Compare commits

..

15 Commits

Author SHA1 Message Date
93d50ac56b refactor: porcail api 2021-04-22 10:04:38 +02:00
0e73af4d49 refactor: update procelain import 2021-04-21 21:30:43 +02:00
00d5e622af feat: update replication 2021-04-21 14:33:18 +02:00
b358d3b305 refactor: change push api 2021-04-21 14:32:56 +02:00
203c10ccd3 feat: missing class attribute for registration 2021-04-21 11:22:53 +02:00
da349dd4a9 refactor: io_bpy architecture revamp 2021-04-21 11:10:24 +02:00
845bb11b8e fix: bl_object 2021-04-20 09:53:59 +02:00
9312d6a8c5 Merge branch 'develop' into 173-differential-revision-milestone-2-replication-refactoring 2021-04-20 09:47:23 +02:00
ca78a42076 feat: update submodules 2021-04-06 14:23:10 +02:00
ee886e00c8 feat: generate subtree 2021-04-06 14:21:26 +02:00
9c7043e84c fix: animation data error 2021-03-26 16:14:27 +01:00
e659b7da94 refactor: move implementation to static def 2021-03-26 12:30:15 +01:00
e3af69a9c8 feat: add replication to the submodules 2021-03-25 14:55:53 +01:00
328c651cea remove data handlign from ReplicatedDatablock 2021-03-24 16:08:12 +01:00
d4224c789a refactor: move commit to porcelain
feractor: remove is_ readonly
2021-03-24 11:20:40 +01:00
76 changed files with 1476 additions and 2404 deletions

3
.gitignore vendored
View File

@ -13,5 +13,4 @@ multi_user_updater/
_build
# ignore generated zip generated from blender_addon_tester
*.zip
libs
*.zip

View File

@ -1,13 +0,0 @@
stages:
- test
- build
- deploy
- doc
include:
- local: .gitlab/ci/test.gitlab-ci.yml
- local: .gitlab/ci/build.gitlab-ci.yml
- local: .gitlab/ci/deploy.gitlab-ci.yml
- local: .gitlab/ci/doc.gitlab-ci.yml

View File

@ -8,5 +8,3 @@ build:
name: multi_user
paths:
- multi_user
variables:
GIT_SUBMODULE_STRATEGY: recursive

View File

@ -5,7 +5,6 @@ deploy:
variables:
DOCKER_DRIVER: overlay2
DOCKER_TLS_CERTDIR: "/certs"
GIT_SUBMODULE_STRATEGY: recursive
services:
- docker:19.03.12-dind

View File

@ -3,5 +3,3 @@ test:
image: slumber/blender-addon-testing:latest
script:
- python3 scripts/test_addon.py
variables:
GIT_SUBMODULE_STRATEGY: recursive

2
.gitmodules vendored
View File

@ -1,3 +1,3 @@
[submodule "multi_user/libs/replication"]
path = multi_user/libs/replication
url = https://gitlab.com/slumber/replication.git
url = https://gitlab.com/slumber/replication

View File

@ -32,32 +32,32 @@ Currently, not all data-block are supported for replication over the wire. The f
| Name | Status | Comment |
| -------------- | :----: | :----------------------------------------------------------: |
| action | ✔️ | |
| armature | ❗ | Not stable |
| camera | ✔️ | |
| collection | ✔️ | |
| curve | ❗ | Nurbs surfaces not supported |
| gpencil | ✔️ | |
| image | ✔️ | |
| mesh | ✔️ | |
| material | ✔️ | |
| node_groups | ✔️ | Material & Geometry only |
| node_groups | | Material & Geometry only |
| geometry nodes | ✔️ | |
| metaball | ✔️ | |
| object | ✔️ | |
| textures | ❗ | Supported for modifiers/materials/geo nodes only |
| texts | ✔️ | |
| scene | ✔️ | |
| world | ✔️ | |
| volumes | ✔️ | |
| lightprobes | ✔️ | |
| physics | ✔️ | |
| curve | | Nurbs surfaces not supported |
| textures | | Supported for modifiers/materials/geo nodes only |
| armature | | Not stable |
| compositing | | [Planned](https://gitlab.com/slumber/multi-user/-/issues/46) |
| texts | | [Planned](https://gitlab.com/slumber/multi-user/-/issues/81) |
| nla | | |
| volumes | ✔️ | |
| particles | ❗ | The cache isn't syncing. |
| speakers | ❗ | [Partial](https://gitlab.com/slumber/multi-user/-/issues/65) |
| vse | ❗ | Mask and Clip not supported yet |
| physics | ❌ | [Planned](https://gitlab.com/slumber/multi-user/-/issues/45) |
| libraries | ❗ | Partial |
| nla | ❌ | |
| texts | ❌ | [Planned](https://gitlab.com/slumber/multi-user/-/issues/81) |
| compositing | ❌ | [Planned](https://gitlab.com/slumber/multi-user/-/issues/46) |

View File

@ -19,10 +19,10 @@ import sys
project = 'multi-user'
copyright = '2020, Swann Martinez'
author = 'Swann Martinez, Poochy, Fabian'
author = 'Swann Martinez, with contributions from Poochy'
# The full version, including alpha/beta/rc tags
release = '0.5.0-develop'
release = '0.2.0'
# -- General configuration ---------------------------------------------------

Binary file not shown.

Before

Width:  |  Height:  |  Size: 320 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 7.3 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 4.2 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 9.0 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 3.2 KiB

View File

@ -108,69 +108,36 @@ Before starting make sure that you have access to the session IP address and por
1. Fill in your user information
--------------------------------
Joining a server
=======================
Follow the user-info_ section for this step.
--------------
Network setup
--------------
----------------
2. Network setup
----------------
In the network panel, select **JOIN**.
The **join sub-panel** (see image below) allows you to configure your client to join a
collaborative session which is already hosted.
.. figure:: img/server_preset_image_normal_server.png
:align: center
:width: 200px
.. figure:: img/quickstart_join.png
:align: center
:alt: Connect menu
Connection pannel
Connection panel
Fill in the fields with your information:
- **IP**: the host's IP address.
- **Port**: the host's port number.
- **Connect as admin**: connect yourself with **admin rights** (see :ref:`admin` ) to the session.
Once you've configured every field, hit the button **CONNECT** to join the session !
When the :ref:`session-status` is **ONLINE** you are online and ready to start co-creating.
.. note::
If you want to have **administrator rights** (see :ref:`admin` ) on the server, just enter the password created by the host in the **Connect as admin** section
.. figure:: img/server_preset_image_admin.png
:align: center
:width: 200px
Admin password
---------------
Server presets
---------------
You can save your server presets in a preset list below the 'JOIN' and 'HOST' buttons. This allows you to quickly access and manage your servers.
To add a server, first enter the ip address and the port (plus the password if needed), then click on the + icon to add a name to your preset. To remove a server from the list, select it and click on the - icon.
.. figure:: img/server_preset_exemple.gif
:align: center
:width: 200px
.. warning:: Be careful, if you don't rename your new preset, or if it has the same name as an existing preset, the old preset will be overwritten.
.. figure:: img/server_preset_image_report.png
:align: center
:width: 200px
.. note::
Two presets are already present when the addon is launched:
- The 'localhost' preset, to host and join a local session quickly
- The 'public session' preset, to join the public sessions of the multi-user server (official discord to participate : https://discord.gg/aBPvGws)
.. Maybe something more explicit here
.. note::
Additional configuration settings can be found in the :ref:`advanced` section.
Once you've configured every field, hit the button **CONNECT** to join the session !
When the :ref:`session-status` is **ONLINE** you are online and ready to start co-creating.
.. note::
When starting a **dedicated server**, the session status screen will take you to the **LOBBY**, awaiting an admin to start the session.

View File

@ -76,7 +76,7 @@ Hit 'Create a network'(see image below) and go to the network settings.
:align: center
:width: 450px
Admin password
Network page
Now that the network is created, let's configure it.

View File

@ -19,7 +19,7 @@
bl_info = {
"name": "Multi-User",
"author": "Swann Martinez",
"version": (0, 5, 0),
"version": (0, 3, 0),
"description": "Enable real-time collaborative workflow inside blender",
"blender": (2, 82, 0),
"location": "3D View > Sidebar > Multi-User tab",
@ -41,12 +41,11 @@ import bpy
from bpy.app.handlers import persistent
from . import environment
from uuid import uuid4
LIBS = os.path.dirname(os.path.abspath(__file__))+"/libs/replication"
module_error_msg = "Insufficient rights to install the multi-user \
dependencies, aunch blender with administrator rights."
def register():
# Setup logging policy
logging.basicConfig(
@ -54,12 +53,17 @@ def register():
datefmt='%H:%M:%S',
level=logging.INFO)
try:
environment.register()
for module_name in list(sys.modules.keys()):
if 'replication' in module_name:
del sys.modules[module_name]
if LIBS not in sys.path:
logging.info('Adding local modules dir to the path')
sys.path.insert(0, LIBS)
try:
from . import presence
from . import operators
from . import handlers
from . import ui
from . import preferences
from . import addon_updater_ops
@ -68,11 +72,10 @@ def register():
addon_updater_ops.register(bl_info)
presence.register()
operators.register()
handlers.register()
ui.register()
except ModuleNotFoundError as e:
raise Exception(module_error_msg)
logging.error(module_error_msg)
logging.error(e)
bpy.types.WindowManager.session = bpy.props.PointerProperty(
type=preferences.SessionProps)
@ -89,7 +92,6 @@ def register():
def unregister():
from . import presence
from . import operators
from . import handlers
from . import ui
from . import preferences
from . import addon_updater_ops
@ -99,7 +101,6 @@ def unregister():
presence.unregister()
addon_updater_ops.unregister()
ui.unregister()
handlers.unregister()
operators.unregister()
preferences.unregister()
@ -107,5 +108,3 @@ def unregister():
del bpy.types.ID.uuid
del bpy.types.WindowManager.online_users
del bpy.types.WindowManager.user_index
environment.unregister()

View File

@ -1688,7 +1688,10 @@ class GitlabEngine(object):
# Could clash with tag names and if it does, it will
# download TAG zip instead of branch zip to get
# direct path, would need.
return f"https://gitlab.com/slumber/multi-user/-/jobs/artifacts/{branch}/download?job=build"
return "{}{}{}".format(
self.form_repo_url(updater),
"/repository/archive.zip?sha=",
branch)
def get_zip_url(self, sha, updater):
return "{base}/repository/archive.zip?sha={sha}".format(

View File

@ -1,106 +0,0 @@
import bpy
import mathutils
from . import dump_anything
from replication.protocol import ReplicatedDatablock
from .bl_datablock import get_datablock_from_uuid
from .bl_datablock import resolve_datablock_from_uuid
from .bl_action import dump_animation_data, load_animation_data, resolve_animation_dependencies
def dump_textures_slots(texture_slots: bpy.types.bpy_prop_collection) -> list:
""" Dump every texture slot collection as the form:
[(index, slot_texture_uuid, slot_texture_name), (), ...]
"""
dumped_slots = []
for index, slot in enumerate(texture_slots):
if slot and slot.texture:
dumped_slots.append((index, slot.texture.uuid, slot.texture.name))
return dumped_slots
def load_texture_slots(dumped_slots: list, target_slots: bpy.types.bpy_prop_collection):
"""
"""
for index, slot in enumerate(target_slots):
if slot:
target_slots.clear(index)
for index, slot_uuid, slot_name in dumped_slots:
target_slots.create(index).texture = get_datablock_from_uuid(
slot_uuid, slot_name
)
IGNORED_ATTR = [
"is_embedded_data",
"is_evaluated",
"is_fluid",
"is_library_indirect",
"users"
]
class BlParticle(ReplicatedDatablock):
use_delta = True
bl_id = "particles"
bl_class = bpy.types.ParticleSettings
bl_icon = "PARTICLES"
bl_check_common = False
bl_reload_parent = False
@staticmethod
def construct(data: dict) -> object:
return bpy.data.particles.new(data["name"])
@staticmethod
def load(data: dict, datablock: object):
load_animation_data(data.get('animation_data'), datablock)
dump_anything.load(datablock, data)
dump_anything.load(datablock.effector_weights, data["effector_weights"])
# Force field
force_field_1 = data.get("force_field_1", None)
if force_field_1:
dump_anything.load(datablock.force_field_1, force_field_1)
force_field_2 = data.get("force_field_2", None)
if force_field_2:
dump_anything.load(datablock.force_field_2, force_field_2)
# Texture slots
load_texture_slots(data["texture_slots"], datablock.texture_slots)
@staticmethod
def dump(datablock: object) -> dict:
dumper = dump_anything.Dumper()
dumper.depth = 1
dumper.exclude_filter = IGNORED_ATTR
data = dumper.dump(datablock)
# Particle effectors
data["effector_weights"] = dumper.dump(datablock.effector_weights)
if datablock.force_field_1:
data["force_field_1"] = dumper.dump(datablock.force_field_1)
if datablock.force_field_2:
data["force_field_2"] = dumper.dump(datablock.force_field_2)
# Texture slots
data["texture_slots"] = dump_textures_slots(datablock.texture_slots)
data['animation_data'] = dump_animation_data(datablock)
return data
@staticmethod
def resolve(data: dict) -> object:
uuid = data.get('uuid')
return resolve_datablock_from_uuid(uuid, bpy.data.particles)
@staticmethod
def resolve_deps(datablock: object) -> [object]:
deps = [t.texture for t in datablock.texture_slots if t and t.texture]
deps.extend(resolve_animation_dependencies(datablock))
return deps
_type = bpy.types.ParticleSettings
_class = BlParticle

View File

@ -24,25 +24,20 @@ import sys
from pathlib import Path
import socket
import re
import bpy
VERSION_EXPR = re.compile('\d+.\d+.\d+')
THIRD_PARTY = os.path.join(os.path.dirname(os.path.abspath(__file__)), "libs")
DEFAULT_CACHE_DIR = os.path.join(
os.path.dirname(os.path.abspath(__file__)), "cache")
REPLICATION_DEPENDENCIES = {
"zmq",
"deepdiff"
}
LIBS = os.path.join(os.path.dirname(os.path.abspath(__file__)), "libs")
REPLICATION = os.path.join(LIBS,"replication")
PYTHON_PATH = None
SUBPROCESS_DIR = None
rtypes = []
def module_can_be_imported(name: str) -> bool:
def module_can_be_imported(name):
try:
__import__(name)
return True
@ -55,7 +50,7 @@ def install_pip():
subprocess.run([str(PYTHON_PATH), "-m", "ensurepip"])
def install_package(name: str, install_dir: str):
def install_package(name, version):
logging.info(f"installing {name} version...")
env = os.environ
if "PIP_REQUIRE_VIRTUALENV" in env:
@ -65,13 +60,12 @@ def install_package(name: str, install_dir: str):
# env var for the subprocess.
env = os.environ.copy()
del env["PIP_REQUIRE_VIRTUALENV"]
subprocess.run([str(PYTHON_PATH), "-m", "pip", "install", f"{name}", "-t", install_dir], env=env)
subprocess.run([str(PYTHON_PATH), "-m", "pip", "install", f"{name}=={version}"], env=env)
if name in sys.modules:
del sys.modules[name]
def check_package_version(name: str, required_version: str):
def check_package_version(name, required_version):
logging.info(f"Checking {name} version...")
out = subprocess.run([str(PYTHON_PATH), "-m", "pip", "show", name], capture_output=True)
@ -83,7 +77,6 @@ def check_package_version(name: str, required_version: str):
logging.info(f"{name} need an update")
return False
def get_ip():
"""
Retrieve the main network interface IP.
@ -101,25 +94,7 @@ def check_dir(dir):
os.makedirs(dir)
def setup_paths(paths: list):
""" Add missing path to sys.path
"""
for path in paths:
if path not in sys.path:
logging.debug(f"Adding {path} dir to the path.")
sys.path.insert(0, path)
def remove_paths(paths: list):
""" Remove list of path from sys.path
"""
for path in paths:
if path in sys.path:
logging.debug(f"Removing {path} dir from the path.")
sys.path.remove(path)
def install_modules(dependencies: list, python_path: str, install_dir: str):
def setup(dependencies, python_path):
global PYTHON_PATH, SUBPROCESS_DIR
PYTHON_PATH = Path(python_path)
@ -128,23 +103,9 @@ def install_modules(dependencies: list, python_path: str, install_dir: str):
if not module_can_be_imported("pip"):
install_pip()
for package_name in dependencies:
for package_name, package_version in dependencies:
if not module_can_be_imported(package_name):
install_package(package_name, install_dir=install_dir)
install_package(package_name, package_version)
module_can_be_imported(package_name)
def register():
if bpy.app.version[1] >= 91:
python_binary_path = sys.executable
else:
python_binary_path = bpy.app.binary_path_python
for module_name in list(sys.modules.keys()):
if 'replication' in module_name:
del sys.modules[module_name]
setup_paths([LIBS, REPLICATION])
install_modules(REPLICATION_DEPENDENCIES, python_binary_path, install_dir=LIBS)
def unregister():
remove_paths([REPLICATION, LIBS])
elif not check_package_version(package_name, package_version):
install_package(package_name, package_version)

View File

@ -1,152 +0,0 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
#
# ##### END GPL LICENSE BLOCK #####
import logging
import bpy
from bpy.app.handlers import persistent
from replication import porcelain
from replication.constants import RP_COMMON, STATE_ACTIVE, STATE_SYNCING, UP
from replication.exception import ContextError, NonAuthorizedOperationError
from replication.interface import session
from . import shared_data, utils
def sanitize_deps_graph(remove_nodes: bool = False):
""" Cleanup the replication graph
"""
if session and session.state == STATE_ACTIVE:
start = utils.current_milli_time()
rm_cpt = 0
for node in session.repository.graph.values():
node.instance = session.repository.rdp.resolve(node.data)
if node is None \
or (node.state == UP and not node.instance):
if remove_nodes:
try:
porcelain.rm(session.repository,
node.uuid,
remove_dependencies=False)
logging.info(f"Removing {node.uuid}")
rm_cpt += 1
except NonAuthorizedOperationError:
continue
logging.info(f"Sanitize took { utils.current_milli_time()-start} ms, removed {rm_cpt} nodes")
def update_external_dependencies():
"""Force external dependencies(files such as images) evaluation
"""
external_types = ['WindowsPath', 'PosixPath', 'Image']
nodes_ids = [n.uuid for n in session.repository.graph.values() if n.data['type_id'] in external_types]
for node_id in nodes_ids:
node = session.repository.graph.get(node_id)
if node and node.owner in [session.repository.username, RP_COMMON]:
porcelain.commit(session.repository, node_id)
porcelain.push(session.repository, 'origin', node_id)
@persistent
def on_scene_update(scene):
"""Forward blender depsgraph update to replication
"""
if session and session.state == STATE_ACTIVE:
context = bpy.context
blender_depsgraph = bpy.context.view_layer.depsgraph
dependency_updates = [u for u in blender_depsgraph.updates]
settings = utils.get_preferences()
incoming_updates = shared_data.session.applied_updates
distant_update = [getattr(u.id, 'uuid', None) for u in dependency_updates if getattr(u.id, 'uuid', None) in incoming_updates]
if distant_update:
for u in distant_update:
shared_data.session.applied_updates.remove(u)
logging.debug(f"Ignoring distant update of {dependency_updates[0].id.name}")
return
update_external_dependencies()
# NOTE: maybe we don't need to check each update but only the first
for update in reversed(dependency_updates):
update_uuid = getattr(update.id, 'uuid', None)
if update_uuid:
node = session.repository.graph.get(update.id.uuid)
check_common = session.repository.rdp.get_implementation(update.id).bl_check_common
if node and (node.owner == session.repository.username or check_common):
logging.debug(f"Evaluate {update.id.name}")
if node.state == UP:
try:
porcelain.commit(session.repository, node.uuid)
porcelain.push(session.repository,
'origin', node.uuid)
except ReferenceError:
logging.debug(f"Reference error {node.uuid}")
except ContextError as e:
logging.debug(e)
except Exception as e:
logging.error(e)
else:
continue
elif isinstance(update.id, bpy.types.Scene):
scene = bpy.data.scenes.get(update.id.name)
scn_uuid = porcelain.add(session.repository, scene)
porcelain.commit(session.repository, scn_uuid)
porcelain.push(session.repository, 'origin', scn_uuid)
@persistent
def resolve_deps_graph(dummy):
"""Resolve deps graph
Temporary solution to resolve each node pointers after a Undo.
A future solution should be to avoid storing dataclock reference...
"""
if session and session.state == STATE_ACTIVE:
sanitize_deps_graph(remove_nodes=True)
@persistent
def load_pre_handler(dummy):
if session and session.state in [STATE_ACTIVE, STATE_SYNCING]:
bpy.ops.session.stop()
@persistent
def update_client_frame(scene):
if session and session.state == STATE_ACTIVE:
porcelain.update_user_metadata(session.repository, {
'frame_current': scene.frame_current
})
def register():
bpy.app.handlers.undo_post.append(resolve_deps_graph)
bpy.app.handlers.redo_post.append(resolve_deps_graph)
bpy.app.handlers.load_pre.append(load_pre_handler)
bpy.app.handlers.frame_change_pre.append(update_client_frame)
def unregister():
bpy.app.handlers.undo_post.remove(resolve_deps_graph)
bpy.app.handlers.redo_post.remove(resolve_deps_graph)
bpy.app.handlers.load_pre.remove(load_pre_handler)
bpy.app.handlers.frame_change_pre.remove(update_client_frame)

View File

@ -16,40 +16,38 @@
# ##### END GPL LICENSE BLOCK #####
import bpy
from replication.protocol import ReplicatedDatablock
__all__ = [
'bl_object',
'bl_mesh',
'bl_camera',
# 'bl_camera',
'bl_collection',
'bl_curve',
'bl_gpencil',
'bl_image',
'bl_light',
# 'bl_curve',
# 'bl_gpencil',
# 'bl_image',
# 'bl_light',
'bl_scene',
'bl_material',
'bl_armature',
'bl_action',
'bl_world',
'bl_metaball',
'bl_lattice',
'bl_lightprobe',
'bl_speaker',
'bl_font',
'bl_sound',
'bl_file',
'bl_node_group',
'bl_texture',
"bl_particle",
# 'bl_library',
# 'bl_armature',
# 'bl_action',
# 'bl_world',
# 'bl_metaball',
# 'bl_lattice',
# 'bl_lightprobe',
# 'bl_speaker',
# 'bl_font',
# 'bl_sound',
# 'bl_file',
# 'bl_sequencer',
# 'bl_node_group',
# 'bl_texture',
# "bl_particle",
] # Order here defines execution order
if bpy.app.version[1] >= 91:
__all__.append('bl_volume')
from . import *
def types_to_register():
return __all__
# if bpy.app.version[1] >= 91:
# __all__.append('bl_volume')
from replication.protocol import DataTranslationProtocol

View File

@ -24,9 +24,14 @@ from enum import Enum
from .. import utils
from .dump_anything import (
Dumper, Loader, np_dump_collection, np_load_collection, remove_items_from_dict)
Dumper,
Loader,
np_dump_collection,
np_load_collection,
remove_items_from_dict)
from .bl_datablock import stamp_uuid
from replication.protocol import ReplicatedDatablock
from .bl_datablock import resolve_datablock_from_uuid
from replication.objects import Node
KEYFRAME = [
'amplitude',
@ -41,6 +46,7 @@ KEYFRAME = [
'interpolation',
]
def has_action(datablock):
""" Check if the datablock datablock has actions
"""
@ -69,7 +75,8 @@ def load_driver(target_datablock, src_driver):
loader = Loader()
drivers = target_datablock.animation_data.drivers
src_driver_data = src_driver['driver']
new_driver = drivers.new(src_driver['data_path'], index=src_driver['array_index'])
new_driver = drivers.new(
src_driver['data_path'], index=src_driver['array_index'])
# Settings
new_driver.driver.type = src_driver_data['type']
@ -85,10 +92,10 @@ def load_driver(target_datablock, src_driver):
for src_target in src_var_data['targets']:
src_target_data = src_var_data['targets'][src_target]
src_id = src_target_data.get('id')
if src_id:
new_var.targets[src_target].id = utils.resolve_from_id(src_target_data['id'], src_target_data['id_type'])
loader.load(new_var.targets[src_target], src_target_data)
new_var.targets[src_target].id = utils.resolve_from_id(
src_target_data['id'], src_target_data['id_type'])
loader.load(
new_var.targets[src_target], src_target_data)
# Fcurve
new_fcurve = new_driver.keyframe_points
@ -121,6 +128,7 @@ def dump_fcurve(fcurve: bpy.types.FCurve, use_numpy: bool = True) -> dict:
points = fcurve.keyframe_points
fcurve_data['keyframes_count'] = len(fcurve.keyframe_points)
fcurve_data['keyframe_points'] = np_dump_collection(points, KEYFRAME)
else: # Legacy method
dumper = Dumper()
fcurve_data["keyframe_points"] = []
@ -130,18 +138,6 @@ def dump_fcurve(fcurve: bpy.types.FCurve, use_numpy: bool = True) -> dict:
dumper.dump(k)
)
if fcurve.modifiers:
dumper = Dumper()
dumper.exclude_filter = [
'is_valid',
'active'
]
dumped_modifiers = []
for modfifier in fcurve.modifiers:
dumped_modifiers.append(dumper.dump(modfifier))
fcurve_data['modifiers'] = dumped_modifiers
return fcurve_data
@ -154,7 +150,7 @@ def load_fcurve(fcurve_data, fcurve):
:type fcurve: bpy.types.FCurve
"""
use_numpy = fcurve_data.get('use_numpy')
loader = Loader()
keyframe_points = fcurve.keyframe_points
# Remove all keyframe points
@ -199,55 +195,37 @@ def load_fcurve(fcurve_data, fcurve):
fcurve.update()
dumped_fcurve_modifiers = fcurve_data.get('modifiers', None)
if dumped_fcurve_modifiers:
# clear modifiers
for fmod in fcurve.modifiers:
fcurve.modifiers.remove(fmod)
# Load each modifiers in order
for modifier_data in dumped_fcurve_modifiers:
modifier = fcurve.modifiers.new(modifier_data['type'])
loader.load(modifier, modifier_data)
elif fcurve.modifiers:
for fmod in fcurve.modifiers:
fcurve.modifiers.remove(fmod)
def dump_animation_data(datablock):
animation_data = {}
def dump_animation_data(datablock, data):
if has_action(datablock):
animation_data['action'] = datablock.animation_data.action.uuid
dumper = Dumper()
dumper.include_filter = ['action']
data['animation_data'] = dumper.dump(datablock.animation_data)
if has_driver(datablock):
animation_data['drivers'] = []
dumped_drivers = {'animation_data': {'drivers': []}}
for driver in datablock.animation_data.drivers:
animation_data['drivers'].append(dump_driver(driver))
dumped_drivers['animation_data']['drivers'].append(
dump_driver(driver))
return animation_data
data.update(dumped_drivers)
def load_animation_data(animation_data, datablock):
def load_animation_data(data, datablock):
# Load animation data
if animation_data:
if 'animation_data' in data.keys():
if datablock.animation_data is None:
datablock.animation_data_create()
for d in datablock.animation_data.drivers:
datablock.animation_data.drivers.remove(d)
if 'drivers' in animation_data:
for driver in animation_data['drivers']:
if 'drivers' in data['animation_data']:
for driver in data['animation_data']['drivers']:
load_driver(datablock, driver)
action = animation_data.get('action')
if action:
action = resolve_datablock_from_uuid(action, bpy.data.actions)
datablock.animation_data.action = action
elif datablock.animation_data.action:
datablock.animation_data.action = None
if 'action' in data['animation_data']:
datablock.animation_data.action = bpy.data.actions[data['animation_data']['action']]
# Remove existing animation data if there is not more to load
elif hasattr(datablock, 'animation_data') and datablock.animation_data:
datablock.animation_data_clear()
@ -261,8 +239,6 @@ def resolve_animation_dependencies(datablock):
class BlAction(ReplicatedDatablock):
use_delta = True
bl_id = "actions"
bl_class = bpy.types.Action
bl_check_common = False
@ -295,6 +271,8 @@ class BlAction(ReplicatedDatablock):
@staticmethod
def dump(datablock: object) -> dict:
stamp_uuid(datablock)
dumper = Dumper()
dumper.exclude_filter = [
'name_full',
@ -317,15 +295,3 @@ class BlAction(ReplicatedDatablock):
data["fcurves"].append(dump_fcurve(fcurve, use_numpy=True))
return data
@staticmethod
def resolve(data: dict) -> object:
uuid = data.get('uuid')
return resolve_datablock_from_uuid(uuid, bpy.data.actions)
@staticmethod
def resolve_deps(datablock: object) -> [object]:
return []
_type = bpy.types.Action
_class = BlAction

View File

@ -22,9 +22,8 @@ import mathutils
from .dump_anything import Loader, Dumper
from .. import presence, operators, utils
from replication.protocol import ReplicatedDatablock
from .bl_datablock import resolve_datablock_from_uuid
from .bl_action import dump_animation_data, load_animation_data, resolve_animation_dependencies
from .bl_datablock import BlDatablock
def get_roll(bone: bpy.types.Bone) -> float:
""" Compute the actuall roll of a pose bone
@ -36,20 +35,16 @@ def get_roll(bone: bpy.types.Bone) -> float:
return bone.AxisRollFromMatrix(bone.matrix_local.to_3x3())[1]
class BlArmature(ReplicatedDatablock):
use_delta = True
class BlArmature(BlDatablock):
bl_id = "armatures"
bl_class = bpy.types.Armature
bl_check_common = False
bl_icon = 'ARMATURE_DATA'
bl_reload_parent = False
@staticmethod
def construct(data: dict) -> object:
return bpy.data.armatures.new(data["name"])
@staticmethod
def load(data: dict, datablock: object):
# Load parent object
parent_object = utils.find_from_attr(
@ -60,7 +55,7 @@ class BlArmature(ReplicatedDatablock):
if parent_object is None:
parent_object = bpy.data.objects.new(
data['user_name'], datablock)
data['user_name'], target)
parent_object.uuid = data['user']
is_object_in_master = (
@ -95,10 +90,10 @@ class BlArmature(ReplicatedDatablock):
bpy.ops.object.mode_set(mode='EDIT')
for bone in data['bones']:
if bone not in datablock.edit_bones:
new_bone = datablock.edit_bones.new(bone)
if bone not in target.edit_bones:
new_bone = target.edit_bones.new(bone)
else:
new_bone = datablock.edit_bones[bone]
new_bone = target.edit_bones[bone]
bone_data = data['bones'].get(bone)
@ -109,7 +104,7 @@ class BlArmature(ReplicatedDatablock):
new_bone.roll = bone_data['roll']
if 'parent' in bone_data:
new_bone.parent = datablock.edit_bones[data['bones']
new_bone.parent = target.edit_bones[data['bones']
[bone]['parent']]
new_bone.use_connect = bone_data['use_connect']
@ -124,10 +119,9 @@ class BlArmature(ReplicatedDatablock):
if 'EDIT' in current_mode:
bpy.ops.object.mode_set(mode='EDIT')
load_animation_data(data.get('animation_data'), datablock)
@staticmethod
def dump(datablock: object) -> dict:
assert(instance)
dumper = Dumper()
dumper.depth = 4
dumper.include_filter = [
@ -141,14 +135,14 @@ class BlArmature(ReplicatedDatablock):
'name',
'layers',
]
data = dumper.dump(datablock)
data = dumper.dump(instance)
for bone in datablock.bones:
for bone in instance.bones:
if bone.parent:
data['bones'][bone.name]['parent'] = bone.parent.name
# get the parent Object
# TODO: Use id_data instead
object_users = utils.get_datablock_users(datablock)[0]
object_users = utils.get_datablock_users(instance)[0]
data['user'] = object_users.uuid
data['user_name'] = object_users.name
@ -159,25 +153,7 @@ class BlArmature(ReplicatedDatablock):
data['user_scene'] = [
item.name for item in container_users if isinstance(item, bpy.types.Scene)]
for bone in datablock.bones:
for bone in instance.bones:
data['bones'][bone.name]['roll'] = get_roll(bone)
data['animation_data'] = dump_animation_data(datablock)
return data
@staticmethod
def resolve(data: dict) -> object:
uuid = data.get('uuid')
name = data.get('name')
datablock = resolve_datablock_from_uuid(uuid, bpy.data.armatures)
if datablock is None:
datablock = bpy.data.armatures.get(name)
return datablock
@staticmethod
def resolve_deps(datablock: object) -> [object]:
return resolve_animation_dependencies(datablock)
_type = bpy.types.Armature
_class = BlArmature

View File

@ -20,58 +20,47 @@ import bpy
import mathutils
from .dump_anything import Loader, Dumper
from replication.protocol import ReplicatedDatablock
from .bl_datablock import resolve_datablock_from_uuid
from .bl_action import dump_animation_data, load_animation_data, resolve_animation_dependencies
from .bl_datablock import BlDatablock
class BlCamera(ReplicatedDatablock):
use_delta = True
class BlCamera(BlDatablock):
bl_id = "cameras"
bl_class = bpy.types.Camera
bl_check_common = False
bl_icon = 'CAMERA_DATA'
bl_reload_parent = False
@staticmethod
def construct(data: dict) -> object:
return bpy.data.cameras.new(data["name"])
@staticmethod
def load(data: dict, datablock: object):
loader = Loader()
loader.load(datablock, data)
loader.load(target, data)
dof_settings = data.get('dof')
load_animation_data(data.get('animation_data'), datablock)
# DOF settings
if dof_settings:
loader.load(datablock.dof, dof_settings)
loader.load(target.dof, dof_settings)
background_images = data.get('background_images')
datablock.background_images.clear()
# TODO: Use image uuid
target.background_images.clear()
if background_images:
for img_name, img_data in background_images.items():
img_id = img_data.get('image')
if img_id:
target_img = datablock.background_images.new()
target_img = target.background_images.new()
target_img.image = bpy.data.images[img_id]
loader.load(target_img, img_data)
img_user = img_data.get('image_user')
if img_user:
loader.load(target_img.image_user, img_user)
@staticmethod
def dump(datablock: object) -> dict:
assert(instance)
# TODO: background image support
dumper = Dumper()
dumper.depth = 3
dumper.include_filter = [
@ -112,37 +101,15 @@ class BlCamera(ReplicatedDatablock):
'scale',
'use_flip_x',
'use_flip_y',
'image_user',
'image',
'frame_duration',
'frame_start',
'frame_offset',
'use_cyclic',
'use_auto_refresh'
'image'
]
data = dumper.dump(datablock)
data['animation_data'] = dump_animation_data(datablock)
for index, image in enumerate(datablock.background_images):
if image.image_user:
data['background_images'][index]['image_user'] = dumper.dump(image.image_user)
return data
@staticmethod
def resolve(data: dict) -> object:
uuid = data.get('uuid')
return resolve_datablock_from_uuid(uuid, bpy.data.cameras)
return dumper.dump(instance)
@staticmethod
def resolve_deps(datablock: object) -> [object]:
deps = []
for background in datablock.background_images:
if background.image:
deps.append(background.image)
deps.extend(resolve_animation_dependencies(datablock))
return deps
_type = bpy.types.Camera
_class = BlCamera

View File

@ -19,12 +19,10 @@
import bpy
import mathutils
from deepdiff import DeepDiff, Delta
from .. import utils
from replication.protocol import ReplicatedDatablock
from .dump_anything import Loader, Dumper
from .bl_datablock import resolve_datablock_from_uuid
from replication.protocol import ReplicatedDatablock
from replication.objects import Node
def dump_collection_children(collection):
collection_children = []
@ -89,17 +87,15 @@ class BlCollection(ReplicatedDatablock):
bl_class = bpy.types.Collection
bl_check_common = True
bl_reload_parent = False
use_delta = True
@staticmethod
def construct(data: dict) -> object:
instance = bpy.data.collections.new(data["name"])
return instance
datablock = bpy.data.collections.new(node.data["name"])
return datablock
@staticmethod
def load(data: dict, datablock: object):
data = node.data
loader = Loader()
loader.load(datablock, data)
@ -113,9 +109,10 @@ class BlCollection(ReplicatedDatablock):
# Keep other user from deleting collection object by flushing their history
utils.flush_history()
@staticmethod
def dump(datablock: object) -> dict:
assert(datablock)
dumper = Dumper()
dumper.depth = 1
dumper.include_filter = [
@ -132,33 +129,9 @@ class BlCollection(ReplicatedDatablock):
return data
@staticmethod
def resolve(data: dict) -> object:
uuid = data.get('uuid')
return resolve_datablock_from_uuid(uuid, bpy.data.collections)
@staticmethod
def resolve_deps(datablock: object) -> [object]:
return resolve_collection_dependencies(datablock)
@staticmethod
def compute_delta(last_data: dict, current_data: dict) -> Delta:
diff_params = {
'ignore_order': True,
'report_repetition': True
}
delta_params = {
# 'mutate': True
}
return Delta(
DeepDiff(last_data,
current_data,
cache_size=5000,
**diff_params),
**delta_params)
_type = bpy.types.Collection
_class = BlCollection

View File

@ -21,15 +21,13 @@ import bpy.types as T
import mathutils
import logging
from ..utils import get_preferences
from replication.protocol import ReplicatedDatablock
from .. import utils
from .bl_datablock import BlDatablock
from .dump_anything import (Dumper, Loader,
np_load_collection,
np_dump_collection)
from .bl_datablock import get_datablock_from_uuid
from .bl_material import dump_materials_slots, load_materials_slots
from .bl_datablock import resolve_datablock_from_uuid
from .bl_action import dump_animation_data, load_animation_data, resolve_animation_dependencies
SPLINE_BEZIER_POINT = [
# "handle_left_type",
@ -136,31 +134,25 @@ SPLINE_METADATA = [
]
class BlCurve(ReplicatedDatablock):
use_delta = True
class BlCurve(BlDatablock):
bl_id = "curves"
bl_class = bpy.types.Curve
bl_check_common = False
bl_icon = 'CURVE_DATA'
bl_reload_parent = False
@staticmethod
def construct(data: dict) -> object:
return bpy.data.curves.new(data["name"], data["type"])
@staticmethod
def load(data: dict, datablock: object):
load_animation_data(data.get('animation_data'), datablock)
loader = Loader()
loader.load(datablock, data)
loader.load(target, data)
datablock.splines.clear()
target.splines.clear()
# load splines
for spline in data['splines'].values():
new_spline = datablock.splines.new(spline['type'])
new_spline = target.splines.new(spline['type'])
# Load curve geometry data
if new_spline.type == 'BEZIER':
@ -181,14 +173,15 @@ class BlCurve(ReplicatedDatablock):
# MATERIAL SLOTS
src_materials = data.get('materials', None)
if src_materials:
load_materials_slots(src_materials, datablock.materials)
load_materials_slots(src_materials, target.materials)
@staticmethod
def dump(datablock: object) -> dict:
assert(instance)
dumper = Dumper()
# Conflicting attributes
# TODO: remove them with the NURBS support
dumper.include_filter = CURVE_METADATA
dumper.exclude_filter = [
'users',
'order_u',
@ -197,16 +190,14 @@ class BlCurve(ReplicatedDatablock):
'point_count_u',
'active_textbox'
]
if datablock.use_auto_texspace:
if instance.use_auto_texspace:
dumper.exclude_filter.extend([
'texspace_location',
'texspace_size'])
data = dumper.dump(datablock)
data['animation_data'] = dump_animation_data(datablock)
data = dumper.dump(instance)
data['splines'] = {}
for index, spline in enumerate(datablock.splines):
for index, spline in enumerate(instance.splines):
dumper.depth = 2
dumper.include_filter = SPLINE_METADATA
spline_data = dumper.dump(spline)
@ -220,25 +211,19 @@ class BlCurve(ReplicatedDatablock):
spline.bezier_points, SPLINE_BEZIER_POINT)
data['splines'][index] = spline_data
if isinstance(datablock, T.SurfaceCurve):
if isinstance(instance, T.SurfaceCurve):
data['type'] = 'SURFACE'
elif isinstance(datablock, T.TextCurve):
elif isinstance(instance, T.TextCurve):
data['type'] = 'FONT'
elif isinstance(datablock, T.Curve):
elif isinstance(instance, T.Curve):
data['type'] = 'CURVE'
data['materials'] = dump_materials_slots(datablock.materials)
data['materials'] = dump_materials_slots(instance.materials)
return data
@staticmethod
def resolve(data: dict) -> object:
uuid = data.get('uuid')
return resolve_datablock_from_uuid(uuid, bpy.data.curves)
@staticmethod
def resolve_deps(datablock: object) -> [object]:
# TODO: resolve material
deps = []
curve = datablock
@ -249,19 +234,15 @@ class BlCurve(ReplicatedDatablock):
curve.font_bold_italic,
curve.font_italic])
for material in datablock.materials:
for material in curve.materials:
if material:
deps.append(material)
deps.extend(resolve_animation_dependencies(datablock))
return deps
@staticmethod
def needs_update(datablock: object, data: dict) -> bool:
return 'EDIT' not in bpy.context.mode \
or get_preferences().sync_flags.sync_during_editmode
_type = [bpy.types.Curve, bpy.types.TextCurve]
_class = BlCurve
def diff(self):
if 'EDIT' in bpy.context.mode \
and not self.preferences.sync_flags.sync_during_editmode:
return False
else:
return super().diff()

View File

@ -23,10 +23,14 @@ import bpy
import mathutils
from replication.constants import DIFF_BINARY, DIFF_JSON, UP
from replication.protocol import ReplicatedDatablock
from replication.objects import Node
from uuid import uuid4
from .. import utils
from .dump_anything import Dumper, Loader
def get_datablock_from_uuid(uuid, default, ignore=[]):
if not uuid:
return default
@ -38,8 +42,18 @@ def get_datablock_from_uuid(uuid, default, ignore=[]):
return item
return default
def resolve_datablock_from_uuid(uuid, bpy_collection):
for item in bpy_collection:
if getattr(item, 'uuid', None) == uuid:
return item
return None
def resolve_datablock_from_root(node:Node, root)->object:
datablock_ref = utils.find_from_attr('uuid', node.uuid, root)
if not datablock_ref:
try:
datablock_ref = root[node.data['name']]
except Exception:
pass
return datablock_ref
def stamp_uuid(datablock):
if not datablock.uuid:
datablock.uuid = str(uuid4())

View File

@ -19,7 +19,7 @@
import logging
import os
import sys
from pathlib import Path, WindowsPath, PosixPath
from pathlib import Path
import bpy
import mathutils
@ -27,7 +27,6 @@ from replication.constants import DIFF_BINARY, UP
from replication.protocol import ReplicatedDatablock
from .. import utils
from ..utils import get_preferences
from .dump_anything import Dumper, Loader
@ -59,16 +58,33 @@ class BlFile(ReplicatedDatablock):
bl_icon = 'FILE'
bl_reload_parent = True
@staticmethod
def construct(data: dict) -> object:
return Path(get_filepath(data['name']))
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
self.instance = kwargs.get('instance', None)
if self.instance and not self.instance.exists():
raise FileNotFoundError(str(self.instance))
self.preferences = utils.get_preferences()
@staticmethod
def resolve(data: dict) -> object:
return Path(get_filepath(data['name']))
def resolve(self, construct = True):
self.instance = Path(get_filepath(self.data['name']))
file_exists = self.instance.exists()
if not file_exists:
logging.debug("File don't exist, loading it.")
self._load(self.data, self.instance)
return file_exists
@staticmethod
def dump(datablock: object) -> dict:
def push(self, socket, identity=None, check_data=False):
super().push(socket, identity=None, check_data=False)
if self.preferences.clear_memory_filecache:
del self.data['file']
def dump(self, instance=None):
"""
Read the file and return a dict as:
{
@ -80,62 +96,44 @@ class BlFile(ReplicatedDatablock):
logging.info(f"Extracting file metadata")
data = {
'name': datablock.name,
'name': self.instance.name,
}
logging.info(f"Reading {datablock.name} content: {datablock.stat().st_size} bytes")
logging.info(
f"Reading {self.instance.name} content: {self.instance.stat().st_size} bytes")
try:
file = open(datablock, "rb")
file = open(self.instance, "rb")
data['file'] = file.read()
file.close()
except IOError:
logging.warning(f"{datablock} doesn't exist, skipping")
logging.warning(f"{self.instance} doesn't exist, skipping")
else:
file.close()
return data
@staticmethod
def load(data: dict, datablock: object):
def load(self, data, target):
"""
Writing the file
"""
try:
file = open(datablock, "wb")
file = open(target, "wb")
file.write(data['file'])
if get_preferences().clear_memory_filecache:
del data['file']
if self.preferences.clear_memory_filecache:
del self.data['file']
except IOError:
logging.warning(f"{datablock} doesn't exist, skipping")
logging.warning(f"{target} doesn't exist, skipping")
else:
file.close()
@staticmethod
def resolve_deps(datablock: object) -> [object]:
return []
@staticmethod
def needs_update(datablock: object, data:dict)-> bool:
if get_preferences().clear_memory_filecache:
def diff(self):
if self.preferences.clear_memory_filecache:
return False
else:
if not datablock:
return None
if not data:
return True
memory_size = sys.getsizeof(data['file'])-33
disk_size = datablock.stat().st_size
if memory_size != disk_size:
return True
else:
return False
_type = [WindowsPath, PosixPath]
_class = BlFile
memory_size = sys.getsizeof(self.data['file'])-33
disk_size = self.instance.stat().st_size
return memory_size != disk_size

View File

@ -22,19 +22,18 @@ from pathlib import Path
import bpy
from replication.protocol import ReplicatedDatablock
from .bl_datablock import BlDatablock
from .bl_file import get_filepath, ensure_unpacked
from .dump_anything import Dumper, Loader
from .bl_datablock import resolve_datablock_from_uuid
class BlFont(ReplicatedDatablock):
class BlFont(BlDatablock):
bl_id = "fonts"
bl_class = bpy.types.VectorFont
bl_check_common = False
bl_icon = 'FILE_FONT'
bl_reload_parent = False
@staticmethod
def construct(data: dict) -> object:
filename = data.get('filename')
@ -43,29 +42,25 @@ class BlFont(ReplicatedDatablock):
else:
return bpy.data.fonts.load(get_filepath(filename))
@staticmethod
def load(data: dict, datablock: object):
def load(self, data, target):
pass
@staticmethod
def dump(datablock: object) -> dict:
if datablock.filepath == '<builtin>':
def dump(self, instance=None):
if instance.filepath == '<builtin>':
filename = '<builtin>'
else:
filename = Path(datablock.filepath).name
filename = Path(instance.filepath).name
if not filename:
raise FileExistsError(datablock.filepath)
raise FileExistsError(instance.filepath)
return {
'filename': filename,
'name': datablock.name
'name': instance.name
}
@staticmethod
def resolve(data: dict) -> object:
uuid = data.get('uuid')
return resolve_datablock_from_uuid(uuid, bpy.data.fonts)
def diff(self):
return False
@staticmethod
def resolve_deps(datablock: object) -> [object]:
@ -76,10 +71,3 @@ class BlFont(ReplicatedDatablock):
deps.append(Path(bpy.path.abspath(datablock.filepath)))
return deps
@staticmethod
def needs_update(datablock: object, data:dict)-> bool:
return False
_type = bpy.types.VectorFont
_class = BlFont

View File

@ -24,12 +24,10 @@ from .dump_anything import (Dumper,
Loader,
np_dump_collection,
np_load_collection)
from replication.protocol import ReplicatedDatablock
from .bl_datablock import resolve_datablock_from_uuid
from .bl_action import dump_animation_data, load_animation_data, resolve_animation_dependencies
from ..utils import get_preferences
from ..timers import is_annotating
from .bl_material import load_materials_slots, dump_materials_slots
from .bl_datablock import BlDatablock
# GPencil data api is structured as it follow:
# GP-Object --> GP-Layers --> GP-Frames --> GP-Strokes --> GP-Stroke-Points
STROKE_POINT = [
'co',
@ -66,9 +64,36 @@ def dump_stroke(stroke):
:param stroke: target grease pencil stroke
:type stroke: bpy.types.GPencilStroke
:return: (p_count, p_data)
:return: dict
"""
return (len(stroke.points), np_dump_collection(stroke.points, STROKE_POINT))
assert(stroke)
dumper = Dumper()
dumper.include_filter = [
"aspect",
"display_mode",
"draw_cyclic",
"end_cap_mode",
"hardeness",
"line_width",
"material_index",
"start_cap_mode",
"uv_rotation",
"uv_scale",
"uv_translation",
"vertex_color_fill",
]
dumped_stroke = dumper.dump(stroke)
# Stoke points
p_count = len(stroke.points)
dumped_stroke['p_count'] = p_count
dumped_stroke['points'] = np_dump_collection(stroke.points, STROKE_POINT)
# TODO: uv_factor, uv_rotation
return dumped_stroke
def load_stroke(stroke_data, stroke):
@ -81,13 +106,12 @@ def load_stroke(stroke_data, stroke):
"""
assert(stroke and stroke_data)
stroke.points.add(stroke_data[0])
np_load_collection(stroke_data[1], stroke.points, STROKE_POINT)
stroke.points.add(stroke_data["p_count"])
np_load_collection(stroke_data['points'], stroke.points, STROKE_POINT)
# HACK: Temporary fix to trigger a BKE_gpencil_stroke_geometry_update to
# fix fill issues
stroke.uv_scale = 1.0
stroke.uv_scale = stroke_data["uv_scale"]
def dump_frame(frame):
""" Dump a grease pencil frame to a dict
@ -121,15 +145,12 @@ def load_frame(frame_data, frame):
assert(frame and frame_data)
# Load stroke points
for stroke_data in frame_data['strokes_points']:
target_stroke = frame.strokes.new()
load_stroke(stroke_data, target_stroke)
# Load stroke metadata
np_load_collection(frame_data['strokes'], frame.strokes, STROKE)
def dump_layer(layer):
""" Dump a grease pencil layer
@ -146,6 +167,7 @@ def dump_layer(layer):
'opacity',
'channel_color',
'color',
# 'thickness', #TODO: enabling only for annotation
'tint_color',
'tint_factor',
'vertex_paint_opacity',
@ -162,7 +184,7 @@ def dump_layer(layer):
'hide',
'annotation_hide',
'lock',
'lock_frame',
# 'lock_frame',
# 'lock_material',
# 'use_mask_layer',
'use_lights',
@ -170,13 +192,12 @@ def dump_layer(layer):
'select',
'show_points',
'show_in_front',
# 'thickness'
# 'parent',
# 'parent_type',
# 'parent_bone',
# 'matrix_inverse',
]
if layer.thickness != 0:
if layer.id_data.is_annotation:
dumper.include_filter.append('thickness')
dumped_layer = dumper.dump(layer)
@ -207,83 +228,68 @@ def load_layer(layer_data, layer):
load_frame(frame_data, target_frame)
def layer_changed(datablock: object, data: dict) -> bool:
if datablock.layers.active and \
datablock.layers.active.info != data["active_layers"]:
return True
else:
return False
def frame_changed(data: dict) -> bool:
return bpy.context.scene.frame_current != data["eval_frame"]
class BlGpencil(ReplicatedDatablock):
class BlGpencil(BlDatablock):
bl_id = "grease_pencils"
bl_class = bpy.types.GreasePencil
bl_check_common = False
bl_icon = 'GREASEPENCIL'
bl_reload_parent = False
@staticmethod
def construct(data: dict) -> object:
return bpy.data.grease_pencils.new(data["name"])
@staticmethod
def load(data: dict, datablock: object):
# MATERIAL SLOTS
src_materials = data.get('materials', None)
if src_materials:
load_materials_slots(src_materials, datablock.materials)
target.materials.clear()
if "materials" in data.keys():
for mat in data['materials']:
target.materials.append(bpy.data.materials[mat])
loader = Loader()
loader.load(datablock, data)
loader.load(target, data)
# TODO: reuse existing layer
for layer in datablock.layers:
datablock.layers.remove(layer)
for layer in target.layers:
target.layers.remove(layer)
if "layers" in data.keys():
for layer in data["layers"]:
layer_data = data["layers"].get(layer)
# if layer not in datablock.layers.keys():
target_layer = datablock.layers.new(data["layers"][layer]["info"])
# if layer not in target.layers.keys():
target_layer = target.layers.new(data["layers"][layer]["info"])
# else:
# target_layer = target.layers[layer]
# target_layer.clear()
load_layer(layer_data, target_layer)
datablock.layers.update()
target.layers.update()
@staticmethod
def dump(datablock: object) -> dict:
assert(instance)
dumper = Dumper()
dumper.depth = 2
dumper.include_filter = [
'materials',
'name',
'zdepth_offset',
'stroke_thickness_space',
'pixel_factor',
'stroke_depth_order'
]
data = dumper.dump(datablock)
data['materials'] = dump_materials_slots(datablock.materials)
data = dumper.dump(instance)
data['layers'] = {}
for layer in datablock.layers:
for layer in instance.layers:
data['layers'][layer.info] = dump_layer(layer)
data["active_layers"] = datablock.layers.active.info if datablock.layers.active else "None"
data["active_layers"] = instance.layers.active.info
data["eval_frame"] = bpy.context.scene.frame_current
return data
@staticmethod
def resolve(data: dict) -> object:
uuid = data.get('uuid')
return resolve_datablock_from_uuid(uuid, bpy.data.grease_pencils)
@staticmethod
def resolve_deps(datablock: object) -> [object]:
deps = []
@ -293,13 +299,17 @@ class BlGpencil(ReplicatedDatablock):
return deps
@staticmethod
def needs_update(datablock: object, data: dict) -> bool:
return bpy.context.mode == 'OBJECT' \
or layer_changed(datablock, data) \
or frame_changed(data) \
or get_preferences().sync_flags.sync_during_editmode \
or is_annotating(bpy.context)
def layer_changed(self):
return self.instance.layers.active.info != self.data["active_layers"]
_type = bpy.types.GreasePencil
_class = BlGpencil
def frame_changed(self):
return bpy.context.scene.frame_current != self.data["eval_frame"]
def diff(self):
if self.layer_changed() \
or self.frame_changed() \
or bpy.context.mode == 'OBJECT' \
or self.preferences.sync_flags.sync_during_editmode:
return super().diff()
else:
return False

View File

@ -24,12 +24,9 @@ import bpy
import mathutils
from .. import utils
from replication.protocol import ReplicatedDatablock
from .bl_datablock import BlDatablock
from .dump_anything import Dumper, Loader
from .bl_file import get_filepath, ensure_unpacked
from .bl_datablock import resolve_datablock_from_uuid
from .bl_action import dump_animation_data, load_animation_data, resolve_animation_dependencies
format_to_ext = {
'BMP': 'bmp',
@ -51,14 +48,13 @@ format_to_ext = {
}
class BlImage(ReplicatedDatablock):
class BlImage(BlDatablock):
bl_id = "images"
bl_class = bpy.types.Image
bl_check_common = False
bl_icon = 'IMAGE_DATA'
bl_reload_parent = False
@staticmethod
def construct(data: dict) -> object:
return bpy.data.images.new(
name=data['name'],
@ -66,22 +62,18 @@ class BlImage(ReplicatedDatablock):
height=data['size'][1]
)
@staticmethod
def load(data: dict, datablock: object):
def load(self, data, target):
loader = Loader()
loader.load(datablock, data)
loader.load(data, target)
# datablock.name = data.get('name')
datablock.source = 'FILE'
datablock.filepath_raw = get_filepath(data['filename'])
color_space_name = data.get("colorspace")
target.source = 'FILE'
target.filepath_raw = get_filepath(data['filename'])
target.colorspace_settings.name = data["colorspace_settings"]["name"]
if color_space_name:
datablock.colorspace_settings.name = color_space_name
def dump(self, instance=None):
assert(instance)
@staticmethod
def dump(datablock: object) -> dict:
filename = Path(datablock.filepath).name
filename = Path(instance.filepath).name
data = {
"filename": filename
@ -91,18 +83,23 @@ class BlImage(ReplicatedDatablock):
dumper.depth = 2
dumper.include_filter = [
"name",
# 'source',
'size',
'alpha_mode']
data.update(dumper.dump(datablock))
data['colorspace'] = datablock.colorspace_settings.name
'height',
'alpha',
'float_buffer',
'alpha_mode',
'colorspace_settings']
data.update(dumper.dump(instance))
return data
@staticmethod
def resolve(data: dict) -> object:
uuid = data.get('uuid')
return resolve_datablock_from_uuid(uuid, bpy.data.images)
def diff(self):
if self.instance.is_dirty:
self.instance.save()
if self.instance and (self.instance.name != self.data['name']):
return True
else:
return False
@staticmethod
def resolve_deps(datablock: object) -> [object]:
@ -125,13 +122,3 @@ class BlImage(ReplicatedDatablock):
deps.append(Path(bpy.path.abspath(datablock.filepath)))
return deps
@staticmethod
def needs_update(datablock: object, data:dict)-> bool:
if datablock.is_dirty:
datablock.save()
return True
_type = bpy.types.Image
_class = BlImage

View File

@ -20,41 +20,33 @@ import bpy
import mathutils
from .dump_anything import Dumper, Loader, np_dump_collection, np_load_collection
from replication.protocol import ReplicatedDatablock
from .bl_datablock import BlDatablock
from replication.exception import ContextError
from .bl_datablock import resolve_datablock_from_uuid
from .bl_action import dump_animation_data, load_animation_data, resolve_animation_dependencies
POINT = ['co', 'weight_softbody', 'co_deform']
class BlLattice(ReplicatedDatablock):
use_delta = True
class BlLattice(BlDatablock):
bl_id = "lattices"
bl_class = bpy.types.Lattice
bl_check_common = False
bl_icon = 'LATTICE_DATA'
bl_reload_parent = False
@staticmethod
def construct(data: dict) -> object:
return bpy.data.lattices.new(data["name"])
@staticmethod
def load(data: dict, datablock: object):
load_animation_data(data.get('animation_data'), datablock)
if datablock.is_editmode:
if target.is_editmode:
raise ContextError("lattice is in edit mode")
loader = Loader()
loader.load(datablock, data)
loader.load(target, data)
np_load_collection(data['points'], datablock.points, POINT)
np_load_collection(data['points'], target.points, POINT)
@staticmethod
def dump(datablock: object) -> dict:
if datablock.is_editmode:
if instance.is_editmode:
raise ContextError("lattice is in edit mode")
dumper = Dumper()
@ -70,20 +62,9 @@ class BlLattice(ReplicatedDatablock):
'interpolation_type_w',
'use_outside'
]
data = dumper.dump(datablock)
data = dumper.dump(instance)
data['points'] = np_dump_collection(instance.points, POINT)
data['points'] = np_dump_collection(datablock.points, POINT)
data['animation_data'] = dump_animation_data(datablock)
return data
@staticmethod
def resolve(data: dict) -> object:
uuid = data.get('uuid')
return resolve_datablock_from_uuid(uuid, bpy.data.lattices)
@staticmethod
def resolve_deps(datablock: object) -> [object]:
return resolve_animation_dependencies(datablock)
_type = bpy.types.Lattice
_class = BlLattice

View File

@ -0,0 +1,45 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
#
# ##### END GPL LICENSE BLOCK #####
import bpy
import mathutils
from .dump_anything import Loader, Dumper
from .bl_datablock import BlDatablock
class BlLibrary(BlDatablock):
bl_id = "libraries"
bl_class = bpy.types.Library
bl_check_common = False
bl_icon = 'LIBRARY_DATA_DIRECT'
bl_reload_parent = False
def construct(data: dict) -> object:
with bpy.data.libraries.load(filepath=data["filepath"], link=True) as (sourceData, targetData):
targetData = sourceData
return sourceData
def load(self, data, target):
pass
def dump(self, instance=None):
assert(instance)
dumper = Dumper()
return dumper.dump(instance)

View File

@ -20,34 +20,25 @@ import bpy
import mathutils
from .dump_anything import Loader, Dumper
from replication.protocol import ReplicatedDatablock
from .bl_datablock import resolve_datablock_from_uuid
from .bl_action import dump_animation_data, load_animation_data, resolve_animation_dependencies
from .bl_datablock import BlDatablock
class BlLight(ReplicatedDatablock):
use_delta = True
class BlLight(BlDatablock):
bl_id = "lights"
bl_class = bpy.types.Light
bl_check_common = False
bl_icon = 'LIGHT_DATA'
bl_reload_parent = False
@staticmethod
def construct(data: dict) -> object:
instance = bpy.data.lights.new(data["name"], data["type"])
instance.uuid = data.get("uuid")
return instance
return bpy.data.lights.new(data["name"], data["type"])
@staticmethod
def load(data: dict, datablock: object):
loader = Loader()
loader.load(datablock, data)
load_animation_data(data.get('animation_data'), datablock)
loader.load(target, data)
@staticmethod
def dump(datablock: object) -> dict:
assert(instance)
dumper = Dumper()
dumper.depth = 3
dumper.include_filter = [
@ -76,23 +67,9 @@ class BlLight(ReplicatedDatablock):
'spot_size',
'spot_blend'
]
data = dumper.dump(datablock)
data['animation_data'] = dump_animation_data(datablock)
data = dumper.dump(instance)
return data
@staticmethod
def resolve(data: dict) -> object:
uuid = data.get('uuid')
return resolve_datablock_from_uuid(uuid, bpy.data.lights)
@staticmethod
def resolve_deps(datablock: object) -> [object]:
deps = []
deps.extend(resolve_animation_dependencies(datablock))
return deps
_type = [bpy.types.SpotLight, bpy.types.PointLight, bpy.types.AreaLight, bpy.types.SunLight]
_class = BlLight

View File

@ -21,19 +21,16 @@ import mathutils
import logging
from .dump_anything import Loader, Dumper
from replication.protocol import ReplicatedDatablock
from .bl_datablock import resolve_datablock_from_uuid
from .bl_datablock import BlDatablock
class BlLightprobe(ReplicatedDatablock):
use_delta = True
class BlLightprobe(BlDatablock):
bl_id = "lightprobes"
bl_class = bpy.types.LightProbe
bl_check_common = False
bl_icon = 'LIGHTPROBE_GRID'
bl_reload_parent = False
@staticmethod
def construct(data: dict) -> object:
type = 'CUBE' if data['type'] == 'CUBEMAP' else data['type']
# See https://developer.blender.org/D6396
@ -42,13 +39,12 @@ class BlLightprobe(ReplicatedDatablock):
else:
logging.warning("Lightprobe replication only supported since 2.83. See https://developer.blender.org/D6396")
@staticmethod
def load(data: dict, datablock: object):
loader = Loader()
loader.load(datablock, data)
loader.load(target, data)
@staticmethod
def dump(datablock: object) -> dict:
assert(instance)
if bpy.app.version[1] < 83:
logging.warning("Lightprobe replication only supported since 2.83. See https://developer.blender.org/D6396")
@ -75,16 +71,7 @@ class BlLightprobe(ReplicatedDatablock):
'visibility_blur'
]
return dumper.dump(datablock)
return dumper.dump(instance)
@staticmethod
def resolve(data: dict) -> object:
uuid = data.get('uuid')
return resolve_datablock_from_uuid(uuid, bpy.data.lightprobes)
@staticmethod
def resolve_deps(datablock: object) -> [object]:
return []
_type = bpy.types.LightProbe
_class = BlLightprobe

View File

@ -24,10 +24,9 @@ import re
from uuid import uuid4
from .dump_anything import Loader, Dumper
from .bl_datablock import get_datablock_from_uuid, stamp_uuid
from replication.protocol import ReplicatedDatablock
from .bl_datablock import get_datablock_from_uuid, resolve_datablock_from_uuid
from .bl_action import dump_animation_data, load_animation_data, resolve_animation_dependencies
from replication.objects import Node
NODE_SOCKET_INDEX = re.compile('\[(\d*)\]')
IGNORED_SOCKETS = ['GEOMETRY', 'SHADER', 'CUSTOM']
@ -37,7 +36,7 @@ def load_node(node_data: dict, node_tree: bpy.types.ShaderNodeTree):
:arg node_data: dumped node data
:type node_data: dict
:arg node_tree: target node_tree
:arg node_tree: datablock node_tree
:type node_tree: bpy.types.NodeTree
"""
loader = Loader()
@ -48,11 +47,7 @@ def load_node(node_data: dict, node_tree: bpy.types.ShaderNodeTree):
node_tree_uuid = node_data.get('node_tree_uuid', None)
if image_uuid and not target_node.image:
image = resolve_datablock_from_uuid(image_uuid, bpy.data.images)
if image is None:
logging.error(f"Fail to find material image from uuid {image_uuid}")
else:
target_node.image = image
target_node.image = get_datablock_from_uuid(image_uuid, None)
if node_tree_uuid:
target_node.node_tree = get_datablock_from_uuid(node_tree_uuid, None)
@ -95,7 +90,7 @@ def load_node(node_data: dict, node_tree: bpy.types.ShaderNodeTree):
def dump_node(node: bpy.types.ShaderNode) -> dict:
""" Dump a single node to a dict
:arg node: target node
:arg node: datablock node
:type node: bpy.types.Node
:retrun: dict
"""
@ -124,7 +119,8 @@ def dump_node(node: bpy.types.ShaderNode) -> dict:
"show_preview",
"show_texture",
"outputs",
"width_hidden"
"width_hidden",
"image"
]
dumped_node = node_dumper.dump(node)
@ -255,7 +251,7 @@ def dump_node_tree(node_tree: bpy.types.ShaderNodeTree) -> dict:
def dump_node_tree_sockets(sockets: bpy.types.Collection) -> dict:
""" dump sockets of a shader_node_tree
:arg target_node_tree: target node_tree
:arg target_node_tree: datablock node_tree
:type target_node_tree: bpy.types.NodeTree
:arg socket_id: socket identifer
:type socket_id: str
@ -278,7 +274,7 @@ def load_node_tree_sockets(sockets: bpy.types.Collection,
sockets_data: dict):
""" load sockets of a shader_node_tree
:arg target_node_tree: target node_tree
:arg target_node_tree: datablock node_tree
:type target_node_tree: bpy.types.NodeTree
:arg socket_id: socket identifer
:type socket_id: str
@ -306,7 +302,7 @@ def load_node_tree(node_tree_data: dict, target_node_tree: bpy.types.ShaderNodeT
:arg node_tree_data: dumped node data
:type node_tree_data: dict
:arg target_node_tree: target node_tree
:arg target_node_tree: datablock node_tree
:type target_node_tree: bpy.types.NodeTree
"""
# TODO: load only required nodes
@ -379,7 +375,7 @@ def load_materials_slots(src_materials: list, dst_materials: bpy.types.bpy_prop_
:arg src_materials: dumped material collection (ex: object.materials)
:type src_materials: list of tuples (uuid, name)
:arg dst_materials: target material collection pointer
:arg dst_materials: datablock material collection pointer
:type dst_materials: bpy.types.bpy_prop_collection
"""
# MATERIAL SLOTS
@ -387,22 +383,20 @@ def load_materials_slots(src_materials: list, dst_materials: bpy.types.bpy_prop_
for mat_uuid, mat_name in src_materials:
mat_ref = None
if mat_uuid:
if mat_uuid is not None:
mat_ref = get_datablock_from_uuid(mat_uuid, None)
else:
mat_ref = bpy.data.materials[mat_name]
dst_materials.append(mat_ref)
class BlMaterial(ReplicatedDatablock):
use_delta = True
bl_id = "materials"
bl_class = bpy.types.Material
bl_check_common = False
bl_icon = 'MATERIAL_DATA'
bl_reload_parent = False
bl_reload_child = True
@staticmethod
def construct(data: dict) -> object:
@ -410,6 +404,7 @@ class BlMaterial(ReplicatedDatablock):
@staticmethod
def load(data: dict, datablock: object):
data = data
loader = Loader()
is_grease_pencil = data.get('is_grease_pencil')
@ -426,11 +421,11 @@ class BlMaterial(ReplicatedDatablock):
datablock.use_nodes = True
load_node_tree(data['node_tree'], datablock.node_tree)
load_animation_data(data.get('nodes_animation_data'), datablock.node_tree)
load_animation_data(data.get('animation_data'), datablock)
@staticmethod
def dump(datablock: object) -> dict:
stamp_uuid(datablock)
mat_dumper = Dumper()
mat_dumper.depth = 2
mat_dumper.include_filter = [
@ -495,27 +490,18 @@ class BlMaterial(ReplicatedDatablock):
data['grease_pencil'] = gp_mat_dumper.dump(datablock.grease_pencil)
elif datablock.use_nodes:
data['node_tree'] = dump_node_tree(datablock.node_tree)
data['nodes_animation_data'] = dump_animation_data(datablock.node_tree)
data['animation_data'] = dump_animation_data(datablock)
return data
@staticmethod
def resolve(data: dict) -> object:
uuid = data.get('uuid')
return resolve_datablock_from_uuid(uuid, bpy.data.materials)
@staticmethod
def resolve_deps(datablock: object) -> [object]:
# TODO: resolve node group deps
deps = []
if datablock.use_nodes:
deps.extend(get_node_tree_dependencies(datablock.node_tree))
deps.extend(resolve_animation_dependencies(datablock.node_tree))
deps.extend(resolve_animation_dependencies(datablock))
return deps
_type = bpy.types.Material
_class = BlMaterial
_class = BlMaterial

View File

@ -22,16 +22,20 @@ import mathutils
import logging
import numpy as np
from .dump_anything import Dumper, Loader, np_load_collection_primitives, np_dump_collection_primitive, np_load_collection, np_dump_collection
from .dump_anything import (Dumper,
Loader,
np_load_collection_primitives,
np_dump_collection_primitive,
np_load_collection, np_dump_collection)
from replication.constants import DIFF_BINARY
from replication.exception import ContextError
from replication.protocol import ReplicatedDatablock
from replication.objects import Node
from .bl_datablock import get_datablock_from_uuid
from .bl_datablock import get_datablock_from_uuid, stamp_uuid
from .bl_material import dump_materials_slots, load_materials_slots
from ..utils import get_preferences
from .bl_datablock import resolve_datablock_from_uuid
from .bl_action import dump_animation_data, load_animation_data, resolve_animation_dependencies
from ..preferences import get_preferences
VERTICE = ['co']
@ -54,9 +58,8 @@ POLYGON = [
'material_index',
]
class BlMesh(ReplicatedDatablock):
use_delta = True
class BlMesh(ReplicatedDatablock):
bl_id = "meshes"
bl_class = bpy.types.Mesh
bl_check_common = False
@ -65,15 +68,17 @@ class BlMesh(ReplicatedDatablock):
@staticmethod
def construct(data: dict) -> object:
return bpy.data.meshes.new(data.get("name"))
datablock = bpy.data.meshes.new(data["name"])
datablock.uuid = data['uuid']
return datablock
@staticmethod
def load(data: dict, datablock: object):
data = data
if not datablock or datablock.is_editmode:
raise ContextError
else:
load_animation_data(data.get('animation_data'), datablock)
loader = Loader()
loader.load(datablock, data)
@ -95,7 +100,7 @@ class BlMesh(ReplicatedDatablock):
np_load_collection(data['vertices'], datablock.vertices, VERTICE)
np_load_collection(data['edges'], datablock.edges, EDGE)
np_load_collection(data['loops'], datablock.loops, LOOP)
np_load_collection(data["polygons"],datablock.polygons, POLYGON)
np_load_collection(data["polygons"], datablock.polygons, POLYGON)
# UV Layers
if 'uv_layers' in data.keys():
@ -104,10 +109,10 @@ class BlMesh(ReplicatedDatablock):
datablock.uv_layers.new(name=layer)
np_load_collection_primitives(
datablock.uv_layers[layer].data,
'uv',
datablock.uv_layers[layer].data,
'uv',
data["uv_layers"][layer]['data'])
# Vertex color
if 'vertex_colors' in data.keys():
for color_layer in data['vertex_colors']:
@ -115,8 +120,8 @@ class BlMesh(ReplicatedDatablock):
datablock.vertex_colors.new(name=color_layer)
np_load_collection_primitives(
datablock.vertex_colors[color_layer].data,
'color',
datablock.vertex_colors[color_layer].data,
'color',
data["vertex_colors"][color_layer]['data'])
datablock.validate()
@ -124,6 +129,8 @@ class BlMesh(ReplicatedDatablock):
@staticmethod
def dump(datablock: object) -> dict:
stamp_uuid(datablock)
if (datablock.is_editmode or bpy.context.mode == "SCULPT") and not get_preferences().sync_flags.sync_during_editmode:
raise ContextError("Mesh is in edit mode")
mesh = datablock
@ -131,6 +138,7 @@ class BlMesh(ReplicatedDatablock):
dumper = Dumper()
dumper.depth = 1
dumper.include_filter = [
'uuid'
'name',
'use_auto_smooth',
'auto_smooth_angle',
@ -140,8 +148,6 @@ class BlMesh(ReplicatedDatablock):
data = dumper.dump(mesh)
data['animation_data'] = dump_animation_data(datablock)
# VERTICES
data["vertex_count"] = len(mesh.vertices)
data["vertices"] = np_dump_collection(mesh.vertices, VERTICE)
@ -163,19 +169,21 @@ class BlMesh(ReplicatedDatablock):
data['uv_layers'] = {}
for layer in mesh.uv_layers:
data['uv_layers'][layer.name] = {}
data['uv_layers'][layer.name]['data'] = np_dump_collection_primitive(layer.data, 'uv')
data['uv_layers'][layer.name]['data'] = np_dump_collection_primitive(
layer.data, 'uv')
# Vertex color
if mesh.vertex_colors:
data['vertex_colors'] = {}
for color_map in mesh.vertex_colors:
data['vertex_colors'][color_map.name] = {}
data['vertex_colors'][color_map.name]['data'] = np_dump_collection_primitive(color_map.data, 'color')
data['vertex_colors'][color_map.name]['data'] = np_dump_collection_primitive(
color_map.data, 'color')
# Materials
data['materials'] = dump_materials_slots(datablock.materials)
return data
@staticmethod
def resolve_deps(datablock: object) -> [object]:
deps = []
@ -184,19 +192,14 @@ class BlMesh(ReplicatedDatablock):
if material:
deps.append(material)
deps.extend(resolve_animation_dependencies(datablock))
return deps
@staticmethod
def resolve(data: dict) -> object:
uuid = data.get('uuid')
return resolve_datablock_from_uuid(uuid, bpy.data.meshes)
@staticmethod
def needs_update(datablock: object, data: dict) -> bool:
return ('EDIT' not in bpy.context.mode and bpy.context.mode != 'SCULPT') \
or get_preferences().sync_flags.sync_during_editmode
def diff(self):
if 'EDIT' in bpy.context.mode \
and not get_preferences().sync_flags.sync_during_editmode:
return False
else:
return super().diff()
_type = bpy.types.Mesh
_class = BlMesh
_class = BlMesh

View File

@ -23,9 +23,7 @@ from .dump_anything import (
Dumper, Loader, np_dump_collection_primitive, np_load_collection_primitives,
np_dump_collection, np_load_collection)
from replication.protocol import ReplicatedDatablock
from .bl_datablock import resolve_datablock_from_uuid
from .bl_action import dump_animation_data, load_animation_data, resolve_animation_dependencies
from .bl_datablock import BlDatablock
ELEMENT = [
@ -64,35 +62,29 @@ def load_metaball_elements(elements_data, elements):
np_load_collection(elements_data, elements, ELEMENT)
class BlMetaball(ReplicatedDatablock):
use_delta = True
class BlMetaball(BlDatablock):
bl_id = "metaballs"
bl_class = bpy.types.MetaBall
bl_check_common = False
bl_icon = 'META_BALL'
bl_reload_parent = False
@staticmethod
def construct(data: dict) -> object:
return bpy.data.metaballs.new(data["name"])
@staticmethod
def load(data: dict, datablock: object):
load_animation_data(data.get('animation_data'), datablock)
loader = Loader()
loader.load(datablock, data)
loader.load(target, data)
datablock.elements.clear()
target.elements.clear()
for mtype in data["elements"]['type']:
new_element = datablock.elements.new()
new_element = target.elements.new()
load_metaball_elements(data['elements'], datablock.elements)
load_metaball_elements(data['elements'], target.elements)
@staticmethod
def dump(datablock: object) -> dict:
assert(instance)
dumper = Dumper()
dumper.depth = 1
dumper.include_filter = [
@ -106,24 +98,7 @@ class BlMetaball(ReplicatedDatablock):
'texspace_size'
]
data = dumper.dump(datablock)
data['animation_data'] = dump_animation_data(datablock)
data['elements'] = dump_metaball_elements(datablock.elements)
data = dumper.dump(instance)
data['elements'] = dump_metaball_elements(instance.elements)
return data
@staticmethod
def resolve(data: dict) -> object:
uuid = data.get('uuid')
return resolve_datablock_from_uuid(uuid, bpy.data.metaballs)
@staticmethod
def resolve_deps(datablock: object) -> [object]:
deps = []
deps.extend(resolve_animation_dependencies(datablock))
return deps
_type = bpy.types.MetaBall
_class = BlMetaball

View File

@ -20,45 +20,27 @@ import bpy
import mathutils
from .dump_anything import Dumper, Loader, np_dump_collection, np_load_collection
from replication.protocol import ReplicatedDatablock
from .bl_datablock import BlDatablock
from .bl_material import (dump_node_tree,
load_node_tree,
get_node_tree_dependencies)
from .bl_datablock import resolve_datablock_from_uuid
from .bl_action import dump_animation_data, load_animation_data, resolve_animation_dependencies
class BlNodeGroup(ReplicatedDatablock):
use_delta = True
class BlNodeGroup(BlDatablock):
bl_id = "node_groups"
bl_class = bpy.types.NodeTree
bl_check_common = False
bl_icon = 'NODETREE'
bl_reload_parent = False
@staticmethod
def construct(data: dict) -> object:
return bpy.data.node_groups.new(data["name"], data["type"])
@staticmethod
def load(data: dict, datablock: object):
load_node_tree(data, datablock)
load_node_tree(data, target)
@staticmethod
def dump(datablock: object) -> dict:
return dump_node_tree(datablock)
@staticmethod
def resolve(data: dict) -> object:
uuid = data.get('uuid')
return resolve_datablock_from_uuid(uuid, bpy.data.node_groups)
return dump_node_tree(instance)
@staticmethod
def resolve_deps(datablock: object) -> [object]:
deps = []
deps.extend(get_node_tree_dependencies(datablock))
deps.extend(resolve_animation_dependencies(datablock))
return deps
_type = [bpy.types.ShaderNodeTree, bpy.types.GeometryNodeTree]
_class = BlNodeGroup
return get_node_tree_dependencies(datablock)

View File

@ -21,12 +21,17 @@ import re
import bpy
import mathutils
from replication.exception import ContextError
from replication.objects import Node
from replication.protocol import ReplicatedDatablock
from .bl_datablock import get_datablock_from_uuid, resolve_datablock_from_uuid
from .bl_datablock import get_datablock_from_uuid, stamp_uuid
from .bl_action import (load_animation_data,
dump_animation_data,
resolve_animation_dependencies)
from ..preferences import get_preferences
from .bl_datablock import get_datablock_from_uuid
from .bl_material import IGNORED_SOCKETS
from ..utils import get_preferences
from .bl_action import dump_animation_data, load_animation_data, resolve_animation_dependencies
from .dump_anything import (
Dumper,
Loader,
@ -40,14 +45,6 @@ SKIN_DATA = [
'use_root'
]
SHAPEKEY_BLOCK_ATTR = [
'mute',
'value',
'slider_min',
'slider_max',
]
if bpy.app.version[1] >= 93:
SUPPORTED_GEOMETRY_NODE_PARAMETERS = (int, str, float)
else:
@ -55,7 +52,6 @@ else:
logging.warning("Geometry node Float parameter not supported in \
blender 2.92.")
def get_node_group_inputs(node_group):
inputs = []
for inpt in node_group.inputs:
@ -94,7 +90,6 @@ def dump_physics(target: bpy.types.Object)->dict:
return physics_data
def load_physics(dumped_settings: dict, target: bpy.types.Object):
""" Load all physics settings from a given object excluding modifier
related physics settings (such as softbody, cloth, dynapaint and fluid)
@ -120,8 +115,7 @@ def load_physics(dumped_settings: dict, target: bpy.types.Object):
loader.load(target.rigid_body_constraint, dumped_settings['rigid_body_constraint'])
elif target.rigid_body_constraint:
bpy.ops.rigidbody.constraint_remove({"object": target})
def dump_modifier_geometry_node_inputs(modifier: bpy.types.Modifier) -> list:
""" Dump geometry node modifier input properties
@ -173,45 +167,40 @@ def load_pose(target_bone, data):
def find_data_from_name(name=None):
instance = None
datablock = None
if not name:
pass
elif name in bpy.data.meshes.keys():
instance = bpy.data.meshes[name]
datablock = bpy.data.meshes[name]
elif name in bpy.data.lights.keys():
instance = bpy.data.lights[name]
datablock = bpy.data.lights[name]
elif name in bpy.data.cameras.keys():
instance = bpy.data.cameras[name]
datablock = bpy.data.cameras[name]
elif name in bpy.data.curves.keys():
instance = bpy.data.curves[name]
datablock = bpy.data.curves[name]
elif name in bpy.data.metaballs.keys():
instance = bpy.data.metaballs[name]
datablock = bpy.data.metaballs[name]
elif name in bpy.data.armatures.keys():
instance = bpy.data.armatures[name]
datablock = bpy.data.armatures[name]
elif name in bpy.data.grease_pencils.keys():
instance = bpy.data.grease_pencils[name]
datablock = bpy.data.grease_pencils[name]
elif name in bpy.data.curves.keys():
instance = bpy.data.curves[name]
datablock = bpy.data.curves[name]
elif name in bpy.data.lattices.keys():
instance = bpy.data.lattices[name]
datablock = bpy.data.lattices[name]
elif name in bpy.data.speakers.keys():
instance = bpy.data.speakers[name]
datablock = bpy.data.speakers[name]
elif name in bpy.data.lightprobes.keys():
# Only supported since 2.83
if bpy.app.version[1] >= 83:
instance = bpy.data.lightprobes[name]
datablock = bpy.data.lightprobes[name]
else:
logging.warning(
"Lightprobe replication only supported since 2.83. See https://developer.blender.org/D6396")
elif bpy.app.version[1] >= 91 and name in bpy.data.volumes.keys():
# Only supported since 2.91
instance = bpy.data.volumes[name]
return instance
def load_data(object, name):
logging.info("loading data")
pass
datablock = bpy.data.volumes[name]
return datablock
def _is_editmode(object: bpy.types.Object) -> bool:
@ -258,6 +247,7 @@ def find_geometry_nodes_dependencies(modifiers: bpy.types.bpy_prop_collection) -
return dependencies
def dump_vertex_groups(src_object: bpy.types.Object) -> dict:
""" Dump object's vertex groups
@ -303,228 +293,43 @@ def load_vertex_groups(dumped_vertex_groups: dict, target_object: bpy.types.Obje
vertex_group.add([index], weight, 'REPLACE')
def dump_shape_keys(target_key: bpy.types.Key)->dict:
""" Dump the target shape_keys datablock to a dict using numpy
:param dumped_key: target key datablock
:type dumped_key: bpy.types.Key
:return: dict
"""
dumped_key_blocks = []
dumper = Dumper()
dumper.include_filter = [
'name',
'mute',
'value',
'slider_min',
'slider_max',
]
for key in target_key.key_blocks:
dumped_key_block = dumper.dump(key)
dumped_key_block['data'] = np_dump_collection(key.data, ['co'])
dumped_key_block['relative_key'] = key.relative_key.name
dumped_key_blocks.append(dumped_key_block)
return {
'reference_key': target_key.reference_key.name,
'use_relative': target_key.use_relative,
'key_blocks': dumped_key_blocks,
'animation_data': dump_animation_data(target_key)
}
def load_shape_keys(dumped_shape_keys: dict, target_object: bpy.types.Object):
""" Load the target shape_keys datablock to a dict using numpy
:param dumped_key: src key data
:type dumped_key: bpy.types.Key
:param target_object: object used to load the shapekeys data onto
:type target_object: bpy.types.Object
"""
loader = Loader()
# Remove existing ones
target_object.shape_key_clear()
# Create keys and load vertices coords
dumped_key_blocks = dumped_shape_keys.get('key_blocks')
for dumped_key_block in dumped_key_blocks:
key_block = target_object.shape_key_add(name=dumped_key_block['name'])
loader.load(key_block, dumped_key_block)
np_load_collection(dumped_key_block['data'], key_block.data, ['co'])
# Load relative key after all
for dumped_key_block in dumped_key_blocks:
relative_key_name = dumped_key_block.get('relative_key')
key_name = dumped_key_block.get('name')
target_keyblock = target_object.data.shape_keys.key_blocks[key_name]
relative_key = target_object.data.shape_keys.key_blocks[relative_key_name]
target_keyblock.relative_key = relative_key
# Shape keys animation data
anim_data = dumped_shape_keys.get('animation_data')
if anim_data:
load_animation_data(anim_data, target_object.data.shape_keys)
def dump_modifiers(modifiers: bpy.types.bpy_prop_collection)->dict:
""" Dump all modifiers of a modifier collection into a dict
:param modifiers: modifiers
:type modifiers: bpy.types.bpy_prop_collection
:return: dict
"""
dumped_modifiers = []
dumper = Dumper()
dumper.depth = 1
dumper.exclude_filter = ['is_active']
for modifier in modifiers:
dumped_modifier = dumper.dump(modifier)
# hack to dump geometry nodes inputs
if modifier.type == 'NODES':
dumped_inputs = dump_modifier_geometry_node_inputs(
modifier)
dumped_modifier['inputs'] = dumped_inputs
elif modifier.type == 'PARTICLE_SYSTEM':
dumper.exclude_filter = [
"is_edited",
"is_editable",
"is_global_hair"
]
dumped_modifier['particle_system'] = dumper.dump(modifier.particle_system)
dumped_modifier['particle_system']['settings_uuid'] = modifier.particle_system.settings.uuid
elif modifier.type in ['SOFT_BODY', 'CLOTH']:
dumped_modifier['settings'] = dumper.dump(modifier.settings)
elif modifier.type == 'UV_PROJECT':
dumped_modifier['projectors'] =[p.object.name for p in modifier.projectors if p and p.object]
dumped_modifiers.append(dumped_modifier)
return dumped_modifiers
def dump_constraints(constraints: bpy.types.bpy_prop_collection)->list:
"""Dump all constraints to a list
:param constraints: constraints
:type constraints: bpy.types.bpy_prop_collection
:return: dict
"""
dumper = Dumper()
dumper.depth = 2
dumper.include_filter = None
dumped_constraints = []
for constraint in constraints:
dumped_constraints.append(dumper.dump(constraint))
return dumped_constraints
def load_constraints(dumped_constraints: list, constraints: bpy.types.bpy_prop_collection):
""" Load dumped constraints
:param dumped_constraints: list of constraints to load
:type dumped_constraints: list
:param constraints: constraints
:type constraints: bpy.types.bpy_prop_collection
"""
loader = Loader()
constraints.clear()
for dumped_constraint in dumped_constraints:
constraint_type = dumped_constraint.get('type')
new_constraint = constraints.new(constraint_type)
loader.load(new_constraint, dumped_constraint)
def load_modifiers(dumped_modifiers: list, modifiers: bpy.types.bpy_prop_collection):
""" Dump all modifiers of a modifier collection into a dict
:param dumped_modifiers: list of modifiers to load
:type dumped_modifiers: list
:param modifiers: modifiers
:type modifiers: bpy.types.bpy_prop_collection
"""
loader = Loader()
modifiers.clear()
for dumped_modifier in dumped_modifiers:
name = dumped_modifier.get('name')
mtype = dumped_modifier.get('type')
loaded_modifier = modifiers.new(name, mtype)
loader.load(loaded_modifier, dumped_modifier)
if loaded_modifier.type == 'NODES':
load_modifier_geometry_node_inputs(dumped_modifier, loaded_modifier)
elif loaded_modifier.type == 'PARTICLE_SYSTEM':
default = loaded_modifier.particle_system.settings
dumped_particles = dumped_modifier['particle_system']
loader.load(loaded_modifier.particle_system, dumped_particles)
settings = get_datablock_from_uuid(dumped_particles['settings_uuid'], None)
if settings:
loaded_modifier.particle_system.settings = settings
# Hack to remove the default generated particle settings
if not default.uuid:
bpy.data.particles.remove(default)
elif loaded_modifier.type in ['SOFT_BODY', 'CLOTH']:
loader.load(loaded_modifier.settings, dumped_modifier['settings'])
elif loaded_modifier.type == 'UV_PROJECT':
for projector_index, projector_object in enumerate(dumped_modifier['projectors']):
target_object = bpy.data.objects.get(projector_object)
if target_object:
loaded_modifier.projectors[projector_index].object = target_object
else:
logging.error("Could't load projector target object {projector_object}")
def load_modifiers_custom_data(dumped_modifiers: dict, modifiers: bpy.types.bpy_prop_collection):
""" Load modifiers custom data not managed by the dump_anything loader
:param dumped_modifiers: modifiers to load
:type dumped_modifiers: dict
:param modifiers: target modifiers collection
:type modifiers: bpy.types.bpy_prop_collection
"""
loader = Loader()
for modifier in modifiers:
dumped_modifier = dumped_modifiers.get(modifier.name)
class BlObject(ReplicatedDatablock):
use_delta = True
bl_id = "objects"
bl_class = bpy.types.Object
bl_check_common = False
bl_icon = 'OBJECT_DATA'
bl_reload_parent = False
is_root = False
@staticmethod
def construct(data: dict) -> object:
instance = None
def construct(data: dict) -> bpy.types.Object:
datablock = None
# TODO: refactoring
object_name = data.get("name")
data_uuid = data.get("data_uuid")
data_id = data.get("data")
data_type = data.get("type")
object_uuid = data.get('uuid')
object_data = get_datablock_from_uuid(
data_uuid,
find_data_from_name(data_id),
ignore=['images']) # TODO: use resolve_from_id
if data_type != 'EMPTY' and object_data is None:
raise Exception(f"Fail to load object {data['name']})")
if object_data is None and data_uuid:
raise Exception(f"Fail to load object {data['name']}({object_uuid})")
return bpy.data.objects.new(object_name, object_data)
datablock = bpy.data.objects.new(object_name, object_data)
datablock.uuid = object_uuid
return datablock
@staticmethod
def load(data: dict, datablock: object):
def load(data: dict, datablock: bpy.types.Object):
data = datablock.data
load_animation_data(data, datablock)
loader = Loader()
load_animation_data(data.get('animation_data'), datablock)
data_uuid = data.get("data_uuid")
data_id = data.get("data")
@ -540,9 +345,24 @@ class BlObject(ReplicatedDatablock):
object_data = datablock.data
# SHAPE KEYS
shape_keys = data.get('shape_keys')
if shape_keys:
load_shape_keys(shape_keys, datablock)
if 'shape_keys' in data:
datablock.shape_key_clear()
# Create keys and load vertices coords
for key_block in data['shape_keys']['key_blocks']:
key_data = data['shape_keys']['key_blocks'][key_block]
datablock.shape_key_add(name=key_block)
loader.load(
datablock.data.shape_keys.key_blocks[key_block], key_data)
for vert in key_data['data']:
datablock.data.shape_keys.key_blocks[key_block].data[vert].co = key_data['data'][vert]['co']
# Load relative key after all
for key_block in data['shape_keys']['key_blocks']:
reference = data['shape_keys']['key_blocks'][key_block]['relative_key']
datablock.data.shape_keys.key_blocks[key_block].relative_key = datablock.data.shape_keys.key_blocks[reference]
# Load transformation data
loader.load(datablock, data)
@ -568,26 +388,26 @@ class BlObject(ReplicatedDatablock):
# Bone groups
for bg_name in data['pose']['bone_groups']:
bg_data = data['pose']['bone_groups'].get(bg_name)
bg_target = datablock.pose.bone_groups.get(bg_name)
bg_datablock = datablock.pose.bone_groups.get(bg_name)
if not bg_target:
bg_target = datablock.pose.bone_groups.new(name=bg_name)
if not bg_datablock:
bg_datablock = datablock.pose.bone_groups.new(name=bg_name)
loader.load(bg_target, bg_data)
loader.load(bg_datablock, bg_data)
# datablock.pose.bone_groups.get
# Bones
for bone in data['pose']['bones']:
target_bone = datablock.pose.bones.get(bone)
datablock_bone = datablock.pose.bones.get(bone)
bone_data = data['pose']['bones'].get(bone)
if 'constraints' in bone_data.keys():
loader.load(target_bone, bone_data['constraints'])
loader.load(datablock_bone, bone_data['constraints'])
load_pose(target_bone, bone_data)
load_pose(datablock_bone, bone_data)
if 'bone_index' in bone_data.keys():
target_bone.bone_group = datablock.pose.bone_group[bone_data['bone_group_index']]
datablock_bone.bone_group = datablock.pose.bone_group[bone_data['bone_group_index']]
# TODO: find another way...
if datablock.empty_display_type == "IMAGE":
@ -608,26 +428,52 @@ class BlObject(ReplicatedDatablock):
and 'cycles_visibility' in data:
loader.load(datablock.cycles_visibility, data['cycles_visibility'])
# TODO: handle geometry nodes input from dump_anything
if hasattr(datablock, 'modifiers'):
load_modifiers(data['modifiers'], datablock.modifiers)
nodes_modifiers = [
mod for mod in datablock.modifiers if mod.type == 'NODES']
for modifier in nodes_modifiers:
load_modifier_geometry_node_inputs(
data['modifiers'][modifier.name], modifier)
constraints = data.get('constraints')
if constraints:
load_constraints(constraints, datablock.constraints)
particles_modifiers = [
mod for mod in datablock.modifiers if mod.type == 'PARTICLE_SYSTEM']
for mod in particles_modifiers:
default = mod.particle_system.settings
dumped_particles = data['modifiers'][mod.name]['particle_system']
loader.load(mod.particle_system, dumped_particles)
settings = get_datablock_from_uuid(dumped_particles['settings_uuid'], None)
if settings:
mod.particle_system.settings = settings
# Hack to remove the default generated particle settings
if not default.uuid:
bpy.data.particles.remove(default)
phys_modifiers = [
mod for mod in datablock.modifiers if mod.type in ['SOFT_BODY', 'CLOTH']]
for mod in phys_modifiers:
loader.load(mod.settings, data['modifiers'][mod.name]['settings'])
# PHYSICS
load_physics(data, datablock)
transform = data.get('transforms', None)
if transform:
datablock.matrix_parent_inverse = mathutils.Matrix(transform['matrix_parent_inverse'])
datablock.matrix_parent_inverse = mathutils.Matrix(
transform['matrix_parent_inverse'])
datablock.matrix_basis = mathutils.Matrix(transform['matrix_basis'])
datablock.matrix_local = mathutils.Matrix(transform['matrix_local'])
@staticmethod
def dump(datablock: object) -> dict:
assert(datablock)
if _is_editmode(datablock):
if get_preferences().sync_flags.sync_during_editmode:
if self.preferences.sync_flags.sync_during_editmode:
datablock.update_from_editmode()
else:
raise ContextError("Object is in edit-mode.")
@ -635,6 +481,7 @@ class BlObject(ReplicatedDatablock):
dumper = Dumper()
dumper.depth = 1
dumper.include_filter = [
"uuid",
"name",
"rotation_mode",
"data",
@ -664,15 +511,11 @@ class BlObject(ReplicatedDatablock):
'show_all_edges',
'show_texture_space',
'show_in_front',
'type',
'parent_type',
'parent_bone',
'track_axis',
'up_axis',
'type'
]
data = dumper.dump(datablock)
data['animation_data'] = dump_animation_data(datablock)
dumper.include_filter = [
'matrix_parent_inverse',
'matrix_local',
@ -690,9 +533,34 @@ class BlObject(ReplicatedDatablock):
data['parent_uid'] = (datablock.parent.uuid, datablock.parent.name)
# MODIFIERS
modifiers = getattr(datablock, 'modifiers', None)
if hasattr(datablock, 'modifiers'):
data['modifiers'] = dump_modifiers(modifiers)
data["modifiers"] = {}
modifiers = getattr(datablock, 'modifiers', None)
if modifiers:
dumper.include_filter = None
dumper.depth = 1
dumper.exclude_filter = ['is_active']
for index, modifier in enumerate(modifiers):
dumped_modifier = dumper.dump(modifier)
# hack to dump geometry nodes inputs
if modifier.type == 'NODES':
dumped_inputs = dump_modifier_geometry_node_inputs(
modifier)
dumped_modifier['inputs'] = dumped_inputs
elif modifier.type == 'PARTICLE_SYSTEM':
dumper.exclude_filter = [
"is_edited",
"is_editable",
"is_global_hair"
]
dumped_modifier['particle_system'] = dumper.dump(modifier.particle_system)
dumped_modifier['particle_system']['settings_uuid'] = modifier.particle_system.settings.uuid
elif modifier.type in ['SOFT_BODY', 'CLOTH']:
dumped_modifier['settings'] = dumper.dump(modifier.settings)
data["modifiers"][modifier.name] = dumped_modifier
gp_modifiers = getattr(datablock, 'grease_pencil_modifiers', None)
@ -718,7 +586,9 @@ class BlObject(ReplicatedDatablock):
# CONSTRAINTS
if hasattr(datablock, 'constraints'):
data["constraints"] = dump_constraints(datablock.constraints)
dumper.include_filter = None
dumper.depth = 3
data["constraints"] = dumper.dump(datablock.constraints)
# POSE
if hasattr(datablock, 'pose') and datablock.pose:
@ -765,7 +635,30 @@ class BlObject(ReplicatedDatablock):
# SHAPE KEYS
object_data = datablock.data
if hasattr(object_data, 'shape_keys') and object_data.shape_keys:
data['shape_keys'] = dump_shape_keys(object_data.shape_keys)
dumper = Dumper()
dumper.depth = 2
dumper.include_filter = [
'reference_key',
'use_relative'
]
data['shape_keys'] = dumper.dump(object_data.shape_keys)
data['shape_keys']['reference_key'] = object_data.shape_keys.reference_key.name
key_blocks = {}
for key in object_data.shape_keys.key_blocks:
dumper.depth = 3
dumper.include_filter = [
'name',
'data',
'mute',
'value',
'slider_min',
'slider_max',
'data',
'co'
]
key_blocks[key.name] = dumper.dump(key)
key_blocks[key.name]['relative_key'] = key.relative_key.name
data['shape_keys']['key_blocks'] = key_blocks
# SKIN VERTICES
if hasattr(object_data, 'skin_vertices') and object_data.skin_vertices:
@ -785,15 +678,16 @@ class BlObject(ReplicatedDatablock):
'scatter',
'shadow',
]
data['cycles_visibility'] = dumper.dump(datablock.cycles_visibility)
data['cycles_visibility'] = dumper.dump(
datablock.cycles_visibility)
# PHYSICS
data.update(dump_physics(datablock))
data.update(dump_physics(instance))
return data
@staticmethod
def resolve_deps(datablock: object) -> [object]:
def resolve_deps(datablock: bpy.types.Object) -> list:
deps = []
# Avoid Empty case
@ -811,22 +705,13 @@ class BlObject(ReplicatedDatablock):
# TODO: uuid based
deps.append(datablock.instance_collection)
deps.extend(resolve_animation_dependencies(datablock))
if datablock.modifiers:
deps.extend(find_textures_dependencies(datablock.modifiers))
deps.extend(find_geometry_nodes_dependencies(datablock.modifiers))
if hasattr(datablock.data, 'shape_keys') and datablock.data.shape_keys:
deps.extend(resolve_animation_dependencies(datablock.data.shape_keys))
deps.extend(resolve_animation_dependencies(datablock))
return deps
@staticmethod
def resolve(data: dict) -> object:
uuid = data.get('uuid')
return resolve_datablock_from_uuid(uuid, bpy.data.objects)
_type = bpy.types.Object
_class = BlObject
_class = BlObject

View File

@ -0,0 +1,90 @@
import bpy
import mathutils
from . import dump_anything
from .bl_datablock import BlDatablock, get_datablock_from_uuid
def dump_textures_slots(texture_slots: bpy.types.bpy_prop_collection) -> list:
""" Dump every texture slot collection as the form:
[(index, slot_texture_uuid, slot_texture_name), (), ...]
"""
dumped_slots = []
for index, slot in enumerate(texture_slots):
if slot and slot.texture:
dumped_slots.append((index, slot.texture.uuid, slot.texture.name))
return dumped_slots
def load_texture_slots(dumped_slots: list, target_slots: bpy.types.bpy_prop_collection):
"""
"""
for index, slot in enumerate(target_slots):
if slot:
target_slots.clear(index)
for index, slot_uuid, slot_name in dumped_slots:
target_slots.create(index).texture = get_datablock_from_uuid(
slot_uuid, slot_name
)
IGNORED_ATTR = [
"is_embedded_data",
"is_evaluated",
"is_fluid",
"is_library_indirect",
"users"
]
class BlParticle(BlDatablock):
bl_id = "particles"
bl_class = bpy.types.ParticleSettings
bl_icon = "PARTICLES"
bl_check_common = False
bl_reload_parent = False
def _construct(self, data):
instance = bpy.data.particles.new(data["name"])
instance.uuid = self.uuid
return instance
def _load_implementation(self, data, target):
dump_anything.load(target, data)
dump_anything.load(target.effector_weights, data["effector_weights"])
# Force field
force_field_1 = data.get("force_field_1", None)
if force_field_1:
dump_anything.load(target.force_field_1, force_field_1)
force_field_2 = data.get("force_field_2", None)
if force_field_2:
dump_anything.load(target.force_field_2, force_field_2)
# Texture slots
load_texture_slots(data["texture_slots"], target.texture_slots)
def _dump_implementation(self, data, instance=None):
assert instance
dumper = dump_anything.Dumper()
dumper.depth = 1
dumper.exclude_filter = IGNORED_ATTR
data = dumper.dump(instance)
# Particle effectors
data["effector_weights"] = dumper.dump(instance.effector_weights)
if instance.force_field_1:
data["force_field_1"] = dumper.dump(instance.force_field_1)
if instance.force_field_2:
data["force_field_2"] = dumper.dump(instance.force_field_2)
# Texture slots
data["texture_slots"] = dump_textures_slots(instance.texture_slots)
return data
def _resolve_deps_implementation(self):
return [t.texture for t in self.instance.texture_slots if t and t.texture]

View File

@ -18,23 +18,26 @@
import logging
from pathlib import Path
from uuid import uuid4
import bpy
import mathutils
from deepdiff import DeepDiff, Delta
from deepdiff import DeepDiff
from replication.constants import DIFF_JSON, MODIFIED
from replication.protocol import ReplicatedDatablock
from replication.objects import Node
from ..utils import flush_history, get_preferences
from .bl_action import (dump_animation_data, load_animation_data,
resolve_animation_dependencies)
from ..utils import flush_history
from .bl_collection import (dump_collection_children, dump_collection_objects,
load_collection_childrens, load_collection_objects,
resolve_collection_dependencies)
from .bl_datablock import resolve_datablock_from_uuid
from .bl_action import (load_animation_data,
dump_animation_data,
resolve_animation_dependencies)
from .bl_datablock import stamp_uuid
from .bl_file import get_filepath
from .dump_anything import Dumper, Loader
from ..preferences import get_preferences
RENDER_SETTINGS = [
'dither_intensity',
@ -304,8 +307,7 @@ def dump_sequence(sequence: bpy.types.Sequence) -> dict:
return data
def load_sequence(sequence_data: dict,
sequence_editor: bpy.types.SequenceEditor):
def load_sequence(sequence_data: dict, sequence_editor: bpy.types.SequenceEditor):
""" Load sequence from dumped data
:arg sequence_data: sequence to dump
@ -365,7 +367,7 @@ def load_sequence(sequence_data: dict,
**seq)
loader = Loader()
# TODO: Support filepath updates
loader.exclure_filter = ['filepath', 'sound', 'filenames', 'fps']
loader.load(sequence, sequence_data)
sequence.select = False
@ -373,7 +375,6 @@ def load_sequence(sequence_data: dict,
class BlScene(ReplicatedDatablock):
is_root = True
use_delta = True
bl_id = "scenes"
bl_class = bpy.types.Scene
@ -383,12 +384,13 @@ class BlScene(ReplicatedDatablock):
@staticmethod
def construct(data: dict) -> object:
return bpy.data.scenes.new(data["name"])
datablock = bpy.data.scenes.new(data["name"])
datablock.uuid = data.get("uuid")
return datablock
@staticmethod
def load(data: dict, datablock: object):
load_animation_data(data.get('animation_data'), datablock)
# Load other meshes metadata
loader = Loader()
loader.load(datablock, data)
@ -403,9 +405,8 @@ class BlScene(ReplicatedDatablock):
datablock.world = bpy.data.worlds[data['world']]
# Annotation
gpencil_uid = data.get('grease_pencil')
if gpencil_uid:
datablock.grease_pencil = resolve_datablock_from_uuid(gpencil_uid, bpy.data.grease_pencils)
if 'grease_pencil' in data.keys():
datablock.grease_pencil = bpy.data.grease_pencils[data['grease_pencil']]
if get_preferences().sync_flags.sync_render_settings:
if 'eevee' in data.keys():
@ -417,19 +418,20 @@ class BlScene(ReplicatedDatablock):
if 'render' in data.keys():
loader.load(datablock.render, data['render'])
view_settings = data.get('view_settings')
if view_settings:
loader.load(datablock.view_settings, view_settings)
if 'view_settings' in data.keys():
loader.load(datablock.view_settings, data['view_settings'])
if datablock.view_settings.use_curve_mapping and \
'curve_mapping' in view_settings:
'curve_mapping' in data['view_settings']:
# TODO: change this ugly fix
datablock.view_settings.curve_mapping.white_level = view_settings['curve_mapping']['white_level']
datablock.view_settings.curve_mapping.black_level = view_settings['curve_mapping']['black_level']
datablock.view_settings.curve_mapping.white_level = data[
'view_settings']['curve_mapping']['white_level']
datablock.view_settings.curve_mapping.black_level = data[
'view_settings']['curve_mapping']['black_level']
datablock.view_settings.curve_mapping.update()
# Sequencer
sequences = data.get('sequences')
if sequences:
# Create sequencer data
datablock.sequence_editor_create()
@ -440,29 +442,19 @@ class BlScene(ReplicatedDatablock):
if seq.name not in sequences:
vse.sequences.remove(seq)
# Load existing sequences
for seq_data in sequences.value():
for seq_name, seq_data in sequences.items():
load_sequence(seq_data, vse)
# If the sequence is no longer used, clear it
elif datablock.sequence_editor and not sequences:
datablock.sequence_editor_clear()
# Timeline markers
markers = data.get('timeline_markers')
if markers:
datablock.timeline_markers.clear()
for name, frame, camera in markers:
marker = datablock.timeline_markers.new(name, frame=frame)
if camera:
marker.camera = resolve_datablock_from_uuid(camera, bpy.data.objects)
marker.select = False
# FIXME: Find a better way after the replication big refacotoring
# Keep other user from deleting collection object by flushing their history
flush_history()
@staticmethod
def dump(datablock: object) -> dict:
data = {}
data['animation_data'] = dump_animation_data(datablock)
stamp_uuid(datablock)
# Metadata
scene_dumper = Dumper()
@ -471,14 +463,17 @@ class BlScene(ReplicatedDatablock):
'name',
'world',
'id',
'grease_pencil',
'frame_start',
'frame_end',
'frame_step',
'uuid'
]
if get_preferences().sync_flags.sync_active_camera:
scene_dumper.include_filter.append('camera')
data.update(scene_dumper.dump(datablock))
data = scene_dumper.dump(datablock)
dump_animation_data(datablock, data)
# Master collection
data['collection'] = {}
@ -526,13 +521,6 @@ class BlScene(ReplicatedDatablock):
dumped_sequences[seq.name] = dump_sequence(seq)
data['sequences'] = dumped_sequences
# Timeline markers
if datablock.timeline_markers:
data['timeline_markers'] = [(m.name, m.frame, getattr(m.camera, 'uuid', None)) for m in datablock.timeline_markers]
if datablock.grease_pencil:
data['grease_pencil'] = datablock.grease_pencil.uuid
return data
@staticmethod
@ -550,8 +538,6 @@ class BlScene(ReplicatedDatablock):
if datablock.grease_pencil:
deps.append(datablock.grease_pencil)
deps.extend(resolve_animation_dependencies(datablock))
# Sequences
vse = datablock.sequence_editor
if vse:
@ -564,22 +550,11 @@ class BlScene(ReplicatedDatablock):
for elem in sequence.elements:
sequence.append(
Path(bpy.path.abspath(sequence.directory),
elem.filename))
elem.filename))
return deps
@staticmethod
def resolve(data: dict) -> object:
uuid = data.get('uuid')
name = data.get('name')
datablock = resolve_datablock_from_uuid(uuid, bpy.data.scenes)
if datablock is None:
datablock = bpy.data.scenes.get(name)
return datablock
@staticmethod
def compute_delta(last_data: dict, current_data: dict) -> Delta:
def diff(self):
exclude_path = []
if not get_preferences().sync_flags.sync_render_settings:
@ -591,22 +566,7 @@ class BlScene(ReplicatedDatablock):
if not get_preferences().sync_flags.sync_active_camera:
exclude_path.append("root['camera']")
diff_params = {
'exclude_paths': exclude_path,
'ignore_order': True,
'report_repetition': True
}
delta_params = {
# 'mutate': True
}
return Delta(
DeepDiff(last_data,
current_data,
cache_size=5000,
**diff_params),
**delta_params)
return DeepDiff(self.data, self._dump(datablock=self.datablock), exclude_paths=exclude_path)
_type = bpy.types.Scene
_class = BlScene
_class = BlScene

View File

@ -23,39 +23,38 @@ from pathlib import Path
import bpy
from .bl_file import get_filepath, ensure_unpacked
from replication.protocol import ReplicatedDatablock
from .bl_datablock import BlDatablock
from .dump_anything import Dumper, Loader
from .bl_datablock import resolve_datablock_from_uuid
class BlSound(ReplicatedDatablock):
class BlSound(BlDatablock):
bl_id = "sounds"
bl_class = bpy.types.Sound
bl_check_common = False
bl_icon = 'SOUND'
bl_reload_parent = False
@staticmethod
def construct(data: dict) -> object:
filename = data.get('filename')
return bpy.data.sounds.load(get_filepath(filename))
@staticmethod
def load(data: dict, datablock: object):
def load(self, data, target):
loader = Loader()
loader.load(datablock, data)
loader.load(target, data)
@staticmethod
def dump(datablock: object) -> dict:
filename = Path(datablock.filepath).name
def diff(self):
return False
def dump(self, instance=None):
filename = Path(instance.filepath).name
if not filename:
raise FileExistsError(datablock.filepath)
raise FileExistsError(instance.filepath)
return {
'filename': filename,
'name': datablock.name
'name': instance.name
}
@staticmethod
@ -63,19 +62,7 @@ class BlSound(ReplicatedDatablock):
deps = []
if datablock.filepath and datablock.filepath != '<builtin>':
ensure_unpacked(datablock)
deps.append(Path(bpy.path.abspath(datablock.filepath)))
return deps
@staticmethod
def resolve(data: dict) -> object:
uuid = data.get('uuid')
return resolve_datablock_from_uuid(uuid, bpy.data.sounds)
@staticmethod
def needs_update(datablock: object, data:dict)-> bool:
return False
_type = bpy.types.Sound
_class = BlSound

View File

@ -20,31 +20,26 @@ import bpy
import mathutils
from .dump_anything import Loader, Dumper
from replication.protocol import ReplicatedDatablock
from .bl_datablock import resolve_datablock_from_uuid
from .bl_action import dump_animation_data, load_animation_data, resolve_animation_dependencies
from .bl_datablock import BlDatablock
class BlSpeaker(ReplicatedDatablock):
use_delta = True
class BlSpeaker(BlDatablock):
bl_id = "speakers"
bl_class = bpy.types.Speaker
bl_check_common = False
bl_icon = 'SPEAKER'
bl_reload_parent = False
@staticmethod
def load(data: dict, datablock: object):
loader = Loader()
loader.load(datablock, data)
load_animation_data(data.get('animation_data'), datablock)
loader.load(target, data)
@staticmethod
def construct(data: dict) -> object:
return bpy.data.speakers.new(data["name"])
@staticmethod
def dump(datablock: object) -> dict:
assert(instance)
dumper = Dumper()
dumper.depth = 1
dumper.include_filter = [
@ -63,18 +58,10 @@ class BlSpeaker(ReplicatedDatablock):
'cone_volume_outer'
]
data = dumper.dump(datablock)
data['animation_data'] = dump_animation_data(datablock)
return data
@staticmethod
def resolve(data: dict) -> object:
uuid = data.get('uuid')
return resolve_datablock_from_uuid(uuid, bpy.data.speakers)
return dumper.dump(instance)
@staticmethod
def resolve_deps(datablock: object) -> [object]:
# TODO: resolve material
deps = []
sound = datablock.sound
@ -82,8 +69,6 @@ class BlSpeaker(ReplicatedDatablock):
if sound:
deps.append(sound)
deps.extend(resolve_animation_dependencies(datablock))
return deps
_type = bpy.types.Speaker
_class = BlSpeaker

View File

@ -20,32 +20,25 @@ import bpy
import mathutils
from .dump_anything import Loader, Dumper
from replication.protocol import ReplicatedDatablock
from .bl_datablock import resolve_datablock_from_uuid
from .bl_action import dump_animation_data, load_animation_data, resolve_animation_dependencies
import bpy.types as T
from .bl_datablock import BlDatablock
class BlTexture(ReplicatedDatablock):
use_delta = True
class BlTexture(BlDatablock):
bl_id = "textures"
bl_class = bpy.types.Texture
bl_check_common = False
bl_icon = 'TEXTURE'
bl_reload_parent = False
@staticmethod
def load(data: dict, datablock: object):
loader = Loader()
loader.load(datablock, data)
load_animation_data(data.get('animation_data'), datablock)
loader.load(target, data)
@staticmethod
def construct(data: dict) -> object:
return bpy.data.textures.new(data["name"], data["type"])
@staticmethod
def dump(datablock: object) -> dict:
assert(instance)
dumper = Dumper()
dumper.depth = 1
@ -59,22 +52,15 @@ class BlTexture(ReplicatedDatablock):
'name_full'
]
data = dumper.dump(datablock)
color_ramp = getattr(datablock, 'color_ramp', None)
data = dumper.dump(instance)
color_ramp = getattr(instance, 'color_ramp', None)
if color_ramp:
dumper.depth = 4
data['color_ramp'] = dumper.dump(color_ramp)
data['animation_data'] = dump_animation_data(datablock)
return data
@staticmethod
def resolve(data: dict) -> object:
uuid = data.get('uuid')
return resolve_datablock_from_uuid(uuid, bpy.data.textures)
@staticmethod
def resolve_deps(datablock: object) -> [object]:
deps = []
@ -84,14 +70,6 @@ class BlTexture(ReplicatedDatablock):
if image:
deps.append(image)
deps.extend(resolve_animation_dependencies(datablock))
return deps
_type = [T.WoodTexture, T.VoronoiTexture,
T.StucciTexture, T.NoiseTexture,
T.MusgraveTexture, T.MarbleTexture,
T.MagicTexture, T.ImageTexture,
T.DistortedNoiseTexture, T.CloudsTexture,
T.BlendTexture]
_class = BlTexture

View File

@ -21,26 +21,32 @@ import mathutils
from pathlib import Path
from .dump_anything import Loader, Dumper
from replication.protocol import ReplicatedDatablock
from .bl_datablock import get_datablock_from_uuid, resolve_datablock_from_uuid
from .bl_datablock import BlDatablock, get_datablock_from_uuid
from .bl_material import dump_materials_slots, load_materials_slots
from .bl_action import dump_animation_data, load_animation_data, resolve_animation_dependencies
class BlVolume(ReplicatedDatablock):
use_delta = True
class BlVolume(BlDatablock):
bl_id = "volumes"
bl_class = bpy.types.Volume
bl_check_common = False
bl_icon = 'VOLUME_DATA'
bl_reload_parent = False
@staticmethod
def load(data: dict, datablock: object):
loader = Loader()
loader.load(target, data)
loader.load(target.display, data['display'])
# MATERIAL SLOTS
src_materials = data.get('materials', None)
if src_materials:
load_materials_slots(src_materials, target.materials)
def construct(data: dict) -> object:
return bpy.data.volumes.new(data["name"])
@staticmethod
def dump(datablock: object) -> dict:
assert(instance)
dumper = Dumper()
dumper.depth = 1
dumper.exclude_filter = [
@ -54,35 +60,17 @@ class BlVolume(ReplicatedDatablock):
'use_fake_user'
]
data = dumper.dump(datablock)
data = dumper.dump(instance)
data['display'] = dumper.dump(datablock.display)
data['display'] = dumper.dump(instance.display)
# Fix material index
data['materials'] = dump_materials_slots(datablock.materials)
data['animation_data'] = dump_animation_data(datablock)
data['materials'] = dump_materials_slots(instance.materials)
return data
@staticmethod
def load(data: dict, datablock: object):
load_animation_data(data.get('animation_data'), datablock)
loader = Loader()
loader.load(datablock, data)
loader.load(datablock.display, data['display'])
# MATERIAL SLOTS
src_materials = data.get('materials', None)
if src_materials:
load_materials_slots(src_materials, datablock.materials)
@staticmethod
def resolve(data: dict) -> object:
uuid = data.get('uuid')
return resolve_datablock_from_uuid(uuid, bpy.data.volumes)
@staticmethod
def resolve_deps(datablock: object) -> [object]:
# TODO: resolve material
deps = []
external_vdb = Path(bpy.path.abspath(datablock.filepath))
@ -93,9 +81,6 @@ class BlVolume(ReplicatedDatablock):
if material:
deps.append(material)
deps.extend(resolve_animation_dependencies(datablock))
return deps
_type = bpy.types.Volume
_class = BlVolume

View File

@ -20,42 +20,35 @@ import bpy
import mathutils
from .dump_anything import Loader, Dumper
from replication.protocol import ReplicatedDatablock
from .bl_datablock import BlDatablock
from .bl_material import (load_node_tree,
dump_node_tree,
get_node_tree_dependencies)
from .bl_datablock import resolve_datablock_from_uuid
from .bl_action import dump_animation_data, load_animation_data, resolve_animation_dependencies
class BlWorld(ReplicatedDatablock):
use_delta = True
class BlWorld(BlDatablock):
bl_id = "worlds"
bl_class = bpy.types.World
bl_check_common = True
bl_icon = 'WORLD_DATA'
bl_reload_parent = False
@staticmethod
def construct(data: dict) -> object:
return bpy.data.worlds.new(data["name"])
@staticmethod
def load(data: dict, datablock: object):
load_animation_data(data.get('animation_data'), datablock)
loader = Loader()
loader.load(datablock, data)
loader.load(target, data)
if data["use_nodes"]:
if datablock.node_tree is None:
datablock.use_nodes = True
if target.node_tree is None:
target.use_nodes = True
load_node_tree(data['node_tree'], datablock.node_tree)
load_node_tree(data['node_tree'], target.node_tree)
@staticmethod
def dump(datablock: object) -> dict:
assert(instance)
world_dumper = Dumper()
world_dumper.depth = 1
world_dumper.include_filter = [
@ -63,17 +56,11 @@ class BlWorld(ReplicatedDatablock):
"name",
"color"
]
data = world_dumper.dump(datablock)
if datablock.use_nodes:
data['node_tree'] = dump_node_tree(datablock.node_tree)
data = world_dumper.dump(instance)
if instance.use_nodes:
data['node_tree'] = dump_node_tree(instance.node_tree)
data['animation_data'] = dump_animation_data(datablock)
return data
@staticmethod
def resolve(data: dict) -> object:
uuid = data.get('uuid')
return resolve_datablock_from_uuid(uuid, bpy.data.worlds)
@staticmethod
def resolve_deps(datablock: object) -> [object]:
@ -82,8 +69,4 @@ class BlWorld(ReplicatedDatablock):
if datablock.use_nodes:
deps.extend(get_node_tree_dependencies(datablock.node_tree))
deps.extend(resolve_animation_dependencies(datablock))
return deps
_type = bpy.types.World
_class = BlWorld

View File

@ -507,12 +507,16 @@ class Loader:
_constructors = {
T.ColorRampElement: (CONSTRUCTOR_NEW, ["position"]),
T.ParticleSettingsTextureSlot: (CONSTRUCTOR_ADD, []),
T.Modifier: (CONSTRUCTOR_NEW, ["name", "type"]),
T.GpencilModifier: (CONSTRUCTOR_NEW, ["name", "type"]),
T.Constraint: (CONSTRUCTOR_NEW, ["type"]),
}
destructors = {
T.ColorRampElement: DESTRUCTOR_REMOVE,
T.Modifier: DESTRUCTOR_CLEAR,
T.GpencilModifier: DESTRUCTOR_CLEAR,
T.Constraint: DESTRUCTOR_REMOVE,
}
element_type = element.bl_rna_property.fixed_type

View File

@ -27,14 +27,12 @@ import shutil
import string
import sys
import time
import traceback
from datetime import datetime
from operator import itemgetter
from pathlib import Path
from queue import Queue
from time import gmtime, strftime
from bpy.props import FloatProperty
import traceback
try:
import _pickle as pickle
@ -45,17 +43,15 @@ import bpy
import mathutils
from bpy.app.handlers import persistent
from bpy_extras.io_utils import ExportHelper, ImportHelper
from replication import porcelain
from replication.constants import (COMMITED, FETCHED, RP_COMMON, STATE_ACTIVE,
STATE_INITIAL, STATE_SYNCING, UP)
from replication.protocol import DataTranslationProtocol
from replication.exception import ContextError, NonAuthorizedOperationError
from replication.interface import session
from replication.objects import Node
from replication.protocol import DataTranslationProtocol
from replication import porcelain
from replication.repository import Repository
from . import bl_types, environment, shared_data, timers, ui, utils
from .handlers import on_scene_update, sanitize_deps_graph
from . import io_bpy, environment, timers, ui, utils
from .presence import SessionStatusWidget, renderer, view3d_find
from .timers import registry
@ -63,6 +59,7 @@ background_execution_queue = Queue()
deleyables = []
stop_modal_executor = False
def session_callback(name):
""" Session callback wrapper
@ -81,39 +78,41 @@ def session_callback(name):
def initialize_session():
"""Session connection init hander
"""
logging.info("Intializing the scene")
settings = utils.get_preferences()
runtime_settings = bpy.context.window_manager.session
if not runtime_settings.is_host:
logging.info("Intializing the scene")
# Step 1: Constrect nodes
logging.info("Instantiating nodes")
for node in session.repository.index_sorted:
node_ref = session.repository.graph.get(node)
if node_ref is None:
logging.error(f"Can't construct node {node}")
elif node_ref.state == FETCHED:
node_ref.instance = session.repository.rdp.resolve(node_ref.data)
if node_ref.instance is None:
node_ref.instance = session.repository.rdp.construct(node_ref.data)
node_ref.instance.uuid = node_ref.uuid
# Step 1: Constrect nodes
logging.info("Constructing nodes")
for node in session.repository.list_ordered():
node_ref = session.repository.get_node(node)
if node_ref is None:
logging.error(f"Can't construct node {node}")
elif node_ref.state == FETCHED:
node_ref.resolve()
# Step 2: Load nodes
logging.info("Applying nodes")
for node in session.repository.heads:
porcelain.apply(session.repository, node)
# Step 2: Load nodes
logging.info("Loading nodes")
for node in session.repository.list_ordered():
node_ref = session.repository.get_node(node)
if node_ref is None:
logging.error(f"Can't load node {node}")
elif node_ref.state == FETCHED:
node_ref.apply()
logging.info("Registering timers")
# Step 4: Register blender timers
for d in deleyables:
d.register()
bpy.ops.session.apply_armature_operator('INVOKE_DEFAULT')
# Step 5: Clearing history
utils.flush_history()
# Step 6: Launch deps graph update handling
bpy.app.handlers.depsgraph_update_post.append(on_scene_update)
bpy.app.handlers.depsgraph_update_post.append(depsgraph_evaluation)
@session_callback('on_exit')
@ -133,8 +132,8 @@ def on_connection_end(reason="none"):
stop_modal_executor = True
if on_scene_update in bpy.app.handlers.depsgraph_update_post:
bpy.app.handlers.depsgraph_update_post.remove(on_scene_update)
if depsgraph_evaluation in bpy.app.handlers.depsgraph_update_post:
bpy.app.handlers.depsgraph_update_post.remove(depsgraph_evaluation)
# Step 3: remove file handled
logger = logging.getLogger()
@ -142,7 +141,8 @@ def on_connection_end(reason="none"):
if isinstance(handler, logging.FileHandler):
logger.removeHandler(handler)
if reason != "user":
bpy.ops.session.notify('INVOKE_DEFAULT', message=f"Disconnected from session. Reason: {reason}. ")
bpy.ops.session.notify(
'INVOKE_DEFAULT', message=f"Disconnected from session. Reason: {reason}. ")
# OPERATORS
@ -163,7 +163,7 @@ class SessionStartOperator(bpy.types.Operator):
settings = utils.get_preferences()
runtime_settings = context.window_manager.session
users = bpy.data.window_managers['WinMan'].online_users
admin_pass = settings.password
admin_pass = runtime_settings.password
users.clear()
deleyables.clear()
@ -191,102 +191,81 @@ class SessionStartOperator(bpy.types.Operator):
handler.setFormatter(formatter)
bpy_protocol = bl_types.get_data_translation_protocol()
bpy_protocol = io_bpy.get_data_translation_protocol()
# Check if supported_datablocks are up to date before starting the
# the session
for dcc_type_id in bpy_protocol.implementations.keys():
if dcc_type_id not in settings.supported_datablocks:
logging.info(f"{dcc_type_id} not found, \
for impl in bpy_protocol.implementations.values():
if impl.__name__ not in settings.supported_datablocks:
logging.info(f"{impl.__name__} not found, \
regenerate type settings...")
settings.generate_supported_types()
# Ensure blender 2.8 compatibility
if bpy.app.version[1] >= 91:
python_binary_path = sys.executable
else:
python_binary_path = bpy.app.binary_path_python
repo = Repository(
rdp=bpy_protocol,
username=settings.username)
# Host a session
# HOST
if self.host:
if settings.init_method == 'EMPTY':
utils.clean_scene()
runtime_settings.is_host = True
runtime_settings.internet_ip = environment.get_ip()
# Start the server locally
server = porcelain.serve(port=settings.port,
timeout=settings.connection_timeout,
admin_password=admin_pass,
log_directory=settings.cache_directory)
try:
# Init repository
for scene in bpy.data.scenes:
porcelain.add(repo, scene)
# Init repository
repo = porcelain.init(bare=False,
data_protocol=bpy_protocol)
porcelain.remote_add(
repo,
'origin',
'127.0.0.1',
settings.port,
admin_password=admin_pass)
session.host(
repository= repo,
remote='origin',
timeout=settings.connection_timeout,
password=admin_pass,
cache_directory=settings.cache_directory,
server_log_level=logging.getLevelName(
logging.getLogger().level),
)
except Exception as e:
self.report({'ERROR'}, repr(e))
logging.error(f"Error: {e}")
traceback.print_exc()
# Join a session
# Add the existing scenes
for scene in bpy.data.scenes:
porcelain.add(repo, scene)
porcelain.remote_add(repo,
'server',
'127.0.0.1',
settings.port)
porcelain.sync(repo, 'server')
porcelain.push(repo, 'server')
# JOIN
else:
if not runtime_settings.admin:
utils.clean_scene()
# regular session, no password needed
admin_pass = None
utils.clean_scene()
try:
porcelain.remote_add(
repo,
'origin',
settings.ip,
settings.port,
admin_password=admin_pass)
session.connect(
repository= repo,
timeout=settings.connection_timeout,
password=admin_pass
)
except Exception as e:
self.report({'ERROR'}, str(e))
logging.error(str(e))
repo = porcelain.clone(settings.ip, settings.ip)
# Background client updates service
deleyables.append(timers.ClientUpdate())
deleyables.append(timers.DynamicRightSelectTimer())
deleyables.append(timers.ApplyTimer(timeout=settings.depsgraph_update_rate))
# deleyables.append(timers.ClientUpdate())
# deleyables.append(timers.DynamicRightSelectTimer())
# deleyables.append(timers.ApplyTimer(
# timeout=settings.depsgraph_update_rate))
# deleyables.append(timers.PushTimer(
# queue=stagging,
# timeout=settings.depsgraph_update_rate
# ))
# session_update = timers.SessionStatusUpdate()
# session_user_sync = timers.SessionUserSync()
# session_background_executor = timers.MainThreadExecutor(
# execution_queue=background_execution_queue)
# session_listen = timers.SessionListenTimer(timeout=0.001)
session_update = timers.SessionStatusUpdate()
session_user_sync = timers.SessionUserSync()
session_background_executor = timers.MainThreadExecutor(execution_queue=background_execution_queue)
session_listen = timers.SessionListenTimer(timeout=0.001)
# session_listen.register()
# session_update.register()
# session_user_sync.register()
# session_background_executor.register()
session_listen.register()
session_update.register()
session_user_sync.register()
session_background_executor.register()
deleyables.append(session_background_executor)
deleyables.append(session_update)
deleyables.append(session_user_sync)
deleyables.append(session_listen)
deleyables.append(timers.AnnotationUpdates())
# deleyables.append(session_background_executor)
# deleyables.append(session_update)
# deleyables.append(session_user_sync)
# deleyables.append(session_listen)
self.report(
{'INFO'},
f"connecting to tcp://{settings.ip}:{settings.port}")
return {"FINISHED"}
@ -325,7 +304,6 @@ class SessionInitOperator(bpy.types.Operator):
porcelain.add(session.repository, scene)
session.init()
context.window_manager.session.is_host = True
return {"FINISHED"}
@ -372,7 +350,7 @@ class SessionKickOperator(bpy.types.Operator):
assert(session)
try:
porcelain.kick(session.repository, self.user)
session.kick(self.user)
except Exception as e:
self.report({'ERROR'}, repr(e))
@ -401,7 +379,7 @@ class SessionPropertyRemoveOperator(bpy.types.Operator):
def execute(self, context):
try:
porcelain.rm(session.repository, self.property_path)
session.remove(self.property_path)
return {"FINISHED"}
except: # NonAuthorizedOperationError:
@ -443,17 +421,10 @@ class SessionPropertyRightOperator(bpy.types.Operator):
runtime_settings = context.window_manager.session
if session:
if runtime_settings.clients == RP_COMMON:
porcelain.unlock(session.repository,
self.key,
session.change_owner(self.key,
runtime_settings.clients,
ignore_warnings=True,
affect_dependencies=self.recursive)
else:
porcelain.lock(session.repository,
self.key,
runtime_settings.clients,
ignore_warnings=True,
affect_dependencies=self.recursive)
return {"FINISHED"}
@ -568,7 +539,7 @@ class SessionSnapTimeOperator(bpy.types.Operator):
def modal(self, context, event):
is_running = context.window_manager.session.user_snap_running
if not is_running:
if event.type in {'RIGHTMOUSE', 'ESC'} or not is_running:
self.cancel(context)
return {'CANCELLED'}
@ -601,28 +572,22 @@ class SessionApply(bpy.types.Operator):
def execute(self, context):
logging.debug(f"Running apply on {self.target}")
try:
node_ref = session.repository.graph.get(self.target)
node_ref = session.repository.get_node(self.target)
porcelain.apply(session.repository,
self.target,
force=True)
impl = session.repository.rdp.get_implementation(node_ref.instance)
# NOTE: find another way to handle child and parent automatic reloading
if impl.bl_reload_parent:
for parent in session.repository.graph.get_parents(self.target):
force=True,
force_dependencies=self.reset_dependencies)
if node_ref.bl_reload_parent:
for parent in session.repository.get_parents(self.target):
logging.debug(f"Refresh parent {parent}")
porcelain.apply(session.repository,
parent.uuid,
force=True)
if hasattr(impl, 'bl_reload_child') and impl.bl_reload_child:
for dep in node_ref.dependencies:
porcelain.apply(session.repository,
dep,
force=True)
except Exception as e:
self.report({'ERROR'}, repr(e))
traceback.print_exc()
return {"CANCELLED"}
return {"CANCELLED"}
return {"FINISHED"}
@ -641,12 +606,55 @@ class SessionCommit(bpy.types.Operator):
def execute(self, context):
try:
porcelain.commit(session.repository, self.target)
porcelain.push(session.repository, 'origin', self.target, force=True)
porcelain.commit(session.repository, uuid=self.target)
session.push(self.target)
return {"FINISHED"}
except Exception as e:
self.report({'ERROR'}, repr(e))
return {"CANCELLED"}
return {"CANCELED"}
class ApplyArmatureOperator(bpy.types.Operator):
"""Operator which runs its self from a timer"""
bl_idname = "session.apply_armature_operator"
bl_label = "Modal Executor Operator"
_timer = None
def modal(self, context, event):
global stop_modal_executor, modal_executor_queue
if stop_modal_executor:
self.cancel(context)
return {'CANCELLED'}
if event.type == 'TIMER':
if session and session.state == STATE_ACTIVE:
nodes = session.list(filter=io_bpy.bl_armature.BlArmature)
for node in nodes:
node_ref = session.repository.get_node(node)
if node_ref.state == FETCHED:
try:
porcelain.apply(session.repository, node)
except Exception as e:
logging.error("Fail to apply armature: {e}")
return {'PASS_THROUGH'}
def execute(self, context):
wm = context.window_manager
self._timer = wm.event_timer_add(2, window=context.window)
wm.modal_handler_add(self)
return {'RUNNING_MODAL'}
def cancel(self, context):
global stop_modal_executor
wm = context.window_manager
wm.event_timer_remove(self._timer)
stop_modal_executor = False
class SessionClearCache(bpy.types.Operator):
@ -690,7 +698,6 @@ class SessionPurgeOperator(bpy.types.Operator):
def execute(self, context):
try:
sanitize_deps_graph(remove_nodes=True)
porcelain.purge_orphan_nodes(session.repository)
except Exception as e:
self.report({'ERROR'}, repr(e))
@ -760,7 +767,7 @@ class SessionSaveBackupOperator(bpy.types.Operator, ExportHelper):
recorder.register()
deleyables.append(recorder)
else:
session.repository.dumps(self.filepath)
session.save(self.filepath)
return {'FINISHED'}
@ -803,25 +810,58 @@ class SessionLoadSaveOperator(bpy.types.Operator, ImportHelper):
def execute(self, context):
from replication.repository import Repository
# init the factory with supported types
bpy_protocol = bl_types.get_data_translation_protocol()
repo = Repository(bpy_protocol)
repo.loads(self.filepath)
utils.clean_scene()
try:
f = gzip.open(self.filepath, "rb")
db = pickle.load(f)
except OSError as e:
f = open(self.filepath, "rb")
db = pickle.load(f)
nodes = [repo.graph.get(n) for n in repo.index_sorted]
if db:
logging.info(f"Reading {self.filepath}")
nodes = db.get("nodes")
# Step 1: Construct nodes
for node in nodes:
node.instance = bpy_protocol.resolve(node.data)
if node.instance is None:
node.instance = bpy_protocol.construct(node.data)
node.instance.uuid = node.uuid
logging.info(f"{len(nodes)} Nodes to load")
# Step 2: Load nodes
for node in nodes:
porcelain.apply(repo, node.uuid)
# init the factory with supported types
bpy_protocol = DataTranslationProtocol()
for type in io_bpy.types_to_register():
type_module = getattr(io_bpy, type)
name = [e.capitalize() for e in type.split('_')[1:]]
type_impl_name = 'Bl'+''.join(name)
type_module_class = getattr(type_module, type_impl_name)
bpy_protocol.register_type(
type_module_class.bl_class,
type_module_class)
graph = Repository()
for node, node_data in nodes:
node_type = node_data.get('str_type')
impl = bpy_protocol.get_implementation_from_net(node_type)
if impl:
logging.info(f"Loading {node}")
instance = impl(owner=node_data['owner'],
uuid=node,
dependencies=node_data['dependencies'],
data=node_data['data'])
graph.do_commit(instance)
instance.state = FETCHED
logging.info("Graph succefully loaded")
utils.clean_scene()
# Step 1: Construct nodes
for node in graph.list_ordered():
graph[node].resolve()
# Step 2: Load nodes
for node in graph.list_ordered():
graph[node].apply()
return {'FINISHED'}
@ -829,78 +869,10 @@ class SessionLoadSaveOperator(bpy.types.Operator, ImportHelper):
def poll(cls, context):
return True
class SessionPresetServerAdd(bpy.types.Operator):
"""Add a server to the server list preset"""
bl_idname = "session.preset_server_add"
bl_label = "add server preset"
bl_description = "add the current server to the server preset list"
bl_options = {"REGISTER"}
name : bpy.props.StringProperty(default="server_preset")
@classmethod
def poll(cls, context):
return True
def invoke(self, context, event):
assert(context)
return context.window_manager.invoke_props_dialog(self)
def draw(self, context):
layout = self.layout
col = layout.column()
settings = utils.get_preferences()
col.prop(settings, "server_name", text="server name")
def execute(self, context):
assert(context)
settings = utils.get_preferences()
existing_preset = settings.server_preset.get(settings.server_name)
new_server = existing_preset if existing_preset else settings.server_preset.add()
new_server.name = settings.server_name
new_server.server_ip = settings.ip
new_server.server_port = settings.port
new_server.server_password = settings.password
settings.server_preset_interface = settings.server_name
if new_server == existing_preset :
self.report({'INFO'}, "Server '" + settings.server_name + "' override")
else :
self.report({'INFO'}, "New '" + settings.server_name + "' server preset")
return {'FINISHED'}
class SessionPresetServerRemove(bpy.types.Operator):
"""Remove a server to the server list preset"""
bl_idname = "session.preset_server_remove"
bl_label = "remove server preset"
bl_description = "remove the current server from the server preset list"
bl_options = {"REGISTER"}
@classmethod
def poll(cls, context):
return True
def execute(self, context):
assert(context)
settings = utils.get_preferences()
settings.server_preset.remove(settings.server_preset.find(settings.server_preset_interface))
return {'FINISHED'}
def menu_func_import(self, context):
self.layout.operator(SessionLoadSaveOperator.bl_idname, text='Multi-user session snapshot (.db)')
self.layout.operator(SessionLoadSaveOperator.bl_idname,
text='Multi-user session snapshot (.db)')
classes = (
@ -912,25 +884,132 @@ classes = (
SessionPropertyRightOperator,
SessionApply,
SessionCommit,
ApplyArmatureOperator,
SessionKickOperator,
SessionInitOperator,
SessionClearCache,
SessionNotifyOperator,
SessionNotifyOperator,
SessionSaveBackupOperator,
SessionLoadSaveOperator,
SessionStopAutoSaveOperator,
SessionPurgeOperator,
SessionPresetServerAdd,
SessionPresetServerRemove,
)
def update_external_dependencies():
nodes_ids = session.list(filter=io_bpy.bl_file.BlFile)
for node_id in nodes_ids:
node = session.repository.get_node(node_id)
if node and node.owner in [session.id, RP_COMMON] \
and node.has_changed():
porcelain.commit(session.repository, node_id)
session.push(node_id, check_data=False)
def sanitize_deps_graph(remove_nodes: bool = False):
""" Cleanup the replication graph
"""
if session and session.state == STATE_ACTIVE:
start = utils.current_milli_time()
rm_cpt = 0
for node_key in session.list():
node = session.repository.get_node(node_key)
if node is None \
or (node.state == UP and not node.resolve(construct=False)):
if remove_nodes:
try:
session.remove(node.uuid, remove_dependencies=False)
logging.info(f"Removing {node.uuid}")
rm_cpt += 1
except NonAuthorizedOperationError:
continue
logging.info(f"Sanitize took { utils.current_milli_time()-start} ms")
@persistent
def resolve_deps_graph(dummy):
"""Resolve deps graph
Temporary solution to resolve each node pointers after a Undo.
A future solution should be to avoid storing dataclock reference...
"""
if session and session.state == STATE_ACTIVE:
sanitize_deps_graph(remove_nodes=True)
@persistent
def load_pre_handler(dummy):
if session and session.state in [STATE_ACTIVE, STATE_SYNCING]:
bpy.ops.session.stop()
@persistent
def update_client_frame(scene):
if session and session.state == STATE_ACTIVE:
session.update_user_metadata({
'frame_current': scene.frame_current
})
@persistent
def depsgraph_evaluation(scene):
if session and session.state == STATE_ACTIVE:
context = bpy.context
blender_depsgraph = bpy.context.view_layer.depsgraph
dependency_updates = [u for u in blender_depsgraph.updates]
settings = utils.get_preferences()
update_external_dependencies()
# NOTE: maybe we don't need to check each update but only the first
for update in reversed(dependency_updates):
# Is the object tracked ?
if update.id.uuid:
# Retrieve local version
node = session.repository.get_node(update.id.uuid)
# Check our right on this update:
# - if its ours or ( under common and diff), launch the
# update process
# - if its to someone else, ignore the update
if node and (node.owner == session.id or node.bl_check_common):
if node.state == UP:
try:
if node.has_changed():
porcelain.commit(session.repository, node.uuid)
session.push(node.uuid, check_data=False)
except ReferenceError:
logging.debug(f"Reference error {node.uuid}")
except ContextError as e:
logging.debug(e)
except Exception as e:
logging.error(e)
else:
continue
# A new scene is created
elif isinstance(update.id, bpy.types.Scene):
ref = session.repository.get_node_by_datablock(update.id)
if ref:
ref.resolve()
else:
scn_uuid = porcelain.add(session.repository, update.id)
porcelain.commit(session.repository, scn_uuid)
porcelain.push(session.repository)
def register():
from bpy.utils import register_class
for cls in classes:
for cls in classes:
register_class(cls)
bpy.app.handlers.undo_post.append(resolve_deps_graph)
bpy.app.handlers.redo_post.append(resolve_deps_graph)
bpy.app.handlers.load_pre.append(load_pre_handler)
bpy.app.handlers.frame_change_pre.append(update_client_frame)
def unregister():
if session and session.state == STATE_ACTIVE:
@ -939,3 +1018,9 @@ def unregister():
from bpy.utils import unregister_class
for cls in reversed(classes):
unregister_class(cls)
bpy.app.handlers.undo_post.remove(resolve_deps_graph)
bpy.app.handlers.redo_post.remove(resolve_deps_graph)
bpy.app.handlers.load_pre.remove(load_pre_handler)
bpy.app.handlers.frame_change_pre.remove(update_client_frame)

View File

@ -24,7 +24,7 @@ import os
from pathlib import Path
from . import bl_types, environment, addon_updater_ops, presence, ui
from . import io_bpy, environment, addon_updater_ops, presence, ui
from .utils import get_preferences, get_expanded_icon
from replication.constants import RP_COMMON
from replication.interface import session
@ -33,19 +33,6 @@ from replication.interface import session
IP_REGEX = re.compile("^(([0-9]|[1-9][0-9]|1[0-9]{2}|2[0-4][0-9]|25[0-5])\.){3}([0-9]|[1-9][0-9]|1[0-9]{2}|2[0-4][0-9]|25[0-5])$")
HOSTNAME_REGEX = re.compile("^(([a-zA-Z0-9]|[a-zA-Z0-9][a-zA-Z0-9\-]*[a-zA-Z0-9])\.)*([A-Za-z0-9]|[A-Za-z0-9][A-Za-z0-9\-]*[A-Za-z0-9])$")
DEFAULT_PRESETS = {
"localhost" : {
"server_ip": "localhost",
"server_port": 5555,
"server_password": "admin"
},
"public session" : {
"server_ip": "51.75.71.183",
"server_port": 5555,
"server_password": ""
},
}
def randomColor():
"""Generate a random color """
r = random.random()
@ -78,11 +65,8 @@ def update_ip(self, context):
logging.error("Wrong IP format")
self['ip'] = "127.0.0.1"
def update_server_preset_interface(self, context):
self.server_name = self.server_preset.get(self.server_preset_interface).name
self.ip = self.server_preset.get(self.server_preset_interface).server_ip
self.port = self.server_preset.get(self.server_preset_interface).server_port
self.password = self.server_preset.get(self.server_preset_interface).server_password
def update_directory(self, context):
new_dir = Path(self.cache_directory)
@ -109,10 +93,6 @@ class ReplicatedDatablock(bpy.types.PropertyGroup):
auto_push: bpy.props.BoolProperty(default=True)
icon: bpy.props.StringProperty()
class ServerPreset(bpy.types.PropertyGroup):
server_ip: bpy.props.StringProperty()
server_port: bpy.props.IntProperty(default=5555)
server_password: bpy.props.StringProperty(default="admin", subtype = "PASSWORD")
def set_sync_render_settings(self, value):
self['sync_render_settings'] = value
@ -165,7 +145,7 @@ class SessionPrefs(bpy.types.AddonPreferences):
ip: bpy.props.StringProperty(
name="ip",
description='Distant host ip',
default="localhost",
default="127.0.0.1",
update=update_ip)
username: bpy.props.StringProperty(
name="Username",
@ -180,17 +160,6 @@ class SessionPrefs(bpy.types.AddonPreferences):
description='Distant host port',
default=5555
)
server_name: bpy.props.StringProperty(
name="server_name",
description="Custom name of the server",
default='localhost',
)
password: bpy.props.StringProperty(
name="password",
default=random_string_digits(),
description='Session password',
subtype='PASSWORD'
)
sync_flags: bpy.props.PointerProperty(
type=ReplicationFlags
)
@ -273,13 +242,6 @@ class SessionPrefs(bpy.types.AddonPreferences):
step=1,
subtype='PERCENTAGE',
)
presence_mode_distance: bpy.props.FloatProperty(
name="Distance mode visibilty",
description="Adjust the distance visibilty of user's mode",
min=0.1,
max=1000,
default=100,
)
conf_session_identity_expanded: bpy.props.BoolProperty(
name="Identity",
description="Identity",
@ -359,25 +321,6 @@ class SessionPrefs(bpy.types.AddonPreferences):
max=59
)
# Server preset
def server_list_callback(scene, context):
settings = get_preferences()
enum = []
for i in settings.server_preset:
enum.append((i.name, i.name, ""))
return enum
server_preset: bpy.props.CollectionProperty(
name="server preset",
type=ServerPreset,
)
server_preset_interface: bpy.props.EnumProperty(
name="servers",
description="servers enum",
items=server_list_callback,
update=update_server_preset_interface,
)
# Custom panel
panel_category: bpy.props.StringProperty(
description="Choose a name for the category of the panel",
@ -453,11 +396,10 @@ class SessionPrefs(bpy.types.AddonPreferences):
col = box.column(align=True)
col.prop(self, "presence_hud_scale", expand=True)
col.prop(self, "presence_hud_hpos", expand=True)
col.prop(self, "presence_hud_vpos", expand=True)
col.prop(self, "presence_mode_distance", expand=True)
if self.category == 'UPDATE':
from . import addon_updater_ops
addon_updater_ops.update_settings_ui(self, context)
@ -465,32 +407,19 @@ class SessionPrefs(bpy.types.AddonPreferences):
def generate_supported_types(self):
self.supported_datablocks.clear()
bpy_protocol = bl_types.get_data_translation_protocol()
bpy_protocol = io_bpy.get_data_translation_protocol()
# init the factory with supported types
for dcc_type_id, impl in bpy_protocol.implementations.items():
for impl in bpy_protocol.implementations.values():
new_db = self.supported_datablocks.add()
new_db.name = dcc_type_id
new_db.type_name = dcc_type_id
new_db.name = impl.__name__
new_db.type_name = impl.__name__
new_db.use_as_filter = True
new_db.icon = impl.bl_icon
new_db.bl_name = impl.bl_id
# custom at launch server preset
def generate_default_presets(self):
for preset_name, preset_data in DEFAULT_PRESETS.items():
existing_preset = self.server_preset.get(preset_name)
if existing_preset :
continue
new_server = self.server_preset.add()
new_server.name = preset_name
new_server.server_ip = preset_data.get('server_ip')
new_server.server_port = preset_data.get('server_port')
new_server.server_password = preset_data.get('server_password',None)
def client_list_callback(scene, context):
from . import operators
@ -546,11 +475,6 @@ class SessionProps(bpy.types.PropertyGroup):
description='Enable user overlay ',
default=True,
)
presence_show_mode: bpy.props.BoolProperty(
name="Show users current mode",
description='Enable user mode overlay ',
default=False,
)
presence_show_far_user: bpy.props.BoolProperty(
name="Show users on different scenes",
description="Show user on different scenes",
@ -566,20 +490,16 @@ class SessionProps(bpy.types.PropertyGroup):
description='Show only owned datablocks',
default=True
)
filter_name: bpy.props.StringProperty(
name="filter_name",
default="",
description='Node name filter',
)
admin: bpy.props.BoolProperty(
name="admin",
description='Connect as admin',
default=False
)
internet_ip: bpy.props.StringProperty(
name="internet ip",
default="no found",
description='Internet interface ip',
password: bpy.props.StringProperty(
name="password",
default=random_string_digits(),
description='Session password',
subtype='PASSWORD'
)
user_snap_running: bpy.props.BoolProperty(
default=False
@ -587,17 +507,15 @@ class SessionProps(bpy.types.PropertyGroup):
time_snap_running: bpy.props.BoolProperty(
default=False
)
is_host: bpy.props.BoolProperty(
default=False
)
def get_preferences():
return bpy.context.preferences.addons[__package__].preferences
classes = (
SessionUser,
SessionProps,
ReplicationFlags,
ReplicatedDatablock,
ServerPreset,
SessionPrefs,
)
@ -610,12 +528,8 @@ def register():
prefs = bpy.context.preferences.addons[__package__].preferences
if len(prefs.supported_datablocks) == 0:
logging.debug('Generating bl_types preferences')
logging.debug('Generating io_bpy preferences')
prefs.generate_supported_types()
# at launch server presets
prefs.generate_default_presets()
def unregister():

View File

@ -94,41 +94,15 @@ def project_to_viewport(region: bpy.types.Region, rv3d: bpy.types.RegionView3D,
return [target.x, target.y, target.z]
def bbox_from_obj(obj: bpy.types.Object, index: int = 1) -> list:
def bbox_from_obj(obj: bpy.types.Object, radius: float) -> list:
""" Generate a bounding box for a given object by using its world matrix
:param obj: target object
:type obj: bpy.types.Object
:param index: indice offset
:type index: int
:return: list of 8 points [(x,y,z),...], list of 12 link between these points [(1,2),...]
:param radius: bounding box radius
:type radius: float
:return: list of 8 points [(x,y,z),...]
"""
radius = 1.0 # Radius of the bounding box
index = 8*index
vertex_indices = (
(0+index, 1+index), (0+index, 2+index), (1+index, 3+index), (2+index, 3+index),
(4+index, 5+index), (4+index, 6+index), (5+index, 7+index), (6+index, 7+index),
(0+index, 4+index), (1+index, 5+index), (2+index, 6+index), (3+index, 7+index))
if obj.type == 'EMPTY':
radius = obj.empty_display_size
elif obj.type == 'LIGHT':
radius = obj.data.shadow_soft_size
elif obj.type == 'LIGHT_PROBE':
radius = obj.data.influence_distance
elif obj.type == 'CAMERA':
radius = obj.data.display_size
elif hasattr(obj, 'bound_box'):
vertex_indices = (
(0+index, 1+index), (1+index, 2+index),
(2+index, 3+index), (0+index, 3+index),
(4+index, 5+index), (5+index, 6+index),
(6+index, 7+index), (4+index, 7+index),
(0+index, 4+index), (1+index, 5+index),
(2+index, 6+index), (3+index, 7+index))
vertex_pos = get_bb_coords_from_obj(obj)
return vertex_pos, vertex_indices
coords = [
(-radius, -radius, -radius), (+radius, -radius, -radius),
(-radius, +radius, -radius), (+radius, +radius, -radius),
@ -138,32 +112,9 @@ def bbox_from_obj(obj: bpy.types.Object, index: int = 1) -> list:
base = obj.matrix_world
bbox_corners = [base @ mathutils.Vector(corner) for corner in coords]
vertex_pos = [(point.x, point.y, point.z) for point in bbox_corners]
return [(point.x, point.y, point.z)
for point in bbox_corners]
return vertex_pos, vertex_indices
def bbox_from_instance_collection(ic: bpy.types.Object, index: int = 0) -> list:
""" Generate a bounding box for a given instance collection by using its objects
:param ic: target instance collection
:type ic: bpy.types.Object
:param index: indice offset
:type index: int
:return: list of 8*objs points [(x,y,z),...], tuple of 12*objs link between these points [(1,2),...]
"""
vertex_pos = []
vertex_indices = ()
for obj_index, obj in enumerate(ic.instance_collection.objects):
vertex_pos_temp, vertex_indices_temp = bbox_from_obj(obj, index=index+obj_index)
vertex_pos += vertex_pos_temp
vertex_indices += vertex_indices_temp
bbox_corners = [ic.matrix_world @ mathutils.Vector(vertex) for vertex in vertex_pos]
vertex_pos = [(point.x, point.y, point.z) for point in bbox_corners]
return vertex_pos, vertex_indices
def generate_user_camera() -> list:
""" Generate a basic camera represention of the user point of view
@ -224,7 +175,7 @@ def get_bb_coords_from_obj(object: bpy.types.Object, instance: bpy.types.Object
bbox_corners = [base @ mathutils.Vector(
corner) for corner in object.bound_box]
return [(point.x, point.y, point.z) for point in bbox_corners]
@ -252,13 +203,6 @@ class Widget(object):
"""
return True
def configure_bgl(self):
bgl.glLineWidth(2.)
bgl.glEnable(bgl.GL_DEPTH_TEST)
bgl.glEnable(bgl.GL_BLEND)
bgl.glEnable(bgl.GL_LINE_SMOOTH)
def draw(self):
"""How to draw the widget
"""
@ -312,6 +256,11 @@ class UserFrustumWidget(Widget):
{"pos": positions},
indices=self.indices)
bgl.glLineWidth(2.)
bgl.glEnable(bgl.GL_DEPTH_TEST)
bgl.glEnable(bgl.GL_BLEND)
bgl.glEnable(bgl.GL_LINE_SMOOTH)
shader.bind()
shader.uniform_float("color", self.data.get('color'))
batch.draw(shader)
@ -323,8 +272,6 @@ class UserSelectionWidget(Widget):
username):
self.username = username
self.settings = bpy.context.window_manager.session
self.current_selection_ids = []
self.current_selected_objects = []
@property
def data(self):
@ -334,15 +281,6 @@ class UserSelectionWidget(Widget):
else:
return None
@property
def selected_objects(self):
user_selection = self.data.get('selected_objects')
if self.current_selection_ids != user_selection:
self.current_selected_objects = [find_from_attr("uuid", uid, bpy.data.objects) for uid in user_selection]
self.current_selection_ids = user_selection
return self.current_selected_objects
def poll(self):
if self.data is None:
return False
@ -357,31 +295,48 @@ class UserSelectionWidget(Widget):
self.settings.enable_presence
def draw(self):
vertex_pos = []
vertex_ind = []
collection_offset = 0
for obj_index, obj in enumerate(self.selected_objects):
if obj is None:
continue
obj_index+=collection_offset
if hasattr(obj, 'instance_collection') and obj.instance_collection:
bbox_pos, bbox_ind = bbox_from_instance_collection(obj, index=obj_index)
collection_offset+=len(obj.instance_collection.objects)-1
else :
bbox_pos, bbox_ind = bbox_from_obj(obj, index=obj_index)
vertex_pos += bbox_pos
vertex_ind += bbox_ind
user_selection = self.data.get('selected_objects')
for select_ob in user_selection:
ob = find_from_attr("uuid", select_ob, bpy.data.objects)
if not ob:
return
shader = gpu.shader.from_builtin('3D_UNIFORM_COLOR')
batch = batch_for_shader(
shader,
'LINES',
{"pos": vertex_pos},
indices=vertex_ind)
vertex_pos = bbox_from_obj(ob, 1.0)
vertex_indices = ((0, 1), (0, 2), (1, 3), (2, 3),
(4, 5), (4, 6), (5, 7), (6, 7),
(0, 4), (1, 5), (2, 6), (3, 7))
if ob.instance_collection:
for obj in ob.instance_collection.objects:
if obj.type == 'MESH' and hasattr(obj, 'bound_box'):
vertex_pos = get_bb_coords_from_obj(obj, instance=ob)
break
elif ob.type == 'EMPTY':
vertex_pos = bbox_from_obj(ob, ob.empty_display_size)
elif ob.type == 'LIGHT':
vertex_pos = bbox_from_obj(ob, ob.data.shadow_soft_size)
elif ob.type == 'LIGHT_PROBE':
vertex_pos = bbox_from_obj(ob, ob.data.influence_distance)
elif ob.type == 'CAMERA':
vertex_pos = bbox_from_obj(ob, ob.data.display_size)
elif hasattr(ob, 'bound_box'):
vertex_indices = (
(0, 1), (1, 2), (2, 3), (0, 3),
(4, 5), (5, 6), (6, 7), (4, 7),
(0, 4), (1, 5), (2, 6), (3, 7))
vertex_pos = get_bb_coords_from_obj(ob)
shader = gpu.shader.from_builtin('3D_UNIFORM_COLOR')
batch = batch_for_shader(
shader,
'LINES',
{"pos": vertex_pos},
indices=vertex_indices)
shader.bind()
shader.uniform_float("color", self.data.get('color'))
batch.draw(shader)
shader.bind()
shader.uniform_float("color", self.data.get('color'))
batch.draw(shader)
class UserNameWidget(Widget):
draw_type = 'POST_PIXEL'
@ -425,62 +380,6 @@ class UserNameWidget(Widget):
blf.color(0, color[0], color[1], color[2], color[3])
blf.draw(0, self.username)
class UserModeWidget(Widget):
draw_type = 'POST_PIXEL'
def __init__(
self,
username):
self.username = username
self.settings = bpy.context.window_manager.session
self.preferences = get_preferences()
@property
def data(self):
user = session.online_users.get(self.username)
if user:
return user.get('metadata')
else:
return None
def poll(self):
if self.data is None:
return False
scene_current = self.data.get('scene_current')
mode_current = self.data.get('mode_current')
user_selection = self.data.get('selected_objects')
return (scene_current == bpy.context.scene.name or
mode_current == bpy.context.mode or
self.settings.presence_show_far_user) and \
user_selection and \
self.settings.presence_show_mode and \
self.settings.enable_presence
def draw(self):
user_selection = self.data.get('selected_objects')
area, region, rv3d = view3d_find()
viewport_coord = project_to_viewport(region, rv3d, (0, 0))
obj = find_from_attr("uuid", user_selection[0], bpy.data.objects)
if not obj:
return
mode_current = self.data.get('mode_current')
color = self.data.get('color')
origin_coord = project_to_screen(obj.location)
distance_viewport_object = math.sqrt((viewport_coord[0]-obj.location[0])**2+(viewport_coord[1]-obj.location[1])**2+(viewport_coord[2]-obj.location[2])**2)
if distance_viewport_object > self.preferences.presence_mode_distance :
return
if origin_coord :
blf.position(0, origin_coord[0]+8, origin_coord[1]-15, 0)
blf.size(0, 16, 72)
blf.color(0, color[0], color[1], color[2], color[3])
blf.draw(0, mode_current)
class SessionStatusWidget(Widget):
draw_type = 'POST_PIXEL'
@ -563,7 +462,6 @@ class DrawFactory(object):
try:
for widget in self.widgets.values():
if widget.draw_type == 'POST_VIEW' and widget.poll():
widget.configure_bgl()
widget.draw()
except Exception as e:
logging.error(
@ -573,7 +471,6 @@ class DrawFactory(object):
try:
for widget in self.widgets.values():
if widget.draw_type == 'POST_PIXEL' and widget.poll():
widget.configure_bgl()
widget.draw()
except Exception as e:
logging.error(
@ -586,7 +483,6 @@ this.renderer = DrawFactory()
def register():
this.renderer.register_handlers()
this.renderer.add_widget("session_status", SessionStatusWidget())

View File

@ -1,48 +0,0 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
#
# ##### END GPL LICENSE BLOCK #####
from replication.constants import STATE_INITIAL
class SessionData():
""" A structure to share easily the current session data across the addon
modules.
This object will completely replace the Singleton lying in replication
interface module.
"""
def __init__(self):
self.repository = None # The current repository
self.remote = None # The active remote
self.server = None
self.applied_updates = []
@property
def state(self):
if self.remote is None:
return STATE_INITIAL
else:
return self.remote.connection_status
def clear(self):
self.remote = None
self.repository = None
self.server = None
self.applied_updates = []
session = SessionData()

View File

@ -27,12 +27,10 @@ from replication.interface import session
from replication import porcelain
from . import operators, utils
from .presence import (UserFrustumWidget, UserNameWidget, UserModeWidget, UserSelectionWidget,
from .presence import (UserFrustumWidget, UserNameWidget, UserSelectionWidget,
generate_user_camera, get_view_matrix, refresh_3d_view,
refresh_sidebar_view, renderer)
from . import shared_data
this = sys.modules[__name__]
# Registered timers
@ -41,8 +39,7 @@ this.registry = dict()
def is_annotating(context: bpy.types.Context):
""" Check if the annotate mode is enabled
"""
active_tool = bpy.context.workspace.tools.from_space_view3d_mode('OBJECT', create=False)
return (active_tool and active_tool.idname == 'builtin.annotate')
return bpy.context.workspace.tools.from_space_view3d_mode('OBJECT', create=False).idname == 'builtin.annotate'
class Timer(object):
@ -75,7 +72,6 @@ class Timer(object):
except Exception as e:
logging.error(e)
self.unregister()
traceback.print_exc()
session.disconnect(reason=f"Error during timer {self.id} execution")
else:
if self.is_running:
@ -92,7 +88,7 @@ class Timer(object):
if bpy.app.timers.is_registered(self.main):
logging.info(f"Unregistering {self.id}")
bpy.app.timers.unregister(self.main)
del this.registry[self.id]
self.is_running = False
@ -103,7 +99,7 @@ class SessionBackupTimer(Timer):
def execute(self):
session.repository.dumps(self._filepath)
session.save(self._filepath)
class SessionListenTimer(Timer):
def execute(self):
@ -112,76 +108,32 @@ class SessionListenTimer(Timer):
class ApplyTimer(Timer):
def execute(self):
if session and session.state == STATE_ACTIVE:
for node in session.repository.graph.keys():
node_ref = session.repository.graph.get(node)
nodes = session.list()
for node in nodes:
node_ref = session.repository.get_node(node)
if node_ref.state == FETCHED:
try:
shared_data.session.applied_updates.append(node)
porcelain.apply(session.repository, node)
except Exception as e:
logging.error(f"Fail to apply {node_ref.uuid}")
traceback.print_exc()
else:
impl = session.repository.rdp.get_implementation(node_ref.instance)
if impl.bl_reload_parent:
for parent in session.repository.graph.get_parents(node):
if node_ref.bl_reload_parent:
for parent in session.repository.get_parents(node):
logging.debug("Refresh parent {node}")
porcelain.apply(session.repository,
parent.uuid,
force=True)
if hasattr(impl, 'bl_reload_child') and impl.bl_reload_child:
for dep in node_ref.dependencies:
porcelain.apply(session.repository,
dep,
force=True)
class AnnotationUpdates(Timer):
def __init__(self, timeout=1):
self._annotating = False
self._settings = utils.get_preferences()
super().__init__(timeout)
def execute(self):
if session and session.state == STATE_ACTIVE:
ctx = bpy.context
annotation_gp = ctx.scene.grease_pencil
if annotation_gp and not annotation_gp.uuid:
ctx.scene.update_tag()
# if an annotation exist and is tracked
if annotation_gp and annotation_gp.uuid:
registered_gp = session.repository.graph.get(annotation_gp.uuid)
if is_annotating(bpy.context):
# try to get the right on it
if registered_gp.owner == RP_COMMON:
self._annotating = True
logging.debug(
"Getting the right on the annotation GP")
porcelain.lock(session.repository,
[registered_gp.uuid],
ignore_warnings=True,
affect_dependencies=False)
if registered_gp.owner == self._settings.username:
porcelain.commit(session.repository, annotation_gp.uuid)
porcelain.push(session.repository, 'origin', annotation_gp.uuid)
elif self._annotating:
porcelain.unlock(session.repository,
[registered_gp.uuid],
ignore_warnings=True,
affect_dependencies=False)
self._annotating = False
class DynamicRightSelectTimer(Timer):
def __init__(self, timeout=.1):
super().__init__(timeout)
self._last_selection = set()
self._last_selection = []
self._user = None
self._annotating = False
def execute(self):
settings = utils.get_preferences()
@ -192,46 +144,88 @@ class DynamicRightSelectTimer(Timer):
self._user = session.online_users.get(settings.username)
if self._user:
current_selection = set(utils.get_selected_objects(
ctx = bpy.context
annotation_gp = ctx.scene.grease_pencil
if annotation_gp and not annotation_gp.uuid:
ctx.scene.update_tag()
# if an annotation exist and is tracked
if annotation_gp and annotation_gp.uuid:
registered_gp = session.repository.get_node(annotation_gp.uuid)
if is_annotating(bpy.context):
# try to get the right on it
if registered_gp.owner == RP_COMMON:
self._annotating = True
logging.debug(
"Getting the right on the annotation GP")
session.change_owner(
registered_gp.uuid,
settings.username,
ignore_warnings=True,
affect_dependencies=False)
if registered_gp.owner == settings.username:
gp_node = session.repository.get_node(annotation_gp.uuid)
if gp_node.has_changed():
porcelain.commit(session.repository, gp_node.uuid)
session.push(gp_node.uuid, check_data=False)
elif self._annotating:
session.change_owner(
registered_gp.uuid,
RP_COMMON,
ignore_warnings=True,
affect_dependencies=False)
current_selection = utils.get_selected_objects(
bpy.context.scene,
bpy.data.window_managers['WinMan'].windows[0].view_layer
))
)
if current_selection != self._last_selection:
to_lock = list(current_selection.difference(self._last_selection))
to_release = list(self._last_selection.difference(current_selection))
instances_to_lock = list()
obj_common = [
o for o in self._last_selection if o not in current_selection]
obj_ours = [
o for o in current_selection if o not in self._last_selection]
for node_id in to_lock:
node = session.repository.graph.get(node_id)
instance_mode = node.data.get('instance_type')
if instance_mode and instance_mode == 'COLLECTION':
to_lock.remove(node_id)
instances_to_lock.append(node_id)
if instances_to_lock:
try:
porcelain.lock(session.repository,
instances_to_lock,
ignore_warnings=True,
affect_dependencies=False)
except NonAuthorizedOperationError as e:
logging.warning(e)
# change old selection right to common
for obj in obj_common:
node = session.repository.get_node(obj)
if to_release:
try:
porcelain.unlock(session.repository,
to_release,
ignore_warnings=True,
affect_dependencies=True)
except NonAuthorizedOperationError as e:
logging.warning(e)
if to_lock:
try:
porcelain.lock(session.repository,
to_lock,
ignore_warnings=True,
affect_dependencies=True)
except NonAuthorizedOperationError as e:
logging.warning(e)
if node and (node.owner == settings.username or node.owner == RP_COMMON):
recursive = True
if node.data and 'instance_type' in node.data.keys():
recursive = node.data['instance_type'] != 'COLLECTION'
try:
session.change_owner(
node.uuid,
RP_COMMON,
ignore_warnings=True,
affect_dependencies=recursive)
except NonAuthorizedOperationError:
logging.warning(
f"Not authorized to change {node} owner")
# change new selection to our
for obj in obj_ours:
node = session.repository.get_node(obj)
if node and node.owner == RP_COMMON:
recursive = True
if node.data and 'instance_type' in node.data.keys():
recursive = node.data['instance_type'] != 'COLLECTION'
try:
session.change_owner(
node.uuid,
settings.username,
ignore_warnings=True,
affect_dependencies=recursive)
except NonAuthorizedOperationError:
logging.warning(
f"Not authorized to change {node} owner")
else:
return
self._last_selection = current_selection
@ -239,29 +233,32 @@ class DynamicRightSelectTimer(Timer):
'selected_objects': current_selection
}
porcelain.update_user_metadata(session.repository, user_metadata)
session.update_user_metadata(user_metadata)
logging.debug("Update selection")
# Fix deselection until right managment refactoring (with Roles concepts)
if len(current_selection) == 0 :
owned_keys = [k for k, v in session.repository.graph.items() if v.owner==settings.username]
if owned_keys:
owned_keys = session.list(
filter_owner=settings.username)
for key in owned_keys:
node = session.repository.get_node(key)
try:
porcelain.unlock(session.repository,
owned_keys,
ignore_warnings=True,
affect_dependencies=True)
except NonAuthorizedOperationError as e:
logging.warning(e)
session.change_owner(
key,
RP_COMMON,
ignore_warnings=True,
affect_dependencies=recursive)
except NonAuthorizedOperationError:
logging.warning(
f"Not authorized to change {key} owner")
# Objects selectability
for obj in bpy.data.objects:
object_uuid = getattr(obj, 'uuid', None)
if object_uuid:
is_selectable = not session.repository.is_node_readonly(object_uuid)
node = session.repository.get_node(object_uuid)
is_selectable = not node.owner in [settings.username, RP_COMMON]
if obj.hide_select != is_selectable:
obj.hide_select = is_selectable
shared_data.session.applied_updates.append(object_uuid)
class ClientUpdate(Timer):
@ -275,8 +272,7 @@ class ClientUpdate(Timer):
if session and renderer:
if session.state in [STATE_ACTIVE, STATE_LOBBY]:
local_user = session.online_users.get(
settings.username)
local_user = session.online_users.get(settings.username)
if not local_user:
return
@ -311,24 +307,20 @@ class ClientUpdate(Timer):
settings.client_color.b,
1),
'frame_current': bpy.context.scene.frame_current,
'scene_current': scene_current,
'mode_current': bpy.context.mode
'scene_current': scene_current
}
porcelain.update_user_metadata(session.repository, metadata)
session.update_user_metadata(metadata)
# Update client representation
# Update client current scene
elif scene_current != local_user_metadata['scene_current']:
local_user_metadata['scene_current'] = scene_current
porcelain.update_user_metadata(session.repository, local_user_metadata)
session.update_user_metadata(local_user_metadata)
elif 'view_corners' in local_user_metadata and current_view_corners != local_user_metadata['view_corners']:
local_user_metadata['view_corners'] = current_view_corners
local_user_metadata['view_matrix'] = get_view_matrix(
)
porcelain.update_user_metadata(session.repository, local_user_metadata)
elif bpy.context.mode != local_user_metadata['mode_current']:
local_user_metadata['mode_current'] = bpy.context.mode
porcelain.update_user_metadata(session.repository, local_user_metadata)
session.update_user_metadata(local_user_metadata)
class SessionStatusUpdate(Timer):
@ -356,7 +348,6 @@ class SessionUserSync(Timer):
renderer.remove_widget(f"{user.username}_cam")
renderer.remove_widget(f"{user.username}_select")
renderer.remove_widget(f"{user.username}_name")
renderer.remove_widget(f"{user.username}_mode")
ui_users.remove(index)
break
@ -372,8 +363,6 @@ class SessionUserSync(Timer):
f"{user}_select", UserSelectionWidget(user))
renderer.add_widget(
f"{user}_name", UserNameWidget(user))
renderer.add_widget(
f"{user}_mode", UserModeWidget(user))
class MainThreadExecutor(Timer):

View File

@ -107,12 +107,10 @@ class SESSION_PT_settings(bpy.types.Panel):
row = row.grid_flow(row_major=True, columns=0, even_columns=True, even_rows=False, align=True)
row.prop(settings.sync_flags, "sync_render_settings",text="",icon_only=True, icon='SCENE')
row.prop(settings.sync_flags, "sync_during_editmode", text="",icon_only=True, icon='EDITMODE_HLT')
row.prop(settings.sync_flags, "sync_active_camera", text="",icon_only=True, icon='VIEW_CAMERA')
row.prop(settings.sync_flags, "sync_active_camera", text="",icon_only=True, icon='OBJECT_DATAMODE')
row= layout.row()
if current_state in [STATE_ACTIVE] and runtime_settings.is_host:
info_msg = f"LAN: {runtime_settings.internet_ip}"
if current_state == STATE_LOBBY:
info_msg = "Waiting for the session to start."
@ -156,13 +154,7 @@ class SESSION_PT_settings_network(bpy.types.Panel):
row = layout.row()
row.prop(runtime_settings, "session_mode", expand=True)
row = layout.row()
col = row.row(align=True)
col.prop(settings, "server_preset_interface", text="")
col.operator("session.preset_server_add", icon='ADD', text="")
col.operator("session.preset_server_remove", icon='REMOVE', text="")
row = layout.row()
box = row.box()
if runtime_settings.session_mode == 'HOST':
@ -174,7 +166,7 @@ class SESSION_PT_settings_network(bpy.types.Panel):
row.prop(settings, "init_method", text="")
row = box.row()
row.label(text="Admin password:")
row.prop(settings, "password", text="")
row.prop(runtime_settings, "password", text="")
row = box.row()
row.operator("session.start", text="HOST").host = True
else:
@ -190,10 +182,11 @@ class SESSION_PT_settings_network(bpy.types.Panel):
if runtime_settings.admin:
row = box.row()
row.label(text="Password:")
row.prop(settings, "password", text="")
row.prop(runtime_settings, "password", text="")
row = box.row()
row.operator("session.start", text="CONNECT").host = False
class SESSION_PT_settings_user(bpy.types.Panel):
bl_idname = "MULTIUSER_SETTINGS_USER_PT_panel"
bl_label = "User info"
@ -343,10 +336,9 @@ class SESSION_PT_user(bpy.types.Panel):
box = row.box()
split = box.split(factor=0.35)
split.label(text="user")
split = split.split(factor=0.3)
split.label(text="mode")
split.label(text="frame")
split = split.split(factor=0.5)
split.label(text="location")
split.label(text="frame")
split.label(text="ping")
row = layout.row()
@ -384,8 +376,6 @@ class SESSION_UL_users(bpy.types.UIList):
ping = '-'
frame_current = '-'
scene_current = '-'
mode_current = '-'
mode_icon = 'BLANK1'
status_icon = 'BLANK1'
if session:
user = session.online_users.get(item.username)
@ -395,45 +385,13 @@ class SESSION_UL_users(bpy.types.UIList):
if metadata and 'frame_current' in metadata:
frame_current = str(metadata.get('frame_current','-'))
scene_current = metadata.get('scene_current','-')
mode_current = metadata.get('mode_current','-')
if mode_current == "OBJECT" :
mode_icon = "OBJECT_DATAMODE"
elif mode_current == "EDIT_MESH" :
mode_icon = "EDITMODE_HLT"
elif mode_current == 'EDIT_CURVE':
mode_icon = "CURVE_DATA"
elif mode_current == 'EDIT_SURFACE':
mode_icon = "SURFACE_DATA"
elif mode_current == 'EDIT_TEXT':
mode_icon = "FILE_FONT"
elif mode_current == 'EDIT_ARMATURE':
mode_icon = "ARMATURE_DATA"
elif mode_current == 'EDIT_METABALL':
mode_icon = "META_BALL"
elif mode_current == 'EDIT_LATTICE':
mode_icon = "LATTICE_DATA"
elif mode_current == 'POSE':
mode_icon = "POSE_HLT"
elif mode_current == 'SCULPT':
mode_icon = "SCULPTMODE_HLT"
elif mode_current == 'PAINT_WEIGHT':
mode_icon = "WPAINT_HLT"
elif mode_current == 'PAINT_VERTEX':
mode_icon = "VPAINT_HLT"
elif mode_current == 'PAINT_TEXTURE':
mode_icon = "TPAINT_HLT"
elif mode_current == 'PARTICLE':
mode_icon = "PARTICLES"
elif mode_current == 'PAINT_GPENCIL' or mode_current =='EDIT_GPENCIL' or mode_current =='SCULPT_GPENCIL' or mode_current =='WEIGHT_GPENCIL' or mode_current =='VERTEX_GPENCIL':
mode_icon = "GREASEPENCIL"
if user['admin']:
status_icon = 'FAKE_USER_ON'
split = layout.split(factor=0.35)
split.label(text=item.username, icon=status_icon)
split = split.split(factor=0.3)
split.label(icon=mode_icon)
split.label(text=frame_current)
split = split.split(factor=0.5)
split.label(text=scene_current)
split.label(text=frame_current)
split.label(text=ping)
@ -460,35 +418,26 @@ class SESSION_PT_presence(bpy.types.Panel):
settings = context.window_manager.session
pref = get_preferences()
layout.active = settings.enable_presence
row = layout.row()
row = row.grid_flow(row_major=True, columns=0, even_columns=True, even_rows=False, align=True)
row.prop(settings, "presence_show_selected",text="",icon_only=True, icon='CUBE')
row.prop(settings, "presence_show_user", text="",icon_only=True, icon='CAMERA_DATA')
row.prop(settings, "presence_show_mode", text="",icon_only=True, icon='OBJECT_DATAMODE')
row.prop(settings, "presence_show_far_user", text="",icon_only=True, icon='SCENE_DATA')
col = layout.column()
if settings.presence_show_mode :
row = col.column()
row.prop(pref, "presence_mode_distance", expand=True)
col.prop(settings, "presence_show_session_status")
if settings.presence_show_session_status :
row = col.column()
row.active = settings.presence_show_session_status
row.prop(pref, "presence_hud_scale", expand=True)
row = col.column(align=True)
row.active = settings.presence_show_session_status
row.prop(pref, "presence_hud_hpos", expand=True)
row.prop(pref, "presence_hud_vpos", expand=True)
row = col.column()
row.active = settings.presence_show_session_status
row.prop(pref, "presence_hud_scale", expand=True)
row = col.column(align=True)
row.active = settings.presence_show_session_status
row.prop(pref, "presence_hud_hpos", expand=True)
row.prop(pref, "presence_hud_vpos", expand=True)
col.prop(settings, "presence_show_selected")
col.prop(settings, "presence_show_user")
row = layout.column()
row.active = settings.presence_show_user
row.prop(settings, "presence_show_far_user")
def draw_property(context, parent, property_uuid, level=0):
settings = get_preferences()
runtime_settings = context.window_manager.session
item = session.repository.graph.get(property_uuid)
type_id = item.data.get('type_id')
item = session.repository.get_node(property_uuid)
area_msg = parent.row(align=True)
if item.state == ERROR:
@ -499,10 +448,11 @@ def draw_property(context, parent, property_uuid, level=0):
line = area_msg.box()
name = item.data['name'] if item.data else item.uuid
icon = settings.supported_datablocks[type_id].icon if type_id else 'ERROR'
detail_item_box = line.row(align=True)
detail_item_box.label(text="", icon=icon)
detail_item_box.label(text="",
icon=settings.supported_datablocks[item.str_type].icon)
detail_item_box.label(text=f"{name}")
# Operations
@ -589,27 +539,40 @@ class SESSION_PT_repository(bpy.types.Panel):
else:
row.operator('session.save', icon="FILE_TICK")
box = layout.box()
row = box.row()
row.prop(runtime_settings, "filter_owned", text="Show only owned Nodes", icon_only=True, icon="DECORATE_UNLOCKED")
row = box.row()
row.prop(runtime_settings, "filter_name", text="Filter")
row = box.row()
flow = layout.grid_flow(
row_major=True,
columns=0,
even_columns=True,
even_rows=False,
align=True)
for item in settings.supported_datablocks:
col = flow.column(align=True)
col.prop(item, "use_as_filter", text="", icon=item.icon)
row = layout.row(align=True)
row.prop(runtime_settings, "filter_owned", text="Show only owned")
row = layout.row(align=True)
# Properties
owned_nodes = [k for k, v in session.repository.graph.items() if v.owner==settings.username]
types_filter = [t.type_name for t in settings.supported_datablocks
if t.use_as_filter]
filtered_node = owned_nodes if runtime_settings.filter_owned else list(session.repository.graph.keys())
key_to_filter = session.list(
filter_owner=settings.username) if runtime_settings.filter_owned else session.list()
if runtime_settings.filter_name:
filtered_node = [n for n in filtered_node if runtime_settings.filter_name.lower() in session.repository.graph.get(n).data.get('name').lower()]
client_keys = [key for key in key_to_filter
if session.repository.get_node(key).str_type
in types_filter]
if filtered_node:
if client_keys:
col = layout.column(align=True)
for key in filtered_node:
for key in client_keys:
draw_property(context, col, key)
else:
layout.row().label(text="Empty")
row.label(text="Empty")
elif session.state == STATE_LOBBY and usr and usr['admin']:
row.operator("session.init", icon='TOOL_SETTINGS', text="Init")
@ -629,32 +592,23 @@ class VIEW3D_PT_overlay_session(bpy.types.Panel):
def draw(self, context):
layout = self.layout
settings = context.window_manager.session
pref = get_preferences()
layout.active = settings.enable_presence
row = layout.row()
row = row.grid_flow(row_major=True, columns=0, even_columns=True, even_rows=False, align=True)
row.prop(settings, "presence_show_selected",text="",icon_only=True, icon='CUBE')
row.prop(settings, "presence_show_user", text="",icon_only=True, icon='CAMERA_DATA')
row.prop(settings, "presence_show_mode", text="",icon_only=True, icon='OBJECT_DATAMODE')
row.prop(settings, "presence_show_far_user", text="",icon_only=True, icon='SCENE_DATA')
view = context.space_data
overlay = view.overlay
display_all = overlay.show_overlays
col = layout.column()
row = col.row(align=True)
settings = context.window_manager.session
layout.active = settings.enable_presence
col = layout.column()
if settings.presence_show_mode :
row = col.column()
row.prop(pref, "presence_mode_distance", expand=True)
col.prop(settings, "presence_show_session_status")
if settings.presence_show_session_status :
row = col.column()
row.active = settings.presence_show_session_status
row.prop(pref, "presence_hud_scale", expand=True)
row = col.column(align=True)
row.active = settings.presence_show_session_status
row.prop(pref, "presence_hud_hpos", expand=True)
row.prop(pref, "presence_hud_vpos", expand=True)
col.prop(settings, "presence_show_selected")
col.prop(settings, "presence_show_user")
row = layout.column()
row.active = settings.presence_show_user
row.prop(settings, "presence_show_far_user")
classes = (
SESSION_UL_users,

View File

@ -38,14 +38,6 @@ from replication.constants import (STATE_ACTIVE, STATE_AUTH,
STATE_LOBBY,
CONNECTING)
CLEARED_DATABLOCKS = ['actions', 'armatures', 'cache_files', 'cameras',
'collections', 'curves', 'filepath', 'fonts',
'grease_pencils', 'images', 'lattices', 'libraries',
'lightprobes', 'lights', 'linestyles', 'masks',
'materials', 'meshes', 'metaballs', 'movieclips',
'node_groups', 'objects', 'paint_curves', 'particles',
'scenes', 'shape_keys', 'sounds', 'speakers', 'texts',
'textures', 'volumes', 'worlds']
def find_from_attr(attr_name, attr_value, list):
for item in list:
@ -109,25 +101,17 @@ def get_state_str(state):
def clean_scene():
for type_name in CLEARED_DATABLOCKS:
sub_collection_to_avoid = [
bpy.data.linestyles.get('LineStyle'),
bpy.data.materials.get('Dots Stroke')
]
type_collection = getattr(bpy.data, type_name)
items_to_remove = [i for i in type_collection if i not in sub_collection_to_avoid]
for item in items_to_remove:
try:
for type_name in dir(bpy.data):
try:
type_collection = getattr(bpy.data, type_name)
for item in type_collection:
type_collection.remove(item)
logging.info(item.name)
except:
continue
except:
continue
# Clear sequencer
bpy.context.scene.sequence_editor_clear()
def get_selected_objects(scene, active_view_layer):
return [obj.uuid for obj in scene.objects if obj.select_get(view_layer=active_view_layer)]

View File

@ -1,4 +1,4 @@
import re
init_py = open("multi_user/libs/replication/replication/__init__.py").read()
init_py = open("multi_user/__init__.py").read()
print(re.search("\d+\.\d+\.\d+\w\d+|\d+\.\d+\.\d+", init_py).group(0))

View File

@ -13,7 +13,7 @@ def main():
if len(sys.argv) > 2:
blender_rev = sys.argv[2]
else:
blender_rev = "2.93.0"
blender_rev = "2.92.0"
try:
exit_val = BAT.test_blender_addon(addon_path=addon, blender_revision=blender_rev)

View File

@ -5,10 +5,9 @@ from deepdiff import DeepDiff
import bpy
import random
from multi_user.bl_types.bl_action import BlAction
from multi_user.io_bpy.bl_action import BlAction
INTERPOLATION = ['CONSTANT', 'LINEAR', 'BEZIER', 'SINE', 'QUAD', 'CUBIC', 'QUART', 'QUINT', 'EXPO', 'CIRC', 'BACK', 'BOUNCE', 'ELASTIC']
FMODIFIERS = ['GENERATOR', 'FNGENERATOR', 'ENVELOPE', 'CYCLES', 'NOISE', 'LIMITS', 'STEPPED']
# @pytest.mark.parametrize('blendname', ['test_action.blend'])
def test_action(clear_blend):
@ -23,20 +22,17 @@ def test_action(clear_blend):
point.co[1] = random.randint(-10,10)
point.interpolation = INTERPOLATION[random.randint(0, len(INTERPOLATION)-1)]
for mod_type in FMODIFIERS:
fcurve_sample.modifiers.new(mod_type)
bpy.ops.mesh.primitive_plane_add()
bpy.data.objects[0].animation_data_create()
bpy.data.objects[0].animation_data.action = datablock
# Test
implementation = BlAction()
expected = implementation.dump(datablock)
expected = implementation._dump(datablock)
bpy.data.actions.remove(datablock)
test = implementation.construct(expected)
implementation.load(expected, test)
result = implementation.dump(test)
test = implementation._construct(expected)
implementation._load(expected, test)
result = implementation._dump(test)
assert not DeepDiff(expected, result)

View File

@ -5,18 +5,18 @@ from deepdiff import DeepDiff
import bpy
import random
from multi_user.bl_types.bl_armature import BlArmature
from multi_user.io_bpy.bl_armature import BlArmature
def test_armature(clear_blend):
bpy.ops.object.armature_add()
datablock = bpy.data.armatures[0]
implementation = BlArmature()
expected = implementation.dump(datablock)
expected = implementation._dump(datablock)
bpy.data.armatures.remove(datablock)
test = implementation.construct(expected)
implementation.load(expected, test)
result = implementation.dump(test)
test = implementation._construct(expected)
implementation._load(expected, test)
result = implementation._dump(test)
assert not DeepDiff(expected, result)

View File

@ -4,7 +4,7 @@ import pytest
from deepdiff import DeepDiff
import bpy
from multi_user.bl_types.bl_camera import BlCamera
from multi_user.io_bpy.bl_camera import BlCamera
@pytest.mark.parametrize('camera_type', ['PANO','PERSP','ORTHO'])
@ -15,11 +15,11 @@ def test_camera(clear_blend, camera_type):
datablock.type = camera_type
camera_dumper = BlCamera()
expected = camera_dumper.dump(datablock)
expected = camera_dumper._dump(datablock)
bpy.data.cameras.remove(datablock)
test = camera_dumper.construct(expected)
camera_dumper.load(expected, test)
result = camera_dumper.dump(test)
test = camera_dumper._construct(expected)
camera_dumper._load(expected, test)
result = camera_dumper._dump(test)
assert not DeepDiff(expected, result)

View File

@ -5,7 +5,7 @@ from deepdiff import DeepDiff
from uuid import uuid4
import bpy
import random
from multi_user.bl_types.bl_collection import BlCollection
from multi_user.io_bpy.bl_collection import BlCollection
def test_collection(clear_blend):
# Generate a collection with childrens and a cube
@ -23,11 +23,11 @@ def test_collection(clear_blend):
# Test
implementation = BlCollection()
expected = implementation.dump(datablock)
expected = implementation._dump(datablock)
bpy.data.collections.remove(datablock)
test = implementation.construct(expected)
implementation.load(expected, test)
result = implementation.dump(test)
test = implementation._construct(expected)
implementation._load(expected, test)
result = implementation._dump(test)
assert not DeepDiff(expected, result)

View File

@ -5,7 +5,7 @@ from deepdiff import DeepDiff
import bpy
import random
from multi_user.bl_types.bl_curve import BlCurve
from multi_user.io_bpy.bl_curve import BlCurve
@pytest.mark.parametrize('curve_type', ['TEXT','BEZIER'])
def test_curve(clear_blend, curve_type):
@ -19,11 +19,11 @@ def test_curve(clear_blend, curve_type):
datablock = bpy.data.curves[0]
implementation = BlCurve()
expected = implementation.dump(datablock)
expected = implementation._dump(datablock)
bpy.data.curves.remove(datablock)
test = implementation.construct(expected)
implementation.load(expected, test)
result = implementation.dump(test)
test = implementation._construct(expected)
implementation._load(expected, test)
result = implementation._dump(test)
assert not DeepDiff(expected, result)

View File

@ -4,7 +4,7 @@ import pytest
from deepdiff import DeepDiff
import bpy
from multi_user.bl_types.bl_gpencil import BlGpencil
from multi_user.io_bpy.bl_gpencil import BlGpencil
def test_gpencil(clear_blend):
@ -13,11 +13,11 @@ def test_gpencil(clear_blend):
datablock = bpy.data.grease_pencils[0]
implementation = BlGpencil()
expected = implementation.dump(datablock)
expected = implementation._dump(datablock)
bpy.data.grease_pencils.remove(datablock)
test = implementation.construct(expected)
implementation.load(expected, test)
result = implementation.dump(test)
test = implementation._construct(expected)
implementation._load(expected, test)
result = implementation._dump(test)
assert not DeepDiff(expected, result)

View File

@ -4,7 +4,7 @@ import pytest
from deepdiff import DeepDiff
import bpy
from multi_user.bl_types.bl_lattice import BlLattice
from multi_user.io_bpy.bl_lattice import BlLattice
def test_lattice(clear_blend):
@ -13,11 +13,11 @@ def test_lattice(clear_blend):
datablock = bpy.data.lattices[0]
implementation = BlLattice()
expected = implementation.dump(datablock)
expected = implementation._dump(datablock)
bpy.data.lattices.remove(datablock)
test = implementation.construct(expected)
implementation.load(expected, test)
result = implementation.dump(test)
test = implementation._construct(expected)
implementation._load(expected, test)
result = implementation._dump(test)
assert not DeepDiff(expected, result)

View File

@ -4,7 +4,7 @@ import pytest
from deepdiff import DeepDiff
import bpy
from multi_user.bl_types.bl_lightprobe import BlLightprobe
from multi_user.io_bpy.bl_lightprobe import BlLightprobe
@pytest.mark.skipif(bpy.app.version[1] < 83, reason="requires blender 2.83 or higher")
@ -14,11 +14,11 @@ def test_lightprobes(clear_blend, lightprobe_type):
blender_light = bpy.data.lightprobes[0]
lightprobe_dumper = BlLightprobe()
expected = lightprobe_dumper.dump(blender_light)
expected = lightprobe_dumper._dump(blender_light)
bpy.data.lightprobes.remove(blender_light)
test = lightprobe_dumper.construct(expected)
lightprobe_dumper.load(expected, test)
result = lightprobe_dumper.dump(test)
test = lightprobe_dumper._construct(expected)
lightprobe_dumper._load(expected, test)
result = lightprobe_dumper._dump(test)
assert not DeepDiff(expected, result)

View File

@ -4,7 +4,7 @@ import pytest
from deepdiff import DeepDiff
import bpy
from multi_user.bl_types.bl_light import BlLight
from multi_user.io_bpy.bl_light import BlLight
@pytest.mark.parametrize('light_type', ['SPOT','SUN','POINT','AREA'])
@ -13,11 +13,11 @@ def test_light(clear_blend, light_type):
blender_light = bpy.data.lights[0]
light_dumper = BlLight()
expected = light_dumper.dump(blender_light)
expected = light_dumper._dump(blender_light)
bpy.data.lights.remove(blender_light)
test = light_dumper.construct(expected)
light_dumper.load(expected, test)
result = light_dumper.dump(test)
test = light_dumper._construct(expected)
light_dumper._load(expected, test)
result = light_dumper._dump(test)
assert not DeepDiff(expected, result)

View File

@ -4,7 +4,7 @@ import pytest
from deepdiff import DeepDiff
import bpy
from multi_user.bl_types.bl_material import BlMaterial
from multi_user.io_bpy.bl_material import BlMaterial
def test_material_nodes(clear_blend):
@ -17,12 +17,12 @@ def test_material_nodes(clear_blend):
datablock.node_tree.nodes.new(ntype)
implementation = BlMaterial()
expected = implementation.dump(datablock)
expected = implementation._dump(datablock)
bpy.data.materials.remove(datablock)
test = implementation.construct(expected)
implementation.load(expected, test)
result = implementation.dump(test)
test = implementation._construct(expected)
implementation._load(expected, test)
result = implementation._dump(test)
assert not DeepDiff(expected, result)
@ -32,11 +32,11 @@ def test_material_gpencil(clear_blend):
bpy.data.materials.create_gpencil_data(datablock)
implementation = BlMaterial()
expected = implementation.dump(datablock)
expected = implementation._dump(datablock)
bpy.data.materials.remove(datablock)
test = implementation.construct(expected)
implementation.load(expected, test)
result = implementation.dump(test)
test = implementation._construct(expected)
implementation._load(expected, test)
result = implementation._dump(test)
assert not DeepDiff(expected, result)

View File

@ -5,7 +5,7 @@ from deepdiff import DeepDiff
import bpy
import random
from multi_user.bl_types.bl_mesh import BlMesh
from multi_user.io_bpy.bl_mesh import BlMesh
@pytest.mark.parametrize('mesh_type', ['EMPTY','FILLED'])
def test_mesh(clear_blend, mesh_type):
@ -18,11 +18,11 @@ def test_mesh(clear_blend, mesh_type):
# Test
implementation = BlMesh()
expected = implementation.dump(datablock)
expected = implementation._dump(datablock)
bpy.data.meshes.remove(datablock)
test = implementation.construct(expected)
implementation.load(expected, test)
result = implementation.dump(test)
test = implementation._construct(expected)
implementation._load(expected, test)
result = implementation._dump(test)
assert not DeepDiff(expected, result)

View File

@ -4,7 +4,7 @@ import pytest
from deepdiff import DeepDiff
import bpy
from multi_user.bl_types.bl_metaball import BlMetaball
from multi_user.io_bpy.bl_metaball import BlMetaball
@pytest.mark.parametrize('metaballs_type', ['PLANE','CAPSULE','BALL','ELLIPSOID','CUBE'])
@ -13,11 +13,11 @@ def test_metaball(clear_blend, metaballs_type):
datablock = bpy.data.metaballs[0]
dumper = BlMetaball()
expected = dumper.dump(datablock)
expected = dumper._dump(datablock)
bpy.data.metaballs.remove(datablock)
test = dumper.construct(expected)
dumper.load(expected, test)
result = dumper.dump(test)
test = dumper._construct(expected)
dumper._load(expected, test)
result = dumper._dump(test)
assert not DeepDiff(expected, result)

View File

@ -5,7 +5,7 @@ from deepdiff import DeepDiff
import bpy
import random
from multi_user.bl_types.bl_object import BlObject
from multi_user.io_bpy.bl_object import BlObject
# Removed 'BUILD', 'SOFT_BODY' modifier because the seed doesn't seems to be
# correctly initialized (#TODO: report the bug)
@ -65,11 +65,11 @@ def test_object(clear_blend):
datablock.shape_key_add(name='shape2')
implementation = BlObject()
expected = implementation.dump(datablock)
expected = implementation._dump(datablock)
bpy.data.objects.remove(datablock)
test = implementation.construct(expected)
implementation.load(expected, test)
result = implementation.dump(test)
test = implementation._construct(expected)
implementation._load(expected, test)
result = implementation._dump(test)
print(DeepDiff(expected, result))
assert not DeepDiff(expected, result)

View File

@ -5,23 +5,21 @@ from deepdiff import DeepDiff
import bpy
import random
from multi_user.bl_types.bl_scene import BlScene
from multi_user.io_bpy.bl_scene import BlScene
from multi_user.utils import get_preferences
def test_scene(clear_blend):
get_preferences().sync_flags.sync_render_settings = True
datablock = bpy.data.scenes.new("toto")
datablock.timeline_markers.new('toto', frame=10)
datablock.timeline_markers.new('tata', frame=1)
datablock.view_settings.use_curve_mapping = True
# Test
implementation = BlScene()
expected = implementation.dump(datablock)
expected = implementation._dump(datablock)
bpy.data.scenes.remove(datablock)
test = implementation.construct(expected)
implementation.load(expected, test)
result = implementation.dump(test)
test = implementation._construct(expected)
implementation._load(expected, test)
result = implementation._dump(test)
assert not DeepDiff(expected, result)

View File

@ -5,18 +5,18 @@ from deepdiff import DeepDiff
import bpy
import random
from multi_user.bl_types.bl_speaker import BlSpeaker
from multi_user.io_bpy.bl_speaker import BlSpeaker
def test_speaker(clear_blend):
bpy.ops.object.speaker_add()
datablock = bpy.data.speakers[0]
implementation = BlSpeaker()
expected = implementation.dump(datablock)
expected = implementation._dump(datablock)
bpy.data.speakers.remove(datablock)
test = implementation.construct(expected)
implementation.load(expected, test)
result = implementation.dump(test)
test = implementation._construct(expected)
implementation._load(expected, test)
result = implementation._dump(test)
assert not DeepDiff(expected, result)

View File

@ -5,7 +5,7 @@ from deepdiff import DeepDiff
import bpy
import random
from multi_user.bl_types.bl_texture import BlTexture
from multi_user.io_bpy.bl_texture import BlTexture
TEXTURE_TYPES = ['NONE', 'BLEND', 'CLOUDS', 'DISTORTED_NOISE', 'IMAGE', 'MAGIC', 'MARBLE', 'MUSGRAVE', 'NOISE', 'STUCCI', 'VORONOI', 'WOOD']
@ -14,11 +14,11 @@ def test_texture(clear_blend, texture_type):
datablock = bpy.data.textures.new('test', texture_type)
implementation = BlTexture()
expected = implementation.dump(datablock)
expected = implementation._dump(datablock)
bpy.data.textures.remove(datablock)
test = implementation.construct(expected)
implementation.load(expected, test)
result = implementation.dump(test)
test = implementation._construct(expected)
implementation._load(expected, test)
result = implementation._dump(test)
assert not DeepDiff(expected, result)

View File

@ -5,17 +5,17 @@ from deepdiff import DeepDiff
import bpy
import random
from multi_user.bl_types.bl_volume import BlVolume
from multi_user.io_bpy.bl_volume import BlVolume
def test_volume(clear_blend):
datablock = bpy.data.volumes.new("Test")
implementation = BlVolume()
expected = implementation.dump(datablock)
expected = implementation._dump(datablock)
bpy.data.volumes.remove(datablock)
test = implementation.construct(expected)
implementation.load(expected, test)
result = implementation.dump(test)
test = implementation._construct(expected)
implementation._load(expected, test)
result = implementation._dump(test)
assert not DeepDiff(expected, result)

View File

@ -5,18 +5,18 @@ from deepdiff import DeepDiff
import bpy
import random
from multi_user.bl_types.bl_world import BlWorld
from multi_user.io_bpy.bl_world import BlWorld
def test_world(clear_blend):
datablock = bpy.data.worlds.new('test')
datablock.use_nodes = True
implementation = BlWorld()
expected = implementation.dump(datablock)
expected = implementation._dump(datablock)
bpy.data.worlds.remove(datablock)
test = implementation.construct(expected)
implementation.load(expected, test)
result = implementation.dump(test)
test = implementation._construct(expected)
implementation._load(expected, test)
result = implementation._dump(test)
assert not DeepDiff(expected, result)