Merge branch '49-connection-preset-system' of https://gitlab.com/slumber/multi-user into 49-connection-preset-system
This commit is contained in:
@ -2,9 +2,12 @@ stages:
|
|||||||
- test
|
- test
|
||||||
- build
|
- build
|
||||||
- deploy
|
- deploy
|
||||||
|
- doc
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
include:
|
include:
|
||||||
- local: .gitlab/ci/test.gitlab-ci.yml
|
- local: .gitlab/ci/test.gitlab-ci.yml
|
||||||
- local: .gitlab/ci/build.gitlab-ci.yml
|
- local: .gitlab/ci/build.gitlab-ci.yml
|
||||||
- local: .gitlab/ci/deploy.gitlab-ci.yml
|
- local: .gitlab/ci/deploy.gitlab-ci.yml
|
||||||
|
- local: .gitlab/ci/doc.gitlab-ci.yml
|
||||||
|
@ -1,5 +1,6 @@
|
|||||||
build:
|
build:
|
||||||
stage: build
|
stage: build
|
||||||
|
needs: ["test"]
|
||||||
image: debian:stable-slim
|
image: debian:stable-slim
|
||||||
script:
|
script:
|
||||||
- rm -rf tests .git .gitignore script
|
- rm -rf tests .git .gitignore script
|
||||||
|
@ -1,5 +1,6 @@
|
|||||||
deploy:
|
deploy:
|
||||||
stage: deploy
|
stage: deploy
|
||||||
|
needs: ["build"]
|
||||||
image: slumber/docker-python
|
image: slumber/docker-python
|
||||||
variables:
|
variables:
|
||||||
DOCKER_DRIVER: overlay2
|
DOCKER_DRIVER: overlay2
|
||||||
|
16
.gitlab/ci/doc.gitlab-ci.yml
Normal file
16
.gitlab/ci/doc.gitlab-ci.yml
Normal file
@ -0,0 +1,16 @@
|
|||||||
|
pages:
|
||||||
|
stage: doc
|
||||||
|
needs: ["deploy"]
|
||||||
|
image: python
|
||||||
|
script:
|
||||||
|
- pip install -U sphinx sphinx_rtd_theme sphinx-material
|
||||||
|
- sphinx-build -b html ./docs public
|
||||||
|
artifacts:
|
||||||
|
paths:
|
||||||
|
- public
|
||||||
|
only:
|
||||||
|
refs:
|
||||||
|
- master
|
||||||
|
- develop
|
||||||
|
|
||||||
|
|
31
CHANGELOG.md
31
CHANGELOG.md
@ -157,4 +157,33 @@ All notable changes to this project will be documented in this file.
|
|||||||
- Empty and Light object selection highlights
|
- Empty and Light object selection highlights
|
||||||
- Material renaming
|
- Material renaming
|
||||||
- Default material nodes input parameters
|
- Default material nodes input parameters
|
||||||
- blender 2.91 python api compatibility
|
- blender 2.91 python api compatibility
|
||||||
|
|
||||||
|
## [0.3.0] - 2021-04-14
|
||||||
|
|
||||||
|
### Added
|
||||||
|
|
||||||
|
- Curve material support
|
||||||
|
- Cycle visibility settings
|
||||||
|
- Session save/load operator
|
||||||
|
- Add new scene support
|
||||||
|
- Physic initial support
|
||||||
|
- Geometry node initial support
|
||||||
|
- Blender 2.93 compatibility
|
||||||
|
### Changed
|
||||||
|
|
||||||
|
- Host documentation on Gitlab Page
|
||||||
|
- Event driven update (from the blender deps graph)
|
||||||
|
|
||||||
|
### Fixed
|
||||||
|
|
||||||
|
- Vertex group assignation
|
||||||
|
- Parent relation can't be removed
|
||||||
|
- Separate object
|
||||||
|
- Delete animation
|
||||||
|
- Sync missing holdout option for grease pencil material
|
||||||
|
- Sync missing `skin_vertices`
|
||||||
|
- Exception access violation during Undo/Redo
|
||||||
|
- Sync missing armature bone Roll
|
||||||
|
- Sync missing driver data_path
|
||||||
|
- Constraint replication
|
64
README.md
64
README.md
@ -19,44 +19,46 @@ This tool aims to allow multiple users to work on the same scene over the networ
|
|||||||
|
|
||||||
## Usage
|
## Usage
|
||||||
|
|
||||||
See the [documentation](https://multi-user.readthedocs.io/en/latest/) for details.
|
See the [documentation](https://slumber.gitlab.io/multi-user/index.html) for details.
|
||||||
|
|
||||||
## Troubleshooting
|
## Troubleshooting
|
||||||
|
|
||||||
See the [troubleshooting guide](https://multi-user.readthedocs.io/en/latest/getting_started/troubleshooting.html) for tips on the most common issues.
|
See the [troubleshooting guide](https://slumber.gitlab.io/multi-user/getting_started/troubleshooting.html) for tips on the most common issues.
|
||||||
|
|
||||||
## Current development status
|
## Current development status
|
||||||
|
|
||||||
Currently, not all data-block are supported for replication over the wire. The following list summarizes the status for each ones.
|
Currently, not all data-block are supported for replication over the wire. The following list summarizes the status for each ones.
|
||||||
|
|
||||||
| Name | Status | Comment |
|
| Name | Status | Comment |
|
||||||
| ----------- | :----: | :--------------------------------------------------------------------------: |
|
| -------------- | :----: | :----------------------------------------------------------: |
|
||||||
| action | ✔️ | |
|
| action | ✔️ | |
|
||||||
| armature | ❗ | Not stable |
|
| armature | ❗ | Not stable |
|
||||||
| camera | ✔️ | |
|
| camera | ✔️ | |
|
||||||
| collection | ✔️ | |
|
| collection | ✔️ | |
|
||||||
| curve | ❗ | Nurbs not supported |
|
| curve | ❗ | Nurbs surfaces not supported |
|
||||||
| gpencil | ✔️ | [Airbrush not supported](https://gitlab.com/slumber/multi-user/-/issues/123) |
|
| gpencil | ✔️ | |
|
||||||
| image | ✔️ | |
|
| image | ✔️ | |
|
||||||
| mesh | ✔️ | |
|
| mesh | ✔️ | |
|
||||||
| material | ✔️ | |
|
| material | ✔️ | |
|
||||||
| node_groups | ❗ | Material only |
|
| node_groups | ❗ | Material & Geometry only |
|
||||||
| metaball | ✔️ | |
|
| geometry nodes | ✔️ | |
|
||||||
| object | ✔️ | |
|
| metaball | ✔️ | |
|
||||||
| textures | ❗ | Supported for modifiers only |
|
| object | ✔️ | |
|
||||||
| texts | ✔️ | |
|
| textures | ❗ | Supported for modifiers/materials/geo nodes only |
|
||||||
| scene | ✔️ | |
|
| texts | ✔️ | |
|
||||||
| world | ✔️ | |
|
| scene | ✔️ | |
|
||||||
| lightprobes | ✔️ | |
|
| world | ✔️ | |
|
||||||
| compositing | ❌ | [Planned](https://gitlab.com/slumber/multi-user/-/issues/46) |
|
| lightprobes | ✔️ | |
|
||||||
| texts | ❌ | [Planned](https://gitlab.com/slumber/multi-user/-/issues/81) |
|
| compositing | ❌ | [Planned](https://gitlab.com/slumber/multi-user/-/issues/46) |
|
||||||
| nla | ❌ | |
|
| texts | ❌ | [Planned](https://gitlab.com/slumber/multi-user/-/issues/81) |
|
||||||
| volumes | ✔️ | |
|
| nla | ❌ | |
|
||||||
| particles | ❌ | [On-going](https://gitlab.com/slumber/multi-user/-/issues/24) |
|
| volumes | ✔️ | |
|
||||||
| speakers | ❗ | [Partial](https://gitlab.com/slumber/multi-user/-/issues/65) |
|
| particles | ❗ | The cache isn't syncing. |
|
||||||
| vse | ❗ | Mask and Clip not supported yet |
|
| speakers | ❗ | [Partial](https://gitlab.com/slumber/multi-user/-/issues/65) |
|
||||||
| physics | ❌ | [Planned](https://gitlab.com/slumber/multi-user/-/issues/45) |
|
| vse | ❗ | Mask and Clip not supported yet |
|
||||||
| libraries | ❗ | Partial |
|
| physics | ❌ | [Planned](https://gitlab.com/slumber/multi-user/-/issues/45) |
|
||||||
|
| libraries | ❗ | Partial |
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
### Performance issues
|
### Performance issues
|
||||||
@ -74,7 +76,7 @@ I'm working on it.
|
|||||||
|
|
||||||
## Contributing
|
## Contributing
|
||||||
|
|
||||||
See [contributing section](https://multi-user.readthedocs.io/en/latest/ways_to_contribute.html) of the documentation.
|
See [contributing section](https://slumber.gitlab.io/multi-user/ways_to_contribute.html) of the documentation.
|
||||||
|
|
||||||
Feel free to [join the discord server](https://discord.gg/aBPvGws) to chat, seek help and contribute.
|
Feel free to [join the discord server](https://discord.gg/aBPvGws) to chat, seek help and contribute.
|
||||||
|
|
||||||
|
@ -374,15 +374,6 @@ Network
|
|||||||
|
|
||||||
Advanced network settings
|
Advanced network settings
|
||||||
|
|
||||||
**IPC Port** is the port used for Inter Process Communication. This port is used
|
|
||||||
by the multi-user subprocesses to communicate with each other. If different instances
|
|
||||||
of multi-user are using the same IPC port, this will create conflict !
|
|
||||||
|
|
||||||
.. note::
|
|
||||||
You only need to modify this setting if you need to launch multiple clients from the same
|
|
||||||
computer (or if you try to host and join from the same computer). To resolve this, you simply need to enter a different
|
|
||||||
**IPC port** for each blender instance.
|
|
||||||
|
|
||||||
**Timeout (in milliseconds)** is the maximum ping authorized before auto-disconnecting.
|
**Timeout (in milliseconds)** is the maximum ping authorized before auto-disconnecting.
|
||||||
You should only increase it if you have a bad connection.
|
You should only increase it if you have a bad connection.
|
||||||
|
|
||||||
|
@ -19,7 +19,7 @@
|
|||||||
bl_info = {
|
bl_info = {
|
||||||
"name": "Multi-User",
|
"name": "Multi-User",
|
||||||
"author": "Swann Martinez",
|
"author": "Swann Martinez",
|
||||||
"version": (0, 3, 0),
|
"version": (0, 4, 0),
|
||||||
"description": "Enable real-time collaborative workflow inside blender",
|
"description": "Enable real-time collaborative workflow inside blender",
|
||||||
"blender": (2, 82, 0),
|
"blender": (2, 82, 0),
|
||||||
"location": "3D View > Sidebar > Multi-User tab",
|
"location": "3D View > Sidebar > Multi-User tab",
|
||||||
@ -44,7 +44,7 @@ from . import environment
|
|||||||
|
|
||||||
|
|
||||||
DEPENDENCIES = {
|
DEPENDENCIES = {
|
||||||
("replication", '0.1.26'),
|
("replication", '0.1.36'),
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@ -122,13 +122,13 @@ class addon_updater_install_popup(bpy.types.Operator):
|
|||||||
# if true, run clean install - ie remove all files before adding new
|
# if true, run clean install - ie remove all files before adding new
|
||||||
# equivalent to deleting the addon and reinstalling, except the
|
# equivalent to deleting the addon and reinstalling, except the
|
||||||
# updater folder/backup folder remains
|
# updater folder/backup folder remains
|
||||||
clean_install = bpy.props.BoolProperty(
|
clean_install: bpy.props.BoolProperty(
|
||||||
name="Clean install",
|
name="Clean install",
|
||||||
description="If enabled, completely clear the addon's folder before installing new update, creating a fresh install",
|
description="If enabled, completely clear the addon's folder before installing new update, creating a fresh install",
|
||||||
default=False,
|
default=False,
|
||||||
options={'HIDDEN'}
|
options={'HIDDEN'}
|
||||||
)
|
)
|
||||||
ignore_enum = bpy.props.EnumProperty(
|
ignore_enum: bpy.props.EnumProperty(
|
||||||
name="Process update",
|
name="Process update",
|
||||||
description="Decide to install, ignore, or defer new addon update",
|
description="Decide to install, ignore, or defer new addon update",
|
||||||
items=[
|
items=[
|
||||||
@ -264,7 +264,7 @@ class addon_updater_update_now(bpy.types.Operator):
|
|||||||
# if true, run clean install - ie remove all files before adding new
|
# if true, run clean install - ie remove all files before adding new
|
||||||
# equivalent to deleting the addon and reinstalling, except the
|
# equivalent to deleting the addon and reinstalling, except the
|
||||||
# updater folder/backup folder remains
|
# updater folder/backup folder remains
|
||||||
clean_install = bpy.props.BoolProperty(
|
clean_install: bpy.props.BoolProperty(
|
||||||
name="Clean install",
|
name="Clean install",
|
||||||
description="If enabled, completely clear the addon's folder before installing new update, creating a fresh install",
|
description="If enabled, completely clear the addon's folder before installing new update, creating a fresh install",
|
||||||
default=False,
|
default=False,
|
||||||
@ -332,7 +332,7 @@ class addon_updater_update_target(bpy.types.Operator):
|
|||||||
i+=1
|
i+=1
|
||||||
return ret
|
return ret
|
||||||
|
|
||||||
target = bpy.props.EnumProperty(
|
target: bpy.props.EnumProperty(
|
||||||
name="Target version to install",
|
name="Target version to install",
|
||||||
description="Select the version to install",
|
description="Select the version to install",
|
||||||
items=target_version
|
items=target_version
|
||||||
@ -341,7 +341,7 @@ class addon_updater_update_target(bpy.types.Operator):
|
|||||||
# if true, run clean install - ie remove all files before adding new
|
# if true, run clean install - ie remove all files before adding new
|
||||||
# equivalent to deleting the addon and reinstalling, except the
|
# equivalent to deleting the addon and reinstalling, except the
|
||||||
# updater folder/backup folder remains
|
# updater folder/backup folder remains
|
||||||
clean_install = bpy.props.BoolProperty(
|
clean_install: bpy.props.BoolProperty(
|
||||||
name="Clean install",
|
name="Clean install",
|
||||||
description="If enabled, completely clear the addon's folder before installing new update, creating a fresh install",
|
description="If enabled, completely clear the addon's folder before installing new update, creating a fresh install",
|
||||||
default=False,
|
default=False,
|
||||||
@ -399,7 +399,7 @@ class addon_updater_install_manually(bpy.types.Operator):
|
|||||||
bl_description = "Proceed to manually install update"
|
bl_description = "Proceed to manually install update"
|
||||||
bl_options = {'REGISTER', 'INTERNAL'}
|
bl_options = {'REGISTER', 'INTERNAL'}
|
||||||
|
|
||||||
error = bpy.props.StringProperty(
|
error: bpy.props.StringProperty(
|
||||||
name="Error Occurred",
|
name="Error Occurred",
|
||||||
default="",
|
default="",
|
||||||
options={'HIDDEN'}
|
options={'HIDDEN'}
|
||||||
@ -461,7 +461,7 @@ class addon_updater_updated_successful(bpy.types.Operator):
|
|||||||
bl_description = "Update installation response"
|
bl_description = "Update installation response"
|
||||||
bl_options = {'REGISTER', 'INTERNAL', 'UNDO'}
|
bl_options = {'REGISTER', 'INTERNAL', 'UNDO'}
|
||||||
|
|
||||||
error = bpy.props.StringProperty(
|
error: bpy.props.StringProperty(
|
||||||
name="Error Occurred",
|
name="Error Occurred",
|
||||||
default="",
|
default="",
|
||||||
options={'HIDDEN'}
|
options={'HIDDEN'}
|
||||||
|
@ -42,13 +42,14 @@ __all__ = [
|
|||||||
# 'bl_sequencer',
|
# 'bl_sequencer',
|
||||||
'bl_node_group',
|
'bl_node_group',
|
||||||
'bl_texture',
|
'bl_texture',
|
||||||
|
"bl_particle",
|
||||||
] # Order here defines execution order
|
] # Order here defines execution order
|
||||||
|
|
||||||
if bpy.app.version[1] >= 91:
|
if bpy.app.version[1] >= 91:
|
||||||
__all__.append('bl_volume')
|
__all__.append('bl_volume')
|
||||||
|
|
||||||
from . import *
|
from . import *
|
||||||
from replication.data import ReplicatedDataFactory
|
from replication.data import DataTranslationProtocol
|
||||||
|
|
||||||
def types_to_register():
|
def types_to_register():
|
||||||
return __all__
|
return __all__
|
||||||
|
@ -25,7 +25,7 @@ from enum import Enum
|
|||||||
from .. import utils
|
from .. import utils
|
||||||
from .dump_anything import (
|
from .dump_anything import (
|
||||||
Dumper, Loader, np_dump_collection, np_load_collection, remove_items_from_dict)
|
Dumper, Loader, np_dump_collection, np_load_collection, remove_items_from_dict)
|
||||||
from .bl_datablock import BlDatablock
|
from .bl_datablock import BlDatablock, has_action, has_driver, dump_driver, load_driver
|
||||||
|
|
||||||
|
|
||||||
KEYFRAME = [
|
KEYFRAME = [
|
||||||
@ -61,7 +61,6 @@ def dump_fcurve(fcurve: bpy.types.FCurve, use_numpy: bool = True) -> dict:
|
|||||||
points = fcurve.keyframe_points
|
points = fcurve.keyframe_points
|
||||||
fcurve_data['keyframes_count'] = len(fcurve.keyframe_points)
|
fcurve_data['keyframes_count'] = len(fcurve.keyframe_points)
|
||||||
fcurve_data['keyframe_points'] = np_dump_collection(points, KEYFRAME)
|
fcurve_data['keyframe_points'] = np_dump_collection(points, KEYFRAME)
|
||||||
|
|
||||||
else: # Legacy method
|
else: # Legacy method
|
||||||
dumper = Dumper()
|
dumper = Dumper()
|
||||||
fcurve_data["keyframe_points"] = []
|
fcurve_data["keyframe_points"] = []
|
||||||
@ -71,6 +70,18 @@ def dump_fcurve(fcurve: bpy.types.FCurve, use_numpy: bool = True) -> dict:
|
|||||||
dumper.dump(k)
|
dumper.dump(k)
|
||||||
)
|
)
|
||||||
|
|
||||||
|
if fcurve.modifiers:
|
||||||
|
dumper = Dumper()
|
||||||
|
dumper.exclude_filter = [
|
||||||
|
'is_valid',
|
||||||
|
'active'
|
||||||
|
]
|
||||||
|
dumped_modifiers = []
|
||||||
|
for modfifier in fcurve.modifiers:
|
||||||
|
dumped_modifiers.append(dumper.dump(modfifier))
|
||||||
|
|
||||||
|
fcurve_data['modifiers'] = dumped_modifiers
|
||||||
|
|
||||||
return fcurve_data
|
return fcurve_data
|
||||||
|
|
||||||
|
|
||||||
@ -83,7 +94,7 @@ def load_fcurve(fcurve_data, fcurve):
|
|||||||
:type fcurve: bpy.types.FCurve
|
:type fcurve: bpy.types.FCurve
|
||||||
"""
|
"""
|
||||||
use_numpy = fcurve_data.get('use_numpy')
|
use_numpy = fcurve_data.get('use_numpy')
|
||||||
|
loader = Loader()
|
||||||
keyframe_points = fcurve.keyframe_points
|
keyframe_points = fcurve.keyframe_points
|
||||||
|
|
||||||
# Remove all keyframe points
|
# Remove all keyframe points
|
||||||
@ -128,6 +139,64 @@ def load_fcurve(fcurve_data, fcurve):
|
|||||||
|
|
||||||
fcurve.update()
|
fcurve.update()
|
||||||
|
|
||||||
|
dumped_fcurve_modifiers = fcurve_data.get('modifiers', None)
|
||||||
|
|
||||||
|
if dumped_fcurve_modifiers:
|
||||||
|
# clear modifiers
|
||||||
|
for fmod in fcurve.modifiers:
|
||||||
|
fcurve.modifiers.remove(fmod)
|
||||||
|
|
||||||
|
# Load each modifiers in order
|
||||||
|
for modifier_data in dumped_fcurve_modifiers:
|
||||||
|
modifier = fcurve.modifiers.new(modifier_data['type'])
|
||||||
|
|
||||||
|
loader.load(modifier, modifier_data)
|
||||||
|
elif fcurve.modifiers:
|
||||||
|
for fmod in fcurve.modifiers:
|
||||||
|
fcurve.modifiers.remove(fmod)
|
||||||
|
|
||||||
|
|
||||||
|
def dump_animation_data(datablock):
|
||||||
|
animation_data = {}
|
||||||
|
if has_action(datablock):
|
||||||
|
animation_data['action'] = datablock.animation_data.action.name
|
||||||
|
if has_driver(datablock):
|
||||||
|
animation_data['drivers'] = []
|
||||||
|
for driver in datablock.animation_data.drivers:
|
||||||
|
animation_data['drivers'].append(dump_driver(driver))
|
||||||
|
|
||||||
|
return animation_data
|
||||||
|
|
||||||
|
|
||||||
|
def load_animation_data(animation_data, datablock):
|
||||||
|
# Load animation data
|
||||||
|
if animation_data:
|
||||||
|
if datablock.animation_data is None:
|
||||||
|
datablock.animation_data_create()
|
||||||
|
|
||||||
|
for d in datablock.animation_data.drivers:
|
||||||
|
datablock.animation_data.drivers.remove(d)
|
||||||
|
|
||||||
|
if 'drivers' in animation_data:
|
||||||
|
for driver in animation_data['drivers']:
|
||||||
|
load_driver(datablock, driver)
|
||||||
|
|
||||||
|
if 'action' in animation_data:
|
||||||
|
datablock.animation_data.action = bpy.data.actions[animation_data['action']]
|
||||||
|
elif datablock.animation_data.action:
|
||||||
|
datablock.animation_data.action = None
|
||||||
|
|
||||||
|
# Remove existing animation data if there is not more to load
|
||||||
|
elif hasattr(datablock, 'animation_data') and datablock.animation_data:
|
||||||
|
datablock.animation_data_clear()
|
||||||
|
|
||||||
|
|
||||||
|
def resolve_animation_dependencies(datablock):
|
||||||
|
if has_action(datablock):
|
||||||
|
return [datablock.animation_data.action]
|
||||||
|
else:
|
||||||
|
return []
|
||||||
|
|
||||||
|
|
||||||
class BlAction(BlDatablock):
|
class BlAction(BlDatablock):
|
||||||
bl_id = "actions"
|
bl_id = "actions"
|
||||||
|
@ -56,6 +56,11 @@ class BlCamera(BlDatablock):
|
|||||||
target_img.image = bpy.data.images[img_id]
|
target_img.image = bpy.data.images[img_id]
|
||||||
loader.load(target_img, img_data)
|
loader.load(target_img, img_data)
|
||||||
|
|
||||||
|
img_user = img_data.get('image_user')
|
||||||
|
if img_user:
|
||||||
|
loader.load(target_img.image_user, img_user)
|
||||||
|
|
||||||
|
|
||||||
def _dump_implementation(self, data, instance=None):
|
def _dump_implementation(self, data, instance=None):
|
||||||
assert(instance)
|
assert(instance)
|
||||||
|
|
||||||
@ -101,10 +106,19 @@ class BlCamera(BlDatablock):
|
|||||||
'scale',
|
'scale',
|
||||||
'use_flip_x',
|
'use_flip_x',
|
||||||
'use_flip_y',
|
'use_flip_y',
|
||||||
'image'
|
'image_user',
|
||||||
|
'image',
|
||||||
|
'frame_duration',
|
||||||
|
'frame_start',
|
||||||
|
'frame_offset',
|
||||||
|
'use_cyclic',
|
||||||
|
'use_auto_refresh'
|
||||||
]
|
]
|
||||||
return dumper.dump(instance)
|
data = dumper.dump(instance)
|
||||||
|
for index, image in enumerate(instance.background_images):
|
||||||
|
if image.image_user:
|
||||||
|
data['background_images'][index]['image_user'] = dumper.dump(image.image_user)
|
||||||
|
return data
|
||||||
def _resolve_deps_implementation(self):
|
def _resolve_deps_implementation(self):
|
||||||
deps = []
|
deps = []
|
||||||
for background in self.instance.background_images:
|
for background in self.instance.background_images:
|
||||||
|
@ -72,10 +72,10 @@ def load_driver(target_datablock, src_driver):
|
|||||||
|
|
||||||
for src_target in src_var_data['targets']:
|
for src_target in src_var_data['targets']:
|
||||||
src_target_data = src_var_data['targets'][src_target]
|
src_target_data = src_var_data['targets'][src_target]
|
||||||
new_var.targets[src_target].id = utils.resolve_from_id(
|
src_id = src_target_data.get('id')
|
||||||
src_target_data['id'], src_target_data['id_type'])
|
if src_id:
|
||||||
loader.load(
|
new_var.targets[src_target].id = utils.resolve_from_id(src_target_data['id'], src_target_data['id_type'])
|
||||||
new_var.targets[src_target], src_target_data)
|
loader.load(new_var.targets[src_target], src_target_data)
|
||||||
|
|
||||||
# Fcurve
|
# Fcurve
|
||||||
new_fcurve = new_driver.keyframe_points
|
new_fcurve = new_driver.keyframe_points
|
||||||
@ -127,16 +127,14 @@ class BlDatablock(ReplicatedDatablock):
|
|||||||
instance.uuid = self.uuid
|
instance.uuid = self.uuid
|
||||||
|
|
||||||
def resolve(self, construct = True):
|
def resolve(self, construct = True):
|
||||||
datablock_ref = None
|
|
||||||
datablock_root = getattr(bpy.data, self.bl_id)
|
datablock_root = getattr(bpy.data, self.bl_id)
|
||||||
|
datablock_ref = utils.find_from_attr('uuid', self.uuid, datablock_root)
|
||||||
try:
|
|
||||||
datablock_ref = datablock_root[self.data['name']]
|
|
||||||
except Exception:
|
|
||||||
pass
|
|
||||||
|
|
||||||
if not datablock_ref:
|
if not datablock_ref:
|
||||||
datablock_ref = utils.find_from_attr('uuid', self.uuid, datablock_root)
|
try:
|
||||||
|
datablock_ref = datablock_root[self.data['name']]
|
||||||
|
except Exception:
|
||||||
|
pass
|
||||||
|
|
||||||
if construct and not datablock_ref:
|
if construct and not datablock_ref:
|
||||||
name = self.data.get('name')
|
name = self.data.get('name')
|
||||||
@ -163,19 +161,17 @@ class BlDatablock(ReplicatedDatablock):
|
|||||||
def _dump(self, instance=None):
|
def _dump(self, instance=None):
|
||||||
dumper = Dumper()
|
dumper = Dumper()
|
||||||
data = {}
|
data = {}
|
||||||
|
animation_data = {}
|
||||||
# Dump animation data
|
# Dump animation data
|
||||||
if has_action(instance):
|
if has_action(instance):
|
||||||
dumper = Dumper()
|
animation_data['action'] = instance.animation_data.action.name
|
||||||
dumper.include_filter = ['action']
|
|
||||||
data['animation_data'] = dumper.dump(instance.animation_data)
|
|
||||||
|
|
||||||
if has_driver(instance):
|
if has_driver(instance):
|
||||||
dumped_drivers = {'animation_data': {'drivers': []}}
|
animation_data['drivers'] = []
|
||||||
for driver in instance.animation_data.drivers:
|
for driver in instance.animation_data.drivers:
|
||||||
dumped_drivers['animation_data']['drivers'].append(
|
animation_data['drivers'].append(dump_driver(driver))
|
||||||
dump_driver(driver))
|
|
||||||
|
|
||||||
data.update(dumped_drivers)
|
if animation_data:
|
||||||
|
data['animation_data'] = animation_data
|
||||||
|
|
||||||
if self.is_library:
|
if self.is_library:
|
||||||
data.update(dumper.dump(instance))
|
data.update(dumper.dump(instance))
|
||||||
@ -202,6 +198,9 @@ class BlDatablock(ReplicatedDatablock):
|
|||||||
|
|
||||||
if 'action' in data['animation_data']:
|
if 'action' in data['animation_data']:
|
||||||
target.animation_data.action = bpy.data.actions[data['animation_data']['action']]
|
target.animation_data.action = bpy.data.actions[data['animation_data']['action']]
|
||||||
|
elif target.animation_data.action:
|
||||||
|
target.animation_data.action = None
|
||||||
|
|
||||||
# Remove existing animation data if there is not more to load
|
# Remove existing animation data if there is not more to load
|
||||||
elif hasattr(target, 'animation_data') and target.animation_data:
|
elif hasattr(target, 'animation_data') and target.animation_data:
|
||||||
target.animation_data_clear()
|
target.animation_data_clear()
|
||||||
|
@ -68,12 +68,15 @@ class BlFile(ReplicatedDatablock):
|
|||||||
self.preferences = utils.get_preferences()
|
self.preferences = utils.get_preferences()
|
||||||
|
|
||||||
def resolve(self, construct = True):
|
def resolve(self, construct = True):
|
||||||
if self.data:
|
self.instance = Path(get_filepath(self.data['name']))
|
||||||
self.instance = Path(get_filepath(self.data['name']))
|
|
||||||
|
file_exists = self.instance.exists()
|
||||||
|
if not file_exists:
|
||||||
|
logging.debug("File don't exist, loading it.")
|
||||||
|
self._load(self.data, self.instance)
|
||||||
|
|
||||||
|
return file_exists
|
||||||
|
|
||||||
if not self.instance.exists():
|
|
||||||
logging.debug("File don't exist, loading it.")
|
|
||||||
self._load(self.data, self.instance)
|
|
||||||
|
|
||||||
def push(self, socket, identity=None, check_data=False):
|
def push(self, socket, identity=None, check_data=False):
|
||||||
super().push(socket, identity=None, check_data=False)
|
super().push(socket, identity=None, check_data=False)
|
||||||
@ -131,6 +134,8 @@ class BlFile(ReplicatedDatablock):
|
|||||||
if self.preferences.clear_memory_filecache:
|
if self.preferences.clear_memory_filecache:
|
||||||
return False
|
return False
|
||||||
else:
|
else:
|
||||||
|
if not self.instance:
|
||||||
|
return False
|
||||||
memory_size = sys.getsizeof(self.data['file'])-33
|
memory_size = sys.getsizeof(self.data['file'])-33
|
||||||
disk_size = self.instance.stat().st_size
|
disk_size = self.instance.stat().st_size
|
||||||
return memory_size != disk_size
|
return memory_size != disk_size
|
||||||
|
@ -68,7 +68,10 @@ class BlImage(BlDatablock):
|
|||||||
|
|
||||||
target.source = 'FILE'
|
target.source = 'FILE'
|
||||||
target.filepath_raw = get_filepath(data['filename'])
|
target.filepath_raw = get_filepath(data['filename'])
|
||||||
target.colorspace_settings.name = data["colorspace_settings"]["name"]
|
color_space_name = data["colorspace_settings"]["name"]
|
||||||
|
|
||||||
|
if color_space_name:
|
||||||
|
target.colorspace_settings.name = color_space_name
|
||||||
|
|
||||||
def _dump(self, instance=None):
|
def _dump(self, instance=None):
|
||||||
assert(instance)
|
assert(instance)
|
||||||
@ -83,6 +86,7 @@ class BlImage(BlDatablock):
|
|||||||
dumper.depth = 2
|
dumper.depth = 2
|
||||||
dumper.include_filter = [
|
dumper.include_filter = [
|
||||||
"name",
|
"name",
|
||||||
|
# 'source',
|
||||||
'size',
|
'size',
|
||||||
'height',
|
'height',
|
||||||
'alpha',
|
'alpha',
|
||||||
|
@ -27,7 +27,7 @@ from .dump_anything import Loader, Dumper
|
|||||||
from .bl_datablock import BlDatablock, get_datablock_from_uuid
|
from .bl_datablock import BlDatablock, get_datablock_from_uuid
|
||||||
|
|
||||||
NODE_SOCKET_INDEX = re.compile('\[(\d*)\]')
|
NODE_SOCKET_INDEX = re.compile('\[(\d*)\]')
|
||||||
|
IGNORED_SOCKETS = ['GEOMETRY', 'SHADER', 'CUSTOM']
|
||||||
|
|
||||||
def load_node(node_data: dict, node_tree: bpy.types.ShaderNodeTree):
|
def load_node(node_data: dict, node_tree: bpy.types.ShaderNodeTree):
|
||||||
""" Load a node into a node_tree from a dict
|
""" Load a node into a node_tree from a dict
|
||||||
@ -52,31 +52,137 @@ def load_node(node_data: dict, node_tree: bpy.types.ShaderNodeTree):
|
|||||||
|
|
||||||
inputs_data = node_data.get('inputs')
|
inputs_data = node_data.get('inputs')
|
||||||
if inputs_data:
|
if inputs_data:
|
||||||
inputs = target_node.inputs
|
inputs = [i for i in target_node.inputs if i.type not in IGNORED_SOCKETS]
|
||||||
for idx, inpt in enumerate(inputs_data):
|
for idx, inpt in enumerate(inputs):
|
||||||
if idx < len(inputs) and hasattr(inputs[idx], "default_value"):
|
if idx < len(inputs_data) and hasattr(inpt, "default_value"):
|
||||||
|
loaded_input = inputs_data[idx]
|
||||||
try:
|
try:
|
||||||
inputs[idx].default_value = inpt
|
if inpt.type in ['OBJECT', 'COLLECTION']:
|
||||||
|
inpt.default_value = get_datablock_from_uuid(loaded_input, None)
|
||||||
|
else:
|
||||||
|
inpt.default_value = loaded_input
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
logging.warning(f"Node {target_node.name} input {inputs[idx].name} parameter not supported, skipping ({e})")
|
logging.warning(f"Node {target_node.name} input {inpt.name} parameter not supported, skipping ({e})")
|
||||||
else:
|
else:
|
||||||
logging.warning(f"Node {target_node.name} input length mismatch.")
|
logging.warning(f"Node {target_node.name} input length mismatch.")
|
||||||
|
|
||||||
outputs_data = node_data.get('outputs')
|
outputs_data = node_data.get('outputs')
|
||||||
if outputs_data:
|
if outputs_data:
|
||||||
outputs = target_node.outputs
|
outputs = [o for o in target_node.outputs if o.type not in IGNORED_SOCKETS]
|
||||||
for idx, output in enumerate(outputs_data):
|
for idx, output in enumerate(outputs):
|
||||||
if idx < len(outputs) and hasattr(outputs[idx], "default_value"):
|
if idx < len(outputs_data) and hasattr(output, "default_value"):
|
||||||
|
loaded_output = outputs_data[idx]
|
||||||
try:
|
try:
|
||||||
outputs[idx].default_value = output
|
if output.type in ['OBJECT', 'COLLECTION']:
|
||||||
except:
|
output.default_value = get_datablock_from_uuid(loaded_output, None)
|
||||||
|
else:
|
||||||
|
output.default_value = loaded_output
|
||||||
|
except Exception as e:
|
||||||
logging.warning(
|
logging.warning(
|
||||||
f"Node {target_node.name} output {outputs[idx].name} parameter not supported, skipping ({e})")
|
f"Node {target_node.name} output {output.name} parameter not supported, skipping ({e})")
|
||||||
else:
|
else:
|
||||||
logging.warning(
|
logging.warning(
|
||||||
f"Node {target_node.name} output length mismatch.")
|
f"Node {target_node.name} output length mismatch.")
|
||||||
|
|
||||||
|
|
||||||
|
def dump_node(node: bpy.types.ShaderNode) -> dict:
|
||||||
|
""" Dump a single node to a dict
|
||||||
|
|
||||||
|
:arg node: target node
|
||||||
|
:type node: bpy.types.Node
|
||||||
|
:retrun: dict
|
||||||
|
"""
|
||||||
|
|
||||||
|
node_dumper = Dumper()
|
||||||
|
node_dumper.depth = 1
|
||||||
|
node_dumper.exclude_filter = [
|
||||||
|
"dimensions",
|
||||||
|
"show_expanded",
|
||||||
|
"name_full",
|
||||||
|
"select",
|
||||||
|
"bl_label",
|
||||||
|
"bl_height_min",
|
||||||
|
"bl_height_max",
|
||||||
|
"bl_height_default",
|
||||||
|
"bl_width_min",
|
||||||
|
"bl_width_max",
|
||||||
|
"type",
|
||||||
|
"bl_icon",
|
||||||
|
"bl_width_default",
|
||||||
|
"bl_static_type",
|
||||||
|
"show_tetxure",
|
||||||
|
"is_active_output",
|
||||||
|
"hide",
|
||||||
|
"show_options",
|
||||||
|
"show_preview",
|
||||||
|
"show_texture",
|
||||||
|
"outputs",
|
||||||
|
"width_hidden",
|
||||||
|
"image"
|
||||||
|
]
|
||||||
|
|
||||||
|
dumped_node = node_dumper.dump(node)
|
||||||
|
|
||||||
|
if node.parent:
|
||||||
|
dumped_node['parent'] = node.parent.name
|
||||||
|
|
||||||
|
dump_io_needed = (node.type not in ['REROUTE', 'OUTPUT_MATERIAL'])
|
||||||
|
|
||||||
|
if dump_io_needed:
|
||||||
|
io_dumper = Dumper()
|
||||||
|
io_dumper.depth = 2
|
||||||
|
io_dumper.include_filter = ["default_value"]
|
||||||
|
|
||||||
|
if hasattr(node, 'inputs'):
|
||||||
|
dumped_node['inputs'] = []
|
||||||
|
inputs = [i for i in node.inputs if i.type not in IGNORED_SOCKETS]
|
||||||
|
for idx, inpt in enumerate(inputs):
|
||||||
|
if hasattr(inpt, 'default_value'):
|
||||||
|
if isinstance(inpt.default_value, bpy.types.ID):
|
||||||
|
dumped_input = inpt.default_value.uuid
|
||||||
|
else:
|
||||||
|
dumped_input = io_dumper.dump(inpt.default_value)
|
||||||
|
|
||||||
|
dumped_node['inputs'].append(dumped_input)
|
||||||
|
|
||||||
|
if hasattr(node, 'outputs'):
|
||||||
|
dumped_node['outputs'] = []
|
||||||
|
for idx, output in enumerate(node.outputs):
|
||||||
|
if output.type not in IGNORED_SOCKETS:
|
||||||
|
if hasattr(output, 'default_value'):
|
||||||
|
dumped_node['outputs'].append(
|
||||||
|
io_dumper.dump(output.default_value))
|
||||||
|
|
||||||
|
if hasattr(node, 'color_ramp'):
|
||||||
|
ramp_dumper = Dumper()
|
||||||
|
ramp_dumper.depth = 4
|
||||||
|
ramp_dumper.include_filter = [
|
||||||
|
'elements',
|
||||||
|
'alpha',
|
||||||
|
'color',
|
||||||
|
'position',
|
||||||
|
'interpolation',
|
||||||
|
'hue_interpolation',
|
||||||
|
'color_mode'
|
||||||
|
]
|
||||||
|
dumped_node['color_ramp'] = ramp_dumper.dump(node.color_ramp)
|
||||||
|
if hasattr(node, 'mapping'):
|
||||||
|
curve_dumper = Dumper()
|
||||||
|
curve_dumper.depth = 5
|
||||||
|
curve_dumper.include_filter = [
|
||||||
|
'curves',
|
||||||
|
'points',
|
||||||
|
'location'
|
||||||
|
]
|
||||||
|
dumped_node['mapping'] = curve_dumper.dump(node.mapping)
|
||||||
|
if hasattr(node, 'image') and getattr(node, 'image'):
|
||||||
|
dumped_node['image_uuid'] = node.image.uuid
|
||||||
|
if hasattr(node, 'node_tree') and getattr(node, 'node_tree'):
|
||||||
|
dumped_node['node_tree_uuid'] = node.node_tree.uuid
|
||||||
|
return dumped_node
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
def load_links(links_data, node_tree):
|
def load_links(links_data, node_tree):
|
||||||
""" Load node_tree links from a list
|
""" Load node_tree links from a list
|
||||||
|
|
||||||
@ -119,94 +225,7 @@ def dump_links(links):
|
|||||||
return links_data
|
return links_data
|
||||||
|
|
||||||
|
|
||||||
def dump_node(node: bpy.types.ShaderNode) -> dict:
|
def dump_node_tree(node_tree: bpy.types.ShaderNodeTree) -> dict:
|
||||||
""" Dump a single node to a dict
|
|
||||||
|
|
||||||
:arg node: target node
|
|
||||||
:type node: bpy.types.Node
|
|
||||||
:retrun: dict
|
|
||||||
"""
|
|
||||||
|
|
||||||
node_dumper = Dumper()
|
|
||||||
node_dumper.depth = 1
|
|
||||||
node_dumper.exclude_filter = [
|
|
||||||
"dimensions",
|
|
||||||
"show_expanded",
|
|
||||||
"name_full",
|
|
||||||
"select",
|
|
||||||
"bl_label",
|
|
||||||
"bl_height_min",
|
|
||||||
"bl_height_max",
|
|
||||||
"bl_height_default",
|
|
||||||
"bl_width_min",
|
|
||||||
"bl_width_max",
|
|
||||||
"type",
|
|
||||||
"bl_icon",
|
|
||||||
"bl_width_default",
|
|
||||||
"bl_static_type",
|
|
||||||
"show_tetxure",
|
|
||||||
"is_active_output",
|
|
||||||
"hide",
|
|
||||||
"show_options",
|
|
||||||
"show_preview",
|
|
||||||
"show_texture",
|
|
||||||
"outputs",
|
|
||||||
"width_hidden",
|
|
||||||
"image"
|
|
||||||
]
|
|
||||||
|
|
||||||
dumped_node = node_dumper.dump(node)
|
|
||||||
|
|
||||||
dump_io_needed = (node.type not in ['REROUTE', 'OUTPUT_MATERIAL'])
|
|
||||||
|
|
||||||
if dump_io_needed:
|
|
||||||
io_dumper = Dumper()
|
|
||||||
io_dumper.depth = 2
|
|
||||||
io_dumper.include_filter = ["default_value"]
|
|
||||||
|
|
||||||
if hasattr(node, 'inputs'):
|
|
||||||
dumped_node['inputs'] = []
|
|
||||||
for idx, inpt in enumerate(node.inputs):
|
|
||||||
if hasattr(inpt, 'default_value'):
|
|
||||||
dumped_node['inputs'].append(
|
|
||||||
io_dumper.dump(inpt.default_value))
|
|
||||||
|
|
||||||
if hasattr(node, 'outputs'):
|
|
||||||
dumped_node['outputs'] = []
|
|
||||||
for idx, output in enumerate(node.outputs):
|
|
||||||
if hasattr(output, 'default_value'):
|
|
||||||
dumped_node['outputs'].append(
|
|
||||||
io_dumper.dump(output.default_value))
|
|
||||||
|
|
||||||
if hasattr(node, 'color_ramp'):
|
|
||||||
ramp_dumper = Dumper()
|
|
||||||
ramp_dumper.depth = 4
|
|
||||||
ramp_dumper.include_filter = [
|
|
||||||
'elements',
|
|
||||||
'alpha',
|
|
||||||
'color',
|
|
||||||
'position',
|
|
||||||
'interpolation',
|
|
||||||
'color_mode'
|
|
||||||
]
|
|
||||||
dumped_node['color_ramp'] = ramp_dumper.dump(node.color_ramp)
|
|
||||||
if hasattr(node, 'mapping'):
|
|
||||||
curve_dumper = Dumper()
|
|
||||||
curve_dumper.depth = 5
|
|
||||||
curve_dumper.include_filter = [
|
|
||||||
'curves',
|
|
||||||
'points',
|
|
||||||
'location'
|
|
||||||
]
|
|
||||||
dumped_node['mapping'] = curve_dumper.dump(node.mapping)
|
|
||||||
if hasattr(node, 'image') and getattr(node, 'image'):
|
|
||||||
dumped_node['image_uuid'] = node.image.uuid
|
|
||||||
if hasattr(node, 'node_tree') and getattr(node, 'node_tree'):
|
|
||||||
dumped_node['node_tree_uuid'] = node.node_tree.uuid
|
|
||||||
return dumped_node
|
|
||||||
|
|
||||||
|
|
||||||
def dump_shader_node_tree(node_tree: bpy.types.ShaderNodeTree) -> dict:
|
|
||||||
""" Dump a shader node_tree to a dict including links and nodes
|
""" Dump a shader node_tree to a dict including links and nodes
|
||||||
|
|
||||||
:arg node_tree: dumped shader node tree
|
:arg node_tree: dumped shader node tree
|
||||||
@ -262,7 +281,7 @@ def load_node_tree_sockets(sockets: bpy.types.Collection,
|
|||||||
"""
|
"""
|
||||||
# Check for removed sockets
|
# Check for removed sockets
|
||||||
for socket in sockets:
|
for socket in sockets:
|
||||||
if not [s for s in sockets_data if socket['uuid'] == s[2]]:
|
if not [s for s in sockets_data if 'uuid' in socket and socket['uuid'] == s[2]]:
|
||||||
sockets.remove(socket)
|
sockets.remove(socket)
|
||||||
|
|
||||||
# Check for new sockets
|
# Check for new sockets
|
||||||
@ -276,7 +295,7 @@ def load_node_tree_sockets(sockets: bpy.types.Collection,
|
|||||||
s['uuid'] = socket_data[2]
|
s['uuid'] = socket_data[2]
|
||||||
|
|
||||||
|
|
||||||
def load_shader_node_tree(node_tree_data: dict, target_node_tree: bpy.types.ShaderNodeTree) -> dict:
|
def load_node_tree(node_tree_data: dict, target_node_tree: bpy.types.ShaderNodeTree) -> dict:
|
||||||
"""Load a shader node_tree from dumped data
|
"""Load a shader node_tree from dumped data
|
||||||
|
|
||||||
:arg node_tree_data: dumped node data
|
:arg node_tree_data: dumped node data
|
||||||
@ -302,6 +321,14 @@ def load_shader_node_tree(node_tree_data: dict, target_node_tree: bpy.types.Shad
|
|||||||
for node in node_tree_data["nodes"]:
|
for node in node_tree_data["nodes"]:
|
||||||
load_node(node_tree_data["nodes"][node], target_node_tree)
|
load_node(node_tree_data["nodes"][node], target_node_tree)
|
||||||
|
|
||||||
|
for node_id, node_data in node_tree_data["nodes"].items():
|
||||||
|
target_node = target_node_tree.nodes.get(node_id, None)
|
||||||
|
if target_node is None:
|
||||||
|
continue
|
||||||
|
elif 'parent' in node_data:
|
||||||
|
target_node.parent = target_node_tree.nodes[node_data['parent']]
|
||||||
|
else:
|
||||||
|
target_node.parent = None
|
||||||
# TODO: load only required nodes links
|
# TODO: load only required nodes links
|
||||||
# Load nodes links
|
# Load nodes links
|
||||||
target_node_tree.links.clear()
|
target_node_tree.links.clear()
|
||||||
@ -316,6 +343,8 @@ def get_node_tree_dependencies(node_tree: bpy.types.NodeTree) -> list:
|
|||||||
def has_node_group(node): return (
|
def has_node_group(node): return (
|
||||||
hasattr(node, 'node_tree') and node.node_tree)
|
hasattr(node, 'node_tree') and node.node_tree)
|
||||||
|
|
||||||
|
def has_texture(node): return (
|
||||||
|
node.type in ['ATTRIBUTE_SAMPLE_TEXTURE','TEXTURE'] and node.texture)
|
||||||
deps = []
|
deps = []
|
||||||
|
|
||||||
for node in node_tree.nodes:
|
for node in node_tree.nodes:
|
||||||
@ -323,6 +352,8 @@ def get_node_tree_dependencies(node_tree: bpy.types.NodeTree) -> list:
|
|||||||
deps.append(node.image)
|
deps.append(node.image)
|
||||||
elif has_node_group(node):
|
elif has_node_group(node):
|
||||||
deps.append(node.node_tree)
|
deps.append(node.node_tree)
|
||||||
|
elif has_texture(node):
|
||||||
|
deps.append(node.texture)
|
||||||
|
|
||||||
return deps
|
return deps
|
||||||
|
|
||||||
@ -353,10 +384,7 @@ def load_materials_slots(src_materials: list, dst_materials: bpy.types.bpy_prop_
|
|||||||
if mat_uuid is not None:
|
if mat_uuid is not None:
|
||||||
mat_ref = get_datablock_from_uuid(mat_uuid, None)
|
mat_ref = get_datablock_from_uuid(mat_uuid, None)
|
||||||
else:
|
else:
|
||||||
mat_ref = bpy.data.materials.get(mat_name, None)
|
mat_ref = bpy.data.materials[mat_name]
|
||||||
|
|
||||||
if mat_ref is None:
|
|
||||||
raise Exception(f"Material {mat_name} doesn't exist")
|
|
||||||
|
|
||||||
dst_materials.append(mat_ref)
|
dst_materials.append(mat_ref)
|
||||||
|
|
||||||
@ -387,7 +415,7 @@ class BlMaterial(BlDatablock):
|
|||||||
if target.node_tree is None:
|
if target.node_tree is None:
|
||||||
target.use_nodes = True
|
target.use_nodes = True
|
||||||
|
|
||||||
load_shader_node_tree(data['node_tree'], target.node_tree)
|
load_node_tree(data['node_tree'], target.node_tree)
|
||||||
|
|
||||||
def _dump_implementation(self, data, instance=None):
|
def _dump_implementation(self, data, instance=None):
|
||||||
assert(instance)
|
assert(instance)
|
||||||
@ -454,7 +482,7 @@ class BlMaterial(BlDatablock):
|
|||||||
]
|
]
|
||||||
data['grease_pencil'] = gp_mat_dumper.dump(instance.grease_pencil)
|
data['grease_pencil'] = gp_mat_dumper.dump(instance.grease_pencil)
|
||||||
elif instance.use_nodes:
|
elif instance.use_nodes:
|
||||||
data['node_tree'] = dump_shader_node_tree(instance.node_tree)
|
data['node_tree'] = dump_node_tree(instance.node_tree)
|
||||||
|
|
||||||
return data
|
return data
|
||||||
|
|
||||||
|
@ -21,13 +21,13 @@ import mathutils
|
|||||||
|
|
||||||
from .dump_anything import Dumper, Loader, np_dump_collection, np_load_collection
|
from .dump_anything import Dumper, Loader, np_dump_collection, np_load_collection
|
||||||
from .bl_datablock import BlDatablock
|
from .bl_datablock import BlDatablock
|
||||||
from .bl_material import (dump_shader_node_tree,
|
from .bl_material import (dump_node_tree,
|
||||||
load_shader_node_tree,
|
load_node_tree,
|
||||||
get_node_tree_dependencies)
|
get_node_tree_dependencies)
|
||||||
|
|
||||||
class BlNodeGroup(BlDatablock):
|
class BlNodeGroup(BlDatablock):
|
||||||
bl_id = "node_groups"
|
bl_id = "node_groups"
|
||||||
bl_class = bpy.types.ShaderNodeTree
|
bl_class = bpy.types.NodeTree
|
||||||
bl_check_common = False
|
bl_check_common = False
|
||||||
bl_icon = 'NODETREE'
|
bl_icon = 'NODETREE'
|
||||||
bl_reload_parent = False
|
bl_reload_parent = False
|
||||||
@ -36,10 +36,10 @@ class BlNodeGroup(BlDatablock):
|
|||||||
return bpy.data.node_groups.new(data["name"], data["type"])
|
return bpy.data.node_groups.new(data["name"], data["type"])
|
||||||
|
|
||||||
def _load_implementation(self, data, target):
|
def _load_implementation(self, data, target):
|
||||||
load_shader_node_tree(data, target)
|
load_node_tree(data, target)
|
||||||
|
|
||||||
def _dump_implementation(self, data, instance=None):
|
def _dump_implementation(self, data, instance=None):
|
||||||
return dump_shader_node_tree(instance)
|
return dump_node_tree(instance)
|
||||||
|
|
||||||
def _resolve_deps_implementation(self):
|
def _resolve_deps_implementation(self):
|
||||||
return get_node_tree_dependencies(self.instance)
|
return get_node_tree_dependencies(self.instance)
|
@ -17,12 +17,14 @@
|
|||||||
|
|
||||||
|
|
||||||
import logging
|
import logging
|
||||||
|
import re
|
||||||
import bpy
|
import bpy
|
||||||
import mathutils
|
import mathutils
|
||||||
from replication.exception import ContextError
|
from replication.exception import ContextError
|
||||||
|
|
||||||
from .bl_datablock import BlDatablock, get_datablock_from_uuid
|
from .bl_datablock import BlDatablock, get_datablock_from_uuid
|
||||||
|
from .bl_material import IGNORED_SOCKETS
|
||||||
|
from .bl_action import dump_animation_data, load_animation_data, resolve_animation_dependencies
|
||||||
from .dump_anything import (
|
from .dump_anything import (
|
||||||
Dumper,
|
Dumper,
|
||||||
Loader,
|
Loader,
|
||||||
@ -36,6 +38,127 @@ SKIN_DATA = [
|
|||||||
'use_root'
|
'use_root'
|
||||||
]
|
]
|
||||||
|
|
||||||
|
SHAPEKEY_BLOCK_ATTR = [
|
||||||
|
'mute',
|
||||||
|
'value',
|
||||||
|
'slider_min',
|
||||||
|
'slider_max',
|
||||||
|
]
|
||||||
|
if bpy.app.version[1] >= 93:
|
||||||
|
SUPPORTED_GEOMETRY_NODE_PARAMETERS = (int, str, float)
|
||||||
|
else:
|
||||||
|
SUPPORTED_GEOMETRY_NODE_PARAMETERS = (int, str)
|
||||||
|
logging.warning("Geometry node Float parameter not supported in \
|
||||||
|
blender 2.92.")
|
||||||
|
|
||||||
|
def get_node_group_inputs(node_group):
|
||||||
|
inputs = []
|
||||||
|
for inpt in node_group.inputs:
|
||||||
|
if inpt.type in IGNORED_SOCKETS:
|
||||||
|
continue
|
||||||
|
else:
|
||||||
|
inputs.append(inpt)
|
||||||
|
return inputs
|
||||||
|
# return [inpt.identifer for inpt in node_group.inputs if inpt.type not in IGNORED_SOCKETS]
|
||||||
|
|
||||||
|
|
||||||
|
def dump_physics(target: bpy.types.Object)->dict:
|
||||||
|
"""
|
||||||
|
Dump all physics settings from a given object excluding modifier
|
||||||
|
related physics settings (such as softbody, cloth, dynapaint and fluid)
|
||||||
|
"""
|
||||||
|
dumper = Dumper()
|
||||||
|
dumper.depth = 1
|
||||||
|
physics_data = {}
|
||||||
|
|
||||||
|
# Collisions (collision)
|
||||||
|
if target.collision and target.collision.use:
|
||||||
|
physics_data['collision'] = dumper.dump(target.collision)
|
||||||
|
|
||||||
|
# Field (field)
|
||||||
|
if target.field and target.field.type != "NONE":
|
||||||
|
physics_data['field'] = dumper.dump(target.field)
|
||||||
|
|
||||||
|
# Rigid Body (rigid_body)
|
||||||
|
if target.rigid_body:
|
||||||
|
physics_data['rigid_body'] = dumper.dump(target.rigid_body)
|
||||||
|
|
||||||
|
# Rigid Body constraint (rigid_body_constraint)
|
||||||
|
if target.rigid_body_constraint:
|
||||||
|
physics_data['rigid_body_constraint'] = dumper.dump(target.rigid_body_constraint)
|
||||||
|
|
||||||
|
return physics_data
|
||||||
|
|
||||||
|
def load_physics(dumped_settings: dict, target: bpy.types.Object):
|
||||||
|
""" Load all physics settings from a given object excluding modifier
|
||||||
|
related physics settings (such as softbody, cloth, dynapaint and fluid)
|
||||||
|
"""
|
||||||
|
loader = Loader()
|
||||||
|
|
||||||
|
if 'collision' in dumped_settings:
|
||||||
|
loader.load(target.collision, dumped_settings['collision'])
|
||||||
|
|
||||||
|
if 'field' in dumped_settings:
|
||||||
|
loader.load(target.field, dumped_settings['field'])
|
||||||
|
|
||||||
|
if 'rigid_body' in dumped_settings:
|
||||||
|
if not target.rigid_body:
|
||||||
|
bpy.ops.rigidbody.object_add({"object": target})
|
||||||
|
loader.load(target.rigid_body, dumped_settings['rigid_body'])
|
||||||
|
elif target.rigid_body:
|
||||||
|
bpy.ops.rigidbody.object_remove({"object": target})
|
||||||
|
|
||||||
|
if 'rigid_body_constraint' in dumped_settings:
|
||||||
|
if not target.rigid_body_constraint:
|
||||||
|
bpy.ops.rigidbody.constraint_add({"object": target})
|
||||||
|
loader.load(target.rigid_body_constraint, dumped_settings['rigid_body_constraint'])
|
||||||
|
elif target.rigid_body_constraint:
|
||||||
|
bpy.ops.rigidbody.constraint_remove({"object": target})
|
||||||
|
|
||||||
|
def dump_modifier_geometry_node_inputs(modifier: bpy.types.Modifier) -> list:
|
||||||
|
""" Dump geometry node modifier input properties
|
||||||
|
|
||||||
|
:arg modifier: geometry node modifier to dump
|
||||||
|
:type modifier: bpy.type.Modifier
|
||||||
|
"""
|
||||||
|
dumped_inputs = []
|
||||||
|
for inpt in get_node_group_inputs(modifier.node_group):
|
||||||
|
input_value = modifier[inpt.identifier]
|
||||||
|
|
||||||
|
dumped_input = None
|
||||||
|
if isinstance(input_value, bpy.types.ID):
|
||||||
|
dumped_input = input_value.uuid
|
||||||
|
elif isinstance(input_value, SUPPORTED_GEOMETRY_NODE_PARAMETERS):
|
||||||
|
dumped_input = input_value
|
||||||
|
elif hasattr(input_value, 'to_list'):
|
||||||
|
dumped_input = input_value.to_list()
|
||||||
|
dumped_inputs.append(dumped_input)
|
||||||
|
|
||||||
|
return dumped_inputs
|
||||||
|
|
||||||
|
|
||||||
|
def load_modifier_geometry_node_inputs(dumped_modifier: dict, target_modifier: bpy.types.Modifier):
|
||||||
|
""" Load geometry node modifier inputs
|
||||||
|
|
||||||
|
:arg dumped_modifier: source dumped modifier to load
|
||||||
|
:type dumped_modifier: dict
|
||||||
|
:arg target_modifier: target geometry node modifier
|
||||||
|
:type target_modifier: bpy.type.Modifier
|
||||||
|
"""
|
||||||
|
|
||||||
|
for input_index, inpt in enumerate(get_node_group_inputs(target_modifier.node_group)):
|
||||||
|
dumped_value = dumped_modifier['inputs'][input_index]
|
||||||
|
input_value = target_modifier[inpt.identifier]
|
||||||
|
if isinstance(input_value, SUPPORTED_GEOMETRY_NODE_PARAMETERS):
|
||||||
|
target_modifier[inpt.identifier] = dumped_value
|
||||||
|
elif hasattr(input_value, 'to_list'):
|
||||||
|
for index in range(len(input_value)):
|
||||||
|
input_value[index] = dumped_value[index]
|
||||||
|
elif inpt.type in ['COLLECTION', 'OBJECT']:
|
||||||
|
target_modifier[inpt.identifier] = get_datablock_from_uuid(
|
||||||
|
dumped_value, None)
|
||||||
|
|
||||||
|
|
||||||
def load_pose(target_bone, data):
|
def load_pose(target_bone, data):
|
||||||
target_bone.rotation_mode = data['rotation_mode']
|
target_bone.rotation_mode = data['rotation_mode']
|
||||||
loader = Loader()
|
loader = Loader()
|
||||||
@ -91,19 +214,43 @@ def _is_editmode(object: bpy.types.Object) -> bool:
|
|||||||
child_data.is_editmode)
|
child_data.is_editmode)
|
||||||
|
|
||||||
|
|
||||||
def find_textures_dependencies(collection):
|
def find_textures_dependencies(modifiers: bpy.types.bpy_prop_collection) -> [bpy.types.Texture]:
|
||||||
""" Check collection
|
""" Find textures lying in a modifier stack
|
||||||
|
|
||||||
|
:arg modifiers: modifiers collection
|
||||||
|
:type modifiers: bpy.types.bpy_prop_collection
|
||||||
|
:return: list of bpy.types.Texture pointers
|
||||||
"""
|
"""
|
||||||
textures = []
|
textures = []
|
||||||
for item in collection:
|
for mod in modifiers:
|
||||||
for attr in dir(item):
|
modifier_attributes = [getattr(mod, attr_name)
|
||||||
inst = getattr(item, attr)
|
for attr_name in mod.bl_rna.properties.keys()]
|
||||||
if issubclass(type(inst), bpy.types.Texture) and inst is not None:
|
for attr in modifier_attributes:
|
||||||
textures.append(inst)
|
if issubclass(type(attr), bpy.types.Texture) and attr is not None:
|
||||||
|
textures.append(attr)
|
||||||
|
|
||||||
return textures
|
return textures
|
||||||
|
|
||||||
|
|
||||||
|
def find_geometry_nodes_dependencies(modifiers: bpy.types.bpy_prop_collection) -> [bpy.types.NodeTree]:
|
||||||
|
""" Find geometry nodes dependencies from a modifier stack
|
||||||
|
|
||||||
|
:arg modifiers: modifiers collection
|
||||||
|
:type modifiers: bpy.types.bpy_prop_collection
|
||||||
|
:return: list of bpy.types.NodeTree pointers
|
||||||
|
"""
|
||||||
|
dependencies = []
|
||||||
|
for mod in modifiers:
|
||||||
|
if mod.type == 'NODES' and mod.node_group:
|
||||||
|
dependencies.append(mod.node_group)
|
||||||
|
# for inpt in get_node_group_inputs(mod.node_group):
|
||||||
|
# parameter = mod.get(inpt.identifier)
|
||||||
|
# if parameter and isinstance(parameter, bpy.types.ID):
|
||||||
|
# dependencies.append(parameter)
|
||||||
|
|
||||||
|
return dependencies
|
||||||
|
|
||||||
|
|
||||||
def dump_vertex_groups(src_object: bpy.types.Object) -> dict:
|
def dump_vertex_groups(src_object: bpy.types.Object) -> dict:
|
||||||
""" Dump object's vertex groups
|
""" Dump object's vertex groups
|
||||||
|
|
||||||
@ -148,6 +295,147 @@ def load_vertex_groups(dumped_vertex_groups: dict, target_object: bpy.types.Obje
|
|||||||
for index, weight in vg['vertices']:
|
for index, weight in vg['vertices']:
|
||||||
vertex_group.add([index], weight, 'REPLACE')
|
vertex_group.add([index], weight, 'REPLACE')
|
||||||
|
|
||||||
|
def dump_shape_keys(target_key: bpy.types.Key)->dict:
|
||||||
|
""" Dump the target shape_keys datablock to a dict using numpy
|
||||||
|
|
||||||
|
:param dumped_key: target key datablock
|
||||||
|
:type dumped_key: bpy.types.Key
|
||||||
|
:return: dict
|
||||||
|
"""
|
||||||
|
|
||||||
|
dumped_key_blocks = []
|
||||||
|
dumper = Dumper()
|
||||||
|
dumper.include_filter = [
|
||||||
|
'name',
|
||||||
|
'mute',
|
||||||
|
'value',
|
||||||
|
'slider_min',
|
||||||
|
'slider_max',
|
||||||
|
]
|
||||||
|
for key in target_key.key_blocks:
|
||||||
|
dumped_key_block = dumper.dump(key)
|
||||||
|
dumped_key_block['data'] = np_dump_collection(key.data, ['co'])
|
||||||
|
dumped_key_block['relative_key'] = key.relative_key.name
|
||||||
|
dumped_key_blocks.append(dumped_key_block)
|
||||||
|
|
||||||
|
return {
|
||||||
|
'reference_key': target_key.reference_key.name,
|
||||||
|
'use_relative': target_key.use_relative,
|
||||||
|
'key_blocks': dumped_key_blocks,
|
||||||
|
'animation_data': dump_animation_data(target_key)
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def load_shape_keys(dumped_shape_keys: dict, target_object: bpy.types.Object):
|
||||||
|
""" Load the target shape_keys datablock to a dict using numpy
|
||||||
|
|
||||||
|
:param dumped_key: src key data
|
||||||
|
:type dumped_key: bpy.types.Key
|
||||||
|
:param target_object: object used to load the shapekeys data onto
|
||||||
|
:type target_object: bpy.types.Object
|
||||||
|
"""
|
||||||
|
loader = Loader()
|
||||||
|
# Remove existing ones
|
||||||
|
target_object.shape_key_clear()
|
||||||
|
|
||||||
|
# Create keys and load vertices coords
|
||||||
|
dumped_key_blocks = dumped_shape_keys.get('key_blocks')
|
||||||
|
for dumped_key_block in dumped_key_blocks:
|
||||||
|
key_block = target_object.shape_key_add(name=dumped_key_block['name'])
|
||||||
|
|
||||||
|
loader.load(key_block, dumped_key_block)
|
||||||
|
np_load_collection(dumped_key_block['data'], key_block.data, ['co'])
|
||||||
|
|
||||||
|
# Load relative key after all
|
||||||
|
for dumped_key_block in dumped_key_blocks:
|
||||||
|
relative_key_name = dumped_key_block.get('relative_key')
|
||||||
|
key_name = dumped_key_block.get('name')
|
||||||
|
|
||||||
|
target_keyblock = target_object.data.shape_keys.key_blocks[key_name]
|
||||||
|
relative_key = target_object.data.shape_keys.key_blocks[relative_key_name]
|
||||||
|
|
||||||
|
target_keyblock.relative_key = relative_key
|
||||||
|
|
||||||
|
# Shape keys animation data
|
||||||
|
anim_data = dumped_shape_keys.get('animation_data')
|
||||||
|
|
||||||
|
if anim_data:
|
||||||
|
load_animation_data(anim_data, target_object.data.shape_keys)
|
||||||
|
|
||||||
|
|
||||||
|
def dump_modifiers(modifiers: bpy.types.bpy_prop_collection)->dict:
|
||||||
|
""" Dump all modifiers of a modifier collection into a dict
|
||||||
|
|
||||||
|
:param modifiers: modifiers
|
||||||
|
:type modifiers: bpy.types.bpy_prop_collection
|
||||||
|
:return: dict
|
||||||
|
"""
|
||||||
|
dumped_modifiers = {}
|
||||||
|
dumper = Dumper()
|
||||||
|
dumper.depth = 1
|
||||||
|
dumper.exclude_filter = ['is_active']
|
||||||
|
|
||||||
|
for index, modifier in enumerate(modifiers):
|
||||||
|
dumped_modifier = dumper.dump(modifier)
|
||||||
|
# hack to dump geometry nodes inputs
|
||||||
|
if modifier.type == 'NODES':
|
||||||
|
dumped_inputs = dump_modifier_geometry_node_inputs(
|
||||||
|
modifier)
|
||||||
|
dumped_modifier['inputs'] = dumped_inputs
|
||||||
|
|
||||||
|
elif modifier.type == 'PARTICLE_SYSTEM':
|
||||||
|
dumper.exclude_filter = [
|
||||||
|
"is_edited",
|
||||||
|
"is_editable",
|
||||||
|
"is_global_hair"
|
||||||
|
]
|
||||||
|
dumped_modifier['particle_system'] = dumper.dump(modifier.particle_system)
|
||||||
|
dumped_modifier['particle_system']['settings_uuid'] = modifier.particle_system.settings.uuid
|
||||||
|
|
||||||
|
elif modifier.type in ['SOFT_BODY', 'CLOTH']:
|
||||||
|
dumped_modifier['settings'] = dumper.dump(modifier.settings)
|
||||||
|
elif modifier.type == 'UV_PROJECT':
|
||||||
|
dumped_modifier['projectors'] =[p.object.name for p in modifier.projectors if p and p.object]
|
||||||
|
|
||||||
|
dumped_modifiers[modifier.name] = dumped_modifier
|
||||||
|
return dumped_modifiers
|
||||||
|
|
||||||
|
|
||||||
|
def load_modifiers_custom_data(dumped_modifiers: dict, modifiers: bpy.types.bpy_prop_collection):
|
||||||
|
""" Load modifiers custom data not managed by the dump_anything loader
|
||||||
|
|
||||||
|
:param dumped_modifiers: modifiers to load
|
||||||
|
:type dumped_modifiers: dict
|
||||||
|
:param modifiers: target modifiers collection
|
||||||
|
:type modifiers: bpy.types.bpy_prop_collection
|
||||||
|
"""
|
||||||
|
loader = Loader()
|
||||||
|
|
||||||
|
for modifier in modifiers:
|
||||||
|
dumped_modifier = dumped_modifiers.get(modifier.name)
|
||||||
|
if modifier.type == 'NODES':
|
||||||
|
load_modifier_geometry_node_inputs(dumped_modifier, modifier)
|
||||||
|
elif modifier.type == 'PARTICLE_SYSTEM':
|
||||||
|
default = modifier.particle_system.settings
|
||||||
|
dumped_particles = dumped_modifier['particle_system']
|
||||||
|
loader.load(modifier.particle_system, dumped_particles)
|
||||||
|
|
||||||
|
settings = get_datablock_from_uuid(dumped_particles['settings_uuid'], None)
|
||||||
|
if settings:
|
||||||
|
modifier.particle_system.settings = settings
|
||||||
|
# Hack to remove the default generated particle settings
|
||||||
|
if not default.uuid:
|
||||||
|
bpy.data.particles.remove(default)
|
||||||
|
elif modifier.type in ['SOFT_BODY', 'CLOTH']:
|
||||||
|
loader.load(modifier.settings, dumped_modifier['settings'])
|
||||||
|
elif modifier.type == 'UV_PROJECT':
|
||||||
|
for projector_index, projector_object in enumerate(dumped_modifier['projectors']):
|
||||||
|
target_object = bpy.data.objects.get(projector_object)
|
||||||
|
if target_object:
|
||||||
|
modifier.projectors[projector_index].object = target_object
|
||||||
|
else:
|
||||||
|
logging.error("Could't load projector target object {projector_object}")
|
||||||
|
|
||||||
class BlObject(BlDatablock):
|
class BlObject(BlDatablock):
|
||||||
bl_id = "objects"
|
bl_id = "objects"
|
||||||
bl_class = bpy.types.Object
|
bl_class = bpy.types.Object
|
||||||
@ -171,13 +459,14 @@ class BlObject(BlDatablock):
|
|||||||
object_name = data.get("name")
|
object_name = data.get("name")
|
||||||
data_uuid = data.get("data_uuid")
|
data_uuid = data.get("data_uuid")
|
||||||
data_id = data.get("data")
|
data_id = data.get("data")
|
||||||
|
data_type = data.get("type")
|
||||||
|
|
||||||
object_data = get_datablock_from_uuid(
|
object_data = get_datablock_from_uuid(
|
||||||
data_uuid,
|
data_uuid,
|
||||||
find_data_from_name(data_id),
|
find_data_from_name(data_id),
|
||||||
ignore=['images']) # TODO: use resolve_from_id
|
ignore=['images']) # TODO: use resolve_from_id
|
||||||
|
|
||||||
if object_data is None and data_uuid:
|
if data_type != 'EMPTY' and object_data is None:
|
||||||
raise Exception(f"Fail to load object {data['name']}({self.uuid})")
|
raise Exception(f"Fail to load object {data['name']}({self.uuid})")
|
||||||
|
|
||||||
instance = bpy.data.objects.new(object_name, object_data)
|
instance = bpy.data.objects.new(object_name, object_data)
|
||||||
@ -203,31 +492,27 @@ class BlObject(BlDatablock):
|
|||||||
object_data = target.data
|
object_data = target.data
|
||||||
|
|
||||||
# SHAPE KEYS
|
# SHAPE KEYS
|
||||||
if 'shape_keys' in data:
|
shape_keys = data.get('shape_keys')
|
||||||
target.shape_key_clear()
|
if shape_keys:
|
||||||
|
load_shape_keys(shape_keys, target)
|
||||||
# Create keys and load vertices coords
|
|
||||||
for key_block in data['shape_keys']['key_blocks']:
|
|
||||||
key_data = data['shape_keys']['key_blocks'][key_block]
|
|
||||||
target.shape_key_add(name=key_block)
|
|
||||||
|
|
||||||
loader.load(
|
|
||||||
target.data.shape_keys.key_blocks[key_block], key_data)
|
|
||||||
for vert in key_data['data']:
|
|
||||||
target.data.shape_keys.key_blocks[key_block].data[vert].co = key_data['data'][vert]['co']
|
|
||||||
|
|
||||||
# Load relative key after all
|
|
||||||
for key_block in data['shape_keys']['key_blocks']:
|
|
||||||
reference = data['shape_keys']['key_blocks'][key_block]['relative_key']
|
|
||||||
|
|
||||||
target.data.shape_keys.key_blocks[key_block].relative_key = target.data.shape_keys.key_blocks[reference]
|
|
||||||
|
|
||||||
# Load transformation data
|
# Load transformation data
|
||||||
loader.load(target, data)
|
loader.load(target, data)
|
||||||
|
|
||||||
|
# Object display fields
|
||||||
if 'display' in data:
|
if 'display' in data:
|
||||||
loader.load(target.display, data['display'])
|
loader.load(target.display, data['display'])
|
||||||
|
|
||||||
|
# Parenting
|
||||||
|
parent_id = data.get('parent_uid')
|
||||||
|
if parent_id:
|
||||||
|
parent = get_datablock_from_uuid(parent_id[0], bpy.data.objects[parent_id[1]])
|
||||||
|
# Avoid reloading
|
||||||
|
if target.parent != parent and parent is not None:
|
||||||
|
target.parent = parent
|
||||||
|
elif target.parent:
|
||||||
|
target.parent = None
|
||||||
|
|
||||||
# Pose
|
# Pose
|
||||||
if 'pose' in data:
|
if 'pose' in data:
|
||||||
if not target.pose:
|
if not target.pose:
|
||||||
@ -272,9 +557,23 @@ class BlObject(BlDatablock):
|
|||||||
SKIN_DATA)
|
SKIN_DATA)
|
||||||
|
|
||||||
if hasattr(target, 'cycles_visibility') \
|
if hasattr(target, 'cycles_visibility') \
|
||||||
and 'cycles_visibility' in data:
|
and 'cycles_visibility' in data:
|
||||||
loader.load(target.cycles_visibility, data['cycles_visibility'])
|
loader.load(target.cycles_visibility, data['cycles_visibility'])
|
||||||
|
|
||||||
|
if hasattr(target, 'modifiers'):
|
||||||
|
load_modifiers_custom_data(data['modifiers'], target.modifiers)
|
||||||
|
|
||||||
|
# PHYSICS
|
||||||
|
load_physics(data, target)
|
||||||
|
|
||||||
|
transform = data.get('transforms', None)
|
||||||
|
if transform:
|
||||||
|
target.matrix_parent_inverse = mathutils.Matrix(
|
||||||
|
transform['matrix_parent_inverse'])
|
||||||
|
target.matrix_basis = mathutils.Matrix(transform['matrix_basis'])
|
||||||
|
target.matrix_local = mathutils.Matrix(transform['matrix_local'])
|
||||||
|
|
||||||
|
|
||||||
def _dump_implementation(self, data, instance=None):
|
def _dump_implementation(self, data, instance=None):
|
||||||
assert(instance)
|
assert(instance)
|
||||||
|
|
||||||
@ -289,7 +588,6 @@ class BlObject(BlDatablock):
|
|||||||
dumper.include_filter = [
|
dumper.include_filter = [
|
||||||
"name",
|
"name",
|
||||||
"rotation_mode",
|
"rotation_mode",
|
||||||
"parent",
|
|
||||||
"data",
|
"data",
|
||||||
"library",
|
"library",
|
||||||
"empty_display_type",
|
"empty_display_type",
|
||||||
@ -304,8 +602,6 @@ class BlObject(BlDatablock):
|
|||||||
"color",
|
"color",
|
||||||
"instance_collection",
|
"instance_collection",
|
||||||
"instance_type",
|
"instance_type",
|
||||||
"location",
|
|
||||||
"scale",
|
|
||||||
'lock_location',
|
'lock_location',
|
||||||
'lock_rotation',
|
'lock_rotation',
|
||||||
'lock_scale',
|
'lock_scale',
|
||||||
@ -319,12 +615,16 @@ class BlObject(BlDatablock):
|
|||||||
'show_all_edges',
|
'show_all_edges',
|
||||||
'show_texture_space',
|
'show_texture_space',
|
||||||
'show_in_front',
|
'show_in_front',
|
||||||
'type',
|
'type'
|
||||||
'rotation_quaternion' if instance.rotation_mode == 'QUATERNION' else 'rotation_euler',
|
|
||||||
]
|
]
|
||||||
|
|
||||||
data = dumper.dump(instance)
|
data = dumper.dump(instance)
|
||||||
|
|
||||||
|
dumper.include_filter = [
|
||||||
|
'matrix_parent_inverse',
|
||||||
|
'matrix_local',
|
||||||
|
'matrix_basis']
|
||||||
|
data['transforms'] = dumper.dump(instance)
|
||||||
dumper.include_filter = [
|
dumper.include_filter = [
|
||||||
'show_shadows',
|
'show_shadows',
|
||||||
]
|
]
|
||||||
@ -334,15 +634,14 @@ class BlObject(BlDatablock):
|
|||||||
if self.is_library:
|
if self.is_library:
|
||||||
return data
|
return data
|
||||||
|
|
||||||
|
# PARENTING
|
||||||
|
if instance.parent:
|
||||||
|
data['parent_uid'] = (instance.parent.uuid, instance.parent.name)
|
||||||
|
|
||||||
# MODIFIERS
|
# MODIFIERS
|
||||||
|
modifiers = getattr(instance, 'modifiers', None)
|
||||||
if hasattr(instance, 'modifiers'):
|
if hasattr(instance, 'modifiers'):
|
||||||
data["modifiers"] = {}
|
data['modifiers'] = dump_modifiers(modifiers)
|
||||||
modifiers = getattr(instance, 'modifiers', None)
|
|
||||||
if modifiers:
|
|
||||||
dumper.include_filter = None
|
|
||||||
dumper.depth = 1
|
|
||||||
for index, modifier in enumerate(modifiers):
|
|
||||||
data["modifiers"][modifier.name] = dumper.dump(modifier)
|
|
||||||
|
|
||||||
gp_modifiers = getattr(instance, 'grease_pencil_modifiers', None)
|
gp_modifiers = getattr(instance, 'grease_pencil_modifiers', None)
|
||||||
|
|
||||||
@ -365,6 +664,7 @@ class BlObject(BlDatablock):
|
|||||||
'location']
|
'location']
|
||||||
gp_mod_data['curve'] = curve_dumper.dump(modifier.curve)
|
gp_mod_data['curve'] = curve_dumper.dump(modifier.curve)
|
||||||
|
|
||||||
|
|
||||||
# CONSTRAINTS
|
# CONSTRAINTS
|
||||||
if hasattr(instance, 'constraints'):
|
if hasattr(instance, 'constraints'):
|
||||||
dumper.include_filter = None
|
dumper.include_filter = None
|
||||||
@ -409,7 +709,6 @@ class BlObject(BlDatablock):
|
|||||||
bone_groups[group.name] = dumper.dump(group)
|
bone_groups[group.name] = dumper.dump(group)
|
||||||
data['pose']['bone_groups'] = bone_groups
|
data['pose']['bone_groups'] = bone_groups
|
||||||
|
|
||||||
|
|
||||||
# VERTEx GROUP
|
# VERTEx GROUP
|
||||||
if len(instance.vertex_groups) > 0:
|
if len(instance.vertex_groups) > 0:
|
||||||
data['vertex_groups'] = dump_vertex_groups(instance)
|
data['vertex_groups'] = dump_vertex_groups(instance)
|
||||||
@ -417,36 +716,14 @@ class BlObject(BlDatablock):
|
|||||||
# SHAPE KEYS
|
# SHAPE KEYS
|
||||||
object_data = instance.data
|
object_data = instance.data
|
||||||
if hasattr(object_data, 'shape_keys') and object_data.shape_keys:
|
if hasattr(object_data, 'shape_keys') and object_data.shape_keys:
|
||||||
dumper = Dumper()
|
data['shape_keys'] = dump_shape_keys(object_data.shape_keys)
|
||||||
dumper.depth = 2
|
|
||||||
dumper.include_filter = [
|
|
||||||
'reference_key',
|
|
||||||
'use_relative'
|
|
||||||
]
|
|
||||||
data['shape_keys'] = dumper.dump(object_data.shape_keys)
|
|
||||||
data['shape_keys']['reference_key'] = object_data.shape_keys.reference_key.name
|
|
||||||
key_blocks = {}
|
|
||||||
for key in object_data.shape_keys.key_blocks:
|
|
||||||
dumper.depth = 3
|
|
||||||
dumper.include_filter = [
|
|
||||||
'name',
|
|
||||||
'data',
|
|
||||||
'mute',
|
|
||||||
'value',
|
|
||||||
'slider_min',
|
|
||||||
'slider_max',
|
|
||||||
'data',
|
|
||||||
'co'
|
|
||||||
]
|
|
||||||
key_blocks[key.name] = dumper.dump(key)
|
|
||||||
key_blocks[key.name]['relative_key'] = key.relative_key.name
|
|
||||||
data['shape_keys']['key_blocks'] = key_blocks
|
|
||||||
|
|
||||||
# SKIN VERTICES
|
# SKIN VERTICES
|
||||||
if hasattr(object_data, 'skin_vertices') and object_data.skin_vertices:
|
if hasattr(object_data, 'skin_vertices') and object_data.skin_vertices:
|
||||||
skin_vertices = list()
|
skin_vertices = list()
|
||||||
for skin_data in object_data.skin_vertices:
|
for skin_data in object_data.skin_vertices:
|
||||||
skin_vertices.append(np_dump_collection(skin_data.data, SKIN_DATA))
|
skin_vertices.append(
|
||||||
|
np_dump_collection(skin_data.data, SKIN_DATA))
|
||||||
data['skin_vertices'] = skin_vertices
|
data['skin_vertices'] = skin_vertices
|
||||||
|
|
||||||
# CYCLE SETTINGS
|
# CYCLE SETTINGS
|
||||||
@ -461,6 +738,9 @@ class BlObject(BlDatablock):
|
|||||||
]
|
]
|
||||||
data['cycles_visibility'] = dumper.dump(instance.cycles_visibility)
|
data['cycles_visibility'] = dumper.dump(instance.cycles_visibility)
|
||||||
|
|
||||||
|
# PHYSICS
|
||||||
|
data.update(dump_physics(instance))
|
||||||
|
|
||||||
return data
|
return data
|
||||||
|
|
||||||
def _resolve_deps_implementation(self):
|
def _resolve_deps_implementation(self):
|
||||||
@ -469,17 +749,25 @@ class BlObject(BlDatablock):
|
|||||||
# Avoid Empty case
|
# Avoid Empty case
|
||||||
if self.instance.data:
|
if self.instance.data:
|
||||||
deps.append(self.instance.data)
|
deps.append(self.instance.data)
|
||||||
if self.instance.parent :
|
|
||||||
deps.append(self.instance.parent)
|
# Particle systems
|
||||||
|
for particle_slot in self.instance.particle_systems:
|
||||||
|
deps.append(particle_slot.settings)
|
||||||
|
|
||||||
if self.is_library:
|
if self.is_library:
|
||||||
deps.append(self.instance.library)
|
deps.append(self.instance.library)
|
||||||
|
|
||||||
|
if self.instance.parent:
|
||||||
|
deps.append(self.instance.parent)
|
||||||
|
|
||||||
if self.instance.instance_type == 'COLLECTION':
|
if self.instance.instance_type == 'COLLECTION':
|
||||||
# TODO: uuid based
|
# TODO: uuid based
|
||||||
deps.append(self.instance.instance_collection)
|
deps.append(self.instance.instance_collection)
|
||||||
|
|
||||||
if self.instance.modifiers:
|
if self.instance.modifiers:
|
||||||
deps.extend(find_textures_dependencies(self.instance.modifiers))
|
deps.extend(find_textures_dependencies(self.instance.modifiers))
|
||||||
|
deps.extend(find_geometry_nodes_dependencies(self.instance.modifiers))
|
||||||
|
|
||||||
|
if hasattr(self.instance.data, 'shape_keys') and self.instance.data.shape_keys:
|
||||||
|
deps.extend(resolve_animation_dependencies(self.instance.data.shape_keys))
|
||||||
return deps
|
return deps
|
||||||
|
90
multi_user/bl_types/bl_particle.py
Normal file
90
multi_user/bl_types/bl_particle.py
Normal file
@ -0,0 +1,90 @@
|
|||||||
|
import bpy
|
||||||
|
import mathutils
|
||||||
|
|
||||||
|
from . import dump_anything
|
||||||
|
from .bl_datablock import BlDatablock, get_datablock_from_uuid
|
||||||
|
|
||||||
|
|
||||||
|
def dump_textures_slots(texture_slots: bpy.types.bpy_prop_collection) -> list:
|
||||||
|
""" Dump every texture slot collection as the form:
|
||||||
|
[(index, slot_texture_uuid, slot_texture_name), (), ...]
|
||||||
|
"""
|
||||||
|
dumped_slots = []
|
||||||
|
for index, slot in enumerate(texture_slots):
|
||||||
|
if slot and slot.texture:
|
||||||
|
dumped_slots.append((index, slot.texture.uuid, slot.texture.name))
|
||||||
|
|
||||||
|
return dumped_slots
|
||||||
|
|
||||||
|
|
||||||
|
def load_texture_slots(dumped_slots: list, target_slots: bpy.types.bpy_prop_collection):
|
||||||
|
"""
|
||||||
|
"""
|
||||||
|
for index, slot in enumerate(target_slots):
|
||||||
|
if slot:
|
||||||
|
target_slots.clear(index)
|
||||||
|
|
||||||
|
for index, slot_uuid, slot_name in dumped_slots:
|
||||||
|
target_slots.create(index).texture = get_datablock_from_uuid(
|
||||||
|
slot_uuid, slot_name
|
||||||
|
)
|
||||||
|
|
||||||
|
IGNORED_ATTR = [
|
||||||
|
"is_embedded_data",
|
||||||
|
"is_evaluated",
|
||||||
|
"is_fluid",
|
||||||
|
"is_library_indirect",
|
||||||
|
"users"
|
||||||
|
]
|
||||||
|
|
||||||
|
class BlParticle(BlDatablock):
|
||||||
|
bl_id = "particles"
|
||||||
|
bl_class = bpy.types.ParticleSettings
|
||||||
|
bl_icon = "PARTICLES"
|
||||||
|
bl_check_common = False
|
||||||
|
bl_reload_parent = False
|
||||||
|
|
||||||
|
def _construct(self, data):
|
||||||
|
instance = bpy.data.particles.new(data["name"])
|
||||||
|
instance.uuid = self.uuid
|
||||||
|
return instance
|
||||||
|
|
||||||
|
def _load_implementation(self, data, target):
|
||||||
|
dump_anything.load(target, data)
|
||||||
|
|
||||||
|
dump_anything.load(target.effector_weights, data["effector_weights"])
|
||||||
|
|
||||||
|
# Force field
|
||||||
|
force_field_1 = data.get("force_field_1", None)
|
||||||
|
if force_field_1:
|
||||||
|
dump_anything.load(target.force_field_1, force_field_1)
|
||||||
|
|
||||||
|
force_field_2 = data.get("force_field_2", None)
|
||||||
|
if force_field_2:
|
||||||
|
dump_anything.load(target.force_field_2, force_field_2)
|
||||||
|
|
||||||
|
# Texture slots
|
||||||
|
load_texture_slots(data["texture_slots"], target.texture_slots)
|
||||||
|
|
||||||
|
def _dump_implementation(self, data, instance=None):
|
||||||
|
assert instance
|
||||||
|
|
||||||
|
dumper = dump_anything.Dumper()
|
||||||
|
dumper.depth = 1
|
||||||
|
dumper.exclude_filter = IGNORED_ATTR
|
||||||
|
data = dumper.dump(instance)
|
||||||
|
|
||||||
|
# Particle effectors
|
||||||
|
data["effector_weights"] = dumper.dump(instance.effector_weights)
|
||||||
|
if instance.force_field_1:
|
||||||
|
data["force_field_1"] = dumper.dump(instance.force_field_1)
|
||||||
|
if instance.force_field_2:
|
||||||
|
data["force_field_2"] = dumper.dump(instance.force_field_2)
|
||||||
|
|
||||||
|
# Texture slots
|
||||||
|
data["texture_slots"] = dump_textures_slots(instance.texture_slots)
|
||||||
|
|
||||||
|
return data
|
||||||
|
|
||||||
|
def _resolve_deps_implementation(self):
|
||||||
|
return [t.texture for t in self.instance.texture_slots if t and t.texture]
|
@ -368,6 +368,8 @@ def load_sequence(sequence_data: dict, sequence_editor: bpy.types.SequenceEditor
|
|||||||
|
|
||||||
|
|
||||||
class BlScene(BlDatablock):
|
class BlScene(BlDatablock):
|
||||||
|
is_root = True
|
||||||
|
|
||||||
bl_id = "scenes"
|
bl_id = "scenes"
|
||||||
bl_class = bpy.types.Scene
|
bl_class = bpy.types.Scene
|
||||||
bl_check_common = True
|
bl_check_common = True
|
||||||
|
@ -21,8 +21,8 @@ import mathutils
|
|||||||
|
|
||||||
from .dump_anything import Loader, Dumper
|
from .dump_anything import Loader, Dumper
|
||||||
from .bl_datablock import BlDatablock
|
from .bl_datablock import BlDatablock
|
||||||
from .bl_material import (load_shader_node_tree,
|
from .bl_material import (load_node_tree,
|
||||||
dump_shader_node_tree,
|
dump_node_tree,
|
||||||
get_node_tree_dependencies)
|
get_node_tree_dependencies)
|
||||||
|
|
||||||
|
|
||||||
@ -44,7 +44,7 @@ class BlWorld(BlDatablock):
|
|||||||
if target.node_tree is None:
|
if target.node_tree is None:
|
||||||
target.use_nodes = True
|
target.use_nodes = True
|
||||||
|
|
||||||
load_shader_node_tree(data['node_tree'], target.node_tree)
|
load_node_tree(data['node_tree'], target.node_tree)
|
||||||
|
|
||||||
def _dump_implementation(self, data, instance=None):
|
def _dump_implementation(self, data, instance=None):
|
||||||
assert(instance)
|
assert(instance)
|
||||||
@ -58,7 +58,7 @@ class BlWorld(BlDatablock):
|
|||||||
]
|
]
|
||||||
data = world_dumper.dump(instance)
|
data = world_dumper.dump(instance)
|
||||||
if instance.use_nodes:
|
if instance.use_nodes:
|
||||||
data['node_tree'] = dump_shader_node_tree(instance.node_tree)
|
data['node_tree'] = dump_node_tree(instance.node_tree)
|
||||||
|
|
||||||
return data
|
return data
|
||||||
|
|
||||||
|
@ -596,6 +596,8 @@ class Loader:
|
|||||||
instance.write(bpy.data.textures.get(dump))
|
instance.write(bpy.data.textures.get(dump))
|
||||||
elif isinstance(rna_property_type, T.ColorRamp):
|
elif isinstance(rna_property_type, T.ColorRamp):
|
||||||
self._load_default(instance, dump)
|
self._load_default(instance, dump)
|
||||||
|
elif isinstance(rna_property_type, T.NodeTree):
|
||||||
|
instance.write(bpy.data.node_groups.get(dump))
|
||||||
elif isinstance(rna_property_type, T.Object):
|
elif isinstance(rna_property_type, T.Object):
|
||||||
instance.write(bpy.data.objects.get(dump))
|
instance.write(bpy.data.objects.get(dump))
|
||||||
elif isinstance(rna_property_type, T.Mesh):
|
elif isinstance(rna_property_type, T.Mesh):
|
||||||
@ -608,6 +610,8 @@ class Loader:
|
|||||||
instance.write(bpy.data.fonts.get(dump))
|
instance.write(bpy.data.fonts.get(dump))
|
||||||
elif isinstance(rna_property_type, T.Sound):
|
elif isinstance(rna_property_type, T.Sound):
|
||||||
instance.write(bpy.data.sounds.get(dump))
|
instance.write(bpy.data.sounds.get(dump))
|
||||||
|
# elif isinstance(rna_property_type, T.ParticleSettings):
|
||||||
|
# instance.write(bpy.data.particles.get(dump))
|
||||||
|
|
||||||
def _load_matrix(self, matrix, dump):
|
def _load_matrix(self, matrix, dump):
|
||||||
matrix.write(mathutils.Matrix(dump))
|
matrix.write(mathutils.Matrix(dump))
|
||||||
|
1
multi_user/libs/replication
Submodule
1
multi_user/libs/replication
Submodule
Submodule multi_user/libs/replication added at 8c27d0cec6
@ -32,6 +32,7 @@ from operator import itemgetter
|
|||||||
from pathlib import Path
|
from pathlib import Path
|
||||||
from queue import Queue
|
from queue import Queue
|
||||||
from time import gmtime, strftime
|
from time import gmtime, strftime
|
||||||
|
import traceback
|
||||||
|
|
||||||
try:
|
try:
|
||||||
import _pickle as pickle
|
import _pickle as pickle
|
||||||
@ -44,9 +45,11 @@ from bpy.app.handlers import persistent
|
|||||||
from bpy_extras.io_utils import ExportHelper, ImportHelper
|
from bpy_extras.io_utils import ExportHelper, ImportHelper
|
||||||
from replication.constants import (COMMITED, FETCHED, RP_COMMON, STATE_ACTIVE,
|
from replication.constants import (COMMITED, FETCHED, RP_COMMON, STATE_ACTIVE,
|
||||||
STATE_INITIAL, STATE_SYNCING, UP)
|
STATE_INITIAL, STATE_SYNCING, UP)
|
||||||
from replication.data import ReplicatedDataFactory
|
from replication.data import DataTranslationProtocol
|
||||||
from replication.exception import NonAuthorizedOperationError, ContextError
|
from replication.exception import ContextError, NonAuthorizedOperationError
|
||||||
from replication.interface import session
|
from replication.interface import session
|
||||||
|
from replication.porcelain import add, apply
|
||||||
|
from replication.repository import Repository
|
||||||
|
|
||||||
from . import bl_types, environment, timers, ui, utils
|
from . import bl_types, environment, timers, ui, utils
|
||||||
from .presence import SessionStatusWidget, renderer, view3d_find
|
from .presence import SessionStatusWidget, renderer, view3d_find
|
||||||
@ -80,8 +83,8 @@ def initialize_session():
|
|||||||
|
|
||||||
# Step 1: Constrect nodes
|
# Step 1: Constrect nodes
|
||||||
logging.info("Constructing nodes")
|
logging.info("Constructing nodes")
|
||||||
for node in session._graph.list_ordered():
|
for node in session.repository.list_ordered():
|
||||||
node_ref = session.get(uuid=node)
|
node_ref = session.repository.get_node(node)
|
||||||
if node_ref is None:
|
if node_ref is None:
|
||||||
logging.error(f"Can't construct node {node}")
|
logging.error(f"Can't construct node {node}")
|
||||||
elif node_ref.state == FETCHED:
|
elif node_ref.state == FETCHED:
|
||||||
@ -89,8 +92,8 @@ def initialize_session():
|
|||||||
|
|
||||||
# Step 2: Load nodes
|
# Step 2: Load nodes
|
||||||
logging.info("Loading nodes")
|
logging.info("Loading nodes")
|
||||||
for node in session._graph.list_ordered():
|
for node in session.repository.list_ordered():
|
||||||
node_ref = session.get(uuid=node)
|
node_ref = session.repository.get_node(node)
|
||||||
|
|
||||||
if node_ref is None:
|
if node_ref is None:
|
||||||
logging.error(f"Can't load node {node}")
|
logging.error(f"Can't load node {node}")
|
||||||
@ -186,7 +189,7 @@ class SessionStartOperator(bpy.types.Operator):
|
|||||||
|
|
||||||
handler.setFormatter(formatter)
|
handler.setFormatter(formatter)
|
||||||
|
|
||||||
bpy_factory = ReplicatedDataFactory()
|
bpy_protocol = DataTranslationProtocol()
|
||||||
supported_bl_types = []
|
supported_bl_types = []
|
||||||
|
|
||||||
# init the factory with supported types
|
# init the factory with supported types
|
||||||
@ -205,22 +208,17 @@ class SessionStartOperator(bpy.types.Operator):
|
|||||||
|
|
||||||
type_local_config = settings.supported_datablocks[type_impl_name]
|
type_local_config = settings.supported_datablocks[type_impl_name]
|
||||||
|
|
||||||
bpy_factory.register_type(
|
bpy_protocol.register_type(
|
||||||
type_module_class.bl_class,
|
type_module_class.bl_class,
|
||||||
type_module_class,
|
type_module_class,
|
||||||
check_common=type_module_class.bl_check_common)
|
check_common=type_module_class.bl_check_common)
|
||||||
|
|
||||||
deleyables.append(timers.ApplyTimer(timeout=settings.depsgraph_update_rate))
|
|
||||||
|
|
||||||
if bpy.app.version[1] >= 91:
|
if bpy.app.version[1] >= 91:
|
||||||
python_binary_path = sys.executable
|
python_binary_path = sys.executable
|
||||||
else:
|
else:
|
||||||
python_binary_path = bpy.app.binary_path_python
|
python_binary_path = bpy.app.binary_path_python
|
||||||
|
|
||||||
session.configure(
|
repo = Repository(data_protocol=bpy_protocol)
|
||||||
factory=bpy_factory,
|
|
||||||
python_path=python_binary_path,
|
|
||||||
external_update_handling=True)
|
|
||||||
|
|
||||||
# Host a session
|
# Host a session
|
||||||
if self.host:
|
if self.host:
|
||||||
@ -231,13 +229,14 @@ class SessionStartOperator(bpy.types.Operator):
|
|||||||
runtime_settings.internet_ip = environment.get_ip()
|
runtime_settings.internet_ip = environment.get_ip()
|
||||||
|
|
||||||
try:
|
try:
|
||||||
|
# Init repository
|
||||||
for scene in bpy.data.scenes:
|
for scene in bpy.data.scenes:
|
||||||
session.add(scene)
|
add(repo, scene)
|
||||||
|
|
||||||
session.host(
|
session.host(
|
||||||
|
repository= repo,
|
||||||
id=settings.username,
|
id=settings.username,
|
||||||
port=settings.port,
|
port=settings.port,
|
||||||
ipc_port=settings.ipc_port,
|
|
||||||
timeout=settings.connection_timeout,
|
timeout=settings.connection_timeout,
|
||||||
password=admin_pass,
|
password=admin_pass,
|
||||||
cache_directory=settings.cache_directory,
|
cache_directory=settings.cache_directory,
|
||||||
@ -247,7 +246,6 @@ class SessionStartOperator(bpy.types.Operator):
|
|||||||
except Exception as e:
|
except Exception as e:
|
||||||
self.report({'ERROR'}, repr(e))
|
self.report({'ERROR'}, repr(e))
|
||||||
logging.error(f"Error: {e}")
|
logging.error(f"Error: {e}")
|
||||||
import traceback
|
|
||||||
traceback.print_exc()
|
traceback.print_exc()
|
||||||
# Join a session
|
# Join a session
|
||||||
else:
|
else:
|
||||||
@ -258,10 +256,10 @@ class SessionStartOperator(bpy.types.Operator):
|
|||||||
|
|
||||||
try:
|
try:
|
||||||
session.connect(
|
session.connect(
|
||||||
|
repository= repo,
|
||||||
id=settings.username,
|
id=settings.username,
|
||||||
address=settings.ip,
|
address=settings.ip,
|
||||||
port=settings.port,
|
port=settings.port,
|
||||||
ipc_port=settings.ipc_port,
|
|
||||||
timeout=settings.connection_timeout,
|
timeout=settings.connection_timeout,
|
||||||
password=admin_pass
|
password=admin_pass
|
||||||
)
|
)
|
||||||
@ -272,6 +270,7 @@ class SessionStartOperator(bpy.types.Operator):
|
|||||||
# Background client updates service
|
# Background client updates service
|
||||||
deleyables.append(timers.ClientUpdate())
|
deleyables.append(timers.ClientUpdate())
|
||||||
deleyables.append(timers.DynamicRightSelectTimer())
|
deleyables.append(timers.DynamicRightSelectTimer())
|
||||||
|
deleyables.append(timers.ApplyTimer(timeout=settings.depsgraph_update_rate))
|
||||||
# deleyables.append(timers.PushTimer(
|
# deleyables.append(timers.PushTimer(
|
||||||
# queue=stagging,
|
# queue=stagging,
|
||||||
# timeout=settings.depsgraph_update_rate
|
# timeout=settings.depsgraph_update_rate
|
||||||
@ -280,7 +279,9 @@ class SessionStartOperator(bpy.types.Operator):
|
|||||||
session_user_sync = timers.SessionUserSync()
|
session_user_sync = timers.SessionUserSync()
|
||||||
session_background_executor = timers.MainThreadExecutor(
|
session_background_executor = timers.MainThreadExecutor(
|
||||||
execution_queue=background_execution_queue)
|
execution_queue=background_execution_queue)
|
||||||
|
session_listen = timers.SessionListenTimer(timeout=0.001)
|
||||||
|
|
||||||
|
session_listen.register()
|
||||||
session_update.register()
|
session_update.register()
|
||||||
session_user_sync.register()
|
session_user_sync.register()
|
||||||
session_background_executor.register()
|
session_background_executor.register()
|
||||||
@ -288,7 +289,7 @@ class SessionStartOperator(bpy.types.Operator):
|
|||||||
deleyables.append(session_background_executor)
|
deleyables.append(session_background_executor)
|
||||||
deleyables.append(session_update)
|
deleyables.append(session_update)
|
||||||
deleyables.append(session_user_sync)
|
deleyables.append(session_user_sync)
|
||||||
|
deleyables.append(session_listen)
|
||||||
|
|
||||||
|
|
||||||
self.report(
|
self.report(
|
||||||
@ -329,7 +330,7 @@ class SessionInitOperator(bpy.types.Operator):
|
|||||||
utils.clean_scene()
|
utils.clean_scene()
|
||||||
|
|
||||||
for scene in bpy.data.scenes:
|
for scene in bpy.data.scenes:
|
||||||
session.add(scene)
|
add(session.repository, scene)
|
||||||
|
|
||||||
session.init()
|
session.init()
|
||||||
|
|
||||||
@ -351,7 +352,7 @@ class SessionStopOperator(bpy.types.Operator):
|
|||||||
|
|
||||||
if session:
|
if session:
|
||||||
try:
|
try:
|
||||||
session.disconnect()
|
session.disconnect(reason='user')
|
||||||
|
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
self.report({'ERROR'}, repr(e))
|
self.report({'ERROR'}, repr(e))
|
||||||
@ -600,17 +601,22 @@ class SessionApply(bpy.types.Operator):
|
|||||||
def execute(self, context):
|
def execute(self, context):
|
||||||
logging.debug(f"Running apply on {self.target}")
|
logging.debug(f"Running apply on {self.target}")
|
||||||
try:
|
try:
|
||||||
node_ref = session.get(uuid=self.target)
|
node_ref = session.repository.get_node(self.target)
|
||||||
session.apply(self.target,
|
apply(session.repository,
|
||||||
force=True,
|
self.target,
|
||||||
force_dependencies=self.reset_dependencies)
|
force=True,
|
||||||
|
force_dependencies=self.reset_dependencies)
|
||||||
if node_ref.bl_reload_parent:
|
if node_ref.bl_reload_parent:
|
||||||
for parent in session._graph.find_parents(self.target):
|
for parent in session.repository.get_parents(self.target):
|
||||||
logging.debug(f"Refresh parent {parent}")
|
logging.debug(f"Refresh parent {parent}")
|
||||||
session.apply(parent, force=True)
|
|
||||||
|
apply(session.repository,
|
||||||
|
parent.uuid,
|
||||||
|
force=True)
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
self.report({'ERROR'}, repr(e))
|
self.report({'ERROR'}, repr(e))
|
||||||
return {"CANCELED"}
|
traceback.print_exc()
|
||||||
|
return {"CANCELLED"}
|
||||||
|
|
||||||
return {"FINISHED"}
|
return {"FINISHED"}
|
||||||
|
|
||||||
@ -650,15 +656,15 @@ class ApplyArmatureOperator(bpy.types.Operator):
|
|||||||
return {'CANCELLED'}
|
return {'CANCELLED'}
|
||||||
|
|
||||||
if event.type == 'TIMER':
|
if event.type == 'TIMER':
|
||||||
if session and session.state['STATE'] == STATE_ACTIVE:
|
if session and session.state == STATE_ACTIVE:
|
||||||
nodes = session.list(filter=bl_types.bl_armature.BlArmature)
|
nodes = session.list(filter=bl_types.bl_armature.BlArmature)
|
||||||
|
|
||||||
for node in nodes:
|
for node in nodes:
|
||||||
node_ref = session.get(uuid=node)
|
node_ref = session.repository.get_node(node)
|
||||||
|
|
||||||
if node_ref.state == FETCHED:
|
if node_ref.state == FETCHED:
|
||||||
try:
|
try:
|
||||||
session.apply(node)
|
apply(session.repository, node)
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
logging.error("Fail to apply armature: {e}")
|
logging.error("Fail to apply armature: {e}")
|
||||||
|
|
||||||
@ -795,7 +801,7 @@ class SessionSaveBackupOperator(bpy.types.Operator, ExportHelper):
|
|||||||
|
|
||||||
@classmethod
|
@classmethod
|
||||||
def poll(cls, context):
|
def poll(cls, context):
|
||||||
return session.state['STATE'] == STATE_ACTIVE
|
return session.state == STATE_ACTIVE
|
||||||
|
|
||||||
class SessionStopAutoSaveOperator(bpy.types.Operator):
|
class SessionStopAutoSaveOperator(bpy.types.Operator):
|
||||||
bl_idname = "session.cancel_autosave"
|
bl_idname = "session.cancel_autosave"
|
||||||
@ -804,7 +810,7 @@ class SessionStopAutoSaveOperator(bpy.types.Operator):
|
|||||||
|
|
||||||
@classmethod
|
@classmethod
|
||||||
def poll(cls, context):
|
def poll(cls, context):
|
||||||
return (session.state['STATE'] == STATE_ACTIVE and 'SessionBackupTimer' in registry)
|
return (session.state == STATE_ACTIVE and 'SessionBackupTimer' in registry)
|
||||||
|
|
||||||
def execute(self, context):
|
def execute(self, context):
|
||||||
autosave_timer = registry.get('SessionBackupTimer')
|
autosave_timer = registry.get('SessionBackupTimer')
|
||||||
@ -829,7 +835,7 @@ class SessionLoadSaveOperator(bpy.types.Operator, ImportHelper):
|
|||||||
)
|
)
|
||||||
|
|
||||||
def execute(self, context):
|
def execute(self, context):
|
||||||
from replication.graph import ReplicationGraph
|
from replication.repository import Repository
|
||||||
|
|
||||||
# TODO: add filechecks
|
# TODO: add filechecks
|
||||||
|
|
||||||
@ -849,7 +855,7 @@ class SessionLoadSaveOperator(bpy.types.Operator, ImportHelper):
|
|||||||
|
|
||||||
|
|
||||||
# init the factory with supported types
|
# init the factory with supported types
|
||||||
bpy_factory = ReplicatedDataFactory()
|
bpy_protocol = DataTranslationProtocol()
|
||||||
for type in bl_types.types_to_register():
|
for type in bl_types.types_to_register():
|
||||||
type_module = getattr(bl_types, type)
|
type_module = getattr(bl_types, type)
|
||||||
name = [e.capitalize() for e in type.split('_')[1:]]
|
name = [e.capitalize() for e in type.split('_')[1:]]
|
||||||
@ -857,16 +863,16 @@ class SessionLoadSaveOperator(bpy.types.Operator, ImportHelper):
|
|||||||
type_module_class = getattr(type_module, type_impl_name)
|
type_module_class = getattr(type_module, type_impl_name)
|
||||||
|
|
||||||
|
|
||||||
bpy_factory.register_type(
|
bpy_protocol.register_type(
|
||||||
type_module_class.bl_class,
|
type_module_class.bl_class,
|
||||||
type_module_class)
|
type_module_class)
|
||||||
|
|
||||||
graph = ReplicationGraph()
|
graph = Repository()
|
||||||
|
|
||||||
for node, node_data in nodes:
|
for node, node_data in nodes:
|
||||||
node_type = node_data.get('str_type')
|
node_type = node_data.get('str_type')
|
||||||
|
|
||||||
impl = bpy_factory.get_implementation_from_net(node_type)
|
impl = bpy_protocol.get_implementation_from_net(node_type)
|
||||||
|
|
||||||
if impl:
|
if impl:
|
||||||
logging.info(f"Loading {node}")
|
logging.info(f"Loading {node}")
|
||||||
@ -874,7 +880,7 @@ class SessionLoadSaveOperator(bpy.types.Operator, ImportHelper):
|
|||||||
uuid=node,
|
uuid=node,
|
||||||
dependencies=node_data['dependencies'],
|
dependencies=node_data['dependencies'],
|
||||||
data=node_data['data'])
|
data=node_data['data'])
|
||||||
instance.store(graph)
|
graph.do_commit(instance)
|
||||||
instance.state = FETCHED
|
instance.state = FETCHED
|
||||||
|
|
||||||
logging.info("Graph succefully loaded")
|
logging.info("Graph succefully loaded")
|
||||||
@ -923,7 +929,7 @@ classes = (
|
|||||||
def update_external_dependencies():
|
def update_external_dependencies():
|
||||||
nodes_ids = session.list(filter=bl_types.bl_file.BlFile)
|
nodes_ids = session.list(filter=bl_types.bl_file.BlFile)
|
||||||
for node_id in nodes_ids:
|
for node_id in nodes_ids:
|
||||||
node = session.get(node_id)
|
node = session.repository.get_node(node_id)
|
||||||
if node and node.owner in [session.id, RP_COMMON] \
|
if node and node.owner in [session.id, RP_COMMON] \
|
||||||
and node.has_changed():
|
and node.has_changed():
|
||||||
session.commit(node_id)
|
session.commit(node_id)
|
||||||
@ -932,11 +938,11 @@ def update_external_dependencies():
|
|||||||
def sanitize_deps_graph(remove_nodes: bool = False):
|
def sanitize_deps_graph(remove_nodes: bool = False):
|
||||||
""" Cleanup the replication graph
|
""" Cleanup the replication graph
|
||||||
"""
|
"""
|
||||||
if session and session.state['STATE'] == STATE_ACTIVE:
|
if session and session.state == STATE_ACTIVE:
|
||||||
start = utils.current_milli_time()
|
start = utils.current_milli_time()
|
||||||
rm_cpt = 0
|
rm_cpt = 0
|
||||||
for node_key in session.list():
|
for node_key in session.list():
|
||||||
node = session.get(node_key)
|
node = session.repository.get_node(node_key)
|
||||||
if node is None \
|
if node is None \
|
||||||
or (node.state == UP and not node.resolve(construct=False)):
|
or (node.state == UP and not node.resolve(construct=False)):
|
||||||
if remove_nodes:
|
if remove_nodes:
|
||||||
@ -957,18 +963,18 @@ def resolve_deps_graph(dummy):
|
|||||||
A future solution should be to avoid storing dataclock reference...
|
A future solution should be to avoid storing dataclock reference...
|
||||||
|
|
||||||
"""
|
"""
|
||||||
if session and session.state['STATE'] == STATE_ACTIVE:
|
if session and session.state == STATE_ACTIVE:
|
||||||
sanitize_deps_graph(remove_nodes=True)
|
sanitize_deps_graph(remove_nodes=True)
|
||||||
|
|
||||||
@persistent
|
@persistent
|
||||||
def load_pre_handler(dummy):
|
def load_pre_handler(dummy):
|
||||||
if session and session.state['STATE'] in [STATE_ACTIVE, STATE_SYNCING]:
|
if session and session.state in [STATE_ACTIVE, STATE_SYNCING]:
|
||||||
bpy.ops.session.stop()
|
bpy.ops.session.stop()
|
||||||
|
|
||||||
|
|
||||||
@persistent
|
@persistent
|
||||||
def update_client_frame(scene):
|
def update_client_frame(scene):
|
||||||
if session and session.state['STATE'] == STATE_ACTIVE:
|
if session and session.state == STATE_ACTIVE:
|
||||||
session.update_user_metadata({
|
session.update_user_metadata({
|
||||||
'frame_current': scene.frame_current
|
'frame_current': scene.frame_current
|
||||||
})
|
})
|
||||||
@ -976,7 +982,7 @@ def update_client_frame(scene):
|
|||||||
|
|
||||||
@persistent
|
@persistent
|
||||||
def depsgraph_evaluation(scene):
|
def depsgraph_evaluation(scene):
|
||||||
if session and session.state['STATE'] == STATE_ACTIVE:
|
if session and session.state == STATE_ACTIVE:
|
||||||
context = bpy.context
|
context = bpy.context
|
||||||
blender_depsgraph = bpy.context.view_layer.depsgraph
|
blender_depsgraph = bpy.context.view_layer.depsgraph
|
||||||
dependency_updates = [u for u in blender_depsgraph.updates]
|
dependency_updates = [u for u in blender_depsgraph.updates]
|
||||||
@ -989,13 +995,13 @@ def depsgraph_evaluation(scene):
|
|||||||
# Is the object tracked ?
|
# Is the object tracked ?
|
||||||
if update.id.uuid:
|
if update.id.uuid:
|
||||||
# Retrieve local version
|
# Retrieve local version
|
||||||
node = session.get(uuid=update.id.uuid)
|
node = session.repository.get_node(update.id.uuid)
|
||||||
|
|
||||||
# Check our right on this update:
|
# Check our right on this update:
|
||||||
# - if its ours or ( under common and diff), launch the
|
# - if its ours or ( under common and diff), launch the
|
||||||
# update process
|
# update process
|
||||||
# - if its to someone else, ignore the update
|
# - if its to someone else, ignore the update
|
||||||
if node and node.owner in [session.id, RP_COMMON]:
|
if node and (node.owner == session.id or node.bl_check_common):
|
||||||
if node.state == UP:
|
if node.state == UP:
|
||||||
try:
|
try:
|
||||||
if node.has_changed():
|
if node.has_changed():
|
||||||
@ -1013,11 +1019,11 @@ def depsgraph_evaluation(scene):
|
|||||||
continue
|
continue
|
||||||
# A new scene is created
|
# A new scene is created
|
||||||
elif isinstance(update.id, bpy.types.Scene):
|
elif isinstance(update.id, bpy.types.Scene):
|
||||||
ref = session.get(reference=update.id)
|
ref = session.repository.get_node_by_datablock(update.id)
|
||||||
if ref:
|
if ref:
|
||||||
ref.resolve()
|
ref.resolve()
|
||||||
else:
|
else:
|
||||||
scn_uuid = session.add(update.id)
|
scn_uuid = add(session.repository, update.id)
|
||||||
session.commit(scn_uuid)
|
session.commit(scn_uuid)
|
||||||
session.push(scn_uuid, check_data=False)
|
session.push(scn_uuid, check_data=False)
|
||||||
def register():
|
def register():
|
||||||
@ -1035,7 +1041,7 @@ def register():
|
|||||||
|
|
||||||
|
|
||||||
def unregister():
|
def unregister():
|
||||||
if session and session.state['STATE'] == STATE_ACTIVE:
|
if session and session.state == STATE_ACTIVE:
|
||||||
session.disconnect()
|
session.disconnect()
|
||||||
|
|
||||||
from bpy.utils import unregister_class
|
from bpy.utils import unregister_class
|
||||||
|
@ -66,14 +66,6 @@ def update_ip(self, context):
|
|||||||
self['ip'] = "127.0.0.1"
|
self['ip'] = "127.0.0.1"
|
||||||
|
|
||||||
|
|
||||||
def update_port(self, context):
|
|
||||||
max_port = self.port + 3
|
|
||||||
|
|
||||||
if self.ipc_port < max_port and \
|
|
||||||
self['ipc_port'] >= self.port:
|
|
||||||
logging.error(
|
|
||||||
"IPC Port in conflict with the port, assigning a random value")
|
|
||||||
self['ipc_port'] = random.randrange(self.port+4, 10000)
|
|
||||||
|
|
||||||
|
|
||||||
def update_directory(self, context):
|
def update_directory(self, context):
|
||||||
@ -174,12 +166,6 @@ class SessionPrefs(bpy.types.AddonPreferences):
|
|||||||
supported_datablocks: bpy.props.CollectionProperty(
|
supported_datablocks: bpy.props.CollectionProperty(
|
||||||
type=ReplicatedDatablock,
|
type=ReplicatedDatablock,
|
||||||
)
|
)
|
||||||
ipc_port: bpy.props.IntProperty(
|
|
||||||
name="ipc_port",
|
|
||||||
description='internal ttl port(only useful for multiple local instances)',
|
|
||||||
default=random.randrange(5570, 70000),
|
|
||||||
update=update_port,
|
|
||||||
)
|
|
||||||
init_method: bpy.props.EnumProperty(
|
init_method: bpy.props.EnumProperty(
|
||||||
name='init_method',
|
name='init_method',
|
||||||
description='Init repo',
|
description='Init repo',
|
||||||
@ -195,7 +181,7 @@ class SessionPrefs(bpy.types.AddonPreferences):
|
|||||||
connection_timeout: bpy.props.IntProperty(
|
connection_timeout: bpy.props.IntProperty(
|
||||||
name='connection timeout',
|
name='connection timeout',
|
||||||
description='connection timeout before disconnection',
|
description='connection timeout before disconnection',
|
||||||
default=1000
|
default=5000
|
||||||
)
|
)
|
||||||
# Replication update settings
|
# Replication update settings
|
||||||
depsgraph_update_rate: bpy.props.FloatProperty(
|
depsgraph_update_rate: bpy.props.FloatProperty(
|
||||||
|
@ -30,7 +30,7 @@ import mathutils
|
|||||||
from bpy_extras import view3d_utils
|
from bpy_extras import view3d_utils
|
||||||
from gpu_extras.batch import batch_for_shader
|
from gpu_extras.batch import batch_for_shader
|
||||||
from replication.constants import (STATE_ACTIVE, STATE_AUTH, STATE_CONFIG,
|
from replication.constants import (STATE_ACTIVE, STATE_AUTH, STATE_CONFIG,
|
||||||
STATE_INITIAL, STATE_LAUNCHING_SERVICES,
|
STATE_INITIAL, CONNECTING,
|
||||||
STATE_LOBBY, STATE_QUITTING, STATE_SRV_SYNC,
|
STATE_LOBBY, STATE_QUITTING, STATE_SRV_SYNC,
|
||||||
STATE_SYNCING, STATE_WAITING)
|
STATE_SYNCING, STATE_WAITING)
|
||||||
from replication.interface import session
|
from replication.interface import session
|
||||||
@ -399,7 +399,7 @@ class SessionStatusWidget(Widget):
|
|||||||
text_scale = self.preferences.presence_hud_scale
|
text_scale = self.preferences.presence_hud_scale
|
||||||
ui_scale = bpy.context.preferences.view.ui_scale
|
ui_scale = bpy.context.preferences.view.ui_scale
|
||||||
color = [1, 1, 0, 1]
|
color = [1, 1, 0, 1]
|
||||||
state = session.state.get('STATE')
|
state = session.state
|
||||||
state_str = f"{get_state_str(state)}"
|
state_str = f"{get_state_str(state)}"
|
||||||
|
|
||||||
if state == STATE_ACTIVE:
|
if state == STATE_ACTIVE:
|
||||||
|
@ -17,13 +17,14 @@
|
|||||||
|
|
||||||
import logging
|
import logging
|
||||||
import sys
|
import sys
|
||||||
|
import traceback
|
||||||
import bpy
|
import bpy
|
||||||
from replication.constants import (FETCHED, RP_COMMON, STATE_ACTIVE,
|
from replication.constants import (FETCHED, RP_COMMON, STATE_ACTIVE,
|
||||||
STATE_INITIAL, STATE_LOBBY, STATE_QUITTING,
|
STATE_INITIAL, STATE_LOBBY, STATE_QUITTING,
|
||||||
STATE_SRV_SYNC, STATE_SYNCING, UP)
|
STATE_SRV_SYNC, STATE_SYNCING, UP)
|
||||||
from replication.exception import NonAuthorizedOperationError, ContextError
|
from replication.exception import NonAuthorizedOperationError, ContextError
|
||||||
from replication.interface import session
|
from replication.interface import session
|
||||||
|
from replication.porcelain import apply, add
|
||||||
|
|
||||||
from . import operators, utils
|
from . import operators, utils
|
||||||
from .presence import (UserFrustumWidget, UserNameWidget, UserSelectionWidget,
|
from .presence import (UserFrustumWidget, UserNameWidget, UserSelectionWidget,
|
||||||
@ -71,7 +72,7 @@ class Timer(object):
|
|||||||
except Exception as e:
|
except Exception as e:
|
||||||
logging.error(e)
|
logging.error(e)
|
||||||
self.unregister()
|
self.unregister()
|
||||||
session.disconnect()
|
session.disconnect(reason=f"Error during timer {self.id} execution")
|
||||||
else:
|
else:
|
||||||
if self.is_running:
|
if self.is_running:
|
||||||
return self._timeout
|
return self._timeout
|
||||||
@ -100,24 +101,31 @@ class SessionBackupTimer(Timer):
|
|||||||
def execute(self):
|
def execute(self):
|
||||||
session.save(self._filepath)
|
session.save(self._filepath)
|
||||||
|
|
||||||
|
class SessionListenTimer(Timer):
|
||||||
|
def execute(self):
|
||||||
|
session.listen()
|
||||||
|
|
||||||
class ApplyTimer(Timer):
|
class ApplyTimer(Timer):
|
||||||
def execute(self):
|
def execute(self):
|
||||||
if session and session.state['STATE'] == STATE_ACTIVE:
|
if session and session.state == STATE_ACTIVE:
|
||||||
nodes = session.list()
|
nodes = session.list()
|
||||||
|
|
||||||
for node in nodes:
|
for node in nodes:
|
||||||
node_ref = session.get(uuid=node)
|
node_ref = session.repository.get_node(node)
|
||||||
|
|
||||||
if node_ref.state == FETCHED:
|
if node_ref.state == FETCHED:
|
||||||
try:
|
try:
|
||||||
session.apply(node)
|
apply(session.repository, node)
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
logging.error(f"Fail to apply {node_ref.uuid}: {e}")
|
logging.error(f"Fail to apply {node_ref.uuid}")
|
||||||
|
traceback.print_exc()
|
||||||
else:
|
else:
|
||||||
if node_ref.bl_reload_parent:
|
if node_ref.bl_reload_parent:
|
||||||
for parent in session._graph.find_parents(node):
|
for parent in session.repository.get_parents(node):
|
||||||
logging.debug("Refresh parent {node}")
|
logging.debug("Refresh parent {node}")
|
||||||
session.apply(parent, force=True)
|
apply(session.repository,
|
||||||
|
parent.uuid,
|
||||||
|
force=True)
|
||||||
|
|
||||||
|
|
||||||
class DynamicRightSelectTimer(Timer):
|
class DynamicRightSelectTimer(Timer):
|
||||||
@ -130,7 +138,7 @@ class DynamicRightSelectTimer(Timer):
|
|||||||
def execute(self):
|
def execute(self):
|
||||||
settings = utils.get_preferences()
|
settings = utils.get_preferences()
|
||||||
|
|
||||||
if session and session.state['STATE'] == STATE_ACTIVE:
|
if session and session.state == STATE_ACTIVE:
|
||||||
# Find user
|
# Find user
|
||||||
if self._user is None:
|
if self._user is None:
|
||||||
self._user = session.online_users.get(settings.username)
|
self._user = session.online_users.get(settings.username)
|
||||||
@ -144,7 +152,7 @@ class DynamicRightSelectTimer(Timer):
|
|||||||
|
|
||||||
# if an annotation exist and is tracked
|
# if an annotation exist and is tracked
|
||||||
if annotation_gp and annotation_gp.uuid:
|
if annotation_gp and annotation_gp.uuid:
|
||||||
registered_gp = session.get(uuid=annotation_gp.uuid)
|
registered_gp = session.repository.get_node(annotation_gp.uuid)
|
||||||
if is_annotating(bpy.context):
|
if is_annotating(bpy.context):
|
||||||
# try to get the right on it
|
# try to get the right on it
|
||||||
if registered_gp.owner == RP_COMMON:
|
if registered_gp.owner == RP_COMMON:
|
||||||
@ -158,7 +166,7 @@ class DynamicRightSelectTimer(Timer):
|
|||||||
affect_dependencies=False)
|
affect_dependencies=False)
|
||||||
|
|
||||||
if registered_gp.owner == settings.username:
|
if registered_gp.owner == settings.username:
|
||||||
gp_node = session.get(uuid=annotation_gp.uuid)
|
gp_node = session.repository.get_node(annotation_gp.uuid)
|
||||||
if gp_node.has_changed():
|
if gp_node.has_changed():
|
||||||
session.commit(gp_node.uuid)
|
session.commit(gp_node.uuid)
|
||||||
session.push(gp_node.uuid, check_data=False)
|
session.push(gp_node.uuid, check_data=False)
|
||||||
@ -182,7 +190,7 @@ class DynamicRightSelectTimer(Timer):
|
|||||||
|
|
||||||
# change old selection right to common
|
# change old selection right to common
|
||||||
for obj in obj_common:
|
for obj in obj_common:
|
||||||
node = session.get(uuid=obj)
|
node = session.repository.get_node(obj)
|
||||||
|
|
||||||
if node and (node.owner == settings.username or node.owner == RP_COMMON):
|
if node and (node.owner == settings.username or node.owner == RP_COMMON):
|
||||||
recursive = True
|
recursive = True
|
||||||
@ -200,7 +208,7 @@ class DynamicRightSelectTimer(Timer):
|
|||||||
|
|
||||||
# change new selection to our
|
# change new selection to our
|
||||||
for obj in obj_ours:
|
for obj in obj_ours:
|
||||||
node = session.get(uuid=obj)
|
node = session.repository.get_node(obj)
|
||||||
|
|
||||||
if node and node.owner == RP_COMMON:
|
if node and node.owner == RP_COMMON:
|
||||||
recursive = True
|
recursive = True
|
||||||
@ -233,7 +241,7 @@ class DynamicRightSelectTimer(Timer):
|
|||||||
owned_keys = session.list(
|
owned_keys = session.list(
|
||||||
filter_owner=settings.username)
|
filter_owner=settings.username)
|
||||||
for key in owned_keys:
|
for key in owned_keys:
|
||||||
node = session.get(uuid=key)
|
node = session.repository.get_node(key)
|
||||||
try:
|
try:
|
||||||
session.change_owner(
|
session.change_owner(
|
||||||
key,
|
key,
|
||||||
@ -262,7 +270,7 @@ class ClientUpdate(Timer):
|
|||||||
settings = utils.get_preferences()
|
settings = utils.get_preferences()
|
||||||
|
|
||||||
if session and renderer:
|
if session and renderer:
|
||||||
if session.state['STATE'] in [STATE_ACTIVE, STATE_LOBBY]:
|
if session.state in [STATE_ACTIVE, STATE_LOBBY]:
|
||||||
local_user = session.online_users.get(
|
local_user = session.online_users.get(
|
||||||
settings.username)
|
settings.username)
|
||||||
|
|
||||||
|
@ -26,7 +26,7 @@ from replication.constants import (ADDED, ERROR, FETCHED,
|
|||||||
STATE_INITIAL, STATE_SRV_SYNC,
|
STATE_INITIAL, STATE_SRV_SYNC,
|
||||||
STATE_WAITING, STATE_QUITTING,
|
STATE_WAITING, STATE_QUITTING,
|
||||||
STATE_LOBBY,
|
STATE_LOBBY,
|
||||||
STATE_LAUNCHING_SERVICES)
|
CONNECTING)
|
||||||
from replication import __version__
|
from replication import __version__
|
||||||
from replication.interface import session
|
from replication.interface import session
|
||||||
from .timers import registry
|
from .timers import registry
|
||||||
@ -71,9 +71,9 @@ class SESSION_PT_settings(bpy.types.Panel):
|
|||||||
|
|
||||||
def draw_header(self, context):
|
def draw_header(self, context):
|
||||||
layout = self.layout
|
layout = self.layout
|
||||||
if session and session.state['STATE'] != STATE_INITIAL:
|
if session and session.state != STATE_INITIAL:
|
||||||
cli_state = session.state
|
cli_state = session.state
|
||||||
state = session.state.get('STATE')
|
state = session.state
|
||||||
connection_icon = "KEYTYPE_MOVING_HOLD_VEC"
|
connection_icon = "KEYTYPE_MOVING_HOLD_VEC"
|
||||||
|
|
||||||
if state == STATE_ACTIVE:
|
if state == STATE_ACTIVE:
|
||||||
@ -81,7 +81,7 @@ class SESSION_PT_settings(bpy.types.Panel):
|
|||||||
else:
|
else:
|
||||||
connection_icon = 'PROP_CON'
|
connection_icon = 'PROP_CON'
|
||||||
|
|
||||||
layout.label(text=f"Session - {get_state_str(cli_state['STATE'])}", icon=connection_icon)
|
layout.label(text=f"Session - {get_state_str(cli_state)}", icon=connection_icon)
|
||||||
else:
|
else:
|
||||||
layout.label(text=f"Session - v{__version__}",icon="PROP_OFF")
|
layout.label(text=f"Session - v{__version__}",icon="PROP_OFF")
|
||||||
|
|
||||||
@ -94,13 +94,13 @@ class SESSION_PT_settings(bpy.types.Panel):
|
|||||||
if hasattr(context.window_manager, 'session'):
|
if hasattr(context.window_manager, 'session'):
|
||||||
# STATE INITIAL
|
# STATE INITIAL
|
||||||
if not session \
|
if not session \
|
||||||
or (session and session.state['STATE'] == STATE_INITIAL):
|
or (session and session.state == STATE_INITIAL):
|
||||||
pass
|
pass
|
||||||
else:
|
else:
|
||||||
cli_state = session.state
|
progress = session.state_progress
|
||||||
row = layout.row()
|
row = layout.row()
|
||||||
|
|
||||||
current_state = cli_state['STATE']
|
current_state = session.state
|
||||||
info_msg = None
|
info_msg = None
|
||||||
|
|
||||||
if current_state in [STATE_ACTIVE]:
|
if current_state in [STATE_ACTIVE]:
|
||||||
@ -124,8 +124,8 @@ class SESSION_PT_settings(bpy.types.Panel):
|
|||||||
if current_state in [STATE_SYNCING, STATE_SRV_SYNC, STATE_WAITING]:
|
if current_state in [STATE_SYNCING, STATE_SRV_SYNC, STATE_WAITING]:
|
||||||
info_box = row.box()
|
info_box = row.box()
|
||||||
info_box.row().label(text=printProgressBar(
|
info_box.row().label(text=printProgressBar(
|
||||||
cli_state['CURRENT'],
|
progress['current'],
|
||||||
cli_state['TOTAL'],
|
progress['total'],
|
||||||
length=16
|
length=16
|
||||||
))
|
))
|
||||||
|
|
||||||
@ -141,7 +141,7 @@ class SESSION_PT_settings_network(bpy.types.Panel):
|
|||||||
@classmethod
|
@classmethod
|
||||||
def poll(cls, context):
|
def poll(cls, context):
|
||||||
return not session \
|
return not session \
|
||||||
or (session and session.state['STATE'] == 0)
|
or (session and session.state == 0)
|
||||||
|
|
||||||
def draw_header(self, context):
|
def draw_header(self, context):
|
||||||
self.layout.label(text="", icon='URL')
|
self.layout.label(text="", icon='URL')
|
||||||
@ -199,7 +199,7 @@ class SESSION_PT_settings_user(bpy.types.Panel):
|
|||||||
@classmethod
|
@classmethod
|
||||||
def poll(cls, context):
|
def poll(cls, context):
|
||||||
return not session \
|
return not session \
|
||||||
or (session and session.state['STATE'] == 0)
|
or (session and session.state == 0)
|
||||||
|
|
||||||
def draw_header(self, context):
|
def draw_header(self, context):
|
||||||
self.layout.label(text="", icon='USER')
|
self.layout.label(text="", icon='USER')
|
||||||
@ -230,7 +230,7 @@ class SESSION_PT_advanced_settings(bpy.types.Panel):
|
|||||||
@classmethod
|
@classmethod
|
||||||
def poll(cls, context):
|
def poll(cls, context):
|
||||||
return not session \
|
return not session \
|
||||||
or (session and session.state['STATE'] == 0)
|
or (session and session.state == 0)
|
||||||
|
|
||||||
def draw_header(self, context):
|
def draw_header(self, context):
|
||||||
self.layout.label(text="", icon='PREFERENCES')
|
self.layout.label(text="", icon='PREFERENCES')
|
||||||
@ -251,9 +251,6 @@ class SESSION_PT_advanced_settings(bpy.types.Panel):
|
|||||||
emboss=False)
|
emboss=False)
|
||||||
|
|
||||||
if settings.sidebar_advanced_net_expanded:
|
if settings.sidebar_advanced_net_expanded:
|
||||||
net_section_row = net_section.row()
|
|
||||||
net_section_row.label(text="IPC Port:")
|
|
||||||
net_section_row.prop(settings, "ipc_port", text="")
|
|
||||||
net_section_row = net_section.row()
|
net_section_row = net_section.row()
|
||||||
net_section_row.label(text="Timeout (ms):")
|
net_section_row.label(text="Timeout (ms):")
|
||||||
net_section_row.prop(settings, "connection_timeout", text="")
|
net_section_row.prop(settings, "connection_timeout", text="")
|
||||||
@ -322,7 +319,7 @@ class SESSION_PT_user(bpy.types.Panel):
|
|||||||
|
|
||||||
@classmethod
|
@classmethod
|
||||||
def poll(cls, context):
|
def poll(cls, context):
|
||||||
return session and session.state['STATE'] in [STATE_ACTIVE, STATE_LOBBY]
|
return session and session.state in [STATE_ACTIVE, STATE_LOBBY]
|
||||||
|
|
||||||
def draw_header(self, context):
|
def draw_header(self, context):
|
||||||
self.layout.label(text="", icon='USER')
|
self.layout.label(text="", icon='USER')
|
||||||
@ -353,7 +350,7 @@ class SESSION_PT_user(bpy.types.Panel):
|
|||||||
if active_user != 0 and active_user.username != settings.username:
|
if active_user != 0 and active_user.username != settings.username:
|
||||||
row = layout.row()
|
row = layout.row()
|
||||||
user_operations = row.split()
|
user_operations = row.split()
|
||||||
if session.state['STATE'] == STATE_ACTIVE:
|
if session.state == STATE_ACTIVE:
|
||||||
|
|
||||||
user_operations.alert = context.window_manager.session.time_snap_running
|
user_operations.alert = context.window_manager.session.time_snap_running
|
||||||
user_operations.operator(
|
user_operations.operator(
|
||||||
@ -411,7 +408,7 @@ class SESSION_PT_presence(bpy.types.Panel):
|
|||||||
@classmethod
|
@classmethod
|
||||||
def poll(cls, context):
|
def poll(cls, context):
|
||||||
return not session \
|
return not session \
|
||||||
or (session and session.state['STATE'] in [STATE_INITIAL, STATE_ACTIVE])
|
or (session and session.state in [STATE_INITIAL, STATE_ACTIVE])
|
||||||
|
|
||||||
def draw_header(self, context):
|
def draw_header(self, context):
|
||||||
self.layout.prop(context.window_manager.session,
|
self.layout.prop(context.window_manager.session,
|
||||||
@ -441,7 +438,7 @@ class SESSION_PT_presence(bpy.types.Panel):
|
|||||||
def draw_property(context, parent, property_uuid, level=0):
|
def draw_property(context, parent, property_uuid, level=0):
|
||||||
settings = get_preferences()
|
settings = get_preferences()
|
||||||
runtime_settings = context.window_manager.session
|
runtime_settings = context.window_manager.session
|
||||||
item = session.get(uuid=property_uuid)
|
item = session.repository.get_node(property_uuid)
|
||||||
|
|
||||||
area_msg = parent.row(align=True)
|
area_msg = parent.row(align=True)
|
||||||
|
|
||||||
@ -519,8 +516,8 @@ class SESSION_PT_repository(bpy.types.Panel):
|
|||||||
admin = usr['admin']
|
admin = usr['admin']
|
||||||
return hasattr(context.window_manager, 'session') and \
|
return hasattr(context.window_manager, 'session') and \
|
||||||
session and \
|
session and \
|
||||||
(session.state['STATE'] == STATE_ACTIVE or \
|
(session.state == STATE_ACTIVE or \
|
||||||
session.state['STATE'] == STATE_LOBBY and admin)
|
session.state == STATE_LOBBY and admin)
|
||||||
|
|
||||||
def draw_header(self, context):
|
def draw_header(self, context):
|
||||||
self.layout.label(text="", icon='OUTLINER_OB_GROUP_INSTANCE')
|
self.layout.label(text="", icon='OUTLINER_OB_GROUP_INSTANCE')
|
||||||
@ -536,7 +533,7 @@ class SESSION_PT_repository(bpy.types.Panel):
|
|||||||
|
|
||||||
row = layout.row()
|
row = layout.row()
|
||||||
|
|
||||||
if session.state['STATE'] == STATE_ACTIVE:
|
if session.state == STATE_ACTIVE:
|
||||||
if 'SessionBackupTimer' in registry:
|
if 'SessionBackupTimer' in registry:
|
||||||
row.alert = True
|
row.alert = True
|
||||||
row.operator('session.cancel_autosave', icon="CANCEL")
|
row.operator('session.cancel_autosave', icon="CANCEL")
|
||||||
@ -568,7 +565,7 @@ class SESSION_PT_repository(bpy.types.Panel):
|
|||||||
filter_owner=settings.username) if runtime_settings.filter_owned else session.list()
|
filter_owner=settings.username) if runtime_settings.filter_owned else session.list()
|
||||||
|
|
||||||
client_keys = [key for key in key_to_filter
|
client_keys = [key for key in key_to_filter
|
||||||
if session.get(uuid=key).str_type
|
if session.repository.get_node(key).str_type
|
||||||
in types_filter]
|
in types_filter]
|
||||||
|
|
||||||
if client_keys:
|
if client_keys:
|
||||||
@ -579,7 +576,7 @@ class SESSION_PT_repository(bpy.types.Panel):
|
|||||||
else:
|
else:
|
||||||
row.label(text="Empty")
|
row.label(text="Empty")
|
||||||
|
|
||||||
elif session.state['STATE'] == STATE_LOBBY and usr and usr['admin']:
|
elif session.state == STATE_LOBBY and usr and usr['admin']:
|
||||||
row.operator("session.init", icon='TOOL_SETTINGS', text="Init")
|
row.operator("session.init", icon='TOOL_SETTINGS', text="Init")
|
||||||
else:
|
else:
|
||||||
row.label(text="Waiting to start")
|
row.label(text="Waiting to start")
|
||||||
|
@ -36,7 +36,7 @@ from replication.constants import (STATE_ACTIVE, STATE_AUTH,
|
|||||||
STATE_INITIAL, STATE_SRV_SYNC,
|
STATE_INITIAL, STATE_SRV_SYNC,
|
||||||
STATE_WAITING, STATE_QUITTING,
|
STATE_WAITING, STATE_QUITTING,
|
||||||
STATE_LOBBY,
|
STATE_LOBBY,
|
||||||
STATE_LAUNCHING_SERVICES)
|
CONNECTING)
|
||||||
|
|
||||||
|
|
||||||
def find_from_attr(attr_name, attr_value, list):
|
def find_from_attr(attr_name, attr_value, list):
|
||||||
@ -92,7 +92,7 @@ def get_state_str(state):
|
|||||||
state_str = 'OFFLINE'
|
state_str = 'OFFLINE'
|
||||||
elif state == STATE_QUITTING:
|
elif state == STATE_QUITTING:
|
||||||
state_str = 'QUITTING'
|
state_str = 'QUITTING'
|
||||||
elif state == STATE_LAUNCHING_SERVICES:
|
elif state == CONNECTING:
|
||||||
state_str = 'LAUNCHING SERVICES'
|
state_str = 'LAUNCHING SERVICES'
|
||||||
elif state == STATE_LOBBY:
|
elif state == STATE_LOBBY:
|
||||||
state_str = 'LOBBY'
|
state_str = 'LOBBY'
|
||||||
|
@ -13,7 +13,7 @@ def main():
|
|||||||
if len(sys.argv) > 2:
|
if len(sys.argv) > 2:
|
||||||
blender_rev = sys.argv[2]
|
blender_rev = sys.argv[2]
|
||||||
else:
|
else:
|
||||||
blender_rev = "2.91.0"
|
blender_rev = "2.92.0"
|
||||||
|
|
||||||
try:
|
try:
|
||||||
exit_val = BAT.test_blender_addon(addon_path=addon, blender_revision=blender_rev)
|
exit_val = BAT.test_blender_addon(addon_path=addon, blender_revision=blender_rev)
|
||||||
|
@ -8,6 +8,7 @@ import random
|
|||||||
from multi_user.bl_types.bl_action import BlAction
|
from multi_user.bl_types.bl_action import BlAction
|
||||||
|
|
||||||
INTERPOLATION = ['CONSTANT', 'LINEAR', 'BEZIER', 'SINE', 'QUAD', 'CUBIC', 'QUART', 'QUINT', 'EXPO', 'CIRC', 'BACK', 'BOUNCE', 'ELASTIC']
|
INTERPOLATION = ['CONSTANT', 'LINEAR', 'BEZIER', 'SINE', 'QUAD', 'CUBIC', 'QUART', 'QUINT', 'EXPO', 'CIRC', 'BACK', 'BOUNCE', 'ELASTIC']
|
||||||
|
FMODIFIERS = ['GENERATOR', 'FNGENERATOR', 'ENVELOPE', 'CYCLES', 'NOISE', 'LIMITS', 'STEPPED']
|
||||||
|
|
||||||
# @pytest.mark.parametrize('blendname', ['test_action.blend'])
|
# @pytest.mark.parametrize('blendname', ['test_action.blend'])
|
||||||
def test_action(clear_blend):
|
def test_action(clear_blend):
|
||||||
@ -22,6 +23,9 @@ def test_action(clear_blend):
|
|||||||
point.co[1] = random.randint(-10,10)
|
point.co[1] = random.randint(-10,10)
|
||||||
point.interpolation = INTERPOLATION[random.randint(0, len(INTERPOLATION)-1)]
|
point.interpolation = INTERPOLATION[random.randint(0, len(INTERPOLATION)-1)]
|
||||||
|
|
||||||
|
for mod_type in FMODIFIERS:
|
||||||
|
fcurve_sample.modifiers.new(mod_type)
|
||||||
|
|
||||||
bpy.ops.mesh.primitive_plane_add()
|
bpy.ops.mesh.primitive_plane_add()
|
||||||
bpy.data.objects[0].animation_data_create()
|
bpy.data.objects[0].animation_data_create()
|
||||||
bpy.data.objects[0].animation_data.action = datablock
|
bpy.data.objects[0].animation_data.action = datablock
|
||||||
|
@ -7,7 +7,7 @@ import bpy
|
|||||||
import random
|
import random
|
||||||
from multi_user.bl_types.bl_object import BlObject
|
from multi_user.bl_types.bl_object import BlObject
|
||||||
|
|
||||||
# Removed 'BUILD' modifier because the seed doesn't seems to be
|
# Removed 'BUILD', 'SOFT_BODY' modifier because the seed doesn't seems to be
|
||||||
# correctly initialized (#TODO: report the bug)
|
# correctly initialized (#TODO: report the bug)
|
||||||
MOFIFIERS_TYPES = [
|
MOFIFIERS_TYPES = [
|
||||||
'DATA_TRANSFER', 'MESH_CACHE', 'MESH_SEQUENCE_CACHE',
|
'DATA_TRANSFER', 'MESH_CACHE', 'MESH_SEQUENCE_CACHE',
|
||||||
@ -22,8 +22,7 @@ MOFIFIERS_TYPES = [
|
|||||||
'MESH_DEFORM', 'SHRINKWRAP', 'SIMPLE_DEFORM', 'SMOOTH',
|
'MESH_DEFORM', 'SHRINKWRAP', 'SIMPLE_DEFORM', 'SMOOTH',
|
||||||
'CORRECTIVE_SMOOTH', 'LAPLACIANSMOOTH', 'SURFACE_DEFORM',
|
'CORRECTIVE_SMOOTH', 'LAPLACIANSMOOTH', 'SURFACE_DEFORM',
|
||||||
'WARP', 'WAVE', 'CLOTH', 'COLLISION', 'DYNAMIC_PAINT',
|
'WARP', 'WAVE', 'CLOTH', 'COLLISION', 'DYNAMIC_PAINT',
|
||||||
'EXPLODE', 'FLUID', 'OCEAN', 'PARTICLE_INSTANCE',
|
'EXPLODE', 'FLUID', 'OCEAN', 'PARTICLE_INSTANCE', 'SURFACE']
|
||||||
'SOFT_BODY', 'SURFACE']
|
|
||||||
|
|
||||||
GP_MODIFIERS_TYPE = [
|
GP_MODIFIERS_TYPE = [
|
||||||
'GP_ARRAY', 'GP_BUILD', 'GP_MIRROR', 'GP_MULTIPLY',
|
'GP_ARRAY', 'GP_BUILD', 'GP_MIRROR', 'GP_MULTIPLY',
|
||||||
@ -72,5 +71,5 @@ def test_object(clear_blend):
|
|||||||
test = implementation._construct(expected)
|
test = implementation._construct(expected)
|
||||||
implementation._load(expected, test)
|
implementation._load(expected, test)
|
||||||
result = implementation._dump(test)
|
result = implementation._dump(test)
|
||||||
|
print(DeepDiff(expected, result))
|
||||||
assert not DeepDiff(expected, result)
|
assert not DeepDiff(expected, result)
|
||||||
|
Reference in New Issue
Block a user