Compare commits
60 Commits
46-composi
...
222-server
Author | SHA1 | Date | |
---|---|---|---|
060b7507b6 | |||
a4f9f6e051 | |||
10de88cdc9 | |||
e4fa34c984 | |||
0dd685d009 | |||
3e8c30c0ab | |||
21cc3cd917 | |||
81e620ee3d | |||
fb9bd108bd | |||
cab6625399 | |||
1b81251a11 | |||
77bf269fb5 | |||
1e675132d4 | |||
781287c390 | |||
d4476baa1b | |||
467e98906e | |||
64a25f94a3 | |||
e6996316be | |||
cf4cd94096 | |||
e9ab633aac | |||
297639e80f | |||
f0cc63b6f0 | |||
d433e8f241 | |||
963a551a1e | |||
d01a434fb7 | |||
3a5a5fc633 | |||
8926ab44e1 | |||
a8f96581c5 | |||
440a4cc1cd | |||
a207c51973 | |||
e706c8e0bf | |||
e590e896da | |||
4140b62a8e | |||
6d9c9c4532 | |||
e9e1911840 | |||
ab350ca7bc | |||
0a8f0b5f88 | |||
2238a15c11 | |||
de73f022e6 | |||
f517205647 | |||
f33c3d8481 | |||
71c69000ec | |||
de1e684b3c | |||
d87730cffb | |||
3f005b86ab | |||
5098e5135d | |||
37cfed489c | |||
9003abcd18 | |||
a199e0df00 | |||
3774419b7e | |||
3e552cb406 | |||
9f381b44c8 | |||
ad795caed5 | |||
504dd77405 | |||
82022c9e4d | |||
d81b4dc014 | |||
63affa079f | |||
fcf5a12dd0 | |||
b0529e4444 | |||
bdfd89c085 |
32
CHANGELOG.md
@ -186,4 +186,34 @@ All notable changes to this project will be documented in this file.
|
||||
- Exception access violation during Undo/Redo
|
||||
- Sync missing armature bone Roll
|
||||
- Sync missing driver data_path
|
||||
- Constraint replication
|
||||
- Constraint replication
|
||||
|
||||
## [0.4.0] - 2021-07-20
|
||||
|
||||
### Added
|
||||
|
||||
- Connection preset system (@Kysios)
|
||||
- Display connected users active mode (users pannel and viewport) (@Kysios)
|
||||
- Delta-based replication
|
||||
- Sync timeline marker
|
||||
- Sync images settings (@Kysios)
|
||||
- Sync parent relation type (@Kysios)
|
||||
- Sync uv project modifier
|
||||
- Sync FCurves modifiers
|
||||
|
||||
### Changed
|
||||
|
||||
- User selection optimizations (draw and sync) (@Kysios)
|
||||
- Improved shapekey syncing performances
|
||||
- Improved gpencil syncing performances
|
||||
- Integrate replication as a submodule
|
||||
- The dependencies are now installed in a folder(blender addon folder) that no longer requires administrative rights
|
||||
- Presence overlay UI optimization (@Kysios)
|
||||
|
||||
### Fixed
|
||||
|
||||
- User selection bounding box glitches for non-mesh objects (@Kysios)
|
||||
- Transforms replication for animated objects
|
||||
- GPencil fill stroke
|
||||
- Sculpt and GPencil brushes deleted when joining a session (@Kysios)
|
||||
- Auto-updater doesn't work for master and develop builds
|
||||
|
63
README.md
@ -11,9 +11,8 @@ This tool aims to allow multiple users to work on the same scene over the networ
|
||||
|
||||
## Quick installation
|
||||
|
||||
1. Download latest release [multi_user.zip](https://gitlab.com/slumber/multi-user/-/jobs/artifacts/master/download?job=build).
|
||||
2. Run blender as administrator (dependencies installation).
|
||||
3. Install last_version.zip from your addon preferences.
|
||||
1. Download [latest build](https://gitlab.com/slumber/multi-user/-/jobs/artifacts/develop/download?job=build) or [stable build](https://gitlab.com/slumber/multi-user/-/jobs/artifacts/master/download?job=build).
|
||||
2. Install last_version.zip from your addon preferences.
|
||||
|
||||
[Dependencies](#dependencies) will be automatically added to your blender python during installation.
|
||||
|
||||
@ -29,35 +28,35 @@ See the [troubleshooting guide](https://slumber.gitlab.io/multi-user/getting_sta
|
||||
|
||||
Currently, not all data-block are supported for replication over the wire. The following list summarizes the status for each ones.
|
||||
|
||||
| Name | Status | Comment |
|
||||
| -------------- | :----: | :----------------------------------------------------------: |
|
||||
| action | ✔️ | |
|
||||
| camera | ✔️ | |
|
||||
| collection | ✔️ | |
|
||||
| gpencil | ✔️ | |
|
||||
| image | ✔️ | |
|
||||
| mesh | ✔️ | |
|
||||
| material | ✔️ | |
|
||||
| node_groups | ✔️ | Material & Geometry only |
|
||||
| geometry nodes | ✔️ | |
|
||||
| metaball | ✔️ | |
|
||||
| object | ✔️ | |
|
||||
| texts | ✔️ | |
|
||||
| scene | ✔️ | |
|
||||
| world | ✔️ | |
|
||||
| volumes | ✔️ | |
|
||||
| lightprobes | ✔️ | |
|
||||
| physics | ✔️ | |
|
||||
| curve | ❗ | Nurbs surfaces not supported |
|
||||
| textures | ❗ | Supported for modifiers/materials/geo nodes only |
|
||||
| armature | ❗ | Not stable |
|
||||
| particles | ❗ | The cache isn't syncing. |
|
||||
| speakers | ❗ | [Partial](https://gitlab.com/slumber/multi-user/-/issues/65) |
|
||||
| vse | ❗ | Mask and Clip not supported yet |
|
||||
| libraries | ❗ | Partial |
|
||||
| nla | ❌ | |
|
||||
| texts | ❌ | [Planned](https://gitlab.com/slumber/multi-user/-/issues/81) |
|
||||
| compositing | ❌ | [Planned](https://gitlab.com/slumber/multi-user/-/issues/46) |
|
||||
| Name | Status | Comment |
|
||||
| -------------- | :----: | :---------------------------------------------------------------------: |
|
||||
| action | ✔️ | |
|
||||
| camera | ✔️ | |
|
||||
| collection | ✔️ | |
|
||||
| gpencil | ✔️ | |
|
||||
| image | ✔️ | |
|
||||
| mesh | ✔️ | |
|
||||
| material | ✔️ | |
|
||||
| node_groups | ✔️ | Material & Geometry only |
|
||||
| geometry nodes | ✔️ | |
|
||||
| metaball | ✔️ | |
|
||||
| object | ✔️ | |
|
||||
| texts | ✔️ | |
|
||||
| scene | ✔️ | |
|
||||
| world | ✔️ | |
|
||||
| volumes | ✔️ | |
|
||||
| lightprobes | ✔️ | |
|
||||
| physics | ✔️ | |
|
||||
| textures | ✔️ | |
|
||||
| curve | ❗ | Nurbs surfaces not supported |
|
||||
| armature | ❗ | Only for Mesh. [Planned for GPencil](https://gitlab.com/slumber/multi-user/-/issues/161). Not stable yet |
|
||||
| particles | ❗ | The cache isn't syncing. |
|
||||
| speakers | ❗ | [Partial](https://gitlab.com/slumber/multi-user/-/issues/65) |
|
||||
| vse | ❗ | Mask and Clip not supported yet |
|
||||
| libraries | ❌ | |
|
||||
| nla | ❌ | |
|
||||
| texts | ❌ | [Planned for v0.5.0](https://gitlab.com/slumber/multi-user/-/issues/81) |
|
||||
| compositing | ❌ | [Planned for v0.5.0](https://gitlab.com/slumber/multi-user/-/issues/46) |
|
||||
|
||||
|
||||
|
||||
|
@ -19,10 +19,10 @@ import sys
|
||||
|
||||
project = 'multi-user'
|
||||
copyright = '2020, Swann Martinez'
|
||||
author = 'Swann Martinez, with contributions from Poochy'
|
||||
author = 'Swann Martinez, Poochy, Fabian'
|
||||
|
||||
# The full version, including alpha/beta/rc tags
|
||||
release = '0.2.0'
|
||||
release = '0.5.0-develop'
|
||||
|
||||
|
||||
# -- General configuration ---------------------------------------------------
|
||||
|
Before Width: | Height: | Size: 12 KiB After Width: | Height: | Size: 15 KiB |
Before Width: | Height: | Size: 15 KiB After Width: | Height: | Size: 22 KiB |
Before Width: | Height: | Size: 17 KiB After Width: | Height: | Size: 18 KiB |
Before Width: | Height: | Size: 14 KiB After Width: | Height: | Size: 20 KiB |
Before Width: | Height: | Size: 70 KiB After Width: | Height: | Size: 365 KiB |
Before Width: | Height: | Size: 18 KiB After Width: | Height: | Size: 26 KiB |
@ -215,8 +215,10 @@ One of the most vital tools is the **Online user panel**. It lists all connected
|
||||
users' information including your own:
|
||||
|
||||
* **Role** : if a user is an admin or a regular user.
|
||||
* **Location**: Where the user is actually working.
|
||||
* **Username** : Name of the user.
|
||||
* **Mode** : User's active editing mode (edit_mesh, paint,etc.).
|
||||
* **Frame**: When (on which frame) the user is working.
|
||||
* **Location**: Where the user is actually working.
|
||||
* **Ping**: user's connection delay in milliseconds
|
||||
|
||||
.. figure:: img/quickstart_users.png
|
||||
@ -273,6 +275,7 @@ it draw users' related information in your viewport such as:
|
||||
|
||||
* Username
|
||||
* User point of view
|
||||
* User active mode
|
||||
* User selection
|
||||
|
||||
.. figure:: img/quickstart_presence.png
|
||||
|
@ -19,9 +19,9 @@
|
||||
bl_info = {
|
||||
"name": "Multi-User",
|
||||
"author": "Swann Martinez",
|
||||
"version": (0, 5, 0),
|
||||
"version": (0, 4, 0),
|
||||
"description": "Enable real-time collaborative workflow inside blender",
|
||||
"blender": (2, 93, 0),
|
||||
"blender": (2, 82, 0),
|
||||
"location": "3D View > Sidebar > Multi-User tab",
|
||||
"warning": "Unstable addon, use it at your own risks",
|
||||
"category": "Collaboration",
|
||||
@ -59,6 +59,7 @@ def register():
|
||||
|
||||
from . import presence
|
||||
from . import operators
|
||||
from . import handlers
|
||||
from . import ui
|
||||
from . import preferences
|
||||
from . import addon_updater_ops
|
||||
@ -67,6 +68,7 @@ def register():
|
||||
addon_updater_ops.register(bl_info)
|
||||
presence.register()
|
||||
operators.register()
|
||||
handlers.register()
|
||||
ui.register()
|
||||
except ModuleNotFoundError as e:
|
||||
raise Exception(module_error_msg)
|
||||
@ -87,6 +89,7 @@ def register():
|
||||
def unregister():
|
||||
from . import presence
|
||||
from . import operators
|
||||
from . import handlers
|
||||
from . import ui
|
||||
from . import preferences
|
||||
from . import addon_updater_ops
|
||||
@ -96,6 +99,7 @@ def unregister():
|
||||
presence.unregister()
|
||||
addon_updater_ops.unregister()
|
||||
ui.unregister()
|
||||
handlers.unregister()
|
||||
operators.unregister()
|
||||
preferences.unregister()
|
||||
|
||||
|
@ -41,7 +41,6 @@ __all__ = [
|
||||
'bl_node_group',
|
||||
'bl_texture',
|
||||
"bl_particle",
|
||||
# 'bl_compositor',
|
||||
] # Order here defines execution order
|
||||
|
||||
if bpy.app.version[1] >= 91:
|
||||
|
@ -219,7 +219,7 @@ def load_fcurve(fcurve_data, fcurve):
|
||||
def dump_animation_data(datablock):
|
||||
animation_data = {}
|
||||
if has_action(datablock):
|
||||
animation_data['action'] = datablock.animation_data.action.name
|
||||
animation_data['action'] = datablock.animation_data.action.uuid
|
||||
if has_driver(datablock):
|
||||
animation_data['drivers'] = []
|
||||
for driver in datablock.animation_data.drivers:
|
||||
@ -241,8 +241,10 @@ def load_animation_data(animation_data, datablock):
|
||||
for driver in animation_data['drivers']:
|
||||
load_driver(datablock, driver)
|
||||
|
||||
if 'action' in animation_data:
|
||||
datablock.animation_data.action = bpy.data.actions[animation_data['action']]
|
||||
action = animation_data.get('action')
|
||||
if action:
|
||||
action = resolve_datablock_from_uuid(action, bpy.data.actions)
|
||||
datablock.animation_data.action = action
|
||||
elif datablock.animation_data.action:
|
||||
datablock.animation_data.action = None
|
||||
|
||||
@ -259,6 +261,8 @@ def resolve_animation_dependencies(datablock):
|
||||
|
||||
|
||||
class BlAction(ReplicatedDatablock):
|
||||
use_delta = True
|
||||
|
||||
bl_id = "actions"
|
||||
bl_class = bpy.types.Action
|
||||
bl_check_common = False
|
||||
|
@ -37,6 +37,8 @@ def get_roll(bone: bpy.types.Bone) -> float:
|
||||
|
||||
|
||||
class BlArmature(ReplicatedDatablock):
|
||||
use_delta = True
|
||||
|
||||
bl_id = "armatures"
|
||||
bl_class = bpy.types.Armature
|
||||
bl_check_common = False
|
||||
|
@ -26,6 +26,8 @@ from .bl_action import dump_animation_data, load_animation_data, resolve_animati
|
||||
|
||||
|
||||
class BlCamera(ReplicatedDatablock):
|
||||
use_delta = True
|
||||
|
||||
bl_id = "cameras"
|
||||
bl_class = bpy.types.Camera
|
||||
bl_check_common = False
|
||||
@ -54,7 +56,7 @@ class BlCamera(ReplicatedDatablock):
|
||||
background_images = data.get('background_images')
|
||||
|
||||
datablock.background_images.clear()
|
||||
|
||||
# TODO: Use image uuid
|
||||
if background_images:
|
||||
for img_name, img_data in background_images.items():
|
||||
img_id = img_data.get('image')
|
||||
|
@ -1,81 +0,0 @@
|
||||
# ##### BEGIN GPL LICENSE BLOCK #####
|
||||
#
|
||||
# This program is free software: you can redistribute it and/or modify
|
||||
# it under the terms of the GNU General Public License as published by
|
||||
# the Free Software Foundation, either version 3 of the License, or
|
||||
# (at your option) any later version.
|
||||
#
|
||||
# This program is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License
|
||||
# along with this program. If not, see <https://www.gnu.org/licenses/>.
|
||||
#
|
||||
# ##### END GPL LICENSE BLOCK #####
|
||||
|
||||
import bpy
|
||||
import mathutils
|
||||
import logging
|
||||
import re
|
||||
|
||||
from uuid import uuid4
|
||||
|
||||
from .dump_anything import Loader, Dumper
|
||||
from replication.protocol import ReplicatedDatablock
|
||||
|
||||
from .bl_datablock import get_datablock_from_uuid, resolve_datablock_from_uuid
|
||||
from .bl_action import dump_animation_data, load_animation_data, resolve_animation_dependencies
|
||||
from .node_tree import load_node_tree, dump_node_tree, get_node_tree_dependencies
|
||||
|
||||
class BlCompositor(ReplicatedDatablock):
|
||||
bl_id = "compositor"
|
||||
bl_class = bpy.types.CompositorNodeTree
|
||||
bl_check_common = True
|
||||
bl_icon = 'COMPOSITOR_NODE'
|
||||
bl_reload_parent = False
|
||||
|
||||
@staticmethod
|
||||
def construct(data: dict) -> object:
|
||||
return bpy.data.scenes["Scene"].node_tree # TODO: resolve_datablock_from_uuid for multiple scenes
|
||||
|
||||
@staticmethod
|
||||
def load(data: dict, datablock: object):
|
||||
load_animation_data(data.get('animation_data'), datablock)
|
||||
loader = Loader()
|
||||
loader.load(datablock, data)
|
||||
load_node_tree(data['node_tree'], datablock)
|
||||
|
||||
@staticmethod
|
||||
def dump(datablock: object) -> dict:
|
||||
comp_dumper = Dumper()
|
||||
comp_dumper.depth = 1
|
||||
comp_dumper.include_filter = [
|
||||
'use_nodes',
|
||||
'name',
|
||||
]
|
||||
data = comp_dumper.dump(datablock)
|
||||
|
||||
data['node_tree'] = dump_node_tree(datablock)
|
||||
|
||||
data['animation_data'] = dump_animation_data(datablock)
|
||||
return data
|
||||
|
||||
@staticmethod
|
||||
def resolve(data: dict) -> object:
|
||||
uuid = data.get('uuid')
|
||||
return resolve_datablock_from_uuid(uuid, bpy.data.scenes["Scene"].node_tree)
|
||||
|
||||
@staticmethod
|
||||
def resolve_deps(datablock: object) -> [object]:
|
||||
deps = []
|
||||
|
||||
deps.extend(get_node_tree_dependencies(datablock))
|
||||
|
||||
deps.extend(resolve_animation_dependencies(datablock))
|
||||
|
||||
return deps
|
||||
|
||||
_type = bpy.types.CompositorNodeTree
|
||||
_class = BlCompositor
|
@ -137,6 +137,8 @@ SPLINE_METADATA = [
|
||||
|
||||
|
||||
class BlCurve(ReplicatedDatablock):
|
||||
use_delta = True
|
||||
|
||||
bl_id = "curves"
|
||||
bl_class = bpy.types.Curve
|
||||
bl_check_common = False
|
||||
|
@ -28,7 +28,8 @@ from replication.protocol import ReplicatedDatablock
|
||||
from .bl_datablock import resolve_datablock_from_uuid
|
||||
from .bl_action import dump_animation_data, load_animation_data, resolve_animation_dependencies
|
||||
from ..utils import get_preferences
|
||||
|
||||
from ..timers import is_annotating
|
||||
from .bl_material import load_materials_slots, dump_materials_slots
|
||||
|
||||
STROKE_POINT = [
|
||||
'co',
|
||||
@ -65,36 +66,9 @@ def dump_stroke(stroke):
|
||||
|
||||
:param stroke: target grease pencil stroke
|
||||
:type stroke: bpy.types.GPencilStroke
|
||||
:return: dict
|
||||
:return: (p_count, p_data)
|
||||
"""
|
||||
|
||||
assert(stroke)
|
||||
|
||||
dumper = Dumper()
|
||||
dumper.include_filter = [
|
||||
"aspect",
|
||||
"display_mode",
|
||||
"draw_cyclic",
|
||||
"end_cap_mode",
|
||||
"hardeness",
|
||||
"line_width",
|
||||
"material_index",
|
||||
"start_cap_mode",
|
||||
"uv_rotation",
|
||||
"uv_scale",
|
||||
"uv_translation",
|
||||
"vertex_color_fill",
|
||||
]
|
||||
dumped_stroke = dumper.dump(stroke)
|
||||
|
||||
# Stoke points
|
||||
p_count = len(stroke.points)
|
||||
dumped_stroke['p_count'] = p_count
|
||||
dumped_stroke['points'] = np_dump_collection(stroke.points, STROKE_POINT)
|
||||
|
||||
# TODO: uv_factor, uv_rotation
|
||||
|
||||
return dumped_stroke
|
||||
return (len(stroke.points), np_dump_collection(stroke.points, STROKE_POINT))
|
||||
|
||||
|
||||
def load_stroke(stroke_data, stroke):
|
||||
@ -107,12 +81,12 @@ def load_stroke(stroke_data, stroke):
|
||||
"""
|
||||
assert(stroke and stroke_data)
|
||||
|
||||
stroke.points.add(stroke_data["p_count"])
|
||||
np_load_collection(stroke_data['points'], stroke.points, STROKE_POINT)
|
||||
stroke.points.add(stroke_data[0])
|
||||
np_load_collection(stroke_data[1], stroke.points, STROKE_POINT)
|
||||
|
||||
# HACK: Temporary fix to trigger a BKE_gpencil_stroke_geometry_update to
|
||||
# fix fill issues
|
||||
stroke.uv_scale = stroke_data["uv_scale"]
|
||||
stroke.uv_scale = 1.0
|
||||
|
||||
|
||||
def dump_frame(frame):
|
||||
@ -147,10 +121,12 @@ def load_frame(frame_data, frame):
|
||||
|
||||
assert(frame and frame_data)
|
||||
|
||||
# Load stroke points
|
||||
for stroke_data in frame_data['strokes_points']:
|
||||
target_stroke = frame.strokes.new()
|
||||
load_stroke(stroke_data, target_stroke)
|
||||
|
||||
# Load stroke metadata
|
||||
np_load_collection(frame_data['strokes'], frame.strokes, STROKE)
|
||||
|
||||
|
||||
@ -170,7 +146,6 @@ def dump_layer(layer):
|
||||
'opacity',
|
||||
'channel_color',
|
||||
'color',
|
||||
# 'thickness', #TODO: enabling only for annotation
|
||||
'tint_color',
|
||||
'tint_factor',
|
||||
'vertex_paint_opacity',
|
||||
@ -187,7 +162,7 @@ def dump_layer(layer):
|
||||
'hide',
|
||||
'annotation_hide',
|
||||
'lock',
|
||||
# 'lock_frame',
|
||||
'lock_frame',
|
||||
# 'lock_material',
|
||||
# 'use_mask_layer',
|
||||
'use_lights',
|
||||
@ -195,12 +170,13 @@ def dump_layer(layer):
|
||||
'select',
|
||||
'show_points',
|
||||
'show_in_front',
|
||||
# 'thickness'
|
||||
# 'parent',
|
||||
# 'parent_type',
|
||||
# 'parent_bone',
|
||||
# 'matrix_inverse',
|
||||
]
|
||||
if layer.id_data.is_annotation:
|
||||
if layer.thickness != 0:
|
||||
dumper.include_filter.append('thickness')
|
||||
|
||||
dumped_layer = dumper.dump(layer)
|
||||
@ -255,10 +231,10 @@ class BlGpencil(ReplicatedDatablock):
|
||||
|
||||
@staticmethod
|
||||
def load(data: dict, datablock: object):
|
||||
datablock.materials.clear()
|
||||
if "materials" in data.keys():
|
||||
for mat in data['materials']:
|
||||
datablock.materials.append(bpy.data.materials[mat])
|
||||
# MATERIAL SLOTS
|
||||
src_materials = data.get('materials', None)
|
||||
if src_materials:
|
||||
load_materials_slots(src_materials, datablock.materials)
|
||||
|
||||
loader = Loader()
|
||||
loader.load(datablock, data)
|
||||
@ -286,7 +262,6 @@ class BlGpencil(ReplicatedDatablock):
|
||||
dumper = Dumper()
|
||||
dumper.depth = 2
|
||||
dumper.include_filter = [
|
||||
'materials',
|
||||
'name',
|
||||
'zdepth_offset',
|
||||
'stroke_thickness_space',
|
||||
@ -294,7 +269,7 @@ class BlGpencil(ReplicatedDatablock):
|
||||
'stroke_depth_order'
|
||||
]
|
||||
data = dumper.dump(datablock)
|
||||
|
||||
data['materials'] = dump_materials_slots(datablock.materials)
|
||||
data['layers'] = {}
|
||||
|
||||
for layer in datablock.layers:
|
||||
@ -323,7 +298,8 @@ class BlGpencil(ReplicatedDatablock):
|
||||
return bpy.context.mode == 'OBJECT' \
|
||||
or layer_changed(datablock, data) \
|
||||
or frame_changed(data) \
|
||||
or get_preferences().sync_flags.sync_during_editmode
|
||||
or get_preferences().sync_flags.sync_during_editmode \
|
||||
or is_annotating(bpy.context)
|
||||
|
||||
_type = bpy.types.GreasePencil
|
||||
_class = BlGpencil
|
||||
|
@ -69,11 +69,12 @@ class BlImage(ReplicatedDatablock):
|
||||
@staticmethod
|
||||
def load(data: dict, datablock: object):
|
||||
loader = Loader()
|
||||
loader.load(data, datablock)
|
||||
loader.load(datablock, data)
|
||||
|
||||
# datablock.name = data.get('name')
|
||||
datablock.source = 'FILE'
|
||||
datablock.filepath_raw = get_filepath(data['filename'])
|
||||
color_space_name = data["colorspace_settings"]["name"]
|
||||
color_space_name = data.get("colorspace")
|
||||
|
||||
if color_space_name:
|
||||
datablock.colorspace_settings.name = color_space_name
|
||||
@ -92,12 +93,10 @@ class BlImage(ReplicatedDatablock):
|
||||
"name",
|
||||
# 'source',
|
||||
'size',
|
||||
'height',
|
||||
'alpha',
|
||||
'float_buffer',
|
||||
'alpha_mode',
|
||||
'colorspace_settings']
|
||||
'alpha_mode']
|
||||
data.update(dumper.dump(datablock))
|
||||
data['colorspace'] = datablock.colorspace_settings.name
|
||||
|
||||
return data
|
||||
|
||||
@staticmethod
|
||||
@ -132,10 +131,7 @@ class BlImage(ReplicatedDatablock):
|
||||
if datablock.is_dirty:
|
||||
datablock.save()
|
||||
|
||||
if not data or (datablock and (datablock.name != data.get('name'))):
|
||||
return True
|
||||
else:
|
||||
return False
|
||||
return True
|
||||
|
||||
_type = bpy.types.Image
|
||||
_class = BlImage
|
||||
|
@ -29,6 +29,8 @@ POINT = ['co', 'weight_softbody', 'co_deform']
|
||||
|
||||
|
||||
class BlLattice(ReplicatedDatablock):
|
||||
use_delta = True
|
||||
|
||||
bl_id = "lattices"
|
||||
bl_class = bpy.types.Lattice
|
||||
bl_check_common = False
|
||||
|
@ -26,6 +26,8 @@ from .bl_action import dump_animation_data, load_animation_data, resolve_animati
|
||||
|
||||
|
||||
class BlLight(ReplicatedDatablock):
|
||||
use_delta = True
|
||||
|
||||
bl_id = "lights"
|
||||
bl_class = bpy.types.Light
|
||||
bl_check_common = False
|
||||
|
@ -25,6 +25,8 @@ from replication.protocol import ReplicatedDatablock
|
||||
from .bl_datablock import resolve_datablock_from_uuid
|
||||
|
||||
class BlLightprobe(ReplicatedDatablock):
|
||||
use_delta = True
|
||||
|
||||
bl_id = "lightprobes"
|
||||
bl_class = bpy.types.LightProbe
|
||||
bl_check_common = False
|
||||
|
@ -28,7 +28,341 @@ from replication.protocol import ReplicatedDatablock
|
||||
|
||||
from .bl_datablock import get_datablock_from_uuid, resolve_datablock_from_uuid
|
||||
from .bl_action import dump_animation_data, load_animation_data, resolve_animation_dependencies
|
||||
from .node_tree import load_node_tree, dump_node_tree, get_node_tree_dependencies
|
||||
|
||||
NODE_SOCKET_INDEX = re.compile('\[(\d*)\]')
|
||||
IGNORED_SOCKETS = ['GEOMETRY', 'SHADER', 'CUSTOM']
|
||||
|
||||
def load_node(node_data: dict, node_tree: bpy.types.ShaderNodeTree):
|
||||
""" Load a node into a node_tree from a dict
|
||||
|
||||
:arg node_data: dumped node data
|
||||
:type node_data: dict
|
||||
:arg node_tree: target node_tree
|
||||
:type node_tree: bpy.types.NodeTree
|
||||
"""
|
||||
loader = Loader()
|
||||
target_node = node_tree.nodes.new(type=node_data["bl_idname"])
|
||||
target_node.select = False
|
||||
loader.load(target_node, node_data)
|
||||
image_uuid = node_data.get('image_uuid', None)
|
||||
node_tree_uuid = node_data.get('node_tree_uuid', None)
|
||||
|
||||
if image_uuid and not target_node.image:
|
||||
image = resolve_datablock_from_uuid(image_uuid, bpy.data.images)
|
||||
if image is None:
|
||||
logging.error(f"Fail to find material image from uuid {image_uuid}")
|
||||
else:
|
||||
target_node.image = image
|
||||
|
||||
if node_tree_uuid:
|
||||
target_node.node_tree = get_datablock_from_uuid(node_tree_uuid, None)
|
||||
|
||||
inputs_data = node_data.get('inputs')
|
||||
if inputs_data:
|
||||
inputs = [i for i in target_node.inputs if i.type not in IGNORED_SOCKETS]
|
||||
for idx, inpt in enumerate(inputs):
|
||||
if idx < len(inputs_data) and hasattr(inpt, "default_value"):
|
||||
loaded_input = inputs_data[idx]
|
||||
try:
|
||||
if inpt.type in ['OBJECT', 'COLLECTION']:
|
||||
inpt.default_value = get_datablock_from_uuid(loaded_input, None)
|
||||
else:
|
||||
inpt.default_value = loaded_input
|
||||
except Exception as e:
|
||||
logging.warning(f"Node {target_node.name} input {inpt.name} parameter not supported, skipping ({e})")
|
||||
else:
|
||||
logging.warning(f"Node {target_node.name} input length mismatch.")
|
||||
|
||||
outputs_data = node_data.get('outputs')
|
||||
if outputs_data:
|
||||
outputs = [o for o in target_node.outputs if o.type not in IGNORED_SOCKETS]
|
||||
for idx, output in enumerate(outputs):
|
||||
if idx < len(outputs_data) and hasattr(output, "default_value"):
|
||||
loaded_output = outputs_data[idx]
|
||||
try:
|
||||
if output.type in ['OBJECT', 'COLLECTION']:
|
||||
output.default_value = get_datablock_from_uuid(loaded_output, None)
|
||||
else:
|
||||
output.default_value = loaded_output
|
||||
except Exception as e:
|
||||
logging.warning(
|
||||
f"Node {target_node.name} output {output.name} parameter not supported, skipping ({e})")
|
||||
else:
|
||||
logging.warning(
|
||||
f"Node {target_node.name} output length mismatch.")
|
||||
|
||||
|
||||
def dump_node(node: bpy.types.ShaderNode) -> dict:
|
||||
""" Dump a single node to a dict
|
||||
|
||||
:arg node: target node
|
||||
:type node: bpy.types.Node
|
||||
:retrun: dict
|
||||
"""
|
||||
|
||||
node_dumper = Dumper()
|
||||
node_dumper.depth = 1
|
||||
node_dumper.exclude_filter = [
|
||||
"dimensions",
|
||||
"show_expanded",
|
||||
"name_full",
|
||||
"select",
|
||||
"bl_label",
|
||||
"bl_height_min",
|
||||
"bl_height_max",
|
||||
"bl_height_default",
|
||||
"bl_width_min",
|
||||
"bl_width_max",
|
||||
"type",
|
||||
"bl_icon",
|
||||
"bl_width_default",
|
||||
"bl_static_type",
|
||||
"show_tetxure",
|
||||
"is_active_output",
|
||||
"hide",
|
||||
"show_options",
|
||||
"show_preview",
|
||||
"show_texture",
|
||||
"outputs",
|
||||
"width_hidden"
|
||||
]
|
||||
|
||||
dumped_node = node_dumper.dump(node)
|
||||
|
||||
if node.parent:
|
||||
dumped_node['parent'] = node.parent.name
|
||||
|
||||
dump_io_needed = (node.type not in ['REROUTE', 'OUTPUT_MATERIAL'])
|
||||
|
||||
if dump_io_needed:
|
||||
io_dumper = Dumper()
|
||||
io_dumper.depth = 2
|
||||
io_dumper.include_filter = ["default_value"]
|
||||
|
||||
if hasattr(node, 'inputs'):
|
||||
dumped_node['inputs'] = []
|
||||
inputs = [i for i in node.inputs if i.type not in IGNORED_SOCKETS]
|
||||
for idx, inpt in enumerate(inputs):
|
||||
if hasattr(inpt, 'default_value'):
|
||||
if isinstance(inpt.default_value, bpy.types.ID):
|
||||
dumped_input = inpt.default_value.uuid
|
||||
else:
|
||||
dumped_input = io_dumper.dump(inpt.default_value)
|
||||
|
||||
dumped_node['inputs'].append(dumped_input)
|
||||
|
||||
if hasattr(node, 'outputs'):
|
||||
dumped_node['outputs'] = []
|
||||
for idx, output in enumerate(node.outputs):
|
||||
if output.type not in IGNORED_SOCKETS:
|
||||
if hasattr(output, 'default_value'):
|
||||
dumped_node['outputs'].append(
|
||||
io_dumper.dump(output.default_value))
|
||||
|
||||
if hasattr(node, 'color_ramp'):
|
||||
ramp_dumper = Dumper()
|
||||
ramp_dumper.depth = 4
|
||||
ramp_dumper.include_filter = [
|
||||
'elements',
|
||||
'alpha',
|
||||
'color',
|
||||
'position',
|
||||
'interpolation',
|
||||
'hue_interpolation',
|
||||
'color_mode'
|
||||
]
|
||||
dumped_node['color_ramp'] = ramp_dumper.dump(node.color_ramp)
|
||||
if hasattr(node, 'mapping'):
|
||||
curve_dumper = Dumper()
|
||||
curve_dumper.depth = 5
|
||||
curve_dumper.include_filter = [
|
||||
'curves',
|
||||
'points',
|
||||
'location'
|
||||
]
|
||||
dumped_node['mapping'] = curve_dumper.dump(node.mapping)
|
||||
if hasattr(node, 'image') and getattr(node, 'image'):
|
||||
dumped_node['image_uuid'] = node.image.uuid
|
||||
if hasattr(node, 'node_tree') and getattr(node, 'node_tree'):
|
||||
dumped_node['node_tree_uuid'] = node.node_tree.uuid
|
||||
return dumped_node
|
||||
|
||||
|
||||
|
||||
def load_links(links_data, node_tree):
|
||||
""" Load node_tree links from a list
|
||||
|
||||
:arg links_data: dumped node links
|
||||
:type links_data: list
|
||||
:arg node_tree: node links collection
|
||||
:type node_tree: bpy.types.NodeTree
|
||||
"""
|
||||
|
||||
for link in links_data:
|
||||
input_socket = node_tree.nodes[link['to_node']
|
||||
].inputs[int(link['to_socket'])]
|
||||
output_socket = node_tree.nodes[link['from_node']].outputs[int(
|
||||
link['from_socket'])]
|
||||
node_tree.links.new(input_socket, output_socket)
|
||||
|
||||
|
||||
def dump_links(links):
|
||||
""" Dump node_tree links collection to a list
|
||||
|
||||
:arg links: node links collection
|
||||
:type links: bpy.types.NodeLinks
|
||||
:retrun: list
|
||||
"""
|
||||
|
||||
links_data = []
|
||||
|
||||
for link in links:
|
||||
to_socket = NODE_SOCKET_INDEX.search(
|
||||
link.to_socket.path_from_id()).group(1)
|
||||
from_socket = NODE_SOCKET_INDEX.search(
|
||||
link.from_socket.path_from_id()).group(1)
|
||||
links_data.append({
|
||||
'to_node': link.to_node.name,
|
||||
'to_socket': to_socket,
|
||||
'from_node': link.from_node.name,
|
||||
'from_socket': from_socket,
|
||||
})
|
||||
|
||||
return links_data
|
||||
|
||||
|
||||
def dump_node_tree(node_tree: bpy.types.ShaderNodeTree) -> dict:
|
||||
""" Dump a shader node_tree to a dict including links and nodes
|
||||
|
||||
:arg node_tree: dumped shader node tree
|
||||
:type node_tree: bpy.types.ShaderNodeTree
|
||||
:return: dict
|
||||
"""
|
||||
node_tree_data = {
|
||||
'nodes': {node.name: dump_node(node) for node in node_tree.nodes},
|
||||
'links': dump_links(node_tree.links),
|
||||
'name': node_tree.name,
|
||||
'type': type(node_tree).__name__
|
||||
}
|
||||
|
||||
for socket_id in ['inputs', 'outputs']:
|
||||
socket_collection = getattr(node_tree, socket_id)
|
||||
node_tree_data[socket_id] = dump_node_tree_sockets(socket_collection)
|
||||
|
||||
return node_tree_data
|
||||
|
||||
|
||||
def dump_node_tree_sockets(sockets: bpy.types.Collection) -> dict:
|
||||
""" dump sockets of a shader_node_tree
|
||||
|
||||
:arg target_node_tree: target node_tree
|
||||
:type target_node_tree: bpy.types.NodeTree
|
||||
:arg socket_id: socket identifer
|
||||
:type socket_id: str
|
||||
:return: dict
|
||||
"""
|
||||
sockets_data = []
|
||||
for socket in sockets:
|
||||
try:
|
||||
socket_uuid = socket['uuid']
|
||||
except Exception:
|
||||
socket_uuid = str(uuid4())
|
||||
socket['uuid'] = socket_uuid
|
||||
|
||||
sockets_data.append((socket.name, socket.bl_socket_idname, socket_uuid))
|
||||
|
||||
return sockets_data
|
||||
|
||||
|
||||
def load_node_tree_sockets(sockets: bpy.types.Collection,
|
||||
sockets_data: dict):
|
||||
""" load sockets of a shader_node_tree
|
||||
|
||||
:arg target_node_tree: target node_tree
|
||||
:type target_node_tree: bpy.types.NodeTree
|
||||
:arg socket_id: socket identifer
|
||||
:type socket_id: str
|
||||
:arg socket_data: dumped socket data
|
||||
:type socket_data: dict
|
||||
"""
|
||||
# Check for removed sockets
|
||||
for socket in sockets:
|
||||
if not [s for s in sockets_data if 'uuid' in socket and socket['uuid'] == s[2]]:
|
||||
sockets.remove(socket)
|
||||
|
||||
# Check for new sockets
|
||||
for idx, socket_data in enumerate(sockets_data):
|
||||
try:
|
||||
checked_socket = sockets[idx]
|
||||
if checked_socket.name != socket_data[0]:
|
||||
checked_socket.name = socket_data[0]
|
||||
except Exception:
|
||||
s = sockets.new(socket_data[1], socket_data[0])
|
||||
s['uuid'] = socket_data[2]
|
||||
|
||||
|
||||
def load_node_tree(node_tree_data: dict, target_node_tree: bpy.types.ShaderNodeTree) -> dict:
|
||||
"""Load a shader node_tree from dumped data
|
||||
|
||||
:arg node_tree_data: dumped node data
|
||||
:type node_tree_data: dict
|
||||
:arg target_node_tree: target node_tree
|
||||
:type target_node_tree: bpy.types.NodeTree
|
||||
"""
|
||||
# TODO: load only required nodes
|
||||
target_node_tree.nodes.clear()
|
||||
|
||||
if not target_node_tree.is_property_readonly('name'):
|
||||
target_node_tree.name = node_tree_data['name']
|
||||
|
||||
if 'inputs' in node_tree_data:
|
||||
socket_collection = getattr(target_node_tree, 'inputs')
|
||||
load_node_tree_sockets(socket_collection, node_tree_data['inputs'])
|
||||
|
||||
if 'outputs' in node_tree_data:
|
||||
socket_collection = getattr(target_node_tree, 'outputs')
|
||||
load_node_tree_sockets(socket_collection, node_tree_data['outputs'])
|
||||
|
||||
# Load nodes
|
||||
for node in node_tree_data["nodes"]:
|
||||
load_node(node_tree_data["nodes"][node], target_node_tree)
|
||||
|
||||
for node_id, node_data in node_tree_data["nodes"].items():
|
||||
target_node = target_node_tree.nodes.get(node_id, None)
|
||||
if target_node is None:
|
||||
continue
|
||||
elif 'parent' in node_data:
|
||||
target_node.parent = target_node_tree.nodes[node_data['parent']]
|
||||
else:
|
||||
target_node.parent = None
|
||||
# TODO: load only required nodes links
|
||||
# Load nodes links
|
||||
target_node_tree.links.clear()
|
||||
|
||||
load_links(node_tree_data["links"], target_node_tree)
|
||||
|
||||
|
||||
def get_node_tree_dependencies(node_tree: bpy.types.NodeTree) -> list:
|
||||
def has_image(node): return (
|
||||
node.type in ['TEX_IMAGE', 'TEX_ENVIRONMENT'] and node.image)
|
||||
|
||||
def has_node_group(node): return (
|
||||
hasattr(node, 'node_tree') and node.node_tree)
|
||||
|
||||
def has_texture(node): return (
|
||||
node.type in ['ATTRIBUTE_SAMPLE_TEXTURE','TEXTURE'] and node.texture)
|
||||
deps = []
|
||||
|
||||
for node in node_tree.nodes:
|
||||
if has_image(node):
|
||||
deps.append(node.image)
|
||||
elif has_node_group(node):
|
||||
deps.append(node.node_tree)
|
||||
elif has_texture(node):
|
||||
deps.append(node.texture)
|
||||
|
||||
return deps
|
||||
|
||||
|
||||
def dump_materials_slots(materials: bpy.types.bpy_prop_collection) -> list:
|
||||
""" Dump material slots collection
|
||||
@ -53,20 +387,22 @@ def load_materials_slots(src_materials: list, dst_materials: bpy.types.bpy_prop_
|
||||
|
||||
for mat_uuid, mat_name in src_materials:
|
||||
mat_ref = None
|
||||
if mat_uuid is not None:
|
||||
if mat_uuid:
|
||||
mat_ref = get_datablock_from_uuid(mat_uuid, None)
|
||||
else:
|
||||
mat_ref = bpy.data.materials[mat_name]
|
||||
|
||||
dst_materials.append(mat_ref)
|
||||
|
||||
|
||||
class BlMaterial(ReplicatedDatablock):
|
||||
use_delta = True
|
||||
|
||||
bl_id = "materials"
|
||||
bl_class = bpy.types.Material
|
||||
bl_check_common = False
|
||||
bl_icon = 'MATERIAL_DATA'
|
||||
bl_reload_parent = False
|
||||
bl_reload_child = True
|
||||
|
||||
@staticmethod
|
||||
def construct(data: dict) -> object:
|
||||
@ -74,8 +410,6 @@ class BlMaterial(ReplicatedDatablock):
|
||||
|
||||
@staticmethod
|
||||
def load(data: dict, datablock: object):
|
||||
load_animation_data(data.get('animation_data'), datablock)
|
||||
|
||||
loader = Loader()
|
||||
|
||||
is_grease_pencil = data.get('is_grease_pencil')
|
||||
@ -92,6 +426,8 @@ class BlMaterial(ReplicatedDatablock):
|
||||
datablock.use_nodes = True
|
||||
|
||||
load_node_tree(data['node_tree'], datablock.node_tree)
|
||||
load_animation_data(data.get('nodes_animation_data'), datablock.node_tree)
|
||||
load_animation_data(data.get('animation_data'), datablock)
|
||||
|
||||
@staticmethod
|
||||
def dump(datablock: object) -> dict:
|
||||
@ -159,8 +495,10 @@ class BlMaterial(ReplicatedDatablock):
|
||||
data['grease_pencil'] = gp_mat_dumper.dump(datablock.grease_pencil)
|
||||
elif datablock.use_nodes:
|
||||
data['node_tree'] = dump_node_tree(datablock.node_tree)
|
||||
data['nodes_animation_data'] = dump_animation_data(datablock.node_tree)
|
||||
|
||||
data['animation_data'] = dump_animation_data(datablock)
|
||||
|
||||
return data
|
||||
|
||||
@staticmethod
|
||||
@ -174,7 +512,7 @@ class BlMaterial(ReplicatedDatablock):
|
||||
|
||||
if datablock.use_nodes:
|
||||
deps.extend(get_node_tree_dependencies(datablock.node_tree))
|
||||
|
||||
deps.extend(resolve_animation_dependencies(datablock.node_tree))
|
||||
deps.extend(resolve_animation_dependencies(datablock))
|
||||
|
||||
return deps
|
||||
|
@ -55,6 +55,8 @@ POLYGON = [
|
||||
]
|
||||
|
||||
class BlMesh(ReplicatedDatablock):
|
||||
use_delta = True
|
||||
|
||||
bl_id = "meshes"
|
||||
bl_class = bpy.types.Mesh
|
||||
bl_check_common = False
|
||||
|
@ -65,6 +65,8 @@ def load_metaball_elements(elements_data, elements):
|
||||
|
||||
|
||||
class BlMetaball(ReplicatedDatablock):
|
||||
use_delta = True
|
||||
|
||||
bl_id = "metaballs"
|
||||
bl_class = bpy.types.MetaBall
|
||||
bl_check_common = False
|
||||
|
@ -28,6 +28,8 @@ from .bl_datablock import resolve_datablock_from_uuid
|
||||
from .bl_action import dump_animation_data, load_animation_data, resolve_animation_dependencies
|
||||
|
||||
class BlNodeGroup(ReplicatedDatablock):
|
||||
use_delta = True
|
||||
|
||||
bl_id = "node_groups"
|
||||
bl_class = bpy.types.NodeTree
|
||||
bl_check_common = False
|
||||
|
@ -24,7 +24,7 @@ from replication.exception import ContextError
|
||||
|
||||
from replication.protocol import ReplicatedDatablock
|
||||
from .bl_datablock import get_datablock_from_uuid, resolve_datablock_from_uuid
|
||||
from .node_tree import IGNORED_SOCKETS
|
||||
from .bl_material import IGNORED_SOCKETS
|
||||
from ..utils import get_preferences
|
||||
from .bl_action import dump_animation_data, load_animation_data, resolve_animation_dependencies
|
||||
from .dump_anything import (
|
||||
@ -493,6 +493,8 @@ def load_modifiers_custom_data(dumped_modifiers: dict, modifiers: bpy.types.bpy_
|
||||
|
||||
|
||||
class BlObject(ReplicatedDatablock):
|
||||
use_delta = True
|
||||
|
||||
bl_id = "objects"
|
||||
bl_class = bpy.types.Object
|
||||
bl_check_common = False
|
||||
@ -618,10 +620,8 @@ class BlObject(ReplicatedDatablock):
|
||||
|
||||
transform = data.get('transforms', None)
|
||||
if transform:
|
||||
datablock.matrix_parent_inverse = mathutils.Matrix(
|
||||
transform['matrix_parent_inverse'])
|
||||
datablock.matrix_parent_inverse = mathutils.Matrix(transform['matrix_parent_inverse'])
|
||||
datablock.matrix_basis = mathutils.Matrix(transform['matrix_basis'])
|
||||
datablock.matrix_local = mathutils.Matrix(transform['matrix_local'])
|
||||
|
||||
|
||||
@staticmethod
|
||||
|
@ -3,7 +3,8 @@ import mathutils
|
||||
|
||||
from . import dump_anything
|
||||
from replication.protocol import ReplicatedDatablock
|
||||
from .bl_datablock import get_datablock_from_uuid, resolve_datablock_from_uuid
|
||||
from .bl_datablock import get_datablock_from_uuid
|
||||
from .bl_datablock import resolve_datablock_from_uuid
|
||||
from .bl_action import dump_animation_data, load_animation_data, resolve_animation_dependencies
|
||||
|
||||
|
||||
@ -40,6 +41,8 @@ IGNORED_ATTR = [
|
||||
]
|
||||
|
||||
class BlParticle(ReplicatedDatablock):
|
||||
use_delta = True
|
||||
|
||||
bl_id = "particles"
|
||||
bl_class = bpy.types.ParticleSettings
|
||||
bl_icon = "PARTICLES"
|
||||
|
@ -19,7 +19,6 @@
|
||||
import logging
|
||||
from pathlib import Path
|
||||
from uuid import uuid4
|
||||
import re
|
||||
|
||||
import bpy
|
||||
import mathutils
|
||||
@ -30,12 +29,10 @@ from replication.protocol import ReplicatedDatablock
|
||||
from ..utils import flush_history, get_preferences
|
||||
from .bl_action import (dump_animation_data, load_animation_data,
|
||||
resolve_animation_dependencies)
|
||||
from .node_tree import (get_node_tree_dependencies, load_node_tree,
|
||||
dump_node_tree)
|
||||
from .bl_collection import (dump_collection_children, dump_collection_objects,
|
||||
load_collection_childrens, load_collection_objects,
|
||||
resolve_collection_dependencies)
|
||||
from .bl_datablock import get_datablock_from_uuid, resolve_datablock_from_uuid
|
||||
from .bl_datablock import resolve_datablock_from_uuid
|
||||
from .bl_file import get_filepath
|
||||
from .dump_anything import Dumper, Loader
|
||||
|
||||
@ -306,6 +303,7 @@ def dump_sequence(sequence: bpy.types.Sequence) -> dict:
|
||||
|
||||
return data
|
||||
|
||||
|
||||
def load_sequence(sequence_data: dict,
|
||||
sequence_editor: bpy.types.SequenceEditor):
|
||||
""" Load sequence from dumped data
|
||||
@ -372,6 +370,7 @@ def load_sequence(sequence_data: dict,
|
||||
loader.load(sequence, sequence_data)
|
||||
sequence.select = False
|
||||
|
||||
|
||||
class BlScene(ReplicatedDatablock):
|
||||
is_root = True
|
||||
use_delta = True
|
||||
@ -404,8 +403,9 @@ class BlScene(ReplicatedDatablock):
|
||||
datablock.world = bpy.data.worlds[data['world']]
|
||||
|
||||
# Annotation
|
||||
if 'grease_pencil' in data.keys():
|
||||
datablock.grease_pencil = bpy.data.grease_pencils[data['grease_pencil']]
|
||||
gpencil_uid = data.get('grease_pencil')
|
||||
if gpencil_uid:
|
||||
datablock.grease_pencil = resolve_datablock_from_uuid(gpencil_uid, bpy.data.grease_pencils)
|
||||
|
||||
if get_preferences().sync_flags.sync_render_settings:
|
||||
if 'eevee' in data.keys():
|
||||
@ -446,16 +446,17 @@ class BlScene(ReplicatedDatablock):
|
||||
elif datablock.sequence_editor and not sequences:
|
||||
datablock.sequence_editor_clear()
|
||||
|
||||
# Timeline markers
|
||||
markers = data.get('timeline_markers')
|
||||
if markers:
|
||||
datablock.timeline_markers.clear()
|
||||
for name, frame, camera in markers:
|
||||
marker = datablock.timeline_markers.new(name, frame=frame)
|
||||
if camera:
|
||||
marker.camera = resolve_datablock_from_uuid(camera, bpy.data.objects)
|
||||
marker.select = False
|
||||
# FIXME: Find a better way after the replication big refacotoring
|
||||
# Keep other user from deleting collection object by flushing their history
|
||||
|
||||
# Compositor
|
||||
if data["use_nodes"]:
|
||||
if datablock.node_tree is None:
|
||||
datablock.use_nodes = True
|
||||
|
||||
load_node_tree(data['node_tree'], datablock.node_tree)
|
||||
|
||||
flush_history()
|
||||
|
||||
@staticmethod
|
||||
@ -467,11 +468,9 @@ class BlScene(ReplicatedDatablock):
|
||||
scene_dumper = Dumper()
|
||||
scene_dumper.depth = 1
|
||||
scene_dumper.include_filter = [
|
||||
'use_nodes',
|
||||
'name',
|
||||
'world',
|
||||
'id',
|
||||
'grease_pencil',
|
||||
'frame_start',
|
||||
'frame_end',
|
||||
'frame_step',
|
||||
@ -526,12 +525,14 @@ class BlScene(ReplicatedDatablock):
|
||||
for seq in vse.sequences_all:
|
||||
dumped_sequences[seq.name] = dump_sequence(seq)
|
||||
data['sequences'] = dumped_sequences
|
||||
|
||||
# Compositor
|
||||
if datablock.use_nodes:
|
||||
data['node_tree'] = dump_node_tree(datablock.node_tree)
|
||||
data['animation_data'] = dump_animation_data(datablock)
|
||||
|
||||
# Timeline markers
|
||||
if datablock.timeline_markers:
|
||||
data['timeline_markers'] = [(m.name, m.frame, getattr(m.camera, 'uuid', None)) for m in datablock.timeline_markers]
|
||||
|
||||
if datablock.grease_pencil:
|
||||
data['grease_pencil'] = datablock.grease_pencil.uuid
|
||||
|
||||
return data
|
||||
|
||||
@staticmethod
|
||||
@ -565,12 +566,6 @@ class BlScene(ReplicatedDatablock):
|
||||
Path(bpy.path.abspath(sequence.directory),
|
||||
elem.filename))
|
||||
|
||||
# Compositor
|
||||
if datablock.use_nodes:
|
||||
deps.extend(get_node_tree_dependencies(datablock.node_tree))
|
||||
|
||||
deps.extend(resolve_animation_dependencies(datablock))
|
||||
|
||||
return deps
|
||||
|
||||
@staticmethod
|
||||
|
@ -25,6 +25,8 @@ from .bl_datablock import resolve_datablock_from_uuid
|
||||
from .bl_action import dump_animation_data, load_animation_data, resolve_animation_dependencies
|
||||
|
||||
class BlSpeaker(ReplicatedDatablock):
|
||||
use_delta = True
|
||||
|
||||
bl_id = "speakers"
|
||||
bl_class = bpy.types.Speaker
|
||||
bl_check_common = False
|
||||
|
@ -26,6 +26,8 @@ from .bl_action import dump_animation_data, load_animation_data, resolve_animati
|
||||
import bpy.types as T
|
||||
|
||||
class BlTexture(ReplicatedDatablock):
|
||||
use_delta = True
|
||||
|
||||
bl_id = "textures"
|
||||
bl_class = bpy.types.Texture
|
||||
bl_check_common = False
|
||||
|
@ -27,6 +27,8 @@ from .bl_material import dump_materials_slots, load_materials_slots
|
||||
from .bl_action import dump_animation_data, load_animation_data, resolve_animation_dependencies
|
||||
|
||||
class BlVolume(ReplicatedDatablock):
|
||||
use_delta = True
|
||||
|
||||
bl_id = "volumes"
|
||||
bl_class = bpy.types.Volume
|
||||
bl_check_common = False
|
||||
|
@ -21,7 +21,7 @@ import mathutils
|
||||
|
||||
from .dump_anything import Loader, Dumper
|
||||
from replication.protocol import ReplicatedDatablock
|
||||
from .node_tree import (load_node_tree,
|
||||
from .bl_material import (load_node_tree,
|
||||
dump_node_tree,
|
||||
get_node_tree_dependencies)
|
||||
|
||||
@ -30,6 +30,8 @@ from .bl_action import dump_animation_data, load_animation_data, resolve_animati
|
||||
|
||||
|
||||
class BlWorld(ReplicatedDatablock):
|
||||
use_delta = True
|
||||
|
||||
bl_id = "worlds"
|
||||
bl_class = bpy.types.World
|
||||
bl_check_common = True
|
||||
|
@ -1,362 +0,0 @@
|
||||
# ##### BEGIN GPL LICENSE BLOCK #####
|
||||
#
|
||||
# This program is free software: you can redistribute it and/or modify
|
||||
# it under the terms of the GNU General Public License as published by
|
||||
# the Free Software Foundation, either version 3 of the License, or
|
||||
# (at your option) any later version.
|
||||
#
|
||||
# This program is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License
|
||||
# along with this program. If not, see <https://www.gnu.org/licenses/>.
|
||||
#
|
||||
# ##### END GPL LICENSE BLOCK #####
|
||||
|
||||
import bpy
|
||||
import mathutils
|
||||
import logging
|
||||
import re
|
||||
|
||||
from uuid import uuid4
|
||||
|
||||
from .dump_anything import Loader, Dumper
|
||||
|
||||
from .bl_datablock import get_datablock_from_uuid, resolve_datablock_from_uuid
|
||||
|
||||
IGNORED_SOCKETS = ['GEOMETRY', 'SHADER', 'CUSTOM']
|
||||
NODE_SOCKET_INDEX = re.compile('\[(\d*)\]')
|
||||
|
||||
def load_node(node_data: dict, node_tree: bpy.types.NodeTree):
|
||||
""" Load a node into a node_tree from a dict
|
||||
|
||||
:arg node_data: dumped node data
|
||||
:type node_data: dict
|
||||
:arg node_tree: target node_tree
|
||||
:type node_tree: bpy.types.NodeTree
|
||||
"""
|
||||
loader = Loader()
|
||||
target_node = node_tree.nodes.new(type=node_data["bl_idname"])
|
||||
target_node.select = False
|
||||
loader.load(target_node, node_data)
|
||||
image_uuid = node_data.get('image_uuid', None)
|
||||
node_tree_uuid = node_data.get('node_tree_uuid', None)
|
||||
|
||||
if image_uuid and not target_node.image:
|
||||
image = resolve_datablock_from_uuid(image_uuid, bpy.data.images)
|
||||
if image is None:
|
||||
logging.error(f"Fail to find material image from uuid {image_uuid}")
|
||||
else:
|
||||
target_node.image = image
|
||||
|
||||
if node_tree_uuid:
|
||||
target_node.node_tree = get_datablock_from_uuid(node_tree_uuid, None)
|
||||
|
||||
inputs_data = node_data.get('inputs')
|
||||
if inputs_data:
|
||||
inputs = [i for i in target_node.inputs if i.type not in IGNORED_SOCKETS]
|
||||
for idx, inpt in enumerate(inputs):
|
||||
if idx < len(inputs_data) and hasattr(inpt, "default_value"):
|
||||
loaded_input = inputs_data[idx]
|
||||
try:
|
||||
if inpt.type in ['OBJECT', 'COLLECTION']:
|
||||
inpt.default_value = get_datablock_from_uuid(loaded_input, None)
|
||||
else:
|
||||
inpt.default_value = loaded_input
|
||||
except Exception as e:
|
||||
logging.warning(f"Node {target_node.name} input {inpt.name} parameter not supported, skipping ({e})")
|
||||
else:
|
||||
logging.warning(f"Node {target_node.name} input length mismatch.")
|
||||
|
||||
outputs_data = node_data.get('outputs')
|
||||
if outputs_data:
|
||||
outputs = [o for o in target_node.outputs if o.type not in IGNORED_SOCKETS]
|
||||
for idx, output in enumerate(outputs):
|
||||
if idx < len(outputs_data) and hasattr(output, "default_value"):
|
||||
loaded_output = outputs_data[idx]
|
||||
try:
|
||||
if output.type in ['OBJECT', 'COLLECTION']:
|
||||
output.default_value = get_datablock_from_uuid(loaded_output, None)
|
||||
else:
|
||||
output.default_value = loaded_output
|
||||
except Exception as e:
|
||||
logging.warning(
|
||||
f"Node {target_node.name} output {output.name} parameter not supported, skipping ({e})")
|
||||
else:
|
||||
logging.warning(
|
||||
f"Node {target_node.name} output length mismatch.")
|
||||
|
||||
|
||||
def dump_node(node: bpy.types.Node) -> dict:
|
||||
""" Dump a single node to a dict
|
||||
|
||||
:arg node: target node
|
||||
:type node: bpy.types.Node
|
||||
:retrun: dict
|
||||
"""
|
||||
|
||||
node_dumper = Dumper()
|
||||
node_dumper.depth = 1
|
||||
node_dumper.exclude_filter = [
|
||||
"dimensions",
|
||||
"show_expanded",
|
||||
"name_full",
|
||||
"select",
|
||||
"bl_label",
|
||||
"bl_height_min",
|
||||
"bl_height_max",
|
||||
"bl_height_default",
|
||||
"bl_width_min",
|
||||
"bl_width_max",
|
||||
"type",
|
||||
"bl_icon",
|
||||
"bl_width_default",
|
||||
"bl_static_type",
|
||||
"show_tetxure",
|
||||
"is_active_output",
|
||||
"hide",
|
||||
"show_options",
|
||||
"show_preview",
|
||||
"show_texture",
|
||||
"outputs",
|
||||
"width_hidden",
|
||||
"image"
|
||||
]
|
||||
|
||||
dumped_node = node_dumper.dump(node)
|
||||
|
||||
if node.parent:
|
||||
dumped_node['parent'] = node.parent.name
|
||||
|
||||
dump_io_needed = (node.type not in ['REROUTE', 'OUTPUT_MATERIAL'])
|
||||
|
||||
|
||||
if dump_io_needed:
|
||||
io_dumper = Dumper()
|
||||
io_dumper.depth = 2
|
||||
io_dumper.include_filter = ["default_value"]
|
||||
|
||||
if hasattr(node, 'inputs'):
|
||||
dumped_node['inputs'] = []
|
||||
inputs = [i for i in node.inputs if i.type not in IGNORED_SOCKETS]
|
||||
for idx, inpt in enumerate(inputs):
|
||||
if hasattr(inpt, 'default_value'):
|
||||
if isinstance(inpt.default_value, bpy.types.ID):
|
||||
dumped_input = inpt.default_value.uuid
|
||||
else:
|
||||
dumped_input = io_dumper.dump(inpt.default_value)
|
||||
|
||||
dumped_node['inputs'].append(dumped_input)
|
||||
|
||||
if hasattr(node, 'outputs'):
|
||||
dumped_node['outputs'] = []
|
||||
for idx, output in enumerate(node.outputs):
|
||||
if output.type not in IGNORED_SOCKETS:
|
||||
if hasattr(output, 'default_value'):
|
||||
dumped_node['outputs'].append(
|
||||
io_dumper.dump(output.default_value))
|
||||
|
||||
if hasattr(node, 'color_ramp'):
|
||||
ramp_dumper = Dumper()
|
||||
ramp_dumper.depth = 4
|
||||
ramp_dumper.include_filter = [
|
||||
'elements',
|
||||
'alpha',
|
||||
'color',
|
||||
'position',
|
||||
'interpolation',
|
||||
'hue_interpolation',
|
||||
'color_mode'
|
||||
]
|
||||
dumped_node['color_ramp'] = ramp_dumper.dump(node.color_ramp)
|
||||
if hasattr(node, 'mapping'):
|
||||
curve_dumper = Dumper()
|
||||
curve_dumper.depth = 5
|
||||
curve_dumper.include_filter = [
|
||||
'curves',
|
||||
'points',
|
||||
'location'
|
||||
]
|
||||
dumped_node['mapping'] = curve_dumper.dump(node.mapping)
|
||||
if hasattr(node, 'image') and getattr(node, 'image'):
|
||||
dumped_node['image_uuid'] = node.image.uuid
|
||||
if hasattr(node, 'node_tree') and getattr(node, 'node_tree'):
|
||||
dumped_node['node_tree_uuid'] = node.node_tree.uuid
|
||||
return dumped_node
|
||||
|
||||
|
||||
def load_links(links_data, node_tree):
|
||||
""" Load node_tree links from a list
|
||||
|
||||
:arg links_data: dumped node links
|
||||
:type links_data: list
|
||||
:arg node_tree: node links collection
|
||||
:type node_tree: bpy.types.NodeTree
|
||||
"""
|
||||
|
||||
for link in links_data:
|
||||
input_socket = node_tree.nodes[link['to_node']
|
||||
].inputs[int(link['to_socket'])]
|
||||
output_socket = node_tree.nodes[link['from_node']].outputs[int(
|
||||
link['from_socket'])]
|
||||
node_tree.links.new(input_socket, output_socket)
|
||||
|
||||
|
||||
def dump_links(links):
|
||||
""" Dump node_tree links collection to a list
|
||||
|
||||
:arg links: node links collection
|
||||
:type links: bpy.types.NodeLinks
|
||||
:retrun: list
|
||||
"""
|
||||
|
||||
links_data = []
|
||||
|
||||
for link in links:
|
||||
to_socket = NODE_SOCKET_INDEX.search(
|
||||
link.to_socket.path_from_id()).group(1)
|
||||
from_socket = NODE_SOCKET_INDEX.search(
|
||||
link.from_socket.path_from_id()).group(1)
|
||||
links_data.append({
|
||||
'to_node': link.to_node.name,
|
||||
'to_socket': to_socket,
|
||||
'from_node': link.from_node.name,
|
||||
'from_socket': from_socket,
|
||||
})
|
||||
|
||||
return links_data
|
||||
|
||||
|
||||
def dump_node_tree(node_tree: bpy.types.NodeTree) -> dict:
|
||||
""" Dump a node_tree to a dict including links and nodes
|
||||
|
||||
:arg node_tree: dumped node tree
|
||||
:type node_tree: bpy.types.NodeTree
|
||||
:return: dict
|
||||
"""
|
||||
node_tree_data = {
|
||||
'nodes': {node.name: dump_node(node) for node in node_tree.nodes},
|
||||
'links': dump_links(node_tree.links),
|
||||
'name': node_tree.name,
|
||||
'type': type(node_tree).__name__
|
||||
}
|
||||
|
||||
for socket_id in ['inputs', 'outputs']:
|
||||
socket_collection = getattr(node_tree, socket_id)
|
||||
node_tree_data[socket_id] = dump_node_tree_sockets(socket_collection)
|
||||
|
||||
return node_tree_data
|
||||
|
||||
|
||||
def dump_node_tree_sockets(sockets: bpy.types.Collection) -> dict:
|
||||
""" dump sockets of a shader_node_tree
|
||||
|
||||
:arg target_node_tree: target node_tree
|
||||
:type target_node_tree: bpy.types.NodeTree
|
||||
:arg socket_id: socket identifer
|
||||
:type socket_id: str
|
||||
:return: dict
|
||||
"""
|
||||
sockets_data = []
|
||||
for socket in sockets:
|
||||
try:
|
||||
socket_uuid = socket['uuid']
|
||||
except Exception:
|
||||
socket_uuid = str(uuid4())
|
||||
socket['uuid'] = socket_uuid
|
||||
|
||||
sockets_data.append((socket.name, socket.bl_socket_idname, socket_uuid))
|
||||
|
||||
return sockets_data
|
||||
|
||||
|
||||
def load_node_tree_sockets(sockets: bpy.types.Collection,
|
||||
sockets_data: dict):
|
||||
""" load sockets of a shader_node_tree
|
||||
|
||||
:arg target_node_tree: target node_tree
|
||||
:type target_node_tree: bpy.types.NodeTree
|
||||
:arg socket_id: socket identifer
|
||||
:type socket_id: str
|
||||
:arg socket_data: dumped socket data
|
||||
:type socket_data: dict
|
||||
"""
|
||||
# Check for removed sockets
|
||||
for socket in sockets:
|
||||
if not [s for s in sockets_data if 'uuid' in socket and socket['uuid'] == s[2]]:
|
||||
sockets.remove(socket)
|
||||
|
||||
# Check for new sockets
|
||||
for idx, socket_data in enumerate(sockets_data):
|
||||
try:
|
||||
checked_socket = sockets[idx]
|
||||
if checked_socket.name != socket_data[0]:
|
||||
checked_socket.name = socket_data[0]
|
||||
except Exception:
|
||||
s = sockets.new(socket_data[1], socket_data[0])
|
||||
s['uuid'] = socket_data[2]
|
||||
|
||||
|
||||
def load_node_tree(node_tree_data: dict, target_node_tree: bpy.types.NodeTree) -> dict:
|
||||
"""Load a shader node_tree from dumped data
|
||||
|
||||
:arg node_tree_data: dumped node data
|
||||
:type node_tree_data: dict
|
||||
:arg target_node_tree: target node_tree
|
||||
:type target_node_tree: bpy.types.NodeTree
|
||||
"""
|
||||
# TODO: load only required nodes
|
||||
target_node_tree.nodes.clear()
|
||||
|
||||
if not target_node_tree.is_property_readonly('name'):
|
||||
target_node_tree.name = node_tree_data['name']
|
||||
|
||||
if 'inputs' in node_tree_data:
|
||||
socket_collection = getattr(target_node_tree, 'inputs')
|
||||
load_node_tree_sockets(socket_collection, node_tree_data['inputs'])
|
||||
|
||||
if 'outputs' in node_tree_data:
|
||||
socket_collection = getattr(target_node_tree, 'outputs')
|
||||
load_node_tree_sockets(socket_collection, node_tree_data['outputs'])
|
||||
|
||||
# Load nodes
|
||||
for node in node_tree_data["nodes"]:
|
||||
load_node(node_tree_data["nodes"][node], target_node_tree)
|
||||
|
||||
for node_id, node_data in node_tree_data["nodes"].items():
|
||||
target_node = target_node_tree.nodes.get(node_id, None)
|
||||
if target_node is None:
|
||||
continue
|
||||
elif 'parent' in node_data:
|
||||
target_node.parent = target_node_tree.nodes[node_data['parent']]
|
||||
else:
|
||||
target_node.parent = None
|
||||
# TODO: load only required nodes links
|
||||
# Load nodes links
|
||||
target_node_tree.links.clear()
|
||||
|
||||
load_links(node_tree_data["links"], target_node_tree)
|
||||
|
||||
|
||||
def get_node_tree_dependencies(node_tree: bpy.types.NodeTree) -> list:
|
||||
def has_image(node): return (
|
||||
node.type in ['TEX_IMAGE', 'TEX_ENVIRONMENT','IMAGE','R_LAYER'] and node.image)
|
||||
|
||||
def has_node_group(node): return (
|
||||
hasattr(node, 'node_tree') and node.node_tree)
|
||||
|
||||
def has_texture(node): return (
|
||||
node.type in ['ATTRIBUTE_SAMPLE_TEXTURE','TEXTURE'] and node.texture)
|
||||
deps = []
|
||||
|
||||
for node in node_tree.nodes:
|
||||
if has_image(node):
|
||||
deps.append(node.image)
|
||||
elif has_node_group(node):
|
||||
deps.append(node.node_tree)
|
||||
elif has_texture(node):
|
||||
deps.append(node.texture)
|
||||
|
||||
return deps
|
152
multi_user/handlers.py
Normal file
@ -0,0 +1,152 @@
|
||||
# ##### BEGIN GPL LICENSE BLOCK #####
|
||||
#
|
||||
# This program is free software: you can redistribute it and/or modify
|
||||
# it under the terms of the GNU General Public License as published by
|
||||
# the Free Software Foundation, either version 3 of the License, or
|
||||
# (at your option) any later version.
|
||||
#
|
||||
# This program is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License
|
||||
# along with this program. If not, see <https://www.gnu.org/licenses/>.
|
||||
#
|
||||
# ##### END GPL LICENSE BLOCK #####
|
||||
|
||||
import logging
|
||||
|
||||
import bpy
|
||||
from bpy.app.handlers import persistent
|
||||
from replication import porcelain
|
||||
from replication.constants import RP_COMMON, STATE_ACTIVE, STATE_SYNCING, UP
|
||||
from replication.exception import ContextError, NonAuthorizedOperationError
|
||||
from replication.interface import session
|
||||
|
||||
from . import shared_data, utils
|
||||
|
||||
|
||||
def sanitize_deps_graph(remove_nodes: bool = False):
|
||||
""" Cleanup the replication graph
|
||||
"""
|
||||
if session and session.state == STATE_ACTIVE:
|
||||
start = utils.current_milli_time()
|
||||
rm_cpt = 0
|
||||
for node in session.repository.graph.values():
|
||||
node.instance = session.repository.rdp.resolve(node.data)
|
||||
if node is None \
|
||||
or (node.state == UP and not node.instance):
|
||||
if remove_nodes:
|
||||
try:
|
||||
porcelain.rm(session.repository,
|
||||
node.uuid,
|
||||
remove_dependencies=False)
|
||||
logging.info(f"Removing {node.uuid}")
|
||||
rm_cpt += 1
|
||||
except NonAuthorizedOperationError:
|
||||
continue
|
||||
logging.info(f"Sanitize took { utils.current_milli_time()-start} ms, removed {rm_cpt} nodes")
|
||||
|
||||
|
||||
def update_external_dependencies():
|
||||
"""Force external dependencies(files such as images) evaluation
|
||||
"""
|
||||
external_types = ['WindowsPath', 'PosixPath', 'Image']
|
||||
nodes_ids = [n.uuid for n in session.repository.graph.values() if n.data['type_id'] in external_types]
|
||||
for node_id in nodes_ids:
|
||||
node = session.repository.graph.get(node_id)
|
||||
if node and node.owner in [session.repository.username, RP_COMMON]:
|
||||
porcelain.commit(session.repository, node_id)
|
||||
porcelain.push(session.repository, 'origin', node_id)
|
||||
|
||||
|
||||
@persistent
|
||||
def on_scene_update(scene):
|
||||
"""Forward blender depsgraph update to replication
|
||||
"""
|
||||
if session and session.state == STATE_ACTIVE:
|
||||
context = bpy.context
|
||||
blender_depsgraph = bpy.context.view_layer.depsgraph
|
||||
dependency_updates = [u for u in blender_depsgraph.updates]
|
||||
settings = utils.get_preferences()
|
||||
incoming_updates = shared_data.session.applied_updates
|
||||
|
||||
distant_update = [getattr(u.id, 'uuid', None) for u in dependency_updates if getattr(u.id, 'uuid', None) in incoming_updates]
|
||||
if distant_update:
|
||||
for u in distant_update:
|
||||
shared_data.session.applied_updates.remove(u)
|
||||
logging.debug(f"Ignoring distant update of {dependency_updates[0].id.name}")
|
||||
return
|
||||
|
||||
update_external_dependencies()
|
||||
|
||||
# NOTE: maybe we don't need to check each update but only the first
|
||||
for update in reversed(dependency_updates):
|
||||
update_uuid = getattr(update.id, 'uuid', None)
|
||||
if update_uuid:
|
||||
node = session.repository.graph.get(update.id.uuid)
|
||||
check_common = session.repository.rdp.get_implementation(update.id).bl_check_common
|
||||
|
||||
if node and (node.owner == session.repository.username or check_common):
|
||||
logging.debug(f"Evaluate {update.id.name}")
|
||||
if node.state == UP:
|
||||
try:
|
||||
porcelain.commit(session.repository, node.uuid)
|
||||
porcelain.push(session.repository,
|
||||
'origin', node.uuid)
|
||||
except ReferenceError:
|
||||
logging.debug(f"Reference error {node.uuid}")
|
||||
except ContextError as e:
|
||||
logging.debug(e)
|
||||
except Exception as e:
|
||||
logging.error(e)
|
||||
else:
|
||||
continue
|
||||
elif isinstance(update.id, bpy.types.Scene):
|
||||
scene = bpy.data.scenes.get(update.id.name)
|
||||
scn_uuid = porcelain.add(session.repository, scene)
|
||||
porcelain.commit(session.repository, scn_uuid)
|
||||
porcelain.push(session.repository, 'origin', scn_uuid)
|
||||
|
||||
|
||||
@persistent
|
||||
def resolve_deps_graph(dummy):
|
||||
"""Resolve deps graph
|
||||
|
||||
Temporary solution to resolve each node pointers after a Undo.
|
||||
A future solution should be to avoid storing dataclock reference...
|
||||
|
||||
"""
|
||||
if session and session.state == STATE_ACTIVE:
|
||||
sanitize_deps_graph(remove_nodes=True)
|
||||
|
||||
|
||||
@persistent
|
||||
def load_pre_handler(dummy):
|
||||
if session and session.state in [STATE_ACTIVE, STATE_SYNCING]:
|
||||
bpy.ops.session.stop()
|
||||
|
||||
|
||||
@persistent
|
||||
def update_client_frame(scene):
|
||||
if session and session.state == STATE_ACTIVE:
|
||||
porcelain.update_user_metadata(session.repository, {
|
||||
'frame_current': scene.frame_current
|
||||
})
|
||||
|
||||
|
||||
def register():
|
||||
bpy.app.handlers.undo_post.append(resolve_deps_graph)
|
||||
bpy.app.handlers.redo_post.append(resolve_deps_graph)
|
||||
|
||||
bpy.app.handlers.load_pre.append(load_pre_handler)
|
||||
bpy.app.handlers.frame_change_pre.append(update_client_frame)
|
||||
|
||||
|
||||
def unregister():
|
||||
bpy.app.handlers.undo_post.remove(resolve_deps_graph)
|
||||
bpy.app.handlers.redo_post.remove(resolve_deps_graph)
|
||||
|
||||
bpy.app.handlers.load_pre.remove(load_pre_handler)
|
||||
bpy.app.handlers.frame_change_pre.remove(update_client_frame)
|
@ -27,12 +27,12 @@ import shutil
|
||||
import string
|
||||
import sys
|
||||
import time
|
||||
import traceback
|
||||
from datetime import datetime
|
||||
from operator import itemgetter
|
||||
from pathlib import Path
|
||||
from queue import Queue
|
||||
from time import gmtime, strftime
|
||||
import traceback
|
||||
|
||||
from bpy.props import FloatProperty
|
||||
|
||||
@ -45,16 +45,17 @@ import bpy
|
||||
import mathutils
|
||||
from bpy.app.handlers import persistent
|
||||
from bpy_extras.io_utils import ExportHelper, ImportHelper
|
||||
from replication import porcelain
|
||||
from replication.constants import (COMMITED, FETCHED, RP_COMMON, STATE_ACTIVE,
|
||||
STATE_INITIAL, STATE_SYNCING, UP)
|
||||
from replication.protocol import DataTranslationProtocol
|
||||
from replication.exception import ContextError, NonAuthorizedOperationError
|
||||
from replication.interface import session
|
||||
from replication import porcelain
|
||||
from replication.repository import Repository
|
||||
from replication.objects import Node
|
||||
from replication.protocol import DataTranslationProtocol
|
||||
from replication.repository import Repository
|
||||
|
||||
from . import bl_types, environment, timers, ui, utils
|
||||
from . import bl_types, environment, shared_data, timers, ui, utils
|
||||
from .handlers import on_scene_update, sanitize_deps_graph
|
||||
from .presence import SessionStatusWidget, renderer, view3d_find
|
||||
from .timers import registry
|
||||
|
||||
@ -99,7 +100,7 @@ def initialize_session():
|
||||
|
||||
# Step 2: Load nodes
|
||||
logging.info("Applying nodes")
|
||||
for node in session.repository.index_sorted:
|
||||
for node in session.repository.heads:
|
||||
porcelain.apply(session.repository, node)
|
||||
|
||||
logging.info("Registering timers")
|
||||
@ -112,7 +113,7 @@ def initialize_session():
|
||||
utils.flush_history()
|
||||
|
||||
# Step 6: Launch deps graph update handling
|
||||
bpy.app.handlers.depsgraph_update_post.append(depsgraph_evaluation)
|
||||
bpy.app.handlers.depsgraph_update_post.append(on_scene_update)
|
||||
|
||||
|
||||
@session_callback('on_exit')
|
||||
@ -132,8 +133,8 @@ def on_connection_end(reason="none"):
|
||||
|
||||
stop_modal_executor = True
|
||||
|
||||
if depsgraph_evaluation in bpy.app.handlers.depsgraph_update_post:
|
||||
bpy.app.handlers.depsgraph_update_post.remove(depsgraph_evaluation)
|
||||
if on_scene_update in bpy.app.handlers.depsgraph_update_post:
|
||||
bpy.app.handlers.depsgraph_update_post.remove(on_scene_update)
|
||||
|
||||
# Step 3: remove file handled
|
||||
logger = logging.getLogger()
|
||||
@ -272,8 +273,7 @@ class SessionStartOperator(bpy.types.Operator):
|
||||
|
||||
session_update = timers.SessionStatusUpdate()
|
||||
session_user_sync = timers.SessionUserSync()
|
||||
session_background_executor = timers.MainThreadExecutor(
|
||||
execution_queue=background_execution_queue)
|
||||
session_background_executor = timers.MainThreadExecutor(execution_queue=background_execution_queue)
|
||||
session_listen = timers.SessionListenTimer(timeout=0.001)
|
||||
|
||||
session_listen.register()
|
||||
@ -285,6 +285,7 @@ class SessionStartOperator(bpy.types.Operator):
|
||||
deleyables.append(session_update)
|
||||
deleyables.append(session_user_sync)
|
||||
deleyables.append(session_listen)
|
||||
deleyables.append(timers.AnnotationUpdates())
|
||||
|
||||
return {"FINISHED"}
|
||||
|
||||
@ -603,9 +604,9 @@ class SessionApply(bpy.types.Operator):
|
||||
node_ref = session.repository.graph.get(self.target)
|
||||
porcelain.apply(session.repository,
|
||||
self.target,
|
||||
force=True,
|
||||
force_dependencies=self.reset_dependencies)
|
||||
force=True)
|
||||
impl = session.repository.rdp.get_implementation(node_ref.instance)
|
||||
# NOTE: find another way to handle child and parent automatic reloading
|
||||
if impl.bl_reload_parent:
|
||||
for parent in session.repository.graph.get_parents(self.target):
|
||||
logging.debug(f"Refresh parent {parent}")
|
||||
@ -613,6 +614,11 @@ class SessionApply(bpy.types.Operator):
|
||||
porcelain.apply(session.repository,
|
||||
parent.uuid,
|
||||
force=True)
|
||||
if hasattr(impl, 'bl_reload_child') and impl.bl_reload_child:
|
||||
for dep in node_ref.dependencies:
|
||||
porcelain.apply(session.repository,
|
||||
dep,
|
||||
force=True)
|
||||
except Exception as e:
|
||||
self.report({'ERROR'}, repr(e))
|
||||
traceback.print_exc()
|
||||
@ -636,7 +642,7 @@ class SessionCommit(bpy.types.Operator):
|
||||
def execute(self, context):
|
||||
try:
|
||||
porcelain.commit(session.repository, self.target)
|
||||
porcelain.push(session.repository, 'origin', self.target)
|
||||
porcelain.push(session.repository, 'origin', self.target, force=True)
|
||||
return {"FINISHED"}
|
||||
except Exception as e:
|
||||
self.report({'ERROR'}, repr(e))
|
||||
@ -684,6 +690,7 @@ class SessionPurgeOperator(bpy.types.Operator):
|
||||
def execute(self, context):
|
||||
try:
|
||||
sanitize_deps_graph(remove_nodes=True)
|
||||
porcelain.purge_orphan_nodes(session.repository)
|
||||
except Exception as e:
|
||||
self.report({'ERROR'}, repr(e))
|
||||
|
||||
@ -716,7 +723,6 @@ class SessionNotifyOperator(bpy.types.Operator):
|
||||
layout = self.layout
|
||||
layout.row().label(text=self.message)
|
||||
|
||||
|
||||
def invoke(self, context, event):
|
||||
return context.window_manager.invoke_props_dialog(self)
|
||||
|
||||
@ -778,6 +784,22 @@ class SessionStopAutoSaveOperator(bpy.types.Operator):
|
||||
|
||||
return {'FINISHED'}
|
||||
|
||||
class SessionGetInfo(bpy.types.Operator):
|
||||
bl_idname = "session.get_info"
|
||||
bl_label = "Get session info"
|
||||
bl_description = "Get session info"
|
||||
|
||||
target_server: bpy.props.StringProperty(default="127.0.0.1:5555")
|
||||
|
||||
@classmethod
|
||||
def poll(cls, context):
|
||||
return (session.state != STATE_ACTIVE)
|
||||
|
||||
def execute(self, context):
|
||||
infos = porcelain.request_session_info(self.target_server, timeout=100)
|
||||
logging.info(f"Session info: {infos}")
|
||||
|
||||
return {'FINISHED'}
|
||||
|
||||
class SessionLoadSaveOperator(bpy.types.Operator, ImportHelper):
|
||||
bl_idname = "session.load"
|
||||
@ -916,113 +938,10 @@ classes = (
|
||||
SessionPurgeOperator,
|
||||
SessionPresetServerAdd,
|
||||
SessionPresetServerRemove,
|
||||
SessionGetInfo,
|
||||
)
|
||||
|
||||
|
||||
def update_external_dependencies():
|
||||
nodes_ids = [n.uuid for n in session.repository.graph.values() if n.data['type_id'] in ['WindowsPath', 'PosixPath']]
|
||||
for node_id in nodes_ids:
|
||||
node = session.repository.graph.get(node_id)
|
||||
if node and node.owner in [session.repository.username, RP_COMMON]:
|
||||
porcelain.commit(session.repository, node_id)
|
||||
porcelain.push(session.repository,'origin', node_id)
|
||||
|
||||
|
||||
def sanitize_deps_graph(remove_nodes: bool = False):
|
||||
""" Cleanup the replication graph
|
||||
"""
|
||||
if session and session.state == STATE_ACTIVE:
|
||||
start = utils.current_milli_time()
|
||||
rm_cpt = 0
|
||||
for node in session.repository.graph.values():
|
||||
node.instance = session.repository.rdp.resolve(node.data)
|
||||
if node is None \
|
||||
or (node.state == UP and not node.instance):
|
||||
if remove_nodes:
|
||||
try:
|
||||
porcelain.rm(session.repository,
|
||||
node.uuid,
|
||||
remove_dependencies=False)
|
||||
logging.info(f"Removing {node.uuid}")
|
||||
rm_cpt += 1
|
||||
except NonAuthorizedOperationError:
|
||||
continue
|
||||
logging.info(f"Sanitize took { utils.current_milli_time()-start} ms, removed {rm_cpt} nodes")
|
||||
|
||||
|
||||
@persistent
|
||||
def resolve_deps_graph(dummy):
|
||||
"""Resolve deps graph
|
||||
|
||||
Temporary solution to resolve each node pointers after a Undo.
|
||||
A future solution should be to avoid storing dataclock reference...
|
||||
|
||||
"""
|
||||
if session and session.state == STATE_ACTIVE:
|
||||
sanitize_deps_graph(remove_nodes=True)
|
||||
|
||||
|
||||
@persistent
|
||||
def load_pre_handler(dummy):
|
||||
if session and session.state in [STATE_ACTIVE, STATE_SYNCING]:
|
||||
bpy.ops.session.stop()
|
||||
|
||||
|
||||
@persistent
|
||||
def update_client_frame(scene):
|
||||
if session and session.state == STATE_ACTIVE:
|
||||
porcelain.update_user_metadata(session.repository, {
|
||||
'frame_current': scene.frame_current
|
||||
})
|
||||
|
||||
|
||||
@persistent
|
||||
def depsgraph_evaluation(scene):
|
||||
if session and session.state == STATE_ACTIVE:
|
||||
context = bpy.context
|
||||
blender_depsgraph = bpy.context.view_layer.depsgraph
|
||||
dependency_updates = [u for u in blender_depsgraph.updates]
|
||||
settings = utils.get_preferences()
|
||||
|
||||
update_external_dependencies()
|
||||
|
||||
is_internal = [u for u in dependency_updates if u.is_updated_geometry or u.is_updated_shading or u.is_updated_transform]
|
||||
|
||||
# NOTE: maybe we don't need to check each update but only the first
|
||||
if not is_internal:
|
||||
return
|
||||
for update in reversed(dependency_updates):
|
||||
# Is the object tracked ?
|
||||
if update.id.uuid:
|
||||
# Retrieve local version
|
||||
node = session.repository.graph.get(update.id.uuid)
|
||||
check_common = session.repository.rdp.get_implementation(update.id).bl_check_common
|
||||
# Check our right on this update:
|
||||
# - if its ours or ( under common and diff), launch the
|
||||
# update process
|
||||
# - if its to someone else, ignore the update
|
||||
if node and (node.owner == session.repository.username or check_common):
|
||||
if node.state == UP:
|
||||
try:
|
||||
porcelain.commit(session.repository, node.uuid)
|
||||
porcelain.push(session.repository, 'origin', node.uuid)
|
||||
except ReferenceError:
|
||||
logging.debug(f"Reference error {node.uuid}")
|
||||
except ContextError as e:
|
||||
logging.debug(e)
|
||||
except Exception as e:
|
||||
logging.error(e)
|
||||
else:
|
||||
continue
|
||||
# A new scene is created
|
||||
elif isinstance(update.id, bpy.types.Scene):
|
||||
ref = session.repository.get_node_by_datablock(update.id)
|
||||
if ref:
|
||||
pass
|
||||
else:
|
||||
scn_uuid = porcelain.add(session.repository, update.id)
|
||||
porcelain.commit(session.node_id, scn_uuid)
|
||||
porcelain.push(session.repository,'origin', scn_uuid)
|
||||
def register():
|
||||
from bpy.utils import register_class
|
||||
|
||||
@ -1030,13 +949,6 @@ def register():
|
||||
register_class(cls)
|
||||
|
||||
|
||||
bpy.app.handlers.undo_post.append(resolve_deps_graph)
|
||||
bpy.app.handlers.redo_post.append(resolve_deps_graph)
|
||||
|
||||
bpy.app.handlers.load_pre.append(load_pre_handler)
|
||||
bpy.app.handlers.frame_change_pre.append(update_client_frame)
|
||||
|
||||
|
||||
def unregister():
|
||||
if session and session.state == STATE_ACTIVE:
|
||||
session.disconnect()
|
||||
@ -1044,9 +956,3 @@ def unregister():
|
||||
from bpy.utils import unregister_class
|
||||
for cls in reversed(classes):
|
||||
unregister_class(cls)
|
||||
|
||||
bpy.app.handlers.undo_post.remove(resolve_deps_graph)
|
||||
bpy.app.handlers.redo_post.remove(resolve_deps_graph)
|
||||
|
||||
bpy.app.handlers.load_pre.remove(load_pre_handler)
|
||||
bpy.app.handlers.frame_change_pre.remove(update_client_frame)
|
||||
|
@ -273,6 +273,13 @@ class SessionPrefs(bpy.types.AddonPreferences):
|
||||
step=1,
|
||||
subtype='PERCENTAGE',
|
||||
)
|
||||
presence_mode_distance: bpy.props.FloatProperty(
|
||||
name="Distance mode visibilty",
|
||||
description="Adjust the distance visibilty of user's mode",
|
||||
min=0.1,
|
||||
max=1000,
|
||||
default=100,
|
||||
)
|
||||
conf_session_identity_expanded: bpy.props.BoolProperty(
|
||||
name="Identity",
|
||||
description="Identity",
|
||||
@ -446,10 +453,11 @@ class SessionPrefs(bpy.types.AddonPreferences):
|
||||
col = box.column(align=True)
|
||||
col.prop(self, "presence_hud_scale", expand=True)
|
||||
|
||||
|
||||
col.prop(self, "presence_hud_hpos", expand=True)
|
||||
col.prop(self, "presence_hud_vpos", expand=True)
|
||||
|
||||
col.prop(self, "presence_mode_distance", expand=True)
|
||||
|
||||
if self.category == 'UPDATE':
|
||||
from . import addon_updater_ops
|
||||
addon_updater_ops.update_settings_ui(self, context)
|
||||
@ -538,6 +546,11 @@ class SessionProps(bpy.types.PropertyGroup):
|
||||
description='Enable user overlay ',
|
||||
default=True,
|
||||
)
|
||||
presence_show_mode: bpy.props.BoolProperty(
|
||||
name="Show users current mode",
|
||||
description='Enable user mode overlay ',
|
||||
default=False,
|
||||
)
|
||||
presence_show_far_user: bpy.props.BoolProperty(
|
||||
name="Show users on different scenes",
|
||||
description="Show user on different scenes",
|
||||
|
@ -94,15 +94,41 @@ def project_to_viewport(region: bpy.types.Region, rv3d: bpy.types.RegionView3D,
|
||||
return [target.x, target.y, target.z]
|
||||
|
||||
|
||||
def bbox_from_obj(obj: bpy.types.Object, radius: float) -> list:
|
||||
def bbox_from_obj(obj: bpy.types.Object, index: int = 1) -> list:
|
||||
""" Generate a bounding box for a given object by using its world matrix
|
||||
|
||||
:param obj: target object
|
||||
:type obj: bpy.types.Object
|
||||
:param radius: bounding box radius
|
||||
:type radius: float
|
||||
:return: list of 8 points [(x,y,z),...]
|
||||
:param index: indice offset
|
||||
:type index: int
|
||||
:return: list of 8 points [(x,y,z),...], list of 12 link between these points [(1,2),...]
|
||||
"""
|
||||
radius = 1.0 # Radius of the bounding box
|
||||
index = 8*index
|
||||
vertex_indices = (
|
||||
(0+index, 1+index), (0+index, 2+index), (1+index, 3+index), (2+index, 3+index),
|
||||
(4+index, 5+index), (4+index, 6+index), (5+index, 7+index), (6+index, 7+index),
|
||||
(0+index, 4+index), (1+index, 5+index), (2+index, 6+index), (3+index, 7+index))
|
||||
|
||||
if obj.type == 'EMPTY':
|
||||
radius = obj.empty_display_size
|
||||
elif obj.type == 'LIGHT':
|
||||
radius = obj.data.shadow_soft_size
|
||||
elif obj.type == 'LIGHT_PROBE':
|
||||
radius = obj.data.influence_distance
|
||||
elif obj.type == 'CAMERA':
|
||||
radius = obj.data.display_size
|
||||
elif hasattr(obj, 'bound_box'):
|
||||
vertex_indices = (
|
||||
(0+index, 1+index), (1+index, 2+index),
|
||||
(2+index, 3+index), (0+index, 3+index),
|
||||
(4+index, 5+index), (5+index, 6+index),
|
||||
(6+index, 7+index), (4+index, 7+index),
|
||||
(0+index, 4+index), (1+index, 5+index),
|
||||
(2+index, 6+index), (3+index, 7+index))
|
||||
vertex_pos = get_bb_coords_from_obj(obj)
|
||||
return vertex_pos, vertex_indices
|
||||
|
||||
coords = [
|
||||
(-radius, -radius, -radius), (+radius, -radius, -radius),
|
||||
(-radius, +radius, -radius), (+radius, +radius, -radius),
|
||||
@ -112,9 +138,32 @@ def bbox_from_obj(obj: bpy.types.Object, radius: float) -> list:
|
||||
base = obj.matrix_world
|
||||
bbox_corners = [base @ mathutils.Vector(corner) for corner in coords]
|
||||
|
||||
return [(point.x, point.y, point.z)
|
||||
for point in bbox_corners]
|
||||
vertex_pos = [(point.x, point.y, point.z) for point in bbox_corners]
|
||||
|
||||
return vertex_pos, vertex_indices
|
||||
|
||||
def bbox_from_instance_collection(ic: bpy.types.Object, index: int = 0) -> list:
|
||||
""" Generate a bounding box for a given instance collection by using its objects
|
||||
|
||||
:param ic: target instance collection
|
||||
:type ic: bpy.types.Object
|
||||
:param index: indice offset
|
||||
:type index: int
|
||||
:return: list of 8*objs points [(x,y,z),...], tuple of 12*objs link between these points [(1,2),...]
|
||||
"""
|
||||
vertex_pos = []
|
||||
vertex_indices = ()
|
||||
|
||||
for obj_index, obj in enumerate(ic.instance_collection.objects):
|
||||
vertex_pos_temp, vertex_indices_temp = bbox_from_obj(obj, index=index+obj_index)
|
||||
vertex_pos += vertex_pos_temp
|
||||
vertex_indices += vertex_indices_temp
|
||||
|
||||
bbox_corners = [ic.matrix_world @ mathutils.Vector(vertex) for vertex in vertex_pos]
|
||||
|
||||
vertex_pos = [(point.x, point.y, point.z) for point in bbox_corners]
|
||||
|
||||
return vertex_pos, vertex_indices
|
||||
|
||||
def generate_user_camera() -> list:
|
||||
""" Generate a basic camera represention of the user point of view
|
||||
@ -175,7 +224,7 @@ def get_bb_coords_from_obj(object: bpy.types.Object, instance: bpy.types.Object
|
||||
|
||||
bbox_corners = [base @ mathutils.Vector(
|
||||
corner) for corner in object.bound_box]
|
||||
|
||||
|
||||
|
||||
return [(point.x, point.y, point.z) for point in bbox_corners]
|
||||
|
||||
@ -203,6 +252,13 @@ class Widget(object):
|
||||
"""
|
||||
return True
|
||||
|
||||
def configure_bgl(self):
|
||||
bgl.glLineWidth(2.)
|
||||
bgl.glEnable(bgl.GL_DEPTH_TEST)
|
||||
bgl.glEnable(bgl.GL_BLEND)
|
||||
bgl.glEnable(bgl.GL_LINE_SMOOTH)
|
||||
|
||||
|
||||
def draw(self):
|
||||
"""How to draw the widget
|
||||
"""
|
||||
@ -256,11 +312,6 @@ class UserFrustumWidget(Widget):
|
||||
{"pos": positions},
|
||||
indices=self.indices)
|
||||
|
||||
bgl.glLineWidth(2.)
|
||||
bgl.glEnable(bgl.GL_DEPTH_TEST)
|
||||
bgl.glEnable(bgl.GL_BLEND)
|
||||
bgl.glEnable(bgl.GL_LINE_SMOOTH)
|
||||
|
||||
shader.bind()
|
||||
shader.uniform_float("color", self.data.get('color'))
|
||||
batch.draw(shader)
|
||||
@ -272,6 +323,8 @@ class UserSelectionWidget(Widget):
|
||||
username):
|
||||
self.username = username
|
||||
self.settings = bpy.context.window_manager.session
|
||||
self.current_selection_ids = []
|
||||
self.current_selected_objects = []
|
||||
|
||||
@property
|
||||
def data(self):
|
||||
@ -281,6 +334,15 @@ class UserSelectionWidget(Widget):
|
||||
else:
|
||||
return None
|
||||
|
||||
@property
|
||||
def selected_objects(self):
|
||||
user_selection = self.data.get('selected_objects')
|
||||
if self.current_selection_ids != user_selection:
|
||||
self.current_selected_objects = [find_from_attr("uuid", uid, bpy.data.objects) for uid in user_selection]
|
||||
self.current_selection_ids = user_selection
|
||||
|
||||
return self.current_selected_objects
|
||||
|
||||
def poll(self):
|
||||
if self.data is None:
|
||||
return False
|
||||
@ -295,49 +357,31 @@ class UserSelectionWidget(Widget):
|
||||
self.settings.enable_presence
|
||||
|
||||
def draw(self):
|
||||
user_selection = self.data.get('selected_objects')
|
||||
for select_ob in user_selection:
|
||||
ob = find_from_attr("uuid", select_ob, bpy.data.objects)
|
||||
if not ob:
|
||||
return
|
||||
vertex_pos = []
|
||||
vertex_ind = []
|
||||
collection_offset = 0
|
||||
for obj_index, obj in enumerate(self.selected_objects):
|
||||
if obj is None:
|
||||
continue
|
||||
obj_index+=collection_offset
|
||||
if hasattr(obj, 'instance_collection') and obj.instance_collection:
|
||||
bbox_pos, bbox_ind = bbox_from_instance_collection(obj, index=obj_index)
|
||||
collection_offset+=len(obj.instance_collection.objects)-1
|
||||
else :
|
||||
bbox_pos, bbox_ind = bbox_from_obj(obj, index=obj_index)
|
||||
vertex_pos += bbox_pos
|
||||
vertex_ind += bbox_ind
|
||||
|
||||
vertex_pos = bbox_from_obj(ob, 1.0)
|
||||
vertex_indices = (
|
||||
(0, 1), (1, 2), (2, 3), (0, 3),
|
||||
(4, 5), (5, 6), (6, 7), (4, 7),
|
||||
(0, 4), (1, 5), (2, 6), (3, 7))
|
||||
|
||||
if ob.instance_collection:
|
||||
for obj in ob.instance_collection.objects:
|
||||
if obj.type == 'MESH' and hasattr(obj, 'bound_box'):
|
||||
vertex_pos = get_bb_coords_from_obj(obj, instance=ob)
|
||||
break
|
||||
elif ob.type == 'EMPTY':
|
||||
vertex_pos = bbox_from_obj(ob, ob.empty_display_size)
|
||||
elif ob.type == 'LIGHT':
|
||||
vertex_pos = bbox_from_obj(ob, ob.data.shadow_soft_size)
|
||||
elif ob.type == 'LIGHT_PROBE':
|
||||
vertex_pos = bbox_from_obj(ob, ob.data.influence_distance)
|
||||
elif ob.type == 'CAMERA':
|
||||
vertex_pos = bbox_from_obj(ob, ob.data.display_size)
|
||||
elif hasattr(ob, 'bound_box'):
|
||||
vertex_indices = (
|
||||
(0, 1), (1, 2), (2, 3), (0, 3),
|
||||
(4, 5), (5, 6), (6, 7), (4, 7),
|
||||
(0, 4), (1, 5), (2, 6), (3, 7))
|
||||
vertex_pos = get_bb_coords_from_obj(ob)
|
||||
|
||||
shader = gpu.shader.from_builtin('3D_UNIFORM_COLOR')
|
||||
batch = batch_for_shader(
|
||||
shader,
|
||||
'LINES',
|
||||
{"pos": vertex_pos},
|
||||
indices=vertex_indices)
|
||||
|
||||
shader.bind()
|
||||
shader.uniform_float("color", self.data.get('color'))
|
||||
batch.draw(shader)
|
||||
shader = gpu.shader.from_builtin('3D_UNIFORM_COLOR')
|
||||
batch = batch_for_shader(
|
||||
shader,
|
||||
'LINES',
|
||||
{"pos": vertex_pos},
|
||||
indices=vertex_ind)
|
||||
|
||||
shader.bind()
|
||||
shader.uniform_float("color", self.data.get('color'))
|
||||
batch.draw(shader)
|
||||
|
||||
class UserNameWidget(Widget):
|
||||
draw_type = 'POST_PIXEL'
|
||||
@ -381,6 +425,62 @@ class UserNameWidget(Widget):
|
||||
blf.color(0, color[0], color[1], color[2], color[3])
|
||||
blf.draw(0, self.username)
|
||||
|
||||
class UserModeWidget(Widget):
|
||||
draw_type = 'POST_PIXEL'
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
username):
|
||||
self.username = username
|
||||
self.settings = bpy.context.window_manager.session
|
||||
self.preferences = get_preferences()
|
||||
|
||||
@property
|
||||
def data(self):
|
||||
user = session.online_users.get(self.username)
|
||||
if user:
|
||||
return user.get('metadata')
|
||||
else:
|
||||
return None
|
||||
|
||||
def poll(self):
|
||||
if self.data is None:
|
||||
return False
|
||||
|
||||
scene_current = self.data.get('scene_current')
|
||||
mode_current = self.data.get('mode_current')
|
||||
user_selection = self.data.get('selected_objects')
|
||||
|
||||
return (scene_current == bpy.context.scene.name or
|
||||
mode_current == bpy.context.mode or
|
||||
self.settings.presence_show_far_user) and \
|
||||
user_selection and \
|
||||
self.settings.presence_show_mode and \
|
||||
self.settings.enable_presence
|
||||
|
||||
def draw(self):
|
||||
user_selection = self.data.get('selected_objects')
|
||||
area, region, rv3d = view3d_find()
|
||||
viewport_coord = project_to_viewport(region, rv3d, (0, 0))
|
||||
|
||||
obj = find_from_attr("uuid", user_selection[0], bpy.data.objects)
|
||||
if not obj:
|
||||
return
|
||||
mode_current = self.data.get('mode_current')
|
||||
color = self.data.get('color')
|
||||
origin_coord = project_to_screen(obj.location)
|
||||
|
||||
distance_viewport_object = math.sqrt((viewport_coord[0]-obj.location[0])**2+(viewport_coord[1]-obj.location[1])**2+(viewport_coord[2]-obj.location[2])**2)
|
||||
|
||||
if distance_viewport_object > self.preferences.presence_mode_distance :
|
||||
return
|
||||
|
||||
if origin_coord :
|
||||
blf.position(0, origin_coord[0]+8, origin_coord[1]-15, 0)
|
||||
blf.size(0, 16, 72)
|
||||
blf.color(0, color[0], color[1], color[2], color[3])
|
||||
blf.draw(0, mode_current)
|
||||
|
||||
|
||||
class SessionStatusWidget(Widget):
|
||||
draw_type = 'POST_PIXEL'
|
||||
@ -463,6 +563,7 @@ class DrawFactory(object):
|
||||
try:
|
||||
for widget in self.widgets.values():
|
||||
if widget.draw_type == 'POST_VIEW' and widget.poll():
|
||||
widget.configure_bgl()
|
||||
widget.draw()
|
||||
except Exception as e:
|
||||
logging.error(
|
||||
@ -472,6 +573,7 @@ class DrawFactory(object):
|
||||
try:
|
||||
for widget in self.widgets.values():
|
||||
if widget.draw_type == 'POST_PIXEL' and widget.poll():
|
||||
widget.configure_bgl()
|
||||
widget.draw()
|
||||
except Exception as e:
|
||||
logging.error(
|
||||
@ -484,6 +586,7 @@ this.renderer = DrawFactory()
|
||||
|
||||
def register():
|
||||
this.renderer.register_handlers()
|
||||
|
||||
|
||||
this.renderer.add_widget("session_status", SessionStatusWidget())
|
||||
|
||||
|
48
multi_user/shared_data.py
Normal file
@ -0,0 +1,48 @@
|
||||
# ##### BEGIN GPL LICENSE BLOCK #####
|
||||
#
|
||||
# This program is free software: you can redistribute it and/or modify
|
||||
# it under the terms of the GNU General Public License as published by
|
||||
# the Free Software Foundation, either version 3 of the License, or
|
||||
# (at your option) any later version.
|
||||
#
|
||||
# This program is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License
|
||||
# along with this program. If not, see <https://www.gnu.org/licenses/>.
|
||||
#
|
||||
# ##### END GPL LICENSE BLOCK #####
|
||||
|
||||
from replication.constants import STATE_INITIAL
|
||||
|
||||
|
||||
class SessionData():
|
||||
""" A structure to share easily the current session data across the addon
|
||||
modules.
|
||||
This object will completely replace the Singleton lying in replication
|
||||
interface module.
|
||||
"""
|
||||
|
||||
def __init__(self):
|
||||
self.repository = None # The current repository
|
||||
self.remote = None # The active remote
|
||||
self.server = None
|
||||
self.applied_updates = []
|
||||
|
||||
@property
|
||||
def state(self):
|
||||
if self.remote is None:
|
||||
return STATE_INITIAL
|
||||
else:
|
||||
return self.remote.connection_status
|
||||
|
||||
def clear(self):
|
||||
self.remote = None
|
||||
self.repository = None
|
||||
self.server = None
|
||||
self.applied_updates = []
|
||||
|
||||
|
||||
session = SessionData()
|
@ -27,10 +27,12 @@ from replication.interface import session
|
||||
from replication import porcelain
|
||||
|
||||
from . import operators, utils
|
||||
from .presence import (UserFrustumWidget, UserNameWidget, UserSelectionWidget,
|
||||
from .presence import (UserFrustumWidget, UserNameWidget, UserModeWidget, UserSelectionWidget,
|
||||
generate_user_camera, get_view_matrix, refresh_3d_view,
|
||||
refresh_sidebar_view, renderer)
|
||||
|
||||
from . import shared_data
|
||||
|
||||
this = sys.modules[__name__]
|
||||
|
||||
# Registered timers
|
||||
@ -39,7 +41,8 @@ this.registry = dict()
|
||||
def is_annotating(context: bpy.types.Context):
|
||||
""" Check if the annotate mode is enabled
|
||||
"""
|
||||
return bpy.context.workspace.tools.from_space_view3d_mode('OBJECT', create=False).idname == 'builtin.annotate'
|
||||
active_tool = bpy.context.workspace.tools.from_space_view3d_mode('OBJECT', create=False)
|
||||
return (active_tool and active_tool.idname == 'builtin.annotate')
|
||||
|
||||
|
||||
class Timer(object):
|
||||
@ -89,7 +92,7 @@ class Timer(object):
|
||||
if bpy.app.timers.is_registered(self.main):
|
||||
logging.info(f"Unregistering {self.id}")
|
||||
bpy.app.timers.unregister(self.main)
|
||||
|
||||
|
||||
del this.registry[self.id]
|
||||
self.is_running = False
|
||||
|
||||
@ -114,6 +117,7 @@ class ApplyTimer(Timer):
|
||||
|
||||
if node_ref.state == FETCHED:
|
||||
try:
|
||||
shared_data.session.applied_updates.append(node)
|
||||
porcelain.apply(session.repository, node)
|
||||
except Exception as e:
|
||||
logging.error(f"Fail to apply {node_ref.uuid}")
|
||||
@ -126,14 +130,58 @@ class ApplyTimer(Timer):
|
||||
porcelain.apply(session.repository,
|
||||
parent.uuid,
|
||||
force=True)
|
||||
if hasattr(impl, 'bl_reload_child') and impl.bl_reload_child:
|
||||
for dep in node_ref.dependencies:
|
||||
porcelain.apply(session.repository,
|
||||
dep,
|
||||
force=True)
|
||||
|
||||
|
||||
class AnnotationUpdates(Timer):
|
||||
def __init__(self, timeout=1):
|
||||
self._annotating = False
|
||||
self._settings = utils.get_preferences()
|
||||
|
||||
super().__init__(timeout)
|
||||
|
||||
def execute(self):
|
||||
if session and session.state == STATE_ACTIVE:
|
||||
ctx = bpy.context
|
||||
annotation_gp = ctx.scene.grease_pencil
|
||||
|
||||
if annotation_gp and not annotation_gp.uuid:
|
||||
ctx.scene.update_tag()
|
||||
|
||||
# if an annotation exist and is tracked
|
||||
if annotation_gp and annotation_gp.uuid:
|
||||
registered_gp = session.repository.graph.get(annotation_gp.uuid)
|
||||
if is_annotating(bpy.context):
|
||||
# try to get the right on it
|
||||
if registered_gp.owner == RP_COMMON:
|
||||
self._annotating = True
|
||||
logging.debug(
|
||||
"Getting the right on the annotation GP")
|
||||
porcelain.lock(session.repository,
|
||||
[registered_gp.uuid],
|
||||
ignore_warnings=True,
|
||||
affect_dependencies=False)
|
||||
|
||||
if registered_gp.owner == self._settings.username:
|
||||
porcelain.commit(session.repository, annotation_gp.uuid)
|
||||
porcelain.push(session.repository, 'origin', annotation_gp.uuid)
|
||||
|
||||
elif self._annotating:
|
||||
porcelain.unlock(session.repository,
|
||||
[registered_gp.uuid],
|
||||
ignore_warnings=True,
|
||||
affect_dependencies=False)
|
||||
self._annotating = False
|
||||
|
||||
class DynamicRightSelectTimer(Timer):
|
||||
def __init__(self, timeout=.1):
|
||||
super().__init__(timeout)
|
||||
self._last_selection = []
|
||||
self._last_selection = set()
|
||||
self._user = None
|
||||
self._annotating = False
|
||||
|
||||
def execute(self):
|
||||
settings = utils.get_preferences()
|
||||
@ -144,83 +192,46 @@ class DynamicRightSelectTimer(Timer):
|
||||
self._user = session.online_users.get(settings.username)
|
||||
|
||||
if self._user:
|
||||
ctx = bpy.context
|
||||
annotation_gp = ctx.scene.grease_pencil
|
||||
|
||||
if annotation_gp and not annotation_gp.uuid:
|
||||
ctx.scene.update_tag()
|
||||
|
||||
# if an annotation exist and is tracked
|
||||
if annotation_gp and annotation_gp.uuid:
|
||||
registered_gp = session.repository.graph.get(annotation_gp.uuid)
|
||||
if is_annotating(bpy.context):
|
||||
# try to get the right on it
|
||||
if registered_gp.owner == RP_COMMON:
|
||||
self._annotating = True
|
||||
logging.debug(
|
||||
"Getting the right on the annotation GP")
|
||||
porcelain.lock(session.repository,
|
||||
registered_gp.uuid,
|
||||
ignore_warnings=True,
|
||||
affect_dependencies=False)
|
||||
|
||||
if registered_gp.owner == settings.username:
|
||||
gp_node = session.repository.graph.get(annotation_gp.uuid)
|
||||
porcelain.commit(session.repository, gp_node.uuid)
|
||||
porcelain.push(session.repository, 'origin', gp_node.uuid)
|
||||
|
||||
elif self._annotating:
|
||||
porcelain.unlock(session.repository,
|
||||
registered_gp.uuid,
|
||||
ignore_warnings=True,
|
||||
affect_dependencies=False)
|
||||
|
||||
current_selection = utils.get_selected_objects(
|
||||
current_selection = set(utils.get_selected_objects(
|
||||
bpy.context.scene,
|
||||
bpy.data.window_managers['WinMan'].windows[0].view_layer
|
||||
)
|
||||
))
|
||||
if current_selection != self._last_selection:
|
||||
obj_common = [
|
||||
o for o in self._last_selection if o not in current_selection]
|
||||
obj_ours = [
|
||||
o for o in current_selection if o not in self._last_selection]
|
||||
to_lock = list(current_selection.difference(self._last_selection))
|
||||
to_release = list(self._last_selection.difference(current_selection))
|
||||
instances_to_lock = list()
|
||||
|
||||
# change old selection right to common
|
||||
for obj in obj_common:
|
||||
node = session.repository.graph.get(obj)
|
||||
for node_id in to_lock:
|
||||
node = session.repository.graph.get(node_id)
|
||||
instance_mode = node.data.get('instance_type')
|
||||
if instance_mode and instance_mode == 'COLLECTION':
|
||||
to_lock.remove(node_id)
|
||||
instances_to_lock.append(node_id)
|
||||
if instances_to_lock:
|
||||
try:
|
||||
porcelain.lock(session.repository,
|
||||
instances_to_lock,
|
||||
ignore_warnings=True,
|
||||
affect_dependencies=False)
|
||||
except NonAuthorizedOperationError as e:
|
||||
logging.warning(e)
|
||||
|
||||
if node and (node.owner == settings.username or node.owner == RP_COMMON):
|
||||
recursive = True
|
||||
if node.data and 'instance_type' in node.data.keys():
|
||||
recursive = node.data['instance_type'] != 'COLLECTION'
|
||||
try:
|
||||
porcelain.unlock(session.repository,
|
||||
node.uuid,
|
||||
ignore_warnings=True,
|
||||
affect_dependencies=recursive)
|
||||
except NonAuthorizedOperationError:
|
||||
logging.warning(
|
||||
f"Not authorized to change {node} owner")
|
||||
|
||||
# change new selection to our
|
||||
for obj in obj_ours:
|
||||
node = session.repository.graph.get(obj)
|
||||
|
||||
if node and node.owner == RP_COMMON:
|
||||
recursive = True
|
||||
if node.data and 'instance_type' in node.data.keys():
|
||||
recursive = node.data['instance_type'] != 'COLLECTION'
|
||||
|
||||
try:
|
||||
porcelain.lock(session.repository,
|
||||
node.uuid,
|
||||
ignore_warnings=True,
|
||||
affect_dependencies=recursive)
|
||||
except NonAuthorizedOperationError:
|
||||
logging.warning(
|
||||
f"Not authorized to change {node} owner")
|
||||
else:
|
||||
return
|
||||
if to_release:
|
||||
try:
|
||||
porcelain.unlock(session.repository,
|
||||
to_release,
|
||||
ignore_warnings=True,
|
||||
affect_dependencies=True)
|
||||
except NonAuthorizedOperationError as e:
|
||||
logging.warning(e)
|
||||
if to_lock:
|
||||
try:
|
||||
porcelain.lock(session.repository,
|
||||
to_lock,
|
||||
ignore_warnings=True,
|
||||
affect_dependencies=True)
|
||||
except NonAuthorizedOperationError as e:
|
||||
logging.warning(e)
|
||||
|
||||
self._last_selection = current_selection
|
||||
|
||||
@ -234,23 +245,23 @@ class DynamicRightSelectTimer(Timer):
|
||||
# Fix deselection until right managment refactoring (with Roles concepts)
|
||||
if len(current_selection) == 0 :
|
||||
owned_keys = [k for k, v in session.repository.graph.items() if v.owner==settings.username]
|
||||
for key in owned_keys:
|
||||
node = session.repository.graph.get(key)
|
||||
if owned_keys:
|
||||
try:
|
||||
porcelain.unlock(session.repository,
|
||||
key,
|
||||
owned_keys,
|
||||
ignore_warnings=True,
|
||||
affect_dependencies=True)
|
||||
except NonAuthorizedOperationError:
|
||||
logging.warning(
|
||||
f"Not authorized to change {key} owner")
|
||||
except NonAuthorizedOperationError as e:
|
||||
logging.warning(e)
|
||||
|
||||
# Objects selectability
|
||||
for obj in bpy.data.objects:
|
||||
object_uuid = getattr(obj, 'uuid', None)
|
||||
if object_uuid:
|
||||
is_selectable = not session.repository.is_node_readonly(object_uuid)
|
||||
if obj.hide_select != is_selectable:
|
||||
obj.hide_select = is_selectable
|
||||
shared_data.session.applied_updates.append(object_uuid)
|
||||
|
||||
|
||||
class ClientUpdate(Timer):
|
||||
@ -300,7 +311,8 @@ class ClientUpdate(Timer):
|
||||
settings.client_color.b,
|
||||
1),
|
||||
'frame_current': bpy.context.scene.frame_current,
|
||||
'scene_current': scene_current
|
||||
'scene_current': scene_current,
|
||||
'mode_current': bpy.context.mode
|
||||
}
|
||||
porcelain.update_user_metadata(session.repository, metadata)
|
||||
|
||||
@ -314,6 +326,9 @@ class ClientUpdate(Timer):
|
||||
local_user_metadata['view_matrix'] = get_view_matrix(
|
||||
)
|
||||
porcelain.update_user_metadata(session.repository, local_user_metadata)
|
||||
elif bpy.context.mode != local_user_metadata['mode_current']:
|
||||
local_user_metadata['mode_current'] = bpy.context.mode
|
||||
porcelain.update_user_metadata(session.repository, local_user_metadata)
|
||||
|
||||
|
||||
class SessionStatusUpdate(Timer):
|
||||
@ -341,6 +356,7 @@ class SessionUserSync(Timer):
|
||||
renderer.remove_widget(f"{user.username}_cam")
|
||||
renderer.remove_widget(f"{user.username}_select")
|
||||
renderer.remove_widget(f"{user.username}_name")
|
||||
renderer.remove_widget(f"{user.username}_mode")
|
||||
ui_users.remove(index)
|
||||
break
|
||||
|
||||
@ -356,6 +372,8 @@ class SessionUserSync(Timer):
|
||||
f"{user}_select", UserSelectionWidget(user))
|
||||
renderer.add_widget(
|
||||
f"{user}_name", UserNameWidget(user))
|
||||
renderer.add_widget(
|
||||
f"{user}_mode", UserModeWidget(user))
|
||||
|
||||
|
||||
class MainThreadExecutor(Timer):
|
||||
|
122
multi_user/ui.py
@ -107,7 +107,7 @@ class SESSION_PT_settings(bpy.types.Panel):
|
||||
row = row.grid_flow(row_major=True, columns=0, even_columns=True, even_rows=False, align=True)
|
||||
row.prop(settings.sync_flags, "sync_render_settings",text="",icon_only=True, icon='SCENE')
|
||||
row.prop(settings.sync_flags, "sync_during_editmode", text="",icon_only=True, icon='EDITMODE_HLT')
|
||||
row.prop(settings.sync_flags, "sync_active_camera", text="",icon_only=True, icon='OBJECT_DATAMODE')
|
||||
row.prop(settings.sync_flags, "sync_active_camera", text="",icon_only=True, icon='VIEW_CAMERA')
|
||||
|
||||
row= layout.row()
|
||||
|
||||
@ -343,9 +343,10 @@ class SESSION_PT_user(bpy.types.Panel):
|
||||
box = row.box()
|
||||
split = box.split(factor=0.35)
|
||||
split.label(text="user")
|
||||
split = split.split(factor=0.5)
|
||||
split.label(text="location")
|
||||
split = split.split(factor=0.3)
|
||||
split.label(text="mode")
|
||||
split.label(text="frame")
|
||||
split.label(text="location")
|
||||
split.label(text="ping")
|
||||
|
||||
row = layout.row()
|
||||
@ -383,6 +384,8 @@ class SESSION_UL_users(bpy.types.UIList):
|
||||
ping = '-'
|
||||
frame_current = '-'
|
||||
scene_current = '-'
|
||||
mode_current = '-'
|
||||
mode_icon = 'BLANK1'
|
||||
status_icon = 'BLANK1'
|
||||
if session:
|
||||
user = session.online_users.get(item.username)
|
||||
@ -392,13 +395,45 @@ class SESSION_UL_users(bpy.types.UIList):
|
||||
if metadata and 'frame_current' in metadata:
|
||||
frame_current = str(metadata.get('frame_current','-'))
|
||||
scene_current = metadata.get('scene_current','-')
|
||||
mode_current = metadata.get('mode_current','-')
|
||||
if mode_current == "OBJECT" :
|
||||
mode_icon = "OBJECT_DATAMODE"
|
||||
elif mode_current == "EDIT_MESH" :
|
||||
mode_icon = "EDITMODE_HLT"
|
||||
elif mode_current == 'EDIT_CURVE':
|
||||
mode_icon = "CURVE_DATA"
|
||||
elif mode_current == 'EDIT_SURFACE':
|
||||
mode_icon = "SURFACE_DATA"
|
||||
elif mode_current == 'EDIT_TEXT':
|
||||
mode_icon = "FILE_FONT"
|
||||
elif mode_current == 'EDIT_ARMATURE':
|
||||
mode_icon = "ARMATURE_DATA"
|
||||
elif mode_current == 'EDIT_METABALL':
|
||||
mode_icon = "META_BALL"
|
||||
elif mode_current == 'EDIT_LATTICE':
|
||||
mode_icon = "LATTICE_DATA"
|
||||
elif mode_current == 'POSE':
|
||||
mode_icon = "POSE_HLT"
|
||||
elif mode_current == 'SCULPT':
|
||||
mode_icon = "SCULPTMODE_HLT"
|
||||
elif mode_current == 'PAINT_WEIGHT':
|
||||
mode_icon = "WPAINT_HLT"
|
||||
elif mode_current == 'PAINT_VERTEX':
|
||||
mode_icon = "VPAINT_HLT"
|
||||
elif mode_current == 'PAINT_TEXTURE':
|
||||
mode_icon = "TPAINT_HLT"
|
||||
elif mode_current == 'PARTICLE':
|
||||
mode_icon = "PARTICLES"
|
||||
elif mode_current == 'PAINT_GPENCIL' or mode_current =='EDIT_GPENCIL' or mode_current =='SCULPT_GPENCIL' or mode_current =='WEIGHT_GPENCIL' or mode_current =='VERTEX_GPENCIL':
|
||||
mode_icon = "GREASEPENCIL"
|
||||
if user['admin']:
|
||||
status_icon = 'FAKE_USER_ON'
|
||||
split = layout.split(factor=0.35)
|
||||
split.label(text=item.username, icon=status_icon)
|
||||
split = split.split(factor=0.5)
|
||||
split.label(text=scene_current)
|
||||
split = split.split(factor=0.3)
|
||||
split.label(icon=mode_icon)
|
||||
split.label(text=frame_current)
|
||||
split.label(text=scene_current)
|
||||
split.label(text=ping)
|
||||
|
||||
|
||||
@ -425,20 +460,29 @@ class SESSION_PT_presence(bpy.types.Panel):
|
||||
settings = context.window_manager.session
|
||||
pref = get_preferences()
|
||||
layout.active = settings.enable_presence
|
||||
|
||||
row = layout.row()
|
||||
row = row.grid_flow(row_major=True, columns=0, even_columns=True, even_rows=False, align=True)
|
||||
row.prop(settings, "presence_show_selected",text="",icon_only=True, icon='CUBE')
|
||||
row.prop(settings, "presence_show_user", text="",icon_only=True, icon='CAMERA_DATA')
|
||||
row.prop(settings, "presence_show_mode", text="",icon_only=True, icon='OBJECT_DATAMODE')
|
||||
row.prop(settings, "presence_show_far_user", text="",icon_only=True, icon='SCENE_DATA')
|
||||
|
||||
col = layout.column()
|
||||
if settings.presence_show_mode :
|
||||
row = col.column()
|
||||
row.prop(pref, "presence_mode_distance", expand=True)
|
||||
|
||||
col.prop(settings, "presence_show_session_status")
|
||||
row = col.column()
|
||||
row.active = settings.presence_show_session_status
|
||||
row.prop(pref, "presence_hud_scale", expand=True)
|
||||
row = col.column(align=True)
|
||||
row.active = settings.presence_show_session_status
|
||||
row.prop(pref, "presence_hud_hpos", expand=True)
|
||||
row.prop(pref, "presence_hud_vpos", expand=True)
|
||||
col.prop(settings, "presence_show_selected")
|
||||
col.prop(settings, "presence_show_user")
|
||||
row = layout.column()
|
||||
row.active = settings.presence_show_user
|
||||
row.prop(settings, "presence_show_far_user")
|
||||
if settings.presence_show_session_status :
|
||||
row = col.column()
|
||||
row.active = settings.presence_show_session_status
|
||||
row.prop(pref, "presence_hud_scale", expand=True)
|
||||
row = col.column(align=True)
|
||||
row.active = settings.presence_show_session_status
|
||||
row.prop(pref, "presence_hud_hpos", expand=True)
|
||||
row.prop(pref, "presence_hud_vpos", expand=True)
|
||||
|
||||
|
||||
def draw_property(context, parent, property_uuid, level=0):
|
||||
settings = get_preferences()
|
||||
@ -555,20 +599,15 @@ class SESSION_PT_repository(bpy.types.Panel):
|
||||
# Properties
|
||||
owned_nodes = [k for k, v in session.repository.graph.items() if v.owner==settings.username]
|
||||
|
||||
filtered_node = owned_nodes if runtime_settings.filter_owned else session.repository.graph.keys()
|
||||
filtered_node = owned_nodes if runtime_settings.filter_owned else list(session.repository.graph.keys())
|
||||
|
||||
if runtime_settings.filter_name:
|
||||
for node_id in filtered_node:
|
||||
node_instance = session.repository.graph.get(node_id)
|
||||
name = node_instance.data.get('name')
|
||||
if runtime_settings.filter_name not in name:
|
||||
filtered_node.remove(node_id)
|
||||
filtered_node = [n for n in filtered_node if runtime_settings.filter_name.lower() in session.repository.graph.get(n).data.get('name').lower()]
|
||||
|
||||
if filtered_node:
|
||||
col = layout.column(align=True)
|
||||
for key in filtered_node:
|
||||
draw_property(context, col, key)
|
||||
|
||||
else:
|
||||
layout.row().label(text="Empty")
|
||||
|
||||
@ -590,23 +629,32 @@ class VIEW3D_PT_overlay_session(bpy.types.Panel):
|
||||
def draw(self, context):
|
||||
layout = self.layout
|
||||
|
||||
view = context.space_data
|
||||
overlay = view.overlay
|
||||
display_all = overlay.show_overlays
|
||||
|
||||
col = layout.column()
|
||||
|
||||
row = col.row(align=True)
|
||||
settings = context.window_manager.session
|
||||
pref = get_preferences()
|
||||
layout.active = settings.enable_presence
|
||||
|
||||
row = layout.row()
|
||||
row = row.grid_flow(row_major=True, columns=0, even_columns=True, even_rows=False, align=True)
|
||||
row.prop(settings, "presence_show_selected",text="",icon_only=True, icon='CUBE')
|
||||
row.prop(settings, "presence_show_user", text="",icon_only=True, icon='CAMERA_DATA')
|
||||
row.prop(settings, "presence_show_mode", text="",icon_only=True, icon='OBJECT_DATAMODE')
|
||||
row.prop(settings, "presence_show_far_user", text="",icon_only=True, icon='SCENE_DATA')
|
||||
|
||||
col = layout.column()
|
||||
if settings.presence_show_mode :
|
||||
row = col.column()
|
||||
row.prop(pref, "presence_mode_distance", expand=True)
|
||||
|
||||
col.prop(settings, "presence_show_session_status")
|
||||
col.prop(settings, "presence_show_selected")
|
||||
col.prop(settings, "presence_show_user")
|
||||
|
||||
row = layout.column()
|
||||
row.active = settings.presence_show_user
|
||||
row.prop(settings, "presence_show_far_user")
|
||||
if settings.presence_show_session_status :
|
||||
row = col.column()
|
||||
row.active = settings.presence_show_session_status
|
||||
row.prop(pref, "presence_hud_scale", expand=True)
|
||||
row = col.column(align=True)
|
||||
row.active = settings.presence_show_session_status
|
||||
row.prop(pref, "presence_hud_hpos", expand=True)
|
||||
row.prop(pref, "presence_hud_vpos", expand=True)
|
||||
|
||||
|
||||
classes = (
|
||||
SESSION_UL_users,
|
||||
|
@ -38,6 +38,14 @@ from replication.constants import (STATE_ACTIVE, STATE_AUTH,
|
||||
STATE_LOBBY,
|
||||
CONNECTING)
|
||||
|
||||
CLEARED_DATABLOCKS = ['actions', 'armatures', 'cache_files', 'cameras',
|
||||
'collections', 'curves', 'filepath', 'fonts',
|
||||
'grease_pencils', 'images', 'lattices', 'libraries',
|
||||
'lightprobes', 'lights', 'linestyles', 'masks',
|
||||
'materials', 'meshes', 'metaballs', 'movieclips',
|
||||
'node_groups', 'objects', 'paint_curves', 'particles',
|
||||
'scenes', 'shape_keys', 'sounds', 'speakers', 'texts',
|
||||
'textures', 'volumes', 'worlds']
|
||||
|
||||
def find_from_attr(attr_name, attr_value, list):
|
||||
for item in list:
|
||||
@ -101,23 +109,25 @@ def get_state_str(state):
|
||||
|
||||
|
||||
def clean_scene():
|
||||
to_delete = [f for f in dir(bpy.data) if f not in ['brushes', 'palettes']]
|
||||
for type_name in to_delete:
|
||||
try:
|
||||
sub_collection_to_avoid = [bpy.data.linestyles['LineStyle'], bpy.data.materials['Dots Stroke']]
|
||||
type_collection = getattr(bpy.data, type_name)
|
||||
items_to_remove = [i for i in type_collection if i not in sub_collection_to_avoid]
|
||||
for item in items_to_remove:
|
||||
try:
|
||||
type_collection.remove(item)
|
||||
except:
|
||||
continue
|
||||
except:
|
||||
continue
|
||||
|
||||
for type_name in CLEARED_DATABLOCKS:
|
||||
sub_collection_to_avoid = [
|
||||
bpy.data.linestyles.get('LineStyle'),
|
||||
bpy.data.materials.get('Dots Stroke')
|
||||
]
|
||||
|
||||
type_collection = getattr(bpy.data, type_name)
|
||||
items_to_remove = [i for i in type_collection if i not in sub_collection_to_avoid]
|
||||
for item in items_to_remove:
|
||||
try:
|
||||
type_collection.remove(item)
|
||||
logging.info(item.name)
|
||||
except:
|
||||
continue
|
||||
|
||||
# Clear sequencer
|
||||
bpy.context.scene.sequence_editor_clear()
|
||||
|
||||
|
||||
def get_selected_objects(scene, active_view_layer):
|
||||
return [obj.uuid for obj in scene.objects if obj.select_get(view_layer=active_view_layer)]
|
||||
|
||||
|
@ -8,7 +8,7 @@ from multi_user.bl_types.bl_material import BlMaterial
|
||||
|
||||
|
||||
def test_material_nodes(clear_blend):
|
||||
nodes_types = [node.bl_rna.identifier for node in bpy.types.ShaderNode.__subclasses__()] # Faire un peu comme ici
|
||||
nodes_types = [node.bl_rna.identifier for node in bpy.types.ShaderNode.__subclasses__()]
|
||||
|
||||
datablock = bpy.data.materials.new("test")
|
||||
datablock.use_nodes = True
|
||||
|
@ -11,8 +11,9 @@ from multi_user.utils import get_preferences
|
||||
def test_scene(clear_blend):
|
||||
get_preferences().sync_flags.sync_render_settings = True
|
||||
|
||||
# datablock = bpy.data.scenes.new("toto") # TODO: trouver datablock -> active compositing 'Use nodes'
|
||||
datablock = bpy.data.scenes["Scene"].use_nodes
|
||||
datablock = bpy.data.scenes.new("toto")
|
||||
datablock.timeline_markers.new('toto', frame=10)
|
||||
datablock.timeline_markers.new('tata', frame=1)
|
||||
datablock.view_settings.use_curve_mapping = True
|
||||
# Test
|
||||
implementation = BlScene()
|
||||
|