Compare commits

..

2 Commits

Author SHA1 Message Date
48866b74d3 refacctor: remove wrong charaters 2020-07-07 22:35:56 +02:00
d9f1031107 feat: initial version 2020-07-07 22:34:40 +02:00
60 changed files with 2003 additions and 3273 deletions

View File

@ -1,9 +1,8 @@
stages:
- test
- build
- deploy
include:
- local: .gitlab/ci/test.gitlab-ci.yml
- local: .gitlab/ci/build.gitlab-ci.yml
- local: .gitlab/ci/deploy.gitlab-ci.yml

View File

@ -1,8 +1,12 @@
build:
stage: build
image: debian:stable-slim
image: python:latest
script:
- git submodule init
- git submodule update
- cd multi_user/libs/replication
- rm -rf tests .git .gitignore script
artifacts:
name: multi_user
paths:

View File

@ -1,18 +0,0 @@
deploy:
stage: deploy
image: slumber/docker-python
variables:
DOCKER_DRIVER: overlay2
DOCKER_TLS_CERTDIR: "/certs"
services:
- docker:19.03.12-dind
script:
- RP_VERSION="$(python scripts/get_replication_version.py)"
- VERSION="$(python scripts/get_addon_version.py)"
- echo "Building docker image with replication ${RP_VERSION}"
- docker build --build-arg replication_version=${RP_VERSION} --build-arg version={VERSION} -t registry.gitlab.com/slumber/multi-user/multi-user-server:${VERSION} ./scripts/docker_server
- echo "Pushing to gitlab registry ${VERSION}"
- docker login -u $CI_REGISTRY_USER -p $CI_REGISTRY_PASSWORD $CI_REGISTRY
- docker push registry.gitlab.com/slumber/multi-user/multi-user-server:${VERSION}

View File

@ -1,5 +1,14 @@
test:
stage: test
image: slumber/blender-addon-testing:latest
image: python:latest
script:
- git submodule init
- git submodule update
- apt update
# install blender to get all required dependencies
# TODO: indtall only dependencies
- apt install -f -y gcc python-dev python3.7-dev
- apt install -f -y blender
- python3 -m pip install blender-addon-tester
- python3 scripts/test_addon.py

3
.gitmodules vendored
View File

@ -0,0 +1,3 @@
[submodule "multi_user/libs/replication"]
path = multi_user/libs/replication
url = https://gitlab.com/slumber/replication.git

View File

@ -37,7 +37,7 @@ All notable changes to this project will be documented in this file.
- Serialization is now based on marshal (2x performance improvements).
- Let pip chose python dependencies install path.
## [0.0.3] - 2020-07-29
## [0.0.3] - Upcoming
### Added
@ -60,39 +60,8 @@ All notable changes to this project will be documented in this file.
- user localization
- repository init
### Removed
- Unused strict right management strategy
- Legacy config management system
## [0.1.0] - preview
### Added
- Dependency graph driven updates [experimental]
- Edit Mode updates
- Late join mechanism
- Sync Axis lock replication
- Sync collection offset
- Sync camera orthographic scale
- Sync custom fonts
- Sync sound files
- Logging configuration (file output and level)
- Object visibility type replication
- Optionnal sync for active camera
- Curve->Mesh conversion
- Mesh->gpencil conversion
### Changed
- Auto updater now handle installation from branches
- Use uuid for collection loading
- Moved session instance to replication package
### Fixed
- Prevent unsupported data types to crash the session
- Modifier vertex group assignation
- World sync
- Snapshot UUID error
- The world is not synchronized
- Legacy config management system

View File

@ -11,7 +11,7 @@ This tool aims to allow multiple users to work on the same scene over the networ
## Quick installation
1. Download latest release [multi_user.zip](https://gitlab.com/slumber/multi-user/-/jobs/artifacts/master/download?job=build).
1. Download latest release [multi_user.zip](/uploads/8aef79c7cf5b1d9606dc58307fd9ad8b/multi_user.zip).
2. Run blender as administrator (dependencies installation).
3. Install last_version.zip from your addon preferences.
@ -25,32 +25,27 @@ See the [documentation](https://multi-user.readthedocs.io/en/latest/) for detail
Currently, not all data-block are supported for replication over the wire. The following list summarizes the status for each ones.
| Name | Status | Comment |
| ----------- | :----: | :--------------------------------------------------------------------------: |
| action | ✔️ | |
| armature | ❗ | Not stable |
| camera | ✔️ | |
| collection | ✔️ | |
| curve | | Nurbs not supported |
| gpencil | ✔️ | [Airbrush not supported](https://gitlab.com/slumber/multi-user/-/issues/123) |
| image | ✔️ | |
| mesh | ✔️ | |
| material | ✔️ | |
| metaball | ✔️ | |
| object | ✔️ | |
| texts | ✔️ | |
| scene | ✔️ | |
| world | ✔️ | |
| lightprobes | ✔️ | |
| compositing | ❌ | [Planned](https://gitlab.com/slumber/multi-user/-/issues/46) |
| texts | ❌ | [Planned](https://gitlab.com/slumber/multi-user/-/issues/81) |
| nla | ❌ | |
| volumes | | |
| particles | ❌ | [On-going](https://gitlab.com/slumber/multi-user/-/issues/24) |
| speakers | ❗ | [Partial](https://gitlab.com/slumber/multi-user/-/issues/65) |
| vse | ❌ | [Planned](https://gitlab.com/slumber/multi-user/-/issues/45) |
| physics | ❌ | [Planned](https://gitlab.com/slumber/multi-user/-/issues/45) |
| libraries | ❗ | Partial |
| Name | Status | Comment |
| ----------- | :----: | :-----------------------------------------------------------: |
| action | | Not stable |
| armature | ❗ | Not stable |
| camera | ✔️ | |
| collection | ✔️ | |
| curve | ✔️ | Nurbs surface don't load correctly |
| gpencil | ✔️ | |
| image | | Not stable yet |
| mesh | ✔️ | |
| material | ✔️ | |
| metaball | ✔️ | |
| object | ✔️ | |
| scene | ✔️ | |
| world | ✔️ | |
| lightprobes | ✔️ | |
| particles | ❌ | [On-going](https://gitlab.com/slumber/multi-user/-/issues/24) |
| speakers | ❌ | [Planned](https://gitlab.com/slumber/multi-user/-/issues/65) |
| vse | ❌ | [Planned](https://gitlab.com/slumber/multi-user/-/issues/45) |
| physics | ❌ | [Planned](https://gitlab.com/slumber/multi-user/-/issues/45) |
| libraries | | Partial |
### Performance issues
@ -62,16 +57,14 @@ I'm working on it.
| Dependencies | Version | Needed |
| ------------ | :-----: | -----: |
| Replication | latest | yes |
| ZeroMQ | latest | yes |
| JsonDiff | latest | yes |
## Contributing
See [contributing section](https://multi-user.readthedocs.io/en/latest/ways_to_contribute.html) of the documentation.
Feel free to [join the discord server](https://discord.gg/aBPvGws) to chat, seek help and contribute.
## Licensing
See [license](LICENSE)

Binary file not shown.

Before

Width:  |  Height:  |  Size: 8.4 KiB

After

Width:  |  Height:  |  Size: 21 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 7.6 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 2.9 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 4.1 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 18 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 15 KiB

View File

@ -8,4 +8,5 @@ Getting started
install
quickstart
known_problems
glossary

View File

@ -0,0 +1,46 @@
.. _known-problems:
==============
Known problems
==============
.. rubric:: What do you need to do in order to use Multi-User through internet?
1. Use Hamachi or ZeroTier (I prefer Hamachi) and create a network.
2. All participants need to join this network.
3. Go to Blender and install Multi-User in the preferneces.
4. Setup and start the session:
* **Host**: After activating Multi-User as an Add-On, press N and go on Multi-User.
Then, put the IP of your network where IP is asked for.
Leave Port and IPC Port on default(5555 and 5561). Increase the Timeout(ms) if the connection is not stable.
Then press on "host".
* **Guest**: After activating Multi-User as an Add-On, press N and go to Multi-User
Then, put the IP of your network where IP is asked for.
Leave Port and IPC Port on default(5555 and 5561)(Simpler, put the same information that the host is using.
BUT,it needs 4 ports for communication. Therefore, you need to put 5555+count of guests [up to 4]. ).
Increase the Timeout(ms) if the connection is not stable. Then press on "connexion".
.. rubric:: What do you need to check if you can't host?
You need to check, if the IP and all ports are correct. If it's not loading, because you laoded a project before hosting, it's not your fault.
Then the version is not sable yet (the project contains data, that is not made stable yet).
.. rubric:: What do you need to check if you can't connect?
Check, if you are connected to the network (VPN) of the host. Also, check if you have all of the information like the host has.
Maybe you have different versions (which shouldn't be the case after Auto-Updater is introduced).
.. rubric:: You are connected, but you dont see anything?
After pressing N, go presence overlay and check the box.
Also, go down and uncheck the box "Show only owned"(unless you need privacy ( ͡° ͜ʖ ͡°) ).
If it's still not working, hit the support channel on the discord channel "multi-user". This little helping text is produced by my own experience
(Ultr-X).
In order to bring attention to other problems, please @ me on the support channel. Every problem brought to me will be documentated to optimize and update this text.
Thank you and have fun with Multi-User, brought to you by "swann".
Here the discord server: https://discord.gg/v5eKgm

View File

@ -161,19 +161,6 @@ The collaboration quality directly depend on the communication quality. This sec
various tools made in an effort to ease the communication between the different session users.
Feel free to suggest any idea for communication tools `here <https://gitlab.com/slumber/multi-user/-/issues/75>`_ .
---------------------------
Change replication behavior
---------------------------
During a session, the multi-user will replicate your modifications to other instances.
In order to avoid annoying other users when you are experimenting, some of those modifications can be ignored via
various flags present at the top of the panel (see red area in the image bellow). Those flags are explained in the :ref:`replication` section.
.. figure:: img/quickstart_replication.png
:align: center
Session replication flags
--------------------
Monitor online users
--------------------
@ -255,8 +242,6 @@ various drawn parts via the following flags:
- **Show users**: display users current viewpoint
- **Show different scenes**: display users working on other scenes
-----------
Manage data
-----------
@ -314,105 +299,37 @@ Here is a quick list of available actions:
.. _advanced:
Advanced settings
=================
Advanced configuration
======================
This section contains optional settings to configure the session behavior.
.. figure:: img/quickstart_advanced.png
:align: center
Advanced configuration panel
Repository panel
-------
Network
-------
.. figure:: img/quickstart_advanced_network.png
:align: center
Advanced network settings
.. rubric:: Network
**IPC Port** is the port used for Inter Process Communication. This port is used
by the multi-users subprocesses to communicate with each others. If different instances
of the multi-user are using the same IPC port it will create conflict !
.. note::
You only need to modify it if you need to launch multiple clients from the same
computer(or if you try to host and join on the same computer). You should just enter a different
**IPC port** for each blender instance.
You only need to modify it if you need to launch multiple clients from the same
computer(or if you try to host and join on the same computer). You should just enter a different
**IPC port** for each blender instance.
**Timeout (in milliseconds)** is the maximum ping authorized before auto-disconnecting.
You should only increase it if you have a bad connection.
.. _replication:
-----------
Replication
-----------
.. figure:: img/quickstart_advanced_replication.png
:align: center
Advanced replication settings
.. rubric:: Replication
**Synchronize render settings** (only host) enable replication of EEVEE and CYCLES render settings to match render between clients.
**Synchronize active camera** sync the scene active camera.
**Edit Mode Updates** enable objects update while you are in Edit_Mode.
.. warning:: Edit Mode Updates kill performances with complex objects (heavy meshes, gpencil, etc...).
**Update method** allow you to change how replication update are triggered. Until now two update methode are implemented:
- **Default**: Use external threads to monitor datablocks changes, slower and less accurate.
- **Despgraph ⚠️**: Use the blender dependency graph to trigger updates. Faster but experimental and unstable !
**Properties frequency gird** allow to set a custom replication frequency for each type of data-block:
- **Refresh**: pushed data update rate (in second)
- **Apply**: pulled data update rate (in second)
-----
Cache
-----
.. note:: Per-data type settings will soon be revamped for simplification purposes
The multi-user allows to replicate external blend dependencies such as images, movies sounds.
On each client, those files are stored into the cache folder.
.. figure:: img/quickstart_advanced_cache.png
:align: center
Advanced cache settings
**cache_directory** allows to choose where cached files (images, sound, movies) will be saved.
**Clear memory filecache** will save memory space at runtime by removing the file content from memory as soon as it have been written to the disk.
**Clear cache** will remove all file from the cache folder.
.. warning:: Clear cash could break your scene image/movie/sound if they are used into the blend !
---
Log
---
.. figure:: img/quickstart_advanced_logging.png
:align: center
Advanced log settings
**log level** allow to set the logging level of detail. Here is the detail for each values:
+-----------+-----------------------------------------------+
| Log level | Description |
+===========+===============================================+
| ERROR | Shows only critical error |
+-----------+-----------------------------------------------+
| WARNING | Shows only errors (all kind) |
+-----------+-----------------------------------------------+
| INFO | Shows only status related messages and errors |
+-----------+-----------------------------------------------+
| DEBUG | Shows every possible information. |
+-----------+-----------------------------------------------+

View File

@ -48,6 +48,7 @@ Documentation is organized into the following sections:
getting_started/install
getting_started/quickstart
getting_started/known_problems
getting_started/glossary
.. toctree::

View File

@ -186,24 +186,25 @@ Using a regular command line
You can run the dedicated server on any platform by following those steps:
1. Firstly, download and intall python 3 (3.6 or above).
2. Install the replication library:
2. Download and extract the dedicated server from `here <https://gitlab.com/slumber/replication/-/archive/develop/replication-develop.zip>`_
3. Open a terminal in the extracted folder and install python dependencies by running:
.. code-block:: bash
python -m pip install replication
python -m pip install -r requirements.txt
4. Launch the server with:
4. Launch the server from the same terminal with:
.. code-block:: bash
replication.serve
python scripts/server.py
.. hint::
You can also specify a custom **port** (-p), **timeout** (-t), **admin password** (-pwd), **log level(ERROR, WARNING, INFO or DEBUG)** (-l) and **log file** (-lf) with the following optionnal argument
You can also specify a custom **port** (-p), **timeout** (-t) and **admin password** (-pwd) with the following optionnal argument
.. code-block:: bash
replication.serve -p 5555 -pwd toto -t 1000 -l INFO -lf server.log
python scripts/server.py -p 5555 -pwd toto -t 1000
As soon as the dedicated server is running, you can connect to it from blender (follow :ref:`how-to-join`).

View File

@ -19,9 +19,9 @@
bl_info = {
"name": "Multi-User",
"author": "Swann Martinez",
"version": (0, 1, 0),
"version": (0, 0, 3),
"description": "Enable real-time collaborative workflow inside blender",
"blender": (2, 82, 0),
"blender": (2, 80, 0),
"location": "3D View > Sidebar > Multi-User tab",
"warning": "Unstable addon, use it at your own risks",
"category": "Collaboration",
@ -43,38 +43,42 @@ from bpy.app.handlers import persistent
from . import environment, utils
# TODO: remove dependency as soon as replication will be installed as a module
DEPENDENCIES = {
("replication", '0.0.21a15'),
("zmq","zmq"),
("jsondiff","jsondiff"),
("deepdiff", "deepdiff"),
("psutil","psutil")
}
module_error_msg = "Insufficient rights to install the multi-user \
dependencies, aunch blender with administrator rights."
libs = os.path.dirname(os.path.abspath(__file__))+"\\libs\\replication\\replication"
def register():
# Setup logging policy
logging.basicConfig(
format='%(asctime)s CLIENT %(levelname)-8s %(message)s',
datefmt='%H:%M:%S',
level=logging.INFO)
logging.basicConfig(format='%(levelname)s:%(message)s', level=logging.INFO)
if libs not in sys.path:
sys.path.append(libs)
try:
environment.setup(DEPENDENCIES, bpy.app.binary_path_python)
except ModuleNotFoundError:
logging.fatal("Fail to install multi-user dependencies, try to execute blender with admin rights.")
return
from . import presence
from . import operators
from . import ui
from . import preferences
from . import addon_updater_ops
from . import presence
from . import operators
from . import ui
from . import preferences
from . import addon_updater_ops
preferences.register()
addon_updater_ops.register(bl_info)
presence.register()
operators.register()
ui.register()
preferences.register()
addon_updater_ops.register(bl_info)
presence.register()
operators.register()
ui.register()
except ModuleNotFoundError as e:
raise Exception(module_error_msg)
logging.error(module_error_msg)
bpy.types.WindowManager.session = bpy.props.PointerProperty(
type=preferences.SessionProps)
bpy.types.ID.uuid = bpy.props.StringProperty(

View File

@ -23,11 +23,7 @@ https://github.com/CGCookie/blender-addon-updater
"""
__version__ = "1.0.8"
import errno
import traceback
import platform
import ssl
import urllib.request
import urllib
@ -102,7 +98,6 @@ class Singleton_updater(object):
# runtime variables, initial conditions
self._verbose = False
self._use_print_traces = True
self._fake_install = False
self._async_checking = False # only true when async daemon started
self._update_ready = None
@ -138,13 +133,6 @@ class Singleton_updater(object):
self._select_link = select_link_function
# called from except blocks, to print the exception details,
# according to the use_print_traces option
def print_trace():
if self._use_print_traces:
traceback.print_exc()
# -------------------------------------------------------------------------
# Getters and setters
# -------------------------------------------------------------------------
@ -178,7 +166,7 @@ class Singleton_updater(object):
try:
self._auto_reload_post_update = bool(value)
except:
raise ValueError("auto_reload_post_update must be a boolean value")
raise ValueError("Must be a boolean value")
@property
def backup_current(self):
@ -363,7 +351,7 @@ class Singleton_updater(object):
try:
self._repo = str(value)
except:
raise ValueError("repo must be a string value")
raise ValueError("User must be a string")
@property
def select_link(self):
@ -389,7 +377,6 @@ class Singleton_updater(object):
os.makedirs(value)
except:
if self._verbose: print("Error trying to staging path")
self.print_trace()
return
self._updater_path = value
@ -459,16 +446,6 @@ class Singleton_updater(object):
except:
raise ValueError("Verbose must be a boolean value")
@property
def use_print_traces(self):
return self._use_print_traces
@use_print_traces.setter
def use_print_traces(self, value):
try:
self._use_print_traces = bool(value)
except:
raise ValueError("use_print_traces must be a boolean value")
@property
def version_max_update(self):
return self._version_max_update
@ -660,9 +637,6 @@ class Singleton_updater(object):
else:
if self._verbose: print("Tokens not setup for engine yet")
# Always set user agent
request.add_header('User-Agent', "Python/"+str(platform.python_version()))
# run the request
try:
if context:
@ -678,7 +652,6 @@ class Singleton_updater(object):
self._error = "HTTP error"
self._error_msg = str(e.code)
print(self._error, self._error_msg)
self.print_trace()
self._update_ready = None
except urllib.error.URLError as e:
reason = str(e.reason)
@ -690,7 +663,6 @@ class Singleton_updater(object):
self._error = "URL error, check internet connection"
self._error_msg = reason
print(self._error, self._error_msg)
self.print_trace()
self._update_ready = None
return None
else:
@ -712,7 +684,6 @@ class Singleton_updater(object):
self._error_msg = str(e.reason)
self._update_ready = None
print(self._error, self._error_msg)
self.print_trace()
return None
else:
return None
@ -729,17 +700,15 @@ class Singleton_updater(object):
if self._verbose: print("Preparing staging folder for download:\n",local)
if os.path.isdir(local) == True:
try:
shutil.rmtree(local, ignore_errors=True)
shutil.rmtree(local)
os.makedirs(local)
except:
error = "failed to remove existing staging directory"
self.print_trace()
else:
try:
os.makedirs(local)
except:
error = "failed to create staging directory"
self.print_trace()
if error != None:
if self._verbose: print("Error: Aborting update, "+error)
@ -764,10 +733,6 @@ class Singleton_updater(object):
request.add_header('PRIVATE-TOKEN',self._engine.token)
else:
if self._verbose: print("Tokens not setup for selected engine yet")
# Always set user agent
request.add_header('User-Agent', "Python/"+str(platform.python_version()))
self.urlretrieve(urllib.request.urlopen(request,context=context), self._source_zip)
# add additional checks on file size being non-zero
if self._verbose: print("Successfully downloaded update zip")
@ -778,7 +743,6 @@ class Singleton_updater(object):
if self._verbose:
print("Error retrieving download, bad link?")
print("Error: {}".format(e))
self.print_trace()
return False
@ -793,18 +757,16 @@ class Singleton_updater(object):
if os.path.isdir(local):
try:
shutil.rmtree(local, ignore_errors=True)
shutil.rmtree(local)
except:
if self._verbose:print("Failed to removed previous backup folder, contininuing")
self.print_trace()
# remove the temp folder; shouldn't exist but could if previously interrupted
if os.path.isdir(tempdest):
try:
shutil.rmtree(tempdest, ignore_errors=True)
shutil.rmtree(tempdest)
except:
if self._verbose:print("Failed to remove existing temp folder, contininuing")
self.print_trace()
# make the full addon copy, which temporarily places outside the addon folder
if self._backup_ignore_patterns != None:
shutil.copytree(
@ -832,7 +794,7 @@ class Singleton_updater(object):
# make the copy
shutil.move(backuploc,tempdest)
shutil.rmtree(self._addon_root, ignore_errors=True)
shutil.rmtree(self._addon_root)
os.rename(tempdest,self._addon_root)
self._json["backup_date"] = ""
@ -853,7 +815,7 @@ class Singleton_updater(object):
# clear the existing source folder in case previous files remain
outdir = os.path.join(self._updater_path, "source")
try:
shutil.rmtree(outdir, ignore_errors=True)
shutil.rmtree(outdir)
if self._verbose:
print("Source folder cleared")
except:
@ -866,7 +828,6 @@ class Singleton_updater(object):
except Exception as err:
print("Error occurred while making extract dir:")
print(str(err))
self.print_trace()
self._error = "Install failed"
self._error_msg = "Failed to make extract directory"
return -1
@ -908,7 +869,6 @@ class Singleton_updater(object):
if exc.errno != errno.EEXIST:
self._error = "Install failed"
self._error_msg = "Could not create folder from zip"
self.print_trace()
return -1
else:
with open(os.path.join(outdir, subpath), "wb") as outfile:
@ -1002,13 +962,12 @@ class Singleton_updater(object):
print("Clean removing file {}".format(os.path.join(base,f)))
for f in folders:
if os.path.join(base,f)==self._updater_path: continue
shutil.rmtree(os.path.join(base,f), ignore_errors=True)
shutil.rmtree(os.path.join(base,f))
print("Clean removing folder and contents {}".format(os.path.join(base,f)))
except Exception as err:
error = "failed to create clean existing addon folder"
print(error, str(err))
self.print_trace()
# Walk through the base addon folder for rules on pre-removing
# but avoid removing/altering backup and updater file
@ -1024,7 +983,6 @@ class Singleton_updater(object):
if self._verbose: print("Pre-removed file "+file)
except OSError:
print("Failed to pre-remove "+file)
self.print_trace()
# Walk through the temp addon sub folder for replacements
# this implements the overwrite rules, which apply after
@ -1048,7 +1006,7 @@ class Singleton_updater(object):
# otherwise, check each file to see if matches an overwrite pattern
replaced=False
for ptrn in self._overwrite_patterns:
if fnmatch.filter([file],ptrn):
if fnmatch.filter([destFile],ptrn):
replaced=True
break
if replaced:
@ -1064,11 +1022,10 @@ class Singleton_updater(object):
# now remove the temp staging folder and downloaded zip
try:
shutil.rmtree(staging_path, ignore_errors=True)
shutil.rmtree(staging_path)
except:
error = "Error: Failed to remove existing staging directory, consider manually removing "+staging_path
if self._verbose: print(error)
self.print_trace()
def reload_addon(self):
@ -1084,16 +1041,9 @@ class Singleton_updater(object):
# not allowed in restricted context, such as register module
# toggle to refresh
if "addon_disable" in dir(bpy.ops.wm): # 2.7
bpy.ops.wm.addon_disable(module=self._addon_package)
bpy.ops.wm.addon_refresh()
bpy.ops.wm.addon_enable(module=self._addon_package)
print("2.7 reload complete")
else: # 2.8
bpy.ops.preferences.addon_disable(module=self._addon_package)
bpy.ops.preferences.addon_refresh()
bpy.ops.preferences.addon_enable(module=self._addon_package)
print("2.8 reload complete")
bpy.ops.wm.addon_disable(module=self._addon_package)
bpy.ops.wm.addon_refresh()
bpy.ops.wm.addon_enable(module=self._addon_package)
# -------------------------------------------------------------------------
@ -1425,26 +1375,26 @@ class Singleton_updater(object):
if "last_check" not in self._json or self._json["last_check"] == "":
return True
else:
now = datetime.now()
last_check = datetime.strptime(self._json["last_check"],
"%Y-%m-%d %H:%M:%S.%f")
next_check = last_check
offset = timedelta(
days=self._check_interval_days + 30*self._check_interval_months,
hours=self._check_interval_hours,
minutes=self._check_interval_minutes
)
now = datetime.now()
last_check = datetime.strptime(self._json["last_check"],
"%Y-%m-%d %H:%M:%S.%f")
next_check = last_check
offset = timedelta(
days=self._check_interval_days + 30*self._check_interval_months,
hours=self._check_interval_hours,
minutes=self._check_interval_minutes
)
delta = (now - offset) - last_check
if delta.total_seconds() > 0:
if self._verbose:
print("{} Updater: Time to check for updates!".format(self._addon))
return True
if self._verbose:
print("{} Updater: Determined it's not yet time to check for updates".format(self._addon))
return False
delta = (now - offset) - last_check
if delta.total_seconds() > 0:
if self._verbose:
print("{} Updater: Time to check for updates!".format(self._addon))
return True
else:
if self._verbose:
print("{} Updater: Determined it's not yet time to check for updates".format(self._addon))
return False
def get_json_path(self):
"""Returns the full path to the JSON state file used by this updater.
@ -1463,7 +1413,6 @@ class Singleton_updater(object):
except Exception as err:
print("Other OS error occurred while trying to rename old JSON")
print(err)
self.print_trace()
return json_path
def set_updater_json(self):
@ -1564,7 +1513,6 @@ class Singleton_updater(object):
except Exception as exception:
print("Checking for update error:")
print(exception)
self.print_trace()
if not self._error:
self._update_ready = False
self._update_version = None
@ -1676,7 +1624,10 @@ class GitlabEngine(object):
return "{}{}{}".format(self.api_url,"/api/v4/projects/",updater.repo)
def form_tags_url(self, updater):
return "{}{}".format(self.form_repo_url(updater),"/repository/tags")
if updater.use_releases:
return "{}{}".format(self.form_repo_url(updater),"/releases")
else:
return "{}{}".format(self.form_repo_url(updater),"/repository/tags")
def form_branch_list_url(self, updater):
# does not validate branch name.
@ -1704,7 +1655,12 @@ class GitlabEngine(object):
def parse_tags(self, response, updater):
if response == None:
return []
return [{"name": tag["name"], "zipball_url": self.get_zip_url(tag["commit"]["id"], updater)} for tag in response]
# Return asset links from release
if updater.use_releases:
return [{"name": release["name"], "zipball_url": release["assets"]["links"][0]["url"]} for release in response]
else:
return [{"name": tag["name"], "zipball_url": self.get_zip_url(tag["commit"]["id"], updater)} for tag in response]
# -----------------------------------------------------------------------------

File diff suppressed because it is too large Load Diff

View File

@ -34,14 +34,11 @@ __all__ = [
'bl_metaball',
'bl_lattice',
'bl_lightprobe',
'bl_speaker',
'bl_font',
'bl_sound',
'bl_file'
'bl_speaker'
] # Order here defines execution order
from . import *
from replication.data import ReplicatedDataFactory
from ..libs.replication.replication.data import ReplicatedDataFactory
def types_to_register():
return __all__

View File

@ -134,7 +134,6 @@ class BlAction(BlDatablock):
bl_delay_refresh = 1
bl_delay_apply = 1
bl_automatic_push = True
bl_check_common = False
bl_icon = 'ACTION_TWEAK'
def _construct(self, data):

View File

@ -31,7 +31,6 @@ class BlArmature(BlDatablock):
bl_delay_refresh = 1
bl_delay_apply = 0
bl_automatic_push = True
bl_check_common = False
bl_icon = 'ARMATURE_DATA'
def _construct(self, data):
@ -93,7 +92,6 @@ class BlArmature(BlDatablock):
new_bone.head = bone_data['head_local']
new_bone.tail_radius = bone_data['tail_radius']
new_bone.head_radius = bone_data['head_radius']
# new_bone.roll = bone_data['roll']
if 'parent' in bone_data:
new_bone.parent = target.edit_bones[data['bones']
@ -125,8 +123,7 @@ class BlArmature(BlDatablock):
'use_connect',
'parent',
'name',
'layers',
# 'roll',
'layers'
]
data = dumper.dump(instance)

View File

@ -29,7 +29,6 @@ class BlCamera(BlDatablock):
bl_delay_refresh = 1
bl_delay_apply = 1
bl_automatic_push = True
bl_check_common = False
bl_icon = 'CAMERA_DATA'
def _construct(self, data):
@ -37,7 +36,7 @@ class BlCamera(BlDatablock):
def _load_implementation(self, data, target):
loader = Loader()
loader = Loader()
loader.load(target, data)
dof_settings = data.get('dof')
@ -46,22 +45,13 @@ class BlCamera(BlDatablock):
if dof_settings:
loader.load(target.dof, dof_settings)
background_images = data.get('background_images')
if background_images:
target.background_images.clear()
for img_name, img_data in background_images.items():
target_img = target.background_images.new()
target_img.image = bpy.data.images[img_name]
loader.load(target_img, img_data)
def _dump_implementation(self, data, instance=None):
assert(instance)
# TODO: background image support
dumper = Dumper()
dumper.depth = 3
dumper.depth = 2
dumper.include_filter = [
"name",
'type',
@ -80,7 +70,6 @@ class BlCamera(BlDatablock):
'aperture_fstop',
'aperture_blades',
'aperture_rotation',
'ortho_scale',
'aperture_ratio',
'display_size',
'show_limits',
@ -90,24 +79,7 @@ class BlCamera(BlDatablock):
'sensor_fit',
'sensor_height',
'sensor_width',
'show_background_images',
'background_images',
'alpha',
'display_depth',
'frame_method',
'offset',
'rotation',
'scale',
'use_flip_x',
'use_flip_y',
'image'
]
return dumper.dump(instance)
def _resolve_deps_implementation(self):
deps = []
for background in self.instance.background_images:
if background.image:
deps.append(background.image)
return deps

View File

@ -21,55 +21,6 @@ import mathutils
from .. import utils
from .bl_datablock import BlDatablock
from .dump_anything import Loader, Dumper
def dump_collection_children(collection):
collection_children = []
for child in collection.children:
if child not in collection_children:
collection_children.append(child.uuid)
return collection_children
def dump_collection_objects(collection):
collection_objects = []
for object in collection.objects:
if object not in collection_objects:
collection_objects.append(object.uuid)
return collection_objects
def load_collection_objects(dumped_objects, collection):
for object in dumped_objects:
object_ref = utils.find_from_attr('uuid', object, bpy.data.objects)
if object_ref is None:
continue
elif object_ref.name not in collection.objects.keys():
collection.objects.link(object_ref)
for object in collection.objects:
if object.uuid not in dumped_objects:
collection.objects.unlink(object)
def load_collection_childrens(dumped_childrens, collection):
for child_collection in dumped_childrens:
collection_ref = utils.find_from_attr(
'uuid',
child_collection,
bpy.data.collections)
if collection_ref is None:
continue
if collection_ref.name not in collection.children.keys():
collection.children.link(collection_ref)
for child_collection in collection.children:
if child_collection.uuid not in dumped_childrens:
collection.children.unlink(child_collection)
class BlCollection(BlDatablock):
@ -79,47 +30,71 @@ class BlCollection(BlDatablock):
bl_delay_refresh = 1
bl_delay_apply = 1
bl_automatic_push = True
bl_check_common = True
def _construct(self, data):
if self.is_library:
with bpy.data.libraries.load(filepath=bpy.data.libraries[self.data['library']].filepath, link=True) as (sourceData, targetData):
targetData.collections = [
name for name in sourceData.collections if name == self.data['name']]
instance = bpy.data.collections[self.data['name']]
return instance
instance = bpy.data.collections.new(data["name"])
return instance
def _load_implementation(self, data, target):
loader = Loader()
loader.load(target, data)
# Load other meshes metadata
target.name = data["name"]
# Objects
load_collection_objects(data['objects'], target)
for object in data["objects"]:
object_ref = bpy.data.objects.get(object)
if object_ref is None:
continue
if object not in target.objects.keys():
target.objects.link(object_ref)
for object in target.objects:
if object.name not in data["objects"]:
target.objects.unlink(object)
# Link childrens
load_collection_childrens(data['children'], target)
for collection in data["children"]:
collection_ref = bpy.data.collections.get(collection)
if collection_ref is None:
continue
if collection_ref.name not in target.children.keys():
target.children.link(collection_ref)
for collection in target.children:
if collection.name not in data["children"]:
target.children.unlink(collection)
def _dump_implementation(self, data, instance=None):
assert(instance)
dumper = Dumper()
dumper.depth = 1
dumper.include_filter = [
"name",
"instance_offset"
]
data = dumper.dump(instance)
data = {}
data['name'] = instance.name
# dump objects
data['objects'] = dump_collection_objects(instance)
collection_objects = []
for object in instance.objects:
if object not in collection_objects:
collection_objects.append(object.name)
data['objects'] = collection_objects
# dump children collections
data['children'] = dump_collection_children(instance)
collection_children = []
for child in instance.children:
if child not in collection_children:
collection_children.append(child.name)
data['children'] = collection_children
return data

View File

@ -46,105 +46,12 @@ SPLINE_POINT = [
"radius",
]
CURVE_METADATA = [
'align_x',
'align_y',
'bevel_depth',
'bevel_factor_end',
'bevel_factor_mapping_end',
'bevel_factor_mapping_start',
'bevel_factor_start',
'bevel_object',
'bevel_resolution',
'body',
'body_format',
'dimensions',
'eval_time',
'extrude',
'family',
'fill_mode',
'follow_curve',
'font',
'font_bold',
'font_bold_italic',
'font_italic',
'make_local',
'materials',
'name',
'offset',
'offset_x',
'offset_y',
'overflow',
'original',
'override_create',
'override_library',
'path_duration',
'preview',
'render_resolution_u',
'render_resolution_v',
'resolution_u',
'resolution_v',
'shape_keys',
'shear',
'size',
'small_caps_scale',
'space_character',
'space_line',
'space_word',
'type',
'taper_object',
'texspace_location',
'texspace_size',
'transform',
'twist_mode',
'twist_smooth',
'underline_height',
'underline_position',
'use_auto_texspace',
'use_deform_bounds',
'use_fake_user',
'use_fill_caps',
'use_fill_deform',
'use_map_taper',
'use_path',
'use_path_follow',
'use_radius',
'use_stretch',
]
SPLINE_METADATA = [
'hide',
'material_index',
# 'order_u',
# 'order_v',
# 'point_count_u',
# 'point_count_v',
'points',
'radius_interpolation',
'resolution_u',
'resolution_v',
'tilt_interpolation',
'type',
'use_bezier_u',
'use_bezier_v',
'use_cyclic_u',
'use_cyclic_v',
'use_endpoint_u',
'use_endpoint_v',
'use_smooth',
]
class BlCurve(BlDatablock):
bl_id = "curves"
bl_class = bpy.types.Curve
bl_delay_refresh = 1
bl_delay_apply = 1
bl_automatic_push = True
bl_check_common = False
bl_icon = 'CURVE_DATA'
def _construct(self, data):
@ -155,7 +62,6 @@ class BlCurve(BlDatablock):
loader.load(target, data)
target.splines.clear()
# load splines
for spline in data['splines'].values():
new_spline = target.splines.new(spline['type'])
@ -166,12 +72,8 @@ class BlCurve(BlDatablock):
bezier_points = new_spline.bezier_points
bezier_points.add(spline['bezier_points_count'])
np_load_collection(spline['bezier_points'], bezier_points, SPLINE_BEZIER_POINT)
if new_spline.type == 'POLY':
points = new_spline.points
points.add(spline['points_count'])
np_load_collection(spline['points'], points, SPLINE_POINT)
# Not working for now...
# Not really working for now...
# See https://blender.stackexchange.com/questions/7020/create-nurbs-surface-with-python
if new_spline.type == 'NURBS':
logging.error("NURBS not supported.")
@ -181,14 +83,11 @@ class BlCurve(BlDatablock):
# new_spline.points[point_index], data['splines'][spline]["points"][point_index])
loader.load(new_spline, spline)
def _dump_implementation(self, data, instance=None):
assert(instance)
dumper = Dumper()
# Conflicting attributes
# TODO: remove them with the NURBS support
dumper.include_filter = CURVE_METADATA
dumper.exclude_filter = [
'users',
'order_u',
@ -206,13 +105,8 @@ class BlCurve(BlDatablock):
for index, spline in enumerate(instance.splines):
dumper.depth = 2
dumper.include_filter = SPLINE_METADATA
spline_data = dumper.dump(spline)
if spline.type == 'POLY':
spline_data['points_count'] = len(spline.points)-1
spline_data['points'] = np_dump_collection(spline.points, SPLINE_POINT)
# spline_data['points'] = np_dump_collection(spline.points, SPLINE_POINT)
spline_data['bezier_points_count'] = len(spline.bezier_points)-1
spline_data['bezier_points'] = np_dump_collection(spline.bezier_points, SPLINE_BEZIER_POINT)
data['splines'][index] = spline_data
@ -224,17 +118,3 @@ class BlCurve(BlDatablock):
elif isinstance(instance, T.Curve):
data['type'] = 'CURVE'
return data
def _resolve_deps_implementation(self):
# TODO: resolve material
deps = []
curve = self.instance
if isinstance(curve, T.TextCurve):
deps.extend([
curve.font,
curve.font_bold,
curve.font_bold_italic,
curve.font_italic])
return deps

View File

@ -16,16 +16,13 @@
# ##### END GPL LICENSE BLOCK #####
import logging
from collections.abc import Iterable
import bpy
import mathutils
from replication.constants import DIFF_BINARY, UP
from replication.data import ReplicatedDatablock
from .. import utils
from .dump_anything import Dumper, Loader
from .dump_anything import Loader, Dumper
from ..libs.replication.replication.data import ReplicatedDatablock
from ..libs.replication.replication.constants import (UP, DIFF_BINARY)
def has_action(target):
@ -89,19 +86,6 @@ def load_driver(target_datablock, src_driver):
loader.load(new_point, src_driver['keyframe_points'][src_point])
def get_datablock_from_uuid(uuid, default, ignore=[]):
if not uuid:
return default
for category in dir(bpy.data):
root = getattr(bpy.data, category)
if isinstance(root, Iterable) and category not in ignore:
for item in root:
if getattr(item, 'uuid', None) == uuid:
return item
return default
class BlDatablock(ReplicatedDatablock):
"""BlDatablock
@ -111,14 +95,11 @@ class BlDatablock(ReplicatedDatablock):
bl_delay_apply : refresh rate in sec for apply
bl_automatic_push : boolean
bl_icon : type icon (blender icon name)
bl_check_common: enable check even in common rights
"""
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
instance = kwargs.get('instance', None)
self.preferences = utils.get_preferences()
# TODO: use is_library_indirect
self.is_library = (instance and hasattr(instance, 'library') and
@ -136,27 +117,15 @@ class BlDatablock(ReplicatedDatablock):
datablock_ref = utils.find_from_attr('uuid', self.uuid, datablock_root)
if not datablock_ref:
try:
datablock_ref = datablock_root[self.data['name']]
except Exception:
name = self.data.get('name')
logging.debug(f"Constructing {name}")
datablock_ref = self._construct(data=self.data)
datablock_ref = datablock_root.get(
self.data['name'], # Resolve by name
self._construct(data=self.data)) # If it doesn't exist create it
if datablock_ref:
setattr(datablock_ref, 'uuid', self.uuid)
self.instance = datablock_ref
def remove_instance(self):
"""
Remove instance from blender data
"""
assert(self.instance)
datablock_root = getattr(bpy.data, self.bl_id)
datablock_root.remove(self.instance)
def _dump(self, instance=None):
dumper = Dumper()
data = {}
@ -217,7 +186,6 @@ class BlDatablock(ReplicatedDatablock):
if not self.is_library:
dependencies.extend(self._resolve_deps_implementation())
logging.debug(f"{self.instance.name} dependencies: {dependencies}")
return dependencies
def _resolve_deps_implementation(self):

View File

@ -1,140 +0,0 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
#
# ##### END GPL LICENSE BLOCK #####
import logging
import os
import sys
from pathlib import Path
import bpy
import mathutils
from replication.constants import DIFF_BINARY, UP
from replication.data import ReplicatedDatablock
from .. import utils
from .dump_anything import Dumper, Loader
def get_filepath(filename):
"""
Construct the local filepath
"""
return str(Path(
utils.get_preferences().cache_directory,
filename
))
def ensure_unpacked(datablock):
if datablock.packed_file:
logging.info(f"Unpacking {datablock.name}")
filename = Path(bpy.path.abspath(datablock.filepath)).name
datablock.filepath = get_filepath(filename)
datablock.unpack(method="WRITE_ORIGINAL")
class BlFile(ReplicatedDatablock):
bl_id = 'file'
bl_name = "file"
bl_class = Path
bl_delay_refresh = 0
bl_delay_apply = 1
bl_automatic_push = True
bl_check_common = False
bl_icon = 'FILE'
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
self.instance = kwargs.get('instance', None)
if self.instance and not self.instance.exists():
raise FileNotFoundError(self.instance)
self.preferences = utils.get_preferences()
self.diff_method = DIFF_BINARY
def resolve(self):
if self.data:
self.instance = Path(get_filepath(self.data['name']))
if not self.instance.exists():
logging.debug("File don't exist, loading it.")
self._load(self.data, self.instance)
def push(self, socket, identity=None):
super().push(socket, identity=None)
if self.preferences.clear_memory_filecache:
del self.data['file']
def _dump(self, instance=None):
"""
Read the file and return a dict as:
{
name : filename
extension :
file: file content
}
"""
logging.info(f"Extracting file metadata")
data = {
'name': self.instance.name,
}
logging.info(
f"Reading {self.instance.name} content: {self.instance.stat().st_size} bytes")
try:
file = open(self.instance, "rb")
data['file'] = file.read()
file.close()
except IOError:
logging.warning(f"{self.instance} doesn't exist, skipping")
else:
file.close()
return data
def _load(self, data, target):
"""
Writing the file
"""
# TODO: check for empty data
if target.exists() and not self.diff():
logging.info(f"{data['name']} already on the disk, skipping.")
return
try:
file = open(target, "wb")
file.write(data['file'])
if self.preferences.clear_memory_filecache:
del self.data['file']
except IOError:
logging.warning(f"{target} doesn't exist, skipping")
else:
file.close()
def diff(self):
memory_size = sys.getsizeof(self.data['file'])-33
disk_size = self.instance.stat().st_size
return memory_size == disk_size

View File

@ -1,74 +0,0 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
#
# ##### END GPL LICENSE BLOCK #####
import logging
import os
from pathlib import Path
import bpy
from .bl_datablock import BlDatablock
from .bl_file import get_filepath, ensure_unpacked
from .dump_anything import Dumper, Loader
class BlFont(BlDatablock):
bl_id = "fonts"
bl_class = bpy.types.VectorFont
bl_delay_refresh = 1
bl_delay_apply = 1
bl_automatic_push = True
bl_check_common = False
bl_icon = 'FILE_FONT'
def _construct(self, data):
filename = data.get('filename')
if filename == '<builtin>':
return bpy.data.fonts.load(filename)
else:
return bpy.data.fonts.load(get_filepath(filename))
def _load(self, data, target):
pass
def _dump(self, instance=None):
if instance.filepath == '<builtin>':
filename = '<builtin>'
else:
filename = Path(instance.filepath).name
if not filename:
raise FileExistsError(instance.filepath)
return {
'filename': filename,
'name': instance.name
}
def diff(self):
return False
def _resolve_deps_implementation(self):
deps = []
if self.instance.filepath and self.instance.filepath != '<builtin>':
ensure_unpacked(self.instance)
deps.append(Path(bpy.path.abspath(self.instance.filepath)))
return deps

View File

@ -218,7 +218,6 @@ class BlGpencil(BlDatablock):
bl_delay_refresh = 2
bl_delay_apply = 1
bl_automatic_push = True
bl_check_common = False
bl_icon = 'GREASEPENCIL'
def _construct(self, data):

View File

@ -16,108 +16,90 @@
# ##### END GPL LICENSE BLOCK #####
import logging
import os
from pathlib import Path
import bpy
import mathutils
import os
import logging
from .. import utils
from .dump_anything import Loader, Dumper
from .bl_datablock import BlDatablock
from .dump_anything import Dumper, Loader
from .bl_file import get_filepath, ensure_unpacked
format_to_ext = {
'BMP': 'bmp',
'IRIS': 'sgi',
'PNG': 'png',
'JPEG': 'jpg',
'JPEG2000': 'jp2',
'TARGA': 'tga',
'TARGA_RAW': 'tga',
'CINEON': 'cin',
'DPX': 'dpx',
'OPEN_EXR_MULTILAYER': 'exr',
'OPEN_EXR': 'exr',
'HDR': 'hdr',
'TIFF': 'tiff',
'AVI_JPEG': 'avi',
'AVI_RAW': 'avi',
'FFMPEG': 'mpeg',
}
def dump_image(image):
pixels = None
if image.source == "GENERATED" or image.packed_file is not None:
prefs = utils.get_preferences()
img_name = f"{image.name}.png"
# Cache the image on the disk
image.filepath_raw = os.path.join(prefs.cache_directory, img_name)
os.makedirs(prefs.cache_directory, exist_ok=True)
image.file_format = "PNG"
image.save()
if image.source == "FILE":
image_path = bpy.path.abspath(image.filepath_raw)
image_directory = os.path.dirname(image_path)
os.makedirs(image_directory, exist_ok=True)
image.save()
file = open(image_path, "rb")
pixels = file.read()
file.close()
else:
raise ValueError()
return pixels
class BlImage(BlDatablock):
bl_id = "images"
bl_class = bpy.types.Image
bl_delay_refresh = 1
bl_delay_refresh = 0
bl_delay_apply = 1
bl_automatic_push = True
bl_check_common = False
bl_automatic_push = False
bl_icon = 'IMAGE_DATA'
def _construct(self, data):
return bpy.data.images.new(
name=data['name'],
width=data['size'][0],
height=data['size'][1]
)
name=data['name'],
width=data['size'][0],
height=data['size'][1]
)
def _load(self, data, target):
loader = Loader()
loader.load(data, target)
image = target
prefs = utils.get_preferences()
img_name = f"{image.name}.png"
img_path = os.path.join(prefs.cache_directory,img_name)
os.makedirs(prefs.cache_directory, exist_ok=True)
file = open(img_path, 'wb')
file.write(data["pixels"])
file.close()
image.source = 'FILE'
image.filepath = img_path
image.colorspace_settings.name = data["colorspace_settings"]["name"]
target.source = 'FILE'
target.filepath_raw = get_filepath(data['filename'])
target.colorspace_settings.name = data["colorspace_settings"]["name"]
def _dump(self, instance=None):
assert(instance)
filename = Path(instance.filepath).name
data = {
"filename": filename
}
data = {}
data['pixels'] = dump_image(instance)
dumper = Dumper()
dumper.depth = 2
dumper.include_filter = [
"name",
'size',
'height',
'alpha',
'float_buffer',
'alpha_mode',
'colorspace_settings']
dumper.include_filter = [
"name",
'size',
'height',
'alpha',
'float_buffer',
'filepath',
'source',
'colorspace_settings']
data.update(dumper.dump(instance))
return data
def diff(self):
if self.instance and (self.instance.name != self.data['name']):
return True
else:
return False
return False
def _resolve_deps_implementation(self):
deps = []
if self.instance.filepath:
if self.instance.packed_file:
filename = Path(bpy.path.abspath(self.instance.filepath)).name
self.instance.filepath = get_filepath(filename)
self.instance.save()
# An image can't be unpacked to the modified path
# TODO: make a bug report
self.instance.unpack(method="REMOVE")
elif self.instance.source == "GENERATED":
filename = f"{self.instance.name}.png"
self.instance.filepath = get_filepath(filename)
self.instance.save()
deps.append(Path(bpy.path.abspath(self.instance.filepath)))
return deps

View File

@ -21,7 +21,7 @@ import mathutils
from .dump_anything import Dumper, Loader, np_dump_collection, np_load_collection
from .bl_datablock import BlDatablock
from replication.exception import ContextError
from ..libs.replication.replication.exception import ContextError
POINT = ['co', 'weight_softbody', 'co_deform']
@ -32,7 +32,6 @@ class BlLattice(BlDatablock):
bl_delay_refresh = 1
bl_delay_apply = 1
bl_automatic_push = True
bl_check_common = False
bl_icon = 'LATTICE_DATA'
def _construct(self, data):

View File

@ -29,7 +29,6 @@ class BlLibrary(BlDatablock):
bl_delay_refresh = 1
bl_delay_apply = 1
bl_automatic_push = True
bl_check_common = False
bl_icon = 'LIBRARY_DATA_DIRECT'
def _construct(self, data):

View File

@ -29,7 +29,6 @@ class BlLight(BlDatablock):
bl_delay_refresh = 1
bl_delay_apply = 1
bl_automatic_push = True
bl_check_common = False
bl_icon = 'LIGHT_DATA'
def _construct(self, data):

View File

@ -30,7 +30,6 @@ class BlLightprobe(BlDatablock):
bl_delay_refresh = 1
bl_delay_apply = 1
bl_automatic_push = True
bl_check_common = False
bl_icon = 'LIGHTPROBE_GRID'
def _construct(self, data):

View File

@ -19,12 +19,11 @@
import bpy
import mathutils
import logging
import re
from .. import utils
from .dump_anything import Loader, Dumper
from .bl_datablock import BlDatablock, get_datablock_from_uuid
from .bl_datablock import BlDatablock
NODE_SOCKET_INDEX = re.compile('\[(\d*)\]')
def load_node(node_data, node_tree):
""" Load a node into a node_tree from a dict
@ -37,24 +36,21 @@ def load_node(node_data, node_tree):
loader = Loader()
target_node = node_tree.nodes.new(type=node_data["bl_idname"])
loader.load(target_node, node_data)
image_uuid = node_data.get('image_uuid', None)
loader.load(target_node, node_data)
if image_uuid and not target_node.image:
target_node.image = get_datablock_from_uuid(image_uuid,None)
for input in node_data["inputs"]:
if hasattr(target_node.inputs[input], "default_value"):
try:
target_node.inputs[input].default_value = node_data["inputs"][input]["default_value"]
except:
logging.error(
f"Material {input} parameter not supported, skipping")
logging.error(f"Material {input} parameter not supported, skipping")
def load_links(links_data, node_tree):
""" Load node_tree links from a list
:arg links_data: dumped node links
:type links_data: list
:arg node_tree: node links collection
@ -64,6 +60,7 @@ def load_links(links_data, node_tree):
for link in links_data:
input_socket = node_tree.nodes[link['to_node']].inputs[int(link['to_socket'])]
output_socket = node_tree.nodes[link['from_node']].outputs[int(link['from_socket'])]
node_tree.links.new(input_socket, output_socket)
@ -78,13 +75,11 @@ def dump_links(links):
links_data = []
for link in links:
to_socket = NODE_SOCKET_INDEX.search(link.to_socket.path_from_id()).group(1)
from_socket = NODE_SOCKET_INDEX.search(link.from_socket.path_from_id()).group(1)
links_data.append({
'to_node': link.to_node.name,
'to_socket': to_socket,
'from_node': link.from_node.name,
'from_socket': from_socket,
'to_node':link.to_node.name,
'to_socket':link.to_socket.path_from_id()[-2:-1],
'from_node':link.from_node.name,
'from_socket':link.from_socket.path_from_id()[-2:-1],
})
return links_data
@ -121,10 +116,9 @@ def dump_node(node):
"show_preview",
"show_texture",
"outputs",
"width_hidden",
"image"
"width_hidden"
]
dumped_node = node_dumper.dump(node)
if hasattr(node, 'inputs'):
@ -157,8 +151,7 @@ def dump_node(node):
'location'
]
dumped_node['mapping'] = curve_dumper.dump(node.mapping)
if hasattr(node, 'image') and getattr(node, 'image'):
dumped_node['image_uuid'] = node.image.uuid
return dumped_node
@ -168,7 +161,6 @@ class BlMaterial(BlDatablock):
bl_delay_refresh = 1
bl_delay_apply = 1
bl_automatic_push = True
bl_check_common = False
bl_icon = 'MATERIAL_DATA'
def _construct(self, data):
@ -184,14 +176,15 @@ class BlMaterial(BlDatablock):
loader.load(
target.grease_pencil, data['grease_pencil'])
if data["use_nodes"]:
if target.node_tree is None:
target.use_nodes = True
target.node_tree.nodes.clear()
loader.load(target, data)
loader.load(target,data)
# Load nodes
for node in data["node_tree"]["nodes"]:
load_node(data["node_tree"]["nodes"][node], target.node_tree)
@ -228,9 +221,9 @@ class BlMaterial(BlDatablock):
for node in instance.node_tree.nodes:
nodes[node.name] = dump_node(node)
data["node_tree"]['nodes'] = nodes
data["node_tree"]["links"] = dump_links(instance.node_tree.links)
if instance.is_grease_pencil:
gp_mat_dumper = Dumper()
gp_mat_dumper.depth = 3
@ -255,7 +248,7 @@ class BlMaterial(BlDatablock):
'texture_clamp',
'gradient_type',
'mix_color',
'flip'
'flip'
]
data['grease_pencil'] = gp_mat_dumper.dump(instance.grease_pencil)
return data
@ -266,9 +259,10 @@ class BlMaterial(BlDatablock):
if self.instance.use_nodes:
for node in self.instance.node_tree.nodes:
if node.type in ['TEX_IMAGE','TEX_ENVIRONMENT']:
if node.type == 'TEX_IMAGE':
deps.append(node.image)
if self.is_library:
deps.append(self.instance.library)
return deps

View File

@ -23,10 +23,11 @@ import logging
import numpy as np
from .dump_anything import Dumper, Loader, np_load_collection_primitives, np_dump_collection_primitive, np_load_collection, np_dump_collection
from replication.constants import DIFF_BINARY
from replication.exception import ContextError
from ..libs.replication.replication.constants import DIFF_BINARY
from ..libs.replication.replication.exception import ContextError
from .bl_datablock import BlDatablock
VERTICE = ['co']
EDGE = [
@ -52,7 +53,6 @@ class BlMesh(BlDatablock):
bl_delay_refresh = 2
bl_delay_apply = 1
bl_automatic_push = True
bl_check_common = False
bl_icon = 'MESH_DATA'
def _construct(self, data):
@ -89,34 +89,32 @@ class BlMesh(BlDatablock):
np_load_collection(data["polygons"],target.polygons, POLYGON)
# UV Layers
if 'uv_layers' in data.keys():
for layer in data['uv_layers']:
if layer not in target.uv_layers:
target.uv_layers.new(name=layer)
for layer in data['uv_layers']:
if layer not in target.uv_layers:
target.uv_layers.new(name=layer)
np_load_collection_primitives(
target.uv_layers[layer].data,
'uv',
data["uv_layers"][layer]['data'])
np_load_collection_primitives(
target.uv_layers[layer].data,
'uv',
data["uv_layers"][layer]['data'])
# Vertex color
if 'vertex_colors' in data.keys():
for color_layer in data['vertex_colors']:
if color_layer not in target.vertex_colors:
target.vertex_colors.new(name=color_layer)
for color_layer in data['vertex_colors']:
if color_layer not in target.vertex_colors:
target.vertex_colors.new(name=color_layer)
np_load_collection_primitives(
target.vertex_colors[color_layer].data,
'color',
data["vertex_colors"][color_layer]['data'])
np_load_collection_primitives(
target.vertex_colors[color_layer].data,
'color',
data["vertex_colors"][color_layer]['data'])
target.validate()
target.update()
def _dump_implementation(self, data, instance=None):
assert(instance)
if instance.is_editmode and not self.preferences.sync_flags.sync_during_editmode:
if instance.is_editmode:
raise ContextError("Mesh is in edit mode")
mesh = instance
@ -149,18 +147,16 @@ class BlMesh(BlDatablock):
data["loops"] = np_dump_collection(mesh.loops, LOOP)
# UV Layers
if mesh.uv_layers:
data['uv_layers'] = {}
for layer in mesh.uv_layers:
data['uv_layers'][layer.name] = {}
data['uv_layers'][layer.name]['data'] = np_dump_collection_primitive(layer.data, 'uv')
data['uv_layers'] = {}
for layer in mesh.uv_layers:
data['uv_layers'][layer.name] = {}
data['uv_layers'][layer.name]['data'] = np_dump_collection_primitive(layer.data, 'uv')
# Vertex color
if mesh.vertex_colors:
data['vertex_colors'] = {}
for color_map in mesh.vertex_colors:
data['vertex_colors'][color_map.name] = {}
data['vertex_colors'][color_map.name]['data'] = np_dump_collection_primitive(color_map.data, 'color')
data['vertex_colors'] = {}
for color_map in mesh.vertex_colors:
data['vertex_colors'][color_map.name] = {}
data['vertex_colors'][color_map.name]['data'] = np_dump_collection_primitive(color_map.data, 'color')
# Fix material index
m_list = []

View File

@ -68,7 +68,6 @@ class BlMetaball(BlDatablock):
bl_delay_refresh = 1
bl_delay_apply = 1
bl_automatic_push = True
bl_check_common = False
bl_icon = 'META_BALL'
def _construct(self, data):

View File

@ -16,15 +16,13 @@
# ##### END GPL LICENSE BLOCK #####
import logging
import bpy
import mathutils
from replication.exception import ContextError
import logging
from .bl_datablock import BlDatablock, get_datablock_from_uuid
from .dump_anything import Dumper, Loader
from replication.exception import ReparentException
from .dump_anything import Loader, Dumper
from .bl_datablock import BlDatablock
from ..libs.replication.replication.exception import ContextError
def load_pose(target_bone, data):
@ -33,59 +31,12 @@ def load_pose(target_bone, data):
loader.load(target_bone, data)
def find_data_from_name(name=None):
instance = None
if not name:
pass
elif name in bpy.data.meshes.keys():
instance = bpy.data.meshes[name]
elif name in bpy.data.lights.keys():
instance = bpy.data.lights[name]
elif name in bpy.data.cameras.keys():
instance = bpy.data.cameras[name]
elif name in bpy.data.curves.keys():
instance = bpy.data.curves[name]
elif name in bpy.data.metaballs.keys():
instance = bpy.data.metaballs[name]
elif name in bpy.data.armatures.keys():
instance = bpy.data.armatures[name]
elif name in bpy.data.grease_pencils.keys():
instance = bpy.data.grease_pencils[name]
elif name in bpy.data.curves.keys():
instance = bpy.data.curves[name]
elif name in bpy.data.lattices.keys():
instance = bpy.data.lattices[name]
elif name in bpy.data.speakers.keys():
instance = bpy.data.speakers[name]
elif name in bpy.data.lightprobes.keys():
# Only supported since 2.83
if bpy.app.version[1] >= 83:
instance = bpy.data.lightprobes[name]
else:
logging.warning(
"Lightprobe replication only supported since 2.83. See https://developer.blender.org/D6396")
return instance
def load_data(object, name):
logging.info("loading data")
pass
def _is_editmode(object: bpy.types.Object) -> bool:
child_data = getattr(object, 'data', None)
return (child_data and
hasattr(child_data, 'is_editmode') and
child_data.is_editmode)
class BlObject(BlDatablock):
bl_id = "objects"
bl_class = bpy.types.Object
bl_delay_refresh = 1
bl_delay_apply = 1
bl_automatic_push = True
bl_check_common = False
bl_icon = 'OBJECT_DATA'
def _construct(self, data):
@ -101,67 +52,45 @@ class BlObject(BlDatablock):
return instance
# TODO: refactoring
object_name = data.get("name")
data_uuid = data.get("data_uuid")
data_id = data.get("data")
object_data = get_datablock_from_uuid(
data_uuid,
find_data_from_name(data_id),
ignore=['images']) #TODO: use resolve_from_id
instance = bpy.data.objects.new(object_name, object_data)
if "data" not in data:
pass
elif data["data"] in bpy.data.meshes.keys():
instance = bpy.data.meshes[data["data"]]
elif data["data"] in bpy.data.lights.keys():
instance = bpy.data.lights[data["data"]]
elif data["data"] in bpy.data.cameras.keys():
instance = bpy.data.cameras[data["data"]]
elif data["data"] in bpy.data.curves.keys():
instance = bpy.data.curves[data["data"]]
elif data["data"] in bpy.data.metaballs.keys():
instance = bpy.data.metaballs[data["data"]]
elif data["data"] in bpy.data.armatures.keys():
instance = bpy.data.armatures[data["data"]]
elif data["data"] in bpy.data.grease_pencils.keys():
instance = bpy.data.grease_pencils[data["data"]]
elif data["data"] in bpy.data.curves.keys():
instance = bpy.data.curves[data["data"]]
elif data["data"] in bpy.data.lattices.keys():
instance = bpy.data.lattices[data["data"]]
elif data["data"] in bpy.data.speakers.keys():
instance = bpy.data.speakers[data["data"]]
elif data["data"] in bpy.data.lightprobes.keys():
# Only supported since 2.83
if bpy.app.version[1] >= 83:
instance = bpy.data.lightprobes[data["data"]]
else:
logging.warning(
"Lightprobe replication only supported since 2.83. See https://developer.blender.org/D6396")
instance = bpy.data.objects.new(data["name"], instance)
instance.uuid = self.uuid
return instance
def _load_implementation(self, data, target):
loader = Loader()
data_uuid = data.get("data_uuid")
data_id = data.get("data")
if target.type != data['type']:
raise ReparentException()
elif target.data and (target.data.name != data_id):
target.data = get_datablock_from_uuid(data_uuid, find_data_from_name(data_id), ignore=['images'])
# vertex groups
if 'vertex_groups' in data:
target.vertex_groups.clear()
for vg in data['vertex_groups']:
vertex_group=target.vertex_groups.new(name = vg['name'])
point_attr='vertices' if 'vertices' in vg else 'points'
for vert in vg[point_attr]:
vertex_group.add(
[vert['index']], vert['weight'], 'REPLACE')
# SHAPE KEYS
if 'shape_keys' in data:
target.shape_key_clear()
object_data=target.data
# Create keys and load vertices coords
for key_block in data['shape_keys']['key_blocks']:
key_data=data['shape_keys']['key_blocks'][key_block]
target.shape_key_add(name = key_block)
loader.load(
target.data.shape_keys.key_blocks[key_block], key_data)
for vert in key_data['data']:
target.data.shape_keys.key_blocks[key_block].data[vert].co = key_data['data'][vert]['co']
# Load relative key after all
for key_block in data['shape_keys']['key_blocks']:
reference = data['shape_keys']['key_blocks'][key_block]['relative_key']
target.data.shape_keys.key_blocks[key_block].relative_key = target.data.shape_keys.key_blocks[reference]
# Load transformation data
loader = Loader()
loader.load(target, data)
loader.load(target.display, data['display'])
# Pose
if 'pose' in data:
if not target.pose:
@ -185,25 +114,51 @@ class BlObject(BlDatablock):
if 'constraints' in bone_data.keys():
loader.load(target_bone, bone_data['constraints'])
load_pose(target_bone, bone_data)
if 'bone_index' in bone_data.keys():
target_bone.bone_group = target.pose.bone_group[bone_data['bone_group_index']]
# TODO: find another way...
if target.type == 'EMPTY':
img_uuid = data.get('data_uuid')
if target.data is None and img_uuid:
target.data = get_datablock_from_uuid(img_uuid, None)#bpy.data.images.get(img_key, None)
# vertex groups
if 'vertex_groups' in data:
target.vertex_groups.clear()
for vg in data['vertex_groups']:
vertex_group = target.vertex_groups.new(name=vg['name'])
point_attr = 'vertices' if 'vertices' in vg else 'points'
for vert in vg[point_attr]:
vertex_group.add(
[vert['index']], vert['weight'], 'REPLACE')
# SHAPE KEYS
if 'shape_keys' in data:
target.shape_key_clear()
object_data = target.data
# Create keys and load vertices coords
for key_block in data['shape_keys']['key_blocks']:
key_data = data['shape_keys']['key_blocks'][key_block]
target.shape_key_add(name=key_block)
loader.load(
target.data.shape_keys.key_blocks[key_block], key_data)
for vert in key_data['data']:
target.data.shape_keys.key_blocks[key_block].data[vert].co = key_data['data'][vert]['co']
# Load relative key after all
for key_block in data['shape_keys']['key_blocks']:
reference = data['shape_keys']['key_blocks'][key_block]['relative_key']
target.data.shape_keys.key_blocks[key_block].relative_key = target.data.shape_keys.key_blocks[reference]
def _dump_implementation(self, data, instance=None):
assert(instance)
if _is_editmode(instance):
if self.preferences.sync_flags.sync_during_editmode:
instance.update_from_editmode()
else:
raise ContextError("Object is in edit-mode.")
child_data = getattr(instance, 'data', None)
if child_data and hasattr(child_data, 'is_editmode') and child_data.is_editmode:
raise ContextError("Object is in edit-mode.")
dumper = Dumper()
dumper.depth = 1
@ -216,55 +171,28 @@ class BlObject(BlDatablock):
"library",
"empty_display_type",
"empty_display_size",
"empty_image_offset",
"empty_image_depth",
"empty_image_side",
"show_empty_image_orthographic",
"show_empty_image_perspective",
"show_empty_image_only_axis_aligned",
"use_empty_image_alpha",
"color",
"instance_collection",
"instance_type",
"location",
"scale",
'lock_location',
'lock_rotation',
'lock_scale',
'hide_render',
'display_type',
'display_bounds_type',
'show_bounds',
'show_name',
'show_axis',
'show_wire',
'show_all_edges',
'show_texture_space',
'show_in_front',
'type',
'rotation_quaternion' if instance.rotation_mode == 'QUATERNION' else 'rotation_euler',
]
data = dumper.dump(instance)
dumper.include_filter = [
'show_shadows',
]
data['display'] = dumper.dump(instance.display)
data['data_uuid'] = getattr(instance.data, 'uuid', None)
if self.is_library:
return data
# MODIFIERS
if hasattr(instance, 'modifiers'):
dumper.include_filter = None
dumper.depth = 1
dumper.depth = 2
data["modifiers"] = {}
for index, modifier in enumerate(instance.modifiers):
data["modifiers"][modifier.name] = dumper.dump(modifier)
# CONSTRAINTS
# OBJECT
if hasattr(instance, 'constraints'):
dumper.depth = 3
data["constraints"] = dumper.dump(instance.constraints)
@ -317,8 +245,7 @@ class BlObject(BlDatablock):
# VERTEx GROUP
if len(instance.vertex_groups) > 0:
points_attr = 'vertices' if isinstance(
instance.data, bpy.types.Mesh) else 'points'
points_attr = 'vertices' if isinstance(instance.data, bpy.types.Mesh) else 'points'
vg_data = []
for vg in instance.vertex_groups:
vg_idx = vg.index
@ -373,7 +300,7 @@ class BlObject(BlDatablock):
def _resolve_deps_implementation(self):
deps = []
# Avoid Empty case
if self.instance.data:
deps.append(self.instance.data)
@ -388,3 +315,4 @@ class BlObject(BlDatablock):
deps.append(self.instance.instance_collection)
return deps

View File

@ -21,10 +21,8 @@ import mathutils
from .dump_anything import Loader, Dumper
from .bl_datablock import BlDatablock
from .bl_collection import dump_collection_children, dump_collection_objects, load_collection_childrens, load_collection_objects
from replication.constants import (DIFF_JSON, MODIFIED)
from deepdiff import DeepDiff
import logging
from ..utils import get_preferences
class BlScene(BlDatablock):
bl_id = "scenes"
@ -32,14 +30,8 @@ class BlScene(BlDatablock):
bl_delay_refresh = 1
bl_delay_apply = 1
bl_automatic_push = True
bl_check_common = True
bl_icon = 'SCENE_DATA'
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
self.diff_method = DIFF_JSON
def _construct(self, data):
instance = bpy.data.scenes.new(data["name"])
return instance
@ -50,8 +42,24 @@ class BlScene(BlDatablock):
loader.load(target, data)
# Load master collection
load_collection_objects(data['collection']['objects'], target.collection)
load_collection_childrens(data['collection']['children'], target.collection)
for object in data["collection"]["objects"]:
if object not in target.collection.objects.keys():
target.collection.objects.link(bpy.data.objects[object])
for object in target.collection.objects.keys():
if object not in data["collection"]["objects"]:
target.collection.objects.unlink(bpy.data.objects[object])
# load collections
for collection in data["collection"]["children"]:
if collection not in target.collection.children.keys():
target.collection.children.link(
bpy.data.collections[collection])
for collection in target.collection.children.keys():
if collection not in data["collection"]["children"]:
target.collection.children.unlink(
bpy.data.collections[collection])
if 'world' in data.keys():
target.world = bpy.data.worlds[data['world']]
@ -60,23 +68,19 @@ class BlScene(BlDatablock):
if 'grease_pencil' in data.keys():
target.grease_pencil = bpy.data.grease_pencils[data['grease_pencil']]
if self.preferences.sync_flags.sync_render_settings:
if 'eevee' in data.keys():
loader.load(target.eevee, data['eevee'])
if 'eevee' in data.keys():
loader.load(target.eevee, data['eevee'])
if 'cycles' in data.keys():
loader.load(target.eevee, data['cycles'])
if 'cycles' in data.keys():
loader.load(target.eevee, data['cycles'])
if 'render' in data.keys():
loader.load(target.render, data['render'])
if 'view_settings' in data.keys():
loader.load(target.view_settings, data['view_settings'])
if target.view_settings.use_curve_mapping:
#TODO: change this ugly fix
target.view_settings.curve_mapping.white_level = data['view_settings']['curve_mapping']['white_level']
target.view_settings.curve_mapping.black_level = data['view_settings']['curve_mapping']['black_level']
target.view_settings.curve_mapping.update()
if 'view_settings' in data.keys():
loader.load(target.view_settings, data['view_settings'])
if target.view_settings.use_curve_mapping:
#TODO: change this ugly fix
target.view_settings.curve_mapping.white_level = data['view_settings']['curve_mapping']['white_level']
target.view_settings.curve_mapping.black_level = data['view_settings']['curve_mapping']['black_level']
target.view_settings.curve_mapping.update()
def _dump_implementation(self, data, instance=None):
assert(instance)
@ -88,27 +92,22 @@ class BlScene(BlDatablock):
'name',
'world',
'id',
'camera',
'grease_pencil',
'frame_start',
'frame_end',
'frame_step',
]
if self.preferences.sync_flags.sync_active_camera:
scene_dumper.include_filter.append('camera')
data = scene_dumper.dump(instance)
scene_dumper.depth = 3
scene_dumper.include_filter = ['children','objects','name']
data['collection'] = {}
data['collection']['children'] = dump_collection_children(instance.collection)
data['collection']['objects'] = dump_collection_objects(instance.collection)
data['collection'] = scene_dumper.dump(instance.collection)
scene_dumper.depth = 1
scene_dumper.include_filter = None
pref = get_preferences()
if self.preferences.sync_flags.sync_render_settings:
if pref.sync_flags.sync_render_settings:
scene_dumper.exclude_filter = [
'gi_cache_info',
'feature_set',
@ -122,15 +121,12 @@ class BlScene(BlDatablock):
'preview_samples',
'sample_clamp_indirect',
'samples',
'volume_bounces',
'file_extension',
'use_denoising'
'volume_bounces'
]
data['eevee'] = scene_dumper.dump(instance.eevee)
data['cycles'] = scene_dumper.dump(instance.cycles)
data['view_settings'] = scene_dumper.dump(instance.view_settings)
data['render'] = scene_dumper.dump(instance.render)
if instance.view_settings.use_curve_mapping:
data['view_settings']['curve_mapping'] = scene_dumper.dump(instance.view_settings.curve_mapping)
scene_dumper.depth = 5
@ -164,17 +160,3 @@ class BlScene(BlDatablock):
deps.append(self.instance.grease_pencil)
return deps
def diff(self):
exclude_path = []
if not self.preferences.sync_flags.sync_render_settings:
exclude_path.append("root['eevee']")
exclude_path.append("root['cycles']")
exclude_path.append("root['view_settings']")
exclude_path.append("root['render']")
if not self.preferences.sync_flags.sync_active_camera:
exclude_path.append("root['camera']")
return DeepDiff(self.data, self._dump(instance=self.instance),exclude_paths=exclude_path, cache_size=5000)

View File

@ -1,69 +0,0 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
#
# ##### END GPL LICENSE BLOCK #####
import logging
import os
from pathlib import Path
import bpy
from .bl_file import get_filepath, ensure_unpacked
from .bl_datablock import BlDatablock
from .dump_anything import Dumper, Loader
class BlSound(BlDatablock):
bl_id = "sounds"
bl_class = bpy.types.Sound
bl_delay_refresh = 1
bl_delay_apply = 1
bl_automatic_push = True
bl_check_common = False
bl_icon = 'SOUND'
def _construct(self, data):
filename = data.get('filename')
return bpy.data.sounds.load(get_filepath(filename))
def _load(self, data, target):
loader = Loader()
loader.load(target, data)
def diff(self):
return False
def _dump(self, instance=None):
filename = Path(instance.filepath).name
if not filename:
raise FileExistsError(instance.filepath)
return {
'filename': filename,
'name': instance.name
}
def _resolve_deps_implementation(self):
deps = []
if self.instance.filepath and self.instance.filepath != '<builtin>':
ensure_unpacked(self.instance)
deps.append(Path(bpy.path.abspath(self.instance.filepath)))
return deps

View File

@ -29,7 +29,6 @@ class BlSpeaker(BlDatablock):
bl_delay_refresh = 1
bl_delay_apply = 1
bl_automatic_push = True
bl_check_common = False
bl_icon = 'SPEAKER'
def _load_implementation(self, data, target):
@ -49,7 +48,6 @@ class BlSpeaker(BlDatablock):
'volume',
'name',
'pitch',
'sound',
'volume_min',
'volume_max',
'attenuation',
@ -62,15 +60,6 @@ class BlSpeaker(BlDatablock):
return dumper.dump(instance)
def _resolve_deps_implementation(self):
# TODO: resolve material
deps = []
sound = self.instance.sound
if sound:
deps.append(sound)
return deps

View File

@ -30,16 +30,12 @@ class BlWorld(BlDatablock):
bl_delay_refresh = 1
bl_delay_apply = 1
bl_automatic_push = True
bl_check_common = True
bl_icon = 'WORLD_DATA'
def _construct(self, data):
return bpy.data.worlds.new(data["name"])
def _load_implementation(self, data, target):
loader = Loader()
loader.load(target, data)
if data["use_nodes"]:
if target.node_tree is None:
target.use_nodes = True
@ -59,15 +55,19 @@ class BlWorld(BlDatablock):
assert(instance)
world_dumper = Dumper()
world_dumper.depth = 1
world_dumper.include_filter = [
"use_nodes",
"name",
"color"
world_dumper.depth = 2
world_dumper.exclude_filter = [
"preview",
"original",
"uuid",
"color",
"cycles",
"light_settings",
"users",
"view_center"
]
data = world_dumper.dump(instance)
if instance.use_nodes:
data['node_tree'] = {}
nodes = {}
for node in instance.node_tree.nodes:
@ -84,7 +84,7 @@ class BlWorld(BlDatablock):
if self.instance.use_nodes:
for node in self.instance.node_tree.nodes:
if node.type in ['TEX_IMAGE','TEX_ENVIRONMENT']:
if node.type == 'TEX_IMAGE':
deps.append(node.image)
if self.is_library:
deps.append(self.instance.library)

View File

@ -115,7 +115,7 @@ def np_dump_collection_primitive(collection: bpy.types.CollectionProperty, attri
:return: numpy byte buffer
"""
if len(collection) == 0:
logging.debug(f'Skipping empty {attribute} attribute')
logging.warning(f'Skipping empty {attribute} attribute')
return {}
attr_infos = collection[0].bl_rna.properties.get(attribute)
@ -192,7 +192,7 @@ def np_load_collection_primitives(collection: bpy.types.CollectionProperty, attr
:type sequence: strr
"""
if len(collection) == 0 or not sequence:
logging.debug(f"Skipping loading {attribute}")
logging.warning(f"Skipping loadin {attribute}")
return
attr_infos = collection[0].bl_rna.properties.get(attribute)
@ -301,7 +301,7 @@ class Dumper:
self._dump_ID = (lambda x, depth: x.name, self._dump_default_as_branch)
self._dump_collection = (
self._dump_default_as_leaf, self._dump_collection_as_branch)
self._dump_array = (self._dump_array_as_branch,
self._dump_array = (self._dump_default_as_leaf,
self._dump_array_as_branch)
self._dump_matrix = (self._dump_matrix_as_leaf,
self._dump_matrix_as_leaf)
@ -593,10 +593,6 @@ class Loader:
instance.write(bpy.data.materials.get(dump))
elif isinstance(rna_property_type, T.Collection):
instance.write(bpy.data.collections.get(dump))
elif isinstance(rna_property_type, T.VectorFont):
instance.write(bpy.data.fonts.get(dump))
elif isinstance(rna_property_type, T.Sound):
instance.write(bpy.data.sounds.get(dump))
def _load_matrix(self, matrix, dump):
matrix.write(mathutils.Matrix(dump))

View File

@ -19,25 +19,20 @@ import logging
import bpy
from . import presence, utils
from replication.constants import (FETCHED,
UP,
RP_COMMON,
STATE_INITIAL,
STATE_QUITTING,
STATE_ACTIVE,
STATE_SYNCING,
STATE_LOBBY,
STATE_SRV_SYNC,
REPARENT)
from . import operators, presence, utils
from .libs.replication.replication.constants import (FETCHED,
RP_COMMON,
STATE_INITIAL,
STATE_QUITTING,
STATE_ACTIVE,
STATE_SYNCING,
STATE_LOBBY,
STATE_SRV_SYNC)
from replication.interface import session
class Delayable():
"""Delayable task interface
"""
def __init__(self):
self.is_registered = False
def register(self):
raise NotImplementedError
@ -56,20 +51,13 @@ class Timer(Delayable):
"""
def __init__(self, duration=1):
super().__init__()
self._timeout = duration
self._running = True
def register(self):
"""Register the timer into the blender timer system
"""
if not self.is_registered:
bpy.app.timers.register(self.main)
self.is_registered = True
logging.debug(f"Register {self.__class__.__name__}")
else:
logging.debug(f"Timer {self.__class__.__name__} already registered")
bpy.app.timers.register(self.main)
def main(self):
self.execute()
@ -97,29 +85,18 @@ class ApplyTimer(Timer):
super().__init__(timout)
def execute(self):
if session and session.state['STATE'] == STATE_ACTIVE:
if self._type:
nodes = session.list(filter=self._type)
else:
nodes = session.list()
client = operators.client
if client and client.state['STATE'] == STATE_ACTIVE:
nodes = client.list(filter=self._type)
for node in nodes:
node_ref = session.get(uuid=node)
node_ref = client.get(uuid=node)
if node_ref.state == FETCHED:
try:
session.apply(node, force=True)
client.apply(node)
except Exception as e:
logging.error(f"Fail to apply {node_ref.uuid}: {e}")
elif node_ref.state == REPARENT:
# Reload the node
node_ref.remove_instance()
node_ref.resolve()
session.apply(node, force=True)
for parent in session._graph.find_parents(node):
logging.info(f"Applying parent {parent}")
session.apply(parent, force=True)
node_ref.state = UP
class DynamicRightSelectTimer(Timer):
@ -130,6 +107,7 @@ class DynamicRightSelectTimer(Timer):
self._right_strategy = RP_COMMON
def execute(self):
session = operators.client
settings = utils.get_preferences()
if session and session.state['STATE'] == STATE_ACTIVE:
@ -213,16 +191,11 @@ class DynamicRightSelectTimer(Timer):
class Draw(Delayable):
def __init__(self):
super().__init__()
self._handler = None
def register(self):
if not self.is_registered:
self._handler = bpy.types.SpaceView3D.draw_handler_add(
self.execute, (), 'WINDOW', 'POST_VIEW')
logging.debug(f"Register {self.__class__.__name__}")
else:
logging.debug(f"Drow {self.__class__.__name__} already registered")
self._handler = bpy.types.SpaceView3D.draw_handler_add(
self.execute, (), 'WINDOW', 'POST_VIEW')
def execute(self):
raise NotImplementedError()
@ -237,6 +210,7 @@ class Draw(Delayable):
class DrawClient(Draw):
def execute(self):
session = getattr(operators, 'client', None)
renderer = getattr(presence, 'renderer', None)
prefs = utils.get_preferences()
@ -265,28 +239,27 @@ class DrawClient(Draw):
class ClientUpdate(Timer):
def __init__(self, timout=.1):
def __init__(self, timout=.016):
super().__init__(timout)
self.handle_quit = False
self.users_metadata = {}
def execute(self):
settings = utils.get_preferences()
session = getattr(operators, 'client', None)
renderer = getattr(presence, 'renderer', None)
if session and renderer:
if session.state['STATE'] in [STATE_ACTIVE, STATE_LOBBY]:
local_user = session.online_users.get(
settings.username)
local_user = operators.client.online_users.get(settings.username)
if not local_user:
return
else:
for username, user_data in session.online_users.items():
for username, user_data in operators.client.online_users.items():
if username != settings.username:
cached_user_data = self.users_metadata.get(
username)
new_user_data = session.online_users[username]['metadata']
cached_user_data = self.users_metadata.get(username)
new_user_data = operators.client.online_users[username]['metadata']
if cached_user_data is None:
self.users_metadata[username] = user_data['metadata']
@ -299,7 +272,7 @@ class ClientUpdate(Timer):
local_user_metadata = local_user.get('metadata')
scene_current = bpy.context.scene.name
local_user = session.online_users.get(settings.username)
local_user = session.online_users.get(settings.username)
current_view_corners = presence.get_view_corners()
# Init client metadata
@ -308,9 +281,9 @@ class ClientUpdate(Timer):
'view_corners': presence.get_view_matrix(),
'view_matrix': presence.get_view_matrix(),
'color': (settings.client_color.r,
settings.client_color.g,
settings.client_color.b,
1),
settings.client_color.g,
settings.client_color.b,
1),
'frame_current': bpy.context.scene.frame_current,
'scene_current': scene_current
}
@ -323,52 +296,33 @@ class ClientUpdate(Timer):
session.update_user_metadata(local_user_metadata)
elif 'view_corners' in local_user_metadata and current_view_corners != local_user_metadata['view_corners']:
local_user_metadata['view_corners'] = current_view_corners
local_user_metadata['view_matrix'] = presence.get_view_matrix(
)
local_user_metadata['view_matrix'] = presence.get_view_matrix()
session.update_user_metadata(local_user_metadata)
# sync online users
session_users = operators.client.online_users
ui_users = bpy.context.window_manager.online_users
for index, user in enumerate(ui_users):
if user.username not in session_users.keys():
ui_users.remove(index)
renderer.flush_selection()
renderer.flush_users()
break
class SessionStatusUpdate(Timer):
def __init__(self, timout=1):
super().__init__(timout)
for user in session_users:
if user not in ui_users:
new_key = ui_users.add()
new_key.name = user
new_key.username = user
elif session.state['STATE'] == STATE_QUITTING:
presence.refresh_sidebar_view()
self.handle_quit = True
elif session.state['STATE'] == STATE_INITIAL and self.handle_quit:
self.handle_quit = False
presence.refresh_sidebar_view()
def execute(self):
presence.refresh_sidebar_view()
operators.unregister_delayables()
presence.renderer.stop()
class SessionUserSync(Timer):
def __init__(self, timout=1):
super().__init__(timout)
def execute(self):
renderer = getattr(presence, 'renderer', None)
if session and renderer:
# sync online users
session_users = session.online_users
ui_users = bpy.context.window_manager.online_users
for index, user in enumerate(ui_users):
if user.username not in session_users.keys():
ui_users.remove(index)
renderer.flush_selection()
renderer.flush_users()
break
for user in session_users:
if user not in ui_users:
new_key = ui_users.add()
new_key.name = user
new_key.username = user
class MainThreadExecutor(Timer):
def __init__(self, timout=1, execution_queue=None):
super().__init__(timout)
self.execution_queue = execution_queue
def execute(self):
while not self.execution_queue.empty():
function = self.execution_queue.get()
logging.debug(f"Executing {function.__name__}")
function()
presence.refresh_sidebar_view()

View File

@ -23,9 +23,6 @@ import subprocess
import sys
from pathlib import Path
import socket
import re
VERSION_EXPR = re.compile('\d+\.\d+\.\d+\w\d+')
THIRD_PARTY = os.path.join(os.path.dirname(os.path.abspath(__file__)), "libs")
DEFAULT_CACHE_DIR = os.path.join(
@ -50,29 +47,10 @@ def install_pip():
subprocess.run([str(PYTHON_PATH), "-m", "ensurepip"])
def install_package(name, version):
logging.info(f"installing {name} version...")
env = os.environ
if "PIP_REQUIRE_VIRTUALENV" in env:
# PIP_REQUIRE_VIRTUALENV is an env var to ensure pip cannot install packages outside a virtual env
# https://docs.python-guide.org/dev/pip-virtualenv/
# But since Blender's pip is outside of a virtual env, it can block our packages installation, so we unset the
# env var for the subprocess.
env = os.environ.copy()
del env["PIP_REQUIRE_VIRTUALENV"]
subprocess.run([str(PYTHON_PATH), "-m", "pip", "install", f"{name}=={version}"], env=env)
def install_package(name):
logging.debug(f"Using {PYTHON_PATH} for installation")
subprocess.run([str(PYTHON_PATH), "-m", "pip", "install", name])
def check_package_version(name, required_version):
logging.info(f"Checking {name} version...")
out = subprocess.run([str(PYTHON_PATH), "-m", "pip", "show", name], capture_output=True)
version = VERSION_EXPR.search(out.stdout.decode())
if version and version.group() == required_version:
logging.info(f"{name} is up to date")
return True
else:
logging.info(f"{name} need an update")
return False
def get_ip():
"""
@ -100,9 +78,7 @@ def setup(dependencies, python_path):
if not module_can_be_imported("pip"):
install_pip()
for package_name, package_version in dependencies:
if not module_can_be_imported(package_name):
install_package(package_name, package_version)
for module_name, package_name in dependencies:
if not module_can_be_imported(module_name):
install_package(package_name)
module_can_be_imported(package_name)
elif not check_package_version(package_name, package_version):
install_package(package_name, package_version)

View File

View File

@ -25,81 +25,31 @@ import string
import time
from operator import itemgetter
from pathlib import Path
import shutil
from pathlib import Path
from queue import Queue
from subprocess import PIPE, Popen, TimeoutExpired
import zmq
import bpy
import mathutils
from bpy.app.handlers import persistent
from . import bl_types, delayable, environment, presence, ui, utils
from replication.constants import (FETCHED, STATE_ACTIVE,
STATE_INITIAL,
STATE_SYNCING, RP_COMMON, UP)
from replication.data import ReplicatedDataFactory
from replication.exception import NonAuthorizedOperationError
from replication.interface import session
from .libs.replication.replication.constants import (FETCHED, STATE_ACTIVE,
STATE_INITIAL,
STATE_SYNCING)
from .libs.replication.replication.data import ReplicatedDataFactory
from .libs.replication.replication.exception import NonAuthorizedOperationError
from .libs.replication.replication.interface import Session
background_execution_queue = Queue()
client = None
delayables = []
stop_modal_executor = False
modal_executor_queue = None
def session_callback(name):
""" Session callback wrapper
This allow to encapsulate session callbacks to background_execution_queue.
By doing this way callback are executed from the main thread.
"""
def func_wrapper(func):
@session.register(name)
def add_background_task():
background_execution_queue.put(func)
return add_background_task
return func_wrapper
@session_callback('on_connection')
def initialize_session():
"""Session connection init hander
"""
settings = utils.get_preferences()
runtime_settings = bpy.context.window_manager.session
# Step 1: Constrect nodes
for node in session._graph.list_ordered():
node_ref = session.get(node)
if node_ref.state == FETCHED:
node_ref.resolve()
# Step 2: Load nodes
for node in session._graph.list_ordered():
node_ref = session.get(node)
if node_ref.state == FETCHED:
node_ref.apply()
# Step 3: Launch presence overlay
if runtime_settings.enable_presence:
presence.renderer.run()
# Step 4: Register blender timers
for d in delayables:
d.register()
if settings.update_method == 'DEPSGRAPH':
bpy.app.handlers.depsgraph_update_post.append(depsgraph_evaluation)
@session_callback('on_exit')
def on_connection_end():
"""Session connection finished handler
"""
def unregister_delayables():
global delayables, stop_modal_executor
settings = utils.get_preferences()
# Step 1: Unregister blender timers
for d in delayables:
try:
d.unregister()
@ -108,21 +58,9 @@ def on_connection_end():
stop_modal_executor = True
# Step 2: Unregister presence renderer
presence.renderer.stop()
if settings.update_method == 'DEPSGRAPH':
bpy.app.handlers.depsgraph_update_post.remove(
depsgraph_evaluation)
# Step 3: remove file handled
logger = logging.getLogger()
for handler in logger.handlers:
if isinstance(handler, logging.FileHandler):
logger.removeHandler(handler)
# OPERATORS
class SessionStartOperator(bpy.types.Operator):
bl_idname = "session.start"
bl_label = "start"
@ -135,38 +73,17 @@ class SessionStartOperator(bpy.types.Operator):
return True
def execute(self, context):
global delayables
global client, delayables
settings = utils.get_preferences()
runtime_settings = context.window_manager.session
users = bpy.data.window_managers['WinMan'].online_users
admin_pass = runtime_settings.password
use_extern_update = settings.update_method == 'DEPSGRAPH'
unregister_delayables()
users.clear()
delayables.clear()
logger = logging.getLogger()
if len(logger.handlers) == 1:
formatter = logging.Formatter(
fmt='%(asctime)s CLIENT %(levelname)-8s %(message)s',
datefmt='%H:%M:%S'
)
log_directory = os.path.join(
settings.cache_directory,
"multiuser_client.log")
os.makedirs(settings.cache_directory, exist_ok=True)
handler = logging.FileHandler(log_directory, mode='w')
logger.addHandler(handler)
for handler in logger.handlers:
if isinstance(handler, logging.NullHandler):
continue
handler.setFormatter(formatter)
bpy_factory = ReplicatedDataFactory()
supported_bl_types = []
@ -178,35 +95,24 @@ class SessionStartOperator(bpy.types.Operator):
supported_bl_types.append(type_module_class.bl_id)
if type_impl_name not in settings.supported_datablocks:
logging.info(f"{type_impl_name} not found, \
regenerate type settings...")
settings.generate_supported_types()
# Retreive local replicated types settings
type_local_config = settings.supported_datablocks[type_impl_name]
bpy_factory.register_type(
type_module_class.bl_class,
type_module_class,
timer=type_local_config.bl_delay_refresh*1000,
automatic=type_local_config.auto_push,
check_common=type_module_class.bl_check_common)
timer=type_local_config.bl_delay_refresh,
automatic=type_local_config.auto_push)
if settings.update_method == 'DEFAULT':
if type_local_config.bl_delay_apply > 0:
delayables.append(
delayable.ApplyTimer(
timout=type_local_config.bl_delay_apply,
target_type=type_module_class))
if type_local_config.bl_delay_apply > 0:
delayables.append(
delayable.ApplyTimer(
timout=type_local_config.bl_delay_apply,
target_type=type_module_class))
session.configure(
client = Session(
factory=bpy_factory,
python_path=bpy.app.binary_path_python,
external_update_handling=use_extern_update)
if settings.update_method == 'DEPSGRAPH':
delayables.append(delayable.ApplyTimer(
settings.depsgraph_update_rate/1000))
python_path=bpy.app.binary_path_python)
# Host a session
if self.host:
@ -216,19 +122,16 @@ class SessionStartOperator(bpy.types.Operator):
runtime_settings.is_host = True
runtime_settings.internet_ip = environment.get_ip()
try:
for scene in bpy.data.scenes:
session.add(scene)
for scene in bpy.data.scenes:
client.add(scene)
session.host(
try:
client.host(
id=settings.username,
port=settings.port,
ipc_port=settings.ipc_port,
timeout=settings.connection_timeout,
password=admin_pass,
cache_directory=settings.cache_directory,
server_log_level=logging.getLevelName(
logging.getLogger().level),
password=admin_pass
)
except Exception as e:
self.report({'ERROR'}, repr(e))
@ -238,11 +141,11 @@ class SessionStartOperator(bpy.types.Operator):
else:
if not runtime_settings.admin:
utils.clean_scene()
# regular session, no password needed
# regular client, no password needed
admin_pass = None
try:
session.connect(
client.connect(
id=settings.username,
address=settings.ip,
port=settings.port,
@ -255,23 +158,21 @@ class SessionStartOperator(bpy.types.Operator):
logging.error(str(e))
# Background client updates service
#TODO: Refactoring
delayables.append(delayable.ClientUpdate())
delayables.append(delayable.DrawClient())
delayables.append(delayable.DynamicRightSelectTimer())
session_update = delayable.SessionStatusUpdate()
session_user_sync = delayable.SessionUserSync()
session_background_executor = delayable.MainThreadExecutor(
execution_queue=background_execution_queue)
# Launch drawing module
if runtime_settings.enable_presence:
presence.renderer.run()
session_update.register()
session_user_sync.register()
session_background_executor.register()
delayables.append(session_background_executor)
delayables.append(session_update)
delayables.append(session_user_sync)
# Register blender main thread tools
for d in delayables:
d.register()
global modal_executor_queue
modal_executor_queue = queue.Queue()
bpy.ops.session.apply_armature_operator()
self.report(
@ -308,13 +209,15 @@ class SessionInitOperator(bpy.types.Operator):
return wm.invoke_props_dialog(self)
def execute(self, context):
global client
if self.init_method == 'EMPTY':
utils.clean_scene()
for scene in bpy.data.scenes:
session.add(scene)
client.add(scene)
session.init()
client.init()
return {"FINISHED"}
@ -330,12 +233,11 @@ class SessionStopOperator(bpy.types.Operator):
return True
def execute(self, context):
global delayables, stop_modal_executor
global client, delayables, stop_modal_executor
if session:
if client:
try:
session.disconnect()
client.disconnect()
except Exception as e:
self.report({'ERROR'}, repr(e))
else:
@ -357,11 +259,11 @@ class SessionKickOperator(bpy.types.Operator):
return True
def execute(self, context):
global delayables, stop_modal_executor
assert(session)
global client, delayables, stop_modal_executor
assert(client)
try:
session.kick(self.user)
client.kick(self.user)
except Exception as e:
self.report({'ERROR'}, repr(e))
@ -388,8 +290,9 @@ class SessionPropertyRemoveOperator(bpy.types.Operator):
return True
def execute(self, context):
global client
try:
session.remove(self.property_path)
client.remove(self.property_path)
return {"FINISHED"}
except: # NonAuthorizedOperationError:
@ -424,9 +327,10 @@ class SessionPropertyRightOperator(bpy.types.Operator):
def execute(self, context):
runtime_settings = context.window_manager.session
global client
if session:
session.change_owner(self.key, runtime_settings.clients)
if client:
client.change_owner(self.key, runtime_settings.clients)
return {"FINISHED"}
@ -473,9 +377,10 @@ class SessionSnapUserOperator(bpy.types.Operator):
if event.type == 'TIMER':
area, region, rv3d = presence.view3d_find()
global client
if session:
target_ref = session.online_users.get(self.target_client)
if client:
target_ref = client.online_users.get(self.target_client)
if target_ref:
target_scene = target_ref['metadata']['scene_current']
@ -484,16 +389,14 @@ class SessionSnapUserOperator(bpy.types.Operator):
if target_scene != context.scene.name:
blender_scene = bpy.data.scenes.get(target_scene, None)
if blender_scene is None:
self.report(
{'ERROR'}, f"Scene {target_scene} doesn't exist on the local client.")
self.report({'ERROR'}, f"Scene {target_scene} doesn't exist on the local client.")
session_sessings.time_snap_running = False
return {"CANCELLED"}
bpy.context.window.scene = blender_scene
# Update client viewmatrix
client_vmatrix = target_ref['metadata'].get(
'view_matrix', None)
client_vmatrix = target_ref['metadata'].get('view_matrix', None)
if client_vmatrix:
rv3d.view_matrix = mathutils.Matrix(client_vmatrix)
@ -546,8 +449,10 @@ class SessionSnapTimeOperator(bpy.types.Operator):
return {'CANCELLED'}
if event.type == 'TIMER':
if session:
target_ref = session.online_users.get(self.target_client)
global client
if client:
target_ref = client.online_users.get(self.target_client)
if target_ref:
context.scene.frame_current = target_ref['metadata']['frame_current']
@ -570,7 +475,9 @@ class SessionApply(bpy.types.Operator):
return True
def execute(self, context):
session.apply(self.target)
global client
client.apply(self.target)
return {"FINISHED"}
@ -588,9 +495,10 @@ class SessionCommit(bpy.types.Operator):
return True
def execute(self, context):
# session.get(uuid=target).diff()
session.commit(uuid=self.target)
session.push(self.target)
global client
# client.get(uuid=target).diff()
client.commit(uuid=self.target)
client.push(self.target)
return {"FINISHED"}
@ -608,17 +516,18 @@ class ApplyArmatureOperator(bpy.types.Operator):
return {'CANCELLED'}
if event.type == 'TIMER':
if session and session.state['STATE'] == STATE_ACTIVE:
nodes = session.list(filter=bl_types.bl_armature.BlArmature)
global client
if client and client.state['STATE'] == STATE_ACTIVE:
nodes = client.list(filter=bl_types.bl_armature.BlArmature)
for node in nodes:
node_ref = session.get(uuid=node)
node_ref = client.get(uuid=node)
if node_ref.state == FETCHED:
try:
session.apply(node)
client.apply(node)
except Exception as e:
logging.error("Fail to apply armature: {e}")
logging.error("Dail to apply armature: {e}")
return {'PASS_THROUGH'}
@ -637,35 +546,6 @@ class ApplyArmatureOperator(bpy.types.Operator):
stop_modal_executor = False
class ClearCache(bpy.types.Operator):
"Clear local session cache"
bl_idname = "session.clear_cache"
bl_label = "Modal Executor Operator"
@classmethod
def poll(cls, context):
return True
def execute(self, context):
cache_dir = utils.get_preferences().cache_directory
try:
for root, dirs, files in os.walk(cache_dir):
for name in files:
Path(root, name).unlink()
except Exception as e:
self.report({'ERROR'}, repr(e))
return {"FINISHED"}
def invoke(self, context, event):
return context.window_manager.invoke_props_dialog(self)
def draw(self, context):
row = self.layout
row.label(text=f" Do you really want to remove local cache ? ")
classes = (
SessionStartOperator,
SessionStopOperator,
@ -678,7 +558,7 @@ classes = (
ApplyArmatureOperator,
SessionKickOperator,
SessionInitOperator,
ClearCache,
)
@ -690,60 +570,29 @@ def sanitize_deps_graph(dummy):
A future solution should be to avoid storing dataclock reference...
"""
if session and session.state['STATE'] == STATE_ACTIVE:
for node_key in session.list():
session.get(node_key).resolve()
global client
if client and client.state['STATE'] == STATE_ACTIVE:
for node_key in client.list():
client.get(node_key).resolve()
@persistent
def load_pre_handler(dummy):
if session and session.state['STATE'] in [STATE_ACTIVE, STATE_SYNCING]:
global client
if client and client.state['STATE'] in [STATE_ACTIVE, STATE_SYNCING]:
bpy.ops.session.stop()
@persistent
def update_client_frame(scene):
if session and session.state['STATE'] == STATE_ACTIVE:
session.update_user_metadata({
if client and client.state['STATE'] == STATE_ACTIVE:
client.update_user_metadata({
'frame_current': scene.frame_current
})
@persistent
def depsgraph_evaluation(scene):
if session and session.state['STATE'] == STATE_ACTIVE:
context = bpy.context
blender_depsgraph = bpy.context.view_layer.depsgraph
dependency_updates = [u for u in blender_depsgraph.updates]
settings = utils.get_preferences()
# NOTE: maybe we don't need to check each update but only the first
for update in reversed(dependency_updates):
# Is the object tracked ?
if update.id.uuid:
# Retrieve local version
node = session.get(update.id.uuid)
# Check our right on this update:
# - if its ours or ( under common and diff), launch the
# update process
# - if its to someone else, ignore the update (go deeper ?)
if node and node.owner in [session.id, RP_COMMON] and node.state == UP:
# Avoid slow geometry update
if 'EDIT' in context.mode and \
not settings.sync_during_editmode:
break
session.stash(node.uuid)
else:
# Distant update
continue
# else:
# # New items !
# logger.error("UPDATE: ADD")
def register():
from bpy.utils import register_class
for cls in classes:
@ -757,8 +606,11 @@ def register():
def unregister():
if session and session.state['STATE'] == STATE_ACTIVE:
session.disconnect()
global client
if client and client.state['STATE'] == 2:
client.disconnect()
client = None
from bpy.utils import unregister_class
for cls in reversed(classes):
@ -769,3 +621,7 @@ def unregister():
bpy.app.handlers.load_pre.remove(load_pre_handler)
bpy.app.handlers.frame_change_pre.remove(update_client_frame)
if __name__ == "__main__":
register()

View File

@ -20,14 +20,9 @@ import logging
import bpy
import string
import re
import os
from pathlib import Path
from . import bl_types, environment, addon_updater_ops, presence, ui
from .utils import get_preferences, get_expanded_icon
from replication.constants import RP_COMMON
from replication.interface import session
from . import utils, bl_types, environment, addon_updater_ops, presence, ui
from .libs.replication.replication.constants import RP_COMMON
IP_EXPR = re.compile('\d+\.\d+\.\d+\.\d+')
@ -51,7 +46,6 @@ def update_panel_category(self, context):
ui.SESSION_PT_settings.bl_category = self.panel_category
ui.register()
def update_ip(self, context):
ip = IP_EXPR.search(self.ip)
@ -61,35 +55,14 @@ def update_ip(self, context):
logging.error("Wrong IP format")
self['ip'] = "127.0.0.1"
def update_port(self, context):
max_port = self.port + 3
if self.ipc_port < max_port and \
self['ipc_port'] >= self.port:
logging.error(
"IPC Port in conflic with the port, assigning a random value")
self['ipc_port'] >= self.port:
logging.error("IPC Port in conflic with the port, assigning a random value")
self['ipc_port'] = random.randrange(self.port+4, 10000)
def update_directory(self, context):
new_dir = Path(self.cache_directory)
if new_dir.exists() and any(Path(self.cache_directory).iterdir()):
logging.error("The folder is not empty, choose another one.")
self['cache_directory'] = environment.DEFAULT_CACHE_DIR
elif not new_dir.exists():
logging.info("Target cache folder doesn't exist, creating it.")
os.makedirs(self.cache_directory, exist_ok=True)
def set_log_level(self, value):
logging.getLogger().setLevel(value)
def get_log_level(self):
return logging.getLogger().level
class ReplicatedDatablock(bpy.types.PropertyGroup):
type_name: bpy.props.StringProperty()
bl_name: bpy.props.StringProperty()
@ -100,44 +73,11 @@ class ReplicatedDatablock(bpy.types.PropertyGroup):
icon: bpy.props.StringProperty()
def set_sync_render_settings(self, value):
self['sync_render_settings'] = value
if session and bpy.context.scene.uuid and value:
bpy.ops.session.apply('INVOKE_DEFAULT', target=bpy.context.scene.uuid)
def set_sync_active_camera(self, value):
self['sync_active_camera'] = value
if session and bpy.context.scene.uuid and value:
bpy.ops.session.apply('INVOKE_DEFAULT', target=bpy.context.scene.uuid)
class ReplicationFlags(bpy.types.PropertyGroup):
def get_sync_render_settings(self):
return self.get('sync_render_settings', True)
def get_sync_active_camera(self):
return self.get('sync_active_camera', True)
sync_render_settings: bpy.props.BoolProperty(
name="Synchronize render settings",
description="Synchronize render settings (eevee and cycles only)",
default=True,
set=set_sync_render_settings,
get=get_sync_render_settings)
sync_during_editmode: bpy.props.BoolProperty(
name="Edit mode updates",
description="Enable objects update in edit mode (! Impact performances !)",
default=False
)
sync_active_camera: bpy.props.BoolProperty(
name="Synchronize active camera",
description="Synchronize the active camera",
default=True,
get=get_sync_active_camera,
set=set_sync_active_camera
)
default=True)
class SessionPrefs(bpy.types.AddonPreferences):
@ -170,8 +110,8 @@ class SessionPrefs(bpy.types.AddonPreferences):
ipc_port: bpy.props.IntProperty(
name="ipc_port",
description='internal ttl port(only usefull for multiple local instances)',
default=random.randrange(5570, 70000),
update=update_port,
default=5561,
update=update_port
)
init_method: bpy.props.EnumProperty(
name='init_method',
@ -183,33 +123,12 @@ class SessionPrefs(bpy.types.AddonPreferences):
cache_directory: bpy.props.StringProperty(
name="cache directory",
subtype="DIR_PATH",
default=environment.DEFAULT_CACHE_DIR,
update=update_directory)
default=environment.DEFAULT_CACHE_DIR)
connection_timeout: bpy.props.IntProperty(
name='connection timeout',
description='connection timeout before disconnection',
default=1000
)
update_method: bpy.props.EnumProperty(
name='update method',
description='replication update method',
items=[
('DEFAULT', "Default", "Default: Use threads to monitor databloc changes"),
('DEPSGRAPH', "Depsgraph",
"Experimental: Use the blender dependency graph to trigger updates"),
],
)
# Replication update settings
depsgraph_update_rate: bpy.props.IntProperty(
name='depsgraph update rate',
description='Dependency graph uppdate rate (milliseconds)',
default=100
)
clear_memory_filecache: bpy.props.BoolProperty(
name="Clear memory filecache",
description="Remove filecache from memory",
default=False
)
# for UI
category: bpy.props.EnumProperty(
name="Category",
@ -220,18 +139,17 @@ class SessionPrefs(bpy.types.AddonPreferences):
],
default='CONFIG'
)
# WIP
logging_level: bpy.props.EnumProperty(
name="Log level",
description="Log verbosity level",
items=[
('ERROR', "error", "show only errors", logging.ERROR),
('WARNING', "warning", "only show warnings and errors", logging.WARNING),
('INFO', "info", "default level", logging.INFO),
('DEBUG', "debug", "show all logs", logging.DEBUG),
('ERROR', "error", "show only errors"),
('WARNING', "warning", "only show warnings and errors"),
('INFO', "info", "default level"),
('DEBUG', "debug", "show all logs"),
],
default='INFO',
set=set_log_level,
get=get_log_level
default='INFO'
)
conf_session_identity_expanded: bpy.props.BoolProperty(
name="Identity",
@ -263,26 +181,6 @@ class SessionPrefs(bpy.types.AddonPreferences):
description="Interface",
default=False
)
sidebar_advanced_rep_expanded: bpy.props.BoolProperty(
name="sidebar_advanced_rep_expanded",
description="sidebar_advanced_rep_expanded",
default=False
)
sidebar_advanced_log_expanded: bpy.props.BoolProperty(
name="sidebar_advanced_log_expanded",
description="sidebar_advanced_log_expanded",
default=False
)
sidebar_advanced_net_expanded: bpy.props.BoolProperty(
name="sidebar_advanced_net_expanded",
description="sidebar_advanced_net_expanded",
default=False
)
sidebar_advanced_cache_expanded: bpy.props.BoolProperty(
name="sidebar_advanced_cache_expanded",
description="sidebar_advanced_cache_expanded",
default=False
)
auto_check_update: bpy.props.BoolProperty(
name="Auto-check for Update",
@ -335,8 +233,8 @@ class SessionPrefs(bpy.types.AddonPreferences):
box = grid.box()
box.prop(
self, "conf_session_identity_expanded", text="User informations",
icon=get_expanded_icon(self.conf_session_identity_expanded),
emboss=False)
icon='DISCLOSURE_TRI_DOWN' if self.conf_session_identity_expanded
else 'DISCLOSURE_TRI_RIGHT', emboss=False)
if self.conf_session_identity_expanded:
box.row().prop(self, "username", text="name")
box.row().prop(self, "client_color", text="color")
@ -345,26 +243,23 @@ class SessionPrefs(bpy.types.AddonPreferences):
box = grid.box()
box.prop(
self, "conf_session_net_expanded", text="Netorking",
icon=get_expanded_icon(self.conf_session_net_expanded),
emboss=False)
icon='DISCLOSURE_TRI_DOWN' if self.conf_session_net_expanded
else 'DISCLOSURE_TRI_RIGHT', emboss=False)
if self.conf_session_net_expanded:
box.row().prop(self, "ip", text="Address")
row = box.row()
row.label(text="Port:")
row.prop(self, "port", text="")
row.prop(self, "port", text="Address")
row = box.row()
row.label(text="Init the session from:")
row.prop(self, "init_method", text="")
row = box.row()
row.label(text="Update method:")
row.prop(self, "update_method", text="")
table = box.box()
table.row().prop(
self, "conf_session_timing_expanded", text="Refresh rates",
icon=get_expanded_icon(self.conf_session_timing_expanded),
emboss=False)
icon='DISCLOSURE_TRI_DOWN' if self.conf_session_timing_expanded
else 'DISCLOSURE_TRI_RIGHT', emboss=False)
if self.conf_session_timing_expanded:
line = table.row()
@ -382,8 +277,8 @@ class SessionPrefs(bpy.types.AddonPreferences):
box = grid.box()
box.prop(
self, "conf_session_hosting_expanded", text="Hosting",
icon=get_expanded_icon(self.conf_session_hosting_expanded),
emboss=False)
icon='DISCLOSURE_TRI_DOWN' if self.conf_session_hosting_expanded
else 'DISCLOSURE_TRI_RIGHT', emboss=False)
if self.conf_session_hosting_expanded:
row = box.row()
row.label(text="Init the session from:")
@ -393,24 +288,23 @@ class SessionPrefs(bpy.types.AddonPreferences):
box = grid.box()
box.prop(
self, "conf_session_cache_expanded", text="Cache",
icon=get_expanded_icon(self.conf_session_cache_expanded),
emboss=False)
icon='DISCLOSURE_TRI_DOWN' if self.conf_session_cache_expanded
else 'DISCLOSURE_TRI_RIGHT', emboss=False)
if self.conf_session_cache_expanded:
box.row().prop(self, "cache_directory", text="Cache directory")
box.row().prop(self, "clear_memory_filecache", text="Clear memory filecache")
# INTERFACE SETTINGS
box = grid.box()
box.prop(
self, "conf_session_ui_expanded", text="Interface",
icon=get_expanded_icon(self.conf_session_ui_expanded),
icon='DISCLOSURE_TRI_DOWN' if self.conf_session_ui_expanded else 'DISCLOSURE_TRI_RIGHT',
emboss=False)
if self.conf_session_ui_expanded:
box.row().prop(self, "panel_category", text="Panel category", expand=True)
if self.category == 'UPDATE':
from . import addon_updater_ops
addon_updater_ops.update_settings_ui(self, context)
addon_updater_ops.update_settings_ui_condensed(self, context)
def generate_supported_types(self):
self.supported_datablocks.clear()
@ -437,10 +331,10 @@ def client_list_callback(scene, context):
items = [(RP_COMMON, RP_COMMON, "")]
username = get_preferences().username
if session:
client_ids = session.online_users.keys()
username = utils.get_preferences().username
cli = operators.client
if cli:
client_ids = cli.online_users.keys()
for id in client_ids:
name_desc = id
if id == username:

View File

@ -19,7 +19,6 @@
import copy
import logging
import math
import traceback
import bgl
import blf
@ -61,8 +60,7 @@ def refresh_sidebar_view():
"""
area, region, rv3d = view3d_find()
if area:
area.regions[3].tag_redraw()
area.regions[3].tag_redraw()
def get_target(region, rv3d, coord):
target = [0, 0, 0]
@ -313,10 +311,10 @@ class DrawFactory(object):
self.d2d_items[client_id] = (position[1], client_id, color)
except Exception as e:
logging.debug(f"Draw client exception: {e} \n {traceback.format_exc()}\n pos:{position},ind:{indices}")
logging.error(f"Draw client exception: {e}")
def draw3d_callback(self):
bgl.glLineWidth(2.)
bgl.glLineWidth(1.5)
bgl.glEnable(bgl.GL_DEPTH_TEST)
bgl.glEnable(bgl.GL_BLEND)
bgl.glEnable(bgl.GL_LINE_SMOOTH)

View File

@ -18,8 +18,8 @@
import bpy
from .utils import get_preferences, get_expanded_icon, get_folder_size
from replication.constants import (ADDED, ERROR, FETCHED,
from . import operators, utils
from .libs.replication.replication.constants import (ADDED, ERROR, FETCHED,
MODIFIED, RP_COMMON, UP,
STATE_ACTIVE, STATE_AUTH,
STATE_CONFIG, STATE_SYNCING,
@ -27,16 +27,13 @@ from replication.constants import (ADDED, ERROR, FETCHED,
STATE_WAITING, STATE_QUITTING,
STATE_LOBBY,
STATE_LAUNCHING_SERVICES)
from replication import __version__
from replication.interface import session
ICONS_PROP_STATES = ['TRIA_DOWN', # ADDED
'TRIA_UP', # COMMITED
'KEYTYPE_KEYFRAME_VEC', # PUSHED
'TRIA_DOWN', # FETCHED
'FILE_REFRESH', # UP
'TRIA_UP',
'ERROR'] # CHANGED
'TRIA_UP'] # CHANGED
def printProgressBar(iteration, total, prefix='', suffix='', decimals=1, length=100, fill='', fill_empty=' '):
@ -53,8 +50,6 @@ def printProgressBar(iteration, total, prefix='', suffix='', decimals=1, length=
From here:
https://gist.github.com/greenstick/b23e475d2bfdc3a82e34eaa1f6781ee4
"""
if total == 0:
return ""
filledLength = int(length * iteration // total)
bar = fill * filledLength + fill_empty * (length - filledLength)
return f"{prefix} |{bar}| {iteration}/{total}{suffix}"
@ -89,16 +84,16 @@ def get_state_str(state):
class SESSION_PT_settings(bpy.types.Panel):
"""Settings panel"""
bl_idname = "MULTIUSER_SETTINGS_PT_panel"
bl_label = " "
bl_label = ""
bl_space_type = 'VIEW_3D'
bl_region_type = 'UI'
bl_category = "Multiuser"
def draw_header(self, context):
layout = self.layout
if session and session.state['STATE'] != STATE_INITIAL:
cli_state = session.state
state = session.state.get('STATE')
if operators.client and operators.client.state['STATE'] != STATE_INITIAL:
cli_state = operators.client.state
state = operators.client.state.get('STATE')
connection_icon = "KEYTYPE_MOVING_HOLD_VEC"
if state == STATE_ACTIVE:
@ -108,54 +103,72 @@ class SESSION_PT_settings(bpy.types.Panel):
layout.label(text=f"Session - {get_state_str(cli_state['STATE'])}", icon=connection_icon)
else:
layout.label(text=f"Session - v{__version__}",icon="PROP_OFF")
layout.label(text="Session",icon="PROP_OFF")
def draw(self, context):
layout = self.layout
layout.use_property_split = True
row = layout.row()
runtime_settings = context.window_manager.session
settings = get_preferences()
settings = utils.get_preferences()
if hasattr(context.window_manager, 'session'):
# STATE INITIAL
if not session \
or (session and session.state['STATE'] == STATE_INITIAL):
if not operators.client \
or (operators.client and operators.client.state['STATE'] == STATE_INITIAL):
pass
else:
cli_state = session.state
cli_state = operators.client.state
row = layout.row()
current_state = cli_state['STATE']
info_msg = None
if current_state in [STATE_ACTIVE]:
row = row.split(factor=0.3)
row.prop(settings.sync_flags, "sync_render_settings",text="",icon_only=True, icon='SCENE')
row.prop(settings.sync_flags, "sync_during_editmode", text="",icon_only=True, icon='EDITMODE_HLT')
row.prop(settings.sync_flags, "sync_active_camera", text="",icon_only=True, icon='OBJECT_DATAMODE')
row= layout.row()
# STATE ACTIVE
if current_state in [STATE_ACTIVE, STATE_LOBBY]:
row.operator("session.stop", icon='QUIT', text="Exit")
row = layout.row()
if runtime_settings.is_host:
row = row.box()
row.label(text=f"{runtime_settings.internet_ip}:{settings.port}", icon='INFO')
row = layout.row()
if current_state in [STATE_ACTIVE] and runtime_settings.is_host:
info_msg = f"LAN: {runtime_settings.internet_ip}"
if current_state == STATE_LOBBY:
info_msg = "Waiting the session to start."
# CONNECTION STATE
elif current_state in [STATE_SRV_SYNC,
STATE_SYNCING,
STATE_AUTH,
STATE_CONFIG,
STATE_WAITING]:
if info_msg:
info_box = row.box()
info_box.row().label(text=info_msg,icon='INFO')
if cli_state['STATE'] in [STATE_SYNCING, STATE_SRV_SYNC, STATE_WAITING]:
box = row.box()
box.label(text=printProgressBar(
cli_state['CURRENT'],
cli_state['TOTAL'],
length=16
))
# Progress bar
if current_state in [STATE_SYNCING, STATE_SRV_SYNC, STATE_WAITING]:
info_box = row.box()
info_box.row().label(text=printProgressBar(
cli_state['CURRENT'],
cli_state['TOTAL'],
row = layout.row()
row.operator("session.stop", icon='QUIT', text="CANCEL")
elif current_state == STATE_QUITTING:
row = layout.row()
box = row.box()
num_online_services = 0
for name, state in operators.client.services_state.items():
if state == STATE_ACTIVE:
num_online_services += 1
total_online_services = len(
operators.client.services_state)
box.label(text=printProgressBar(
total_online_services-num_online_services,
total_online_services,
length=16
))
layout.row().operator("session.stop", icon='QUIT', text="Exit")
class SESSION_PT_settings_network(bpy.types.Panel):
bl_idname = "MULTIUSER_SETTINGS_NETWORK_PT_panel"
@ -166,8 +179,8 @@ class SESSION_PT_settings_network(bpy.types.Panel):
@classmethod
def poll(cls, context):
return not session \
or (session and session.state['STATE'] == 0)
return not operators.client \
or (operators.client and operators.client.state['STATE'] == 0)
def draw_header(self, context):
self.layout.label(text="", icon='URL')
@ -176,7 +189,7 @@ class SESSION_PT_settings_network(bpy.types.Panel):
layout = self.layout
runtime_settings = context.window_manager.session
settings = get_preferences()
settings = utils.get_preferences()
# USER SETTINGS
row = layout.row()
@ -224,8 +237,8 @@ class SESSION_PT_settings_user(bpy.types.Panel):
@classmethod
def poll(cls, context):
return not session \
or (session and session.state['STATE'] == 0)
return not operators.client \
or (operators.client and operators.client.state['STATE'] == 0)
def draw_header(self, context):
self.layout.label(text="", icon='USER')
@ -234,7 +247,7 @@ class SESSION_PT_settings_user(bpy.types.Panel):
layout = self.layout
runtime_settings = context.window_manager.session
settings = get_preferences()
settings = utils.get_preferences()
row = layout.row()
# USER SETTINGS
@ -255,8 +268,8 @@ class SESSION_PT_advanced_settings(bpy.types.Panel):
@classmethod
def poll(cls, context):
return not session \
or (session and session.state['STATE'] == 0)
return not operators.client \
or (operators.client and operators.client.state['STATE'] == 0)
def draw_header(self, context):
self.layout.label(text="", icon='PREFERENCES')
@ -265,107 +278,44 @@ class SESSION_PT_advanced_settings(bpy.types.Panel):
layout = self.layout
runtime_settings = context.window_manager.session
settings = get_preferences()
settings = utils.get_preferences()
net_section = layout.row().box()
net_section.prop(
settings,
"sidebar_advanced_net_expanded",
text="Network",
icon=get_expanded_icon(settings.sidebar_advanced_net_expanded),
emboss=False)
if settings.sidebar_advanced_net_expanded:
net_section_row = net_section.row()
net_section_row.label(text="IPC Port:")
net_section_row.prop(settings, "ipc_port", text="")
net_section_row = net_section.row()
net_section_row.label(text="Timeout (ms):")
net_section_row.prop(settings, "connection_timeout", text="")
net_section.label(text="Network ", icon='TRIA_DOWN')
net_section_row = net_section.row()
net_section_row.label(text="IPC Port:")
net_section_row.prop(settings, "ipc_port", text="")
net_section_row = net_section.row()
net_section_row.label(text="Timeout (ms):")
net_section_row.prop(settings, "connection_timeout", text="")
replication_section = layout.row().box()
replication_section.prop(
settings,
"sidebar_advanced_rep_expanded",
text="Replication",
icon=get_expanded_icon(settings.sidebar_advanced_rep_expanded),
emboss=False)
if settings.sidebar_advanced_rep_expanded:
replication_section_row = replication_section.row()
replication_section_row.label(text="Sync flags", icon='COLLECTION_NEW')
replication_section_row = replication_section.row()
replication_section.label(text="Replication ", icon='TRIA_DOWN')
replication_section_row = replication_section.row()
if runtime_settings.session_mode == 'HOST':
replication_section_row.prop(settings.sync_flags, "sync_render_settings")
replication_section_row = replication_section.row()
replication_section_row.prop(settings.sync_flags, "sync_active_camera")
replication_section_row = replication_section.row()
replication_section_row.prop(settings.sync_flags, "sync_during_editmode")
replication_section_row = replication_section.row()
if settings.sync_flags.sync_during_editmode:
warning = replication_section_row.box()
warning.label(text="Don't use this with heavy meshes !", icon='ERROR')
replication_section_row = replication_section.row()
replication_section_row = replication_section.row()
replication_section_row.label(text="Per data type timers:")
replication_section_row = replication_section.row()
# Replication frequencies
flow = replication_section_row .grid_flow(
row_major=True, columns=0, even_columns=True, even_rows=False, align=True)
line = flow.row(align=True)
line.label(text=" ")
line.separator()
line.label(text="refresh (sec)")
line.label(text="apply (sec)")
replication_section_row.label(text="Update method", icon='RECOVER_LAST')
replication_section_row = replication_section.row()
replication_section_row.prop(settings, "update_method", expand=True)
replication_section_row = replication_section.row()
replication_timers = replication_section_row.box()
replication_timers.label(text="Replication timers", icon='TIME')
if settings.update_method == "DEFAULT":
replication_timers = replication_timers.row()
# Replication frequencies
flow = replication_timers.grid_flow(
row_major=True, columns=0, even_columns=True, even_rows=False, align=True)
line = flow.row(align=True)
line.label(text=" ")
line.separator()
line.label(text="refresh (sec)")
line.label(text="apply (sec)")
for item in settings.supported_datablocks:
line = flow.row(align=True)
line.prop(item, "auto_push", text="", icon=item.icon)
line.separator()
line.prop(item, "bl_delay_refresh", text="")
line.prop(item, "bl_delay_apply", text="")
for item in settings.supported_datablocks:
line = flow.row(align=True)
line.prop(item, "auto_push", text="", icon=item.icon)
line.separator()
line.prop(item, "bl_delay_refresh", text="")
line.prop(item, "bl_delay_apply", text="")
else:
replication_timers = replication_timers.row()
replication_timers.label(text="Update rate (ms):")
replication_timers.prop(settings, "depsgraph_update_rate", text="")
cache_section = layout.row().box()
cache_section.prop(
settings,
"sidebar_advanced_cache_expanded",
text="Cache",
icon=get_expanded_icon(settings.sidebar_advanced_cache_expanded),
emboss=False)
if settings.sidebar_advanced_cache_expanded:
cache_section_row = cache_section.row()
cache_section_row.label(text="Cache directory:")
cache_section_row = cache_section.row()
cache_section_row.prop(settings, "cache_directory", text="")
cache_section_row = cache_section.row()
cache_section_row.label(text="Clear memory filecache:")
cache_section_row.prop(settings, "clear_memory_filecache", text="")
cache_section_row = cache_section.row()
cache_section_row.operator('session.clear_cache', text=f"Clear cache ({get_folder_size(settings.cache_directory)})")
log_section = layout.row().box()
log_section.prop(
settings,
"sidebar_advanced_log_expanded",
text="Logging",
icon=get_expanded_icon(settings.sidebar_advanced_log_expanded),
emboss=False)
if settings.sidebar_advanced_log_expanded:
log_section_row = log_section.row()
log_section_row.label(text="Log level:")
log_section_row.prop(settings, 'logging_level', text="")
class SESSION_PT_user(bpy.types.Panel):
bl_idname = "MULTIUSER_USER_PT_panel"
bl_label = "Online users"
@ -375,7 +325,7 @@ class SESSION_PT_user(bpy.types.Panel):
@classmethod
def poll(cls, context):
return session and session.state['STATE'] in [STATE_ACTIVE, STATE_LOBBY]
return operators.client and operators.client.state['STATE'] in [STATE_ACTIVE, STATE_LOBBY]
def draw_header(self, context):
self.layout.label(text="", icon='USER')
@ -384,7 +334,7 @@ class SESSION_PT_user(bpy.types.Panel):
layout = self.layout
online_users = context.window_manager.online_users
selected_user = context.window_manager.user_index
settings = get_preferences()
settings = utils.get_preferences()
active_user = online_users[selected_user] if len(
online_users)-1 >= selected_user else 0
runtime_settings = context.window_manager.session
@ -406,21 +356,19 @@ class SESSION_PT_user(bpy.types.Panel):
if active_user != 0 and active_user.username != settings.username:
row = layout.row()
user_operations = row.split()
if session.state['STATE'] == STATE_ACTIVE:
user_operations.alert = context.window_manager.session.time_snap_running
user_operations.operator(
"session.snapview",
text="",
icon='VIEW_CAMERA').target_client = active_user.username
user_operations.alert = context.window_manager.session.time_snap_running
user_operations.operator(
"session.snapview",
text="",
icon='VIEW_CAMERA').target_client = active_user.username
user_operations.alert = context.window_manager.session.user_snap_running
user_operations.operator(
"session.snaptime",
text="",
icon='TIME').target_client = active_user.username
user_operations.alert = context.window_manager.session.user_snap_running
user_operations.operator(
"session.snaptime",
text="",
icon='TIME').target_client = active_user.username
if session.online_users[settings.username]['admin']:
if operators.client.online_users[settings.username]['admin']:
user_operations.operator(
"session.kick",
text="",
@ -429,7 +377,8 @@ class SESSION_PT_user(bpy.types.Panel):
class SESSION_UL_users(bpy.types.UIList):
def draw_item(self, context, layout, data, item, icon, active_data, active_propname, index, flt_flag):
settings = get_preferences()
session = operators.client
settings = utils.get_preferences()
is_local_user = item.username == settings.username
ping = '-'
frame_current = '-'
@ -441,8 +390,8 @@ class SESSION_UL_users(bpy.types.UIList):
ping = str(user['latency'])
metadata = user.get('metadata')
if metadata and 'frame_current' in metadata:
frame_current = str(metadata.get('frame_current','-'))
scene_current = metadata.get('scene_current','-')
frame_current = str(metadata['frame_current'])
scene_current = metadata['scene_current']
if user['admin']:
status_icon = 'FAKE_USER_ON'
split = layout.split(factor=0.35)
@ -463,8 +412,8 @@ class SESSION_PT_presence(bpy.types.Panel):
@classmethod
def poll(cls, context):
return not session \
or (session and session.state['STATE'] in [STATE_INITIAL, STATE_ACTIVE])
return not operators.client \
or (operators.client and operators.client.state['STATE'] in [STATE_INITIAL, STATE_ACTIVE])
def draw_header(self, context):
self.layout.prop(context.window_manager.session,
@ -482,18 +431,48 @@ class SESSION_PT_presence(bpy.types.Panel):
row.active = settings.presence_show_user
row.prop(settings, "presence_show_far_user")
def draw_property(context, parent, property_uuid, level=0):
settings = get_preferences()
runtime_settings = context.window_manager.session
item = session.get(uuid=property_uuid)
area_msg = parent.row(align=True)
class SESSION_PT_services(bpy.types.Panel):
bl_idname = "MULTIUSER_SERVICE_PT_panel"
bl_label = "Services"
bl_space_type = 'VIEW_3D'
bl_region_type = 'UI'
bl_parent_id = 'MULTIUSER_SETTINGS_PT_panel'
bl_options = {'DEFAULT_CLOSED'}
@classmethod
def poll(cls, context):
return operators.client and operators.client.state['STATE'] == 2
def draw_header(self, context):
self.layout.label(text="", icon='FILE_CACHE')
def draw(self, context):
layout = self.layout
online_users = context.window_manager.online_users
selected_user = context.window_manager.user_index
settings = context.window_manager.session
active_user = online_users[selected_user] if len(online_users)-1 >= selected_user else 0
# Create a simple row.
for name, state in operators.client.services_state.items():
row = layout.row()
row.label(text=name)
row.label(text=get_state_str(state))
def draw_property(context, parent, property_uuid, level=0):
settings = utils.get_preferences()
runtime_settings = context.window_manager.session
item = operators.client.get(uuid=property_uuid)
if item.state == ERROR:
area_msg.alert=True
else:
area_msg.alert=False
return
area_msg = parent.row(align=True)
if level > 0:
for i in range(level):
area_msg.label(text="")
line = area_msg.box()
name = item.data['name'] if item.data else item.uuid
@ -506,8 +485,8 @@ def draw_property(context, parent, property_uuid, level=0):
# Operations
have_right_to_modify = (item.owner == settings.username or \
item.owner == RP_COMMON) and item.state != ERROR
have_right_to_modify = item.owner == settings.username or \
item.owner == RP_COMMON
if have_right_to_modify:
detail_item_box.operator(
@ -543,6 +522,7 @@ def draw_property(context, parent, property_uuid, level=0):
else:
detail_item_box.label(text="", icon="DECORATE_LOCKED")
class SESSION_PT_repository(bpy.types.Panel):
bl_idname = "MULTIUSER_PROPERTIES_PT_panel"
bl_label = "Repository"
@ -552,17 +532,9 @@ class SESSION_PT_repository(bpy.types.Panel):
@classmethod
def poll(cls, context):
settings = get_preferences()
admin = False
if session and hasattr(session,'online_users'):
usr = session.online_users.get(settings.username)
if usr:
admin = usr['admin']
return hasattr(context.window_manager, 'session') and \
session and \
(session.state['STATE'] == STATE_ACTIVE or \
session.state['STATE'] == STATE_LOBBY and admin)
operators.client and \
operators.client.state['STATE'] in [STATE_ACTIVE, STATE_LOBBY]
def draw_header(self, context):
self.layout.label(text="", icon='OUTLINER_OB_GROUP_INSTANCE')
@ -571,9 +543,10 @@ class SESSION_PT_repository(bpy.types.Panel):
layout = self.layout
# Filters
settings = get_preferences()
settings = utils.get_preferences()
runtime_settings = context.window_manager.session
session = operators.client
usr = session.online_users.get(settings.username)
row = layout.row()
@ -599,11 +572,11 @@ class SESSION_PT_repository(bpy.types.Panel):
types_filter = [t.type_name for t in settings.supported_datablocks
if t.use_as_filter]
key_to_filter = session.list(
filter_owner=settings.username) if runtime_settings.filter_owned else session.list()
key_to_filter = operators.client.list(
filter_owner=settings.username) if runtime_settings.filter_owned else operators.client.list()
client_keys = [key for key in key_to_filter
if session.get(uuid=key).str_type
if operators.client.get(uuid=key).str_type
in types_filter]
if client_keys:
@ -619,35 +592,6 @@ class SESSION_PT_repository(bpy.types.Panel):
else:
row.label(text="Waiting to start")
class VIEW3D_PT_overlay_session(bpy.types.Panel):
bl_space_type = 'VIEW_3D'
bl_region_type = 'HEADER'
bl_parent_id = 'VIEW3D_PT_overlay'
bl_label = "Multi-user"
@classmethod
def poll(cls, context):
return True
def draw(self, context):
layout = self.layout
view = context.space_data
overlay = view.overlay
display_all = overlay.show_overlays
col = layout.column()
col.active = display_all
row = col.row(align=True)
settings = context.window_manager.session
layout.active = settings.enable_presence
col = layout.column()
col.prop(settings, "presence_show_selected")
col.prop(settings, "presence_show_user")
row = layout.column()
row.active = settings.presence_show_user
row.prop(settings, "presence_show_far_user")
classes = (
SESSION_UL_users,
@ -657,8 +601,9 @@ classes = (
SESSION_PT_presence,
SESSION_PT_advanced_settings,
SESSION_PT_user,
SESSION_PT_services,
SESSION_PT_repository,
VIEW3D_PT_overlay_session,
)

View File

@ -21,10 +21,8 @@ import logging
import os
import sys
import time
from collections.abc import Iterable
from pathlib import Path
from uuid import uuid4
import math
from collections.abc import Iterable
import bpy
import mathutils
@ -41,7 +39,7 @@ def find_from_attr(attr_name, attr_value, list):
def get_datablock_users(datablock):
users = []
supported_types = get_preferences().supported_datablocks
supported_types = get_preferences().supported_datablocks
if hasattr(datablock, 'users_collection') and datablock.users_collection:
users.extend(list(datablock.users_collection))
if hasattr(datablock, 'users_scene') and datablock.users_scene:
@ -49,7 +47,7 @@ def get_datablock_users(datablock):
if hasattr(datablock, 'users_group') and datablock.users_scene:
users.extend(list(datablock.users_scene))
for datatype in supported_types:
if datatype.bl_name != 'users' and hasattr(bpy.data, datatype.bl_name):
if datatype.bl_name != 'users':
root = getattr(bpy.data, datatype.bl_name)
for item in root:
if hasattr(item, 'data') and datablock == item.data or \
@ -79,76 +77,10 @@ def resolve_from_id(id, optionnal_type=None):
if id in root and ((optionnal_type is None) or (optionnal_type.lower() in root[id].__class__.__name__.lower())):
return root[id]
return None
def get_preferences():
return bpy.context.preferences.addons[__package__].preferences
def current_milli_time():
return int(round(time.time() * 1000))
def get_expanded_icon(prop: bpy.types.BoolProperty) -> str:
if prop:
return 'DISCLOSURE_TRI_DOWN'
else:
return 'DISCLOSURE_TRI_RIGHT'
# Taken from here: https://stackoverflow.com/a/55659577
def get_folder_size(folder):
return ByteSize(sum(file.stat().st_size for file in Path(folder).rglob('*')))
class ByteSize(int):
_kB = 1024
_suffixes = 'B', 'kB', 'MB', 'GB', 'PB'
def __new__(cls, *args, **kwargs):
return super().__new__(cls, *args, **kwargs)
def __init__(self, *args, **kwargs):
self.bytes = self.B = int(self)
self.kilobytes = self.kB = self / self._kB**1
self.megabytes = self.MB = self / self._kB**2
self.gigabytes = self.GB = self / self._kB**3
self.petabytes = self.PB = self / self._kB**4
*suffixes, last = self._suffixes
suffix = next((
suffix
for suffix in suffixes
if 1 < getattr(self, suffix) < self._kB
), last)
self.readable = suffix, getattr(self, suffix)
super().__init__()
def __str__(self):
return self.__format__('.2f')
def __repr__(self):
return '{}({})'.format(self.__class__.__name__, super().__repr__())
def __format__(self, format_spec):
suffix, val = self.readable
return '{val:{fmt}} {suf}'.format(val=math.ceil(val), fmt=format_spec, suf=suffix)
def __sub__(self, other):
return self.__class__(super().__sub__(other))
def __add__(self, other):
return self.__class__(super().__add__(other))
def __mul__(self, other):
return self.__class__(super().__mul__(other))
def __rsub__(self, other):
return self.__class__(super().__sub__(other))
def __radd__(self, other):
return self.__class__(super().__add__(other))
def __rmul__(self, other):
return self.__class__(super().__rmul__(other))
return int(round(time.time() * 1000))

View File

@ -1,24 +0,0 @@
# Download base image debian jessie
FROM python:slim
ARG replication_version=0.0.21a15
ARG version=0.1.0
# Infos
LABEL maintainer="Swann Martinez"
LABEL version=$version
LABEL description="Blender multi-user addon \
dedicated server image."
# Argument
ENV password='admin'
ENV port=5555
ENV timeout=3000
ENV log_level=INFO
ENV log_file="multiuser_server.log"
#Install replication
RUN pip install replication==$replication_version
# Run the server with parameters
CMD replication.serve -pwd ${password} -p ${port} -t ${timeout} -l ${log_level} -lf ${log_file}

View File

@ -1,6 +0,0 @@
import re
init_py = open("multi_user/__init__.py").read()
version = re.search("\d+, \d+, \d+", init_py).group(0)
digits = version.split(',')
print('.'.join(digits).replace(" ",""))

View File

@ -1,4 +0,0 @@
import re
init_py = open("multi_user/__init__.py").read()
print(re.search("\d+\.\d+\.\d+\w\d+|\d+\.\d+\.\d+", init_py).group(0))

View File

@ -2,7 +2,7 @@ import os
import pytest
from deepdiff import DeepDiff
from uuid import uuid4
import bpy
import random
from multi_user.bl_types.bl_collection import BlCollection
@ -10,13 +10,8 @@ from multi_user.bl_types.bl_collection import BlCollection
def test_collection(clear_blend):
# Generate a collection with childrens and a cube
datablock = bpy.data.collections.new("root")
datablock.uuid = str(uuid4())
s1 = bpy.data.collections.new("child")
s1.uuid = str(uuid4())
s2 = bpy.data.collections.new("child2")
s2.uuid = str(uuid4())
datablock.children.link(s1)
datablock.children.link(s2)
datablock.children.link(bpy.data.collections.new("child"))
datablock.children.link(bpy.data.collections.new("child2"))
bpy.ops.mesh.primitive_cube_add()
datablock.objects.link(bpy.data.objects[0])

View File

@ -0,0 +1,21 @@
import os
import pytest
from deepdiff import DeepDiff
import bpy
import random
from multi_user.bl_types.bl_image import BlImage
def test_image(clear_blend):
datablock = bpy.data.images.new('asd',2000,2000)
implementation = BlImage()
expected = implementation._dump(datablock)
bpy.data.images.remove(datablock)
test = implementation._construct(expected)
implementation._load(expected, test)
result = implementation._dump(test)
assert not DeepDiff(expected, result)

View File

@ -30,11 +30,9 @@ CONSTRAINTS_TYPES = [
'COPY_ROTATION', 'COPY_SCALE', 'COPY_TRANSFORMS', 'LIMIT_DISTANCE',
'LIMIT_LOCATION', 'LIMIT_ROTATION', 'LIMIT_SCALE', 'MAINTAIN_VOLUME',
'TRANSFORM', 'TRANSFORM_CACHE', 'CLAMP_TO', 'DAMPED_TRACK', 'IK',
'LOCKED_TRACK', 'STRETCH_TO', 'TRACK_TO', 'ACTION',
'LOCKED_TRACK', 'SPLINE_IK', 'STRETCH_TO', 'TRACK_TO', 'ACTION',
'ARMATURE', 'CHILD_OF', 'FLOOR', 'FOLLOW_PATH', 'PIVOT', 'SHRINKWRAP']
#temporary disabled 'SPLINE_IK' until its fixed
def test_object(clear_blend):
bpy.ops.mesh.primitive_cube_add(
enter_editmode=False, align='WORLD', location=(0, 0, 0))