Compare commits

..

2 Commits

Author SHA1 Message Date
48866b74d3 refacctor: remove wrong charaters 2020-07-07 22:35:56 +02:00
d9f1031107 feat: initial version 2020-07-07 22:34:40 +02:00
53 changed files with 1709 additions and 2581 deletions

View File

@ -5,3 +5,4 @@ stages:
include: include:
- local: .gitlab/ci/test.gitlab-ci.yml - local: .gitlab/ci/test.gitlab-ci.yml
- local: .gitlab/ci/build.gitlab-ci.yml - local: .gitlab/ci/build.gitlab-ci.yml

View File

@ -1,15 +1,14 @@
build: build:
stage: build stage: build
image: debian:stable-slim image: python:latest
script: script:
- git submodule init
- git submodule update
- cd multi_user/libs/replication
- rm -rf tests .git .gitignore script - rm -rf tests .git .gitignore script
artifacts: artifacts:
name: multi_user name: multi_user
paths: paths:
- multi_user - multi_user
only:
refs:
- master
- develop

View File

@ -1,10 +1,14 @@
test: test:
stage: test stage: test
image: slumber/blender-addon-testing:latest image: python:latest
script: script:
- git submodule init
- git submodule update
- apt update
# install blender to get all required dependencies
# TODO: indtall only dependencies
- apt install -f -y gcc python-dev python3.7-dev
- apt install -f -y blender
- python3 -m pip install blender-addon-tester
- python3 scripts/test_addon.py - python3 scripts/test_addon.py
only:
refs:
- master
- develop

3
.gitmodules vendored
View File

@ -0,0 +1,3 @@
[submodule "multi_user/libs/replication"]
path = multi_user/libs/replication
url = https://gitlab.com/slumber/replication.git

View File

@ -37,7 +37,7 @@ All notable changes to this project will be documented in this file.
- Serialization is now based on marshal (2x performance improvements). - Serialization is now based on marshal (2x performance improvements).
- Let pip chose python dependencies install path. - Let pip chose python dependencies install path.
## [0.0.3] - 2020-07-29 ## [0.0.3] - Upcoming
### Added ### Added
@ -60,29 +60,8 @@ All notable changes to this project will be documented in this file.
- user localization - user localization
- repository init - repository init
### Removed ### Removed
- Unused strict right management strategy - Unused strict right management strategy
- Legacy config management system - Legacy config management system
## [0.0.4] - preview
### Added
- Dependency graph driven updates [experimental]
- Optional Edit Mode update
- Late join mechanism
- Sync Axis lock replication
- Sync collection offset
- Sync camera orthographic scale
- Logging basic configuration (file output and level)
### Changed
- Auto updater now handle installation from branches
- use uuid for collection loading
### Fixed
- Prevent unsuported datatypes to crash the session
- Modifier vertex group assignation

View File

@ -11,7 +11,7 @@ This tool aims to allow multiple users to work on the same scene over the networ
## Quick installation ## Quick installation
1. Download latest release [multi_user.zip](https://gitlab.com/slumber/multi-user/-/jobs/artifacts/master/download?job=build). 1. Download latest release [multi_user.zip](/uploads/8aef79c7cf5b1d9606dc58307fd9ad8b/multi_user.zip).
2. Run blender as administrator (dependencies installation). 2. Run blender as administrator (dependencies installation).
3. Install last_version.zip from your addon preferences. 3. Install last_version.zip from your addon preferences.
@ -57,16 +57,14 @@ I'm working on it.
| Dependencies | Version | Needed | | Dependencies | Version | Needed |
| ------------ | :-----: | -----: | | ------------ | :-----: | -----: |
| Replication | latest | yes | | ZeroMQ | latest | yes |
| JsonDiff | latest | yes |
## Contributing ## Contributing
See [contributing section](https://multi-user.readthedocs.io/en/latest/ways_to_contribute.html) of the documentation. See [contributing section](https://multi-user.readthedocs.io/en/latest/ways_to_contribute.html) of the documentation.
Feel free to [join the discord server](https://discord.gg/aBPvGws) to chat, seek help and contribute.
## Licensing ## Licensing
See [license](LICENSE) See [license](LICENSE)

Binary file not shown.

Before

Width:  |  Height:  |  Size: 25 KiB

After

Width:  |  Height:  |  Size: 21 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 2.9 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 4.1 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 15 KiB

View File

@ -8,4 +8,5 @@ Getting started
install install
quickstart quickstart
known_problems
glossary glossary

View File

@ -0,0 +1,46 @@
.. _known-problems:
==============
Known problems
==============
.. rubric:: What do you need to do in order to use Multi-User through internet?
1. Use Hamachi or ZeroTier (I prefer Hamachi) and create a network.
2. All participants need to join this network.
3. Go to Blender and install Multi-User in the preferneces.
4. Setup and start the session:
* **Host**: After activating Multi-User as an Add-On, press N and go on Multi-User.
Then, put the IP of your network where IP is asked for.
Leave Port and IPC Port on default(5555 and 5561). Increase the Timeout(ms) if the connection is not stable.
Then press on "host".
* **Guest**: After activating Multi-User as an Add-On, press N and go to Multi-User
Then, put the IP of your network where IP is asked for.
Leave Port and IPC Port on default(5555 and 5561)(Simpler, put the same information that the host is using.
BUT,it needs 4 ports for communication. Therefore, you need to put 5555+count of guests [up to 4]. ).
Increase the Timeout(ms) if the connection is not stable. Then press on "connexion".
.. rubric:: What do you need to check if you can't host?
You need to check, if the IP and all ports are correct. If it's not loading, because you laoded a project before hosting, it's not your fault.
Then the version is not sable yet (the project contains data, that is not made stable yet).
.. rubric:: What do you need to check if you can't connect?
Check, if you are connected to the network (VPN) of the host. Also, check if you have all of the information like the host has.
Maybe you have different versions (which shouldn't be the case after Auto-Updater is introduced).
.. rubric:: You are connected, but you dont see anything?
After pressing N, go presence overlay and check the box.
Also, go down and uncheck the box "Show only owned"(unless you need privacy ( ͡° ͜ʖ ͡°) ).
If it's still not working, hit the support channel on the discord channel "multi-user". This little helping text is produced by my own experience
(Ultr-X).
In order to bring attention to other problems, please @ me on the support channel. Every problem brought to me will be documentated to optimize and update this text.
Thank you and have fun with Multi-User, brought to you by "swann".
Here the discord server: https://discord.gg/v5eKgm

View File

@ -299,30 +299,22 @@ Here is a quick list of available actions:
.. _advanced: .. _advanced:
Advanced settings Advanced configuration
================= ======================
This section contains optional settings to configure the session behavior. This section contains optional settings to configure the session behavior.
.. figure:: img/quickstart_advanced.png .. figure:: img/quickstart_advanced.png
:align: center :align: center
Advanced configuration panel Repository panel
------- .. rubric:: Network
Network
-------
.. figure:: img/quickstart_advanced_network.png
:align: center
Advanced network settings
**IPC Port** is the port used for Inter Process Communication. This port is used **IPC Port** is the port used for Inter Process Communication. This port is used
by the multi-users subprocesses to communicate with each others. If different instances by the multi-users subprocesses to communicate with each others. If different instances
of the multi-user are using the same IPC port it will create conflict ! of the multi-user are using the same IPC port it will create conflict !
.. note::
You only need to modify it if you need to launch multiple clients from the same You only need to modify it if you need to launch multiple clients from the same
computer(or if you try to host and join on the same computer). You should just enter a different computer(or if you try to host and join on the same computer). You should just enter a different
**IPC port** for each blender instance. **IPC port** for each blender instance.
@ -330,50 +322,14 @@ of the multi-user are using the same IPC port it will create conflict !
**Timeout (in milliseconds)** is the maximum ping authorized before auto-disconnecting. **Timeout (in milliseconds)** is the maximum ping authorized before auto-disconnecting.
You should only increase it if you have a bad connection. You should only increase it if you have a bad connection.
----------- .. rubric:: Replication
Replication
-----------
.. figure:: img/quickstart_advanced_replication.png
:align: center
Advanced replication settings
**Synchronize render settings** (only host) enable replication of EEVEE and CYCLES render settings to match render between clients. **Synchronize render settings** (only host) enable replication of EEVEE and CYCLES render settings to match render between clients.
**Edit Mode Updates** enable objects update while you are in Edit_Mode.
.. warning:: Edit Mode Updates kill performances with complex objects (heavy meshes, gpencil, etc...).
**Update method** allow you to change how replication update are triggered. Until now two update methode are implemented:
- **Default**: Use external threads to monitor datablocks changes, slower and less accurate.
- **Despgraph ⚠️**: Use the blender dependency graph to trigger updates. Faster but experimental and unstable !
**Properties frequency gird** allow to set a custom replication frequency for each type of data-block: **Properties frequency gird** allow to set a custom replication frequency for each type of data-block:
- **Refresh**: pushed data update rate (in second) - **Refresh**: pushed data update rate (in second)
- **Apply**: pulled data update rate (in second) - **Apply**: pulled data update rate (in second)
--- .. note:: Per-data type settings will soon be revamped for simplification purposes
Log
---
.. figure:: img/quickstart_advanced_logging.png
:align: center
Advanced log settings
**log level** allow to set the logging level of detail. Here is the detail for each values:
+-----------+-----------------------------------------------+
| Log level | Description |
+===========+===============================================+
| ERROR | Shows only critical error |
+-----------+-----------------------------------------------+
| WARNING | Shows only errors (all kind) |
+-----------+-----------------------------------------------+
| INFO | Shows only status related messages and errors |
+-----------+-----------------------------------------------+
| DEBUG | Shows every possible information. |
+-----------+-----------------------------------------------+

View File

@ -48,6 +48,7 @@ Documentation is organized into the following sections:
getting_started/install getting_started/install
getting_started/quickstart getting_started/quickstart
getting_started/known_problems
getting_started/glossary getting_started/glossary
.. toctree:: .. toctree::

View File

@ -186,24 +186,25 @@ Using a regular command line
You can run the dedicated server on any platform by following those steps: You can run the dedicated server on any platform by following those steps:
1. Firstly, download and intall python 3 (3.6 or above). 1. Firstly, download and intall python 3 (3.6 or above).
2. Install the replication library: 2. Download and extract the dedicated server from `here <https://gitlab.com/slumber/replication/-/archive/develop/replication-develop.zip>`_
3. Open a terminal in the extracted folder and install python dependencies by running:
.. code-block:: bash .. code-block:: bash
python -m pip install replication python -m pip install -r requirements.txt
4. Launch the server with: 4. Launch the server from the same terminal with:
.. code-block:: bash .. code-block:: bash
replication.serve python scripts/server.py
.. hint:: .. hint::
You can also specify a custom **port** (-p), **timeout** (-t), **admin password** (-pwd), **log level(ERROR, WARNING, INFO or DEBUG)** (-l) and **log file** (-lf) with the following optionnal argument You can also specify a custom **port** (-p), **timeout** (-t) and **admin password** (-pwd) with the following optionnal argument
.. code-block:: bash .. code-block:: bash
replication.serve -p 5555 -pwd toto -t 1000 -l INFO -lf server.log python scripts/server.py -p 5555 -pwd toto -t 1000
As soon as the dedicated server is running, you can connect to it from blender (follow :ref:`how-to-join`). As soon as the dedicated server is running, you can connect to it from blender (follow :ref:`how-to-join`).

View File

@ -21,7 +21,7 @@ bl_info = {
"author": "Swann Martinez", "author": "Swann Martinez",
"version": (0, 0, 3), "version": (0, 0, 3),
"description": "Enable real-time collaborative workflow inside blender", "description": "Enable real-time collaborative workflow inside blender",
"blender": (2, 82, 0), "blender": (2, 80, 0),
"location": "3D View > Sidebar > Multi-User tab", "location": "3D View > Sidebar > Multi-User tab",
"warning": "Unstable addon, use it at your own risks", "warning": "Unstable addon, use it at your own risks",
"category": "Collaboration", "category": "Collaboration",
@ -43,17 +43,23 @@ from bpy.app.handlers import persistent
from . import environment, utils from . import environment, utils
# TODO: remove dependency as soon as replication will be installed as a module
DEPENDENCIES = { DEPENDENCIES = {
("replication", '0.0.21a8'), ("zmq","zmq"),
("jsondiff","jsondiff"),
("deepdiff", "deepdiff"),
("psutil","psutil")
} }
libs = os.path.dirname(os.path.abspath(__file__))+"\\libs\\replication\\replication"
def register(): def register():
# Setup logging policy # Setup logging policy
logging.basicConfig( logging.basicConfig(format='%(levelname)s:%(message)s', level=logging.INFO)
format='%(asctime)s CLIENT %(levelname)-8s %(message)s',
datefmt='%H:%M:%S', if libs not in sys.path:
level=logging.INFO) sys.path.append(libs)
try: try:
environment.setup(DEPENDENCIES, bpy.app.binary_path_python) environment.setup(DEPENDENCIES, bpy.app.binary_path_python)

View File

@ -23,11 +23,7 @@ https://github.com/CGCookie/blender-addon-updater
""" """
__version__ = "1.0.8"
import errno import errno
import traceback
import platform
import ssl import ssl
import urllib.request import urllib.request
import urllib import urllib
@ -102,7 +98,6 @@ class Singleton_updater(object):
# runtime variables, initial conditions # runtime variables, initial conditions
self._verbose = False self._verbose = False
self._use_print_traces = True
self._fake_install = False self._fake_install = False
self._async_checking = False # only true when async daemon started self._async_checking = False # only true when async daemon started
self._update_ready = None self._update_ready = None
@ -138,13 +133,6 @@ class Singleton_updater(object):
self._select_link = select_link_function self._select_link = select_link_function
# called from except blocks, to print the exception details,
# according to the use_print_traces option
def print_trace():
if self._use_print_traces:
traceback.print_exc()
# ------------------------------------------------------------------------- # -------------------------------------------------------------------------
# Getters and setters # Getters and setters
# ------------------------------------------------------------------------- # -------------------------------------------------------------------------
@ -178,7 +166,7 @@ class Singleton_updater(object):
try: try:
self._auto_reload_post_update = bool(value) self._auto_reload_post_update = bool(value)
except: except:
raise ValueError("auto_reload_post_update must be a boolean value") raise ValueError("Must be a boolean value")
@property @property
def backup_current(self): def backup_current(self):
@ -363,7 +351,7 @@ class Singleton_updater(object):
try: try:
self._repo = str(value) self._repo = str(value)
except: except:
raise ValueError("repo must be a string value") raise ValueError("User must be a string")
@property @property
def select_link(self): def select_link(self):
@ -389,7 +377,6 @@ class Singleton_updater(object):
os.makedirs(value) os.makedirs(value)
except: except:
if self._verbose: print("Error trying to staging path") if self._verbose: print("Error trying to staging path")
self.print_trace()
return return
self._updater_path = value self._updater_path = value
@ -459,16 +446,6 @@ class Singleton_updater(object):
except: except:
raise ValueError("Verbose must be a boolean value") raise ValueError("Verbose must be a boolean value")
@property
def use_print_traces(self):
return self._use_print_traces
@use_print_traces.setter
def use_print_traces(self, value):
try:
self._use_print_traces = bool(value)
except:
raise ValueError("use_print_traces must be a boolean value")
@property @property
def version_max_update(self): def version_max_update(self):
return self._version_max_update return self._version_max_update
@ -660,9 +637,6 @@ class Singleton_updater(object):
else: else:
if self._verbose: print("Tokens not setup for engine yet") if self._verbose: print("Tokens not setup for engine yet")
# Always set user agent
request.add_header('User-Agent', "Python/"+str(platform.python_version()))
# run the request # run the request
try: try:
if context: if context:
@ -678,7 +652,6 @@ class Singleton_updater(object):
self._error = "HTTP error" self._error = "HTTP error"
self._error_msg = str(e.code) self._error_msg = str(e.code)
print(self._error, self._error_msg) print(self._error, self._error_msg)
self.print_trace()
self._update_ready = None self._update_ready = None
except urllib.error.URLError as e: except urllib.error.URLError as e:
reason = str(e.reason) reason = str(e.reason)
@ -690,7 +663,6 @@ class Singleton_updater(object):
self._error = "URL error, check internet connection" self._error = "URL error, check internet connection"
self._error_msg = reason self._error_msg = reason
print(self._error, self._error_msg) print(self._error, self._error_msg)
self.print_trace()
self._update_ready = None self._update_ready = None
return None return None
else: else:
@ -712,7 +684,6 @@ class Singleton_updater(object):
self._error_msg = str(e.reason) self._error_msg = str(e.reason)
self._update_ready = None self._update_ready = None
print(self._error, self._error_msg) print(self._error, self._error_msg)
self.print_trace()
return None return None
else: else:
return None return None
@ -729,17 +700,15 @@ class Singleton_updater(object):
if self._verbose: print("Preparing staging folder for download:\n",local) if self._verbose: print("Preparing staging folder for download:\n",local)
if os.path.isdir(local) == True: if os.path.isdir(local) == True:
try: try:
shutil.rmtree(local, ignore_errors=True) shutil.rmtree(local)
os.makedirs(local) os.makedirs(local)
except: except:
error = "failed to remove existing staging directory" error = "failed to remove existing staging directory"
self.print_trace()
else: else:
try: try:
os.makedirs(local) os.makedirs(local)
except: except:
error = "failed to create staging directory" error = "failed to create staging directory"
self.print_trace()
if error != None: if error != None:
if self._verbose: print("Error: Aborting update, "+error) if self._verbose: print("Error: Aborting update, "+error)
@ -764,10 +733,6 @@ class Singleton_updater(object):
request.add_header('PRIVATE-TOKEN',self._engine.token) request.add_header('PRIVATE-TOKEN',self._engine.token)
else: else:
if self._verbose: print("Tokens not setup for selected engine yet") if self._verbose: print("Tokens not setup for selected engine yet")
# Always set user agent
request.add_header('User-Agent', "Python/"+str(platform.python_version()))
self.urlretrieve(urllib.request.urlopen(request,context=context), self._source_zip) self.urlretrieve(urllib.request.urlopen(request,context=context), self._source_zip)
# add additional checks on file size being non-zero # add additional checks on file size being non-zero
if self._verbose: print("Successfully downloaded update zip") if self._verbose: print("Successfully downloaded update zip")
@ -778,7 +743,6 @@ class Singleton_updater(object):
if self._verbose: if self._verbose:
print("Error retrieving download, bad link?") print("Error retrieving download, bad link?")
print("Error: {}".format(e)) print("Error: {}".format(e))
self.print_trace()
return False return False
@ -793,18 +757,16 @@ class Singleton_updater(object):
if os.path.isdir(local): if os.path.isdir(local):
try: try:
shutil.rmtree(local, ignore_errors=True) shutil.rmtree(local)
except: except:
if self._verbose:print("Failed to removed previous backup folder, contininuing") if self._verbose:print("Failed to removed previous backup folder, contininuing")
self.print_trace()
# remove the temp folder; shouldn't exist but could if previously interrupted # remove the temp folder; shouldn't exist but could if previously interrupted
if os.path.isdir(tempdest): if os.path.isdir(tempdest):
try: try:
shutil.rmtree(tempdest, ignore_errors=True) shutil.rmtree(tempdest)
except: except:
if self._verbose:print("Failed to remove existing temp folder, contininuing") if self._verbose:print("Failed to remove existing temp folder, contininuing")
self.print_trace()
# make the full addon copy, which temporarily places outside the addon folder # make the full addon copy, which temporarily places outside the addon folder
if self._backup_ignore_patterns != None: if self._backup_ignore_patterns != None:
shutil.copytree( shutil.copytree(
@ -832,7 +794,7 @@ class Singleton_updater(object):
# make the copy # make the copy
shutil.move(backuploc,tempdest) shutil.move(backuploc,tempdest)
shutil.rmtree(self._addon_root, ignore_errors=True) shutil.rmtree(self._addon_root)
os.rename(tempdest,self._addon_root) os.rename(tempdest,self._addon_root)
self._json["backup_date"] = "" self._json["backup_date"] = ""
@ -853,7 +815,7 @@ class Singleton_updater(object):
# clear the existing source folder in case previous files remain # clear the existing source folder in case previous files remain
outdir = os.path.join(self._updater_path, "source") outdir = os.path.join(self._updater_path, "source")
try: try:
shutil.rmtree(outdir, ignore_errors=True) shutil.rmtree(outdir)
if self._verbose: if self._verbose:
print("Source folder cleared") print("Source folder cleared")
except: except:
@ -866,7 +828,6 @@ class Singleton_updater(object):
except Exception as err: except Exception as err:
print("Error occurred while making extract dir:") print("Error occurred while making extract dir:")
print(str(err)) print(str(err))
self.print_trace()
self._error = "Install failed" self._error = "Install failed"
self._error_msg = "Failed to make extract directory" self._error_msg = "Failed to make extract directory"
return -1 return -1
@ -908,7 +869,6 @@ class Singleton_updater(object):
if exc.errno != errno.EEXIST: if exc.errno != errno.EEXIST:
self._error = "Install failed" self._error = "Install failed"
self._error_msg = "Could not create folder from zip" self._error_msg = "Could not create folder from zip"
self.print_trace()
return -1 return -1
else: else:
with open(os.path.join(outdir, subpath), "wb") as outfile: with open(os.path.join(outdir, subpath), "wb") as outfile:
@ -1002,13 +962,12 @@ class Singleton_updater(object):
print("Clean removing file {}".format(os.path.join(base,f))) print("Clean removing file {}".format(os.path.join(base,f)))
for f in folders: for f in folders:
if os.path.join(base,f)==self._updater_path: continue if os.path.join(base,f)==self._updater_path: continue
shutil.rmtree(os.path.join(base,f), ignore_errors=True) shutil.rmtree(os.path.join(base,f))
print("Clean removing folder and contents {}".format(os.path.join(base,f))) print("Clean removing folder and contents {}".format(os.path.join(base,f)))
except Exception as err: except Exception as err:
error = "failed to create clean existing addon folder" error = "failed to create clean existing addon folder"
print(error, str(err)) print(error, str(err))
self.print_trace()
# Walk through the base addon folder for rules on pre-removing # Walk through the base addon folder for rules on pre-removing
# but avoid removing/altering backup and updater file # but avoid removing/altering backup and updater file
@ -1024,7 +983,6 @@ class Singleton_updater(object):
if self._verbose: print("Pre-removed file "+file) if self._verbose: print("Pre-removed file "+file)
except OSError: except OSError:
print("Failed to pre-remove "+file) print("Failed to pre-remove "+file)
self.print_trace()
# Walk through the temp addon sub folder for replacements # Walk through the temp addon sub folder for replacements
# this implements the overwrite rules, which apply after # this implements the overwrite rules, which apply after
@ -1048,7 +1006,7 @@ class Singleton_updater(object):
# otherwise, check each file to see if matches an overwrite pattern # otherwise, check each file to see if matches an overwrite pattern
replaced=False replaced=False
for ptrn in self._overwrite_patterns: for ptrn in self._overwrite_patterns:
if fnmatch.filter([file],ptrn): if fnmatch.filter([destFile],ptrn):
replaced=True replaced=True
break break
if replaced: if replaced:
@ -1064,11 +1022,10 @@ class Singleton_updater(object):
# now remove the temp staging folder and downloaded zip # now remove the temp staging folder and downloaded zip
try: try:
shutil.rmtree(staging_path, ignore_errors=True) shutil.rmtree(staging_path)
except: except:
error = "Error: Failed to remove existing staging directory, consider manually removing "+staging_path error = "Error: Failed to remove existing staging directory, consider manually removing "+staging_path
if self._verbose: print(error) if self._verbose: print(error)
self.print_trace()
def reload_addon(self): def reload_addon(self):
@ -1084,16 +1041,9 @@ class Singleton_updater(object):
# not allowed in restricted context, such as register module # not allowed in restricted context, such as register module
# toggle to refresh # toggle to refresh
if "addon_disable" in dir(bpy.ops.wm): # 2.7
bpy.ops.wm.addon_disable(module=self._addon_package) bpy.ops.wm.addon_disable(module=self._addon_package)
bpy.ops.wm.addon_refresh() bpy.ops.wm.addon_refresh()
bpy.ops.wm.addon_enable(module=self._addon_package) bpy.ops.wm.addon_enable(module=self._addon_package)
print("2.7 reload complete")
else: # 2.8
bpy.ops.preferences.addon_disable(module=self._addon_package)
bpy.ops.preferences.addon_refresh()
bpy.ops.preferences.addon_enable(module=self._addon_package)
print("2.8 reload complete")
# ------------------------------------------------------------------------- # -------------------------------------------------------------------------
@ -1425,7 +1375,7 @@ class Singleton_updater(object):
if "last_check" not in self._json or self._json["last_check"] == "": if "last_check" not in self._json or self._json["last_check"] == "":
return True return True
else:
now = datetime.now() now = datetime.now()
last_check = datetime.strptime(self._json["last_check"], last_check = datetime.strptime(self._json["last_check"],
"%Y-%m-%d %H:%M:%S.%f") "%Y-%m-%d %H:%M:%S.%f")
@ -1441,7 +1391,7 @@ class Singleton_updater(object):
if self._verbose: if self._verbose:
print("{} Updater: Time to check for updates!".format(self._addon)) print("{} Updater: Time to check for updates!".format(self._addon))
return True return True
else:
if self._verbose: if self._verbose:
print("{} Updater: Determined it's not yet time to check for updates".format(self._addon)) print("{} Updater: Determined it's not yet time to check for updates".format(self._addon))
return False return False
@ -1463,7 +1413,6 @@ class Singleton_updater(object):
except Exception as err: except Exception as err:
print("Other OS error occurred while trying to rename old JSON") print("Other OS error occurred while trying to rename old JSON")
print(err) print(err)
self.print_trace()
return json_path return json_path
def set_updater_json(self): def set_updater_json(self):
@ -1564,7 +1513,6 @@ class Singleton_updater(object):
except Exception as exception: except Exception as exception:
print("Checking for update error:") print("Checking for update error:")
print(exception) print(exception)
self.print_trace()
if not self._error: if not self._error:
self._update_ready = False self._update_ready = False
self._update_version = None self._update_version = None
@ -1676,6 +1624,9 @@ class GitlabEngine(object):
return "{}{}{}".format(self.api_url,"/api/v4/projects/",updater.repo) return "{}{}{}".format(self.api_url,"/api/v4/projects/",updater.repo)
def form_tags_url(self, updater): def form_tags_url(self, updater):
if updater.use_releases:
return "{}{}".format(self.form_repo_url(updater),"/releases")
else:
return "{}{}".format(self.form_repo_url(updater),"/repository/tags") return "{}{}".format(self.form_repo_url(updater),"/repository/tags")
def form_branch_list_url(self, updater): def form_branch_list_url(self, updater):
@ -1704,9 +1655,14 @@ class GitlabEngine(object):
def parse_tags(self, response, updater): def parse_tags(self, response, updater):
if response == None: if response == None:
return [] return []
# Return asset links from release
if updater.use_releases:
return [{"name": release["name"], "zipball_url": release["assets"]["links"][0]["url"]} for release in response]
else:
return [{"name": tag["name"], "zipball_url": self.get_zip_url(tag["commit"]["id"], updater)} for tag in response] return [{"name": tag["name"], "zipball_url": self.get_zip_url(tag["commit"]["id"], updater)} for tag in response]
# ----------------------------------------------------------------------------- # -----------------------------------------------------------------------------
# The module-shared class instance, # The module-shared class instance,
# should be what's imported to other files # should be what's imported to other files

View File

@ -16,13 +16,7 @@
# #
# ##### END GPL LICENSE BLOCK ##### # ##### END GPL LICENSE BLOCK #####
"""Blender UI integrations for the addon updater.
Implements draw calls, popups, and operators that use the addon_updater.
"""
import os import os
import traceback
import bpy import bpy
from bpy.app.handlers import persistent from bpy.app.handlers import persistent
@ -34,16 +28,16 @@ try:
except Exception as e: except Exception as e:
print("ERROR INITIALIZING UPDATER") print("ERROR INITIALIZING UPDATER")
print(str(e)) print(str(e))
traceback.print_exc()
class Singleton_updater_none(object): class Singleton_updater_none(object):
def __init__(self): def __init__(self):
self.addon = None self.addon = None
self.verbose = False self.verbose = False
self.use_print_traces = True
self.invalidupdater = True # used to distinguish bad install self.invalidupdater = True # used to distinguish bad install
self.error = None self.error = None
self.error_msg = None self.error_msg = None
self.async_checking = None self.async_checking = None
def clear_state(self): def clear_state(self):
self.addon = None self.addon = None
self.verbose = False self.verbose = False
@ -51,6 +45,7 @@ except Exception as e:
self.error = None self.error = None
self.error_msg = None self.error_msg = None
self.async_checking = None self.async_checking = None
def run_update(self): pass def run_update(self): pass
def check_for_update(self): pass def check_for_update(self): pass
updater = Singleton_updater_none() updater = Singleton_updater_none()
@ -155,7 +150,8 @@ class addon_updater_install_popup(bpy.types.Operator):
col.scale_y = 0.7 col.scale_y = 0.7
col.label(text="Update {} ready!".format(str(updater.update_version)), col.label(text="Update {} ready!".format(str(updater.update_version)),
icon="LOOP_FORWARDS") icon="LOOP_FORWARDS")
col.label(text="Choose 'Update Now' & press OK to install, ",icon="BLANK1") col.label(
text="Choose 'Update Now' & press OK to install, ", icon="BLANK1")
col.label(text="or click outside window to defer", icon="BLANK1") col.label(text="or click outside window to defer", icon="BLANK1")
row = col.row() row = col.row()
row.prop(self, "ignore_enum", expand=True) row.prop(self, "ignore_enum", expand=True)
@ -289,12 +285,13 @@ class addon_updater_update_now(bpy.types.Operator):
# should return 0, if not something happened # should return 0, if not something happened
if updater.verbose: if updater.verbose:
if res==0: print("Updater returned successful") if res == 0:
else: print("Updater returned "+str(res)+", error occurred") print("Updater returned successful")
else:
print("Updater returned "+str(res)+", error occurred")
except Exception as e: except Exception as e:
updater._error = "Error trying to run update" updater._error = "Error trying to run update"
updater._error_msg = str(e) updater._error_msg = str(e)
updater.print_trace()
atr = addon_updater_install_manually.bl_idname.split(".") atr = addon_updater_install_manually.bl_idname.split(".")
getattr(getattr(bpy.ops, atr[0]), atr[1])('INVOKE_DEFAULT') getattr(getattr(bpy.ops, atr[0]), atr[1])('INVOKE_DEFAULT')
elif updater.update_ready == None: elif updater.update_ready == None:
@ -305,10 +302,9 @@ class addon_updater_update_now(bpy.types.Operator):
elif updater.update_ready == False: elif updater.update_ready == False:
self.report({'INFO'}, "Nothing to update") self.report({'INFO'}, "Nothing to update")
return {'CANCELLED'}
else: else:
self.report({'ERROR'}, "Encountered problem while trying to update") self.report(
return {'CANCELLED'} {'ERROR'}, "Encountered problem while trying to update")
return {'FINISHED'} return {'FINISHED'}
@ -350,7 +346,8 @@ class addon_updater_update_target(bpy.types.Operator):
@classmethod @classmethod
def poll(cls, context): def poll(cls, context):
if updater.invalidupdater == True: return False if updater.invalidupdater == True:
return False
return updater.update_ready != None and len(updater.tags) > 0 return updater.update_ready != None and len(updater.tags) > 0
def invoke(self, context, event): def invoke(self, context, event):
@ -367,7 +364,6 @@ class addon_updater_update_target(bpy.types.Operator):
subcol = split.column() subcol = split.column()
subcol.prop(self, "target", text="") subcol.prop(self, "target", text="")
def execute(self, context): def execute(self, context):
# in case of error importing updater # in case of error importing updater
@ -419,8 +415,10 @@ class addon_updater_install_manually(bpy.types.Operator):
if self.error != "": if self.error != "":
col = layout.column() col = layout.column()
col.scale_y = 0.7 col.scale_y = 0.7
col.label(text="There was an issue trying to auto-install",icon="ERROR") col.label(
col.label(text="Press the download button below and install",icon="BLANK1") text="There was an issue trying to auto-install", icon="ERROR")
col.label(
text="Press the download button below and install", icon="BLANK1")
col.label(text="the zip file like a normal addon.", icon="BLANK1") col.label(text="the zip file like a normal addon.", icon="BLANK1")
else: else:
col = layout.column() col = layout.column()
@ -451,6 +449,7 @@ class addon_updater_install_manually(bpy.types.Operator):
row.label(text="See source website to download the update") row.label(text="See source website to download the update")
def execute(self, context): def execute(self, context):
return {'FINISHED'} return {'FINISHED'}
@ -498,23 +497,16 @@ class addon_updater_updated_successful(bpy.types.Operator):
# tell user to restart blender # tell user to restart blender
if "just_restored" in saved and saved["just_restored"] == True: if "just_restored" in saved and saved["just_restored"] == True:
col = layout.column() col = layout.column()
col.scale_y = 0.7
col.label(text="Addon restored", icon="RECOVER_LAST") col.label(text="Addon restored", icon="RECOVER_LAST")
alert_row = col.row() col.label(text="Restart blender to reload.", icon="BLANK1")
alert_row.alert = True
alert_row.operator(
"wm.quit_blender",
text="Restart blender to reload",
icon="BLANK1")
updater.json_reset_restore() updater.json_reset_restore()
else: else:
col = layout.column() col = layout.column()
col.label(text="Addon successfully installed", icon="FILE_TICK") col.scale_y = 0.7
alert_row = col.row() col.label(text="Addon successfully installed",
alert_row.alert = True icon="FILE_TICK")
alert_row.operator( col.label(text="Restart blender to reload.", icon="BLANK1")
"wm.quit_blender",
text="Restart blender to reload",
icon="BLANK1")
else: else:
# reload addon, but still recommend they restart blender # reload addon, but still recommend they restart blender
@ -528,7 +520,8 @@ class addon_updater_updated_successful(bpy.types.Operator):
else: else:
col = layout.column() col = layout.column()
col.scale_y = 0.7 col.scale_y = 0.7
col.label(text="Addon successfully installed", icon="FILE_TICK") col.label(text="Addon successfully installed",
icon="FILE_TICK")
col.label(text="Consider restarting blender to fully reload.", col.label(text="Consider restarting blender to fully reload.",
icon="BLANK1") icon="BLANK1")
@ -617,6 +610,7 @@ ran_update_sucess_popup = False
# global var for preventing successive calls # global var for preventing successive calls
ran_background_check = False ran_background_check = False
@persistent @persistent
def updater_run_success_popup_handler(scene): def updater_run_success_popup_handler(scene):
global ran_update_sucess_popup global ran_update_sucess_popup
@ -627,12 +621,8 @@ def updater_run_success_popup_handler(scene):
return return
try: try:
if "scene_update_post" in dir(bpy.app.handlers):
bpy.app.handlers.scene_update_post.remove( bpy.app.handlers.scene_update_post.remove(
updater_run_success_popup_handler) updater_run_success_popup_handler)
else:
bpy.app.handlers.depsgraph_update_post.remove(
updater_run_success_popup_handler)
except: except:
pass pass
@ -650,12 +640,8 @@ def updater_run_install_popup_handler(scene):
return return
try: try:
if "scene_update_post" in dir(bpy.app.handlers):
bpy.app.handlers.scene_update_post.remove( bpy.app.handlers.scene_update_post.remove(
updater_run_install_popup_handler) updater_run_install_popup_handler)
else:
bpy.app.handlers.depsgraph_update_post.remove(
updater_run_install_popup_handler)
except: except:
pass pass
@ -673,7 +659,7 @@ def updater_run_install_popup_handler(scene):
# user probably manually installed to get the up to date addon # user probably manually installed to get the up to date addon
# in here. Clear out the update flag using this function # in here. Clear out the update flag using this function
if updater.verbose: if updater.verbose:
print("{} updater: appears user updated, clearing flag".format(\ print("{} updater: appears user updated, clearing flag".format(
updater.addon)) updater.addon))
updater.json_reset_restore() updater.json_reset_restore()
return return
@ -692,24 +678,11 @@ def background_update_callback(update_ready):
return return
if update_ready != True: if update_ready != True:
return return
if updater_run_install_popup_handler not in \
# see if we need add to the update handler to trigger the popup bpy.app.handlers.scene_update_post and \
handlers = [] ran_autocheck_install_popup == False:
if "scene_update_post" in dir(bpy.app.handlers): # 2.7x
handlers = bpy.app.handlers.scene_update_post
else: # 2.8x
handlers = bpy.app.handlers.depsgraph_update_post
in_handles = updater_run_install_popup_handler in handlers
if in_handles or ran_autocheck_install_popup:
return
if "scene_update_post" in dir(bpy.app.handlers): # 2.7x
bpy.app.handlers.scene_update_post.append( bpy.app.handlers.scene_update_post.append(
updater_run_install_popup_handler) updater_run_install_popup_handler)
else: # 2.8x
bpy.app.handlers.depsgraph_update_post.append(
updater_run_install_popup_handler)
ran_autocheck_install_popup = True ran_autocheck_install_popup = True
@ -733,6 +706,7 @@ def post_update_callback(module_name, res=None):
# ie if "auto_reload_post_update" == True, comment out this code # ie if "auto_reload_post_update" == True, comment out this code
if updater.verbose: if updater.verbose:
print("{} updater: Running post update callback".format(updater.addon)) print("{} updater: Running post update callback".format(updater.addon))
# bpy.app.handlers.scene_update_post.append(updater_run_success_popup_handler)
atr = addon_updater_updated_successful.bl_idname.split(".") atr = addon_updater_updated_successful.bl_idname.split(".")
getattr(getattr(bpy.ops, atr[0]), atr[1])('INVOKE_DEFAULT') getattr(getattr(bpy.ops, atr[0]), atr[1])('INVOKE_DEFAULT')
@ -786,7 +760,7 @@ def check_for_update_background():
# this function should take a bool input, if true: update ready # this function should take a bool input, if true: update ready
# if false, no update ready # if false, no update ready
if updater.verbose: if updater.verbose:
print("{} updater: Running background check for update".format(\ print("{} updater: Running background check for update".format(
updater.addon)) updater.addon))
updater.check_for_update_async(background_update_callback) updater.check_for_update_async(background_update_callback)
ran_background_check = True ran_background_check = True
@ -817,7 +791,8 @@ def check_for_update_nonthreaded(self, context):
atr = addon_updater_install_popup.bl_idname.split(".") atr = addon_updater_install_popup.bl_idname.split(".")
getattr(getattr(bpy.ops, atr[0]), atr[1])('INVOKE_DEFAULT') getattr(getattr(bpy.ops, atr[0]), atr[1])('INVOKE_DEFAULT')
else: else:
if updater.verbose: print("No update ready") if updater.verbose:
print("No update ready")
self.report({'INFO'}, "No update ready") self.report({'INFO'}, "No update ready")
@ -831,36 +806,22 @@ def showReloadPopup():
saved_state = updater.json saved_state = updater.json
global ran_update_sucess_popup global ran_update_sucess_popup
has_state = saved_state != None a = saved_state != None
just_updated = "just_updated" in saved_state b = "just_updated" in saved_state
updated_info = saved_state["just_updated"] c = saved_state["just_updated"]
if not (has_state and just_updated and updated_info):
return
if a and b and c:
updater.json_reset_postupdate() # so this only runs once updater.json_reset_postupdate() # so this only runs once
# no handlers in this case # no handlers in this case
if updater.auto_reload_post_update == False: if updater.auto_reload_post_update == False:
return return
# see if we need add to the update handler to trigger the popup if updater_run_success_popup_handler not in \
handlers = [] bpy.app.handlers.scene_update_post \
if "scene_update_post" in dir(bpy.app.handlers): # 2.7x and ran_update_sucess_popup == False:
handlers = bpy.app.handlers.scene_update_post
else: # 2.8x
handlers = bpy.app.handlers.depsgraph_update_post
in_handles = updater_run_success_popup_handler in handlers
if in_handles or ran_update_sucess_popup is True:
return
if "scene_update_post" in dir(bpy.app.handlers): # 2.7x
bpy.app.handlers.scene_update_post.append( bpy.app.handlers.scene_update_post.append(
updater_run_success_popup_handler) updater_run_success_popup_handler)
else: # 2.8x
bpy.app.handlers.depsgraph_update_post.append(
updater_run_success_popup_handler)
ran_update_sucess_popup = True ran_update_sucess_popup = True
@ -886,14 +847,9 @@ def update_notice_box_ui(self, context):
layout = self.layout layout = self.layout
box = layout.box() box = layout.box()
col = box.column() col = box.column()
alert_row = col.row() col.scale_y = 0.7
alert_row.alert = True col.label(text="Restart blender", icon="ERROR")
alert_row.operator(
"wm.quit_blender",
text="Restart blender",
icon="ERROR")
col.label(text="to complete update") col.label(text="to complete update")
return return
# if user pressed ignore, don't draw the box # if user pressed ignore, don't draw the box
@ -957,14 +913,10 @@ def update_settings_ui(self, context, element=None):
if updater.auto_reload_post_update == False: if updater.auto_reload_post_update == False:
saved_state = updater.json saved_state = updater.json
if "just_updated" in saved_state and saved_state["just_updated"] == True: if "just_updated" in saved_state and saved_state["just_updated"] == True:
row.alert = True row.label(text="Restart blender to complete update", icon="ERROR")
row.operator(
"wm.quit_blender",
text="Restart blender to complete update",
icon="ERROR")
return return
split = layout_split(row, factor=0.4) split = layout_split(row, factor=0.3)
subcol = split.column() subcol = split.column()
subcol.prop(settings, "auto_check_update") subcol.prop(settings, "auto_check_update")
subcol = split.column() subcol = split.column()
@ -979,11 +931,9 @@ def update_settings_ui(self, context, element=None):
checkcol = subrow.column(align=True) checkcol = subrow.column(align=True)
checkcol.prop(settings, "updater_intrval_days") checkcol.prop(settings, "updater_intrval_days")
checkcol = subrow.column(align=True) checkcol = subrow.column(align=True)
checkcol.prop(settings, "updater_intrval_hours")
# Consider un-commenting for local dev (e.g. to set shorter intervals) checkcol = subrow.column(align=True)
# checkcol.prop(settings,"updater_intrval_hours") checkcol.prop(settings, "updater_intrval_minutes")
# checkcol = subrow.column(align=True)
# checkcol.prop(settings,"updater_intrval_minutes")
# checking / managing updates # checking / managing updates
row = box.row() row = box.row()
@ -1123,11 +1073,7 @@ def update_settings_ui_condensed(self, context, element=None):
if updater.auto_reload_post_update == False: if updater.auto_reload_post_update == False:
saved_state = updater.json saved_state = updater.json
if "just_updated" in saved_state and saved_state["just_updated"] == True: if "just_updated" in saved_state and saved_state["just_updated"] == True:
row.alert = True # mark red row.label(text="Restart blender to complete update", icon="ERROR")
row.operator(
"wm.quit_blender",
text="Restart blender to complete update",
icon="ERROR")
return return
col = row.column() col = row.column()
@ -1248,11 +1194,13 @@ def skip_tag_function(self, tag):
if self.include_branches == True: if self.include_branches == True:
for branch in self.include_branch_list: for branch in self.include_branch_list:
if tag["name"].lower() == branch: return False if tag["name"].lower() == branch:
return False
# function converting string to tuple, ignoring e.g. leading 'v' # function converting string to tuple, ignoring e.g. leading 'v'
tupled = self.version_tuple_from_text(tag["name"]) tupled = self.version_tuple_from_text(tag["name"])
if type(tupled) != type( (1,2,3) ): return True if type(tupled) != type((1, 2, 3)):
return True
# select the min tag version - change tuple accordingly # select the min tag version - change tuple accordingly
if self.version_min_update != None: if self.version_min_update != None:
@ -1324,9 +1272,7 @@ def register(bl_info):
updater.clear_state() # clear internal vars, avoids reloading oddities updater.clear_state() # clear internal vars, avoids reloading oddities
# confirm your updater "engine" (Github is default if not specified) # confirm your updater "engine" (Github is default if not specified)
# updater.engine = "Github"
updater.engine = "GitLab" updater.engine = "GitLab"
# updater.engine = "Bitbucket"
# If using private repository, indicate the token here # If using private repository, indicate the token here
# Must be set after assigning the engine. # Must be set after assigning the engine.
@ -1340,6 +1286,7 @@ def register(bl_info):
# choose your own repository, must match git name # choose your own repository, must match git name
updater.repo = "10515801" updater.repo = "10515801"
# updater.addon = # define at top of module, MUST be done first # updater.addon = # define at top of module, MUST be done first
# Website for manual addon download, optional but recommended to set # Website for manual addon download, optional but recommended to set
@ -1348,7 +1295,7 @@ def register(bl_info):
# Addon subfolder path # Addon subfolder path
# "sample/path/to/addon" # "sample/path/to/addon"
# default is "" or None, meaning root # default is "" or None, meaning root
updater.subfolder_path = "multi_user" updater.subfolder_path = "multi-user"
# used to check/compare versions # used to check/compare versions
updater.current_version = bl_info["version"] updater.current_version = bl_info["version"]
@ -1360,7 +1307,7 @@ def register(bl_info):
# Optional, consider turning off for production or allow as an option # Optional, consider turning off for production or allow as an option
# This will print out additional debugging info to the console # This will print out additional debugging info to the console
updater.verbose = False # make False for production default updater.verbose = True # make False for production default
# Optional, customize where the addon updater processing subfolder is, # Optional, customize where the addon updater processing subfolder is,
# essentially a staging folder used by the updater on its own # essentially a staging folder used by the updater on its own
@ -1421,11 +1368,11 @@ def register(bl_info):
# the "install {branch}/older version" operator. # the "install {branch}/older version" operator.
updater.include_branches = True updater.include_branches = True
# (GitHub only) This options allows the user to use releases over tags for data, # (GitHub/Gitlab only) This options allows the user to use releases over tags for data,
# which enables pulling down release logs/notes, as well as specify installs from # which enables pulling down release logs/notes, as well as specify installs from
# release-attached zips (instead of just the auto-packaged code generated with # release-attached zips (instead of just the auto-packaged code generated with
# a release/tag). Setting has no impact on BitBucket or GitLab repos # a release/tag). Setting has no impact on BitBucket or GitLab repos
updater.use_releases = False updater.use_releases = True
# note: Releases always have a tag, but a tag may not always be a release # note: Releases always have a tag, but a tag may not always be a release
# Therefore, setting True above will filter out any non-annoted tags # Therefore, setting True above will filter out any non-annoted tags
# note 2: Using this option will also display the release name instead of # note 2: Using this option will also display the release name instead of
@ -1435,7 +1382,8 @@ def register(bl_info):
# updater.include_branch_list defaults to ['master'] branch if set to none # updater.include_branch_list defaults to ['master'] branch if set to none
# example targeting another multiple branches allowed to pull from # example targeting another multiple branches allowed to pull from
# updater.include_branch_list = ['master', 'dev'] # example with two branches # updater.include_branch_list = ['master', 'dev'] # example with two branches
updater.include_branch_list = ['master','develop'] # None is the equivalent to setting ['master'] # None is the equivalent to setting ['master']
updater.include_branch_list = None
# Only allow manual install, thus prompting the user to open # Only allow manual install, thus prompting the user to open
# the addon's web page to download, specifically: updater.website # the addon's web page to download, specifically: updater.website
@ -1460,7 +1408,7 @@ def register(bl_info):
# Set the min and max versions allowed to install. # Set the min and max versions allowed to install.
# Optional, default None # Optional, default None
# min install (>=) will install this and higher # min install (>=) will install this and higher
updater.version_min_update = (0,0,3) updater.version_min_update = (0, 0, 1)
# updater.version_min_update = None # if not wanting to define a min # updater.version_min_update = None # if not wanting to define a min
# max install (<) will install strictly anything lower # max install (<) will install strictly anything lower
@ -1473,11 +1421,6 @@ def register(bl_info):
# Function defined above, customize as appropriate per repository; not required # Function defined above, customize as appropriate per repository; not required
updater.select_link = select_link_function updater.select_link = select_link_function
# Recommended false to encourage blender restarts on update completion
# Setting this option to True is NOT as stable as false (could cause
# blender crashes)
updater.auto_reload_post_update = False
# The register line items for all operators/panels # The register line items for all operators/panels
# If using bpy.utils.register_module(__name__) to register elsewhere # If using bpy.utils.register_module(__name__) to register elsewhere
# in the addon, delete these lines (also from unregister) # in the addon, delete these lines (also from unregister)

View File

@ -34,13 +34,11 @@ __all__ = [
'bl_metaball', 'bl_metaball',
'bl_lattice', 'bl_lattice',
'bl_lightprobe', 'bl_lightprobe',
'bl_speaker', 'bl_speaker'
'bl_font',
'bl_sound'
] # Order here defines execution order ] # Order here defines execution order
from . import * from . import *
from replication.data import ReplicatedDataFactory from ..libs.replication.replication.data import ReplicatedDataFactory
def types_to_register(): def types_to_register():
return __all__ return __all__

View File

@ -134,7 +134,6 @@ class BlAction(BlDatablock):
bl_delay_refresh = 1 bl_delay_refresh = 1
bl_delay_apply = 1 bl_delay_apply = 1
bl_automatic_push = True bl_automatic_push = True
bl_check_common = False
bl_icon = 'ACTION_TWEAK' bl_icon = 'ACTION_TWEAK'
def _construct(self, data): def _construct(self, data):

View File

@ -31,7 +31,6 @@ class BlArmature(BlDatablock):
bl_delay_refresh = 1 bl_delay_refresh = 1
bl_delay_apply = 0 bl_delay_apply = 0
bl_automatic_push = True bl_automatic_push = True
bl_check_common = False
bl_icon = 'ARMATURE_DATA' bl_icon = 'ARMATURE_DATA'
def _construct(self, data): def _construct(self, data):
@ -93,7 +92,6 @@ class BlArmature(BlDatablock):
new_bone.head = bone_data['head_local'] new_bone.head = bone_data['head_local']
new_bone.tail_radius = bone_data['tail_radius'] new_bone.tail_radius = bone_data['tail_radius']
new_bone.head_radius = bone_data['head_radius'] new_bone.head_radius = bone_data['head_radius']
# new_bone.roll = bone_data['roll']
if 'parent' in bone_data: if 'parent' in bone_data:
new_bone.parent = target.edit_bones[data['bones'] new_bone.parent = target.edit_bones[data['bones']
@ -125,8 +123,7 @@ class BlArmature(BlDatablock):
'use_connect', 'use_connect',
'parent', 'parent',
'name', 'name',
'layers', 'layers'
# 'roll',
] ]
data = dumper.dump(instance) data = dumper.dump(instance)

View File

@ -29,7 +29,6 @@ class BlCamera(BlDatablock):
bl_delay_refresh = 1 bl_delay_refresh = 1
bl_delay_apply = 1 bl_delay_apply = 1
bl_automatic_push = True bl_automatic_push = True
bl_check_common = False
bl_icon = 'CAMERA_DATA' bl_icon = 'CAMERA_DATA'
def _construct(self, data): def _construct(self, data):
@ -46,22 +45,13 @@ class BlCamera(BlDatablock):
if dof_settings: if dof_settings:
loader.load(target.dof, dof_settings) loader.load(target.dof, dof_settings)
background_images = data.get('background_images')
if background_images:
target.background_images.clear()
for img_name, img_data in background_images.items():
target_img = target.background_images.new()
target_img.image = bpy.data.images[img_name]
loader.load(target_img, img_data)
def _dump_implementation(self, data, instance=None): def _dump_implementation(self, data, instance=None):
assert(instance) assert(instance)
# TODO: background image support # TODO: background image support
dumper = Dumper() dumper = Dumper()
dumper.depth = 3 dumper.depth = 2
dumper.include_filter = [ dumper.include_filter = [
"name", "name",
'type', 'type',
@ -80,7 +70,6 @@ class BlCamera(BlDatablock):
'aperture_fstop', 'aperture_fstop',
'aperture_blades', 'aperture_blades',
'aperture_rotation', 'aperture_rotation',
'ortho_scale',
'aperture_ratio', 'aperture_ratio',
'display_size', 'display_size',
'show_limits', 'show_limits',
@ -90,24 +79,7 @@ class BlCamera(BlDatablock):
'sensor_fit', 'sensor_fit',
'sensor_height', 'sensor_height',
'sensor_width', 'sensor_width',
'show_background_images',
'background_images',
'alpha',
'display_depth',
'frame_method',
'offset',
'rotation',
'scale',
'use_flip_x',
'use_flip_y',
'image'
] ]
return dumper.dump(instance) return dumper.dump(instance)
def _resolve_deps_implementation(self):
deps = []
for background in self.instance.background_images:
if background.image:
deps.append(background.image)
return deps

View File

@ -21,55 +21,6 @@ import mathutils
from .. import utils from .. import utils
from .bl_datablock import BlDatablock from .bl_datablock import BlDatablock
from .dump_anything import Loader, Dumper
def dump_collection_children(collection):
collection_children = []
for child in collection.children:
if child not in collection_children:
collection_children.append(child.uuid)
return collection_children
def dump_collection_objects(collection):
collection_objects = []
for object in collection.objects:
if object not in collection_objects:
collection_objects.append(object.uuid)
return collection_objects
def load_collection_objects(dumped_objects, collection):
for object in dumped_objects:
object_ref = utils.find_from_attr('uuid', object, bpy.data.objects)
if object_ref is None:
continue
elif object_ref.name not in collection.objects.keys():
collection.objects.link(object_ref)
for object in collection.objects:
if object.uuid not in dumped_objects:
collection.objects.unlink(object)
def load_collection_childrens(dumped_childrens, collection):
for child_collection in dumped_childrens:
collection_ref = utils.find_from_attr(
'uuid',
child_collection,
bpy.data.collections)
if collection_ref is None:
continue
if collection_ref.name not in collection.children.keys():
collection.children.link(collection_ref)
for child_collection in collection.children:
if child_collection.uuid not in dumped_childrens:
collection.children.unlink(child_collection)
class BlCollection(BlDatablock): class BlCollection(BlDatablock):
@ -79,7 +30,6 @@ class BlCollection(BlDatablock):
bl_delay_refresh = 1 bl_delay_refresh = 1
bl_delay_apply = 1 bl_delay_apply = 1
bl_automatic_push = True bl_automatic_push = True
bl_check_common = True
def _construct(self, data): def _construct(self, data):
if self.is_library: if self.is_library:
@ -95,31 +45,56 @@ class BlCollection(BlDatablock):
return instance return instance
def _load_implementation(self, data, target): def _load_implementation(self, data, target):
loader = Loader() # Load other meshes metadata
loader.load(target, data) target.name = data["name"]
# Objects # Objects
load_collection_objects(data['objects'], target) for object in data["objects"]:
object_ref = bpy.data.objects.get(object)
if object_ref is None:
continue
if object not in target.objects.keys():
target.objects.link(object_ref)
for object in target.objects:
if object.name not in data["objects"]:
target.objects.unlink(object)
# Link childrens # Link childrens
load_collection_childrens(data['children'], target) for collection in data["children"]:
collection_ref = bpy.data.collections.get(collection)
if collection_ref is None:
continue
if collection_ref.name not in target.children.keys():
target.children.link(collection_ref)
for collection in target.children:
if collection.name not in data["children"]:
target.children.unlink(collection)
def _dump_implementation(self, data, instance=None): def _dump_implementation(self, data, instance=None):
assert(instance) assert(instance)
data = {}
dumper = Dumper() data['name'] = instance.name
dumper.depth = 1
dumper.include_filter = [
"name",
"instance_offset"
]
data = dumper.dump(instance)
# dump objects # dump objects
data['objects'] = dump_collection_objects(instance) collection_objects = []
for object in instance.objects:
if object not in collection_objects:
collection_objects.append(object.name)
data['objects'] = collection_objects
# dump children collections # dump children collections
data['children'] = dump_collection_children(instance) collection_children = []
for child in instance.children:
if child not in collection_children:
collection_children.append(child.name)
data['children'] = collection_children
return data return data

View File

@ -52,7 +52,6 @@ class BlCurve(BlDatablock):
bl_delay_refresh = 1 bl_delay_refresh = 1
bl_delay_apply = 1 bl_delay_apply = 1
bl_automatic_push = True bl_automatic_push = True
bl_check_common = False
bl_icon = 'CURVE_DATA' bl_icon = 'CURVE_DATA'
def _construct(self, data): def _construct(self, data):
@ -62,11 +61,6 @@ class BlCurve(BlDatablock):
loader = Loader() loader = Loader()
loader.load(target, data) loader.load(target, data)
# if isinstance(curve, T.TextCurve):
# curve.font = data['font']
# curve.font_bold = data['font']
# curve.font_bold_italic = data['font']
# curve.font_italic = data['font']
target.splines.clear() target.splines.clear()
# load splines # load splines
for spline in data['splines'].values(): for spline in data['splines'].values():
@ -89,7 +83,6 @@ class BlCurve(BlDatablock):
# new_spline.points[point_index], data['splines'][spline]["points"][point_index]) # new_spline.points[point_index], data['splines'][spline]["points"][point_index])
loader.load(new_spline, spline) loader.load(new_spline, spline)
def _dump_implementation(self, data, instance=None): def _dump_implementation(self, data, instance=None):
assert(instance) assert(instance)
dumper = Dumper() dumper = Dumper()
@ -125,17 +118,3 @@ class BlCurve(BlDatablock):
elif isinstance(instance, T.Curve): elif isinstance(instance, T.Curve):
data['type'] = 'CURVE' data['type'] = 'CURVE'
return data return data
def _resolve_deps_implementation(self):
# TODO: resolve material
deps = []
curve = self.instance
if isinstance(curve, T.TextCurve):
deps.extend([
curve.font,
curve.font_bold,
curve.font_bold_italic,
curve.font_italic])
return deps

View File

@ -18,12 +18,11 @@
import bpy import bpy
import mathutils import mathutils
import logging
from .. import utils from .. import utils
from .dump_anything import Loader, Dumper from .dump_anything import Loader, Dumper
from replication.data import ReplicatedDatablock from ..libs.replication.replication.data import ReplicatedDatablock
from replication.constants import (UP, DIFF_BINARY) from ..libs.replication.replication.constants import (UP, DIFF_BINARY)
def has_action(target): def has_action(target):
@ -96,15 +95,12 @@ class BlDatablock(ReplicatedDatablock):
bl_delay_apply : refresh rate in sec for apply bl_delay_apply : refresh rate in sec for apply
bl_automatic_push : boolean bl_automatic_push : boolean
bl_icon : type icon (blender icon name) bl_icon : type icon (blender icon name)
bl_check_common: enable check even in common rights
""" """
def __init__(self, *args, **kwargs): def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs) super().__init__(*args, **kwargs)
instance = kwargs.get('instance', None) instance = kwargs.get('instance', None)
self.preferences = utils.get_preferences()
# TODO: use is_library_indirect # TODO: use is_library_indirect
self.is_library = (instance and hasattr(instance, 'library') and self.is_library = (instance and hasattr(instance, 'library') and
instance.library) or \ instance.library) or \
@ -121,27 +117,15 @@ class BlDatablock(ReplicatedDatablock):
datablock_ref = utils.find_from_attr('uuid', self.uuid, datablock_root) datablock_ref = utils.find_from_attr('uuid', self.uuid, datablock_root)
if not datablock_ref: if not datablock_ref:
try: datablock_ref = datablock_root.get(
datablock_ref = datablock_root[self.data['name']] self.data['name'], # Resolve by name
except Exception: self._construct(data=self.data)) # If it doesn't exist create it
name = self.data.get('name')
logging.debug(f"Constructing {name}")
datablock_ref = self._construct(data=self.data)
if datablock_ref: if datablock_ref:
setattr(datablock_ref, 'uuid', self.uuid) setattr(datablock_ref, 'uuid', self.uuid)
self.instance = datablock_ref self.instance = datablock_ref
def remove_instance(self):
"""
Remove instance from blender data
"""
assert(self.instance)
datablock_root = getattr(bpy.data, self.bl_id)
datablock_root.remove(self.instance)
def _dump(self, instance=None): def _dump(self, instance=None):
dumper = Dumper() dumper = Dumper()
data = {} data = {}
@ -202,7 +186,6 @@ class BlDatablock(ReplicatedDatablock):
if not self.is_library: if not self.is_library:
dependencies.extend(self._resolve_deps_implementation()) dependencies.extend(self._resolve_deps_implementation())
logging.debug(f"{self.instance.name} dependencies: {dependencies}")
return dependencies return dependencies
def _resolve_deps_implementation(self): def _resolve_deps_implementation(self):

View File

@ -1,166 +0,0 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
#
# ##### END GPL LICENSE BLOCK #####
import bpy
import mathutils
import logging
import pathlib
import os
from .. import utils
from .dump_anything import Loader, Dumper
from replication.data import ReplicatedDatablock
from replication.constants import (UP, DIFF_BINARY)
class BlFileDatablock(ReplicatedDatablock):
"""BlDatablock
bl_id : blender internal storage identifier
bl_class : blender internal type
bl_delay_refresh : refresh rate in second for observers
bl_delay_apply : refresh rate in sec for apply
bl_automatic_push : boolean
bl_icon : type icon (blender icon name)
bl_check_common: enable check even in common rights
"""
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
instance = kwargs.get('instance', None)
self.preferences = utils.get_preferences()
if instance and hasattr(instance, 'uuid'):
instance.uuid = self.uuid
self.diff_method = DIFF_BINARY
def resolve(self):
datablock_ref = None
datablock_root = getattr(bpy.data, self.bl_id)
datablock_ref = utils.find_from_attr('uuid', self.uuid, datablock_root)
if not datablock_ref:
try:
datablock_ref = datablock_root[self.data['name']]
except Exception:
name = self.data.get('name')
logging.debug(f"Constructing {name}")
datablock_ref = self._construct(data=self.data)
if datablock_ref:
setattr(datablock_ref, 'uuid', self.uuid)
self.instance = datablock_ref
def remove_instance(self):
"""
Remove instance from blender data
"""
assert(self.instance)
datablock_root = getattr(bpy.data, self.bl_id)
datablock_root.remove(self.instance)
def get_filepath(self):
ext = pathlib.Path(self.data['filepath']).suffix
if ext:
name = f"{self.uuid}{ext}"
return os.path.join(self.preferences.cache_directory, name)
else:
return self.data['filepath']
def _construct(self, data):
filepath = self.get_filepath()
# Step 1: load content
if 'file' in data.keys():
self._write_content(data['file'], filepath)
else:
logging.warning("No data to write, skipping.")
# Step 2: construct the file
root = getattr(bpy.data, self.bl_id)
# Step 3: construct the datablock
return root.load(filepath)
def _dump(self, instance=None):
# Step 1: dump related metadata
data = self._dump_metadata(instance=instance)
# Step 2: dump file content
file_content = self._read_content(instance.filepath)
if file_content:
data['file'] = file_content
return data
def _load(self, target, data):
self._load_metadata(target, data)
def _dump_metadata(self, data, target):
"""
Dump datablock metadata
"""
raise NotImplementedError()
def _read_content(self, filepath):
"""
Dump file content
"""
logging.info("Reading file content")
content = None
try:
file = open(bpy.path.abspath(self.instance.filepath), 'rb')
content = file.read()
except IOError:
logging.warning(f"{filepath} doesn't exist, skipping")
else:
file.close()
return content
def _load_metadata(self, target, data):
raise NotImplementedError
def _write_content(self, content, filepath):
"""
Write content on the disk
"""
logging.info("Writing file content")
try:
file = open(filepath, 'wb')
file.write(content)
except IOError:
logging.warning(f"{self.uuid} writing error, skipping.")
else:
file.close()
def resolve_deps(self):
return []
def is_valid(self):
return getattr(bpy.data, self.bl_id).get(self.data['name'])
def diff(self):
return False

View File

@ -1,49 +0,0 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
#
# ##### END GPL LICENSE BLOCK #####
import bpy
import mathutils
import os
import logging
import pathlib
from .. import utils
from .dump_anything import Loader, Dumper
from .bl_file_datablock import BlFileDatablock
class BlFont(BlFileDatablock):
bl_id = "fonts"
bl_class = bpy.types.VectorFont
bl_delay_refresh = 1
bl_delay_apply = 1
bl_automatic_push = True
bl_check_common = False
bl_icon = 'FILE_FONT'
def _load_metadata(self, data, target):
# No metadate for fonts
pass
def _dump_metadata(self, instance=None):
return {
'filepath': instance.filepath,
'name': instance.name
}
def diff(self):
return False

View File

@ -218,7 +218,6 @@ class BlGpencil(BlDatablock):
bl_delay_refresh = 2 bl_delay_refresh = 2
bl_delay_apply = 1 bl_delay_apply = 1
bl_automatic_push = True bl_automatic_push = True
bl_check_common = False
bl_icon = 'GREASEPENCIL' bl_icon = 'GREASEPENCIL'
def _construct(self, data): def _construct(self, data):

View File

@ -24,44 +24,16 @@ from .. import utils
from .dump_anything import Loader, Dumper from .dump_anything import Loader, Dumper
from .bl_datablock import BlDatablock from .bl_datablock import BlDatablock
format_to_ext = { def dump_image(image):
'BMP': 'bmp',
'IRIS': 'sgi',
'PNG': 'png',
'JPEG': 'jpg',
'JPEG2000': 'jp2',
'TARGA': 'tga',
'TARGA_RAW': 'tga',
'CINEON': 'cin',
'DPX': 'dpx',
'OPEN_EXR_MULTILAYER': 'exr',
'OPEN_EXR': 'exr',
'HDR': 'hdr',
'TIFF': 'tiff',
'AVI_JPEG': 'avi',
'AVI_RAW': 'avi',
'FFMPEG': 'mpeg',
}
class BlImage(BlDatablock):
bl_id = "images"
bl_class = bpy.types.Image
bl_delay_refresh = 1
bl_delay_apply = 1
bl_automatic_push = True
bl_check_common = False
bl_icon = 'IMAGE_DATA'
def dump_image(self, image):
pixels = None pixels = None
if image.source == "GENERATED" or image.packed_file is not None: if image.source == "GENERATED" or image.packed_file is not None:
prefs = utils.get_preferences() prefs = utils.get_preferences()
img_name = f"{self.uuid}.{format_to_ext[image.file_format]}" img_name = f"{image.name}.png"
# Cache the image on the disk # Cache the image on the disk
image.filepath_raw = os.path.join(prefs.cache_directory, img_name) image.filepath_raw = os.path.join(prefs.cache_directory, img_name)
os.makedirs(prefs.cache_directory, exist_ok=True) os.makedirs(prefs.cache_directory, exist_ok=True)
image.file_format = "PNG"
image.save() image.save()
if image.source == "FILE": if image.source == "FILE":
@ -76,6 +48,14 @@ class BlImage(BlDatablock):
raise ValueError() raise ValueError()
return pixels return pixels
class BlImage(BlDatablock):
bl_id = "images"
bl_class = bpy.types.Image
bl_delay_refresh = 0
bl_delay_apply = 1
bl_automatic_push = False
bl_icon = 'IMAGE_DATA'
def _construct(self, data): def _construct(self, data):
return bpy.data.images.new( return bpy.data.images.new(
name=data['name'], name=data['name'],
@ -86,8 +66,8 @@ class BlImage(BlDatablock):
def _load(self, data, target): def _load(self, data, target):
image = target image = target
prefs = utils.get_preferences() prefs = utils.get_preferences()
img_format = data['file_format']
img_name = f"{self.uuid}.{format_to_ext[img_format]}" img_name = f"{image.name}.png"
img_path = os.path.join(prefs.cache_directory,img_name) img_path = os.path.join(prefs.cache_directory,img_name)
os.makedirs(prefs.cache_directory, exist_ok=True) os.makedirs(prefs.cache_directory, exist_ok=True)
@ -99,13 +79,11 @@ class BlImage(BlDatablock):
image.filepath = img_path image.filepath = img_path
image.colorspace_settings.name = data["colorspace_settings"]["name"] image.colorspace_settings.name = data["colorspace_settings"]["name"]
loader = Loader()
loader.load(data, target)
def _dump(self, instance=None): def _dump(self, instance=None):
assert(instance) assert(instance)
data = {} data = {}
data['pixels'] = self.dump_image(instance) data['pixels'] = dump_image(instance)
dumper = Dumper() dumper = Dumper()
dumper.depth = 2 dumper.depth = 2
dumper.include_filter = [ dumper.include_filter = [
@ -114,8 +92,6 @@ class BlImage(BlDatablock):
'height', 'height',
'alpha', 'alpha',
'float_buffer', 'float_buffer',
'file_format',
'alpha_mode',
'filepath', 'filepath',
'source', 'source',
'colorspace_settings'] 'colorspace_settings']
@ -124,7 +100,6 @@ class BlImage(BlDatablock):
return data return data
def diff(self): def diff(self):
if self.instance and (self.instance.name != self.data['name']):
return True
else:
return False return False

View File

@ -21,7 +21,7 @@ import mathutils
from .dump_anything import Dumper, Loader, np_dump_collection, np_load_collection from .dump_anything import Dumper, Loader, np_dump_collection, np_load_collection
from .bl_datablock import BlDatablock from .bl_datablock import BlDatablock
from replication.exception import ContextError from ..libs.replication.replication.exception import ContextError
POINT = ['co', 'weight_softbody', 'co_deform'] POINT = ['co', 'weight_softbody', 'co_deform']
@ -32,7 +32,6 @@ class BlLattice(BlDatablock):
bl_delay_refresh = 1 bl_delay_refresh = 1
bl_delay_apply = 1 bl_delay_apply = 1
bl_automatic_push = True bl_automatic_push = True
bl_check_common = False
bl_icon = 'LATTICE_DATA' bl_icon = 'LATTICE_DATA'
def _construct(self, data): def _construct(self, data):

View File

@ -29,7 +29,6 @@ class BlLibrary(BlDatablock):
bl_delay_refresh = 1 bl_delay_refresh = 1
bl_delay_apply = 1 bl_delay_apply = 1
bl_automatic_push = True bl_automatic_push = True
bl_check_common = False
bl_icon = 'LIBRARY_DATA_DIRECT' bl_icon = 'LIBRARY_DATA_DIRECT'
def _construct(self, data): def _construct(self, data):

View File

@ -29,7 +29,6 @@ class BlLight(BlDatablock):
bl_delay_refresh = 1 bl_delay_refresh = 1
bl_delay_apply = 1 bl_delay_apply = 1
bl_automatic_push = True bl_automatic_push = True
bl_check_common = False
bl_icon = 'LIGHT_DATA' bl_icon = 'LIGHT_DATA'
def _construct(self, data): def _construct(self, data):

View File

@ -30,7 +30,6 @@ class BlLightprobe(BlDatablock):
bl_delay_refresh = 1 bl_delay_refresh = 1
bl_delay_apply = 1 bl_delay_apply = 1
bl_automatic_push = True bl_automatic_push = True
bl_check_common = False
bl_icon = 'LIGHTPROBE_GRID' bl_icon = 'LIGHTPROBE_GRID'
def _construct(self, data): def _construct(self, data):

View File

@ -19,13 +19,11 @@
import bpy import bpy
import mathutils import mathutils
import logging import logging
import re
from ..utils import get_datablock_from_uuid from .. import utils
from .dump_anything import Loader, Dumper from .dump_anything import Loader, Dumper
from .bl_datablock import BlDatablock from .bl_datablock import BlDatablock
NODE_SOCKET_INDEX = re.compile('\[(\d*)\]')
def load_node(node_data, node_tree): def load_node(node_data, node_tree):
""" Load a node into a node_tree from a dict """ Load a node into a node_tree from a dict
@ -39,18 +37,15 @@ def load_node(node_data, node_tree):
target_node = node_tree.nodes.new(type=node_data["bl_idname"]) target_node = node_tree.nodes.new(type=node_data["bl_idname"])
loader.load(target_node, node_data) loader.load(target_node, node_data)
image_uuid = node_data.get('image_uuid', None)
if image_uuid and not target_node.image:
target_node.image = get_datablock_from_uuid(image_uuid,None)
for input in node_data["inputs"]: for input in node_data["inputs"]:
if hasattr(target_node.inputs[input], "default_value"): if hasattr(target_node.inputs[input], "default_value"):
try: try:
target_node.inputs[input].default_value = node_data["inputs"][input]["default_value"] target_node.inputs[input].default_value = node_data["inputs"][input]["default_value"]
except: except:
logging.error( logging.error(f"Material {input} parameter not supported, skipping")
f"Material {input} parameter not supported, skipping")
def load_links(links_data, node_tree): def load_links(links_data, node_tree):
@ -65,6 +60,7 @@ def load_links(links_data, node_tree):
for link in links_data: for link in links_data:
input_socket = node_tree.nodes[link['to_node']].inputs[int(link['to_socket'])] input_socket = node_tree.nodes[link['to_node']].inputs[int(link['to_socket'])]
output_socket = node_tree.nodes[link['from_node']].outputs[int(link['from_socket'])] output_socket = node_tree.nodes[link['from_node']].outputs[int(link['from_socket'])]
node_tree.links.new(input_socket, output_socket) node_tree.links.new(input_socket, output_socket)
@ -79,13 +75,11 @@ def dump_links(links):
links_data = [] links_data = []
for link in links: for link in links:
to_socket = NODE_SOCKET_INDEX.search(link.to_socket.path_from_id()).group(1)
from_socket = NODE_SOCKET_INDEX.search(link.from_socket.path_from_id()).group(1)
links_data.append({ links_data.append({
'to_node':link.to_node.name, 'to_node':link.to_node.name,
'to_socket': to_socket, 'to_socket':link.to_socket.path_from_id()[-2:-1],
'from_node':link.from_node.name, 'from_node':link.from_node.name,
'from_socket': from_socket, 'from_socket':link.from_socket.path_from_id()[-2:-1],
}) })
return links_data return links_data
@ -122,8 +116,7 @@ def dump_node(node):
"show_preview", "show_preview",
"show_texture", "show_texture",
"outputs", "outputs",
"width_hidden", "width_hidden"
"image"
] ]
dumped_node = node_dumper.dump(node) dumped_node = node_dumper.dump(node)
@ -158,8 +151,7 @@ def dump_node(node):
'location' 'location'
] ]
dumped_node['mapping'] = curve_dumper.dump(node.mapping) dumped_node['mapping'] = curve_dumper.dump(node.mapping)
if hasattr(node, 'image') and getattr(node, 'image'):
dumped_node['image_uuid'] = node.image.uuid
return dumped_node return dumped_node
@ -169,7 +161,6 @@ class BlMaterial(BlDatablock):
bl_delay_refresh = 1 bl_delay_refresh = 1
bl_delay_apply = 1 bl_delay_apply = 1
bl_automatic_push = True bl_automatic_push = True
bl_check_common = False
bl_icon = 'MATERIAL_DATA' bl_icon = 'MATERIAL_DATA'
def _construct(self, data): def _construct(self, data):
@ -185,6 +176,7 @@ class BlMaterial(BlDatablock):
loader.load( loader.load(
target.grease_pencil, data['grease_pencil']) target.grease_pencil, data['grease_pencil'])
if data["use_nodes"]: if data["use_nodes"]:
if target.node_tree is None: if target.node_tree is None:
target.use_nodes = True target.use_nodes = True
@ -267,9 +259,10 @@ class BlMaterial(BlDatablock):
if self.instance.use_nodes: if self.instance.use_nodes:
for node in self.instance.node_tree.nodes: for node in self.instance.node_tree.nodes:
if node.type in ['TEX_IMAGE','TEX_ENVIRONMENT']: if node.type == 'TEX_IMAGE':
deps.append(node.image) deps.append(node.image)
if self.is_library: if self.is_library:
deps.append(self.instance.library) deps.append(self.instance.library)
return deps return deps

View File

@ -23,10 +23,11 @@ import logging
import numpy as np import numpy as np
from .dump_anything import Dumper, Loader, np_load_collection_primitives, np_dump_collection_primitive, np_load_collection, np_dump_collection from .dump_anything import Dumper, Loader, np_load_collection_primitives, np_dump_collection_primitive, np_load_collection, np_dump_collection
from replication.constants import DIFF_BINARY from ..libs.replication.replication.constants import DIFF_BINARY
from replication.exception import ContextError from ..libs.replication.replication.exception import ContextError
from .bl_datablock import BlDatablock from .bl_datablock import BlDatablock
VERTICE = ['co'] VERTICE = ['co']
EDGE = [ EDGE = [
@ -52,7 +53,6 @@ class BlMesh(BlDatablock):
bl_delay_refresh = 2 bl_delay_refresh = 2
bl_delay_apply = 1 bl_delay_apply = 1
bl_automatic_push = True bl_automatic_push = True
bl_check_common = False
bl_icon = 'MESH_DATA' bl_icon = 'MESH_DATA'
def _construct(self, data): def _construct(self, data):
@ -114,7 +114,7 @@ class BlMesh(BlDatablock):
def _dump_implementation(self, data, instance=None): def _dump_implementation(self, data, instance=None):
assert(instance) assert(instance)
if instance.is_editmode and not self.preferences.enable_editmode_updates: if instance.is_editmode:
raise ContextError("Mesh is in edit mode") raise ContextError("Mesh is in edit mode")
mesh = instance mesh = instance

View File

@ -68,7 +68,6 @@ class BlMetaball(BlDatablock):
bl_delay_refresh = 1 bl_delay_refresh = 1
bl_delay_apply = 1 bl_delay_apply = 1
bl_automatic_push = True bl_automatic_push = True
bl_check_common = False
bl_icon = 'META_BALL' bl_icon = 'META_BALL'
def _construct(self, data): def _construct(self, data):

View File

@ -16,16 +16,13 @@
# ##### END GPL LICENSE BLOCK ##### # ##### END GPL LICENSE BLOCK #####
import logging
import bpy import bpy
import mathutils import mathutils
from replication.exception import ContextError import logging
from ..utils import get_datablock_from_uuid from .dump_anything import Loader, Dumper
from .bl_datablock import BlDatablock from .bl_datablock import BlDatablock
from .dump_anything import Dumper, Loader from ..libs.replication.replication.exception import ContextError
from replication.exception import ReparentException
def load_pose(target_bone, data): def load_pose(target_bone, data):
@ -34,59 +31,12 @@ def load_pose(target_bone, data):
loader.load(target_bone, data) loader.load(target_bone, data)
def find_data_from_name(name=None):
instance = None
if not name:
pass
elif name in bpy.data.meshes.keys():
instance = bpy.data.meshes[name]
elif name in bpy.data.lights.keys():
instance = bpy.data.lights[name]
elif name in bpy.data.cameras.keys():
instance = bpy.data.cameras[name]
elif name in bpy.data.curves.keys():
instance = bpy.data.curves[name]
elif name in bpy.data.metaballs.keys():
instance = bpy.data.metaballs[name]
elif name in bpy.data.armatures.keys():
instance = bpy.data.armatures[name]
elif name in bpy.data.grease_pencils.keys():
instance = bpy.data.grease_pencils[name]
elif name in bpy.data.curves.keys():
instance = bpy.data.curves[name]
elif name in bpy.data.lattices.keys():
instance = bpy.data.lattices[name]
elif name in bpy.data.speakers.keys():
instance = bpy.data.speakers[name]
elif name in bpy.data.lightprobes.keys():
# Only supported since 2.83
if bpy.app.version[1] >= 83:
instance = bpy.data.lightprobes[name]
else:
logging.warning(
"Lightprobe replication only supported since 2.83. See https://developer.blender.org/D6396")
return instance
def load_data(object, name):
logging.info("loading data")
pass
def _is_editmode(object: bpy.types.Object) -> bool:
child_data = getattr(object, 'data', None)
return (child_data and
hasattr(child_data, 'is_editmode') and
child_data.is_editmode)
class BlObject(BlDatablock): class BlObject(BlDatablock):
bl_id = "objects" bl_id = "objects"
bl_class = bpy.types.Object bl_class = bpy.types.Object
bl_delay_refresh = 1 bl_delay_refresh = 1
bl_delay_apply = 1 bl_delay_apply = 1
bl_automatic_push = True bl_automatic_push = True
bl_check_common = False
bl_icon = 'OBJECT_DATA' bl_icon = 'OBJECT_DATA'
def _construct(self, data): def _construct(self, data):
@ -102,29 +52,73 @@ class BlObject(BlDatablock):
return instance return instance
# TODO: refactoring # TODO: refactoring
object_name = data.get("name") if "data" not in data:
data_uuid = data.get("data_uuid") pass
data_id = data.get("data") elif data["data"] in bpy.data.meshes.keys():
instance = bpy.data.meshes[data["data"]]
object_data = get_datablock_from_uuid( elif data["data"] in bpy.data.lights.keys():
data_uuid, instance = bpy.data.lights[data["data"]]
find_data_from_name(data_id), elif data["data"] in bpy.data.cameras.keys():
ignore=['images']) #TODO: use resolve_from_id instance = bpy.data.cameras[data["data"]]
instance = bpy.data.objects.new(object_name, object_data) elif data["data"] in bpy.data.curves.keys():
instance = bpy.data.curves[data["data"]]
elif data["data"] in bpy.data.metaballs.keys():
instance = bpy.data.metaballs[data["data"]]
elif data["data"] in bpy.data.armatures.keys():
instance = bpy.data.armatures[data["data"]]
elif data["data"] in bpy.data.grease_pencils.keys():
instance = bpy.data.grease_pencils[data["data"]]
elif data["data"] in bpy.data.curves.keys():
instance = bpy.data.curves[data["data"]]
elif data["data"] in bpy.data.lattices.keys():
instance = bpy.data.lattices[data["data"]]
elif data["data"] in bpy.data.speakers.keys():
instance = bpy.data.speakers[data["data"]]
elif data["data"] in bpy.data.lightprobes.keys():
# Only supported since 2.83
if bpy.app.version[1] >= 83:
instance = bpy.data.lightprobes[data["data"]]
else:
logging.warning(
"Lightprobe replication only supported since 2.83. See https://developer.blender.org/D6396")
instance = bpy.data.objects.new(data["name"], instance)
instance.uuid = self.uuid instance.uuid = self.uuid
return instance return instance
def _load_implementation(self, data, target): def _load_implementation(self, data, target):
# Load transformation data
loader = Loader() loader = Loader()
loader.load(target, data)
data_uuid = data.get("data_uuid") # Pose
data_id = data.get("data") if 'pose' in data:
if not target.pose:
raise Exception('No pose data yet (Fixed in a near futur)')
# Bone groups
for bg_name in data['pose']['bone_groups']:
bg_data = data['pose']['bone_groups'].get(bg_name)
bg_target = target.pose.bone_groups.get(bg_name)
if target.type != data['type']: if not bg_target:
raise ReparentException() bg_target = target.pose.bone_groups.new(name=bg_name)
elif target.data and (target.data.name != data_id):
target.data = get_datablock_from_uuid(data_uuid, find_data_from_name(data_id), ignore=['images']) loader.load(bg_target, bg_data)
# target.pose.bone_groups.get
# Bones
for bone in data['pose']['bones']:
target_bone = target.pose.bones.get(bone)
bone_data = data['pose']['bones'].get(bone)
if 'constraints' in bone_data.keys():
loader.load(target_bone, bone_data['constraints'])
load_pose(target_bone, bone_data)
if 'bone_index' in bone_data.keys():
target_bone.bone_group = target.pose.bone_group[bone_data['bone_group_index']]
# vertex groups # vertex groups
if 'vertex_groups' in data: if 'vertex_groups' in data:
@ -158,50 +152,12 @@ class BlObject(BlDatablock):
target.data.shape_keys.key_blocks[key_block].relative_key = target.data.shape_keys.key_blocks[reference] target.data.shape_keys.key_blocks[key_block].relative_key = target.data.shape_keys.key_blocks[reference]
# Load transformation data
loader.load(target, data)
# Pose
if 'pose' in data:
if not target.pose:
raise Exception('No pose data yet (Fixed in a near futur)')
# Bone groups
for bg_name in data['pose']['bone_groups']:
bg_data = data['pose']['bone_groups'].get(bg_name)
bg_target = target.pose.bone_groups.get(bg_name)
if not bg_target:
bg_target = target.pose.bone_groups.new(name=bg_name)
loader.load(bg_target, bg_data)
# target.pose.bone_groups.get
# Bones
for bone in data['pose']['bones']:
target_bone = target.pose.bones.get(bone)
bone_data = data['pose']['bones'].get(bone)
if 'constraints' in bone_data.keys():
loader.load(target_bone, bone_data['constraints'])
load_pose(target_bone, bone_data)
if 'bone_index' in bone_data.keys():
target_bone.bone_group = target.pose.bone_group[bone_data['bone_group_index']]
# TODO: find another way...
if target.type == 'EMPTY':
img_uuid = data.get('data_uuid')
if target.data is None and img_uuid:
target.data = get_datablock_from_uuid(img_uuid, None)#bpy.data.images.get(img_key, None)
def _dump_implementation(self, data, instance=None): def _dump_implementation(self, data, instance=None):
assert(instance) assert(instance)
if _is_editmode(instance): child_data = getattr(instance, 'data', None)
if self.preferences.enable_editmode_updates:
instance.update_from_editmode() if child_data and hasattr(child_data, 'is_editmode') and child_data.is_editmode:
else:
raise ContextError("Object is in edit-mode.") raise ContextError("Object is in edit-mode.")
dumper = Dumper() dumper = Dumper()
@ -215,39 +171,28 @@ class BlObject(BlDatablock):
"library", "library",
"empty_display_type", "empty_display_type",
"empty_display_size", "empty_display_size",
"empty_image_offset",
"empty_image_depth",
"empty_image_side",
"show_empty_image_orthographic",
"show_empty_image_perspective",
"show_empty_image_only_axis_aligned",
"use_empty_image_alpha",
"color",
"instance_collection", "instance_collection",
"instance_type", "instance_type",
"location", "location",
"scale", "scale",
'lock_location',
'lock_rotation',
'lock_scale',
'type',
'rotation_quaternion' if instance.rotation_mode == 'QUATERNION' else 'rotation_euler', 'rotation_quaternion' if instance.rotation_mode == 'QUATERNION' else 'rotation_euler',
] ]
data = dumper.dump(instance) data = dumper.dump(instance)
data['data_uuid'] = getattr(instance.data, 'uuid', None)
if self.is_library: if self.is_library:
return data return data
# MODIFIERS # MODIFIERS
if hasattr(instance, 'modifiers'): if hasattr(instance, 'modifiers'):
dumper.include_filter = None dumper.include_filter = None
dumper.depth = 1 dumper.depth = 2
data["modifiers"] = {} data["modifiers"] = {}
for index, modifier in enumerate(instance.modifiers): for index, modifier in enumerate(instance.modifiers):
data["modifiers"][modifier.name] = dumper.dump(modifier) data["modifiers"][modifier.name] = dumper.dump(modifier)
# CONSTRAINTS # CONSTRAINTS
# OBJECT
if hasattr(instance, 'constraints'): if hasattr(instance, 'constraints'):
dumper.depth = 3 dumper.depth = 3
data["constraints"] = dumper.dump(instance.constraints) data["constraints"] = dumper.dump(instance.constraints)
@ -300,8 +245,7 @@ class BlObject(BlDatablock):
# VERTEx GROUP # VERTEx GROUP
if len(instance.vertex_groups) > 0: if len(instance.vertex_groups) > 0:
points_attr = 'vertices' if isinstance( points_attr = 'vertices' if isinstance(instance.data, bpy.types.Mesh) else 'points'
instance.data, bpy.types.Mesh) else 'points'
vg_data = [] vg_data = []
for vg in instance.vertex_groups: for vg in instance.vertex_groups:
vg_idx = vg.index vg_idx = vg.index
@ -371,3 +315,4 @@ class BlObject(BlDatablock):
deps.append(self.instance.instance_collection) deps.append(self.instance.instance_collection)
return deps return deps

View File

@ -21,7 +21,7 @@ import mathutils
from .dump_anything import Loader, Dumper from .dump_anything import Loader, Dumper
from .bl_datablock import BlDatablock from .bl_datablock import BlDatablock
from .bl_collection import dump_collection_children, dump_collection_objects, load_collection_childrens, load_collection_objects
from ..utils import get_preferences from ..utils import get_preferences
class BlScene(BlDatablock): class BlScene(BlDatablock):
@ -30,7 +30,6 @@ class BlScene(BlDatablock):
bl_delay_refresh = 1 bl_delay_refresh = 1
bl_delay_apply = 1 bl_delay_apply = 1
bl_automatic_push = True bl_automatic_push = True
bl_check_common = True
bl_icon = 'SCENE_DATA' bl_icon = 'SCENE_DATA'
def _construct(self, data): def _construct(self, data):
@ -43,8 +42,24 @@ class BlScene(BlDatablock):
loader.load(target, data) loader.load(target, data)
# Load master collection # Load master collection
load_collection_objects(data['collection']['objects'], target.collection) for object in data["collection"]["objects"]:
load_collection_childrens(data['collection']['children'], target.collection) if object not in target.collection.objects.keys():
target.collection.objects.link(bpy.data.objects[object])
for object in target.collection.objects.keys():
if object not in data["collection"]["objects"]:
target.collection.objects.unlink(bpy.data.objects[object])
# load collections
for collection in data["collection"]["children"]:
if collection not in target.collection.children.keys():
target.collection.children.link(
bpy.data.collections[collection])
for collection in target.collection.children.keys():
if collection not in data["collection"]["children"]:
target.collection.children.unlink(
bpy.data.collections[collection])
if 'world' in data.keys(): if 'world' in data.keys():
target.world = bpy.data.worlds[data['world']] target.world = bpy.data.worlds[data['world']]
@ -59,9 +74,6 @@ class BlScene(BlDatablock):
if 'cycles' in data.keys(): if 'cycles' in data.keys():
loader.load(target.eevee, data['cycles']) loader.load(target.eevee, data['cycles'])
if 'render' in data.keys():
loader.load(target.render, data['render'])
if 'view_settings' in data.keys(): if 'view_settings' in data.keys():
loader.load(target.view_settings, data['view_settings']) loader.load(target.view_settings, data['view_settings'])
if target.view_settings.use_curve_mapping: if target.view_settings.use_curve_mapping:
@ -82,18 +94,13 @@ class BlScene(BlDatablock):
'id', 'id',
'camera', 'camera',
'grease_pencil', 'grease_pencil',
'frame_start',
'frame_end',
'frame_step',
] ]
data = scene_dumper.dump(instance) data = scene_dumper.dump(instance)
scene_dumper.depth = 3 scene_dumper.depth = 3
scene_dumper.include_filter = ['children','objects','name'] scene_dumper.include_filter = ['children','objects','name']
data['collection'] = {} data['collection'] = scene_dumper.dump(instance.collection)
data['collection']['children'] = dump_collection_children(instance.collection)
data['collection']['objects'] = dump_collection_objects(instance.collection)
scene_dumper.depth = 1 scene_dumper.depth = 1
scene_dumper.include_filter = None scene_dumper.include_filter = None
@ -119,7 +126,6 @@ class BlScene(BlDatablock):
data['eevee'] = scene_dumper.dump(instance.eevee) data['eevee'] = scene_dumper.dump(instance.eevee)
data['cycles'] = scene_dumper.dump(instance.cycles) data['cycles'] = scene_dumper.dump(instance.cycles)
data['view_settings'] = scene_dumper.dump(instance.view_settings) data['view_settings'] = scene_dumper.dump(instance.view_settings)
data['render'] = scene_dumper.dump(instance.render)
if instance.view_settings.use_curve_mapping: if instance.view_settings.use_curve_mapping:
data['view_settings']['curve_mapping'] = scene_dumper.dump(instance.view_settings.curve_mapping) data['view_settings']['curve_mapping'] = scene_dumper.dump(instance.view_settings.curve_mapping)

View File

@ -1,74 +0,0 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
#
# ##### END GPL LICENSE BLOCK #####
import bpy
import mathutils
import os
import logging
import pathlib
from .. import utils
from .dump_anything import Loader, Dumper
from .bl_datablock import BlDatablock
class BlSound(BlDatablock):
bl_id = "sounds"
bl_class = bpy.types.Sound
bl_delay_refresh = 1
bl_delay_apply = 1
bl_automatic_push = True
bl_check_common = False
bl_icon = 'SOUND'
def _construct(self, data):
if 'file' in data.keys():
prefs = utils.get_preferences()
ext = data['filepath'].split(".")[-1]
sound_name = f"{self.uuid}.{ext}"
sound_path = os.path.join(prefs.cache_directory, sound_name)
os.makedirs(prefs.cache_directory, exist_ok=True)
file = open(sound_path, 'wb')
file.write(data["file"])
file.close()
logging.info(f'loading {sound_path}')
return bpy.data.sounds.load(sound_path)
def _load(self, data, target):
loader = Loader()
loader.load(target, data)
def _dump(self, instance=None):
if not instance.packed_file:
# prefs = utils.get_preferences()
# ext = pathlib.Path(instance.filepath).suffix
# sound_name = f"{self.uuid}{ext}"
# sound_path = os.path.join(prefs.cache_directory, sound_name)
# instance.filepath = sound_path
instance.pack()
#TODO:use file locally with unpack(method='USE_ORIGINAL') ?
return {
'filepath':instance.filepath,
'name':instance.name,
'file': instance.packed_file.data
}
def diff(self):
return False

View File

@ -29,7 +29,6 @@ class BlSpeaker(BlDatablock):
bl_delay_refresh = 1 bl_delay_refresh = 1
bl_delay_apply = 1 bl_delay_apply = 1
bl_automatic_push = True bl_automatic_push = True
bl_check_common = False
bl_icon = 'SPEAKER' bl_icon = 'SPEAKER'
def _load_implementation(self, data, target): def _load_implementation(self, data, target):
@ -49,7 +48,6 @@ class BlSpeaker(BlDatablock):
'volume', 'volume',
'name', 'name',
'pitch', 'pitch',
'sound',
'volume_min', 'volume_min',
'volume_max', 'volume_max',
'attenuation', 'attenuation',
@ -62,15 +60,6 @@ class BlSpeaker(BlDatablock):
return dumper.dump(instance) return dumper.dump(instance)
def _resolve_deps_implementation(self):
# TODO: resolve material
deps = []
sound = self.instance.sound
if sound:
deps.append(sound)
return deps

View File

@ -30,7 +30,6 @@ class BlWorld(BlDatablock):
bl_delay_refresh = 1 bl_delay_refresh = 1
bl_delay_apply = 1 bl_delay_apply = 1
bl_automatic_push = True bl_automatic_push = True
bl_check_common = True
bl_icon = 'WORLD_DATA' bl_icon = 'WORLD_DATA'
def _construct(self, data): def _construct(self, data):
@ -56,14 +55,19 @@ class BlWorld(BlDatablock):
assert(instance) assert(instance)
world_dumper = Dumper() world_dumper = Dumper()
world_dumper.depth = 1 world_dumper.depth = 2
world_dumper.include_filter = [ world_dumper.exclude_filter = [
"use_nodes", "preview",
"name", "original",
"uuid",
"color",
"cycles",
"light_settings",
"users",
"view_center"
] ]
data = world_dumper.dump(instance) data = world_dumper.dump(instance)
if instance.use_nodes: if instance.use_nodes:
data['node_tree'] = {}
nodes = {} nodes = {}
for node in instance.node_tree.nodes: for node in instance.node_tree.nodes:
@ -80,7 +84,7 @@ class BlWorld(BlDatablock):
if self.instance.use_nodes: if self.instance.use_nodes:
for node in self.instance.node_tree.nodes: for node in self.instance.node_tree.nodes:
if node.type in ['TEX_IMAGE','TEX_ENVIRONMENT']: if node.type == 'TEX_IMAGE':
deps.append(node.image) deps.append(node.image)
if self.is_library: if self.is_library:
deps.append(self.instance.library) deps.append(self.instance.library)

View File

@ -115,7 +115,7 @@ def np_dump_collection_primitive(collection: bpy.types.CollectionProperty, attri
:return: numpy byte buffer :return: numpy byte buffer
""" """
if len(collection) == 0: if len(collection) == 0:
logging.debug(f'Skipping empty {attribute} attribute') logging.warning(f'Skipping empty {attribute} attribute')
return {} return {}
attr_infos = collection[0].bl_rna.properties.get(attribute) attr_infos = collection[0].bl_rna.properties.get(attribute)
@ -192,7 +192,7 @@ def np_load_collection_primitives(collection: bpy.types.CollectionProperty, attr
:type sequence: strr :type sequence: strr
""" """
if len(collection) == 0 or not sequence: if len(collection) == 0 or not sequence:
logging.debug(f"Skipping loading {attribute}") logging.warning(f"Skipping loadin {attribute}")
return return
attr_infos = collection[0].bl_rna.properties.get(attribute) attr_infos = collection[0].bl_rna.properties.get(attribute)
@ -301,7 +301,7 @@ class Dumper:
self._dump_ID = (lambda x, depth: x.name, self._dump_default_as_branch) self._dump_ID = (lambda x, depth: x.name, self._dump_default_as_branch)
self._dump_collection = ( self._dump_collection = (
self._dump_default_as_leaf, self._dump_collection_as_branch) self._dump_default_as_leaf, self._dump_collection_as_branch)
self._dump_array = (self._dump_array_as_branch, self._dump_array = (self._dump_default_as_leaf,
self._dump_array_as_branch) self._dump_array_as_branch)
self._dump_matrix = (self._dump_matrix_as_leaf, self._dump_matrix = (self._dump_matrix_as_leaf,
self._dump_matrix_as_leaf) self._dump_matrix_as_leaf)
@ -593,10 +593,6 @@ class Loader:
instance.write(bpy.data.materials.get(dump)) instance.write(bpy.data.materials.get(dump))
elif isinstance(rna_property_type, T.Collection): elif isinstance(rna_property_type, T.Collection):
instance.write(bpy.data.collections.get(dump)) instance.write(bpy.data.collections.get(dump))
elif isinstance(rna_property_type, T.VectorFont):
instance.write(bpy.data.fonts.get(dump))
elif isinstance(rna_property_type, T.Sound):
instance.write(bpy.data.sounds.get(dump))
def _load_matrix(self, matrix, dump): def _load_matrix(self, matrix, dump):
matrix.write(mathutils.Matrix(dump)) matrix.write(mathutils.Matrix(dump))

View File

@ -20,16 +20,14 @@ import logging
import bpy import bpy
from . import operators, presence, utils from . import operators, presence, utils
from replication.constants import (FETCHED, from .libs.replication.replication.constants import (FETCHED,
UP,
RP_COMMON, RP_COMMON,
STATE_INITIAL, STATE_INITIAL,
STATE_QUITTING, STATE_QUITTING,
STATE_ACTIVE, STATE_ACTIVE,
STATE_SYNCING, STATE_SYNCING,
STATE_LOBBY, STATE_LOBBY,
STATE_SRV_SYNC, STATE_SRV_SYNC)
REPARENT)
class Delayable(): class Delayable():
@ -89,28 +87,16 @@ class ApplyTimer(Timer):
def execute(self): def execute(self):
client = operators.client client = operators.client
if client and client.state['STATE'] == STATE_ACTIVE: if client and client.state['STATE'] == STATE_ACTIVE:
if self._type:
nodes = client.list(filter=self._type) nodes = client.list(filter=self._type)
else:
nodes = client.list()
for node in nodes: for node in nodes:
node_ref = client.get(uuid=node) node_ref = client.get(uuid=node)
if node_ref.state == FETCHED: if node_ref.state == FETCHED:
try: try:
client.apply(node, force=True) client.apply(node)
except Exception as e: except Exception as e:
logging.error(f"Fail to apply {node_ref.uuid}: {e}") logging.error(f"Fail to apply {node_ref.uuid}: {e}")
elif node_ref.state == REPARENT:
# Reload the node
node_ref.remove_instance()
node_ref.resolve()
client.apply(node, force=True)
for parent in client._graph.find_parents(node):
logging.info(f"Applying parent {parent}")
client.apply(parent, force=True)
node_ref.state = UP
class DynamicRightSelectTimer(Timer): class DynamicRightSelectTimer(Timer):
@ -253,7 +239,7 @@ class DrawClient(Draw):
class ClientUpdate(Timer): class ClientUpdate(Timer):
def __init__(self, timout=.1): def __init__(self, timout=.016):
super().__init__(timout) super().__init__(timout)
self.handle_quit = False self.handle_quit = False
self.users_metadata = {} self.users_metadata = {}
@ -265,16 +251,14 @@ class ClientUpdate(Timer):
if session and renderer: if session and renderer:
if session.state['STATE'] in [STATE_ACTIVE, STATE_LOBBY]: if session.state['STATE'] in [STATE_ACTIVE, STATE_LOBBY]:
local_user = operators.client.online_users.get( local_user = operators.client.online_users.get(settings.username)
settings.username)
if not local_user: if not local_user:
return return
else: else:
for username, user_data in operators.client.online_users.items(): for username, user_data in operators.client.online_users.items():
if username != settings.username: if username != settings.username:
cached_user_data = self.users_metadata.get( cached_user_data = self.users_metadata.get(username)
username)
new_user_data = operators.client.online_users[username]['metadata'] new_user_data = operators.client.online_users[username]['metadata']
if cached_user_data is None: if cached_user_data is None:
@ -312,28 +296,8 @@ class ClientUpdate(Timer):
session.update_user_metadata(local_user_metadata) session.update_user_metadata(local_user_metadata)
elif 'view_corners' in local_user_metadata and current_view_corners != local_user_metadata['view_corners']: elif 'view_corners' in local_user_metadata and current_view_corners != local_user_metadata['view_corners']:
local_user_metadata['view_corners'] = current_view_corners local_user_metadata['view_corners'] = current_view_corners
local_user_metadata['view_matrix'] = presence.get_view_matrix( local_user_metadata['view_matrix'] = presence.get_view_matrix()
)
session.update_user_metadata(local_user_metadata) session.update_user_metadata(local_user_metadata)
class SessionStatusUpdate(Timer):
def __init__(self, timout=1):
super().__init__(timout)
def execute(self):
presence.refresh_sidebar_view()
class SessionUserSync(Timer):
def __init__(self, timout=1):
super().__init__(timout)
def execute(self):
session = getattr(operators, 'client', None)
renderer = getattr(presence, 'renderer', None)
if session and renderer:
# sync online users # sync online users
session_users = operators.client.online_users session_users = operators.client.online_users
ui_users = bpy.context.window_manager.online_users ui_users = bpy.context.window_manager.online_users
@ -350,3 +314,15 @@ class SessionUserSync(Timer):
new_key = ui_users.add() new_key = ui_users.add()
new_key.name = user new_key.name = user
new_key.username = user new_key.username = user
elif session.state['STATE'] == STATE_QUITTING:
presence.refresh_sidebar_view()
self.handle_quit = True
elif session.state['STATE'] == STATE_INITIAL and self.handle_quit:
self.handle_quit = False
presence.refresh_sidebar_view()
operators.unregister_delayables()
presence.renderer.stop()
presence.refresh_sidebar_view()

View File

@ -23,9 +23,6 @@ import subprocess
import sys import sys
from pathlib import Path from pathlib import Path
import socket import socket
import re
VERSION_EXPR = re.compile('\d+\.\d+\.\d+\w\d+')
THIRD_PARTY = os.path.join(os.path.dirname(os.path.abspath(__file__)), "libs") THIRD_PARTY = os.path.join(os.path.dirname(os.path.abspath(__file__)), "libs")
DEFAULT_CACHE_DIR = os.path.join( DEFAULT_CACHE_DIR = os.path.join(
@ -50,29 +47,10 @@ def install_pip():
subprocess.run([str(PYTHON_PATH), "-m", "ensurepip"]) subprocess.run([str(PYTHON_PATH), "-m", "ensurepip"])
def install_package(name, version): def install_package(name):
logging.info(f"installing {name} version...") logging.debug(f"Using {PYTHON_PATH} for installation")
env = os.environ subprocess.run([str(PYTHON_PATH), "-m", "pip", "install", name])
if "PIP_REQUIRE_VIRTUALENV" in env:
# PIP_REQUIRE_VIRTUALENV is an env var to ensure pip cannot install packages outside a virtual env
# https://docs.python-guide.org/dev/pip-virtualenv/
# But since Blender's pip is outside of a virtual env, it can block our packages installation, so we unset the
# env var for the subprocess.
env = os.environ.copy()
del env["PIP_REQUIRE_VIRTUALENV"]
subprocess.run([str(PYTHON_PATH), "-m", "pip", "install", f"{name}=={version}"], env=env)
def check_package_version(name, required_version):
logging.info(f"Checking {name} version...")
out = subprocess.run(f"{str(PYTHON_PATH)} -m pip show {name}", capture_output=True)
version = VERSION_EXPR.search(out.stdout.decode())
if version and version.group() == required_version:
logging.info(f"{name} is up to date")
return True
else:
logging.info(f"{name} need an update")
return False
def get_ip(): def get_ip():
""" """
@ -100,9 +78,7 @@ def setup(dependencies, python_path):
if not module_can_be_imported("pip"): if not module_can_be_imported("pip"):
install_pip() install_pip()
for package_name, package_version in dependencies: for module_name, package_name in dependencies:
if not module_can_be_imported(package_name): if not module_can_be_imported(module_name):
install_package(package_name, package_version) install_package(package_name)
module_can_be_imported(package_name) module_can_be_imported(package_name)
elif not check_package_version(package_name, package_version):
install_package(package_name, package_version)

View File

View File

@ -33,19 +33,31 @@ import mathutils
from bpy.app.handlers import persistent from bpy.app.handlers import persistent
from . import bl_types, delayable, environment, presence, ui, utils from . import bl_types, delayable, environment, presence, ui, utils
from replication.constants import (FETCHED, STATE_ACTIVE, from .libs.replication.replication.constants import (FETCHED, STATE_ACTIVE,
STATE_INITIAL, STATE_INITIAL,
STATE_SYNCING, RP_COMMON, UP) STATE_SYNCING)
from replication.data import ReplicatedDataFactory from .libs.replication.replication.data import ReplicatedDataFactory
from replication.exception import NonAuthorizedOperationError from .libs.replication.replication.exception import NonAuthorizedOperationError
from replication.interface import Session from .libs.replication.replication.interface import Session
client = None client = None
delayables = [] delayables = []
stop_modal_executor = False stop_modal_executor = False
modal_executor_queue = None
def unregister_delayables():
global delayables, stop_modal_executor
for d in delayables:
try:
d.unregister()
except:
continue
stop_modal_executor = True
# OPERATORS # OPERATORS
@ -67,32 +79,11 @@ class SessionStartOperator(bpy.types.Operator):
runtime_settings = context.window_manager.session runtime_settings = context.window_manager.session
users = bpy.data.window_managers['WinMan'].online_users users = bpy.data.window_managers['WinMan'].online_users
admin_pass = runtime_settings.password admin_pass = runtime_settings.password
use_extern_update = settings.update_method == 'DEPSGRAPH'
unregister_delayables()
users.clear() users.clear()
delayables.clear() delayables.clear()
logger = logging.getLogger()
if len(logger.handlers)==1:
formatter = logging.Formatter(
fmt='%(asctime)s CLIENT %(levelname)-8s %(message)s',
datefmt='%H:%M:%S'
)
log_directory = os.path.join(
settings.cache_directory,
"multiuser_client.log")
os.makedirs(settings.cache_directory, exist_ok=True)
handler = logging.FileHandler(log_directory, mode='w')
logger.addHandler(handler)
for handler in logger.handlers:
if isinstance(handler, logging.NullHandler):
continue
handler.setFormatter(formatter)
bpy_factory = ReplicatedDataFactory() bpy_factory = ReplicatedDataFactory()
supported_bl_types = [] supported_bl_types = []
@ -110,11 +101,9 @@ class SessionStartOperator(bpy.types.Operator):
bpy_factory.register_type( bpy_factory.register_type(
type_module_class.bl_class, type_module_class.bl_class,
type_module_class, type_module_class,
timer=type_local_config.bl_delay_refresh*1000, timer=type_local_config.bl_delay_refresh,
automatic=type_local_config.auto_push, automatic=type_local_config.auto_push)
check_common=type_module_class.bl_check_common)
if settings.update_method == 'DEFAULT':
if type_local_config.bl_delay_apply > 0: if type_local_config.bl_delay_apply > 0:
delayables.append( delayables.append(
delayable.ApplyTimer( delayable.ApplyTimer(
@ -123,12 +112,7 @@ class SessionStartOperator(bpy.types.Operator):
client = Session( client = Session(
factory=bpy_factory, factory=bpy_factory,
python_path=bpy.app.binary_path_python, python_path=bpy.app.binary_path_python)
external_update_handling=use_extern_update)
if settings.update_method == 'DEPSGRAPH':
delayables.append(delayable.ApplyTimer(
settings.depsgraph_update_rate/1000))
# Host a session # Host a session
if self.host: if self.host:
@ -147,10 +131,7 @@ class SessionStartOperator(bpy.types.Operator):
port=settings.port, port=settings.port,
ipc_port=settings.ipc_port, ipc_port=settings.ipc_port,
timeout=settings.connection_timeout, timeout=settings.connection_timeout,
password=admin_pass, password=admin_pass
cache_directory=settings.cache_directory,
server_log_level=logging.getLevelName(
logging.getLogger().level),
) )
except Exception as e: except Exception as e:
self.report({'ERROR'}, repr(e)) self.report({'ERROR'}, repr(e))
@ -177,32 +158,11 @@ class SessionStartOperator(bpy.types.Operator):
logging.error(str(e)) logging.error(str(e))
# Background client updates service # Background client updates service
#TODO: Refactoring
delayables.append(delayable.ClientUpdate()) delayables.append(delayable.ClientUpdate())
delayables.append(delayable.DrawClient()) delayables.append(delayable.DrawClient())
delayables.append(delayable.DynamicRightSelectTimer()) delayables.append(delayable.DynamicRightSelectTimer())
session_update = delayable.SessionStatusUpdate()
session_user_sync = delayable.SessionUserSync()
session_update.register()
session_user_sync.register()
delayables.append(session_update)
delayables.append(session_user_sync)
@client.register('on_connection')
def initialize_session():
settings = utils.get_preferences()
for node in client._graph.list_ordered():
node_ref = client.get(node)
if node_ref.state == FETCHED:
node_ref.resolve()
for node in client._graph.list_ordered():
node_ref = client.get(node)
if node_ref.state == FETCHED:
node_ref.apply()
# Launch drawing module # Launch drawing module
if runtime_settings.enable_presence: if runtime_settings.enable_presence:
presence.renderer.run() presence.renderer.run()
@ -211,28 +171,8 @@ class SessionStartOperator(bpy.types.Operator):
for d in delayables: for d in delayables:
d.register() d.register()
if settings.update_method == 'DEPSGRAPH': global modal_executor_queue
bpy.app.handlers.depsgraph_update_post.append( modal_executor_queue = queue.Queue()
depsgraph_evaluation)
@client.register('on_exit')
def desinitialize_session():
global delayables, stop_modal_executor
settings = utils.get_preferences()
for d in delayables:
try:
d.unregister()
except:
continue
stop_modal_executor = True
presence.renderer.stop()
if settings.update_method == 'DEPSGRAPH':
bpy.app.handlers.depsgraph_update_post.remove(
depsgraph_evaluation)
bpy.ops.session.apply_armature_operator() bpy.ops.session.apply_armature_operator()
self.report( self.report(
@ -449,16 +389,14 @@ class SessionSnapUserOperator(bpy.types.Operator):
if target_scene != context.scene.name: if target_scene != context.scene.name:
blender_scene = bpy.data.scenes.get(target_scene, None) blender_scene = bpy.data.scenes.get(target_scene, None)
if blender_scene is None: if blender_scene is None:
self.report( self.report({'ERROR'}, f"Scene {target_scene} doesn't exist on the local client.")
{'ERROR'}, f"Scene {target_scene} doesn't exist on the local client.")
session_sessings.time_snap_running = False session_sessings.time_snap_running = False
return {"CANCELLED"} return {"CANCELLED"}
bpy.context.window.scene = blender_scene bpy.context.window.scene = blender_scene
# Update client viewmatrix # Update client viewmatrix
client_vmatrix = target_ref['metadata'].get( client_vmatrix = target_ref['metadata'].get('view_matrix', None)
'view_matrix', None)
if client_vmatrix: if client_vmatrix:
rv3d.view_matrix = mathutils.Matrix(client_vmatrix) rv3d.view_matrix = mathutils.Matrix(client_vmatrix)
@ -589,7 +527,7 @@ class ApplyArmatureOperator(bpy.types.Operator):
try: try:
client.apply(node) client.apply(node)
except Exception as e: except Exception as e:
logging.error("Fail to apply armature: {e}") logging.error("Dail to apply armature: {e}")
return {'PASS_THROUGH'} return {'PASS_THROUGH'}
@ -655,41 +593,6 @@ def update_client_frame(scene):
}) })
@persistent
def depsgraph_evaluation(scene):
if client and client.state['STATE'] == STATE_ACTIVE:
context = bpy.context
blender_depsgraph = bpy.context.view_layer.depsgraph
dependency_updates = [u for u in blender_depsgraph.updates]
settings = utils.get_preferences()
# NOTE: maybe we don't need to check each update but only the first
for update in reversed(dependency_updates):
# Is the object tracked ?
if update.id.uuid:
# Retrieve local version
node = client.get(update.id.uuid)
# Check our right on this update:
# - if its ours or ( under common and diff), launch the
# update process
# - if its to someone else, ignore the update (go deeper ?)
if node and node.owner in [client.id, RP_COMMON] and node.state == UP:
# Avoid slow geometry update
if 'EDIT' in context.mode and \
not settings.enable_editmode_updates:
break
client.stash(node.uuid)
else:
# Distant update
continue
# else:
# # New items !
# logger.error("UPDATE: ADD")
def register(): def register():
from bpy.utils import register_class from bpy.utils import register_class
for cls in classes: for cls in classes:
@ -718,3 +621,7 @@ def unregister():
bpy.app.handlers.load_pre.remove(load_pre_handler) bpy.app.handlers.load_pre.remove(load_pre_handler)
bpy.app.handlers.frame_change_pre.remove(update_client_frame) bpy.app.handlers.frame_change_pre.remove(update_client_frame)
if __name__ == "__main__":
register()

View File

@ -21,9 +21,8 @@ import bpy
import string import string
import re import re
from . import bl_types, environment, addon_updater_ops, presence, ui from . import utils, bl_types, environment, addon_updater_ops, presence, ui
from .utils import get_preferences, get_expanded_icon from .libs.replication.replication.constants import RP_COMMON
from replication.constants import RP_COMMON
IP_EXPR = re.compile('\d+\.\d+\.\d+\.\d+') IP_EXPR = re.compile('\d+\.\d+\.\d+\.\d+')
@ -47,7 +46,6 @@ def update_panel_category(self, context):
ui.SESSION_PT_settings.bl_category = self.panel_category ui.SESSION_PT_settings.bl_category = self.panel_category
ui.register() ui.register()
def update_ip(self, context): def update_ip(self, context):
ip = IP_EXPR.search(self.ip) ip = IP_EXPR.search(self.ip)
@ -57,25 +55,14 @@ def update_ip(self, context):
logging.error("Wrong IP format") logging.error("Wrong IP format")
self['ip'] = "127.0.0.1" self['ip'] = "127.0.0.1"
def update_port(self, context): def update_port(self, context):
max_port = self.port + 3 max_port = self.port + 3
if self.ipc_port < max_port and \ if self.ipc_port < max_port and \
self['ipc_port'] >= self.port: self['ipc_port'] >= self.port:
logging.error( logging.error("IPC Port in conflic with the port, assigning a random value")
"IPC Port in conflic with the port, assigning a random value")
self['ipc_port'] = random.randrange(self.port+4, 10000) self['ipc_port'] = random.randrange(self.port+4, 10000)
def set_log_level(self, value):
logging.getLogger().setLevel(value)
def get_log_level(self):
return logging.getLogger().level
class ReplicatedDatablock(bpy.types.PropertyGroup): class ReplicatedDatablock(bpy.types.PropertyGroup):
type_name: bpy.props.StringProperty() type_name: bpy.props.StringProperty()
bl_name: bpy.props.StringProperty() bl_name: bpy.props.StringProperty()
@ -142,26 +129,6 @@ class SessionPrefs(bpy.types.AddonPreferences):
description='connection timeout before disconnection', description='connection timeout before disconnection',
default=1000 default=1000
) )
update_method: bpy.props.EnumProperty(
name='update method',
description='replication update method',
items=[
('DEFAULT', "Default", "Default: Use threads to monitor databloc changes"),
('DEPSGRAPH', "Depsgraph",
"Experimental: Use the blender dependency graph to trigger updates"),
],
)
# Replication update settings
depsgraph_update_rate: bpy.props.IntProperty(
name='depsgraph update rate',
description='Dependency graph uppdate rate (milliseconds)',
default=100
)
enable_editmode_updates: bpy.props.BoolProperty(
name="Edit mode updates",
description="Enable objects update in edit mode (! Impact performances !)",
default=False
)
# for UI # for UI
category: bpy.props.EnumProperty( category: bpy.props.EnumProperty(
name="Category", name="Category",
@ -172,18 +139,17 @@ class SessionPrefs(bpy.types.AddonPreferences):
], ],
default='CONFIG' default='CONFIG'
) )
# WIP
logging_level: bpy.props.EnumProperty( logging_level: bpy.props.EnumProperty(
name="Log level", name="Log level",
description="Log verbosity level", description="Log verbosity level",
items=[ items=[
('ERROR', "error", "show only errors", logging.ERROR), ('ERROR', "error", "show only errors"),
('WARNING', "warning", "only show warnings and errors", logging.WARNING), ('WARNING', "warning", "only show warnings and errors"),
('INFO', "info", "default level", logging.INFO), ('INFO', "info", "default level"),
('DEBUG', "debug", "show all logs", logging.DEBUG), ('DEBUG', "debug", "show all logs"),
], ],
default='INFO', default='INFO'
set=set_log_level,
get=get_log_level
) )
conf_session_identity_expanded: bpy.props.BoolProperty( conf_session_identity_expanded: bpy.props.BoolProperty(
name="Identity", name="Identity",
@ -215,21 +181,7 @@ class SessionPrefs(bpy.types.AddonPreferences):
description="Interface", description="Interface",
default=False default=False
) )
sidebar_advanced_rep_expanded: bpy.props.BoolProperty(
name="sidebar_advanced_rep_expanded",
description="sidebar_advanced_rep_expanded",
default=False
)
sidebar_advanced_log_expanded: bpy.props.BoolProperty(
name="sidebar_advanced_log_expanded",
description="sidebar_advanced_log_expanded",
default=False
)
sidebar_advanced_net_expanded: bpy.props.BoolProperty(
name="sidebar_advanced_net_expanded",
description="sidebar_advanced_net_expanded",
default=False
)
auto_check_update: bpy.props.BoolProperty( auto_check_update: bpy.props.BoolProperty(
name="Auto-check for Update", name="Auto-check for Update",
description="If enabled, auto-check for updates using an interval", description="If enabled, auto-check for updates using an interval",
@ -281,8 +233,8 @@ class SessionPrefs(bpy.types.AddonPreferences):
box = grid.box() box = grid.box()
box.prop( box.prop(
self, "conf_session_identity_expanded", text="User informations", self, "conf_session_identity_expanded", text="User informations",
icon=get_expanded_icon(self.conf_session_identity_expanded), icon='DISCLOSURE_TRI_DOWN' if self.conf_session_identity_expanded
emboss=False) else 'DISCLOSURE_TRI_RIGHT', emboss=False)
if self.conf_session_identity_expanded: if self.conf_session_identity_expanded:
box.row().prop(self, "username", text="name") box.row().prop(self, "username", text="name")
box.row().prop(self, "client_color", text="color") box.row().prop(self, "client_color", text="color")
@ -291,26 +243,23 @@ class SessionPrefs(bpy.types.AddonPreferences):
box = grid.box() box = grid.box()
box.prop( box.prop(
self, "conf_session_net_expanded", text="Netorking", self, "conf_session_net_expanded", text="Netorking",
icon=get_expanded_icon(self.conf_session_net_expanded), icon='DISCLOSURE_TRI_DOWN' if self.conf_session_net_expanded
emboss=False) else 'DISCLOSURE_TRI_RIGHT', emboss=False)
if self.conf_session_net_expanded: if self.conf_session_net_expanded:
box.row().prop(self, "ip", text="Address") box.row().prop(self, "ip", text="Address")
row = box.row() row = box.row()
row.label(text="Port:") row.label(text="Port:")
row.prop(self, "port", text="") row.prop(self, "port", text="Address")
row = box.row() row = box.row()
row.label(text="Init the session from:") row.label(text="Init the session from:")
row.prop(self, "init_method", text="") row.prop(self, "init_method", text="")
row = box.row()
row.label(text="Update method:")
row.prop(self, "update_method", text="")
table = box.box() table = box.box()
table.row().prop( table.row().prop(
self, "conf_session_timing_expanded", text="Refresh rates", self, "conf_session_timing_expanded", text="Refresh rates",
icon=get_expanded_icon(self.conf_session_timing_expanded), icon='DISCLOSURE_TRI_DOWN' if self.conf_session_timing_expanded
emboss=False) else 'DISCLOSURE_TRI_RIGHT', emboss=False)
if self.conf_session_timing_expanded: if self.conf_session_timing_expanded:
line = table.row() line = table.row()
@ -328,8 +277,8 @@ class SessionPrefs(bpy.types.AddonPreferences):
box = grid.box() box = grid.box()
box.prop( box.prop(
self, "conf_session_hosting_expanded", text="Hosting", self, "conf_session_hosting_expanded", text="Hosting",
icon=get_expanded_icon(self.conf_session_hosting_expanded), icon='DISCLOSURE_TRI_DOWN' if self.conf_session_hosting_expanded
emboss=False) else 'DISCLOSURE_TRI_RIGHT', emboss=False)
if self.conf_session_hosting_expanded: if self.conf_session_hosting_expanded:
row = box.row() row = box.row()
row.label(text="Init the session from:") row.label(text="Init the session from:")
@ -339,8 +288,8 @@ class SessionPrefs(bpy.types.AddonPreferences):
box = grid.box() box = grid.box()
box.prop( box.prop(
self, "conf_session_cache_expanded", text="Cache", self, "conf_session_cache_expanded", text="Cache",
icon=get_expanded_icon(self.conf_session_cache_expanded), icon='DISCLOSURE_TRI_DOWN' if self.conf_session_cache_expanded
emboss=False) else 'DISCLOSURE_TRI_RIGHT', emboss=False)
if self.conf_session_cache_expanded: if self.conf_session_cache_expanded:
box.row().prop(self, "cache_directory", text="Cache directory") box.row().prop(self, "cache_directory", text="Cache directory")
@ -348,14 +297,14 @@ class SessionPrefs(bpy.types.AddonPreferences):
box = grid.box() box = grid.box()
box.prop( box.prop(
self, "conf_session_ui_expanded", text="Interface", self, "conf_session_ui_expanded", text="Interface",
icon=get_expanded_icon(self.conf_session_ui_expanded), icon='DISCLOSURE_TRI_DOWN' if self.conf_session_ui_expanded else 'DISCLOSURE_TRI_RIGHT',
emboss=False) emboss=False)
if self.conf_session_ui_expanded: if self.conf_session_ui_expanded:
box.row().prop(self, "panel_category", text="Panel category", expand=True) box.row().prop(self, "panel_category", text="Panel category", expand=True)
if self.category == 'UPDATE': if self.category == 'UPDATE':
from . import addon_updater_ops from . import addon_updater_ops
addon_updater_ops.update_settings_ui(self, context) addon_updater_ops.update_settings_ui_condensed(self, context)
def generate_supported_types(self): def generate_supported_types(self):
self.supported_datablocks.clear() self.supported_datablocks.clear()
@ -382,7 +331,7 @@ def client_list_callback(scene, context):
items = [(RP_COMMON, RP_COMMON, "")] items = [(RP_COMMON, RP_COMMON, "")]
username = get_preferences().username username = utils.get_preferences().username
cli = operators.client cli = operators.client
if cli: if cli:
client_ids = cli.online_users.keys() client_ids = cli.online_users.keys()

View File

@ -19,7 +19,6 @@
import copy import copy
import logging import logging
import math import math
import traceback
import bgl import bgl
import blf import blf
@ -61,7 +60,6 @@ def refresh_sidebar_view():
""" """
area, region, rv3d = view3d_find() area, region, rv3d = view3d_find()
if area:
area.regions[3].tag_redraw() area.regions[3].tag_redraw()
def get_target(region, rv3d, coord): def get_target(region, rv3d, coord):
@ -313,10 +311,10 @@ class DrawFactory(object):
self.d2d_items[client_id] = (position[1], client_id, color) self.d2d_items[client_id] = (position[1], client_id, color)
except Exception as e: except Exception as e:
logging.debug(f"Draw client exception: {e} \n {traceback.format_exc()}\n pos:{position},ind:{indices}") logging.error(f"Draw client exception: {e}")
def draw3d_callback(self): def draw3d_callback(self):
bgl.glLineWidth(2.) bgl.glLineWidth(1.5)
bgl.glEnable(bgl.GL_DEPTH_TEST) bgl.glEnable(bgl.GL_DEPTH_TEST)
bgl.glEnable(bgl.GL_BLEND) bgl.glEnable(bgl.GL_BLEND)
bgl.glEnable(bgl.GL_LINE_SMOOTH) bgl.glEnable(bgl.GL_LINE_SMOOTH)

View File

@ -18,9 +18,8 @@
import bpy import bpy
from . import operators from . import operators, utils
from .utils import get_preferences, get_expanded_icon from .libs.replication.replication.constants import (ADDED, ERROR, FETCHED,
from replication.constants import (ADDED, ERROR, FETCHED,
MODIFIED, RP_COMMON, UP, MODIFIED, RP_COMMON, UP,
STATE_ACTIVE, STATE_AUTH, STATE_ACTIVE, STATE_AUTH,
STATE_CONFIG, STATE_SYNCING, STATE_CONFIG, STATE_SYNCING,
@ -28,7 +27,6 @@ from replication.constants import (ADDED, ERROR, FETCHED,
STATE_WAITING, STATE_QUITTING, STATE_WAITING, STATE_QUITTING,
STATE_LOBBY, STATE_LOBBY,
STATE_LAUNCHING_SERVICES) STATE_LAUNCHING_SERVICES)
from replication import __version__
ICONS_PROP_STATES = ['TRIA_DOWN', # ADDED ICONS_PROP_STATES = ['TRIA_DOWN', # ADDED
'TRIA_UP', # COMMITED 'TRIA_UP', # COMMITED
@ -52,8 +50,6 @@ def printProgressBar(iteration, total, prefix='', suffix='', decimals=1, length=
From here: From here:
https://gist.github.com/greenstick/b23e475d2bfdc3a82e34eaa1f6781ee4 https://gist.github.com/greenstick/b23e475d2bfdc3a82e34eaa1f6781ee4
""" """
if total == 0:
return ""
filledLength = int(length * iteration // total) filledLength = int(length * iteration // total)
bar = fill * filledLength + fill_empty * (length - filledLength) bar = fill * filledLength + fill_empty * (length - filledLength)
return f"{prefix} |{bar}| {iteration}/{total}{suffix}" return f"{prefix} |{bar}| {iteration}/{total}{suffix}"
@ -107,14 +103,14 @@ class SESSION_PT_settings(bpy.types.Panel):
layout.label(text=f"Session - {get_state_str(cli_state['STATE'])}", icon=connection_icon) layout.label(text=f"Session - {get_state_str(cli_state['STATE'])}", icon=connection_icon)
else: else:
layout.label(text=f"Session - v{__version__}",icon="PROP_OFF") layout.label(text="Session",icon="PROP_OFF")
def draw(self, context): def draw(self, context):
layout = self.layout layout = self.layout
layout.use_property_split = True layout.use_property_split = True
row = layout.row() row = layout.row()
runtime_settings = context.window_manager.session runtime_settings = context.window_manager.session
settings = get_preferences() settings = utils.get_preferences()
if hasattr(context.window_manager, 'session'): if hasattr(context.window_manager, 'session'):
# STATE INITIAL # STATE INITIAL
@ -130,18 +126,14 @@ class SESSION_PT_settings(bpy.types.Panel):
current_state = cli_state['STATE'] current_state = cli_state['STATE']
# STATE ACTIVE # STATE ACTIVE
if current_state in [STATE_ACTIVE]: if current_state in [STATE_ACTIVE, STATE_LOBBY]:
row.operator("session.stop", icon='QUIT', text="Exit") row.operator("session.stop", icon='QUIT', text="Exit")
row = layout.row() row = layout.row()
if runtime_settings.is_host: if runtime_settings.is_host:
row = row.box() row = row.box()
row.label(text=f"LAN: {runtime_settings.internet_ip}", icon='INFO') row.label(text=f"{runtime_settings.internet_ip}:{settings.port}", icon='INFO')
row = layout.row() row = layout.row()
if current_state == STATE_LOBBY:
row = row.box()
row.label(text=f"Waiting the session to start", icon='INFO')
row = layout.row()
row.operator("session.stop", icon='QUIT', text="Exit")
# CONNECTION STATE # CONNECTION STATE
elif current_state in [STATE_SRV_SYNC, elif current_state in [STATE_SRV_SYNC,
STATE_SYNCING, STATE_SYNCING,
@ -197,7 +189,7 @@ class SESSION_PT_settings_network(bpy.types.Panel):
layout = self.layout layout = self.layout
runtime_settings = context.window_manager.session runtime_settings = context.window_manager.session
settings = get_preferences() settings = utils.get_preferences()
# USER SETTINGS # USER SETTINGS
row = layout.row() row = layout.row()
@ -255,7 +247,7 @@ class SESSION_PT_settings_user(bpy.types.Panel):
layout = self.layout layout = self.layout
runtime_settings = context.window_manager.session runtime_settings = context.window_manager.session
settings = get_preferences() settings = utils.get_preferences()
row = layout.row() row = layout.row()
# USER SETTINGS # USER SETTINGS
@ -286,18 +278,11 @@ class SESSION_PT_advanced_settings(bpy.types.Panel):
layout = self.layout layout = self.layout
runtime_settings = context.window_manager.session runtime_settings = context.window_manager.session
settings = get_preferences() settings = utils.get_preferences()
net_section = layout.row().box() net_section = layout.row().box()
net_section.prop( net_section.label(text="Network ", icon='TRIA_DOWN')
settings,
"sidebar_advanced_net_expanded",
text="Network",
icon=get_expanded_icon(settings.sidebar_advanced_net_expanded),
emboss=False)
if settings.sidebar_advanced_net_expanded:
net_section_row = net_section.row() net_section_row = net_section.row()
net_section_row.label(text="IPC Port:") net_section_row.label(text="IPC Port:")
net_section_row.prop(settings, "ipc_port", text="") net_section_row.prop(settings, "ipc_port", text="")
@ -306,38 +291,16 @@ class SESSION_PT_advanced_settings(bpy.types.Panel):
net_section_row.prop(settings, "connection_timeout", text="") net_section_row.prop(settings, "connection_timeout", text="")
replication_section = layout.row().box() replication_section = layout.row().box()
replication_section.prop( replication_section.label(text="Replication ", icon='TRIA_DOWN')
settings,
"sidebar_advanced_rep_expanded",
text="Replication",
icon=get_expanded_icon(settings.sidebar_advanced_rep_expanded),
emboss=False)
if settings.sidebar_advanced_rep_expanded:
replication_section_row = replication_section.row()
replication_section_row.label(text="Sync flags", icon='COLLECTION_NEW')
replication_section_row = replication_section.row() replication_section_row = replication_section.row()
if runtime_settings.session_mode == 'HOST':
replication_section_row.prop(settings.sync_flags, "sync_render_settings") replication_section_row.prop(settings.sync_flags, "sync_render_settings")
replication_section_row = replication_section.row()
replication_section_row.prop(settings, "enable_editmode_updates")
replication_section_row = replication_section.row() replication_section_row = replication_section.row()
if settings.enable_editmode_updates: replication_section_row.label(text="Per data type timers:")
warning = replication_section_row.box()
warning.label(text="Don't use this with heavy meshes !", icon='ERROR')
replication_section_row = replication_section.row() replication_section_row = replication_section.row()
replication_section_row.label(text="Update method", icon='RECOVER_LAST')
replication_section_row = replication_section.row()
replication_section_row.prop(settings, "update_method", expand=True)
replication_section_row = replication_section.row()
replication_timers = replication_section_row.box()
replication_timers.label(text="Replication timers", icon='TIME')
if settings.update_method == "DEFAULT":
replication_timers = replication_timers.row()
# Replication frequencies # Replication frequencies
flow = replication_timers.grid_flow( flow = replication_section_row .grid_flow(
row_major=True, columns=0, even_columns=True, even_rows=False, align=True) row_major=True, columns=0, even_columns=True, even_rows=False, align=True)
line = flow.row(align=True) line = flow.row(align=True)
line.label(text=" ") line.label(text=" ")
@ -351,23 +314,8 @@ class SESSION_PT_advanced_settings(bpy.types.Panel):
line.separator() line.separator()
line.prop(item, "bl_delay_refresh", text="") line.prop(item, "bl_delay_refresh", text="")
line.prop(item, "bl_delay_apply", text="") line.prop(item, "bl_delay_apply", text="")
else:
replication_timers = replication_timers.row()
replication_timers.label(text="Update rate (ms):")
replication_timers.prop(settings, "depsgraph_update_rate", text="")
log_section = layout.row().box()
log_section.prop(
settings,
"sidebar_advanced_log_expanded",
text="Logging",
icon=get_expanded_icon(settings.sidebar_advanced_log_expanded),
emboss=False)
if settings.sidebar_advanced_log_expanded:
log_section_row = log_section.row()
log_section_row.label(text="Log level:")
log_section_row.prop(settings, 'logging_level', text="")
class SESSION_PT_user(bpy.types.Panel): class SESSION_PT_user(bpy.types.Panel):
bl_idname = "MULTIUSER_USER_PT_panel" bl_idname = "MULTIUSER_USER_PT_panel"
bl_label = "Online users" bl_label = "Online users"
@ -386,7 +334,7 @@ class SESSION_PT_user(bpy.types.Panel):
layout = self.layout layout = self.layout
online_users = context.window_manager.online_users online_users = context.window_manager.online_users
selected_user = context.window_manager.user_index selected_user = context.window_manager.user_index
settings = get_preferences() settings = utils.get_preferences()
active_user = online_users[selected_user] if len( active_user = online_users[selected_user] if len(
online_users)-1 >= selected_user else 0 online_users)-1 >= selected_user else 0
runtime_settings = context.window_manager.session runtime_settings = context.window_manager.session
@ -408,8 +356,6 @@ class SESSION_PT_user(bpy.types.Panel):
if active_user != 0 and active_user.username != settings.username: if active_user != 0 and active_user.username != settings.username:
row = layout.row() row = layout.row()
user_operations = row.split() user_operations = row.split()
if operators.client.state['STATE'] == STATE_ACTIVE:
user_operations.alert = context.window_manager.session.time_snap_running user_operations.alert = context.window_manager.session.time_snap_running
user_operations.operator( user_operations.operator(
"session.snapview", "session.snapview",
@ -432,7 +378,7 @@ class SESSION_PT_user(bpy.types.Panel):
class SESSION_UL_users(bpy.types.UIList): class SESSION_UL_users(bpy.types.UIList):
def draw_item(self, context, layout, data, item, icon, active_data, active_propname, index, flt_flag): def draw_item(self, context, layout, data, item, icon, active_data, active_propname, index, flt_flag):
session = operators.client session = operators.client
settings = get_preferences() settings = utils.get_preferences()
is_local_user = item.username == settings.username is_local_user = item.username == settings.username
ping = '-' ping = '-'
frame_current = '-' frame_current = '-'
@ -444,8 +390,8 @@ class SESSION_UL_users(bpy.types.UIList):
ping = str(user['latency']) ping = str(user['latency'])
metadata = user.get('metadata') metadata = user.get('metadata')
if metadata and 'frame_current' in metadata: if metadata and 'frame_current' in metadata:
frame_current = str(metadata.get('frame_current','-')) frame_current = str(metadata['frame_current'])
scene_current = metadata.get('scene_current','-') scene_current = metadata['scene_current']
if user['admin']: if user['admin']:
status_icon = 'FAKE_USER_ON' status_icon = 'FAKE_USER_ON'
split = layout.split(factor=0.35) split = layout.split(factor=0.35)
@ -516,7 +462,7 @@ class SESSION_PT_services(bpy.types.Panel):
def draw_property(context, parent, property_uuid, level=0): def draw_property(context, parent, property_uuid, level=0):
settings = get_preferences() settings = utils.get_preferences()
runtime_settings = context.window_manager.session runtime_settings = context.window_manager.session
item = operators.client.get(uuid=property_uuid) item = operators.client.get(uuid=property_uuid)
@ -586,18 +532,9 @@ class SESSION_PT_repository(bpy.types.Panel):
@classmethod @classmethod
def poll(cls, context): def poll(cls, context):
session = operators.client
settings = get_preferences()
admin = False
if session and hasattr(session,'online_users'):
usr = session.online_users.get(settings.username)
if usr:
admin = usr['admin']
return hasattr(context.window_manager, 'session') and \ return hasattr(context.window_manager, 'session') and \
operators.client and \ operators.client and \
(operators.client.state['STATE'] == STATE_ACTIVE or \ operators.client.state['STATE'] in [STATE_ACTIVE, STATE_LOBBY]
operators.client.state['STATE'] == STATE_LOBBY and admin)
def draw_header(self, context): def draw_header(self, context):
self.layout.label(text="", icon='OUTLINER_OB_GROUP_INSTANCE') self.layout.label(text="", icon='OUTLINER_OB_GROUP_INSTANCE')
@ -606,7 +543,7 @@ class SESSION_PT_repository(bpy.types.Panel):
layout = self.layout layout = self.layout
# Filters # Filters
settings = get_preferences() settings = utils.get_preferences()
runtime_settings = context.window_manager.session runtime_settings = context.window_manager.session
session = operators.client session = operators.client

View File

@ -78,28 +78,9 @@ def resolve_from_id(id, optionnal_type=None):
return root[id] return root[id]
return None return None
def get_datablock_from_uuid(uuid, default, ignore=[]):
if not uuid:
return default
for category in dir(bpy.data):
root = getattr(bpy.data, category)
if isinstance(root, Iterable) and category not in ignore:
for item in root:
if getattr(item, 'uuid', None) == uuid:
return item
return default
def get_preferences(): def get_preferences():
return bpy.context.preferences.addons[__package__].preferences return bpy.context.preferences.addons[__package__].preferences
def current_milli_time(): def current_milli_time():
return int(round(time.time() * 1000)) return int(round(time.time() * 1000))
def get_expanded_icon(prop: bpy.types.BoolProperty) -> str:
if prop:
return 'DISCLOSURE_TRI_DOWN'
else:
return 'DISCLOSURE_TRI_RIGHT'

View File

@ -2,7 +2,7 @@ import os
import pytest import pytest
from deepdiff import DeepDiff from deepdiff import DeepDiff
from uuid import uuid4
import bpy import bpy
import random import random
from multi_user.bl_types.bl_collection import BlCollection from multi_user.bl_types.bl_collection import BlCollection
@ -10,13 +10,8 @@ from multi_user.bl_types.bl_collection import BlCollection
def test_collection(clear_blend): def test_collection(clear_blend):
# Generate a collection with childrens and a cube # Generate a collection with childrens and a cube
datablock = bpy.data.collections.new("root") datablock = bpy.data.collections.new("root")
datablock.uuid = str(uuid4()) datablock.children.link(bpy.data.collections.new("child"))
s1 = bpy.data.collections.new("child") datablock.children.link(bpy.data.collections.new("child2"))
s1.uuid = str(uuid4())
s2 = bpy.data.collections.new("child2")
s2.uuid = str(uuid4())
datablock.children.link(s1)
datablock.children.link(s2)
bpy.ops.mesh.primitive_cube_add() bpy.ops.mesh.primitive_cube_add()
datablock.objects.link(bpy.data.objects[0]) datablock.objects.link(bpy.data.objects[0])

View File

@ -30,11 +30,9 @@ CONSTRAINTS_TYPES = [
'COPY_ROTATION', 'COPY_SCALE', 'COPY_TRANSFORMS', 'LIMIT_DISTANCE', 'COPY_ROTATION', 'COPY_SCALE', 'COPY_TRANSFORMS', 'LIMIT_DISTANCE',
'LIMIT_LOCATION', 'LIMIT_ROTATION', 'LIMIT_SCALE', 'MAINTAIN_VOLUME', 'LIMIT_LOCATION', 'LIMIT_ROTATION', 'LIMIT_SCALE', 'MAINTAIN_VOLUME',
'TRANSFORM', 'TRANSFORM_CACHE', 'CLAMP_TO', 'DAMPED_TRACK', 'IK', 'TRANSFORM', 'TRANSFORM_CACHE', 'CLAMP_TO', 'DAMPED_TRACK', 'IK',
'LOCKED_TRACK', 'STRETCH_TO', 'TRACK_TO', 'ACTION', 'LOCKED_TRACK', 'SPLINE_IK', 'STRETCH_TO', 'TRACK_TO', 'ACTION',
'ARMATURE', 'CHILD_OF', 'FLOOR', 'FOLLOW_PATH', 'PIVOT', 'SHRINKWRAP'] 'ARMATURE', 'CHILD_OF', 'FLOOR', 'FOLLOW_PATH', 'PIVOT', 'SHRINKWRAP']
#temporary disabled 'SPLINE_IK' until its fixed
def test_object(clear_blend): def test_object(clear_blend):
bpy.ops.mesh.primitive_cube_add( bpy.ops.mesh.primitive_cube_add(
enter_editmode=False, align='WORLD', location=(0, 0, 0)) enter_editmode=False, align='WORLD', location=(0, 0, 0))