Compare commits

...

438 Commits

Author SHA1 Message Date
e0b56d8990 Merge branch 'develop' into 'master'
v0.1.0

See merge request slumber/multi-user!43
2020-10-05 13:42:16 +00:00
0687090f05 feat: update changelog 2020-10-05 15:18:01 +02:00
920744334c Merge branch '125-autobuild-docker-image' into 'develop'
Resolve "Autobuild docker image"

See merge request slumber/multi-user!53
2020-10-05 09:32:57 +00:00
dfa7f98126 refactor: remove useless script 2020-10-05 11:28:45 +02:00
ea530f0f96 featL enable tast and build back 2020-10-03 00:30:37 +02:00
c3546ff74f fix: var name 2020-10-03 00:28:11 +02:00
83aa9b57ec feat: automatic image version 2020-10-03 00:26:44 +02:00
28a265be68 test: login in script 2020-10-03 00:12:39 +02:00
7dfabb16c7 fix: tls 2020-10-03 00:07:07 +02:00
ea5d9371ca feat: login 2020-10-03 00:00:42 +02:00
3df73a0716 feat: find replication version 2020-10-02 23:58:08 +02:00
ae3c994ff1 feat: dind tests 2020-10-02 23:55:04 +02:00
bd73b385b6 feat: dind 2020-10-02 23:52:19 +02:00
f054b1c5f2 fix: trying to use a standard docker image 2020-10-02 23:38:09 +02:00
d083100a2a fix: image directory path 2020-10-02 23:33:50 +02:00
b813b8df9e feat: docker build and push 2020-10-02 23:32:06 +02:00
d0e966ff1a fix: path 2020-10-02 23:29:48 +02:00
56cbf14fe1 refactor: use custom image 2020-10-02 23:27:45 +02:00
8bf55ebd46 feat: apk update 2020-10-02 23:19:34 +02:00
edbc5ee343 feat: apt install 2020-10-02 23:16:46 +02:00
4a92511582 feat: test install python 2020-10-02 23:14:49 +02:00
b42df2cf4a feat: retrieve version 2020-10-02 23:07:25 +02:00
7549466824 fix: ci deploy name 2020-10-02 18:59:25 +02:00
423e71476d feat: update ci 2020-10-02 18:57:50 +02:00
3bc4b20035 feat: CI file and docker image 2020-10-02 18:56:26 +02:00
9966a24b5e feat: update supported types in README.md 2020-10-02 18:04:32 +02:00
577c01a594 Merge branch '124-use-a-global-session-instance-in-replication' into 'develop'
Resolve "use a global session instance in replication"

See merge request slumber/multi-user!52
2020-10-02 15:51:30 +00:00
3d72796c10 refactor: remove old client ref
feat: update changelog
2020-10-02 17:48:56 +02:00
edcbd7b02a feat: display node in error in the repository view 2020-10-02 17:22:14 +02:00
b368c985b8 refactor: session handler encapsulation 2020-10-02 12:11:53 +02:00
cab1a71eaa fix: version 2020-10-02 09:52:21 +02:00
33cb188509 refactor: use replication session global instance 2020-10-02 00:05:33 +02:00
0a3dd9b5b8 fix: missing get_datablock_from_uuid 2020-10-02 00:00:34 +02:00
7fbdbdcc21 feat: show flag in blender overlays panel 2020-10-01 22:55:06 +02:00
8f9d5aabf9 refactor: moved get_datablock_from_uuid to bl_datablock 2020-10-01 22:51:48 +02:00
824d4d6a83 feat: upgrade replication version to fix duplicate during REPARENT
Related to #113
2020-10-01 15:34:36 +02:00
5f4bccbcd9 feat: POLY curves support
During a mesh->curve conversion, curve type spline is changed to POLY. This is adding the support for POLY curves.

Related to #113
2020-10-01 15:33:10 +02:00
8e8e54fe7d Merge branch '122-crash-on-connection' into 'develop'
Resolve "Crash on connection"

See merge request slumber/multi-user!50
2020-10-01 09:17:59 +00:00
04b13cc0b7 refactor: moveconnection handlers to the main thread 2020-10-01 10:58:30 +02:00
ba98875560 fix: version check command format 2020-09-29 17:33:39 +02:00
a9fb84a5c6 fix: world viewport color sync 2020-09-29 11:47:48 +02:00
2f139178d3 feat: update replication version 2020-09-28 22:59:43 +02:00
e466f81600 fix: file handler not properly closed 2020-09-28 22:51:07 +02:00
cb836e30f5 fix: empty uv useless update 2020-09-28 22:50:42 +02:00
152e356dad fix: font/sound loading 2020-09-28 10:40:07 +02:00
7b13e8978b fix: close file handler after quitting the session 2020-09-28 10:32:41 +02:00
e0839fe1fb Merge branch '118-optionnal-active-camera-sync-flag' into 'develop'
Resolve "Optionnal active camera sync flag"

See merge request slumber/multi-user!49
2020-09-25 14:09:31 +00:00
aec3e8b8bf doc: update replication flag section 2020-09-25 15:27:01 +02:00
a89564de6b feat: append synchronization flags to the top
refactor: enable sync render settings by default
2020-09-25 14:26:31 +02:00
e301a10456 feat: active camera sync flag 2020-09-25 11:33:35 +02:00
cfc6ce91bc feat: initial live syncflag support 2020-09-25 11:23:36 +02:00
4f731c6640 fix: implementation not found if a new type is added 2020-09-23 17:37:21 +02:00
9b1b8f11fd feat: sync object hide_render 2020-09-23 16:48:17 +02:00
e742c824fc feat: sync all object show flags except hide_viewport. 2020-09-23 16:47:51 +02:00
6757bbbd30 fix: enable DIFF_BINARY by default 2020-09-23 16:04:31 +02:00
f6a39e4290 fix: scene differential error
fix: bl_file loading error
feat: update replication version
2020-09-23 14:24:57 +02:00
410d8d2f1a feat: display sync 2020-09-23 10:00:08 +02:00
bd64c17f05 feat: update version 2020-09-22 16:36:59 +02:00
dc063b5954 fix: handle file not found exception 2020-09-21 18:52:27 +02:00
0ae34d5702 Merge branch 'file_replication' into 'develop'
Basic file replication interface

See merge request slumber/multi-user!48
2020-09-21 16:17:58 +00:00
167b39f15e doc: added a cache section to the quickstart 2020-09-21 18:14:30 +02:00
9adc0d7d6e clean: remove image testing (until the file replication interface is done) 2020-09-21 17:48:07 +02:00
fb622fa098 fix: get_datablock_users attribute error 2020-09-21 17:37:06 +02:00
c533d4b86a ci: run tests on every branch 2020-09-21 17:31:07 +02:00
6c47e095be feat: cache managenent utility 2020-09-21 16:47:49 +02:00
f992d06b03 feat: handle packed datablock
feat: filecache settings
2020-09-21 12:12:19 +02:00
af3afc1124 feat: use bl_file in bl_image 2020-09-21 00:11:37 +02:00
b77ab2dd05 feat: use bl_file to replicate font and sound files 2020-09-20 23:31:24 +02:00
150054d19c feat: generic file replication ground work 2020-09-20 19:53:51 +02:00
8d2b9e5580 Merge branch '65-sync-speaker-sounds' into 'develop'
Partial support for syncinf speaker sound files

See merge request slumber/multi-user!47
2020-09-19 19:37:43 +00:00
6870331c34 feat: notice 2020-09-19 18:59:03 +02:00
6f73b7fc29 feat: ground work for sound sync 2020-09-19 00:47:46 +02:00
6385830f53 fix: prevent world replication conflict with external addons 2020-09-18 23:38:21 +02:00
b705228f4a feat: support all font file extention 2020-09-18 23:30:50 +02:00
73d2da4c47 fix: ReparentException error
feat: replication protocol version in ui header
2020-09-18 23:25:01 +02:00
b28e7c2149 Merge branch '116-bfon-is-missing' into 'develop'
Resolve "Bfont is missing"

See merge request slumber/multi-user!46
2020-09-18 21:10:13 +00:00
38f06683be fix: bfont is missing
related to #116
2020-09-18 23:09:47 +02:00
62221c9e49 Merge branch '114-support-custom-fonts' into 'develop'
Resolve "Support custom fonts"

See merge request slumber/multi-user!45
2020-09-18 15:05:25 +00:00
e9f416f682 feat: ground work for custom font support 2020-09-18 17:04:24 +02:00
3108a06e89 fix: sync flag missing comma 2020-09-18 16:17:19 +02:00
470df50dc2 fix: bl_image test, disabling texture unload from ram. 2020-09-18 16:02:50 +02:00
d8a94e3f5e fix: image uuid error 2020-09-18 15:58:43 +02:00
47a0efef27 Merge branch '113-support-datablock-conversion' into 'develop'
Resolve "Support datablock conversion"

See merge request slumber/multi-user!44
2020-09-18 13:33:43 +00:00
ca5aebfeff feat: various images format support
feat: world environment image support
2020-09-18 15:25:52 +02:00
fe6ffd19b4 feat: child date renaming support 2020-09-17 23:45:09 +02:00
b9a6ddafe9 fix: object data load 2020-09-17 23:17:51 +02:00
ae71d7757e feat: reparent ground work 2020-09-17 22:47:11 +02:00
34ed5da6f0 fix: logging 2020-09-15 16:33:49 +02:00
2c16f07ae7 doc: update Changelog 2020-09-15 15:05:09 +02:00
60f25359d1 Merge branch '111-improve-the-logging-process' into 'develop'
Resolve "Improve the logging process"

See merge request slumber/multi-user!42
2020-09-15 11:03:42 +00:00
975b50a988 doc: update log related sections 2020-09-15 13:02:50 +02:00
66417dc84a refactor: minor ui cleanup 2020-09-15 12:40:51 +02:00
514f90d602 feat: logging to files
feat: logging level

Related to #111
2020-09-15 12:31:46 +02:00
086876ad2e feat: update version check to handle experimental ones 2020-09-15 12:29:20 +02:00
71c179f32f fix: python version 2020-09-09 11:58:51 +02:00
2399096b07 feat: experimenting a custom testing image 2020-09-09 11:57:34 +02:00
0c4d1aaa5f feat: update changelog to reflect changes 2020-09-09 11:55:53 +02:00
de8fbb0629 feat: update addon updater to support installation from branches (develop and master) 2020-09-09 10:58:02 +02:00
d7396e578c Merge branch '107-optionnal-flag-to-allow-edit-mesh-updates' into 'develop'
Resolve "Optionnal flag to allow edit mesh updates"

See merge request slumber/multi-user!41
2020-09-08 21:11:09 +00:00
7f5b5866f2 feat: usage warning 2020-09-08 23:09:42 +02:00
3eb1af406b doc: reflect advanced settings changes 2020-09-08 22:56:23 +02:00
79ccac915f feat: experimental edit mode update
Related to #107
2020-09-08 22:37:58 +02:00
f5232ccea0 Merge branch 'master' into develop 2020-09-03 17:23:21 +02:00
c599a4e6ea doc: update advanced section 2020-09-03 16:15:49 +02:00
b3230177d8 Merge branch 'feature/event_driven_updates' into develop 2020-09-03 15:59:19 +02:00
f2da4cb8e9 Merge branch 'develop' of gitlab.com:slumber/multi-user into develop 2020-09-02 16:45:08 +02:00
605bcc7581 refactor: bl_collection lint
feat: late update replication
2020-09-02 16:44:11 +02:00
e31d76a641 Merge branch 'fix-pip-require-virtualenv' into 'develop'
Resolve "'zmq' install (and other pip packages) fails when PIP_REQUIRE_VIRTUALENV env var is set to true"

See merge request slumber/multi-user!40
2020-08-28 17:49:27 +00:00
97c2118b7e doc: add comment to explain why unsetting PIP_REQUIRE_VIRTUALENV is required. 2020-08-28 18:12:01 +02:00
352977e442 fix: unset PIP_REQUIRE_VIRTUALENV if set to ensure multi-user can install its packages 2020-08-28 17:23:25 +02:00
a46d5fa227 fix: missing ui error, missing scene 2020-08-28 15:27:46 +02:00
ade736d8a5 refacotr: collection test 2020-08-28 15:01:50 +02:00
d7f7e86015 fix: collection dump 2020-08-28 14:52:56 +02:00
5e7d1e1dda feat: update replication version 2020-08-28 14:20:00 +02:00
fa5f0c7296 fix: replication version 2020-08-28 14:13:20 +02:00
f14d0915c8 feat: same collection management for Scene Master collection 2020-08-28 14:10:09 +02:00
d1e088d229 feat: orhtographic_scale sync 2020-08-28 14:09:10 +02:00
aa35da9c56 refactor: move attribute skipping warnings to debug 2020-08-28 11:28:26 +02:00
f26c3b2606 refactor: use uuid for collection loading 2020-08-28 11:27:03 +02:00
00d60be75b feat: change replication to the pre-release version 2020-08-27 11:40:26 +02:00
bb5b9fe4c8 refactor: move deepdiff dependency to replication 2020-08-27 10:45:54 +02:00
c6af49492e Merge branch 'master' of gitlab.com:slumber/multi-user 2020-08-26 11:35:47 +02:00
6158ef5171 feat: discord link in readme 2020-08-26 11:35:06 +02:00
6475b4fc08 feat: collection insance offset support
Related to #105
2020-08-24 17:49:17 +02:00
e4e09d63ff fix: instanced collection replication
Related to #105
2020-08-24 17:48:14 +02:00
4b07ae0cc3 fix: fix test condition 2020-08-07 15:47:05 +02:00
49a419cbe2 fix: none result while trying to access a node 2020-08-07 15:38:11 +02:00
5d52fb2460 fix: avoid build ci from running on other branch than develop and master 2020-08-07 15:08:08 +02:00
f1e09c1507 Merge branch 'develop' into feature/event_driven_updates 2020-08-07 15:07:17 +02:00
f915c52bd0 fix: loader missing 2020-08-06 15:33:08 +02:00
dee2e77552 fix: modifier assigned vertex groups 2020-08-06 15:26:55 +02:00
7953a2a177 feat: Update CHANGELOG.md 2020-07-31 09:01:01 +00:00
3f0082927e feat: lock movement support 2020-07-29 11:10:35 +02:00
07ffe05a84 feat: enable autoupdater back 2020-07-28 17:26:14 +02:00
09ee1cf826 fix: auto updater download 2020-07-28 17:17:39 +02:00
61bcec98c3 fix: wrong debian version 2020-07-28 13:37:06 +00:00
1c85d436fd feat: missing update 2020-07-28 13:32:33 +00:00
03318026d4 feat: missing zip program 2020-07-28 13:26:54 +00:00
7a0b142d69 feat: zip build artifacts 2020-07-28 13:20:35 +00:00
eb874110f8 Merge branch 'develop' 2020-07-28 14:25:42 +02:00
6e0c7bc332 clean disable armature missing roll 2020-07-28 12:06:58 +02:00
ee83e61b09 fix: None image 2020-07-28 12:05:26 +02:00
99b2dc0539 refactor: increase use line width 2020-07-28 12:05:04 +02:00
53f1118181 refactor: update download links 2020-07-27 17:44:27 +02:00
2791264a92 feat: empty image support 2020-07-24 21:38:00 +02:00
6c2ee0cad3 refactor: lobby ui 2020-07-24 16:02:19 +02:00
20f8c25f55 refactor: timeout update 2020-07-24 14:58:07 +02:00
0224f55104 refactor: change the timeout 2020-07-24 14:57:39 +02:00
644702ebdf feat: client state update as soon a client are in the lobby 2020-07-24 14:56:20 +02:00
9377b2be9b fix: session pannel title 2020-07-24 14:55:14 +02:00
29cbf23142 doc: update hosting guide 2020-07-24 14:53:44 +02:00
a645f71d19 fix: material socket index
Related to #101
2020-07-22 16:41:01 +02:00
909d92a7a1 fix: broken materials links
Related to #102
2020-07-21 16:46:16 +02:00
7ee9089087 feat: use callbacks instead of timers to cleanup session states
refactor: move graph initialization to operators,py
2020-07-17 16:33:39 +02:00
6201c82392 fix: Loading the false scene by default
`get` was giving wrong result inthe  scene initialization routing during the  resolve process

Related to #100
2020-07-17 14:47:52 +02:00
0faf7d9436 fix: initial test to handle #99 2020-07-15 19:08:53 +02:00
e69e61117a fix: Process "Quitting session" does not finish and gets stuck
Related to #101
2020-07-15 14:46:48 +02:00
25e988d423 fix: Unregistration of users
Related to #97
2020-07-15 13:50:46 +02:00
8a3ab895e0 fix: ci blender install 2020-07-14 12:56:34 +00:00
06a8e3c0ab feat: replication update 2020-07-14 14:54:22 +02:00
c1c1628a38 Update .gitlab/ci/test.gitlab-ci.yml 2020-07-14 10:07:30 +00:00
022e3354d9 feat: psutils test 2020-07-14 11:59:45 +02:00
211cb848b9 feat: update replication version 2020-07-14 11:29:30 +02:00
25e233f328 fix: temporary disabled spline IK test until the python api is fixed 2020-07-13 15:57:19 +02:00
9bc3d9b29d feat: dependencies version check 2020-07-13 15:12:15 +02:00
15debf339d feat: auto-update dependencies 2020-07-10 18:00:44 +02:00
56df7d182d clean: remove libs 2020-07-10 17:05:42 +02:00
26e1579e35 feat: update ci 2020-07-10 16:59:47 +02:00
a0e290ad6d feat: remove submodule 2020-07-10 16:59:32 +02:00
092384b2e4 feat: use replication from pip 2020-07-10 16:50:09 +02:00
2dc3654e6c feat: tests
feat: services heartbeats
clean: remove psutil dependency
2020-07-09 22:10:26 +02:00
f37a9efc60 feat: orthographic correction 2020-07-09 15:52:42 +02:00
0c5d323063 clean: remove old modal operator queue 2020-07-09 15:17:45 +02:00
b9f1b8a871 fix: armature operator is not running 2020-07-08 18:09:00 +02:00
2f6d8e1701 feat: initial camera background image support 2020-07-07 17:09:37 +02:00
9e64584f2d fix: ZeroDivisionError: integer division or modulo by zero 2020-07-07 15:50:05 +02:00
154aaf71c8 fix: disable test 2020-07-07 13:32:30 +00:00
ac24ab69ff fix: revert and disable testing until a definitive fix 2020-07-07 13:29:08 +00:00
ad431378f8 fix: test with a debian based image 2020-07-07 13:18:34 +00:00
784506cd95 fix: update build file 2020-07-07 13:07:55 +00:00
eb7542b1dd fix: update test.gitlab-ic.yml 2020-07-07 12:44:58 +00:00
2bc0d18120 fix: another attempt to fix the tests 2020-07-07 12:24:10 +00:00
27f9b8c659 fix: test errors attempt 2020-07-07 11:24:34 +00:00
ae08b40e8b fix: another attempt to fix psutil install on docker 2020-07-07 09:45:48 +00:00
6ce4cc2d47 fix: python dev version 2020-07-07 09:34:50 +00:00
f96d5e0e2f fix: pipeline test 2020-07-07 11:26:30 +02:00
8ec80b5150 fix: test pipeline 2020-07-07 11:22:05 +02:00
691c45b2c2 feat: missing dependencies
feat: update test to blender 2.90.0
2020-07-07 11:11:36 +02:00
25f3e27b7f fix: Python.exe sometimes dont shut down in task manager processes
fix: disconnect attempt when the session is not running

Related to slumber/multi-user#94
2020-07-07 10:58:34 +02:00
e2cdd26b7c feat: update submodule 2020-07-02 14:16:11 +02:00
eb52deade6 fix: initial fix for Snap user: Unknown location error
Related to #95.
2020-07-02 14:14:16 +02:00
0e8bdf7fe5 feat: IPC port is now verified on update.
feat: A wrong ip is filled with the default one 127.0.0.1
2020-07-02 10:12:08 +02:00
b2d1cec7f4 feat: verbose commit errors 2020-06-30 11:14:42 +02:00
29e19c7e05 fix: missing function 2020-06-29 18:49:06 +02:00
dda252729d fix: Client get stuck during initial snapshots
Related to #91
2020-06-29 18:25:30 +02:00
4e4d366a57 fix: wrong download link 2020-06-28 23:24:03 +02:00
fca53a6725 feat: better traceback handling 2020-06-28 21:34:49 +02:00
80f37583cc fix: start empty on local host 2020-06-26 22:46:38 +02:00
5f763906c3 clean: remove print 2020-06-25 13:00:15 +02:00
f64d36d7ed feat: ip string verification 2020-06-24 16:58:44 +02:00
dc6975171c feat: update submodule 2020-06-24 09:55:23 +02:00
e5d3c664a7 feat: update submodule to develop 2020-06-24 09:55:01 +02:00
d11035fd6c doc: dedicated server startup 2020-06-24 09:53:45 +02:00
406039aa21 Merge branch 'dedicated_server' into 'develop'
feat: dedicated server

See merge request slumber/multi-user!38
2020-06-23 21:28:30 +00:00
5e8c4c1455 doc: added hyperlink 2020-06-23 23:23:46 +02:00
92efa89e35 feat: dedicated server command list 2020-06-23 19:20:09 +02:00
d806570ee0 doc: added discord invite link 2020-06-23 19:04:07 +02:00
b414ca31b1 feat: submodule update
doc: dedicated server progress
2020-06-23 18:55:31 +02:00
59bfcede97 doc: orthographic pass 2020-06-23 17:59:56 +02:00
9d8e3a3e7b doc: network connection 2020-06-23 17:52:40 +02:00
788477502d doc: hosting guide progress 2020-06-23 16:21:12 +02:00
226c01df8b doc: hosting guide progress 2020-06-23 00:15:25 +02:00
c57fb0ca67 feat: various fix inculded in submodule update 2020-06-22 18:40:23 +02:00
745c7ca04a clean: operator
feat: update submodules
2020-06-22 18:23:27 +02:00
8ddb86556d clean: remove unused vars from operator.py
doc: carry on hosting guide
2020-06-22 17:25:44 +02:00
b1ccbf72f3 doc: internet guide update 2020-06-22 14:54:18 +02:00
f85001cb1a feat: update changelog 2020-06-22 11:53:17 +02:00
a7371c0566 doc: quickstart advanced section update 2020-06-22 11:44:37 +02:00
3d9c20cc03 doc: update manage data section 2020-06-22 11:02:58 +02:00
661065e51f doc: orthigraphic correction 2020-06-22 10:12:55 +02:00
c1fe033ff3 doc: feat links 2020-06-19 12:09:25 +02:00
3ea45b3cf6 doc: feat manage session section 2020-06-19 11:58:12 +02:00
b385a821d4 doc: quickstart refactoring progress 2020-06-19 11:34:15 +02:00
ac1e1f39b7 doc: feat glossary 2020-06-19 10:50:29 +02:00
41140286e1 feat: missing images 2020-06-18 18:42:06 +02:00
c50313f8b1 doc: feat glossary 2020-06-18 18:06:18 +02:00
2ad626197f doc: revamp progress 2020-06-18 17:06:43 +02:00
e927676e3e doc: update 2020-06-18 15:54:48 +02:00
4531b7fd6d doc: carry on quickstart refactoring 2020-06-18 12:26:18 +02:00
5199a810cd feat: show host serving address 2020-06-17 22:11:20 +02:00
2bdbfb082b doc: quickstart revamp
refactor: ttl start on connection
2020-06-17 19:21:57 +02:00
9e6b1a141d refactor: uui cleanup 2020-06-17 12:10:16 +02:00
9c3afdbd81 doc: update capture 2020-06-16 18:54:17 +02:00
32669cef22 fix: wrong orthograph 2020-06-16 18:50:08 +02:00
cc6d1a35bc doc: update captures 2020-06-16 18:45:53 +02:00
63a36ad5d4 refactor: remove ip from host parameters 2020-06-16 18:25:49 +02:00
bcdefca32c refactor: remove unused STRICT right strategy 2020-06-16 18:04:27 +02:00
88e69711ba fix: connected user display in lobby 2020-06-16 17:15:32 +02:00
1ccfd59e65 feat: ui icons
feat: server cmd update
2020-06-16 16:40:00 +02:00
a201ae4ea6 feat: admin client repository init on connection to an empty server 2020-06-16 15:19:38 +02:00
e9029b1414 feat: connection protocol refactoring 2020-06-16 00:02:02 +02:00
19946794f6 refactor: explicit ui label 2020-06-11 18:35:52 +02:00
3f15092b3a clean: image logs 2020-06-11 16:31:25 +02:00
838df92217 refactor: add an icon for non admin-user 2020-06-11 15:25:58 +02:00
54b01e4513 refacotor: set thost by default 2020-06-11 15:00:51 +02:00
c065b198d4 feat: added an initialization step 2020-06-10 18:43:21 +02:00
12c0dab881 feat: kick command is back in the ui
clean: unused logs
2020-06-10 14:56:03 +02:00
7759234ea3 refactor: cleanup connction operator 2020-06-10 10:15:39 +02:00
ad99a349f7 feat: show admin status in UI
feat: update submodule
2020-06-09 23:56:20 +02:00
fdc7e4678c feat: update submodule
feat: hide uuid
2020-06-09 18:02:53 +02:00
8d040cc304 feat: update submodule 2020-06-05 18:41:05 +02:00
f1c95d03f8 feat: initial work to run a dedicated server 2020-06-04 18:38:03 +02:00
b55faf2d1c fix: Object location is reset to the origin
Related to #84
2020-06-04 16:53:26 +02:00
258f27d96e fix: get undo handlers back to fix undo crashes
related to #32
2020-05-28 12:59:05 +02:00
8027e541c3 fix: package installation on linux
Related to #74
2020-05-27 15:57:57 +02:00
fc1108ab61 feat: Feature Proposal, Documentation and Refactoring issue templates 2020-05-20 11:32:36 +00:00
54bcd41267 fix: wrong scene initialisation 2020-05-16 21:45:42 +02:00
25c19471bb feat: update submodule 2020-05-15 18:23:51 +02:00
9e4e646bb1 Merge branch 'develop' into feature/event_driven_updates 2020-05-15 16:19:47 +02:00
95524fa3e9 Merge branch '29-differential-revision' into develop 2020-05-09 17:05:49 +02:00
07a646aa18 feat: submodule update 2020-05-09 16:34:14 +02:00
67fc19dae1 fix: get hard reference back
fix: image crash
2020-05-08 15:11:18 +02:00
1070cabf7a Merge branch '29-differential-revision' into 'develop'
fix: spot light shape replication

See merge request slumber/multi-user!36
2020-05-01 21:03:18 +00:00
fcc9292e02 fix: spot light shape replication 2020-05-01 22:58:27 +02:00
f3d8f15ab1 Merge branch '29-differential-revision' into 'develop'
Resolve "Implementation cleanup"

See merge request slumber/multi-user!32
2020-04-22 15:24:11 +00:00
4c44c2f1a0 feat: logging cleanup 2020-04-22 17:04:14 +02:00
e7d948049d feat: test build 2020-04-17 15:49:22 +02:00
0ad0f4d62d feat: remove branch build restriction 2020-04-17 15:46:52 +02:00
7df6ab1852 feat: clean log for library import
feat: color curve map for scene test
feat; update submodules
2020-04-17 15:28:17 +02:00
b4d1e04b87 fix: sidebar refresh 2020-04-14 18:56:20 +02:00
2e60bb985f feat: custom panel category 2020-04-14 17:22:28 +02:00
f8fa407a45 Merge branch '29-differential-revision' into feature/event_driven_updates 2020-04-13 11:48:20 +02:00
b2085e80f8 Merge branch 'develop' into 29-differential-revision 2020-04-09 21:45:44 +02:00
95241d4148 feat: start to test operators 2020-04-08 19:41:57 +02:00
03490af042 feat: additional tests 2020-04-08 19:28:02 +02:00
94aeff7b35 feat: new tests
fix: curve diff errors
fix: metaball diff error
fix: scene error
2020-04-08 18:40:02 +02:00
f5452b8aee Update .gitlab/ci/test.gitlab-ci.yml 2020-04-08 10:20:13 +00:00
4f02134b6c feat: expose connection TTL 2020-04-08 11:15:29 +02:00
52b2c25014 fix: wrong dir 2020-04-08 09:09:09 +00:00
d8f68640b5 refactor: test to move test_script 2020-04-08 09:06:36 +00:00
bb2fc2c32b Update .gitlab/ci/test.gitlab-ci.yml 2020-04-08 09:00:47 +00:00
582c908dc4 Update .gitlab/ci/test.gitlab-ci.yml 2020-04-08 08:47:42 +00:00
506284ad1f another ci test 2020-04-08 08:41:24 +00:00
e7e8782c28 Update .gitlab/ci/test.gitlab-ci.yml 2020-04-08 08:18:42 +00:00
3f614cdcef Update test.gitlab-ci.yml 2020-04-07 19:36:44 +00:00
5f1853bbe3 Update test.gitlab-ci.yml 2020-04-07 19:32:40 +00:00
99d64cd411 Update .gitlab/ci/test.gitlab-ci.yml 2020-04-07 17:14:12 +00:00
607a1f25d3 Update .gitlab/ci/test.gitlab-ci.yml 2020-04-07 17:12:27 +00:00
b047b48738 Update .gitlab/ci/test.gitlab-ci.yml 2020-04-07 17:11:05 +00:00
e1688f7f12 Update .gitlab/ci/test.gitlab-ci.yml 2020-04-07 17:09:12 +00:00
5aa7a1f140 Update .gitlab/ci/test.gitlab-ci.yml 2020-04-07 17:04:56 +00:00
ca141f9860 Update .gitlab/ci/test.gitlab-ci.yml 2020-04-07 17:03:15 +00:00
d4d14d57ff Update .gitlab/ci/test.gitlab-ci.yml 2020-04-07 17:00:19 +00:00
1789ae20cf Update .gitlab/ci/test.gitlab-ci.yml 2020-04-07 16:56:24 +00:00
2e0414f9d5 Update .gitlab/ci/test.gitlab-ci.yml 2020-04-07 16:55:15 +00:00
52cbf79f39 Update .gitlab/ci/test.gitlab-ci.yml 2020-04-07 16:54:14 +00:00
e9e06bbf8f Update .gitlab/ci/test.gitlab-ci.yml 2020-04-07 16:53:12 +00:00
1f16e07a3c Update .gitlab/ci/test.gitlab-ci.yml 2020-04-07 16:52:29 +00:00
e476d44527 Update .gitlab/ci/test.gitlab-ci.yml 2020-04-07 16:48:40 +00:00
7a09026b6c Update .gitlab/ci/test.gitlab-ci.yml 2020-04-07 16:47:52 +00:00
09c724ac53 Update .gitlab/ci/test.gitlab-ci.yml 2020-04-07 16:46:59 +00:00
7152bb825d Update .gitlab/ci/test.gitlab-ci.yml, .gitlab/ci/build.gitlab-ci.yml files 2020-04-07 16:45:26 +00:00
e8d23426d7 Update .gitlab/ci/build.gitlab-ci.yml 2020-04-07 16:43:42 +00:00
dcb593db07 refactor: update ci 2020-04-07 18:39:54 +02:00
3b9a722812 refactor: update ci 2020-04-07 18:23:34 +02:00
7cc0c0ccf8 refactor: update ci 2020-04-07 18:22:01 +02:00
1304489748 feat: update submodule 2020-04-07 18:19:50 +02:00
673a9dd669 feat: more tests 2020-04-07 18:19:17 +02:00
fc15478dfa refactor: remove pointer hard reference
feat: light test
2020-04-07 14:49:43 +02:00
c23b5923ab refactor: cleanup 2020-04-07 09:59:04 +02:00
2d700e83bb feat: update submodule 2020-04-06 15:36:28 +02:00
271f210591 fix: light missing parameters 2020-04-06 15:16:57 +02:00
e65dd1939a refactor: update screen only on client update 2020-04-06 14:47:03 +02:00
76be9092c8 Merge branch 'develop' into 29-differential-revision 2020-04-06 11:48:47 +02:00
3394299e8b fix: view_settings error
fix: empty collection error
2020-04-06 10:38:31 +02:00
5e6f6ac460 Update CHANGELOG.md 2020-04-03 16:10:34 +00:00
49d676f4f2 feat: documentation update
Added kick option
Added show flag description
Updated captures
2020-04-03 17:27:56 +02:00
4dd4cea3ed Merge branch '48-kick-client-from-host' into 'develop'
Resolve "Kick client from host"

See merge request slumber/multi-user!34
2020-04-03 13:49:36 +00:00
408a16064c feat: update submodule 2020-04-03 15:43:04 +02:00
cfd80dd426 feat: update submodule 2020-04-03 15:39:56 +02:00
0fde356f4b feat: initial kick 2020-04-03 14:59:33 +02:00
427b36ddaf Merge branch '36-sync-render-settings' into 'develop'
Resolve "Sync Render Settings"

See merge request slumber/multi-user!33
2020-04-03 08:19:31 +00:00
1b94e017e6 feat: ignore wrong parameters 2020-04-02 19:36:28 +02:00
b3b2296b06 fix: black and white level errors 2020-04-02 19:21:55 +02:00
9c897745fd feat: optionnal setting on host 2020-04-02 18:42:41 +02:00
0783c625d0 feat: initial render settings replication support 2020-04-02 17:39:14 +02:00
a1036956c3 Merge branch '29-differential-revision' into 'develop'
Feat: implementation refactor progress

See merge request slumber/multi-user!31
2020-04-02 12:04:51 +00:00
bfbf2727ea fix: material error 2020-04-02 12:14:12 +02:00
39766739d2 refactor: move dump_anything to the dump modules 2020-04-01 11:37:05 +02:00
776664149e refactor: imports 2020-04-01 11:34:24 +02:00
fef088e39b refactor: cleanup lattice 2020-04-01 10:23:28 +02:00
31feb2439d refactor: cleanup 2020-03-31 17:52:52 +02:00
e041b2cb91 feat: cleanup 2020-03-31 17:42:14 +02:00
379e7cdf71 refactor: cleanup implementation
feat: np_load_collection for collection attr load/unload
2020-03-31 17:28:32 +02:00
928eccfa23 feat: vertex_color support 2020-03-31 10:13:49 +02:00
9c9d7a31bf feat: cleanup metaballs 2020-03-30 18:50:54 +02:00
d46ea3a117 refactor: change default materials refrash rate to 1s 2020-03-30 18:12:43 +02:00
820c6dad7e refactor: feat dump node 2020-03-30 18:05:33 +02:00
acbf897f5d refactor: rename def 2020-03-30 17:21:28 +02:00
981ac03855 Merge branch 'develop' into 29-differential-revision 2020-03-30 17:15:56 +02:00
c20777f860 feat: update submodule 2020-03-30 17:15:38 +02:00
219973930b fix: annotation settings 2020-03-30 15:01:48 +02:00
79ba63ce85 feat: attribute dump function
refactor: cleanup utils
2020-03-30 14:31:35 +02:00
922538dc3a feat: update submodule 2020-03-27 18:52:23 +01:00
5574059b46 Merge branch '29-differential-revision' into 'develop'
various implementation refactoring

See merge request slumber/multi-user!24
2020-03-27 17:50:34 +00:00
289a49251e refactor: cleanup 2020-03-27 18:18:36 +01:00
ef9e9dbae8 feat: action refactoring 2020-03-27 15:50:25 +01:00
98d86c050b feat: update submodules 2020-03-26 21:11:21 +01:00
3f0c31d771 refactor: cleanup implementation 2020-03-26 19:10:26 +01:00
d7e47e5c14 refactor: clean material 2020-03-26 18:51:35 +01:00
cab4a8876b feat: CurveMapping support 2020-03-26 18:51:19 +01:00
d19932cc3b refactor: change default timers for mesh and gpencil 2020-03-26 17:30:45 +01:00
ea9ee4ead1 fix: mesh materials 2020-03-26 17:23:42 +01:00
667c3cd04d fix: mesh update 2020-03-26 17:13:59 +01:00
6334bfdc01 fix: node_tree dump 2020-03-26 17:13:48 +01:00
2016af33b7 fix: missing light shape 2020-03-26 17:13:30 +01:00
f0a2659b43 fix: collection destructor during loading 2020-03-26 16:10:27 +01:00
489502a783 Merge remote-tracking branch 'origin/develop' into 29-differential-revision 2020-03-26 14:38:41 +01:00
cb6e26513c Merge branch '78-jsondiff-import-issue-during-addon-installation' into 'develop'
Resolve "jsondiff import issue during addon installation"

See merge request slumber/multi-user!30
2020-03-26 13:08:10 +00:00
3680a751aa fix: jsondiff import error
Related to #78
2020-03-26 14:04:47 +01:00
4413785903 feat: missing lines 2020-03-26 12:07:12 +01:00
25825f7aeb feat: blender 2.83 api changes 2020-03-26 11:45:41 +01:00
73019fc0b0 feat: grease pencil progress 2020-03-26 11:12:26 +01:00
6a98e749f9 feat: gpencil dump refactoring 2020-03-26 09:20:41 +01:00
a84fccb3ce feat: camera cleanup 2020-03-25 13:24:12 +01:00
f9222d84ea feat: crease and bevel_wieght support 2020-03-25 11:36:29 +01:00
d7964b645a fix: generated image path error 2020-03-25 10:35:31 +01:00
c3ae56abd2 refactor: cleanup progress 2020-03-25 09:47:20 +01:00
daff548010 feat: uv_layer loading 2020-03-24 19:40:18 +01:00
757dbfd6ab feat: mesh loading progress 2020-03-24 19:09:02 +01:00
01fdf7b35b feat: mesh implementation cleanup progress 2020-03-23 21:49:28 +01:00
90d4bb0e47 feat: mesh dump refactoring wip 2020-03-23 17:55:10 +01:00
01faa94a9a fix: resolve error 2020-03-23 15:29:34 +01:00
b55700862f fix: loadimg error 2020-03-23 13:49:47 +01:00
90a44eb5db fix resolve 2020-03-23 12:09:59 +01:00
fb0760928e feat: verbose errors 2020-03-23 11:04:06 +01:00
8ce53b8413 feat: bl_object clean
Related to #29
2020-03-20 19:31:48 +01:00
2484028b5a feat: cleanup 2020-03-20 16:14:54 +01:00
2fcb4615be feat: GPL headers 2020-03-20 14:56:50 +01:00
653cf7e25a Merge remote-tracking branch 'origin/develop' into 29-differential-revision 2020-03-20 14:28:07 +01:00
aa0b54a054 fix: ui refresh 2020-03-20 14:17:58 +01:00
8d2755060e Merge branch '76-handle-services-timeout-error' into 'develop'
Resolve "Handle services timeout error"

See merge request slumber/multi-user!29
2020-03-20 12:43:41 +00:00
ba9b4ebe70 feat: update submodules 2020-03-20 13:41:42 +01:00
b8f46c2523 fix: service launching error
Related to #76
2020-03-20 11:46:43 +01:00
153ff5b129 Merge branch 'develop' into 29-differential-revision 2020-03-19 18:55:37 +01:00
931301074e refactor: remove badges from README 2020-03-19 17:08:59 +00:00
56e5709a35 refactor: use pickle instead of msgpack 2020-03-19 17:26:30 +01:00
7fa97704bd feat: gitlab-ci 2020-03-19 10:33:19 +00:00
85b3f6e246 feat: bug report template 2020-03-19 10:25:21 +00:00
a0676f4e37 hotfix: wrong download link 2020-03-14 20:30:18 +00:00
5a0be0f6f9 Update README.md 2020-03-14 15:36:48 +00:00
717a2da3de refactor: cleanup progression 2020-03-13 17:13:39 +01:00
4a127e617c feat: cleanup 2020-03-13 15:05:00 +01:00
3f7cb65393 doc: update download links to job builds 2020-03-12 13:39:50 +01:00
cd00813aed Merge branch '71-blender-addon-updater-integration' into 'develop'
Resolve "Blender Addon Updater integration"

See merge request slumber/multi-user!26
2020-03-12 12:28:47 +00:00
511983c7ff feat: remove useless information panel
feat: doc_url field [2.83 update](https://developer.blender.org/D7015)
2020-03-12 13:25:13 +01:00
1e580dbcd6 Update multi_user/__init__.py, .gitlab-ci.yml files 2020-03-12 12:17:30 +00:00
824040660b Update .gitlab-ci.yml 2020-03-12 12:06:27 +00:00
931c683030 Update .gitlab-ci.yml 2020-03-12 12:04:52 +00:00
ff5e56e36c fix: use release 2020-03-12 11:57:54 +01:00
1c6e88ce61 feat: get links from gitlab releases 2020-03-12 11:40:36 +01:00
08b9a35981 Update .gitlab-ci.yml 2020-03-11 22:43:25 +00:00
7db7382c68 Update .gitlab-ci.yml 2020-03-11 22:41:23 +00:00
1e81a2de16 Update .gitlab-ci.yml 2020-03-11 22:38:09 +00:00
624f67a621 Update .gitlab-ci.yml 2020-03-11 22:36:26 +00:00
ffb6c397b8 Test addon build 2020-03-11 22:35:50 +00:00
dbaff5df85 feat: auto_updater script 2020-03-11 22:42:09 +01:00
75839e60f0 refactor: cleanup 2020-03-11 15:17:13 +01:00
cd10dbb04d feat: update submodule 2020-03-11 15:09:52 +01:00
7c3ac6aeed Merge branch '29-differential-revision' into 'develop'
Various cleanup

See merge request slumber/multi-user!23
2020-03-11 14:03:05 +00:00
d30f4452be Merge branch 'develop' into '29-differential-revision'
# Conflicts:
#   multi_user/bl_types/bl_curve.py
2020-03-11 14:02:05 +00:00
61a05dc347 fix: curve commit error
Releated to #72
2020-03-10 18:04:06 +01:00
9f59e7b6e8 fix: curve commit error
Releated to #72
2020-03-10 18:00:23 +01:00
30f787f507 refactor: progress on bl_data_io clean 2020-03-09 16:12:18 +01:00
a8da01c8ff feat: start to leanup datablock io api 2020-03-09 15:59:30 +01:00
9df7cd4659 Merge branch '40-multi-scene-workflow' into 'develop'
Resolve "Multi scene workflow"

See merge request slumber/multi-user!22
2020-03-06 12:48:32 +00:00
c281ac4397 feat: update snap user to support snaping accross scenes
Related to #40
2020-03-05 17:31:12 +01:00
250cf91032 feat: show flag for user on an other scene
feat: disable user drawing on other scene

Related to #40
2020-03-05 17:20:04 +01:00
fe9a096ab2 feat: store current scene in user metadata
Related to #40
2020-03-05 16:19:13 +01:00
a6e1566f89 Merge branch '40-multi-scene-workflow' of gitlab.com:slumber/multi-user into feature/event_driven_updates 2020-03-05 16:17:00 +01:00
adeb694b2d feat: one apply timer for all 2020-03-05 15:38:20 +01:00
50d14e663e feat: update sumbodules 2020-03-05 10:56:17 +01:00
9b8d69042d feat: update submodule 2020-03-04 22:28:34 +01:00
b2475081b6 feat: id accessor 2020-03-04 18:28:42 +01:00
aef1d8987c Merge branch '61-config-file-prevents-having-the-addon-on-a-shared-network-location' into feature/event_driven_updates 2020-03-04 14:54:42 +01:00
d8f49ff298 Merge branch '61-config-file-prevents-having-the-addon-on-a-shared-network-location' into 'develop'
Resolve "Config file prevents having the addon on a shared network location"

See merge request slumber/multi-user!20
2020-03-04 13:52:54 +00:00
efa243211b feat: remove headers 2020-03-04 14:12:56 +01:00
f03a3aadff feat: cleanup ui 2020-03-04 12:51:56 +01:00
16147ae2ba feat: custom cache directory in userpref 2020-03-02 11:09:45 +01:00
8e600778ab feat: store addon config in user preference (except runtime vars)
Related to #20
2020-02-28 17:34:30 +01:00
292f76aea5 feat: move diff to observer
feat: logs
2020-02-28 15:39:29 +01:00
28c4ccf1f3 Merge branch 'develop' into feature/event_driven_updates 2020-02-28 14:48:09 +01:00
a7641d6fc9 doc: update version 2020-02-28 14:26:24 +01:00
c7584964fe refactor: change download version 2020-02-28 14:08:52 +01:00
549b0b3784 fix: submodule version 2020-02-25 17:40:00 +01:00
fc9ab1a7e6 feat: update submodule 2020-02-25 17:38:43 +01:00
44bffc1850 Merge remote-tracking branch 'origin/develop' into feature/event_driven_updates 2020-02-25 17:37:24 +01:00
a141e9bfe7 feat: stash on deps graph update 2020-02-23 14:08:45 +01:00
111 changed files with 9512 additions and 23538 deletions

6
.gitignore vendored
View File

@ -7,6 +7,10 @@ __pycache__/
cache
config
*.code-workspace
multi_user_updater/
# sphinx build folder
_build
_build
# ignore generated zip generated from blender_addon_tester
*.zip

9
.gitlab-ci.yml Normal file
View File

@ -0,0 +1,9 @@
stages:
- test
- build
- deploy
include:
- local: .gitlab/ci/test.gitlab-ci.yml
- local: .gitlab/ci/build.gitlab-ci.yml
- local: .gitlab/ci/deploy.gitlab-ci.yml

View File

@ -0,0 +1,10 @@
build:
stage: build
image: debian:stable-slim
script:
- rm -rf tests .git .gitignore script
artifacts:
name: multi_user
paths:
- multi_user

View File

@ -0,0 +1,18 @@
deploy:
stage: deploy
image: slumber/docker-python
variables:
DOCKER_DRIVER: overlay2
DOCKER_TLS_CERTDIR: "/certs"
services:
- docker:19.03.12-dind
script:
- RP_VERSION="$(python scripts/get_replication_version.py)"
- VERSION="$(python scripts/get_addon_version.py)"
- echo "Building docker image with replication ${RP_VERSION}"
- docker build --build-arg replication_version=${RP_VERSION} --build-arg version={VERSION} -t registry.gitlab.com/slumber/multi-user/multi-user-server:${VERSION} ./scripts/docker_server
- echo "Pushing to gitlab registry ${VERSION}"
- docker login -u $CI_REGISTRY_USER -p $CI_REGISTRY_PASSWORD $CI_REGISTRY
- docker push registry.gitlab.com/slumber/multi-user/multi-user-server:${VERSION}

View File

@ -0,0 +1,5 @@
test:
stage: test
image: slumber/blender-addon-testing:latest
script:
- python3 scripts/test_addon.py

View File

@ -0,0 +1,46 @@
<!---
Please read this!
Before opening a new issue, make sure to search for keywords in the issues
filtered by the "bug" label:
- https://gitlab.com/slumber/multi-user/-/issues?scope=all&utf8=✓&label_name[]=bug
and verify the issue you're about to submit isn't a duplicate.
--->
### Summary
(Summarize the bug encountered concisely)
* Addon version: (your addon-version)
* Blender version: (your blender version)
* OS: (your os windows/linux/mac)
### Steps to reproduce
(How one can reproduce the issue - this is very important)
### Example Project [optionnal]
(If possible, please create an example project that exhibits the problematic behavior, and link to it here in the bug report)
### What is the current *bug* behavior?
(What actually happens)
### Relevant logs and/or screenshots
(Paste any relevant logs - please use code blocks (```) to format console output,
logs, and code as it's tough to read otherwise.)
### Possible fixes [optionnal]
(If you can, link to the line of code that might be responsible for the problem)
/label ~type::bug
/cc @project-manager

View File

@ -0,0 +1,30 @@
### Problem to solve
<!-- Include the following detail as necessary:
* What feature(s) affected?
* What docs or doc section affected? Include links or paths.
* Is there a problem with a specific document, or a feature/process that's not addressed sufficiently in docs?
* Any other ideas or requests?
-->
### Further details
<!--
* Any concepts, procedures, reference info we could add to make it easier to successfully use the multi-user addom?
* Include use cases, benefits, and/or goals for this work.
-->
### Proposal
<!-- Further specifics for how can we solve the problem. -->
### Who can address the issue
<!-- What if any special expertise is required to resolve this issue? -->
### Other links/references
<!-- E.g. related GitLab issues/MRs -->
/label ~type::documentation
/cc @project-manager

View File

@ -0,0 +1,18 @@
### Problem to solve
<!-- What problem do we solve? Try to define the who/what/why of the opportunity as a user story. For example, "As a (who), I want (what), so I can (why/value)." -->
### Proposal
<!-- How are we going to solve the problem?-->
### Further details
<!-- Include use cases, benefits, goals, or any other details that will help us understand the problem better. -->
### Links / references
/label ~type::feature request
/cc @project-manager

View File

@ -0,0 +1,34 @@
## Summary
<!--
Please briefly describe what part of the code base needs to be refactored.
-->
## Improvements
<!--
Explain the benefits of refactoring this code.
-->
## Risks
<!--
Please list features that can break because of this refactoring and how you intend to solve that.
-->
## Involved components
<!--
List files or directories that will be changed by the refactoring.
-->
## Optional: Intended side effects
<!--
If the refactoring involves changes apart from the main improvements (such as a better UI), list them here.
It may be a good idea to create separate issues and link them here.
-->
/label ~type::refactoring
/cc @project-manager

3
.gitmodules vendored
View File

@ -1,3 +0,0 @@
[submodule "multi_user/libs/replication"]
path = multi_user/libs/replication
url = https://gitlab.com/slumber/replication.git

View File

@ -35,4 +35,64 @@ All notable changes to this project will be documented in this file.
- Right management takes view-layer in account for object selection.
- Use a basic BFS approach for replication graph pre-load.
- Serialization is now based on marshal (2x performance improvements).
- Let pip chose python dependencies install path.
- Let pip chose python dependencies install path.
## [0.0.3] - 2020-07-29
### Added
- Auto updater support
- Big Performances improvements on Meshes, Gpencils, Actions
- Multi-scene workflow support
- Render setting synchronization
- Kick command
- Dedicated server with a basic command set
- Administrator session status
- Tests
- Blender 2.83-2.90 support
### Changed
- Config is now stored in blender user preference
- Documentation update
- Connection protocol
- UI revamp:
- user localization
- repository init
### Removed
- Unused strict right management strategy
- Legacy config management system
## [0.1.0] - preview
### Added
- Dependency graph driven updates [experimental]
- Edit Mode updates
- Late join mechanism
- Sync Axis lock replication
- Sync collection offset
- Sync camera orthographic scale
- Sync custom fonts
- Sync sound files
- Logging configuration (file output and level)
- Object visibility type replication
- Optionnal sync for active camera
- Curve->Mesh conversion
- Mesh->gpencil conversion
### Changed
- Auto updater now handle installation from branches
- Use uuid for collection loading
- Moved session instance to replication package
### Fixed
- Prevent unsupported data types to crash the session
- Modifier vertex group assignation
- World sync
- Snapshot UUID error
- The world is not synchronized

View File

@ -2,7 +2,7 @@
> Enable real-time collaborative workflow inside blender
![demo](https://i.imgur.com/X0B7O1Q.gif)
<img src="https://i.imgur.com/X0B7O1Q.gif" width=600>
:warning: Under development, use it at your own risks. Currently tested on Windows platform. :warning:
@ -11,7 +11,7 @@ This tool aims to allow multiple users to work on the same scene over the networ
## Quick installation
1. Download latest release [multi_user.zip](/uploads/8aef79c7cf5b1d9606dc58307fd9ad8b/multi_user.zip).
1. Download latest release [multi_user.zip](https://gitlab.com/slumber/multi-user/-/jobs/artifacts/master/download?job=build).
2. Run blender as administrator (dependencies installation).
3. Install last_version.zip from your addon preferences.
@ -25,22 +25,33 @@ See the [documentation](https://multi-user.readthedocs.io/en/latest/) for detail
Currently, not all data-block are supported for replication over the wire. The following list summarizes the status for each ones.
| Name | Status | Comment |
| ----------- | :----------------: | :------------: |
| action | :exclamation: | Not stable |
| armature | :exclamation: | Not stable |
| camera | :white_check_mark: | |
| collection | :white_check_mark: | |
| curve | :white_check_mark: | Not tested |
| gpencil | :white_check_mark: | |
| image | :exclamation: | Not stable yet |
| mesh | :white_check_mark: | |
| material | :white_check_mark: | |
| metaball | :white_check_mark: | |
| object | :white_check_mark: | |
| scene | :white_check_mark: | |
| world | :white_check_mark: | |
| lightprobes | :white_check_mark: | |
| Name | Status | Comment |
| ----------- | :----: | :--------------------------------------------------------------------------: |
| action | ✔️ | |
| armature | ❗ | Not stable |
| camera | ✔️ | |
| collection | ✔️ | |
| curve | ❗ | Nurbs not supported |
| gpencil | ✔️ | [Airbrush not supported](https://gitlab.com/slumber/multi-user/-/issues/123) |
| image | ✔️ | |
| mesh | ✔️ | |
| material | ✔️ | |
| metaball | ✔️ | |
| object | ✔️ | |
| texts | ✔️ | |
| scene | ✔️ | |
| world | ✔️ | |
| lightprobes | ✔️ | |
| compositing | ❌ | [Planned](https://gitlab.com/slumber/multi-user/-/issues/46) |
| texts | ❌ | [Planned](https://gitlab.com/slumber/multi-user/-/issues/81) |
| nla | ❌ | |
| volumes | ❌ | |
| particles | ❌ | [On-going](https://gitlab.com/slumber/multi-user/-/issues/24) |
| speakers | ❗ | [Partial](https://gitlab.com/slumber/multi-user/-/issues/65) |
| vse | ❌ | [Planned](https://gitlab.com/slumber/multi-user/-/issues/45) |
| physics | ❌ | [Planned](https://gitlab.com/slumber/multi-user/-/issues/45) |
| libraries | ❗ | Partial |
### Performance issues
@ -51,18 +62,17 @@ I'm working on it.
| Dependencies | Version | Needed |
| ------------ | :-----: | -----: |
| ZeroMQ | latest | yes |
| msgpack | latest | yes |
| PyYAML | latest | yes |
| JsonDiff | latest | yes |
| Replication | latest | yes |
## Contributing
See [contributing section](https://multi-user.readthedocs.io/en/latest/ways_to_contribute.html) of the documentation.
Feel free to [join the discord server](https://discord.gg/aBPvGws) to chat, seek help and contribute.
## Licensing
See [license](LICENSE)
[![Documentation Status](https://readthedocs.org/projects/multi-user/badge/?version=latest)](https://multi-user.readthedocs.io/en/latest/?badge=latest)

Binary file not shown.

After

Width:  |  Height:  |  Size: 5.8 MiB

View File

@ -5,6 +5,11 @@ Introduction
A film is an idea carved along the whole production process by many different peoples. A traditional animation pipeline involve a linear succession of tasks. From storyboard to compositing by passing upon different step, its fundamental work flow is similar to an industrial assembly line. Since each step is almost a department, its common that one person on department B doesn't know what another person did on a previous step in a department A. This lack of visibility/communication could be a source of problems which could produce a bad impact on the final production result.
.. figure:: img/about_chain.gif
:align: center
The linear workflow problems
Nowadays it's a known fact that real-time rendering technologies allows to speedup traditional linear production by reducing drastically the iteration time across different steps. All majors industrial CG solutions are moving toward real-time horizons to bring innovative interactive workflows. But this is a microscopic, per-task/solution vision of real-time rendering benefits for the animation production. What if we step-back, get a macroscopic picture of an animation movie pipeline and ask ourself how real-time could change our global workflow ? Could-it bring better ways of working together by giving more visibility between departments during the whole production ?
The multi-user addon is an attempt to experiment real-time parallelism between different production stage. By replicating blender data blocks over the networks, it allows different artists to collaborate on a same scene in real-time.

View File

@ -22,7 +22,7 @@ copyright = '2020, Swann Martinez'
author = 'Swann Martinez'
# The full version, including alpha/beta/rc tags
release = '0.0.1'
release = '0.0.2'
# -- General configuration ---------------------------------------------------

View File

@ -0,0 +1,59 @@
========
Glossary
========
.. glossary::
.. _admin:
administrator
*A session administrator can manage users (kick) and have a write access on
each datablock. He could also init a dedicated server repository.*
.. _session-status:
session status
*Located in the title of the multi-user panel, the session status show
you the connection state.*
.. figure:: img/quickstart_session_status.png
:align: center
Session status in panel title bar
All possible state are listed here with their meaning:*
+--------------------+---------------------------------------------------------------------------------------------+
| State | Description |
+--------------------+---------------------------------------------------------------------------------------------+
| WARMING UP DATA | Commiting local data |
+--------------------+---------------------------------------------------------------------------------------------+
| FETCHING | Dowloading snapshot from the server |
+--------------------+---------------------------------------------------------------------------------------------+
| AUTHENTIFICATION | Initial server authentication |
+--------------------+---------------------------------------------------------------------------------------------+
| ONLINE | Connected to the session |
+--------------------+---------------------------------------------------------------------------------------------+
| PUSHING | Init the server repository by pushing ours |
+--------------------+---------------------------------------------------------------------------------------------+
| INIT | Initial state |
+--------------------+---------------------------------------------------------------------------------------------+
| QUITTING | Exiting the session |
+--------------------+---------------------------------------------------------------------------------------------+
| LAUNCHING SERVICES | Launching local services. Services are spetialized daemons running in the background. ) |
+--------------------+---------------------------------------------------------------------------------------------+
| LOBBY | The lobby is a waiting state triggered when the server repository hasn't been initiated yet |
| | |
| | Once initialized, the server will automatically launch all client in the **LOBBY**. |
+--------------------+---------------------------------------------------------------------------------------------+
.. _common-right:
common right
When a data block is under common right, it is available for everyone to modification.
The rights will be given to the user that select it first.

Binary file not shown.

Before

Width:  |  Height:  |  Size: 12 KiB

After

Width:  |  Height:  |  Size: 8.4 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 7.6 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.9 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 4.1 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 18 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 8.9 KiB

After

Width:  |  Height:  |  Size: 13 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 8.5 KiB

After

Width:  |  Height:  |  Size: 12 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 9.7 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 16 KiB

After

Width:  |  Height:  |  Size: 22 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 15 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 19 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.5 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.5 MiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 5.2 MiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 5.4 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 4.6 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 32 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 9.1 KiB

After

Width:  |  Height:  |  Size: 18 KiB

View File

@ -1,3 +1,4 @@
===============
Getting started
===============
@ -7,3 +8,4 @@ Getting started
install
quickstart
glossary

View File

@ -2,8 +2,12 @@
Installation
============
*The process is the same for linux, mac and windows.*
.. hint::
The process is the same for linux, mac and windows.
1. Download latest release `multi_user.zip <https://gitlab.com/slumber/multi-user/uploads/8aef79c7cf5b1d9606dc58307fd9ad8b/multi_user.zip>`_.
1. Download latest `release <https://gitlab.com/slumber/multi-user/-/jobs/artifacts/master/download?job=build>`_ or `develop (unstable !) <https://gitlab.com/slumber/multi-user/-/jobs/artifacts/develop/download?job=build>`_ build.
2. Run blender as administrator (to allow python dependencies auto-installation).
3. Install last_version.zip from your addon preferences.
3. Install **multi-user.zip** from your addon preferences.
Once the addon is succesfully installed, I strongly recommend you to follow the :ref:`quickstart`
tutorial.

View File

@ -1,90 +1,302 @@
.. _quickstart:
===========
Quick start
===========
*All settings are located under: `View3D -> Sidebar -> Multiuser panel`*
.. hint::
*All session related settings are located under: `View3D -> Sidebar -> Multiuser panel`*
Session setup
=============
This section describe how to create or join a collaborative session.
The multi-user is based on a session management system.
In this this guide you will quickly learn how to use the collaborative session system in three part:
---------------------
1. User information's
---------------------
- :ref:`how-to-host`
- :ref:`how-to-join`
- :ref:`how-to-manage`
.. image:: img/quickstart_user_infos.png
.. _how-to-host:
- **name**: username.
- **color**: color used to represent the user into other user workspace.
How to host a session
=====================
----------
2. Network
----------
The multi-user add-on rely on a Client-Server architecture.
The server is the heart of the collaborative session,
it will allow each users to communicate with each others.
In simple terms, *Hosting a session* means *run a local server and connect the local client to it*.
When I said **local server** I mean accessible from the LAN (Local Area Network).
.. note:: If you host a session over internet, special network configuration is needed.
However sometime you will need to host a session over the internet,
in this case I strongly recommand you to read the :ref:`internet-guide` tutorial.
Hosting and connection are done from this panel.
+-----------------------------------+-------------------------------------+
| Host | Join |
+===================================+=====================================+
|.. image:: img/quickstart_host.png | .. image:: img/quickstart_join.png |
+-----------------------------------+-------------------------------------+
| | Start empty: Cleanup the file | | IP: server ip |
| | before hosting | | Port: server port |
+-----------------------------------+-------------------------------------+
| **HOST**: Host a session | **CONNECT**: Join a session |
+-----------------------------------+-------------------------------------+
.. _user-info:
**Port configuration:**
For now, a session use 4 ports to run.
If 5555 is given in host settings, it will use 5555, 5556 (5555+1), 5557 (5555+2), 5558 (5555+3).
-----------------------------
1. Fill your user information
-----------------------------
------------
2.1 Advanced
------------
The **User Info** panel (See image below) allow you to constomize your online identity.
.. image:: img/quickstart_advanced.png
.. figure:: img/quickstart_user_info.png
:align: center
**Right strategy** (only host) enable you to choose between a strict and a relaxed pattern:
User info panel
- **Strict**: Host is the king, by default the host own each properties, only him can grant modification rights.
- **Common**: Each properties are under common rights by default, on selection, a property is only modifiable by the owner.
On each strategy, when a user is the owner he can choose to pass his rights to someone else.
**Properties frequency gird** allow to set a custom replication frequency for each type of data-block:
Let's fill those tow field:
- **Refresh**: pushed data update rate (in second)
- **Apply**: pulled data update rate (in second)
- **name**: your online name.
- **color**: a color used to represent you into other user workspace(see image below).
.. note:: Per-data type settings will soon be revamped for simplification purposes
Session Management
==================
During online sessions, other users will see your selected object and camera hilghlited in your profile color.
This section describe tools available during a collaborative session.
.. _user-representation:
---------------
Connected users
.. figure:: img/quickstart_user_representation.png
:align: center
User viewport representation
--------------------
2. Setup the network
--------------------
When the hosting process will start, the multi-user addon will lauch a local server instance.
In the nerwork panel select **HOST**.
The **Host sub-panel** (see image below) allow you to configure the server according to:
* **Port**: Port on wich the server is listening.
* **Start from**: The session initialisation method.
* **current scenes**: Start with the current blendfile datas.
* **an empty scene**: Clear a data and start over.
.. danger::
By starting from an empty, all of the blend data will be removed !
Ensure to save your existing work before launching the session.
* **Admin password**: The session administration password.
.. figure:: img/quickstart_host.png
:align: center
:alt: host menu
Host network panel
.. note:: Additionnal configuration setting can be found in the :ref:`advanced` section.
Once everything is setup you can hit the **HOST** button to launch the session !
It will do two things:
* Start a local server
* Connect you to it as an :ref:`admin`
During online session, various actions are available to you, go to :ref:`how-to-manage` section to
learn more about them.
.. _how-to-join:
How to join a session
=====================
This section describe how join a launched session.
Before starting make sure that you have access to the session ip and port.
-----------------------------
1. Fill your user information
-----------------------------
Follow the user-info_ section for this step.
----------------
2. Network setup
----------------
In the nerwork panel select **JOIN**.
The **join sub-panel** (see image below) allow you configure the client to join a
collaborative session.
.. figure:: img/quickstart_join.png
:align: center
:alt: Connect menu
Connection panel
Fill those field with your information:
- **IP**: the host ip.
- **Port**: the host port.
- **Connect as admin**: connect you with **admin rights** (see :ref:`admin` ) to the session.
.. Maybe something more explicit here
.. note::
Additionnal configuration setting can be found in the :ref:`advanced` section.
Once you've set every field, hit the button **CONNECT** to join the session !
When the :ref:`session-status` is **ONLINE** you are online and ready to start to collaborate.
.. note::
On the **dedicated server** startup, the session status will get you to the **LOBBY** waiting a admin to start it.
If the session status is set to **LOBBY** and you are a regular user, you need to wait that an admin launch it.
If you are the admin, you just need to init the repository to start the session (see image below).
.. figure:: img/quickstart_session_init.png
:align: center
Session initialisation for dedicated server
During online session, various actions are available to you, go to :ref:`how-to-manage` section to
learn more about them.
.. _how-to-manage:
How to manage a session
=======================
The collaboration quality directly depend on the communication quality. This section describes
various tools made in an effort to ease the communication between the different session users.
Feel free to suggest any idea for communication tools `here <https://gitlab.com/slumber/multi-user/-/issues/75>`_ .
---------------------------
Change replication behavior
---------------------------
During a session, the multi-user will replicate your modifications to other instances.
In order to avoid annoying other users when you are experimenting, some of those modifications can be ignored via
various flags present at the top of the panel (see red area in the image bellow). Those flags are explained in the :ref:`replication` section.
.. figure:: img/quickstart_replication.png
:align: center
Session replication flags
--------------------
Monitor online users
--------------------
One of the most vital tool is the **Online user panel**. It list all connected
users information's including yours such as :
* **Role** : if user is an admin or a regular user.
* **Location**: Where the user is actually working.
* **Frame**: When (in frame) the user working.
* **Ping**: user connection delay in milliseconds
.. figure:: img/quickstart_users.png
:align: center
Online user panel
By selecting a user in the list you'll have access to different user related **actions**.
Those operators allow you reach the selected user state in tow different dimensions: **SPACE** and **TIME**.
Snapping in space
----------------
The **CAMERA button** (Also called **snap view** operator) allow you to snap on
the user viewpoint. To disable the snap, click back on the button. This action
served different purposes such as easing the review process, working together on
wide world.
.. hint::
If the target user is localized on another scene, the **snap view** operator will send you to his scene.
.. figure:: img/quickstart_snap_view.gif
:align: center
Snap view in action
Snapping in time
---------------
.. image:: img/quickstart_users.png
The **CLOCK button** (Also called **snap time** operator) allow you to snap on
the user time (current frame). To disable the snap, click back on the button.
This action is built to help various actors to work on the same temporality
(for instance multiple animators).
This panel displays all connected users information's, including yours.
By selecting a user in the list you'll have access to different **actions**:
.. figure:: img/quickstart_snap_time.gif
:align: center
- The **camera button** allow you to snap on the user viewpoint.
- The **time button** allow you to snap on the user time.
Snap time in action
---------------------
Replicated properties
---------------------
.. image:: img/quickstart_properties.png
Kick a user
-----------
The **replicated properties** panel shows all replicated properties status and associated actions.
Since the replication architecture is based on commit/push/pull mechanisms, a replicated properties can be pushed/pull or even committed manually from this panel.
.. warning:: Only available for :ref:`admin` !
The **CROSS button** (Also called **kick** operator) allow the admin to kick the selected user. On the target user side, the session will properly disconnect.
Change users display
--------------------
Presence is the multi-user module responsible for users display. During the session,
it draw users related information in your viewport such as:
* Username
* User point of view
* User selection
.. figure:: img/quickstart_presence.png
:align: center
Presence show flags
The presence overlay panel (see image above) allow you to enable/disable
various drawn parts via the following flags:
- **Show selected objects**: display other users current selection
- **Show users**: display users current viewpoint
- **Show different scenes**: display users working on other scenes
-----------
Manage data
-----------
In order to understand replication data managment, a quick introduction to the multi-user data workflow is required.
First thing to know: until now, the addon rely on a data-based replication. In simple words, it means that it replicate
user's action results.
To replicate datablocks between clients the multi-user rely on what tends to be a distributed architecture:
- The server store the "master" version of the work.
- Each client have a local version of the work.
When an artist modified something in the scene, here is what is happening in the background:
1. Modified data are **COMMITTED** to the local repository.
2. Once committed locally, they are **PUSHED** to the server
3. As soon as the server is getting updates, they are stored locally and pushed to every other clients
At the top of this data management system, a right management system prevent
multiple users from modifying same data at same time. A datablock may belong to
a connected user or be under :ref:`common-right<**COMMON**>` rights.
.. note::
In a near future, the right management system will support roles to allow multiple users to
work on different aspect of the same datablock.
The Repository panel (see image below) allow you to monitor, change datablock states and right manually.
.. figure:: img/quickstart_properties.png
:align: center
Repository panel
The **show only owned** flag allow you to see which datablocks you are currently modifying.
.. warning::
If you are editing a datablock not listed with this fag enabled, it means that you do
not have right granted to modify it. So it won't be updated to other client !
Here is a quick list of available actions:
+---------------------------------------+-------------------+------------------------------------------------------------------------------------+
| icon | Action | Description |
@ -100,12 +312,107 @@ Since the replication architecture is based on commit/push/pull mechanisms, a re
| .. image:: img/quickstart_remove.png | **Delete** | Remove the data-block from network replication |
+---------------------------------------+-------------------+------------------------------------------------------------------------------------+
.. _advanced:
Advanced settings
=================
This section contains optional settings to configure the session behavior.
.. figure:: img/quickstart_advanced.png
:align: center
Advanced configuration panel
-------
Network
-------
.. figure:: img/quickstart_advanced_network.png
:align: center
Advanced network settings
**IPC Port** is the port used for Inter Process Communication. This port is used
by the multi-users subprocesses to communicate with each others. If different instances
of the multi-user are using the same IPC port it will create conflict !
.. note::
You only need to modify it if you need to launch multiple clients from the same
computer(or if you try to host and join on the same computer). You should just enter a different
**IPC port** for each blender instance.
**Timeout (in milliseconds)** is the maximum ping authorized before auto-disconnecting.
You should only increase it if you have a bad connection.
.. _replication:
-----------
Replication
-----------
.. figure:: img/quickstart_advanced_replication.png
:align: center
Advanced replication settings
**Synchronize render settings** (only host) enable replication of EEVEE and CYCLES render settings to match render between clients.
**Synchronize active camera** sync the scene active camera.
**Edit Mode Updates** enable objects update while you are in Edit_Mode.
.. warning:: Edit Mode Updates kill performances with complex objects (heavy meshes, gpencil, etc...).
**Update method** allow you to change how replication update are triggered. Until now two update methode are implemented:
- **Default**: Use external threads to monitor datablocks changes, slower and less accurate.
- **Despgraph ⚠️**: Use the blender dependency graph to trigger updates. Faster but experimental and unstable !
**Properties frequency gird** allow to set a custom replication frequency for each type of data-block:
- **Refresh**: pushed data update rate (in second)
- **Apply**: pulled data update rate (in second)
-----
Cache
-----
The multi-user allows to replicate external blend dependencies such as images, movies sounds.
On each client, those files are stored into the cache folder.
.. figure:: img/quickstart_advanced_cache.png
:align: center
Advanced cache settings
**cache_directory** allows to choose where cached files (images, sound, movies) will be saved.
**Clear memory filecache** will save memory space at runtime by removing the file content from memory as soon as it have been written to the disk.
**Clear cache** will remove all file from the cache folder.
.. warning:: Clear cash could break your scene image/movie/sound if they are used into the blend !
---
Log
---
.. figure:: img/quickstart_advanced_logging.png
:align: center
Advanced log settings
**log level** allow to set the logging level of detail. Here is the detail for each values:
+-----------+-----------------------------------------------+
| Log level | Description |
+===========+===============================================+
| ERROR | Shows only critical error |
+-----------+-----------------------------------------------+
| WARNING | Shows only errors (all kind) |
+-----------+-----------------------------------------------+
| INFO | Shows only status related messages and errors |
+-----------+-----------------------------------------------+
| DEBUG | Shows every possible information. |
+-----------+-----------------------------------------------+

View File

@ -18,6 +18,11 @@ Main Features
- Datablocks right managment
- Tested under Windows
Community
=========
A `discord server <https://discord.gg/aBPvGws>`_ have been created to provide help for new users and
organize collaborative creation sessions.
Status
======
@ -43,6 +48,7 @@ Documentation is organized into the following sections:
getting_started/install
getting_started/quickstart
getting_started/glossary
.. toctree::
:maxdepth: 1

View File

@ -1,47 +1,278 @@
================
Advanced hosting
================
.. _internet-guide:
This tutorial aims to guide you to host a collaborative Session on internet.
.. note::
This tutorial will change soon with the new dedicated server.
The multi-user network architecture is based on a clients-server model. The communication protocol use four ports to communicate with client:
* Commands: command transmission (such as **snapshots**, **change_rights**, etc.)
* Subscriber: pull data
* Publisher: push data
* TTL (time to leave): used to ping each clients
===================
Hosting on internet
===================
.. warning::
Until now, those communications are not encrypted but are planned to be in a mid-term future (`Status <https://gitlab.com/slumber/multi-user/issues/62>`_).
This tutorial aims to guide you to host a collaborative Session on internet.
Hosting a session can be done is several ways:
- :ref:`host-blender`: hosting a session directly from the blender add-on panel.
- :ref:`host-dedicated`: hosting a session directly from the command line interface on a computer without blender.
.. _host-blender:
-------------
From blender
-------------
By default your router doesn't allow anyone to share you connection.
In order grant server access to people from internet you have tow main option:
* The :ref:`connection-sharing`: the easiest way.
* The :ref:`port-forwarding`: this one is the most unsecure, if you have no networking knowledge, you should definitively go to :ref:`connection-sharing`.
.. _connection-sharing:
Using a connection sharing solution
-----------------------------------
Many third party software like `ZEROTIER <https://www.zerotier.com/download/>`_ (Free) or `HAMACHI <https://vpn.net/>`_ (Free until 5 users) allow you to share your private network with other people.
For the example I'm gonna use ZeroTier because its free and open source.
1. Installation
^^^^^^^^^^^^^^^
Let's start by downloading and installing ZeroTier:
https://www.zerotier.com/download/
Once installed, launch it.
2. Network creation
^^^^^^^^^^^^^^^^^^^
To create a ZeroTier private network you need to register a ZeroTier account `on my.zerotier.com <https://my.zerotier.com/login>`_
(click on **login** then register on the bottom)
Once you account it activated, you can connect to `my.zerotier.com <https://my.zerotier.com/login>`_.
Head up to the **Network** section(highlighted in red in the image below).
.. figure:: img/hosting_guide_head_network.png
:align: center
:width: 450px
ZeroTier user homepage
Hit 'Create a network'(see image below) and go to the network settings.
.. figure:: img/hosting_guide_create_network.png
:align: center
:width: 450px
Network page
Now that the network is created, let's configure it.
In the Settings section(see image below), you can change the network name to what you want.
Make sure that the field **Access Control** is set to **PRIVATE**.
.. hint::
If you set the Access Control to PUBLIC, anyone will be able to join without
your confirmation. It is easier to set up but less secure.
.. figure:: img/hosting_guide_network_settings.png
:align: center
:width: 450px
Network settings
That's all for the network setup !
Now let's connect everyone.
.. _network-authorization:
3. Network authorization
^^^^^^^^^^^^^^^^^^^^^^^^
Since your ZeroTier network is Private, you will need to authorize each new users
to connect to it.
For each user you want to add, do the following step:
1. Get the client **ZeroTier id** by right clicking on the ZeroTier tray icon and click on the `Node ID`, it will copy it.
.. figure:: img/hosting_guide_get_node.png
:align: center
:width: 450px
Get the ZeroTier client id
2. Go to the network settings in the Member section and paste the Node ID into the Manually Add Member field.
.. figure:: img/hosting_guide_add_node.png
:align: center
:width: 450px
Add the client to network authorized users
4. Network connection
^^^^^^^^^^^^^^^^^^^^^
To connect to the ZeroTier network, get the network id from the network settings (see image).
.. figure:: img/hosting_guide_get_id.png
:align: center
:width: 450px
Now we are ready to join the network !
Right click on the ZeroTier tray icon and select **Join Network** !
.. figure:: img/hosting_guide_join_network.png
:align: center
:width: 450px
.. figure:: img/hosting_guide_join.png
:align: center
Joining the network
Past the network id and check ``Allow Managed`` then click on join !
You should be connected to the network.
Let's check the connection status. Right click on the tray icon and click on **Show Networks...**.
.. figure:: img/hosting_guide_show_network.png
:align: center
:width: 450px
Show network status
.. figure:: img/hosting_guide_network_status.png
:align: center
Network status.
The network status must be **OK** for each user(like in the picture above) otherwise it means that you are not connected to the network.
If you see something like **ACCESS_DENIED**, it means that you were not authorized to join the network. Please check the :ref:`network-authorization` section.
This is it for the ZeroTier network setup. Now everything should be setup to use the multi-user add-on over internet ! You can now follow the :ref:`quickstart` guide to start using the multi-user add-on !
.. _port-forwarding:
Using port-forwarding
---------------------
The port forwarding method consist to configure you Network route to allow internet trafic throught specific ports.
In order to know which port are used by the add-on, check the :ref:`port-setup` section.
To set up port forwarding for each port you can follow this `guide <https://www.wikihow.com/Set-Up-Port-Forwarding-on-a-Router>`_ for example.
Once you have set up the network you can follow the :ref:`quickstart` guide to start using the multi-user add-on !
.. _host-dedicated:
--------------------------
From the dedicated server
--------------------------
.. warning::
The dedicated server is developed to run directly on internet server (like VPS). You can also
run it at home for LAN but for internet hosting you need to follow the :ref:`port-forwarding` setup first.
The dedicated server allow you to host a session with simplicity from any location.
It was developed to improve intaernet hosting performance.
The dedicated server can be run in tow ways:
- :ref:`cmd-line`
- :ref:`docker`
.. _cmd-line:
Using a regular command line
----------------------------
You can run the dedicated server on any platform by following those steps:
1. Firstly, download and intall python 3 (3.6 or above).
2. Install the replication library:
.. code-block:: bash
python -m pip install replication
4. Launch the server with:
.. code-block:: bash
replication.serve
.. hint::
You can also specify a custom **port** (-p), **timeout** (-t), **admin password** (-pwd), **log level(ERROR, WARNING, INFO or DEBUG)** (-l) and **log file** (-lf) with the following optionnal argument
.. code-block:: bash
replication.serve -p 5555 -pwd toto -t 1000 -l INFO -lf server.log
As soon as the dedicated server is running, you can connect to it from blender (follow :ref:`how-to-join`).
.. hint::
Some commands are available to manage the session. Check :ref:`dedicated-management` to learn more.
.. _docker:
Using a pre-configured image on docker engine
---------------------------------------------
Launching the dedicated server from a docker server is simple as:
.. code-block:: bash
docker run -d \
-p 5555-5560:5555-5560 \
-e port=5555 \
-e password=admin \
-e timeout=1000 \
registry.gitlab.com/slumber/multi-user/multi-user-server:0.0.3
As soon as the dedicated server is running, you can connect to it from blender.
You can check the :ref:`how-to-join` section.
.. hint::
Some commands are available to manage the session. Check :ref:`dedicated-management` to learn more.
.. _dedicated-management:
Dedicated server management
---------------------------
Here is the list of available commands from the dedicated server:
- ``help``: Show all commands.
- ``exit`` or ``Ctrl+C`` : Stop the server.
- ``kick username``: kick the provided user.
- ``users``: list all online users.
.. _port-setup:
----------
Port setup
----------
The multi-user network architecture is based on a clients-server model. The communication protocol use four ports to communicate with client:
* Commands: command transmission (such as **snapshots**, **change_rights**, etc.) [given port]
* Subscriber : pull data [Commands port + 1]
* Publisher : push data [Commands port + 2]
* TTL (time to leave) : used to ping each client [Commands port + 3]
To know which ports will be used, you just have to read the port in your preference.
.. image:: img/hosting_guide_port.png
.. figure:: img/hosting_guide_port.png
:align: center
:alt: Port
:width: 200px
Port in host settings
In the picture below we have setup our port to **5555** so it will be:
* Commands: 5555 (**5555** +0)
* Commands: 5555 (**5555**)
* Subscriber: 5556 (**5555** +1)
* Publisher: 5557 (**5555** +2)
* TTL: 5558 (**5555** +3)
Now that we know which port are needed to communicate we need to allow other computer to communicate with our one.
By default your router shall block those ports. In order grant server access to people from internet you have multiple options:
1. Simple: use a third party software like `HAMACHI <https://vpn.net/>`_ (Free until 5 users) or `ZEROTIER <https://www.zerotier.com/download/>`_ to handle network sharing.
2. Harder: Setup a VPN server and allow distant user to connect to your VPN.
3. **Not secure** but simple: Setup port forwarding for each ports (for example 5555,5556,5557 and 5558 in our case). You can follow this `guide <https://www.wikihow.com/Set-Up-Port-Forwarding-on-a-Router>`_ for example.
Once you have setup the network, you can run **HOST** in order to start the server. Then other users could join your session in the regular way.
Those four ports need to be accessible from the client otherwise it won't work at all !

Binary file not shown.

After

Width:  |  Height:  |  Size: 38 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 9.8 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 27 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 20 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 28 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 3.4 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 22 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 37 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 12 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 14 KiB

After

Width:  |  Height:  |  Size: 20 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 19 KiB

View File

@ -1,12 +1,31 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
#
# ##### END GPL LICENSE BLOCK #####
bl_info = {
"name": "Multi-User",
"author": "Swann Martinez",
"version": (0, 0, 2),
"version": (0, 1, 0),
"description": "Enable real-time collaborative workflow inside blender",
"blender": (2, 80, 0),
"blender": (2, 82, 0),
"location": "3D View > Sidebar > Multi-User tab",
"warning": "Unstable addon, use it at your own risks",
"category": "Collaboration",
"doc_url": "https://multi-user.readthedocs.io/en/develop/index.html",
"wiki_url": "https://multi-user.readthedocs.io/en/develop/index.html",
"tracker_url": "https://gitlab.com/slumber/multi-user/issues",
"support": "COMMUNITY"
@ -21,282 +40,65 @@ import sys
import bpy
from bpy.app.handlers import persistent
from . import environment, utils, presence
from .libs.replication.replication.constants import RP_COMMON
from . import environment, utils
# TODO: remove dependency as soon as replication will be installed as a module
DEPENDENCIES = {
("zmq","zmq"),
("msgpack","msgpack"),
("yaml","pyyaml"),
("jsondiff","jsondiff")
("replication", '0.0.21a15'),
}
logger = logging.getLogger(__name__)
logger.setLevel(logging.WARNING)
#TODO: refactor config
# UTILITY FUNCTIONS
def generate_supported_types():
stype_dict = {'supported_types':{}}
for type in bl_types.types_to_register():
type_module = getattr(bl_types, type)
type_impl_name = "Bl{}".format(type.split('_')[1].capitalize())
type_module_class = getattr(type_module, type_impl_name)
props = {}
props['bl_delay_refresh']=type_module_class.bl_delay_refresh
props['bl_delay_apply']=type_module_class.bl_delay_apply
props['use_as_filter'] = False
props['icon'] = type_module_class.bl_icon
props['auto_push']=type_module_class.bl_automatic_push
props['bl_name']=type_module_class.bl_id
stype_dict['supported_types'][type_impl_name] = props
return stype_dict
def client_list_callback(scene, context):
from . import operators
items = [(RP_COMMON, RP_COMMON, "")]
username = bpy.context.window_manager.session.username
cli = operators.client
if cli:
client_ids = cli.online_users.keys()
for id in client_ids:
name_desc = id
if id == username:
name_desc += " (self)"
items.append((id, name_desc, ""))
return items
def randomColor():
r = random.random()
v = random.random()
b = random.random()
return [r, v, b]
class ReplicatedDatablock(bpy.types.PropertyGroup):
'''name = StringProperty() '''
type_name: bpy.props.StringProperty()
bl_name: bpy.props.StringProperty()
bl_delay_refresh: bpy.props.FloatProperty()
bl_delay_apply: bpy.props.FloatProperty()
use_as_filter: bpy.props.BoolProperty(default=True)
auto_push: bpy.props.BoolProperty(default=True)
icon: bpy.props.StringProperty()
class SessionUser(bpy.types.PropertyGroup):
"""Session User
Blender user information property
"""
username: bpy.props.StringProperty(name="username")
current_frame: bpy.props.IntProperty(name="current_frame")
class SessionProps(bpy.types.PropertyGroup):
username: bpy.props.StringProperty(
name="Username",
default="user_{}".format(utils.random_string_digits())
)
ip: bpy.props.StringProperty(
name="ip",
description='Distant host ip',
default="127.0.0.1"
)
user_uuid: bpy.props.StringProperty(
name="user_uuid",
default="None"
)
port: bpy.props.IntProperty(
name="port",
description='Distant host port',
default=5555
)
ipc_port: bpy.props.IntProperty(
name="ipc_port",
description='internal ttl port(only usefull for multiple local instances)',
default=5561
)
is_admin: bpy.props.BoolProperty(
name="is_admin",
default=False
)
start_empty: bpy.props.BoolProperty(
name="start_empty",
default=True
)
session_mode: bpy.props.EnumProperty(
name='session_mode',
description='session mode',
items={
('HOST', 'hosting', 'host a session'),
('CONNECT', 'connexion', 'connect to a session')},
default='HOST')
right_strategy: bpy.props.EnumProperty(
name='right_strategy',
description='right strategy',
items={
('STRICT', 'strict', 'strict right repartition'),
('COMMON', 'common', 'relaxed right repartition')},
default='COMMON')
client_color: bpy.props.FloatVectorProperty(
name="client_instance_color",
subtype='COLOR',
default=randomColor())
clients: bpy.props.EnumProperty(
name="clients",
description="client enum",
items=client_list_callback)
enable_presence: bpy.props.BoolProperty(
name="Presence overlay",
description='Enable overlay drawing module',
default=True,
update=presence.update_presence
)
presence_show_selected: bpy.props.BoolProperty(
name="Show selected objects",
description='Enable selection overlay ',
default=True,
update=presence.update_overlay_settings
)
presence_show_user: bpy.props.BoolProperty(
name="Show users",
description='Enable user overlay ',
default=True,
update=presence.update_overlay_settings
)
supported_datablock: bpy.props.CollectionProperty(
type=ReplicatedDatablock,
)
session_filter: bpy.props.CollectionProperty(
type=ReplicatedDatablock,
)
filter_owned: bpy.props.BoolProperty(
name="filter_owned",
description='Show only owned datablocks',
default=True
)
user_snap_running: bpy.props.BoolProperty(
default=False
)
time_snap_running: bpy.props.BoolProperty(
default=False
)
def load(self):
config = environment.load_config()
if "username" in config.keys():
self.username = config["username"]
self.ip = config["ip"]
self.port = config["port"]
self.start_empty = config["start_empty"]
self.enable_presence = config["enable_presence"]
self.client_color = config["client_color"]
else:
logger.error("Fail to read user config")
if len(self.supported_datablock)>0:
self.supported_datablock.clear()
if "supported_types" not in config:
config = generate_supported_types()
for datablock in config["supported_types"].keys():
rep_value = self.supported_datablock.add()
rep_value.name = datablock
rep_value.type_name = datablock
config_block = config["supported_types"][datablock]
rep_value.bl_delay_refresh = config_block['bl_delay_refresh']
rep_value.bl_delay_apply = config_block['bl_delay_apply']
rep_value.icon = config_block['icon']
rep_value.auto_push = config_block['auto_push']
rep_value.bl_name = config_block['bl_name']
def save(self,context):
config = environment.load_config()
if "supported_types" not in config:
config = generate_supported_types()
config["username"] = self.username
config["ip"] = self.ip
config["port"] = self.port
config["start_empty"] = self.start_empty
config["enable_presence"] = self.enable_presence
config["client_color"] = [self.client_color.r,self.client_color.g,self.client_color.b]
for bloc in self.supported_datablock:
config_block = config["supported_types"][bloc.type_name]
config_block['bl_delay_refresh'] = bloc.bl_delay_refresh
config_block['bl_delay_apply'] = bloc.bl_delay_apply
config_block['use_as_filter'] = bloc.use_as_filter
config_block['icon'] = bloc.icon
config_block['auto_push'] = bloc.auto_push
config_block['bl_name'] = bloc.bl_name
environment.save_config(config)
classes = (
SessionUser,
ReplicatedDatablock,
SessionProps,
)
libs = os.path.dirname(os.path.abspath(__file__))+"\\libs\\replication\\replication"
@persistent
def load_handler(dummy):
import bpy
bpy.context.window_manager.session.load()
module_error_msg = "Insufficient rights to install the multi-user \
dependencies, aunch blender with administrator rights."
def register():
if libs not in sys.path:
sys.path.append(libs)
environment.setup(DEPENDENCIES,bpy.app.binary_path_python)
# Setup logging policy
logging.basicConfig(
format='%(asctime)s CLIENT %(levelname)-8s %(message)s',
datefmt='%H:%M:%S',
level=logging.INFO)
from . import presence
from . import operators
from . import ui
try:
environment.setup(DEPENDENCIES, bpy.app.binary_path_python)
for cls in classes:
bpy.utils.register_class(cls)
from . import presence
from . import operators
from . import ui
from . import preferences
from . import addon_updater_ops
preferences.register()
addon_updater_ops.register(bl_info)
presence.register()
operators.register()
ui.register()
except ModuleNotFoundError as e:
raise Exception(module_error_msg)
logging.error(module_error_msg)
bpy.types.WindowManager.session = bpy.props.PointerProperty(
type=SessionProps)
bpy.types.ID.uuid = bpy.props.StringProperty(default="")
type=preferences.SessionProps)
bpy.types.ID.uuid = bpy.props.StringProperty(
default="",
options={'HIDDEN', 'SKIP_SAVE'})
bpy.types.WindowManager.online_users = bpy.props.CollectionProperty(
type=SessionUser
type=preferences.SessionUser
)
bpy.types.WindowManager.user_index = bpy.props.IntProperty()
bpy.context.window_manager.session.load()
presence.register()
operators.register()
ui.register()
bpy.app.handlers.load_post.append(load_handler)
def unregister():
from . import presence
from . import operators
from . import ui
from . import preferences
from . import addon_updater_ops
presence.unregister()
addon_updater_ops.unregister()
ui.unregister()
operators.unregister()
preferences.unregister()
del bpy.types.WindowManager.session
for cls in reversed(classes):
bpy.utils.unregister_class(cls)
del bpy.types.ID.uuid
del bpy.types.WindowManager.online_users
del bpy.types.WindowManager.user_index

1715
multi_user/addon_updater.py Normal file

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@ -1,3 +1,21 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
#
# ##### END GPL LICENSE BLOCK #####
__all__ = [
'bl_object',
'bl_mesh',
@ -16,11 +34,14 @@ __all__ = [
'bl_metaball',
'bl_lattice',
'bl_lightprobe',
'bl_speaker'
'bl_speaker',
'bl_font',
'bl_sound',
'bl_file'
] # Order here defines execution order
from . import *
from ..libs.replication.replication.data import ReplicatedDataFactory
from replication.data import ReplicatedDataFactory
def types_to_register():
return __all__

View File

@ -1,11 +1,132 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
#
# ##### END GPL LICENSE BLOCK #####
import bpy
import mathutils
import copy
import numpy as np
from enum import Enum
from .. import utils
from .dump_anything import (
Dumper, Loader, np_dump_collection, np_load_collection, remove_items_from_dict)
from .bl_datablock import BlDatablock
# WIP
KEYFRAME = [
'amplitude',
'co',
'back',
'handle_left',
'handle_right',
'easing',
'handle_left_type',
'handle_right_type',
'type',
'interpolation',
]
def dump_fcurve(fcurve: bpy.types.FCurve, use_numpy:bool =True) -> dict:
""" Dump a sigle curve to a dict
:arg fcurve: fcurve to dump
:type fcurve: bpy.types.FCurve
:arg use_numpy: use numpy to eccelerate dump
:type use_numpy: bool
:return: dict
"""
fcurve_data = {
"data_path": fcurve.data_path,
"dumped_array_index": fcurve.array_index,
"use_numpy": use_numpy
}
if use_numpy:
points = fcurve.keyframe_points
fcurve_data['keyframes_count'] = len(fcurve.keyframe_points)
fcurve_data['keyframe_points'] = np_dump_collection(points, KEYFRAME)
else: # Legacy method
dumper = Dumper()
fcurve_data["keyframe_points"] = []
for k in fcurve.keyframe_points:
fcurve_data["keyframe_points"].append(
dumper.dump(k)
)
return fcurve_data
def load_fcurve(fcurve_data, fcurve):
""" Load a dumped fcurve
:arg fcurve_data: a dumped fcurve
:type fcurve_data: dict
:arg fcurve: fcurve to dump
:type fcurve: bpy.types.FCurve
"""
use_numpy = fcurve_data.get('use_numpy')
keyframe_points = fcurve.keyframe_points
# Remove all keyframe points
for i in range(len(keyframe_points)):
keyframe_points.remove(keyframe_points[0], fast=True)
if use_numpy:
keyframe_points.add(fcurve_data['keyframes_count'])
np_load_collection(fcurve_data["keyframe_points"], keyframe_points, KEYFRAME)
else:
# paste dumped keyframes
for dumped_keyframe_point in fcurve_data["keyframe_points"]:
if dumped_keyframe_point['type'] == '':
dumped_keyframe_point['type'] = 'KEYFRAME'
new_kf = keyframe_points.insert(
dumped_keyframe_point["co"][0],
dumped_keyframe_point["co"][1],
options={'FAST', 'REPLACE'}
)
keycache = copy.copy(dumped_keyframe_point)
keycache = remove_items_from_dict(
keycache,
["co", "handle_left", "handle_right", 'type']
)
loader = Loader()
loader.load(new_kf, keycache)
new_kf.type = dumped_keyframe_point['type']
new_kf.handle_left = [
dumped_keyframe_point["handle_left"][0],
dumped_keyframe_point["handle_left"][1]
]
new_kf.handle_right = [
dumped_keyframe_point["handle_right"][0],
dumped_keyframe_point["handle_right"][1]
]
fcurve.update()
class BlAction(BlDatablock):
bl_id = "actions"
@ -13,87 +134,30 @@ class BlAction(BlDatablock):
bl_delay_refresh = 1
bl_delay_apply = 1
bl_automatic_push = True
bl_check_common = False
bl_icon = 'ACTION_TWEAK'
def construct(self, data):
def _construct(self, data):
return bpy.data.actions.new(data["name"])
def load(self, data, target):
begin_frame = 100000
end_frame = -100000
for dumped_fcurve in data["fcurves"]:
begin_frame = min(
begin_frame,
min(
[begin_frame] + [dkp["co"][0] for dkp in dumped_fcurve["keyframe_points"]]
)
)
end_frame = max(
end_frame,
max(
[end_frame] + [dkp["co"][0] for dkp in dumped_fcurve["keyframe_points"]]
)
)
begin_frame = 0
loader = utils.dump_anything.Loader()
def _load_implementation(self, data, target):
for dumped_fcurve in data["fcurves"]:
dumped_data_path = dumped_fcurve["data_path"]
dumped_array_index = dumped_fcurve["dumped_array_index"]
# create fcurve if needed
fcurve = target.fcurves.find(dumped_data_path, index=dumped_array_index)
fcurve = target.fcurves.find(
dumped_data_path, index=dumped_array_index)
if fcurve is None:
fcurve = target.fcurves.new(dumped_data_path, index=dumped_array_index)
fcurve = target.fcurves.new(
dumped_data_path, index=dumped_array_index)
load_fcurve(dumped_fcurve, fcurve)
target.id_root = data['id_root']
# remove keyframes within dumped_action range
for keyframe in reversed(fcurve.keyframe_points):
if end_frame >= (keyframe.co[0] + begin_frame ) >= begin_frame:
fcurve.keyframe_points.remove(keyframe, fast=True)
# paste dumped keyframes
for dumped_keyframe_point in dumped_fcurve["keyframe_points"]:
if dumped_keyframe_point['type'] == '':
dumped_keyframe_point['type'] = 'KEYFRAME'
new_kf = fcurve.keyframe_points.insert(
dumped_keyframe_point["co"][0] - begin_frame,
dumped_keyframe_point["co"][1],
options={'FAST', 'REPLACE'}
)
keycache = copy.copy(dumped_keyframe_point)
keycache = utils.dump_anything.remove_items_from_dict(
keycache,
["co", "handle_left", "handle_right",'type']
)
loader.load(
new_kf,
keycache
)
new_kf.type = dumped_keyframe_point['type']
new_kf.handle_left = [
dumped_keyframe_point["handle_left"][0] - begin_frame,
dumped_keyframe_point["handle_left"][1]
]
new_kf.handle_right = [
dumped_keyframe_point["handle_right"][0] - begin_frame,
dumped_keyframe_point["handle_right"][1]
]
# clearing (needed for blender to update well)
if len(fcurve.keyframe_points) == 0:
target.fcurves.remove(fcurve)
target.id_root= data['id_root']
def dump(self, pointer=None):
assert(pointer)
dumper = utils.dump_anything.Dumper()
dumper.exclude_filter =[
def _dump_implementation(self, data, instance=None):
dumper = Dumper()
dumper.exclude_filter = [
'name_full',
'original',
'use_fake_user',
@ -106,28 +170,11 @@ class BlAction(BlDatablock):
'users'
]
dumper.depth = 1
data = dumper.dump(pointer)
data = dumper.dump(instance)
data["fcurves"] = []
dumper.depth = 2
for fcurve in self.pointer.fcurves:
fc = {
"data_path": fcurve.data_path,
"dumped_array_index": fcurve.array_index,
"keyframe_points": []
}
for k in fcurve.keyframe_points:
fc["keyframe_points"].append(
dumper.dump(k)
)
data["fcurves"].append(fc)
for fcurve in instance.fcurves:
data["fcurves"].append(dump_fcurve(fcurve, use_numpy=True))
return data
def is_valid(self):
return bpy.data.actions.get(self.data['name'])

View File

@ -1,12 +1,28 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
#
# ##### END GPL LICENSE BLOCK #####
import bpy
import mathutils
from ..libs.overrider import Overrider
from .. import utils
from .. import presence, operators
from .bl_datablock import BlDatablock
# WIP
from .dump_anything import Loader, Dumper
from .. import presence, operators, utils
from .bl_datablock import BlDatablock
class BlArmature(BlDatablock):
@ -15,12 +31,13 @@ class BlArmature(BlDatablock):
bl_delay_refresh = 1
bl_delay_apply = 0
bl_automatic_push = True
bl_check_common = False
bl_icon = 'ARMATURE_DATA'
def construct(self, data):
def _construct(self, data):
return bpy.data.armatures.new(data["name"])
def load_implementation(self, data, target):
def _load_implementation(self, data, target):
# Load parent object
parent_object = utils.find_from_attr(
'uuid',
@ -30,7 +47,7 @@ class BlArmature(BlDatablock):
if parent_object is None:
parent_object = bpy.data.objects.new(
data['user_name'], self.pointer)
data['user_name'], target)
parent_object.uuid = data['user']
is_object_in_master = (
@ -65,10 +82,10 @@ class BlArmature(BlDatablock):
bpy.ops.object.mode_set(mode='EDIT')
for bone in data['bones']:
if bone not in self.pointer.edit_bones:
new_bone = self.pointer.edit_bones.new(bone)
if bone not in target.edit_bones:
new_bone = target.edit_bones.new(bone)
else:
new_bone = self.pointer.edit_bones[bone]
new_bone = target.edit_bones[bone]
bone_data = data['bones'].get(bone)
@ -76,13 +93,15 @@ class BlArmature(BlDatablock):
new_bone.head = bone_data['head_local']
new_bone.tail_radius = bone_data['tail_radius']
new_bone.head_radius = bone_data['head_radius']
# new_bone.roll = bone_data['roll']
if 'parent' in bone_data:
new_bone.parent = self.pointer.edit_bones[data['bones']
new_bone.parent = target.edit_bones[data['bones']
[bone]['parent']]
new_bone.use_connect = bone_data['use_connect']
utils.dump_anything.load(new_bone, bone_data)
loader = Loader()
loader.load(new_bone, bone_data)
if bpy.context.mode != 'OBJECT':
bpy.ops.object.mode_set(mode='OBJECT')
@ -92,10 +111,10 @@ class BlArmature(BlDatablock):
if 'EDIT' in current_mode:
bpy.ops.object.mode_set(mode='EDIT')
def dump_implementation(self, data, pointer=None):
assert(pointer)
def _dump_implementation(self, data, instance=None):
assert(instance)
dumper = utils.dump_anything.Dumper()
dumper = Dumper()
dumper.depth = 4
dumper.include_filter = [
'bones',
@ -106,16 +125,17 @@ class BlArmature(BlDatablock):
'use_connect',
'parent',
'name',
'layers'
'layers',
# 'roll',
]
data = dumper.dump(pointer)
data = dumper.dump(instance)
for bone in pointer.bones:
for bone in instance.bones:
if bone.parent:
data['bones'][bone.name]['parent'] = bone.parent.name
# get the parent Object
object_users = utils.get_datablock_users(pointer)[0]
object_users = utils.get_datablock_users(instance)[0]
data['user'] = object_users.uuid
data['user_name'] = object_users.name
@ -127,5 +147,4 @@ class BlArmature(BlDatablock):
item.name for item in container_users if isinstance(item, bpy.types.Scene)]
return data
def is_valid(self):
return bpy.data.armatures.get(self.data['name'])

View File

@ -1,7 +1,25 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
#
# ##### END GPL LICENSE BLOCK #####
import bpy
import mathutils
from .. import utils
from .dump_anything import Loader, Dumper
from .bl_datablock import BlDatablock
@ -11,25 +29,39 @@ class BlCamera(BlDatablock):
bl_delay_refresh = 1
bl_delay_apply = 1
bl_automatic_push = True
bl_check_common = False
bl_icon = 'CAMERA_DATA'
def load(self, data, target):
utils.dump_anything.load(target, data)
def _construct(self, data):
return bpy.data.cameras.new(data["name"])
def _load_implementation(self, data, target):
loader = Loader()
loader.load(target, data)
dof_settings = data.get('dof')
# DOF settings
if dof_settings:
utils.dump_anything.load(target.dof, dof_settings)
loader.load(target.dof, dof_settings)
def construct(self, data):
return bpy.data.cameras.new(data["name"])
background_images = data.get('background_images')
def dump_implementation(self, data, pointer=None):
assert(pointer)
if background_images:
target.background_images.clear()
for img_name, img_data in background_images.items():
target_img = target.background_images.new()
target_img.image = bpy.data.images[img_name]
loader.load(target_img, img_data)
dumper = utils.dump_anything.Dumper()
dumper.depth = 2
def _dump_implementation(self, data, instance=None):
assert(instance)
# TODO: background image support
dumper = Dumper()
dumper.depth = 3
dumper.include_filter = [
"name",
'type',
@ -48,9 +80,34 @@ class BlCamera(BlDatablock):
'aperture_fstop',
'aperture_blades',
'aperture_rotation',
'ortho_scale',
'aperture_ratio',
'display_size',
'show_limits',
'show_mist',
'show_sensor',
'show_name',
'sensor_fit',
'sensor_height',
'sensor_width',
'show_background_images',
'background_images',
'alpha',
'display_depth',
'frame_method',
'offset',
'rotation',
'scale',
'use_flip_x',
'use_flip_y',
'image'
]
return dumper.dump(pointer)
def is_valid(self):
return bpy.data.cameras.get(self.data['name'])
return dumper.dump(instance)
def _resolve_deps_implementation(self):
deps = []
for background in self.instance.background_images:
if background.image:
deps.append(background.image)
return deps

View File

@ -1,8 +1,75 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
#
# ##### END GPL LICENSE BLOCK #####
import bpy
import mathutils
from .. import utils
from .bl_datablock import BlDatablock
from .dump_anything import Loader, Dumper
def dump_collection_children(collection):
collection_children = []
for child in collection.children:
if child not in collection_children:
collection_children.append(child.uuid)
return collection_children
def dump_collection_objects(collection):
collection_objects = []
for object in collection.objects:
if object not in collection_objects:
collection_objects.append(object.uuid)
return collection_objects
def load_collection_objects(dumped_objects, collection):
for object in dumped_objects:
object_ref = utils.find_from_attr('uuid', object, bpy.data.objects)
if object_ref is None:
continue
elif object_ref.name not in collection.objects.keys():
collection.objects.link(object_ref)
for object in collection.objects:
if object.uuid not in dumped_objects:
collection.objects.unlink(object)
def load_collection_childrens(dumped_childrens, collection):
for child_collection in dumped_childrens:
collection_ref = utils.find_from_attr(
'uuid',
child_collection,
bpy.data.collections)
if collection_ref is None:
continue
if collection_ref.name not in collection.children.keys():
collection.children.link(collection_ref)
for child_collection in collection.children:
if child_collection.uuid not in dumped_childrens:
collection.children.unlink(child_collection)
class BlCollection(BlDatablock):
@ -12,81 +79,56 @@ class BlCollection(BlDatablock):
bl_delay_refresh = 1
bl_delay_apply = 1
bl_automatic_push = True
def construct(self, data):
bl_check_common = True
def _construct(self, data):
if self.is_library:
with bpy.data.libraries.load(filepath=bpy.data.libraries[self.data['library']].filepath, link=True) as (sourceData, targetData):
targetData.collections = [
name for name in sourceData.collections if name == self.data['name']]
instance = bpy.data.collections[self.data['name']]
instance.uuid = self.uuid
return instance
instance = bpy.data.collections.new(data["name"])
instance.uuid = self.uuid
return instance
def load(self, data, target):
# Load other meshes metadata
# dump_anything.load(target, data)
target.name = data["name"]
# link objects
for object in data["objects"]:
object_ref = utils.find_from_attr('uuid', object, bpy.data.objects)
if object_ref and object_ref.name not in target.objects.keys():
target.objects.link(object_ref)
def _load_implementation(self, data, target):
loader = Loader()
loader.load(target, data)
for object in target.objects:
if object.uuid not in data["objects"]:
target.objects.unlink(object)
# Objects
load_collection_objects(data['objects'], target)
# Link childrens
for collection in data["children"]:
collection_ref = utils.find_from_attr(
'uuid', collection, bpy.data.collections)
if collection_ref and collection_ref.name not in target.children.keys():
target.children.link(collection_ref)
load_collection_childrens(data['children'], target)
for collection in target.children:
if collection.uuid not in data["children"]:
target.children.unlink(collection)
def _dump_implementation(self, data, instance=None):
assert(instance)
def dump_implementation(self, data, pointer=None):
assert(pointer)
data = {}
data['name'] = pointer.name
dumper = Dumper()
dumper.depth = 1
dumper.include_filter = [
"name",
"instance_offset"
]
data = dumper.dump(instance)
# dump objects
collection_objects = []
for object in pointer.objects:
if object not in collection_objects:
collection_objects.append(object.uuid)
data['objects'] = collection_objects
data['objects'] = dump_collection_objects(instance)
# dump children collections
collection_children = []
for child in pointer.children:
if child not in collection_children:
collection_children.append(child.uuid)
data['children'] = collection_children
data['children'] = dump_collection_children(instance)
return data
def resolve_dependencies(self):
def _resolve_deps_implementation(self):
deps = []
for child in self.pointer.children:
for child in self.instance.children:
deps.append(child)
for object in self.pointer.objects:
for object in self.instance.objects:
deps.append(object)
return deps
def is_valid(self):
return bpy.data.collections.get(self.data['name'])

View File

@ -1,8 +1,142 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
#
# ##### END GPL LICENSE BLOCK #####
import bpy
import bpy.types as T
import mathutils
import logging
from .. import utils
from .bl_datablock import BlDatablock
from .dump_anything import (Dumper, Loader,
np_load_collection,
np_dump_collection)
SPLINE_BEZIER_POINT = [
# "handle_left_type",
# "handle_right_type",
"handle_left",
"co",
"handle_right",
"tilt",
"weight_softbody",
"radius",
]
SPLINE_POINT = [
"co",
"tilt",
"weight_softbody",
"radius",
]
CURVE_METADATA = [
'align_x',
'align_y',
'bevel_depth',
'bevel_factor_end',
'bevel_factor_mapping_end',
'bevel_factor_mapping_start',
'bevel_factor_start',
'bevel_object',
'bevel_resolution',
'body',
'body_format',
'dimensions',
'eval_time',
'extrude',
'family',
'fill_mode',
'follow_curve',
'font',
'font_bold',
'font_bold_italic',
'font_italic',
'make_local',
'materials',
'name',
'offset',
'offset_x',
'offset_y',
'overflow',
'original',
'override_create',
'override_library',
'path_duration',
'preview',
'render_resolution_u',
'render_resolution_v',
'resolution_u',
'resolution_v',
'shape_keys',
'shear',
'size',
'small_caps_scale',
'space_character',
'space_line',
'space_word',
'type',
'taper_object',
'texspace_location',
'texspace_size',
'transform',
'twist_mode',
'twist_smooth',
'underline_height',
'underline_position',
'use_auto_texspace',
'use_deform_bounds',
'use_fake_user',
'use_fill_caps',
'use_fill_deform',
'use_map_taper',
'use_path',
'use_path_follow',
'use_radius',
'use_stretch',
]
SPLINE_METADATA = [
'hide',
'material_index',
# 'order_u',
# 'order_v',
# 'point_count_u',
# 'point_count_v',
'points',
'radius_interpolation',
'resolution_u',
'resolution_v',
'tilt_interpolation',
'type',
'use_bezier_u',
'use_bezier_v',
'use_cyclic_u',
'use_cyclic_v',
'use_endpoint_u',
'use_endpoint_v',
'use_smooth',
]
class BlCurve(BlDatablock):
bl_id = "curves"
@ -10,54 +144,97 @@ class BlCurve(BlDatablock):
bl_delay_refresh = 1
bl_delay_apply = 1
bl_automatic_push = True
bl_check_common = False
bl_icon = 'CURVE_DATA'
def construct(self, data):
return bpy.data.curves.new(data["name"], 'CURVE')
def _construct(self, data):
return bpy.data.curves.new(data["name"], data["type"])
def load(self, data, target):
utils.dump_anything.load(target, data)
def _load_implementation(self, data, target):
loader = Loader()
loader.load(target, data)
target.splines.clear()
# load splines
for spline in data['splines']:
new_spline = target.splines.new(data['splines'][spline]['type'])
utils.dump_anything.load(new_spline, data['splines'][spline])
for spline in data['splines'].values():
new_spline = target.splines.new(spline['type'])
# Load curve geometry data
for bezier_point_index in data['splines'][spline]["bezier_points"]:
if bezier_point_index != 0:
new_spline.bezier_points.add(1)
utils.dump_anything.load(
new_spline.bezier_points[bezier_point_index], data['splines'][spline]["bezier_points"][bezier_point_index])
if new_spline.type == 'BEZIER':
bezier_points = new_spline.bezier_points
bezier_points.add(spline['bezier_points_count'])
np_load_collection(spline['bezier_points'], bezier_points, SPLINE_BEZIER_POINT)
if new_spline.type == 'POLY':
points = new_spline.points
points.add(spline['points_count'])
np_load_collection(spline['points'], points, SPLINE_POINT)
# Not working for now...
# See https://blender.stackexchange.com/questions/7020/create-nurbs-surface-with-python
if new_spline.type == 'NURBS':
logging.error("NURBS not supported.")
# new_spline.points.add(len(data['splines'][spline]["points"])-1)
# for point_index in data['splines'][spline]["points"]:
# loader.load(
# new_spline.points[point_index], data['splines'][spline]["points"][point_index])
for point_index in data['splines'][spline]["points"]:
new_spline.points.add(1)
utils.dump_anything.load(
new_spline.points[point_index], data['splines'][spline]["points"][point_index])
loader.load(new_spline, spline)
def _dump_implementation(self, data, instance=None):
assert(instance)
dumper = Dumper()
# Conflicting attributes
# TODO: remove them with the NURBS support
dumper.include_filter = CURVE_METADATA
def dump_implementation(self, data, pointer=None):
assert(pointer)
data = utils.dump_datablock(pointer, 1)
dumper.exclude_filter = [
'users',
'order_u',
'order_v',
'point_count_v',
'point_count_u',
'active_textbox'
]
if instance.use_auto_texspace:
dumper.exclude_filter.extend([
'texspace_location',
'texspace_size'])
data = dumper.dump(instance)
data['splines'] = {}
dumper = utils.dump_anything.Dumper()
dumper.depth = 3
for index,spline in enumerate(pointer.splines):
spline_data = {}
spline_data['points'] = dumper.dump(spline.points)
spline_data['bezier_points'] = dumper.dump(spline.bezier_points)
spline_data['type'] = dumper.dump(spline.type)
for index, spline in enumerate(instance.splines):
dumper.depth = 2
dumper.include_filter = SPLINE_METADATA
spline_data = dumper.dump(spline)
if spline.type == 'POLY':
spline_data['points_count'] = len(spline.points)-1
spline_data['points'] = np_dump_collection(spline.points, SPLINE_POINT)
spline_data['bezier_points_count'] = len(spline.bezier_points)-1
spline_data['bezier_points'] = np_dump_collection(spline.bezier_points, SPLINE_BEZIER_POINT)
data['splines'][index] = spline_data
if isinstance(pointer,'TextCurve'):
data['type'] = 'TEXT'
if isinstance(pointer,'SurfaceCurve'):
if isinstance(instance, T.SurfaceCurve):
data['type'] = 'SURFACE'
if isinstance(pointer,'TextCurve'):
elif isinstance(instance, T.TextCurve):
data['type'] = 'FONT'
elif isinstance(instance, T.Curve):
data['type'] = 'CURVE'
return data
def is_valid(self):
return bpy.data.curves.get(self.data['name'])
def _resolve_deps_implementation(self):
# TODO: resolve material
deps = []
curve = self.instance
if isinstance(curve, T.TextCurve):
deps.extend([
curve.font,
curve.font_bold,
curve.font_bold_italic,
curve.font_italic])
return deps

View File

@ -1,13 +1,51 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
#
# ##### END GPL LICENSE BLOCK #####
import logging
from collections.abc import Iterable
import bpy
import mathutils
from replication.constants import DIFF_BINARY, UP
from replication.data import ReplicatedDatablock
from .. import utils
from ..libs.replication.replication.data import ReplicatedDatablock
from ..libs.replication.replication.constants import UP
from ..libs.replication.replication.constants import DIFF_BINARY
from .dump_anything import Dumper, Loader
def has_action(target):
""" Check if the target datablock has actions
"""
return (hasattr(target, 'animation_data')
and target.animation_data
and target.animation_data.action)
def has_driver(target):
""" Check if the target datablock is driven
"""
return (hasattr(target, 'animation_data')
and target.animation_data
and target.animation_data.drivers)
def dump_driver(driver):
dumper = utils.dump_anything.Dumper()
dumper = Dumper()
dumper.depth = 6
data = dumper.dump(driver)
@ -15,6 +53,7 @@ def dump_driver(driver):
def load_driver(target_datablock, src_driver):
loader = Loader()
drivers = target_datablock.animation_data.drivers
src_driver_data = src_driver['driver']
new_driver = drivers.new(src_driver['data_path'])
@ -22,7 +61,7 @@ def load_driver(target_datablock, src_driver):
# Settings
new_driver.driver.type = src_driver_data['type']
new_driver.driver.expression = src_driver_data['expression']
utils.dump_anything.load(new_driver, src_driver)
loader.load(new_driver, src_driver)
# Variables
for src_variable in src_driver_data['variables']:
@ -35,7 +74,7 @@ def load_driver(target_datablock, src_driver):
src_target_data = src_var_data['targets'][src_target]
new_var.targets[src_target].id = utils.resolve_from_id(
src_target_data['id'], src_target_data['id_type'])
utils.dump_anything.load(
loader.load(
new_var.targets[src_target], src_target_data)
# Fcurve
@ -47,8 +86,20 @@ def load_driver(target_datablock, src_driver):
for index, src_point in enumerate(src_driver['keyframe_points']):
new_point = new_fcurve[index]
utils.dump_anything.load(
new_point, src_driver['keyframe_points'][src_point])
loader.load(new_point, src_driver['keyframe_points'][src_point])
def get_datablock_from_uuid(uuid, default, ignore=[]):
if not uuid:
return default
for category in dir(bpy.data):
root = getattr(bpy.data, category)
if isinstance(root, Iterable) and category not in ignore:
for item in root:
if getattr(item, 'uuid', None) == uuid:
return item
return default
class BlDatablock(ReplicatedDatablock):
@ -60,92 +111,80 @@ class BlDatablock(ReplicatedDatablock):
bl_delay_apply : refresh rate in sec for apply
bl_automatic_push : boolean
bl_icon : type icon (blender icon name)
bl_check_common: enable check even in common rights
"""
bl_id = "scenes"
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
pointer = kwargs.get('pointer', None)
instance = kwargs.get('instance', None)
self.preferences = utils.get_preferences()
# TODO: use is_library_indirect
self.is_library = (pointer and hasattr(pointer, 'library') and
pointer.library) or \
self.is_library = (instance and hasattr(instance, 'library') and
instance.library) or \
(self.data and 'library' in self.data)
if self.is_library:
self.load = self.load_library
self.dump = self.dump_library
self.diff = self.diff_library
self.resolve_dependencies = self.resolve_dependencies_library
if instance and hasattr(instance, 'uuid'):
instance.uuid = self.uuid
if self.pointer and hasattr(self.pointer, 'uuid'):
self.pointer.uuid = self.uuid
self.diff_method = DIFF_BINARY
def library_apply(self):
"""Apply stored data
"""
# UP in case we want to reset our pointer data
self.state = UP
def bl_diff(self):
"""Generic datablock diff"""
return self.pointer.name != self.data['name']
def construct_library(self, data):
return None
def load_library(self, data, target):
pass
def dump_library(self, pointer=None):
return utils.dump_datablock(pointer, 1)
def diff_library(self):
return False
def resolve_dependencies_library(self):
return [self.pointer.library]
def resolve(self):
datablock_ref = None
datablock_root = getattr(bpy.data, self.bl_id)
datablock_ref = utils.find_from_attr('uuid', self.uuid, datablock_root)
# In case of lost uuid (ex: undo), resolve by name and reassign it
# TODO: avoid reference storing
if not datablock_ref:
datablock_ref = getattr(
bpy.data, self.bl_id).get(self.data['name'])
try:
datablock_ref = datablock_root[self.data['name']]
except Exception:
name = self.data.get('name')
logging.debug(f"Constructing {name}")
datablock_ref = self._construct(data=self.data)
if datablock_ref:
setattr(datablock_ref, 'uuid', self.uuid)
self.pointer = datablock_ref
self.instance = datablock_ref
def dump(self, pointer=None):
def remove_instance(self):
"""
Remove instance from blender data
"""
assert(self.instance)
datablock_root = getattr(bpy.data, self.bl_id)
datablock_root.remove(self.instance)
def _dump(self, instance=None):
dumper = Dumper()
data = {}
if utils.has_action(pointer):
dumper = utils.dump_anything.Dumper()
# Dump animation data
if has_action(instance):
dumper = Dumper()
dumper.include_filter = ['action']
data['animation_data'] = dumper.dump(pointer.animation_data)
data['animation_data'] = dumper.dump(instance.animation_data)
if utils.has_driver(pointer):
if has_driver(instance):
dumped_drivers = {'animation_data': {'drivers': []}}
for driver in pointer.animation_data.drivers:
for driver in instance.animation_data.drivers:
dumped_drivers['animation_data']['drivers'].append(
dump_driver(driver))
data.update(dumped_drivers)
data.update(self.dump_implementation(data, pointer=pointer))
if self.is_library:
data.update(dumper.dump(instance))
else:
data.update(self._dump_implementation(data, instance=instance))
return data
def dump_implementation(self, data, target):
def _dump_implementation(self, data, target):
raise NotImplementedError
def load(self, data, target):
def _load(self, data, target):
# Load animation data
if 'animation_data' in data.keys():
if target.animation_data is None:
@ -161,18 +200,28 @@ class BlDatablock(ReplicatedDatablock):
if 'action' in data['animation_data']:
target.animation_data.action = bpy.data.actions[data['animation_data']['action']]
self.load_implementation(data, target)
if self.is_library:
return
else:
self._load_implementation(data, target)
def load_implementation(self, data, target):
def _load_implementation(self, data, target):
raise NotImplementedError
def resolve_dependencies(self):
def resolve_deps(self):
dependencies = []
if utils.has_action(self.pointer):
dependencies.append(self.pointer.animation_data.action)
if has_action(self.instance):
dependencies.append(self.instance.animation_data.action)
if not self.is_library:
dependencies.extend(self._resolve_deps_implementation())
logging.debug(f"{self.instance.name} dependencies: {dependencies}")
return dependencies
def _resolve_deps_implementation(self):
return []
def is_valid(self):
raise NotImplementedError
return getattr(bpy.data, self.bl_id).get(self.data['name'])

View File

@ -0,0 +1,140 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
#
# ##### END GPL LICENSE BLOCK #####
import logging
import os
import sys
from pathlib import Path
import bpy
import mathutils
from replication.constants import DIFF_BINARY, UP
from replication.data import ReplicatedDatablock
from .. import utils
from .dump_anything import Dumper, Loader
def get_filepath(filename):
"""
Construct the local filepath
"""
return str(Path(
utils.get_preferences().cache_directory,
filename
))
def ensure_unpacked(datablock):
if datablock.packed_file:
logging.info(f"Unpacking {datablock.name}")
filename = Path(bpy.path.abspath(datablock.filepath)).name
datablock.filepath = get_filepath(filename)
datablock.unpack(method="WRITE_ORIGINAL")
class BlFile(ReplicatedDatablock):
bl_id = 'file'
bl_name = "file"
bl_class = Path
bl_delay_refresh = 0
bl_delay_apply = 1
bl_automatic_push = True
bl_check_common = False
bl_icon = 'FILE'
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
self.instance = kwargs.get('instance', None)
if self.instance and not self.instance.exists():
raise FileNotFoundError(self.instance)
self.preferences = utils.get_preferences()
self.diff_method = DIFF_BINARY
def resolve(self):
if self.data:
self.instance = Path(get_filepath(self.data['name']))
if not self.instance.exists():
logging.debug("File don't exist, loading it.")
self._load(self.data, self.instance)
def push(self, socket, identity=None):
super().push(socket, identity=None)
if self.preferences.clear_memory_filecache:
del self.data['file']
def _dump(self, instance=None):
"""
Read the file and return a dict as:
{
name : filename
extension :
file: file content
}
"""
logging.info(f"Extracting file metadata")
data = {
'name': self.instance.name,
}
logging.info(
f"Reading {self.instance.name} content: {self.instance.stat().st_size} bytes")
try:
file = open(self.instance, "rb")
data['file'] = file.read()
file.close()
except IOError:
logging.warning(f"{self.instance} doesn't exist, skipping")
else:
file.close()
return data
def _load(self, data, target):
"""
Writing the file
"""
# TODO: check for empty data
if target.exists() and not self.diff():
logging.info(f"{data['name']} already on the disk, skipping.")
return
try:
file = open(target, "wb")
file.write(data['file'])
if self.preferences.clear_memory_filecache:
del self.data['file']
except IOError:
logging.warning(f"{target} doesn't exist, skipping")
else:
file.close()
def diff(self):
memory_size = sys.getsizeof(self.data['file'])-33
disk_size = self.instance.stat().st_size
return memory_size == disk_size

View File

@ -0,0 +1,74 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
#
# ##### END GPL LICENSE BLOCK #####
import logging
import os
from pathlib import Path
import bpy
from .bl_datablock import BlDatablock
from .bl_file import get_filepath, ensure_unpacked
from .dump_anything import Dumper, Loader
class BlFont(BlDatablock):
bl_id = "fonts"
bl_class = bpy.types.VectorFont
bl_delay_refresh = 1
bl_delay_apply = 1
bl_automatic_push = True
bl_check_common = False
bl_icon = 'FILE_FONT'
def _construct(self, data):
filename = data.get('filename')
if filename == '<builtin>':
return bpy.data.fonts.load(filename)
else:
return bpy.data.fonts.load(get_filepath(filename))
def _load(self, data, target):
pass
def _dump(self, instance=None):
if instance.filepath == '<builtin>':
filename = '<builtin>'
else:
filename = Path(instance.filepath).name
if not filename:
raise FileExistsError(instance.filepath)
return {
'filename': filename,
'name': instance.name
}
def diff(self):
return False
def _resolve_deps_implementation(self):
deps = []
if self.instance.filepath and self.instance.filepath != '<builtin>':
ensure_unpacked(self.instance)
deps.append(Path(bpy.path.abspath(self.instance.filepath)))
return deps

View File

@ -1,83 +1,283 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
#
# ##### END GPL LICENSE BLOCK #####
import bpy
import mathutils
import numpy as np
from .. import utils
from .dump_anything import (Dumper,
Loader,
np_dump_collection,
np_load_collection)
from .bl_datablock import BlDatablock
# GPencil data api is structured as it follow:
# GP-Object --> GP-Layers --> GP-Frames --> GP-Strokes --> GP-Stroke-Points
def load_gpencil_layer(target=None, data=None, create=False):
STROKE_POINT = [
'co',
'pressure',
'strength',
'uv_factor',
'uv_rotation'
utils.dump_anything.load(target, data)
for k,v in target.frames.items():
target.frames.remove(v)
for frame in data["frames"]:
tframe = target.frames.new(data["frames"][frame]['frame_number'])
]
# utils.dump_anything.load(tframe, data["frames"][frame])
for stroke in data["frames"][frame]["strokes"]:
try:
tstroke = tframe.strokes[stroke]
except:
tstroke = tframe.strokes.new()
utils.dump_anything.load(
tstroke, data["frames"][frame]["strokes"][stroke])
if bpy.app.version[1] >= 83:
STROKE_POINT.append('vertex_color')
for point in data["frames"][frame]["strokes"][stroke]["points"]:
p = data["frames"][frame]["strokes"][stroke]["points"][point]
def dump_stroke(stroke):
""" Dump a grease pencil stroke to a dict
tstroke.points.add(1)
tpoint = tstroke.points[len(tstroke.points)-1]
:param stroke: target grease pencil stroke
:type stroke: bpy.types.GPencilStroke
:return: dict
"""
assert(stroke)
dumper = Dumper()
dumper.include_filter = [
"aspect",
"display_mode",
"draw_cyclic",
"end_cap_mode",
"hardeness",
"line_width",
"material_index",
"start_cap_mode",
"uv_rotation",
"uv_scale",
"uv_translation",
"vertex_color_fill",
]
dumped_stroke = dumper.dump(stroke)
# Stoke points
p_count = len(stroke.points)
dumped_stroke['p_count'] = p_count
dumped_stroke['points'] = np_dump_collection(stroke.points, STROKE_POINT)
# TODO: uv_factor, uv_rotation
return dumped_stroke
def load_stroke(stroke_data, stroke):
""" Load a grease pencil stroke from a dict
:param stroke_data: dumped grease pencil stroke
:type stroke_data: dict
:param stroke: target grease pencil stroke
:type stroke: bpy.types.GPencilStroke
"""
assert(stroke and stroke_data)
loader = Loader()
loader.load(stroke, stroke_data)
stroke.points.add(stroke_data["p_count"])
np_load_collection(stroke_data['points'], stroke.points, STROKE_POINT)
def dump_frame(frame):
""" Dump a grease pencil frame to a dict
:param frame: target grease pencil stroke
:type frame: bpy.types.GPencilFrame
:return: dict
"""
assert(frame)
dumped_frame = dict()
dumped_frame['frame_number'] = frame.frame_number
dumped_frame['strokes'] = []
# TODO: took existing strokes in account
for stroke in frame.strokes:
dumped_frame['strokes'].append(dump_stroke(stroke))
return dumped_frame
def load_frame(frame_data, frame):
""" Load a grease pencil frame from a dict
:param frame_data: source grease pencil frame
:type frame_data: dict
:param frame: target grease pencil stroke
:type frame: bpy.types.GPencilFrame
"""
assert(frame and frame_data)
# frame.frame_number = frame_data['frame_number']
# TODO: took existing stroke in account
for stroke_data in frame_data['strokes']:
target_stroke = frame.strokes.new()
load_stroke(stroke_data, target_stroke)
def dump_layer(layer):
""" Dump a grease pencil layer
:param layer: target grease pencil stroke
:type layer: bpy.types.GPencilFrame
"""
assert(layer)
dumper = Dumper()
dumper.include_filter = [
'info',
'opacity',
'channel_color',
'color',
# 'thickness', #TODO: enabling only for annotation
'tint_color',
'tint_factor',
'vertex_paint_opacity',
'line_change',
'use_onion_skinning',
'use_annotation_onion_skinning',
'annotation_onion_before_range',
'annotation_onion_after_range',
'annotation_onion_before_color',
'annotation_onion_after_color',
'pass_index',
# 'viewlayer_render',
'blend_mode',
'hide',
'annotation_hide',
'lock',
# 'lock_frame',
# 'lock_material',
# 'use_mask_layer',
'use_lights',
'use_solo_mode',
'select',
'show_points',
'show_in_front',
# 'parent',
# 'parent_type',
# 'parent_bone',
# 'matrix_inverse',
]
dumped_layer = dumper.dump(layer)
dumped_layer['frames'] = []
for frame in layer.frames:
dumped_layer['frames'].append(dump_frame(frame))
return dumped_layer
def load_layer(layer_data, layer):
""" Load a grease pencil layer from a dict
:param layer_data: source grease pencil layer data
:type layer_data: dict
:param layer: target grease pencil stroke
:type layer: bpy.types.GPencilFrame
"""
# TODO: take existing data in account
loader = Loader()
loader.load(layer, layer_data)
for frame_data in layer_data["frames"]:
target_frame = layer.frames.new(frame_data['frame_number'])
load_frame(frame_data, target_frame)
utils.dump_anything.load(tpoint, p)
class BlGpencil(BlDatablock):
bl_id = "grease_pencils"
bl_class = bpy.types.GreasePencil
bl_delay_refresh = 5
bl_delay_apply = 5
bl_delay_refresh = 2
bl_delay_apply = 1
bl_automatic_push = True
bl_check_common = False
bl_icon = 'GREASEPENCIL'
def construct(self, data):
def _construct(self, data):
return bpy.data.grease_pencils.new(data["name"])
def load(self, data, target):
for layer in target.layers:
target.layers.remove(layer)
if "layers" in data.keys():
for layer in data["layers"]:
if layer not in target.layers.keys():
gp_layer = target.layers.new(data["layers"][layer]["info"])
else:
gp_layer = target.layers[layer]
load_gpencil_layer(
target=gp_layer, data=data["layers"][layer], create=True)
utils.dump_anything.load(target, data)
def _load_implementation(self, data, target):
target.materials.clear()
if "materials" in data.keys():
for mat in data['materials']:
target.materials.append(bpy.data.materials[mat])
def dump_implementation(self, data, pointer=None):
assert(pointer)
data = utils.dump_datablock(pointer, 2)
utils.dump_datablock_attibutes(
pointer, ['layers'], 9, data)
loader = Loader()
loader.load(target, data)
# TODO: reuse existing layer
for layer in target.layers:
target.layers.remove(layer)
if "layers" in data.keys():
for layer in data["layers"]:
layer_data = data["layers"].get(layer)
# if layer not in target.layers.keys():
target_layer = target.layers.new(data["layers"][layer]["info"])
# else:
# target_layer = target.layers[layer]
# target_layer.clear()
load_layer(layer_data, target_layer)
def _dump_implementation(self, data, instance=None):
assert(instance)
dumper = Dumper()
dumper.depth = 2
dumper.include_filter = [
'materials',
'name',
'zdepth_offset',
'stroke_thickness_space',
'pixel_factor',
'stroke_depth_order'
]
data = dumper.dump(instance)
data['layers'] = {}
for layer in instance.layers:
data['layers'][layer.info] = dump_layer(layer)
return data
def resolve_dependencies(self):
def _resolve_deps_implementation(self):
deps = []
for material in self.pointer.materials:
for material in self.instance.materials:
deps.append(material)
return deps
def is_valid(self):
return bpy.data.grease_pencils.get(self.data['name'])

View File

@ -1,83 +1,123 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
#
# ##### END GPL LICENSE BLOCK #####
import logging
import os
from pathlib import Path
import bpy
import mathutils
import os
from .. import utils, environment
from .. import utils
from .bl_datablock import BlDatablock
from .dump_anything import Dumper, Loader
from .bl_file import get_filepath, ensure_unpacked
def dump_image(image):
pixels = None
if image.source == "GENERATED":
img_name = "{}.png".format(image.name)
format_to_ext = {
'BMP': 'bmp',
'IRIS': 'sgi',
'PNG': 'png',
'JPEG': 'jpg',
'JPEG2000': 'jp2',
'TARGA': 'tga',
'TARGA_RAW': 'tga',
'CINEON': 'cin',
'DPX': 'dpx',
'OPEN_EXR_MULTILAYER': 'exr',
'OPEN_EXR': 'exr',
'HDR': 'hdr',
'TIFF': 'tiff',
'AVI_JPEG': 'avi',
'AVI_RAW': 'avi',
'FFMPEG': 'mpeg',
}
image.filepath_raw = os.path.join(environment.CACHE_DIR, img_name)
image.file_format = "PNG"
image.save()
if image.source == "FILE":
image_path = bpy.path.abspath(image.filepath_raw)
image_directory = os.path.dirname(image_path)
os.makedirs(image_directory, exist_ok=True)
image.save()
file = open(image_path, "rb")
pixels = file.read()
file.close()
else:
raise ValueError()
return pixels
class BlImage(BlDatablock):
bl_id = "images"
bl_class = bpy.types.Image
bl_delay_refresh = 0
bl_delay_refresh = 1
bl_delay_apply = 1
bl_automatic_push = False
bl_automatic_push = True
bl_check_common = False
bl_icon = 'IMAGE_DATA'
def construct(self, data):
def _construct(self, data):
return bpy.data.images.new(
name=data['name'],
width=data['size'][0],
height=data['size'][1]
)
name=data['name'],
width=data['size'][0],
height=data['size'][1]
)
def load(self, data, target):
image = target
def _load(self, data, target):
loader = Loader()
loader.load(data, target)
img_name = "{}.png".format(image.name)
target.source = 'FILE'
target.filepath_raw = get_filepath(data['filename'])
target.colorspace_settings.name = data["colorspace_settings"]["name"]
img_path = os.path.join(environment.CACHE_DIR, img_name)
def _dump(self, instance=None):
assert(instance)
file = open(img_path, 'wb')
file.write(data["pixels"])
file.close()
filename = Path(instance.filepath).name
image.source = 'FILE'
image.filepath = img_path
image.colorspace_settings.name = data["colorspace_settings"]["name"]
data = {
"filename": filename
}
def dump_implementation(self, data, pointer=None):
assert(pointer)
data = {}
data['pixels'] = dump_image(pointer)
dumper = utils.dump_anything.Dumper()
dumper = Dumper()
dumper.depth = 2
dumper.include_filter = [
"name",
'size',
'height',
'alpha',
'float_buffer',
'filepath',
'source',
'colorspace_settings']
data.update(dumper.dump(pointer))
dumper.include_filter = [
"name",
'size',
'height',
'alpha',
'float_buffer',
'alpha_mode',
'colorspace_settings']
data.update(dumper.dump(instance))
return data
def diff(self):
return False
def is_valid(self):
return bpy.data.images.get(self.data['name'])
if self.instance and (self.instance.name != self.data['name']):
return True
else:
return False
def _resolve_deps_implementation(self):
deps = []
if self.instance.filepath:
if self.instance.packed_file:
filename = Path(bpy.path.abspath(self.instance.filepath)).name
self.instance.filepath = get_filepath(filename)
self.instance.save()
# An image can't be unpacked to the modified path
# TODO: make a bug report
self.instance.unpack(method="REMOVE")
elif self.instance.source == "GENERATED":
filename = f"{self.instance.name}.png"
self.instance.filepath = get_filepath(filename)
self.instance.save()
deps.append(Path(bpy.path.abspath(self.instance.filepath)))
return deps

View File

@ -1,8 +1,29 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
#
# ##### END GPL LICENSE BLOCK #####
import bpy
import mathutils
from .. import utils
from .dump_anything import Dumper, Loader, np_dump_collection, np_load_collection
from .bl_datablock import BlDatablock
from replication.exception import ContextError
POINT = ['co', 'weight_softbody', 'co_deform']
class BlLattice(BlDatablock):
@ -11,21 +32,27 @@ class BlLattice(BlDatablock):
bl_delay_refresh = 1
bl_delay_apply = 1
bl_automatic_push = True
bl_check_common = False
bl_icon = 'LATTICE_DATA'
def load(self, data, target):
utils.dump_anything.load(target, data)
for point in data['points']:
utils.dump_anything.load(target.points[point], data["points"][point])
def construct(self, data):
def _construct(self, data):
return bpy.data.lattices.new(data["name"])
def dump(self, pointer=None):
assert(pointer)
def _load_implementation(self, data, target):
if target.is_editmode:
raise ContextError("lattice is in edit mode")
dumper = utils.dump_anything.Dumper()
dumper.depth = 3
loader = Loader()
loader.load(target, data)
np_load_collection(data['points'], target.points, POINT)
def _dump_implementation(self, data, instance=None):
if instance.is_editmode:
raise ContextError("lattice is in edit mode")
dumper = Dumper()
dumper.depth = 1
dumper.include_filter = [
"name",
'type',
@ -35,18 +62,11 @@ class BlLattice(BlDatablock):
'interpolation_type_u',
'interpolation_type_v',
'interpolation_type_w',
'use_outside',
'points',
'co',
'weight_softbody',
'co_deform'
'use_outside'
]
data = dumper.dump(pointer)
data = dumper.dump(instance)
data['points'] = np_dump_collection(instance.points, POINT)
return data
def is_valid(self):
return bpy.data.lattices.get(self.data['name'])

View File

@ -1,7 +1,25 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
#
# ##### END GPL LICENSE BLOCK #####
import bpy
import mathutils
from .. import utils
from .dump_anything import Loader, Dumper
from .bl_datablock import BlDatablock
@ -11,18 +29,19 @@ class BlLibrary(BlDatablock):
bl_delay_refresh = 1
bl_delay_apply = 1
bl_automatic_push = True
bl_check_common = False
bl_icon = 'LIBRARY_DATA_DIRECT'
def construct(self, data):
def _construct(self, data):
with bpy.data.libraries.load(filepath=data["filepath"], link=True) as (sourceData, targetData):
targetData = sourceData
return sourceData
def load(self, data, target):
def _load(self, data, target):
pass
def dump(self, pointer=None):
assert(pointer)
return utils.dump_datablock(pointer, 1)
def _dump(self, instance=None):
assert(instance)
dumper = Dumper()
return dumper.dump(instance)
def is_valid(self):
return bpy.data.libraries.get(self.data['name'])

View File

@ -1,7 +1,25 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
#
# ##### END GPL LICENSE BLOCK #####
import bpy
import mathutils
from .. import utils
from .dump_anything import Loader, Dumper
from .bl_datablock import BlDatablock
@ -11,17 +29,19 @@ class BlLight(BlDatablock):
bl_delay_refresh = 1
bl_delay_apply = 1
bl_automatic_push = True
bl_check_common = False
bl_icon = 'LIGHT_DATA'
def construct(self, data):
def _construct(self, data):
return bpy.data.lights.new(data["name"], data["type"])
def load(self, data, target):
utils.dump_anything.load(target, data)
def _load_implementation(self, data, target):
loader = Loader()
loader.load(target, data)
def dump_implementation(self, data, pointer=None):
assert(pointer)
dumper = utils.dump_anything.Dumper()
def _dump_implementation(self, data, instance=None):
assert(instance)
dumper = Dumper()
dumper.depth = 3
dumper.include_filter = [
"name",
@ -41,11 +61,17 @@ class BlLight(BlDatablock):
"contact_shadow_distance",
"contact_shadow_soft_size",
"contact_shadow_bias",
"contact_shadow_thickness"
"contact_shadow_thickness",
"shape",
"size_y",
"size",
"angle",
'spot_size',
'spot_blend'
]
data = dumper.dump(pointer)
data = dumper.dump(instance)
return data
def is_valid(self):
return bpy.data.lights.get(self.data['name'])

View File

@ -1,11 +1,28 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
#
# ##### END GPL LICENSE BLOCK #####
import bpy
import mathutils
import logging
from .. import utils
from .dump_anything import Loader, Dumper
from .bl_datablock import BlDatablock
logger = logging.getLogger(__name__)
class BlLightprobe(BlDatablock):
bl_id = "lightprobes"
@ -13,28 +30,27 @@ class BlLightprobe(BlDatablock):
bl_delay_refresh = 1
bl_delay_apply = 1
bl_automatic_push = True
bl_check_common = False
bl_icon = 'LIGHTPROBE_GRID'
def load(self, data, target):
utils.dump_anything.load(target, data)
def construct(self, data):
def _construct(self, data):
type = 'CUBE' if data['type'] == 'CUBEMAP' else data['type']
# See https://developer.blender.org/D6396
if bpy.app.version[1] >= 83:
return bpy.data.lightprobes.new(data["name"], type)
else:
logger.warning("Lightprobe replication only supported since 2.83. See https://developer.blender.org/D6396")
logging.warning("Lightprobe replication only supported since 2.83. See https://developer.blender.org/D6396")
def _load_implementation(self, data, target):
loader = Loader()
loader.load(target, data)
def dump(self, pointer=None):
assert(pointer)
def _dump_implementation(self, data, instance=None):
assert(instance)
if bpy.app.version[1] < 83:
logger.warning("Lightprobe replication only supported since 2.83. See https://developer.blender.org/D6396")
logging.warning("Lightprobe replication only supported since 2.83. See https://developer.blender.org/D6396")
dumper = utils.dump_anything.Dumper()
dumper = Dumper()
dumper.depth = 1
dumper.include_filter = [
"name",
@ -57,7 +73,7 @@ class BlLightprobe(BlDatablock):
'visibility_blur'
]
return dumper.dump(pointer)
return dumper.dump(instance)
def is_valid(self):
return bpy.data.lattices.get(self.data['name'])

View File

@ -1,117 +1,218 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
#
# ##### END GPL LICENSE BLOCK #####
import bpy
import mathutils
import logging
import re
from .. import utils
from .bl_datablock import BlDatablock
from .dump_anything import Loader, Dumper
from .bl_datablock import BlDatablock, get_datablock_from_uuid
logger = logging.getLogger(__name__)
def clean_color_ramp(target_ramp):
# clear existing
try:
for key in target_ramp.elements:
target_ramp.elements.remove(key)
except:
pass
def load_mapping(target_apping, source_mapping):
# clear existing curves
for curve in target_apping.curves:
for point in curve.points:
try:
curve.remove(point)
except:
continue
# Load curves
for curve in source_mapping['curves']:
for point in source_mapping['curves'][curve]['points']:
pos = source_mapping['curves'][curve]['points'][point]['location']
target_apping.curves[curve].points.new(pos[0],pos[1])
NODE_SOCKET_INDEX = re.compile('\[(\d*)\]')
def load_node(node_data, node_tree):
""" Load a node into a node_tree from a dict
def load_node(target_node_tree, source):
target_node = target_node_tree.nodes.get(source["name"])
:arg node_data: dumped node data
:type node_data: dict
:arg node_tree: target node_tree
:type node_tree: bpy.types.NodeTree
"""
loader = Loader()
target_node = node_tree.nodes.new(type=node_data["bl_idname"])
if target_node is None:
node_type = source["bl_idname"]
loader.load(target_node, node_data)
image_uuid = node_data.get('image_uuid', None)
target_node = target_node_tree.nodes.new(type=node_type)
if image_uuid and not target_node.image:
target_node.image = get_datablock_from_uuid(image_uuid,None)
# Clean color ramp before loading it
if source['type'] == 'VALTORGB':
clean_color_ramp(target_node.color_ramp)
if source['type'] == 'CURVE_RGB':
load_mapping(target_node.mapping, source['mapping'])
utils.dump_anything.load(
target_node,
source)
if source['type'] == 'TEX_IMAGE':
target_node.image = bpy.data.images[source['image']]
for input in source["inputs"]:
for input in node_data["inputs"]:
if hasattr(target_node.inputs[input], "default_value"):
try:
target_node.inputs[input].default_value = source["inputs"][input]["default_value"]
target_node.inputs[input].default_value = node_data["inputs"][input]["default_value"]
except:
logger.error("{} not supported, skipping".format(input))
logging.error(
f"Material {input} parameter not supported, skipping")
def load_link(target_node_tree, source):
input_socket = target_node_tree.nodes[source['to_node']
['name']].inputs[source['to_socket']['name']]
output_socket = target_node_tree.nodes[source['from_node']
['name']].outputs[source['from_socket']['name']]
target_node_tree.links.new(input_socket, output_socket)
def load_links(links_data, node_tree):
""" Load node_tree links from a list
:arg links_data: dumped node links
:type links_data: list
:arg node_tree: node links collection
:type node_tree: bpy.types.NodeTree
"""
for link in links_data:
input_socket = node_tree.nodes[link['to_node']].inputs[int(link['to_socket'])]
output_socket = node_tree.nodes[link['from_node']].outputs[int(link['from_socket'])]
node_tree.links.new(input_socket, output_socket)
def dump_links(links):
""" Dump node_tree links collection to a list
:arg links: node links collection
:type links: bpy.types.NodeLinks
:retrun: list
"""
links_data = []
for link in links:
to_socket = NODE_SOCKET_INDEX.search(link.to_socket.path_from_id()).group(1)
from_socket = NODE_SOCKET_INDEX.search(link.from_socket.path_from_id()).group(1)
links_data.append({
'to_node': link.to_node.name,
'to_socket': to_socket,
'from_node': link.from_node.name,
'from_socket': from_socket,
})
return links_data
def dump_node(node):
""" Dump a single node to a dict
:arg node: target node
:type node: bpy.types.Node
:retrun: dict
"""
node_dumper = Dumper()
node_dumper.depth = 1
node_dumper.exclude_filter = [
"dimensions",
"show_expanded",
"name_full",
"select",
"bl_height_min",
"bl_height_max",
"bl_height_default",
"bl_width_min",
"bl_width_max",
"type",
"bl_icon",
"bl_width_default",
"bl_static_type",
"show_tetxure",
"is_active_output",
"hide",
"show_options",
"show_preview",
"show_texture",
"outputs",
"width_hidden",
"image"
]
dumped_node = node_dumper.dump(node)
if hasattr(node, 'inputs'):
dumped_node['inputs'] = {}
for i in node.inputs:
input_dumper = Dumper()
input_dumper.depth = 2
input_dumper.include_filter = ["default_value"]
if hasattr(i, 'default_value'):
dumped_node['inputs'][i.name] = input_dumper.dump(
i)
if hasattr(node, 'color_ramp'):
ramp_dumper = Dumper()
ramp_dumper.depth = 4
ramp_dumper.include_filter = [
'elements',
'alpha',
'color',
'position'
]
dumped_node['color_ramp'] = ramp_dumper.dump(node.color_ramp)
if hasattr(node, 'mapping'):
curve_dumper = Dumper()
curve_dumper.depth = 5
curve_dumper.include_filter = [
'curves',
'points',
'location'
]
dumped_node['mapping'] = curve_dumper.dump(node.mapping)
if hasattr(node, 'image') and getattr(node, 'image'):
dumped_node['image_uuid'] = node.image.uuid
return dumped_node
class BlMaterial(BlDatablock):
bl_id = "materials"
bl_class = bpy.types.Material
bl_delay_refresh = 10
bl_delay_apply = 10
bl_delay_refresh = 1
bl_delay_apply = 1
bl_automatic_push = True
bl_check_common = False
bl_icon = 'MATERIAL_DATA'
def construct(self, data):
def _construct(self, data):
return bpy.data.materials.new(data["name"])
def load_implementation(self, data, target):
def _load_implementation(self, data, target):
loader = Loader()
target.name = data['name']
if data['is_grease_pencil']:
if not target.is_grease_pencil:
bpy.data.materials.create_gpencil_data(target)
utils.dump_anything.load(
loader.load(
target.grease_pencil, data['grease_pencil'])
utils.load_dict(data['grease_pencil'], target.grease_pencil)
elif data["use_nodes"]:
if data["use_nodes"]:
if target.node_tree is None:
target.use_nodes = True
target.node_tree.nodes.clear()
utils.dump_anything.load(target,data)
loader.load(target, data)
# Load nodes
for node in data["node_tree"]["nodes"]:
load_node(target.node_tree, data["node_tree"]["nodes"][node])
load_node(data["node_tree"]["nodes"][node], target.node_tree)
# Load nodes links
target.node_tree.links.clear()
for link in data["node_tree"]["links"]:
load_link(target.node_tree, data["node_tree"]["links"][link])
load_links(data["node_tree"]["links"], target.node_tree)
def dump_implementation(self, data, pointer=None):
assert(pointer)
mat_dumper = utils.dump_anything.Dumper()
def _dump_implementation(self, data, instance=None):
assert(instance)
mat_dumper = Dumper()
mat_dumper.depth = 2
mat_dumper.exclude_filter = [
"is_embed_data",
"is_evaluated",
"name_full",
"bl_description",
"bl_icon",
"bl_idname",
"bl_label",
"preview",
"original",
"uuid",
@ -120,89 +221,54 @@ class BlMaterial(BlDatablock):
"line_color",
"view_center",
]
node_dumper = utils.dump_anything.Dumper()
node_dumper.depth = 1
node_dumper.exclude_filter = [
"dimensions",
"show_expanded"
"select",
"bl_height_min",
"bl_height_max",
"bl_width_min",
"bl_width_max",
"bl_width_default",
"hide",
"show_options",
"show_tetxures",
"show_preview",
"outputs",
"width_hidden"
]
input_dumper = utils.dump_anything.Dumper()
input_dumper.depth = 2
input_dumper.include_filter = ["default_value"]
links_dumper = utils.dump_anything.Dumper()
links_dumper.depth = 3
links_dumper.include_filter = [
"name",
"to_node",
"from_node",
"from_socket",
"to_socket"]
data = mat_dumper.dump(pointer)
data = mat_dumper.dump(instance)
if pointer.use_nodes:
if instance.use_nodes:
nodes = {}
for node in pointer.node_tree.nodes:
nodes[node.name] = node_dumper.dump(node)
if hasattr(node, 'inputs'):
nodes[node.name]['inputs'] = {}
for i in node.inputs:
if hasattr(i, 'default_value'):
nodes[node.name]['inputs'][i.name] = input_dumper.dump(
i)
if hasattr(node, 'color_ramp'):
ramp_dumper = utils.dump_anything.Dumper()
ramp_dumper.depth = 4
ramp_dumper.include_filter = [
'elements',
'alpha',
'color',
'position'
]
nodes[node.name]['color_ramp'] = ramp_dumper.dump(node.color_ramp)
if hasattr(node, 'mapping'):
curve_dumper = utils.dump_anything.Dumper()
curve_dumper.depth = 5
curve_dumper.include_filter = [
'curves',
'points',
'location'
]
nodes[node.name]['mapping'] = curve_dumper.dump(node.mapping)
for node in instance.node_tree.nodes:
nodes[node.name] = dump_node(node)
data["node_tree"]['nodes'] = nodes
data["node_tree"]["links"] = links_dumper.dump(pointer.node_tree.links)
elif pointer.is_grease_pencil:
utils.dump_datablock_attibutes(pointer, ["grease_pencil"], 3, data)
data["node_tree"]["links"] = dump_links(instance.node_tree.links)
if instance.is_grease_pencil:
gp_mat_dumper = Dumper()
gp_mat_dumper.depth = 3
gp_mat_dumper.include_filter = [
'show_stroke',
'mode',
'stroke_style',
'color',
'use_overlap_strokes',
'show_fill',
'fill_style',
'fill_color',
'pass_index',
'alignment_mode',
# 'fill_image',
'texture_opacity',
'mix_factor',
'texture_offset',
'texture_angle',
'texture_scale',
'texture_clamp',
'gradient_type',
'mix_color',
'flip'
]
data['grease_pencil'] = gp_mat_dumper.dump(instance.grease_pencil)
return data
def resolve_dependencies(self):
def _resolve_deps_implementation(self):
# TODO: resolve node group deps
deps = []
if self.pointer.use_nodes:
for node in self.pointer.node_tree.nodes:
if node.type == 'TEX_IMAGE':
if self.instance.use_nodes:
for node in self.instance.node_tree.nodes:
if node.type in ['TEX_IMAGE','TEX_ENVIRONMENT']:
deps.append(node.image)
if self.is_library:
deps.append(self.pointer.library)
deps.append(self.instance.library)
return deps
def is_valid(self):
return bpy.data.materials.get(self.data['name'])

View File

@ -1,166 +1,170 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
#
# ##### END GPL LICENSE BLOCK #####
import bpy
import bmesh
import mathutils
import logging
import numpy as np
from .. import utils
from ..libs.replication.replication.constants import DIFF_BINARY
from .dump_anything import Dumper, Loader, np_load_collection_primitives, np_dump_collection_primitive, np_load_collection, np_dump_collection
from replication.constants import DIFF_BINARY
from replication.exception import ContextError
from .bl_datablock import BlDatablock
VERTICE = ['co']
def dump_mesh(mesh, data={}):
import bmesh
mesh_data = data
mesh_buffer = bmesh.new()
# https://blog.michelanders.nl/2016/02/copying-vertices-to-numpy-arrays-in_4.html
mesh_buffer.from_mesh(mesh)
uv_layer = mesh_buffer.loops.layers.uv.verify()
bevel_layer = mesh_buffer.verts.layers.bevel_weight.verify()
skin_layer = mesh_buffer.verts.layers.skin.verify()
verts = {}
for vert in mesh_buffer.verts:
v = {}
v["co"] = list(vert.co)
# vert metadata
v['bevel'] = vert[bevel_layer]
v['normal'] = list(vert.normal)
# v['skin'] = list(vert[skin_layer])
verts[str(vert.index)] = v
mesh_data["verts"] = verts
edges = {}
for edge in mesh_buffer.edges:
e = {}
e["verts"] = [edge.verts[0].index, edge.verts[1].index]
# Edge metadata
e["smooth"] = edge.smooth
edges[edge.index] = e
mesh_data["edges"] = edges
faces = {}
for face in mesh_buffer.faces:
f = {}
fverts = []
for vert in face.verts:
fverts.append(vert.index)
f["verts"] = fverts
f["material_index"] = face.material_index
f["smooth"] = face.smooth
f["normal"] = list(face.normal)
f["index"] = face.index
uvs = []
# Face metadata
for loop in face.loops:
loop_uv = loop[uv_layer]
uvs.append(list(loop_uv.uv))
f["uv"] = uvs
faces[face.index] = f
mesh_data["faces"] = faces
uv_layers = []
for uv_layer in mesh.uv_layers:
uv_layers.append(uv_layer.name)
mesh_data["uv_layers"] = uv_layers
# return mesh_data
EDGE = [
'vertices',
'crease',
'bevel_weight',
]
LOOP = [
'vertex_index',
'normal',
]
POLYGON = [
'loop_total',
'loop_start',
'use_smooth',
'material_index',
]
class BlMesh(BlDatablock):
bl_id = "meshes"
bl_class = bpy.types.Mesh
bl_delay_refresh = 10
bl_delay_apply = 10
bl_delay_refresh = 2
bl_delay_apply = 1
bl_automatic_push = True
bl_check_common = False
bl_icon = 'MESH_DATA'
def construct(self, data):
def _construct(self, data):
instance = bpy.data.meshes.new(data["name"])
instance.uuid = self.uuid
return instance
def load_implementation(self, data, target):
if not target or not target.is_editmode:
# 1 - LOAD MATERIAL SLOTS
# SLots
i = 0
def _load_implementation(self, data, target):
if not target or target.is_editmode:
raise ContextError
else:
loader = Loader()
loader.load(target, data)
# MATERIAL SLOTS
target.materials.clear()
for m in data["material_list"]:
target.materials.append(bpy.data.materials[m])
# 2 - LOAD GEOMETRY
mesh_buffer = bmesh.new()
# CLEAR GEOMETRY
if target.vertices:
target.clear_geometry()
for i in data["verts"]:
v = mesh_buffer.verts.new(data["verts"][i]["co"])
v.normal = data["verts"][i]["normal"]
mesh_buffer.verts.ensure_lookup_table()
target.vertices.add(data["vertex_count"])
target.edges.add(data["egdes_count"])
target.loops.add(data["loop_count"])
target.polygons.add(data["poly_count"])
for i in data["edges"]:
verts = mesh_buffer.verts
v1 = data["edges"][i]["verts"][0]
v2 = data["edges"][i]["verts"][1]
edge = mesh_buffer.edges.new([verts[v1], verts[v2]])
edge.smooth = data["edges"][i]["smooth"]
# LOADING
np_load_collection(data['vertices'], target.vertices, VERTICE)
np_load_collection(data['edges'], target.edges, EDGE)
np_load_collection(data['loops'], target.loops, LOOP)
np_load_collection(data["polygons"],target.polygons, POLYGON)
# UV Layers
if 'uv_layers' in data.keys():
for layer in data['uv_layers']:
if layer not in target.uv_layers:
target.uv_layers.new(name=layer)
np_load_collection_primitives(
target.uv_layers[layer].data,
'uv',
data["uv_layers"][layer]['data'])
mesh_buffer.edges.ensure_lookup_table()
for p in data["faces"]:
verts = []
for v in data["faces"][p]["verts"]:
verts.append(mesh_buffer.verts[v])
# Vertex color
if 'vertex_colors' in data.keys():
for color_layer in data['vertex_colors']:
if color_layer not in target.vertex_colors:
target.vertex_colors.new(name=color_layer)
if len(verts) > 0:
f = mesh_buffer.faces.new(verts)
np_load_collection_primitives(
target.vertex_colors[color_layer].data,
'color',
data["vertex_colors"][color_layer]['data'])
uv_layer = mesh_buffer.loops.layers.uv.verify()
target.validate()
target.update()
f.smooth = data["faces"][p]["smooth"]
f.normal = data["faces"][p]["normal"]
f.index = data["faces"][p]["index"]
f.material_index = data["faces"][p]['material_index']
# UV loading
for i, loop in enumerate(f.loops):
loop_uv = loop[uv_layer]
loop_uv.uv = data["faces"][p]["uv"][i]
mesh_buffer.faces.ensure_lookup_table()
mesh_buffer.to_mesh(target)
def _dump_implementation(self, data, instance=None):
assert(instance)
# 3 - LOAD METADATA
# uv's
utils.dump_anything.load(target.uv_layers, data['uv_layers'])
if instance.is_editmode and not self.preferences.sync_flags.sync_during_editmode:
raise ContextError("Mesh is in edit mode")
mesh = instance
bevel_layer = mesh_buffer.verts.layers.bevel_weight.verify()
skin_layer = mesh_buffer.verts.layers.skin.verify()
utils.dump_anything.load(target, data)
def dump_implementation(self, data, pointer=None):
assert(pointer)
dumper = utils.dump_anything.Dumper()
dumper.depth = 2
dumper = Dumper()
dumper.depth = 1
dumper.include_filter = [
'name',
'use_auto_smooth',
'auto_smooth_angle'
'auto_smooth_angle',
'use_customdata_edge_bevel',
'use_customdata_edge_crease'
]
data = dumper.dump(pointer)
dump_mesh(pointer, data)
data = dumper.dump(mesh)
# VERTICES
data["vertex_count"] = len(mesh.vertices)
data["vertices"] = np_dump_collection(mesh.vertices, VERTICE)
# EDGES
data["egdes_count"] = len(mesh.edges)
data["edges"] = np_dump_collection(mesh.edges, EDGE)
# POLYGONS
data["poly_count"] = len(mesh.polygons)
data["polygons"] = np_dump_collection(mesh.polygons, POLYGON)
# LOOPS
data["loop_count"] = len(mesh.loops)
data["loops"] = np_dump_collection(mesh.loops, LOOP)
# UV Layers
if mesh.uv_layers:
data['uv_layers'] = {}
for layer in mesh.uv_layers:
data['uv_layers'][layer.name] = {}
data['uv_layers'][layer.name]['data'] = np_dump_collection_primitive(layer.data, 'uv')
# Vertex color
if mesh.vertex_colors:
data['vertex_colors'] = {}
for color_map in mesh.vertex_colors:
data['vertex_colors'][color_map.name] = {}
data['vertex_colors'][color_map.name]['data'] = np_dump_collection_primitive(color_map.data, 'color')
# Fix material index
m_list = []
for material in pointer.materials:
for material in instance.materials:
if material:
m_list.append(material.name)
@ -168,14 +172,11 @@ class BlMesh(BlDatablock):
return data
def resolve_dependencies(self):
def _resolve_deps_implementation(self):
deps = []
for material in self.pointer.materials:
for material in self.instance.materials:
if material:
deps.append(material)
return deps
def is_valid(self):
return bpy.data.meshes.get(self.data['name'])

View File

@ -1,37 +1,106 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
#
# ##### END GPL LICENSE BLOCK #####
import bpy
import mathutils
from .. import utils
from .dump_anything import (
Dumper, Loader, np_dump_collection_primitive, np_load_collection_primitives,
np_dump_collection, np_load_collection)
from .bl_datablock import BlDatablock
ELEMENT = [
'co',
'hide',
'radius',
'rotation',
'size_x',
'size_y',
'size_z',
'stiffness',
'type'
]
def dump_metaball_elements(elements):
""" Dump a metaball element
:arg element: metaball element
:type bpy.types.MetaElement
:return: dict
"""
dumped_elements = np_dump_collection(elements, ELEMENT)
return dumped_elements
def load_metaball_elements(elements_data, elements):
""" Dump a metaball element
:arg element: metaball element
:type bpy.types.MetaElement
:return: dict
"""
np_load_collection(elements_data, elements, ELEMENT)
class BlMetaball(BlDatablock):
bl_id = "metaballs"
bl_class = bpy.types.MetaBall
bl_delay_refresh = 1
bl_delay_apply = 1
bl_automatic_push = True
bl_check_common = False
bl_icon = 'META_BALL'
def construct(self, data):
def _construct(self, data):
return bpy.data.metaballs.new(data["name"])
def load(self, data, target):
utils.dump_anything.load(target, data)
def _load_implementation(self, data, target):
loader = Loader()
loader.load(target, data)
target.elements.clear()
for element in data["elements"]:
new_element = target.elements.new(type=data["elements"][element]['type'])
utils.dump_anything.load(new_element, data["elements"][element])
def dump_implementation(self, data, pointer=None):
assert(pointer)
dumper = utils.dump_anything.Dumper()
dumper.depth = 3
dumper.exclude_filter = ["is_editmode"]
for mtype in data["elements"]['type']:
new_element = target.elements.new()
load_metaball_elements(data['elements'], target.elements)
def _dump_implementation(self, data, instance=None):
assert(instance)
dumper = Dumper()
dumper.depth = 1
dumper.include_filter = [
'name',
'resolution',
'render_resolution',
'threshold',
'update_method',
'use_auto_texspace',
'texspace_location',
'texspace_size'
]
data = dumper.dump(instance)
data['elements'] = dump_metaball_elements(instance.elements)
data = dumper.dump(pointer)
return data
def is_valid(self):
return bpy.data.metaballs.get(self.data['name'])

View File

@ -1,33 +1,82 @@
import bpy
import mathutils
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
#
# ##### END GPL LICENSE BLOCK #####
import logging
from .. import utils
from .bl_datablock import BlDatablock
import bpy
import mathutils
from replication.exception import ContextError
logger = logging.getLogger(__name__)
def load_constraints(target, data):
for local_constraint in target.constraints:
if local_constraint.name not in data:
target.constraints.remove(local_constraint)
for constraint in data:
target_constraint = target.constraints.get(constraint)
if not target_constraint:
target_constraint = target.constraints.new(
data[constraint]['type'])
utils.dump_anything.load(
target_constraint, data[constraint])
from .bl_datablock import BlDatablock, get_datablock_from_uuid
from .dump_anything import Dumper, Loader
from replication.exception import ReparentException
def load_pose(target_bone, data):
target_bone.rotation_mode = data['rotation_mode']
loader = Loader()
loader.load(target_bone, data)
utils.dump_anything.load(target_bone, data)
def find_data_from_name(name=None):
instance = None
if not name:
pass
elif name in bpy.data.meshes.keys():
instance = bpy.data.meshes[name]
elif name in bpy.data.lights.keys():
instance = bpy.data.lights[name]
elif name in bpy.data.cameras.keys():
instance = bpy.data.cameras[name]
elif name in bpy.data.curves.keys():
instance = bpy.data.curves[name]
elif name in bpy.data.metaballs.keys():
instance = bpy.data.metaballs[name]
elif name in bpy.data.armatures.keys():
instance = bpy.data.armatures[name]
elif name in bpy.data.grease_pencils.keys():
instance = bpy.data.grease_pencils[name]
elif name in bpy.data.curves.keys():
instance = bpy.data.curves[name]
elif name in bpy.data.lattices.keys():
instance = bpy.data.lattices[name]
elif name in bpy.data.speakers.keys():
instance = bpy.data.speakers[name]
elif name in bpy.data.lightprobes.keys():
# Only supported since 2.83
if bpy.app.version[1] >= 83:
instance = bpy.data.lightprobes[name]
else:
logging.warning(
"Lightprobe replication only supported since 2.83. See https://developer.blender.org/D6396")
return instance
def load_data(object, name):
logging.info("loading data")
pass
def _is_editmode(object: bpy.types.Object) -> bool:
child_data = getattr(object, 'data', None)
return (child_data and
hasattr(child_data, 'is_editmode') and
child_data.is_editmode)
class BlObject(BlDatablock):
@ -36,10 +85,11 @@ class BlObject(BlDatablock):
bl_delay_refresh = 1
bl_delay_apply = 1
bl_automatic_push = True
bl_check_common = False
bl_icon = 'OBJECT_DATA'
def construct(self, data):
pointer = None
def _construct(self, data):
instance = None
if self.is_library:
with bpy.data.libraries.load(filepath=bpy.data.libraries[self.data['library']].filepath, link=True) as (sourceData, targetData):
@ -50,69 +100,67 @@ class BlObject(BlDatablock):
instance.uuid = self.uuid
return instance
# Object specific constructor...
if "data" not in data:
pass
elif data["data"] in bpy.data.meshes.keys():
pointer = bpy.data.meshes[data["data"]]
elif data["data"] in bpy.data.lights.keys():
pointer = bpy.data.lights[data["data"]]
elif data["data"] in bpy.data.cameras.keys():
pointer = bpy.data.cameras[data["data"]]
elif data["data"] in bpy.data.curves.keys():
pointer = bpy.data.curves[data["data"]]
elif data["data"] in bpy.data.metaballs.keys():
pointer = bpy.data.metaballs[data["data"]]
elif data["data"] in bpy.data.armatures.keys():
pointer = bpy.data.armatures[data["data"]]
elif data["data"] in bpy.data.grease_pencils.keys():
pointer = bpy.data.grease_pencils[data["data"]]
elif data["data"] in bpy.data.curves.keys():
pointer = bpy.data.curves[data["data"]]
elif data["data"] in bpy.data.lattices.keys():
pointer = bpy.data.lattices[data["data"]]
elif data["data"] in bpy.data.speakers.keys():
pointer = bpy.data.speakers[data["data"]]
elif data["data"] in bpy.data.lightprobes.keys():
# Only supported since 2.83
if bpy.app.version[1] >= 83:
pointer = bpy.data.lightprobes[data["data"]]
else:
logger.warning(
"Lightprobe replication only supported since 2.83. See https://developer.blender.org/D6396")
instance = bpy.data.objects.new(data["name"], pointer)
# TODO: refactoring
object_name = data.get("name")
data_uuid = data.get("data_uuid")
data_id = data.get("data")
object_data = get_datablock_from_uuid(
data_uuid,
find_data_from_name(data_id),
ignore=['images']) #TODO: use resolve_from_id
instance = bpy.data.objects.new(object_name, object_data)
instance.uuid = self.uuid
return instance
def load_implementation(self, data, target):
def _load_implementation(self, data, target):
loader = Loader()
data_uuid = data.get("data_uuid")
data_id = data.get("data")
if target.type != data['type']:
raise ReparentException()
elif target.data and (target.data.name != data_id):
target.data = get_datablock_from_uuid(data_uuid, find_data_from_name(data_id), ignore=['images'])
# vertex groups
if 'vertex_groups' in data:
target.vertex_groups.clear()
for vg in data['vertex_groups']:
vertex_group=target.vertex_groups.new(name = vg['name'])
point_attr='vertices' if 'vertices' in vg else 'points'
for vert in vg[point_attr]:
vertex_group.add(
[vert['index']], vert['weight'], 'REPLACE')
# SHAPE KEYS
if 'shape_keys' in data:
target.shape_key_clear()
object_data=target.data
# Create keys and load vertices coords
for key_block in data['shape_keys']['key_blocks']:
key_data=data['shape_keys']['key_blocks'][key_block]
target.shape_key_add(name = key_block)
loader.load(
target.data.shape_keys.key_blocks[key_block], key_data)
for vert in key_data['data']:
target.data.shape_keys.key_blocks[key_block].data[vert].co = key_data['data'][vert]['co']
# Load relative key after all
for key_block in data['shape_keys']['key_blocks']:
reference = data['shape_keys']['key_blocks'][key_block]['relative_key']
target.data.shape_keys.key_blocks[key_block].relative_key = target.data.shape_keys.key_blocks[reference]
# Load transformation data
rot_mode = 'rotation_quaternion' if data['rotation_mode'] == 'QUATERNION' else 'rotation_euler'
target.rotation_mode = data['rotation_mode']
target.location = data['location']
setattr(target, rot_mode, data[rot_mode])
target.scale = data['scale']
loader.load(target, data)
target.name = data["name"]
# Load modifiers
if hasattr(target, 'modifiers'):
# TODO: smarter selective update
target.modifiers.clear()
for modifier in data['modifiers']:
target_modifier = target.modifiers.get(modifier)
if not target_modifier:
target_modifier = target.modifiers.new(
data['modifiers'][modifier]['name'], data['modifiers'][modifier]['type'])
utils.dump_anything.load(
target_modifier, data['modifiers'][modifier])
# Load constraints
# Object
if hasattr(target, 'constraints') and 'constraints' in data:
load_constraints(target, data['constraints'])
loader.load(target.display, data['display'])
# Pose
if 'pose' in data:
@ -126,7 +174,7 @@ class BlObject(BlDatablock):
if not bg_target:
bg_target = target.pose.bone_groups.new(name=bg_name)
utils.dump_anything.load(bg_target, bg_data)
loader.load(bg_target, bg_data)
# target.pose.bone_groups.get
# Bones
@ -135,62 +183,29 @@ class BlObject(BlDatablock):
bone_data = data['pose']['bones'].get(bone)
if 'constraints' in bone_data.keys():
load_constraints(
target_bone, bone_data['constraints'])
loader.load(target_bone, bone_data['constraints'])
load_pose(target_bone, bone_data)
if 'bone_index' in bone_data.keys():
target_bone.bone_group = target.pose.bone_group[bone_data['bone_group_index']]
# Load relations
if 'children' in data.keys():
for child in data['children']:
bpy.data.objects[child].parent = self.pointer
# TODO: find another way...
if target.type == 'EMPTY':
img_uuid = data.get('data_uuid')
if target.data is None and img_uuid:
target.data = get_datablock_from_uuid(img_uuid, None)#bpy.data.images.get(img_key, None)
# Load empty representation
target.empty_display_size = data['empty_display_size']
target.empty_display_type = data['empty_display_type']
def _dump_implementation(self, data, instance=None):
assert(instance)
# Instancing
target.instance_type = data['instance_type']
if data['instance_type'] == 'COLLECTION':
target.instance_collection = bpy.data.collections[data['instance_collection']]
if _is_editmode(instance):
if self.preferences.sync_flags.sync_during_editmode:
instance.update_from_editmode()
else:
raise ContextError("Object is in edit-mode.")
# vertex groups
if 'vertex_groups' in data:
target.vertex_groups.clear()
for vg in data['vertex_groups']:
vertex_group = target.vertex_groups.new(name=vg['name'])
for vert in vg['vertices']:
vertex_group.add(
[vert['index']], vert['weight'], 'REPLACE')
# SHAPE KEYS
if 'shape_keys' in data:
target.shape_key_clear()
object_data = target.data
# Create keys and load vertices coords
for key_block in data['shape_keys']['key_blocks']:
key_data = data['shape_keys']['key_blocks'][key_block]
target.shape_key_add(name=key_block)
utils.dump_anything.load(
target.data.shape_keys.key_blocks[key_block], key_data)
for vert in key_data['data']:
target.data.shape_keys.key_blocks[key_block].data[vert].co = key_data['data'][vert]['co']
# Load relative key after all
for key_block in data['shape_keys']['key_blocks']:
reference = data['shape_keys']['key_blocks'][key_block]['relative_key']
target.data.shape_keys.key_blocks[key_block].relative_key = target.data.shape_keys.key_blocks[reference]
def dump_implementation(self, data, pointer=None):
assert(pointer)
dumper = utils.dump_anything.Dumper()
dumper = Dumper()
dumper.depth = 1
dumper.include_filter = [
"name",
@ -201,38 +216,64 @@ class BlObject(BlDatablock):
"library",
"empty_display_type",
"empty_display_size",
"empty_image_offset",
"empty_image_depth",
"empty_image_side",
"show_empty_image_orthographic",
"show_empty_image_perspective",
"show_empty_image_only_axis_aligned",
"use_empty_image_alpha",
"color",
"instance_collection",
"instance_type",
"location",
"scale",
'rotation_quaternion' if pointer.rotation_mode == 'QUATERNION' else 'rotation_euler',
'lock_location',
'lock_rotation',
'lock_scale',
'hide_render',
'display_type',
'display_bounds_type',
'show_bounds',
'show_name',
'show_axis',
'show_wire',
'show_all_edges',
'show_texture_space',
'show_in_front',
'type',
'rotation_quaternion' if instance.rotation_mode == 'QUATERNION' else 'rotation_euler',
]
data = dumper.dump(pointer)
data = dumper.dump(instance)
dumper.include_filter = [
'show_shadows',
]
data['display'] = dumper.dump(instance.display)
data['data_uuid'] = getattr(instance.data, 'uuid', None)
if self.is_library:
return data
# MODIFIERS
if hasattr(pointer, 'modifiers'):
if hasattr(instance, 'modifiers'):
dumper.include_filter = None
dumper.depth = 2
dumper.depth = 1
data["modifiers"] = {}
for index, modifier in enumerate(pointer.modifiers):
for index, modifier in enumerate(instance.modifiers):
data["modifiers"][modifier.name] = dumper.dump(modifier)
data["modifiers"][modifier.name]['m_index'] = index
# CONSTRAINTS
# OBJECT
if hasattr(pointer, 'constraints'):
if hasattr(instance, 'constraints'):
dumper.depth = 3
data["constraints"] = dumper.dump(pointer.constraints)
data["constraints"] = dumper.dump(instance.constraints)
# POSE
if hasattr(pointer, 'pose') and pointer.pose:
if hasattr(instance, 'pose') and instance.pose:
# BONES
bones = {}
for bone in pointer.pose.bones:
for bone in instance.pose.bones:
bones[bone.name] = {}
dumper.depth = 1
rotation = 'rotation_quaternion' if bone.rotation_mode == 'QUATERNION' else 'rotation_euler'
@ -257,7 +298,7 @@ class BlObject(BlDatablock):
# GROUPS
bone_groups = {}
for group in pointer.pose.bone_groups:
for group in instance.pose.bone_groups:
dumper.depth = 3
dumper.include_filter = [
'name',
@ -267,28 +308,30 @@ class BlObject(BlDatablock):
data['pose']['bone_groups'] = bone_groups
# CHILDS
if len(pointer.children) > 0:
if len(instance.children) > 0:
childs = []
for child in pointer.children:
for child in instance.children:
childs.append(child.name)
data["children"] = childs
# VERTEx GROUP
if len(pointer.vertex_groups) > 0:
if len(instance.vertex_groups) > 0:
points_attr = 'vertices' if isinstance(
instance.data, bpy.types.Mesh) else 'points'
vg_data = []
for vg in pointer.vertex_groups:
for vg in instance.vertex_groups:
vg_idx = vg.index
dumped_vg = {}
dumped_vg['name'] = vg.name
vertices = []
for v in pointer.data.vertices:
for i, v in enumerate(getattr(instance.data, points_attr)):
for vg in v.groups:
if vg.group == vg_idx:
vertices.append({
'index': v.index,
'index': i,
'weight': vg.weight
})
@ -299,18 +342,18 @@ class BlObject(BlDatablock):
data['vertex_groups'] = vg_data
# SHAPE KEYS
pointer_data = pointer.data
if hasattr(pointer_data, 'shape_keys') and pointer_data.shape_keys:
dumper = utils.dump_anything.Dumper()
object_data = instance.data
if hasattr(object_data, 'shape_keys') and object_data.shape_keys:
dumper = Dumper()
dumper.depth = 2
dumper.include_filter = [
'reference_key',
'use_relative'
]
data['shape_keys'] = dumper.dump(pointer_data.shape_keys)
data['shape_keys']['reference_key'] = pointer_data.shape_keys.reference_key.name
data['shape_keys'] = dumper.dump(object_data.shape_keys)
data['shape_keys']['reference_key'] = object_data.shape_keys.reference_key.name
key_blocks = {}
for key in pointer_data.shape_keys.key_blocks:
for key in object_data.shape_keys.key_blocks:
dumper.depth = 3
dumper.include_filter = [
'name',
@ -328,23 +371,20 @@ class BlObject(BlDatablock):
return data
def resolve_dependencies(self):
deps = super().resolve_dependencies()
def _resolve_deps_implementation(self):
deps = []
# Avoid Empty case
if self.pointer.data:
deps.append(self.pointer.data)
if len(self.pointer.children) > 0:
deps.extend(list(self.pointer.children))
if self.instance.data:
deps.append(self.instance.data)
if len(self.instance.children) > 0:
deps.extend(list(self.instance.children))
if self.is_library:
deps.append(self.pointer.library)
deps.append(self.instance.library)
if self.pointer.instance_type == 'COLLECTION':
if self.instance.instance_type == 'COLLECTION':
# TODO: uuid based
deps.append(self.pointer.instance_collection)
deps.append(self.instance.instance_collection)
return deps
def is_valid(self):
return bpy.data.objects.get(self.data['name'])

View File

@ -1,8 +1,30 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
#
# ##### END GPL LICENSE BLOCK #####
import bpy
import mathutils
from .. import utils
from .dump_anything import Loader, Dumper
from .bl_datablock import BlDatablock
from .bl_collection import dump_collection_children, dump_collection_objects, load_collection_childrens, load_collection_objects
from replication.constants import (DIFF_JSON, MODIFIED)
from deepdiff import DeepDiff
import logging
class BlScene(BlDatablock):
bl_id = "scenes"
@ -10,37 +32,26 @@ class BlScene(BlDatablock):
bl_delay_refresh = 1
bl_delay_apply = 1
bl_automatic_push = True
bl_check_common = True
bl_icon = 'SCENE_DATA'
def construct(self, data):
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
self.diff_method = DIFF_JSON
def _construct(self, data):
instance = bpy.data.scenes.new(data["name"])
instance.uuid = self.uuid
return instance
def load(self, data, target):
target = self.pointer
def _load_implementation(self, data, target):
# Load other meshes metadata
utils.dump_anything.load(target, data)
loader = Loader()
loader.load(target, data)
# Load master collection
for object in data["collection"]["objects"]:
if object not in target.collection.objects.keys():
target.collection.objects.link(bpy.data.objects[object])
for object in target.collection.objects.keys():
if object not in data["collection"]["objects"]:
target.collection.objects.unlink(bpy.data.objects[object])
# load collections
for collection in data["collection"]["children"]:
if collection not in target.collection.children.keys():
target.collection.children.link(
bpy.data.collections[collection])
for collection in target.collection.children.keys():
if collection not in data["collection"]["children"]:
target.collection.children.unlink(
bpy.data.collections[collection])
load_collection_objects(data['collection']['objects'], target.collection)
load_collection_childrens(data['collection']['children'], target.collection)
if 'world' in data.keys():
target.world = bpy.data.worlds[data['world']]
@ -49,42 +60,121 @@ class BlScene(BlDatablock):
if 'grease_pencil' in data.keys():
target.grease_pencil = bpy.data.grease_pencils[data['grease_pencil']]
def dump_implementation(self, data, pointer=None):
assert(pointer)
if self.preferences.sync_flags.sync_render_settings:
if 'eevee' in data.keys():
loader.load(target.eevee, data['eevee'])
if 'cycles' in data.keys():
loader.load(target.eevee, data['cycles'])
if 'render' in data.keys():
loader.load(target.render, data['render'])
if 'view_settings' in data.keys():
loader.load(target.view_settings, data['view_settings'])
if target.view_settings.use_curve_mapping:
#TODO: change this ugly fix
target.view_settings.curve_mapping.white_level = data['view_settings']['curve_mapping']['white_level']
target.view_settings.curve_mapping.black_level = data['view_settings']['curve_mapping']['black_level']
target.view_settings.curve_mapping.update()
def _dump_implementation(self, data, instance=None):
assert(instance)
data = {}
scene_dumper = utils.dump_anything.Dumper()
scene_dumper = Dumper()
scene_dumper.depth = 1
scene_dumper.include_filter = ['name','world', 'id', 'camera', 'grease_pencil']
data = scene_dumper.dump(pointer)
scene_dumper.include_filter = [
'name',
'world',
'id',
'grease_pencil',
'frame_start',
'frame_end',
'frame_step',
]
if self.preferences.sync_flags.sync_active_camera:
scene_dumper.include_filter.append('camera')
data = scene_dumper.dump(instance)
scene_dumper.depth = 3
scene_dumper.include_filter = ['children','objects','name']
data['collection'] = scene_dumper.dump(pointer.collection)
scene_dumper.include_filter = ['children','objects','name']
data['collection'] = {}
data['collection']['children'] = dump_collection_children(instance.collection)
data['collection']['objects'] = dump_collection_objects(instance.collection)
scene_dumper.depth = 1
scene_dumper.include_filter = None
if self.preferences.sync_flags.sync_render_settings:
scene_dumper.exclude_filter = [
'gi_cache_info',
'feature_set',
'debug_use_hair_bvh',
'aa_samples',
'blur_glossy',
'glossy_bounces',
'device',
'max_bounces',
'preview_aa_samples',
'preview_samples',
'sample_clamp_indirect',
'samples',
'volume_bounces',
'file_extension',
'use_denoising'
]
data['eevee'] = scene_dumper.dump(instance.eevee)
data['cycles'] = scene_dumper.dump(instance.cycles)
data['view_settings'] = scene_dumper.dump(instance.view_settings)
data['render'] = scene_dumper.dump(instance.render)
if instance.view_settings.use_curve_mapping:
data['view_settings']['curve_mapping'] = scene_dumper.dump(instance.view_settings.curve_mapping)
scene_dumper.depth = 5
scene_dumper.include_filter = [
'curves',
'points',
'location'
]
data['view_settings']['curve_mapping']['curves'] = scene_dumper.dump(instance.view_settings.curve_mapping.curves)
return data
def resolve_dependencies(self):
def _resolve_deps_implementation(self):
deps = []
# child collections
for child in self.pointer.collection.children:
for child in self.instance.collection.children:
deps.append(child)
# childs objects
for object in self.pointer.objects:
for object in self.instance.objects:
deps.append(object)
# world
if self.pointer.world:
deps.append(self.pointer.world)
if self.instance.world:
deps.append(self.instance.world)
# annotations
if self.pointer.grease_pencil:
deps.append(self.pointer.grease_pencil)
if self.instance.grease_pencil:
deps.append(self.instance.grease_pencil)
return deps
def is_valid(self):
return bpy.data.scenes.get(self.data['name'])
def diff(self):
exclude_path = []
if not self.preferences.sync_flags.sync_render_settings:
exclude_path.append("root['eevee']")
exclude_path.append("root['cycles']")
exclude_path.append("root['view_settings']")
exclude_path.append("root['render']")
if not self.preferences.sync_flags.sync_active_camera:
exclude_path.append("root['camera']")
return DeepDiff(self.data, self._dump(instance=self.instance),exclude_paths=exclude_path, cache_size=5000)

View File

@ -0,0 +1,69 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
#
# ##### END GPL LICENSE BLOCK #####
import logging
import os
from pathlib import Path
import bpy
from .bl_file import get_filepath, ensure_unpacked
from .bl_datablock import BlDatablock
from .dump_anything import Dumper, Loader
class BlSound(BlDatablock):
bl_id = "sounds"
bl_class = bpy.types.Sound
bl_delay_refresh = 1
bl_delay_apply = 1
bl_automatic_push = True
bl_check_common = False
bl_icon = 'SOUND'
def _construct(self, data):
filename = data.get('filename')
return bpy.data.sounds.load(get_filepath(filename))
def _load(self, data, target):
loader = Loader()
loader.load(target, data)
def diff(self):
return False
def _dump(self, instance=None):
filename = Path(instance.filepath).name
if not filename:
raise FileExistsError(instance.filepath)
return {
'filename': filename,
'name': instance.name
}
def _resolve_deps_implementation(self):
deps = []
if self.instance.filepath and self.instance.filepath != '<builtin>':
ensure_unpacked(self.instance)
deps.append(Path(bpy.path.abspath(self.instance.filepath)))
return deps

View File

@ -1,7 +1,25 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
#
# ##### END GPL LICENSE BLOCK #####
import bpy
import mathutils
from .. import utils
from .dump_anything import Loader, Dumper
from .bl_datablock import BlDatablock
@ -11,24 +29,27 @@ class BlSpeaker(BlDatablock):
bl_delay_refresh = 1
bl_delay_apply = 1
bl_automatic_push = True
bl_check_common = False
bl_icon = 'SPEAKER'
def load(self, data, target):
utils.dump_anything.load(target, data)
def _load_implementation(self, data, target):
loader = Loader()
loader.load(target, data)
def construct(self, data):
def _construct(self, data):
return bpy.data.speakers.new(data["name"])
def dump(self, pointer=None):
assert(pointer)
def _dump_implementation(self, data, instance=None):
assert(instance)
dumper = utils.dump_anything.Dumper()
dumper = Dumper()
dumper.depth = 1
dumper.include_filter = [
"muted",
'volume',
'name',
'pitch',
'sound',
'volume_min',
'volume_max',
'attenuation',
@ -39,8 +60,17 @@ class BlSpeaker(BlDatablock):
'cone_volume_outer'
]
return dumper.dump(pointer)
return dumper.dump(instance)
def _resolve_deps_implementation(self):
# TODO: resolve material
deps = []
sound = self.instance.sound
if sound:
deps.append(sound)
return deps
def is_valid(self):
return bpy.data.lattices.get(self.data['name'])

View File

@ -1,23 +1,45 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
#
# ##### END GPL LICENSE BLOCK #####
import bpy
import mathutils
from .. import utils
from .dump_anything import Loader, Dumper
from .bl_datablock import BlDatablock
from .bl_material import load_link, load_node
from .bl_material import load_links, load_node, dump_node, dump_links
class BlWorld(BlDatablock):
bl_id = "worlds"
bl_class = bpy.types.World
bl_delay_refresh = 4
bl_delay_apply = 4
bl_delay_refresh = 1
bl_delay_apply = 1
bl_automatic_push = True
bl_check_common = True
bl_icon = 'WORLD_DATA'
def construct(self, data):
def _construct(self, data):
return bpy.data.worlds.new(data["name"])
def load(self, data, target):
def _load_implementation(self, data, target):
loader = Loader()
loader.load(target, data)
if data["use_nodes"]:
if target.node_tree is None:
target.use_nodes = True
@ -25,82 +47,46 @@ class BlWorld(BlDatablock):
target.node_tree.nodes.clear()
for node in data["node_tree"]["nodes"]:
load_node(target.node_tree, data["node_tree"]["nodes"][node])
load_node(data["node_tree"]["nodes"][node], target.node_tree)
# Load nodes links
target.node_tree.links.clear()
for link in data["node_tree"]["links"]:
load_link(target.node_tree, data["node_tree"]["links"][link])
load_links(data["node_tree"]["links"], target.node_tree)
def dump_implementation(self, data, pointer=None):
assert(pointer)
def _dump_implementation(self, data, instance=None):
assert(instance)
world_dumper = utils.dump_anything.Dumper()
world_dumper.depth = 2
world_dumper.exclude_filter = [
"preview",
"original",
"uuid",
"color",
"cycles",
"light_settings",
"users",
"view_center"
world_dumper = Dumper()
world_dumper.depth = 1
world_dumper.include_filter = [
"use_nodes",
"name",
"color"
]
data = world_dumper.dump(pointer)
if pointer.use_nodes:
data = world_dumper.dump(instance)
if instance.use_nodes:
data['node_tree'] = {}
nodes = {}
dumper = utils.dump_anything.Dumper()
dumper.depth = 2
dumper.exclude_filter = [
"dimensions",
"select",
"bl_height_min",
"bl_height_max",
"bl_width_min",
"bl_width_max",
"bl_width_default",
"hide",
"show_options",
"show_tetxures",
"show_preview",
"outputs",
"preview",
"original",
"width_hidden",
]
for node in pointer.node_tree.nodes:
nodes[node.name] = dumper.dump(node)
for node in instance.node_tree.nodes:
nodes[node.name] = dump_node(node)
if hasattr(node, 'inputs'):
nodes[node.name]['inputs'] = {}
for i in node.inputs:
input_dumper = utils.dump_anything.Dumper()
input_dumper.depth = 2
input_dumper.include_filter = ["default_value"]
if hasattr(i, 'default_value'):
nodes[node.name]['inputs'][i.name] = input_dumper.dump(
i)
data["node_tree"]['nodes'] = nodes
utils.dump_datablock_attibutes(
pointer.node_tree, ["links"], 3, data['node_tree'])
data["node_tree"]['links'] = dump_links(instance.node_tree.links)
return data
def resolve_dependencies(self):
def _resolve_deps_implementation(self):
deps = []
if self.pointer.use_nodes:
for node in self.pointer.node_tree.nodes:
if node.type == 'TEX_IMAGE':
if self.instance.use_nodes:
for node in self.instance.node_tree.nodes:
if node.type in ['TEX_IMAGE','TEX_ENVIRONMENT']:
deps.append(node.image)
if self.is_library:
deps.append(self.pointer.library)
deps.append(self.instance.library)
return deps
def is_valid(self):
return bpy.data.worlds.get(self.data['name'])

View File

@ -0,0 +1,669 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
#
# ##### END GPL LICENSE BLOCK #####
import logging
import bpy
import bpy.types as T
import mathutils
import numpy as np
BPY_TO_NUMPY_TYPES = {
'FLOAT': np.float,
'INT': np.int,
'BOOL': np.bool}
PRIMITIVE_TYPES = ['FLOAT', 'INT', 'BOOLEAN']
NP_COMPATIBLE_TYPES = ['FLOAT', 'INT', 'BOOLEAN', 'ENUM']
def np_load_collection(dikt: dict, collection: bpy.types.CollectionProperty, attributes: list = None):
""" Dump a list of attributes from the sane collection
to the target dikt.
Without attribute given, it try to load all entry from dikt.
:arg dikt: target dict
:type dikt: dict
:arg collection: source collection
:type collection: bpy.types.CollectionProperty
:arg attributes: list of attributes name
:type attributes: list
"""
if not dikt or len(collection) == 0:
logging.warning(f'Skipping collection')
return
if attributes is None:
attributes = dikt.keys()
for attr in attributes:
attr_type = collection[0].bl_rna.properties.get(attr).type
if attr_type in PRIMITIVE_TYPES:
np_load_collection_primitives(collection, attr, dikt[attr])
elif attr_type == 'ENUM':
np_load_collection_enum(collection, attr, dikt[attr])
else:
logging.error(f"{attr} of type {attr_type} not supported.")
def np_dump_collection(collection: bpy.types.CollectionProperty, attributes: list = None) -> dict:
""" Dump a list of attributes from the sane collection
to the target dikt
Without attributes given, it try to dump all properties
that matches NP_COMPATIBLE_TYPES.
:arg collection: source collection
:type collection: bpy.types.CollectionProperty
:arg attributes: list of attributes name
:type attributes: list
:retrun: dict
"""
dumped_collection = {}
if len(collection) == 0:
return dumped_collection
# TODO: find a way without getting the first item
properties = collection[0].bl_rna.properties
if attributes is None:
attributes = [p.identifier for p in properties if p.type in NP_COMPATIBLE_TYPES and not p.is_readonly]
for attr in attributes:
attr_type = properties[attr].type
if attr_type in PRIMITIVE_TYPES:
dumped_collection[attr] = np_dump_collection_primitive(
collection, attr)
elif attr_type == 'ENUM':
dumped_collection[attr] = np_dump_collection_enum(collection, attr)
else:
logging.error(f"{attr} of type {attr_type} not supported. Only {PRIMITIVE_TYPES} and ENUM supported. Skipping it.")
return dumped_collection
def np_dump_collection_primitive(collection: bpy.types.CollectionProperty, attribute: str) -> str:
""" Dump a collection attribute as a sequence
!!! warning
Only work with int, float and bool attributes
:arg collection: target collection
:type collection: bpy.types.CollectionProperty
:arg attribute: target attribute
:type attribute: str
:return: numpy byte buffer
"""
if len(collection) == 0:
logging.debug(f'Skipping empty {attribute} attribute')
return {}
attr_infos = collection[0].bl_rna.properties.get(attribute)
assert(attr_infos.type in ['FLOAT', 'INT', 'BOOLEAN'])
size = sum(attr_infos.array_dimensions) if attr_infos.is_array else 1
dumped_sequence = np.zeros(
len(collection)*size,
dtype=BPY_TO_NUMPY_TYPES.get(attr_infos.type))
collection.foreach_get(attribute, dumped_sequence)
return dumped_sequence.tobytes()
def np_dump_collection_enum(collection: bpy.types.CollectionProperty, attribute: str) -> list:
""" Dump a collection enum attribute to an index list
:arg collection: target collection
:type collection: bpy.types.CollectionProperty
:arg attribute: target attribute
:type attribute: bpy.types.EnumProperty
:return: list of int
"""
attr_infos = collection[0].bl_rna.properties.get(attribute)
assert(attr_infos.type == 'ENUM')
enum_items = attr_infos.enum_items
return [enum_items[getattr(i, attribute)].value for i in collection]
def np_load_collection_enum(collection: bpy.types.CollectionProperty, attribute: str, sequence: list):
""" Load a collection enum attribute from a list sequence
!!! warning
Only work with Enum
:arg collection: target collection
:type collection: bpy.types.CollectionProperty
:arg attribute: target attribute
:type attribute: str
:arg sequence: enum data buffer
:type sequence: list
:return: numpy byte buffer
"""
attr_infos = collection[0].bl_rna.properties.get(attribute)
assert(attr_infos.type == 'ENUM')
enum_items = attr_infos.enum_items
enum_idx = [i.value for i in enum_items]
for index, item in enumerate(sequence):
setattr(collection[index], attribute,
enum_items[enum_idx.index(item)].identifier)
def np_load_collection_primitives(collection: bpy.types.CollectionProperty, attribute: str, sequence: str):
""" Load a collection attribute from a str bytes sequence
!!! warning
Only work with int, float and bool attributes
:arg collection: target collection
:type collection: bpy.types.CollectionProperty
:arg attribute: target attribute
:type attribute: str
:arg sequence: data buffer
:type sequence: strr
"""
if len(collection) == 0 or not sequence:
logging.debug(f"Skipping loading {attribute}")
return
attr_infos = collection[0].bl_rna.properties.get(attribute)
assert(attr_infos.type in ['FLOAT', 'INT', 'BOOLEAN'])
collection.foreach_set(
attribute,
np.frombuffer(sequence, dtype=BPY_TO_NUMPY_TYPES.get(attr_infos.type)))
def remove_items_from_dict(d, keys, recursive=False):
copy = dict(d)
for k in keys:
copy.pop(k, None)
if recursive:
for k in [k for k in copy.keys() if isinstance(copy[k], dict)]:
copy[k] = remove_items_from_dict(copy[k], keys, recursive)
return copy
def _is_dictionnary(v):
return hasattr(v, "items") and callable(v.items)
def _dump_filter_type(t):
return lambda x: isinstance(x, t)
def _dump_filter_type_by_name(t_name):
return lambda x: t_name == x.__class__.__name__
def _dump_filter_array(array):
# only primitive type array
if not isinstance(array, T.bpy_prop_array):
return False
if len(array) > 0 and type(array[0]) not in [bool, float, int]:
return False
return True
def _dump_filter_default(default):
if default is None:
return False
if type(default) is list:
return False
return True
def _load_filter_type(t, use_bl_rna=True):
def filter_function(x):
if use_bl_rna and x.bl_rna_property:
return isinstance(x.bl_rna_property, t)
else:
return isinstance(x.read(), t)
return filter_function
def _load_filter_array(array):
# only primitive type array
if not isinstance(array.read(), T.bpy_prop_array):
return False
if len(array.read()) > 0 and type(array.read()[0]) not in [bool, float, int]:
return False
return True
def _load_filter_color(color):
return color.__class__.__name__ == 'Color'
def _load_filter_default(default):
if default.read() is None:
return False
if type(default.read()) is list:
return False
return True
class Dumper:
# TODO: support occlude readonly
# TODO: use foreach_set/get on collection compatible properties
def __init__(self):
self.verbose = True
self.depth = 1
self.keep_compounds_as_leaves = False
self.accept_read_only = True
self._build_inline_dump_functions()
self._build_match_elements()
self.type_subset = self.match_subset_all
self.include_filter = []
self.exclude_filter = []
def dump(self, any):
return self._dump_any(any, 0)
def _dump_any(self, any, depth):
for filter_function, dump_function in self.type_subset:
if filter_function(any):
return dump_function[not (depth >= self.depth)](any, depth + 1)
def _build_inline_dump_functions(self):
self._dump_identity = (lambda x, depth: x, lambda x, depth: x)
self._dump_ref = (lambda x, depth: x.name, self._dump_object_as_branch)
self._dump_ID = (lambda x, depth: x.name, self._dump_default_as_branch)
self._dump_collection = (
self._dump_default_as_leaf, self._dump_collection_as_branch)
self._dump_array = (self._dump_array_as_branch,
self._dump_array_as_branch)
self._dump_matrix = (self._dump_matrix_as_leaf,
self._dump_matrix_as_leaf)
self._dump_vector = (self._dump_vector_as_leaf,
self._dump_vector_as_leaf)
self._dump_quaternion = (
self._dump_quaternion_as_leaf, self._dump_quaternion_as_leaf)
self._dump_default = (self._dump_default_as_leaf,
self._dump_default_as_branch)
self._dump_color = (self._dump_color_as_leaf, self._dump_color_as_leaf)
def _build_match_elements(self):
self._match_type_bool = (_dump_filter_type(bool), self._dump_identity)
self._match_type_int = (_dump_filter_type(int), self._dump_identity)
self._match_type_float = (
_dump_filter_type(float), self._dump_identity)
self._match_type_string = (_dump_filter_type(str), self._dump_identity)
self._match_type_ref = (_dump_filter_type(T.Object), self._dump_ref)
self._match_type_ID = (_dump_filter_type(T.ID), self._dump_ID)
self._match_type_bpy_prop_collection = (
_dump_filter_type(T.bpy_prop_collection), self._dump_collection)
self._match_type_array = (_dump_filter_array, self._dump_array)
self._match_type_matrix = (_dump_filter_type(
mathutils.Matrix), self._dump_matrix)
self._match_type_vector = (_dump_filter_type(
mathutils.Vector), self._dump_vector)
self._match_type_quaternion = (_dump_filter_type(
mathutils.Quaternion), self._dump_quaternion)
self._match_type_euler = (_dump_filter_type(
mathutils.Euler), self._dump_quaternion)
self._match_type_color = (
_dump_filter_type_by_name("Color"), self._dump_color)
self._match_default = (_dump_filter_default, self._dump_default)
def _dump_collection_as_branch(self, collection, depth):
dump = {}
for i in collection.items():
dv = self._dump_any(i[1], depth)
if not (dv is None):
dump[i[0]] = dv
return dump
def _dump_default_as_leaf(self, default, depth):
if self.keep_compounds_as_leaves:
return str(type(default))
else:
return None
def _dump_array_as_branch(self, array, depth):
return [i for i in array]
def _dump_matrix_as_leaf(self, matrix, depth):
return [list(v) for v in matrix]
def _dump_vector_as_leaf(self, vector, depth):
return list(vector)
def _dump_quaternion_as_leaf(self, quaternion, depth):
return list(quaternion)
def _dump_color_as_leaf(self, color, depth):
return list(color)
def _dump_object_as_branch(self, default, depth):
if depth == 1:
return self._dump_default_as_branch(default, depth)
else:
return default.name
def _dump_default_as_branch(self, default, depth):
def is_valid_property(p):
try:
if (self.include_filter and p not in self.include_filter):
return False
getattr(default, p)
except AttributeError as err:
logging.debug(err)
return False
if p.startswith("__"):
return False
if callable(getattr(default, p)):
return False
if p in ["bl_rna", "rna_type"]:
return False
return True
all_property_names = [p for p in dir(default) if is_valid_property(
p) and p != '' and p not in self.exclude_filter]
dump = {}
for p in all_property_names:
if (self.exclude_filter and p in self.exclude_filter) or\
(self.include_filter and p not in self.include_filter):
return False
dp = self._dump_any(getattr(default, p), depth)
if not (dp is None):
dump[p] = dp
return dump
@property
def match_subset_all(self):
return [
self._match_type_bool,
self._match_type_int,
self._match_type_float,
self._match_type_string,
self._match_type_ref,
self._match_type_ID,
self._match_type_bpy_prop_collection,
self._match_type_array,
self._match_type_matrix,
self._match_type_vector,
self._match_type_quaternion,
self._match_type_euler,
self._match_type_color,
self._match_default
]
@property
def match_subset_primitives(self):
return [
self._match_type_bool,
self._match_type_int,
self._match_type_float,
self._match_type_string,
self._match_default
]
class BlenderAPIElement:
def __init__(self, api_element, sub_element_name="", occlude_read_only=True):
self.api_element = api_element
self.sub_element_name = sub_element_name
self.occlude_read_only = occlude_read_only
def read(self):
return getattr(self.api_element, self.sub_element_name) if self.sub_element_name else self.api_element
def write(self, value):
# take precaution if property is read-only
if self.sub_element_name and \
not self.api_element.is_property_readonly(self.sub_element_name):
setattr(self.api_element, self.sub_element_name, value)
else:
self.api_element = value
def extend(self, element_name):
return BlenderAPIElement(self.read(), element_name)
@property
def bl_rna_property(self):
if not hasattr(self.api_element, "bl_rna"):
return False
if not self.sub_element_name:
return False
return self.api_element.bl_rna.properties[self.sub_element_name]
class Loader:
def __init__(self):
self.type_subset = self.match_subset_all
self.occlude_read_only = False
self.order = ['*']
def load(self, dst_data, src_dumped_data):
self._load_any(
BlenderAPIElement(
dst_data, occlude_read_only=self.occlude_read_only),
src_dumped_data
)
def _load_any(self, any, dump):
for filter_function, load_function in self.type_subset:
if filter_function(any):
load_function(any, dump)
return
def _load_identity(self, element, dump):
element.write(dump)
def _load_array(self, element, dump):
# supports only primitive types currently
try:
for i in range(len(dump)):
element.read()[i] = dump[i]
except AttributeError as err:
logging.debug(err)
if not self.occlude_read_only:
raise err
def _load_collection(self, element, dump):
if not element.bl_rna_property:
return
# local enum
CONSTRUCTOR_NEW = "new"
CONSTRUCTOR_ADD = "add"
DESTRUCTOR_REMOVE = "remove"
DESTRUCTOR_CLEAR = "clear"
_constructors = {
T.ColorRampElement: (CONSTRUCTOR_NEW, ["position"]),
T.ParticleSettingsTextureSlot: (CONSTRUCTOR_ADD, []),
T.Modifier: (CONSTRUCTOR_NEW, ["name", "type"]),
T.Constraint: (CONSTRUCTOR_NEW, ["type"]),
}
destructors = {
T.ColorRampElement: DESTRUCTOR_REMOVE,
T.Modifier: DESTRUCTOR_CLEAR,
T.Constraint: CONSTRUCTOR_NEW,
}
element_type = element.bl_rna_property.fixed_type
_constructor = _constructors.get(type(element_type))
if _constructor is None: # collection type not supported
return
destructor = destructors.get(type(element_type))
# Try to clear existing
if destructor:
if destructor == DESTRUCTOR_REMOVE:
collection = element.read()
for i in range(len(collection)-1):
collection.remove(collection[0])
else:
getattr(element.read(), DESTRUCTOR_CLEAR)()
for dump_idx, dumped_element in enumerate(dump.values()):
if dump_idx == 0 and len(element.read()) > 0:
new_element = element.read()[0]
else:
try:
_constructor_parameters = [dumped_element[name]
for name in _constructor[1]]
except KeyError:
logging.debug("Collection load error, missing parameters.")
continue # TODO handle error
new_element = getattr(element.read(), _constructor[0])(
*_constructor_parameters)
self._load_any(
BlenderAPIElement(
new_element, occlude_read_only=self.occlude_read_only),
dumped_element
)
def _load_curve_mapping(self, element, dump):
mapping = element.read()
curves = mapping.curves
for curve_index, curve in dump['curves'].items():
dst_curve = curves[curve_index]
# cleanup existing curve
for idx in range(len(dst_curve.points), 0, -1):
try:
dst_curve.points.remove(dst_curve.points[0])
except Exception:
break
default_point_count = len(dst_curve.points)
for point_idx, point in curve['points'].items():
pos = point['location']
if point_idx < default_point_count:
dst_curve.points[int(point_idx)].location = pos
else:
dst_curve.points.new(pos[0], pos[1])
def _load_pointer(self, instance, dump):
rna_property_type = instance.bl_rna_property.fixed_type
if not rna_property_type:
return
if isinstance(rna_property_type, T.Image):
instance.write(bpy.data.images.get(dump))
elif isinstance(rna_property_type, T.Texture):
instance.write(bpy.data.textures.get(dump))
elif isinstance(rna_property_type, T.ColorRamp):
self._load_default(instance, dump)
elif isinstance(rna_property_type, T.Object):
instance.write(bpy.data.objects.get(dump))
elif isinstance(rna_property_type, T.Mesh):
instance.write(bpy.data.meshes.get(dump))
elif isinstance(rna_property_type, T.Material):
instance.write(bpy.data.materials.get(dump))
elif isinstance(rna_property_type, T.Collection):
instance.write(bpy.data.collections.get(dump))
elif isinstance(rna_property_type, T.VectorFont):
instance.write(bpy.data.fonts.get(dump))
elif isinstance(rna_property_type, T.Sound):
instance.write(bpy.data.sounds.get(dump))
def _load_matrix(self, matrix, dump):
matrix.write(mathutils.Matrix(dump))
def _load_vector(self, vector, dump):
vector.write(mathutils.Vector(dump))
def _load_quaternion(self, quaternion, dump):
quaternion.write(mathutils.Quaternion(dump))
def _load_euler(self, euler, dump):
euler.write(mathutils.Euler(dump))
def _ordered_keys(self, keys):
ordered_keys = []
for order_element in self.order:
if order_element == '*':
ordered_keys += [k for k in keys if not k in self.order]
else:
if order_element in keys:
ordered_keys.append(order_element)
return ordered_keys
def _load_default(self, default, dump):
if not _is_dictionnary(dump):
return # TODO error handling
for k in self._ordered_keys(dump.keys()):
v = dump[k]
if not hasattr(default.read(), k):
logging.debug(f"Load default, skipping {default} : {k}")
try:
self._load_any(default.extend(k), v)
except Exception as err:
logging.debug(f"Cannot load {k}: {err}")
@property
def match_subset_all(self):
return [
(_load_filter_type(T.BoolProperty), self._load_identity),
(_load_filter_type(T.IntProperty), self._load_identity),
# before float because bl_rna type of matrix if FloatProperty
(_load_filter_type(mathutils.Matrix, use_bl_rna=False), self._load_matrix),
# before float because bl_rna type of vector if FloatProperty
(_load_filter_type(mathutils.Vector, use_bl_rna=False), self._load_vector),
(_load_filter_type(mathutils.Quaternion,
use_bl_rna=False), self._load_quaternion),
(_load_filter_type(mathutils.Euler, use_bl_rna=False), self._load_euler),
(_load_filter_type(T.CurveMapping, use_bl_rna=False),
self._load_curve_mapping),
(_load_filter_type(T.FloatProperty), self._load_identity),
(_load_filter_type(T.StringProperty), self._load_identity),
(_load_filter_type(T.EnumProperty), self._load_identity),
(_load_filter_type(T.PointerProperty), self._load_pointer),
(_load_filter_array, self._load_array),
(_load_filter_type(T.CollectionProperty), self._load_collection),
(_load_filter_default, self._load_default),
(_load_filter_color, self._load_identity),
]
# Utility functions
def dump(any, depth=1):
dumper = Dumper()
dumper.depth = depth
return dumper.dump(any)
def load(dst, src):
loader = Loader()
loader.load(dst, src)

View File

@ -1,17 +1,43 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
#
# ##### END GPL LICENSE BLOCK #####
import logging
import bpy
from . import operators, presence, utils
from .libs.replication.replication.constants import FETCHED, RP_COMMON, STATE_INITIAL,STATE_QUITTING, STATE_ACTIVE, STATE_SYNCING, STATE_SRV_SYNC
logger = logging.getLogger(__name__)
logger.setLevel(logging.WARNING)
from . import presence, utils
from replication.constants import (FETCHED,
UP,
RP_COMMON,
STATE_INITIAL,
STATE_QUITTING,
STATE_ACTIVE,
STATE_SYNCING,
STATE_LOBBY,
STATE_SRV_SYNC,
REPARENT)
from replication.interface import session
class Delayable():
"""Delayable task interface
"""
def __init__(self):
self.is_registered = False
def register(self):
raise NotImplementedError
@ -30,13 +56,20 @@ class Timer(Delayable):
"""
def __init__(self, duration=1):
super().__init__()
self._timeout = duration
self._running = True
def register(self):
"""Register the timer into the blender timer system
"""
bpy.app.timers.register(self.main)
if not self.is_registered:
bpy.app.timers.register(self.main)
self.is_registered = True
logging.debug(f"Register {self.__class__.__name__}")
else:
logging.debug(f"Timer {self.__class__.__name__} already registered")
def main(self):
self.execute()
@ -64,19 +97,29 @@ class ApplyTimer(Timer):
super().__init__(timout)
def execute(self):
client = operators.client
if client and client.state['STATE'] == STATE_ACTIVE:
nodes = client.list(filter=self._type)
if session and session.state['STATE'] == STATE_ACTIVE:
if self._type:
nodes = session.list(filter=self._type)
else:
nodes = session.list()
for node in nodes:
node_ref = client.get(uuid=node)
node_ref = session.get(uuid=node)
if node_ref.state == FETCHED:
try:
client.apply(node)
session.apply(node, force=True)
except Exception as e:
logger.error(
"fail to apply {}: {}".format(node_ref.uuid, e))
logging.error(f"Fail to apply {node_ref.uuid}: {e}")
elif node_ref.state == REPARENT:
# Reload the node
node_ref.remove_instance()
node_ref.resolve()
session.apply(node, force=True)
for parent in session._graph.find_parents(node):
logging.info(f"Applying parent {parent}")
session.apply(parent, force=True)
node_ref.state = UP
class DynamicRightSelectTimer(Timer):
@ -87,79 +130,73 @@ class DynamicRightSelectTimer(Timer):
self._right_strategy = RP_COMMON
def execute(self):
session = operators.client
settings = bpy.context.window_manager.session
settings = utils.get_preferences()
if session and session.state['STATE'] == STATE_ACTIVE:
# Find user
if self._user is None:
self._user = session.online_users.get(settings.username)
if self._right_strategy is None:
self._right_strategy = session.config[
'right_strategy']
if self._user:
current_selection = utils.get_selected_objects(
bpy.context.scene,
bpy.data.window_managers['WinMan'].windows[0].view_layer
)
if current_selection != self._last_selection:
if self._right_strategy == RP_COMMON:
obj_common = [
o for o in self._last_selection if o not in current_selection]
obj_ours = [
o for o in current_selection if o not in self._last_selection]
obj_common = [
o for o in self._last_selection if o not in current_selection]
obj_ours = [
o for o in current_selection if o not in self._last_selection]
# change old selection right to common
for obj in obj_common:
node = session.get(uuid=obj)
# change old selection right to common
for obj in obj_common:
node = session.get(uuid=obj)
if node and (node.owner == settings.username or node.owner == RP_COMMON):
recursive = True
if node.data and 'instance_type' in node.data.keys():
recursive = node.data['instance_type'] != 'COLLECTION'
session.change_owner(
node.uuid,
RP_COMMON,
recursive=recursive)
if node and (node.owner == settings.username or node.owner == RP_COMMON):
recursive = True
if node.data and 'instance_type' in node.data.keys():
recursive = node.data['instance_type'] != 'COLLECTION'
session.change_owner(
node.uuid,
RP_COMMON,
recursive=recursive)
# change new selection to our
for obj in obj_ours:
node = session.get(uuid=obj)
# change new selection to our
for obj in obj_ours:
node = session.get(uuid=obj)
if node and node.owner == RP_COMMON:
recursive = True
if node.data and 'instance_type' in node.data.keys():
recursive = node.data['instance_type'] != 'COLLECTION'
if node and node.owner == RP_COMMON:
recursive = True
if node.data and 'instance_type' in node.data.keys():
recursive = node.data['instance_type'] != 'COLLECTION'
session.change_owner(
node.uuid,
settings.username,
recursive=recursive)
else:
return
session.change_owner(
node.uuid,
settings.username,
recursive=recursive)
else:
return
self._last_selection = current_selection
self._last_selection = current_selection
user_metadata = {
'selected_objects': current_selection
}
user_metadata = {
'selected_objects': current_selection
}
session.update_user_metadata(user_metadata)
logger.info("Update selection")
session.update_user_metadata(user_metadata)
logging.debug("Update selection")
# Fix deselection until right managment refactoring (with Roles concepts)
if len(current_selection) == 0 and self._right_strategy == RP_COMMON:
owned_keys = session.list(
filter_owner=settings.username)
for key in owned_keys:
node = session.get(uuid=key)
# Fix deselection until right managment refactoring (with Roles concepts)
if len(current_selection) == 0 and self._right_strategy == RP_COMMON:
owned_keys = session.list(
filter_owner=settings.username)
for key in owned_keys:
node = session.get(uuid=key)
session.change_owner(
key,
RP_COMMON,
recursive=recursive)
session.change_owner(
key,
RP_COMMON,
recursive=recursive)
for user, user_info in session.online_users.items():
if user != settings.username:
@ -176,11 +213,16 @@ class DynamicRightSelectTimer(Timer):
class Draw(Delayable):
def __init__(self):
super().__init__()
self._handler = None
def register(self):
self._handler = bpy.types.SpaceView3D.draw_handler_add(
self.execute, (), 'WINDOW', 'POST_VIEW')
if not self.is_registered:
self._handler = bpy.types.SpaceView3D.draw_handler_add(
self.execute, (), 'WINDOW', 'POST_VIEW')
logging.debug(f"Register {self.__class__.__name__}")
else:
logging.debug(f"Drow {self.__class__.__name__} already registered")
def execute(self):
raise NotImplementedError()
@ -195,78 +237,122 @@ class Draw(Delayable):
class DrawClient(Draw):
def execute(self):
session = getattr(operators, 'client', None)
renderer = getattr(presence, 'renderer', None)
prefs = utils.get_preferences()
if session and renderer and session.state['STATE'] == STATE_ACTIVE:
settings = bpy.context.window_manager.session
users = session.online_users
# Update users
for user in users.values():
metadata = user.get('metadata')
if 'color' in metadata:
color = metadata.get('color')
scene_current = metadata.get('scene_current')
user_showable = scene_current == bpy.context.scene.name or settings.presence_show_far_user
if color and scene_current and user_showable:
if settings.presence_show_selected and 'selected_objects' in metadata.keys():
renderer.draw_client_selection(
user['id'], metadata['color'], metadata['selected_objects'])
user['id'], color, metadata['selected_objects'])
if settings.presence_show_user and 'view_corners' in metadata:
renderer.draw_client_camera(
user['id'], metadata['view_corners'], metadata['color'])
user['id'], metadata['view_corners'], color)
if not user_showable:
# TODO: remove this when user event drivent update will be
# ready
renderer.flush_selection()
renderer.flush_users()
class ClientUpdate(Timer):
def __init__(self, timout=.5):
def __init__(self, timout=.1):
super().__init__(timout)
self.handle_quit = False
self.users_metadata = {}
def execute(self):
settings = bpy.context.window_manager.session
session_info = bpy.context.window_manager.session
session = getattr(operators, 'client', None)
settings = utils.get_preferences()
renderer = getattr(presence, 'renderer', None)
if session and renderer and session.state['STATE'] == STATE_ACTIVE:
# Check if session has been closes prematurely
if session.state['STATE'] == 0:
bpy.ops.session.stop()
local_user = operators.client.online_users.get(
session_info.username)
if not local_user:
return
if session and renderer:
if session.state['STATE'] in [STATE_ACTIVE, STATE_LOBBY]:
local_user = session.online_users.get(
settings.username)
local_user_metadata = local_user.get('metadata')
current_view_corners = presence.get_view_corners()
if not local_user:
return
else:
for username, user_data in session.online_users.items():
if username != settings.username:
cached_user_data = self.users_metadata.get(
username)
new_user_data = session.online_users[username]['metadata']
if not local_user_metadata or 'color' not in local_user_metadata.keys():
metadata = {
'view_corners': current_view_corners,
'view_matrix': presence.get_view_matrix(),
'color': (settings.client_color.r,
settings.client_color.g,
settings.client_color.b,
1),
'frame_current':bpy.context.scene.frame_current
}
session.update_user_metadata(metadata)
elif current_view_corners != local_user_metadata['view_corners']:
logger.info('update user metadata')
local_user_metadata['view_corners'] = current_view_corners
local_user_metadata['view_matrix'] = presence.get_view_matrix()
session.update_user_metadata(local_user_metadata)
if cached_user_data is None:
self.users_metadata[username] = user_data['metadata']
elif 'view_matrix' in cached_user_data and 'view_matrix' in new_user_data and cached_user_data['view_matrix'] != new_user_data['view_matrix']:
presence.refresh_3d_view()
self.users_metadata[username] = user_data['metadata']
break
else:
self.users_metadata[username] = user_data['metadata']
local_user_metadata = local_user.get('metadata')
scene_current = bpy.context.scene.name
local_user = session.online_users.get(settings.username)
current_view_corners = presence.get_view_corners()
# Init client metadata
if not local_user_metadata or 'color' not in local_user_metadata.keys():
metadata = {
'view_corners': presence.get_view_matrix(),
'view_matrix': presence.get_view_matrix(),
'color': (settings.client_color.r,
settings.client_color.g,
settings.client_color.b,
1),
'frame_current': bpy.context.scene.frame_current,
'scene_current': scene_current
}
session.update_user_metadata(metadata)
# Update client representation
# Update client current scene
elif scene_current != local_user_metadata['scene_current']:
local_user_metadata['scene_current'] = scene_current
session.update_user_metadata(local_user_metadata)
elif 'view_corners' in local_user_metadata and current_view_corners != local_user_metadata['view_corners']:
local_user_metadata['view_corners'] = current_view_corners
local_user_metadata['view_matrix'] = presence.get_view_matrix(
)
session.update_user_metadata(local_user_metadata)
class SessionStatusUpdate(Timer):
def __init__(self, timout=1):
super().__init__(timout)
def execute(self):
presence.refresh_sidebar_view()
class SessionUserSync(Timer):
def __init__(self, timout=1):
super().__init__(timout)
def execute(self):
renderer = getattr(presence, 'renderer', None)
if session and renderer:
# sync online users
session_users = operators.client.online_users
session_users = session.online_users
ui_users = bpy.context.window_manager.online_users
for index, user in enumerate(ui_users):
if user.username not in session_users.keys():
ui_users.remove(index)
renderer.flush_selection()
renderer.flush_users()
break
for user in session_users:
@ -275,18 +361,14 @@ class ClientUpdate(Timer):
new_key.name = user
new_key.username = user
# TODO: event drivent 3d view refresh
presence.refresh_3d_view()
elif session.state['STATE'] == STATE_QUITTING:
presence.refresh_3d_view()
self.handle_quit = True
elif session.state['STATE'] == STATE_INITIAL and self.handle_quit:
self.handle_quit = False
presence.refresh_3d_view()
operators.unregister_delayables()
presence.renderer.stop()
# # ui update
elif session:
presence.refresh_3d_view()
class MainThreadExecutor(Timer):
def __init__(self, timout=1, execution_queue=None):
super().__init__(timout)
self.execution_queue = execution_queue
def execute(self):
while not self.execution_queue.empty():
function = self.execution_queue.get()
logging.debug(f"Executing {function.__name__}")
function()

View File

@ -1,18 +1,35 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
#
# ##### END GPL LICENSE BLOCK #####
import collections
import logging
import os
import subprocess
import sys
from pathlib import Path
import socket
import re
logger = logging.getLogger(__name__)
logger.setLevel(logging.WARNING)
CONFIG_DIR = os.path.join(os.path.dirname(os.path.abspath(__file__)), "config")
CONFIG = os.path.join(CONFIG_DIR, "app.yaml")
VERSION_EXPR = re.compile('\d+\.\d+\.\d+\w\d+')
THIRD_PARTY = os.path.join(os.path.dirname(os.path.abspath(__file__)), "libs")
CACHE_DIR = os.path.join(os.path.dirname(os.path.abspath(__file__)), "cache")
DEFAULT_CACHE_DIR = os.path.join(
os.path.dirname(os.path.abspath(__file__)), "cache")
PYTHON_PATH = None
SUBPROCESS_DIR = None
@ -20,26 +37,6 @@ SUBPROCESS_DIR = None
rtypes = []
def load_config():
import yaml
try:
with open(CONFIG, 'r') as config_file:
return yaml.safe_load(config_file)
except FileNotFoundError:
logger.info("no config")
return {}
def save_config(config):
import yaml
logger.info("saving config")
with open(CONFIG, 'w') as outfile:
yaml.dump(config, outfile, default_flow_style=False)
def module_can_be_imported(name):
try:
__import__(name)
@ -50,30 +47,62 @@ def module_can_be_imported(name):
def install_pip():
# pip can not necessarily be imported into Blender after this
get_pip_path = Path(__file__).parent / "libs" / "get-pip.py"
subprocess.run([str(PYTHON_PATH), str(get_pip_path)], cwd=SUBPROCESS_DIR)
subprocess.run([str(PYTHON_PATH), "-m", "ensurepip"])
def install_package(name):
subprocess.run([str(PYTHON_PATH), "-m", "pip", "install",
name], cwd=SUBPROCESS_DIR)
def install_package(name, version):
logging.info(f"installing {name} version...")
env = os.environ
if "PIP_REQUIRE_VIRTUALENV" in env:
# PIP_REQUIRE_VIRTUALENV is an env var to ensure pip cannot install packages outside a virtual env
# https://docs.python-guide.org/dev/pip-virtualenv/
# But since Blender's pip is outside of a virtual env, it can block our packages installation, so we unset the
# env var for the subprocess.
env = os.environ.copy()
del env["PIP_REQUIRE_VIRTUALENV"]
subprocess.run([str(PYTHON_PATH), "-m", "pip", "install", f"{name}=={version}"], env=env)
def check_package_version(name, required_version):
logging.info(f"Checking {name} version...")
out = subprocess.run([str(PYTHON_PATH), "-m", "pip", "show", name], capture_output=True)
version = VERSION_EXPR.search(out.stdout.decode())
if version and version.group() == required_version:
logging.info(f"{name} is up to date")
return True
else:
logging.info(f"{name} need an update")
return False
def get_ip():
"""
Retrieve the main network interface IP.
"""
s = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
s.connect(("8.8.8.8", 80))
ip = s.getsockname()[0]
s.close()
return ip
def check_dir(dir):
if not os.path.exists(dir):
os.makedirs(dir)
def setup(dependencies, python_path):
global PYTHON_PATH, SUBPROCESS_DIR
PYTHON_PATH = Path(python_path)
SUBPROCESS_DIR = PYTHON_PATH.parent
check_dir(CACHE_DIR)
check_dir(CONFIG_DIR)
if not module_can_be_imported("pip"):
install_pip()
for module_name, package_name in dependencies:
if not module_can_be_imported(module_name):
install_package(package_name)
for package_name, package_version in dependencies:
if not module_can_be_imported(package_name):
install_package(package_name, package_version)
module_can_be_imported(package_name)
elif not check_package_version(package_name, package_version):
install_package(package_name, package_version)

View File

@ -1,397 +0,0 @@
import bpy
import bpy.types as T
import mathutils
def remove_items_from_dict(d, keys, recursive=False):
copy = dict(d)
for k in keys:
copy.pop(k, None)
if recursive:
for k in [k for k in copy.keys() if isinstance(copy[k], dict)]:
copy[k] = remove_items_from_dict(copy[k], keys, recursive)
return copy
def _is_dictionnary(v):
return hasattr(v, "items") and callable(v.items)
def _dump_filter_type(t):
return lambda x: isinstance(x, t)
def _dump_filter_type_by_name(t_name):
return lambda x: t_name == x.__class__.__name__
def _dump_filter_array(array):
# only primitive type array
if not isinstance(array, T.bpy_prop_array):
return False
if len(array) > 0 and type(array[0]) not in [bool, float, int]:
return False
return True
def _dump_filter_default(default):
if default is None:
return False
if type(default) is list:
return False
return True
def _load_filter_type(t, use_bl_rna=True):
def filter_function(x):
if use_bl_rna and x.bl_rna_property:
return isinstance(x.bl_rna_property, t)
else:
isinstance(x.read(), t)
return filter_function
def _load_filter_array(array):
# only primitive type array
if not isinstance(array.read(), T.bpy_prop_array):
return False
if len(array.read()) > 0 and type(array.read()[0]) not in [bool, float, int]:
return False
return True
def _load_filter_color(color):
return color.__class__.__name__ == 'Color'
def _load_filter_default(default):
if default.read() is None:
return False
if type(default.read()) is list:
return False
return True
class Dumper:
def __init__(self):
self.verbose = False
self.depth = 1
self.keep_compounds_as_leaves = False
self.accept_read_only = True
self._build_inline_dump_functions()
self._build_match_elements()
self.type_subset = self.match_subset_all
self.include_filter = []
self.exclude_filter = []
# self._atomic_types = [] # TODO future option?
def dump(self, any):
return self._dump_any(any, 0)
def _dump_any(self, any, depth):
for filter_function, dump_function in self.type_subset:
if filter_function(any):
return dump_function[not (depth >= self.depth)](any, depth + 1)
def _build_inline_dump_functions(self):
self._dump_identity = (lambda x, depth: x, lambda x, depth: x)
self._dump_ref = (lambda x, depth: x.name, self._dump_object_as_branch)
self._dump_ID = (lambda x, depth: x.name, self._dump_default_as_branch)
self._dump_collection = (self._dump_default_as_leaf, self._dump_collection_as_branch)
self._dump_array = (self._dump_default_as_leaf, self._dump_array_as_branch)
self._dump_matrix = (self._dump_matrix_as_leaf, self._dump_matrix_as_leaf)
self._dump_vector = (self._dump_vector_as_leaf, self._dump_vector_as_leaf)
self._dump_quaternion = (self._dump_quaternion_as_leaf, self._dump_quaternion_as_leaf)
self._dump_default = (self._dump_default_as_leaf, self._dump_default_as_branch)
self._dump_color = (self._dump_color_as_leaf, self._dump_color_as_leaf)
def _build_match_elements(self):
self._match_type_bool = (_dump_filter_type(bool), self._dump_identity)
self._match_type_int = (_dump_filter_type(int), self._dump_identity)
self._match_type_float = (_dump_filter_type(float), self._dump_identity)
self._match_type_string = (_dump_filter_type(str), self._dump_identity)
self._match_type_ref = (_dump_filter_type(T.Object), self._dump_ref)
self._match_type_ID = (_dump_filter_type(T.ID), self._dump_ID)
self._match_type_bpy_prop_collection = (_dump_filter_type(T.bpy_prop_collection), self._dump_collection)
self._match_type_array = (_dump_filter_array, self._dump_array)
self._match_type_matrix = (_dump_filter_type(mathutils.Matrix), self._dump_matrix)
self._match_type_vector = (_dump_filter_type(mathutils.Vector), self._dump_vector)
self._match_type_quaternion = (_dump_filter_type(mathutils.Quaternion), self._dump_quaternion)
self._match_type_euler = (_dump_filter_type(mathutils.Euler), self._dump_quaternion)
self._match_type_color = (_dump_filter_type_by_name("Color"), self._dump_color)
self._match_default = (_dump_filter_default, self._dump_default)
def _dump_collection_as_branch(self, collection, depth):
dump = {}
for i in collection.items():
dv = self._dump_any(i[1], depth)
if not (dv is None):
dump[i[0]] = dv
return dump
def _dump_default_as_leaf(self, default, depth):
if self.keep_compounds_as_leaves:
return str(type(default))
else:
return None
def _dump_array_as_branch(self, array, depth):
return [i for i in array]
def _dump_matrix_as_leaf(self, matrix, depth):
return [list(v) for v in matrix]
def _dump_vector_as_leaf(self, vector, depth):
return list(vector)
def _dump_quaternion_as_leaf(self, quaternion, depth):
return list(quaternion)
def _dump_color_as_leaf(self, color, depth):
return list(color)
def _dump_object_as_branch(self, default, depth):
if depth == 1:
return self._dump_default_as_branch(default, depth)
else:
return default.name
def _dump_default_as_branch(self, default, depth):
def is_valid_property(p):
try:
if (self.include_filter and p not in self.include_filter):
return False
getattr(default, p)
except AttributeError:
return False
if p.startswith("__"):
return False
if callable(getattr(default, p)):
return False
if p in ["bl_rna", "rna_type"]:
return False
return True
all_property_names = [p for p in dir(default) if is_valid_property(p) and p != '' and p not in self.exclude_filter]
dump = {}
for p in all_property_names:
if (self.exclude_filter and p in self.exclude_filter) or\
(self.include_filter and p not in self.include_filter):
return False
dp = self._dump_any(getattr(default, p), depth)
if not (dp is None):
dump[p] = dp
return dump
@property
def match_subset_all(self):
return [
self._match_type_bool,
self._match_type_int,
self._match_type_float,
self._match_type_string,
self._match_type_ref,
self._match_type_ID,
self._match_type_bpy_prop_collection,
self._match_type_array,
self._match_type_matrix,
self._match_type_vector,
self._match_type_quaternion,
self._match_type_euler,
self._match_type_color,
self._match_default
]
@property
def match_subset_primitives(self):
return [
self._match_type_bool,
self._match_type_int,
self._match_type_float,
self._match_type_string,
self._match_default
]
class BlenderAPIElement:
def __init__(self, api_element, sub_element_name="", occlude_read_only=True):
self.api_element = api_element
self.sub_element_name = sub_element_name
self.occlude_read_only = occlude_read_only
def read(self):
return getattr(self.api_element, self.sub_element_name) if self.sub_element_name else self.api_element
def write(self, value):
# take precaution if property is read-only
try:
if self.sub_element_name:
setattr(self.api_element, self.sub_element_name, value)
else:
self.api_element = value
except AttributeError as err:
if not self.occlude_read_only:
raise err
def extend(self, element_name):
return BlenderAPIElement(self.read(), element_name)
@property
def bl_rna_property(self):
if not hasattr(self.api_element, "bl_rna"):
return False
if not self.sub_element_name:
return False
return self.api_element.bl_rna.properties[self.sub_element_name]
class Loader:
def __init__(self):
self.type_subset = self.match_subset_all
self.occlude_read_only = True
self.order = ['*']
def load(self, dst_data, src_dumped_data):
self._load_any(
BlenderAPIElement(dst_data, occlude_read_only=self.occlude_read_only),
src_dumped_data
)
def _load_any(self, any, dump):
for filter_function, load_function in self.type_subset:
if filter_function(any):
load_function(any, dump)
return
def _load_identity(self, element, dump):
element.write(dump)
def _load_array(self, element, dump):
# supports only primitive types currently
try:
for i in range(len(dump)):
element.read()[i] = dump[i]
except AttributeError as err:
if not self.occlude_read_only:
raise err
def _load_collection(self, element, dump):
if not element.bl_rna_property:
return
# local enum
CONSTRUCTOR_NEW = "new"
CONSTRUCTOR_ADD = "add"
constructors = {
T.ColorRampElement: (CONSTRUCTOR_NEW, ["position"]),
T.ParticleSettingsTextureSlot: (CONSTRUCTOR_ADD, [])
}
element_type = element.bl_rna_property.fixed_type
constructor = constructors.get(type(element_type))
if constructor is None: # collection type not supported
return
for dumped_element in dump.values():
try:
constructor_parameters = [dumped_element[name] for name in constructor[1]]
except KeyError:
print("Collection load error, missing parameters.")
continue # TODO handle error
new_element = getattr(element.read(), constructor[0])(*constructor_parameters)
self._load_any(
BlenderAPIElement(new_element, occlude_read_only=self.occlude_read_only),
dumped_element
)
def _load_pointer(self, pointer, dump):
rna_property_type = pointer.bl_rna_property.fixed_type
if not rna_property_type:
return
if isinstance(rna_property_type, T.Image):
pointer.write(bpy.data.images.get(dump))
elif isinstance(rna_property_type, T.Texture):
pointer.write(bpy.data.textures.get(dump))
elif isinstance(rna_property_type, T.ColorRamp):
self._load_default(pointer, dump)
elif isinstance(rna_property_type, T.Object):
pointer.write(bpy.data.objects.get(dump))
elif isinstance(rna_property_type, T.Mesh):
pointer.write(bpy.data.meshes.get(dump))
elif isinstance(rna_property_type, T.Material):
pointer.write(bpy.data.materials.get(dump))
def _load_matrix(self, matrix, dump):
matrix.write(mathutils.Matrix(dump))
def _load_vector(self, vector, dump):
vector.write(mathutils.Vector(dump))
def _load_quaternion(self, quaternion, dump):
quaternion.write(mathutils.Quaternion(dump))
def _load_euler(self, euler, dump):
euler.write(mathutils.Euler(dump))
def _ordered_keys(self, keys):
ordered_keys = []
for order_element in self.order:
if order_element == '*':
ordered_keys += [k for k in keys if not k in self.order]
else:
if order_element in keys:
ordered_keys.append(order_element)
return ordered_keys
def _load_default(self, default, dump):
if not _is_dictionnary(dump):
return # TODO error handling
for k in self._ordered_keys(dump.keys()):
v = dump[k]
if not hasattr(default.read(), k):
continue # TODO error handling
try:
self._load_any(default.extend(k), v)
except:
pass
@property
def match_subset_all(self):
return [
(_load_filter_type(T.BoolProperty), self._load_identity),
(_load_filter_type(T.IntProperty), self._load_identity),
(_load_filter_type(mathutils.Matrix, use_bl_rna=False), self._load_matrix), # before float because bl_rna type of matrix if FloatProperty
(_load_filter_type(mathutils.Vector, use_bl_rna=False), self._load_vector), # before float because bl_rna type of vector if FloatProperty
(_load_filter_type(mathutils.Quaternion, use_bl_rna=False), self._load_quaternion),
(_load_filter_type(mathutils.Euler, use_bl_rna=False), self._load_euler),
(_load_filter_type(T.FloatProperty), self._load_identity),
(_load_filter_type(T.StringProperty), self._load_identity),
(_load_filter_type(T.EnumProperty), self._load_identity),
(_load_filter_type(T.PointerProperty), self._load_pointer),
(_load_filter_array, self._load_array),
(_load_filter_type(T.CollectionProperty), self._load_collection),
(_load_filter_default, self._load_default),
(_load_filter_color, self._load_identity),
]
# Utility functions
def dump(any, depth=1):
dumper = Dumper()
dumper.depath = depth
return dumper.dump(any)
def dump_datablock(datablock, depth):
if datablock:
dumper = Dumper()
dumper.type_subset = dumper.match_subset_all
dumper.depth = depth
datablock_type = datablock.bl_rna.name
key = "{}/{}".format(datablock_type, datablock.name)
data = dumper.dump(datablock)
return data
def load(dst, src):
loader = Loader()
# loader.match_subset_all = loader.match_subset_all
loader.load(dst, src)

File diff suppressed because it is too large Load Diff

View File

@ -1,219 +0,0 @@
"""
Context Manager allowing temporary override of attributes
````python
import bpy
from overrider import Overrider
with Overrider(name='bpy_', parent=bpy) as bpy_:
# set preview render settings
bpy_.context.scene.render.use_file_extension = False
bpy_.context.scene.render.resolution_x = 512
bpy_.context.scene.render.resolution_y = 512
bpy_.context.scene.render.use_file_extension = False
bpy_.context.scene.render.image_settings.file_format = "JPEG"
bpy_.context.scene.layers[10] = False
frame_start = action.frame_range[0]
frame_end = action.frame_range[1]
if begin_frame is not None:
frame_start = begin_frame
if end_frame is not None:
frame_end = end_frame
# render
window = bpy_.data.window_managers[0].windows[0]
screen = bpy_.data.window_managers[0].windows[0].screen
area = next(area for area in screen.areas if area.type == 'VIEW_3D')
space = next(space for space in area.spaces if space.type == 'VIEW_3D')
space.viewport_shade = 'MATERIAL'
space.region_3d.view_perspective = 'CAMERA'
override_context = {
"window": window._real_value_(),
"screen": screen._real_value_()
}
if frame_start == frame_end:
bpy.context.scene.frame_set(int(frame_start))
bpy_.context.scene.render.filepath = os.path.join(directory, "icon.jpg")
bpy.ops.render.opengl(override_context, write_still=True)
else:
for icon_index, frame_number in enumerate(range(int(frame_start), int(frame_end) + 1)):
bpy.context.scene.frame_set(frame_number)
bpy.context.scene.render.filepath = os.path.join(directory, "icon", "{:04d}.jpg".format(icon_index))
bpy.ops.render.opengl(override_context, write_still=True)
````
"""
from collections import OrderedDict
class OverrideIter:
def __init__(self, parent):
self.parent = parent
self.index = -1
def __next__(self):
self.index += 1
try:
return self.parent[self.index]
except IndexError as e:
raise StopIteration
class OverrideBase:
def __init__(self, context_manager, name=None, parent=None):
self._name__ = name
self._context_manager_ = context_manager
self._parent_ = parent
self._changed_attributes_ = OrderedDict()
self._changed_items_ = OrderedDict()
self._children_ = list()
self._original_value_ = self._real_value_()
def __repr__(self):
return "<{}({})>".format(self.__class__.__name__, self._path_)
@property
def _name_(self):
raise NotImplementedError()
@property
def _path_(self):
if isinstance(self._parent_, OverrideBase):
return self._parent_._path_ + self._name_
return self._name_
def _real_value_(self):
raise NotImplementedError()
def _restore_(self):
for attribute, original_value in reversed(self._changed_attributes_.items()):
setattr(self._real_value_(), attribute, original_value)
for item, original_value in reversed(self._changed_items_.items()):
self._real_value_()[item] = original_value
def __getattr__(self, attr):
new_attribute = OverrideAttribute(self._context_manager_, name=attr, parent=self)
self._children_.append(new_attribute)
return new_attribute
def __getitem__(self, item):
new_item = OverrideItem(self._context_manager_, name=item, parent=self)
self._children_.append(new_item)
return new_item
def __iter__(self):
return OverrideIter(self)
def __setattr__(self, attr, value):
if attr in (
'_name__',
'_context_manager_',
'_parent_',
'_children_',
'_original_value_',
'_changed_attributes_',
'_changed_items_'
):
self.__dict__[attr] = value
return
if attr not in self._changed_attributes_.keys():
self._changed_attributes_[attr] = getattr(self._real_value_(), attr)
self._context_manager_.register_as_changed(self)
setattr(self._real_value_(), attr, value)
def __setitem__(self, item, value):
if item not in self._changed_items_.keys():
self._changed_items_[item] = self._real_value_()[item]
self._context_manager_.register_as_changed(self)
self._real_value_()[item] = value
def __eq__(self, other):
return self._real_value_() == other
def __gt__(self, other):
return self._real_value_() > other
def __lt__(self, other):
return self._real_value_() < other
def __ge__(self, other):
return self._real_value_() >= other
def __le__(self, other):
return self._real_value_() <= other
def __call__(self, *args, **kwargs):
# TODO : surround str value with quotes
arguments = list([str(arg) for arg in args]) + ['{}={}'.format(key, value) for key, value in kwargs.items()]
arguments = ', '.join(arguments)
raise RuntimeError('Overrider does not allow call to {}({})'.format(self._path_, arguments))
class OverrideRoot(OverrideBase):
@property
def _name_(self):
return self._name__
def _real_value_(self):
return self._parent_
class OverrideAttribute(OverrideBase):
@property
def _name_(self):
return '.{}'.format(self._name__)
def _real_value_(self):
return getattr(self._parent_._real_value_(), self._name__)
class OverrideItem(OverrideBase):
@property
def _name_(self):
if isinstance(self._name__, str):
return '["{}"]'.format(self._name__)
return '[{}]'.format(self._name__)
def _real_value_(self):
return self._parent_._real_value_()[self._name__]
class Overrider:
def __init__(self, name, parent):
self.name = name
self.parent = parent
self.override = None
self.registered_overrides = list()
def __enter__(self):
self.override = OverrideRoot(
context_manager=self,
parent=self.parent,
name=self.name
)
return self.override
def __exit__(self, exc_type, exc_val, exc_tb):
self.restore()
def register_as_changed(self, override):
self.registered_overrides.append(override)
def restore(self):
for override in reversed(self.registered_overrides):
override._restore_()

View File

@ -1,3 +1,21 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
#
# ##### END GPL LICENSE BLOCK #####
import asyncio
import logging
import os
@ -7,44 +25,104 @@ import string
import time
from operator import itemgetter
from pathlib import Path
from subprocess import PIPE, Popen, TimeoutExpired
import shutil
from pathlib import Path
from queue import Queue
import bpy
import mathutils
from bpy.app.handlers import persistent
from . import bl_types, delayable, environment, presence, ui, utils
from .libs.replication.replication.constants import (FETCHED, STATE_ACTIVE,
STATE_INITIAL,
STATE_SYNCING)
from .libs.replication.replication.data import ReplicatedDataFactory
from .libs.replication.replication.exception import NonAuthorizedOperationError
from .libs.replication.replication.interface import Session
from replication.constants import (FETCHED, STATE_ACTIVE,
STATE_INITIAL,
STATE_SYNCING, RP_COMMON, UP)
from replication.data import ReplicatedDataFactory
from replication.exception import NonAuthorizedOperationError
from replication.interface import session
logger = logging.getLogger(__name__)
logger.setLevel(logging.WARNING)
client = None
background_execution_queue = Queue()
delayables = []
ui_context = None
stop_modal_executor = False
modal_executor_queue = None
server_process = None
def unregister_delayables():
global delayables, stop_modal_executor
def session_callback(name):
""" Session callback wrapper
This allow to encapsulate session callbacks to background_execution_queue.
By doing this way callback are executed from the main thread.
"""
def func_wrapper(func):
@session.register(name)
def add_background_task():
background_execution_queue.put(func)
return add_background_task
return func_wrapper
@session_callback('on_connection')
def initialize_session():
"""Session connection init hander
"""
settings = utils.get_preferences()
runtime_settings = bpy.context.window_manager.session
# Step 1: Constrect nodes
for node in session._graph.list_ordered():
node_ref = session.get(node)
if node_ref.state == FETCHED:
node_ref.resolve()
# Step 2: Load nodes
for node in session._graph.list_ordered():
node_ref = session.get(node)
if node_ref.state == FETCHED:
node_ref.apply()
# Step 3: Launch presence overlay
if runtime_settings.enable_presence:
presence.renderer.run()
# Step 4: Register blender timers
for d in delayables:
try:
d.unregister()
except:
continue
d.register()
if settings.update_method == 'DEPSGRAPH':
bpy.app.handlers.depsgraph_update_post.append(depsgraph_evaluation)
@session_callback('on_exit')
def on_connection_end():
"""Session connection finished handler
"""
global delayables, stop_modal_executor
settings = utils.get_preferences()
# Step 1: Unregister blender timers
for d in delayables:
try:
d.unregister()
except:
continue
stop_modal_executor = True
# Step 2: Unregister presence renderer
presence.renderer.stop()
if settings.update_method == 'DEPSGRAPH':
bpy.app.handlers.depsgraph_update_post.remove(
depsgraph_evaluation)
# Step 3: remove file handled
logger = logging.getLogger()
for handler in logger.handlers:
if isinstance(handler, logging.FileHandler):
logger.removeHandler(handler)
# OPERATORS
class SessionStartOperator(bpy.types.Operator):
bl_idname = "session.start"
bl_label = "start"
@ -57,107 +135,187 @@ class SessionStartOperator(bpy.types.Operator):
return True
def execute(self, context):
global client, delayables, ui_context, server_process
settings = context.window_manager.session
users = bpy.data.window_managers['WinMan'].online_users
global delayables
# TODO: Sync server clients
settings = utils.get_preferences()
runtime_settings = context.window_manager.session
users = bpy.data.window_managers['WinMan'].online_users
admin_pass = runtime_settings.password
use_extern_update = settings.update_method == 'DEPSGRAPH'
users.clear()
delayables.clear()
# save config
settings.save(context)
logger = logging.getLogger()
if len(logger.handlers) == 1:
formatter = logging.Formatter(
fmt='%(asctime)s CLIENT %(levelname)-8s %(message)s',
datefmt='%H:%M:%S'
)
log_directory = os.path.join(
settings.cache_directory,
"multiuser_client.log")
os.makedirs(settings.cache_directory, exist_ok=True)
handler = logging.FileHandler(log_directory, mode='w')
logger.addHandler(handler)
for handler in logger.handlers:
if isinstance(handler, logging.NullHandler):
continue
handler.setFormatter(formatter)
bpy_factory = ReplicatedDataFactory()
supported_bl_types = []
ui_context = context.copy()
# init the factory with supported types
for type in bl_types.types_to_register():
type_module = getattr(bl_types, type)
type_impl_name = "Bl{}".format(type.split('_')[1].capitalize())
type_impl_name = f"Bl{type.split('_')[1].capitalize()}"
type_module_class = getattr(type_module, type_impl_name)
supported_bl_types.append(type_module_class.bl_id)
# Retreive local replicated types settings
type_local_config = settings.supported_datablock[type_impl_name]
if type_impl_name not in settings.supported_datablocks:
logging.info(f"{type_impl_name} not found, \
regenerate type settings...")
settings.generate_supported_types()
type_local_config = settings.supported_datablocks[type_impl_name]
bpy_factory.register_type(
type_module_class.bl_class,
type_module_class,
timer=type_local_config.bl_delay_refresh,
automatic=type_local_config.auto_push)
timer=type_local_config.bl_delay_refresh*1000,
automatic=type_local_config.auto_push,
check_common=type_module_class.bl_check_common)
if type_local_config.bl_delay_apply > 0:
delayables.append(delayable.ApplyTimer(
timout=type_local_config.bl_delay_apply,
target_type=type_module_class))
if settings.update_method == 'DEFAULT':
if type_local_config.bl_delay_apply > 0:
delayables.append(
delayable.ApplyTimer(
timout=type_local_config.bl_delay_apply,
target_type=type_module_class))
client = Session(
session.configure(
factory=bpy_factory,
python_path=bpy.app.binary_path_python,
default_strategy=settings.right_strategy)
external_update_handling=use_extern_update)
if settings.update_method == 'DEPSGRAPH':
delayables.append(delayable.ApplyTimer(
settings.depsgraph_update_rate/1000))
# Host a session
if self.host:
# Scene setup
if settings.start_empty:
if settings.init_method == 'EMPTY':
utils.clean_scene()
runtime_settings.is_host = True
runtime_settings.internet_ip = environment.get_ip()
try:
for scene in bpy.data.scenes:
scene_uuid = client.add(scene)
client.commit(scene_uuid)
session.add(scene)
client.host(
session.host(
id=settings.username,
address=settings.ip,
port=settings.port,
ipc_port=settings.ipc_port)
except Exception as e:
self.report({'ERROR'}, repr(e))
logger.error(f"Error: {e}")
finally:
settings.is_admin = True
# Join a session
else:
utils.clean_scene()
try:
client.connect(
id=settings.username,
address=settings.ip,
port=settings.port,
ipc_port=settings.ipc_port
ipc_port=settings.ipc_port,
timeout=settings.connection_timeout,
password=admin_pass,
cache_directory=settings.cache_directory,
server_log_level=logging.getLevelName(
logging.getLogger().level),
)
except Exception as e:
self.report({'ERROR'}, repr(e))
logger.error(f"Error: {e}")
finally:
settings.is_admin = False
logging.error(f"Error: {e}")
# Join a session
else:
if not runtime_settings.admin:
utils.clean_scene()
# regular session, no password needed
admin_pass = None
try:
session.connect(
id=settings.username,
address=settings.ip,
port=settings.port,
ipc_port=settings.ipc_port,
timeout=settings.connection_timeout,
password=admin_pass
)
except Exception as e:
self.report({'ERROR'}, str(e))
logging.error(str(e))
# Background client updates service
#TODO: Refactoring
delayables.append(delayable.ClientUpdate())
delayables.append(delayable.DrawClient())
delayables.append(delayable.DynamicRightSelectTimer())
# Launch drawing module
if settings.enable_presence:
presence.renderer.run()
session_update = delayable.SessionStatusUpdate()
session_user_sync = delayable.SessionUserSync()
session_background_executor = delayable.MainThreadExecutor(
execution_queue=background_execution_queue)
# Register blender main thread tools
for d in delayables:
d.register()
session_update.register()
session_user_sync.register()
session_background_executor.register()
delayables.append(session_background_executor)
delayables.append(session_update)
delayables.append(session_user_sync)
global modal_executor_queue
modal_executor_queue = queue.Queue()
bpy.ops.session.apply_armature_operator()
self.report(
{'INFO'},
"connexion on tcp://{}:{}".format(settings.ip, settings.port))
f"connecting to tcp://{settings.ip}:{settings.port}")
return {"FINISHED"}
class SessionInitOperator(bpy.types.Operator):
bl_idname = "session.init"
bl_label = "Init session repostitory from"
bl_description = "Init the current session"
bl_options = {"REGISTER"}
init_method: bpy.props.EnumProperty(
name='init_method',
description='Init repo',
items={
('EMPTY', 'an empty scene', 'start empty'),
('BLEND', 'current scenes', 'use current scenes')},
default='BLEND')
@classmethod
def poll(cls, context):
return True
def draw(self, context):
layout = self.layout
col = layout.column()
col.prop(self, 'init_method', text="")
def invoke(self, context, event):
wm = context.window_manager
return wm.invoke_props_dialog(self)
def execute(self, context):
if self.init_method == 'EMPTY':
utils.clean_scene()
for scene in bpy.data.scenes:
session.add(scene)
session.init()
return {"FINISHED"}
@ -172,16 +330,50 @@ class SessionStopOperator(bpy.types.Operator):
return True
def execute(self, context):
global client, delayables, stop_modal_executor
assert(client)
global delayables, stop_modal_executor
if session:
try:
session.disconnect()
except Exception as e:
self.report({'ERROR'}, repr(e))
else:
self.report({'WARNING'}, "No session to quit.")
return {"FINISHED"}
return {"FINISHED"}
class SessionKickOperator(bpy.types.Operator):
bl_idname = "session.kick"
bl_label = "Kick"
bl_description = "Kick the user"
bl_options = {"REGISTER"}
user: bpy.props.StringProperty()
@classmethod
def poll(cls, context):
return True
def execute(self, context):
global delayables, stop_modal_executor
assert(session)
try:
client.disconnect()
session.kick(self.user)
except Exception as e:
self.report({'ERROR'}, repr(e))
return {"FINISHED"}
def invoke(self, context, event):
return context.window_manager.invoke_props_dialog(self)
def draw(self, context):
row = self.layout
row.label(text=f" Do you really want to kick {self.user} ? ")
class SessionPropertyRemoveOperator(bpy.types.Operator):
bl_idname = "session.remove_prop"
@ -196,9 +388,8 @@ class SessionPropertyRemoveOperator(bpy.types.Operator):
return True
def execute(self, context):
global client
try:
client.remove(self.property_path)
session.remove(self.property_path)
return {"FINISHED"}
except: # NonAuthorizedOperationError:
@ -226,17 +417,16 @@ class SessionPropertyRightOperator(bpy.types.Operator):
def draw(self, context):
layout = self.layout
settings = context.window_manager.session
runtime_settings = context.window_manager.session
col = layout.column()
col.prop(settings, "clients")
col.prop(runtime_settings, "clients")
def execute(self, context):
settings = context.window_manager.session
global client
runtime_settings = context.window_manager.session
if client:
client.change_owner(self.key, settings.clients)
if session:
session.change_owner(self.key, runtime_settings.clients)
return {"FINISHED"}
@ -257,13 +447,13 @@ class SessionSnapUserOperator(bpy.types.Operator):
def execute(self, context):
wm = context.window_manager
settings = context.window_manager.session
runtime_settings = context.window_manager.session
if settings.time_snap_running:
settings.time_snap_running = False
if runtime_settings.time_snap_running:
runtime_settings.time_snap_running = False
return {'CANCELLED'}
else:
settings.time_snap_running = True
runtime_settings.time_snap_running = True
self._timer = wm.event_timer_add(0.1, window=context.window)
wm.modal_handler_add(self)
@ -274,7 +464,8 @@ class SessionSnapUserOperator(bpy.types.Operator):
wm.event_timer_remove(self._timer)
def modal(self, context, event):
is_running = context.window_manager.session.time_snap_running
session_sessings = context.window_manager.session
is_running = session_sessings.time_snap_running
if event.type in {'RIGHTMOUSE', 'ESC'} or not is_running:
self.cancel(context)
@ -282,14 +473,34 @@ class SessionSnapUserOperator(bpy.types.Operator):
if event.type == 'TIMER':
area, region, rv3d = presence.view3d_find()
global client
if client:
target_ref = client.online_users.get(self.target_client)
if session:
target_ref = session.online_users.get(self.target_client)
if target_ref:
rv3d.view_matrix = mathutils.Matrix(
target_ref['metadata']['view_matrix'])
target_scene = target_ref['metadata']['scene_current']
# Handle client on other scenes
if target_scene != context.scene.name:
blender_scene = bpy.data.scenes.get(target_scene, None)
if blender_scene is None:
self.report(
{'ERROR'}, f"Scene {target_scene} doesn't exist on the local client.")
session_sessings.time_snap_running = False
return {"CANCELLED"}
bpy.context.window.scene = blender_scene
# Update client viewmatrix
client_vmatrix = target_ref['metadata'].get(
'view_matrix', None)
if client_vmatrix:
rv3d.view_matrix = mathutils.Matrix(client_vmatrix)
else:
self.report({'ERROR'}, f"Client viewport not ready.")
session_sessings.time_snap_running = False
return {"CANCELLED"}
else:
return {"CANCELLED"}
@ -311,13 +522,13 @@ class SessionSnapTimeOperator(bpy.types.Operator):
return True
def execute(self, context):
settings = context.window_manager.session
runtime_settings = context.window_manager.session
if settings.user_snap_running:
settings.user_snap_running = False
if runtime_settings.user_snap_running:
runtime_settings.user_snap_running = False
return {'CANCELLED'}
else:
settings.user_snap_running = True
runtime_settings.user_snap_running = True
wm = context.window_manager
self._timer = wm.event_timer_add(0.05, window=context.window)
@ -335,10 +546,8 @@ class SessionSnapTimeOperator(bpy.types.Operator):
return {'CANCELLED'}
if event.type == 'TIMER':
global client
if client:
target_ref = client.online_users.get(self.target_client)
if session:
target_ref = session.online_users.get(self.target_client)
if target_ref:
context.scene.frame_current = target_ref['metadata']['frame_current']
@ -361,9 +570,7 @@ class SessionApply(bpy.types.Operator):
return True
def execute(self, context):
global client
client.apply(self.target)
session.apply(self.target)
return {"FINISHED"}
@ -381,10 +588,9 @@ class SessionCommit(bpy.types.Operator):
return True
def execute(self, context):
global client
# client.get(uuid=target).diff()
client.commit(uuid=self.target)
client.push(self.target)
# session.get(uuid=target).diff()
session.commit(uuid=self.target)
session.push(self.target)
return {"FINISHED"}
@ -402,19 +608,17 @@ class ApplyArmatureOperator(bpy.types.Operator):
return {'CANCELLED'}
if event.type == 'TIMER':
global client
if client and client.state['STATE'] == STATE_ACTIVE:
nodes = client.list(filter=bl_types.bl_armature.BlArmature)
if session and session.state['STATE'] == STATE_ACTIVE:
nodes = session.list(filter=bl_types.bl_armature.BlArmature)
for node in nodes:
node_ref = client.get(uuid=node)
node_ref = session.get(uuid=node)
if node_ref.state == FETCHED:
try:
client.apply(node)
session.apply(node)
except Exception as e:
logger.error(
"fail to apply {}: {}".format(node_ref.uuid, e))
logging.error("Fail to apply armature: {e}")
return {'PASS_THROUGH'}
@ -433,6 +637,35 @@ class ApplyArmatureOperator(bpy.types.Operator):
stop_modal_executor = False
class ClearCache(bpy.types.Operator):
"Clear local session cache"
bl_idname = "session.clear_cache"
bl_label = "Modal Executor Operator"
@classmethod
def poll(cls, context):
return True
def execute(self, context):
cache_dir = utils.get_preferences().cache_directory
try:
for root, dirs, files in os.walk(cache_dir):
for name in files:
Path(root, name).unlink()
except Exception as e:
self.report({'ERROR'}, repr(e))
return {"FINISHED"}
def invoke(self, context, event):
return context.window_manager.invoke_props_dialog(self)
def draw(self, context):
row = self.layout
row.label(text=f" Do you really want to remove local cache ? ")
classes = (
SessionStartOperator,
SessionStopOperator,
@ -443,18 +676,12 @@ classes = (
SessionApply,
SessionCommit,
ApplyArmatureOperator,
SessionKickOperator,
SessionInitOperator,
ClearCache,
)
@persistent
def load_pre_handler(dummy):
global client
if client and client.state['STATE'] in [STATE_ACTIVE, STATE_SYNCING]:
bpy.ops.session.stop()
@persistent
def sanitize_deps_graph(dummy):
"""sanitize deps graph
@ -463,27 +690,32 @@ def sanitize_deps_graph(dummy):
A future solution should be to avoid storing dataclock reference...
"""
global client
if session and session.state['STATE'] == STATE_ACTIVE:
for node_key in session.list():
session.get(node_key).resolve()
if client and client.state['STATE'] in [STATE_ACTIVE]:
for node_key in client.list():
client.get(node_key).resolve()
@persistent
def load_pre_handler(dummy):
if session and session.state['STATE'] in [STATE_ACTIVE, STATE_SYNCING]:
bpy.ops.session.stop()
@persistent
def update_client_frame(scene):
if client and client.state['STATE'] == STATE_ACTIVE:
client.update_user_metadata({
if session and session.state['STATE'] == STATE_ACTIVE:
session.update_user_metadata({
'frame_current': scene.frame_current
})
@persistent
def depsgraph_evaluation(scene):
if client and client.state['STATE'] == STATE_ACTIVE:
if session and session.state['STATE'] == STATE_ACTIVE:
context = bpy.context
blender_depsgraph = bpy.context.view_layer.depsgraph
dependency_updates = [u for u in blender_depsgraph.updates]
session_infos = bpy.context.window_manager.session
settings = utils.get_preferences()
# NOTE: maybe we don't need to check each update but only the first
@ -491,25 +723,25 @@ def depsgraph_evaluation(scene):
# Is the object tracked ?
if update.id.uuid:
# Retrieve local version
node = client.get(update.id.uuid)
node = session.get(update.id.uuid)
# Check our right on this update:
# - if its ours or ( under common and diff), launch the
# update process
# - if its to someone else, ignore the update (go deeper ?)
if node.owner == session_infos.username:
if node and node.owner in [session.id, RP_COMMON] and node.state == UP:
# Avoid slow geometry update
if 'EDIT' in context.mode:
if 'EDIT' in context.mode and \
not settings.sync_during_editmode:
break
logger.error("UPDATE: MODIFIFY {}".format(type(update.id)))
# client.commit(node.uuid)
# client.push(node.uuid)
session.stash(node.uuid)
else:
# Distant update
continue
# else:
# # New items !
# logger.error("UPDATE: ADD")C.obj
# logger.error("UPDATE: ADD")
def register():
@ -517,36 +749,23 @@ def register():
for cls in classes:
register_class(cls)
bpy.app.handlers.load_pre.append(load_pre_handler)
bpy.app.handlers.undo_post.append(sanitize_deps_graph)
bpy.app.handlers.redo_post.append(sanitize_deps_graph)
bpy.app.handlers.load_pre.append(load_pre_handler)
bpy.app.handlers.frame_change_pre.append(update_client_frame)
# bpy.app.handlers.depsgraph_update_post.append(depsgraph_evaluation)
def unregister():
global client
if client and client.state['STATE'] == 2:
client.disconnect()
client = None
if session and session.state['STATE'] == STATE_ACTIVE:
session.disconnect()
from bpy.utils import unregister_class
for cls in reversed(classes):
unregister_class(cls)
bpy.app.handlers.load_pre.remove(load_pre_handler)
bpy.app.handlers.undo_post.remove(sanitize_deps_graph)
bpy.app.handlers.redo_post.remove(sanitize_deps_graph)
bpy.app.handlers.load_pre.remove(load_pre_handler)
bpy.app.handlers.frame_change_pre.remove(update_client_frame)
# bpy.app.handlers.depsgraph_update_post.remove(depsgraph_evaluation)
if __name__ == "__main__":
register()

556
multi_user/preferences.py Normal file
View File

@ -0,0 +1,556 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
#
# ##### END GPL LICENSE BLOCK #####
import random
import logging
import bpy
import string
import re
import os
from pathlib import Path
from . import bl_types, environment, addon_updater_ops, presence, ui
from .utils import get_preferences, get_expanded_icon
from replication.constants import RP_COMMON
from replication.interface import session
IP_EXPR = re.compile('\d+\.\d+\.\d+\.\d+')
def randomColor():
"""Generate a random color """
r = random.random()
v = random.random()
b = random.random()
return [r, v, b]
def random_string_digits(stringLength=6):
"""Generate a random string of letters and digits """
lettersAndDigits = string.ascii_letters + string.digits
return ''.join(random.choices(lettersAndDigits, k=stringLength))
def update_panel_category(self, context):
ui.unregister()
ui.SESSION_PT_settings.bl_category = self.panel_category
ui.register()
def update_ip(self, context):
ip = IP_EXPR.search(self.ip)
if ip:
self['ip'] = ip.group()
else:
logging.error("Wrong IP format")
self['ip'] = "127.0.0.1"
def update_port(self, context):
max_port = self.port + 3
if self.ipc_port < max_port and \
self['ipc_port'] >= self.port:
logging.error(
"IPC Port in conflic with the port, assigning a random value")
self['ipc_port'] = random.randrange(self.port+4, 10000)
def update_directory(self, context):
new_dir = Path(self.cache_directory)
if new_dir.exists() and any(Path(self.cache_directory).iterdir()):
logging.error("The folder is not empty, choose another one.")
self['cache_directory'] = environment.DEFAULT_CACHE_DIR
elif not new_dir.exists():
logging.info("Target cache folder doesn't exist, creating it.")
os.makedirs(self.cache_directory, exist_ok=True)
def set_log_level(self, value):
logging.getLogger().setLevel(value)
def get_log_level(self):
return logging.getLogger().level
class ReplicatedDatablock(bpy.types.PropertyGroup):
type_name: bpy.props.StringProperty()
bl_name: bpy.props.StringProperty()
bl_delay_refresh: bpy.props.FloatProperty()
bl_delay_apply: bpy.props.FloatProperty()
use_as_filter: bpy.props.BoolProperty(default=True)
auto_push: bpy.props.BoolProperty(default=True)
icon: bpy.props.StringProperty()
def set_sync_render_settings(self, value):
self['sync_render_settings'] = value
if session and bpy.context.scene.uuid and value:
bpy.ops.session.apply('INVOKE_DEFAULT', target=bpy.context.scene.uuid)
def set_sync_active_camera(self, value):
self['sync_active_camera'] = value
if session and bpy.context.scene.uuid and value:
bpy.ops.session.apply('INVOKE_DEFAULT', target=bpy.context.scene.uuid)
class ReplicationFlags(bpy.types.PropertyGroup):
def get_sync_render_settings(self):
return self.get('sync_render_settings', True)
def get_sync_active_camera(self):
return self.get('sync_active_camera', True)
sync_render_settings: bpy.props.BoolProperty(
name="Synchronize render settings",
description="Synchronize render settings (eevee and cycles only)",
default=True,
set=set_sync_render_settings,
get=get_sync_render_settings)
sync_during_editmode: bpy.props.BoolProperty(
name="Edit mode updates",
description="Enable objects update in edit mode (! Impact performances !)",
default=False
)
sync_active_camera: bpy.props.BoolProperty(
name="Synchronize active camera",
description="Synchronize the active camera",
default=True,
get=get_sync_active_camera,
set=set_sync_active_camera
)
class SessionPrefs(bpy.types.AddonPreferences):
bl_idname = __package__
ip: bpy.props.StringProperty(
name="ip",
description='Distant host ip',
default="127.0.0.1",
update=update_ip)
username: bpy.props.StringProperty(
name="Username",
default=f"user_{random_string_digits()}"
)
client_color: bpy.props.FloatVectorProperty(
name="client_instance_color",
subtype='COLOR',
default=randomColor())
port: bpy.props.IntProperty(
name="port",
description='Distant host port',
default=5555
)
sync_flags: bpy.props.PointerProperty(
type=ReplicationFlags
)
supported_datablocks: bpy.props.CollectionProperty(
type=ReplicatedDatablock,
)
ipc_port: bpy.props.IntProperty(
name="ipc_port",
description='internal ttl port(only usefull for multiple local instances)',
default=random.randrange(5570, 70000),
update=update_port,
)
init_method: bpy.props.EnumProperty(
name='init_method',
description='Init repo',
items={
('EMPTY', 'an empty scene', 'start empty'),
('BLEND', 'current scenes', 'use current scenes')},
default='BLEND')
cache_directory: bpy.props.StringProperty(
name="cache directory",
subtype="DIR_PATH",
default=environment.DEFAULT_CACHE_DIR,
update=update_directory)
connection_timeout: bpy.props.IntProperty(
name='connection timeout',
description='connection timeout before disconnection',
default=1000
)
update_method: bpy.props.EnumProperty(
name='update method',
description='replication update method',
items=[
('DEFAULT', "Default", "Default: Use threads to monitor databloc changes"),
('DEPSGRAPH', "Depsgraph",
"Experimental: Use the blender dependency graph to trigger updates"),
],
)
# Replication update settings
depsgraph_update_rate: bpy.props.IntProperty(
name='depsgraph update rate',
description='Dependency graph uppdate rate (milliseconds)',
default=100
)
clear_memory_filecache: bpy.props.BoolProperty(
name="Clear memory filecache",
description="Remove filecache from memory",
default=False
)
# for UI
category: bpy.props.EnumProperty(
name="Category",
description="Preferences Category",
items=[
('CONFIG', "Configuration", "Configuration about this add-on"),
('UPDATE', "Update", "Update this add-on"),
],
default='CONFIG'
)
logging_level: bpy.props.EnumProperty(
name="Log level",
description="Log verbosity level",
items=[
('ERROR', "error", "show only errors", logging.ERROR),
('WARNING', "warning", "only show warnings and errors", logging.WARNING),
('INFO', "info", "default level", logging.INFO),
('DEBUG', "debug", "show all logs", logging.DEBUG),
],
default='INFO',
set=set_log_level,
get=get_log_level
)
conf_session_identity_expanded: bpy.props.BoolProperty(
name="Identity",
description="Identity",
default=True
)
conf_session_net_expanded: bpy.props.BoolProperty(
name="Net",
description="net",
default=True
)
conf_session_hosting_expanded: bpy.props.BoolProperty(
name="Rights",
description="Rights",
default=False
)
conf_session_timing_expanded: bpy.props.BoolProperty(
name="timings",
description="timings",
default=False
)
conf_session_cache_expanded: bpy.props.BoolProperty(
name="Cache",
description="cache",
default=False
)
conf_session_ui_expanded: bpy.props.BoolProperty(
name="Interface",
description="Interface",
default=False
)
sidebar_advanced_rep_expanded: bpy.props.BoolProperty(
name="sidebar_advanced_rep_expanded",
description="sidebar_advanced_rep_expanded",
default=False
)
sidebar_advanced_log_expanded: bpy.props.BoolProperty(
name="sidebar_advanced_log_expanded",
description="sidebar_advanced_log_expanded",
default=False
)
sidebar_advanced_net_expanded: bpy.props.BoolProperty(
name="sidebar_advanced_net_expanded",
description="sidebar_advanced_net_expanded",
default=False
)
sidebar_advanced_cache_expanded: bpy.props.BoolProperty(
name="sidebar_advanced_cache_expanded",
description="sidebar_advanced_cache_expanded",
default=False
)
auto_check_update: bpy.props.BoolProperty(
name="Auto-check for Update",
description="If enabled, auto-check for updates using an interval",
default=False,
)
updater_intrval_months: bpy.props.IntProperty(
name='Months',
description="Number of months between checking for updates",
default=0,
min=0
)
updater_intrval_days: bpy.props.IntProperty(
name='Days',
description="Number of days between checking for updates",
default=7,
min=0,
max=31
)
updater_intrval_hours: bpy.props.IntProperty(
name='Hours',
description="Number of hours between checking for updates",
default=0,
min=0,
max=23
)
updater_intrval_minutes: bpy.props.IntProperty(
name='Minutes',
description="Number of minutes between checking for updates",
default=0,
min=0,
max=59
)
# Custom panel
panel_category: bpy.props.StringProperty(
description="Choose a name for the category of the panel",
default="Multiuser",
update=update_panel_category)
def draw(self, context):
layout = self.layout
layout.row().prop(self, "category", expand=True)
if self.category == 'CONFIG':
grid = layout.column()
# USER INFORMATIONS
box = grid.box()
box.prop(
self, "conf_session_identity_expanded", text="User informations",
icon=get_expanded_icon(self.conf_session_identity_expanded),
emboss=False)
if self.conf_session_identity_expanded:
box.row().prop(self, "username", text="name")
box.row().prop(self, "client_color", text="color")
# NETWORK SETTINGS
box = grid.box()
box.prop(
self, "conf_session_net_expanded", text="Netorking",
icon=get_expanded_icon(self.conf_session_net_expanded),
emboss=False)
if self.conf_session_net_expanded:
box.row().prop(self, "ip", text="Address")
row = box.row()
row.label(text="Port:")
row.prop(self, "port", text="")
row = box.row()
row.label(text="Init the session from:")
row.prop(self, "init_method", text="")
row = box.row()
row.label(text="Update method:")
row.prop(self, "update_method", text="")
table = box.box()
table.row().prop(
self, "conf_session_timing_expanded", text="Refresh rates",
icon=get_expanded_icon(self.conf_session_timing_expanded),
emboss=False)
if self.conf_session_timing_expanded:
line = table.row()
line.label(text=" ")
line.separator()
line.label(text="refresh (sec)")
line.label(text="apply (sec)")
for item in self.supported_datablocks:
line = table.row(align=True)
line.label(text="", icon=item.icon)
line.prop(item, "bl_delay_refresh", text="")
line.prop(item, "bl_delay_apply", text="")
# HOST SETTINGS
box = grid.box()
box.prop(
self, "conf_session_hosting_expanded", text="Hosting",
icon=get_expanded_icon(self.conf_session_hosting_expanded),
emboss=False)
if self.conf_session_hosting_expanded:
row = box.row()
row.label(text="Init the session from:")
row.prop(self, "init_method", text="")
# CACHE SETTINGS
box = grid.box()
box.prop(
self, "conf_session_cache_expanded", text="Cache",
icon=get_expanded_icon(self.conf_session_cache_expanded),
emboss=False)
if self.conf_session_cache_expanded:
box.row().prop(self, "cache_directory", text="Cache directory")
box.row().prop(self, "clear_memory_filecache", text="Clear memory filecache")
# INTERFACE SETTINGS
box = grid.box()
box.prop(
self, "conf_session_ui_expanded", text="Interface",
icon=get_expanded_icon(self.conf_session_ui_expanded),
emboss=False)
if self.conf_session_ui_expanded:
box.row().prop(self, "panel_category", text="Panel category", expand=True)
if self.category == 'UPDATE':
from . import addon_updater_ops
addon_updater_ops.update_settings_ui(self, context)
def generate_supported_types(self):
self.supported_datablocks.clear()
for type in bl_types.types_to_register():
new_db = self.supported_datablocks.add()
type_module = getattr(bl_types, type)
type_impl_name = f"Bl{type.split('_')[1].capitalize()}"
type_module_class = getattr(type_module, type_impl_name)
new_db.name = type_impl_name
new_db.type_name = type_impl_name
new_db.bl_delay_refresh = type_module_class.bl_delay_refresh
new_db.bl_delay_apply = type_module_class.bl_delay_apply
new_db.use_as_filter = True
new_db.icon = type_module_class.bl_icon
new_db.auto_push = type_module_class.bl_automatic_push
new_db.bl_name = type_module_class.bl_id
def client_list_callback(scene, context):
from . import operators
items = [(RP_COMMON, RP_COMMON, "")]
username = get_preferences().username
if session:
client_ids = session.online_users.keys()
for id in client_ids:
name_desc = id
if id == username:
name_desc += " (self)"
items.append((id, name_desc, ""))
return items
class SessionUser(bpy.types.PropertyGroup):
"""Session User
Blender user information property
"""
username: bpy.props.StringProperty(name="username")
current_frame: bpy.props.IntProperty(name="current_frame")
class SessionProps(bpy.types.PropertyGroup):
session_mode: bpy.props.EnumProperty(
name='session_mode',
description='session mode',
items={
('HOST', 'HOST', 'host a session'),
('CONNECT', 'JOIN', 'connect to a session')},
default='CONNECT')
clients: bpy.props.EnumProperty(
name="clients",
description="client enum",
items=client_list_callback)
enable_presence: bpy.props.BoolProperty(
name="Presence overlay",
description='Enable overlay drawing module',
default=True,
update=presence.update_presence
)
presence_show_selected: bpy.props.BoolProperty(
name="Show selected objects",
description='Enable selection overlay ',
default=True,
update=presence.update_overlay_settings
)
presence_show_user: bpy.props.BoolProperty(
name="Show users",
description='Enable user overlay ',
default=True,
update=presence.update_overlay_settings
)
presence_show_far_user: bpy.props.BoolProperty(
name="Show users on different scenes",
description="Show user on different scenes",
default=False,
update=presence.update_overlay_settings
)
filter_owned: bpy.props.BoolProperty(
name="filter_owned",
description='Show only owned datablocks',
default=True
)
admin: bpy.props.BoolProperty(
name="admin",
description='Connect as admin',
default=False
)
password: bpy.props.StringProperty(
name="password",
default=random_string_digits(),
description='Session password',
subtype='PASSWORD'
)
internet_ip: bpy.props.StringProperty(
name="internet ip",
default="no found",
description='Internet interface ip',
)
user_snap_running: bpy.props.BoolProperty(
default=False
)
time_snap_running: bpy.props.BoolProperty(
default=False
)
is_host: bpy.props.BoolProperty(
default=False
)
classes = (
SessionUser,
SessionProps,
ReplicationFlags,
ReplicatedDatablock,
SessionPrefs,
)
def register():
from bpy.utils import register_class
for cls in classes:
register_class(cls)
prefs = bpy.context.preferences.addons[__package__].preferences
if len(prefs.supported_datablocks) == 0:
logging.debug('Generating bl_types preferences')
prefs.generate_supported_types()
def unregister():
from bpy.utils import unregister_class
for cls in reversed(classes):
unregister_class(cls)

View File

@ -1,6 +1,25 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
#
# ##### END GPL LICENSE BLOCK #####
import copy
import logging
import math
import traceback
import bgl
import blf
@ -14,10 +33,12 @@ from . import utils
renderer = None
logger = logging.getLogger(__name__)
def view3d_find():
""" Find the first 'VIEW_3D' windows found in areas
:return: tuple(Area, Region, RegionView3D)
"""
for area in bpy.data.window_managers[0].windows[0].screen.areas:
if area.type == 'VIEW_3D':
v3d = area.spaces[0]
@ -25,15 +46,23 @@ def view3d_find():
for region in area.regions:
if region.type == 'WINDOW':
return area, region, rv3d
return None, None, None
def refresh_3d_view():
""" Refresh the viewport
"""
area, region, rv3d = view3d_find()
if area and region and rv3d:
area.tag_redraw()
def refresh_sidebar_view():
""" Refresh the blender sidebar
"""
area, region, rv3d = view3d_find()
if area:
area.regions[3].tag_redraw()
def get_target(region, rv3d, coord):
target = [0, 0, 0]
@ -117,10 +146,8 @@ def get_bb_coords_from_obj(object, parent=None):
def get_view_matrix():
area, region, rv3d = view3d_find()
if area and region and rv3d:
matrix_dumper = utils.dump_anything.Dumper()
return matrix_dumper.dump(rv3d.view_matrix)
if area and region and rv3d:
return [list(v) for v in rv3d.view_matrix]
def update_presence(self, context):
global renderer
@ -183,7 +210,7 @@ class DrawFactory(object):
def flush_selection(self, user=None):
key_to_remove = []
select_key = "{}_select".format(user) if user else "select"
select_key = f"{user}_select" if user else "select"
for k in self.d3d_items.keys():
if select_key in k:
@ -204,13 +231,13 @@ class DrawFactory(object):
self.d2d_items.clear()
def draw_client_selection(self, client_id, client_color, client_selection):
local_user = bpy.context.window_manager.session.username
local_user = utils.get_preferences().username
if local_user != client_id:
self.flush_selection(client_id)
for select_ob in client_selection:
drawable_key = "{}_select_{}".format(client_id, select_ob)
drawable_key = f"{client_id}_select_{select_ob}"
ob = utils.find_from_attr("uuid", select_ob, bpy.data.objects)
if not ob:
@ -219,6 +246,10 @@ class DrawFactory(object):
if ob.type == 'EMPTY':
# TODO: Child case
# Collection instance case
indices = (
(0, 1), (1, 2), (2, 3), (0, 3),
(4, 5), (5, 6), (6, 7), (4, 7),
(0, 4), (1, 5), (2, 6), (3, 7))
if ob.instance_collection:
for obj in ob.instance_collection.objects:
if obj.type == 'MESH':
@ -261,7 +292,7 @@ class DrawFactory(object):
def draw_client_camera(self, client_id, client_location, client_color):
if client_location:
local_user = bpy.context.window_manager.session.username
local_user = utils.get_preferences().username
if local_user != client_id:
try:
@ -282,10 +313,10 @@ class DrawFactory(object):
self.d2d_items[client_id] = (position[1], client_id, color)
except Exception as e:
logger.error("Draw client exception {}".format(e))
logging.debug(f"Draw client exception: {e} \n {traceback.format_exc()}\n pos:{position},ind:{indices}")
def draw3d_callback(self):
bgl.glLineWidth(1.5)
bgl.glLineWidth(2.)
bgl.glEnable(bgl.GL_DEPTH_TEST)
bgl.glEnable(bgl.GL_BLEND)
bgl.glEnable(bgl.GL_LINE_SMOOTH)
@ -296,7 +327,7 @@ class DrawFactory(object):
shader.uniform_float("color", color)
batch.draw(shader)
except Exception:
logger.error("3D Exception")
logging.error("3D Exception")
def draw2d_callback(self):
for position, font, color in self.d2d_items.values():
@ -310,7 +341,7 @@ class DrawFactory(object):
blf.draw(0, font)
except Exception:
logger.error("2D EXCEPTION")
logging.error("2D EXCEPTION")
def register():

View File

@ -1,21 +1,45 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
#
# ##### END GPL LICENSE BLOCK #####
import bpy
from . import operators
from .libs.replication.replication.constants import (ADDED, ERROR, FETCHED,
from .utils import get_preferences, get_expanded_icon, get_folder_size
from replication.constants import (ADDED, ERROR, FETCHED,
MODIFIED, RP_COMMON, UP,
STATE_ACTIVE, STATE_AUTH,
STATE_CONFIG, STATE_SYNCING,
STATE_INITIAL, STATE_SRV_SYNC,
STATE_WAITING, STATE_QUITTING)
STATE_WAITING, STATE_QUITTING,
STATE_LOBBY,
STATE_LAUNCHING_SERVICES)
from replication import __version__
from replication.interface import session
ICONS_PROP_STATES = ['TRIA_DOWN', # ADDED
'TRIA_UP', # COMMITED
'KEYTYPE_KEYFRAME_VEC', # PUSHED
'TRIA_DOWN', # FETCHED
'FILE_REFRESH', # UP
'TRIA_UP'] # CHANGED
'TRIA_UP',
'ERROR'] # CHANGED
def printProgressBar (iteration, total, prefix = '', suffix = '', decimals = 1, length = 100, fill = '', fill_empty=' '):
def printProgressBar(iteration, total, prefix='', suffix='', decimals=1, length=100, fill='', fill_empty=' '):
"""
Call in a loop to create terminal progress bar
@params:
@ -29,17 +53,19 @@ def printProgressBar (iteration, total, prefix = '', suffix = '', decimals = 1,
From here:
https://gist.github.com/greenstick/b23e475d2bfdc3a82e34eaa1f6781ee4
"""
percent = ("{0:." + str(decimals) + "f}").format(100 * (iteration / float(total)))
if total == 0:
return ""
filledLength = int(length * iteration // total)
bar = fill * filledLength + fill_empty * (length - filledLength)
return '{} |{}| {}/{}{}'.format(prefix, bar, iteration,total, suffix)
return f"{prefix} |{bar}| {iteration}/{total}{suffix}"
def get_state_str(state):
state_str = 'None'
state_str = 'UNKNOWN'
if state == STATE_WAITING:
state_str = 'WARMING UP DATA'
elif state == STATE_SYNCING:
state_str = 'FETCHING FROM SERVER'
state_str = 'FETCHING'
elif state == STATE_AUTH:
state_str = 'AUTHENTIFICATION'
elif state == STATE_CONFIG:
@ -47,144 +73,168 @@ def get_state_str(state):
elif state == STATE_ACTIVE:
state_str = 'ONLINE'
elif state == STATE_SRV_SYNC:
state_str = 'PUSHING TO SERVER'
state_str = 'PUSHING'
elif state == STATE_INITIAL:
state_str = 'INIT'
elif state == STATE_QUITTING:
state_str = 'QUITTING SESSION'
state_str = 'QUITTING'
elif state == STATE_LAUNCHING_SERVICES:
state_str = 'LAUNCHING SERVICES'
elif state == STATE_LOBBY:
state_str = 'LOBBY'
return state_str
class SESSION_PT_settings(bpy.types.Panel):
"""Settings panel"""
bl_idname = "MULTIUSER_SETTINGS_PT_panel"
bl_label = "Session"
bl_label = " "
bl_space_type = 'VIEW_3D'
bl_region_type = 'UI'
bl_category = "Multiuser"
def draw_header(self, context):
self.layout.label(text="", icon='TOOL_SETTINGS')
layout = self.layout
if session and session.state['STATE'] != STATE_INITIAL:
cli_state = session.state
state = session.state.get('STATE')
connection_icon = "KEYTYPE_MOVING_HOLD_VEC"
if state == STATE_ACTIVE:
connection_icon = 'PROP_ON'
else:
connection_icon = 'PROP_CON'
layout.label(text=f"Session - {get_state_str(cli_state['STATE'])}", icon=connection_icon)
else:
layout.label(text=f"Session - v{__version__}",icon="PROP_OFF")
def draw(self, context):
layout = self.layout
layout.use_property_split = True
row = layout.row()
runtime_settings = context.window_manager.session
settings = get_preferences()
if hasattr(context.window_manager, 'session'):
# STATE INITIAL
if not operators.client \
or (operators.client and operators.client.state['STATE'] == STATE_INITIAL):
if not session \
or (session and session.state['STATE'] == STATE_INITIAL):
pass
else:
cli_state = operators.client.state
row.label(text=f"Status : {get_state_str(cli_state['STATE'])}")
cli_state = session.state
row = layout.row()
current_state = cli_state['STATE']
info_msg = None
# STATE ACTIVE
if current_state == STATE_ACTIVE:
row.operator("session.stop", icon='QUIT', text="Exit")
row = layout.row()
if current_state in [STATE_ACTIVE]:
row = row.split(factor=0.3)
row.prop(settings.sync_flags, "sync_render_settings",text="",icon_only=True, icon='SCENE')
row.prop(settings.sync_flags, "sync_during_editmode", text="",icon_only=True, icon='EDITMODE_HLT')
row.prop(settings.sync_flags, "sync_active_camera", text="",icon_only=True, icon='OBJECT_DATAMODE')
row= layout.row()
# CONNECTION STATE
elif current_state in [
STATE_SRV_SYNC,
STATE_SYNCING,
STATE_AUTH,
STATE_CONFIG,
STATE_WAITING]:
if cli_state['STATE'] in [STATE_SYNCING,STATE_SRV_SYNC,STATE_WAITING]:
box = row.box()
box.label(text=printProgressBar(
cli_state['CURRENT'],
cli_state['TOTAL'],
length=16
))
if current_state in [STATE_ACTIVE] and runtime_settings.is_host:
info_msg = f"LAN: {runtime_settings.internet_ip}"
if current_state == STATE_LOBBY:
info_msg = "Waiting the session to start."
row = layout.row()
row.operator("session.stop", icon='QUIT', text="CANCEL")
elif current_state == STATE_QUITTING:
row = layout.row()
box = row.box()
if info_msg:
info_box = row.box()
info_box.row().label(text=info_msg,icon='INFO')
num_online_services = 0
for name, state in operators.client.services_state.items():
if state == STATE_ACTIVE:
num_online_services += 1
# Progress bar
if current_state in [STATE_SYNCING, STATE_SRV_SYNC, STATE_WAITING]:
info_box = row.box()
info_box.row().label(text=printProgressBar(
cli_state['CURRENT'],
cli_state['TOTAL'],
length=16
))
total_online_services = len(operators.client.services_state)
box.label(text=printProgressBar(
total_online_services-num_online_services,
total_online_services,
length=16
))
layout.row().operator("session.stop", icon='QUIT', text="Exit")
class SESSION_PT_settings_network(bpy.types.Panel):
bl_idname = "MULTIUSER_SETTINGS_NETWORK_PT_panel"
bl_label = "Network"
bl_space_type = 'VIEW_3D'
bl_region_type = 'UI'
bl_category = "Multiuser"
bl_parent_id = 'MULTIUSER_SETTINGS_PT_panel'
@classmethod
def poll(cls, context):
return not operators.client \
or (operators.client and operators.client.state['STATE'] == 0)
return not session \
or (session and session.state['STATE'] == 0)
def draw_header(self, context):
self.layout.label(text="", icon='URL')
def draw(self, context):
layout = self.layout
settings = context.window_manager.session
runtime_settings = context.window_manager.session
settings = get_preferences()
# USER SETTINGS
row = layout.row()
row.prop(settings, "session_mode", expand=True)
row.prop(runtime_settings, "session_mode", expand=True)
row = layout.row()
box = row.box()
row = box.row()
row.prop(settings, "ip", text="IP")
row = box.row()
row.label(text="Port:")
row.prop(settings, "port", text="")
row = box.row()
row.label(text="IPC Port:")
row.prop(settings, "ipc_port", text="")
if settings.session_mode == 'HOST':
if runtime_settings.session_mode == 'HOST':
row = box.row()
row.label(text="Start empty:")
row.prop(settings, "start_empty", text="")
row.label(text="Port:")
row.prop(settings, "port", text="")
row = box.row()
row.label(text="Start from:")
row.prop(settings, "init_method", text="")
row = box.row()
row.label(text="Admin password:")
row.prop(runtime_settings, "password", text="")
row = box.row()
row.operator("session.start", text="HOST").host = True
else:
row = box.row()
row.prop(settings, "ip", text="IP")
row = box.row()
row.label(text="Port:")
row.prop(settings, "port", text="")
row = box.row()
row.prop(runtime_settings, "admin", text='Connect as admin', icon='DISCLOSURE_TRI_DOWN' if runtime_settings.admin
else 'DISCLOSURE_TRI_RIGHT')
if runtime_settings.admin:
row = box.row()
row.label(text="Password:")
row.prop(runtime_settings, "password", text="")
row = box.row()
row.operator("session.start", text="CONNECT").host = False
class SESSION_PT_settings_user(bpy.types.Panel):
bl_idname = "MULTIUSER_SETTINGS_USER_PT_panel"
bl_label = "User"
bl_label = "User info"
bl_space_type = 'VIEW_3D'
bl_region_type = 'UI'
bl_category = "Multiuser"
bl_parent_id = 'MULTIUSER_SETTINGS_PT_panel'
@classmethod
def poll(cls, context):
return not operators.client \
or (operators.client and operators.client.state['STATE'] == 0)
return not session \
or (session and session.state['STATE'] == 0)
def draw_header(self, context):
self.layout.label(text="", icon='USER')
def draw(self, context):
layout = self.layout
settings = context.window_manager.session
runtime_settings = context.window_manager.session
settings = get_preferences()
row = layout.row()
# USER SETTINGS
@ -195,134 +245,230 @@ class SESSION_PT_settings_user(bpy.types.Panel):
row = layout.row()
class SESSION_PT_settings_replication(bpy.types.Panel):
class SESSION_PT_advanced_settings(bpy.types.Panel):
bl_idname = "MULTIUSER_SETTINGS_REPLICATION_PT_panel"
bl_label = "Advanced"
bl_space_type = 'VIEW_3D'
bl_region_type = 'UI'
bl_category = "Multiuser"
bl_parent_id = 'MULTIUSER_SETTINGS_PT_panel'
bl_options = {'DEFAULT_CLOSED'}
@classmethod
def poll(cls, context):
return not operators.client \
or (operators.client and operators.client.state['STATE'] == 0)
return not session \
or (session and session.state['STATE'] == 0)
def draw_header(self, context):
self.layout.label(text="", icon='PREFERENCES')
def draw(self, context):
layout = self.layout
settings = context.window_manager.session
# Right managment
if settings.session_mode == 'HOST':
row = layout.row(align=True)
row.label(text="Right strategy:")
row.prop(settings,"right_strategy",text="")
runtime_settings = context.window_manager.session
settings = get_preferences()
row = layout.row()
net_section = layout.row().box()
net_section.prop(
settings,
"sidebar_advanced_net_expanded",
text="Network",
icon=get_expanded_icon(settings.sidebar_advanced_net_expanded),
emboss=False)
if settings.sidebar_advanced_net_expanded:
net_section_row = net_section.row()
net_section_row.label(text="IPC Port:")
net_section_row.prop(settings, "ipc_port", text="")
net_section_row = net_section.row()
net_section_row.label(text="Timeout (ms):")
net_section_row.prop(settings, "connection_timeout", text="")
row = layout.row()
# Replication frequencies
flow = row .grid_flow(
row_major=True, columns=0, even_columns=True, even_rows=False, align=True)
line = flow.row(align=True)
line.label(text=" ")
line.separator()
line.label(text="refresh (sec)")
line.label(text="apply (sec)")
replication_section = layout.row().box()
replication_section.prop(
settings,
"sidebar_advanced_rep_expanded",
text="Replication",
icon=get_expanded_icon(settings.sidebar_advanced_rep_expanded),
emboss=False)
for item in settings.supported_datablock:
line = flow.row(align=True)
line.prop(item, "auto_push", text="", icon=item.icon)
line.separator()
line.prop(item, "bl_delay_refresh", text="")
line.prop(item, "bl_delay_apply", text="")
if settings.sidebar_advanced_rep_expanded:
replication_section_row = replication_section.row()
replication_section_row.label(text="Sync flags", icon='COLLECTION_NEW')
replication_section_row = replication_section.row()
replication_section_row.prop(settings.sync_flags, "sync_render_settings")
replication_section_row = replication_section.row()
replication_section_row.prop(settings.sync_flags, "sync_active_camera")
replication_section_row = replication_section.row()
replication_section_row.prop(settings.sync_flags, "sync_during_editmode")
replication_section_row = replication_section.row()
if settings.sync_flags.sync_during_editmode:
warning = replication_section_row.box()
warning.label(text="Don't use this with heavy meshes !", icon='ERROR')
replication_section_row = replication_section.row()
replication_section_row.label(text="Update method", icon='RECOVER_LAST')
replication_section_row = replication_section.row()
replication_section_row.prop(settings, "update_method", expand=True)
replication_section_row = replication_section.row()
replication_timers = replication_section_row.box()
replication_timers.label(text="Replication timers", icon='TIME')
if settings.update_method == "DEFAULT":
replication_timers = replication_timers.row()
# Replication frequencies
flow = replication_timers.grid_flow(
row_major=True, columns=0, even_columns=True, even_rows=False, align=True)
line = flow.row(align=True)
line.label(text=" ")
line.separator()
line.label(text="refresh (sec)")
line.label(text="apply (sec)")
for item in settings.supported_datablocks:
line = flow.row(align=True)
line.prop(item, "auto_push", text="", icon=item.icon)
line.separator()
line.prop(item, "bl_delay_refresh", text="")
line.prop(item, "bl_delay_apply", text="")
else:
replication_timers = replication_timers.row()
replication_timers.label(text="Update rate (ms):")
replication_timers.prop(settings, "depsgraph_update_rate", text="")
cache_section = layout.row().box()
cache_section.prop(
settings,
"sidebar_advanced_cache_expanded",
text="Cache",
icon=get_expanded_icon(settings.sidebar_advanced_cache_expanded),
emboss=False)
if settings.sidebar_advanced_cache_expanded:
cache_section_row = cache_section.row()
cache_section_row.label(text="Cache directory:")
cache_section_row = cache_section.row()
cache_section_row.prop(settings, "cache_directory", text="")
cache_section_row = cache_section.row()
cache_section_row.label(text="Clear memory filecache:")
cache_section_row.prop(settings, "clear_memory_filecache", text="")
cache_section_row = cache_section.row()
cache_section_row.operator('session.clear_cache', text=f"Clear cache ({get_folder_size(settings.cache_directory)})")
log_section = layout.row().box()
log_section.prop(
settings,
"sidebar_advanced_log_expanded",
text="Logging",
icon=get_expanded_icon(settings.sidebar_advanced_log_expanded),
emboss=False)
if settings.sidebar_advanced_log_expanded:
log_section_row = log_section.row()
log_section_row.label(text="Log level:")
log_section_row.prop(settings, 'logging_level', text="")
class SESSION_PT_user(bpy.types.Panel):
bl_idname = "MULTIUSER_USER_PT_panel"
bl_label = "Online users"
bl_space_type = 'VIEW_3D'
bl_region_type = 'UI'
bl_category = "Multiuser"
bl_parent_id = 'MULTIUSER_SETTINGS_PT_panel'
@classmethod
def poll(cls, context):
return operators.client and operators.client.state['STATE'] == 2
return session and session.state['STATE'] in [STATE_ACTIVE, STATE_LOBBY]
def draw_header(self, context):
self.layout.label(text="", icon='USER')
def draw(self, context):
layout = self.layout
online_users = context.window_manager.online_users
selected_user = context.window_manager.user_index
settings = context.window_manager.session
active_user = online_users[selected_user] if len(online_users)-1>=selected_user else 0
settings = get_preferences()
active_user = online_users[selected_user] if len(
online_users)-1 >= selected_user else 0
runtime_settings = context.window_manager.session
# Create a simple row.
row = layout.row()
box = row.box()
split = box.split(factor=0.5)
split = box.split(factor=0.35)
split.label(text="user")
split = split.split(factor=0.5)
split.label(text="location")
split.label(text="frame")
split.label(text="ping")
row = layout.row()
layout.template_list("SESSION_UL_users", "", context.window_manager, "online_users", context.window_manager, "user_index")
layout.template_list("SESSION_UL_users", "", context.window_manager,
"online_users", context.window_manager, "user_index")
if active_user != 0 and active_user.username != settings.username:
row = layout.row()
user_operations = row.split()
user_operations.alert = context.window_manager.session.time_snap_running
user_operations.operator(
"session.snapview",
text="",
icon='VIEW_CAMERA').target_client = active_user.username
user_operations.alert = context.window_manager.session.user_snap_running
user_operations.operator(
"session.snaptime",
text="",
icon='TIME').target_client = active_user.username
if session.state['STATE'] == STATE_ACTIVE:
user_operations.alert = context.window_manager.session.time_snap_running
user_operations.operator(
"session.snapview",
text="",
icon='VIEW_CAMERA').target_client = active_user.username
user_operations.alert = context.window_manager.session.user_snap_running
user_operations.operator(
"session.snaptime",
text="",
icon='TIME').target_client = active_user.username
if session.online_users[settings.username]['admin']:
user_operations.operator(
"session.kick",
text="",
icon='CANCEL').user = active_user.username
class SESSION_UL_users(bpy.types.UIList):
def draw_item(self, context, layout, data, item, icon, active_data, active_propname, index, flt_flag):
session = operators.client
settings = context.window_manager.session
settings = get_preferences()
is_local_user = item.username == settings.username
ping = '-'
frame_current = '-'
scene_current = '-'
status_icon = 'BLANK1'
if session:
user = session.online_users.get(item.username)
if user:
ping = str(user['latency'])
metadata = user.get('metadata')
if metadata and 'frame_current' in metadata:
frame_current = str(metadata['frame_current'])
split = layout.split(factor=0.5)
split.label(text=item.username)
frame_current = str(metadata.get('frame_current','-'))
scene_current = metadata.get('scene_current','-')
if user['admin']:
status_icon = 'FAKE_USER_ON'
split = layout.split(factor=0.35)
split.label(text=item.username, icon=status_icon)
split = split.split(factor=0.5)
split.label(text=scene_current)
split.label(text=frame_current)
split.label(text=ping)
class SESSION_PT_presence(bpy.types.Panel):
bl_idname = "MULTIUSER_MODULE_PT_panel"
bl_label = "Presence overlay"
bl_space_type = 'VIEW_3D'
bl_region_type = 'UI'
bl_category = "Multiuser"
bl_parent_id = 'MULTIUSER_SETTINGS_PT_panel'
bl_options = {'DEFAULT_CLOSED'}
@classmethod
def poll(cls, context):
return not operators.client \
or (operators.client and operators.client.state['STATE'] in [STATE_INITIAL, STATE_ACTIVE])
return not session \
or (session and session.state['STATE'] in [STATE_INITIAL, STATE_ACTIVE])
def draw_header(self, context):
self.layout.prop(context.window_manager.session, "enable_presence", text="")
self.layout.prop(context.window_manager.session,
"enable_presence", text="",icon='OVERLAY')
def draw(self, context):
layout = self.layout
@ -330,50 +476,24 @@ class SESSION_PT_presence(bpy.types.Panel):
settings = context.window_manager.session
layout.active = settings.enable_presence
col = layout.column()
col.prop(settings,"presence_show_selected")
col.prop(settings,"presence_show_user")
row = layout.row()
class SESSION_PT_services(bpy.types.Panel):
bl_idname = "MULTIUSER_SERVICE_PT_panel"
bl_label = "Services"
bl_space_type = 'VIEW_3D'
bl_region_type = 'UI'
bl_category = "Multiuser"
bl_parent_id = 'MULTIUSER_SETTINGS_PT_panel'
bl_options = {'DEFAULT_CLOSED'}
@classmethod
def poll(cls, context):
return operators.client and operators.client.state['STATE'] == 2
def draw(self, context):
layout = self.layout
online_users = context.window_manager.online_users
selected_user = context.window_manager.user_index
settings = context.window_manager.session
active_user = online_users[selected_user] if len(online_users)-1>=selected_user else 0
# Create a simple row.
for name, state in operators.client.services_state.items():
row = layout.row()
row.label(text=name)
row.label(text=get_state_str(state))
col.prop(settings, "presence_show_selected")
col.prop(settings, "presence_show_user")
row = layout.column()
row.active = settings.presence_show_user
row.prop(settings, "presence_show_far_user")
def draw_property(context, parent, property_uuid, level=0):
settings = context.window_manager.session
item = operators.client.get(uuid=property_uuid)
if item.state == ERROR:
return
settings = get_preferences()
runtime_settings = context.window_manager.session
item = session.get(uuid=property_uuid)
area_msg = parent.row(align=True)
if level > 0:
for i in range(level):
area_msg.label(text="")
if item.state == ERROR:
area_msg.alert=True
else:
area_msg.alert=False
line = area_msg.box()
name = item.data['name'] if item.data else item.uuid
@ -381,22 +501,21 @@ def draw_property(context, parent, property_uuid, level=0):
detail_item_box = line.row(align=True)
detail_item_box.label(text="",
icon=settings.supported_datablock[item.str_type].icon)
detail_item_box.label(text="{} ".format(name))
icon=settings.supported_datablocks[item.str_type].icon)
detail_item_box.label(text=f"{name}")
# Operations
have_right_to_modify = settings.is_admin or \
item.owner == settings.username or \
item.owner == RP_COMMON
have_right_to_modify = (item.owner == settings.username or \
item.owner == RP_COMMON) and item.state != ERROR
if have_right_to_modify:
detail_item_box.operator(
"session.commit",
text="",
icon='TRIA_UP').target = item.uuid
detail_item_box.separator()
if item.state in [FETCHED, UP]:
detail_item_box.operator(
"session.apply",
@ -424,17 +543,26 @@ def draw_property(context, parent, property_uuid, level=0):
else:
detail_item_box.label(text="", icon="DECORATE_LOCKED")
class SESSION_PT_outliner(bpy.types.Panel):
class SESSION_PT_repository(bpy.types.Panel):
bl_idname = "MULTIUSER_PROPERTIES_PT_panel"
bl_label = "Properties"
bl_label = "Repository"
bl_space_type = 'VIEW_3D'
bl_region_type = 'UI'
bl_category = "Multiuser"
bl_parent_id = 'MULTIUSER_SETTINGS_PT_panel'
@classmethod
def poll(cls, context):
return operators.client and operators.client.state['STATE'] == 2
settings = get_preferences()
admin = False
if session and hasattr(session,'online_users'):
usr = session.online_users.get(settings.username)
if usr:
admin = usr['admin']
return hasattr(context.window_manager, 'session') and \
session and \
(session.state['STATE'] == STATE_ACTIVE or \
session.state['STATE'] == STATE_LOBBY and admin)
def draw_header(self, context):
self.layout.label(text="", icon='OUTLINER_OB_GROUP_INSTANCE')
@ -442,9 +570,15 @@ class SESSION_PT_outliner(bpy.types.Panel):
def draw(self, context):
layout = self.layout
if hasattr(context.window_manager, 'session'):
# Filters
settings = context.window_manager.session
# Filters
settings = get_preferences()
runtime_settings = context.window_manager.session
usr = session.online_users.get(settings.username)
row = layout.row()
if session.state['STATE'] == STATE_ACTIVE:
flow = layout.grid_flow(
row_major=True,
columns=0,
@ -452,27 +586,27 @@ class SESSION_PT_outliner(bpy.types.Panel):
even_rows=False,
align=True)
for item in settings.supported_datablock:
for item in settings.supported_datablocks:
col = flow.column(align=True)
col.prop(item, "use_as_filter", text="", icon=item.icon)
row = layout.row(align=True)
row.prop(settings, "filter_owned", text="Show only owned")
row.prop(runtime_settings, "filter_owned", text="Show only owned")
row = layout.row(align=True)
# Properties
types_filter = [t.type_name for t in settings.supported_datablock
types_filter = [t.type_name for t in settings.supported_datablocks
if t.use_as_filter]
key_to_filter = operators.client.list(
filter_owner=settings.username) if settings.filter_owned else operators.client.list()
key_to_filter = session.list(
filter_owner=settings.username) if runtime_settings.filter_owned else session.list()
client_keys = [key for key in key_to_filter
if operators.client.get(uuid=key).str_type
if session.get(uuid=key).str_type
in types_filter]
if client_keys and len(client_keys) > 0:
if client_keys:
col = layout.column(align=True)
for key in client_keys:
draw_property(context, col, key)
@ -480,6 +614,40 @@ class SESSION_PT_outliner(bpy.types.Panel):
else:
row.label(text="Empty")
elif session.state['STATE'] == STATE_LOBBY and usr and usr['admin']:
row.operator("session.init", icon='TOOL_SETTINGS', text="Init")
else:
row.label(text="Waiting to start")
class VIEW3D_PT_overlay_session(bpy.types.Panel):
bl_space_type = 'VIEW_3D'
bl_region_type = 'HEADER'
bl_parent_id = 'VIEW3D_PT_overlay'
bl_label = "Multi-user"
@classmethod
def poll(cls, context):
return True
def draw(self, context):
layout = self.layout
view = context.space_data
overlay = view.overlay
display_all = overlay.show_overlays
col = layout.column()
col.active = display_all
row = col.row(align=True)
settings = context.window_manager.session
layout.active = settings.enable_presence
col = layout.column()
col.prop(settings, "presence_show_selected")
col.prop(settings, "presence_show_user")
row = layout.column()
row.active = settings.presence_show_user
row.prop(settings, "presence_show_far_user")
classes = (
SESSION_UL_users,
@ -487,10 +655,10 @@ classes = (
SESSION_PT_settings_user,
SESSION_PT_settings_network,
SESSION_PT_presence,
SESSION_PT_settings_replication,
SESSION_PT_advanced_settings,
SESSION_PT_user,
SESSION_PT_outliner,
SESSION_PT_services
SESSION_PT_repository,
VIEW3D_PT_overlay_session,
)

View File

@ -1,31 +1,35 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
#
# ##### END GPL LICENSE BLOCK #####
import json
import logging
import os
import random
import string
import sys
from uuid import uuid4
import time
from collections.abc import Iterable
from pathlib import Path
from uuid import uuid4
import math
import bpy
import mathutils
from . import environment, presence
from .libs import dump_anything
logger = logging.getLogger(__name__)
logger.setLevel(logging.WARNING)
def has_action(target):
return (hasattr(target, 'animation_data')
and target.animation_data
and target.animation_data.action)
def has_driver(target):
return (hasattr(target, 'animation_data')
and target.animation_data
and target.animation_data.drivers)
def find_from_attr(attr_name, attr_value, list):
@ -37,7 +41,7 @@ def find_from_attr(attr_name, attr_value, list):
def get_datablock_users(datablock):
users = []
supported_types = bpy.context.window_manager.session.supported_datablock
supported_types = get_preferences().supported_datablocks
if hasattr(datablock, 'users_collection') and datablock.users_collection:
users.extend(list(datablock.users_collection))
if hasattr(datablock, 'users_scene') and datablock.users_scene:
@ -45,7 +49,7 @@ def get_datablock_users(datablock):
if hasattr(datablock, 'users_group') and datablock.users_scene:
users.extend(list(datablock.users_scene))
for datatype in supported_types:
if datatype.bl_name != 'users':
if datatype.bl_name != 'users' and hasattr(bpy.data, datatype.bl_name):
root = getattr(bpy.data, datatype.bl_name)
for item in root:
if hasattr(item, 'data') and datablock == item.data or \
@ -54,12 +58,6 @@ def get_datablock_users(datablock):
return users
def random_string_digits(stringLength=6):
"""Generate a random string of letters and digits """
lettersAndDigits = string.ascii_letters + string.digits
return ''.join(random.choices(lettersAndDigits, k=stringLength))
def clean_scene():
for type_name in dir(bpy.data):
try:
@ -70,89 +68,10 @@ def clean_scene():
continue
def revers(d):
l = []
for i in d:
l.append(i)
return l[::-1]
def get_armature_edition_context(armature):
override = {}
# Set correct area
for area in bpy.data.window_managers[0].windows[0].screen.areas:
if area.type == 'VIEW_3D':
override = bpy.context.copy()
override['area'] = area
break
# Set correct armature settings
override['window'] = bpy.data.window_managers[0].windows[0]
override['screen'] = bpy.data.window_managers[0].windows[0].screen
override['mode'] = 'EDIT_ARMATURE'
override['active_object'] = armature
override['selected_objects'] = [armature]
for o in bpy.data.objects:
if o.data == armature:
override['edit_object'] = o
break
return override
def get_selected_objects(scene, active_view_layer):
return [obj.uuid for obj in scene.objects if obj.select_get(view_layer=active_view_layer)]
def load_dict(src_dict, target):
try:
for item in src_dict:
# attr =
setattr(target, item, src_dict[item])
except Exception as e:
logger.error(e)
pass
def dump_datablock(datablock, depth):
if datablock:
dumper = dump_anything.Dumper()
dumper.type_subset = dumper.match_subset_all
dumper.depth = depth
datablock_type = datablock.bl_rna.name
key = "{}/{}".format(datablock_type, datablock.name)
data = dumper.dump(datablock)
return data
def dump_datablock_attibutes(datablock=None, attributes=[], depth=1, dickt=None):
if datablock:
dumper = dump_anything.Dumper()
dumper.type_subset = dumper.match_subset_all
dumper.depth = depth
datablock_type = datablock.bl_rna.name
data = {}
if dickt:
data = dickt
for attr in attributes:
try:
data[attr] = dumper.dump(getattr(datablock, attr))
except:
pass
return data
def resolve_from_id(id, optionnal_type=None):
for category in dir(bpy.data):
root = getattr(bpy.data, category)
@ -160,4 +79,76 @@ def resolve_from_id(id, optionnal_type=None):
if id in root and ((optionnal_type is None) or (optionnal_type.lower() in root[id].__class__.__name__.lower())):
return root[id]
return None
def get_preferences():
return bpy.context.preferences.addons[__package__].preferences
def current_milli_time():
return int(round(time.time() * 1000))
def get_expanded_icon(prop: bpy.types.BoolProperty) -> str:
if prop:
return 'DISCLOSURE_TRI_DOWN'
else:
return 'DISCLOSURE_TRI_RIGHT'
# Taken from here: https://stackoverflow.com/a/55659577
def get_folder_size(folder):
return ByteSize(sum(file.stat().st_size for file in Path(folder).rglob('*')))
class ByteSize(int):
_kB = 1024
_suffixes = 'B', 'kB', 'MB', 'GB', 'PB'
def __new__(cls, *args, **kwargs):
return super().__new__(cls, *args, **kwargs)
def __init__(self, *args, **kwargs):
self.bytes = self.B = int(self)
self.kilobytes = self.kB = self / self._kB**1
self.megabytes = self.MB = self / self._kB**2
self.gigabytes = self.GB = self / self._kB**3
self.petabytes = self.PB = self / self._kB**4
*suffixes, last = self._suffixes
suffix = next((
suffix
for suffix in suffixes
if 1 < getattr(self, suffix) < self._kB
), last)
self.readable = suffix, getattr(self, suffix)
super().__init__()
def __str__(self):
return self.__format__('.2f')
def __repr__(self):
return '{}({})'.format(self.__class__.__name__, super().__repr__())
def __format__(self, format_spec):
suffix, val = self.readable
return '{val:{fmt}} {suf}'.format(val=math.ceil(val), fmt=format_spec, suf=suffix)
def __sub__(self, other):
return self.__class__(super().__sub__(other))
def __add__(self, other):
return self.__class__(super().__add__(other))
def __mul__(self, other):
return self.__class__(super().__mul__(other))
def __rsub__(self, other):
return self.__class__(super().__sub__(other))
def __radd__(self, other):
return self.__class__(super().__add__(other))
def __rmul__(self, other):
return self.__class__(super().__rmul__(other))

View File

@ -0,0 +1,24 @@
# Download base image debian jessie
FROM python:slim
ARG replication_version=0.0.21a15
ARG version=0.1.0
# Infos
LABEL maintainer="Swann Martinez"
LABEL version=$version
LABEL description="Blender multi-user addon \
dedicated server image."
# Argument
ENV password='admin'
ENV port=5555
ENV timeout=3000
ENV log_level=INFO
ENV log_file="multiuser_server.log"
#Install replication
RUN pip install replication==$replication_version
# Run the server with parameters
CMD replication.serve -pwd ${password} -p ${port} -t ${timeout} -l ${log_level} -lf ${log_file}

View File

@ -0,0 +1,6 @@
import re
init_py = open("multi_user/__init__.py").read()
version = re.search("\d+, \d+, \d+", init_py).group(0)
digits = version.split(',')
print('.'.join(digits).replace(" ",""))

View File

@ -0,0 +1,4 @@
import re
init_py = open("multi_user/__init__.py").read()
print(re.search("\d+\.\d+\.\d+\w\d+|\d+\.\d+\.\d+", init_py).group(0))

25
scripts/test_addon.py Normal file
View File

@ -0,0 +1,25 @@
import sys
try:
import blender_addon_tester as BAT
except Exception as e:
print(e)
sys.exit(1)
def main():
if len(sys.argv) > 1:
addon = sys.argv[1]
else:
addon = "multi_user"
if len(sys.argv) > 2:
blender_rev = sys.argv[2]
else:
blender_rev = "2.90.0"
try:
exit_val = BAT.test_blender_addon(addon_path=addon, blender_revision=blender_rev)
except Exception as e:
print(e)
exit_val = 1
sys.exit(exit_val)
main()

View File

@ -0,0 +1,25 @@
import os
import pytest
import bpy
@pytest.fixture
def clear_blend():
""" Remove all datablocks of a blend
"""
for type_name in dir(bpy.data):
try:
type_collection = getattr(bpy.data, type_name)
for item in type_collection:
type_collection.remove(item)
except Exception:
continue
@pytest.fixture
def load_blendfile(blendname):
print(f"loading {blendname}")
dir_path = os.path.dirname(os.path.realpath(__file__))
bpy.ops.wm.open_mainfile(filepath=os.path.join(dir_path, blendname))

View File

@ -0,0 +1,38 @@
import os
import pytest
from deepdiff import DeepDiff
import bpy
import random
from multi_user.bl_types.bl_action import BlAction
INTERPOLATION = ['CONSTANT', 'LINEAR', 'BEZIER', 'SINE', 'QUAD', 'CUBIC', 'QUART', 'QUINT', 'EXPO', 'CIRC', 'BACK', 'BOUNCE', 'ELASTIC']
# @pytest.mark.parametrize('blendname', ['test_action.blend'])
def test_action(clear_blend):
# Generate a random action
datablock = bpy.data.actions.new("sdsad")
fcurve_sample = datablock.fcurves.new('location')
fcurve_sample.keyframe_points.add(100)
datablock.id_root = 'MESH'
for i, point in enumerate(fcurve_sample.keyframe_points):
point.co[0] = i
point.co[1] = random.randint(-10,10)
point.interpolation = INTERPOLATION[random.randint(0, len(INTERPOLATION)-1)]
bpy.ops.mesh.primitive_plane_add()
bpy.data.objects[0].animation_data_create()
bpy.data.objects[0].animation_data.action = datablock
# Test
implementation = BlAction()
expected = implementation._dump(datablock)
bpy.data.actions.remove(datablock)
test = implementation._construct(expected)
implementation._load(expected, test)
result = implementation._dump(test)
assert not DeepDiff(expected, result)

View File

@ -0,0 +1,22 @@
import os
import pytest
from deepdiff import DeepDiff
import bpy
import random
from multi_user.bl_types.bl_armature import BlArmature
def test_armature(clear_blend):
bpy.ops.object.armature_add()
datablock = bpy.data.armatures[0]
implementation = BlArmature()
expected = implementation._dump(datablock)
bpy.data.armatures.remove(datablock)
test = implementation._construct(expected)
implementation._load(expected, test)
result = implementation._dump(test)
assert not DeepDiff(expected, result)

View File

@ -0,0 +1,25 @@
import os
import pytest
from deepdiff import DeepDiff
import bpy
from multi_user.bl_types.bl_camera import BlCamera
@pytest.mark.parametrize('camera_type', ['PANO','PERSP','ORTHO'])
def test_camera(clear_blend, camera_type):
bpy.ops.object.camera_add()
datablock = bpy.data.cameras[0]
datablock.type = camera_type
camera_dumper = BlCamera()
expected = camera_dumper._dump(datablock)
bpy.data.cameras.remove(datablock)
test = camera_dumper._construct(expected)
camera_dumper._load(expected, test)
result = camera_dumper._dump(test)
assert not DeepDiff(expected, result)

View File

@ -0,0 +1,33 @@
import os
import pytest
from deepdiff import DeepDiff
from uuid import uuid4
import bpy
import random
from multi_user.bl_types.bl_collection import BlCollection
def test_collection(clear_blend):
# Generate a collection with childrens and a cube
datablock = bpy.data.collections.new("root")
datablock.uuid = str(uuid4())
s1 = bpy.data.collections.new("child")
s1.uuid = str(uuid4())
s2 = bpy.data.collections.new("child2")
s2.uuid = str(uuid4())
datablock.children.link(s1)
datablock.children.link(s2)
bpy.ops.mesh.primitive_cube_add()
datablock.objects.link(bpy.data.objects[0])
# Test
implementation = BlCollection()
expected = implementation._dump(datablock)
bpy.data.collections.remove(datablock)
test = implementation._construct(expected)
implementation._load(expected, test)
result = implementation._dump(test)
assert not DeepDiff(expected, result)

View File

@ -0,0 +1,29 @@
import os
import pytest
from deepdiff import DeepDiff
import bpy
import random
from multi_user.bl_types.bl_curve import BlCurve
@pytest.mark.parametrize('curve_type', ['TEXT','BEZIER'])
def test_curve(clear_blend, curve_type):
if curve_type == 'TEXT':
bpy.ops.object.text_add(enter_editmode=False, align='WORLD', location=(0, 0, 0))
elif curve_type == 'BEZIER':
bpy.ops.curve.primitive_bezier_curve_add(enter_editmode=False, align='WORLD', location=(0, 0, 0))
else: #TODO: NURBS support
bpy.ops.surface.primitive_nurbs_surface_curve_add(radius=1, enter_editmode=False, align='WORLD', location=(0, 0, 0))
datablock = bpy.data.curves[0]
implementation = BlCurve()
expected = implementation._dump(datablock)
bpy.data.curves.remove(datablock)
test = implementation._construct(expected)
implementation._load(expected, test)
result = implementation._dump(test)
assert not DeepDiff(expected, result)

View File

@ -0,0 +1,23 @@
import os
import pytest
from deepdiff import DeepDiff
import bpy
from multi_user.bl_types.bl_gpencil import BlGpencil
def test_gpencil(clear_blend):
bpy.ops.object.gpencil_add(type='MONKEY')
datablock = bpy.data.grease_pencils[0]
implementation = BlGpencil()
expected = implementation._dump(datablock)
bpy.data.grease_pencils.remove(datablock)
test = implementation._construct(expected)
implementation._load(expected, test)
result = implementation._dump(test)
assert not DeepDiff(expected, result)

Some files were not shown because too many files have changed in this diff Show More