Compare commits

...

221 commits

Author SHA1 Message Date
Bart Schuurmans 5fcdfbc9c6 WIP: Add Series model 2024-04-26 12:53:04 +02:00
Mouse Reeve c4b21ee258
Merge pull request #3114 from SMillerDev/feat/api/oauth
feat: add OAuth authentication
2024-04-24 15:45:54 -07:00
Mouse Reeve ad830dd885
Merge pull request #3350 from Minnozz/custom-port
Correctly handle serving BookWyrm on custom port
2024-04-24 15:27:01 -07:00
Mouse Reeve 366c647585
Merge pull request #3359 from bookwyrm-social/dependabot/pip/aiohttp-3.9.4
Bump aiohttp from 3.9.2 to 3.9.4
2024-04-24 15:13:30 -07:00
Bart Schuurmans 4f58b11330 Include the correct protocol and port in remote IDs 2024-04-24 15:35:19 +02:00
Bart Schuurmans 609bc15406 Support http:// protocol in BookWyrm connector 2024-04-24 15:30:47 +02:00
Bart Schuurmans c42db40a63 Construct absolute URLs with the correct protocol and port 2024-04-24 15:30:47 +02:00
Bart Schuurmans 3aefbb548e Allow serving BookWyrm on a non-standard port 2024-04-24 15:30:47 +02:00
Bart Schuurmans baea105c18 pytest.ini env values should be unquoted
Otherwise the quotes end up in the strings.
2024-04-24 15:30:47 +02:00
Bart Schuurmans c73d1fff6a Remove unnecessary exceptions from validate_url_domain 2024-04-24 15:30:47 +02:00
Bart Schuurmans 3d183a393f
Merge pull request #3360 from hughrun/move-fix
refactor Move for more redundancy
2024-04-24 15:30:19 +02:00
Bart Schuurmans f24fdf73b5 Update to match newer code style 2024-04-24 15:08:48 +02:00
Bart Schuurmans 839ab2fafd
Merge branch 'main' into move-fix 2024-04-24 14:56:32 +02:00
Bart Schuurmans 637f19b208
Merge pull request #3336 from Minnozz/s3-url-protocol
Support AWS_S3_URL_PROTOCOL
2024-04-24 14:53:55 +02:00
Bart Schuurmans 031223104f Clarify AWS_S3_URL_PROTOCOL in .env.example 2024-04-24 14:46:57 +02:00
Hugh Rundle 6684d60526
refactor Move for more redundancy
As outlined in #3354, a use `Move` fails if the user is moving from a BookWyrm server to another BookWrym server.
This is because:

1. the original code did not announce changes to alsoKnownAs;
2. the original code always checked the locally saved profile rather than refetching the remote data;

This commit fixes both these problems by forcing `MoveUser` to always perform a "refresh" of the local data from the remote, and by saving the user with broadcast=True when updating alsoKnownAs ids.
2024-04-22 13:35:08 +10:00
dependabot[bot] cca58023ed
Bump aiohttp from 3.9.2 to 3.9.4
Bumps [aiohttp](https://github.com/aio-libs/aiohttp) from 3.9.2 to 3.9.4.
- [Release notes](https://github.com/aio-libs/aiohttp/releases)
- [Changelog](https://github.com/aio-libs/aiohttp/blob/master/CHANGES.rst)
- [Commits](https://github.com/aio-libs/aiohttp/compare/v3.9.2...v3.9.4)

---
updated-dependencies:
- dependency-name: aiohttp
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-04-18 15:51:34 +00:00
Bart Schuurmans bf5c08dbf3 Add docker-compose.override.yml to .gitignore 2024-04-15 13:17:00 +02:00
Bart Schuurmans be872ed672 Support AWS_S3_URL_PROTOCOL
- Allow setting in .env
- Default to PROTOCOL (same as before)
- Propagate to django-storages so it generates the correct URLs in sass_src
2024-04-15 13:16:51 +02:00
Bart Schuurmans 70f803a1f6
Merge pull request #3353 from dato/fix_quotation_str_pagenum
Fix creation of quotations with no end position
2024-04-15 13:11:55 +02:00
Adeodato Simó 4304cd4a79
use re.escape 2024-04-13 21:26:41 -03:00
Adeodato Simó 8733369605
test_quotation_page_serialization: add test with no position 2024-04-13 21:26:41 -03:00
Adeodato Simó df78cc64a6
Quotation._format_position: do not treat page numbers as integers
Fixes: #3352
2024-04-13 21:26:41 -03:00
Adeodato Simó f844abcad9
test_quotation_page_serialization: use strings for page numbers
This follows from #3273, "Allow page numbers to be text, instead of
integers".
2024-04-13 21:26:39 -03:00
Bart Schuurmans 21a39f8170
Merge pull request #3228 from hughrun/user-export
Fix user exports to deal with s3 storage
2024-04-13 22:53:58 +02:00
Hugh Rundle c3c46144fe
add merge migration 2024-04-13 12:39:40 +10:00
Hugh Rundle d48d312c0a
Merge branch 'main' into user-export 2024-04-13 12:26:13 +10:00
Hugh Rundle 501fb45528
export avatars to own directory
Saving avatars to /images is problematic because it changes the original filepath from avatars/filename to images/avatars/filename.
In this PR prior to this commit, imports failed as they are looking for a file path beginning with "avatar"
2024-04-13 12:03:35 +10:00
Bart Schuurmans 7d581759da
Merge pull request #3342 from hbrunn/main-pilkit
[FIX] make sure to get Pillow>=10 compatible pilkit
2024-04-11 14:52:22 +02:00
Bart Schuurmans d5a536ae36 Change pilkit constraint to the version that does work 2024-04-11 14:45:13 +02:00
Bart Schuurmans 26f92db5d8 Merge branch 'main' into main-pilkit 2024-04-11 14:43:10 +02:00
Bart Schuurmans 5686c5ae5d
Merge pull request #3356 from Minnozz/quick-fix-frontend-ci
Install same version of eslint in CI as in dev-tools
2024-04-10 22:10:07 +02:00
Bart Schuurmans 9d9e64399c Install same version of eslint in CI as in dev-tools 2024-04-10 21:26:34 +02:00
Mouse Reeve b6aba44e42
Merge pull request #3355 from bookwyrm-social/merge-migration
Adds merge migration
2024-04-09 06:04:15 -05:00
Mouse Reeve 3ffbb242a4 Black 2024-04-09 05:59:01 -05:00
Mouse Reeve af0bd90c15 Adds merge migration 2024-04-09 05:57:27 -05:00
Mouse Reeve 73630331d1
Merge pull request #3299 from Minnozz/absorb
Track which Author/Work/Edition a duplicate has been merged into
2024-04-09 05:55:44 -05:00
Mouse Reeve ca6dbcb483
Merge pull request #3348 from Minnozz/more-indexes
Define more indexes for slow queries
2024-04-04 15:18:07 -07:00
Bart Schuurmans e1c54b2933 Remove optimizations with adverse effects
`if not audience` actually causes the entire query to be evaluated, before .values_list() is called.
2024-04-04 13:47:56 +02:00
Bart Schuurmans 439cb3ccaa Remove unnecessary conversions between list and set 2024-04-04 13:15:31 +02:00
Bart Schuurmans 321397a349 Specify which column DISTINCT should apply to 2024-04-03 21:28:22 +02:00
Bart Schuurmans 464a0298c6 Add index for finding active (and local) users 2024-04-03 21:27:52 +02:00
Bart Schuurmans 0501ce39cd Add index for looking up User by username 2024-04-03 21:15:24 +02:00
Bart Schuurmans 4d5a30d953 Add index for looking up KeyPair by remote id 2024-04-03 21:11:27 +02:00
Bart Schuurmans 5cfe7eca6f Add index for finding all statuses in a thread 2024-04-03 21:11:09 +02:00
Bart Schuurmans 5082806b82
Merge pull request #3338 from Minnozz/fix-nginx-location
Make nginx config safer
2024-04-03 19:22:16 +02:00
Mouse Reeve d1d91f0c2b
Merge pull request #3347 from bookwyrm-social/dependabot/pip/pillow-10.3.0
Bump pillow from 10.2.0 to 10.3.0
2024-04-03 10:01:59 -07:00
dependabot[bot] ea0ade955b
Bump pillow from 10.2.0 to 10.3.0
Bumps [pillow](https://github.com/python-pillow/Pillow) from 10.2.0 to 10.3.0.
- [Release notes](https://github.com/python-pillow/Pillow/releases)
- [Changelog](https://github.com/python-pillow/Pillow/blob/main/CHANGES.rst)
- [Commits](https://github.com/python-pillow/Pillow/compare/10.2.0...10.3.0)

---
updated-dependencies:
- dependency-name: pillow
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-04-03 16:45:11 +00:00
Mouse Reeve f085d3d0fe
Merge pull request #3346 from Minnozz/status-remote-id-index
Add index on Status.remote_id
2024-04-02 13:02:35 -07:00
Bart Schuurmans 4bbdd0b2d0 Add index on Status.remote_id
This field is often used in WHERE-clauses in queries that are very slow on bookwyrm.social.
2024-04-02 21:54:30 +02:00
Sean Molenaar d5fb21f330
Merge branch 'main' into feat/api/oauth 2024-04-01 22:35:19 +02:00
Mouse Reeve f28800af7f
Merge pull request #3339 from Minnozz/fix-file-leaks
Fix resource leaks
2024-03-31 12:43:19 -07:00
Mouse Reeve cb3fd0cfc1
Merge branch 'main' into feat/api/oauth 2024-03-31 12:41:12 -07:00
Bart Schuurmans 72ed878eeb
Merge pull request #3343 from Minnozz/update-codeql
Update CodeQL workflows to v3
2024-03-30 22:01:49 +01:00
Bart Schuurmans f666951934 Update CodeQL workflows to v3
https://github.blog/changelog/2024-01-12-code-scanning-deprecation-of-codeql-action-v2/
2024-03-30 21:56:44 +01:00
Holger Brunn fcd0087589 [FIX] make sure to get Pillow>=10 compatible pilkit 2024-03-30 01:58:41 +01:00
Bart Schuurmans ffee29d8e2 Fix resource leaks
Rewrite places where files (or other resources) are opened but not closed to "with" blocks, which
automatically call close() at the end of the scope.

Also simplify some tests where images need to be saved to a model field: an opened file can be
passed directly to FileField.save().
2024-03-29 20:14:10 +01:00
Bart Schuurmans 75bc4f8cb0 Make nginx config safer
Instead of allowing all image files anywhere, and disallowing non-image file under /images/, only
allow image files under /images/ and don't match non-image files elsewhere. They get proxied to web
instead and result in a 404 there.

For example, the old config allowed /exports/foo.jpg to be served, while the new config does not.
2024-03-29 15:04:38 +01:00
Bart Schuurmans e7ae0fdf93
Merge pull request #3337 from prolibre/apport-perso
flower 2.0.1 fixes a few link bugs (particularly for favicon)
2024-03-29 14:45:59 +01:00
Bart Schuurmans 5d597f1ca9 Use new "with ()" style 2024-03-29 14:25:08 +01:00
Bart Schuurmans 0ac9d12d1c Merge branch 'main' into user-export 2024-03-29 14:23:10 +01:00
Bart Schuurmans e74de94640
Merge pull request #3334 from ccamara/patch-1
Remove twitter from README.md
2024-03-29 14:21:49 +01:00
Bart Schuurmans 1464d09a43
Merge pull request #3320 from dato/better-fmt-patch-calls
bulk-fmt: bracket-wrap calls to patch() for better readability
2024-03-29 14:19:16 +01:00
Anthony 2272e7a326 flower 2.0.1 fixes a few link bugs (particularly for favicon) 2024-03-29 12:07:52 +01:00
Bart Schuurmans 2bbe3d4c32 Test user export archive contents 2024-03-28 13:50:55 +01:00
Bart Schuurmans bb5d8152f1 Fix mypy error 2024-03-28 13:21:30 +01:00
Bart Schuurmans dabf7c6e10 User export testing fixes 2024-03-28 13:09:21 +01:00
Bart Schuurmans cdbc1d172c Fix double exports subdir in S3 user export 2024-03-27 23:28:24 +01:00
Adeodato Simó 3133a47b7c
Merge from main into 'better-fmt-patch-calls'
Conflicts:
	bookwyrm/tests/test_book_search.py
2024-03-27 17:13:08 -03:00
Bart Schuurmans c6ca547d58 Fix migration formatting 2024-03-27 20:41:59 +01:00
Bart Schuurmans 797d5cb508 Update BookwyrmExportJob tests 2024-03-27 20:39:57 +01:00
Adeodato Simó 699d637bae
Fix detection of unlisted posts (#3258)
Merged from dato/fix_unlisted_set_from_activity.
2024-03-27 16:29:09 -03:00
Bart Schuurmans 9afd0ebb54 Update migrations 2024-03-27 20:15:06 +01:00
Bart Schuurmans 9685ae5a0a Consolidate BookwyrmExportJob into two tasks
Creating the export JSON and export TAR are now the only two tasks.
2024-03-27 20:13:49 +01:00
Carlos Cámara 98600440d8
Remove twitter from README.md
The Twitter/X account doesn't seem to exist, so removing the badge
2024-03-26 17:14:09 +00:00
Bart Schuurmans ed2e9e5ea8 Merge migration 2024-03-26 13:41:39 +01:00
Bart Schuurmans ef57c0bc8b Check last user export too in post handler 2024-03-26 13:41:39 +01:00
Bart Schuurmans 145c67dd21 Merge BookwyrmExportJob export_data field back into one with dynamic storage backend 2024-03-26 13:41:39 +01:00
Bart Schuurmans 6a67943408
Merge branch 'main' into user-export 2024-03-26 13:15:40 +01:00
Mouse Reeve 9dfa218ba5
Merge pull request #3333 from bookwyrm-social/locales
Updates locales and version number
2024-03-25 16:36:51 -07:00
Mouse Reeve bf52eeaa9e Bump version to 0.7.3. 2024-03-25 16:15:02 -07:00
Mouse Reeve 011e4a27a6 Updates locales and adds missing trimmed on blocktrans 2024-03-25 16:13:00 -07:00
Mouse Reeve 7192449b21
Merge pull request #3325 from Minnozz/author-search-vector
Rework author search
2024-03-25 14:41:25 -07:00
Bart Schuurmans d9bf848cfa Fix pylint warnings 2024-03-25 18:25:43 +01:00
Bart Schuurmans bd95bcd50b Add test for special character in cover filename 2024-03-25 18:14:45 +01:00
Bart Schuurmans f721289b1d Simplify logic for rendering user exports 2024-03-25 18:14:45 +01:00
Bart Schuurmans a51402241b Refactor creation of user export archive 2024-03-25 18:14:45 +01:00
Bart Schuurmans e0decbfd1d Fix urlescaped relative path to cover image in export
Fixes #3292
2024-03-25 17:59:39 +01:00
Bart Schuurmans aee8dc16af Fix pylint warning 2024-03-24 13:27:01 +01:00
Bart Schuurmans 5bd66cb3f7 Only generate signed S3 link to user export when user clicks download 2024-03-24 13:08:33 +01:00
Bart Schuurmans ab7b0893e0 User exports: handle files that no longer exist on file storage 2024-03-24 12:47:26 +01:00
Bart Schuurmans 471233c1dc Use different export job fields for the different storage backends
This way, the database definition is not depdendent on the runtime configuration.
2024-03-24 12:46:42 +01:00
Bart Schuurmans 073f62d5bb Add exports_volume to docker-compose.yml
Exports should be written to a Docker volume instead of to the bind mount (= source directory). This
way they are shared between different containers even when they run on different machines.
2024-03-24 12:08:29 +01:00
Bart Schuurmans a770689245 Merge branch 'main' into user-export 2024-03-24 12:07:14 +01:00
Bart Schuurmans 69f464418d Remove problematic migration
This migration is dependent on the runtime configuration (.env); a structural fix will follow.
2024-03-24 12:06:44 +01:00
Bart Schuurmans f11c80162a
Merge pull request #3331 from Minnozz/revert-docker-mount-ro
Revert "docker-compose.yml: make all bind mounts read only"
2024-03-24 11:30:56 +01:00
Bart Schuurmans 7c2fa746ae Revert "docker-compose.yml: make all bind mounts read only"
This reverts commit 864304f128.
2024-03-24 11:23:23 +01:00
Hugh Rundle 03587dfdc7
migrations 2024-03-24 20:56:20 +11:00
Hugh Rundle dd27684d4b
set signed s3 url expiry with env value
Adds S3_SIGNED_URL_EXPIRY val to .env and settings (defaults to 15 mins)
Note that this is reset every time the user loads the exports page
and is independent of the _creation_ of export files.
2024-03-24 20:53:49 +11:00
Bart Schuurmans caebebeb37
Merge pull request #3261 from bSolt/book-series-3256
Add book series by title in feed posts
2024-03-23 20:01:03 +01:00
Bart Schuurmans 592914dc91 Render series number with comma and outside of link on book page 2024-03-23 19:51:20 +01:00
Bart Schuurmans 2915133223
Merge branch 'main' into book-series-3256 2024-03-23 19:37:07 +01:00
Bart Schuurmans 2d2ccd51df Factor out book series info into separate template 2024-03-23 19:35:24 +01:00
Bart Schuurmans 4a690e675a BookDataModel: add dry_run argument to merge_into 2024-03-23 19:28:57 +01:00
Bart Schuurmans fb82c7a579 Add test for merging authors 2024-03-23 19:28:57 +01:00
Bart Schuurmans 6f191acb27 BookDataModel: fix absorbing data from array and partial date fields 2024-03-23 19:28:57 +01:00
Bart Schuurmans 7fb079cb43 PartialDate: fix __eq__ method 2024-03-23 19:28:57 +01:00
Bart Schuurmans 7066e2815b BookDataModel.merge_into: return and log absorbed fields 2024-03-23 19:28:57 +01:00
Bart Schuurmans e04cd79ff8 Redirect to new URL when a merged object is requested 2024-03-23 19:28:57 +01:00
Bart Schuurmans 5e123972e8 BookDataModel: implement merge_into method 2024-03-23 19:28:57 +01:00
Bart Schuurmans b3753ab6da Add MergedBookDataModel 2024-03-23 19:28:57 +01:00
Bart Schuurmans b8995bd4b1 Add tests for author search 2024-03-23 19:26:51 +01:00
Bart Schuurmans 769d9726e5 Add book search test cases for author aliases 2024-03-23 19:26:51 +01:00
Bart Schuurmans 36222afa79 Switch author search from TrigramSimilarity to SearchQuery 2024-03-23 19:26:51 +01:00
Bart Schuurmans 0795b4d171 Include Author aliases in Book search vector 2024-03-23 19:26:51 +01:00
Bart Schuurmans 2de35f3fc7 Calculate Author search vector with name and aliases 2024-03-23 19:26:51 +01:00
Mouse Reeve bac52eef3e
Merge pull request #3275 from ccamara/wikidata
Add wikidata field for authors
2024-03-23 08:12:09 -07:00
Mouse Reeve 8bbac458a6
Merge pull request #3217 from dato/switch_edition_invalidate_active_shelves
Invalidate `active_shelf` when switching editions
2024-03-23 07:59:40 -07:00
Mouse Reeve 5b71e94888
Merge branch 'main' into user-export 2024-03-23 07:55:46 -07:00
Mouse Reeve a914a44fba
Removes unnecessary redeclaration of wikidata model field in Author 2024-03-23 07:54:54 -07:00
Mouse Reeve 8e088a6d53
Merge branch 'main' into switch_edition_invalidate_active_shelves 2024-03-23 07:53:24 -07:00
Mouse Reeve b508b4cd33
Merge pull request #3323 from Minnozz/docker-bind-ro
Docker: make bind mounts of source code read only
2024-03-23 07:51:00 -07:00
Mouse Reeve 886d6ec9f7
Merge branch 'main' into docker-bind-ro 2024-03-23 07:48:27 -07:00
Mouse Reeve 21f75da75e
Merge pull request #3328 from Minnozz/escape-query-in-link
Escape search query in generated URLs
2024-03-23 07:46:04 -07:00
Mouse Reeve 20db968315
Merge pull request #3322 from Minnozz/fix-font-download
Fix font download
2024-03-23 07:36:43 -07:00
Bart Schuurmans c3d25c59c5 Escape search query in generated URLs
Otherwise, a query containing '&' or other special characters results in a broken URL.
2024-03-21 16:48:34 +01:00
Bart Schuurmans 3cde6dbe5a
Merge pull request #3326 from Minnozz/black-required-version
black: specify major version 22 only
2024-03-21 16:30:56 +01:00
Bart Schuurmans 682bb3b62f dev-tools: relax black version constraint 2024-03-21 16:25:29 +01:00
Bart Schuurmans b5b9eddaf0 CI: relax black version constraints 2024-03-20 12:46:37 +01:00
Bart Schuurmans ab430e0208 requirements.txt: add black
This way, IDEs can be set up to use the black version from the environment instead of a globally
available/bundled black version.
2024-03-20 12:43:17 +01:00
Bart Schuurmans e13e4237f4 black: specify required-version
This ensures consistent formatting among different contributors / development setups.

https://black.readthedocs.io/en/stable/usage_and_configuration/the_basics.html#required-version
2024-03-20 12:26:21 +01:00
Bart Schuurmans 762786839c
Merge pull request #3134 from dato/trigger_migrations
Support trigger migrations
2024-03-20 12:11:34 +01:00
Bart Schuurmans 4ca52c0b38
Merge branch 'main' into trigger_migrations 2024-03-20 11:47:54 +01:00
Bart Schuurmans 6a87713f9f Recalculate all book search vectors after fixing the author trigger 2024-03-20 11:45:12 +01:00
Mouse Reeve d08147c6d9
Merge pull request #3244 from bookwyrm-social/dependabot/pip/pillow-10.2.0
Bump pillow from 10.0.1 to 10.2.0
2024-03-19 15:10:30 -07:00
Bart Schuurmans f423834bd0 Catch the correct exception type from Pillow 2024-03-19 12:42:52 +01:00
Mouse Reeve d304ceb437
Merge pull request #3324 from bookwyrm-social/dependabot/pip/django-3.2.25
Bump django from 3.2.24 to 3.2.25
2024-03-18 15:05:30 -07:00
dependabot[bot] 47afe34d97
Bump django from 3.2.24 to 3.2.25
Bumps [django](https://github.com/django/django) from 3.2.24 to 3.2.25.
- [Commits](https://github.com/django/django/compare/3.2.24...3.2.25)

---
updated-dependencies:
- dependency-name: django
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-03-18 21:48:21 +00:00
Bart Schuurmans 4d23edddca Make sure /images/ and /static/ exist now that the bind mount is read only
Otherwise the static_volume and media_volume can't be mounted there.
2024-03-18 21:35:12 +01:00
Bart Schuurmans 68cb94daf2 docker-compose.yml: don't automatically start dev-tools by assigning profile 2024-03-18 21:34:51 +01:00
Bart Schuurmans 864304f128 docker-compose.yml: make all bind mounts read only
Except dev-tools, since it needs to be able to change the source.
2024-03-18 21:34:09 +01:00
Bart Schuurmans 7690247ab4 Font download: log the exact error 2024-03-18 20:34:47 +01:00
Bart Schuurmans 3367b20965 Font download: destination dir is allowed to exist
Without this argument, an existing directory (but not the file) causes an error.
2024-03-18 20:23:31 +01:00
Bart Schuurmans 748418590f docker-compose.yml: mount static_volume for flower
Because flower also uses BookwyrmConfig, it wants to download fonts, and will download them to an
incorrect location if the static_volume is not mounted.
2024-03-18 20:22:19 +01:00
Bart Schuurmans ccf2b16d73 requirements.txt: make typing-Pillow match Pillow 2024-03-18 19:52:40 +01:00
dependabot[bot] 3be227fc86 Bump pillow from 10.0.1 to 10.2.0
Bumps [pillow](https://github.com/python-pillow/Pillow) from 10.0.1 to 10.2.0.
- [Release notes](https://github.com/python-pillow/Pillow/releases)
- [Changelog](https://github.com/python-pillow/Pillow/blob/main/CHANGES.rst)
- [Commits](https://github.com/python-pillow/Pillow/compare/10.0.1...10.2.0)

---
updated-dependencies:
- dependency-name: pillow
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-03-18 19:51:24 +01:00
Adeodato Simó a6dc5bd13f
Make get_file_size robust against typing errors 2024-03-18 15:03:07 -03:00
Adeodato Simó 518da3b9cf Merge from main into 'user-export'
Conflicts:
	bookwyrm/models/bookwyrm_export_job.py
	requirements.txt
2024-03-18 14:47:34 -03:00
Adeodato Simó 2cf7ed477d Consolidate test_posgres.py into test_book_search.py
These are tests I missed when first writing trigger tests in
test_book_search.py.
2024-03-17 22:38:44 -03:00
Adeodato Simó cceccd1ecf
Merge from main into 'trigger_migrations'
Conflicts:
	requirements.txt
2024-03-17 21:54:15 -03:00
Adeodato Simó beb49af514
Upgade django-pgtrigger to 4.11 2024-03-17 21:46:34 -03:00
Adeodato Simó 90bd893568 Fix remaining instances of bad-classmethod-argument 2024-03-17 21:28:55 -03:00
Adeodato Simó e2c9ea3cd2 Fix instances of bad-classmethod-argument in recently edited files 2024-03-17 21:28:55 -03:00
Adeodato Simó 4b9fe0af0c Remove nesting in several with..patch calls 2024-03-17 20:57:39 -03:00
Adeodato Simó 1b9e0546e6 Bracket-wrap calls to patch() for better readability 2024-03-17 20:34:12 -03:00
Bart Schuurmans 8cf52e0a77
Merge pull request #3318 from Minnozz/ci-annotations
CI: update pytest setup and show annotations on PRs
2024-03-17 11:24:01 +01:00
Bart Schuurmans 0282e20b89
Merge branch 'main' into book-series-3256 2024-03-16 11:23:40 +01:00
Bart Schuurmans 4e20e43037 CI: merge all Python actions into one file 2024-03-13 23:36:26 +01:00
Bart Schuurmans 383e6533e1 CI: use pytest-github-actions-annotate-failures 2024-03-13 23:35:05 +01:00
Bart Schuurmans 74fdd9a85a CI: simplify pytest setup 2024-03-13 23:35:05 +01:00
Bart Schuurmans 6af0a08838 CI: use actions/setup-python@v5 and cache pip 2024-03-13 23:35:03 +01:00
Bart Schuurmans 12b469a0d6 CI: use actions/checkout@v4 2024-03-13 23:33:40 +01:00
Mouse Reeve 288743b686
Merge pull request #3315 from Minnozz/fix-pytest-env
pytest.ini: define ALLOWED_HOSTS
2024-03-13 15:29:15 -07:00
Mouse Reeve a3465e6154
Merge pull request #3303 from MaggieFero/main
Upgrade Python Version and Several Other Packages for Security
2024-03-13 15:28:54 -07:00
Bart Schuurmans 3ba528ecdd pytest.ini: define ALLOWED_HOSTS
This fixes running `./bw-dev pytest` locally when having a different value defined for
`ALLOWED_HOSTS` in `.env`.
2024-03-11 20:12:46 +01:00
Margaret Fero d138395c75 Add linter exclusion for TBookWyrmModel 2024-03-02 17:43:49 -08:00
Margaret Fero 91fe4ad535 Fix spacing for linter 2024-03-02 17:31:16 -08:00
Margaret Fero 9fa09d5ebe Add extra space required by linter 2024-03-02 17:30:37 -08:00
Margaret Fero eadb0e640f Fix typo in operator 2024-03-02 17:29:42 -08:00
Margaret Fero be140d5e5a Pin setuptools at 65.5.1 2024-03-02 17:20:48 -08:00
Margaret Fero 22c4155c7c Upgrade pytest to 6.2.5 2024-03-02 16:09:34 -08:00
Margaret Fero 498dc35d99 Upgrade Pylint to 2.15.0 2024-03-02 16:09:06 -08:00
Margaret Fero 0f5a3e9163 Pin Tornado at 6.3.3 2024-03-02 16:08:41 -08:00
Margaret Fero da2636fa29 Add grpcio pin @ 1.57.0 2024-03-02 16:07:50 -08:00
Margaret Fero c1520da56d Upgrade flower to 2.0.0 2024-03-02 16:05:11 -08:00
Margaret Fero fee3fdd5a8 Upgrade django-compressor to 4.4 2024-03-02 16:04:37 -08:00
Margaret Fero c944824ac7 Upgrade django-celery-beat to 2.5.0 2024-03-02 16:04:06 -08:00
Margaret Fero 4312e9bba0 Upgrade Celery to 5.3.1 2024-03-02 16:03:19 -08:00
Margaret Fero 39da471f79 Disable Pylint Failure for imghdr deprecation for now 2024-03-02 15:59:17 -08:00
Margaret Fero 570017d3b0 Upgrade Python Version from 3.9 to 3.11 2024-03-02 15:57:06 -08:00
Margaret Fero 3652ac8100
Alphabetize requirements.txt
Alphabetize requirements.txt for developer convenience; this helps to find duplicates and unnecessarily-pinned subdependencies, as well as making the file easier to read and use.
2024-03-02 15:41:06 -08:00
Margaret Fero f8fd76cff0
Remove duplicate types-requests==2.31.0.2
The types-requests==2.31.0.2 dependency was double-listed right next to each other; this commit removes one.
2024-03-02 13:57:09 -08:00
Margaret Fero 206ed9f7fb
Merge pull request #2 from bookwyrm-social/main
No Actual Changes
2024-03-02 13:55:24 -08:00
Carlos Camara 89d8537e1b Add wikidata field to author's template 2024-02-05 22:08:34 +00:00
Carlos Cámara 71f527eb1b
Merge branch 'main' into wikidata 2024-02-04 20:34:51 +01:00
Carlos Cámara 6ac38564e2 Add wikidata field for authors 2024-02-03 22:55:33 +00:00
Hugh Rundle 3675a4cf3f
disable user exports if using azure 2024-01-29 14:28:30 +11:00
Hugh Rundle 5f7be848fc
subclass boto3 session instead of adding new env value
Thanks Dato!
2024-01-29 14:10:36 +11:00
Hugh Rundle f96ddaa3e1
Merge pull request #3 from dato/export_job_inject_aws_endpoint_setting
Subclass boto3.Session to use AWS_S3_ENDPOINT_URL
2024-01-29 13:49:45 +11:00
Hugh Rundle adff3c4251
allow user exports with s3
also undoes a line space change in settings.py to make the PR cleaner
2024-01-29 13:45:35 +11:00
Hugh Rundle 765fc1e43d
fix tests 2024-01-29 12:28:37 +11:00
Adeodato Simó c106b2a988
Subclass boto3.Session to use AWS_S3_ENDPOINT_URL
As of 0.1.13, the s3-tar library uses an environment variable
(`S3_ENDPOINT_URL`) to determine the AWS endpoint. See:
https://github.com/xtream1101/s3-tar/blob/0.1.13/s3_tar/utils.py#L25-L29.

To save BookWyrm admins from having to set it (e.g., through `.env`)
when they are already setting `AWS_S3_ENDPOINT_URL`, we create a Session
class that unconditionally uses that URL, and feed it to S3Tar.
2024-01-28 22:21:44 -03:00
Hugh Rundle 2c231acebe
linting and tests 2024-01-28 20:35:47 +11:00
Hugh Rundle a3e05254b5
fix avatar import path 2024-01-28 15:56:44 +11:00
Hugh Rundle 582e97e4a5
Merge branch 'image-serialize' into user-export
pulls Mouse's fix for imagefile serialization
2024-01-28 15:12:15 +11:00
Hugh Rundle 0d619f7eb4
Merge branch 'main' into user-export 2024-01-28 15:11:02 +11:00
Hugh Rundle 2bb9a85591
various fixes
- use signed url for s3 downloads
- re-arrange tar.gz file to match original
- delete all working files after tarring
- import from s3 export

TODO

- check local export and import
- fix error when avatar missing
- deal with multiple s3 storage options (e.g. Azure)
2024-01-28 15:07:55 +11:00
Braden Solt 6add81cf15 move outside of authors "if" 2024-01-27 11:02:42 -07:00
Braden Solt 629acbaa19 add series number on posts in the feed 2024-01-27 10:58:57 -07:00
Adeodato Simó accb3273f1
When determining privacy, check for unlisted early
If `followers_url` is found in `to`, the post may still be _unlisted_
if `"https://www.w3.org/ns/activitystreams#Public"` appears in `cc`.
Hence this should be checked earlier.
2024-01-26 06:45:54 -03:00
Hugh Rundle 26c37de2d4
linting 2024-01-20 07:16:42 +11:00
Hugh Rundle 469172947b
cleanup and linting 2024-01-18 18:43:45 +11:00
Hugh Rundle 833f26fd0e
Merge branch 'main' into user-export 2024-01-18 18:24:56 +11:00
Hugh Rundle d4d2734dab
ignore exports dir 2024-01-14 14:14:20 +11:00
Hugh Rundle 62cc6c298f
oops
- remove test export files
- check in emblackened files
2024-01-14 12:19:59 +11:00
Hugh Rundle cbd08127ef
initial work on fixing user exports with s3
- custom storages
- tar.gz within bucket using s3_tar
- slightly changes export directory structure
- major problems still outstanding re delivering s3 files to end users
2024-01-14 12:14:44 +11:00
Adeodato Simó eb13eb9882
Invalidate active_shelf when switching editions 2024-01-13 19:00:57 +01:00
Sean Molenaar 5d09c54e57
Merge branch 'main' into feat/api/oauth 2023-12-07 15:38:19 +01:00
Sean Molenaar b7ba6f1a36
urls.py: fix style 2023-11-30 11:25:51 +01:00
Adeodato Simó d6eb390cee
Add test that forces book_authors_search_vector_trigger to execute 2023-11-26 15:59:17 -03:00
Adeodato Simó b5805accac
Minor improvements to bookwyrm_book trigger code
- do not COALESCE columns that cannot be NULL
- do not bring bookwyrm_book to author names JOIN
- add comments documenting the four steps
2023-11-25 21:49:15 -03:00
Adeodato Simó bbfbd1e97a
Add tests for trigger code (i.e. how search_vector is computed) 2023-11-25 20:54:49 -03:00
Adeodato Simó 9bcb5b80ea
Further simplify bookwyrm_author trigger 2023-11-25 18:13:40 -03:00
Adeodato Simó 8df408e07e
Define search_vector_trigger via Book.Meta.triggers 2023-11-25 17:02:54 -03:00
Adeodato Simó bcb3a343d4
Fix JOIN in author_search_vector_trigger, add missing WHERE clause 2023-11-25 16:23:21 -03:00
Adeodato Simó 416a6caf2d
Define author_search_vector_trigger via Author.Meta.triggers
Previously, triggers lived only in a particular migration file. With
this change, code for the triggers resides in the model, and their
lifecycle is managed through normal Django migrations.
2023-11-25 16:17:51 -03:00
Adeodato Simó 44ef928c3c
Alter object row IDs to force test failure in original code 2023-11-25 16:11:01 -03:00
Adeodato Simó e4d688665c
Remove index for author.search_vector, which is never used 2023-11-24 22:43:12 -03:00
Adeodato Simó 0299f2e235
Add functional tests for search_vector triggers
As metadata changes, search continues to work.
2023-11-24 22:28:41 -03:00
Sean Molenaar e144ce19fa
fix: add include import from django.urls 2023-11-16 10:48:06 +01:00
Sean Molenaar da4214ad61 feat: add OAuth authentication
Issue GH-2292
2023-11-14 14:18:35 +01:00
308 changed files with 12026 additions and 4573 deletions

View file

@ -16,6 +16,11 @@ DEFAULT_LANGUAGE="English"
## Leave unset to allow all hosts
# ALLOWED_HOSTS="localhost,127.0.0.1,[::1]"
# Specify when the site is served from a port that is not the default
# for the protocol (80 for HTTP or 443 for HTTPS).
# Probably only necessary in development.
# PORT=1333
MEDIA_ROOT=images/
# Database configuration
@ -71,14 +76,20 @@ ENABLE_THUMBNAIL_GENERATION=true
USE_S3=false
AWS_ACCESS_KEY_ID=
AWS_SECRET_ACCESS_KEY=
# seconds for signed S3 urls to expire
# this is currently only used for user export files
S3_SIGNED_URL_EXPIRY=900
# Commented are example values if you use a non-AWS, S3-compatible service
# AWS S3 should work with only AWS_STORAGE_BUCKET_NAME and AWS_S3_REGION_NAME
# non-AWS S3-compatible services will need AWS_STORAGE_BUCKET_NAME,
# along with both AWS_S3_CUSTOM_DOMAIN and AWS_S3_ENDPOINT_URL
# along with both AWS_S3_CUSTOM_DOMAIN and AWS_S3_ENDPOINT_URL.
# AWS_S3_URL_PROTOCOL must end in ":" and defaults to the same protocol as
# the BookWyrm instance ("http:" or "https:", based on USE_SSL).
# AWS_STORAGE_BUCKET_NAME= # "example-bucket-name"
# AWS_S3_CUSTOM_DOMAIN=None # "example-bucket-name.s3.fr-par.scw.cloud"
# AWS_S3_URL_PROTOCOL=None # "http:"
# AWS_S3_REGION_NAME=None # "fr-par"
# AWS_S3_ENDPOINT_URL=None # "https://s3.fr-par.scw.cloud"
@ -133,9 +144,9 @@ HTTP_X_FORWARDED_PROTO=false
TWO_FACTOR_LOGIN_VALIDITY_WINDOW=2
TWO_FACTOR_LOGIN_MAX_SECONDS=60
# Additional hosts to allow in the Content-Security-Policy, "self" (should be DOMAIN)
# and AWS_S3_CUSTOM_DOMAIN (if used) are added by default.
# Value should be a comma-separated list of host names.
# Additional hosts to allow in the Content-Security-Policy, "self" (should be
# DOMAIN with optionally ":" + PORT) and AWS_S3_CUSTOM_DOMAIN (if used) are
# added by default. Value should be a comma-separated list of host names.
CSP_ADDITIONAL_HOSTS=
# Time before being logged out (in seconds)

View file

@ -1,17 +0,0 @@
name: Python Formatting (run ./bw-dev black to fix)
on:
push:
branches: [ main ]
pull_request:
branches: [ main ]
jobs:
lint:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: actions/setup-python@v4
- uses: psf/black@22.12.0
with:
version: 22.12.0

View file

@ -36,11 +36,11 @@ jobs:
steps:
- name: Checkout repository
uses: actions/checkout@v3
uses: actions/checkout@v4
# Initializes the CodeQL tools for scanning.
- name: Initialize CodeQL
uses: github/codeql-action/init@v2
uses: github/codeql-action/init@v3
with:
languages: ${{ matrix.language }}
# If you wish to specify custom queries, you can do so here or in a config file.
@ -51,7 +51,7 @@ jobs:
# Autobuild attempts to build any compiled languages (C/C++, C#, or Java).
# If this step fails, then you should remove it and run the build manually (see below)
- name: Autobuild
uses: github/codeql-action/autobuild@v2
uses: github/codeql-action/autobuild@v3
# Command-line programs to run using the OS shell.
# 📚 https://git.io/JvXDl
@ -65,4 +65,4 @@ jobs:
# make release
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@v2
uses: github/codeql-action/analyze@v3

View file

@ -10,7 +10,7 @@ jobs:
lint:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
- name: Install curlylint
run: pip install curlylint

View file

@ -1,70 +0,0 @@
name: Run Python Tests
on:
push:
branches: [ main ]
pull_request:
branches: [ main ]
jobs:
build:
runs-on: ubuntu-20.04
services:
postgres:
image: postgres:13
env:
POSTGRES_USER: postgres
POSTGRES_PASSWORD: hunter2
options: >-
--health-cmd pg_isready
--health-interval 10s
--health-timeout 5s
--health-retries 5
ports:
- 5432:5432
steps:
- uses: actions/checkout@v3
- name: Set up Python
uses: actions/setup-python@v4
with:
python-version: 3.9
- name: Install Dependencies
run: |
python -m pip install --upgrade pip
pip install -r requirements.txt
- name: Check migrations up-to-date
run: |
python ./manage.py makemigrations --check
env:
SECRET_KEY: beepbeep
DOMAIN: your.domain.here
EMAIL_HOST: ""
EMAIL_HOST_USER: ""
EMAIL_HOST_PASSWORD: ""
- name: Run Tests
env:
SECRET_KEY: beepbeep
DEBUG: false
USE_HTTPS: true
DOMAIN: your.domain.here
BOOKWYRM_DATABASE_BACKEND: postgres
MEDIA_ROOT: images/
POSTGRES_PASSWORD: hunter2
POSTGRES_USER: postgres
POSTGRES_DB: github_actions
POSTGRES_HOST: 127.0.0.1
CELERY_BROKER: ""
REDIS_BROKER_PORT: 6379
REDIS_BROKER_PASSWORD: beep
USE_DUMMY_CACHE: true
FLOWER_PORT: 8888
EMAIL_HOST: "smtp.mailgun.org"
EMAIL_PORT: 587
EMAIL_HOST_USER: ""
EMAIL_HOST_PASSWORD: ""
EMAIL_USE_TLS: true
ENABLE_PREVIEW_IMAGES: false
ENABLE_THUMBNAIL_GENERATION: true
HTTP_X_FORWARDED_PROTO: false
run: |
pytest -n 3

View file

@ -19,10 +19,11 @@ jobs:
steps:
# Checks-out your repository under $GITHUB_WORKSPACE, so your job can access it.
- uses: actions/checkout@v3
- uses: actions/checkout@v4
- name: Install modules
run: npm install stylelint stylelint-config-recommended stylelint-config-standard stylelint-order eslint
# run: npm install stylelint stylelint-config-recommended stylelint-config-standard stylelint-order eslint
run: npm install eslint@^8.9.0
# See .stylelintignore for files that are not linted.
# - name: Run stylelint

View file

@ -1,50 +0,0 @@
name: Mypy
on:
push:
branches: [ main ]
pull_request:
branches: [ main ]
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Set up Python 3.9
uses: actions/setup-python@v4
with:
python-version: 3.9
- name: Install Dependencies
run: |
python -m pip install --upgrade pip
pip install -r requirements.txt
- name: Analysing the code with mypy
env:
SECRET_KEY: beepbeep
DEBUG: false
USE_HTTPS: true
DOMAIN: your.domain.here
BOOKWYRM_DATABASE_BACKEND: postgres
MEDIA_ROOT: images/
POSTGRES_PASSWORD: hunter2
POSTGRES_USER: postgres
POSTGRES_DB: github_actions
POSTGRES_HOST: 127.0.0.1
CELERY_BROKER: ""
REDIS_BROKER_PORT: 6379
REDIS_BROKER_PASSWORD: beep
USE_DUMMY_CACHE: true
FLOWER_PORT: 8888
EMAIL_HOST: "smtp.mailgun.org"
EMAIL_PORT: 587
EMAIL_HOST_USER: ""
EMAIL_HOST_PASSWORD: ""
EMAIL_USE_TLS: true
ENABLE_PREVIEW_IMAGES: false
ENABLE_THUMBNAIL_GENERATION: true
HTTP_X_FORWARDED_PROTO: false
run: |
mypy bookwyrm celerywyrm

View file

@ -14,7 +14,7 @@ jobs:
steps:
# Checks-out your repository under $GITHUB_WORKSPACE, so your job can access it.
- uses: actions/checkout@v3
- uses: actions/checkout@v4
- name: Install modules
run: npm install prettier@2.5.1

View file

@ -1,27 +0,0 @@
name: Pylint
on:
push:
branches: [ main ]
pull_request:
branches: [ main ]
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Set up Python 3.9
uses: actions/setup-python@v4
with:
python-version: 3.9
- name: Install Dependencies
run: |
python -m pip install --upgrade pip
pip install -r requirements.txt
- name: Analysing the code with pylint
run: |
pylint bookwyrm/

99
.github/workflows/python.yml vendored Normal file
View file

@ -0,0 +1,99 @@
name: Python
on:
push:
branches: [ main ]
pull_request:
branches: [ main ]
# overrides for .env.example
env:
POSTGRES_HOST: 127.0.0.1
PGPORT: 5432
POSTGRES_USER: postgres
POSTGRES_PASSWORD: hunter2
POSTGRES_DB: github_actions
SECRET_KEY: beepbeep
EMAIL_HOST_USER: ""
EMAIL_HOST_PASSWORD: ""
jobs:
pytest:
name: Tests (pytest)
runs-on: ubuntu-latest
services:
postgres:
image: postgres:13
env: # does not inherit from jobs.build.env
POSTGRES_USER: postgres
POSTGRES_PASSWORD: hunter2
options: >-
--health-cmd pg_isready
--health-interval 10s
--health-timeout 5s
--health-retries 5
ports:
- 5432:5432
steps:
- uses: actions/checkout@v4
- name: Set up Python 3.11
uses: actions/setup-python@v5
with:
python-version: 3.11
cache: pip
- name: Install Dependencies
run: |
python -m pip install --upgrade pip
pip install -r requirements.txt
pip install pytest-github-actions-annotate-failures
- name: Set up .env
run: cp .env.example .env
- name: Check migrations up-to-date
run: python ./manage.py makemigrations --check
- name: Run Tests
run: pytest -n 3
pylint:
name: Linting (pylint)
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Set up Python 3.11
uses: actions/setup-python@v5
with:
python-version: 3.11
cache: pip
- name: Install Dependencies
run: |
python -m pip install --upgrade pip
pip install -r requirements.txt
- name: Analyse code with pylint
run: pylint bookwyrm/
mypy:
name: Typing (mypy)
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Set up Python 3.11
uses: actions/setup-python@v5
with:
python-version: 3.11
cache: pip
- name: Install Dependencies
run: |
python -m pip install --upgrade pip
pip install -r requirements.txt
- name: Set up .env
run: cp .env.example .env
- name: Analyse code with mypy
run: mypy bookwyrm celerywyrm
black:
name: Formatting (black; run ./bw-dev black to fix)
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/setup-python@v5
- uses: psf/black@stable
with:
version: "22.*"

5
.gitignore vendored
View file

@ -16,6 +16,8 @@
# BookWyrm
.env
/images/
/exports/
/static/
bookwyrm/static/css/bookwyrm.css
bookwyrm/static/css/themes/
!bookwyrm/static/css/themes/bookwyrm-*.scss
@ -36,3 +38,6 @@ nginx/default.conf
#macOS
**/.DS_Store
# Docker
docker-compose.override.yml

View file

@ -1,4 +1,4 @@
FROM python:3.9
FROM python:3.11
ENV PYTHONUNBUFFERED 1

View file

@ -10,7 +10,6 @@ BookWyrm is a social network for tracking your reading, talking about books, wri
## Links
[![Mastodon Follow](https://img.shields.io/mastodon/follow/000146121?domain=https%3A%2F%2Ftech.lgbt&style=social)](https://tech.lgbt/@bookwyrm)
[![Twitter Follow](https://img.shields.io/twitter/follow/BookWyrmSocial?style=social)](https://twitter.com/BookWyrmSocial)
- [Project homepage](https://joinbookwyrm.com/)
- [Support](https://patreon.com/bookwyrm)

View file

@ -1 +1 @@
0.7.2
0.7.3

View file

@ -20,6 +20,7 @@ from bookwyrm.tasks import app, MISC
logger = logging.getLogger(__name__)
# pylint: disable=invalid-name
TBookWyrmModel = TypeVar("TBookWyrmModel", bound=base_model.BookWyrmModel)

View file

@ -139,14 +139,14 @@ class ActivityStream(RedisStore):
| (
Q(following=status.user) & Q(following=status.reply_parent.user)
) # if the user is following both authors
).distinct()
)
# only visible to the poster's followers and tagged users
elif status.privacy == "followers":
audience = audience.filter(
Q(following=status.user) # if the user is following the author
)
return audience.distinct()
return audience.distinct("id")
@tracer.start_as_current_span("ActivityStream.get_audience")
def get_audience(self, status):
@ -156,7 +156,7 @@ class ActivityStream(RedisStore):
status_author = models.User.objects.filter(
is_active=True, local=True, id=status.user.id
).values_list("id", flat=True)
return list(set(list(audience) + list(status_author)))
return list(set(audience) | set(status_author))
def get_stores_for_users(self, user_ids):
"""convert a list of user ids into redis store ids"""
@ -183,15 +183,13 @@ class HomeStream(ActivityStream):
def get_audience(self, status):
trace.get_current_span().set_attribute("stream_id", self.key)
audience = super()._get_audience(status)
if not audience:
return []
# if the user is following the author
audience = audience.filter(following=status.user).values_list("id", flat=True)
# if the user is the post's author
status_author = models.User.objects.filter(
is_active=True, local=True, id=status.user.id
).values_list("id", flat=True)
return list(set(list(audience) + list(status_author)))
return list(set(audience) | set(status_author))
def get_statuses_for_user(self, user):
return models.Status.privacy_filter(
@ -239,9 +237,7 @@ class BooksStream(ActivityStream):
)
audience = super()._get_audience(status)
if not audience:
return models.User.objects.none()
return audience.filter(shelfbook__book__parent_work=work).distinct()
return audience.filter(shelfbook__book__parent_work=work)
def get_audience(self, status):
# only show public statuses on the books feed,

View file

@ -1,4 +1,5 @@
"""Do further startup configuration and initialization"""
import os
import urllib
import logging
@ -14,16 +15,16 @@ def download_file(url, destination):
"""Downloads a file to the given path"""
try:
# Ensure our destination directory exists
os.makedirs(os.path.dirname(destination))
os.makedirs(os.path.dirname(destination), exist_ok=True)
with urllib.request.urlopen(url) as stream:
with open(destination, "b+w") as outfile:
outfile.write(stream.read())
except (urllib.error.HTTPError, urllib.error.URLError):
logger.info("Failed to download file %s", url)
except OSError:
logger.info("Couldn't open font file %s for writing", destination)
except: # pylint: disable=bare-except
logger.info("Unknown error in file download")
except (urllib.error.HTTPError, urllib.error.URLError) as err:
logger.error("Failed to download file %s: %s", url, err)
except OSError as err:
logger.error("Couldn't open font file %s for writing: %s", destination, err)
except Exception as err: # pylint:disable=broad-except
logger.error("Unknown error in file download: %s", err)
class BookwyrmConfig(AppConfig):

View file

@ -3,7 +3,9 @@ from __future__ import annotations
from abc import ABC, abstractmethod
from typing import Optional, TypedDict, Any, Callable, Union, Iterator
from urllib.parse import quote_plus
import imghdr
# pylint: disable-next=deprecated-module
import imghdr # Deprecated in 3.11 for removal in 3.13; no good alternative yet
import logging
import re
import asyncio

View file

@ -118,9 +118,11 @@ def get_connectors() -> Iterator[abstract_connector.AbstractConnector]:
def get_or_create_connector(remote_id: str) -> abstract_connector.AbstractConnector:
"""get the connector related to the object's server"""
url = urlparse(remote_id)
identifier = url.netloc
identifier = url.hostname
if not identifier:
raise ValueError("Invalid remote id")
raise ValueError(f"Invalid remote id: {remote_id}")
base_url = f"{url.scheme}://{url.netloc}"
try:
connector_info = models.Connector.objects.get(identifier=identifier)
@ -128,10 +130,10 @@ def get_or_create_connector(remote_id: str) -> abstract_connector.AbstractConnec
connector_info = models.Connector.objects.create(
identifier=identifier,
connector_file="bookwyrm_connector",
base_url=f"https://{identifier}",
books_url=f"https://{identifier}/book",
covers_url=f"https://{identifier}/images/covers",
search_url=f"https://{identifier}/search?q=",
base_url=base_url,
books_url=f"{base_url}/book",
covers_url=f"{base_url}/images/covers",
search_url=f"{base_url}/search?q=",
priority=2,
)
@ -188,8 +190,11 @@ def raise_not_valid_url(url: str) -> None:
if not parsed.scheme in ["http", "https"]:
raise ConnectorException("Invalid scheme: ", url)
if not parsed.hostname:
raise ConnectorException("Hostname missing: ", url)
try:
ipaddress.ip_address(parsed.netloc)
ipaddress.ip_address(parsed.hostname)
raise ConnectorException("Provided url is an IP address: ", url)
except ValueError:
# it's not an IP address, which is good

View file

@ -4,7 +4,7 @@ from django.template.loader import get_template
from bookwyrm import models, settings
from bookwyrm.tasks import app, EMAIL
from bookwyrm.settings import DOMAIN
from bookwyrm.settings import DOMAIN, BASE_URL
def email_data():
@ -14,6 +14,7 @@ def email_data():
"site_name": site.name,
"logo": site.logo_small_url,
"domain": DOMAIN,
"base_url": BASE_URL,
"user": None,
}

View file

@ -15,6 +15,7 @@ class AuthorForm(CustomForm):
"aliases",
"bio",
"wikipedia_link",
"wikidata",
"website",
"born",
"died",
@ -32,6 +33,7 @@ class AuthorForm(CustomForm):
"wikipedia_link": forms.TextInput(
attrs={"aria-describedby": "desc_wikipedia_link"}
),
"wikidata": forms.TextInput(attrs={"aria-describedby": "desc_wikidata"}),
"website": forms.TextInput(attrs={"aria-describedby": "desc_website"}),
"born": forms.SelectDateWidget(attrs={"aria-describedby": "desc_born"}),
"died": forms.SelectDateWidget(attrs={"aria-describedby": "desc_died"}),

View file

@ -1,4 +1,5 @@
""" using django model forms """
from django import forms
from file_resubmit.widgets import ResubmitImageWidget
@ -25,7 +26,6 @@ class EditionForm(CustomForm):
"subtitle",
"description",
"series",
"series_number",
"languages",
"subjects",
"publishers",
@ -55,9 +55,6 @@ class EditionForm(CustomForm):
attrs={"aria-describedby": "desc_description"}
),
"series": forms.TextInput(attrs={"aria-describedby": "desc_series"}),
"series_number": forms.TextInput(
attrs={"aria-describedby": "desc_series_number"}
),
"subjects": ArrayWidget(),
"languages": forms.TextInput(
attrs={"aria-describedby": "desc_languages_help desc_languages"}
@ -116,7 +113,6 @@ class EditionFromWorkForm(CustomForm):
"description",
"languages",
"series",
"series_number",
"subjects",
"subject_places",
"cover",

View file

@ -26,7 +26,7 @@ class FileLinkForm(CustomForm):
url = cleaned_data.get("url")
filetype = cleaned_data.get("filetype")
book = cleaned_data.get("book")
domain = urlparse(url).netloc
domain = urlparse(url).hostname
if models.LinkDomain.objects.filter(domain=domain).exists():
status = models.LinkDomain.objects.get(domain=domain).status
if status == "blocked":

View file

@ -1,13 +1,14 @@
""" PROCEED WITH CAUTION: uses deduplication fields to permanently
merge book data objects """
from django.core.management.base import BaseCommand
from django.db.models import Count
from bookwyrm import models
from bookwyrm.management.merge import merge_objects
def dedupe_model(model):
def dedupe_model(model, dry_run=False):
"""combine duplicate editions and update related models"""
print(f"deduplicating {model.__name__}:")
fields = model._meta.get_fields()
dedupe_fields = [
f for f in fields if hasattr(f, "deduplication_field") and f.deduplication_field
@ -16,30 +17,42 @@ def dedupe_model(model):
dupes = (
model.objects.values(field.name)
.annotate(Count(field.name))
.filter(**{"%s__count__gt" % field.name: 1})
.filter(**{f"{field.name}__count__gt": 1})
.exclude(**{field.name: ""})
.exclude(**{f"{field.name}__isnull": True})
)
for dupe in dupes:
value = dupe[field.name]
if not value or value == "":
continue
print("----------")
print(dupe)
objs = model.objects.filter(**{field.name: value}).order_by("id")
canonical = objs.first()
print("keeping", canonical.remote_id)
action = "would merge" if dry_run else "merging"
print(
f"{action} into {model.__name__} {canonical.remote_id} based on {field.name} {value}:"
)
for obj in objs[1:]:
print(obj.remote_id)
merge_objects(canonical, obj)
print(f"- {obj.remote_id}")
absorbed_fields = obj.merge_into(canonical, dry_run=dry_run)
print(f" absorbed fields: {absorbed_fields}")
class Command(BaseCommand):
"""deduplicate allllll the book data models"""
help = "merges duplicate book data"
def add_arguments(self, parser):
"""add the arguments for this command"""
parser.add_argument(
"--dry_run",
action="store_true",
help="don't actually merge, only print what would happen",
)
# pylint: disable=no-self-use,unused-argument
def handle(self, *args, **options):
"""run deduplications"""
dedupe_model(models.Edition)
dedupe_model(models.Work)
dedupe_model(models.Author)
dedupe_model(models.Edition, dry_run=options["dry_run"])
dedupe_model(models.Work, dry_run=options["dry_run"])
dedupe_model(models.Author, dry_run=options["dry_run"])

View file

@ -1,50 +0,0 @@
from django.db.models import ManyToManyField
def update_related(canonical, obj):
"""update all the models with fk to the object being removed"""
# move related models to canonical
related_models = [
(r.remote_field.name, r.related_model) for r in canonical._meta.related_objects
]
for (related_field, related_model) in related_models:
# Skip the ManyToMany fields that arent auto-created. These
# should have a corresponding OneToMany field in the model for
# the linking table anyway. If we update it through that model
# instead then we wont lose the extra fields in the linking
# table.
related_field_obj = related_model._meta.get_field(related_field)
if isinstance(related_field_obj, ManyToManyField):
through = related_field_obj.remote_field.through
if not through._meta.auto_created:
continue
related_objs = related_model.objects.filter(**{related_field: obj})
for related_obj in related_objs:
print("replacing in", related_model.__name__, related_field, related_obj.id)
try:
setattr(related_obj, related_field, canonical)
related_obj.save()
except TypeError:
getattr(related_obj, related_field).add(canonical)
getattr(related_obj, related_field).remove(obj)
def copy_data(canonical, obj):
"""try to get the most data possible"""
for data_field in obj._meta.get_fields():
if not hasattr(data_field, "activitypub_field"):
continue
data_value = getattr(obj, data_field.name)
if not data_value:
continue
if not getattr(canonical, data_field.name):
print("setting data field", data_field.name, data_value)
setattr(canonical, data_field.name, data_value)
canonical.save()
def merge_objects(canonical, obj):
copy_data(canonical, obj)
update_related(canonical, obj)
# remove the outdated entry
obj.delete()

View file

@ -1,4 +1,3 @@
from bookwyrm.management.merge import merge_objects
from django.core.management.base import BaseCommand
@ -9,6 +8,11 @@ class MergeCommand(BaseCommand):
"""add the arguments for this command"""
parser.add_argument("--canonical", type=int, required=True)
parser.add_argument("--other", type=int, required=True)
parser.add_argument(
"--dry_run",
action="store_true",
help="don't actually merge, only print what would happen",
)
# pylint: disable=no-self-use,unused-argument
def handle(self, *args, **options):
@ -26,4 +30,8 @@ class MergeCommand(BaseCommand):
print("other book doesnt exist!")
return
merge_objects(canonical, other)
absorbed_fields = other.merge_into(canonical, dry_run=options["dry_run"])
action = "would be" if options["dry_run"] else "has been"
print(f"{other.remote_id} {action} merged into {canonical.remote_id}")
print(f"absorbed fields: {absorbed_fields}")

View file

@ -0,0 +1,16 @@
# Generated by Django 3.2.20 on 2023-11-24 17:11
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
("bookwyrm", "0188_theme_loads"),
]
operations = [
migrations.RemoveIndex(
model_name="author",
name="bookwyrm_au_search__b050a8_gin",
),
]

View file

@ -0,0 +1,76 @@
# Generated by Django 3.2.20 on 2023-11-25 00:47
from importlib import import_module
import re
from django.db import migrations
import pgtrigger.compiler
import pgtrigger.migrations
trigger_migration = import_module("bookwyrm.migrations.0077_auto_20210623_2155")
# it's _very_ convenient for development that this migration be reversible
search_vector_trigger = trigger_migration.Migration.operations[4]
author_search_vector_trigger = trigger_migration.Migration.operations[5]
assert re.search(r"\bCREATE TRIGGER search_vector_trigger\b", search_vector_trigger.sql)
assert re.search(
r"\bCREATE TRIGGER author_search_vector_trigger\b",
author_search_vector_trigger.sql,
)
class Migration(migrations.Migration):
dependencies = [
("bookwyrm", "0190_book_search_updates"),
]
operations = [
pgtrigger.migrations.AddTrigger(
model_name="book",
trigger=pgtrigger.compiler.Trigger(
name="update_search_vector_on_book_edit",
sql=pgtrigger.compiler.UpsertTriggerSql(
func="new.search_vector := setweight(coalesce(nullif(to_tsvector('english', new.title), ''), to_tsvector('simple', new.title)), 'A') || setweight(to_tsvector('english', coalesce(new.subtitle, '')), 'B') || (SELECT setweight(to_tsvector('simple', coalesce(array_to_string(array_agg(bookwyrm_author.name), ' '), '')), 'C') FROM bookwyrm_author LEFT JOIN bookwyrm_book_authors ON bookwyrm_author.id = bookwyrm_book_authors.author_id WHERE bookwyrm_book_authors.book_id = new.id ) || setweight(to_tsvector('english', coalesce(new.series, '')), 'D');RETURN NEW;",
hash="77d6399497c0a89b0bf09d296e33c396da63705c",
operation='INSERT OR UPDATE OF "title", "subtitle", "series", "search_vector"',
pgid="pgtrigger_update_search_vector_on_book_edit_bec58",
table="bookwyrm_book",
when="BEFORE",
),
),
),
pgtrigger.migrations.AddTrigger(
model_name="author",
trigger=pgtrigger.compiler.Trigger(
name="reset_search_vector_on_author_edit",
sql=pgtrigger.compiler.UpsertTriggerSql(
func="WITH updated_books AS (SELECT book_id FROM bookwyrm_book_authors WHERE author_id = new.id ) UPDATE bookwyrm_book SET search_vector = '' FROM updated_books WHERE id = updated_books.book_id;RETURN NEW;",
hash="e7bbf08711ff3724c58f4d92fb7a082ffb3d7826",
operation='UPDATE OF "name"',
pgid="pgtrigger_reset_search_vector_on_author_edit_a447c",
table="bookwyrm_author",
when="AFTER",
),
),
),
migrations.RunSQL(
sql="""DROP TRIGGER IF EXISTS search_vector_trigger ON bookwyrm_book;
DROP FUNCTION IF EXISTS book_trigger;
""",
reverse_sql=search_vector_trigger.sql,
),
migrations.RunSQL(
sql="""DROP TRIGGER IF EXISTS author_search_vector_trigger ON bookwyrm_author;
DROP FUNCTION IF EXISTS author_trigger;
""",
reverse_sql=author_search_vector_trigger.sql,
),
migrations.RunSQL(
# Recalculate book search vector for any missed author name changes
# due to bug in JOIN in the old trigger.
sql="UPDATE bookwyrm_book SET search_vector = NULL;",
reverse_sql=migrations.RunSQL.noop,
),
]

View file

@ -0,0 +1,92 @@
# Generated by Django 3.2.23 on 2024-01-28 02:49
import bookwyrm.storage_backends
import django.core.serializers.json
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
dependencies = [
("bookwyrm", "0192_sitesettings_user_exports_enabled"),
]
operations = [
migrations.AddField(
model_name="bookwyrmexportjob",
name="export_json",
field=models.JSONField(
encoder=django.core.serializers.json.DjangoJSONEncoder, null=True
),
),
migrations.AddField(
model_name="bookwyrmexportjob",
name="json_completed",
field=models.BooleanField(default=False),
),
migrations.AlterField(
model_name="bookwyrmexportjob",
name="export_data",
field=models.FileField(
null=True,
storage=bookwyrm.storage_backends.ExportsFileStorage,
upload_to="",
),
),
migrations.CreateModel(
name="AddFileToTar",
fields=[
(
"childjob_ptr",
models.OneToOneField(
auto_created=True,
on_delete=django.db.models.deletion.CASCADE,
parent_link=True,
primary_key=True,
serialize=False,
to="bookwyrm.childjob",
),
),
(
"parent_export_job",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
related_name="child_edition_export_jobs",
to="bookwyrm.bookwyrmexportjob",
),
),
],
options={
"abstract": False,
},
bases=("bookwyrm.childjob",),
),
migrations.CreateModel(
name="AddBookToUserExportJob",
fields=[
(
"childjob_ptr",
models.OneToOneField(
auto_created=True,
on_delete=django.db.models.deletion.CASCADE,
parent_link=True,
primary_key=True,
serialize=False,
to="bookwyrm.childjob",
),
),
(
"edition",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
to="bookwyrm.edition",
),
),
],
options={
"abstract": False,
},
bases=("bookwyrm.childjob",),
),
]

View file

@ -0,0 +1,13 @@
# Generated by Django 3.2.23 on 2024-03-18 17:37
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
("bookwyrm", "0193_auto_20240128_0249"),
("bookwyrm", "0195_alter_user_preferred_language"),
]
operations = []

View file

@ -0,0 +1,13 @@
# Generated by Django 3.2.23 on 2024-03-18 00:48
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
("bookwyrm", "0191_migrate_search_vec_triggers_to_pgtriggers"),
("bookwyrm", "0195_alter_user_preferred_language"),
]
operations = []

View file

@ -0,0 +1,41 @@
# Generated by Django 3.2.25 on 2024-03-20 15:15
import django.contrib.postgres.indexes
from django.db import migrations
import pgtrigger.compiler
import pgtrigger.migrations
class Migration(migrations.Migration):
dependencies = [
("bookwyrm", "0196_merge_pr3134_into_main"),
]
operations = [
migrations.AddIndex(
model_name="author",
index=django.contrib.postgres.indexes.GinIndex(
fields=["search_vector"], name="bookwyrm_au_search__b050a8_gin"
),
),
pgtrigger.migrations.AddTrigger(
model_name="author",
trigger=pgtrigger.compiler.Trigger(
name="update_search_vector_on_author_edit",
sql=pgtrigger.compiler.UpsertTriggerSql(
func="new.search_vector := setweight(to_tsvector('simple', new.name), 'A') || setweight(to_tsvector('simple', coalesce(array_to_string(new.aliases, ' '), '')), 'B');RETURN NEW;",
hash="b97919016236d74d0ade51a0769a173ea269da64",
operation='INSERT OR UPDATE OF "name", "aliases", "search_vector"',
pgid="pgtrigger_update_search_vector_on_author_edit_c61cb",
table="bookwyrm_author",
when="BEFORE",
),
),
),
migrations.RunSQL(
# Calculate search vector for all Authors.
sql="UPDATE bookwyrm_author SET search_vector = NULL;",
reverse_sql="UPDATE bookwyrm_author SET search_vector = NULL;",
),
]

View file

@ -0,0 +1,13 @@
# Generated by Django 3.2.25 on 2024-03-24 02:35
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
("bookwyrm", "0196_merge_20240318_1737"),
("bookwyrm", "0196_merge_pr3134_into_main"),
]
operations = []

View file

@ -0,0 +1,48 @@
# Generated by Django 3.2.24 on 2024-02-28 21:30
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
dependencies = [
("bookwyrm", "0196_merge_pr3134_into_main"),
]
operations = [
migrations.CreateModel(
name="MergedBook",
fields=[
("deleted_id", models.IntegerField(primary_key=True, serialize=False)),
(
"merged_into",
models.ForeignKey(
on_delete=django.db.models.deletion.PROTECT,
related_name="absorbed",
to="bookwyrm.book",
),
),
],
options={
"abstract": False,
},
),
migrations.CreateModel(
name="MergedAuthor",
fields=[
("deleted_id", models.IntegerField(primary_key=True, serialize=False)),
(
"merged_into",
models.ForeignKey(
on_delete=django.db.models.deletion.PROTECT,
related_name="absorbed",
to="bookwyrm.author",
),
),
],
options={
"abstract": False,
},
),
]

View file

@ -0,0 +1,23 @@
# Generated by Django 3.2.25 on 2024-03-26 11:37
import bookwyrm.models.bookwyrm_export_job
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("bookwyrm", "0197_merge_20240324_0235"),
]
operations = [
migrations.AlterField(
model_name="bookwyrmexportjob",
name="export_data",
field=models.FileField(
null=True,
storage=bookwyrm.models.bookwyrm_export_job.select_exports_storage,
upload_to="",
),
),
]

View file

@ -0,0 +1,57 @@
# Generated by Django 3.2.25 on 2024-03-20 15:52
from django.db import migrations
import pgtrigger.compiler
import pgtrigger.migrations
class Migration(migrations.Migration):
dependencies = [
("bookwyrm", "0197_author_search_vector"),
]
operations = [
pgtrigger.migrations.RemoveTrigger(
model_name="author",
name="reset_search_vector_on_author_edit",
),
pgtrigger.migrations.RemoveTrigger(
model_name="book",
name="update_search_vector_on_book_edit",
),
pgtrigger.migrations.AddTrigger(
model_name="author",
trigger=pgtrigger.compiler.Trigger(
name="reset_book_search_vector_on_author_edit",
sql=pgtrigger.compiler.UpsertTriggerSql(
func="WITH updated_books AS (SELECT book_id FROM bookwyrm_book_authors WHERE author_id = new.id ) UPDATE bookwyrm_book SET search_vector = '' FROM updated_books WHERE id = updated_books.book_id;RETURN NEW;",
hash="68422c0f29879c5802b82159dde45297eff53e73",
operation='UPDATE OF "name", "aliases"',
pgid="pgtrigger_reset_book_search_vector_on_author_edit_a50c7",
table="bookwyrm_author",
when="AFTER",
),
),
),
pgtrigger.migrations.AddTrigger(
model_name="book",
trigger=pgtrigger.compiler.Trigger(
name="update_search_vector_on_book_edit",
sql=pgtrigger.compiler.UpsertTriggerSql(
func="WITH author_names AS (SELECT array_to_string(bookwyrm_author.name || bookwyrm_author.aliases, ' ') AS name_and_aliases FROM bookwyrm_author LEFT JOIN bookwyrm_book_authors ON bookwyrm_author.id = bookwyrm_book_authors.author_id WHERE bookwyrm_book_authors.book_id = new.id ) SELECT setweight(coalesce(nullif(to_tsvector('english', new.title), ''), to_tsvector('simple', new.title)), 'A') || setweight(to_tsvector('english', coalesce(new.subtitle, '')), 'B') || (SELECT setweight(to_tsvector('simple', coalesce(array_to_string(array_agg(name_and_aliases), ' '), '')), 'C') FROM author_names) || setweight(to_tsvector('english', coalesce(new.series, '')), 'D') INTO new.search_vector;RETURN NEW;",
hash="9324f5ca76a6f5e63931881d62d11da11f595b2c",
operation='INSERT OR UPDATE OF "title", "subtitle", "series", "search_vector"',
pgid="pgtrigger_update_search_vector_on_book_edit_bec58",
table="bookwyrm_book",
when="BEFORE",
),
),
),
migrations.RunSQL(
# Recalculate search vector for all Books because it now includes
# Author aliases.
sql="UPDATE bookwyrm_book SET search_vector = NULL;",
reverse_sql="UPDATE bookwyrm_book SET search_vector = NULL;",
),
]

View file

@ -0,0 +1,13 @@
# Generated by Django 3.2.25 on 2024-03-26 12:17
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
("bookwyrm", "0198_alter_bookwyrmexportjob_export_data"),
("bookwyrm", "0198_book_search_vector_author_aliases"),
]
operations = []

View file

@ -0,0 +1,19 @@
# Generated by Django 3.2.25 on 2024-04-02 19:53
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("bookwyrm", "0198_book_search_vector_author_aliases"),
]
operations = [
migrations.AddIndex(
model_name="status",
index=models.Index(
fields=["remote_id"], name="bookwyrm_st_remote__06aeba_idx"
),
),
]

View file

@ -0,0 +1,27 @@
# Generated by Django 3.2.25 on 2024-03-27 19:14
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
("bookwyrm", "0199_merge_20240326_1217"),
]
operations = [
migrations.RemoveField(
model_name="addfiletotar",
name="childjob_ptr",
),
migrations.RemoveField(
model_name="addfiletotar",
name="parent_export_job",
),
migrations.DeleteModel(
name="AddBookToUserExportJob",
),
migrations.DeleteModel(
name="AddFileToTar",
),
]

View file

@ -0,0 +1,19 @@
# Generated by Django 3.2.25 on 2024-04-03 19:05
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("bookwyrm", "0199_status_bookwyrm_st_remote__06aeba_idx"),
]
operations = [
migrations.AddIndex(
model_name="status",
index=models.Index(
fields=["thread_id"], name="bookwyrm_st_thread__cf064f_idx"
),
),
]

View file

@ -0,0 +1,19 @@
# Generated by Django 3.2.25 on 2024-04-03 19:10
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("bookwyrm", "0200_status_bookwyrm_st_thread__cf064f_idx"),
]
operations = [
migrations.AddIndex(
model_name="keypair",
index=models.Index(
fields=["remote_id"], name="bookwyrm_ke_remote__472927_idx"
),
),
]

View file

@ -0,0 +1,19 @@
# Generated by Django 3.2.25 on 2024-04-03 19:14
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("bookwyrm", "0201_keypair_bookwyrm_ke_remote__472927_idx"),
]
operations = [
migrations.AddIndex(
model_name="user",
index=models.Index(
fields=["username"], name="bookwyrm_us_usernam_b2546d_idx"
),
),
]

View file

@ -0,0 +1,19 @@
# Generated by Django 3.2.25 on 2024-04-03 19:22
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("bookwyrm", "0202_user_bookwyrm_us_usernam_b2546d_idx"),
]
operations = [
migrations.AddIndex(
model_name="user",
index=models.Index(
fields=["is_active", "local"], name="bookwyrm_us_is_acti_972dc4_idx"
),
),
]

View file

@ -0,0 +1,13 @@
# Generated by Django 3.2.25 on 2024-04-09 10:42
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
("bookwyrm", "0197_mergedauthor_mergedbook"),
("bookwyrm", "0203_user_bookwyrm_us_is_acti_972dc4_idx"),
]
operations = []

View file

@ -0,0 +1,13 @@
# Generated by Django 3.2.25 on 2024-04-13 02:32
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
("bookwyrm", "0200_auto_20240327_1914"),
("bookwyrm", "0204_merge_20240409_1042"),
]
operations = []

View file

@ -0,0 +1,148 @@
# Generated by Django 3.2.23 on 2024-02-04 20:27
import bookwyrm.models.fields
from django.db import migrations, models
import django.db.models.deletion
def make_series(apps, schema_editor):
Edition = apps.get_model("bookwyrm", "Edition")
Series = apps.get_model("bookwyrm", "Series")
SeriesBook = apps.get_model("bookwyrm", "SeriesBook")
db_alias = schema_editor.connection.alias
with_series = (
Edition.objects.using(db_alias)
.exclude(series_name__isnull=True)
.exclude(series_name__exact="")
.order_by("series_name", "series_number")
)
for edition in with_series:
# TODO: Try to parse number from series_name if series_number is empty?
series, _ = Series.objects.using(db_alias).get_or_create(
name=edition.series_name,
authors=edition.authors.all(),
)
SeriesBook.objects.using(db_alias).create(
book=edition, series=series, number=edition.series_number
)
class Migration(migrations.Migration):
dependencies = [
("bookwyrm", "0205_merge_20240413_0232.py"),
]
operations = [
migrations.RenameField(
model_name="book",
old_name="series",
new_name="series_name",
),
migrations.CreateModel(
name="Series",
fields=[
(
"id",
models.AutoField(
auto_created=True,
primary_key=True,
serialize=False,
verbose_name="ID",
),
),
("created_date", models.DateTimeField(auto_now_add=True)),
("updated_date", models.DateTimeField(auto_now=True)),
(
"remote_id",
bookwyrm.models.fields.RemoteIdField(
max_length=255,
null=True,
validators=[bookwyrm.models.fields.validate_remote_id],
),
),
("name", bookwyrm.models.fields.CharField(max_length=100)),
(
"authors",
bookwyrm.models.fields.ManyToManyField(to="bookwyrm.Author"),
),
],
options={
"abstract": False,
},
),
migrations.CreateModel(
name="SeriesBook",
fields=[
(
"id",
models.AutoField(
auto_created=True,
primary_key=True,
serialize=False,
verbose_name="ID",
),
),
("created_date", models.DateTimeField(auto_now_add=True)),
("updated_date", models.DateTimeField(auto_now=True)),
(
"remote_id",
bookwyrm.models.fields.RemoteIdField(
max_length=255,
null=True,
validators=[bookwyrm.models.fields.validate_remote_id],
),
),
(
"number",
bookwyrm.models.fields.CharField(
blank=True, max_length=255, null=True
),
),
(
"book",
models.ForeignKey(
on_delete=django.db.models.deletion.PROTECT, to="bookwyrm.book"
),
),
(
"series",
models.ForeignKey(
on_delete=django.db.models.deletion.PROTECT,
to="bookwyrm.series",
),
),
],
options={
"ordering": ["-number"],
"unique_together": {("book", "series")},
},
),
migrations.AddField(
model_name="series",
name="books",
field=bookwyrm.models.fields.ManyToManyField(
related_name="series_books",
through="bookwyrm.SeriesBook",
to="bookwyrm.Book",
),
),
migrations.AddField(
model_name="book",
name="series",
field=models.ManyToManyField(
through="bookwyrm.SeriesBook", to="bookwyrm.Series"
),
),
migrations.RunPython(make_series), # TODO: reverse_code
migrations.RemoveField(
model_name="book",
name="series_number",
),
migrations.RemoveField(
model_name="book",
name="series_name",
),
]

View file

@ -1,4 +1,5 @@
""" bring all the models into the app namespace """
import inspect
import sys
@ -7,6 +8,8 @@ from .author import Author
from .link import Link, FileLink, LinkDomain
from .connector import Connector
from .series import Series, SeriesBook
from .shelf import Shelf, ShelfBook
from .list import List, ListItem

View file

@ -1,20 +1,25 @@
""" database schema for info about authors """
import re
from typing import Tuple, Any
from django.contrib.postgres.indexes import GinIndex
from django.db import models
from django.contrib.postgres.indexes import GinIndex
import pgtrigger
from bookwyrm import activitypub
from bookwyrm.settings import DOMAIN
from bookwyrm.settings import BASE_URL
from bookwyrm.utils.db import format_trigger
from .book import BookDataModel
from .book import BookDataModel, MergedAuthor
from . import fields
class Author(BookDataModel):
"""basic biographic info"""
merged_model = MergedAuthor
wikipedia_link = fields.CharField(
max_length=255, blank=True, null=True, deduplication_field=True
)
@ -65,11 +70,48 @@ class Author(BookDataModel):
def get_remote_id(self):
"""editions and works both use "book" instead of model_name"""
return f"https://{DOMAIN}/author/{self.id}"
activity_serializer = activitypub.Author
return f"{BASE_URL}/author/{self.id}"
class Meta:
"""sets up postgres GIN index field"""
"""sets up indexes and triggers"""
# pylint: disable=line-too-long
indexes = (GinIndex(fields=["search_vector"]),)
triggers = [
pgtrigger.Trigger(
name="update_search_vector_on_author_edit",
when=pgtrigger.Before,
operation=pgtrigger.Insert
| pgtrigger.UpdateOf("name", "aliases", "search_vector"),
func=format_trigger(
"""new.search_vector :=
-- author name, with priority A
setweight(to_tsvector('simple', new.name), 'A') ||
-- author aliases, with priority B
setweight(to_tsvector('simple', coalesce(array_to_string(new.aliases, ' '), '')), 'B');
RETURN new;
"""
),
),
pgtrigger.Trigger(
name="reset_book_search_vector_on_author_edit",
when=pgtrigger.After,
operation=pgtrigger.UpdateOf("name", "aliases"),
func=format_trigger(
"""WITH updated_books AS (
SELECT book_id
FROM bookwyrm_book_authors
WHERE author_id = new.id
)
UPDATE bookwyrm_book
SET search_vector = ''
FROM updated_books
WHERE id = updated_books.book_id;
RETURN new;
"""
),
),
]
activity_serializer = activitypub.Author

View file

@ -10,7 +10,7 @@ from django.http import Http404
from django.utils.translation import gettext_lazy as _
from django.utils.text import slugify
from bookwyrm.settings import DOMAIN
from bookwyrm.settings import BASE_URL
from .fields import RemoteIdField
@ -38,7 +38,7 @@ class BookWyrmModel(models.Model):
def get_remote_id(self):
"""generate the url that resolves to the local object, without a slug"""
base_path = f"https://{DOMAIN}"
base_path = BASE_URL
if hasattr(self, "user"):
base_path = f"{base_path}{self.user.local_path}"
@ -53,7 +53,7 @@ class BookWyrmModel(models.Model):
@property
def local_path(self):
"""how to link to this object in the local app, with a slug"""
local = self.get_remote_id().replace(f"https://{DOMAIN}", "")
local = self.get_remote_id().replace(BASE_URL, "")
name = None
if hasattr(self, "name_field"):

View file

@ -1,29 +1,33 @@
""" database schema for books and shelves """
from itertools import chain
import re
from typing import Any
from typing import Any, Dict
from typing_extensions import Self
from django.contrib.postgres.search import SearchVectorField
from django.contrib.postgres.indexes import GinIndex
from django.core.cache import cache
from django.db import models, transaction
from django.db.models import Prefetch
from django.db.models import Prefetch, ManyToManyField
from django.dispatch import receiver
from django.utils.translation import gettext_lazy as _
from model_utils import FieldTracker
from model_utils.managers import InheritanceManager
from imagekit.models import ImageSpecField
import pgtrigger
from bookwyrm import activitypub
from bookwyrm.isbn.isbn import hyphenator_singleton as hyphenator
from bookwyrm.preview_images import generate_edition_preview_image_task
from bookwyrm.settings import (
DOMAIN,
BASE_URL,
DEFAULT_LANGUAGE,
LANGUAGE_ARTICLES,
ENABLE_PREVIEW_IMAGES,
ENABLE_THUMBNAIL_GENERATION,
)
from bookwyrm.utils.db import format_trigger
from .activitypub_mixin import OrderedCollectionPageMixin, ObjectMixin
from .base_model import BookWyrmModel
@ -106,10 +110,115 @@ class BookDataModel(ObjectMixin, BookWyrmModel):
"""only send book data updates to other bookwyrm instances"""
super().broadcast(activity, sender, software=software, **kwargs)
def merge_into(self, canonical: Self, dry_run=False) -> Dict[str, Any]:
"""merge this entity into another entity"""
if canonical.id == self.id:
raise ValueError(f"Cannot merge {self} into itself")
absorbed_fields = canonical.absorb_data_from(self, dry_run=dry_run)
if dry_run:
return absorbed_fields
canonical.save()
self.merged_model.objects.create(deleted_id=self.id, merged_into=canonical)
# move related models to canonical
related_models = [
(r.remote_field.name, r.related_model) for r in self._meta.related_objects
]
# pylint: disable=protected-access
for related_field, related_model in related_models:
# Skip the ManyToMany fields that arent auto-created. These
# should have a corresponding OneToMany field in the model for
# the linking table anyway. If we update it through that model
# instead then we wont lose the extra fields in the linking
# table.
# pylint: disable=protected-access
related_field_obj = related_model._meta.get_field(related_field)
if isinstance(related_field_obj, ManyToManyField):
through = related_field_obj.remote_field.through
if not through._meta.auto_created:
continue
related_objs = related_model.objects.filter(**{related_field: self})
for related_obj in related_objs:
try:
setattr(related_obj, related_field, canonical)
related_obj.save()
except TypeError:
getattr(related_obj, related_field).add(canonical)
getattr(related_obj, related_field).remove(self)
self.delete()
return absorbed_fields
def absorb_data_from(self, other: Self, dry_run=False) -> Dict[str, Any]:
"""fill empty fields with values from another entity"""
absorbed_fields = {}
for data_field in self._meta.get_fields():
if not hasattr(data_field, "activitypub_field"):
continue
canonical_value = getattr(self, data_field.name)
other_value = getattr(other, data_field.name)
if not other_value:
continue
if isinstance(data_field, fields.ArrayField):
if new_values := list(set(other_value) - set(canonical_value)):
# append at the end (in no particular order)
if not dry_run:
setattr(self, data_field.name, canonical_value + new_values)
absorbed_fields[data_field.name] = new_values
elif isinstance(data_field, fields.PartialDateField):
if (
(not canonical_value)
or (other_value.has_day and not canonical_value.has_day)
or (other_value.has_month and not canonical_value.has_month)
):
if not dry_run:
setattr(self, data_field.name, other_value)
absorbed_fields[data_field.name] = other_value
else:
if not canonical_value:
if not dry_run:
setattr(self, data_field.name, other_value)
absorbed_fields[data_field.name] = other_value
return absorbed_fields
class MergedBookDataModel(models.Model):
"""a BookDataModel instance that has been merged into another instance. kept
to be able to redirect old URLs"""
deleted_id = models.IntegerField(primary_key=True)
class Meta:
"""abstract just like BookDataModel"""
abstract = True
class MergedBook(MergedBookDataModel):
"""an Book that has been merged into another one"""
merged_into = models.ForeignKey(
"Book", on_delete=models.PROTECT, related_name="absorbed"
)
class MergedAuthor(MergedBookDataModel):
"""an Author that has been merged into another one"""
merged_into = models.ForeignKey(
"Author", on_delete=models.PROTECT, related_name="absorbed"
)
class Book(BookDataModel):
"""a generic book, which can mean either an edition or a work"""
merged_model = MergedBook
connector = models.ForeignKey("Connector", on_delete=models.PROTECT, null=True)
# book/work metadata
@ -120,8 +229,11 @@ class Book(BookDataModel):
languages = fields.ArrayField(
models.CharField(max_length=255), blank=True, default=list
)
series = fields.TextField(max_length=255, blank=True, null=True)
series_number = fields.CharField(max_length=255, blank=True, null=True)
series = models.ManyToManyField(
"Series",
through="SeriesBook",
through_fields=("book", "series"),
)
subjects = fields.ArrayField(
models.CharField(max_length=255), blank=True, null=True, default=list
)
@ -190,9 +302,13 @@ class Book(BookDataModel):
"""properties of this edition, as a string"""
items = [
self.physical_format if hasattr(self, "physical_format") else None,
f"{self.languages[0]} language"
if self.languages and self.languages[0] and self.languages[0] != "English"
else None,
(
f"{self.languages[0]} language"
if self.languages
and self.languages[0]
and self.languages[0] != "English"
else None
),
str(self.published_date.year) if self.published_date else None,
", ".join(self.publishers) if hasattr(self, "publishers") else None,
]
@ -214,7 +330,7 @@ class Book(BookDataModel):
def get_remote_id(self):
"""editions and works both use "book" instead of model_name"""
return f"https://{DOMAIN}/book/{self.id}"
return f"{BASE_URL}/book/{self.id}"
def guess_sort_title(self):
"""Get a best-guess sort title for the current book"""
@ -232,9 +348,49 @@ class Book(BookDataModel):
)
class Meta:
"""sets up postgres GIN index field"""
"""set up indexes and triggers"""
# pylint: disable=line-too-long
indexes = (GinIndex(fields=["search_vector"]),)
triggers = [
pgtrigger.Trigger(
name="update_search_vector_on_book_edit",
when=pgtrigger.Before,
operation=pgtrigger.Insert
| pgtrigger.UpdateOf("title", "subtitle", "series", "search_vector"),
func=format_trigger(
"""
WITH author_names AS (
SELECT array_to_string(bookwyrm_author.name || bookwyrm_author.aliases, ' ') AS name_and_aliases
FROM bookwyrm_author
LEFT JOIN bookwyrm_book_authors
ON bookwyrm_author.id = bookwyrm_book_authors.author_id
WHERE bookwyrm_book_authors.book_id = new.id
)
SELECT
-- title, with priority A (parse in English, default to simple if empty)
setweight(COALESCE(nullif(
to_tsvector('english', new.title), ''),
to_tsvector('simple', new.title)), 'A') ||
-- subtitle, with priority B (always in English?)
setweight(to_tsvector('english', COALESCE(new.subtitle, '')), 'B') ||
-- list of authors names and aliases (with priority C)
(SELECT setweight(to_tsvector('simple', COALESCE(array_to_string(ARRAY_AGG(name_and_aliases), ' '), '')), 'C')
FROM author_names
) ||
--- last: series name, with lowest priority
setweight(to_tsvector('english', COALESCE(new.series, '')), 'D')
INTO new.search_vector;
RETURN new;
"""
),
)
]
class Work(OrderedCollectionPageMixin, Book):

View file

@ -1,213 +1,318 @@
"""Export user account to tar.gz file for import into another Bookwyrm instance"""
import dataclasses
import logging
from uuid import uuid4
import os
from django.db.models import FileField
from boto3.session import Session as BotoSession
from s3_tar import S3Tar
from django.db.models import BooleanField, FileField, JSONField
from django.db.models import Q
from django.core.serializers.json import DjangoJSONEncoder
from django.core.files.base import ContentFile
from django.utils.module_loading import import_string
from bookwyrm.models import AnnualGoal, ReadThrough, ShelfBook, List, ListItem
from bookwyrm import settings, storage_backends
from bookwyrm.models import AnnualGoal, ReadThrough, ShelfBook, ListItem
from bookwyrm.models import Review, Comment, Quotation
from bookwyrm.models import Edition
from bookwyrm.models import UserFollows, User, UserBlocks
from bookwyrm.models.job import ParentJob, ParentTask
from bookwyrm.models.job import ParentJob
from bookwyrm.tasks import app, IMPORTS
from bookwyrm.utils.tar import BookwyrmTarFile
logger = logging.getLogger(__name__)
class BookwyrmAwsSession(BotoSession):
"""a boto session that always uses settings.AWS_S3_ENDPOINT_URL"""
def client(self, *args, **kwargs): # pylint: disable=arguments-differ
kwargs["endpoint_url"] = settings.AWS_S3_ENDPOINT_URL
return super().client("s3", *args, **kwargs)
def select_exports_storage():
"""callable to allow for dependency on runtime configuration"""
cls = import_string(settings.EXPORTS_STORAGE)
return cls()
class BookwyrmExportJob(ParentJob):
"""entry for a specific request to export a bookwyrm user"""
export_data = FileField(null=True)
export_data = FileField(null=True, storage=select_exports_storage)
export_json = JSONField(null=True, encoder=DjangoJSONEncoder)
json_completed = BooleanField(default=False)
def start_job(self):
"""Start the job"""
start_export_task.delay(job_id=self.id, no_children=True)
"""schedule the first task"""
return self
task = create_export_json_task.delay(job_id=self.id)
self.task_id = task.id
self.save(update_fields=["task_id"])
@app.task(queue=IMPORTS, base=ParentTask)
def start_export_task(**kwargs):
"""trigger the child tasks for each row"""
job = BookwyrmExportJob.objects.get(id=kwargs["job_id"])
@app.task(queue=IMPORTS)
def create_export_json_task(job_id):
"""create the JSON data for the export"""
job = BookwyrmExportJob.objects.get(id=job_id)
# don't start the job if it was stopped from the UI
if job.complete:
return
try:
# This is where ChildJobs get made
job.export_data = ContentFile(b"", str(uuid4()))
json_data = json_export(job.user)
tar_export(json_data, job.user, job.export_data)
job.save(update_fields=["export_data"])
job.set_status("active")
# generate JSON structure
job.export_json = export_json(job.user)
job.save(update_fields=["export_json"])
# create archive in separate task
create_archive_task.delay(job_id=job.id)
except Exception as err: # pylint: disable=broad-except
logger.exception("User Export Job %s Failed with error: %s", job.id, err)
logger.exception(
"create_export_json_task for %s failed with error: %s", job, err
)
job.set_status("failed")
job.set_status("complete")
def archive_file_location(file, directory="") -> str:
"""get the relative location of a file inside the archive"""
return os.path.join(directory, file.name)
def tar_export(json_data: str, user, file):
"""wrap the export information in a tar file"""
file.open("wb")
with BookwyrmTarFile.open(mode="w:gz", fileobj=file) as tar:
tar.write_bytes(json_data.encode("utf-8"))
def add_file_to_s3_tar(s3_tar: S3Tar, storage, file, directory=""):
"""
add file to S3Tar inside directory, keeping any directories under its
storage location
"""
s3_tar.add_file(
os.path.join(storage.location, file.name),
folder=os.path.dirname(archive_file_location(file, directory=directory)),
)
# Add avatar image if present
if getattr(user, "avatar", False):
tar.add_image(user.avatar, filename="avatar")
@app.task(queue=IMPORTS)
def create_archive_task(job_id):
"""create the archive containing the JSON file and additional files"""
job = BookwyrmExportJob.objects.get(id=job_id)
# don't start the job if it was stopped from the UI
if job.complete:
return
try:
export_task_id = str(job.task_id)
archive_filename = f"{export_task_id}.tar.gz"
export_json_bytes = DjangoJSONEncoder().encode(job.export_json).encode("utf-8")
user = job.user
editions = get_books_for_user(user)
for book in editions:
if getattr(book, "cover", False):
tar.add_image(book.cover)
file.close()
if settings.USE_S3:
# Storage for writing temporary files
exports_storage = storage_backends.ExportsS3Storage()
# Handle for creating the final archive
s3_tar = S3Tar(
exports_storage.bucket_name,
os.path.join(exports_storage.location, archive_filename),
session=BookwyrmAwsSession(),
)
# Save JSON file to a temporary location
export_json_tmp_file = os.path.join(export_task_id, "archive.json")
exports_storage.save(
export_json_tmp_file,
ContentFile(export_json_bytes),
)
s3_tar.add_file(
os.path.join(exports_storage.location, export_json_tmp_file)
)
# Add images to TAR
images_storage = storage_backends.ImagesStorage()
if user.avatar:
add_file_to_s3_tar(s3_tar, images_storage, user.avatar)
for edition in editions:
if edition.cover:
add_file_to_s3_tar(
s3_tar, images_storage, edition.cover, directory="images"
)
# Create archive and store file name
s3_tar.tar()
job.export_data = archive_filename
job.save(update_fields=["export_data"])
# Delete temporary files
exports_storage.delete(export_json_tmp_file)
else:
job.export_data = archive_filename
with job.export_data.open("wb") as tar_file:
with BookwyrmTarFile.open(mode="w:gz", fileobj=tar_file) as tar:
# save json file
tar.write_bytes(export_json_bytes)
# Add avatar image if present
if user.avatar:
tar.add_image(user.avatar)
for edition in editions:
if edition.cover:
tar.add_image(edition.cover, directory="images")
job.save(update_fields=["export_data"])
job.set_status("completed")
except Exception as err: # pylint: disable=broad-except
logger.exception("create_archive_task for %s failed with error: %s", job, err)
job.set_status("failed")
def json_export(
user,
): # pylint: disable=too-many-locals, too-many-statements, too-many-branches
"""Generate an export for a user"""
def export_json(user: User):
"""create export JSON"""
data = export_user(user) # in the root of the JSON structure
data["settings"] = export_settings(user)
data["goals"] = export_goals(user)
data["books"] = export_books(user)
data["saved_lists"] = export_saved_lists(user)
data["follows"] = export_follows(user)
data["blocks"] = export_blocks(user)
return data
# User as AP object
exported_user = user.to_activity()
# I don't love this but it prevents a JSON encoding error
# when there is no user image
if exported_user.get("icon") in (None, dataclasses.MISSING):
exported_user["icon"] = {}
def export_user(user: User):
"""export user data"""
data = user.to_activity()
if user.avatar:
data["icon"]["url"] = archive_file_location(user.avatar)
else:
# change the URL to be relative to the JSON file
file_type = exported_user["icon"]["url"].rsplit(".", maxsplit=1)[-1]
filename = f"avatar.{file_type}"
exported_user["icon"]["url"] = filename
data["icon"] = {}
return data
# Additional settings - can't be serialized as AP
def export_settings(user: User):
"""Additional settings - can't be serialized as AP"""
vals = [
"show_goal",
"preferred_timezone",
"default_post_privacy",
"show_suggested_users",
]
exported_user["settings"] = {}
for k in vals:
exported_user["settings"][k] = getattr(user, k)
return {k: getattr(user, k) for k in vals}
# Reading goals - can't be serialized as AP
reading_goals = AnnualGoal.objects.filter(user=user).distinct()
exported_user["goals"] = []
for goal in reading_goals:
exported_user["goals"].append(
{"goal": goal.goal, "year": goal.year, "privacy": goal.privacy}
)
# Reading history - can't be serialized as AP
readthroughs = ReadThrough.objects.filter(user=user).distinct().values()
readthroughs = list(readthroughs)
def export_saved_lists(user: User):
"""add user saved lists to export JSON"""
return [l.remote_id for l in user.saved_lists.all()]
# Books
editions = get_books_for_user(user)
exported_user["books"] = []
for edition in editions:
book = {}
book["work"] = edition.parent_work.to_activity()
book["edition"] = edition.to_activity()
if book["edition"].get("cover"):
# change the URL to be relative to the JSON file
filename = book["edition"]["cover"]["url"].rsplit("/", maxsplit=1)[-1]
book["edition"]["cover"]["url"] = f"covers/{filename}"
# authors
book["authors"] = []
for author in edition.authors.all():
book["authors"].append(author.to_activity())
# Shelves this book is on
# Every ShelfItem is this book so we don't other serializing
book["shelves"] = []
shelf_books = (
ShelfBook.objects.select_related("shelf")
.filter(user=user, book=edition)
.distinct()
)
for shelfbook in shelf_books:
book["shelves"].append(shelfbook.shelf.to_activity())
# Lists and ListItems
# ListItems include "notes" and "approved" so we need them
# even though we know it's this book
book["lists"] = []
list_items = ListItem.objects.filter(book=edition, user=user).distinct()
for item in list_items:
list_info = item.book_list.to_activity()
list_info[
"privacy"
] = item.book_list.privacy # this isn't serialized so we add it
list_info["list_item"] = item.to_activity()
book["lists"].append(list_info)
# Statuses
# Can't use select_subclasses here because
# we need to filter on the "book" value,
# which is not available on an ordinary Status
for status in ["comments", "quotations", "reviews"]:
book[status] = []
comments = Comment.objects.filter(user=user, book=edition).all()
for status in comments:
obj = status.to_activity()
obj["progress"] = status.progress
obj["progress_mode"] = status.progress_mode
book["comments"].append(obj)
quotes = Quotation.objects.filter(user=user, book=edition).all()
for status in quotes:
obj = status.to_activity()
obj["position"] = status.position
obj["endposition"] = status.endposition
obj["position_mode"] = status.position_mode
book["quotations"].append(obj)
reviews = Review.objects.filter(user=user, book=edition).all()
for status in reviews:
obj = status.to_activity()
book["reviews"].append(obj)
# readthroughs can't be serialized to activity
book_readthroughs = (
ReadThrough.objects.filter(user=user, book=edition).distinct().values()
)
book["readthroughs"] = list(book_readthroughs)
# append everything
exported_user["books"].append(book)
# saved book lists - just the remote id
saved_lists = List.objects.filter(id__in=user.saved_lists.all()).distinct()
exported_user["saved_lists"] = [l.remote_id for l in saved_lists]
# follows - just the remote id
def export_follows(user: User):
"""add user follows to export JSON"""
follows = UserFollows.objects.filter(user_subject=user).distinct()
following = User.objects.filter(userfollows_user_object__in=follows).distinct()
exported_user["follows"] = [f.remote_id for f in following]
return [f.remote_id for f in following]
# blocks - just the remote id
def export_blocks(user: User):
"""add user blocks to export JSON"""
blocks = UserBlocks.objects.filter(user_subject=user).distinct()
blocking = User.objects.filter(userblocks_user_object__in=blocks).distinct()
return [b.remote_id for b in blocking]
exported_user["blocks"] = [b.remote_id for b in blocking]
return DjangoJSONEncoder().encode(exported_user)
def export_goals(user: User):
"""add user reading goals to export JSON"""
reading_goals = AnnualGoal.objects.filter(user=user).distinct()
return [
{"goal": goal.goal, "year": goal.year, "privacy": goal.privacy}
for goal in reading_goals
]
def export_books(user: User):
"""add books to export JSON"""
editions = get_books_for_user(user)
return [export_book(user, edition) for edition in editions]
def export_book(user: User, edition: Edition):
"""add book to export JSON"""
data = {}
data["work"] = edition.parent_work.to_activity()
data["edition"] = edition.to_activity()
if edition.cover:
data["edition"]["cover"]["url"] = archive_file_location(
edition.cover, directory="images"
)
# authors
data["authors"] = [author.to_activity() for author in edition.authors.all()]
# Shelves this book is on
# Every ShelfItem is this book so we don't other serializing
shelf_books = (
ShelfBook.objects.select_related("shelf")
.filter(user=user, book=edition)
.distinct()
)
data["shelves"] = [shelfbook.shelf.to_activity() for shelfbook in shelf_books]
# Lists and ListItems
# ListItems include "notes" and "approved" so we need them
# even though we know it's this book
list_items = ListItem.objects.filter(book=edition, user=user).distinct()
data["lists"] = []
for item in list_items:
list_info = item.book_list.to_activity()
list_info[
"privacy"
] = item.book_list.privacy # this isn't serialized so we add it
list_info["list_item"] = item.to_activity()
data["lists"].append(list_info)
# Statuses
# Can't use select_subclasses here because
# we need to filter on the "book" value,
# which is not available on an ordinary Status
for status in ["comments", "quotations", "reviews"]:
data[status] = []
comments = Comment.objects.filter(user=user, book=edition).all()
for status in comments:
obj = status.to_activity()
obj["progress"] = status.progress
obj["progress_mode"] = status.progress_mode
data["comments"].append(obj)
quotes = Quotation.objects.filter(user=user, book=edition).all()
for status in quotes:
obj = status.to_activity()
obj["position"] = status.position
obj["endposition"] = status.endposition
obj["position_mode"] = status.position_mode
data["quotations"].append(obj)
reviews = Review.objects.filter(user=user, book=edition).all()
data["reviews"] = [status.to_activity() for status in reviews]
# readthroughs can't be serialized to activity
book_readthroughs = (
ReadThrough.objects.filter(user=user, book=edition).distinct().values()
)
data["readthroughs"] = list(book_readthroughs)
return data
def get_books_for_user(user):

View file

@ -42,20 +42,23 @@ def start_import_task(**kwargs):
try:
archive_file.open("rb")
with BookwyrmTarFile.open(mode="r:gz", fileobj=archive_file) as tar:
job.import_data = json.loads(tar.read("archive.json").decode("utf-8"))
json_filename = next(
filter(lambda n: n.startswith("archive"), tar.getnames())
)
job.import_data = json.loads(tar.read(json_filename).decode("utf-8"))
if "include_user_profile" in job.required:
update_user_profile(job.user, tar, job.import_data)
if "include_user_settings" in job.required:
update_user_settings(job.user, job.import_data)
if "include_goals" in job.required:
update_goals(job.user, job.import_data.get("goals"))
update_goals(job.user, job.import_data.get("goals", []))
if "include_saved_lists" in job.required:
upsert_saved_lists(job.user, job.import_data.get("saved_lists"))
upsert_saved_lists(job.user, job.import_data.get("saved_lists", []))
if "include_follows" in job.required:
upsert_follows(job.user, job.import_data.get("follows"))
upsert_follows(job.user, job.import_data.get("follows", []))
if "include_blocks" in job.required:
upsert_user_blocks(job.user, job.import_data.get("blocks"))
upsert_user_blocks(job.user, job.import_data.get("blocks", []))
process_books(job, tar)
@ -212,7 +215,7 @@ def upsert_statuses(user, cls, data, book_remote_id):
instance.save() # save and broadcast
else:
logger.info("User does not have permission to import statuses")
logger.warning("User does not have permission to import statuses")
def upsert_lists(user, lists, book_id):

View file

@ -11,7 +11,7 @@ ConnectorFiles = models.TextChoices("ConnectorFiles", CONNECTORS)
class Connector(BookWyrmModel):
"""book data source connectors"""
identifier = models.CharField(max_length=255, unique=True)
identifier = models.CharField(max_length=255, unique=True) # domain
priority = models.IntegerField(default=2)
name = models.CharField(max_length=255, null=True, blank=True)
connector_file = models.CharField(max_length=255, choices=ConnectorFiles.choices)

View file

@ -16,7 +16,7 @@ FederationStatus = [
class FederatedServer(BookWyrmModel):
"""store which servers we federate with"""
server_name = models.CharField(max_length=255, unique=True)
server_name = models.CharField(max_length=255, unique=True) # domain
status = models.CharField(
max_length=255, default="federated", choices=FederationStatus
)
@ -64,5 +64,4 @@ class FederatedServer(BookWyrmModel):
def is_blocked(cls, url: str) -> bool:
"""look up if a domain is blocked"""
url = urlparse(url)
domain = url.netloc
return cls.objects.filter(server_name=domain, status="blocked").exists()
return cls.objects.filter(server_name=url.hostname, status="blocked").exists()

View file

@ -260,12 +260,12 @@ class PrivacyField(ActivitypubFieldMixin, models.CharField):
if to == [self.public]:
setattr(instance, self.name, "public")
elif self.public in cc:
setattr(instance, self.name, "unlisted")
elif to == [user.followers_url]:
setattr(instance, self.name, "followers")
elif cc == []:
setattr(instance, self.name, "direct")
elif self.public in cc:
setattr(instance, self.name, "unlisted")
else:
setattr(instance, self.name, "followers")
return original == getattr(instance, self.name)

View file

@ -1,7 +1,7 @@
""" do book related things with other users """
from django.db import models, IntegrityError, transaction
from django.db.models import Q
from bookwyrm.settings import DOMAIN
from bookwyrm.settings import BASE_URL
from .base_model import BookWyrmModel
from . import fields
from .relationship import UserBlocks
@ -17,7 +17,7 @@ class Group(BookWyrmModel):
def get_remote_id(self):
"""don't want the user to be in there in this case"""
return f"https://{DOMAIN}/group/{self.id}"
return f"{BASE_URL}/group/{self.id}"
@classmethod
def followers_filter(cls, queryset, viewer):

View file

@ -135,8 +135,7 @@ class ParentJob(Job):
)
app.control.revoke(list(tasks))
for task in self.pending_child_jobs:
task.update(status=self.Status.STOPPED)
self.pending_child_jobs.update(status=self.Status.STOPPED)
@property
def has_completed(self):
@ -248,7 +247,7 @@ class SubTask(app.Task):
"""
def before_start(
self, task_id, args, kwargs
self, task_id, *args, **kwargs
): # pylint: disable=no-self-use, unused-argument
"""Handler called before the task starts. Override.
@ -272,7 +271,7 @@ class SubTask(app.Task):
child_job.set_status(ChildJob.Status.ACTIVE)
def on_success(
self, retval, task_id, args, kwargs
self, retval, task_id, *args, **kwargs
): # pylint: disable=no-self-use, unused-argument
"""Run by the worker if the task executes successfully. Override.

View file

@ -38,7 +38,7 @@ class Link(ActivitypubMixin, BookWyrmModel):
"""create a link"""
# get or create the associated domain
if not self.domain:
domain = urlparse(self.url).netloc
domain = urlparse(self.url).hostname
self.domain, _ = LinkDomain.objects.get_or_create(domain=domain)
# this is never broadcast, the owning model broadcasts an update

View file

@ -7,7 +7,7 @@ from django.db.models import Q
from django.utils import timezone
from bookwyrm import activitypub
from bookwyrm.settings import DOMAIN
from bookwyrm.settings import BASE_URL
from .activitypub_mixin import CollectionItemMixin, OrderedCollectionMixin
from .base_model import BookWyrmModel
@ -50,7 +50,7 @@ class List(OrderedCollectionMixin, BookWyrmModel):
def get_remote_id(self):
"""don't want the user to be in there in this case"""
return f"https://{DOMAIN}/list/{self.id}"
return f"{BASE_URL}/list/{self.id}"
@property
def collection_queryset(self):

View file

@ -10,7 +10,7 @@ from .notification import Notification, NotificationType
class Move(ActivityMixin, BookWyrmModel):
"""migrating an activitypub user account"""
"""migrating an activitypub object"""
user = fields.ForeignKey(
"User", on_delete=models.PROTECT, activitypub_field="actor"

View file

@ -3,7 +3,7 @@ from django.core.exceptions import PermissionDenied
from django.db import models
from django.utils.translation import gettext_lazy as _
from bookwyrm.settings import DOMAIN
from bookwyrm.settings import BASE_URL
from .base_model import BookWyrmModel
@ -46,7 +46,7 @@ class Report(BookWyrmModel):
raise PermissionDenied()
def get_remote_id(self):
return f"https://{DOMAIN}/settings/reports/{self.id}"
return f"{BASE_URL}/settings/reports/{self.id}"
def comment(self, user, note):
"""comment on a report"""

35
bookwyrm/models/series.py Normal file
View file

@ -0,0 +1,35 @@
"""series of books"""
from django.db import models
from .base_model import BookWyrmModel
from . import fields
class Series(BookWyrmModel):
"""a named series of books"""
name = fields.CharField(max_length=100)
authors = fields.ManyToManyField("Author") # TODO: add on Author model
books = fields.ManyToManyField(
"Book",
through="SeriesBook",
through_fields=("series", "book"),
related_name="series_books",
)
class SeriesBook(BookWyrmModel):
"""membership of a series"""
book = models.ForeignKey("Book", on_delete=models.PROTECT)
series = models.ForeignKey("Series", on_delete=models.PROTECT)
number = fields.CharField(max_length=255, blank=True, null=True)
collection_field = "series"
class Meta:
"""a series can't contain the same book twice"""
unique_together = ("book", "series")
ordering = ["-number"]

View file

@ -6,7 +6,7 @@ from django.db import models
from django.utils import timezone
from bookwyrm import activitypub
from bookwyrm.settings import DOMAIN
from bookwyrm.settings import BASE_URL
from bookwyrm.tasks import BROADCAST
from .activitypub_mixin import CollectionItemMixin, OrderedCollectionMixin
from .base_model import BookWyrmModel
@ -71,7 +71,7 @@ class Shelf(OrderedCollectionMixin, BookWyrmModel):
@property
def local_path(self):
"""No slugs"""
return self.get_remote_id().replace(f"https://{DOMAIN}", "")
return self.get_remote_id().replace(BASE_URL, "")
def raise_not_deletable(self, viewer):
"""don't let anyone delete a default shelf"""

View file

@ -12,7 +12,7 @@ from model_utils import FieldTracker
from bookwyrm.connectors.abstract_connector import get_data
from bookwyrm.preview_images import generate_site_preview_image_task
from bookwyrm.settings import DOMAIN, ENABLE_PREVIEW_IMAGES, STATIC_FULL_URL
from bookwyrm.settings import BASE_URL, ENABLE_PREVIEW_IMAGES, STATIC_FULL_URL
from bookwyrm.settings import RELEASE_API
from bookwyrm.tasks import app, MISC
from .base_model import BookWyrmModel, new_access_code
@ -188,7 +188,7 @@ class SiteInvite(models.Model):
@property
def link(self):
"""formats the invite link"""
return f"https://{DOMAIN}/invite/{self.code}"
return f"{BASE_URL}/invite/{self.code}"
class InviteRequest(BookWyrmModel):
@ -235,7 +235,7 @@ class PasswordReset(models.Model):
@property
def link(self):
"""formats the invite link"""
return f"https://{DOMAIN}/password-reset/{self.code}"
return f"{BASE_URL}/password-reset/{self.code}"
# pylint: disable=unused-argument

View file

@ -80,6 +80,10 @@ class Status(OrderedCollectionPageMixin, BookWyrmModel):
"""default sorting"""
ordering = ("-published_date",)
indexes = [
models.Index(fields=["remote_id"]),
models.Index(fields=["thread_id"]),
]
def save(self, *args, **kwargs):
"""save and notify"""
@ -388,10 +392,10 @@ class Quotation(BookStatus):
def _format_position(self) -> Optional[str]:
"""serialize page position"""
beg = self.position
end = self.endposition or 0
end = self.endposition
if self.position_mode != "PG" or not beg:
return None
return f"pp. {beg}-{end}" if end > beg else f"p. {beg}"
return f"pp. {beg}-{end}" if end else f"p. {beg}"
@property
def pure_content(self):

View file

@ -19,7 +19,7 @@ from bookwyrm.connectors import get_data, ConnectorException
from bookwyrm.models.shelf import Shelf
from bookwyrm.models.status import Status
from bookwyrm.preview_images import generate_user_preview_image_task
from bookwyrm.settings import DOMAIN, ENABLE_PREVIEW_IMAGES, USE_HTTPS, LANGUAGES
from bookwyrm.settings import BASE_URL, ENABLE_PREVIEW_IMAGES, LANGUAGES
from bookwyrm.signatures import create_key_pair
from bookwyrm.tasks import app, MISC
from bookwyrm.utils import regex
@ -42,12 +42,6 @@ def get_feed_filter_choices():
return [f[0] for f in FeedFilterChoices]
def site_link():
"""helper for generating links to the site"""
protocol = "https" if USE_HTTPS else "http"
return f"{protocol}://{DOMAIN}"
# pylint: disable=too-many-public-methods
class User(OrderedCollectionPageMixin, AbstractUser):
"""a user who wants to read books"""
@ -198,6 +192,14 @@ class User(OrderedCollectionPageMixin, AbstractUser):
hotp_secret = models.CharField(max_length=32, default=None, blank=True, null=True)
hotp_count = models.IntegerField(default=0, blank=True, null=True)
class Meta(AbstractUser.Meta):
"""indexes"""
indexes = [
models.Index(fields=["username"]),
models.Index(fields=["is_active", "local"]),
]
@property
def active_follower_requests(self):
"""Follow requests from active users"""
@ -206,8 +208,7 @@ class User(OrderedCollectionPageMixin, AbstractUser):
@property
def confirmation_link(self):
"""helper for generating confirmation links"""
link = site_link()
return f"{link}/confirm-email/{self.confirmation_code}"
return f"{BASE_URL}/confirm-email/{self.confirmation_code}"
@property
def following_link(self):
@ -341,7 +342,7 @@ class User(OrderedCollectionPageMixin, AbstractUser):
if not self.local and not re.match(regex.FULL_USERNAME, self.username):
# generate a username that uses the domain (webfinger format)
actor_parts = urlparse(self.remote_id)
self.username = f"{self.username}@{actor_parts.netloc}"
self.username = f"{self.username}@{actor_parts.hostname}"
# this user already exists, no need to populate fields
if not created:
@ -361,11 +362,10 @@ class User(OrderedCollectionPageMixin, AbstractUser):
with transaction.atomic():
# populate fields for local users
link = site_link()
self.remote_id = f"{link}/user/{self.localname}"
self.remote_id = f"{BASE_URL}/user/{self.localname}"
self.followers_url = f"{self.remote_id}/followers"
self.inbox = f"{self.remote_id}/inbox"
self.shared_inbox = f"{link}/inbox"
self.shared_inbox = f"{BASE_URL}/inbox"
self.outbox = f"{self.remote_id}/outbox"
# an id needs to be set before we can proceed with related models
@ -509,6 +509,13 @@ class KeyPair(ActivitypubMixin, BookWyrmModel):
activity_serializer = activitypub.PublicKey
serialize_reverse_fields = [("owner", "owner", "id")]
class Meta:
"""indexes"""
indexes = [
models.Index(fields=["remote_id"]),
]
def get_remote_id(self):
# self.owner is set by the OneToOneField on User
return f"{self.owner.remote_id}/#main-key"
@ -543,7 +550,7 @@ def set_remote_server(user_id, allow_external_connections=False):
user = User.objects.get(id=user_id)
actor_parts = urlparse(user.remote_id)
federated_server = get_or_create_remote_server(
actor_parts.netloc, allow_external_connections=allow_external_connections
actor_parts.hostname, allow_external_connections=allow_external_connections
)
# if we were unable to find the server, we need to create a new entry for it
if not federated_server:

View file

@ -1,4 +1,5 @@
""" Generate social media preview images for twitter/mastodon/etc """
import math
import os
import textwrap
@ -42,8 +43,8 @@ def get_imagefont(name, size):
return ImageFont.truetype(path, size)
except KeyError:
logger.error("Font %s not found in config", name)
except OSError:
logger.error("Could not load font %s from file", name)
except OSError as err:
logger.error("Could not load font %s from file: %s", name, err)
return ImageFont.load_default()
@ -59,7 +60,7 @@ def get_font(weight, size=28):
font.set_variation_by_name("Bold")
if weight == "regular":
font.set_variation_by_name("Regular")
except AttributeError:
except OSError:
pass
return font
@ -174,11 +175,13 @@ def generate_instance_layer(content_width):
site = models.SiteSettings.objects.get()
if site.logo_small:
logo_img = Image.open(site.logo_small)
with Image.open(site.logo_small) as logo_img:
logo_img.load()
else:
try:
static_path = os.path.join(settings.STATIC_ROOT, "images/logo-small.png")
logo_img = Image.open(static_path)
with Image.open(static_path) as logo_img:
logo_img.load()
except FileNotFoundError:
logo_img = None
@ -210,18 +213,9 @@ def generate_instance_layer(content_width):
def generate_rating_layer(rating, content_width):
"""Places components for rating preview"""
try:
icon_star_full = Image.open(
os.path.join(settings.STATIC_ROOT, "images/icons/star-full.png")
)
icon_star_empty = Image.open(
os.path.join(settings.STATIC_ROOT, "images/icons/star-empty.png")
)
icon_star_half = Image.open(
os.path.join(settings.STATIC_ROOT, "images/icons/star-half.png")
)
except FileNotFoundError:
return None
path_star_full = os.path.join(settings.STATIC_ROOT, "images/icons/star-full.png")
path_star_empty = os.path.join(settings.STATIC_ROOT, "images/icons/star-empty.png")
path_star_half = os.path.join(settings.STATIC_ROOT, "images/icons/star-half.png")
icon_size = 64
icon_margin = 10
@ -236,17 +230,23 @@ def generate_rating_layer(rating, content_width):
position_x = 0
for _ in range(math.floor(rating)):
rating_layer_mask.alpha_composite(icon_star_full, (position_x, 0))
position_x = position_x + icon_size + icon_margin
try:
with Image.open(path_star_full) as icon_star_full:
for _ in range(math.floor(rating)):
rating_layer_mask.alpha_composite(icon_star_full, (position_x, 0))
position_x = position_x + icon_size + icon_margin
if math.floor(rating) != math.ceil(rating):
rating_layer_mask.alpha_composite(icon_star_half, (position_x, 0))
position_x = position_x + icon_size + icon_margin
if math.floor(rating) != math.ceil(rating):
with Image.open(path_star_half) as icon_star_half:
rating_layer_mask.alpha_composite(icon_star_half, (position_x, 0))
position_x = position_x + icon_size + icon_margin
for _ in range(5 - math.ceil(rating)):
rating_layer_mask.alpha_composite(icon_star_empty, (position_x, 0))
position_x = position_x + icon_size + icon_margin
with Image.open(path_star_empty) as icon_star_empty:
for _ in range(5 - math.ceil(rating)):
rating_layer_mask.alpha_composite(icon_star_empty, (position_x, 0))
position_x = position_x + icon_size + icon_margin
except FileNotFoundError:
return None
rating_layer_mask = rating_layer_mask.getchannel("A")
rating_layer_mask = ImageOps.invert(rating_layer_mask)
@ -289,7 +289,8 @@ def generate_preview_image(
texts = texts or {}
# Cover
try:
inner_img_layer = Image.open(picture)
with Image.open(picture) as inner_img_layer:
inner_img_layer.load()
inner_img_layer.thumbnail(
(inner_img_width, inner_img_height), Image.Resampling.LANCZOS
)

View file

@ -19,7 +19,6 @@ DOMAIN = env("DOMAIN")
with open("VERSION", encoding="utf-8") as f:
version = f.read()
version = version.replace("\n", "")
f.close()
VERSION = version
@ -102,12 +101,14 @@ INSTALLED_APPS = [
"django.contrib.messages",
"django.contrib.staticfiles",
"django.contrib.humanize",
"oauth2_provider",
"file_resubmit",
"sass_processor",
"bookwyrm",
"celery",
"django_celery_beat",
"imagekit",
"pgtrigger",
"storages",
]
@ -350,30 +351,34 @@ USE_L10N = True
USE_TZ = True
USER_AGENT = f"BookWyrm (BookWyrm/{VERSION}; +https://{DOMAIN}/)"
# Imagekit generated thumbnails
ENABLE_THUMBNAIL_GENERATION = env.bool("ENABLE_THUMBNAIL_GENERATION", False)
IMAGEKIT_CACHEFILE_DIR = "thumbnails"
IMAGEKIT_DEFAULT_CACHEFILE_STRATEGY = "bookwyrm.thumbnail_generation.Strategy"
# Static files (CSS, JavaScript, Images)
# https://docs.djangoproject.com/en/3.2/howto/static-files/
PROJECT_DIR = os.path.dirname(os.path.abspath(__file__))
CSP_ADDITIONAL_HOSTS = env.list("CSP_ADDITIONAL_HOSTS", [])
# Storage
PROTOCOL = "http"
if USE_HTTPS:
PROTOCOL = "https"
SESSION_COOKIE_SECURE = True
CSRF_COOKIE_SECURE = True
PORT = env.int("PORT", 443 if USE_HTTPS else 80)
if (USE_HTTPS and PORT == 443) or (not USE_HTTPS and PORT == 80):
NETLOC = DOMAIN
else:
NETLOC = f"{DOMAIN}:{PORT}"
BASE_URL = f"{PROTOCOL}://{NETLOC}"
USER_AGENT = f"BookWyrm (BookWyrm/{VERSION}; +{BASE_URL})"
# Storage
USE_S3 = env.bool("USE_S3", False)
USE_AZURE = env.bool("USE_AZURE", False)
S3_SIGNED_URL_EXPIRY = env.int("S3_SIGNED_URL_EXPIRY", 900)
if USE_S3:
# AWS settings
@ -385,19 +390,34 @@ if USE_S3:
AWS_S3_ENDPOINT_URL = env("AWS_S3_ENDPOINT_URL", None)
AWS_DEFAULT_ACL = "public-read"
AWS_S3_OBJECT_PARAMETERS = {"CacheControl": "max-age=86400"}
AWS_S3_URL_PROTOCOL = env("AWS_S3_URL_PROTOCOL", f"{PROTOCOL}:")
# S3 Static settings
STATIC_LOCATION = "static"
STATIC_URL = f"{PROTOCOL}://{AWS_S3_CUSTOM_DOMAIN}/{STATIC_LOCATION}/"
STATIC_URL = f"{AWS_S3_URL_PROTOCOL}//{AWS_S3_CUSTOM_DOMAIN}/{STATIC_LOCATION}/"
STATIC_FULL_URL = STATIC_URL
STATICFILES_STORAGE = "bookwyrm.storage_backends.StaticStorage"
# S3 Media settings
MEDIA_LOCATION = "images"
MEDIA_URL = f"{PROTOCOL}://{AWS_S3_CUSTOM_DOMAIN}/{MEDIA_LOCATION}/"
MEDIA_URL = f"{AWS_S3_URL_PROTOCOL}//{AWS_S3_CUSTOM_DOMAIN}/{MEDIA_LOCATION}/"
MEDIA_FULL_URL = MEDIA_URL
STATIC_FULL_URL = STATIC_URL
DEFAULT_FILE_STORAGE = "bookwyrm.storage_backends.ImagesStorage"
CSP_DEFAULT_SRC = ["'self'", AWS_S3_CUSTOM_DOMAIN] + CSP_ADDITIONAL_HOSTS
CSP_SCRIPT_SRC = ["'self'", AWS_S3_CUSTOM_DOMAIN] + CSP_ADDITIONAL_HOSTS
# S3 Exports settings
EXPORTS_STORAGE = "bookwyrm.storage_backends.ExportsS3Storage"
# Content Security Policy
CSP_DEFAULT_SRC = [
"'self'",
f"{AWS_S3_URL_PROTOCOL}//{AWS_S3_CUSTOM_DOMAIN}"
if AWS_S3_CUSTOM_DOMAIN
else None,
] + CSP_ADDITIONAL_HOSTS
CSP_SCRIPT_SRC = [
"'self'",
f"{AWS_S3_URL_PROTOCOL}//{AWS_S3_CUSTOM_DOMAIN}"
if AWS_S3_CUSTOM_DOMAIN
else None,
] + CSP_ADDITIONAL_HOSTS
elif USE_AZURE:
# Azure settings
AZURE_ACCOUNT_NAME = env("AZURE_ACCOUNT_NAME")
AZURE_ACCOUNT_KEY = env("AZURE_ACCOUNT_KEY")
AZURE_CONTAINER = env("AZURE_CONTAINER")
@ -407,6 +427,7 @@ elif USE_AZURE:
STATIC_URL = (
f"{PROTOCOL}://{AZURE_CUSTOM_DOMAIN}/{AZURE_CONTAINER}/{STATIC_LOCATION}/"
)
STATIC_FULL_URL = STATIC_URL
STATICFILES_STORAGE = "bookwyrm.storage_backends.AzureStaticStorage"
# Azure Media settings
MEDIA_LOCATION = "images"
@ -414,15 +435,24 @@ elif USE_AZURE:
f"{PROTOCOL}://{AZURE_CUSTOM_DOMAIN}/{AZURE_CONTAINER}/{MEDIA_LOCATION}/"
)
MEDIA_FULL_URL = MEDIA_URL
STATIC_FULL_URL = STATIC_URL
DEFAULT_FILE_STORAGE = "bookwyrm.storage_backends.AzureImagesStorage"
# Azure Exports settings
EXPORTS_STORAGE = None # not implemented yet
# Content Security Policy
CSP_DEFAULT_SRC = ["'self'", AZURE_CUSTOM_DOMAIN] + CSP_ADDITIONAL_HOSTS
CSP_SCRIPT_SRC = ["'self'", AZURE_CUSTOM_DOMAIN] + CSP_ADDITIONAL_HOSTS
else:
# Static settings
STATIC_URL = "/static/"
STATIC_FULL_URL = BASE_URL + STATIC_URL
STATICFILES_STORAGE = "django.contrib.staticfiles.storage.StaticFilesStorage"
# Media settings
MEDIA_URL = "/images/"
MEDIA_FULL_URL = f"{PROTOCOL}://{DOMAIN}{MEDIA_URL}"
STATIC_FULL_URL = f"{PROTOCOL}://{DOMAIN}{STATIC_URL}"
MEDIA_FULL_URL = BASE_URL + MEDIA_URL
DEFAULT_FILE_STORAGE = "django.core.files.storage.FileSystemStorage"
# Exports settings
EXPORTS_STORAGE = "bookwyrm.storage_backends.ExportsFileStorage"
# Content Security Policy
CSP_DEFAULT_SRC = ["'self'"] + CSP_ADDITIONAL_HOSTS
CSP_SCRIPT_SRC = ["'self'"] + CSP_ADDITIONAL_HOSTS

View file

@ -1,6 +1,7 @@
"""Handles backends for storages"""
import os
from tempfile import SpooledTemporaryFile
from django.core.files.storage import FileSystemStorage
from storages.backends.s3boto3 import S3Boto3Storage
from storages.backends.azure_storage import AzureStorage
@ -61,3 +62,18 @@ class AzureImagesStorage(AzureStorage): # pylint: disable=abstract-method
location = "images"
overwrite_files = False
class ExportsFileStorage(FileSystemStorage): # pylint: disable=abstract-method
"""Storage class for exports contents with local files"""
location = "exports"
overwrite_files = False
class ExportsS3Storage(S3Boto3Storage): # pylint: disable=abstract-method
"""Storage class for exports contents with S3"""
location = "exports"
default_acl = None
overwrite_files = False

View file

@ -8,7 +8,7 @@
<h1 class="title">{% trans "File too large" %}</h1>
<p class="content">{% trans "The file you are uploading is too large." %}</p>
<p class="content">
{% blocktrans %}
{% blocktrans trimmed %}
You you can try using a smaller file, or ask your BookWyrm server administrator to increase the <code>DATA_UPLOAD_MAX_MEMORY_SIZE</code> setting.
{% endblocktrans %}
</p>

View file

@ -55,6 +55,8 @@
<p class="field"><label class="label" for="id_wikipedia_link">{% trans "Wikipedia link:" %}</label> {{ form.wikipedia_link }}</p>
<p class="field"><label class="label" for="id_wikidata">{% trans "Wikidata:" %}</label> {{ form.wikidata }}</p>
{% include 'snippets/form_errors.html' with errors_list=form.wikipedia_link.errors id="desc_wikipedia_link" %}
<p class="field"><label class="label" for="id_website">{% trans "Website:" %}</label> {{ form.website }}</p>

View file

@ -31,7 +31,7 @@
{{ book.title }}
</h1>
{% if book.subtitle or book.series %}
{% if book.subtitle or book.series.exists %}
<p class="subtitle title is-5">
{% if book.subtitle %}
<meta
@ -44,20 +44,19 @@
</span>
{% endif %}
{% if book.series %}
<meta itemprop="position" content="{{ book.series_number }}">
{% for book_series in book.series.all %}
{% spaceless %}
<span itemprop="isPartOf" itemscope itemtype="https://schema.org/BookSeries">
{% if book.authors.exists %}
<a href="{% url 'book-series-by' book.authors.first.id %}?series_name={{ book.series | urlencode }}"
itemprop="url">
{% endif %}
<span itemprop="name">{{ book.series }}</span>
{% if book.series_number %} #{{ book.series_number }}{% endif %}
{% if book.authors.exists %}
<a href="{{ book_series.local_path }}" itemprop="url">
<span itemprop="name">{{ book_series.series.name }}</span>
</a>
{% endif %}
{% if book_series.number %}
<span>, #</span>
<span itemprop="position">{{ book.series_number }}</span>
{% endif %}
</span>
{% endif %}
{% endspaceless %}
{% endfor %}
</p>
{% endif %}

View file

@ -2,10 +2,10 @@
<div style="font-family: BlinkMacSystemFont,-apple-system,'Segoe UI',Roboto,Oxygen,Ubuntu,Cantarell,'Fira Sans','Droid Sans','Helvetica Neue',Helvetica,Arial,sans-serif; border-radius: 6px; background-color: #efefef; max-width: 632px;">
<div style="padding: 1rem; overflow: auto;">
<div style="float: left; margin-right: 1rem;">
<a style="color: #3273dc;" href="https://{{ domain }}" style="text-decoration: none;"><img src="{{ logo }}" alt="logo" loading="lazy" decoding="async"></a>
<a style="color: #3273dc;" href="{{ base_url }}" style="text-decoration: none;"><img src="{{ logo }}" alt="logo" loading="lazy" decoding="async"></a>
</div>
<div>
<a style="color: black; text-decoration: none" href="https://{{ domain }}" style="text-decoration: none;"><strong>{{ site_name }}</strong><br>
<a style="color: black; text-decoration: none" href="{{ base_url }}" style="text-decoration: none;"><strong>{{ site_name }}</strong><br>
{{ domain }}</a>
</div>
</div>
@ -18,9 +18,9 @@
</div>
<div style="padding: 1rem; font-size: 0.8rem;">
<p style="margin: 0; color: #333;">{% blocktrans %}BookWyrm hosted on <a style="color: #3273dc;" href="https://{{ domain }}">{{ site_name }}</a>{% endblocktrans %}</p>
<p style="margin: 0; color: #333;">{% blocktrans %}BookWyrm hosted on <a style="color: #3273dc;" href="{{ base_url }}">{{ site_name }}</a>{% endblocktrans %}</p>
{% if user %}
<p style="margin: 0; color: #333;"><a style="color: #3273dc;" href="https://{{ domain }}{% url 'prefs-profile' %}">{% trans "Email preference" %}</a></p>
<p style="margin: 0; color: #333;"><a style="color: #3273dc;" href="{{ base_url }}{% url 'prefs-profile' %}">{% trans "Email preference" %}</a></p>
{% endif %}
</div>
</div>

View file

@ -12,6 +12,6 @@
<p>
{% url 'code-of-conduct' as coc_path %}
{% url 'about' as about_path %}
{% blocktrans %}Learn more <a href="https://{{ domain }}{{ about_path }}">about {{ site_name }}</a>.{% endblocktrans %}
{% blocktrans %}Learn more <a href="{{ base_url }}{{ about_path }}">about {{ site_name }}</a>.{% endblocktrans %}
</p>
{% endblock %}

View file

@ -5,6 +5,6 @@
{{ invite_link }}
{% blocktrans %}Learn more about {{ site_name }}:{% endblocktrans %} https://{{ domain }}{% url 'about' %}
{% blocktrans %}Learn more about {{ site_name }}:{% endblocktrans %} {{ base_url }}{% url 'about' %}
{% endblock %}

View file

@ -10,6 +10,6 @@
<Image width="16" height="16" type="image/x-icon">{{ image }}</Image>
<Url
type="text/html"
template="https://{{ DOMAIN }}{% url 'search' %}?q={searchTerms}"
template="{{ BASE_URL }}{% url 'search' %}?q={searchTerms}"
/>
</OpenSearchDescription>

View file

@ -97,25 +97,25 @@
</td>
</tr>
{% endif %}
{% for job in jobs %}
{% for export in jobs %}
<tr>
<td>{{ job.updated_date }}</td>
<td>{{ export.job.updated_date }}</td>
<td>
<span
{% if job.status == "stopped" or job.status == "failed" %}
{% if export.job.status == "stopped" or export.job.status == "failed" %}
class="tag is-danger"
{% elif job.status == "pending" %}
{% elif export.job.status == "pending" %}
class="tag is-warning"
{% elif job.complete %}
{% elif export.job.complete %}
class="tag"
{% else %}
class="tag is-success"
{% endif %}
>
{% if job.status %}
{{ job.status }}
{{ job.status_display }}
{% elif job.complete %}
{% if export.job.status %}
{{ export.job.status }}
{{ export.job.status_display }}
{% elif export.job.complete %}
{% trans "Complete" %}
{% else %}
{% trans "Active" %}
@ -123,18 +123,20 @@
</span>
</td>
<td>
<span>{{ job.export_data|get_file_size }}</span>
{% if export.size %}
<span>{{ export.size|get_file_size }}</span>
{% endif %}
</td>
<td>
{% if job.complete and not job.status == "stopped" and not job.status == "failed" %}
<p>
<a download="" href="/preferences/user-export/{{ job.task_id }}">
<span class="icon icon-download" aria-hidden="true"></span>
<span class="is-hidden-mobile">
{% trans "Download your export" %}
</span>
</a>
</p>
{% if export.url %}
<a href="{{ export.url }}">
<span class="icon icon-download" aria-hidden="true"></span>
<span class="is-hidden-mobile">
{% trans "Download your export" %}
</span>
</a>
{% elif export.unavailable %}
{% trans "Archive is no longer available" %}
{% endif %}
</td>
</tr>

View file

@ -109,7 +109,7 @@
<p class="block">
{% if request.user.is_authenticated %}
{% if not remote %}
<a href="{{ request.path }}?q={{ query }}&type=book&remote=true" id="tour-load-from-other-catalogues">
<a href="{{ request.path }}?q={{ query|urlencode }}&type=book&remote=true" id="tour-load-from-other-catalogues">
{% trans "Load results from other catalogues" %}
</a>
{% else %}

View file

@ -41,18 +41,18 @@
<nav class="tabs">
<ul>
<li{% if type == "book" %} class="is-active"{% endif %}>
<a href="{% url 'search' %}?q={{ query }}&type=book">{% trans "Books" %}</a>
<a href="{% url 'search' %}?q={{ query|urlencode }}&type=book">{% trans "Books" %}</a>
</li>
<li{% if type == "author" %} class="is-active"{% endif %}>
<a href="{% url 'search' %}?q={{ query }}&type=author">{% trans "Authors" %}</a>
<a href="{% url 'search' %}?q={{ query|urlencode }}&type=author">{% trans "Authors" %}</a>
</li>
{% if request.user.is_authenticated %}
<li{% if type == "user" %} class="is-active"{% endif %}>
<a href="{% url 'search' %}?q={{ query }}&type=user">{% trans "Users" %}</a>
<a href="{% url 'search' %}?q={{ query|urlencode }}&type=user">{% trans "Users" %}</a>
</li>
{% endif %}
<li{% if type == "list" %} class="is-active"{% endif %}>
<a href="{% url 'search' %}?q={{ query }}&type=list">{% trans "Lists" %}</a>
<a href="{% url 'search' %}?q={{ query|urlencode }}&type=list">{% trans "Lists" %}</a>
</li>
</ul>
</nav>

View file

@ -157,13 +157,13 @@
>
<div class="notification is-danger is-light">
<p class="my-2">{% trans "Users are currently unable to start new user exports. This is the default setting." %}</p>
{% if use_s3 %}
<p>{% trans "It is not currently possible to provide user exports when using s3 storage. The BookWyrm development team are working on a fix for this." %}</p>
{% if use_azure %}
<p>{% trans "It is not currently possible to provide user exports when using Azure storage." %}</p>
{% endif %}
</div>
{% csrf_token %}
<div class="control">
<button type="submit" class="button is-success" {% if use_s3 %}disabled{% endif %}>
<button type="submit" class="button is-success" {% if use_azure %}disabled{% endif %}>
{% trans "Enable user exports" %}
</button>
</div>

View file

@ -0,0 +1,3 @@
{% if book.series %}
({{book.series}}{%if book.series_number %}, #{{book.series_number}}{% endif %})
{% endif %}

View file

@ -9,12 +9,15 @@
{% if book.authors.exists %}
{% blocktrans trimmed with path=book.local_path title=book|book_title %}
<a href="{{ path }}">{{ title }}</a> by
<a href="{{ path }}">{{ title }}</a>
by
{% endblocktrans %}&nbsp;{% include 'snippets/authors.html' with book=book limit=3 %}
{% else %}
<a href="{{ book.local_path }}">{{ book|book_title }}</a>
{% endif %}
{% include 'snippets/book_series.html' with book=book %}
{% endcache %}
{% endspaceless %}

View file

@ -17,4 +17,7 @@ commented on <a href="{{ book_path }}">{{ book }}</a>
{% endblocktrans %}
{% endif %}
{% include 'snippets/book_series.html' with book=book %}
{% endwith %}

View file

@ -17,4 +17,7 @@ quoted <a href="{{ book_path }}">{{ book }}</a>
{% endblocktrans %}
{% endif %}
{% include 'snippets/book_series.html' with book=book %}
{% endwith %}

View file

@ -19,4 +19,7 @@ finished reading <a href="{{ book_path }}">{{ book }}</a>
{% endblocktrans %}
{% endif %}
{% include 'snippets/book_series.html' with book=book %}
{% endspaceless %}

View file

@ -19,4 +19,7 @@ started reading <a href="{{ book_path }}">{{ book }}</a>
{% endblocktrans %}
{% endif %}
{% include 'snippets/book_series.html' with book=book %}
{% endspaceless %}

View file

@ -17,4 +17,7 @@ reviewed <a href="{{ book_path }}">{{ book }}</a>
{% endblocktrans %}
{% endif %}
{% include 'snippets/book_series.html' with book=book %}
{% endwith %}

View file

@ -19,5 +19,8 @@ stopped reading <a href="{{ book_path }}">{{ book }}</a>
{% endblocktrans %}
{% endif %}
{% include 'snippets/book_series.html' with book=book %}
{% endspaceless %}

View file

@ -19,4 +19,7 @@ wants to read <a href="{{ book_path }}">{{ book }}</a>
{% endblocktrans %}
{% endif %}
{% include 'snippets/book_series.html' with book=book %}
{% endspaceless %}

View file

@ -120,7 +120,7 @@ def id_to_username(user_id):
"""given an arbitrary remote id, return the username"""
if user_id:
url = urlparse(user_id)
domain = url.netloc
domain = url.hostname
parts = url.path.split("/")
name = parts[-1]
value = f"{name}@{domain}"
@ -130,11 +130,14 @@ def id_to_username(user_id):
@register.filter(name="get_file_size")
def get_file_size(file):
def get_file_size(nbytes):
"""display the size of a file in human readable terms"""
try:
raw_size = os.stat(file.path).st_size
raw_size = float(nbytes)
except (ValueError, TypeError):
return repr(nbytes)
else:
if raw_size < 1024:
return f"{raw_size} bytes"
if raw_size < 1024**2:
@ -142,8 +145,6 @@ def get_file_size(file):
if raw_size < 1024**3:
return f"{raw_size/1024**2:.2f} MB"
return f"{raw_size/1024**3:.2f} GB"
except Exception: # pylint: disable=broad-except
return ""
@register.filter(name="get_user_permission")

View file

@ -7,13 +7,13 @@ class Author(TestCase):
"""serialize author tests"""
@classmethod
def setUpTestData(self): # pylint: disable=bad-classmethod-argument
def setUpTestData(cls):
"""initial data"""
self.book = models.Edition.objects.create(
cls.book = models.Edition.objects.create(
title="Example Edition",
remote_id="https://example.com/book/1",
)
self.author = models.Author.objects.create(
cls.author = models.Author.objects.create(
name="Author fullname",
aliases=["One", "Two"],
bio="bio bio bio",

View file

@ -1,12 +1,10 @@
""" tests the base functionality for activitypub dataclasses """
from io import BytesIO
import json
import pathlib
from unittest.mock import patch
from dataclasses import dataclass
from django.test import TestCase
from PIL import Image
import responses
from bookwyrm import activitypub
@ -29,16 +27,18 @@ class BaseActivity(TestCase):
"""the super class for model-linked activitypub dataclasses"""
@classmethod
def setUpTestData(self): # pylint: disable=bad-classmethod-argument
def setUpTestData(cls):
"""we're probably going to re-use this so why copy/paste"""
with patch("bookwyrm.suggested_users.rerank_suggestions_task.delay"), patch(
"bookwyrm.activitystreams.populate_stream_task.delay"
), patch("bookwyrm.lists_stream.populate_lists_task.delay"):
self.user = models.User.objects.create_user(
with (
patch("bookwyrm.suggested_users.rerank_suggestions_task.delay"),
patch("bookwyrm.activitystreams.populate_stream_task.delay"),
patch("bookwyrm.lists_stream.populate_lists_task.delay"),
):
cls.user = models.User.objects.create_user(
"mouse", "mouse@mouse.mouse", "mouseword", local=True, localname="mouse"
)
self.user.remote_id = "http://example.com/a/b"
self.user.save(broadcast=False, update_fields=["remote_id"])
cls.user.remote_id = "http://example.com/a/b"
cls.user.save(broadcast=False, update_fields=["remote_id"])
def setUp(self):
datafile = pathlib.Path(__file__).parent.joinpath("../data/ap_user.json")
@ -46,13 +46,11 @@ class BaseActivity(TestCase):
# don't try to load the user icon
del self.userdata["icon"]
image_file = pathlib.Path(__file__).parent.joinpath(
image_path = pathlib.Path(__file__).parent.joinpath(
"../../static/images/default_avi.jpg"
)
image = Image.open(image_file)
output = BytesIO()
image.save(output, format=image.format)
self.image_data = output.getvalue()
with open(image_path, "rb") as image_file:
self.image_data = image_file.read()
def test_get_representative_not_existing(self, *_):
"""test that an instance representative actor is created if it does not exist"""
@ -232,10 +230,12 @@ class BaseActivity(TestCase):
)
# sets the celery task call to the function call
with patch("bookwyrm.activitypub.base_activity.set_related_field.delay"):
with patch("bookwyrm.models.status.Status.ignore_activity") as discarder:
discarder.return_value = False
update_data.to_model(model=models.Status, instance=status)
with (
patch("bookwyrm.activitypub.base_activity.set_related_field.delay"),
patch("bookwyrm.models.status.Status.ignore_activity") as discarder,
):
discarder.return_value = False
update_data.to_model(model=models.Status, instance=status)
self.assertIsNone(status.attachments.first())
@responses.activate

View file

@ -11,18 +11,20 @@ class Note(TestCase):
"""the model-linked ActivityPub dataclass for Note-based types"""
@classmethod
def setUpTestData(self): # pylint: disable=bad-classmethod-argument
def setUpTestData(cls):
"""create a shared user"""
with patch("bookwyrm.suggested_users.rerank_suggestions_task.delay"), patch(
"bookwyrm.activitystreams.populate_stream_task.delay"
), patch("bookwyrm.lists_stream.populate_lists_task.delay"):
self.user = models.User.objects.create_user(
with (
patch("bookwyrm.suggested_users.rerank_suggestions_task.delay"),
patch("bookwyrm.activitystreams.populate_stream_task.delay"),
patch("bookwyrm.lists_stream.populate_lists_task.delay"),
):
cls.user = models.User.objects.create_user(
"mouse", "mouse@mouse.mouse", "mouseword", local=True, localname="mouse"
)
self.user.remote_id = "https://test-instance.org/user/critic"
self.user.save(broadcast=False, update_fields=["remote_id"])
cls.user.remote_id = "https://test-instance.org/user/critic"
cls.user.save(broadcast=False, update_fields=["remote_id"])
self.book = models.Edition.objects.create(
cls.book = models.Edition.objects.create(
title="Test Edition", remote_id="http://book.com/book"
)

View file

@ -11,10 +11,10 @@ class Quotation(TestCase):
"""we have hecka ways to create statuses"""
@classmethod
def setUpTestData(self): # pylint: disable=bad-classmethod-argument
def setUpTestData(cls):
"""model objects we'll need"""
with patch("bookwyrm.models.user.set_remote_server.delay"):
self.user = models.User.objects.create_user(
cls.user = models.User.objects.create_user(
"mouse",
"mouse@mouse.mouse",
"mouseword",
@ -23,7 +23,7 @@ class Quotation(TestCase):
outbox="https://example.com/user/mouse/outbox",
remote_id="https://example.com/user/mouse",
)
self.book = models.Edition.objects.create(
cls.book = models.Edition.objects.create(
title="Example Edition",
remote_id="https://example.com/book/1",
)

View file

@ -16,15 +16,17 @@ class Activitystreams(TestCase):
"""using redis to build activity streams"""
@classmethod
def setUpTestData(self): # pylint: disable=bad-classmethod-argument
def setUpTestData(cls):
"""use a test csv"""
with patch("bookwyrm.suggested_users.rerank_suggestions_task.delay"), patch(
"bookwyrm.activitystreams.populate_stream_task.delay"
), patch("bookwyrm.lists_stream.populate_lists_task.delay"):
self.local_user = models.User.objects.create_user(
with (
patch("bookwyrm.suggested_users.rerank_suggestions_task.delay"),
patch("bookwyrm.activitystreams.populate_stream_task.delay"),
patch("bookwyrm.lists_stream.populate_lists_task.delay"),
):
cls.local_user = models.User.objects.create_user(
"mouse", "mouse@mouse.mouse", "password", local=True, localname="mouse"
)
self.another_user = models.User.objects.create_user(
cls.another_user = models.User.objects.create_user(
"nutria",
"nutria@nutria.nutria",
"password",
@ -32,7 +34,7 @@ class Activitystreams(TestCase):
localname="nutria",
)
with patch("bookwyrm.models.user.set_remote_server.delay"):
self.remote_user = models.User.objects.create_user(
cls.remote_user = models.User.objects.create_user(
"rat",
"rat@rat.com",
"ratword",
@ -42,7 +44,7 @@ class Activitystreams(TestCase):
outbox="https://example.com/users/rat/outbox",
)
work = models.Work.objects.create(title="test work")
self.book = models.Edition.objects.create(title="test book", parent_work=work)
cls.book = models.Edition.objects.create(title="test book", parent_work=work)
def setUp(self):
"""per-test setUp"""
@ -105,9 +107,11 @@ class Activitystreams(TestCase):
privacy="direct",
book=self.book,
)
with patch("bookwyrm.activitystreams.r.set"), patch(
"bookwyrm.activitystreams.r.delete"
), patch("bookwyrm.activitystreams.ActivityStream.get_store") as redis_mock:
with (
patch("bookwyrm.activitystreams.r.set"),
patch("bookwyrm.activitystreams.r.delete"),
patch("bookwyrm.activitystreams.ActivityStream.get_store") as redis_mock,
):
redis_mock.return_value = [status.id, status2.id]
result = self.test_stream.get_activity_stream(self.local_user)
self.assertEqual(result.count(), 2)

View file

@ -15,16 +15,18 @@ class Activitystreams(TestCase):
"""using redis to build activity streams"""
@classmethod
def setUpTestData(self): # pylint: disable=bad-classmethod-argument
def setUpTestData(cls):
"""use a test csv"""
with patch("bookwyrm.suggested_users.rerank_suggestions_task.delay"), patch(
"bookwyrm.activitystreams.populate_stream_task.delay"
), patch("bookwyrm.lists_stream.populate_lists_task.delay"):
self.local_user = models.User.objects.create_user(
with (
patch("bookwyrm.suggested_users.rerank_suggestions_task.delay"),
patch("bookwyrm.activitystreams.populate_stream_task.delay"),
patch("bookwyrm.lists_stream.populate_lists_task.delay"),
):
cls.local_user = models.User.objects.create_user(
"mouse", "mouse@mouse.mouse", "password", local=True, localname="mouse"
)
with patch("bookwyrm.models.user.set_remote_server.delay"):
self.remote_user = models.User.objects.create_user(
cls.remote_user = models.User.objects.create_user(
"rat",
"rat@rat.com",
"ratword",
@ -34,7 +36,7 @@ class Activitystreams(TestCase):
outbox="https://example.com/users/rat/outbox",
)
work = models.Work.objects.create(title="test work")
self.book = models.Edition.objects.create(title="test book", parent_work=work)
cls.book = models.Edition.objects.create(title="test book", parent_work=work)
def test_get_statuses_for_user_books(self, *_):
"""create a stream for a user"""

View file

@ -13,15 +13,17 @@ class Activitystreams(TestCase):
"""using redis to build activity streams"""
@classmethod
def setUpTestData(self): # pylint: disable=bad-classmethod-argument
def setUpTestData(cls):
"""use a test csv"""
with patch("bookwyrm.suggested_users.rerank_suggestions_task.delay"), patch(
"bookwyrm.activitystreams.populate_stream_task.delay"
), patch("bookwyrm.lists_stream.populate_lists_task.delay"):
self.local_user = models.User.objects.create_user(
with (
patch("bookwyrm.suggested_users.rerank_suggestions_task.delay"),
patch("bookwyrm.activitystreams.populate_stream_task.delay"),
patch("bookwyrm.lists_stream.populate_lists_task.delay"),
):
cls.local_user = models.User.objects.create_user(
"mouse", "mouse@mouse.mouse", "password", local=True, localname="mouse"
)
self.another_user = models.User.objects.create_user(
cls.another_user = models.User.objects.create_user(
"nutria",
"nutria@nutria.nutria",
"password",
@ -29,7 +31,7 @@ class Activitystreams(TestCase):
localname="nutria",
)
with patch("bookwyrm.models.user.set_remote_server.delay"):
self.remote_user = models.User.objects.create_user(
cls.remote_user = models.User.objects.create_user(
"rat",
"rat@rat.com",
"ratword",

View file

@ -13,15 +13,17 @@ class Activitystreams(TestCase):
"""using redis to build activity streams"""
@classmethod
def setUpTestData(self): # pylint: disable=bad-classmethod-argument
def setUpTestData(cls):
"""use a test csv"""
with patch("bookwyrm.suggested_users.rerank_suggestions_task.delay"), patch(
"bookwyrm.activitystreams.populate_stream_task.delay"
), patch("bookwyrm.lists_stream.populate_lists_task.delay"):
self.local_user = models.User.objects.create_user(
with (
patch("bookwyrm.suggested_users.rerank_suggestions_task.delay"),
patch("bookwyrm.activitystreams.populate_stream_task.delay"),
patch("bookwyrm.lists_stream.populate_lists_task.delay"),
):
cls.local_user = models.User.objects.create_user(
"mouse", "mouse@mouse.mouse", "password", local=True, localname="mouse"
)
self.another_user = models.User.objects.create_user(
cls.another_user = models.User.objects.create_user(
"nutria",
"nutria@nutria.nutria",
"password",
@ -29,7 +31,7 @@ class Activitystreams(TestCase):
localname="nutria",
)
with patch("bookwyrm.models.user.set_remote_server.delay"):
self.remote_user = models.User.objects.create_user(
cls.remote_user = models.User.objects.create_user(
"rat",
"rat@rat.com",
"ratword",
@ -39,7 +41,7 @@ class Activitystreams(TestCase):
outbox="https://example.com/users/rat/outbox",
)
work = models.Work.objects.create(title="test work")
self.book = models.Edition.objects.create(title="test book", parent_work=work)
cls.book = models.Edition.objects.create(title="test book", parent_work=work)
def test_localstream_get_audience_remote_status(self, *_):
"""get a list of users that should see a status"""

View file

@ -15,16 +15,18 @@ class ActivitystreamsSignals(TestCase):
"""using redis to build activity streams"""
@classmethod
def setUpTestData(self): # pylint: disable=bad-classmethod-argument
def setUpTestData(cls):
"""use a test csv"""
with patch("bookwyrm.suggested_users.rerank_suggestions_task.delay"), patch(
"bookwyrm.activitystreams.populate_stream_task.delay"
), patch("bookwyrm.lists_stream.populate_lists_task.delay"):
self.local_user = models.User.objects.create_user(
with (
patch("bookwyrm.suggested_users.rerank_suggestions_task.delay"),
patch("bookwyrm.activitystreams.populate_stream_task.delay"),
patch("bookwyrm.lists_stream.populate_lists_task.delay"),
):
cls.local_user = models.User.objects.create_user(
"mouse", "mouse@mouse.mouse", "password", local=True, localname="mouse"
)
with patch("bookwyrm.models.user.set_remote_server.delay"):
self.remote_user = models.User.objects.create_user(
cls.remote_user = models.User.objects.create_user(
"rat",
"rat@rat.com",
"ratword",

View file

@ -8,15 +8,17 @@ class Activitystreams(TestCase):
"""using redis to build activity streams"""
@classmethod
def setUpTestData(self): # pylint: disable=bad-classmethod-argument
def setUpTestData(cls):
"""use a test csv"""
with patch("bookwyrm.suggested_users.rerank_suggestions_task.delay"), patch(
"bookwyrm.activitystreams.populate_stream_task.delay"
), patch("bookwyrm.lists_stream.populate_lists_task.delay"):
self.local_user = models.User.objects.create_user(
with (
patch("bookwyrm.suggested_users.rerank_suggestions_task.delay"),
patch("bookwyrm.activitystreams.populate_stream_task.delay"),
patch("bookwyrm.lists_stream.populate_lists_task.delay"),
):
cls.local_user = models.User.objects.create_user(
"mouse", "mouse@mouse.mouse", "password", local=True, localname="mouse"
)
self.another_user = models.User.objects.create_user(
cls.another_user = models.User.objects.create_user(
"nutria",
"nutria@nutria.nutria",
"password",
@ -24,7 +26,7 @@ class Activitystreams(TestCase):
localname="nutria",
)
with patch("bookwyrm.models.user.set_remote_server.delay"):
self.remote_user = models.User.objects.create_user(
cls.remote_user = models.User.objects.create_user(
"rat",
"rat@rat.com",
"ratword",
@ -34,11 +36,9 @@ class Activitystreams(TestCase):
outbox="https://example.com/users/rat/outbox",
)
work = models.Work.objects.create(title="test work")
self.book = models.Edition.objects.create(title="test book", parent_work=work)
cls.book = models.Edition.objects.create(title="test book", parent_work=work)
with patch("bookwyrm.models.activitypub_mixin.broadcast_task.apply_async"):
self.status = models.Status.objects.create(
content="hi", user=self.local_user
)
cls.status = models.Status.objects.create(content="hi", user=cls.local_user)
def test_add_book_statuses_task(self):
"""statuses related to a book"""

Some files were not shown because too many files have changed in this diff Show more