Compare commits

...

198 commits
0.8.0 ... main

Author SHA1 Message Date
Jamie Bliss 024b01a144
Add myself to a few docs (#713)
* Add myself to a few docs

* Update conf.py
2024-05-21 14:11:53 -04:00
Henri Dickson 8f17b81912
fix 500 when save edited annoucements (#708) 2024-05-21 13:55:17 -04:00
Andrew Godwin 7c34ac78ed Write a release checklist and do a couple things on it 2024-02-06 14:49:35 -07:00
Henri Dickson 72eb6a6271
add application/activity+json to accept header to improve compatibility (#694) 2024-02-05 21:40:04 -05:00
Jamie Bliss b2223ddf42
Back out push notification changes 2024-02-05 21:18:59 -05:00
Jamie Bliss 045a499ddf
Fix docs 2024-02-05 20:59:22 -05:00
Jamie Bliss 0fa48578f2
Write release notes for 0.11.0 2024-02-05 20:53:09 -05:00
Henri Dickson f86f3a49e4
Fix when report ap message does not have content field (#689) 2024-01-08 19:48:21 -07:00
Henri Dickson 2f4daa02bd
Add missing validator to initial migration (#687) 2024-01-04 08:59:26 -07:00
Henri Dickson 798222dcdb
Post update/delete also fanout to those who liked/boosted it but not following the author (#684) 2023-12-31 11:06:30 -07:00
Henri Dickson 74b3ac551a
Fix accept/reject follow request (#683) 2023-12-27 11:48:09 -07:00
Henri Dickson 4a09379e09
Fix federating with GoToSocial (#682) 2023-12-26 10:26:03 -07:00
Henri Dickson 448092d6d9
Improve identity deletion (#678) 2023-12-16 23:49:59 +00:00
Henri Dickson 5d508a17ec
Basic protection against invalid domain names (#680) 2023-12-13 09:04:41 +00:00
Jamie Bliss d07482f5a8
Allow statusless posts (#677) 2023-12-07 16:32:18 -07:00
Henri Dickson 123c20efb1
When remote follows local, hold off sending Accept if remote identity is not fully fetched (#676) 2023-12-06 11:08:41 -07:00
Karthik Balakrishnan 83607779cd
Fix README: 0.10.1 is latest release (#675) 2023-12-01 09:11:18 -07:00
Andrew Godwin 837320f461 Invert pruning exit codes 2023-12-01 00:03:09 -07:00
Rob 5f28d702f8
Make max_media_attachments configurable by admin (#669) 2023-11-28 09:52:04 -07:00
Henri Dickson ac7fef4b28
Do not fetch webfinger if when querying identity on local domain (#668) 2023-11-26 21:00:58 -07:00
Henri Dickson 6855e74c6f
Do not retry unmute if mute never expires 2023-11-26 12:46:31 -07:00
Henri Dickson a58d7ccd8f
Do not make local identities outdated (#667) 2023-11-26 11:19:18 -07:00
Rob 1a728ea023
Add s3-insecure to pydantic checker (#665) 2023-11-26 11:13:55 -07:00
Humberto Rocha b031880e41
Extract json parser to core and use in fetch_actor (#663) 2023-11-20 11:46:51 -07:00
Humberto Rocha 81d019ad0d
Improve search api json parsing (#662) 2023-11-19 11:32:35 -07:00
Henri Dickson 5267e4108c
Allow unicode characters in hashtag (#659) 2023-11-19 09:58:20 -07:00
Henri Dickson b122e2beda
Fix fetching post from another takahe by searching its url (#661) 2023-11-18 21:03:51 -07:00
Rob ae1bfc49a7
Add s3-insecure for S3 backend (#658) 2023-11-17 21:49:06 -07:00
Osma Ahvenlampi 1ceef59bec
Module-specific loggers and minor reformatting (#657) 2023-11-16 10:27:20 -07:00
Humberto Rocha 2f546dfa74
Do not canonicalise non json content in the search endpoint (#654) 2023-11-15 15:00:56 -07:00
Andrew Godwin cc9e397f60 Ensure post pruning has a random selection element 2023-11-14 00:04:18 -07:00
Andrew Godwin dc397903b2 Fix release date formatting 2023-11-13 12:18:30 -07:00
Andrew Godwin debf4670e8 Releasing 0.10.1 2023-11-13 12:16:40 -07:00
Andrew Godwin e49bfc4775 Add Stator tuning notes 2023-11-13 10:52:22 -07:00
Andrew Godwin 308dd033e1 Significantly drop the default settings for stator 2023-11-13 10:39:21 -07:00
Andrew Godwin 460d1d7e1c Don't prune replies to local, add docs 2023-11-12 18:32:38 -07:00
Andrew Godwin eb0b0d775c Don't delete mentioned people 2023-11-12 18:06:29 -07:00
Andrew Godwin 74f69a3813 Add identity pruning, improve post pruning 2023-11-12 18:01:01 -07:00
Andrew Godwin 9fc497f826 Mention that the final number includes dependencies 2023-11-12 17:12:05 -07:00
Andrew Godwin ab3648e05d Add console logging back to Stator 2023-11-12 16:49:01 -07:00
Andrew Godwin 476f817464 Only consider local replies 2023-11-12 16:31:20 -07:00
Andrew Godwin 99e7fb8639 Fix prune issues when multiple replies 2023-11-12 16:30:49 -07:00
Andrew Godwin 87344b47b5 Add manual post pruning command 2023-11-12 16:23:43 -07:00
Andrew Godwin aa39ef0571 Move remote pruning note over to 0.11 2023-11-12 14:43:45 -07:00
Andrew Godwin 110a5e64dc "a" to "our" is important meaning 2023-11-12 14:42:59 -07:00
Andrew Godwin bae76c3063 Add 0.10 to release index with date 2023-11-12 14:39:24 -07:00
Andrew Godwin 9bb40ca7f6 Releasing 0.10 2023-11-12 14:12:06 -07:00
Andrew Godwin af7f1173fc Disable remote post pruning via Stator for now 2023-11-12 12:37:04 -07:00
Andrew Godwin 30e9b1f62d Ignore more Lemmy things 2023-11-12 12:35:11 -07:00
Andrew Godwin 95089c0c61 Ignore some messages at inbox view time 2023-11-12 12:09:09 -07:00
Andrew Godwin d815aa53e1 Ignore lemmy-flavour likes and dislikes 2023-11-12 11:21:23 -07:00
Andrew Godwin e6e64f1000 Don't use other server URIs in our IDs (Fixes #323) 2023-11-12 10:21:07 -07:00
Andrew Godwin c3bf7563b4
Fix memcached testing error on GH Actions 2023-11-09 12:47:08 -07:00
Andrew Godwin e577d020ee Bump to Python 3.11, as 3.10 is in security-only now 2023-11-09 12:19:56 -07:00
Andrew Godwin 57cefa967c Prepping 0.10 release notes 2023-11-09 12:10:31 -07:00
Andrew Godwin 6fdfdca442 Update all the pre-commit hooks 2023-11-09 12:07:21 -07:00
Andrew Godwin e17f17385a Add setting to keep migration off by default for now 2023-11-09 11:58:40 -07:00
Jamie Bliss 5cc74900b1
Update client app compatibility, add links (#649)
* Tuba advertises compatibility
* Phanpy seems to work for me
2023-11-08 13:33:08 -07:00
Osma Ahvenlampi 24577761ed
focalpoints are floats between -1..1, not int (#648) 2023-11-04 11:24:09 -06:00
Osma Ahvenlampi 039adae797
Refactoring inbox processing to smaller tasks (#647) 2023-10-26 10:01:03 -06:00
Osma Ahvenlampi 9368996a5b
use logging instead of sentry.capture_* (#646) 2023-10-23 10:33:55 -06:00
Andrew Godwin 84ded2f3a5 Turn off remote prune for now 2023-10-19 08:42:01 -06:00
Andrew Godwin 07d187309e Pruning docs and ability to turn off 2023-10-01 10:49:10 -06:00
Andrew Godwin 8cc1691857 Delete remote posts after a set horizon time 2023-10-01 10:43:22 -06:00
Osma Ahvenlampi b60e807b91
Separate out timeouts from other remote server issues (#645) 2023-10-01 09:27:23 -06:00
Osma Ahvenlampi 1e8a392e57
Deal with unknown json-ld schemas (#644)
Rather than raising an error, returns an empty schema.
2023-09-20 14:58:38 -04:00
Humberto Rocha 8c832383e0
Update ld schema to support instances that implement multikey and wytchspace (#643) 2023-09-16 19:09:13 -06:00
Andrew Godwin 6c83d7b67b Fix #642: Race condition searching for unseen users 2023-09-15 10:21:33 -06:00
Andrew Godwin dd532e4425 Fix tests for profile redirect and add to notes 2023-09-07 22:06:50 -06:00
Andrew Godwin 1e76430f74 Don't show identity pages for remote identities 2023-09-07 21:54:42 -06:00
Andrew Godwin ddf24d376e Fix initial identity choices 2023-09-04 11:21:04 -06:00
Osma Ahvenlampi 2a0bbf0d5d
One more try to get the fetch_account/sync_pins/post relationship and parallelism fixed (#634) 2023-08-26 15:16:14 -06:00
Henri Dickson 555046ac4d
Ignore unknown tag type in incoming post, rather than raise exception (#639) 2023-08-25 16:35:57 -06:00
Henri Dickson b003af64cc
Do not print "Scheduling 0 handled" unless settings.DEBUG is on (#636) 2023-08-23 22:12:21 +10:00
Osma Ahvenlampi 671807beb8
Misc lemmy compat (#635) 2023-08-21 11:55:48 +09:30
Osma Ahvenlampi 2a50928f27
Signatures need to use UTF-8 in order to represent all URLs (#633) 2023-08-21 11:54:47 +09:30
Henri Dickson 70b9e3b900
Support follow requests (#625) 2023-08-18 15:49:45 +09:30
TAKAHASHI Shuuji faa181807c
Fix Accept object id for follow activity for Misskey and Firefish (#632) 2023-08-18 15:42:53 +09:30
TAKAHASHI Shuuji 679f0def99
Add stub API endpoint for user suggestion (api/v2/suggestions) (#631) 2023-08-17 17:41:06 +09:30
Henri Dickson 1262c619bb
Make nodeinfo do metadata based on domain requested (#628) 2023-08-11 09:34:25 -06:00
Andrew Godwin 0c72327ab7 Fix state graph 2023-08-08 09:04:21 -06:00
Andrew Godwin 84703bbc45 Lay groundwork for moved identity state 2023-08-08 08:55:16 -06:00
TAKAHASHI Shuuji 93dfc85cf7
Fix small syntax errors (#627) 2023-08-07 09:18:18 -06:00
TAKAHASHI Shuuji 67d755e6d3
Support to export blocks/mutes as CSV files (#626) 2023-08-07 09:16:52 -06:00
Henri Dickson 4a9109271d
Fix like/boost remote post (#629) 2023-08-07 09:15:13 -06:00
Humberto Rocha a69499c742
Add 'domain' to the blocklist supported headers (#623) 2023-08-03 10:41:47 -06:00
Humberto Rocha c4a2b62016
Allow updated to updated transition on Domain model (#621) 2023-07-30 11:22:35 -07:00
Henri Dickson 1b7bb8c501
Add Idempotency-Key to allowed CORS header (#618)
It's used by other web clients, so should improve compatibility.
2023-07-24 18:54:58 -06:00
Humberto Rocha f3bab95827
Add support to import blocklists (#617) 2023-07-24 17:59:50 -06:00
Andrew Godwin 4a8bdec90c Implement inbound account migration 2023-07-22 11:46:35 -06:00
Andrew Godwin cc6355f60b Refs #613: Also block subdomains 2023-07-22 10:54:36 -06:00
Andrew Godwin 83b57a0998 Never put blocked domains into outdated either 2023-07-22 10:44:01 -06:00
Andrew Godwin aac75dd4c3 Fixed #613: Don't pull nodeinfo from blocked servers! 2023-07-22 10:41:58 -06:00
Andrew Godwin 759d5ac052 Fixed #616: Do followers-only properly 2023-07-22 10:38:22 -06:00
Andrew Godwin 1dd076ff7d Fixed #615: Nicely reject malformatted http signatures 2023-07-20 09:55:36 -06:00
Humberto Rocha d6cdcb1d83
Wait setup to complete before starting web and stator containers (#611) 2023-07-17 09:31:21 -06:00
Andrew Godwin 188e5a2446 Remove all remaining async code for now 2023-07-17 00:37:47 -06:00
Andrew Godwin 0915b17c4b Prune some unnecessary async usage 2023-07-17 00:18:00 -06:00
Andrew Godwin 31c743319e Require hatchway 0.5.2 2023-07-15 12:43:45 -06:00
Andrew Godwin 11e3ca12d4 Start on push notification work 2023-07-15 12:37:47 -06:00
Deborah Pickett 824f5b289c
Permit SMTP to mail relay without authentication (#600) 2023-07-14 13:57:58 -06:00
Osma Ahvenlampi 2d140f2e97
remove duplicate attachment url check (#608) 2023-07-14 13:52:04 -06:00
Osma Ahvenlampi b2a9b334be
Resubmit: Be quieter about remote hosts with invalid SSL certs (#595) 2023-07-12 09:51:08 -06:00
Osma Ahvenlampi 5549d21528
Fix inbox processing errors from pinned posts and non-Mastodon servers (#596)
If a post (interaction) comes in from AP inbox but no local author profile exists,
fetch_actor will pull in both the identity AND its pinned posts, which the incoming
post might have been. This would case a database integrity violation. We check
for post existing again after syncing the actor.

Post processing also barfed on posts where content didn't follow Mastodon specs.
For example, Kbin sets tag names in 'tag' attribute, instead of 'name' attribute.
2023-07-12 09:49:30 -06:00
Humberto Rocha 5f49f9b2bb
Add support to dismiss notifications (#605) 2023-07-11 16:37:03 -06:00
Osma Ahvenlampi 1cc9c16b8c
Use 400 and 401 error codes as OAuth2 documents, accept 400 as webfinger error code (#597) 2023-07-10 10:19:20 -06:00
Humberto Rocha 91cf2f3a30
Add missing SignUp link to header (#606) 2023-07-10 10:13:57 -06:00
Andrew Godwin 68eea142b1 Fix domain index issue 2023-07-10 10:11:48 -06:00
Andrew Godwin 3f8213f54a Syncify another handler 2023-07-09 00:43:16 -06:00
Andrew Godwin 2523de4249 Prevent race condition between threads and locking 2023-07-09 00:42:56 -06:00
Andrew Godwin 933f6660d5 Catch all the subtypes too 2023-07-07 16:39:02 -06:00
Andrew Godwin 2fda9ad2b4 Also capture unknown message types 2023-07-07 16:33:55 -06:00
Andrew Godwin 4458594f04 Also capture JSON-LD errors 2023-07-07 16:32:57 -06:00
Andrew Godwin c93a27e418 Capture and don't thrash on badly formatted AP messages 2023-07-07 16:29:12 -06:00
Andrew Godwin 709f2527ac Refresh identities half as frequently 2023-07-07 15:52:12 -06:00
Andrew Godwin 7f483af8d3 Rework Stator to use a next field and no async 2023-07-07 15:14:06 -06:00
Andrew Godwin e34e4c0c77 Fixed #599: Interaction state not present on notifications 2023-07-05 07:58:54 -06:00
Humberto Rocha 542e3836af
Add endpoint to get notification by id (#594) 2023-07-04 08:06:31 -06:00
Andrew Godwin 82a9c18205 Fixed #593: Add some docs for TAKAHE_CSRF_HOSTS 2023-07-02 20:41:38 +01:00
Andrew Godwin a8b31e9f6a Releasing 0.9 2023-06-24 11:57:56 -06:00
Andrew Godwin d6e891426c Modify 500 error page to not need config 2023-06-24 11:56:57 -06:00
Humberto Rocha 226a60bec7
Fix canonicalize (#590) 2023-06-24 08:53:42 -06:00
Humberto Rocha 9038e498d5
Fix identity metadata not properly propagating through AP (#589) 2023-06-22 17:09:19 -06:00
mj2p bb8f589da7
Bugfix admin redirect fixes (#587) 2023-06-14 11:15:29 -06:00
TAKAHASHI Shuuji f88ad38294
Prepend invisible URL protocol prefix (#586) 2023-05-26 09:02:49 -06:00
TAKAHASHI Shuuji 2040124147
Prevent dropping ellipsis URL (#585) 2023-05-24 11:41:56 -06:00
Karthik Balakrishnan 68dc2dc9ed
Use post id to generate summary class (#583)
Removes the "expand linked CWs" feature for now.
2023-05-20 01:02:40 -06:00
Karthik Balakrishnan 568b87dadb
Customize gunicorn worker count (#578) 2023-05-19 10:31:44 -06:00
Andrew Godwin 79e1f0da14 Don't even try to progress post attachments 2023-05-15 16:59:52 -06:00
Christof Dorner cec04e8ddb
Fixes various issues with pinned posts - continued (#581) 2023-05-15 11:36:33 -06:00
Andrew Godwin b2768e7f2e Improve stator's performance in larger installs 2023-05-15 11:34:21 -06:00
Christof Dorner 9bc18a1190
Fixes various issues with pinned posts (#580) 2023-05-15 10:54:32 -06:00
Andrew Godwin 5297b98273 Include psql in the docker image 2023-05-13 11:38:32 -06:00
Andrew Godwin 888f4ad36c Move from index_together to indexes 2023-05-13 11:30:42 -06:00
Andrew Godwin 46679a5c73 Bump up django-upgrade 2023-05-13 11:20:47 -06:00
Andrew Godwin f4bbe78bd5 Fix tests after mentions change 2023-05-13 11:19:23 -06:00
Andrew Godwin f5a3971ef8 Implement replies profile tab and boosts option 2023-05-13 11:07:57 -06:00
Andrew Godwin 31c4f89124 Make mentions look EXACTLY like Mastodon 2023-05-13 10:40:00 -06:00
Andrew Godwin 6a94dcfcc6 Add three more apps to home 2023-05-13 10:32:48 -06:00
Andrew Godwin 67f64a4313 Fix mention formatting on Sengi 2023-05-13 10:32:36 -06:00
Andrew Godwin 1fb02b06e1 Fixed #577: Send bot property down API right 2023-05-13 06:00:48 -06:00
Christof Dorner d6c9ba0819
Pinned posts (#561) 2023-05-13 10:01:27 -06:00
Christof Dorner 744c2825d9
Show posts and boosts on an identity's profile view (#574) 2023-05-12 17:43:26 -06:00
Christof Dorner b3b58df2b1
Fix hashtag search results (#576)
We mistakenly wrote to the key "hashtag" instead of "hashtags", resulting
in no results in the API response. Additionally, the type of the Tag's `history`
needs to be a list, not a dict.

This fixes hashtag search in Elk.
2023-05-10 10:17:00 -06:00
Christof Dorner 9775fa8991
Add mention class to user mention links (#575)
This should fix mention links in Elk to keep linking inside Elk, not to the instance of the mentioned user.
2023-05-09 09:55:19 -06:00
Christof Dorner 51ffcc6192
Fix default storage setting and use new STORAGES (#573) 2023-05-08 10:40:33 -06:00
Karthik Balakrishnan dcc4a5723e
Migrate to new staticfiles storage setting (#570) 2023-05-07 23:08:33 -06:00
Karthik Balakrishnan f256217d1b
Show domain setup to admins (#568)
Prompts admin users to setup domains on the identity creation
2023-05-07 23:06:10 -06:00
Andrew Godwin b6d9f1dc95 Add search v1 emulation (for Sengi) 2023-05-06 09:30:21 -07:00
Andrew Godwin 930aab384e Debug, search fixes 2023-05-05 23:30:59 -07:00
Andrew Godwin eeee385a61 Fix a few visual bugs 2023-05-05 23:04:20 -07:00
Christof Dorner a4e6033a0b
Ignore unknown create/update types (#569) 2023-05-04 16:56:50 -06:00
Karthik Balakrishnan dbc25f538d
Fix bug in middleware when domain config is None (#567) 2023-05-04 11:59:11 -06:00
Andrew Godwin 7862795993 A few small fixes post-UI merge 2023-05-04 11:50:31 -06:00
Michael Manfre 8ff6100e94
Django 4.2 with psycopg 3 (#555) 2023-05-03 23:49:17 -06:00
Andrew Godwin 709dc86162 Fixed #559: Trim hashtags to 100 chars or less 2023-05-03 23:12:28 -06:00
Andrew Godwin 8f57aa5f37
UI/Domains Refactor
Redoes the UI to remove timelines, promote domains, and a lot of other things to support the refactor.
2023-05-03 22:42:37 -06:00
Christof Dorner 7331591432
fix order of next/prev header links (#564) 2023-05-02 10:57:58 -06:00
Christof Dorner ac54c7ff81
Fix post attachment author check on editing (#563) 2023-05-02 09:58:32 -06:00
Christof Dorner 5759e1d5c1
Expose Link header via CORS (#562) 2023-05-02 09:57:12 -06:00
Christof Dorner 7d1558a2ab
Support editing media description when editing statuses (#556) 2023-04-11 09:35:36 -06:00
Christof Dorner b31c5156ff
Improve hashtag case handling and implement /api/v1/tags/<hashtag> endpoint (#554)
* Lowercase hashtag before loading its timeline

* Implement /api/v1/tags/<hashtag> endpoint

* Lower hashtag before un-/following

* Fix field name for hashtag following/followed boolean
2023-04-06 15:14:21 -06:00
Christof Dorner 216915ddb8
set post attachment type correctly for videos (#553) 2023-04-01 18:07:38 -06:00
Christof Dorner 96bc64fd01
Implement Mastodon's v1/statuses/<id>/reblogged_by API endpoint (#551) 2023-03-31 13:40:15 -06:00
Christof Dorner ba4414dbce
fix /v1/statuses/<id>/favourited_by pagination header error (#550) 2023-03-23 19:09:03 -06:00
Christof Dorner e45195bb02
Handle posts with only contentMap as post instead of interaction (#549) 2023-03-23 12:27:32 -06:00
Andrew Godwin ea7d5f307c Allow single IDs in familiar_followers
Fixes #547
2023-03-22 16:21:30 -06:00
Andrew Godwin 1994671541 Correctly handle GTS mentions of same username
Fixes #546
2023-03-22 10:44:15 -06:00
Humberto Rocha c702b1b24d
Always return voters_count (#543) 2023-03-15 18:46:12 -06:00
Christof Dorner c94b54dde8
Add mention and hashtag classes to hashtag anchors (#542) 2023-03-15 13:49:23 -06:00
Andrew Godwin 74c4819ee2 Fix tuning EOF 2023-03-14 15:39:42 -06:00
Christof Dorner 79c1be03a6
Add ability to follow hashtags 2023-03-14 15:35:40 -06:00
Daniel Vydra 902891ff9e
Simple docs for setting up a sentry.io installation (#540) 2023-03-13 10:49:07 -06:00
Andrew Godwin 542678cab5 Fix author checks on post attachments
Fixes #538
2023-03-12 16:19:40 -06:00
Dan Watson cedcc8fa7c
Bookmarks (#537) 2023-03-11 11:17:20 -07:00
Andrew Godwin 758e6633c4 Fix image proxy headers
Fixes #536
2023-03-11 11:06:44 -07:00
Dan Watson 61830a9a9c
Fix page ordering (#535) 2023-03-10 09:10:34 -07:00
Dan Watson 6e8149675c
Fix home timeline pagination 2023-03-09 19:46:57 -07:00
Dan Watson 3bd01b2b3d
Stub out more API, implement preferences and peers (#533) 2023-03-09 14:36:24 -07:00
Dan Watson 6f4abd5aae
Initial support for IceCubes (#532) 2023-03-09 10:47:33 -07:00
Christof Dorner 56da914340
Allow to set default reply visibility (#531) 2023-03-08 11:01:21 -07:00
Andrew Godwin 1b9cf24d09 Move back to canonicalising public as "as:Public" 2023-03-08 10:11:56 -07:00
Christof Dorner 85b4910829
Added admin notes field to domains (#530) 2023-03-06 16:37:05 -07:00
Andrew Godwin 5ea3d5d143 Implement a client_credentials process for read 2023-03-06 15:48:43 -07:00
Andrew Godwin 05992d6553 Fix linking to things without URL attrs 2023-03-05 11:40:50 -07:00
Andrew Godwin afc94f6313 Add in_reply_to index 2023-03-05 10:34:58 -07:00
Andrew Godwin bd6d1ae8de Fix fallback for instance API host
Fixes #522
2023-03-03 15:04:34 -07:00
Kelson Vibber 78eacf165e
Accept hs2019 in signatures (#529)
Fixes part of federation with GoToSocial - this is just a different name for the same algorithm.
2023-03-03 09:18:11 -07:00
Andrew Godwin 552a150e57 Stop over-recursion in post contexts 2023-03-02 10:28:27 -07:00
Andrew Godwin 6411a375ba Allow API access with cookies again 2023-03-02 10:22:37 -07:00
TAKAHASHI Shuuji 026e1be357
Put ALT badge on attached images with alt-text 2023-02-28 17:02:20 -07:00
Humberto Rocha 9e016aaa5f
Fix navigation menu in mobile (#525) 2023-02-26 00:10:54 -07:00
Humberto Rocha d9cab99859
Fix not fetching images properlly when url is redirected (#526) 2023-02-26 00:07:25 -07:00
Humberto Rocha 9aff13118a
Fix crash when fetching emoji without mimetype and extension (#524) 2023-02-25 14:47:43 -07:00
TAKAHASHI Shuuji 2b56b33e38
Show profile image on the image viewer (#520)
Fixes #519
2023-02-24 03:54:58 -07:00
Colin Schlueter 6fb9a5ea96
Fix nameserver substitution for IPv6 resolvers (#516) 2023-02-20 09:49:53 -07:00
Andrew Godwin 42d6eb6000 Preserve ellipsis class on links when re-rendering 2023-02-19 23:01:50 -07:00
Andrew Godwin c3f5cf8d05 Fix 0.8 release doc styling 2023-02-19 22:50:25 -07:00
297 changed files with 8408 additions and 3883 deletions

View file

@ -6,6 +6,9 @@
.pre-commit-config.yaml
.venv
/fly.*
/static-collected
/takahe/local_settings.py
__pycache__/
media
notes.md
venv

View file

@ -8,7 +8,7 @@ jobs:
timeout-minutes: 5
strategy:
matrix:
python-version: ["3.10"]
python-version: ["3.11"]
steps:
- uses: actions/checkout@v3
- name: Set up Python ${{ matrix.python-version }}

View file

@ -4,6 +4,8 @@ on:
push:
paths-ignore:
- 'docs/**'
branches:
- main
pull_request:
paths-ignore:
- 'docs/**'
@ -15,17 +17,13 @@ jobs:
timeout-minutes: 8
strategy:
matrix:
python-version: ["3.10", "3.11"]
python-version: ["3.11", "3.12"]
db:
- "postgres://postgres:postgres@localhost/postgres"
- "sqlite:///takahe.db"
include:
- db: "postgres://postgres:postgres@localhost/postgres"
db_name: postgres
search: true
- db: "sqlite:///takahe.db"
db_name: sqlite
search: false
services:
postgres:
image: postgres:15
@ -48,6 +46,7 @@ jobs:
cache: pip
- name: Install dependencies
run: |
sudo apt-get install -y libmemcached-dev libwebp-dev libjpeg-dev
python -m pip install -r requirements-dev.txt
- name: Run pytest
env:

View file

@ -1,6 +1,6 @@
repos:
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v4.4.0
rev: v4.5.0
hooks:
- id: check-case-conflict
- id: check-merge-conflict
@ -15,19 +15,19 @@ repos:
- id: trailing-whitespace
- repo: https://github.com/asottile/pyupgrade
rev: "v3.3.0"
rev: "v3.15.0"
hooks:
- id: pyupgrade
args: [--py310-plus]
args: [--py311-plus]
- repo: https://github.com/adamchainz/django-upgrade
rev: "1.12.0"
rev: "1.15.0"
hooks:
- id: django-upgrade
args: [--target-version, "4.1"]
args: [--target-version, "4.2"]
- repo: https://github.com/psf/black
rev: 22.10.0
- repo: https://github.com/psf/black-pre-commit-mirror
rev: 23.11.0
hooks:
- id: black
@ -38,12 +38,12 @@ repos:
args: ["--profile=black"]
- repo: https://github.com/pycqa/flake8
rev: 6.0.0
rev: 6.1.0
hooks:
- id: flake8
- repo: https://github.com/pre-commit/mirrors-mypy
rev: v0.991
rev: v1.6.1
hooks:
- id: mypy
exclude: "^tests/"
@ -51,8 +51,6 @@ repos:
[types-pyopenssl, types-mock, types-cachetools, types-python-dateutil]
- repo: https://github.com/rtts/djhtml
rev: v1.5.2
rev: 3.0.6
hooks:
- id: djhtml
- id: djcss
- id: djjs

View file

@ -4,7 +4,7 @@ version: 2
build:
os: ubuntu-22.04
tools:
python: "3.10"
python: "3.11"
# Build documentation in the docs/ directory with Sphinx
sphinx:

View file

@ -1,9 +1,9 @@
![takahē](static/img/logo-128.png)
A *beta* Fediverse server for microblogging/"toots". Not fully polished yet -
A *beta* Fediverse server for microblogging. Not fully polished yet -
we're still working towards a 1.0!
**Current version: [0.8](https://docs.jointakahe.org/en/latest/releases/0.8/)**
**Current version: [0.11.0](https://docs.jointakahe.org/en/latest/releases/0.11/)**
Key features:

View file

@ -210,8 +210,8 @@ class TimelineEventAdmin(admin.ModelAdmin):
@admin.register(FanOut)
class FanOutAdmin(admin.ModelAdmin):
list_display = ["id", "state", "created", "state_attempted", "type", "identity"]
list_filter = (IdentityLocalFilter, "type", "state", "state_attempted")
list_display = ["id", "state", "created", "state_next_attempt", "type", "identity"]
list_filter = (IdentityLocalFilter, "type", "state")
raw_id_fields = ["subject_post", "subject_post_interaction"]
autocomplete_fields = ["identity"]
readonly_fields = ["created", "updated", "state_changed"]
@ -229,7 +229,7 @@ class FanOutAdmin(admin.ModelAdmin):
@admin.register(PostInteraction)
class PostInteractionAdmin(admin.ModelAdmin):
list_display = ["id", "state", "state_attempted", "type", "identity", "post"]
list_display = ["id", "state", "state_next_attempt", "type", "identity", "post"]
list_filter = (IdentityLocalFilter, "type", "state")
raw_id_fields = ["post"]
autocomplete_fields = ["identity"]

View file

View file

@ -0,0 +1,83 @@
import datetime
import sys
from django.conf import settings
from django.core.management.base import BaseCommand
from django.db.models import Q
from django.utils import timezone
from activities.models import Post
class Command(BaseCommand):
help = "Prunes posts that are old, not local and have no local interaction"
def add_arguments(self, parser):
parser.add_argument(
"--number",
"-n",
type=int,
default=500,
help="The maximum number of posts to prune at once",
)
def handle(self, number: int, *args, **options):
if not settings.SETUP.REMOTE_PRUNE_HORIZON:
print("Pruning has been disabled as REMOTE_PRUNE_HORIZON=0")
sys.exit(2)
# Find a set of posts that match the initial criteria
print(f"Running query to find up to {number} old posts...")
posts = (
Post.objects.filter(
local=False,
created__lt=timezone.now()
- datetime.timedelta(days=settings.SETUP.REMOTE_PRUNE_HORIZON),
)
.exclude(
Q(interactions__identity__local=True)
| Q(visibility=Post.Visibilities.mentioned)
)
.order_by("?")[:number]
)
post_ids_and_uris = dict(posts.values_list("object_uri", "id"))
print(f" found {len(post_ids_and_uris)}")
# Fetch all of their replies and exclude any that have local replies
print("Excluding ones with local replies...")
replies = Post.objects.filter(
local=True,
in_reply_to__in=post_ids_and_uris.keys(),
).values_list("in_reply_to", flat=True)
for reply in replies:
if reply and reply in post_ids_and_uris:
del post_ids_and_uris[reply]
print(f" narrowed down to {len(post_ids_and_uris)}")
# Fetch all the posts that they are replies to, and don't delete ones
# that are replies to local posts
print("Excluding ones that are replies to local posts...")
in_reply_tos = (
Post.objects.filter(id__in=post_ids_and_uris.values())
.values_list("in_reply_to", flat=True)
.distinct()
)
local_object_uris = Post.objects.filter(
local=True, object_uri__in=in_reply_tos
).values_list("object_uri", flat=True)
final_post_ids = list(
Post.objects.filter(id__in=post_ids_and_uris.values())
.exclude(in_reply_to__in=local_object_uris)
.values_list("id", flat=True)
)
print(f" narrowed down to {len(final_post_ids)}")
# Delete them
if not final_post_ids:
sys.exit(0)
print("Deleting...")
_, deleted = Post.objects.filter(id__in=final_post_ids).delete()
print("Deleted:")
for model, model_deleted in deleted.items():
print(f" {model}: {model_deleted}")
sys.exit(1)

View file

@ -16,7 +16,6 @@ import stator.models
class Migration(migrations.Migration):
initial = True
dependencies = [
@ -264,6 +263,7 @@ class Migration(migrations.Migration):
("identity_edited", "Identity Edited"),
("identity_deleted", "Identity Deleted"),
("identity_created", "Identity Created"),
("identity_moved", "Identity Moved"),
],
max_length=100,
),
@ -324,6 +324,7 @@ class Migration(migrations.Migration):
("mentioned", "Mentioned"),
("liked", "Liked"),
("followed", "Followed"),
("follow_requested", "Follow Requested"),
("boosted", "Boosted"),
("announcement", "Announcement"),
("identity_created", "Identity Created"),

View file

@ -8,7 +8,6 @@ import stator.models
class Migration(migrations.Migration):
dependencies = [
("activities", "0001_initial"),
]

View file

@ -10,7 +10,6 @@ import core.uploads
class Migration(migrations.Migration):
dependencies = [
("activities", "0002_hashtag"),
]

View file

@ -11,7 +11,6 @@ import stator.models
class Migration(migrations.Migration):
dependencies = [
("users", "0003_identity_followers_etc"),
("activities", "0003_postattachment_null_thumb"),

View file

@ -14,7 +14,6 @@ def timelineevent_populate_published(apps, schema_editor):
class Migration(migrations.Migration):
dependencies = [
("activities", "0004_emoji_post_emojis"),
]

View file

@ -5,7 +5,6 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("users", "0005_report"),
("activities", "0005_post_type_timeline_urls"),

View file

@ -4,7 +4,6 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("activities", "0006_fanout_subject_identity_alter_fanout_type"),
]

View file

@ -4,7 +4,6 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("activities", "0007_post_stats"),
]

View file

@ -4,7 +4,6 @@ from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
("users", "0011_announcement"),
("activities", "0008_state_and_post_indexes"),

View file

@ -4,7 +4,6 @@ from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
("users", "0013_stator_indexes"),
("activities", "0009_alter_timelineevent_index_together"),

View file

@ -4,7 +4,6 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("activities", "0010_stator_indexes"),
]

View file

@ -0,0 +1,19 @@
# Generated by Django 4.1.4 on 2023-03-05 17:33
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("activities", "0011_postinteraction_value_alter_postinteraction_type"),
]
operations = [
migrations.AlterField(
model_name="post",
name="in_reply_to",
field=models.CharField(
blank=True, db_index=True, max_length=500, null=True
),
),
]

View file

@ -0,0 +1,25 @@
# Generated by Django 4.1.4 on 2023-03-12 22:14
import django.db.models.deletion
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("users", "0015_bookmark"),
("activities", "0012_in_reply_to_index"),
]
operations = [
migrations.AddField(
model_name="postattachment",
name="author",
field=models.ForeignKey(
blank=True,
null=True,
on_delete=django.db.models.deletion.CASCADE,
related_name="attachments",
to="users.identity",
),
),
]

View file

@ -0,0 +1,23 @@
# Generated by Django 4.2 on 2023-04-29 18:49
import django.contrib.postgres.indexes
import django.contrib.postgres.search
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
("activities", "0013_postattachment_author"),
]
operations = [
migrations.AddIndex(
model_name="post",
index=django.contrib.postgres.indexes.GinIndex(
django.contrib.postgres.search.SearchVector(
"content", config="english"
),
name="content_vector_gin",
),
),
]

View file

@ -0,0 +1,25 @@
# Generated by Django 4.1.7 on 2023-04-24 08:04
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("activities", "0014_post_content_vector_gin"),
]
operations = [
migrations.AlterField(
model_name="postinteraction",
name="type",
field=models.CharField(
choices=[
("like", "Like"),
("boost", "Boost"),
("vote", "Vote"),
("pin", "Pin"),
],
max_length=100,
),
),
]

View file

@ -0,0 +1,91 @@
# Generated by Django 4.2.1 on 2023-05-13 17:29
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("activities", "0015_alter_postinteraction_type"),
]
operations = [
migrations.RenameIndex(
model_name="emoji",
new_name="activities__state_r_aa72ec_idx",
old_fields=("state_ready", "state_locked_until", "state"),
),
migrations.RenameIndex(
model_name="fanout",
new_name="activities__state_r_aae3b4_idx",
old_fields=("state_ready", "state_locked_until", "state"),
),
migrations.RenameIndex(
model_name="hashtag",
new_name="activities__state_r_5703be_idx",
old_fields=("state_ready", "state_locked_until", "state"),
),
migrations.RenameIndex(
model_name="post",
new_name="activities__state_r_b8f1ff_idx",
old_fields=("state_ready", "state_locked_until", "state"),
),
migrations.RenameIndex(
model_name="postattachment",
new_name="activities__state_r_4e981c_idx",
old_fields=("state_ready", "state_locked_until", "state"),
),
migrations.RenameIndex(
model_name="postinteraction",
new_name="activities__state_r_981d8c_idx",
old_fields=("state_ready", "state_locked_until", "state"),
),
migrations.RenameIndex(
model_name="postinteraction",
new_name="activities__type_75d2e4_idx",
old_fields=("type", "identity", "post"),
),
migrations.RenameIndex(
model_name="timelineevent",
new_name="activities__identit_0b93c3_idx",
old_fields=("identity", "type", "subject_post", "subject_identity"),
),
migrations.RenameIndex(
model_name="timelineevent",
new_name="activities__identit_cc2290_idx",
old_fields=("identity", "type", "subject_identity"),
),
migrations.RenameIndex(
model_name="timelineevent",
new_name="activities__identit_872fbb_idx",
old_fields=("identity", "created"),
),
migrations.AddIndex(
model_name="emoji",
index=models.Index(
fields=["state", "state_attempted"], name="ix_emoji_state_attempted"
),
),
migrations.AddIndex(
model_name="emoji",
index=models.Index(
condition=models.Q(("state_locked_until__isnull", False)),
fields=["state_locked_until", "state"],
name="ix_emoji_state_locked",
),
),
migrations.AddIndex(
model_name="postinteraction",
index=models.Index(
fields=["state", "state_attempted"],
name="ix_postinterac_state_attempted",
),
),
migrations.AddIndex(
model_name="postinteraction",
index=models.Index(
condition=models.Q(("state_locked_until__isnull", False)),
fields=["state_locked_until", "state"],
name="ix_postinterac_state_locked",
),
),
]

View file

@ -0,0 +1,234 @@
# Generated by Django 4.2.1 on 2023-07-05 22:18
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("activities", "0016_index_together_migration"),
]
operations = [
migrations.RemoveIndex(
model_name="emoji",
name="activities__state_r_aa72ec_idx",
),
migrations.RemoveIndex(
model_name="emoji",
name="ix_emoji_state_attempted",
),
migrations.RemoveIndex(
model_name="emoji",
name="ix_emoji_state_locked",
),
migrations.RemoveIndex(
model_name="fanout",
name="ix_fanout_state_attempted",
),
migrations.RemoveIndex(
model_name="fanout",
name="ix_fanout_state_locked",
),
migrations.RemoveIndex(
model_name="fanout",
name="activities__state_r_aae3b4_idx",
),
migrations.RemoveIndex(
model_name="hashtag",
name="ix_hashtag_state_attempted",
),
migrations.RemoveIndex(
model_name="hashtag",
name="ix_hashtag_state_locked",
),
migrations.RemoveIndex(
model_name="hashtag",
name="activities__state_r_5703be_idx",
),
migrations.RemoveIndex(
model_name="post",
name="ix_post_state_attempted",
),
migrations.RemoveIndex(
model_name="post",
name="ix_post_state_locked",
),
migrations.RemoveIndex(
model_name="post",
name="activities__state_r_b8f1ff_idx",
),
migrations.RemoveIndex(
model_name="postattachment",
name="ix_postattachm_state_attempted",
),
migrations.RemoveIndex(
model_name="postattachment",
name="ix_postattachm_state_locked",
),
migrations.RemoveIndex(
model_name="postattachment",
name="activities__state_r_4e981c_idx",
),
migrations.RemoveIndex(
model_name="postinteraction",
name="activities__state_r_981d8c_idx",
),
migrations.RemoveIndex(
model_name="postinteraction",
name="ix_postinterac_state_attempted",
),
migrations.RemoveIndex(
model_name="postinteraction",
name="ix_postinterac_state_locked",
),
migrations.RemoveField(
model_name="emoji",
name="state_attempted",
),
migrations.RemoveField(
model_name="emoji",
name="state_ready",
),
migrations.RemoveField(
model_name="fanout",
name="state_attempted",
),
migrations.RemoveField(
model_name="fanout",
name="state_ready",
),
migrations.RemoveField(
model_name="hashtag",
name="state_attempted",
),
migrations.RemoveField(
model_name="hashtag",
name="state_ready",
),
migrations.RemoveField(
model_name="post",
name="state_attempted",
),
migrations.RemoveField(
model_name="post",
name="state_ready",
),
migrations.RemoveField(
model_name="postattachment",
name="state_attempted",
),
migrations.RemoveField(
model_name="postattachment",
name="state_ready",
),
migrations.RemoveField(
model_name="postinteraction",
name="state_attempted",
),
migrations.RemoveField(
model_name="postinteraction",
name="state_ready",
),
migrations.AddField(
model_name="emoji",
name="state_next_attempt",
field=models.DateTimeField(blank=True, null=True),
),
migrations.AddField(
model_name="fanout",
name="state_next_attempt",
field=models.DateTimeField(blank=True, null=True),
),
migrations.AddField(
model_name="hashtag",
name="state_next_attempt",
field=models.DateTimeField(blank=True, null=True),
),
migrations.AddField(
model_name="post",
name="state_next_attempt",
field=models.DateTimeField(blank=True, null=True),
),
migrations.AddField(
model_name="postattachment",
name="state_next_attempt",
field=models.DateTimeField(blank=True, null=True),
),
migrations.AddField(
model_name="postinteraction",
name="state_next_attempt",
field=models.DateTimeField(blank=True, null=True),
),
migrations.AlterField(
model_name="emoji",
name="state_locked_until",
field=models.DateTimeField(blank=True, db_index=True, null=True),
),
migrations.AlterField(
model_name="fanout",
name="state_locked_until",
field=models.DateTimeField(blank=True, db_index=True, null=True),
),
migrations.AlterField(
model_name="hashtag",
name="state_locked_until",
field=models.DateTimeField(blank=True, db_index=True, null=True),
),
migrations.AlterField(
model_name="post",
name="state_locked_until",
field=models.DateTimeField(blank=True, db_index=True, null=True),
),
migrations.AlterField(
model_name="postattachment",
name="state_locked_until",
field=models.DateTimeField(blank=True, db_index=True, null=True),
),
migrations.AlterField(
model_name="postinteraction",
name="state_locked_until",
field=models.DateTimeField(blank=True, db_index=True, null=True),
),
migrations.AddIndex(
model_name="emoji",
index=models.Index(
fields=["state", "state_next_attempt", "state_locked_until"],
name="ix_emoji_state_next",
),
),
migrations.AddIndex(
model_name="fanout",
index=models.Index(
fields=["state", "state_next_attempt", "state_locked_until"],
name="ix_fanout_state_next",
),
),
migrations.AddIndex(
model_name="hashtag",
index=models.Index(
fields=["state", "state_next_attempt", "state_locked_until"],
name="ix_hashtag_state_next",
),
),
migrations.AddIndex(
model_name="post",
index=models.Index(
fields=["state", "state_next_attempt", "state_locked_until"],
name="ix_post_state_next",
),
),
migrations.AddIndex(
model_name="postattachment",
index=models.Index(
fields=["state", "state_next_attempt", "state_locked_until"],
name="ix_postattachm_state_next",
),
),
migrations.AddIndex(
model_name="postinteraction",
index=models.Index(
fields=["state", "state_next_attempt", "state_locked_until"],
name="ix_postinterac_state_next",
),
),
]

View file

@ -0,0 +1,17 @@
# Generated by Django 4.2.2 on 2023-07-09 17:25
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("activities", "0017_stator_next_change"),
]
operations = [
migrations.AddField(
model_name="timelineevent",
name="dismissed",
field=models.BooleanField(default=False),
),
]

View file

@ -0,0 +1,22 @@
# Generated by Django 4.2.3 on 2023-10-30 07:44
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("activities", "0018_timelineevent_dismissed"),
]
operations = [
migrations.AlterField(
model_name="postattachment",
name="focal_x",
field=models.FloatField(blank=True, null=True),
),
migrations.AlterField(
model_name="postattachment",
name="focal_y",
field=models.FloatField(blank=True, null=True),
),
]

View file

@ -4,13 +4,13 @@ from typing import ClassVar
import httpx
import urlman
from asgiref.sync import sync_to_async
from cachetools import TTLCache, cached
from django.conf import settings
from django.core.exceptions import ValidationError
from django.core.files.base import ContentFile
from django.db import models
from django.utils.safestring import mark_safe
from PIL import Image
from core.files import get_remote_file
from core.html import FediverseHtmlParser
@ -34,23 +34,27 @@ class EmojiStates(StateGraph):
outdated.transitions_to(updated)
@classmethod
async def handle_outdated(cls, instance: "Emoji"):
def handle_outdated(cls, instance: "Emoji"):
"""
Fetches remote emoji and uploads to file for local caching
"""
if instance.remote_url and not instance.file:
try:
file, mimetype = await get_remote_file(
file, mimetype = get_remote_file(
instance.remote_url,
timeout=settings.SETUP.REMOTE_TIMEOUT,
max_size=settings.SETUP.EMOJI_MAX_IMAGE_FILESIZE_KB * 1024,
)
except httpx.RequestError:
return
if file:
if mimetype == "application/octet-stream":
mimetype = Image.open(file).get_format_mimetype()
instance.file = file
instance.mimetype = mimetype
await sync_to_async(instance.save)()
instance.save()
return cls.updated
@ -81,7 +85,6 @@ class EmojiManager(models.Manager):
class Emoji(StatorModel):
# Normalized Emoji without the ':'
shortcode = models.SlugField(max_length=100, db_index=True)
@ -123,7 +126,7 @@ class Emoji(StatorModel):
class Meta:
unique_together = ("domain", "shortcode")
index_together = StatorModel.Meta.index_together
indexes: list = [] # We need this so Stator can add its own
class urls(urlman.Urls):
admin = "/admin/emoji/"
@ -278,7 +281,7 @@ class Emoji(StatorModel):
# Name could be a direct property, or in a language'd value
if "name" in data:
name = data["name"]
elif "nameMap" in data:
elif "nameMap" in data and "und" in data["nameMap"]:
name = data["nameMap"]["und"]
else:
raise ValueError("No name on emoji JSON")
@ -288,8 +291,6 @@ class Emoji(StatorModel):
mimetype = icon.get("mediaType")
if not mimetype:
mimetype, _ = mimetypes.guess_type(icon["url"])
if mimetype is None:
raise ValueError("No mimetype on emoji JSON")
# create
shortcode = name.strip(":")
@ -301,17 +302,22 @@ class Emoji(StatorModel):
except cls.DoesNotExist:
pass
else:
# default to previously discovered mimetype if not provided
# by the instance to avoid infinite outdated state
if mimetype is None:
mimetype = emoji.mimetype
# Domain previously provided this shortcode. Trample in the new emoji
if emoji.remote_url != icon["url"] or emoji.mimetype != mimetype:
emoji.object_uri = data["id"]
emoji.remote_url = icon["url"]
emoji.mimetype = mimetype
emoji.category = category
emoji.transition_set_state("outdated")
if emoji.file:
emoji.file.delete(save=True)
else:
emoji.save()
emoji.transition_perform("outdated")
return emoji
emoji = cls.objects.create(
@ -319,7 +325,7 @@ class Emoji(StatorModel):
domain=None if domain.local else domain,
local=domain.local,
object_uri=data["id"],
mimetype=mimetype,
mimetype=mimetype or "application/octet-stream",
category=category,
remote_url=icon["url"],
)

View file

@ -1,5 +1,4 @@
import httpx
from asgiref.sync import sync_to_async
from django.db import models
from activities.models.timeline_event import TimelineEvent
@ -19,26 +18,24 @@ class FanOutStates(StateGraph):
new.times_out_to(failed, seconds=86400 * 3)
@classmethod
async def handle_new(cls, instance: "FanOut"):
def handle_new(cls, instance: "FanOut"):
"""
Sends the fan-out to the right inbox.
"""
fan_out = await instance.afetch_full()
# Don't try to fan out to identities that are not fetched yet
if not (fan_out.identity.local or fan_out.identity.inbox_uri):
if not (instance.identity.local or instance.identity.inbox_uri):
return
match (fan_out.type, fan_out.identity.local):
match (instance.type, instance.identity.local):
# Handle creating/updating local posts
case ((FanOut.Types.post | FanOut.Types.post_edited), True):
post = await fan_out.subject_post.afetch_full()
post = instance.subject_post
# If the author of the post is blocked or muted, skip out
if (
await Block.objects.active()
.filter(source=fan_out.identity, target=post.author)
.aexists()
Block.objects.active()
.filter(source=instance.identity, target=post.author)
.exists()
):
return cls.skipped
# Make a timeline event directly
@ -48,42 +45,42 @@ class FanOutStates(StateGraph):
add = True
mentioned = {identity.id for identity in post.mentions.all()}
if post.in_reply_to:
followed = await sync_to_async(set)(
fan_out.identity.outbound_follows.filter(
followed = set(
instance.identity.outbound_follows.filter(
state__in=FollowStates.group_active()
).values_list("target_id", flat=True)
)
interested_in = followed.union(
{post.author_id, fan_out.identity_id}
{post.author_id, instance.identity_id}
)
add = (post.author_id in followed) and (
bool(mentioned.intersection(interested_in))
)
if add:
await sync_to_async(TimelineEvent.add_post)(
identity=fan_out.identity,
TimelineEvent.add_post(
identity=instance.identity,
post=post,
)
# We might have been mentioned
if (
fan_out.identity.id in mentioned
and fan_out.identity_id != post.author_id
instance.identity.id in mentioned
and instance.identity_id != post.author_id
):
await sync_to_async(TimelineEvent.add_mentioned)(
identity=fan_out.identity,
TimelineEvent.add_mentioned(
identity=instance.identity,
post=post,
)
# Handle sending remote posts create
case (FanOut.Types.post, False):
post = await fan_out.subject_post.afetch_full()
post = instance.subject_post
# Sign it and send it
try:
await post.author.signed_request(
post.author.signed_request(
method="post",
uri=(
fan_out.identity.shared_inbox_uri
or fan_out.identity.inbox_uri
instance.identity.shared_inbox_uri
or instance.identity.inbox_uri
),
body=canonicalise(post.to_create_ap()),
)
@ -92,14 +89,14 @@ class FanOutStates(StateGraph):
# Handle sending remote posts update
case (FanOut.Types.post_edited, False):
post = await fan_out.subject_post.afetch_full()
post = instance.subject_post
# Sign it and send it
try:
await post.author.signed_request(
post.author.signed_request(
method="post",
uri=(
fan_out.identity.shared_inbox_uri
or fan_out.identity.inbox_uri
instance.identity.shared_inbox_uri
or instance.identity.inbox_uri
),
body=canonicalise(post.to_update_ap()),
)
@ -108,24 +105,24 @@ class FanOutStates(StateGraph):
# Handle deleting local posts
case (FanOut.Types.post_deleted, True):
post = await fan_out.subject_post.afetch_full()
if fan_out.identity.local:
post = instance.subject_post
if instance.identity.local:
# Remove all timeline events mentioning it
await TimelineEvent.objects.filter(
identity=fan_out.identity,
TimelineEvent.objects.filter(
identity=instance.identity,
subject_post=post,
).adelete()
).delete()
# Handle sending remote post deletes
case (FanOut.Types.post_deleted, False):
post = await fan_out.subject_post.afetch_full()
post = instance.subject_post
# Send it to the remote inbox
try:
await post.author.signed_request(
post.author.signed_request(
method="post",
uri=(
fan_out.identity.shared_inbox_uri
or fan_out.identity.inbox_uri
instance.identity.shared_inbox_uri
or instance.identity.inbox_uri
),
body=canonicalise(post.to_delete_ap()),
)
@ -134,113 +131,121 @@ class FanOutStates(StateGraph):
# Handle local boosts/likes
case (FanOut.Types.interaction, True):
interaction = await fan_out.subject_post_interaction.afetch_full()
interaction = instance.subject_post_interaction
# If the author of the interaction is blocked or their notifications
# are muted, skip out
if (
await Block.objects.active()
Block.objects.active()
.filter(
models.Q(mute=False) | models.Q(include_notifications=True),
source=fan_out.identity,
source=instance.identity,
target=interaction.identity,
)
.aexists()
.exists()
):
return cls.skipped
# If blocked/muted the underlying post author, skip out
if (
await Block.objects.active()
Block.objects.active()
.filter(
source=fan_out.identity,
source=instance.identity,
target_id=interaction.post.author_id,
)
.aexists()
.exists()
):
return cls.skipped
# Make a timeline event directly
await sync_to_async(TimelineEvent.add_post_interaction)(
identity=fan_out.identity,
TimelineEvent.add_post_interaction(
identity=instance.identity,
interaction=interaction,
)
# Handle sending remote boosts/likes/votes
# Handle sending remote boosts/likes/votes/pins
case (FanOut.Types.interaction, False):
interaction = await fan_out.subject_post_interaction.afetch_full()
interaction = instance.subject_post_interaction
# Send it to the remote inbox
try:
await interaction.identity.signed_request(
if interaction.type == interaction.Types.vote:
body = interaction.to_create_ap()
elif interaction.type == interaction.Types.pin:
body = interaction.to_add_ap()
else:
body = interaction.to_ap()
interaction.identity.signed_request(
method="post",
uri=(
fan_out.identity.shared_inbox_uri
or fan_out.identity.inbox_uri
),
body=canonicalise(
interaction.to_create_ap()
if interaction.type == interaction.Types.vote
else interaction.to_ap()
instance.identity.shared_inbox_uri
or instance.identity.inbox_uri
),
body=canonicalise(body),
)
except httpx.RequestError:
return
# Handle undoing local boosts/likes
case (FanOut.Types.undo_interaction, True): # noqa:F841
interaction = await fan_out.subject_post_interaction.afetch_full()
interaction = instance.subject_post_interaction
# Delete any local timeline events
await sync_to_async(TimelineEvent.delete_post_interaction)(
identity=fan_out.identity,
TimelineEvent.delete_post_interaction(
identity=instance.identity,
interaction=interaction,
)
# Handle sending remote undoing boosts/likes
# Handle sending remote undoing boosts/likes/pins
case (FanOut.Types.undo_interaction, False): # noqa:F841
interaction = await fan_out.subject_post_interaction.afetch_full()
interaction = instance.subject_post_interaction
# Send an undo to the remote inbox
try:
await interaction.identity.signed_request(
if interaction.type == interaction.Types.pin:
body = interaction.to_remove_ap()
else:
body = interaction.to_undo_ap()
interaction.identity.signed_request(
method="post",
uri=(
fan_out.identity.shared_inbox_uri
or fan_out.identity.inbox_uri
instance.identity.shared_inbox_uri
or instance.identity.inbox_uri
),
body=canonicalise(interaction.to_undo_ap()),
body=canonicalise(body),
)
except httpx.RequestError:
return
# Handle sending identity edited to remote
case (FanOut.Types.identity_edited, False):
identity = await fan_out.subject_identity.afetch_full()
identity = instance.subject_identity
try:
await identity.signed_request(
identity.signed_request(
method="post",
uri=(
fan_out.identity.shared_inbox_uri
or fan_out.identity.inbox_uri
),
body=canonicalise(
await sync_to_async(fan_out.subject_identity.to_update_ap)()
instance.identity.shared_inbox_uri
or instance.identity.inbox_uri
),
body=canonicalise(instance.subject_identity.to_update_ap()),
)
except httpx.RequestError:
return
# Handle sending identity deleted to remote
case (FanOut.Types.identity_deleted, False):
identity = await fan_out.subject_identity.afetch_full()
identity = instance.subject_identity
try:
await identity.signed_request(
identity.signed_request(
method="post",
uri=(
fan_out.identity.shared_inbox_uri
or fan_out.identity.inbox_uri
instance.identity.shared_inbox_uri
or instance.identity.inbox_uri
),
body=canonicalise(fan_out.subject_identity.to_delete_ap()),
body=canonicalise(instance.subject_identity.to_delete_ap()),
)
except httpx.RequestError:
return
# Handle sending identity moved to remote
case (FanOut.Types.identity_moved, False):
raise NotImplementedError()
# Sending identity edited/deleted to local is a no-op
case (FanOut.Types.identity_edited, True):
pass
@ -249,14 +254,14 @@ class FanOutStates(StateGraph):
# Created identities make a timeline event
case (FanOut.Types.identity_created, True):
await sync_to_async(TimelineEvent.add_identity_created)(
identity=fan_out.identity,
new_identity=fan_out.subject_identity,
TimelineEvent.add_identity_created(
identity=instance.identity,
new_identity=instance.subject_identity,
)
case _:
raise ValueError(
f"Cannot fan out with type {fan_out.type} local={fan_out.identity.local}"
f"Cannot fan out with type {instance.type} local={instance.identity.local}"
)
return cls.sent
@ -276,6 +281,7 @@ class FanOut(StatorModel):
identity_edited = "identity_edited"
identity_deleted = "identity_deleted"
identity_created = "identity_created"
identity_moved = "identity_moved"
state = StateField(FanOutStates)
@ -317,23 +323,3 @@ class FanOut(StatorModel):
created = models.DateTimeField(auto_now_add=True)
updated = models.DateTimeField(auto_now=True)
### Async helpers ###
async def afetch_full(self):
"""
Returns a version of the object with all relations pre-loaded
"""
return (
await FanOut.objects.select_related(
"identity",
"subject_post",
"subject_post_interaction",
"subject_identity",
"subject_identity__domain",
)
.prefetch_related(
"subject_post__emojis",
)
.aget(pk=self.pk)
)

View file

@ -2,7 +2,6 @@ import re
from datetime import date, timedelta
import urlman
from asgiref.sync import sync_to_async
from django.db import models
from django.utils import timezone
@ -18,31 +17,27 @@ class HashtagStates(StateGraph):
updated.transitions_to(outdated)
@classmethod
async def handle_outdated(cls, instance: "Hashtag"):
def handle_outdated(cls, instance: "Hashtag"):
"""
Computes the stats and other things for a Hashtag
"""
from time import time
from .post import Post
start = time()
posts_query = Post.objects.local_public().tagged_with(instance)
total = await posts_query.acount()
total = posts_query.count()
today = timezone.now().date()
total_today = await posts_query.filter(
total_today = posts_query.filter(
created__gte=today,
created__lte=today + timedelta(days=1),
).acount()
total_month = await posts_query.filter(
).count()
total_month = posts_query.filter(
created__year=today.year,
created__month=today.month,
).acount()
total_year = await posts_query.filter(
).count()
total_year = posts_query.filter(
created__year=today.year,
).acount()
).count()
if total:
if not instance.stats:
instance.stats = {}
@ -55,9 +50,8 @@ class HashtagStates(StateGraph):
}
)
instance.stats_updated = timezone.now()
await sync_to_async(instance.save)()
instance.save()
print(f"Updated hashtag {instance.hashtag} in {time() - start:.5f} seconds")
return cls.updated
@ -86,6 +80,7 @@ class HashtagManager(models.Manager):
class Hashtag(StatorModel):
MAXIMUM_LENGTH = 100
# Normalized hashtag without the '#'
hashtag = models.SlugField(primary_key=True, max_length=100)
@ -114,6 +109,8 @@ class Hashtag(StatorModel):
class urls(urlman.Urls):
view = "/tags/{self.hashtag}/"
follow = "/tags/{self.hashtag}/follow/"
unfollow = "/tags/{self.hashtag}/unfollow/"
admin = "/admin/hashtags/"
admin_edit = "{admin}{self.hashtag}/"
admin_enable = "{admin_edit}enable/"
@ -166,9 +163,14 @@ class Hashtag(StatorModel):
results[date(year, month, day)] = val
return dict(sorted(results.items(), reverse=True)[:num])
def to_mastodon_json(self):
return {
def to_mastodon_json(self, following: bool | None = None):
value = {
"name": self.hashtag,
"url": self.urls.view.full(),
"url": self.urls.view.full(), # type: ignore
"history": [],
}
if following is not None:
value["following"] = following
return value

View file

@ -1,6 +1,6 @@
import datetime
import hashlib
import json
import logging
import mimetypes
import ssl
from collections.abc import Iterable
@ -9,12 +9,15 @@ from urllib.parse import urlparse
import httpx
import urlman
from asgiref.sync import async_to_sync, sync_to_async
from django.conf import settings
from django.contrib.postgres.indexes import GinIndex
from django.contrib.postgres.search import SearchVector
from django.db import models, transaction
from django.db.utils import IntegrityError
from django.template import loader
from django.template.defaultfilters import linebreaks_filter
from django.utils import timezone
from pyld.jsonld import JsonLdError
from activities.models.emoji import Emoji
from activities.models.fan_out import FanOut
@ -25,7 +28,7 @@ from activities.models.post_types import (
PostTypeDataEncoder,
QuestionData,
)
from core.exceptions import capture_message
from core.exceptions import ActivityPubFormatError
from core.html import ContentRenderer, FediverseHtmlParser
from core.ld import (
canonicalise,
@ -38,21 +41,25 @@ from core.snowflake import Snowflake
from stator.exceptions import TryAgainLater
from stator.models import State, StateField, StateGraph, StatorModel
from users.models.follow import FollowStates
from users.models.hashtag_follow import HashtagFollow
from users.models.identity import Identity, IdentityStates
from users.models.inbox_message import InboxMessage
from users.models.system_actor import SystemActor
logger = logging.getLogger(__name__)
class PostStates(StateGraph):
new = State(try_interval=300)
fanned_out = State(externally_progressed=True)
deleted = State(try_interval=300)
deleted_fanned_out = State(delete_after=24 * 60 * 60)
deleted_fanned_out = State(delete_after=86400)
edited = State(try_interval=300)
edited_fanned_out = State(externally_progressed=True)
new.transitions_to(fanned_out)
fanned_out.transitions_to(deleted_fanned_out)
fanned_out.transitions_to(deleted)
fanned_out.transitions_to(edited)
@ -62,45 +69,66 @@ class PostStates(StateGraph):
edited_fanned_out.transitions_to(deleted)
@classmethod
async def targets_fan_out(cls, post: "Post", type_: str) -> None:
def targets_fan_out(cls, post: "Post", type_: str) -> None:
# Fan out to each target
for follow in await post.aget_targets():
await FanOut.objects.acreate(
for follow in post.get_targets():
FanOut.objects.create(
identity=follow,
type=type_,
subject_post=post,
)
@classmethod
async def handle_new(cls, instance: "Post"):
def handle_new(cls, instance: "Post"):
"""
Creates all needed fan-out objects for a new Post.
"""
post = await instance.afetch_full()
# Only fan out if the post was published in the last day or it's local
# (we don't want to fan out anything older that that which is remote)
if post.local or (timezone.now() - post.published) < datetime.timedelta(days=1):
await cls.targets_fan_out(post, FanOut.Types.post)
await post.ensure_hashtags()
if instance.local or (timezone.now() - instance.published) < datetime.timedelta(
days=1
):
cls.targets_fan_out(instance, FanOut.Types.post)
instance.ensure_hashtags()
return cls.fanned_out
@classmethod
async def handle_deleted(cls, instance: "Post"):
def handle_fanned_out(cls, instance: "Post"):
"""
Creates all needed fan-out objects needed to delete a Post.
For remote posts, sees if we can delete them every so often.
"""
post = await instance.afetch_full()
await cls.targets_fan_out(post, FanOut.Types.post_deleted)
# Skip all of this if the horizon is zero
if settings.SETUP.REMOTE_PRUNE_HORIZON <= 0:
return
# To be a candidate for deletion, a post must be remote and old enough
if instance.local:
return
if instance.created > timezone.now() - datetime.timedelta(
days=settings.SETUP.REMOTE_PRUNE_HORIZON
):
return
# It must have no local interactions
if instance.interactions.filter(identity__local=True).exists():
return
# OK, delete it!
instance.delete()
return cls.deleted_fanned_out
@classmethod
async def handle_edited(cls, instance: "Post"):
def handle_deleted(cls, instance: "Post"):
"""
Creates all needed fan-out objects needed to delete a Post.
"""
cls.targets_fan_out(instance, FanOut.Types.post_deleted)
return cls.deleted_fanned_out
@classmethod
def handle_edited(cls, instance: "Post"):
"""
Creates all needed fan-out objects for an edited Post.
"""
post = await instance.afetch_full()
await cls.targets_fan_out(post, FanOut.Types.post_edited)
await post.ensure_hashtags()
cls.targets_fan_out(instance, FanOut.Types.post_edited)
instance.ensure_hashtags()
return cls.edited_fanned_out
@ -269,7 +297,7 @@ class Post(StatorModel):
# The Post it is replying to as an AP ID URI
# (as otherwise we'd have to pull entire threads to use IDs)
in_reply_to = models.CharField(max_length=500, blank=True, null=True)
in_reply_to = models.CharField(max_length=500, blank=True, null=True, db_index=True)
# The identities the post is directly to (who can see it if not public)
to = models.ManyToManyField(
@ -311,6 +339,10 @@ class Post(StatorModel):
class Meta:
indexes = [
GinIndex(fields=["hashtags"], name="hashtags_gin"),
GinIndex(
SearchVector("content", config="english"),
name="content_vector_gin",
),
models.Index(
fields=["visibility", "local", "published"],
name="ix_post_local_public_published",
@ -320,7 +352,6 @@ class Post(StatorModel):
name="ix_post_local_public_created",
),
]
index_together = StatorModel.Meta.index_together
class urls(urlman.Urls):
view = "{self.author.urls.view}posts/{self.id}/"
@ -329,6 +360,8 @@ class Post(StatorModel):
action_unlike = "{view}unlike/"
action_boost = "{view}boost/"
action_unboost = "{view}unboost/"
action_bookmark = "{view}bookmark/"
action_unbookmark = "{view}unbookmark/"
action_delete = "{view}delete/"
action_edit = "{view}edit/"
action_report = "{view}report/"
@ -369,8 +402,6 @@ class Post(StatorModel):
.first()
)
ain_reply_to_post = sync_to_async(in_reply_to_post)
### Content cleanup and extraction ###
def clean_type_data(self, value):
PostTypeData.parse_obj(value)
@ -429,7 +460,7 @@ class Post(StatorModel):
"""
if not self.summary:
return ""
return "summary-" + hashlib.md5(self.summary.encode("utf8")).hexdigest()
return "summary-{self.id}"
@property
def stats_with_defaults(self):
@ -442,18 +473,6 @@ class Post(StatorModel):
"replies": self.stats.get("replies", 0) if self.stats else 0,
}
### Async helpers ###
async def afetch_full(self) -> "Post":
"""
Returns a version of the object with all relations pre-loaded
"""
return (
await Post.objects.select_related("author", "author__domain")
.prefetch_related("mentions", "mentions__domain", "attachments", "emojis")
.aget(pk=self.pk)
)
### Local creation/editing ###
@classmethod
@ -481,7 +500,10 @@ class Post(StatorModel):
# Strip all unwanted HTML and apply linebreaks filter, grabbing hashtags on the way
parser = FediverseHtmlParser(linebreaks_filter(content), find_hashtags=True)
content = parser.html
hashtags = sorted(parser.hashtags) or None
hashtags = (
sorted([tag[: Hashtag.MAXIMUM_LENGTH] for tag in parser.hashtags])
or None
)
# Make the Post object
post = cls.objects.create(
author=author,
@ -515,12 +537,16 @@ class Post(StatorModel):
sensitive: bool | None = None,
visibility: int = Visibilities.public,
attachments: list | None = None,
attachment_attributes: list | None = None,
):
with transaction.atomic():
# Strip all HTML and apply linebreaks filter
parser = FediverseHtmlParser(linebreaks_filter(content), find_hashtags=True)
self.content = parser.html
self.hashtags = sorted(parser.hashtags) or None
self.hashtags = (
sorted([tag[: Hashtag.MAXIMUM_LENGTH] for tag in parser.hashtags])
or None
)
self.summary = summary or None
self.sensitive = bool(summary) if sensitive is None else sensitive
self.visibility = visibility
@ -530,6 +556,17 @@ class Post(StatorModel):
self.attachments.set(attachments or [])
self.save()
for attrs in attachment_attributes or []:
attachment = next(
(a for a in attachments or [] if str(a.id) == attrs.id), None
)
if attachment is None:
continue
attachment.name = attrs.description
attachment.save()
self.transition_perform(PostStates.edited)
@classmethod
def mentions_from_content(cls, content, author) -> set[Identity]:
mention_hits = FediverseHtmlParser(content, find_mentions=True).mentions
@ -546,11 +583,11 @@ class Post(StatorModel):
domain=domain,
fetch=True,
)
if identity is not None:
if identity is not None and not identity.deleted:
mentions.add(identity)
return mentions
async def ensure_hashtags(self) -> None:
def ensure_hashtags(self) -> None:
"""
Ensure any of the already parsed hashtags from this Post
have a corresponding Hashtag record.
@ -558,10 +595,10 @@ class Post(StatorModel):
# Ensure hashtags
if self.hashtags:
for hashtag in self.hashtags:
tag, _ = await Hashtag.objects.aget_or_create(
hashtag=hashtag,
tag, _ = Hashtag.objects.get_or_create(
hashtag=hashtag[: Hashtag.MAXIMUM_LENGTH],
)
await tag.atransition_perform(HashtagStates.outdated)
tag.transition_perform(HashtagStates.outdated)
def calculate_stats(self, save=True):
"""
@ -613,6 +650,7 @@ class Post(StatorModel):
"""
Returns the AP JSON for this object
"""
self.author.ensure_uris()
value = {
"to": [],
"cc": [],
@ -645,11 +683,14 @@ class Post(StatorModel):
if self.edited:
value["updated"] = format_ld_date(self.edited)
# Targeting
# TODO: Add followers object
if self.visibility == self.Visibilities.public:
value["to"].append("Public")
value["to"].append("as:Public")
elif self.visibility == self.Visibilities.unlisted:
value["cc"].append("Public")
value["cc"].append("as:Public")
elif (
self.visibility == self.Visibilities.followers and self.author.followers_uri
):
value["to"].append(self.author.followers_uri)
# Mentions
for mention in self.mentions.all():
value["tag"].append(mention.to_ap_tag())
@ -717,27 +758,36 @@ class Post(StatorModel):
"object": object,
}
async def aget_targets(self) -> Iterable[Identity]:
def get_targets(self) -> Iterable[Identity]:
"""
Returns a list of Identities that need to see posts and their changes
"""
targets = set()
async for mention in self.mentions.all():
for mention in self.mentions.all():
targets.add(mention)
# Then, if it's not mentions only, also deliver to followers
if self.visibility in [Post.Visibilities.public, Post.Visibilities.unlisted]:
for interaction in self.interactions.all():
targets.add(interaction.identity)
# Then, if it's not mentions only, also deliver to followers and all hashtag followers
if self.visibility != Post.Visibilities.mentioned:
async for follower in self.author.inbound_follows.filter(
for follower in self.author.inbound_follows.filter(
state__in=FollowStates.group_active()
).select_related("source"):
targets.add(follower.source)
if self.hashtags:
for follow in HashtagFollow.objects.by_hashtags(
self.hashtags
).prefetch_related("identity"):
targets.add(follow.identity)
# If it's a reply, always include the original author if we know them
reply_post = await self.ain_reply_to_post()
reply_post = self.in_reply_to_post()
if reply_post:
targets.add(reply_post.author)
# And if it's a reply to one of our own, we have to re-fan-out to
# the original author's followers
if reply_post.author.local:
async for follower in reply_post.author.inbound_follows.filter(
for follower in reply_post.author.inbound_follows.filter(
state__in=FollowStates.group_active()
).select_related("source"):
targets.add(follower.source)
@ -754,7 +804,7 @@ class Post(StatorModel):
.filter(mute=False)
.select_related("target")
)
async for block in blocks:
for block in blocks:
try:
targets.remove(block.target)
except KeyError:
@ -814,32 +864,52 @@ class Post(StatorModel):
# If the author is not fetched yet, try again later
if author.domain is None:
if fetch_author:
async_to_sync(author.fetch_actor)()
if author.domain is None:
if not author.fetch_actor() or author.domain is None:
raise TryAgainLater()
else:
raise TryAgainLater()
# If the post is from a blocked domain, stop and drop
if author.domain.blocked:
if author.domain.recursively_blocked():
raise cls.DoesNotExist("Post is from a blocked domain")
post = cls.objects.create(
object_uri=data["id"],
author=author,
content="",
local=False,
type=data["type"],
)
created = True
# parallelism may cause another simultaneous worker thread
# to try to create the same post - so watch for that and
# try to avoid failing the entire transaction
try:
# wrapped in a transaction to avoid breaking the outer
# transaction
with transaction.atomic():
post = cls.objects.create(
object_uri=data["id"],
author=author,
content="",
local=False,
type=data["type"],
)
created = True
except IntegrityError:
# despite previous checks, a parallel thread managed
# to create the same object already
raise TryAgainLater()
else:
raise cls.DoesNotExist(f"No post with ID {data['id']}", data)
if update or created:
post.type = data["type"]
post.url = data.get("url", data["id"])
if post.type in (cls.Types.article, cls.Types.question):
post.type_data = PostTypeData(__root__=data).__root__
post.content = get_value_or_map(data, "content", "contentMap")
post.summary = data.get("summary")
try:
# apparently sometimes posts (Pages?) in the fediverse
# don't have content, but this shouldn't be a total failure
post.content = get_value_or_map(data, "content", "contentMap")
except ActivityPubFormatError as err:
logger.warning("%s on %s", err, post.url)
post.content = None
# Document types have names, not summaries
post.summary = data.get("summary") or data.get("name")
if not post.content and post.summary:
post.content = post.summary
post.summary = None
post.sensitive = data.get("sensitive", False)
post.url = data.get("url", data["id"])
post.published = parse_ld_date(data.get("published"))
post.edited = parse_ld_date(data.get("updated"))
post.in_reply_to = data.get("inReplyTo")
@ -851,19 +921,22 @@ class Post(StatorModel):
mention_identity = Identity.by_actor_uri(tag["href"], create=True)
post.mentions.add(mention_identity)
elif tag_type in ["_:hashtag", "hashtag"]:
# kbin produces tags with 'tag' instead of 'name'
if "tag" in tag and "name" not in tag:
name = get_value_or_map(tag, "tag", "tagMap")
else:
name = get_value_or_map(tag, "name", "nameMap")
post.hashtags.append(
get_value_or_map(tag, "name", "nameMap").lower().lstrip("#")
name.lower().lstrip("#")[: Hashtag.MAXIMUM_LENGTH]
)
elif tag_type in ["toot:emoji", "emoji"]:
emoji = Emoji.by_ap_tag(post.author.domain, tag, create=True)
post.emojis.add(emoji)
elif tag_type == "edition":
# Bookwyrm Edition is similar to hashtags. There should be a link to
# the book in the Note's content and a post attachment of the cover
# image. No special processing should be needed for ingest.
pass
else:
raise ValueError(f"Unknown tag type {tag['type']}")
# Various ActivityPub implementations and proposals introduced tag
# types, e.g. Edition in Bookwyrm and Link in fep-e232 Object Links
# it should be safe to ignore (and log) them before a full support
pass
# Visibility and to
# (a post is public if it's to:public, otherwise it's unlisted if
# it's cc:public, otherwise it's more limited)
@ -874,10 +947,15 @@ class Post(StatorModel):
post.visibility = Post.Visibilities.public
elif "public" in cc or "as:public" in cc:
post.visibility = Post.Visibilities.unlisted
elif post.author.followers_uri in to:
post.visibility = Post.Visibilities.followers
# Attachments
# These have no IDs, so we have to wipe them each time
post.attachments.all().delete()
for attachment in get_list(data, "attachment"):
if "url" not in attachment and "href" in attachment:
# Links have hrefs, while other Objects have urls
attachment["url"] = attachment["href"]
if "focalPoint" in attachment:
try:
focal_x, focal_y = attachment["focalPoint"]
@ -887,6 +965,10 @@ class Post(StatorModel):
focal_x, focal_y = None, None
mimetype = attachment.get("mediaType")
if not mimetype or not isinstance(mimetype, str):
if "url" not in attachment:
raise ActivityPubFormatError(
f"No URL present on attachment in {post.url}"
)
mimetype, _ = mimetypes.guess_type(attachment["url"])
if not mimetype:
mimetype = "application/octet-stream"
@ -902,7 +984,11 @@ class Post(StatorModel):
)
# Calculate stats in case we have existing replies
post.calculate_stats(save=False)
post.save()
with transaction.atomic():
# if we don't commit the transaction here, there's a chance
# the parent fetch below goes into an infinite loop
post.save()
# Potentially schedule a fetch of the reply parent, and recalculate
# its stats if it's here already.
if post.in_reply_to:
@ -912,8 +998,10 @@ class Post(StatorModel):
try:
cls.ensure_object_uri(post.in_reply_to, reason=post.object_uri)
except ValueError:
capture_message(
f"Cannot fetch ancestor of Post={post.pk}, ancestor_uri={post.in_reply_to}"
logger.warning(
"Cannot fetch ancestor of Post=%s, ancestor_uri=%s",
post.pk,
post.in_reply_to,
)
else:
parent.calculate_stats()
@ -930,10 +1018,10 @@ class Post(StatorModel):
except cls.DoesNotExist:
if fetch:
try:
response = async_to_sync(SystemActor().signed_request)(
response = SystemActor().signed_request(
method="get", uri=object_uri
)
except (httpx.HTTPError, ssl.SSLCertVerificationError):
except (httpx.HTTPError, ssl.SSLCertVerificationError, ValueError):
raise cls.DoesNotExist(f"Could not fetch {object_uri}")
if response.status_code in [404, 410]:
raise cls.DoesNotExist(f"No post at {object_uri}")
@ -951,11 +1039,13 @@ class Post(StatorModel):
update=True,
fetch_author=True,
)
except (json.JSONDecodeError, ValueError):
raise cls.DoesNotExist(f"Invalid ld+json response for {object_uri}")
except (json.JSONDecodeError, ValueError, JsonLdError) as err:
raise cls.DoesNotExist(
f"Invalid ld+json response for {object_uri}"
) from err
# We may need to fetch the author too
if post.author.state == IdentityStates.outdated:
async_to_sync(post.author.fetch_actor)()
post.author.fetch_actor()
return post
else:
raise cls.DoesNotExist(f"Cannot find Post with URI {object_uri}")
@ -989,7 +1079,7 @@ class Post(StatorModel):
if data["actor"] != data["object"]["attributedTo"]:
raise ValueError("Create actor does not match its Post object", data)
# Create it, stator will fan it out locally
cls.by_ap(data["object"], create=True, update=True)
cls.by_ap(data["object"], create=True, update=True, fetch_author=True)
@classmethod
def handle_update_ap(cls, data):
@ -1059,7 +1149,7 @@ class Post(StatorModel):
### Mastodon API ###
def to_mastodon_json(self, interactions=None, identity=None):
def to_mastodon_json(self, interactions=None, bookmarks=None, identity=None):
reply_parent = None
if self.in_reply_to:
# Load the PK and author.id explicitly to prevent a SELECT on the entire author Identity
@ -1128,4 +1218,7 @@ class Post(StatorModel):
if interactions:
value["favourited"] = self.pk in interactions.get("like", [])
value["reblogged"] = self.pk in interactions.get("boost", [])
value["pinned"] = self.pk in interactions.get("pin", [])
if bookmarks:
value["bookmarked"] = self.pk in bookmarks
return value

View file

@ -8,16 +8,11 @@ from stator.models import State, StateField, StateGraph, StatorModel
class PostAttachmentStates(StateGraph):
new = State(try_interval=30000)
new = State(externally_progressed=True)
fetched = State()
new.transitions_to(fetched)
@classmethod
async def handle_new(cls, instance):
# TODO: Fetch images to our own media storage
pass
class PostAttachment(StatorModel):
"""
@ -31,6 +26,13 @@ class PostAttachment(StatorModel):
blank=True,
null=True,
)
author = models.ForeignKey(
"users.Identity",
on_delete=models.CASCADE,
related_name="attachments",
blank=True,
null=True,
)
state = StateField(graph=PostAttachmentStates)
@ -55,8 +57,8 @@ class PostAttachment(StatorModel):
width = models.IntegerField(null=True, blank=True)
height = models.IntegerField(null=True, blank=True)
focal_x = models.IntegerField(null=True, blank=True)
focal_y = models.IntegerField(null=True, blank=True)
focal_x = models.FloatField(null=True, blank=True)
focal_y = models.FloatField(null=True, blank=True)
blurhash = models.TextField(null=True, blank=True)
created = models.DateTimeField(auto_now_add=True)
@ -111,7 +113,7 @@ class PostAttachment(StatorModel):
### ActivityPub ###
def to_ap(self):
return {
ap = {
"url": self.file.url,
"name": self.name,
"type": "Document",
@ -120,13 +122,22 @@ class PostAttachment(StatorModel):
"mediaType": self.mimetype,
"blurhash": self.blurhash,
}
if self.is_image() and self.focal_x and self.focal_y:
ap["type"] = "Image"
ap["focalPoint"] = [self.focal_x, self.focal_y]
return ap
### Mastodon Client API ###
def to_mastodon_json(self):
type_ = "unknown"
if self.is_image():
type_ = "image"
elif self.is_video():
type_ = "video"
value = {
"id": self.pk,
"type": "image" if self.is_image() else "unknown",
"type": type_,
"url": self.full_url().absolute,
"preview_url": self.thumbnail_url().absolute,
"remote_url": None,

View file

@ -27,95 +27,89 @@ class PostInteractionStates(StateGraph):
return [cls.new, cls.fanned_out]
@classmethod
async def handle_new(cls, instance: "PostInteraction"):
def handle_new(cls, instance: "PostInteraction"):
"""
Creates all needed fan-out objects for a new PostInteraction.
"""
interaction = await instance.afetch_full()
# Boost: send a copy to all people who follow this user (limiting
# to just local follows if it's a remote boost)
if interaction.type == interaction.Types.boost:
for target in await interaction.aget_boost_targets():
await FanOut.objects.acreate(
# Pin: send Add activity to all people who follow this user
if instance.type == instance.Types.boost or instance.type == instance.Types.pin:
for target in instance.get_targets():
FanOut.objects.create(
type=FanOut.Types.interaction,
identity=target,
subject_post=interaction.post,
subject_post_interaction=interaction,
subject_post=instance.post,
subject_post_interaction=instance,
)
# Like: send a copy to the original post author only,
# if the liker is local or they are
elif interaction.type == interaction.Types.like:
if interaction.identity.local or interaction.post.local:
await FanOut.objects.acreate(
elif instance.type == instance.Types.like:
if instance.identity.local or instance.post.local:
FanOut.objects.create(
type=FanOut.Types.interaction,
identity_id=interaction.post.author_id,
subject_post=interaction.post,
subject_post_interaction=interaction,
identity_id=instance.post.author_id,
subject_post=instance.post,
subject_post_interaction=instance,
)
# Vote: send a copy of the vote to the original
# post author only if it's a local interaction
# to a non local post
elif interaction.type == interaction.Types.vote:
if interaction.identity.local and not interaction.post.local:
await FanOut.objects.acreate(
elif instance.type == instance.Types.vote:
if instance.identity.local and not instance.post.local:
FanOut.objects.create(
type=FanOut.Types.interaction,
identity_id=interaction.post.author_id,
subject_post=interaction.post,
subject_post_interaction=interaction,
identity_id=instance.post.author_id,
subject_post=instance.post,
subject_post_interaction=instance,
)
else:
raise ValueError("Cannot fan out unknown type")
# And one for themselves if they're local and it's a boost
if (
interaction.type == PostInteraction.Types.boost
and interaction.identity.local
):
await FanOut.objects.acreate(
identity_id=interaction.identity_id,
if instance.type == PostInteraction.Types.boost and instance.identity.local:
FanOut.objects.create(
identity_id=instance.identity_id,
type=FanOut.Types.interaction,
subject_post=interaction.post,
subject_post_interaction=interaction,
subject_post=instance.post,
subject_post_interaction=instance,
)
return cls.fanned_out
@classmethod
async def handle_undone(cls, instance: "PostInteraction"):
def handle_undone(cls, instance: "PostInteraction"):
"""
Creates all needed fan-out objects to undo a PostInteraction.
"""
interaction = await instance.afetch_full()
# Undo Boost: send a copy to all people who follow this user
if interaction.type == interaction.Types.boost:
async for follow in interaction.identity.inbound_follows.select_related(
# Undo Pin: send a Remove activity to all people who follow this user
if instance.type == instance.Types.boost or instance.type == instance.Types.pin:
for follow in instance.identity.inbound_follows.select_related(
"source", "target"
):
if follow.source.local or follow.target.local:
await FanOut.objects.acreate(
FanOut.objects.create(
type=FanOut.Types.undo_interaction,
identity_id=follow.source_id,
subject_post=interaction.post,
subject_post_interaction=interaction,
subject_post=instance.post,
subject_post_interaction=instance,
)
# Undo Like: send a copy to the original post author only
elif interaction.type == interaction.Types.like:
await FanOut.objects.acreate(
elif instance.type == instance.Types.like:
FanOut.objects.create(
type=FanOut.Types.undo_interaction,
identity_id=interaction.post.author_id,
subject_post=interaction.post,
subject_post_interaction=interaction,
identity_id=instance.post.author_id,
subject_post=instance.post,
subject_post_interaction=instance,
)
else:
raise ValueError("Cannot fan out unknown type")
# And one for themselves if they're local and it's a boost
if (
interaction.type == PostInteraction.Types.boost
and interaction.identity.local
):
await FanOut.objects.acreate(
identity_id=interaction.identity_id,
if instance.type == PostInteraction.Types.boost and instance.identity.local:
FanOut.objects.create(
identity_id=instance.identity_id,
type=FanOut.Types.undo_interaction,
subject_post=interaction.post,
subject_post_interaction=interaction,
subject_post=instance.post,
subject_post_interaction=instance,
)
return cls.undone_fanned_out
@ -129,6 +123,7 @@ class PostInteraction(StatorModel):
like = "like"
boost = "boost"
vote = "vote"
pin = "pin"
id = models.BigIntegerField(
primary_key=True,
@ -170,9 +165,7 @@ class PostInteraction(StatorModel):
updated = models.DateTimeField(auto_now=True)
class Meta:
index_together = [
["type", "identity", "post"]
] + StatorModel.Meta.index_together
indexes = [models.Index(fields=["type", "identity", "post"])]
### Display helpers ###
@ -186,7 +179,7 @@ class PostInteraction(StatorModel):
ids_with_interaction_type = cls.objects.filter(
identity=identity,
post_id__in=[post.pk for post in posts],
type__in=[cls.Types.like, cls.Types.boost],
type__in=[cls.Types.like, cls.Types.boost, cls.Types.pin],
state__in=[PostInteractionStates.new, PostInteractionStates.fanned_out],
).values_list("post_id", "type")
# Make it into the return dict
@ -205,34 +198,30 @@ class PostInteraction(StatorModel):
[e.subject_post for e in events if e.subject_post], identity
)
### Async helpers ###
async def afetch_full(self):
"""
Returns a version of the object with all relations pre-loaded
"""
return await PostInteraction.objects.select_related(
"identity", "post", "post__author"
).aget(pk=self.pk)
async def aget_boost_targets(self) -> Iterable[Identity]:
def get_targets(self) -> Iterable[Identity]:
"""
Returns an iterable with Identities of followers that have unique
shared_inbox among each other to be used as target to the boost
shared_inbox among each other to be used as target.
When interaction is boost, only boost follows are considered,
for pins all followers are considered.
"""
# Start including the post author
targets = {self.post.author}
query = self.identity.inbound_follows.active()
# Include all followers that are following the boosts
async for follow in self.identity.inbound_follows.active().filter(
boosts=True
).select_related("source"):
if self.type == self.Types.boost:
query = query.filter(boosts=True)
for follow in query.select_related("source"):
targets.add(follow.source)
# Fetch the full blocks and remove them as targets
async for block in self.identity.outbound_blocks.active().filter(
mute=False
).select_related("target"):
for block in (
self.identity.outbound_blocks.active()
.filter(mute=False)
.select_related("target")
):
try:
targets.remove(block.target)
except KeyError:
@ -326,7 +315,7 @@ class PostInteraction(StatorModel):
"inReplyTo": self.post.object_uri,
"attributedTo": self.identity.actor_uri,
}
else:
elif self.type == self.Types.pin:
raise ValueError("Cannot turn into AP")
return value
@ -356,6 +345,28 @@ class PostInteraction(StatorModel):
"object": object,
}
def to_add_ap(self):
"""
Returns the AP JSON to add a pin interaction to the featured collection
"""
return {
"type": "Add",
"actor": self.identity.actor_uri,
"object": self.post.object_uri,
"target": self.identity.actor_uri + "collections/featured/",
}
def to_remove_ap(self):
"""
Returns the AP JSON to remove a pin interaction from the featured collection
"""
return {
"type": "Remove",
"actor": self.identity.actor_uri,
"object": self.post.object_uri,
"target": self.identity.actor_uri + "collections/featured/",
}
### ActivityPub (inbound) ###
@classmethod
@ -438,8 +449,9 @@ class PostInteraction(StatorModel):
# TODO: Limited retry state?
return
interaction.post.calculate_stats()
interaction.post.calculate_type_data()
if interaction and interaction.post:
interaction.post.calculate_stats()
interaction.post.calculate_type_data()
@classmethod
def handle_undo_ap(cls, data):
@ -464,6 +476,76 @@ class PostInteraction(StatorModel):
interaction.post.calculate_stats()
interaction.post.calculate_type_data()
@classmethod
def handle_add_ap(cls, data):
"""
Handles an incoming Add activity which is a pin
"""
target = data.get("target", None)
if not target:
return
# we only care about pinned posts, not hashtags
object = data.get("object", {})
if isinstance(object, dict) and object.get("type") == "Hashtag":
return
with transaction.atomic():
identity = Identity.by_actor_uri(data["actor"], create=True)
# it's only a pin if the target is the identity's featured collection URI
if identity.featured_collection_uri != target:
return
object_uri = get_str_or_id(object)
if not object_uri:
return
post = Post.by_object_uri(object_uri, fetch=True)
return PostInteraction.objects.get_or_create(
type=cls.Types.pin,
identity=identity,
post=post,
state__in=PostInteractionStates.group_active(),
)[0]
@classmethod
def handle_remove_ap(cls, data):
"""
Handles an incoming Remove activity which is an unpin
"""
target = data.get("target", None)
if not target:
return
# we only care about pinned posts, not hashtags
object = data.get("object", {})
if isinstance(object, dict) and object.get("type") == "Hashtag":
return
with transaction.atomic():
identity = Identity.by_actor_uri(data["actor"], create=True)
# it's only an unpin if the target is the identity's featured collection URI
if identity.featured_collection_uri != target:
return
try:
object_uri = get_str_or_id(object)
if not object_uri:
return
post = Post.by_object_uri(object_uri, fetch=False)
for interaction in cls.objects.filter(
type=cls.Types.pin,
identity=identity,
post=post,
state__in=PostInteractionStates.group_active(),
):
# Force it into undone_fanned_out as it's not ours
interaction.transition_perform(
PostInteractionStates.undone_fanned_out
)
except (cls.DoesNotExist, Post.DoesNotExist):
return
### Mastodon API ###
def to_mastodon_status_json(self, interactions=None, identity=None):

View file

@ -58,7 +58,7 @@ class QuestionData(BasePostDataType):
"expired": False,
"multiple": multiple,
"votes_count": 0,
"voters_count": self.voter_count if multiple else None,
"voters_count": self.voter_count,
"voted": False,
"own_votes": [],
"options": [],

View file

@ -16,6 +16,7 @@ class TimelineEvent(models.Model):
mentioned = "mentioned"
liked = "liked" # Someone liking one of our posts
followed = "followed"
follow_requested = "follow_requested"
boosted = "boosted" # Someone boosting one of our posts
announcement = "announcement" # Server announcement
identity_created = "identity_created" # New identity created
@ -55,15 +56,18 @@ class TimelineEvent(models.Model):
published = models.DateTimeField(default=timezone.now)
seen = models.BooleanField(default=False)
dismissed = models.BooleanField(default=False)
created = models.DateTimeField(auto_now_add=True)
class Meta:
index_together = [
indexes = [
# This relies on a DB that can use left subsets of indexes
("identity", "type", "subject_post", "subject_identity"),
("identity", "type", "subject_identity"),
("identity", "created"),
models.Index(
fields=["identity", "type", "subject_post", "subject_identity"]
),
models.Index(fields=["identity", "type", "subject_identity"]),
models.Index(fields=["identity", "created"]),
]
### Alternate constructors ###
@ -71,14 +75,30 @@ class TimelineEvent(models.Model):
@classmethod
def add_follow(cls, identity, source_identity):
"""
Adds a follow to the timeline if it's not there already
Adds a follow to the timeline if it's not there already, remove follow request if any
"""
cls.objects.filter(
type=cls.Types.follow_requested,
identity=identity,
subject_identity=source_identity,
).delete()
return cls.objects.get_or_create(
identity=identity,
type=cls.Types.followed,
subject_identity=source_identity,
)[0]
@classmethod
def add_follow_request(cls, identity, source_identity):
"""
Adds a follow request to the timeline if it's not there already
"""
return cls.objects.get_or_create(
identity=identity,
type=cls.Types.follow_requested,
subject_identity=source_identity,
)[0]
@classmethod
def add_post(cls, identity, post):
"""
@ -166,6 +186,14 @@ class TimelineEvent(models.Model):
subject_identity_id=interaction.identity_id,
).delete()
@classmethod
def delete_follow(cls, target, source):
TimelineEvent.objects.filter(
type__in=[cls.Types.followed, cls.Types.follow_requested],
identity=target,
subject_identity=source,
).delete()
### Background tasks ###
@classmethod
@ -215,16 +243,18 @@ class TimelineEvent(models.Model):
)
elif self.type == self.Types.followed:
result["type"] = "follow"
elif self.type == self.Types.follow_requested:
result["type"] = "follow_request"
elif self.type == self.Types.identity_created:
result["type"] = "admin.sign_up"
else:
raise ValueError(f"Cannot convert {self.type} to notification JSON")
return result
def to_mastodon_status_json(self, interactions=None, identity=None):
def to_mastodon_status_json(self, interactions=None, bookmarks=None, identity=None):
if self.type == self.Types.post:
return self.subject_post.to_mastodon_json(
interactions=interactions, identity=identity
interactions=interactions, bookmarks=bookmarks, identity=identity
)
elif self.type == self.Types.boost:
return self.subject_post_interaction.to_mastodon_status_json(

View file

@ -1,3 +1,5 @@
import logging
from activities.models import (
Post,
PostInteraction,
@ -5,9 +7,10 @@ from activities.models import (
PostStates,
TimelineEvent,
)
from core.exceptions import capture_message
from users.models import Identity
logger = logging.getLogger(__name__)
class PostService:
"""
@ -72,7 +75,12 @@ class PostService:
def unboost_as(self, identity: Identity):
self.uninteract_as(identity, PostInteraction.Types.boost)
def context(self, identity: Identity | None) -> tuple[list[Post], list[Post]]:
def context(
self,
identity: Identity | None,
num_ancestors: int = 10,
num_descendants: int = 50,
) -> tuple[list[Post], list[Post]]:
"""
Returns ancestor/descendant information.
@ -82,8 +90,6 @@ class PostService:
If identity is provided, includes mentions/followers-only posts they
can see. Otherwise, shows unlisted and above only.
"""
num_ancestors = 10
num_descendants = 50
# Retrieve ancestors via parent walk
ancestors: list[Post] = []
ancestor = self.post
@ -95,7 +101,7 @@ class PostService:
try:
Post.ensure_object_uri(object_uri, reason=reason)
except ValueError:
capture_message(
logger.error(
f"Cannot fetch ancestor Post={self.post.pk}, ancestor_uri={object_uri}"
)
break
@ -105,6 +111,7 @@ class PostService:
# Retrieve descendants via breadth-first-search
descendants: list[Post] = []
queue = [self.post]
seen: set[str] = set()
while queue and len(descendants) < num_descendants:
node = queue.pop()
child_queryset = (
@ -119,8 +126,10 @@ class PostService:
else:
child_queryset = child_queryset.unlisted(include_replies=True)
for child in child_queryset:
descendants.append(child)
queue.append(child)
if child.pk not in seen:
descendants.append(child)
queue.append(child)
seen.add(child.pk)
return ancestors, descendants
def delete(self):
@ -136,3 +145,22 @@ class PostService:
),
PostInteractionStates.undone,
)
def pin_as(self, identity: Identity):
if identity != self.post.author:
raise ValueError("Not the author of this post")
if self.post.visibility == Post.Visibilities.mentioned:
raise ValueError("Cannot pin a mentioned-only post")
if (
PostInteraction.objects.filter(
type=PostInteraction.Types.pin,
identity=identity,
).count()
>= 5
):
raise ValueError("Maximum number of pins already reached")
self.interact_as(identity, PostInteraction.Types.pin)
def unpin_as(self, identity: Identity):
self.uninteract_as(identity, PostInteraction.Types.pin)

View file

@ -1,7 +1,7 @@
import httpx
from asgiref.sync import async_to_sync
from activities.models import Hashtag, Post
from core.json import json_from_response
from core.ld import canonicalise
from users.models import Domain, Identity, IdentityStates
from users.models.system_actor import SystemActor
@ -49,7 +49,7 @@ class SearchService:
username, domain_instance or domain, fetch=True
)
if identity and identity.state == IdentityStates.outdated:
async_to_sync(identity.fetch_actor)()
identity.fetch_actor()
except ValueError:
pass
@ -74,7 +74,7 @@ class SearchService:
# Fetch the provided URL as the system actor to retrieve the AP JSON
try:
response = async_to_sync(SystemActor().signed_request)(
response = SystemActor().signed_request(
method="get",
uri=self.query,
)
@ -82,7 +82,12 @@ class SearchService:
return None
if response.status_code >= 400:
return None
document = canonicalise(response.json(), include_security=True)
json_data = json_from_response(response)
if not json_data:
return None
document = canonicalise(json_data, include_security=True)
type = document.get("type", "unknown").lower()
# Is it an identity?
@ -90,7 +95,7 @@ class SearchService:
# Try and retrieve the profile by actor URI
identity = Identity.by_actor_uri(document["id"], create=True)
if identity and identity.state == IdentityStates.outdated:
async_to_sync(identity.fetch_actor)()
identity.fetch_actor()
return identity
# Is it a post?
@ -123,6 +128,14 @@ class SearchService:
results.add(hashtag)
return results
def search_post_content(self):
"""
Searches for posts on an identity via full text search
"""
return self.identity.posts.unlisted(include_replies=True).filter(
content__search=self.query
)[:50]
def search_all(self):
"""
Returns all possible results for a search

View file

@ -47,12 +47,15 @@ class TimelineService:
)
def local(self) -> models.QuerySet[Post]:
return (
queryset = (
PostService.queryset()
.local_public()
.filter(author__restriction=Identity.Restriction.none)
.order_by("-id")
)
if self.identity is not None:
queryset = queryset.filter(author__domain=self.identity.domain)
return queryset
def federated(self) -> models.QuerySet[Post]:
return (
@ -74,19 +77,56 @@ class TimelineService:
def notifications(self, types: list[str]) -> models.QuerySet[TimelineEvent]:
return (
self.event_queryset()
.filter(identity=self.identity, type__in=types)
.filter(identity=self.identity, type__in=types, dismissed=False)
.order_by("-created")
)
def identity_public(self, identity: Identity):
def identity_public(
self,
identity: Identity,
include_boosts: bool = True,
include_replies: bool = True,
):
"""
Returns all publically visible posts for an identity
Returns timeline events with all of an identity's publicly visible posts
and their boosts
"""
filter = models.Q(
type=TimelineEvent.Types.post,
subject_post__author=identity,
subject_post__visibility__in=[
Post.Visibilities.public,
Post.Visibilities.local_only,
Post.Visibilities.unlisted,
],
)
if include_boosts:
filter = filter | models.Q(
type=TimelineEvent.Types.boost, subject_identity=identity
)
if not include_replies:
filter = filter & models.Q(subject_post__in_reply_to__isnull=True)
return (
self.event_queryset()
.filter(
filter,
identity=identity,
)
.order_by("-created")
)
def identity_pinned(self) -> models.QuerySet[Post]:
"""
Return all pinned posts that are publicly visible for an identity
"""
return (
PostService.queryset()
.filter(author=identity)
.unlisted(include_replies=True)
.order_by("-id")
.public()
.filter(
interactions__identity=self.identity,
interactions__type=PostInteraction.Types.pin,
interactions__state__in=PostInteractionStates.group_active(),
)
)
def likes(self) -> models.QuerySet[Post]:
@ -102,3 +142,13 @@ class TimelineService:
)
.order_by("-id")
)
def bookmarks(self) -> models.QuerySet[Post]:
"""
Return all bookmarked posts for an identity
"""
return (
PostService.queryset()
.filter(bookmarks__identity=self.identity)
.order_by("-id")
)

View file

@ -1,27 +1,17 @@
from django import forms
from django.conf import settings
from django.core.exceptions import PermissionDenied
from django.shortcuts import get_object_or_404, redirect, render
from django.contrib import messages
from django.shortcuts import redirect
from django.utils import timezone
from django.utils.decorators import method_decorator
from django.views.generic import FormView
from activities.models import (
Post,
PostAttachment,
PostAttachmentStates,
PostStates,
TimelineEvent,
)
from activities.models import Post, PostAttachment, PostAttachmentStates, TimelineEvent
from core.files import blurhash_image, resize_image
from core.html import FediverseHtmlParser
from core.models import Config
from users.decorators import identity_required
from users.views.base import IdentityViewMixin
@method_decorator(identity_required, name="dispatch")
class Compose(FormView):
class Compose(IdentityViewMixin, FormView):
template_name = "activities/compose.html"
class form_class(forms.Form):
@ -33,6 +23,7 @@ class Compose(FormView):
},
)
)
visibility = forms.ChoiceField(
choices=[
(Post.Visibilities.public, "Public"),
@ -42,6 +33,7 @@ class Compose(FormView):
(Post.Visibilities.mentioned, "Mentioned Only"),
],
)
content_warning = forms.CharField(
required=False,
label=Config.lazy_system_value("content_warning_text"),
@ -52,11 +44,42 @@ class Compose(FormView):
),
help_text="Optional - Post will be hidden behind this text until clicked",
)
reply_to = forms.CharField(widget=forms.HiddenInput(), required=False)
def __init__(self, request, *args, **kwargs):
image = forms.ImageField(
required=False,
help_text="Optional - For multiple image uploads and cropping, please use an app",
widget=forms.FileInput(
attrs={
"_": f"""
on change
if me.files[0].size > {settings.SETUP.MEDIA_MAX_IMAGE_FILESIZE_MB * 1024 ** 2}
add [@disabled=] to #upload
remove <ul.errorlist/>
make <ul.errorlist/> called errorlist
make <li/> called error
set size_in_mb to (me.files[0].size / 1024 / 1024).toFixed(2)
put 'File must be {settings.SETUP.MEDIA_MAX_IMAGE_FILESIZE_MB}MB or less (actual: ' + size_in_mb + 'MB)' into error
put error into errorlist
put errorlist before me
else
remove @disabled from #upload
remove <ul.errorlist/>
end
end
"""
}
),
)
image_caption = forms.CharField(
required=False,
help_text="Provide an image caption for the visually impaired",
)
def __init__(self, identity, *args, **kwargs):
super().__init__(*args, **kwargs)
self.request = request
self.identity = identity
self.fields["text"].widget.attrs[
"_"
] = rf"""
@ -83,7 +106,7 @@ class Compose(FormView):
def clean_text(self):
text = self.cleaned_data.get("text")
# Check minimum interval
last_post = self.request.identity.posts.order_by("-created").first()
last_post = self.identity.posts.order_by("-created").first()
if (
last_post
and (timezone.now() - last_post.created).total_seconds()
@ -102,181 +125,75 @@ class Compose(FormView):
)
return text
def clean_image(self):
value = self.cleaned_data.get("image")
if value:
max_mb = settings.SETUP.MEDIA_MAX_IMAGE_FILESIZE_MB
max_bytes = max_mb * 1024 * 1024
if value.size > max_bytes:
# Erase the file from our data to stop trying to show it again
self.files = {}
raise forms.ValidationError(
f"File must be {max_mb}MB or less (actual: {value.size / 1024 ** 2:.2f})"
)
return value
def get_form(self, form_class=None):
return self.form_class(request=self.request, **self.get_form_kwargs())
return self.form_class(identity=self.identity, **self.get_form_kwargs())
def get_initial(self):
initial = super().get_initial()
if self.post_obj:
initial.update(
{
"reply_to": self.reply_to.pk if self.reply_to else "",
"visibility": self.post_obj.visibility,
"text": FediverseHtmlParser(self.post_obj.content).plain_text,
"content_warning": self.post_obj.summary,
}
)
else:
initial[
"visibility"
] = self.request.identity.config_identity.default_post_visibility
if self.reply_to:
initial["reply_to"] = self.reply_to.pk
if self.reply_to.visibility == Post.Visibilities.public:
initial["visibility"] = Post.Visibilities.unlisted
else:
initial["visibility"] = self.reply_to.visibility
initial["content_warning"] = self.reply_to.summary
# Build a set of mentions for the content to start as
mentioned = {self.reply_to.author}
mentioned.update(self.reply_to.mentions.all())
mentioned.discard(self.request.identity)
initial["text"] = "".join(
f"@{identity.handle} "
for identity in mentioned
if identity.username
)
initial["visibility"] = self.identity.config_identity.default_post_visibility
return initial
def form_valid(self, form):
# Gather any attachment objects now, they're not in the form proper
# See if we need to make an image attachment
attachments = []
if "attachment" in self.request.POST:
attachments = PostAttachment.objects.filter(
pk__in=self.request.POST.getlist("attachment", [])
if form.cleaned_data.get("image"):
main_file = resize_image(
form.cleaned_data["image"],
size=(2000, 2000),
cover=False,
)
# Dispatch based on edit or not
if self.post_obj:
self.post_obj.edit_local(
content=form.cleaned_data["text"],
summary=form.cleaned_data.get("content_warning"),
visibility=form.cleaned_data["visibility"],
attachments=attachments,
thumbnail_file = resize_image(
form.cleaned_data["image"],
size=(400, 225),
cover=True,
)
self.post_obj.transition_perform(PostStates.edited)
else:
post = Post.create_local(
author=self.request.identity,
content=form.cleaned_data["text"],
summary=form.cleaned_data.get("content_warning"),
visibility=form.cleaned_data["visibility"],
reply_to=self.reply_to,
attachments=attachments,
attachment = PostAttachment.objects.create(
blurhash=blurhash_image(thumbnail_file),
mimetype="image/webp",
width=main_file.image.width,
height=main_file.image.height,
name=form.cleaned_data.get("image_caption"),
state=PostAttachmentStates.fetched,
author=self.identity,
)
# Add their own timeline event for immediate visibility
TimelineEvent.add_post(self.request.identity, post)
return redirect("/")
def dispatch(self, request, handle=None, post_id=None, *args, **kwargs):
self.post_obj = None
if handle and post_id:
# Make sure the request identity owns the post!
if handle != request.identity.handle:
raise PermissionDenied("Post author is not requestor")
self.post_obj = get_object_or_404(request.identity.posts, pk=post_id)
# Grab the reply-to post info now
self.reply_to = None
reply_to_id = request.POST.get("reply_to") or request.GET.get("reply_to")
if reply_to_id:
try:
self.reply_to = Post.objects.get(pk=reply_to_id)
except Post.DoesNotExist:
pass
# Keep going with normal rendering
return super().dispatch(request, *args, **kwargs)
attachment.file.save(
main_file.name,
main_file,
)
attachment.thumbnail.save(
thumbnail_file.name,
thumbnail_file,
)
attachment.save()
attachments.append(attachment)
# Create the post
post = Post.create_local(
author=self.identity,
content=form.cleaned_data["text"],
summary=form.cleaned_data.get("content_warning"),
visibility=form.cleaned_data["visibility"],
attachments=attachments,
)
# Add their own timeline event for immediate visibility
TimelineEvent.add_post(self.identity, post)
messages.success(self.request, "Your post was created.")
return redirect(".")
def get_context_data(self, **kwargs):
context = super().get_context_data(**kwargs)
context["reply_to"] = self.reply_to
if self.post_obj:
context["post"] = self.post_obj
context["identity"] = self.identity
context["section"] = "compose"
return context
@method_decorator(identity_required, name="dispatch")
class ImageUpload(FormView):
"""
Handles image upload - returns a new input type hidden to embed in
the main form that references an orphaned PostAttachment
"""
template_name = "activities/_image_upload.html"
class form_class(forms.Form):
image = forms.ImageField(
widget=forms.FileInput(
attrs={
"_": f"""
on change
if me.files[0].size > {settings.SETUP.MEDIA_MAX_IMAGE_FILESIZE_MB * 1024 ** 2}
add [@disabled=] to #upload
remove <ul.errorlist/>
make <ul.errorlist/> called errorlist
make <li/> called error
set size_in_mb to (me.files[0].size / 1024 / 1024).toFixed(2)
put 'File must be {settings.SETUP.MEDIA_MAX_IMAGE_FILESIZE_MB}MB or less (actual: ' + size_in_mb + 'MB)' into error
put error into errorlist
put errorlist before me
else
remove @disabled from #upload
remove <ul.errorlist/>
end
end
"""
}
)
)
description = forms.CharField(required=False)
def clean_image(self):
value = self.cleaned_data["image"]
max_mb = settings.SETUP.MEDIA_MAX_IMAGE_FILESIZE_MB
max_bytes = max_mb * 1024 * 1024
if value.size > max_bytes:
# Erase the file from our data to stop trying to show it again
self.files = {}
raise forms.ValidationError(
f"File must be {max_mb}MB or less (actual: {value.size / 1024 ** 2:.2f})"
)
return value
def form_invalid(self, form):
return super().form_invalid(form)
def form_valid(self, form):
# Make a PostAttachment
main_file = resize_image(
form.cleaned_data["image"],
size=(2000, 2000),
cover=False,
)
thumbnail_file = resize_image(
form.cleaned_data["image"],
size=(400, 225),
cover=True,
)
attachment = PostAttachment.objects.create(
blurhash=blurhash_image(thumbnail_file),
mimetype="image/webp",
width=main_file.image.width,
height=main_file.image.height,
name=form.cleaned_data.get("description"),
state=PostAttachmentStates.fetched,
)
attachment.file.save(
main_file.name,
main_file,
)
attachment.thumbnail.save(
thumbnail_file.name,
thumbnail_file,
)
attachment.save()
# Return the response, with a hidden input plus a note
return render(
self.request, "activities/_image_uploaded.html", {"attachment": attachment}
)

View file

@ -1,7 +1,6 @@
import json
import httpx
from asgiref.sync import async_to_sync
from django import forms
from django.utils.decorators import method_decorator
from django.views.generic import FormView, TemplateView
@ -13,7 +12,6 @@ from users.models import SystemActor
@method_decorator(admin_required, name="dispatch")
class JsonViewer(FormView):
template_name = "activities/debug_json.html"
class form_class(forms.Form):
@ -31,7 +29,7 @@ class JsonViewer(FormView):
context = self.get_context_data(form=form)
try:
response = async_to_sync(SystemActor().signed_request)(
response = SystemActor().signed_request(
method="get",
uri=uri,
)
@ -56,25 +54,23 @@ class JsonViewer(FormView):
except json.JSONDecodeError as ex:
result = str(ex)
else:
result = json.dumps(document, indent=4, sort_keys=True)
context["raw_result"] = json.dumps(response.json(), indent=2)
result = json.dumps(document, indent=2, sort_keys=True)
# result = pprint.pformat(document)
context["result"] = result
return self.render_to_response(context)
class NotFound(TemplateView):
template_name = "404.html"
class ServerError(TemplateView):
template_name = "500.html"
@method_decorator(admin_required, name="dispatch")
class OauthAuthorize(TemplateView):
template_name = "api/oauth_authorize.html"
def get_context_data(self):

View file

@ -1,26 +0,0 @@
from django.views.generic import ListView
from activities.models import Hashtag
class ExploreTag(ListView):
template_name = "activities/explore_tag.html"
extra_context = {
"current_page": "explore",
"allows_refresh": True,
}
paginate_by = 20
def get_queryset(self):
return (
Hashtag.objects.public()
.filter(
stats__total__gt=0,
)
.order_by("-stats__total")
)[:20]
class Explore(ExploreTag):
pass

View file

@ -1,15 +1,13 @@
from django.core.exceptions import PermissionDenied
from django.http import Http404, JsonResponse
from django.shortcuts import get_object_or_404, redirect, render
from django.shortcuts import get_object_or_404, redirect
from django.utils.decorators import method_decorator
from django.views.decorators.vary import vary_on_headers
from django.views.generic import TemplateView, View
from django.views.generic import TemplateView
from activities.models import Post, PostInteraction, PostStates
from activities.models import Post, PostStates
from activities.services import PostService
from core.decorators import cache_page_by_ap_json
from core.ld import canonicalise
from users.decorators import identity_required
from users.models import Identity
from users.shortcuts import by_handle_or_404
@ -19,7 +17,6 @@ from users.shortcuts import by_handle_or_404
)
@method_decorator(vary_on_headers("Accept"), name="dispatch")
class Individual(TemplateView):
template_name = "activities/post.html"
identity: Identity
@ -32,7 +29,7 @@ class Individual(TemplateView):
self.post_obj = get_object_or_404(
PostService.queryset()
.filter(author=self.identity)
.visible_to(request.identity, include_replies=True),
.unlisted(include_replies=True),
pk=post_id,
)
if self.post_obj.state in [PostStates.deleted, PostStates.deleted_fanned_out]:
@ -49,20 +46,17 @@ class Individual(TemplateView):
context = super().get_context_data(**kwargs)
ancestors, descendants = PostService(self.post_obj).context(
self.request.identity
identity=None, num_ancestors=2
)
context.update(
{
"identity": self.identity,
"post": self.post_obj,
"interactions": PostInteraction.get_post_interactions(
[self.post_obj] + ancestors + descendants,
self.request.identity,
),
"link_original": True,
"ancestors": ancestors,
"descendants": descendants,
"public_styling": True,
}
)
@ -76,95 +70,3 @@ class Individual(TemplateView):
canonicalise(self.post_obj.to_ap(), include_security=True),
content_type="application/activity+json",
)
@method_decorator(identity_required, name="dispatch")
class Like(View):
"""
Adds/removes a like from the current identity to the post
"""
undo = False
def post(self, request, handle, post_id):
identity = by_handle_or_404(self.request, handle, local=False)
post = get_object_or_404(
PostService.queryset()
.filter(author=identity)
.visible_to(request.identity, include_replies=True),
pk=post_id,
)
service = PostService(post)
if self.undo:
service.unlike_as(request.identity)
else:
service.like_as(request.identity)
# Return either a redirect or a HTMX snippet
if request.htmx:
return render(
request,
"activities/_like.html",
{
"post": post,
"interactions": {"like": set() if self.undo else {post.pk}},
},
)
return redirect(post.urls.view)
@method_decorator(identity_required, name="dispatch")
class Boost(View):
"""
Adds/removes a boost from the current identity to the post
"""
undo = False
def post(self, request, handle, post_id):
identity = by_handle_or_404(self.request, handle, local=False)
post = get_object_or_404(
PostService.queryset()
.filter(author=identity)
.visible_to(request.identity, include_replies=True),
pk=post_id,
)
service = PostService(post)
if self.undo:
service.unboost_as(request.identity)
else:
service.boost_as(request.identity)
# Return either a redirect or a HTMX snippet
if request.htmx:
return render(
request,
"activities/_boost.html",
{
"post": post,
"interactions": {"boost": set() if self.undo else {post.pk}},
},
)
return redirect(post.urls.view)
@method_decorator(identity_required, name="dispatch")
class Delete(TemplateView):
"""
Deletes a post
"""
template_name = "activities/post_delete.html"
def dispatch(self, request, handle, post_id):
# Make sure the request identity owns the post!
if handle != request.identity.handle:
raise PermissionDenied("Post author is not requestor")
self.identity = by_handle_or_404(self.request, handle, local=False)
self.post_obj = get_object_or_404(self.identity.posts, pk=post_id)
return super().dispatch(request)
def get_context_data(self):
return {"post": self.post_obj}
def post(self, request):
PostService(self.post_obj).delete()
return redirect("/")

View file

@ -1,22 +0,0 @@
from django import forms
from django.views.generic import FormView
from activities.services import SearchService
class Search(FormView):
template_name = "activities/search.html"
class form_class(forms.Form):
query = forms.CharField(
help_text="Search for:\nA user by @username@domain or their profile URL\nA hashtag by #tagname\nA post by its URL",
widget=forms.TextInput(attrs={"type": "search", "autofocus": "autofocus"}),
)
def form_valid(self, form):
searcher = SearchService(form.cleaned_data["query"], self.request.identity)
# Render results
context = self.get_context_data(form=form)
context["results"] = searcher.search_all()
return self.render_to_response(context)

View file

@ -1,49 +1,35 @@
from django.core.paginator import Paginator
from django.contrib.auth.decorators import login_required
from django.shortcuts import get_object_or_404, redirect
from django.utils.decorators import method_decorator
from django.views.generic import ListView, TemplateView
from activities.models import Hashtag, PostInteraction, TimelineEvent
from activities.models import Hashtag, TimelineEvent
from activities.services import TimelineService
from core.decorators import cache_page
from users.decorators import identity_required
from .compose import Compose
from users.models import Identity
from users.views.base import IdentityViewMixin
@method_decorator(identity_required, name="dispatch")
@method_decorator(login_required, name="dispatch")
class Home(TemplateView):
"""
Homepage for logged-in users - shows identities primarily.
"""
template_name = "activities/home.html"
form_class = Compose.form_class
def get_form(self, form_class=None):
return self.form_class(request=self.request, **self.get_form_kwargs())
def get_context_data(self):
events = TimelineService(self.request.identity).home()
paginator = Paginator(events, 25)
page_number = self.request.GET.get("page")
event_page = paginator.get_page(page_number)
context = {
"interactions": PostInteraction.get_event_interactions(
event_page,
self.request.identity,
),
"current_page": "home",
"allows_refresh": True,
"page_obj": event_page,
"form": self.form_class(request=self.request),
return {
"identities": Identity.objects.filter(
users__pk=self.request.user.pk
).order_by("created"),
}
return context
@method_decorator(
cache_page("cache_timeout_page_timeline", public_only=True), name="dispatch"
)
class Tag(ListView):
template_name = "activities/tag.html"
extra_context = {
"current_page": "tag",
@ -60,64 +46,15 @@ class Tag(ListView):
return super().get(request, *args, **kwargs)
def get_queryset(self):
return TimelineService(self.request.identity).hashtag(self.hashtag)
return TimelineService(None).hashtag(self.hashtag)
def get_context_data(self):
context = super().get_context_data()
context["hashtag"] = self.hashtag
context["interactions"] = PostInteraction.get_post_interactions(
context["page_obj"], self.request.identity
)
return context
@method_decorator(
cache_page("cache_timeout_page_timeline", public_only=True), name="dispatch"
)
class Local(ListView):
template_name = "activities/local.html"
extra_context = {
"current_page": "local",
"allows_refresh": True,
}
paginate_by = 25
def get_queryset(self):
return TimelineService(self.request.identity).local()
def get_context_data(self):
context = super().get_context_data()
context["interactions"] = PostInteraction.get_post_interactions(
context["page_obj"], self.request.identity
)
return context
@method_decorator(identity_required, name="dispatch")
class Federated(ListView):
template_name = "activities/federated.html"
extra_context = {
"current_page": "federated",
"allows_refresh": True,
}
paginate_by = 25
def get_queryset(self):
return TimelineService(self.request.identity).federated()
def get_context_data(self):
context = super().get_context_data()
context["interactions"] = PostInteraction.get_post_interactions(
context["page_obj"], self.request.identity
)
return context
@method_decorator(identity_required, name="dispatch")
class Notifications(ListView):
class Notifications(IdentityViewMixin, ListView):
template_name = "activities/notifications.html"
extra_context = {
"current_page": "notifications",
@ -129,7 +66,6 @@ class Notifications(ListView):
"boosted": TimelineEvent.Types.boosted,
"mentioned": TimelineEvent.Types.mentioned,
"liked": TimelineEvent.Types.liked,
"identity_created": TimelineEvent.Types.identity_created,
}
def get_queryset(self):
@ -147,7 +83,7 @@ class Notifications(ListView):
for type_name, type in self.notification_types.items():
if notification_options.get(type_name, True):
types.append(type)
return TimelineService(self.request.identity).notifications(types)
return TimelineService(self.identity).notifications(types)
def get_context_data(self, **kwargs):
context = super().get_context_data(**kwargs)
@ -168,9 +104,6 @@ class Notifications(ListView):
events.append(event)
# Retrieve what kinds of things to show
context["events"] = events
context["identity"] = self.identity
context["notification_options"] = self.request.session["notification_options"]
context["interactions"] = PostInteraction.get_event_interactions(
context["page_obj"],
self.request.identity,
)
return context

View file

@ -6,8 +6,7 @@ from django.http import JsonResponse
def identity_required(function):
"""
API version of the identity_required decorator that just makes sure the
token is tied to one, not an app only.
Makes sure the token is tied to an identity, not an app only.
"""
@wraps(function)
@ -32,12 +31,18 @@ def scope_required(scope: str, requires_identity=True):
@wraps(function)
def inner(request, *args, **kwargs):
if not request.token:
return JsonResponse({"error": "identity_token_required"}, status=401)
if request.identity:
# They're just logged in via cookie - give full access
pass
else:
return JsonResponse(
{"error": "identity_token_required"}, status=401
)
elif not request.token.has_scope(scope):
return JsonResponse({"error": "out_of_scope_for_token"}, status=403)
# They need an identity
if not request.identity and requires_identity:
return JsonResponse({"error": "identity_token_required"}, status=401)
if not request.token.has_scope(scope):
return JsonResponse({"error": "out_of_scope_for_token"}, status=403)
return function(request, *args, **kwargs)
inner.csrf_exempt = True # type:ignore

View file

@ -15,15 +15,20 @@ class ApiTokenMiddleware:
def __call__(self, request):
auth_header = request.headers.get("authorization", None)
request.token = None
request.identity = None
if auth_header and auth_header.startswith("Bearer "):
token_value = auth_header[7:]
try:
token = Token.objects.get(token=token_value, revoked=None)
except Token.DoesNotExist:
return HttpResponse("Invalid Bearer token", status=400)
request.user = token.user
request.identity = token.identity
request.token = token
if token_value == "__app__":
# Special client app token value
pass
else:
try:
token = Token.objects.get(token=token_value, revoked=None)
except Token.DoesNotExist:
return HttpResponse("Invalid Bearer token", status=400)
request.user = token.user
request.identity = token.identity
request.token = token
request.session = None
response = self.get_response(request)
return response

View file

@ -6,7 +6,6 @@ from django.db import migrations, models
class Migration(migrations.Migration):
initial = True
dependencies = [

View file

@ -6,7 +6,6 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("users", "0008_follow_boosts"),
migrations.swappable_dependency(settings.AUTH_USER_MODEL),

View file

@ -0,0 +1,17 @@
# Generated by Django 4.2.1 on 2023-07-15 17:40
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("api", "0002_remove_token_code_token_revoked_alter_token_token_and_more"),
]
operations = [
migrations.AddField(
model_name="token",
name="push_subscription",
field=models.JSONField(blank=True, null=True),
),
]

View file

@ -1,3 +1,5 @@
import secrets
from django.db import models
@ -17,3 +19,23 @@ class Application(models.Model):
created = models.DateTimeField(auto_now_add=True)
updated = models.DateTimeField(auto_now=True)
@classmethod
def create(
cls,
client_name: str,
redirect_uris: str,
website: str | None,
scopes: str | None = None,
):
client_id = "tk-" + secrets.token_urlsafe(16)
client_secret = secrets.token_urlsafe(40)
return cls.objects.create(
name=client_name,
website=website,
client_id=client_id,
client_secret=client_secret,
redirect_uris=redirect_uris,
scopes=scopes or "read",
)

View file

@ -1,4 +1,21 @@
import urlman
from django.db import models
from pydantic import BaseModel
class PushSubscriptionSchema(BaseModel):
"""
Basic validating schema for push data
"""
class Keys(BaseModel):
p256dh: str
auth: str
endpoint: str
keys: Keys
alerts: dict[str, bool]
policy: str
class Token(models.Model):
@ -37,6 +54,11 @@ class Token(models.Model):
updated = models.DateTimeField(auto_now=True)
revoked = models.DateTimeField(blank=True, null=True)
push_subscription = models.JSONField(blank=True, null=True)
class urls(urlman.Urls):
edit = "/@{self.identity.handle}/settings/tokens/{self.id}/"
def has_scope(self, scope: str):
"""
Returns if this token has the given scope.
@ -45,3 +67,8 @@ class Token(models.Model):
# TODO: Support granular scopes the other way?
scope_prefix = scope.split(":")[0]
return (scope in self.scopes) or (scope_prefix in self.scopes)
def set_push_subscription(self, data: dict):
# Validate schema and assign
self.push_subscription = PushSubscriptionSchema(**data).dict()
self.save()

View file

@ -4,10 +4,11 @@ from collections.abc import Callable
from typing import Any, Generic, Protocol, TypeVar
from django.db import models
from django.db.models.expressions import Case, F, When
from django.http import HttpRequest
from hatchway.http import ApiResponse
from activities.models import PostInteraction
from activities.models import PostInteraction, TimelineEvent
T = TypeVar("T")
@ -75,8 +76,8 @@ class PaginatingApiResponse(ApiResponse[list[TI]]):
parts = [
entry
for entry in [
self.get_part(0, "min_id", "prev"),
self.get_part(-1, "max_id", "next"),
self.get_part(0, "min_id", "prev"),
]
if entry
]
@ -215,62 +216,45 @@ class MastodonPaginator:
max_id: str | None,
since_id: str | None,
limit: int | None,
home: bool = False,
) -> PaginationResult[TM]:
limit = min(limit or self.default_limit, self.max_limit)
filters = {}
id_field = "id"
reverse = False
if home:
# The home timeline interleaves Post IDs and PostInteraction IDs in an
# annotated field called "subject_id".
id_field = "subject_id"
queryset = queryset.annotate(
subject_id=Case(
When(type=TimelineEvent.Types.post, then=F("subject_post_id")),
default=F("subject_post_interaction"),
)
)
# These "does not start with interaction" checks can be removed after a
# couple months, when clients have flushed them out.
if max_id and not max_id.startswith("interaction"):
queryset = queryset.filter(id__lt=max_id)
filters[f"{id_field}__lt"] = max_id
if since_id and not since_id.startswith("interaction"):
queryset = queryset.filter(id__gt=since_id)
filters[f"{id_field}__gt"] = since_id
if min_id and not min_id.startswith("interaction"):
# Min ID requires items _immediately_ newer than specified, so we
# invert the ordering to accommodate
queryset = queryset.filter(id__gt=min_id).order_by("id")
else:
queryset = queryset.order_by("-id")
filters[f"{id_field}__gt"] = min_id
reverse = True
# Default is to order by ID descending (newest first), except for min_id
# queries, which should order by ID for limiting, then reverse the results to be
# consistent. The clearest explanation of this I've found so far is this:
# https://mastodon.social/@Gargron/100846335353411164
ordering = id_field if reverse else f"-{id_field}"
results = list(queryset.filter(**filters).order_by(ordering)[:limit])
if reverse:
results.reverse()
limit = min(limit or self.default_limit, self.max_limit)
return PaginationResult(
results=list(queryset[:limit]),
limit=limit,
)
def paginate_home(
self,
queryset,
min_id: str | None,
max_id: str | None,
since_id: str | None,
limit: int | None,
) -> PaginationResult:
"""
The home timeline requires special handling where we mix Posts and
PostInteractions together.
"""
if max_id and not max_id.startswith("interaction"):
queryset = queryset.filter(
models.Q(subject_post_id__lt=max_id)
| models.Q(subject_post_interaction_id__lt=max_id)
)
if since_id and not since_id.startswith("interaction"):
queryset = queryset.filter(
models.Q(subject_post_id__gt=since_id)
| models.Q(subject_post_interaction_id__gt=since_id)
)
if min_id and not min_id.startswith("interaction"):
# Min ID requires items _immediately_ newer than specified, so we
# invert the ordering to accommodate
queryset = queryset.filter(
models.Q(subject_post_id__gt=min_id)
| models.Q(subject_post_interaction_id__gt=min_id)
).order_by("id")
else:
queryset = queryset.order_by("-id")
limit = min(limit or self.default_limit, self.max_limit)
return PaginationResult(
results=list(queryset[:limit]),
results=results,
limit=limit,
)

View file

@ -1,8 +1,10 @@
from typing import Literal, Optional, Union
from django.conf import settings
from hatchway import Field, Schema
from activities import models as activities_models
from api import models as api_models
from core.html import FediverseHtmlParser
from users import models as users_models
from users.services import IdentityService
@ -15,6 +17,23 @@ class Application(Schema):
client_id: str
client_secret: str
redirect_uri: str = Field(alias="redirect_uris")
vapid_key: str | None
@classmethod
def from_application(cls, application: api_models.Application) -> "Application":
instance = cls.from_orm(application)
instance.vapid_key = settings.SETUP.VAPID_PUBLIC_KEY
return instance
@classmethod
def from_application_no_keys(
cls, application: api_models.Application
) -> "Application":
instance = cls.from_orm(application)
instance.vapid_key = settings.SETUP.VAPID_PUBLIC_KEY
instance.client_id = ""
instance.client_secret = ""
return instance
class CustomEmoji(Schema):
@ -165,10 +184,15 @@ class Status(Schema):
cls,
post: activities_models.Post,
interactions: dict[str, set[str]] | None = None,
bookmarks: set[str] | None = None,
identity: users_models.Identity | None = None,
) -> "Status":
return cls(
**post.to_mastodon_json(interactions=interactions, identity=identity)
**post.to_mastodon_json(
interactions=interactions,
bookmarks=bookmarks,
identity=identity,
)
)
@classmethod
@ -180,8 +204,14 @@ class Status(Schema):
interactions = activities_models.PostInteraction.get_post_interactions(
posts, identity
)
bookmarks = users_models.Bookmark.for_identity(identity, posts)
return [
cls.from_post(post, interactions=interactions, identity=identity)
cls.from_post(
post,
interactions=interactions,
bookmarks=bookmarks,
identity=identity,
)
for post in posts
]
@ -190,11 +220,12 @@ class Status(Schema):
cls,
timeline_event: activities_models.TimelineEvent,
interactions: dict[str, set[str]] | None = None,
bookmarks: set[str] | None = None,
identity: users_models.Identity | None = None,
) -> "Status":
return cls(
**timeline_event.to_mastodon_status_json(
interactions=interactions, identity=identity
interactions=interactions, bookmarks=bookmarks, identity=identity
)
)
@ -207,8 +238,13 @@ class Status(Schema):
interactions = activities_models.PostInteraction.get_event_interactions(
events, identity
)
bookmarks = users_models.Bookmark.for_identity(
identity, events, "subject_post_id"
)
return [
cls.from_timeline_event(event, interactions=interactions, identity=identity)
cls.from_timeline_event(
event, interactions=interactions, bookmarks=bookmarks, identity=identity
)
for event in events
]
@ -256,21 +292,50 @@ class Notification(Schema):
def from_timeline_event(
cls,
event: activities_models.TimelineEvent,
interactions=None,
) -> "Notification":
return cls(**event.to_mastodon_notification_json())
return cls(**event.to_mastodon_notification_json(interactions=interactions))
class Tag(Schema):
name: str
url: str
history: dict
history: list
following: bool | None
@classmethod
def from_hashtag(
cls,
hashtag: activities_models.Hashtag,
following: bool | None = None,
) -> "Tag":
return cls(**hashtag.to_mastodon_json())
return cls(**hashtag.to_mastodon_json(following=following))
class FollowedTag(Tag):
id: str
@classmethod
def from_follow(
cls,
follow: users_models.HashtagFollow,
) -> "FollowedTag":
return cls(id=follow.id, **follow.hashtag.to_mastodon_json(following=True))
@classmethod
def map_from_follows(
cls,
hashtag_follows: list[users_models.HashtagFollow],
) -> list["Tag"]:
return [cls.from_follow(follow) for follow in hashtag_follows]
class FeaturedTag(Schema):
id: str
name: str
url: str
statuses_count: int
last_status_at: str
class Search(Schema):
@ -337,3 +402,104 @@ class Announcement(Schema):
user: users_models.User,
) -> "Announcement":
return cls(**announcement.to_mastodon_json(user=user))
class List(Schema):
id: str
title: str
replies_policy: Literal[
"followed",
"list",
"none",
]
class Preferences(Schema):
posting_default_visibility: Literal[
"public",
"unlisted",
"private",
"direct",
] = Field(alias="posting:default:visibility")
posting_default_sensitive: bool = Field(alias="posting:default:sensitive")
posting_default_language: str | None = Field(alias="posting:default:language")
reading_expand_media: Literal[
"default",
"show_all",
"hide_all",
] = Field(alias="reading:expand:media")
reading_expand_spoilers: bool = Field(alias="reading:expand:spoilers")
@classmethod
def from_identity(
cls,
identity: users_models.Identity,
) -> "Preferences":
visibility_mapping = {
activities_models.Post.Visibilities.public: "public",
activities_models.Post.Visibilities.unlisted: "unlisted",
activities_models.Post.Visibilities.followers: "private",
activities_models.Post.Visibilities.mentioned: "direct",
activities_models.Post.Visibilities.local_only: "public",
}
return cls.parse_obj(
{
"posting:default:visibility": visibility_mapping[
identity.config_identity.default_post_visibility
],
"posting:default:sensitive": False,
"posting:default:language": None,
"reading:expand:media": "default",
"reading:expand:spoilers": identity.config_identity.expand_content_warnings,
}
)
class PushSubscriptionKeys(Schema):
p256dh: str
auth: str
class PushSubscriptionCreation(Schema):
endpoint: str
keys: PushSubscriptionKeys
class PushDataAlerts(Schema):
mention: bool = False
status: bool = False
reblog: bool = False
follow: bool = False
follow_request: bool = False
favourite: bool = False
poll: bool = False
update: bool = False
admin_sign_up: bool = Field(False, alias="admin.sign_up")
admin_report: bool = Field(False, alias="admin.report")
class PushData(Schema):
alerts: PushDataAlerts
policy: Literal["all", "followed", "follower", "none"] = "all"
class PushSubscription(Schema):
id: str
endpoint: str
alerts: PushDataAlerts
policy: str
server_key: str
@classmethod
def from_token(
cls,
token: api_models.Token,
) -> Optional["PushSubscription"]:
value = token.push_subscription
if value:
value["id"] = "1"
value["server_key"] = settings.SETUP.VAPID_PUBLIC_KEY
del value["keys"]
return value
else:
return None

View file

@ -5,14 +5,21 @@ from api.views import (
accounts,
announcements,
apps,
bookmarks,
emoji,
filters,
follow_requests,
instance,
lists,
media,
notifications,
polls,
preferences,
push,
search,
statuses,
suggestions,
tags,
timelines,
trends,
)
@ -35,19 +42,31 @@ urlpatterns = [
path("v1/accounts/<id>/unmute", accounts.account_unmute),
path("v1/accounts/<id>/following", accounts.account_following),
path("v1/accounts/<id>/followers", accounts.account_followers),
path("v1/accounts/<id>/featured_tags", accounts.account_featured_tags),
# Announcements
path("v1/announcements", announcements.announcement_list),
path("v1/announcements/<pk>/dismiss", announcements.announcement_dismiss),
# Apps
path("v1/apps", apps.add_app),
path("v1/apps/verify_credentials", apps.verify_credentials),
# Bookmarks
path("v1/bookmarks", bookmarks.bookmarks),
# Emoji
path("v1/custom_emojis", emoji.emojis),
# Filters
path("v2/filters", filters.list_filters),
path("v1/filters", filters.list_filters),
# Follow requests
path("v1/follow_requests", follow_requests.follow_requests),
path("v1/follow_requests/<id>/authorize", follow_requests.accept_follow_request),
path("v1/follow_requests/<id>/reject", follow_requests.reject_follow_request),
# Instance
path("v1/instance", instance.instance_info_v1),
path("v1/instance/activity", instance.activity),
path("v1/instance/peers", instance.peers),
path("v2/instance", instance.instance_info_v2),
# Lists
path("v1/lists", lists.get_lists),
# Media
path("v1/media", media.upload_media),
path("v2/media", media.upload_media),
@ -63,10 +82,26 @@ urlpatterns = [
path("v1/statuses/<id>/source", statuses.status_source),
# Notifications
path("v1/notifications", notifications.notifications),
path("v1/notifications/clear", notifications.dismiss_notifications),
path("v1/notifications/<id>", notifications.get_notification),
path("v1/notifications/<id>/dismiss", notifications.dismiss_notification),
# Polls
path("v1/polls/<id>", polls.get_poll),
path("v1/polls/<id>/votes", polls.vote_poll),
# Preferences
path("v1/preferences", preferences.preferences),
# Push
path(
"v1/push/subscription",
methods(
get=push.get_subscription,
post=push.create_subscription,
put=push.update_subscription,
delete=push.delete_subscription,
),
),
# Search
path("v1/search", search.search),
path("v2/search", search.search),
# Statuses
path("v1/statuses", statuses.post_status),
@ -76,6 +111,16 @@ urlpatterns = [
path("v1/statuses/<id>/favourited_by", statuses.favourited_by),
path("v1/statuses/<id>/reblog", statuses.reblog_status),
path("v1/statuses/<id>/unreblog", statuses.unreblog_status),
path("v1/statuses/<id>/reblogged_by", statuses.reblogged_by),
path("v1/statuses/<id>/bookmark", statuses.bookmark_status),
path("v1/statuses/<id>/unbookmark", statuses.unbookmark_status),
path("v1/statuses/<id>/pin", statuses.pin_status),
path("v1/statuses/<id>/unpin", statuses.unpin_status),
# Tags
path("v1/followed_tags", tags.followed_tags),
path("v1/tags/<hashtag>", tags.hashtag),
path("v1/tags/<id>/follow", tags.follow),
path("v1/tags/<id>/unfollow", tags.unfollow),
# Timelines
path("v1/timelines/home", timelines.home),
path("v1/timelines/public", timelines.public),
@ -86,4 +131,6 @@ urlpatterns = [
path("v1/trends/tags", trends.trends_tags),
path("v1/trends/statuses", trends.trends_statuses),
path("v1/trends/links", trends.trends_links),
# Suggestions
path("v2/suggestions", suggestions.suggested_users),
]

View file

@ -5,13 +5,13 @@ from django.http import HttpRequest
from django.shortcuts import get_object_or_404
from hatchway import ApiResponse, QueryOrBody, api_view
from activities.models import Post
from activities.models import Post, PostInteraction, PostInteractionStates
from activities.services import SearchService
from api import schemas
from api.decorators import scope_required
from api.pagination import MastodonPaginator, PaginatingApiResponse, PaginationResult
from core.models import Config
from users.models import Identity
from users.models import Identity, IdentityStates
from users.services import IdentityService
from users.shortcuts import by_handle_or_404
@ -29,6 +29,7 @@ def update_credentials(
display_name: QueryOrBody[str | None] = None,
note: QueryOrBody[str | None] = None,
discoverable: QueryOrBody[bool | None] = None,
locked: QueryOrBody[bool | None] = None,
source: QueryOrBody[dict[str, Any] | None] = None,
fields_attributes: QueryOrBody[dict[str, dict[str, str]] | None] = None,
avatar: File | None = None,
@ -42,6 +43,8 @@ def update_credentials(
service.set_summary(note)
if discoverable is not None:
identity.discoverable = discoverable
if locked is not None:
identity.manually_approves_followers = locked
if source:
if "privacy" in source:
privacy_map = {
@ -70,15 +73,22 @@ def update_credentials(
if header:
service.set_image(header)
identity.save()
identity.transition_perform(IdentityStates.edited)
return schemas.Account.from_identity(identity, source=True)
@scope_required("read")
@api_view.get
def account_relationships(request, id: list[str] | None) -> list[schemas.Relationship]:
def account_relationships(
request, id: list[str] | str | None
) -> list[schemas.Relationship]:
result = []
# ID is actually a list. Thanks Mastodon!
ids = id or []
if isinstance(id, str):
ids = [id]
elif id is None:
ids = []
else:
ids = id
for actual_id in ids:
identity = get_object_or_404(Identity, pk=actual_id)
result.append(
@ -90,12 +100,17 @@ def account_relationships(request, id: list[str] | None) -> list[schemas.Relatio
@scope_required("read")
@api_view.get
def familiar_followers(
request, id: list[str] | None
request, id: list[str] | str | None
) -> list[schemas.FamiliarFollowers]:
"""
Returns people you follow that also follow given account IDs
"""
ids = id or []
if isinstance(id, str):
ids = [id]
elif id is None:
ids = []
else:
ids = id
result = []
for actual_id in ids:
target_identity = get_object_or_404(Identity, pk=actual_id)
@ -189,7 +204,10 @@ def account_statuses(
.order_by("-created")
)
if pinned:
return ApiResponse([])
queryset = queryset.filter(
interactions__type=PostInteraction.Types.pin,
interactions__state__in=PostInteractionStates.group_active(),
)
if only_media:
queryset = queryset.filter(attachments__pk__isnull=False)
if tagged:
@ -349,3 +367,9 @@ def account_followers(
request=request,
include_params=["limit"],
)
@api_view.get
def account_featured_tags(request: HttpRequest, id: str) -> list[schemas.FeaturedTag]:
# Not implemented yet
return []

View file

@ -1,28 +1,30 @@
import secrets
from hatchway import QueryOrBody, api_view
from hatchway import Schema, api_view
from .. import schemas
from ..models import Application
class CreateApplicationSchema(Schema):
client_name: str
redirect_uris: str
scopes: None | str = None
website: None | str = None
from api import schemas
from api.decorators import scope_required
from api.models import Application
@api_view.post
def add_app(request, details: CreateApplicationSchema) -> schemas.Application:
client_id = "tk-" + secrets.token_urlsafe(16)
client_secret = secrets.token_urlsafe(40)
application = Application.objects.create(
name=details.client_name,
website=details.website,
client_id=client_id,
client_secret=client_secret,
redirect_uris=details.redirect_uris,
scopes=details.scopes or "read",
def add_app(
request,
client_name: QueryOrBody[str],
redirect_uris: QueryOrBody[str],
scopes: QueryOrBody[None | str] = None,
website: QueryOrBody[None | str] = None,
) -> schemas.Application:
application = Application.create(
client_name=client_name,
website=website,
redirect_uris=redirect_uris,
scopes=scopes,
)
return schemas.Application.from_orm(application)
return schemas.Application.from_application(application)
@scope_required("read")
@api_view.get
def verify_credentials(
request,
) -> schemas.Application:
return schemas.Application.from_application_no_keys(request.token.application)

33
api/views/bookmarks.py Normal file
View file

@ -0,0 +1,33 @@
from django.http import HttpRequest
from hatchway import api_view
from activities.models import Post
from activities.services import TimelineService
from api import schemas
from api.decorators import scope_required
from api.pagination import MastodonPaginator, PaginatingApiResponse, PaginationResult
@scope_required("read:bookmarks")
@api_view.get
def bookmarks(
request: HttpRequest,
max_id: str | None = None,
since_id: str | None = None,
min_id: str | None = None,
limit: int = 20,
) -> list[schemas.Status]:
queryset = TimelineService(request.identity).bookmarks()
paginator = MastodonPaginator()
pager: PaginationResult[Post] = paginator.paginate(
queryset,
min_id=min_id,
max_id=max_id,
since_id=since_id,
limit=limit,
)
return PaginatingApiResponse(
schemas.Status.map_from_post(pager.results, request.identity),
request=request,
include_params=["limit"],
)

View file

@ -0,0 +1,60 @@
from django.http import HttpRequest
from django.shortcuts import get_object_or_404
from hatchway import api_view
from api import schemas
from api.decorators import scope_required
from api.pagination import MastodonPaginator, PaginatingApiResponse, PaginationResult
from users.models.identity import Identity
from users.services.identity import IdentityService
@scope_required("read:follows")
@api_view.get
def follow_requests(
request: HttpRequest,
max_id: str | None = None,
since_id: str | None = None,
min_id: str | None = None,
limit: int = 40,
) -> list[schemas.Account]:
service = IdentityService(request.identity)
paginator = MastodonPaginator(max_limit=80)
pager: PaginationResult[Identity] = paginator.paginate(
service.follow_requests(),
min_id=min_id,
max_id=max_id,
since_id=since_id,
limit=limit,
)
return PaginatingApiResponse(
[schemas.Account.from_identity(i) for i in pager.results],
request=request,
include_params=["limit"],
)
@scope_required("write:follows")
@api_view.post
def accept_follow_request(
request: HttpRequest,
id: str | None = None,
) -> schemas.Relationship:
source_identity = get_object_or_404(
Identity.objects.exclude(restriction=Identity.Restriction.blocked), pk=id
)
IdentityService(request.identity).accept_follow_request(source_identity)
return IdentityService(source_identity).mastodon_json_relationship(request.identity)
@scope_required("write:follows")
@api_view.post
def reject_follow_request(
request: HttpRequest,
id: str | None = None,
) -> schemas.Relationship:
source_identity = get_object_or_404(
Identity.objects.exclude(restriction=Identity.Restriction.blocked), pk=id
)
IdentityService(request.identity).reject_follow_request(source_identity)
return IdentityService(source_identity).mastodon_json_relationship(request.identity)

View file

@ -1,4 +1,8 @@
import datetime
from django.conf import settings
from django.core.cache import cache
from django.utils import timezone
from hatchway import api_view
from activities.models import Post
@ -10,6 +14,15 @@ from users.models import Domain, Identity
@api_view.get
def instance_info_v1(request):
# The stats are expensive to calculate, so don't do it very often
stats = cache.get("instance_info_stats")
if stats is None:
stats = {
"user_count": Identity.objects.filter(local=True).count(),
"status_count": Post.objects.filter(local=True).not_hidden().count(),
"domain_count": Domain.objects.count(),
}
cache.set("instance_info_stats", stats, timeout=300)
return {
"uri": request.headers.get("host", settings.SETUP.MAIN_DOMAIN),
"title": Config.system.site_name,
@ -18,11 +31,7 @@ def instance_info_v1(request):
"email": "",
"version": f"takahe/{__version__}",
"urls": {},
"stats": {
"user_count": Identity.objects.filter(local=True).count(),
"status_count": Post.objects.filter(local=True).not_hidden().count(),
"domain_count": Domain.objects.count(),
},
"stats": stats,
"thumbnail": Config.system.site_banner,
"languages": ["en"],
"registrations": (Config.system.signup_allowed),
@ -32,7 +41,7 @@ def instance_info_v1(request):
"accounts": {},
"statuses": {
"max_characters": Config.system.post_length,
"max_media_attachments": 4,
"max_media_attachments": Config.system.max_media_attachments,
"characters_reserved_per_url": 23,
},
"media_attachments": {
@ -47,6 +56,12 @@ def instance_info_v1(request):
"image_size_limit": (1024**2) * 10,
"image_matrix_limit": 2000 * 2000,
},
"polls": {
"max_options": 4,
"max_characters_per_option": 50,
"min_expiration": 300,
"max_expiration": 2629746,
},
},
"contact_account": None,
"rules": [],
@ -59,9 +74,7 @@ def instance_info_v2(request) -> dict:
request.headers.get("host", settings.SETUP.MAIN_DOMAIN)
)
if current_domain is None or not current_domain.local:
current_domain = Domain.get_domain(
request.headers.get(settings.SETUP.MAIN_DOMAIN)
)
current_domain = Domain.get_domain(settings.SETUP.MAIN_DOMAIN)
if current_domain is None:
raise ValueError("No domain set up for MAIN_DOMAIN")
admin_identity = (
@ -89,7 +102,7 @@ def instance_info_v2(request) -> dict:
"accounts": {"max_featured_tags": 0},
"statuses": {
"max_characters": Config.system.post_length,
"max_media_attachments": 4,
"max_media_attachments": Config.system.max_media_attachments,
"characters_reserved_per_url": 23,
},
"media_attachments": {
@ -126,3 +139,46 @@ def instance_info_v2(request) -> dict:
},
"rules": [],
}
@api_view.get
def peers(request) -> list[str]:
return list(
Domain.objects.filter(local=False, blocked=False).values_list(
"domain", flat=True
)
)
@api_view.get
def activity(request) -> list:
"""
Weekly activity endpoint
"""
# The stats are expensive to calculate, so don't do it very often
stats = cache.get("instance_activity_stats")
if stats is None:
stats = []
# Work out our most recent week start
now = timezone.now()
week_start = now.replace(
hour=0, minute=0, second=0, microsecond=0
) - datetime.timedelta(now.weekday())
for i in range(12):
week_end = week_start + datetime.timedelta(days=7)
stats.append(
{
"week": int(week_start.timestamp()),
"statuses": Post.objects.filter(
local=True, created__gte=week_start, created__lt=week_end
).count(),
# TODO: Populate when we have identity activity tracking
"logins": 0,
"registrations": Identity.objects.filter(
local=True, created__gte=week_start, created__lt=week_end
).count(),
}
)
week_start -= datetime.timedelta(days=7)
cache.set("instance_activity_stats", stats, timeout=300)
return stats

12
api/views/lists.py Normal file
View file

@ -0,0 +1,12 @@
from django.http import HttpRequest
from hatchway import api_view
from api import schemas
from api.decorators import scope_required
@scope_required("read:lists")
@api_view.get
def get_lists(request: HttpRequest) -> list[schemas.List]:
# We don't implement this yet
return []

View file

@ -34,6 +34,7 @@ def upload_media(
height=main_file.image.height,
name=description or None,
state=PostAttachmentStates.fetched,
author=request.identity,
)
attachment.file.save(
main_file.name,
@ -54,7 +55,10 @@ def get_media(
id: str,
) -> schemas.MediaAttachment:
attachment = get_object_or_404(PostAttachment, pk=id)
if attachment.post.author != request.identity:
if attachment.post:
if attachment.post.author != request.identity:
raise ApiError(401, "Not the author of this attachment")
elif attachment.author and attachment.author != request.identity:
raise ApiError(401, "Not the author of this attachment")
return schemas.MediaAttachment.from_post_attachment(attachment)
@ -68,7 +72,10 @@ def update_media(
focus: QueryOrBody[str] = "0,0",
) -> schemas.MediaAttachment:
attachment = get_object_or_404(PostAttachment, pk=id)
if attachment.post.author != request.identity:
if attachment.post:
if attachment.post.author != request.identity:
raise ApiError(401, "Not the author of this attachment")
elif attachment.author != request.identity:
raise ApiError(401, "Not the author of this attachment")
attachment.name = description or None
attachment.save()

View file

@ -1,12 +1,22 @@
from django.http import HttpRequest
from django.shortcuts import get_object_or_404
from hatchway import ApiResponse, api_view
from activities.models import TimelineEvent
from activities.models import PostInteraction, TimelineEvent
from activities.services import TimelineService
from api import schemas
from api.decorators import scope_required
from api.pagination import MastodonPaginator, PaginatingApiResponse, PaginationResult
# Types/exclude_types use weird syntax so we have to handle them manually
NOTIFICATION_TYPES = {
"favourite": TimelineEvent.Types.liked,
"reblog": TimelineEvent.Types.boosted,
"mention": TimelineEvent.Types.mentioned,
"follow": TimelineEvent.Types.followed,
"admin.sign_up": TimelineEvent.Types.identity_created,
}
@scope_required("read:notifications")
@api_view.get
@ -18,22 +28,14 @@ def notifications(
limit: int = 20,
account_id: str | None = None,
) -> ApiResponse[list[schemas.Notification]]:
# Types/exclude_types use weird syntax so we have to handle them manually
base_types = {
"favourite": TimelineEvent.Types.liked,
"reblog": TimelineEvent.Types.boosted,
"mention": TimelineEvent.Types.mentioned,
"follow": TimelineEvent.Types.followed,
"admin.sign_up": TimelineEvent.Types.identity_created,
}
requested_types = set(request.GET.getlist("types[]"))
excluded_types = set(request.GET.getlist("exclude_types[]"))
if not requested_types:
requested_types = set(base_types.keys())
requested_types = set(NOTIFICATION_TYPES.keys())
requested_types.difference_update(excluded_types)
# Use that to pull relevant events
queryset = TimelineService(request.identity).notifications(
[base_types[r] for r in requested_types if r in base_types]
[NOTIFICATION_TYPES[r] for r in requested_types if r in NOTIFICATION_TYPES]
)
paginator = MastodonPaginator()
pager: PaginationResult[TimelineEvent] = paginator.paginate(
@ -43,8 +45,56 @@ def notifications(
since_id=since_id,
limit=limit,
)
interactions = PostInteraction.get_event_interactions(
pager.results,
request.identity,
)
return PaginatingApiResponse(
[schemas.Notification.from_timeline_event(event) for event in pager.results],
[
schemas.Notification.from_timeline_event(event, interactions=interactions)
for event in pager.results
],
request=request,
include_params=["limit", "account_id"],
)
@scope_required("read:notifications")
@api_view.get
def get_notification(
request: HttpRequest,
id: str,
) -> schemas.Notification:
notification = get_object_or_404(
TimelineService(request.identity).notifications(
list(NOTIFICATION_TYPES.values())
),
id=id,
)
return schemas.Notification.from_timeline_event(notification)
@scope_required("write:notifications")
@api_view.post
def dismiss_notifications(request: HttpRequest) -> dict:
TimelineService(request.identity).notifications(
list(NOTIFICATION_TYPES.values())
).update(dismissed=True)
return {}
@scope_required("write:notifications")
@api_view.post
def dismiss_notification(request: HttpRequest, id: str) -> dict:
notification = get_object_or_404(
TimelineService(request.identity).notifications(
list(NOTIFICATION_TYPES.values())
),
id=id,
)
notification.dismissed = True
notification.save()
return {}

View file

@ -1,6 +1,7 @@
import base64
import json
import secrets
import time
from urllib.parse import urlparse, urlunparse
from django.contrib.auth.mixins import LoginRequiredMixin
@ -72,6 +73,7 @@ class AuthorizationView(LoginRequiredMixin, View):
request,
"api/oauth_error.html",
{"error": f"Invalid response type '{response_type}'"},
status=400,
)
application = Application.objects.filter(
@ -80,7 +82,10 @@ class AuthorizationView(LoginRequiredMixin, View):
if application is None:
return render(
request, "api/oauth_error.html", {"error": "Invalid client_id"}
request,
"api/oauth_error.html",
{"error": "Invalid client_id"},
status=400,
)
if application.redirect_uris and redirect_uri not in application.redirect_uris:
@ -88,6 +93,7 @@ class AuthorizationView(LoginRequiredMixin, View):
request,
"api/oauth_error.html",
{"error": "Invalid application redirect URI"},
status=401,
)
context = {
@ -169,13 +175,16 @@ class TokenView(View):
return JsonResponse({"error": "invalid_grant_type"}, status=400)
if grant_type == "client_credentials":
# TODO: Implement client credentials flow
# We don't support individual client credential tokens, but instead
# just have a fixed one (since anyone can register an app at any
# time anyway)
return JsonResponse(
{
"error": "invalid_grant_type",
"error_description": "client credential flow not implemented",
},
status=400,
"access_token": "__app__",
"token_type": "Bearer",
"scope": "read",
"created_at": int(time.time()),
}
)
elif grant_type == "authorization_code":
code = post_data.get("code")

13
api/views/preferences.py Normal file
View file

@ -0,0 +1,13 @@
from django.http import HttpRequest
from hatchway import api_view
from api import schemas
from api.decorators import scope_required
@scope_required("read:accounts")
@api_view.get
def preferences(request: HttpRequest) -> dict:
# Ideally this should just return Preferences; maybe hatchway needs a way to
# indicate response models should be serialized by alias?
return schemas.Preferences.from_identity(request.identity).dict(by_alias=True)

70
api/views/push.py Normal file
View file

@ -0,0 +1,70 @@
from django.conf import settings
from django.http import Http404
from hatchway import ApiError, QueryOrBody, api_view
from api import schemas
from api.decorators import scope_required
@scope_required("push")
@api_view.post
def create_subscription(
request,
subscription: QueryOrBody[schemas.PushSubscriptionCreation],
data: QueryOrBody[schemas.PushData],
) -> schemas.PushSubscription:
# First, check the server is set up to do push notifications
if not settings.SETUP.VAPID_PRIVATE_KEY:
raise Http404("Push not available")
# Then, register this with our token
request.token.set_push_subscription(
{
"endpoint": subscription.endpoint,
"keys": subscription.keys,
"alerts": data.alerts,
"policy": data.policy,
}
)
# Then return the subscription
return schemas.PushSubscription.from_token(request.token) # type:ignore
@scope_required("push")
@api_view.get
def get_subscription(request) -> schemas.PushSubscription:
# First, check the server is set up to do push notifications
if not settings.SETUP.VAPID_PRIVATE_KEY:
raise Http404("Push not available")
# Get the subscription if it exists
subscription = schemas.PushSubscription.from_token(request.token)
if not subscription:
raise ApiError(404, "Not Found")
return subscription
@scope_required("push")
@api_view.put
def update_subscription(
request, data: QueryOrBody[schemas.PushData]
) -> schemas.PushSubscription:
# First, check the server is set up to do push notifications
if not settings.SETUP.VAPID_PRIVATE_KEY:
raise Http404("Push not available")
# Get the subscription if it exists
subscription = schemas.PushSubscription.from_token(request.token)
if not subscription:
raise ApiError(404, "Not Found")
# Update the subscription
subscription.alerts = data.alerts
subscription.policy = data.policy
request.token.set_push_subscription(subscription)
# Then return the subscription
return schemas.PushSubscription.from_token(request.token) # type:ignore
@scope_required("push")
@api_view.delete
def delete_subscription(request) -> dict:
# Unset the subscription
request.token.push_subscription = None
return {}

View file

@ -13,7 +13,7 @@ from api.decorators import scope_required
def search(
request,
q: str,
type: Literal["accounts", "hashtags", "statuses"] | None = None,
type: Literal["accounts", "hashtags", "statuses", ""] | None = None,
fetch_identities: bool = Field(False, alias="resolve"),
following: bool = False,
exclude_unreviewed: bool = False,
@ -33,13 +33,15 @@ def search(
# Run search
searcher = SearchService(q, request.identity)
search_result = searcher.search_all()
if type == "":
type = None
if type is None or type == "accounts":
result["accounts"] = [
schemas.Account.from_identity(i, include_counts=False)
for i in search_result["identities"]
]
if type is None or type == "hashtag":
result["hashtag"] = [
result["hashtags"] = [
schemas.Tag.from_hashtag(h) for h in search_result["hashtags"]
]
if type is None or type == "statuses":

View file

@ -16,7 +16,7 @@ from activities.models import (
from activities.services import PostService
from api import schemas
from api.decorators import scope_required
from api.pagination import MastodonPaginator, PaginationResult
from api.pagination import MastodonPaginator, PaginatingApiResponse, PaginationResult
from core.models import Config
@ -39,7 +39,7 @@ class PostPollSchema(Schema):
class PostStatusSchema(Schema):
status: str
status: str | None
in_reply_to_id: str | None = None
sensitive: bool = False
spoiler_text: str | None = None
@ -50,12 +50,18 @@ class PostStatusSchema(Schema):
poll: PostPollSchema | None = None
class MediaAttributesSchema(Schema):
id: str
description: str
class EditStatusSchema(Schema):
status: str
sensitive: bool = False
spoiler_text: str | None = None
language: str | None = None
media_ids: list[str] = []
media_attributes: list[MediaAttributesSchema] = []
def post_for_id(request: HttpRequest, id: str) -> Post:
@ -76,9 +82,9 @@ def post_for_id(request: HttpRequest, id: str) -> Post:
@api_view.post
def post_status(request, details: PostStatusSchema) -> schemas.Status:
# Check text length
if len(details.status) > Config.system.post_length:
if details.status and len(details.status) > Config.system.post_length:
raise ApiError(400, "Status is too long")
if len(details.status) == 0 and not details.media_ids:
if not details.status and not details.media_ids:
raise ApiError(400, "Status is empty")
# Grab attachments
attachments = [get_object_or_404(PostAttachment, pk=id) for id in details.media_ids]
@ -97,7 +103,7 @@ def post_status(request, details: PostStatusSchema) -> schemas.Status:
pass
post = Post.create_local(
author=request.identity,
content=details.status,
content=details.status or "",
summary=details.spoiler_text,
sensitive=details.sensitive,
visibility=visibility_map[details.visibility],
@ -134,6 +140,7 @@ def edit_status(request, id: str, details: EditStatusSchema) -> schemas.Status:
summary=details.spoiler_text,
sensitive=details.sensitive,
attachments=attachments,
attachment_attributes=details.media_attributes,
)
return schemas.Status.from_post(post)
@ -230,10 +237,7 @@ def favourited_by(
limit=limit,
)
headers = {}
if pager.results:
headers = {"link": pager.link_header(request, ["limit"])}
return ApiResponse(
return PaginatingApiResponse(
[
schemas.Account.from_identity(
interaction.identity,
@ -241,7 +245,53 @@ def favourited_by(
)
for interaction in pager.results
],
headers=headers,
request=request,
include_params=[
"limit",
"id",
],
)
@api_view.get
def reblogged_by(
request: HttpRequest,
id: str,
max_id: str | None = None,
since_id: str | None = None,
min_id: str | None = None,
limit: int = 20,
) -> ApiResponse[list[schemas.Account]]:
"""
View who reblogged a given status.
"""
post = post_for_id(request, id)
paginator = MastodonPaginator()
pager: PaginationResult[PostInteraction] = paginator.paginate(
post.interactions.filter(
type=PostInteraction.Types.boost,
state__in=PostInteractionStates.group_active(),
).select_related("identity"),
min_id=min_id,
max_id=max_id,
since_id=since_id,
limit=limit,
)
return PaginatingApiResponse(
[
schemas.Account.from_identity(
interaction.identity,
include_counts=False,
)
for interaction in pager.results
],
request=request,
include_params=[
"limit",
"id",
],
)
@ -267,3 +317,50 @@ def unreblog_status(request, id: str) -> schemas.Status:
return schemas.Status.from_post(
post, interactions=interactions, identity=request.identity
)
@scope_required("write:bookmarks")
@api_view.post
def bookmark_status(request, id: str) -> schemas.Status:
post = post_for_id(request, id)
request.identity.bookmarks.get_or_create(post=post)
interactions = PostInteraction.get_post_interactions([post], request.identity)
return schemas.Status.from_post(
post, interactions=interactions, bookmarks={post.pk}, identity=request.identity
)
@scope_required("write:bookmarks")
@api_view.post
def unbookmark_status(request, id: str) -> schemas.Status:
post = post_for_id(request, id)
request.identity.bookmarks.filter(post=post).delete()
interactions = PostInteraction.get_post_interactions([post], request.identity)
return schemas.Status.from_post(
post, interactions=interactions, identity=request.identity
)
@scope_required("write:accounts")
@api_view.post
def pin_status(request, id: str) -> schemas.Status:
post = post_for_id(request, id)
try:
PostService(post).pin_as(request.identity)
interactions = PostInteraction.get_post_interactions([post], request.identity)
return schemas.Status.from_post(
post, identity=request.identity, interactions=interactions
)
except ValueError as e:
raise ApiError(422, str(e))
@scope_required("write:accounts")
@api_view.post
def unpin_status(request, id: str) -> schemas.Status:
post = post_for_id(request, id)
PostService(post).unpin_as(request.identity)
interactions = PostInteraction.get_post_interactions([post], request.identity)
return schemas.Status.from_post(
post, identity=request.identity, interactions=interactions
)

16
api/views/suggestions.py Normal file
View file

@ -0,0 +1,16 @@
from django.http import HttpRequest
from hatchway import api_view
from api import schemas
from api.decorators import scope_required
@scope_required("read")
@api_view.get
def suggested_users(
request: HttpRequest,
limit: int = 10,
offset: int | None = None,
) -> list[schemas.Account]:
# We don't implement this yet
return []

84
api/views/tags.py Normal file
View file

@ -0,0 +1,84 @@
from django.http import HttpRequest
from django.shortcuts import get_object_or_404
from hatchway import api_view
from activities.models import Hashtag
from api import schemas
from api.decorators import scope_required
from api.pagination import MastodonPaginator, PaginatingApiResponse, PaginationResult
from users.models import HashtagFollow
@api_view.get
def hashtag(request: HttpRequest, hashtag: str) -> schemas.Tag:
tag = get_object_or_404(
Hashtag,
pk=hashtag.lower(),
)
following = None
if request.identity:
following = tag.followers.filter(identity=request.identity).exists()
return schemas.Tag.from_hashtag(
tag,
following=following,
)
@scope_required("read:follows")
@api_view.get
def followed_tags(
request: HttpRequest,
max_id: str | None = None,
since_id: str | None = None,
min_id: str | None = None,
limit: int = 100,
) -> list[schemas.Tag]:
queryset = HashtagFollow.objects.by_identity(request.identity)
paginator = MastodonPaginator()
pager: PaginationResult[HashtagFollow] = paginator.paginate(
queryset,
min_id=min_id,
max_id=max_id,
since_id=since_id,
limit=limit,
)
return PaginatingApiResponse(
schemas.FollowedTag.map_from_follows(pager.results),
request=request,
include_params=["limit"],
)
@scope_required("write:follows")
@api_view.post
def follow(
request: HttpRequest,
id: str,
) -> schemas.Tag:
hashtag = get_object_or_404(
Hashtag,
pk=id.lower(),
)
request.identity.hashtag_follows.get_or_create(hashtag=hashtag)
return schemas.Tag.from_hashtag(
hashtag,
following=True,
)
@scope_required("write:follows")
@api_view.post
def unfollow(
request: HttpRequest,
id: str,
) -> schemas.Tag:
hashtag = get_object_or_404(
Hashtag,
pk=id.lower(),
)
request.identity.hashtag_follows.filter(hashtag=hashtag).delete()
return schemas.Tag.from_hashtag(
hashtag,
following=False,
)

View file

@ -1,7 +1,7 @@
from django.http import HttpRequest
from hatchway import ApiError, ApiResponse, api_view
from activities.models import Post
from activities.models import Post, TimelineEvent
from activities.services import TimelineService
from api import schemas
from api.decorators import scope_required
@ -34,12 +34,13 @@ def home(
"subject_post_interaction__post__mentions__domain",
"subject_post_interaction__post__author__posts",
)
pager = paginator.paginate_home(
pager: PaginationResult[TimelineEvent] = paginator.paginate(
queryset,
min_id=min_id,
max_id=max_id,
since_id=since_id,
limit=limit,
home=True,
)
return PaginatingApiResponse(
schemas.Status.map_from_timeline_event(pager.results, request.identity),
@ -100,7 +101,7 @@ def hashtag(
) -> ApiResponse[list[schemas.Status]]:
if limit > 40:
limit = 40
queryset = TimelineService(request.identity).hashtag(hashtag)
queryset = TimelineService(request.identity).hashtag(hashtag.lower())
if local:
queryset = queryset.filter(local=True)
if only_media:

View file

@ -1,12 +1,12 @@
from django.conf import settings
from core.models import Config
def config_context(request):
return {
"config": Config.system,
"config_identity": (
request.identity.config_identity if request.identity else None
),
"allow_migration": settings.SETUP.ALLOW_USER_MIGRATION,
"top_section": request.path.strip("/").split("/")[0],
"opengraph_defaults": {
"og:site_name": Config.system.site_name,

View file

@ -20,16 +20,6 @@ def vary_by_ap_json(request, *args, **kwargs) -> str:
return "not_ap"
def vary_by_identity(request, *args, **kwargs) -> str:
"""
Return a cache usable string token that is different based upon the
request.identity
"""
if request.identity:
return f"ident{request.identity.pk}"
return "identNone"
def cache_page(
timeout: int | str = "cache_timeout_page_default",
*,

View file

@ -1,45 +1,16 @@
import traceback
from asgiref.sync import sync_to_async
from django.conf import settings
class ActivityPubError(BaseException):
"""
A problem with an ActivityPub message
"""
class ActivityPubFormatError(ActivityPubError):
"""
A problem with an ActivityPub message's format/keys
"""
class ActorMismatchError(ActivityPubError):
"""
The actor is not authorised to do the action we saw
"""
def capture_message(message: str, level: str | None = None, scope=None, **scope_args):
"""
Sends the informational message to Sentry if it's configured
"""
if settings.SETUP.SENTRY_DSN and settings.SETUP.SENTRY_CAPTURE_MESSAGES:
from sentry_sdk import capture_message
capture_message(message, level, scope, **scope_args)
elif settings.DEBUG:
if scope or scope_args:
message += f"; {scope=}, {scope_args=}"
print(message)
def capture_exception(exception: BaseException, scope=None, **scope_args):
"""
Sends the exception to Sentry if it's configured
"""
if settings.SETUP.SENTRY_DSN:
from sentry_sdk import capture_exception
capture_exception(exception, scope, **scope_args)
elif settings.DEBUG:
traceback.print_exc()
acapture_exception = sync_to_async(capture_exception, thread_sensitive=False)

View file

@ -57,7 +57,7 @@ def blurhash_image(file) -> str:
return blurhash.encode(file, 4, 4)
async def get_remote_file(
def get_remote_file(
url: str,
*,
timeout: float = settings.SETUP.REMOTE_TIMEOUT,
@ -70,8 +70,10 @@ async def get_remote_file(
"User-Agent": settings.TAKAHE_USER_AGENT,
}
async with httpx.AsyncClient(headers=headers) as client:
async with client.stream("GET", url, timeout=timeout) as stream:
with httpx.Client(headers=headers) as client:
with client.stream(
"GET", url, timeout=timeout, follow_redirects=True
) as stream:
allow_download = max_size is None
if max_size:
try:
@ -80,7 +82,7 @@ async def get_remote_file(
except (KeyError, TypeError):
pass
if allow_download:
file = ContentFile(await stream.aread(), name=url)
file = ContentFile(stream.read(), name=url)
return file, stream.headers.get(
"content-type", "application/octet-stream"
)

View file

@ -38,7 +38,7 @@ class FediverseHtmlParser(HTMLParser):
r"(^|[^\w\d\-_/])@([\w\d\-_]+(?:@[\w\d\-_\.]+[\w\d\-_]+)?)"
)
HASHTAG_REGEX = re.compile(r"\B#([a-zA-Z0-9(_)]+\b)(?!;)")
HASHTAG_REGEX = re.compile(r"\B#([\w()]+\b)(?!;)")
EMOJI_REGEX = re.compile(r"\B:([a-zA-Z0-9(_)-]+):\B")
@ -91,6 +91,8 @@ class FediverseHtmlParser(HTMLParser):
for mention in mentions or []:
if self.uri_domain:
url = mention.absolute_profile_uri()
elif not mention.local:
url = mention.profile_uri
else:
url = str(mention.urls.view)
if mention.username:
@ -98,6 +100,7 @@ class FediverseHtmlParser(HTMLParser):
domain = mention.domain_id.lower()
self.mention_matches[f"{username}"] = url
self.mention_matches[f"{username}@{domain}"] = url
self.mention_matches[mention.absolute_profile_uri()] = url
def handle_starttag(self, tag: str, attrs: list[tuple[str, str | None]]) -> None:
if tag in self.REWRITE_TO_P:
@ -123,9 +126,10 @@ class FediverseHtmlParser(HTMLParser):
if self._pending_a:
href = self._pending_a["attrs"].get("href")
content = self._pending_a["content"].strip()
has_ellipsis = "ellipsis" in self._pending_a["attrs"].get("class", "")
# Is it a mention?
if content.lower().lstrip("@") in self.mention_matches:
self.html_output += self.create_mention(content)
self.html_output += self.create_mention(content, href)
self.text_output += content
# Is it a hashtag?
elif self.HASHTAG_REGEX.match(content):
@ -133,7 +137,11 @@ class FediverseHtmlParser(HTMLParser):
self.text_output += content
elif content:
# Shorten the link if we need to
self.html_output += self.create_link(href, content)
self.html_output += self.create_link(
href,
content,
has_ellipsis=has_ellipsis,
)
self.text_output += href
self._pending_a = None
@ -153,7 +161,7 @@ class FediverseHtmlParser(HTMLParser):
self.html_output += self.linkify(self._data_buffer)
self._data_buffer = ""
def create_link(self, href, content):
def create_link(self, href, content, has_ellipsis=False):
"""
Generates a link, doing optional shortening.
@ -161,13 +169,17 @@ class FediverseHtmlParser(HTMLParser):
"""
looks_like_link = bool(self.URL_REGEX.match(content))
if looks_like_link:
content = content.split("://", 1)[1]
if looks_like_link and len(content) > 30:
return f'<a href="{html.escape(href)}" rel="nofollow" class="ellipsis" title="{html.escape(content)}">{html.escape(content[:30])}</a>'
protocol, content = content.split("://", 1)
else:
protocol = ""
if (looks_like_link and len(content) > 30) or has_ellipsis:
return f'<a href="{html.escape(href)}" rel="nofollow" class="ellipsis" title="{html.escape(content)}"><span class="invisible">{html.escape(protocol)}://</span><span class="ellipsis">{html.escape(content[:30])}</span><span class="invisible">{html.escape(content[30:])}</span></a>'
elif looks_like_link:
return f'<a href="{html.escape(href)}" rel="nofollow"><span class="invisible">{html.escape(protocol)}://</span>{html.escape(content)}</a>'
else:
return f'<a href="{html.escape(href)}" rel="nofollow">{html.escape(content)}</a>'
def create_mention(self, handle) -> str:
def create_mention(self, handle, href: str | None = None) -> str:
"""
Generates a mention link. Handle should have a leading @.
@ -182,12 +194,15 @@ class FediverseHtmlParser(HTMLParser):
short_hash = short_handle.lower()
self.mentions.add(handle_hash)
url = self.mention_matches.get(handle_hash)
# If we have a captured link out, use that as the actual resolver
if href and href in self.mention_matches:
url = self.mention_matches[href]
if url:
if short_hash not in self.mention_aliases:
self.mention_aliases[short_hash] = handle_hash
elif self.mention_aliases.get(short_hash) != handle_hash:
short_handle = handle
return f'<a href="{html.escape(url)}">@{html.escape(short_handle)}</a>'
return f'<span class="h-card"><a href="{html.escape(url)}" class="u-url mention" rel="nofollow noopener noreferrer" target="_blank">@<span>{html.escape(short_handle)}</span></a></span>'
else:
return "@" + html.escape(handle)
@ -200,7 +215,7 @@ class FediverseHtmlParser(HTMLParser):
hashtag = hashtag.lstrip("#")
self.hashtags.add(hashtag.lower())
if self.uri_domain:
return f'<a href="https://{self.uri_domain}/tags/{hashtag.lower()}/" rel="tag">#{hashtag}</a>'
return f'<a href="https://{self.uri_domain}/tags/{hashtag.lower()}/" class="mention hashtag" rel="tag">#{hashtag}</a>'
else:
return f'<a href="/tags/{hashtag.lower()}/" rel="tag">#{hashtag}</a>'

32
core/json.py Normal file
View file

@ -0,0 +1,32 @@
import json
from httpx import Response
JSON_CONTENT_TYPES = [
"application/json",
"application/ld+json",
"application/activity+json",
]
def json_from_response(response: Response) -> dict | None:
content_type, *parameters = (
response.headers.get("Content-Type", "invalid").lower().split(";")
)
if content_type not in JSON_CONTENT_TYPES:
return None
charset = None
for parameter in parameters:
key, value = parameter.split("=")
if key.strip() == "charset":
charset = value.strip()
if charset:
return json.loads(response.content.decode(charset))
else:
# if no charset informed, default to
# httpx json for encoding inference
return response.json()

View file

@ -1,12 +1,24 @@
import datetime
import logging
import os
import urllib.parse as urllib_parse
from dateutil import parser
from pyld import jsonld
from pyld.jsonld import JsonLdError
from core.exceptions import ActivityPubFormatError
logger = logging.getLogger(__name__)
schemas = {
"unknown": {
"contentType": "application/ld+json",
"documentUrl": "unknown",
"contextUrl": None,
"document": {
"@context": {},
},
},
"www.w3.org/ns/activitystreams": {
"contentType": "application/ld+json",
"documentUrl": "http://www.w3.org/ns/activitystreams",
@ -456,6 +468,46 @@ schemas = {
}
},
},
"w3id.org/security/multikey/v1": {
"contentType": "application/ld+json",
"documentUrl": "https://w3id.org/security/multikey/v1",
"contextUrl": None,
"document": {
"@context": {
"id": "@id",
"type": "@type",
"@protected": True,
"Multikey": {
"@id": "https://w3id.org/security#Multikey",
"@context": {
"@protected": True,
"id": "@id",
"type": "@type",
"controller": {
"@id": "https://w3id.org/security#controller",
"@type": "@id",
},
"revoked": {
"@id": "https://w3id.org/security#revoked",
"@type": "http://www.w3.org/2001/XMLSchema#dateTime",
},
"expires": {
"@id": "https://w3id.org/security#expiration",
"@type": "http://www.w3.org/2001/XMLSchema#dateTime",
},
"publicKeyMultibase": {
"@id": "https://w3id.org/security#publicKeyMultibase",
"@type": "https://w3id.org/security#multibase",
},
"secretKeyMultibase": {
"@id": "https://w3id.org/security#secretKeyMultibase",
"@type": "https://w3id.org/security#multibase",
},
},
},
},
},
},
"*/schemas/litepub-0.1.jsonld": {
"contentType": "application/ld+json",
"documentUrl": "http://w3id.org/security/v1",
@ -547,6 +599,28 @@ schemas = {
}
},
},
"schema.org": {
"contentType": "application/ld+json",
"documentUrl": "https://schema.org/docs/jsonldcontext.json",
"contextUrl": None,
"document": {
"@context": {
"schema": "http://schema.org/",
"PropertyValue": {"@id": "schema:PropertyValue"},
"value": {"@id": "schema:value"},
},
},
},
"purl.org/wytchspace/ns/ap/1.0": {
"contentType": "application/ld+json",
"documentUrl": "https://purl.org/wytchspace/ns/ap/1.0",
"contextUrl": None,
"document": {
"@context": {
"wytch": "https://ns.wytch.space/ap/1.0.jsonld",
},
},
},
}
DATETIME_FORMAT = "%Y-%m-%dT%H:%M:%S.Z"
@ -558,12 +632,8 @@ def builtin_document_loader(url: str, options={}):
# Get URL without scheme
pieces = urllib_parse.urlparse(url)
if pieces.hostname is None:
raise JsonLdError(
f"No schema built-in for {url!r}",
"jsonld.LoadDocumentError",
code="loading document failed",
cause="NoHostnameError",
)
logger.info(f"No host name for json-ld schema: {url!r}")
return schemas["unknown"]
key = pieces.hostname + pieces.path.rstrip("/")
try:
return schemas[key]
@ -572,12 +642,9 @@ def builtin_document_loader(url: str, options={}):
key = "*" + pieces.path.rstrip("/")
return schemas[key]
except KeyError:
raise JsonLdError(
f"No schema built-in for {key!r}",
"jsonld.LoadDocumentError",
code="loading document failed",
cause="KeyError",
)
# return an empty context instead of throwing an error
logger.info(f"Ignoring unknown json-ld schema: {url!r}")
return schemas["unknown"]
def canonicalise(json_data: dict, include_security: bool = False) -> dict:
@ -592,24 +659,32 @@ def canonicalise(json_data: dict, include_security: bool = False) -> dict:
"""
if not isinstance(json_data, dict):
raise ValueError("Pass decoded JSON data into LDDocument")
context = [
"https://www.w3.org/ns/activitystreams",
{
"blurhash": "toot:blurhash",
"Emoji": "toot:Emoji",
"focalPoint": {"@container": "@list", "@id": "toot:focalPoint"},
"Hashtag": "as:Hashtag",
"manuallyApprovesFollowers": "as:manuallyApprovesFollowers",
"Public": "as:Public",
"sensitive": "as:sensitive",
"toot": "http://joinmastodon.org/ns#",
"votersCount": "toot:votersCount",
},
]
context = json_data.get("@context", [])
if not isinstance(context, list):
context = [context]
if not context:
context.append("https://www.w3.org/ns/activitystreams")
context.append(
{
"blurhash": "toot:blurhash",
"Emoji": "toot:Emoji",
"focalPoint": {"@container": "@list", "@id": "toot:focalPoint"},
"Hashtag": "as:Hashtag",
"manuallyApprovesFollowers": "as:manuallyApprovesFollowers",
"sensitive": "as:sensitive",
"toot": "http://joinmastodon.org/ns#",
"votersCount": "toot:votersCount",
"featured": {"@id": "toot:featured", "@type": "@id"},
}
)
if include_security:
context.append("https://w3id.org/security/v1")
if "@context" not in json_data:
json_data["@context"] = context
json_data["@context"] = context
return jsonld.compact(jsonld.expand(json_data), context)
@ -675,7 +750,7 @@ def get_value_or_map(data, key, map_key):
if "und" in map_key:
return data[map_key]["und"]
return list(data[map_key].values())[0]
raise KeyError(f"Cannot find {key} or {map_key}")
raise ActivityPubFormatError(f"Cannot find {key} or {map_key}")
def media_type_from_filename(filename):

View file

@ -10,7 +10,6 @@ import core.uploads
class Migration(migrations.Migration):
initial = True
dependencies = [

View file

@ -0,0 +1,35 @@
# Generated by Django 4.2 on 2023-04-29 18:49
import django.db.models.deletion
from django.conf import settings
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("users", "0016_hashtagfollow"),
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
("core", "0001_initial"),
]
operations = [
migrations.AlterUniqueTogether(
name="config",
unique_together=set(),
),
migrations.AddField(
model_name="config",
name="domain",
field=models.ForeignKey(
blank=True,
null=True,
on_delete=django.db.models.deletion.CASCADE,
related_name="configs",
to="users.domain",
),
),
migrations.AlterUniqueTogether(
name="config",
unique_together={("key", "user", "identity", "domain")},
),
]

View file

@ -2,7 +2,6 @@ from functools import partial
from typing import ClassVar
import pydantic
from asgiref.sync import sync_to_async
from django.core.files import File
from django.db import models
from django.utils.functional import lazy
@ -43,6 +42,14 @@ class Config(models.Model):
on_delete=models.CASCADE,
)
domain = models.ForeignKey(
"users.domain",
blank=True,
null=True,
related_name="configs",
on_delete=models.CASCADE,
)
json = models.JSONField(blank=True, null=True)
image = models.ImageField(
blank=True,
@ -52,7 +59,7 @@ class Config(models.Model):
class Meta:
unique_together = [
("key", "user", "identity"),
("key", "user", "identity", "domain"),
]
system: ClassVar["Config.ConfigOptions"] # type: ignore
@ -86,17 +93,7 @@ class Config(models.Model):
"""
return cls.load_values(
cls.SystemOptions,
{"identity__isnull": True, "user__isnull": True},
)
@classmethod
async def aload_system(cls):
"""
Async loads the system config options object
"""
return await sync_to_async(cls.load_values)(
cls.SystemOptions,
{"identity__isnull": True, "user__isnull": True},
{"identity__isnull": True, "user__isnull": True, "domain__isnull": True},
)
@classmethod
@ -106,17 +103,7 @@ class Config(models.Model):
"""
return cls.load_values(
cls.UserOptions,
{"identity__isnull": True, "user": user},
)
@classmethod
async def aload_user(cls, user):
"""
Async loads the user config options object
"""
return await sync_to_async(cls.load_values)(
cls.UserOptions,
{"identity__isnull": True, "user": user},
{"identity__isnull": True, "user": user, "domain__isnull": True},
)
@classmethod
@ -126,17 +113,17 @@ class Config(models.Model):
"""
return cls.load_values(
cls.IdentityOptions,
{"identity": identity, "user__isnull": True},
{"identity": identity, "user__isnull": True, "domain__isnull": True},
)
@classmethod
async def aload_identity(cls, identity):
def load_domain(cls, domain):
"""
Async loads an identity config options object
Loads an domain config options object
"""
return await sync_to_async(cls.load_values)(
cls.IdentityOptions,
{"identity": identity, "user__isnull": True},
return cls.load_values(
cls.DomainOptions,
{"domain": domain, "user__isnull": True, "identity__isnull": True},
)
@classmethod
@ -170,7 +157,7 @@ class Config(models.Model):
key,
value,
cls.SystemOptions,
{"identity__isnull": True, "user__isnull": True},
{"identity__isnull": True, "user__isnull": True, "domain__isnull": True},
)
@classmethod
@ -179,7 +166,7 @@ class Config(models.Model):
key,
value,
cls.UserOptions,
{"identity__isnull": True, "user": user},
{"identity__isnull": True, "user": user, "domain__isnull": True},
)
@classmethod
@ -188,11 +175,19 @@ class Config(models.Model):
key,
value,
cls.IdentityOptions,
{"identity": identity, "user__isnull": True},
{"identity": identity, "user__isnull": True, "domain__isnull": True},
)
@classmethod
def set_domain(cls, domain, key, value):
cls.set_value(
key,
value,
cls.DomainOptions,
{"domain": domain, "user__isnull": True, "identity__isnull": True},
)
class SystemOptions(pydantic.BaseModel):
version: str = __version__
system_actor_public_key: str = ""
@ -210,6 +205,7 @@ class Config(models.Model):
policy_terms: str = ""
policy_privacy: str = ""
policy_rules: str = ""
policy_issues: str = ""
signup_allowed: bool = True
signup_text: str = ""
@ -218,6 +214,7 @@ class Config(models.Model):
content_warning_text: str = "Content Warning"
post_length: int = 500
max_media_attachments: int = 4
post_minimum_interval: int = 3 # seconds
identity_min_length: int = 2
identity_max_per_user: int = 5
@ -239,19 +236,20 @@ class Config(models.Model):
custom_head: str | None
class UserOptions(pydantic.BaseModel):
pass
light_theme: bool = False
class IdentityOptions(pydantic.BaseModel):
toot_mode: bool = False
default_post_visibility: int = 0 # Post.Visibilities.public
visible_follows: bool = True
light_theme: bool = False
# wellness Options
search_enabled: bool = True
visible_reaction_counts: bool = True
expand_linked_cws: bool = True
infinite_scroll: bool = True
expand_content_warnings: bool = False
boosts_on_profile: bool = True
custom_css: str | None
class DomainOptions(pydantic.BaseModel):
site_name: str = ""
site_icon: UploadedImage | None = None
hide_login: bool = False
custom_css: str = ""
single_user: str = ""

View file

@ -27,12 +27,14 @@ if SENTRY_ENABLED:
set_context = sentry_sdk.set_context
set_tag = sentry_sdk.set_tag
start_transaction = sentry_sdk.start_transaction
start_span = sentry_sdk.start_span
else:
configure_scope = noop_context
push_scope = noop_context
set_context = noop
set_tag = noop
start_transaction = noop_context
start_span = noop_context
def set_takahe_app(name: str):

View file

@ -1,5 +1,7 @@
import base64
import json
import logging
from ssl import SSLCertVerificationError, SSLError
from typing import Literal, TypedDict, cast
from urllib.parse import urlparse
@ -17,6 +19,8 @@ from pyld import jsonld
from core.ld import format_ld_date
logger = logging.getLogger(__name__)
class VerificationError(BaseException):
"""
@ -87,9 +91,9 @@ class HttpSignature:
if header_name == "(request-target)":
value = f"{request.method.lower()} {request.path}"
elif header_name == "content-type":
value = request.META["CONTENT_TYPE"]
value = request.headers["content-type"]
elif header_name == "content-length":
value = request.META["CONTENT_LENGTH"]
value = request.headers["content-length"]
else:
value = request.META["HTTP_%s" % header_name.upper().replace("-", "_")]
headers[header_name] = value
@ -102,12 +106,18 @@ class HttpSignature:
name, value = item.split("=", 1)
value = value.strip('"')
bits[name.lower()] = value
signature_details: HttpSignatureDetails = {
"headers": bits["headers"].split(),
"signature": base64.b64decode(bits["signature"]),
"algorithm": bits["algorithm"],
"keyid": bits["keyid"],
}
try:
signature_details: HttpSignatureDetails = {
"headers": bits["headers"].split(),
"signature": base64.b64decode(bits["signature"]),
"algorithm": bits["algorithm"],
"keyid": bits["keyid"],
}
except KeyError as e:
key_names = " ".join(bits.keys())
raise VerificationError(
f"Missing item from details (have: {key_names}, error: {e})"
)
return signature_details
@classmethod
@ -133,7 +143,7 @@ class HttpSignature:
try:
public_key_instance.verify(
signature,
cleartext.encode("ascii"),
cleartext.encode("utf8"),
padding.PKCS1v15(),
hashes.SHA256(),
)
@ -160,7 +170,12 @@ class HttpSignature:
raise VerificationFormatError("No signature header present")
signature_details = cls.parse_signature(request.headers["signature"])
# Reject unknown algorithms
if signature_details["algorithm"] != "rsa-sha256":
# hs2019 is used by some libraries to obfuscate the real algorithm per the spec
# https://datatracker.ietf.org/doc/html/draft-cavage-http-signatures-12
if (
signature_details["algorithm"] != "rsa-sha256"
and signature_details["algorithm"] != "hs2019"
):
raise VerificationFormatError("Unknown signature algorithm")
# Create the signature payload
headers_string = cls.headers_from_request(request, signature_details["headers"])
@ -171,13 +186,13 @@ class HttpSignature:
)
@classmethod
async def signed_request(
def signed_request(
cls,
uri: str,
body: dict | None,
private_key: str,
key_id: str,
content_type: str = "application/json",
content_type: str = "application/activity+json",
method: Literal["get", "post"] = "post",
timeout: TimeoutTypes = settings.SETUP.REMOTE_TIMEOUT,
):
@ -204,7 +219,7 @@ class HttpSignature:
body_bytes = b""
# GET requests get implicit accept headers added
if method == "get":
headers["Accept"] = "application/ld+json"
headers["Accept"] = "application/activity+json,application/ld+json"
# Sign the headers
signed_string = "\n".join(
f"{name.lower()}: {value}" for name, value in headers.items()
@ -217,7 +232,7 @@ class HttpSignature:
),
)
signature = private_key_instance.sign(
signed_string.encode("ascii"),
signed_string.encode("utf8"),
padding.PKCS1v15(),
hashes.SHA256(),
)
@ -235,15 +250,19 @@ class HttpSignature:
# Send the request with all those headers except the pseudo one
del headers["(request-target)"]
async with httpx.AsyncClient(timeout=timeout) as client:
with httpx.Client(timeout=timeout) as client:
try:
response = await client.request(
response = client.request(
method,
uri,
headers=headers,
content=body_bytes,
follow_redirects=method == "get",
)
except SSLError as invalid_cert:
# Not our problem if the other end doesn't have proper SSL
logger.info("Invalid cert on %s %s", uri, invalid_cert)
raise SSLCertVerificationError(invalid_cert) from invalid_cert
except InvalidCodepoint as ex:
# Convert to a more generic error we handle
raise httpx.HTTPError(f"InvalidCodepoint: {str(ex)}") from None
@ -278,6 +297,8 @@ class LDSignature:
Verifies a document
"""
try:
# causing side effects to the original document is bad form
document = document.copy()
# Strip out the signature from the incoming document
signature = document.pop("signature")
# Create the options document
@ -305,7 +326,7 @@ class LDSignature:
hashes.SHA256(),
)
except InvalidSignature:
raise VerificationError("Signature mismatch")
raise VerificationError("LDSignature mismatch")
@classmethod
def create_signature(

View file

@ -1,14 +1,11 @@
import json
from typing import ClassVar
import markdown_it
from django.conf import settings
from django.http import HttpResponse
from django.shortcuts import redirect
from django.templatetags.static import static
from django.utils.decorators import method_decorator
from django.utils.safestring import mark_safe
from django.views.decorators.cache import cache_control
from django.views.generic import TemplateView, View
from django.views.static import serve
@ -21,17 +18,18 @@ from core.models import Config
def homepage(request):
if request.user.is_authenticated:
return Home.as_view()(request)
elif request.domain and request.domain.config_domain.single_user:
return redirect(f"/@{request.domain.config_domain.single_user}/")
else:
return About.as_view()(request)
@method_decorator(cache_page(public_only=True), name="dispatch")
class About(TemplateView):
template_name = "about.html"
def get_context_data(self):
service = TimelineService(self.request.identity)
service = TimelineService(None)
return {
"current_page": "about",
"content": mark_safe(
@ -87,46 +85,6 @@ class RobotsTxt(TemplateView):
}
@method_decorator(cache_control(max_age=60 * 15), name="dispatch")
class AppManifest(StaticContentView):
"""
Serves a PWA manifest file. This is a view as we want to drive some
items from settings.
NOTE: If this view changes to need runtime Config, it should change from
StaticContentView to View, otherwise the settings will only get
picked up during boot time.
"""
content_type = "application/json"
def get_static_content(self) -> str | bytes:
return json.dumps(
{
"$schema": "https://json.schemastore.org/web-manifest-combined.json",
"name": "Takahē",
"short_name": "Takahē",
"start_url": "/",
"display": "standalone",
"background_color": "#26323c",
"theme_color": "#26323c",
"description": "An ActivityPub server",
"icons": [
{
"src": static("img/icon-128.png"),
"sizes": "128x128",
"type": "image/png",
},
{
"src": static("img/icon-1024.png"),
"sizes": "1024x1024",
"type": "image/png",
},
],
}
)
class FlatPage(TemplateView):
"""
Serves a "flat page" from a config option,

View file

@ -5,18 +5,14 @@ FROM ${IMAGE_HOST}:${IMAGE_LABEL}
ENV PYTHONUNBUFFERED=1
COPY requirements.txt requirements.txt
RUN apt-get update \
&& apt-get install -y --no-install-recommends \
libpq5 \
libxslt1.1 \
nginx \
busybox \
&& rm -rf /var/lib/apt/lists/*
COPY requirements.txt requirements.txt
RUN apt-get update \
&& apt-get install -y --no-install-recommends \
netcat \
gcc \
libc6-dev \
@ -24,6 +20,7 @@ RUN apt-get update \
# Required to build lxml on arm64.
libxslt1-dev \
zlib1g-dev \
postgresql-client \
&& python3 -m pip install --no-cache-dir --upgrade -r requirements.txt \
&& apt-get purge -y --auto-remove \
gcc \

View file

@ -15,7 +15,7 @@ x-takahe-common:
TAKAHE_DATABASE_SERVER: "postgres://postgres:insecure_password@db/takahe"
TAKAHE_DEBUG: "true"
TAKAHE_SECRET_KEY: "insecure_secret"
TAKAHE_CSRF_TRUSTED_ORIGINS: '["http://127.0.0.1:8000", "https://127.0.0.1:8000"]'
TAKAHE_CSRF_HOSTS: '["http://127.0.0.1:8000", "https://127.0.0.1:8000"]'
TAKAHE_USE_PROXY_HEADERS: "true"
TAKAHE_EMAIL_BACKEND: "console://console"
TAKAHE_MAIN_DOMAIN: "example.com"
@ -56,10 +56,16 @@ services:
start_period: 15s
ports:
- "8000:8000"
depends_on:
setup:
condition: service_completed_successfully
stator:
<<: *takahe-common
command: ["/takahe/manage.py", "runstator"]
depends_on:
setup:
condition: service_completed_successfully
setup:
<<: *takahe-common

View file

@ -79,6 +79,12 @@ server {
# Unset Authorization and Cookie for security reasons.
proxy_set_header Authorization '';
proxy_set_header Cookie '';
proxy_set_header User-Agent 'takahe/nginx';
proxy_set_header Host $proxy_host;
proxy_set_header X-Forwarded-For '';
proxy_set_header X-Forwarded-Host '';
proxy_set_header X-Forwarded-Server '';
proxy_set_header X-Real-Ip '';
# Stops the local disk from being written to (just forwards data through)
proxy_max_temp_file_size 0;

View file

@ -1,8 +1,9 @@
#!/bin/bash
# Set up cache size and nameserver subs
# Nameservers are taken from /etc/resolv.conf - if the IP contains ":", it's IPv6 and must be enclosed in [] for nginx
CACHE_SIZE="${TAKAHE_NGINX_CACHE_SIZE:-1g}"
NAMESERVER=`cat /etc/resolv.conf | grep "nameserver" | awk '{print $2}' | tr '\n' ' '`
NAMESERVER=`cat /etc/resolv.conf | grep "nameserver" | awk '{print ($2 ~ ":") ? "["$2"]" : $2}' | tr '\n' ' '`
if [ -z "$NAMESERVER" ]; then
NAMESERVER="9.9.9.9 149.112.112.112"
fi

View file

@ -13,7 +13,7 @@ sys.path.insert(0, str(pathlib.Path(__file__).parent / "extensions"))
project = "Takahē"
copyright = "2022, Andrew Godwin"
author = "Andrew Godwin"
author = "Andrew Godwin, Jamie Bliss"
# -- General configuration ---------------------------------------------------
# https://www.sphinx-doc.org/en/master/usage/configuration.html#general-configuration

View file

@ -79,7 +79,7 @@ local installation, though.
Direct Installation
~~~~~~~~~~~~~~~~~~~
Takahē requires Python 3.10 or above, so you'll need that first. Clone the repo::
Takahē requires Python 3.11 or above, so you'll need that first. Clone the repo::
git clone https://github.com/jointakahe/takahe/
@ -172,3 +172,37 @@ We use `HTMX <https://htmx.org/>`_ for dynamically loading content, and
`Hyperscript <https://hyperscript.org/>`_ for most interactions rather than raw
JavaScript. If you can accomplish what you need with these tools, please use them
rather than adding JS.
Cutting a release
-----------------
In order to make a release of Takahē, follow these steps:
* Create or update the release document (in ``/docs/releases``) for the
release; major versions get their own document, minor releases get a
subheading in the document for their major release.
* Go through the git commit history since the last release in order to write
a reasonable summary of features.
* Be sure to include the little paragraphs at the end about contributing and
the docker tag, and an Upgrade Notes section that at minimum mentions
migrations and if they're normal or weird (even if there aren't any, it's
nice to call that out).
* If it's a new doc, make sure you include it in ``docs/releases/index.rst``!
* Update the version number in ``/takahe/__init__.py``
* Update the version number in ``README.md``
* Make a commit containing these changes called ``Releasing 1.23.45``.
* Tag that commit with a tag in the format ``1.23.45``.
* Wait for the GitHub Actions to run and publish the docker images (around 20
minutes as the ARM build is a bit slow)
* Post on the official account announcing the relase and linking to the
now-published release notes.

View file

@ -9,7 +9,8 @@ page!
Creator & Main Developer
------------------------
* `Andrew Godwin <https://aeracode.org>`_
* `Andrew Godwin <https://aeracode.org>`_ (Original creator)
* `Jamie Bliss <https://tacobelllabs.net/@astraluma>`_ (Current maintainer)
Core Contributors

View file

@ -119,9 +119,26 @@ be provided to the containers from the first boot.
``["andrew@aeracode.org"]`` (if you're doing this via shell, be careful
about escaping!)
* If you want to support push notifications, set ``TAKAHE_VAPID_PUBLIC_KEY``
and ``TAKAHE_VAPID_PRIVATE_KEY`` to a valid VAPID keypair (note that if you
ever change these, push notifications will stop working). You can generate
a keypair at `<https://web-push-codelab.glitch.me/>`_.
There are some other, optional variables you can tweak once the
system is up and working - see :doc:`tuning` for more.
If you are behind a caching proxy, such as Cloudflare, you may need to update
your CSRF host settings to match. Takahē validates that requests have an
Origin header that matches their Referer header by default, and these services
can break that relationship.
Takahē lets you set this up via the ``TAKAHE_CSRF_HOSTS`` environment variable, which takes
a Python-list-formatted list of additional protocols/domains to allow, with wildcards. It feeds
directly into Django's `CSRF_TRUSTED_ORIGINS <https://docs.djangoproject.com/en/4.2/ref/settings/#csrf-trusted-origins>`_
setting, so for more information about how to use it, see `the Django documentation <https://docs.djangoproject.com/en/4.2/ref/settings/#csrf-trusted-origins>`_ - generally, you'd want to set it to
your website's public address, so for our server it would have been
``TAKAHE_CSRF_HOSTS='["https://takahe.social"]'``.
.. _media_configuration:
@ -150,6 +167,11 @@ If you omit the keys or the endpoint URL, then Takahē will try to use implicit
authentication for them. The keys, if included, should be urlencoded, as AWS
secret keys commonly contain eg + characters.
With the above examples, Takahē connects to an S3 bucket using **HTTPS**. If
you wish to connect to an S3 bucket using **HTTP** (for example, to connect to
an S3 API endpoint on a private network), replace `s3` in the examples above
with `s3-insecure`.
Your S3 bucket *must* be set to allow publically-readable files, as Takahē will
set all files it uploads to be ``public-read``. We randomise uploaded file
names to prevent enumeration attacks.
@ -206,6 +228,9 @@ with the password ``my:password``, it would be represented as::
smtp://someone%40example.com:my%3Apassword@smtp.example.com:25/
The username and password can be omitted, with a URL in the form
``smtp://host:port/``, if your mail server is a (properly firewalled!)
unauthenticated relay.
SendGrid
########

View file

@ -15,14 +15,15 @@ Client Apps
These apps are known to fully work as far as Takahē's
:doc:`own featureset <features>` allows:
* Tusky
* Elk
* Pinafore
* `Tusky <https://tusky.app/>`_
* `Elk <https://elk.zone/>`_
* `Pinafore <https://pinafore.social/>`_
* `Tuba <https://tuba.geopjr.dev/>`_
These apps have had initial testing and work at a basic level:
* Ivory
* `Ivory <https://tapbots.com/ivory/>`_
* `Phanpy <https://phanpy.social/>`_
Fediverse Servers

98
docs/releases/0.10.rst Normal file
View file

@ -0,0 +1,98 @@
0.10
====
*0.10.0 Released: 2023/11/12*
*0.10.1 Released: 2023/11/13*
This release is a polish release that mostly focuses on performance, stability
and federation compatibility.
This release's major changes:
* Stator, the background task system, has been significantly reworked to require
smaller indexes, spend less time scheduling, and has had most of its async
nature removed, as this both reduces deadlocks and improves performance in
most situations (the context switching was costing more than the gains from
talking to other servers asynchronously).
Minor changes also include:
* Followers-only mode now works correctly inbound and outbound (though outbound
may need the other server to refresh the profile first).
* Profile pages are no longer shown for remote identities; instead, users are
linked or redirected directly to the remote profile page.
* Inbound migration has been implemented, but is disabled by default as outbound
migration is not yet complete, and we don't want to release a system that
captures users with no outward path. If you *really* want to enable it, set
``TAKAHE_ALLOW_USER_MIGRATION=true`` in your environment.
* Federation compatibility has been improved with several other servers.
* Blocked domains now receive absolutely zero fetches from Takahē; previously,
they were still pinged occasionally to see if they were online.
* SMTP servers that don't require authentication are now supported.
* Python 3.11 is now the minimum version required; this will not affect you at
all if you run Takahē via our docker image, as is recommended.
An automatic remote post pruning system, to shrink the database of old data
that was no longer needed, was in the development version but has been switched
to a set of manual commands as of 0.10.1 - you can read more below or in
:doc:`/tuning`.
If you'd like to help with code, design, or other areas, see
:doc:`/contributing` to see how to get in touch.
You can download images from `Docker Hub <https://hub.docker.com/r/jointakahe/takahe>`_,
or use the image name ``jointakahe/takahe:0.10``.
0.10.1
------
*Released: 2023/11/13*
This is a bugfix and small feature addition release:
* The ``runstator`` command now logs its output to the terminal again
* Two new commands, ``pruneposts`` and ``pruneidentities`` are added, to enable
pruning (deletion of old content) of Posts and Identities respectively.
You can read more about them in :doc:`/tuning`.
* Stator's default concurrency levels have been significantly reduced as it's
now way more efficient at using individual database connections, but as a
result it places way more load on them. You can read more about tuning this
in :doc:`/tuning`.
Upgrade Notes
-------------
Migrations
~~~~~~~~~~
There are new database migrations; they are backwards-compatible, but contain
very significant index changes to all of the main tables that may cause the
PostgreSQL deadlock detector to trigger if you attempt to apply them while your
site is live.
We recommend:
* Temporarily stopping all instances of the webserver and Stator
* Applying the migration (should be less than a few minutes on most installs)
* Restarting the instances of webserver and Stator
Stator
~~~~~~
Stator's new internal architecture allocates a worker thread and a database
connection up to its concurrency value; this means it is a _lot_ more efficient
for a given "concurrency" number than the old system and also uses a lot more
database connections. We recommend you reduce your configuration values for
these by 5-10x; if you didn't set them manually, then don't worry, we've
reduced the default values by a similar amount.

54
docs/releases/0.11.rst Normal file
View file

@ -0,0 +1,54 @@
0.11
====
*Released: 2024-02-05*
This is largely a bugfix and catch up release.
Some highlights:
* Python 3.10 has been dropped. The new minimum Python version is 3.11
* Jamie (`@astraluma@tacobelllabs.net <https://tacobelllabs.net/@astraluma>`_)
has officially joined the project
* If your S3 does not use TLS, you must use ``s3-insecure`` in your
configuration
* Takahē now supports unicode hashtags
* Add a Maximum Media Attachments setting
* Inverted the pruning command exit codes
* Posts are no longer required to have text content
And some interoperability bugs:
* Fixed a bug with GoToSocial
* Attempted to fix follows from Misskey family
* Correctly handle when a federated report doesn't have content
In additions, there's many bugfixes and minor changes, including:
* Several JSON handling improvements
* Post pruning now has a random element to it
* More specific loggers
* Don't make local identities stale
* Don't try to unmute when there's no expiration
* Don't try to WebFinger local users
* Synchronize follow accepting and profile fetching
* Perform some basic domain validity
* Correctly reject more operations when the identity is deleted
* Post edit fanouts for likers/boosters
If you'd like to help with code, design, or other areas, see
:doc:`/contributing` to see how to get in touch.
You can download images from `Docker Hub <https://hub.docker.com/r/jointakahe/takahe>`_,
or use the image name ``jointakahe/takahe:0.11``.
Upgrade Notes
-------------
Migrations
~~~~~~~~~~
There are new database migrations; they are backwards-compatible and should
not present any major database load.

View file

@ -20,7 +20,7 @@ Features:
* Following CSV import and export (Mastodon-compatible format)
* You can also export your followers as a CSV, but this cannot be imported
* You can also export your followers as a CSV, but this cannot be imported
* User assignment in domain create/edit screen
@ -52,6 +52,8 @@ There are new database migrations; they are backwards-compatible, so please
apply them before restarting your webservers and stator processes.
Two of the migrations involve adding large indexes and may take some time to
process (on the order of minutes). You may wish to bring your site down into
process (on the order of minutes) if you have a large database.
You may wish to bring your site down into
a maintenance mode before applying these to reduce the chance of lock conflicts
slowing things down, or causing request timeouts.

103
docs/releases/0.9.rst Normal file
View file

@ -0,0 +1,103 @@
0.9
===
*Released: 2023/06/24*
This release is a large overhaul of Takahē that removes all timeline UI elements
in the web interface in favour of apps, while reworking the remaining pages
to be a pleasant profile viewing, post viewing, and settings experience.
We've also started on our path of making individual domains much more
customisable; you can now theme them individually, the Local timeline is now
domain-specific, and domains can be set to serve single user profiles.
This release's major changes:
* The Home, Notifications, Local and Federated timelines have been removed
from the web UI. They still function for apps.
* The ability to like, boost, bookmark and reply to posts has been removed from
the web UI. They still function for apps.
* The web Compose tool has been considerably simplified and relegated to a new
"tools" section; most users should now use an app for composing posts.
* The Follows page is now in settings and is view-only.
* Identity profiles and individual post pages are now considerably simplified
and have no sidebar.
* A Search feature is now available for posts from a single identity on its
profile page; users can turn this on or off in their identity's profile
settings.
* Domains can now have their own site name, site icon, and custom CSS
* Domains can be set to a "single user mode" where they redirect to a user
profile, rather than showing their own homepage.
* Added an Authorized Apps identity settings screen, that allows seeing what apps you've
authorized, revocation of app access, and generating your own personal API
tokens.
* Added a Delete Profile settings screen that allows self-serve identity deletion.
* The logged-in homepage now shows a list of identities to select from as well
as a set of recommended apps to use for timeline interfaces.
* We have totally dropped our alpha-quality SQLite support; it just doesn't have
sufficient full-text-search and JSON operator support, unfortunately.
There are many minor changes to support the new direction; important ones include:
* The dark/light mode toggle is now a User (login) setting, not an Identity setting
* Identity selection is no longer part of a session - now, multiple identity
settings pages can be opened at once.
* The ability for users to add their own custom CSS has been removed, as it
was potentially confusing with our upcoming profile customization work (it
only ever applied to your own session, and with timelines gone, it no longer
makes much sense!)
* API pagination has been further improved, specifically for Elk compatibility
* Server admins can now add a "Report a Problem" footer link with either
hosted content or an external link.
This is a large change in direction, and we hope that it will match the way
that people use Takahē and its multi-domain support far better. For more
discussion and rationale on the change, see `Andrew's blog post about it <https://aeracode.org/2023/04/29/refactor-treat/>`_.
Our future plans include stability and polish in order to get us to a 1.0 release,
as well as allowing users to customize their profiles more, account import
support, and protocol enhancements like automatic fetching of replies for
non-local posts. If you're curious about what we're up to, or have an idea,
we're very happy to chat about it in our Discord!
If you'd like to help with code, design, other areas, see
:doc:`/contributing` to see how to get in touch.
You can download images from `Docker Hub <https://hub.docker.com/r/jointakahe/takahe>`_,
or use the image name ``jointakahe/takahe:0.9``.
Upgrade Notes
-------------
Despite the large refactor to the UI, Takahē's internals are not significantly
changed, and this upgrade is operationally like any other minor release.
Migrations
~~~~~~~~~~
There are new database migrations; they are backwards-compatible, so please
apply them before restarting your webservers and stator processes.
One of the migrations involves adding a large search index for opt-in post
searching, and may take some time to
process (on the order of minutes) if you have a large database.
You may wish to bring your site down into
a maintenance mode before applying it to reduce the chance of lock conflicts
slowing things down, or causing request timeouts.

View file

@ -7,6 +7,9 @@ Versions
.. toctree::
:maxdepth: 1
0.11
0.10
0.9
0.8
0.7
0.6

15
docs/releases/next.rst Normal file
View file

@ -0,0 +1,15 @@
Upgrade Notes
-------------
VAPID keys and Push notifications
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Takahē now supports push notifications if you supply a valid VAPID keypair as
the ``TAKAHE_VAPID_PUBLIC_KEY`` and ``TAKAHE_VAPID_PRIVATE_KEY`` environment
variables. You can generate a keypair via `https://web-push-codelab.glitch.me/`_.
Note that users of apps may need to sign out and in again to their accounts for
the app to notice that it can now do push notifications. Some apps, like Elk,
may cache the fact your server didn't support it for a while.

Some files were not shown because too many files have changed in this diff Show more