Compare commits

...

775 commits
0.7.0 ... main

Author SHA1 Message Date
trinity-1686a 304fb740d8 Merge pull request 'Support for storing media on S3' (#1149) from lx/Plume:s3 into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1149
Reviewed-by: trinity-1686a <trinity-1686a@noreply@joinplu.me>
2023-06-21 18:18:37 +00:00
Alex Auvolat 61e65a55ad improve formatting 2023-05-15 12:35:39 +02:00
Alex Auvolat 3f93212424 fix plume-cli 2023-05-12 18:25:19 +02:00
Alex Auvolat 20d77c22df try (and fail) to build with Nix 2023-05-12 17:20:45 +02:00
Alex Auvolat 24d3b289da Properly handle Content-Type 2023-05-12 16:11:29 +02:00
Alex Auvolat 20fa2cacf4 Store replicated remote media on S3 if available 2023-05-12 15:54:44 +02:00
Alex Auvolat 4e67eb8317 Uniformize media path/URL handling and implement direct download from S3 backend 2023-05-12 15:40:36 +02:00
Alex Auvolat 24c008b0de Add support for uploading media files to S3 2023-05-12 13:24:36 +02:00
Alex Auvolat 1cb9459a23 Update S3 features and make S3 support optional 2023-05-12 12:35:11 +02:00
Alex Auvolat 10e06737cf Update rust-s3 dependency and move Cargo.toml dependencies 2023-05-12 12:12:32 +02:00
Alex Auvolat 30a3cec87e Add Nix development shell 2023-05-12 12:12:09 +02:00
trinity-1686a 54af93d8ff initial s3 support
probably incomplete
2023-05-12 11:30:31 +02:00
KitaitiMakoto 9425b44d08 Merge pull request 'FIX: #1145 Fix SCSS errors' (#1146) from scss-errors into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1146
2023-04-16 07:13:55 +00:00
Kitaiti Makoto 487f296db5 Fix Clippy warnings 2023-04-16 16:12:09 +09:00
Kitaiti Makoto 8bdd481e0d Fix SCSS errors 2023-04-16 16:10:54 +09:00
KitaitiMakoto 19f18421bc Merge pull request 'delete comments properly when deleting users' (#1144) from fix-delete-user into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1144
Reviewed-by: KitaitiMakoto <kitaitimakoto@noreply@joinplu.me>
2023-04-16 06:59:53 +00:00
trinity-1686a e1777e9071 delete comments properly when deleting users 2023-04-09 12:54:29 +02:00
KitaitiMakoto 613ccbcd94 Merge pull request 'Add user search form to admin panel' (#1143) from moderation-improvement into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1143
2023-03-21 10:29:33 +00:00
Kitaiti Makoto b9a09a2511 Follow pagination user list page change 2023-03-21 19:15:22 +09:00
Kitaiti Makoto 213628e400 Don't use LIKE query when username is empty for user search 2023-03-21 19:07:09 +09:00
Kitaiti Makoto d6bb2bfb72 Use unwrap_or_default() instead of unwrap_or("") 2023-03-21 18:49:52 +09:00
Kitaiti Makoto 33bd290679 Use DbConn in model tests 2023-03-20 01:21:39 +09:00
Kitaiti Makoto 85ab5393fd Set style to user search form 2023-03-20 01:03:09 +09:00
Kitaiti Makoto 98c73bb6df Add search form to user list page 2023-03-20 01:00:17 +09:00
Kitaiti Makoto 3e9d9a459f Enable admin_search_user route 2023-03-20 01:00:02 +09:00
Kitaiti Makoto a394c3f210 Define admin_search_user route 2023-03-20 00:59:46 +09:00
Kitaiti Makoto a1a19e091a Define User::search_local_by_name() method 2023-03-20 00:59:16 +09:00
Kitaiti Makoto ec030d500d Exclude instance user when counting local users 2023-03-20 00:19:42 +09:00
Kitaiti Makoto cfa74f84e7 Remove instance users from user list to show 2023-03-06 19:30:18 +09:00
trinity-1686a 97cbe7f446 Merge pull request 'allow timeline manipulation from plm' (#1113) from timeline-cli into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1113
Reviewed-by: KitaitiMakoto <kitaitimakoto@noreply@joinplu.me>
2023-02-26 15:56:16 +00:00
trinity-1686a 7e4d081027 Merge branch 'main' into timeline-cli 2023-02-26 16:48:40 +01:00
KitaitiMakoto 1e5ae92135 Merge pull request 'Update crates' (#1138) from update-crates into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1138
2023-01-10 15:53:07 +00:00
Kitaiti Makoto 036ee6fac4 Merge remote-tracking branch 'origin/main' into update-crates 2023-01-11 00:42:34 +09:00
dependabot[bot] 6028295748 Bump glob from 0.3.0 to 0.3.1
Bumps [glob](https://github.com/rust-lang/glob) from 0.3.0 to 0.3.1.
- [Release notes](https://github.com/rust-lang/glob/releases)
- [Commits](https://github.com/rust-lang/glob/compare/0.3.0...0.3.1)

---
updated-dependencies:
- dependency-name: glob
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2023-01-11 00:41:35 +09:00
dependabot[bot] aa4cfd374d Bump atom_syndication from 0.11.0 to 0.12.0
Bumps [atom_syndication](https://github.com/rust-syndication/atom) from 0.11.0 to 0.12.0.
- [Release notes](https://github.com/rust-syndication/atom/releases)
- [Changelog](https://github.com/rust-syndication/atom/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rust-syndication/atom/compare/0.11.0...0.12.0)

---
updated-dependencies:
- dependency-name: atom_syndication
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2023-01-11 00:41:35 +09:00
dependabot[bot] 3303a4af84 Bump ructe from 0.14.2 to 0.15.0
Bumps [ructe](https://github.com/kaj/ructe) from 0.14.2 to 0.15.0.
- [Release notes](https://github.com/kaj/ructe/releases)
- [Changelog](https://github.com/kaj/ructe/blob/master/CHANGELOG.md)
- [Commits](https://github.com/kaj/ructe/compare/v0.14.2...v0.15.0)

---
updated-dependencies:
- dependency-name: ructe
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>

Change blog title specification

Revert "Change blog title specification"

This reverts commit a362b2474fa32b0e937f59acb9edb68d462c0719.
2023-01-11 00:41:22 +09:00
KitaitiMakoto 37a136787b Merge pull request 'Update crates' (#1136) from update-crates into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1136
2023-01-08 20:07:16 +00:00
Kitaiti Makoto 300ff37694 Merge remote-tracking branch 'github/dependabot/cargo/rsass-0.26.0' into update-crates 2023-01-09 04:50:04 +09:00
dependabot[bot] c1d9d39dc1
Bump rsass from 0.25.2 to 0.26.0
Bumps [rsass](https://github.com/kaj/rsass) from 0.25.2 to 0.26.0.
- [Release notes](https://github.com/kaj/rsass/releases)
- [Changelog](https://github.com/kaj/rsass/blob/main/CHANGELOG.md)
- [Commits](https://github.com/kaj/rsass/compare/v0.25.2...v0.26.0)

---
updated-dependencies:
- dependency-name: rsass
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2023-01-06 19:06:33 +00:00
dependabot[bot] 93d6ee04d4
Bump ructe from 0.14.2 to 0.15.0
Bumps [ructe](https://github.com/kaj/ructe) from 0.14.2 to 0.15.0.
- [Release notes](https://github.com/kaj/ructe/releases)
- [Changelog](https://github.com/kaj/ructe/blob/master/CHANGELOG.md)
- [Commits](https://github.com/kaj/ructe/compare/v0.14.2...v0.15.0)

---
updated-dependencies:
- dependency-name: ructe
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2023-01-06 19:06:09 +00:00
KitaitiMakoto ae7bf2e132 Merge pull request 'Update crates' (#1134) from update-crates into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1134
2023-01-06 14:57:17 +00:00
Kitaiti Makoto 0020242571 Format 2023-01-06 23:40:05 +09:00
Kitaiti Makoto 4f796e788c Clippy 2023-01-06 23:38:38 +09:00
Kitaiti Makoto 3d192c1179 Merge remote-tracking branches 'github/dependabot/cargo/whatlang-0.16.2', 'github/dependabot/cargo/ldap3-0.11.1' and 'github/dependabot/cargo/futures-0.3.25' into update-crates 2023-01-06 23:12:47 +09:00
dependabot[bot] 2f8d188d59
Bump futures from 0.3.21 to 0.3.25
Bumps [futures](https://github.com/rust-lang/futures-rs) from 0.3.21 to 0.3.25.
- [Release notes](https://github.com/rust-lang/futures-rs/releases)
- [Changelog](https://github.com/rust-lang/futures-rs/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rust-lang/futures-rs/compare/0.3.21...0.3.25)

---
updated-dependencies:
- dependency-name: futures
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2023-01-05 19:07:12 +00:00
dependabot[bot] 19766662f1
Bump ldap3 from 0.10.6 to 0.11.1
Bumps [ldap3](https://github.com/inejge/ldap3) from 0.10.6 to 0.11.1.
- [Release notes](https://github.com/inejge/ldap3/releases)
- [Changelog](https://github.com/inejge/ldap3/blob/master/CHANGELOG.md)
- [Commits](https://github.com/inejge/ldap3/compare/v0.10.6...v0.11.1)

---
updated-dependencies:
- dependency-name: ldap3
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2023-01-05 19:06:52 +00:00
dependabot[bot] 301aad3f73
Bump whatlang from 0.16.0 to 0.16.2
Bumps [whatlang](https://github.com/greyblake/whatlang-rs) from 0.16.0 to 0.16.2.
- [Release notes](https://github.com/greyblake/whatlang-rs/releases)
- [Changelog](https://github.com/greyblake/whatlang-rs/blob/master/CHANGELOG.md)
- [Commits](https://github.com/greyblake/whatlang-rs/compare/v0.16.0...v0.16.2)

---
updated-dependencies:
- dependency-name: whatlang
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2023-01-05 19:06:39 +00:00
KitaitiMakoto 92a8f8aa4c Merge pull request 'Update crates' (#1131) from update-crates into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1131
2023-01-05 16:35:11 +00:00
Kitaiti Makoto 0c856a5252 Update Ructe 2023-01-06 01:11:46 +09:00
Kitaiti Makoto 2df6138ff1 Fix caddy run option 2023-01-06 00:59:00 +09:00
Kitaiti Makoto b2942f3f47 Update crates 2023-01-06 00:42:29 +09:00
Kitaiti Makoto 94f20c8fc2 Update rustfmt and clippy on Ci 2023-01-06 00:42:20 +09:00
Kitaiti Makoto 5d48b93c8b Update Docker image for CI 2023-01-06 00:27:57 +09:00
Kitaiti Makoto bbf2e00920 Install updated crates 2023-01-06 00:04:53 +09:00
Kitaiti Makoto c97361f5f4 Update crates 2023-01-06 00:04:45 +09:00
KitaitiMakoto 7c799e8abf Merge pull request 'Add changelogs' (#1130) from changelog into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1130
2023-01-05 14:47:52 +00:00
Kitaiti Makoto d196e1dbd0 Add changelogs [skip ci] 2023-01-05 23:46:50 +09:00
Kitaiti Makoto 1679315322 Clippy 2023-01-05 05:08:53 +09:00
KitaitiMakoto dd3a5f4a5b Merge pull request 'Fixes #1128 Some ActivityPub related fixes' (#1129) from ap-fixes into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1129
2023-01-04 19:27:50 +00:00
Kitaiti Makoto 3580fb04fa Format 2023-01-05 04:14:25 +09:00
Kitaiti Makoto 699fdc30d9 Make preferredUsername of blogs valid 2023-01-05 04:13:52 +09:00
Kitaiti Makoto 704e9aa47f Make blogs.fqn valid 2023-01-05 04:13:40 +09:00
Kitaiti Makoto d741238ccb Revert "Rejectd illegal characters from blog name"
This reverts commit 9776374d17.
2023-01-05 03:11:20 +09:00
Kitaiti Makoto 9776374d17 Rejectd illegal characters from blog name 2023-01-05 02:49:48 +09:00
Kitaiti Makoto 2d10ddb9fa Clippy 2023-01-05 02:33:29 +09:00
Kitaiti Makoto e746a0b03f Add error log for invalid preferredUsername 2023-01-05 02:25:58 +09:00
Kitaiti Makoto 85cacf4239 Format 2023-01-05 02:23:50 +09:00
Kitaiti Makoto f138ae6ed9 Allow empty avatar for remote users 2023-01-05 02:20:57 +09:00
Kitaiti Makoto 399af4004a Build CustomPerson from source string at once 2023-01-05 02:20:25 +09:00
Kitaiti Makoto d36f13e984 Add test for deserializing CustomGroup 2023-01-05 02:19:10 +09:00
Kitaiti Makoto 9a3699160d Fix test name 2023-01-05 01:16:13 +09:00
KitaitiMakoto 4103e7513d Merge pull request 'Fix #966: Don't retrieve user info from blocked instances' (#1120) from block-instance-user into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1120
2023-01-04 16:11:34 +00:00
KitaitiMakoto ed9970b102 Merge pull request 'Percent encode to create ActivityPub URI' (#1127) from blog-slug into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1127
2023-01-04 09:55:34 +00:00
Kitaiti Makoto afa875366e Percent encode to create ActivityPub URI 2023-01-04 18:54:38 +09:00
KitaitiMakoto 9696f04c64 Merge pull request 'Fix #1125 Fix a bug about blog title and AP URL' (#1126) from blog-slug into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1126
2023-01-04 09:02:06 +00:00
Kitaiti Makoto 40e1a1fc2c Don't encode whole AP ID 2023-01-04 17:49:16 +09:00
Kitaiti Makoto ee1e553460 Fix a but about blog title and AP URL 2023-01-03 22:09:43 +09:00
KitaitiMakoto d20ce6dd0b Merge pull request 'Downgrade Docker image to run on CI' (#1124) from downgrade-ci-docker into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1124
2023-01-03 10:16:01 +00:00
Kitaiti Makoto 72f7909a42 Downgrade Docker image to run on CI 2023-01-03 19:15:32 +09:00
KitaitiMakoto 4e1fb64868 Merge pull request 'Update CI environment' (#1123) from clippy-on-ci into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1123
2023-01-03 09:30:54 +00:00
KitaitiMakoto 85c1bfa300 Merge pull request 'Fix #1121: Check email block list when email sign-up' (#1122) from block-on-email-signup into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1122
2023-01-03 09:11:20 +00:00
Kitaiti Makoto 172c78c41d Update Dockerfile for CI 2023-01-03 18:10:18 +09:00
Kitaiti Makoto 3b08d5b485 Update .dockerignore 2023-01-03 18:05:25 +09:00
Kitaiti Makoto 832479a706 Extract EmailSingup::ensure_email_not_blocked() 2023-01-03 17:49:15 +09:00
Kitaiti Makoto 3b3148fa6b Clippy 2023-01-03 17:42:26 +09:00
Kitaiti Makoto b38d55f486 Check email block list when email sign-up 2023-01-03 17:41:25 +09:00
Kitaiti Makoto 2804a490ed Don't retrieve user info from blocked instances 2023-01-03 03:23:50 +09:00
KitaitiMakoto 8c098def61 Merge pull request 'Update Rust' (#1119) from update-rust into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1119
2023-01-02 18:22:29 +00:00
Kitaiti Makoto e10ddb50c0 Ignore sum doc tests 2023-01-03 03:22:05 +09:00
Kitaiti Makoto 4df2c3e6f6 Clippy 2023-01-03 02:53:12 +09:00
Kitaiti Makoto 2f53fc78b6 Remove unnecessary trick 2023-01-03 02:41:51 +09:00
Kitaiti Makoto fded87654d Update Rust 2023-01-03 02:34:28 +09:00
Kitaiti Makoto 08cd777f81 Update pear_codegen 2023-01-03 02:34:22 +09:00
KitaitiMakoto 96b88353c5 Merge pull request 'docker-management' (#1118) from docker-management into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1118
2023-01-02 17:31:36 +00:00
Kitaiti Makoto 302026feb9 Update Rust on building Docker image [skip ci] 2023-01-03 02:30:43 +09:00
Kitaiti Makoto ba6d322da7 Set wasm-opt = false 2023-01-03 02:29:48 +09:00
KitaitiMakoto 488563e9c1 Merge pull request 'Update GitHub actions' (#1117) from docker-management into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1117
2023-01-02 16:13:28 +00:00
Kitaiti Makoto 130bb4c102 Update GitHub actions 2023-01-03 01:10:11 +09:00
KitaitiMakoto 9368aebe70 Merge pull request 'Fix #924 Update rocket_csrf' (#1116) from update-rocket_csrf into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1116
2023-01-02 15:29:34 +00:00
Kitaiti Makoto ca2843822e Install rocket_csrf 0.1.2 2023-01-03 00:27:05 +09:00
Kitaiti Makoto bd91b4a346 Update rocket_csrf to 0.1.2 2023-01-03 00:26:31 +09:00
trinity-1686a 35b951967d add a few help messages to the cli 2023-01-01 11:03:50 +01:00
KitaitiMakoto 63d2cf91e9 Merge pull request 'Update crates' (#1115) from update-rocket_csrf into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1115
2023-01-01 00:10:29 +00:00
Kitaiti Makoto 263cf9e04f Stick activitystreams for all packages 2023-01-01 08:52:08 +09:00
Kitaiti Makoto 22ebecba67 Stick serde for all packages 2023-01-01 08:11:14 +09:00
Kitaiti Makoto 903b48ed12 Stick activitystreams version 2023-01-01 07:54:08 +09:00
Kitaiti Makoto a550291c85 Stick activitystreams version 2023-01-01 07:41:54 +09:00
Kitaiti Makoto 47394fc620 Stick serde 2023-01-01 07:24:32 +09:00
Kitaiti Makoto b180089b1b Update ldap3 2023-01-01 07:10:31 +09:00
Kitaiti Makoto a275aa5965 Stick serde version 2023-01-01 07:10:09 +09:00
Kitaiti Makoto 87edb2486c Update openssl 2023-01-01 06:36:30 +09:00
Kitaiti Makoto 10617f3144 Install rocket_csrf 2023-01-01 06:23:25 +09:00
Kitaiti Makoto 6654ad28b7 Update rocket_csrf to 0.1.1 2023-01-01 06:23:18 +09:00
trinity-1686a 771d4325c2 add plm command for list management 2022-12-17 17:51:51 +01:00
trinity-1686a 1536a6d3f3 allow timeline manipulation from plm 2022-12-16 22:51:14 +01:00
KitaitiMakoto 620726cc25 Merge pull request 'Update crates' (#1107) from update-crates into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1107
2022-06-16 09:12:07 +00:00
Kitaiti Makoto 0eef7c0b89 Merge remote-tracking branch 'github/dependabot/cargo/web-sys-0.3.58' into update-crates 2022-06-16 15:58:57 +09:00
dependabot[bot] 321e40ea3f
Bump web-sys from 0.3.57 to 0.3.58
Bumps [web-sys](https://github.com/rustwasm/wasm-bindgen) from 0.3.57 to 0.3.58.
- [Release notes](https://github.com/rustwasm/wasm-bindgen/releases)
- [Changelog](https://github.com/rustwasm/wasm-bindgen/blob/main/CHANGELOG.md)
- [Commits](https://github.com/rustwasm/wasm-bindgen/commits)

---
updated-dependencies:
- dependency-name: web-sys
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-06-15 19:27:28 +00:00
dependabot[bot] a218b4ea4f
Bump reqwest from 0.11.10 to 0.11.11
Bumps [reqwest](https://github.com/seanmonstar/reqwest) from 0.11.10 to 0.11.11.
- [Release notes](https://github.com/seanmonstar/reqwest/releases)
- [Changelog](https://github.com/seanmonstar/reqwest/blob/master/CHANGELOG.md)
- [Commits](https://github.com/seanmonstar/reqwest/compare/v0.11.10...v0.11.11)

---
updated-dependencies:
- dependency-name: reqwest
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-06-15 19:26:39 +00:00
KitaitiMakoto 9613ccd0c3 Merge pull request 'Fix Cargo.toml' (#1106) from fix-cargo into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1106
2022-06-15 04:52:21 +00:00
Kitaiti Makoto 9493c1ad06 Fix Cargo.toml 2022-06-15 13:51:30 +09:00
KitaitiMakoto e92ac1a13f Merge pull request 'Update crates' (#1105) from update-crates into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1105
2022-06-15 04:47:53 +00:00
Kitaiti Makoto 1517b4d91e Merge remote-tracking branch 'github/dependabot/cargo/js-sys-0.3.58' into update-crates 2022-06-15 13:46:55 +09:00
dependabot[bot] 38cc4c043d
Bump js-sys from 0.3.57 to 0.3.58
Bumps [js-sys](https://github.com/rustwasm/wasm-bindgen) from 0.3.57 to 0.3.58.
- [Release notes](https://github.com/rustwasm/wasm-bindgen/releases)
- [Changelog](https://github.com/rustwasm/wasm-bindgen/blob/main/CHANGELOG.md)
- [Commits](https://github.com/rustwasm/wasm-bindgen/commits)

---
updated-dependencies:
- dependency-name: js-sys
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-06-14 19:21:44 +00:00
dependabot[bot] 05c1d727dc
Bump wasm-bindgen from 0.2.80 to 0.2.81
Bumps [wasm-bindgen](https://github.com/rustwasm/wasm-bindgen) from 0.2.80 to 0.2.81.
- [Release notes](https://github.com/rustwasm/wasm-bindgen/releases)
- [Changelog](https://github.com/rustwasm/wasm-bindgen/blob/main/CHANGELOG.md)
- [Commits](https://github.com/rustwasm/wasm-bindgen/commits)

---
updated-dependencies:
- dependency-name: wasm-bindgen
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-06-14 19:21:16 +00:00
KitaitiMakoto 84645c7ed9 Merge pull request 'Update crates' (#1103) from update-crates into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1103
2022-06-12 11:54:55 +00:00
KitaitiMakoto 7c505bde7f Merge pull request 'Blog's header buttons margin fix in RTL' (#1093) from mskf1383/Plume:main into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1093
2022-06-12 11:30:08 +00:00
Kitaiti Makoto 9f543f1b6b Merge remote-tracking branch 'github/dependabot/cargo/flume-0.10.13' into update-crates 2022-06-12 20:26:19 +09:00
Kitaiti Makoto 0f7b882749 Merge remote-tracking branch 'github/dependabot/cargo/tracing-0.1.35' into update-crates 2022-06-12 20:25:10 +09:00
Kitaiti Makoto f9f4375a40 Merge remote-tracking branch 'github/dependabot/cargo/tokio-1.19.2' into update-crates 2022-06-12 20:24:58 +09:00
dependabot[bot] 12c2848cc7
Bump flume from 0.10.12 to 0.10.13
Bumps [flume](https://github.com/zesterer/flume) from 0.10.12 to 0.10.13.
- [Release notes](https://github.com/zesterer/flume/releases)
- [Changelog](https://github.com/zesterer/flume/blob/master/CHANGELOG.md)
- [Commits](https://github.com/zesterer/flume/commits)

---
updated-dependencies:
- dependency-name: flume
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-06-10 19:31:34 +00:00
dependabot[bot] 4cfb3e2494
Bump tracing from 0.1.34 to 0.1.35
Bumps [tracing](https://github.com/tokio-rs/tracing) from 0.1.34 to 0.1.35.
- [Release notes](https://github.com/tokio-rs/tracing/releases)
- [Commits](https://github.com/tokio-rs/tracing/compare/tracing-0.1.34...tracing-0.1.35)

---
updated-dependencies:
- dependency-name: tracing
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-06-08 19:30:58 +00:00
dependabot[bot] 090b0a6f0d
Bump tokio from 1.18.2 to 1.19.2
Bumps [tokio](https://github.com/tokio-rs/tokio) from 1.18.2 to 1.19.2.
- [Release notes](https://github.com/tokio-rs/tokio/releases)
- [Commits](https://github.com/tokio-rs/tokio/commits)

---
updated-dependencies:
- dependency-name: tokio
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-06-06 19:47:34 +00:00
MohammadSaleh Kamyab 4502b77094 Failsafe 2022-06-06 05:39:21 +00:00
MohammadSaleh Kamyab 8f5a86206a Merge branch 'main' into main 2022-06-01 09:26:34 +00:00
dependabot[bot] 340157f80d
Bump scheduled-thread-pool from 0.2.5 to 0.2.6
Bumps [scheduled-thread-pool](https://github.com/sfackler/scheduled-thread-pool) from 0.2.5 to 0.2.6.
- [Release notes](https://github.com/sfackler/scheduled-thread-pool/releases)
- [Commits](https://github.com/sfackler/scheduled-thread-pool/compare/v0.2.5...v0.2.6)

---
updated-dependencies:
- dependency-name: scheduled-thread-pool
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-05-30 19:35:18 +00:00
KitaitiMakoto 16b10695df Merge pull request 'Bump rocket_contrib from 0.4.10 to 0.4.11' (#1101) from rocket_contrib-0.4.11 into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1101
2022-05-27 02:08:27 +00:00
Kitaiti Makoto b8eb631aa3 Merge remote-tracking branch 'origin/main' into rocket_contrib-0.4.11 2022-05-27 11:07:59 +09:00
KitaitiMakoto 5c9094fede Merge pull request 'Bump rocket from 0.4.10 to 0.4.11' (#1100) from rocket-0.4.11 into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1100
2022-05-27 02:03:58 +00:00
dependabot[bot] 4e2ca515ce
Bump rocket_contrib from 0.4.10 to 0.4.11
Bumps [rocket_contrib](https://github.com/SergioBenitez/Rocket) from 0.4.10 to 0.4.11.
- [Release notes](https://github.com/SergioBenitez/Rocket/releases)
- [Changelog](https://github.com/SergioBenitez/Rocket/blob/master/CHANGELOG.md)
- [Commits](https://github.com/SergioBenitez/Rocket/commits)

---
updated-dependencies:
- dependency-name: rocket_contrib
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-05-26 19:33:42 +00:00
dependabot[bot] eccfbd3fbc
Bump rocket from 0.4.10 to 0.4.11
Bumps [rocket](https://github.com/SergioBenitez/Rocket) from 0.4.10 to 0.4.11.
- [Release notes](https://github.com/SergioBenitez/Rocket/releases)
- [Changelog](https://github.com/SergioBenitez/Rocket/blob/master/CHANGELOG.md)
- [Commits](https://github.com/SergioBenitez/Rocket/commits)

---
updated-dependencies:
- dependency-name: rocket
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-05-26 19:33:24 +00:00
KitaitiMakoto 8408342b5d Merge pull request 'Bump once_cell from 1.11.0 to 1.12.0' (#1099) from once_cell-1.12.0 into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1099
2022-05-24 05:48:38 +00:00
dependabot[bot] c47921bb25
Bump once_cell from 1.11.0 to 1.12.0
Bumps [once_cell](https://github.com/matklad/once_cell) from 1.11.0 to 1.12.0.
- [Release notes](https://github.com/matklad/once_cell/releases)
- [Changelog](https://github.com/matklad/once_cell/blob/master/CHANGELOG.md)
- [Commits](https://github.com/matklad/once_cell/compare/v1.11.0...v1.12.0)

---
updated-dependencies:
- dependency-name: once_cell
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-05-23 19:24:07 +00:00
KitaitiMakoto 03f470f04c Merge pull request 'Bump regex-syntax from 0.6.25 to 0.6.26' (#1096) from regex-syntax-0.6.26 into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1096
2022-05-21 06:35:34 +00:00
KitaitiMakoto 5770c3b85b Merge branch 'main' into regex-syntax-0.6.26 2022-05-21 06:35:19 +00:00
KitaitiMakoto c92f46b2c9 Merge pull request 'Bump once_cell from 1.10.0 to 1.11.0' (#1095) from once_cell-1.11.0 into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1095
2022-05-21 06:34:41 +00:00
dependabot[bot] 69eba69528
Bump regex-syntax from 0.6.25 to 0.6.26
Bumps [regex-syntax](https://github.com/rust-lang/regex) from 0.6.25 to 0.6.26.
- [Release notes](https://github.com/rust-lang/regex/releases)
- [Changelog](https://github.com/rust-lang/regex/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rust-lang/regex/commits)

---
updated-dependencies:
- dependency-name: regex-syntax
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-05-20 19:26:36 +00:00
dependabot[bot] c302d842e0
Bump once_cell from 1.10.0 to 1.11.0
Bumps [once_cell](https://github.com/matklad/once_cell) from 1.10.0 to 1.11.0.
- [Release notes](https://github.com/matklad/once_cell/releases)
- [Changelog](https://github.com/matklad/once_cell/blob/master/CHANGELOG.md)
- [Commits](https://github.com/matklad/once_cell/compare/v1.10.0...v1.11.0)

---
updated-dependencies:
- dependency-name: once_cell
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-05-19 19:36:52 +00:00
KitaitiMakoto f660220495 Merge pull request 'Fix blog slug' (#1094) from fix-blog-slug into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1094
2022-05-19 17:29:34 +00:00
KitaitiMakoto 155df7bdf0 Merge branch 'main' into fix-blog-slug 2022-05-19 17:28:56 +00:00
Kitaiti Makoto 29055d1957 Follow clippy 2022-05-20 02:13:23 +09:00
Kitaiti Makoto d6ee49b880 Update Cargo.lock 2022-05-20 01:55:12 +09:00
Kitaiti Makoto 248ed265c4 Remove unsed heck from dependencies 2022-05-20 01:53:16 +09:00
Kitaiti Makoto a1f958ee7a Remove unused import 2022-05-20 01:52:26 +09:00
Kitaiti Makoto abf352b957 Remove unused function 2022-05-20 01:51:53 +09:00
Kitaiti Makoto 393f8e5e0c Use Blog::slug() to determine blog's slug 2022-05-20 01:51:16 +09:00
Kitaiti Makoto 4dfe300ee3 Define Blog::slug() 2022-05-20 01:50:51 +09:00
Kitaiti Makoto 65829094c9 Add test for blog slug validation 2022-05-20 01:40:51 +09:00
MohammadSaleh Kamyab d5c3e6d6f0 Blog's header buttons margin fix in RTL 2022-05-19 15:47:56 +00:00
KitaitiMakoto d702dd2fae Merge pull request 'Bidirectional support for user page header' (#1092) from mskf1383/Plume:main into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1092
2022-05-19 15:38:36 +00:00
MohammadSaleh Kamyab 0d855823c9 Bidirectional support for user page header 2022-05-19 15:26:44 +00:00
KitaitiMakoto 2dd33769d4 Merge pull request 'Bump rsass from 0.24.0 to 0.25.0' (#1091) from rsass-0.25.0 into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1091
2022-05-19 01:29:12 +00:00
dependabot[bot] 4ea9f6ecf1
Bump rsass from 0.24.0 to 0.25.0
Bumps [rsass](https://github.com/kaj/rsass) from 0.24.0 to 0.25.0.
- [Release notes](https://github.com/kaj/rsass/releases)
- [Changelog](https://github.com/kaj/rsass/blob/master/CHANGELOG.md)
- [Commits](https://github.com/kaj/rsass/compare/v0.24.0...v0.25.0)

---
updated-dependencies:
- dependency-name: rsass
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-05-18 19:31:26 +00:00
KitaitiMakoto 3b0b6c4b0b Merge pull request 'Fix .venv path in buildenv' (#1090) from venv-path into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1090
2022-05-17 17:39:10 +00:00
Kitaiti Makoto 4328fad5a3 Don't load venv 2022-05-18 02:15:15 +09:00
Kitaiti Makoto b26822c045 Update buildenv image 2022-05-16 12:59:05 +09:00
Kitaiti Makoto f372282b04 Use apt package for setuptools instead of pyenv 2022-05-16 12:20:06 +09:00
Kitaiti Makoto 145253ccbf Fix .venv path in buildenv 2022-05-15 11:41:50 +09:00
KitaitiMakoto 485223a3dd Merge pull request 'Add fmt and clippy on CI' (#1089) from add-toolchain into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1089
2022-05-15 02:22:29 +00:00
Kitaiti Makoto 7f75fa74e7 Add fmt and clippy on CI 2022-05-15 11:21:07 +09:00
KitaitiMakoto 821fce1903 Merge pull request 'Use rust-toolchain in buildenv' (#1088) from buildenv-rust-toolchain into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1088
2022-05-15 02:16:39 +00:00
Kitaiti Makoto ce484de61e Bump buildenv image 2022-05-15 11:15:05 +09:00
Kitaiti Makoto 35fb57718d Add rust-toolchain into buildenv 2022-05-15 06:46:18 +09:00
KitaitiMakoto 35d12d7cae Merge pull request 'Activate venv on integration test' (#1087) from fix-test-env into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1087
2022-05-14 21:34:37 +00:00
Kitaiti Makoto 846154efe1 Activate venv on integration test 2022-05-15 06:34:09 +09:00
KitaitiMakoto 1ec7acbdfe Merge pull request 'Update Crowdin enviroment' (#1086) from update-crowdin into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1086
2022-05-14 21:14:15 +00:00
Kitaiti Makoto e384fdfcff Update buildenv image to v0.5.0 2022-05-15 05:56:31 +09:00
Kitaiti Makoto ed58e44d2e Use Python 3 to install Selenium 2022-05-15 05:45:25 +09:00
Kitaiti Makoto f151dee339 Don't strip in buildenv 2022-05-15 05:13:09 +09:00
Kitaiti Makoto 61f25941e8 Install crowdin CLI using apt in buildenv 2022-05-15 05:13:09 +09:00
Kitaiti Makoto 0628a14be6 Use Rust image for buildenv 2022-05-15 05:13:03 +09:00
KitaitiMakoto b46ae83377 Merge pull request 'Change default branch to main' (#1085) from default-branch into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1085
2022-05-14 18:57:21 +00:00
Kitaiti Makoto 70bc7f8edf Change default branch to main 2022-05-15 03:55:33 +09:00
KitaitiMakoto aff481b947 Merge pull request 'Add 'My feed' to i18n timeline name' (#1084) from myfeed-translation into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1084
2022-05-14 18:35:35 +00:00
Kitaiti Makoto 29ef73d307 Add 'My feed' to i18n timeline name 2022-05-15 03:32:45 +09:00
KitaitiMakoto db205d0d9d Merge pull request 'Bump ldap3 from 0.10.4 to 0.10.5' (#1083) from ldap3-0.10.5 into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1083
2022-05-12 23:19:47 +00:00
dependabot[bot] 57ab7edf23
Bump ldap3 from 0.10.4 to 0.10.5
Bumps [ldap3](https://github.com/inejge/ldap3) from 0.10.4 to 0.10.5.
- [Release notes](https://github.com/inejge/ldap3/releases)
- [Changelog](https://github.com/inejge/ldap3/blob/master/CHANGELOG.md)
- [Commits](https://github.com/inejge/ldap3/compare/v0.10.4...v0.10.5)

---
updated-dependencies:
- dependency-name: ldap3
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-05-12 19:26:54 +00:00
KitaitiMakoto e4bd9d65cf Merge pull request 'Bump diesel-derive-newtype from 0.1.2 to 1.0.0' (#1081) from diesel-derive-newtype-1.0.0 into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1081
2022-05-11 21:56:27 +00:00
KitaitiMakoto 17e4ddb32d Merge branch 'main' into diesel-derive-newtype-1.0.0 2022-05-11 21:56:09 +00:00
KitaitiMakoto 4fd85b30f1 Merge pull request '(cargo-release) version v0.7.3-dev' (#1080) from v0.7.3-dev into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1080
2022-05-11 21:33:43 +00:00
Kitaiti Makoto 6148f29c66 (cargo-release) version {{version}} 2022-05-12 06:30:25 +09:00
KitaitiMakoto 1a3fad2d6a Merge pull request 'Release v0.7.2' (#1079) from v0.7.2 into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1079
2022-05-11 19:50:40 +00:00
Kitaiti Makoto 0945d3bc53 Set release to false for sub crates [skip ci] 2022-05-12 04:35:05 +09:00
Kitaiti Makoto 9a824f06c3 (cargo-release) version {{version}} 2022-05-12 04:29:46 +09:00
Kitaiti Makoto eec09d79fe Fix release.toml 2022-05-12 04:27:48 +09:00
dependabot[bot] 7f63d2a129
Bump diesel-derive-newtype from 0.1.2 to 1.0.0
Bumps [diesel-derive-newtype](https://github.com/quodlibetor/diesel-derive-newtype) from 0.1.2 to 1.0.0.
- [Release notes](https://github.com/quodlibetor/diesel-derive-newtype/releases)
- [Changelog](https://github.com/quodlibetor/diesel-derive-newtype/blob/main/CHANGELOG.md)
- [Commits](https://github.com/quodlibetor/diesel-derive-newtype/commits)

---
updated-dependencies:
- dependency-name: diesel-derive-newtype
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-05-11 19:24:14 +00:00
Kitaiti Makoto 79b639c3e6 Update PO files 2022-05-11 03:29:33 +09:00
Kitaiti Makoto efef208f53 Update translations 2022-05-11 03:26:01 +09:00
KitaitiMakoto 27e0f755f6 Merge pull request 'Bump whatlang from 0.15.0 to 0.16.0' (#1078) from whatlang-0.16.0 into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1078
2022-05-10 17:53:35 +00:00
KitaitiMakoto a9d7aae5d6 Merge branch 'main' into whatlang-0.16.0 2022-05-10 17:52:46 +00:00
KitaitiMakoto 42e584a363 Merge pull request 'Add blank line' (#1077) from tiny-change into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1077
2022-05-10 17:50:07 +00:00
Kitaiti Makoto 8c37ea3ec3 Add blank line 2022-05-11 02:35:30 +09:00
dependabot[bot] 3c14fa0058 Bump whatlang from 0.15.0 to 0.16.0
Bumps [whatlang](https://github.com/greyblake/whatlang-rs) from 0.15.0 to 0.16.0.
- [Release notes](https://github.com/greyblake/whatlang-rs/releases)
- [Changelog](https://github.com/greyblake/whatlang-rs/blob/master/CHANGELOG.md)
- [Commits](https://github.com/greyblake/whatlang-rs/compare/v0.15.0...v0.16.0)

---
updated-dependencies:
- dependency-name: whatlang
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-05-11 02:33:23 +09:00
dependabot[bot] ab94cca210
Bump tokio from 1.18.1 to 1.18.2
Bumps [tokio](https://github.com/tokio-rs/tokio) from 1.18.1 to 1.18.2.
- [Release notes](https://github.com/tokio-rs/tokio/releases)
- [Commits](https://github.com/tokio-rs/tokio/compare/tokio-1.18.1...tokio-1.18.2)

---
updated-dependencies:
- dependency-name: tokio
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-05-09 19:31:15 +00:00
Kitaiti Makoto ea62388985 Update PO files 2022-05-09 23:34:32 +09:00
KitaitiMakoto a9219efee4 Merge pull request 'Move to action area after liking/boosting/commenting' (#1074) from action-id into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1074
2022-05-08 10:24:15 +00:00
Kitaiti Makoto 776ed058c7 [skip ci]Add changelog 2022-05-08 18:53:57 +09:00
Kitaiti Makoto aa3e196b8f Make comment content required 2022-05-08 18:51:09 +09:00
Kitaiti Makoto 52cb7270a9 Set id attributes to action forms in post details page 2022-05-08 18:44:42 +09:00
KitaitiMakoto 66376afb36 Merge pull request 'Upgrade activitystreams to 0.7, again' (#1022) from ap07 into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1022
2022-05-08 09:14:53 +00:00
Kitaiti Makoto 96860be1be Fix Follow::accept_follow() 2022-05-08 18:00:42 +09:00
KitaitiMakoto 3bf61efc34 Merge pull request '[skip ci]Update changelog' (#1073) from changelog into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1073
2022-05-07 16:45:23 +00:00
KitaitiMakoto d95549f58b Merge pull request 'Move local feed before federated feed for non-logged-in users' (#1072) from timeline-order into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1072
2022-05-07 16:35:28 +00:00
Kitaiti Makoto bf24e4878a Merge remote-tracking branch 'origin/main' into ap07 2022-05-07 13:04:47 +09:00
Kitaiti Makoto 9ae231fcef [skip ci]Update changelog 2022-05-07 12:58:47 +09:00
KitaitiMakoto c32acb2fcf Merge pull request 'Sleep between broadcasting' (#1071) from sleep-broadcasting into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1071
2022-05-07 03:50:42 +00:00
Kitaiti Makoto 770c77ee81 Move local feed before federated feed for non-logged-in users 2022-05-07 12:48:34 +09:00
KitaitiMakoto aa3e4d7cf8 Merge branch 'main' into sleep-broadcasting 2022-05-07 03:27:53 +00:00
KitaitiMakoto 156a875f02 Merge pull request 'Move local timeline before federated timeline' (#1070) from timeline-order into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1070
2022-05-07 03:25:01 +00:00
Kitaiti Makoto cfed02bbcf Merge remote-tracking branch 'origin/main' into ap07 2022-05-07 11:15:04 +09:00
Kitaiti Makoto 4d3db9af73 Sleep between broadcasting 2022-05-07 04:50:58 +09:00
Kitaiti Makoto c1c606bc86 [skip ci]Add changelog about timeline order change 2022-05-07 04:46:28 +09:00
Kitaiti Makoto f401949037 Move local timeline before federated timeline 2022-05-07 04:43:00 +09:00
Kitaiti Makoto e8dc0942e5 Merge remote-tracking branch 'origin/main' into ap07 2022-05-06 12:38:47 +09:00
Kitaiti Makoto 9fbafd8e79 Fix Follow object in accepting follow 2022-05-06 12:06:37 +09:00
KitaitiMakoto b9ea83a602 Merge pull request 'More personal timelines' (#1069) from timeline-order into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1069
2022-05-05 19:11:47 +00:00
KitaitiMakoto 2ada5a83af Merge branch 'main' into timeline-order 2022-05-05 19:11:30 +00:00
KitaitiMakoto ec25599d1f Merge pull request 'Fixes #949 Fix time out error on broadcasting' (#1068) from fix-timeout into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1068
2022-05-05 17:15:28 +00:00
KitaitiMakoto 57551610e2 Merge pull request 'Update CircleCI image' (#1066) from update-circleci-image into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1066
2022-05-05 11:29:37 +00:00
Kitaiti Makoto 8948b7acc1 Center timeline tabs 2022-05-05 19:11:23 +09:00
Kitaiti Makoto ccf7ff2bc9 Remove Latest articles from timeline tabs 2022-05-05 19:05:54 +09:00
Kitaiti Makoto 39de967141 Show first timeline at home 2022-05-05 19:03:33 +09:00
Kitaiti Makoto 118cfd7166 Replace hard tabs with soft tabs 2022-05-05 18:14:55 +09:00
Kitaiti Makoto c2fd4ab3a5 Merge remote-tracking branch 'origin/main' into fix-timeout 2022-05-05 17:14:52 +09:00
Kitaiti Makoto 70b5bee00f Move My feed first in timelines 2022-05-05 16:48:51 +09:00
Kitaiti Makoto de605deb1e Don't unwrap 2022-05-05 16:35:03 +09:00
Kitaiti Makoto 116974f811 Add comment about broadcast capacity 2022-05-05 16:29:52 +09:00
Kitaiti Makoto c57f36ccca Merge branch 'fix-timeout' into ap07 2022-05-05 15:50:51 +09:00
Kitaiti Makoto 9def0355aa Reduce broadcast request connections 2022-05-05 15:50:44 +09:00
Kitaiti Makoto 4e833c2061 Follow clippy 2022-05-05 13:14:11 +09:00
Kitaiti Makoto 5871ed7301 Merge branch 'fix-timeout' into ap07 2022-05-05 13:12:04 +09:00
Kitaiti Makoto 97632fdbfe Broadcast asynchronously 2022-05-05 13:03:41 +09:00
Kitaiti Makoto 1f8da7e63d Install futures 2022-05-05 12:41:16 +09:00
Kitaiti Makoto 76ca7c1462 Add futures to plume-common's dependencies 2022-05-05 12:40:37 +09:00
Kitaiti Makoto f06f444a13 Update CircleCI image
See https://discuss.circleci.com/t/legacy-convenience-image-deprecation/41034
2022-05-05 11:32:26 +09:00
Kitaiti Makoto 10dfecf45c Merge remote-tracking branch 'origin/fix-timeout' into ap07 2022-05-05 10:42:03 +09:00
Kitaiti Makoto a7b899817a Run HTTP request in broadcast() on tokio runtime 2022-05-05 09:04:54 +09:00
Kitaiti Makoto e0258003b9 Install tokio and flume 2022-05-05 09:04:34 +09:00
Kitaiti Makoto 2326eb77cd Add tokio to plume-common's dependencies 2022-05-05 08:42:26 +09:00
Kitaiti Makoto 504d41d887 Add flume to plume-common's dependencies 2022-05-05 08:38:57 +09:00
KitaitiMakoto 5a7d5e8099 Merge pull request 'Bump serde_json from 1.0.80 to 1.0.81' (#1064) from serde_json-1.0.81 into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1064
2022-05-04 19:56:28 +00:00
KitaitiMakoto 74a1daac8c Merge branch 'main' into serde_json-1.0.81 2022-05-04 19:56:18 +00:00
KitaitiMakoto 1f855601ea Merge pull request 'Bump openssl from 0.10.38 to 0.10.40' (#1063) from openssl-0.10.40 into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1063
2022-05-04 19:55:39 +00:00
Kitaiti Makoto f22c4d5c78 Await in consumer 2022-05-05 04:51:42 +09:00
Kitaiti Makoto ce4b216722 Broadcast asynchronously 2022-05-05 04:34:04 +09:00
Kitaiti Makoto 9016995d92 Install tokio 2022-05-05 04:33:52 +09:00
dependabot[bot] 853a1db028
Bump serde_json from 1.0.80 to 1.0.81
Bumps [serde_json](https://github.com/serde-rs/json) from 1.0.80 to 1.0.81.
- [Release notes](https://github.com/serde-rs/json/releases)
- [Commits](https://github.com/serde-rs/json/compare/v1.0.80...v1.0.81)

---
updated-dependencies:
- dependency-name: serde_json
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-05-04 19:25:50 +00:00
dependabot[bot] 712ee30a1f
Bump openssl from 0.10.38 to 0.10.40
Bumps [openssl](https://github.com/sfackler/rust-openssl) from 0.10.38 to 0.10.40.
- [Release notes](https://github.com/sfackler/rust-openssl/releases)
- [Commits](https://github.com/sfackler/rust-openssl/compare/openssl-v0.10.38...openssl-v0.10.40)

---
updated-dependencies:
- dependency-name: openssl
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-05-04 19:25:29 +00:00
Kitaiti Makoto 9e5f9255d1 Add tokio to plume-common's dependencies 2022-05-05 03:34:11 +09:00
Kitaiti Makoto 2e35441483 Follow reqwest change 2022-05-05 01:21:25 +09:00
Kitaiti Makoto 5c74f598d8 Update Cargo.lock 2022-05-05 01:21:12 +09:00
Kitaiti Makoto 5d711dc47c Upgrade reqwest to 0.11 2022-05-05 01:20:18 +09:00
KitaitiMakoto 9ae3057106 Merge pull request 'Fixes #1061 Render 404 when page not found' (#1062) from render-404 into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1062
2022-05-04 12:23:04 +00:00
Kitaiti Makoto b7ea154e51 Render 404 when page not found 2022-05-04 21:21:58 +09:00
Kitaiti Makoto 692e6b1c82 Uninstall tokio 2022-05-04 21:13:30 +09:00
Kitaiti Makoto 528f1bac48 Remove tokio from dependencies 2022-05-04 21:12:52 +09:00
Kitaiti Makoto 35aa2374c4 Execute broadcast synchronously 2022-05-04 21:12:35 +09:00
Kitaiti Makoto 3eb7662aef Log inbox URI when broadcast() failed 2022-05-04 21:04:49 +09:00
Kitaiti Makoto de4fcaee93 Merge remote-tracking branch 'origin/main' into ap07 2022-05-04 05:04:57 +09:00
KitaitiMakoto 812fd3d956 Merge pull request 'Reuse reqwest client on broadcasting' (#1059) from fix-timeout into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1059
2022-05-03 19:59:05 +00:00
KitaitiMakoto 5d3b480790 Merge branch 'main' into fix-timeout 2022-05-03 19:58:38 +00:00
KitaitiMakoto 2f1801acae Merge pull request 'Bump validator from 0.14.0 to 0.15.0' (#1060) from validator-0.15.0 into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1060
2022-05-03 19:57:56 +00:00
Kitaiti Makoto 0404528908 Remove unnecessary clone of config 2022-05-04 04:40:16 +09:00
Kitaiti Makoto 4529b929d8 [skip ci]Add changelog 2022-05-04 04:26:47 +09:00
dependabot[bot] 889decc720
Bump validator from 0.14.0 to 0.15.0
Bumps [validator](https://github.com/Keats/validator) from 0.14.0 to 0.15.0.
- [Release notes](https://github.com/Keats/validator/releases)
- [Changelog](https://github.com/Keats/validator/blob/master/CHANGELOG.md)
- [Commits](https://github.com/Keats/validator/compare/v0.14.0...v0.15.0)

---
updated-dependencies:
- dependency-name: validator
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-05-03 19:24:10 +00:00
Kitaiti Makoto db0f1a3c46 Reuse reqwest client on broadcasting
See https://users.rust-lang.org/t/reqwest-http-client-fails-when-too-much-concurrency/55644/2
2022-05-04 04:23:10 +09:00
KitaitiMakoto ef57ef91f0 Merge pull request 'Fixes #1051 Fix accept header' (#1058) from activitystreams-content-type into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1058
2022-05-03 17:21:25 +00:00
Kitaiti Makoto 073b72c9ed Add more fixes 2022-05-04 02:01:58 +09:00
Kitaiti Makoto 45a6744d4d [skip ci]Add changelog 2022-05-04 01:58:44 +09:00
Kitaiti Makoto 4d19861a25 Fix accept header 2022-05-04 01:56:49 +09:00
Kitaiti Makoto f5906cacf3 Restore missing logic for Media 2022-05-03 22:25:43 +09:00
Kitaiti Makoto 03ba77a577 Restore filter 2022-05-03 12:36:31 +09:00
Kitaiti Makoto 0fc7372781 Restore order of decl of boundary of broadcast() 2022-05-03 12:34:56 +09:00
Kitaiti Makoto 6c2846980a Merge remote-tracking branch 'origin/main' into ap07 2022-05-03 12:24:04 +09:00
Kitaiti Makoto 0685c59bf3 Add changelog 2022-05-03 12:18:52 +09:00
KitaitiMakoto 5f629195f8 Merge pull request 'Bump whatlang from 0.13.0 to 0.15.0' (#1057) from whatlang-0.15.0 into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1057
2022-05-03 03:08:09 +00:00
KitaitiMakoto 13eeedb620 Merge branch 'main' into whatlang-0.15.0 2022-05-03 02:49:51 +00:00
KitaitiMakoto a076c132ca Merge pull request 'Bump serde from 1.0.136 to 1.0.137' (#1056) from serde-1.0.137 into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1056
2022-05-03 02:49:09 +00:00
KitaitiMakoto a76e0dfe5b Merge branch 'main' into serde-1.0.137 2022-05-03 02:27:46 +00:00
KitaitiMakoto 8faac20977 Merge pull request 'Bump serde_json from 1.0.79 to 1.0.80' (#1055) from serde_json-1.0.80 into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1055
2022-05-03 02:22:54 +00:00
Kitaiti Makoto b04edfa05e Follow clippy 2022-05-03 10:53:16 +09:00
Kitaiti Makoto 1f62bf27f8 Fix nest of source property for Post 2022-05-03 10:47:21 +09:00
Kitaiti Makoto 1e0d1fb97a Add test for CustomGroup 2022-05-03 10:16:57 +09:00
Kitaiti Makoto 19d30c12d1 Parse source property properly 2022-05-03 10:04:22 +09:00
Kitaiti Makoto 384474930c Add test for Create activity with licensed article 2022-05-03 08:24:22 +09:00
dependabot[bot] 4cb64e0a8c
Bump whatlang from 0.13.0 to 0.15.0
Bumps [whatlang](https://github.com/greyblake/whatlang-rs) from 0.13.0 to 0.15.0.
- [Release notes](https://github.com/greyblake/whatlang-rs/releases)
- [Changelog](https://github.com/greyblake/whatlang-rs/blob/master/CHANGELOG.md)
- [Commits](https://github.com/greyblake/whatlang-rs/compare/v0.13.0...v0.15.0)

---
updated-dependencies:
- dependency-name: whatlang
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-05-02 19:35:35 +00:00
dependabot[bot] cd2a2df48d
Bump serde from 1.0.136 to 1.0.137
Bumps [serde](https://github.com/serde-rs/serde) from 1.0.136 to 1.0.137.
- [Release notes](https://github.com/serde-rs/serde/releases)
- [Commits](https://github.com/serde-rs/serde/compare/v1.0.136...v1.0.137)

---
updated-dependencies:
- dependency-name: serde
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-05-02 19:35:19 +00:00
dependabot[bot] cbf960500b
Bump serde_json from 1.0.79 to 1.0.80
Bumps [serde_json](https://github.com/serde-rs/json) from 1.0.79 to 1.0.80.
- [Release notes](https://github.com/serde-rs/json/releases)
- [Commits](https://github.com/serde-rs/json/compare/v1.0.79...v1.0.80)

---
updated-dependencies:
- dependency-name: serde_json
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-05-02 19:35:06 +00:00
Kitaiti Makoto 3d434f1923 Remove old activitypub related crates 2022-05-03 03:05:49 +09:00
Kitaiti Makoto 52022fb597 Remove activitypub crate from plume-models 2022-05-03 02:54:08 +09:00
Kitaiti Makoto d96940c848 Don't implement activitypub::Link for Id 2022-05-03 02:51:36 +09:00
Kitaiti Makoto b17884681d Implement ap_followers using activitystreams 2022-05-03 02:50:14 +09:00
Kitaiti Makoto 78a001ac89 Remove trailing 07 in routes/posts.rs 2022-05-03 02:14:23 +09:00
Kitaiti Makoto 9b04fb96e6 Remote trailing 07 in inbox.rs 2022-05-03 02:12:59 +09:00
Kitaiti Makoto d23002b817 Remove trailing 07 from method name 2022-05-03 02:11:46 +09:00
Kitaiti Makoto 6282b98b03 Fix doc test 2022-05-03 02:05:46 +09:00
Kitaiti Makoto d75600ba14 Remove trailing 07 in activity_pub/inbox.rs 2022-05-03 01:32:47 +09:00
Kitaiti Makoto e6ea302319 Remove activitypub crate from Inbox test 2022-05-03 01:31:39 +09:00
Kitaiti Makoto df005a28f8 Rename: activity07() -> activity() 2022-05-03 01:26:15 +09:00
Kitaiti Makoto 15134eed60 Rename: get_sender07() -> get_sender() 2022-05-03 01:24:22 +09:00
Kitaiti Makoto 7dd56a71e3 Rename: from_db07() -> from_db() 2022-05-03 01:23:37 +09:00
Kitaiti Makoto 06d2f68ecd Rename: from_activity07() -> from_activity() 2022-05-03 01:22:55 +09:00
Kitaiti Makoto 9a640b3438 Rename: deref07() -> deref() 2022-05-03 01:22:02 +09:00
Kitaiti Makoto 9791607793 Rename: with07() -> with() 2022-05-03 01:20:40 +09:00
Kitaiti Makoto ccd3c8a3f2 Don't implement activitypub's Object for Source 2022-05-03 01:17:27 +09:00
Kitaiti Makoto 6bbadc78b0 Rename: Licensed07 -> Licensed 2022-05-03 01:15:13 +09:00
Kitaiti Makoto e1673787b4 Remove unused Licensed struct 2022-05-03 01:14:29 +09:00
Kitaiti Makoto 1dd176dd80 Rename: broadcast07() -> broadcast() 2022-05-03 01:12:39 +09:00
Kitaiti Makoto 5c59687cb8 Remove unused broadcast() 2022-05-03 01:12:02 +09:00
Kitaiti Makoto a24e3c46e6 Remove trailing 07 in posts.rs 2022-05-03 01:07:52 +09:00
Kitaiti Makoto 267fecba66 Remove unsed posts::LicensedArticle 2022-05-03 01:03:54 +09:00
Kitaiti Makoto fc99d2b7a0 Remove trailing 07 in remote_fetch_actor.rs 2022-05-03 00:57:19 +09:00
Kitaiti Makoto 1b32fa1e34 Remove unused Media::from_activity() 2022-05-03 00:54:37 +09:00
Kitaiti Makoto ce42524273 Remote trailng 07 from Note in comments.rs 2022-05-03 00:50:43 +09:00
Kitaiti Makoto 595fa05660 Use Follow::to_activity07() instead of to_activity() 2022-05-03 00:47:46 +09:00
Kitaiti Makoto f8a0dff526 Remove unused Follow::build_accept() 2022-05-03 00:45:22 +09:00
Kitaiti Makoto 06d216c7ed Remove unused Follow::accept_follow() 2022-05-03 00:43:47 +09:00
Kitaiti Makoto f44bca30f4 Use Follow::build_undo07() instead of build_undo() 2022-05-03 00:41:05 +09:00
Kitaiti Makoto e4180b3b38 Rename: ApSignature07 -> ApSignature 2022-05-03 00:29:44 +09:00
Kitaiti Makoto 992a482b96 Remove unused ApSignature type 2022-05-03 00:26:37 +09:00
Kitaiti Makoto 0ad845e0f7 Remove unused User::fetch_outbox() 2022-05-03 00:22:22 +09:00
Kitaiti Makoto ee97213c90 Remove trailing 07 from import of OrderedCollectionPage 2022-05-03 00:18:42 +09:00
Kitaiti Makoto 6d919da049 Remove duplicate import 2022-05-03 00:17:19 +09:00
Kitaiti Makoto 6c615d01ad Remove users::CustomPerson 2022-05-03 00:15:46 +09:00
Kitaiti Makoto f8870af9fe Remove trailing 07 from Hashtag 2022-05-03 00:10:51 +09:00
Kitaiti Makoto 2fe2505a01 Remove unused Hashtag 2022-05-03 00:09:15 +09:00
Kitaiti Makoto e41fa353e4 Use User::to_activity07() instead of to_activity() 2022-05-03 00:06:18 +09:00
Kitaiti Makoto effdc44943 Use User::delete_activity07() instead of delete_activity() 2022-05-03 00:03:11 +09:00
Kitaiti Makoto fd341bdb22 Use User::outbox07() instead of outbox() 2022-05-02 23:59:38 +09:00
Kitaiti Makoto 68c794c54b Use User::outbox_page07() instead of outbox_page() 2022-05-02 23:55:18 +09:00
Kitaiti Makoto 7b3b00be23 Remove unused Tag::from_activity() and to_activity() 2022-05-02 23:51:32 +09:00
Kitaiti Makoto 41ccacc5d3 Remove unused Mention::from_activity() 2022-05-02 23:47:36 +09:00
Kitaiti Makoto 4ef9350ce7 Remove unused Mention::to_activity() 2022-05-02 23:45:06 +09:00
Kitaiti Makoto 5d08ff6c3b Use Mention::build_activity07() instead of build_activity() 2022-05-02 23:43:24 +09:00
Kitaiti Makoto 01dca62ce5 Rename: Like07 -> LikeAct 2022-05-02 23:38:32 +09:00
Kitaiti Makoto 6ab1ecd57b Use Like::to_activity07() instead of to_activity() 2022-05-02 23:37:58 +09:00
Kitaiti Makoto b13444895f Use Like::build_undo07() instead of build_undo() 2022-05-02 23:34:19 +09:00
Kitaiti Makoto 771c157fe5 Use Comment::to_activity07() instead of to_activity() 2022-05-02 23:30:44 +09:00
Kitaiti Makoto b5e1076b0e Use Comment::build_delete07() instead of build_delete() 2022-05-02 23:26:25 +09:00
Kitaiti Makoto 6cc43c2420 Use Comment::create_activity07() instead of create_activity() 2022-05-02 23:23:28 +09:00
Kitaiti Makoto f365041a45 Use Reshare::to_activity07() instead of to_activity() 2022-05-02 23:19:29 +09:00
Kitaiti Makoto ae9c9262f7 Use Reshare::build_undo07() instead of build_undo() 2022-05-02 23:16:55 +09:00
Kitaiti Makoto 40ce515e6c Don't rename activitystreams' tokens to 07 2022-05-02 23:11:48 +09:00
Kitaiti Makoto bc96af7f5f Remove unused blogs::CustomGroup 2022-05-02 23:08:41 +09:00
Kitaiti Makoto 811c20c8fb Use Blog::to_activity07() instead of to_activity() 2022-05-02 23:07:05 +09:00
Kitaiti Makoto 4b4c22cf8a Use Blog::outbox_page07() instead of outbox_page() 2022-05-02 23:01:26 +09:00
Kitaiti Makoto 0524b0b153 Add Blog::outbox_page07() 2022-05-02 22:56:54 +09:00
Kitaiti Makoto cd6c57b9c5 Use Blog::outbox07() instead of outbox() 2022-05-02 22:55:06 +09:00
Kitaiti Makoto f608f7a4d6 Install activitystreams 2022-05-02 22:55:06 +09:00
Kitaiti Makoto 803680186b Add Blog::outbox07() 2022-05-02 22:55:06 +09:00
Kitaiti Makoto 68a01d5f9b Add activitystreams to Plume's dependencies 2022-05-02 22:55:06 +09:00
Kitaiti Makoto 5b3a472b66 Use Post::to_activity07() instead of to_activity() 2022-05-02 22:55:06 +09:00
Kitaiti Makoto 2a85f775e9 Use Post::build_delete07() instead of build_delete() 2022-05-02 22:55:06 +09:00
Kitaiti Makoto a958300a58 Use Post::update_hashtags07() instead of update_hashtags() 2022-05-02 22:55:03 +09:00
Kitaiti Makoto a8be31b177 Use Post::update_tags07() instead of update_tags() 2022-05-02 22:26:59 +09:00
Kitaiti Makoto 6cd68ab8b0 Use Post::update_activity07() instead of update_activity() 2022-05-02 22:23:12 +09:00
Kitaiti Makoto a589435f4f Use Post::create_activity07() instead of create_activity() 2022-05-02 22:19:00 +09:00
Kitaiti Makoto 39b49c707e Use Post::update_mentions07() instead of update_mentions() 2022-05-02 21:49:00 +09:00
Kitaiti Makoto c4bb1f771b Add test for Tag::from_activity07() 2022-05-02 21:00:44 +09:00
Kitaiti Makoto 28440271bb Rename: FromId::from_id07 -> from_id 2022-05-02 19:24:36 +09:00
Kitaiti Makoto 0ab7774e29 Rename: AsObject07 -> AsObject 2022-05-02 17:43:03 +09:00
Kitaiti Makoto 33afe9111e Remove AsObject 2022-05-02 17:38:08 +09:00
Kitaiti Makoto d8a2e1925f Rename FromId07 -> FromId 2022-05-02 16:07:08 +09:00
Kitaiti Makoto 2804f44a06 Rmove FromId 2022-05-02 12:58:01 +09:00
Kitaiti Makoto 2165c286ae Remove with() 2022-05-01 19:49:12 +09:00
Kitaiti Makoto 8cbf410faf Remove execute permission from plume-common/src/lib.rs 2022-05-01 19:15:16 +09:00
Kitaiti Makoto c521a81373 Make test follow LicensedArticle change 2022-05-01 19:13:38 +09:00
Kitaiti Makoto 7ade0550c9 Remove unused import 2022-05-01 18:54:11 +09:00
Kitaiti Makoto 41bc2d6949 Make LicensedArticle's license fieald optional 2022-05-01 18:53:51 +09:00
Kitaiti Makoto de6e9c0e2e Fix Post::from_activity07() 2022-05-01 13:00:04 +09:00
Kitaiti Makoto 38ebc9ea41 Modify test data for Post 2022-05-01 12:59:48 +09:00
Kitaiti Makoto 8f976be998 Implement AsObject07 for PostUpdate 2022-05-01 09:56:16 +09:00
Kitaiti Makoto e5a2850105 Implement FromId07 for PostUpdate 2022-05-01 09:56:03 +09:00
KitaitiMakoto 85727c6d4c Merge pull request 'Bump ldap3 from 0.10.3 to 0.10.4' (#1054) from ldap3-0.10.4 into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1054
2022-04-30 23:35:00 +00:00
KitaitiMakoto 87247a23b3 Merge branch 'main' into ldap3-0.10.4 2022-04-30 23:08:53 +00:00
KitaitiMakoto 61785364e3 Merge pull request 'Bump ctrlc from 3.2.1 to 3.2.2' (#1053) from ctrlc-3.2.2 into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1053
2022-04-30 23:08:30 +00:00
Kitaiti Makoto 76f688c967 Replace some Inbox::with with with07 2022-05-01 07:45:42 +09:00
Kitaiti Makoto 05df3b89a1 Fix Follow::activity07() 2022-05-01 07:45:05 +09:00
Kitaiti Makoto 4e42a34337 Replace some with() with with07() 2022-05-01 06:05:20 +09:00
Kitaiti Makoto 62372201e0 Fix inbox::tests::create_post() 2022-05-01 06:04:58 +09:00
Kitaiti Makoto 036913a828 Use id() for reply_tos 2022-05-01 04:45:56 +09:00
dependabot[bot] b2a889b9e4
Bump ldap3 from 0.10.3 to 0.10.4
Bumps [ldap3](https://github.com/inejge/ldap3) from 0.10.3 to 0.10.4.
- [Release notes](https://github.com/inejge/ldap3/releases)
- [Changelog](https://github.com/inejge/ldap3/blob/master/CHANGELOG.md)
- [Commits](https://github.com/inejge/ldap3/compare/v0.10.3...v0.10.4)

---
updated-dependencies:
- dependency-name: ldap3
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-04-26 19:28:07 +00:00
dependabot[bot] fae8338772
Bump ctrlc from 3.2.1 to 3.2.2
Bumps [ctrlc](https://github.com/Detegr/rust-ctrlc) from 3.2.1 to 3.2.2.
- [Release notes](https://github.com/Detegr/rust-ctrlc/releases)
- [Commits](https://github.com/Detegr/rust-ctrlc/compare/3.2.1...3.2.2)

---
updated-dependencies:
- dependency-name: ctrlc
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-04-25 19:25:52 +00:00
Kitaiti Makoto 79b5d9a690 Replace Inbox::with() with with07() 2022-04-24 07:25:31 +09:00
Kitaiti Makoto 3e54d10981 Implement AsObject07<User, Undo07, &DbConn> for Like 2022-04-24 07:11:46 +09:00
Kitaiti Makoto a1c3bfb646 Implement FromId07 for Like 2022-04-24 07:10:56 +09:00
Kitaiti Makoto b2528c21ff Implement AsObject07<User, Like07, &DbConn> for Post 2022-04-24 07:04:30 +09:00
Kitaiti Makoto fcc9e1d81b Implement Like::build_undo07() 2022-04-24 07:03:13 +09:00
Kitaiti Makoto 3093f713ef Add test for Like::build_undo07() 2022-04-24 07:03:06 +09:00
Kitaiti Makoto 4ea29d29a0 Implement Like::to_activity07() 2022-04-24 06:58:33 +09:00
Kitaiti Makoto 6b8d90d8b6 Add test for Like::to_activity07() 2022-04-24 06:58:17 +09:00
Kitaiti Makoto bd3e6a5a91 Replace some Inbox::with with with07 2022-04-24 06:54:07 +09:00
Kitaiti Makoto 46f4676efb Implement Reshare::build_undo07() 2022-04-24 06:48:31 +09:00
Kitaiti Makoto c814ac5681 Add test for Reshare::build_undo07() 2022-04-24 06:48:16 +09:00
Kitaiti Makoto 0887399048 Implement AsObject07<User, Undo07, &DbConn> for Reshare 2022-04-24 06:42:48 +09:00
Kitaiti Makoto f2a2bf2b23 Implement FromId07 for Reshare 2022-04-24 06:41:21 +09:00
Kitaiti Makoto e2702a187b Implement AsObject07<User, Announce07, &DbConn> for Post 2022-04-24 06:35:50 +09:00
Kitaiti Makoto d78a57ce47 Implement Reshare::to_ativity07() 2022-04-24 06:34:00 +09:00
Kitaiti Makoto 10acbdd41f Add test for Reshare::to_activity07() 2022-04-24 06:33:39 +09:00
Kitaiti Makoto 73009818f2 Implement AsObject07<User, Undo07, &DbConn> for Follow 2022-04-24 06:11:54 +09:00
Kitaiti Makoto fb5027becd Implement FromId07 for Follow 2022-04-24 06:09:00 +09:00
Kitaiti Makoto 86609b51fa Implement AsObject07<User, FollowAct07, &DbConn> for User 2022-04-24 05:51:24 +09:00
Kitaiti Makoto 44799e94fd Implement Follow::accept_follow07() 2022-04-24 05:50:45 +09:00
Kitaiti Makoto f14c307786 Remove unused type parameter from broadcast07() 2022-04-24 05:47:11 +09:00
Kitaiti Makoto 174624f5c1 Implement Follow::build_undo07() 2022-04-24 03:45:22 +09:00
Kitaiti Makoto 5f91345d69 Add test for Follow::build_undo07() 2022-04-24 03:45:11 +09:00
Kitaiti Makoto 9ca975113c Implement Follow::build_accept07() 2022-04-24 03:39:08 +09:00
Kitaiti Makoto 38a55857c6 Add test for Follow::build_accept07() 2022-04-24 03:38:54 +09:00
Kitaiti Makoto 9343d3a120 Implement Follow::to_activity07() 2022-04-24 03:38:36 +09:00
Kitaiti Makoto c5656971c9 Add test for Follow::to_activity07() 2022-04-24 03:38:24 +09:00
Kitaiti Makoto ed55b66253 Implement AsObject07 for Comment 2022-04-24 01:49:27 +09:00
Kitaiti Makoto 713ffb9506 Fix Comment::create_activity07() 2022-04-24 01:37:30 +09:00
Kitaiti Makoto 9969e844ca Add test for Comment self federation 2022-04-24 01:37:03 +09:00
Kitaiti Makoto 0c61dca9ca Follow clippy 2022-04-23 22:49:13 +09:00
Kitaiti Makoto 957725fbf8 impl FromId07<DbConn> for Comment 2022-04-23 22:46:49 +09:00
Kitaiti Makoto 1f6361a9a2 Fix Cargo.toml 2022-04-18 00:21:17 +09:00
Kitaiti Makoto cf870971d1 Add test for Comment::build_delete07() 2022-04-18 00:19:33 +09:00
Kitaiti Makoto 08ac7227b5 Implement Comment::builde_delete07() 2022-04-18 00:19:33 +09:00
Kitaiti Makoto 88eb61c320 Implement Comment::create_activity07() 2022-04-18 00:19:33 +09:00
Kitaiti Makoto 1c1dbd481a Add test for Comment::to_activity07() 2022-04-18 00:19:33 +09:00
Kitaiti Makoto 86b4f622ea Implement Comment::to_activity07() 2022-04-18 00:19:33 +09:00
Kitaiti Makoto f854bc5838 Add test for LicensedArticle deserialization 2022-04-18 00:19:33 +09:00
Kitaiti Makoto 489156f4a3 Add test for Post's self federation 2022-04-18 00:19:33 +09:00
Kitaiti Makoto 01e8b0bce8 Implement AsObject for Post 2022-04-18 00:19:33 +09:00
Kitaiti Makoto 9183d04e66 Fix Post::from_activity07() for borrow checker 2022-04-18 00:19:33 +09:00
Kitaiti Makoto 5e463e2cc9 Implement FromId07 for Post 2022-04-18 00:19:33 +09:00
Kitaiti Makoto 6e2bff10f7 Add test for Post::build_delete07() 2022-04-18 00:19:33 +09:00
Kitaiti Makoto 3e9d9a81b7 Implement Mention::build_delete07() 2022-04-18 00:19:33 +09:00
Kitaiti Makoto 98e0754976 Add test for Post::build_delete() 2022-04-18 00:19:33 +09:00
Kitaiti Makoto da7870eeba Implement Post::update_hashtags07() 2022-04-18 00:19:33 +09:00
Kitaiti Makoto c1562f3868 Implement Post::update_tags07() 2022-04-18 00:19:33 +09:00
Kitaiti Makoto e0390cb105 Implement Post::update_mentions07() 2022-04-18 00:19:33 +09:00
Kitaiti Makoto 32cd91cfb9 Implement Mention::from_activity07() 2022-04-18 00:19:33 +09:00
Kitaiti Makoto 991dfccf3b Add test for Post::update_activity07() 2022-04-18 00:19:33 +09:00
Kitaiti Makoto 16e012ba00 Implement Post::update_activity07() 2022-04-18 00:19:33 +09:00
Kitaiti Makoto 871618f45d Add test for Post::create_activity07() 2022-04-18 00:19:33 +09:00
Kitaiti Makoto 680d321a2e Implement Post::create_activity07() 2022-04-18 00:19:33 +09:00
Kitaiti Makoto c37ff54857 Fix Post::to_activity07() 2022-04-18 00:19:33 +09:00
Kitaiti Makoto d4018d61d4 Add test for Post::to_activity07() 2022-04-18 00:19:33 +09:00
Kitaiti Makoto 21a0059755 Follow clippy 2022-04-18 00:19:33 +09:00
Kitaiti Makoto 05f4c186f4 Fix test for LicensedArticle serialization 2022-04-18 00:19:33 +09:00
Kitaiti Makoto 53512a6167 Fix SourceProperty property 2022-04-18 00:19:33 +09:00
Kitaiti Makoto 7cf7700ef7 Implement Post::to_activity07() 2022-04-18 00:19:33 +09:00
Kitaiti Makoto 216855d3a7 Add SourceProperty to LicensedArticle 2022-04-18 00:19:33 +09:00
Kitaiti Makoto 23f273e5e8 Readd assert-json-diff 2022-04-18 00:19:33 +09:00
Kitaiti Makoto 70949fad02 Rename: ActorSource -> SourceProperty 2022-04-18 00:19:33 +09:00
Kitaiti Makoto 1f5ce8e504 Add test for Mention::build_activity07() and to_activity07() 2022-04-18 00:19:33 +09:00
Kitaiti Makoto 2316d36e03 Add test for Tag::from_activity07() 2022-04-18 00:19:33 +09:00
Kitaiti Makoto 2b1ddc71ac Implement Tag::to_activity07() and Tag::build_activity07() 2022-04-18 00:19:33 +09:00
Kitaiti Makoto b9dac1a21a Define Hashtag07 2022-04-18 00:19:33 +09:00
Kitaiti Makoto 95fb5a3c71 Implement Media::from_activity07() 2022-04-18 00:19:33 +09:00
Kitaiti Makoto 75b43a738f Follow clippy 2022-04-18 00:19:33 +09:00
Kitaiti Makoto 5bd467c4c1 Remove unnecessary records 2022-04-18 00:19:33 +09:00
Kitaiti Makoto 74d6dc5089 Implement Blog::to_activity07(), outbox_collection07() and outbox_collection_page07() 2022-04-18 00:19:33 +09:00
Kitaiti Makoto 994a4dbb2d Add source property to CustomGroup 2022-04-18 00:19:33 +09:00
Kitaiti Makoto 67996cc938 Add test for Blog::outbox_collection_page() 2022-04-18 00:19:33 +09:00
Kitaiti Makoto f5e776c4d7 Fix first and last link in Blog::outbox_collection() 2022-04-18 00:19:33 +09:00
Kitaiti Makoto e27fc47287 Extract Blog::outbox_collection_page() 2022-04-18 00:19:33 +09:00
Kitaiti Makoto 0ed91b89ff Add test for Blog::outbox_collection() 2022-04-18 00:19:33 +09:00
Kitaiti Makoto 00862790a1 Extract Blog::outbox_collection() 2022-04-18 00:19:33 +09:00
Kitaiti Makoto ab6f39c192 Add test for Blog::to_activity() 2022-04-18 00:19:33 +09:00
Kitaiti Makoto 4edc201c14 Implement FromId07 for Blog 2022-04-18 00:19:33 +09:00
Kitaiti Makoto a1a7acfe94 Use new activitystreams APIs 2022-04-18 00:19:33 +09:00
Kitaiti Makoto 6b5a1d2130 Use Base::retract() instead of into_any_base() on creating activity 2022-04-18 00:19:33 +09:00
Kitaiti Makoto 85e35fdb5d Update activitystreams 2022-04-18 00:19:33 +09:00
Kitaiti Makoto da9e13622c Use Inbox::with07() for User, Delete, User 2022-04-18 00:11:12 +09:00
Kitaiti Makoto 8f4dd8a57b Implement User::fetch_outbox07() 2022-04-18 00:11:12 +09:00
Kitaiti Makoto 78b0535063 Implement User::fetch_outbox_page07() 2022-04-18 00:11:12 +09:00
Kitaiti Makoto 6323c7aef8 Add test for User::outbox_page_collection07() 2022-04-18 00:11:12 +09:00
Kitaiti Makoto 7f0ad56d07 Implement User::outbox_collection_page07() 2022-04-18 00:11:12 +09:00
Kitaiti Makoto e7eea3901f Implement User::outbox_page07() 2022-04-18 00:11:12 +09:00
Kitaiti Makoto 0979471e54 Add test for User::delete_activity07() 2022-04-18 00:11:12 +09:00
Kitaiti Makoto 8d69051a61 Implement User::delete_activity07() 2022-04-18 00:11:12 +09:00
Kitaiti Makoto 55ca1345e1 Add test self_federation07() for User 2022-04-18 00:11:12 +09:00
Kitaiti Makoto ad951ca842 Add test for User::to_activity07() 2022-04-18 00:11:12 +09:00
Kitaiti Makoto cb8e2e9294 Implement User::to_activity() 2022-04-18 00:11:12 +09:00
Kitaiti Makoto 038d65acaa Implement User::outbox07() 2022-04-18 00:11:12 +09:00
Kitaiti Makoto e392a89526 Add test for User::outbox_collection07() 2022-04-18 00:11:12 +09:00
Kitaiti Makoto d62f51665b Implement User::outbox_collection07() 2022-04-18 00:11:12 +09:00
Kitaiti Makoto e42aa6fe8e Implement From<iri_string::validate::Error> for Error 2022-04-18 00:11:12 +09:00
Kitaiti Makoto ab126563f3 Implement AsObject07 for User 2022-04-18 00:11:12 +09:00
Kitaiti Makoto c1b9ebdae6 [REFACTORING]Reduce duplicated closure 2022-04-18 00:11:12 +09:00
Kitaiti Makoto d3e11c78d7 [REFACTORING]Use method chain instead of if clauses 2022-04-18 00:11:12 +09:00
Kitaiti Makoto 4ccfec8019 Use OneOrMany<AnyBase>::to_as_uri() 2022-04-18 00:11:12 +09:00
Kitaiti Makoto bb5157637d Implement OneOrMany<AnyBase>::to_as_uri() 2022-04-18 00:11:12 +09:00
Kitaiti Makoto 456df3e535 Use OneOrMany<&AnyString>::as_as_str() 2022-04-18 00:11:12 +09:00
Kitaiti Makoto f0112850fa Implement OneOrMany<&AnyString>::as_as_str() 2022-04-18 00:11:12 +09:00
Kitaiti Makoto a6a21d5dfa Rewrite to_as_string() using method chain instead of if expressions 2022-04-18 00:11:12 +09:00
Kitaiti Makoto 249fbbe891 Remove unused import 2022-04-18 00:11:12 +09:00
Kitaiti Makoto e925865767 Use &AnyString::as_as_str() 2022-04-18 00:11:12 +09:00
Kitaiti Makoto 28643fc2c2 Implement &AnyString::as_as_str() 2022-04-18 00:11:12 +09:00
Kitaiti Makoto 3db10a09bb Use OneOrMany<&AnyString>::to_as_string() 2022-04-18 00:11:12 +09:00
Kitaiti Makoto a80a95d471 Implement OneOrMany<&AnyString>::to_as_string() 2022-04-18 00:11:12 +09:00
Kitaiti Makoto e407d58ee9 Implement FromId07 for User 2022-04-18 00:11:12 +09:00
Kitaiti Makoto a6d839a766 Make fields of ApSignature07 and PublicKey07 public 2022-04-18 00:11:12 +09:00
Kitaiti Makoto f3b67ab6c9 WIP 2022-04-18 00:11:12 +09:00
Kitaiti Makoto 66f5628a27 Add suffix 07 to activitystreams 0.7 related methods 2022-04-18 00:11:12 +09:00
Kitaiti Makoto 4b3b5c1f40 Implement From<activitystreams::checked::CheckError> for Error 2022-04-18 00:11:12 +09:00
Kitaiti Makoto 3e687f3af0 Reduce type parameter from broadcast07 2022-04-18 00:11:12 +09:00
Kitaiti Makoto 119d3e4f6a [plume-common]Add tests for new ActivityPub functions 2022-04-18 00:11:12 +09:00
Kitaiti Makoto a21d66178e [plume-common]Implement ActivityPub related function using activitystreams 0.7 2022-04-18 00:11:12 +09:00
Kitaiti Makoto 52967f3e47 [plume-common]Implement ActivityPub-related code using activitystreams 0.7 2022-04-18 00:11:12 +09:00
Kitaiti Makoto 29439f9d02 Add tests for newly added ActivityPub-related structs 2022-04-18 00:11:12 +09:00
Kitaiti Makoto bc72a4c2d1 Install assert-json-diff 2022-04-18 00:11:12 +09:00
Kitaiti Makoto 3ded0e2166 Add assert-json-diff to plume-common's dependencies 2022-04-18 00:11:12 +09:00
Kitaiti Makoto 27c10e5e5c Install activitystreams-ext 2022-04-18 00:10:41 +09:00
Kitaiti Makoto 816aefe72a Add ActivityStreams Ext to plume-common dependencies 2022-04-18 00:10:41 +09:00
Kitaiti Makoto bff50f8e4c Add activitystreams 0.7 to plume-models dependencies 2022-04-18 00:10:41 +09:00
Kitaiti Makoto be1c22815b Install activitystreams 0.7 2022-04-18 00:10:19 +09:00
Kitaiti Makoto 08ab7ffd08 Add activitystreams 0.7 to plume-common dependencies 2022-04-17 23:47:13 +09:00
KitaitiMakoto 8709f6cf9f Merge pull request 'Bump tracing from 0.1.32 to 0.1.34' (#1050) from tracing-0.1.34 into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1050
2022-04-17 14:44:46 +00:00
KitaitiMakoto 04cae95635 Merge branch 'main' into tracing-0.1.34 2022-04-17 14:44:27 +00:00
KitaitiMakoto 5d37b2534a Merge pull request 'Bump wasm-bindgen from 0.2.78 to 0.2.80' (#1049) from wasm-bindgen-0.2.80 into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1049
2022-04-17 14:44:00 +00:00
dependabot[bot] eafafdaadf Bump wasm-bindgen from 0.2.78 to 0.2.80
Bumps [wasm-bindgen](https://github.com/rustwasm/wasm-bindgen) from 0.2.78 to 0.2.80.
- [Release notes](https://github.com/rustwasm/wasm-bindgen/releases)
- [Changelog](https://github.com/rustwasm/wasm-bindgen/blob/main/CHANGELOG.md)
- [Commits](https://github.com/rustwasm/wasm-bindgen/compare/0.2.78...0.2.80)

---
updated-dependencies:
- dependency-name: wasm-bindgen
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-04-17 23:43:25 +09:00
KitaitiMakoto 6897b8fa58 Merge pull request 'Bump js-sys from 0.3.55 to 0.3.57' (#1048) from js-sys-0.3.57 into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1048
2022-04-17 14:40:47 +00:00
dependabot[bot] 16d3279d72 Bump js-sys from 0.3.55 to 0.3.57
Bumps [js-sys](https://github.com/rustwasm/wasm-bindgen) from 0.3.55 to 0.3.57.
- [Release notes](https://github.com/rustwasm/wasm-bindgen/releases)
- [Changelog](https://github.com/rustwasm/wasm-bindgen/blob/main/CHANGELOG.md)
- [Commits](https://github.com/rustwasm/wasm-bindgen/commits)

---
updated-dependencies:
- dependency-name: js-sys
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-04-17 23:40:01 +09:00
KitaitiMakoto 36c76c534d Merge pull request 'Bump web-sys from 0.3.55 to 0.3.57' (#1047) from web-sys-0.3.57 into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1047
2022-04-17 14:39:18 +00:00
KitaitiMakoto 26f460be89 Merge branch 'main' into web-sys-0.3.57 2022-04-17 14:39:05 +00:00
KitaitiMakoto b9fb13104a Merge pull request 'Bump ammonia from 3.1.4 to 3.2.0' (#1046) from ammonia-3.2.0 into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1046
2022-04-17 14:38:08 +00:00
dependabot[bot] 5cc411158f
Bump tracing from 0.1.32 to 0.1.34
Bumps [tracing](https://github.com/tokio-rs/tracing) from 0.1.32 to 0.1.34.
- [Release notes](https://github.com/tokio-rs/tracing/releases)
- [Commits](https://github.com/tokio-rs/tracing/compare/tracing-0.1.32...tracing-0.1.34)

---
updated-dependencies:
- dependency-name: tracing
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-04-15 19:29:35 +00:00
dependabot[bot] ca69c93531
Bump web-sys from 0.3.55 to 0.3.57
Bumps [web-sys](https://github.com/rustwasm/wasm-bindgen) from 0.3.55 to 0.3.57.
- [Release notes](https://github.com/rustwasm/wasm-bindgen/releases)
- [Changelog](https://github.com/rustwasm/wasm-bindgen/blob/main/CHANGELOG.md)
- [Commits](https://github.com/rustwasm/wasm-bindgen/commits)

---
updated-dependencies:
- dependency-name: web-sys
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-04-08 19:48:13 +00:00
dependabot[bot] 95cb7cc904
Bump ammonia from 3.1.4 to 3.2.0
Bumps [ammonia](https://github.com/rust-ammonia/ammonia) from 3.1.4 to 3.2.0.
- [Release notes](https://github.com/rust-ammonia/ammonia/releases)
- [Changelog](https://github.com/rust-ammonia/ammonia/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rust-ammonia/ammonia/compare/v3.1.4...v3.2.0)

---
updated-dependencies:
- dependency-name: ammonia
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-04-08 19:48:04 +00:00
KitaitiMakoto 0a62fa46aa Merge pull request 'Bump tracing-subscriber from 0.3.9 to 0.3.10' (#1045) from tracing-subscriber-0.3.10 into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1045
2022-04-03 13:28:51 +00:00
KitaitiMakoto 2e60410969 Merge branch 'main' into tracing-subscriber-0.3.10 2022-04-03 13:28:09 +00:00
KitaitiMakoto a12d3a591b Merge pull request 'Bump ldap3 from 0.10.2 to 0.10.3' (#1044) from ldap3-0.10.3 into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1044
2022-04-03 13:27:42 +00:00
dependabot[bot] 3cf7c67b6d
Bump tracing-subscriber from 0.3.9 to 0.3.10
Bumps [tracing-subscriber](https://github.com/tokio-rs/tracing) from 0.3.9 to 0.3.10.
- [Release notes](https://github.com/tokio-rs/tracing/releases)
- [Commits](https://github.com/tokio-rs/tracing/compare/tracing-subscriber-0.3.9...tracing-subscriber-0.3.10)

---
updated-dependencies:
- dependency-name: tracing-subscriber
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-04-01 19:26:25 +00:00
dependabot[bot] 2cf79f31b7
Bump ldap3 from 0.10.2 to 0.10.3
Bumps [ldap3](https://github.com/inejge/ldap3) from 0.10.2 to 0.10.3.
- [Release notes](https://github.com/inejge/ldap3/releases)
- [Changelog](https://github.com/inejge/ldap3/blob/master/CHANGELOG.md)
- [Commits](https://github.com/inejge/ldap3/compare/v0.10.2...v0.10.3)

---
updated-dependencies:
- dependency-name: ldap3
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-03-30 19:30:55 +00:00
KitaitiMakoto 24cf941303 Merge pull request 'Bump native-tls from 0.2.8 to 0.2.10' (#1043) from native-tls-0.2.10 into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1043
2022-03-30 19:04:42 +00:00
dependabot[bot] 38cf9b5496
Bump native-tls from 0.2.8 to 0.2.10
Bumps [native-tls](https://github.com/sfackler/rust-native-tls) from 0.2.8 to 0.2.10.
- [Release notes](https://github.com/sfackler/rust-native-tls/releases)
- [Changelog](https://github.com/sfackler/rust-native-tls/blob/master/CHANGELOG.md)
- [Commits](https://github.com/sfackler/rust-native-tls/compare/v0.2.8...v0.2.10)

---
updated-dependencies:
- dependency-name: native-tls
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-03-29 19:29:35 +00:00
KitaitiMakoto b9607b32ac Merge pull request 'Bump rsass from 0.23.4 to 0.24.0' (#1042) from rsass-0.24.0 into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1042
2022-03-26 15:10:16 +00:00
KitaitiMakoto 3393da2560 Merge pull request 'Bump bcrypt from 0.12.0 to 0.12.1' (#1041) from bcrypt-0.12.1 into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1041
2022-03-26 15:09:54 +00:00
dependabot[bot] 7566f94690
Bump rsass from 0.23.4 to 0.24.0
Bumps [rsass](https://github.com/kaj/rsass) from 0.23.4 to 0.24.0.
- [Release notes](https://github.com/kaj/rsass/releases)
- [Changelog](https://github.com/kaj/rsass/blob/master/CHANGELOG.md)
- [Commits](https://github.com/kaj/rsass/compare/v0.23.4...v0.24.0)

---
updated-dependencies:
- dependency-name: rsass
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-03-24 19:29:35 +00:00
dependabot[bot] ee8312fb57
Bump bcrypt from 0.12.0 to 0.12.1
Bumps [bcrypt](https://github.com/Keats/rust-bcrypt) from 0.12.0 to 0.12.1.
- [Release notes](https://github.com/Keats/rust-bcrypt/releases)
- [Commits](https://github.com/Keats/rust-bcrypt/compare/v0.12.0...v0.12.1)

---
updated-dependencies:
- dependency-name: bcrypt
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-03-21 19:32:31 +00:00
KitaitiMakoto 76219704f3 Merge pull request 'Bump rpassword from 5.0.1 to 6.0.1' (#1040) from rpassword-6.0.1 into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1040
2022-03-20 20:29:35 +00:00
dependabot[bot] 2a43d4e88a
Bump rpassword from 5.0.1 to 6.0.1
Bumps [rpassword](https://github.com/conradkleinespel/rpassword) from 5.0.1 to 6.0.1.
- [Release notes](https://github.com/conradkleinespel/rpassword/releases)
- [Commits](https://github.com/conradkleinespel/rpassword/compare/v5.0.1...v6.0.1)

---
updated-dependencies:
- dependency-name: rpassword
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-03-14 19:29:46 +00:00
KitaitiMakoto f0ce073a37 Merge pull request 'Bump tracing from 0.1.31 to 0.1.32' (#1039) from tracing-0.1.32 into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1039
2022-03-12 13:46:19 +00:00
dependabot[bot] 8047196394
Bump tracing from 0.1.31 to 0.1.32
Bumps [tracing](https://github.com/tokio-rs/tracing) from 0.1.31 to 0.1.32.
- [Release notes](https://github.com/tokio-rs/tracing/releases)
- [Commits](https://github.com/tokio-rs/tracing/compare/tracing-0.1.31...tracing-0.1.32)

---
updated-dependencies:
- dependency-name: tracing
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-03-09 19:30:07 +00:00
KitaitiMakoto 3cf52b3985 Merge pull request 'Add test for Tag::from_activity()' (#1037) from tag-test into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1037
2022-03-06 11:44:02 +00:00
KitaitiMakoto ac378e448b Merge pull request 'Bump once_cell from 1.9.0 to 1.10.0' (#1038) from once_cell-1.10.0 into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1038
2022-03-06 11:43:40 +00:00
Kitaiti Makoto daae2038f8 Add test for Tag::from_activity() 2022-03-06 20:22:36 +09:00
dependabot[bot] a2356c6e59
Bump once_cell from 1.9.0 to 1.10.0
Bumps [once_cell](https://github.com/matklad/once_cell) from 1.9.0 to 1.10.0.
- [Release notes](https://github.com/matklad/once_cell/releases)
- [Changelog](https://github.com/matklad/once_cell/blob/master/CHANGELOG.md)
- [Commits](https://github.com/matklad/once_cell/compare/v1.9.0...v1.10.0)

---
updated-dependencies:
- dependency-name: once_cell
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-03-03 19:31:09 +00:00
KitaitiMakoto 44b91c6f07 Merge pull request 'Update crates' (#1036) from ldap3-0.10.2 into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1036
2022-03-01 13:30:25 +00:00
Kitaiti Makoto 144565d13e Merge remote-tracking branch 'github/dependabot/cargo/bcrypt-0.12.0' into ldap3-0.10.2 2022-03-01 22:06:38 +09:00
dependabot[bot] 07fd66863d
Bump bcrypt from 0.11.0 to 0.12.0
Bumps [bcrypt](https://github.com/Keats/rust-bcrypt) from 0.11.0 to 0.12.0.
- [Release notes](https://github.com/Keats/rust-bcrypt/releases)
- [Commits](https://github.com/Keats/rust-bcrypt/compare/v0.11.0...v0.12.0)

---
updated-dependencies:
- dependency-name: bcrypt
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-02-28 19:46:44 +00:00
dependabot[bot] 385a5f7c33
Bump ldap3 from 0.9.3 to 0.10.2
Bumps [ldap3](https://github.com/inejge/ldap3) from 0.9.3 to 0.10.2.
- [Release notes](https://github.com/inejge/ldap3/releases)
- [Changelog](https://github.com/inejge/ldap3/blob/master/CHANGELOG.md)
- [Commits](https://github.com/inejge/ldap3/compare/v0.9.3...v0.10.2)

---
updated-dependencies:
- dependency-name: ldap3
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-02-28 19:46:38 +00:00
KitaitiMakoto 8c8c2edc66 Merge pull request 'Add tests for Tag' (#1035) from ap-tests into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1035
2022-02-27 12:50:36 +00:00
Kitaiti Makoto f7e393bded Add tests for Tag 2022-02-27 21:47:40 +09:00
KitaitiMakoto 1df25e34b0 Merge pull request 'Update gettext-macros to 0.6.1' (#1033) from update-gettext-macros into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1033
2022-02-26 15:47:44 +00:00
Kitaiti Makoto ed491bad21 Update gettext-macros to 0.6.1 2022-02-26 11:20:45 +09:00
KitaitiMakoto 306f2d5738 Merge pull request 'Bump bcrypt from 0.10.1 to 0.11.0' (#1032) from bcrypt-0.11.0 into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1032
2022-02-23 20:43:41 +00:00
dependabot[bot] 2196cb95c0
Bump bcrypt from 0.10.1 to 0.11.0
Bumps [bcrypt](https://github.com/Keats/rust-bcrypt) from 0.10.1 to 0.11.0.
- [Release notes](https://github.com/Keats/rust-bcrypt/releases)
- [Commits](https://github.com/Keats/rust-bcrypt/compare/v0.10.1...v0.11.0)

---
updated-dependencies:
- dependency-name: bcrypt
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-02-23 19:27:58 +00:00
KitaitiMakoto b4d494a5c7 Merge pull request 'Bump tracing from 0.1.30 to 0.1.31' (#1030) from tracing-0.1.31 into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1030
2022-02-19 18:17:15 +00:00
Kitaiti Makoto e8432f575e Merge remote-tracking branch 'origin/main' into tracing-0.1.31 2022-02-20 02:55:50 +09:00
KitaitiMakoto eb48723c08 Merge pull request 'Bump tracing-subscriber from 0.3.8 to 0.3.9' (#1028) from tracing-subscriber-0.3.9 into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1028
2022-02-19 17:28:02 +00:00
dependabot[bot] 65168202b4
Bump tracing from 0.1.30 to 0.1.31
Bumps [tracing](https://github.com/tokio-rs/tracing) from 0.1.30 to 0.1.31.
- [Release notes](https://github.com/tokio-rs/tracing/releases)
- [Commits](https://github.com/tokio-rs/tracing/compare/tracing-0.1.30...tracing-0.1.31)

---
updated-dependencies:
- dependency-name: tracing
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-02-18 19:38:05 +00:00
dependabot[bot] dced3cf881
Bump tracing-subscriber from 0.3.8 to 0.3.9
Bumps [tracing-subscriber](https://github.com/tokio-rs/tracing) from 0.3.8 to 0.3.9.
- [Release notes](https://github.com/tokio-rs/tracing/releases)
- [Commits](https://github.com/tokio-rs/tracing/compare/tracing-subscriber-0.3.8...tracing-subscriber-0.3.9)

---
updated-dependencies:
- dependency-name: tracing-subscriber
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-02-18 19:37:58 +00:00
KitaitiMakoto 170fd6026c Merge pull request 'Bump ammonia from 3.1.3 to 3.1.4' (#1027) from update-crate into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1027
2022-02-18 10:49:25 +00:00
dependabot[bot] 1ccaa817b3
Bump ammonia from 3.1.3 to 3.1.4
Bumps [ammonia](https://github.com/rust-ammonia/ammonia) from 3.1.3 to 3.1.4.
- [Release notes](https://github.com/rust-ammonia/ammonia/releases)
- [Changelog](https://github.com/rust-ammonia/ammonia/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rust-ammonia/ammonia/compare/v3.1.3...v3.1.4)

---
updated-dependencies:
- dependency-name: ammonia
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-02-17 19:29:15 +00:00
KitaitiMakoto 65ba083720 Merge pull request 'Switch gettext crate from GitHub to crates.io' (#1018) from gettext-cratesio into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1018
2022-02-17 14:32:47 +00:00
Kitaiti Makoto dba902d262 Merge remote-tracking branch 'origin/main' into gettext-cratesio 2022-02-17 23:09:50 +09:00
Kitaiti Makoto d52c7a3afa Update gettext-macros and gettext-utils 2022-02-17 23:08:16 +09:00
KitaitiMakoto c63f88fb7f Merge pull request 'Update crates' (#1026) from update-crates into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1026
2022-02-17 13:58:30 +00:00
Kitaiti Makoto 4412e0598f Follow API change of rocket_i18n 2022-02-17 22:36:13 +09:00
Kitaiti Makoto eb22c1168e Install rocket_i18n from crates.io 2022-02-17 22:36:13 +09:00
Kitaiti Makoto 917eda356d Use rocket_i18n on crates.io 2022-02-17 22:36:13 +09:00
Kitaiti Makoto bc6580bbdc Switch gettext crate from GitHub to crates.io 2022-02-17 22:36:13 +09:00
Kitaiti Makoto 920cf622c5 Merge remote-tracking branch 'github/dependabot/cargo/askama_escape-0.10.3' into update-crates 2022-02-17 22:34:41 +09:00
dependabot[bot] 13dcb193dc
Bump askama_escape from 0.10.2 to 0.10.3
Bumps [askama_escape](https://github.com/djc/askama) from 0.10.2 to 0.10.3.
- [Release notes](https://github.com/djc/askama/releases)
- [Commits](https://github.com/djc/askama/commits)

---
updated-dependencies:
- dependency-name: askama_escape
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-02-16 19:34:15 +00:00
dependabot[bot] 3afb724fed
Bump serde_json from 1.0.78 to 1.0.79
Bumps [serde_json](https://github.com/serde-rs/json) from 1.0.78 to 1.0.79.
- [Release notes](https://github.com/serde-rs/json/releases)
- [Commits](https://github.com/serde-rs/json/compare/v1.0.78...v1.0.79)

---
updated-dependencies:
- dependency-name: serde_json
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-02-14 19:32:32 +00:00
KitaitiMakoto 9662936b44 Merge pull request 'Update crates' (#1023) from update-crates into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1023
2022-02-08 08:13:37 +00:00
Kitaiti Makoto 4780472d48 Make Circle CI follow ructe change 2022-02-08 16:54:03 +09:00
Kitaiti Makoto 6f68c4504b Update Cargo.lock 2022-02-08 16:53:35 +09:00
Kitaiti Makoto 28e0cdfe63 Remove activitystreams from dependencies 2022-02-08 16:50:00 +09:00
Kitaiti Makoto a5003526c8 Follow clippy 2022-02-08 16:47:48 +09:00
Kitaiti Makoto ec3d78b509 Merge remote-tracking branches 'github/dependabot/cargo/ructe-0.14.0', 'github/dependabot/cargo/rsass-0.23.4' and 'github/dependabot/cargo/tracing-subscriber-0.3.8' into update-crates 2022-02-08 16:37:12 +09:00
dependabot[bot] 4205e38605
Bump rsass from 0.23.2 to 0.23.4
Bumps [rsass](https://github.com/kaj/rsass) from 0.23.2 to 0.23.4.
- [Release notes](https://github.com/kaj/rsass/releases)
- [Changelog](https://github.com/kaj/rsass/blob/master/CHANGELOG.md)
- [Commits](https://github.com/kaj/rsass/compare/v0.23.2...v0.23.4)

---
updated-dependencies:
- dependency-name: rsass
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-02-07 19:38:02 +00:00
dependabot[bot] 8438d48c71
Bump ructe from 0.13.4 to 0.14.0
Bumps [ructe](https://github.com/kaj/ructe) from 0.13.4 to 0.14.0.
- [Release notes](https://github.com/kaj/ructe/releases)
- [Changelog](https://github.com/kaj/ructe/blob/master/CHANGELOG.md)
- [Commits](https://github.com/kaj/ructe/compare/v0.13.4...v0.14.0)

---
updated-dependencies:
- dependency-name: ructe
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-02-07 19:37:50 +00:00
dependabot[bot] 52faf5996b
Bump tracing-subscriber from 0.3.7 to 0.3.8
Bumps [tracing-subscriber](https://github.com/tokio-rs/tracing) from 0.3.7 to 0.3.8.
- [Release notes](https://github.com/tokio-rs/tracing/releases)
- [Commits](https://github.com/tokio-rs/tracing/compare/tracing-subscriber-0.3.7...tracing-subscriber-0.3.8)

---
updated-dependencies:
- dependency-name: tracing-subscriber
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-02-07 19:37:34 +00:00
KitaitiMakoto 69eccc50a3 Merge pull request 'Add ActivityPub tests and a little fixes' (#1021) from ap-tests into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1021
2022-02-05 09:17:27 +00:00
Kitaiti Makoto 54cbdb236f Add tests for Mention activity 2022-02-05 17:58:00 +09:00
Kitaiti Makoto 34c374de1a Attach icon field to User activity only whene it has avatar 2022-02-05 17:58:00 +09:00
Kitaiti Makoto 113722e4ba Add more ActivityPub tests for User 2022-02-05 17:58:00 +09:00
Kitaiti Makoto 3b429909f1 Extract User::outbox_collection_page() from outbox_collection() for testablity 2022-02-05 17:58:00 +09:00
Kitaiti Makoto f1cdf4552f Extract User::outbox_collection() from outbox() for testablity 2022-02-05 17:58:00 +09:00
Kitaiti Makoto 7d320e57da Don't make medias::tests::clean() panic when file not found 2022-02-05 17:58:00 +09:00
Kitaiti Makoto e1a598a459 Attach avater to sample user 2022-02-05 17:58:00 +09:00
Kitaiti Makoto 6107842303 Add tests for Comment::to_activity() and build_delete() 2022-02-05 17:58:00 +09:00
Kitaiti Makoto 65372d2018 Extract comments::tests::prepare_activity() 2022-02-05 17:58:00 +09:00
Kitaiti Makoto 4842385ca6 Add test about reply 2022-02-05 17:58:00 +09:00
Kitaiti Makoto 05f55fc1ca Add https scheme to mention URI in contents 2022-02-05 17:58:00 +09:00
Kitaiti Makoto e8153d4b42 Fix Comment::to_activity() 2022-02-05 17:58:00 +09:00
Kitaiti Makoto 2087a659f9 Add test to validate comment json 2022-02-05 17:58:00 +09:00
Kitaiti Makoto 1770336c11 Make format_datetime() crate public 2022-02-05 17:58:00 +09:00
Kitaiti Makoto 9c177f6286 Change format_datetime implementation according to feature 2022-02-05 17:58:00 +09:00
Kitaiti Makoto 93a2c6d99f Add tests for Follow::build_accept() and build_undo() 2022-02-05 17:58:00 +09:00
Kitaiti Makoto 5ef76873b7 Fix tests 2022-02-05 17:58:00 +09:00
Kitaiti Makoto b97c3fdb87 Extract Follow::build_accept 2022-02-05 17:58:00 +09:00
Kitaiti Makoto 64838ad864 Add test for Follow::to_activity() 2022-02-05 17:58:00 +09:00
Kitaiti Makoto 4df2ce5744 Add mention to test suite for Post activities 2022-02-05 17:58:00 +09:00
Kitaiti Makoto c1f42836d9 Fix variable names 2022-02-05 17:58:00 +09:00
Kitaiti Makoto 0cbc9438d4 Complete a slash to Post Create activity's ID 2022-02-05 17:58:00 +09:00
Kitaiti Makoto ca6cd534d8 Add tests for Post::to_activity(), create_activity() and update_activity() 2022-02-05 17:58:00 +09:00
Kitaiti Makoto f529e803ef Fix ap_url of Reshare 2022-02-05 17:58:00 +09:00
Kitaiti Makoto e5bc84badf Add tests for Reshare::to_activity and build_undo 2022-02-05 17:58:00 +09:00
Kitaiti Makoto e2077bed59 Add test for Like::build_undo 2022-02-05 17:58:00 +09:00
Kitaiti Makoto 9ab9d29efb Remove double slashes 2022-02-05 17:58:00 +09:00
Kitaiti Makoto 5373a674e1 Add test for Like::to_activity 2022-02-05 17:58:00 +09:00
Kitaiti Makoto bfaa2fafaf Install assert-json-diff 2022-02-05 17:58:00 +09:00
Kitaiti Makoto d4a13a13d4 Add assert-json-diff to dev dependencies of plume-common
% cargo add assert-json-diff -p plume-models --dev
2022-02-05 17:58:00 +09:00
Kitaiti Makoto 92c0368dd8 Install activitystreams 0.7.0 2022-02-05 17:58:00 +09:00
Kitaiti Makoto 80c0426768 Add activitystreams 0.7.0 to plume-models dependencies 2022-02-05 17:58:00 +09:00
Kitaiti Makoto 71b21289ab Install activitystreams 0.7.0 2022-02-05 17:58:00 +09:00
Kitaiti Makoto fa861ff314 Add activitystreams 0.7.0 to plume-common dependencies 2022-02-05 17:58:00 +09:00
KitaitiMakoto 400d2dee32 Merge pull request 'Update crates' (#1020) from update-crates into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1020
2022-02-05 05:48:47 +00:00
Kitaiti Makoto 3993dda17d Merge remote-tracking branch 'github/dependabot/cargo/tracing-0.1.30' into update-crates 2022-02-05 14:31:20 +09:00
dependabot[bot] b1255efdcd
Bump tracing from 0.1.29 to 0.1.30
Bumps [tracing](https://github.com/tokio-rs/tracing) from 0.1.29 to 0.1.30.
- [Release notes](https://github.com/tokio-rs/tracing/releases)
- [Commits](https://github.com/tokio-rs/tracing/compare/tracing-0.1.29...tracing-0.1.30)

---
updated-dependencies:
- dependency-name: tracing
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-02-04 19:30:06 +00:00
dependabot[bot] 22036c6a94
Bump rsass from 0.23.0 to 0.23.2
Bumps [rsass](https://github.com/kaj/rsass) from 0.23.0 to 0.23.2.
- [Release notes](https://github.com/kaj/rsass/releases)
- [Changelog](https://github.com/kaj/rsass/blob/master/CHANGELOG.md)
- [Commits](https://github.com/kaj/rsass/compare/v0.23.0...v0.23.2)

---
updated-dependencies:
- dependency-name: rsass
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-02-04 19:29:57 +00:00
KitaitiMakoto f2df4b7d7d Merge pull request 'Don't fill empty content when switching rich editor' (#1017) from content-placeholder into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1017
2022-01-29 17:29:41 +00:00
Kitaiti Makoto 7c57bf78a1 [skip ci]Add changelog 2022-01-30 02:29:16 +09:00
Kitaiti Makoto 8d898ff477 Don't fill empty content when switching rich editor 2022-01-30 02:26:23 +09:00
KitaitiMakoto a1045dbce9 Merge pull request 'Fixes #988 Fix email_blocklist schema' (#1016) from block_list-schema into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1016
2022-01-29 16:38:33 +00:00
Kitaiti Makoto 23a07f3f7b [skip ci]Add changelog 2022-01-30 01:17:33 +09:00
Kitaiti Makoto 458d87fef1 Run migration 2022-01-30 01:16:51 +09:00
Kitaiti Makoto 82df86d09e Set null to email_blocklist table fields for SQLite 2022-01-30 01:16:03 +09:00
Kitaiti Makoto 858cad2995 Set null to email_blocklist table fields 2022-01-30 00:54:10 +09:00
Kitaiti Makoto c0483cf12e Generate migration files for adding NOT NULL constraints to email_blocklist table fields
% diesel migration generate add_not_null_constraint_to_email_blocklist
2022-01-30 00:45:18 +09:00
KitaitiMakoto 57a54cf016 Merge pull request 'Update Rust' (#1015) from bump-rust into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1015
2022-01-27 03:14:47 +00:00
Kitaiti Makoto 325d8cde08 [skip ci]Add changelog about Rust bump 2022-01-27 12:01:54 +09:00
Kitaiti Makoto 9e2c76c3bc Satisfy clippy 2022-01-27 11:54:41 +09:00
Kitaiti Makoto 996b161c1e Satisfy clippy 2022-01-27 11:45:35 +09:00
Kitaiti Makoto 831ef88431 Update Rust 2022-01-27 11:45:21 +09:00
KitaitiMakoto 89517e5988 Merge pull request 'Update crates' (#1014) from update-crates into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1014
2022-01-27 01:22:16 +00:00
Kitaiti Makoto 48dbcf75a9 Update crates 2022-01-27 10:02:47 +09:00
Kitaiti Makoto a56a9bc9c5 Add changelogs 2022-01-27 10:02:30 +09:00
KitaitiMakoto 918103fa29 Merge pull request 'Fix #1011 Add Basque' (#1013) from langs into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1013
2022-01-26 13:37:53 +00:00
KitaitiMakoto c9b8f5a739 Merge pull request 'Fix #1009 Email Sign-up Explanation' (#1012) from email-signup-explanation into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1012
2022-01-26 13:37:38 +00:00
Kitaiti Makoto d58ff36d80 Update po files 2022-01-26 22:17:11 +09:00
Kitaiti Makoto 00d647c0ad Add Basque po files 2022-01-26 22:16:55 +09:00
Kitaiti Makoto a27f196578 Add Basque 2022-01-26 22:15:35 +09:00
Kitaiti Makoto abe82b79ce Update pot file 2022-01-26 22:10:47 +09:00
Kitaiti Makoto 95230c3a23 Add explanation for email signup 2022-01-26 22:10:14 +09:00
Kitaiti Makoto eade69a12c Fix indentation 2022-01-26 22:09:45 +09:00
KitaitiMakoto 4f89e214ef Merge pull request 'Update crates' (#1010) from update-crates into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1010
2022-01-25 00:46:41 +00:00
Kitaiti Makoto 2936679326 Update crates 2022-01-25 09:26:23 +09:00
KitaitiMakoto 18a67fe1b5 Merge pull request 'Update crates' (#1008) from update-crates into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1008
2022-01-16 03:23:13 +00:00
Kitaiti Makoto ba29c8ef6f Follow atom_syndication API change 2022-01-16 12:02:12 +09:00
Kitaiti Makoto d253f1a020 Upgrade atom_syndication 2022-01-16 12:02:03 +09:00
Kitaiti Makoto 03060d6ee2 Update crates 2022-01-16 11:31:18 +09:00
KitaitiMakoto ac8ad3aae2 Merge pull request 'Add dependabot.yml' (#1007) from dependabot into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1007
2022-01-15 02:01:40 +00:00
Kitaiti Makoto 14e294efed Add dependabot.yml 2022-01-15 06:02:16 +09:00
KitaitiMakoto ec3205b372 Merge pull request 'v0.7.1' (#1006) from v0.7.1 into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1006
2022-01-12 02:10:48 +00:00
Kitaiti Makoto 45119d9a8c (cargo-release) version {{version}} 2022-01-12 10:42:45 +09:00
Kitaiti Makoto 1065078f75 Update translation files 2022-01-12 10:29:50 +09:00
Kitaiti Makoto 0ce904a985 Update translation files 2022-01-12 10:24:14 +09:00
Kitaiti Makoto 254eec8e6a Follow cargo-release update 2022-01-12 10:19:58 +09:00
KitaitiMakoto 0e4cb4f6e1 Merge pull request '[skip ci]Fix # of pull reuqest in changelog' (#1004) from fix-changelog into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1004
2022-01-12 00:37:19 +00:00
Kitaiti Makoto 9b05ac90df [skip ci]Fix # of pull reuqest in changelog 2022-01-12 09:36:32 +09:00
KitaitiMakoto f28a7fa508 Merge pull request 'Add changelogs' (#1003) from changelog into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1003
2022-01-12 00:35:26 +00:00
Kitaiti Makoto 65e95d8998 Add changelogs 2022-01-12 09:34:39 +09:00
KitaitiMakoto 808b8f8e98 Merge pull request 'Fix #1001 Deny access to disabled sign-up strategy' (#1002) from restrict-signup into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/1002
2022-01-12 00:33:04 +00:00
Kitaiti Makoto 43b46a8be4 Make email_signups::create return ErrorPage on error 2022-01-12 09:20:20 +09:00
Kitaiti Makoto 9bbfc71fc8 Fix registration openess condition mistake 2022-01-12 09:19:48 +09:00
Kitaiti Makoto 5d58b31f1c Remove unreachable code 2022-01-12 09:00:04 +09:00
Kitaiti Makoto e31a2238fb Respond with error status code when error 2022-01-12 08:58:42 +09:00
Kitaiti Makoto 7de37bc9b7 Hide password sign-up routings when it's disabled 2022-01-12 08:46:11 +09:00
Kitaiti Makoto 13f7734751 Hide email sign-up routings when it's disabled 2022-01-12 08:40:20 +09:00
Kitaiti Makoto b4395bce99 Implement request guard to detect enabled sign-up strategy 2022-01-12 08:39:33 +09:00
Kitaiti Makoto 7c82b08615 Use into() instead of explicitly wrapping return values 2022-01-12 04:36:01 +09:00
Kitaiti Makoto 6498dbfbb7 Reuse form values 2022-01-12 04:28:02 +09:00
Kitaiti Makoto 74254aed4a Move require_logins from plume-common to plume 2022-01-12 04:18:13 +09:00
KitaitiMakoto 8c48abf48e Merge pull request 'Update crates' (#997) from update-crates into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/997
2022-01-07 18:58:18 +00:00
Kitaiti Makoto 8958226604 Upgrade shrinkwraprs 2022-01-08 03:54:38 +09:00
Kitaiti Makoto 005a6db230 Update crates 2022-01-08 03:20:22 +09:00
KitaitiMakoto 4397abd8ab Merge pull request 'Add plume-front.pot' (#994) from front-po into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/994
2022-01-07 15:40:31 +00:00
Kitaiti Makoto e53882f555 Add plume-front.pot 2022-01-08 00:39:17 +09:00
KitaitiMakoto 5d5e61dfa1 Merge pull request 'Update crates' (#993) from update-crates into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/993
2022-01-06 21:45:53 +00:00
Kitaiti Makoto c5c6b70a89 Upgrade ldap3 2022-01-07 06:17:58 +09:00
Kitaiti Makoto 6778a0e943 Remove hyper from plume-common 2022-01-07 06:16:00 +09:00
Kitaiti Makoto 677e238c6d Follow API change of heck 2022-01-07 06:12:15 +09:00
Kitaiti Makoto b0bc2372fa Upgrade heck 2022-01-07 06:12:03 +09:00
Kitaiti Makoto 6a808c7cc5 Upgrade hex 2022-01-07 06:07:07 +09:00
Kitaiti Makoto d53543ccb1 Upgrade base64 2022-01-07 06:06:49 +09:00
Kitaiti Makoto 88d7d54601 Upgrade whatlang 2022-01-07 06:05:20 +09:00
Kitaiti Makoto 0f0c896887 Upgrade itertools 2022-01-07 06:05:05 +09:00
Kitaiti Makoto 65233c0a9a Upgrade ammonia 2022-01-07 05:53:09 +09:00
Kitaiti Makoto 32e1e4788f Upgrade shrinkwraprs 2022-01-07 05:49:06 +09:00
Kitaiti Makoto 181a78876b Remove askama_escape from dependencies of plume-models 2022-01-07 05:47:46 +09:00
Kitaiti Makoto 61d5446113 Use plume_common::escape() instead of askama_escape::escape() directly 2022-01-07 05:47:08 +09:00
Kitaiti Makoto c786569171 Define plume_common::escape() 2022-01-07 05:36:39 +09:00
Kitaiti Makoto d83a75e3f4 Add askama_escape to plume-common 2022-01-07 05:26:32 +09:00
Kitaiti Makoto a6f06559ea Remove rspassword from dependencies 2022-01-07 05:15:07 +09:00
Kitaiti Makoto 2084145dd3 Upgrade multipart 2022-01-07 05:12:29 +09:00
Kitaiti Makoto dd54058516 Upgrade guid-create 2022-01-07 05:09:08 +09:00
Kitaiti Makoto 4056a54d44 Upgrade tracing-subscriber 2022-01-07 05:03:07 +09:00
Kitaiti Makoto 191cd11741 Upgrade dotenv 2022-01-07 04:59:03 +09:00
Kitaiti Makoto 800e74da67 Follow API change of validator 2022-01-07 04:55:49 +09:00
Kitaiti Makoto 237da47950 Upgrade validator 2022-01-07 04:55:41 +09:00
Kitaiti Makoto ec12539fd0 Follow rsass API change 2022-01-07 04:41:20 +09:00
Kitaiti Makoto a537db559b Upgrade rsass 2022-01-07 04:41:11 +09:00
Kitaiti Makoto 2ba158df67 Update crates 2022-01-07 01:41:22 +09:00
KitaitiMakoto c0c066547f Merge pull request 'Update po files' (#991) from po into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/991
2022-01-06 13:24:09 +00:00
Kitaiti Makoto c3f59b14b9 Update po files 2022-01-06 22:14:24 +09:00
KitaitiMakoto 1d06a8f1ad Merge pull request 'Fixes #636 Email sign up feature' (#990) from mail-confirmation into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/990
2022-01-06 12:04:24 +00:00
Kitaiti Makoto efaf1295e9 Suppress clippy 2022-01-06 20:43:40 +09:00
Kitaiti Makoto 4bc9cf3ad1 [skip ci]Complete changelogs 2022-01-06 20:35:34 +09:00
Kitaiti Makoto 1e3851ea69 Execute SQLs for email_signups in transaction 2022-01-06 20:27:55 +09:00
Kitaiti Makoto b6d38536e3 Add email signup feature 2022-01-06 20:18:20 +09:00
Kitaiti Makoto 9b4c678aa9 Make signup token transparent 2022-01-06 18:40:24 +09:00
Kitaiti Makoto a65775d85b Implement EmailSignup 2022-01-05 03:17:58 +09:00
Kitaiti Makoto 192c7677c3 Run migration
% diesel migration run
2022-01-05 00:01:36 +09:00
Kitaiti Makoto 2a31a7b601 Define email_singups table 2022-01-05 00:01:36 +09:00
Kitaiti Makoto 355fd7cb1d Generate create_email_signups_table migration
% diesel migration generate create_email_signups_table
2022-01-05 00:01:36 +09:00
Kitaiti Makoto 40efd73dfc Add config for sign up strategy 2022-01-05 00:01:32 +09:00
KitaitiMakoto 31b144c76d Merge pull request 'Remove unnecessary prefix' (#986) from fix-tag into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/986
2022-01-04 10:17:53 +00:00
Kitaiti Makoto 31a46514cb Remove unnecessary prefix 2022-01-04 19:17:04 +09:00
KitaitiMakoto c8d906eb99 Merge pull request 'Quote version tag' (#985) from fix-action into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/985
2022-01-04 10:11:19 +00:00
Kitaiti Makoto 2895a1c819 Quote version tag 2022-01-04 19:10:37 +09:00
KitaitiMakoto a4a5d08662 Merge pull request 'deploy-tags' (#984) from deploy-tags into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/984
2022-01-04 10:08:31 +00:00
Kitaiti Makoto b97c9d2165 Deploy Docker images with tags 2022-01-04 19:07:47 +09:00
Kitaiti Makoto 5b7e8a69a5 Revert "Deploy tags"
This reverts commit d9a59f1b07.
2022-01-04 19:04:58 +09:00
KitaitiMakoto 9601e99e33 Merge pull request 'Deploy tags' (#983) from deploy-tags into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/983
2022-01-04 09:42:19 +00:00
Kitaiti Makoto d9a59f1b07 Deploy tags 2022-01-04 18:41:06 +09:00
KitaitiMakoto b6a6af906a Merge pull request '[skip ci]Add changelog abourt MAIL_PORT' (#982) from changelog into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/982
2022-01-03 16:07:13 +00:00
Kitaiti Makoto 2c4799ce27 [skip ci]Add changelog abourt MAIL_PORT 2022-01-04 01:06:12 +09:00
KitaitiMakoto b33b19849c Merge pull request 'Fix notification page error' (#981) from fix-notification-page into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/981
2022-01-03 14:29:43 +00:00
Kitaiti Makoto e398f36c57 Add changelog 2022-01-03 23:28:56 +09:00
Kitaiti Makoto ee6064eee8 Don't unwrap() 2022-01-03 23:28:05 +09:00
KitaitiMakoto 9d012c8f3c Merge pull request 'Closes #944 Mail server port configuration' (#980) from mail-server-port into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/980
2022-01-03 10:37:53 +00:00
Kitaiti Makoto 8888dbba0a Initialize SMTP client with port number 2022-01-03 18:09:26 +09:00
Kitaiti Makoto 6f8d5c1eb4 Add SmtpClient::new_with_addr() method 2022-01-03 18:09:13 +09:00
Kitaiti Makoto 62da4a3d5c Add native-tls to plume-models' dependencies 2022-01-03 18:04:49 +09:00
Kitaiti Makoto 5cfc8e71a5 Remove Lettre from plume module dependencies 2022-01-03 17:36:37 +09:00
Kitaiti Makoto a599760891 Use smtp module from plume_models instead of lettre directly 2022-01-03 17:36:11 +09:00
Kitaiti Makoto 00324f668f Add port field to MailConfig 2022-01-03 17:25:46 +09:00
Kitaiti Makoto d4549704b9 Install Lettre 2022-01-03 17:11:35 +09:00
Kitaiti Makoto 0836e3d693 Add Lettre to plume-models' dependencies 2022-01-03 17:09:47 +09:00
Kitaiti Makoto 0058c3053d Move mail config from plume::mail::mailer to plume_models::CONFIG 2022-01-03 15:50:04 +09:00
KitaitiMakoto 2a1a0a23a5 Merge pull request 'Move bottombar styles to _article.scss' (#978) from bottombar-styling into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/978
2022-01-02 18:03:46 +00:00
Kitaiti Makoto 5614e3bd59 Move bottombar styles to _article.scss 2022-01-03 03:01:59 +09:00
KitaitiMakoto acbda3cde1 Merge pull request 'Make bottom bar smaller in narrow window' (#977) from layout-post-control into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/977
2022-01-02 17:50:20 +00:00
Kitaiti Makoto 0755436458 Make bottom bar smaller in narrow window 2022-01-03 02:49:45 +09:00
KitaitiMakoto 3daf405ae2 Merge pull request 'Make blog cover clickable' (#976) from clickable-blog-image into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/976
2022-01-02 17:40:30 +00:00
Kitaiti Makoto 53dc3b0c03 Fix cover size of posts 2022-01-03 02:39:17 +09:00
Kitaiti Makoto 371dcc5091 Address blog title positoin in dashboard 2022-01-03 02:37:45 +09:00
Kitaiti Makoto 905fe54fa3 Make blog cover a link 2022-01-03 02:35:07 +09:00
Kitaiti Makoto 62c0827ff5 Remove needless whitespaces 2022-01-03 02:32:17 +09:00
KitaitiMakoto d6c65ce81a Merge pull request 'Fix #927 Ensure Post ap_url' (#975) from ensure-ap-url into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/975
2022-01-02 17:24:04 +00:00
Kitaiti Makoto 5532b4a4d7 Ensure Post ap_url 2022-01-03 02:21:34 +09:00
KitaitiMakoto 637bd3347b Merge pull request 'Fix #967 Fix comment link' (#974) from fix-comment-link into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/974
2022-01-02 17:08:10 +00:00
Kitaiti Makoto bac373a818 Fix comment link 2022-01-03 02:06:59 +09:00
Kitaiti Makoto f0e7ea5640 (cargo-release) version {{version}} 2022-01-03 01:17:07 +09:00
Kitaiti Makoto 4b981e0fad (cargo-release) version {{version}} 2022-01-03 00:56:29 +09:00
KitaitiMakoto 8fb9d861de Merge pull request '[skip ci]Give up to deploy tags' (#973) from giveup-tag into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/973
2022-01-02 15:43:05 +00:00
Kitaiti Makoto 199269ba3c [skip ci]Give up to deploy tags 2022-01-03 00:42:36 +09:00
KitaitiMakoto 0b9ec4c52c Merge pull request 'Use Git tag for Docker image tag' (#972) from deploy-tag into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/972
2022-01-02 15:14:38 +00:00
Kitaiti Makoto 0e51565cc8 Use Git tag for Docker image tag 2022-01-03 00:14:00 +09:00
KitaitiMakoto 0418d35b67 Merge pull request 'Fix a typo' (#971) from deploy-tag into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/971
2022-01-02 15:06:26 +00:00
Kitaiti Makoto bf9d25363b Fix a typo 2022-01-03 00:04:57 +09:00
KitaitiMakoto 84f00c57d1 Merge pull request 'Fix tag calculation' (#970) from deploy-tag into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/970
2022-01-02 15:02:15 +00:00
Kitaiti Makoto 7c1a5421fa Fix tag calculation 2022-01-03 00:01:18 +09:00
KitaitiMakoto 3815bfe980 Merge pull request '[skip ci]Deploy tags to Docker Hub' (#969) from deploy-tag into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/969
2022-01-02 14:51:03 +00:00
Kitaiti Makoto de448c3192 [skip ci]Deploy tags to Docker Hub 2022-01-02 23:49:42 +09:00
KitaitiMakoto 967e2dfde6 Merge pull request 'Add GitHub Action to deploy Docker image' (#968) from gh-action into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/968
2022-01-02 14:28:27 +00:00
Kitaiti Makoto dd3c1eac5f Add GitHub Action to deploy Docker image 2022-01-02 23:26:39 +09:00
KitaitiMakoto abc0a794c1 Merge pull request 'v0.7.0' (#923) from v0.7.0 into main
Reviewed-on: https://git.joinplu.me/Plume/Plume/pulls/923
2022-01-02 11:48:46 +00:00
202 changed files with 13508 additions and 7077 deletions

View file

@ -10,8 +10,8 @@ executors:
type: boolean
default: false
docker:
- image: plumeorg/plume-buildenv:v0.4.0
- image: <<#parameters.postgres>>circleci/postgres:9.6-alpine<</parameters.postgres>><<^parameters.postgres>>alpine:latest<</parameters.postgres>>
- image: plumeorg/plume-buildenv:v0.8.0
- image: <<#parameters.postgres>>cimg/postgres:14.2<</parameters.postgres>><<^parameters.postgres>>alpine:latest<</parameters.postgres>>
environment:
POSTGRES_USER: postgres
POSTGRES_DB: plume
@ -38,7 +38,7 @@ commands:
- restore_cache:
keys:
- v0-<< parameters.cache >>-{{ checksum "Cargo.lock" }}-{{ .Branch }}
- v0-<< parameters.cache >>-{{ checksum "Cargo.lock" }}-master
- v0-<< parameters.cache >>-{{ checksum "Cargo.lock" }}-main
cache:
description: push cache
@ -63,7 +63,8 @@ commands:
type: boolean
default: false
steps:
- run: cargo clippy <<^parameters.no_feature>>--no-default-features --features="${FEATURES}"<</parameters.no_feature>> --release -p <<parameters.package>> -- -D warnings -A clippy::needless_borrow
- run: rustup component add clippy --toolchain nightly-2022-07-19-x86_64-unknown-linux-gnu
- run: cargo clippy <<^parameters.no_feature>>--no-default-features --features="${FEATURES}"<</parameters.no_feature>> --release -p <<parameters.package>> -- -D warnings
run_with_coverage:
description: run command with environment for coverage
@ -111,6 +112,7 @@ jobs:
name: default
steps:
- restore_env
- run: rustup component add rustfmt --toolchain nightly-2022-07-19-x86_64-unknown-linux-gnu
- run: cargo fmt --all -- --check
clippy:
@ -258,4 +260,4 @@ workflows:
filters:
branches:
only:
- /^master/
- /^main/

View file

@ -1,4 +1,4 @@
FROM debian:buster-20210208
FROM rust:1
ENV PATH="/root/.cargo/bin:${PATH}"
#install native/circleci/build dependancies
@ -6,19 +6,19 @@ RUN apt update &&\
apt install -y --no-install-recommends git ssh tar gzip ca-certificates default-jre&&\
echo "deb [trusted=yes] https://apt.fury.io/caddy/ /" \
| tee -a /etc/apt/sources.list.d/caddy-fury.list &&\
wget -qO - https://artifacts.crowdin.com/repo/GPG-KEY-crowdin | apt-key add - &&\
echo "deb https://artifacts.crowdin.com/repo/deb/ /" > /etc/apt/sources.list.d/crowdin.list &&\
apt update &&\
apt install -y --no-install-recommends binutils-dev build-essential cmake curl gcc gettext git libcurl4-openssl-dev libdw-dev libelf-dev libiberty-dev libpq-dev libsqlite3-dev libssl-dev make openssl pkg-config postgresql postgresql-contrib python zlib1g-dev python3-pip zip unzip libclang-dev clang caddy&&\
apt install -y --no-install-recommends binutils-dev build-essential cmake curl gcc gettext git libcurl4-openssl-dev libdw-dev libelf-dev libiberty-dev libpq-dev libsqlite3-dev libssl-dev make openssl pkg-config postgresql postgresql-contrib python zlib1g-dev python3-dev python3-pip python3-setuptools zip unzip libclang-dev clang caddy crowdin3 &&\
rm -rf /var/lib/apt/lists/*
#install and configure rust
RUN curl https://sh.rustup.rs -sSf | sh -s -- --default-toolchain nightly-2021-11-27 -y &&\
rustup component add rustfmt clippy &&\
rustup component add rust-std --target wasm32-unknown-unknown
#stick rust environment
COPY rust-toolchain ./
RUN rustup component add rustfmt clippy
#compile some deps
RUN cargo install wasm-pack &&\
cargo install grcov &&\
strip /root/.cargo/bin/* &&\
rm -fr ~/.cargo/registry
#set some compilation parametters
@ -29,11 +29,3 @@ RUN pip3 install selenium
#configure caddy
COPY Caddyfile /Caddyfile
#install crowdin
RUN mkdir /crowdin && cd /crowdin &&\
curl -O https://downloads.crowdin.com/cli/v2/crowdin-cli.zip &&\
unzip crowdin-cli.zip && rm crowdin-cli.zip &&\
cd * && mv crowdin-cli.jar /usr/local/bin && cd && rm -rf /crowdin &&\
/bin/echo -e '#!/bin/sh\njava -jar /usr/local/bin/crowdin-cli.jar $@' > /usr/local/bin/crowdin &&\
chmod +x /usr/local/bin/crowdin

View file

@ -0,0 +1 @@
nightly-2022-07-19

View file

@ -3,3 +3,5 @@ data
Dockerfile
docker-compose.yml
.env
target
data

1
.envrc Normal file
View file

@ -0,0 +1 @@
use flake

6
.github/dependabot.yml vendored Normal file
View file

@ -0,0 +1,6 @@
version: 2
updates:
- package-ecosystem: cargo
directory: /
schedule:
interval: daily

View file

@ -0,0 +1,30 @@
name: cd
on:
push:
branches:
- 'main'
jobs:
docker:
runs-on: ubuntu-latest
steps:
-
name: Set up QEMU
uses: docker/setup-qemu-action@v2
-
name: Set up Docker Buildx
uses: docker/setup-buildx-action@v2
-
name: Login to DockerHub
uses: docker/login-action@v2
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
-
name: Build and push
id: docker_build
uses: docker/build-push-action@v3
with:
push: true
tags: plumeorg/plume:latest

View file

@ -0,0 +1,36 @@
name: cd
on:
push:
tags:
- '*.*.*'
jobs:
docker:
runs-on: ubuntu-latest
steps:
-
name: Set up QEMU
uses: docker/setup-qemu-action@v2
-
name: Set up Docker Buildx
uses: docker/setup-buildx-action@v2
-
name: Docker meta
id: meta
uses: docker/metadata-action@v3
with:
images: plumeorg/plume
-
name: Login to DockerHub
uses: docker/login-action@v2
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
-
name: Build and push
id: docker_build
uses: docker/build-push-action@v3
with:
push: true
tags: ${{ steps.meta.outputs.tags }}

1
.gitignore vendored
View file

@ -20,3 +20,4 @@ search_index
__pycache__
.vscode/
*-journal
.direnv/

View file

@ -4,6 +4,69 @@
## [Unreleased] - ReleaseDate
### Added
- Add 'My feed' to i18n timeline name (#1084)
- Bidirectional support for user page header (#1092)
### Changed
- Use blog title as slug (#1094, #1126, #1127)
- Bump Rust to nightly 2022-07-19 (#1119)
### Fixed
- Malfunction while creating a blog post in Persian (#1116)
- Email block list is ignored when email sign-up (#1122)
- Bug that some Activity Sytreams properties are not parsed properly (#1129)
- Allow empty avatar for remote users (#1129)
- Percent encode blog FQN for federation interoperability (#1129)
- The same to `preferredUsername` (#1129)
## [[0.7.2]] - 2022-05-11
### Added
- Basque language (#1013)
- Unit tests for ActivityPub (#1021)
- Move to action area after liking/boosting/commenting (#1074)
### Changed
- Bump Rust to nightly 2022-01-26 (#1015)
- Remove "Latest articles" timeline (#1069)
- Change order of timeline tabs (#1069, #1070, #1072)
- Migrate ActivityPub-related crates from activitypub 0.1 to activitystreams 0.7 (#1022)
### Fixed
- Add explanation of sign-up step at sign-up page when email sign-up mode (#1012)
- Add NOT NULL constraint to email_blocklist table fields (#1016)
- Don't fill empty content when switching rich editor (#1017)
- Fix accept header (#1058)
- Render 404 page instead of 500 when data is not found (#1062)
- Reuse reqwest client on broadcasting (#1059)
- Reduce broadcasting HTTP request at once to prevent them being timed out (#1068, #1071)
- Some ActivityPub data (#1021)
## [[0.7.1]] - 2022-01-12
### Added
- Introduce environment variable `MAIL_PORT` (#980)
- Introduce email sign-up feature (#636, #1002)
### Changed
- Some styling improvements (#976, #977, #978)
- Respond with error status code when error (#1002)
### Fiexed
- Fix comment link (#974)
- Fix a bug that prevents posting articles (#975)
- Fix a bug that notification page doesn't show (#981)
## [[0.7.0]] - 2022-01-02
### Added
@ -218,7 +281,9 @@
- Ability to create multiple blogs
<!-- next-url -->
[Unreleased]: https://github.com/Plume-org/Plume/compare/0.7.0...HEAD
[Unreleased]: https://github.com/Plume-org/Plume/compare/0.7.2...HEAD
[[0.7.2]]: https://github.com/Plume-org/Plume/compare/0.7.1...0.7.2
[[0.7.1]]: https://github.com/Plume-org/Plume/compare/0.7.0...0.7.1
[[0.7.0]]: https://github.com/Plume-org/Plume/compare/0.6.0...0.7.0
[[0.6.0]]: https://github.com/Plume-org/Plume/compare/0.5.0...0.6.0
[0.5.0]: https://github.com/Plume-org/Plume/compare/0.4.0-alpha-4...0.5.0

2929
Cargo.lock generated

File diff suppressed because it is too large Load diff

View file

@ -1,37 +1,33 @@
[package]
authors = ["Plume contributors"]
name = "plume"
version = "0.7.0"
version = "0.7.3-dev"
repository = "https://github.com/Plume-org/Plume"
edition = "2018"
[dependencies]
activitypub = "0.1.3"
askama_escape = "0.1"
atom_syndication = "0.6"
atom_syndication = "0.12.0"
clap = "2.33"
dotenv = "0.15.0"
gettext = { git = "https://github.com/Plume-org/gettext/", rev = "294c54d74c699fbc66502b480a37cc66c1daa7f3" }
gettext-macros = { git = "https://github.com/Plume-org/gettext-macros/", rev = "a7c605f7edd6bfbfbfe7778026bfefd88d82db10" }
gettext-utils = { git = "https://github.com/Plume-org/gettext-macros/", rev = "a7c605f7edd6bfbfbfe7778026bfefd88d82db10" }
guid-create = "0.1"
lettre = "0.9.2"
gettext = "0.4.0"
gettext-macros = "0.6.1"
gettext-utils = "0.1.0"
guid-create = "0.2"
lettre_email = "0.9.2"
num_cpus = "1.10"
rocket = "0.4.6"
rocket_contrib = { version = "0.4.5", features = ["json"] }
rocket_i18n = { git = "https://github.com/Plume-org/rocket_i18n", rev = "e922afa7c366038b3433278c03b1456b346074f2" }
rpassword = "4.0"
scheduled-thread-pool = "0.2.2"
serde = "1.0"
serde_json = "1.0.70"
shrinkwraprs = "0.2.1"
validator = "0.8"
validator_derive = "0.8"
rocket = "0.4.11"
rocket_contrib = { version = "0.4.11", features = ["json"] }
rocket_i18n = "0.4.1"
scheduled-thread-pool = "0.2.6"
serde = "1.0.137"
serde_json = "1.0.81"
shrinkwraprs = "0.3.0"
validator = { version = "0.15", features = ["derive"] }
webfinger = "0.4.1"
tracing = "0.1.22"
tracing-subscriber = "0.2.15"
tracing = "0.1.35"
tracing-subscriber = "0.3.10"
riker = "0.4.2"
activitystreams = "=0.7.0-alpha.20"
[[bin]]
name = "plume"
@ -43,7 +39,7 @@ version = "0.4"
[dependencies.ctrlc]
features = ["termination"]
version = "3.1.2"
version = "3.2.2"
[dependencies.diesel]
features = ["r2d2", "chrono"]
@ -52,7 +48,7 @@ version = "1.4.5"
[dependencies.multipart]
default-features = false
features = ["server"]
version = "0.16"
version = "0.18"
[dependencies.plume-api]
path = "plume-api"
@ -64,20 +60,21 @@ path = "plume-common"
path = "plume-models"
[dependencies.rocket_csrf]
git = "https://github.com/fdb-hiroshima/rocket_csrf"
rev = "29910f2829e7e590a540da3804336577b48c7b31"
git = "https://git.joinplu.me/plume/rocket_csrf"
rev = "0.1.2"
[build-dependencies]
ructe = "0.13.0"
rsass = "0.9"
ructe = "0.15.0"
rsass = "0.26"
[features]
default = ["postgres"]
default = ["postgres", "s3"]
postgres = ["plume-models/postgres", "diesel/postgres"]
sqlite = ["plume-models/sqlite", "diesel/sqlite"]
debug-mailer = []
test = []
search-lindera = ["plume-models/search-lindera"]
s3 = ["plume-models/s3"]
[workspace]
members = ["plume-api", "plume-cli", "plume-models", "plume-common", "plume-front", "plume-macro"]

View file

@ -1,4 +1,4 @@
FROM rust:1-buster as builder
FROM rust:1 as builder
RUN apt-get update && apt-get install -y --no-install-recommends \
ca-certificates \
@ -18,17 +18,15 @@ COPY script/wasm-deps.sh .
RUN chmod a+x ./wasm-deps.sh && sleep 1 && ./wasm-deps.sh
WORKDIR /app
COPY Cargo.toml Cargo.lock rust-toolchain ./
RUN cargo install wasm-pack
COPY . .
RUN cargo install wasm-pack
RUN chmod a+x ./script/plume-front.sh && sleep 1 && ./script/plume-front.sh
RUN cargo install --path ./ --force --no-default-features --features postgres
RUN cargo install --path plume-cli --force --no-default-features --features postgres
RUN cargo clean
FROM debian:buster-slim
FROM debian:stable-slim
RUN apt-get update && apt-get install -y --no-install-recommends \
ca-certificates \

View file

@ -1,10 +1,10 @@
<h1 align="center">
<img src="https://raw.githubusercontent.com/Plume-org/Plume/master/assets/icons/trwnh/feather/plumeFeather64.png" alt="Plume's logo">
<img src="https://raw.githubusercontent.com/Plume-org/Plume/main/assets/icons/trwnh/feather/plumeFeather64.png" alt="Plume's logo">
Plume
</h1>
<p align="center">
<a href="https://github.com/Plume-org/Plume/"><img alt="CircleCI" src="https://img.shields.io/circleci/build/gh/Plume-org/Plume.svg"></a>
<a href="https://codecov.io/gh/Plume-org/Plume"><img src="https://codecov.io/gh/Plume-org/Plume/branch/master/graph/badge.svg" alt="Code coverage"></a>
<a href="https://codecov.io/gh/Plume-org/Plume"><img src="https://codecov.io/gh/Plume-org/Plume/branch/main/graph/badge.svg" alt="Code coverage"></a>
<a title="Crowdin" target="_blank" href="https://crowdin.com/project/plume"><img src="https://d322cqt584bo4o.cloudfront.net/plume/localized.svg"></a>
<a href="https://hub.docker.com/r/plumeorg/plume"><img alt="Docker Pulls" src="https://img.shields.io/docker/pulls/plumeorg/plume.svg"></a>
<a href="https://liberapay.com/Plume"><img alt="Liberapay patrons" src="https://img.shields.io/liberapay/patrons/Plume.svg"></a>
@ -53,3 +53,4 @@ As we want the various spaces related to the project (GitHub, Matrix, Loomio, et
We provide various way to install Plume: from source, with pre-built binaries, with Docker or with YunoHost.
For detailed explanations, please refer to [the documentation](https://docs.joinplu.me/installation/).

View file

@ -228,7 +228,7 @@ main .article-meta {
fill: currentColor;
}
.action.liked:hover svg.feather {
background: transparentize($red, 0.75)
background: transparentize($red, 0.75);
color: $red;
}
}
@ -252,7 +252,7 @@ main .article-meta {
background: $primary;
}
.action.reshared:hover svg.feather {
background: transparentize($primary, 0.75)
background: transparentize($primary, 0.75);
color: $primary;
}
}
@ -516,4 +516,11 @@ input:checked ~ .cw-container > .cw-text {
main .article-meta > *, main .article-meta .comments, main .article-meta > .banner > * {
margin: 0 5%;
}
.bottom-bar {
align-items: center;
& > div:nth-child(2) {
margin: 0;
}
}
}

View file

@ -135,6 +135,7 @@ form.new-post {
.button + .button {
margin-left: 1em;
margin-inline-start: 1em;
}
.split {

View file

@ -219,17 +219,21 @@ p.error {
margin: 20px;
}
.cover-link {
margin: 0;
&:hover {
opacity: 0.9;
}
}
.cover {
min-height: 10em;
background-position: center;
background-size: cover;
margin: 0px;
&:hover {
opacity: 0.9;
}
}
header {
display: flex;
}
@ -245,6 +249,9 @@ p.error {
position: relative;
a {
display: block;
width: 100%;
height: 100%;
padding-block-start: 0.5em;
transition: color 0.1s ease-in;
color: $text-color;
@ -500,6 +507,7 @@ figure {
margin: auto $horizontal-margin 2em;
overflow: auto;
display: flex;
justify-content: center;
a {
display: inline-block;
@ -569,14 +577,6 @@ figure {
}
}
.bottom-bar {
flex-direction: column;
align-items: center;
& > div {
margin: 0;
}
}
main .article-meta .comments .comment {
header {
flex-direction: column;
@ -611,4 +611,4 @@ code {
.function{
color:inherit;
}
}
}

View file

@ -120,8 +120,14 @@ fn compile_theme(path: &Path, out_dir: &Path) -> std::io::Result<()> {
// compile the .scss/.sass file
let mut out = File::create(out.join("theme.css"))?;
out.write_all(
&rsass::compile_scss_file(path, rsass::OutputStyle::Compressed)
.expect("SCSS compilation error"),
&rsass::compile_scss_path(
path,
rsass::output::Format {
style: rsass::output::Style::Compressed,
..rsass::output::Format::default()
},
)
.expect("SCSS compilation error"),
)?;
Ok(())

116
flake.lock Normal file
View file

@ -0,0 +1,116 @@
{
"nodes": {
"flake-utils": {
"inputs": {
"systems": "systems"
},
"locked": {
"lastModified": 1681202837,
"narHash": "sha256-H+Rh19JDwRtpVPAWp64F+rlEtxUWBAQW28eAi3SRSzg=",
"owner": "numtide",
"repo": "flake-utils",
"rev": "cfacdce06f30d2b68473a46042957675eebb3401",
"type": "github"
},
"original": {
"owner": "numtide",
"repo": "flake-utils",
"type": "github"
}
},
"flake-utils_2": {
"inputs": {
"systems": "systems_2"
},
"locked": {
"lastModified": 1681202837,
"narHash": "sha256-H+Rh19JDwRtpVPAWp64F+rlEtxUWBAQW28eAi3SRSzg=",
"owner": "numtide",
"repo": "flake-utils",
"rev": "cfacdce06f30d2b68473a46042957675eebb3401",
"type": "github"
},
"original": {
"owner": "numtide",
"repo": "flake-utils",
"type": "github"
}
},
"nixpkgs": {
"locked": {
"lastModified": 1683408522,
"narHash": "sha256-9kcPh6Uxo17a3kK3XCHhcWiV1Yu1kYj22RHiymUhMkU=",
"owner": "NixOS",
"repo": "nixpkgs",
"rev": "897876e4c484f1e8f92009fd11b7d988a121a4e7",
"type": "github"
},
"original": {
"owner": "NixOS",
"ref": "nixos-unstable",
"repo": "nixpkgs",
"type": "github"
}
},
"root": {
"inputs": {
"flake-utils": "flake-utils",
"nixpkgs": "nixpkgs",
"rust-overlay": "rust-overlay"
}
},
"rust-overlay": {
"inputs": {
"flake-utils": "flake-utils_2",
"nixpkgs": [
"nixpkgs"
]
},
"locked": {
"lastModified": 1683857898,
"narHash": "sha256-pyVY4UxM6zUX97g6bk6UyCbZGCWZb2Zykrne8YxacRA=",
"owner": "oxalica",
"repo": "rust-overlay",
"rev": "4e7fba3f37f5e184ada0ef3cf1e4d8ef450f240b",
"type": "github"
},
"original": {
"owner": "oxalica",
"repo": "rust-overlay",
"type": "github"
}
},
"systems": {
"locked": {
"lastModified": 1681028828,
"narHash": "sha256-Vy1rq5AaRuLzOxct8nz4T6wlgyUR7zLU309k9mBC768=",
"owner": "nix-systems",
"repo": "default",
"rev": "da67096a3b9bf56a91d16901293e51ba5b49a27e",
"type": "github"
},
"original": {
"owner": "nix-systems",
"repo": "default",
"type": "github"
}
},
"systems_2": {
"locked": {
"lastModified": 1681028828,
"narHash": "sha256-Vy1rq5AaRuLzOxct8nz4T6wlgyUR7zLU309k9mBC768=",
"owner": "nix-systems",
"repo": "default",
"rev": "da67096a3b9bf56a91d16901293e51ba5b49a27e",
"type": "github"
},
"original": {
"owner": "nix-systems",
"repo": "default",
"type": "github"
}
}
},
"root": "root",
"version": 7
}

60
flake.nix Normal file
View file

@ -0,0 +1,60 @@
{
description = "Developpment shell for Plume including nightly Rust compiler";
inputs.nixpkgs.url = "github:NixOS/nixpkgs/nixos-unstable";
inputs.rust-overlay = {
url = "github:oxalica/rust-overlay";
inputs.nixpkgs.follows = "nixpkgs";
};
inputs.flake-utils.url = "github:numtide/flake-utils";
outputs = { self, nixpkgs, flake-utils, rust-overlay, ... }:
flake-utils.lib.eachDefaultSystem (system:
let
overlays = [ (import rust-overlay) ];
pkgs = import nixpkgs { inherit system overlays; };
inputs = with pkgs; [
(rust-bin.nightly.latest.default.override {
targets = [ "wasm32-unknown-unknown" ];
})
wasm-pack
openssl
pkg-config
gettext
postgresql
sqlite
];
in {
packages.default = pkgs.rustPlatform.buildRustPackage {
pname = "plume";
version = "0.7.3-dev";
src = ./.;
cargoLock = {
lockFile = ./Cargo.lock;
outputHashes = {
"pulldown-cmark-0.8.0" = "sha256-lpfoRDuY3zJ3QmUqJ5k9OL0MEdGDpwmpJ+u5BCj2kIA=";
"rocket_csrf-0.1.2" = "sha256-WywZfMiwZqTPfSDcAE7ivTSYSaFX+N9fjnRsLSLb9wE=";
};
};
buildNoDefaultFeatures = true;
buildFeatures = ["postgresql" "s3"];
nativeBuildInputs = inputs;
buildPhase = ''
wasm-pack build --target web --release plume-front
cargo build --no-default-features --features postgresql,s3 --path .
cargo build --no-default-features --features postgresql,s3 --path plume-cli
'';
installPhase = ''
cargo install --no-default-features --features postgresql,s3 --path . --target-dir $out
cargo install --no-default-features --features postgresql,s3 --path plume-cli --target-dir $out
'';
};
devShells.default = pkgs.mkShell {
packages = inputs;
};
});
}

View file

@ -0,0 +1 @@
DROP TABLE email_signups;

View file

@ -0,0 +1,9 @@
CREATE TABLE email_signups (
id SERIAL PRIMARY KEY,
email VARCHAR NOT NULL,
token VARCHAR NOT NULL,
expiration_date TIMESTAMP NOT NULL
);
CREATE INDEX email_signups_token ON email_signups (token);
CREATE UNIQUE INDEX email_signups_token_requests_email ON email_signups (email);

View file

@ -0,0 +1,4 @@
ALTER TABLE email_blocklist ALTER COLUMN notification_text DROP NOT NULL;
ALTER TABLE email_blocklist ALTER COLUMN notify_user DROP NOT NULL;
ALTER TABLE email_blocklist ALTER COLUMN note DROP NOT NULL;
ALTER TABLE email_blocklist ALTER COLUMN email_address DROP NOT NULL;

View file

@ -0,0 +1,4 @@
ALTER TABLE email_blocklist ALTER COLUMN email_address SET NOT NULL;
ALTER TABLE email_blocklist ALTER COLUMN note SET NOT NULL;
ALTER TABLE email_blocklist ALTER COLUMN notify_user SET NOT NULL;
ALTER TABLE email_blocklist ALTER COLUMN notification_text SET NOT NULL;

View file

@ -0,0 +1 @@
DROP TABLE email_signups;

View file

@ -0,0 +1,9 @@
CREATE TABLE email_signups (
id INTEGER NOT NULL PRIMARY KEY AUTOINCREMENT,
email VARCHAR NOT NULL,
token VARCHAR NOT NULL,
expiration_date TIMESTAMP NOT NULL
);
CREATE INDEX email_signups_token ON email_signups (token);
CREATE UNIQUE INDEX email_signups_token_requests_email ON email_signups (email);

View file

@ -0,0 +1,9 @@
CREATE TABLE email_blocklist2(id INTEGER PRIMARY KEY,
email_address TEXT UNIQUE,
note TEXT,
notify_user BOOLEAN DEFAULT FALSE,
notification_text TEXT);
INSERT INTO email_blocklist2 SELECT * FROM email_blocklist;
DROP TABLE email_blocklist;
ALTER TABLE email_blocklist2 RENAME TO email_blocklist;

View file

@ -0,0 +1,9 @@
CREATE TABLE email_blocklist2(id INTEGER PRIMARY KEY,
email_address TEXT UNIQUE NOT NULL,
note TEXT NOT NULL,
notify_user BOOLEAN DEFAULT FALSE NOT NULL,
notification_text TEXT NOT NULL);
INSERT INTO email_blocklist2 SELECT * FROM email_blocklist;
DROP TABLE email_blocklist;
ALTER TABLE email_blocklist2 RENAME TO email_blocklist;

View file

@ -1,9 +1,9 @@
[package]
name = "plume-api"
version = "0.7.0"
version = "0.7.2"
authors = ["Plume contributors"]
edition = "2018"
[dependencies]
serde = "1.0"
serde = "1.0.137"
serde_derive = "1.0"

View file

@ -1,2 +1,3 @@
pre-release-hook = ["cargo", "fmt"]
pre-release-replacements = []
release = false

View file

@ -1,6 +1,6 @@
[package]
name = "plume-cli"
version = "0.7.0"
version = "0.7.2"
authors = ["Plume contributors"]
edition = "2018"
@ -10,8 +10,8 @@ path = "src/main.rs"
[dependencies]
clap = "2.33"
dotenv = "0.14"
rpassword = "5.0.0"
dotenv = "0.15"
rpassword = "6.0.1"
[dependencies.diesel]
features = ["r2d2", "chrono"]
@ -24,3 +24,4 @@ path = "../plume-models"
postgres = ["plume-models/postgres", "diesel/postgres"]
sqlite = ["plume-models/sqlite", "diesel/sqlite"]
search-lindera = ["plume-models/search-lindera"]
s3 = ["plume-models/s3"]

View file

@ -1,2 +1,3 @@
pre-release-hook = ["cargo", "fmt"]
pre-release-replacements = []
release = false

262
plume-cli/src/list.rs Normal file
View file

@ -0,0 +1,262 @@
use clap::{App, Arg, ArgMatches, SubCommand};
use plume_models::{blogs::Blog, instance::Instance, lists::*, users::User, Connection};
pub fn command<'a, 'b>() -> App<'a, 'b> {
SubCommand::with_name("lists")
.about("Manage lists")
.subcommand(
SubCommand::with_name("new")
.arg(
Arg::with_name("name")
.short("n")
.long("name")
.takes_value(true)
.help("The name of this list"),
)
.arg(
Arg::with_name("type")
.short("t")
.long("type")
.takes_value(true)
.help(
r#"The type of this list (one of "user", "blog", "word" or "prefix")"#,
),
)
.arg(
Arg::with_name("user")
.short("u")
.long("user")
.takes_value(true)
.help("Username of whom this list is for. Empty for an instance list"),
)
.about("Create a new list"),
)
.subcommand(
SubCommand::with_name("delete")
.arg(
Arg::with_name("name")
.short("n")
.long("name")
.takes_value(true)
.help("The name of the list to delete"),
)
.arg(
Arg::with_name("user")
.short("u")
.long("user")
.takes_value(true)
.help("Username of whom this list was for. Empty for instance list"),
)
.arg(
Arg::with_name("yes")
.short("y")
.long("yes")
.help("Confirm the deletion"),
)
.about("Delete a list"),
)
.subcommand(
SubCommand::with_name("add")
.arg(
Arg::with_name("name")
.short("n")
.long("name")
.takes_value(true)
.help("The name of the list to add an element to"),
)
.arg(
Arg::with_name("user")
.short("u")
.long("user")
.takes_value(true)
.help("Username of whom this list is for. Empty for instance list"),
)
.arg(
Arg::with_name("value")
.short("v")
.long("value")
.takes_value(true)
.help("The value to add"),
)
.about("Add element to a list"),
)
.subcommand(
SubCommand::with_name("rm")
.arg(
Arg::with_name("name")
.short("n")
.long("name")
.takes_value(true)
.help("The name of the list to remove an element from"),
)
.arg(
Arg::with_name("user")
.short("u")
.long("user")
.takes_value(true)
.help("Username of whom this list is for. Empty for instance list"),
)
.arg(
Arg::with_name("value")
.short("v")
.long("value")
.takes_value(true)
.help("The value to remove"),
)
.about("Remove element from list"),
)
}
pub fn run<'a>(args: &ArgMatches<'a>, conn: &Connection) {
let conn = conn;
match args.subcommand() {
("new", Some(x)) => new(x, conn),
("delete", Some(x)) => delete(x, conn),
("add", Some(x)) => add(x, conn),
("rm", Some(x)) => rm(x, conn),
("", None) => command().print_help().unwrap(),
_ => println!("Unknown subcommand"),
}
}
fn get_list_identifier(args: &ArgMatches<'_>) -> (String, Option<String>) {
let name = args
.value_of("name")
.map(String::from)
.expect("No name provided for the list");
let user = args.value_of("user").map(String::from);
(name, user)
}
fn get_list_type(args: &ArgMatches<'_>) -> ListType {
let typ = args
.value_of("type")
.map(String::from)
.expect("No name type for the list");
match typ.as_str() {
"user" => ListType::User,
"blog" => ListType::Blog,
"word" => ListType::Word,
"prefix" => ListType::Prefix,
_ => panic!("Invalid list type: {}", typ),
}
}
fn get_value(args: &ArgMatches<'_>) -> String {
args.value_of("value")
.map(String::from)
.expect("No query provided")
}
fn resolve_user(username: &str, conn: &Connection) -> User {
let instance = Instance::get_local_uncached(conn).expect("Failed to load local instance");
User::find_by_name(conn, username, instance.id).expect("User not found")
}
fn new(args: &ArgMatches<'_>, conn: &Connection) {
let (name, user) = get_list_identifier(args);
let typ = get_list_type(args);
let user = user.map(|user| resolve_user(&user, conn));
List::new(conn, &name, user.as_ref(), typ).expect("failed to create list");
}
fn delete(args: &ArgMatches<'_>, conn: &Connection) {
let (name, user) = get_list_identifier(args);
if !args.is_present("yes") {
panic!("Warning, this operation is destructive. Add --yes to confirm you want to do it.")
}
let user = user.map(|user| resolve_user(&user, conn));
let list =
List::find_for_user_by_name(conn, user.map(|u| u.id), &name).expect("list not found");
list.delete(conn).expect("Failed to update list");
}
fn add(args: &ArgMatches<'_>, conn: &Connection) {
let (name, user) = get_list_identifier(args);
let value = get_value(args);
let user = user.map(|user| resolve_user(&user, conn));
let list =
List::find_for_user_by_name(conn, user.map(|u| u.id), &name).expect("list not found");
match list.kind() {
ListType::Blog => {
let blog_id = Blog::find_by_fqn(conn, &value).expect("unknown blog").id;
if !list.contains_blog(conn, blog_id).unwrap() {
list.add_blogs(conn, &[blog_id]).unwrap();
}
}
ListType::User => {
let user_id = User::find_by_fqn(conn, &value).expect("unknown user").id;
if !list.contains_user(conn, user_id).unwrap() {
list.add_users(conn, &[user_id]).unwrap();
}
}
ListType::Word => {
if !list.contains_word(conn, &value).unwrap() {
list.add_words(conn, &[&value]).unwrap();
}
}
ListType::Prefix => {
if !list.contains_prefix(conn, &value).unwrap() {
list.add_prefixes(conn, &[&value]).unwrap();
}
}
}
}
fn rm(args: &ArgMatches<'_>, conn: &Connection) {
let (name, user) = get_list_identifier(args);
let value = get_value(args);
let user = user.map(|user| resolve_user(&user, conn));
let list =
List::find_for_user_by_name(conn, user.map(|u| u.id), &name).expect("list not found");
match list.kind() {
ListType::Blog => {
let blog_id = Blog::find_by_fqn(conn, &value).expect("unknown blog").id;
let mut blogs = list.list_blogs(conn).unwrap();
if let Some(index) = blogs.iter().position(|b| b.id == blog_id) {
blogs.swap_remove(index);
let blogs = blogs.iter().map(|b| b.id).collect::<Vec<_>>();
list.set_blogs(conn, &blogs).unwrap();
}
}
ListType::User => {
let user_id = User::find_by_fqn(conn, &value).expect("unknown user").id;
let mut users = list.list_users(conn).unwrap();
if let Some(index) = users.iter().position(|u| u.id == user_id) {
users.swap_remove(index);
let users = users.iter().map(|u| u.id).collect::<Vec<_>>();
list.set_users(conn, &users).unwrap();
}
}
ListType::Word => {
let mut words = list.list_words(conn).unwrap();
if let Some(index) = words.iter().position(|w| *w == value) {
words.swap_remove(index);
let words = words.iter().map(String::as_str).collect::<Vec<_>>();
list.set_words(conn, &words).unwrap();
}
}
ListType::Prefix => {
let mut prefixes = list.list_prefixes(conn).unwrap();
if let Some(index) = prefixes.iter().position(|p| *p == value) {
prefixes.swap_remove(index);
let prefixes = prefixes.iter().map(String::as_str).collect::<Vec<_>>();
list.set_prefixes(conn, &prefixes).unwrap();
}
}
}
}

View file

@ -4,8 +4,10 @@ use plume_models::{instance::Instance, Connection as Conn, CONFIG};
use std::io::{self, prelude::*};
mod instance;
mod list;
mod migration;
mod search;
mod timeline;
mod users;
fn main() {
@ -16,6 +18,8 @@ fn main() {
.subcommand(instance::command())
.subcommand(migration::command())
.subcommand(search::command())
.subcommand(timeline::command())
.subcommand(list::command())
.subcommand(users::command());
let matches = app.clone().get_matches();
@ -37,6 +41,10 @@ fn main() {
("search", Some(args)) => {
search::run(args, &conn.expect("Couldn't connect to the database."))
}
("timeline", Some(args)) => {
timeline::run(args, &conn.expect("Couldn't connect to the database."))
}
("lists", Some(args)) => list::run(args, &conn.expect("Couldn't connect to the database.")),
("users", Some(args)) => {
users::run(args, &conn.expect("Couldn't connect to the database."))
}

257
plume-cli/src/timeline.rs Normal file
View file

@ -0,0 +1,257 @@
use clap::{App, Arg, ArgMatches, SubCommand};
use plume_models::{instance::Instance, posts::Post, timeline::*, users::*, Connection};
pub fn command<'a, 'b>() -> App<'a, 'b> {
SubCommand::with_name("timeline")
.about("Manage public timeline")
.subcommand(
SubCommand::with_name("new")
.arg(
Arg::with_name("name")
.short("n")
.long("name")
.takes_value(true)
.help("The name of this timeline"),
)
.arg(
Arg::with_name("query")
.short("q")
.long("query")
.takes_value(true)
.help("The query posts in this timelines have to match"),
)
.arg(
Arg::with_name("user")
.short("u")
.long("user")
.takes_value(true)
.help(
"Username of whom this timeline is for. Empty for an instance timeline",
),
)
.arg(
Arg::with_name("preload-count")
.short("p")
.long("preload-count")
.takes_value(true)
.help("Number of posts to try to preload in this timeline at its creation"),
)
.about("Create a new timeline"),
)
.subcommand(
SubCommand::with_name("delete")
.arg(
Arg::with_name("name")
.short("n")
.long("name")
.takes_value(true)
.help("The name of the timeline to delete"),
)
.arg(
Arg::with_name("user")
.short("u")
.long("user")
.takes_value(true)
.help(
"Username of whom this timeline was for. Empty for instance timeline",
),
)
.arg(
Arg::with_name("yes")
.short("y")
.long("yes")
.help("Confirm the deletion"),
)
.about("Delete a timeline"),
)
.subcommand(
SubCommand::with_name("edit")
.arg(
Arg::with_name("name")
.short("n")
.long("name")
.takes_value(true)
.help("The name of the timeline to edit"),
)
.arg(
Arg::with_name("user")
.short("u")
.long("user")
.takes_value(true)
.help("Username of whom this timeline is for. Empty for instance timeline"),
)
.arg(
Arg::with_name("query")
.short("q")
.long("query")
.takes_value(true)
.help("The query posts in this timelines have to match"),
)
.about("Edit the query of a timeline"),
)
.subcommand(
SubCommand::with_name("repopulate")
.arg(
Arg::with_name("name")
.short("n")
.long("name")
.takes_value(true)
.help("The name of the timeline to repopulate"),
)
.arg(
Arg::with_name("user")
.short("u")
.long("user")
.takes_value(true)
.help(
"Username of whom this timeline was for. Empty for instance timeline",
),
)
.arg(
Arg::with_name("preload-count")
.short("p")
.long("preload-count")
.takes_value(true)
.help("Number of posts to try to preload in this timeline at its creation"),
)
.about("Repopulate a timeline. Run this after modifying a list the timeline depends on."),
)
}
pub fn run<'a>(args: &ArgMatches<'a>, conn: &Connection) {
let conn = conn;
match args.subcommand() {
("new", Some(x)) => new(x, conn),
("edit", Some(x)) => edit(x, conn),
("delete", Some(x)) => delete(x, conn),
("repopulate", Some(x)) => repopulate(x, conn),
("", None) => command().print_help().unwrap(),
_ => println!("Unknown subcommand"),
}
}
fn get_timeline_identifier(args: &ArgMatches<'_>) -> (String, Option<String>) {
let name = args
.value_of("name")
.map(String::from)
.expect("No name provided for the timeline");
let user = args.value_of("user").map(String::from);
(name, user)
}
fn get_query(args: &ArgMatches<'_>) -> String {
let query = args
.value_of("query")
.map(String::from)
.expect("No query provided");
match TimelineQuery::parse(&query) {
Ok(_) => (),
Err(QueryError::SyntaxError(start, end, message)) => panic!(
"Query parsing error between {} and {}: {}",
start, end, message
),
Err(QueryError::UnexpectedEndOfQuery) => {
panic!("Query parsing error: unexpected end of query")
}
Err(QueryError::RuntimeError(message)) => panic!("Query parsing error: {}", message),
}
query
}
fn get_preload_count(args: &ArgMatches<'_>) -> usize {
args.value_of("preload-count")
.map(|arg| arg.parse().expect("invalid preload-count"))
.unwrap_or(plume_models::ITEMS_PER_PAGE as usize)
}
fn resolve_user(username: &str, conn: &Connection) -> User {
let instance = Instance::get_local_uncached(conn).expect("Failed to load local instance");
User::find_by_name(conn, username, instance.id).expect("User not found")
}
fn preload(timeline: Timeline, count: usize, conn: &Connection) {
timeline.remove_all_posts(conn).unwrap();
if count == 0 {
return;
}
let mut posts = Vec::with_capacity(count as usize);
for post in Post::list_filtered(conn, None, None, None)
.unwrap()
.into_iter()
.rev()
{
if timeline.matches(conn, &post, Kind::Original).unwrap() {
posts.push(post);
if posts.len() >= count {
break;
}
}
}
for post in posts.iter().rev() {
timeline.add_post(conn, post).unwrap();
}
}
fn new(args: &ArgMatches<'_>, conn: &Connection) {
let (name, user) = get_timeline_identifier(args);
let query = get_query(args);
let preload_count = get_preload_count(args);
let user = user.map(|user| resolve_user(&user, conn));
let timeline = if let Some(user) = user {
Timeline::new_for_user(conn, user.id, name, query)
} else {
Timeline::new_for_instance(conn, name, query)
}
.expect("Failed to create new timeline");
preload(timeline, preload_count, conn);
}
fn edit(args: &ArgMatches<'_>, conn: &Connection) {
let (name, user) = get_timeline_identifier(args);
let query = get_query(args);
let user = user.map(|user| resolve_user(&user, conn));
let mut timeline = Timeline::find_for_user_by_name(conn, user.map(|u| u.id), &name)
.expect("timeline not found");
timeline.query = query;
timeline.update(conn).expect("Failed to update timeline");
}
fn delete(args: &ArgMatches<'_>, conn: &Connection) {
let (name, user) = get_timeline_identifier(args);
if !args.is_present("yes") {
panic!("Warning, this operation is destructive. Add --yes to confirm you want to do it.")
}
let user = user.map(|user| resolve_user(&user, conn));
let timeline = Timeline::find_for_user_by_name(conn, user.map(|u| u.id), &name)
.expect("timeline not found");
timeline.delete(conn).expect("Failed to update timeline");
}
fn repopulate(args: &ArgMatches<'_>, conn: &Connection) {
let (name, user) = get_timeline_identifier(args);
let preload_count = get_preload_count(args);
let user = user.map(|user| resolve_user(&user, conn));
let timeline = Timeline::find_for_user_by_name(conn, user.map(|u| u.id), &name)
.expect("timeline not found");
preload(timeline, preload_count, conn);
}

View file

@ -1,29 +1,30 @@
[package]
name = "plume-common"
version = "0.7.0"
version = "0.7.2"
authors = ["Plume contributors"]
edition = "2018"
[dependencies]
activitypub = "0.1.1"
activitystreams-derive = "0.1.1"
activitystreams-traits = "0.1.0"
array_tool = "1.0"
base64 = "0.10"
heck = "0.3.0"
hex = "0.3"
hyper = "0.12.33"
openssl = "0.10.22"
rocket = "0.4.6"
reqwest = { version = "0.9", features = ["socks"] }
serde = "1.0"
base64 = "0.13"
hex = "0.4"
openssl = "0.10.40"
rocket = "0.4.11"
reqwest = { version = "0.11.11", features = ["blocking", "json", "socks"] }
serde = "1.0.137"
serde_derive = "1.0"
serde_json = "1.0.70"
serde_json = "1.0.81"
shrinkwraprs = "0.3.0"
syntect = "4.5.0"
tokio = "0.1.22"
regex-syntax = { version = "0.6.17", default-features = false, features = ["unicode-perl"] }
tracing = "0.1.22"
regex-syntax = { version = "0.6.26", default-features = false, features = ["unicode-perl"] }
tracing = "0.1.35"
askama_escape = "0.10.3"
activitystreams = "=0.7.0-alpha.20"
activitystreams-ext = "0.1.0-alpha.2"
url = "2.2.2"
flume = "0.10.13"
tokio = { version = "1.19.2", features = ["full"] }
futures = "0.3.25"
[dependencies.chrono]
features = ["serde"]
@ -35,4 +36,7 @@ git = "https://git.joinplu.me/Plume/pulldown-cmark"
branch = "bidi-plume"
[dev-dependencies]
once_cell = "1.5.2"
assert-json-diff = "2.0.1"
once_cell = "1.12.0"
[features]

View file

@ -1,2 +1,3 @@
pre-release-hook = ["cargo", "fmt"]
pre-release-replacements = []
release = false

View file

@ -10,8 +10,7 @@ use super::{request, sign::Signer};
/// # Example
///
/// ```rust
/// # extern crate activitypub;
/// # use activitypub::{actor::Person, activity::{Announce, Create}, object::Note};
/// # use activitystreams::{prelude::*, base::Base, actor::Person, activity::{Announce, Create}, object::Note, iri_string::types::IriString};
/// # use openssl::{hash::MessageDigest, pkey::PKey, rsa::Rsa};
/// # use once_cell::sync::Lazy;
/// # use plume_common::activity_pub::inbox::*;
@ -113,12 +112,13 @@ use super::{request, sign::Signer};
/// # }
/// # }
/// #
/// # let mut act = Create::default();
/// # act.object_props.set_id_string(String::from("https://test.ap/activity")).unwrap();
/// # let mut person = Person::default();
/// # person.object_props.set_id_string(String::from("https://test.ap/actor")).unwrap();
/// # act.create_props.set_actor_object(person).unwrap();
/// # act.create_props.set_object_object(Note::default()).unwrap();
/// # let mut person = Person::new();
/// # person.set_id("https://test.ap/actor".parse::<IriString>().unwrap());
/// # let mut act = Create::new(
/// # Base::retract(person).unwrap().into_generic().unwrap(),
/// # Base::retract(Note::new()).unwrap().into_generic().unwrap()
/// # );
/// # act.set_id("https://test.ap/activity".parse::<IriString>().unwrap());
/// # let activity_json = serde_json::to_value(act).unwrap();
/// #
/// # let conn = ();
@ -197,29 +197,29 @@ where
}
/// Registers an handler on this Inbox.
pub fn with<A, V, M>(self, proxy: Option<&reqwest::Proxy>) -> Inbox<'a, C, E, R>
pub fn with<A, V, M>(self, proxy: Option<&reqwest::Proxy>) -> Self
where
A: AsActor<&'a C> + FromId<C, Error = E>,
V: activitypub::Activity,
V: activitystreams::markers::Activity + serde::de::DeserializeOwned,
M: AsObject<A, V, &'a C, Error = E> + FromId<C, Error = E>,
M::Output: Into<R>,
{
if let Inbox::NotHandled(ctx, mut act, e) = self {
if let Self::NotHandled(ctx, mut act, e) = self {
if serde_json::from_value::<V>(act.clone()).is_ok() {
let act_clone = act.clone();
let act_id = match act_clone["id"].as_str() {
Some(x) => x,
None => return Inbox::NotHandled(ctx, act, InboxError::InvalidID),
None => return Self::NotHandled(ctx, act, InboxError::InvalidID),
};
// Get the actor ID
let actor_id = match get_id(act["actor"].clone()) {
Some(x) => x,
None => return Inbox::NotHandled(ctx, act, InboxError::InvalidActor(None)),
None => return Self::NotHandled(ctx, act, InboxError::InvalidActor(None)),
};
if Self::is_spoofed_activity(&actor_id, &act) {
return Inbox::NotHandled(ctx, act, InboxError::InvalidObject(None));
return Self::NotHandled(ctx, act, InboxError::InvalidObject(None));
}
// Transform this actor to a model (see FromId for details about the from_id function)
@ -235,14 +235,14 @@ where
if let Some(json) = json {
act["actor"] = json;
}
return Inbox::NotHandled(ctx, act, InboxError::InvalidActor(Some(e)));
return Self::NotHandled(ctx, act, InboxError::InvalidActor(Some(e)));
}
};
// Same logic for "object"
let obj_id = match get_id(act["object"].clone()) {
Some(x) => x,
None => return Inbox::NotHandled(ctx, act, InboxError::InvalidObject(None)),
None => return Self::NotHandled(ctx, act, InboxError::InvalidObject(None)),
};
let obj = match M::from_id(
ctx,
@ -255,19 +255,19 @@ where
if let Some(json) = json {
act["object"] = json;
}
return Inbox::NotHandled(ctx, act, InboxError::InvalidObject(Some(e)));
return Self::NotHandled(ctx, act, InboxError::InvalidObject(Some(e)));
}
};
// Handle the activity
match obj.activity(ctx, actor, act_id) {
Ok(res) => Inbox::Handled(res.into()),
Err(e) => Inbox::Failed(e),
Ok(res) => Self::Handled(res.into()),
Err(e) => Self::Failed(e),
}
} else {
// If the Activity type is not matching the expected one for
// this handler, try with the next one.
Inbox::NotHandled(ctx, act, e)
Self::NotHandled(ctx, act, e)
}
} else {
self
@ -333,7 +333,7 @@ pub trait FromId<C>: Sized {
type Error: From<InboxError<Self::Error>> + Debug;
/// The ActivityPub object type representing Self
type Object: activitypub::Object;
type Object: activitystreams::markers::Object + serde::de::DeserializeOwned;
/// Tries to get an instance of `Self` from an ActivityPub ID.
///
@ -366,7 +366,7 @@ pub trait FromId<C>: Sized {
) -> Result<Self::Object, (Option<serde_json::Value>, Self::Error)> {
request::get(id, Self::get_sender(), proxy)
.map_err(|_| (None, InboxError::DerefError))
.and_then(|mut r| {
.and_then(|r| {
let json: serde_json::Value = r
.json()
.map_err(|_| (None, InboxError::InvalidObject(None)))?;
@ -418,8 +418,7 @@ pub trait AsActor<C> {
/// representing the Note by a Message type, without any specific context.
///
/// ```rust
/// # extern crate activitypub;
/// # use activitypub::{activity::Create, actor::Person, object::Note};
/// # use activitystreams::{prelude::*, activity::Create, actor::Person, object::Note};
/// # use plume_common::activity_pub::inbox::{AsActor, AsObject, FromId};
/// # use plume_common::activity_pub::sign::{gen_keypair, Error as SignError, Result as SignResult, Signer};
/// # use openssl::{hash::MessageDigest, pkey::PKey, rsa::Rsa};
@ -501,7 +500,10 @@ pub trait AsActor<C> {
/// }
///
/// fn from_activity(_: &(), obj: Note) -> Result<Self, Self::Error> {
/// Ok(Message { text: obj.object_props.content_string().map_err(|_| ())? })
/// Ok(Message {
/// text: obj.content()
/// .and_then(|content| content.to_owned().single_xsd_string()).ok_or(())?
/// })
/// }
///
/// fn get_sender() -> &'static dyn Signer {
@ -521,7 +523,7 @@ pub trait AsActor<C> {
/// ```
pub trait AsObject<A, V, C>
where
V: activitypub::Activity,
V: activitystreams::markers::Activity,
{
/// What kind of error is returned when something fails
type Error;
@ -549,11 +551,17 @@ mod tests {
use crate::activity_pub::sign::{
gen_keypair, Error as SignError, Result as SignResult, Signer,
};
use activitypub::{activity::*, actor::Person, object::Note};
use activitystreams::{
activity::{Announce, Create, Delete, Like},
actor::Person,
base::Base,
object::Note,
prelude::*,
};
use once_cell::sync::Lazy;
use openssl::{hash::MessageDigest, pkey::PKey, rsa::Rsa};
static MY_SIGNER: Lazy<MySigner> = Lazy::new(|| MySigner::new());
static MY_SIGNER: Lazy<MySigner> = Lazy::new(MySigner::new);
struct MySigner {
public_key: String,
@ -588,7 +596,7 @@ mod tests {
.unwrap();
let mut verifier = openssl::sign::Verifier::new(MessageDigest::sha256(), &key).unwrap();
verifier.update(data.as_bytes()).unwrap();
verifier.verify(&signature).map_err(|_| SignError())
verifier.verify(signature).map_err(|_| SignError())
}
}
@ -598,11 +606,11 @@ mod tests {
type Object = Person;
fn from_db(_: &(), _id: &str) -> Result<Self, Self::Error> {
Ok(MyActor)
Ok(Self)
}
fn from_activity(_: &(), _obj: Person) -> Result<Self, Self::Error> {
Ok(MyActor)
Ok(Self)
}
fn get_sender() -> &'static dyn Signer {
@ -626,11 +634,11 @@ mod tests {
type Object = Note;
fn from_db(_: &(), _id: &str) -> Result<Self, Self::Error> {
Ok(MyObject)
Ok(Self)
}
fn from_activity(_: &(), _obj: Note) -> Result<Self, Self::Error> {
Ok(MyObject)
Ok(Self)
}
fn get_sender() -> &'static dyn Signer {
@ -678,21 +686,15 @@ mod tests {
}
fn build_create() -> Create {
let mut act = Create::default();
act.object_props
.set_id_string(String::from("https://test.ap/activity"))
.unwrap();
let mut person = Person::default();
person
.object_props
.set_id_string(String::from("https://test.ap/actor"))
.unwrap();
act.create_props.set_actor_object(person).unwrap();
let mut note = Note::default();
note.object_props
.set_id_string(String::from("https://test.ap/note"))
.unwrap();
act.create_props.set_object_object(note).unwrap();
let mut person = Person::new();
person.set_id("https://test.ap/actor".parse().unwrap());
let mut note = Note::new();
note.set_id("https://test.ap/note".parse().unwrap());
let mut act = Create::new(
Base::retract(person).unwrap().into_generic().unwrap(),
Base::retract(note).unwrap().into_generic().unwrap(),
);
act.set_id("https://test.ap/activity".parse().unwrap());
act
}
@ -729,6 +731,16 @@ mod tests {
}
struct FailingActor;
impl AsActor<&()> for FailingActor {
fn get_inbox_url(&self) -> String {
String::from("https://test.ap/failing-actor/inbox")
}
fn is_local(&self) -> bool {
false
}
}
impl FromId<()> for FailingActor {
type Error = ();
type Object = Person;
@ -737,7 +749,7 @@ mod tests {
Err(())
}
fn from_activity(_: &(), _obj: Person) -> Result<Self, Self::Error> {
fn from_activity(_: &(), _obj: Self::Object) -> Result<Self, Self::Error> {
Err(())
}
@ -745,15 +757,6 @@ mod tests {
&*MY_SIGNER
}
}
impl AsActor<&()> for FailingActor {
fn get_inbox_url(&self) -> String {
String::from("https://test.ap/failing-actor/inbox")
}
fn is_local(&self) -> bool {
false
}
}
impl AsObject<FailingActor, Create, &()> for MyObject {
type Error = ();
@ -779,7 +782,7 @@ mod tests {
.done();
assert!(res.is_err());
let res: Result<(), ()> = Inbox::handle(&(), act.clone())
let res: Result<(), ()> = Inbox::handle(&(), act)
.with::<FailingActor, Create, MyObject>(None)
.with::<MyActor, Create, MyObject>(None)
.done();

View file

@ -1,13 +1,27 @@
use activitypub::{Activity, Link, Object};
use activitystreams::{
actor::{ApActor, Group, Person},
base::{AnyBase, Base, Extends},
iri_string::types::IriString,
kind,
markers::{self, Activity},
object::{ApObject, Article, Object},
primitives::{AnyString, OneOrMany},
unparsed::UnparsedMutExt,
};
use activitystreams_ext::{Ext1, Ext2, UnparsedExtension};
use array_tool::vec::Uniq;
use reqwest::{header::HeaderValue, r#async::ClientBuilder, Url};
use futures::future::join_all;
use reqwest::{header::HeaderValue, ClientBuilder, RequestBuilder, Url};
use rocket::{
http::Status,
request::{FromRequest, Request},
response::{Responder, Response},
Outcome,
};
use tokio::prelude::*;
use tokio::{
runtime,
time::{sleep, Duration},
};
use tracing::{debug, warn};
use self::sign::Signable;
@ -24,8 +38,8 @@ pub const AP_CONTENT_TYPE: &str =
pub fn ap_accept_header() -> Vec<&'static str> {
vec![
"application/ld+json; profile=\"https://w3.org/ns/activitystreams\"",
"application/ld+json;profile=\"https://w3.org/ns/activitystreams\"",
"application/ld+json; profile=\"https://www.w3.org/ns/activitystreams\"",
"application/ld+json;profile=\"https://www.w3.org/ns/activitystreams\"",
"application/activity+json",
"application/ld+json",
]
@ -63,7 +77,7 @@ impl<T> ActivityStream<T> {
}
}
impl<'r, O: Object> Responder<'r> for ActivityStream<O> {
impl<'r, O: serde::Serialize> Responder<'r> for ActivityStream<O> {
fn respond_to(self, request: &Request<'_>) -> Result<Response<'r>, Status> {
let mut json = serde_json::to_value(&self.0).map_err(|_| Status::InternalServerError)?;
json["@context"] = context();
@ -87,14 +101,16 @@ impl<'a, 'r> FromRequest<'a, 'r> for ApRequest {
.map(|header| {
header
.split(',')
.map(|ct| match ct.trim() {
.map(|ct| {
match ct.trim() {
// bool for Forward: true if found a valid Content-Type for Plume first (HTML), false otherwise
"application/ld+json; profile=\"https://w3.org/ns/activitystreams\""
| "application/ld+json;profile=\"https://w3.org/ns/activitystreams\""
"application/ld+json; profile=\"https://www.w3.org/ns/activitystreams\""
| "application/ld+json;profile=\"https://www.w3.org/ns/activitystreams\""
| "application/activity+json"
| "application/ld+json" => Outcome::Success(ApRequest),
"text/html" => Outcome::Forward(true),
_ => Outcome::Forward(false),
}
})
.fold(Outcome::Forward(false), |out, ct| {
if out.clone().forwarded().unwrap_or_else(|| out.is_success()) {
@ -108,10 +124,11 @@ impl<'a, 'r> FromRequest<'a, 'r> for ApRequest {
.unwrap_or(Outcome::Forward(()))
}
}
pub fn broadcast<S, A, T, C>(sender: &S, act: A, to: Vec<T>, proxy: Option<reqwest::Proxy>)
where
S: sign::Signer,
A: Activity,
A: Activity + serde::Serialize,
T: inbox::AsActor<C>,
{
let boxes = to
@ -130,59 +147,79 @@ where
.sign(sender)
.expect("activity_pub::broadcast: signature error");
let mut rt = tokio::runtime::current_thread::Runtime::new()
let client = if let Some(proxy) = proxy {
ClientBuilder::new().proxy(proxy)
} else {
ClientBuilder::new()
}
.connect_timeout(std::time::Duration::from_secs(5))
.build()
.expect("Can't build client");
let rt = runtime::Builder::new_current_thread()
.enable_all()
.build()
.expect("Error while initializing tokio runtime for federation");
for inbox in boxes {
let body = signed.to_string();
let mut headers = request::headers();
let url = Url::parse(&inbox);
if url.is_err() {
warn!("Inbox is invalid URL: {:?}", &inbox);
continue;
rt.block_on(async {
// TODO: should be determined dependent on database connections because
// after broadcasting, target instance sends request to this instance,
// and Plume accesses database at that time.
let capacity = 6;
let (tx, rx) = flume::bounded::<RequestBuilder>(capacity);
let mut handles = Vec::with_capacity(capacity);
for _ in 0..capacity {
let rx = rx.clone();
let handle = rt.spawn(async move {
while let Ok(request_builder) = rx.recv_async().await {
// After broadcasting, target instance sends request to this instance.
// Sleep here in order to reduce requests at once
sleep(Duration::from_millis(500)).await;
let _ = request_builder
.send()
.await
.map(move |r| {
if r.status().is_success() {
debug!("Successfully sent activity to inbox ({})", &r.url());
} else {
warn!("Error while sending to inbox ({:?})", &r)
}
debug!("Response: \"{:?}\"\n", r);
})
.map_err(|e| warn!("Error while sending to inbox ({:?})", e));
}
});
handles.push(handle);
}
let url = url.unwrap();
if !url.has_host() {
warn!("Inbox doesn't have host: {:?}", &inbox);
continue;
};
let host_header_value = HeaderValue::from_str(url.host_str().expect("Unreachable"));
if host_header_value.is_err() {
warn!("Header value is invalid: {:?}", url.host_str());
continue;
}
headers.insert("Host", host_header_value.unwrap());
headers.insert("Digest", request::Digest::digest(&body));
rt.spawn(
if let Some(proxy) = proxy.clone() {
ClientBuilder::new().proxy(proxy)
} else {
ClientBuilder::new()
for inbox in boxes {
let body = signed.to_string();
let mut headers = request::headers();
let url = Url::parse(&inbox);
if url.is_err() {
warn!("Inbox is invalid URL: {:?}", &inbox);
continue;
}
.connect_timeout(std::time::Duration::from_secs(5))
.build()
.expect("Can't build client")
.post(&inbox)
.headers(headers.clone())
.header(
let url = url.unwrap();
if !url.has_host() {
warn!("Inbox doesn't have host: {:?}", &inbox);
continue;
};
let host_header_value = HeaderValue::from_str(url.host_str().expect("Unreachable"));
if host_header_value.is_err() {
warn!("Header value is invalid: {:?}", url.host_str());
continue;
}
headers.insert("Host", host_header_value.unwrap());
headers.insert("Digest", request::Digest::digest(&body));
headers.insert(
"Signature",
request::signature(sender, &headers, ("post", url.path(), url.query()))
.expect("activity_pub::broadcast: request signature error"),
)
.body(body)
.send()
.and_then(move |r| {
if r.status().is_success() {
debug!("Successfully sent activity to inbox ({})", &inbox);
} else {
warn!("Error while sending to inbox ({:?})", &r)
}
r.into_body().concat2()
})
.map(move |response| debug!("Response: \"{:?}\"\n", response))
.map_err(|e| warn!("Error while sending to inbox ({:?})", e)),
);
}
rt.run().unwrap();
);
let request_builder = client.post(&inbox).headers(headers.clone()).body(body);
let _ = tx.send_async(request_builder).await;
}
drop(tx);
join_all(handles).await;
});
}
#[derive(Shrinkwrap, Clone, Serialize, Deserialize)]
@ -204,46 +241,193 @@ pub trait IntoId {
fn into_id(self) -> Id;
}
impl Link for Id {}
#[derive(Clone, Debug, Default, Deserialize, Serialize, Properties)]
#[derive(Clone, Debug, Deserialize, Serialize, PartialEq, Eq)]
#[serde(rename_all = "camelCase")]
pub struct ApSignature {
#[activitystreams(concrete(PublicKey), functional)]
pub public_key: Option<serde_json::Value>,
pub public_key: PublicKey,
}
#[derive(Clone, Debug, Default, Deserialize, Serialize, Properties)]
#[derive(Clone, Debug, Deserialize, Serialize, PartialEq, Eq)]
#[serde(rename_all = "camelCase")]
pub struct PublicKey {
#[activitystreams(concrete(String), functional)]
pub id: Option<serde_json::Value>,
#[activitystreams(concrete(String), functional)]
pub owner: Option<serde_json::Value>,
#[activitystreams(concrete(String), functional)]
pub public_key_pem: Option<serde_json::Value>,
pub id: IriString,
pub owner: IriString,
pub public_key_pem: String,
}
#[derive(Clone, Debug, Default, UnitString)]
#[activitystreams(Hashtag)]
pub struct HashtagType;
impl<U> UnparsedExtension<U> for ApSignature
where
U: UnparsedMutExt,
{
type Error = serde_json::Error;
#[derive(Clone, Debug, Default, Deserialize, Serialize, Properties)]
fn try_from_unparsed(unparsed_mut: &mut U) -> Result<Self, Self::Error> {
Ok(ApSignature {
public_key: unparsed_mut.remove("publicKey")?,
})
}
fn try_into_unparsed(self, unparsed_mut: &mut U) -> Result<(), Self::Error> {
unparsed_mut.insert("publicKey", self.public_key)?;
Ok(())
}
}
#[derive(Clone, Debug, Deserialize, Serialize, PartialEq, Eq)]
#[serde(rename_all = "camelCase")]
pub struct Hashtag {
#[serde(rename = "type")]
kind: HashtagType,
#[activitystreams(concrete(String), functional)]
pub href: Option<serde_json::Value>,
#[activitystreams(concrete(String), functional)]
pub name: Option<serde_json::Value>,
pub struct SourceProperty {
pub source: Source,
}
#[derive(Clone, Debug, Default, Deserialize, Serialize)]
impl<U> UnparsedExtension<U> for SourceProperty
where
U: UnparsedMutExt,
{
type Error = serde_json::Error;
fn try_from_unparsed(unparsed_mut: &mut U) -> Result<Self, Self::Error> {
Ok(SourceProperty {
source: unparsed_mut.remove("source")?,
})
}
fn try_into_unparsed(self, unparsed_mut: &mut U) -> Result<(), Self::Error> {
unparsed_mut.insert("source", self.source)?;
Ok(())
}
}
pub type CustomPerson = Ext1<ApActor<Person>, ApSignature>;
pub type CustomGroup = Ext2<ApActor<Group>, ApSignature, SourceProperty>;
kind!(HashtagType, Hashtag);
#[derive(Clone, Debug, serde::Deserialize, serde::Serialize)]
pub struct Hashtag {
#[serde(skip_serializing_if = "Option::is_none")]
pub href: Option<IriString>,
#[serde(skip_serializing_if = "Option::is_none")]
pub name: Option<AnyString>,
#[serde(flatten)]
inner: Object<HashtagType>,
}
impl Hashtag {
pub fn new() -> Self {
Self {
href: None,
name: None,
inner: Object::new(),
}
}
pub fn extending(mut inner: Object<HashtagType>) -> Result<Self, serde_json::Error> {
let href = inner.remove("href")?;
let name = inner.remove("name")?;
Ok(Self { href, name, inner })
}
pub fn retracting(self) -> Result<Object<HashtagType>, serde_json::Error> {
let Self {
href,
name,
mut inner,
} = self;
inner.insert("href", href)?;
inner.insert("name", name)?;
Ok(inner)
}
}
pub trait AsHashtag: markers::Object {
fn hashtag_ref(&self) -> &Hashtag;
fn hashtag_mut(&mut self) -> &mut Hashtag;
}
pub trait HashtagExt: AsHashtag {
fn href(&self) -> Option<&IriString> {
self.hashtag_ref().href.as_ref()
}
fn set_href<T>(&mut self, href: T) -> &mut Self
where
T: Into<IriString>,
{
self.hashtag_mut().href = Some(href.into());
self
}
fn take_href(&mut self) -> Option<IriString> {
self.hashtag_mut().href.take()
}
fn delete_href(&mut self) -> &mut Self {
self.hashtag_mut().href = None;
self
}
fn name(&self) -> Option<&AnyString> {
self.hashtag_ref().name.as_ref()
}
fn set_name<T>(&mut self, name: T) -> &mut Self
where
T: Into<AnyString>,
{
self.hashtag_mut().name = Some(name.into());
self
}
fn take_name(&mut self) -> Option<AnyString> {
self.hashtag_mut().name.take()
}
fn delete_name(&mut self) -> &mut Self {
self.hashtag_mut().name = None;
self
}
}
impl Default for Hashtag {
fn default() -> Self {
Self::new()
}
}
impl AsHashtag for Hashtag {
fn hashtag_ref(&self) -> &Self {
self
}
fn hashtag_mut(&mut self) -> &mut Self {
self
}
}
impl Extends<HashtagType> for Hashtag {
type Error = serde_json::Error;
fn extends(base: Base<HashtagType>) -> Result<Self, Self::Error> {
let inner = Object::extends(base)?;
Self::extending(inner)
}
fn retracts(self) -> Result<Base<HashtagType>, Self::Error> {
let inner = self.retracting()?;
inner.retracts()
}
}
impl markers::Base for Hashtag {}
impl markers::Object for Hashtag {}
impl<T> HashtagExt for T where T: AsHashtag {}
#[derive(Clone, Debug, Default, Deserialize, Serialize, PartialEq, Eq)]
#[serde(rename_all = "camelCase")]
pub struct Source {
pub media_type: String,
@ -251,13 +435,366 @@ pub struct Source {
pub content: String,
}
impl Object for Source {}
impl<U> UnparsedExtension<U> for Source
where
U: UnparsedMutExt,
{
type Error = serde_json::Error;
#[derive(Clone, Debug, Default, Deserialize, Serialize, Properties)]
#[serde(rename_all = "camelCase")]
pub struct Licensed {
#[activitystreams(concrete(String), functional)]
pub license: Option<serde_json::Value>,
fn try_from_unparsed(unparsed_mut: &mut U) -> Result<Self, Self::Error> {
Ok(Source {
content: unparsed_mut.remove("content")?,
media_type: unparsed_mut.remove("mediaType")?,
})
}
fn try_into_unparsed(self, unparsed_mut: &mut U) -> Result<(), Self::Error> {
unparsed_mut.insert("content", self.content)?;
unparsed_mut.insert("mediaType", self.media_type)?;
Ok(())
}
}
impl Object for Licensed {}
#[derive(Clone, Debug, Deserialize, Serialize, PartialEq, Eq)]
#[serde(rename_all = "camelCase")]
pub struct Licensed {
pub license: Option<String>,
}
impl<U> UnparsedExtension<U> for Licensed
where
U: UnparsedMutExt,
{
type Error = serde_json::Error;
fn try_from_unparsed(unparsed_mut: &mut U) -> Result<Self, Self::Error> {
Ok(Licensed {
license: unparsed_mut.remove("license")?,
})
}
fn try_into_unparsed(self, unparsed_mut: &mut U) -> Result<(), Self::Error> {
unparsed_mut.insert("license", self.license)?;
Ok(())
}
}
pub type LicensedArticle = Ext1<ApObject<Article>, Licensed>;
pub trait ToAsString {
fn to_as_string(&self) -> Option<String>;
}
impl ToAsString for OneOrMany<&AnyString> {
fn to_as_string(&self) -> Option<String> {
self.as_as_str().map(|s| s.to_string())
}
}
trait AsAsStr {
fn as_as_str(&self) -> Option<&str>;
}
impl AsAsStr for OneOrMany<&AnyString> {
fn as_as_str(&self) -> Option<&str> {
self.iter().next().map(|prop| prop.as_str())
}
}
pub trait ToAsUri {
fn to_as_uri(&self) -> Option<String>;
}
impl ToAsUri for OneOrMany<AnyBase> {
fn to_as_uri(&self) -> Option<String> {
self.iter()
.next()
.and_then(|prop| prop.as_xsd_any_uri().map(|uri| uri.to_string()))
}
}
#[cfg(test)]
mod tests {
use super::*;
use activitystreams::{
activity::{ActorAndObjectRef, Create},
object::{kind::ArticleType, Image},
prelude::{ApActorExt, BaseExt, ExtendsExt, ObjectExt},
};
use assert_json_diff::assert_json_eq;
use serde_json::{from_str, json, to_value};
#[test]
fn se_ap_signature() {
let ap_signature = ApSignature {
public_key: PublicKey {
id: "https://example.com/pubkey".parse().unwrap(),
owner: "https://example.com/owner".parse().unwrap(),
public_key_pem: "pubKeyPem".into(),
},
};
let expected = json!({
"publicKey": {
"id": "https://example.com/pubkey",
"owner": "https://example.com/owner",
"publicKeyPem": "pubKeyPem"
}
});
assert_json_eq!(to_value(ap_signature).unwrap(), expected);
}
#[test]
fn de_ap_signature() {
let value: ApSignature = from_str(
r#"
{
"publicKey": {
"id": "https://example.com/",
"owner": "https://example.com/",
"publicKeyPem": ""
}
}
"#,
)
.unwrap();
let expected = ApSignature {
public_key: PublicKey {
id: "https://example.com/".parse().unwrap(),
owner: "https://example.com/".parse().unwrap(),
public_key_pem: "".into(),
},
};
assert_eq!(value, expected);
}
#[test]
fn se_custom_person() {
let actor = ApActor::new("https://example.com/inbox".parse().unwrap(), Person::new());
let person = CustomPerson::new(
actor,
ApSignature {
public_key: PublicKey {
id: "https://example.com/pubkey".parse().unwrap(),
owner: "https://example.com/owner".parse().unwrap(),
public_key_pem: "pubKeyPem".into(),
},
},
);
let expected = json!({
"inbox": "https://example.com/inbox",
"type": "Person",
"publicKey": {
"id": "https://example.com/pubkey",
"owner": "https://example.com/owner",
"publicKeyPem": "pubKeyPem"
}
});
assert_eq!(to_value(person).unwrap(), expected);
}
#[test]
fn se_custom_group() {
let group = CustomGroup::new(
ApActor::new("https://example.com/inbox".parse().unwrap(), Group::new()),
ApSignature {
public_key: PublicKey {
id: "https://example.com/pubkey".parse().unwrap(),
owner: "https://example.com/owner".parse().unwrap(),
public_key_pem: "pubKeyPem".into(),
},
},
SourceProperty {
source: Source {
content: String::from("This is a *custom* group."),
media_type: String::from("text/markdown"),
},
},
);
let expected = json!({
"inbox": "https://example.com/inbox",
"type": "Group",
"publicKey": {
"id": "https://example.com/pubkey",
"owner": "https://example.com/owner",
"publicKeyPem": "pubKeyPem"
},
"source": {
"content": "This is a *custom* group.",
"mediaType": "text/markdown"
}
});
assert_eq!(to_value(group).unwrap(), expected);
}
#[test]
fn de_custom_group() {
let value: CustomGroup = from_str(
r#"
{
"icon": {
"type": "Image"
},
"id": "https://plume01.localhost/~/Plume01%20Blog%202/",
"image": {
"type": "Image"
},
"inbox": "https://plume01.localhost/~/Plume01%20Blog%202/inbox",
"name": "Plume01 Blog 2",
"outbox": "https://plume01.localhost/~/Plume01%20Blog%202/outbox",
"preferredUsername": "Plume01 Blog 2",
"publicKey": {
"id": "https://plume01.localhost/~/Plume01%20Blog%202/#main-key",
"owner": "https://plume01.localhost/~/Plume01%20Blog%202/",
"publicKeyPem": "-----BEGIN PUBLIC KEY-----\nMIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEAwPGtKkl/iMsNAyeVaJGz\noEz5PoNkjRnKK7G97MFvb4zw9zs5SpzWW7b/pKHa4dODcGDJXmkCJ1H5JWyguzN8\n2GNoFjtEOJHxEGwBHSYDsTmhuLNB0DKxMU2iu55g8iIiXhZiIW1FBNGs/Geaymvr\nh/TEtzdReN8wzloRR55kOVcU49xBkqx8cfDSk/lrrDLlpveHdqgaFnIvuw2vycK0\nxFzS3xlEUpzJk9kHxoR1uEAfZ+gCv26Sgo/HqOAhqSD5IU3QZC3kdkr/hwVqtr8U\nXGkGG6Mo1rgzhkYiCFkWrV2WoKkcEHD4nEzbgoZZ5MyuSoloxnyF3NiScqmqW+Yx\nkQIDAQAB\n-----END PUBLIC KEY-----\n"
},
"source": {
"content": "",
"mediaType": "text/markdown"
},
"summary": "",
"type": "Group"
}
"#
).unwrap();
let mut expected = CustomGroup::new(
ApActor::new("https://plume01.localhost/~/Plume01%20Blog%202/inbox".parse().unwrap(), Group::new()),
ApSignature {
public_key: PublicKey {
id: "https://plume01.localhost/~/Plume01%20Blog%202/#main-key".parse().unwrap(),
owner: "https://plume01.localhost/~/Plume01%20Blog%202/".parse().unwrap(),
public_key_pem: "-----BEGIN PUBLIC KEY-----\nMIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEAwPGtKkl/iMsNAyeVaJGz\noEz5PoNkjRnKK7G97MFvb4zw9zs5SpzWW7b/pKHa4dODcGDJXmkCJ1H5JWyguzN8\n2GNoFjtEOJHxEGwBHSYDsTmhuLNB0DKxMU2iu55g8iIiXhZiIW1FBNGs/Geaymvr\nh/TEtzdReN8wzloRR55kOVcU49xBkqx8cfDSk/lrrDLlpveHdqgaFnIvuw2vycK0\nxFzS3xlEUpzJk9kHxoR1uEAfZ+gCv26Sgo/HqOAhqSD5IU3QZC3kdkr/hwVqtr8U\nXGkGG6Mo1rgzhkYiCFkWrV2WoKkcEHD4nEzbgoZZ5MyuSoloxnyF3NiScqmqW+Yx\nkQIDAQAB\n-----END PUBLIC KEY-----\n".into(),
}
},
SourceProperty {
source: Source {
content: String::from(""),
media_type: String::from("text/markdown")
}
}
);
expected.set_icon(Image::new().into_any_base().unwrap());
expected.set_id(
"https://plume01.localhost/~/Plume01%20Blog%202/"
.parse()
.unwrap(),
);
expected.set_image(Image::new().into_any_base().unwrap());
expected.set_name("Plume01 Blog 2");
expected.set_outbox(
"https://plume01.localhost/~/Plume01%20Blog%202/outbox"
.parse()
.unwrap(),
);
expected.set_preferred_username("Plume01 Blog 2");
expected.set_summary("");
assert_json_eq!(value, expected);
}
#[test]
fn se_licensed_article() {
let object = ApObject::new(Article::new());
let licensed_article = LicensedArticle::new(
object,
Licensed {
license: Some("CC-0".into()),
},
);
let expected = json!({
"type": "Article",
"license": "CC-0",
});
assert_json_eq!(to_value(licensed_article).unwrap(), expected);
}
#[test]
fn de_licensed_article() {
let value: LicensedArticle = from_str(
r#"
{
"type": "Article",
"id": "https://plu.me/~/Blog/my-article",
"attributedTo": ["https://plu.me/@/Admin", "https://plu.me/~/Blog"],
"content": "Hello.",
"name": "My Article",
"summary": "Bye.",
"source": {
"content": "Hello.",
"mediaType": "text/markdown"
},
"published": "2014-12-12T12:12:12Z",
"to": ["https://www.w3.org/ns/activitystreams#Public"],
"license": "CC-0"
}
"#,
)
.unwrap();
let expected = json!({
"type": "Article",
"id": "https://plu.me/~/Blog/my-article",
"attributedTo": ["https://plu.me/@/Admin", "https://plu.me/~/Blog"],
"content": "Hello.",
"name": "My Article",
"summary": "Bye.",
"source": {
"content": "Hello.",
"mediaType": "text/markdown"
},
"published": "2014-12-12T12:12:12Z",
"to": ["https://www.w3.org/ns/activitystreams#Public"],
"license": "CC-0"
});
assert_eq!(to_value(value).unwrap(), expected);
}
#[test]
fn de_create_with_licensed_article() {
let create: Create = from_str(
r#"
{
"id": "https://plu.me/~/Blog/my-article",
"type": "Create",
"actor": "https://plu.me/@/Admin",
"to": "https://www.w3.org/ns/activitystreams#Public",
"object": {
"type": "Article",
"id": "https://plu.me/~/Blog/my-article",
"attributedTo": ["https://plu.me/@/Admin", "https://plu.me/~/Blog"],
"content": "Hello.",
"name": "My Article",
"summary": "Bye.",
"source": {
"content": "Hello.",
"mediaType": "text/markdown"
},
"published": "2014-12-12T12:12:12Z",
"to": ["https://www.w3.org/ns/activitystreams#Public"],
"license": "CC-0"
}
}
"#,
)
.unwrap();
let base = create.object_field_ref().as_single_base().unwrap();
let any_base = AnyBase::from_base(base.clone());
let value = any_base.extend::<LicensedArticle, ArticleType>().unwrap();
let expected = json!({
"type": "Article",
"id": "https://plu.me/~/Blog/my-article",
"attributedTo": ["https://plu.me/@/Admin", "https://plu.me/~/Blog"],
"content": "Hello.",
"name": "My Article",
"summary": "Bye.",
"source": {
"content": "Hello.",
"mediaType": "text/markdown"
},
"published": "2014-12-12T12:12:12Z",
"to": ["https://www.w3.org/ns/activitystreams#Public"],
"license": "CC-0"
});
assert_eq!(to_value(value).unwrap(), expected);
}
}

View file

@ -1,10 +1,11 @@
use chrono::{offset::Utc, DateTime};
use openssl::hash::{Hasher, MessageDigest};
use reqwest::{
blocking::{ClientBuilder, Response},
header::{
HeaderMap, HeaderValue, InvalidHeaderValue, ACCEPT, CONTENT_TYPE, DATE, HOST, USER_AGENT,
},
ClientBuilder, Proxy, Response, Url, UrlError,
Proxy, Url,
};
use std::ops::Deref;
use std::time::SystemTime;
@ -18,8 +19,8 @@ const PLUME_USER_AGENT: &str = concat!("Plume/", env!("CARGO_PKG_VERSION"));
#[derive(Debug)]
pub struct Error();
impl From<UrlError> for Error {
fn from(_err: UrlError) -> Self {
impl From<url::ParseError> for Error {
fn from(_err: url::ParseError) -> Self {
Error()
}
}
@ -252,7 +253,7 @@ mod tests {
.unwrap();
let mut verifier = openssl::sign::Verifier::new(MessageDigest::sha256(), &key).unwrap();
verifier.update(data.as_bytes()).unwrap();
verifier.verify(&signature).map_err(|_| Error())
verifier.verify(signature).map_err(|_| Error())
}
}
@ -261,7 +262,7 @@ mod tests {
let signer = MySigner::new();
let headers = HeaderMap::new();
let result = signature(&signer, &headers, ("post", "/inbox", None)).unwrap();
let fields: Vec<&str> = result.to_str().unwrap().split(",").collect();
let fields: Vec<&str> = result.to_str().unwrap().split(',').collect();
assert_eq!(r#"headers="(request-target)""#, fields[2]);
let sign = &fields[3][11..(fields[3].len() - 1)];
assert!(signer.verify("post /inbox", sign.as_bytes()).is_ok());

View file

@ -119,7 +119,7 @@ impl Signable for serde_json::Value {
}
}
#[derive(Debug, Copy, Clone, PartialEq)]
#[derive(Debug, Copy, Clone, PartialEq, Eq)]
pub enum SignatureValidity {
Invalid,
ValidNoDigest,

2
plume-common/src/lib.rs Executable file → Normal file
View file

@ -1,7 +1,5 @@
#![feature(associated_type_defaults)]
#[macro_use]
extern crate activitystreams_derive;
#[macro_use]
extern crate shrinkwraprs;
#[macro_use]

View file

@ -1,11 +1,7 @@
use heck::CamelCase;
use openssl::rand::rand_bytes;
use pulldown_cmark::{html, CodeBlockKind, CowStr, Event, LinkType, Options, Parser, Tag};
use regex_syntax::is_word_character;
use rocket::{
http::uri::Uri,
response::{Flash, Redirect},
};
use rocket::http::uri::Uri;
use std::collections::HashSet;
use syntect::html::{ClassStyle, ClassedHTMLGenerator};
use syntect::parsing::SyntaxSet;
@ -19,14 +15,6 @@ pub fn random_hex() -> String {
.fold(String::new(), |res, byte| format!("{}{:x}", res, byte))
}
/// Remove non alphanumeric characters and CamelCase a string
pub fn make_actor_id(name: &str) -> String {
name.to_camel_case()
.chars()
.filter(|c| c.is_alphanumeric())
.collect()
}
/**
* Percent-encode characters which are not allowed in IRI path segments.
*
@ -80,19 +68,6 @@ pub fn iri_percent_encode_seg_char(c: char) -> String {
}
}
/**
* Redirects to the login page with a given message.
*
* Note that the message should be translated before passed to this function.
*/
pub fn requires_login<T: Into<Uri<'static>>>(message: &str, url: T) -> Flash<Redirect> {
Flash::new(
Redirect::to(format!("/login?m={}", Uri::percent_encode(message))),
"callback",
url.into().to_string(),
)
}
#[derive(Debug)]
enum State {
Mention,
@ -287,7 +262,7 @@ pub fn md_to_html<'a>(
media_processor: Option<MediaProcessor<'a>>,
) -> (String, HashSet<String>, HashSet<String>) {
let base_url = if let Some(base_url) = base_url {
format!("//{}/", base_url)
format!("https://{}/", base_url)
} else {
"/".to_owned()
};
@ -466,6 +441,10 @@ pub fn md_to_html<'a>(
(buf, mentions.collect(), hashtags.collect())
}
pub fn escape(string: &str) -> askama_escape::Escaped<askama_escape::Html> {
askama_escape::escape(string, askama_escape::Html)
}
#[cfg(test)]
mod tests {
use super::*;

View file

@ -1,26 +1,29 @@
[package]
name = "plume-front"
version = "0.7.0"
version = "0.7.2"
authors = ["Plume contributors"]
edition = "2018"
[package.metadata.wasm-pack.profile.release]
wasm-opt = false
[lib]
crate-type = ["cdylib"]
[dependencies]
gettext = { git = "https://github.com/Plume-org/gettext/", rev = "294c54d74c699fbc66502b480a37cc66c1daa7f3" }
gettext-macros = { git = "https://github.com/Plume-org/gettext-macros/", rev = "a7c605f7edd6bfbfbfe7778026bfefd88d82db10" }
gettext-utils = { git = "https://github.com/Plume-org/gettext-macros/", rev = "a7c605f7edd6bfbfbfe7778026bfefd88d82db10" }
gettext = "0.4.0"
gettext-macros = "0.6.1"
gettext-utils = "0.1.0"
lazy_static = "1.3"
serde = "1.0"
serde = "1.0.137"
serde_json = "1.0"
wasm-bindgen = "0.2.70"
js-sys = "0.3.47"
wasm-bindgen = "0.2.81"
js-sys = "0.3.58"
serde_derive = "1.0.123"
console_error_panic_hook = "0.1.6"
[dependencies.web-sys]
version = "0.3.47"
version = "0.3.58"
features = [
'console',
'ClipboardEvent',

View file

@ -1,2 +1,3 @@
pre-release-hook = ["cargo", "fmt"]
pre-release-replacements = []
release = false

View file

@ -397,7 +397,9 @@ fn init_editor() -> Result<(), EditorError> {
content_val.clone(),
false,
)?;
content.set_inner_html(&content_val);
if !content_val.is_empty() {
content.set_inner_html(&content_val);
}
// character counter
let character_counter = Closure::wrap(Box::new(mv!(content => move |_| {

View file

@ -23,6 +23,7 @@ init_i18n!(
en,
eo,
es,
eu,
fa,
fi,
fr,

View file

@ -1,6 +1,6 @@
[package]
name = "plume-macro"
version = "0.7.0"
version = "0.7.2"
authors = ["Trinity Pointard <trinity.pointard@insa-rennes.fr>"]
edition = "2018"
description = "Plume procedural macros"

View file

@ -1,2 +1,3 @@
pre-release-hook = ["cargo", "fmt"]
pre-release-replacements = []
release = false

View file

@ -1,39 +1,41 @@
[package]
name = "plume-models"
version = "0.7.0"
version = "0.7.2"
authors = ["Plume contributors"]
edition = "2018"
[dependencies]
activitypub = "0.1.1"
ammonia = "2.1.1"
askama_escape = "0.1"
bcrypt = "0.10.1"
guid-create = "0.1"
itertools = "0.8.0"
ammonia = "3.2.0"
bcrypt = "0.12.1"
guid-create = "0.2"
itertools = "0.10.3"
lazy_static = "1.0"
ldap3 = "0.7.1"
ldap3 = "0.11.1"
migrations_internals= "1.4.0"
openssl = "0.10.22"
rocket = "0.4.6"
rocket_i18n = { git = "https://github.com/Plume-org/rocket_i18n", rev = "e922afa7c366038b3433278c03b1456b346074f2" }
reqwest = "0.9"
scheduled-thread-pool = "0.2.2"
serde = "1.0"
openssl = "0.10.40"
rocket = "0.4.11"
rocket_i18n = "0.4.1"
reqwest = "0.11.11"
scheduled-thread-pool = "0.2.6"
serde = "1.0.137"
rust-s3 = { version = "0.33.0", optional = true, features = ["blocking"] }
serde_derive = "1.0"
serde_json = "1.0.70"
serde_json = "1.0.81"
tantivy = "0.13.3"
url = "2.1"
walkdir = "2.2"
webfinger = "0.4.1"
whatlang = "0.11.1"
shrinkwraprs = "0.2.1"
diesel-derive-newtype = "0.1.2"
glob = "0.3.0"
whatlang = "0.16.2"
shrinkwraprs = "0.3.0"
diesel-derive-newtype = "1.0.0"
glob = "0.3.1"
lindera-tantivy = { version = "0.7.1", optional = true }
tracing = "0.1.22"
tracing = "0.1.35"
riker = "0.4.2"
once_cell = "1.5.2"
once_cell = "1.12.0"
lettre = "0.9.6"
native-tls = "0.2.10"
activitystreams = "=0.7.0-alpha.20"
[dependencies.chrono]
features = ["serde"]
@ -53,9 +55,11 @@ path = "../plume-common"
path = "../plume-macro"
[dev-dependencies]
assert-json-diff = "2.0.1"
diesel_migrations = "1.3.0"
[features]
postgres = ["diesel/postgres", "plume-macro/postgres" ]
sqlite = ["diesel/sqlite", "plume-macro/sqlite" ]
search-lindera = ["lindera-tantivy"]
s3 = ["rust-s3"]

View file

@ -1,2 +1,3 @@
pre-release-hook = ["cargo", "fmt"]
pre-release-replacements = []
release = false

View file

@ -103,7 +103,7 @@ impl<'a, 'r> FromRequest<'a, 'r> for ApiToken {
let conn = request
.guard::<DbConn>()
.map_failure(|_| (Status::InternalServerError, TokenError::DbError))?;
if let Ok(token) = ApiToken::find_by_value(&*conn, val) {
if let Ok(token) = ApiToken::find_by_value(&conn, val) {
return Outcome::Success(token);
}
}

View file

@ -126,12 +126,9 @@ pub(crate) mod tests {
.id,
various[1].id
);
assert_eq!(
BlocklistedEmail::matches_blocklist(&conn, no_match)
.unwrap()
.is_none(),
true
);
assert!(BlocklistedEmail::matches_blocklist(&conn, no_match)
.unwrap()
.is_none());
Ok(())
});
}

View file

@ -1,12 +1,14 @@
use crate::{
ap_url, db_conn::DbConn, instance::*, medias::Media, posts::Post, safe_string::SafeString,
schema::blogs, users::User, Connection, Error, PlumeRocket, Result, CONFIG, ITEMS_PER_PAGE,
instance::*, medias::Media, posts::Post, safe_string::SafeString, schema::blogs, users::User,
Connection, Error, PlumeRocket, Result, CONFIG, ITEMS_PER_PAGE,
};
use activitypub::{
actor::Group,
use activitystreams::{
actor::{ApActor, ApActorExt, AsApActor, Group},
base::AnyBase,
collection::{OrderedCollection, OrderedCollectionPage},
object::Image,
CustomObject,
iri_string::types::IriString,
object::{kind::ImageType, ApObject, Image, ObjectExt},
prelude::*,
};
use chrono::NaiveDateTime;
use diesel::{self, ExpressionMethods, OptionalExtension, QueryDsl, RunQueryDsl, SaveChangesDsl};
@ -16,16 +18,17 @@ use openssl::{
rsa::Rsa,
sign::{Signer, Verifier},
};
use plume_common::activity_pub::{
inbox::{AsActor, FromId},
sign, ActivityStream, ApSignature, Id, IntoId, PublicKey, Source,
use plume_common::{
activity_pub::{
inbox::{AsActor, FromId},
sign, ActivityStream, ApSignature, CustomGroup, Id, IntoId, PublicKey, Source,
SourceProperty, ToAsString, ToAsUri,
},
utils::iri_percent_encode_seg,
};
use url::Url;
use webfinger::*;
pub type CustomGroup = CustomObject<ApSignature, Group>;
#[derive(Queryable, Identifiable, Clone, AsChangeset)]
#[derive(Queryable, Identifiable, Clone, AsChangeset, Debug)]
#[changeset_options(treat_none_as_null = "true")]
pub struct Blog {
pub id: i32,
@ -83,9 +86,13 @@ impl Blog {
if inserted.fqn.is_empty() {
if instance.local {
inserted.fqn = inserted.actor_id.clone();
inserted.fqn = iri_percent_encode_seg(&inserted.actor_id);
} else {
inserted.fqn = format!("{}@{}", inserted.actor_id, instance.public_domain);
inserted.fqn = format!(
"{}@{}",
iri_percent_encode_seg(&inserted.actor_id),
instance.public_domain
);
}
}
@ -95,6 +102,10 @@ impl Blog {
find_by!(blogs, find_by_ap_url, ap_url as &str);
find_by!(blogs, find_by_name, actor_id as &str, instance_id as i32);
pub fn slug(title: &str) -> &str {
title
}
pub fn get_instance(&self, conn: &Connection) -> Result<Instance> {
Instance::get(conn, self.instance_id)
}
@ -131,10 +142,10 @@ impl Blog {
.map_err(Error::from)
}
pub fn find_by_fqn(conn: &DbConn, fqn: &str) -> Result<Blog> {
pub fn find_by_fqn(conn: &Connection, fqn: &str) -> Result<Blog> {
let from_db = blogs::table
.filter(blogs::fqn.eq(fqn))
.first(&**conn)
.first(conn)
.optional()?;
if let Some(from_db) = from_db {
Ok(from_db)
@ -143,7 +154,7 @@ impl Blog {
}
}
fn fetch_from_webfinger(conn: &DbConn, acct: &str) -> Result<Blog> {
fn fetch_from_webfinger(conn: &Connection, acct: &str) -> Result<Blog> {
resolve_with_prefix(Prefix::Group, acct.to_owned(), true)?
.links
.into_iter()
@ -161,104 +172,120 @@ impl Blog {
}
pub fn to_activity(&self, conn: &Connection) -> Result<CustomGroup> {
let mut blog = Group::default();
blog.ap_actor_props
.set_preferred_username_string(self.actor_id.clone())?;
blog.object_props.set_name_string(self.title.clone())?;
blog.ap_actor_props
.set_outbox_string(self.outbox_url.clone())?;
blog.ap_actor_props
.set_inbox_string(self.inbox_url.clone())?;
blog.object_props
.set_summary_string(self.summary_html.to_string())?;
blog.ap_object_props.set_source_object(Source {
content: self.summary.clone(),
media_type: String::from("text/markdown"),
})?;
let mut blog = ApActor::new(self.inbox_url.parse()?, Group::new());
blog.set_preferred_username(iri_percent_encode_seg(&self.actor_id));
blog.set_name(self.title.clone());
blog.set_outbox(self.outbox_url.parse()?);
blog.set_summary(self.summary_html.to_string());
let source = SourceProperty {
source: Source {
content: self.summary.clone(),
media_type: String::from("text/markdown"),
},
};
let mut icon = Image::default();
icon.object_props.set_url_string(
self.icon_id
.and_then(|id| Media::get(conn, id).and_then(|m| m.url()).ok())
.unwrap_or_default(),
)?;
icon.object_props.set_attributed_to_link(
self.icon_id
.and_then(|id| {
Media::get(conn, id)
.and_then(|m| Ok(User::get(conn, m.owner_id)?.into_id()))
.ok()
})
.unwrap_or_else(|| Id::new(String::new())),
)?;
blog.object_props.set_icon_object(icon)?;
let mut icon = Image::new();
let _ = self.icon_id.map(|id| {
Media::get(conn, id).and_then(|m| {
let _ = m
.url()
.and_then(|url| url.parse::<IriString>().map_err(|_| Error::Url))
.map(|url| icon.set_url(url));
icon.set_attributed_to(
User::get(conn, m.owner_id)?
.into_id()
.parse::<IriString>()?,
);
Ok(())
})
});
blog.set_icon(icon.into_any_base()?);
let mut banner = Image::default();
banner.object_props.set_url_string(
self.banner_id
.and_then(|id| Media::get(conn, id).and_then(|m| m.url()).ok())
.unwrap_or_default(),
)?;
banner.object_props.set_attributed_to_link(
self.banner_id
.and_then(|id| {
Media::get(conn, id)
.and_then(|m| Ok(User::get(conn, m.owner_id)?.into_id()))
.ok()
})
.unwrap_or_else(|| Id::new(String::new())),
)?;
blog.object_props.set_image_object(banner)?;
let mut banner = Image::new();
let _ = self.banner_id.map(|id| {
Media::get(conn, id).and_then(|m| {
let _ = m
.url()
.and_then(|url| url.parse::<IriString>().map_err(|_| Error::Url))
.map(|url| banner.set_url(url));
banner.set_attributed_to(
User::get(conn, m.owner_id)?
.into_id()
.parse::<IriString>()?,
);
Ok(())
})
});
blog.set_image(banner.into_any_base()?);
blog.object_props.set_id_string(self.ap_url.clone())?;
blog.set_id(self.ap_url.parse()?);
let mut public_key = PublicKey::default();
public_key.set_id_string(format!("{}#main-key", self.ap_url))?;
public_key.set_owner_string(self.ap_url.clone())?;
public_key.set_public_key_pem_string(self.public_key.clone())?;
let mut ap_signature = ApSignature::default();
ap_signature.set_public_key_publickey(public_key)?;
let pub_key = PublicKey {
id: format!("{}#main-key", self.ap_url).parse()?,
owner: self.ap_url.parse()?,
public_key_pem: self.public_key.clone(),
};
let ap_signature = ApSignature {
public_key: pub_key,
};
Ok(CustomGroup::new(blog, ap_signature))
Ok(CustomGroup::new(blog, ap_signature, source))
}
pub fn outbox(&self, conn: &Connection) -> Result<ActivityStream<OrderedCollection>> {
let mut coll = OrderedCollection::default();
coll.collection_props.items = serde_json::to_value(self.get_activities(conn))?;
coll.collection_props
.set_total_items_u64(self.get_activities(conn).len() as u64)?;
coll.collection_props
.set_first_link(Id::new(ap_url(&format!("{}?page=1", &self.outbox_url))))?;
coll.collection_props
.set_last_link(Id::new(ap_url(&format!(
self.outbox_collection(conn).map(ActivityStream::new)
}
pub fn outbox_collection(&self, conn: &Connection) -> Result<OrderedCollection> {
let acts = self.get_activities(conn);
let acts = acts
.iter()
.filter_map(|value| AnyBase::from_arbitrary_json(value).ok())
.collect::<Vec<AnyBase>>();
let n_acts = acts.len();
let mut coll = OrderedCollection::new();
coll.set_many_items(acts);
coll.set_total_items(n_acts as u64);
coll.set_first(format!("{}?page=1", &self.outbox_url).parse::<IriString>()?);
coll.set_last(
format!(
"{}?page={}",
&self.outbox_url,
(self.get_activities(conn).len() as u64 + ITEMS_PER_PAGE as u64 - 1) as u64
/ ITEMS_PER_PAGE as u64
))))?;
Ok(ActivityStream::new(coll))
(n_acts as u64 + ITEMS_PER_PAGE as u64 - 1) as u64 / ITEMS_PER_PAGE as u64
)
.parse::<IriString>()?,
);
Ok(coll)
}
pub fn outbox_page(
&self,
conn: &Connection,
(min, max): (i32, i32),
) -> Result<ActivityStream<OrderedCollectionPage>> {
let mut coll = OrderedCollectionPage::default();
self.outbox_collection_page(conn, (min, max))
.map(ActivityStream::new)
}
pub fn outbox_collection_page(
&self,
conn: &Connection,
(min, max): (i32, i32),
) -> Result<OrderedCollectionPage> {
let mut coll = OrderedCollectionPage::new();
let acts = self.get_activity_page(conn, (min, max));
//This still doesn't do anything because the outbox
//doesn't do anything yet
coll.collection_page_props.set_next_link(Id::new(&format!(
"{}?page={}",
&self.outbox_url,
min / ITEMS_PER_PAGE + 1
)))?;
coll.collection_page_props.set_prev_link(Id::new(&format!(
"{}?page={}",
&self.outbox_url,
min / ITEMS_PER_PAGE - 1
)))?;
coll.collection_props.items = serde_json::to_value(acts)?;
Ok(ActivityStream::new(coll))
coll.set_next(
format!("{}?page={}", &self.outbox_url, min / ITEMS_PER_PAGE + 1)
.parse::<IriString>()?,
);
coll.set_prev(
format!("{}?page={}", &self.outbox_url, min / ITEMS_PER_PAGE - 1)
.parse::<IriString>()?,
);
coll.set_many_items(
acts.iter()
.filter_map(|value| AnyBase::from_arbitrary_json(value).ok()),
);
Ok(coll)
}
fn get_activities(&self, _conn: &Connection) -> Vec<serde_json::Value> {
vec![]
@ -345,18 +372,100 @@ impl IntoId for Blog {
}
}
impl FromId<DbConn> for Blog {
impl FromId<Connection> for Blog {
type Error = Error;
type Object = CustomGroup;
fn from_db(conn: &DbConn, id: &str) -> Result<Self> {
fn from_db(conn: &Connection, id: &str) -> Result<Self> {
Self::find_by_ap_url(conn, id)
}
fn from_activity(conn: &DbConn, acct: CustomGroup) -> Result<Self> {
let url = Url::parse(&acct.object.object_props.id_string()?)?;
let inst = url.host_str().ok_or(Error::Url)?;
let instance = Instance::find_by_domain(conn, inst).or_else(|_| {
fn from_activity(conn: &Connection, acct: CustomGroup) -> Result<Self> {
let (name, outbox_url, inbox_url) = {
let actor = acct.ap_actor_ref();
let name = actor
.preferred_username()
.ok_or(Error::MissingApProperty)?
.to_string();
if name.contains(&['<', '>', '&', '@', '\'', '"', ' ', '\t'][..]) {
tracing::error!("preferredUsername includes invalid character(s): {}", &name);
return Err(Error::InvalidValue);
}
(
name,
actor.outbox()?.ok_or(Error::MissingApProperty)?.to_string(),
actor.inbox()?.to_string(),
)
};
let mut new_blog = NewBlog {
actor_id: name.to_string(),
outbox_url,
inbox_url,
public_key: acct.ext_one.public_key.public_key_pem.to_string(),
private_key: None,
theme: None,
..NewBlog::default()
};
let object = ApObject::new(acct.inner);
new_blog.title = object
.name()
.and_then(|name| name.to_as_string())
.unwrap_or(name);
new_blog.summary_html = SafeString::new(
&object
.summary()
.and_then(|summary| summary.to_as_string())
.unwrap_or_default(),
);
let icon_id = object
.icon()
.and_then(|icons| {
icons.iter().next().and_then(|icon| {
let icon = icon.to_owned().extend::<Image, ImageType>().ok()??;
let owner = icon.attributed_to()?.to_as_uri()?;
Media::save_remote(
conn,
icon.url()?.to_as_uri()?,
&User::from_id(conn, &owner, None, CONFIG.proxy()).ok()?,
)
.ok()
})
})
.map(|m| m.id);
new_blog.icon_id = icon_id;
let banner_id = object
.image()
.and_then(|banners| {
banners.iter().next().and_then(|banner| {
let banner = banner.to_owned().extend::<Image, ImageType>().ok()??;
let owner = banner.attributed_to()?.to_as_uri()?;
Media::save_remote(
conn,
banner.url()?.to_as_uri()?,
&User::from_id(conn, &owner, None, CONFIG.proxy()).ok()?,
)
.ok()
})
})
.map(|m| m.id);
new_blog.banner_id = banner_id;
new_blog.summary = acct.ext_two.source.content;
let any_base = AnyBase::from_extended(object)?;
let id = any_base.id().ok_or(Error::MissingApProperty)?;
new_blog.ap_url = id.to_string();
let inst = id
.authority_components()
.ok_or(Error::Url)?
.host()
.to_string();
let instance = Instance::find_by_domain(conn, &inst).or_else(|_| {
Instance::insert(
conn,
NewInstance {
@ -373,75 +482,9 @@ impl FromId<DbConn> for Blog {
},
)
})?;
let icon_id = acct
.object
.object_props
.icon_image()
.ok()
.and_then(|icon| {
let owner = icon.object_props.attributed_to_link::<Id>().ok()?;
Media::save_remote(
conn,
icon.object_props.url_string().ok()?,
&User::from_id(conn, &owner, None, CONFIG.proxy()).ok()?,
)
.ok()
})
.map(|m| m.id);
new_blog.instance_id = instance.id;
let banner_id = acct
.object
.object_props
.image_image()
.ok()
.and_then(|banner| {
let owner = banner.object_props.attributed_to_link::<Id>().ok()?;
Media::save_remote(
conn,
banner.object_props.url_string().ok()?,
&User::from_id(conn, &owner, None, CONFIG.proxy()).ok()?,
)
.ok()
})
.map(|m| m.id);
let name = acct.object.ap_actor_props.preferred_username_string()?;
if name.contains(&['<', '>', '&', '@', '\'', '"', ' ', '\t'][..]) {
return Err(Error::InvalidValue);
}
Blog::insert(
conn,
NewBlog {
actor_id: name.clone(),
title: acct.object.object_props.name_string().unwrap_or(name),
outbox_url: acct.object.ap_actor_props.outbox_string()?,
inbox_url: acct.object.ap_actor_props.inbox_string()?,
summary: acct
.object
.ap_object_props
.source_object::<Source>()
.map(|s| s.content)
.unwrap_or_default(),
instance_id: instance.id,
ap_url: acct.object.object_props.id_string()?,
public_key: acct
.custom_props
.public_key_publickey()?
.public_key_pem_string()?,
private_key: None,
banner_id,
icon_id,
summary_html: SafeString::new(
&acct
.object
.object_props
.summary_string()
.unwrap_or_default(),
),
theme: None,
},
)
Blog::insert(conn, new_blog)
}
fn get_sender() -> &'static dyn sign::Signer {
@ -512,12 +555,14 @@ pub(crate) mod tests {
blog_authors::*, instance::tests as instance_tests, medias::NewMedia, tests::db,
users::tests as usersTests, Connection as Conn,
};
use assert_json_diff::assert_json_eq;
use diesel::Connection;
use serde_json::to_value;
pub(crate) fn fill_database(conn: &Conn) -> (Vec<User>, Vec<Blog>) {
instance_tests::fill_database(conn);
let users = usersTests::fill_database(conn);
let blog1 = Blog::insert(
let mut blog1 = Blog::insert(
conn,
NewBlog::new_local(
"BlogName".to_owned(),
@ -590,6 +635,41 @@ pub(crate) mod tests {
},
)
.unwrap();
blog1.icon_id = Some(
Media::insert(
conn,
NewMedia {
file_path: "aaa.png".into(),
alt_text: String::new(),
is_remote: false,
remote_url: None,
sensitive: false,
content_warning: None,
owner_id: users[0].id,
},
)
.unwrap()
.id,
);
blog1.banner_id = Some(
Media::insert(
conn,
NewMedia {
file_path: "bbb.png".into(),
alt_text: String::new(),
is_remote: false,
remote_url: None,
sensitive: false,
content_warning: None,
owner_id: users[0].id,
},
)
.unwrap()
.id,
);
let _: Blog = blog1.save_changes(conn).unwrap();
(users, vec![blog1, blog2, blog3])
}
@ -597,10 +677,10 @@ pub(crate) mod tests {
fn get_instance() {
let conn = &db();
conn.test_transaction::<_, (), _>(|| {
fill_database(&conn);
fill_database(conn);
let blog = Blog::insert(
&conn,
conn,
NewBlog::new_local(
"SomeName".to_owned(),
"Some name".to_owned(),
@ -612,7 +692,7 @@ pub(crate) mod tests {
.unwrap();
assert_eq!(
blog.get_instance(&conn).unwrap().id,
blog.get_instance(conn).unwrap().id,
Instance::get_local().unwrap().id
);
// TODO add tests for remote instance
@ -624,10 +704,10 @@ pub(crate) mod tests {
fn authors() {
let conn = &db();
conn.test_transaction::<_, (), _>(|| {
let (user, _) = fill_database(&conn);
let (user, _) = fill_database(conn);
let b1 = Blog::insert(
&conn,
conn,
NewBlog::new_local(
"SomeName".to_owned(),
"Some name".to_owned(),
@ -638,7 +718,7 @@ pub(crate) mod tests {
)
.unwrap();
let b2 = Blog::insert(
&conn,
conn,
NewBlog::new_local(
"Blog".to_owned(),
"Blog".to_owned(),
@ -651,7 +731,7 @@ pub(crate) mod tests {
let blog = vec![b1, b2];
BlogAuthor::insert(
&conn,
conn,
NewBlogAuthor {
blog_id: blog[0].id,
author_id: user[0].id,
@ -661,7 +741,7 @@ pub(crate) mod tests {
.unwrap();
BlogAuthor::insert(
&conn,
conn,
NewBlogAuthor {
blog_id: blog[0].id,
author_id: user[1].id,
@ -671,7 +751,7 @@ pub(crate) mod tests {
.unwrap();
BlogAuthor::insert(
&conn,
conn,
NewBlogAuthor {
blog_id: blog[1].id,
author_id: user[0].id,
@ -681,39 +761,39 @@ pub(crate) mod tests {
.unwrap();
assert!(blog[0]
.list_authors(&conn)
.list_authors(conn)
.unwrap()
.iter()
.any(|a| a.id == user[0].id));
assert!(blog[0]
.list_authors(&conn)
.list_authors(conn)
.unwrap()
.iter()
.any(|a| a.id == user[1].id));
assert!(blog[1]
.list_authors(&conn)
.list_authors(conn)
.unwrap()
.iter()
.any(|a| a.id == user[0].id));
assert!(!blog[1]
.list_authors(&conn)
.list_authors(conn)
.unwrap()
.iter()
.any(|a| a.id == user[1].id));
assert!(Blog::find_for_author(&conn, &user[0])
assert!(Blog::find_for_author(conn, &user[0])
.unwrap()
.iter()
.any(|b| b.id == blog[0].id));
assert!(Blog::find_for_author(&conn, &user[1])
assert!(Blog::find_for_author(conn, &user[1])
.unwrap()
.iter()
.any(|b| b.id == blog[0].id));
assert!(Blog::find_for_author(&conn, &user[0])
assert!(Blog::find_for_author(conn, &user[0])
.unwrap()
.iter()
.any(|b| b.id == blog[1].id));
assert!(!Blog::find_for_author(&conn, &user[1])
assert!(!Blog::find_for_author(conn, &user[1])
.unwrap()
.iter()
.any(|b| b.id == blog[1].id));
@ -725,10 +805,10 @@ pub(crate) mod tests {
fn find_local() {
let conn = &db();
conn.test_transaction::<_, (), _>(|| {
fill_database(&conn);
fill_database(conn);
let blog = Blog::insert(
&conn,
conn,
NewBlog::new_local(
"SomeName".to_owned(),
"Some name".to_owned(),
@ -739,7 +819,7 @@ pub(crate) mod tests {
)
.unwrap();
assert_eq!(Blog::find_by_fqn(&conn, "SomeName").unwrap().id, blog.id);
assert_eq!(Blog::find_by_fqn(conn, "SomeName").unwrap().id, blog.id);
Ok(())
})
}
@ -748,10 +828,10 @@ pub(crate) mod tests {
fn get_fqn() {
let conn = &db();
conn.test_transaction::<_, (), _>(|| {
fill_database(&conn);
fill_database(conn);
let blog = Blog::insert(
&conn,
conn,
NewBlog::new_local(
"SomeName".to_owned(),
"Some name".to_owned(),
@ -771,10 +851,10 @@ pub(crate) mod tests {
fn delete() {
let conn = &db();
conn.test_transaction::<_, (), _>(|| {
let (_, blogs) = fill_database(&conn);
let (_, blogs) = fill_database(conn);
blogs[0].delete(&conn).unwrap();
assert!(Blog::get(&conn, blogs[0].id).is_err());
blogs[0].delete(conn).unwrap();
assert!(Blog::get(conn, blogs[0].id).is_err());
Ok(())
})
}
@ -783,10 +863,10 @@ pub(crate) mod tests {
fn delete_via_user() {
let conn = &db();
conn.test_transaction::<_, (), _>(|| {
let (user, _) = fill_database(&conn);
let (user, _) = fill_database(conn);
let b1 = Blog::insert(
&conn,
conn,
NewBlog::new_local(
"SomeName".to_owned(),
"Some name".to_owned(),
@ -797,7 +877,7 @@ pub(crate) mod tests {
)
.unwrap();
let b2 = Blog::insert(
&conn,
conn,
NewBlog::new_local(
"Blog".to_owned(),
"Blog".to_owned(),
@ -810,7 +890,7 @@ pub(crate) mod tests {
let blog = vec![b1, b2];
BlogAuthor::insert(
&conn,
conn,
NewBlogAuthor {
blog_id: blog[0].id,
author_id: user[0].id,
@ -820,7 +900,7 @@ pub(crate) mod tests {
.unwrap();
BlogAuthor::insert(
&conn,
conn,
NewBlogAuthor {
blog_id: blog[0].id,
author_id: user[1].id,
@ -830,7 +910,7 @@ pub(crate) mod tests {
.unwrap();
BlogAuthor::insert(
&conn,
conn,
NewBlogAuthor {
blog_id: blog[1].id,
author_id: user[0].id,
@ -839,11 +919,11 @@ pub(crate) mod tests {
)
.unwrap();
user[0].delete(&conn).unwrap();
assert!(Blog::get(&conn, blog[0].id).is_ok());
assert!(Blog::get(&conn, blog[1].id).is_err());
user[1].delete(&conn).unwrap();
assert!(Blog::get(&conn, blog[0].id).is_err());
user[0].delete(conn).unwrap();
assert!(Blog::get(conn, blog[0].id).is_ok());
assert!(Blog::get(conn, blog[1].id).is_err());
user[1].delete(conn).unwrap();
assert!(Blog::get(conn, blog[0].id).is_err());
Ok(())
})
}
@ -852,10 +932,10 @@ pub(crate) mod tests {
fn self_federation() {
let conn = &db();
conn.test_transaction::<_, (), _>(|| {
let (users, mut blogs) = fill_database(&conn);
let (users, mut blogs) = fill_database(conn);
blogs[0].icon_id = Some(
Media::insert(
&conn,
conn,
NewMedia {
file_path: "aaa.png".into(),
alt_text: String::new(),
@ -871,7 +951,7 @@ pub(crate) mod tests {
);
blogs[0].banner_id = Some(
Media::insert(
&conn,
conn,
NewMedia {
file_path: "bbb.png".into(),
alt_text: String::new(),
@ -886,10 +966,9 @@ pub(crate) mod tests {
.id,
);
let _: Blog = blogs[0].save_changes(&**conn).unwrap();
let ap_repr = blogs[0].to_activity(&conn).unwrap();
blogs[0].delete(&conn).unwrap();
let blog = Blog::from_activity(&conn, ap_repr).unwrap();
let ap_repr = blogs[0].to_activity(conn).unwrap();
blogs[0].delete(conn).unwrap();
let blog = Blog::from_activity(conn, ap_repr).unwrap();
assert_eq!(blog.actor_id, blogs[0].actor_id);
assert_eq!(blog.title, blogs[0].title);
@ -901,10 +980,96 @@ pub(crate) mod tests {
assert_eq!(blog.public_key, blogs[0].public_key);
assert_eq!(blog.fqn, blogs[0].fqn);
assert_eq!(blog.summary_html, blogs[0].summary_html);
assert_eq!(blog.icon_url(&conn), blogs[0].icon_url(&conn));
assert_eq!(blog.banner_url(&conn), blogs[0].banner_url(&conn));
assert_eq!(blog.icon_url(conn), blogs[0].icon_url(conn));
assert_eq!(blog.banner_url(conn), blogs[0].banner_url(conn));
Ok(())
})
}
#[test]
fn to_activity() {
let conn = &db();
conn.test_transaction::<_, Error, _>(|| {
let (_users, blogs) = fill_database(conn);
let blog = &blogs[0];
let act = blog.to_activity(conn)?;
let expected = json!({
"icon": {
"attributedTo": "https://plu.me/@/admin/",
"type": "Image",
"url": "https://plu.me/aaa.png"
},
"id": "https://plu.me/~/BlogName/",
"image": {
"attributedTo": "https://plu.me/@/admin/",
"type": "Image",
"url": "https://plu.me/bbb.png"
},
"inbox": "https://plu.me/~/BlogName/inbox",
"name": "Blog name",
"outbox": "https://plu.me/~/BlogName/outbox",
"preferredUsername": "BlogName",
"publicKey": {
"id": "https://plu.me/~/BlogName/#main-key",
"owner": "https://plu.me/~/BlogName/",
"publicKeyPem": blog.public_key
},
"source": {
"content": "This is a small blog",
"mediaType": "text/markdown"
},
"summary": "",
"type": "Group"
});
assert_json_eq!(to_value(act)?, expected);
Ok(())
});
}
#[test]
fn outbox_collection() {
let conn = &db();
conn.test_transaction::<_, Error, _>(|| {
let (_users, blogs) = fill_database(conn);
let blog = &blogs[0];
let act = blog.outbox_collection(conn)?;
let expected = json!({
"items": [],
"totalItems": 0,
"first": "https://plu.me/~/BlogName/outbox?page=1",
"last": "https://plu.me/~/BlogName/outbox?page=0",
"type": "OrderedCollection"
});
assert_json_eq!(to_value(act)?, expected);
Ok(())
});
}
#[test]
fn outbox_collection_page() {
let conn = &db();
conn.test_transaction::<_, Error, _>(|| {
let (_users, blogs) = fill_database(conn);
let blog = &blogs[0];
let act = blog.outbox_collection_page(conn, (33, 36))?;
let expected = json!({
"next": "https://plu.me/~/BlogName/outbox?page=3",
"prev": "https://plu.me/~/BlogName/outbox?page=1",
"items": [],
"type": "OrderedCollectionPage"
});
assert_json_eq!(to_value(act)?, expected);
Ok(())
});
}
}

View file

@ -1,6 +1,5 @@
use crate::{
comment_seers::{CommentSeers, NewCommentSeers},
db_conn::DbConn,
instance::Instance,
medias::Media,
mentions::Mention,
@ -11,10 +10,15 @@ use crate::{
users::User,
Connection, Error, Result, CONFIG,
};
use activitypub::{
use activitystreams::{
activity::{Create, Delete},
link,
base::{AnyBase, Base},
iri_string::types::IriString,
link::{self, kind::MentionType},
object::{Note, Tombstone},
prelude::*,
primitives::OneOrMany,
time::OffsetDateTime,
};
use chrono::{self, NaiveDateTime};
use diesel::{self, ExpressionMethods, QueryDsl, RunQueryDsl, SaveChangesDsl};
@ -22,7 +26,7 @@ use plume_common::{
activity_pub::{
inbox::{AsActor, AsObject, FromId},
sign::Signer,
Id, IntoId, PUBLIC_VISIBILITY,
IntoId, ToAsString, ToAsUri, PUBLIC_VISIBILITY,
},
utils,
};
@ -59,7 +63,7 @@ impl Comment {
insert!(comments, NewComment, |inserted, conn| {
if inserted.ap_url.is_none() {
inserted.ap_url = Some(format!(
"{}comment/{}",
"{}/comment/{}",
inserted.get_post(conn)?.ap_url,
inserted.id
));
@ -69,6 +73,7 @@ impl Comment {
});
get!(comments);
list_by!(comments, list_by_post, post_id as i32);
list_by!(comments, list_by_author, author_id as i32);
find_by!(comments, find_by_ap_url, ap_url as &str);
pub fn get_author(&self, conn: &Connection) -> Result<User> {
@ -106,7 +111,7 @@ impl Comment {
.unwrap_or(false)
}
pub fn to_activity(&self, conn: &DbConn) -> Result<Note> {
pub fn to_activity(&self, conn: &Connection) -> Result<Note> {
let author = User::get(conn, self.author_id)?;
let (html, mentions, _hashtags) = utils::md_to_html(
self.content.get().as_ref(),
@ -115,47 +120,59 @@ impl Comment {
Some(Media::get_media_processor(conn, vec![&author])),
);
let mut note = Note::default();
let to = vec![Id::new(PUBLIC_VISIBILITY.to_string())];
let mut note = Note::new();
let to = vec![PUBLIC_VISIBILITY.parse::<IriString>()?];
note.object_props
.set_id_string(self.ap_url.clone().unwrap_or_default())?;
note.object_props
.set_summary_string(self.spoiler_text.clone())?;
note.object_props.set_content_string(html)?;
note.object_props
.set_in_reply_to_link(Id::new(self.in_response_to_id.map_or_else(
|| Ok(Post::get(conn, self.post_id)?.ap_url),
|id| Ok(Comment::get(conn, id)?.ap_url.unwrap_or_default()) as Result<String>,
)?))?;
note.object_props
.set_published_string(chrono::Utc::now().to_rfc3339())?;
note.object_props.set_attributed_to_link(author.into_id())?;
note.object_props.set_to_link_vec(to)?;
note.object_props.set_tag_link_vec(
mentions
.into_iter()
.filter_map(|m| Mention::build_activity(conn, &m).ok())
.collect::<Vec<link::Mention>>(),
)?;
note.set_id(
self.ap_url
.clone()
.unwrap_or_default()
.parse::<IriString>()?,
);
note.set_summary(self.spoiler_text.clone());
note.set_content(html);
note.set_in_reply_to(self.in_response_to_id.map_or_else(
|| Post::get(conn, self.post_id).map(|post| post.ap_url),
|id| Comment::get(conn, id).map(|comment| comment.ap_url.unwrap_or_default()),
)?);
note.set_published(
OffsetDateTime::from_unix_timestamp_nanos(self.creation_date.timestamp_nanos().into())
.expect("OffsetDateTime"),
);
note.set_attributed_to(author.into_id().parse::<IriString>()?);
note.set_many_tos(to);
note.set_many_tags(mentions.into_iter().filter_map(|m| {
Mention::build_activity(conn, &m)
.map(|mention| mention.into_any_base().expect("Can convert"))
.ok()
}));
Ok(note)
}
pub fn create_activity(&self, conn: &DbConn) -> Result<Create> {
pub fn create_activity(&self, conn: &Connection) -> Result<Create> {
let author = User::get(conn, self.author_id)?;
let note = self.to_activity(conn)?;
let mut act = Create::default();
act.create_props.set_actor_link(author.into_id())?;
act.create_props.set_object_object(note.clone())?;
act.object_props.set_id_string(format!(
"{}/activity",
self.ap_url.clone().ok_or(Error::MissingApProperty)?,
))?;
act.object_props
.set_to_link_vec(note.object_props.to_link_vec::<Id>()?)?;
act.object_props
.set_cc_link_vec(vec![Id::new(self.get_author(conn)?.followers_endpoint)])?;
let note_clone = note.clone();
let mut act = Create::new(
author.into_id().parse::<IriString>()?,
Base::retract(note)?.into_generic()?,
);
act.set_id(
format!(
"{}/activity",
self.ap_url.clone().ok_or(Error::MissingApProperty)?,
)
.parse::<IriString>()?,
);
act.set_many_tos(
note_clone
.to()
.iter()
.flat_map(|tos| tos.iter().map(|to| to.to_owned())),
);
act.set_many_ccs(vec![self.get_author(conn)?.followers_endpoint]);
Ok(act)
}
@ -180,138 +197,140 @@ impl Comment {
}
pub fn build_delete(&self, conn: &Connection) -> Result<Delete> {
let mut act = Delete::default();
act.delete_props
.set_actor_link(self.get_author(conn)?.into_id())?;
let mut tombstone = Tombstone::new();
tombstone.set_id(
self.ap_url
.as_ref()
.ok_or(Error::MissingApProperty)?
.parse::<IriString>()?,
);
let mut tombstone = Tombstone::default();
tombstone
.object_props
.set_id_string(self.ap_url.clone().ok_or(Error::MissingApProperty)?)?;
act.delete_props.set_object_object(tombstone)?;
let mut act = Delete::new(
self.get_author(conn)?.into_id().parse::<IriString>()?,
Base::retract(tombstone)?.into_generic()?,
);
act.object_props
.set_id_string(format!("{}#delete", self.ap_url.clone().unwrap()))?;
act.object_props
.set_to_link_vec(vec![Id::new(PUBLIC_VISIBILITY)])?;
act.set_id(format!("{}#delete", self.ap_url.clone().unwrap()).parse::<IriString>()?);
act.set_many_tos(vec![PUBLIC_VISIBILITY.parse::<IriString>()?]);
Ok(act)
}
}
impl FromId<DbConn> for Comment {
impl FromId<Connection> for Comment {
type Error = Error;
type Object = Note;
fn from_db(conn: &DbConn, id: &str) -> Result<Self> {
fn from_db(conn: &Connection, id: &str) -> Result<Self> {
Self::find_by_ap_url(conn, id)
}
fn from_activity(conn: &DbConn, note: Note) -> Result<Self> {
fn from_activity(conn: &Connection, note: Note) -> Result<Self> {
let comm = {
let previous_url = note
.object_props
.in_reply_to
.as_ref()
.in_reply_to()
.ok_or(Error::MissingApProperty)?
.as_str()
.iter()
.next()
.ok_or(Error::MissingApProperty)?
.id()
.ok_or(Error::MissingApProperty)?;
let previous_comment = Comment::find_by_ap_url(conn, previous_url);
let previous_comment = Comment::find_by_ap_url(conn, previous_url.as_str());
let is_public = |v: &Option<serde_json::Value>| match v
.as_ref()
.unwrap_or(&serde_json::Value::Null)
{
serde_json::Value::Array(v) => v
.iter()
.filter_map(serde_json::Value::as_str)
.any(|s| s == PUBLIC_VISIBILITY),
serde_json::Value::String(s) => s == PUBLIC_VISIBILITY,
_ => false,
let is_public = |v: &Option<&OneOrMany<AnyBase>>| match v {
Some(one_or_many) => one_or_many.iter().any(|any_base| {
let id = any_base.id();
id.is_some() && id.unwrap() == PUBLIC_VISIBILITY
}),
None => false,
};
let public_visibility = is_public(&note.object_props.to)
|| is_public(&note.object_props.bto)
|| is_public(&note.object_props.cc)
|| is_public(&note.object_props.bcc);
let public_visibility = is_public(&note.to())
|| is_public(&note.bto())
|| is_public(&note.cc())
|| is_public(&note.bcc());
let summary = note.summary().and_then(|summary| summary.to_as_string());
let sensitive = summary.is_some();
let comm = Comment::insert(
conn,
NewComment {
content: SafeString::new(&note.object_props.content_string()?),
spoiler_text: note.object_props.summary_string().unwrap_or_default(),
ap_url: note.object_props.id_string().ok(),
content: SafeString::new(
&note
.content()
.ok_or(Error::MissingApProperty)?
.to_as_string()
.ok_or(Error::InvalidValue)?,
),
spoiler_text: summary.unwrap_or_default(),
ap_url: Some(
note.id_unchecked()
.ok_or(Error::MissingApProperty)?
.to_string(),
),
in_response_to_id: previous_comment.iter().map(|c| c.id).next(),
post_id: previous_comment.map(|c| c.post_id).or_else(|_| {
Ok(Post::find_by_ap_url(conn, previous_url)?.id) as Result<i32>
Ok(Post::find_by_ap_url(conn, previous_url.as_str())?.id) as Result<i32>
})?,
author_id: User::from_id(
conn,
&note.object_props.attributed_to_link::<Id>()?,
&note
.attributed_to()
.ok_or(Error::MissingApProperty)?
.to_as_uri()
.ok_or(Error::MissingApProperty)?,
None,
CONFIG.proxy(),
)
.map_err(|(_, e)| e)?
.id,
sensitive: note.object_props.summary_string().is_ok(),
sensitive,
public_visibility,
},
)?;
// save mentions
if let Some(serde_json::Value::Array(tags)) = note.object_props.tag.clone() {
for tag in tags {
serde_json::from_value::<link::Mention>(tag)
.map_err(Error::from)
.and_then(|m| {
let author = &Post::get(conn, comm.post_id)?.get_authors(conn)?[0];
let not_author = m.link_props.href_string()? != author.ap_url.clone();
Mention::from_activity(conn, &m, comm.id, false, not_author)
})
.ok();
if let Some(tags) = note.tag() {
let author_url = &Post::get(conn, comm.post_id)?.get_authors(conn)?[0].ap_url;
for tag in tags.iter() {
let m = tag.clone().extend::<link::Mention, MentionType>()?; // FIXME: Don't clone
if m.is_none() {
continue;
}
let m = m.unwrap();
let not_author = m.href().ok_or(Error::MissingApProperty)? != author_url;
let _ = Mention::from_activity(conn, &m, comm.id, false, not_author);
}
}
comm
};
if !comm.public_visibility {
let receivers_ap_url = |v: Option<serde_json::Value>| {
let filter = |e: serde_json::Value| {
if let serde_json::Value::String(s) = e {
Some(s)
} else {
None
let mut receiver_ids = HashSet::new();
let mut receivers_id = |v: Option<&'_ OneOrMany<AnyBase>>| {
if let Some(one_or_many) = v {
for any_base in one_or_many.iter() {
if let Some(id) = any_base.id() {
receiver_ids.insert(id.to_string());
}
}
};
match v.unwrap_or(serde_json::Value::Null) {
serde_json::Value::Array(v) => v,
v => vec![v],
}
.into_iter()
.filter_map(filter)
};
let mut note = note;
receivers_id(note.to());
receivers_id(note.cc());
receivers_id(note.bto());
receivers_id(note.bcc());
let to = receivers_ap_url(note.object_props.to.take());
let cc = receivers_ap_url(note.object_props.cc.take());
let bto = receivers_ap_url(note.object_props.bto.take());
let bcc = receivers_ap_url(note.object_props.bcc.take());
let receivers_ap_url = to
.chain(cc)
.chain(bto)
.chain(bcc)
.collect::<HashSet<_>>() // remove duplicates (don't do a query more than once)
let receivers_ap_url = receiver_ids
.into_iter()
.map(|v| {
if let Ok(user) = User::from_id(conn, &v, None, CONFIG.proxy()) {
.flat_map(|v| {
if let Ok(user) = User::from_id(conn, v.as_ref(), None, CONFIG.proxy()) {
vec![user]
} else {
vec![] // TODO try to fetch collection
}
})
.flatten()
.filter(|u| u.get_instance(conn).map(|i| i.local).unwrap_or(false))
.collect::<HashSet<User>>(); //remove duplicates (prevent db error)
@ -335,21 +354,21 @@ impl FromId<DbConn> for Comment {
}
}
impl AsObject<User, Create, &DbConn> for Comment {
impl AsObject<User, Create, &Connection> for Comment {
type Error = Error;
type Output = Self;
fn activity(self, _conn: &DbConn, _actor: User, _id: &str) -> Result<Self> {
fn activity(self, _conn: &Connection, _actor: User, _id: &str) -> Result<Self> {
// The actual creation takes place in the FromId impl
Ok(self)
}
}
impl AsObject<User, Delete, &DbConn> for Comment {
impl AsObject<User, Delete, &Connection> for Comment {
type Error = Error;
type Output = ();
fn activity(self, conn: &DbConn, actor: User, _id: &str) -> Result<()> {
fn activity(self, conn: &Connection, actor: User, _id: &str) -> Result<()> {
if self.author_id != actor.id {
return Err(Error::Unauthorized);
}
@ -362,14 +381,14 @@ impl AsObject<User, Delete, &DbConn> for Comment {
}
for n in Notification::find_for_comment(conn, &self)? {
n.delete(&**conn)?;
n.delete(conn)?;
}
diesel::update(comments::table)
.filter(comments::in_response_to_id.eq(self.id))
.set(comments::in_response_to_id.eq(self.in_response_to_id))
.execute(&**conn)?;
diesel::delete(&self).execute(&**conn)?;
.execute(conn)?;
diesel::delete(&self).execute(conn)?;
Ok(())
}
}
@ -403,10 +422,35 @@ impl CommentTree {
#[cfg(test)]
mod tests {
use super::*;
use crate::blogs::Blog;
use crate::db_conn::DbConn;
use crate::inbox::{inbox, tests::fill_database, InboxResult};
use crate::safe_string::SafeString;
use crate::tests::db;
use crate::tests::{db, format_datetime};
use assert_json_diff::assert_json_eq;
use diesel::Connection;
use serde_json::{json, to_value};
fn prepare_activity(conn: &DbConn) -> (Comment, Vec<Post>, Vec<User>, Vec<Blog>) {
let (posts, users, blogs) = fill_database(conn);
let comment = Comment::insert(
conn,
NewComment {
content: SafeString::new("My comment, mentioning to @user"),
in_response_to_id: None,
post_id: posts[0].id,
author_id: users[0].id,
ap_url: None,
sensitive: true,
spoiler_text: "My CW".into(),
public_visibility: true,
},
)
.unwrap();
(comment, posts, users, blogs)
}
// creates a post, get it's Create activity, delete the post,
// "send" the Create to the inbox, and check it works
@ -414,30 +458,77 @@ mod tests {
fn self_federation() {
let conn = &db();
conn.test_transaction::<_, (), _>(|| {
let (posts, users, _) = fill_database(&conn);
let (original_comm, posts, users, _blogs) = prepare_activity(conn);
let act = original_comm.create_activity(conn).unwrap();
let original_comm = Comment::insert(
assert_json_eq!(to_value(&act).unwrap(), json!({
"actor": "https://plu.me/@/admin/",
"cc": ["https://plu.me/@/admin/followers"],
"id": format!("https://plu.me/~/BlogName/testing/comment/{}/activity", original_comm.id),
"object": {
"attributedTo": "https://plu.me/@/admin/",
"content": r###"<p dir="auto">My comment, mentioning to <a href="https://plu.me/@/user/" title="user">@user</a></p>
"###,
"id": format!("https://plu.me/~/BlogName/testing/comment/{}", original_comm.id),
"inReplyTo": "https://plu.me/~/BlogName/testing",
"published": format_datetime(&original_comm.creation_date),
"summary": "My CW",
"tag": [
{
"href": "https://plu.me/@/user/",
"name": "@user",
"type": "Mention"
}
],
"to": ["https://www.w3.org/ns/activitystreams#Public"],
"type": "Note"
},
"to": ["https://www.w3.org/ns/activitystreams#Public"],
"type": "Create",
}));
let reply = Comment::insert(
conn,
NewComment {
content: SafeString::new("My comment"),
in_response_to_id: None,
content: SafeString::new(""),
in_response_to_id: Some(original_comm.id),
post_id: posts[0].id,
author_id: users[0].id,
author_id: users[1].id,
ap_url: None,
sensitive: true,
spoiler_text: "My CW".into(),
sensitive: false,
spoiler_text: "".into(),
public_visibility: true,
},
)
.unwrap();
let act = original_comm.create_activity(&conn).unwrap();
let reply_act = reply.create_activity(conn).unwrap();
assert_json_eq!(to_value(&reply_act).unwrap(), json!({
"actor": "https://plu.me/@/user/",
"cc": ["https://plu.me/@/user/followers"],
"id": format!("https://plu.me/~/BlogName/testing/comment/{}/activity", reply.id),
"object": {
"attributedTo": "https://plu.me/@/user/",
"content": "",
"id": format!("https://plu.me/~/BlogName/testing/comment/{}", reply.id),
"inReplyTo": format!("https://plu.me/~/BlogName/testing/comment/{}", original_comm.id),
"published": format_datetime(&reply.creation_date),
"summary": "",
"tag": [],
"to": ["https://www.w3.org/ns/activitystreams#Public"],
"type": "Note"
},
"to": ["https://www.w3.org/ns/activitystreams#Public"],
"type": "Create"
}));
inbox(
&conn,
serde_json::to_value(original_comm.build_delete(&conn).unwrap()).unwrap(),
conn,
serde_json::to_value(original_comm.build_delete(conn).unwrap()).unwrap(),
)
.unwrap();
match inbox(&conn, serde_json::to_value(act).unwrap()).unwrap() {
match inbox(conn, to_value(act).unwrap()).unwrap() {
InboxResult::Commented(c) => {
// TODO: one is HTML, the other markdown: assert_eq!(c.content, original_comm.content);
assert_eq!(c.in_response_to_id, original_comm.in_response_to_id);
@ -452,4 +543,60 @@ mod tests {
Ok(())
})
}
#[test]
fn to_activity() {
let conn = db();
conn.test_transaction::<_, Error, _>(|| {
let (comment, _posts, _users, _blogs) = prepare_activity(&conn);
let act = comment.to_activity(&conn)?;
let expected = json!({
"attributedTo": "https://plu.me/@/admin/",
"content": r###"<p dir="auto">My comment, mentioning to <a href="https://plu.me/@/user/" title="user">@user</a></p>
"###,
"id": format!("https://plu.me/~/BlogName/testing/comment/{}", comment.id),
"inReplyTo": "https://plu.me/~/BlogName/testing",
"published": format_datetime(&comment.creation_date),
"summary": "My CW",
"tag": [
{
"href": "https://plu.me/@/user/",
"name": "@user",
"type": "Mention"
}
],
"to": ["https://www.w3.org/ns/activitystreams#Public"],
"type": "Note"
});
assert_json_eq!(to_value(act)?, expected);
Ok(())
});
}
#[test]
fn build_delete() {
let conn = db();
conn.test_transaction::<_, Error, _>(|| {
let (comment, _posts, _users, _blogs) = prepare_activity(&conn);
let act = comment.build_delete(&conn)?;
let expected = json!({
"actor": "https://plu.me/@/admin/",
"id": format!("https://plu.me/~/BlogName/testing/comment/{}#delete", comment.id),
"object": {
"id": format!("https://plu.me/~/BlogName/testing/comment/{}", comment.id),
"type": "Tombstone"
},
"to": ["https://www.w3.org/ns/activitystreams#Public"],
"type": "Delete"
});
assert_json_eq!(to_value(act)?, expected);
Ok(())
});
}
}

View file

@ -1,9 +1,14 @@
use crate::search::TokenizerKind as SearchTokenizer;
use crate::signups::Strategy as SignupStrategy;
use crate::smtp::{SMTP_PORT, SUBMISSIONS_PORT, SUBMISSION_PORT};
use rocket::config::Limits;
use rocket::Config as RocketConfig;
use std::collections::HashSet;
use std::env::{self, var};
#[cfg(feature = "s3")]
use s3::{Bucket, Region, creds::Credentials};
#[cfg(not(test))]
const DB_NAME: &str = "plume";
#[cfg(test)]
@ -15,21 +20,33 @@ pub struct Config {
pub db_name: &'static str,
pub db_max_size: Option<u32>,
pub db_min_idle: Option<u32>,
pub signup: SignupStrategy,
pub search_index: String,
pub search_tokenizers: SearchTokenizerConfig,
pub rocket: Result<RocketConfig, InvalidRocketConfig>,
pub logo: LogoConfig,
pub default_theme: String,
pub media_directory: String,
pub mail: Option<MailConfig>,
pub ldap: Option<LdapConfig>,
pub proxy: Option<ProxyConfig>,
pub s3: Option<S3Config>,
}
impl Config {
pub fn proxy(&self) -> Option<&reqwest::Proxy> {
self.proxy.as_ref().map(|p| &p.proxy)
}
}
fn string_to_bool(val: &str, name: &str) -> bool {
match val {
"1" | "true" | "TRUE" => true,
"0" | "false" | "FALSE" => false,
_ => panic!("Invalid configuration: {} is not boolean", name),
}
}
#[derive(Debug, Clone)]
pub enum InvalidRocketConfig {
Env,
@ -245,6 +262,31 @@ impl SearchTokenizerConfig {
}
}
pub struct MailConfig {
pub server: String,
pub port: u16,
pub helo_name: String,
pub username: String,
pub password: String,
}
fn get_mail_config() -> Option<MailConfig> {
Some(MailConfig {
server: env::var("MAIL_SERVER").ok()?,
port: env::var("MAIL_PORT").map_or(SUBMISSIONS_PORT, |port| match port.as_str() {
"smtp" => SMTP_PORT,
"submissions" => SUBMISSIONS_PORT,
"submission" => SUBMISSION_PORT,
number => number
.parse()
.expect(r#"MAIL_PORT must be "smtp", "submissions", "submission" or an integer."#),
}),
helo_name: env::var("MAIL_HELO_NAME").unwrap_or_else(|_| "localhost".to_owned()),
username: env::var("MAIL_USER").ok()?,
password: env::var("MAIL_PASSWORD").ok()?,
})
}
pub struct LdapConfig {
pub addr: String,
pub base_dn: String,
@ -259,11 +301,7 @@ fn get_ldap_config() -> Option<LdapConfig> {
match (addr, base_dn) {
(Some(addr), Some(base_dn)) => {
let tls = var("LDAP_TLS").unwrap_or_else(|_| "false".to_owned());
let tls = match tls.as_ref() {
"1" | "true" | "TRUE" => true,
"0" | "false" | "FALSE" => false,
_ => panic!("Invalid LDAP configuration : tls"),
};
let tls = string_to_bool(&tls, "LDAP_TLS");
let user_name_attr = var("LDAP_USER_NAME_ATTR").unwrap_or_else(|_| "cn".to_owned());
let mail_attr = var("LDAP_USER_MAIL_ATTR").unwrap_or_else(|_| "mail".to_owned());
Some(LdapConfig {
@ -320,6 +358,104 @@ fn get_proxy_config() -> Option<ProxyConfig> {
})
}
pub struct S3Config {
pub bucket: String,
pub access_key_id: String,
pub access_key_secret: String,
// region? If not set, default to us-east-1
pub region: String,
// hostname for s3. If not set, default to $region.amazonaws.com
pub hostname: String,
// may be useful when using self hosted s3. Won't work with recent AWS buckets
pub path_style: bool,
// http or https
pub protocol: String,
// download directly from s3 to user, wihout going through Plume. Require public read on bucket
pub direct_download: bool,
// use this hostname for downloads, can be used with caching proxy in front of s3 (expected to
// be reachable through https)
pub alias: Option<String>,
}
impl S3Config {
#[cfg(feature = "s3")]
pub fn get_bucket(&self) -> Bucket {
let region = Region::Custom {
region: self.region.clone(),
endpoint: format!("{}://{}", self.protocol, self.hostname),
};
let credentials = Credentials {
access_key: Some(self.access_key_id.clone()),
secret_key: Some(self.access_key_secret.clone()),
security_token: None,
session_token: None,
expiration: None,
};
let bucket = Bucket::new(&self.bucket, region, credentials).unwrap();
if self.path_style {
bucket.with_path_style()
} else {
bucket
}
}
}
fn get_s3_config() -> Option<S3Config> {
let bucket = var("S3_BUCKET").ok();
let access_key_id = var("AWS_ACCESS_KEY_ID").ok();
let access_key_secret = var("AWS_SECRET_ACCESS_KEY").ok();
if bucket.is_none() && access_key_id.is_none() && access_key_secret.is_none() {
return None;
}
#[cfg(not(feature = "s3"))]
panic!("S3 support is not enabled in this build");
#[cfg(feature = "s3")]
{
if bucket.is_none() || access_key_id.is_none() || access_key_secret.is_none() {
panic!("Invalid S3 configuration: some required values are set, but not others");
}
let bucket = bucket.unwrap();
let access_key_id = access_key_id.unwrap();
let access_key_secret = access_key_secret.unwrap();
let region = var("S3_REGION").unwrap_or_else(|_| "us-east-1".to_owned());
let hostname = var("S3_HOSTNAME").unwrap_or_else(|_| format!("{}.amazonaws.com", region));
let protocol = var("S3_PROTOCOL").unwrap_or_else(|_| "https".to_owned());
if protocol != "http" && protocol != "https" {
panic!("Invalid S3 configuration: invalid protocol {}", protocol);
}
let path_style = var("S3_PATH_STYLE").unwrap_or_else(|_| "false".to_owned());
let path_style = string_to_bool(&path_style, "S3_PATH_STYLE");
let direct_download = var("S3_DIRECT_DOWNLOAD").unwrap_or_else(|_| "false".to_owned());
let direct_download = string_to_bool(&direct_download, "S3_DIRECT_DOWNLOAD");
let alias = var("S3_ALIAS_HOST").ok();
if direct_download && protocol == "http" && alias.is_none() {
panic!("S3 direct download is disabled because bucket is accessed through plain HTTP. Use HTTPS or set an alias hostname (S3_ALIAS_HOST).");
}
Some(S3Config {
bucket,
access_key_id,
access_key_secret,
region,
hostname,
protocol,
path_style,
direct_download,
alias,
})
}
}
lazy_static! {
pub static ref CONFIG: Config = Config {
base_url: var("BASE_URL").unwrap_or_else(|_| format!(
@ -335,6 +471,7 @@ lazy_static! {
s.parse::<u32>()
.expect("Couldn't parse DB_MIN_IDLE into u32")
)),
signup: var("SIGNUP").map_or(SignupStrategy::default(), |s| s.parse().unwrap()),
#[cfg(feature = "postgres")]
database_url: var("DATABASE_URL")
.unwrap_or_else(|_| format!("postgres://plume:plume@localhost/{}", DB_NAME)),
@ -347,7 +484,9 @@ lazy_static! {
default_theme: var("DEFAULT_THEME").unwrap_or_else(|_| "default-light".to_owned()),
media_directory: var("MEDIA_UPLOAD_DIRECTORY")
.unwrap_or_else(|_| "static/media".to_owned()),
mail: get_mail_config(),
ldap: get_ldap_config(),
proxy: get_proxy_config(),
s3: get_s3_config(),
};
}

View file

@ -69,7 +69,8 @@ pub(crate) mod tests {
impl CustomizeConnection<Connection, ConnError> for TestConnectionCustomizer {
fn on_acquire(&self, conn: &mut Connection) -> Result<(), ConnError> {
PragmaForeignKey.on_acquire(conn)?;
Ok(conn.begin_test_transaction().unwrap())
conn.begin_test_transaction().unwrap();
Ok(())
}
}
}

View file

@ -0,0 +1,158 @@
use crate::{
blocklisted_emails::BlocklistedEmail,
db_conn::DbConn,
schema::email_signups,
users::{NewUser, Role, User},
Error, Result,
};
use chrono::{offset::Utc, Duration, NaiveDateTime};
use diesel::{
Connection as _, ExpressionMethods, Identifiable, Insertable, QueryDsl, Queryable, RunQueryDsl,
};
use plume_common::utils::random_hex;
use std::ops::Deref;
const TOKEN_VALIDITY_HOURS: i64 = 2;
#[repr(transparent)]
pub struct Token(String);
impl From<String> for Token {
fn from(string: String) -> Self {
Token(string)
}
}
impl From<Token> for String {
fn from(token: Token) -> Self {
token.0
}
}
impl Deref for Token {
type Target = String;
fn deref(&self) -> &Self::Target {
&self.0
}
}
impl Token {
fn generate() -> Self {
Self(random_hex())
}
}
#[derive(Identifiable, Queryable)]
pub struct EmailSignup {
pub id: i32,
pub email: String,
pub token: String,
pub expiration_date: NaiveDateTime,
}
#[derive(Insertable)]
#[table_name = "email_signups"]
pub struct NewEmailSignup<'a> {
pub email: &'a str,
pub token: &'a str,
pub expiration_date: NaiveDateTime,
}
impl EmailSignup {
pub fn start(conn: &DbConn, email: &str) -> Result<Token> {
Self::ensure_email_not_blocked(conn, email)?;
conn.transaction(|| {
Self::ensure_user_not_exist_by_email(conn, email)?;
let _rows = Self::delete_existings_by_email(conn, email)?;
let token = Token::generate();
let expiration_date = Utc::now()
.naive_utc()
.checked_add_signed(Duration::hours(TOKEN_VALIDITY_HOURS))
.expect("could not calculate expiration date");
let new_signup = NewEmailSignup {
email,
token: &token,
expiration_date,
};
let _rows = diesel::insert_into(email_signups::table)
.values(new_signup)
.execute(&**conn)?;
Ok(token)
})
}
pub fn find_by_token(conn: &DbConn, token: Token) -> Result<Self> {
let signup = email_signups::table
.filter(email_signups::token.eq(token.as_str()))
.first::<Self>(&**conn)
.map_err(Error::from)?;
Ok(signup)
}
pub fn confirm(&self, conn: &DbConn) -> Result<()> {
Self::ensure_email_not_blocked(conn, &self.email)?;
conn.transaction(|| {
Self::ensure_user_not_exist_by_email(conn, &self.email)?;
if self.expired() {
Self::delete_existings_by_email(conn, &self.email)?;
return Err(Error::Expired);
}
Ok(())
})
}
pub fn complete(&self, conn: &DbConn, username: String, password: String) -> Result<User> {
Self::ensure_email_not_blocked(conn, &self.email)?;
conn.transaction(|| {
Self::ensure_user_not_exist_by_email(conn, &self.email)?;
let user = NewUser::new_local(
conn,
username,
"".to_string(),
Role::Normal,
"",
self.email.clone(),
Some(User::hash_pass(&password)?),
)?;
self.delete(conn)?;
Ok(user)
})
}
fn delete(&self, conn: &DbConn) -> Result<()> {
let _rows = diesel::delete(self).execute(&**conn).map_err(Error::from)?;
Ok(())
}
fn ensure_email_not_blocked(conn: &DbConn, email: &str) -> Result<()> {
if let Some(x) = BlocklistedEmail::matches_blocklist(conn, email)? {
Err(Error::Blocklisted(x.notify_user, x.notification_text))
} else {
Ok(())
}
}
fn ensure_user_not_exist_by_email(conn: &DbConn, email: &str) -> Result<()> {
if User::email_used(conn, email)? {
let _rows = Self::delete_existings_by_email(conn, email)?;
return Err(Error::UserAlreadyExists);
}
Ok(())
}
fn delete_existings_by_email(conn: &DbConn, email: &str) -> Result<usize> {
let existing_signups = email_signups::table.filter(email_signups::email.eq(email));
diesel::delete(existing_signups)
.execute(&**conn)
.map_err(Error::from)
}
fn expired(&self) -> bool {
self.expiration_date < Utc::now().naive_utc()
}
}

View file

@ -1,8 +1,13 @@
use crate::{
ap_url, db_conn::DbConn, instance::Instance, notifications::*, schema::follows, users::User,
Connection, Error, Result, CONFIG,
ap_url, instance::Instance, notifications::*, schema::follows, users::User, Connection, Error,
Result, CONFIG,
};
use activitystreams::{
activity::{Accept, ActorAndObjectRef, Follow as FollowAct, Undo},
base::AnyBase,
iri_string::types::IriString,
prelude::*,
};
use activitypub::activity::{Accept, Follow as FollowAct, Undo};
use diesel::{self, ExpressionMethods, QueryDsl, RunQueryDsl, SaveChangesDsl};
use plume_common::activity_pub::{
broadcast,
@ -53,15 +58,13 @@ impl Follow {
pub fn to_activity(&self, conn: &Connection) -> Result<FollowAct> {
let user = User::get(conn, self.follower_id)?;
let target = User::get(conn, self.following_id)?;
let target_id = target.ap_url.parse::<IriString>()?;
let mut act = FollowAct::new(user.ap_url.parse::<IriString>()?, target_id.clone());
act.set_id(self.ap_url.parse::<IriString>()?);
act.set_many_tos(vec![target_id]);
act.set_many_ccs(vec![PUBLIC_VISIBILITY.parse::<IriString>()?]);
let mut act = FollowAct::default();
act.follow_props.set_actor_link::<Id>(user.into_id())?;
act.follow_props
.set_object_link::<Id>(target.clone().into_id())?;
act.object_props.set_id_string(self.ap_url.clone())?;
act.object_props.set_to_link_vec(vec![target.into_id()])?;
act.object_props
.set_cc_link_vec(vec![Id::new(PUBLIC_VISIBILITY.to_string())])?;
Ok(act)
}
@ -94,81 +97,87 @@ impl Follow {
NewFollow {
follower_id: from_id,
following_id: target_id,
ap_url: follow.object_props.id_string()?,
ap_url: follow
.object_field_ref()
.as_single_id()
.ok_or(Error::MissingApProperty)?
.to_string(),
},
)?;
res.notify(conn)?;
let mut accept = Accept::default();
let accept_id = ap_url(&format!(
"{}/follow/{}/accept",
CONFIG.base_url.as_str(),
&res.id
));
accept.object_props.set_id_string(accept_id)?;
accept
.object_props
.set_to_link_vec(vec![from.clone().into_id()])?;
accept
.object_props
.set_cc_link_vec(vec![Id::new(PUBLIC_VISIBILITY.to_string())])?;
accept
.accept_props
.set_actor_link::<Id>(target.clone().into_id())?;
accept.accept_props.set_object_object(follow)?;
broadcast(
&*target,
accept,
vec![from.clone()],
CONFIG.proxy().cloned(),
);
let accept = res.build_accept(from, target, follow)?;
broadcast(target, accept, vec![from.clone()], CONFIG.proxy().cloned());
Ok(res)
}
pub fn build_accept<A: Signer + IntoId + Clone, B: Clone + AsActor<T> + IntoId, T>(
&self,
from: &B,
target: &A,
follow: FollowAct,
) -> Result<Accept> {
let mut accept = Accept::new(
target.clone().into_id().parse::<IriString>()?,
AnyBase::from_extended(follow)?,
);
let accept_id = ap_url(&format!(
"{}/follows/{}/accept",
CONFIG.base_url.as_str(),
self.id
));
accept.set_id(accept_id.parse::<IriString>()?);
accept.set_many_tos(vec![from.clone().into_id().parse::<IriString>()?]);
accept.set_many_ccs(vec![PUBLIC_VISIBILITY.parse::<IriString>()?]);
Ok(accept)
}
pub fn build_undo(&self, conn: &Connection) -> Result<Undo> {
let mut undo = Undo::default();
undo.undo_props
.set_actor_link(User::get(conn, self.follower_id)?.into_id())?;
undo.object_props
.set_id_string(format!("{}/undo", self.ap_url))?;
undo.undo_props
.set_object_link::<Id>(self.clone().into_id())?;
undo.object_props
.set_to_link_vec(vec![User::get(conn, self.following_id)?.into_id()])?;
undo.object_props
.set_cc_link_vec(vec![Id::new(PUBLIC_VISIBILITY.to_string())])?;
let mut undo = Undo::new(
User::get(conn, self.follower_id)?
.ap_url
.parse::<IriString>()?,
self.ap_url.parse::<IriString>()?,
);
undo.set_id(format!("{}/undo", self.ap_url).parse::<IriString>()?);
undo.set_many_tos(vec![User::get(conn, self.following_id)?
.ap_url
.parse::<IriString>()?]);
undo.set_many_ccs(vec![PUBLIC_VISIBILITY.parse::<IriString>()?]);
Ok(undo)
}
}
impl AsObject<User, FollowAct, &DbConn> for User {
impl AsObject<User, FollowAct, &Connection> for User {
type Error = Error;
type Output = Follow;
fn activity(self, conn: &DbConn, actor: User, id: &str) -> Result<Follow> {
fn activity(self, conn: &Connection, actor: User, id: &str) -> Result<Follow> {
// Mastodon (at least) requires the full Follow object when accepting it,
// so we rebuilt it here
let mut follow = FollowAct::default();
follow.object_props.set_id_string(id.to_string())?;
follow
.follow_props
.set_actor_link::<Id>(actor.clone().into_id())?;
let follow = FollowAct::new(actor.ap_url.parse::<IriString>()?, id.parse::<IriString>()?);
Follow::accept_follow(conn, &actor, &self, follow, actor.id, self.id)
}
}
impl FromId<DbConn> for Follow {
impl FromId<Connection> for Follow {
type Error = Error;
type Object = FollowAct;
fn from_db(conn: &DbConn, id: &str) -> Result<Self> {
fn from_db(conn: &Connection, id: &str) -> Result<Self> {
Follow::find_by_ap_url(conn, id)
}
fn from_activity(conn: &DbConn, follow: FollowAct) -> Result<Self> {
fn from_activity(conn: &Connection, follow: FollowAct) -> Result<Self> {
let actor = User::from_id(
conn,
&follow.follow_props.actor_link::<Id>()?,
follow
.actor_field_ref()
.as_single_id()
.ok_or(Error::MissingApProperty)?
.as_str(),
None,
CONFIG.proxy(),
)
@ -176,7 +185,11 @@ impl FromId<DbConn> for Follow {
let target = User::from_id(
conn,
&follow.follow_props.object_link::<Id>()?,
follow
.object_field_ref()
.as_single_id()
.ok_or(Error::MissingApProperty)?
.as_str(),
None,
CONFIG.proxy(),
)
@ -189,18 +202,18 @@ impl FromId<DbConn> for Follow {
}
}
impl AsObject<User, Undo, &DbConn> for Follow {
impl AsObject<User, Undo, &Connection> for Follow {
type Error = Error;
type Output = ();
fn activity(self, conn: &DbConn, actor: User, _id: &str) -> Result<()> {
fn activity(self, conn: &Connection, actor: User, _id: &str) -> Result<()> {
let conn = conn;
if self.follower_id == actor.id {
diesel::delete(&self).execute(&**conn)?;
diesel::delete(&self).execute(conn)?;
// delete associated notification if any
if let Ok(notif) = Notification::find(conn, notification_kind::FOLLOW, self.id) {
diesel::delete(&notif).execute(&**conn)?;
diesel::delete(&notif).execute(conn)?;
}
Ok(())
@ -219,8 +232,31 @@ impl IntoId for Follow {
#[cfg(test)]
mod tests {
use super::*;
use crate::{tests::db, users::tests as user_tests};
use crate::{
db_conn::DbConn, tests::db, users::tests as user_tests, users::tests::fill_database,
};
use assert_json_diff::assert_json_eq;
use diesel::Connection;
use serde_json::{json, to_value};
fn prepare_activity(conn: &DbConn) -> (Follow, User, User, Vec<User>) {
let users = fill_database(conn);
let following = &users[1];
let follower = &users[2];
let mut follow = Follow::insert(
conn,
NewFollow {
follower_id: follower.id,
following_id: following.id,
ap_url: "".into(),
},
)
.unwrap();
// following.ap_url = format!("https://plu.me/follows/{}", follow.id);
follow.ap_url = format!("https://plu.me/follows/{}", follow.id);
(follow, following.to_owned(), follower.to_owned(), users)
}
#[test]
fn test_id() {
@ -255,4 +291,77 @@ mod tests {
Ok(())
})
}
#[test]
fn to_activity() {
let conn = db();
conn.test_transaction::<_, Error, _>(|| {
let (follow, _following, _follower, _users) = prepare_activity(&conn);
let act = follow.to_activity(&conn)?;
let expected = json!({
"actor": "https://plu.me/@/other/",
"cc": ["https://www.w3.org/ns/activitystreams#Public"],
"id": format!("https://plu.me/follows/{}", follow.id),
"object": "https://plu.me/@/user/",
"to": ["https://plu.me/@/user/"],
"type": "Follow"
});
assert_json_eq!(to_value(act)?, expected);
Ok(())
});
}
#[test]
fn build_accept() {
let conn = db();
conn.test_transaction::<_, Error, _>(|| {
let (follow, following, follower, _users) = prepare_activity(&conn);
let act = follow.build_accept(&follower, &following, follow.to_activity(&conn)?)?;
let expected = json!({
"actor": "https://plu.me/@/user/",
"cc": ["https://www.w3.org/ns/activitystreams#Public"],
"id": format!("https://127.0.0.1:7878/follows/{}/accept", follow.id),
"object": {
"actor": "https://plu.me/@/other/",
"cc": ["https://www.w3.org/ns/activitystreams#Public"],
"id": format!("https://plu.me/follows/{}", follow.id),
"object": "https://plu.me/@/user/",
"to": ["https://plu.me/@/user/"],
"type": "Follow"
},
"to": ["https://plu.me/@/other/"],
"type": "Accept"
});
assert_json_eq!(to_value(act)?, expected);
Ok(())
});
}
#[test]
fn build_undo() {
let conn = db();
conn.test_transaction::<_, Error, _>(|| {
let (follow, _following, _follower, _users) = prepare_activity(&conn);
let act = follow.build_undo(&conn)?;
let expected = json!({
"actor": "https://plu.me/@/other/",
"cc": ["https://www.w3.org/ns/activitystreams#Public"],
"id": format!("https://plu.me/follows/{}/undo", follow.id),
"object": format!("https://plu.me/follows/{}", follow.id),
"to": ["https://plu.me/@/user/"],
"type": "Undo"
});
assert_json_eq!(to_value(act)?, expected);
Ok(())
});
}
}

View file

@ -1,13 +1,12 @@
use activitypub::activity::*;
use activitystreams::activity::{Announce, Create, Delete, Follow, Like, Undo, Update};
use crate::{
comments::Comment,
db_conn::DbConn,
follows, likes,
posts::{Post, PostUpdate},
reshares::Reshare,
users::User,
Error, CONFIG,
Connection, Error, CONFIG,
};
use plume_common::activity_pub::inbox::Inbox;
@ -46,7 +45,7 @@ impl_into_inbox_result! {
Reshare => Reshared
}
pub fn inbox(conn: &DbConn, act: serde_json::Value) -> Result<InboxResult, Error> {
pub fn inbox(conn: &Connection, act: serde_json::Value) -> Result<InboxResult, Error> {
Inbox::handle(conn, act)
.with::<User, Announce, Post>(CONFIG.proxy())
.with::<User, Create, Comment>(CONFIG.proxy())
@ -82,9 +81,9 @@ pub(crate) mod tests {
use crate::post_authors::*;
use crate::posts::*;
let (users, blogs) = blog_fill_db(&conn);
let (users, blogs) = blog_fill_db(conn);
let post = Post::insert(
&conn,
conn,
NewPost {
blog_id: blogs[0].id,
slug: "testing".to_owned(),
@ -94,15 +93,15 @@ pub(crate) mod tests {
license: "WTFPL".to_owned(),
creation_date: None,
ap_url: format!("https://plu.me/~/{}/testing", blogs[0].actor_id),
subtitle: String::new(),
source: String::new(),
subtitle: "Bye".to_string(),
source: "Hello".to_string(),
cover_id: None,
},
)
.unwrap();
PostAuthor::insert(
&conn,
conn,
NewPostAuthor {
post_id: post.id,
author_id: users[0].id,
@ -190,7 +189,7 @@ pub(crate) mod tests {
});
assert!(matches!(
super::inbox(&conn, act.clone()),
super::inbox(&conn, act),
Err(super::Error::Inbox(
box plume_common::activity_pub::inbox::InboxError::InvalidObject(_),
))
@ -221,7 +220,7 @@ pub(crate) mod tests {
});
assert!(matches!(
super::inbox(&conn, act.clone()),
super::inbox(&conn, act),
Err(super::Error::Inbox(
box plume_common::activity_pub::inbox::InboxError::InvalidObject(_),
))
@ -249,7 +248,7 @@ pub(crate) mod tests {
});
assert!(matches!(
super::inbox(&conn, act.clone()),
super::inbox(&conn, act),
Err(super::Error::Inbox(
box plume_common::activity_pub::inbox::InboxError::InvalidObject(_),
))
@ -268,7 +267,7 @@ pub(crate) mod tests {
"actor": users[0].ap_url,
"object": {
"type": "Article",
"id": "https://plu.me/~/Blog/my-article",
"id": "https://plu.me/~/BlogName/testing",
"attributedTo": [users[0].ap_url, blogs[0].ap_url],
"content": "Hello.",
"name": "My Article",
@ -286,11 +285,11 @@ pub(crate) mod tests {
match super::inbox(&conn, act).unwrap() {
super::InboxResult::Post(p) => {
assert!(p.is_author(&conn, users[0].id).unwrap());
assert_eq!(p.source, "Hello.".to_owned());
assert_eq!(p.source, "Hello".to_owned());
assert_eq!(p.blog_id, blogs[0].id);
assert_eq!(p.content, SafeString::new("Hello."));
assert_eq!(p.subtitle, "Bye.".to_owned());
assert_eq!(p.title, "My Article".to_owned());
assert_eq!(p.content, SafeString::new("Hello"));
assert_eq!(p.subtitle, "Bye".to_owned());
assert_eq!(p.title, "Testing".to_owned());
}
_ => panic!("Unexpected result"),
};
@ -324,7 +323,7 @@ pub(crate) mod tests {
});
assert!(matches!(
super::inbox(&conn, act.clone()),
super::inbox(&conn, act),
Err(super::Error::Inbox(
box plume_common::activity_pub::inbox::InboxError::InvalidObject(_),
))
@ -362,7 +361,7 @@ pub(crate) mod tests {
});
assert!(matches!(
super::inbox(&conn, act.clone()),
super::inbox(&conn, act),
Err(super::Error::Inbox(
box plume_common::activity_pub::inbox::InboxError::InvalidObject(_),
))
@ -397,7 +396,7 @@ pub(crate) mod tests {
});
assert!(matches!(
super::inbox(&conn, act.clone()),
super::inbox(&conn, act),
Err(super::Error::Inbox(
box plume_common::activity_pub::inbox::InboxError::InvalidObject(_),
))

View file

@ -9,7 +9,7 @@ use crate::{
use chrono::NaiveDateTime;
use diesel::{self, result::Error::NotFound, ExpressionMethods, QueryDsl, RunQueryDsl};
use once_cell::sync::OnceCell;
use plume_common::utils::md_to_html;
use plume_common::utils::{iri_percent_encode_seg, md_to_html};
use std::sync::RwLock;
#[derive(Clone, Identifiable, Queryable)]
@ -173,8 +173,8 @@ impl Instance {
"{instance}/{prefix}/{name}/{box_name}",
instance = self.public_domain,
prefix = prefix,
name = name,
box_name = box_name
name = iri_percent_encode_seg(name),
box_name = iri_percent_encode_seg(box_name)
))
}
@ -523,7 +523,7 @@ pub(crate) mod tests {
.unwrap();
let inst = Instance::get(conn, inst.id).unwrap();
assert_eq!(inst.name, "NewName".to_owned());
assert_eq!(inst.open_registrations, false);
assert!(!inst.open_registrations);
assert_eq!(
inst.long_description.get(),
"[long_description](/with_link)"

97
plume-models/src/lib.rs Executable file → Normal file
View file

@ -16,6 +16,9 @@ extern crate serde_json;
#[macro_use]
extern crate tantivy;
use activitystreams::iri_string;
pub use lettre;
pub use lettre::smtp;
use once_cell::sync::Lazy;
use plume_common::activity_pub::{inbox::InboxError, request, sign};
use posts::PostEvent;
@ -65,6 +68,9 @@ pub enum Error {
Url,
Webfinger,
Expired,
UserAlreadyExists,
#[cfg(feature = "s3")]
S3(s3::error::S3Error),
}
impl From<bcrypt::BcryptError> for Error {
@ -97,6 +103,12 @@ impl From<url::ParseError> for Error {
}
}
impl From<iri_string::validate::Error> for Error {
fn from(_: iri_string::validate::Error) -> Self {
Error::Url
}
}
impl From<serde_json::Error> for Error {
fn from(_: serde_json::Error) -> Self {
Error::SerDe
@ -115,12 +127,9 @@ impl From<reqwest::header::InvalidHeaderValue> for Error {
}
}
impl From<activitypub::Error> for Error {
fn from(err: activitypub::Error) -> Self {
match err {
activitypub::Error::NotFound => Error::MissingApProperty,
_ => Error::SerDe,
}
impl From<activitystreams::checked::CheckError> for Error {
fn from(_: activitystreams::checked::CheckError) -> Error {
Error::MissingApProperty
}
}
@ -163,6 +172,13 @@ impl From<request::Error> for Error {
}
}
#[cfg(feature = "s3")]
impl From<s3::error::S3Error> for Error {
fn from(err: s3::error::S3Error) -> Error {
Error::S3(err)
}
}
pub type Result<T> = std::result::Result<T, Error>;
/// Adds a function to a model, that returns the first
@ -170,7 +186,7 @@ pub type Result<T> = std::result::Result<T, Error>;
///
/// Usage:
///
/// ```rust
/// ```ignore
/// impl Model {
/// find_by!(model_table, name_of_the_function, field1 as String, field2 as i32);
/// }
@ -194,7 +210,7 @@ macro_rules! find_by {
///
/// Usage:
///
/// ```rust
/// ```ignore
/// impl Model {
/// list_by!(model_table, name_of_the_function, field1 as String);
/// }
@ -218,7 +234,7 @@ macro_rules! list_by {
///
/// # Usage
///
/// ```rust
/// ```ignore
/// impl Model {
/// get!(model_table);
/// }
@ -241,7 +257,7 @@ macro_rules! get {
///
/// # Usage
///
/// ```rust
/// ```ignore
/// impl Model {
/// insert!(model_table, NewModelType);
/// }
@ -273,7 +289,7 @@ macro_rules! insert {
///
/// # Usage
///
/// ```rust
/// ```ignore
/// impl Model {
/// last!(model_table);
/// }
@ -300,10 +316,38 @@ pub fn ap_url(url: &str) -> String {
format!("https://{}", url)
}
pub trait SmtpNewWithAddr {
fn new_with_addr(
addr: (&str, u16),
) -> std::result::Result<smtp::SmtpClient, smtp::error::Error>;
}
impl SmtpNewWithAddr for smtp::SmtpClient {
// Stolen from lettre::smtp::SmtpClient::new_simple()
fn new_with_addr(addr: (&str, u16)) -> std::result::Result<Self, smtp::error::Error> {
use native_tls::TlsConnector;
use smtp::{
client::net::{ClientTlsParameters, DEFAULT_TLS_PROTOCOLS},
ClientSecurity, SmtpClient,
};
let (domain, port) = addr;
let mut tls_builder = TlsConnector::builder();
tls_builder.min_protocol_version(Some(DEFAULT_TLS_PROTOCOLS[0]));
let tls_parameters =
ClientTlsParameters::new(domain.to_string(), tls_builder.build().unwrap());
SmtpClient::new((domain, port), ClientSecurity::Wrapper(tls_parameters))
}
}
#[cfg(test)]
#[macro_use]
mod tests {
use crate::{db_conn, migrations::IMPORTED_MIGRATIONS, Connection as Conn, CONFIG};
use chrono::{naive::NaiveDateTime, Datelike, Timelike};
use diesel::r2d2::ConnectionManager;
use plume_common::utils::random_hex;
use std::env::temp_dir;
@ -319,7 +363,7 @@ mod tests {
};
}
pub fn db<'a>() -> db_conn::DbConn {
pub fn db() -> db_conn::DbConn {
db_conn::DbConn((*DB_POOL).get().unwrap())
}
@ -336,6 +380,33 @@ mod tests {
pool
};
}
#[cfg(feature = "postgres")]
pub(crate) fn format_datetime(dt: &NaiveDateTime) -> String {
format!(
"{:04}-{:02}-{:02}T{:02}:{:02}:{:02}.{:06}Z",
dt.year(),
dt.month(),
dt.day(),
dt.hour(),
dt.minute(),
dt.second(),
dt.timestamp_subsec_micros()
)
}
#[cfg(feature = "sqlite")]
pub(crate) fn format_datetime(dt: &NaiveDateTime) -> String {
format!(
"{:04}-{:02}-{:02}T{:02}:{:02}:{:02}Z",
dt.year(),
dt.month(),
dt.day(),
dt.hour(),
dt.minute(),
dt.second()
)
}
}
pub mod admin;
@ -347,6 +418,7 @@ pub mod blogs;
pub mod comment_seers;
pub mod comments;
pub mod db_conn;
pub mod email_signups;
pub mod follows;
pub mod headers;
pub mod inbox;
@ -367,6 +439,7 @@ pub mod safe_string;
#[allow(unused_imports)]
pub mod schema;
pub mod search;
pub mod signups;
pub mod tags;
pub mod timeline;
pub mod users;

View file

@ -1,14 +1,19 @@
use crate::{
db_conn::DbConn, instance::Instance, notifications::*, posts::Post, schema::likes, timeline::*,
users::User, Connection, Error, Result, CONFIG,
instance::Instance, notifications::*, posts::Post, schema::likes, timeline::*, users::User,
Connection, Error, Result, CONFIG,
};
use activitystreams::{
activity::{ActorAndObjectRef, Like as LikeAct, Undo},
base::AnyBase,
iri_string::types::IriString,
prelude::*,
};
use activitypub::activity;
use chrono::NaiveDateTime;
use diesel::{self, ExpressionMethods, QueryDsl, RunQueryDsl};
use plume_common::activity_pub::{
inbox::{AsActor, AsObject, FromId},
sign::Signer,
Id, IntoId, PUBLIC_VISIBILITY,
PUBLIC_VISIBILITY,
};
#[derive(Clone, Queryable, Identifiable)]
@ -34,18 +39,16 @@ impl Like {
find_by!(likes, find_by_ap_url, ap_url as &str);
find_by!(likes, find_by_user_on_post, user_id as i32, post_id as i32);
pub fn to_activity(&self, conn: &Connection) -> Result<activity::Like> {
let mut act = activity::Like::default();
act.like_props
.set_actor_link(User::get(conn, self.user_id)?.into_id())?;
act.like_props
.set_object_link(Post::get(conn, self.post_id)?.into_id())?;
act.object_props
.set_to_link_vec(vec![Id::new(PUBLIC_VISIBILITY.to_string())])?;
act.object_props.set_cc_link_vec(vec![Id::new(
User::get(conn, self.user_id)?.followers_endpoint,
)])?;
act.object_props.set_id_string(self.ap_url.clone())?;
pub fn to_activity(&self, conn: &Connection) -> Result<LikeAct> {
let mut act = LikeAct::new(
User::get(conn, self.user_id)?.ap_url.parse::<IriString>()?,
Post::get(conn, self.post_id)?.ap_url.parse::<IriString>()?,
);
act.set_many_tos(vec![PUBLIC_VISIBILITY.parse::<IriString>()?]);
act.set_many_ccs(vec![User::get(conn, self.user_id)?
.followers_endpoint
.parse::<IriString>()?]);
act.set_id(self.ap_url.parse::<IriString>()?);
Ok(act)
}
@ -67,28 +70,26 @@ impl Like {
Ok(())
}
pub fn build_undo(&self, conn: &Connection) -> Result<activity::Undo> {
let mut act = activity::Undo::default();
act.undo_props
.set_actor_link(User::get(conn, self.user_id)?.into_id())?;
act.undo_props.set_object_object(self.to_activity(conn)?)?;
act.object_props
.set_id_string(format!("{}#delete", self.ap_url))?;
act.object_props
.set_to_link_vec(vec![Id::new(PUBLIC_VISIBILITY.to_string())])?;
act.object_props.set_cc_link_vec(vec![Id::new(
User::get(conn, self.user_id)?.followers_endpoint,
)])?;
pub fn build_undo(&self, conn: &Connection) -> Result<Undo> {
let mut act = Undo::new(
User::get(conn, self.user_id)?.ap_url.parse::<IriString>()?,
AnyBase::from_extended(self.to_activity(conn)?)?,
);
act.set_id(format!("{}#delete", self.ap_url).parse::<IriString>()?);
act.set_many_tos(vec![PUBLIC_VISIBILITY.parse::<IriString>()?]);
act.set_many_ccs(vec![User::get(conn, self.user_id)?
.followers_endpoint
.parse::<IriString>()?]);
Ok(act)
}
}
impl AsObject<User, activity::Like, &DbConn> for Post {
impl AsObject<User, LikeAct, &Connection> for Post {
type Error = Error;
type Output = Like;
fn activity(self, conn: &DbConn, actor: User, id: &str) -> Result<Like> {
fn activity(self, conn: &Connection, actor: User, id: &str) -> Result<Like> {
let res = Like::insert(
conn,
NewLike {
@ -104,21 +105,24 @@ impl AsObject<User, activity::Like, &DbConn> for Post {
}
}
impl FromId<DbConn> for Like {
impl FromId<Connection> for Like {
type Error = Error;
type Object = activity::Like;
type Object = LikeAct;
fn from_db(conn: &DbConn, id: &str) -> Result<Self> {
fn from_db(conn: &Connection, id: &str) -> Result<Self> {
Like::find_by_ap_url(conn, id)
}
fn from_activity(conn: &DbConn, act: activity::Like) -> Result<Self> {
fn from_activity(conn: &Connection, act: LikeAct) -> Result<Self> {
let res = Like::insert(
conn,
NewLike {
post_id: Post::from_id(
conn,
&act.like_props.object_link::<Id>()?,
act.object_field_ref()
.as_single_id()
.ok_or(Error::MissingApProperty)?
.as_str(),
None,
CONFIG.proxy(),
)
@ -126,13 +130,19 @@ impl FromId<DbConn> for Like {
.id,
user_id: User::from_id(
conn,
&act.like_props.actor_link::<Id>()?,
act.actor_field_ref()
.as_single_id()
.ok_or(Error::MissingApProperty)?
.as_str(),
None,
CONFIG.proxy(),
)
.map_err(|(_, e)| e)?
.id,
ap_url: act.object_props.id_string()?,
ap_url: act
.id_unchecked()
.ok_or(Error::MissingApProperty)?
.to_string(),
},
)?;
res.notify(conn)?;
@ -144,17 +154,17 @@ impl FromId<DbConn> for Like {
}
}
impl AsObject<User, activity::Undo, &DbConn> for Like {
impl AsObject<User, Undo, &Connection> for Like {
type Error = Error;
type Output = ();
fn activity(self, conn: &DbConn, actor: User, _id: &str) -> Result<()> {
fn activity(self, conn: &Connection, actor: User, _id: &str) -> Result<()> {
if actor.id == self.user_id {
diesel::delete(&self).execute(&**conn)?;
diesel::delete(&self).execute(conn)?;
// delete associated notification if any
if let Ok(notif) = Notification::find(conn, notification_kind::LIKE, self.id) {
diesel::delete(&notif).execute(&**conn)?;
diesel::delete(&notif).execute(conn)?;
}
Ok(())
} else {
@ -165,8 +175,7 @@ impl AsObject<User, activity::Undo, &DbConn> for Like {
impl NewLike {
pub fn new(p: &Post, u: &User) -> Self {
// TODO: this URL is not valid
let ap_url = format!("{}/like/{}", u.ap_url, p.ap_url);
let ap_url = format!("{}like/{}", u.ap_url, p.ap_url);
NewLike {
post_id: p.id,
user_id: u.id,
@ -174,3 +183,67 @@ impl NewLike {
}
}
}
#[cfg(test)]
mod tests {
use super::*;
use crate::diesel::Connection;
use crate::{inbox::tests::fill_database, tests::db};
use assert_json_diff::assert_json_eq;
use serde_json::{json, to_value};
#[test]
fn to_activity() {
let conn = db();
conn.test_transaction::<_, Error, _>(|| {
let (posts, _users, _blogs) = fill_database(&conn);
let post = &posts[0];
let user = &post.get_authors(&conn)?[0];
let like = Like::insert(&conn, NewLike::new(post, user))?;
let act = like.to_activity(&conn).unwrap();
let expected = json!({
"actor": "https://plu.me/@/admin/",
"cc": ["https://plu.me/@/admin/followers"],
"id": "https://plu.me/@/admin/like/https://plu.me/~/BlogName/testing",
"object": "https://plu.me/~/BlogName/testing",
"to": ["https://www.w3.org/ns/activitystreams#Public"],
"type": "Like",
});
assert_json_eq!(to_value(act)?, expected);
Ok(())
});
}
#[test]
fn build_undo() {
let conn = db();
conn.test_transaction::<_, Error, _>(|| {
let (posts, _users, _blogs) = fill_database(&conn);
let post = &posts[0];
let user = &post.get_authors(&conn)?[0];
let like = Like::insert(&conn, NewLike::new(post, user))?;
let act = like.build_undo(&conn)?;
let expected = json!({
"actor": "https://plu.me/@/admin/",
"cc": ["https://plu.me/@/admin/followers"],
"id": "https://plu.me/@/admin/like/https://plu.me/~/BlogName/testing#delete",
"object": {
"actor": "https://plu.me/@/admin/",
"cc": ["https://plu.me/@/admin/followers"],
"id": "https://plu.me/@/admin/like/https://plu.me/~/BlogName/testing",
"object": "https://plu.me/~/BlogName/testing",
"to": ["https://www.w3.org/ns/activitystreams#Public"],
"type": "Like",
},
"to": ["https://www.w3.org/ns/activitystreams#Public"],
"type": "Undo",
});
assert_json_eq!(to_value(act)?, expected);
Ok(())
});
}
}

View file

@ -297,6 +297,28 @@ impl List {
.map_err(Error::from)
}
pub fn delete(&self, conn: &Connection) -> Result<()> {
if let Some(user_id) = self.user_id {
diesel::delete(
lists::table
.filter(lists::user_id.eq(user_id))
.filter(lists::name.eq(&self.name)),
)
.execute(conn)
.map(|_| ())
.map_err(Error::from)
} else {
diesel::delete(
lists::table
.filter(lists::user_id.is_null())
.filter(lists::name.eq(&self.name)),
)
.execute(conn)
.map(|_| ())
.map_err(Error::from)
}
}
func! {set: set_users, User, add_users}
func! {set: set_blogs, Blog, add_blogs}
func! {set: set_words, Word, add_words}
@ -413,7 +435,7 @@ mod tests {
&List::find_for_user_by_name(conn, l1.user_id, &l1.name).unwrap(),
);
l_eq(
&&l1u,
&l1u,
&List::find_for_user_by_name(conn, l1u.user_id, &l1u.name).unwrap(),
);
Ok(())

View file

@ -1,14 +1,13 @@
use crate::{
ap_url, db_conn::DbConn, instance::Instance, safe_string::SafeString, schema::medias,
users::User, Connection, Error, Result, CONFIG,
ap_url, instance::Instance, safe_string::SafeString, schema::medias, users::User, Connection,
Error, Result, CONFIG,
};
use activitypub::object::Image;
use askama_escape::escape;
use activitystreams::{object::Image, prelude::*};
use diesel::{self, ExpressionMethods, QueryDsl, RunQueryDsl};
use guid_create::GUID;
use plume_common::{
activity_pub::{inbox::FromId, request, Id},
utils::MediaProcessor,
activity_pub::{inbox::FromId, request, ToAsString, ToAsUri},
utils::{escape, MediaProcessor},
};
use std::{
fs::{self, DirBuilder},
@ -17,6 +16,9 @@ use std::{
use tracing::warn;
use url::Url;
#[cfg(feature = "s3")]
use crate::config::S3Config;
const REMOTE_MEDIA_DIRECTORY: &str = "remote";
#[derive(Clone, Identifiable, Queryable, AsChangeset)]
@ -43,7 +45,7 @@ pub struct NewMedia {
pub owner_id: i32,
}
#[derive(PartialEq)]
#[derive(PartialEq, Eq)]
pub enum MediaCategory {
Image,
Audio,
@ -106,7 +108,7 @@ impl Media {
.file_path
.rsplit_once('.')
.map(|x| x.1)
.expect("Media::category: extension error")
.unwrap_or("")
.to_lowercase()
{
"png" | "jpg" | "jpeg" | "gif" | "svg" => MediaCategory::Image,
@ -152,26 +154,99 @@ impl Media {
})
}
/// Returns full file path for medias stored in the local media directory.
pub fn local_path(&self) -> Option<PathBuf> {
if self.file_path.is_empty() {
return None;
}
if CONFIG.s3.is_some() {
#[cfg(feature="s3")]
unreachable!("Called Media::local_path() but media are stored on S3");
#[cfg(not(feature="s3"))]
unreachable!();
}
let relative_path = self
.file_path
.trim_start_matches(&CONFIG.media_directory)
.trim_start_matches(path::MAIN_SEPARATOR)
.trim_start_matches("static/media/");
Some(Path::new(&CONFIG.media_directory).join(relative_path))
}
/// Returns the relative URL to access this file, which is also the key at which
/// it is stored in the S3 bucket if we are using S3 storage.
/// Does not start with a '/', it is of the form "static/media/<...>"
pub fn relative_url(&self) -> Option<String> {
if self.file_path.is_empty() {
return None;
}
let relative_path = self
.file_path
.trim_start_matches(&CONFIG.media_directory)
.replace(path::MAIN_SEPARATOR, "/");
let relative_path = relative_path
.trim_start_matches('/')
.trim_start_matches("static/media/");
Some(format!("static/media/{}", relative_path))
}
/// Returns a public URL through which this media file can be accessed
pub fn url(&self) -> Result<String> {
if self.is_remote {
Ok(self.remote_url.clone().unwrap_or_default())
} else {
let file_path = self.file_path.replace(path::MAIN_SEPARATOR, "/").replacen(
&CONFIG.media_directory,
"static/media",
1,
); // "static/media" from plume::routs::plume_media_files()
let relative_url = self.relative_url().unwrap_or_default();
#[cfg(feature="s3")]
if CONFIG.s3.as_ref().map(|x| x.direct_download).unwrap_or(false) {
let s3_url = match CONFIG.s3.as_ref().unwrap() {
S3Config { alias: Some(alias), .. } => {
format!("https://{}/{}", alias, relative_url)
}
S3Config { path_style: true, hostname, bucket, .. } => {
format!("https://{}/{}/{}",
hostname,
bucket,
relative_url
)
}
S3Config { path_style: false, hostname, bucket, .. } => {
format!("https://{}.{}/{}",
bucket,
hostname,
relative_url
)
}
};
return Ok(s3_url);
}
Ok(ap_url(&format!(
"{}/{}",
Instance::get_local()?.public_domain,
&file_path
relative_url
)))
}
}
pub fn delete(&self, conn: &Connection) -> Result<()> {
if !self.is_remote {
fs::remove_file(self.file_path.as_str())?;
if CONFIG.s3.is_some() {
#[cfg(not(feature="s3"))]
unreachable!();
#[cfg(feature = "s3")]
CONFIG.s3.as_ref().unwrap().get_bucket()
.delete_object_blocking(&self.relative_url().ok_or(Error::NotFound)?)?;
} else {
fs::remove_file(self.local_path().ok_or(Error::NotFound)?)?;
}
}
diesel::delete(self)
.execute(conn)
@ -207,36 +282,75 @@ impl Media {
}
// TODO: merge with save_remote?
pub fn from_activity(conn: &DbConn, image: &Image) -> Result<Media> {
pub fn from_activity(conn: &Connection, image: &Image) -> Result<Media> {
let remote_url = image
.object_props
.url_string()
.or(Err(Error::MissingApProperty))?;
let path = determine_mirror_file_path(&remote_url);
let parent = path.parent().ok_or(Error::InvalidValue)?;
if !parent.is_dir() {
DirBuilder::new().recursive(true).create(parent)?;
}
.url()
.and_then(|url| url.to_as_uri())
.ok_or(Error::MissingApProperty)?;
let mut dest = fs::File::create(path.clone())?;
// TODO: conditional GET
request::get(
remote_url.as_str(),
User::get_sender(),
CONFIG.proxy().cloned(),
)?
.copy_to(&mut dest)?;
let file_path = if CONFIG.s3.is_some() {
#[cfg(not(feature="s3"))]
unreachable!();
Media::find_by_file_path(conn, path.to_str().ok_or(Error::InvalidValue)?)
#[cfg(feature = "s3")]
{
use rocket::http::ContentType;
let dest = determine_mirror_s3_path(&remote_url);
let media = request::get(
remote_url.as_str(),
User::get_sender(),
CONFIG.proxy().cloned(),
)?;
let content_type = media
.headers()
.get(reqwest::header::CONTENT_TYPE)
.and_then(|x| x.to_str().ok())
.and_then(ContentType::parse_flexible)
.unwrap_or(ContentType::Binary);
let bytes = media.bytes()?;
let bucket = CONFIG.s3.as_ref().unwrap().get_bucket();
bucket.put_object_with_content_type_blocking(
&dest,
&bytes,
&content_type.to_string()
)?;
dest
}
} else {
let path = determine_mirror_file_path(&remote_url);
let parent = path.parent().ok_or(Error::InvalidValue)?;
if !parent.is_dir() {
DirBuilder::new().recursive(true).create(parent)?;
}
let mut dest = fs::File::create(path.clone())?;
// TODO: conditional GET
request::get(
remote_url.as_str(),
User::get_sender(),
CONFIG.proxy().cloned(),
)?
.copy_to(&mut dest)?;
path.to_str().ok_or(Error::InvalidValue)?.to_string()
};
Media::find_by_file_path(conn, &file_path)
.and_then(|mut media| {
let mut updated = false;
let alt_text = image
.object_props
.content_string()
.or(Err(Error::NotFound))?;
let sensitive = image.object_props.summary_string().is_ok();
let content_warning = image.object_props.summary_string().ok();
.content()
.and_then(|content| content.to_as_string())
.ok_or(Error::NotFound)?;
let summary = image.summary().and_then(|summary| summary.to_as_string());
let sensitive = summary.is_some();
let content_warning = summary;
if media.alt_text != alt_text {
media.alt_text = alt_text;
updated = true;
@ -258,33 +372,30 @@ impl Media {
updated = true;
}
if updated {
diesel::update(&media).set(&media).execute(&**conn)?;
diesel::update(&media).set(&media).execute(conn)?;
}
Ok(media)
})
.or_else(|_| {
let summary = image.summary().and_then(|summary| summary.to_as_string());
Media::insert(
conn,
NewMedia {
file_path: path.to_str().ok_or(Error::InvalidValue)?.to_string(),
file_path,
alt_text: image
.object_props
.content_string()
.or(Err(Error::NotFound))?,
.content()
.and_then(|content| content.to_as_string())
.ok_or(Error::NotFound)?,
is_remote: false,
remote_url: None,
sensitive: image.object_props.summary_string().is_ok(),
content_warning: image.object_props.summary_string().ok(),
sensitive: summary.is_some(),
content_warning: summary,
owner_id: User::from_id(
conn,
image
.object_props
.attributed_to_link_vec::<Id>()
.or(Err(Error::NotFound))?
.into_iter()
.next()
.ok_or(Error::NotFound)?
.as_ref(),
&image
.attributed_to()
.and_then(|attributed_to| attributed_to.to_as_uri())
.ok_or(Error::MissingApProperty)?,
None,
CONFIG.proxy(),
)
@ -310,12 +421,10 @@ impl Media {
}
fn determine_mirror_file_path(url: &str) -> PathBuf {
let mut file_path = Path::new(&super::CONFIG.media_directory).join(REMOTE_MEDIA_DIRECTORY);
Url::parse(url)
.map(|url| {
if !url.has_host() {
return;
}
let mut file_path = Path::new(&CONFIG.media_directory).join(REMOTE_MEDIA_DIRECTORY);
match Url::parse(url) {
Ok(url) if url.has_host() => {
file_path.push(url.host_str().unwrap());
for segment in url.path_segments().expect("FIXME") {
file_path.push(segment);
@ -323,19 +432,54 @@ fn determine_mirror_file_path(url: &str) -> PathBuf {
// TODO: handle query
// HINT: Use characters which must be percent-encoded in path as separator between path and query
// HINT: handle extension
})
.unwrap_or_else(|err| {
warn!("Failed to parse url: {} {}", &url, err);
}
other => {
if let Err(err) = other {
warn!("Failed to parse url: {} {}", &url, err);
} else {
warn!("Error without a host: {}", &url);
}
let ext = url
.rsplit('.')
.next()
.map(ToOwned::to_owned)
.unwrap_or_else(|| String::from("png"));
file_path.push(format!("{}.{}", GUID::rand(), ext));
});
}
}
file_path
}
#[cfg(feature="s3")]
fn determine_mirror_s3_path(url: &str) -> String {
match Url::parse(url) {
Ok(url) if url.has_host() => {
format!("static/media/{}/{}/{}",
REMOTE_MEDIA_DIRECTORY,
url.host_str().unwrap(),
url.path().trim_start_matches('/'),
)
}
other => {
if let Err(err) = other {
warn!("Failed to parse url: {} {}", &url, err);
} else {
warn!("Error without a host: {}", &url);
}
let ext = url
.rsplit('.')
.next()
.map(ToOwned::to_owned)
.unwrap_or_else(|| String::from("png"));
format!("static/media/{}/{}.{}",
REMOTE_MEDIA_DIRECTORY,
GUID::rand(),
ext,
)
}
}
}
#[cfg(test)]
pub(crate) mod tests {
use super::*;
@ -346,7 +490,7 @@ pub(crate) mod tests {
use std::path::Path;
pub(crate) fn fill_database(conn: &Conn) -> (Vec<User>, Vec<Media>) {
let mut wd = current_dir().unwrap().to_path_buf();
let mut wd = current_dir().unwrap();
while wd.pop() {
if wd.join(".git").exists() {
set_current_dir(wd).unwrap();
@ -401,7 +545,15 @@ pub(crate) mod tests {
pub(crate) fn clean(conn: &Conn) {
//used to remove files generated by tests
for media in Media::list_all_medias(conn).unwrap() {
media.delete(conn).unwrap();
if let Some(err) = media.delete(conn).err() {
match &err {
Error::Io(e) => match e.kind() {
std::io::ErrorKind::NotFound => (),
_ => panic!("{:?}", err),
},
_ => panic!("{:?}", err),
}
}
}
}
@ -451,7 +603,7 @@ pub(crate) mod tests {
let media = Media::insert(
conn,
NewMedia {
file_path: path.clone(),
file_path: path,
alt_text: "alt message".to_owned(),
is_remote: false,
remote_url: None,

View file

@ -1,8 +1,12 @@
use crate::{
comments::Comment, db_conn::DbConn, notifications::*, posts::Post, schema::mentions,
users::User, Connection, Error, Result,
comments::Comment, notifications::*, posts::Post, schema::mentions, users::User, Connection,
Error, Result,
};
use activitystreams::{
base::BaseExt,
iri_string::types::IriString,
link::{self, LinkExt},
};
use activitypub::link;
use diesel::{self, ExpressionMethods, QueryDsl, RunQueryDsl};
use plume_common::activity_pub::inbox::AsActor;
@ -56,21 +60,19 @@ impl Mention {
}
}
pub fn build_activity(conn: &DbConn, ment: &str) -> Result<link::Mention> {
pub fn build_activity(conn: &Connection, ment: &str) -> Result<link::Mention> {
let user = User::find_by_fqn(conn, ment)?;
let mut mention = link::Mention::default();
mention.link_props.set_href_string(user.ap_url)?;
mention.link_props.set_name_string(format!("@{}", ment))?;
let mut mention = link::Mention::new();
mention.set_href(user.ap_url.parse::<IriString>()?);
mention.set_name(format!("@{}", ment));
Ok(mention)
}
pub fn to_activity(&self, conn: &Connection) -> Result<link::Mention> {
let user = self.get_mentioned(conn)?;
let mut mention = link::Mention::default();
mention.link_props.set_href_string(user.ap_url.clone())?;
mention
.link_props
.set_name_string(format!("@{}", user.fqn))?;
let mut mention = link::Mention::new();
mention.set_href(user.ap_url.parse::<IriString>()?);
mention.set_name(format!("@{}", user.fqn));
Ok(mention)
}
@ -81,8 +83,8 @@ impl Mention {
in_post: bool,
notify: bool,
) -> Result<Self> {
let ap_url = ment.link_props.href_string().or(Err(Error::NotFound))?;
let mentioned = User::find_by_ap_url(conn, &ap_url)?;
let ap_url = ment.href().ok_or(Error::NotFound)?.as_str();
let mentioned = User::find_by_ap_url(conn, ap_url)?;
if in_post {
Post::get(conn, inside).and_then(|post| {
@ -145,3 +147,62 @@ impl Mention {
}
}
}
#[cfg(test)]
mod tests {
use super::*;
use crate::{inbox::tests::fill_database, tests::db, Error};
use assert_json_diff::assert_json_eq;
use diesel::Connection;
use serde_json::{json, to_value};
#[test]
fn build_activity() {
let conn = db();
conn.test_transaction::<_, Error, _>(|| {
let (_posts, users, _blogs) = fill_database(&conn);
let user = &users[0];
let name = &user.username;
let act = Mention::build_activity(&conn, name)?;
let expected = json!({
"href": "https://plu.me/@/admin/",
"name": "@admin",
"type": "Mention",
});
assert_json_eq!(to_value(act)?, expected);
Ok(())
});
}
#[test]
fn to_activity() {
let conn = db();
conn.test_transaction::<_, Error, _>(|| {
let (posts, users, _blogs) = fill_database(&conn);
let post = &posts[0];
let user = &users[0];
let mention = Mention::insert(
&conn,
NewMention {
mentioned_id: user.id,
post_id: Some(post.id),
comment_id: None,
},
)?;
let act = mention.to_activity(&conn)?;
let expected = json!({
"href": "https://plu.me/@/admin/",
"name": "@admin",
"type": "Mention",
});
assert_json_eq!(to_value(act)?, expected);
Ok(())
});
}
}

View file

@ -89,7 +89,7 @@ mod tests {
let request = PasswordResetRequest::find_by_token(&conn, &token)
.expect("couldn't retrieve request");
assert!(&token.len() > &32);
assert!(token.len() > 32);
assert_eq!(&request.email, &admin_email);
Ok(())
@ -103,8 +103,8 @@ mod tests {
user_tests::fill_database(&conn);
let admin_email = "admin@example.com";
PasswordResetRequest::insert(&conn, &admin_email).expect("couldn't insert new request");
PasswordResetRequest::insert(&conn, &admin_email)
PasswordResetRequest::insert(&conn, admin_email).expect("couldn't insert new request");
PasswordResetRequest::insert(&conn, admin_email)
.expect("couldn't insert second request");
let count = password_reset_requests::table.count().get_result(&*conn);
@ -132,7 +132,7 @@ mod tests {
.execute(&*conn)
.expect("could not insert request");
match PasswordResetRequest::find_by_token(&conn, &token) {
match PasswordResetRequest::find_by_token(&conn, token) {
Err(Error::Expired) => (),
_ => panic!("Received unexpected result finding expired token"),
}
@ -148,7 +148,7 @@ mod tests {
user_tests::fill_database(&conn);
let admin_email = "admin@example.com";
let token = PasswordResetRequest::insert(&conn, &admin_email)
let token = PasswordResetRequest::insert(&conn, admin_email)
.expect("couldn't insert new request");
PasswordResetRequest::find_and_delete_by_token(&conn, &token)
.expect("couldn't find and delete request");

View file

@ -1,22 +1,26 @@
use crate::{
ap_url, blogs::Blog, db_conn::DbConn, instance::Instance, medias::Media, mentions::Mention,
post_authors::*, safe_string::SafeString, schema::posts, tags::*, timeline::*, users::User,
Connection, Error, PostEvent::*, Result, CONFIG, POST_CHAN,
ap_url, blogs::Blog, instance::Instance, medias::Media, mentions::Mention, post_authors::*,
safe_string::SafeString, schema::posts, tags::*, timeline::*, users::User, Connection, Error,
PostEvent::*, Result, CONFIG, POST_CHAN,
};
use activitypub::{
use activitystreams::{
activity::{Create, Delete, Update},
link,
object::{Article, Image, Tombstone},
CustomObject,
base::{AnyBase, Base},
iri_string::types::IriString,
link::{self, kind::MentionType},
object::{kind::ImageType, ApObject, Article, AsApObject, Image, ObjectExt, Tombstone},
prelude::*,
time::OffsetDateTime,
};
use chrono::{NaiveDateTime, TimeZone, Utc};
use diesel::{self, BelongingToDsl, ExpressionMethods, QueryDsl, RunQueryDsl, SaveChangesDsl};
use chrono::{NaiveDateTime, Utc};
use diesel::{self, BelongingToDsl, ExpressionMethods, QueryDsl, RunQueryDsl};
use once_cell::sync::Lazy;
use plume_common::{
activity_pub::{
inbox::{AsActor, AsObject, FromId},
sign::Signer,
Hashtag, Id, IntoId, Licensed, Source, PUBLIC_VISIBILITY,
Hashtag, HashtagType, Id, IntoId, Licensed, LicensedArticle, ToAsString, ToAsUri,
PUBLIC_VISIBILITY,
},
utils::{iri_percent_encode_seg, md_to_html},
};
@ -24,8 +28,6 @@ use riker::actors::{Publish, Tell};
use std::collections::{HashMap, HashSet};
use std::sync::{Arc, Mutex};
pub type LicensedArticle = CustomObject<Licensed, Article>;
static BLOG_FQN_CACHE: Lazy<Mutex<HashMap<i32, String>>> = Lazy::new(|| Mutex::new(HashMap::new()));
#[derive(Queryable, Identifiable, Clone, AsChangeset, Debug)]
@ -67,15 +69,15 @@ impl Post {
find_by!(posts, find_by_ap_url, ap_url as &str);
last!(posts);
pub fn insert(conn: &Connection, new: NewPost) -> Result<Self> {
pub fn insert(conn: &Connection, mut new: NewPost) -> Result<Self> {
if new.ap_url.is_empty() {
let blog = Blog::get(conn, new.blog_id)?;
new.ap_url = Self::ap_url(blog, &new.slug);
}
diesel::insert_into(posts::table)
.values(new)
.execute(conn)?;
let mut post = Self::last(conn)?;
if post.ap_url.is_empty() {
post.ap_url = Self::ap_url(post.get_blog(conn)?, &post.slug);
let _: Post = post.save_changes(conn)?;
}
let post = Self::last(conn)?;
if post.published {
post.publish_published();
@ -132,7 +134,7 @@ impl Post {
.filter(posts::published.eq(true))
.count()
.load(conn)?
.get(0)
.first()
.cloned()
.ok_or(Error::NotFound)
}
@ -253,7 +255,7 @@ impl Post {
ap_url(&format!(
"{}/~/{}/{}/",
CONFIG.base_url,
blog.fqn,
iri_percent_encode_seg(&blog.fqn),
iri_percent_encode_seg(slug)
))
}
@ -353,92 +355,92 @@ impl Post {
.collect::<Vec<serde_json::Value>>();
mentions_json.append(&mut tags_json);
let mut article = Article::default();
article.object_props.set_name_string(self.title.clone())?;
article.object_props.set_id_string(self.ap_url.clone())?;
let mut article = ApObject::new(Article::new());
article.set_name(self.title.clone());
article.set_id(self.ap_url.parse::<IriString>()?);
let mut authors = self
.get_authors(conn)?
.into_iter()
.map(|x| Id::new(x.ap_url))
.collect::<Vec<Id>>();
authors.push(self.get_blog(conn)?.into_id()); // add the blog URL here too
article
.object_props
.set_attributed_to_link_vec::<Id>(authors)?;
article
.object_props
.set_content_string(self.content.get().clone())?;
article.ap_object_props.set_source_object(Source {
content: self.source.clone(),
media_type: String::from("text/markdown"),
})?;
article
.object_props
.set_published_utctime(Utc.from_utc_datetime(&self.creation_date))?;
article
.object_props
.set_summary_string(self.subtitle.clone())?;
article.object_props.tag = Some(json!(mentions_json));
.filter_map(|x| x.ap_url.parse::<IriString>().ok())
.collect::<Vec<IriString>>();
authors.push(self.get_blog(conn)?.ap_url.parse::<IriString>()?); // add the blog URL here too
article.set_many_attributed_tos(authors);
article.set_content(self.content.get().clone());
let source = AnyBase::from_arbitrary_json(serde_json::json!({
"content": self.source,
"mediaType": "text/markdown",
}))?;
article.set_source(source);
article.set_published(
OffsetDateTime::from_unix_timestamp_nanos(self.creation_date.timestamp_nanos().into())
.expect("OffsetDateTime"),
);
article.set_summary(&*self.subtitle);
article.set_many_tags(
mentions_json
.iter()
.filter_map(|mention_json| AnyBase::from_arbitrary_json(mention_json).ok()),
);
if let Some(media_id) = self.cover_id {
let media = Media::get(conn, media_id)?;
let mut cover = Image::default();
cover.object_props.set_url_string(media.url()?)?;
let mut cover = Image::new();
cover.set_url(media.url()?);
if media.sensitive {
cover
.object_props
.set_summary_string(media.content_warning.unwrap_or_default())?;
cover.set_summary(media.content_warning.unwrap_or_default());
}
cover.object_props.set_content_string(media.alt_text)?;
cover
.object_props
.set_attributed_to_link_vec(vec![User::get(conn, media.owner_id)?.into_id()])?;
article.object_props.set_icon_object(cover)?;
cover.set_content(media.alt_text);
cover.set_many_attributed_tos(vec![User::get(conn, media.owner_id)?
.ap_url
.parse::<IriString>()?]);
article.set_icon(cover.into_any_base()?);
}
article.object_props.set_url_string(self.ap_url.clone())?;
article
.object_props
.set_to_link_vec::<Id>(to.into_iter().map(Id::new).collect())?;
article
.object_props
.set_cc_link_vec::<Id>(cc.into_iter().map(Id::new).collect())?;
let mut license = Licensed::default();
license.set_license_string(self.license.clone())?;
article.set_url(self.ap_url.parse::<IriString>()?);
article.set_many_tos(
to.into_iter()
.filter_map(|to| to.parse::<IriString>().ok())
.collect::<Vec<IriString>>(),
);
article.set_many_ccs(
cc.into_iter()
.filter_map(|cc| cc.parse::<IriString>().ok())
.collect::<Vec<IriString>>(),
);
let license = Licensed {
license: Some(self.license.clone()),
};
Ok(LicensedArticle::new(article, license))
}
pub fn create_activity(&self, conn: &Connection) -> Result<Create> {
let article = self.to_activity(conn)?;
let mut act = Create::default();
act.object_props
.set_id_string(format!("{}activity", self.ap_url))?;
act.object_props
.set_to_link_vec::<Id>(article.object.object_props.to_link_vec()?)?;
act.object_props
.set_cc_link_vec::<Id>(article.object.object_props.cc_link_vec()?)?;
act.create_props
.set_actor_link(Id::new(self.get_authors(conn)?[0].clone().ap_url))?;
act.create_props.set_object_object(article)?;
let to = article.to().ok_or(Error::MissingApProperty)?.clone();
let cc = article.cc().ok_or(Error::MissingApProperty)?.clone();
let mut act = Create::new(
self.get_authors(conn)?[0].ap_url.parse::<IriString>()?,
Base::retract(article)?.into_generic()?,
);
act.set_id(format!("{}/activity", self.ap_url).parse::<IriString>()?);
act.set_many_tos(to);
act.set_many_ccs(cc);
Ok(act)
}
pub fn update_activity(&self, conn: &Connection) -> Result<Update> {
let article = self.to_activity(conn)?;
let mut act = Update::default();
act.object_props.set_id_string(format!(
"{}/update-{}",
self.ap_url,
Utc::now().timestamp()
))?;
act.object_props
.set_to_link_vec::<Id>(article.object.object_props.to_link_vec()?)?;
act.object_props
.set_cc_link_vec::<Id>(article.object.object_props.cc_link_vec()?)?;
act.update_props
.set_actor_link(Id::new(self.get_authors(conn)?[0].clone().ap_url))?;
act.update_props.set_object_object(article)?;
let to = article.to().ok_or(Error::MissingApProperty)?.clone();
let cc = article.cc().ok_or(Error::MissingApProperty)?.clone();
let mut act = Update::new(
self.get_authors(conn)?[0].ap_url.parse::<IriString>()?,
Base::retract(article)?.into_generic()?,
);
act.set_id(
format!("{}/update-{}", self.ap_url, Utc::now().timestamp()).parse::<IriString>()?,
);
act.set_many_tos(to);
act.set_many_ccs(cc);
Ok(act)
}
@ -447,10 +449,8 @@ impl Post {
.into_iter()
.map(|m| {
(
m.link_props
.href_string()
.ok()
.and_then(|ap_url| User::find_by_ap_url(conn, &ap_url).ok())
m.href()
.and_then(|ap_url| User::find_by_ap_url(conn, ap_url.as_ref()).ok())
.map(|u| u.id),
m,
)
@ -465,7 +465,7 @@ impl Post {
.collect::<HashSet<_>>();
for (m, id) in &mentions {
if !old_user_mentioned.contains(id) {
Mention::from_activity(&*conn, m, self.id, true, true)?;
Mention::from_activity(conn, m, self.id, true, true)?;
}
}
@ -485,10 +485,10 @@ impl Post {
pub fn update_tags(&self, conn: &Connection, tags: Vec<Hashtag>) -> Result<()> {
let tags_name = tags
.iter()
.filter_map(|t| t.name_string().ok())
.filter_map(|t| t.name.as_ref().map(|name| name.as_str().to_string()))
.collect::<HashSet<_>>();
let old_tags = Tag::for_post(&*conn, self.id)?;
let old_tags = Tag::for_post(conn, self.id)?;
let old_tags_name = old_tags
.iter()
.filter_map(|tag| {
@ -502,8 +502,9 @@ impl Post {
for t in tags {
if !t
.name_string()
.map(|n| old_tags_name.contains(&n))
.name
.as_ref()
.map(|n| old_tags_name.contains(n.as_str()))
.unwrap_or(true)
{
Tag::from_activity(conn, &t, self.id, false)?;
@ -521,10 +522,10 @@ impl Post {
pub fn update_hashtags(&self, conn: &Connection, tags: Vec<Hashtag>) -> Result<()> {
let tags_name = tags
.iter()
.filter_map(|t| t.name_string().ok())
.filter_map(|t| t.name.as_ref().map(|name| name.as_str().to_string()))
.collect::<HashSet<_>>();
let old_tags = Tag::for_post(&*conn, self.id)?;
let old_tags = Tag::for_post(conn, self.id)?;
let old_tags_name = old_tags
.iter()
.filter_map(|tag| {
@ -538,8 +539,9 @@ impl Post {
for t in tags {
if !t
.name_string()
.map(|n| old_tags_name.contains(&n))
.name
.as_ref()
.map(|n| old_tags_name.contains(n.as_str()))
.unwrap_or(true)
{
Tag::from_activity(conn, &t, self.id, true)?;
@ -566,18 +568,19 @@ impl Post {
}
pub fn build_delete(&self, conn: &Connection) -> Result<Delete> {
let mut act = Delete::default();
act.delete_props
.set_actor_link(self.get_authors(conn)?[0].clone().into_id())?;
let mut tombstone = Tombstone::new();
tombstone.set_id(self.ap_url.parse()?);
let mut tombstone = Tombstone::default();
tombstone.object_props.set_id_string(self.ap_url.clone())?;
act.delete_props.set_object_object(tombstone)?;
let mut act = Delete::new(
self.get_authors(conn)?[0]
.clone()
.into_id()
.parse::<IriString>()?,
Base::retract(tombstone)?.into_generic()?,
);
act.object_props
.set_id_string(format!("{}#delete", self.ap_url))?;
act.object_props
.set_to_link_vec(vec![Id::new(PUBLIC_VISIBILITY)])?;
act.set_id(format!("{}#delete", self.ap_url).parse()?);
act.set_many_tos(vec![PUBLIC_VISIBILITY.parse::<IriString>()?]);
Ok(act)
}
@ -612,56 +615,91 @@ impl Post {
}
}
impl FromId<DbConn> for Post {
impl FromId<Connection> for Post {
type Error = Error;
type Object = LicensedArticle;
fn from_db(conn: &DbConn, id: &str) -> Result<Self> {
fn from_db(conn: &Connection, id: &str) -> Result<Self> {
Self::find_by_ap_url(conn, id)
}
fn from_activity(conn: &DbConn, article: LicensedArticle) -> Result<Self> {
let conn = conn;
let license = article.custom_props.license_string().unwrap_or_default();
let article = article.object;
fn from_activity(conn: &Connection, article: LicensedArticle) -> Result<Self> {
let license = article.ext_one.license.unwrap_or_default();
let article = article.inner;
let (blog, authors) = article
.object_props
.attributed_to_link_vec::<Id>()?
.into_iter()
.ap_object_ref()
.attributed_to()
.ok_or(Error::MissingApProperty)?
.iter()
.fold((None, vec![]), |(blog, mut authors), link| {
let url = link;
match User::from_id(conn, &url, None, CONFIG.proxy()) {
Ok(u) => {
authors.push(u);
(blog, authors)
if let Some(url) = link.id() {
match User::from_id(conn, url.as_str(), None, CONFIG.proxy()) {
Ok(u) => {
authors.push(u);
(blog, authors)
}
Err(_) => (
blog.or_else(|| {
Blog::from_id(conn, url.as_str(), None, CONFIG.proxy()).ok()
}),
authors,
),
}
Err(_) => (
blog.or_else(|| Blog::from_id(conn, &url, None, CONFIG.proxy()).ok()),
authors,
),
} else {
// logically, url possible to be an object without id proprty like {"type":"Person", "name":"Sally"} but we ignore the case
(blog, authors)
}
});
let cover = article
.object_props
.icon_object::<Image>()
.ok()
.and_then(|img| Media::from_activity(conn, &img).ok().map(|m| m.id));
let cover = article.icon().and_then(|icon| {
icon.iter().next().and_then(|img| {
let image = img.to_owned().extend::<Image, ImageType>().ok()??;
Media::from_activity(conn, &image).ok().map(|m| m.id)
})
});
let title = article.object_props.name_string()?;
let title = article
.name()
.and_then(|name| name.to_as_string())
.ok_or(Error::MissingApProperty)?;
let id = AnyBase::from_extended(article.clone()) // FIXME: Don't clone
.ok()
.ok_or(Error::MissingApProperty)?
.id()
.map(|id| id.to_string());
let ap_url = article
.object_props
.url_string()
.or_else(|_| article.object_props.id_string())?;
.url()
.and_then(|url| url.to_as_uri().or(id))
.ok_or(Error::MissingApProperty)?;
let source = article
.source()
.and_then(|s| {
serde_json::to_value(s).ok().and_then(|obj| {
if !obj.is_object() {
return None;
}
obj.get("content")
.and_then(|content| content.as_str().map(|c| c.to_string()))
})
})
.unwrap_or_default();
let post = Post::from_db(conn, &ap_url)
.and_then(|mut post| {
let mut updated = false;
let slug = Self::slug(&title);
let content = SafeString::new(&article.object_props.content_string()?);
let subtitle = article.object_props.summary_string()?;
let source = article.ap_object_props.source_object::<Source>()?.content;
let content = SafeString::new(
&article
.content()
.and_then(|content| content.to_as_string())
.ok_or(Error::MissingApProperty)?,
);
let subtitle = article
.summary()
.and_then(|summary| summary.to_as_string())
.ok_or(Error::MissingApProperty)?;
if post.slug != slug {
post.slug = slug.to_string();
updated = true;
@ -683,7 +721,7 @@ impl FromId<DbConn> for Post {
updated = true;
}
if post.source != source {
post.source = source;
post.source = source.clone();
updated = true;
}
if post.cover_id != cover {
@ -704,14 +742,31 @@ impl FromId<DbConn> for Post {
blog_id: blog.ok_or(Error::NotFound)?.id,
slug: Self::slug(&title).to_string(),
title,
content: SafeString::new(&article.object_props.content_string()?),
content: SafeString::new(
&article
.content()
.and_then(|content| content.to_as_string())
.ok_or(Error::MissingApProperty)?,
),
published: true,
license,
// FIXME: This is wrong: with this logic, we may use the display URL as the AP ID. We need two different fields
ap_url,
creation_date: Some(article.object_props.published_utctime()?.naive_utc()),
subtitle: article.object_props.summary_string()?,
source: article.ap_object_props.source_object::<Source>()?.content,
creation_date: article.published().map(|published| {
let timestamp_secs = published.unix_timestamp();
let timestamp_nanos = published.unix_timestamp_nanos()
- (timestamp_secs as i128) * 1000i128 * 1000i128 * 1000i128;
NaiveDateTime::from_timestamp_opt(
timestamp_secs,
timestamp_nanos as u32,
)
.unwrap()
}),
subtitle: article
.summary()
.and_then(|summary| summary.to_as_string())
.ok_or(Error::MissingApProperty)?,
source,
cover_id: cover,
},
)
@ -735,22 +790,22 @@ impl FromId<DbConn> for Post {
.2
.into_iter()
.collect::<HashSet<_>>();
if let Some(serde_json::Value::Array(tags)) = article.object_props.tag {
for tag in tags {
serde_json::from_value::<link::Mention>(tag.clone())
.map(|m| Mention::from_activity(conn, &m, post.id, true, true))
if let Some(tags) = article.tag() {
for tag in tags.iter() {
tag.clone()
.extend::<link::Mention, MentionType>() // FIXME: Don't clone
.map(|mention| {
mention.map(|m| Mention::from_activity(conn, &m, post.id, true, true))
})
.ok();
serde_json::from_value::<Hashtag>(tag.clone())
.map_err(Error::from)
.and_then(|t| {
let tag_name = t.name_string()?;
Ok(Tag::from_activity(
conn,
&t,
post.id,
hashtags.remove(&tag_name),
))
tag.clone()
.extend::<Hashtag, HashtagType>() // FIXME: Don't clone
.map(|hashtag| {
hashtag.and_then(|t| {
let tag_name = t.name.clone()?.as_str().to_string();
Tag::from_activity(conn, &t, post.id, hashtags.remove(&tag_name)).ok()
})
})
.ok();
}
@ -762,25 +817,25 @@ impl FromId<DbConn> for Post {
}
fn get_sender() -> &'static dyn Signer {
Instance::get_local_instance_user().expect("Failed to local instance user")
Instance::get_local_instance_user().expect("Failed to get local instance user")
}
}
impl AsObject<User, Create, &DbConn> for Post {
impl AsObject<User, Create, &Connection> for Post {
type Error = Error;
type Output = Post;
type Output = Self;
fn activity(self, _conn: &DbConn, _actor: User, _id: &str) -> Result<Post> {
fn activity(self, _conn: &Connection, _actor: User, _id: &str) -> Result<Self::Output> {
// TODO: check that _actor is actually one of the author?
Ok(self)
}
}
impl AsObject<User, Delete, &DbConn> for Post {
impl AsObject<User, Delete, &Connection> for Post {
type Error = Error;
type Output = ();
fn activity(self, conn: &DbConn, actor: User, _id: &str) -> Result<()> {
fn activity(self, conn: &Connection, actor: User, _id: &str) -> Result<Self::Output> {
let can_delete = self
.get_authors(conn)?
.into_iter()
@ -804,36 +859,63 @@ pub struct PostUpdate {
pub tags: Option<serde_json::Value>,
}
impl FromId<DbConn> for PostUpdate {
impl FromId<Connection> for PostUpdate {
type Error = Error;
type Object = LicensedArticle;
fn from_db(_: &DbConn, _: &str) -> Result<Self> {
fn from_db(_: &Connection, _: &str) -> Result<Self> {
// Always fail because we always want to deserialize the AP object
Err(Error::NotFound)
}
fn from_activity(conn: &DbConn, updated: LicensedArticle) -> Result<Self> {
Ok(PostUpdate {
ap_url: updated.object.object_props.id_string()?,
title: updated.object.object_props.name_string().ok(),
subtitle: updated.object.object_props.summary_string().ok(),
content: updated.object.object_props.content_string().ok(),
cover: updated
.object
.object_props
.icon_object::<Image>()
.ok()
.and_then(|img| Media::from_activity(conn, &img).ok().map(|m| m.id)),
source: updated
.object
.ap_object_props
.source_object::<Source>()
.ok()
.map(|x| x.content),
license: updated.custom_props.license_string().ok(),
tags: updated.object.object_props.tag,
})
fn from_activity(conn: &Connection, updated: Self::Object) -> Result<Self> {
let mut post_update = PostUpdate {
ap_url: updated
.ap_object_ref()
.id_unchecked()
.ok_or(Error::MissingApProperty)?
.to_string(),
title: updated
.ap_object_ref()
.name()
.and_then(|name| name.to_as_string()),
subtitle: updated
.ap_object_ref()
.summary()
.and_then(|summary| summary.to_as_string()),
content: updated
.ap_object_ref()
.content()
.and_then(|content| content.to_as_string()),
cover: None,
source: updated.source().and_then(|s| {
serde_json::to_value(s).ok().and_then(|obj| {
if !obj.is_object() {
return None;
}
obj.get("content")
.and_then(|content| content.as_str().map(|c| c.to_string()))
})
}),
license: None,
tags: updated
.tag()
.and_then(|tags| serde_json::to_value(tags).ok()),
};
post_update.cover = updated.ap_object_ref().icon().and_then(|img| {
img.iter()
.next()
.and_then(|img| {
img.clone()
.extend::<Image, ImageType>()
.map(|img| img.and_then(|img| Media::from_activity(conn, &img).ok()))
.ok()
})
.and_then(|m| m.map(|m| m.id))
});
post_update.license = updated.ext_one.license;
Ok(post_update)
}
fn get_sender() -> &'static dyn Signer {
@ -841,11 +923,11 @@ impl FromId<DbConn> for PostUpdate {
}
}
impl AsObject<User, Update, &DbConn> for PostUpdate {
impl AsObject<User, Update, &Connection> for PostUpdate {
type Error = Error;
type Output = ();
fn activity(self, conn: &DbConn, actor: User, _id: &str) -> Result<()> {
fn activity(self, conn: &Connection, actor: User, _id: &str) -> Result<()> {
let mut post =
Post::from_id(conn, &self.ap_url, None, CONFIG.proxy()).map_err(|(_, e)| e)?;
@ -893,8 +975,12 @@ impl AsObject<User, Update, &DbConn> for PostUpdate {
serde_json::from_value::<Hashtag>(tag.clone())
.map_err(Error::from)
.and_then(|t| {
let tag_name = t.name_string()?;
if txt_hashtags.remove(&tag_name) {
let tag_name = t.name.as_ref().ok_or(Error::MissingApProperty)?;
let tag_name_str = tag_name
.as_xsd_string()
.or_else(|| tag_name.as_rdf_lang_string().map(|rls| &*rls.value))
.ok_or(Error::MissingApProperty)?;
if txt_hashtags.remove(tag_name_str) {
hashtags.push(t);
} else {
tags.push(t);
@ -941,10 +1027,30 @@ impl From<PostEvent> for Arc<Post> {
#[cfg(test)]
mod tests {
use super::*;
use crate::db_conn::DbConn;
use crate::inbox::{inbox, tests::fill_database, InboxResult};
use crate::mentions::{Mention, NewMention};
use crate::safe_string::SafeString;
use crate::tests::db;
use crate::tests::{db, format_datetime};
use assert_json_diff::assert_json_eq;
use diesel::Connection;
use serde_json::{json, to_value};
fn prepare_activity(conn: &DbConn) -> (Post, Mention, Vec<Post>, Vec<User>, Vec<Blog>) {
let (posts, users, blogs) = fill_database(conn);
let post = &posts[0];
let mentioned = &users[1];
let mention = Mention::insert(
conn,
NewMention {
mentioned_id: mentioned.id,
post_id: Some(post.id),
comment_id: None,
},
)
.unwrap();
(post.to_owned(), mention, posts, users, blogs)
}
// creates a post, get it's Create activity, delete the post,
// "send" the Create to the inbox, and check it works
@ -952,9 +1058,9 @@ mod tests {
fn self_federation() {
let conn = &db();
conn.test_transaction::<_, (), _>(|| {
let (_, users, blogs) = fill_database(&conn);
let (_, users, blogs) = fill_database(conn);
let post = Post::insert(
&conn,
conn,
NewPost {
blog_id: blogs[0].id,
slug: "yo".into(),
@ -971,19 +1077,19 @@ mod tests {
)
.unwrap();
PostAuthor::insert(
&conn,
conn,
NewPostAuthor {
post_id: post.id,
author_id: users[0].id,
},
)
.unwrap();
let create = post.create_activity(&conn).unwrap();
post.delete(&conn).unwrap();
let create = post.create_activity(conn).unwrap();
post.delete(conn).unwrap();
match inbox(&conn, serde_json::to_value(create).unwrap()).unwrap() {
match inbox(conn, serde_json::to_value(create).unwrap()).unwrap() {
InboxResult::Post(p) => {
assert!(p.is_author(&conn, users[0].id).unwrap());
assert!(p.is_author(conn, users[0].id).unwrap());
assert_eq!(p.source, "Hello".to_owned());
assert_eq!(p.blog_id, blogs[0].id);
assert_eq!(p.content, SafeString::new("Hello"));
@ -997,45 +1103,177 @@ mod tests {
}
#[test]
fn licensed_article_serde() {
let mut article = Article::default();
article.object_props.set_id_string("Yo".into()).unwrap();
let mut license = Licensed::default();
license.set_license_string("WTFPL".into()).unwrap();
let full_article = LicensedArticle::new(article, license);
fn to_activity() {
let conn = db();
conn.test_transaction::<_, Error, _>(|| {
let (post, _mention, _posts, _users, _blogs) = prepare_activity(&conn);
let act = post.to_activity(&conn)?;
let json = serde_json::to_value(full_article).unwrap();
let article_from_json: LicensedArticle = serde_json::from_value(json).unwrap();
assert_eq!(
"Yo",
&article_from_json.object.object_props.id_string().unwrap()
);
assert_eq!(
"WTFPL",
&article_from_json.custom_props.license_string().unwrap()
);
let expected = json!({
"attributedTo": ["https://plu.me/@/admin/", "https://plu.me/~/BlogName/"],
"cc": [],
"content": "Hello",
"id": "https://plu.me/~/BlogName/testing",
"license": "WTFPL",
"name": "Testing",
"published": format_datetime(&post.creation_date),
"source": {
"content": "Hello",
"mediaType": "text/markdown"
},
"summary": "Bye",
"tag": [
{
"href": "https://plu.me/@/user/",
"name": "@user",
"type": "Mention"
}
],
"to": ["https://www.w3.org/ns/activitystreams#Public"],
"type": "Article",
"url": "https://plu.me/~/BlogName/testing"
});
assert_json_eq!(to_value(act)?, expected);
Ok(())
});
}
#[test]
fn licensed_article_deserialization() {
let json = json!({
"type": "Article",
"id": "https://plu.me/~/Blog/my-article",
"attributedTo": ["https://plu.me/@/Admin", "https://plu.me/~/Blog"],
"content": "Hello.",
"name": "My Article",
"summary": "Bye.",
"source": {
"content": "Hello.",
"mediaType": "text/markdown"
},
"published": "2014-12-12T12:12:12Z",
"to": [plume_common::activity_pub::PUBLIC_VISIBILITY]
fn create_activity() {
let conn = db();
conn.test_transaction::<_, Error, _>(|| {
let (post, _mention, _posts, _users, _blogs) = prepare_activity(&conn);
let act = post.create_activity(&conn)?;
let expected = json!({
"actor": "https://plu.me/@/admin/",
"cc": [],
"id": "https://plu.me/~/BlogName/testing/activity",
"object": {
"attributedTo": ["https://plu.me/@/admin/", "https://plu.me/~/BlogName/"],
"cc": [],
"content": "Hello",
"id": "https://plu.me/~/BlogName/testing",
"license": "WTFPL",
"name": "Testing",
"published": format_datetime(&post.creation_date),
"source": {
"content": "Hello",
"mediaType": "text/markdown"
},
"summary": "Bye",
"tag": [
{
"href": "https://plu.me/@/user/",
"name": "@user",
"type": "Mention"
}
],
"to": ["https://www.w3.org/ns/activitystreams#Public"],
"type": "Article",
"url": "https://plu.me/~/BlogName/testing"
},
"to": ["https://www.w3.org/ns/activitystreams#Public"],
"type": "Create"
});
assert_json_eq!(to_value(act)?, expected);
Ok(())
});
}
#[test]
fn update_activity() {
let conn = db();
conn.test_transaction::<_, Error, _>(|| {
let (post, _mention, _posts, _users, _blogs) = prepare_activity(&conn);
let act = post.update_activity(&conn)?;
let expected = json!({
"actor": "https://plu.me/@/admin/",
"cc": [],
"id": "https://plu.me/~/BlogName/testing/update-",
"object": {
"attributedTo": ["https://plu.me/@/admin/", "https://plu.me/~/BlogName/"],
"cc": [],
"content": "Hello",
"id": "https://plu.me/~/BlogName/testing",
"license": "WTFPL",
"name": "Testing",
"published": format_datetime(&post.creation_date),
"source": {
"content": "Hello",
"mediaType": "text/markdown"
},
"summary": "Bye",
"tag": [
{
"href": "https://plu.me/@/user/",
"name": "@user",
"type": "Mention"
}
],
"to": ["https://www.w3.org/ns/activitystreams#Public"],
"type": "Article",
"url": "https://plu.me/~/BlogName/testing"
},
"to": ["https://www.w3.org/ns/activitystreams#Public"],
"type": "Update"
});
let actual = to_value(act)?;
let id = actual["id"].to_string();
let (id_pre, id_post) = id.rsplit_once('-').unwrap();
assert_eq!(post.ap_url, "https://plu.me/~/BlogName/testing");
assert_eq!(
id_pre,
to_value("\"https://plu.me/~/BlogName/testing/update")
.unwrap()
.as_str()
.unwrap()
);
assert_eq!(id_post.len(), 11);
assert_eq!(
id_post.matches(char::is_numeric).collect::<String>().len(),
10
);
for (key, value) in actual.as_object().unwrap().into_iter() {
if key == "id" {
continue;
}
assert_json_eq!(value, expected.get(key).unwrap());
}
Ok(())
});
}
#[test]
fn build_delete() {
let conn = db();
conn.test_transaction::<_, Error, _>(|| {
let (post, _mention, _posts, _users, _blogs) = prepare_activity(&conn);
let act = post.build_delete(&conn)?;
let expected = json!({
"actor": "https://plu.me/@/admin/",
"id": "https://plu.me/~/BlogName/testing#delete",
"object": {
"id": "https://plu.me/~/BlogName/testing",
"type": "Tombstone"
},
"to": [
"https://www.w3.org/ns/activitystreams#Public"
],
"type": "Delete"
});
assert_json_eq!(to_value(act)?, expected);
Ok(())
});
let article: LicensedArticle = serde_json::from_value(json).unwrap();
assert_eq!(
"https://plu.me/~/Blog/my-article",
&article.object.object_props.id_string().unwrap()
);
}
}

View file

@ -1,12 +1,16 @@
use crate::{
db_conn::{DbConn, DbPool},
follows,
posts::{LicensedArticle, Post},
posts::Post,
users::{User, UserEvent},
ACTOR_SYS, CONFIG, USER_CHAN,
};
use activitypub::activity::Create;
use plume_common::activity_pub::inbox::FromId;
use activitystreams::{
activity::{ActorAndObjectRef, Create},
base::AnyBase,
object::kind::ArticleType,
};
use plume_common::activity_pub::{inbox::FromId, LicensedArticle};
use riker::actors::{Actor, ActorFactoryArgs, ActorRefFactory, Context, Sender, Subscribe, Tell};
use std::sync::Arc;
use tracing::{error, info, warn};
@ -41,6 +45,12 @@ impl Actor for RemoteFetchActor {
RemoteUserFound(user) => match self.conn.get() {
Ok(conn) => {
let conn = DbConn(conn);
if user
.get_instance(&conn)
.map_or(false, |instance| instance.blocked)
{
return;
}
// Don't call these functions in parallel
// for the case database connections limit is too small
fetch_and_cache_articles(&user, &conn);
@ -68,13 +78,17 @@ fn fetch_and_cache_articles(user: &Arc<User>, conn: &DbConn) {
match create_acts {
Ok(create_acts) => {
for create_act in create_acts {
match create_act.create_props.object_object::<LicensedArticle>() {
Ok(article) => {
match create_act.object_field_ref().as_single_base().map(|base| {
let any_base = AnyBase::from_base(base.clone()); // FIXME: Don't clone()
any_base.extend::<LicensedArticle, ArticleType>()
}) {
Some(Ok(Some(article))) => {
Post::from_activity(conn, article)
.expect("Article from remote user couldn't be saved");
info!("Fetched article from remote user");
}
Err(e) => warn!("Error while fetching articles in background: {:?}", e),
Some(Err(e)) => warn!("Error while fetching articles in background: {:?}", e),
_ => warn!("Error while fetching articles in background"),
}
}
}

View file

@ -1,14 +1,19 @@
use crate::{
db_conn::DbConn, instance::Instance, notifications::*, posts::Post, schema::reshares,
timeline::*, users::User, Connection, Error, Result, CONFIG,
instance::Instance, notifications::*, posts::Post, schema::reshares, timeline::*, users::User,
Connection, Error, Result, CONFIG,
};
use activitystreams::{
activity::{ActorAndObjectRef, Announce, Undo},
base::AnyBase,
iri_string::types::IriString,
prelude::*,
};
use activitypub::activity::{Announce, Undo};
use chrono::NaiveDateTime;
use diesel::{self, ExpressionMethods, QueryDsl, RunQueryDsl};
use plume_common::activity_pub::{
inbox::{AsActor, AsObject, FromId},
sign::Signer,
Id, IntoId, PUBLIC_VISIBILITY,
PUBLIC_VISIBILITY,
};
#[derive(Clone, Queryable, Identifiable)]
@ -61,16 +66,16 @@ impl Reshare {
}
pub fn to_activity(&self, conn: &Connection) -> Result<Announce> {
let mut act = Announce::default();
act.announce_props
.set_actor_link(User::get(conn, self.user_id)?.into_id())?;
act.announce_props
.set_object_link(Post::get(conn, self.post_id)?.into_id())?;
act.object_props.set_id_string(self.ap_url.clone())?;
act.object_props
.set_to_link_vec(vec![Id::new(PUBLIC_VISIBILITY.to_string())])?;
act.object_props
.set_cc_link_vec(vec![Id::new(self.get_user(conn)?.followers_endpoint)])?;
let mut act = Announce::new(
User::get(conn, self.user_id)?.ap_url.parse::<IriString>()?,
Post::get(conn, self.post_id)?.ap_url.parse::<IriString>()?,
);
act.set_id(self.ap_url.parse::<IriString>()?);
act.set_many_tos(vec![PUBLIC_VISIBILITY.parse::<IriString>()?]);
act.set_many_ccs(vec![self
.get_user(conn)?
.followers_endpoint
.parse::<IriString>()?]);
Ok(act)
}
@ -93,26 +98,26 @@ impl Reshare {
}
pub fn build_undo(&self, conn: &Connection) -> Result<Undo> {
let mut act = Undo::default();
act.undo_props
.set_actor_link(User::get(conn, self.user_id)?.into_id())?;
act.undo_props.set_object_object(self.to_activity(conn)?)?;
act.object_props
.set_id_string(format!("{}#delete", self.ap_url))?;
act.object_props
.set_to_link_vec(vec![Id::new(PUBLIC_VISIBILITY.to_string())])?;
act.object_props
.set_cc_link_vec(vec![Id::new(self.get_user(conn)?.followers_endpoint)])?;
let mut act = Undo::new(
User::get(conn, self.user_id)?.ap_url.parse::<IriString>()?,
AnyBase::from_extended(self.to_activity(conn)?)?,
);
act.set_id(format!("{}#delete", self.ap_url).parse::<IriString>()?);
act.set_many_tos(vec![PUBLIC_VISIBILITY.parse::<IriString>()?]);
act.set_many_ccs(vec![self
.get_user(conn)?
.followers_endpoint
.parse::<IriString>()?]);
Ok(act)
}
}
impl AsObject<User, Announce, &DbConn> for Post {
impl AsObject<User, Announce, &Connection> for Post {
type Error = Error;
type Output = Reshare;
fn activity(self, conn: &DbConn, actor: User, id: &str) -> Result<Reshare> {
fn activity(self, conn: &Connection, actor: User, id: &str) -> Result<Reshare> {
let conn = conn;
let reshare = Reshare::insert(
conn,
@ -129,21 +134,24 @@ impl AsObject<User, Announce, &DbConn> for Post {
}
}
impl FromId<DbConn> for Reshare {
impl FromId<Connection> for Reshare {
type Error = Error;
type Object = Announce;
fn from_db(conn: &DbConn, id: &str) -> Result<Self> {
fn from_db(conn: &Connection, id: &str) -> Result<Self> {
Reshare::find_by_ap_url(conn, id)
}
fn from_activity(conn: &DbConn, act: Announce) -> Result<Self> {
fn from_activity(conn: &Connection, act: Announce) -> Result<Self> {
let res = Reshare::insert(
conn,
NewReshare {
post_id: Post::from_id(
conn,
&act.announce_props.object_link::<Id>()?,
act.object_field_ref()
.as_single_id()
.ok_or(Error::MissingApProperty)?
.as_str(),
None,
CONFIG.proxy(),
)
@ -151,13 +159,19 @@ impl FromId<DbConn> for Reshare {
.id,
user_id: User::from_id(
conn,
&act.announce_props.actor_link::<Id>()?,
act.actor_field_ref()
.as_single_id()
.ok_or(Error::MissingApProperty)?
.as_str(),
None,
CONFIG.proxy(),
)
.map_err(|(_, e)| e)?
.id,
ap_url: act.object_props.id_string()?,
ap_url: act
.id_unchecked()
.ok_or(Error::MissingApProperty)?
.to_string(),
},
)?;
res.notify(conn)?;
@ -169,17 +183,17 @@ impl FromId<DbConn> for Reshare {
}
}
impl AsObject<User, Undo, &DbConn> for Reshare {
impl AsObject<User, Undo, &Connection> for Reshare {
type Error = Error;
type Output = ();
fn activity(self, conn: &DbConn, actor: User, _id: &str) -> Result<()> {
fn activity(self, conn: &Connection, actor: User, _id: &str) -> Result<()> {
if actor.id == self.user_id {
diesel::delete(&self).execute(&**conn)?;
diesel::delete(&self).execute(conn)?;
// delete associated notification if any
if let Ok(notif) = Notification::find(conn, notification_kind::RESHARE, self.id) {
diesel::delete(&notif).execute(&**conn)?;
diesel::delete(&notif).execute(conn)?;
}
Ok(())
@ -191,7 +205,7 @@ impl AsObject<User, Undo, &DbConn> for Reshare {
impl NewReshare {
pub fn new(p: &Post, u: &User) -> Self {
let ap_url = format!("{}/reshare/{}", u.ap_url, p.ap_url);
let ap_url = format!("{}reshare/{}", u.ap_url, p.ap_url);
NewReshare {
post_id: p.id,
user_id: u.id,
@ -199,3 +213,67 @@ impl NewReshare {
}
}
}
#[cfg(test)]
mod test {
use super::*;
use crate::diesel::Connection;
use crate::{inbox::tests::fill_database, tests::db};
use assert_json_diff::assert_json_eq;
use serde_json::{json, to_value};
#[test]
fn to_activity() {
let conn = db();
conn.test_transaction::<_, Error, _>(|| {
let (posts, _users, _blogs) = fill_database(&conn);
let post = &posts[0];
let user = &post.get_authors(&conn)?[0];
let reshare = Reshare::insert(&conn, NewReshare::new(post, user))?;
let act = reshare.to_activity(&conn).unwrap();
let expected = json!({
"actor": "https://plu.me/@/admin/",
"cc": ["https://plu.me/@/admin/followers"],
"id": "https://plu.me/@/admin/reshare/https://plu.me/~/BlogName/testing",
"object": "https://plu.me/~/BlogName/testing",
"to": ["https://www.w3.org/ns/activitystreams#Public"],
"type": "Announce",
});
assert_json_eq!(to_value(act)?, expected);
Ok(())
});
}
#[test]
fn build_undo() {
let conn = db();
conn.test_transaction::<_, Error, _>(|| {
let (posts, _users, _blogs) = fill_database(&conn);
let post = &posts[0];
let user = &post.get_authors(&conn)?[0];
let reshare = Reshare::insert(&conn, NewReshare::new(post, user))?;
let act = reshare.build_undo(&conn)?;
let expected = json!({
"actor": "https://plu.me/@/admin/",
"cc": ["https://plu.me/@/admin/followers"],
"id": "https://plu.me/@/admin/reshare/https://plu.me/~/BlogName/testing#delete",
"object": {
"actor": "https://plu.me/@/admin/",
"cc": ["https://plu.me/@/admin/followers"],
"id": "https://plu.me/@/admin/reshare/https://plu.me/~/BlogName/testing",
"object": "https://plu.me/~/BlogName/testing",
"to": ["https://www.w3.org/ns/activitystreams#Public"],
"type": "Announce"
},
"to": ["https://www.w3.org/ns/activitystreams#Public"],
"type": "Undo",
});
assert_json_eq!(to_value(act)?, expected);
Ok(())
});
}
}

View file

@ -93,7 +93,7 @@ fn url_add_prefix(url: &str) -> Option<Cow<'_, str>> {
}
}
#[derive(Debug, Clone, PartialEq, AsExpression, FromSqlRow, Default)]
#[derive(Debug, Clone, PartialEq, Eq, AsExpression, FromSqlRow, Default)]
#[sql_type = "Text"]
pub struct SafeString {
value: String,

View file

@ -73,16 +73,26 @@ table! {
user_id -> Int4,
}
}
table! {
email_blocklist(id){
email_blocklist (id) {
id -> Int4,
email_address -> VarChar,
email_address -> Text,
note -> Text,
notify_user -> Bool,
notification_text -> Text,
}
}
table! {
email_signups (id) {
id -> Int4,
email -> Varchar,
token -> Varchar,
expiration_date -> Timestamp,
}
}
table! {
follows (id) {
id -> Int4,
@ -306,6 +316,8 @@ allow_tables_to_appear_in_same_query!(
blogs,
comments,
comment_seers,
email_blocklist,
email_signups,
follows,
instances,
likes,

View file

@ -108,7 +108,7 @@ mod tests {
let searcher = Arc::new(get_searcher(&CONFIG.search_tokenizers));
SearchActor::init(searcher.clone(), db_pool.clone());
let conn = db_pool.clone().get().unwrap();
let conn = db_pool.get().unwrap();
let title = random_hex()[..8].to_owned();
let (_instance, _user, blog) = fill_database(&conn);
@ -161,41 +161,43 @@ mod tests {
long_description_html: "<p>Good morning</p>".to_string(),
short_description: SafeString::new("Hello"),
short_description_html: "<p>Hello</p>".to_string(),
name: random_hex().to_string(),
name: random_hex(),
open_registrations: true,
public_domain: random_hex().to_string(),
public_domain: random_hex(),
},
)
.unwrap();
let user = User::insert(
conn,
NewUser {
username: random_hex().to_string(),
display_name: random_hex().to_string(),
outbox_url: random_hex().to_string(),
inbox_url: random_hex().to_string(),
username: random_hex(),
display_name: random_hex(),
outbox_url: random_hex(),
inbox_url: random_hex(),
summary: "".to_string(),
email: None,
hashed_password: None,
instance_id: instance.id,
ap_url: random_hex().to_string(),
ap_url: random_hex(),
private_key: None,
public_key: "".to_string(),
shared_inbox_url: None,
followers_endpoint: random_hex().to_string(),
followers_endpoint: random_hex(),
avatar_id: None,
summary_html: SafeString::new(""),
role: 0,
fqn: random_hex().to_string(),
fqn: random_hex(),
},
)
.unwrap();
let mut blog = NewBlog::default();
blog.instance_id = instance.id;
blog.actor_id = random_hex().to_string();
blog.ap_url = random_hex().to_string();
blog.inbox_url = random_hex().to_string();
blog.outbox_url = random_hex().to_string();
let blog = NewBlog {
instance_id: instance.id,
actor_id: random_hex(),
ap_url: random_hex(),
inbox_url: random_hex(),
outbox_url: random_hex(),
..Default::default()
};
let blog = Blog::insert(conn, blog).unwrap();
BlogAuthor::insert(
conn,

View file

@ -154,7 +154,7 @@ pub(crate) mod tests {
},
)
.unwrap();
searcher.add_document(&conn, &post).unwrap();
searcher.add_document(conn, &post).unwrap();
searcher.commit();
assert_eq!(
searcher.search_document(conn, Query::from_str(&title).unwrap(), (0, 1))[0].id,

View file

@ -94,7 +94,7 @@ macro_rules! gen_to_string {
)*
$(
for val in &$self.$date {
$result.push_str(&format!("{}:{} ", stringify!($date), NaiveDate::from_num_days_from_ce(*val as i32).format("%Y-%m-%d")));
$result.push_str(&format!("{}:{} ", stringify!($date), NaiveDate::from_num_days_from_ce_opt(*val as i32).unwrap().format("%Y-%m-%d")));
}
)*
}
@ -180,12 +180,16 @@ impl PlumeQuery {
if self.before.is_some() || self.after.is_some() {
// if at least one range bound is provided
let after = self
.after
.unwrap_or_else(|| i64::from(NaiveDate::from_ymd(2000, 1, 1).num_days_from_ce()));
let after = self.after.unwrap_or_else(|| {
i64::from(
NaiveDate::from_ymd_opt(2000, 1, 1)
.unwrap()
.num_days_from_ce(),
)
});
let before = self
.before
.unwrap_or_else(|| i64::from(Utc::today().num_days_from_ce()));
.unwrap_or_else(|| i64::from(Utc::now().date_naive().num_days_from_ce()));
let field = Searcher::schema().get_field("creation_date").unwrap();
let range =
RangeQuery::new_i64_bounds(field, Bound::Included(after), Bound::Included(before));
@ -202,16 +206,20 @@ impl PlumeQuery {
pub fn before<D: Datelike>(&mut self, date: &D) -> &mut Self {
let before = self
.before
.unwrap_or_else(|| i64::from(Utc::today().num_days_from_ce()));
.unwrap_or_else(|| i64::from(Utc::now().date_naive().num_days_from_ce()));
self.before = Some(cmp::min(before, i64::from(date.num_days_from_ce())));
self
}
// documents older than the provided date will be ignored
pub fn after<D: Datelike>(&mut self, date: &D) -> &mut Self {
let after = self
.after
.unwrap_or_else(|| i64::from(NaiveDate::from_ymd(2000, 1, 1).num_days_from_ce()));
let after = self.after.unwrap_or_else(|| {
i64::from(
NaiveDate::from_ymd_opt(2000, 1, 1)
.unwrap()
.num_days_from_ce(),
)
});
self.after = Some(cmp::max(after, i64::from(date.num_days_from_ce())));
self
}

View file

@ -57,7 +57,7 @@ impl<'a> WhitespaceTokenStream<'a> {
.filter(|&(_, ref c)| c.is_whitespace())
.map(|(offset, _)| offset)
.next()
.unwrap_or_else(|| self.text.len())
.unwrap_or(self.text.len())
}
}

View file

@ -0,0 +1,72 @@
use crate::CONFIG;
use rocket::request::{FromRequest, Outcome, Request};
use std::fmt;
use std::str::FromStr;
pub enum Strategy {
Password,
Email,
}
impl Default for Strategy {
fn default() -> Self {
Self::Password
}
}
impl FromStr for Strategy {
type Err = StrategyError;
fn from_str(s: &str) -> Result<Self, Self::Err> {
use self::Strategy::*;
match s {
"password" => Ok(Password),
"email" => Ok(Email),
s => Err(StrategyError::Unsupported(s.to_string())),
}
}
}
#[derive(Debug)]
pub enum StrategyError {
Unsupported(String),
}
impl fmt::Display for StrategyError {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
use self::StrategyError::*;
match self {
// FIXME: Calc option strings from enum
Unsupported(s) => write!(f, "Unsupported strategy: {}. Choose password or email", s),
}
}
}
impl std::error::Error for StrategyError {}
pub struct Password();
pub struct Email();
impl<'a, 'r> FromRequest<'a, 'r> for Password {
type Error = ();
fn from_request(_request: &'a Request<'r>) -> Outcome<Self, ()> {
match matches!(CONFIG.signup, Strategy::Password) {
true => Outcome::Success(Self()),
false => Outcome::Forward(()),
}
}
}
impl<'a, 'r> FromRequest<'a, 'r> for Email {
type Error = ();
fn from_request(_request: &'a Request<'r>) -> Outcome<Self, ()> {
match matches!(CONFIG.signup, Strategy::Email) {
true => Outcome::Success(Self()),
false => Outcome::Forward(()),
}
}
}

View file

@ -1,6 +1,7 @@
use crate::{ap_url, instance::Instance, schema::tags, Connection, Error, Result};
use activitystreams::iri_string::types::IriString;
use diesel::{self, ExpressionMethods, QueryDsl, RunQueryDsl};
use plume_common::activity_pub::Hashtag;
use plume_common::activity_pub::{Hashtag, HashtagExt};
#[derive(Clone, Identifiable, Queryable)]
pub struct Tag {
@ -25,13 +26,16 @@ impl Tag {
list_by!(tags, for_post, post_id as i32);
pub fn to_activity(&self) -> Result<Hashtag> {
let mut ht = Hashtag::default();
ht.set_href_string(ap_url(&format!(
"{}/tag/{}",
Instance::get_local()?.public_domain,
self.tag
)))?;
ht.set_name_string(self.tag.clone())?;
let mut ht = Hashtag::new();
ht.set_href(
ap_url(&format!(
"{}/tag/{}",
Instance::get_local()?.public_domain,
self.tag
))
.parse::<IriString>()?,
);
ht.set_name(self.tag.clone());
Ok(ht)
}
@ -44,7 +48,7 @@ impl Tag {
Tag::insert(
conn,
NewTag {
tag: tag.name_string()?,
tag: tag.name().ok_or(Error::MissingApProperty)?.as_str().into(),
is_hashtag,
post_id: post,
},
@ -52,13 +56,16 @@ impl Tag {
}
pub fn build_activity(tag: String) -> Result<Hashtag> {
let mut ht = Hashtag::default();
ht.set_href_string(ap_url(&format!(
"{}/tag/{}",
Instance::get_local()?.public_domain,
tag
)))?;
ht.set_name_string(tag)?;
let mut ht = Hashtag::new();
ht.set_href(
ap_url(&format!(
"{}/tag/{}",
Instance::get_local()?.public_domain,
tag
))
.parse::<IriString>()?,
);
ht.set_name(tag);
Ok(ht)
}
@ -69,3 +76,72 @@ impl Tag {
.map_err(Error::from)
}
}
#[cfg(test)]
mod tests {
use super::*;
use crate::tests::db;
use crate::{diesel::Connection, inbox::tests::fill_database};
use assert_json_diff::assert_json_eq;
use serde_json::to_value;
#[test]
fn from_activity() {
let conn = &db();
conn.test_transaction::<_, Error, _>(|| {
let (posts, _users, _blogs) = fill_database(conn);
let post_id = posts[0].id;
let mut ht = Hashtag::new();
ht.set_href(ap_url("https://plu.me/tag/a_tag").parse::<IriString>()?);
ht.set_name("a_tag".to_string());
let tag = Tag::from_activity(conn, &ht, post_id, true)?;
assert_eq!(&tag.tag, "a_tag");
assert!(tag.is_hashtag);
Ok(())
});
}
#[test]
fn to_activity() {
let conn = &db();
conn.test_transaction::<_, Error, _>(|| {
fill_database(conn);
let tag = Tag {
id: 0,
tag: "a_tag".into(),
is_hashtag: false,
post_id: 0,
};
let act = tag.to_activity()?;
let expected = json!({
"href": "https://plu.me/tag/a_tag",
"name": "a_tag",
"type": "Hashtag"
});
assert_json_eq!(to_value(&act)?, expected);
Ok(())
})
}
#[test]
fn build_activity() {
let conn = &db();
conn.test_transaction::<_, Error, _>(|| {
fill_database(conn);
let act = Tag::build_activity("a_tag".into())?;
let expected = json!({
"href": "https://plu.me/tag/a_tag",
"name": "a_tag",
"type": "Hashtag"
});
assert_json_eq!(to_value(&act)?, expected);
Ok(())
});
}
}

View file

@ -1,19 +1,19 @@
use crate::{
db_conn::DbConn,
lists::List,
posts::Post,
schema::{posts, timeline, timeline_definition},
Connection, Error, Result,
};
use diesel::{self, BoolExpressionMethods, ExpressionMethods, QueryDsl, RunQueryDsl};
use std::cmp::Ordering;
use std::ops::Deref;
pub(crate) mod query;
pub use self::query::Kind;
use self::query::{QueryError, TimelineQuery};
pub use self::query::{QueryError, TimelineQuery};
#[derive(Clone, Debug, PartialEq, Queryable, Identifiable, AsChangeset)]
#[derive(Clone, Debug, PartialEq, Eq, Queryable, Identifiable, AsChangeset)]
#[table_name = "timeline_definition"]
pub struct Timeline {
pub id: i32,
@ -92,6 +92,16 @@ impl Timeline {
.load::<Self>(conn)
.map_err(Error::from)
}
.map(|mut timelines| {
timelines.sort_by(|t1, t2| {
if t1.user_id.is_some() && t2.user_id.is_none() {
Ordering::Less
} else {
t1.id.cmp(&t2.id)
}
});
timelines
})
}
pub fn new_for_user(
@ -209,7 +219,7 @@ impl Timeline {
.map_err(Error::from)
}
pub fn add_to_all_timelines(conn: &DbConn, post: &Post, kind: Kind<'_>) -> Result<()> {
pub fn add_to_all_timelines(conn: &Connection, post: &Post, kind: Kind<'_>) -> Result<()> {
let timelines = timeline_definition::table
.load::<Self>(conn.deref())
.map_err(Error::from)?;
@ -235,7 +245,26 @@ impl Timeline {
Ok(())
}
pub fn matches(&self, conn: &DbConn, post: &Post, kind: Kind<'_>) -> Result<bool> {
pub fn remove_post(&self, conn: &Connection, post: &Post) -> Result<bool> {
if self.includes_post(conn, post)? {
return Ok(false);
}
diesel::delete(
timeline::table
.filter(timeline::timeline_id.eq(self.id))
.filter(timeline::post_id.eq(post.id)),
)
.execute(conn)?;
Ok(true)
}
pub fn remove_all_posts(&self, conn: &Connection) -> Result<u64> {
let count = diesel::delete(timeline::table.filter(timeline::timeline_id.eq(self.id)))
.execute(conn)?;
Ok(count as u64)
}
pub fn matches(&self, conn: &Connection, post: &Post, kind: Kind<'_>) -> Result<bool> {
let query = TimelineQuery::parse(&self.query)?;
query.matches(conn, self, post, kind)
}
@ -271,73 +300,63 @@ mod tests {
fn test_timeline() {
let conn = &db();
conn.test_transaction::<_, (), _>(|| {
let users = userTests::fill_database(&conn);
let users = userTests::fill_database(conn);
let mut tl1_u1 = Timeline::new_for_user(
&conn,
conn,
users[0].id,
"my timeline".to_owned(),
"all".to_owned(),
)
.unwrap();
List::new(
&conn,
"languages I speak",
Some(&users[1]),
ListType::Prefix,
)
.unwrap();
List::new(conn, "languages I speak", Some(&users[1]), ListType::Prefix).unwrap();
let tl2_u1 = Timeline::new_for_user(
&conn,
conn,
users[0].id,
"another timeline".to_owned(),
"followed".to_owned(),
)
.unwrap();
let tl1_u2 = Timeline::new_for_user(
&conn,
conn,
users[1].id,
"english posts".to_owned(),
"lang in \"languages I speak\"".to_owned(),
)
.unwrap();
let tl1_instance = Timeline::new_for_instance(
&conn,
conn,
"english posts".to_owned(),
"license in [cc]".to_owned(),
)
.unwrap();
assert_eq!(tl1_u1, Timeline::get(&conn, tl1_u1.id).unwrap());
assert_eq!(tl1_u1, Timeline::get(conn, tl1_u1.id).unwrap());
assert_eq!(
tl2_u1,
Timeline::find_for_user_by_name(&conn, Some(users[0].id), "another timeline")
Timeline::find_for_user_by_name(conn, Some(users[0].id), "another timeline")
.unwrap()
);
assert_eq!(
tl1_instance,
Timeline::find_for_user_by_name(&conn, None, "english posts").unwrap()
Timeline::find_for_user_by_name(conn, None, "english posts").unwrap()
);
let tl_u1 = Timeline::list_for_user(&conn, Some(users[0].id)).unwrap();
let tl_u1 = Timeline::list_for_user(conn, Some(users[0].id)).unwrap();
assert_eq!(3, tl_u1.len()); // it is not 2 because there is a "Your feed" tl created for each user automatically
assert!(tl_u1.iter().fold(false, |res, tl| { res || *tl == tl1_u1 }));
assert!(tl_u1.iter().fold(false, |res, tl| { res || *tl == tl2_u1 }));
assert!(tl_u1.iter().any(|tl| *tl == tl1_u1));
assert!(tl_u1.iter().any(|tl| *tl == tl2_u1));
let tl_instance = Timeline::list_for_user(&conn, None).unwrap();
let tl_instance = Timeline::list_for_user(conn, None).unwrap();
assert_eq!(3, tl_instance.len()); // there are also the local and federated feed by default
assert!(tl_instance
.iter()
.fold(false, |res, tl| { res || *tl == tl1_instance }));
assert!(tl_instance.iter().any(|tl| *tl == tl1_instance));
tl1_u1.name = "My Super TL".to_owned();
let new_tl1_u2 = tl1_u2.update(&conn).unwrap();
let new_tl1_u2 = tl1_u2.update(conn).unwrap();
let tl_u2 = Timeline::list_for_user(&conn, Some(users[1].id)).unwrap();
let tl_u2 = Timeline::list_for_user(conn, Some(users[1].id)).unwrap();
assert_eq!(2, tl_u2.len()); // same here
assert!(tl_u2
.iter()
.fold(false, |res, tl| { res || *tl == new_tl1_u2 }));
assert!(tl_u2.iter().any(|tl| *tl == new_tl1_u2));
Ok(())
});
@ -347,48 +366,48 @@ mod tests {
fn test_timeline_creation_error() {
let conn = &db();
conn.test_transaction::<_, (), _>(|| {
let users = userTests::fill_database(&conn);
let users = userTests::fill_database(conn);
assert!(Timeline::new_for_user(
&conn,
conn,
users[0].id,
"my timeline".to_owned(),
"invalid keyword".to_owned(),
)
.is_err());
assert!(Timeline::new_for_instance(
&conn,
conn,
"my timeline".to_owned(),
"invalid keyword".to_owned(),
)
.is_err());
assert!(Timeline::new_for_user(
&conn,
conn,
users[0].id,
"my timeline".to_owned(),
"author in non_existant_list".to_owned(),
)
.is_err());
assert!(Timeline::new_for_instance(
&conn,
conn,
"my timeline".to_owned(),
"lang in dont-exist".to_owned(),
)
.is_err());
List::new(&conn, "friends", Some(&users[0]), ListType::User).unwrap();
List::new(&conn, "idk", None, ListType::Blog).unwrap();
List::new(conn, "friends", Some(&users[0]), ListType::User).unwrap();
List::new(conn, "idk", None, ListType::Blog).unwrap();
assert!(Timeline::new_for_user(
&conn,
conn,
users[0].id,
"my timeline".to_owned(),
"blog in friends".to_owned(),
)
.is_err());
assert!(Timeline::new_for_instance(
&conn,
conn,
"my timeline".to_owned(),
"not author in idk".to_owned(),
)
@ -402,10 +421,10 @@ mod tests {
fn test_simple_match() {
let conn = &db();
conn.test_transaction::<_, (), _>(|| {
let (users, blogs) = blogTests::fill_database(&conn);
let (users, blogs) = blogTests::fill_database(conn);
let gnu_tl = Timeline::new_for_user(
&conn,
conn,
users[0].id,
"GNU timeline".to_owned(),
"license in [AGPL, LGPL, GPL]".to_owned(),
@ -413,7 +432,7 @@ mod tests {
.unwrap();
let gnu_post = Post::insert(
&conn,
conn,
NewPost {
blog_id: blogs[0].id,
slug: "slug".to_string(),
@ -429,10 +448,10 @@ mod tests {
},
)
.unwrap();
assert!(gnu_tl.matches(&conn, &gnu_post, Kind::Original).unwrap());
assert!(gnu_tl.matches(conn, &gnu_post, Kind::Original).unwrap());
let non_free_post = Post::insert(
&conn,
conn,
NewPost {
blog_id: blogs[0].id,
slug: "slug2".to_string(),
@ -449,7 +468,7 @@ mod tests {
)
.unwrap();
assert!(!gnu_tl
.matches(&conn, &non_free_post, Kind::Original)
.matches(conn, &non_free_post, Kind::Original)
.unwrap());
Ok(())
@ -460,9 +479,9 @@ mod tests {
fn test_complex_match() {
let conn = &db();
conn.test_transaction::<_, (), _>(|| {
let (users, blogs) = blogTests::fill_database(&conn);
let (users, blogs) = blogTests::fill_database(conn);
Follow::insert(
&conn,
conn,
NewFollow {
follower_id: users[0].id,
following_id: users[1].id,
@ -472,11 +491,11 @@ mod tests {
.unwrap();
let fav_blogs_list =
List::new(&conn, "fav_blogs", Some(&users[0]), ListType::Blog).unwrap();
fav_blogs_list.add_blogs(&conn, &[blogs[0].id]).unwrap();
List::new(conn, "fav_blogs", Some(&users[0]), ListType::Blog).unwrap();
fav_blogs_list.add_blogs(conn, &[blogs[0].id]).unwrap();
let my_tl = Timeline::new_for_user(
&conn,
conn,
users[0].id,
"My timeline".to_owned(),
"blog in fav_blogs and not has_cover or local and followed exclude likes"
@ -485,7 +504,7 @@ mod tests {
.unwrap();
let post = Post::insert(
&conn,
conn,
NewPost {
blog_id: blogs[0].id,
slug: "about-linux".to_string(),
@ -501,10 +520,10 @@ mod tests {
},
)
.unwrap();
assert!(my_tl.matches(&conn, &post, Kind::Original).unwrap()); // matches because of "blog in fav_blogs" (and there is no cover)
assert!(my_tl.matches(conn, &post, Kind::Original).unwrap()); // matches because of "blog in fav_blogs" (and there is no cover)
let post = Post::insert(
&conn,
conn,
NewPost {
blog_id: blogs[1].id,
slug: "about-linux-2".to_string(),
@ -522,7 +541,7 @@ mod tests {
},
)
.unwrap();
assert!(!my_tl.matches(&conn, &post, Kind::Like(&users[1])).unwrap());
assert!(!my_tl.matches(conn, &post, Kind::Like(&users[1])).unwrap());
Ok(())
});
@ -532,17 +551,17 @@ mod tests {
fn test_add_to_all_timelines() {
let conn = &db();
conn.test_transaction::<_, (), _>(|| {
let (users, blogs) = blogTests::fill_database(&conn);
let (users, blogs) = blogTests::fill_database(conn);
let gnu_tl = Timeline::new_for_user(
&conn,
conn,
users[0].id,
"GNU timeline".to_owned(),
"license in [AGPL, LGPL, GPL]".to_owned(),
)
.unwrap();
let non_gnu_tl = Timeline::new_for_user(
&conn,
conn,
users[0].id,
"Stallman disapproved timeline".to_owned(),
"not license in [AGPL, LGPL, GPL]".to_owned(),
@ -550,7 +569,7 @@ mod tests {
.unwrap();
let gnu_post = Post::insert(
&conn,
conn,
NewPost {
blog_id: blogs[0].id,
slug: "slug".to_string(),
@ -568,7 +587,7 @@ mod tests {
.unwrap();
let non_free_post = Post::insert(
&conn,
conn,
NewPost {
blog_id: blogs[0].id,
slug: "slug2".to_string(),
@ -585,13 +604,13 @@ mod tests {
)
.unwrap();
Timeline::add_to_all_timelines(&conn, &gnu_post, Kind::Original).unwrap();
Timeline::add_to_all_timelines(&conn, &non_free_post, Kind::Original).unwrap();
Timeline::add_to_all_timelines(conn, &gnu_post, Kind::Original).unwrap();
Timeline::add_to_all_timelines(conn, &non_free_post, Kind::Original).unwrap();
let res = gnu_tl.get_latest(&conn, 2).unwrap();
let res = gnu_tl.get_latest(conn, 2).unwrap();
assert_eq!(res.len(), 1);
assert_eq!(res[0].id, gnu_post.id);
let res = non_gnu_tl.get_latest(&conn, 2).unwrap();
let res = non_gnu_tl.get_latest(conn, 2).unwrap();
assert_eq!(res.len(), 1);
assert_eq!(res[0].id, non_free_post.id);
@ -603,10 +622,10 @@ mod tests {
fn test_matches_lists_direct() {
let conn = &db();
conn.test_transaction::<_, (), _>(|| {
let (users, blogs) = blogTests::fill_database(&conn);
let (users, blogs) = blogTests::fill_database(conn);
let gnu_post = Post::insert(
&conn,
conn,
NewPost {
blog_id: blogs[0].id,
slug: "slug".to_string(),
@ -623,63 +642,63 @@ mod tests {
)
.unwrap();
gnu_post
.update_tags(&conn, vec![Tag::build_activity("free".to_owned()).unwrap()])
.update_tags(conn, vec![Tag::build_activity("free".to_owned()).unwrap()])
.unwrap();
PostAuthor::insert(
&conn,
conn,
NewPostAuthor {
post_id: gnu_post.id,
author_id: blogs[0].list_authors(&conn).unwrap()[0].id,
author_id: blogs[0].list_authors(conn).unwrap()[0].id,
},
)
.unwrap();
let tl = Timeline::new_for_user(
&conn,
conn,
users[0].id,
"blog timeline".to_owned(),
format!("blog in [{}]", blogs[0].fqn),
)
.unwrap();
assert!(tl.matches(&conn, &gnu_post, Kind::Original).unwrap());
tl.delete(&conn).unwrap();
assert!(tl.matches(conn, &gnu_post, Kind::Original).unwrap());
tl.delete(conn).unwrap();
let tl = Timeline::new_for_user(
&conn,
conn,
users[0].id,
"blog timeline".to_owned(),
"blog in [no_one@nowhere]".to_owned(),
)
.unwrap();
assert!(!tl.matches(&conn, &gnu_post, Kind::Original).unwrap());
tl.delete(&conn).unwrap();
assert!(!tl.matches(conn, &gnu_post, Kind::Original).unwrap());
tl.delete(conn).unwrap();
let tl = Timeline::new_for_user(
&conn,
conn,
users[0].id,
"author timeline".to_owned(),
format!(
"author in [{}]",
blogs[0].list_authors(&conn).unwrap()[0].fqn
blogs[0].list_authors(conn).unwrap()[0].fqn
),
)
.unwrap();
assert!(tl.matches(&conn, &gnu_post, Kind::Original).unwrap());
tl.delete(&conn).unwrap();
assert!(tl.matches(conn, &gnu_post, Kind::Original).unwrap());
tl.delete(conn).unwrap();
let tl = Timeline::new_for_user(
&conn,
conn,
users[0].id,
"author timeline".to_owned(),
format!("author in [{}]", users[2].fqn),
)
.unwrap();
assert!(!tl.matches(&conn, &gnu_post, Kind::Original).unwrap());
assert!(!tl.matches(conn, &gnu_post, Kind::Original).unwrap());
assert!(tl
.matches(&conn, &gnu_post, Kind::Reshare(&users[2]))
.matches(conn, &gnu_post, Kind::Reshare(&users[2]))
.unwrap());
assert!(!tl.matches(&conn, &gnu_post, Kind::Like(&users[2])).unwrap());
tl.delete(&conn).unwrap();
assert!(!tl.matches(conn, &gnu_post, Kind::Like(&users[2])).unwrap());
tl.delete(conn).unwrap();
let tl = Timeline::new_for_user(
&conn,
conn,
users[0].id,
"author timeline".to_owned(),
format!(
@ -688,50 +707,50 @@ mod tests {
),
)
.unwrap();
assert!(!tl.matches(&conn, &gnu_post, Kind::Original).unwrap());
assert!(!tl.matches(conn, &gnu_post, Kind::Original).unwrap());
assert!(!tl
.matches(&conn, &gnu_post, Kind::Reshare(&users[2]))
.matches(conn, &gnu_post, Kind::Reshare(&users[2]))
.unwrap());
assert!(tl.matches(&conn, &gnu_post, Kind::Like(&users[2])).unwrap());
tl.delete(&conn).unwrap();
assert!(tl.matches(conn, &gnu_post, Kind::Like(&users[2])).unwrap());
tl.delete(conn).unwrap();
let tl = Timeline::new_for_user(
&conn,
conn,
users[0].id,
"tag timeline".to_owned(),
"tags in [free]".to_owned(),
)
.unwrap();
assert!(tl.matches(&conn, &gnu_post, Kind::Original).unwrap());
tl.delete(&conn).unwrap();
assert!(tl.matches(conn, &gnu_post, Kind::Original).unwrap());
tl.delete(conn).unwrap();
let tl = Timeline::new_for_user(
&conn,
conn,
users[0].id,
"tag timeline".to_owned(),
"tags in [private]".to_owned(),
)
.unwrap();
assert!(!tl.matches(&conn, &gnu_post, Kind::Original).unwrap());
tl.delete(&conn).unwrap();
assert!(!tl.matches(conn, &gnu_post, Kind::Original).unwrap());
tl.delete(conn).unwrap();
let tl = Timeline::new_for_user(
&conn,
conn,
users[0].id,
"english timeline".to_owned(),
"lang in [en]".to_owned(),
)
.unwrap();
assert!(tl.matches(&conn, &gnu_post, Kind::Original).unwrap());
tl.delete(&conn).unwrap();
assert!(tl.matches(conn, &gnu_post, Kind::Original).unwrap());
tl.delete(conn).unwrap();
let tl = Timeline::new_for_user(
&conn,
conn,
users[0].id,
"franco-italian timeline".to_owned(),
"lang in [fr, it]".to_owned(),
)
.unwrap();
assert!(!tl.matches(&conn, &gnu_post, Kind::Original).unwrap());
tl.delete(&conn).unwrap();
assert!(!tl.matches(conn, &gnu_post, Kind::Original).unwrap());
tl.delete(conn).unwrap();
Ok(())
});
@ -775,10 +794,10 @@ mod tests {
fn test_matches_keyword() {
let conn = &db();
conn.test_transaction::<_, (), _>(|| {
let (users, blogs) = blogTests::fill_database(&conn);
let (users, blogs) = blogTests::fill_database(conn);
let gnu_post = Post::insert(
&conn,
conn,
NewPost {
blog_id: blogs[0].id,
slug: "slug".to_string(),
@ -796,61 +815,61 @@ mod tests {
.unwrap();
let tl = Timeline::new_for_user(
&conn,
conn,
users[0].id,
"Linux title".to_owned(),
"title contains Linux".to_owned(),
)
.unwrap();
assert!(tl.matches(&conn, &gnu_post, Kind::Original).unwrap());
tl.delete(&conn).unwrap();
assert!(tl.matches(conn, &gnu_post, Kind::Original).unwrap());
tl.delete(conn).unwrap();
let tl = Timeline::new_for_user(
&conn,
conn,
users[0].id,
"Microsoft title".to_owned(),
"title contains Microsoft".to_owned(),
)
.unwrap();
assert!(!tl.matches(&conn, &gnu_post, Kind::Original).unwrap());
tl.delete(&conn).unwrap();
assert!(!tl.matches(conn, &gnu_post, Kind::Original).unwrap());
tl.delete(conn).unwrap();
let tl = Timeline::new_for_user(
&conn,
conn,
users[0].id,
"Linux subtitle".to_owned(),
"subtitle contains Stallman".to_owned(),
)
.unwrap();
assert!(tl.matches(&conn, &gnu_post, Kind::Original).unwrap());
tl.delete(&conn).unwrap();
assert!(tl.matches(conn, &gnu_post, Kind::Original).unwrap());
tl.delete(conn).unwrap();
let tl = Timeline::new_for_user(
&conn,
conn,
users[0].id,
"Microsoft subtitle".to_owned(),
"subtitle contains Nadella".to_owned(),
)
.unwrap();
assert!(!tl.matches(&conn, &gnu_post, Kind::Original).unwrap());
tl.delete(&conn).unwrap();
assert!(!tl.matches(conn, &gnu_post, Kind::Original).unwrap());
tl.delete(conn).unwrap();
let tl = Timeline::new_for_user(
&conn,
conn,
users[0].id,
"Linux content".to_owned(),
"content contains Linux".to_owned(),
)
.unwrap();
assert!(tl.matches(&conn, &gnu_post, Kind::Original).unwrap());
tl.delete(&conn).unwrap();
assert!(tl.matches(conn, &gnu_post, Kind::Original).unwrap());
tl.delete(conn).unwrap();
let tl = Timeline::new_for_user(
&conn,
conn,
users[0].id,
"Microsoft content".to_owned(),
"subtitle contains Windows".to_owned(),
)
.unwrap();
assert!(!tl.matches(&conn, &gnu_post, Kind::Original).unwrap());
tl.delete(&conn).unwrap();
assert!(!tl.matches(conn, &gnu_post, Kind::Original).unwrap());
tl.delete(conn).unwrap();
Ok(())
});

View file

@ -1,17 +1,16 @@
use crate::{
blogs::Blog,
db_conn::DbConn,
lists::{self, ListType},
posts::Post,
tags::Tag,
timeline::Timeline,
users::User,
Result,
Connection, Result,
};
use plume_common::activity_pub::inbox::AsActor;
use whatlang::{self, Lang};
#[derive(Debug, Clone, PartialEq)]
#[derive(Debug, Clone, PartialEq, Eq)]
pub enum QueryError {
SyntaxError(usize, usize, String),
UnexpectedEndOfQuery,
@ -20,7 +19,7 @@ pub enum QueryError {
pub type QueryResult<T> = std::result::Result<T, QueryError>;
#[derive(Debug, Clone, Copy, PartialEq)]
#[derive(Debug, Clone, Copy, PartialEq, Eq)]
pub enum Kind<'a> {
Original,
Reshare(&'a User),
@ -155,7 +154,7 @@ enum TQ<'a> {
impl<'a> TQ<'a> {
fn matches(
&self,
conn: &DbConn,
conn: &Connection,
timeline: &Timeline,
post: &Post,
kind: Kind<'_>,
@ -200,7 +199,7 @@ enum Arg<'a> {
impl<'a> Arg<'a> {
pub fn matches(
&self,
conn: &DbConn,
conn: &Connection,
timeline: &Timeline,
post: &Post,
kind: Kind<'_>,
@ -225,7 +224,7 @@ enum WithList {
impl WithList {
pub fn matches(
&self,
conn: &DbConn,
conn: &Connection,
timeline: &Timeline,
post: &Post,
list: &List<'_>,
@ -292,7 +291,7 @@ impl WithList {
WithList::Author { boosts, likes } => match kind {
Kind::Original => Ok(list
.iter()
.filter_map(|a| User::find_by_fqn(&*conn, a).ok())
.filter_map(|a| User::find_by_fqn(conn, a).ok())
.any(|a| post.is_author(conn, a.id).unwrap_or(false))),
Kind::Reshare(u) => {
if *boosts {
@ -361,7 +360,7 @@ enum Bool {
impl Bool {
pub fn matches(
&self,
conn: &DbConn,
conn: &Connection,
timeline: &Timeline,
post: &Post,
kind: Kind<'_>,
@ -654,7 +653,7 @@ impl<'a> TimelineQuery<'a> {
pub fn matches(
&self,
conn: &DbConn,
conn: &Connection,
timeline: &Timeline,
post: &Post,
kind: Kind<'_>,

View file

@ -1,18 +1,24 @@
use crate::{
ap_url, blocklisted_emails::BlocklistedEmail, blogs::Blog, db_conn::DbConn, follows::Follow,
instance::*, medias::Media, notifications::Notification, post_authors::PostAuthor, posts::Post,
safe_string::SafeString, schema::users, timeline::Timeline, Connection, Error, Result,
UserEvent::*, CONFIG, ITEMS_PER_PAGE, USER_CHAN,
ap_url, blocklisted_emails::BlocklistedEmail, blogs::Blog, comments::Comment, db_conn::DbConn,
follows::Follow, instance::*, medias::Media, notifications::Notification,
post_authors::PostAuthor, posts::Post, safe_string::SafeString, schema::users,
timeline::Timeline, Connection, Error, Result, UserEvent::*, CONFIG, ITEMS_PER_PAGE, USER_CHAN,
};
use activitypub::{
use activitystreams::{
activity::Delete,
actor::Person,
actor::{ApActor, AsApActor, Endpoints, Person},
base::{AnyBase, Base},
collection::{OrderedCollection, OrderedCollectionPage},
object::{Image, Tombstone},
Activity, CustomObject, Endpoint,
iri_string::types::IriString,
markers::Activity,
object::{kind::ImageType, AsObject as _, Image, Tombstone},
prelude::*,
};
use chrono::{NaiveDateTime, Utc};
use diesel::{self, BelongingToDsl, ExpressionMethods, OptionalExtension, QueryDsl, RunQueryDsl};
use diesel::{
self, BelongingToDsl, BoolExpressionMethods, ExpressionMethods, OptionalExtension, QueryDsl,
RunQueryDsl, TextExpressionMethods,
};
use ldap3::{LdapConn, Scope, SearchEntry};
use openssl::{
hash::MessageDigest,
@ -25,7 +31,8 @@ use plume_common::{
inbox::{AsActor, AsObject, FromId},
request::get,
sign::{gen_keypair, Error as SignError, Result as SignResult, Signer},
ActivityStream, ApSignature, Id, IntoId, PublicKey, PUBLIC_VISIBILITY,
ActivityStream, ApSignature, CustomPerson, Id, IntoId, PublicKey, ToAsString, ToAsUri,
PUBLIC_VISIBILITY,
},
utils,
};
@ -39,11 +46,8 @@ use std::{
hash::{Hash, Hasher},
sync::Arc,
};
use url::Url;
use webfinger::*;
pub type CustomPerson = CustomObject<ApSignature, Person>;
pub enum Role {
Admin = 0,
Moderator = 1,
@ -164,6 +168,14 @@ impl User {
notif.delete(conn)?
}
for comment in Comment::list_by_author(conn, self.id)? {
let delete_activity = comment.build_delete(conn)?;
crate::inbox::inbox(
conn,
serde_json::to_value(&delete_activity).map_err(Error::from)?,
)?;
}
diesel::delete(self)
.execute(conn)
.map(|_| ())
@ -185,15 +197,16 @@ impl User {
pub fn count_local(conn: &Connection) -> Result<i64> {
users::table
.filter(users::instance_id.eq(Instance::get_local()?.id))
.filter(users::role.ne(Role::Instance as i32))
.count()
.get_result(&*conn)
.get_result(conn)
.map_err(Error::from)
}
pub fn find_by_fqn(conn: &DbConn, fqn: &str) -> Result<User> {
pub fn find_by_fqn(conn: &Connection, fqn: &str) -> Result<User> {
let from_db = users::table
.filter(users::fqn.eq(fqn))
.first(&**conn)
.first(conn)
.optional()?;
if let Some(from_db) = from_db {
Ok(from_db)
@ -202,7 +215,44 @@ impl User {
}
}
fn fetch_from_webfinger(conn: &DbConn, acct: &str) -> Result<User> {
pub fn search_local_by_name(
conn: &Connection,
name: &str,
(min, max): (i32, i32),
) -> Result<Vec<User>> {
users::table
.filter(users::instance_id.eq(Instance::get_local()?.id))
.filter(users::role.ne(Role::Instance as i32))
// TODO: use `ilike` instead of `like` for PostgreSQL
.filter(
users::username
.like(format!("%{}%", name))
.or(users::display_name.like(format!("%{}%", name))),
)
.order(users::username.asc())
.offset(min.into())
.limit((max - min).into())
.load::<User>(conn)
.map_err(Error::from)
}
/**
* TODO: Should create user record with normalized(lowercased) email
*/
pub fn email_used(conn: &DbConn, email: &str) -> Result<bool> {
use diesel::dsl::{exists, select};
select(exists(
users::table
.filter(users::instance_id.eq(Instance::get_local()?.id))
.filter(users::email.eq(email))
.or_filter(users::email.eq(email.to_ascii_lowercase())),
))
.get_result(&**conn)
.map_err(Error::from)
}
fn fetch_from_webfinger(conn: &Connection, acct: &str) -> Result<User> {
let link = resolve(acct.to_owned(), true)?
.links
.into_iter()
@ -227,12 +277,9 @@ impl User {
}
fn fetch(url: &str) -> Result<CustomPerson> {
let mut res = get(url, Self::get_sender(), CONFIG.proxy().cloned())?;
let res = get(url, Self::get_sender(), CONFIG.proxy().cloned())?;
let text = &res.text()?;
// without this workaround, publicKey is not correctly deserialized
let ap_sign = serde_json::from_str::<ApSignature>(text)?;
let mut json = serde_json::from_str::<CustomPerson>(text)?;
json.custom_props = ap_sign;
let json = serde_json::from_str::<CustomPerson>(text)?;
Ok(json)
}
@ -242,37 +289,48 @@ impl User {
pub fn refetch(&self, conn: &Connection) -> Result<()> {
User::fetch(&self.ap_url.clone()).and_then(|json| {
let avatar = Media::save_remote(
conn,
json.object
.object_props
.icon_image()? // FIXME: Fails when icon is not set
.object_props
.url_string()?,
self,
)
.ok();
let avatar = json
.icon()
.and_then(|icon| icon.iter().next())
.and_then(|i| i.clone().extend::<Image, ImageType>().ok())
.and_then(|image| image)
.and_then(|image| image.id_unchecked().map(|url| url.to_string()))
.and_then(|url| Media::save_remote(conn, url, self).ok());
let pub_key = &json.ext_one.public_key.public_key_pem;
diesel::update(self)
.set((
users::username.eq(json.object.ap_actor_props.preferred_username_string()?),
users::display_name.eq(json.object.object_props.name_string()?),
users::outbox_url.eq(json.object.ap_actor_props.outbox_string()?),
users::inbox_url.eq(json.object.ap_actor_props.inbox_string()?),
users::username.eq(json
.ap_actor_ref()
.preferred_username()
.ok_or(Error::MissingApProperty)?),
users::display_name.eq(json
.ap_actor_ref()
.name()
.ok_or(Error::MissingApProperty)?
.to_as_string()
.ok_or(Error::MissingApProperty)?),
users::outbox_url.eq(json
.ap_actor_ref()
.outbox()?
.ok_or(Error::MissingApProperty)?
.as_str()),
users::inbox_url.eq(json.ap_actor_ref().inbox()?.as_str()),
users::summary.eq(SafeString::new(
&json
.object
.object_props
.summary_string()
.ap_actor_ref()
.summary()
.and_then(|summary| summary.to_as_string())
.unwrap_or_default(),
)),
users::followers_endpoint.eq(json.object.ap_actor_props.followers_string()?),
users::followers_endpoint.eq(json
.ap_actor_ref()
.followers()?
.ok_or(Error::MissingApProperty)?
.as_str()),
users::avatar_id.eq(avatar.map(|a| a.id)),
users::last_fetched_date.eq(Utc::now().naive_utc()),
users::public_key.eq(json
.custom_props
.public_key_publickey()?
.public_key_pem_string()?),
users::public_key.eq(pub_key),
))
.execute(conn)
.map(|_| ())
@ -387,7 +445,7 @@ impl User {
}
// if no user was found, and we were unable to auto-register from ldap
// fake-verify a password, and return an error.
let other = User::get(&*conn, 1)
let other = User::get(conn, 1)
.expect("No user is registered")
.hashed_password;
other.map(|pass| bcrypt::verify(password, &pass));
@ -406,6 +464,7 @@ impl User {
pub fn get_local_page(conn: &Connection, (min, max): (i32, i32)) -> Result<Vec<User>> {
users::table
.filter(users::instance_id.eq(Instance::get_local()?.id))
.filter(users::role.ne(Role::Instance as i32))
.order(users::username.asc())
.offset(min.into())
.limit((max - min).into())
@ -413,48 +472,63 @@ impl User {
.map_err(Error::from)
}
pub fn outbox(&self, conn: &Connection) -> Result<ActivityStream<OrderedCollection>> {
let mut coll = OrderedCollection::default();
Ok(ActivityStream::new(self.outbox_collection(conn)?))
}
pub fn outbox_collection(&self, conn: &Connection) -> Result<OrderedCollection> {
let mut coll = OrderedCollection::new();
let first = &format!("{}?page=1", &self.outbox_url);
let last = &format!(
"{}?page={}",
&self.outbox_url,
self.get_activities_count(conn) / i64::from(ITEMS_PER_PAGE) + 1
);
coll.collection_props.set_first_link(Id::new(first))?;
coll.collection_props.set_last_link(Id::new(last))?;
coll.collection_props
.set_total_items_u64(self.get_activities_count(conn) as u64)?;
Ok(ActivityStream::new(coll))
coll.set_first(first.parse::<IriString>()?);
coll.set_last(last.parse::<IriString>()?);
coll.set_total_items(self.get_activities_count(conn) as u64);
Ok(coll)
}
pub fn outbox_page(
&self,
conn: &Connection,
(min, max): (i32, i32),
) -> Result<ActivityStream<OrderedCollectionPage>> {
Ok(ActivityStream::new(
self.outbox_collection_page(conn, (min, max))?,
))
}
pub fn outbox_collection_page(
&self,
conn: &Connection,
(min, max): (i32, i32),
) -> Result<OrderedCollectionPage> {
let acts = self.get_activities_page(conn, (min, max))?;
let n_acts = self.get_activities_count(conn);
let mut coll = OrderedCollectionPage::default();
let mut coll = OrderedCollectionPage::new();
if n_acts - i64::from(min) >= i64::from(ITEMS_PER_PAGE) {
coll.collection_page_props.set_next_link(Id::new(&format!(
"{}?page={}",
&self.outbox_url,
min / ITEMS_PER_PAGE + 2
)))?;
coll.set_next(
format!("{}?page={}", &self.outbox_url, min / ITEMS_PER_PAGE + 2)
.parse::<IriString>()?,
);
}
if min > 0 {
coll.collection_page_props.set_prev_link(Id::new(&format!(
"{}?page={}",
&self.outbox_url,
min / ITEMS_PER_PAGE
)))?;
coll.set_prev(
format!("{}?page={}", &self.outbox_url, min / ITEMS_PER_PAGE)
.parse::<IriString>()?,
);
}
coll.collection_props.items = serde_json::to_value(acts)?;
coll.collection_page_props
.set_part_of_link(Id::new(&self.outbox_url))?;
Ok(ActivityStream::new(coll))
coll.set_many_items(
acts.iter()
.filter_map(|value| AnyBase::from_arbitrary_json(value).ok()),
);
coll.set_part_of(self.outbox_url.parse::<IriString>()?);
Ok(coll)
}
fn fetch_outbox_page<T: Activity>(&self, url: &str) -> Result<(Vec<T>, Option<String>)> {
let mut res = get(url, Self::get_sender(), CONFIG.proxy().cloned())?;
pub fn fetch_outbox_page<T: Activity + serde::de::DeserializeOwned>(
&self,
url: &str,
) -> Result<(Vec<T>, Option<String>)> {
let res = get(url, Self::get_sender(), CONFIG.proxy().cloned())?;
let text = &res.text()?;
let json: serde_json::Value = serde_json::from_str(text)?;
let items = json["items"]
@ -467,8 +541,9 @@ impl User {
let next = json.get("next").map(|x| x.as_str().unwrap().to_owned());
Ok((items, next))
}
pub fn fetch_outbox<T: Activity>(&self) -> Result<Vec<T>> {
let mut res = get(
pub fn fetch_outbox<T: Activity + serde::de::DeserializeOwned>(&self) -> Result<Vec<T>> {
let res = get(
&self.outbox_url[..],
Self::get_sender(),
CONFIG.proxy().cloned(),
@ -504,7 +579,7 @@ impl User {
}
pub fn fetch_followers_ids(&self) -> Result<Vec<String>> {
let mut res = get(
let res = get(
&self.followers_endpoint[..],
Self::get_sender(),
CONFIG.proxy().cloned(),
@ -712,71 +787,58 @@ impl User {
}
pub fn to_activity(&self, conn: &Connection) -> Result<CustomPerson> {
let mut actor = Person::default();
actor.object_props.set_id_string(self.ap_url.clone())?;
actor
.object_props
.set_name_string(self.display_name.clone())?;
actor
.object_props
.set_summary_string(self.summary_html.get().clone())?;
actor.object_props.set_url_string(self.ap_url.clone())?;
actor
.ap_actor_props
.set_inbox_string(self.inbox_url.clone())?;
actor
.ap_actor_props
.set_outbox_string(self.outbox_url.clone())?;
actor
.ap_actor_props
.set_preferred_username_string(self.username.clone())?;
actor
.ap_actor_props
.set_followers_string(self.followers_endpoint.clone())?;
let mut actor = ApActor::new(self.inbox_url.parse()?, Person::new());
let ap_url = self.ap_url.parse::<IriString>()?;
actor.set_id(ap_url.clone());
actor.set_name(self.display_name.clone());
actor.set_summary(self.summary_html.get().clone());
actor.set_url(ap_url.clone());
actor.set_inbox(self.inbox_url.parse()?);
actor.set_outbox(self.outbox_url.parse()?);
actor.set_preferred_username(self.username.clone());
actor.set_followers(self.followers_endpoint.parse()?);
if let Some(shared_inbox_url) = self.shared_inbox_url.clone() {
let mut endpoints = Endpoint::default();
endpoints.set_shared_inbox_string(shared_inbox_url)?;
actor.ap_actor_props.set_endpoints_endpoint(endpoints)?;
let endpoints = Endpoints {
shared_inbox: Some(shared_inbox_url.parse::<IriString>()?),
..Endpoints::default()
};
actor.set_endpoints(endpoints);
}
let mut public_key = PublicKey::default();
public_key.set_id_string(format!("{}#main-key", self.ap_url))?;
public_key.set_owner_string(self.ap_url.clone())?;
public_key.set_public_key_pem_string(self.public_key.clone())?;
let mut ap_signature = ApSignature::default();
ap_signature.set_public_key_publickey(public_key)?;
let pub_key = PublicKey {
id: format!("{}#main-key", self.ap_url).parse()?,
owner: ap_url,
public_key_pem: self.public_key.clone(),
};
let ap_signature = ApSignature {
public_key: pub_key,
};
let mut avatar = Image::default();
avatar.object_props.set_url_string(
self.avatar_id
.and_then(|id| Media::get(conn, id).and_then(|m| m.url()).ok())
.unwrap_or_default(),
)?;
actor.object_props.set_icon_object(avatar)?;
if let Some(avatar_id) = self.avatar_id {
let mut avatar = Image::new();
avatar.set_url(Media::get(conn, avatar_id)?.url()?.parse::<IriString>()?);
actor.set_icon(avatar.into_any_base()?);
}
Ok(CustomPerson::new(actor, ap_signature))
}
pub fn delete_activity(&self, conn: &Connection) -> Result<Delete> {
let mut del = Delete::default();
let mut tombstone = Tombstone::new();
tombstone.set_id(self.ap_url.parse()?);
let mut tombstone = Tombstone::default();
tombstone.object_props.set_id_string(self.ap_url.clone())?;
del.delete_props
.set_actor_link(Id::new(self.ap_url.clone()))?;
del.delete_props.set_object_object(tombstone)?;
del.object_props
.set_id_string(format!("{}#delete", self.ap_url))?;
del.object_props
.set_to_link_vec(vec![Id::new(PUBLIC_VISIBILITY)])?;
del.object_props.set_cc_link_vec(
let mut del = Delete::new(
self.ap_url.parse::<IriString>()?,
Base::retract(tombstone)?.into_generic()?,
);
del.set_id(format!("{}#delete", self.ap_url).parse()?);
del.set_many_tos(vec![PUBLIC_VISIBILITY.parse::<IriString>()?]);
del.set_many_ccs(
self.get_followers(conn)?
.into_iter()
.map(|f| Id::new(f.ap_url))
.collect(),
)?;
.filter_map(|f| f.ap_url.parse::<IriString>().ok()),
);
Ok(del)
}
@ -880,7 +942,7 @@ impl<'a, 'r> FromRequest<'a, 'r> for User {
.cookies()
.get_private(AUTH_COOKIE)
.and_then(|cookie| cookie.value().parse().ok())
.and_then(|id| User::get(&*conn, id).ok())
.and_then(|id| User::get(&conn, id).ok())
.or_forward(())
}
}
@ -893,18 +955,73 @@ impl IntoId for User {
impl Eq for User {}
impl FromId<DbConn> for User {
impl FromId<Connection> for User {
type Error = Error;
type Object = CustomPerson;
fn from_db(conn: &DbConn, id: &str) -> Result<Self> {
fn from_db(conn: &Connection, id: &str) -> Result<Self> {
Self::find_by_ap_url(conn, id)
}
fn from_activity(conn: &DbConn, acct: CustomPerson) -> Result<Self> {
let url = Url::parse(&acct.object.object_props.id_string()?)?;
let inst = url.host_str().ok_or(Error::Url)?;
let instance = Instance::find_by_domain(conn, inst).or_else(|_| {
fn from_activity(conn: &Connection, acct: CustomPerson) -> Result<Self> {
let actor = acct.ap_actor_ref();
let username = actor
.preferred_username()
.ok_or(Error::MissingApProperty)?
.to_string();
if username.contains(&['<', '>', '&', '@', '\'', '"', ' ', '\t'][..]) {
tracing::error!(
"preferredUsername includes invalid character(s): {}",
&username
);
return Err(Error::InvalidValue);
}
let summary = acct
.object_ref()
.summary()
.and_then(|prop| prop.to_as_string())
.unwrap_or_default();
let mut new_user = NewUser {
display_name: acct
.object_ref()
.name()
.and_then(|prop| prop.to_as_string())
.unwrap_or_else(|| username.clone()),
username: username.clone(),
outbox_url: actor.outbox()?.ok_or(Error::MissingApProperty)?.to_string(),
inbox_url: actor.inbox()?.to_string(),
role: 2,
summary_html: SafeString::new(&summary),
summary,
public_key: acct.ext_one.public_key.public_key_pem.to_string(),
shared_inbox_url: actor
.endpoints()?
.and_then(|e| e.shared_inbox.map(|inbox| inbox.to_string())),
followers_endpoint: actor
.followers()?
.ok_or(Error::MissingApProperty)?
.to_string(),
..NewUser::default()
};
let avatar_id = acct.object_ref().icon().and_then(|icon| icon.to_as_uri());
let (ap_url, inst) = {
let any_base = acct.into_any_base()?;
let id = any_base.id().ok_or(Error::MissingApProperty)?;
(
id.to_string(),
id.authority_components()
.ok_or(Error::Url)?
.host()
.to_string(),
)
};
new_user.ap_url = ap_url;
let instance = Instance::find_by_domain(conn, &inst).or_else(|_| {
Instance::insert(
conn,
NewInstance {
@ -921,70 +1038,20 @@ impl FromId<DbConn> for User {
},
)
})?;
let username = acct.object.ap_actor_props.preferred_username_string()?;
if username.contains(&['<', '>', '&', '@', '\'', '"', ' ', '\t'][..]) {
return Err(Error::InvalidValue);
}
let fqn = if instance.local {
username.clone()
new_user.instance_id = instance.id;
new_user.fqn = if instance.local {
username
} else {
format!("{}@{}", username, instance.public_domain)
};
let user = User::insert(
conn,
NewUser {
display_name: acct
.object
.object_props
.name_string()
.unwrap_or_else(|_| username.clone()),
username,
outbox_url: acct.object.ap_actor_props.outbox_string()?,
inbox_url: acct.object.ap_actor_props.inbox_string()?,
role: 2,
summary: acct
.object
.object_props
.summary_string()
.unwrap_or_default(),
summary_html: SafeString::new(
&acct
.object
.object_props
.summary_string()
.unwrap_or_default(),
),
email: None,
hashed_password: None,
instance_id: instance.id,
ap_url: acct.object.object_props.id_string()?,
public_key: acct
.custom_props
.public_key_publickey()?
.public_key_pem_string()?,
private_key: None,
shared_inbox_url: acct
.object
.ap_actor_props
.endpoints_endpoint()
.and_then(|e| e.shared_inbox_string())
.ok(),
followers_endpoint: acct.object.ap_actor_props.followers_string()?,
fqn,
avatar_id: None,
},
)?;
let user = User::insert(conn, new_user)?;
if let Some(avatar_id) = avatar_id {
let avatar = Media::save_remote(conn, avatar_id, &user);
if let Ok(icon) = acct.object.object_props.icon_image() {
if let Ok(url) = icon.object_props.url_string() {
let avatar = Media::save_remote(conn, url, &user);
if let Ok(avatar) = avatar {
user.set_avatar(conn, avatar.id)?;
if let Ok(avatar) = avatar {
if let Err(e) = user.set_avatar(conn, avatar.id) {
tracing::error!("{:?}", e);
}
}
}
@ -997,7 +1064,7 @@ impl FromId<DbConn> for User {
}
}
impl AsActor<&DbConn> for User {
impl AsActor<&Connection> for User {
fn get_inbox_url(&self) -> String {
self.inbox_url.clone()
}
@ -1013,11 +1080,11 @@ impl AsActor<&DbConn> for User {
}
}
impl AsObject<User, Delete, &DbConn> for User {
impl AsObject<User, Delete, &Connection> for User {
type Error = Error;
type Output = ();
fn activity(self, conn: &DbConn, actor: User, _id: &str) -> Result<()> {
fn activity(self, conn: &Connection, actor: User, _id: &str) -> Result<()> {
if self.id == actor.id {
self.delete(conn).map(|_| ())
} else {
@ -1126,10 +1193,13 @@ pub(crate) mod tests {
use super::*;
use crate::{
instance::{tests as instance_tests, Instance},
medias::{Media, NewMedia},
tests::db,
Connection as Conn,
Connection as Conn, ITEMS_PER_PAGE,
};
use diesel::Connection;
use assert_json_diff::assert_json_eq;
use diesel::{Connection, SaveChangesDsl};
use serde_json::to_value;
pub(crate) fn fill_database(conn: &Conn) -> Vec<User> {
instance_tests::fill_database(conn);
@ -1153,7 +1223,7 @@ pub(crate) mod tests {
Some("invalid_user_password".to_owned()),
)
.unwrap();
let other = NewUser::new_local(
let mut other = NewUser::new_local(
conn,
"other".to_owned(),
"Another user".to_owned(),
@ -1163,9 +1233,73 @@ pub(crate) mod tests {
Some("invalid_other_password".to_owned()),
)
.unwrap();
let avatar = Media::insert(
conn,
NewMedia {
file_path: "static/media/example.png".into(),
alt_text: "Another user".into(),
is_remote: false,
remote_url: None,
sensitive: false,
content_warning: None,
owner_id: other.id,
},
)
.unwrap();
other.avatar_id = Some(avatar.id);
let other = other.save_changes::<User>(conn).unwrap();
vec![admin, user, other]
}
fn fill_pages(
conn: &DbConn,
) -> (
Vec<crate::posts::Post>,
Vec<crate::users::User>,
Vec<crate::blogs::Blog>,
) {
use crate::post_authors::NewPostAuthor;
use crate::posts::NewPost;
let (mut posts, users, blogs) = crate::inbox::tests::fill_database(conn);
let user = &users[0];
let blog = &blogs[0];
for i in 1..(ITEMS_PER_PAGE * 4 + 3) {
let title = format!("Post {}", i);
let content = format!("Content for post {}.", i);
let post = Post::insert(
conn,
NewPost {
blog_id: blog.id,
slug: title.clone(),
title: title.clone(),
content: SafeString::new(&content),
published: true,
license: "CC-0".into(),
creation_date: None,
ap_url: format!("{}/{}", blog.ap_url, title),
subtitle: "".into(),
source: content,
cover_id: None,
},
)
.unwrap();
PostAuthor::insert(
conn,
NewPostAuthor {
post_id: post.id,
author_id: user.id,
},
)
.unwrap();
posts.push(post);
}
(posts, users, blogs)
}
#[test]
fn find_by() {
let conn = db();
@ -1216,11 +1350,11 @@ pub(crate) mod tests {
fn delete() {
let conn = &db();
conn.test_transaction::<_, (), _>(|| {
let inserted = fill_database(&conn);
let inserted = fill_database(conn);
assert!(User::get(&conn, inserted[0].id).is_ok());
inserted[0].delete(&conn).unwrap();
assert!(User::get(&conn, inserted[0].id).is_err());
assert!(User::get(conn, inserted[0].id).is_ok());
inserted[0].delete(conn).unwrap();
assert!(User::get(conn, inserted[0].id).is_err());
Ok(())
});
}
@ -1229,20 +1363,20 @@ pub(crate) mod tests {
fn admin() {
let conn = &db();
conn.test_transaction::<_, (), _>(|| {
let inserted = fill_database(&conn);
let inserted = fill_database(conn);
let local_inst = Instance::get_local().unwrap();
let mut i = 0;
while local_inst.has_admin(&conn).unwrap() {
while local_inst.has_admin(conn).unwrap() {
assert!(i < 100); //prevent from looping indefinitelly
local_inst
.main_admin(&conn)
.main_admin(conn)
.unwrap()
.set_role(&conn, Role::Normal)
.set_role(conn, Role::Normal)
.unwrap();
i += 1;
}
inserted[0].set_role(&conn, Role::Admin).unwrap();
assert_eq!(inserted[0].id, local_inst.main_admin(&conn).unwrap().id);
inserted[0].set_role(conn, Role::Admin).unwrap();
assert_eq!(inserted[0].id, local_inst.main_admin(conn).unwrap().id);
Ok(())
});
}
@ -1251,9 +1385,9 @@ pub(crate) mod tests {
fn auth() {
let conn = &db();
conn.test_transaction::<_, (), _>(|| {
fill_database(&conn);
fill_database(conn);
let test_user = NewUser::new_local(
&conn,
conn,
"test".to_owned(),
"test user".to_owned(),
Role::Normal,
@ -1264,10 +1398,10 @@ pub(crate) mod tests {
.unwrap();
assert_eq!(
User::login(&conn, "test", "test_password").unwrap().id,
User::login(conn, "test", "test_password").unwrap().id,
test_user.id
);
assert!(User::login(&conn, "test", "other_password").is_err());
assert!(User::login(conn, "test", "other_password").is_err());
Ok(())
});
}
@ -1276,26 +1410,26 @@ pub(crate) mod tests {
fn get_local_page() {
let conn = &db();
conn.test_transaction::<_, (), _>(|| {
fill_database(&conn);
fill_database(conn);
let page = User::get_local_page(&conn, (0, 2)).unwrap();
let page = User::get_local_page(conn, (0, 2)).unwrap();
assert_eq!(page.len(), 2);
assert!(page[0].username <= page[1].username);
let mut last_username = User::get_local_page(&conn, (0, 1)).unwrap()[0]
let mut last_username = User::get_local_page(conn, (0, 1)).unwrap()[0]
.username
.clone();
for i in 1..User::count_local(&conn).unwrap() as i32 {
let page = User::get_local_page(&conn, (i, i + 1)).unwrap();
for i in 1..User::count_local(conn).unwrap() as i32 {
let page = User::get_local_page(conn, (i, i + 1)).unwrap();
assert_eq!(page.len(), 1);
assert!(last_username <= page[0].username);
last_username = page[0].username.clone();
}
assert_eq!(
User::get_local_page(&conn, (0, User::count_local(&conn).unwrap() as i32 + 10))
User::get_local_page(conn, (0, User::count_local(conn).unwrap() as i32 + 10))
.unwrap()
.len() as i64,
User::count_local(&conn).unwrap()
User::count_local(conn).unwrap()
);
Ok(())
});
@ -1326,4 +1460,134 @@ pub(crate) mod tests {
Ok(())
});
}
#[test]
fn to_activity() {
let conn = db();
conn.test_transaction::<_, Error, _>(|| {
let users = fill_database(&conn);
let user = &users[0];
let act = user.to_activity(&conn)?;
let expected = json!({
"endpoints": {
"sharedInbox": "https://plu.me/inbox"
},
"followers": "https://plu.me/@/admin/followers",
"id": "https://plu.me/@/admin/",
"inbox": "https://plu.me/@/admin/inbox",
"name": "The admin",
"outbox": "https://plu.me/@/admin/outbox",
"preferredUsername": "admin",
"publicKey": {
"id": "https://plu.me/@/admin/#main-key",
"owner": "https://plu.me/@/admin/",
"publicKeyPem": user.public_key,
},
"summary": "<p dir=\"auto\">Hello there, Im the admin</p>\n",
"type": "Person",
"url": "https://plu.me/@/admin/"
});
assert_json_eq!(to_value(act)?, expected);
let other = &users[2];
let other_act = other.to_activity(&conn)?;
let expected_other = json!({
"endpoints": {
"sharedInbox": "https://plu.me/inbox"
},
"followers": "https://plu.me/@/other/followers",
"icon": {
"url": "https://plu.me/static/media/example.png",
"type": "Image",
},
"id": "https://plu.me/@/other/",
"inbox": "https://plu.me/@/other/inbox",
"name": "Another user",
"outbox": "https://plu.me/@/other/outbox",
"preferredUsername": "other",
"publicKey": {
"id": "https://plu.me/@/other/#main-key",
"owner": "https://plu.me/@/other/",
"publicKeyPem": other.public_key,
},
"summary": "<p dir=\"auto\">Hello there, Im someone else</p>\n",
"type": "Person",
"url": "https://plu.me/@/other/"
});
assert_json_eq!(to_value(other_act)?, expected_other);
Ok(())
});
}
#[test]
fn delete_activity() {
let conn = db();
conn.test_transaction::<_, Error, _>(|| {
let users = fill_database(&conn);
let user = &users[1];
let act = user.delete_activity(&conn)?;
let expected = json!({
"actor": "https://plu.me/@/user/",
"cc": [],
"id": "https://plu.me/@/user/#delete",
"object": {
"id": "https://plu.me/@/user/",
"type": "Tombstone",
},
"to": ["https://www.w3.org/ns/activitystreams#Public"],
"type": "Delete",
});
assert_json_eq!(to_value(act)?, expected);
Ok(())
});
}
#[test]
fn outbox_collection() {
let conn = db();
conn.test_transaction::<_, Error, _>(|| {
let (_pages, users, _blogs) = fill_pages(&conn);
let user = &users[0];
let act = user.outbox_collection(&conn)?;
let expected = json!({
"first": "https://plu.me/@/admin/outbox?page=1",
"last": "https://plu.me/@/admin/outbox?page=5",
"totalItems": 51,
"type": "OrderedCollection",
});
assert_json_eq!(to_value(act)?, expected);
Ok(())
});
}
#[test]
fn outbox_collection_page() {
let conn = db();
conn.test_transaction::<_, Error, _>(|| {
let users = fill_database(&conn);
let user = &users[0];
let act = user.outbox_collection_page(&conn, (33, 36))?;
let expected = json!({
"items": [],
"partOf": "https://plu.me/@/admin/outbox",
"prev": "https://plu.me/@/admin/outbox?page=2",
"type": "OrderedCollectionPage",
});
assert_json_eq!(to_value(act)?, expected);
Ok(())
});
}
}

View file

@ -3,7 +3,7 @@ msgstr ""
"Project-Id-Version: plume\n"
"Report-Msgid-Bugs-To: \n"
"POT-Creation-Date: 2018-06-15 16:33-0700\n"
"PO-Revision-Date: 2021-05-05 18:31\n"
"PO-Revision-Date: 2022-01-12 01:20\n"
"Last-Translator: \n"
"Language-Team: Afrikaans\n"
"Language: af_ZA\n"
@ -17,47 +17,47 @@ msgstr ""
"X-Crowdin-File: /master/po/plume-front/plume-front.pot\n"
"X-Crowdin-File-ID: 12\n"
# plume-front/src/editor.rs:188
# plume-front/src/editor.rs:172
msgid "Do you want to load the local autosave last edited at {}?"
msgstr ""
# plume-front/src/editor.rs:281
# plume-front/src/editor.rs:326
msgid "Open the rich text editor"
msgstr ""
# plume-front/src/editor.rs:314
# plume-front/src/editor.rs:385
msgid "Title"
msgstr ""
# plume-front/src/editor.rs:318
# plume-front/src/editor.rs:389
msgid "Subtitle, or summary"
msgstr ""
# plume-front/src/editor.rs:325
# plume-front/src/editor.rs:396
msgid "Write your article here. Markdown is supported."
msgstr ""
# plume-front/src/editor.rs:336
# plume-front/src/editor.rs:407
msgid "Around {} characters left"
msgstr ""
# plume-front/src/editor.rs:413
# plume-front/src/editor.rs:517
msgid "Tags"
msgstr ""
# plume-front/src/editor.rs:414
# plume-front/src/editor.rs:518
msgid "License"
msgstr ""
# plume-front/src/editor.rs:417
# plume-front/src/editor.rs:524
msgid "Cover"
msgstr ""
# plume-front/src/editor.rs:437
# plume-front/src/editor.rs:564
msgid "This is a draft"
msgstr ""
# plume-front/src/editor.rs:444
# plume-front/src/editor.rs:575
msgid "Publish"
msgstr ""

View file

@ -3,7 +3,7 @@ msgstr ""
"Project-Id-Version: plume\n"
"Report-Msgid-Bugs-To: \n"
"POT-Creation-Date: 2018-06-15 16:33-0700\n"
"PO-Revision-Date: 2021-05-05 18:31\n"
"PO-Revision-Date: 2022-01-12 01:20\n"
"Last-Translator: \n"
"Language-Team: Arabic\n"
"Language: ar_SA\n"
@ -17,47 +17,47 @@ msgstr ""
"X-Crowdin-File: /master/po/plume-front/plume-front.pot\n"
"X-Crowdin-File-ID: 12\n"
# plume-front/src/editor.rs:188
# plume-front/src/editor.rs:172
msgid "Do you want to load the local autosave last edited at {}?"
msgstr ""
# plume-front/src/editor.rs:281
# plume-front/src/editor.rs:326
msgid "Open the rich text editor"
msgstr "فتح محرر النصوص الغني"
# plume-front/src/editor.rs:314
# plume-front/src/editor.rs:385
msgid "Title"
msgstr "العنوان"
# plume-front/src/editor.rs:318
# plume-front/src/editor.rs:389
msgid "Subtitle, or summary"
msgstr "العنوان الثانوي أو الملخص"
# plume-front/src/editor.rs:325
# plume-front/src/editor.rs:396
msgid "Write your article here. Markdown is supported."
msgstr "اكتب مقالك هنا. ماركداون مُدَعَّم."
# plume-front/src/editor.rs:336
# plume-front/src/editor.rs:407
msgid "Around {} characters left"
msgstr "يتبقا {} حرفا تقريبا"
# plume-front/src/editor.rs:413
# plume-front/src/editor.rs:517
msgid "Tags"
msgstr "الوسوم"
# plume-front/src/editor.rs:414
# plume-front/src/editor.rs:518
msgid "License"
msgstr "الرخصة"
# plume-front/src/editor.rs:417
# plume-front/src/editor.rs:524
msgid "Cover"
msgstr "الغلاف"
# plume-front/src/editor.rs:437
# plume-front/src/editor.rs:564
msgid "This is a draft"
msgstr "هذه مسودة"
# plume-front/src/editor.rs:444
# plume-front/src/editor.rs:575
msgid "Publish"
msgstr "نشر كتابا"

View file

@ -3,7 +3,7 @@ msgstr ""
"Project-Id-Version: plume\n"
"Report-Msgid-Bugs-To: \n"
"POT-Creation-Date: 2018-06-15 16:33-0700\n"
"PO-Revision-Date: 2021-05-05 18:31\n"
"PO-Revision-Date: 2022-01-12 01:20\n"
"Last-Translator: \n"
"Language-Team: Bulgarian\n"
"Language: bg_BG\n"
@ -17,47 +17,47 @@ msgstr ""
"X-Crowdin-File: /master/po/plume-front/plume-front.pot\n"
"X-Crowdin-File-ID: 12\n"
# plume-front/src/editor.rs:188
# plume-front/src/editor.rs:172
msgid "Do you want to load the local autosave last edited at {}?"
msgstr "Искате ли да активирате локално автоматично запаметяване, последно редактирано в {}?"
# plume-front/src/editor.rs:281
# plume-front/src/editor.rs:326
msgid "Open the rich text editor"
msgstr "Отворете редактора с богат текст"
# plume-front/src/editor.rs:314
# plume-front/src/editor.rs:385
msgid "Title"
msgstr "Заглавие"
# plume-front/src/editor.rs:318
# plume-front/src/editor.rs:389
msgid "Subtitle, or summary"
msgstr "Подзаглавие или резюме"
# plume-front/src/editor.rs:325
# plume-front/src/editor.rs:396
msgid "Write your article here. Markdown is supported."
msgstr "Напишете статията си тук. Поддържа се Markdown."
# plume-front/src/editor.rs:336
# plume-front/src/editor.rs:407
msgid "Around {} characters left"
msgstr "Остават {} знака вляво"
# plume-front/src/editor.rs:413
# plume-front/src/editor.rs:517
msgid "Tags"
msgstr "Етикети"
# plume-front/src/editor.rs:414
# plume-front/src/editor.rs:518
msgid "License"
msgstr "Лиценз"
# plume-front/src/editor.rs:417
# plume-front/src/editor.rs:524
msgid "Cover"
msgstr "Основно изображение"
# plume-front/src/editor.rs:437
# plume-front/src/editor.rs:564
msgid "This is a draft"
msgstr "Това е проект"
# plume-front/src/editor.rs:444
# plume-front/src/editor.rs:575
msgid "Publish"
msgstr "Публикувай"

View file

@ -3,7 +3,7 @@ msgstr ""
"Project-Id-Version: plume\n"
"Report-Msgid-Bugs-To: \n"
"POT-Creation-Date: 2018-06-15 16:33-0700\n"
"PO-Revision-Date: 2021-05-05 18:31\n"
"PO-Revision-Date: 2022-01-12 01:20\n"
"Last-Translator: \n"
"Language-Team: Catalan\n"
"Language: ca_ES\n"
@ -17,47 +17,47 @@ msgstr ""
"X-Crowdin-File: /master/po/plume-front/plume-front.pot\n"
"X-Crowdin-File-ID: 12\n"
# plume-front/src/editor.rs:188
# plume-front/src/editor.rs:172
msgid "Do you want to load the local autosave last edited at {}?"
msgstr ""
# plume-front/src/editor.rs:281
# plume-front/src/editor.rs:326
msgid "Open the rich text editor"
msgstr "Obre leditor de text enriquit"
# plume-front/src/editor.rs:314
# plume-front/src/editor.rs:385
msgid "Title"
msgstr "Títol"
# plume-front/src/editor.rs:318
# plume-front/src/editor.rs:389
msgid "Subtitle, or summary"
msgstr "Subtítol o resum"
# plume-front/src/editor.rs:325
# plume-front/src/editor.rs:396
msgid "Write your article here. Markdown is supported."
msgstr "Escriviu el vostre article ací. Podeu fer servir el Markdown."
# plume-front/src/editor.rs:336
# plume-front/src/editor.rs:407
msgid "Around {} characters left"
msgstr "Queden uns {} caràcters"
# plume-front/src/editor.rs:413
# plume-front/src/editor.rs:517
msgid "Tags"
msgstr "Etiquetes"
# plume-front/src/editor.rs:414
# plume-front/src/editor.rs:518
msgid "License"
msgstr "Llicència"
# plume-front/src/editor.rs:417
# plume-front/src/editor.rs:524
msgid "Cover"
msgstr "Coberta"
# plume-front/src/editor.rs:437
# plume-front/src/editor.rs:564
msgid "This is a draft"
msgstr "Açò és un esborrany"
# plume-front/src/editor.rs:444
# plume-front/src/editor.rs:575
msgid "Publish"
msgstr "Publica"

View file

@ -3,7 +3,7 @@ msgstr ""
"Project-Id-Version: plume\n"
"Report-Msgid-Bugs-To: \n"
"POT-Creation-Date: 2018-06-15 16:33-0700\n"
"PO-Revision-Date: 2022-01-02 11:39\n"
"PO-Revision-Date: 2022-05-09 09:58\n"
"Last-Translator: \n"
"Language-Team: Czech\n"
"Language: cs_CZ\n"
@ -17,47 +17,47 @@ msgstr ""
"X-Crowdin-File: /master/po/plume-front/plume-front.pot\n"
"X-Crowdin-File-ID: 12\n"
# plume-front/src/editor.rs:188
# plume-front/src/editor.rs:172
msgid "Do you want to load the local autosave last edited at {}?"
msgstr ""
# plume-front/src/editor.rs:281
# plume-front/src/editor.rs:326
msgid "Open the rich text editor"
msgstr "Otevřít editor formátovaného textu"
# plume-front/src/editor.rs:314
# plume-front/src/editor.rs:385
msgid "Title"
msgstr "Nadpis"
# plume-front/src/editor.rs:318
# plume-front/src/editor.rs:389
msgid "Subtitle, or summary"
msgstr "Podnadpis, nebo shrnutí"
# plume-front/src/editor.rs:325
# plume-front/src/editor.rs:396
msgid "Write your article here. Markdown is supported."
msgstr "Sem napište svůj článek. Markdown je podporován."
# plume-front/src/editor.rs:336
# plume-front/src/editor.rs:407
msgid "Around {} characters left"
msgstr "Zbývá kolem {} znaků"
# plume-front/src/editor.rs:413
# plume-front/src/editor.rs:517
msgid "Tags"
msgstr "Tagy"
# plume-front/src/editor.rs:414
# plume-front/src/editor.rs:518
msgid "License"
msgstr "Licence"
# plume-front/src/editor.rs:417
# plume-front/src/editor.rs:524
msgid "Cover"
msgstr "Titulka"
# plume-front/src/editor.rs:437
# plume-front/src/editor.rs:564
msgid "This is a draft"
msgstr "Tohle je koncept"
# plume-front/src/editor.rs:444
# plume-front/src/editor.rs:575
msgid "Publish"
msgstr "Zveřejnit"

View file

@ -3,7 +3,7 @@ msgstr ""
"Project-Id-Version: plume\n"
"Report-Msgid-Bugs-To: \n"
"POT-Creation-Date: 2018-06-15 16:33-0700\n"
"PO-Revision-Date: 2021-05-05 18:31\n"
"PO-Revision-Date: 2022-01-12 01:20\n"
"Last-Translator: \n"
"Language-Team: Danish\n"
"Language: da_DK\n"
@ -17,47 +17,47 @@ msgstr ""
"X-Crowdin-File: /master/po/plume-front/plume-front.pot\n"
"X-Crowdin-File-ID: 12\n"
# plume-front/src/editor.rs:188
# plume-front/src/editor.rs:172
msgid "Do you want to load the local autosave last edited at {}?"
msgstr ""
# plume-front/src/editor.rs:281
# plume-front/src/editor.rs:326
msgid "Open the rich text editor"
msgstr ""
# plume-front/src/editor.rs:314
# plume-front/src/editor.rs:385
msgid "Title"
msgstr ""
# plume-front/src/editor.rs:318
# plume-front/src/editor.rs:389
msgid "Subtitle, or summary"
msgstr ""
# plume-front/src/editor.rs:325
# plume-front/src/editor.rs:396
msgid "Write your article here. Markdown is supported."
msgstr ""
# plume-front/src/editor.rs:336
# plume-front/src/editor.rs:407
msgid "Around {} characters left"
msgstr ""
# plume-front/src/editor.rs:413
# plume-front/src/editor.rs:517
msgid "Tags"
msgstr ""
# plume-front/src/editor.rs:414
# plume-front/src/editor.rs:518
msgid "License"
msgstr ""
# plume-front/src/editor.rs:417
# plume-front/src/editor.rs:524
msgid "Cover"
msgstr ""
# plume-front/src/editor.rs:437
# plume-front/src/editor.rs:564
msgid "This is a draft"
msgstr ""
# plume-front/src/editor.rs:444
# plume-front/src/editor.rs:575
msgid "Publish"
msgstr ""

View file

@ -3,7 +3,7 @@ msgstr ""
"Project-Id-Version: plume\n"
"Report-Msgid-Bugs-To: \n"
"POT-Creation-Date: 2018-06-15 16:33-0700\n"
"PO-Revision-Date: 2021-12-11 15:00\n"
"PO-Revision-Date: 2022-01-26 13:16\n"
"Last-Translator: \n"
"Language-Team: German\n"
"Language: de_DE\n"
@ -17,47 +17,47 @@ msgstr ""
"X-Crowdin-File: /master/po/plume-front/plume-front.pot\n"
"X-Crowdin-File-ID: 12\n"
# plume-front/src/editor.rs:188
# plume-front/src/editor.rs:172
msgid "Do you want to load the local autosave last edited at {}?"
msgstr "Möchten Sie die lokale automatische Speicherung laden, die zuletzt um {} bearbeitet wurde?"
# plume-front/src/editor.rs:281
# plume-front/src/editor.rs:326
msgid "Open the rich text editor"
msgstr " Rich Text Editor (RTE) öffnen"
# plume-front/src/editor.rs:314
# plume-front/src/editor.rs:385
msgid "Title"
msgstr "Titel"
# plume-front/src/editor.rs:318
# plume-front/src/editor.rs:389
msgid "Subtitle, or summary"
msgstr "Untertitel oder Zusammenfassung"
# plume-front/src/editor.rs:325
# plume-front/src/editor.rs:396
msgid "Write your article here. Markdown is supported."
msgstr "Schreiben deinen Artikel hier. Markdown wird unterstützt."
# plume-front/src/editor.rs:336
# plume-front/src/editor.rs:407
msgid "Around {} characters left"
msgstr "Ungefähr {} Zeichen übrig"
# plume-front/src/editor.rs:413
# plume-front/src/editor.rs:517
msgid "Tags"
msgstr "Schlagwörter"
# plume-front/src/editor.rs:414
# plume-front/src/editor.rs:518
msgid "License"
msgstr "Lizenz"
# plume-front/src/editor.rs:417
# plume-front/src/editor.rs:524
msgid "Cover"
msgstr "Einband"
# plume-front/src/editor.rs:437
# plume-front/src/editor.rs:564
msgid "This is a draft"
msgstr "Dies ist ein Entwurf"
# plume-front/src/editor.rs:444
# plume-front/src/editor.rs:575
msgid "Publish"
msgstr "Veröffentlichen"

View file

@ -3,7 +3,7 @@ msgstr ""
"Project-Id-Version: plume\n"
"Report-Msgid-Bugs-To: \n"
"POT-Creation-Date: 2018-06-15 16:33-0700\n"
"PO-Revision-Date: 2021-05-05 18:31\n"
"PO-Revision-Date: 2022-01-12 01:20\n"
"Last-Translator: \n"
"Language-Team: Greek\n"
"Language: el_GR\n"
@ -17,47 +17,47 @@ msgstr ""
"X-Crowdin-File: /master/po/plume-front/plume-front.pot\n"
"X-Crowdin-File-ID: 12\n"
# plume-front/src/editor.rs:188
# plume-front/src/editor.rs:172
msgid "Do you want to load the local autosave last edited at {}?"
msgstr ""
# plume-front/src/editor.rs:281
# plume-front/src/editor.rs:326
msgid "Open the rich text editor"
msgstr ""
# plume-front/src/editor.rs:314
# plume-front/src/editor.rs:385
msgid "Title"
msgstr ""
# plume-front/src/editor.rs:318
# plume-front/src/editor.rs:389
msgid "Subtitle, or summary"
msgstr ""
# plume-front/src/editor.rs:325
# plume-front/src/editor.rs:396
msgid "Write your article here. Markdown is supported."
msgstr ""
# plume-front/src/editor.rs:336
# plume-front/src/editor.rs:407
msgid "Around {} characters left"
msgstr ""
# plume-front/src/editor.rs:413
# plume-front/src/editor.rs:517
msgid "Tags"
msgstr ""
# plume-front/src/editor.rs:414
# plume-front/src/editor.rs:518
msgid "License"
msgstr ""
# plume-front/src/editor.rs:417
# plume-front/src/editor.rs:524
msgid "Cover"
msgstr ""
# plume-front/src/editor.rs:437
# plume-front/src/editor.rs:564
msgid "This is a draft"
msgstr ""
# plume-front/src/editor.rs:444
# plume-front/src/editor.rs:575
msgid "Publish"
msgstr ""

View file

@ -3,7 +3,7 @@ msgstr ""
"Project-Id-Version: plume\n"
"Report-Msgid-Bugs-To: \n"
"POT-Creation-Date: 2018-06-15 16:33-0700\n"
"PO-Revision-Date: 2021-05-05 18:31\n"
"PO-Revision-Date: 2022-01-12 01:20\n"
"Last-Translator: \n"
"Language-Team: English\n"
"Language: en_US\n"
@ -17,47 +17,47 @@ msgstr ""
"X-Crowdin-File: /master/po/plume-front/plume-front.pot\n"
"X-Crowdin-File-ID: 12\n"
# plume-front/src/editor.rs:188
# plume-front/src/editor.rs:172
msgid "Do you want to load the local autosave last edited at {}?"
msgstr ""
# plume-front/src/editor.rs:281
# plume-front/src/editor.rs:326
msgid "Open the rich text editor"
msgstr ""
# plume-front/src/editor.rs:314
# plume-front/src/editor.rs:385
msgid "Title"
msgstr ""
# plume-front/src/editor.rs:318
# plume-front/src/editor.rs:389
msgid "Subtitle, or summary"
msgstr ""
# plume-front/src/editor.rs:325
# plume-front/src/editor.rs:396
msgid "Write your article here. Markdown is supported."
msgstr ""
# plume-front/src/editor.rs:336
# plume-front/src/editor.rs:407
msgid "Around {} characters left"
msgstr ""
# plume-front/src/editor.rs:413
# plume-front/src/editor.rs:517
msgid "Tags"
msgstr ""
# plume-front/src/editor.rs:414
# plume-front/src/editor.rs:518
msgid "License"
msgstr ""
# plume-front/src/editor.rs:417
# plume-front/src/editor.rs:524
msgid "Cover"
msgstr ""
# plume-front/src/editor.rs:437
# plume-front/src/editor.rs:564
msgid "This is a draft"
msgstr ""
# plume-front/src/editor.rs:444
# plume-front/src/editor.rs:575
msgid "Publish"
msgstr ""

View file

@ -3,7 +3,7 @@ msgstr ""
"Project-Id-Version: plume\n"
"Report-Msgid-Bugs-To: \n"
"POT-Creation-Date: 2018-06-15 16:33-0700\n"
"PO-Revision-Date: 2021-05-05 18:31\n"
"PO-Revision-Date: 2022-01-12 01:20\n"
"Last-Translator: \n"
"Language-Team: Esperanto\n"
"Language: eo_UY\n"
@ -17,47 +17,47 @@ msgstr ""
"X-Crowdin-File: /master/po/plume-front/plume-front.pot\n"
"X-Crowdin-File-ID: 12\n"
# plume-front/src/editor.rs:188
# plume-front/src/editor.rs:172
msgid "Do you want to load the local autosave last edited at {}?"
msgstr ""
# plume-front/src/editor.rs:281
# plume-front/src/editor.rs:326
msgid "Open the rich text editor"
msgstr "Malfermi la riĉan redaktilon"
# plume-front/src/editor.rs:314
# plume-front/src/editor.rs:385
msgid "Title"
msgstr "Titolo"
# plume-front/src/editor.rs:318
# plume-front/src/editor.rs:389
msgid "Subtitle, or summary"
msgstr ""
# plume-front/src/editor.rs:325
# plume-front/src/editor.rs:396
msgid "Write your article here. Markdown is supported."
msgstr "Verku vian artikolon ĉi tie. Markdown estas subtenita."
# plume-front/src/editor.rs:336
# plume-front/src/editor.rs:407
msgid "Around {} characters left"
msgstr "Proksimume {} signoj restantaj"
# plume-front/src/editor.rs:413
# plume-front/src/editor.rs:517
msgid "Tags"
msgstr "Etikedoj"
# plume-front/src/editor.rs:414
# plume-front/src/editor.rs:518
msgid "License"
msgstr "Permesilo"
# plume-front/src/editor.rs:417
# plume-front/src/editor.rs:524
msgid "Cover"
msgstr "Kovro"
# plume-front/src/editor.rs:437
# plume-front/src/editor.rs:564
msgid "This is a draft"
msgstr "Malfinias"
# plume-front/src/editor.rs:444
# plume-front/src/editor.rs:575
msgid "Publish"
msgstr "Eldoni"

View file

@ -3,7 +3,7 @@ msgstr ""
"Project-Id-Version: plume\n"
"Report-Msgid-Bugs-To: \n"
"POT-Creation-Date: 2018-06-15 16:33-0700\n"
"PO-Revision-Date: 2022-01-02 11:39\n"
"PO-Revision-Date: 2022-01-26 13:16\n"
"Last-Translator: \n"
"Language-Team: Spanish\n"
"Language: es_ES\n"
@ -17,47 +17,47 @@ msgstr ""
"X-Crowdin-File: /master/po/plume-front/plume-front.pot\n"
"X-Crowdin-File-ID: 12\n"
# plume-front/src/editor.rs:188
# plume-front/src/editor.rs:172
msgid "Do you want to load the local autosave last edited at {}?"
msgstr "¿Quieres cargar el guardado automático local editado por última vez en {}?"
# plume-front/src/editor.rs:281
# plume-front/src/editor.rs:326
msgid "Open the rich text editor"
msgstr "Abrir el editor de texto enriquecido"
# plume-front/src/editor.rs:314
# plume-front/src/editor.rs:385
msgid "Title"
msgstr "Título"
# plume-front/src/editor.rs:318
# plume-front/src/editor.rs:389
msgid "Subtitle, or summary"
msgstr "Subtítulo, o resumen"
# plume-front/src/editor.rs:325
# plume-front/src/editor.rs:396
msgid "Write your article here. Markdown is supported."
msgstr "Escriba su artículo aquí. Puede utilizar Markdown."
# plume-front/src/editor.rs:336
# plume-front/src/editor.rs:407
msgid "Around {} characters left"
msgstr "Quedan unos {} caracteres"
# plume-front/src/editor.rs:413
# plume-front/src/editor.rs:517
msgid "Tags"
msgstr "Etiquetas"
# plume-front/src/editor.rs:414
# plume-front/src/editor.rs:518
msgid "License"
msgstr "Licencia"
# plume-front/src/editor.rs:417
# plume-front/src/editor.rs:524
msgid "Cover"
msgstr "Cubierta"
# plume-front/src/editor.rs:437
# plume-front/src/editor.rs:564
msgid "This is a draft"
msgstr "Esto es un borrador"
# plume-front/src/editor.rs:444
# plume-front/src/editor.rs:575
msgid "Publish"
msgstr "Publicar"

63
po/plume-front/eu.po Normal file
View file

@ -0,0 +1,63 @@
msgid ""
msgstr ""
"Project-Id-Version: plume\n"
"Report-Msgid-Bugs-To: \n"
"POT-Creation-Date: 2018-06-15 16:33-0700\n"
"PO-Revision-Date: 2022-05-09 09:58\n"
"Last-Translator: \n"
"Language-Team: Basque\n"
"Language: eu_ES\n"
"MIME-Version: 1.0\n"
"Content-Type: text/plain; charset=UTF-8\n"
"Content-Transfer-Encoding: 8bit\n"
"Plural-Forms: nplurals=2; plural=(n != 1);\n"
"X-Crowdin-Project: plume\n"
"X-Crowdin-Project-ID: 352097\n"
"X-Crowdin-Language: eu\n"
"X-Crowdin-File: /master/po/plume-front/plume-front.pot\n"
"X-Crowdin-File-ID: 12\n"
# plume-front/src/editor.rs:172
msgid "Do you want to load the local autosave last edited at {}?"
msgstr "{}(t)an automatikoki gordetako azken kopia lokala kargatu nahi al duzu?"
# plume-front/src/editor.rs:326
msgid "Open the rich text editor"
msgstr "Ireki testu-formatutzaile aberatsa"
# plume-front/src/editor.rs:385
msgid "Title"
msgstr "Izenburua"
# plume-front/src/editor.rs:389
msgid "Subtitle, or summary"
msgstr "Azpititulua edo laburpena"
# plume-front/src/editor.rs:396
msgid "Write your article here. Markdown is supported."
msgstr "Idatzi hemen testua. Markdown erabil dezakezu."
# plume-front/src/editor.rs:407
msgid "Around {} characters left"
msgstr "%{count} karaktere geratzen dira"
# plume-front/src/editor.rs:517
msgid "Tags"
msgstr "Etiketak"
# plume-front/src/editor.rs:518
msgid "License"
msgstr "Lizentzia"
# plume-front/src/editor.rs:524
msgid "Cover"
msgstr "Azala"
# plume-front/src/editor.rs:564
msgid "This is a draft"
msgstr "Zirriborro bat da"
# plume-front/src/editor.rs:575
msgid "Publish"
msgstr "Argitaratu"

View file

@ -3,7 +3,7 @@ msgstr ""
"Project-Id-Version: plume\n"
"Report-Msgid-Bugs-To: \n"
"POT-Creation-Date: 2018-06-15 16:33-0700\n"
"PO-Revision-Date: 2021-12-11 15:00\n"
"PO-Revision-Date: 2022-05-10 17:54\n"
"Last-Translator: \n"
"Language-Team: Persian\n"
"Language: fa_IR\n"
@ -17,47 +17,47 @@ msgstr ""
"X-Crowdin-File: /master/po/plume-front/plume-front.pot\n"
"X-Crowdin-File-ID: 12\n"
# plume-front/src/editor.rs:188
# plume-front/src/editor.rs:172
msgid "Do you want to load the local autosave last edited at {}?"
msgstr "آیا می‌خواهید نسخهٔ ذخیره شدهٔ خودکار محلّی از آخرین ویرایش در {} را بار کنید؟"
# plume-front/src/editor.rs:281
# plume-front/src/editor.rs:326
msgid "Open the rich text editor"
msgstr "باز کردن ویرایش‌گر غنی"
msgstr "گشودن ویرایشگر غنی"
# plume-front/src/editor.rs:314
# plume-front/src/editor.rs:385
msgid "Title"
msgstr "عنوان"
# plume-front/src/editor.rs:318
# plume-front/src/editor.rs:389
msgid "Subtitle, or summary"
msgstr "زیرعنوان، یا چکیده"
# plume-front/src/editor.rs:325
# plume-front/src/editor.rs:396
msgid "Write your article here. Markdown is supported."
msgstr "مقاله‌تان را این‌جا بنویسید. از مارک‌داون پشتیبانی می‌شود."
# plume-front/src/editor.rs:336
# plume-front/src/editor.rs:407
msgid "Around {} characters left"
msgstr "نزدیک به {} حرف باقی مانده است"
# plume-front/src/editor.rs:413
# plume-front/src/editor.rs:517
msgid "Tags"
msgstr "برچسب‌ها"
# plume-front/src/editor.rs:414
# plume-front/src/editor.rs:518
msgid "License"
msgstr "پروانه"
# plume-front/src/editor.rs:417
# plume-front/src/editor.rs:524
msgid "Cover"
msgstr "جلد"
# plume-front/src/editor.rs:437
# plume-front/src/editor.rs:564
msgid "This is a draft"
msgstr "این، یک پیش‌نویس است"
# plume-front/src/editor.rs:444
# plume-front/src/editor.rs:575
msgid "Publish"
msgstr "انتشار"

View file

@ -3,7 +3,7 @@ msgstr ""
"Project-Id-Version: plume\n"
"Report-Msgid-Bugs-To: \n"
"POT-Creation-Date: 2018-06-15 16:33-0700\n"
"PO-Revision-Date: 2021-05-05 18:31\n"
"PO-Revision-Date: 2022-01-12 01:20\n"
"Last-Translator: \n"
"Language-Team: Finnish\n"
"Language: fi_FI\n"
@ -17,47 +17,47 @@ msgstr ""
"X-Crowdin-File: /master/po/plume-front/plume-front.pot\n"
"X-Crowdin-File-ID: 12\n"
# plume-front/src/editor.rs:188
# plume-front/src/editor.rs:172
msgid "Do you want to load the local autosave last edited at {}?"
msgstr ""
# plume-front/src/editor.rs:281
# plume-front/src/editor.rs:326
msgid "Open the rich text editor"
msgstr "Avaa edistynyt tekstieditori"
# plume-front/src/editor.rs:314
# plume-front/src/editor.rs:385
msgid "Title"
msgstr "Otsikko"
# plume-front/src/editor.rs:318
# plume-front/src/editor.rs:389
msgid "Subtitle, or summary"
msgstr "Alaotsikko tai tiivistelmä"
# plume-front/src/editor.rs:325
# plume-front/src/editor.rs:396
msgid "Write your article here. Markdown is supported."
msgstr "Kirjoita artikkelisi tähän. Markdown -kuvauskieli on tuettu."
# plume-front/src/editor.rs:336
# plume-front/src/editor.rs:407
msgid "Around {} characters left"
msgstr "%{count} merkkiä jäljellä"
# plume-front/src/editor.rs:413
# plume-front/src/editor.rs:517
msgid "Tags"
msgstr "Tagit"
# plume-front/src/editor.rs:414
# plume-front/src/editor.rs:518
msgid "License"
msgstr "Lisenssi"
# plume-front/src/editor.rs:417
# plume-front/src/editor.rs:524
msgid "Cover"
msgstr "Kansi"
# plume-front/src/editor.rs:437
# plume-front/src/editor.rs:564
msgid "This is a draft"
msgstr "Tämä on luonnos"
# plume-front/src/editor.rs:444
# plume-front/src/editor.rs:575
msgid "Publish"
msgstr "Julkaise"

View file

@ -3,7 +3,7 @@ msgstr ""
"Project-Id-Version: plume\n"
"Report-Msgid-Bugs-To: \n"
"POT-Creation-Date: 2018-06-15 16:33-0700\n"
"PO-Revision-Date: 2021-12-11 15:00\n"
"PO-Revision-Date: 2022-05-09 09:59\n"
"Last-Translator: \n"
"Language-Team: French\n"
"Language: fr_FR\n"
@ -17,47 +17,47 @@ msgstr ""
"X-Crowdin-File: /master/po/plume-front/plume-front.pot\n"
"X-Crowdin-File-ID: 12\n"
# plume-front/src/editor.rs:188
# plume-front/src/editor.rs:172
msgid "Do you want to load the local autosave last edited at {}?"
msgstr "Voulez vous charger la sauvegarde automatique locale, éditée la dernière fois à {}?"
# plume-front/src/editor.rs:281
# plume-front/src/editor.rs:326
msgid "Open the rich text editor"
msgstr "Ouvrir l'éditeur de texte avancé"
# plume-front/src/editor.rs:314
# plume-front/src/editor.rs:385
msgid "Title"
msgstr "Titre"
# plume-front/src/editor.rs:318
# plume-front/src/editor.rs:389
msgid "Subtitle, or summary"
msgstr "Sous-titre ou résumé"
# plume-front/src/editor.rs:325
# plume-front/src/editor.rs:396
msgid "Write your article here. Markdown is supported."
msgstr "Écrivez votre article ici. Vous pouvez utiliser du Markdown."
# plume-front/src/editor.rs:336
# plume-front/src/editor.rs:407
msgid "Around {} characters left"
msgstr "Environ {} caractères restant"
# plume-front/src/editor.rs:413
# plume-front/src/editor.rs:517
msgid "Tags"
msgstr "Étiquettes"
# plume-front/src/editor.rs:414
# plume-front/src/editor.rs:518
msgid "License"
msgstr "Licence"
# plume-front/src/editor.rs:417
# plume-front/src/editor.rs:524
msgid "Cover"
msgstr "Illustration"
# plume-front/src/editor.rs:437
# plume-front/src/editor.rs:564
msgid "This is a draft"
msgstr "Ceci est un brouillon"
# plume-front/src/editor.rs:444
# plume-front/src/editor.rs:575
msgid "Publish"
msgstr "Publier"

View file

@ -3,7 +3,7 @@ msgstr ""
"Project-Id-Version: plume\n"
"Report-Msgid-Bugs-To: \n"
"POT-Creation-Date: 2018-06-15 16:33-0700\n"
"PO-Revision-Date: 2021-05-05 18:31\n"
"PO-Revision-Date: 2022-01-26 13:16\n"
"Last-Translator: \n"
"Language-Team: Galician\n"
"Language: gl_ES\n"
@ -17,47 +17,47 @@ msgstr ""
"X-Crowdin-File: /master/po/plume-front/plume-front.pot\n"
"X-Crowdin-File-ID: 12\n"
# plume-front/src/editor.rs:188
# plume-front/src/editor.rs:172
msgid "Do you want to load the local autosave last edited at {}?"
msgstr "Queres cargar a última copia gardada editada o {}?"
# plume-front/src/editor.rs:281
# plume-front/src/editor.rs:326
msgid "Open the rich text editor"
msgstr "Abre o editor de texto enriquecido"
# plume-front/src/editor.rs:314
# plume-front/src/editor.rs:385
msgid "Title"
msgstr "Título"
# plume-front/src/editor.rs:318
# plume-front/src/editor.rs:389
msgid "Subtitle, or summary"
msgstr "Subtítulo, ou resumo"
# plume-front/src/editor.rs:325
# plume-front/src/editor.rs:396
msgid "Write your article here. Markdown is supported."
msgstr "Escribe aquí o teu artigo: podes utilizar Markdown."
# plume-front/src/editor.rs:336
# plume-front/src/editor.rs:407
msgid "Around {} characters left"
msgstr "Dispós de {} caracteres"
# plume-front/src/editor.rs:413
# plume-front/src/editor.rs:517
msgid "Tags"
msgstr "Etiquetas"
# plume-front/src/editor.rs:414
# plume-front/src/editor.rs:518
msgid "License"
msgstr "Licenza"
# plume-front/src/editor.rs:417
# plume-front/src/editor.rs:524
msgid "Cover"
msgstr "Portada"
# plume-front/src/editor.rs:437
# plume-front/src/editor.rs:564
msgid "This is a draft"
msgstr "Este é un borrador"
# plume-front/src/editor.rs:444
# plume-front/src/editor.rs:575
msgid "Publish"
msgstr "Publicar"

View file

@ -3,7 +3,7 @@ msgstr ""
"Project-Id-Version: plume\n"
"Report-Msgid-Bugs-To: \n"
"POT-Creation-Date: 2018-06-15 16:33-0700\n"
"PO-Revision-Date: 2021-05-05 18:31\n"
"PO-Revision-Date: 2022-01-12 01:20\n"
"Last-Translator: \n"
"Language-Team: Hebrew\n"
"Language: he_IL\n"
@ -17,47 +17,47 @@ msgstr ""
"X-Crowdin-File: /master/po/plume-front/plume-front.pot\n"
"X-Crowdin-File-ID: 12\n"
# plume-front/src/editor.rs:188
# plume-front/src/editor.rs:172
msgid "Do you want to load the local autosave last edited at {}?"
msgstr ""
# plume-front/src/editor.rs:281
# plume-front/src/editor.rs:326
msgid "Open the rich text editor"
msgstr ""
# plume-front/src/editor.rs:314
# plume-front/src/editor.rs:385
msgid "Title"
msgstr ""
# plume-front/src/editor.rs:318
# plume-front/src/editor.rs:389
msgid "Subtitle, or summary"
msgstr ""
# plume-front/src/editor.rs:325
# plume-front/src/editor.rs:396
msgid "Write your article here. Markdown is supported."
msgstr ""
# plume-front/src/editor.rs:336
# plume-front/src/editor.rs:407
msgid "Around {} characters left"
msgstr ""
# plume-front/src/editor.rs:413
# plume-front/src/editor.rs:517
msgid "Tags"
msgstr ""
# plume-front/src/editor.rs:414
# plume-front/src/editor.rs:518
msgid "License"
msgstr ""
# plume-front/src/editor.rs:417
# plume-front/src/editor.rs:524
msgid "Cover"
msgstr ""
# plume-front/src/editor.rs:437
# plume-front/src/editor.rs:564
msgid "This is a draft"
msgstr ""
# plume-front/src/editor.rs:444
# plume-front/src/editor.rs:575
msgid "Publish"
msgstr ""

View file

@ -3,7 +3,7 @@ msgstr ""
"Project-Id-Version: plume\n"
"Report-Msgid-Bugs-To: \n"
"POT-Creation-Date: 2018-06-15 16:33-0700\n"
"PO-Revision-Date: 2021-05-05 18:31\n"
"PO-Revision-Date: 2022-01-12 01:20\n"
"Last-Translator: \n"
"Language-Team: Hindi\n"
"Language: hi_IN\n"
@ -17,47 +17,47 @@ msgstr ""
"X-Crowdin-File: /master/po/plume-front/plume-front.pot\n"
"X-Crowdin-File-ID: 12\n"
# plume-front/src/editor.rs:188
# plume-front/src/editor.rs:172
msgid "Do you want to load the local autosave last edited at {}?"
msgstr ""
# plume-front/src/editor.rs:281
# plume-front/src/editor.rs:326
msgid "Open the rich text editor"
msgstr ""
# plume-front/src/editor.rs:314
# plume-front/src/editor.rs:385
msgid "Title"
msgstr "शीर्षक"
# plume-front/src/editor.rs:318
# plume-front/src/editor.rs:389
msgid "Subtitle, or summary"
msgstr ""
# plume-front/src/editor.rs:325
# plume-front/src/editor.rs:396
msgid "Write your article here. Markdown is supported."
msgstr "अपना आर्टिकल या लेख यहाँ लिखें. Markdown उपलब्ध है."
# plume-front/src/editor.rs:336
# plume-front/src/editor.rs:407
msgid "Around {} characters left"
msgstr "लगभग {} अक्षर बाकी हैं"
# plume-front/src/editor.rs:413
# plume-front/src/editor.rs:517
msgid "Tags"
msgstr "टैग्स"
# plume-front/src/editor.rs:414
# plume-front/src/editor.rs:518
msgid "License"
msgstr "लाइसेंस"
# plume-front/src/editor.rs:417
# plume-front/src/editor.rs:524
msgid "Cover"
msgstr ""
# plume-front/src/editor.rs:437
# plume-front/src/editor.rs:564
msgid "This is a draft"
msgstr ""
# plume-front/src/editor.rs:444
# plume-front/src/editor.rs:575
msgid "Publish"
msgstr "पब्लिश करें"

View file

@ -3,7 +3,7 @@ msgstr ""
"Project-Id-Version: plume\n"
"Report-Msgid-Bugs-To: \n"
"POT-Creation-Date: 2018-06-15 16:33-0700\n"
"PO-Revision-Date: 2021-05-05 18:31\n"
"PO-Revision-Date: 2022-01-12 01:20\n"
"Last-Translator: \n"
"Language-Team: Croatian\n"
"Language: hr_HR\n"
@ -17,47 +17,47 @@ msgstr ""
"X-Crowdin-File: /master/po/plume-front/plume-front.pot\n"
"X-Crowdin-File-ID: 12\n"
# plume-front/src/editor.rs:188
# plume-front/src/editor.rs:172
msgid "Do you want to load the local autosave last edited at {}?"
msgstr ""
# plume-front/src/editor.rs:281
# plume-front/src/editor.rs:326
msgid "Open the rich text editor"
msgstr ""
# plume-front/src/editor.rs:314
# plume-front/src/editor.rs:385
msgid "Title"
msgstr "Naslov"
# plume-front/src/editor.rs:318
# plume-front/src/editor.rs:389
msgid "Subtitle, or summary"
msgstr ""
# plume-front/src/editor.rs:325
# plume-front/src/editor.rs:396
msgid "Write your article here. Markdown is supported."
msgstr ""
# plume-front/src/editor.rs:336
# plume-front/src/editor.rs:407
msgid "Around {} characters left"
msgstr ""
# plume-front/src/editor.rs:413
# plume-front/src/editor.rs:517
msgid "Tags"
msgstr "Tagovi"
# plume-front/src/editor.rs:414
# plume-front/src/editor.rs:518
msgid "License"
msgstr "Licenca"
# plume-front/src/editor.rs:417
# plume-front/src/editor.rs:524
msgid "Cover"
msgstr ""
# plume-front/src/editor.rs:437
# plume-front/src/editor.rs:564
msgid "This is a draft"
msgstr ""
# plume-front/src/editor.rs:444
# plume-front/src/editor.rs:575
msgid "Publish"
msgstr "Objavi"

Some files were not shown because too many files have changed in this diff Show more