Compare commits

...

228 commits

Author SHA1 Message Date
FloatingGhost ccdf55acff Fix instance name in email test 2022-11-04 18:42:12 +00:00
floatingghost cc6d760814 Merge pull request 'Fix typo in CSP Report-To header name' (#250) from tcit/akkoma:fix-typo-in-csp-report-to-header-name into develop
Reviewed-on: AkkomaGang/akkoma#250
2022-11-04 18:41:26 +00:00
Thomas Citharel 4d0a51221a
Fix typo in CSP Report-To header name
The header name was Report-To, not Reply-To.

In any case, that's now being changed to the Reporting-Endpoints HTTP
Response Header.
https://w3c.github.io/reporting/#header
https://github.com/w3c/reporting/issues/177

CanIUse says the Report-To header is still supported by current Chrome
and friends.
https://caniuse.com/mdn-http_headers_report-to

It doesn't have any data for the Reporting-Endpoints HTTP header, but
this article says Chrome 96 supports it.
https://web.dev/reporting-api/

(Even though that's come out one year ago, that's not compatible with
Network Error Logging which's still using the Report-To version of the
API)

Signed-off-by: Thomas Citharel <tcit@tcit.fr>
2022-11-04 15:02:13 +01:00
FloatingGhost 7cfce562a9 Add default favicon
Fixes pleroma-fe#185
2022-11-02 22:38:02 +00:00
floatingghost 346f72b471 Merge pull request 'Change default instance name to "Akkoma"' (#248) from norm/akkoma:default-instance-name into develop
Reviewed-on: AkkomaGang/akkoma#248
2022-11-02 01:29:42 +00:00
Norm 9682ec4c5f Change default instance name to "Akkoma"
This was left at "Pleroma" for some reason.
2022-11-01 20:52:17 +00:00
floatingghost 9038da01cc Merge pull request 'Push.Impl: support edits' (#244) from norm/akkoma:push-support-edits into develop
Reviewed-on: AkkomaGang/akkoma#244
2022-11-01 15:14:08 +00:00
floatingghost e44e147b54 Merge pull request 'fix flaky test_user_relationship_test.exs:81' (#240) from ilja/akkoma:fix_flaky_test_user_relationship_test.exs_81 into develop
Reviewed-on: AkkomaGang/akkoma#240
2022-11-01 14:44:23 +00:00
floatingghost d5bbc3eeb2 Merge pull request 'fix flaky test filter_controller_test.exs:200' (#239) from ilja/akkoma:fix_flaky_filter_controller_test.exs_200 into develop
Reviewed-on: AkkomaGang/akkoma#239
2022-11-01 14:42:43 +00:00
floatingghost 479542c692 Merge pull request 'fix flaky participation_test.exs' (#238) from ilja/akkoma:fix_erratic_participation_test into develop
Reviewed-on: AkkomaGang/akkoma#238
2022-11-01 14:37:06 +00:00
ilja be5044f785 fix_flaky_transfer_task_test.exs (#237)
There were async calls happening, so they weren't always finished when assert happened.

I also fixed some bugs in the erratic tests that were introduced when removing :shout.:shout is a key where restart is needed, and was changed in the test to use :rate_limit (which also requires a restart). But there was a bug in the syntax that didn't get caught because the test was tagged as erratic and therefor didn't fail. Here I fixed it.

During compilation, we had a warning `:logger is used by the current application but the current application does not depend on :logger` which is now fixed as well (see commit message for complete stacktrace).

Co-authored-by: Ilja <ilja@ilja.space>
Reviewed-on: AkkomaGang/akkoma#237
Co-authored-by: ilja <akkoma.dev@ilja.space>
Co-committed-by: ilja <akkoma.dev@ilja.space>
2022-11-01 14:31:29 +00:00
ilja f1dfd76b98 Fix rate_limiter_test.exs test "it restricts based on config values" (#233)
Fixes one of the 'erratic' tests

It used a timer to sleep.
But time also goes on when doing other things, so depending on hardware, the timings could be off.
I slightly changed the tests so we still test what we functionally want.
Instead of waiting until the cache expires I now have a function to expire the test and use that.

That means we're not testing any more if the cache really expires after a certain amount of time,
but that's the responsability of the dependency imo, so shouldn't be a problem.

I also changed `Pleroma.Web.Endpoint, :http, :ip` in the tests to `127.0.0.1`
Currently it was set to 8.8.8.8, but I see no reason for that and, while I assume that no calls
are made to it, it may come over as weird or suspicious to people.

Co-authored-by: Ilja <ilja@ilja.space>
Reviewed-on: AkkomaGang/akkoma#233
Co-authored-by: ilja <akkoma.dev@ilja.space>
Co-committed-by: ilja <akkoma.dev@ilja.space>
2022-11-01 14:25:54 +00:00
FloatingGhost 1bb8b76311 Fix tests in ldap registration 2022-11-01 14:21:35 +00:00
nullobsi cbc693f832 Fix LDAP user registration (#229)
Simple fix for LDAP user registration. I'm not sure what changed but I managed to get Akkoma running in a debug session and figured out it was missing a match for an extra value at the end. I don't know Elixir all that well so I'm not sure if this was the correct way to do it... but it works. :)

Reviewed-on: AkkomaGang/akkoma#229
Co-authored-by: nullobsi <me@nullob.si>
Co-committed-by: nullobsi <me@nullob.si>
2022-11-01 14:17:55 +00:00
FloatingGhost d782140e2b Reword stop gifs 2022-10-29 22:08:18 +01:00
FloatingGhost 4d9ca8909d Add StopGifs to description 2022-10-29 21:57:50 +01:00
marcin mikołajczak 6486211064
Push.Impl: support edits
Signed-off-by: marcin mikołajczak <git@mkljczk.pl>
2022-10-28 01:20:19 -04:00
ilja 3562eaeedc fix flaky test_user_relationship_test.exs:81
The problem was double. On the one hand, the function didn't actually return what was in the DB.
On the other hand the test was flaky because it used NaiveDateTime.utc_now() so test could fail or pass depending on a difference of microseconds.

Both are fixed now.
2022-10-23 13:31:01 +02:00
Ilja a59d310982 fix flaky test filter_controller_test.exs:200 2022-10-23 13:07:02 +02:00
ilja e6ceea3553 fix flaky participation_test.exs
It was tested if the updated_at after marking as "read" was equal as the updated_at at insertion, but that seems wrong.
Firstly, if a record is updated, you expect the updated_at to also update.
Secondly, the insert and update happen almost at the same time, so it's flaky regardless.

Here I make sure it has a much older updated_at during insert so we can clealy see the effect after update.
I also check that the updated_at is actually updated because I expect that this is the expected behaviour and it's also the current behaviour.
2022-10-23 12:33:31 +02:00
FloatingGhost 16a31872fe document local_bubble 2022-10-21 10:23:07 +01:00
FloatingGhost 7bb6df2d5b Remove unused DATA arg 2022-10-20 13:26:00 +01:00
floatingghost f36d14818d Unilateral remove from followers (#232)
from https://git.pleroma.social/pleroma/pleroma/-/merge_requests/3647/

Co-authored-by: marcin mikołajczak <git@mkljczk.pl>
Co-authored-by: Tusooa Zhu <tusooa@kazv.moe>
Co-authored-by: FloatingGhost <hannah@coffee-and-dreams.uk>
Reviewed-on: AkkomaGang/akkoma#232
2022-10-19 10:01:14 +00:00
FloatingGhost 5231d436d1 Add docker migration guide 2022-10-18 16:16:55 +01:00
FloatingGhost deba1d25f5 add DB restart to docker file 2022-10-17 16:29:36 +01:00
floatingghost 66f913355a Docker builds (#231)
Co-authored-by: FloatingGhost <hannah@coffee-and-dreams.uk>
Reviewed-on: AkkomaGang/akkoma#231
2022-10-16 19:25:54 +00:00
FloatingGhost 60b3c8d17b bump version 2022-10-14 12:49:35 +01:00
floatingghost edf7d5089f Merge pull request 'Check that the signature matches the creator' (#230) from domain-blocks into develop
Reviewed-on: AkkomaGang/akkoma#230
2022-10-14 11:41:34 +00:00
floatingghost d4bdd3ddb7 Merge pull request 'SQL optimisations' (#227) from i-hate-sql into develop
Reviewed-on: AkkomaGang/akkoma#227
2022-10-14 10:49:02 +00:00
FloatingGhost 03662501c3 Check that the signature matches the creator 2022-10-14 11:48:32 +01:00
FloatingGhost 856c57208b Ensure deletes are handled after everything else 2022-10-11 14:30:08 +01:00
FloatingGhost cb9b0d3720 optimise notifications query 2022-10-11 11:40:43 +01:00
FloatingGhost 8af50dea36 format 2022-10-10 17:13:42 +01:00
FloatingGhost ca9e6ffc55 Use inner lateral join to not get dropped in :total 2022-10-10 16:45:02 +01:00
FloatingGhost 574f010bc8 Extract deactivated users query to a join 2022-10-10 15:55:58 +01:00
floatingghost c6e63aaf6b Backend settings sync (#226)
Co-authored-by: FloatingGhost <hannah@coffee-and-dreams.uk>
Reviewed-on: AkkomaGang/akkoma#226
2022-10-06 16:22:15 +00:00
floatingghost 07295f7c8c Merge pull request 'include requirement to enable HTTP tunnel in tor' (#224) from tor-docs into develop
Reviewed-on: AkkomaGang/akkoma#224
2022-09-20 13:43:14 +00:00
FloatingGhost 47a793f33e include requirement to enable HTTP tunnel in tor 2022-09-20 14:40:32 +01:00
floatingghost 7775cefd73 Merge pull request 'ensure we use the same OTP for all releases' (#223) from update-otp-version into develop
Reviewed-on: AkkomaGang/akkoma#223
2022-09-20 12:33:16 +00:00
FloatingGhost 69099d6f44 ensure we use the same OTP for all releases 2022-09-20 12:20:54 +01:00
floatingghost 5827f7781f Add installation note about flavour, support special cases (#222)
Fixes #210

Co-authored-by: FloatingGhost <hannah@coffee-and-dreams.uk>
Reviewed-on: AkkomaGang/akkoma#222
2022-09-20 11:04:26 +00:00
floatingghost b2aa82cee5 Fix false error in meilisearch index (#221)
the schema changed

https://docs.meilisearch.com/reference/api/documents.html#add-or-update-documents

this wasn't breaking anything, it would just report errors that were actually successes

Co-authored-by: FloatingGhost <hannah@coffee-and-dreams.uk>
Reviewed-on: AkkomaGang/akkoma#221
2022-09-20 10:36:21 +00:00
floatingghost 9b2c169cef Merge pull request 'Move remote user interaction changelog entry to correct version' (#219) from norm/akkoma:changelog-remote-user-interaction into develop
Reviewed-on: AkkomaGang/akkoma#219
2022-09-19 17:33:32 +00:00
Norm 561e1f2470 Make backups require its own scope (#218)
Pulled from https://git.pleroma.social/pleroma/pleroma/-/merge_requests/3721.

This makes backups require its own scope (`read:backups`) instead of the `read:accounts` scope.

Co-authored-by: Tusooa Zhu <tusooa@kazv.moe>
Reviewed-on: AkkomaGang/akkoma#218
Co-authored-by: Norm <normandy@biribiri.dev>
Co-committed-by: Norm <normandy@biribiri.dev>
2022-09-19 17:31:35 +00:00
floatingghost 0aabe4d0c3 Merge pull request 'Update soapbox-fe base url' (#220) from lou_de_sel/akkoma:lou_de_sel-patch-1 into develop
Reviewed-on: AkkomaGang/akkoma#220
2022-09-19 17:30:04 +00:00
lou_de_sel 8fe59d495d Update soapbox base url
At some point 'soapbox-pub/soapbox-fe' was moved to 'soapbox-pub/soapbox' and the build url is now updated.
2022-09-18 07:45:30 +00:00
Norm 84f8f32ef9 Move remote user interaction changelog entry to correct version
That feature was added in 2022.09, not 2022.08.
2022-09-18 03:21:05 +00:00
FloatingGhost ad1a6d3dc2 ensure queue_target can't be silly low 2022-09-16 14:23:31 +01:00
FloatingGhost ee2eb7752d Ensure rollback succeeds 2022-09-16 13:00:40 +01:00
floatingghost 4e01e1bf72 Merge pull request 'User: search: exclude deactivated users from user search' (#214) from norm/akkoma:exclude-deactivated-search into develop
Reviewed-on: AkkomaGang/akkoma#214
2022-09-16 11:56:00 +00:00
floatingghost 911f8feb0a Ensure migrations succeed (#216)
Co-authored-by: FloatingGhost <hannah@coffee-and-dreams.uk>
Reviewed-on: AkkomaGang/akkoma#216
2022-09-16 11:53:11 +00:00
a1batross 77596a3021
User: search: exclude deactivated users from user search
This way we don't pollute search results with deactivated and deleted users
2022-09-15 21:21:06 -04:00
Norm 00f840fd44 Update styles.json path in frontend config doc (#212)
Co-authored-by: Francis Dinh <normandy@biribiri.dev>
Reviewed-on: AkkomaGang/akkoma#212
Co-authored-by: Norm <normandy@biribiri.dev>
Co-committed-by: Norm <normandy@biribiri.dev>
2022-09-14 10:20:07 +00:00
Tusooa Zhu 4c06c4ecb1 Add margin to forms and make inputs fill whole width 2022-09-11 20:30:03 +01:00
Tusooa Zhu 2aa8e66527 Fix User.get_or_fetch/1 with usernames starting with http 2022-09-11 20:29:05 +01:00
floatingghost dbe678cb06 Merge pull request 'pleroma-cherry-picks' (#209) from pleroma-cherry-picks into develop
Reviewed-on: AkkomaGang/akkoma#209
2022-09-11 19:28:06 +00:00
FloatingGhost b4261b0335 Use set of pregenerated RSA keys
Randomness is a huge resource sink, so let's just use
a some that we made earlier
2022-09-11 20:14:58 +01:00
Hélène 1acd38fe7f OAuthPlug: use user cache instead of joining
As this plug is called on every request, this should reduce load on the
database by not requiring to select on the users table every single
time, and to instead use the by-ID user cache whenever possible.
2022-09-11 19:55:55 +01:00
Hélène 3e2d15c71d emoji-test: update to latest 15.0 draft 2022-09-11 19:55:45 +01:00
Hélène 8683252fc5 Metadata/Utils: use summary as description if set
When generating OpenGraph and TwitterCard metadata for a post, the
summary field will be used first if it is set to generate the post
description.
2022-09-11 19:55:38 +01:00
Hélène 0b14f02ed2 User: generate private keys on user creation
This fixes a race condition bug where keys could be regenerated
post-federation, causing activities and HTTP signatures from an user to
be dropped due to key differences.
2022-09-11 19:54:37 +01:00
Hélène b6891fe190 Migrations: generate unset user keys
User keys are now generated on user creation instead of "when needed",
to prevent race conditions in federation and a few other issues. This
migration will generate keys missing for local users.
2022-09-11 19:53:31 +01:00
Hélène e88f36f72b ObjectView: do not fetch an object for its ID
Non-Create/Listen activities had their associated object field
normalized and fetched, but only to use their `id` field, which is both
slow and redundant. This also failed on Undo activities, which delete
the associated object/activity in database.

Undo activities will now render properly and database loads should
improve ever so slightly.
2022-09-11 19:52:59 +01:00
FloatingGhost dfba26a09c Revert "use prebuilt image for docs"
This reverts commit ef4282b348.
2022-09-10 21:08:22 +01:00
FloatingGhost f376eb7106 Revert "tmp: use akkoma build image"
This reverts commit cad2745734.
2022-09-10 21:08:06 +01:00
FloatingGhost ef4282b348 use prebuilt image for docs 2022-09-10 17:13:25 +01:00
FloatingGhost cad2745734 tmp: use akkoma build image 2022-09-10 16:48:46 +01:00
floatingghost b8190f19dc 2022.09 stable release chores (#206)
Co-authored-by: FloatingGhost <hannah@coffee-and-dreams.uk>
Reviewed-on: AkkomaGang/akkoma#206
2022-09-10 14:44:17 +00:00
Norm a6d85003fe Remote interaction with posts (#198)
Grabbed from https://git.pleroma.social/pleroma/pleroma/-/merge_requests/3587

Co-authored-by: Tusooa Zhu <tusooa@kazv.moe>
Reviewed-on: AkkomaGang/akkoma#198
Co-authored-by: Norm <normandy@biribiri.dev>
Co-committed-by: Norm <normandy@biribiri.dev>
2022-09-08 10:19:22 +00:00
Norm 7af32634be Remove gitlab files (#203)
These don't really serve a purpose now and aren't even recognized by
Gitea.

Co-authored-by: Francis Dinh <normandy@biribiri.dev>
Reviewed-on: AkkomaGang/akkoma#203
Co-authored-by: Norm <normandy@biribiri.dev>
Co-committed-by: Norm <normandy@biribiri.dev>
2022-09-08 09:54:02 +00:00
floatingghost 2641dcdd15 Post editing (#202)
Rebased from #103

Co-authored-by: Tusooa Zhu <tusooa@kazv.moe>
Co-authored-by: FloatingGhost <hannah@coffee-and-dreams.uk>
Reviewed-on: AkkomaGang/akkoma#202
2022-09-06 19:24:02 +00:00
FloatingGhost 6c80977b06 turn inlineQuotePolicy on by default 2022-09-05 17:22:33 +01:00
FloatingGhost f6304cfd78 add extra tests for builder 2022-09-05 01:24:40 +01:00
FloatingGhost 1c7d7845c3 fix compilation warnings 2022-09-05 00:39:32 +01:00
floatingghost 1b826eea54 Allow reacting with remote emoji when they exist on the post (#200)
Co-authored-by: FloatingGhost <hannah@coffee-and-dreams.uk>
Reviewed-on: AkkomaGang/akkoma#200
2022-09-04 23:31:41 +00:00
floatingghost 7a90d71e8d ensure .exs config is used before default (#197)
Co-authored-by: FloatingGhost <hannah@coffee-and-dreams.uk>
Reviewed-on: AkkomaGang/akkoma#197
2022-09-02 22:05:39 +00:00
floatingghost 8e4de118c1 Don't persist local undone follow (#194)
same deal but backwards this time

Co-authored-by: FloatingGhost <hannah@coffee-and-dreams.uk>
Reviewed-on: AkkomaGang/akkoma#194
2022-08-31 18:00:36 +00:00
floatingghost decbca0c91 add seperate source and dest entries in language listing (#193)
Co-authored-by: FloatingGhost <hannah@coffee-and-dreams.uk>
Reviewed-on: AkkomaGang/akkoma#193
2022-08-30 16:59:33 +00:00
floatingghost c3fde9577d Allow listing languages, setting source language (#192)
Co-authored-by: FloatingGhost <hannah@coffee-and-dreams.uk>
Reviewed-on: AkkomaGang/akkoma#192
2022-08-30 14:58:54 +00:00
FloatingGhost 25111bb407 include frontend installation document on all install guides 2022-08-30 10:56:33 +01:00
FloatingGhost 9cb41b6d7b add extra instructions to placeholder page 2022-08-30 10:39:36 +01:00
FloatingGhost 7759187de9 ensure default value is sane 2022-08-29 22:20:47 +01:00
floatingghost df39cab9c1 Automatic status translation (#187)
Fixes #115

Co-authored-by: FloatingGhost <hannah@coffee-and-dreams.uk>
Reviewed-on: AkkomaGang/akkoma#187
2022-08-29 19:42:22 +00:00
FloatingGhost 722e56b308 add changelog entry 2022-08-27 19:12:15 +01:00
Tusooa Zhu 95e4018c1a Disconnect streaming sessions when token is revoked
Use Websockex to replace websocket_client

Test that server will disconnect websocket upon token revocation

Lint

Execute session disconnect in background

Refactor streamer test

allow multi-streams

rebase websocket change
2022-08-27 19:07:48 +01:00
floatingghost 772c209914 GTS: cherry-picks and collection usage (#186)
https://git.pleroma.social/pleroma/pleroma/-/merge_requests/3725?commit_id=61254111e59f02118cad15de49d1e0704c07030e

what is this, a yoink of a yoink? good times

Co-authored-by: Hélène <pleroma-dev@helene.moe>
Co-authored-by: FloatingGhost <hannah@coffee-and-dreams.uk>
Reviewed-on: AkkomaGang/akkoma#186
2022-08-27 18:05:48 +00:00
floatingghost f32e288711 Merge pull request 'Add ability to obfuscate domains in MRF transparency' (#185) from obfuscate-domain-blocks into develop
Reviewed-on: AkkomaGang/akkoma#185
2022-08-27 11:11:56 +00:00
FloatingGhost 85137f591f Add ability to obfuscate domains in MRF transparency 2022-08-27 11:57:57 +01:00
floatingghost f11a6eb8dd Merge pull request 'Update min elixir version in mix.exs to 1.12' (#184) from norm/akkoma:mix-exs-elixir into develop
Reviewed-on: AkkomaGang/akkoma#184
2022-08-25 19:39:56 +00:00
Norm db7ad08d1e Update min elixir version in mix.exs to 1.12
The install docs already mention 1.12 as the minimum supported version, so this should also be reflected in `mix.exs`.
2022-08-25 18:30:19 +00:00
floatingghost e4f2251e0f Add support for setting language in instance metadata (#183)
Reviewed-on: AkkomaGang/akkoma#183
2022-08-25 16:11:21 +00:00
floatingghost 618cf7ff7f reuse valid oauth tokens (#182)
Reviewed-on: AkkomaGang/akkoma#182
2022-08-25 14:37:51 +00:00
FloatingGhost 017b50550b add changelog entry 2022-08-24 15:38:02 +01:00
floatingghost 92ba2802fb generate-keys-at-registration-time (#181)
Reviewed-on: AkkomaGang/akkoma#181
2022-08-24 14:36:33 +00:00
FloatingGhost fd7f4874ba allow new mfm classes 2022-08-24 10:06:48 +01:00
floatingghost c40b45e675 Merge pull request 'add maintained language tags' (#180) from add-language-tags into develop
Reviewed-on: AkkomaGang/akkoma#180
2022-08-23 15:11:14 +00:00
FloatingGhost 9b6feb6657 add language tags 2022-08-23 16:10:19 +01:00
FloatingGhost 3cf8c1eb31 use public temple dep 2022-08-23 12:13:35 +01:00
FloatingGhost 152c43ac9e update mfm_parser 2022-08-23 12:09:01 +01:00
FloatingGhost 8d7b63a766 Revert "Fix oauth2 (for real) (#179)"
This reverts commit aa681d7e15.
2022-08-21 17:52:02 +01:00
floatingghost aa681d7e15 Fix oauth2 (for real) (#179)
Reviewed-on: AkkomaGang/akkoma#179
2022-08-21 16:24:37 +00:00
FloatingGhost b0130bfa7b Revert "oauth2 fixes (#177)"
This reverts commit 429e2ac832.
2022-08-21 16:22:15 +01:00
floatingghost d72f9e39d9 add visibility check on quote (#178)
Reviewed-on: AkkomaGang/akkoma#178
2022-08-21 15:17:01 +00:00
floatingghost 429e2ac832 oauth2 fixes (#177)
Reviewed-on: AkkomaGang/akkoma#177
2022-08-21 14:46:52 +00:00
floatingghost f8dffa6126 Merge pull request 'Update OTP OpenRC service' (#174) from norm/akkoma:otp-openrc into develop
Reviewed-on: AkkomaGang/akkoma#174
2022-08-19 09:25:12 +00:00
Norm ffbf8304e0 Update OTP OpenRC service
This makes the paths match that of the OTP install guide on OpenRC distros.
2022-08-18 23:13:09 +00:00
floatingghost 59b886e86e Merge pull request 'Update OTP systemd service' (#173) from kawen/akkoma:develop into develop
Reviewed-on: AkkomaGang/akkoma#173
2022-08-18 10:51:46 +00:00
Karen Konou 22333f13e8 Update OTP systemd service 2022-08-18 12:43:20 +02:00
FloatingGhost a8f8ecce31 add changelog entry 2022-08-18 04:23:07 +01:00
floatingghost e9f1897cfd parser MFM server-side (#172)
Reviewed-on: AkkomaGang/akkoma#172
2022-08-18 03:14:48 +00:00
floatingghost aaf78e2b52 only put linked mfm in source (#171)
Reviewed-on: AkkomaGang/akkoma#171
2022-08-17 09:35:11 +00:00
floatingghost 11ec9daa5b API compatibility with fedibird, frontend config (#163)
Reviewed-on: AkkomaGang/akkoma#163
2022-08-17 00:22:59 +00:00
floatingghost 89ffc01c23 only return create objects for ES search (#165)
Reviewed-on: AkkomaGang/akkoma#165
2022-08-16 23:24:19 +00:00
floatingghost 61641957cb fix compatibility with meilisearch (#164)
Reviewed-on: AkkomaGang/akkoma#164
2022-08-16 22:56:49 +00:00
floatingghost 37a1001b97 add finch outbound proxy support (#158)
Reviewed-on: AkkomaGang/akkoma#158
2022-08-14 23:13:49 +00:00
FloatingGhost 5796d81d98 Merge branch 'stable' into develop 2022-08-12 16:06:30 +01:00
floatingghost 7544939c83 Merge pull request 'stable tag' (#159) from stable-tag into develop
Reviewed-on: AkkomaGang/akkoma#159
2022-08-12 15:04:05 +00:00
FloatingGhost 5192e21e53 bump version to 3.1.0 2022-08-12 16:00:40 +01:00
FloatingGhost 19ccdc8762 mix format 2022-08-11 19:21:50 +01:00
FloatingGhost 967c325b0d fix tests 2022-08-11 19:21:43 +01:00
FloatingGhost d3b9cfb03f use :discard instead of cancel 2022-08-11 19:17:50 +01:00
FloatingGhost ceeeefc707 we don't need to purge emoji.txt now 2022-08-11 19:06:49 +01:00
FloatingGhost 366889f97c remove default emoji file 2022-08-11 19:05:41 +01:00
FloatingGhost 74dbea4cf8 add masto-fe docs 2022-08-11 17:43:34 +01:00
Weblate 8bca9a7dbe Update translation files
Updated by "Squash Git commits" hook in Weblate.

Co-authored-by: Weblate <noreply@weblate.org>
Translate-URL: http://translate.akkoma.dev/projects/akkoma/akkoma-backend-static-pages/
Translation: Pleroma fe/Akkoma Backend (Static pages)
2022-08-11 09:29:28 +00:00
Weblate fcb5e4a48d Translated using Weblate (Dutch)
Currently translated at 100.0% (83 of 83 strings)

Added translation using Weblate (Dutch)

Co-authored-by: Fristi <fristi@subcon.town>
Co-authored-by: Weblate <noreply@weblate.org>
Translate-URL: http://translate.akkoma.dev/projects/akkoma/akkoma-backend-static-pages/nl/
Translation: Pleroma fe/Akkoma Backend (Static pages)
2022-08-11 09:29:28 +00:00
Weblate b1e2f3f646 Update translation files
Updated by "Squash Git commits" hook in Weblate.

Co-authored-by: Weblate <noreply@weblate.org>
Translate-URL: http://translate.akkoma.dev/projects/akkoma/akkoma-backend-static-pages/
Translation: Pleroma fe/Akkoma Backend (Static pages)
2022-08-11 09:29:28 +00:00
Weblate 2f074a6966 Translated using Weblate (Catalan)
Currently translated at 2.9% (30 of 1002 strings)

Update translation files

Updated by "Squash Git commits" hook in Weblate.

Co-authored-by: Weblate <noreply@weblate.org>
Co-authored-by: sola <spla@mastodont.cat>
Translate-URL: http://translate.akkoma.dev/projects/akkoma/akkoma-backend-config-descriptions/ca/
Translate-URL: http://translate.akkoma.dev/projects/akkoma/akkoma-backend-static-pages/
Translation: Pleroma fe/Akkoma Backend (Config Descriptions)
Translation: Pleroma fe/Akkoma Backend (Static pages)
2022-08-11 09:29:28 +00:00
Weblate fd35a66312 Update translation files
Updated by "Squash Git commits" hook in Weblate.

Co-authored-by: Weblate <noreply@weblate.org>
Translate-URL: http://translate.akkoma.dev/projects/akkoma/akkoma-backend-static-pages/
Translation: Pleroma fe/Akkoma Backend (Static pages)
2022-08-11 09:29:28 +00:00
Weblate 5022ecd766 Update translation files
Updated by "Squash Git commits" hook in Weblate.

Co-authored-by: Weblate <noreply@weblate.org>
Translate-URL: http://translate.akkoma.dev/projects/akkoma/akkoma-backend-static-pages/
Translation: Pleroma fe/Akkoma Backend (Static pages)
2022-08-11 09:29:28 +00:00
FloatingGhost d16eff1c0f describe color keys
fixes #126
2022-08-11 10:28:59 +01:00
FloatingGhost 55179d4214 set soapbox-fe v2 by default
fixes #157
2022-08-11 10:25:03 +01:00
FloatingGhost e5a2548521 remove warning that breaks update 2022-08-09 12:57:39 +01:00
floatingghost 1245141779 treat rejections in MRF as a reject in federator (#155)
Reviewed-on: AkkomaGang/akkoma#155
2022-08-08 15:47:57 +00:00
FloatingGhost 5d23df84c9 Mix format 2022-08-07 20:49:56 +01:00
Hélène b3e4d81362 StatusView: implement pleroma.context field
This field replaces the now deprecated conversation_id field, and now
exposes the ActivityPub object `context` directly via the MastoAPI
instead of relying on StatusNet-era data concepts.
2022-08-07 20:48:08 +01:00
Hélène b9bb093600 StatusView: clear MSB on calculated conversation_id
This field seems to be a left-over from the StatusNet era.
If your application uses `pleroma.conversation_id`: this field is
deprecated.

It is currently stubbed instead by doing a CRC32 of the context, and
clearing the MSB to avoid overflow exceptions with signed integers on
the different clients using this field (Java/Kotlin code, mostly; see
Husky and probably other mobile clients.)

This should be removed in a future version of Pleroma. Pleroma-FE
currently depends on this field, as well.
2022-08-07 20:47:59 +01:00
floatingghost 62e179f446 make conversation-id deterministic (#154)
Reviewed-on: AkkomaGang/akkoma#154
2022-08-06 20:59:15 +00:00
floatingghost 21ec1edbb6 Merge pull request 'allow quote-inline span class' (#152) from allow-quote-inline into develop
Reviewed-on: AkkomaGang/akkoma#152
2022-08-05 19:37:35 +00:00
FloatingGhost e8806fdc42 allow quote-inline span class 2022-08-05 20:35:00 +01:00
floatingghost ec162b496b /notice signing checks on redirect (#150)
Reviewed-on: AkkomaGang/akkoma#150
2022-08-05 19:31:32 +00:00
floatingghost 3b973d0627 Merge pull request 'Update 'docs/docs/installation/migrating_to_akkoma.md'' (#151) from ShariVegas/akkoma:sharivegas-docpatch-migration into develop
Reviewed-on: AkkomaGang/akkoma#151
2022-08-05 16:40:12 +00:00
Shari Vegas 273e51cb4a Update 'docs/docs/installation/migrating_to_akkoma.md'
I ran into an issue after migrating, admin-fe wouldn't function properly. Ran the above command for my build, and got that functionality back.
2022-08-05 16:30:33 +00:00
floatingghost 0ec3a11895 don't persist undo of follows (#149)
Reviewed-on: AkkomaGang/akkoma#149
2022-08-05 13:28:56 +00:00
floatingghost 2781faaa7b Merge pull request 'Fix postgres install and setup for fedora guide' (#147) from norm/akkoma:fedora-install into develop
Reviewed-on: AkkomaGang/akkoma#147
2022-08-05 11:43:14 +00:00
floatingghost a82fb2acc1 Merge pull request 'Update default paths' (#141) from norm/akkoma:update-default-paths into develop
Reviewed-on: AkkomaGang/akkoma#141
2022-08-05 11:42:14 +00:00
Norm 499d8a1056 Merge branch 'develop' into fedora-install 2022-08-05 05:03:00 +00:00
Norm 6b85b36e3a Fix postgres install and setup for fedora guide
Fedora requires some additional setup to work with Pleroma compared to Ubuntu/Debian.
2022-08-05 05:02:42 +00:00
floatingghost 5fe2c61029 Merge pull request 'Transmogrifier: fix reply context fixing' (#145) from misskey-thread-breakiness into develop
Reviewed-on: AkkomaGang/akkoma#145
2022-08-04 12:07:15 +00:00
Hélène c1e15ff6f8 Transmogrifier: fix reply context fixing
Incoming Pleroma replies to a Misskey thread were rejected due to a
broken context fix, which caused them to not be visible until a
non-Pleroma user interacted with the replies.

This fix properly sets the post-fix object context to its parent Create
activity as well, if it was changed.
2022-08-04 12:57:48 +01:00
FloatingGhost 9df732c42b Merge branch 'translations' into develop 2022-08-03 22:05:23 +01:00
floatingghost 80f444fb52 Merge pull request 'docs/installation: Update required Elixir version to 1.12' (#144) from norm/akkoma:norm-patch-1 into develop
Reviewed-on: AkkomaGang/akkoma#144
2022-08-03 12:02:41 +00:00
Norm b5d06a3db8 docs/installation: Update required Elixir version to 1.12
Some dependencies will refuse to work on Elixir 1.10 (and presumably 1.9). One dependency states 1.13 as a requirement but will still work on 1.12 just fine.
2022-08-03 12:01:13 +00:00
floatingghost 456c97fda9 Merge pull request 'remove unneeded function' (#143) from compile-fix into develop
Reviewed-on: AkkomaGang/akkoma#143
2022-08-03 11:12:05 +00:00
floatingghost 842ab82ef0 Merge pull request 'Allow users to create backups without providing email address' (#140) from norm/akkoma:backup-without-email into develop
Reviewed-on: AkkomaGang/akkoma#140
2022-08-03 11:11:53 +00:00
sfr 6e9126a794 add code of conduct (#129)
Reviewed-on: AkkomaGang/akkoma#129
Co-authored-by: sfr <sol@solfisher.com>
Co-committed-by: sfr <sol@solfisher.com>
2022-08-03 10:55:11 +00:00
Norm 2c40d565fa Fix config path lookup (#139)
Reviewed-on: AkkomaGang/akkoma#139
Co-authored-by: Norm <normandy@biribiri.dev>
Co-committed-by: Norm <normandy@biribiri.dev>
2022-08-03 10:52:21 +00:00
FloatingGhost 359510eebc remove unneeded function 2022-08-03 11:50:48 +01:00
Norm 8bfd01b9c7
Update default paths 2022-08-03 01:05:53 -04:00
Tusooa Zhu f08241c8ab
Allow users to create backups without providing email address
Ref: backup-without-email
2022-08-02 22:16:54 -04:00
floatingghost c9600dbbbf local-only-fixed (#138)
Reviewed-on: AkkomaGang/akkoma#138
2022-08-02 14:46:46 +00:00
floatingghost ca000f8301 Merge mrf_simple-reject with quarantine (#137)
Reviewed-on: AkkomaGang/akkoma#137
2022-08-02 14:19:24 +00:00
Joel Beckmeyer e26388a01c Support reaching user@sub.domain.tld at user@domain.tld (#134)
Reviewed-on: AkkomaGang/akkoma#134
Co-authored-by: Joel Beckmeyer <joel@beckmeyer.us>
Co-committed-by: Joel Beckmeyer <joel@beckmeyer.us>
2022-08-02 13:54:22 +00:00
floatingghost c3eea8dc7d expose bubble instances via nodeinfo (#136)
Reviewed-on: AkkomaGang/akkoma#136
2022-08-02 09:11:22 +00:00
Weblate 8b14b65e39 Merge branch 'origin/develop' into Weblate. 2022-08-01 12:48:29 +00:00
floatingghost 55b86e45ec Merge pull request 'translations' (#133) from translations into develop
Reviewed-on: AkkomaGang/akkoma#133
2022-08-01 12:48:26 +00:00
Weblate dbb80c79d5 Translated using Weblate (Catalan)
Currently translated at 100.0% (103 of 103 strings)

Translated using Weblate (Catalan)

Currently translated at 0.3% (4 of 1002 strings)

Translated using Weblate (Catalan)

Currently translated at 100.0% (83 of 83 strings)

Translated using Weblate (Catalan)

Currently translated at 13.5% (14 of 103 strings)

Added translation using Weblate (Catalan)

Merge branch 'origin/develop' into Weblate.

Merge branch 'origin/develop' into Weblate.

Merge branch 'origin/develop' into Weblate.

Translated using Weblate (Catalan)

Currently translated at 100.0% (0 of 0 strings)

Added translation using Weblate (Catalan)

Added translation using Weblate (Catalan)

Added translation using Weblate (Catalan)

Co-authored-by: Anonymous <noreply@weblate.org>
Co-authored-by: Weblate <noreply@weblate.org>
Co-authored-by: sola <spla@mastodont.cat>
Translate-URL: http://translate.akkoma.dev/projects/akkoma/akkoma-backend-config-descriptions/ca/
Translate-URL: http://translate.akkoma.dev/projects/akkoma/akkoma-backend-errors/ca/
Translate-URL: http://translate.akkoma.dev/projects/akkoma/akkoma-backend-static-pages/ca/
Translation: Pleroma fe/Akkoma Backend (Config Descriptions)
Translation: Pleroma fe/Akkoma Backend (Errors)
Translation: Pleroma fe/Akkoma Backend (Static pages)
2022-08-01 12:47:27 +00:00
floatingghost 19a27ff006 allow small/center tags in misskeymarkdown (#132)
Reviewed-on: AkkomaGang/akkoma#132
2022-08-01 12:46:52 +00:00
Yukkuri 38659e5610 Use uppercase HTTP HEAD method for media preview proxy request (#128)
Reviewed-on: AkkomaGang/akkoma#128
Co-authored-by: Yukkuri <iamtakingiteasy@eientei.org>
Co-committed-by: Yukkuri <iamtakingiteasy@eientei.org>
2022-07-30 21:58:14 +00:00
FloatingGhost 2033d7d4fc ensure extra info in fix_follow_state prints 2022-07-29 19:50:26 +01:00
Weblate bbb9dbc4d4 Merge branch 'origin/develop' into Weblate. 2022-07-29 11:42:45 +00:00
Weblate c0965ed24a Added translation using Weblate (Catalan)
Merge branch 'origin/develop' into Weblate.

Merge branch 'origin/develop' into Weblate.

Merge branch 'origin/develop' into Weblate.

Translated using Weblate (Catalan)

Currently translated at 100.0% (0 of 0 strings)

Added translation using Weblate (Catalan)

Added translation using Weblate (Catalan)

Added translation using Weblate (Catalan)

Co-authored-by: Anonymous <noreply@weblate.org>
Co-authored-by: Weblate <noreply@weblate.org>
Co-authored-by: sola <spla@mastodont.cat>
Translate-URL: http://translate.akkoma.dev/projects/akkoma/akkoma-backend-config-descriptions/ca/
Translation: Pleroma fe/Akkoma Backend (Config Descriptions)
2022-07-29 11:37:51 +00:00
Weblate c53c967aa7 Merge branch 'origin/develop' into Weblate. 2022-07-29 09:10:28 +00:00
FloatingGhost db99edacfe do the same for soapbox 2022-07-29 10:10:12 +01:00
Weblate e216b275fe Merge branch 'origin/develop' into Weblate. 2022-07-29 09:08:55 +00:00
FloatingGhost 4f6caae209 ensure we can't run the same clause of fix_quote_url more than once 2022-07-29 10:08:40 +01:00
Weblate fc4dc83bba Merge branch 'origin/develop' into Weblate. 2022-07-29 09:04:18 +00:00
FloatingGhost bf3f934275 add guards around fix misskey content 2022-07-29 10:04:04 +01:00
Weblate f4b507f3a2 Translated using Weblate (Catalan)
Currently translated at 100.0% (0 of 0 strings)

Added translation using Weblate (Catalan)

Added translation using Weblate (Catalan)

Added translation using Weblate (Catalan)

Co-authored-by: Anonymous <noreply@weblate.org>
Co-authored-by: Weblate <noreply@weblate.org>
Co-authored-by: sola <spla@mastodont.cat>
Translate-URL: http://translate.akkoma.dev/projects/akkoma/akkoma-backend-config-descriptions/ca/
Translation: Pleroma fe/Akkoma Backend (Config Descriptions)
2022-07-28 12:12:55 +00:00
floatingghost 405406601f Fix emoji qualification (#124)
Reviewed-on: AkkomaGang/akkoma#124
2022-07-28 12:02:36 +00:00
FloatingGhost 52095ff4de fix release tag 2022-07-28 11:44:17 +01:00
floatingghost 2c8f57db98 rename-flavours (#123)
Reviewed-on: AkkomaGang/akkoma#123
2022-07-28 10:36:51 +00:00
Fristi 7380dc0256 Added installation guides for redhat linux distributions, includes OTP build guide for fedora. (#122)
Reviewed-on: AkkomaGang/akkoma#122
Co-authored-by: Fristi <fristi@noreply.akkoma>
Co-committed-by: Fristi <fristi@noreply.akkoma>
2022-07-28 10:19:32 +00:00
FloatingGhost 38cefaebd9 ensure mix clean 2022-07-27 23:29:34 +01:00
floatingghost 2796a9acaf backend-i18n (#121)
Reviewed-on: AkkomaGang/akkoma#121
2022-07-27 21:56:59 +00:00
floatingghost 645f0390bc Prepare for ubuntu22 murdering openssl (#120)
Reviewed-on: AkkomaGang/akkoma#120
2022-07-27 21:48:13 +00:00
floatingghost a3501cab86 ensure quote fetching obeys max thread distance (#119)
Reviewed-on: AkkomaGang/akkoma#119
2022-07-26 17:28:47 +00:00
floatingghost 2cde2052b8 Merge pull request 'don't error out if the featured collection has a string ID' (#118) from fix/collections-with-deleted-items into develop
Reviewed-on: AkkomaGang/akkoma#118
2022-07-26 14:20:42 +00:00
FloatingGhost 0a55c37182 don't error out if the featured collection has a string ID 2022-07-26 15:08:35 +01:00
floatingghost 1f6deb0ef4 include local instance in bubble timeline (#117)
Reviewed-on: AkkomaGang/akkoma#117
2022-07-26 12:22:49 +00:00
floatingghost 90c4785b89 remove public post quarantine exception (#114)
Reviewed-on: AkkomaGang/akkoma#114
2022-07-26 11:09:13 +00:00
floatingghost 1f8e5be051 Merge pull request 'add authorized_fetch_mode to description.exs' (#116) from document-secure-fetch into develop
Reviewed-on: AkkomaGang/akkoma#116
2022-07-26 10:04:24 +00:00
FloatingGhost 36eec89946 add authorized_fetch_mode to description.exs 2022-07-26 10:51:40 +01:00
floatingghost 1419eee5df Quote posting (#113)
Reviewed-on: AkkomaGang/akkoma#113
2022-07-25 16:30:06 +00:00
FloatingGhost 516d155558 open up functions in user 2022-07-24 17:56:48 +01:00
floatingghost c4e9c4bc95 extend custom runtime system (#108)
Reviewed-on: AkkomaGang/akkoma#108
2022-07-24 16:42:43 +00:00
FloatingGhost d0b7d37cd8 bump version for release 2022-07-23 20:01:48 +01:00
floatingghost 6ff6f12fec bugfix/follow-state (#104)
Reviewed-on: AkkomaGang/akkoma#104
2022-07-23 20:01:48 +01:00
floatingghost f9a7b456eb Merge pull request 'bump version for release' (#105) from version into develop
Reviewed-on: AkkomaGang/akkoma#105
2022-07-23 19:01:04 +00:00
FloatingGhost 8e94cbcac7 bump version for release 2022-07-23 20:00:38 +01:00
floatingghost 4c47992686 bugfix/follow-state (#104)
Reviewed-on: AkkomaGang/akkoma#104
2022-07-23 18:58:45 +00:00
floatingghost cb6e7359af add bubble timeline (#100)
Reviewed-on: AkkomaGang/akkoma#100
2022-07-22 14:55:38 +00:00
floatingghost 4571d372b8 Merge pull request 'prune lockfile' (#99) from chore/dep-prune into develop
Reviewed-on: AkkomaGang/akkoma#99
2022-07-21 11:50:11 +00:00
FloatingGhost 26830387ac prune lockfile 2022-07-21 12:39:53 +01:00
floatingghost 0c542e58aa Remove instrumentors (#98)
Reviewed-on: AkkomaGang/akkoma#98
2022-07-21 11:32:17 +00:00
floatingghost d109bbf71c Merge pull request 'purge chat and shout endpoints' (#97) from purge-chat into develop
Reviewed-on: AkkomaGang/akkoma#97
2022-07-21 10:53:48 +00:00
FloatingGhost 0f132b802d purge chat and shout endpoints 2022-07-21 11:29:28 +01:00
floatingghost 07ea4d73e1 update mastofe paths (#95)
Reviewed-on: AkkomaGang/akkoma#95
2022-07-20 20:13:50 +00:00
floatingghost ab5bf7c020 Merge pull request 'update feature set' (#94) from feature-set into develop
Reviewed-on: AkkomaGang/akkoma#94
2022-07-20 14:47:40 +00:00
FloatingGhost e35dced9c8 remove chat enabled feature 2022-07-20 15:46:41 +01:00
FloatingGhost 3b8bf8464f update features array 2022-07-20 15:43:41 +01:00
floatingghost 729f45ccd2 purge ldap authenticator (#92)
Reviewed-on: AkkomaGang/akkoma#92
2022-07-20 12:49:13 +00:00
floatingghost dc9f66749c remove all endpoints marked as deprecated (#91)
Reviewed-on: AkkomaGang/akkoma#91
2022-07-20 12:00:58 +00:00
floatingghost ffc5944334 Merge pull request 'purge scrobbling' (#90) from purge/scrobbling into develop
Reviewed-on: AkkomaGang/akkoma#90
2022-07-20 10:32:03 +00:00
floatingghost f7f4220a18 Merge pull request 'Change Pleroma references to Akkoma in README' (#88) from norm/akkoma:readme into develop
Reviewed-on: AkkomaGang/akkoma#88
2022-07-19 19:00:20 +00:00
Norm 8887788adb
Change Pleroma references to Akkoma in README 2022-07-19 13:29:33 -04:00
FloatingGhost a2b384d572 document scrobbling purge 2022-07-19 17:22:02 +01:00
FloatingGhost cf0ad02ea9 Remove scrobbling support 2022-07-19 15:07:45 +01:00
floatingghost d177715a04 Merge pull request 'doc: update repo link from docs to akkoma' (#87) from sfr/akkoma:doc/link into develop
Reviewed-on: AkkomaGang/akkoma#87
2022-07-19 10:03:47 +00:00
Sol Fisher Romanoff f95f35a1ab
doc: update repo link from docs to akkoma 2022-07-19 12:36:09 +03:00
floatingghost 3897bb825a Merge pull request 'fix resolution of GTS AP IDs from key IDs' (#86) from gts-user-resolution into develop
Reviewed-on: AkkomaGang/akkoma#86
2022-07-18 20:47:51 +00:00
FloatingGhost 85e2e64c82 fix resolution of GTS user keys 2022-07-18 15:21:27 +01:00
floatingghost 54ed8760ff Merge branch 'from/upstream-develop/tusooa/server-announcements' into 'develop' (#85)
Reviewed-on: AkkomaGang/akkoma#85
2022-07-18 13:08:36 +00:00
floatingghost 3cbc401fe0 upgrade oban to v11 (#84)
Reviewed-on: AkkomaGang/akkoma#84
2022-07-18 10:48:49 +00:00
floatingghost ba8e0dff23 Merge pull request 'Add avatar image file' (#83) from default-avatar into develop
Reviewed-on: AkkomaGang/akkoma#83
2022-07-18 10:26:04 +00:00
FloatingGhost 17ea24838b Add avatar image file 2022-07-18 11:25:09 +01:00
floatingghost ff16840cc8 Refactor CI build (#80)
Reviewed-on: AkkomaGang/akkoma#80
2022-07-18 10:17:24 +00:00
floatingghost 5b4d77eaa7 maintenance: dependency upgrade (#81)
Reviewed-on: AkkomaGang/akkoma#81
2022-07-18 00:56:35 +00:00
423 changed files with 41898 additions and 9761 deletions

View file

@ -6,6 +6,12 @@ COPYING
*file
elixir_buildpack.config
test/
test
benchmarks
docs/site
docker-db
uploads
instance
# Required to get version
!.git

10
.gitignore vendored
View file

@ -17,6 +17,13 @@ secret
/instance
/priv/ssh_keys
vm.args
.cache/
.hex/
.mix/
.psql_history
docker-resources/Dockerfile
docker-resources/Caddyfile
pgdata
# Prevent committing custom emojis
/priv/static/emoji/custom/*
@ -65,3 +72,6 @@ pleroma.iml
# Generated documentation
docs/site
# docker stuff
docker-db

View file

@ -1,18 +0,0 @@
<!--
### Precheck
* For support use https://git.pleroma.social/pleroma/pleroma-support or [community channels](https://git.pleroma.social/pleroma/pleroma#community-channels).
* Please do a quick search to ensure no similar bug has been reported before. If the bug has not been addressed after 2 weeks, it's fine to bump it.
* Try to ensure that the bug is actually related to the Pleroma backend. For example, if a bug happens in Pleroma-FE but not in Mastodon-FE or mobile clients, it's likely that the bug should be filed in [Pleroma-FE](https://git.pleroma.social/pleroma/pleroma-fe/issues/new) repository.
-->
### Environment
* Installation type (OTP or From Source):
* Pleroma version (could be found in the "Version" tab of settings in Pleroma-FE):
* Elixir version (`elixir -v` for from source installations, N/A for OTP):
* Operating system:
* PostgreSQL version (`psql -V`):
### Bug description

View file

@ -1,6 +0,0 @@
### Release checklist
* [ ] Bump version in `mix.exs`
* [ ] Compile a changelog
* [ ] Create an MR with an announcement to pleroma.social
* [ ] Tag the release
* [ ] Merge `stable` into `develop` (in case the fixes are already in develop, use `git merge -s ours --no-commit` and manually merge the changelogs)

197
.woodpecker.yml Normal file
View file

@ -0,0 +1,197 @@
variables:
- &scw-secrets
- SCW_ACCESS_KEY
- SCW_SECRET_KEY
- SCW_DEFAULT_ORGANIZATION_ID
- &setup-hex "mix local.hex --force && mix local.rebar --force"
- &on-release
when:
event:
- push
- tag
branch:
- develop
- stable
- refs/tags/v*
- refs/tags/stable-*
- &on-stable
when:
event:
- push
- tag
branch:
- stable
- refs/tags/stable-*
- &on-point-release
when:
event:
- push
branch:
- develop
- stable
- &on-pr-open
when:
event:
- pull_request
- &tag-build "export BUILD_TAG=$${CI_COMMIT_TAG:-\"$CI_COMMIT_BRANCH\"} && export PLEROMA_BUILD_BRANCH=$BUILD_TAG"
- &clean "(rm -rf release || true) && (rm -rf _build || true) && (rm -rf /root/.mix)"
- &mix-clean "mix deps.clean --all && mix clean"
services:
postgres:
image: postgres:13
when:
event:
- pull_request
environment:
POSTGRES_DB: pleroma_test
POSTGRES_USER: postgres
POSTGRES_PASSWORD: postgres
pipeline:
lint:
<<: *on-pr-open
image: akkoma/ci-base:latest
commands:
- mix local.hex --force
- mix local.rebar --force
- mix format --check-formatted
build:
image: akkoma/ci-base:latest
<<: *on-pr-open
environment:
MIX_ENV: test
POSTGRES_DB: pleroma_test
POSTGRES_USER: postgres
POSTGRES_PASSWORD: postgres
DB_HOST: postgres
commands:
- mix local.hex --force
- mix local.rebar --force
- mix deps.get
- mix compile
test:
image: akkoma/ci-base:latest
<<: *on-pr-open
environment:
MIX_ENV: test
POSTGRES_DB: pleroma_test
POSTGRES_USER: postgres
POSTGRES_PASSWORD: postgres
DB_HOST: postgres
commands:
- mix local.hex --force
- mix local.rebar --force
- mix deps.get
- mix compile
- mix ecto.drop -f -q
- mix ecto.create
- mix ecto.migrate
- mix test --preload-modules --exclude erratic --exclude federated --max-cases 4
# Canonical amd64
ubuntu22:
image: hexpm/elixir:1.13.4-erlang-24.3.4.5-ubuntu-jammy-20220428
<<: *on-release
environment:
MIX_ENV: prod
DEBIAN_FRONTEND: noninteractive
commands:
- apt-get update && apt-get install -y cmake libmagic-dev rclone zip imagemagick libmagic-dev git build-essential g++ wget
- *clean
- echo "import Config" > config/prod.secret.exs
- *setup-hex
- *tag-build
- mix deps.get --only prod
- mix release --path release
- zip akkoma-ubuntu-jammy.zip -r release
release-ubuntu22:
image: akkoma/releaser
<<: *on-release
secrets: *scw-secrets
commands:
- export SOURCE=akkoma-ubuntu-jammy.zip
- export DEST=scaleway:akkoma-updates/$${CI_COMMIT_TAG:-"$CI_COMMIT_BRANCH"}/akkoma-ubuntu-jammy.zip
- /bin/sh /entrypoint.sh
- export DEST=scaleway:akkoma-updates/$${CI_COMMIT_TAG:-"$CI_COMMIT_BRANCH"}/akkoma-amd64-ubuntu-jammy.zip
- /bin/sh /entrypoint.sh
debian-bullseye:
image: hexpm/elixir:1.13.4-erlang-24.3.4.5-debian-bullseye-20220801
<<: *on-release
environment:
MIX_ENV: prod
DEBIAN_FRONTEND: noninteractive
commands:
- apt-get update && apt-get install -y cmake libmagic-dev rclone zip imagemagick libmagic-dev git build-essential gcc make g++ wget
- *clean
- echo "import Config" > config/prod.secret.exs
- *setup-hex
- *tag-build
- *mix-clean
- mix deps.get --only prod
- mix release --path release
- zip akkoma-amd64.zip -r release
release-debian:
image: akkoma/releaser
<<: *on-release
secrets: *scw-secrets
commands:
- export SOURCE=akkoma-amd64.zip
- export DEST=scaleway:akkoma-updates/$${CI_COMMIT_TAG:-"$CI_COMMIT_BRANCH"}/akkoma-amd64.zip
- /bin/sh /entrypoint.sh
- export DEST=scaleway:akkoma-updates/$${CI_COMMIT_TAG:-"$CI_COMMIT_BRANCH"}/akkoma-debian-stable.zip
- /bin/sh /entrypoint.sh
# Canonical amd64-musl
musl:
image: hexpm/elixir:1.13.4-erlang-24.3.4.5-alpine-3.15.6
<<: *on-stable
environment:
MIX_ENV: prod
commands:
- apk add git gcc g++ musl-dev make cmake file-dev rclone wget zip imagemagick
- *clean
- *setup-hex
- *mix-clean
- *tag-build
- mix deps.get --only prod
- mix release --path release
- zip akkoma-amd64-musl.zip -r release
release-musl:
image: akkoma/releaser
<<: *on-stable
secrets: *scw-secrets
commands:
- export SOURCE=akkoma-amd64-musl.zip
- export DEST=scaleway:akkoma-updates/$${CI_COMMIT_TAG:-"$CI_COMMIT_BRANCH"}/akkoma-amd64-musl.zip
- /bin/sh /entrypoint.sh
docs:
<<: *on-point-release
secrets:
- SCW_ACCESS_KEY
- SCW_SECRET_KEY
- SCW_DEFAULT_ORGANIZATION_ID
environment:
CI: "true"
image: python:3.10-slim
commands:
- apt-get update && apt-get install -y rclone wget git zip
- wget https://github.com/scaleway/scaleway-cli/releases/download/v2.5.1/scaleway-cli_2.5.1_linux_amd64
- mv scaleway-cli_2.5.1_linux_amd64 scaleway-cli
- chmod +x scaleway-cli
- ./scaleway-cli object config install type=rclone
- cd docs
- pip install -r requirements.txt
- mkdocs build
- zip -r docs.zip site/*
- cd site
- rclone copy . scaleway:akkoma-docs/$CI_COMMIT_BRANCH/

View file

@ -1,27 +0,0 @@
pipeline:
build:
when:
event:
- push
branch:
- develop
- stable
secrets:
- SCW_ACCESS_KEY
- SCW_SECRET_KEY
- SCW_DEFAULT_ORGANIZATION_ID
environment:
CI: "true"
image: python:3.10-slim
commands:
- apt-get update && apt-get install -y rclone wget git zip
- wget https://github.com/scaleway/scaleway-cli/releases/download/v2.5.1/scaleway-cli_2.5.1_linux_amd64
- mv scaleway-cli_2.5.1_linux_amd64 scaleway-cli
- chmod +x scaleway-cli
- ./scaleway-cli object config install type=rclone
- cd docs
- pip install -r requirements.txt
- mkdocs build
- zip -r docs.zip site/*
- cd site
- rclone copy . scaleway:akkoma-docs/$CI_COMMIT_BRANCH/

View file

@ -1,127 +0,0 @@
matrix:
docker_prefix:
- ""
- arm64v8/
- arm32v7/
tag:
- amd64
- arm64
- arm
include:
- tag: amd64
docker_prefix: ""
pipeline:
glibc:
when:
event:
- push
- tag
branch:
- develop
- stable
- refs/tags/v*
- refs/tags/stable-*
secrets:
- SCW_ACCESS_KEY
- SCW_SECRET_KEY
- SCW_DEFAULT_ORGANIZATION_ID
image: ${docker_prefix}elixir:1.13
environment:
MIX_ENV: prod
commands:
- apt-get update && apt-get install -y cmake libmagic-dev rclone zip imagemagick libmagic-dev
- wget https://github.com/scaleway/scaleway-cli/releases/download/v2.5.1/scaleway-cli_2.5.1_linux_amd64
- mv scaleway-cli_2.5.1_linux_amd64 scaleway-cli
- chmod +x scaleway-cli
- ./scaleway-cli object config install type=rclone
- echo "import Mix.Config" > config/prod.secret.exs
- mix local.hex --force
- mix local.rebar --force
- export BUILD_TAG=$${CI_COMMIT_TAG:-"$CI_COMMIT_BRANCH"}
- export PLEROMA_BUILD_BRANCH=$BUILD_TAG
- mix deps.clean --all
- mix deps.get --only prod
- mkdir release
- mix release --path release
- zip akkoma-${tag}.zip -r release
- rclone copyto akkoma-${tag}.zip scaleway:akkoma-updates/$BUILD_TAG/akkoma-${tag}.zip
musl:
when:
event:
- push
- tag
branch:
- develop
- stable
- refs/tags/v*
- refs/tags/stable-*
secrets:
- SCW_ACCESS_KEY
- SCW_SECRET_KEY
- SCW_DEFAULT_ORGANIZATION_ID
image: ${docker_prefix}elixir:1.13-alpine
environment:
MIX_ENV: prod
commands:
- apk add git gcc g++ musl-dev make cmake file-dev rclone wget zip imagemagick
- rm -rf release || true
- rm -rf _build || true
- rm -rf /root/.mix
- rm scaleway-cli || true
- wget https://github.com/scaleway/scaleway-cli/releases/download/v2.5.1/scaleway-cli_2.5.1_linux_amd64
- mv scaleway-cli_2.5.1_linux_amd64 scaleway-cli
- chmod +x scaleway-cli
- ./scaleway-cli object config install type=rclone
- mix local.hex --force
- mix local.rebar --force
- export BUILD_TAG=$${CI_COMMIT_TAG:-"$CI_COMMIT_BRANCH"}
- export PLEROMA_BUILD_BRANCH=$BUILD_TAG
- mix deps.clean --all
- mix deps.get --only prod
- mix release --path release
- zip akkoma-${tag}.zip -r release
- rclone copyto akkoma-${tag}.zip scaleway:akkoma-updates/$BUILD_TAG/akkoma-${tag}-musl.zip
musl1.1:
when:
event:
- push
- tag
branch:
- develop
- stable
- refs/tags/v*
- refs/tags/stable-*
secrets:
- SCW_ACCESS_KEY
- SCW_SECRET_KEY
- SCW_DEFAULT_ORGANIZATION_ID
image: voidlinux/voidlinux-musl
environment:
MIX_ENV: prod
commands:
- xbps-install -Suy || xbps-install -uy xbps
- xbps-install -Suy
- xbps-install -y git gcc musl-devel make cmake file-devel rclone wget zip libmagic elixir
- rm -rf release || true
- rm -rf _build || true
- rm -rf /root/.mix
- rm scaleway-cli || true
- wget https://github.com/scaleway/scaleway-cli/releases/download/v2.5.1/scaleway-cli_2.5.1_linux_amd64
- mv scaleway-cli_2.5.1_linux_amd64 scaleway-cli
- chmod +x scaleway-cli
- ./scaleway-cli object config install type=rclone
- mix local.hex --force
- mix local.rebar --force
- export BUILD_TAG=$${CI_COMMIT_TAG:-"$CI_COMMIT_BRANCH"}
- export PLEROMA_BUILD_BRANCH=$BUILD_TAG
- mix deps.clean --all
- mix deps.get --only prod
- mix release --path release
- zip akkoma-${tag}.zip -r release
- rclone copyto akkoma-${tag}.zip scaleway:akkoma-updates/$BUILD_TAG/akkoma-${tag}-musl11.zip

View file

@ -1,59 +0,0 @@
matrix:
ELIXIR_VERSION:
- 1.13
pipeline:
lint:
when:
event:
- pull_request
image: pleromaforkci/ci-base:1.13
commands:
- mix local.hex --force
- mix local.rebar --force
- mix format --check-formatted
build:
image: pleromaforkci/ci-base:${ELIXIR_VERSION}
when:
event:
- pull_request
environment:
MIX_ENV: test
commands:
- mix local.hex --force
- mix local.rebar --force
- mix deps.get
- mix compile
test:
group: test
image: pleromaforkci/ci-base:${ELIXIR_VERSION}
when:
event:
- pull_request
environment:
MIX_ENV: test
POSTGRES_DB: pleroma_test
POSTGRES_USER: postgres
POSTGRES_PASSWORD: postgres
DB_HOST: postgres
commands:
- mix local.hex --force
- mix local.rebar --force
- mix deps.get
- mix ecto.drop -f -q
- mix ecto.create
- mix ecto.migrate
- mix test --preload-modules --exclude erratic --exclude federated --max-cases 4
services:
postgres:
image: postgres:13
when:
event:
- pull_request
environment:
POSTGRES_DB: pleroma_test
POSTGRES_USER: postgres
POSTGRES_PASSWORD: postgres

View file

@ -4,6 +4,92 @@ All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/).
## Unreleased
## Added
- Officially supported docker release
- Ability to remove followers unilaterally without a block
## Changes
- Follows no longer override domain blocks, a domain block is final
- Deletes are now the lowest priority to publish and will be handled after creates
## Fixed
- Registrations via ldap are now compatible with the latest OTP24
## 2022.10
### Added
- Ability to sync frontend profiles between clients, with a name attached
- Status card generation will now use the media summary if it is available
### Changed
- Emoji updated to latest 15.0 draft
- **Breaking**: `/api/v1/pleroma/backups` endpoints now requires `read:backups` scope instead of `read:accounts`
- Verify that the signature on posts is not domain blocked, and belongs to the correct user
### Fixed
- OAuthPlug no longer joins with the database every call and uses the user cache
- Undo activities no longer try to look up by ID, and render correctly
- prevent false-errors from meilisearch
## 2022.09
### Added
- support for fedibird-fe, and non-breaking API parity for it to function
- support for setting instance languages in metadata
- support for reusing oauth tokens, and not requiring new authorizations
- the ability to obfuscate domains in your MRF descriptions
- automatic translation of statuses via DeepL or LibreTranslate
- ability to edit posts
- ability to react with remote emoji
### Changed
- MFM parsing is now done on the backend by a modified version of ilja's parser -> https://akkoma.dev/AkkomaGang/mfm-parser
- InlineQuotePolicy is now on by default
- Enable remote users to interact with posts
### Fixed
- Compatibility with latest meilisearch
- Resolution of nested mix tasks (i.e search.meilisearch) in OTP releases
- Elasticsearch returning likes and repeats, displaying as posts
- Ensure key generation happens at registration-time to prevent potential race-conditions
- Ensured websockets get closed on logout
- Allowed GoToSocial-style `?query_string` signatures
### Removed
- Non-finch HTTP adapters. `:tesla, :adapter` is now highly recommended to be set to the default.
## 2022.08
### Added
- extended runtime module support, see config cheatsheet
- quote posting; quotes are limited to public posts
### Changed
- quarantining is now considered absolutely; public activities are no longer
an exception.
- also merged quarantine and mrf reject - quarantine is now deprecated
- flavours:
- amd64 is built for debian stable. Compatible with ubuntu 20.
- ubuntu-jammy is built for... well, ubuntu 22 (LTS)
- amd64-musl is built for alpine 3.16
### Fixed
- Updated mastoFE path, for the newer version
### Removed
- Scrobbling support
- `/api/v1/pleroma/scrobble`
- `/api/v1/pleroma/accounts/{id}/scrobbles`
- Deprecated endpoints
- `/api/v1/pleroma/chats`
- `/api/v1/notifications/dismiss`
- `/api/v1/search`
- `/api/v1/statuses/{id}/card`
- Chats, they were half-baked. Just use PMs.
- Prometheus, it causes massive slowdown
## 2022.07
### Added
@ -144,6 +230,7 @@ you might end up in a situation where you don't have an ability to get it.
- Attachment dimensions and blurhashes are federated when available.
- Mastodon API: support `poll` notification.
- Pinned posts federation
- Possibility to discover users like `user@example.org`, while Akkoma is working on `akkoma.example.org`. Additional configuration required.
### Fixed
- Don't crash so hard when email settings are invalid.

24
CODE_OF_CONDUCT.md Normal file
View file

@ -0,0 +1,24 @@
# Akkoma Code of Conduct
The Akkoma project aims to be **enjoyable** for anyone to participate in, regardless of their identity or level of expertise. To achieve this, the community must create an environment which is **safe** and **equitable**; the following guidelines have been created with these goals in mind.
1. **Treat individuals with respect.** Differing experiences and viewpoints deserve to be respected, and bigotry and harassment are not tolerated under any circumstances.
- Individuals should at all times be treated as equals, regardless of their age, gender, sexuality, race, ethnicity, _or any other characteristic_, intrinsic or otherwise.
- Behaviour that is harmful in nature should be addressed and corrected *regardless of intent*.
- Respect personal boundaries and ask for clarification whenever they are unclear.
- (Obviously, hate does not count as merely a "differing viewpoint", because it is harmful in nature.)
2. **Be understanding of differences in communication.** Not everyone is aware of unspoken social cues, and speech that is not intended to be offensive should not be treated as such simply due to an atypical manner of communication.
- Somebody who speaks bluntly is not necessarily rude, and somebody who swears a lot is not necessarily volatile.
- Try to confirm your interpretation of their intent rather than assuming bad faith.
- Someone may not communicate as, or come across as a picture of "professionalism", but this should not be seen as a reason to dismiss them. This is a **casual** space, and communication styles can reflect that.
3. **"Uncomfortable" does not mean "unsafe".** In an ideal world, the community would be safe, equitable, enjoyable, *and* comfortable for all members at all times. Unfortunately, this is not always possible in reality.
- Safety and equity will be prioritized over comfort whenever it is necessary to do so.
- Weaponizing one's own discomfort to deflect accountability or censor an individual (e.g. "white fragility") is a form of discriminatory conduct.
4. **Let people grow from their mistakes.** Nobody is perfect; even the most well-meaning individual can do something hurtful. Everyone should be given a fair opportunity to explain themselves and correct their behaviour. Portraying someone as inherently malicious prevents improvement and shifts focus away from the *action* that was problematic.
- Avoid bringing up past events that do not accurately reflect an individual's current actions or beliefs. (This is, of course, different from providing evidence of a recurring pattern of behaviour.)
---
This document was adapted from one created by ~keith as part of punks default repository template, and is licensed under CC-BY-SA 4.0. The original template is here: <https://bytes.keithhacks.cyou/keith/default-template>

View file

@ -1,52 +1,33 @@
FROM elixir:1.9-alpine as build
COPY . .
FROM hexpm/elixir:1.13.4-erlang-24.3.4.5-alpine-3.15.6
ENV MIX_ENV=prod
RUN apk add git gcc g++ musl-dev make cmake file-dev &&\
echo "import Mix.Config" > config/prod.secret.exs &&\
mix local.hex --force &&\
mix local.rebar --force &&\
mix deps.get --only prod &&\
mkdir release &&\
mix release --path release
ARG HOME=/opt/akkoma
FROM alpine:3.14
ARG BUILD_DATE
ARG VCS_REF
LABEL maintainer="ops@pleroma.social" \
org.opencontainers.image.title="pleroma" \
org.opencontainers.image.description="Pleroma for Docker" \
org.opencontainers.image.authors="ops@pleroma.social" \
org.opencontainers.image.vendor="pleroma.social" \
org.opencontainers.image.documentation="https://git.pleroma.social/pleroma/pleroma" \
LABEL org.opencontainers.image.title="akkoma" \
org.opencontainers.image.description="Akkoma for Docker" \
org.opencontainers.image.vendor="akkoma.dev" \
org.opencontainers.image.documentation="https://docs.akkoma.dev/stable/" \
org.opencontainers.image.licenses="AGPL-3.0" \
org.opencontainers.image.url="https://pleroma.social" \
org.opencontainers.image.url="https://akkoma.dev" \
org.opencontainers.image.revision=$VCS_REF \
org.opencontainers.image.created=$BUILD_DATE
ARG HOME=/opt/pleroma
ARG DATA=/var/lib/pleroma
RUN apk update &&\
apk add exiftool ffmpeg imagemagick libmagic ncurses postgresql-client &&\
adduser --system --shell /bin/false --home ${HOME} pleroma &&\
mkdir -p ${DATA}/uploads &&\
mkdir -p ${DATA}/static &&\
chown -R pleroma ${DATA} &&\
mkdir -p /etc/pleroma &&\
chown -R pleroma /etc/pleroma
USER pleroma
COPY --from=build --chown=pleroma:0 /release ${HOME}
COPY ./config/docker.exs /etc/pleroma/config.exs
COPY ./docker-entrypoint.sh ${HOME}
RUN apk add git gcc g++ musl-dev make cmake file-dev exiftool ffmpeg imagemagick libmagic ncurses postgresql-client
EXPOSE 4000
ENTRYPOINT ["/opt/pleroma/docker-entrypoint.sh"]
ARG UID=1000
ARG GID=1000
ARG UNAME=akkoma
RUN addgroup -g $GID $UNAME
RUN adduser -u $UID -G $UNAME -D -h $HOME $UNAME
WORKDIR /opt/akkoma
USER $UNAME
RUN mix local.hex --force &&\
mix local.rebar --force
CMD ["/opt/akkoma/docker-entrypoint.sh"]

View file

@ -2,23 +2,25 @@
*a smallish microblogging platform, aka the cooler pleroma*
![English OK](https://img.shields.io/badge/English-OK-blueviolet) ![日本語OK](https://img.shields.io/badge/%E6%97%A5%E6%9C%AC%E8%AA%9E-OK-blueviolet)
## About
This is a fork of Pleroma, which is a microblogging server software that can federate (= exchange messages with) other servers that support ActivityPub. What that means is that you can host a server for yourself or your friends and stay in control of your online identity, but still exchange messages with people on larger servers. Pleroma will federate with all servers that implement ActivityPub, like Friendica, GNU Social, Hubzilla, Mastodon, Misskey, Peertube, and Pixelfed.
This is a fork of Pleroma, which is a microblogging server software that can federate (= exchange messages with) other servers that support ActivityPub. What that means is that you can host a server for yourself or your friends and stay in control of your online identity, but still exchange messages with people on larger servers. Akkoma will federate with all servers that implement ActivityPub, like Friendica, GNU Social, Hubzilla, Mastodon, Misskey, Peertube, and Pixelfed.
Akkoma is written in Elixir and uses PostgresSQL for data storage.
For clients it supports the [Mastodon client API](https://docs.joinmastodon.org/api/guidelines/) with Pleroma extensions (see the API section on <https://docs.akkoma.dev/stable/>).
- [Client Applications for Pleroma](https://docs.akkoma.dev/stable/clients/)
- [Client Applications for Akkoma](https://docs.akkoma.dev/stable/clients/)
## Installation
### OTP releases (Recommended)
If you are running Linux (glibc or musl) on x86, the recommended way to install Pleroma is by using OTP releases. OTP releases are as close as you can get to binary releases with Erlang/Elixir. The release is self-contained, and provides everything needed to boot it. The installation instructions are available [here](https://docs.akkoma.dev/stable/installation/otp_en/).
If you are running Linux (glibc or musl) on x86, the recommended way to install Akkoma is by using OTP releases. OTP releases are as close as you can get to binary releases with Erlang/Elixir. The release is self-contained, and provides everything needed to boot it. The installation instructions are available [here](https://docs.akkoma.dev/stable/installation/otp_en/).
### From Source
If your platform is not supported, or you just want to be able to edit the source code easily, you may install Pleroma from source.
If your platform is not supported, or you just want to be able to edit the source code easily, you may install Akkoma from source.
- [Alpine Linux](https://docs.akkoma.dev/stable/installation/alpine_linux_en/)
- [Arch Linux](https://docs.akkoma.dev/stable/installation/arch_linux_en/)
@ -34,7 +36,7 @@ If your platform is not supported, or you just want to be able to edit the sourc
While we dont provide docker files, other people have written very good ones. Take a look at <https://github.com/angristan/docker-pleroma> or <https://glitch.sh/sn0w/pleroma-docker>.
### Compilation Troubleshooting
If you ever encounter compilation issues during the updating of Pleroma, you can try these commands and see if they fix things:
If you ever encounter compilation issues during the updating of Akkoma, you can try these commands and see if they fix things:
- `mix deps.clean --all`
- `mix local.rebar`

2
SIGNING_KEY.pub Normal file
View file

@ -0,0 +1,2 @@
untrusted comment: Akkoma Signing Key public key
RWQRlw8Ex/uTbvo1wB1yK75tQ5nXKilB/vrKdkL41bgZHL9aKP+7fSS5

View file

@ -48,6 +48,7 @@ config :pleroma, ecto_repos: [Pleroma.Repo]
config :pleroma, Pleroma.Repo,
telemetry_event: [Pleroma.Repo.Instrumenter],
queue_target: 20_000,
migration_lock: nil
config :pleroma, Pleroma.Captcha,
@ -184,7 +185,7 @@ config :pleroma, :http,
adapter: []
config :pleroma, :instance,
name: "Pleroma",
name: "Akkoma",
email: "example@example.com",
notify_email: "noreply@example.com",
description: "Akkoma: The cooler fediverse server",
@ -197,6 +198,7 @@ config :pleroma, :instance,
avatar_upload_limit: 2_000_000,
background_upload_limit: 4_000_000,
banner_upload_limit: 4_000_000,
languages: ["en"],
poll_limits: %{
max_options: 20,
max_option_chars: 200,
@ -215,7 +217,6 @@ config :pleroma, :instance,
],
allow_relay: true,
public: true,
quarantined_instances: [],
static_dir: "instance/static/",
allowed_post_formats: [
"text/plain",
@ -259,7 +260,9 @@ config :pleroma, :instance,
show_reactions: true,
password_reset_token_validity: 60 * 60 * 24,
profile_directory: true,
privileged_staff: false
privileged_staff: false,
local_bubble: [],
max_frontend_settings_json_chars: 100_000
config :pleroma, :welcome,
direct_message: [
@ -267,11 +270,6 @@ config :pleroma, :welcome,
sender_nickname: nil,
message: nil
],
chat_message: [
enabled: false,
sender_nickname: nil,
message: nil
],
email: [
enabled: false,
sender: nil,
@ -411,6 +409,8 @@ config :pleroma, :mrf_vocabulary,
accept: [],
reject: []
config :pleroma, :mrf_inline_quote, prefix: "RE"
# threshold of 7 days
config :pleroma, :mrf_object_age,
threshold: 604_800,
@ -569,7 +569,10 @@ config :pleroma, Oban,
mute_expire: 5,
search_indexing: 10
],
plugins: [Oban.Plugins.Pruner],
plugins: [
Oban.Plugins.Pruner,
{Oban.Plugins.Reindexer, schedule: "@weekly"}
],
crontab: [
{"0 0 * * 0", Pleroma.Workers.Cron.DigestEmailsWorker},
{"0 0 * * *", Pleroma.Workers.Cron.NewUsersDigestWorker}
@ -638,13 +641,6 @@ config :pleroma, Pleroma.Emails.UserEmail,
config :pleroma, Pleroma.Emails.NewUsersDigestEmail, enabled: false
config :prometheus, Pleroma.Web.Endpoint.MetricsExporter,
enabled: false,
auth: false,
ip_whitelist: [],
path: "/api/pleroma/app_metrics",
format: :text
config :pleroma, Pleroma.ScheduledActivity,
daily_user_limit: 25,
total_user_limit: 300,
@ -720,6 +716,7 @@ config :pleroma, :static_fe, enabled: false
config :pleroma, :frontends,
primary: %{"name" => "pleroma-fe", "ref" => "stable"},
admin: %{"name" => "admin-fe", "ref" => "stable"},
mastodon: %{"name" => "mastodon-fe", "ref" => "akkoma"},
swagger: %{
"name" => "swagger-ui",
"ref" => "stable",
@ -738,9 +735,18 @@ config :pleroma, :frontends,
"mastodon-fe" => %{
"name" => "mastodon-fe",
"git" => "https://akkoma.dev/AkkomaGang/masto-fe",
"build_url" => "https://akkoma-updates.s3-website.fr-par.scw.cloud/frontend/masto-fe.zip",
"build_url" =>
"https://akkoma-updates.s3-website.fr-par.scw.cloud/frontend/${ref}/masto-fe.zip",
"build_dir" => "distribution",
"ref" => "develop"
"ref" => "akkoma"
},
"fedibird-fe" => %{
"name" => "fedibird-fe",
"git" => "https://akkoma.dev/AkkomaGang/fedibird-fe",
"build_url" =>
"https://akkoma-updates.s3-website.fr-par.scw.cloud/frontend/${ref}/fedibird-fe.zip",
"build_dir" => "distribution",
"ref" => "akkoma"
},
"admin-fe" => %{
"name" => "admin-fe",
@ -751,10 +757,10 @@ config :pleroma, :frontends,
},
"soapbox-fe" => %{
"name" => "soapbox-fe",
"git" => "https://gitlab.com/soapbox-pub/soapbox-fe",
"git" => "https://gitlab.com/soapbox-pub/soapbox",
"build_url" =>
"https://gitlab.com/soapbox-pub/soapbox-fe/-/jobs/artifacts/${ref}/download?job=build-production",
"ref" => "v1.0.0",
"https://gitlab.com/soapbox-pub/soapbox/-/jobs/artifacts/${ref}/download?job=build-production",
"ref" => "v2.0.0",
"build_dir" => "static"
},
# For developers - enables a swagger frontend to view the openapi spec
@ -793,7 +799,8 @@ config :pleroma, Pleroma.Web.ApiSpec.CastAndValidate, strict: false
config :pleroma, :mrf,
policies: [Pleroma.Web.ActivityPub.MRF.ObjectAgePolicy, Pleroma.Web.ActivityPub.MRF.TagPolicy],
transparency: true,
transparency_exclusions: []
transparency_exclusions: [],
transparency_obfuscate_domains: []
config :ex_aws, http_client: Pleroma.HTTP.ExAws
@ -816,6 +823,8 @@ config :pleroma, ConcurrentLimiter, [
{Pleroma.Search, [max_running: 30, max_waiting: 50]}
]
config :pleroma, Pleroma.Web.WebFinger, domain: nil, update_nickname_on_user_fetch: true
config :pleroma, Pleroma.Search, module: Pleroma.Search.DatabaseSearch
config :pleroma, Pleroma.Search.Meilisearch,
@ -839,6 +848,19 @@ config :pleroma, Pleroma.Search.Elasticsearch.Cluster,
}
}
config :pleroma, :translator,
enabled: false,
module: Pleroma.Akkoma.Translators.DeepL
config :pleroma, :deepl,
# either :free or :pro
tier: :free,
api_key: ""
config :pleroma, :libre_translate,
url: "http://127.0.0.1:5000",
api_key: nil
# Import environment specific config. This must remain at the bottom
# of this file so it overrides the configuration defined above.
import_config "#{Mix.env()}.exs"

View file

@ -509,6 +509,16 @@ config :pleroma, :config_description, [
"Pleroma"
]
},
%{
key: :languages,
type: {:list, :string},
description: "Languages the instance uses",
suggestions: [
"en",
"ja",
"fr"
]
},
%{
key: :email,
label: "Admin Email Address",
@ -691,7 +701,7 @@ config :pleroma, :config_description, [
key_placeholder: "instance",
value_placeholder: "reason",
description:
"List of ActivityPub instances where private (DMs, followers-only) activities will not be sent and the reason for doing so",
"(Deprecated, will be removed in next release) List of ActivityPub instances where activities will not be sent, and the reason for doing so",
suggestions: [
{"quarantined.com", "Reason"},
{"*.quarantined.com", "Reason"}
@ -946,7 +956,13 @@ config :pleroma, :config_description, [
key: :privileged_staff,
type: :boolean,
description:
"Let moderators access sensitive data (e.g. updating user credentials, get password reset token, delete users, index and read private statuses and chats)"
"Let moderators access sensitive data (e.g. updating user credentials, get password reset token, delete users, index and read private statuses)"
},
%{
key: :local_bubble,
type: {:list, :string},
description:
"List of instances that make up your local bubble (closely-related instances). Used to populate the 'bubble' timeline (domain only)."
}
]
},
@ -984,35 +1000,6 @@ config :pleroma, :config_description, [
}
]
},
%{
key: :chat_message,
type: :keyword,
descpiption: "Chat message settings",
children: [
%{
key: :enabled,
type: :boolean,
description: "Enables sending a chat message to newly registered users"
},
%{
key: :message,
type: :string,
description:
"A message that will be sent to newly registered users as a chat message",
suggestions: [
"Hello, welcome on board!"
]
},
%{
key: :sender_nickname,
type: :string,
description: "The nickname of the local user that sends a welcome chat message",
suggestions: [
"lain"
]
}
]
},
%{
key: :email,
type: :keyword,
@ -1192,7 +1179,6 @@ config :pleroma, :config_description, [
hideFilteredStatuses: false,
hideMutedPosts: false,
hidePostStats: false,
hideSitename: false,
hideUserStats: false,
loginMethod: "password",
logo: "/static/logo.svg",
@ -1258,12 +1244,6 @@ config :pleroma, :config_description, [
type: :boolean,
description: "Hide notices statistics (repeats, favorites, ...)"
},
%{
key: :hideSitename,
label: "Hide Sitename",
type: :boolean,
description: "Hides instance name from PleromaFE banner"
},
%{
key: :hideUserStats,
label: "Hide user stats",
@ -1373,6 +1353,48 @@ config :pleroma, :config_description, [
type: :string,
description: "Which theme to use. Available themes are defined in styles.json",
suggestions: ["pleroma-dark"]
},
%{
key: :showPanelNavShortcuts,
label: "Show timeline panel nav shortcuts",
type: :boolean,
description: "Whether to put timeline nav tabs on the top of the panel"
},
%{
key: :showNavShortcuts,
label: "Show navbar shortcuts",
type: :boolean,
description: "Whether to put extra navigation options on the navbar"
},
%{
key: :showWiderShortcuts,
label: "Increase navbar shortcut spacing",
type: :boolean,
description: "Whether to add extra space between navbar icons"
},
%{
key: :hideSiteFavicon,
label: "Hide site favicon",
type: :boolean,
description: "Whether to hide the instance favicon from the navbar"
},
%{
key: :hideSiteName,
label: "Hide site name",
type: :boolean,
description: "Whether to hide the site name from the navbar"
},
%{
key: :renderMisskeyMarkdown,
label: "Render misskey markdown",
type: :boolean,
description: "Whether to render Misskey-flavoured markdown"
},
%{
key: :stopGifs,
label: "Stop Gifs",
type: :boolean,
description: "Whether to pause animated images until they're hovered on"
}
]
},
@ -1465,13 +1487,14 @@ config :pleroma, :config_description, [
%{
key: :theme_color,
type: :string,
description: "Describe the theme color of the app",
description: "Describe the theme color of the app - this is only used for mastodon-fe",
suggestions: ["#282c37", "mediumpurple"]
},
%{
key: :background_color,
type: :string,
description: "Describe the background color of the app",
description:
"Describe the background color of the app - this is only used for mastodon-fe",
suggestions: ["#191b22", "aliceblue"]
}
]
@ -1678,6 +1701,11 @@ config :pleroma, :config_description, [
type: :boolean,
description: "Sign object fetches with HTTP signatures"
},
%{
key: :authorized_fetch_mode,
type: :boolean,
description: "Require HTTP signatures on AP fetches"
},
%{
key: :note_replies_output_limit,
type: :integer,
@ -2605,27 +2633,6 @@ config :pleroma, :config_description, [
}
]
},
%{
group: :pleroma,
key: :shout,
type: :group,
description: "Pleroma shout settings",
children: [
%{
key: :enabled,
type: :boolean,
description: "Enables the backend Shoutbox chat feature."
},
%{
key: :limit,
type: :integer,
description: "Shout message character limit.",
suggestions: [
5_000
]
}
]
},
%{
group: :pleroma,
key: :http,
@ -2636,9 +2643,10 @@ config :pleroma, :config_description, [
%{
key: :proxy_url,
label: "Proxy URL",
type: [:string, :tuple],
description: "Proxy URL",
suggestions: ["localhost:9020", {:socks5, :localhost, 3090}]
type: :string,
description:
"Proxy URL - of the format http://host:port. Advise setting in .exs instead of admin-fe due to this being set at boot-time.",
suggestions: ["http://localhost:3128"]
},
%{
key: :user_agent,
@ -3089,6 +3097,12 @@ config :pleroma, :config_description, [
description: "Admin frontend",
children: installed_frontend_options
},
%{
key: :mastodon,
type: :map,
description: "Mastodon frontend",
children: installed_frontend_options
},
%{
key: :swagger,
type: :map,
@ -3166,43 +3180,6 @@ config :pleroma, :config_description, [
}
]
},
%{
group: :prometheus,
key: Pleroma.Web.Endpoint.MetricsExporter,
type: :group,
description: "Prometheus app metrics endpoint configuration",
children: [
%{
key: :enabled,
type: :boolean,
description: "[Pleroma extension] Enables app metrics endpoint."
},
%{
key: :ip_whitelist,
label: "IP Whitelist",
type: [{:list, :string}, {:list, :charlist}, {:list, :tuple}],
description: "Restrict access of app metrics endpoint to the specified IP addresses."
},
%{
key: :auth,
type: [:boolean, :tuple],
description: "Enables HTTP Basic Auth for app metrics endpoint.",
suggestion: [false, {:basic, "myusername", "mypassword"}]
},
%{
key: :path,
type: :string,
description: "App metrics endpoint URI path.",
suggestions: ["/api/pleroma/app_metrics"]
},
%{
key: :format,
type: :atom,
description: "App metrics endpoint output format.",
suggestions: [:text, :protobuf]
}
]
},
%{
group: :pleroma,
key: ConcurrentLimiter,
@ -3255,13 +3232,14 @@ config :pleroma, :config_description, [
group: :pleroma,
key: Pleroma.Search,
type: :group,
label: "Search",
description: "General search settings.",
children: [
%{
key: :module,
type: :keyword,
type: :module,
description: "Selected search module.",
suggestion: [Pleroma.Search.DatabaseSearch, Pleroma.Search.Meilisearch]
suggestions: {:list_behaviour_implementations, Pleroma.Search.SearchBackend}
}
]
},
@ -3286,7 +3264,7 @@ config :pleroma, :config_description, [
},
%{
key: :initial_indexing_chunk_size,
type: :int,
type: :integer,
description:
"Amount of posts in a batch when running the initial indexing operation. Should probably not be more than 100000" <>
" since there's a limit on maximum insert size",
@ -3297,6 +3275,7 @@ config :pleroma, :config_description, [
%{
group: :pleroma,
key: Pleroma.Search.Elasticsearch.Cluster,
label: "Elasticsearch",
type: :group,
description: "Elasticsearch settings.",
children: [
@ -3363,13 +3342,13 @@ config :pleroma, :config_description, [
},
%{
key: :bulk_page_size,
type: :int,
type: :integer,
description: "Size for bulk put requests, mostly used on building the index",
suggestion: [5000]
},
%{
key: :bulk_wait_interval,
type: :int,
type: :integer,
description: "Time to wait between bulk put requests (in ms)",
suggestion: [15_000]
}
@ -3378,5 +3357,66 @@ config :pleroma, :config_description, [
]
}
]
},
%{
group: :pleroma,
key: :translator,
type: :group,
description: "Translation Settings",
children: [
%{
key: :enabled,
type: :boolean,
description: "Is translation enabled?",
suggestion: [true, false]
},
%{
key: :module,
type: :module,
description: "Translation module.",
suggestions: {:list_behaviour_implementations, Pleroma.Akkoma.Translator}
}
]
},
%{
group: :pleroma,
key: :deepl,
label: "DeepL",
type: :group,
description: "DeepL Settings.",
children: [
%{
key: :tier,
type: {:dropdown, :atom},
description: "API Tier",
suggestions: [:free, :pro]
},
%{
key: :api_key,
type: :string,
description: "API key for DeepL",
suggestions: [nil]
}
]
},
%{
group: :pleroma,
key: :libre_translate,
type: :group,
description: "LibreTranslate Settings.",
children: [
%{
key: :url,
type: :string,
description: "URL for libretranslate",
suggestion: [nil]
},
%{
key: :api_key,
type: :string,
description: "API key for libretranslate",
suggestion: [nil]
}
]
}
]

View file

@ -24,11 +24,11 @@ config :pleroma, Pleroma.Repo,
config :web_push_encryption, :vapid_details, subject: "mailto:#{System.get_env("NOTIFY_EMAIL")}"
config :pleroma, :database, rum_enabled: false
config :pleroma, :instance, static_dir: "/var/lib/pleroma/static"
config :pleroma, Pleroma.Uploaders.Local, uploads: "/var/lib/pleroma/uploads"
config :pleroma, :instance, static_dir: "/var/lib/akkoma/static"
config :pleroma, Pleroma.Uploaders.Local, uploads: "/var/lib/akkoma/uploads"
# We can't store the secrets in this file, since this is baked into the docker image
if not File.exists?("/var/lib/pleroma/secret.exs") do
if not File.exists?("/var/lib/akkoma/secret.exs") do
secret = :crypto.strong_rand_bytes(64) |> Base.encode64() |> binary_part(0, 64)
signing_salt = :crypto.strong_rand_bytes(8) |> Base.encode64() |> binary_part(0, 8)
{web_push_public_key, web_push_private_key} = :crypto.generate_key(:ecdh, :prime256v1)
@ -52,16 +52,16 @@ if not File.exists?("/var/lib/pleroma/secret.exs") do
web_push_private_key: Base.url_encode64(web_push_private_key, padding: false)
)
File.write("/var/lib/pleroma/secret.exs", secret_file)
File.write("/var/lib/akkoma/secret.exs", secret_file)
end
import_config("/var/lib/pleroma/secret.exs")
import_config("/var/lib/akkoma/secret.exs")
# For additional user config
if File.exists?("/var/lib/pleroma/config.exs"),
do: import_config("/var/lib/pleroma/config.exs"),
if File.exists?("/var/lib/akkoma/config.exs"),
do: import_config("/var/lib/akkoma/config.exs"),
else:
File.write("/var/lib/pleroma/config.exs", """
File.write("/var/lib/akkoma/config.exs", """
import Config
# For additional configuration outside of environmental variables

View file

@ -1,4 +0,0 @@
firefox, /emoji/Firefox.gif, Gif,Fun
blank, /emoji/blank.png, Fun
dinosaur, /emoji/dino walking.gif, Gif
100a, /emoji/100a.png, Fun

View file

@ -126,6 +126,8 @@ config :pleroma, :pipeline,
config :pleroma, :cachex, provider: Pleroma.CachexMock
config :pleroma, Pleroma.Web.WebFinger, update_nickname_on_user_fetch: false
config :pleroma, :side_effects,
ap_streamer: Pleroma.Web.ActivityPub.ActivityPubMock,
logger: Pleroma.LoggerMock

61
docker-compose.yml Normal file
View file

@ -0,0 +1,61 @@
version: "3.7"
services:
db:
image: akkoma-db:latest
build: ./docker-resources/database
restart: unless-stopped
user: ${DOCKER_USER}
environment: {
# This might seem insecure but is usually not a problem.
# You should leave this at the "akkoma" default.
# The DB is only reachable by containers in the same docker network,
# and is not exposed to the open internet.
#
# If you do change this, remember to update "config.exs".
POSTGRES_DB: akkoma,
POSTGRES_USER: akkoma,
POSTGRES_PASSWORD: akkoma,
}
env_file:
- .env
volumes:
- type: bind
source: ./pgdata
target: /var/lib/postgresql/data
akkoma:
image: akkoma:latest
build: .
restart: unless-stopped
env_file:
- .env
links:
- db
ports: [
# Uncomment/Change port mappings below as needed.
# The left side is your host machine, the right one is the akkoma container.
# You can prefix the left side with an ip.
# Webserver (for reverse-proxies outside of docker)
# If you use a dockerized proxy, you can leave this commented
# and use a container link instead.
"127.0.0.1:4000:4000",
]
volumes:
- .:/opt/akkoma
# Uncomment the following if you want to use a reverse proxy
#proxy:
# image: caddy:2-alpine
# restart: unless-stopped
# links:
# - akkoma
# ports: [
# "443:443",
# "80:80"
# ]
# volumes:
# - ./docker-resources/Caddyfile:/etc/caddy/Caddyfile
# - ./caddy-data:/data
# - ./caddy-config:/config

View file

@ -8,7 +8,7 @@ while ! pg_isready -U ${DB_USER:-pleroma} -d postgres://${DB_HOST:-db}:5432/${DB
done
echo "-- Running migrations..."
$HOME/bin/pleroma_ctl migrate
mix ecto.migrate
echo "-- Starting!"
exec $HOME/bin/pleroma start
mix phx.server

View file

@ -0,0 +1,14 @@
# default docker Caddyfile config for Akkoma
#
# Simple installation instructions:
# 1. Replace 'example.tld' with your instance's domain wherever it appears.
example.tld {
log {
output file /var/log/caddy/akkoma.log
}
encode gzip
reverse_proxy akkoma:4000
}

4
docker-resources/build.sh Executable file
View file

@ -0,0 +1,4 @@
#!/bin/sh
docker-compose build --build-arg UID=$(id -u) --build-arg GID=$(id -g) akkoma
docker-compose build --build-arg UID=$(id -u) --build-arg GID=$(id -g) db

View file

@ -0,0 +1,10 @@
FROM postgres:14-alpine
ARG UID=1000
ARG GID=1000
ARG UNAME=akkoma
RUN addgroup -g $GID $UNAME
RUN adduser -u $UID -G $UNAME -D -h $HOME $UNAME
USER akkoma

View file

@ -0,0 +1,4 @@
MIX_ENV=prod
DB_NAME=akkoma
DB_USER=akkoma
DB_PASS=akkoma

3
docker-resources/manage.sh Executable file
View file

@ -0,0 +1,3 @@
#!/bin/sh
docker-compose run --rm akkoma $@

View file

@ -5,3 +5,5 @@ install:
pipenv install
clean:
rm -rf site
serve:
pipenv run python3 -m http.server -d site

View file

@ -300,3 +300,28 @@
```sh
mix pleroma.user unconfirm_all
```
## Fix following state
Sometimes the system can get into a situation where
it think you're already following someone and won't send a request
to the remote instance, or won't let you unfollow someone. This
bug was fixed, but in case you encounter these weird states:
=== "OTP"
```sh
./bin/pleroma_ctl user fix_follow_state localuser remoteuser@example.com
```
=== "From Source"
```sh
mix pleroma.user fix_follow_state localuser remoteuser@example.com
```
The first argument is the local user's nickname - if you are `myuser@myinstance`, this should be `myuser`.
The second is the remote user, consisting of both nickname AND domain.
If you are a weird follow state situation and cannot resolve it with the above, you may need to co-operate with the remote admin to clear the state their side too - they should provide the arguments *backwards*, i.e `fix_follow_state remote local`.

View file

@ -14,6 +14,10 @@ su akkoma -s $SHELL -lc "./bin/pleroma_ctl update"
su akkoma -s $SHELL -lc "./bin/pleroma_ctl migrate"
```
If you selected an alternate flavour on installation,
you _may_ need to specify `--flavour`, in the same way as
[when installing](../../installation/otp_en#detecting-flavour).
## For from source installations (using git)
1. Go to the working directory of Akkoma (default is `/opt/akkoma`)

View file

@ -8,11 +8,6 @@ For from source installations Akkoma configuration works by first importing the
To add configuration to your config file, you can copy it from the base config. The latest version of it can be viewed [here](https://akkoma.dev/AkkomaGang/akkoma/src/branch/develop/config/config.exs). You can also use this file if you don't know how an option is supposed to be formatted.
## :shout
* `enabled` - Enables the backend Shoutbox chat feature. Defaults to `true`.
* `limit` - Shout character limit. Defaults to `5_000`
## :instance
* `name`: The instances name.
* `email`: Email used to reach an Administrator/Moderator of the instance.
@ -39,7 +34,7 @@ To add configuration to your config file, you can copy it from the base config.
* `federation_reachability_timeout_days`: Timeout (in days) of each external federation target being unreachable prior to pausing federating to it.
* `allow_relay`: Permits remote instances to subscribe to all public posts of your instance. This may increase the visibility of your instance.
* `public`: Makes the client API in authenticated mode-only except for user-profiles. Useful for disabling the Local Timeline and The Whole Known Network. Note that there is a dependent setting restricting or allowing unauthenticated access to specific resources, see `restrict_unauthenticated` for more details.
* `quarantined_instances`: ActivityPub instances where private (DMs, followers-only) activities will not be send.
* `quarantined_instances`: *DEPRECATED* ActivityPub instances where activities will not be sent. They can still reach there via other means, we just won't send them.
* `allowed_post_formats`: MIME-type list of formats allowed to be posted (transformed into HTML).
* `extended_nickname_format`: Set to `true` to use extended local nicknames format (allows underscores/dashes). This will break federation with
older software for theses nicknames.
@ -64,6 +59,7 @@ To add configuration to your config file, you can copy it from the base config.
* `cleanup_attachments`: Remove attachments along with statuses. Does not affect duplicate files and attachments without status. Enabling this will increase load to database when deleting statuses on larger instances.
* `show_reactions`: Let favourites and emoji reactions be viewed through the API (default: `true`).
* `password_reset_token_validity`: The time after which reset tokens aren't accepted anymore, in seconds (default: one day).
* `local_bubble`: Array of domains representing instances closely related to yours. Used to populate the `bubble` timeline. e.g `['example.com']`, (default: `[]`)
## :database
* `improved_hashtag_timeline`: Setting to force toggle / force disable improved hashtags timeline. `:enabled` forces hashtags to be fetched from `hashtags` table for hashtags timeline. `:disabled` forces object-embedded hashtags to be used (slower). Keep it `:auto` for automatic behaviour (it is auto-set to `:enabled` [unless overridden] when HashtagsTableMigrator completes).
@ -77,10 +73,6 @@ To add configuration to your config file, you can copy it from the base config.
* `enabled`: Enables the send a direct message to a newly registered user. Defaults to `false`.
* `sender_nickname`: The nickname of the local user that sends the welcome message.
* `message`: A message that will be send to a newly registered users as a direct message.
* `chat_message`: - welcome message sent as a chat message.
* `enabled`: Enables the send a chat message to a newly registered user. Defaults to `false`.
* `sender_nickname`: The nickname of the local user that sends the welcome message.
* `message`: A message that will be send to a newly registered users as a chat message.
* `email`: - welcome message sent as a email.
* `enabled`: Enables the send a welcome email to a newly registered user. Defaults to `false`.
* `sender`: The email address or tuple with `{nickname, email}` that will use as sender to the welcome email.
@ -129,6 +121,7 @@ To add configuration to your config file, you can copy it from the base config.
* `Pleroma.Web.ActivityPub.MRF.KeywordPolicy`: Rejects or removes from the federated timeline or replaces keywords. (See [`:mrf_keyword`](#mrf_keyword)).
* `transparency`: Make the content of your Message Rewrite Facility settings public (via nodeinfo).
* `transparency_exclusions`: Exclude specific instance names from MRF transparency. The use of the exclusions feature will be disclosed in nodeinfo as a boolean value.
* `transparency_obfuscate_domains`: Show domains with `*` in the middle, to censor them if needed. For example, `ridingho.me` will show as `rid*****.me`
## Federation
### MRF policies
@ -140,7 +133,7 @@ To add configuration to your config file, you can copy it from the base config.
* `media_removal`: List of instances to strip media attachments from and the reason for doing so.
* `media_nsfw`: List of instances to tag all media as NSFW (sensitive) from and the reason for doing so.
* `federated_timeline_removal`: List of instances to remove from the Federated Timeline (aka The Whole Known Network) and the reason for doing so.
* `reject`: List of instances to reject activities (except deletes) from and the reason for doing so.
* `reject`: List of instances to reject activities (except deletes) from and the reason for doing so. Additionally prevents activities from being sent to that instance.
* `accept`: List of instances to only accept activities (except deletes) from and the reason for doing so.
* `followers_only`: Force posts from the given instances to be visible by followers only and the reason for doing so.
* `report_removal`: List of instances to reject reports from and the reason for doing so.
@ -292,14 +285,19 @@ config :pleroma, :frontends,
"name" => "swagger-ui",
"ref" => "stable",
"enabled" => true
}
},
mastodon: %{
"name" => "mastodon-fe",
"ref" => "akkoma"
}
```
* `:primary` - The frontend that will be served at `/`
* `:admin` - The frontend that will be served at `/pleroma/admin`
* `:swagger` - Config for developers to act as an API reference to be served at `/akkoma/swaggerui/` (trailing slash _needed_). Disabled by default.
* `:mastodon` - The mastodon-fe configuration. This shouldn't need to be changed. This is served at `/web` when installed.
### :static_fe
### :static\_fe
Render profiles and posts using server-generated HTML that is viewable without using JavaScript.
@ -525,7 +523,7 @@ Available caches:
### :http
* `proxy_url`: an upstream proxy to fetch posts and/or media with, (default: `nil`)
* `proxy_url`: an upstream proxy to fetch posts and/or media with, (default: `nil`); for example `http://127.0.0.1:3192`. Does not support SOCKS5 proxy, only http(s).
* `send_user_agent`: should we include a user agent with HTTP requests? (default: `true`)
* `user_agent`: what user agent should we use? (default: `:default`), must be string or `:default`
* `adapter`: array of adapter options
@ -1043,7 +1041,22 @@ config :pleroma, Pleroma.Formatter,
## Custom Runtime Modules (`:modules`)
* `runtime_dir`: A path to custom Elixir modules (such as MRF policies).
* `runtime_dir`: A path to custom Elixir modules, such as MRF policies or
custom authenticators. These modules will be loaded on boot, and can be
contained in subdirectories. It is advised to use version-controlled
subdirectories to make management of them a bit easier. Note that only
files with the extension `.ex` will be loaded.
```elixir
config :pleroma, :modules, runtime_dir: "instance/modules"
```
### Adding a module
```bash
cd instance/modules/
git clone <MY MODULE>
```
## :configurable_from_database
@ -1147,3 +1160,28 @@ Each job has these settings:
* `:max_running` - max concurrently runnings jobs
* `:max_waiting` - max waiting jobs
### Translation Settings
Settings to automatically translate statuses for end users. Currently supported
translation services are DeepL and LibreTranslate.
Translations are available at `/api/v1/statuses/:id/translations/:language`, where
`language` is the target language code (e.g `en`)
### `:translator`
- `:enabled` - enables translation
- `:module` - Sets module to be used
- Either `Pleroma.Akkoma.Translators.DeepL` or `Pleroma.Akkoma.Translators.LibreTranslate`
### `:deepl`
- `:api_key` - API key for DeepL
- `:tier` - API tier
- either `:free` or `:pro`
### `:libre_translate`
- `:url` - URL of LibreTranslate instance
- `:api_key` - API key for LibreTranslate

View file

@ -19,6 +19,10 @@ config :pleroma, :frontends,
admin: %{
"name" => "admin-fe",
"ref" => "stable"
},
mastodon: %{
"name" => "mastodon-fe",
"ref" => "akkoma"
}
```
@ -26,12 +30,18 @@ This would serve the frontend from the the folder at `$instance_static/frontends
Refer to [the frontend CLI task](../../administration/CLI_tasks/frontend) for how to install the frontend's files
If you wish masto-fe to also be enabled, you will also need to run the install task for `mastodon-fe`. Not doing this will lead to the frontend not working.
If you choose not to install a frontend for whatever reason, it is recommended that you enable [`:static_fe`](#static_fe) to allow remote users to click "view remote source". Don't bother with this if you've got no unauthenticated access though.
You can also replace the default "no frontend" page by placing an `index.html` file under your `instance/static/` directory.
## Mastodon-FE
Akkoma supports both [glitchsoc](https://github.com/glitch-soc/mastodon)'s more "vanilla" mastodon frontend,
as well as [fedibird](https://github.com/fedibird/mastodon)'s extended frontend which has near-feature-parity with akkoma (with quoting and reactions).
To enable either one, you must run the `frontend.install` task for either `mastodon-fe` or `fedibird-fe` (both `--ref akkoma`), then make sure
`:pleroma, :frontends, :mastodon` references the one you want.
## Swagger (openAPI) documentation viewer
If you're a developer and you'd like a human-readable rendering of the

View file

@ -0,0 +1,62 @@
# How to use a different domain name for Akkoma and the users it serves
Akkoma users are primarily identified by a `user@example.org` handle, and you might want this identifier to be the same as your email or jabber account, for instance.
However, in this case, you are almost certainly serving some web content on `https://example.org` already, and you might want to use another domain (say `akkoma.example.org`) for Akkoma itself.
Akkoma supports that, but it might be tricky to set up, and any error might prevent you from federating with other instances.
*If you are already running Akkoma on `example.org`, it is no longer possible to move it to `akkoma.example.org`.*
## Account identifiers
It is important to understand that for federation purposes, a user in Akkoma has two unique identifiers associated:
- A webfinger `acct:` URI, used for discovery and as a verifiable global name for the user across Akkoma instances. In our example, our account's acct: URI is `acct:user@example.org`
- An author/actor URI, used in every other aspect of federation. This is the way in which users are identified in ActivityPub, the underlying protocol used for federation with other Akkoma instances.
In our case, it is `https://akkoma.example.org/users/user`.
Both account identifiers are unique and required for Akkoma. An important risk if you set up your Akkoma instance incorrectly is to create two users (with different acct: URIs) with conflicting author/actor URIs.
## WebFinger
As said earlier, each Akkoma user has an `acct`: URI, which is used for discovery and authentication. When you add @user@example.org, a webfinger query is performed. This is done in two steps:
1. Querying `https://example.org/.well-known/host-meta` (where the domain of the URL matches the domain part of the `acct`: URI) to get information on how to perform the query.
This file will indeed contain a URL template of the form `https://example.org/.well-known/webfinger?resource={uri}` that will be used in the second step.
2. Fill the returned template with the `acct`: URI to be queried and perform the query: `https://example.org/.well-known/webfinger?resource=acct:user@example.org`
## Configuring your Akkoma instance
**_DO NOT ATTEMPT TO CONFIGURE YOUR INSTANCE THIS WAY IF YOU DID NOT UNDERSTAND THE ABOVE_**
### Configuring Akkoma
Akkoma has a two configuration settings to enable using different domains for your users and Akkoma itself. `host` in `Pleroma.Web.Endpoint` and `domain` in `Pleroma.Web.WebFinger`. When the latter is not set, it defaults to the value of `host`.
*Be extra careful when configuring your Akkoma instance, as changing `host` may cause remote instances to register different accounts with the same author/actor URI, which will result in federation issues!*
```elixir
config :pleroma, Pleroma.Web.Endpoint,
url: [host: "pleroma.example.org"]
config :pleroma, Pleroma.Web.WebFinger, domain: "example.org"
```
- `domain` - is the domain for which your Akkoma instance has authority, it's the domain used in `acct:` URI. In our example, `domain` would be set to `example.org`.
- `host` - is the domain used for any URL generated for your instance, including the author/actor URL's. In our case, that would be `akkoma.example.org`.
### Configuring WebFinger domain
Now, you have Akkoma running at `https://akkoma.example.org` as well as a website at `https://example.org`. If you recall how webfinger queries work, the first step is to query `https://example.org/.well-known/host-meta`, which will contain an URL template.
Therefore, the easiest way to configure `example.org` is to redirect `/.well-known/host-meta` to `akkoma.example.org`.
With nginx, it would be as simple as adding:
```nginx
location = /.well-known/host-meta {
return 301 https://akkoma.example.org$request_uri;
}
```
in example.org's server block.

View file

@ -21,7 +21,7 @@ This will only save the theme for you personally. To make it available to the wh
### Upload the theme to the server
Themes can be found in the [static directory](static_dir.md). Create `STATIC-DIR/static/themes/` if needed and copy your theme there. Next you need to add an entry for your theme to `STATIC-DIR/static/styles.json`. If you use a from source installation, you'll first need to copy the file from `priv/static/static/styles.json`.
Themes can be found in the [static directory](static_dir.md). Create `STATIC-DIR/static/themes/` if needed and copy your theme there. Next you need to add an entry for your theme to `STATIC-DIR/static/styles.json`. If you use a from source installation, you'll first need to copy the file from `STATIC-DIR/frontends/pleroma-fe/REF/static/styles.json` (where `REF` is `stable` or `develop` depending on which ref you decided to install).
Example of `styles.json` where we add our own `my-awesome-theme.json`
```json

View file

@ -14,11 +14,12 @@ apt -yq install tor
**WARNING:** Onion instances not using a Tor version supporting V3 addresses will not be able to federate with you.
Create the hidden service for your Akkoma instance in `/etc/tor/torrc`:
Create the hidden service for your Akkoma instance in `/etc/tor/torrc`, with an HTTP tunnel:
```
HiddenServiceDir /var/lib/tor/akkoma_hidden_service/
HiddenServicePort 80 127.0.0.1:8099
HiddenServiceVersion 3 # Remove if Tor version is below 0.3 ( tor --version )
HTTPTunnelPort 9080
```
Restart Tor to generate an adress:
```
@ -35,7 +36,7 @@ Next, edit your Akkoma config.
If running in prod, navigate to your Akkoma directory, edit `config/prod.secret.exs`
and append this line:
```
config :pleroma, :http, proxy_url: {:socks5, :localhost, 9050}
config :pleroma, :http, proxy_url: "http://localhost:9080"
```
In your Akkoma directory, assuming you're running prod,
run the following:

View file

@ -141,8 +141,7 @@ You then need to set the URL and authentication credentials if relevant.
### Initial indexing
After setting up the configuration, you'll want to index all of your already existsing posts. Only public posts are indexed. You'll only
have to do it one time, but it might take a while, depending on the amount of posts your instance has seen.
After setting up the configuration, you'll want to index all of your already existsing posts. You'll only have to do it one time, but it might take a while, depending on the amount of posts your instance has seen.
The sequence of actions is as follows:

View file

@ -1031,7 +1031,6 @@ Most of the settings will be applied in `runtime`, this means that you don't nee
- `:hackney_pools`
- `:connections_pool`
- `:pools`
- `:chat`
- partially settings inside these keys:
- `:seconds_valid` in `Pleroma.Captcha`
- `:proxy_remote` in `Pleroma.Upload`
@ -1411,127 +1410,6 @@ Loads json generated from `config/descriptions.exs`.
```
## GET /api/v1/pleroma/admin/users/:nickname/chats
### List a user's chats
- Params: None
- Response:
```json
[
{
"sender": {
"id": "someflakeid",
"username": "somenick",
...
},
"receiver": {
"id": "someflakeid",
"username": "somenick",
...
},
"id" : "1",
"unread" : 2,
"last_message" : {...}, // The last message in that chat
"updated_at": "2020-04-21T15:11:46.000Z"
}
]
```
## GET /api/v1/pleroma/admin/chats/:chat_id
### View a single chat
- Params: None
- Response:
```json
{
"sender": {
"id": "someflakeid",
"username": "somenick",
...
},
"receiver": {
"id": "someflakeid",
"username": "somenick",
...
},
"id" : "1",
"unread" : 2,
"last_message" : {...}, // The last message in that chat
"updated_at": "2020-04-21T15:11:46.000Z"
}
```
## GET /api/v1/pleroma/admin/chats/:chat_id/messages
### List the messages in a chat
- Params: `max_id`, `min_id`
- Response:
```json
[
{
"account_id": "someflakeid",
"chat_id": "1",
"content": "Check this out :firefox:",
"created_at": "2020-04-21T15:11:46.000Z",
"emojis": [
{
"shortcode": "firefox",
"static_url": "https://dontbulling.me/emoji/Firefox.gif",
"url": "https://dontbulling.me/emoji/Firefox.gif",
"visible_in_picker": false
}
],
"id": "13",
"unread": true
},
{
"account_id": "someflakeid",
"chat_id": "1",
"content": "Whats' up?",
"created_at": "2020-04-21T15:06:45.000Z",
"emojis": [],
"id": "12",
"unread": false
}
]
```
## DELETE /api/v1/pleroma/admin/chats/:chat_id/messages/:message_id
### Delete a single message
- Params: None
- Response:
```json
{
"account_id": "someflakeid",
"chat_id": "1",
"content": "Check this out :firefox:",
"created_at": "2020-04-21T15:11:46.000Z",
"emojis": [
{
"shortcode": "firefox",
"static_url": "https://dontbulling.me/emoji/Firefox.gif",
"url": "https://dontbulling.me/emoji/Firefox.gif",
"visible_in_picker": false
}
],
"id": "13",
"unread": false
}
```
## `GET /api/v1/pleroma/admin/instance_document/:document_name`
### Get an instance document
@ -1636,3 +1514,117 @@ Returns the content of the document
"error": "Could not install frontend"
}
```
## `GET /api/v1/pleroma/admin/announcements`
### List announcements
- Params: `offset`, `limit`
- Response: JSON, list of announcements
```json
[
{
"id": "AHDp0GBdRn1EPN5HN2",
"content": "some content",
"starts_at": null,
"ends_at": null,
"all_day": false,
"published_at": "2022-03-09T02:13:05",
"reactions": [],
"statuses": [],
"tags": [],
"emojis": [],
"updated_at": "2022-03-09T02:13:05"
}
]
```
Note that this differs from the Mastodon API variant: Mastodon API only returns *active* announcements, while this returns all.
## `GET /api/v1/pleroma/admin/announcements/:id`
### Display one announcement
- Response: JSON, one announcement
```json
{
"id": "AHDp0GBdRn1EPN5HN2",
"content": "some content",
"starts_at": null,
"ends_at": null,
"all_day": false,
"published_at": "2022-03-09T02:13:05",
"reactions": [],
"statuses": [],
"tags": [],
"emojis": [],
"updated_at": "2022-03-09T02:13:05"
}
```
## `POST /api/v1/pleroma/admin/announcements`
### Create an announcement
- Params:
- `content`: string, required, announcement content
- `starts_at`: datetime, optional, default to null, the time when the announcement will become active (displayed to users); if it is null, the announcement will be active immediately
- `ends_at`: datetime, optional, default to null, the time when the announcement will become inactive (no longer displayed to users); if it is null, the announcement will be active until an admin deletes it
- `all_day`: boolean, optional, default to false, tells the client whether to only display dates for `starts_at` and `ends_at`
- Response: JSON, created announcement
```json
{
"id": "AHDp0GBdRn1EPN5HN2",
"content": "some content",
"starts_at": null,
"ends_at": null,
"all_day": false,
"published_at": "2022-03-09T02:13:05",
"reactions": [],
"statuses": [],
"tags": [],
"emojis": [],
"updated_at": "2022-03-09T02:13:05"
}
```
## `PATCH /api/v1/pleroma/admin/announcements/:id`
### Change an announcement
- Params: same as `POST /api/v1/pleroma/admin/announcements`, except no param is required.
- Updates the announcement according to params. Missing params are kept as-is.
- Response: JSON, updated announcement
```json
{
"id": "AHDp0GBdRn1EPN5HN2",
"content": "some content",
"starts_at": null,
"ends_at": null,
"all_day": false,
"published_at": "2022-03-09T02:13:05",
"reactions": [],
"statuses": [],
"tags": [],
"emojis": [],
"updated_at": "2022-03-09T02:13:05"
}
```
## `DELETE /api/v1/pleroma/admin/announcements/:id`
### Delete an announcement
- Response: JSON, empty object
```json
{}
```

View file

@ -1,255 +0,0 @@
# Chats
Chats are a way to represent an IM-style conversation between two actors. They are not the same as direct messages and they are not `Status`es, even though they have a lot in common.
## Why Chats?
There are no 'visibility levels' in ActivityPub, their definition is purely a Mastodon convention. Direct Messaging between users on the fediverse has mostly been modeled by using ActivityPub addressing following Mastodon conventions on normal `Note` objects. In this case, a 'direct message' would be a message that has no followers addressed and also does not address the special public actor, but just the recipients in the `to` field. It would still be a `Note` and is presented with other `Note`s as a `Status` in the API.
This is an awkward setup for a few reasons:
- As DMs generally still follow the usual `Status` conventions, it is easy to accidentally pull somebody into a DM thread by mentioning them. (e.g. "I hate @badguy so much")
- It is possible to go from a publicly addressed `Status` to a DM reply, back to public, then to a 'followers only' reply, and so on. This can be become very confusing, as it is unclear which user can see which part of the conversation.
- The standard `Status` format of implicit addressing also leads to rather ugly results if you try to display the messages as a chat, because all the recipients are always mentioned by name in the message.
- As direct messages are posted with the same api call (and usually same frontend component) as public messages, accidentally making a public message private or vice versa can happen easily. Client bugs can also lead to this, accidentally making private messages public.
As a measure to improve this situation, the `Conversation` concept and related Akkoma extensions were introduced. While it made it possible to work around a few of the issues, many of the problems remained and it didn't see much adoption because it was too complicated to use correctly.
## Chats explained
For this reasons, Chats are a new and different entity, both in the API as well as in ActivityPub. A quick overview:
- Chats are meant to represent an instant message conversation between two actors. For now these are only 1-on-1 conversations, but the other actor can be a group in the future.
- Chat messages have the ActivityPub type `ChatMessage`. They are not `Note`s. Servers that don't understand them will just drop them.
- The only addressing allowed in `ChatMessage`s is one single ActivityPub actor in the `to` field.
- There's always only one Chat between two actors. If you start chatting with someone and later start a 'new' Chat, the old Chat will be continued.
- `ChatMessage`s are posted with a different api, making it very hard to accidentally send a message to the wrong person.
- `ChatMessage`s don't show up in the existing timelines.
- Chats can never go from private to public. They are always private between the two actors.
## Caveats
- Chats are NOT E2E encrypted (yet). Security is still the same as email.
## API
In general, the way to send a `ChatMessage` is to first create a `Chat`, then post a message to that `Chat`. `Group`s will later be supported by making them a sub-type of `Account`.
This is the overview of using the API. The API is also documented via OpenAPI, so you can view it and play with it by pointing SwaggerUI or a similar OpenAPI tool to `https://yourinstance.tld/api/openapi`.
### Creating or getting a chat.
To create or get an existing Chat for a certain recipient (identified by Account ID)
you can call:
`POST /api/v1/pleroma/chats/by-account-id/:account_id`
The account id is the normal FlakeId of the user
```
POST /api/v1/pleroma/chats/by-account-id/someflakeid
```
If you already have the id of a chat, you can also use
```
GET /api/v1/pleroma/chats/:id
```
There will only ever be ONE Chat for you and a given recipient, so this call
will return the same Chat if you already have one with that user.
Returned data:
```json
{
"account": {
"id": "someflakeid",
"username": "somenick",
...
},
"id" : "1",
"unread" : 2,
"last_message" : {...}, // The last message in that chat
"updated_at": "2020-04-21T15:11:46.000Z"
}
```
### Marking a chat as read
To mark a number of messages in a chat up to a certain message as read, you can use
`POST /api/v1/pleroma/chats/:id/read`
Parameters:
- last_read_id: Given this id, all chat messages until this one will be marked as read. Required.
Returned data:
```json
{
"account": {
"id": "someflakeid",
"username": "somenick",
...
},
"id" : "1",
"unread" : 0,
"updated_at": "2020-04-21T15:11:46.000Z"
}
```
### Marking a single chat message as read
To set the `unread` property of a message to `false`
`POST /api/v1/pleroma/chats/:id/messages/:message_id/read`
Returned data:
The modified chat message
### Getting a list of Chats
`GET /api/v1/pleroma/chats`
This will return a list of chats that you have been involved in, sorted by their
last update (so new chats will be at the top).
Parameters:
- with_muted: Include chats from muted users (boolean).
Returned data:
```json
[
{
"account": {
"id": "someflakeid",
"username": "somenick",
...
},
"id" : "1",
"unread" : 2,
"last_message" : {...}, // The last message in that chat
"updated_at": "2020-04-21T15:11:46.000Z"
}
]
```
The recipient of messages that are sent to this chat is given by their AP ID.
No pagination is implemented for now.
### Getting the messages for a Chat
For a given Chat id, you can get the associated messages with
`GET /api/v1/pleroma/chats/:id/messages`
This will return all messages, sorted by most recent to least recent. The usual
pagination options are implemented.
Returned data:
```json
[
{
"account_id": "someflakeid",
"chat_id": "1",
"content": "Check this out :firefox:",
"created_at": "2020-04-21T15:11:46.000Z",
"emojis": [
{
"shortcode": "firefox",
"static_url": "https://dontbulling.me/emoji/Firefox.gif",
"url": "https://dontbulling.me/emoji/Firefox.gif",
"visible_in_picker": false
}
],
"id": "13",
"unread": true
},
{
"account_id": "someflakeid",
"chat_id": "1",
"content": "Whats' up?",
"created_at": "2020-04-21T15:06:45.000Z",
"emojis": [],
"id": "12",
"unread": false,
"idempotency_key": "75442486-0874-440c-9db1-a7006c25a31f"
}
]
```
- idempotency_key: The copy of the `idempotency-key` HTTP request header that can be used for optimistic message sending. Included only during the first few minutes after the message creation.
### Posting a chat message
Posting a chat message for given Chat id works like this:
`POST /api/v1/pleroma/chats/:id/messages`
Parameters:
- content: The text content of the message. Optional if media is attached.
- media_id: The id of an upload that will be attached to the message.
Currently, no formatting beyond basic escaping and emoji is implemented.
Returned data:
```json
{
"account_id": "someflakeid",
"chat_id": "1",
"content": "Check this out :firefox:",
"created_at": "2020-04-21T15:11:46.000Z",
"emojis": [
{
"shortcode": "firefox",
"static_url": "https://dontbulling.me/emoji/Firefox.gif",
"url": "https://dontbulling.me/emoji/Firefox.gif",
"visible_in_picker": false
}
],
"id": "13",
"unread": false
}
```
### Deleting a chat message
Deleting a chat message for given Chat id works like this:
`DELETE /api/v1/pleroma/chats/:chat_id/messages/:message_id`
Returned data is the deleted message.
### Notifications
There's a new `pleroma:chat_mention` notification, which has this form. It is not given out in the notifications endpoint by default, you need to explicitly request it with `include_types[]=pleroma:chat_mention`:
```json
{
"id": "someid",
"type": "pleroma:chat_mention",
"account": { ... } // User account of the sender,
"chat_message": {
"chat_id": "1",
"id": "10",
"content": "Hello",
"account_id": "someflakeid",
"unread": false
},
"created_at": "somedate"
}
```
### Streaming
There is an additional `user:pleroma_chat` stream. Incoming chat messages will make the current chat be sent to this `user` stream. The `event` of an incoming chat message is `pleroma:chat_update`. The payload is the updated chat with the incoming chat message in the `last_message` field.
### Web Push
If you want to receive push messages for this type, you'll need to add the `pleroma:chat_mention` type to your alerts in the push subscription.

View file

@ -40,6 +40,10 @@ Has these additional fields under the `pleroma` object:
- `parent_visible`: If the parent of this post is visible to the user or not.
- `pinned_at`: a datetime (iso8601) when status was pinned, `null` otherwise.
The `GET /api/v1/statuses/:id/source` endpoint additionally has the following attributes:
- `content_type`: The content type of the status source.
## Scheduled statuses
Has these additional fields in `params`:
@ -99,13 +103,11 @@ Has these additional fields under the `pleroma` object:
- `hide_followers_count`: boolean, true when the user has follower stat hiding enabled
- `hide_follows_count`: boolean, true when the user has follow stat hiding enabled
- `settings_store`: A generic map of settings for frontends. Opaque to the backend. Only returned in `/api/v1/accounts/verify_credentials` and `/api/v1/accounts/update_credentials`
- `chat_token`: The token needed for Akkoma shoutbox. Only returned in `/api/v1/accounts/verify_credentials`
- `deactivated`: boolean, true when the user is deactivated
- `allow_following_move`: boolean, true when the user allows automatically follow moved following accounts
- `unread_conversation_count`: The count of unread conversations. Only returned to the account owner.
- `unread_notifications_count`: The count of unread notifications. Only returned to the account owner.
- `notification_settings`: object, can be absent. See `/api/v1/pleroma/notification_settings` for the parameters/keys returned.
- `accepts_chat_messages`: boolean, but can be null if we don't have that information about a user
- `favicon`: nullable URL string, Favicon image of the user's instance
### Source
@ -159,15 +161,6 @@ The `type` value is `pleroma:emoji_reaction`. Has these fields:
- `account`: The account of the user who reacted
- `status`: The status that was reacted on
### ChatMention Notification (not default)
This notification has to be requested explicitly.
The `type` value is `pleroma:chat_mention`
- `account`: The account who sent the message
- `chat_message`: The chat message
### Report Notification (not default)
This notification has to be requested explicitly.
@ -182,7 +175,7 @@ The `type` value is `pleroma:report`
Accepts additional parameters:
- `exclude_visibilities`: will exclude the notifications for activities with the given visibilities. The parameter accepts an array of visibility types (`public`, `unlisted`, `private`, `direct`). Usage example: `GET /api/v1/notifications?exclude_visibilities[]=direct&exclude_visibilities[]=private`.
- `include_types`: will include the notifications for activities with the given types. The parameter accepts an array of types (`mention`, `follow`, `reblog`, `favourite`, `move`, `pleroma:emoji_reaction`, `pleroma:chat_mention`, `pleroma:report`). Usage example: `GET /api/v1/notifications?include_types[]=mention&include_types[]=reblog`.
- `include_types`: will include the notifications for activities with the given types. The parameter accepts an array of types (`mention`, `follow`, `reblog`, `favourite`, `move`, `pleroma:emoji_reaction`, `pleroma:report`). Usage example: `GET /api/v1/notifications?include_types[]=mention&include_types[]=reblog`.
## DELETE `/api/v1/notifications/destroy_multiple`
@ -240,7 +233,6 @@ Additional parameters can be added to the JSON body/Form data:
- `pleroma_background_image` - sets the background image of the user. Can be set to "" (an empty string) to reset.
- `discoverable` - if true, external services (search bots) etc. are allowed to index / list the account (regardless of this setting, user will still appear in regular search results).
- `actor_type` - the type of this account.
- `accepts_chat_messages` - if false, this account will reject all chat messages.
- `language` - user's preferred language for receiving emails (digest, confirmation, etc.)
All images (avatar, banner and background) can be reset to the default by sending an empty string ("") instead of a file.
@ -300,7 +292,6 @@ Has these additional parameters (which are the same as in Akkoma-API):
`GET /api/v1/instance` has additional fields
- `max_toot_chars`: The maximum characters per post
- `chat_limit`: The maximum characters per chat message
- `description_limit`: The maximum characters per image description
- `poll_limits`: The limits of polls
- `upload_limit`: The maximum upload file size
@ -321,7 +312,6 @@ Has these additional parameters (which are the same as in Akkoma-API):
Permits these additional alert types:
- pleroma:chat_mention
- pleroma:emoji_reaction
## Markers
@ -332,10 +322,6 @@ Has these additional fields under the `pleroma` object:
## Streaming
### Chats
There is an additional `user:pleroma_chat` stream. Incoming chat messages will make the current chat be sent to this `user` stream. The `event` of an incoming chat message is `pleroma:chat_update`. The payload is the updated chat with the incoming chat message in the `last_message` field.
### Remote timelines
For viewing remote server timelines, there are `public:remote` and `public:remote:media` streams. Each of these accept a parameter like `?instance=lain.com`.

View file

@ -44,11 +44,8 @@ See also [the Nodeinfo standard](https://nodeinfo.diaspora.software/).
"shareable_emoji_packs",
"multifetch",
"pleroma:api/v1/notifications:include_types_filter",
"chat",
"shout",
"relay",
"pleroma_emoji_reactions",
"pleroma_chat_messages"
"pleroma_emoji_reactions"
],
"federation":{
"enabled":true,
@ -204,11 +201,8 @@ See also [the Nodeinfo standard](https://nodeinfo.diaspora.software/).
"shareable_emoji_packs",
"multifetch",
"pleroma:api/v1/notifications:include_types_filter",
"chat",
"shout",
"relay",
"pleroma_emoji_reactions",
"pleroma_chat_messages"
"pleroma_emoji_reactions"
],
"federation":{
"enabled":true,

View file

@ -576,38 +576,6 @@ The status posting endpoint takes an additional parameter, `in_reply_to_conversa
* Response: the archive of the pack with a 200 status code, 403 if the pack is not set as shared,
404 if the pack does not exist
## `GET /api/v1/pleroma/accounts/:id/scrobbles`
### Requests a list of current and recent Listen activities for an account
* Method `GET`
* Authentication: not required
* Params: None
* Response: An array of media metadata entities.
* Example response:
```json
[
{
"account": {...},
"id": "1234",
"title": "Some Title",
"artist": "Some Artist",
"album": "Some Album",
"length": 180000,
"created_at": "2019-09-28T12:40:45.000Z"
}
]
```
## `POST /api/v1/pleroma/scrobble`
### Creates a new Listen activity for an account
* Method `POST`
* Authentication: required
* Params:
* `title`: the title of the media playing
* `album`: the album of the media playing [optional]
* `artist`: the artist of the media playing [optional]
* `length`: the length of the media playing [optional]
* Response: the newly created media metadata entity representing the Listen activity
# Emoji Reactions
Emoji reactions work a lot like favourites do. They make it possible to react to a post with a single emoji character. To detect the presence of this feature, you can check `pleroma_emoji_reactions` entry in the features list of nodeinfo.

View file

@ -26,40 +26,3 @@ Response: HTTP 201 Created with the object into the body, no `Location` header p
The object given in the reponse should then be inserted into an Object's `attachment` field.
## ChatMessages
`ChatMessage`s are the messages sent in 1-on-1 chats. They are similar to
`Note`s, but the addresing is done by having a single AP actor in the `to`
field. Addressing multiple actors is not allowed. These messages are always
private, there is no public version of them. They are created with a `Create`
activity.
They are part of the `litepub` namespace as `http://litepub.social/ns#ChatMessage`.
Example:
```json
{
"actor": "http://2hu.gensokyo/users/raymoo",
"id": "http://2hu.gensokyo/objects/1",
"object": {
"attributedTo": "http://2hu.gensokyo/users/raymoo",
"content": "You expected a cute girl? Too bad.",
"id": "http://2hu.gensokyo/objects/2",
"published": "2020-02-12T14:08:20Z",
"to": [
"http://2hu.gensokyo/users/marisa"
],
"type": "ChatMessage"
},
"published": "2018-02-12T14:08:20Z",
"to": [
"http://2hu.gensokyo/users/marisa"
],
"type": "Create"
}
```
This setup does not prevent multi-user chats, but these will have to go through
a `Group`, which will be the recipient of the messages and then `Announce` them
to the users in the `Group`.

View file

@ -7,6 +7,20 @@ It actually consists of two components: a backend, named simply Akkoma, and a us
It's part of what we call the fediverse, a federated network of instances which speak common protocols and can communicate with each other.
One account on an instance is enough to talk to the entire fediverse!
## Community Channels
### IRC
For support or general questions, pop over to #akkoma and #akkoma-dev at [irc.akkoma.dev](https://irc.akkoma.dev) (port 6697, SSL)
### Discourse
For more general meta-discussion, for example discussion of potential future features, head on over to [meta.akkoma.dev](https://meta.akkoma.dev)
### Dev diaries and release notifications
will be posted via [@akkoma@ihba](https://ihatebeinga.live/users/akkoma)
## How can I use it?
Akkoma instances are already widely deployed, a list can be found at <https://the-federation.info/pleroma> and <https://fediverse.network/pleroma>.
@ -26,3 +40,4 @@ Just add a "/web" after your instance url (e.g. <https://pleroma.soykaf.com/web>
The Mastodon interface is from the Glitch-soc fork. For more information on the Mastodon interface you can check the [Mastodon](https://docs.joinmastodon.org/) and [Glitch-soc](https://glitch-soc.github.io/docs/) documentation.
Remember, what you see is only the frontend part of Mastodon, the backend is still Akkoma.

View file

@ -221,6 +221,8 @@ If your instance is up and running, you can create your first user with administ
doas -u akkoma env MIX_ENV=prod mix pleroma.user new <username> <your@emailaddress> --admin
```
{! installation/frontends.include !}
#### Further reading
{! installation/further_reading.include !}

View file

@ -212,6 +212,8 @@ If your instance is up and running, you can create your first user with administ
sudo -Hu akkoma MIX_ENV=prod mix pleroma.user new <username> <your@emailaddress> --admin
```
{! installation/frontends.include !}
#### Further reading
{! installation/further_reading.include !}

View file

@ -175,6 +175,8 @@ If your instance is up and running, you can create your first user with administ
sudo -Hu akkoma MIX_ENV=prod mix pleroma.user new <username> <your@emailaddress> --admin
```
{! installation/frontends.include !}
#### Further reading
{! installation/further_reading.include !}

View file

@ -1,188 +0,0 @@
# Akkomaの入れ方
## 日本語訳について
この記事は [Installing on Debian based distributions](Installing on Debian based distributions) の日本語訳です。何かがおかしいと思ったら、原文を見てください。
## インストール
このガイドはDebian Stretchを利用することを想定しています。Ubuntu 16.04や18.04でもおそらく動作します。また、ユーザはrootもしくはsudoにより管理者権限を持っていることを前提とします。もし、以下の操作をrootユーザで行う場合は、 `sudo` を無視してください。ただし、`sudo -Hu akkoma` のようにユーザを指定している場合には `su <username> -s $SHELL -c 'command'` を代わりに使ってください。
### 必要なソフトウェア
- PostgreSQL 9.6以上 (Ubuntu16.04では9.5しか提供されていないので,[](https://www.postgresql.org/download/linux/ubuntu/)こちらから新しいバージョンを入手してください)
- `postgresql-contrib` 9.6以上 (同上)
- Elixir 1.8 以上 ([Debianのリポジトリからインストールしないこと ここからインストールすること!](https://elixir-lang.org/install.html#unix-and-unix-like)。または [asdf](https://github.com/asdf-vm/asdf) をakkomaユーザーでインストールしてください)
- `erlang-dev`
- `erlang-nox`
- `git`
- `build-essential`
- `cmake`
- `libmagic-dev`
#### このガイドで利用している追加パッケージ
- `nginx` (おすすめです。他のリバースプロキシを使う場合は、参考となる設定をこのリポジトリから探してください)
- `certbot` (または何らかのLet's Encrypt向けACMEクライアント)
- `ImageMagick`
- `ffmpeg`
- `exiftool`
### システムを準備する
* まずシステムをアップデートしてください。
```
sudo apt update
sudo apt full-upgrade
```
* 上記に挙げたパッケージをインストールしておきます。
```
sudo apt install git build-essential postgresql postgresql-contrib cmake ffmpeg imagemagick libmagic-dev
```
### ElixirとErlangをインストールします
* Erlangのリポジトリをダウンロードおよびインストールします。
```
wget -P /tmp/ https://packages.erlang-solutions.com/erlang-solutions_2.0_all.deb
sudo dpkg -i /tmp/erlang-solutions_2.0_all.deb
```
* ElixirとErlangをインストールします、
```
sudo apt update
sudo apt install elixir erlang-dev erlang-nox
```
### オプションパッケージ: [`docs/installation/optional/media_graphics_packages.md`](../installation/optional/media_graphics_packages.md)
```shell
sudo apt install imagemagick ffmpeg libimage-exiftool-perl
```
### Akkoma BE (バックエンド) をインストールします
* Akkoma用に新しいユーザーを作ります。
```
sudo useradd -r -s /bin/false -m -d /var/lib/akkoma -U akkoma
```
**注意**: Akkomaユーザとして単発のコマンドを実行したい場合はは、`sudo -Hu akkoma command` を使ってください。シェルを使いたい場合は `sudo -Hu akkoma $SHELL`です。もし `sudo` を使わない場合は、rootユーザで `su -l akkoma -s $SHELL -c 'command'` とすることでコマンドを、`su -l akkoma -s $SHELL` とすることでシェルを開始できます。
* Gitリポジトリをクローンします。
```
sudo mkdir -p /opt/akkoma
sudo chown -R akkoma:akkoma /opt/akkoma
sudo -Hu akkoma git clone https://akkoma.dev/AkkomaGang/akkoma.git /opt/akkoma
```
* 新しいディレクトリに移動します。
```
cd /opt/akkoma
```
* Akkomaが依存するパッケージをインストールします。Hexをインストールしてもよいか聞かれたら、yesを入力してください。
```
sudo -Hu akkoma mix deps.get
```
* コンフィギュレーションを生成します。
```
sudo -Hu akkoma MIX_ENV=prod mix pleroma.instance gen
```
* rebar3をインストールしてもよいか聞かれたら、yesを入力してください。
* このときにakkomaの一部がコンパイルされるため、この処理には時間がかかります。
* あなたのインスタンスについて、いくつかの質問されます。この質問により `config/generated_config.exs` という設定ファイルが生成されます。
* コンフィギュレーションを確認して、もし問題なければ、ファイル名を変更してください。
```
sudo -Hu akkoma mv config/{generated_config.exs,prod.secret.exs}
```
* 先程のコマンドで、すでに `config/setup_db.psql` というファイルが作られています。このファイルをもとに、データベースを作成します。
```
sudo -Hu akkoma MIX_ENV=prod mix pleroma.instance gen
```
* そして、データベースのマイグレーションを実行します。
```
sudo -Hu akkoma MIX_ENV=prod mix ecto.migrate
```
* これでAkkomaを起動できるようになりました。
```
sudo -Hu akkoma MIX_ENV=prod mix phx.server
```
### インストールの最終段階
あなたの新しいインスタンスを世界に向けて公開するには、nginx等のWebサーバやプロキシサーバをAkkomaの前段に使用する必要があります。また、Akkoma のためにシステムサービスファイルを作成する必要があります。
#### Nginx
* まだインストールしていないなら、nginxをインストールします。
```
sudo apt install nginx
```
* SSLをセットアップします。他の方法でもよいですが、ここではcertbotを説明します。
certbotを使うならば、まずそれをインストールします。
```
sudo apt install certbot
```
そしてセットアップします。
```
sudo mkdir -p /var/lib/letsencrypt/
sudo certbot certonly --email <your@emailaddress> -d <yourdomain> --standalone
```
もしうまくいかないときは、nginxが正しく動いていない可能性があります。先にnginxを設定してください。ssl "on" を "off" に変えてから再試行してください。
---
* nginxの設定ファイルサンプルをnginxフォルダーにコピーします。
```
sudo cp /opt/akkoma/installation/nginx/akkoma.nginx /etc/nginx/sites-available/akkoma.nginx
sudo ln -s /etc/nginx/sites-available/akkoma.nginx /etc/nginx/sites-enabled/akkoma.nginx
```
* nginxを起動する前に、設定ファイルを編集してください。例えば、サーバー名、証明書のパスなどを変更する必要があります。
* nginxを再起動します。
```
sudo systemctl enable --now nginx.service
```
もし証明書を更新する必要が出てきた場合には、nginxの関連するlocationブロックのコメントアウトを外し、以下のコマンドを動かします。
```
sudo certbot certonly --email <your@emailaddress> -d <yourdomain> --webroot -w /var/lib/letsencrypt/
```
#### 他のWebサーバやプロキシ
これに関してはサンプルが `/opt/akkoma/installation/` にあるので、探してみてください。
#### Systemd サービス
* サービスファイルのサンプルをコピーします。
```
sudo cp /opt/akkoma/installation/akkoma.service /etc/systemd/system/akkoma.service
```
* サービスファイルを変更します。すべてのパスが正しいことを確認してください
* サービスを有効化し `akkoma.service` を開始してください
```
sudo systemctl enable --now akkoma.service
```
#### 初期ユーザの作成
新たにインスタンスを作成したら、以下のコマンドにより管理者権限を持った初期ユーザを作成できます。
```
sudo -Hu akkoma MIX_ENV=prod mix pleroma.user new <username> <your@emailaddress> --admin
```
#### その他の設定とカスタマイズ
{! installation/further_reading.include !}

View file

@ -0,0 +1,161 @@
# Installing in Docker
## Installation
This guide will show you how to get akkoma working in a docker container,
if you want isolation, or if you run a distribution not supported by the OTP
releases.
If you want to migrate from or OTP to docker, check out [the migration guide](./migrating_to_docker_en.md).
### Prepare the system
* Install docker and docker-compose
* [Docker](https://docs.docker.com/engine/install/)
* [Docker-compose](https://docs.docker.com/compose/install/)
* This will usually just be a repository installation and a package manager invocation.
* Clone the akkoma repository
* `git clone https://akkoma.dev/AkkomaGang/akkoma.git -b stable`
* `cd akkoma`
### Set up basic configuration
```bash
cp docker-resources/env.example .env
echo "DOCKER_USER=$(id -u):$(id -g)" >> .env
```
This probably won't need to be changed, it's only there to set basic environment
variables for the docker-compose file.
### Building the container
The container provided is a thin wrapper around akkoma's dependencies,
it does not contain the code itself. This is to allow for easy updates
and debugging if required.
```bash
./docker-resources/build.sh
```
This will generate a container called `akkoma` which we can use
in our compose environment.
### Generating your instance
```bash
mkdir pgdata
./docker-resources/manage.sh mix deps.get
./docker-resources/manage.sh mix compile
./docker-resources/manage.sh mix pleroma.instance gen
```
This will ask you a few questions - the defaults are fine for most things,
the database hostname is `db`, and you will want to set the ip to `0.0.0.0`.
Now we'll want to copy over the config it just created
```bash
cp config/generated_config.exs config/prod.secret.exs
```
### Setting up the database
We need to run a few commands on the database container, this isn't too bad
```bash
docker-compose run --rm --user akkoma -d db
# Note down the name it gives here, it will be something like akkoma_db_run
docker-compose run --rm akkoma psql -h db -U akkoma -f config/setup_db.psql
docker stop akkoma_db_run # Replace with the name you noted down
```
Now we can actually run our migrations
```bash
./docker-resources/manage.sh mix ecto.migrate
# this will recompile your files at the same time, since we changed the config
```
### Start the server
We're going to run it in the foreground on the first run, just to make sure
everything start up.
```bash
docker-compose up
```
If everything went well, you should be able to access your instance at http://localhost:4000
You can `ctrl-c` out of the docker-compose now to shutdown the server.
### Running in the background
```bash
docker-compose up -d
```
### Create your first user
If your instance is up and running, you can create your first user with administrative rights with the following task:
```shell
./docker-resources/manage.sh mix pleroma.user new MY_USERNAME MY_EMAIL@SOMEWHERE --admin
```
And follow the prompts
### Reverse proxies
This is a tad more complex in docker than on the host itself. It
You've got two options.
#### Running caddy in a container
This is by far the easiest option. It'll handle HTTPS and all that for you.
```bash
mkdir caddy-data
mkdir caddy-config
cp docker-resources/Caddyfile.example docker-resources/Caddyfile
```
Then edit the TLD in your caddyfile to the domain you're serving on.
Uncomment the `caddy` section in the docker-compose file,
then run `docker-compose up -d` again.
#### Running a reverse proxy on the host
If you want, you can also run the reverse proxy on the host. This is a bit more complex, but it's also more flexible.
Follow the guides for source install for your distribution of choice, or adapt
as needed. Your standard setup can be found in the [Debian Guide](../debian_based_en/#nginx)
### You're done!
All that's left is to set up your frontends.
The standard from-source commands will apply to you, just make sure you
prefix them with `./docker-resources/manage.sh`!
{! installation/frontends.include !}
### Updating Docker Installs
```bash
git pull
./docker-resources/build.sh
./docker-resources/manage.sh mix deps.get
./docker-resources/manage.sh mix compile
./docker-resources/manage.sh mix ecto.migrate
docker-compose restart akkoma db
```
#### Further reading
{! installation/further_reading.include !}
{! support.include !}

View file

@ -0,0 +1,208 @@
# Installing on Fedora
## OTP releases and RedHat-distributions
While the OTP releases of Akkoma work on most Linux distributions, they do not work correctly with RedHat-distributions. Therefore from-source installations are the recommended way to go when trying to install Akkoma on Fedora, Centos Stream or RedHat.
However, it is possible to compile your own OTP release of Akkoma for RedHat. Keep in mind that this has a few drawbacks, and has no particular advantage over a from-source installation, since you'll need to install Erlang and Elixir anyway.
This guide will cover a from-source installation. For instructions on how to build your own OTP release, please check out [the OTP for RedHat guide](./otp_redhat_en.md).
## Installation
This guide will assume you are on Fedora 36. This guide should also work with current releases of Centos Stream and RedHat, although it has not been tested yet. It also assumes that you have administrative rights, either as root or a user with [sudo permissions](https://docs.fedoraproject.org/en-US/quick-docs/adding_user_to_sudoers_file/). If you want to run this guide with root, ignore the `sudo` at the beginning of the lines, unless it calls a user like `sudo -Hu akkoma`; in this case, use `su <username> -s $SHELL -c 'command'` instead.
{! installation/generic_dependencies.include !}
### Prepare the system
* First update the system, if not already done:
```shell
sudo dnf upgrade --refresh
```
* Install some of the above mentioned programs:
```shell
sudo dnf install git gcc g++ make cmake file-devel postgresql-server postgresql-contrib
```
* Enable and initialize Postgres:
```shell
sudo systemctl enable postgresql.service
sudo postgresql-setup --initdb --unit postgresql
# Allow password auth for postgres
sudo sed -E -i 's|(host +all +all +127.0.0.1/32 +)ident|\1md5|' /var/lib/pgsql/data/pg_hba.conf
sudo systemctl start postgresql.service
```
### Install Elixir and Erlang
* Install Elixir and Erlang:
```shell
sudo dnf install elixir erlang-os_mon erlang-eldap erlang-xmerl erlang-erl_interface erlang-syntax_tools
```
### Optional packages: [`docs/installation/optional/media_graphics_packages.md`](../installation/optional/media_graphics_packages.md)
* Install ffmpeg (requires setting up the RPM-fusion repositories):
```shell
sudo dnf -y install https://download1.rpmfusion.org/free/fedora/rpmfusion-free-release-$(rpm -E %fedora).noarch.rpm
sudo dnf -y install https://download1.rpmfusion.org/nonfree/fedora/rpmfusion-nonfree-release-$(rpm -E %fedora).noarch.rpm
sudo dnf install ffmpeg
```
* Install ImageMagick and ExifTool for image manipulation:
```shell
sudo dnf install Imagemagick perl-Image-ExifTool
```
### Install AkkomaBE
* Add a new system user for the Akkoma service:
```shell
sudo useradd -r -s /bin/false -m -d /var/lib/akkoma -U akkoma
```
**Note**: To execute a single command as the Akkoma system user, use `sudo -Hu akkoma command`. You can also switch to a shell by using `sudo -Hu akkoma $SHELL`. If you dont have and want `sudo` on your system, you can use `su` as root user (UID 0) for a single command by using `su -l akkoma -s $SHELL -c 'command'` and `su -l akkoma -s $SHELL` for starting a shell.
* Git clone the AkkomaBE repository and make the Akkoma user the owner of the directory:
```shell
sudo mkdir -p /opt/akkoma
sudo chown -R akkoma:akkoma /opt/akkoma
sudo -Hu akkoma git clone https://akkoma.dev/AkkomaGang/akkoma.git /opt/akkoma
```
* Change to the new directory:
```shell
cd /opt/akkoma
```
* Install the dependencies for Akkoma and answer with `yes` if it asks you to install `Hex`:
```shell
sudo -Hu akkoma mix deps.get
```
* Generate the configuration: `sudo -Hu akkoma MIX_ENV=prod mix pleroma.instance gen`
* Answer with `yes` if it asks you to install `rebar3`.
* This may take some time, because parts of akkoma get compiled first.
* After that it will ask you a few questions about your instance and generates a configuration file in `config/generated_config.exs`.
* Check the configuration and if all looks right, rename it, so Akkoma will load it (`prod.secret.exs` for productive instance, `dev.secret.exs` for development instances):
```shell
sudo -Hu akkoma mv config/{generated_config.exs,prod.secret.exs}
```
* The previous command creates also the file `config/setup_db.psql`, with which you can create the database:
```shell
sudo -Hu postgres psql -f config/setup_db.psql
```
* Now run the database migration:
```shell
sudo -Hu akkoma MIX_ENV=prod mix ecto.migrate
```
* Now you can start Akkoma already
```shell
sudo -Hu akkoma MIX_ENV=prod mix phx.server
```
### Finalize installation
If you want to open your newly installed instance to the world, you should run nginx or some other webserver/proxy in front of Akkoma and you should consider to create a systemd service file for Akkoma.
#### Nginx
* Install nginx, if not already done:
```shell
sudo dnf install nginx
```
* Setup your SSL cert, using your method of choice or certbot. If using certbot, first install it:
```shell
sudo dnf install certbot
```
and then set it up:
```shell
sudo mkdir -p /var/lib/letsencrypt/
sudo certbot certonly --email <your@emailaddress> -d <yourdomain> --standalone
```
If that doesnt work, make sure, that nginx is not already running. If it still doesnt work, try setting up nginx first (change ssl “on” to “off” and try again).
---
* Copy the example nginx configuration and activate it:
```shell
sudo cp /opt/akkoma/installation/nginx/akkoma.nginx /etc/nginx/conf.d/akkoma.conf
```
* Before starting nginx edit the configuration and change it to your needs (e.g. change servername, change cert paths)
* Enable and start nginx:
```shell
sudo systemctl enable --now nginx.service
```
If you need to renew the certificate in the future, uncomment the relevant location block in the nginx config and run:
```shell
sudo certbot certonly --email <your@emailaddress> -d <yourdomain> --webroot -w /var/lib/letsencrypt/
```
#### Other webserver/proxies
You can find example configurations for them in `/opt/akkoma/installation/`.
#### Systemd service
* Copy example service file
```shell
sudo cp /opt/akkoma/installation/akkoma.service /etc/systemd/system/akkoma.service
```
* Edit the service file and make sure that all paths fit your installation
* Enable and start `akkoma.service`:
```shell
sudo systemctl enable --now akkoma.service
```
#### Create your first user
If your instance is up and running, you can create your first user with administrative rights with the following task:
```shell
sudo -Hu akkoma MIX_ENV=prod mix pleroma.user new <username> <your@emailaddress> --admin
```
{! installation/frontends.include !}
#### Further reading
{! installation/further_reading.include !}
{! support.include !}

View file

@ -206,6 +206,9 @@ If your instance is up and running, you can create your first user with administ
```shell
sudo -Hu akkoma MIX_ENV=prod mix pleroma.user new <username> <your@emailaddress> --admin
```
{! installation/frontends.include !}
## Conclusion
Restart nginx with `# service nginx restart` and you should be up and running.

View file

@ -0,0 +1,31 @@
#### Installing Frontends
Once your backend server is functional, you'll also want to
probably install frontends.
These are no longer bundled with the distribution and need an extra
command to install.
For most installations, the following will suffice:
=== "OTP"
```sh
./bin/pleroma_ctl frontend install pleroma-fe --ref stable
# and also, if desired
./bin/pleroma_ctl frontend install admin-fe --ref stable
```
=== "From Source"
```sh
mix pleroma.frontend install pleroma-fe --ref stable
mix pleroma.frontend install admin-fe --ref stable
```
=== "Docker"
```sh
./docker-resources/manage.sh mix pleroma.frontend install pleroma-fe --ref stable
./docker-resources/manage.sh mix pleroma.frontend install admin-fe --ref stable
```
For more customised installations, refer to [Frontend Management](../../configuration/frontend_management)

View file

@ -1,7 +1,7 @@
## Required dependencies
* PostgreSQL 9.6+
* Elixir 1.9+
* Elixir 1.12+ (1.13+ recommended)
* Erlang OTP 22.2+
* git
* file / libmagic

View file

@ -293,6 +293,8 @@ akkoma$ MIX_ENV=prod mix pleroma.user new <username> <your@emailaddress> --admin
If you opted to allow sudo for the `akkoma` user but would like to remove the ability for greater security, now might be a good time to edit `/etc/sudoers` and/or change the groups the `akkoma` user belongs to. Be sure to restart the akkoma service afterwards to ensure it picks up on the changes.
{! installation/frontends.include !}
#### Further reading
{! installation/further_reading.include !}

View file

@ -1,7 +1,5 @@
# Migrating to Akkoma
**Akkoma does not currently have a stable release, until 3.0, all builds should be considered "develop"**
## Why should you migrate?
aside from actually responsive maintainer(s)? let's lookie here, we've got:
@ -11,6 +9,8 @@ aside from actually responsive maintainer(s)? let's lookie here, we've got:
- elasticsearch support (because pleroma search is GARBAGE)
- latest develop pleroma-fe additions
- local-only posting
- automatic post translation
- the mastodon frontend back in all its glory
- probably more, this is like 3.5 years of IHBA additions finally compiled
## Actually migrating
@ -30,31 +30,27 @@ upstream git URL then just rebuild - that'll be:
git remote set-url origin https://akkoma.dev/AkkomaGang/akkoma.git/
git fetch origin
git pull -r
# or, if you're on an instance-specific branch, you may want
# to run "git merge stable" instead (or develop if you want)
```
Then compile, migrate and restart as usual.
## From OTP
**IMPORTANT: if you are using musl1.1 (void linux musl edition),
you will need to override the FLAVOUR to amd64-musl11,
also pls go shout at your maintainers to actually upgrade from EOL software.**
the flavour to be
This will just be setting the update URL -
This will just be setting the update URL - find your flavour from the [mapping on the install guide](../otp_en/#detecting-flavour) first.
```bash
export FLAVOUR=$(arch="$(uname -m)";if [ "$arch" = "x86_64" ];then arch="amd64";elif [ "$arch" = "armv7l" ];then arch="arm";elif [ "$arch" = "aarch64" ];then arch="arm64";else echo "Unsupported arch: $arch">&2;fi;if getconf GNU_LIBC_VERSION>/dev/null;then libc_postfix="";elif [ "$(ldd 2>&1|head -c 9)" = "musl libc" ];then libc_postfix="-musl";elif [ "$(find /lib/libc.musl*|wc -l)" ];then libc_postfix="-musl";else echo "Unsupported libc">&2;fi;echo "$arch$libc_postfix")
export FLAVOUR=[the flavour you found above]
./bin/pleroma_ctl update --zip-url https://akkoma-updates.s3-website.fr-par.scw.cloud/develop/akkoma-$FLAVOUR.zip
./bin/pleroma_ctl update --zip-url https://akkoma-updates.s3-website.fr-par.scw.cloud/stable/akkoma-$FLAVOUR.zip
./bin/pleroma_ctl migrate
```
Then restart. When updating in the future, you canjust use
```bash
./bin/pleroma_ctl update --branch develop
./bin/pleroma_ctl update --branch stable
```
## Frontend changes
@ -64,14 +60,20 @@ your upgrade path here depends on your setup
### I just run with the built-in frontend
You'll need to run a single command,
You'll need to run a couple of commands,
```bash
# From source
mix pleroma.frontend install pleroma-fe
# OTP
./bin/pleroma_ctl frontend install pleroma-fe
```
=== "OTP"
```sh
./bin/pleroma_ctl frontend install pleroma-fe --ref stable
# and also, if desired
./bin/pleroma_ctl frontend install admin-fe --ref stable
```
=== "From Source"
```sh
mix pleroma.frontend install pleroma-fe --ref stable
mix pleroma.frontend install admin-fe --ref stable
```
### I've run the mix task to install a frontend

View file

@ -0,0 +1,158 @@
# Migrating to a Docker Installation
If you for any reason wish to migrate a source or OTP install to a docker one,
this guide is for you.
You have a few options - your major one will be whether you want to keep your
reverse-proxy setup from before.
You probably should, in the first instance.
### Prepare the system
* Install docker and docker-compose
* [Docker](https://docs.docker.com/engine/install/)
* [Docker-compose](https://docs.docker.com/compose/install/)
* This will usually just be a repository installation and a package manager invocation.
=== "Source"
```bash
git pull
```
=== "OTP"
Clone the akkoma repository
```bash
git clone https://akkoma.dev/AkkomaGang/akkoma.git -b stable
cd akkoma
```
### Back up your old database
Change the database name as needed
```bash
pg_dump -d akkoma_prod --format c > akkoma_backup.sql
```
### Getting your static files in the right place
This will vary by every installation. Copy your `instance` directory to `instance/` in
the akkoma source directory - this is where the docker container will look for it.
For *most* from-source installs it'll already be there.
And the same with `uploads`, make sure your uploads (if you have them on disk) are
located at `uploads/` in the akkoma source directory.
If you have them on a different disk, you will need to mount that disk into the docker-compose file,
with an entry that looks like this:
```yaml
akkoma:
volumes:
- .:/opt/akkoma # This should already be there
- type: bind
source: /path/to/your/uploads
target: /opt/akkoma/uploads
```
### Set up basic configuration
```bash
cp docker-resources/env.example .env
echo "DOCKER_USER=$(id -u):$(id -g)" >> .env
```
This probably won't need to be changed, it's only there to set basic environment
variables for the docker-compose file.
=== "From source"
You probably won't need to change your config. Provided your `config/prod.secret.exs` file
is still there, you're all good.
=== "OTP"
```bash
cp /etc/akkoma/config.exs config/prod.secret.exs
```
**BOTH**
Set the following config in `config/prod.secret.exs`:
```elixir
config :pleroma, Pleroma.Web.Endpoint,
...,
http: [ip: {0, 0, 0, 0}, port: 4000]
config :pleroma, Pleroma.Repo,
...,
username: "akkoma",
password: "akkoma",
database: "akkoma",
hostname: "db"
```
### Building the container
The container provided is a thin wrapper around akkoma's dependencies,
it does not contain the code itself. This is to allow for easy updates
and debugging if required.
```bash
./docker-resources/build.sh
```
This will generate a container called `akkoma` which we can use
in our compose environment.
### Setting up the docker resources
```bash
# These won't exist if you're migrating from OTP
rm -rf deps
rm -rf _build
```
```bash
mkdir pgdata
./docker-resources/manage.sh mix deps.get
./docker-resources/manage.sh mix compile
```
### Setting up the database
Now we can import our database to the container.
```bash
docker-compose run --rm --user akkoma -d db
docker-compose run --rm akkoma pg_restore -v -U akkoma -j $(grep -c ^processor /proc/cpuinfo) -d akkoma -h db akkoma_backup.sql
```
### Reverse proxies
If you're just reusing your old proxy, you may have to uncomment the line in
the docker-compose file under `ports`. You'll find it.
Otherwise, you can use the same setup as the [docker installation guide](./docker_en.md#reverse-proxies).
### Let's go
```bash
docker-compose up -d
```
You should now be at the same point as you were before, but with a docker install.
{! installation/frontends.include !}
See the [docker installation guide](./docker_en.md) for more information on how to
update.
#### Further reading
{! installation/further_reading.include !}
{! support.include !}

View file

@ -202,6 +202,8 @@ incorrect timestamps. You should have ntpd running.
* <https://catgirl.science>
{! installation/frontends.include !}
#### Further reading
{! installation/further_reading.include !}

View file

@ -250,6 +250,8 @@ If your instance is up and running, you can create your first user with administ
LC_ALL=en_US.UTF-8 MIX_ENV=prod mix pleroma.user new <username> <your@emailaddress> --admin
```
{! installation/frontends.include !}
#### Further reading
{! installation/further_reading.include !}

View file

@ -1,121 +0,0 @@
# Akkoman asennus OpenBSD:llä
Tarvitset:
* Oman domainin
* OpenBSD 6.3 -serverin
* Auttavan ymmärryksen unix-järjestelmistä
Komennot, joiden edessä on '#', tulee ajaa käyttäjänä `root`. Tämä on
suositeltavaa tehdä komennon `doas` avulla, katso `doas (1)` ja `doas.conf (5)`.
Tästä eteenpäin oletuksena on, että domain "esimerkki.com" osoittaa
serverin IP-osoitteeseen.
Jos asennuksen kanssa on ongelmia, IRC-kanava #pleroma Libera.chat tai
Matrix-kanava #pleroma:libera.chat ovat hyviä paikkoja löytää apua
(englanniksi), `/msg eal kukkuu` jos haluat välttämättä puhua härmää.
Asenna tarvittava ohjelmisto:
`# pkg_add git elixir gmake postgresql-server-10.3 postgresql-contrib-10.3 cmake ffmpeg ImageMagick`
#### Optional software
[`docs/installation/optional/media_graphics_packages.md`](../installation/optional/media_graphics_packages.md):
* ImageMagick
* ffmpeg
* exiftool
Asenna tarvittava ohjelmisto:
`# pkg_add ImageMagick ffmpeg p5-Image-ExifTool`
Luo postgresql-tietokanta:
`# su - _postgresql`
`$ mkdir /var/postgresql/data`
`$ initdb -D /var/postgresql/data -E UTF8`
`$ createdb`
Käynnistä tietokanta ja aseta se käynnistymään automaattisesti.
`# rcctl start postgresql`
`# rcctl enable postgresql`
Luo käyttäjä akkomaa varten (kysyy muutaman kysymyksen):
`# adduser akkoma`
Vaihda akkoma-käyttäjään ja mene kotihakemistoosi:
`# su - akkoma`
Lataa akkoman lähdekoodi:
`$ git clone https://akkoma.dev/AkkomaGang/akkoma.git`
`$ cd akkoma`
Asenna tarvittavat elixir-kirjastot:
`$ mix deps.get`
`$ mix deps.compile`
Luo tarvittava konfiguraatio:
`$ mix generate_config`
`$ cp config/generated_config.exs config/prod.secret.exs`
Aja luodut tietokantakomennot:
`# su _postgres -c 'psql -f config/setup_db.psql'`
`$ MIX_ENV=prod mix ecto.migrate`
Käynnistä akkoma-prosessi:
`$ MIX_ENV=prod mix compile`
`$ MIX_ENV=prod mix phx.server`
Tässä vaiheessa on hyvä tarkistaa että asetukset ovat oikein. Avaa selaimella,
curlilla tai vastaavalla työkalulla `esimerkki.com:4000/api/v1/instance` ja katso
että kohta "uri" on "https://esimerkki.com".
Huom! Muista varmistaa että muuttuja MIX_ENV on "prod" mix-komentoja ajaessasi.
Mix lukee oikean konfiguraatiotiedoston sen mukaisesti.
Ohessa enimmäkseen toimivaksi todettu rc.d-skripti akkoman käynnistämiseen.
Kirjoita se tiedostoon /etc/rc.d/akkoma. Tämän jälkeen aja
`# chmod +x /etc/rc.d/akkoma`, ja voit käynnistää akkoman komennolla
`# /etc/rc.d/akkoma start`.
```
#!/bin/ksh
#/etc/rc.d/akkoma
daemon="cd /home/akkoma/akkoma;MIX_ENV=prod /usr/local/bin/elixir"
daemon_flags="--detached /usr/local/bin/mix phx.server"
daemon_user="akkoma"
rc_reload="NO"
rc_bg="YES"
pexp="beam"
. /etc/rc.d/rc.subr
rc_cmd $1
```
Tämän jälkeen tarvitset enää HTTP-serverin välittämään kutsut akkoma-prosessille.
Tiedostosta `install/akkoma.nginx` löytyy esimerkkikonfiguraatio, ja TLS-sertifikaatit
saat ilmaiseksi esimerkiksi [letsencryptiltä](https://certbot.eff.org/lets-encrypt/opbsd-nginx.html).
Nginx asentuu yksinkertaisesti komennolla `# pkg_add nginx`.
Kun olet valmis, avaa https://esimerkki.com selaimessasi. Luo käyttäjä ja seuraa kiinnostavia
tyyppejä muilla palvelimilla!

View file

@ -6,6 +6,7 @@ This guide covers a installation using an OTP release. To install Akkoma from so
## Pre-requisites
* A machine running Linux with GNU (e.g. Debian, Ubuntu) or musl (e.g. Alpine) libc and `x86_64`, `aarch64` or `armv7l` CPU, you have root access to. If you are not sure if it's compatible see [Detecting flavour section](#detecting-flavour) below
* For installing OTP releases on RedHat-based distros like Fedora and Centos Stream, please follow [this guide](./otp_redhat_en.md) instead.
* A (sub)domain pointed to the machine
You will be running commands as root. If you aren't root already, please elevate your priviledges by executing `sudo su`/`su`.
@ -14,12 +15,19 @@ While in theory OTP releases are possbile to install on any compatible machine,
### Detecting flavour
Paste the following into the shell:
```sh
arch="$(uname -m)";if [ "$arch" = "x86_64" ];then arch="amd64";elif [ "$arch" = "armv7l" ];then arch="arm";elif [ "$arch" = "aarch64" ];then arch="arm64";else echo "Unsupported arch: $arch">&2;fi;if getconf GNU_LIBC_VERSION>/dev/null;then libc_postfix="";elif [ "$(ldd 2>&1|head -c 9)" = "musl libc" ];then libc_postfix="-musl";elif [ "$(find /lib/libc.musl*|wc -l)" ];then libc_postfix="-musl";else echo "Unsupported libc">&2;fi;echo "$arch$libc_postfix"
```
This is a little more complex than it used to be (thanks ubuntu)
If your platform is supported the output will contain the flavour string, you will need it later. If not, this just means that we don't build releases for your platform, you can still try installing from source.
Use the following mapping to figure out your flavour:
| distribution | flavour | available branches |
| ------------- | ------------------ | ------------------- |
| debian stable | amd64 | develop, stable |
| ubuntu focal | amd64 | develop, stable |
| ubuntu jammy | amd64-ubuntu-jammy | develop, stable |
| alpine | amd64-musl | stable |
Other similar distributions will _probably_ work, but if it is not listed above, there is no official
support.
### Installing the required packages
@ -298,6 +306,8 @@ su akkoma -s $SHELL -lc "./bin/pleroma_ctl user new joeuser joeuser@sld.tld --ad
```
This will create an account withe the username of 'joeuser' with the email address of joeuser@sld.tld, and set that user's account as an admin. This will result in a link that you can paste into the browser, which logs you in and enables you to set the password.
{! installation/frontends.include !}
## Further reading
{! installation/further_reading.include !}

View file

@ -0,0 +1,286 @@
# Installing on RedHat using OTP releases
## OTP releases and Fedora/RedHat
The current OTP builds available for Linux are unfortunately incompatible with RedHat Linux distributions, like Fedora and Centos Stream. This is due to RedHat maintaining patched versions of certain Erlang libraries, making them incompatible with other Linux distributions.
However, you may compile your own OTP release from scratch. This is particularly useful if you wish to quickly distribute your OTP build onto multiple systems, without having to worry about compiling code on every system. However, if your goal is to simply set up a single instance for yourself, installing from-source might be a simpler option. To install from-source, please follow [this guide](./fedora_based_en.md).
## Pre-requisites
In order to compile a RedHat-compatible OTP release, you will need to run a RedHat Linux distribution. This guide will assume you run Fedora 36, though it should also work on older Fedora releases and other RedHat distributions. It also assumes that you have administrative rights and sufficient knowledge on how to perform common CLI tasks in Linux. If you want to run this guide with root, ignore the `sudo` at the beginning of the lines.
Important: keep in mind that you must build your OTP release for the specific RedHat distribution you wish to use it on. A build on Fedora will only be compatible with a specific Fedora release version.
## Building an OTP release for Fedora 36
### Installing required packages
* First, update your system, if not already done:
```shell
sudo dnf upgrade --refresh
```
* Then install the required packages to build your OTP release:
```shell
sudo dnf install git gcc g++ erlang elixir erlang-os_mon erlang-eldap erlang-xmerl erlang-erl_interface erlang-syntax_tools make cmake file-devel
```
### Preparing the project files
* Git clone the AkkomaBE repository. This can be done anywhere:
```shell
cd ~
git clone https://akkoma.dev/AkkomaGang/akkoma.git
```
* Change to the new directory:
```shell
cd ./akkoma
```
### Building the OTP release
* Run the following commands:
```shell
export MIX_ENV=prod
echo "import Config" > config/prod.secret.exs
mix local.hex --force
mix local.rebar --force
mix deps.get --only prod
mkdir release
mix release --path release
```
Note that compiling the OTP release will take some time. Once it completes, you will find the OTP files in the directory `release`.
If all went well, you will have built your very own Fedora-compatible OTP release! You can now pack up the files in the `release` directory and deploy them to your other Fedora servers.
## Installing the OTP release
Installing the OTP release from this point onward will be very similar to the regular OTP release. This guide assumes you will want to install your OTP package on other systems, so additional pre-requisites will be listed below.
Please note that running your own OTP release has some minor caveats that you should be aware of. They will be listed below as well.
### Installing required packages
Other than things bundled in the OTP release Akkoma depends on:
* curl (to download the release build)
* ncurses (ERTS won't run without it)
* PostgreSQL (also utilizes extensions in postgresql-contrib)
* nginx (could be swapped with another reverse proxy but this guide covers only it)
* certbot (for Let's Encrypt certificates, could be swapped with another ACME client, but this guide covers only it)
* libmagic/file
First, update your system, if not already done:
```shell
sudo dnf upgrade --refresh
```
Then install the required packages:
```shell
sudo dnf install curl ncurses postgresql postgresql-contrib nginx certbot file-devel
```
### Optional packages: [`docs/installation/optional/media_graphics_packages.md`](../installation/optional/media_graphics_packages.md)
* Install ffmpeg (requires setting up the RPM-fusion repositories):
```shell
sudo dnf -y install https://download1.rpmfusion.org/free/fedora/rpmfusion-free-release-$(rpm -E %fedora).noarch.rpm
sudo dnf -y install https://download1.rpmfusion.org/nonfree/fedora/rpmfusion-nonfree-release-$(rpm -E %fedora).noarch.rpm
sudo dnf install ffmpeg
```
* Install ImageMagick and ExifTool for image manipulation:
```shell
sudo dnf install Imagemagick perl-Image-ExifTool
```
### Configuring PostgreSQL
#### (Optional) Performance configuration
It is encouraged to check [Optimizing your PostgreSQL performance](../configuration/postgresql.md) document, for tips on PostgreSQL tuning.
Restart PostgreSQL to apply configuration changes:
```shell
sudo systemctl restart postgresql
```
### Installing Akkoma
```sh
# Create a Akkoma user
adduser --system --shell /bin/false --home /opt/akkoma akkoma
# Move your custom OTP release to the home directory
sudo -Hu akkoma mv /your/custom/otp/release /opt/akkoma
# Create uploads directory and set proper permissions (skip if planning to use a remote uploader)
# Note: It does not have to be `/var/lib/akkoma/uploads`, the config generator will ask about the upload directory later
sudo mkdir -p /var/lib/akkoma/uploads
sudo chown -R akkoma /var/lib/akkoma
# Create custom public files directory (custom emojis, frontend bundle overrides, robots.txt, etc.)
# Note: It does not have to be `/var/lib/akkoma/static`, the config generator will ask about the custom public files directory later
sudo mkdir -p /var/lib/akkoma/static
sudo chown -R akkoma /var/lib/akkoma
# Create a config directory
sudo mkdir -p /etc/akkoma
sudo chown -R akkoma /etc/akkoma
# Run the config generator
sudo -Hu akkoma ./bin/pleroma_ctl instance gen --output /etc/akkoma/config.exs --output-psql /tmp/setup_db.psql
# Create the postgres database
sudo -Hu postgres psql -f /tmp/setup_db.psql
# Create the database schema
sudo -Hu akkoma ./bin/pleroma_ctl migrate
# Start the instance to verify that everything is working as expected
sudo -Hu akkoma ./bin/pleroma daemon
# Wait for about 20 seconds and query the instance endpoint, if it shows your uri, name and email correctly, you are configured correctly
sleep 20 && curl http://localhost:4000/api/v1/instance
# Stop the instance
sudo -Hu akkoma ./bin/pleroma stop
```
### Setting up nginx and getting Let's Encrypt SSL certificaties
#### Get a Let's Encrypt certificate
```shell
certbot certonly --standalone --preferred-challenges http -d yourinstance.tld
```
#### Copy Akkoma nginx configuration to the nginx folder
```shell
cp /opt/akkoma/installation/nginx/akkoma.nginx /etc/nginx/conf.d/akkoma.conf
```
#### Edit the nginx config
```shell
# Replace example.tld with your (sub)domain (replace $EDITOR with your editor of choice)
sudo $EDITOR /etc/nginx/conf.d/akkoma.conf
# Verify that the config is valid
sudo nginx -t
```
#### Start nginx
```shell
sudo systemctl start nginx
```
At this point if you open your (sub)domain in a browser you should see a 502 error, that's because Akkoma is not started yet.
### Setting up a system service
```shell
# Copy the service into a proper directory
cp /opt/akkoma/installation/akkoma.service /etc/systemd/system/akkoma.service
# Edit the service file and make any neccesary changes
sudo $EDITOR /etc/systemd/system/akkoma.service
# If you use SELinux, set the correct file context on the pleroma binary
sudo semanage fcontext -a -t init_t /opt/akkoma/bin/pleroma
sudo restorecon -v /opt/akkoma/bin/pleroma
# Start akkoma and enable it on boot
sudo systemctl start akkoma
sudo systemctl enable akkoma
```
If everything worked, you should see a response from Akkoma-BE when visiting your domain. You may need to install frontends like Akkoma-FE and Admin-FE; refer to [this guide](../administration/CLI_tasks/frontend.md) on how to install them.
If that didn't happen, try reviewing the installation steps, starting Akkoma in the foreground and seeing if there are any errrors.
{! support.include !}
## Post installation
### Setting up auto-renew of the Let's Encrypt certificate
```shell
# Create the directory for webroot challenges
sudo mkdir -p /var/lib/letsencrypt
# Uncomment the webroot method
sudo $EDITOR /etc/nginx/conf.d/akkoma.conf
# Verify that the config is valid
sudo nginx -t
# Restart nginx
sudo systemctl restart nginx
# Ensure the webroot menthod and post hook is working
sudo certbot renew --cert-name yourinstance.tld --webroot -w /var/lib/letsencrypt/ --dry-run --post-hook 'systemctl reload nginx'
# Add it to the daily cron
echo '#!/bin/sh
certbot renew --cert-name yourinstance.tld --webroot -w /var/lib/letsencrypt/ --post-hook "systemctl reload nginx"
' > /etc/cron.daily/renew-akkoma-cert
sudo chmod +x /etc/cron.daily/renew-akkoma-cert
# If everything worked the output should contain /etc/cron.daily/renew-akkoma-cert
sudo run-parts --test /etc/cron.daily
```
## Create your first user and set as admin
```shell
cd /opt/akkoma
sudo -Hu akkoma ./bin/pleroma_ctl user new joeuser joeuser@sld.tld --admin
```
This will create an account withe the username of 'joeuser' with the email address of joeuser@sld.tld, and set that user's account as an admin. This will result in a link that you can paste into the browser, which logs you in and enables you to set the password.
## Further reading
### Caveats of building your own OTP release
There are some things to take note of when your are running your own OTP builds.
#### Updating your OTP builds
Using your custom OTP build, you will not be able to update the installation using the `pleroma_ctl update` command. Running this command would overwrite your install with an OTP release from the main Akkoma repository, which will break your install.
Instead, you will have to rebuild your OTP release every time there are updates, then manually move it to where your Akkoma installation is running, overwriting the old OTP release files. Make sure to stop the Akkoma-BE server before overwriting any files!
After that, run the `pleroma_ctl migrate` command as usual to perform database migrations.
#### Cross-compatibility between RedHat distributions
As it currently stands, your OTP build will only be compatible for the specific RedHat distribution you've built it on. Fedora builds only work on Fedora, Centos builds only on Centos, RedHat builds only on RedHat. Secondly, for Fedora, they will also be bound to the specific Fedora release. This is because different releases of Fedora may have significant changes made in some of the required packages and libraries.
{! installation/frontends.include !}
{! installation/further_reading.include !}
{! support.include !}

View file

@ -0,0 +1,66 @@
# Verifying OTP release integrity
All stable OTP releases are cryptographically signed, to allow
you to verify the integrity if you choose to.
Releases are signed with [Signify](https://man.openbsd.org/signify.1),
with [the public key in the main repository](https://akkoma.dev/AkkomaGang/akkoma/src/branch/develop/SIGNING_KEY.pub)
Release URLs will always be of the form
```
https://akkoma-updates.s3-website.fr-par.scw.cloud/{branch}/akkoma-{flavour}.zip
```
Where branch is usually `stable` or `develop`, and `flavour` is
the one [that you detect on install](../otp_en/#detecting-flavour).
So, for an AMD64 stable install, your update URL will be
```
https://akkoma-updates.s3-website.fr-par.scw.cloud/stable/akkoma-amd64.zip
```
To verify the integrity of this file, we have two helper files
```
# Checksums
https://akkoma-updates.s3-website.fr-par.scw.cloud/{branch}/akkoma-{flavour}.zip.sha256
# Signify signature of the hashes
https://akkoma-updates.s3-website.fr-par.scw.cloud/{branch}/akkoma-{flavour}.zip.sha256.sig
```
Thus, to upgrade manually, with integrity checking, consider the following script:
```bash
#!/bin/bash
set -eo pipefail
export FLAVOUR=amd64
export BRANCH=stable
# Fetch signing key
curl --silent https://akkoma.dev/AkkomaGang/akkoma/raw/branch/$BRANCH/SIGNING_KEY.pub -o AKKOMA_SIGNING_KEY.pub
# Download zip file and sig files
wget -q https://akkoma-updates.s3-website.fr-par.scw.cloud/$BRANCH/akkoma-$FLAVOUR{.zip,.zip.sha256,.zip.sha256.sig}
# Verify zip file's sha256 integrity
sha256sum --check akkoma-$FLAVOUR.zip.sha256
# Verify hash file's integrity
# Signify might be under the `signify` command, depending on your distribution
signify-openbsd -V -p AKKOMA_SIGNING_KEY.pub -m akkoma-$FLAVOUR.zip.sha256
# We're good, use that URL
echo "Update URL contents verified"
echo "use"
echo "./bin/pleroma_ctl update --zip-url https://akkoma-updates.s3-website.fr-par.scw.cloud/$BRANCH/akkoma-$FLAVOUR"
echo "to update your instance"
# Clean up
rm akkoma-$FLAVOUR.zip
rm akkoma-$FLAVOUR.zip.sha256
rm akkoma-$FLAVOUR.zip.sha256.sig
```

View file

@ -14,8 +14,8 @@ theme:
extra_css:
- css/extra.css
repo_name: 'AkkomaGang/docs'
repo_url: 'https://akkoma.dev/AkkomaGang/docs'
repo_name: 'AkkomaGang/akkoma'
repo_url: 'https://akkoma.dev/AkkomaGang/akkoma'
extra:
repo_icon: gitea

View file

@ -23,7 +23,15 @@ defmodule Mix.Pleroma do
Pleroma.Config.Oban.warn()
Pleroma.Application.limiters_setup()
Application.put_env(:phoenix, :serve_endpoints, false, persistent: true)
Finch.start_link(name: MyFinch)
proxy_url = Pleroma.Config.get([:http, :proxy_url])
proxy = Pleroma.HTTP.AdapterHelper.format_proxy(proxy_url)
finch_config =
[:http, :adapter]
|> Pleroma.Config.get([])
|> Pleroma.HTTP.AdapterHelper.maybe_add_proxy_pool(proxy)
|> Keyword.put(:name, MyFinch)
unless System.get_env("DEBUG") do
Logger.remove_backend(:console)
@ -45,6 +53,7 @@ defmodule Mix.Pleroma do
Pleroma.Emoji,
{Pleroma.Config.TransferTask, false},
Pleroma.Web.Endpoint,
{Finch, finch_config},
{Oban, oban_config},
{Majic.Pool,
[name: Pleroma.MajicPool, pool_size: Pleroma.Config.get([:majic_pool, :size], 2)]}

View file

@ -9,7 +9,7 @@ defmodule Mix.Tasks.Pleroma.Search.Meilisearch do
import Ecto.Query
import Pleroma.Search.Meilisearch,
only: [meili_post: 2, meili_put: 2, meili_get: 1, meili_delete!: 1]
only: [meili_put: 2, meili_get: 1, meili_delete!: 1]
def run(["index"]) do
start_pleroma()
@ -27,7 +27,7 @@ defmodule Mix.Tasks.Pleroma.Search.Meilisearch do
end
{:ok, _} =
meili_post(
meili_put(
"/indexes/objects/settings/ranking-rules",
[
"published:desc",
@ -41,7 +41,7 @@ defmodule Mix.Tasks.Pleroma.Search.Meilisearch do
)
{:ok, _} =
meili_post(
meili_put(
"/indexes/objects/settings/searchable-attributes",
[
"content"
@ -91,7 +91,7 @@ defmodule Mix.Tasks.Pleroma.Search.Meilisearch do
)
with {:ok, res} <- result do
if not Map.has_key?(res, "uid") do
if not Map.has_key?(res, "indexUid") do
IO.puts("\nFailed to index: #{inspect(result)}")
end
else

View file

@ -258,6 +258,25 @@ defmodule Mix.Tasks.Pleroma.User do
end
end
def run(["refetch_public_keys"]) do
start_pleroma()
Pleroma.User.Query.build(%{
external: true,
is_active: true
})
|> refetch_public_keys()
end
def run(["refetch_public_keys" | rest]) do
start_pleroma()
Pleroma.User.Query.build(%{
ap_id: rest
})
|> refetch_public_keys()
end
def run(["invite" | rest]) do
{options, [], []} =
OptionParser.parse(rest,
@ -487,6 +506,64 @@ defmodule Mix.Tasks.Pleroma.User do
|> Stream.run()
end
def run(["fix_follow_state", local_user, remote_user]) do
start_pleroma()
with {:local, %User{} = local} <- {:local, User.get_by_nickname(local_user)},
{:remote, %User{} = remote} <- {:remote, User.get_by_nickname(remote_user)},
{:follow_data, %{data: %{"state" => request_state}}} <-
{:follow_data, Pleroma.Web.ActivityPub.Utils.fetch_latest_follow(local, remote)} do
calculated_state = User.following?(local, remote)
IO.puts(
"Request state is #{request_state}, vs calculated state of following=#{calculated_state}"
)
if calculated_state == false && request_state == "accept" do
IO.puts("Discrepancy found, fixing")
Pleroma.Web.CommonAPI.reject_follow_request(local, remote)
shell_info("Relationship fixed")
else
shell_info("No discrepancy found")
end
else
{:local, _} ->
shell_error("No local user #{local_user}")
{:remote, _} ->
shell_error("No remote user #{remote_user}")
{:follow_data, _} ->
shell_error("No follow data for #{local_user} and #{remote_user}")
end
end
def run(["convert_id", id]) do
{:ok, uuid} = FlakeId.Ecto.Type.dump(id)
{:ok, raw_id} = Ecto.UUID.load(uuid)
shell_info(raw_id)
end
defp refetch_public_keys(query) do
query
|> Pleroma.Repo.chunk_stream(50, :batches)
|> Stream.each(fn users ->
users
|> Enum.each(fn user ->
IO.puts("Re-Resolving: #{user.ap_id}")
with {:ok, user} <- Pleroma.User.fetch_by_ap_id(user.ap_id),
changeset <- Pleroma.User.update_changeset(user),
{:ok, _user} <- Pleroma.User.update_and_set_cache(changeset) do
:ok
else
error -> IO.puts("Could not resolve: #{user.ap_id}, #{inspect(error)}")
end
end)
end)
|> Stream.run()
end
defp set_moderator(user, value) do
{:ok, user} =
user

View file

@ -292,6 +292,12 @@ defmodule Pleroma.Activity do
get_in_reply_to_activity_from_object(Object.normalize(activity, fetch: false))
end
def get_quoted_activity_from_object(%Object{data: %{"quoteUri" => ap_id}}) do
get_create_by_object_ap_id_with_object(ap_id)
end
def get_quoted_activity_from_object(_), do: nil
def normalize(%Activity{data: %{"id" => ap_id}}), do: get_by_ap_id_with_object(ap_id)
def normalize(%{"id" => ap_id}), do: get_by_ap_id_with_object(ap_id)
def normalize(ap_id) when is_binary(ap_id), do: get_by_ap_id_with_object(ap_id)
@ -362,9 +368,15 @@ defmodule Pleroma.Activity do
end
def restrict_deactivated_users(query) do
deactivated_users_query = from(u in User.Query.build(%{deactivated: true}), select: u.ap_id)
from(activity in query, where: activity.actor not in subquery(deactivated_users_query))
query
|> join(
:inner_lateral,
[activity],
active in fragment(
"SELECT is_active from users WHERE ap_id = ? AND is_active = TRUE",
activity.actor
)
)
end
defdelegate search(user, query, options \\ []), to: Pleroma.Search.DatabaseSearch

View file

@ -8,6 +8,40 @@ defmodule Pleroma.Activity.HTML do
@cachex Pleroma.Config.get([:cachex, :provider], Cachex)
# We store a list of cache keys related to an activity in a
# separate cache, scrubber_management_cache. It has the same
# size as scrubber_cache (see application.ex). Every time we add
# a cache to scrubber_cache, we update scrubber_management_cache.
#
# The most recent write of a certain key in the management cache
# is the same as the most recent write of any record related to that
# key in the main cache.
# Assuming LRW ( https://hexdocs.pm/cachex/Cachex.Policy.LRW.html ),
# this means when the management cache is evicted by cachex, all
# related records in the main cache will also have been evicted.
defp get_cache_keys_for(activity_id) do
with {:ok, list} when is_list(list) <- @cachex.get(:scrubber_management_cache, activity_id) do
list
else
_ -> []
end
end
defp add_cache_key_for(activity_id, additional_key) do
current = get_cache_keys_for(activity_id)
unless additional_key in current do
@cachex.put(:scrubber_management_cache, activity_id, [additional_key | current])
end
end
def invalidate_cache_for(activity_id) do
keys = get_cache_keys_for(activity_id)
Enum.map(keys, &@cachex.del(:scrubber_cache, &1))
@cachex.del(:scrubber_management_cache, activity_id)
end
def get_cached_scrubbed_html_for_activity(
content,
scrubbers,
@ -19,6 +53,8 @@ defmodule Pleroma.Activity.HTML do
@cachex.fetch!(:scrubber_cache, key, fn _key ->
object = Object.normalize(activity, fetch: false)
add_cache_key_for(activity.id, key)
HTML.ensure_scrubbed_html(content, scrubbers, object.data["fake"] || false, callback)
end)
end

View file

@ -0,0 +1,100 @@
defmodule Pleroma.Akkoma.FrontendSettingsProfile do
use Ecto.Schema
import Ecto.Changeset
import Ecto.Query
alias Pleroma.Repo
alias Pleroma.Config
alias Pleroma.User
@primary_key false
schema "user_frontend_setting_profiles" do
belongs_to(:user, Pleroma.User, primary_key: true, type: FlakeId.Ecto.CompatType)
field(:frontend_name, :string, primary_key: true)
field(:profile_name, :string, primary_key: true)
field(:settings, :map)
field(:version, :integer)
timestamps()
end
def changeset(%__MODULE__{} = struct, attrs) do
struct
|> cast(attrs, [:user_id, :frontend_name, :profile_name, :settings, :version])
|> validate_required([:user_id, :frontend_name, :profile_name, :settings, :version])
|> validate_length(:frontend_name, min: 1, max: 255)
|> validate_length(:profile_name, min: 1, max: 255)
|> validate_version(struct)
|> validate_number(:version, greater_than: 0)
|> validate_settings_length(Config.get([:instance, :max_frontend_settings_json_chars]))
end
def create_or_update(%User{} = user, frontend_name, profile_name, settings, version) do
struct =
case get_by_user_and_frontend_name_and_profile_name(user, frontend_name, profile_name) do
nil ->
%__MODULE__{}
%__MODULE__{} = profile ->
profile
end
struct
|> changeset(%{
user_id: user.id,
frontend_name: frontend_name,
profile_name: profile_name,
settings: settings,
version: version
})
|> Repo.insert_or_update()
end
def get_all_by_user_and_frontend_name(%User{id: user_id}, frontend_name) do
Repo.all(
from(p in __MODULE__, where: p.user_id == ^user_id and p.frontend_name == ^frontend_name)
)
end
def get_by_user_and_frontend_name_and_profile_name(
%User{id: user_id},
frontend_name,
profile_name
) do
Repo.one(
from(p in __MODULE__,
where:
p.user_id == ^user_id and p.frontend_name == ^frontend_name and
p.profile_name == ^profile_name
)
)
end
def delete_profile(profile) do
Repo.delete(profile)
end
defp validate_settings_length(
%Ecto.Changeset{changes: %{settings: settings}} = changeset,
max_length
) do
settings_json = Jason.encode!(settings)
if String.length(settings_json) > max_length do
add_error(changeset, :settings, "is too long")
else
changeset
end
end
defp validate_version(changeset, %{version: nil}), do: changeset
defp validate_version(%Ecto.Changeset{changes: %{version: version}} = changeset, %{
version: prev_version
}) do
if version != prev_version + 1 do
add_error(changeset, :version, "must be incremented by 1")
else
changeset
end
end
end

View file

@ -0,0 +1,100 @@
defmodule Pleroma.Akkoma.Translators.DeepL do
@behaviour Pleroma.Akkoma.Translator
alias Pleroma.HTTP
alias Pleroma.Config
require Logger
defp base_url(:free) do
"https://api-free.deepl.com/v2/"
end
defp base_url(:pro) do
"https://api.deepl.com/v2/"
end
defp api_key do
Config.get([:deepl, :api_key])
end
defp tier do
Config.get([:deepl, :tier])
end
@impl Pleroma.Akkoma.Translator
def languages do
with {:ok, %{status: 200} = source_response} <- do_languages("source"),
{:ok, %{status: 200} = dest_response} <- do_languages("target"),
{:ok, source_body} <- Jason.decode(source_response.body),
{:ok, dest_body} <- Jason.decode(dest_response.body) do
source_resp =
Enum.map(source_body, fn %{"language" => code, "name" => name} ->
%{code: code, name: name}
end)
dest_resp =
Enum.map(dest_body, fn %{"language" => code, "name" => name} ->
%{code: code, name: name}
end)
{:ok, source_resp, dest_resp}
else
{:ok, %{status: status} = response} ->
Logger.warning("DeepL: Request rejected: #{inspect(response)}")
{:error, "DeepL request failed (code #{status})"}
{:error, reason} ->
{:error, reason}
end
end
@impl Pleroma.Akkoma.Translator
def translate(string, from_language, to_language) do
with {:ok, %{status: 200} = response} <-
do_request(api_key(), tier(), string, from_language, to_language),
{:ok, body} <- Jason.decode(response.body) do
%{"translations" => [%{"text" => translated, "detected_source_language" => detected}]} =
body
{:ok, detected, translated}
else
{:ok, %{status: status} = response} ->
Logger.warning("DeepL: Request rejected: #{inspect(response)}")
{:error, "DeepL request failed (code #{status})"}
{:error, reason} ->
{:error, reason}
end
end
defp do_request(api_key, tier, string, from_language, to_language) do
HTTP.post(
base_url(tier) <> "translate",
URI.encode_query(
%{
text: string,
target_lang: to_language,
tag_handling: "html"
}
|> maybe_add_source(from_language),
:rfc3986
),
[
{"authorization", "DeepL-Auth-Key #{api_key}"},
{"content-type", "application/x-www-form-urlencoded"}
]
)
end
defp maybe_add_source(opts, nil), do: opts
defp maybe_add_source(opts, lang), do: Map.put(opts, :source_lang, lang)
defp do_languages(type) do
HTTP.get(
base_url(tier()) <> "languages?type=#{type}",
[
{"authorization", "DeepL-Auth-Key #{api_key()}"}
]
)
end
end

View file

@ -0,0 +1,82 @@
defmodule Pleroma.Akkoma.Translators.LibreTranslate do
@behaviour Pleroma.Akkoma.Translator
alias Pleroma.Config
alias Pleroma.HTTP
require Logger
defp api_key do
Config.get([:libre_translate, :api_key])
end
defp url do
Config.get([:libre_translate, :url])
end
@impl Pleroma.Akkoma.Translator
def languages do
with {:ok, %{status: 200} = response} <- do_languages(),
{:ok, body} <- Jason.decode(response.body) do
resp = Enum.map(body, fn %{"code" => code, "name" => name} -> %{code: code, name: name} end)
# No separate source/dest
{:ok, resp, resp}
else
{:ok, %{status: status} = response} ->
Logger.warning("LibreTranslate: Request rejected: #{inspect(response)}")
{:error, "LibreTranslate request failed (code #{status})"}
{:error, reason} ->
{:error, reason}
end
end
@impl Pleroma.Akkoma.Translator
def translate(string, from_language, to_language) do
with {:ok, %{status: 200} = response} <- do_request(string, from_language, to_language),
{:ok, body} <- Jason.decode(response.body) do
%{"translatedText" => translated} = body
detected =
if Map.has_key?(body, "detectedLanguage") do
get_in(body, ["detectedLanguage", "language"])
else
from_language
end
{:ok, detected, translated}
else
{:ok, %{status: status} = response} ->
Logger.warning("libre_translate: request failed, #{inspect(response)}")
{:error, "libre_translate: request failed (code #{status})"}
{:error, reason} ->
{:error, reason}
end
end
defp do_request(string, from_language, to_language) do
url = URI.parse(url())
url = %{url | path: "/translate"}
HTTP.post(
to_string(url),
Jason.encode!(%{
q: string,
source: if(is_nil(from_language), do: "auto", else: from_language),
target: to_language,
format: "html",
api_key: api_key()
}),
[
{"content-type", "application/json"}
]
)
end
defp do_languages() do
url = URI.parse(url())
url = %{url | path: "/languages"}
HTTP.get(to_string(url))
end
end

View file

@ -0,0 +1,8 @@
defmodule Pleroma.Akkoma.Translator do
@callback translate(String.t(), String.t() | nil, String.t()) ::
{:ok, String.t(), String.t()} | {:error, any()}
@callback languages() ::
{:ok, [%{name: String.t(), code: String.t()}],
[%{name: String.t(), code: String.t()}]}
| {:error, any()}
end

160
lib/pleroma/announcement.ex Normal file
View file

@ -0,0 +1,160 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2022 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Announcement do
use Ecto.Schema
import Ecto.Changeset, only: [cast: 3, validate_required: 2]
import Ecto.Query
alias Pleroma.AnnouncementReadRelationship
alias Pleroma.Repo
@type t :: %__MODULE__{}
@primary_key {:id, FlakeId.Ecto.CompatType, autogenerate: true}
schema "announcements" do
field(:data, :map)
field(:starts_at, :utc_datetime)
field(:ends_at, :utc_datetime)
field(:rendered, :map)
timestamps(type: :utc_datetime)
end
def change(struct, params \\ %{}) do
struct
|> cast(validate_params(struct, params), [:data, :starts_at, :ends_at, :rendered])
|> validate_required([:data])
end
defp validate_params(struct, params) do
base_data =
%{
"content" => "",
"all_day" => false
}
|> Map.merge((struct && struct.data) || %{})
merged_data =
Map.merge(base_data, params.data)
|> Map.take(["content", "all_day"])
params
|> Map.merge(%{data: merged_data})
|> add_rendered_properties()
end
def add_rendered_properties(params) do
{content_html, _, _} =
Pleroma.Web.CommonAPI.Utils.format_input(params.data["content"], "text/plain",
mentions_format: :full
)
rendered = %{
"content" => content_html
}
params
|> Map.put(:rendered, rendered)
end
def add(params) do
changeset = change(%__MODULE__{}, params)
Repo.insert(changeset)
end
def update(announcement, params) do
changeset = change(announcement, params)
Repo.update(changeset)
end
def list_all do
__MODULE__
|> Repo.all()
end
def list_paginated(%{limit: limited_number, offset: offset_number}) do
__MODULE__
|> limit(^limited_number)
|> offset(^offset_number)
|> Repo.all()
end
def get_by_id(id) do
Repo.get_by(__MODULE__, id: id)
end
def delete_by_id(id) do
with announcement when not is_nil(announcement) <- get_by_id(id),
{:ok, _} <- Repo.delete(announcement) do
:ok
else
_ ->
:error
end
end
def read_by?(announcement, user) do
AnnouncementReadRelationship.exists?(user, announcement)
end
def mark_read_by(announcement, user) do
AnnouncementReadRelationship.mark_read(user, announcement)
end
def render_json(announcement, opts \\ []) do
extra_params =
case Keyword.fetch(opts, :for) do
{:ok, user} when not is_nil(user) ->
%{read: read_by?(announcement, user)}
_ ->
%{}
end
admin_extra_params =
case Keyword.fetch(opts, :admin) do
{:ok, true} ->
%{pleroma: %{raw_content: announcement.data["content"]}}
_ ->
%{}
end
base = %{
id: announcement.id,
content: announcement.rendered["content"],
starts_at: announcement.starts_at,
ends_at: announcement.ends_at,
all_day: announcement.data["all_day"],
published_at: announcement.inserted_at,
updated_at: announcement.updated_at,
mentions: [],
statuses: [],
tags: [],
emojis: [],
reactions: []
}
base
|> Map.merge(extra_params)
|> Map.merge(admin_extra_params)
end
# "visible" means:
# starts_at < time < ends_at
def list_all_visible_when(time) do
__MODULE__
|> where([a], is_nil(a.starts_at) or a.starts_at < ^time)
|> where([a], is_nil(a.ends_at) or a.ends_at > ^time)
|> Repo.all()
end
def list_all_visible do
list_all_visible_when(DateTime.now("Etc/UTC") |> elem(1))
end
end

View file

@ -0,0 +1,55 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2022 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.AnnouncementReadRelationship do
use Ecto.Schema
import Ecto.Changeset
alias FlakeId.Ecto.CompatType
alias Pleroma.Announcement
alias Pleroma.Repo
alias Pleroma.User
@type t :: %__MODULE__{}
schema "announcement_read_relationships" do
belongs_to(:user, User, type: CompatType)
belongs_to(:announcement, Announcement, type: CompatType)
timestamps(updated_at: false)
end
def mark_read(user, announcement) do
%__MODULE__{}
|> cast(%{user_id: user.id, announcement_id: announcement.id}, [:user_id, :announcement_id])
|> validate_required([:user_id, :announcement_id])
|> foreign_key_constraint(:user_id)
|> foreign_key_constraint(:announcement_id)
|> unique_constraint([:user_id, :announcement_id])
|> Repo.insert()
end
def mark_unread(user, announcement) do
with relationship <- get(user, announcement),
{:exists, true} <- {:exists, not is_nil(relationship)},
{:ok, _} <- Repo.delete(relationship) do
:ok
else
{:exists, false} ->
:ok
_ ->
:error
end
end
def get(user, announcement) do
Repo.get_by(__MODULE__, user_id: user.id, announcement_id: announcement.id)
end
def exists?(user, announcement) do
not is_nil(get(user, announcement))
end
end

View file

@ -53,7 +53,6 @@ defmodule Pleroma.Application do
Config.DeprecationWarnings.warn()
Pleroma.Web.Plugs.HTTPSecurityPlug.warn_if_disabled()
Pleroma.ApplicationRequirements.verify!()
setup_instrumenters()
load_custom_modules()
Pleroma.Docs.JSON.compile()
limiters_setup()
@ -64,7 +63,8 @@ defmodule Pleroma.Application do
Pleroma.Repo,
Config.TransferTask,
Pleroma.Emoji,
Pleroma.Web.Plugs.RateLimiter.Supervisor
Pleroma.Web.Plugs.RateLimiter.Supervisor,
{Task.Supervisor, name: Pleroma.TaskSupervisor}
] ++
cachex_children() ++
http_children() ++
@ -77,8 +77,7 @@ defmodule Pleroma.Application do
] ++
elasticsearch_children() ++
task_children(@mix_env) ++
dont_run_in_test(@mix_env) ++
shout_child(shout_enabled?())
dont_run_in_test(@mix_env)
# See http://elixir-lang.org/docs/stable/elixir/Supervisor.html
# for other strategies and supported options
@ -93,11 +92,16 @@ defmodule Pleroma.Application do
end
opts = [strategy: :one_for_one, name: Pleroma.Supervisor, max_restarts: max_restarts]
result = Supervisor.start_link(children, opts)
set_postgres_server_version()
result
with {:ok, data} <- Supervisor.start_link(children, opts) do
set_postgres_server_version()
{:ok, data}
else
e ->
Logger.error("Failed to start!")
Logger.error("#{inspect(e)}")
e
end
end
defp set_postgres_server_version do
@ -139,29 +143,6 @@ defmodule Pleroma.Application do
end
end
defp setup_instrumenters do
require Prometheus.Registry
if Application.get_env(:prometheus, Pleroma.Repo.Instrumenter) do
:ok =
:telemetry.attach(
"prometheus-ecto",
[:pleroma, :repo, :query],
&Pleroma.Repo.Instrumenter.handle_event/4,
%{}
)
Pleroma.Repo.Instrumenter.setup()
end
Pleroma.Web.Endpoint.MetricsExporter.setup()
Pleroma.Web.Endpoint.PipelineInstrumenter.setup()
# Note: disabled until prometheus-phx is integrated into prometheus-phoenix:
# Pleroma.Web.Endpoint.Instrumenter.setup()
PrometheusPhx.setup()
end
defp cachex_children do
[
build_cachex("used_captcha", ttl_interval: seconds_valid_interval()),
@ -169,15 +150,13 @@ defmodule Pleroma.Application do
build_cachex("object", default_ttl: 25_000, ttl_interval: 1000, limit: 2500),
build_cachex("rich_media", default_ttl: :timer.minutes(120), limit: 5000),
build_cachex("scrubber", limit: 2500),
build_cachex("scrubber_management", limit: 2500),
build_cachex("idempotency", expiration: idempotency_expiration(), limit: 2500),
build_cachex("web_resp", limit: 2500),
build_cachex("emoji_packs", expiration: emoji_packs_expiration(), limit: 10),
build_cachex("failed_proxy_url", limit: 2500),
build_cachex("banned_urls", default_ttl: :timer.hours(24 * 30), limit: 5_000),
build_cachex("chat_message_id_idempotency_key",
expiration: chat_message_id_idempotency_key_expiration(),
limit: 500_000
)
build_cachex("translations", default_ttl: :timer.hours(24 * 30), limit: 2500)
]
end
@ -187,9 +166,6 @@ defmodule Pleroma.Application do
defp idempotency_expiration,
do: expiration(default: :timer.seconds(6 * 60 * 60), interval: :timer.seconds(60))
defp chat_message_id_idempotency_key_expiration,
do: expiration(default: :timer.minutes(2), interval: :timer.seconds(60))
defp seconds_valid_interval,
do: :timer.seconds(Config.get!([Pleroma.Captcha, :seconds_valid]))
@ -201,8 +177,6 @@ defmodule Pleroma.Application do
type: :worker
}
defp shout_enabled?, do: Config.get([:shout, :enabled])
defp dont_run_in_test(env) when env in [:test, :benchmark], do: []
defp dont_run_in_test(_) do
@ -222,15 +196,6 @@ defmodule Pleroma.Application do
]
end
defp shout_child(true) do
[
Pleroma.Web.ShoutChannel.ShoutChannelState,
{Phoenix.PubSub, [name: Pleroma.PubSub, adapter: Phoenix.PubSub.PG2]}
]
end
defp shout_child(_), do: []
defp task_children(:test) do
[
%{
@ -286,9 +251,13 @@ defmodule Pleroma.Application do
end
defp http_children do
proxy_url = Config.get([:http, :proxy_url])
proxy = Pleroma.HTTP.AdapterHelper.format_proxy(proxy_url)
config =
[:http, :adapter]
|> Config.get([])
|> Pleroma.HTTP.AdapterHelper.maybe_add_proxy_pool(proxy)
|> Keyword.put(:name, MyFinch)
[{Finch, config}]

View file

@ -1,97 +0,0 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2021 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Chat do
use Ecto.Schema
import Ecto.Changeset
import Ecto.Query
alias Pleroma.Chat
alias Pleroma.Repo
alias Pleroma.User
@moduledoc """
Chat keeps a reference to ChatMessage conversations between a user and an recipient. The recipient can be a user (for now) or a group (not implemented yet).
It is a helper only, to make it easy to display a list of chats with other people, ordered by last bump. The actual messages are retrieved by querying the recipients of the ChatMessages.
"""
@type t :: %__MODULE__{}
@primary_key {:id, FlakeId.Ecto.CompatType, autogenerate: true}
schema "chats" do
belongs_to(:user, User, type: FlakeId.Ecto.CompatType)
field(:recipient, :string)
timestamps()
end
def changeset(struct, params) do
struct
|> cast(params, [:user_id, :recipient])
|> validate_change(:recipient, fn
:recipient, recipient ->
case User.get_cached_by_ap_id(recipient) do
nil -> [recipient: "must be an existing user"]
_ -> []
end
end)
|> validate_required([:user_id, :recipient])
|> unique_constraint(:user_id, name: :chats_user_id_recipient_index)
end
@spec get_by_user_and_id(User.t(), FlakeId.Ecto.CompatType.t()) ::
{:ok, t()} | {:error, :not_found}
def get_by_user_and_id(%User{id: user_id}, id) do
from(c in __MODULE__,
where: c.id == ^id,
where: c.user_id == ^user_id
)
|> Repo.find_resource()
end
@spec get_by_id(FlakeId.Ecto.CompatType.t()) :: t() | nil
def get_by_id(id) do
Repo.get(__MODULE__, id)
end
@spec get(FlakeId.Ecto.CompatType.t(), String.t()) :: t() | nil
def get(user_id, recipient) do
Repo.get_by(__MODULE__, user_id: user_id, recipient: recipient)
end
@spec get_or_create(FlakeId.Ecto.CompatType.t(), String.t()) ::
{:ok, t()} | {:error, Ecto.Changeset.t()}
def get_or_create(user_id, recipient) do
%__MODULE__{}
|> changeset(%{user_id: user_id, recipient: recipient})
|> Repo.insert(
# Need to set something, otherwise we get nothing back at all
on_conflict: [set: [recipient: recipient]],
returning: true,
conflict_target: [:user_id, :recipient]
)
end
@spec bump_or_create(FlakeId.Ecto.CompatType.t(), String.t()) ::
{:ok, t()} | {:error, Ecto.Changeset.t()}
def bump_or_create(user_id, recipient) do
%__MODULE__{}
|> changeset(%{user_id: user_id, recipient: recipient})
|> Repo.insert(
on_conflict: [set: [updated_at: NaiveDateTime.utc_now()]],
returning: true,
conflict_target: [:user_id, :recipient]
)
end
@spec for_user_query(FlakeId.Ecto.CompatType.t()) :: Ecto.Query.t()
def for_user_query(user_id) do
from(c in Chat,
where: c.user_id == ^user_id,
order_by: [desc: c.updated_at]
)
end
end

View file

@ -1,117 +0,0 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2021 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Chat.MessageReference do
@moduledoc """
A reference that builds a relation between an AP chat message that a user can see and whether it has been seen
by them, or should be displayed to them. Used to build the chat view that is presented to the user.
"""
use Ecto.Schema
alias Pleroma.Chat
alias Pleroma.Object
alias Pleroma.Repo
import Ecto.Changeset
import Ecto.Query
@primary_key {:id, FlakeId.Ecto.Type, autogenerate: true}
schema "chat_message_references" do
belongs_to(:object, Object)
belongs_to(:chat, Chat, type: FlakeId.Ecto.CompatType)
field(:unread, :boolean, default: true)
timestamps()
end
def changeset(struct, params) do
struct
|> cast(params, [:object_id, :chat_id, :unread])
|> validate_required([:object_id, :chat_id, :unread])
end
def get_by_id(id) do
__MODULE__
|> Repo.get(id)
|> Repo.preload(:object)
end
def delete(cm_ref) do
cm_ref
|> Repo.delete()
end
def delete_for_object(%{id: object_id}) do
from(cr in __MODULE__,
where: cr.object_id == ^object_id
)
|> Repo.delete_all()
end
def for_chat_and_object(%{id: chat_id}, %{id: object_id}) do
__MODULE__
|> Repo.get_by(chat_id: chat_id, object_id: object_id)
|> Repo.preload(:object)
end
def for_chat_query(chat) do
from(cr in __MODULE__,
where: cr.chat_id == ^chat.id,
order_by: [desc: :id],
preload: [:object]
)
end
def last_message_for_chat(chat) do
chat
|> for_chat_query()
|> limit(1)
|> Repo.one()
end
def create(chat, object, unread) do
params = %{
chat_id: chat.id,
object_id: object.id,
unread: unread
}
%__MODULE__{}
|> changeset(params)
|> Repo.insert()
end
def unread_count_for_chat(chat) do
chat
|> for_chat_query()
|> where([cmr], cmr.unread == true)
|> Repo.aggregate(:count)
end
def mark_as_read(cm_ref) do
cm_ref
|> changeset(%{unread: false})
|> Repo.update()
end
def set_all_seen_for_chat(chat, last_read_id \\ nil) do
query =
chat
|> for_chat_query()
|> exclude(:order_by)
|> exclude(:preload)
|> where([cmr], cmr.unread == true)
if last_read_id do
query
|> where([cmr], cmr.id <= ^last_read_id)
else
query
end
|> Repo.update_all(set: [unread: false])
end
end

View file

@ -11,10 +11,7 @@ defmodule Akkoma.Collections.Fetcher do
alias Pleroma.Config
require Logger
def fetch_collection_by_ap_id(ap_id) when is_binary(ap_id) do
fetch_collection(ap_id)
end
@spec fetch_collection(String.t() | map()) :: {:ok, [Pleroma.Object.t()]} | {:error, any()}
def fetch_collection(ap_id) when is_binary(ap_id) do
with {:ok, page} <- Fetcher.fetch_and_contain_remote_object_from_id(ap_id) do
{:ok, objects_from_collection(page)}
@ -26,7 +23,7 @@ defmodule Akkoma.Collections.Fetcher do
end
def fetch_collection(%{"type" => type} = page)
when type in ["Collection", "OrderedCollection"] do
when type in ["Collection", "OrderedCollection", "CollectionPage", "OrderedCollectionPage"] do
{:ok, objects_from_collection(page)}
end
@ -38,12 +35,13 @@ defmodule Akkoma.Collections.Fetcher do
when is_list(items) and type in ["Collection", "CollectionPage"],
do: items
defp objects_from_collection(%{"type" => "OrderedCollection", "orderedItems" => items})
when is_list(items),
do: items
defp objects_from_collection(%{"type" => type, "orderedItems" => items} = page)
when is_list(items) and type in ["OrderedCollection", "OrderedCollectionPage"],
do: maybe_next_page(page, items)
defp objects_from_collection(%{"type" => "Collection", "items" => items}) when is_list(items),
do: items
defp objects_from_collection(%{"type" => type, "items" => items} = page)
when is_list(items) and type in ["Collection", "CollectionPage"],
do: maybe_next_page(page, items)
defp objects_from_collection(%{"type" => type, "first" => first})
when is_binary(first) and type in ["Collection", "OrderedCollection"] do
@ -55,17 +53,27 @@ defmodule Akkoma.Collections.Fetcher do
fetch_page_items(id)
end
defp objects_from_collection(_page), do: []
defp fetch_page_items(id, items \\ []) do
if Enum.count(items) >= Config.get([:activitypub, :max_collection_objects]) do
items
else
{:ok, page} = Fetcher.fetch_and_contain_remote_object_from_id(id)
objects = items_in_page(page)
with {:ok, page} <- Fetcher.fetch_and_contain_remote_object_from_id(id) do
objects = items_in_page(page)
if Enum.count(objects) > 0 do
maybe_next_page(page, items ++ objects)
if Enum.count(objects) > 0 do
maybe_next_page(page, items ++ objects)
else
items
end
else
items
{:error, "Object has been deleted"} ->
items
{:error, error} ->
Logger.error("Could not fetch page #{id} - #{inspect(error)}")
{:error, error}
end
end
end

View file

@ -17,7 +17,9 @@ defmodule Pleroma.Config.DeprecationWarnings do
{[:instance, :mrf_transparency], [:mrf, :transparency],
"\n* `config :pleroma, :instance, mrf_transparency` is now `config :pleroma, :mrf, transparency`"},
{[:instance, :mrf_transparency_exclusions], [:mrf, :transparency_exclusions],
"\n* `config :pleroma, :instance, mrf_transparency_exclusions` is now `config :pleroma, :mrf, transparency_exclusions`"}
"\n* `config :pleroma, :instance, mrf_transparency_exclusions` is now `config :pleroma, :mrf, transparency_exclusions`"},
{[:instance, :quarantined_instances], [:mrf_simple, :reject],
"\n* `config :pleroma, :instance, :quarantined_instances` is now covered by `:pleroma, :mrf_simple, :reject`"}
]
def check_simple_policy_tuples do
@ -81,7 +83,7 @@ defmodule Pleroma.Config.DeprecationWarnings do
end
def check_quarantined_instances_tuples do
has_strings = Config.get([:instance, :quarantined_instances]) |> Enum.any?(&is_binary/1)
has_strings = Config.get([:instance, :quarantined_instances], []) |> Enum.any?(&is_binary/1)
if has_strings do
Logger.warn("""
@ -176,7 +178,6 @@ defmodule Pleroma.Config.DeprecationWarnings do
check_activity_expiration_config(),
check_remote_ip_plug_name(),
check_uploders_s3_public_endpoint(),
check_old_chat_shoutbox(),
check_quarantined_instances_tuples(),
check_transparency_exclusions_tuples(),
check_simple_policy_tuples()
@ -308,27 +309,4 @@ defmodule Pleroma.Config.DeprecationWarnings do
:ok
end
end
@spec check_old_chat_shoutbox() :: :ok | nil
def check_old_chat_shoutbox do
instance_config = Pleroma.Config.get([:instance])
chat_config = Pleroma.Config.get([:chat]) || []
use_old_config =
Keyword.has_key?(instance_config, :chat_limit) or
Keyword.has_key?(chat_config, :enabled)
if use_old_config do
Logger.error("""
!!!DEPRECATION WARNING!!!
Your config is using the old namespace for the Shoutbox configuration. You need to convert to the new namespace. e.g.,
\n* `config :pleroma, :chat, enabled` and `config :pleroma, :instance, chat_limit` are now equal to:
\n* `config :pleroma, :shout, enabled` and `config :pleroma, :shout, limit`
""")
:error
else
:ok
end
end
end

View file

@ -32,9 +32,9 @@ defmodule Pleroma.Config.Holder do
def release_defaults do
[
pleroma: [
{:instance, [static_dir: "/var/lib/pleroma/static"]},
{Pleroma.Uploaders.Local, [uploads: "/var/lib/pleroma/uploads"]},
{:modules, [runtime_dir: "/var/lib/pleroma/modules"]},
{:instance, [static_dir: "/var/lib/akkoma/static"]},
{Pleroma.Uploaders.Local, [uploads: "/var/lib/akkoma/uploads"]},
{:modules, [runtime_dir: "/var/lib/akkoma/modules"]},
{:release, true}
]
]

View file

@ -14,10 +14,10 @@ defmodule Pleroma.Config.ReleaseRuntimeProvider do
config_path =
cond do
opts[:config_path] -> opts[:config_path]
System.get_env("PLEROMA_CONFIG_PATH") -> System.get_env("PLEROMA_CONFIG_PATH")
System.get_env("AKKOMA_CONFIG_PATH") -> System.get_env("AKKOMA_CONFIG_PATH")
File.exists?("/etc/akkoma/config.exs") -> "/etc/akkoma/config.exs"
true -> "/etc/pleroma/config.exs"
System.get_env("PLEROMA_CONFIG_PATH") -> System.get_env("PLEROMA_CONFIG_PATH")
File.exists?("/etc/pleroma/config.exs") -> "/etc/pleroma/config.exs"
true -> "/etc/akkoma/config.exs"
end
with_runtime_config =
@ -31,7 +31,7 @@ defmodule Pleroma.Config.ReleaseRuntimeProvider do
warning = [
IO.ANSI.red(),
IO.ANSI.bright(),
"!!! Config path is not declared! Please ensure it exists and that PLEROMA_CONFIG_PATH is unset or points to an existing file",
"!!! Config path is not declared! Please ensure it exists and that AKKOMA_CONFIG_PATH and/or PLEROMA_CONFIG_PATH is unset or points to an existing file",
IO.ANSI.reset()
]

View file

@ -15,7 +15,6 @@ defmodule Pleroma.Config.TransferTask do
defp reboot_time_keys,
do: [
{:pleroma, :shout},
{:pleroma, Oban},
{:pleroma, :rate_limit},
{:pleroma, :markup},
@ -39,7 +38,6 @@ defmodule Pleroma.Config.TransferTask do
def load_and_update_env(deleted_settings \\ [], restart_pleroma? \\ true) do
with {_, true} <- {:configurable, Config.get(:configurable_from_database)} do
# We need to restart applications for loaded settings take effect
{logger, other} =
(Repo.all(ConfigDB) ++ deleted_settings)
|> Enum.map(&merge_with_default/1)
@ -51,8 +49,7 @@ defmodule Pleroma.Config.TransferTask do
started_applications = Application.started_applications()
# TODO: some problem with prometheus after restart!
reject = [nil, :prometheus, :postgrex]
reject = [nil, :postgrex]
reject =
if restart_pleroma? do
@ -87,7 +84,12 @@ defmodule Pleroma.Config.TransferTask do
end
defp merge_with_default(%{group: group, key: key, value: value} = setting) do
default = Config.Holder.default_config(group, key)
default =
if group == :pleroma do
Config.get([key], Config.Holder.default_config(group, key))
else
Config.Holder.default_config(group, key)
end
merged =
cond do

View file

@ -27,4 +27,40 @@ defmodule Pleroma.Constants do
do:
~w(index.html robots.txt static static-fe finmoji emoji packs sounds images instance sw.js sw-pleroma.js favicon.png schemas doc embed.js embed.css)
)
const(status_updatable_fields,
do: [
"source",
"tag",
"updated",
"emoji",
"content",
"summary",
"sensitive",
"attachment",
"generator"
]
)
const(updatable_object_types,
do: [
"Note",
"Question",
"Audio",
"Video",
"Event",
"Article",
"Page"
]
)
const(actor_types,
do: [
"Application",
"Group",
"Organization",
"Person",
"Service"
]
)
end

View file

@ -0,0 +1,10 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2022 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Docs.Translator do
require Pleroma.Docs.Translator.Compiler
require Pleroma.Web.Gettext
@before_compile Pleroma.Docs.Translator.Compiler
end

View file

@ -0,0 +1,119 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2022 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Docs.Translator.Compiler do
@external_resource "config/description.exs"
@raw_config Pleroma.Config.Loader.read("config/description.exs")
@raw_descriptions @raw_config[:pleroma][:config_description]
defmacro __before_compile__(_env) do
strings =
__MODULE__.descriptions()
|> __MODULE__.extract_strings()
quote do
def placeholder do
unquote do
Enum.map(
strings,
fn {path, type, string} ->
ctxt = msgctxt_for(path, type)
quote do
Pleroma.Web.Gettext.dpgettext_noop(
"config_descriptions",
unquote(ctxt),
unquote(string)
)
end
end
)
end
end
end
end
def descriptions do
Pleroma.Web.ActivityPub.MRF.config_descriptions()
|> Enum.reduce(@raw_descriptions, fn description, acc -> [description | acc] end)
|> Pleroma.Docs.Generator.convert_to_strings()
end
def extract_strings(descriptions) do
descriptions
|> Enum.reduce(%{strings: [], path: []}, &process_item/2)
|> Map.get(:strings)
end
defp process_item(entity, acc) do
current_level =
acc
|> process_desc(entity)
|> process_label(entity)
process_children(entity, current_level)
end
defp process_desc(acc, %{description: desc} = item) do
%{
strings: [{acc.path ++ [key_for(item)], "description", desc} | acc.strings],
path: acc.path
}
end
defp process_desc(acc, _) do
acc
end
defp process_label(acc, %{label: label} = item) do
%{
strings: [{acc.path ++ [key_for(item)], "label", label} | acc.strings],
path: acc.path
}
end
defp process_label(acc, _) do
acc
end
defp process_children(%{children: children} = item, acc) do
current_level = Map.put(acc, :path, acc.path ++ [key_for(item)])
children
|> Enum.reduce(current_level, &process_item/2)
|> Map.put(:path, acc.path)
end
defp process_children(_, acc) do
acc
end
def msgctxt_for(path, type) do
"config #{type} at #{Enum.join(path, " > ")}"
end
defp convert_group({_, group}) do
group
end
defp convert_group(group) do
group
end
def key_for(%{group: group, key: key}) do
"#{convert_group(group)}-#{key}"
end
def key_for(%{group: group}) do
convert_group(group)
end
def key_for(%{key: key}) do
key
end
def key_for(_) do
nil
end
end

View file

@ -1,13 +1,13 @@
# emoji-test.txt
# Date: 2021-08-26, 17:22:23 GMT
# © 2021 Unicode®, Inc.
# Date: 2022-08-12, 20:24:39 GMT
# © 2022 Unicode®, Inc.
# Unicode and the Unicode Logo are registered trademarks of Unicode, Inc. in the U.S. and other countries.
# For terms of use, see http://www.unicode.org/terms_of_use.html
# For terms of use, see https://www.unicode.org/terms_of_use.html
#
# Emoji Keyboard/Display Test Data for UTS #51
# Version: 14.0
# Version: 15.0
#
# For documentation and usage, see http://www.unicode.org/reports/tr51
# For documentation and usage, see https://www.unicode.org/reports/tr51
#
# This file provides data for testing which emoji forms should be in keyboards and which should also be displayed/processed.
# Format: code points; status # emoji name
@ -92,6 +92,7 @@
1F62C ; fully-qualified # 😬 E1.0 grimacing face
1F62E 200D 1F4A8 ; fully-qualified # 😮‍💨 E13.1 face exhaling
1F925 ; fully-qualified # 🤥 E3.0 lying face
1FAE8 ; fully-qualified # 🫨 E15.0 shaking face
# subgroup: face-sleepy
1F60C ; fully-qualified # 😌 E0.6 relieved face
@ -155,7 +156,7 @@
# subgroup: face-negative
1F624 ; fully-qualified # 😤 E0.6 face with steam from nose
1F621 ; fully-qualified # 😡 E0.6 pouting face
1F621 ; fully-qualified # 😡 E0.6 enraged face
1F620 ; fully-qualified # 😠 E0.6 angry face
1F92C ; fully-qualified # 🤬 E5.0 face with symbols on mouth
1F608 ; fully-qualified # 😈 E1.0 smiling face with horns
@ -190,8 +191,7 @@
1F649 ; fully-qualified # 🙉 E0.6 hear-no-evil monkey
1F64A ; fully-qualified # 🙊 E0.6 speak-no-evil monkey
# subgroup: emotion
1F48B ; fully-qualified # 💋 E0.6 kiss mark
# subgroup: heart
1F48C ; fully-qualified # 💌 E0.6 love letter
1F498 ; fully-qualified # 💘 E0.6 heart with arrow
1F49D ; fully-qualified # 💝 E0.6 heart with ribbon
@ -210,14 +210,20 @@
2764 200D 1FA79 ; unqualified # ❤‍🩹 E13.1 mending heart
2764 FE0F ; fully-qualified # ❤️ E0.6 red heart
2764 ; unqualified # ❤ E0.6 red heart
1FA77 ; fully-qualified # 🩷 E15.0 pink heart
1F9E1 ; fully-qualified # 🧡 E5.0 orange heart
1F49B ; fully-qualified # 💛 E0.6 yellow heart
1F49A ; fully-qualified # 💚 E0.6 green heart
1F499 ; fully-qualified # 💙 E0.6 blue heart
1FA75 ; fully-qualified # 🩵 E15.0 light blue heart
1F49C ; fully-qualified # 💜 E0.6 purple heart
1F90E ; fully-qualified # 🤎 E12.0 brown heart
1F5A4 ; fully-qualified # 🖤 E3.0 black heart
1FA76 ; fully-qualified # 🩶 E15.0 grey heart
1F90D ; fully-qualified # 🤍 E12.0 white heart
# subgroup: emotion
1F48B ; fully-qualified # 💋 E0.6 kiss mark
1F4AF ; fully-qualified # 💯 E0.6 hundred points
1F4A2 ; fully-qualified # 💢 E0.6 anger symbol
1F4A5 ; fully-qualified # 💥 E0.6 collision
@ -226,21 +232,20 @@
1F4A8 ; fully-qualified # 💨 E0.6 dashing away
1F573 FE0F ; fully-qualified # 🕳️ E0.7 hole
1F573 ; unqualified # 🕳 E0.7 hole
1F4A3 ; fully-qualified # 💣 E0.6 bomb
1F4AC ; fully-qualified # 💬 E0.6 speech balloon
1F441 FE0F 200D 1F5E8 FE0F ; fully-qualified # 👁️‍🗨️ E2.0 eye in speech bubble
1F441 200D 1F5E8 FE0F ; unqualified # 👁‍🗨️ E2.0 eye in speech bubble
1F441 FE0F 200D 1F5E8 ; unqualified # 👁️‍🗨 E2.0 eye in speech bubble
1F441 FE0F 200D 1F5E8 ; minimally-qualified # 👁️‍🗨 E2.0 eye in speech bubble
1F441 200D 1F5E8 ; unqualified # 👁‍🗨 E2.0 eye in speech bubble
1F5E8 FE0F ; fully-qualified # 🗨️ E2.0 left speech bubble
1F5E8 ; unqualified # 🗨 E2.0 left speech bubble
1F5EF FE0F ; fully-qualified # 🗯️ E0.7 right anger bubble
1F5EF ; unqualified # 🗯 E0.7 right anger bubble
1F4AD ; fully-qualified # 💭 E1.0 thought balloon
1F4A4 ; fully-qualified # 💤 E0.6 zzz
1F4A4 ; fully-qualified # 💤 E0.6 ZZZ
# Smileys & Emotion subtotal: 177
# Smileys & Emotion subtotal: 177 w/o modifiers
# Smileys & Emotion subtotal: 180
# Smileys & Emotion subtotal: 180 w/o modifiers
# group: People & Body
@ -300,6 +305,18 @@
1FAF4 1F3FD ; fully-qualified # 🫴🏽 E14.0 palm up hand: medium skin tone
1FAF4 1F3FE ; fully-qualified # 🫴🏾 E14.0 palm up hand: medium-dark skin tone
1FAF4 1F3FF ; fully-qualified # 🫴🏿 E14.0 palm up hand: dark skin tone
1FAF7 ; fully-qualified # 🫷 E15.0 leftwards pushing hand
1FAF7 1F3FB ; fully-qualified # 🫷🏻 E15.0 leftwards pushing hand: light skin tone
1FAF7 1F3FC ; fully-qualified # 🫷🏼 E15.0 leftwards pushing hand: medium-light skin tone
1FAF7 1F3FD ; fully-qualified # 🫷🏽 E15.0 leftwards pushing hand: medium skin tone
1FAF7 1F3FE ; fully-qualified # 🫷🏾 E15.0 leftwards pushing hand: medium-dark skin tone
1FAF7 1F3FF ; fully-qualified # 🫷🏿 E15.0 leftwards pushing hand: dark skin tone
1FAF8 ; fully-qualified # 🫸 E15.0 rightwards pushing hand
1FAF8 1F3FB ; fully-qualified # 🫸🏻 E15.0 rightwards pushing hand: light skin tone
1FAF8 1F3FC ; fully-qualified # 🫸🏼 E15.0 rightwards pushing hand: medium-light skin tone
1FAF8 1F3FD ; fully-qualified # 🫸🏽 E15.0 rightwards pushing hand: medium skin tone
1FAF8 1F3FE ; fully-qualified # 🫸🏾 E15.0 rightwards pushing hand: medium-dark skin tone
1FAF8 1F3FF ; fully-qualified # 🫸🏿 E15.0 rightwards pushing hand: dark skin tone
# subgroup: hand-fingers-partial
1F44C ; fully-qualified # 👌 E0.6 OK hand
@ -473,11 +490,11 @@
1F932 1F3FE ; fully-qualified # 🤲🏾 E5.0 palms up together: medium-dark skin tone
1F932 1F3FF ; fully-qualified # 🤲🏿 E5.0 palms up together: dark skin tone
1F91D ; fully-qualified # 🤝 E3.0 handshake
1F91D 1F3FB ; fully-qualified # 🤝🏻 E3.0 handshake: light skin tone
1F91D 1F3FC ; fully-qualified # 🤝🏼 E3.0 handshake: medium-light skin tone
1F91D 1F3FD ; fully-qualified # 🤝🏽 E3.0 handshake: medium skin tone
1F91D 1F3FE ; fully-qualified # 🤝🏾 E3.0 handshake: medium-dark skin tone
1F91D 1F3FF ; fully-qualified # 🤝🏿 E3.0 handshake: dark skin tone
1F91D 1F3FB ; fully-qualified # 🤝🏻 E14.0 handshake: light skin tone
1F91D 1F3FC ; fully-qualified # 🤝🏼 E14.0 handshake: medium-light skin tone
1F91D 1F3FD ; fully-qualified # 🤝🏽 E14.0 handshake: medium skin tone
1F91D 1F3FE ; fully-qualified # 🤝🏾 E14.0 handshake: medium-dark skin tone
1F91D 1F3FF ; fully-qualified # 🤝🏿 E14.0 handshake: dark skin tone
1FAF1 1F3FB 200D 1FAF2 1F3FC ; fully-qualified # 🫱🏻‍🫲🏼 E14.0 handshake: light skin tone, medium-light skin tone
1FAF1 1F3FB 200D 1FAF2 1F3FD ; fully-qualified # 🫱🏻‍🫲🏽 E14.0 handshake: light skin tone, medium skin tone
1FAF1 1F3FB 200D 1FAF2 1F3FE ; fully-qualified # 🫱🏻‍🫲🏾 E14.0 handshake: light skin tone, medium-dark skin tone
@ -1455,7 +1472,7 @@
1F575 1F3FF ; fully-qualified # 🕵🏿 E2.0 detective: dark skin tone
1F575 FE0F 200D 2642 FE0F ; fully-qualified # 🕵️‍♂️ E4.0 man detective
1F575 200D 2642 FE0F ; unqualified # 🕵‍♂️ E4.0 man detective
1F575 FE0F 200D 2642 ; unqualified # 🕵️‍♂ E4.0 man detective
1F575 FE0F 200D 2642 ; minimally-qualified # 🕵️‍♂ E4.0 man detective
1F575 200D 2642 ; unqualified # 🕵‍♂ E4.0 man detective
1F575 1F3FB 200D 2642 FE0F ; fully-qualified # 🕵🏻‍♂️ E4.0 man detective: light skin tone
1F575 1F3FB 200D 2642 ; minimally-qualified # 🕵🏻‍♂ E4.0 man detective: light skin tone
@ -1469,7 +1486,7 @@
1F575 1F3FF 200D 2642 ; minimally-qualified # 🕵🏿‍♂ E4.0 man detective: dark skin tone
1F575 FE0F 200D 2640 FE0F ; fully-qualified # 🕵️‍♀️ E4.0 woman detective
1F575 200D 2640 FE0F ; unqualified # 🕵‍♀️ E4.0 woman detective
1F575 FE0F 200D 2640 ; unqualified # 🕵️‍♀ E4.0 woman detective
1F575 FE0F 200D 2640 ; minimally-qualified # 🕵️‍♀ E4.0 woman detective
1F575 200D 2640 ; unqualified # 🕵‍♀ E4.0 woman detective
1F575 1F3FB 200D 2640 FE0F ; fully-qualified # 🕵🏻‍♀️ E4.0 woman detective: light skin tone
1F575 1F3FB 200D 2640 ; minimally-qualified # 🕵🏻‍♀ E4.0 woman detective: light skin tone
@ -2302,7 +2319,7 @@
1F3CC 1F3FF ; fully-qualified # 🏌🏿 E4.0 person golfing: dark skin tone
1F3CC FE0F 200D 2642 FE0F ; fully-qualified # 🏌️‍♂️ E4.0 man golfing
1F3CC 200D 2642 FE0F ; unqualified # 🏌‍♂️ E4.0 man golfing
1F3CC FE0F 200D 2642 ; unqualified # 🏌️‍♂ E4.0 man golfing
1F3CC FE0F 200D 2642 ; minimally-qualified # 🏌️‍♂ E4.0 man golfing
1F3CC 200D 2642 ; unqualified # 🏌‍♂ E4.0 man golfing
1F3CC 1F3FB 200D 2642 FE0F ; fully-qualified # 🏌🏻‍♂️ E4.0 man golfing: light skin tone
1F3CC 1F3FB 200D 2642 ; minimally-qualified # 🏌🏻‍♂ E4.0 man golfing: light skin tone
@ -2316,7 +2333,7 @@
1F3CC 1F3FF 200D 2642 ; minimally-qualified # 🏌🏿‍♂ E4.0 man golfing: dark skin tone
1F3CC FE0F 200D 2640 FE0F ; fully-qualified # 🏌️‍♀️ E4.0 woman golfing
1F3CC 200D 2640 FE0F ; unqualified # 🏌‍♀️ E4.0 woman golfing
1F3CC FE0F 200D 2640 ; unqualified # 🏌️‍♀ E4.0 woman golfing
1F3CC FE0F 200D 2640 ; minimally-qualified # 🏌️‍♀ E4.0 woman golfing
1F3CC 200D 2640 ; unqualified # 🏌‍♀ E4.0 woman golfing
1F3CC 1F3FB 200D 2640 FE0F ; fully-qualified # 🏌🏻‍♀️ E4.0 woman golfing: light skin tone
1F3CC 1F3FB 200D 2640 ; minimally-qualified # 🏌🏻‍♀ E4.0 woman golfing: light skin tone
@ -2427,7 +2444,7 @@
26F9 1F3FF ; fully-qualified # ⛹🏿 E2.0 person bouncing ball: dark skin tone
26F9 FE0F 200D 2642 FE0F ; fully-qualified # ⛹️‍♂️ E4.0 man bouncing ball
26F9 200D 2642 FE0F ; unqualified # ⛹‍♂️ E4.0 man bouncing ball
26F9 FE0F 200D 2642 ; unqualified # ⛹️‍♂ E4.0 man bouncing ball
26F9 FE0F 200D 2642 ; minimally-qualified # ⛹️‍♂ E4.0 man bouncing ball
26F9 200D 2642 ; unqualified # ⛹‍♂ E4.0 man bouncing ball
26F9 1F3FB 200D 2642 FE0F ; fully-qualified # ⛹🏻‍♂️ E4.0 man bouncing ball: light skin tone
26F9 1F3FB 200D 2642 ; minimally-qualified # ⛹🏻‍♂ E4.0 man bouncing ball: light skin tone
@ -2441,7 +2458,7 @@
26F9 1F3FF 200D 2642 ; minimally-qualified # ⛹🏿‍♂ E4.0 man bouncing ball: dark skin tone
26F9 FE0F 200D 2640 FE0F ; fully-qualified # ⛹️‍♀️ E4.0 woman bouncing ball
26F9 200D 2640 FE0F ; unqualified # ⛹‍♀️ E4.0 woman bouncing ball
26F9 FE0F 200D 2640 ; unqualified # ⛹️‍♀ E4.0 woman bouncing ball
26F9 FE0F 200D 2640 ; minimally-qualified # ⛹️‍♀ E4.0 woman bouncing ball
26F9 200D 2640 ; unqualified # ⛹‍♀ E4.0 woman bouncing ball
26F9 1F3FB 200D 2640 FE0F ; fully-qualified # ⛹🏻‍♀️ E4.0 woman bouncing ball: light skin tone
26F9 1F3FB 200D 2640 ; minimally-qualified # ⛹🏻‍♀ E4.0 woman bouncing ball: light skin tone
@ -2462,7 +2479,7 @@
1F3CB 1F3FF ; fully-qualified # 🏋🏿 E2.0 person lifting weights: dark skin tone
1F3CB FE0F 200D 2642 FE0F ; fully-qualified # 🏋️‍♂️ E4.0 man lifting weights
1F3CB 200D 2642 FE0F ; unqualified # 🏋‍♂️ E4.0 man lifting weights
1F3CB FE0F 200D 2642 ; unqualified # 🏋️‍♂ E4.0 man lifting weights
1F3CB FE0F 200D 2642 ; minimally-qualified # 🏋️‍♂ E4.0 man lifting weights
1F3CB 200D 2642 ; unqualified # 🏋‍♂ E4.0 man lifting weights
1F3CB 1F3FB 200D 2642 FE0F ; fully-qualified # 🏋🏻‍♂️ E4.0 man lifting weights: light skin tone
1F3CB 1F3FB 200D 2642 ; minimally-qualified # 🏋🏻‍♂ E4.0 man lifting weights: light skin tone
@ -2476,7 +2493,7 @@
1F3CB 1F3FF 200D 2642 ; minimally-qualified # 🏋🏿‍♂ E4.0 man lifting weights: dark skin tone
1F3CB FE0F 200D 2640 FE0F ; fully-qualified # 🏋️‍♀️ E4.0 woman lifting weights
1F3CB 200D 2640 FE0F ; unqualified # 🏋‍♀️ E4.0 woman lifting weights
1F3CB FE0F 200D 2640 ; unqualified # 🏋️‍♀ E4.0 woman lifting weights
1F3CB FE0F 200D 2640 ; minimally-qualified # 🏋️‍♀ E4.0 woman lifting weights
1F3CB 200D 2640 ; unqualified # 🏋‍♀ E4.0 woman lifting weights
1F3CB 1F3FB 200D 2640 FE0F ; fully-qualified # 🏋🏻‍♀️ E4.0 woman lifting weights: light skin tone
1F3CB 1F3FB 200D 2640 ; minimally-qualified # 🏋🏻‍♀ E4.0 woman lifting weights: light skin tone
@ -3262,8 +3279,8 @@
1FAC2 ; fully-qualified # 🫂 E13.0 people hugging
1F463 ; fully-qualified # 👣 E0.6 footprints
# People & Body subtotal: 2986
# People & Body subtotal: 506 w/o modifiers
# People & Body subtotal: 2998
# People & Body subtotal: 508 w/o modifiers
# group: Component
@ -3306,6 +3323,8 @@
1F405 ; fully-qualified # 🐅 E1.0 tiger
1F406 ; fully-qualified # 🐆 E1.0 leopard
1F434 ; fully-qualified # 🐴 E0.6 horse face
1FACE ; fully-qualified # 🫎 E15.0 moose
1FACF ; fully-qualified # 🫏 E15.0 donkey
1F40E ; fully-qualified # 🐎 E0.6 horse
1F984 ; fully-qualified # 🦄 E1.0 unicorn
1F993 ; fully-qualified # 🦓 E5.0 zebra
@ -3373,6 +3392,9 @@
1F9A9 ; fully-qualified # 🦩 E12.0 flamingo
1F99A ; fully-qualified # 🦚 E11.0 peacock
1F99C ; fully-qualified # 🦜 E11.0 parrot
1FABD ; fully-qualified # 🪽 E15.0 wing
1F426 200D 2B1B ; fully-qualified # 🐦‍⬛ E15.0 black bird
1FABF ; fully-qualified # 🪿 E15.0 goose
# subgroup: animal-amphibian
1F438 ; fully-qualified # 🐸 E0.6 frog
@ -3399,6 +3421,7 @@
1F419 ; fully-qualified # 🐙 E0.6 octopus
1F41A ; fully-qualified # 🐚 E0.6 spiral shell
1FAB8 ; fully-qualified # 🪸 E14.0 coral
1FABC ; fully-qualified # 🪼 E15.0 jellyfish
# subgroup: animal-bug
1F40C ; fully-qualified # 🐌 E0.6 snail
@ -3433,6 +3456,7 @@
1F33B ; fully-qualified # 🌻 E0.6 sunflower
1F33C ; fully-qualified # 🌼 E0.6 blossom
1F337 ; fully-qualified # 🌷 E0.6 tulip
1FABB ; fully-qualified # 🪻 E15.0 hyacinth
# subgroup: plant-other
1F331 ; fully-qualified # 🌱 E0.6 seedling
@ -3451,9 +3475,10 @@
1F343 ; fully-qualified # 🍃 E0.6 leaf fluttering in wind
1FAB9 ; fully-qualified # 🪹 E14.0 empty nest
1FABA ; fully-qualified # 🪺 E14.0 nest with eggs
1F344 ; fully-qualified # 🍄 E0.6 mushroom
# Animals & Nature subtotal: 151
# Animals & Nature subtotal: 151 w/o modifiers
# Animals & Nature subtotal: 159
# Animals & Nature subtotal: 159 w/o modifiers
# group: Food & Drink
@ -3492,10 +3517,11 @@
1F966 ; fully-qualified # 🥦 E5.0 broccoli
1F9C4 ; fully-qualified # 🧄 E12.0 garlic
1F9C5 ; fully-qualified # 🧅 E12.0 onion
1F344 ; fully-qualified # 🍄 E0.6 mushroom
1F95C ; fully-qualified # 🥜 E3.0 peanuts
1FAD8 ; fully-qualified # 🫘 E14.0 beans
1F330 ; fully-qualified # 🌰 E0.6 chestnut
1FADA ; fully-qualified # 🫚 E15.0 ginger root
1FADB ; fully-qualified # 🫛 E15.0 pea pod
# subgroup: food-prepared
1F35E ; fully-qualified # 🍞 E0.6 bread
@ -3607,8 +3633,8 @@
1FAD9 ; fully-qualified # 🫙 E14.0 jar
1F3FA ; fully-qualified # 🏺 E1.0 amphora
# Food & Drink subtotal: 134
# Food & Drink subtotal: 134 w/o modifiers
# Food & Drink subtotal: 135
# Food & Drink subtotal: 135 w/o modifiers
# group: Travel & Places
@ -3974,11 +4000,10 @@
1F3AF ; fully-qualified # 🎯 E0.6 bullseye
1FA80 ; fully-qualified # 🪀 E12.0 yo-yo
1FA81 ; fully-qualified # 🪁 E12.0 kite
1F52B ; fully-qualified # 🔫 E0.6 water pistol
1F3B1 ; fully-qualified # 🎱 E0.6 pool 8 ball
1F52E ; fully-qualified # 🔮 E0.6 crystal ball
1FA84 ; fully-qualified # 🪄 E13.0 magic wand
1F9FF ; fully-qualified # 🧿 E11.0 nazar amulet
1FAAC ; fully-qualified # 🪬 E14.0 hamsa
1F3AE ; fully-qualified # 🎮 E0.6 video game
1F579 FE0F ; fully-qualified # 🕹️ E0.7 joystick
1F579 ; unqualified # 🕹 E0.7 joystick
@ -4013,8 +4038,8 @@
1F9F6 ; fully-qualified # 🧶 E11.0 yarn
1FAA2 ; fully-qualified # 🪢 E13.0 knot
# Activities subtotal: 97
# Activities subtotal: 97 w/o modifiers
# Activities subtotal: 96
# Activities subtotal: 96 w/o modifiers
# group: Objects
@ -4040,6 +4065,7 @@
1FA73 ; fully-qualified # 🩳 E12.0 shorts
1F459 ; fully-qualified # 👙 E0.6 bikini
1F45A ; fully-qualified # 👚 E0.6 womans clothes
1FAAD ; fully-qualified # 🪭 E15.0 folding hand fan
1F45B ; fully-qualified # 👛 E0.6 purse
1F45C ; fully-qualified # 👜 E0.6 handbag
1F45D ; fully-qualified # 👝 E0.6 clutch bag
@ -4055,6 +4081,7 @@
1F461 ; fully-qualified # 👡 E0.6 womans sandal
1FA70 ; fully-qualified # 🩰 E12.0 ballet shoes
1F462 ; fully-qualified # 👢 E0.6 womans boot
1FAAE ; fully-qualified # 🪮 E15.0 hair pick
1F451 ; fully-qualified # 👑 E0.6 crown
1F452 ; fully-qualified # 👒 E0.6 womans hat
1F3A9 ; fully-qualified # 🎩 E0.6 top hat
@ -4103,6 +4130,8 @@
1FA95 ; fully-qualified # 🪕 E12.0 banjo
1F941 ; fully-qualified # 🥁 E3.0 drum
1FA98 ; fully-qualified # 🪘 E13.0 long drum
1FA87 ; fully-qualified # 🪇 E15.0 maracas
1FA88 ; fully-qualified # 🪈 E15.0 flute
# subgroup: phone
1F4F1 ; fully-qualified # 📱 E0.6 mobile phone
@ -4275,7 +4304,7 @@
1F5E1 ; unqualified # 🗡 E0.7 dagger
2694 FE0F ; fully-qualified # ⚔️ E1.0 crossed swords
2694 ; unqualified # ⚔ E1.0 crossed swords
1F52B ; fully-qualified # 🔫 E0.6 water pistol
1F4A3 ; fully-qualified # 💣 E0.6 bomb
1FA83 ; fully-qualified # 🪃 E13.0 boomerang
1F3F9 ; fully-qualified # 🏹 E1.0 bow and arrow
1F6E1 FE0F ; fully-qualified # 🛡️ E0.7 shield
@ -4354,12 +4383,14 @@
1FAA6 ; fully-qualified # 🪦 E13.0 headstone
26B1 FE0F ; fully-qualified # ⚱️ E1.0 funeral urn
26B1 ; unqualified # ⚱ E1.0 funeral urn
1F9FF ; fully-qualified # 🧿 E11.0 nazar amulet
1FAAC ; fully-qualified # 🪬 E14.0 hamsa
1F5FF ; fully-qualified # 🗿 E0.6 moai
1FAA7 ; fully-qualified # 🪧 E13.0 placard
1FAAA ; fully-qualified # 🪪 E14.0 identification card
# Objects subtotal: 304
# Objects subtotal: 304 w/o modifiers
# Objects subtotal: 310
# Objects subtotal: 310 w/o modifiers
# group: Symbols
@ -4455,6 +4486,7 @@
262E ; unqualified # ☮ E1.0 peace symbol
1F54E ; fully-qualified # 🕎 E1.0 menorah
1F52F ; fully-qualified # 🔯 E0.6 dotted six-pointed star
1FAAF ; fully-qualified # 🪯 E15.0 khanda
# subgroup: zodiac
2648 ; fully-qualified # ♈ E0.6 Aries
@ -4503,6 +4535,7 @@
1F505 ; fully-qualified # 🔅 E1.0 dim button
1F506 ; fully-qualified # 🔆 E1.0 bright button
1F4F6 ; fully-qualified # 📶 E0.6 antenna bars
1F6DC ; fully-qualified # 🛜 E15.0 wireless
1F4F3 ; fully-qualified # 📳 E0.6 vibration mode
1F4F4 ; fully-qualified # 📴 E0.6 mobile phone off
@ -4693,8 +4726,8 @@
1F533 ; fully-qualified # 🔳 E0.6 white square button
1F532 ; fully-qualified # 🔲 E0.6 black square button
# Symbols subtotal: 302
# Symbols subtotal: 302 w/o modifiers
# Symbols subtotal: 304
# Symbols subtotal: 304 w/o modifiers
# group: Flags
@ -4709,7 +4742,7 @@
1F3F3 200D 1F308 ; unqualified # 🏳‍🌈 E4.0 rainbow flag
1F3F3 FE0F 200D 26A7 FE0F ; fully-qualified # 🏳️‍⚧️ E13.0 transgender flag
1F3F3 200D 26A7 FE0F ; unqualified # 🏳‍⚧️ E13.0 transgender flag
1F3F3 FE0F 200D 26A7 ; unqualified # 🏳️‍⚧ E13.0 transgender flag
1F3F3 FE0F 200D 26A7 ; minimally-qualified # 🏳️‍⚧ E13.0 transgender flag
1F3F3 200D 26A7 ; unqualified # 🏳‍⚧ E13.0 transgender flag
1F3F4 200D 2620 FE0F ; fully-qualified # 🏴‍☠️ E11.0 pirate flag
1F3F4 200D 2620 ; minimally-qualified # 🏴‍☠ E11.0 pirate flag
@ -4983,9 +5016,9 @@
# Flags subtotal: 275 w/o modifiers
# Status Counts
# fully-qualified : 3624
# minimally-qualified : 817
# unqualified : 252
# fully-qualified : 3655
# minimally-qualified : 827
# unqualified : 242
# component : 9
#EOF

View file

@ -9,6 +9,7 @@ defmodule Pleroma.Emoji do
"""
use GenServer
alias Pleroma.Emoji.Combinations
alias Pleroma.Emoji.Loader
require Logger
@ -124,7 +125,7 @@ defmodule Pleroma.Emoji do
|> String.split("\n")
|> Enum.filter(fn line ->
line != "" and not String.starts_with?(line, "#") and
String.contains?(line, "qualified")
String.contains?(line, "fully-qualified")
end)
|> Enum.map(fn line ->
line
@ -186,4 +187,22 @@ defmodule Pleroma.Emoji do
end
def emoji_url(_), do: nil
def emoji_name_with_instance(name, url) do
url = url |> URI.parse() |> Map.get(:host)
"#{name}@#{url}"
end
emoji_qualification_map =
emojis
|> Enum.filter(&String.contains?(&1, "\uFE0F"))
|> Combinations.variate_emoji_qualification()
for {qualified, unqualified_list} <- emoji_qualification_map do
for unqualified <- unqualified_list do
def fully_qualify_emoji(unquote(unqualified)), do: unquote(qualified)
end
end
def fully_qualify_emoji(emoji), do: emoji
end

View file

@ -0,0 +1,45 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2022 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Emoji.Combinations do
# FE0F is the emoji variation sequence. It is used for fully-qualifying
# emoji, and that includes emoji combinations.
# This code generates combinations per emoji: for each FE0F, all possible
# combinations of the character being removed or staying will be generated.
# This is made as an attempt to find all partially-qualified and unqualified
# versions of a fully-qualified emoji.
# I have found *no cases* for which this would be a problem, after browsing
# the entire emoji list in emoji-test.txt. This is safe, and, sadly, most
# likely sane too.
defp qualification_combinations(codepoints) do
qualification_combinations([[]], codepoints)
end
defp qualification_combinations(acc, []), do: acc
defp qualification_combinations(acc, ["\uFE0F" | tail]) do
acc
|> Enum.flat_map(fn x -> [x, x ++ ["\uFE0F"]] end)
|> qualification_combinations(tail)
end
defp qualification_combinations(acc, [codepoint | tail]) do
acc
|> Enum.map(&Kernel.++(&1, [codepoint]))
|> qualification_combinations(tail)
end
def variate_emoji_qualification(emoji) when is_binary(emoji) do
emoji
|> String.codepoints()
|> qualification_combinations()
|> Enum.map(&List.to_string/1)
end
def variate_emoji_qualification(emoji) when is_list(emoji) do
emoji
|> Enum.map(fn emoji -> {emoji, variate_emoji_qualification(emoji)} end)
end
end

View file

@ -240,30 +240,6 @@ defmodule Pleroma.FollowingRelationship do
end)
end
@doc """
For a query with joined activity,
keeps rows where activity's actor is followed by user -or- is NOT domain-blocked by user.
"""
def keep_following_or_not_domain_blocked(query, user) do
where(
query,
[_, activity],
fragment(
# "(actor's domain NOT in domain_blocks) OR (actor IS in followed AP IDs)"
"""
NOT (substring(? from '.*://([^/]*)') = ANY(?)) OR
? = ANY(SELECT ap_id FROM users AS u INNER JOIN following_relationships AS fr
ON u.id = fr.following_id WHERE fr.follower_id = ? AND fr.state = ?)
""",
activity.actor,
^user.domain_blocks,
activity.actor,
^User.binary_id(user.id),
^accept_state_code()
)
)
end
defp validate_not_self_relationship(%Changeset{} = changeset) do
changeset
|> validate_follower_id_following_id_inequality()

View file

@ -136,7 +136,12 @@ defmodule Pleroma.Formatter do
HTML.filter_tags(text)
end
def html_escape(text, format) when format in ["text/plain", "text/x.misskeymarkdown"] do
def html_escape(text, "text/x.misskeymarkdown") do
text
|> HTML.filter_tags()
end
def html_escape(text, "text/plain") do
Regex.split(@link_regex, text, include_captures: true)
|> Enum.map_every(2, fn chunk ->
{:safe, part} = Phoenix.HTML.html_escape(chunk)

View file

@ -9,6 +9,7 @@ defmodule Pleroma.Helpers.AuthHelper do
import Plug.Conn
@oauth_token_session_key :oauth_token
@oauth_user_session_key :oauth_user
@doc """
Skips OAuth permissions (scopes) checks, assigns nil `:token`.
@ -43,4 +44,16 @@ defmodule Pleroma.Helpers.AuthHelper do
def delete_session_token(%Conn{} = conn) do
delete_session(conn, @oauth_token_session_key)
end
def put_session_user(%Conn{} = conn, user) do
put_session(conn, @oauth_user_session_key, user)
end
def delete_session_user(%Conn{} = conn) do
delete_session(conn, @oauth_user_session_key)
end
def get_session_user(%Conn{} = conn) do
get_session(conn, @oauth_user_session_key)
end
end

View file

@ -6,7 +6,7 @@ defmodule Pleroma.HTTP.AdapterHelper do
@moduledoc """
Configure Tesla.Client with default and customized adapter options.
"""
@defaults [name: MyFinch, connect_timeout: 5_000, recv_timeout: 5_000]
@defaults [name: MyFinch, pool_timeout: 5_000, receive_timeout: 5_000]
@type proxy_type() :: :socks4 | :socks5
@type host() :: charlist() | :inet.ip_address()
@ -25,15 +25,58 @@ defmodule Pleroma.HTTP.AdapterHelper do
def format_proxy(proxy_url) do
case parse_proxy(proxy_url) do
{:ok, host, port} -> {host, port}
{:ok, type, host, port} -> {type, host, port}
{:ok, host, port} -> {:http, host, port, []}
{:ok, type, host, port} -> {type, host, port, []}
_ -> nil
end
end
@spec maybe_add_proxy(keyword(), proxy() | nil) :: keyword()
def maybe_add_proxy(opts, nil), do: opts
def maybe_add_proxy(opts, proxy), do: Keyword.put_new(opts, :proxy, proxy)
def maybe_add_proxy(opts, proxy) do
Keyword.put(opts, :proxy, proxy)
end
def maybe_add_proxy_pool(opts, nil), do: opts
def maybe_add_proxy_pool(opts, proxy) do
Logger.info("Using HTTP Proxy: #{inspect(proxy)}")
opts
|> maybe_add_pools()
|> maybe_add_default_pool()
|> maybe_add_conn_opts()
|> put_in([:pools, :default, :conn_opts, :proxy], proxy)
end
defp maybe_add_pools(opts) do
if Keyword.has_key?(opts, :pools) do
opts
else
Keyword.put(opts, :pools, %{})
end
end
defp maybe_add_default_pool(opts) do
pools = Keyword.get(opts, :pools)
if Map.has_key?(pools, :default) do
opts
else
put_in(opts, [:pools, :default], [])
end
end
defp maybe_add_conn_opts(opts) do
conn_opts = get_in(opts, [:pools, :default, :conn_opts])
unless is_nil(conn_opts) do
opts
else
put_in(opts, [:pools, :default, :conn_opts], [])
end
end
@doc """
Merge default connection & adapter options with received ones.
@ -46,36 +89,31 @@ defmodule Pleroma.HTTP.AdapterHelper do
|> AdapterHelper.Default.options(uri)
end
defp proxy_type("http"), do: {:ok, :http}
defp proxy_type("https"), do: {:ok, :https}
defp proxy_type(_), do: {:error, :unknown}
@spec parse_proxy(String.t() | tuple() | nil) ::
{:ok, host(), pos_integer()}
| {:ok, proxy_type(), host(), pos_integer()}
| {:error, atom()}
| nil
def parse_proxy(nil), do: nil
def parse_proxy(proxy) when is_binary(proxy) do
with [host, port] <- String.split(proxy, ":"),
{port, ""} <- Integer.parse(port) do
{:ok, parse_host(host), port}
with %URI{} = uri <- URI.parse(proxy),
{:ok, type} <- proxy_type(uri.scheme) do
{:ok, type, uri.host, uri.port}
else
{_, _} ->
Logger.warn("Parsing port failed #{inspect(proxy)}")
{:error, :invalid_proxy_port}
:error ->
Logger.warn("Parsing port failed #{inspect(proxy)}")
{:error, :invalid_proxy_port}
_ ->
Logger.warn("Parsing proxy failed #{inspect(proxy)}")
e ->
Logger.warn("Parsing proxy failed #{inspect(proxy)}, #{inspect(e)}")
{:error, :invalid_proxy}
end
end
def parse_proxy(proxy) when is_tuple(proxy) do
with {type, host, port} <- proxy do
{:ok, type, parse_host(host), port}
{:ok, type, host, port}
else
_ ->
Logger.warn("Parsing proxy failed #{inspect(proxy)}")

View file

@ -9,7 +9,7 @@ defmodule Pleroma.HTTP.AdapterHelper.Default do
@spec options(keyword(), URI.t()) :: keyword()
def options(opts, _uri) do
proxy = Pleroma.Config.get([:http, :proxy_url], nil)
proxy = Pleroma.Config.get([:http, :proxy_url])
AdapterHelper.maybe_add_proxy(opts, AdapterHelper.format_proxy(proxy))
end

View file

@ -1,82 +0,0 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2021 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.HTTP.AdapterHelper.Gun do
@behaviour Pleroma.HTTP.AdapterHelper
alias Pleroma.Config
alias Pleroma.HTTP.AdapterHelper
require Logger
@defaults [
retry: 1,
retry_timeout: 1_000
]
@type pool() :: :federation | :upload | :media | :default
@spec options(keyword(), URI.t()) :: keyword()
def options(incoming_opts \\ [], %URI{} = uri) do
proxy =
[:http, :proxy_url]
|> Config.get()
|> AdapterHelper.format_proxy()
config_opts = Config.get([:http, :adapter], [])
@defaults
|> Keyword.merge(config_opts)
|> add_scheme_opts(uri)
|> AdapterHelper.maybe_add_proxy(proxy)
|> Keyword.merge(incoming_opts)
|> put_timeout()
end
defp add_scheme_opts(opts, %{scheme: "http"}), do: opts
defp add_scheme_opts(opts, %{scheme: "https"}) do
Keyword.put(opts, :certificates_verification, true)
end
defp put_timeout(opts) do
{recv_timeout, opts} = Keyword.pop(opts, :recv_timeout, pool_timeout(opts[:pool]))
# this is the timeout to receive a message from Gun
# `:timeout` key is used in Tesla
Keyword.put(opts, :timeout, recv_timeout)
end
@spec pool_timeout(pool()) :: non_neg_integer()
def pool_timeout(pool) do
default = Config.get([:pools, :default, :recv_timeout], 5_000)
Config.get([:pools, pool, :recv_timeout], default)
end
def limiter_setup do
prefix = Pleroma.Gun.ConnectionPool
wait = Config.get([:connections_pool, :connection_acquisition_wait])
retries = Config.get([:connections_pool, :connection_acquisition_retries])
:pools
|> Config.get([])
|> Enum.each(fn {name, opts} ->
max_running = Keyword.get(opts, :size, 50)
max_waiting = Keyword.get(opts, :max_waiting, 10)
result =
ConcurrentLimiter.new(:"#{prefix}.#{name}", max_running, max_waiting,
wait: wait,
max_retries: retries
)
case result do
:ok -> :ok
{:error, :existing} -> :ok
end
end)
:ok
end
end

View file

@ -1,40 +0,0 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2021 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.HTTP.AdapterHelper.Hackney do
@behaviour Pleroma.HTTP.AdapterHelper
@defaults [
follow_redirect: true,
force_redirect: true
]
@spec options(keyword(), URI.t()) :: keyword()
def options(connection_opts \\ [], %URI{} = uri) do
proxy = Pleroma.Config.get([:http, :proxy_url])
config_opts = Pleroma.Config.get([:http, :adapter], [])
@defaults
|> Keyword.merge(config_opts)
|> Keyword.merge(connection_opts)
|> add_scheme_opts(uri)
|> maybe_add_with_body()
|> Pleroma.HTTP.AdapterHelper.maybe_add_proxy(proxy)
end
defp add_scheme_opts(opts, %URI{scheme: "https"}) do
Keyword.put(opts, :ssl_options, versions: [:"tlsv1.3", :"tlsv1.2", :"tlsv1.1", :tlsv1])
end
defp add_scheme_opts(opts, _), do: opts
defp maybe_add_with_body(opts) do
if opts[:max_body] do
Keyword.put(opts, :with_body, true)
else
opts
end
end
end

View file

@ -3,7 +3,6 @@
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.MigrationHelper.NotificationBackfill do
alias Pleroma.Object
alias Pleroma.Repo
alias Pleroma.User
@ -79,14 +78,5 @@ defmodule Pleroma.MigrationHelper.NotificationBackfill do
end
end
defp type_from_activity_object(%{data: %{"type" => "Create", "object" => %{}}}), do: "mention"
defp type_from_activity_object(%{data: %{"type" => "Create"}} = activity) do
object = Object.get_by_ap_id(activity.data["object"])
case object && object.data["type"] do
"ChatMessage" -> "pleroma:chat_mention"
_ -> "mention"
end
end
defp type_from_activity_object(%{data: %{"type" => "Create"}}), do: "mention"
end

View file

@ -237,17 +237,6 @@ defmodule Pleroma.ModerationLog do
insert_log_entry_with_message(%ModerationLog{data: data})
end
def insert_log(%{actor: %User{} = actor, action: "chat_message_delete", subject_id: subject_id}) do
%ModerationLog{
data: %{
"actor" => %{"nickname" => actor.nickname},
"action" => "chat_message_delete",
"subject_id" => subject_id
}
}
|> insert_log_entry_with_message()
end
@spec insert_log_entry_with_message(ModerationLog) :: {:ok, ModerationLog} | {:error, any}
defp insert_log_entry_with_message(entry) do
entry.data["message"]
@ -554,16 +543,6 @@ defmodule Pleroma.ModerationLog do
"@#{actor_nickname} updated users: #{users_to_nicknames_string(subjects)}"
end
def get_log_entry_message(%ModerationLog{
data: %{
"actor" => %{"nickname" => actor_nickname},
"action" => "chat_message_delete",
"subject_id" => subject_id
}
}) do
"@#{actor_nickname} deleted chat message ##{subject_id}"
end
def get_log_entry_message(%ModerationLog{
data: %{
"actor" => %{"nickname" => actor_nickname},

View file

@ -68,7 +68,6 @@ defmodule Pleroma.Notification do
follow_request
mention
move
pleroma:chat_mention
pleroma:emoji_reaction
pleroma:report
reblog
@ -139,7 +138,24 @@ defmodule Pleroma.Notification do
query
|> where([n, a], a.actor not in ^blocked_ap_ids)
|> FollowingRelationship.keep_following_or_not_domain_blocked(user)
|> restrict_domain_blocked(user)
end
defp restrict_domain_blocked(query, user) do
where(
query,
[_, activity],
fragment(
# "(actor's domain NOT in domain_blocks)"
"""
NOT (
substring(? from '.*://([^/]*)') = ANY(?)
)
""",
activity.actor,
^user.domain_blocks
)
)
end
defp exclude_blockers(query, user) do
@ -385,7 +401,7 @@ defmodule Pleroma.Notification do
end
def create_notifications(%Activity{data: %{"type" => type}} = activity, options)
when type in ["Follow", "Like", "Announce", "Move", "EmojiReact", "Flag"] do
when type in ["Follow", "Like", "Announce", "Move", "EmojiReact", "Flag", "Update"] do
do_create_notifications(activity, options)
end
@ -439,21 +455,15 @@ defmodule Pleroma.Notification do
activity
|> type_from_activity_object()
"Update" ->
"update"
t ->
raise "No notification type for activity type #{t}"
end
end
defp type_from_activity_object(%{data: %{"type" => "Create", "object" => %{}}}), do: "mention"
defp type_from_activity_object(%{data: %{"type" => "Create"}} = activity) do
object = Object.get_by_ap_id(activity.data["object"])
case object && object.data["type"] do
"ChatMessage" -> "pleroma:chat_mention"
_ -> "mention"
end
end
defp type_from_activity_object(%{data: %{"type" => "Create"}}), do: "mention"
# TODO move to sql, too.
def create_notification(%Activity{} = activity, %User{} = user, opts \\ []) do
@ -513,7 +523,16 @@ defmodule Pleroma.Notification do
def get_notified_from_activity(activity, local_only \\ true)
def get_notified_from_activity(%Activity{data: %{"type" => type}} = activity, local_only)
when type in ["Create", "Like", "Announce", "Follow", "Move", "EmojiReact", "Flag"] do
when type in [
"Create",
"Like",
"Announce",
"Follow",
"Move",
"EmojiReact",
"Flag",
"Update"
] do
potential_receiver_ap_ids = get_potential_receiver_ap_ids(activity)
potential_receivers =
@ -553,6 +572,21 @@ defmodule Pleroma.Notification do
(User.all_superusers() |> Enum.map(fn user -> user.ap_id end)) -- [actor]
end
# Update activity: notify all who repeated this
def get_potential_receiver_ap_ids(%{data: %{"type" => "Update", "actor" => actor}} = activity) do
with %Object{data: %{"id" => object_id}} <- Object.normalize(activity, fetch: false) do
repeaters =
Activity.Queries.by_type("Announce")
|> Activity.Queries.by_object_id(object_id)
|> Activity.with_joined_user_actor()
|> where([a, u], u.local)
|> select([a, u], u.ap_id)
|> Repo.all()
repeaters -- [actor]
end
end
def get_potential_receiver_ap_ids(activity) do
[]
|> Utils.maybe_notify_to_recipients(activity)

View file

@ -145,7 +145,7 @@ defmodule Pleroma.Object do
Logger.debug("Backtrace: #{inspect(Process.info(:erlang.self(), :current_stacktrace))}")
end
def normalize(_, options \\ [fetch: false])
def normalize(_, options \\ [fetch: false, id_only: false])
# If we pass an Activity to Object.normalize(), we can try to use the preloaded object.
# Use this whenever possible, especially when walking graphs in an O(N) loop!
@ -173,10 +173,15 @@ defmodule Pleroma.Object do
def normalize(%{"id" => ap_id}, options), do: normalize(ap_id, options)
def normalize(ap_id, options) when is_binary(ap_id) do
if Keyword.get(options, :fetch) do
Fetcher.fetch_object_from_id!(ap_id, options)
else
get_cached_by_ap_id(ap_id)
cond do
Keyword.get(options, :id_only) ->
ap_id
Keyword.get(options, :fetch) ->
Fetcher.fetch_object_from_id!(ap_id, options)
true ->
get_cached_by_ap_id(ap_id)
end
end
@ -208,10 +213,6 @@ defmodule Pleroma.Object do
end
end
def context_mapping(context) do
Object.change(%Object{}, %{data: %{"id" => context}})
end
def make_tombstone(%Object{data: %{"id" => id, "type" => type}}, deleted \\ DateTime.utc_now()) do
%ObjectTombstone{
id: id,

View file

@ -26,8 +26,42 @@ defmodule Pleroma.Object.Fetcher do
end
defp maybe_reinject_internal_fields(%{data: %{} = old_data}, new_data) do
has_history? = fn
%{"formerRepresentations" => %{"orderedItems" => list}} when is_list(list) -> true
_ -> false
end
internal_fields = Map.take(old_data, Pleroma.Constants.object_internal_fields())
remote_history_exists? = has_history?.(new_data)
# If the remote history exists, we treat that as the only source of truth.
new_data =
if has_history?.(old_data) and not remote_history_exists? do
Map.put(new_data, "formerRepresentations", old_data["formerRepresentations"])
else
new_data
end
# If the remote does not have history information, we need to manage it ourselves
new_data =
if not remote_history_exists? do
changed? =
Pleroma.Constants.status_updatable_fields()
|> Enum.any?(fn field -> Map.get(old_data, field) != Map.get(new_data, field) end)
%{updated_object: updated_object} =
new_data
|> Object.Updater.maybe_update_history(old_data,
updated: changed?,
use_history_in_new_object?: false
)
updated_object
else
new_data
end
Map.merge(new_data, internal_fields)
end

View file

@ -0,0 +1,240 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2022 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Object.Updater do
require Pleroma.Constants
def update_content_fields(orig_object_data, updated_object) do
Pleroma.Constants.status_updatable_fields()
|> Enum.reduce(
%{data: orig_object_data, updated: false},
fn field, %{data: data, updated: updated} ->
updated =
updated or
(field != "updated" and
Map.get(updated_object, field) != Map.get(orig_object_data, field))
data =
if Map.has_key?(updated_object, field) do
Map.put(data, field, updated_object[field])
else
Map.drop(data, [field])
end
%{data: data, updated: updated}
end
)
end
def maybe_history(object) do
with history <- Map.get(object, "formerRepresentations"),
true <- is_map(history),
"OrderedCollection" <- Map.get(history, "type"),
true <- is_list(Map.get(history, "orderedItems")),
true <- is_integer(Map.get(history, "totalItems")) do
history
else
_ -> nil
end
end
def history_for(object) do
with history when not is_nil(history) <- maybe_history(object) do
history
else
_ -> history_skeleton()
end
end
defp history_skeleton do
%{
"type" => "OrderedCollection",
"totalItems" => 0,
"orderedItems" => []
}
end
def maybe_update_history(
updated_object,
orig_object_data,
opts
) do
updated = opts[:updated]
use_history_in_new_object? = opts[:use_history_in_new_object?]
if not updated do
%{updated_object: updated_object, used_history_in_new_object?: false}
else
# Put edit history
# Note that we may have got the edit history by first fetching the object
{new_history, used_history_in_new_object?} =
with true <- use_history_in_new_object?,
updated_history when not is_nil(updated_history) <- maybe_history(opts[:new_data]) do
{updated_history, true}
else
_ ->
history = history_for(orig_object_data)
latest_history_item =
orig_object_data
|> Map.drop(["id", "formerRepresentations"])
updated_history =
history
|> Map.put("orderedItems", [latest_history_item | history["orderedItems"]])
|> Map.put("totalItems", history["totalItems"] + 1)
{updated_history, false}
end
updated_object =
updated_object
|> Map.put("formerRepresentations", new_history)
%{updated_object: updated_object, used_history_in_new_object?: used_history_in_new_object?}
end
end
defp maybe_update_poll(to_be_updated, updated_object) do
choice_key = fn data ->
if Map.has_key?(data, "anyOf"), do: "anyOf", else: "oneOf"
end
with true <- to_be_updated["type"] == "Question",
key <- choice_key.(updated_object),
true <- key == choice_key.(to_be_updated),
orig_choices <- to_be_updated[key] |> Enum.map(&Map.drop(&1, ["replies"])),
new_choices <- updated_object[key] |> Enum.map(&Map.drop(&1, ["replies"])),
true <- orig_choices == new_choices do
# Choices are the same, but counts are different
to_be_updated
|> Map.put(key, updated_object[key])
else
# Choices (or vote type) have changed, do not allow this
_ -> to_be_updated
end
end
# This calculates the data to be sent as the object of an Update.
# new_data's formerRepresentations is not considered.
# formerRepresentations is added to the returned data.
def make_update_object_data(original_data, new_data, date) do
%{data: updated_data, updated: updated} =
original_data
|> update_content_fields(new_data)
if not updated do
updated_data
else
%{updated_object: updated_data} =
updated_data
|> maybe_update_history(original_data, updated: updated, use_history_in_new_object?: false)
updated_data
|> Map.put("updated", date)
end
end
# This calculates the data of the new Object from an Update.
# new_data's formerRepresentations is considered.
def make_new_object_data_from_update_object(original_data, new_data) do
update_is_reasonable =
with {_, updated} when not is_nil(updated) <- {:cur_updated, new_data["updated"]},
{_, {:ok, updated_time, _}} <- {:cur_updated, DateTime.from_iso8601(updated)},
{_, last_updated} when not is_nil(last_updated) <-
{:last_updated, original_data["updated"] || original_data["published"]},
{_, {:ok, last_updated_time, _}} <-
{:last_updated, DateTime.from_iso8601(last_updated)},
:gt <- DateTime.compare(updated_time, last_updated_time) do
:update_everything
else
# only allow poll updates
{:cur_updated, _} -> :no_content_update
:eq -> :no_content_update
# allow all updates
{:last_updated, _} -> :update_everything
# allow no updates
_ -> false
end
%{
updated_object: updated_data,
used_history_in_new_object?: used_history_in_new_object?,
updated: updated
} =
if update_is_reasonable == :update_everything do
%{data: updated_data, updated: updated} =
original_data
|> update_content_fields(new_data)
updated_data
|> maybe_update_history(original_data,
updated: updated,
use_history_in_new_object?: true,
new_data: new_data
)
|> Map.put(:updated, updated)
else
%{
updated_object: original_data,
used_history_in_new_object?: false,
updated: false
}
end
updated_data =
if update_is_reasonable != false do
updated_data
|> maybe_update_poll(new_data)
else
updated_data
end
%{
updated_data: updated_data,
updated: updated,
used_history_in_new_object?: used_history_in_new_object?
}
end
def for_each_history_item(%{"orderedItems" => items} = history, _object, fun) do
new_items =
Enum.map(items, fun)
|> Enum.reduce_while(
{:ok, []},
fn
{:ok, item}, {:ok, acc} -> {:cont, {:ok, acc ++ [item]}}
e, _acc -> {:halt, e}
end
)
case new_items do
{:ok, items} -> {:ok, Map.put(history, "orderedItems", items)}
e -> e
end
end
def for_each_history_item(history, _, _) do
{:ok, history}
end
def do_with_history(object, fun) do
with history <- object["formerRepresentations"],
object <- Map.drop(object, ["formerRepresentations"]),
{_, {:ok, object}} <- {:main_body, fun.(object)},
{_, {:ok, history}} <- {:history_items, for_each_history_item(history, object, fun)} do
object =
if history do
Map.put(object, "formerRepresentations", history)
else
object
end
{:ok, object}
else
{:main_body, e} -> e
{:history_items, e} -> e
end
end
end

View file

@ -25,7 +25,7 @@ defmodule Pleroma.ReleaseTasks do
module = Module.split(module)
match?(["Mix", "Tasks", "Pleroma" | _], module) and
String.downcase(List.last(module)) == task
task_match?(module, task)
end)
if module do
@ -35,6 +35,13 @@ defmodule Pleroma.ReleaseTasks do
end
end
defp task_match?(["Mix", "Tasks", "Pleroma" | module_path], task) do
module_path
|> Enum.join(".")
|> String.downcase()
|> String.equivalent?(String.downcase(task))
end
def migrate(args) do
Mix.Tasks.Pleroma.Ecto.Migrate.run(args)
end

View file

@ -11,8 +11,6 @@ defmodule Pleroma.Repo do
import Ecto.Query
require Logger
defmodule Instrumenter, do: use(Prometheus.EctoInstrumenter)
@doc """
Dynamically loads the repository url from the
DATABASE_URL environment variable.

Some files were not shown because too many files have changed in this diff Show more