Compare commits

...

100 commits

Author SHA1 Message Date
74182abb5b bump version
All checks were successful
ci/woodpecker/push/lint Pipeline was successful
ci/woodpecker/push/test/1 Pipeline was successful
ci/woodpecker/push/test/2 Pipeline was successful
ci/woodpecker/push/build-arm64 Pipeline was successful
ci/woodpecker/push/build-amd64 Pipeline was successful
ci/woodpecker/push/docs Pipeline was successful
2025-03-11 20:48:27 +00:00
0a9cf8fa8b Merge pull request 'Test lowest and highest language versions, elixir 1.18 support' (#875) from ci-testing-all-versions into develop
Some checks failed
ci/woodpecker/push/lint Pipeline was successful
ci/woodpecker/push/test/1 Pipeline was successful
ci/woodpecker/push/test/2 Pipeline was successful
ci/woodpecker/push/build-amd64 Pipeline failed
ci/woodpecker/push/build-arm64 Pipeline failed
ci/woodpecker/push/docs unknown status
Reviewed-on: #875
2025-03-11 20:47:54 +00:00
066d5b48ed Fix Content-Type sanitisation for emoji and local uploads
Some checks failed
ci/woodpecker/push/lint Pipeline was successful
ci/woodpecker/push/test Pipeline was successful
ci/woodpecker/push/build-arm64 Pipeline failed
ci/woodpecker/push/build-amd64 Pipeline failed
ci/woodpecker/push/docs unknown status
This was accidentally broken in c8e0f7848b
due to a one-letter mistake in the plug option name and an absence of
tests. Therefore it was once again possible to serve e.g. Javascript or
CSS payloads via uploads and emoji.
However due to other protections it was still NOT possible for anyone to
serve any payload with an ActivityPub Content-Type. With the CSP policy
hardening from previous JS payload exloits predating the Content-Type
sanitisation, there is currently no known way of abusing this weakened
Content-Type sanitisation, but should be fixed regardless.

This commit fixes the option name and adds tests to ensure
such a regression doesn't occur again in the future.

Reported-by: Lain Soykaf <lain@lain.com>
2025-03-10 19:45:26 +01:00
4a05b2d643 we do actually want to start oban-met...
All checks were successful
ci/woodpecker/push/test/1 Pipeline was successful
ci/woodpecker/push/test/2 Pipeline was successful
ci/woodpecker/push/build-arm64 Pipeline was successful
ci/woodpecker/push/build-amd64 Pipeline was successful
ci/woodpecker/push/docs Pipeline was successful
ci/woodpecker/pr/lint Pipeline was successful
ci/woodpecker/push/lint Pipeline was successful
ci/woodpecker/pr/test/2 Pipeline was successful
ci/woodpecker/pr/test/1 Pipeline was successful
ci/woodpecker/pr/build-arm64 Pipeline was successful
ci/woodpecker/pr/build-amd64 Pipeline was successful
ci/woodpecker/pr/docs Pipeline was successful
ci/woodpecker/pull_request_closed/build-amd64 Pipeline was successful
ci/woodpecker/pull_request_closed/lint Pipeline was successful
ci/woodpecker/pull_request_closed/test/1 Pipeline was successful
ci/woodpecker/pull_request_closed/test/2 Pipeline was successful
ci/woodpecker/pull_request_closed/build-arm64 Pipeline was successful
ci/woodpecker/pull_request_closed/docs Pipeline was successful
2025-03-02 13:36:52 +00:00
93200a8073 use latest ASDF instructions
Some checks failed
ci/woodpecker/push/lint Pipeline was successful
ci/woodpecker/push/test/2 Pipeline was successful
ci/woodpecker/push/test/1 Pipeline was successful
ci/woodpecker/push/build-arm64 Pipeline was successful
ci/woodpecker/push/build-amd64 Pipeline was successful
ci/woodpecker/push/docs Pipeline was successful
ci/woodpecker/pr/test/2 unknown status
ci/woodpecker/pr/build-arm64 unknown status
ci/woodpecker/pr/lint Pipeline failed
ci/woodpecker/pr/test/1 unknown status
ci/woodpecker/pr/build-amd64 unknown status
ci/woodpecker/pr/docs unknown status
2025-03-02 13:36:14 +00:00
41a4ed1db5 specify correct version
Some checks failed
ci/woodpecker/push/lint Pipeline was successful
ci/woodpecker/push/docs Pipeline was successful
ci/woodpecker/pr/test/1 unknown status
ci/woodpecker/pr/test/2 unknown status
ci/woodpecker/pr/build-arm64 unknown status
ci/woodpecker/push/test/1 Pipeline was successful
ci/woodpecker/push/test/2 Pipeline was successful
ci/woodpecker/push/build-arm64 Pipeline was successful
ci/woodpecker/push/build-amd64 Pipeline was successful
ci/woodpecker/pr/lint Pipeline failed
ci/woodpecker/pr/build-amd64 unknown status
ci/woodpecker/pr/docs unknown status
2025-03-02 13:17:52 +00:00
8e789c6236 1.14.1 min version
Some checks are pending
ci/woodpecker/push/lint Pipeline was successful
ci/woodpecker/push/build-arm64 Pipeline was successful
ci/woodpecker/push/build-amd64 Pipeline was successful
ci/woodpecker/push/docs Pipeline was successful
ci/woodpecker/pr/build-amd64 Pipeline is pending approval
ci/woodpecker/pr/build-arm64 Pipeline is pending approval
ci/woodpecker/pr/docs Pipeline is pending approval
ci/woodpecker/pr/lint Pipeline is pending approval
ci/woodpecker/pr/test/1 Pipeline is pending approval
ci/woodpecker/pr/test/2 Pipeline is pending approval
ci/woodpecker/push/test/1 Pipeline was successful
ci/woodpecker/push/test/2 Pipeline was successful
2025-03-02 13:07:03 +00:00
184c62359f drop back to 1.14/OTP25
Some checks failed
ci/woodpecker/push/lint Pipeline was successful
ci/woodpecker/push/build-amd64 Pipeline was successful
ci/woodpecker/push/build-arm64 Pipeline was successful
ci/woodpecker/push/test/2 Pipeline was successful
ci/woodpecker/push/test/1 Pipeline was successful
ci/woodpecker/push/docs Pipeline was successful
ci/woodpecker/pr/build-amd64 unknown status
ci/woodpecker/pr/docs unknown status
ci/woodpecker/pr/lint Pipeline failed
ci/woodpecker/pr/test/1 unknown status
ci/woodpecker/pr/test/2 unknown status
ci/woodpecker/pr/build-arm64 unknown status
2025-03-02 13:04:10 +00:00
829af03042 we don't support otp24, bump to 25
All checks were successful
ci/woodpecker/push/lint Pipeline was successful
ci/woodpecker/push/test/2 Pipeline was successful
ci/woodpecker/push/test/1 Pipeline was successful
ci/woodpecker/push/build-amd64 Pipeline was successful
ci/woodpecker/push/build-arm64 Pipeline was successful
ci/woodpecker/push/docs Pipeline was successful
ci/woodpecker/pr/lint Pipeline was successful
ci/woodpecker/pr/test/2 Pipeline was successful
ci/woodpecker/pr/test/1 Pipeline was successful
ci/woodpecker/pr/build-arm64 Pipeline was successful
ci/woodpecker/pr/build-amd64 Pipeline was successful
ci/woodpecker/pr/docs Pipeline was successful
2025-03-02 12:19:14 +00:00
842414b927 run the lint task on the latest version
Some checks failed
ci/woodpecker/push/lint Pipeline was successful
ci/woodpecker/push/test/1 Pipeline was successful
ci/woodpecker/push/test/2 Pipeline was successful
ci/woodpecker/push/build-arm64 Pipeline was successful
ci/woodpecker/push/build-amd64 Pipeline was successful
ci/woodpecker/push/docs Pipeline was successful
ci/woodpecker/pr/lint Pipeline was successful
ci/woodpecker/pr/test/1 Pipeline failed
ci/woodpecker/pr/test/2 Pipeline was successful
ci/woodpecker/pr/build-arm64 unknown status
ci/woodpecker/pr/build-amd64 unknown status
ci/woodpecker/pr/docs unknown status
2025-03-02 11:56:15 +00:00
f176294d6d elixir 1.18 formatting
Some checks failed
ci/woodpecker/push/docs Pipeline was successful
ci/woodpecker/push/lint Pipeline was successful
ci/woodpecker/push/test/2 Pipeline was successful
ci/woodpecker/push/test/1 Pipeline was successful
ci/woodpecker/push/build-arm64 Pipeline was successful
ci/woodpecker/push/build-amd64 Pipeline was successful
ci/woodpecker/pr/lint Pipeline failed
ci/woodpecker/pr/test/1 unknown status
ci/woodpecker/pr/test/2 unknown status
ci/woodpecker/pr/build-amd64 unknown status
ci/woodpecker/pr/docs unknown status
ci/woodpecker/pr/build-arm64 unknown status
2025-03-02 11:54:00 +00:00
b1c0b9e01a test lowest and highest supported versions on PR
All checks were successful
ci/woodpecker/push/lint Pipeline was successful
ci/woodpecker/push/test/1 Pipeline was successful
ci/woodpecker/push/test/2 Pipeline was successful
ci/woodpecker/push/build-amd64 Pipeline was successful
ci/woodpecker/push/build-arm64 Pipeline was successful
ci/woodpecker/push/docs Pipeline was successful
2025-03-02 11:49:41 +00:00
fc2c740008 dependency upgrade
All checks were successful
ci/woodpecker/push/lint Pipeline was successful
ci/woodpecker/push/test Pipeline was successful
ci/woodpecker/push/build-arm64 Pipeline was successful
ci/woodpecker/push/build-amd64 Pipeline was successful
ci/woodpecker/push/docs Pipeline was successful
2025-03-02 11:34:09 +00:00
9da2cb881e upgrade oban migrations to v12
Some checks failed
ci/woodpecker/push/lint Pipeline was successful
ci/woodpecker/push/test Pipeline was successful
ci/woodpecker/push/docs unknown status
ci/woodpecker/push/build-arm64 Pipeline failed
ci/woodpecker/push/build-amd64 Pipeline failed
2025-03-02 11:32:40 +00:00
522a168af6 force signatures for pinned posts
All checks were successful
ci/woodpecker/push/lint Pipeline was successful
ci/woodpecker/push/test Pipeline was successful
ci/woodpecker/push/build-arm64 Pipeline was successful
ci/woodpecker/push/build-amd64 Pipeline was successful
ci/woodpecker/push/docs Pipeline was successful
2025-03-01 17:27:45 +00:00
59ea358e52 bump version
All checks were successful
ci/woodpecker/push/lint Pipeline was successful
ci/woodpecker/push/test Pipeline was successful
ci/woodpecker/push/build-arm64 Pipeline was successful
ci/woodpecker/push/build-amd64 Pipeline was successful
ci/woodpecker/push/docs Pipeline was successful
2025-03-01 16:36:04 +00:00
d62808e4b6 move /outbox to signed pipeline 2025-03-01 16:28:12 +00:00
7ccc560e4d prepare 2025.03 release
All checks were successful
ci/woodpecker/push/lint Pipeline was successful
ci/woodpecker/push/test Pipeline was successful
ci/woodpecker/push/build-arm64 Pipeline was successful
ci/woodpecker/push/build-amd64 Pipeline was successful
ci/woodpecker/push/docs Pipeline was successful
2025-03-01 12:19:43 +00:00
a47b02cb69 Merge remote-tracking branch 'oneric-sec/sec-2024-12' into develop 2025-03-01 12:13:17 +00:00
6222936673 use akk.dev mfm parser
Some checks failed
ci/woodpecker/push/lint Pipeline was successful
ci/woodpecker/push/test Pipeline was successful
ci/woodpecker/push/build-arm64 Pipeline was successful
ci/woodpecker/push/docs unknown status
ci/woodpecker/push/build-amd64 Pipeline failed
2025-03-01 12:10:23 +00:00
d65cd1b141 Merge pull request 'Add oban web dashboard' (#871) from oban_web into develop
All checks were successful
ci/woodpecker/push/lint Pipeline was successful
ci/woodpecker/push/test Pipeline was successful
ci/woodpecker/push/build-arm64 Pipeline was successful
ci/woodpecker/push/build-amd64 Pipeline was successful
ci/woodpecker/push/docs Pipeline was successful
Reviewed-on: #871
2025-02-27 12:07:36 +00:00
d7dd34f263 Merge pull request 'Use FEP-c16b: Formatting MFM functions' (#823) from ilja/akkoma:use_fep-c16b_formatting_mfm_functions into develop
Some checks failed
ci/woodpecker/push/lint Pipeline was successful
ci/woodpecker/push/test Pipeline was successful
ci/woodpecker/push/build-arm64 Pipeline failed
ci/woodpecker/push/build-amd64 Pipeline failed
ci/woodpecker/push/docs unknown status
Reviewed-on: #823
2025-02-27 12:03:22 +00:00
c2f60c9228 add a snapshot test for api prefixes
Some checks are pending
ci/woodpecker/push/lint Pipeline was successful
ci/woodpecker/push/build-arm64 Pipeline is running
ci/woodpecker/push/test Pipeline was successful
ci/woodpecker/push/build-amd64 Pipeline was successful
ci/woodpecker/push/docs Pipeline was successful
ci/woodpecker/pr/lint Pipeline was successful
ci/woodpecker/pr/test Pipeline was successful
ci/woodpecker/pr/build-arm64 Pipeline was successful
ci/woodpecker/pr/build-amd64 Pipeline was successful
ci/woodpecker/pr/docs Pipeline was successful
ci/woodpecker/pull_request_closed/lint Pipeline was successful
ci/woodpecker/pull_request_closed/build-amd64 Pipeline was successful
ci/woodpecker/pull_request_closed/docs Pipeline was successful
ci/woodpecker/pull_request_closed/test Pipeline was successful
ci/woodpecker/pull_request_closed/build-arm64 Pipeline was successful
2025-02-23 16:51:48 +00:00
13d650602b update deps
Some checks are pending
ci/woodpecker/pr/build-amd64 Pipeline is pending approval
ci/woodpecker/pr/build-arm64 Pipeline is pending approval
ci/woodpecker/pr/docs Pipeline is pending approval
ci/woodpecker/pr/lint Pipeline is pending approval
ci/woodpecker/pr/test Pipeline is pending approval
ci/woodpecker/push/lint Pipeline was successful
ci/woodpecker/push/test Pipeline was successful
ci/woodpecker/push/build-arm64 Pipeline was successful
ci/woodpecker/push/build-amd64 Pipeline was successful
ci/woodpecker/push/docs Pipeline was successful
2025-02-23 16:32:55 +00:00
a49f04bb4e Merge branch 'develop' into oban_web 2025-02-23 16:16:48 +00:00
da7998e89e put oban route under a known prefix 2025-02-23 16:16:17 +00:00
ilja space
dce07f05d9 Merge branch 'develop' of https://akkoma.dev/AkkomaGang/akkoma into use_fep-c16b_formatting_mfm_functions
Some checks are pending
ci/woodpecker/pr/build-amd64 Pipeline is pending approval
ci/woodpecker/pr/build-arm64 Pipeline is pending approval
ci/woodpecker/pr/docs Pipeline is pending approval
ci/woodpecker/pr/lint Pipeline is pending approval
ci/woodpecker/pr/test Pipeline is pending approval
ci/woodpecker/pull_request_closed/lint Pipeline was successful
ci/woodpecker/pull_request_closed/test Pipeline was successful
ci/woodpecker/pull_request_closed/build-amd64 Pipeline was successful
ci/woodpecker/pull_request_closed/build-arm64 Pipeline was successful
ci/woodpecker/pull_request_closed/docs Pipeline was successful
2025-02-23 10:13:44 +01:00
7c23793e55 changelog: add entries for preceding commits 2025-02-21 19:37:27 +01:00
8243fc0ef4 federation: strip internal fields from incoming updates and history
When note editing support was added, it was omitted to strip internal
fields from edited notes and their history.

This was uncovered due to Mastodon inlining the like count as a "likes"
collection conflicting with our internal "likes" list causing validation
failures. In a spot check with likes/like_count it was not possible to
inject those internal fields into the local db via Update, but this
was not extensively tested for all fields and avenues.

Similarly address normalisation did not normalise addressing in the
object history, although this was never at risk of being exploitable.

The revision history of the Pleroma MR adding edit support reveals
recusrive stripping was intentionally avoided, since it will end up
removing e.g. emoji from outgoing activities. This appears to still
be true. However, all current internal fields ("pleroma_interal"
appears to be unused) contain data already publicised otherwise anyway.
In the interest of fixing a federation bug (and at worst potential data
injection) quickly outgoing stripping is left non-recursive for now.

Of course the ultimate fix here is to not mix remote and internal data
into the same map in the first place, but unfortunately having a single
map of all truth is a core assumption of *oma's AP doc processing.
Changing this is a masive undertaking and not suitable for providing
a short-term fix.
2025-02-21 19:37:27 +01:00
11ad4711eb signing_key: don't retrieve superfluous fields when loading ap_id 2025-02-21 19:37:27 +01:00
d8e40173bf http_signatures: tweak order of route aliases
We expect most requests to be made for the actual canonical ID,
so check this one first (starting without query headers matching the
predominant albeit spec-breaking version).

Also avoid unnecessary rerewrites of the digest header on each route
alias by just setting it once before iterating through aliases.
2025-02-21 19:37:27 +01:00
9cc5fe9a5f signature: refetch key upon verification failure
This matches behaviour prioir to the SigningKey migration
and the expected semantics of the http_signatures lib.
Additionally add a min interval paramter, to avoid
refetch floods on bugs causing incompatible signatures
(like e.g. currently with Bridgy)
2025-02-21 19:37:27 +01:00
355263858c Merge pull request 'Expose Port IO stats via Prometheus' (#869) from Oneric/akkoma:io-telemetry into develop
All checks were successful
ci/woodpecker/push/lint Pipeline was successful
ci/woodpecker/push/test Pipeline was successful
ci/woodpecker/push/build-arm64 Pipeline was successful
ci/woodpecker/push/build-amd64 Pipeline was successful
ci/woodpecker/push/docs Pipeline was successful
Reviewed-on: #869
2025-02-21 15:28:09 +00:00
a7b4e4bfd9 signature: distinguish error sources and log fetch issues 2025-02-14 22:10:25 +01:00
51642a90c5 signature: drop unecessary round trip over user
We already got the key.
2025-02-14 22:10:25 +01:00
bc79bd0edf cosmetic/test/user: replace deprecated clear_config syntax 2025-02-14 22:10:25 +01:00
ee61ce61a7 changelog: summarise preceeding changes 2025-02-14 22:10:25 +01:00
8a0d130976 Add tests for SigninKey module 2025-02-14 22:10:25 +01:00
898b98e5dd db: drop legacy key fields in users table 2025-02-14 22:10:25 +01:00
ea2de1f28a signing_key: ensure only one key per user exists
Fixes: AkkomaGang/akkoma issue 858
2025-02-14 22:10:25 +01:00
2a4587f201 Fix SigningKey db schema 2025-02-14 22:10:25 +01:00
3460f41776 Fix user updates
User updates broke with the migration to separate signing keys
since user data carries signing keys but we didn't allow the
association data to be updated.
2025-02-14 22:10:25 +01:00
cc5c1bb10c signing_key: cleanup code
In particular this avoids an unecessary roundtrip
over user_id when searching a key via its primary key_id
2025-02-14 22:10:25 +01:00
70fe99d196 Prevent key-actor mapping poisoning and key take overs
Previously there were mainly two attack vectors:
 - for raw keys the owner <-> key mapping wasn't verified at all
 - keys were retrieved with refetching allowed
   and only the top-level ID was sanitised while
   usually keys are but a subobject

This reintroduces public key checks in the user actor,
previously removed in 9728e2f8f7
but now adapted to account for the new mapping mechanism.
2025-02-14 22:10:25 +01:00
366065c0f6 fetcher: split out core object fetch validation
To allow reuse for adapted key validation logic
2025-02-14 22:10:25 +01:00
b5fa8c6d09 readme: drop mention of YunoHost package
It’s no longer listed in the catalogue and
the git repo wasn't updated in over a year
2025-02-14 22:10:25 +01:00
d68a5f6c56 Protected against counterfeit local docs being posted
Only possible if actor keys leaked first
thus log with alert level
2025-02-14 22:10:25 +01:00
4231345f4e cosmetic/emoji/pack: fix spelling
There might be further debate about "emoji" vs "emojis" for the plural
but a grep shows the latter is already widely used in our codebase.
2025-02-14 22:10:25 +01:00
96fe080e6e Convert all raw :zip usage to SafeZip
Notably at least two instances were not properly guarded from path
traversal attack before and are only now fixed by using SafeZip:

 - frontend installation did never check for malicious paths.
   But given a malicious froontend could already, e.g. steal
   all user tokens even without this, in the real world
   admins should only use frontends from trusted sources
   and the practical implications are minimal

 - the emoji pack update/upload API taking a ZIP file
   did not protect against path traversal. While atm
   only admins can use these emoji endpoints, emoji
   packs are typically considered "harmless" and used
   without prior verification from various sources.
   Thus this appears more concerning.
2025-02-14 22:10:25 +01:00
7151ef4718 Add SafeZip module
This will replace all the slightly different safety workarounds at
different ZIP handling sites and ensure safety is actually consistently
enforced everywhere while also making code cleaner and easiert to
follow.
2025-02-14 22:10:25 +01:00
c8e0f7848b Migrate back to upstream Plug.Static
Commit a924e117fd bumped the
plug package to 1.16.0 which includes our upstream patch.

Resolves: #734
2025-02-14 22:10:25 +01:00
98c7e9534e Drop obsolete APNG mime override
Commit 9d2c558f64 bumped
to a mime package version including the upstream fix.
2025-02-14 22:10:25 +01:00
1c2eb4d799 cosmetic/object: drop is_ prefix from is_tombstone_object?
The question mark suffix already implies it being an indicator function
2025-02-14 22:10:25 +01:00
7998a00346 cosmetic/rich_media/parser: fix typo 2025-02-14 22:10:25 +01:00
4c41f8c286 Merge pull request 'Improve stat queries and ReceiverWorker logic' (#862) from Oneric/akkoma:perf_tweaks_stats+jobs into develop
All checks were successful
ci/woodpecker/push/lint Pipeline was successful
ci/woodpecker/push/test Pipeline was successful
ci/woodpecker/push/build-arm64 Pipeline was successful
ci/woodpecker/push/build-amd64 Pipeline was successful
ci/woodpecker/push/docs Pipeline was successful
Reviewed-on: #862
2025-02-14 19:22:35 +00:00
f0a99b4595 article_note_validator: fix handling of Mastodon-style replies collections
Some checks are pending
ci/woodpecker/pr/build-amd64 Pipeline is pending approval
ci/woodpecker/pr/build-arm64 Pipeline is pending approval
ci/woodpecker/pr/docs Pipeline is pending approval
ci/woodpecker/pr/lint Pipeline is pending approval
ci/woodpecker/pr/test Pipeline is pending approval
ci/woodpecker/pull_request_closed/lint Pipeline was successful
ci/woodpecker/pull_request_closed/test Pipeline was successful
ci/woodpecker/pull_request_closed/build-amd64 Pipeline was successful
ci/woodpecker/pull_request_closed/build-arm64 Pipeline was successful
ci/woodpecker/pull_request_closed/docs Pipeline was successful
The first collection page is (sometimes?) inlined
which caused crashes when attempting to log the fetch failure.
But there’s no need to fetch and we can treat it like the other inline collection
2025-02-14 18:49:51 +01:00
a1c841a122 federation.md: list FEP-dc88 formatting mathematics
Implemented by #642
2025-02-14 18:49:51 +01:00
1b09b9fc22 static_fe: fix HTML quotation for upload alt text
Reported by riley on IRC
2025-02-14 18:49:51 +01:00
46148c0825 Don't return garbage on failed collection fetches
And for now treat partial fetches as a success, since for all
current users partial collection data is better than no data at all.

If an error occurred while fetching a page, this previously
returned a bogus {:ok, {:error, _}} success, causing the error
to be attached to the object as an reply list subsequently
leading to the whole post getting rejected during validation.

Also the pinned collection caller did not actually handle
the preexisting error case resulting in process crashes.
2025-02-14 18:49:51 +01:00
4701aa2a38 receiver_worker: log processes crashes
Oban cataches crashes to handle job failure and retry,
thus it never bubbles up all the way and nothing is logged by default.
For better debugging, catch and log any crashes.
2025-02-14 18:46:19 +01:00
fb3de8045a Expose Port IO stats via Prometheus
Some checks are pending
ci/woodpecker/pr/build-amd64 Pipeline is pending approval
ci/woodpecker/pr/build-arm64 Pipeline is pending approval
ci/woodpecker/pr/docs Pipeline is pending approval
ci/woodpecker/pr/lint Pipeline is pending approval
ci/woodpecker/pr/test Pipeline is pending approval
ci/woodpecker/pull_request_closed/lint Pipeline was successful
ci/woodpecker/pull_request_closed/test Pipeline was successful
ci/woodpecker/pull_request_closed/build-amd64 Pipeline was successful
ci/woodpecker/pull_request_closed/build-arm64 Pipeline was successful
ci/woodpecker/pull_request_closed/docs Pipeline was successful
2025-01-27 20:00:30 +01:00
b2b63ad89f add oban web
Some checks failed
ci/woodpecker/push/lint Pipeline was successful
ci/woodpecker/push/test Pipeline was successful
ci/woodpecker/push/build-amd64 Pipeline was successful
ci/woodpecker/push/build-arm64 Pipeline was successful
ci/woodpecker/push/docs Pipeline was successful
ci/woodpecker/pr/lint Pipeline failed
ci/woodpecker/pr/test unknown status
ci/woodpecker/pr/build-amd64 unknown status
ci/woodpecker/pr/build-arm64 unknown status
ci/woodpecker/pr/docs unknown status
2025-01-17 09:38:09 +00:00
ilja space
d56165c71e Merge branch 'develop' of https://akkoma.dev/AkkomaGang/akkoma into use_fep-c16b_formatting_mfm_functions
Some checks are pending
ci/woodpecker/pr/build-amd64 Pipeline is pending approval
ci/woodpecker/pr/build-arm64 Pipeline is pending approval
ci/woodpecker/pr/docs Pipeline is pending approval
ci/woodpecker/pr/lint Pipeline is pending approval
ci/woodpecker/pr/test Pipeline is pending approval
2025-01-12 07:59:40 +01:00
8fa51700d4 changelog: summarise preceeding perf tweaks 2025-01-07 20:27:28 +01:00
2ddff7e386 transmogrifier: gracefully ignore Delete of unknown objects
It's quite common to receive spurious Deletes,
so we neither want to waste resources on retrying
nor spam "invalid AP" logs
2025-01-07 20:27:28 +01:00
cd8e6a4235 transmogrifier: gracefully ignore duplicated object deletes
The object lookup is later repeated in the validator, but due to
caching shouldn't incur any noticeable performance impact.
It’s actually preferable to check here, since it avoids the otherwise
occuring user lookup and overhead from starting and aborting a
transaction
2025-01-07 20:27:28 +01:00
ac2327c8fc transmogrfier: be more selective about Delete retry
If something else renders the Delete invalid,
there’s no point in retrying anyway
2025-01-07 20:27:28 +01:00
92bf93a4f7 transmogrifier: avoid crashes on non-validation Delte errors
Happens e.g. for duplicated Deletes.
The remaining tombstone object no longer has an actor,
leading to an error response during side-effect handling.
2025-01-07 20:27:28 +01:00
7ad5f8d3c0 object_validators: only query relevant table for object
Most of them actually only accept either activities or a
non-activity object later; querying both is then a waste
of resources and may create false positives.
2025-01-07 20:27:28 +01:00
b0387dee14 Gracefully ignore Undo activities referring to unknown objects 2025-01-07 20:27:28 +01:00
caa4fbe326 user: avoid database work on superfluous pin
The only thing this does is changing the updated_at field of the user.
Afaict this came to be because prior to pins federating this was split
into two functions, one of which created a changeset, the other applying
a given changeset. When this was merged the bits were just copied into
place.
2025-01-07 20:27:28 +01:00
09736431e0 Don't spam logs about deleted users
User.get_or_fetch_by_(apid|nickname) are the only external users of fetch_and_prepare_user_from_ap_id,
thus there’s no point in duplicating logging, expecially not at error level.
Currently (duplicated) _not_found errors for users make up the bulk of my logs
and are created almost every second. Deleted users are a common occurence and not
worth logging outside of debug
2025-01-07 20:27:28 +01:00
bcf3e101f6 rich_media: lower log level of update 2025-01-07 20:27:28 +01:00
05bbdbf388 nodeinfo: lower log level of regular actions to debug 2025-01-07 20:27:28 +01:00
2c75600532 federation/incoming: improve link_resolve retry decision
To facilitate this ObjectValidator.fetch_actor_and_object is adapted to
return an informative error. Otherwise we’d be unable to make an
informed decision on retrying or not later. There’s no point in
retrying to fetch MRF-blocked stuff or private posts for example.
2025-01-07 20:27:28 +01:00
0cd4040db6 Error out earlier on missing mandatory reference
This is the only user of fetch_actor_and_object which previously just
always preteneded to be successful. For all the activity types handled
here, we absolutely need the referenced object to be able to process it
(other than Announce whether or not processing those activity types for
unknown remote objects is desirable in the first place is up for debate)

All other users of the similar fetch_actor already properly check success.

Note, this currently lumps all reolv failure reasons together,
so even e.g. boosts of MRF rejected posts will still exhaust all
retries. The following commit improves on this.
2025-01-07 20:27:28 +01:00
0ba5c3649d federator: don't nest {:error, _} tuples
It makes decisions based on error sources harder since all possible
nesting levels need to be checked for. As shown by the return values
handled in the receiver worker something else still nests those,
but this is a first start.
2025-01-07 20:27:28 +01:00
8e5defe6ca stats: estimate remote user count
This value is currently only used by Prometheus metrics
but (after optimisng the peer query inthe preceeding commit)
the most costly part of instance stats.
2025-01-07 20:27:28 +01:00
138b1aea2f stats: use cheaper peers query
This query is one of the top cost offenders during an instances
lifetime. For small instances it was shown to take up 30-50% percent of
the total database query time, while for bigger isntaces it still held
a spot in the top 3 — alost as or even more expensive overall than
timeline queries!

The good news is, there’s a cheaper way using the instance table:
no need to process each entry, no need to filter NULLs
and no need to dedupe. EXPLAIN estimates the cost of the
old query as 13272.39 and the cost of the new query as 395.74
for me; i.e. a 33-fold reduction.

Results can slightly differ. E.g. we might have an old user
predating the instance tables existence and no interaction with since
or no instance table entry due to failure to query nodeinfo.
Conversely, we might have an instance entry but all known users got
deleted since.
However, this seems unproblematic in practice
and well worth the perf improvment.

Given the previous query didn’t exclude unreachable instances
neither does the new query.
2025-01-07 20:27:28 +01:00
8b5183cb74 stats: fix stat spec 2025-01-07 20:27:28 +01:00
cbb0d4b0a8 receiver_worker: log unecpected errors
This can't handle process crash errors
but i hope those get a stacktrace logged by default
2025-01-07 20:27:28 +01:00
be2c857845 receiver_worker: don't reattempt invalid documents
Ideally we’d like to split this up more and count most invalid documents
as an error, but silently drop e.g. Deletes for unknown objects.
However, this is hard to extract from the changeset and jobs canceled
with :discard don’t count as exceptions and I’m not aware of a idiomatic
way to cancel further retries while retaining the exception status.

Thus at least keep a log, but since superfluous "Delete"s
seem kinda frequent, don't log at error, only info level.
2025-01-07 20:27:28 +01:00
9f4d3a936f cosmetic/receiver_worker: reformat error cases
The next commit adds a multi-statement case
and then mix format will enforce this anyway
2025-01-07 20:27:28 +01:00
f9724b5879 Don’t reattempt insertion of already known objects
Might happen if we receive e.g. a Like before the Note arrives
in our inbox and we thus already queried the Note ourselves.
2025-01-07 20:27:27 +01:00
041dedb86e Don't reattempt RichMediaBackfill by default
Retrying seems unlikely to be helpful:
 - if it timed out, chances are the short delay before reattempting
   won't give the remote enough time to recover from its outage and
   a longer delay makes the job pointless as users likely scrolled
   further already. (Likely this is already be the case after the
   first 20s timeout)
 - if the remote data is so borked we couldn't even parse it far enough
   for an "invalid metadata" error, chances are it will remain borked
   upon reattempt
2025-01-07 20:27:27 +01:00
280652651c rich_media: don't reattempt parsing on rejected URLs 2025-01-07 20:27:27 +01:00
92544e8f99 Don't enqueue a plethora of unnecessary NodeInfoFetcher jobs
There were two issues leading to needles effort:
Most importnatly, the use of AP IDs as "source_url" meant multiple
simultaneous jobs got scheduled for the same instance even with the
default unique settings.
Also jobs were scheduled uncontionally for each processed AP object
meaning we incured oberhead from managing Oban jobs even if we knew it
wasn't necessary. By comparison the single query to check if an update
is needed should be cheaper overall.
2025-01-07 20:27:27 +01:00
d283ac52c3 Don't create noop SearchIndexingWorker jobs for passive index 2025-01-07 20:27:27 +01:00
ed4019e7a3 workers: make custom filtering ahead of enqueue possible 2025-01-07 20:27:27 +01:00
25d24cc5f6 validators/add_remove: don't crash on failure to resolve reference
It allows for informed error handling and retry/discard job
decisions lateron which a future commit will add.
2025-01-07 20:27:27 +01:00
ead44c6671 federator: don't fetch the user for no reason
The return value is never used here; later stages which actually need it
fetch the user themselves and it doesn't matter wheter we wait for the
fech here or later (if needed at all).

Even more, this early fetch always fails if the user was already deleted
or never known to begin with, but we get something referencing it; e.g.
the very Delete action carrying out the user deletion.
This prevents processing of the Delete, but before that it will be
reattempted several times, each time attempring to fetch the
non-existing profile, wasting resources.
2025-01-07 20:27:27 +01:00
4859f38624 add_remove_validator: limit refetch rate to 1 per 5s
This matches the maximum_age used when processing Move activities
2025-01-07 20:27:27 +01:00
0f4a7a185f Drop ap_enabled indicator from atom feeds 2025-01-07 20:27:27 +01:00
Haelwenn (lanodan) Monnier
c17681ae1e Purge obsolete ap_enabled indicator
It was used to migrate OStatus connections to ActivityPub if possible,
but support for OStatus was long since dropped, all new actors always AP
and if anything wasn't migrated before, their instance is already marked
as unreachable anyway.

The associated logic was also buggy in several ways and deleted users
got set to ap_enabled=false also causing some issues.

This patch is a pretty direct port of the original Pleroma MR;
follow-up commits will further fix and clean up remaining issues.
Changes made (other than trivial merge conflict resolutions):
  - converted CHANGELOG format
  - adapted migration id for Akkoma’s timeline
  - removed ap_enabled from additional tests

Ported-from: https://git.pleroma.social/pleroma/pleroma/-/merge_requests/3880
2025-01-07 20:27:26 +01:00
ad92e504d7 Update mix.exs
Some checks are pending
ci/woodpecker/push/lint Pipeline was successful
ci/woodpecker/push/test Pipeline was successful
ci/woodpecker/pr/build-amd64 Pipeline is pending approval
ci/woodpecker/pr/docs Pipeline is pending approval
ci/woodpecker/pr/lint Pipeline is pending approval
ci/woodpecker/pr/test Pipeline is pending approval
ci/woodpecker/pr/build-arm64 Pipeline is pending approval
ci/woodpecker/pull_request_closed/test Pipeline was successful
ci/woodpecker/pull_request_closed/build-amd64 Pipeline was successful
ci/woodpecker/pull_request_closed/lint Pipeline was successful
ci/woodpecker/pull_request_closed/build-arm64 Pipeline was successful
ci/woodpecker/push/build-arm64 Pipeline was successful
ci/woodpecker/push/build-amd64 Pipeline was successful
ci/woodpecker/pull_request_closed/docs Pipeline was successful
ci/woodpecker/push/docs Pipeline was successful
2025-01-06 12:01:53 +00:00
ilja space
2624258cfd Add "FEP-c16b: Formatting MFM functions" to CHANGELOG.md
Some checks are pending
ci/woodpecker/pr/build-amd64 Pipeline is pending approval
ci/woodpecker/pr/build-arm64 Pipeline is pending approval
ci/woodpecker/pr/docs Pipeline is pending approval
ci/woodpecker/pr/lint Pipeline is pending approval
ci/woodpecker/pr/test Pipeline is pending approval
2024-12-02 12:35:48 +01:00
ilja space
f646e78b48 Add "FEP-c16b: Formatting MFM functions" to FEDERATION.md
Some checks are pending
ci/woodpecker/pr/build-amd64 Pipeline is pending approval
ci/woodpecker/pr/build-arm64 Pipeline is pending approval
ci/woodpecker/pr/docs Pipeline is pending approval
ci/woodpecker/pr/lint Pipeline is pending approval
ci/woodpecker/pr/test Pipeline is pending approval
2024-12-01 11:51:34 +01:00
ilja space
d5e9f6be47 Merge branch 'develop' of https://akkoma.dev/AkkomaGang/akkoma into use_fep-c16b_formatting_mfm_functions 2024-12-01 11:16:59 +01:00
ilja
90adb3cff5 Fix tests
Some checks are pending
ci/woodpecker/pr/build-amd64 Pipeline is pending
ci/woodpecker/pr/build-arm64 Pipeline is pending
ci/woodpecker/pr/docs Pipeline is pending
ci/woodpecker/pr/lint Pipeline is pending
ci/woodpecker/pr/test Pipeline is pending
There was one test who used MFM and now failed due to the new representation. This is now adapted so it doesn't fail any more.

There was another test failing, but I don't see how this could have been affected by the MFM changes...
But I did draw in newer dependencies, so I thought maybe a newer EARMARK dependency was now failing, and indeed.
By explicitly asking for 1.4.46 (according to mix.lock the version it was before), it now works again.

This is what was failing. It seems that earmark 1.4.47 removed everything before the comment, which it should not do.

  1) test format_input/3 with markdown raw HTML (Pleroma.Web.CommonAPI.UtilsTest)
     test/pleroma/web/common_api/utils_test.exs:213
     Assertion with == failed
     code:  assert result == ~s"<a href=\"http://example.org/\">OwO</a>"
     left:  ""
     right: "<a href=\"http://example.org/\">OwO</a>"
     stacktrace:
       test/pleroma/web/common_api/utils_test.exs:216: (test)
2024-08-11 14:59:10 +02:00
ilja
f6422cb370 Use FEP-c16b: formatting mfm functions
Some checks are pending
ci/woodpecker/pr/build-amd64 Pipeline is pending
ci/woodpecker/pr/build-arm64 Pipeline is pending
ci/woodpecker/pr/docs Pipeline is pending
ci/woodpecker/pr/lint Pipeline is pending
ci/woodpecker/pr/test Pipeline is pending
See <#381>

We can't use the HTML content as-is.
[FEP-c16b](https://codeberg.org/fediverse/fep/src/branch/main/fep/c16b/fep-c16b.md) was
written to have a "scrubber friendly" way of representing MFM functions in HTML. Here
we add support in the backend for the functions Foundkey supports. The front-ends then
needs to add support to make sure the HTML representation is turned into a correct view.
(I.e. by help of CSS and Javascript)

FEP-c16b also has a discovery mechanism to indicate to recieving servers that they can
use the `content` directly. This is not implemented in this commit
2024-08-10 20:21:05 +02:00
103 changed files with 2144 additions and 1133 deletions

View file

@ -1,6 +1,5 @@
[
import_deps: [:ecto, :ecto_sql, :phoenix],
subdirectories: ["priv/*/migrations"],
import_deps: [:mneme, :ecto, :ecto_sql, :phoenix],
plugins: [Phoenix.LiveView.HTMLFormatter],
inputs: [
"mix.exs",

View file

@ -40,7 +40,7 @@ variables:
steps:
lint:
image: akkoma/ci-base:1.16-otp26
image: akkoma/ci-base:1.18-otp27
<<: *on-pr-open
environment:
MIX_ENV: test

View file

@ -5,16 +5,18 @@ depends_on:
- lint
matrix:
# test the lowest and highest versions
ELIXIR_VERSION:
- 1.14
- 1.15
- 1.16
- 1.18
OTP_VERSION:
- 25
- 26
- 27
include:
- ELIXIR_VERSION: 1.16
OTP_VERSION: 26
- ELIXIR_VERSION: 1.14
OTP_VERSION: 25
- ELIXIR_VERSION: 1.18
OTP_VERSION: 27
variables:
- &setup-hex "mix local.hex --force && mix local.rebar --force"

View file

@ -6,6 +6,25 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/).
## Unreleased
## 2025.03
## Added
- Oban (worker) dashboard at `/akkoma/oban`
## Fixed
- fixed some holes in SigningKey verification potentially allowing they key-user mapping to be poisoned
- frontend ZIP files can no longer traverse to paths outside their install dir
- fixed user updates trying but failing to renew signing key information
- fixed signing key refresh on key rotation
## Changed
- Dropped obsolete `ap_enabled` indicator from user table and associated buggy logic
- The remote user count in prometheus metrics is now an estimate instead of an exact number
since the latter proved unreasonably costly to obtain for a merely nice-to-have statistic
- Various other tweaks improving stat query performance and avoiding unecessary work on received AP documents
- The HTML content for new posts (both Client-to-Server as well as Server-to-Server communication) will now use a different formatting to represent MFM. See [FEP-c16b](https://codeberg.org/fediverse/fep/src/branch/main/fep/c16b/fep-c16b.md) for more details.
- HTTP signatures now test the most likely request-target alias first cutting down on overhead
## 2025.01.01
Hotfix: Federation could break if a null value found its way into `should_federate?\1`

View file

@ -10,8 +10,10 @@
## Supported FEPs
- [FEP-67ff: FEDERATION](https://codeberg.org/fediverse/fep/src/branch/main/fep/67ff/fep-67ff.md)
- [FEP-dc88: Formatting Mathematics](https://codeberg.org/fediverse/fep/src/branch/main/fep/dc88/fep-dc88.md)
- [FEP-f1d5: NodeInfo in Fediverse Software](https://codeberg.org/fediverse/fep/src/branch/main/fep/f1d5/fep-f1d5.md)
- [FEP-fffd: Proxy Objects](https://codeberg.org/fediverse/fep/src/branch/main/fep/fffd/fep-fffd.md)
- [FEP-c16b: Formatting MFM functions](https://codeberg.org/fediverse/fep/src/branch/main/fep/c16b/fep-c16b.md)
## ActivityPub
@ -30,6 +32,10 @@ Akkoma does not perform JSON-LD processing.
All AP S2S POST requests to Akkoma instances MUST be signed.
Depending on instance configuration the same may be true for GET requests.
### FEP-c16b: Formatting MFM functions
The optional extension term `htmlMfm` is currently not used.
## Nodeinfo
Akkoma provides many additional entries in its nodeinfo response,

View file

@ -54,9 +54,6 @@ If your platform is not supported, or you just want to be able to edit the sourc
### Docker
Docker installation is supported via [this setup](https://docs.akkoma.dev/stable/installation/docker_en/)
### Packages
Akkoma is packaged for [YunoHost](https://yunohost.org) and can be found and installed from the [YunoHost app catalogue](https://yunohost.org/#/apps).
### Compilation Troubleshooting
If you ever encounter compilation issues during the updating of Akkoma, you can try these commands and see if they fix things:

View file

@ -1,7 +1,6 @@
FROM elixir:1.9.4
RUN apt-get update &&\
apt-get install -y libmagic-dev cmake libimage-exiftool-perl ffmpeg &&\
mix local.hex --force &&\
mix local.rebar --force
ARG TAG
FROM docker.io/hexpm/elixir:${TAG}
RUN apk update
RUN apk add git gcc g++ musl-dev make cmake file-dev rclone wget zip imagemagick ffmpeg perl-image-exiftool exiftool
RUN mkdir /src
WORKDIR /src

3
ci/build-all.sh Executable file
View file

@ -0,0 +1,3 @@
./build.sh 1.14-otp25 1.14.3-erlang-25.3.2-alpine-3.18.0
./build.sh 1.15-otp25 1.15.8-erlang-25.3.2.18-alpine-3.19.7
./build.sh 1.18-otp27 1.18.2-erlang-27.2.4-alpine-3.19.7

3
ci/build.sh Executable file
View file

@ -0,0 +1,3 @@
echo "Building $1 using image tag $2"
docker build -t docker.io/akkoma/ci-base:$1 --build-arg=version=$1 --build-arg=TAG=$2 .
docker push docker.io/akkoma/ci-base:$1

View file

@ -1 +0,0 @@
docker buildx build --platform linux/amd64,linux/arm64,linux/arm/v7 -t git.pleroma.social:5050/pleroma/pleroma/ci-base:latest --push .

View file

@ -166,10 +166,7 @@
"application/xrd+xml" => ["xrd+xml"],
"application/jrd+json" => ["jrd+json"],
"application/activity+json" => ["activity+json"],
"application/ld+json" => ["activity+json"],
# Can be removed when bumping MIME past 2.0.5
# see https://akkoma.dev/AkkomaGang/akkoma/issues/657
"image/apng" => ["apng"]
"application/ld+json" => ["activity+json"]
}
config :mime, :extensions, %{
@ -373,6 +370,7 @@
note_replies_output_limit: 5,
sign_object_fetches: true,
authorized_fetch_mode: false,
min_key_refetch_interval: 86_400,
max_collection_objects: 50
config :pleroma, :streamer,
@ -602,7 +600,7 @@
federator_incoming: 5,
federator_outgoing: 5,
search_indexing: 2,
rich_media_backfill: 3
rich_media_backfill: 1
],
timeout: [
activity_expiration: :timer.seconds(5),

View file

@ -61,15 +61,15 @@ Next install Erlang:
```shell
asdf plugin add erlang https://github.com/asdf-vm/asdf-erlang.git
export KERL_CONFIGURE_OPTIONS="--disable-debug --without-javac"
asdf install erlang 26.2.5.4
asdf global erlang 26.2.5.4
asdf install erlang 27.2.4
asdf set erlang 27.2.4
```
Now install Elixir:
```shell
asdf plugin-add elixir https://github.com/asdf-vm/asdf-elixir.git
asdf install elixir 1.17.3-otp-26
asdf global elixir 1.17.3-otp-26
asdf plugin add elixir https://github.com/asdf-vm/asdf-elixir.git
asdf install elixir 1.18.2-otp-27
asdf set elixir 1.18.2-otp-27
```
Confirm that Elixir is installed correctly by checking the version:

View file

@ -1,7 +1,7 @@
## Required dependencies
* PostgreSQL 12+
* Elixir 1.14+ (currently tested up to 1.17)
* Elixir 1.14.1+ (currently tested up to 1.18)
* Erlang OTP 25+ (currently tested up to OTP27)
* git
* file / libmagic

View file

@ -1,9 +0,0 @@
# Installing on Yunohost
[YunoHost](https://yunohost.org) is a server operating system aimed at self-hosting. The YunoHost community maintains a package of Akkoma which allows you to install Akkoma on YunoHost. You can install it via the normal way through the admin web interface, or through the CLI. More information can be found at [the repo of the package](https://github.com/YunoHost-Apps/akkoma_ynh).
## Questions
Questions and problems related to the YunoHost parts can be done through the [YunoHost channels](https://yunohost.org/en/help).
For questions about Akkoma, check out the [Akkoma community channels](../../#community-channels).

View file

@ -93,6 +93,7 @@ def run(["get-packs" | args]) do
)
files = fetch_and_decode!(files_loc)
files_to_unzip = for({_, f} <- files, do: f)
shell_info(IO.ANSI.format(["Unpacking ", :bright, pack_name]))
@ -103,17 +104,7 @@ def run(["get-packs" | args]) do
pack_name
])
files_to_unzip =
Enum.map(
files,
fn {_, f} -> to_charlist(f) end
)
{:ok, _} =
:zip.unzip(binary_archive,
cwd: to_charlist(pack_path),
file_list: files_to_unzip
)
{:ok, _} = Pleroma.SafeZip.unzip_data(binary_archive, pack_path, files_to_unzip)
shell_info(IO.ANSI.format(["Writing pack.json for ", :bright, pack_name]))
@ -202,7 +193,7 @@ def run(["gen-pack" | args]) do
tmp_pack_dir = Path.join(System.tmp_dir!(), "emoji-pack-#{name}")
{:ok, _} = :zip.unzip(binary_archive, cwd: String.to_charlist(tmp_pack_dir))
{:ok, _} = Pleroma.SafeZip.unzip_data(binary_archive, tmp_pack_dir)
emoji_map = Pleroma.Emoji.Loader.make_shortcode_to_file_map(tmp_pack_dir, exts)

View file

@ -14,7 +14,7 @@ defmodule Akkoma.Collections.Fetcher do
@spec fetch_collection(String.t() | map()) :: {:ok, [Pleroma.Object.t()]} | {:error, any()}
def fetch_collection(ap_id) when is_binary(ap_id) do
with {:ok, page} <- Fetcher.fetch_and_contain_remote_object_from_id(ap_id) do
{:ok, objects_from_collection(page)}
partial_as_success(objects_from_collection(page))
else
e ->
Logger.error("Could not fetch collection #{ap_id} - #{inspect(e)}")
@ -24,9 +24,12 @@ def fetch_collection(ap_id) when is_binary(ap_id) do
def fetch_collection(%{"type" => type} = page)
when type in ["Collection", "OrderedCollection", "CollectionPage", "OrderedCollectionPage"] do
{:ok, objects_from_collection(page)}
partial_as_success(objects_from_collection(page))
end
defp partial_as_success({:partial, items}), do: {:ok, items}
defp partial_as_success(res), do: res
defp items_in_page(%{"type" => type, "orderedItems" => items})
when is_list(items) and type in ["OrderedCollection", "OrderedCollectionPage"],
do: items
@ -53,11 +56,11 @@ defp objects_from_collection(%{"type" => type, "first" => %{"id" => id}})
fetch_page_items(id)
end
defp objects_from_collection(_page), do: []
defp objects_from_collection(_page), do: {:ok, []}
defp fetch_page_items(id, items \\ []) do
if Enum.count(items) >= Config.get([:activitypub, :max_collection_objects]) do
items
{:ok, items}
else
with {:ok, page} <- Fetcher.fetch_and_contain_remote_object_from_id(id) do
objects = items_in_page(page)
@ -65,18 +68,22 @@ defp fetch_page_items(id, items \\ []) do
if Enum.count(objects) > 0 do
maybe_next_page(page, items ++ objects)
else
items
{:ok, items}
end
else
{:error, :not_found} ->
items
{:ok, items}
{:error, :forbidden} ->
items
{:ok, items}
{:error, error} ->
Logger.error("Could not fetch page #{id} - #{inspect(error)}")
{:error, error}
case items do
[] -> {:error, error}
_ -> {:partial, items}
end
end
end
end
@ -85,5 +92,5 @@ defp maybe_next_page(%{"next" => id}, items) when is_binary(id) do
fetch_page_items(id, items)
end
defp maybe_next_page(_, items), do: items
defp maybe_next_page(_, items), do: {:ok, items}
end

View file

@ -25,6 +25,7 @@ defmodule Pleroma.Emoji.Pack do
alias Pleroma.Emoji
alias Pleroma.Emoji.Pack
alias Pleroma.Utils
alias Pleroma.SafeZip
# Invalid/Malicious names are supposed to be filtered out before path joining,
# but there are many entrypoints to affected functions so as the code changes
@ -95,22 +96,20 @@ def delete(name) do
end
end
@spec unpack_zip_emojies(list(tuple())) :: list(map())
defp unpack_zip_emojies(zip_files) do
Enum.reduce(zip_files, [], fn
{_, path, s, _, _, _}, acc when elem(s, 2) == :regular ->
with(
filename <- Path.basename(path),
shortcode <- Path.basename(filename, Path.extname(filename)),
false <- Emoji.exist?(shortcode)
) do
[%{path: path, filename: path, shortcode: shortcode} | acc]
else
_ -> acc
end
_, acc ->
acc
@spec map_zip_emojis(list(String.t())) :: list(map())
defp map_zip_emojis(zip_files) do
Enum.reduce(zip_files, [], fn path, acc ->
with(
filename <- Path.basename(path),
shortcode <- Path.basename(filename, Path.extname(filename)),
# note: this only checks the shortcode, if an emoji already exists on the same path, but
# with a different shortcode, the existing one will be degraded to an alias of the new
false <- Emoji.exist?(shortcode)
) do
[%{path: path, filename: path, shortcode: shortcode} | acc]
else
_ -> acc
end
end)
end
@ -118,18 +117,15 @@ defp unpack_zip_emojies(zip_files) do
{:ok, t()}
| {:error, File.posix() | atom()}
def add_file(%Pack{} = pack, _, _, %Plug.Upload{content_type: "application/zip"} = file) do
with {:ok, zip_files} <- :zip.table(to_charlist(file.path)),
[_ | _] = emojies <- unpack_zip_emojies(zip_files),
with {:ok, zip_files} <- SafeZip.list_dir_file(file.path),
[_ | _] = emojis <- map_zip_emojis(zip_files),
{:ok, tmp_dir} <- Utils.tmp_dir("emoji") do
try do
{:ok, _emoji_files} =
:zip.unzip(
to_charlist(file.path),
[{:file_list, Enum.map(emojies, & &1[:path])}, {:cwd, to_charlist(tmp_dir)}]
)
SafeZip.unzip_file(file.path, tmp_dir, Enum.map(emojis, & &1[:path]))
{_, updated_pack} =
Enum.map_reduce(emojies, pack, fn item, emoji_pack ->
Enum.map_reduce(emojis, pack, fn item, emoji_pack ->
emoji_file = %Plug.Upload{
filename: item[:filename],
path: path_join_safe(tmp_dir, item[:path])
@ -446,16 +442,9 @@ defp downloadable?(pack) do
end
defp create_archive_and_cache(pack, hash) do
files = [
~c"pack.json"
| Enum.map(pack.files, fn {_, file} ->
{:ok, file} = Path.safe_relative(file)
to_charlist(file)
end)
]
{:ok, {_, result}} =
:zip.zip(~c"#{pack.name}.zip", files, [:memory, cwd: to_charlist(pack.path)])
pack_file_list = Enum.into(pack.files, [], fn {_, f} -> f end)
files = ["pack.json" | pack_file_list]
{:ok, {_, result}} = SafeZip.zip("#{pack.name}.zip", files, pack.path, true)
ttl_per_file = Pleroma.Config.get!([:emoji, :shared_pack_cache_seconds_per_file])
overall_ttl = :timer.seconds(ttl_per_file * Enum.count(files))
@ -626,11 +615,10 @@ defp copy_as(remote_pack, local_name) do
defp unzip(archive, pack_info, remote_pack, local_pack) do
with :ok <- File.mkdir_p!(local_pack.path) do
files = Enum.map(remote_pack["files"], fn {_, path} -> to_charlist(path) end)
files = Enum.map(remote_pack["files"], fn {_, path} -> path end)
# Fallback cannot contain a pack.json file
files = if pack_info[:fallback], do: files, else: [~c"pack.json" | files]
:zip.unzip(archive, cwd: to_charlist(local_pack.path), file_list: files)
files = if pack_info[:fallback], do: files, else: ["pack.json" | files]
SafeZip.unzip_data(archive, local_pack.path, files)
end
end
@ -693,13 +681,14 @@ defp update_sha_and_save_metadata(pack, data) do
end
defp validate_has_all_files(pack, zip) do
with {:ok, f_list} <- :zip.unzip(zip, [:memory]) do
# Check if all files from the pack.json are in the archive
pack.files
|> Enum.all?(fn {_, from_manifest} ->
List.keyfind(f_list, to_charlist(from_manifest), 0)
# Check if all files from the pack.json are in the archive
eset =
Enum.reduce(pack.files, MapSet.new(), fn
{_, file}, s -> MapSet.put(s, to_charlist(file))
end)
|> if(do: :ok, else: {:error, :incomplete})
end
if SafeZip.contains_all_data?(zip, eset),
do: :ok,
else: {:error, :incomplete}
end
end

View file

@ -70,25 +70,12 @@ defp download_or_unzip(_frontend_info, temp_dir, file) do
end
def unzip(zip, dest) do
with {:ok, unzipped} <- :zip.unzip(zip, [:memory]) do
File.rm_rf!(dest)
File.mkdir_p!(dest)
File.rm_rf!(dest)
File.mkdir_p!(dest)
Enum.each(unzipped, fn {filename, data} ->
path = filename
new_file_path = Path.join(dest, path)
new_file_path
|> Path.dirname()
|> File.rm()
new_file_path
|> Path.dirname()
|> File.mkdir_p!()
File.write!(new_file_path, data)
end)
case Pleroma.SafeZip.unzip_data(zip, dest) do
{:ok, _} -> :ok
error -> error
end
end

View file

@ -158,6 +158,14 @@ def needs_update(%Instance{metadata_updated_at: metadata_updated_at}) do
NaiveDateTime.diff(now, metadata_updated_at) > 86_400
end
def needs_update(%URI{host: host}) do
with %Instance{} = instance <- Repo.get_by(Instance, %{host: host}) do
needs_update(instance)
else
_ -> true
end
end
def local do
%Instance{
host: Pleroma.Web.Endpoint.host(),
@ -180,7 +188,7 @@ def update_metadata(%URI{host: host} = uri) do
defp do_update_metadata(%URI{host: host} = uri, existing_record) do
if existing_record do
if needs_update(existing_record) do
Logger.info("Updating metadata for #{host}")
Logger.debug("Updating metadata for #{host}")
favicon = scrape_favicon(uri)
nodeinfo = scrape_nodeinfo(uri)
@ -199,7 +207,7 @@ defp do_update_metadata(%URI{host: host} = uri, existing_record) do
favicon = scrape_favicon(uri)
nodeinfo = scrape_nodeinfo(uri)
Logger.info("Creating metadata for #{host}")
Logger.debug("Creating metadata for #{host}")
%Instance{}
|> changeset(%{

View file

@ -215,6 +215,11 @@ def get_cached_by_ap_id(ap_id) do
end
end
# Intentionally accepts non-Object arguments!
@spec tombstone_object?(term()) :: boolean()
def tombstone_object?(%Object{data: %{"type" => "Tombstone"}}), do: true
def tombstone_object?(_), do: false
def make_tombstone(%Object{data: %{"id" => id, "type" => type}}, deleted \\ DateTime.utc_now()) do
%ObjectTombstone{
id: id,

View file

@ -157,4 +157,16 @@ def same_origin(id1, id2) do
compare_uris(uri1, uri2)
end
@doc """
Checks whether a key_id - owner_id pair are acceptable.
While in theory keys and actors on different domain could be verified
by fetching both and checking the links on both ends (if different at all),
this requires extra fetches and there are no known implementations with split
actor and key domains, thus atm this simply requires same domain.
"""
def contain_key_user(key_id, user_id) do
same_origin(key_id, user_id)
end
end

View file

@ -265,6 +265,64 @@ def fetch_and_contain_remote_object_from_id(%{"id" => id}, is_ap_id),
def fetch_and_contain_remote_object_from_id(id, is_ap_id) when is_binary(id) do
Logger.debug("Fetching object #{id} via AP [ap_id=#{is_ap_id}]")
fetch_and_contain_remote_ap_doc(
id,
is_ap_id,
fn final_uri, data -> {Containment.contain_id_to_fetch(final_uri, data), data["id"]} end
)
end
def fetch_and_contain_remote_object_from_id(_id, _is_ap_id),
do: {:error, :invalid_id}
def fetch_and_contain_remote_key(id) do
Logger.debug("Fetching remote actor key #{id}")
fetch_and_contain_remote_ap_doc(
id,
# key IDs are alwas AP IDs which should resolve directly and exactly
true,
fn
final_uri, %{"id" => user_id, "publicKey" => %{"id" => key_id}} ->
# For non-fragment keys as used e.g. by GTS, the "id" won't match the fetch URI,
# but the key ID will. Thus do NOT strict check the top-lelve id, but byte-exact
# check the key ID (since for later lookups we need byte-exact matches).
# This relies on fetching already enforcing a domain match for toplevel id and host
with :ok <- Containment.contain_key_user(key_id, user_id),
true <- key_id == final_uri do
{:ok, key_id}
else
_ -> {:error, key_id}
end
final_uri,
%{"type" => "CryptographicKey", "id" => key_id, "owner" => user_id, "publicKeyPem" => _} ->
# XXX: refactor into separate function isntead of duplicating
with :ok <- Containment.contain_key_user(key_id, user_id),
true <- key_id == final_uri do
{:ok, key_id}
else
_ -> {:error, nil}
end
_, _ ->
{:error, nil}
end
)
end
# Fetches an AP document and performing variable security checks on it.
#
# Note that the received documents "id" matching the final host domain
# is always enforced before the custom ID check runs.
@spec fetch_and_contain_remote_ap_doc(
String.t(),
boolean(),
(String.t(), Map.t() -> {:ok | :error, String.t() | term()})
) :: {:ok, Map.t()} | {:reject, term()} | {:error, term()}
defp fetch_and_contain_remote_ap_doc(id, is_ap_id, verify_id) do
Logger.debug("Dereferencing AP doc #{}")
with {:valid_uri_scheme, true} <- {:valid_uri_scheme, String.starts_with?(id, "http")},
%URI{} = uri <- URI.parse(id),
{:mrf_reject_check, {:ok, nil}} <-
@ -277,7 +335,7 @@ def fetch_and_contain_remote_object_from_id(id, is_ap_id) when is_binary(id) do
true <- !is_ap_id || final_id == id,
{:ok, data} <- safe_json_decode(body),
{_, :ok} <- {:containment, Containment.contain_origin(final_id, data)},
{_, _, :ok} <- {:strict_id, data["id"], Containment.contain_id_to_fetch(final_id, data)} do
{_, {:ok, _}} <- {:strict_id, verify_id.(final_id, data)} do
unless Instances.reachable?(final_id) do
Instances.set_reachable(final_id)
end
@ -289,14 +347,12 @@ def fetch_and_contain_remote_object_from_id(id, is_ap_id) when is_binary(id) do
# Similarly keys, either use a fragment ID and are a subobjects or a distinct ID
# but for compatibility are still a subobject presenting their owning actors ID at the toplevel.
# Refetching _once_ from the listed id, should yield a strict match afterwards.
{:strict_id, ap_id, _} = e ->
case is_ap_id do
false ->
fetch_and_contain_remote_object_from_id(ap_id, true)
true ->
log_fetch_error(id, e)
{:error, :id_mismatch}
{:strict_id, {_error, ap_id}} = e ->
if !is_ap_id and is_binary(ap_id) do
fetch_and_contain_remote_ap_doc(ap_id, true, verify_id)
else
log_fetch_error(id, e)
{:error, :id_mismatch}
end
{:mrf_reject_check, _} = e ->
@ -327,9 +383,6 @@ def fetch_and_contain_remote_object_from_id(id, is_ap_id) when is_binary(id) do
end
end
def fetch_and_contain_remote_object_from_id(_id, _is_ap_id),
do: {:error, :invalid_id}
# HOPEFULLY TEMPORARY
# Basically none of our Tesla mocks in tests set the (supposed to
# exist for Tesla proper) url parameter for their responses

216
lib/pleroma/safe_zip.ex Normal file
View file

@ -0,0 +1,216 @@
# Akkoma: Magically expressive social media
# Copyright © 2024 Akkoma Authors <https://akkoma.dev/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.SafeZip do
@moduledoc """
Wraps the subset of Erlang's zip module wed like to use
but enforces path-traversal safety everywhere and other checks.
For convenience almost all functions accept both elixir strings and charlists,
but output elixir strings themselves. However, this means the input parameter type
can no longer be used to distinguish archive file paths from archive binary data in memory,
thus where needed both a _data and _file variant are provided.
"""
@type text() :: String.t() | [char()]
defp is_safe_path?(path) do
# Path accepts elixirs chardata()
case Path.safe_relative(path) do
{:ok, _} -> true
_ -> false
end
end
defp is_safe_type?(file_type) do
if file_type in [:regular, :directory] do
true
else
false
end
end
defp maybe_add_file(_type, _path_charlist, nil), do: nil
defp maybe_add_file(:regular, path_charlist, file_list),
do: [to_string(path_charlist) | file_list]
defp maybe_add_file(_type, _path_charlist, file_list), do: file_list
@spec check_safe_archive_and_maybe_list_files(binary() | [char()], [term()], boolean()) ::
{:ok, [String.t()]} | {:error, reason :: term()}
defp check_safe_archive_and_maybe_list_files(archive, opts, list) do
acc = if list, do: [], else: nil
with {:ok, table} <- :zip.table(archive, opts) do
Enum.reduce_while(table, {:ok, acc}, fn
# ZIP comment
{:zip_comment, _}, acc ->
{:cont, acc}
# File entry
{:zip_file, path, info, _comment, _offset, _comp_size}, {:ok, fl} ->
with {_, type} <- {:get_type, elem(info, 2)},
{_, true} <- {:type, is_safe_type?(type)},
{_, true} <- {:safe_path, is_safe_path?(path)} do
{:cont, {:ok, maybe_add_file(type, path, fl)}}
else
{:get_type, e} ->
{:halt,
{:error, "Couldn't determine file type of ZIP entry at #{path} (#{inspect(e)})"}}
{:type, _} ->
{:halt, {:error, "Potentially unsafe file type in ZIP at: #{path}"}}
{:safe_path, _} ->
{:halt, {:error, "Unsafe path in ZIP: #{path}"}}
end
# new OTP version?
_, _acc ->
{:halt, {:error, "Unknown ZIP record type"}}
end)
end
end
@spec check_safe_archive_and_list_files(binary() | [char()], [term()]) ::
{:ok, [String.t()]} | {:error, reason :: term()}
defp check_safe_archive_and_list_files(archive, opts \\ []) do
check_safe_archive_and_maybe_list_files(archive, opts, true)
end
@spec check_safe_archive(binary() | [char()], [term()]) :: :ok | {:error, reason :: term()}
defp check_safe_archive(archive, opts \\ []) do
case check_safe_archive_and_maybe_list_files(archive, opts, false) do
{:ok, _} -> :ok
error -> error
end
end
@spec check_safe_file_list([text()], text()) :: :ok | {:error, term()}
defp check_safe_file_list([], _), do: :ok
defp check_safe_file_list([path | tail], cwd) do
with {_, true} <- {:path, is_safe_path?(path)},
{_, {:ok, fstat}} <- {:stat, File.stat(Path.expand(path, cwd))},
{_, true} <- {:type, is_safe_type?(fstat.type)} do
check_safe_file_list(tail, cwd)
else
{:path, _} ->
{:error, "Unsafe path escaping cwd: #{path}"}
{:stat, e} ->
{:error, "Unable to check file type of #{path}: #{inspect(e)}"}
{:type, _} ->
{:error, "Unsafe type at #{path}"}
end
end
defp check_safe_file_list(_, _), do: {:error, "Malformed file_list"}
@doc """
Checks whether the archive data contais file entries for all paths from fset
Note this really only accepts entries corresponding to regular _files_,
if a path is contained as for example an directory, this does not count as a match.
"""
@spec contains_all_data?(binary(), MapSet.t()) :: true | false
def contains_all_data?(archive_data, fset) do
with {:ok, table} <- :zip.table(archive_data) do
remaining =
Enum.reduce(table, fset, fn
{:zip_file, path, info, _comment, _offset, _comp_size}, fset ->
if elem(info, 2) == :regular do
MapSet.delete(fset, path)
else
fset
end
_, _ ->
fset
end)
|> MapSet.size()
if remaining == 0, do: true, else: false
else
_ -> false
end
end
@doc """
List all file entries in ZIP, or error if invalid or unsafe.
Note this really only lists regular files, no directories, ZIP comments or other types!
"""
@spec list_dir_file(text()) :: {:ok, [String.t()]} | {:error, reason :: term()}
def list_dir_file(archive) do
path = to_charlist(archive)
check_safe_archive_and_list_files(path)
end
defp stringify_zip({:ok, {fname, data}}), do: {:ok, {to_string(fname), data}}
defp stringify_zip({:ok, fname}), do: {:ok, to_string(fname)}
defp stringify_zip(ret), do: ret
@spec zip(text(), text(), [text()], boolean()) ::
{:ok, file_name :: String.t()}
| {:ok, {file_name :: String.t(), file_data :: binary()}}
| {:error, reason :: term()}
def zip(name, file_list, cwd, memory \\ false) do
opts = [{:cwd, to_charlist(cwd)}]
opts = if memory, do: [:memory | opts], else: opts
with :ok <- check_safe_file_list(file_list, cwd) do
file_list = for f <- file_list, do: to_charlist(f)
name = to_charlist(name)
stringify_zip(:zip.zip(name, file_list, opts))
end
end
@spec unzip_file(text(), text(), [text()] | nil) ::
{:ok, [String.t()]}
| {:error, reason :: term()}
| {:error, {name :: text(), reason :: term()}}
def unzip_file(archive, target_dir, file_list \\ nil) do
do_unzip(to_charlist(archive), to_charlist(target_dir), file_list)
end
@spec unzip_data(binary(), text(), [text()] | nil) ::
{:ok, [String.t()]}
| {:error, reason :: term()}
| {:error, {name :: text(), reason :: term()}}
def unzip_data(archive, target_dir, file_list \\ nil) do
do_unzip(archive, to_charlist(target_dir), file_list)
end
defp stringify_unzip({:ok, [{_fname, _data} | _] = filebinlist}),
do: {:ok, Enum.map(filebinlist, fn {fname, data} -> {to_string(fname), data} end)}
defp stringify_unzip({:ok, [_fname | _] = filelist}),
do: {:ok, Enum.map(filelist, fn fname -> to_string(fname) end)}
defp stringify_unzip({:error, {fname, term}}), do: {:error, {to_string(fname), term}}
defp stringify_unzip(ret), do: ret
@spec do_unzip(binary() | [char()], text(), [text()] | nil) ::
{:ok, [String.t()]}
| {:error, reason :: term()}
| {:error, {name :: text(), reason :: term()}}
defp do_unzip(archive, target_dir, file_list) do
opts =
if file_list != nil do
[
file_list: for(f <- file_list, do: to_charlist(f)),
cwd: target_dir
]
else
[cwd: target_dir]
end
with :ok <- check_safe_archive(archive) do
stringify_unzip(:zip.unzip(archive, opts))
end
end
end

View file

@ -40,12 +40,6 @@ def search(user, search_query, options \\ []) do
end
end
@impl true
def add_to_index(_activity), do: nil
@impl true
def remove_from_index(_object), do: nil
def maybe_restrict_author(query, %User{} = author) do
Activity.Queries.by_author(query, author)
end

View file

@ -14,4 +14,6 @@ defmodule Pleroma.Search.SearchBackend do
from index.
"""
@callback remove_from_index(object :: Pleroma.Object.t()) :: {:ok, any()} | {:error, any()}
@optional_callbacks add_to_index: 1, remove_from_index: 1
end

View file

@ -9,37 +9,40 @@ defmodule Pleroma.Signature do
alias Pleroma.User.SigningKey
require Logger
def key_id_to_actor_id(key_id) do
case SigningKey.key_id_to_ap_id(key_id) do
nil ->
# hm, we SHOULD have gotten this in the pipeline before we hit here!
Logger.error("Could not figure out who owns the key #{key_id}")
{:error, :key_owner_not_found}
key ->
{:ok, key}
end
end
def fetch_public_key(conn) do
with %{"keyId" => kid} <- HTTPSignatures.signature_for_conn(conn),
{:ok, %SigningKey{}} <- SigningKey.get_or_fetch_by_key_id(kid),
{:ok, actor_id} <- key_id_to_actor_id(kid),
{:ok, public_key} <- User.get_public_key_for_ap_id(actor_id) do
{:ok, public_key}
with {_, %{"keyId" => kid}} <- {:keyid, HTTPSignatures.signature_for_conn(conn)},
{_, {:ok, %SigningKey{} = sk}, _} <-
{:fetch, SigningKey.get_or_fetch_by_key_id(kid), kid},
{_, {:ok, decoded_key}} <- {:decode, SigningKey.public_key_decoded(sk)} do
{:ok, decoded_key}
else
{:fetch, error, kid} ->
Logger.error("Failed to acquire key from signature: #{kid} #{inspect(error)}")
{:error, {:fetch, error}}
e ->
{:error, e}
end
end
def refetch_public_key(conn) do
with %{"keyId" => kid} <- HTTPSignatures.signature_for_conn(conn),
{:ok, %SigningKey{}} <- SigningKey.get_or_fetch_by_key_id(kid),
{:ok, actor_id} <- key_id_to_actor_id(kid),
{:ok, public_key} <- User.get_public_key_for_ap_id(actor_id) do
{:ok, public_key}
with {_, %{"keyId" => kid}} <- {:keyid, HTTPSignatures.signature_for_conn(conn)},
{_, {:ok, %SigningKey{} = sk}, _} <- {:fetch, SigningKey.refresh_by_key_id(kid), kid},
{_, {:ok, decoded_key}} <- {:decode, SigningKey.public_key_decoded(sk)} do
{:ok, decoded_key}
else
{:fetch, {:error, :too_young}, kid} ->
Logger.debug("Refusing to refetch recently updated key: #{kid}")
{:error, {:fetch, :too_young}}
{:fetch, {:error, :unknown}, kid} ->
Logger.warning("Attempted to refresh unknown key; this should not happen: #{kid}")
{:error, {:fetch, :unknown}}
{:fetch, error, kid} ->
Logger.error("Failed to refresh stale key from signature: #{kid} #{inspect(error)}")
{:error, {:fetch, error}}
e ->
{:error, e}
end

View file

@ -10,6 +10,7 @@ defmodule Pleroma.Stats do
alias Pleroma.CounterCache
alias Pleroma.Repo
alias Pleroma.User
alias Pleroma.Instances.Instance
@interval :timer.seconds(300)
@ -39,7 +40,8 @@ def force_update do
@spec get_stats() :: %{
domain_count: non_neg_integer(),
status_count: non_neg_integer(),
user_count: non_neg_integer()
user_count: non_neg_integer(),
remote_user_count: non_neg_integer()
}
def get_stats do
%{stats: stats} = GenServer.call(__MODULE__, :get_state)
@ -60,41 +62,39 @@ def get_peers do
stats: %{
domain_count: non_neg_integer(),
status_count: non_neg_integer(),
user_count: non_neg_integer()
user_count: non_neg_integer(),
remote_user_count: non_neg_integer()
}
}
def calculate_stat_data do
# instances table has an unique constraint on the host column
peers =
from(
u in User,
select: fragment("distinct split_part(?, '@', 2)", u.nickname),
where: u.local != ^true
i in Instance,
select: i.host
)
|> Repo.all()
|> Enum.filter(& &1)
domain_count = Enum.count(peers)
status_count = Repo.aggregate(User.Query.build(%{local: true}), :sum, :note_count)
users_query =
# there are few enough local users for postgres to use an index scan
# (also here an exact count is a bit more important)
user_count =
from(u in User,
where: u.is_active == true,
where: u.local == true,
where: not is_nil(u.nickname),
where: not u.invisible
)
|> Repo.aggregate(:count, :id)
remote_users_query =
from(u in User,
where: u.is_active == true,
where: u.local == false,
where: not is_nil(u.nickname),
where: not u.invisible
)
user_count = Repo.aggregate(users_query, :count, :id)
remote_user_count = Repo.aggregate(remote_users_query, :count, :id)
# but mostly numerous remote users leading to a full a full table scan
# (ecto currently doesn't allow building queries without explicit table)
%{rows: [[remote_user_count]]} =
"SELECT estimate_remote_user_count();"
|> Pleroma.Repo.query!()
%{
peers: peers,

View file

@ -100,7 +100,6 @@ defmodule Pleroma.User do
field(:password_hash, :string)
field(:password, :string, virtual: true)
field(:password_confirmation, :string, virtual: true)
field(:keys, :string)
field(:ap_id, :string)
field(:avatar, :map, default: %{})
field(:local, :boolean, default: true)
@ -127,7 +126,6 @@ defmodule Pleroma.User do
field(:domain_blocks, {:array, :string}, default: [])
field(:is_active, :boolean, default: true)
field(:no_rich_text, :boolean, default: false)
field(:ap_enabled, :boolean, default: false)
field(:is_moderator, :boolean, default: false)
field(:is_admin, :boolean, default: false)
field(:show_role, :boolean, default: true)
@ -223,7 +221,9 @@ defmodule Pleroma.User do
# FOR THE FUTURE: We might want to make this a one-to-many relationship
# it's entirely possible right now, but we don't have a use case for it
has_one(:signing_key, SigningKey, foreign_key: :user_id)
# XXX: in the future wed also like to parse and honour key expiration times
# instead of blindly accepting any change in signing keys
has_one(:signing_key, SigningKey, foreign_key: :user_id, on_replace: :update)
timestamps()
end
@ -473,7 +473,6 @@ def remote_user_changeset(struct \\ %User{local: false}, params) do
:shared_inbox,
:nickname,
:avatar,
:ap_enabled,
:banner,
:background,
:is_locked,
@ -1006,11 +1005,7 @@ def maybe_direct_follow(%User{} = follower, %User{local: true} = followed) do
end
def maybe_direct_follow(%User{} = follower, %User{} = followed) do
if not ap_enabled?(followed) do
follow(follower, followed)
else
{:ok, follower, followed}
end
{:ok, follower, followed}
end
@doc "A mass follow for local users. Respects blocks in both directions but does not create activities."
@ -1826,7 +1821,6 @@ def purge_user_changeset(user) do
confirmation_token: nil,
domain_blocks: [],
is_active: false,
ap_enabled: false,
is_moderator: false,
is_admin: false,
mastofe_settings: nil,
@ -2006,8 +2000,20 @@ def get_or_fetch_by_ap_id(ap_id, options \\ []) do
{%User{} = user, _} ->
{:ok, user}
e ->
{_, {:error, {:reject, :mrf}}} ->
Logger.debug("Rejected to fetch user due to MRF: #{ap_id}")
{:error, {:reject, :mrf}}
{_, {:error, :not_found}} ->
Logger.debug("User doesn't exist (anymore): #{ap_id}")
{:error, :not_found}
{_, {:error, e}} ->
Logger.error("Could not fetch user #{ap_id}, #{inspect(e)}")
{:error, e}
e ->
Logger.error("Unexpected error condition while fetching user #{ap_id}, #{inspect(e)}")
{:error, :not_found}
end
end
@ -2062,21 +2068,6 @@ defp create_service_actor(uri, nickname) do
defdelegate public_key(user), to: SigningKey
def get_public_key_for_ap_id(ap_id) do
with {:ok, %User{} = user} <- get_or_fetch_by_ap_id(ap_id),
{:ok, public_key} <- SigningKey.public_key(user) do
{:ok, public_key}
else
e ->
Logger.error("Could not get public key for #{ap_id}.\n#{inspect(e)}")
{:error, e}
end
end
def ap_enabled?(%User{local: true}), do: true
def ap_enabled?(%User{ap_enabled: ap_enabled}), do: ap_enabled
def ap_enabled?(_), do: false
@doc "Gets or fetch a user by uri or nickname."
@spec get_or_fetch(String.t()) :: {:ok, User.t()} | {:error, String.t()}
def get_or_fetch("http://" <> _host = uri), do: get_or_fetch_by_ap_id(uri)
@ -2580,10 +2571,10 @@ def add_pinned_object_id(%User{} = user, object_id) do
[pinned_objects: "You have already pinned the maximum number of statuses"]
end
end)
|> update_and_set_cache()
else
change(user)
{:ok, user}
end
|> update_and_set_cache()
end
@spec remove_pinned_object_id(User.t(), String.t()) :: {:ok, t()} | {:error, term()}

View file

@ -119,7 +119,7 @@ def process(%__MODULE__{} = backup) do
end
end
@files [~c"actor.json", ~c"outbox.json", ~c"likes.json", ~c"bookmarks.json"]
@files ["actor.json", "outbox.json", "likes.json", "bookmarks.json"]
def export(%__MODULE__{} = backup) do
backup = Repo.preload(backup, :user)
name = String.trim_trailing(backup.file_name, ".zip")
@ -130,10 +130,9 @@ def export(%__MODULE__{} = backup) do
:ok <- statuses(dir, backup.user),
:ok <- likes(dir, backup.user),
:ok <- bookmarks(dir, backup.user),
{:ok, zip_path} <-
:zip.create(String.to_charlist(dir <> ".zip"), @files, cwd: String.to_charlist(dir)),
{:ok, zip_path} <- Pleroma.SafeZip.zip(dir <> ".zip", @files, dir),
{:ok, _} <- File.rm_rf(dir) do
{:ok, to_string(zip_path)}
{:ok, zip_path}
end
end

View file

@ -56,16 +56,10 @@ def key_id_to_user_id(key_id) do
def key_id_to_ap_id(key_id) do
Logger.debug("Looking up key ID: #{key_id}")
result =
from(sk in __MODULE__, where: sk.key_id == ^key_id)
|> join(:inner, [sk], u in User, on: sk.user_id == u.id)
|> select([sk, u], %{user: u})
|> Repo.one()
case result do
%{user: %User{ap_id: ap_id}} -> ap_id
_ -> nil
end
from(sk in __MODULE__, where: sk.key_id == ^key_id)
|> join(:inner, [sk], u in User, on: sk.user_id == u.id)
|> select([sk, u], u.ap_id)
|> Repo.one()
end
@spec generate_rsa_pem() :: {:ok, binary()}
@ -115,24 +109,18 @@ def private_pem_to_public_pem(private_pem) do
{:ok, :public_key.pem_encode([public_key])}
end
@spec public_key(User.t()) :: {:ok, binary()} | {:error, String.t()}
@spec public_key(__MODULE__) :: {:ok, binary()} | {:error, String.t()}
@doc """
Given a user, return the public key for that user in binary format.
Return public key data in binary format.
"""
def public_key(%User{} = user) do
case Repo.preload(user, :signing_key) do
%User{signing_key: %__MODULE__{public_key: public_key_pem}} ->
key =
public_key_pem
|> :public_key.pem_decode()
|> hd()
|> :public_key.pem_entry_decode()
def public_key_decoded(%__MODULE__{public_key: public_key_pem}) do
decoded =
public_key_pem
|> :public_key.pem_decode()
|> hd()
|> :public_key.pem_entry_decode()
{:ok, key}
_ ->
{:error, "key not found"}
end
{:ok, decoded}
end
def public_key(_), do: {:error, "key not found"}
@ -174,12 +162,12 @@ def private_key(%User{} = user) do
Will either return the key if it exists locally, or fetch it from the remote instance.
"""
def get_or_fetch_by_key_id(key_id) do
case key_id_to_user_id(key_id) do
case Repo.get_by(__MODULE__, key_id: key_id) do
nil ->
fetch_remote_key(key_id)
user_id ->
{:ok, Repo.get_by(__MODULE__, user_id: user_id)}
key ->
{:ok, key}
end
end
@ -195,20 +183,28 @@ def get_or_fetch_by_key_id(key_id) do
def fetch_remote_key(key_id) do
Logger.debug("Fetching remote key: #{key_id}")
with {:ok, _body} = resp <-
Pleroma.Object.Fetcher.fetch_and_contain_remote_object_from_id(key_id),
{:ok, ap_id, public_key_pem} <- handle_signature_response(resp) do
with {:ok, resp_body} <-
Pleroma.Object.Fetcher.fetch_and_contain_remote_key(key_id),
{:ok, ap_id, public_key_pem} <- handle_signature_response(resp_body),
{:ok, user} <- User.get_or_fetch_by_ap_id(ap_id) do
Logger.debug("Fetched remote key: #{ap_id}")
# fetch the user
{:ok, user} = User.get_or_fetch_by_ap_id(ap_id)
# store the key
key = %__MODULE__{
key = %{
user_id: user.id,
public_key: public_key_pem,
key_id: key_id
}
Repo.insert(key, on_conflict: :replace_all, conflict_target: :key_id)
key_cs =
cast(%__MODULE__{}, key, [:user_id, :public_key, :key_id])
|> unique_constraint(:user_id)
Repo.insert(key_cs,
# while this should never run for local users anyway, etc make sure we really never loose privkey info!
on_conflict: {:replace_all_except, [:inserted_at, :private_key]},
# if the key owner overlaps with a distinct existing key entry, this intetionally still errros
conflict_target: :key_id
)
else
e ->
Logger.debug("Failed to fetch remote key: #{inspect(e)}")
@ -216,6 +212,23 @@ def fetch_remote_key(key_id) do
end
end
defp refresh_key(%__MODULE__{} = key) do
min_backoff = Pleroma.Config.get!([:activitypub, :min_key_refetch_interval])
if Timex.diff(Timex.now(), key.updated_at, :seconds) >= min_backoff do
fetch_remote_key(key.key_id)
else
{:error, :too_young}
end
end
def refresh_by_key_id(key_id) do
case Repo.get_by(__MODULE__, key_id: key_id) do
nil -> {:error, :unknown}
key -> refresh_key(key)
end
end
# Take the response from the remote instance and extract the key details
# will check if the key ID matches the owner of the key, if not, error
defp extract_key_details(%{"id" => ap_id, "publicKey" => public_key}) do
@ -227,7 +240,7 @@ defp extract_key_details(%{"id" => ap_id, "publicKey" => public_key}) do
end
end
defp handle_signature_response({:ok, body}) do
defp handle_signature_response(body) do
case body do
%{
"type" => "CryptographicKey",
@ -247,9 +260,9 @@ defp handle_signature_response({:ok, body}) do
%{"error" => error} ->
{:error, error}
other ->
{:error, "Could not process key: #{inspect(other)}"}
end
end
defp handle_signature_response({:error, e}), do: {:error, e}
defp handle_signature_response(other), do: {:error, "Could not fetch key: #{inspect(other)}"}
end

View file

@ -1626,7 +1626,6 @@ defp object_to_user_data(data, additional) do
%{
ap_id: data["id"],
uri: get_actor_url(data["url"]),
ap_enabled: true,
banner: normalize_image(data["image"]),
background: normalize_image(data["backgroundUrl"]),
fields: fields,
@ -1743,7 +1742,7 @@ def user_data_from_user_object(data, additional \\ []) do
end
end
def fetch_and_prepare_user_from_ap_id(ap_id, additional \\ []) do
defp fetch_and_prepare_user_from_ap_id(ap_id, additional) do
with {:ok, data} <- Fetcher.fetch_and_contain_remote_object_from_id(ap_id),
{:valid, {:ok, _, _}} <- {:valid, UserValidator.validate(data, [])},
{:ok, data} <- user_data_from_user_object(data, additional) do
@ -1751,19 +1750,16 @@ def fetch_and_prepare_user_from_ap_id(ap_id, additional \\ []) do
else
# If this has been deleted, only log a debug and not an error
{:error, {"Object has been deleted", _, _} = e} ->
Logger.debug("Could not decode user at fetch #{ap_id}, #{inspect(e)}")
{:error, e}
Logger.debug("User was explicitly deleted #{ap_id}, #{inspect(e)}")
{:error, :not_found}
{:reject, reason} = e ->
Logger.debug("Rejected user #{ap_id}: #{inspect(reason)}")
{:reject, _reason} = e ->
{:error, e}
{:valid, reason} ->
Logger.debug("Data is not a valid user #{ap_id}: #{inspect(reason)}")
{:error, "Not a user"}
{:error, {:validate, reason}}
{:error, e} ->
Logger.error("Could not decode user at fetch #{ap_id}, #{inspect(e)}")
{:error, e}
end
end
@ -1801,7 +1797,7 @@ def pin_data_from_featured_collection(%{
else
e ->
Logger.error("Could not decode featured collection at fetch #{first}, #{inspect(e)}")
{:ok, %{}}
%{}
end
end
@ -1811,14 +1807,18 @@ def pin_data_from_featured_collection(
} = collection
)
when type in ["OrderedCollection", "Collection"] do
{:ok, objects} = Collections.Fetcher.fetch_collection(collection)
# Items can either be a map _or_ a string
objects
|> Map.new(fn
ap_id when is_binary(ap_id) -> {ap_id, NaiveDateTime.utc_now()}
%{"id" => object_ap_id} -> {object_ap_id, NaiveDateTime.utc_now()}
end)
with {:ok, objects} <- Collections.Fetcher.fetch_collection(collection) do
# Items can either be a map _or_ a string
objects
|> Map.new(fn
ap_id when is_binary(ap_id) -> {ap_id, NaiveDateTime.utc_now()}
%{"id" => object_ap_id} -> {object_ap_id, NaiveDateTime.utc_now()}
end)
else
e ->
Logger.warning("Failed to fetch featured collection #{collection}, #{inspect(e)}")
%{}
end
end
def pin_data_from_featured_collection(obj) do
@ -1857,31 +1857,27 @@ def enqueue_pin_fetches(_), do: nil
def make_user_from_ap_id(ap_id, additional \\ []) do
user = User.get_cached_by_ap_id(ap_id)
if user && !User.ap_enabled?(user) do
Transmogrifier.upgrade_user_from_ap_id(ap_id)
else
with {:ok, data} <- fetch_and_prepare_user_from_ap_id(ap_id, additional) do
user =
if data.ap_id != ap_id do
User.get_cached_by_ap_id(data.ap_id)
else
user
end
if user do
user
|> User.remote_user_changeset(data)
|> User.update_and_set_cache()
|> tap(fn _ -> enqueue_pin_fetches(data) end)
with {:ok, data} <- fetch_and_prepare_user_from_ap_id(ap_id, additional) do
user =
if data.ap_id != ap_id do
User.get_cached_by_ap_id(data.ap_id)
else
maybe_handle_clashing_nickname(data)
data
|> User.remote_user_changeset()
|> Repo.insert()
|> User.set_cache()
|> tap(fn _ -> enqueue_pin_fetches(data) end)
user
end
if user do
user
|> User.remote_user_changeset(data)
|> User.update_and_set_cache()
|> tap(fn _ -> enqueue_pin_fetches(data) end)
else
maybe_handle_clashing_nickname(data)
data
|> User.remote_user_changeset()
|> Repo.insert()
|> User.set_cache()
|> tap(fn _ -> enqueue_pin_fetches(data) end)
end
end
end

View file

@ -24,7 +24,11 @@ defp score_displayname("federationbot"), do: 1.0
defp score_displayname("fedibot"), do: 1.0
defp score_displayname(_), do: 0.0
defp determine_if_followbot(%User{nickname: nickname, name: displayname, actor_type: actor_type}) do
defp determine_if_followbot(%User{
nickname: nickname,
name: displayname,
actor_type: actor_type
}) do
# nickname will be a binary string except when following a relay
nick_score =
if is_binary(nickname) do

View file

@ -108,7 +108,7 @@ defp check_ftl_removal(%{host: actor_host} = _actor_info, object) do
end
defp intersection(list1, list2) do
list1 -- list1 -- list2
list1 -- (list1 -- list2)
end
defp check_followers_only(%{host: actor_host} = _actor_info, object) do

View file

@ -15,6 +15,7 @@ defmodule Pleroma.Web.ActivityPub.ObjectValidator do
alias Pleroma.EctoType.ActivityPub.ObjectValidators
alias Pleroma.Object
alias Pleroma.Object.Containment
alias Pleroma.Object.Fetcher
alias Pleroma.User
alias Pleroma.Web.ActivityPub.ObjectValidators.AcceptRejectValidator
alias Pleroma.Web.ActivityPub.ObjectValidators.AddRemoveValidator
@ -253,9 +254,28 @@ def fetch_actor(object) do
end
def fetch_actor_and_object(object) do
fetch_actor(object)
Object.normalize(object["object"], fetch: true)
:ok
# Fetcher.fetch_object_from_id already first does a local db lookup
with {:ok, %User{}} <- fetch_actor(object),
{:ap_id, id} when is_binary(id) <-
{:ap_id, Pleroma.Web.ActivityPub.Utils.get_ap_id(object["object"])},
{:ok, %Object{}} <- Fetcher.fetch_object_from_id(id) do
:ok
else
{:ap_id, id} ->
{:error, {:validate, "Invalid AP id: #{inspect(id)}"}}
# if actor: late post from a previously unknown, deleted profile
# if object: private post we're not allowed to access
# (other HTTP replies might just indicate a temporary network failure though!)
{:error, e} when e in [:not_found, :forbidden] ->
{:error, :ignore}
{:error, _} = e ->
e
e ->
{:error, e}
end
end
defp for_each_history_item(

View file

@ -9,6 +9,7 @@ defmodule Pleroma.Web.ActivityPub.ObjectValidators.AddRemoveValidator do
import Pleroma.Web.ActivityPub.ObjectValidators.CommonValidations
require Pleroma.Constants
require Logger
alias Pleroma.User
@ -27,14 +28,21 @@ defmodule Pleroma.Web.ActivityPub.ObjectValidators.AddRemoveValidator do
end
def cast_and_validate(data) do
{:ok, actor} = User.get_or_fetch_by_ap_id(data["actor"])
with {_, {:ok, actor}} <- {:user, User.get_or_fetch_by_ap_id(data["actor"])},
{_, {:ok, actor}} <- {:feataddr, maybe_refetch_user(actor)} do
data
|> maybe_fix_data_for_mastodon(actor)
|> cast_data()
|> validate_data(actor)
else
{:feataddr, _} ->
{:error,
{:validate,
"Actor doesn't provide featured collection address to verify against: #{data["id"]}"}}
{:ok, actor} = maybe_refetch_user(actor)
data
|> maybe_fix_data_for_mastodon(actor)
|> cast_data()
|> validate_data(actor)
{:user, _} ->
{:error, :link_resolve_failed}
end
end
defp maybe_fix_data_for_mastodon(data, actor) do
@ -73,6 +81,9 @@ defp maybe_refetch_user(%User{featured_address: address} = user) when is_binary(
end
defp maybe_refetch_user(%User{ap_id: ap_id}) do
Pleroma.Web.ActivityPub.Transmogrifier.upgrade_user_from_ap_id(ap_id)
# If the user didn't expose a featured collection before,
# recheck now so we can verify perms for add/remove.
# But wait at least 5s to avoid rapid refetches in edge cases
User.get_or_fetch_by_ap_id(ap_id, maximum_age: 5)
end
end

View file

@ -71,7 +71,7 @@ defp fix_tag(data), do: Map.drop(data, ["tag"])
defp fix_replies(%{"replies" => replies} = data) when is_list(replies), do: data
defp fix_replies(%{"replies" => %{"first" => first}} = data) do
defp fix_replies(%{"replies" => %{"first" => first}} = data) when is_binary(first) do
with {:ok, replies} <- Akkoma.Collections.Fetcher.fetch_collection(first) do
Map.put(data, "replies", replies)
else
@ -81,6 +81,10 @@ defp fix_replies(%{"replies" => %{"first" => first}} = data) do
end
end
defp fix_replies(%{"replies" => %{"first" => %{"items" => replies}}} = data)
when is_list(replies),
do: Map.put(data, "replies", replies)
defp fix_replies(%{"replies" => %{"items" => replies}} = data) when is_list(replies),
do: Map.put(data, "replies", replies)

View file

@ -54,10 +54,14 @@ def validate_actor_presence(cng, options \\ []) do
def validate_object_presence(cng, options \\ []) do
field_name = Keyword.get(options, :field_name, :object)
allowed_types = Keyword.get(options, :allowed_types, false)
allowed_categories = Keyword.get(options, :allowed_object_categores, [:object, :activity])
cng
|> validate_change(field_name, fn field_name, object_id ->
object = Object.get_cached_by_ap_id(object_id) || Activity.get_by_ap_id(object_id)
object =
(:object in allowed_categories && Object.get_cached_by_ap_id(object_id)) ||
(:activity in allowed_categories && Activity.get_by_ap_id(object_id)) ||
nil
cond do
!object ->

View file

@ -61,7 +61,10 @@ defp validate_data(cng) do
|> validate_inclusion(:type, ["Delete"])
|> validate_delete_actor(:actor)
|> validate_modification_rights()
|> validate_object_or_user_presence(allowed_types: @deletable_types)
|> validate_object_or_user_presence(
allowed_types: @deletable_types,
allowed_object_categories: [:object]
)
|> add_deleted_activity_id()
end

View file

@ -129,7 +129,7 @@ defp validate_data(data_cng) do
|> validate_inclusion(:type, ["EmojiReact"])
|> validate_required([:id, :type, :object, :actor, :context, :to, :cc, :content])
|> validate_actor_presence()
|> validate_object_presence()
|> validate_object_presence(allowed_object_categories: [:object])
|> validate_emoji()
|> maybe_validate_tag_presence()
end

View file

@ -66,7 +66,7 @@ defp validate_data(data_cng) do
|> validate_inclusion(:type, ["Like"])
|> validate_required([:id, :type, :object, :actor, :context, :to, :cc])
|> validate_actor_presence()
|> validate_object_presence()
|> validate_object_presence(allowed_object_categories: [:object])
|> validate_existing_like()
end

View file

@ -44,7 +44,7 @@ defp validate_data(data_cng) do
|> validate_inclusion(:type, ["Undo"])
|> validate_required([:id, :type, :object, :actor, :to, :cc])
|> validate_undo_actor(:actor)
|> validate_object_presence()
|> validate_object_presence(allowed_object_categories: [:activity])
|> validate_undo_rights()
end

View file

@ -22,7 +22,8 @@ def validate(object, meta)
def validate(%{"type" => type, "id" => _id} = data, meta)
when type in Pleroma.Constants.actor_types() do
with :ok <- validate_inbox(data),
with :ok <- validate_pubkey(data),
:ok <- validate_inbox(data),
:ok <- contain_collection_origin(data) do
{:ok, data, meta}
else
@ -33,6 +34,24 @@ def validate(%{"type" => type, "id" => _id} = data, meta)
def validate(_, _), do: {:error, "Not a user object"}
defp validate_pubkey(%{
"id" => user_id,
"publicKey" => %{"id" => pk_id, "publicKeyPem" => _key}
}) do
with {_, true} <- {:user, is_binary(user_id)},
{_, true} <- {:key, is_binary(pk_id)},
:ok <- Containment.contain_key_user(pk_id, user_id) do
:ok
else
{:user, _} -> {:error, "Invalid user id: #{inspect(user_id)}"}
{:key, _} -> {:error, "Invalid key id: #{inspect(pk_id)}"}
:error -> {:error, "Problematic actor-key pairing: #{user_id} - #{pk_id}"}
end
end
# pubkey is optional atm
defp validate_pubkey(_data), do: :ok
defp validate_inbox(%{"id" => id, "inbox" => inbox}) do
case Containment.same_origin(id, inbox) do
:ok -> :ok

View file

@ -219,7 +219,6 @@ def publish(%User{} = actor, %{data: %{"bcc" => bcc}} = activity)
inboxes =
recipients
|> Enum.filter(&User.ap_enabled?/1)
|> Enum.map(fn actor -> actor.inbox end)
|> Enum.filter(fn inbox -> should_federate?(inbox) end)
|> Instances.filter_reachable()
@ -261,7 +260,6 @@ def publish(%User{} = actor, %Activity{} = activity) do
json = Jason.encode!(data)
recipients(actor, activity)
|> Enum.filter(fn user -> User.ap_enabled?(user) end)
|> Enum.map(fn %User{} = user ->
determine_inbox(activity, user)
end)

View file

@ -21,7 +21,6 @@ defmodule Pleroma.Web.ActivityPub.Transmogrifier do
alias Pleroma.Web.ActivityPub.Visibility
alias Pleroma.Web.ActivityPub.ObjectValidators.CommonFixes
alias Pleroma.Web.Federator
alias Pleroma.Workers.TransmogrifierWorker
import Ecto.Query
@ -30,6 +29,7 @@ defmodule Pleroma.Web.ActivityPub.Transmogrifier do
@doc """
Modifies an incoming AP object (mastodon format) to our internal format.
(This only deals with non-activity AP objects)
"""
def fix_object(object, options \\ []) do
object
@ -45,6 +45,38 @@ def fix_object(object, options \\ []) do
|> fix_content_map()
|> fix_addressing()
|> fix_summary()
|> fix_history(&fix_object/1)
end
defp maybe_fix_object(%{"attributedTo" => _} = object), do: fix_object(object)
defp maybe_fix_object(object), do: object
defp fix_history(%{"formerRepresentations" => %{"orderedItems" => list}} = obj, fix_fun)
when is_list(list) do
update_in(obj["formerRepresentations"]["orderedItems"], fn h -> Enum.map(h, fix_fun) end)
end
defp fix_history(obj, _), do: obj
defp fix_recursive(obj, fun) do
# unlike Erlang, Elixir does not support recursive inline functions
# which would allow us to avoid reconstructing this on every recursion
rec_fun = fn
obj when is_map(obj) -> fix_recursive(obj, fun)
# there may be simple AP IDs in history (or object field)
obj -> obj
end
obj
|> fun.()
|> fix_history(rec_fun)
|> then(fn
%{"object" => object} = doc when is_map(object) ->
update_in(doc["object"], rec_fun)
apdoc ->
apdoc
end)
end
def fix_summary(%{"summary" => nil} = object) do
@ -415,17 +447,10 @@ defp get_reported(objects) do
end
def handle_incoming(data, options \\ []) do
data = normalise_addressing_public(data)
data =
if data["object"] != nil do
object = normalise_addressing_public(data["object"])
Map.put(data, "object", object)
else
data
end
handle_incoming_normalised(data, options)
data
|> fix_recursive(&normalise_addressing_public/1)
|> fix_recursive(&strip_internal_fields/1)
|> handle_incoming_normalised(options)
end
defp handle_incoming_normalised(data, options)
@ -491,7 +516,6 @@ defp handle_incoming_normalised(
object =
data["object"]
|> strip_internal_fields()
|> fix_type(fetch_options)
|> fix_in_reply_to(fetch_options)
|> fix_quote_url(fetch_options)
@ -520,10 +544,22 @@ defp handle_incoming_normalised(
defp handle_incoming_normalised(%{"type" => type} = data, _options)
when type in ~w{Like EmojiReact Announce Add Remove} do
with :ok <- ObjectValidator.fetch_actor_and_object(data),
with {_, :ok} <- {:link, ObjectValidator.fetch_actor_and_object(data)},
{:ok, activity, _meta} <- Pipeline.common_pipeline(data, local: false) do
{:ok, activity}
else
{:link, {:error, :ignore}} ->
{:error, :ignore}
{:link, {:error, {:validate, _}} = e} ->
e
{:link, {:error, {:reject, _}} = e} ->
e
{:link, _} ->
{:error, :link_resolve_failed}
e ->
{:error, e}
end
@ -534,6 +570,9 @@ defp handle_incoming_normalised(
_options
)
when type in ~w{Update Block Follow Accept Reject} do
fixed_obj = maybe_fix_object(data["object"])
data = if fixed_obj != nil, do: %{data | "object" => fixed_obj}, else: data
with {:ok, %User{}} <- ObjectValidator.fetch_actor(data),
{:ok, activity, _} <-
Pipeline.common_pipeline(data, local: false) do
@ -545,22 +584,45 @@ defp handle_incoming_normalised(
%{"type" => "Delete"} = data,
_options
) do
with {:ok, activity, _} <-
Pipeline.common_pipeline(data, local: false) do
oid_result = ObjectValidators.ObjectID.cast(data["object"])
with {_, {:ok, object_id}} <- {:object_id, oid_result},
object <- Object.get_cached_by_ap_id(object_id),
{_, false} <- {:tombstone, Object.tombstone_object?(object) && !data["actor"]},
{:ok, activity, _} <- Pipeline.common_pipeline(data, local: false) do
{:ok, activity}
else
{:error, {:validate, _}} = e ->
# Check if we have a create activity for this
with {:ok, object_id} <- ObjectValidators.ObjectID.cast(data["object"]),
%Activity{data: %{"actor" => actor}} <-
Activity.create_by_object_ap_id(object_id) |> Repo.one(),
# We have one, insert a tombstone and retry
{:ok, tombstone_data, _} <- Builder.tombstone(actor, object_id),
{:ok, _tombstone} <- Object.create(tombstone_data) do
handle_incoming(data)
{:object_id, _} ->
{:error, {:validate, "Invalid object id: #{data["object"]}"}}
{:tombstone, true} ->
{:error, :ignore}
{:error, {:validate, {:error, %Ecto.Changeset{errors: errors}}}} = e ->
if errors[:object] == {"can't find object", []} do
# Check if we have a create activity for this
# (e.g. from a db prune without --prune-activities)
# We'd still like to process side effects so insert a fake tombstone and retry
# (real tombstones from Object.delete do not have an actor field)
with {:ok, object_id} <- ObjectValidators.ObjectID.cast(data["object"]),
{_, %Activity{data: %{"actor" => actor}}} <-
{:create, Activity.create_by_object_ap_id(object_id) |> Repo.one()},
{:ok, tombstone_data, _} <- Builder.tombstone(actor, object_id),
{:ok, _tombstone} <- Object.create(tombstone_data) do
handle_incoming(data)
else
{:create, _} -> {:error, :ignore}
_ -> e
end
else
_ -> e
e
end
{:error, _} = e ->
e
e ->
{:error, e}
end
end
@ -593,6 +655,20 @@ defp handle_incoming_normalised(
when type in ["Like", "EmojiReact", "Announce", "Block"] do
with {:ok, activity, _} <- Pipeline.common_pipeline(data, local: false) do
{:ok, activity}
else
{:error, {:validate, {:error, %Ecto.Changeset{errors: errors}}}} = e ->
# If we never saw the activity being undone, no need to do anything.
# Inspectinging the validation error content is a bit akward, but looking up the Activity
# ahead of time here would be too costly since Activity queries are not cached
# and there's no way atm to pass the retrieved result along along
if errors[:object] == {"can't find object", []} do
{:error, :ignore}
else
e
end
e ->
e
end
end
@ -995,6 +1071,8 @@ def prepare_attachments(object) do
Map.put(object, "attachment", attachments)
end
# for outgoing docs immediately stripping internal fields recursively breaks later emoji transformations
# (XXX: it would be better to reorder operations so we can always use recursive stripping)
def strip_internal_fields(object) do
Map.drop(object, Pleroma.Constants.object_internal_fields())
end
@ -1007,47 +1085,6 @@ defp strip_internal_tags(%{"tag" => tags} = object) do
defp strip_internal_tags(object), do: object
def perform(:user_upgrade, user) do
# we pass a fake user so that the followers collection is stripped away
old_follower_address = User.ap_followers(%User{nickname: user.nickname})
from(
a in Activity,
where: ^old_follower_address in a.recipients,
update: [
set: [
recipients:
fragment(
"array_replace(?,?,?)",
a.recipients,
^old_follower_address,
^user.follower_address
)
]
]
)
|> Repo.update_all([])
end
def upgrade_user_from_ap_id(ap_id) do
with %User{local: false} = user <- User.get_cached_by_ap_id(ap_id),
{:ok, data} <- ActivityPub.fetch_and_prepare_user_from_ap_id(ap_id),
{:ok, user} <- update_user(user, data) do
ActivityPub.enqueue_pin_fetches(user)
TransmogrifierWorker.enqueue("user_upgrade", %{"user_id" => user.id})
{:ok, user}
else
%User{} = user -> {:ok, user}
e -> e
end
end
defp update_user(user, data) do
user
|> User.remote_user_changeset(data)
|> User.update_and_set_cache()
end
def maybe_fix_user_url(%{"url" => url} = data) when is_map(url) do
Map.put(data, "url", url["href"])
end

View file

@ -6,7 +6,6 @@ defmodule Pleroma.Web.Federator do
alias Pleroma.Activity
alias Pleroma.Object.Containment
alias Pleroma.User
alias Pleroma.Web.ActivityPub.ActivityPub
alias Pleroma.Web.ActivityPub.Transmogrifier
alias Pleroma.Web.ActivityPub.Utils
alias Pleroma.Web.Federator.Publisher
@ -92,10 +91,10 @@ def perform(:incoming_ap_doc, params) do
# NOTE: we use the actor ID to do the containment, this is fine because an
# actor shouldn't be acting on objects outside their own AP server.
with {_, {:ok, _user}} <- {:actor, ap_enabled_actor(actor)},
nil <- Activity.normalize(params["id"]),
with nil <- Activity.normalize(params["id"]),
{_, :ok} <-
{:correct_origin?, Containment.contain_origin_from_id(actor, params)},
{_, :ok, _} <- {:local, Containment.contain_local_fetch(actor), actor},
{:ok, activity} <- Transmogrifier.handle_incoming(params) do
{:ok, activity}
else
@ -103,6 +102,13 @@ def perform(:incoming_ap_doc, params) do
Logger.debug("Origin containment failure for #{params["id"]}")
{:error, :origin_containment_failed}
{:local, _, actor} ->
Logger.alert(
"Received incoming AP doc with valid signature for local actor #{actor}! Likely key leak!\n#{inspect(params)}"
)
{:error, :origin_containment_failed}
%Activity{} ->
Logger.debug("Already had #{params["id"]}")
{:error, :already_present}
@ -119,17 +125,11 @@ def perform(:incoming_ap_doc, params) do
e ->
# Just drop those for now
Logger.debug(fn -> "Unhandled activity\n" <> Jason.encode!(params, pretty: true) end)
{:error, e}
end
end
def ap_enabled_actor(id) do
user = User.get_cached_by_ap_id(id)
if User.ap_enabled?(user) do
{:ok, user}
else
ActivityPub.make_user_from_ap_id(id)
case e do
{:error, _} -> e
_ -> {:error, e}
end
end
end
end

View file

@ -617,7 +617,9 @@ def render("context.json", %{activity: activity, activities: activities, user: u
%{ancestors: ancestors, descendants: descendants} =
activities
|> Enum.reverse()
|> Enum.group_by(fn %{id: id} -> if id < activity.id, do: :ancestors, else: :descendants end)
|> Enum.group_by(fn %{id: id} ->
if id < activity.id, do: :ancestors, else: :descendants
end)
|> Map.put_new(:ancestors, [])
|> Map.put_new(:descendants, [])

View file

@ -8,8 +8,8 @@ defmodule Pleroma.Web.Plugs.HTTPSignaturePlug do
use Pleroma.Web, :verified_routes
alias Pleroma.Activity
alias Pleroma.Signature
alias Pleroma.Instances
alias Pleroma.User.SigningKey
require Logger
@cachex Pleroma.Config.get([:cachex, :provider], Cachex)
@ -77,10 +77,6 @@ defp assign_valid_signature_on_route_aliases(conn, [path | rest]) do
|> put_req_header("(request-target)", request_target)
|> maybe_put_created_psudoheader()
|> maybe_put_expires_psudoheader()
|> case do
%{assigns: %{digest: digest}} = conn -> put_req_header(conn, "digest", digest)
conn -> conn
end
conn
|> assign(:valid_signature, HTTPSignatures.validate_conn(conn))
@ -93,7 +89,13 @@ defp maybe_assign_valid_signature(conn) do
# set (request-target) header to the appropriate value
# we also replace the digest header with the one we computed
possible_paths =
route_aliases(conn) ++ [conn.request_path, conn.request_path <> "?#{conn.query_string}"]
[conn.request_path, conn.request_path <> "?#{conn.query_string}" | route_aliases(conn)]
conn =
case conn do
%{assigns: %{digest: digest}} = conn -> put_req_header(conn, "digest", digest)
conn -> conn
end
assign_valid_signature_on_route_aliases(conn, possible_paths)
else
@ -140,15 +142,17 @@ defp maybe_require_signature(conn), do: conn
defp signature_host(conn) do
with {:key_id, %{"keyId" => kid}} <- {:key_id, HTTPSignatures.signature_for_conn(conn)},
{:actor_id, {:ok, actor_id}} <- {:actor_id, Signature.key_id_to_actor_id(kid)} do
{:actor_id, actor_id, _} when actor_id != nil <-
{:actor_id, SigningKey.key_id_to_ap_id(kid), kid} do
actor_id
else
{:key_id, e} ->
Logger.error("Failed to extract key_id from signature: #{inspect(e)}")
nil
{:actor_id, e} ->
Logger.error("Failed to extract actor_id from signature: #{inspect(e)}")
{:actor_id, _, kid} ->
# SigningKeys SHOULD have been fetched before this gets called!
Logger.error("Failed to extract actor_id from signature: signing key #{kid} not known")
nil
end
end

View file

@ -62,10 +62,10 @@ defp call_static(%{request_path: request_path} = conn, opts, from) do
opts =
opts
|> Map.put(:from, from)
|> Map.put(:set_content_type, false)
|> Map.put(:content_types, false)
conn
|> set_static_content_type(request_path)
|> Pleroma.Web.Plugs.StaticNoCT.call(opts)
|> Plug.Static.call(opts)
end
end

View file

@ -1,469 +0,0 @@
# This is almost identical to Plug.Static from Plug 1.15.3 (2024-01-16)
# It being copied is a temporary measure to fix an urgent bug without
# needing to wait for merge of a suitable patch upstream
# The differences are:
# - this leading comment
# - renaming of the module from 'Plug.Static' to 'Pleroma.Web.Plugs.StaticNoCT'
# - additon of set_content_type option
defmodule Pleroma.Web.Plugs.StaticNoCT do
@moduledoc """
A plug for serving static assets.
It requires two options:
* `:at` - the request path to reach for static assets.
It must be a string.
* `:from` - the file system path to read static assets from.
It can be either: a string containing a file system path, an
atom representing the application name (where assets will
be served from `priv/static`), a tuple containing the
application name and the directory to serve assets from (besides
`priv/static`), or an MFA tuple.
The preferred form is to use `:from` with an atom or tuple, since
it will make your application independent from the starting directory.
For example, if you pass:
plug Plug.Static, from: "priv/app/path"
Plug.Static will be unable to serve assets if you build releases
or if you change the current directory. Instead do:
plug Plug.Static, from: {:app_name, "priv/app/path"}
If a static asset cannot be found, `Plug.Static` simply forwards
the connection to the rest of the pipeline.
## Cache mechanisms
`Plug.Static` uses etags for HTTP caching. This means browsers/clients
should cache assets on the first request and validate the cache on
following requests, not downloading the static asset once again if it
has not changed. The cache-control for etags is specified by the
`cache_control_for_etags` option and defaults to `"public"`.
However, `Plug.Static` also supports direct cache control by using
versioned query strings. If the request query string starts with
"?vsn=", `Plug.Static` assumes the application is versioning assets
and does not set the `ETag` header, meaning the cache behaviour will
be specified solely by the `cache_control_for_vsn_requests` config,
which defaults to `"public, max-age=31536000"`.
## Options
* `:encodings` - list of 2-ary tuples where first value is value of
the `Accept-Encoding` header and second is extension of the file to
be served if given encoding is accepted by client. Entries will be tested
in order in list, so entries higher in list will be preferred. Defaults
to: `[]`.
In addition to setting this value directly it supports 2 additional
options for compatibility reasons:
+ `:brotli` - will append `{"br", ".br"}` to the encodings list.
+ `:gzip` - will append `{"gzip", ".gz"}` to the encodings list.
Additional options will be added in the above order (Brotli takes
preference over Gzip) to reflect older behaviour which was set due
to fact that Brotli in general provides better compression ratio than
Gzip.
* `:cache_control_for_etags` - sets the cache header for requests
that use etags. Defaults to `"public"`.
* `:etag_generation` - specify a `{module, function, args}` to be used
to generate an etag. The `path` of the resource will be passed to
the function, as well as the `args`. If this option is not supplied,
etags will be generated based off of file size and modification time.
Note it is [recommended for the etag value to be quoted](https://tools.ietf.org/html/rfc7232#section-2.3),
which Plug won't do automatically.
* `:cache_control_for_vsn_requests` - sets the cache header for
requests starting with "?vsn=" in the query string. Defaults to
`"public, max-age=31536000"`.
* `:only` - filters which requests to serve. This is useful to avoid
file system access on every request when this plug is mounted
at `"/"`. For example, if `only: ["images", "favicon.ico"]` is
specified, only files in the "images" directory and the
"favicon.ico" file will be served by `Plug.Static`.
Note that `Plug.Static` matches these filters against request
uri and not against the filesystem. When requesting
a file with name containing non-ascii or special characters,
you should use urlencoded form. For example, you should write
`only: ["file%20name"]` instead of `only: ["file name"]`.
Defaults to `nil` (no filtering).
* `:only_matching` - a relaxed version of `:only` that will
serve any request as long as one of the given values matches the
given path. For example, `only_matching: ["images", "favicon"]`
will match any request that starts at "images" or "favicon",
be it "/images/foo.png", "/images-high/foo.png", "/favicon.ico"
or "/favicon-high.ico". Such matches are useful when serving
digested files at the root. Defaults to `nil` (no filtering).
* `:headers` - other headers to be set when serving static assets. Specify either
an enum of key-value pairs or a `{module, function, args}` to return an enum. The
`conn` will be passed to the function, as well as the `args`.
* `:content_types` - custom MIME type mapping. As a map with filename as key
and content type as value. For example:
`content_types: %{"apple-app-site-association" => "application/json"}`.
* `:set_content_type` - by default Plug.Static (re)sets the content type header
using auto-detection and the `:content_types` map. But when set to `false`
no content-type header will be inserted instead retaining the original
value or lack thereof. This can be useful when custom logic for appropiate
content types is needed which cannot be reasonably expressed as a static
filename map.
## Examples
This plug can be mounted in a `Plug.Builder` pipeline as follows:
defmodule MyPlug do
use Plug.Builder
plug Plug.Static,
at: "/public",
from: :my_app,
only: ~w(images robots.txt)
plug :not_found
def not_found(conn, _) do
send_resp(conn, 404, "not found")
end
end
"""
@behaviour Plug
@allowed_methods ~w(GET HEAD)
import Plug.Conn
alias Plug.Conn
# In this module, the `:prim_file` Erlang module along with the `:file_info`
# record are used instead of the more common and Elixir-y `File` module and
# `File.Stat` struct, respectively. The reason behind this is performance: all
# the `File` operations pass through a single process in order to support node
# operations that we simply don't need when serving assets.
require Record
Record.defrecordp(:file_info, Record.extract(:file_info, from_lib: "kernel/include/file.hrl"))
defmodule InvalidPathError do
defexception message: "invalid path for static asset", plug_status: 400
end
@impl true
def init(opts) do
from =
case Keyword.fetch!(opts, :from) do
{_, _} = from -> from
{_, _, _} = from -> from
from when is_atom(from) -> {from, "priv/static"}
from when is_binary(from) -> from
_ -> raise ArgumentError, ":from must be an atom, a binary or a tuple"
end
encodings =
opts
|> Keyword.get(:encodings, [])
|> maybe_add("br", ".br", Keyword.get(opts, :brotli, false))
|> maybe_add("gzip", ".gz", Keyword.get(opts, :gzip, false))
%{
encodings: encodings,
only_rules: {Keyword.get(opts, :only, []), Keyword.get(opts, :only_matching, [])},
qs_cache: Keyword.get(opts, :cache_control_for_vsn_requests, "public, max-age=31536000"),
et_cache: Keyword.get(opts, :cache_control_for_etags, "public"),
et_generation: Keyword.get(opts, :etag_generation, nil),
headers: Keyword.get(opts, :headers, %{}),
content_types: Keyword.get(opts, :content_types, %{}),
set_content_type: Keyword.get(opts, :set_content_type, true),
from: from,
at: opts |> Keyword.fetch!(:at) |> Plug.Router.Utils.split()
}
end
@impl true
def call(
conn = %Conn{method: meth},
%{at: at, only_rules: only_rules, from: from, encodings: encodings} = options
)
when meth in @allowed_methods do
segments = subset(at, conn.path_info)
if allowed?(only_rules, segments) do
segments = Enum.map(segments, &uri_decode/1)
if invalid_path?(segments) do
raise InvalidPathError, "invalid path for static asset: #{conn.request_path}"
end
path = path(from, segments)
range = get_req_header(conn, "range")
encoding = file_encoding(conn, path, range, encodings)
serve_static(encoding, conn, segments, range, options)
else
conn
end
end
def call(conn, _options) do
conn
end
defp uri_decode(path) do
# TODO: Remove rescue as this can't fail from Elixir v1.13
try do
URI.decode(path)
rescue
ArgumentError ->
raise InvalidPathError
end
end
defp allowed?(_only_rules, []), do: false
defp allowed?({[], []}, _list), do: true
defp allowed?({full, prefix}, [h | _]) do
h in full or (prefix != [] and match?({0, _}, :binary.match(h, prefix)))
end
defp maybe_put_content_type(conn, false, _, _), do: conn
defp maybe_put_content_type(conn, _, types, filename) do
content_type = Map.get(types, filename) || MIME.from_path(filename)
conn
|> put_resp_header("content-type", content_type)
end
defp serve_static({content_encoding, file_info, path}, conn, segments, range, options) do
%{
qs_cache: qs_cache,
et_cache: et_cache,
et_generation: et_generation,
headers: headers,
content_types: types,
set_content_type: set_content_type
} = options
case put_cache_header(conn, qs_cache, et_cache, et_generation, file_info, path) do
{:stale, conn} ->
filename = List.last(segments)
conn
|> maybe_put_content_type(set_content_type, types, filename)
|> put_resp_header("accept-ranges", "bytes")
|> maybe_add_encoding(content_encoding)
|> merge_headers(headers)
|> serve_range(file_info, path, range, options)
{:fresh, conn} ->
conn
|> maybe_add_vary(options)
|> send_resp(304, "")
|> halt()
end
end
defp serve_static(:error, conn, _segments, _range, _options) do
conn
end
defp serve_range(conn, file_info, path, [range], options) do
file_info(size: file_size) = file_info
with %{"bytes" => bytes} <- Plug.Conn.Utils.params(range),
{range_start, range_end} <- start_and_end(bytes, file_size) do
send_range(conn, path, range_start, range_end, file_size, options)
else
_ -> send_entire_file(conn, path, options)
end
end
defp serve_range(conn, _file_info, path, _range, options) do
send_entire_file(conn, path, options)
end
defp start_and_end("-" <> rest, file_size) do
case Integer.parse(rest) do
{last, ""} when last > 0 and last <= file_size -> {file_size - last, file_size - 1}
_ -> :error
end
end
defp start_and_end(range, file_size) do
case Integer.parse(range) do
{first, "-"} when first >= 0 ->
{first, file_size - 1}
{first, "-" <> rest} when first >= 0 ->
case Integer.parse(rest) do
{last, ""} when last >= first -> {first, min(last, file_size - 1)}
_ -> :error
end
_ ->
:error
end
end
defp send_range(conn, path, 0, range_end, file_size, options) when range_end == file_size - 1 do
send_entire_file(conn, path, options)
end
defp send_range(conn, path, range_start, range_end, file_size, _options) do
length = range_end - range_start + 1
conn
|> put_resp_header("content-range", "bytes #{range_start}-#{range_end}/#{file_size}")
|> send_file(206, path, range_start, length)
|> halt()
end
defp send_entire_file(conn, path, options) do
conn
|> maybe_add_vary(options)
|> send_file(200, path)
|> halt()
end
defp maybe_add_encoding(conn, nil), do: conn
defp maybe_add_encoding(conn, ce), do: put_resp_header(conn, "content-encoding", ce)
defp maybe_add_vary(conn, %{encodings: encodings}) do
# If we serve gzip or brotli at any moment, we need to set the proper vary
# header regardless of whether we are serving gzip content right now.
# See: http://www.fastly.com/blog/best-practices-for-using-the-vary-header/
if encodings != [] do
update_in(conn.resp_headers, &[{"vary", "Accept-Encoding"} | &1])
else
conn
end
end
defp put_cache_header(
%Conn{query_string: "vsn=" <> _} = conn,
qs_cache,
_et_cache,
_et_generation,
_file_info,
_path
)
when is_binary(qs_cache) do
{:stale, put_resp_header(conn, "cache-control", qs_cache)}
end
defp put_cache_header(conn, _qs_cache, et_cache, et_generation, file_info, path)
when is_binary(et_cache) do
etag = etag_for_path(file_info, et_generation, path)
conn =
conn
|> put_resp_header("cache-control", et_cache)
|> put_resp_header("etag", etag)
if etag in get_req_header(conn, "if-none-match") do
{:fresh, conn}
else
{:stale, conn}
end
end
defp put_cache_header(conn, _, _, _, _, _) do
{:stale, conn}
end
defp etag_for_path(file_info, et_generation, path) do
case et_generation do
{module, function, args} ->
apply(module, function, [path | args])
nil ->
file_info(size: size, mtime: mtime) = file_info
<<?", {size, mtime} |> :erlang.phash2() |> Integer.to_string(16)::binary, ?">>
end
end
defp file_encoding(conn, path, [_range], _encodings) do
# We do not support compression for range queries.
file_encoding(conn, path, nil, [])
end
defp file_encoding(conn, path, _range, encodings) do
encoded =
Enum.find_value(encodings, fn {encoding, ext} ->
if file_info = accept_encoding?(conn, encoding) && regular_file_info(path <> ext) do
{encoding, file_info, path <> ext}
end
end)
cond do
not is_nil(encoded) ->
encoded
file_info = regular_file_info(path) ->
{nil, file_info, path}
true ->
:error
end
end
defp regular_file_info(path) do
case :prim_file.read_file_info(path) do
{:ok, file_info(type: :regular) = file_info} ->
file_info
_ ->
nil
end
end
defp accept_encoding?(conn, encoding) do
encoding? = &String.contains?(&1, [encoding, "*"])
Enum.any?(get_req_header(conn, "accept-encoding"), fn accept ->
accept |> Plug.Conn.Utils.list() |> Enum.any?(encoding?)
end)
end
defp maybe_add(list, key, value, true), do: list ++ [{key, value}]
defp maybe_add(list, _key, _value, false), do: list
defp path({module, function, arguments}, segments)
when is_atom(module) and is_atom(function) and is_list(arguments),
do: Enum.join([apply(module, function, arguments) | segments], "/")
defp path({app, from}, segments) when is_atom(app) and is_binary(from),
do: Enum.join([Application.app_dir(app), from | segments], "/")
defp path(from, segments),
do: Enum.join([from | segments], "/")
defp subset([h | expected], [h | actual]), do: subset(expected, actual)
defp subset([], actual), do: actual
defp subset(_, _), do: []
defp invalid_path?(list) do
invalid_path?(list, :binary.compile_pattern(["/", "\\", ":", "\0"]))
end
defp invalid_path?([h | _], _match) when h in [".", "..", ""], do: true
defp invalid_path?([h | t], match), do: String.contains?(h, match) or invalid_path?(t)
defp invalid_path?([], _match), do: false
defp merge_headers(conn, {module, function, args}) do
merge_headers(conn, apply(module, function, [conn | args]))
end
defp merge_headers(conn, headers) do
merge_resp_headers(conn, headers)
end
end

View file

@ -88,12 +88,12 @@ defp get_media(conn, {:static_dir, directory}, opts) do
Map.get(opts, :static_plug_opts)
|> Map.put(:at, [@path])
|> Map.put(:from, directory)
|> Map.put(:set_content_type, false)
|> Map.put(:content_types, false)
conn =
conn
|> set_content_type(opts, conn.request_path)
|> Pleroma.Web.Plugs.StaticNoCT.call(static_opts)
|> Plug.Static.call(static_opts)
if conn.halted do
conn

View file

@ -57,6 +57,10 @@ def run(%{"url" => url, "url_hash" => url_hash} = args) do
Logger.debug("Rich media error for #{url}: :content_type is #{type}")
negative_cache(url_hash, :timer.minutes(30))
{:error, {:url, reason}} ->
Logger.debug("Rich media error for #{url}: refusing URL #{inspect(reason)}")
negative_cache(url_hash, :timer.minutes(180))
e ->
Logger.debug("Rich media error for #{url}: #{inspect(e)}")
{:error, e}
@ -82,7 +86,7 @@ defp maybe_schedule_expiration(url, fields) do
end
defp stream_update(%{"activity_id" => activity_id}) do
Logger.info("Rich media backfill: streaming update for activity #{activity_id}")
Logger.debug("Rich media backfill: streaming update for activity #{activity_id}")
Pleroma.Activity.get_by_id(activity_id)
|> Pleroma.Activity.normalize()

View file

@ -16,12 +16,13 @@ def parse(nil), do: nil
@spec parse(String.t()) :: {:ok, map()} | {:error, any()}
def parse(url) do
with {_, true} <- {:config, @config_impl.get([:rich_media, :enabled])},
:ok <- validate_page_url(url),
{_, :ok} <- {:url, validate_page_url(url)},
{:ok, data} <- parse_url(url) do
data = Map.put(data, "url", url)
{:ok, data}
else
{:config, _} -> {:error, :rich_media_disabled}
{:url, {:error, reason}} -> {:error, {:url, reason}}
e -> e
end
end
@ -62,7 +63,7 @@ defp clean_parsed_data(data) do
|> Map.new()
end
@spec validate_page_url(URI.t() | binary()) :: :ok | :error
@spec validate_page_url(URI.t() | binary()) :: :ok | {:error, term()}
defp validate_page_url(page_url) when is_binary(page_url) do
validate_tld = @config_impl.get([Pleroma.Formatter, :validate_tld])
@ -74,20 +75,20 @@ defp validate_page_url(page_url) when is_binary(page_url) do
defp validate_page_url(%URI{host: host, scheme: "https"}) do
cond do
Linkify.Parser.ip?(host) ->
:error
{:error, :ip}
host in @config_impl.get([:rich_media, :ignore_hosts], []) ->
:error
{:error, :ignore_hosts}
get_tld(host) in @config_impl.get([:rich_media, :ignore_tld], []) ->
:error
{:error, :ignore_tld}
true ->
:ok
end
end
defp validate_page_url(_), do: :error
defp validate_page_url(_), do: {:error, "scheme mismatch"}
defp parse_uri(true, url) do
url
@ -95,7 +96,7 @@ defp parse_uri(true, url) do
|> validate_page_url
end
defp parse_uri(_, _), do: :error
defp parse_uri(_, _), do: {:error, "not a URL"}
defp get_tld(host) do
host

View file

@ -5,6 +5,7 @@
defmodule Pleroma.Web.Router do
use Pleroma.Web, :router
import Phoenix.LiveDashboard.Router
import Oban.Web.Router
pipeline :accepts_html do
plug(:accepts, ["html"])
@ -808,9 +809,6 @@ defmodule Pleroma.Web.Router do
pipe_through([:activitypub_client])
get("/users/:nickname/inbox", ActivityPubController, :read_inbox)
get("/users/:nickname/outbox", ActivityPubController, :outbox)
get("/users/:nickname/collections/featured", ActivityPubController, :pinned)
end
scope "/", Pleroma.Web.ActivityPub do
@ -825,7 +823,9 @@ defmodule Pleroma.Web.Router do
scope "/", Pleroma.Web.ActivityPub do
pipe_through(:activitypub)
post("/inbox", ActivityPubController, :inbox)
get("/users/:nickname/outbox", ActivityPubController, :outbox)
post("/users/:nickname/inbox", ActivityPubController, :inbox)
get("/users/:nickname/collections/featured", ActivityPubController, :pinned)
end
scope "/relay", Pleroma.Web.ActivityPub do
@ -903,6 +903,8 @@ defmodule Pleroma.Web.Router do
metrics: {Pleroma.Web.Telemetry, :live_dashboard_metrics},
csp_nonce_assign_key: :csp_nonce
)
oban_dashboard("/akkoma/oban", csp_nonce_assign_key: :csp_nonce)
end
# Test-only routes needed to test action dispatching and plug chain execution

View file

@ -208,8 +208,10 @@ defp summary_fallback_metrics(byte_unit \\ :byte) do
dist_metrics ++ vm_metrics
end
defp common_metrics do
defp common_metrics(byte_unit \\ :byte) do
[
last_value("vm.portio.in.total", unit: {:byte, byte_unit}),
last_value("vm.portio.out.total", unit: {:byte, byte_unit}),
last_value("pleroma.local_users.total"),
last_value("pleroma.domains.total"),
last_value("pleroma.local_statuses.total"),
@ -220,14 +222,22 @@ defp common_metrics do
def prometheus_metrics,
do: common_metrics() ++ distribution_metrics() ++ summary_fallback_metrics()
def live_dashboard_metrics, do: common_metrics() ++ summary_metrics(:megabyte)
def live_dashboard_metrics, do: common_metrics(:megabyte) ++ summary_metrics(:megabyte)
defp periodic_measurements do
[
{__MODULE__, :io_stats, []},
{__MODULE__, :instance_stats, []}
]
end
def io_stats do
# All IO done via erlang ports, i.e. mostly network but also e.g. fasthtml_workers. NOT disk IO!
{{:input, input}, {:output, output}} = :erlang.statistics(:io)
:telemetry.execute([:vm, :portio, :in], %{total: input}, %{})
:telemetry.execute([:vm, :portio, :out], %{total: output}, %{})
end
def instance_stats do
stats = Stats.get_stats()
:telemetry.execute([:pleroma, :local_users], %{total: stats.user_count}, %{})

View file

@ -11,7 +11,4 @@
<%= if User.banner_url(@user) do %>
<link rel="header" href="<%= User.banner_url(@user) %>"/>
<% end %>
<%= if @user.local do %>
<ap_enabled>true</ap_enabled>
<% end %>
</author>

View file

@ -11,7 +11,4 @@
<%= if User.banner_url(@user) do %>
<link rel="header"><%= User.banner_url(@user) %></link>
<% end %>
<%= if @user.local do %>
<ap_enabled>true</ap_enabled>
<% end %>
</managingEditor>

View file

@ -8,9 +8,6 @@
<%= if User.banner_url(@actor) do %>
<link rel="header" href="<%= User.banner_url(@actor) %>"/>
<% end %>
<%= if @actor.local do %>
<ap_enabled>true</ap_enabled>
<% end %>
<poco:preferredUsername><%= @actor.nickname %></poco:preferredUsername>
<poco:displayName><%= @actor.name %></poco:displayName>

View file

@ -5,7 +5,7 @@
<meta charset="utf-8" />
<meta content="width=device-width, initial-scale=1" name="viewport" />
<title>
<%= Config.get([:instance, :name]) %>
{Config.get([:instance, :name])}
</title>
<link rel="icon" type="image/png" href="/favicon.png" />
<link rel="manifest" type="applicaton/manifest+json" {%{href: ~p"/web/manifest.json"}} />

View file

@ -5,7 +5,7 @@
<meta charset="utf-8" />
<meta content="width=device-width, initial-scale=1" name="viewport" />
<title>
<%= Config.get([:instance, :name]) %>
{Config.get([:instance, :name])}
</title>
<link rel="icon" type="image/png" href="/favicon.png" />
<link rel="manifest" type="applicaton/manifest+json" {%{href: ~p"/web/manifest.json"}} />

View file

@ -1,4 +1,4 @@
<a class="attachment" href="<%= @url %>" alt=<%= @name %>" title="<%= @name %>">
<a class="attachment" href="<%= @url %>" alt="<%= @name %>" title="<%= @name %>">
<%= if @nsfw do %>
<div class="nsfw-banner">
<div><%= gettext("Hover to show content") %></div>

View file

@ -6,7 +6,9 @@ defmodule Pleroma.Workers.MailerWorker do
use Pleroma.Workers.WorkerHelper, queue: "mailer"
@impl Oban.Worker
def perform(%Job{args: %{"op" => "email", "encoded_email" => encoded_email, "config" => config}}) do
def perform(%Job{
args: %{"op" => "email", "encoded_email" => encoded_email, "config" => config}
}) do
encoded_email
|> Base.decode64!()
|> :erlang.binary_to_term()

View file

@ -6,7 +6,9 @@ defmodule Pleroma.Workers.MuteExpireWorker do
use Pleroma.Workers.WorkerHelper, queue: "mute_expire"
@impl Oban.Worker
def perform(%Job{args: %{"op" => "unmute_user", "muter_id" => muter_id, "mutee_id" => mutee_id}}) do
def perform(%Job{
args: %{"op" => "unmute_user", "muter_id" => muter_id, "mutee_id" => mutee_id}
}) do
Pleroma.User.unmute(muter_id, mutee_id)
:ok
end

View file

@ -1,9 +1,30 @@
defmodule Pleroma.Workers.NodeInfoFetcherWorker do
use Pleroma.Workers.WorkerHelper, queue: "nodeinfo_fetcher"
use Pleroma.Workers.WorkerHelper,
queue: "nodeinfo_fetcher",
unique: [
keys: [:op, :source_url],
# old jobs still get pruned after a short while
period: :infinity,
states: Oban.Job.states()
]
alias Oban.Job
alias Pleroma.Instances.Instance
def enqueue(op, %{"source_url" => ap_id} = params, worker_args) do
# reduce to base url to avoid enqueueing unneccessary duplicates
domain =
ap_id
|> URI.parse()
|> URI.merge("/")
if Instance.needs_update(domain) do
do_enqueue(op, %{params | "source_url" => URI.to_string(domain)}, worker_args)
else
:ok
end
end
@impl Oban.Worker
def perform(%Job{
args: %{"op" => "process", "source_url" => domain}

View file

@ -13,7 +13,9 @@ def backoff(%Job{attempt: attempt}) when is_integer(attempt) do
end
@impl Oban.Worker
def perform(%Job{args: %{"op" => "publish", "activity_id" => activity_id, "object_data" => nil}}) do
def perform(%Job{
args: %{"op" => "publish", "activity_id" => activity_id, "object_data" => nil}
}) do
activity = Activity.get_by_id(activity_id)
Federator.perform(:publish, activity)
end

View file

@ -3,6 +3,8 @@
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Workers.ReceiverWorker do
require Logger
alias Pleroma.Web.Federator
use Pleroma.Workers.WorkerHelper, queue: "federator_incoming"
@ -12,10 +14,49 @@ def perform(%Job{args: %{"op" => "incoming_ap_doc", "params" => params}}) do
with {:ok, res} <- Federator.perform(:incoming_ap_doc, params) do
{:ok, res}
else
{:error, :origin_containment_failed} -> {:discard, :origin_containment_failed}
{:error, {:reject, reason}} -> {:discard, reason}
{:error, _} = e -> e
e -> {:error, e}
{:error, :origin_containment_failed} ->
{:discard, :origin_containment_failed}
{:error, {:reject, reason}} ->
{:discard, reason}
{:error, :already_present} ->
{:discard, :already_present}
{:error, :ignore} ->
{:discard, :ignore}
# invalid data or e.g. deleting an object we don't know about anyway
{:error, {:validate, issue}} ->
Logger.info("Received invalid AP document: #{inspect(issue)}")
{:discard, :invalid}
# rarer, but sometimes theres an additional :error in front
{:error, {:error, {:validate, issue}}} ->
Logger.info("Received invalid AP document: (2e) #{inspect(issue)}")
{:discard, :invalid}
# failed to resolve a necessary referenced remote AP object;
# might be temporary server/network trouble thus reattempt
{:error, :link_resolve_failed} = e ->
Logger.info("Failed to resolve AP link; may retry: #{inspect(params)}")
e
{:error, _} = e ->
Logger.error("Unexpected AP doc error: #{inspect(e)} from #{inspect(params)}")
e
e ->
Logger.error("Unexpected AP doc error: (raw) #{inspect(e)} from #{inspect(params)}")
{:error, e}
end
rescue
err ->
Logger.error(
"Receiver worker CRASH on #{inspect(params)} with: #{Exception.format(:error, err, __STACKTRACE__)}"
)
# reraise to let oban handle transaction conflicts without deductig an attempt
reraise err, __STACKTRACE__
end
end

View file

@ -1,23 +1,38 @@
defmodule Pleroma.Workers.SearchIndexingWorker do
use Pleroma.Workers.WorkerHelper, queue: "search_indexing"
@impl Oban.Worker
defp search_module(), do: Pleroma.Config.get!([Pleroma.Search, :module])
def enqueue("add_to_index", params, worker_args) do
if Kernel.function_exported?(search_module(), :add_to_index, 1) do
do_enqueue("add_to_index", params, worker_args)
else
# XXX: or {:ok, nil} to more closely match Oban.inset()'s {:ok, job}?
# or similar to unique coflict: %Oban.Job{conflict?: true} (but omitting all other fileds...)
:ok
end
end
def enqueue("remove_from_index", params, worker_args) do
if Kernel.function_exported?(search_module(), :remove_from_index, 1) do
do_enqueue("remove_from_index", params, worker_args)
else
:ok
end
end
@impl Oban.Worker
def perform(%Job{args: %{"op" => "add_to_index", "activity" => activity_id}}) do
activity = Pleroma.Activity.get_by_id_with_object(activity_id)
search_module = Pleroma.Config.get([Pleroma.Search, :module])
search_module.add_to_index(activity)
search_module().add_to_index(activity)
:ok
end
def perform(%Job{args: %{"op" => "remove_from_index", "object" => object_id}}) do
search_module = Pleroma.Config.get([Pleroma.Search, :module])
# Fake the object so we can remove it from the index without having to keep it in the DB
search_module.remove_from_index(%Pleroma.Object{id: object_id})
search_module().remove_from_index(%Pleroma.Object{id: object_id})
:ok
end

View file

@ -1,15 +0,0 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2021 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Workers.TransmogrifierWorker do
alias Pleroma.User
use Pleroma.Workers.WorkerHelper, queue: "transmogrifier"
@impl Oban.Worker
def perform(%Job{args: %{"op" => "user_upgrade", "user_id" => user_id}}) do
user = User.get_cached_by_id(user_id)
Pleroma.Web.ActivityPub.Transmogrifier.perform(:user_upgrade, user)
end
end

View file

@ -38,7 +38,7 @@ defmacro __using__(opts) do
alias Oban.Job
def enqueue(op, params, worker_args \\ []) do
defp do_enqueue(op, params, worker_args \\ []) do
params = Map.merge(%{"op" => op}, params)
queue_atom = String.to_atom(unquote(queue))
worker_args = worker_args ++ WorkerHelper.worker_args(queue_atom)
@ -48,11 +48,16 @@ def enqueue(op, params, worker_args \\ []) do
|> Oban.insert()
end
def enqueue(op, params, worker_args \\ []),
do: do_enqueue(op, params, worker_args)
@impl Oban.Worker
def timeout(_job) do
queue_atom = String.to_atom(unquote(queue))
Config.get([:workers, :timeout, queue_atom], :timer.minutes(1))
end
defoverridable enqueue: 3
end
end
end

24
mix.exs
View file

@ -4,8 +4,8 @@ defmodule Pleroma.Mixfile do
def project do
[
app: :pleroma,
version: version("3.14.0"),
elixir: "~> 1.14",
version: version("3.15.2"),
elixir: "~> 1.14.1 or ~> 1.15",
elixirc_paths: elixirc_paths(Mix.env()),
compilers: Mix.compilers(),
elixirc_options: [warnings_as_errors: warnings_as_errors()],
@ -14,7 +14,7 @@ def project do
aliases: aliases(),
deps: deps(),
test_coverage: [tool: ExCoveralls],
preferred_cli_env: ["coveralls.html": :test],
preferred_cli_env: ["coveralls.html": :test, "mneme.test": :test, "mneme.watch": :test],
# Docs
name: "Akkoma",
homepage_url: "https://akkoma.dev/",
@ -117,16 +117,17 @@ defp deps do
[
{:phoenix, "~> 1.7.0"},
{:phoenix_view, "~> 2.0"},
{:phoenix_live_dashboard, "~> 0.7.2"},
{:phoenix_live_dashboard, "~> 0.8.6"},
{:tzdata, "~> 1.1.1"},
{:plug_cowboy, "~> 2.6"},
{:phoenix_pubsub, "~> 2.1"},
{:phoenix_ecto, "~> 4.4"},
{:phoenix_ecto, "~> 4.6"},
{:inet_cidr, "~> 1.0.0"},
{:ecto_enum, "~> 1.4"},
{:ecto_sql, "~> 3.10.0"},
{:postgrex, "~> 0.17.2"},
{:oban, "~> 2.17.8"},
{:ecto_sql, "~> 3.12.0"},
{:postgrex, "~> 0.20.0"},
{:oban, "~> 2.19.0"},
{:oban_web, "~> 2.11.0"},
{:gettext, "~> 0.22.3"},
{:bcrypt_elixir, "~> 3.0.1"},
{:fast_sanitize, "~> 0.2.3"},
@ -191,12 +192,12 @@ defp deps do
git: "https://github.com/FloatingGhost/pleroma-contrib-search-parser.git",
ref: "08971a81e68686f9ac465cfb6661d51c5e4e1e7f"},
{:nimble_parsec, "~> 1.3", override: true},
{:ecto_psql_extras, "~> 0.7"},
{:ecto_psql_extras, "~> 0.8"},
{:elasticsearch,
git: "https://akkoma.dev/AkkomaGang/elasticsearch-elixir.git", ref: "main"},
{:mfm_parser,
git: "https://akkoma.dev/AkkomaGang/mfm-parser.git",
ref: "b21ab7754024af096f2d14247574f55f0063295b"},
ref: "360a30267a847810a63ab48f606ba227b2ca05f0"},
## dev & test
{:ex_doc, "~> 0.30", only: :dev, runtime: false},
@ -209,7 +210,8 @@ defp deps do
{:dialyxir, "~> 1.3", only: [:dev], runtime: false},
{:elixir_xml_to_map, "~> 3.0", only: :test},
{:mint, "~> 1.5.1", override: true},
{:nimble_pool, "~> 1.0", override: true}
{:nimble_pool, "~> 1.0", override: true},
{:mneme, "~> 0.10.2", only: [:dev, :test]}
] ++ oauth_deps()
end

View file

@ -8,120 +8,132 @@
"cachex": {:hex, :cachex, "3.6.0", "14a1bfbeee060dd9bec25a5b6f4e4691e3670ebda28c8ba2884b12fe30b36bf8", [:mix], [{:eternal, "~> 1.2", [hex: :eternal, repo: "hexpm", optional: false]}, {:jumper, "~> 1.0", [hex: :jumper, repo: "hexpm", optional: false]}, {:sleeplocks, "~> 1.1", [hex: :sleeplocks, repo: "hexpm", optional: false]}, {:unsafe, "~> 1.0", [hex: :unsafe, repo: "hexpm", optional: false]}], "hexpm", "ebf24e373883bc8e0c8d894a63bbe102ae13d918f790121f5cfe6e485cc8e2e2"},
"calendar": {:hex, :calendar, "1.0.0", "f52073a708528482ec33d0a171954ca610fe2bd28f1e871f247dc7f1565fa807", [:mix], [{:tzdata, "~> 0.1.201603 or ~> 0.5.20 or ~> 1.0", [hex: :tzdata, repo: "hexpm", optional: false]}], "hexpm", "990e9581920c82912a5ee50e62ff5ef96da6b15949a2ee4734f935fdef0f0a6f"},
"captcha": {:git, "https://git.pleroma.social/pleroma/elixir-libraries/elixir-captcha.git", "6630c42aaaab124e697b4e513190c89d8b64e410", [ref: "6630c42aaaab124e697b4e513190c89d8b64e410"]},
"castore": {:hex, :castore, "1.0.9", "5cc77474afadf02c7c017823f460a17daa7908e991b0cc917febc90e466a375c", [:mix], [], "hexpm", "5ea956504f1ba6f2b4eb707061d8e17870de2bee95fb59d512872c2ef06925e7"},
"certifi": {:hex, :certifi, "2.12.0", "2d1cca2ec95f59643862af91f001478c9863c2ac9cb6e2f89780bfd8de987329", [:rebar3], [], "hexpm", "ee68d85df22e554040cdb4be100f33873ac6051387baf6a8f6ce82272340ff1c"},
"castore": {:hex, :castore, "1.0.12", "053f0e32700cbec356280c0e835df425a3be4bc1e0627b714330ad9d0f05497f", [:mix], [], "hexpm", "3dca286b2186055ba0c9449b4e95b97bf1b57b47c1f2644555879e659960c224"},
"certifi": {:hex, :certifi, "2.14.0", "ed3bef654e69cde5e6c022df8070a579a79e8ba2368a00acf3d75b82d9aceeed", [:rebar3], [], "hexpm", "ea59d87ef89da429b8e905264fdec3419f84f2215bb3d81e07a18aac919026c3"},
"combine": {:hex, :combine, "0.10.0", "eff8224eeb56498a2af13011d142c5e7997a80c8f5b97c499f84c841032e429f", [:mix], [], "hexpm", "1b1dbc1790073076580d0d1d64e42eae2366583e7aecd455d1215b0d16f2451b"},
"comeonin": {:hex, :comeonin, "5.5.0", "364d00df52545c44a139bad919d7eacb55abf39e86565878e17cebb787977368", [:mix], [], "hexpm", "6287fc3ba0aad34883cbe3f7949fc1d1e738e5ccdce77165bc99490aa69f47fb"},
"comeonin": {:hex, :comeonin, "5.5.1", "5113e5f3800799787de08a6e0db307133850e635d34e9fab23c70b6501669510", [:mix], [], "hexpm", "65aac8f19938145377cee73973f192c5645873dcf550a8a6b18187d17c13ccdb"},
"concurrent_limiter": {:git, "https://akkoma.dev/AkkomaGang/concurrent-limiter.git", "a9e0b3d64574bdba761f429bb4fba0cf687b3338", [ref: "a9e0b3d64574bdba761f429bb4fba0cf687b3338"]},
"connection": {:hex, :connection, "1.1.0", "ff2a49c4b75b6fb3e674bfc5536451607270aac754ffd1bdfe175abe4a6d7a68", [:mix], [], "hexpm", "722c1eb0a418fbe91ba7bd59a47e28008a189d47e37e0e7bb85585a016b2869c"},
"cors_plug": {:hex, :cors_plug, "3.0.3", "7c3ac52b39624bc616db2e937c282f3f623f25f8d550068b6710e58d04a0e330", [:mix], [{:plug, "~> 1.13", [hex: :plug, repo: "hexpm", optional: false]}], "hexpm", "3f2d759e8c272ed3835fab2ef11b46bddab8c1ab9528167bd463b6452edf830d"},
"cowboy": {:hex, :cowboy, "2.12.0", "f276d521a1ff88b2b9b4c54d0e753da6c66dd7be6c9fca3d9418b561828a3731", [:make, :rebar3], [{:cowlib, "2.13.0", [hex: :cowlib, repo: "hexpm", optional: false]}, {:ranch, "1.8.0", [hex: :ranch, repo: "hexpm", optional: false]}], "hexpm", "8a7abe6d183372ceb21caa2709bec928ab2b72e18a3911aa1771639bef82651e"},
"cowboy": {:hex, :cowboy, "2.13.0", "09d770dd5f6a22cc60c071f432cd7cb87776164527f205c5a6b0f24ff6b38990", [:make, :rebar3], [{:cowlib, ">= 2.14.0 and < 3.0.0", [hex: :cowlib, repo: "hexpm", optional: false]}, {:ranch, ">= 1.8.0 and < 3.0.0", [hex: :ranch, repo: "hexpm", optional: false]}], "hexpm", "e724d3a70995025d654c1992c7b11dbfea95205c047d86ff9bf1cda92ddc5614"},
"cowboy_telemetry": {:hex, :cowboy_telemetry, "0.4.0", "f239f68b588efa7707abce16a84d0d2acf3a0f50571f8bb7f56a15865aae820c", [:rebar3], [{:cowboy, "~> 2.7", [hex: :cowboy, repo: "hexpm", optional: false]}, {:telemetry, "~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "7d98bac1ee4565d31b62d59f8823dfd8356a169e7fcbb83831b8a5397404c9de"},
"cowlib": {:hex, :cowlib, "2.13.0", "db8f7505d8332d98ef50a3ef34b34c1afddec7506e4ee4dd4a3a266285d282ca", [:make, :rebar3], [], "hexpm", "e1e1284dc3fc030a64b1ad0d8382ae7e99da46c3246b815318a4b848873800a4"},
"credo": {:hex, :credo, "1.7.8", "9722ba1681e973025908d542ec3d95db5f9c549251ba5b028e251ad8c24ab8c5", [:mix], [{:bunt, "~> 0.2.1 or ~> 1.0", [hex: :bunt, repo: "hexpm", optional: false]}, {:file_system, "~> 0.2 or ~> 1.0", [hex: :file_system, repo: "hexpm", optional: false]}, {:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: false]}], "hexpm", "cb9e87cc64f152f3ed1c6e325e7b894dea8f5ef2e41123bd864e3cd5ceb44968"},
"cowlib": {:hex, :cowlib, "2.14.0", "623791c56c1cc9df54a71a9c55147a401549917f00a2e48a6ae12b812c586ced", [:make, :rebar3], [], "hexpm", "0af652d1550c8411c3b58eed7a035a7fb088c0b86aff6bc504b0bc3b7f791aa2"},
"credo": {:hex, :credo, "1.7.11", "d3e805f7ddf6c9c854fd36f089649d7cf6ba74c42bc3795d587814e3c9847102", [:mix], [{:bunt, "~> 0.2.1 or ~> 1.0", [hex: :bunt, repo: "hexpm", optional: false]}, {:file_system, "~> 0.2 or ~> 1.0", [hex: :file_system, repo: "hexpm", optional: false]}, {:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: false]}], "hexpm", "56826b4306843253a66e47ae45e98e7d284ee1f95d53d1612bb483f88a8cf219"},
"custom_base": {:hex, :custom_base, "0.2.1", "4a832a42ea0552299d81652aa0b1f775d462175293e99dfbe4d7dbaab785a706", [:mix], [], "hexpm", "8df019facc5ec9603e94f7270f1ac73ddf339f56ade76a721eaa57c1493ba463"},
"db_connection": {:hex, :db_connection, "2.7.0", "b99faa9291bb09892c7da373bb82cba59aefa9b36300f6145c5f201c7adf48ec", [:mix], [{:telemetry, "~> 0.4 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "dcf08f31b2701f857dfc787fbad78223d61a32204f217f15e881dd93e4bdd3ff"},
"decimal": {:hex, :decimal, "2.1.1", "5611dca5d4b2c3dd497dec8f68751f1f1a54755e8ed2a966c2633cf885973ad6", [:mix], [], "hexpm", "53cfe5f497ed0e7771ae1a475575603d77425099ba5faef9394932b35020ffcc"},
"decimal": {:hex, :decimal, "2.3.0", "3ad6255aa77b4a3c4f818171b12d237500e63525c2fd056699967a3e7ea20f62", [:mix], [], "hexpm", "a4d66355cb29cb47c3cf30e71329e58361cfcb37c34235ef3bf1d7bf3773aeac"},
"deep_merge": {:hex, :deep_merge, "1.0.0", "b4aa1a0d1acac393bdf38b2291af38cb1d4a52806cf7a4906f718e1feb5ee961", [:mix], [], "hexpm", "ce708e5f094b9cd4e8f2be4f00d2f4250c4095be93f8cd6d018c753894885430"},
"dialyxir": {:hex, :dialyxir, "1.4.4", "fb3ce8741edeaea59c9ae84d5cec75da00fa89fe401c72d6e047d11a61f65f70", [:mix], [{:erlex, ">= 0.2.7", [hex: :erlex, repo: "hexpm", optional: false]}], "hexpm", "cd6111e8017ccd563e65621a4d9a4a1c5cd333df30cebc7face8029cacb4eff6"},
"dialyxir": {:hex, :dialyxir, "1.4.5", "ca1571ac18e0f88d4ab245f0b60fa31ff1b12cbae2b11bd25d207f865e8ae78a", [:mix], [{:erlex, ">= 0.2.7", [hex: :erlex, repo: "hexpm", optional: false]}], "hexpm", "b0fb08bb8107c750db5c0b324fa2df5ceaa0f9307690ee3c1f6ba5b9eb5d35c3"},
"earmark": {:hex, :earmark, "1.4.46", "8c7287bd3137e99d26ae4643e5b7ef2129a260e3dcf41f251750cb4563c8fb81", [:mix], [], "hexpm", "798d86db3d79964e759ddc0c077d5eb254968ed426399fbf5a62de2b5ff8910a"},
"earmark_parser": {:hex, :earmark_parser, "1.4.41", "ab34711c9dc6212dda44fcd20ecb87ac3f3fce6f0ca2f28d4a00e4154f8cd599", [:mix], [], "hexpm", "a81a04c7e34b6617c2792e291b5a2e57ab316365c2644ddc553bb9ed863ebefa"},
"earmark_parser": {:hex, :earmark_parser, "1.4.43", "34b2f401fe473080e39ff2b90feb8ddfeef7639f8ee0bbf71bb41911831d77c5", [:mix], [], "hexpm", "970a3cd19503f5e8e527a190662be2cee5d98eed1ff72ed9b3d1a3d466692de8"},
"eblurhash": {:hex, :eblurhash, "1.2.2", "7da4255aaea984b31bb71155f673257353b0e0554d0d30dcf859547e74602582", [:rebar3], [], "hexpm", "8c20ca00904de023a835a9dcb7b7762fed32264c85a80c3cafa85288e405044c"},
"ecto": {:hex, :ecto, "3.10.3", "eb2ae2eecd210b4eb8bece1217b297ad4ff824b4384c0e3fdd28aaf96edd6135", [:mix], [{:decimal, "~> 1.6 or ~> 2.0", [hex: :decimal, repo: "hexpm", optional: false]}, {:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: true]}, {:telemetry, "~> 0.4 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "44bec74e2364d491d70f7e42cd0d690922659d329f6465e89feb8a34e8cd3433"},
"ecto": {:hex, :ecto, "3.12.5", "4a312960ce612e17337e7cefcf9be45b95a3be6b36b6f94dfb3d8c361d631866", [:mix], [{:decimal, "~> 2.0", [hex: :decimal, repo: "hexpm", optional: false]}, {:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: true]}, {:telemetry, "~> 0.4 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "6eb18e80bef8bb57e17f5a7f068a1719fbda384d40fc37acb8eb8aeca493b6ea"},
"ecto_enum": {:hex, :ecto_enum, "1.4.0", "d14b00e04b974afc69c251632d1e49594d899067ee2b376277efd8233027aec8", [:mix], [{:ecto, ">= 3.0.0", [hex: :ecto, repo: "hexpm", optional: false]}, {:ecto_sql, "> 3.0.0", [hex: :ecto_sql, repo: "hexpm", optional: false]}, {:mariaex, ">= 0.0.0", [hex: :mariaex, repo: "hexpm", optional: true]}, {:postgrex, ">= 0.0.0", [hex: :postgrex, repo: "hexpm", optional: true]}], "hexpm", "8fb55c087181c2b15eee406519dc22578fa60dd82c088be376d0010172764ee4"},
"ecto_psql_extras": {:hex, :ecto_psql_extras, "0.8.2", "79350a53246ac5ec27326d208496aebceb77fa82a91744f66a9154560f0759d3", [:mix], [{:ecto_sql, "~> 3.7", [hex: :ecto_sql, repo: "hexpm", optional: false]}, {:postgrex, "> 0.16.0 and < 0.20.0", [hex: :postgrex, repo: "hexpm", optional: false]}, {:table_rex, "~> 3.1.1 or ~> 4.0.0", [hex: :table_rex, repo: "hexpm", optional: false]}], "hexpm", "6149c1c4a5ba6602a76cb09ee7a269eb60dab9694a1dbbb797f032555212de75"},
"ecto_sql": {:hex, :ecto_sql, "3.10.2", "6b98b46534b5c2f8b8b5f03f126e75e2a73c64f3c071149d32987a5378b0fdbd", [:mix], [{:db_connection, "~> 2.4.1 or ~> 2.5", [hex: :db_connection, repo: "hexpm", optional: false]}, {:ecto, "~> 3.10.0", [hex: :ecto, repo: "hexpm", optional: false]}, {:myxql, "~> 0.6.0", [hex: :myxql, repo: "hexpm", optional: true]}, {:postgrex, "~> 0.16.0 or ~> 0.17.0 or ~> 1.0", [hex: :postgrex, repo: "hexpm", optional: true]}, {:tds, "~> 2.1.1 or ~> 2.2", [hex: :tds, repo: "hexpm", optional: true]}, {:telemetry, "~> 0.4.0 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "68c018debca57cb9235e3889affdaec7a10616a4e3a80c99fa1d01fdafaa9007"},
"ecto_psql_extras": {:hex, :ecto_psql_extras, "0.8.7", "49943fe6bce07281fe3adfc2a23d3794e2acc644dfe98411cb5712ffecb6ad1a", [:mix], [{:ecto_sql, "~> 3.7", [hex: :ecto_sql, repo: "hexpm", optional: false]}, {:postgrex, "> 0.16.0", [hex: :postgrex, repo: "hexpm", optional: false]}, {:table_rex, "~> 3.1.1 or ~> 4.0", [hex: :table_rex, repo: "hexpm", optional: false]}], "hexpm", "ac0a0bce57ffe36b30fac2a2d0d427b04de016e6af5db6f4b41afa1241f39cda"},
"ecto_sql": {:hex, :ecto_sql, "3.12.1", "c0d0d60e85d9ff4631f12bafa454bc392ce8b9ec83531a412c12a0d415a3a4d0", [:mix], [{:db_connection, "~> 2.4.1 or ~> 2.5", [hex: :db_connection, repo: "hexpm", optional: false]}, {:ecto, "~> 3.12", [hex: :ecto, repo: "hexpm", optional: false]}, {:myxql, "~> 0.7", [hex: :myxql, repo: "hexpm", optional: true]}, {:postgrex, "~> 0.19 or ~> 1.0", [hex: :postgrex, repo: "hexpm", optional: true]}, {:tds, "~> 2.1.1 or ~> 2.2", [hex: :tds, repo: "hexpm", optional: true]}, {:telemetry, "~> 0.4.0 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "aff5b958a899762c5f09028c847569f7dfb9cc9d63bdb8133bff8a5546de6bf5"},
"elasticsearch": {:git, "https://akkoma.dev/AkkomaGang/elasticsearch-elixir.git", "6cd946f75f6ab9042521a009d1d32d29a90113ca", [ref: "main"]},
"elixir_make": {:hex, :elixir_make, "0.8.4", "4960a03ce79081dee8fe119d80ad372c4e7badb84c493cc75983f9d3bc8bde0f", [:mix], [{:castore, "~> 0.1 or ~> 1.0", [hex: :castore, repo: "hexpm", optional: true]}, {:certifi, "~> 2.0", [hex: :certifi, repo: "hexpm", optional: true]}], "hexpm", "6e7f1d619b5f61dfabd0a20aa268e575572b542ac31723293a4c1a567d5ef040"},
"elixir_make": {:hex, :elixir_make, "0.9.0", "6484b3cd8c0cee58f09f05ecaf1a140a8c97670671a6a0e7ab4dc326c3109726", [:mix], [], "hexpm", "db23d4fd8b757462ad02f8aa73431a426fe6671c80b200d9710caf3d1dd0ffdb"},
"elixir_xml_to_map": {:hex, :elixir_xml_to_map, "3.1.0", "4d6260486a8cce59e4bf3575fe2dd2a24766546ceeef9f93fcec6f7c62a2827a", [:mix], [{:erlsom, "~> 1.4", [hex: :erlsom, repo: "hexpm", optional: false]}], "hexpm", "8fe5f2e75f90bab07ee2161120c2dc038ebcae8135554f5582990f1c8c21f911"},
"erlex": {:hex, :erlex, "0.2.7", "810e8725f96ab74d17aac676e748627a07bc87eb950d2b83acd29dc047a30595", [:mix], [], "hexpm", "3ed95f79d1a844c3f6bf0cea61e0d5612a42ce56da9c03f01df538685365efb0"},
"erlsom": {:hex, :erlsom, "1.5.1", "c8fe2babd33ff0846403f6522328b8ab676f896b793634cfe7ef181c05316c03", [:rebar3], [], "hexpm", "7965485494c5844dd127656ac40f141aadfa174839ec1be1074e7edf5b4239eb"},
"erlsom": {:hex, :erlsom, "1.5.2", "3e47c53a199136fb4d20d5479edb7c9229f31624534c062633951c8d14dcd276", [:rebar3], [], "hexpm", "4e765cc677fb30509f7b628ff2914e124cf4dcc0fac1c0a62ee4dcee24215b5d"},
"eternal": {:hex, :eternal, "1.2.2", "d1641c86368de99375b98d183042dd6c2b234262b8d08dfd72b9eeaafc2a1abd", [:mix], [], "hexpm", "2c9fe32b9c3726703ba5e1d43a1d255a4f3f2d8f8f9bc19f094c7cb1a7a9e782"},
"ex_aws": {:hex, :ex_aws, "2.5.6", "6f642e0f82eff10a9b470044f084b81a791cf15b393d647ea5f3e65da2794e3d", [:mix], [{:configparser_ex, "~> 4.0", [hex: :configparser_ex, repo: "hexpm", optional: true]}, {:hackney, "~> 1.16", [hex: :hackney, repo: "hexpm", optional: true]}, {:jason, "~> 1.1", [hex: :jason, repo: "hexpm", optional: true]}, {:jsx, "~> 2.8 or ~> 3.0", [hex: :jsx, repo: "hexpm", optional: true]}, {:mime, "~> 1.2 or ~> 2.0", [hex: :mime, repo: "hexpm", optional: false]}, {:req, "~> 0.3", [hex: :req, repo: "hexpm", optional: true]}, {:sweet_xml, "~> 0.7", [hex: :sweet_xml, repo: "hexpm", optional: true]}, {:telemetry, "~> 0.4.3 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "c69eec59e31fdd89d0beeb1d97e16518dd1b23ad95b3d5c9f1dcfec23d97f960"},
"ex_aws_s3": {:hex, :ex_aws_s3, "2.5.4", "87aaf4a2f24a48f516d7f5aaced9d128dd5d0f655c4431f9037a11a85c71109c", [:mix], [{:ex_aws, "~> 2.0", [hex: :ex_aws, repo: "hexpm", optional: false]}, {:sweet_xml, ">= 0.0.0", [hex: :sweet_xml, repo: "hexpm", optional: true]}], "hexpm", "c06e7f68b33f7c0acba1361dbd951c79661a28f85aa2e0582990fccca4425355"},
"ex_aws": {:hex, :ex_aws, "2.5.8", "0393cfbc5e4a9e7017845451a015d836a670397100aa4c86901980e2a2c5f7d4", [:mix], [{:configparser_ex, "~> 4.0", [hex: :configparser_ex, repo: "hexpm", optional: true]}, {:hackney, "~> 1.16", [hex: :hackney, repo: "hexpm", optional: true]}, {:jason, "~> 1.1", [hex: :jason, repo: "hexpm", optional: true]}, {:jsx, "~> 2.8 or ~> 3.0", [hex: :jsx, repo: "hexpm", optional: true]}, {:mime, "~> 1.2 or ~> 2.0", [hex: :mime, repo: "hexpm", optional: false]}, {:req, "~> 0.3", [hex: :req, repo: "hexpm", optional: true]}, {:sweet_xml, "~> 0.7", [hex: :sweet_xml, repo: "hexpm", optional: true]}, {:telemetry, "~> 0.4.3 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "8f79777b7932168956c8cc3a6db41f5783aa816eb50de356aed3165a71e5f8c3"},
"ex_aws_s3": {:hex, :ex_aws_s3, "2.5.6", "d135983bbd8b6df6350dfd83999437725527c1bea151e5055760bfc9b2d17c20", [:mix], [{:ex_aws, "~> 2.0", [hex: :ex_aws, repo: "hexpm", optional: false]}, {:sweet_xml, ">= 0.0.0", [hex: :sweet_xml, repo: "hexpm", optional: true]}], "hexpm", "9874e12847e469ca2f13a5689be04e546c16f63caf6380870b7f25bf7cb98875"},
"ex_const": {:hex, :ex_const, "0.3.0", "9d79516679991baf540ef445438eef1455ca91cf1a3c2680d8fb9e5bea2fe4de", [:mix], [], "hexpm", "76546322abb9e40ee4a2f454cf1c8a5b25c3672fa79bed1ea52c31e0d2428ca9"},
"ex_doc": {:hex, :ex_doc, "0.34.2", "13eedf3844ccdce25cfd837b99bea9ad92c4e511233199440488d217c92571e8", [:mix], [{:earmark_parser, "~> 1.4.39", [hex: :earmark_parser, repo: "hexpm", optional: false]}, {:makeup_c, ">= 0.1.0", [hex: :makeup_c, repo: "hexpm", optional: true]}, {:makeup_elixir, "~> 0.14 or ~> 1.0", [hex: :makeup_elixir, repo: "hexpm", optional: false]}, {:makeup_erlang, "~> 0.1 or ~> 1.0", [hex: :makeup_erlang, repo: "hexpm", optional: false]}, {:makeup_html, ">= 0.1.0", [hex: :makeup_html, repo: "hexpm", optional: true]}], "hexpm", "5ce5f16b41208a50106afed3de6a2ed34f4acfd65715b82a0b84b49d995f95c1"},
"ex_doc": {:hex, :ex_doc, "0.37.2", "2a3aa7014094f0e4e286a82aa5194a34dd17057160988b8509b15aa6c292720c", [:mix], [{:earmark_parser, "~> 1.4.42", [hex: :earmark_parser, repo: "hexpm", optional: false]}, {:makeup_c, ">= 0.1.0", [hex: :makeup_c, repo: "hexpm", optional: true]}, {:makeup_elixir, "~> 0.14 or ~> 1.0", [hex: :makeup_elixir, repo: "hexpm", optional: false]}, {:makeup_erlang, "~> 0.1 or ~> 1.0", [hex: :makeup_erlang, repo: "hexpm", optional: false]}, {:makeup_html, ">= 0.1.0", [hex: :makeup_html, repo: "hexpm", optional: true]}], "hexpm", "4dfa56075ce4887e4e8b1dcc121cd5fcb0f02b00391fd367ff5336d98fa49049"},
"ex_machina": {:hex, :ex_machina, "2.8.0", "a0e847b5712065055ec3255840e2c78ef9366634d62390839d4880483be38abe", [:mix], [{:ecto, "~> 2.2 or ~> 3.0", [hex: :ecto, repo: "hexpm", optional: true]}, {:ecto_sql, "~> 3.0", [hex: :ecto_sql, repo: "hexpm", optional: true]}], "hexpm", "79fe1a9c64c0c1c1fab6c4fa5d871682cb90de5885320c187d117004627a7729"},
"ex_syslogger": {:hex, :ex_syslogger, "2.0.0", "de6de5c5472a9c4fdafb28fa6610e381ae79ebc17da6490b81d785d68bd124c9", [:mix], [{:jason, "~> 1.2", [hex: :jason, repo: "hexpm", optional: true]}, {:syslog, "~> 1.1.0", [hex: :syslog, repo: "hexpm", optional: false]}], "hexpm", "a52b2fe71764e9e6ecd149ab66635812f68e39279cbeee27c52c0e35e8b8019e"},
"excoveralls": {:hex, :excoveralls, "0.16.1", "0bd42ed05c7d2f4d180331a20113ec537be509da31fed5c8f7047ce59ee5a7c5", [:mix], [{:hackney, "~> 1.16", [hex: :hackney, repo: "hexpm", optional: false]}, {:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: false]}], "hexpm", "dae763468e2008cf7075a64cb1249c97cb4bc71e236c5c2b5e5cdf1cfa2bf138"},
"expo": {:hex, :expo, "0.4.1", "1c61d18a5df197dfda38861673d392e642649a9cef7694d2f97a587b2cfb319b", [:mix], [], "hexpm", "2ff7ba7a798c8c543c12550fa0e2cbc81b95d4974c65855d8d15ba7b37a1ce47"},
"fast_html": {:hex, :fast_html, "2.3.0", "08c1d8ead840dd3060ba02c761bed9f37f456a1ddfe30bcdcfee8f651cec06a6", [:make, :mix], [{:elixir_make, "~> 0.4", [hex: :elixir_make, repo: "hexpm", optional: false]}, {:nimble_pool, "~> 0.2.0", [hex: :nimble_pool, repo: "hexpm", optional: false]}], "hexpm", "f18e3c7668f82d3ae0b15f48d48feeb257e28aa5ab1b0dbf781c7312e5da029d"},
"fast_html": {:hex, :fast_html, "2.4.1", "73142526cee294b0ec8cf122483f32c861e4a9d988c60cd04f63ce7fd9e5a620", [:make, :mix], [{:elixir_make, "~> 0.4", [hex: :elixir_make, repo: "hexpm", optional: false]}, {:nimble_pool, "~> 1.1", [hex: :nimble_pool, repo: "hexpm", optional: false]}], "hexpm", "767a63ecc941d3fc0e0e9609ded1a5e798398e5b1bf4d2f47bcb5992a86b32cf"},
"fast_sanitize": {:hex, :fast_sanitize, "0.2.3", "67b93dfb34e302bef49fec3aaab74951e0f0602fd9fa99085987af05bd91c7a5", [:mix], [{:fast_html, "~> 2.0", [hex: :fast_html, repo: "hexpm", optional: false]}, {:plug, "~> 1.8", [hex: :plug, repo: "hexpm", optional: false]}], "hexpm", "e8ad286d10d0386e15d67d0ee125245ebcfbc7d7290b08712ba9013c8c5e56e2"},
"file_ex": {:git, "https://akkoma.dev/AkkomaGang/file_ex.git", "cc7067c7d446c2526e9ecf91d40896b088851569", [ref: "cc7067c7d446c2526e9ecf91d40896b088851569"]},
"file_system": {:hex, :file_system, "1.0.1", "79e8ceaddb0416f8b8cd02a0127bdbababe7bf4a23d2a395b983c1f8b3f73edd", [:mix], [], "hexpm", "4414d1f38863ddf9120720cd976fce5bdde8e91d8283353f0e31850fa89feb9e"},
"file_system": {:hex, :file_system, "1.1.0", "08d232062284546c6c34426997dd7ef6ec9f8bbd090eb91780283c9016840e8f", [:mix], [], "hexpm", "bfcf81244f416871f2a2e15c1b515287faa5db9c6bcf290222206d120b3d43f6"},
"finch": {:hex, :finch, "0.18.0", "944ac7d34d0bd2ac8998f79f7a811b21d87d911e77a786bc5810adb75632ada4", [:mix], [{:castore, "~> 0.1 or ~> 1.0", [hex: :castore, repo: "hexpm", optional: false]}, {:mime, "~> 1.0 or ~> 2.0", [hex: :mime, repo: "hexpm", optional: false]}, {:mint, "~> 1.3", [hex: :mint, repo: "hexpm", optional: false]}, {:nimble_options, "~> 0.4 or ~> 1.0", [hex: :nimble_options, repo: "hexpm", optional: false]}, {:nimble_pool, "~> 0.2.6 or ~> 1.0", [hex: :nimble_pool, repo: "hexpm", optional: false]}, {:telemetry, "~> 0.4 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "69f5045b042e531e53edc2574f15e25e735b522c37e2ddb766e15b979e03aa65"},
"flake_id": {:hex, :flake_id, "0.1.0", "7716b086d2e405d09b647121a166498a0d93d1a623bead243e1f74216079ccb3", [:mix], [{:base62, "~> 1.2", [hex: :base62, repo: "hexpm", optional: false]}, {:ecto, ">= 2.0.0", [hex: :ecto, repo: "hexpm", optional: true]}], "hexpm", "31fc8090fde1acd267c07c36ea7365b8604055f897d3a53dd967658c691bd827"},
"floki": {:hex, :floki, "0.36.3", "1102f93b16a55bc5383b85ae3ec470f82dee056eaeff9195e8afdf0ef2a43c30", [:mix], [], "hexpm", "fe0158bff509e407735f6d40b3ee0d7deb47f3f3ee7c6c182ad28599f9f6b27a"},
"floki": {:hex, :floki, "0.37.0", "b83e0280bbc6372f2a403b2848013650b16640cd2470aea6701f0632223d719e", [:mix], [], "hexpm", "516a0c15a69f78c47dc8e0b9b3724b29608aa6619379f91b1ffa47109b5d0dd3"},
"gen_smtp": {:hex, :gen_smtp, "1.2.0", "9cfc75c72a8821588b9b9fe947ae5ab2aed95a052b81237e0928633a13276fd3", [:rebar3], [{:ranch, ">= 1.8.0", [hex: :ranch, repo: "hexpm", optional: false]}], "hexpm", "5ee0375680bca8f20c4d85f58c2894441443a743355430ff33a783fe03296779"},
"gettext": {:hex, :gettext, "0.22.3", "c8273e78db4a0bb6fba7e9f0fd881112f349a3117f7f7c598fa18c66c888e524", [:mix], [{:expo, "~> 0.4.0", [hex: :expo, repo: "hexpm", optional: false]}], "hexpm", "935f23447713954a6866f1bb28c3a878c4c011e802bcd68a726f5e558e4b64bd"},
"hackney": {:hex, :hackney, "1.20.1", "8d97aec62ddddd757d128bfd1df6c5861093419f8f7a4223823537bad5d064e2", [:rebar3], [{:certifi, "~> 2.12.0", [hex: :certifi, repo: "hexpm", optional: false]}, {:idna, "~> 6.1.0", [hex: :idna, repo: "hexpm", optional: false]}, {:metrics, "~> 1.0.0", [hex: :metrics, repo: "hexpm", optional: false]}, {:mimerl, "~> 1.1", [hex: :mimerl, repo: "hexpm", optional: false]}, {:parse_trans, "3.4.1", [hex: :parse_trans, repo: "hexpm", optional: false]}, {:ssl_verify_fun, "~> 1.1.0", [hex: :ssl_verify_fun, repo: "hexpm", optional: false]}, {:unicode_util_compat, "~> 0.7.0", [hex: :unicode_util_compat, repo: "hexpm", optional: false]}], "hexpm", "fe9094e5f1a2a2c0a7d10918fee36bfec0ec2a979994cff8cfe8058cd9af38e3"},
"glob_ex": {:hex, :glob_ex, "0.1.11", "cb50d3f1ef53f6ca04d6252c7fde09fd7a1cf63387714fe96f340a1349e62c93", [:mix], [], "hexpm", "342729363056e3145e61766b416769984c329e4378f1d558b63e341020525de4"},
"hackney": {:hex, :hackney, "1.23.0", "55cc09077112bcb4a69e54be46ed9bc55537763a96cd4a80a221663a7eafd767", [:rebar3], [{:certifi, "~> 2.14.0", [hex: :certifi, repo: "hexpm", optional: false]}, {:idna, "~> 6.1.0", [hex: :idna, repo: "hexpm", optional: false]}, {:metrics, "~> 1.0.0", [hex: :metrics, repo: "hexpm", optional: false]}, {:mimerl, "~> 1.1", [hex: :mimerl, repo: "hexpm", optional: false]}, {:parse_trans, "3.4.1", [hex: :parse_trans, repo: "hexpm", optional: false]}, {:ssl_verify_fun, "~> 1.1.0", [hex: :ssl_verify_fun, repo: "hexpm", optional: false]}, {:unicode_util_compat, "~> 0.7.0", [hex: :unicode_util_compat, repo: "hexpm", optional: false]}], "hexpm", "6cd1c04cd15c81e5a493f167b226a15f0938a84fc8f0736ebe4ddcab65c0b44e"},
"hpax": {:hex, :hpax, "0.1.2", "09a75600d9d8bbd064cdd741f21fc06fc1f4cf3d0fcc335e5aa19be1a7235c84", [:mix], [], "hexpm", "2c87843d5a23f5f16748ebe77969880e29809580efdaccd615cd3bed628a8c13"},
"html_entities": {:hex, :html_entities, "0.5.2", "9e47e70598da7de2a9ff6af8758399251db6dbb7eebe2b013f2bbd2515895c3c", [:mix], [], "hexpm", "c53ba390403485615623b9531e97696f076ed415e8d8058b1dbaa28181f4fdcc"},
"http_signatures": {:git, "https://akkoma.dev/AkkomaGang/http_signatures.git", "d44c43d66758c6a73eaa4da9cffdbee0c5da44ae", [ref: "d44c43d66758c6a73eaa4da9cffdbee0c5da44ae"]},
"httpoison": {:hex, :httpoison, "1.8.2", "9eb9c63ae289296a544842ef816a85d881d4a31f518a0fec089aaa744beae290", [:mix], [{:hackney, "~> 1.17", [hex: :hackney, repo: "hexpm", optional: false]}], "hexpm", "2bb350d26972e30c96e2ca74a1aaf8293d61d0742ff17f01e0279fef11599921"},
"idna": {:hex, :idna, "6.1.1", "8a63070e9f7d0c62eb9d9fcb360a7de382448200fbbd1b106cc96d3d8099df8d", [:rebar3], [{:unicode_util_compat, "~> 0.7.0", [hex: :unicode_util_compat, repo: "hexpm", optional: false]}], "hexpm", "92376eb7894412ed19ac475e4a86f7b413c1b9fbb5bd16dccd57934157944cea"},
"igniter": {:hex, :igniter, "0.5.29", "6bf7ddaf15e88ae75f6dad514329530ec8f4721ba14782f6386a7345c1be99fd", [:mix], [{:glob_ex, "~> 0.1.7", [hex: :glob_ex, repo: "hexpm", optional: false]}, {:inflex, "~> 2.0", [hex: :inflex, repo: "hexpm", optional: false]}, {:jason, "~> 1.4", [hex: :jason, repo: "hexpm", optional: false]}, {:owl, "~> 0.11", [hex: :owl, repo: "hexpm", optional: false]}, {:phx_new, "~> 1.7", [hex: :phx_new, repo: "hexpm", optional: true]}, {:req, "~> 0.5", [hex: :req, repo: "hexpm", optional: false]}, {:rewrite, ">= 1.1.1 and < 2.0.0-0", [hex: :rewrite, repo: "hexpm", optional: false]}, {:sourceror, "~> 1.4", [hex: :sourceror, repo: "hexpm", optional: false]}, {:spitfire, ">= 0.1.3 and < 1.0.0-0", [hex: :spitfire, repo: "hexpm", optional: false]}], "hexpm", "beb6e0f69fc6d4e3975ffa26c5459fc63fd96f85cfaeba984c2dfd3d7333b6ad"},
"inet_cidr": {:hex, :inet_cidr, "1.0.8", "d26bb7bdbdf21ae401ead2092bf2bb4bf57fe44a62f5eaa5025280720ace8a40", [:mix], [], "hexpm", "d5b26da66603bb56c933c65214c72152f0de9a6ea53618b56d63302a68f6a90e"},
"inflex": {:hex, :inflex, "2.1.0", "a365cf0821a9dacb65067abd95008ca1b0bb7dcdd85ae59965deef2aa062924c", [:mix], [], "hexpm", "14c17d05db4ee9b6d319b0bff1bdf22aa389a25398d1952c7a0b5f3d93162dd8"},
"jason": {:hex, :jason, "1.4.4", "b9226785a9aa77b6857ca22832cffa5d5011a667207eb2a0ad56adb5db443b8a", [:mix], [{:decimal, "~> 1.0 or ~> 2.0", [hex: :decimal, repo: "hexpm", optional: true]}], "hexpm", "c5eb0cab91f094599f94d55bc63409236a8ec69a21a67814529e8d5f6cc90b3b"},
"joken": {:hex, :joken, "2.6.2", "5daaf82259ca603af4f0b065475099ada1b2b849ff140ccd37f4b6828ca6892a", [:mix], [{:jose, "~> 1.11.10", [hex: :jose, repo: "hexpm", optional: false]}], "hexpm", "5134b5b0a6e37494e46dbf9e4dad53808e5e787904b7c73972651b51cce3d72b"},
"jose": {:hex, :jose, "1.11.10", "a903f5227417bd2a08c8a00a0cbcc458118be84480955e8d251297a425723f83", [:mix, :rebar3], [], "hexpm", "0d6cd36ff8ba174db29148fc112b5842186b68a90ce9fc2b3ec3afe76593e614"},
"jumper": {:hex, :jumper, "1.0.2", "68cdcd84472a00ac596b4e6459a41b3062d4427cbd4f1e8c8793c5b54f1406a7", [:mix], [], "hexpm", "9b7782409021e01ab3c08270e26f36eb62976a38c1aa64b2eaf6348422f165e1"},
"linkify": {:hex, :linkify, "0.5.3", "5f8143d8f61f5ff08d3aeeff47ef6509492b4948d8f08007fbf66e4d2246a7f2", [:mix], [], "hexpm", "3ef35a1377d47c25506e07c1c005ea9d38d700699d92ee92825f024434258177"},
"mail": {:hex, :mail, "0.4.2", "959229676ef11dfdfb1269ce4a611622cb70415a1d6f925d79e547848bafc14d", [:mix], [], "hexpm", "08e5b70c72b8d1605cb88ef2df2c7e41d002210a621503ea1c13f1a7916b6bd3"},
"mail": {:hex, :mail, "0.4.3", "16df84500780980826d9b52059d24c08fdd75c98f178a7a7ea809ea83fb70542", [:mix], [], "hexpm", "164975550b977e47cab431c403b0e90c8ce542036d32c7189b83839d8d7d391b"},
"majic": {:git, "https://akkoma.dev/AkkomaGang/majic.git", "80540b36939ec83f48e76c61e5000e0fd67706f0", [ref: "80540b36939ec83f48e76c61e5000e0fd67706f0"]},
"makeup": {:hex, :makeup, "1.1.2", "9ba8837913bdf757787e71c1581c21f9d2455f4dd04cfca785c70bbfff1a76a3", [:mix], [{:nimble_parsec, "~> 1.2.2 or ~> 1.3", [hex: :nimble_parsec, repo: "hexpm", optional: false]}], "hexpm", "cce1566b81fbcbd21eca8ffe808f33b221f9eee2cbc7a1706fc3da9ff18e6cac"},
"makeup_elixir": {:hex, :makeup_elixir, "0.16.2", "627e84b8e8bf22e60a2579dad15067c755531fea049ae26ef1020cad58fe9578", [:mix], [{:makeup, "~> 1.0", [hex: :makeup, repo: "hexpm", optional: false]}, {:nimble_parsec, "~> 1.2.3 or ~> 1.3", [hex: :nimble_parsec, repo: "hexpm", optional: false]}], "hexpm", "41193978704763f6bbe6cc2758b84909e62984c7752b3784bd3c218bb341706b"},
"makeup_erlang": {:hex, :makeup_erlang, "1.0.1", "c7f58c120b2b5aa5fd80d540a89fdf866ed42f1f3994e4fe189abebeab610839", [:mix], [{:makeup, "~> 1.0", [hex: :makeup, repo: "hexpm", optional: false]}], "hexpm", "8a89a1eeccc2d798d6ea15496a6e4870b75e014d1af514b1b71fa33134f57814"},
"makeup": {:hex, :makeup, "1.2.1", "e90ac1c65589ef354378def3ba19d401e739ee7ee06fb47f94c687016e3713d1", [:mix], [{:nimble_parsec, "~> 1.4", [hex: :nimble_parsec, repo: "hexpm", optional: false]}], "hexpm", "d36484867b0bae0fea568d10131197a4c2e47056a6fbe84922bf6ba71c8d17ce"},
"makeup_elixir": {:hex, :makeup_elixir, "1.0.1", "e928a4f984e795e41e3abd27bfc09f51db16ab8ba1aebdba2b3a575437efafc2", [:mix], [{:makeup, "~> 1.0", [hex: :makeup, repo: "hexpm", optional: false]}, {:nimble_parsec, "~> 1.2.3 or ~> 1.3", [hex: :nimble_parsec, repo: "hexpm", optional: false]}], "hexpm", "7284900d412a3e5cfd97fdaed4f5ed389b8f2b4cb49efc0eb3bd10e2febf9507"},
"makeup_erlang": {:hex, :makeup_erlang, "1.0.2", "03e1804074b3aa64d5fad7aa64601ed0fb395337b982d9bcf04029d68d51b6a7", [:mix], [{:makeup, "~> 1.0", [hex: :makeup, repo: "hexpm", optional: false]}], "hexpm", "af33ff7ef368d5893e4a267933e7744e46ce3cf1f61e2dccf53a111ed3aa3727"},
"meck": {:hex, :meck, "0.9.2", "85ccbab053f1db86c7ca240e9fc718170ee5bda03810a6292b5306bf31bae5f5", [:rebar3], [], "hexpm", "81344f561357dc40a8344afa53767c32669153355b626ea9fcbc8da6b3045826"},
"metrics": {:hex, :metrics, "1.0.1", "25f094dea2cda98213cecc3aeff09e940299d950904393b2a29d191c346a8486", [:rebar3], [], "hexpm", "69b09adddc4f74a40716ae54d140f93beb0fb8978d8636eaded0c31b6f099f16"},
"mfm_parser": {:git, "https://akkoma.dev/AkkomaGang/mfm-parser.git", "b21ab7754024af096f2d14247574f55f0063295b", [ref: "b21ab7754024af096f2d14247574f55f0063295b"]},
"mfm_parser": {:git, "https://akkoma.dev/AkkomaGang/mfm-parser.git", "360a30267a847810a63ab48f606ba227b2ca05f0", [ref: "360a30267a847810a63ab48f606ba227b2ca05f0"]},
"mime": {:hex, :mime, "2.0.6", "8f18486773d9b15f95f4f4f1e39b710045fa1de891fada4516559967276e4dc2", [:mix], [], "hexpm", "c9945363a6b26d747389aac3643f8e0e09d30499a138ad64fe8fd1d13d9b153e"},
"mimerl": {:hex, :mimerl, "1.3.0", "d0cd9fc04b9061f82490f6581e0128379830e78535e017f7780f37fea7545726", [:rebar3], [], "hexpm", "a1e15a50d1887217de95f0b9b0793e32853f7c258a5cd227650889b38839fe9d"},
"mint": {:hex, :mint, "1.5.2", "4805e059f96028948870d23d7783613b7e6b0e2fb4e98d720383852a760067fd", [:mix], [{:castore, "~> 0.1.0 or ~> 1.0", [hex: :castore, repo: "hexpm", optional: true]}, {:hpax, "~> 0.1.1", [hex: :hpax, repo: "hexpm", optional: false]}], "hexpm", "d77d9e9ce4eb35941907f1d3df38d8f750c357865353e21d335bdcdf6d892a02"},
"mock": {:hex, :mock, "0.3.8", "7046a306b71db2488ef54395eeb74df0a7f335a7caca4a3d3875d1fc81c884dd", [:mix], [{:meck, "~> 0.9.2", [hex: :meck, repo: "hexpm", optional: false]}], "hexpm", "7fa82364c97617d79bb7d15571193fc0c4fe5afd0c932cef09426b3ee6fe2022"},
"mneme": {:hex, :mneme, "0.10.2", "f263ff74e993ef9eb160e04b608a149881e601ed6f5508c9afcafb578a184b52", [:mix], [{:file_system, "~> 1.0", [hex: :file_system, repo: "hexpm", optional: false]}, {:igniter, "~> 0.3.76 or ~> 0.4.0 or ~> 0.5.0", [hex: :igniter, repo: "hexpm", optional: false]}, {:nimble_options, "~> 1.0", [hex: :nimble_options, repo: "hexpm", optional: false]}, {:owl, "~> 0.9", [hex: :owl, repo: "hexpm", optional: false]}, {:rewrite, "~> 1.0", [hex: :rewrite, repo: "hexpm", optional: false]}, {:sourceror, "~> 1.0", [hex: :sourceror, repo: "hexpm", optional: false]}, {:text_diff, "~> 0.1", [hex: :text_diff, repo: "hexpm", optional: false]}], "hexpm", "3b9493fc114c4bb0f6232e021620ffd7944819b9b9105a5b286b6dc907f7720a"},
"mock": {:hex, :mock, "0.3.9", "10e44ad1f5962480c5c9b9fa779c6c63de9bd31997c8e04a853ec990a9d841af", [:mix], [{:meck, "~> 0.9.2", [hex: :meck, repo: "hexpm", optional: false]}], "hexpm", "9e1b244c4ca2551bb17bb8415eed89e40ee1308e0fbaed0a4fdfe3ec8a4adbd3"},
"mogrify": {:hex, :mogrify, "0.9.3", "238c782f00271dace01369ad35ae2e9dd020feee3443b9299ea5ea6bed559841", [:mix], [], "hexpm", "0189b1e1de27455f2b9ae8cf88239cefd23d38de9276eb5add7159aea51731e6"},
"mox": {:hex, :mox, "1.2.0", "a2cd96b4b80a3883e3100a221e8adc1b98e4c3a332a8fc434c39526babafd5b3", [:mix], [{:nimble_ownership, "~> 1.0", [hex: :nimble_ownership, repo: "hexpm", optional: false]}], "hexpm", "c7b92b3cc69ee24a7eeeaf944cd7be22013c52fcb580c1f33f50845ec821089a"},
"nimble_options": {:hex, :nimble_options, "1.1.1", "e3a492d54d85fc3fd7c5baf411d9d2852922f66e69476317787a7b2bb000a61b", [:mix], [], "hexpm", "821b2470ca9442c4b6984882fe9bb0389371b8ddec4d45a9504f00a66f650b44"},
"nimble_ownership": {:hex, :nimble_ownership, "1.0.0", "3f87744d42c21b2042a0aa1d48c83c77e6dd9dd357e425a038dd4b49ba8b79a1", [:mix], [], "hexpm", "7c16cc74f4e952464220a73055b557a273e8b1b7ace8489ec9d86e9ad56cb2cc"},
"nimble_parsec": {:hex, :nimble_parsec, "1.4.0", "51f9b613ea62cfa97b25ccc2c1b4216e81df970acd8e16e8d1bdc58fef21370d", [:mix], [], "hexpm", "9c565862810fb383e9838c1dd2d7d2c437b3d13b267414ba6af33e50d2d1cf28"},
"nimble_ownership": {:hex, :nimble_ownership, "1.0.1", "f69fae0cdd451b1614364013544e66e4f5d25f36a2056a9698b793305c5aa3a6", [:mix], [], "hexpm", "3825e461025464f519f3f3e4a1f9b68c47dc151369611629ad08b636b73bb22d"},
"nimble_parsec": {:hex, :nimble_parsec, "1.4.2", "8efba0122db06df95bfaa78f791344a89352ba04baedd3849593bfce4d0dc1c6", [:mix], [], "hexpm", "4b21398942dda052b403bbe1da991ccd03a053668d147d53fb8c4e0efe09c973"},
"nimble_pool": {:hex, :nimble_pool, "1.1.0", "bf9c29fbdcba3564a8b800d1eeb5a3c58f36e1e11d7b7fb2e084a643f645f06b", [:mix], [], "hexpm", "af2e4e6b34197db81f7aad230c1118eac993acc0dae6bc83bac0126d4ae0813a"},
"oban": {:hex, :oban, "2.17.12", "33fb0cbfb92b910d48dd91a908590fe3698bb85eacec8cd0d9bc6aa13dddd6d6", [:mix], [{:ecto_sql, "~> 3.10", [hex: :ecto_sql, repo: "hexpm", optional: false]}, {:ecto_sqlite3, "~> 0.9", [hex: :ecto_sqlite3, repo: "hexpm", optional: true]}, {:jason, "~> 1.1", [hex: :jason, repo: "hexpm", optional: false]}, {:postgrex, "~> 0.16", [hex: :postgrex, repo: "hexpm", optional: true]}, {:telemetry, "~> 0.4 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "7a647d6cd6bb300073db17faabce22d80ae135da3baf3180a064fa7c4fa046e3"},
"oban": {:hex, :oban, "2.19.2", "11e635c49b4f422814eb96a4d78974c9e67d62a20a969e5fd50db040d32c08ba", [:mix], [{:ecto_sql, "~> 3.10", [hex: :ecto_sql, repo: "hexpm", optional: false]}, {:ecto_sqlite3, "~> 0.9", [hex: :ecto_sqlite3, repo: "hexpm", optional: true]}, {:igniter, "~> 0.5", [hex: :igniter, repo: "hexpm", optional: true]}, {:jason, "~> 1.1", [hex: :jason, repo: "hexpm", optional: true]}, {:myxql, "~> 0.7", [hex: :myxql, repo: "hexpm", optional: true]}, {:postgrex, "~> 0.16", [hex: :postgrex, repo: "hexpm", optional: true]}, {:telemetry, "~> 0.4 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "de8314b00b31d17f98fd2c76781f80c1cfc8621122b41830c0834486c44e1087"},
"oban_met": {:hex, :oban_met, "1.0.1", "737db0064567b923d3f35efd1d3009dd1435d60ee6f98dbb55dbb83db8f4f4fa", [:mix], [{:oban, "~> 2.18", [hex: :oban, repo: "hexpm", optional: false]}], "hexpm", "0492d841f880b76c5b73081bc70ebea20ebacc08e871345f72c2270513f09957"},
"oban_web": {:hex, :oban_web, "2.11.1", "646492d92177a43ff869535f0c92380e17148f45bc2c8dc8f74c69f8a2874bad", [:mix], [{:jason, "~> 1.2", [hex: :jason, repo: "hexpm", optional: false]}, {:oban, "~> 2.19", [hex: :oban, repo: "hexpm", optional: false]}, {:oban_met, "~> 1.0", [hex: :oban_met, repo: "hexpm", optional: false]}, {:phoenix, "~> 1.7", [hex: :phoenix, repo: "hexpm", optional: false]}, {:phoenix_html, "~> 3.3 or ~> 4.0", [hex: :phoenix_html, repo: "hexpm", optional: false]}, {:phoenix_live_view, "~> 1.0", [hex: :phoenix_live_view, repo: "hexpm", optional: false]}, {:phoenix_pubsub, "~> 2.1", [hex: :phoenix_pubsub, repo: "hexpm", optional: false]}], "hexpm", "d853c6af3f7c20d03a2bf7b1baad71835d50fbb98af05004e9b51da558b90b01"},
"open_api_spex": {:hex, :open_api_spex, "3.21.2", "6a704f3777761feeb5657340250d6d7332c545755116ca98f33d4b875777e1e5", [:mix], [{:decimal, "~> 1.0 or ~> 2.0", [hex: :decimal, repo: "hexpm", optional: true]}, {:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: true]}, {:plug, "~> 1.7", [hex: :plug, repo: "hexpm", optional: false]}, {:poison, "~> 3.0 or ~> 4.0 or ~> 5.0 or ~> 6.0", [hex: :poison, repo: "hexpm", optional: true]}, {:ymlr, "~> 2.0 or ~> 3.0 or ~> 4.0 or ~> 5.0", [hex: :ymlr, repo: "hexpm", optional: true]}], "hexpm", "f42ae6ed668b895ebba3e02773cfb4b41050df26f803f2ef634c72a7687dc387"},
"owl": {:hex, :owl, "0.12.2", "65906b525e5c3ef51bab6cba7687152be017aebe1da077bb719a5ee9f7e60762", [:mix], [{:ucwidth, "~> 0.2", [hex: :ucwidth, repo: "hexpm", optional: true]}], "hexpm", "6398efa9e1fea70a04d24231e10dcd66c1ac1aa2da418d20ef5357ec61de2880"},
"parse_trans": {:hex, :parse_trans, "3.4.1", "6e6aa8167cb44cc8f39441d05193be6e6f4e7c2946cb2759f015f8c56b76e5ff", [:rebar3], [], "hexpm", "620a406ce75dada827b82e453c19cf06776be266f5a67cff34e1ef2cbb60e49a"},
"phoenix": {:hex, :phoenix, "1.7.14", "a7d0b3f1bc95987044ddada111e77bd7f75646a08518942c72a8440278ae7825", [:mix], [{:castore, ">= 0.0.0", [hex: :castore, repo: "hexpm", optional: false]}, {:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: true]}, {:phoenix_pubsub, "~> 2.1", [hex: :phoenix_pubsub, repo: "hexpm", optional: false]}, {:phoenix_template, "~> 1.0", [hex: :phoenix_template, repo: "hexpm", optional: false]}, {:phoenix_view, "~> 2.0", [hex: :phoenix_view, repo: "hexpm", optional: true]}, {:plug, "~> 1.14", [hex: :plug, repo: "hexpm", optional: false]}, {:plug_cowboy, "~> 2.7", [hex: :plug_cowboy, repo: "hexpm", optional: true]}, {:plug_crypto, "~> 1.2 or ~> 2.0", [hex: :plug_crypto, repo: "hexpm", optional: false]}, {:telemetry, "~> 0.4 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}, {:websock_adapter, "~> 0.5.3", [hex: :websock_adapter, repo: "hexpm", optional: false]}], "hexpm", "c7859bc56cc5dfef19ecfc240775dae358cbaa530231118a9e014df392ace61a"},
"phoenix": {:hex, :phoenix, "1.7.20", "6bababaf27d59f5628f9b608de902a021be2cecefb8231e1dbdc0a2e2e480e9b", [:mix], [{:castore, ">= 0.0.0", [hex: :castore, repo: "hexpm", optional: false]}, {:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: true]}, {:phoenix_pubsub, "~> 2.1", [hex: :phoenix_pubsub, repo: "hexpm", optional: false]}, {:phoenix_template, "~> 1.0", [hex: :phoenix_template, repo: "hexpm", optional: false]}, {:phoenix_view, "~> 2.0", [hex: :phoenix_view, repo: "hexpm", optional: true]}, {:plug, "~> 1.14", [hex: :plug, repo: "hexpm", optional: false]}, {:plug_cowboy, "~> 2.7", [hex: :plug_cowboy, repo: "hexpm", optional: true]}, {:plug_crypto, "~> 1.2 or ~> 2.0", [hex: :plug_crypto, repo: "hexpm", optional: false]}, {:telemetry, "~> 0.4 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}, {:websock_adapter, "~> 0.5.3", [hex: :websock_adapter, repo: "hexpm", optional: false]}], "hexpm", "6be2ab98302e8784a31829e0d50d8bdfa81a23cd912c395bafd8b8bfb5a086c2"},
"phoenix_ecto": {:hex, :phoenix_ecto, "4.6.3", "f686701b0499a07f2e3b122d84d52ff8a31f5def386e03706c916f6feddf69ef", [:mix], [{:ecto, "~> 3.5", [hex: :ecto, repo: "hexpm", optional: false]}, {:phoenix_html, "~> 2.14.2 or ~> 3.0 or ~> 4.1", [hex: :phoenix_html, repo: "hexpm", optional: true]}, {:plug, "~> 1.9", [hex: :plug, repo: "hexpm", optional: false]}, {:postgrex, "~> 0.16 or ~> 1.0", [hex: :postgrex, repo: "hexpm", optional: true]}], "hexpm", "909502956916a657a197f94cc1206d9a65247538de8a5e186f7537c895d95764"},
"phoenix_html": {:hex, :phoenix_html, "3.3.4", "42a09fc443bbc1da37e372a5c8e6755d046f22b9b11343bf885067357da21cb3", [:mix], [{:plug, "~> 1.5", [hex: :plug, repo: "hexpm", optional: true]}], "hexpm", "0249d3abec3714aff3415e7ee3d9786cb325be3151e6c4b3021502c585bf53fb"},
"phoenix_live_dashboard": {:hex, :phoenix_live_dashboard, "0.7.2", "97cc4ff2dba1ebe504db72cb45098cb8e91f11160528b980bd282cc45c73b29c", [:mix], [{:ecto, "~> 3.6.2 or ~> 3.7", [hex: :ecto, repo: "hexpm", optional: true]}, {:ecto_mysql_extras, "~> 0.5", [hex: :ecto_mysql_extras, repo: "hexpm", optional: true]}, {:ecto_psql_extras, "~> 0.7", [hex: :ecto_psql_extras, repo: "hexpm", optional: true]}, {:mime, "~> 1.6 or ~> 2.0", [hex: :mime, repo: "hexpm", optional: false]}, {:phoenix_live_view, "~> 0.18.3", [hex: :phoenix_live_view, repo: "hexpm", optional: false]}, {:telemetry_metrics, "~> 0.6 or ~> 1.0", [hex: :telemetry_metrics, repo: "hexpm", optional: false]}], "hexpm", "0e5fdf063c7a3b620c566a30fcf68b7ee02e5e46fe48ee46a6ec3ba382dc05b7"},
"phoenix_live_view": {:hex, :phoenix_live_view, "0.18.18", "1f38fbd7c363723f19aad1a04b5490ff3a178e37daaf6999594d5f34796c47fc", [:mix], [{:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: true]}, {:phoenix, "~> 1.6.15 or ~> 1.7.0", [hex: :phoenix, repo: "hexpm", optional: false]}, {:phoenix_html, "~> 3.3", [hex: :phoenix_html, repo: "hexpm", optional: false]}, {:phoenix_template, "~> 1.0", [hex: :phoenix_template, repo: "hexpm", optional: false]}, {:phoenix_view, "~> 2.0", [hex: :phoenix_view, repo: "hexpm", optional: true]}, {:telemetry, "~> 0.4.2 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "a5810d0472f3189ede6d2a95bda7f31c6113156b91784a3426cb0ab6a6d85214"},
"phoenix_live_dashboard": {:hex, :phoenix_live_dashboard, "0.8.6", "7b1f0327f54c9eb69845fd09a77accf922f488c549a7e7b8618775eb603a62c7", [:mix], [{:ecto, "~> 3.6.2 or ~> 3.7", [hex: :ecto, repo: "hexpm", optional: true]}, {:ecto_mysql_extras, "~> 0.5", [hex: :ecto_mysql_extras, repo: "hexpm", optional: true]}, {:ecto_psql_extras, "~> 0.7", [hex: :ecto_psql_extras, repo: "hexpm", optional: true]}, {:ecto_sqlite3_extras, "~> 1.1.7 or ~> 1.2.0", [hex: :ecto_sqlite3_extras, repo: "hexpm", optional: true]}, {:mime, "~> 1.6 or ~> 2.0", [hex: :mime, repo: "hexpm", optional: false]}, {:phoenix_live_view, "~> 0.19 or ~> 1.0", [hex: :phoenix_live_view, repo: "hexpm", optional: false]}, {:telemetry_metrics, "~> 0.6 or ~> 1.0", [hex: :telemetry_metrics, repo: "hexpm", optional: false]}], "hexpm", "1681ab813ec26ca6915beb3414aa138f298e17721dc6a2bde9e6eb8a62360ff6"},
"phoenix_live_view": {:hex, :phoenix_live_view, "1.0.5", "f072166f87c44ffaf2b47b65c5ced8c375797830e517bfcf0a006fe7eb113911", [:mix], [{:floki, "~> 0.36", [hex: :floki, repo: "hexpm", optional: true]}, {:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: true]}, {:phoenix, "~> 1.6.15 or ~> 1.7.0", [hex: :phoenix, repo: "hexpm", optional: false]}, {:phoenix_html, "~> 3.3 or ~> 4.0", [hex: :phoenix_html, repo: "hexpm", optional: false]}, {:phoenix_template, "~> 1.0", [hex: :phoenix_template, repo: "hexpm", optional: false]}, {:phoenix_view, "~> 2.0", [hex: :phoenix_view, repo: "hexpm", optional: true]}, {:plug, "~> 1.15", [hex: :plug, repo: "hexpm", optional: false]}, {:telemetry, "~> 0.4.2 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "94abbc84df8a93a64514fc41528695d7326b6f3095e906b32f264ec4280811f3"},
"phoenix_pubsub": {:hex, :phoenix_pubsub, "2.1.3", "3168d78ba41835aecad272d5e8cd51aa87a7ac9eb836eabc42f6e57538e3731d", [:mix], [], "hexpm", "bba06bc1dcfd8cb086759f0edc94a8ba2bc8896d5331a1e2c2902bf8e36ee502"},
"phoenix_swoosh": {:hex, :phoenix_swoosh, "1.2.1", "b74ccaa8046fbc388a62134360ee7d9742d5a8ae74063f34eb050279de7a99e1", [:mix], [{:finch, "~> 0.8", [hex: :finch, repo: "hexpm", optional: true]}, {:hackney, "~> 1.10", [hex: :hackney, repo: "hexpm", optional: true]}, {:phoenix, "~> 1.6", [hex: :phoenix, repo: "hexpm", optional: true]}, {:phoenix_html, "~> 3.0 or ~> 4.0", [hex: :phoenix_html, repo: "hexpm", optional: true]}, {:phoenix_view, "~> 1.0 or ~> 2.0", [hex: :phoenix_view, repo: "hexpm", optional: false]}, {:swoosh, "~> 1.5", [hex: :swoosh, repo: "hexpm", optional: false]}], "hexpm", "4000eeba3f9d7d1a6bf56d2bd56733d5cadf41a7f0d8ffe5bb67e7d667e204a2"},
"phoenix_template": {:hex, :phoenix_template, "1.0.4", "e2092c132f3b5e5b2d49c96695342eb36d0ed514c5b252a77048d5969330d639", [:mix], [{:phoenix_html, "~> 2.14.2 or ~> 3.0 or ~> 4.0", [hex: :phoenix_html, repo: "hexpm", optional: true]}], "hexpm", "2c0c81f0e5c6753faf5cca2f229c9709919aba34fab866d3bc05060c9c444206"},
"phoenix_view": {:hex, :phoenix_view, "2.0.4", "b45c9d9cf15b3a1af5fb555c674b525391b6a1fe975f040fb4d913397b31abf4", [:mix], [{:phoenix_html, "~> 2.14.2 or ~> 3.0 or ~> 4.0", [hex: :phoenix_html, repo: "hexpm", optional: true]}, {:phoenix_template, "~> 1.0", [hex: :phoenix_template, repo: "hexpm", optional: false]}], "hexpm", "4e992022ce14f31fe57335db27a28154afcc94e9983266835bb3040243eb620b"},
"plug": {:hex, :plug, "1.16.1", "40c74619c12f82736d2214557dedec2e9762029b2438d6d175c5074c933edc9d", [:mix], [{:mime, "~> 1.0 or ~> 2.0", [hex: :mime, repo: "hexpm", optional: false]}, {:plug_crypto, "~> 1.1.1 or ~> 1.2 or ~> 2.0", [hex: :plug_crypto, repo: "hexpm", optional: false]}, {:telemetry, "~> 0.4.3 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "a13ff6b9006b03d7e33874945b2755253841b238c34071ed85b0e86057f8cddc"},
"plug_cowboy": {:hex, :plug_cowboy, "2.7.2", "fdadb973799ae691bf9ecad99125b16625b1c6039999da5fe544d99218e662e4", [:mix], [{:cowboy, "~> 2.7", [hex: :cowboy, repo: "hexpm", optional: false]}, {:cowboy_telemetry, "~> 0.3", [hex: :cowboy_telemetry, repo: "hexpm", optional: false]}, {:plug, "~> 1.14", [hex: :plug, repo: "hexpm", optional: false]}], "hexpm", "245d8a11ee2306094840c000e8816f0cbed69a23fc0ac2bcf8d7835ae019bb2f"},
"plug_cowboy": {:hex, :plug_cowboy, "2.7.3", "1304d36752e8bdde213cea59ef424ca932910a91a07ef9f3874be709c4ddb94b", [:mix], [{:cowboy, "~> 2.7", [hex: :cowboy, repo: "hexpm", optional: false]}, {:cowboy_telemetry, "~> 0.3", [hex: :cowboy_telemetry, repo: "hexpm", optional: false]}, {:plug, "~> 1.14", [hex: :plug, repo: "hexpm", optional: false]}], "hexpm", "77c95524b2aa5364b247fa17089029e73b951ebc1adeef429361eab0bb55819d"},
"plug_crypto": {:hex, :plug_crypto, "2.1.0", "f44309c2b06d249c27c8d3f65cfe08158ade08418cf540fd4f72d4d6863abb7b", [:mix], [], "hexpm", "131216a4b030b8f8ce0f26038bc4421ae60e4bb95c5cf5395e1421437824c4fa"},
"plug_static_index_html": {:hex, :plug_static_index_html, "1.0.0", "840123d4d3975585133485ea86af73cb2600afd7f2a976f9f5fd8b3808e636a0", [:mix], [{:plug, "~> 1.0", [hex: :plug, repo: "hexpm", optional: false]}], "hexpm", "79fd4fcf34d110605c26560cbae8f23c603ec4158c08298bd4360fdea90bb5cf"},
"poison": {:hex, :poison, "5.0.0", "d2b54589ab4157bbb82ec2050757779bfed724463a544b6e20d79855a9e43b24", [:mix], [{:decimal, "~> 2.0", [hex: :decimal, repo: "hexpm", optional: true]}], "hexpm", "11dc6117c501b80c62a7594f941d043982a1bd05a1184280c0d9166eb4d8d3fc"},
"poolboy": {:hex, :poolboy, "1.5.2", "392b007a1693a64540cead79830443abf5762f5d30cf50bc95cb2c1aaafa006b", [:rebar3], [], "hexpm", "dad79704ce5440f3d5a3681c8590b9dc25d1a561e8f5a9c995281012860901e3"},
"postgrex": {:hex, :postgrex, "0.17.5", "0483d054938a8dc069b21bdd636bf56c487404c241ce6c319c1f43588246b281", [:mix], [{:db_connection, "~> 2.1", [hex: :db_connection, repo: "hexpm", optional: false]}, {:decimal, "~> 1.5 or ~> 2.0", [hex: :decimal, repo: "hexpm", optional: false]}, {:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: true]}, {:table, "~> 0.1.0", [hex: :table, repo: "hexpm", optional: true]}], "hexpm", "50b8b11afbb2c4095a3ba675b4f055c416d0f3d7de6633a595fc131a828a67eb"},
"postgrex": {:hex, :postgrex, "0.20.0", "363ed03ab4757f6bc47942eff7720640795eb557e1935951c1626f0d303a3aed", [:mix], [{:db_connection, "~> 2.1", [hex: :db_connection, repo: "hexpm", optional: false]}, {:decimal, "~> 1.5 or ~> 2.0", [hex: :decimal, repo: "hexpm", optional: false]}, {:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: true]}, {:table, "~> 0.1.0", [hex: :table, repo: "hexpm", optional: true]}], "hexpm", "d36ef8b36f323d29505314f704e21a1a038e2dc387c6409ee0cd24144e187c0f"},
"pot": {:hex, :pot, "1.0.2", "13abb849139fdc04ab8154986abbcb63bdee5de6ed2ba7e1713527e33df923dd", [:rebar3], [], "hexpm", "78fe127f5a4f5f919d6ea5a2a671827bd53eb9d37e5b4128c0ad3df99856c2e0"},
"ranch": {:hex, :ranch, "1.8.0", "8c7a100a139fd57f17327b6413e4167ac559fbc04ca7448e9be9057311597a1d", [:make, :rebar3], [], "hexpm", "49fbcfd3682fab1f5d109351b61257676da1a2fdbe295904176d5e521a2ddfe5"},
"ranch": {:hex, :ranch, "2.2.0", "25528f82bc8d7c6152c57666ca99ec716510fe0925cb188172f41ce93117b1b0", [:make, :rebar3], [], "hexpm", "fa0b99a1780c80218a4197a59ea8d3bdae32fbff7e88527d7d8a4787eff4f8e7"},
"recon": {:hex, :recon, "2.5.6", "9052588e83bfedfd9b72e1034532aee2a5369d9d9343b61aeb7fbce761010741", [:mix, :rebar3], [], "hexpm", "96c6799792d735cc0f0fd0f86267e9d351e63339cbe03df9d162010cefc26bb0"},
"remote_ip": {:hex, :remote_ip, "1.1.0", "cb308841595d15df3f9073b7c39243a1dd6ca56e5020295cb012c76fbec50f2d", [:mix], [{:combine, "~> 0.10", [hex: :combine, repo: "hexpm", optional: false]}, {:plug, "~> 1.14", [hex: :plug, repo: "hexpm", optional: false]}], "hexpm", "616ffdf66aaad6a72fc546dabf42eed87e2a99e97b09cbd92b10cc180d02ed74"},
"req": {:hex, :req, "0.5.8", "50d8d65279d6e343a5e46980ac2a70e97136182950833a1968b371e753f6a662", [:mix], [{:brotli, "~> 0.3.1", [hex: :brotli, repo: "hexpm", optional: true]}, {:ezstd, "~> 1.0", [hex: :ezstd, repo: "hexpm", optional: true]}, {:finch, "~> 0.17", [hex: :finch, repo: "hexpm", optional: false]}, {:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: false]}, {:mime, "~> 2.0.6 or ~> 2.1", [hex: :mime, repo: "hexpm", optional: false]}, {:nimble_csv, "~> 1.0", [hex: :nimble_csv, repo: "hexpm", optional: true]}, {:plug, "~> 1.0", [hex: :plug, repo: "hexpm", optional: true]}], "hexpm", "d7fc5898a566477e174f26887821a3c5082b243885520ee4b45555f5d53f40ef"},
"rewrite": {:hex, :rewrite, "1.1.2", "f5a5d10f5fed1491a6ff48e078d4585882695962ccc9e6c779bae025d1f92eda", [:mix], [{:glob_ex, "~> 0.1", [hex: :glob_ex, repo: "hexpm", optional: false]}, {:sourceror, "~> 1.0", [hex: :sourceror, repo: "hexpm", optional: false]}, {:text_diff, "~> 0.1", [hex: :text_diff, repo: "hexpm", optional: false]}], "hexpm", "7f8b94b1e3528d0a47b3e8b7bfeca559d2948a65fa7418a9ad7d7712703d39d4"},
"search_parser": {:git, "https://github.com/FloatingGhost/pleroma-contrib-search-parser.git", "08971a81e68686f9ac465cfb6661d51c5e4e1e7f", [ref: "08971a81e68686f9ac465cfb6661d51c5e4e1e7f"]},
"sleeplocks": {:hex, :sleeplocks, "1.1.3", "96a86460cc33b435c7310dbd27ec82ca2c1f24ae38e34f8edde97f756503441a", [:rebar3], [], "hexpm", "d3b3958552e6eb16f463921e70ae7c767519ef8f5be46d7696cc1ed649421321"},
"sourceror": {:hex, :sourceror, "1.7.1", "599d78f4cc2be7d55c9c4fd0a8d772fd0478e3a50e726697c20d13d02aa056d4", [:mix], [], "hexpm", "cd6f268fe29fa00afbc535e215158680a0662b357dc784646d7dff28ac65a0fc"},
"spitfire": {:hex, :spitfire, "0.1.5", "10b041e781bff9544d2fdf00893e1a325758408c5366a9bfa4333072568659b1", [:mix], [], "hexpm", "866a55d21fe827934ff38200111335c9dd311df13cbf2580ed71d84b0a783150"},
"ssl_verify_fun": {:hex, :ssl_verify_fun, "1.1.7", "354c321cf377240c7b8716899e182ce4890c5938111a1296add3ec74cf1715df", [:make, :mix, :rebar3], [], "hexpm", "fe4c190e8f37401d30167c8c405eda19469f34577987c76dde613e838bbc67f8"},
"statistex": {:hex, :statistex, "1.0.0", "f3dc93f3c0c6c92e5f291704cf62b99b553253d7969e9a5fa713e5481cd858a5", [:mix], [], "hexpm", "ff9d8bee7035028ab4742ff52fc80a2aa35cece833cf5319009b52f1b5a86c27"},
"sweet_xml": {:hex, :sweet_xml, "0.7.4", "a8b7e1ce7ecd775c7e8a65d501bc2cd933bff3a9c41ab763f5105688ef485d08", [:mix], [], "hexpm", "e7c4b0bdbf460c928234951def54fe87edf1a170f6896675443279e2dbeba167"},
"sweet_xml": {:hex, :sweet_xml, "0.7.5", "803a563113981aaac202a1dbd39771562d0ad31004ddbfc9b5090bdcd5605277", [:mix], [], "hexpm", "193b28a9b12891cae351d81a0cead165ffe67df1b73fe5866d10629f4faefb12"},
"swoosh": {:hex, :swoosh, "1.14.4", "94e9dba91f7695a10f49b0172c4a4cb658ef24abef7e8140394521b7f3bbb2d4", [:mix], [{:cowboy, "~> 1.1 or ~> 2.4", [hex: :cowboy, repo: "hexpm", optional: true]}, {:ex_aws, "~> 2.1", [hex: :ex_aws, repo: "hexpm", optional: true]}, {:finch, "~> 0.6", [hex: :finch, repo: "hexpm", optional: true]}, {:gen_smtp, "~> 0.13 or ~> 1.0", [hex: :gen_smtp, repo: "hexpm", optional: true]}, {:hackney, "~> 1.9", [hex: :hackney, repo: "hexpm", optional: true]}, {:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: false]}, {:mail, "~> 0.2", [hex: :mail, repo: "hexpm", optional: true]}, {:mime, "~> 1.1 or ~> 2.0", [hex: :mime, repo: "hexpm", optional: false]}, {:plug, "~> 1.9", [hex: :plug, repo: "hexpm", optional: true]}, {:plug_cowboy, ">= 1.0.0", [hex: :plug_cowboy, repo: "hexpm", optional: true]}, {:req, "~> 0.4 or ~> 1.0", [hex: :req, repo: "hexpm", optional: true]}, {:telemetry, "~> 0.4.2 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "081c5a590e4ba85cc89baddf7b2beecf6c13f7f84a958f1cd969290815f0f026"},
"syslog": {:hex, :syslog, "1.1.0", "6419a232bea84f07b56dc575225007ffe34d9fdc91abe6f1b2f254fd71d8efc2", [:rebar3], [], "hexpm", "4c6a41373c7e20587be33ef841d3de6f3beba08519809329ecc4d27b15b659e1"},
"table_rex": {:hex, :table_rex, "4.0.0", "3c613a68ebdc6d4d1e731bc973c233500974ec3993c99fcdabb210407b90959b", [:mix], [], "hexpm", "c35c4d5612ca49ebb0344ea10387da4d2afe278387d4019e4d8111e815df8f55"},
"table_rex": {:hex, :table_rex, "4.1.0", "fbaa8b1ce154c9772012bf445bfb86b587430fb96f3b12022d3f35ee4a68c918", [:mix], [], "hexpm", "95932701df195d43bc2d1c6531178fc8338aa8f38c80f098504d529c43bc2601"},
"telemetry": {:hex, :telemetry, "1.3.0", "fedebbae410d715cf8e7062c96a1ef32ec22e764197f70cda73d82778d61e7a2", [:rebar3], [], "hexpm", "7015fc8919dbe63764f4b4b87a95b7c0996bd539e0d499be6ec9d7f3875b79e6"},
"telemetry_metrics": {:hex, :telemetry_metrics, "0.6.2", "2caabe9344ec17eafe5403304771c3539f3b6e2f7fb6a6f602558c825d0d0bfb", [:mix], [{:telemetry, "~> 0.4 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "9b43db0dc33863930b9ef9d27137e78974756f5f198cae18409970ed6fa5b561"},
"telemetry_metrics_prometheus": {:hex, :telemetry_metrics_prometheus, "1.1.0", "1cc23e932c1ef9aa3b91db257ead31ea58d53229d407e059b29bb962c1505a13", [:mix], [{:plug_cowboy, "~> 2.1", [hex: :plug_cowboy, repo: "hexpm", optional: false]}, {:telemetry_metrics_prometheus_core, "~> 1.0", [hex: :telemetry_metrics_prometheus_core, repo: "hexpm", optional: false]}], "hexpm", "d43b3659b3244da44fe0275b717701542365d4519b79d9ce895b9719c1ce4d26"},
"telemetry_metrics_prometheus_core": {:hex, :telemetry_metrics_prometheus_core, "1.1.0", "4e15f6d7dbedb3a4e3aed2262b7e1407f166fcb9c30ca3f96635dfbbef99965c", [:mix], [{:telemetry, "~> 0.4 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}, {:telemetry_metrics, "~> 0.6", [hex: :telemetry_metrics, repo: "hexpm", optional: false]}], "hexpm", "0dd10e7fe8070095df063798f82709b0a1224c31b8baf6278b423898d591a069"},
"telemetry_poller": {:hex, :telemetry_poller, "1.1.0", "58fa7c216257291caaf8d05678c8d01bd45f4bdbc1286838a28c4bb62ef32999", [:rebar3], [{:telemetry, "~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "9eb9d9cbfd81cbd7cdd24682f8711b6e2b691289a0de6826e58452f28c103c8f"},
"temple": {:git, "https://akkoma.dev/AkkomaGang/temple.git", "066a699ade472d8fa42a9d730b29a61af9bc8b59", [ref: "066a699ade472d8fa42a9d730b29a61af9bc8b59"]},
"tesla": {:hex, :tesla, "1.13.0", "24a068a48d107080dd7c943a593997eee265977a38020eb2ab657cca78a12502", [:mix], [{:castore, "~> 0.1 or ~> 1.0", [hex: :castore, repo: "hexpm", optional: true]}, {:exjsx, ">= 3.0.0", [hex: :exjsx, repo: "hexpm", optional: true]}, {:finch, "~> 0.13", [hex: :finch, repo: "hexpm", optional: true]}, {:fuse, "~> 2.4", [hex: :fuse, repo: "hexpm", optional: true]}, {:gun, ">= 1.0.0", [hex: :gun, repo: "hexpm", optional: true]}, {:hackney, "~> 1.6", [hex: :hackney, repo: "hexpm", optional: true]}, {:ibrowse, "4.4.2", [hex: :ibrowse, repo: "hexpm", optional: true]}, {:jason, ">= 1.0.0", [hex: :jason, repo: "hexpm", optional: true]}, {:mime, "~> 1.0 or ~> 2.0", [hex: :mime, repo: "hexpm", optional: false]}, {:mint, "~> 1.0", [hex: :mint, repo: "hexpm", optional: true]}, {:mox, "~> 1.0", [hex: :mox, repo: "hexpm", optional: true]}, {:msgpax, "~> 2.3", [hex: :msgpax, repo: "hexpm", optional: true]}, {:poison, ">= 1.0.0", [hex: :poison, repo: "hexpm", optional: true]}, {:telemetry, "~> 0.4 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: true]}], "hexpm", "7b8fc8f6b0640fa0d090af7889d12eb396460e044b6f8688a8e55e30406a2200"},
"tesla": {:hex, :tesla, "1.14.1", "71c5b031b4e089c0fbfb2b362e24b4478465773ae4ef569760a8c2899ad1e73c", [:mix], [{:castore, "~> 0.1 or ~> 1.0", [hex: :castore, repo: "hexpm", optional: true]}, {:exjsx, ">= 3.0.0", [hex: :exjsx, repo: "hexpm", optional: true]}, {:finch, "~> 0.13", [hex: :finch, repo: "hexpm", optional: true]}, {:fuse, "~> 2.4", [hex: :fuse, repo: "hexpm", optional: true]}, {:gun, ">= 1.0.0", [hex: :gun, repo: "hexpm", optional: true]}, {:hackney, "~> 1.21", [hex: :hackney, repo: "hexpm", optional: true]}, {:ibrowse, "4.4.2", [hex: :ibrowse, repo: "hexpm", optional: true]}, {:jason, ">= 1.0.0", [hex: :jason, repo: "hexpm", optional: true]}, {:mime, "~> 1.0 or ~> 2.0", [hex: :mime, repo: "hexpm", optional: false]}, {:mint, "~> 1.0", [hex: :mint, repo: "hexpm", optional: true]}, {:mox, "~> 1.0", [hex: :mox, repo: "hexpm", optional: true]}, {:msgpax, "~> 2.3", [hex: :msgpax, repo: "hexpm", optional: true]}, {:poison, ">= 1.0.0", [hex: :poison, repo: "hexpm", optional: true]}, {:telemetry, "~> 0.4 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: true]}], "hexpm", "c1dde8140a49a3bef5bb622356e77ac5a24ad0c8091f12c3b7fc1077ce797155"},
"text_diff": {:hex, :text_diff, "0.1.0", "1caf3175e11a53a9a139bc9339bd607c47b9e376b073d4571c031913317fecaa", [:mix], [], "hexpm", "d1ffaaecab338e49357b6daa82e435f877e0649041ace7755583a0ea3362dbd7"},
"timex": {:hex, :timex, "3.7.11", "bb95cb4eb1d06e27346325de506bcc6c30f9c6dea40d1ebe390b262fad1862d1", [:mix], [{:combine, "~> 0.10", [hex: :combine, repo: "hexpm", optional: false]}, {:gettext, "~> 0.20", [hex: :gettext, repo: "hexpm", optional: false]}, {:tzdata, "~> 1.1", [hex: :tzdata, repo: "hexpm", optional: false]}], "hexpm", "8b9024f7efbabaf9bd7aa04f65cf8dcd7c9818ca5737677c7b76acbc6a94d1aa"},
"trailing_format_plug": {:hex, :trailing_format_plug, "0.0.7", "64b877f912cf7273bed03379936df39894149e35137ac9509117e59866e10e45", [:mix], [{:plug, "> 0.12.0", [hex: :plug, repo: "hexpm", optional: false]}], "hexpm", "bd4fde4c15f3e993a999e019d64347489b91b7a9096af68b2bdadd192afa693f"},
"tzdata": {:hex, :tzdata, "1.1.2", "45e5f1fcf8729525ec27c65e163be5b3d247ab1702581a94674e008413eef50b", [:mix], [{:hackney, "~> 1.17", [hex: :hackney, repo: "hexpm", optional: false]}], "hexpm", "cec7b286e608371602318c414f344941d5eb0375e14cfdab605cca2fe66cba8b"},
@ -131,6 +143,6 @@
"vex": {:hex, :vex, "0.9.2", "fe061acc9e0907d983d46b51bf35d58176f0fe6eb7ba3b33c9336401bf42b6d1", [:mix], [], "hexpm", "76e709a9762e98c6b462dfce92e9b5dfbf712839227f2da8add6dd11549b12cb"},
"web_push_encryption": {:hex, :web_push_encryption, "0.3.1", "76d0e7375142dfee67391e7690e89f92578889cbcf2879377900b5620ee4708d", [:mix], [{:httpoison, "~> 1.0", [hex: :httpoison, repo: "hexpm", optional: false]}, {:jose, "~> 1.11.1", [hex: :jose, repo: "hexpm", optional: false]}], "hexpm", "4f82b2e57622fb9337559058e8797cb0df7e7c9790793bdc4e40bc895f70e2a2"},
"websock": {:hex, :websock, "0.5.3", "2f69a6ebe810328555b6fe5c831a851f485e303a7c8ce6c5f675abeb20ebdadc", [:mix], [], "hexpm", "6105453d7fac22c712ad66fab1d45abdf049868f253cf719b625151460b8b453"},
"websock_adapter": {:hex, :websock_adapter, "0.5.7", "65fa74042530064ef0570b75b43f5c49bb8b235d6515671b3d250022cb8a1f9e", [:mix], [{:bandit, ">= 0.6.0", [hex: :bandit, repo: "hexpm", optional: true]}, {:plug, "~> 1.14", [hex: :plug, repo: "hexpm", optional: false]}, {:plug_cowboy, "~> 2.6", [hex: :plug_cowboy, repo: "hexpm", optional: true]}, {:websock, "~> 0.5", [hex: :websock, repo: "hexpm", optional: false]}], "hexpm", "d0f478ee64deddfec64b800673fd6e0c8888b079d9f3444dd96d2a98383bdbd1"},
"websock_adapter": {:hex, :websock_adapter, "0.5.8", "3b97dc94e407e2d1fc666b2fb9acf6be81a1798a2602294aac000260a7c4a47d", [:mix], [{:bandit, ">= 0.6.0", [hex: :bandit, repo: "hexpm", optional: true]}, {:plug, "~> 1.14", [hex: :plug, repo: "hexpm", optional: false]}, {:plug_cowboy, "~> 2.6", [hex: :plug_cowboy, repo: "hexpm", optional: true]}, {:websock, "~> 0.5", [hex: :websock, repo: "hexpm", optional: false]}], "hexpm", "315b9a1865552212b5f35140ad194e67ce31af45bcee443d4ecb96b5fd3f3782"},
"websockex": {:hex, :websockex, "0.4.3", "92b7905769c79c6480c02daacaca2ddd49de936d912976a4d3c923723b647bf0", [:mix], [], "hexpm", "95f2e7072b85a3a4cc385602d42115b73ce0b74a9121d0d6dbbf557645ac53e4"},
}

View file

@ -11,7 +11,7 @@ def up do
# Also this MUST use select, else the migration will fail in future installs with new user fields!
from(u in Pleroma.User,
where: u.local == true,
select: {u.id, u.keys, u.ap_id}
select: {u.id, fragment("?.keys", u), u.ap_id}
)
|> Repo.stream(timeout: :infinity)
|> Enum.each(fn

View file

@ -0,0 +1,38 @@
# Akkoma: Magically expressive social media
# Copyright © 2024 Akkoma Authors <https://akkoma.dev/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Repo.Migrations.RemoteUserCountEstimateFunction do
use Ecto.Migration
@function_name "estimate_remote_user_count"
def up() do
# yep, this EXPLAIN (ab)use is blessed by the PostgreSQL wiki:
# https://wiki.postgresql.org/wiki/Count_estimate
"""
CREATE OR REPLACE FUNCTION #{@function_name}()
RETURNS integer
LANGUAGE plpgsql AS $$
DECLARE plan jsonb;
BEGIN
EXECUTE '
EXPLAIN (FORMAT JSON)
SELECT *
FROM public.users
WHERE local = false AND
is_active = true AND
invisible = false AND
nickname IS NOT NULL;
' INTO plan;
RETURN plan->0->'Plan'->'Plan Rows';
END;
$$;
"""
|> execute()
end
def down() do
execute("DROP FUNCTION IF EXISTS #{@function_name}()")
end
end

View file

@ -0,0 +1,13 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2023 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Repo.Migrations.RemoveUserApEnabled do
use Ecto.Migration
def change do
alter table(:users) do
remove(:ap_enabled, :boolean, default: false, null: false)
end
end
end

View file

@ -0,0 +1,24 @@
defmodule Pleroma.Repo.Migrations.SigningKeyNullability do
use Ecto.Migration
import Ecto.Query
def up() do
# Delete existing NULL entries; they are useless
Pleroma.User.SigningKey
|> where([s], is_nil(s.user_id) or is_nil(s.public_key))
|> Pleroma.Repo.delete_all()
alter table(:signing_keys) do
modify :user_id, :uuid, null: false
modify :public_key, :text, null: false
end
end
def down() do
alter table(:signing_keys) do
modify :user_id, :uuid, null: true
modify :public_key, :text, null: true
end
end
end

View file

@ -0,0 +1,36 @@
defmodule Pleroma.Repo.Migrations.SigningKeyUniqueUserId do
use Ecto.Migration
import Ecto.Query
def up() do
# If dupes exists for any local user we do NOT want to delete the genuine privkey alongside the fake.
# Instead just filter out anything pertaining to local users, if dupes exists manual intervention
# is required anyway and index creation will just fail later (check against legacy field in users table)
dupes =
Pleroma.User.SigningKey
|> join(:inner, [s], u in Pleroma.User, on: s.user_id == u.id)
|> group_by([s], s.user_id)
|> having([], count() > 1)
|> having([_s, u], not fragment("bool_or(?)", u.local))
|> select([s], s.user_id)
# Delete existing remote duplicates
# theyll be reinserted on the next user update
# or proactively fetched when receiving a signature from it
Pleroma.User.SigningKey
|> where([s], s.user_id in subquery(dupes))
|> Pleroma.Repo.delete_all()
drop_if_exists(index(:signing_keys, [:user_id]))
create_if_not_exists(
index(:signing_keys, [:user_id], name: :signing_keys_user_id_index, unique: true)
)
end
def down() do
drop_if_exists(index(:signing_keys, [:user_id]))
create_if_not_exists(index(:signing_keys, [:user_id], name: :signing_keys_user_id_index))
end
end

View file

@ -0,0 +1,31 @@
defmodule Pleroma.Repo.Migrations.UsersDropLegacyKeyFields do
use Ecto.Migration
def up() do
alter table(:users) do
remove :keys, :text
remove :public_key, :text
end
end
def down() do
# Using raw query since the "keys" field may not exist in the Elixir Ecto schema
# causing issues when migrating data back and this requires column adds to be raw query too
"""
ALTER TABLE public.users
ADD COLUMN keys text,
ADD COLUMN public_key text;
"""
|> Pleroma.Repo.query!([], timeout: :infinity)
"""
UPDATE public.users AS u
SET keys = s.private_key
FROM public.signing_keys AS s
WHERE s.user_id = u.id AND
u.local AND
s.private_key IS NOT NULL;
"""
|> Pleroma.Repo.query!([], timeout: :infinity)
end
end

View file

@ -0,0 +1,7 @@
defmodule Pleroma.Repo.Migrations.UpgradeObanToV12 do
use Ecto.Migration
def up, do: Oban.Migrations.up(version: 12)
def down, do: Oban.Migrations.down(version: 12)
end

View file

@ -61,6 +61,34 @@ defmodule Pleroma.HTML.Scrubber.Default do
Meta.allow_tag_with_this_attribute_values(:span, "class", [
"h-card",
"quote-inline",
# "FEP-c16b: Formatting MFM functions" tags that Akkoma supports
# NOTE: Maybe it would be better to have something like "allow `mfm-*`,
# but at moment of writing this is not a thing in the HTML parser we use
# The following are the non-animated MFM
"mfm-center",
"mfm-flip",
"mfm-font",
"mfm-blur",
"mfm-rotate",
"mfm-x2",
"mfm-x3",
"mfm-x4",
"mfm-position",
"mfm-scale",
"mfm-fg",
"mfm-bg",
# The following are the animated MFM
"mfm-jelly",
"mfm-twitch",
"mfm-shake",
"mfm-spin",
"mfm-jump",
"mfm-bounce",
"mfm-rainbow",
"mfm-tada",
"mfm-sparkle",
# MFM legacy
# This is for backwards compatibility with posts formatted on Akkoma before support for FEP-c16b
"mfm",
"mfm _mfm_tada_",
"mfm _mfm_jelly_",
@ -79,6 +107,26 @@ defmodule Pleroma.HTML.Scrubber.Default do
])
Meta.allow_tag_with_these_attributes(:span, [
# "FEP-c16b: Formatting MFM functions" attributes that Akkoma supports
# NOTE: Maybe it would be better to have something like "allow `data-mfm-*`,
# but at moment of writing this is not a thing in the HTML parser we use
"data-mfm-h",
"data-mfm-v",
"data-mfm-x",
"data-mfm-y",
"data-mfm-alternate",
"data-mfm-speed",
"data-mfm-deg",
"data-mfm-left",
"data-mfm-serif",
"data-mfm-monospace",
"data-mfm-cursive",
"data-mfm-fantasy",
"data-mfm-emoji",
"data-mfm-math",
"data-mfm-color",
# MFM legacy
# This is for backwards compatibility with posts formatted on Akkoma before support for FEP-c16b
"data-x",
"data-y",
"data-h",

View file

@ -0,0 +1,54 @@
{
"@context": [
"https://www.w3.org/ns/activitystreams",
"https://w3id.org/security/v1",
{
"manuallyApprovesFollowers": "as:manuallyApprovesFollowers",
"sensitive": "as:sensitive",
"movedTo": "as:movedTo",
"Hashtag": "as:Hashtag",
"ostatus": "http://ostatus.org#",
"atomUri": "ostatus:atomUri",
"inReplyToAtomUri": "ostatus:inReplyToAtomUri",
"conversation": "ostatus:conversation",
"toot": "http://joinmastodon.org/ns#",
"Emoji": "toot:Emoji",
"alsoKnownAs": {
"@id": "as:alsoKnownAs",
"@type": "@id"
}
}
],
"id": "http://remote.example/users/with_key_id_of_admin-mastodon.example.org",
"type": "Person",
"following": "http://remote.example/users/evil/following",
"followers": "http://remote.example/users/evil/followers",
"inbox": "http://remote.example/users/evil/inbox",
"outbox": "http://remote.example/users/evil/outbox",
"preferredUsername": "evil",
"name": null,
"discoverable": "true",
"summary": "hii",
"url": "http://remote.example/@evil",
"manuallyApprovesFollowers": false,
"publicKey": {
"id": "http://mastodon.example.org/users/admin#main-key",
"owner": "http://remote.example/users/evil",
"publicKeyPem": "-----BEGIN PUBLIC KEY-----\nMIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEAtc4Tir+3ADhSNF6VKrtW\nOU32T01w7V0yshmQei38YyiVwVvFu8XOP6ACchkdxbJ+C9mZud8qWaRJKVbFTMUG\nNX4+6Q+FobyuKrwN7CEwhDALZtaN2IPbaPd6uG1B7QhWorrY+yFa8f2TBM3BxnUy\nI4T+bMIZIEYG7KtljCBoQXuTQmGtuffO0UwJksidg2ffCF5Q+K//JfQagJ3UzrR+\nZXbKMJdAw4bCVJYs4Z5EhHYBwQWiXCyMGTd7BGlmMkY6Av7ZqHKC/owp3/0EWDNz\nNqF09Wcpr3y3e8nA10X40MJqp/wR+1xtxp+YGbq/Cj5hZGBG7etFOmIpVBrDOhry\nBwIDAQAB\n-----END PUBLIC KEY-----\n"
},
"attachment": [],
"endpoints": {
"sharedInbox": "http://remote.example/inbox"
},
"icon": {
"type": "Image",
"mediaType": "image/jpeg",
"url": "https://cdn.niu.moe/accounts/avatars/000/033/323/original/fd7f8ae0b3ffedc9.jpeg"
},
"image": {
"type": "Image",
"mediaType": "image/png",
"url": "https://cdn.niu.moe/accounts/headers/000/033/323/original/850b3448fa5fd477.png"
},
"alsoKnownAs": []
}

View file

@ -56,8 +56,19 @@ test "it returns error if public key is nil" do
describe "refetch_public_key/1" do
test "it returns key" do
clear_config([:activitypub, :min_key_refetch_interval], 0)
ap_id = "https://mastodon.social/users/lambadalambda"
%Pleroma.User{signing_key: sk} =
Pleroma.User.get_or_fetch_by_ap_id(ap_id)
|> then(fn {:ok, u} -> u end)
|> Pleroma.User.SigningKey.load_key()
{:ok, _} =
%{sk | public_key: "-----BEGIN PUBLIC KEY-----\nasdfghjkl"}
|> Ecto.Changeset.change()
|> Pleroma.Repo.update()
assert Signature.refetch_public_key(make_fake_conn(ap_id)) == {:ok, @rsa_public_key}
end
end
@ -114,16 +125,6 @@ test "it returns signature headers" do
end
end
describe "key_id_to_actor_id/1" do
test "it reverses the key id to actor id" do
user =
insert(:user)
|> with_signing_key()
assert Signature.key_id_to_actor_id(user.signing_key.key_id) == {:ok, user.ap_id}
end
end
describe "signed_date" do
test "it returns formatted current date" do
with_mock(NaiveDateTime, utc_now: fn -> ~N[2019-08-23 18:11:24.822233] end) do

View file

@ -0,0 +1,305 @@
# Akkoma: Magically expressive social media
# Copyright © 2024 Akkoma Authors <https://akkoma.dev/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.User.SigningKeyTests do
alias Pleroma.User
alias Pleroma.User.SigningKey
alias Pleroma.Repo
use Pleroma.DataCase
use Oban.Testing, repo: Pleroma.Repo
import Pleroma.Factory
defp maybe_put(map, _, nil), do: map
defp maybe_put(map, key, val), do: Kernel.put_in(map, key, val)
defp get_body_actor(key_id \\ nil, user_id \\ nil, owner_id \\ nil) do
owner_id = owner_id || user_id
File.read!("test/fixtures/tesla_mock/admin@mastdon.example.org.json")
|> Jason.decode!()
|> maybe_put(["id"], user_id)
|> maybe_put(["publicKey", "id"], key_id)
|> maybe_put(["publicKey", "owner"], owner_id)
|> Jason.encode!()
end
defp get_body_rawkey(key_id, owner, pem \\ "RSA begin buplic key") do
%{
"type" => "CryptographicKey",
"id" => key_id,
"owner" => owner,
"publicKeyPem" => pem
}
|> Jason.encode!()
end
defmacro mock_tesla(
url,
get_body,
status \\ 200,
headers \\ []
) do
quote do
Tesla.Mock.mock(fn
%{method: :get, url: unquote(url)} ->
%Tesla.Env{
status: unquote(status),
body: unquote(get_body),
url: unquote(url),
headers: [
{"content-type",
"application/ld+json; profile=\"https://www.w3.org/ns/activitystreams\""}
| unquote(headers)
]
}
end)
end
end
describe "succesfully" do
test "inserts key and new user on fetch" do
ap_id_actor = "https://mastodon.example.org/signing-key-test/actor"
ap_id_key = ap_id_actor <> "#main-key"
ap_doc = get_body_actor(ap_id_key, ap_id_actor)
mock_tesla(ap_id_actor, ap_doc)
{:ok, %SigningKey{} = key} = SigningKey.fetch_remote_key(ap_id_key)
user = User.get_by_id(key.user_id)
assert match?(%User{}, user)
user = SigningKey.load_key(user)
assert user.ap_id == ap_id_actor
assert user.signing_key.key_id == ap_id_key
assert user.signing_key.key_id == key.key_id
assert user.signing_key.private_key == nil
end
test "updates existing key" do
user =
insert(:user, local: false, domain: "mastodon.example.org")
|> with_signing_key()
ap_id_actor = user.ap_id
ap_doc = get_body_actor(user.signing_key.key_id, ap_id_actor)
mock_tesla(ap_id_actor, ap_doc)
old_pem = user.signing_key.public_key
old_priv = user.signing_key.private_key
# note: the returned value does not fully match the value stored in the database
# since inserted_at isn't changed on upserts
{:ok, %SigningKey{} = key} = SigningKey.fetch_remote_key(user.signing_key.key_id)
refreshed_key = Repo.get_by(SigningKey, key_id: key.key_id)
assert match?(%SigningKey{}, refreshed_key)
refute refreshed_key.public_key == old_pem
assert refreshed_key.private_key == old_priv
assert refreshed_key.user_id == user.id
assert key.public_key == refreshed_key.public_key
end
test "finds known key by key_id" do
sk = insert(:signing_key, key_id: "https://remote.example/signing-key-test/some-kown-key")
{:ok, key} = SigningKey.get_or_fetch_by_key_id(sk.key_id)
assert sk == key
end
test "finds key for remote user" do
user_with_preload =
insert(:user, local: false)
|> with_signing_key()
user = User.get_by_id(user_with_preload.id)
assert !match?(%SigningKey{}, user.signing_key)
user = SigningKey.load_key(user)
assert match?(%SigningKey{}, user.signing_key)
# the initial "with_signing_key" doesn't set timestamps, and meta differs (loaded vs built)
# thus clear affected fields before comparison
found_sk = %{user.signing_key | inserted_at: nil, updated_at: nil, __meta__: nil}
ref_sk = %{user_with_preload.signing_key | __meta__: nil}
assert found_sk == ref_sk
end
test "finds remote user id by key id" do
user =
insert(:user, local: false)
|> with_signing_key()
uid = SigningKey.key_id_to_user_id(user.signing_key.key_id)
assert uid == user.id
end
test "finds remote user ap id by key id" do
user =
insert(:user, local: false)
|> with_signing_key()
uapid = SigningKey.key_id_to_ap_id(user.signing_key.key_id)
assert uapid == user.ap_id
end
end
test "won't fetch keys for local users" do
user =
insert(:user, local: true)
|> with_signing_key()
{:error, _} = SigningKey.fetch_remote_key(user.signing_key.key_id)
end
test "fails insert with overlapping key owner" do
user =
insert(:user, local: false)
|> with_signing_key()
second_key_id =
user.signing_key.key_id
|> URI.parse()
|> Map.put(:fragment, nil)
|> Map.put(:query, nil)
|> URI.to_string()
|> then(fn id -> id <> "/second_key" end)
ap_doc = get_body_rawkey(second_key_id, user.ap_id)
mock_tesla(second_key_id, ap_doc)
res = SigningKey.fetch_remote_key(second_key_id)
assert match?({:error, %{errors: _}}, res)
{:error, cs} = res
assert Keyword.has_key?(cs.errors, :user_id)
end
test "Fetched raw SigningKeys cannot take over arbitrary users" do
# in theory cross-domain key and actor are fine, IF and ONLY IF
# the actor also links back to this key, but this isnt supported atm anyway
user =
insert(:user, local: false)
|> with_signing_key()
remote_key_id = "https://remote.example/keys/for_local"
keydoc = get_body_rawkey(remote_key_id, user.ap_id)
mock_tesla(remote_key_id, keydoc)
{:error, _} = SigningKey.fetch_remote_key(remote_key_id)
refreshed_org_key = Repo.get_by(SigningKey, key_id: user.signing_key.key_id)
refreshed_user_key = Repo.get_by(SigningKey, user_id: user.id)
assert match?(%SigningKey{}, refreshed_org_key)
assert match?(%SigningKey{}, refreshed_user_key)
actor_host = URI.parse(user.ap_id).host
org_key_host = URI.parse(refreshed_org_key.key_id).host
usr_key_host = URI.parse(refreshed_user_key.key_id).host
assert actor_host == org_key_host
assert actor_host == usr_key_host
refute usr_key_host == "remote.example"
assert refreshed_user_key == refreshed_org_key
assert user.signing_key.key_id == refreshed_org_key.key_id
end
test "Fetched non-raw SigningKey cannot take over arbitrary users" do
# this actually comes free with our fetch ID checks, but lets verify it here too for good measure
user =
insert(:user, local: false)
|> with_signing_key()
remote_key_id = "https://remote.example/keys#for_local"
keydoc = get_body_actor(remote_key_id, user.ap_id, user.ap_id)
mock_tesla(remote_key_id, keydoc)
{:error, _} = SigningKey.fetch_remote_key(remote_key_id)
refreshed_org_key = Repo.get_by(SigningKey, key_id: user.signing_key.key_id)
refreshed_user_key = Repo.get_by(SigningKey, user_id: user.id)
assert match?(%SigningKey{}, refreshed_org_key)
assert match?(%SigningKey{}, refreshed_user_key)
actor_host = URI.parse(user.ap_id).host
org_key_host = URI.parse(refreshed_org_key.key_id).host
usr_key_host = URI.parse(refreshed_user_key.key_id).host
assert actor_host == org_key_host
assert actor_host == usr_key_host
refute usr_key_host == "remote.example"
assert refreshed_user_key == refreshed_org_key
assert user.signing_key.key_id == refreshed_org_key.key_id
end
test "remote users sharing signing key ID don't break our database" do
# in principle a valid setup using this can be cosntructed,
# but so far not observed in practice and our db scheme cannot handle it.
# Thus make sure it doesn't break our db anything but gets rejected
key_id = "https://mastodon.example.org/the_one_key"
user1 =
insert(:user, local: false, domain: "mastodon.example.org")
|> with_signing_key(%{key_id: key_id})
key_owner = "https://mastodon.example.org/#"
user2_ap_id = user1.ap_id <> "22"
user2_doc = get_body_actor(user1.signing_key.key_id, user2_ap_id, key_owner)
user3_ap_id = user1.ap_id <> "333"
user3_doc = get_body_actor(user1.signing_key.key_id, user2_ap_id)
standalone_key_doc =
get_body_rawkey(key_id, "https://mastodon.example.org/#", user1.signing_key.public_key)
ap_headers = [
{"content-type", "application/ld+json; profile=\"https://www.w3.org/ns/activitystreams\""}
]
Tesla.Mock.mock(fn
%{method: :get, url: ^key_id} ->
%Tesla.Env{
status: 200,
body: standalone_key_doc,
url: key_id,
headers: ap_headers
}
%{method: :get, url: ^user2_ap_id} ->
%Tesla.Env{
status: 200,
body: user2_doc,
url: user2_ap_id,
headers: ap_headers
}
%{method: :get, url: ^user3_ap_id} ->
%Tesla.Env{
status: 200,
body: user3_doc,
url: user3_ap_id,
headers: ap_headers
}
end)
{:error, _} = SigningKey.fetch_remote_key(key_id)
{:ok, user2} = User.get_or_fetch_by_ap_id(user2_ap_id)
{:ok, user3} = User.get_or_fetch_by_ap_id(user3_ap_id)
{:ok, db_key} = SigningKey.get_or_fetch_by_key_id(key_id)
keys =
from(s in SigningKey, where: s.key_id == ^key_id)
|> Repo.all()
assert match?([%SigningKey{}], keys)
assert [db_key] == keys
assert db_key.user_id == user1.id
assert match?({:ok, _}, SigningKey.public_key(user1))
assert {:error, "key not found"} == SigningKey.public_key(user2)
assert {:error, "key not found"} == SigningKey.public_key(user3)
end
end

View file

@ -10,6 +10,7 @@ defmodule Pleroma.UserTest do
alias Pleroma.Repo
alias Pleroma.Tests.ObanHelpers
alias Pleroma.User
alias Pleroma.User.SigningKey
alias Pleroma.Web.ActivityPub.ActivityPub
alias Pleroma.Web.CommonAPI
@ -460,12 +461,7 @@ test "it sends a welcome message if it is set" do
}
)
setup do:
clear_config(:mrf,
policies: [
Pleroma.Web.ActivityPub.MRF.SimplePolicy
]
)
setup do: clear_config([:mrf, :policies], [Pleroma.Web.ActivityPub.MRF.SimplePolicy])
test "it sends a welcome email message if it is set" do
welcome_user = insert(:user)
@ -908,6 +904,35 @@ test "it doesn't fail on invalid alsoKnownAs entries" do
assert {:ok, %User{also_known_as: []}} =
User.get_or_fetch_by_ap_id("https://mbp.example.com/")
end
test "doesn't allow key_id poisoning" do
{:error, {:validate, {:error, "Problematic actor-key pairing:" <> _}}} =
User.fetch_by_ap_id(
"http://remote.example/users/with_key_id_of_admin-mastodon.example.org"
)
used_key_id = "http://mastodon.example.org/users/admin#main-key"
refute Repo.get_by(SigningKey, key_id: used_key_id)
{:ok, user} = User.fetch_by_ap_id("http://mastodon.example.org/users/admin")
user = SigningKey.load_key(user)
# ensure we checked for the right key before
assert user.signing_key.key_id == used_key_id
end
test "doesn't allow key_id takeovers" do
{:ok, user} = User.fetch_by_ap_id("http://mastodon.example.org/users/admin")
user = SigningKey.load_key(user)
{:error, {:validate, {:error, "Problematic actor-key pairing:" <> _}}} =
User.fetch_by_ap_id(
"http://remote.example/users/with_key_id_of_admin-mastodon.example.org"
)
refreshed_sk = Repo.get_by(SigningKey, key_id: user.signing_key.key_id)
assert refreshed_sk.user_id == user.id
end
end
test "returns an ap_id for a user" do
@ -1678,7 +1703,6 @@ test "delete/1 purges a user when they wouldn't be fully deleted" do
bio: "eyy lmao",
name: "qqqqqqq",
password_hash: "pdfk2$1b3n159001",
keys: "RSA begin buplic key",
avatar: %{"a" => "b"},
tags: ["qqqqq"],
banner: %{"a" => "b"},
@ -1694,7 +1718,6 @@ test "delete/1 purges a user when they wouldn't be fully deleted" do
confirmation_token: "qqqq",
domain_blocks: ["lain.com"],
is_active: false,
ap_enabled: true,
is_moderator: true,
is_admin: true,
mastofe_settings: %{"a" => "b"},
@ -1734,7 +1757,6 @@ test "delete/1 purges a user when they wouldn't be fully deleted" do
confirmation_token: nil,
domain_blocks: [],
is_active: false,
ap_enabled: false,
is_moderator: false,
is_admin: false,
mastofe_settings: nil,
@ -1798,10 +1820,6 @@ test "unsuggests a user" do
end
end
test "get_public_key_for_ap_id fetches a user that's not in the db" do
assert {:ok, _key} = User.get_public_key_for_ap_id("http://mastodon.example.org/users/admin")
end
describe "per-user rich-text filtering" do
test "html_filter_policy returns default policies, when rich-text is enabled" do
user = insert(:user)
@ -2204,6 +2222,56 @@ test "removes report notifs when user isn't superuser any more" do
# is not a superuser any more
assert [] = Notification.for_user(user)
end
test "updates public key pem" do
# note: in the future key updates might be limited to announced expirations
user_org =
insert(:user, local: false)
|> with_signing_key()
old_pub = user_org.signing_key.public_key
new_pub = "BEGIN RSA public key"
refute old_pub == new_pub
{:ok, %User{} = user} =
User.update_and_set_cache(user_org, %{signing_key: %{public_key: new_pub}})
user = SigningKey.load_key(user)
assert user.signing_key.public_key == new_pub
end
test "updates public key id if valid" do
# note: in the future key updates might be limited to announced expirations
user_org =
insert(:user, local: false)
|> with_signing_key()
old_kid = user_org.signing_key.key_id
new_kid = old_kid <> "_v2"
{:ok, %User{} = user} =
User.update_and_set_cache(user_org, %{signing_key: %{key_id: new_kid}})
user = SigningKey.load_key(user)
assert user.signing_key.key_id == new_kid
refute Repo.get_by(SigningKey, key_id: old_kid)
end
test "refuses to sever existing key-user mappings" do
user1 =
insert(:user, local: false)
|> with_signing_key()
user2 =
insert(:user, local: false)
|> with_signing_key()
assert_raise Ecto.ConstraintError, fn ->
User.update_and_set_cache(user2, %{signing_key: %{key_id: user1.signing_key.key_id}})
end
end
end
describe "following/followers synchronization" do
@ -2217,8 +2285,7 @@ test "updates the counters normally on following/getting a follow when disabled"
insert(:user,
local: false,
follower_address: "http://remote.org/users/masto_closed/followers",
following_address: "http://remote.org/users/masto_closed/following",
ap_enabled: true
following_address: "http://remote.org/users/masto_closed/following"
)
assert other_user.following_count == 0
@ -2239,8 +2306,7 @@ test "synchronizes the counters with the remote instance for the followed when e
insert(:user,
local: false,
follower_address: "http://remote.org/users/masto_closed/followers",
following_address: "http://remote.org/users/masto_closed/following",
ap_enabled: true
following_address: "http://remote.org/users/masto_closed/following"
)
assert other_user.following_count == 0
@ -2261,8 +2327,7 @@ test "synchronizes the counters with the remote instance for the follower when e
insert(:user,
local: false,
follower_address: "http://remote.org/users/masto_closed/followers",
following_address: "http://remote.org/users/masto_closed/following",
ap_enabled: true
following_address: "http://remote.org/users/masto_closed/following"
)
assert other_user.following_count == 0

View file

@ -16,7 +16,6 @@ defmodule Pleroma.Web.ActivityPub.ActivityPubControllerTest do
alias Pleroma.Web.ActivityPub.ObjectView
alias Pleroma.Web.ActivityPub.Relay
alias Pleroma.Web.ActivityPub.UserView
alias Pleroma.Web.ActivityPub.Utils
alias Pleroma.Web.CommonAPI
alias Pleroma.Web.Endpoint
alias Pleroma.Workers.ReceiverWorker
@ -562,7 +561,7 @@ test "it inserts an incoming activity into the database", %{conn: conn} do
|> assign(:valid_signature, true)
|> put_req_header(
"signature",
"keyId=\"http://mastodon.example.org/users/admin/main-key\""
"keyId=\"http://mastodon.example.org/users/admin#main-key\""
)
|> put_req_header("content-type", "application/activity+json")
|> post("/inbox", data)
@ -579,7 +578,6 @@ test "it inserts an incoming activity into the database" <>
user =
insert(:user,
ap_id: "https://mastodon.example.org/users/raymoo",
ap_enabled: true,
local: false,
last_refreshed_at: nil
)
@ -681,7 +679,7 @@ test "accepts Add/Remove activities", %{conn: conn} do
|> String.replace("{{nickname}}", "lain")
actor = "https://example.com/users/lain"
key_id = "#{actor}/main-key"
key_id = "#{actor}#main-key"
insert(:user,
ap_id: actor,
@ -743,7 +741,7 @@ test "accepts Add/Remove activities", %{conn: conn} do
assert "ok" ==
conn
|> assign(:valid_signature, true)
|> put_req_header("signature", "keyId=\"#{actor}/main-key\"")
|> put_req_header("signature", "keyId=\"#{actor}#main-key\"")
|> put_req_header("content-type", "application/activity+json")
|> post("/inbox", data)
|> json_response(200)
@ -766,7 +764,7 @@ test "accepts Add/Remove activities", %{conn: conn} do
assert "ok" ==
conn
|> assign(:valid_signature, true)
|> put_req_header("signature", "keyId=\"#{actor}/main-key\"")
|> put_req_header("signature", "keyId=\"#{actor}#main-key\"")
|> put_req_header("content-type", "application/activity+json")
|> post("/inbox", data)
|> json_response(200)
@ -792,7 +790,7 @@ test "mastodon pin/unpin", %{conn: conn} do
|> String.replace("{{nickname}}", "lain")
actor = "https://example.com/users/lain"
key_id = "#{actor}/main-key"
key_id = "#{actor}#main-key"
sender =
insert(:user,
@ -885,7 +883,7 @@ test "mastodon pin/unpin", %{conn: conn} do
assert "ok" ==
conn
|> assign(:valid_signature, true)
|> put_req_header("signature", "keyId=\"#{actor}/main-key\"")
|> put_req_header("signature", "keyId=\"#{actor}#main-key\"")
|> put_req_header("content-type", "application/activity+json")
|> post("/inbox", data)
|> json_response(200)
@ -917,7 +915,7 @@ test "it inserts an incoming activity into the database", %{conn: conn, data: da
conn =
conn
|> assign(:valid_signature, true)
|> put_req_header("signature", "keyId=\"#{data["actor"]}/main-key\"")
|> put_req_header("signature", "keyId=\"#{data["actor"]}#main-key\"")
|> put_req_header("content-type", "application/activity+json")
|> post("/users/#{user.nickname}/inbox", data)
@ -941,7 +939,7 @@ test "it accepts messages with to as string instead of array", %{conn: conn, dat
conn =
conn
|> assign(:valid_signature, true)
|> put_req_header("signature", "keyId=\"#{data["actor"]}/main-key\"")
|> put_req_header("signature", "keyId=\"#{data["actor"]}#main-key\"")
|> put_req_header("content-type", "application/activity+json")
|> post("/users/#{user.nickname}/inbox", data)
@ -963,7 +961,7 @@ test "it accepts messages with cc as string instead of array", %{conn: conn, dat
conn =
conn
|> assign(:valid_signature, true)
|> put_req_header("signature", "keyId=\"#{data["actor"]}/main-key\"")
|> put_req_header("signature", "keyId=\"#{data["actor"]}#main-key\"")
|> put_req_header("content-type", "application/activity+json")
|> post("/users/#{user.nickname}/inbox", data)
@ -990,7 +988,7 @@ test "it accepts messages with bcc as string instead of array", %{conn: conn, da
conn =
conn
|> assign(:valid_signature, true)
|> put_req_header("signature", "keyId=\"#{data["actor"]}/main-key\"")
|> put_req_header("signature", "keyId=\"#{data["actor"]}#main-key\"")
|> put_req_header("content-type", "application/activity+json")
|> post("/users/#{user.nickname}/inbox", data)
@ -1114,7 +1112,8 @@ test "it clears `unreachable` federation status of the sender", %{conn: conn, da
end
test "it removes all follower collections but actor's", %{conn: conn} do
[actor, recipient] = insert_pair(:user)
actor = insert(:user, local: false)
recipient = insert(:user, local: true)
actor = with_signing_key(actor)
to = [
@ -1128,7 +1127,7 @@ test "it removes all follower collections but actor's", %{conn: conn} do
data = %{
"@context" => ["https://www.w3.org/ns/activitystreams"],
"type" => "Create",
"id" => Utils.generate_activity_id(),
"id" => actor.ap_id <> "/create/12345",
"to" => to,
"cc" => cc,
"actor" => actor.ap_id,
@ -1138,7 +1137,7 @@ test "it removes all follower collections but actor's", %{conn: conn} do
"cc" => cc,
"content" => "It's a note",
"attributedTo" => actor.ap_id,
"id" => Utils.generate_object_id()
"id" => actor.ap_id <> "/note/12345"
}
}
@ -1413,7 +1412,7 @@ test "it does not return a local note activity when unauthenticated", %{conn: co
|> get("/users/#{user.nickname}/outbox?page=true")
|> json_response(200)
assert %{"orderedItems" => []} = resp
refute Map.has_key?(resp, "orderedItems")
end
test "it returns a note activity in a collection", %{conn: conn} do

View file

@ -178,7 +178,6 @@ test "it returns a user" do
{:ok, user} = ActivityPub.make_user_from_ap_id(user_id)
assert user.ap_id == user_id
assert user.nickname == "admin@mastodon.example.org"
assert user.ap_enabled
assert user.follower_address == "http://mastodon.example.org/users/admin/followers"
end

View file

@ -241,11 +241,11 @@ test "it rejects posts without links" do
assert capture_log(fn ->
{:reject, _} = AntiLinkSpamPolicy.filter(message)
end) =~ "[error] Could not decode user at fetch http://invalid.actor"
end) =~ "[error] Could not fetch user http://invalid.actor,"
assert capture_log(fn ->
{:reject, _} = AntiLinkSpamPolicy.filter(update_message)
end) =~ "[error] Could not decode user at fetch http://invalid.actor"
end) =~ "[error] Could not fetch user http://invalid.actor,"
end
test "it rejects posts with links" do
@ -259,11 +259,11 @@ test "it rejects posts with links" do
assert capture_log(fn ->
{:reject, _} = AntiLinkSpamPolicy.filter(message)
end) =~ "[error] Could not decode user at fetch http://invalid.actor"
end) =~ "[error] Could not fetch user http://invalid.actor,"
assert capture_log(fn ->
{:reject, _} = AntiLinkSpamPolicy.filter(update_message)
end) =~ "[error] Could not decode user at fetch http://invalid.actor"
end) =~ "[error] Could not fetch user http://invalid.actor,"
end
end

View file

@ -157,7 +157,7 @@ test "a misskey MFM status with a content field should work and be linked", _ do
assert content =~ "@oops_not_a_mention"
assert content =~
"<span class=\"mfm _mfm_jelly_\">mfm goes here</span> </p>aaa"
"<span class=\"mfm-jelly\">mfm goes here</span> </p>aaa"
assert content =~ "some text<br/>newline"
end

View file

@ -306,15 +306,13 @@ test "publish to url with with different ports" do
follower =
insert(:user, %{
local: false,
inbox: "https://domain.com/users/nick1/inbox",
ap_enabled: true
inbox: "https://domain.com/users/nick1/inbox"
})
another_follower =
insert(:user, %{
local: false,
inbox: "https://rejected.com/users/nick2/inbox",
ap_enabled: true
inbox: "https://rejected.com/users/nick2/inbox"
})
actor =
@ -386,8 +384,7 @@ test "publish to url with with different ports" do
follower =
insert(:user, %{
local: false,
inbox: "https://domain.com/users/nick1/inbox",
ap_enabled: true
inbox: "https://domain.com/users/nick1/inbox"
})
actor =
@ -425,8 +422,7 @@ test "publish to url with with different ports" do
follower =
insert(:user, %{
local: false,
inbox: "https://domain.com/users/nick1/inbox",
ap_enabled: true
inbox: "https://domain.com/users/nick1/inbox"
})
actor = insert(:user, follower_address: follower.ap_id)
@ -461,15 +457,13 @@ test "publish to url with with different ports" do
fetcher =
insert(:user,
local: false,
inbox: "https://domain.com/users/nick1/inbox",
ap_enabled: true
inbox: "https://domain.com/users/nick1/inbox"
)
another_fetcher =
insert(:user,
local: false,
inbox: "https://domain2.com/users/nick1/inbox",
ap_enabled: true
inbox: "https://domain2.com/users/nick1/inbox"
)
actor = insert(:user)

View file

@ -29,7 +29,7 @@ test "relay actor is invisible" do
test "returns errors when user not found" do
assert capture_log(fn ->
{:error, _} = Relay.follow("test-ap-id")
end) =~ "Could not decode user at fetch"
end) =~ "Could not fetch user test-ap-id,"
end
test "returns activity" do
@ -48,7 +48,7 @@ test "returns activity" do
test "returns errors when user not found" do
assert capture_log(fn ->
{:error, _} = Relay.unfollow("test-ap-id")
end) =~ "Could not decode user at fetch"
end) =~ "Could not fetch user test-ap-id,"
end
test "returns activity" do

View file

@ -46,7 +46,7 @@ test "it queues a fetch of instance information" do
assert_enqueued(
worker: Pleroma.Workers.NodeInfoFetcherWorker,
args: %{"op" => "process", "source_url" => "https://wowee.example.com/users/1"}
args: %{"op" => "process", "source_url" => "https://wowee.example.com/"}
)
end
end

View file

@ -102,6 +102,7 @@ test "Add/Remove activities for remote users without featured address" do
user =
user
|> Ecto.Changeset.change(featured_address: nil)
|> Ecto.Changeset.change(last_refreshed_at: ~N[2013-03-14 11:50:00.000000])
|> Repo.update!()
%{host: host} = URI.parse(user.ap_id)

View file

@ -8,7 +8,6 @@ defmodule Pleroma.Web.ActivityPub.TransmogrifierTest do
@moduletag :mocked
alias Pleroma.Activity
alias Pleroma.Object
alias Pleroma.Tests.ObanHelpers
alias Pleroma.User
alias Pleroma.Web.ActivityPub.Transmogrifier
alias Pleroma.Web.ActivityPub.Utils
@ -53,6 +52,25 @@ test "it works for incoming unfollows with an existing follow" do
refute User.following?(User.get_cached_by_ap_id(data["actor"]), user)
end
test "it ignores Undo activities for unknown objects" do
undo_data = %{
"id" => "https://remote.com/undo",
"type" => "Undo",
"actor" => "https:://remote.com/users/unknown",
"object" => %{
"id" => "https://remote.com/undone_activity/unknown",
"type" => "Like"
}
}
assert {:error, :ignore} == Transmogrifier.handle_incoming(undo_data)
user = insert(:user, local: false, ap_id: "https://remote.com/users/known")
undo_data = %{undo_data | "actor" => user.ap_id}
assert {:error, :ignore} == Transmogrifier.handle_incoming(undo_data)
end
test "it accepts Flag activities" do
user = insert(:user)
other_user = insert(:user)
@ -146,6 +164,183 @@ test "it accepts quote posts" do
# It fetched the quoted post
assert Object.normalize("https://misskey.io/notes/8vs6wxufd0")
end
test "doesn't allow remote edits to fake local likes" do
# as a spot check for no internal fields getting injected
now = DateTime.utc_now()
pub_date = DateTime.to_iso8601(Timex.subtract(now, Timex.Duration.from_minutes(3)))
edit_date = DateTime.to_iso8601(now)
local_user = insert(:user)
create_data = %{
"type" => "Create",
"id" => "http://mastodon.example.org/users/admin/statuses/2619539638/activity",
"actor" => "http://mastodon.example.org/users/admin",
"to" => ["as:Public"],
"cc" => [],
"object" => %{
"type" => "Note",
"id" => "http://mastodon.example.org/users/admin/statuses/2619539638",
"attributedTo" => "http://mastodon.example.org/users/admin",
"to" => ["as:Public"],
"cc" => [],
"published" => pub_date,
"content" => "miaow",
"likes" => [local_user.ap_id]
}
}
update_data =
create_data
|> Map.put("type", "Update")
|> Map.put("id", create_data["object"]["id"] <> "/update/1")
|> put_in(["object", "content"], "miaow :3")
|> put_in(["object", "updated"], edit_date)
|> put_in(["object", "formerRepresentations"], %{
"type" => "OrderedCollection",
"totalItems" => 1,
"orderedItems" => [create_data["object"]]
})
{:ok, %Pleroma.Activity{} = activity} = Transmogrifier.handle_incoming(create_data)
%Pleroma.Object{} = object = Object.get_by_ap_id(activity.data["object"])
assert object.data["content"] == "miaow"
assert object.data["likes"] == []
assert object.data["like_count"] == 0
{:ok, %Pleroma.Activity{} = activity} = Transmogrifier.handle_incoming(update_data)
%Pleroma.Object{} = object = Object.get_by_ap_id(activity.data["object"]["id"])
assert object.data["content"] == "miaow :3"
assert object.data["likes"] == []
assert object.data["like_count"] == 0
end
test "doesn't trip over remote likes in notes" do
now = DateTime.utc_now()
pub_date = DateTime.to_iso8601(Timex.subtract(now, Timex.Duration.from_minutes(3)))
edit_date = DateTime.to_iso8601(now)
create_data = %{
"type" => "Create",
"id" => "http://mastodon.example.org/users/admin/statuses/3409297097/activity",
"actor" => "http://mastodon.example.org/users/admin",
"to" => ["as:Public"],
"cc" => [],
"object" => %{
"type" => "Note",
"id" => "http://mastodon.example.org/users/admin/statuses/3409297097",
"attributedTo" => "http://mastodon.example.org/users/admin",
"to" => ["as:Public"],
"cc" => [],
"published" => pub_date,
"content" => "miaow",
"likes" => %{
"id" => "http://mastodon.example.org/users/admin/statuses/3409297097/likes",
"totalItems" => 0,
"type" => "Collection"
}
}
}
update_data =
create_data
|> Map.put("type", "Update")
|> Map.put("id", create_data["object"]["id"] <> "/update/1")
|> put_in(["object", "content"], "miaow :3")
|> put_in(["object", "updated"], edit_date)
|> put_in(["object", "likes", "totalItems"], 666)
|> put_in(["object", "formerRepresentations"], %{
"type" => "OrderedCollection",
"totalItems" => 1,
"orderedItems" => [create_data["object"]]
})
{:ok, %Pleroma.Activity{} = activity} = Transmogrifier.handle_incoming(create_data)
%Pleroma.Object{} = object = Object.get_by_ap_id(activity.data["object"])
assert object.data["content"] == "miaow"
assert object.data["likes"] == []
assert object.data["like_count"] == 0
{:ok, %Pleroma.Activity{} = activity} = Transmogrifier.handle_incoming(update_data)
%Pleroma.Object{} = object = Object.get_by_ap_id(activity.data["object"]["id"])
assert object.data["content"] == "miaow :3"
assert object.data["likes"] == []
# in the future this should retain remote likes, but for now:
assert object.data["like_count"] == 0
end
test "doesn't trip over remote likes in polls" do
now = DateTime.utc_now()
pub_date = DateTime.to_iso8601(Timex.subtract(now, Timex.Duration.from_minutes(3)))
edit_date = DateTime.to_iso8601(now)
create_data = %{
"type" => "Create",
"id" => "http://mastodon.example.org/users/admin/statuses/2471790073/activity",
"actor" => "http://mastodon.example.org/users/admin",
"to" => ["as:Public"],
"cc" => [],
"object" => %{
"type" => "Question",
"id" => "http://mastodon.example.org/users/admin/statuses/2471790073",
"attributedTo" => "http://mastodon.example.org/users/admin",
"to" => ["as:Public"],
"cc" => [],
"published" => pub_date,
"content" => "vote!",
"anyOf" => [
%{
"type" => "Note",
"name" => "a",
"replies" => %{
"type" => "Collection",
"totalItems" => 3
}
},
%{
"type" => "Note",
"name" => "b",
"replies" => %{
"type" => "Collection",
"totalItems" => 1
}
}
],
"likes" => %{
"id" => "http://mastodon.example.org/users/admin/statuses/2471790073/likes",
"totalItems" => 0,
"type" => "Collection"
}
}
}
update_data =
create_data
|> Map.put("type", "Update")
|> Map.put("id", create_data["object"]["id"] <> "/update/1")
|> put_in(["object", "content"], "vote now!")
|> put_in(["object", "updated"], edit_date)
|> put_in(["object", "likes", "totalItems"], 666)
|> put_in(["object", "formerRepresentations"], %{
"type" => "OrderedCollection",
"totalItems" => 1,
"orderedItems" => [create_data["object"]]
})
{:ok, %Pleroma.Activity{} = activity} = Transmogrifier.handle_incoming(create_data)
%Pleroma.Object{} = object = Object.get_by_ap_id(activity.data["object"])
assert object.data["content"] == "vote!"
assert object.data["likes"] == []
assert object.data["like_count"] == 0
{:ok, %Pleroma.Activity{} = activity} = Transmogrifier.handle_incoming(update_data)
%Pleroma.Object{} = object = Object.get_by_ap_id(activity.data["object"]["id"])
assert object.data["content"] == "vote now!"
assert object.data["likes"] == []
# in the future this should retain remote likes, but for now:
assert object.data["like_count"] == 0
end
end
describe "prepare outgoing" do
@ -348,69 +543,6 @@ test "Updates of Notes are handled" do
end
end
describe "user upgrade" do
test "it upgrades a user to activitypub" do
user =
insert(:user, %{
nickname: "rye@niu.moe",
local: false,
ap_id: "https://niu.moe/users/rye",
follower_address: User.ap_followers(%User{nickname: "rye@niu.moe"})
})
user_two = insert(:user)
Pleroma.FollowingRelationship.follow(user_two, user, :follow_accept)
{:ok, activity} = CommonAPI.post(user, %{status: "test"})
{:ok, unrelated_activity} = CommonAPI.post(user_two, %{status: "test"})
assert "http://localhost:4001/users/rye@niu.moe/followers" in activity.recipients
user = User.get_cached_by_id(user.id)
assert user.note_count == 1
{:ok, user} = Transmogrifier.upgrade_user_from_ap_id("https://niu.moe/users/rye")
ObanHelpers.perform_all()
assert user.ap_enabled
assert user.note_count == 1
assert user.follower_address == "https://niu.moe/users/rye/followers"
assert user.following_address == "https://niu.moe/users/rye/following"
user = User.get_cached_by_id(user.id)
assert user.note_count == 1
activity = Activity.get_by_id(activity.id)
assert user.follower_address in activity.recipients
assert %{
"url" => [
%{
"href" =>
"https://cdn.niu.moe/accounts/avatars/000/033/323/original/fd7f8ae0b3ffedc9.jpeg"
}
]
} = user.avatar
assert %{
"url" => [
%{
"href" =>
"https://cdn.niu.moe/accounts/headers/000/033/323/original/850b3448fa5fd477.png"
}
]
} = user.banner
refute "..." in activity.recipients
unrelated_activity = Activity.get_by_id(unrelated_activity.id)
refute user.follower_address in unrelated_activity.recipients
user_two = User.get_cached_by_id(user_two.id)
assert User.following?(user_two, user)
refute "..." in User.following(user_two)
end
end
describe "actor rewriting" do
test "it fixes the actor URL property to be a proper URI" do
data = %{

View file

@ -1129,7 +1129,7 @@ test "removes a pending follow for a local user" do
test "cancels a pending follow for a remote user" do
follower = insert(:user)
followed = insert(:user, is_locked: true, local: false, ap_enabled: true)
followed = insert(:user, is_locked: true, local: false)
assert {:ok, follower, followed, %{id: _activity_id, data: %{"state" => "pending"}}} =
CommonAPI.follow(follower, followed)

View file

@ -0,0 +1,66 @@
# Akkoma: Magically expressive social media
# Copyright © 2025 Akkoma Authors <https://akkoma.dev/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Web.ContentTypeSanitisationTest do
use Pleroma.Web.ConnCase, async: true
alias Pleroma.Web.ContentTypeSanitisationTemplate, as: Template
require Template
defp create_file(path, body) do
File.write!(path, body)
on_exit(fn -> File.rm(path) end)
end
defp upload_dir(),
do: Path.join(Pleroma.Uploaders.Local.upload_path(), "test_StaticPlugSanitisationTest")
defp create_upload(subpath, body) do
Path.join(upload_dir(), subpath)
|> create_file(body)
"/media/test_StaticPlugSanitisationTest/#{subpath}"
end
defp emoji_dir(),
do:
Path.join(
Pleroma.Config.get!([:instance, :static_dir]),
"emoji/test_StaticPlugSanitisationTest"
)
defp create_emoji(subpath, body) do
Path.join(emoji_dir(), subpath)
|> create_file(body)
"/emoji/test_StaticPlugSanitisationTest/#{subpath}"
end
setup_all do
File.mkdir_p(upload_dir())
File.mkdir_p(emoji_dir())
on_exit(fn ->
File.rm_rf!(upload_dir())
File.rm_rf!(emoji_dir())
end)
end
describe "sanitises Content-Type of local uploads" do
Template.do_all_common_tests(&create_upload/2)
test "while preserving audio types" do
Template.do_audio_test(&create_upload/2, false)
end
end
describe "sanitises Content-Type of emoji" do
Template.do_all_common_tests(&create_emoji/2)
test "if audio type" do
Template.do_audio_test(&create_emoji/2, true)
end
end
end

View file

@ -79,16 +79,14 @@ test "it federates only to reachable instances via AP" do
local: false,
nickname: "nick1@domain.com",
ap_id: "https://domain.com/users/nick1",
inbox: inbox1,
ap_enabled: true
inbox: inbox1
})
insert(:user, %{
local: false,
nickname: "nick2@domain2.com",
ap_id: "https://domain2.com/users/nick2",
inbox: inbox2,
ap_enabled: true
inbox: inbox2
})
dt = NaiveDateTime.utc_now()
@ -134,7 +132,7 @@ test "successfully processes incoming AP docs with correct origin" do
assert {:ok, _activity} = ObanHelpers.perform(job)
assert {:ok, job} = Federator.incoming_ap_doc(params)
assert {:error, :already_present} = ObanHelpers.perform(job)
assert {:discard, :already_present} = ObanHelpers.perform(job)
end
test "successfully normalises public scope descriptors" do

View file

@ -61,7 +61,9 @@ test "get instance stats", %{conn: conn} do
{:ok, _user2} = User.set_activation(user2, false)
insert(:user, %{local: false, nickname: "u@peer1.com"})
insert(:instance, %{domain: "peer1.com"})
insert(:user, %{local: false, nickname: "u@peer2.com"})
insert(:instance, %{domain: "peer2.com"})
{:ok, _} = Pleroma.Web.CommonAPI.post(user, %{status: "cofe"})
@ -81,7 +83,9 @@ test "get instance stats", %{conn: conn} do
test "get peers", %{conn: conn} do
insert(:user, %{local: false, nickname: "u@peer1.com"})
insert(:instance, %{domain: "peer1.com"})
insert(:user, %{local: false, nickname: "u@peer2.com"})
insert(:instance, %{domain: "peer2.com"})
Pleroma.Stats.force_update()

View file

@ -109,25 +109,40 @@ test "does a HEAD request to check if the body is html" do
test "refuses to crawl incomplete URLs" do
url = "example.com/ogp"
assert :error == Parser.parse(url)
assert {:error, {:url, "scheme mismatch"}} == Parser.parse(url)
end
test "refuses to crawl plain HTTP and other scheme URL" do
[
"http://example.com/ogp",
"ftp://example.org/dist/"
]
|> Enum.each(fn url ->
res = Parser.parse(url)
assert {:error, {:url, "scheme mismatch"}} == res or
{:error, {:url, "not a URL"}} == res
end)
end
test "refuses to crawl malformed URLs" do
url = "example.com[]/ogp"
assert :error == Parser.parse(url)
assert {:error, {:url, "not a URL"}} == Parser.parse(url)
end
test "refuses to crawl URLs of private network from posts" do
[
"http://127.0.0.1:4000/notice/9kCP7VNyPJXFOXDrgO",
"https://127.0.0.1:4000/notice/9kCP7VNyPJXFOXDrgO",
"https://10.111.10.1/notice/9kCP7V",
"https://172.16.32.40/notice/9kCP7V",
"https://192.168.10.40/notice/9kCP7V",
"https://pleroma.local/notice/9kCP7V"
"https://192.168.10.40/notice/9kCP7V"
]
|> Enum.each(fn url ->
assert :error == Parser.parse(url)
assert {:error, {:url, :ip}} == Parser.parse(url)
end)
url = "https://pleroma.local/notice/9kCP7V"
assert {:error, {:url, :ignore_tld}} == Parser.parse(url)
end
test "returns error when disabled" do

View file

@ -0,0 +1,38 @@
defmodule Pleroma.Web.RouterTest do
use Pleroma.DataCase
use Mneme
test "route prefix stability" do
auto_assert(
[
"api",
"main",
"ostatus_subscribe",
"oauth",
"akkoma",
"objects",
"activities",
"notice",
"@:nickname",
":nickname",
"users",
"tags",
"mailer",
"inbox",
"relay",
"internal",
".well-known",
"nodeinfo",
"manifest.json",
"web",
"auth",
"embed",
"proxy",
"phoenix",
"test",
"user_exists",
"check_password"
] <- Pleroma.Web.Router.get_api_routes()
)
end
end

View file

@ -132,7 +132,7 @@ test "show follow page with error when user can not be fetched by `acct` link",
|> html_response(200)
assert response =~ "Error fetching user"
end) =~ ":not_found"
end) =~ "User doesn't exist (anymore): https://mastodon.social/users/not_found"
end
end

View file

@ -7,13 +7,16 @@ defmodule Pleroma.Workers.ReceiverWorkerTest do
use Oban.Testing, repo: Pleroma.Repo
@moduletag :mocked
import ExUnit.CaptureLog
import Mock
import Pleroma.Factory
alias Pleroma.Web.ActivityPub.Utils
alias Pleroma.Workers.ReceiverWorker
test "it ignores MRF reject" do
params = insert(:note).data
user = insert(:user, local: false)
params = insert(:note, user: user, data: %{"id" => user.ap_id <> "/note/1"}).data
with_mock Pleroma.Web.ActivityPub.Transmogrifier,
handle_incoming: fn _ -> {:reject, "MRF"} end do
@ -23,4 +26,36 @@ test "it ignores MRF reject" do
})
end
end
test "it errors on receiving local documents" do
actor = insert(:user, local: true)
recipient = insert(:user, local: true)
to = [recipient.ap_id]
cc = []
params = %{
"@context" => ["https://www.w3.org/ns/activitystreams"],
"type" => "Create",
"id" => Utils.generate_activity_id(),
"to" => to,
"cc" => cc,
"actor" => actor.ap_id,
"object" => %{
"type" => "Note",
"to" => to,
"cc" => cc,
"content" => "It's a note",
"attributedTo" => actor.ap_id,
"id" => Utils.generate_object_id()
}
}
assert capture_log(fn ->
assert {:discard, :origin_containment_failed} ==
ReceiverWorker.perform(%Oban.Job{
args: %{"op" => "incoming_ap_doc", "params" => params}
})
end) =~ "[alert] Received incoming AP doc with valid signature for local actor"
end
end

View file

@ -62,8 +62,7 @@ def user_factory(attrs \\ %{}) do
last_digest_emailed_at: NaiveDateTime.utc_now(),
last_refreshed_at: NaiveDateTime.utc_now(),
notification_settings: %Pleroma.User.NotificationSetting{},
multi_factor_authentication_settings: %Pleroma.MFA.Settings{},
ap_enabled: true
multi_factor_authentication_settings: %Pleroma.MFA.Settings{}
}
urls =

Some files were not shown because too many files have changed in this diff Show more