Compare commits

...

134 commits

Author SHA1 Message Date
a913c11230 bump version
All checks were successful
ci/woodpecker/push/docs Pipeline was successful
ci/woodpecker/push/publish/4 Pipeline was successful
ci/woodpecker/push/publish/1 Pipeline was successful
ci/woodpecker/push/publish/2 Pipeline was successful
2026-03-14 13:20:10 +00:00
Oneric
9d1e169472 webfinger/finger: allow WebFinger endpoint delegation with FEP-2c59
The ban on redirects was based on a misreading of FEP-2c59’s
requirements. It is only meant to forbid addresses other than
the canonical ActivityPub ID being advertised as such in the
returned WebFinger data.
This does not meaningfully lessen security and verification still
remains stricter than without FEP-2c59.

Notably this allows Mastodon with its backwards WebFinger redirect
(redirecting from the canonical WebFinger domain to the AP domain)
to adopt FEP-2c59 without causing issues or extra effort to existing
deplyoments which already adopted the Mastodon-recommended setup.
2026-03-13 00:00:00 +00:00
Oneric
4f03a3b709 federation.md: document our WebFinger behaviour 2026-03-13 00:00:00 +00:00
Oneric
e8d6d054d0 test/webfinger/finger: fix tests
One asserted the response format of finger_actor on a finger_mention
call as a previous iteration of the implementation mistakenly returned.
The other didn’t actually test anything WebFinger but fundamental id
containment and verification for generic AP fetches. Now it does.
2026-03-12 00:00:00 +00:00
Oneric
ab2210f02d test/webfinger/finger: add more validation tests 2026-03-12 00:00:00 +00:00
Oneric
5256785b9a test/webfinger/finger: mock url in all new tests
It is used in security checks and only due to an abudance of
existing tests lacking it, _only_ the test env is allowed to
fallback to the query URL for theses tests as a temporary
(well, ... it’s been a while now) measure.
We really shouldn’t be adding more deficient tests like that.
2026-03-12 00:00:00 +00:00
Oneric
4460c9c26d test/webfinger/finger: improve new test names and comments 2026-03-12 00:00:00 +00:00
f87a2f52e1 add extra happy and unhappy path tests for webfingers 2026-03-12 00:00:00 +00:00
627ac3645a add some more webfinger tests 2026-03-12 00:00:00 +00:00
2020a2395a add baseline webfinger FEP-2c59 tests 2026-03-12 00:00:00 +00:00
Oneric
fd734c5a7b webfinger/finger: normalise mention resources to more common format 2026-03-12 00:00:00 +00:00
Oneric
d86c290c25 user: drop unused function variants and parameters 2026-03-12 00:00:00 +00:00
Oneric
838d0b0a74 ap/views: use consistent structure for root collections
Notably user follow* collections faked a zero totalItem count
rather than simply omitting the field and included a link to a first
page even when knowing this page cannot be fetched while most others
omitted it. Omitting it will spare us receiving requests doomed to
failure for this page and matches the format used by GtS and Mastodon.

Such requests occur e.g. when other *oma servers try to determine
whether follow* relationships should be publicly shown. Other
implementations like Mastodon¹ simply treat the presence of a (link to)
a first page aready as an indicator for a public collection. By
omitting the link Mastodon servers will now too honour our users
request to hide follow* details.

The "featured" user collection are typically quite small and thus
the sole occurence of the alternative form where all items are directly
inlined into the root without an intermediate page. Thus it is not
converted but also no helper for this format created.

1: eb848d082a/app/services/activitypub/process_account_service.rb (L303)
2026-03-12 00:00:00 +00:00
Oneric
ddcc1626f8 ap/user_view: optimise total count of follow* collections
The count is precomputed as a user property.
Masto API already relies on this cached value.
This let’s us skip actually querying  follow* details unless
follow(ing/ed) users are publicly exposed and thus will be served.

In fact this could now be optimised further by using keyset pagination
to only fetch what’s actually needed for the current page. This would
also completely obsolete the need for the _offset collection page
helpers. However, for this pagination to be efficient it needs to happen
o the follow relation table, not users. This is left to a future commit.

Due to an ambiguity with PhoenixHtmlHelpers the Ecto.Query
select import was unusable without extra qualification,
therfore it is converted to a require expression.
2026-03-12 00:00:00 +00:00
Oneric
fbf02025e0 user/query: drop unused legacy parameter
There deactivated db column since 860b5c7804
and users of this legacy kludge introduced in 10ff01acd9
all migrated to an equivalent newer parameter
2026-03-12 00:00:00 +00:00
Oneric
a899663ddc Fix malformed is_active qualifiers in User.Query usages
While "is_active" is a property of users, it is not a recognised keyword
for the User.Query helper; instead "deactivated" with negated value must
be used.
This arose because originally the user property was also called
"deactivated" until it was renamed adn negated five years ago
in 860b5c7804. This included renaiming
the parameter in most but not all User.Query usages.
Notably the parameter in User.get_followers_query was renamed
but not in User.get_friends_query (friends == following).
The accepted query parameter in User.Query however was not changed.
This lead to the former mistakenly including deleted users causing
too large values to be reported in the ActivityPub follower, but not
following collection as reported in #1078.

In Masto API responses filtering by `User.is_visible` weeded out
the extra accounts before they got displayed to API users.

On the surface it might seem logical to align the name of the User.Query
parameter with the actual property name. However, User.Query already
accepts an "active" parameter which is an alias for limiting to accounts
which are neither deleted nor deactivated by moderators (both indicated
by is_active) and also are not newly created account requests still
pending an admin approval (is_approved) or necessary email confirmation
(is_confirmed); in short as the alias suggests whether the account is
active. Two highly similar parameter names like this would be much too
confusing.

The renamed "is_active" on the other hand does not actually suffice to
say whether an account is actually active, only whether it has (not yet)
ceased to be active (by its own volition or moderator action).
Meaning its "new" name is actively misleading. Arguably the rename
made things worse for no reason whatsoever and should not ever have
happened.

For now, we’ll just revert the incorrect query helper parameter renames.

Fixes: https://akkoma.dev/AkkomaGang/akkoma/ issues/ 1078
2026-03-12 00:00:00 +00:00
Oneric
d2fda68afd user/fetch: properly flag remote, hidden follow* counts
Instead of treating them like a public zero count.
2026-03-12 00:00:00 +00:00
Oneric
9fb6993e1b cosmetic/user/fetch: reorder functions
Such that helper functions are near their sole caller
instead of being interpsarsed with other public functions
2026-03-12 00:00:00 +00:00
Oneric
4bae78d419 user/fetcher: assume collection to be private on fetch errors
With the follow info update now actually running after being fixed
a bunch of errors about :not_found and :forbidden collection fetches
started to be logged
2026-03-12 00:00:00 +00:00
Oneric
f3821628e3 test: fix module name of GettextCompanion tests
Oversight in 54fd8379ad
2026-03-12 00:00:00 +00:00
Oneric
b4c6a90fe8 webfinger/finger: allow leading @ in handles
At least for FEP-2c59 this shouldn’t be the case
but in theory some WebFinger implementations may
serve their subject with an extra @
2026-03-12 00:00:00 +00:00
Oneric
a37d60d741 changelog: add missing changes 2026-03-12 00:00:00 +00:00
Oneric
71757f6562 user/fetcher: utilise existing verified nick on user refetch
Unless the force-revalidation config is enabled (currently the default).
Also avoids an unneecessarily duplicated db query for the old user.
2026-03-12 00:00:00 +00:00
Oneric
892628d16d user/fetcher: always detect nickname changes on Update activities
Even when the "always force revalidation" option is not enabled
while avoiding unnecessary revalidations if nothing changed.

With this heuristic we should be able to change the default to "false"
soon, but for now keep it enabled to help amend recent bugs.
2026-03-12 00:00:00 +00:00
Oneric
2a1b6e2873 user/fetcher: drop nonsense type-based follow update skip
The real intent behind the commit introducing this seemed to have been
avoiding running this when the actor does not expose follow collection ids
ec24c70db8.
This is already taken care of with the :collections_available check.
Some Implementations use other actor type like Group etc for visible,
followable actors making skipping undesirable.

Notably though, this actually has _always_ skipped counter updates
as even when this check was introduced, the user changeset data and
struct used the :actor_type key not :type.

In some situations fetch_follow_information_for_user is called directly
from other modules thus occasionally counters still got updated
for accounts with closer federation relationships masking the issue.
2026-03-12 00:00:00 +00:00
Oneric
756cfb6798 user/fetcher: fix follow count update
Users no longer have an info substruct for over 6 years
since e8843974cb.
Instead the counters are now directly part of the user struct itself.
2026-03-12 00:00:00 +00:00
Oneric
698ee181b4 webfinger/finger: fix error return format for invalid XML
By default just a plain :error atom is returned,
differing from the return spec of the JSON version
2026-03-12 00:00:00 +00:00
Oneric
c80aec05de webfinger: rewrite finger query and validation from and to actors
Resolves all security issues discussed in 5a120417fd86bbd8d1dd1ab720b24ba02c879f09
and thus reactivates skipped tests.
Since the necessary logic significantly differs for WebFinger handle dicovery/validation
and fetching of actors from just the webfinger handle the relevant public function was split
necessitating also a partial rewrite of the user fetch logic.

This works with all of the following:
  - ActivityPub domain is identical to WebFinger handle domain
  - AP domain set up host-meta LRDD link to WebFinger domain
  - AP domain set up a HTTP redirect on /.well-known/webfinger
    to the WebFinger domain
  - Mastodon style: WebFinger domain set up a HTTP redirect
    on its well-known path to AP backend (only for discovery
    from nickname though until Mastodon supports FEP-2c59)

This intentionally does not work for setups where FEP-2c59 is not
supported and the initially queried domain simply directly responds with
data containing a nickname from another domain’s authority without any
redirecty. (This includes the setup currently recommended by Mastodon,
when enriching an AP actor. Once Mastodon supports FEP-2c59 though its
setup will start to work again too automatically).
While technically possible to cross-verify the data with the nickname
domain, the existing validation logic is already complex enough and
such cross-validation needs extra safety measures to not get trapped
into infinite loops. Such setups are considered broken.
2026-03-12 00:00:00 +00:00
Oneric
69622b3321 Drop obsolete kludge for a specific, dead instance
It doesn’t make sense in general (many implementations use ids not nicks in ap_id)
and just wastes time by making additional, unnecessary, failing network requests.
This arguably should have never been committed.
2026-03-12 00:00:00 +00:00
Oneric
1e6332039f user/fetcher: also validate user data from Update
And fixup sloppy test data
2026-03-12 00:00:00 +00:00
Oneric
eb15e04d24 Split user fetching out of general ActivityPub module
The ActivityPub module is already overloaded and way too large.
Logic for fetching users and user information is isolated from
all other parts of the ActivityPub module, so let’s split it out.
2026-03-12 00:00:00 +00:00
Oneric
25461b75f7 webfinger: split remote queries and local data generation
They do not share any logic and the lack of separation makese it easy
to end up in the wrong section with ensuing confusion.
2026-03-12 00:00:00 +00:00
Oneric
4a35cbd23d fed/out: expose webfinger property in local actors (FEP-2c59)
It makes discovery and validation of the desired webfinger address
much easier. Future commits will actually use it for validation
and nick change discovery.
2026-03-12 00:00:00 +00:00
Oneric
1cdc247c63 Temporarily disable customised webfinger hosts
Proper validation of nicknames must consider both the domain
the nickname is associated with _and_ the actor to be assigned this
nickname consent to this.
Prior attempts at securing this wer emade in
a953b1d927 and
0d66237205 but they do not suffice.

The existing code attempted to validate webfinger responses independent
of the actual ActivityPub actor data and only ever consider the former
(yet failed to properly validate even this).

When resolving a user via a user-provided nickname the assignment
done by the provided URL was simply trusted regardless of the actors
AP host or data. When the nresolving the AP id, the nickname from this
original WebFinger query was passed along as pre-trusted data overriding
any discovery or validation from the actual actors side.
This allowed malicious actors to serve rogue WebFinger data associating
arbitrary actors with any nicknames they wished. Prompting servers to
resolve this rogue nickname then creates this nonconsensual assignment.

Notably, the existing code’s attempt at verification (only for domain
consent) used the originally requested URL for comparison against the
domain in the nickname handle. This effectively disabled custom
WebFinger domains for honest servers unless using an host-meta LRDD link.
(While LRDD links were recommend in the past by both *oma nad Mastodon,
 today most implementations other than *oma instead
 recommend setups emplyoing HTTP redirects.)
Still, this strictness did not prevent spoofing by malicious server.
It did however mean that rogue nickname assignments from an initial
nickname-based discovery were at least undone on the next user refresh
provided :pleroma, Pleroma.Web.WebFinger, :update_nickname_on_user_fetch
was not changed from its default truthy value.
(A renewed fetch via the rogue nickname would re-establish it though)

When enriching an already resolved ActivityPub actor to discover its
nickname the WebFinger query was not done with the unique AP id as a
resource, but a guessed nickname handle.
Furthermore, the received WebFinger response was not validated
to ensure the ActivityPub ID the WebFinger server pointed to
for the final nickname matched the actual ID in the considered
AP actor data.
While the faulty request URI check described above provides some
friction for malicious actors, it is still possible for mischiveous
AP instances to setup a rogue LRDD link poitning to a third-party
domain’s WebFInger and using the freedom provided by the LRDD link
to overwrite the resource value we provide in the lookup. Thus usurping
existing nicknames in another domains authority.

Proposed tweaks to the existing, faulty checks to work with
HTTP-redirect-based custom WebFInger domains would have made it
even easier to usurp nicknames from foreign domains.

For now simply disable custom WebFinger domains as a quick hotfix.
Subsequent commits will partially de-spaghettify the relevant code
and completely overhaul webfinger and nickname handling and validation.
2026-03-12 00:00:00 +00:00
Oneric
4c63243692 emoji/pack: fix in-place changes
All checks were successful
ci/woodpecker/pr/test/2 Pipeline was successful
ci/woodpecker/pr/test/1 Pipeline was successful
ci/woodpecker/push/docs Pipeline was successful
ci/woodpecker/push/publish/4 Pipeline was successful
ci/woodpecker/push/publish/1 Pipeline was successful
ci/woodpecker/push/publish/2 Pipeline was successful
Deleting a whole pack at once didn’t remove its emoji from memory
and ewly updated or added emoji used a wrong path. While the pack
also contains a path, this is the filesystem path while in-memory
Emoji need the URL path constructed from a constant prefix,
pack name and just the filename component of the filesystem file.
Test used to not check for this at all.

Fixes oversight in 318ee6ee17
2026-03-12 00:00:00 +00:00
Oneric
7b9a0e6d71 twitter_api/remote_follow: allow leading @ in nicknames
All checks were successful
ci/woodpecker/pr/test/2 Pipeline was successful
ci/woodpecker/pr/test/1 Pipeline was successful
ci/woodpecker/push/docs Pipeline was successful
ci/woodpecker/push/publish/4 Pipeline was successful
ci/woodpecker/push/publish/1 Pipeline was successful
ci/woodpecker/push/publish/2 Pipeline was successful
And never attempt to fetch nicknames as URLs
2026-03-02 00:00:00 +00:00
Oneric
5873e40484 grafana: update reference dashboard
All checks were successful
ci/woodpecker/pr/test/2 Pipeline was successful
ci/woodpecker/pr/test/1 Pipeline was successful
ci/woodpecker/push/docs Pipeline was successful
ci/woodpecker/push/publish/4 Pipeline was successful
ci/woodpecker/push/publish/1 Pipeline was successful
ci/woodpecker/push/publish/2 Pipeline was successful
2026-02-19 00:00:00 +00:00
Oneric
f8abae1f58 docs/admin/monitoring: document reference dashboard requires VM
As reported on IRC.
What exactly Prometheus takes offense with isn’t clear yet.
2026-02-19 00:00:00 +00:00
Oneric
4912e1782d docs/admin/monitoring: add instructions to setup outlier statistics 2026-02-19 00:00:00 +00:00
Oneric
6ed678dfa6 mix/uploads: fix rewrite_media_domain for user images
Fixes: #1064
2026-02-19 00:00:00 +00:00
7c0deab8c5 Merge pull request 'Fetcher: Only check SimplePolicy rules when policy is enabled' (#1044) from mkljczk/akkoma:fetcher-simple-policy into develop
All checks were successful
ci/woodpecker/push/docs Pipeline was successful
ci/woodpecker/push/publish/4 Pipeline was successful
ci/woodpecker/push/publish/1 Pipeline was successful
ci/woodpecker/push/publish/2 Pipeline was successful
Reviewed-on: #1044
Reviewed-by: Oneric <oneric@noreply.akkoma>
2026-02-18 13:37:27 +00:00
00fcffe5b9 fix test
All checks were successful
ci/woodpecker/pr/test/2 Pipeline was successful
ci/woodpecker/pr/test/1 Pipeline was successful
Signed-off-by: nicole mikołajczyk <git@mkljczk.pl>
2026-02-17 14:32:59 +01:00
246e864ce4 Merge pull request 'Mastodon-flavour (read) quotes API compat' (#1059) from Oneric/akkoma:masto-quotes-api into develop
All checks were successful
ci/woodpecker/push/docs Pipeline was successful
ci/woodpecker/push/publish/4 Pipeline was successful
ci/woodpecker/push/publish/1 Pipeline was successful
ci/woodpecker/push/publish/2 Pipeline was successful
Reviewed-on: #1059
2026-02-07 22:39:47 +00:00
c4bcfb70df Merge pull request 'Use local elixir-captcha clone' (#1060) from use-local-captcha-clone into develop
All checks were successful
ci/woodpecker/push/docs Pipeline was successful
ci/woodpecker/push/publish/4 Pipeline was successful
ci/woodpecker/push/publish/1 Pipeline was successful
ci/woodpecker/push/publish/2 Pipeline was successful
Reviewed-on: #1060
Reviewed-by: Oneric <oneric@noreply.akkoma>
2026-02-07 20:11:07 +00:00
cf8010a33e Merge pull request 'ensure utf-8 nicknames on nickname GETs and user validator' (#1057) from user-utf8 into develop
Some checks failed
ci/woodpecker/push/publish/4 Pipeline failed
ci/woodpecker/push/docs Pipeline was successful
ci/woodpecker/push/publish/1 Pipeline failed
ci/woodpecker/push/publish/2 Pipeline failed
Reviewed-on: #1057
Reviewed-by: Oneric <oneric@noreply.akkoma>
2026-02-07 19:41:26 +00:00
4c657591a7 use version with git history
All checks were successful
ci/woodpecker/pr/test/2 Pipeline was successful
ci/woodpecker/pr/test/1 Pipeline was successful
2026-02-07 19:40:09 +00:00
6ae0635da7 mix format
Some checks are pending
ci/woodpecker/pr/test/1 Pipeline is pending approval
ci/woodpecker/pr/test/2 Pipeline is pending approval
2026-02-07 19:28:13 +00:00
11dbfe75b9 pleroma git OBLITERATED
Some checks failed
ci/woodpecker/pr/test/2 Pipeline failed
ci/woodpecker/pr/test/1 Pipeline failed
2026-02-07 19:16:32 +00:00
58ee25bfbb correct typings, duplicated check
Some checks are pending
ci/woodpecker/pr/test/1 Pipeline is pending approval
ci/woodpecker/pr/test/2 Pipeline is pending approval
2026-02-07 19:09:02 +00:00
Oneric
fd87664b9e api/statuses: allow quoting local statuses locally
All checks were successful
ci/woodpecker/pr/test/2 Pipeline was successful
ci/woodpecker/pr/test/1 Pipeline was successful
2026-02-07 00:00:00 +00:00
Oneric
731863af9c api/statuses: allow quoting own private posts
Provided the quote is private too.

Ideally we’d inline the quoted, private status since not all
remotes may already know the old post and some implmentations
(ourselves included) have trouble fetching private posts.
In practice, at least we cannot yet make use of such an inlined post
anyway defeating the point. Implementing the inlining and ability to
make use of the inlined copy is thus deferred to a future patch.

Resolves: #952
2026-02-07 00:00:00 +00:00
Oneric
5b72099802 api/statuses: provide polyglot masto-and-*oma quote object
However, we cannot provide Masto-style shallow quotes this way.

Inspired-by: https://issues.iceshrimp.dev/issue/ISH-871#comment-019c24ed-c841-7de2-9c69-85e2951135ca
Resolves: #1009
2026-02-07 00:00:00 +00:00
Oneric
c67848d473 api/statuses: accept and prefer masto-flavour quoted_status_id
The quote creation interface still isn’t exactly drop-in compatible for
masto-only clients since we do not provide or otherwise deal with
quote-authorization objects which clients are encouraged to check before
even offering the possibility of attempting a quote. Still, having a
consistent paramter name will be easier on clients.

Also dropped unused quote_id parameter from ActivityDraft struct
2026-02-07 00:00:00 +00:00
Oneric
a454af32f5 view/nodeinfo: use string keys
This makes embedded nodeinfo data
consistent between local and remote users
2026-02-07 00:00:00 +00:00
Oneric
e557bbcd9d api/masto/account: filter embedded nodeinfo
The only kown user is akkoma-fe and it only ever
accesses the software information. For *oma instances
the full, unfiltered nodeinfo data can be quite large
adding unneeded bloat to API responses.
This would have become worse with the duplication of
account data needed for Masto quote post interop.

In case a client we’re not aware of actually uses more fields from
nodeinfo, a new but temporary config setting is provided as a workaround.

Fixes: #827
2026-02-07 00:00:00 +00:00
b20576da2e Merge pull request 'http: allow compressed responses, use system CA certs instead of CAStore fallback' (#1058) from Oneric/akkoma:http-lib-updates into develop
All checks were successful
ci/woodpecker/push/docs Pipeline was successful
ci/woodpecker/push/publish/4 Pipeline was successful
ci/woodpecker/push/publish/1 Pipeline was successful
ci/woodpecker/push/publish/2 Pipeline was successful
Reviewed-on: #1058
2026-01-30 20:14:53 +00:00
dee0e01af9 object/fetcher: only check SimplePolicy rules when policy is enabled
Some checks are pending
ci/woodpecker/pr/test/1 Pipeline is pending approval
ci/woodpecker/pr/test/2 Pipeline is pending approval
Signed-off-by: nicole mikołajczyk <git@mkljczk.pl>
2026-01-30 00:00:00 +00:00
Oneric
e488cc0a42 http/adapter_helper: explicitly enable IPv4
All checks were successful
ci/woodpecker/pr/test/2 Pipeline was successful
ci/woodpecker/pr/test/1 Pipeline was successful
Mint was upgraded in b1397e1670
2026-01-27 00:00:00 +00:00
Oneric
be21f914f4 mix: bump finch and use system cacerts
This upgrade pulls in a fix to better avoid killing re-actived pools,
obsoletes the need for our own HTTP2 server push workaround and allows
us to use system CA certs without breaking plain HTTP connections.

We tried to to the latter before on a per request basis, but this didn’t
actually do anything and we actually relied on the CAStore package
fallback the entire time. The broken attempt was removed in
ed5d609ba4.

Resolves: #880
2026-01-27 00:00:00 +00:00
Oneric
b9eeebdfd7 http: accept compressed responses
Resolves: #755
2026-01-27 00:00:00 +00:00
Oneric
c79e8fb086 mix: update Tesla to >= 1.16.0
This is the first release containg fixes making DecompressResponse
stable enough and suitable to be used by default allowing us to
profit from transport compression in obtained responses.

(Note: no compression is used in bodies we send out, i.e.
 ActivityPub documents federated to remote inboxes, since
 this will likely break signatures depending on whether
 the checksum is generated and checked before or after compression)

Ref: 5bc9b82823
Ref: 288699e8f5
2026-01-27 00:00:00 +00:00
8da6785c46 mix format
All checks were successful
ci/woodpecker/pr/test/2 Pipeline was successful
ci/woodpecker/pr/test/1 Pipeline was successful
2026-01-25 01:31:26 +00:00
3deb267333 if we don't have a preferredUsername, accept standard fallback 2026-01-25 01:30:25 +00:00
0d7bbab384 ensure utf-8 nicknames on nickname gets and user validator 2026-01-25 01:29:10 +00:00
aafe0f8a81 Merge pull request 'scrubbers/default: Allow "mention hashtag" classes used by Mastodon' (#1056) from mkljczk/akkoma:allow-mention-hashtag into develop
All checks were successful
ci/woodpecker/push/docs Pipeline was successful
ci/woodpecker/push/publish/4 Pipeline was successful
ci/woodpecker/push/publish/1 Pipeline was successful
ci/woodpecker/push/publish/2 Pipeline was successful
Reviewed-on: #1056
2026-01-24 14:39:56 +00:00
24faec8de2 scrubbers/default: Allow "mention hashtag" classes used by Mastodon
All checks were successful
ci/woodpecker/pr/test/2 Pipeline was successful
ci/woodpecker/pr/test/1 Pipeline was successful
Signed-off-by: nicole mikołajczyk <git@mkljczk.pl>
2026-01-24 14:17:33 +01:00
816d2332ab Merge pull request 'Update docs/docs/administration/backup.md' (#1050) from patatas/akkoma:develop into develop
All checks were successful
ci/woodpecker/push/docs Pipeline was successful
ci/woodpecker/push/publish/4 Pipeline was successful
ci/woodpecker/push/publish/1 Pipeline was successful
ci/woodpecker/push/publish/2 Pipeline was successful
Reviewed-on: #1050
Reviewed-by: Oneric <oneric@noreply.akkoma>
2026-01-18 17:28:49 +00:00
a4a547e76e Update docs/docs/administration/backup.md
Some checks are pending
ci/woodpecker/pr/test/1 Pipeline is pending approval
ci/woodpecker/pr/test/2 Pipeline is pending approval
separate commands with semicolon (consistent with previous step in restore instructions)
2026-01-17 20:08:57 +00:00
Oneric
6cec7d39d6 db/migration/20251227000002: improve performance with older PostgreSQL
All checks were successful
ci/woodpecker/pr/test/2 Pipeline was successful
ci/woodpecker/pr/test/1 Pipeline was successful
ci/woodpecker/push/docs Pipeline was successful
ci/woodpecker/push/publish/4 Pipeline was successful
ci/woodpecker/push/publish/1 Pipeline was successful
ci/woodpecker/push/publish/2 Pipeline was successful
On fedi.absturztau.be the planner did not utilise the context
index for the original version leading to a query plan
100× worse than with this tweaked version.

With PostgreSQL 18 the relative difference is much smaller
but also in favour for the new version with the best observed
instance resulting in nearly half the estimated cost.
2026-01-13 00:00:00 +00:00
Oneric
3fbf7e03cf db/migration/20251227000000: add analyze statement
All checks were successful
ci/woodpecker/pr/test/2 Pipeline was successful
ci/woodpecker/push/publish/4 Pipeline was successful
ci/woodpecker/pr/test/1 Pipeline was successful
ci/woodpecker/push/docs Pipeline was successful
ci/woodpecker/push/publish/1 Pipeline was successful
ci/woodpecker/push/publish/2 Pipeline was successful
The second next migration greatly profits from this index
but sometimes the planner may not pick it up immediately
without an explicit ANALYZE call.
2026-01-12 00:00:00 +00:00
Oneric
31d277ae34 db: (re)add activity type index
All checks were successful
ci/woodpecker/pr/test/2 Pipeline was successful
ci/woodpecker/pr/test/1 Pipeline was successful
ci/woodpecker/push/docs Pipeline was successful
ci/woodpecker/push/publish/4 Pipeline was successful
ci/woodpecker/push/publish/1 Pipeline was successful
ci/woodpecker/push/publish/2 Pipeline was successful
Before 32a2a0e5fa the context index
acted as a (based on the name surprising) type index too.
Type-based queries are used in the daily pruning of old e.g. Delete
activities by PruneDatabaseWorker, when querying reports for the admin
API and inferring by the significant change in average execution time
a mysterious COUNT query we couldn’t associated with any code so far:

  SELECT count(a0."id") FROM "activities" AS a0 WHERE (a0."data"->>$2 = $1);

Having this as a separate index without pre-sorting results in
an overall smaller index size than merging this into the context index
again and based on it not having been sorted before non-context queries
appear to not significantly profit from presorting.
2026-01-11 00:00:00 +00:00
Oneric
3487e93128 api/v1/custom_emoji: improve performance
All checks were successful
ci/woodpecker/push/publish/4 Pipeline was successful
ci/woodpecker/push/docs Pipeline was successful
ci/woodpecker/push/publish/1 Pipeline was successful
ci/woodpecker/push/publish/2 Pipeline was successful
ci/woodpecker/pr/test/2 Pipeline was successful
ci/woodpecker/pr/test/1 Pipeline was successful
Metrics showed this endpoint taking unexpectedly long processing times
for a simple readout of an ETS table. Further profiling with
fprof revealed almost all time was spent in URI.merge.

Endpoint.url() is per its documented API contract guaranteed to have a
blank path and all our emoji file paths start with a slash.
Thus, a simple binary concat is equivalent to the result of URI.merge.

This cuts down the profiled time of just the fetching and
rendering of all emoji to a tenth for me. For the overall API request
with overhead for handling of the incoming response as well as encoding
and seding out of the data, the overall time as reported by phoenix
metrics dropped down by a factor of about 2.5.
With a higher count of installed emoji the overall relative time
reduction is assumed to get closer to the relative time reduction of
the actual processing in the controller + view alone.

For reference, this was measured with 4196 installed emoji:
(keep in mind enabling fprof slows down the overall execution)

          fprof'ed controller   Phoenix stop duration
 BEFORE:     (10 ± 0.3)s             ~250 ms
  AFTER:    (0.9 ± 0.06)s            ~100 ms

Note: akkoma-fe does not use this endpoint, but /api/v1/pleroma/emoji
defined in Pleroma.Web.TwitterAPI.UtilController:emoji/2 which only
emits absolute-path references and thus had no use for URI.merge anyway.
2026-01-11 00:00:00 +00:00
93b513d09c Merge pull request 'Fix conversations API' (#1039) from Oneric/akkoma:fix-conv-api into develop
All checks were successful
ci/woodpecker/push/docs Pipeline was successful
ci/woodpecker/push/publish/4 Pipeline was successful
ci/woodpecker/push/publish/1 Pipeline was successful
ci/woodpecker/push/publish/2 Pipeline was successful
Reviewed-on: #1039
2026-01-11 15:54:49 +00:00
Oneric
6443db213a conversation: remove unused users relationship
All checks were successful
ci/woodpecker/pr/test/2 Pipeline was successful
ci/woodpecker/pr/test/1 Pipeline was successful
It is never used outside tests
and even there its correctness and/or
worth was questionable. The participations recipients
or just testing over the particiaptions’ user_id seem
like a better fit.
In particular this was always preloaded for API responses
needlessly slowing them down
2026-01-11 00:00:00 +00:00
Oneric
263c915d40 Create and bump conversations on new remote posts
The handling of remote and local posts should really be unified.
Currently new remote posts did not bump conversations
(notably though, until before the previous commit remote
 edits (and only edits), did in fact bump conversations due to being
 the sole caller of notify_and_stream outside its parent module)
2026-01-11 00:00:00 +00:00
Oneric
388d67f5b3 Don't mark conversations as unread on post edits
Without any indication of which post was edited this is only confusing and annoying.
2026-01-11 00:00:00 +00:00
Oneric
6adf0be349 notifications: always defer sending
And consistently treat muted notifications.

Before, when Notifications where immediately sent out, it correctly
skipped streaming nad web pushes for muted notifications, but the
distinction was lost when notification sending was deferred.
Furthermore, for users which are muted, but are still allowed to
create notifications, it previously (sometimes) automatically marked
them as read. Now notifications are either always silent, meaning
 - will not be streamed or create web pushes
 - will automatically be marked as read on creation
 - should not show up unless passing with_muted=true
or active, meaning
 - will be streamed to active websockets and cause web pushes if any
 - must be marked as read manually
 - show up without with_muted=true

Deferring sending out of notifications isdesirable to avoid duplicate
sending and ghost notifications when processing of the actiity fails
later in the pipeline and avoids stalling the ongoing db transaction.

Inspired by https://git.pleroma.social/pleroma/pleroma/-/merge_requests/4032
but the actual implementation differs to not break with_muted=true.
2026-01-11 00:00:00 +00:00
Oneric
2516206a31 conversation: include owner in recipients upon creation
Participation.set_Recipients already does something equivalent,
but it was forgotten here.
2026-01-11 00:00:00 +00:00
Oneric
9311d603fb conversation_controller: skip superfluous second order clause
This might also have prevented utilising the pre-sorted index.
2026-01-11 00:00:00 +00:00
Oneric
34df23e468 api/masto/conversation: fix pagination over filtered blocks
Previously the view received pre-filtered results and therefore
pagination headers were also only generated with pre-filtered
results in mind. Thus the next page would again traverse over
previously seen but filtered out entries.
Curucially, since this last_activity filtering is happening _after_
the SQL query, it can lead to fewer results being returned than
requested and in particular even zero results remaining.
Sufficiently large blocks of filtered participations thus caused
pagination to get stuck and abort early.

Ideally we would like for this filtering to occur as part of the SQL
query ensuring we will laways return as many results as are allowed as
long as there are more to be found. This would oboslete this issue.
However, for the reasons discussed in the previous commit’s message
this is currently not feasible.

The view is the only caller inside the actual application of the
for_user_with_pagiantion* functions, thus we can simply move filtering
inside the view allowing the full results set to be used when generating
pagination headers.
This still means there might be empty results pages, but now with
correct pagination headers one can follow to eventually get the full set
2026-01-11 00:00:00 +00:00
Oneric
1029aa97d2 api/masto/conversations: paginate by last status id
The old pagination logic was inconsistent and thus broken.
It used to order conversations based on updated_at but generated
pagination HTTP Link headers based on participation IDs.
Thus entries could repeat or be left out entirely.

Notably using updated_at also lead to bumps when
merely marking an individual conversation as read,
though not when marking _all_ conversations as read in bulk
since Repo.update_all does not touch date fields autoamtically.

For consistent and sensible "last active" ordering this is replaced
by using the flake ID (which contains the date) of the last status
which bumped the conversation for both ordering and pagination
parameters. This takes into account whether the status was
visible to the participation owner at the time of posting.

Notably however, it does not care about whether the status continues to
exist or continues to be visible to the owner. Thus this is not marked
as a foreign key and MUST NOT be blindly used as such!
For the most part, this should be considered as just a funny timestamp,
which is more resiliant against spurious bumps than updated_at, offers
deterministic sorting for simulateanously bumped conversations and
better usability in pagination HTTP headers and requests.

Implementing this as a proper foreign key with continuously enforced
visibility restrictions was considered. This would allow to just load
the "last activity" by joining on this foreign key, to immediately
delete participations once they become empty and obsoleting the
pre-existing post-query result filtering.
However, maintaining this such that visibility restrictions are
respected at all times is challenging and incurs overhead.
E.g. unfollows may retroactively change whether the participation owner
is allowed to access activities in the context.

This may be reconsidered in the future once we are clearer
on how we want to (not) extend conversation features.

This also improves on query performance by now using a multi-row,
pre-sorted index such that one user querying their latest conversations
(a frequent operation) only needs to perform an index lookup (and
loading full details from the heap).
Before queries from rarely active users needed to traverse newer
conversations of all other users to collect results.
2026-01-11 00:00:00 +00:00
Oneric
ebd22c07d1 test/factory: take override paramters at more factories 2026-01-11 00:00:00 +00:00
Oneric
97b2dffcb9 pagination: allow custom pagination ids
While we already use wrapped return lists for
HTML pagination headers, currently SQL queries
from the SQL pagination helper use the primary key "id"
of some given table binding. However, for participations
we would like to be able to sort by a field which is not
a primary key. Thus allow custom field names.
2026-01-11 00:00:00 +00:00
Oneric
613135a402 ap: fix streamer crashes on new, locally initiated DM threads
Since side effect handling of local notes currently can only immediately
stream out changes and notifications (eventhough it really shouldn’t for
many more reasons then this) the transaction inserting the various
created objects into the database is not finished yet when StreamerView
tries to render the conversation.

This would be fine were it using the same db connection as the
inserting transaction. However, the streamer is a different process
and gets sent the in-memory version of the participation.

In the case of newly created threads the preloaded the streamer process
will not be able to load preload it itself since it uses a different db
connection and thus cannot see any effects of the unfinished transaction
yet. Thus it must be preloaded before passing it to Streamer.

Notably, the same reasoning applies to the new status activity itself.
Though fetching the activity takes more time with several prepatory
queries and in practice it appears rare for the actual activity query to
occur before the transaction finished.
Nonetheless, this too should and will be fixed in a future commit.

Fixes: #887
2026-01-11 00:00:00 +00:00
Oneric
120a86953e conversation: don't create participations for remote users
They can never query the participation via API nor edit the recipients,
thus such particiaptions would just forever remain unused and are
a waste of space.
2026-01-11 00:00:00 +00:00
Oneric
8d0bf2d2de conversation/participation: fix restrict_recipients
For unfathomable reasons, this did not actually use recipiepent (ship)
information in any way, but compared against all users owning a
participation within the same context.
This is obviously contradicting *oma’s manually managed recipient data
and even without this extension breaks if there are multiple distinct
subthreads under the same (possibly public) root post.
2026-01-11 00:00:00 +00:00
Oneric
32ec7a3446 cosmetic/conversation/participation: mark eligible functions as private 2026-01-11 00:00:00 +00:00
Oneric
9fffc49aaa conversation/participation: delete unused function
Redundant with unread_count/1 and only the latter is actually used
2026-01-11 00:00:00 +00:00
Oneric
32a2a0e5fa db: tweak activity context index
Presorting the index will speed up all context queries
and avoids having to load the full context before sorting
and limiting can even start.
This is relevant when get the context of a status
and for dm conversation timelines.

Including the id column adds to the index size,
but dropping non-Creates also brings it down again
for an overall only moderate size increase.
The speedup when querying large threads/conversations
is worth it either way.

Otherwise, the context attribute is only used as a condition in
queries related to notifications, but only to filter an otherwise
pre-selected set of notifications and this index will not be relevant.
2026-01-11 00:00:00 +00:00
Oneric
5608f974a3 api: support with_muted in pleroma/conversation/:id/statuses
It does not make sense to check for thread mutes here though.
Even if this thread was muted, it being explicitly fetched
indicates it is desired to be displayed.
While this used to load thread mutes, it didn’t actually apply them
until now, so in regard it does not change behaviour either and we
can optimise the query by not loading thread mutes at all.

This does change behavior of conversation timelines however
by now omitting posts of muted users by default, aligning it
with all other timelines.
2026-01-11 00:00:00 +00:00
Oneric
f280dfa26f docs/monitoring: note reference dashboard testing 2026-01-11 00:00:00 +00:00
Oneric
0326330d66 telemetry: log which activities failed to be delivered 2026-01-11 00:00:00 +00:00
d35705912f Merge pull request 'webfinger: accept canoncial AP type in XML and don’t serve response for remote users' (#1045) from Oneric/akkoma:fix-webfinger-type into develop
All checks were successful
ci/woodpecker/push/docs Pipeline was successful
ci/woodpecker/push/publish/4 Pipeline was successful
ci/woodpecker/push/publish/1 Pipeline was successful
ci/woodpecker/push/publish/2 Pipeline was successful
Reviewed-on: #1045
2026-01-10 20:23:53 +00:00
Oneric
74fa8f5581 webfinger: don’t serve response for remote users’ AP id
All checks were successful
ci/woodpecker/pr/test/2 Pipeline was successful
ci/woodpecker/pr/test/1 Pipeline was successful
2026-01-10 00:00:00 +00:00
Oneric
967e2d0e71 webfinger: mark represent_user as private 2026-01-10 00:00:00 +00:00
Oneric
ee7e6d87f2 fed/in: accept canoncial AP type in XML webfinger data
This was supposed to be already handled for both XML and JSON
with d1f6ecf607 though the code
failed to consider variable scopes and thus was broken and
actually just a noop for XML.

For inexplicable reasons 1a250d65af
just outright removed both the failed attempt to parse the canonical
type in XML documents and also serving of the canonical type in our own
XML (and only XML) response.

With the refactor in 28f7f4c6de
the canoncial type was again served in both our own JSON and XML
responses but parsing of the canonical type remained missing
from XML until now.
2026-01-10 00:00:00 +00:00
e326285085 Merge pull request 'Various fixes' (#1043) from Oneric/akkoma:varfixes into develop
All checks were successful
ci/woodpecker/push/docs Pipeline was successful
ci/woodpecker/push/publish/4 Pipeline was successful
ci/woodpecker/push/publish/1 Pipeline was successful
ci/woodpecker/push/publish/2 Pipeline was successful
Reviewed-on: #1043
2026-01-05 14:38:31 +00:00
Oneric
80817ac65e fed/out: also represent emoji as anonymous objects in reactions
All checks were successful
ci/woodpecker/pr/test/2 Pipeline was successful
ci/woodpecker/pr/test/1 Pipeline was successful
It did not use the same Emoji object template as other occurrences.
This also fixes an issue with the icon URL not being properly encoded
as well as an inconsistency regarding the domain part of remote
reactions in retractions. All places use the image URL domain
except the query to find the activity to retract relie on the id.
Even before this change this made it impossible to retract remote
emoji reactions if the remote either doesn't send emoji IDs or
doesn't store images on the ActivityPub domain.

Addresses omission in 4ff5293093

Fixes: #1042
2026-01-05 00:00:00 +00:00
Oneric
5f4083888d api/stream: don’t leak hidden follow* counts in relationship updates
Based on the Pleroma patch linked below, but not needlessly
hiding the count if only the listing of the specific follow* accounts is
hidden while counts are still public.
https://git.pleroma.social/pleroma/pleroma/-/merge_requests/4205

Co-authored-by: nicole mikołajczyk <git@mkljczk.pl>
2026-01-05 00:00:00 +00:00
Oneric
eb08a3fff2 api/pleroma/conversation: accept JSON body to update conversation 2026-01-05 00:00:00 +00:00
Oneric
d6209837b3 api/v1/filters: escape regex sequence in user-provided phrases 2026-01-05 00:00:00 +00:00
Oneric
59b524741d web/activity_pub: drop duplicated restrict_filtered 2026-01-05 00:00:00 +00:00
e941f8c7c1 Merge pull request 'Support Mastodon-compatible translations API' (#1024) from mkljczk/akkoma:akkoma-mastoapi-translations into develop
All checks were successful
ci/woodpecker/push/docs Pipeline was successful
ci/woodpecker/push/publish/4 Pipeline was successful
ci/woodpecker/push/publish/1 Pipeline was successful
ci/woodpecker/push/publish/2 Pipeline was successful
Reviewed-on: #1024
2026-01-04 16:11:27 +00:00
b147d2b19d Add /api/v1/instance/translation_languages
All checks were successful
ci/woodpecker/pr/test/2 Pipeline was successful
ci/woodpecker/pr/test/1 Pipeline was successful
Signed-off-by: nicole mikołajczyk <git@mkljczk.pl>
2026-01-04 00:00:00 +00:00
d65758d8f7 Deduplicate translations code
Signed-off-by: nicole mikołajczyk <git@mkljczk.pl>
2026-01-04 00:00:00 +00:00
f5ed0e2e66 Inline translation provider names in function
Signed-off-by: nicole mikołajczyk <git@mkljczk.pl>
2026-01-04 00:00:00 +00:00
3b74ab8623 Support Mastodon-compatible translations API
Signed-off-by: nicole mikołajczyk <git@mkljczk.pl>
2026-01-04 00:00:00 +00:00
c971f297a5 Merge pull request 'Deal with elixir 1.19 warnings and test failures' (#1029) from Oneric/akkoma:elixir-1.19-warnings into develop
All checks were successful
ci/woodpecker/push/docs Pipeline was successful
ci/woodpecker/push/publish/4 Pipeline was successful
ci/woodpecker/push/publish/1 Pipeline was successful
ci/woodpecker/push/publish/2 Pipeline was successful
Reviewed-on: #1029
2025-12-29 00:09:37 +00:00
720b51d08e Merge pull request 'Update ci build scripts for 1.19' (#1038) from ci-builds-otp28 into develop
All checks were successful
ci/woodpecker/push/docs Pipeline was successful
ci/woodpecker/push/publish/4 Pipeline was successful
ci/woodpecker/push/publish/1 Pipeline was successful
ci/woodpecker/push/publish/2 Pipeline was successful
Reviewed-on: #1038
2025-12-28 21:57:15 +00:00
27b725e382 Update ci/build-all.sh
Some checks are pending
ci/woodpecker/pr/test/1 Pipeline is pending approval
ci/woodpecker/pr/test/2 Pipeline is pending approval
2025-12-28 21:52:01 +00:00
Oneric
86d62173ff test: fix regex compare for OTP28
All checks were successful
ci/woodpecker/pr/test/2 Pipeline was successful
ci/woodpecker/pr/test/1 Pipeline was successful
This was technically already incorrect before and pointed out as such in
documentation, but in practice worked well until OTP28’s regex changes.
2025-12-28 00:00:00 +00:00
Oneric
cbae0760d0 ci: adjust elixir and OTP versions
Some checks failed
ci/woodpecker/pr/test/2 Pipeline failed
ci/woodpecker/pr/test/1 Pipeline was successful
2025-12-28 00:00:00 +00:00
Oneric
1fed47d0e0 user/signing_key: fix public_key functions and test
Some checks failed
ci/woodpecker/pr/test/1 Pipeline failed
ci/woodpecker/pr/test/2 Pipeline was successful
2025-12-25 00:00:00 +00:00
Oneric
712a629d84 Fix test in- and exclusion
We had a feww files matching neither inclusion nor exclusion rules.
With elixir 1.19 this creates a warning.
Most were intended to be ignored and thus we now override the default
rules in mix.exs to be explicit about this. However, two test files were
simply misnamed and intended to run, but untill now skipped.

Notably, the signing key test for shared key ids currently fails due to
a missing mock. For now this faulty test is simply disabled, the next
commit will fix it
2025-12-25 00:00:00 +00:00
Oneric
84ad11452e test: fix elixir 1.19 warnings in tests
Except for the struct comparisons, which was a real bug,
the changes are (in current versions) just cosmetic
2025-12-25 00:00:00 +00:00
Oneric
ae17ad49ff utils: comply with elixir 1.19 soft-requirement for parallel compiles
Elixir 1.19 now requires (with a deprecation warning) return_diagnostics
to be set to true for parallel compiles. However, this parameter was only
added in Elixir 1.15, so this raises our base requirement.
Since Elixir 1.14 is EOL anyway now this shoulöd be fine though.
2025-12-25 00:00:00 +00:00
Oneric
e2f9315c07 cosmetic: adjust for elixir 1.19 struct update requirments
When feasible actually enforce the to-be-updated data being the desired struct type.
For ActivityDraft this would add too much clutter and isn't necessary
since all affected functions are private functions we can ensure only
get correct data, thus just use simple map-update syntax here.
2025-12-25 00:00:00 +00:00
Oneric
fdd6bb5f1a mix: define preferred env in cli()
Defining inside project() is deprecated
2025-12-25 00:00:00 +00:00
Oneric
7936c01316 test/config/deprecations: fix warning comparsion for elixir 1.19
It can include a terminal-colour sequence before the final newline
2025-12-25 00:00:00 +00:00
Oneric
d92f246c56 web/ap/object_validators/tag: fix hashtag name normalisation
When federating out, we add the leading hashtag back in though
2025-12-25 00:00:00 +00:00
Oneric
8f166ed705 cosmetic: adjust for elixir 1.19 mix format 2025-12-25 00:00:00 +00:00
Oneric
b44292650e web/telemetry: fix HTTP error mapping for Prometheus
Some checks failed
ci/woodpecker/pr/test/2 Pipeline was successful
ci/woodpecker/pr/test/1 Pipeline failed
ci/woodpecker/push/docs Pipeline was successful
ci/woodpecker/push/publish/4 Pipeline was successful
ci/woodpecker/push/publish/1 Pipeline was successful
ci/woodpecker/push/publish/2 Pipeline was successful
Fixes omission in commit 2b4b68eba7
which introduced a more concise response format for HTTP errors.
2025-12-24 00:00:00 +00:00
68c79595fd Merge pull request 'Fix more interactions with invisible posts and corresponding data leaks' (#1036) from Oneric/akkoma:fix-interacting-nonvisible-posts into develop
All checks were successful
ci/woodpecker/push/publish/4 Pipeline was successful
ci/woodpecker/push/docs Pipeline was successful
ci/woodpecker/push/publish/1 Pipeline was successful
ci/woodpecker/push/publish/2 Pipeline was successful
Reviewed-on: #1036
2025-12-24 02:43:00 +00:00
Oneric
be7ce02295 test/mastodon_api/status: insert mute before testing unmute
All checks were successful
ci/woodpecker/pr/test/2 Pipeline was successful
ci/woodpecker/pr/test/1 Pipeline was successful
Currently the test is still correctly sensitive to the visibility issue
it wants to test, but it easy to see how this could chane in the future
if it starts considering whether a mute existed in the first place.
Inserting a mute first ensures the test will keep working as intended.
2025-12-24 00:00:00 +00:00
Oneric
b50028cf73 changelog: add entries for recent fixes
All checks were successful
ci/woodpecker/pr/test/2 Pipeline was successful
ci/woodpecker/pr/test/1 Pipeline was successful
2025-12-24 00:00:00 +00:00
Oneric
82dd0b290a api/statuses/unfav: do not leak post when acces to post was lost
If a user successfully favourited a post in the past (implying they once
had access), but now no longer are alllowed to see  the (potentially
since edited) post, the request would still process and leak the current
status data in the response.

As a compromise to still allow retracting past favourites (if IDs are
still cached), the unfavouriting operation will still be processed, but
at the end lie to the user and return a "not found" error instead of
a success with forbidden data.

This was originally found by Phantasm and fixed in Pleroma as part of
https://git.pleroma.social/pleroma/pleroma/-/merge_requests/4400
but by completely preventing the favourite retraction.
2025-12-24 00:00:00 +00:00
Oneric
981997a621 api/statuses/bookmark: improve response for hidden or bogus targets
Also fixes Bookmark.destroy crashing when called with
parameters not mapping to any existing bookmark.

Partially-based-on: fe7108cbc2
Co-authored-by: Phantasm <phantasm@centrum.cz>
2025-12-24 00:00:00 +00:00
Lain Soykaf
126ac6e0e7 Transmogrifier: Handle user updates.
Cherry-picked-from: 98f300c5ae
2025-12-24 00:00:00 +00:00
Lain Soykaf
3e3baa089b TransmogrifierTest: Add failing test for Update.
Cherry-picked-from: ed538603fb
2025-12-24 00:00:00 +00:00
Oneric
25d27edddb ap/transmogrifier: ensure attempts to update non-updateable data are logged
Often raised errors get logged automatically,
but not always and here it doesn't seem to happen.
I’m not sure what the criteria for it being logged or not are tbh.
2025-12-24 00:00:00 +00:00
Phantasm
300744b577 CommonAPI: Forbid disallowed status (un)muting and unpinning
When a user tried to unpin a status not belonging to them, a full
MastoAPI response was sent back even if status was not visible to them.

Ditto with (un)mutting except ownership.

Cherry-picked-from: 2b76243ec8
2025-12-24 00:00:00 +00:00
Phantasm
ac94214ee6 Transmogrifier: update internal fields list according to constant
Adjusted original patch to drop fields not present in Akkoma

Cherry-picked-from: 3f16965178
2025-12-24 00:00:00 +00:00
Oneric
a2d156aa22 cosmetic/common_api: simplify check_statuses_visibility 2025-12-24 00:00:00 +00:00
Phantasm
31d5f556f0 CommonAPI: Fail when user sends report with posts not visible to them
Cherry-picked-from: 2b76243ec8
2025-12-24 00:00:00 +00:00
144 changed files with 5709 additions and 2509 deletions

View file

@ -4,12 +4,12 @@ when:
matrix:
# test the lowest and highest versions
include:
- ELIXIR_VERSION: 1.14
- ELIXIR_VERSION: 1.15
OTP_VERSION: 25
LINT: NO
PLATFORM: linux/amd64
- ELIXIR_VERSION: 1.18
OTP_VERSION: 27
- ELIXIR_VERSION: 1.19
OTP_VERSION: 28
LINT: YES
PLATFORM: linux/arm64

View file

@ -4,7 +4,55 @@ All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/).
## Unreleased
## 2026.03
### BREAKING
- Elixir 1.14 is no longer suported, and it's EOL! Upgrade to Elixir 1.15+
- `account` entities in API responses now only contain a cut down version of their servers nodeinfo.
TEMPORARILY a config option is provided to serve the full nodeinfo data again.
HOWEVER this option WILL be removed soon. If you encounter any issues with third-party clients fixed
by using this setting, tell us so we can include all actually needed keys by default.
### REMOVED
### Added
- Mastodon-compatible translation endpoints are now supported too;
the older Akkoma endpoints are deprecated but no immediate plans for removal
- `GET pleroma/conversation/:id/statuses` now supports `with_muted`
- `POST /api/v1/statuses` accepts and now prefers the Mastodon-compatible `quoted_status_id` parameter for quoting a post
- `status` API entities now expose non-shallow quotes in a manner also compatible with Mastodon clients
- support for WebFinger backlinks in ActivityPub actors (FEP-2c59)
### Fixed
- pinning, muting or unmuting a status one is not allowed to access no longer leaks its content
- revoking a favourite on a post one lost access to no longer leaks its content
- user info updates again are actively federated to other servers;
this was accidentally broken in the previous release
- it is no longer possible to reference posts one cannot access when reporting another user
- streamed relationship updates no longer leak follow* counts for users who chose to hide their counts
- WebFinger data and user nicknames no longer allow non-consential associations
- Correctly setup custom WebFinger domains work again
- fix paths of emojis added or updated at runtime and remove emoji from runtime when deleting an entire pack without requiring a full emoji reload
- fix retraction of remote emoji reaction when id is not present or its domain differs from image host
- fix AP ids declared with the canonical type being ignored in XML WebFinger responses
- fix many, many bugs in the conversations API family
- notifications about muted entities are no longer streamed out
- non-UTF-8 usernames no longer lead to internal server errors in API endpoints
- when SimplePolicy rules are configured but the MRF not enabled, its rules no longer interfere with fetching
- fixed remote follow counter refresh on user (re)fetch
- remote users whose follow* counts are private are now actually shown as such in API instead of represeneting them with public zero counters
- fix local follow* collections counting and including AP IDs of deleted users
### Changed
- `PATCH /api/v1/pleroma/conversations/:id` now accepts update parameters via JSON body too
- it is now possible to quote local and ones own private posts provided a compatible scope is used
- on final activity failures the error log now includes the afected activity
- improved performance of `GET api/v1/custom_emoji`
- outgoing HTTP requests now accept compressed responses
- the system CA certificate store is now used by default
- when refreshing remote follow* stats all fetch-related erros are now treated as stats being private;
this avoids spurious error logs and better matches the intent of implementations serving fallback HTML responses on the AP collection endpoints
## 2025.12

View file

@ -10,6 +10,7 @@
## Supported FEPs
- [FEP-67ff: FEDERATION](https://codeberg.org/fediverse/fep/src/branch/main/fep/67ff/fep-67ff.md)
- [FEP-2c59: Discovery of a Webfinger address from an ActivityPub actor](https://codeberg.org/fediverse/fep/src/branch/main/fep/2c59/fep-2c59.md)
- [FEP-dc88: Formatting Mathematics](https://codeberg.org/fediverse/fep/src/branch/main/fep/dc88/fep-dc88.md)
- [FEP-f1d5: NodeInfo in Fediverse Software](https://codeberg.org/fediverse/fep/src/branch/main/fep/f1d5/fep-f1d5.md)
- [FEP-fffd: Proxy Objects](https://codeberg.org/fediverse/fep/src/branch/main/fep/fffd/fep-fffd.md)
@ -37,6 +38,21 @@ Depending on instance configuration the same may be true for GET requests.
We set the optional extension term `htmlMfm: true` when using content type "text/x.misskeymarkdown".
Incoming messages containing `htmlMfm: true` will not have their content re-parsed.
## WebFinger
Akkoma requires WebFinger implmentations to respond to queries about a given user both when
`acct:user@domain` or the canonical ActivityPub id of the actor is passed as the `resource`.
Akkoma strongly encourages ActivityPub implementations to include
a FEP-2c59-compliant WebFinger backlink in their actor documents.
Without FEP-2c59 and if different domains are used for ActivityPub and the Webfinger subject,
Akkoma relies on the presence of an host-meta LRDD template on the ActivityPub domain
or a HTTP redirect from the ActivityPub domains `/.well-known/webfinger` to an equivalent endpoint
on the domain used in the `subject` to discover and validate the domain association.
Without FEP-2c59 Akkoma may not become aware of changes to the
preferred WebFinger `subject` domain for already discovered users.
## Nodeinfo
Akkoma provides many additional entries in its nodeinfo response,

View file

@ -1,3 +1,2 @@
./build.sh 1.14-otp25 1.14.3-erlang-25.3.2-alpine-3.18.0
./build.sh 1.15-otp25 1.15.8-erlang-25.3.2.18-alpine-3.19.7
./build.sh 1.18-otp27 1.18.2-erlang-27.2.4-alpine-3.19.7
./build.sh 1.15-otp25 1.15.8-erlang-25.3.2.18-alpine-3.22.2
./build.sh 1.19-otp28 1.19-erlang-28.0-alpine-3.23.2

View file

@ -249,6 +249,7 @@ config :pleroma, :instance,
remote_post_retention_days: 90,
skip_thread_containment: true,
limit_to_local_content: :unauthenticated,
filter_embedded_nodeinfo: true,
user_bio_length: 5000,
user_name_length: 100,
max_account_fields: 10,
@ -903,7 +904,11 @@ config :pleroma, ConcurrentLimiter, [
{Pleroma.Search, [max_running: 30, max_waiting: 50]}
]
config :pleroma, Pleroma.Web.WebFinger, domain: nil, update_nickname_on_user_fetch: true
config :pleroma, Pleroma.Web.WebFinger,
domain: nil,
# this _forces_ a nickname rediscovery and validation, otherwise only updates when detecting a change
# TODO: default this to false after the fallout from recent WebFinger bugs is healed
update_nickname_on_user_fetch: true
config :pleroma, Pleroma.Search, module: Pleroma.Search.DatabaseSearch

View file

@ -3495,7 +3495,7 @@ config :pleroma, :config_description, [
key: :module,
type: :module,
description: "Translation module.",
suggestions: {:list_behaviour_implementations, Pleroma.Akkoma.Translator}
suggestions: {:list_behaviour_implementations, Pleroma.Akkoma.Translator.Provider}
}
]
},

View file

@ -18,7 +18,7 @@
3. Go to the working directory of Akkoma (default is `/opt/akkoma`)
4. Copy the above-mentioned files back to their original position.
5. Drop the existing database and user[¹]. `sudo -Hu postgres psql -c 'DROP DATABASE akkoma;';` `sudo -Hu postgres psql -c 'DROP USER akkoma;'`
6. Restore the database schema and akkoma role[¹] (replace the password with the one you find in the configuration file), `sudo -Hu postgres psql -c "CREATE USER akkoma WITH ENCRYPTED PASSWORD '<database-password-wich-you-can-find-in-your-configuration-file>';"` `sudo -Hu postgres psql -c "CREATE DATABASE akkoma OWNER akkoma;"`.
6. Restore the database schema and akkoma role[¹] (replace the password with the one you find in the configuration file), `sudo -Hu postgres psql -c "CREATE USER akkoma WITH ENCRYPTED PASSWORD '<database-password-wich-you-can-find-in-your-configuration-file>';";` `sudo -Hu postgres psql -c "CREATE DATABASE akkoma OWNER akkoma;"`.
7. Now restore the Akkoma instance's data into the empty database schema[¹]: `sudo -Hu postgres pg_restore -d akkoma -v -1 </path/to/backup_location/akkoma.pgdump>`
8. If you installed a newer Akkoma version, you should run the database migrations `./bin/pleroma_ctl migrate`[²].
9. Restart the Akkoma service.

View file

@ -53,6 +53,10 @@ curl -i -H 'Authorization: Bearer $ACCESS_TOKEN' https://myinstance.example/api/
You may use the eponymous [Prometheus](https://prometheus.io/)
or anything compatible with it like e.g. [VictoriaMetrics](https://victoriametrics.com/).
The latter claims better performance and storage efficiency.
However, at the moment our reference dashboard only works with VictoriaMetrics,
thus if you wish to use use the reference as an easy dropin template you must
use VictoriaMetrics.
Patches to allow the dashboard to work with plain Prometheus are welcome though.
Both of them can usually be easily installed via distro-packages or docker.
Depending on your distro or installation method the preferred way to change the CLI arguments and the location of config files may differ; consult the documentation of your chosen method to find out.
@ -254,6 +258,44 @@ as well as database diagnostics.
BEAM VM stats include detailed memory consumption breakdowns
and a full list of running processes for example.
## Postgres Statements Statistics
The built-in dashboard can list the queries your instances spends the
most accumulative time on giving insight into potential bottlenecks
and what might be worth optimising.
This is the “Outliers” tab in “Ecto Stats”.
However for this to work you first need to enable a PostgreSQL extension
as follows:
Add the following two lines two your `postgresql.conf` (typically placed in your data dir):
```
shared_preload_libraries = 'pg_stat_statements'
pg_stat_statements.track = all
```
Now restart PostgreSQL. Then connect to your akkoma database using `psql` and run:
```sql
CREATE EXTENSION IF NOT EXISTS pg_stat_statements;
```
Execution time statistics will now start to be gathered.
To get a representative sample of your instances workload you should wait a week or at least a day.
These statistics are never reset automatically, but with new Akkoma releases and
changes in the servers your instance federates with the workload will evolve.
Thus its a good idea to reset this occasionally using:
```sql
-- get user oid: SELECT oid FROM pg_roles WHERE rolname = 'akkoma';
-- get db oid: SELECT oid FROM pg_database WHERE datname = 'akkoma';
SELECT pg_stat_statements_reset('<akkoma user oid>'::regclass::oid, '<akkoma database oid>'::regclass::oid);
-- or alternatively, to just reset stats for all users and databases:
-- SELECT pg_stat_statements_reset();
```
## Oban Web
This too requires administrator rights to access and can be found under `/akkoma/oban` if enabled.

View file

@ -1,8 +1,8 @@
## Required dependencies
* PostgreSQL 12+
* Elixir 1.14.1+ (currently tested up to 1.18)
* Erlang OTP 25+ (currently tested up to OTP27)
* Elixir 1.15+ (currently tested up to 1.19)
* Erlang OTP 25+ (currently tested up to OTP28)
* git
* file / libmagic
* gcc (clang might also work)

View file

@ -723,6 +723,8 @@
},
"displayName": "Run Queue",
"mappings": [],
"max": 1.5,
"min": 0,
"thresholds": {
"mode": "absolute",
"steps": [
@ -732,11 +734,11 @@
},
{
"color": "yellow",
"value": 15
"value": 0.2
},
{
"color": "red",
"value": 25
"value": 1
}
]
},
@ -784,6 +786,12 @@
{
"id": "displayName",
"value": "Memory"
},
{
"id": "min"
},
{
"id": "max"
}
]
},
@ -836,7 +844,7 @@
"disableTextWrap": false,
"editorMode": "builder",
"exemplar": false,
"expr": "rate(vm_memory_total_psum{instance=\"${INSTANCE}\", job=\"${SCRAPE_JOB}\"}[$__interval])",
"expr": "increase(vm_memory_total_psum{instance=\"${INSTANCE}\", job=\"${SCRAPE_JOB}\"}[$__interval])",
"fullMetaSearch": false,
"hide": true,
"includeNullMetadata": true,
@ -854,7 +862,7 @@
"disableTextWrap": false,
"editorMode": "builder",
"exemplar": false,
"expr": "rate(vm_memory_total_pcount{instance=\"${INSTANCE}\", job=\"${SCRAPE_JOB}\"}[$__interval])",
"expr": "increase(vm_memory_total_pcount{instance=\"${INSTANCE}\", job=\"${SCRAPE_JOB}\"}[$__interval])",
"fullMetaSearch": false,
"hide": true,
"includeNullMetadata": true,
@ -882,7 +890,7 @@
"disableTextWrap": false,
"editorMode": "builder",
"exemplar": false,
"expr": "rate(vm_total_run_queue_lengths_cpu_psum{instance=\"${INSTANCE}\", job=\"${SCRAPE_JOB}\"}[$__rate_interval])",
"expr": "increase(vm_total_run_queue_lengths_cpu_psum{instance=\"${INSTANCE}\", job=\"${SCRAPE_JOB}\"}[$__interval])",
"fullMetaSearch": false,
"hide": true,
"includeNullMetadata": true,
@ -900,7 +908,7 @@
"disableTextWrap": false,
"editorMode": "builder",
"exemplar": false,
"expr": "rate(vm_total_run_queue_lengths_cpu_pcount{instance=\"${INSTANCE}\", job=\"${SCRAPE_JOB}\"}[$__rate_interval])",
"expr": "increase(vm_total_run_queue_lengths_cpu_pcount{instance=\"${INSTANCE}\", job=\"${SCRAPE_JOB}\"}[$__interval])",
"fullMetaSearch": false,
"hide": true,
"includeNullMetadata": true,
@ -928,7 +936,7 @@
"disableTextWrap": false,
"editorMode": "builder",
"exemplar": false,
"expr": "rate(vm_total_run_queue_lengths_io_fsum_psum{instance=\"${INSTANCE}\", job=\"${SCRAPE_JOB}\"}[$__rate_interval])",
"expr": "increase(vm_total_run_queue_lengths_io_fsum_psum{instance=\"${INSTANCE}\", job=\"${SCRAPE_JOB}\"}[$__interval])",
"fullMetaSearch": false,
"hide": true,
"includeNullMetadata": true,
@ -946,7 +954,7 @@
"disableTextWrap": false,
"editorMode": "builder",
"exemplar": false,
"expr": "rate(vm_total_run_queue_lengths_io_fsum_pcount{instance=\"${INSTANCE}\", job=\"${SCRAPE_JOB}\"}[$__rate_interval])",
"expr": "increase(vm_total_run_queue_lengths_io_fsum_pcount{instance=\"${INSTANCE}\", job=\"${SCRAPE_JOB}\"}[$__interval])",
"fullMetaSearch": false,
"hide": true,
"includeNullMetadata": true,
@ -1974,6 +1982,7 @@
"type": "prometheus",
"uid": "${DATASOURCE}"
},
"description": "Times are counted upon job completion/failure and may contain IO or network wait times.",
"fieldConfig": {
"defaults": {
"color": {
@ -2356,7 +2365,7 @@
"type": "prometheus",
"uid": "${DATASOURCE}"
},
"description": "Jobs intentionally held back until a later start data",
"description": "Jobs intentionally held back until a later start date. This also (but not only) includes retries of previously failed jobs since theres a cooldown between re-attempts.",
"fieldConfig": {
"defaults": {
"color": {
@ -2681,7 +2690,7 @@
},
"disableTextWrap": false,
"editorMode": "builder",
"expr": "rate(vm_memory_total_psum{instance=\"${INSTANCE}\", job=\"${SCRAPE_JOB}\"}[$__interval])",
"expr": "increase(vm_memory_total_psum{instance=\"${INSTANCE}\", job=\"${SCRAPE_JOB}\"}[$__interval])",
"fullMetaSearch": false,
"hide": true,
"includeNullMetadata": true,
@ -2698,7 +2707,7 @@
},
"disableTextWrap": false,
"editorMode": "builder",
"expr": "rate(vm_memory_total_pcount{instance=\"${INSTANCE}\", job=\"${SCRAPE_JOB}\"}[$__interval])",
"expr": "increase(vm_memory_total_pcount{instance=\"${INSTANCE}\", job=\"${SCRAPE_JOB}\"}[$__interval])",
"fullMetaSearch": false,
"hide": true,
"includeNullMetadata": true,
@ -3598,6 +3607,6 @@
"timezone": "utc",
"title": "Akkoma Dashboard",
"uid": "edzowz85niznkc",
"version": 29,
"version": 54,
"weekStart": ""
}

View file

@ -33,7 +33,7 @@ defmodule Mix.Tasks.Pleroma.Email do
Pleroma.User.Query.build(%{
local: true,
is_active: true,
deactivated: false,
is_confirmed: false,
invisible: false
})

View file

@ -162,8 +162,9 @@ defmodule Mix.Tasks.Pleroma.Uploads do
Map.put(link, "href", rewrite_url(id, href, from_url, to_url))
end
defp rewrite_url_object(id, %{"type" => "Document", "url" => urls} = object, from_url, to_url) do
# Document will contain url field, which will be an array of links
defp rewrite_url_object(id, %{"type" => type, "url" => urls} = object, from_url, to_url)
when type in ["Document", "Image"] do
# Document and Image contain url field, which will always be an array of links
Map.put(
object,
"url",

View file

@ -262,7 +262,7 @@ defmodule Mix.Tasks.Pleroma.User do
Pleroma.User.Query.build(%{
external: true,
is_active: true
deactivated: false
})
|> refetch_public_keys()
end
@ -408,7 +408,7 @@ defmodule Mix.Tasks.Pleroma.User do
Pleroma.User.Query.build(%{
local: true,
is_active: true,
deactivated: false,
is_moderator: false,
is_admin: false,
invisible: false
@ -426,7 +426,7 @@ defmodule Mix.Tasks.Pleroma.User do
Pleroma.User.Query.build(%{
local: true,
is_active: true,
deactivated: false,
is_moderator: false,
is_admin: false,
invisible: false

View file

@ -1,8 +1,15 @@
defmodule Pleroma.Akkoma.Translator do
@callback translate(String.t(), String.t() | nil, String.t()) ::
{:ok, String.t(), String.t()} | {:error, any()}
@callback languages() ::
{:ok, [%{name: String.t(), code: String.t()}],
[%{name: String.t(), code: String.t()}]}
| {:error, any()}
@cachex Pleroma.Config.get([:cachex, :provider], Cachex)
def languages do
module = Pleroma.Config.get([:translator, :module])
@cachex.fetch!(:translations_cache, "languages:#{module}}", fn _ ->
with {:ok, source_languages, dest_languages} <- module.languages() do
{:commit, {:ok, source_languages, dest_languages}}
else
{:error, err} -> {:ignore, {:error, err}}
end
end)
end
end

View file

@ -1,5 +1,5 @@
defmodule Pleroma.Akkoma.Translators.ArgosTranslate do
@behaviour Pleroma.Akkoma.Translator
@behaviour Pleroma.Akkoma.Translator.Provider
alias Pleroma.Config
@ -23,7 +23,7 @@ defmodule Pleroma.Akkoma.Translators.ArgosTranslate do
end
end
@impl Pleroma.Akkoma.Translator
@impl Pleroma.Akkoma.Translator.Provider
def languages do
with {response, 0} <- safe_languages() do
langs =
@ -83,7 +83,7 @@ defmodule Pleroma.Akkoma.Translators.ArgosTranslate do
defp htmlify_response(string, _), do: string
@impl Pleroma.Akkoma.Translator
@impl Pleroma.Akkoma.Translator.Provider
def translate(string, nil, to_language) do
# Akkoma's Pleroma-fe expects us to detect the source language automatically.
# Argos-translate doesn't have that option (yet?)
@ -106,4 +106,7 @@ defmodule Pleroma.Akkoma.Translators.ArgosTranslate do
{response, _} -> {:error, "ArgosTranslate failed to translate (#{response})"}
end
end
@impl Pleroma.Akkoma.Translator.Provider
def name, do: "Argos Translate"
end

View file

@ -1,5 +1,5 @@
defmodule Pleroma.Akkoma.Translators.DeepL do
@behaviour Pleroma.Akkoma.Translator
@behaviour Pleroma.Akkoma.Translator.Provider
alias Pleroma.HTTP
alias Pleroma.Config
@ -21,7 +21,7 @@ defmodule Pleroma.Akkoma.Translators.DeepL do
Config.get([:deepl, :tier])
end
@impl Pleroma.Akkoma.Translator
@impl Pleroma.Akkoma.Translator.Provider
def languages do
with {:ok, %{status: 200} = source_response} <- do_languages("source"),
{:ok, %{status: 200} = dest_response} <- do_languages("target"),
@ -48,7 +48,7 @@ defmodule Pleroma.Akkoma.Translators.DeepL do
end
end
@impl Pleroma.Akkoma.Translator
@impl Pleroma.Akkoma.Translator.Provider
def translate(string, from_language, to_language) do
with {:ok, %{status: 200} = response} <-
do_request(api_key(), tier(), string, from_language, to_language),
@ -97,4 +97,7 @@ defmodule Pleroma.Akkoma.Translators.DeepL do
]
)
end
@impl Pleroma.Akkoma.Translator.Provider
def name, do: "DeepL"
end

View file

@ -1,5 +1,5 @@
defmodule Pleroma.Akkoma.Translators.LibreTranslate do
@behaviour Pleroma.Akkoma.Translator
@behaviour Pleroma.Akkoma.Translator.Provider
alias Pleroma.Config
alias Pleroma.HTTP
@ -13,7 +13,7 @@ defmodule Pleroma.Akkoma.Translators.LibreTranslate do
Config.get([:libre_translate, :url])
end
@impl Pleroma.Akkoma.Translator
@impl Pleroma.Akkoma.Translator.Provider
def languages do
with {:ok, %{status: 200} = response} <- do_languages(),
{:ok, body} <- Jason.decode(response.body) do
@ -30,7 +30,7 @@ defmodule Pleroma.Akkoma.Translators.LibreTranslate do
end
end
@impl Pleroma.Akkoma.Translator
@impl Pleroma.Akkoma.Translator.Provider
def translate(string, from_language, to_language) do
with {:ok, %{status: 200} = response} <- do_request(string, from_language, to_language),
{:ok, body} <- Jason.decode(response.body) do
@ -79,4 +79,7 @@ defmodule Pleroma.Akkoma.Translators.LibreTranslate do
HTTP.get(to_string(url))
end
@impl Pleroma.Akkoma.Translator.Provider
def name, do: "LibreTranslate"
end

View file

@ -0,0 +1,9 @@
defmodule Pleroma.Akkoma.Translator.Provider do
@callback translate(String.t(), String.t() | nil, String.t()) ::
{:ok, String.t(), String.t()} | {:error, any()}
@callback languages() ::
{:ok, [%{name: String.t(), code: String.t()}],
[%{name: String.t(), code: String.t()}]}
| {:error, any()}
@callback name() :: String.t()
end

View file

@ -53,13 +53,15 @@ defmodule Pleroma.Bookmark do
end
@spec destroy(FlakeId.Ecto.CompatType.t(), FlakeId.Ecto.CompatType.t()) ::
{:ok, Bookmark.t()} | {:error, Changeset.t()}
:ok | {:error, any()}
def destroy(user_id, activity_id) do
from(b in Bookmark,
where: b.user_id == ^user_id,
where: b.activity_id == ^activity_id
)
|> Repo.one()
|> Repo.delete()
{cnt, _} =
from(b in Bookmark,
where: b.user_id == ^user_id,
where: b.activity_id == ^activity_id
)
|> Repo.delete_all()
if cnt >= 1, do: :ok, else: {:error, :not_found}
end
end

View file

@ -43,6 +43,22 @@ defmodule Pleroma.Config.DeprecationWarnings do
end
end
def check_truncated_nodeinfo_in_accounts do
if !Config.get!([:instance, :filter_embedded_nodeinfo]) do
Logger.warning("""
!!!BUG WORKAROUND DETECTED!!!
Your config is explicitly disabling filtering of nodeinfo data embedded in other Masto API responses
config :pleroma, :instance, filter_embedded_nodeinfo: false
This setting will soon be removed. Any usage of it merely serves as a temporary workaround.
Make sure to file a bug telling us which problems you encountered and circumvented by setting this!
https://akkoma.dev/AkkomaGang/akkoma/issues
We cant fix bugs we dont know about.
""")
end
end
def check_exiftool_filter do
filters = Config.get([Pleroma.Upload]) |> Keyword.get(:filters, [])

View file

@ -15,7 +15,6 @@ defmodule Pleroma.Conversation do
# This is the context ap id.
field(:ap_id, :string)
has_many(:participations, Participation)
has_many(:users, through: [:participations, :user])
timestamps()
end
@ -45,7 +44,11 @@ defmodule Pleroma.Conversation do
participation = Repo.preload(participation, :recipients)
if Enum.empty?(participation.recipients) do
recipients = User.get_all_by_ap_id(activity.recipients)
recipients =
[activity.actor | activity.recipients]
|> Enum.uniq()
|> User.get_all_by_ap_id()
RecipientShip.create(recipients, participation)
end
end
@ -64,15 +67,16 @@ defmodule Pleroma.Conversation do
ap_id when is_binary(ap_id) and byte_size(ap_id) > 0 <- object.data["context"],
{:ok, conversation} <- create_for_ap_id(ap_id) do
users = User.get_users_from_set(activity.recipients, local_only: false)
local_users = Enum.filter(users, & &1.local)
participations =
Enum.map(users, fn user ->
Enum.map(local_users, fn user ->
invisible_conversation = Enum.any?(users, &User.blocks?(user, &1))
opts = Keyword.put(opts, :invisible_conversation, invisible_conversation)
{:ok, participation} =
Participation.create_for_user_and_conversation(user, conversation, opts)
Participation.create_or_bump(user, conversation, activity.id, opts)
maybe_create_recipientships(participation, activity)
participation

View file

@ -12,9 +12,12 @@ defmodule Pleroma.Conversation.Participation do
import Ecto.Changeset
import Ecto.Query
@type t() :: %__MODULE__{}
schema "conversation_participations" do
belongs_to(:user, User, type: FlakeId.Ecto.CompatType)
belongs_to(:conversation, Conversation)
field(:last_bump, FlakeId.Ecto.CompatType)
field(:read, :boolean, default: false)
field(:last_activity_id, FlakeId.Ecto.CompatType, virtual: true)
@ -24,24 +27,26 @@ defmodule Pleroma.Conversation.Participation do
timestamps()
end
def creation_cng(struct, params) do
defp creation_cng(struct, params) do
struct
|> cast(params, [:user_id, :conversation_id, :read])
|> validate_required([:user_id, :conversation_id])
|> cast(params, [:user_id, :conversation_id, :last_bump, :read])
|> validate_required([:user_id, :conversation_id, :last_bump])
end
def create_for_user_and_conversation(user, conversation, opts \\ []) do
def create_or_bump(user, conversation, status_id, opts \\ []) do
read = !!opts[:read]
invisible_conversation = !!opts[:invisible_conversation]
update_on_conflict =
if(invisible_conversation, do: [], else: [read: read])
|> Keyword.put(:updated_at, NaiveDateTime.utc_now())
|> Keyword.put(:last_bump, status_id)
%__MODULE__{}
|> creation_cng(%{
user_id: user.id,
conversation_id: conversation.id,
last_bump: status_id,
read: invisible_conversation || read
})
|> Repo.insert(
@ -51,7 +56,7 @@ defmodule Pleroma.Conversation.Participation do
)
end
def read_cng(struct, params) do
defp read_cng(struct, params) do
struct
|> cast(params, [:read])
|> validate_required([:read])
@ -99,43 +104,90 @@ defmodule Pleroma.Conversation.Participation do
{:ok, user, participations}
end
# used for tests
def mark_as_unread(participation) do
participation
|> read_cng(%{read: false})
|> Repo.update()
end
def for_user(user, params \\ %{}) do
def for_user_with_pagination(user, params \\ %{}) do
from(p in __MODULE__,
where: p.user_id == ^user.id,
order_by: [desc: p.updated_at],
preload: [conversation: [:users]]
preload: [:conversation]
)
|> restrict_recipients(user, params)
|> Pleroma.Pagination.fetch_paginated(params)
|> select([p], %{id: p.last_bump, entry: p})
|> Pleroma.Pagination.fetch_paginated(Map.put(params, :pagination_field, :last_bump))
end
def restrict_recipients(query, user, %{recipients: user_ids}) do
def preload_last_activity_id_and_filter(participations) when is_list(participations) do
participations
|> Enum.map(fn p -> load_last_activity_id(p) end)
|> Enum.filter(fn p -> p.last_activity_id end)
end
defp load_last_activity_id(%__MODULE__{} = participation) do
%{
participation
| last_activity_id: last_activity_id(participation)
}
end
@spec last_activity_id(t(), User.t() | nil) :: Flake.t()
def last_activity_id(participation, user \\ nil)
def last_activity_id(
%__MODULE__{conversation: %Conversation{}} = participation,
user
) do
user =
if user && user.id == participation.user_id do
user
else
case participation.user do
%User{} -> participation.user
_ -> User.get_cached_by_id(participation.user_id)
end
end
ActivityPub.fetch_latest_direct_activity_id_for_context(
participation.conversation.ap_id,
%{
user: user,
blocking_user: user
}
)
end
def last_activity_id(%__MODULE__{} = participation, user) do
case Repo.preload(participation, :conversation) do
%{conversation: %Conversation{}} = p -> last_activity_id(p, user)
_ -> nil
end
end
defp restrict_recipients(query, user, %{recipients: user_ids}) do
user_binary_ids =
[user.id | user_ids]
|> Enum.uniq()
|> User.binary_id()
conversation_subquery =
__MODULE__
|> group_by([p], p.conversation_id)
recipient_subquery =
RecipientShip
|> group_by([r], r.participation_id)
|> having(
[p],
count(p.user_id) == ^length(user_binary_ids) and
fragment("array_agg(?) @> ?", p.user_id, ^user_binary_ids)
[r],
count(r.user_id) == ^length(user_binary_ids) and
fragment("array_agg(?) @> ?", r.user_id, ^user_binary_ids)
)
|> select([p], %{id: p.conversation_id})
|> select([r], %{pid: r.participation_id})
query
|> join(:inner, [p], c in subquery(conversation_subquery), on: p.conversation_id == c.id)
|> join(:inner, [p], r in subquery(recipient_subquery), on: p.id == r.pid)
end
def restrict_recipients(query, _, _), do: query
defp restrict_recipients(query, _, _), do: query
def for_user_and_conversation(user, conversation) do
from(p in __MODULE__,
@ -145,26 +197,6 @@ defmodule Pleroma.Conversation.Participation do
|> Repo.one()
end
def for_user_with_last_activity_id(user, params \\ %{}) do
for_user(user, params)
|> Enum.map(fn participation ->
activity_id =
ActivityPub.fetch_latest_direct_activity_id_for_context(
participation.conversation.ap_id,
%{
user: user,
blocking_user: user
}
)
%{
participation
| last_activity_id: activity_id
}
end)
|> Enum.reject(&is_nil(&1.last_activity_id))
end
def get(_, _ \\ [])
def get(nil, _), do: nil
@ -213,14 +245,6 @@ defmodule Pleroma.Conversation.Participation do
|> Repo.aggregate(:count, :id)
end
def unread_conversation_count_for_user(user) do
from(p in __MODULE__,
where: p.user_id == ^user.id,
where: not p.read,
select: %{count: count(p.id)}
)
end
def delete(%__MODULE__{} = participation) do
Repo.delete(participation)
end

View file

@ -93,9 +93,13 @@ defmodule Pleroma.Emoji.Pack do
@spec delete(String.t()) ::
{:ok, [binary()]} | {:error, File.posix(), binary()} | {:error, :empty_values}
def delete(name) do
with :ok <- validate_not_empty([name]),
pack_path <- path_join_name_safe(emoji_path(), name) do
File.rm_rf(pack_path)
with {_, :ok} <- {:empty, validate_not_empty([name])},
{:ok, pack} <- load_pack(name) do
Enum.each(pack.files, fn {shortcode, _} -> Emoji.delete(shortcode) end)
File.rm_rf(pack.path)
else
{:empty, error} -> error
_ -> {:ok, []}
end
end
@ -176,12 +180,7 @@ defmodule Pleroma.Emoji.Pack do
defp do_add_file(pack, shortcode, filename, file) do
with :ok <- save_file(file, pack, filename),
pack <- put_emoji(pack, shortcode, filename),
{:ok, pack} <- save_pack(pack) do
{shortcode, filename, tags(pack)}
|> Emoji.build()
|> Emoji.add_or_update()
{:ok, pack} <- put_emoji(pack, shortcode, filename) do
{:ok, pack}
end
end
@ -207,14 +206,8 @@ defmodule Pleroma.Emoji.Pack do
{:ok, updated_pack} <-
pack
|> delete_emoji(shortcode)
|> put_emoji(new_shortcode, new_filename)
|> save_pack() do
Emoji.delete(shortcode)
{new_shortcode, new_filename, tags(pack)}
|> Emoji.build()
|> Emoji.add_or_update()
|> put_emoji(new_shortcode, new_filename) do
if shortcode != new_shortcode, do: Emoji.delete(shortcode)
{:ok, updated_pack}
end
end
@ -528,7 +521,17 @@ defmodule Pleroma.Emoji.Pack do
defp put_emoji(pack, shortcode, filename) do
files = Map.put(pack.files, shortcode, filename)
%{pack | files: files, files_count: length(Map.keys(files))}
pack = %{pack | files: files, files_count: length(Map.keys(files))}
url_path = path_join_name_safe("/emoji/", pack.name) |> path_join_safe(filename)
with {:ok, pack} <- save_pack(pack) do
{shortcode, url_path, tags(pack)}
|> Emoji.build()
|> Emoji.add_or_update()
{:ok, pack}
end
end
defp delete_emoji(pack, shortcode) do

View file

@ -193,6 +193,12 @@ defmodule Pleroma.Filter do
end
end
defp escape_for_regex(plain_phrase) do
# Escape all active characters:
# .^$*+?()[{\|
Regex.replace(~r/\.\^\$\*\+\?\(\)\[\{\\\|/, plain_phrase, fn m -> "\\" <> m end)
end
@spec compose_regex(User.t() | [t()], format()) :: String.t() | Regex.t() | nil
def compose_regex(user_or_filters, format \\ :postgres)
@ -207,7 +213,7 @@ defmodule Pleroma.Filter do
def compose_regex([_ | _] = filters, format) do
phrases =
filters
|> Enum.map(& &1.phrase)
|> Enum.map(&escape_for_regex(&1.phrase))
|> Enum.join("|")
case format do

View file

@ -61,12 +61,7 @@ defmodule Pleroma.HTTP do
options = options |> Keyword.delete(:params)
headers = maybe_add_user_agent(headers)
client =
Tesla.client([
Tesla.Middleware.FollowRedirects,
Pleroma.HTTP.Middleware.HTTPSignature,
Tesla.Middleware.Telemetry
])
client = build_client(method)
Logger.debug("Outbound: #{method} #{url}")
@ -84,6 +79,37 @@ defmodule Pleroma.HTTP do
{:error, :fetch_error}
end
defp build_client(method) do
# Orders of middlewares matters!
# We start construction with the middlewares _last_ to run
# on outgoing requests (and first on incoming responses).
# This allows using more efficient list prepending.
middlewares = [Tesla.Middleware.Telemetry]
# XXX: just like the user-agent header below, our current mocks can't handle extra headers
# and would break if we used the decompression middleware during tests.
# The :test condition can and should be removed once mocks are fixed.
#
# HEAD responses won't contain a body to compress anyway and we sometimes use
# HEAD requests to determine whether a remote resource is within size limits before fetching it.
# If the server would send a compressed response however, Content-Length will be the size of
# the _compressed_ response body skewing results.
middlewares =
if method != :head and @mix_env != :test do
[Tesla.Middleware.DecompressResponse | middlewares]
else
middlewares
end
middlewares = [
Tesla.Middleware.FollowRedirects,
Pleroma.HTTP.Middleware.HTTPSignature | middlewares
]
Tesla.client(middlewares)
end
# XXX: our test mocks are (too) strict about headers and cannot handle user-agent atm
if @mix_env == :test do
defp maybe_add_user_agent(headers) do
with true <- Pleroma.Config.get([:http, :send_user_agent]) do

View file

@ -29,13 +29,11 @@ defmodule Pleroma.HTTP.AdapterHelper do
conn_max_idle_time: Config.get!([:http, :receive_timeout]),
protocols: Config.get!([:http, :protocols]),
conn_opts: [
# Do NOT add cacerts here as this will cause issues for plain HTTP connections!
# (when we upgrade our deps to Mint >= 1.6.0 we can also explicitly enable "inet4: true")
transport_opts: [inet6: true],
# up to at least version 0.20.0, Finch leaves server_push enabled by default for HTTP2,
# but will actually raise an exception when receiving such a response. Tell servers we don't want it.
# see: https://github.com/sneako/finch/issues/325
client_settings: [enable_push: false]
transport_opts: [
inet6: true,
inet4: true,
cacerts: :public_key.cacerts_get()
]
]
]
}

View file

@ -78,7 +78,7 @@ defmodule Pleroma.Marker do
defp get_marker(user, timeline) do
case Repo.find_resource(get_query(user, timeline)) do
{:ok, marker} -> %__MODULE__{marker | user: user}
{:ok, %__MODULE__{} = marker} -> %__MODULE__{marker | user: user}
_ -> %__MODULE__{timeline: timeline, user_id: user.id}
end
end

View file

@ -54,6 +54,7 @@ defmodule Pleroma.MFA do
end
@doc false
@spec fetch_settings(User.t()) :: Settings.t()
def fetch_settings(%User{} = user) do
user.multi_factor_authentication_settings || %Settings{}
end

View file

@ -8,7 +8,8 @@ defmodule Pleroma.MFA.Changeset do
alias Pleroma.User
def disable(%Ecto.Changeset{} = changeset, force \\ false) do
settings =
%Settings{} =
settings =
changeset
|> Ecto.Changeset.apply_changes()
|> MFA.fetch_settings()
@ -22,18 +23,18 @@ defmodule Pleroma.MFA.Changeset do
def disable_totp(%User{multi_factor_authentication_settings: settings} = user) do
user
|> put_change(%Settings{settings | totp: %Settings.TOTP{}})
|> put_change(%{settings | totp: %Settings.TOTP{}})
end
def confirm_totp(%User{multi_factor_authentication_settings: settings} = user) do
totp_settings = %Settings.TOTP{settings.totp | confirmed: true}
totp_settings = %{settings.totp | confirmed: true}
user
|> put_change(%Settings{settings | totp: totp_settings, enabled: true})
|> put_change(%{settings | totp: totp_settings, enabled: true})
end
def setup_totp(%User{} = user, attrs) do
mfa_settings = MFA.fetch_settings(user)
%Settings{} = mfa_settings = MFA.fetch_settings(user)
totp_settings =
%Settings.TOTP{}
@ -45,7 +46,7 @@ defmodule Pleroma.MFA.Changeset do
def cast_backup_codes(%User{} = user, codes) do
user
|> put_change(%Settings{
|> put_change(%{
user.multi_factor_authentication_settings
| backup_codes: codes
})

View file

@ -15,7 +15,6 @@ defmodule Pleroma.Notification do
alias Pleroma.Repo
alias Pleroma.ThreadMute
alias Pleroma.User
alias Pleroma.Web.CommonAPI
alias Pleroma.Web.CommonAPI.Utils
alias Pleroma.Web.Push
alias Pleroma.Web.Streamer
@ -388,40 +387,46 @@ defmodule Pleroma.Notification do
end
end
@spec create_notifications(Activity.t(), keyword()) :: {:ok, [Notification.t()] | []}
def create_notifications(activity, options \\ [])
@doc """
Create notifications for given Activity in database, but does NOT send them to streams and webpush.
On success returns :ok triple with non-muted notifications in the second position and
muted (i.e. likely not supposed to be pro-actively sent) notifications in the third position.
"""
@spec create_notifications(Activity.t()) ::
{:ok, [Notification.t()] | [], [Notification.t()] | []}
def create_notifications(activity)
def create_notifications(%Activity{data: %{"to" => _, "type" => "Create"}} = activity, options) do
def create_notifications(%Activity{data: %{"to" => _, "type" => "Create"}} = activity) do
object = Object.normalize(activity, fetch: false)
if object && object.data["type"] == "Answer" do
{:ok, []}
{:ok, [], []}
else
do_create_notifications(activity, options)
do_create_notifications(activity)
end
end
def create_notifications(%Activity{data: %{"type" => type}} = activity, options)
def create_notifications(%Activity{data: %{"type" => type}} = activity)
when type in ["Follow", "Like", "Announce", "Move", "EmojiReact", "Flag", "Update"] do
do_create_notifications(activity, options)
do_create_notifications(activity)
end
def create_notifications(_, _), do: {:ok, []}
defp do_create_notifications(%Activity{} = activity, options) do
do_send = Keyword.get(options, :do_send, true)
def create_notifications(_), do: {:ok, [], []}
defp do_create_notifications(%Activity{} = activity) do
{enabled_receivers, disabled_receivers} = get_notified_from_activity(activity)
potential_receivers = enabled_receivers ++ disabled_receivers
notifications =
Enum.map(potential_receivers, fn user ->
do_send = do_send && user in enabled_receivers
create_notification(activity, user, do_send: do_send)
end)
notifications_active =
enabled_receivers
|> Enum.map(&create_notification(activity, &1))
|> Enum.reject(&is_nil/1)
{:ok, notifications}
notifications_silent =
disabled_receivers
|> Enum.map(&create_notification(activity, &1, seen: true))
|> Enum.reject(&is_nil/1)
{:ok, notifications_active, notifications_silent}
end
defp type_from_activity(%{data: %{"type" => type}} = activity) do
@ -467,9 +472,9 @@ defmodule Pleroma.Notification do
defp type_from_activity_object(%{data: %{"type" => "Create"}}), do: "mention"
# TODO move to sql, too.
def create_notification(%Activity{} = activity, %User{} = user, opts \\ []) do
do_send = Keyword.get(opts, :do_send, true)
defp create_notification(%Activity{} = activity, %User{} = user, opts \\ []) do
type = Keyword.get(opts, :type, type_from_activity(activity))
seen = Keyword.get(opts, :seen, false)
unless skip?(activity, user, opts) do
{:ok, %{notification: notification}} =
@ -477,17 +482,12 @@ defmodule Pleroma.Notification do
|> Multi.insert(:notification, %Notification{
user_id: user.id,
activity: activity,
seen: mark_as_read?(activity, user),
seen: seen,
type: type
})
|> Marker.multi_set_last_read_id(user, "notifications")
|> Repo.transaction()
if do_send do
Streamer.stream(["user", "user:notification"], notification)
Push.send(notification)
end
notification
end
end
@ -678,6 +678,12 @@ defmodule Pleroma.Notification do
end
end
def skip?(:internal, %Activity{} = activity, _user, _opts) do
actor = activity.data["actor"]
user = User.get_cached_by_ap_id(actor)
User.is_internal_user?(user)
end
def skip?(:invisible, %Activity{} = activity, _user, _opts) do
actor = activity.data["actor"]
user = User.get_cached_by_ap_id(actor)
@ -740,11 +746,6 @@ defmodule Pleroma.Notification do
def skip?(_type, _activity, _user, _opts), do: false
def mark_as_read?(activity, target_user) do
user = Activity.user_actor(activity)
User.mutes_user?(target_user, user) || CommonAPI.thread_muted?(target_user, activity)
end
def for_user_and_activity(user, activity) do
from(n in __MODULE__,
where: n.user_id == ^user.id,
@ -764,4 +765,12 @@ defmodule Pleroma.Notification do
)
|> Repo.update_all(set: [seen: true])
end
@spec send(list(Notification.t())) :: :ok
def send(notifications) do
Enum.each(notifications, fn notification ->
Streamer.stream(["user", "user:notification"], notification)
Push.send(notification)
end)
end
end

View file

@ -10,6 +10,7 @@ defmodule Pleroma.Object.Fetcher do
alias Pleroma.Object.Containment
alias Pleroma.Repo
alias Pleroma.Web.ActivityPub.InternalFetchActor
alias Pleroma.Web.ActivityPub.MRF
alias Pleroma.Web.ActivityPub.ObjectValidator
alias Pleroma.Web.ActivityPub.Transmogrifier
alias Pleroma.Web.Federator
@ -138,10 +139,7 @@ defmodule Pleroma.Object.Fetcher do
{:valid_uri_scheme, true} <-
{:valid_uri_scheme, uri.scheme == "http" or uri.scheme == "https"},
# If we have instance restrictions, apply them here to prevent fetching from unwanted instances
{:mrf_reject_check, {:ok, nil}} <-
{:mrf_reject_check, Pleroma.Web.ActivityPub.MRF.SimplePolicy.check_reject(uri)},
{:mrf_accept_check, {:ok, _}} <-
{:mrf_accept_check, Pleroma.Web.ActivityPub.MRF.SimplePolicy.check_accept(uri)},
{_, {:ok, _}} <- {:mrf_check, maybe_restrict_uri_mrf(uri)},
{_, nil} <- {:fetch_object, Object.get_cached_by_ap_id(id)},
{_, true} <- {:allowed_depth, Federator.allowed_thread_distance?(options[:depth])},
{_, {:ok, data}} <- {:fetch, fetch_and_contain_remote_object_from_id(id)},
@ -161,11 +159,7 @@ defmodule Pleroma.Object.Fetcher do
log_fetch_error(id, e)
{:error, :invalid_uri_scheme}
{:mrf_reject_check, _} = e ->
log_fetch_error(id, e)
{:reject, :mrf}
{:mrf_accept_check, _} = e ->
{:mrf_check, _} = e ->
log_fetch_error(id, e)
{:reject, :mrf}
@ -213,6 +207,17 @@ defmodule Pleroma.Object.Fetcher do
Logger.error("Object rejected while fetching #{id} #{inspect(error)}")
end
defp maybe_restrict_uri_mrf(uri) do
with {:enabled, true} <- {:enabled, MRF.SimplePolicy in MRF.get_policies()},
{:ok, _} <- MRF.SimplePolicy.check_reject(uri),
{:ok, _} <- MRF.SimplePolicy.check_accept(uri) do
{:ok, nil}
else
{:enabled, false} -> {:ok, nil}
{:reject, reason} -> {:reject, reason}
end
end
defp prepare_activity_params(data) do
%{
"type" => "Create",
@ -298,10 +303,7 @@ defmodule Pleroma.Object.Fetcher do
with {:valid_uri_scheme, true} <- {:valid_uri_scheme, String.starts_with?(id, "http")},
%URI{} = uri <- URI.parse(id),
{:mrf_reject_check, {:ok, nil}} <-
{:mrf_reject_check, Pleroma.Web.ActivityPub.MRF.SimplePolicy.check_reject(uri)},
{:mrf_accept_check, {:ok, _}} <-
{:mrf_accept_check, Pleroma.Web.ActivityPub.MRF.SimplePolicy.check_accept(uri)},
{_, {:ok, _}} <- {:mrf_check, maybe_restrict_uri_mrf(uri)},
{:local_fetch, :ok} <- {:local_fetch, Containment.contain_local_fetch(id)},
{:ok, final_id, body} <- get_object(id),
# a canonical ID shouldn't be a redirect

View file

@ -97,6 +97,9 @@ defmodule Pleroma.Pagination do
defp do_unwrap([], acc), do: Enum.reverse(acc)
defp cast_params(params) do
# Ecto doesnt support atom types
pfield = params[:pagination_field] || :id
param_types = %{
min_id: params[:id_type] || :string,
since_id: params[:id_type] || :string,
@ -108,54 +111,54 @@ defmodule Pleroma.Pagination do
order_asc: :boolean
}
params = Map.delete(params, :id_type)
params = Map.drop(params, [:id_type, :pagination_field])
changeset = cast({%{}, param_types}, params, Map.keys(param_types))
changeset.changes
Map.put(changeset.changes, :pagination_field, pfield)
end
defp order_statement(query, table_binding, :asc) do
defp order_statement(query, table_binding, :asc, %{pagination_field: fname}) do
order_by(
query,
[{u, table_position(query, table_binding)}],
fragment("? asc nulls last", u.id)
fragment("? asc nulls last", field(u, ^fname))
)
end
defp order_statement(query, table_binding, :desc) do
defp order_statement(query, table_binding, :desc, %{pagination_field: fname}) do
order_by(
query,
[{u, table_position(query, table_binding)}],
fragment("? desc nulls last", u.id)
fragment("? desc nulls last", field(u, ^fname))
)
end
defp restrict(query, :min_id, %{min_id: min_id}, table_binding) do
where(query, [{q, table_position(query, table_binding)}], q.id > ^min_id)
defp restrict(query, :min_id, %{min_id: min_id, pagination_field: fname}, table_binding) do
where(query, [{q, table_position(query, table_binding)}], field(q, ^fname) > ^min_id)
end
defp restrict(query, :since_id, %{since_id: since_id}, table_binding) do
where(query, [{q, table_position(query, table_binding)}], q.id > ^since_id)
defp restrict(query, :since_id, %{since_id: since_id, pagination_field: fname}, table_binding) do
where(query, [{q, table_position(query, table_binding)}], field(q, ^fname) > ^since_id)
end
defp restrict(query, :max_id, %{max_id: max_id}, table_binding) do
where(query, [{q, table_position(query, table_binding)}], q.id < ^max_id)
defp restrict(query, :max_id, %{max_id: max_id, pagination_field: fname}, table_binding) do
where(query, [{q, table_position(query, table_binding)}], field(q, ^fname) < ^max_id)
end
defp restrict(query, :order, %{skip_order: true}, _), do: query
defp restrict(%{order_bys: [_ | _]} = query, :order, %{skip_extra_order: true}, _), do: query
defp restrict(query, :order, %{min_id: _}, table_binding) do
order_statement(query, table_binding, :asc)
defp restrict(query, :order, %{min_id: _} = options, table_binding) do
order_statement(query, table_binding, :asc, options)
end
defp restrict(query, :order, %{max_id: _}, table_binding) do
order_statement(query, table_binding, :desc)
defp restrict(query, :order, %{max_id: _} = options, table_binding) do
order_statement(query, table_binding, :desc, options)
end
defp restrict(query, :order, options, table_binding) do
dir = if options[:order_asc], do: :asc, else: :desc
order_statement(query, table_binding, dir)
order_statement(query, table_binding, dir, options)
end
defp restrict(query, :offset, %{offset: offset}, _table_binding) do

View file

@ -82,7 +82,7 @@ defmodule Pleroma.Upload do
def store(upload, opts \\ []) do
opts = get_opts(opts)
with {:ok, upload} <- prepare_upload(upload, opts),
with {:ok, %__MODULE__{} = upload} <- prepare_upload(upload, opts),
upload = %__MODULE__{upload | path: upload.path || "#{upload.id}/#{upload.name}"},
{:ok, upload} <- Pleroma.Upload.Filter.filter(opts.filters, upload),
description = Map.get(upload, :description) || "",

View file

@ -31,6 +31,7 @@ defmodule Pleroma.User do
alias Pleroma.Registration
alias Pleroma.Repo
alias Pleroma.User
alias Pleroma.User.Fetcher
alias Pleroma.UserRelationship
alias Pleroma.Web.ActivityPub.ActivityPub
alias Pleroma.Web.ActivityPub.Builder
@ -834,7 +835,7 @@ defmodule Pleroma.User do
candidates = Config.get([:instance, :autofollowed_nicknames])
autofollowed_users =
User.Query.build(%{nickname: candidates, local: true, is_active: true})
User.Query.build(%{nickname: candidates, local: true, deactivated: false})
|> Repo.all()
follow_all(user, autofollowed_users)
@ -1103,16 +1104,6 @@ defmodule Pleroma.User do
|> Repo.all()
end
# This is mostly an SPC migration fix. This guesses the user nickname by taking the last part
# of the ap_id and the domain and tries to get that user
def get_by_guessed_nickname(ap_id) do
domain = URI.parse(ap_id).host
name = List.last(String.split(ap_id, "/"))
nickname = "#{name}@#{domain}"
get_cached_by_nickname(nickname)
end
@spec set_cache(
{:error, any}
| {:ok, User.t()}
@ -1211,14 +1202,18 @@ defmodule Pleroma.User do
end
def get_cached_by_nickname(nickname) do
key = "nickname:#{nickname}"
if String.valid?(nickname) do
key = "nickname:#{nickname}"
@cachex.fetch!(:user_cache, key, fn _ ->
case get_or_fetch_by_nickname(nickname) do
{:ok, user} -> {:commit, user}
{:error, _error} -> {:ignore, nil}
end
end)
@cachex.fetch!(:user_cache, key, fn _ ->
case get_or_fetch_by_nickname(nickname) do
{:ok, user} -> {:commit, user}
{:error, _error} -> {:ignore, nil}
end
end)
else
nil
end
end
def get_cached_by_nickname_or_id(nickname_or_id, opts \\ []) do
@ -1241,10 +1236,14 @@ defmodule Pleroma.User do
@spec get_by_nickname(String.t()) :: User.t() | nil
def get_by_nickname(nickname) do
Repo.get_by(User, nickname: nickname) ||
if Regex.match?(~r(@#{Pleroma.Web.Endpoint.host()})i, nickname) do
Repo.get_by(User, nickname: local_nickname(nickname))
end
if String.valid?(nickname) do
Repo.get_by(User, nickname: nickname) ||
if Regex.match?(~r(@#{Pleroma.Web.Endpoint.host()})i, nickname) do
Repo.get_by(User, nickname: local_nickname(nickname))
end
else
nil
end
end
def get_by_email(email), do: Repo.get_by(User, email: email)
@ -1253,7 +1252,7 @@ defmodule Pleroma.User do
get_by_nickname(nickname_or_email) || get_by_email(nickname_or_email)
end
def fetch_by_nickname(nickname), do: ActivityPub.make_user_from_nickname(nickname)
def fetch_by_nickname(nickname), do: Fetcher.make_user_from_nickname(nickname)
def get_or_fetch_by_nickname(nickname) do
with %User{} = user <- get_by_nickname(nickname) do
@ -1269,72 +1268,54 @@ defmodule Pleroma.User do
end
end
@spec get_followers_query(User.t(), pos_integer() | nil) :: Ecto.Query.t()
def get_followers_query(%User{} = user, nil) do
User.Query.build(%{followers: user, is_active: true})
end
def get_followers_query(%User{} = user, page) do
user
|> get_followers_query(nil)
|> User.Query.paginate(page, 20)
end
@spec get_followers_query(User.t()) :: Ecto.Query.t()
def get_followers_query(%User{} = user), do: get_followers_query(user, nil)
def get_followers_query(%User{} = user) do
User.Query.build(%{followers: user, deactivated: false})
end
@spec get_followers(User.t(), pos_integer() | nil) :: {:ok, list(User.t())}
def get_followers(%User{} = user, page \\ nil) do
@spec get_followers(User.t()) :: {:ok, list(User.t())}
def get_followers(%User{} = user) do
user
|> get_followers_query(page)
|> get_followers_query()
|> Repo.all()
end
@spec get_external_followers(User.t(), pos_integer() | nil) :: {:ok, list(User.t())}
def get_external_followers(%User{} = user, page \\ nil) do
@spec get_external_followers(User.t()) :: {:ok, list(User.t())}
def get_external_followers(%User{} = user) do
user
|> get_followers_query(page)
|> get_followers_query()
|> User.Query.build(%{external: true})
|> Repo.all()
end
def get_followers_ids(%User{} = user, page \\ nil) do
def get_followers_ids(%User{} = user) do
user
|> get_followers_query(page)
|> get_followers_query()
|> select([u], u.id)
|> Repo.all()
end
@spec get_friends_query(User.t(), pos_integer() | nil) :: Ecto.Query.t()
def get_friends_query(%User{} = user, nil) do
@spec get_friends_query(User.t()) :: Ecto.Query.t()
def get_friends_query(%User{} = user) do
User.Query.build(%{friends: user, deactivated: false})
end
def get_friends_query(%User{} = user, page) do
def get_friends(%User{} = user) do
user
|> get_friends_query(nil)
|> User.Query.paginate(page, 20)
end
@spec get_friends_query(User.t()) :: Ecto.Query.t()
def get_friends_query(%User{} = user), do: get_friends_query(user, nil)
def get_friends(%User{} = user, page \\ nil) do
user
|> get_friends_query(page)
|> get_friends_query()
|> Repo.all()
end
def get_friends_ap_ids(%User{} = user) do
user
|> get_friends_query(nil)
|> get_friends_query()
|> select([u], u.ap_id)
|> Repo.all()
end
def get_friends_ids(%User{} = user, page \\ nil) do
def get_friends_ids(%User{} = user) do
user
|> get_friends_query(page)
|> get_friends_query()
|> select([u], u.id)
|> Repo.all()
end
@ -1402,7 +1383,7 @@ defmodule Pleroma.User do
end
def fetch_follow_information(user) do
with {:ok, info} <- ActivityPub.fetch_follow_information_for_user(user) do
with {:ok, info} <- Fetcher.fetch_follow_information_for_user(user) do
user
|> follow_information_changeset(info)
|> update_and_set_cache()
@ -1454,7 +1435,7 @@ defmodule Pleroma.User do
@spec get_users_from_set([String.t()], keyword()) :: [User.t()]
def get_users_from_set(ap_ids, opts \\ []) do
local_only = Keyword.get(opts, :local_only, true)
criteria = %{ap_id: ap_ids, is_active: true}
criteria = %{ap_id: ap_ids, deactivated: false}
criteria = if local_only, do: Map.put(criteria, :local, true), else: criteria
User.Query.build(criteria)
@ -1465,7 +1446,7 @@ defmodule Pleroma.User do
def get_recipients_from_activity(%Activity{recipients: to, actor: actor}) do
to = [actor | to]
query = User.Query.build(%{recipients_from_activity: to, local: true, is_active: true})
query = User.Query.build(%{recipients_from_activity: to, local: true, deactivated: false})
query
|> Repo.all()
@ -1977,12 +1958,16 @@ defmodule Pleroma.User do
def html_filter_policy(_), do: Config.get([:markup, :scrub_policy])
def fetch_by_ap_id(ap_id), do: ActivityPub.make_user_from_ap_id(ap_id)
def fetch_by_ap_id(ap_id), do: Fetcher.make_user_from_ap_id(ap_id)
defp refetch_or_fetch_by_ap_id(%User{} = user, _), do: Fetcher.refetch_user(user)
defp refetch_or_fetch_by_ap_id(_, ap_id), do: Fetcher.make_user_from_ap_id(ap_id)
def get_or_fetch_by_ap_id(ap_id, options \\ []) do
cached_user = get_cached_by_ap_id(ap_id)
maybe_fetched_user = needs_update?(cached_user, options) && fetch_by_ap_id(ap_id)
maybe_fetched_user =
needs_update?(cached_user, options) && refetch_or_fetch_by_ap_id(cached_user, ap_id)
case {cached_user, maybe_fetched_user} do
{_, {:ok, %User{} = user}} ->
@ -2070,7 +2055,7 @@ defmodule Pleroma.User do
|> set_cache()
end
defdelegate public_key(user), to: SigningKey
defdelegate public_key(user), to: SigningKey, as: :public_key_pem
@doc "Gets or fetch a user by uri or nickname."
@spec get_or_fetch(String.t()) :: {:ok, User.t()} | {:error, String.t()}
@ -2203,7 +2188,7 @@ defmodule Pleroma.User do
@spec all_superusers() :: [User.t()]
def all_superusers do
User.Query.build(%{super_users: true, local: true, is_active: true})
User.Query.build(%{super_users: true, local: true, deactivated: false})
|> Repo.all()
end

443
lib/pleroma/user/fetcher.ex Normal file
View file

@ -0,0 +1,443 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2021 Pleroma Authors <https://pleroma.social/>
# Copyright © 2026 Akkoma Authors <https://akkoma.dev/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.User.Fetcher do
alias Akkoma.Collections
alias Pleroma.Config
alias Pleroma.Object
alias Pleroma.Object.Fetcher, as: APFetcher
alias Pleroma.Repo
alias Pleroma.User
alias Pleroma.Web.ActivityPub.MRF
alias Pleroma.Web.ActivityPub.ObjectValidators.UserValidator
alias Pleroma.Web.ActivityPub.Transmogrifier
alias Pleroma.Web.WebFinger
import Pleroma.Web.ActivityPub.Utils
require Logger
@spec get_actor_url(any()) :: binary() | nil
defp get_actor_url(url) when is_binary(url), do: url
defp get_actor_url(%{"href" => href}) when is_binary(href), do: href
defp get_actor_url(url) when is_list(url) do
url
|> List.first()
|> get_actor_url()
end
defp get_actor_url(_url), do: nil
defp normalize_image(%{"url" => url}) do
%{
"type" => "Image",
"url" => [%{"href" => url}]
}
end
defp normalize_image(urls) when is_list(urls), do: urls |> List.first() |> normalize_image()
defp normalize_image(_), do: nil
defp normalize_also_known_as(aka) when is_list(aka), do: aka
defp normalize_also_known_as(aka) when is_binary(aka), do: [aka]
defp normalize_also_known_as(nil), do: []
defp normalize_attachment(%{} = attachment), do: [attachment]
defp normalize_attachment(attachment) when is_list(attachment), do: attachment
defp normalize_attachment(_), do: []
defp maybe_make_public_key_object(data) do
if is_map(data["publicKey"]) && is_binary(data["publicKey"]["publicKeyPem"]) do
%{
public_key: data["publicKey"]["publicKeyPem"],
key_id: data["publicKey"]["id"]
}
else
nil
end
end
defp try_fallback_nick(%{"id" => ap_id, "preferredUsername" => name})
when is_binary(name) and is_binary(ap_id) do
with true <- name != "",
domain when domain != nil and domain != "" <- URI.parse(ap_id).host do
"#{name}@#{domain}"
else
_ -> nil
end
end
defp try_fallback_nick(_), do: nil
defp object_to_user_data(data, verified_nick) do
fields =
data
|> Map.get("attachment", [])
|> normalize_attachment()
|> Enum.filter(fn
%{"type" => t} -> t == "PropertyValue"
_ -> false
end)
|> Enum.map(fn fields -> Map.take(fields, ["name", "value"]) end)
emojis =
data
|> Map.get("tag", [])
|> Enum.filter(fn
%{"type" => "Emoji"} -> true
_ -> false
end)
|> Map.new(fn %{"icon" => %{"url" => url}, "name" => name} ->
{String.trim(name, ":"), url}
end)
is_locked = data["manuallyApprovesFollowers"] || false
data = Transmogrifier.maybe_fix_user_object(data)
is_discoverable = data["discoverable"] || false
invisible = data["invisible"] || false
actor_type = data["type"] || "Person"
{featured_address, pinned_objects} =
case process_featured_collection(data["featured"]) do
{:ok, featured_address, pinned_objects} -> {featured_address, pinned_objects}
_ -> {nil, %{}}
end
# first, check that the owner is correct
signing_key =
if data["id"] !== data["publicKey"]["owner"] do
Logger.error(
"Owner of the public key is not the same as the actor - not saving the public key."
)
nil
else
maybe_make_public_key_object(data)
end
shared_inbox =
if is_map(data["endpoints"]) && is_binary(data["endpoints"]["sharedInbox"]) do
data["endpoints"]["sharedInbox"]
end
# can still be nil if no name was indicated in AP data
nickname = verified_nick || try_fallback_nick(data)
# also_known_as must be a URL
also_known_as =
data
|> Map.get("alsoKnownAs", [])
|> normalize_also_known_as()
|> Enum.filter(fn url ->
case URI.parse(url) do
%URI{scheme: "http"} -> true
%URI{scheme: "https"} -> true
_ -> false
end
end)
%{
ap_id: data["id"],
uri: get_actor_url(data["url"]),
banner: normalize_image(data["image"]),
background: normalize_image(data["backgroundUrl"]),
fields: fields,
emoji: emojis,
is_locked: is_locked,
is_discoverable: is_discoverable,
invisible: invisible,
avatar: normalize_image(data["icon"]),
name: data["name"],
follower_address: data["followers"],
following_address: data["following"],
featured_address: featured_address,
bio: data["summary"] || "",
actor_type: actor_type,
also_known_as: also_known_as,
signing_key: signing_key,
inbox: data["inbox"],
shared_inbox: shared_inbox,
pinned_objects: pinned_objects,
nickname: nickname
}
end
defp collection_private(%{"first" => %{"type" => type}})
when type in ["CollectionPage", "OrderedCollectionPage"],
do: false
defp collection_private(%{"first" => first}) do
with {:ok, %{"type" => type}} when type in ["CollectionPage", "OrderedCollectionPage"] <-
APFetcher.fetch_and_contain_remote_object_from_id(first) do
false
else
_ -> true
end
end
defp collection_private(_data), do: true
defp counter_private(%{"totalItems" => _}), do: false
defp counter_private(_), do: true
defp normalize_counter(counter) when is_integer(counter), do: counter
defp normalize_counter(_), do: 0
defp eval_collection_counter(apid) when is_binary(apid) do
case APFetcher.fetch_and_contain_remote_object_from_id(apid) do
{:ok, data} ->
{collection_private(data), counter_private(data), normalize_counter(data["totalItems"])}
_ ->
Logger.debug("Failed to fetch follower/ing collection #{apid}; assuming private")
{true, true, 0}
end
end
defp eval_collection_counter(_), do: {true, 0}
def fetch_follow_information_for_user(user) do
{hide_follows, hide_follows_count, following_count} =
eval_collection_counter(user.following_address)
{hide_followers, hide_followers_count, follower_count} =
eval_collection_counter(user.follower_address)
{:ok,
%{
hide_follows: hide_follows,
hide_follows_count: hide_follows_count,
following_count: following_count,
hide_followers: hide_followers,
hide_followers_count: hide_followers_count,
follower_count: follower_count
}}
end
def maybe_update_follow_information(user_data) do
with {:enabled, true} <- {:enabled, Config.get([:instance, :external_user_synchronization])},
{_, true} <-
{:collections_available,
!!(user_data[:following_address] && user_data[:follower_address])},
{:ok, follow_info} <-
fetch_follow_information_for_user(user_data) do
Map.merge(user_data, follow_info)
else
{:user_type_check, false} ->
user_data
{:collections_available, false} ->
user_data
{:enabled, false} ->
user_data
e ->
Logger.error(
"Follower/Following counter update for #{user_data.ap_id} failed.\n" <> inspect(e)
)
user_data
end
end
def maybe_handle_clashing_nickname(data) do
with nickname when is_binary(nickname) <- data[:nickname],
%User{} = old_user <- User.get_by_nickname(nickname),
{_, false} <- {:ap_id_comparison, data[:ap_id] == old_user.ap_id} do
Logger.info(
"Found an old user for #{nickname}, the old ap id is #{old_user.ap_id}, new one is #{data[:ap_id]}, renaming.
"
)
old_user
|> User.remote_user_changeset(%{nickname: "#{old_user.id}.#{old_user.nickname}"})
|> User.update_and_set_cache()
else
{:ap_id_comparison, true} ->
Logger.info(
"Found an old user for #{data[:nickname]}, but the ap id #{data[:ap_id]} is the same as the new user. Race
condition? Not changing anything."
)
_ ->
nil
end
end
def process_featured_collection(nil), do: {:ok, nil, %{}}
def process_featured_collection(""), do: {:ok, nil, %{}}
def process_featured_collection(featured_collection) do
featured_address =
case get_ap_id(featured_collection) do
id when is_binary(id) -> id
_ -> nil
end
# TODO: allow passing item/page limit as function opt and use here
case Collections.Fetcher.fetch_collection(featured_collection) do
{:ok, items} ->
now = NaiveDateTime.utc_now()
dated_obj_ids = Map.new(items, fn obj -> {get_ap_id(obj), now} end)
{:ok, featured_address, dated_obj_ids}
error ->
Logger.error(
"Could not decode featured collection at fetch #{inspect(featured_collection)}: #{inspect(error)}"
)
error =
case error do
{:error, e} -> e
e -> e
end
{:error, error}
end
end
def enqueue_pin_fetches(%{pinned_objects: pins}) do
# enqueue a task to fetch all pinned objects
Enum.each(pins, fn {ap_id, _} ->
if is_nil(Object.get_cached_by_ap_id(ap_id)) do
Pleroma.Workers.RemoteFetcherWorker.enqueue("fetch_remote", %{
"id" => ap_id,
"depth" => 1
})
end
end)
end
def enqueue_pin_fetches(_), do: nil
def validate_and_cast(data, verified_nick) do
with {:ok, data} <- MRF.filter(data),
{:valid, {:ok, _, _}} <- {:valid, UserValidator.validate(data, [])} do
{:ok, object_to_user_data(data, verified_nick)}
else
{:valid, reason} ->
{:error, {:validate, reason}}
e ->
{:error, e}
end
end
defp insert_or_update(%User{} = olduser, newdata) do
olduser
|> User.remote_user_changeset(newdata)
|> User.update_and_set_cache()
end
defp insert_or_update(nil, newdata) do
newdata
|> User.remote_user_changeset()
|> Repo.insert()
|> User.set_cache()
end
defp make_user_from_apdata_and_nick(ap_data, verified_nick, olduser \\ nil) do
with {:ok, data} <- validate_and_cast(ap_data, verified_nick) do
olduser = olduser || User.get_cached_by_ap_id(data.ap_id)
if !olduser || olduser.nickname != data.nickname do
maybe_handle_clashing_nickname(data)
end
data = maybe_update_follow_information(data)
with {:ok, newuser} <- insert_or_update(olduser, data) do
enqueue_pin_fetches(data)
{:ok, newuser}
end
end
end
defp discover_nick_from_actor_data(data) do
case WebFinger.Finger.finger_actor(data) do
{:ok, nil} ->
Logger.debug("No WebFinger found for #{data["id"]}; using fallback")
nil
{:ok, nick} ->
nick
{:error, error} ->
Logger.error(
"Invalid WebFinger for #{data["id"]}; spoof attempt or just misconfiguration? Using safe fallback: #{inspect(error)}"
)
nil
end
end
defp needs_nick_update(%{"webfinger" => "acct:" <> nick}, nick), do: false
defp needs_nick_update(%{"webfinger" => nick}, nick), do: false
defp needs_nick_update(%{"preferredUsername" => name}, oldnick) when is_binary(name) do
String.starts_with?(oldnick, name <> "@")
end
defp needs_nick_update(ap_data, oldnick) do
ap_nick = ap_data["webfinger"] || ap_data["preferredUsername"]
(!oldnick && ap_nick) || (oldnick && !ap_nick)
end
defp refreshed_nick(ap_data, olduser) do
if Config.get!([Pleroma.Web.WebFinger, :update_nickname_on_user_fetch]) ||
!olduser || needs_nick_update(ap_data, olduser.nickname) do
discover_nick_from_actor_data(ap_data)
else
olduser.nickname
end
end
defp refresh_or_fetch_from_ap_id(ap_id, olduser) do
with {:ok, data} <- APFetcher.fetch_and_contain_remote_object_from_id(ap_id),
# if AP id somehow changed on refetch, discard old info
verified_olduser <- (olduser && olduser.ap_id == data["id"] && olduser) || nil,
verified_nick <- refreshed_nick(data, verified_olduser) do
make_user_from_apdata_and_nick(data, verified_nick, verified_olduser)
else
# If this has been deleted, only log a debug and not an error
{:error, {"Object has been deleted", _, _} = e} ->
Logger.debug("User was explicitly deleted #{ap_id}, #{inspect(e)}")
{:error, :not_found}
{:reject, _reason} = e ->
{:error, e}
{:error, e} ->
{:error, e}
end
end
def make_user_from_ap_id(ap_id), do: refresh_or_fetch_from_ap_id(ap_id, nil)
def refetch_user(%User{ap_id: ap_id} = u), do: refresh_or_fetch_from_ap_id(ap_id, u)
def make_user_from_nickname(nickname) do
case WebFinger.Finger.finger_mention(nickname) do
{:ok, handle, actor_data} ->
make_user_from_apdata_and_nick(actor_data, handle)
error ->
error
end
end
def update_user_with_apdata(%{"id" => ap_id} = new_ap_data) do
with %User{} = old_user <- User.get_cached_by_ap_id(ap_id) do
new_nick = refreshed_nick(new_ap_data, old_user)
make_user_from_apdata_and_nick(new_ap_data, new_nick, old_user)
else
nil ->
Logger.warning("Cannot update unknown user #{ap_id}")
{:error, :not_found}
end
end
end

View file

@ -144,11 +144,6 @@ defmodule Pleroma.User.Query do
|> where([u], u.is_confirmed == true)
end
defp compose_query({:legacy_active, _}, query) do
query
|> where([u], fragment("not (?->'deactivated' @> 'true')", u.info))
end
defp compose_query({:deactivated, false}, query) do
where(query, [u], u.is_active == true)
end

View file

@ -110,7 +110,7 @@ defmodule Pleroma.User.SigningKey do
{:ok, :public_key.pem_encode([public_key])}
end
@spec public_key(__MODULE__) :: {:ok, binary()} | {:error, String.t()}
@spec public_key_decoded(__MODULE__) :: {:ok, binary()} | {:error, String.t()}
@doc """
Return public key data in binary format.
"""
@ -124,8 +124,12 @@ defmodule Pleroma.User.SigningKey do
{:ok, decoded}
end
def public_key(_), do: {:error, "key not found"}
def public_key_decoded(_), do: {:error, "key not found"}
@spec public_key_pem(__MODULE__) :: {:ok, binary()} | {:error, String.t()}
@doc """
Return public key data for user in PEM format
"""
def public_key_pem(%User{} = user) do
case Repo.preload(user, :signing_key) do
%User{signing_key: %__MODULE__{public_key: public_key_pem}} -> {:ok, public_key_pem}

View file

@ -16,7 +16,7 @@ defmodule Pleroma.Utils do
def compile_dir(dir) when is_binary(dir) do
dir
|> elixir_files()
|> Kernel.ParallelCompiler.compile()
|> Kernel.ParallelCompiler.compile(return_diagnostics: true)
end
defp elixir_files(dir) when is_binary(dir) do

View file

@ -3,7 +3,6 @@
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Web.ActivityPub.ActivityPub do
alias Akkoma.Collections
alias Pleroma.Activity
alias Pleroma.Activity.Ir.Topics
alias Pleroma.Config
@ -16,16 +15,13 @@ defmodule Pleroma.Web.ActivityPub.ActivityPub do
alias Pleroma.Notification
alias Pleroma.Object
alias Pleroma.Object.Containment
alias Pleroma.Object.Fetcher
alias Pleroma.Pagination
alias Pleroma.Repo
alias Pleroma.Upload
alias Pleroma.User
alias Pleroma.Web.ActivityPub.MRF
alias Pleroma.Web.ActivityPub.ObjectValidators.UserValidator
alias Pleroma.Web.ActivityPub.Transmogrifier
alias Pleroma.Web.ActivityPub.Visibility
alias Pleroma.Web.Streamer
alias Pleroma.Web.WebFinger
alias Pleroma.Workers.BackgroundWorker
alias Pleroma.Workers.PollWorker
@ -208,21 +204,19 @@ defmodule Pleroma.Web.ActivityPub.ActivityPub do
end
def notify_and_stream(activity) do
Notification.create_notifications(activity)
original_activity =
case activity do
%{data: %{"type" => "Update"}, object: %{data: %{"id" => id}}} ->
Activity.get_create_by_object_ap_id_with_object(id)
_ ->
activity
end
conversation = create_or_bump_conversation(original_activity, original_activity.actor)
participations = get_participations(conversation)
# XXX: all callers of this should be moved to side_effect handling, such that
# notifications can be collected and only be sent out _after_ the transaction succeed
{:ok, notifications, _} = Notification.create_notifications(activity)
Notification.send(notifications)
stream_out(activity)
stream_out_participations(participations)
end
defp maybe_bump_conversation(activity) do
if Visibility.is_direct?(activity) do
conversation = create_or_bump_conversation(activity, activity.actor)
participations = get_participations(conversation)
stream_out_participations(participations)
end
end
defp maybe_create_activity_expiration(
@ -239,7 +233,7 @@ defmodule Pleroma.Web.ActivityPub.ActivityPub do
defp maybe_create_activity_expiration(activity), do: {:ok, activity}
defp create_or_bump_conversation(activity, actor) do
def create_or_bump_conversation(activity, actor) do
with {:ok, conversation} <- Conversation.create_or_bump_for(activity),
%User{} = user <- User.get_cached_by_ap_id(actor) do
Participation.mark_as_read(user, conversation)
@ -258,7 +252,7 @@ defmodule Pleroma.Web.ActivityPub.ActivityPub do
def stream_out_participations(participations) do
participations =
participations
|> Repo.preload(:user)
|> Repo.preload([:user, :conversation])
Streamer.stream("participation", participations)
end
@ -323,6 +317,7 @@ defmodule Pleroma.Web.ActivityPub.ActivityPub do
{:ok, _actor} <- increase_note_count_if_public(actor, activity),
{:ok, _actor} <- update_last_status_at_if_public(actor, activity),
_ <- notify_and_stream(activity),
_ <- maybe_bump_conversation(activity),
:ok <- maybe_schedule_poll_notifications(activity),
:ok <- maybe_federate(activity) do
{:ok, activity}
@ -482,9 +477,9 @@ defmodule Pleroma.Web.ActivityPub.ActivityPub do
from(activity in Activity)
|> maybe_preload_objects(opts)
|> maybe_preload_bookmarks(opts)
|> maybe_set_thread_muted_field(opts)
|> restrict_blocked(opts)
|> restrict_blockers_visibility(opts)
|> restrict_muted_users(opts)
|> restrict_recipients(recipients, opts[:user])
|> restrict_filtered(opts)
|> where(
@ -1096,24 +1091,35 @@ defmodule Pleroma.Web.ActivityPub.ActivityPub do
defp restrict_reblogs(query, _), do: query
defp restrict_muted(query, %{with_muted: true}), do: query
defp restrict_muted(query, opts) do
query
|> restrict_muted_users(opts)
|> restrict_muted_threads(opts)
end
defp restrict_muted(query, %{muting_user: %User{} = user} = opts) do
defp restrict_muted_users(query, %{with_muted: true}), do: query
defp restrict_muted_users(query, %{muting_user: %User{} = user} = opts) do
mutes = opts[:muted_users_ap_ids] || User.muted_users_ap_ids(user)
query =
from([activity] in query,
where: fragment("not (? = ANY(?))", activity.actor, ^mutes),
where:
fragment(
"not (?->'to' \\?| ?) or ? = ?",
activity.data,
^mutes,
activity.actor,
^user.ap_id
)
)
from([activity] in query,
where: fragment("not (? = ANY(?))", activity.actor, ^mutes),
where:
fragment(
"not (?->'to' \\?| ?) or ? = ?",
activity.data,
^mutes,
activity.actor,
^user.ap_id
)
)
end
defp restrict_muted_users(query, _), do: query
defp restrict_muted_threads(query, %{with_muted: true}), do: query
defp restrict_muted_threads(query, %{muting_user: %User{} = _user} = opts) do
unless opts[:skip_preload] do
from([thread_mute: tm] in query, where: is_nil(tm.user_id))
else
@ -1121,7 +1127,7 @@ defmodule Pleroma.Web.ActivityPub.ActivityPub do
end
end
defp restrict_muted(query, _), do: query
defp restrict_muted_threads(query, _), do: query
defp restrict_blocked(query, %{blocking_user: %User{} = user} = opts) do
blocked_ap_ids = opts[:blocked_users_ap_ids] || User.blocked_users_ap_ids(user)
@ -1447,7 +1453,6 @@ defmodule Pleroma.Web.ActivityPub.ActivityPub do
|> restrict_muted_reblogs(restrict_muted_reblogs_opts)
|> restrict_instance(opts)
|> restrict_announce_object_actor(opts)
|> restrict_filtered(opts)
|> maybe_restrict_deactivated_users(opts)
|> exclude_poll_votes(opts)
|> exclude_invisible_actors(opts)
@ -1536,361 +1541,6 @@ defmodule Pleroma.Web.ActivityPub.ActivityPub do
defp sanitize_upload_file(upload), do: upload
@spec get_actor_url(any()) :: binary() | nil
defp get_actor_url(url) when is_binary(url), do: url
defp get_actor_url(%{"href" => href}) when is_binary(href), do: href
defp get_actor_url(url) when is_list(url) do
url
|> List.first()
|> get_actor_url()
end
defp get_actor_url(_url), do: nil
defp normalize_image(%{"url" => url}) do
%{
"type" => "Image",
"url" => [%{"href" => url}]
}
end
defp normalize_image(urls) when is_list(urls), do: urls |> List.first() |> normalize_image()
defp normalize_image(_), do: nil
defp normalize_also_known_as(aka) when is_list(aka), do: aka
defp normalize_also_known_as(aka) when is_binary(aka), do: [aka]
defp normalize_also_known_as(nil), do: []
defp normalize_attachment(%{} = attachment), do: [attachment]
defp normalize_attachment(attachment) when is_list(attachment), do: attachment
defp normalize_attachment(_), do: []
defp maybe_make_public_key_object(data) do
if is_map(data["publicKey"]) && is_binary(data["publicKey"]["publicKeyPem"]) do
%{
public_key: data["publicKey"]["publicKeyPem"],
key_id: data["publicKey"]["id"]
}
else
nil
end
end
defp object_to_user_data(data, additional) do
fields =
data
|> Map.get("attachment", [])
|> normalize_attachment()
|> Enum.filter(fn
%{"type" => t} -> t == "PropertyValue"
_ -> false
end)
|> Enum.map(fn fields -> Map.take(fields, ["name", "value"]) end)
emojis =
data
|> Map.get("tag", [])
|> Enum.filter(fn
%{"type" => "Emoji"} -> true
_ -> false
end)
|> Map.new(fn %{"icon" => %{"url" => url}, "name" => name} ->
{String.trim(name, ":"), url}
end)
is_locked = data["manuallyApprovesFollowers"] || false
data = Transmogrifier.maybe_fix_user_object(data)
is_discoverable = data["discoverable"] || false
invisible = data["invisible"] || false
actor_type = data["type"] || "Person"
{featured_address, pinned_objects} =
case process_featured_collection(data["featured"]) do
{:ok, featured_address, pinned_objects} -> {featured_address, pinned_objects}
_ -> {nil, %{}}
end
# first, check that the owner is correct
signing_key =
if data["id"] !== data["publicKey"]["owner"] do
Logger.error(
"Owner of the public key is not the same as the actor - not saving the public key."
)
nil
else
maybe_make_public_key_object(data)
end
shared_inbox =
if is_map(data["endpoints"]) && is_binary(data["endpoints"]["sharedInbox"]) do
data["endpoints"]["sharedInbox"]
end
# if WebFinger request was already done, we probably have acct, otherwise
# we request WebFinger here
nickname = additional[:nickname_from_acct] || generate_nickname(data)
# also_known_as must be a URL
also_known_as =
data
|> Map.get("alsoKnownAs", [])
|> normalize_also_known_as()
|> Enum.filter(fn url ->
case URI.parse(url) do
%URI{scheme: "http"} -> true
%URI{scheme: "https"} -> true
_ -> false
end
end)
%{
ap_id: data["id"],
uri: get_actor_url(data["url"]),
banner: normalize_image(data["image"]),
background: normalize_image(data["backgroundUrl"]),
fields: fields,
emoji: emojis,
is_locked: is_locked,
is_discoverable: is_discoverable,
invisible: invisible,
avatar: normalize_image(data["icon"]),
name: data["name"],
follower_address: data["followers"],
following_address: data["following"],
featured_address: featured_address,
bio: data["summary"] || "",
actor_type: actor_type,
also_known_as: also_known_as,
signing_key: signing_key,
inbox: data["inbox"],
shared_inbox: shared_inbox,
pinned_objects: pinned_objects,
nickname: nickname
}
end
defp generate_nickname(%{"preferredUsername" => username} = data) when is_binary(username) do
generated = "#{username}@#{URI.parse(data["id"]).host}"
if Config.get([WebFinger, :update_nickname_on_user_fetch]) do
case WebFinger.finger(generated) do
{:ok, %{"subject" => "acct:" <> acct}} -> acct
_ -> generated
end
else
generated
end
end
# nickname can be nil because of virtual actors
defp generate_nickname(_), do: nil
def fetch_follow_information_for_user(user) do
with {:ok, following_data} <-
Fetcher.fetch_and_contain_remote_object_from_id(user.following_address),
{:ok, hide_follows} <- collection_private(following_data),
{:ok, followers_data} <-
Fetcher.fetch_and_contain_remote_object_from_id(user.follower_address),
{:ok, hide_followers} <- collection_private(followers_data) do
{:ok,
%{
hide_follows: hide_follows,
follower_count: normalize_counter(followers_data["totalItems"]),
following_count: normalize_counter(following_data["totalItems"]),
hide_followers: hide_followers
}}
else
{:error, _} = e -> e
e -> {:error, e}
end
end
defp normalize_counter(counter) when is_integer(counter), do: counter
defp normalize_counter(_), do: 0
def maybe_update_follow_information(user_data) do
with {:enabled, true} <- {:enabled, Config.get([:instance, :external_user_synchronization])},
{_, true} <- {:user_type_check, user_data[:type] in ["Person", "Service"]},
{_, true} <-
{:collections_available,
!!(user_data[:following_address] && user_data[:follower_address])},
{:ok, info} <-
fetch_follow_information_for_user(user_data) do
info = Map.merge(user_data[:info] || %{}, info)
user_data
|> Map.put(:info, info)
else
{:user_type_check, false} ->
user_data
{:collections_available, false} ->
user_data
{:enabled, false} ->
user_data
e ->
Logger.error(
"Follower/Following counter update for #{user_data.ap_id} failed.\n" <> inspect(e)
)
user_data
end
end
defp collection_private(%{"first" => %{"type" => type}})
when type in ["CollectionPage", "OrderedCollectionPage"],
do: {:ok, false}
defp collection_private(%{"first" => first}) do
with {:ok, %{"type" => type}} when type in ["CollectionPage", "OrderedCollectionPage"] <-
Fetcher.fetch_and_contain_remote_object_from_id(first) do
{:ok, false}
else
{:error, _} -> {:ok, true}
end
end
defp collection_private(_data), do: {:ok, true}
def user_data_from_user_object(data, additional \\ []) do
with {:ok, data} <- MRF.filter(data) do
{:ok, object_to_user_data(data, additional)}
else
e -> {:error, e}
end
end
defp fetch_and_prepare_user_from_ap_id(ap_id, additional) do
with {:ok, data} <- Fetcher.fetch_and_contain_remote_object_from_id(ap_id),
{:valid, {:ok, _, _}} <- {:valid, UserValidator.validate(data, [])},
{:ok, data} <- user_data_from_user_object(data, additional) do
{:ok, maybe_update_follow_information(data)}
else
# If this has been deleted, only log a debug and not an error
{:error, {"Object has been deleted", _, _} = e} ->
Logger.debug("User was explicitly deleted #{ap_id}, #{inspect(e)}")
{:error, :not_found}
{:reject, _reason} = e ->
{:error, e}
{:valid, reason} ->
{:error, {:validate, reason}}
{:error, e} ->
{:error, e}
end
end
def maybe_handle_clashing_nickname(data) do
with nickname when is_binary(nickname) <- data[:nickname],
%User{} = old_user <- User.get_by_nickname(nickname),
{_, false} <- {:ap_id_comparison, data[:ap_id] == old_user.ap_id} do
Logger.info(
"Found an old user for #{nickname}, the old ap id is #{old_user.ap_id}, new one is #{data[:ap_id]}, renaming."
)
old_user
|> User.remote_user_changeset(%{nickname: "#{old_user.id}.#{old_user.nickname}"})
|> User.update_and_set_cache()
else
{:ap_id_comparison, true} ->
Logger.info(
"Found an old user for #{data[:nickname]}, but the ap id #{data[:ap_id]} is the same as the new user. Race condition? Not changing anything."
)
_ ->
nil
end
end
def process_featured_collection(nil), do: {:ok, nil, %{}}
def process_featured_collection(""), do: {:ok, nil, %{}}
def process_featured_collection(featured_collection) do
featured_address =
case get_ap_id(featured_collection) do
id when is_binary(id) -> id
_ -> nil
end
# TODO: allow passing item/page limit as function opt and use here
case Collections.Fetcher.fetch_collection(featured_collection) do
{:ok, items} ->
now = NaiveDateTime.utc_now()
dated_obj_ids = Map.new(items, fn obj -> {get_ap_id(obj), now} end)
{:ok, featured_address, dated_obj_ids}
error ->
Logger.error(
"Could not decode featured collection at fetch #{inspect(featured_collection)}: #{inspect(error)}"
)
error =
case error do
{:error, e} -> e
e -> e
end
{:error, error}
end
end
def enqueue_pin_fetches(%{pinned_objects: pins}) do
# enqueue a task to fetch all pinned objects
Enum.each(pins, fn {ap_id, _} ->
if is_nil(Object.get_cached_by_ap_id(ap_id)) do
Pleroma.Workers.RemoteFetcherWorker.enqueue("fetch_remote", %{
"id" => ap_id,
"depth" => 1
})
end
end)
end
def enqueue_pin_fetches(_), do: nil
def make_user_from_ap_id(ap_id, additional \\ []) do
user = User.get_cached_by_ap_id(ap_id)
with {:ok, data} <- fetch_and_prepare_user_from_ap_id(ap_id, additional) do
user =
if data.ap_id != ap_id do
User.get_cached_by_ap_id(data.ap_id)
else
user
end
if user do
user
|> User.remote_user_changeset(data)
|> User.update_and_set_cache()
|> tap(fn _ -> enqueue_pin_fetches(data) end)
else
maybe_handle_clashing_nickname(data)
data
|> User.remote_user_changeset()
|> Repo.insert()
|> User.set_cache()
|> tap(fn _ -> enqueue_pin_fetches(data) end)
end
end
end
def make_user_from_nickname(nickname) do
with {:ok, %{"ap_id" => ap_id, "subject" => "acct:" <> acct}} when not is_nil(ap_id) <-
WebFinger.finger(nickname) do
make_user_from_ap_id(ap_id, nickname_from_acct: acct)
else
_e -> {:error, "No AP id in WebFinger"}
end
end
# filter out broken threads
defp contain_broken_threads(%Activity{} = activity, %User{} = user) do
entire_thread_visible_for_user?(activity, user)

View file

@ -57,6 +57,17 @@ defmodule Pleroma.Web.ActivityPub.Builder do
{:ok, data, []}
end
@spec emoji_object!({String.t(), String.t()}) :: map()
def emoji_object!({name, url}) do
# TODO: we should probably send mtime instead of unix epoch time for updated
%{
"icon" => %{"url" => "#{URI.encode(url)}", "type" => "Image"},
"name" => Emoji.maybe_quote(name),
"type" => "Emoji",
"updated" => "1970-01-01T00:00:00Z"
}
end
defp unicode_emoji_react(_object, data, emoji) do
data
|> Map.put("content", emoji)
@ -67,18 +78,7 @@ defmodule Pleroma.Web.ActivityPub.Builder do
data
|> Map.put("content", Emoji.maybe_quote(emoji))
|> Map.put("type", "EmojiReact")
|> Map.put("tag", [
%{}
|> Map.put("id", url)
|> Map.put("type", "Emoji")
|> Map.put("name", Emoji.maybe_quote(emoji))
|> Map.put(
"icon",
%{}
|> Map.put("type", "Image")
|> Map.put("url", url)
)
])
|> Map.put("tag", [emoji_object!({emoji, url})])
end
defp remote_custom_emoji_react(

View file

@ -44,9 +44,9 @@ defmodule Pleroma.Web.ActivityPub.ObjectValidators.TagValidator do
|> validate_required([:type, :href])
end
def changeset(struct, %{"type" => "Hashtag", "name" => name} = data) do
def changeset(struct, %{"type" => "Hashtag", "name" => full_name} = data) do
name =
cond do
case full_name do
"#" <> name -> name
name -> name
end

View file

@ -25,6 +25,7 @@ defmodule Pleroma.Web.ActivityPub.ObjectValidators.UserValidator do
when type in Pleroma.Constants.actor_types() do
with :ok <- validate_pubkey(data),
:ok <- validate_inbox(data),
:ok <- validate_nickname(data),
:ok <- contain_collection_origin(data) do
{:ok, data, meta}
else
@ -83,4 +84,18 @@ defmodule Pleroma.Web.ActivityPub.ObjectValidators.UserValidator do
_, error -> error
end)
end
defp validate_nickname(%{"preferredUsername" => nick}) when is_binary(nick) do
if String.valid?(nick) do
:ok
else
{:error, "Nickname is not valid UTF-8"}
end
end
defp validate_nickname(%{"preferredUsername" => _nick}) do
{:error, "Nickname is not a valid string"}
end
defp validate_nickname(_), do: :ok
end

View file

@ -15,12 +15,12 @@ defmodule Pleroma.Web.ActivityPub.SideEffects do
alias Pleroma.Object
alias Pleroma.Repo
alias Pleroma.User
alias Pleroma.User.Fetcher, as: UserFetcher
alias Pleroma.Web.ActivityPub.ActivityPub
alias Pleroma.Web.ActivityPub.Builder
alias Pleroma.Web.ActivityPub.Pipeline
alias Pleroma.Web.ActivityPub.Utils
alias Pleroma.Web.ActivityPub.Visibility
alias Pleroma.Web.Push
alias Pleroma.Web.Streamer
alias Pleroma.Workers.PollWorker
@ -121,7 +121,7 @@ defmodule Pleroma.Web.ActivityPub.SideEffects do
nil
end
{:ok, notifications} = Notification.create_notifications(object, do_send: false)
{:ok, notifications, _} = Notification.create_notifications(object)
meta =
meta
@ -180,7 +180,8 @@ defmodule Pleroma.Web.ActivityPub.SideEffects do
liked_object = Object.get_by_ap_id(object.data["object"])
Utils.add_like_to_object(object, liked_object)
Notification.create_notifications(object)
{:ok, notifications, _} = Notification.create_notifications(object)
meta = add_notifications(meta, notifications)
{:ok, object, meta}
end
@ -199,7 +200,7 @@ defmodule Pleroma.Web.ActivityPub.SideEffects do
def handle(%{data: %{"type" => "Create"}} = activity, meta) do
with {:ok, object, meta} <- handle_object_creation(meta[:object_data], activity, meta),
%User{} = user <- User.get_cached_by_ap_id(activity.data["actor"]) do
{:ok, notifications} = Notification.create_notifications(activity, do_send: false)
{:ok, notifications, _} = Notification.create_notifications(activity)
{:ok, _user} = ActivityPub.increase_note_count_if_public(user, object)
{:ok, _user} = ActivityPub.update_last_status_at_if_public(user, object)
@ -211,6 +212,18 @@ defmodule Pleroma.Web.ActivityPub.SideEffects do
reply_depth = (meta[:depth] || 0) + 1
participations =
with true <- Visibility.is_direct?(activity),
{:ok, conversation} <-
ActivityPub.create_or_bump_conversation(activity, activity.actor) do
conversation
|> Repo.preload(:participations)
|> Map.get(:participations)
|> Repo.preload(:user)
else
_ -> []
end
Pleroma.Workers.NodeInfoFetcherWorker.enqueue("process", %{
"source_url" => activity.data["actor"]
})
@ -233,6 +246,7 @@ defmodule Pleroma.Web.ActivityPub.SideEffects do
meta =
meta
|> add_notifications(notifications)
|> add_streamables([{"participation", participations}])
ap_streamer().stream_out(activity)
@ -255,9 +269,11 @@ defmodule Pleroma.Web.ActivityPub.SideEffects do
Utils.add_announce_to_object(object, announced_object)
if !User.is_internal_user?(user) do
Notification.create_notifications(object)
{:ok, notifications, _} = Notification.create_notifications(object)
meta = add_notifications(meta, notifications)
if !User.is_internal_user?(user) do
# XXX: this too should be added to meta and only done after transaction
ap_streamer().stream_out(object)
end
@ -280,7 +296,8 @@ defmodule Pleroma.Web.ActivityPub.SideEffects do
reacted_object = Object.get_by_ap_id(object.data["object"])
Utils.add_emoji_reaction_to_object(object, reacted_object)
Notification.create_notifications(object)
{:ok, notifications, _} = Notification.create_notifications(object)
meta = add_notifications(meta, notifications)
{:ok, object, meta}
end
@ -411,11 +428,7 @@ defmodule Pleroma.Web.ActivityPub.SideEffects do
changeset
|> User.update_and_set_cache()
else
{:ok, new_user_data} = ActivityPub.user_data_from_user_object(updated_object)
User.get_by_ap_id(updated_object["id"])
|> User.remote_user_changeset(new_user_data)
|> User.update_and_set_cache()
UserFetcher.update_user_with_apdata(updated_object)
end
{:ok, object, meta}
@ -557,10 +570,7 @@ defmodule Pleroma.Web.ActivityPub.SideEffects do
defp send_notifications(meta) do
Keyword.get(meta, :notifications, [])
|> Enum.each(fn notification ->
Streamer.stream(["user", "user:notification"], notification)
Push.send(notification)
end)
|> Notification.send()
meta
end
@ -574,13 +584,17 @@ defmodule Pleroma.Web.ActivityPub.SideEffects do
meta
end
defp add_notifications(meta, notifications) do
existing = Keyword.get(meta, :notifications, [])
meta
|> Keyword.put(:notifications, notifications ++ existing)
defp add_to_list(meta, key, entries) do
existing = Keyword.get(meta, key, [])
Keyword.put(meta, key, entries ++ existing)
end
defp add_notifications(meta, notifications),
do: add_to_list(meta, :notifications, notifications)
defp add_streamables(meta, streamables),
do: add_to_list(meta, :streamables, streamables)
@impl true
def handle_after_transaction(meta) do
meta

View file

@ -879,8 +879,27 @@ defmodule Pleroma.Web.ActivityPub.Transmogrifier do
{:ok, data}
end
def prepare_outgoing(%{"type" => "Update", "object" => %{"type" => objtype} = object} = data)
when objtype in Pleroma.Constants.actor_types() do
object =
object
|> maybe_fix_user_object()
|> strip_internal_fields()
data =
data
|> Map.put("object", object)
|> strip_internal_fields()
|> Map.merge(Utils.make_json_ld_header())
|> Map.delete("bcc")
{:ok, data}
end
def prepare_outgoing(%{"type" => "Update", "object" => %{}} = data) do
raise "Requested to serve an Update for non-updateable object type: #{inspect(data)}"
err_msg = "Requested to serve an Update for non-updateable object type: #{inspect(data)}"
Logger.error(err_msg)
raise err_msg
end
def prepare_outgoing(%{"type" => "Announce", "actor" => ap_id, "object" => object_id} = data) do
@ -1009,29 +1028,19 @@ defmodule Pleroma.Web.ActivityPub.Transmogrifier do
def take_emoji_tags(%User{emoji: emoji}) do
emoji
|> Map.to_list()
|> Enum.map(&build_emoji_tag/1)
|> Enum.map(&Builder.emoji_object!/1)
end
# TODO: we should probably send mtime instead of unix epoch time for updated
def add_emoji_tags(%{"emoji" => emoji} = object) do
tags = object["tag"] || []
out = Enum.map(emoji, &build_emoji_tag/1)
out = Enum.map(emoji, &Builder.emoji_object!/1)
Map.put(object, "tag", tags ++ out)
end
def add_emoji_tags(object), do: object
defp build_emoji_tag({name, url}) do
%{
"icon" => %{"url" => "#{URI.encode(url)}", "type" => "Image"},
"name" => ":" <> name <> ":",
"type" => "Emoji",
"updated" => "1970-01-01T00:00:00Z"
}
end
def set_conversation(object) do
Map.put(object, "conversation", object["context"])
end

View file

@ -101,6 +101,8 @@ defmodule Pleroma.Web.ActivityPub.Utils do
"@context" => [
"https://www.w3.org/ns/activitystreams",
"#{Endpoint.url()}/schemas/litepub-0.1.jsonld",
# FEP-2c59
"https://purl.archive.org/socialweb/webfinger",
%{
"@language" => "und",
"htmlMfm" => "https://w3id.org/fep/c16b#htmlMfm"
@ -516,7 +518,7 @@ defmodule Pleroma.Web.ActivityPub.Utils do
|> where([activity], fragment("?->>'content' = ?
AND EXISTS (
SELECT FROM jsonb_array_elements(?->'tag') elem
WHERE elem->>'id' ILIKE ?
WHERE COALESCE(elem->'icon'->>'url', '') ILIKE ?
)", activity.data, ^emoji_pattern, activity.data, ^domain_pattern))
else
query

View file

@ -6,10 +6,49 @@
defmodule Pleroma.Web.ActivityPub.CollectionViewHelper do
alias Pleroma.Web.ActivityPub.Utils
@doc """
Renders the root of a larger (or private) OrderedCollection
possibly with a link to or an inlined first page.
(For small public collections to be served all at once
a version with orderedItems may be preferable for simplicity)
"""
@spec collection_root_ordered(
String.t(),
integer() | nil | false,
String.t() | map() | nil | false
) :: map()
def collection_root_ordered(iri, total_count \\ nil, first \\ nil) do
collection_root(iri, true, total_count, first)
end
@spec collection_root(
String.t(),
boolean(),
integer() | nil | false,
String.t() | map() | nil | false
) :: map()
defp collection_root(iri, ordered, total_count, first) do
type = if ordered, do: "OrderedCollection", else: "Collection"
%{
"type" => type,
"id" => iri
}
|> put_truthy("totalItems", total_count)
|> put_truthy("first", first)
end
defp put_truthy(map, key, val) do
if val do
Map.put(map, key, val)
else
map
end
end
def collection_page_offset(collection, iri, page, show_items \\ true, total \\ nil) do
offset = (page - 1) * 10
items = Enum.slice(collection, offset, 10)
items = Enum.map(items, fn user -> user.ap_id end)
total = total || length(collection)
map = %{

View file

@ -56,28 +56,25 @@ defmodule Pleroma.Web.ActivityPub.ObjectView do
first_pagination = reply_collection_first_pagination(items, opts)
col_ap =
%{
"id" => object_ap_id <> "/replies",
"type" => "OrderedCollection",
"totalItems" => total
}
col_ap =
first_page =
if total > 0 do
first_page =
CollectionViewHelper.collection_page_keyset(
display_items,
first_pagination,
params[:limit],
true
)
Map.put(col_ap, "first", first_page)
CollectionViewHelper.collection_page_keyset(
display_items,
first_pagination,
params[:limit],
true
)
else
col_ap
false
end
col_ap =
CollectionViewHelper.collection_root_ordered(
object_ap_id <> "/replies",
total,
first_page
)
if params[:skip_ap_ctx] do
col_ap
else

View file

@ -12,11 +12,11 @@ defmodule Pleroma.Web.ActivityPub.UserView do
alias Pleroma.Web.ActivityPub.ObjectView
alias Pleroma.Web.ActivityPub.Transmogrifier
alias Pleroma.Web.ActivityPub.Utils
alias Pleroma.Web.WebFinger
require Ecto.Query
require Pleroma.Web.ActivityPub.Transmogrifier
import Ecto.Query
defp maybe_put(map, _, nil), do: map
defp maybe_put(map, k, v), do: Map.put(map, k, v)
@ -57,6 +57,7 @@ defmodule Pleroma.Web.ActivityPub.UserView do
|> maybe_put("following", user.following_address)
|> maybe_put("followers", user.follower_address)
|> maybe_put("preferredUsername", user.nickname)
|> maybe_put_webfinger(user)
|> Map.merge(Utils.make_json_ld_header())
end
@ -106,6 +107,7 @@ defmodule Pleroma.Web.ActivityPub.UserView do
"capabilities" => capabilities,
"alsoKnownAs" => user.also_known_as
}
|> maybe_put_webfinger(user)
|> Map.merge(maybe_make_image(&User.avatar_url/2, "icon", user))
|> Map.merge(maybe_make_image(&User.banner_url/2, "image", user))
# Yes, the key is named ...Url eventhough it is a whole 'Image' object
@ -134,6 +136,7 @@ defmodule Pleroma.Web.ActivityPub.UserView do
# since Mastodon requires a WebFinger address for all users, this seems like a good idea
"preferredUsername" => user.nickname
}
|> maybe_put_webfinger(user)
|> Map.merge(Utils.make_json_ld_header())
end
@ -141,17 +144,22 @@ defmodule Pleroma.Web.ActivityPub.UserView do
showing_items = (opts[:for] && opts[:for] == user) || !user.hide_follows
showing_count = showing_items || !user.hide_follows_count
query = User.get_friends_query(user)
query = from(user in query, select: [:ap_id])
following = Repo.all(query)
total =
if showing_count do
length(following)
user.following_count
else
0
end
following =
if showing_items and total > 0 do
User.get_friends_query(user)
|> Ecto.Query.select([u], u.ap_id)
|> Repo.all()
else
[]
end
CollectionViewHelper.collection_page_offset(
following,
"#{user.ap_id}/following",
@ -166,33 +174,31 @@ defmodule Pleroma.Web.ActivityPub.UserView do
showing_items = (opts[:for] && opts[:for] == user) || !user.hide_follows
showing_count = showing_items || !user.hide_follows_count
query = User.get_friends_query(user)
query = from(user in query, select: [:ap_id])
following = Repo.all(query)
total = showing_count && user.following_count
total =
if showing_count do
length(following)
following =
if showing_items && total > 0 do
User.get_friends_query(user)
|> Ecto.Query.select([u], u.ap_id)
|> Repo.all()
else
0
[]
end
%{
"id" => "#{user.ap_id}/following",
"type" => "OrderedCollection",
"totalItems" => total,
"first" =>
if showing_items do
CollectionViewHelper.collection_page_offset(
following,
"#{user.ap_id}/following",
1,
!user.hide_follows
)
else
"#{user.ap_id}/following?page=1"
end
}
first_page =
showing_items &&
CollectionViewHelper.collection_page_offset(
following,
"#{user.ap_id}/following",
1,
!user.hide_follows
)
CollectionViewHelper.collection_root_ordered(
"#{user.ap_id}/following",
total,
first_page
)
|> Map.merge(Utils.make_json_ld_header())
end
@ -200,17 +206,22 @@ defmodule Pleroma.Web.ActivityPub.UserView do
showing_items = (opts[:for] && opts[:for] == user) || !user.hide_followers
showing_count = showing_items || !user.hide_followers_count
query = User.get_followers_query(user)
query = from(user in query, select: [:ap_id])
followers = Repo.all(query)
total =
if showing_count do
length(followers)
user.follower_count
else
0
end
followers =
if showing_items and total > 0 do
User.get_followers_query(user)
|> Ecto.Query.select([u], u.ap_id)
|> Repo.all()
else
[]
end
CollectionViewHelper.collection_page_offset(
followers,
"#{user.ap_id}/followers",
@ -225,43 +236,41 @@ defmodule Pleroma.Web.ActivityPub.UserView do
showing_items = (opts[:for] && opts[:for] == user) || !user.hide_followers
showing_count = showing_items || !user.hide_followers_count
query = User.get_followers_query(user)
query = from(user in query, select: [:ap_id])
followers = Repo.all(query)
total = showing_count && user.follower_count
total =
if showing_count do
length(followers)
followers =
if showing_items and total > 0 do
User.get_followers_query(user)
|> Ecto.Query.select([u], u.ap_id)
|> Repo.all()
else
0
[]
end
%{
"id" => "#{user.ap_id}/followers",
"type" => "OrderedCollection",
"first" =>
if showing_items do
CollectionViewHelper.collection_page_offset(
followers,
"#{user.ap_id}/followers",
1,
showing_items,
total
)
else
"#{user.ap_id}/followers?page=1"
end
}
|> maybe_put_total_items(showing_count, total)
first_page =
showing_items &&
CollectionViewHelper.collection_page_offset(
followers,
"#{user.ap_id}/followers",
1,
showing_items,
total
)
CollectionViewHelper.collection_root_ordered(
"#{user.ap_id}/followers",
total,
first_page
)
|> Map.merge(Utils.make_json_ld_header())
end
def render("activity_collection.json", %{iri: iri}) do
%{
"id" => iri,
"type" => "OrderedCollection",
"first" => "#{iri}?page=true"
}
CollectionViewHelper.collection_root_ordered(
iri,
false,
"#{iri}?page=true"
)
|> Map.merge(Utils.make_json_ld_header())
end
@ -297,12 +306,14 @@ defmodule Pleroma.Web.ActivityPub.UserView do
|> Map.merge(Utils.make_json_ld_header())
end
defp maybe_put_total_items(map, false, _total), do: map
defp maybe_put_total_items(map, true, total) do
Map.put(map, "totalItems", total)
defp maybe_put_webfinger(%{"preferredUsername" => username} = data, %{local: true}) do
# FEP-2c59 entry for local users
webfinger_domain = WebFinger.Schema.domain()
Map.put(data, "webfinger", "#{username}@#{webfinger_domain}")
end
defp maybe_put_webfinger(data, _), do: data
defp maybe_make_image(func, key, user) do
image = func.(user, no_default: true)
maybe_insert_image(key, image)

View file

@ -1,12 +1,11 @@
defmodule Pleroma.Web.AkkomaAPI.TranslationController do
use Pleroma.Web, :controller
alias Pleroma.Akkoma.Translator
alias Pleroma.Web.Plugs.OAuthScopesPlug
require Logger
@cachex Pleroma.Config.get([:cachex, :provider], Cachex)
@unauthenticated_access %{fallback: :proceed_unauthenticated, scopes: []}
plug(
OAuthScopesPlug,
@ -24,7 +23,7 @@ defmodule Pleroma.Web.AkkomaAPI.TranslationController do
@doc "GET /api/v1/akkoma/translation/languages"
def languages(conn, _params) do
with {:enabled, true} <- {:enabled, Pleroma.Config.get([:translator, :enabled])},
{:ok, source_languages, dest_languages} <- get_languages() do
{:ok, source_languages, dest_languages} <- Translator.languages() do
conn
|> json(%{source: source_languages, target: dest_languages})
else
@ -36,16 +35,4 @@ defmodule Pleroma.Web.AkkomaAPI.TranslationController do
{:error, e}
end
end
defp get_languages do
module = Pleroma.Config.get([:translator, :module])
@cachex.fetch!(:translations_cache, "languages:#{module}}", fn _ ->
with {:ok, source_languages, dest_languages} <- module.languages() do
{:commit, {:ok, source_languages, dest_languages}}
else
{:error, err} -> {:ignore, {:error, err}}
end
end)
end
end

View file

@ -34,6 +34,29 @@ defmodule Pleroma.Web.ApiSpec.InstanceOperation do
}
end
def translation_languages_operation do
%Operation{
tags: ["Instance"],
summary: "Retrieve supported languages matrix",
operationId: "InstanceController.translation_languages",
responses: %{
200 =>
Operation.response(
"Translation languages matrix",
"application/json",
%Schema{
type: :object,
additionalProperties: %Schema{
type: :array,
items: %Schema{type: :string},
description: "Supported target languages for a source language"
}
}
)
}
}
end
defp instance do
%Schema{
type: :object,

View file

@ -5,6 +5,7 @@
defmodule Pleroma.Web.ApiSpec.PleromaConversationOperation do
alias OpenApiSpex.Operation
alias OpenApiSpex.Schema
alias Pleroma.Web.ApiSpec.Schemas.BooleanLike
alias Pleroma.Web.ApiSpec.Schemas.Conversation
alias Pleroma.Web.ApiSpec.Schemas.FlakeID
alias Pleroma.Web.ApiSpec.StatusOperation
@ -42,7 +43,8 @@ defmodule Pleroma.Web.ApiSpec.PleromaConversationOperation do
Operation.parameter(:id, :path, :string, "Conversation ID",
example: "123",
required: true
)
),
Operation.parameter(:with_muted, :query, BooleanLike, "Include activities by muted users")
| pagination_params()
],
security: [%{"oAuth" => ["read:statuses"]}],
@ -59,6 +61,9 @@ defmodule Pleroma.Web.ApiSpec.PleromaConversationOperation do
end
def update_operation do
recipients_description =
"A list of ids of users that should receive posts to this conversation. This will replace the current list of recipients, so submit the full list. The owner of owner of the conversation will always be part of the set of recipients, though."
%Operation{
tags: ["Conversations"],
summary: "Update conversation",
@ -72,10 +77,21 @@ defmodule Pleroma.Web.ApiSpec.PleromaConversationOperation do
:recipients,
:query,
%Schema{type: :array, items: FlakeID},
"A list of ids of users that should receive posts to this conversation. This will replace the current list of recipients, so submit the full list. The owner of owner of the conversation will always be part of the set of recipients, though.",
required: true
recipients_description
)
],
requestBody:
request_body("Parameters", %Schema{
type: :object,
properties: %{
recipients: %Schema{
type: :array,
items: FlakeID,
nullable: false,
description: recipients_description
}
}
}),
security: [%{"oAuth" => ["write:conversations"]}],
operationId: "PleromaAPI.ConversationController.update",
responses: %{

View file

@ -24,7 +24,17 @@ defmodule Pleroma.Web.ApiSpec.ReportOperation do
requestBody: Helpers.request_body("Parameters", create_request(), required: true),
responses: %{
200 => Operation.response("Report", "application/json", create_response()),
400 => Operation.response("Report", "application/json", ApiError)
400 => Operation.response("Report", "application/json", ApiError),
404 =>
Operation.response(
"Report",
"application/json",
%Schema{
allOf: [ApiError],
title: "Report",
example: %{"error" => "Record not found"}
}
)
}
}
end

View file

@ -244,7 +244,19 @@ defmodule Pleroma.Web.ApiSpec.StatusOperation do
example: %{
"error" => "Record not found"
}
})
}),
422 =>
Operation.response(
"Unprocessable Entity",
"application/json",
%Schema{
allOf: [ApiError],
title: "Unprocessable Entity",
example: %{
"error" => "Someone else's status cannot be unpinned"
}
}
)
}
}
end
@ -258,7 +270,8 @@ defmodule Pleroma.Web.ApiSpec.StatusOperation do
operationId: "StatusController.bookmark",
parameters: [id_param()],
responses: %{
200 => status_response()
200 => status_response(),
404 => Operation.response("Not found", "application/json", ApiError)
}
}
end
@ -272,7 +285,8 @@ defmodule Pleroma.Web.ApiSpec.StatusOperation do
operationId: "StatusController.unbookmark",
parameters: [id_param()],
responses: %{
200 => status_response()
200 => status_response(),
404 => Operation.response("Not found", "application/json", ApiError)
}
}
end
@ -307,7 +321,17 @@ defmodule Pleroma.Web.ApiSpec.StatusOperation do
],
responses: %{
200 => status_response(),
400 => Operation.response("Error", "application/json", ApiError)
400 => Operation.response("Error", "application/json", ApiError),
404 =>
Operation.response(
"Unprocessable Entity",
"application/json",
%Schema{
allOf: [ApiError],
title: "Error",
example: %{"error" => "Record not found"}
}
)
}
}
end
@ -323,7 +347,17 @@ defmodule Pleroma.Web.ApiSpec.StatusOperation do
parameters: [id_param()],
responses: %{
200 => status_response(),
400 => Operation.response("Error", "application/json", ApiError)
400 => Operation.response("Error", "application/json", ApiError),
404 =>
Operation.response(
"Error",
"application/json",
%Schema{
allOf: [ApiError],
title: "Error",
example: %{"error" => "Record not found"}
}
)
}
}
end
@ -417,11 +451,48 @@ defmodule Pleroma.Web.ApiSpec.StatusOperation do
tags: ["Retrieve status translation"],
summary: "Translate status",
description: "View the translation of a given status",
operationId: "StatusController.translation",
operationId: "StatusController.translate",
parameters: [id_param()],
security: [%{"oAuth" => ["read:statuses"]}],
requestBody:
request_body(
"Parameters",
%Schema{
type: :object,
properties: %{
lang: %Schema{
type: :string,
nullable: true,
description: "Translation target language."
},
source_lang: %Schema{
type: :string,
nullable: true,
description: "Translation source language."
}
}
},
required: false
),
responses: %{
200 => Operation.response("Translation", "application/json", translation()),
400 => Operation.response("Error", "application/json", ApiError),
404 => Operation.response("Not Found", "application/json", ApiError)
}
}
end
def translate_legacy_operation do
%Operation{
tags: ["Retrieve status translation"],
summary: "Translate status",
description: "View the translation of a given status",
operationId: "StatusController.translate_legacy",
deprecated: true,
security: [%{"oAuth" => ["read:statuses"]}],
parameters: [id_param(), language_param(), source_language_param()],
responses: %{
200 => Operation.response("Translation", "application/json", translation()),
200 => Operation.response("Translation", "application/json", translation_legacy()),
400 => Operation.response("Error", "application/json", ApiError),
404 => Operation.response("Not Found", "application/json", ApiError)
}
@ -565,10 +636,16 @@ defmodule Pleroma.Web.ApiSpec.StatusOperation do
description:
"The number of seconds the posted activity should expire in. When a posted activity expires it will be deleted from the server, and a delete request for it will be federated. This needs to be longer than an hour."
},
quote_id: %Schema{
quoted_status_id: %Schema{
nullable: true,
type: :string,
description: "Will quote a given status."
},
quote_id: %Schema{
deprecated: true,
nullable: true,
type: :string,
description: "Deprecated alias for quoted_status_id."
}
},
example: %{
@ -788,6 +865,29 @@ defmodule Pleroma.Web.ApiSpec.StatusOperation do
defp translation do
%Schema{
title: "StatusTranslation",
description: "Represents status translation with related information.",
type: :object,
required: [:content, :detected_source_language, :provider],
properties: %{
content: %Schema{
type: :string,
description: "Translated status content"
},
detected_source_language: %Schema{
type: :string,
description: "Detected source language"
},
provider: %Schema{
type: :string,
description: "Translation provider service name"
}
}
}
end
defp translation_legacy do
%Schema{
title: "AkkomaStatusTranslation",
description: "The translation of a status.",
type: :object,
required: [:detected_language, :text],

View file

@ -19,10 +19,10 @@ defmodule Pleroma.Web.ApiSpec.RenderError do
def call(conn, errors) do
errors =
Enum.map(errors, fn
%{name: nil, reason: :invalid_enum} = err ->
%OpenApiSpex.Cast.Error{name: nil, reason: :invalid_enum} = err ->
%OpenApiSpex.Cast.Error{err | name: err.value}
%{name: nil} = err ->
%OpenApiSpex.Cast.Error{name: nil} = err ->
%OpenApiSpex.Cast.Error{err | name: List.last(err.path)}
err ->

View file

@ -199,10 +199,15 @@ defmodule Pleroma.Web.CommonAPI do
%Object{} = note <- Object.normalize(activity, fetch: false),
%Activity{} = like <- Utils.get_existing_like(user.ap_id, note),
{:ok, undo, _} <- Builder.undo(user, like),
{:ok, activity, _} <- Pipeline.common_pipeline(undo, local: true) do
{:ok, activity, _} <- Pipeline.common_pipeline(undo, local: true),
# to avoid exposing post data in API response, lie to user and
# claim the operation failed if they arent (anymore) allowed to access it.
# But only check at end to allow retracting the fav if ID still available
{_, true} <- {:visibility, Visibility.visible_for_user?(note, user)} do
{:ok, activity}
else
{:find_activity, _} -> {:error, :not_found}
{:visibility, _} -> {:error, :not_found}
_ -> {:error, dgettext("errors", "Could not unfavorite")}
end
end
@ -425,6 +430,7 @@ defmodule Pleroma.Web.CommonAPI do
@spec unpin(String.t(), User.t()) :: {:ok, User.t()} | {:error, term()}
def unpin(id, user) do
with %Activity{} = activity <- create_activity_by_id(id),
true <- activity_belongs_to_actor(activity, user.ap_id),
{:ok, unpin_data, _} <- Builder.unpin(user, activity.object),
{:ok, _unpin, _} <-
Pipeline.common_pipeline(unpin_data,
@ -440,7 +446,8 @@ defmodule Pleroma.Web.CommonAPI do
def add_mute(user, activity, params \\ %{}) do
expires_in = Map.get(params, :expires_in, 0)
with {:ok, _} <- ThreadMute.add_mute(user.id, activity.data["context"]),
with true <- Visibility.visible_for_user?(activity, user),
{:ok, _} <- ThreadMute.add_mute(user.id, activity.data["context"]),
_ <- Pleroma.Notification.mark_context_as_read(user, activity.data["context"]) do
if expires_in > 0 do
Pleroma.Workers.MuteExpireWorker.enqueue(
@ -453,12 +460,17 @@ defmodule Pleroma.Web.CommonAPI do
{:ok, activity}
else
{:error, _} -> {:error, dgettext("errors", "conversation is already muted")}
false -> {:error, :visibility_error}
end
end
def remove_mute(%User{} = user, %Activity{} = activity) do
ThreadMute.remove_mute(user.id, activity.data["context"])
{:ok, activity}
if Visibility.visible_for_user?(activity, user) do
ThreadMute.remove_mute(user.id, activity.data["context"])
{:ok, activity}
else
{:error, :visibility_error}
end
end
def remove_mute(user_id, activity_id) do
@ -485,7 +497,8 @@ defmodule Pleroma.Web.CommonAPI do
def report(user, data) do
with {:ok, account} <- get_reported_account(data.account_id),
{:ok, {content_html, _, _}} <- make_report_content_html(data[:comment]),
{:ok, statuses} <- get_report_statuses(account, data) do
{:ok, statuses} <- get_report_statuses(account, data),
{_, true} <- {:visibility, check_statuses_visibility(user, statuses)} do
ActivityPub.flag(%{
context: Utils.generate_context_id(),
actor: user,
@ -494,9 +507,22 @@ defmodule Pleroma.Web.CommonAPI do
content: content_html,
forward: Map.get(data, :forward, false)
})
else
{:visibility, _} ->
{:error, :visibility}
error ->
error
end
end
defp check_statuses_visibility(user, statuses) when is_list(statuses) do
Enum.all?(statuses, fn status -> Visibility.visible_for_user?(status, user) end)
end
# There are no statuses associated with the report, pass!
defp check_statuses_visibility(_, status) when status == nil, do: true
defp get_reported_account(account_id) do
case User.get_cached_by_id(account_id) do
%User{} = account -> {:ok, account}
@ -567,9 +593,6 @@ defmodule Pleroma.Web.CommonAPI do
user = User.get_cached_by_ap_id(ap_id) ->
user
user = User.get_by_guessed_nickname(ap_id) ->
user
fake_record_fallback ->
# TODO: refactor (fake records is never a good idea)
User.error_user(ap_id)

View file

@ -5,6 +5,7 @@
defmodule Pleroma.Web.CommonAPI.ActivityDraft do
alias Pleroma.Activity
alias Pleroma.Object
alias Pleroma.User
alias Pleroma.Web.ActivityPub.Builder
alias Pleroma.Web.ActivityPub.Visibility
alias Pleroma.Web.CommonAPI
@ -24,7 +25,6 @@ defmodule Pleroma.Web.CommonAPI.ActivityDraft do
in_reply_to: nil,
language: nil,
content_map: %{},
quote_id: nil,
quote: nil,
visibility: nil,
expires_at: nil,
@ -69,23 +69,27 @@ defmodule Pleroma.Web.CommonAPI.ActivityDraft do
end
defp put_params(draft, params) do
params = Map.put_new(params, :in_reply_to_status_id, params[:in_reply_to_id])
%__MODULE__{draft | params: params}
params =
params
|> Map.put_new(:in_reply_to_status_id, params[:in_reply_to_id])
|> Map.put_new(:quoted_status_id, params[:quote_id])
%{draft | params: params}
end
defp status(%{params: %{status: status}} = draft) do
%__MODULE__{draft | status: String.trim(status)}
%{draft | status: String.trim(status)}
end
defp summary(%{params: params} = draft) do
%__MODULE__{draft | summary: Map.get(params, :spoiler_text, "")}
%{draft | summary: Map.get(params, :spoiler_text, "")}
end
defp full_payload(%{status: status, summary: summary} = draft) do
full_payload = String.trim(status <> summary)
case Utils.validate_character_limit(full_payload, draft.attachments) do
:ok -> %__MODULE__{draft | full_payload: full_payload}
:ok -> %{draft | full_payload: full_payload}
{:error, message} -> add_error(draft, message)
end
end
@ -93,7 +97,7 @@ defmodule Pleroma.Web.CommonAPI.ActivityDraft do
defp attachments(%{params: params, user: user} = draft) do
case Utils.attachments_from_ids(user, params) do
attachments when is_list(attachments) ->
%__MODULE__{draft | attachments: attachments}
%{draft | attachments: attachments}
{:error, reason} ->
add_error(draft, reason)
@ -111,7 +115,7 @@ defmodule Pleroma.Web.CommonAPI.ActivityDraft do
with %Activity{} = activity <- Activity.get_by_id(id),
true <- Visibility.visible_for_user?(activity, draft.user),
{_, type} when type in ["Create", "Announce"] <- {:type, activity.data["type"]} do
%__MODULE__{draft | in_reply_to: activity}
%{draft | in_reply_to: activity}
else
nil ->
add_error(draft, dgettext("errors", "Parent post does not exist or was deleted"))
@ -130,29 +134,52 @@ defmodule Pleroma.Web.CommonAPI.ActivityDraft do
end
defp in_reply_to(%{params: %{in_reply_to_status_id: %Activity{} = in_reply_to}} = draft) do
%__MODULE__{draft | in_reply_to: in_reply_to}
%{draft | in_reply_to: in_reply_to}
end
defp in_reply_to(draft), do: draft
defp quote_id(%{params: %{quote_id: ""}} = draft), do: draft
defp can_quote(
%User{ap_id: actor},
%Activity{actor: quoted_author, data: %{"type" => "Create"}} = quoting,
quote_visibility
) do
quoting_visibility = CommonAPI.get_quoted_visibility(quoting)
defp quote_id(%{params: %{quote_id: id}} = draft) when is_binary(id) do
quoting_visibility in ["public", "unlisted"] or
(quoting_visibility == "local" && quote_visibility == quoting_visibility) or
(quoting_visibility == "private" && quote_visibility == quoting_visibility &&
actor == quoted_author)
end
defp can_quote(_, _, _), do: false
defp quote_id(%{params: %{quoted_status_id: ""}} = draft), do: draft
defp quote_id(
%{user: actor, visibility: quote_visibiliity, params: %{quoted_status_id: id}} = draft
)
when is_binary(id) do
with {:activity, %Activity{} = quote} <- {:activity, Activity.get_by_id(id)},
visibility <- CommonAPI.get_quoted_visibility(quote),
{:visibility, true} <- {:visibility, visibility in ["public", "unlisted"]} do
%__MODULE__{draft | quote: Activity.get_by_id(id)}
{:visibility, true} <- {:visibility, can_quote(actor, quote, quote_visibiliity)} do
%{draft | quote: Activity.get_by_id(id)}
else
{:activity, _} ->
add_error(draft, dgettext("errors", "You can't quote a status that doesn't exist"))
{:visibility, false} ->
add_error(draft, dgettext("errors", "You can only quote public or unlisted statuses"))
add_error(
draft,
dgettext(
"errors",
"You cannot quote this status at all or not with the intended visibility"
)
)
end
end
defp quote_id(%{params: %{quote_id: %Activity{} = quote}} = draft) do
%__MODULE__{draft | quote: quote}
defp quote_id(%{params: %{quoted_status_id: %Activity{} = quote}} = draft) do
%{draft | quote: quote}
end
defp quote_id(draft), do: draft
@ -160,7 +187,7 @@ defmodule Pleroma.Web.CommonAPI.ActivityDraft do
defp language(%{params: %{language: language}, content_html: content} = draft)
when is_binary(language) do
if Pleroma.ISO639.valid_alpha2?(language) do
%__MODULE__{draft | content_map: %{language => content}}
%{draft | content_map: %{language => content}}
else
add_error(draft, dgettext("errors", "Invalid language"))
end
@ -168,7 +195,7 @@ defmodule Pleroma.Web.CommonAPI.ActivityDraft do
defp language(%{content_html: content} = draft) do
# Use a default language if no language is specified
%__MODULE__{draft | content_map: %{"en" => content}}
%{draft | content_map: %{"en" => content}}
end
defp visibility(%{params: params} = draft) do
@ -177,13 +204,13 @@ defmodule Pleroma.Web.CommonAPI.ActivityDraft do
add_error(draft, dgettext("errors", "The message visibility must be direct"))
{visibility, _} ->
%__MODULE__{draft | visibility: visibility}
%{draft | visibility: visibility}
end
end
defp expires_at(draft) do
case CommonAPI.check_expiry_date(draft.params[:expires_in]) do
{:ok, expires_at} -> %__MODULE__{draft | expires_at: expires_at}
{:ok, expires_at} -> %{draft | expires_at: expires_at}
{:error, message} -> add_error(draft, message)
end
end
@ -191,7 +218,7 @@ defmodule Pleroma.Web.CommonAPI.ActivityDraft do
defp poll(draft) do
case Utils.make_poll_data(draft.params) do
{:ok, {poll, poll_emoji}} ->
%__MODULE__{draft | extra: poll, emoji: Map.merge(draft.emoji, poll_emoji)}
%{draft | extra: poll, emoji: Map.merge(draft.emoji, poll_emoji)}
{:error, message} ->
add_error(draft, message)
@ -206,22 +233,22 @@ defmodule Pleroma.Web.CommonAPI.ActivityDraft do
|> Enum.map(fn {_, mentioned_user} -> mentioned_user.ap_id end)
|> Utils.get_addressed_users(draft.params[:to])
%__MODULE__{draft | content_html: content_html, mentions: mentions, tags: tags}
%{draft | content_html: content_html, mentions: mentions, tags: tags}
end
defp to_and_cc(draft) do
{to, cc} = Utils.get_to_and_cc(draft)
%__MODULE__{draft | to: to, cc: cc}
%{draft | to: to, cc: cc}
end
defp context(draft) do
context = Utils.make_context(draft)
%__MODULE__{draft | context: context}
%{draft | context: context}
end
defp sensitive(draft) do
sensitive = draft.params[:sensitive]
%__MODULE__{draft | sensitive: sensitive}
%{draft | sensitive: sensitive}
end
defp object(draft) do
@ -266,7 +293,7 @@ defmodule Pleroma.Web.CommonAPI.ActivityDraft do
|> Map.put("generator", draft.params[:generator])
|> Map.put("contentMap", draft.content_map)
%__MODULE__{draft | object: object}
%{draft | object: object}
end
defp maybe_put(map, key, value, true), do: map |> Map.put(key, value)
@ -274,7 +301,7 @@ defmodule Pleroma.Web.CommonAPI.ActivityDraft do
defp preview?(draft) do
preview? = Pleroma.Web.Utils.Params.truthy_param?(draft.params[:preview])
%__MODULE__{draft | preview?: preview?}
%{draft | preview?: preview?}
end
defp changes(draft) do
@ -296,14 +323,14 @@ defmodule Pleroma.Web.CommonAPI.ActivityDraft do
additional: additional
}
%__MODULE__{draft | changes: changes}
%{draft | changes: changes}
end
defp with_valid(%{valid?: true} = draft, func), do: func.(draft)
defp with_valid(draft, _func), do: draft
defp add_error(draft, message) do
%__MODULE__{draft | valid?: false, errors: [message | draft.errors]}
%{draft | valid?: false, errors: [message | draft.errors]}
end
defp validate(%{valid?: true} = draft), do: {:ok, draft}

View file

@ -21,10 +21,14 @@ defmodule Pleroma.Web.MastodonAPI.ConversationController do
@doc "GET /api/v1/conversations"
def index(%{assigns: %{user: user}} = conn, params) do
participations = Participation.for_user_with_last_activity_id(user, params)
participations_keyed = Participation.for_user_with_pagination(user, params)
participations =
Pleroma.Pagination.unwrap(participations_keyed)
|> Participation.preload_last_activity_id_and_filter()
conn
|> add_link_headers(participations)
|> add_link_headers(participations_keyed)
|> render("participations.json", participations: participations, for: user)
end

View file

@ -20,4 +20,18 @@ defmodule Pleroma.Web.MastodonAPI.InstanceController do
def peers(conn, _params) do
json(conn, Pleroma.Stats.get_peers())
end
@doc "GET /api/v1/instance/translation_languages"
def translation_languages(conn, _params) do
with {:ok, source_languages, destination_languages} <- Pleroma.Akkoma.Translator.languages() do
conn
|> render("translation_languages.json", %{
source_languages: source_languages,
destination_languages: destination_languages
})
else
{:enabled, false} -> json(conn, %{})
e -> {:error, e}
end
end
end

View file

@ -16,6 +16,12 @@ defmodule Pleroma.Web.MastodonAPI.ReportController do
def create(%{assigns: %{user: user}, body_params: params} = conn, _) do
with {:ok, activity} <- Pleroma.Web.CommonAPI.report(user, params) do
render(conn, "show.json", activity: activity)
else
{:error, :visibility} ->
{:error, :not_found, "Record not found"}
error ->
error
end
end
end

View file

@ -41,6 +41,7 @@ defmodule Pleroma.Web.MastodonAPI.StatusController do
:show,
:context,
:translate,
:translate_legacy,
:show_history,
:show_source
]
@ -134,7 +135,7 @@ defmodule Pleroma.Web.MastodonAPI.StatusController do
"""
# Creates a scheduled status when `scheduled_at` param is present and it's far enough
def create(
%{
%Plug.Conn{
assigns: %{user: user},
body_params: %{status: _, scheduled_at: scheduled_at} = params
} = conn,
@ -202,7 +203,7 @@ defmodule Pleroma.Web.MastodonAPI.StatusController do
end
end
def create(%{assigns: %{user: _user}, body_params: %{media_ids: _} = params} = conn, _) do
def create(%Plug.Conn{assigns: %{user: _user}, body_params: %{media_ids: _} = params} = conn, _) do
params = Map.put(params, :status, "")
create(%Plug.Conn{conn | body_params: params}, %{})
end
@ -351,8 +352,15 @@ defmodule Pleroma.Web.MastodonAPI.StatusController do
@doc "POST /api/v1/statuses/:id/unpin"
def unpin(%{assigns: %{user: user}} = conn, %{id: ap_id_or_id}) do
# CommonAPI already checks whether user can unpin
with {:ok, activity} <- CommonAPI.unpin(ap_id_or_id, user) do
try_render(conn, "show.json", activity: activity, for: user, as: :activity)
else
{:error, :ownership_error} ->
{:error, :unprocessable_entity, "Someone else's status cannot be unpinned"}
error ->
error
end
end
@ -363,6 +371,12 @@ defmodule Pleroma.Web.MastodonAPI.StatusController do
true <- Visibility.visible_for_user?(activity, user),
{:ok, _bookmark} <- Bookmark.create(user.id, activity.id) do
try_render(conn, "show.json", activity: activity, for: user, as: :activity)
else
none when none in [nil, false] ->
{:error, :not_found, "Record not found"}
error ->
error
end
end
@ -370,25 +384,48 @@ defmodule Pleroma.Web.MastodonAPI.StatusController do
def unbookmark(%{assigns: %{user: user}} = conn, %{id: id}) do
with %Activity{} = activity <- Activity.get_by_id_with_object(id),
%User{} = user <- User.get_cached_by_nickname(user.nickname),
true <- Visibility.visible_for_user?(activity, user),
{:ok, _bookmark} <- Bookmark.destroy(user.id, activity.id) do
# order matters: if a user bookmarked a post but later lost access rights via unfollow
# we want to allow cleaning up the now useless entry (if it was still cached locally)
# but never return a success response which contains the current status content
:ok <- Bookmark.destroy(user.id, activity.id),
true <- Visibility.visible_for_user?(activity, user) do
try_render(conn, "show.json", activity: activity, for: user, as: :activity)
else
none when none in [nil, false] ->
{:error, :not_found, "Record not found"}
error ->
error
end
end
@doc "POST /api/v1/statuses/:id/mute"
def mute_conversation(%{assigns: %{user: user}, body_params: params} = conn, %{id: id}) do
with %Activity{} = activity <- Activity.get_by_id(id),
# CommonAPI already checks whether user is allowed to mute
{:ok, activity} <- CommonAPI.add_mute(user, activity, params) do
try_render(conn, "show.json", activity: activity, for: user, as: :activity)
else
{:error, :visibility_error} ->
{:error, :not_found, "Record not found"}
error ->
error
end
end
@doc "POST /api/v1/statuses/:id/unmute"
def unmute_conversation(%{assigns: %{user: user}} = conn, %{id: id}) do
with %Activity{} = activity <- Activity.get_by_id(id),
# CommonAPI already checks whether user is allowed to unmute
{:ok, activity} <- CommonAPI.remove_mute(user, activity) do
try_render(conn, "show.json", activity: activity, for: user, as: :activity)
else
{:error, :visibility_error} ->
{:error, :not_found, "Record not found"}
error ->
error
end
end
@ -453,6 +490,7 @@ defmodule Pleroma.Web.MastodonAPI.StatusController do
|> ActivityPub.fetch_activities_for_context(%{
blocking_user: user,
user: user,
with_muted: true,
exclude_id: activity.id
})
|> Enum.filter(fn activity -> Visibility.visible_for_user?(activity, user) end)
@ -497,20 +535,34 @@ defmodule Pleroma.Web.MastodonAPI.StatusController do
)
end
@doc "GET /api/v1/statuses/:id/translations/:language"
def translate(%{assigns: %{user: user}} = conn, %{id: id, language: language} = params) do
with {:enabled, true} <- {:enabled, Config.get([:translator, :enabled])},
%Activity{} = activity <- Activity.get_by_id_with_object(id),
{:visible, true} <- {:visible, Visibility.visible_for_user?(activity, user)},
translation_module <- Config.get([:translator, :module]),
{:ok, detected, translation} <-
fetch_or_translate(
activity.id,
activity.object.data["content"],
Map.get(params, :from, nil),
language,
translation_module
@doc "POST /api/v1/statuses/:id/translate"
def translate(%{assigns: %{user: user}, body_params: params} = conn, %{id: id}) do
with {:ok, translation} <-
do_translate(
id,
user,
Map.get(params, :source_lang, nil),
Map.get(params, :lang, nil)
) do
json(conn, translation)
else
{:enabled, false} ->
conn
|> put_status(:forbidden)
|> json(%{"error" => "Translation is not enabled"})
{:visible, false} ->
{:error, :not_found}
e ->
e
end
end
@doc "GET /api/v1/statuses/:id/translations/:language"
def translate_legacy(%{assigns: %{user: user}} = conn, %{id: id, language: language} = params) do
with {:ok, %{content: translation, detected_source_language: detected}} <-
do_translate(id, user, Map.get(params, :from, nil), language) do
json(conn, %{detected_language: detected, text: translation})
else
{:enabled, false} ->
@ -526,6 +578,28 @@ defmodule Pleroma.Web.MastodonAPI.StatusController do
end
end
defp do_translate(id, user, source_language, target_language) do
with {:enabled, true} <- {:enabled, Config.get([:translator, :enabled])},
%Activity{} = activity <- Activity.get_by_id_with_object(id),
{:visible, true} <- {:visible, Visibility.visible_for_user?(activity, user)},
translation_module <- Config.get([:translator, :module]),
{:ok, detected, translation} <-
fetch_or_translate(
activity.id,
activity.object.data["content"],
source_language,
target_language,
translation_module
) do
{:ok,
%{
content: translation,
detected_source_language: detected,
provider: translation_module.name()
}}
end
end
defp fetch_or_translate(status_id, text, source_language, target_language, translation_module) do
@cachex.fetch!(
:translations_cache,

View file

@ -181,10 +181,19 @@ defmodule Pleroma.Web.MastodonAPI.AccountView do
end
def render("instance.json", %{instance: %Pleroma.Instances.Instance{} = instance}) do
nodeinfo =
if Pleroma.Config.get!([:instance, :filter_embedded_nodeinfo]) and instance.nodeinfo do
%{}
|> maybe_put_nodeinfo(instance.nodeinfo, "version")
|> maybe_put_nodeinfo(instance.nodeinfo, "software")
else
instance.nodeinfo
end
%{
name: instance.host,
favicon: instance.favicon |> MediaProxy.url(),
nodeinfo: instance.nodeinfo
nodeinfo: nodeinfo
}
end
@ -442,6 +451,16 @@ defmodule Pleroma.Web.MastodonAPI.AccountView do
defp maybe_put_email_address(data, _, _), do: data
defp maybe_put_nodeinfo(map, nodeinfo, key) do
val = nodeinfo[key]
if val do
Map.put(map, key, val)
else
map
end
end
defp image_url(%{"url" => [%{"href" => href} | _]}), do: href
defp image_url(_), do: nil
end

View file

@ -6,8 +6,8 @@ defmodule Pleroma.Web.MastodonAPI.ConversationView do
use Pleroma.Web, :view
alias Pleroma.Activity
alias Pleroma.Conversation.Participation
alias Pleroma.Repo
alias Pleroma.Web.ActivityPub.ActivityPub
alias Pleroma.Web.MastodonAPI.AccountView
alias Pleroma.Web.MastodonAPI.StatusView
@ -21,18 +21,10 @@ defmodule Pleroma.Web.MastodonAPI.ConversationView do
def render("participation.json", %{participation: participation, for: user}) do
participation = Repo.preload(participation, conversation: [], recipients: [])
last_activity_id =
with nil <- participation.last_activity_id do
ActivityPub.fetch_latest_direct_activity_id_for_context(
participation.conversation.ap_id,
%{
user: user,
blocking_user: user
}
)
end
activity_id =
participation.last_activity_id || Participation.last_activity_id(participation, user)
activity = Activity.get_by_id_with_object(last_activity_id)
activity = Activity.get_by_id_with_object(activity_id)
# Conversations return all users except the current user,
# except when the current user is the only participant

View file

@ -13,7 +13,7 @@ defmodule Pleroma.Web.MastodonAPI.CustomEmojiView do
end
def render("show.json", %{custom_emoji: {shortcode, %Emoji{file: relative_url, tags: tags}}}) do
url = Endpoint.url() |> URI.merge(relative_url) |> to_string()
url = Endpoint.url() <> relative_url
%{
"shortcode" => shortcode,

View file

@ -14,7 +14,7 @@ defmodule Pleroma.Web.MastodonAPI.InstanceView do
instance = Config.get(:instance)
%{
uri: Pleroma.Web.WebFinger.domain(),
uri: Pleroma.Web.WebFinger.Schema.domain(),
title: Keyword.get(instance, :name),
description: Keyword.get(instance, :description),
short_description:
@ -55,6 +55,18 @@ defmodule Pleroma.Web.MastodonAPI.InstanceView do
}
end
def render("translation_languages.json", %{
source_languages: source_languages,
destination_languages: destination_languages
}) do
source_language_codes = Enum.map(source_languages, fn lang -> lang.code end)
dest_language_codes = Enum.map(destination_languages, fn lang -> lang.code end)
Map.new(source_language_codes, fn language ->
{language, dest_language_codes -- [language]}
end)
end
def features do
[
"pleroma_api",

View file

@ -792,8 +792,8 @@ defmodule Pleroma.Web.MastodonAPI.StatusView do
defp maybe_render_quote(nil, _), do: nil
defp maybe_render_quote(quote, opts) do
with %User{} = quoted_user <- User.get_cached_by_ap_id(quote.actor),
false <- Map.get(opts, :do_not_recurse, false),
with false <- Map.get(opts, :do_not_recurse, false),
%User{} = quoted_user <- User.get_cached_by_ap_id(quote.actor),
true <- visible_for_user?(quote, opts[:for]),
false <- User.blocks?(opts[:for], quoted_user),
false <- User.mutes?(opts[:for], quoted_user) do
@ -802,7 +802,14 @@ defmodule Pleroma.Web.MastodonAPI.StatusView do
|> Map.put(:activity, quote)
|> Map.put(:do_not_recurse, true)
render("show.json", opts)
qdata = render("show.json", opts)
# For Masto-API compat we need to stuff the quote into itself
# such that the "quote" object meets both the old *oma convention
# being directly a status itself and the new Masto flavour with a sub-object
qdata
|> Map.put(:state, "accepted")
|> Map.put(:quoted_status, qdata)
else
_ -> nil
end

View file

@ -64,7 +64,7 @@ defmodule Pleroma.Web.Metadata.Utils do
def user_name_string(user) do
"#{user.name} " <>
if user.local do
"(@#{user.nickname}@#{Pleroma.Web.WebFinger.domain()})"
"(@#{user.nickname}@#{Pleroma.Web.WebFinger.Schema.domain()})"
else
"(@#{user.nickname})"
end

View file

@ -27,65 +27,69 @@ defmodule Pleroma.Web.Nodeinfo.Nodeinfo do
federation = InstanceView.federation()
features = InstanceView.features()
# (Unlike most of our views)
# This uses string keys for consistency with remote nodeinfo data
%{
version: "2.0",
software: %{
name: Pleroma.Application.name() |> String.downcase(),
version: Pleroma.Application.version()
"version" => "2.0",
"software" => %{
"name" => Pleroma.Application.name() |> String.downcase(),
"version" => Pleroma.Application.version()
},
protocols: Publisher.gather_nodeinfo_protocol_names(),
services: %{
inbound: [],
outbound: []
"protocols" => Publisher.gather_nodeinfo_protocol_names(),
"services" => %{
"inbound" => [],
"outbound" => []
},
openRegistrations: Config.get([:instance, :registrations_open]),
usage: %{
users: %{
total: Map.get(stats, :user_count, 0),
activeMonth: Pleroma.User.active_user_count(30),
activeHalfyear: Pleroma.User.active_user_count(180)
"openRegistrations" => Config.get([:instance, :registrations_open]),
"usage" => %{
"users" => %{
"total" => Map.get(stats, :user_count, 0),
"activeMonth" => Pleroma.User.active_user_count(30),
"activeHalfyear" => Pleroma.User.active_user_count(180)
},
localPosts: Map.get(stats, :status_count, 0)
"localPosts" => Map.get(stats, :status_count, 0)
},
metadata: %{
nodeName: Config.get([:instance, :name]),
nodeDescription: description(),
private: !Config.get([:instance, :public], true),
suggestions: %{
enabled: false
"metadata" => %{
"nodeName" => Config.get([:instance, :name]),
"nodeDescription" => description(),
"private" => !Config.get([:instance, :public], true),
"suggestions" => %{
"enabled" => false
},
staffAccounts: staff_accounts,
federation: federation,
pollLimits: Config.get([:instance, :poll_limits]),
postFormats: Config.get([:instance, :allowed_post_formats]),
uploadLimits: %{
general: Config.get([:instance, :upload_limit]),
avatar: Config.get([:instance, :avatar_upload_limit]),
banner: Config.get([:instance, :banner_upload_limit]),
background: Config.get([:instance, :background_upload_limit])
"staffAccounts" => staff_accounts,
"federation" => federation,
"pollLimits" => Config.get([:instance, :poll_limits]),
"postFormats" => Config.get([:instance, :allowed_post_formats]),
"uploadLimits" => %{
"general" => Config.get([:instance, :upload_limit]),
"avatar" => Config.get([:instance, :avatar_upload_limit]),
"banner" => Config.get([:instance, :banner_upload_limit]),
"background" => Config.get([:instance, :background_upload_limit])
},
fieldsLimits: %{
maxFields: Config.get([:instance, :max_account_fields]),
maxRemoteFields: Config.get([:instance, :max_remote_account_fields]),
nameLength: Config.get([:instance, :account_field_name_length]),
valueLength: Config.get([:instance, :account_field_value_length])
"fieldsLimits" => %{
"maxFields" => Config.get([:instance, :max_account_fields]),
"maxRemoteFields" => Config.get([:instance, :max_remote_account_fields]),
"nameLength" => Config.get([:instance, :account_field_name_length]),
"valueLength" => Config.get([:instance, :account_field_value_length])
},
accountActivationRequired: Config.get([:instance, :account_activation_required], false),
invitesEnabled: Config.get([:instance, :invites_enabled], false),
mailerEnabled: Config.get([Pleroma.Emails.Mailer, :enabled], false),
features: features,
restrictedNicknames: Config.get([Pleroma.User, :restricted_nicknames]),
skipThreadContainment: Config.get([:instance, :skip_thread_containment], false),
privilegedStaff: Config.get([:instance, :privileged_staff]),
localBubbleInstances: Config.get([:instance, :local_bubble], []),
publicTimelineVisibility: %{
federated:
"accountActivationRequired" =>
Config.get([:instance, :account_activation_required], false),
"invitesEnabled" => Config.get([:instance, :invites_enabled], false),
"mailerEnabled" => Config.get([Pleroma.Emails.Mailer, :enabled], false),
"features" => features,
"restrictedNicknames" => Config.get([Pleroma.User, :restricted_nicknames]),
"skipThreadContainment" => Config.get([:instance, :skip_thread_containment], false),
"privilegedStaff" => Config.get([:instance, :privileged_staff]),
"localBubbleInstances" => Config.get([:instance, :local_bubble], []),
"publicTimelineVisibility" => %{
"federated" =>
!Config.restrict_unauthenticated_access?(:timelines, :federated) &&
Config.get([:instance, :federated_timeline_available], true),
local: !Config.restrict_unauthenticated_access?(:timelines, :local),
bubble: !Config.restrict_unauthenticated_access?(:timelines, :bubble)
"local" => !Config.restrict_unauthenticated_access?(:timelines, :local),
"bubble" => !Config.restrict_unauthenticated_access?(:timelines, :bubble)
},
federatedTimelineAvailable: Config.get([:instance, :federated_timeline_available], true)
"federatedTimelineAvailable" =>
Config.get([:instance, :federated_timeline_available], true)
}
}
end
@ -95,12 +99,12 @@ defmodule Pleroma.Web.Nodeinfo.Nodeinfo do
updated_software =
raw_response
|> Map.get(:software)
|> Map.put(:repository, Pleroma.Application.repository())
|> Map.get("software")
|> Map.put("repository", Pleroma.Application.repository())
raw_response
|> Map.put(:software, updated_software)
|> Map.put(:version, "2.1")
|> Map.put("software", updated_software)
|> Map.put("version", "2.1")
end
def get_nodeinfo(_version) do

View file

@ -39,16 +39,22 @@ defmodule Pleroma.Web.PleromaAPI.ConversationController do
) do
with %Participation{user_id: ^user_id} = participation <-
Participation.get(participation_id, preload: [:conversation]) do
params =
qparams =
params
|> Map.put(:blocking_user, user)
|> Map.put(:muting_user, user)
|> Map.put(:user, user)
pparams =
params
|> Map.put(:total, false)
# Already sorted using a plain "DESC", matching our index instead of Paginations "DESC NULLS LAST"
|> Map.put(:skip_extra_order, true)
activities =
participation.conversation.ap_id
|> ActivityPub.fetch_activities_for_context_query(params)
|> Pleroma.Pagination.fetch_paginated(Map.put(params, :total, false))
|> ActivityPub.fetch_activities_for_context_query(qparams)
|> Pleroma.Pagination.fetch_paginated(pparams)
|> Enum.reverse()
conn
@ -64,13 +70,22 @@ defmodule Pleroma.Web.PleromaAPI.ConversationController do
end
def update(
%{assigns: %{user: %{id: user_id} = user}} = conn,
%{id: participation_id, recipients: recipients}
%{assigns: %{user: %{id: user_id} = user}, body_params: body_params} = conn,
%{id: participation_id} = params
) do
with %Participation{user_id: ^user_id} = participation <- Participation.get(participation_id),
# OpenApiSpex 3.x prevents Plug's usual parameter premerging
params = Map.merge(body_params, params)
with {_, recipients} when recipients != nil <- {:params, params[:recipients]},
%Participation{user_id: ^user_id} = participation <- Participation.get(participation_id),
{:ok, participation} <- Participation.set_recipients(participation, recipients) do
render(conn, "participation.json", participation: participation, for: user)
else
{:params, _} ->
conn
|> put_status(:bad_request)
|> json(%{"error" => "No paramters passed to update!"})
{:error, message} ->
conn
|> put_status(:bad_request)

View file

@ -18,7 +18,7 @@ defmodule Pleroma.Web.PleromaAPI.UserImportController do
plug(Pleroma.Web.ApiSpec.CastAndValidate)
defdelegate open_api_operation(action), to: ApiSpec.UserImportOperation
def follow(%{body_params: %{list: %Plug.Upload{path: path}}} = conn, _) do
def follow(%Plug.Conn{body_params: %{list: %Plug.Upload{path: path}}} = conn, _) do
follow(%Plug.Conn{conn | body_params: %{list: File.read!(path)}}, %{})
end
@ -35,20 +35,20 @@ defmodule Pleroma.Web.PleromaAPI.UserImportController do
json(conn, "job started")
end
def blocks(%{body_params: %{list: %Plug.Upload{path: path}}} = conn, _) do
def blocks(%Plug.Conn{body_params: %{list: %Plug.Upload{path: path}}} = conn, _) do
blocks(%Plug.Conn{conn | body_params: %{list: File.read!(path)}}, %{})
end
def blocks(%{assigns: %{user: blocker}, body_params: %{list: list}} = conn, _) do
def blocks(%Plug.Conn{assigns: %{user: blocker}, body_params: %{list: list}} = conn, _) do
User.Import.blocks_import(blocker, prepare_user_identifiers(list))
json(conn, "job started")
end
def mutes(%{body_params: %{list: %Plug.Upload{path: path}}} = conn, _) do
def mutes(%Plug.Conn{body_params: %{list: %Plug.Upload{path: path}}} = conn, _) do
mutes(%Plug.Conn{conn | body_params: %{list: File.read!(path)}}, %{})
end
def mutes(%{assigns: %{user: user}, body_params: %{list: list}} = conn, _) do
def mutes(%Plug.Conn{assigns: %{user: user}, body_params: %{list: list}} = conn, _) do
User.Import.mutes_import(user, prepare_user_identifiers(list))
json(conn, "job started")
end

View file

@ -614,7 +614,8 @@ defmodule Pleroma.Web.Router do
post("/statuses/:id/unbookmark", StatusController, :unbookmark)
post("/statuses/:id/mute", StatusController, :mute_conversation)
post("/statuses/:id/unmute", StatusController, :unmute_conversation)
get("/statuses/:id/translations/:language", StatusController, :translate)
post("/statuses/:id/translate", StatusController, :translate)
get("/statuses/:id/translations/:language", StatusController, :translate_legacy)
post("/push/subscription", SubscriptionController, :create)
get("/push/subscription", SubscriptionController, :show)
@ -667,6 +668,7 @@ defmodule Pleroma.Web.Router do
post("/accounts", AccountController, :create)
get("/instance", InstanceController, :show)
get("/instance/translation_languages", InstanceController, :translation_languages)
get("/instance/peers", InstanceController, :peers)
get("/statuses", StatusController, :index)

View file

@ -67,8 +67,10 @@ defmodule Pleroma.Web.Telemetry do
{meta.args["activity_id"], nil, "inbox_collection"}
"publish_one" ->
full_target = get_in(meta.args, ["params", "inbox"])
%{host: simple_target} = URI.parse(full_target || "")
full_inbox = get_in(meta.args, ["params", "inbox"]) || ""
%{host: simple_target} = URI.parse(full_inbox)
activity_apid = get_in(meta.args, ["params", "id"]) || "(no AP id)"
full_target = activity_apid <> " to " <> full_inbox
error = collect_apdelivery_error(event, meta)
{full_target, simple_target, error}
end
@ -112,6 +114,9 @@ defmodule Pleroma.Web.Telemetry do
error when is_atom(error) ->
"#{error}"
{:http_error, reason, _} when is_number(reason) or is_atom(reason) or is_binary(reason) ->
"http_#{reason}"
%{status: code} when is_number(code) ->
"http_#{code}"

View file

@ -29,9 +29,16 @@ defmodule Pleroma.Web.TwitterAPI.RemoteFollowController do
# GET /ostatus_subscribe
#
def follow(%{assigns: %{user: user}} = conn, %{"acct" => acct}) do
case is_status?(acct) do
true -> follow_status(conn, user, acct)
_ -> follow_account(conn, user, acct)
cond do
String.starts_with?(acct, "@") ->
follow_account(conn, user, String.slice(acct, 1..-1//1))
String.starts_with?(acct, "http://") ||
(String.starts_with?(acct, "https://") && is_status?(acct)) ->
follow_status(conn, user, acct)
true ->
follow_account(conn, user, acct)
end
end

View file

@ -16,7 +16,7 @@ defmodule Pleroma.Web.TwitterAPI.UtilController do
alias Pleroma.Web.ActivityPub.ActivityPub
alias Pleroma.Web.CommonAPI
alias Pleroma.Web.Plugs.OAuthScopesPlug
alias Pleroma.Web.WebFinger
alias Pleroma.Web.WebFinger.Finger
plug(
Pleroma.Web.ApiSpec.CastAndValidate
@ -111,7 +111,7 @@ defmodule Pleroma.Web.TwitterAPI.UtilController do
end
def remote_subscribe(conn, %{"user" => %{"nickname" => nick, "profile" => profile}}) do
with {:ok, %{"subscribe_address" => template}} <- WebFinger.finger(profile),
with {:ok, %{"subscribe_address" => template}} <- Finger.finger_raw_data(profile),
%User{ap_id: ap_id} <- User.get_cached_by_nickname(nick) do
conn
|> Phoenix.Controller.redirect(external: String.replace(template, "{uri}", ap_id))
@ -131,7 +131,7 @@ defmodule Pleroma.Web.TwitterAPI.UtilController do
end
def remote_subscribe(conn, %{"status" => %{"status_id" => id, "profile" => profile}}) do
with {:ok, %{"subscribe_address" => template}} <- WebFinger.finger(profile),
with {:ok, %{"subscribe_address" => template}} <- Finger.finger_raw_data(profile),
%Activity{} = activity <- Activity.get_by_id(id),
{:ok, ap_id} <- get_ap_id(activity) do
conn
@ -155,7 +155,7 @@ defmodule Pleroma.Web.TwitterAPI.UtilController do
%Plug.Conn{body_params: %{ap_id: ap_id, profile: profile}} = conn,
_params
) do
with {:ok, %{"subscribe_address" => template}} <- WebFinger.finger(profile) do
with {:ok, %{"subscribe_address" => template}} <- Finger.finger_raw_data(profile) do
conn
|> json(%{url: String.replace(template, "{uri}", ap_id)})
else

View file

@ -87,22 +87,32 @@ defmodule Pleroma.Web.StreamerView do
|> Jason.encode!()
end
def render("follow_relationships_update.json", item, topic) do
def render(
"follow_relationships_update.json",
%{follower: follower, following: following, state: state},
topic
) do
# This is streamed out to the _follower_
# Thus the full details of the follower should be sent out unchecked,
# but details of the following user must obey user-indicated preferences
following_followers = if following.hide_followers_count, do: 0, else: following.follower_count
following_following = if following.hide_follows_count, do: 0, else: following.following_count
%{
stream: [topic],
event: "pleroma:follow_relationships_update",
payload:
%{
state: item.state,
state: state,
follower: %{
id: item.follower.id,
follower_count: item.follower.follower_count,
following_count: item.follower.following_count
id: follower.id,
follower_count: follower.follower_count,
following_count: follower.following_count
},
following: %{
id: item.following.id,
follower_count: item.following.follower_count,
following_count: item.following.following_count
id: following.id,
follower_count: following_followers,
following_count: following_following
}
}
|> Jason.encode!()

View file

@ -1,257 +0,0 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2021 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Web.WebFinger do
alias Pleroma.HTTP
alias Pleroma.User
alias Pleroma.Web.Endpoint
alias Pleroma.Web.Federator.Publisher
alias Pleroma.Web.XML
alias Pleroma.XmlBuilder
require Jason
require Logger
def host_meta do
base_url = Endpoint.url()
{
:XRD,
%{xmlns: "http://docs.oasis-open.org/ns/xri/xrd-1.0"},
{
:Link,
%{
rel: "lrdd",
type: "application/xrd+xml",
template: "#{base_url}/.well-known/webfinger?resource={uri}"
}
}
}
|> XmlBuilder.to_doc()
end
def webfinger(resource, fmt) when fmt in ["XML", "JSON"] do
host = Pleroma.Web.Endpoint.host()
regex =
if webfinger_domain = Pleroma.Config.get([__MODULE__, :domain]) do
~r/(acct:)?(?<username>[a-z0-9A-Z_\.-]+)@(#{host}|#{webfinger_domain})/
else
~r/(acct:)?(?<username>[a-z0-9A-Z_\.-]+)@#{host}/
end
with %{"username" => username} <- Regex.named_captures(regex, resource),
%User{} = user <- User.get_cached_by_nickname(username) do
{:ok, represent_user(user, fmt)}
else
_e ->
with %User{} = user <- User.get_cached_by_ap_id(resource) do
{:ok, represent_user(user, fmt)}
else
_e ->
{:error, "Couldn't find user"}
end
end
end
defp gather_links(%User{} = user) do
[
%{
"rel" => "http://webfinger.net/rel/profile-page",
"type" => "text/html",
"href" => user.ap_id
}
] ++ Publisher.gather_webfinger_links(user)
end
defp gather_aliases(%User{} = user) do
[user.ap_id]
end
def represent_user(user, "JSON") do
%{
"subject" => "acct:#{user.nickname}@#{domain()}",
"aliases" => gather_aliases(user),
"links" => gather_links(user)
}
end
def represent_user(user, "XML") do
aliases =
user
|> gather_aliases()
|> Enum.map(&{:Alias, &1})
links =
gather_links(user)
|> Enum.map(fn link -> {:Link, link} end)
{
:XRD,
%{xmlns: "http://docs.oasis-open.org/ns/xri/xrd-1.0"},
[
{:Subject, "acct:#{user.nickname}@#{domain()}"}
] ++ aliases ++ links
}
|> XmlBuilder.to_doc()
end
def domain do
Pleroma.Config.get([__MODULE__, :domain]) || Pleroma.Web.Endpoint.host()
end
@spec webfinger_from_xml(binary()) :: {:ok, map()} | nil
defp webfinger_from_xml(body) do
with {:ok, doc} <- XML.parse_document(body) do
subject = XML.string_from_xpath("//Subject", doc)
subscribe_address =
~s{//Link[@rel="http://ostatus.org/schema/1.0/subscribe"]/@template}
|> XML.string_from_xpath(doc)
ap_id =
~s{//Link[@rel="self" and @type="application/activity+json"]/@href}
|> XML.string_from_xpath(doc)
data = %{
"subject" => subject,
"subscribe_address" => subscribe_address,
"ap_id" => ap_id
}
{:ok, data}
end
end
defp webfinger_from_json(body) do
with {:ok, doc} <- Jason.decode(body) do
data =
Enum.reduce(doc["links"], %{"subject" => doc["subject"]}, fn link, data ->
case {link["type"], link["rel"]} do
{"application/activity+json", "self"} ->
Map.put(data, "ap_id", link["href"])
{"application/ld+json; profile=\"https://www.w3.org/ns/activitystreams\"", "self"} ->
Map.put(data, "ap_id", link["href"])
{nil, "http://ostatus.org/schema/1.0/subscribe"} ->
Map.put(data, "subscribe_address", link["template"])
_ ->
Logger.debug("Unhandled type: #{inspect(link["type"])}")
data
end
end)
{:ok, data}
end
end
def get_template_from_xml(body) do
xpath = "//Link[@rel='lrdd']/@template"
with {:ok, doc} <- XML.parse_document(body),
template when template != nil <- XML.string_from_xpath(xpath, doc) do
{:ok, template}
end
end
@cachex Pleroma.Config.get([:cachex, :provider], Cachex)
def find_lrdd_template(domain) do
@cachex.fetch!(:host_meta_cache, domain, fn _ ->
{:commit, fetch_lrdd_template(domain)}
end)
rescue
e -> {:error, "Cachex error: #{inspect(e)}"}
end
defp fetch_lrdd_template(domain) do
# WebFinger is restricted to HTTPS - https://tools.ietf.org/html/rfc7033#section-9.1
meta_url = "https://#{domain}/.well-known/host-meta"
with {:ok, %{status: status, body: body}} when status in 200..299 <-
HTTP.Backoff.get(meta_url) do
get_template_from_xml(body)
else
error ->
Logger.warning("Can't find LRDD template in #{inspect(meta_url)}: #{inspect(error)}")
{:error, :lrdd_not_found}
end
end
defp get_address_from_domain(domain, "acct:" <> _ = encoded_account) when is_binary(domain) do
case find_lrdd_template(domain) do
{:ok, template} ->
String.replace(template, "{uri}", encoded_account)
_ ->
"https://#{domain}/.well-known/webfinger?resource=#{encoded_account}"
end
end
defp get_address_from_domain(domain, account) when is_binary(domain) do
encoded_account = URI.encode("acct:#{account}")
get_address_from_domain(domain, encoded_account)
end
defp get_address_from_domain(_, _), do: {:error, :webfinger_no_domain}
@spec finger(String.t()) :: {:ok, map()} | {:error, any()}
def finger(account) do
account = String.trim_leading(account, "@")
domain =
with [_name, domain] <- String.split(account, "@") do
domain
else
_e ->
URI.parse(account).host
end
with address when is_binary(address) <- get_address_from_domain(domain, account),
{:ok, %{status: status, body: body, headers: headers}} when status in 200..299 <-
HTTP.Backoff.get(
address,
[{"accept", "application/xrd+xml,application/jrd+json"}]
) do
case List.keyfind(headers, "content-type", 0) do
{_, content_type} ->
case Plug.Conn.Utils.media_type(content_type) do
{:ok, "application", subtype, _} when subtype in ~w(xrd+xml xml) ->
webfinger_from_xml(body)
{:ok, "application", subtype, _} when subtype in ~w(jrd+json json) ->
webfinger_from_json(body)
_ ->
{:error, {:content_type, content_type}}
end
_ ->
{:error, {:content_type, nil}}
end
|> case do
{:ok, data} -> validate_webfinger(address, data)
error -> error
end
else
error ->
Logger.debug("Couldn't finger #{account}: #{inspect(error)}")
error
end
end
defp validate_webfinger(request_url, %{"subject" => "acct:" <> acct = subject} = data) do
with [_name, acct_host] <- String.split(acct, "@"),
{_, url} <- {:address, get_address_from_domain(acct_host, subject)},
%URI{host: request_host} <- URI.parse(request_url),
%URI{host: acct_host} <- URI.parse(url),
{_, true} <- {:hosts_match, acct_host == request_host} do
{:ok, data}
else
_ -> {:error, {:webfinger_invalid, request_url, data}}
end
end
defp validate_webfinger(url, data), do: {:error, {:webfinger_invalid, url, data}}
end

View file

@ -0,0 +1,349 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2021 Pleroma Authors <https://pleroma.social/>
# Copyright © 2026 Akkoma Authors <https://akkoma.dev/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Web.WebFinger.Finger do
@moduledoc """
Used to query validated WebFinger data from remote hosts.
Notably this validation includes making sure BOTH the
domain used in the WebFinger handle AND the ActivityPub actor agree to the
final domain username and domain parts being associated to the particular actor.
"""
alias Pleroma.HTTP
alias Pleroma.Object.Fetcher
alias Pleroma.Web.XML
require Jason
require Logger
@spec webfinger_from_xml(binary()) :: {:ok, map()} | nil
defp webfinger_from_xml(body) do
with {:ok, doc} <- XML.parse_document(body) do
subject = XML.string_from_xpath("//Subject", doc)
subscribe_address =
~s{//Link[@rel="http://ostatus.org/schema/1.0/subscribe"]/@template}
|> XML.string_from_xpath(doc)
ap_id_compat =
~s{//Link[@rel="self" and @type="application/activity+json"]/@href}
|> XML.string_from_xpath(doc)
ap_id_spec =
~s{//Link[@rel="self" and @type='application/ld+json; profile="https://www.w3.org/ns/activitystreams"']/@href}
|> XML.string_from_xpath(doc)
data = %{
"subject" => subject,
"subscribe_address" => subscribe_address,
"ap_id" => ap_id_spec || ap_id_compat
}
{:ok, data}
else
_ -> {:error, :invalid_xml}
end
end
defp webfinger_from_json(body) do
with {:ok, doc} <- Jason.decode(body) do
data =
Enum.reduce(doc["links"], %{"subject" => doc["subject"]}, fn link, data ->
case {link["type"], link["rel"]} do
{"application/activity+json", "self"} ->
Map.put(data, "ap_id", link["href"])
{"application/ld+json; profile=\"https://www.w3.org/ns/activitystreams\"", "self"} ->
Map.put(data, "ap_id", link["href"])
{nil, "http://ostatus.org/schema/1.0/subscribe"} ->
Map.put(data, "subscribe_address", link["template"])
_ ->
Logger.debug("Unhandled type: #{inspect(link["type"])}")
data
end
end)
{:ok, data}
end
end
# discover webfinger domain delegation
# (does NOT imply delgated-to domain agrees; only consent of domain doing the delegation!)
defp get_template_from_xml(body) do
xpath = "//Link[@rel='lrdd']/@template"
with {:ok, doc} <- XML.parse_document(body),
template when template != nil <- XML.string_from_xpath(xpath, doc) do
{:ok, template}
end
end
defp fetch_lrdd_template(domain) do
# WebFinger is restricted to HTTPS - https://tools.ietf.org/html/rfc7033#section-9.1
meta_url = "https://#{domain}/.well-known/host-meta"
with {:ok, %{status: status, body: body}} when status in 200..299 <-
HTTP.Backoff.get(meta_url) do
get_template_from_xml(body)
else
error ->
Logger.warning("Can't find LRDD template in #{inspect(meta_url)}: #{inspect(error)}")
{:error, :lrdd_not_found}
end
end
# public for tests
@cachex Pleroma.Config.get([:cachex, :provider], Cachex)
def find_lrdd_template(domain) do
@cachex.fetch!(:host_meta_cache, domain, fn _ ->
{:commit, fetch_lrdd_template(domain)}
end)
rescue
e -> {:error, "Cachex error: #{inspect(e)}"}
end
defp make_finger_uri(domain, resource) do
encoded_resource = URI.encode(resource)
discovered_template = find_lrdd_template(domain)
case discovered_template do
{:ok, template} ->
# RFC 6415 LRDD (Link-based Resource Descriptor Documents) query endpoint
String.replace(template, "{uri}", encoded_resource)
_ ->
# Canonical WebFinger endpoint from its own RFC 7033
"https://#{domain}/.well-known/webfinger?resource=#{encoded_resource}"
end
end
defp parse_finger_response(%{body: body, headers: headers}) do
case List.keyfind(headers, "content-type", 0) do
{_, content_type} ->
case Plug.Conn.Utils.media_type(content_type) do
{:ok, "application", subtype, _} when subtype in ~w(xrd+xml xml) ->
webfinger_from_xml(body)
{:ok, "application", subtype, _} when subtype in ~w(jrd+json json) ->
webfinger_from_json(body)
_ ->
{:error, {:content_type, content_type}}
end
_ ->
{:error, {:content_type, nil}}
end
end
defp map_fetch_error_reason(%{status: code}) when code in [401, 403], do: :forbidden
defp map_fetch_error_reason(%{status: 404}), do: :not_found
defp map_fetch_error_reason(%{status: 410}), do: :deleted
defp map_fetch_error_reason(%{status: code, headers: headers}) when is_integer(code),
do: {:http_error, code, headers}
defp map_fetch_error_reason(%Tesla.Env{} = env), do: {:http_error, :connect, env}
defp finger_unverified_data(domain, resource) do
query_uri = make_finger_uri(domain, resource)
resp = HTTP.Backoff.get(query_uri, [{"accept", "application/xrd+xml,application/jrd+json"}])
with {:ok, %{url: resolved_uri, status: status} = resp_data} when status in 200..299 <- resp,
{_, {:ok, parsed_data}} <- {:parse, parse_finger_response(resp_data)} do
resolved_domain = URI.parse(resolved_uri).host
{:ok, resolved_domain, parsed_data}
else
{:ok, %Tesla.Env{} = env} -> {:error, map_fetch_error_reason(env)}
{:parse, {:error, _} = error} -> error
{:error, _reason} = e -> e
end
end
defp normalise_webfinger_handle("acct:@" <> handle), do: handle
defp normalise_webfinger_handle("acct:" <> handle), do: handle
defp normalise_webfinger_handle("@" <> handle), do: handle
defp normalise_webfinger_handle(handle) when is_binary(handle), do: handle
defp parse_handle(handle) do
case String.split(handle, "@", parts: 3) do
["", name, domain] -> {name, domain}
[name, domain] -> {name, domain}
[name] -> {name, nil}
_ -> {nil, nil}
end
end
@doc """
Discovers and verifies the WebFinger handle of an ActivityPub actor for use as a nickname.
If the actor or instance does not use WebFinger or just temporarily unavailable no value
is returned and it is up to callers to decide on an appropriate fallback or stop processing.
Returns {:ok, handle} if discovered and successfully verified,
{:ok, nil} if no WebFinger can be discovered but was also not required and
{:error, reason} if validation failed or a required WebFinger link is missing.
"""
@spec finger_actor(map()) :: {:ok, String.t() | nil} | {:error, any()}
def finger_actor(%{"webfinger" => preferred_handle, "id" => ap_id})
when is_binary(preferred_handle) and is_binary(ap_id) do
# As per FEP-2c59 an "acct:" prefix is discouraged but allowed in the actor property
preferred_handle = normalise_webfinger_handle(preferred_handle)
{_, domain} = parse_handle(preferred_handle)
ap_domain = URI.parse(ap_id).host
with {_, false} <- {:no_domain, domain == nil || ap_domain == nil},
{_, false} <- {:matching_domain, domain == ap_domain},
# We check for an exact match to the preferred handle which will ALWAYS
# belong to the initial query domain, thus we do not need to consider the final domain here.
# If the query domain delegates to another domain via host-meta or HTTP redirects on
# ./well-known/ paths (which ought to be directly controlled by the operator),
# this clearly indicates consent of the query domain to allow the final domain to manage this data
{_, {:ok, _, %{"ap_id" => fingered_ap_id, "subject" => finger_subject}}} <-
{:query, finger_unverified_data(domain, ap_id)},
{_, false} <- {:fingered_data_mismatch, ap_id != fingered_ap_id},
finger_handle <- normalise_webfinger_handle(finger_subject),
{_, false} <- {:fingered_data_mismatch, preferred_handle != finger_handle} do
{:ok, preferred_handle}
else
{:matching_domain, true} ->
{:ok, preferred_handle}
{:query, error} ->
error
{reason, _} ->
{:error, reason}
end
end
def finger_actor(%{"id" => ap_id} = actor_data) when is_binary(ap_id) do
ap_domain = URI.parse(ap_id).host
with {_, false} <- {:no_domain, ap_domain == nil},
{_, {:ok, finger_domain, %{"ap_id" => fingered_ap_id, "subject" => finger_subject}}} <-
{:query, finger_unverified_data(ap_domain, ap_id)},
{_, false} <- {:fingered_data_mismatch, fingered_ap_id != ap_id},
handle <- normalise_webfinger_handle(finger_subject),
{nick_user, nick_domain} <- parse_handle(handle),
# Mastodon in its infinite wisdom encourages setups for custom WebFinger domains,
# such that the actual WebFinger response is _never_ served directly from the domain used in handles.
# Unlike in domain authority checks for AP IDs, here only fixed /.well-known URLs are queried,
# thus a redirect on this endpoint can be considered an approval from the redirecting domain
# (but not the redirected-to domain!) and it should be safe to accept both domain authorities here.
{_, false} <-
{:finger_domain_spoof, nick_domain != finger_domain && nick_domain != ap_domain},
ap_name <- actor_data["preferredUsername"],
{_, false} <- {:fingered_data_mismatch, ap_name != nil && ap_name != nick_user} do
{:ok, handle}
else
{:query, _} ->
# Instance either doesnt use WebFinger or WebFinger setup temporarily unreachable.
# This is no error (WebFinger isnt mandatory for AP); we just have no WebFinger handle to report.
{:ok, nil}
{reason, _} ->
{:error, reason}
end
end
# Mastodon does not respond to requests with a leading "@" regardless of whether an additional "acct:" prefix is used.
# While all tested implementations also respond without a leading "acct:",
# including it seems more robust since Mastodon always does so in its own queries.
defp resource_from_mention("@" <> nick), do: "acct:" <> nick
defp resource_from_mention(nick), do: "acct:" <> nick
defp verify_ap_data_from_finger(%{"webfinger" => preferred_handle} = data, finger_handle, _, _) do
if normalise_webfinger_handle(preferred_handle) == finger_handle do
{:ok, finger_handle, data}
else
{:error, :finger_data_mismatch}
end
end
defp verify_ap_data_from_finger(%{"id" => ap_id} = data, handle, finger_domain, finger_name) do
ap_domain = URI.parse(ap_id).host
with {_, false} <- {:domain_mismatch, ap_domain != finger_domain},
ap_name <- data["preferredUsername"],
{_, false} <- {:fingered_nick_mismatch, ap_name != nil && ap_name != finger_name} do
{:ok, handle, data}
else
{:domain_mismatch, true} ->
# Actor has no webfinger backlink and is from different domain. We
# need to make sure actor agrees to be associated with this domain.
# Thus restart querying from actor data.
case finger_actor(data) do
{:ok, verified_nick} -> {:ok, verified_nick, data}
error -> error
end
{reason, _} ->
{:error, reason}
end
end
@doc """
Resolve mention handle to unparsed ActivityPub data,
but verified consistency with resolved webfinger handle.
The final handle may differ from initially queried handle.
Callers thus MUST use the returned handle for further processing, NOT the initially queried handle!
E.g. the AP actor may live on social.example.org and employ a redirect or LRDD tmeplate
to allow discovering the preferred handle on example.org. Or the actor changed their
username and queries to the old handle now respond with the updated information to gracefully
transitioning old references.
"""
@spec finger_mention(String.t()) :: {:ok, String.t(), map()} | {:error, any()}
def finger_mention(mention_handle) when is_binary(mention_handle) do
{qname, qdomain} = parse_handle(mention_handle)
resource = resource_from_mention(mention_handle)
with {_, false} <- {:invalid_handle, qname == nil || qdomain == nil},
{_, {:ok, finger_domain, %{"ap_id" => fingered_ap_id, "subject" => finger_subject}}} <-
{:query, finger_unverified_data(qdomain, resource)},
handle <- normalise_webfinger_handle(finger_subject),
{nick_user, nick_domain} <- parse_handle(handle),
# see comment in finger_actor for why both domains can and need to be accepted
{_, false} <-
{:finger_domain_spoof, nick_domain != finger_domain && nick_domain != qdomain},
{_, {:ok, data}} <-
{:fetch, Fetcher.fetch_and_contain_remote_object_from_id(fingered_ap_id)} do
verify_ap_data_from_finger(data, handle, finger_domain, nick_user)
else
{:query, error} -> error
{:fetch, error} -> error
{reason, _} -> {:error, reason}
end
end
@doc """
Retrieve raw, UNVERFIFIED webfinger data for a resource,
guessing the WebFinger domain from the resource itself.
Only use this when no verification needed! (E.g. to discover subsription addresses)
"""
@spec finger_raw_data(String.t()) :: {:ok, map()} | {:error, any()}
def finger_raw_data(resource) do
{domain, resource} =
if Regex.match?(~r/^https?:\/\//, resource) do
{URI.parse(resource).host, resource}
else
{_, domain} = parse_handle(resource)
{domain, resource_from_mention(resource)}
end
with {_, domain} when is_binary(domain) <- {:domain, domain},
{:ok, _, data} <- finger_unverified_data(domain, resource) do
{:ok, data}
else
{:domain, _} -> {:error, :no_domain}
error -> error
end
end
end

View file

@ -0,0 +1,104 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2021 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Web.WebFinger.Schema do
@moduledoc """
Generates WebFinger-related response data for local resources.
"""
alias Pleroma.User
alias Pleroma.Web.Endpoint
alias Pleroma.Web.Federator.Publisher
alias Pleroma.XmlBuilder
require Jason
require Logger
def host_meta do
base_url = Endpoint.url()
{
:XRD,
%{xmlns: "http://docs.oasis-open.org/ns/xri/xrd-1.0"},
{
:Link,
%{
rel: "lrdd",
type: "application/xrd+xml",
template: "#{base_url}/.well-known/webfinger?resource={uri}"
}
}
}
|> XmlBuilder.to_doc()
end
def webfinger(resource, fmt) when fmt in ["XML", "JSON"] do
host = Pleroma.Web.Endpoint.host()
regex =
if webfinger_domain = Pleroma.Config.get([Pleroma.Web.WebFinger, :domain]) do
~r/(acct:)?(?<username>[a-z0-9A-Z_\.-]+)@(#{host}|#{webfinger_domain})/
else
~r/(acct:)?(?<username>[a-z0-9A-Z_\.-]+)@#{host}/
end
with %{"username" => username} <- Regex.named_captures(regex, resource),
%User{} = user <- User.get_cached_by_nickname(username) do
{:ok, represent_user(user, fmt)}
else
_e ->
with %User{} = user <- User.get_cached_by_ap_id(resource),
true <- user.local do
{:ok, represent_user(user, fmt)}
else
_e ->
{:error, "Couldn't find user"}
end
end
end
defp gather_links(%User{} = user) do
[
%{
"rel" => "http://webfinger.net/rel/profile-page",
"type" => "text/html",
"href" => user.ap_id
}
] ++ Publisher.gather_webfinger_links(user)
end
defp gather_aliases(%User{} = user) do
[user.ap_id]
end
defp represent_user(user, "JSON") do
%{
"subject" => "acct:#{user.nickname}@#{domain()}",
"aliases" => gather_aliases(user),
"links" => gather_links(user)
}
end
defp represent_user(user, "XML") do
aliases =
user
|> gather_aliases()
|> Enum.map(&{:Alias, &1})
links =
gather_links(user)
|> Enum.map(fn link -> {:Link, link} end)
{
:XRD,
%{xmlns: "http://docs.oasis-open.org/ns/xri/xrd-1.0"},
[
{:Subject, "acct:#{user.nickname}@#{domain()}"}
] ++ aliases ++ links
}
|> XmlBuilder.to_doc()
end
def domain do
Pleroma.Config.get([Pleroma.Web.WebFinger, :domain]) || Pleroma.Web.Endpoint.host()
end
end

View file

@ -5,13 +5,13 @@
defmodule Pleroma.Web.WebFinger.WebFingerController do
use Pleroma.Web, :controller
alias Pleroma.Web.WebFinger
alias Pleroma.Web.WebFinger.Schema
plug(Pleroma.Web.Plugs.SetFormatPlug)
plug(Pleroma.Web.Plugs.FederatingPlug)
def host_meta(conn, _params) do
xml = WebFinger.host_meta()
xml = Schema.host_meta()
conn
|> put_resp_content_type("application/xrd+xml")
@ -20,7 +20,7 @@ defmodule Pleroma.Web.WebFinger.WebFingerController do
def webfinger(%{assigns: %{format: format}} = conn, %{"resource" => resource})
when format in ["xml", "xrd+xml"] do
with {:ok, response} <- WebFinger.webfinger(resource, "XML") do
with {:ok, response} <- Schema.webfinger(resource, "XML") do
conn
|> put_resp_content_type("application/xrd+xml")
|> send_resp(200, response)
@ -31,7 +31,7 @@ defmodule Pleroma.Web.WebFinger.WebFingerController do
def webfinger(%{assigns: %{format: format}} = conn, %{"resource" => resource})
when format in ["json", "jrd+json"] do
with {:ok, response} <- WebFinger.webfinger(resource, "JSON") do
with {:ok, response} <- Schema.webfinger(resource, "JSON") do
json(conn, response)
else
_e ->

20
mix.exs
View file

@ -4,8 +4,8 @@ defmodule Pleroma.Mixfile do
def project do
[
app: :pleroma,
version: version("3.17.0"),
elixir: "~> 1.14.1 or ~> 1.15",
version: version("3.18.0"),
elixir: "~> 1.15",
elixirc_paths: elixirc_paths(Mix.env()),
compilers: Mix.compilers(),
xref: [exclude: [:eldap]],
@ -13,7 +13,7 @@ defmodule Pleroma.Mixfile do
aliases: aliases(),
deps: deps(),
test_coverage: [tool: ExCoveralls],
preferred_cli_env: ["coveralls.html": :test, "mneme.test": :test, "mneme.watch": :test],
test_ignore_filters: [~r/_helper.exs$/, ~r/^test\/fixtures\//, ~r/^test\/credo\//],
# Docs
name: "Akkoma",
homepage_url: "https://akkoma.dev/",
@ -43,6 +43,12 @@ defmodule Pleroma.Mixfile do
|> add_listeners(Mix.env())
end
def cli() do
[
preferred_cli_env: ["coveralls.html": :test, "mneme.test": :test, "mneme.watch": :test]
]
end
defp add_listeners(project, :dev), do: Keyword.put(project, :listeners, [Phoenix.CodeReloader])
defp add_listeners(project, _), do: project
@ -137,10 +143,10 @@ defmodule Pleroma.Mixfile do
{:phoenix_html_helpers, "~> 1.0"},
{:calendar, "~> 1.0"},
{:cachex, "~> 4.1"},
{:tesla, "~> 1.7"},
{:tesla, "~> 1.16.0"},
{:castore, "~> 1.0"},
{:cowlib, "~> 2.12"},
{:finch, "~> 0.20.0"},
{:finch, "~> 0.21.0"},
{:jason, "~> 1.4"},
{:trailing_format_plug, "~> 0.0.7"},
{:mogrify, "~> 0.9"},
@ -179,9 +185,7 @@ defmodule Pleroma.Mixfile do
{:concurrent_limiter,
git: "https://akkoma.dev/AkkomaGang/concurrent-limiter.git", branch: "main"},
{:remote_ip, "~> 1.2.0"},
{:captcha,
git: "https://git.pleroma.social/pleroma/elixir-libraries/elixir-captcha.git",
branch: "master"},
{:captcha, git: "https://akkoma.dev/AkkomaGang/elixir-captcha.git", branch: "main"},
{:restarter, path: "./restarter"},
{:majic, git: "https://akkoma.dev/AkkomaGang/majic.git", branch: "main"},
{:eblurhash, "~> 1.2.2"},

View file

@ -7,8 +7,8 @@
"bunt": {:hex, :bunt, "1.0.0", "081c2c665f086849e6d57900292b3a161727ab40431219529f13c4ddcf3e7a44", [:mix], [], "hexpm", "dc5f86aa08a5f6fa6b8096f0735c4e76d54ae5c9fa2c143e5a1fc7c1cd9bb6b5"},
"cachex": {:hex, :cachex, "4.1.1", "574c5cd28473db313a0a76aac8c945fe44191659538ca6a1e8946ec300b1a19f", [:mix], [{:eternal, "~> 1.2", [hex: :eternal, repo: "hexpm", optional: false]}, {:ex_hash_ring, "~> 6.0", [hex: :ex_hash_ring, repo: "hexpm", optional: false]}, {:jumper, "~> 1.0", [hex: :jumper, repo: "hexpm", optional: false]}, {:sleeplocks, "~> 1.1", [hex: :sleeplocks, repo: "hexpm", optional: false]}, {:unsafe, "~> 1.0", [hex: :unsafe, repo: "hexpm", optional: false]}], "hexpm", "d6b7449ff98d6bb92dda58bd4fc3189cae9f99e7042054d669596f56dc503cd8"},
"calendar": {:hex, :calendar, "1.0.0", "f52073a708528482ec33d0a171954ca610fe2bd28f1e871f247dc7f1565fa807", [:mix], [{:tzdata, "~> 0.1.201603 or ~> 0.5.20 or ~> 1.0", [hex: :tzdata, repo: "hexpm", optional: false]}], "hexpm", "990e9581920c82912a5ee50e62ff5ef96da6b15949a2ee4734f935fdef0f0a6f"},
"captcha": {:git, "https://git.pleroma.social/pleroma/elixir-libraries/elixir-captcha.git", "784605815756bbc1d7e313ff552840afb62e2c41", [branch: "master"]},
"castore": {:hex, :castore, "1.0.16", "8a4f9a7c8b81cda88231a08fe69e3254f16833053b23fa63274b05cbc61d2a1e", [:mix], [], "hexpm", "33689203a0eaaf02fcd0e86eadfbcf1bd636100455350592e7e2628564022aaf"},
"captcha": {:git, "https://akkoma.dev/AkkomaGang/elixir-captcha.git", "784605815756bbc1d7e313ff552840afb62e2c41", [branch: "main"]},
"castore": {:hex, :castore, "1.0.17", "4f9770d2d45fbd91dcf6bd404cf64e7e58fed04fadda0923dc32acca0badffa2", [:mix], [], "hexpm", "12d24b9d80b910dd3953e165636d68f147a31db945d2dcb9365e441f8b5351e5"},
"certifi": {:hex, :certifi, "2.15.0", "0e6e882fcdaaa0a5a9f2b3db55b1394dba07e8d6d9bcad08318fb604c6839712", [:rebar3], [], "hexpm", "b147ed22ce71d72eafdad94f055165c1c182f61a2ff49df28bcc71d1d5b94a60"},
"combine": {:hex, :combine, "0.10.0", "eff8224eeb56498a2af13011d142c5e7997a80c8f5b97c499f84c841032e429f", [:mix], [], "hexpm", "1b1dbc1790073076580d0d1d64e42eae2366583e7aecd455d1215b0d16f2451b"},
"comeonin": {:hex, :comeonin, "5.5.1", "5113e5f3800799787de08a6e0db307133850e635d34e9fab23c70b6501669510", [:mix], [], "hexpm", "65aac8f19938145377cee73973f192c5645873dcf550a8a6b18187d17c13ccdb"},
@ -50,7 +50,7 @@
"fast_sanitize": {:hex, :fast_sanitize, "0.2.3", "67b93dfb34e302bef49fec3aaab74951e0f0602fd9fa99085987af05bd91c7a5", [:mix], [{:fast_html, "~> 2.0", [hex: :fast_html, repo: "hexpm", optional: false]}, {:plug, "~> 1.8", [hex: :plug, repo: "hexpm", optional: false]}], "hexpm", "e8ad286d10d0386e15d67d0ee125245ebcfbc7d7290b08712ba9013c8c5e56e2"},
"file_ex": {:git, "https://akkoma.dev/AkkomaGang/file_ex.git", "cc7067c7d446c2526e9ecf91d40896b088851569", [ref: "cc7067c7d446c2526e9ecf91d40896b088851569"]},
"file_system": {:hex, :file_system, "1.1.1", "31864f4685b0148f25bd3fbef2b1228457c0c89024ad67f7a81a3ffbc0bbad3a", [:mix], [], "hexpm", "7a15ff97dfe526aeefb090a7a9d3d03aa907e100e262a0f8f7746b78f8f87a5d"},
"finch": {:hex, :finch, "0.20.0", "5330aefb6b010f424dcbbc4615d914e9e3deae40095e73ab0c1bb0968933cadf", [:mix], [{:mime, "~> 1.0 or ~> 2.0", [hex: :mime, repo: "hexpm", optional: false]}, {:mint, "~> 1.6.2 or ~> 1.7", [hex: :mint, repo: "hexpm", optional: false]}, {:nimble_options, "~> 0.4 or ~> 1.0", [hex: :nimble_options, repo: "hexpm", optional: false]}, {:nimble_pool, "~> 1.1", [hex: :nimble_pool, repo: "hexpm", optional: false]}, {:telemetry, "~> 0.4 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "2658131a74d051aabfcba936093c903b8e89da9a1b63e430bee62045fa9b2ee2"},
"finch": {:hex, :finch, "0.21.0", "b1c3b2d48af02d0c66d2a9ebfb5622be5c5ecd62937cf79a88a7f98d48a8290c", [:mix], [{:mime, "~> 1.0 or ~> 2.0", [hex: :mime, repo: "hexpm", optional: false]}, {:mint, "~> 1.6.2 or ~> 1.7", [hex: :mint, repo: "hexpm", optional: false]}, {:nimble_options, "~> 0.4 or ~> 1.0", [hex: :nimble_options, repo: "hexpm", optional: false]}, {:nimble_pool, "~> 1.1", [hex: :nimble_pool, repo: "hexpm", optional: false]}, {:telemetry, "~> 0.4 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "87dc6e169794cb2570f75841a19da99cfde834249568f2a5b121b809588a4377"},
"flake_id": {:git, "https://akkoma.dev/AkkomaGang/flake_id.git", "5a68513f7e7353706e788781eff6e56bf00bb41b", [branch: "main"]},
"floki": {:hex, :floki, "0.38.0", "62b642386fa3f2f90713f6e231da0fa3256e41ef1089f83b6ceac7a3fd3abf33", [:mix], [], "hexpm", "a5943ee91e93fb2d635b612caf5508e36d37548e84928463ef9dd986f0d1abd9"},
"gen_smtp": {:hex, :gen_smtp, "1.3.0", "62c3d91f0dcf6ce9db71bcb6881d7ad0d1d834c7f38c13fa8e952f4104a8442e", [:rebar3], [{:ranch, ">= 1.8.0", [hex: :ranch, repo: "hexpm", optional: false]}], "hexpm", "0b73fbf069864ecbce02fe653b16d3f35fd889d0fdd4e14527675565c39d84e6"},
@ -134,7 +134,7 @@
"telemetry_metrics_prometheus_core": {:hex, :telemetry_metrics_prometheus_core, "1.2.1", "c9755987d7b959b557084e6990990cb96a50d6482c683fb9622a63837f3cd3d8", [:mix], [{:telemetry, "~> 0.4 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}, {:telemetry_metrics, "~> 0.6 or ~> 1.0", [hex: :telemetry_metrics, repo: "hexpm", optional: false]}], "hexpm", "5e2c599da4983c4f88a33e9571f1458bf98b0cf6ba930f1dc3a6e8cf45d5afb6"},
"telemetry_poller": {:hex, :telemetry_poller, "1.3.0", "d5c46420126b5ac2d72bc6580fb4f537d35e851cc0f8dbd571acf6d6e10f5ec7", [:rebar3], [{:telemetry, "~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "51f18bed7128544a50f75897db9974436ea9bfba560420b646af27a9a9b35211"},
"temple": {:git, "https://akkoma.dev/AkkomaGang/temple.git", "066a699ade472d8fa42a9d730b29a61af9bc8b59", [ref: "066a699ade472d8fa42a9d730b29a61af9bc8b59"]},
"tesla": {:hex, :tesla, "1.15.3", "3a2b5c37f09629b8dcf5d028fbafc9143c0099753559d7fe567eaabfbd9b8663", [:mix], [{:castore, "~> 0.1 or ~> 1.0", [hex: :castore, repo: "hexpm", optional: true]}, {:exjsx, ">= 3.0.0", [hex: :exjsx, repo: "hexpm", optional: true]}, {:finch, "~> 0.13", [hex: :finch, repo: "hexpm", optional: true]}, {:fuse, "~> 2.4", [hex: :fuse, repo: "hexpm", optional: true]}, {:gun, ">= 1.0.0", [hex: :gun, repo: "hexpm", optional: true]}, {:hackney, "~> 1.21", [hex: :hackney, repo: "hexpm", optional: true]}, {:ibrowse, "4.4.2", [hex: :ibrowse, repo: "hexpm", optional: true]}, {:jason, ">= 1.0.0", [hex: :jason, repo: "hexpm", optional: true]}, {:mime, "~> 1.0 or ~> 2.0", [hex: :mime, repo: "hexpm", optional: false]}, {:mint, "~> 1.0", [hex: :mint, repo: "hexpm", optional: true]}, {:mox, "~> 1.0", [hex: :mox, repo: "hexpm", optional: true]}, {:msgpax, "~> 2.3", [hex: :msgpax, repo: "hexpm", optional: true]}, {:poison, ">= 1.0.0", [hex: :poison, repo: "hexpm", optional: true]}, {:telemetry, "~> 0.4 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: true]}], "hexpm", "98bb3d4558abc67b92fb7be4cd31bb57ca8d80792de26870d362974b58caeda7"},
"tesla": {:hex, :tesla, "1.16.0", "de77d083aea08ebd1982600693ff5d779d68a4bb835d136a0394b08f69714660", [:mix], [{:castore, "~> 0.1 or ~> 1.0", [hex: :castore, repo: "hexpm", optional: true]}, {:exjsx, ">= 3.0.0", [hex: :exjsx, repo: "hexpm", optional: true]}, {:finch, "~> 0.13", [hex: :finch, repo: "hexpm", optional: true]}, {:fuse, "~> 2.4", [hex: :fuse, repo: "hexpm", optional: true]}, {:gun, ">= 1.0.0", [hex: :gun, repo: "hexpm", optional: true]}, {:hackney, "~> 1.21", [hex: :hackney, repo: "hexpm", optional: true]}, {:ibrowse, "4.4.2", [hex: :ibrowse, repo: "hexpm", optional: true]}, {:jason, ">= 1.0.0", [hex: :jason, repo: "hexpm", optional: true]}, {:mime, "~> 1.0 or ~> 2.0", [hex: :mime, repo: "hexpm", optional: false]}, {:mint, "~> 1.0", [hex: :mint, repo: "hexpm", optional: true]}, {:mox, "~> 1.0", [hex: :mox, repo: "hexpm", optional: true]}, {:msgpax, "~> 2.3", [hex: :msgpax, repo: "hexpm", optional: true]}, {:poison, ">= 1.0.0", [hex: :poison, repo: "hexpm", optional: true]}, {:telemetry, "~> 0.4 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: true]}], "hexpm", "eb3bdfc0c6c8a23b4e3d86558e812e3577acff1cb4acb6cfe2da1985a1035b89"},
"text_diff": {:hex, :text_diff, "0.1.0", "1caf3175e11a53a9a139bc9339bd607c47b9e376b073d4571c031913317fecaa", [:mix], [], "hexpm", "d1ffaaecab338e49357b6daa82e435f877e0649041ace7755583a0ea3362dbd7"},
"timex": {:hex, :timex, "3.7.13", "0688ce11950f5b65e154e42b47bf67b15d3bc0e0c3def62199991b8a8079a1e2", [:mix], [{:combine, "~> 0.10", [hex: :combine, repo: "hexpm", optional: false]}, {:gettext, "~> 0.26", [hex: :gettext, repo: "hexpm", optional: false]}, {:tzdata, "~> 1.1", [hex: :tzdata, repo: "hexpm", optional: false]}], "hexpm", "09588e0522669328e973b8b4fd8741246321b3f0d32735b589f78b136e6d4c54"},
"trailing_format_plug": {:hex, :trailing_format_plug, "0.0.7", "64b877f912cf7273bed03379936df39894149e35137ac9509117e59866e10e45", [:mix], [{:plug, "> 0.12.0", [hex: :plug, repo: "hexpm", optional: false]}], "hexpm", "bd4fde4c15f3e993a999e019d64347489b91b7a9096af68b2bdadd192afa693f"},

View file

@ -19,7 +19,11 @@ defmodule Pleroma.Repo.Migrations.RestrictEligibleUserIndexes do
# Just used to quickly retrieve all suggested useres
# (this perhaps should have been a separate table to begin with).
# This MUST use BTREE, a HASH index will not be used when querying all suggested users!
index(:users, [:id], where: "is_suggested", using: :btree, name: :users_where_suggested_index),
index(:users, [:id],
where: "is_suggested",
using: :btree,
name: :users_where_suggested_index
),
# Only _local_ users can be admins or moderators and in practice
# this criteria is only used to query for a "true" setting.

View file

@ -0,0 +1,34 @@
defmodule Pleroma.Repo.Migrations.SortActivityContextIndex do
use Ecto.Migration
# definied in 20170912114248_add_context_index.exs
@old_idx index(
:activities,
["(data->>'type')", "(data->>'context')"],
name: :activities_context_index
)
# The index is only used in fetch_activities_for_context_query which
# is always restricted to Creates and sorted by id (rev. chronologically)
@new_idx index(
:activities,
["(data->>'context')", "id DESC"],
where: "(data->>'type') = 'Create'",
name: :activities_context_index
)
def up() do
drop_if_exists(@old_idx)
create_if_not_exists(@new_idx)
flush()
# ensure planner immediately picks up new index for subsequent migrations
execute("ANALYZE activities;")
end
def down() do
drop_if_exists(@new_idx)
create_if_not_exists(@old_idx)
end
end

View file

@ -0,0 +1,20 @@
defmodule Pleroma.Repo.Migrations.DeleteeRemoteParticipations do
use Ecto.Migration
def up() do
execute """
DELETE FROM conversation_participations AS p
USING users AS u
WHERE p.user_id = u.id
AND NOT u.local
;
"""
end
def down() do
# not reversible, but never made sense
# and the only thing "relying" on it
# was broken and non-sensical either way
:ok
end
end

View file

@ -0,0 +1,77 @@
defmodule Pleroma.Repo.Migrations.ConversationParticipationLastStatusId do
use Ecto.Migration
# definied in 20190410152859_add_participation_updated_at_index.exs
@old_sort_idx index(:conversation_participations, ["updated_at desc"])
# requires new column to be created and filled first
# (the column i not nullable, but ordering is done by pagination helper which always uses NULLS LAST
# and currently postgres isn't smart enough to use a simple DESC index anyway)
@new_sort_idx index(:conversation_participations, [:user_id, "last_bump DESC NULLS LAST"])
def up() do
drop_if_exists(@old_sort_idx)
# create new col, temporarily nullable
# Do NOT use foreign-key constraint, we don't care if the message is deleted
# nor do we use it for joins. This is just a funny timestamp
alter table(:conversation_participations) do
add :last_bump, :uuid, null: true
end
flush()
# fill in data
execute """
UPDATE conversation_participations AS p
SET last_bump = (
SELECT a.id
FROM activities AS a
WHERE a.data->>'context' = c.ap_id AND
a.data->>'type' = 'Create' AND
(u.ap_id = ANY(a.recipients) OR u.ap_id = a.actor) AND
activity_visibility(a.actor, a.recipients, a.data) = 'direct'
ORDER BY a.id DESC