Compare commits

...

116 Commits

Author SHA1 Message Date
Floatingghost f614bf2725 Merge branch 'develop' into stable
ci/woodpecker/push/build-amd64 Pipeline is pending Details
ci/woodpecker/push/build-arm64 Pipeline is pending Details
ci/woodpecker/push/docs Pipeline is pending Details
ci/woodpecker/push/lint Pipeline is pending Details
ci/woodpecker/push/test Pipeline is pending Details
ci/woodpecker/tag/lint Pipeline failed Details
ci/woodpecker/tag/test unknown status Details
ci/woodpecker/tag/build-arm64 unknown status Details
ci/woodpecker/tag/build-amd64 unknown status Details
ci/woodpecker/tag/docs unknown status Details
2024-04-27 15:08:35 +01:00
Floatingghost 7038b60ab5 bump version
ci/woodpecker/push/build-amd64 Pipeline is pending Details
ci/woodpecker/push/build-arm64 Pipeline is pending Details
ci/woodpecker/push/docs Pipeline is pending Details
ci/woodpecker/push/lint Pipeline is pending Details
ci/woodpecker/push/test Pipeline is pending Details
2024-04-27 15:08:21 +01:00
Floatingghost 828158ef49 Merge remote-tracking branch 'oneric/fedfix-public-ld' into develop
ci/woodpecker/push/build-amd64 Pipeline is pending Details
ci/woodpecker/push/build-arm64 Pipeline is pending Details
ci/woodpecker/push/docs Pipeline is pending Details
ci/woodpecker/push/lint Pipeline is pending Details
ci/woodpecker/push/test Pipeline is pending Details
2024-04-26 18:49:31 +01:00
Floatingghost c7276713e0 Merge remote-tracking branch 'oneric/changelog-3.13' into develop
ci/woodpecker/push/build-amd64 Pipeline is pending Details
ci/woodpecker/push/build-arm64 Pipeline is pending Details
ci/woodpecker/push/docs Pipeline is pending Details
ci/woodpecker/push/lint Pipeline is pending Details
ci/woodpecker/push/test Pipeline is pending Details
2024-04-26 18:43:39 +01:00
floatingghost 310c1b7e24 Merge pull request 'Change nginx cache size to 1 GiB' (#759) from norm/akkoma:nginx-cache-size into develop
ci/woodpecker/push/build-amd64 Pipeline is pending Details
ci/woodpecker/push/build-arm64 Pipeline is pending Details
ci/woodpecker/push/docs Pipeline is pending Details
ci/woodpecker/push/lint Pipeline is pending Details
ci/woodpecker/push/test Pipeline is pending Details
Reviewed-on: #759
2024-04-26 17:40:23 +00:00
floatingghost 7da6f41718 Merge pull request 'Exiftool: Strip all non-essential metadata tags' (#745) from Oneric/akkoma:exiftool-strip-all into develop
ci/woodpecker/push/build-amd64 Pipeline is pending Details
ci/woodpecker/push/build-arm64 Pipeline is pending Details
ci/woodpecker/push/docs Pipeline is pending Details
ci/woodpecker/push/lint Pipeline is pending Details
ci/woodpecker/push/test Pipeline is pending Details
Reviewed-on: #745
2024-04-26 17:38:47 +00:00
floatingghost 53c67993bb Merge pull request 'Remove unused top level files' (#760) from norm/akkoma:remove-unused-files into develop
ci/woodpecker/push/build-amd64 Pipeline is pending Details
ci/woodpecker/push/build-arm64 Pipeline is pending Details
ci/woodpecker/push/docs Pipeline is pending Details
ci/woodpecker/push/lint Pipeline is pending Details
ci/woodpecker/push/test Pipeline is pending Details
Reviewed-on: #760
2024-04-26 17:24:21 +00:00
Oneric 5bc64c5753 changelog: add note about StripMetadata and ReadDescription order
ci/woodpecker/pr/lint Pipeline was successful Details
ci/woodpecker/pr/test Pipeline failed Details
ci/woodpecker/pr/build-arm64 unknown status Details
ci/woodpecker/pr/build-amd64 unknown status Details
ci/woodpecker/pr/docs unknown status Details
2024-04-26 18:57:28 +02:00
Oneric 5ee0fb18cb exiftool: make stripped tags configurable 2024-04-26 18:57:24 +02:00
Norm 5b320616ca Remove unused top level files
ci/woodpecker/pr/build-amd64 Pipeline is pending Details
ci/woodpecker/pr/build-arm64 Pipeline is pending Details
ci/woodpecker/pr/docs Pipeline is pending Details
ci/woodpecker/pr/lint Pipeline is pending Details
ci/woodpecker/pr/test Pipeline is pending Details
I don't think anyone really uses the tools that uses these files these
days, and they are another thing that needs to be updated every so
often.
2024-04-26 02:33:18 -04:00
Norm 72c2d9f009 Change nginx cache size to 1 GiB
ci/woodpecker/pr/build-amd64 Pipeline is pending Details
ci/woodpecker/pr/build-arm64 Pipeline is pending Details
ci/woodpecker/pr/docs Pipeline is pending Details
ci/woodpecker/pr/lint Pipeline is pending Details
ci/woodpecker/pr/test Pipeline is pending Details
The current 10 GiB cache size is too large to fit into tmpfs for VMs and
other machines with smaller RAM sizes. Most non-Debian distros mount
/tmp on tmpfs.
2024-04-26 01:43:44 -04:00
Oneric 12db5c23f2 Add missing changelog entries
ci/woodpecker/pr/build-amd64 Pipeline is pending Details
ci/woodpecker/pr/build-arm64 Pipeline is pending Details
ci/woodpecker/pr/docs Pipeline is pending Details
ci/woodpecker/pr/lint Pipeline is pending Details
ci/woodpecker/pr/test Pipeline is pending Details
2024-04-26 00:51:45 +02:00
Oneric a95af3ee4c exiftool: strip all non-essential tags
Documentation was already clear on this only stripping GPS tags.
But there are more potentially sensitive metadata tags (e.g. author
and possibly description) and the name alone suggests a broader effect.

Thus change the filter to strip all metadata except for colourspace info
and orientation (technically it strips everything and then readds
selected tags).

Explicitly stripping CommonIFD0 is needed since -all does not modify
IFD0 due to TIFF storing some actual image data there. CommonIFD0 then
strips a bunch of commonly used actual metadata tags from IFD0, to my
understanding leaving TIFF image data and custom metadata tags intact.
2024-04-25 23:00:42 +02:00
Oneric 163cb1d5e0 exiftool: strip JXL and HEIC
As of exiftool 12.57 both formats are supported, but EXIF data is
optional for JXL and if exiftool doesn’t find a preexisting metadata
chunk it will create one and treat it as a minor error resulting in
a non-zero exit code.
Setting -ignoreMinorErrors avoids failing on such uploads.
2024-04-25 23:00:42 +02:00
Oneric 24e608ab5b docs: fix typo 2024-04-25 23:00:42 +02:00
Oneric b0a46c1e2e Normalise public adressing to fix federation
ci/woodpecker/pr/build-amd64 Pipeline is pending Details
ci/woodpecker/pr/build-arm64 Pipeline is pending Details
ci/woodpecker/pr/docs Pipeline is pending Details
ci/woodpecker/pr/lint Pipeline is pending Details
ci/woodpecker/pr/test Pipeline is pending Details
Due to JSON-LD compaction the full address of public scope
may also occur in shorter forms and the spec requires us to treat them
all equivalently. To save us the pain of repeatedly checking for all
variants internally, normalise inbound data to just one form.
See note at: https://www.w3.org/TR/activitypub/#public-addressing

This needs to happen very early, even before the other addressing fixes
else an earlier validator will reject the object. This in turn required
to move the list-tpye normalisation earlier as well, but since I was
unsure about putting empty lists into the data when no such field
existed before, I excluded this case and thus the later fixing had to be
kept as well.

Fixes: #670
2024-04-25 18:45:16 +02:00
floatingghost b1c6621e66 Merge pull request 'Read image description from EXIF data' (#744) from timorl/akkoma:elseinspe into develop
ci/woodpecker/push/build-amd64 Pipeline is pending Details
ci/woodpecker/push/build-arm64 Pipeline is pending Details
ci/woodpecker/push/docs Pipeline is pending Details
ci/woodpecker/push/lint Pipeline is pending Details
ci/woodpecker/push/test Pipeline is pending Details
Reviewed-on: #744
2024-04-25 12:52:31 +00:00
floatingghost 764dbeded4 Merge pull request 'Accept all standard actor types' (#751) from Oneric/akkoma:all-actor-types into develop
ci/woodpecker/push/build-amd64 Pipeline is pending Details
ci/woodpecker/push/build-arm64 Pipeline is pending Details
ci/woodpecker/push/docs Pipeline is pending Details
ci/woodpecker/push/lint Pipeline is pending Details
ci/woodpecker/push/test Pipeline is pending Details
Reviewed-on: #751
2024-04-24 17:09:02 +00:00
floatingghost 06847ca5f8 Merge pull request 'Update nginx config and install docs to use certbot's nginx plugin' (#752) from norm/akkoma:docs-nginx-certbot into develop
ci/woodpecker/push/build-amd64 Pipeline is pending Details
ci/woodpecker/push/build-arm64 Pipeline is pending Details
ci/woodpecker/push/docs Pipeline is pending Details
ci/woodpecker/push/lint Pipeline is pending Details
ci/woodpecker/push/test Pipeline is pending Details
Reviewed-on: #752
2024-04-24 17:08:39 +00:00
floatingghost 80e1c094c7 Merge pull request 'Don't strip newlines in pre' (#709) from snan/akkoma:pre into develop
ci/woodpecker/push/build-amd64 Pipeline is pending Details
ci/woodpecker/push/build-arm64 Pipeline is pending Details
ci/woodpecker/push/docs Pipeline is pending Details
ci/woodpecker/push/lint Pipeline is pending Details
ci/woodpecker/push/test Pipeline is pending Details
Reviewed-on: #709
2024-04-24 17:00:34 +00:00
floatingghost 4a0e90e8a8 Merge pull request 'ReceiverWorker: Make sure non-{:ok, _} is returned as {:error, …}' (#753) from Oneric/akkoma:receive-worker-return into develop
ci/woodpecker/push/build-amd64 Pipeline is pending Details
ci/woodpecker/push/build-arm64 Pipeline is pending Details
ci/woodpecker/push/docs Pipeline is pending Details
ci/woodpecker/push/lint Pipeline is pending Details
ci/woodpecker/push/test Pipeline is pending Details
Reviewed-on: #753
2024-04-24 17:00:18 +00:00
floatingghost 1e48a37545 Merge pull request 'Remove unused AP C2S endpoints' (#749) from who-wants-to-yeet-c2s-i-want-to-yeet-c2s into develop
ci/woodpecker/push/build-amd64 Pipeline is pending Details
ci/woodpecker/push/build-arm64 Pipeline is pending Details
ci/woodpecker/push/docs Pipeline is pending Details
ci/woodpecker/push/lint Pipeline is pending Details
ci/woodpecker/push/test Pipeline is pending Details
Reviewed-on: #749
2024-04-24 16:59:58 +00:00
Oneric 83f75c3e93 Accept all standard actor types
ci/woodpecker/pr/lint Pipeline was successful Details
ci/woodpecker/pr/test Pipeline was successful Details
ci/woodpecker/pr/build-arm64 unknown status Details
ci/woodpecker/pr/build-amd64 unknown status Details
ci/woodpecker/pr/docs unknown status Details
2024-04-23 18:14:34 +02:00
floatingghost 7d89dba528 Merge pull request 'Fix flaky expires_at tests' (#754) from Oneric/akkoma:test-flaky-expires_at into develop
ci/woodpecker/push/build-amd64 Pipeline is pending Details
ci/woodpecker/push/build-arm64 Pipeline is pending Details
ci/woodpecker/push/docs Pipeline is pending Details
ci/woodpecker/push/lint Pipeline is pending Details
ci/woodpecker/push/test Pipeline is pending Details
Reviewed-on: #754
2024-04-23 15:14:21 +00:00
Floatingghost 92168fa5a1 Merge remote-tracking branch 'origin/develop' into who-wants-to-yeet-c2s-i-want-to-yeet-c2s
ci/woodpecker/push/build-amd64 Pipeline is pending Details
ci/woodpecker/push/build-arm64 Pipeline is pending Details
ci/woodpecker/push/docs Pipeline is pending Details
ci/woodpecker/push/lint Pipeline is pending Details
ci/woodpecker/push/test Pipeline is pending Details
ci/woodpecker/pr/build-amd64 Pipeline is pending Details
ci/woodpecker/pr/build-arm64 Pipeline is pending Details
ci/woodpecker/pr/docs Pipeline is pending Details
ci/woodpecker/pr/lint Pipeline is pending Details
ci/woodpecker/pr/test Pipeline is pending Details
2024-04-23 14:37:05 +01:00
Floatingghost 3e199242b0 remove upload_media from AP representation
ci/woodpecker/push/build-amd64 Pipeline is pending Details
ci/woodpecker/push/build-arm64 Pipeline is pending Details
ci/woodpecker/push/docs Pipeline is pending Details
ci/woodpecker/push/lint Pipeline is pending Details
ci/woodpecker/push/test Pipeline is pending Details
ci/woodpecker/pr/build-amd64 Pipeline is pending Details
ci/woodpecker/pr/build-arm64 Pipeline is pending Details
ci/woodpecker/pr/docs Pipeline is pending Details
ci/woodpecker/pr/lint Pipeline is pending Details
ci/woodpecker/pr/test Pipeline is pending Details
2024-04-23 14:35:52 +01:00
Norm 0fa3fbf55e Update OTP install docs to use certbot nginx plugin
ci/woodpecker/pr/build-amd64 Pipeline is pending Details
ci/woodpecker/pr/build-arm64 Pipeline is pending Details
ci/woodpecker/pr/docs Pipeline is pending Details
ci/woodpecker/pr/lint Pipeline is pending Details
ci/woodpecker/pr/test Pipeline is pending Details
2024-04-23 00:02:54 -04:00
Norm e5f4282cca Update certbot instructions for Alpine Linux 2024-04-23 00:02:54 -04:00
Norm cdde95ad8b Update gentoo install guide to use certbot-nginx 2024-04-23 00:02:54 -04:00
Norm c493769364 Update Nginx setup docs for Fedora and Red Hat OTP 2024-04-23 00:02:15 -04:00
Norm 39b8e73532 Update docs for Arch Linux nginx setup
Alongside moving to certbot's nginx plugin, also use conf.d instead of
recreating the sites-{available,enabled} setup that Debian/Ubuntu uses.

Furthermore, also request a certificate for the media domain at the same
time since that's now required.
2024-04-21 18:19:07 -04:00
Norm 5405828ab1 Update debian install docs to use certbot nginx plugin 2024-04-21 18:19:07 -04:00
Norm 3e9643b172 Update nginx config for Certbot's nginx plugin 2024-04-21 18:19:01 -04:00
Oneric 20c22eb159 Fix flaky expires_at tests
ci/woodpecker/pr/lint Pipeline was successful Details
ci/woodpecker/pr/test Pipeline was successful Details
ci/woodpecker/pr/build-arm64 unknown status Details
ci/woodpecker/pr/docs unknown status Details
ci/woodpecker/pr/build-amd64 unknown status Details
The API parameter is not a timestamp but an offset.
If a sufficient amount of time passes between the tests
expires_at calculation and the internal calculation during processing
of the request the strict equality assertion fails. (Either a direct
assertion or indirect via job lookup).

To avoid this lower comparison granularity.
2024-04-21 21:08:53 +00:00
Haelwenn (lanodan) Monnier 0c2f200b4d ReceiverWorker: Make sure non-{:ok, _} is returned as {:error, …}
ci/woodpecker/pr/lint Pipeline was successful Details
ci/woodpecker/pr/test Pipeline was successful Details
ci/woodpecker/pr/build-arm64 unknown status Details
ci/woodpecker/pr/build-amd64 unknown status Details
ci/woodpecker/pr/docs unknown status Details
Otherwise an error like `{:signature, {:error, {:error, :not_found}}}`
ends up considered a success.

Cherry-picked-from: a299ddb10e
2024-04-21 20:58:06 +02:00
timorl 3f54945033
Fix the one test that wasn't just being flaky
ci/woodpecker/pr/lint Pipeline was successful Details
ci/woodpecker/pr/test Pipeline was successful Details
ci/woodpecker/pr/build-arm64 unknown status Details
ci/woodpecker/pr/build-amd64 unknown status Details
ci/woodpecker/pr/docs unknown status Details
2024-04-21 19:43:26 +02:00
timorl 09d3ccf770
Read description before stripping metadata
ci/woodpecker/pr/build-amd64 Pipeline is pending Details
ci/woodpecker/pr/build-arm64 Pipeline is pending Details
ci/woodpecker/pr/docs Pipeline is pending Details
ci/woodpecker/pr/lint Pipeline is pending Details
ci/woodpecker/pr/test Pipeline is pending Details
2024-04-19 20:51:54 +02:00
timorl 9da0fe930e
Format, but this time with a non-ancient version of elixir 2024-04-19 18:07:50 +02:00
timorl 2a9db73b4c
Merge branch 'develop' into elseinspe 2024-04-19 17:11:55 +02:00
floatingghost 0fee71f58f Merge pull request 'Handle failed fetches a bit better' (#743) from failed-fetch-processing into develop
ci/woodpecker/push/build-amd64 Pipeline is pending Details
ci/woodpecker/push/build-arm64 Pipeline is pending Details
ci/woodpecker/push/docs Pipeline is pending Details
ci/woodpecker/push/lint Pipeline is pending Details
ci/woodpecker/push/test Pipeline is pending Details
Reviewed-on: #743
2024-04-19 11:25:14 +00:00
Floatingghost 370576474c only consider :op and :id args in duplicate checks
ci/woodpecker/push/build-amd64 Pipeline is pending Details
ci/woodpecker/push/build-arm64 Pipeline is pending Details
ci/woodpecker/push/docs Pipeline is pending Details
ci/woodpecker/push/lint Pipeline is pending Details
ci/woodpecker/push/test Pipeline is pending Details
ci/woodpecker/pr/lint Pipeline was successful Details
ci/woodpecker/pr/test Pipeline was successful Details
ci/woodpecker/pr/build-amd64 unknown status Details
ci/woodpecker/pr/build-arm64 unknown status Details
ci/woodpecker/pr/docs unknown status Details
2024-04-19 11:39:27 +01:00
Floatingghost 1ed975636b Keep READ endpoints, purge WRITE
ci/woodpecker/push/build-amd64 Pipeline is pending Details
ci/woodpecker/push/build-arm64 Pipeline is pending Details
ci/woodpecker/push/docs Pipeline is pending Details
ci/woodpecker/push/lint Pipeline is pending Details
ci/woodpecker/push/test Pipeline is pending Details
ci/woodpecker/pr/build-amd64 Pipeline is pending Details
ci/woodpecker/pr/build-arm64 Pipeline is pending Details
ci/woodpecker/pr/docs Pipeline is pending Details
ci/woodpecker/pr/lint Pipeline is pending Details
ci/woodpecker/pr/test Pipeline is pending Details
2024-04-19 11:06:01 +01:00
timorl cd7af81896
Rename StripLocation to StripMetadata for temporal-proofing reasons
ci/woodpecker/pr/build-amd64 Pipeline is pending Details
ci/woodpecker/pr/build-arm64 Pipeline is pending Details
ci/woodpecker/pr/docs Pipeline is pending Details
ci/woodpecker/pr/lint Pipeline is pending Details
ci/woodpecker/pr/test Pipeline is pending Details
2024-04-16 20:37:00 +02:00
Floatingghost 2c7e5b2287 changelog entry
ci/woodpecker/push/build-amd64 Pipeline is pending Details
ci/woodpecker/push/build-arm64 Pipeline is pending Details
ci/woodpecker/push/docs Pipeline is pending Details
ci/woodpecker/push/lint Pipeline is pending Details
ci/woodpecker/push/test Pipeline is pending Details
2024-04-16 13:57:05 +01:00
Floatingghost ddb8a5ef73 yeet AP C2S support
literally nothing uses C2S AP, and it's another route into core
systems which requires analysis and maintenance. A second API
is just extra surface for potentially bad things so let's take
it out back and obliterate it
2024-04-16 13:55:03 +01:00
Floatingghost 123db1abc4 Merge branch 'develop' into failed-fetch-processing
ci/woodpecker/push/build-amd64 Pipeline is pending Details
ci/woodpecker/push/build-arm64 Pipeline is pending Details
ci/woodpecker/push/docs Pipeline is pending Details
ci/woodpecker/push/lint Pipeline is pending Details
ci/woodpecker/push/test Pipeline is pending Details
ci/woodpecker/pr/lint Pipeline was successful Details
ci/woodpecker/pr/test Pipeline was successful Details
ci/woodpecker/pr/build-amd64 unknown status Details
ci/woodpecker/pr/build-arm64 unknown status Details
ci/woodpecker/pr/docs unknown status Details
2024-04-16 12:35:54 +01:00
Floatingghost b2c29527fb make xmerl shut up about markup
ci/woodpecker/push/build-amd64 Pipeline is pending Details
ci/woodpecker/push/build-arm64 Pipeline is pending Details
ci/woodpecker/push/docs Pipeline is pending Details
ci/woodpecker/push/lint Pipeline is pending Details
ci/woodpecker/push/test Pipeline is pending Details
ci/woodpecker/pr/lint Pipeline was successful Details
ci/woodpecker/pr/test Pipeline failed Details
ci/woodpecker/pr/build-amd64 unknown status Details
ci/woodpecker/pr/build-arm64 unknown status Details
ci/woodpecker/pr/docs unknown status Details
2024-04-16 10:19:30 +01:00
timorl 59d32c10d9
Formatting 2024-04-16 08:02:13 +02:00
Floatingghost d2cee15c15 mix format says no
ci/woodpecker/push/build-amd64 Pipeline is pending Details
ci/woodpecker/push/build-arm64 Pipeline is pending Details
ci/woodpecker/push/docs Pipeline is pending Details
ci/woodpecker/push/lint Pipeline is pending Details
ci/woodpecker/push/test Pipeline is pending Details
ci/woodpecker/pr/lint Pipeline was successful Details
ci/woodpecker/pr/test Pipeline was successful Details
ci/woodpecker/pr/build-arm64 unknown status Details
ci/woodpecker/pr/docs unknown status Details
ci/woodpecker/pr/build-amd64 unknown status Details
2024-04-16 03:07:28 +01:00
Floatingghost d70fa16383 oban options should be a keyword list
ci/woodpecker/push/build-amd64 Pipeline is pending Details
ci/woodpecker/push/build-arm64 Pipeline is pending Details
ci/woodpecker/push/docs Pipeline is pending Details
ci/woodpecker/push/lint Pipeline is pending Details
ci/woodpecker/push/test Pipeline is pending Details
ci/woodpecker/pr/lint Pipeline failed Details
ci/woodpecker/pr/test unknown status Details
ci/woodpecker/pr/build-arm64 unknown status Details
ci/woodpecker/pr/build-amd64 unknown status Details
ci/woodpecker/pr/docs unknown status Details
2024-04-16 02:58:50 +01:00
Floatingghost 5043571084 Enable oban job uniqueness
ci/woodpecker/push/build-amd64 Pipeline is pending Details
ci/woodpecker/push/build-arm64 Pipeline is pending Details
ci/woodpecker/push/docs Pipeline is pending Details
ci/woodpecker/push/lint Pipeline is pending Details
ci/woodpecker/push/test Pipeline is pending Details
ci/woodpecker/pr/build-amd64 Pipeline is pending Details
ci/woodpecker/pr/build-arm64 Pipeline is pending Details
ci/woodpecker/pr/docs Pipeline is pending Details
ci/woodpecker/pr/lint Pipeline is pending Details
ci/woodpecker/pr/test Pipeline is pending Details
by default just prevent job floods with a 1-seconds
uniqueness check, but override in RemoteFetcherWorker
for 5 minute uniqueness check over all states

:infinity is an option we can go for maybe at some point,
but that would prevent any refetches so maybe not idk.
2024-04-16 02:53:24 +01:00
Floatingghost 1896ff1ab0 changelog entry
ci/woodpecker/push/build-amd64 Pipeline is pending Details
ci/woodpecker/push/build-arm64 Pipeline is pending Details
ci/woodpecker/push/docs Pipeline is pending Details
ci/woodpecker/push/lint Pipeline is pending Details
ci/woodpecker/push/test Pipeline is pending Details
ci/woodpecker/pr/build-amd64 Pipeline is pending Details
ci/woodpecker/pr/build-arm64 Pipeline is pending Details
ci/woodpecker/pr/docs Pipeline is pending Details
ci/woodpecker/pr/lint Pipeline is pending Details
ci/woodpecker/pr/test Pipeline is pending Details
2024-04-16 02:35:59 +01:00
Floatingghost b7dd739de1 Make sure we return the right format for oban 2024-04-16 02:35:21 +01:00
timorl b144218dce
Merge branch 'develop' into elseinspe
ci/woodpecker/pr/lint Pipeline failed Details
ci/woodpecker/pr/test unknown status Details
ci/woodpecker/pr/build-arm64 unknown status Details
ci/woodpecker/pr/build-amd64 unknown status Details
ci/woodpecker/pr/docs unknown status Details
2024-04-14 20:31:33 +02:00
Floatingghost 2fc25980d1 fix pattern matching in fetch errors
ci/woodpecker/push/build-amd64 Pipeline is pending Details
ci/woodpecker/push/build-arm64 Pipeline is pending Details
ci/woodpecker/push/docs Pipeline is pending Details
ci/woodpecker/push/lint Pipeline is pending Details
ci/woodpecker/push/test Pipeline is pending Details
ci/woodpecker/pr/lint Pipeline was successful Details
ci/woodpecker/pr/test Pipeline was successful Details
ci/woodpecker/pr/build-arm64 unknown status Details
ci/woodpecker/pr/build-amd64 unknown status Details
ci/woodpecker/pr/docs unknown status Details
2024-04-13 23:55:26 +01:00
floatingghost c1f0b6b875 Merge pull request 'Accept body parameters for /api/pleroma/notification_settings' (#738) from Oneric/akkoma:notif-setting-parameters into develop
ci/woodpecker/push/build-amd64 Pipeline is pending Details
ci/woodpecker/push/build-arm64 Pipeline is pending Details
ci/woodpecker/push/docs Pipeline is pending Details
ci/woodpecker/push/lint Pipeline is pending Details
ci/woodpecker/push/test Pipeline is pending Details
Reviewed-on: #738
2024-04-13 22:55:02 +00:00
Floatingghost 18442dcc7e Fix quote test 2024-04-13 23:05:52 +01:00
Floatingghost 33fb74043d Bring our adjustments into line with atom-failure 2024-04-13 22:56:04 +01:00
Floatingghost 49ed27cd96 require logger 2024-04-13 22:25:31 +01:00
Floatingghost 7f6e35ece4 formatting 2024-04-12 20:33:33 +01:00
Mark Felder 2e369aef71 Allow the Remote Fetcher to attempt fetching an unreachable instance 2024-04-12 20:33:21 +01:00
Mark Felder fed7a78c77 Oban jobs should be discarded on permanent errors 2024-04-12 20:33:17 +01:00
Mark Felder c0532bcae0 Handle 401s as I have observed it in the wild 2024-04-12 20:33:11 +01:00
Mark Felder f31b262aec Improve test descriptions 2024-04-12 20:32:38 +01:00
Mark Felder ff515c05c3 Prevent requeuing Remote Fetcher jobs that exceed thread depth 2024-04-12 20:32:31 +01:00
Mark Felder 7e5004b3e2 Leverage existing atoms as return errors for the object fetcher 2024-04-12 20:32:13 +01:00
Mark Felder 53a9413b95 Formatting 2024-04-12 20:31:40 +01:00
Mark Felder d69cba1b93 Remove duplicate log messages from Transmogrifier
Object fetch errors are logged in the fetcher module
2024-04-12 20:31:31 +01:00
Mark Felder 3c54f407c5 Conslidate log messages for object fetcher failures and leverage Logger.metadata 2024-04-12 20:30:38 +01:00
Mark Felder 825ae46bfa Set Logger level to error 2024-04-12 20:29:33 +01:00
Mark Felder 331710b6bb RemoteFetcherWorker Oban job tests 2024-04-12 20:29:28 +01:00
Mark Felder eeed051a0f Fix detection of user follower collection being private
We were overzealous with matching on a raw error from the object fetch that should have never been relied on like this. If we can't fetch successfully we should assume that the collection is private.

Building a more expressive and universal error struct to match on may be something to consider.
2024-04-12 20:29:11 +01:00
Mark Felder 30d63aaa6e Revert "Mark instances as unreachable when returning a 403 from an object fetch"
This reverts commit d472bafec19cee269e7c943bafae7c805785acd7.
2024-04-12 20:28:56 +01:00
Mark Felder e2b04fac5a Skip remote fetch jobs for unreachable instances 2024-04-12 20:28:36 +01:00
Mark Felder 6d368808d3 Remove mistaken duplicate fetch 2024-04-12 20:28:31 +01:00
Mark Felder 160d113b30 Changelogs 2024-04-12 20:28:26 +01:00
Mark Felder 132036f951 Cancel remote fetch jobs for deleted objects 2024-04-12 20:28:21 +01:00
Mark Felder 4ff22a409a Consolidate the HTTP status code checking into the private get_object/1 2024-04-12 20:28:16 +01:00
Mark Felder 4c29366fe5 Mark instances as unreachable when returning a 403 from an object fetch
This is a definite sign the instance is blocked and they are enforcing authorized_fetch
2024-04-12 20:27:33 +01:00
Mark Felder ac4cc619ea Fix Transmogrifier tests
These tests relied on the removed Fetcher.fetch_object_from_id!/2 function injecting the error tuple into a log message with the exact words "Object containment failed."

We will keep this behavior by generating a similar log message, but perhaps this should do a better job of matching on the error tuple returned by Transmogrifier.handle_incoming/1
2024-04-12 20:26:56 +01:00
Mark Felder c241b5b09f Remove Fetcher.fetch_object_from_id!/2
It was only being called once and can be replaced with a case statement.
2024-04-12 20:26:28 +01:00
Floatingghost f8a53fbe2f bump dependencies
ci/woodpecker/push/build-amd64 Pipeline is pending Details
ci/woodpecker/push/build-arm64 Pipeline is pending Details
ci/woodpecker/push/docs Pipeline is pending Details
ci/woodpecker/push/lint Pipeline is pending Details
ci/woodpecker/push/test Pipeline is pending Details
2024-04-12 19:59:30 +01:00
floatingghost e36c0f96fc Merge pull request 'Add docker override file to docs and gitignore' (#621) from norm/akkoma:docker-compose-override into develop
ci/woodpecker/push/build-amd64 Pipeline is pending Details
ci/woodpecker/push/build-arm64 Pipeline is pending Details
ci/woodpecker/push/docs Pipeline is pending Details
ci/woodpecker/push/lint Pipeline is pending Details
ci/woodpecker/push/test Pipeline is pending Details
Reviewed-on: #621
2024-04-12 18:50:25 +00:00
floatingghost 6f3c955aa0 Merge pull request 'elixir1.16 testing' (#742) from elixir1.16 into develop
ci/woodpecker/push/build-amd64 Pipeline is pending Details
ci/woodpecker/push/build-arm64 Pipeline is pending Details
ci/woodpecker/push/docs Pipeline is pending Details
ci/woodpecker/push/lint Pipeline is pending Details
ci/woodpecker/push/test Pipeline is pending Details
Reviewed-on: #742
2024-04-12 18:49:33 +00:00
floatingghost 024ffadd80 Merge pull request 'Don't list old accounts as aliases in WebFinger' (#713) from erincandescent/akkoma:no-old-account-alias into develop
ci/woodpecker/push/build-amd64 Pipeline is pending Details
ci/woodpecker/push/build-arm64 Pipeline is pending Details
ci/woodpecker/push/docs Pipeline is pending Details
ci/woodpecker/push/lint Pipeline is pending Details
ci/woodpecker/push/test Pipeline is pending Details
Reviewed-on: #713
2024-04-12 18:34:14 +00:00
floatingghost e2e4f53585 Merge pull request 'Use standard-compliant Accept header when fetching' (#740) from Oneric/akkoma:fetch_std-accept-hdr into develop
Reviewed-on: #740
2024-04-12 18:28:26 +00:00
Floatingghost d910e8d7d1 Add test suite for elixir1.16
ci/woodpecker/push/lint Pipeline was successful Details
ci/woodpecker/push/test Pipeline was successful Details
ci/woodpecker/push/build-amd64 Pipeline was successful Details
ci/woodpecker/push/build-arm64 Pipeline was successful Details
ci/woodpecker/push/docs Pipeline was successful Details
ci/woodpecker/pr/lint Pipeline was successful Details
ci/woodpecker/pr/test Pipeline was successful Details
ci/woodpecker/pr/build-arm64 unknown status Details
ci/woodpecker/pr/build-amd64 unknown status Details
ci/woodpecker/pr/docs unknown status Details
2024-04-12 19:13:33 +01:00
Floatingghost df25d86999 Cleaned up FEP-fffd commits a bit
ci/woodpecker/push/build-amd64 Pipeline is pending Details
ci/woodpecker/push/build-arm64 Pipeline is pending Details
ci/woodpecker/push/docs Pipeline is pending Details
ci/woodpecker/push/lint Pipeline is pending Details
ci/woodpecker/push/test Pipeline is pending Details
2024-04-12 18:50:57 +01:00
floatingghost 4887df12d7 Merge pull request 'Allow for url to be a list' (#718) from helge/akkoma:develop into develop
ci/woodpecker/push/build-amd64 Pipeline is pending Details
ci/woodpecker/push/build-arm64 Pipeline is pending Details
ci/woodpecker/push/docs Pipeline is pending Details
ci/woodpecker/push/lint Pipeline is pending Details
ci/woodpecker/push/test Pipeline is pending Details
Reviewed-on: #718
2024-04-12 17:39:38 +00:00
floatingghost e6ca2b4d2a Merge pull request 'Fix array-less EmojiReacts' (#739) from Oneric/akkoma:tag-arrayless into develop
ci/woodpecker/push/build-amd64 Pipeline is pending Details
ci/woodpecker/push/build-arm64 Pipeline is pending Details
ci/woodpecker/push/docs Pipeline is pending Details
ci/woodpecker/push/lint Pipeline is pending Details
ci/woodpecker/push/test Pipeline is pending Details
Reviewed-on: #739
2024-04-12 17:26:07 +00:00
floatingghost 6ba80aaff5 Merge pull request 'Check if data is visible before embedding it in OG tags' (#741) from ograph-restrictions into develop
ci/woodpecker/push/build-amd64 Pipeline is pending Details
ci/woodpecker/push/build-arm64 Pipeline is pending Details
ci/woodpecker/push/docs Pipeline is pending Details
ci/woodpecker/push/lint Pipeline is pending Details
ci/woodpecker/push/test Pipeline is pending Details
Reviewed-on: #741
2024-04-12 17:22:59 +00:00
floatingghost 8e60177466 Merge pull request 'MRF.InlineQuotePolicy: Add link to post URL, not ID' (#733) from erincandescent/akkoma:quote-url into develop
ci/woodpecker/push/build-amd64 Pipeline is pending Details
ci/woodpecker/push/build-arm64 Pipeline is pending Details
ci/woodpecker/push/docs Pipeline is pending Details
ci/woodpecker/push/lint Pipeline is pending Details
ci/woodpecker/push/test Pipeline is pending Details
Reviewed-on: #733
2024-04-12 17:02:52 +00:00
Erin Shepherd 75d9e2b375 MRF.InlineQuotePolicy: Add link to post URL, not ID
ci/woodpecker/pr/lint Pipeline was successful Details
ci/woodpecker/pr/test Pipeline failed Details
ci/woodpecker/pr/build-arm64 unknown status Details
ci/woodpecker/pr/build-amd64 unknown status Details
ci/woodpecker/pr/docs unknown status Details
"id" is used for the canonical link to the AS2 representation of an object.
"url" is typically used for the canonical link to the HTTP representation.
It is what we use, for example, when following the "external source" link
in the frontend. However, it's not the link we include in the post contents
for quote posts.

Using URL instead means we include a more user-friendly URL for Mastodon,
and a working (in the browser) URL for Threads
2024-04-12 13:23:50 +02:00
Floatingghost 05f8179d08 check if data is visible before embedding it in OG tags
ci/woodpecker/pr/build-amd64 Pipeline is pending Details
ci/woodpecker/pr/build-arm64 Pipeline is pending Details
ci/woodpecker/pr/docs Pipeline is pending Details
ci/woodpecker/pr/lint Pipeline is pending Details
ci/woodpecker/pr/test Pipeline is pending Details
ci/woodpecker/push/lint Pipeline was successful Details
ci/woodpecker/push/test Pipeline was successful Details
ci/woodpecker/push/build-amd64 Pipeline was successful Details
ci/woodpecker/push/build-arm64 Pipeline was successful Details
ci/woodpecker/push/docs Pipeline was successful Details
previously we would uncritically take data and format it into
tags for static-fe and the like - however, instances can be
configured to disallow unauthenticated access to these resources.

this means that OG tags as a vector for information leakage.

_technically_ this should only occur if you have both
restrict_unauthenticated *AND* you run static-fe, which makes no
sense since static-fe is for unauthenticated people in particular,
but hey ho.
2024-04-12 05:16:47 +01:00
Oneric fae0a14ee8 Use standard-compliant Accept header when fetching
ci/woodpecker/pr/lint Pipeline was successful Details
ci/woodpecker/pr/test Pipeline was successful Details
ci/woodpecker/pr/build-amd64 Pipeline was successful Details
ci/woodpecker/pr/build-arm64 Pipeline was successful Details
ci/woodpecker/pr/docs Pipeline was successful Details
Spec says clients MUST use this header and servers MUST respond to it,
while servers merely SHOULD respond to the one we used before.
https://www.w3.org/TR/activitypub/#retrieving-objects

The old value is kept as a fallback since at least two years ago
not every implementation correctly dealt with the spec-compliant
variant, see: https://github.com/owncast/owncast/issues/1827

Fixes: #730
2024-04-12 00:22:37 +02:00
Floatingghost 1135935cbe Merge remote-tracking branch 'oneric/ipv6' into develop
ci/woodpecker/push/build-amd64 Pipeline is pending Details
ci/woodpecker/push/build-arm64 Pipeline is pending Details
ci/woodpecker/push/docs Pipeline is pending Details
ci/woodpecker/push/lint Pipeline is pending Details
ci/woodpecker/push/test Pipeline is pending Details
2024-04-11 20:59:49 +01:00
floatingghost 090a77d1af Merge pull request 'static-fe: don’t squeeze non-square images' (#705) from Oneric/akkoma:staticfe-nonsquare-img into develop
ci/woodpecker/push/build-amd64 Pipeline is pending Details
ci/woodpecker/push/build-arm64 Pipeline is pending Details
ci/woodpecker/push/docs Pipeline is pending Details
ci/woodpecker/push/lint Pipeline is pending Details
ci/woodpecker/push/test Pipeline is pending Details
Reviewed-on: #705
2024-04-11 18:43:03 +00:00
floatingghost 0e066bddae Merge pull request 'Drop base_url special casing in test env' (#737) from Oneric/akkoma:testenv_drop_baseurl_specialcase into develop
ci/woodpecker/push/build-amd64 Pipeline is pending Details
ci/woodpecker/push/build-arm64 Pipeline is pending Details
ci/woodpecker/push/docs Pipeline is pending Details
ci/woodpecker/push/lint Pipeline is pending Details
ci/woodpecker/push/test Pipeline is pending Details
Reviewed-on: #737
2024-04-11 18:24:09 +00:00
Oneric bd74ad9ce4 Accept body parameters for /api/pleroma/notification_settings
ci/woodpecker/pr/lint Pipeline was successful Details
ci/woodpecker/pr/test Pipeline was successful Details
ci/woodpecker/pr/build-amd64 unknown status Details
ci/woodpecker/pr/build-arm64 unknown status Details
ci/woodpecker/pr/docs unknown status Details
This brings it in line with its documentation and akkoma-fe’s
expectations. For backwards compatibility URL parameters are still
accept with lower priority. Unfortunately this means duplicating
parameters and descriptions in the API spec.

Usually Plug already pre-merges parameters from different sources into
the plain 'params' parameter which then gets forwarded by Phoenix.
However, OpenApiSpex 3.x prevents this; 4.x is set to change this
  https://github.com/open-api-spex/open_api_spex/issues/334
  https://github.com/open-api-spex/open_api_spex/issues/92

Fixes: #691
Fixes: #722
2024-04-09 04:11:28 +02:00
Oneric 462225880a Accept EmojiReacts with non-array tag
ci/woodpecker/pr/lint Pipeline was successful Details
ci/woodpecker/pr/test Pipeline failed Details
ci/woodpecker/pr/build-arm64 unknown status Details
ci/woodpecker/pr/build-amd64 unknown status Details
ci/woodpecker/pr/docs unknown status Details
JSON-LD compaction strips the array since it’s just one object

Fixes: #720
2024-04-09 04:04:16 +02:00
Oneric debd686418 Add tests for our own custom emoji format 2024-04-09 03:52:22 +02:00
Oneric 9598137d32 Drop base_url special casing in test env
ci/woodpecker/pr/lint Pipeline was successful Details
ci/woodpecker/pr/test Pipeline failed Details
ci/woodpecker/pr/build-arm64 unknown status Details
ci/woodpecker/pr/build-amd64 unknown status Details
ci/woodpecker/pr/docs unknown status Details
61621ebdbc already explicitly added
the uploader base url to config/test.exs and it reduces differences
from prod.
2024-04-07 00:20:12 +02:00
floatingghost b8393ad9ed Merge pull request 'context: add featured definition' (#717) from erincandescent/akkoma:context-featured into develop
ci/woodpecker/push/build-amd64 Pipeline is pending Details
ci/woodpecker/push/build-arm64 Pipeline is pending Details
ci/woodpecker/push/docs Pipeline is pending Details
ci/woodpecker/push/lint Pipeline is pending Details
ci/woodpecker/push/test Pipeline is pending Details
Reviewed-on: #717
2024-04-03 10:22:09 +00:00
floatingghost 554f19a9ed Merge pull request 'Refresh Users much more aggressively when processing Move activities' (#714) from erincandescent/akkoma:move-bust-cache into develop
ci/woodpecker/push/build-amd64 Pipeline is pending Details
ci/woodpecker/push/build-arm64 Pipeline is pending Details
ci/woodpecker/push/docs Pipeline is pending Details
ci/woodpecker/push/lint Pipeline is pending Details
ci/woodpecker/push/test Pipeline is pending Details
Reviewed-on: #714
2024-04-03 10:03:14 +00:00
Erin Shepherd 8fbd771d6e context: add featured & backgroundUrl definitions
ci/woodpecker/pr/build-amd64 Pipeline is pending Details
ci/woodpecker/pr/build-arm64 Pipeline is pending Details
ci/woodpecker/pr/docs Pipeline is pending Details
ci/woodpecker/pr/lint Pipeline is pending Details
ci/woodpecker/pr/test Pipeline is pending Details
These were missing from our context, which caused interoperability issues with
people who do context processing
2024-04-01 13:39:38 +02:00
Erin Shepherd 464db9ea0b Don't list old accounts as aliases in WebFinger
ci/woodpecker/pr/lint Pipeline was successful Details
ci/woodpecker/pr/test Pipeline failed Details
ci/woodpecker/pr/build-arm64 unknown status Details
ci/woodpecker/pr/build-amd64 unknown status Details
ci/woodpecker/pr/docs unknown status Details
Per the XRD specification:

> 2.4. Element <Alias>
>
> The <Alias> element contains a URI value that is an additional
> identifier for the resource described by the XRD. This value
> MUST be an absolute URI. The <Alias> element does not identify
> additional resources the XRD is describing, **but rather provides
> additional identifiers for the same resource.**

(http://docs.oasis-open.org/xri/xrd/v1.0/os/xrd-1.0-os.html#element.alias, emphasis mine)

In other words, the alias list is expected to link to things which are
not just semantically the same, but exactly the same. Old user accounts
don't do that

This change should not pose a compatibility issue: Mastodon does not
list old accounts here (See e1fcb02867/app/serializers/webfinger_serializer.rb (L12))

The use of as:alsoKnownAs is also not quite semantically right here
(see https://www.w3.org/TR/did-core/#dfn-alsoknownas, which defines
it to be used to refer to identifiers which are interchangable) but
that's what DID get for reusing a property definition that Mastodon
already squatted long before they got to it
2024-04-01 13:34:58 +02:00
Sandra Snan 6116f81546
Don't strip newlines in the Atom feed
ci/woodpecker/pr/build-amd64 Pipeline is pending Details
ci/woodpecker/pr/build-arm64 Pipeline is pending Details
ci/woodpecker/pr/docs Pipeline is pending Details
ci/woodpecker/pr/lint Pipeline is pending Details
ci/woodpecker/pr/test Pipeline is pending Details
2024-03-11 12:50:14 +01:00
Helge 5d89e0c917 Allow for url to be a list
ci/woodpecker/pr/lint Pipeline failed Details
ci/woodpecker/pr/test unknown status Details
ci/woodpecker/pr/build-arm64 unknown status Details
ci/woodpecker/pr/build-amd64 unknown status Details
ci/woodpecker/pr/docs unknown status Details
This solves interoperability issues, see:
- https://git.pleroma.social/pleroma/pleroma/-/issues/3253
- https://socialhub.activitypub.rocks/t/fep-fffd-proxy-objects/3172/30?u=helge
- https://data.funfedi.dev/0.1.1/#url-parameter
2024-03-03 09:11:45 +01:00
Erin Shepherd f18e2ba42c Refresh Users much more aggressively when processing Move activities
ci/woodpecker/pr/lint Pipeline was successful Details
ci/woodpecker/pr/test Pipeline failed Details
ci/woodpecker/pr/build-arm64 unknown status Details
ci/woodpecker/pr/build-amd64 unknown status Details
ci/woodpecker/pr/docs unknown status Details
The default refresh interval of 1 day is woefully inadequate here;
users expect to be able to add the alias to their new account and
press the move button on their old account and have it work.

This allows callers to specify a maximum age before a refetch is
triggered. We set that to 5s for the move code, as a nice compromise
between Making Things Work and ensuring that this can't be used
to hammer a remote server
2024-02-29 21:14:53 +01:00
Oneric fc95519dbf Allow fetching over IPv6
ci/woodpecker/pr/lint Pipeline was successful Details
ci/woodpecker/pr/test Pipeline was successful Details
ci/woodpecker/pr/build-amd64 Pipeline was successful Details
ci/woodpecker/pr/build-arm64 Pipeline was successful Details
ci/woodpecker/pr/docs Pipeline was successful Details
Mint/Finch disable IPv6 by default preventing us from
fetching anything from IPv6-only hosts without this.
2024-02-25 23:50:51 +01:00
Oneric d7c8e9df27 static-fe: don’t squeeze non-square avatars
ci/woodpecker/pr/build-amd64 Pipeline is pending Details
ci/woodpecker/pr/build-arm64 Pipeline is pending Details
ci/woodpecker/pr/docs Pipeline is pending Details
ci/woodpecker/pr/lint Pipeline is pending Details
ci/woodpecker/pr/test Pipeline is pending Details
This will crop them to a square matching behaviour of Husky and *key
and allowing us to never worry about consistent alignment.
Note, akkoma-fe instead displays the full image with inserted spacing.
2024-02-23 23:39:44 +00:00
Oneric a0daec6ea1 static-fe: don’t squeeze non-square emoji
Emoji and the navbar items want to let blend in with lines of text,
so fix their height and let the width adjust as needed.
2024-02-23 23:39:44 +00:00
Norm 0cb3812ac0
Add docker override file to docs and gitignore
ci/woodpecker/pr/build-amd64 Pipeline is pending Details
ci/woodpecker/pr/build-arm64 Pipeline is pending Details
ci/woodpecker/pr/docs Pipeline is pending Details
ci/woodpecker/pr/test Pipeline is pending Details
The docker-compose.yml file is likely to be edited quite extensively by
admins when setting up an instance. This would likely cause problems
when dealing with updating Akkoma as merge conflicts would likely occur.

Docker-compose already has the ability to use override files in addition
to the main `docker-compose.yml` file. Admins can instead put any
overrides (additional volumes, container for elasticsearch, etc.) into a
file that won't be tracked by git and thus won't run into merge
conflicts in the future. In particular, the
`docker-compose.override.yml` will be checked by docker compose in
addition to the main file if it exists and override definitions from the
latter with the former.
2023-08-07 13:09:04 -04:00
Ilja 66a04cead3 Descriptions from exif data with only whitespeces are considered empty
ci/woodpecker/pr/woodpecker Pipeline is pending Details
I noticed that pictures taken with Ubuntu-Touch have whitespace in one of the fields
This should just be ignored imo
2022-10-23 14:46:22 +02:00
Ilja f50cffd134 update moduledoc 2022-10-23 14:46:22 +02:00
Ilja 338612d72b Use EXIF data of image to prefill image description
During attachment upload Pleroma returns a "description" field.

* This MR allows Pleroma to read the EXIF data during upload and return the description to the FE using this field.
    * If a description is already present (e.g. because a previous module added it), it will use that
    * Otherwise it will read from the EXIF data. First it will check -ImageDescription, if that's empty, it will check -iptc:Caption-Abstract
    * If no description is found, it will simply return nil, which is the default value
* When people set up a new instance, they will be asked if they want to read metadata and this module will be activated if so

There was an Exiftool module, which has now been renamed to Exiftool.StripLocation
2022-10-23 14:46:16 +02:00
93 changed files with 2170 additions and 1176 deletions

1
.gitignore vendored
View File

@ -78,3 +78,4 @@ docs/venv
# docker stuff
docker-db
*.iml
docker-compose.override.yml

View File

@ -7,6 +7,7 @@ matrix:
ELIXIR_VERSION:
- 1.14
- 1.15
- 1.16
OTP_VERSION:
- 25
- 26
@ -17,6 +18,8 @@ matrix:
OTP_VERSION: 25
- ELIXIR_VERSION: 1.15
OTP_VERSION: 26
- ELIXIR_VERSION: 1.16
OTP_VERSION: 26
variables:
- &scw-secrets

View File

@ -4,6 +4,44 @@ All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/).
## Unreleased
## 2024.04
## Added
- Support for [FEP-fffd](https://codeberg.org/fediverse/fep/src/branch/main/fep/fffd/fep-fffd.md) (proxy objects)
- Verified support for elixir 1.16
- Uploadfilter `Pleroma.Upload.Filter.Exiftool.ReadDescription` returns description values to the FE so they can pre fill the image description field
NOTE: this filter MUST be placed before `Exiftool.StripMetadata` to work
## Changed
- Inbound pipeline error handing was modified somewhat, which should lead to less incomprehensible log spam. Hopefully.
- Uploadfilter `Pleroma.Upload.Filter.Exiftool` was replaced by `Pleroma.Upload.Filter.Exiftool.StripMetadata`;
the latter strips all non-essential metadata by default but can be configured.
To regain the old behaviour of only stripping GPS data set `purge: ["gps:all"]`.
- Uploadfilter `Pleroma.Upload.Filter.Exiftool` has been renamed to `Pleroma.Upload.Filter.Exiftool.StripMetadata`
- MRF.InlineQuotePolicy now prefers to insert display URLs instead of ActivityPub IDs
- Old accounts are no longer listed in WebFinger as aliases; this was breaking spec
## Fixed
- Issue preventing fetching anything from IPv6-only instances
- Issue allowing post content to leak via opengraph tags despite :estrict\_unauthenticated being set
- Move activities no longer operate on stale user data
- Missing definitions in our JSON-LD context
- Issue mangling newlines in code blocks for RSS/Atom feeds
- static\_fe squeezing non-square avatars and emoji
- Issue leading to properly JSON-LD compacted emoji reactions being rejected
- We now use a standard-compliant Accept header when fetching ActivityPub objects
- /api/pleroma/notification\_settings was rejecting body parameters;
this also broke changing this setting via akkoma-fe
- Issue leading to Mastodon bot accounts being rejected
- Scope misdetection of remote posts resulting from not recognising
JSON-LD-compacted forms of public scope; affected e.g. federation with bovine
## Removed
- ActivityPub Client-To-Server write API endpoints have been disabled;
read endpoints are planned to be removed next release unless a clear need is demonstrated
## 2024.03
## Added

View File

@ -1,2 +0,0 @@
web: mix phx.server
release: mix ecto.migrate

View File

@ -222,6 +222,26 @@ config :pleroma, :config_description, [
}
]
},
%{
group: :pleroma,
key: Pleroma.Upload.Filter.Exiftool.StripMetadata,
type: :group,
description: "Strip specified metadata from image uploads",
children: [
%{
key: :purge,
description: "Metadata fields or groups to strip",
type: {:list, :string},
suggestions: ["all", "CommonIFD0"]
},
%{
key: :preserve,
description: "Metadata fields or groups to preserve (takes precedence over stripping)",
type: {:list, :string},
suggestions: ["ColorSpaces", "Orientation"]
}
]
},
%{
group: :pleroma,
key: Pleroma.Emails.Mailer,

View File

@ -26,6 +26,8 @@ config :pleroma, Pleroma.Upload,
filters: [],
link_name: false
config :pleroma, :media_proxy, base_url: "http://localhost:4001"
config :pleroma, Pleroma.Uploaders.Local, uploads: "test/uploads"
config :pleroma, Pleroma.Emails.Mailer, adapter: Swoosh.Adapters.Test, enabled: true

View File

@ -1,7 +0,0 @@
{
"skip_files": [
"test/support",
"lib/mix/tasks/pleroma/benchmark.ex",
"lib/credo/check/consistency/file_location.ex"
]
}

View File

@ -46,7 +46,7 @@ services:
volumes:
- .:/opt/akkoma
# Uncomment the following if you want to use a reverse proxy
# Copy this into docker-compose.override.yml and uncomment there if you want to use a reverse proxy
#proxy:
# image: caddy:2-alpine
# restart: unless-stopped

View File

@ -37,7 +37,8 @@ If any of the options are left unspecified, you will be prompted interactively.
- `--static-dir <path>` - the directory custom public files should be read from (custom emojis, frontend bundle overrides, robots.txt, etc.)
- `--listen-ip <ip>` - the ip the app should listen to, defaults to 127.0.0.1
- `--listen-port <port>` - the port the app should listen to, defaults to 4000
- `--strip-uploads <Y|N>` - use ExifTool to strip uploads of sensitive location data
- `--strip-uploads-metadata <Y|N>` - use ExifTool to strip uploads of metadata when possible
- `--read-uploads-description <Y|N>` - use ExifTool to read image descriptions from uploads
- `--anonymize-uploads <Y|N>` - randomize uploaded filenames
- `--dedupe-uploads <Y|N>` - store files based on their hash to reduce data storage requirements if duplicates are uploaded with different filenames
- `--skip-release-env` - skip generation the release environment file

View File

@ -654,9 +654,17 @@ This filter replaces the declared filename (not the path) of an upload.
* `text`: Text to replace filenames in links. If empty, `{random}.extension` will be used. You can get the original filename extension by using `{extension}`, for example `custom-file-name.{extension}`.
#### Pleroma.Upload.Filter.Exiftool
#### Pleroma.Upload.Filter.Exiftool.StripMetadata
This filter only strips the GPS and location metadata with Exiftool leaving color profiles and attributes intact.
This filter strips metadata with Exiftool leaving color profiles and orientation intact.
* `purge`: List of Exiftool tag names or tag group names to purge
* `preserve`: List of Exiftool tag names or tag group names to preserve even if they occur in the purge list
#### Pleroma.Upload.Filter.Exiftool.ReadDescription
This filter reads the ImageDescription and iptc:Caption-Abstract fields with Exiftool so clients can prefill the media description field.
No specific configuration.

View File

@ -145,47 +145,13 @@ If you want to open your newly installed instance to the world, you should run n
doas apk add nginx
```
* Setup your SSL cert, using your method of choice or certbot. If using certbot, first install it:
```shell
doas apk add certbot
```
and then set it up:
```shell
doas mkdir -p /var/lib/letsencrypt/
doas certbot certonly --email <your@emailaddress> -d <yourdomain> --standalone
```
If that doesnt work, make sure, that nginx is not already running. If it still doesnt work, try setting up nginx first (change ssl “on” to “off” and try again).
* Copy the example nginx configuration to the nginx folder
```shell
doas cp /opt/akkoma/installation/nginx/akkoma.nginx /etc/nginx/conf.d/akkoma.conf
```
* Before starting nginx edit the configuration and change it to your needs. You must change change `server_name` and the paths to the certificates. You can use `nano` (install with `apk add nano` if missing).
```
server {
server_name your.domain;
listen 80;
...
}
server {
server_name your.domain;
listen 443 ssl http2;
...
ssl_trusted_certificate /etc/letsencrypt/live/your.domain/chain.pem;
ssl_certificate /etc/letsencrypt/live/your.domain/fullchain.pem;
ssl_certificate_key /etc/letsencrypt/live/your.domain/privkey.pem;
...
}
```
* Before starting nginx edit the configuration and change it to your needs. You must change change `server_name`. You can use `nano` (install with `apk add nano` if missing).
* Enable and start nginx:
```shell
@ -193,10 +159,37 @@ doas rc-update add nginx
doas rc-service nginx start
```
If you need to renew the certificate in the future, uncomment the relevant location block in the nginx config and run:
* Setup your SSL cert, using your method of choice or certbot. If using certbot, first install it:
```shell
doas certbot certonly --email <your@emailaddress> -d <yourdomain> --webroot -w /var/lib/letsencrypt/
doas apk add certbot certbot-nginx
```
and then set it up:
```shell
doas mkdir -p /var/lib/letsencrypt/
doas certbot --email <your@emailaddress> -d <yourdomain> -d <media_domain> --nginx
```
If that doesn't work the first time, add `--dry-run` to further attempts to avoid being ratelimited as you identify the issue, and do not remove it until the dry run succeeds. A common source of problems are nginx config syntax errors; this can be checked for by running `nginx -t`.
To automatically renew, set up a cron job like so:
```shell
# Enable the crond service
doas rc-update add crond
doas rc-service crond start
# Test that renewals work
doas certbot renew --cert-name yourinstance.tld --nginx --dry-run
# Add the renewal task to cron
echo '#!/bin/sh
certbot renew --cert-name yourinstance.tld --nginx
' | doas tee /etc/periodic/daily/renew-akkoma-cert
doas chmod +x /etc/periodic/daily/renew-akkoma-cert
```
#### OpenRC service

View File

@ -136,16 +136,17 @@ If you want to open your newly installed instance to the world, you should run n
sudo pacman -S nginx
```
* Create directories for available and enabled sites:
* Copy the example nginx configuration:
```shell
sudo mkdir -p /etc/nginx/sites-{available,enabled}
sudo cp /opt/akkoma/installation/nginx/akkoma.nginx /etc/nginx/conf.d/akkoma.conf
```
* Append the following line at the end of the `http` block in `/etc/nginx/nginx.conf`:
* Before starting nginx edit the configuration and change it to your needs (e.g. change servername, change cert paths)
* Enable and start nginx:
```Nginx
include sites-enabled/*;
```shell
sudo systemctl enable --now nginx.service
```
* Setup your SSL cert, using your method of choice or certbot. If using certbot, first install it:
@ -158,32 +159,18 @@ and then set it up:
```shell
sudo mkdir -p /var/lib/letsencrypt/
sudo certbot certonly --email <your@emailaddress> -d <yourdomain> --standalone
sudo certbot --email <your@emailaddress> -d <yourdomain> -d <media_domain> --nginx
```
If that doesnt work, make sure, that nginx is not already running. If it still doesnt work, try setting up nginx first (change ssl “on” to “off” and try again).
If that doesn't work the first time, add `--dry-run` to further attempts to avoid being ratelimited as you identify the issue, and do not remove it until the dry run succeeds. A common source of problems are nginx config syntax errors; this can be checked for by running `nginx -t`.
---
* Copy the example nginx configuration and activate it:
To make sure renewals work, enable the appropriate systemd timer:
```shell
sudo cp /opt/akkoma/installation/nginx/akkoma.nginx /etc/nginx/sites-available/akkoma.nginx
sudo ln -s /etc/nginx/sites-available/akkoma.nginx /etc/nginx/sites-enabled/akkoma.nginx
sudo systemctl enable --now certbot-renew.timer
```
* Before starting nginx edit the configuration and change it to your needs (e.g. change servername, change cert paths)
* Enable and start nginx:
```shell
sudo systemctl enable --now nginx.service
```
If you need to renew the certificate in the future, uncomment the relevant location block in the nginx config and run:
```shell
sudo certbot certonly --email <your@emailaddress> -d <yourdomain> --webroot -w /var/lib/letsencrypt/
```
Certificate renewal should be handled automatically by Certbot from now on.
#### Other webserver/proxies

View File

@ -155,23 +155,6 @@ If you want to open your newly installed instance to the world, you should run n
sudo apt install nginx
```
* Setup your SSL cert, using your method of choice or certbot. If using certbot, first install it:
```shell
sudo apt install certbot
```
and then set it up:
```shell
sudo mkdir -p /var/lib/letsencrypt/
sudo certbot certonly --email <your@emailaddress> -d <yourdomain> --standalone
```
If that doesnt work, make sure, that nginx is not already running. If it still doesnt work, try setting up nginx first (change ssl “on” to “off” and try again).
---
* Copy the example nginx configuration and activate it:
```shell
@ -186,12 +169,23 @@ sudo ln -s /etc/nginx/sites-available/akkoma.nginx /etc/nginx/sites-enabled/akko
sudo systemctl enable --now nginx.service
```
If you need to renew the certificate in the future, uncomment the relevant location block in the nginx config and run:
* Setup your SSL cert, using your method of choice or certbot. If using certbot, first install it:
```shell
sudo certbot certonly --email <your@emailaddress> -d <yourdomain> --webroot -w /var/lib/letsencrypt/
sudo apt install certbot python3-certbot-nginx
```
and then set it up:
```shell
sudo mkdir -p /var/lib/letsencrypt/
sudo certbot --email <your@emailaddress> -d <yourdomain> -d <media_domain> --nginx
```
If that doesn't work the first time, add `--dry-run` to further attempts to avoid being ratelimited as you identify the issue, and do not remove it until the dry run succeeds. A common source of problems are nginx config syntax errors; this can be checked for by running `nginx -t`.
Certificate renewal should be handled automatically by Certbot from now on.
#### Other webserver/proxies
You can find example configurations for them in `/opt/akkoma/installation/`.

View File

@ -125,7 +125,26 @@ cp docker-resources/Caddyfile.example docker-resources/Caddyfile
Then edit the TLD in your caddyfile to the domain you're serving on.
Uncomment the `caddy` section in the docker compose file,
Copy the commented out `caddy` section in `docker-compose.yml` into a new file called `docker-compose.override.yml` like so:
```yaml
version: "3.7"
services:
proxy:
image: caddy:2-alpine
restart: unless-stopped
links:
- akkoma
ports: [
"443:443",
"80:80"
]
volumes:
- ./docker-resources/Caddyfile:/etc/caddy/Caddyfile
- ./caddy-data:/data
- ./caddy-config:/config
```
then run `docker compose up -d` again.
#### Running a reverse proxy on the host
@ -155,6 +174,12 @@ git pull
docker compose restart akkoma db
```
### Modifying the Docker services
If you want to modify the services defined in the docker compose file, you can
create a new file called `docker-compose.override.yml`. There you can add any
overrides or additional services without worrying about git conflicts when a
new release comes out.
#### Further reading
{! installation/further_reading.include !}

View File

@ -135,23 +135,6 @@ If you want to open your newly installed instance to the world, you should run n
sudo dnf install nginx
```
* Setup your SSL cert, using your method of choice or certbot. If using certbot, first install it:
```shell
sudo dnf install certbot
```
and then set it up:
```shell
sudo mkdir -p /var/lib/letsencrypt/
sudo certbot certonly --email <your@emailaddress> -d <yourdomain> --standalone
```
If that doesnt work, make sure, that nginx is not already running. If it still doesnt work, try setting up nginx first (change ssl “on” to “off” and try again).
---
* Copy the example nginx configuration and activate it:
```shell
@ -165,12 +148,23 @@ sudo cp /opt/akkoma/installation/nginx/akkoma.nginx /etc/nginx/conf.d/akkoma.con
sudo systemctl enable --now nginx.service
```
If you need to renew the certificate in the future, uncomment the relevant location block in the nginx config and run:
* Setup your SSL cert, using your method of choice or certbot. If using certbot, first install it:
```shell
sudo certbot certonly --email <your@emailaddress> -d <yourdomain> --webroot -w /var/lib/letsencrypt/
sudo dnf install certbot python3-certbot-nginx
```
and then set it up:
```shell
sudo certbot --email <your@emailaddress> -d <yourdomain> -d <media_domain> --nginx
```
If that doesn't work the first time, add `--dry-run` to further attempts to avoid being ratelimited as you identify the issue, and do not remove it until the dry run succeeds. A common source of problems are nginx config syntax errors; this can be checked for by running `nginx -t`.
Certificate renewal should be handled automatically by Certbot from now on.
#### Other webserver/proxies
You can find example configurations for them in `/opt/akkoma/installation/`.

View File

@ -1,8 +1,8 @@
## Required dependencies
* PostgreSQL 9.6+
* Elixir 1.14+
* Erlang OTP 25+
* Elixir 1.14+ (currently tested up to 1.16)
* Erlang OTP 25+ (currently tested up to OTP26)
* git
* file / libmagic
* gcc (clang might also work)

View File

@ -201,25 +201,6 @@ Assuming you want to open your newly installed federated social network to, well
include sites-enabled/*;
```
* Setup your SSL cert, using your method of choice or certbot. If using certbot, install it if you haven't already:
```shell
# emerge --ask app-crypt/certbot app-crypt/certbot-nginx
```
and then set it up:
```shell
# mkdir -p /var/lib/letsencrypt/
# certbot certonly --email <your@emailaddress> -d <yourdomain> --standalone
```
If that doesn't work the first time, add `--dry-run` to further attempts to avoid being ratelimited as you identify the issue, and do not remove it until the dry run succeeds. If that doesnt work, make sure, that nginx is not already running. If it still doesnt work, try setting up nginx first (change ssl “on” to “off” and try again). Often the answer to issues with certbot is to use the `--nginx` flag once you have nginx up and running.
If you are using any additional subdomains, such as for a media proxy, you can re-run the same command with the subdomain in question. When it comes time to renew later, you will not need to run multiple times for each domain, one renew will handle it.
---
* Copy the example nginx configuration and activate it:
```shell
@ -237,9 +218,24 @@ Pay special attention to the line that begins with `ssl_ecdh_curve`. It is stong
```shell
# rc-update add nginx default
# /etc/init.d/nginx start
# rc-service nginx start
```
* Setup your SSL cert, using your method of choice or certbot. If using certbot, install it if you haven't already:
```shell
# emerge --ask app-crypt/certbot app-crypt/certbot-nginx
```
and then set it up:
```shell
# mkdir -p /var/lib/letsencrypt/
# certbot --email <your@emailaddress> -d <yourdomain> -d <media_domain> --nginx
```
If that doesn't work the first time, add `--dry-run` to further attempts to avoid being ratelimited as you identify the issue, and do not remove it until the dry run succeeds. A common source of problems are nginx config syntax errors; this can be checked for by running `nginx -t`.
If you are using certbot, it is HIGHLY recommend you set up a cron job that renews your certificate, and that you install the suggested `certbot-nginx` plugin. If you don't do these things, you only have yourself to blame when your instance breaks suddenly because you forgot about it.
First, ensure that the command you will be installing into your crontab works.

View File

@ -14,7 +14,7 @@ Note: the packages are not required with the current default settings of Akkoma.
`ImageMagick` is a set of tools to create, edit, compose, or convert bitmap images.
It is required for the following Akkoma features:
* `Pleroma.Upload.Filters.Mogrify`, `Pleroma.Upload.Filters.Mogrifun` upload filters (related config: `Plaroma.Upload/filters` in `config/config.exs`)
* `Pleroma.Upload.Filters.Mogrify`, `Pleroma.Upload.Filters.Mogrifun` upload filters (related config: `Pleroma.Upload/filters` in `config/config.exs`)
* Media preview proxy for still images (related config: `media_preview_proxy/enabled` in `config/config.exs`)
## `ffmpeg`
@ -29,4 +29,5 @@ It is required for the following Akkoma features:
`exiftool` is media files metadata reader/writer.
It is required for the following Akkoma features:
* `Pleroma.Upload.Filters.Exiftool` upload filter (related config: `Plaroma.Upload/filters` in `config/config.exs`)
* `Pleroma.Upload.Filters.Exiftool.StripMetadata` upload filter (related config: `Pleroma.Upload/filters` in `config/config.exs`)
* `Pleroma.Upload.Filters.Exiftool.ReadDescription` upload filter (related config: `Pleroma.Upload/filters` in `config/config.exs`)

View File

@ -9,7 +9,7 @@ This guide covers a installation using an OTP release. To install Akkoma from so
* For installing OTP releases on RedHat-based distros like Fedora and Centos Stream, please follow [this guide](./otp_redhat_en.md) instead.
* A (sub)domain pointed to the machine
You will be running commands as root. If you aren't root already, please elevate your priviledges by executing `sudo su`/`su`.
You will be running commands as root. If you aren't root already, please elevate your priviledges by executing `sudo -i`/`su`.
While in theory OTP releases are possbile to install on any compatible machine, for the sake of simplicity this guide focuses only on Debian/Ubuntu and Alpine.
@ -176,11 +176,6 @@ su akkoma -s $SHELL -lc "./bin/pleroma stop"
### Setting up nginx and getting Let's Encrypt SSL certificaties
#### Get a Let's Encrypt certificate
```sh
certbot certonly --standalone --preferred-challenges http -d yourinstance.tld
```
#### Copy Akkoma nginx configuration to the nginx folder
The location of nginx configs is dependent on the distro
@ -209,6 +204,14 @@ $EDITOR path-to-nginx-config
# Verify that the config is valid
nginx -t
```
#### Get a Let's Encrypt certificate
```sh
certbot --nginx -d yourinstance.tld -d media.yourinstance.tld
```
If that doesn't work the first time, add `--dry-run` to further attempts to avoid being ratelimited as you identify the issue, and do not remove it until the dry run succeeds. A common source of problems are nginx config syntax errors; this can be checked for by running `nginx -t`.
#### Start nginx
=== "Alpine"
@ -252,32 +255,19 @@ If everything worked, you should see Akkoma-FE when visiting your domain. If tha
## Post installation
### Setting up auto-renew of the Let's Encrypt certificate
```sh
# Create the directory for webroot challenges
mkdir -p /var/lib/letsencrypt
# Uncomment the webroot method
$EDITOR path-to-nginx-config
# Verify that the config is valid
nginx -t
```
=== "Alpine"
```
# Restart nginx
rc-service nginx restart
# Start the cron daemon and make it start on boot
rc-service crond start
rc-update add crond
# Ensure the webroot menthod and post hook is working
certbot renew --cert-name yourinstance.tld --webroot -w /var/lib/letsencrypt/ --dry-run --post-hook 'rc-service nginx reload'
certbot renew --cert-name yourinstance.tld --nginx --dry-run
# Add it to the daily cron
echo '#!/bin/sh
certbot renew --cert-name yourinstance.tld --webroot -w /var/lib/letsencrypt/ --post-hook "rc-service nginx reload"
certbot renew --cert-name yourinstance.tld --nginx
' > /etc/periodic/daily/renew-akkoma-cert
chmod +x /etc/periodic/daily/renew-akkoma-cert
@ -286,22 +276,7 @@ nginx -t
```
=== "Debian/Ubuntu"
```
# Restart nginx
systemctl restart nginx
# Ensure the webroot menthod and post hook is working
certbot renew --cert-name yourinstance.tld --webroot -w /var/lib/letsencrypt/ --dry-run --post-hook 'systemctl reload nginx'
# Add it to the daily cron
echo '#!/bin/sh
certbot renew --cert-name yourinstance.tld --webroot -w /var/lib/letsencrypt/ --post-hook "systemctl reload nginx"
' > /etc/cron.daily/renew-akkoma-cert
chmod +x /etc/cron.daily/renew-akkoma-cert
# If everything worked the output should contain /etc/cron.daily/renew-akkoma-cert
run-parts --test /etc/cron.daily
```
This should be automatically enabled with the `certbot-renew.timer` systemd unit.
## Create your first user and set as admin
```sh

View File

@ -82,6 +82,7 @@ Other than things bundled in the OTP release Akkoma depends on:
* PostgreSQL (also utilizes extensions in postgresql-contrib)
* nginx (could be swapped with another reverse proxy but this guide covers only it)
* certbot (for Let's Encrypt certificates, could be swapped with another ACME client, but this guide covers only it)
* If you are using certbot, also install the `python3-certbot-nginx` package for the nginx plugin
* libmagic/file
First, update your system, if not already done:
@ -169,12 +170,6 @@ sudo -Hu akkoma ./bin/pleroma stop
### Setting up nginx and getting Let's Encrypt SSL certificaties
#### Get a Let's Encrypt certificate
```shell
certbot certonly --standalone --preferred-challenges http -d yourinstance.tld
```
#### Copy Akkoma nginx configuration to the nginx folder
```shell
@ -195,8 +190,15 @@ sudo nginx -t
sudo systemctl start nginx
```
At this point if you open your (sub)domain in a browser you should see a 502 error, that's because Akkoma is not started yet.
#### Get a Let's Encrypt certificate
```shell
sudo certbot --email <your@emailaddress> -d <yourdomain> -d <media_domain> --nginx
```
If that doesn't work the first time, add `--dry-run` to further attempts to avoid being ratelimited as you identify the issue, and do not remove it until the dry run succeeds. A common source of problems are nginx config syntax errors; this can be checked for by running `nginx -t`.
If you're successful with obtaining the certificates, opening your (sub)domain in a browser will result in a 502 error, since Akkoma hasn't been started yet.
### Setting up a system service
@ -239,19 +241,11 @@ sudo nginx -t
# Restart nginx
sudo systemctl restart nginx
# Ensure the webroot menthod and post hook is working
sudo certbot renew --cert-name yourinstance.tld --webroot -w /var/lib/letsencrypt/ --dry-run --post-hook 'systemctl reload nginx'
# Add it to the daily cron
echo '#!/bin/sh
certbot renew --cert-name yourinstance.tld --webroot -w /var/lib/letsencrypt/ --post-hook "systemctl reload nginx"
' > /etc/cron.daily/renew-akkoma-cert
sudo chmod +x /etc/cron.daily/renew-akkoma-cert
# If everything worked the output should contain /etc/cron.daily/renew-akkoma-cert
sudo run-parts --test /etc/cron.daily
# Test that renewals work properly
sudo certbot renew --cert-name yourinstance.tld --nginx --dry-run
```
Assuming the commands were run successfully, certbot should be able to renew your certificates automatically via the `certbot-renew.timer` systemd unit.
## Create your first user and set as admin
```shell

View File

@ -1,2 +0,0 @@
elixir_version=1.14.3
erlang_version=25.3

View File

@ -1,12 +1,9 @@
# default nginx site config for Akkoma
#
# Simple installation instructions:
# 1. Install your TLS certificate, possibly using Let's Encrypt.
# 2. Replace 'example.tld' with your instance's domain wherever it appears.
# 3. Copy this file to /etc/nginx/sites-available/ and then add a symlink to it
# in /etc/nginx/sites-enabled/ and run 'nginx -s reload' or restart nginx.
# See the documentation at docs.akkoma.dev for your particular distro/OS for
# installation instructions.
proxy_cache_path /tmp/akkoma-media-cache levels=1:2 keys_zone=akkoma_media_cache:10m max_size=10g
proxy_cache_path /tmp/akkoma-media-cache levels=1:2 keys_zone=akkoma_media_cache:10m max_size=1g
inactive=720m use_temp_path=off;
# this is explicitly IPv4 since Pleroma.Web.Endpoint binds on IPv4 only
@ -15,25 +12,19 @@ upstream phoenix {
server 127.0.0.1:4000 max_fails=5 fail_timeout=60s;
}
server {
server_name example.tld;
listen 80;
listen [::]:80;
# Uncomment this if you need to use the 'webroot' method with certbot. Make sure
# that the directory exists and that it is accessible by the webserver. If you followed
# the guide, you already ran 'mkdir -p /var/lib/letsencrypt' to create the folder.
# You may need to load this file with the ssl server block commented out, run certbot
# to get the certificate, and then uncomment it.
#
# location ~ /\.well-known/acme-challenge {
# root /var/lib/letsencrypt/;
# }
location / {
return 301 https://$server_name$request_uri;
}
}
# If you are setting up TLS certificates without certbot, uncomment the
# following to enable HTTP -> HTTPS redirects. Certbot users don't need to do
# this as it will automatically do this for you.
# server {
# server_name example.tld media.example.tld;
#
# listen 80;
# listen [::]:80;
#
# location / {
# return 301 https://$server_name$request_uri;
# }
# }
# Enable SSL session caching for improved performance
ssl_session_cache shared:ssl_session_cache:10m;
@ -41,22 +32,29 @@ ssl_session_cache shared:ssl_session_cache:10m;
server {
server_name example.tld;
listen 443 ssl http2;
listen [::]:443 ssl http2;
ssl_session_timeout 1d;
ssl_session_cache shared:MozSSL:10m; # about 40000 sessions
ssl_session_tickets off;
# Once certbot is set up, this will automatically be updated to listen to
# port 443 with TLS alongside a redirect from plaintext HTTP.
listen 80;
listen [::]:80;
ssl_trusted_certificate /etc/letsencrypt/live/example.tld/chain.pem;
ssl_certificate /etc/letsencrypt/live/example.tld/fullchain.pem;
ssl_certificate_key /etc/letsencrypt/live/example.tld/privkey.pem;
# If you are not using Certbot, comment out the above and uncomment/edit the following
# listen 443 ssl http2;
# listen [::]:443 ssl http2;
# ssl_session_timeout 1d;
# ssl_session_cache shared:MozSSL:10m; # about 40000 sessions
# ssl_session_tickets off;
#
# ssl_trusted_certificate /etc/letsencrypt/live/example.tld/chain.pem;
# ssl_certificate /etc/letsencrypt/live/example.tld/fullchain.pem;
# ssl_certificate_key /etc/letsencrypt/live/example.tld/privkey.pem;
#
# ssl_protocols TLSv1.2 TLSv1.3;
# ssl_ciphers "ECDHE-ECDSA-AES256-GCM-SHA384:ECDHE-RSA-AES256-GCM-SHA384:ECDHE-ECDSA-CHACHA20-POLY1305:ECDHE-RSA-CHACHA20-POLY1305:ECDHE-ECDSA-AES128-GCM-SHA256:ECDHE-RSA-AES128-GCM-SHA256:ECDHE-ECDSA-AES256-SHA384:ECDHE-RSA-AES256-SHA384:!aNULL:!eNULL:!EXPORT:!DES:!MD5:!PSK:!RC4";
# ssl_prefer_server_ciphers off;
# ssl_ecdh_curve X25519:prime256v1:secp384r1:secp521r1;
# ssl_stapling on;
# ssl_stapling_verify on;
ssl_protocols TLSv1.2 TLSv1.3;
ssl_ciphers "ECDHE-ECDSA-AES256-GCM-SHA384:ECDHE-RSA-AES256-GCM-SHA384:ECDHE-ECDSA-CHACHA20-POLY1305:ECDHE-RSA-CHACHA20-POLY1305:ECDHE-ECDSA-AES128-GCM-SHA256:ECDHE-RSA-AES128-GCM-SHA256:ECDHE-ECDSA-AES256-SHA384:ECDHE-RSA-AES256-SHA384:!aNULL:!eNULL:!EXPORT:!DES:!MD5:!PSK:!RC4";
ssl_prefer_server_ciphers off;
ssl_ecdh_curve X25519:prime256v1:secp384r1:secp521r1;
ssl_stapling on;
ssl_stapling_verify on;
gzip_vary on;
gzip_proxied any;
@ -86,27 +84,22 @@ server {
# Upload and MediaProxy Subdomain
# (see main domain setup for more details)
server {
server_name media.example.tld;
listen 80;
listen [::]:80;
location / {
return 301 https://$server_name$request_uri;
}
}
server {
server_name media.example.tld;
listen 443 ssl http2;
listen [::]:443 ssl http2;
# Same as above, will be updated to HTTPS once certbot is set up.
listen 80;
listen [::]:80;
ssl_trusted_certificate /etc/letsencrypt/live/media.example.tld/chain.pem;
ssl_certificate /etc/letsencrypt/live/media.example.tld/fullchain.pem;
ssl_certificate_key /etc/letsencrypt/live/media.example.tld/privkey.pem;
# .. copy all other the ssl_* and gzip_* stuff from main domain
# If you are not using certbot, comment the above and copy all the ssl
# stuff from above into here.
gzip_vary on;
gzip_proxied any;
gzip_comp_level 6;
gzip_buffers 16 8k;
gzip_http_version 1.1;
gzip_types text/plain text/css application/json application/javascript text/xml application/xml application/xml+rss text/javascript application/activity+json application/atom+xml;
# the nginx default is 1m, not enough for large media uploads
client_max_body_size 16m;

View File

@ -35,7 +35,8 @@ defmodule Mix.Tasks.Pleroma.Instance do
static_dir: :string,
listen_ip: :string,
listen_port: :string,
strip_uploads: :string,
strip_uploads_metadata: :string,
read_uploads_description: :string,
anonymize_uploads: :string
],
aliases: [
@ -169,21 +170,38 @@ defmodule Mix.Tasks.Pleroma.Instance do
)
|> Path.expand()
{strip_uploads_message, strip_uploads_default} =
{strip_uploads_metadata_message, strip_uploads_metadata_default} =
if Pleroma.Utils.command_available?("exiftool") do
{"Do you want to strip location (GPS) data from uploaded images? This requires exiftool, it was detected as installed. (y/n)",
{"Do you want to strip metadata from uploaded images? This requires exiftool, it was detected as installed. (y/n)",
"y"}
else
{"Do you want to strip location (GPS) data from uploaded images? This requires exiftool, it was detected as not installed, please install it if you answer yes. (y/n)",
{"Do you want to strip metadata from uploaded images? This requires exiftool, it was detected as not installed, please install it if you answer yes. (y/n)",
"n"}
end
strip_uploads =
strip_uploads_metadata =
get_option(
options,
:strip_uploads,
strip_uploads_message,
strip_uploads_default
:strip_uploads_metadata,
strip_uploads_metadata_message,
strip_uploads_metadata_default
) === "y"
{read_uploads_description_message, read_uploads_description_default} =
if Pleroma.Utils.command_available?("exiftool") do
{"Do you want to read data from uploaded files so clients can use it to prefill fields like image description? This requires exiftool, it was detected as installed. (y/n)",
"y"}
else
{"Do you want to read data from uploaded files so clients can use it to prefill fields like image description? This requires exiftool, it was detected as not installed, please install it if you answer yes. (y/n)",
"n"}
end
read_uploads_description =
get_option(
options,
:read_uploads_description,
read_uploads_description_message,
read_uploads_description_default
) === "y"
anonymize_uploads =
@ -230,7 +248,8 @@ defmodule Mix.Tasks.Pleroma.Instance do
listen_port: listen_port,
upload_filters:
upload_filters(%{
strip: strip_uploads,
strip_metadata: strip_uploads_metadata,
read_description: read_uploads_description,
anonymize: anonymize_uploads
})
)
@ -305,11 +324,20 @@ defmodule Mix.Tasks.Pleroma.Instance do
end
defp upload_filters(filters) when is_map(filters) do
enabled_filters = []
enabled_filters =
if filters.strip do
[Pleroma.Upload.Filter.Exiftool]
if filters.read_description do
enabled_filters ++ [Pleroma.Upload.Filter.Exiftool.ReadDescription]
else
[]
enabled_filters
end
enabled_filters =
if filters.strip_metadata do
enabled_filters ++ [Pleroma.Upload.Filter.Exiftool.StripMetadata]
else
enabled_filters
end
enabled_filters =

View File

@ -288,6 +288,7 @@ defmodule Pleroma.Application do
|> Config.get([])
|> Pleroma.HTTP.AdapterHelper.add_pool_size(pool_size)
|> Pleroma.HTTP.AdapterHelper.maybe_add_proxy_pool(proxy)
|> Pleroma.HTTP.AdapterHelper.ensure_ipv6()
|> Keyword.put(:name, MyFinch)
[{Finch, config}]

View File

@ -164,7 +164,8 @@ defmodule Pleroma.ApplicationRequirements do
defp check_system_commands!(:ok) do
filter_commands_statuses = [
check_filter(Pleroma.Upload.Filter.Exiftool, "exiftool"),
check_filter(Pleroma.Upload.Filter.Exiftool.StripMetadata, "exiftool"),
check_filter(Pleroma.Upload.Filter.Exiftool.ReadDescription, "exiftool"),
check_filter(Pleroma.Upload.Filter.Mogrify, "mogrify"),
check_filter(Pleroma.Upload.Filter.Mogrifun, "mogrify"),
check_filter(Pleroma.Upload.Filter.AnalyzeMetadata, "mogrify"),

View File

@ -68,7 +68,10 @@ defmodule Akkoma.Collections.Fetcher do
items
end
else
{:error, {"Object has been deleted", _, _}} ->
{:error, :not_found} ->
items
{:error, :forbidden} ->
items
{:error, error} ->

View File

@ -22,6 +22,43 @@ defmodule Pleroma.Config.DeprecationWarnings do
"\n* `config :pleroma, :instance, :quarantined_instances` is now covered by `:pleroma, :mrf_simple, :reject`"}
]
def check_exiftool_filter do
filters = Config.get([Pleroma.Upload]) |> Keyword.get(:filters, [])
if Pleroma.Upload.Filter.Exiftool in filters do
Logger.warning("""
!!!DEPRECATION WARNING!!!
Your config is using Exiftool as a filter instead of Exiftool.StripMetadata. This should work for now, but you are advised to change to the new configuration to prevent possible issues later:
```
config :pleroma, Pleroma.Upload,
filters: [Pleroma.Upload.Filter.Exiftool]
```
Is now
```
config :pleroma, Pleroma.Upload,
filters: [Pleroma.Upload.Filter.Exiftool.StripMetadata]
```
""")
new_config =
filters
|> Enum.map(fn
Pleroma.Upload.Filter.Exiftool -> Pleroma.Upload.Filter.Exiftool.StripMetadata
filter -> filter
end)
Config.put([Pleroma.Upload, :filters], new_config)
:error
else
:ok
end
end
def check_simple_policy_tuples do
has_strings =
Config.get([:mrf_simple])
@ -184,7 +221,8 @@ defmodule Pleroma.Config.DeprecationWarnings do
check_simple_policy_tuples(),
check_http_adapter(),
check_uploader_base_url_set(),
check_uploader_base_url_is_not_base_domain()
check_uploader_base_url_is_not_base_domain(),
check_exiftool_filter()
]
|> Enum.reduce(:ok, fn
:ok, :ok -> :ok

View File

@ -65,6 +65,15 @@ defmodule Pleroma.HTTP.AdapterHelper do
|> put_in([:pools, :default, :size], pool_size)
end
def ensure_ipv6(opts) do
# Default transport opts already enable IPv6, so just ensure they're loaded
opts
|> maybe_add_pools()
|> maybe_add_default_pool()
|> maybe_add_conn_opts()
|> maybe_add_transport_opts()
end
defp maybe_add_pools(opts) do
if Keyword.has_key?(opts, :pools) do
opts
@ -96,11 +105,15 @@ defmodule Pleroma.HTTP.AdapterHelper do
defp maybe_add_transport_opts(opts) do
transport_opts = get_in(opts, [:pools, :default, :conn_opts, :transport_opts])
unless is_nil(transport_opts) do
opts
else
put_in(opts, [:pools, :default, :conn_opts, :transport_opts], [])
end
opts =
unless is_nil(transport_opts) do
opts
else
put_in(opts, [:pools, :default, :conn_opts, :transport_opts], [])
end
# IPv6 is disabled and IPv4 enabled by default; ensure we can use both
put_in(opts, [:pools, :default, :conn_opts, :transport_opts, :inet6], true)
end
@doc """

View File

@ -178,7 +178,10 @@ defmodule Pleroma.Object do
ap_id
Keyword.get(options, :fetch) ->
Fetcher.fetch_object_from_id!(ap_id, options)
case Fetcher.fetch_object_from_id(ap_id, options) do
{:ok, object} -> object
_ -> nil
end
true ->
get_cached_by_ap_id(ap_id)

View File

@ -122,7 +122,7 @@ defmodule Pleroma.Object.Fetcher do
{:ok, object}
else
{:local, true} -> {:ok, object}
{:id, false} -> {:error, "Object id changed on refetch"}
{:id, false} -> {:error, :id_mismatch}
e -> {:error, e}
end
end
@ -136,10 +136,13 @@ defmodule Pleroma.Object.Fetcher do
def fetch_object_from_id(id, options \\ []) do
with %URI{} = uri <- URI.parse(id),
# let's check the URI is even vaguely valid first
{:scheme, true} <- {:scheme, uri.scheme == "http" or uri.scheme == "https"},
{:valid_uri_scheme, true} <-
{:valid_uri_scheme, uri.scheme == "http" or uri.scheme == "https"},
# If we have instance restrictions, apply them here to prevent fetching from unwanted instances
{:ok, nil} <- Pleroma.Web.ActivityPub.MRF.SimplePolicy.check_reject(uri),
{:ok, _} <- Pleroma.Web.ActivityPub.MRF.SimplePolicy.check_accept(uri),
{:mrf_reject_check, {:ok, nil}} <-
{:mrf_reject_check, Pleroma.Web.ActivityPub.MRF.SimplePolicy.check_reject(uri)},
{:mrf_accept_check, {:ok, _}} <-
{:mrf_accept_check, Pleroma.Web.ActivityPub.MRF.SimplePolicy.check_accept(uri)},
{_, nil} <- {:fetch_object, Object.get_cached_by_ap_id(id)},
{_, true} <- {:allowed_depth, Federator.allowed_thread_distance?(options[:depth])},
{_, {:ok, data}} <- {:fetch, fetch_and_contain_remote_object_from_id(id)},
@ -151,20 +154,37 @@ defmodule Pleroma.Object.Fetcher do
{:object, data, Object.normalize(activity, fetch: false)} do
{:ok, object}
else
{:allowed_depth, false} ->
{:error, "Max thread distance exceeded."}
{:allowed_depth, false} = e ->
log_fetch_error(id, e)
{:error, :allowed_depth}
{:scheme, false} ->
{:error, "URI Scheme Invalid"}
{:valid_uri_scheme, _} = e ->
log_fetch_error(id, e)
{:error, :invalid_uri_scheme}
{:transmogrifier, {:error, {:reject, e}}} ->
{:reject, e}
{:mrf_reject_check, _} = e ->
log_fetch_error(id, e)
{:reject, :mrf}
{:transmogrifier, {:reject, e}} ->
{:reject, e}
{:mrf_accept_check, _} = e ->
log_fetch_error(id, e)
{:reject, :mrf}
{:transmogrifier, _} = e ->
{:error, e}
{:containment, reason} = e ->
log_fetch_error(id, e)
{:error, reason}
{:transmogrifier, {:error, {:reject, reason}}} = e ->
log_fetch_error(id, e)
{:reject, reason}
{:transmogrifier, {:reject, reason}} = e ->
log_fetch_error(id, e)
{:reject, reason}
{:transmogrifier, reason} = e ->
log_fetch_error(id, e)
{:error, reason}
{:object, data, nil} ->
reinject_object(%Object{}, data)
@ -175,17 +195,21 @@ defmodule Pleroma.Object.Fetcher do
{:fetch_object, %Object{} = object} ->
{:ok, object}
{:fetch, {:error, error}} ->
{:error, error}
{:reject, reason} ->
{:reject, reason}
{:fetch, {:error, reason}} = e ->
log_fetch_error(id, e)
{:error, reason}
e ->
e
log_fetch_error(id, e)
{:error, e}
end
end
defp log_fetch_error(id, error) do
Logger.metadata(object: id)
Logger.error("Object rejected while fetching #{id} #{inspect(error)}")
end
defp prepare_activity_params(data) do
%{
"type" => "Create",
@ -199,27 +223,6 @@ defmodule Pleroma.Object.Fetcher do
|> Maps.put_if_present("bcc", data["bcc"])
end
@doc "Identical to `fetch_object_from_id/2` but just directly returns the object or on error `nil`"
def fetch_object_from_id!(id, options \\ []) do
with {:ok, object} <- fetch_object_from_id(id, options) do
object
else
{:error, %Tesla.Mock.Error{}} ->
nil
{:error, {"Object has been deleted", _id, _code}} ->
nil
{:reject, reason} ->
Logger.debug("Rejected #{id} while fetching: #{inspect(reason)}")
nil
e ->
Logger.error("Error while fetching #{id}: #{inspect(e)}")
nil
end
end
defp make_signature(id, date) do
uri = URI.parse(id)
@ -259,8 +262,13 @@ defmodule Pleroma.Object.Fetcher do
def fetch_and_contain_remote_object_from_id(id) when is_binary(id) do
Logger.debug("Fetching object #{id} via AP")
with {:scheme, true} <- {:scheme, String.starts_with?(id, "http")},
{_, :ok} <- {:local_fetch, Containment.contain_local_fetch(id)},
with {:valid_uri_scheme, true} <- {:valid_uri_scheme, String.starts_with?(id, "http")},
%URI{} = uri <- URI.parse(id),
{:mrf_reject_check, {:ok, nil}} <-
{:mrf_reject_check, Pleroma.Web.ActivityPub.MRF.SimplePolicy.check_reject(uri)},
{:mrf_accept_check, {:ok, _}} <-
{:mrf_accept_check, Pleroma.Web.ActivityPub.MRF.SimplePolicy.check_accept(uri)},
{:local_fetch, :ok} <- {:local_fetch, Containment.contain_local_fetch(id)},
{:ok, final_id, body} <- get_object(id),
{:ok, data} <- safe_json_decode(body),
{_, :ok} <- {:strict_id, Containment.contain_id_to_fetch(final_id, data)},
@ -271,17 +279,29 @@ defmodule Pleroma.Object.Fetcher do
{:ok, data}
else
{:strict_id, _} ->
{:error, "Object's ActivityPub id/url does not match final fetch URL"}
{:strict_id, _} = e ->
log_fetch_error(id, e)
{:error, :id_mismatch}
{:scheme, _} ->
{:error, "Unsupported URI scheme"}
{:mrf_reject_check, _} = e ->
log_fetch_error(id, e)
{:reject, :mrf}
{:local_fetch, _} ->
{:error, "Trying to fetch local resource"}
{:mrf_accept_check, _} = e ->
log_fetch_error(id, e)
{:reject, :mrf}
{:containment, _} ->
{:error, "Object containment failed."}
{:valid_uri_scheme, _} = e ->
log_fetch_error(id, e)
{:error, :invalid_uri_scheme}
{:local_fetch, _} = e ->
log_fetch_error(id, e)
{:error, :local_resource}
{:containment, reason} ->
log_fetch_error(id, reason)
{:error, reason}
{:error, e} ->
{:error, e}
@ -292,7 +312,7 @@ defmodule Pleroma.Object.Fetcher do
end
def fetch_and_contain_remote_object_from_id(_id),
do: {:error, "id must be a string"}
do: {:error, :invalid_id}
defp check_crossdomain_redirect(final_host, original_url)
@ -324,7 +344,11 @@ defmodule Pleroma.Object.Fetcher do
date = Pleroma.Signature.signed_date()
headers =
[{"accept", "application/activity+json"}]
[
# The first is required by spec, the second provided as a fallback for buggy implementations
{"accept", "application/ld+json; profile=\"https://www.w3.org/ns/activitystreams\""},
{"accept", "application/activity+json"}
]
|> maybe_date_fetch(date)
|> sign_fetch(id, date)
@ -352,8 +376,11 @@ defmodule Pleroma.Object.Fetcher do
{:error, {:content_type, content_type}}
end
else
{:ok, %{status: code}} when code in [401, 403] ->
{:error, :forbidden}
{:ok, %{status: code}} when code in [404, 410] ->
{:error, {"Object has been deleted", id, code}}
{:error, :not_found}
{:error, e} ->
{:error, e}

View File

@ -39,8 +39,6 @@ defmodule Pleroma.Upload do
alias Pleroma.Web.ActivityPub.Utils
require Logger
@mix_env Mix.env()
@type source ::
Plug.Upload.t()
| (data_uri_string :: String.t())
@ -63,12 +61,23 @@ defmodule Pleroma.Upload do
width: integer(),
height: integer(),
blurhash: String.t(),
description: String.t(),
path: String.t()
}
@always_enabled_filters [Pleroma.Upload.Filter.Dedupe]
defstruct [:id, :name, :tempfile, :content_type, :width, :height, :blurhash, :path]
defstruct [
:id,
:name,
:tempfile,
:content_type,
:width,
:height,
:blurhash,
:description,
:path
]
@spec store(source, options :: [option()]) :: {:ok, Map.t()} | {:error, any()}
@doc "Store a file. If using a `Plug.Upload{}` as the source, be sure to use `Majic.Plug` to ensure its content_type and filename is correct."
@ -78,7 +87,7 @@ defmodule Pleroma.Upload do
with {:ok, upload} <- prepare_upload(upload, opts),
upload = %__MODULE__{upload | path: upload.path || "#{upload.id}/#{upload.name}"},
{:ok, upload} <- Pleroma.Upload.Filter.filter(opts.filters, upload),
description = Map.get(opts, :description) || "",
description = Map.get(upload, :description) || "",
{_, true} <-
{:description_limit,
String.length(description) <= Pleroma.Config.get([:instance, :description_limit])},
@ -154,7 +163,8 @@ defmodule Pleroma.Upload do
id: UUID.generate(),
name: file.filename,
tempfile: file.path,
content_type: file.content_type
content_type: file.content_type,
description: opts.description
}}
end
end
@ -174,7 +184,8 @@ defmodule Pleroma.Upload do
id: UUID.generate(),
name: hash <> "." <> ext,
tempfile: tmp_path,
content_type: content_type
content_type: content_type,
description: opts.description
}}
end
end
@ -230,13 +241,6 @@ defmodule Pleroma.Upload do
defp url_from_spec(_upload, _base_url, {:url, url}), do: url
if @mix_env == :test do
defp choose_base_url(prim, sec \\ nil),
do: prim || sec || Pleroma.Web.Endpoint.url() <> "/media/"
else
defp choose_base_url(prim, sec \\ nil), do: prim || sec
end
def base_url do
uploader = Config.get([Pleroma.Upload, :uploader])
upload_base_url = Config.get([Pleroma.Upload, :base_url])
@ -244,7 +248,7 @@ defmodule Pleroma.Upload do
case uploader do
Pleroma.Uploaders.Local ->
choose_base_url(upload_base_url)
upload_base_url
Pleroma.Uploaders.S3 ->
bucket = Config.get([Pleroma.Uploaders.S3, :bucket])
@ -270,7 +274,7 @@ defmodule Pleroma.Upload do
end
_ ->
choose_base_url(public_endpoint, upload_base_url)
public_endpoint || upload_base_url
end
end
end

View File

@ -0,0 +1,52 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2021 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Upload.Filter.Exiftool.ReadDescription do
@moduledoc """
Gets a valid description from the related EXIF tags and provides them in the response if no description is provided yet.
It will first check ImageDescription, when that doesn't probide a valid description, it will check iptc:Caption-Abstract.
A valid description means the fields are filled in and not too long (see `:instance, :description_limit`).
"""
@behaviour Pleroma.Upload.Filter
@spec filter(Pleroma.Upload.t()) :: {:ok, any()} | {:error, String.t()}
def filter(%Pleroma.Upload{description: description})
when is_binary(description),
do: {:ok, :noop}
def filter(%Pleroma.Upload{tempfile: file} = upload),
do: {:ok, :filtered, upload |> Map.put(:description, read_description_from_exif_data(file))}
def filter(_, _), do: {:ok, :noop}
defp read_description_from_exif_data(file) do
nil
|> read_when_empty(file, "-ImageDescription")
|> read_when_empty(file, "-iptc:Caption-Abstract")
end
defp read_when_empty(current_description, _, _) when is_binary(current_description),
do: current_description
defp read_when_empty(_, file, tag) do
try do
{tag_content, 0} =
System.cmd("exiftool", ["-b", "-s3", tag, file],
stderr_to_stdout: true,
parallelism: true
)
tag_content = String.trim(tag_content)
if tag_content != "" and
String.length(tag_content) <=
Pleroma.Config.get([:instance, :description_limit]),
do: tag_content,
else: nil
rescue
_ in ErlangError -> nil
end
end
end

View File

@ -2,24 +2,42 @@
# Copyright © 2017-2021 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Upload.Filter.Exiftool do
defmodule Pleroma.Upload.Filter.Exiftool.StripMetadata do
@moduledoc """
Strips GPS related EXIF tags and overwrites the file in place.
Tries to strip all image metadata but colorspace and orientation overwriting the file in place.
Also strips or replaces filesystem metadata e.g., timestamps.
"""
@behaviour Pleroma.Upload.Filter
alias Pleroma.Config
@purge_default ["all", "CommonIFD0"]
@preserve_default ["ColorSpaceTags", "Orientation"]
@spec filter(Pleroma.Upload.t()) :: {:ok, :noop} | {:ok, :filtered} | {:error, String.t()}
# Formats not compatible with exiftool at this time
def filter(%Pleroma.Upload{content_type: "image/heic"}), do: {:ok, :noop}
def filter(%Pleroma.Upload{content_type: "image/webp"}), do: {:ok, :noop}
def filter(%Pleroma.Upload{content_type: "image/svg+xml"}), do: {:ok, :noop}
def filter(%Pleroma.Upload{content_type: "image/jxl"}), do: {:ok, :noop}
def filter(%Pleroma.Upload{tempfile: file, content_type: "image" <> _}) do
purge_args =
Config.get([__MODULE__, :purge], @purge_default)
|> Enum.map(fn mgroup -> "-" <> mgroup <> "=" end)
preserve_args =
Config.get([__MODULE__, :preserve], @preserve_default)
|> Enum.map(fn mgroup -> "-" <> mgroup end)
|> then(fn
# If -TagsFromFile is not followed by tag selectors, it will copy most available tags
[] -> []
args -> ["-TagsFromFile", "@" | args]
end)
args = ["-ignoreMinorErrors", "-overwrite_original" | purge_args] ++ preserve_args ++ [file]
try do
case System.cmd("exiftool", ["-overwrite_original", "-gps:all=", file], parallelism: true) do
case System.cmd("exiftool", args, parallelism: true) do
{_response, 0} -> {:ok, :filtered}
{error, 1} -> {:error, error}
end

View File

@ -969,15 +969,16 @@ defmodule Pleroma.User do
defp maybe_send_registration_email(_), do: {:ok, :noop}
def needs_update?(%User{local: true}), do: false
def needs_update?(user, options \\ [])
def needs_update?(%User{local: true}, _options), do: false
def needs_update?(%User{local: false, last_refreshed_at: nil}, _options), do: true
def needs_update?(%User{local: false, last_refreshed_at: nil}), do: true
def needs_update?(%User{local: false} = user) do
NaiveDateTime.diff(NaiveDateTime.utc_now(), user.last_refreshed_at) >= 86_400
def needs_update?(%User{local: false} = user, options) do
NaiveDateTime.diff(NaiveDateTime.utc_now(), user.last_refreshed_at) >=
Keyword.get(options, :maximum_age, 86_400)
end
def needs_update?(_), do: true
def needs_update?(_, _options), do: true
# "Locked" (self-locked) users demand explicit authorization of follow requests
@spec can_direct_follow_local(User.t(), User.t()) :: true | false
@ -1980,10 +1981,10 @@ defmodule Pleroma.User do
def fetch_by_ap_id(ap_id), do: ActivityPub.make_user_from_ap_id(ap_id)
def get_or_fetch_by_ap_id(ap_id) do
def get_or_fetch_by_ap_id(ap_id, options \\ []) do
cached_user = get_cached_by_ap_id(ap_id)
maybe_fetched_user = needs_update?(cached_user) && fetch_by_ap_id(ap_id)
maybe_fetched_user = needs_update?(cached_user, options) && fetch_by_ap_id(ap_id)
case {cached_user, maybe_fetched_user} do
{_, {:ok, %User{} = user}} ->

View File

@ -1705,9 +1705,7 @@ defmodule Pleroma.Web.ActivityPub.ActivityPub do
Fetcher.fetch_and_contain_remote_object_from_id(first) do
{:ok, false}
else
{:error, {:ok, %{status: code}}} when code in [401, 403] -> {:ok, true}
{:error, _} = e -> e
e -> {:error, e}
{:error, _} -> {:ok, true}
end
end
@ -1732,7 +1730,7 @@ defmodule Pleroma.Web.ActivityPub.ActivityPub do
Logger.debug("Could not decode user at fetch #{ap_id}, #{inspect(e)}")
{:error, e}
{:error, {:reject, reason} = e} ->
{:reject, reason} = e ->
Logger.debug("Rejected user #{ap_id}: #{inspect(reason)}")
{:error, e}

View File

@ -12,9 +12,7 @@ defmodule Pleroma.Web.ActivityPub.ActivityPubController do
alias Pleroma.Web.ActivityPub.ActivityPub
alias Pleroma.Web.ActivityPub.InternalFetchActor
alias Pleroma.Web.ActivityPub.ObjectView
alias Pleroma.Web.ActivityPub.Pipeline
alias Pleroma.Web.ActivityPub.Relay
alias Pleroma.Web.ActivityPub.Transmogrifier
alias Pleroma.Web.ActivityPub.UserView
alias Pleroma.Web.ActivityPub.Utils
alias Pleroma.Web.ActivityPub.Visibility
@ -40,11 +38,9 @@ defmodule Pleroma.Web.ActivityPub.ActivityPubController do
# Note: :following and :followers must be served even without authentication (as via :api)
plug(
EnsureAuthenticatedPlug
when action in [:read_inbox, :update_outbox, :whoami, :upload_media]
when action in [:read_inbox]
)
plug(Majic.Plug, [pool: Pleroma.MajicPool] when action in [:upload_media])
plug(
Pleroma.Web.Plugs.Cache,
[query_params: false, tracking_fun: &__MODULE__.track_object_fetch/2]
@ -160,7 +156,9 @@ defmodule Pleroma.Web.ActivityPub.ActivityPubController do
end
end
# GET /relay/following
@doc """
GET /relay/following
"""
def relay_following(conn, _params) do
with %{halted: false} = conn <- FederatingPlug.call(conn, []) do
conn
@ -197,7 +195,9 @@ defmodule Pleroma.Web.ActivityPub.ActivityPubController do
end
end
# GET /relay/followers
@doc """
GET /relay/followers
"""
def relay_followers(conn, _params) do
with %{halted: false} = conn <- FederatingPlug.call(conn, []) do
conn
@ -317,14 +317,6 @@ defmodule Pleroma.Web.ActivityPub.ActivityPubController do
|> represent_service_actor(conn)
end
@doc "Returns the authenticated user's ActivityPub User object or a 404 Not Found if non-authenticated"
def whoami(%{assigns: %{user: %User{} = user}} = conn, _params) do
conn
|> put_resp_content_type("application/activity+json")
|> put_view(UserView)
|> render("user.json", %{user: user})
end
def read_inbox(
%{assigns: %{user: %User{nickname: nickname} = user}} = conn,
%{"nickname" => nickname, "page" => page?} = params
@ -375,105 +367,6 @@ defmodule Pleroma.Web.ActivityPub.ActivityPubController do
|> json(err)
end
defp fix_user_message(%User{ap_id: actor}, %{"type" => "Create", "object" => object} = activity)
when is_map(object) do
length =
[object["content"], object["summary"], object["name"]]
|> Enum.filter(&is_binary(&1))
|> Enum.join("")
|> String.length()
limit = Pleroma.Config.get([:instance, :limit])
if length < limit do
object =
object
|> Transmogrifier.strip_internal_fields()
|> Map.put("attributedTo", actor)
|> Map.put("actor", actor)
|> Map.put("id", Utils.generate_object_id())
{:ok, Map.put(activity, "object", object)}
else
{:error,
dgettext(
"errors",
"Character limit (%{limit} characters) exceeded, contains %{length} characters",
limit: limit,
length: length
)}
end
end
defp fix_user_message(
%User{ap_id: actor} = user,
%{"type" => "Delete", "object" => object} = activity
) do
with {_, %Object{data: object_data}} <- {:normalize, Object.normalize(object, fetch: false)},
{_, true} <- {:permission, user.is_moderator || actor == object_data["actor"]} do
{:ok, activity}
else
{:normalize, _} ->
{:error, "No such object found"}
{:permission, _} ->
{:forbidden, "You can't delete this object"}
end
end
defp fix_user_message(%User{}, activity) do
{:ok, activity}
end
def update_outbox(
%{assigns: %{user: %User{nickname: nickname, ap_id: actor} = user}} = conn,
%{"nickname" => nickname} = params
) do
params =
params
|> Map.drop(["nickname"])
|> Map.put("id", Utils.generate_activity_id())
|> Map.put("actor", actor)
with {:ok, params} <- fix_user_message(user, params),
{:ok, activity, _} <- Pipeline.common_pipeline(params, local: true),
%Activity{data: activity_data} <- Activity.normalize(activity) do
conn
|> put_status(:created)
|> put_resp_header("location", activity_data["id"])
|> json(activity_data)
else
{:forbidden, message} ->
conn
|> put_status(:forbidden)
|> json(message)
{:error, message} ->
conn
|> put_status(:bad_request)
|> json(message)
e ->
Logger.warning(fn -> "AP C2S: #{inspect(e)}" end)
conn
|> put_status(:bad_request)
|> json("Bad Request")
end
end
def update_outbox(%{assigns: %{user: %User{} = user}} = conn, %{"nickname" => nickname}) do
err =
dgettext("errors", "can't update outbox of %{nickname} as %{as_nickname}",
nickname: nickname,
as_nickname: user.nickname
)
conn
|> put_status(:forbidden)
|> json(err)
end
defp errors(conn, {:error, :not_found}) do
conn
|> put_status(:not_found)
@ -495,21 +388,6 @@ defmodule Pleroma.Web.ActivityPub.ActivityPubController do
conn
end
def upload_media(%{assigns: %{user: %User{} = user}} = conn, %{"file" => file} = data) do
with {:ok, object} <-
ActivityPub.upload(
file,
actor: User.ap_id(user),
description: Map.get(data, "description")
) do
Logger.debug(inspect(object))
conn
|> put_status(:created)
|> json(object.data)
end
end
def pinned(conn, %{"nickname" => nickname}) do
with %User{} = user <- User.get_cached_by_nickname(nickname) do
conn

View File

@ -6,14 +6,29 @@ defmodule Pleroma.Web.ActivityPub.MRF.InlineQuotePolicy do
@moduledoc "Force a quote line into the message content."
@behaviour Pleroma.Web.ActivityPub.MRF.Policy
alias Pleroma.Object
defp build_inline_quote(prefix, url) do
"<span class=\"quote-inline\"><br/><br/>#{prefix}: <a href=\"#{url}\">#{url}</a></span>"
end
defp has_inline_quote?(content, quote_url) do
defp resolve_urls(quote_url) do
# Fetching here can cause infinite recursion as we run this logic on inbound objects too
# This is probably not a problem - its an exceptional corner case for a local user to quote
# a post which doesn't exist
with %Object{} = obj <- Object.normalize(quote_url, fetch: false) do
id = obj.data["id"]
url = Map.get(obj.data, "url", id)
{id, url, [id, url, quote_url]}
else
_ -> {quote_url, quote_url, [quote_url]}
end
end
defp has_inline_quote?(content, urls) do
cond do
# Does the quote URL exist in the content?
content =~ quote_url -> true
Enum.any?(urls, fn url -> content =~ url end) -> true
# Does the content already have a .quote-inline span?
content =~ "<span class=\"quote-inline\">" -> true
# No inline quote found
@ -22,18 +37,22 @@ defmodule Pleroma.Web.ActivityPub.MRF.InlineQuotePolicy do
end
defp filter_object(%{"quoteUri" => quote_url} = object) do
{id, preferred_url, all_urls} = resolve_urls(quote_url)
object = Map.put(object, "quoteUri", id)
content = object["content"] || ""
if has_inline_quote?(content, quote_url) do
if has_inline_quote?(content, all_urls) do
object
else
prefix = Pleroma.Config.get([:mrf_inline_quote, :prefix])
content =
if String.ends_with?(content, "</p>") do
String.trim_trailing(content, "</p>") <> build_inline_quote(prefix, quote_url) <> "</p>"
String.trim_trailing(content, "</p>") <>
build_inline_quote(prefix, preferred_url) <> "</p>"
else
content <> build_inline_quote(prefix, quote_url)
content <> build_inline_quote(prefix, preferred_url)
end
Map.put(object, "content", content)

View File

@ -53,6 +53,13 @@ defmodule Pleroma.Web.ActivityPub.ObjectValidators.ArticleNotePageValidator do
defp fix_url(%{"url" => url} = data) when is_bitstring(url), do: data
defp fix_url(%{"url" => url} = data) when is_map(url), do: Map.put(data, "url", url["href"])
defp fix_url(%{"url" => url} = data) when is_list(url) do
data
|> Map.put("url", List.first(url))
|> fix_url()
end
defp fix_url(data), do: data
defp fix_tag(%{"tag" => tag} = data) when is_list(tag) do

View File

@ -8,6 +8,7 @@ defmodule Pleroma.Web.ActivityPub.ObjectValidators.EmojiReactValidator do
alias Pleroma.Emoji
alias Pleroma.Object
alias Pleroma.Web.ActivityPub.ObjectValidators.CommonFixes
alias Pleroma.Web.ActivityPub.Transmogrifier
import Ecto.Changeset
import Pleroma.Web.ActivityPub.ObjectValidators.CommonValidations
@ -52,6 +53,7 @@ defmodule Pleroma.Web.ActivityPub.ObjectValidators.EmojiReactValidator do
defp fix(data) do
data =
data
|> Transmogrifier.fix_tag()
|> fix_emoji_qualification()
|> CommonFixes.fix_actor()
|> CommonFixes.fix_activity_addressing()

View File

@ -16,11 +16,13 @@ defmodule Pleroma.Web.ActivityPub.ObjectValidators.UserValidator do
alias Pleroma.Object.Containment
alias Pleroma.Signature
require Pleroma.Constants
@impl true
def validate(object, meta)
def validate(%{"type" => type, "id" => _id} = data, meta)
when type in ["Person", "Organization", "Group", "Application"] do
when type in Pleroma.Constants.actor_types() do
with :ok <- validate_pubkey(data),
:ok <- validate_inbox(data),
:ok <- contain_collection_origin(data) do

View File

@ -25,8 +25,8 @@ defmodule Pleroma.Web.ActivityPub.Transmogrifier do
import Ecto.Query
require Logger
require Pleroma.Constants
require Logger
@doc """
Modifies an incoming AP object (mastodon format) to our internal format.
@ -58,21 +58,48 @@ defmodule Pleroma.Web.ActivityPub.Transmogrifier do
def fix_summary(object), do: Map.put(object, "summary", "")
def fix_addressing_list(map, field) do
addrs = map[field]
defp fix_addressing_list(addrs) do
cond do
is_list(addrs) ->
Map.put(map, field, Enum.filter(addrs, &is_binary/1))
is_binary(addrs) ->
Map.put(map, field, [addrs])
true ->
Map.put(map, field, [])
is_list(addrs) -> Enum.filter(addrs, &is_binary/1)
is_binary(addrs) -> [addrs]
true -> []
end
end
# Due to JSON-LD simply "Public" and "as:Public" are equivalent to the full URI
# but to simplify later checks we only want to deal with one reperesentation internally
defp normalise_addressing_public_list(map, all_fields)
defp normalise_addressing_public_list(%{} = map, [field | fields]) do
full_uri = Pleroma.Constants.as_public()
map =
if map[field] != nil do
new_fval =
map[field]
|> fix_addressing_list()
|> Enum.map(fn
"Public" -> full_uri
"as:Public" -> full_uri
x -> x
end)
Map.put(map, field, new_fval)
else
map
end
normalise_addressing_public_list(map, fields)
end
defp normalise_addressing_public_list(map, _) do
map
end
defp normalise_addressing_public(map) do
normalise_addressing_public_list(map, ["to", "cc", "bto", "bcc"])
end
# if directMessage flag is set to true, leave the addressing alone
def fix_explicit_addressing(%{"directMessage" => true} = object, _follower_collection),
do: object
@ -96,6 +123,10 @@ defmodule Pleroma.Web.ActivityPub.Transmogrifier do
|> Map.put("cc", final_cc)
end
def fix_addressing_list_key(map, field) do
Map.put(map, field, fix_addressing_list(map[field]))
end
def fix_addressing(object) do
{:ok, %User{follower_address: follower_collection}} =
object
@ -103,10 +134,10 @@ defmodule Pleroma.Web.ActivityPub.Transmogrifier do
|> User.get_or_fetch_by_ap_id()
object
|> fix_addressing_list("to")
|> fix_addressing_list("cc")
|> fix_addressing_list("bto")
|> fix_addressing_list("bcc")
|> fix_addressing_list_key("to")
|> fix_addressing_list_key("cc")
|> fix_addressing_list_key("bto")
|> fix_addressing_list_key("bcc")
|> fix_explicit_addressing(follower_collection)
|> CommonFixes.fix_implicit_addressing(follower_collection)
end
@ -135,8 +166,7 @@ defmodule Pleroma.Web.ActivityPub.Transmogrifier do
|> Map.put("context", replied_object.data["context"] || object["conversation"])
|> Map.drop(["conversation", "inReplyToAtomUri"])
else
e ->
Logger.warning("Couldn't fetch reply@#{inspect(in_reply_to_id)}, error: #{inspect(e)}")
_ ->
object
end
else
@ -384,11 +414,28 @@ defmodule Pleroma.Web.ActivityPub.Transmogrifier do
end)
end
def handle_incoming(data, options \\ [])
def handle_incoming(data, options \\ []) do
data = normalise_addressing_public(data)
data =
if data["object"] != nil do
object = normalise_addressing_public(data["object"])
Map.put(data, "object", object)
else
data
end
handle_incoming_normalised(data, options)
end
defp handle_incoming_normalised(data, options)
# Flag objects are placed ahead of the ID check because Mastodon 2.8 and earlier send them
# with nil ID.
def handle_incoming(%{"type" => "Flag", "object" => objects, "actor" => actor} = data, _options) do
defp handle_incoming_normalised(
%{"type" => "Flag", "object" => objects, "actor" => actor} = data,
_options
) do
with context <- data["context"] || Utils.generate_context_id(),
content <- data["content"] || "",
%User{} = actor <- User.get_cached_by_ap_id(actor),
@ -409,20 +456,21 @@ defmodule Pleroma.Web.ActivityPub.Transmogrifier do
end
# disallow objects with bogus IDs
def handle_incoming(%{"id" => nil}, _options), do: :error
def handle_incoming(%{"id" => ""}, _options), do: :error
defp handle_incoming_normalised(%{"id" => nil}, _options), do: :error
defp handle_incoming_normalised(%{"id" => ""}, _options), do: :error
# length of https:// = 8, should validate better, but good enough for now.
def handle_incoming(%{"id" => id}, _options) when is_binary(id) and byte_size(id) < 8,
do: :error
defp handle_incoming_normalised(%{"id" => id}, _options)
when is_binary(id) and byte_size(id) < 8,
do: :error
@doc "Rewrite misskey likes into EmojiReacts"
def handle_incoming(
%{
"type" => "Like",
"content" => reaction
} = data,
options
) do
# Rewrite misskey likes into EmojiReacts
defp handle_incoming_normalised(
%{
"type" => "Like",
"content" => reaction
} = data,
options
) do
if Pleroma.Emoji.is_unicode_emoji?(reaction) || Pleroma.Emoji.matches_shortcode?(reaction) do
data
|> Map.put("type", "EmojiReact")
@ -434,11 +482,11 @@ defmodule Pleroma.Web.ActivityPub.Transmogrifier do
end
end
def handle_incoming(
%{"type" => "Create", "object" => %{"type" => objtype, "id" => obj_id}} = data,
options
)
when objtype in ~w{Question Answer Audio Video Event Article Note Page} do
defp handle_incoming_normalised(
%{"type" => "Create", "object" => %{"type" => objtype, "id" => obj_id}} = data,
options
)
when objtype in ~w{Question Answer Audio Video Event Article Note Page} do
fetch_options = Keyword.put(options, :depth, (options[:depth] || 0) + 1)
object =
@ -470,8 +518,8 @@ defmodule Pleroma.Web.ActivityPub.Transmogrifier do
end
end
def handle_incoming(%{"type" => type} = data, _options)
when type in ~w{Like EmojiReact Announce Add Remove} do
defp handle_incoming_normalised(%{"type" => type} = data, _options)
when type in ~w{Like EmojiReact Announce Add Remove} do
with :ok <- ObjectValidator.fetch_actor_and_object(data),
{:ok, activity, _meta} <- Pipeline.common_pipeline(data, local: false) do
{:ok, activity}
@ -481,11 +529,11 @@ defmodule Pleroma.Web.ActivityPub.Transmogrifier do
end
end
def handle_incoming(
%{"type" => type} = data,
_options
)
when type in ~w{Update Block Follow Accept Reject} do
defp handle_incoming_normalised(
%{"type" => type} = data,
_options
)
when type in ~w{Update Block Follow Accept Reject} do
with {:ok, %User{}} <- ObjectValidator.fetch_actor(data),
{:ok, activity, _} <-
Pipeline.common_pipeline(data, local: false) do
@ -493,10 +541,10 @@ defmodule Pleroma.Web.ActivityPub.Transmogrifier do
end
end
def handle_incoming(
%{"type" => "Delete"} = data,
_options
) do
defp handle_incoming_normalised(
%{"type" => "Delete"} = data,
_options
) do
with {:ok, activity, _} <-
Pipeline.common_pipeline(data, local: false) do
{:ok, activity}
@ -516,15 +564,15 @@ defmodule Pleroma.Web.ActivityPub.Transmogrifier do
end
end
def handle_incoming(
%{
"type" => "Undo",
"object" => %{"type" => "Follow", "object" => followed},
"actor" => follower,
"id" => id
} = _data,
_options
) do
defp handle_incoming_normalised(
%{
"type" => "Undo",
"object" => %{"type" => "Follow", "object" => followed},
"actor" => follower,
"id" => id
} = _data,
_options
) do
with %User{local: true} = followed <- User.get_cached_by_ap_id(followed),
{:ok, %User{} = follower} <- User.get_or_fetch_by_ap_id(follower),
{:ok, activity} <- ActivityPub.unfollow(follower, followed, id, false) do
@ -535,28 +583,28 @@ defmodule Pleroma.Web.ActivityPub.Transmogrifier do
end
end
def handle_incoming(
%{
"type" => "Undo",
"object" => %{"type" => type}
} = data,
_options
)
when type in ["Like", "EmojiReact", "Announce", "Block"] do
defp handle_incoming_normalised(
%{
"type" => "Undo",
"object" => %{"type" => type}
} = data,
_options
)
when type in ["Like", "EmojiReact", "Announce", "Block"] do
with {:ok, activity, _} <- Pipeline.common_pipeline(data, local: false) do
{:ok, activity}
end
end
# For Undos that don't have the complete object attached, try to find it in our database.
def handle_incoming(
%{
"type" => "Undo",
"object" => object
} = activity,
options
)
when is_binary(object) do
defp handle_incoming_normalised(
%{
"type" => "Undo",
"object" => object
} = activity,
options
)
when is_binary(object) do
with %Activity{data: data} <- Activity.get_by_ap_id(object) do
activity
|> Map.put("object", data)
@ -566,17 +614,22 @@ defmodule Pleroma.Web.ActivityPub.Transmogrifier do
end
end
def handle_incoming(
%{
"type" => "Move",
"actor" => origin_actor,
"object" => origin_actor,
"target" => target_actor
},
_options
) do
defp handle_incoming_normalised(
%{
"type" => "Move",
"actor" => origin_actor,
"object" => origin_actor,
"target" => target_actor
},
_options
) do
with %User{} = origin_user <- User.get_cached_by_ap_id(origin_actor),
{:ok, %User{} = target_user} <- User.get_or_fetch_by_ap_id(target_actor),
# Use a dramatically shortened maximum age before refresh here because it is reasonable
# for a user to
# 1. Add the alias to their new account and then
# 2. Press the button on their new account
# within a very short period of time and expect it to work
{:ok, %User{} = target_user} <- User.get_or_fetch_by_ap_id(target_actor, maximum_age: 5),
true <- origin_actor in target_user.also_known_as do
ActivityPub.move(origin_user, target_user, false)
else
@ -584,7 +637,7 @@ defmodule Pleroma.Web.ActivityPub.Transmogrifier do
end
end
def handle_incoming(_, _), do: :error
defp handle_incoming_normalised(_, _), do: :error
@spec get_obj_helper(String.t(), Keyword.t()) :: {:ok, Object.t()} | nil
def get_obj_helper(id, options \\ []) do
@ -828,8 +881,7 @@ defmodule Pleroma.Web.ActivityPub.Transmogrifier do
relative_object do
Map.put(data, "object", external_url)
else
{:fetch, e} ->
Logger.error("Couldn't fetch fixed_object@#{object} #{inspect(e)}")
{:fetch, _} ->
data
_ ->

View File

@ -26,8 +26,7 @@ defmodule Pleroma.Web.ActivityPub.UserView do
"oauthAuthorizationEndpoint" => url(~p"/oauth/authorize"),
"oauthRegistrationEndpoint" => url(~p"/api/v1/apps"),
"oauthTokenEndpoint" => url(~p"/oauth/token"),
"sharedInbox" => url(~p"/inbox"),
"uploadMedia" => url(~p"/api/ap/upload_media")
"sharedInbox" => url(~p"/inbox")
}
end

View File

@ -150,7 +150,7 @@ defmodule Pleroma.Web.ApiSpec.TwitterUtilOperation do
"removes the contents of a message from the push notification"
)
],
requestBody: nil,
requestBody: request_body("Parameters", update_notification_settings_request()),
responses: %{
200 =>
Operation.response("Success", "application/json", %Schema{
@ -432,4 +432,22 @@ defmodule Pleroma.Web.ApiSpec.TwitterUtilOperation do
}
}
end
defp update_notification_settings_request do
%Schema{
title: "UpdateNotificationSettings",
description: "PUT paramenters (query, form or JSON) for updating notification settings",
type: :object,
properties: %{
block_from_strangers: %Schema{
type: :boolean,
description: "blocks notifications from accounts you do not follow"
},
hide_notification_contents: %Schema{
type: :boolean,
description: "removes the contents of a message from the push notification"
}
}
}
end
end

View File

@ -78,9 +78,7 @@ defmodule Pleroma.Web.Feed.FeedView do
end
def activity_content(%{"content" => content}) do
content
|> String.replace(~r/[\n\r]/, "")
|> escape()
escape(content)
end
def activity_content(_), do: ""

View File

@ -14,8 +14,6 @@ defmodule Pleroma.Web.MediaProxy do
@cachex Pleroma.Config.get([:cachex, :provider], Cachex)
@mix_env Mix.env()
def cache_table, do: @cache_table
@spec in_banned_urls(String.t()) :: boolean()
@ -146,14 +144,8 @@ defmodule Pleroma.Web.MediaProxy do
if path = URI.parse(url_or_path).path, do: Path.basename(path)
end
if @mix_env == :test do
def base_url do
Config.get([:media_proxy, :base_url], Endpoint.url())
end
else
def base_url do
Config.get!([:media_proxy, :base_url])
end
def base_url do
Config.get!([:media_proxy, :base_url])
end
defp proxy_url(path, sig_base64, url_base64, filename) do

View File

@ -12,14 +12,38 @@ defmodule Pleroma.Web.Metadata.Providers.OpenGraph do
@behaviour Provider
@media_types ["image", "audio", "video"]
defp user_avatar_tags(user) do
if Utils.visible?(user) do
[
{:meta, [property: "og:image", content: MediaProxy.preview_url(User.avatar_url(user))],
[]},
{:meta, [property: "og:image:width", content: 150], []},
{:meta, [property: "og:image:height", content: 150], []}
]
else
[]
end
end
@impl Provider
def build_tags(%{
object: object,
url: url,
user: user
}) do
attachments = build_attachments(object)
scrubbed_content = Utils.scrub_html_and_truncate(object)
attachments =
if Utils.visible?(object) do
build_attachments(object)
else
[]
end
scrubbed_content =
if Utils.visible?(object) do
Utils.scrub_html_and_truncate(object)
else
"Content cannot be displayed."
end
[
{:meta,
@ -36,12 +60,7 @@ defmodule Pleroma.Web.Metadata.Providers.OpenGraph do
{:meta, [property: "og:type", content: "article"], []}
] ++
if attachments == [] or Metadata.activity_nsfw?(object) do
[
{:meta, [property: "og:image", content: MediaProxy.preview_url(User.avatar_url(user))],
[]},
{:meta, [property: "og:image:width", content: 150], []},
{:meta, [property: "og:image:height", content: 150], []}
]
user_avatar_tags(user)
else
attachments
end
@ -49,7 +68,9 @@ defmodule Pleroma.Web.Metadata.Providers.OpenGraph do
@impl Provider
def build_tags(%{user: user}) do
with truncated_bio = Utils.scrub_html_and_truncate(user.bio) do
if Utils.visible?(user) do
truncated_bio = Utils.scrub_html_and_truncate(user.bio)
[
{:meta,
[
@ -58,12 +79,10 @@ defmodule Pleroma.Web.Metadata.Providers.OpenGraph do
], []},
{:meta, [property: "og:url", content: user.uri || user.ap_id], []},
{:meta, [property: "og:description", content: truncated_bio], []},
{:meta, [property: "og:type", content: "article"], []},
{:meta, [property: "og:image", content: MediaProxy.preview_url(User.avatar_url(user))],
[]},
{:meta, [property: "og:image:width", content: 150], []},
{:meta, [property: "og:image:height", content: 150], []}
]
{:meta, [property: "og:type", content: "article"], []}
] ++ user_avatar_tags(user)
else
[]
end
end

View File

@ -17,8 +17,19 @@ defmodule Pleroma.Web.Metadata.Providers.TwitterCard do
@impl Provider
def build_tags(%{activity_id: id, object: object, user: user}) do
attachments = build_attachments(id, object)
scrubbed_content = Utils.scrub_html_and_truncate(object)
attachments =
if Utils.visible?(object) do
build_attachments(id, object)
else
[]
end
scrubbed_content =
if Utils.visible?(object) do
Utils.scrub_html_and_truncate(object)
else
"Content cannot be displayed."
end
[
title_tag(user),
@ -36,13 +47,17 @@ defmodule Pleroma.Web.Metadata.Providers.TwitterCard do
@impl Provider
def build_tags(%{user: user}) do
with truncated_bio = Utils.scrub_html_and_truncate(user.bio) do
[
title_tag(user),
{:meta, [name: "twitter:description", content: truncated_bio], []},
image_tag(user),
{:meta, [name: "twitter:card", content: "summary"], []}
]
if Utils.visible?(user) do
with truncated_bio = Utils.scrub_html_and_truncate(user.bio) do
[
title_tag(user),
{:meta, [name: "twitter:description", content: truncated_bio], []},
image_tag(user),
{:meta, [name: "twitter:card", content: "summary"], []}
]
end
else
[]
end
end
@ -51,7 +66,11 @@ defmodule Pleroma.Web.Metadata.Providers.TwitterCard do
end
def image_tag(user) do
{:meta, [name: "twitter:image", content: MediaProxy.preview_url(User.avatar_url(user))], []}
if Utils.visible?(user) do
{:meta, [name: "twitter:image", content: MediaProxy.preview_url(User.avatar_url(user))], []}
else
{:meta, [name: "twitter:image", content: ""], []}
end
end
defp build_attachments(id, %{data: %{"attachment" => attachments}}) do

View File

@ -7,6 +7,15 @@ defmodule Pleroma.Web.Metadata.Utils do
alias Pleroma.Emoji
alias Pleroma.Formatter
alias Pleroma.HTML
alias Pleroma.Web.ActivityPub.Visibility
def visible?(%Pleroma.User{} = object) do
Visibility.restrict_unauthenticated_access?(object) == :visible
end
def visible?(object) do
Visibility.visible_for_user?(object, nil)
end
defp scrub_html_and_truncate_object_field(field, object) do
field

View File

@ -800,13 +800,9 @@ defmodule Pleroma.Web.Router do
scope "/", Pleroma.Web.ActivityPub do
pipe_through([:activitypub_client])
get("/api/ap/whoami", ActivityPubController, :whoami)
get("/users/:nickname/inbox", ActivityPubController, :read_inbox)
get("/users/:nickname/outbox", ActivityPubController, :outbox)
post("/users/:nickname/outbox", ActivityPubController, :update_outbox)
post("/api/ap/upload_media", ActivityPubController, :upload_media)
get("/users/:nickname/collections/featured", ActivityPubController, :pinned)
end

View File

@ -151,41 +151,40 @@ defmodule Pleroma.Web.Telemetry do
# phoenix.router_dispatch.stop.duration
# pleroma.repo.query.total_time
# pleroma.repo.query.queue_time
dist_metrics =
[
distribution("phoenix.endpoint.stop.duration.fdist",
event_name: [:phoenix, :endpoint, :stop],
measurement: :duration,
unit: {:native, :millisecond},
reporter_options: [
buckets: simple_buckets
]
),
distribution("pleroma.repo.query.decode_time.fdist",
event_name: [:pleroma, :repo, :query],
measurement: :decode_time,
unit: {:native, :millisecond},
reporter_options: [
buckets: simple_buckets_quick
]
),
distribution("pleroma.repo.query.query_time.fdist",
event_name: [:pleroma, :repo, :query],
measurement: :query_time,
unit: {:native, :millisecond},
reporter_options: [
buckets: simple_buckets
]
),
distribution("pleroma.repo.query.idle_time.fdist",
event_name: [:pleroma, :repo, :query],
measurement: :idle_time,
unit: {:native, :millisecond},
reporter_options: [
buckets: simple_buckets
]
)
]
dist_metrics = [
distribution("phoenix.endpoint.stop.duration.fdist",
event_name: [:phoenix, :endpoint, :stop],
measurement: :duration,
unit: {:native, :millisecond},
reporter_options: [
buckets: simple_buckets
]
),
distribution("pleroma.repo.query.decode_time.fdist",
event_name: [:pleroma, :repo, :query],
measurement: :decode_time,
unit: {:native, :millisecond},
reporter_options: [
buckets: simple_buckets_quick
]
),
distribution("pleroma.repo.query.query_time.fdist",
event_name: [:pleroma, :repo, :query],
measurement: :query_time,
unit: {:native, :millisecond},
reporter_options: [
buckets: simple_buckets
]
),
distribution("pleroma.repo.query.idle_time.fdist",
event_name: [:pleroma, :repo, :query],
measurement: :idle_time,
unit: {:native, :millisecond},
reporter_options: [
buckets: simple_buckets
]
)
]
vm_metrics =
sum_counter_pair("vm.memory.total",

View File

@ -184,7 +184,13 @@ defmodule Pleroma.Web.TwitterAPI.UtilController do
json(conn, emoji)
end
def update_notificaton_settings(%{assigns: %{user: user}} = conn, params) do
def update_notificaton_settings(
%{assigns: %{user: user}, body_params: body_params} = conn,
params
) do
# OpenApiSpex 3.x prevents Plug's usual parameter premerging
params = Map.merge(params, body_params)
with {:ok, _} <- User.update_notification_settings(user, params) do
json(conn, %{status: "success"})
end

View File

@ -65,7 +65,7 @@ defmodule Pleroma.Web.WebFinger do
end
defp gather_aliases(%User{} = user) do
[user.ap_id | user.also_known_as]
[user.ap_id]
end
def represent_user(user, "JSON") do

View File

@ -26,7 +26,7 @@ defmodule Pleroma.Web.XML do
def parse_document(text) do
try do
doc = SweetXml.parse(text, dtd: :none)
doc = SweetXml.parse(text, dtd: :none, quiet: true)
{:ok, doc}
rescue

View File

@ -14,7 +14,8 @@ defmodule Pleroma.Workers.ReceiverWorker do
else
{:error, :origin_containment_failed} -> {:discard, :origin_containment_failed}
{:error, {:reject, reason}} -> {:discard, reason}
e -> e
{:error, _} = e -> e
e -> {:error, e}
end
end
end

View File

@ -5,10 +5,42 @@
defmodule Pleroma.Workers.RemoteFetcherWorker do
alias Pleroma.Object.Fetcher
use Pleroma.Workers.WorkerHelper, queue: "remote_fetcher"
use Pleroma.Workers.WorkerHelper,
queue: "remote_fetcher",
unique: [period: 300, states: Oban.Job.states(), keys: [:op, :id]]
@impl Oban.Worker
def perform(%Job{args: %{"op" => "fetch_remote", "id" => id} = args}) do
{:ok, _object} = Fetcher.fetch_object_from_id(id, depth: args["depth"])
case Fetcher.fetch_object_from_id(id, depth: args["depth"]) do
{:ok, _object} ->
:ok
{:error, :forbidden} ->
{:discard, :forbidden}
{:error, :not_found} ->
{:discard, :not_found}
{:error, :allowed_depth} ->
{:discard, :allowed_depth}
{:error, :invalid_uri_scheme} ->
{:discard, :invalid_uri_scheme}
{:error, :local_resource} ->
{:discard, :local_resource}
{:reject, _} ->
{:discard, :reject}
{:error, :id_mismatch} ->
{:discard, :id_mismatch}
{:error, _} = e ->
e
e ->
{:error, e}
end
end
end

View File

@ -25,12 +25,16 @@ defmodule Pleroma.Workers.WorkerHelper do
defmacro __using__(opts) do
caller_module = __CALLER__.module
queue = Keyword.fetch!(opts, :queue)
# by default just stop unintended duplicates - this can and should be overridden
# if you want to have a more complex uniqueness constraint
uniqueness = Keyword.get(opts, :unique, period: 1)
quote do
# Note: `max_attempts` is intended to be overridden in `new/2` call
use Oban.Worker,
queue: unquote(queue),
max_attempts: 1
max_attempts: 1,
unique: unquote(uniqueness)
alias Oban.Job

View File

@ -4,7 +4,7 @@ defmodule Pleroma.Mixfile do
def project do
[
app: :pleroma,
version: version("3.12.2"),
version: version("3.13.0"),
elixir: "~> 1.14",
elixirc_paths: elixirc_paths(Mix.env()),
compilers: Mix.compilers(),
@ -125,7 +125,7 @@ defmodule Pleroma.Mixfile do
{:ecto_enum, "~> 1.4"},
{:ecto_sql, "~> 3.10.0"},
{:postgrex, "~> 0.17.2"},
{:oban, "~> 2.15.2"},
{:oban, "~> 2.17.8"},
{:gettext, "~> 0.22.3"},
{:bcrypt_elixir, "~> 3.0.1"},
{:fast_sanitize, "~> 0.2.3"},

View File

@ -3,55 +3,55 @@
"base62": {:hex, :base62, "1.2.2", "85c6627eb609317b70f555294045895ffaaeb1758666ab9ef9ca38865b11e629", [:mix], [{:custom_base, "~> 0.2.1", [hex: :custom_base, repo: "hexpm", optional: false]}], "hexpm", "d41336bda8eaa5be197f1e4592400513ee60518e5b9f4dcf38f4b4dae6f377bb"},
"bbcode_pleroma": {:hex, :bbcode_pleroma, "0.2.0", "d36f5bca6e2f62261c45be30fa9b92725c0655ad45c99025cb1c3e28e25803ef", [:mix], [{:nimble_parsec, "~> 0.5", [hex: :nimble_parsec, repo: "hexpm", optional: false]}], "hexpm", "19851074419a5fedb4ef49e1f01b30df504bb5dbb6d6adfc135238063bebd1c3"},
"bcrypt_elixir": {:hex, :bcrypt_elixir, "3.0.1", "9be815469e6bfefec40fa74658ecbbe6897acfb57614df1416eeccd4903f602c", [:make, :mix], [{:comeonin, "~> 5.3", [hex: :comeonin, repo: "hexpm", optional: false]}, {:elixir_make, "~> 0.6", [hex: :elixir_make, repo: "hexpm", optional: false]}], "hexpm", "486bb95efb645d1efc6794c1ddd776a186a9a713abf06f45708a6ce324fb96cf"},
"benchee": {:hex, :benchee, "1.2.0", "afd2f0caec06ce3a70d9c91c514c0b58114636db9d83c2dc6bfd416656618353", [:mix], [{:deep_merge, "~> 1.0", [hex: :deep_merge, repo: "hexpm", optional: false]}, {:statistex, "~> 1.0", [hex: :statistex, repo: "hexpm", optional: false]}, {:table, "~> 0.1.0", [hex: :table, repo: "hexpm", optional: true]}], "hexpm", "ee729e53217898b8fd30aaad3cce61973dab61574ae6f48229fe7ff42d5e4457"},
"bunt": {:hex, :bunt, "0.2.1", "e2d4792f7bc0ced7583ab54922808919518d0e57ee162901a16a1b6664ef3b14", [:mix], [], "hexpm", "a330bfb4245239787b15005e66ae6845c9cd524a288f0d141c148b02603777a5"},
"benchee": {:hex, :benchee, "1.3.0", "f64e3b64ad3563fa9838146ddefb2d2f94cf5b473bdfd63f5ca4d0657bf96694", [:mix], [{:deep_merge, "~> 1.0", [hex: :deep_merge, repo: "hexpm", optional: false]}, {:statistex, "~> 1.0", [hex: :statistex, repo: "hexpm", optional: false]}, {:table, "~> 0.1.0", [hex: :table, repo: "hexpm", optional: true]}], "hexpm", "34f4294068c11b2bd2ebf2c59aac9c7da26ffa0068afdf3419f1b176e16c5f81"},
"bunt": {:hex, :bunt, "1.0.0", "081c2c665f086849e6d57900292b3a161727ab40431219529f13c4ddcf3e7a44", [:mix], [], "hexpm", "dc5f86aa08a5f6fa6b8096f0735c4e76d54ae5c9fa2c143e5a1fc7c1cd9bb6b5"},
"cachex": {:hex, :cachex, "3.6.0", "14a1bfbeee060dd9bec25a5b6f4e4691e3670ebda28c8ba2884b12fe30b36bf8", [:mix], [{:eternal, "~> 1.2", [hex: :eternal, repo: "hexpm", optional: false]}, {:jumper, "~> 1.0", [hex: :jumper, repo: "hexpm", optional: false]}, {:sleeplocks, "~> 1.1", [hex: :sleeplocks, repo: "hexpm", optional: false]}, {:unsafe, "~> 1.0", [hex: :unsafe, repo: "hexpm", optional: false]}], "hexpm", "ebf24e373883bc8e0c8d894a63bbe102ae13d918f790121f5cfe6e485cc8e2e2"},
"calendar": {:hex, :calendar, "1.0.0", "f52073a708528482ec33d0a171954ca610fe2bd28f1e871f247dc7f1565fa807", [:mix], [{:tzdata, "~> 0.1.201603 or ~> 0.5.20 or ~> 1.0", [hex: :tzdata, repo: "hexpm", optional: false]}], "hexpm", "990e9581920c82912a5ee50e62ff5ef96da6b15949a2ee4734f935fdef0f0a6f"},
"captcha": {:git, "https://git.pleroma.social/pleroma/elixir-libraries/elixir-captcha.git", "90f6ce7672f70f56708792a98d98bd05176c9176", [ref: "90f6ce7672f70f56708792a98d98bd05176c9176"]},
"castore": {:hex, :castore, "1.0.5", "9eeebb394cc9a0f3ae56b813459f990abb0a3dedee1be6b27fdb50301930502f", [:mix], [], "hexpm", "8d7c597c3e4a64c395980882d4bca3cebb8d74197c590dc272cfd3b6a6310578"},
"castore": {:hex, :castore, "1.0.6", "ffc42f110ebfdafab0ea159cd43d31365fa0af0ce4a02ecebf1707ae619ee727", [:mix], [], "hexpm", "374c6e7ca752296be3d6780a6d5b922854ffcc74123da90f2f328996b962d33a"},
"certifi": {:hex, :certifi, "2.12.0", "2d1cca2ec95f59643862af91f001478c9863c2ac9cb6e2f89780bfd8de987329", [:rebar3], [], "hexpm", "ee68d85df22e554040cdb4be100f33873ac6051387baf6a8f6ce82272340ff1c"},
"combine": {:hex, :combine, "0.10.0", "eff8224eeb56498a2af13011d142c5e7997a80c8f5b97c499f84c841032e429f", [:mix], [], "hexpm", "1b1dbc1790073076580d0d1d64e42eae2366583e7aecd455d1215b0d16f2451b"},
"comeonin": {:hex, :comeonin, "5.4.0", "246a56ca3f41d404380fc6465650ddaa532c7f98be4bda1b4656b3a37cc13abe", [:mix], [], "hexpm", "796393a9e50d01999d56b7b8420ab0481a7538d0caf80919da493b4a6e51faf1"},
"concurrent_limiter": {:git, "https://akkoma.dev/AkkomaGang/concurrent-limiter.git", "a9e0b3d64574bdba761f429bb4fba0cf687b3338", [ref: "a9e0b3d64574bdba761f429bb4fba0cf687b3338"]},
"connection": {:hex, :connection, "1.1.0", "ff2a49c4b75b6fb3e674bfc5536451607270aac754ffd1bdfe175abe4a6d7a68", [:mix], [], "hexpm", "722c1eb0a418fbe91ba7bd59a47e28008a189d47e37e0e7bb85585a016b2869c"},
"cors_plug": {:hex, :cors_plug, "3.0.3", "7c3ac52b39624bc616db2e937c282f3f623f25f8d550068b6710e58d04a0e330", [:mix], [{:plug, "~> 1.13", [hex: :plug, repo: "hexpm", optional: false]}], "hexpm", "3f2d759e8c272ed3835fab2ef11b46bddab8c1ab9528167bd463b6452edf830d"},
"cowboy": {:hex, :cowboy, "2.10.0", "ff9ffeff91dae4ae270dd975642997afe2a1179d94b1887863e43f681a203e26", [:make, :rebar3], [{:cowlib, "2.12.1", [hex: :cowlib, repo: "hexpm", optional: false]}, {:ranch, "1.8.0", [hex: :ranch, repo: "hexpm", optional: false]}], "hexpm", "3afdccb7183cc6f143cb14d3cf51fa00e53db9ec80cdcd525482f5e99bc41d6b"},
"cowboy": {:hex, :cowboy, "2.12.0", "f276d521a1ff88b2b9b4c54d0e753da6c66dd7be6c9fca3d9418b561828a3731", [:make, :rebar3], [{:cowlib, "2.13.0", [hex: :cowlib, repo: "hexpm", optional: false]}, {:ranch, "1.8.0", [hex: :ranch, repo: "hexpm", optional: false]}], "hexpm", "8a7abe6d183372ceb21caa2709bec928ab2b72e18a3911aa1771639bef82651e"},
"cowboy_telemetry": {:hex, :cowboy_telemetry, "0.4.0", "f239f68b588efa7707abce16a84d0d2acf3a0f50571f8bb7f56a15865aae820c", [:rebar3], [{:cowboy, "~> 2.7", [hex: :cowboy, repo: "hexpm", optional: false]}, {:telemetry, "~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "7d98bac1ee4565d31b62d59f8823dfd8356a169e7fcbb83831b8a5397404c9de"},
"cowlib": {:hex, :cowlib, "2.12.1", "a9fa9a625f1d2025fe6b462cb865881329b5caff8f1854d1cbc9f9533f00e1e1", [:make, :rebar3], [], "hexpm", "163b73f6367a7341b33c794c4e88e7dbfe6498ac42dcd69ef44c5bc5507c8db0"},
"credo": {:hex, :credo, "1.7.1", "6e26bbcc9e22eefbff7e43188e69924e78818e2fe6282487d0703652bc20fd62", [:mix], [{:bunt, "~> 0.2.1", [hex: :bunt, repo: "hexpm", optional: false]}, {:file_system, "~> 0.2.8", [hex: :file_system, repo: "hexpm", optional: false]}, {:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: false]}], "hexpm", "e9871c6095a4c0381c89b6aa98bc6260a8ba6addccf7f6a53da8849c748a58a2"},
"cowlib": {:hex, :cowlib, "2.13.0", "db8f7505d8332d98ef50a3ef34b34c1afddec7506e4ee4dd4a3a266285d282ca", [:make, :rebar3], [], "hexpm", "e1e1284dc3fc030a64b1ad0d8382ae7e99da46c3246b815318a4b848873800a4"},
"credo": {:hex, :credo, "1.7.5", "643213503b1c766ec0496d828c90c424471ea54da77c8a168c725686377b9545", [:mix], [{:bunt, "~> 0.2.1 or ~> 1.0", [hex: :bunt, repo: "hexpm", optional: false]}, {:file_system, "~> 0.2 or ~> 1.0", [hex: :file_system, repo: "hexpm", optional: false]}, {:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: false]}], "hexpm", "f799e9b5cd1891577d8c773d245668aa74a2fcd15eb277f51a0131690ebfb3fd"},
"custom_base": {:hex, :custom_base, "0.2.1", "4a832a42ea0552299d81652aa0b1f775d462175293e99dfbe4d7dbaab785a706", [:mix], [], "hexpm", "8df019facc5ec9603e94f7270f1ac73ddf339f56ade76a721eaa57c1493ba463"},
"db_connection": {:hex, :db_connection, "2.6.0", "77d835c472b5b67fc4f29556dee74bf511bbafecdcaf98c27d27fa5918152086", [:mix], [{:telemetry, "~> 0.4 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "c2f992d15725e721ec7fbc1189d4ecdb8afef76648c746a8e1cad35e3b8a35f3"},
"decimal": {:hex, :decimal, "2.1.1", "5611dca5d4b2c3dd497dec8f68751f1f1a54755e8ed2a966c2633cf885973ad6", [:mix], [], "hexpm", "53cfe5f497ed0e7771ae1a475575603d77425099ba5faef9394932b35020ffcc"},
"deep_merge": {:hex, :deep_merge, "1.0.0", "b4aa1a0d1acac393bdf38b2291af38cb1d4a52806cf7a4906f718e1feb5ee961", [:mix], [], "hexpm", "ce708e5f094b9cd4e8f2be4f00d2f4250c4095be93f8cd6d018c753894885430"},
"dialyxir": {:hex, :dialyxir, "1.4.2", "764a6e8e7a354f0ba95d58418178d486065ead1f69ad89782817c296d0d746a5", [:mix], [{:erlex, ">= 0.2.6", [hex: :erlex, repo: "hexpm", optional: false]}], "hexpm", "516603d8067b2fd585319e4b13d3674ad4f314a5902ba8130cd97dc902ce6bbd"},
"dialyxir": {:hex, :dialyxir, "1.4.3", "edd0124f358f0b9e95bfe53a9fcf806d615d8f838e2202a9f430d59566b6b53b", [:mix], [{:erlex, ">= 0.2.6", [hex: :erlex, repo: "hexpm", optional: false]}], "hexpm", "bf2cfb75cd5c5006bec30141b131663299c661a864ec7fbbc72dfa557487a986"},
"earmark": {:hex, :earmark, "1.4.46", "8c7287bd3137e99d26ae4643e5b7ef2129a260e3dcf41f251750cb4563c8fb81", [:mix], [], "hexpm", "798d86db3d79964e759ddc0c077d5eb254968ed426399fbf5a62de2b5ff8910a"},
"earmark_parser": {:hex, :earmark_parser, "1.4.39", "424642f8335b05bb9eb611aa1564c148a8ee35c9c8a8bba6e129d51a3e3c6769", [:mix], [], "hexpm", "06553a88d1f1846da9ef066b87b57c6f605552cfbe40d20bd8d59cc6bde41944"},
"eblurhash": {:hex, :eblurhash, "1.2.2", "7da4255aaea984b31bb71155f673257353b0e0554d0d30dcf859547e74602582", [:rebar3], [], "hexpm", "8c20ca00904de023a835a9dcb7b7762fed32264c85a80c3cafa85288e405044c"},
"ecto": {:hex, :ecto, "3.10.3", "eb2ae2eecd210b4eb8bece1217b297ad4ff824b4384c0e3fdd28aaf96edd6135", [:mix], [{:decimal, "~> 1.6 or ~> 2.0", [hex: :decimal, repo: "hexpm", optional: false]}, {:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: true]}, {:telemetry, "~> 0.4 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "44bec74e2364d491d70f7e42cd0d690922659d329f6465e89feb8a34e8cd3433"},
"ecto_enum": {:hex, :ecto_enum, "1.4.0", "d14b00e04b974afc69c251632d1e49594d899067ee2b376277efd8233027aec8", [:mix], [{:ecto, ">= 3.0.0", [hex: :ecto, repo: "hexpm", optional: false]}, {:ecto_sql, "> 3.0.0", [hex: :ecto_sql, repo: "hexpm", optional: false]}, {:mariaex, ">= 0.0.0", [hex: :mariaex, repo: "hexpm", optional: true]}, {:postgrex, ">= 0.0.0", [hex: :postgrex, repo: "hexpm", optional: true]}], "hexpm", "8fb55c087181c2b15eee406519dc22578fa60dd82c088be376d0010172764ee4"},
"ecto_psql_extras": {:hex, :ecto_psql_extras, "0.7.14", "7a20cfe913b0476542b43870e67386461258734896035e3f284039fd18bd4c4c", [:mix], [{:ecto_sql, "~> 3.7", [hex: :ecto_sql, repo: "hexpm", optional: false]}, {:postgrex, "~> 0.16.0 or ~> 0.17.0", [hex: :postgrex, repo: "hexpm", optional: false]}, {:table_rex, "~> 3.1.1", [hex: :table_rex, repo: "hexpm", optional: false]}], "hexpm", "22f5f98592dd597db9416fcef00effae0787669fdcb6faf447e982b553798e98"},
"ecto_psql_extras": {:hex, :ecto_psql_extras, "0.7.15", "0fc29dbae0e444a29bd6abeee4cf3c4c037e692a272478a234a1cc765077dbb1", [:mix], [{:ecto_sql, "~> 3.7", [hex: :ecto_sql, repo: "hexpm", optional: false]}, {:postgrex, "~> 0.16.0 or ~> 0.17.0", [hex: :postgrex, repo: "hexpm", optional: false]}, {:table_rex, "~> 3.1.1 or ~> 4.0.0", [hex: :table_rex, repo: "hexpm", optional: false]}], "hexpm", "b6127f3a5c6fc3d84895e4768cc7c199f22b48b67d6c99b13fbf4a374e73f039"},
"ecto_sql": {:hex, :ecto_sql, "3.10.2", "6b98b46534b5c2f8b8b5f03f126e75e2a73c64f3c071149d32987a5378b0fdbd", [:mix], [{:db_connection, "~> 2.4.1 or ~> 2.5", [hex: :db_connection, repo: "hexpm", optional: false]}, {:ecto, "~> 3.10.0", [hex: :ecto, repo: "hexpm", optional: false]}, {:myxql, "~> 0.6.0", [hex: :myxql, repo: "hexpm", optional: true]}, {:postgrex, "~> 0.16.0 or ~> 0.17.0 or ~> 1.0", [hex: :postgrex, repo: "hexpm", optional: true]}, {:tds, "~> 2.1.1 or ~> 2.2", [hex: :tds, repo: "hexpm", optional: true]}, {:telemetry, "~> 0.4.0 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "68c018debca57cb9235e3889affdaec7a10616a4e3a80c99fa1d01fdafaa9007"},
"elasticsearch": {:git, "https://akkoma.dev/AkkomaGang/elasticsearch-elixir.git", "6cd946f75f6ab9042521a009d1d32d29a90113ca", [ref: "main"]},
"elixir_make": {:hex, :elixir_make, "0.7.7", "7128c60c2476019ed978210c245badf08b03dbec4f24d05790ef791da11aa17c", [:mix], [{:castore, "~> 0.1 or ~> 1.0", [hex: :castore, repo: "hexpm", optional: true]}], "hexpm", "5bc19fff950fad52bbe5f211b12db9ec82c6b34a9647da0c2224b8b8464c7e6c"},
"elixir_xml_to_map": {:hex, :elixir_xml_to_map, "3.0.0", "67dcff30ecf72aed37ab08525133e4420717a749436e22bfece431e7dddeea7e", [:mix], [{:erlsom, "~> 1.4", [hex: :erlsom, repo: "hexpm", optional: false]}], "hexpm", "11222dd7f029f8db7a6662b41c992dbdb0e1c6e4fdea6a42056f9d27c847efbb"},
"elixir_make": {:hex, :elixir_make, "0.8.3", "d38d7ee1578d722d89b4d452a3e36bcfdc644c618f0d063b874661876e708683", [:mix], [{:castore, "~> 0.1 or ~> 1.0", [hex: :castore, repo: "hexpm", optional: true]}, {:certifi, "~> 2.0", [hex: :certifi, repo: "hexpm", optional: true]}], "hexpm", "5c99a18571a756d4af7a4d89ca75c28ac899e6103af6f223982f09ce44942cc9"},
"elixir_xml_to_map": {:hex, :elixir_xml_to_map, "3.1.0", "4d6260486a8cce59e4bf3575fe2dd2a24766546ceeef9f93fcec6f7c62a2827a", [:mix], [{:erlsom, "~> 1.4", [hex: :erlsom, repo: "hexpm", optional: false]}], "hexpm", "8fe5f2e75f90bab07ee2161120c2dc038ebcae8135554f5582990f1c8c21f911"},
"erlex": {:hex, :erlex, "0.2.6", "c7987d15e899c7a2f34f5420d2a2ea0d659682c06ac607572df55a43753aa12e", [:mix], [], "hexpm", "2ed2e25711feb44d52b17d2780eabf998452f6efda104877a3881c2f8c0c0c75"},
"erlsom": {:hex, :erlsom, "1.5.1", "c8fe2babd33ff0846403f6522328b8ab676f896b793634cfe7ef181c05316c03", [:rebar3], [], "hexpm", "7965485494c5844dd127656ac40f141aadfa174839ec1be1074e7edf5b4239eb"},
"eternal": {:hex, :eternal, "1.2.2", "d1641c86368de99375b98d183042dd6c2b234262b8d08dfd72b9eeaafc2a1abd", [:mix], [], "hexpm", "2c9fe32b9c3726703ba5e1d43a1d255a4f3f2d8f8f9bc19f094c7cb1a7a9e782"},
"ex_aws": {:hex, :ex_aws, "2.5.0", "1785e69350b16514c1049330537c7da10039b1a53e1d253bbd703b135174aec3", [:mix], [{:configparser_ex, "~> 4.0", [hex: :configparser_ex, repo: "hexpm", optional: true]}, {:hackney, "~> 1.16", [hex: :hackney, repo: "hexpm", optional: true]}, {:jason, "~> 1.1", [hex: :jason, repo: "hexpm", optional: true]}, {:jsx, "~> 2.8 or ~> 3.0", [hex: :jsx, repo: "hexpm", optional: true]}, {:mime, "~> 1.2 or ~> 2.0", [hex: :mime, repo: "hexpm", optional: false]}, {:sweet_xml, "~> 0.7", [hex: :sweet_xml, repo: "hexpm", optional: true]}, {:telemetry, "~> 0.4.3 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "971b86e5495fc0ae1c318e35e23f389e74cf322f2c02d34037c6fc6d405006f1"},
"ex_aws_s3": {:hex, :ex_aws_s3, "2.5.2", "cee302b8e9ee198cc0d89f1de2a7d6a8921e1a556574476cf5590d2156590fe3", [:mix], [{:ex_aws, "~> 2.0", [hex: :ex_aws, repo: "hexpm", optional: false]}, {:sweet_xml, ">= 0.0.0", [hex: :sweet_xml, repo: "hexpm", optional: true]}], "hexpm", "cc5bd945a22a99eece4721d734ae2452d3717e81c357a781c8574663254df4a1"},
"ex_aws": {:hex, :ex_aws, "2.5.3", "9c2d05ba0c057395b12c7b5ca6267d14cdaec1d8e65bdf6481fe1fd245accfb4", [:mix], [{:configparser_ex, "~> 4.0", [hex: :configparser_ex, repo: "hexpm", optional: true]}, {:hackney, "~> 1.16", [hex: :hackney, repo: "hexpm", optional: true]}, {:jason, "~> 1.1", [hex: :jason, repo: "hexpm", optional: true]}, {:jsx, "~> 2.8 or ~> 3.0", [hex: :jsx, repo: "hexpm", optional: true]}, {:mime, "~> 1.2 or ~> 2.0", [hex: :mime, repo: "hexpm", optional: false]}, {:sweet_xml, "~> 0.7", [hex: :sweet_xml, repo: "hexpm", optional: true]}, {:telemetry, "~> 0.4.3 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "67115f1d399d7ec4d191812ee565c6106cb4b1bbf19a9d4db06f265fd87da97e"},
"ex_aws_s3": {:hex, :ex_aws_s3, "2.5.3", "422468e5c3e1a4da5298e66c3468b465cfd354b842e512cb1f6fbbe4e2f5bdaf", [:mix], [{:ex_aws, "~> 2.0", [hex: :ex_aws, repo: "hexpm", optional: false]}, {:sweet_xml, ">= 0.0.0", [hex: :sweet_xml, repo: "hexpm", optional: true]}], "hexpm", "4f09dd372cc386550e484808c5ac5027766c8d0cd8271ccc578b82ee6ef4f3b8"},
"ex_const": {:hex, :ex_const, "0.2.4", "d06e540c9d834865b012a17407761455efa71d0ce91e5831e86881b9c9d82448", [:mix], [], "hexpm", "96fd346610cc992b8f896ed26a98be82ac4efb065a0578f334a32d60a3ba9767"},
"ex_doc": {:hex, :ex_doc, "0.31.0", "06eb1dfd787445d9cab9a45088405593dd3bb7fe99e097eaa71f37ba80c7a676", [:mix], [{:earmark_parser, "~> 1.4.39", [hex: :earmark_parser, repo: "hexpm", optional: false]}, {:makeup_c, ">= 0.1.1", [hex: :makeup_c, repo: "hexpm", optional: true]}, {:makeup_elixir, "~> 0.14", [hex: :makeup_elixir, repo: "hexpm", optional: false]}, {:makeup_erlang, "~> 0.1", [hex: :makeup_erlang, repo: "hexpm", optional: false]}], "hexpm", "5350cafa6b7f77bdd107aa2199fe277acf29d739aba5aee7e865fc680c62a110"},
"ex_doc": {:hex, :ex_doc, "0.32.0", "896afb57b1e00030f6ec8b2e19d3ca99a197afb23858d49d94aea673dc222f12", [:mix], [{:earmark_parser, "~> 1.4.39", [hex: :earmark_parser, repo: "hexpm", optional: false]}, {:makeup_c, ">= 0.1.1", [hex: :makeup_c, repo: "hexpm", optional: true]}, {:makeup_elixir, "~> 0.14", [hex: :makeup_elixir, repo: "hexpm", optional: false]}, {:makeup_erlang, "~> 0.1", [hex: :makeup_erlang, repo: "hexpm", optional: false]}], "hexpm", "ed2c3e42c558f49bda3ff37e05713432006e1719a6c4a3320c7e4735787374e7"},
"ex_machina": {:hex, :ex_machina, "2.7.0", "b792cc3127fd0680fecdb6299235b4727a4944a09ff0fa904cc639272cd92dc7", [:mix], [{:ecto, "~> 2.2 or ~> 3.0", [hex: :ecto, repo: "hexpm", optional: true]}, {:ecto_sql, "~> 3.0", [hex: :ecto_sql, repo: "hexpm", optional: true]}], "hexpm", "419aa7a39bde11894c87a615c4ecaa52d8f107bbdd81d810465186f783245bf8"},
"ex_syslogger": {:hex, :ex_syslogger, "2.0.0", "de6de5c5472a9c4fdafb28fa6610e381ae79ebc17da6490b81d785d68bd124c9", [:mix], [{:jason, "~> 1.2", [hex: :jason, repo: "hexpm", optional: true]}, {:syslog, "~> 1.1.0", [hex: :syslog, repo: "hexpm", optional: false]}], "hexpm", "a52b2fe71764e9e6ecd149ab66635812f68e39279cbeee27c52c0e35e8b8019e"},
"excoveralls": {:hex, :excoveralls, "0.16.1", "0bd42ed05c7d2f4d180331a20113ec537be509da31fed5c8f7047ce59ee5a7c5", [:mix], [{:hackney, "~> 1.16", [hex: :hackney, repo: "hexpm", optional: false]}, {:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: false]}], "hexpm", "dae763468e2008cf7075a64cb1249c97cb4bc71e236c5c2b5e5cdf1cfa2bf138"},
"expo": {:hex, :expo, "0.4.1", "1c61d18a5df197dfda38861673d392e642649a9cef7694d2f97a587b2cfb319b", [:mix], [], "hexpm", "2ff7ba7a798c8c543c12550fa0e2cbc81b95d4974c65855d8d15ba7b37a1ce47"},
"fast_html": {:hex, :fast_html, "2.2.0", "6c5ef1be087a4ed613b0379c13f815c4d11742b36b67bb52cee7859847c84520", [:make, :mix], [{:elixir_make, "~> 0.4", [hex: :elixir_make, repo: "hexpm", optional: false]}, {:nimble_pool, "~> 0.2.0", [hex: :nimble_pool, repo: "hexpm", optional: false]}], "hexpm", "064c4f23b4a6168f9187dac8984b056f2c531bb0787f559fd6a8b34b38aefbae"},
"fast_html": {:hex, :fast_html, "2.3.0", "08c1d8ead840dd3060ba02c761bed9f37f456a1ddfe30bcdcfee8f651cec06a6", [:make, :mix], [{:elixir_make, "~> 0.4", [hex: :elixir_make, repo: "hexpm", optional: false]}, {:nimble_pool, "~> 0.2.0", [hex: :nimble_pool, repo: "hexpm", optional: false]}], "hexpm", "f18e3c7668f82d3ae0b15f48d48feeb257e28aa5ab1b0dbf781c7312e5da029d"},
"fast_sanitize": {:hex, :fast_sanitize, "0.2.3", "67b93dfb34e302bef49fec3aaab74951e0f0602fd9fa99085987af05bd91c7a5", [:mix], [{:fast_html, "~> 2.0", [hex: :fast_html, repo: "hexpm", optional: false]}, {:plug, "~> 1.8", [hex: :plug, repo: "hexpm", optional: false]}], "hexpm", "e8ad286d10d0386e15d67d0ee125245ebcfbc7d7290b08712ba9013c8c5e56e2"},
"file_ex": {:git, "https://akkoma.dev/AkkomaGang/file_ex.git", "cc7067c7d446c2526e9ecf91d40896b088851569", [ref: "cc7067c7d446c2526e9ecf91d40896b088851569"]},
"file_system": {:hex, :file_system, "0.2.10", "fb082005a9cd1711c05b5248710f8826b02d7d1784e7c3451f9c1231d4fc162d", [:mix], [], "hexpm", "41195edbfb562a593726eda3b3e8b103a309b733ad25f3d642ba49696bf715dc"},
"file_system": {:hex, :file_system, "1.0.0", "b689cc7dcee665f774de94b5a832e578bd7963c8e637ef940cd44327db7de2cd", [:mix], [], "hexpm", "6752092d66aec5a10e662aefeed8ddb9531d79db0bc145bb8c40325ca1d8536d"},
"finch": {:hex, :finch, "0.16.0", "40733f02c89f94a112518071c0a91fe86069560f5dbdb39f9150042f44dcfb1a", [:mix], [{:castore, "~> 0.1 or ~> 1.0", [hex: :castore, repo: "hexpm", optional: false]}, {:mime, "~> 1.0 or ~> 2.0", [hex: :mime, repo: "hexpm", optional: false]}, {:mint, "~> 1.3", [hex: :mint, repo: "hexpm", optional: false]}, {:nimble_options, "~> 0.4 or ~> 1.0", [hex: :nimble_options, repo: "hexpm", optional: false]}, {:nimble_pool, "~> 0.2.6 or ~> 1.0", [hex: :nimble_pool, repo: "hexpm", optional: false]}, {:telemetry, "~> 0.4 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "f660174c4d519e5fec629016054d60edd822cdfe2b7270836739ac2f97735ec5"},
"flake_id": {:hex, :flake_id, "0.1.0", "7716b086d2e405d09b647121a166498a0d93d1a623bead243e1f74216079ccb3", [:mix], [{:base62, "~> 1.2", [hex: :base62, repo: "hexpm", optional: false]}, {:ecto, ">= 2.0.0", [hex: :ecto, repo: "hexpm", optional: true]}], "hexpm", "31fc8090fde1acd267c07c36ea7365b8604055f897d3a53dd967658c691bd827"},
"floki": {:hex, :floki, "0.35.2", "87f8c75ed8654b9635b311774308b2760b47e9a579dabf2e4d5f1e1d42c39e0b", [:mix], [], "hexpm", "6b05289a8e9eac475f644f09c2e4ba7e19201fd002b89c28c1293e7bd16773d9"},
"floki": {:hex, :floki, "0.36.1", "712b7f2ba19a4d5a47dfe3e74d81876c95bbcbee44fe551f0af3d2a388abb3da", [:mix], [], "hexpm", "21ba57abb8204bcc70c439b423fc0dd9f0286de67dc82773a14b0200ada0995f"},
"gen_smtp": {:hex, :gen_smtp, "1.2.0", "9cfc75c72a8821588b9b9fe947ae5ab2aed95a052b81237e0928633a13276fd3", [:rebar3], [{:ranch, ">= 1.8.0", [hex: :ranch, repo: "hexpm", optional: false]}], "hexpm", "5ee0375680bca8f20c4d85f58c2894441443a743355430ff33a783fe03296779"},
"gettext": {:hex, :gettext, "0.22.3", "c8273e78db4a0bb6fba7e9f0fd881112f349a3117f7f7c598fa18c66c888e524", [:mix], [{:expo, "~> 0.4.0", [hex: :expo, repo: "hexpm", optional: false]}], "hexpm", "935f23447713954a6866f1bb28c3a878c4c011e802bcd68a726f5e558e4b64bd"},
"hackney": {:hex, :hackney, "1.20.1", "8d97aec62ddddd757d128bfd1df6c5861093419f8f7a4223823537bad5d064e2", [:rebar3], [{:certifi, "~> 2.12.0", [hex: :certifi, repo: "hexpm", optional: false]}, {:idna, "~> 6.1.0", [hex: :idna, repo: "hexpm", optional: false]}, {:metrics, "~> 1.0.0", [hex: :metrics, repo: "hexpm", optional: false]}, {:mimerl, "~> 1.1", [hex: :mimerl, repo: "hexpm", optional: false]}, {:parse_trans, "3.4.1", [hex: :parse_trans, repo: "hexpm", optional: false]}, {:ssl_verify_fun, "~> 1.1.0", [hex: :ssl_verify_fun, repo: "hexpm", optional: false]}, {:unicode_util_compat, "~> 0.7.0", [hex: :unicode_util_compat, repo: "hexpm", optional: false]}], "hexpm", "fe9094e5f1a2a2c0a7d10918fee36bfec0ec2a979994cff8cfe8058cd9af38e3"},
@ -60,17 +60,17 @@
"http_signatures": {:git, "https://akkoma.dev/AkkomaGang/http_signatures.git", "6640ce7d24c783ac2ef56e27d00d12e8dc85f396", [ref: "6640ce7d24c783ac2ef56e27d00d12e8dc85f396"]},
"httpoison": {:hex, :httpoison, "1.8.2", "9eb9c63ae289296a544842ef816a85d881d4a31f518a0fec089aaa744beae290", [:mix], [{:hackney, "~> 1.17", [hex: :hackney, repo: "hexpm", optional: false]}], "hexpm", "2bb350d26972e30c96e2ca74a1aaf8293d61d0742ff17f01e0279fef11599921"},
"idna": {:hex, :idna, "6.1.1", "8a63070e9f7d0c62eb9d9fcb360a7de382448200fbbd1b106cc96d3d8099df8d", [:rebar3], [{:unicode_util_compat, "~> 0.7.0", [hex: :unicode_util_compat, repo: "hexpm", optional: false]}], "hexpm", "92376eb7894412ed19ac475e4a86f7b413c1b9fbb5bd16dccd57934157944cea"},
"inet_cidr": {:hex, :inet_cidr, "1.0.4", "a05744ab7c221ca8e395c926c3919a821eb512e8f36547c062f62c4ca0cf3d6e", [:mix], [], "hexpm", "64a2d30189704ae41ca7dbdd587f5291db5d1dda1414e0774c29ffc81088c1bc"},
"inet_cidr": {:hex, :inet_cidr, "1.0.8", "d26bb7bdbdf21ae401ead2092bf2bb4bf57fe44a62f5eaa5025280720ace8a40", [:mix], [], "hexpm", "d5b26da66603bb56c933c65214c72152f0de9a6ea53618b56d63302a68f6a90e"},
"jason": {:hex, :jason, "1.4.1", "af1504e35f629ddcdd6addb3513c3853991f694921b1b9368b0bd32beb9f1b63", [:mix], [{:decimal, "~> 1.0 or ~> 2.0", [hex: :decimal, repo: "hexpm", optional: true]}], "hexpm", "fbb01ecdfd565b56261302f7e1fcc27c4fb8f32d56eab74db621fc154604a7a1"},
"joken": {:hex, :joken, "2.6.0", "b9dd9b6d52e3e6fcb6c65e151ad38bf4bc286382b5b6f97079c47ade6b1bcc6a", [:mix], [{:jose, "~> 1.11.5", [hex: :jose, repo: "hexpm", optional: false]}], "hexpm", "5a95b05a71cd0b54abd35378aeb1d487a23a52c324fa7efdffc512b655b5aaa7"},
"jose": {:hex, :jose, "1.11.6", "613fda82552128aa6fb804682e3a616f4bc15565a048dabd05b1ebd5827ed965", [:mix, :rebar3], [], "hexpm", "6275cb75504f9c1e60eeacb771adfeee4905a9e182103aa59b53fed651ff9738"},
"joken": {:hex, :joken, "2.6.1", "2ca3d8d7f83bf7196296a3d9b2ecda421a404634bfc618159981a960020480a1", [:mix], [{:jose, "~> 1.11.9", [hex: :jose, repo: "hexpm", optional: false]}], "hexpm", "ab26122c400b3d254ce7d86ed066d6afad27e70416df947cdcb01e13a7382e68"},
"jose": {:hex, :jose, "1.11.9", "c861eb99d9e9f62acd071dc5a49ffbeab9014e44490cd85ea3e49e3d36184777", [:mix, :rebar3], [], "hexpm", "b5ccc3749d2e1638c26bed806259df5bc9e438797fe60dc71e9fa0716133899b"},
"jumper": {:hex, :jumper, "1.0.2", "68cdcd84472a00ac596b4e6459a41b3062d4427cbd4f1e8c8793c5b54f1406a7", [:mix], [], "hexpm", "9b7782409021e01ab3c08270e26f36eb62976a38c1aa64b2eaf6348422f165e1"},
"linkify": {:git, "https://akkoma.dev/AkkomaGang/linkify.git", "2567e2c1073fa371fd26fd66dfa5bc77b6919c16", []},
"mail": {:hex, :mail, "0.3.1", "cb0a14e4ed8904e4e5a08214e686ccf6f9099346885db17d8c309381f865cc5c", [:mix], [], "hexpm", "1db701e89865c1d5fa296b2b57b1cd587587cca8d8a1a22892b35ef5a8e352a6"},
"majic": {:git, "https://akkoma.dev/AkkomaGang/majic.git", "80540b36939ec83f48e76c61e5000e0fd67706f0", [ref: "80540b36939ec83f48e76c61e5000e0fd67706f0"]},
"makeup": {:hex, :makeup, "1.1.1", "fa0bc768698053b2b3869fa8a62616501ff9d11a562f3ce39580d60860c3a55e", [:mix], [{:nimble_parsec, "~> 1.2.2 or ~> 1.3", [hex: :nimble_parsec, repo: "hexpm", optional: false]}], "hexpm", "5dc62fbdd0de44de194898b6710692490be74baa02d9d108bc29f007783b0b48"},
"makeup_elixir": {:hex, :makeup_elixir, "0.16.1", "cc9e3ca312f1cfeccc572b37a09980287e243648108384b97ff2b76e505c3555", [:mix], [{:makeup, "~> 1.0", [hex: :makeup, repo: "hexpm", optional: false]}, {:nimble_parsec, "~> 1.2.3 or ~> 1.3", [hex: :nimble_parsec, repo: "hexpm", optional: false]}], "hexpm", "e127a341ad1b209bd80f7bd1620a15693a9908ed780c3b763bccf7d200c767c6"},
"makeup_erlang": {:hex, :makeup_erlang, "0.1.3", "d684f4bac8690e70b06eb52dad65d26de2eefa44cd19d64a8095e1417df7c8fd", [:mix], [{:makeup, "~> 1.0", [hex: :makeup, repo: "hexpm", optional: false]}], "hexpm", "b78dc853d2e670ff6390b605d807263bf606da3c82be37f9d7f68635bd886fc9"},
"makeup_elixir": {:hex, :makeup_elixir, "0.16.2", "627e84b8e8bf22e60a2579dad15067c755531fea049ae26ef1020cad58fe9578", [:mix], [{:makeup, "~> 1.0", [hex: :makeup, repo: "hexpm", optional: false]}, {:nimble_parsec, "~> 1.2.3 or ~> 1.3", [hex: :nimble_parsec, repo: "hexpm", optional: false]}], "hexpm", "41193978704763f6bbe6cc2758b84909e62984c7752b3784bd3c218bb341706b"},
"makeup_erlang": {:hex, :makeup_erlang, "0.1.5", "e0ff5a7c708dda34311f7522a8758e23bfcd7d8d8068dc312b5eb41c6fd76eba", [:mix], [{:makeup, "~> 1.0", [hex: :makeup, repo: "hexpm", optional: false]}], "hexpm", "94d2e986428585a21516d7d7149781480013c56e30c6a233534bedf38867a59a"},
"meck": {:hex, :meck, "0.9.2", "85ccbab053f1db86c7ca240e9fc718170ee5bda03810a6292b5306bf31bae5f5", [:rebar3], [], "hexpm", "81344f561357dc40a8344afa53767c32669153355b626ea9fcbc8da6b3045826"},
"metrics": {:hex, :metrics, "1.0.1", "25f094dea2cda98213cecc3aeff09e940299d950904393b2a29d191c346a8486", [:rebar3], [], "hexpm", "69b09adddc4f74a40716ae54d140f93beb0fb8978d8636eaded0c31b6f099f16"},
"mfm_parser": {:git, "https://akkoma.dev/AkkomaGang/mfm-parser.git", "b21ab7754024af096f2d14247574f55f0063295b", [ref: "b21ab7754024af096f2d14247574f55f0063295b"]},
@ -82,54 +82,54 @@
"mox": {:hex, :mox, "1.1.0", "0f5e399649ce9ab7602f72e718305c0f9cdc351190f72844599545e4996af73c", [:mix], [], "hexpm", "d44474c50be02d5b72131070281a5d3895c0e7a95c780e90bc0cfe712f633a13"},
"nimble_options": {:hex, :nimble_options, "1.1.0", "3b31a57ede9cb1502071fade751ab0c7b8dbe75a9a4c2b5bbb0943a690b63172", [:mix], [], "hexpm", "8bbbb3941af3ca9acc7835f5655ea062111c9c27bcac53e004460dfd19008a99"},
"nimble_parsec": {:hex, :nimble_parsec, "1.4.0", "51f9b613ea62cfa97b25ccc2c1b4216e81df970acd8e16e8d1bdc58fef21370d", [:mix], [], "hexpm", "9c565862810fb383e9838c1dd2d7d2c437b3d13b267414ba6af33e50d2d1cf28"},
"nimble_pool": {:hex, :nimble_pool, "1.0.0", "5eb82705d138f4dd4423f69ceb19ac667b3b492ae570c9f5c900bb3d2f50a847", [:mix], [], "hexpm", "80be3b882d2d351882256087078e1b1952a28bf98d0a287be87e4a24a710b67a"},
"oban": {:hex, :oban, "2.15.4", "d49ab4ffb7153010e32f80fe9e56f592706238149ec579eb50f8a4e41d218856", [:mix], [{:ecto_sql, "~> 3.6", [hex: :ecto_sql, repo: "hexpm", optional: false]}, {:ecto_sqlite3, "~> 0.9", [hex: :ecto_sqlite3, repo: "hexpm", optional: true]}, {:jason, "~> 1.1", [hex: :jason, repo: "hexpm", optional: false]}, {:postgrex, "~> 0.16", [hex: :postgrex, repo: "hexpm", optional: true]}, {:telemetry, "~> 0.4 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "5fce611fdfffb13e9148df883116e5201adf1e731eb302cc88cde0588510079c"},
"open_api_spex": {:hex, :open_api_spex, "3.18.0", "f9952b6bc8a1bf14168f3754981b7c8d72d015112bfedf2588471dd602e1e715", [:mix], [{:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: true]}, {:plug, "~> 1.7", [hex: :plug, repo: "hexpm", optional: false]}, {:poison, "~> 3.0 or ~> 4.0 or ~> 5.0", [hex: :poison, repo: "hexpm", optional: true]}, {:ymlr, "~> 2.0 or ~> 3.0 or ~> 4.0", [hex: :ymlr, repo: "hexpm", optional: true]}], "hexpm", "37849887ab67efab052376401fac28c0974b273ffaecd98f4532455ca0886464"},
"nimble_pool": {:hex, :nimble_pool, "1.1.0", "bf9c29fbdcba3564a8b800d1eeb5a3c58f36e1e11d7b7fb2e084a643f645f06b", [:mix], [], "hexpm", "af2e4e6b34197db81f7aad230c1118eac993acc0dae6bc83bac0126d4ae0813a"},
"oban": {:hex, :oban, "2.17.8", "7fd7c8e82c7819afc1b5b5ed8d6d92bf0ecdd7ba170328fb043301eb06d32521", [:mix], [{:ecto_sql, "~> 3.10", [hex: :ecto_sql, repo: "hexpm", optional: false]}, {:ecto_sqlite3, "~> 0.9", [hex: :ecto_sqlite3, repo: "hexpm", optional: true]}, {:jason, "~> 1.1", [hex: :jason, repo: "hexpm", optional: false]}, {:postgrex, "~> 0.16", [hex: :postgrex, repo: "hexpm", optional: true]}, {:telemetry, "~> 0.4 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "a2165bf93843b7bcb68182c82725ddd4cb43c0c3719f114e7aa3b6c99c4b6129"},
"open_api_spex": {:hex, :open_api_spex, "3.18.3", "fefb84fe323cacfc92afdd0ecb9e89bc0261ae00b7e3167ffc2028ce3944de42", [:mix], [{:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: true]}, {:plug, "~> 1.7", [hex: :plug, repo: "hexpm", optional: false]}, {:poison, "~> 3.0 or ~> 4.0 or ~> 5.0", [hex: :poison, repo: "hexpm", optional: true]}, {:ymlr, "~> 2.0 or ~> 3.0 or ~> 4.0 or ~> 5.0", [hex: :ymlr, repo: "hexpm", optional: true]}], "hexpm", "c0cfc31570199ce7e7520b494a591027da609af45f6bf9adce51e2469b1609fb"},
"parse_trans": {:hex, :parse_trans, "3.4.1", "6e6aa8167cb44cc8f39441d05193be6e6f4e7c2946cb2759f015f8c56b76e5ff", [:rebar3], [], "hexpm", "620a406ce75dada827b82e453c19cf06776be266f5a67cff34e1ef2cbb60e49a"},
"phoenix": {:hex, :phoenix, "1.7.10", "02189140a61b2ce85bb633a9b6fd02dff705a5f1596869547aeb2b2b95edd729", [:mix], [{:castore, ">= 0.0.0", [hex: :castore, repo: "hexpm", optional: false]}, {:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: true]}, {:phoenix_pubsub, "~> 2.1", [hex: :phoenix_pubsub, repo: "hexpm", optional: false]}, {:phoenix_template, "~> 1.0", [hex: :phoenix_template, repo: "hexpm", optional: false]}, {:phoenix_view, "~> 2.0", [hex: :phoenix_view, repo: "hexpm", optional: true]}, {:plug, "~> 1.14", [hex: :plug, repo: "hexpm", optional: false]}, {:plug_cowboy, "~> 2.6", [hex: :plug_cowboy, repo: "hexpm", optional: true]}, {:plug_crypto, "~> 1.2 or ~> 2.0", [hex: :plug_crypto, repo: "hexpm", optional: false]}, {:telemetry, "~> 0.4 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}, {:websock_adapter, "~> 0.5.3", [hex: :websock_adapter, repo: "hexpm", optional: false]}], "hexpm", "cf784932e010fd736d656d7fead6a584a4498efefe5b8227e9f383bf15bb79d0"},
"phoenix_ecto": {:hex, :phoenix_ecto, "4.4.3", "86e9878f833829c3f66da03d75254c155d91d72a201eb56ae83482328dc7ca93", [:mix], [{:ecto, "~> 3.5", [hex: :ecto, repo: "hexpm", optional: false]}, {:phoenix_html, "~> 2.14.2 or ~> 3.0 or ~> 4.0", [hex: :phoenix_html, repo: "hexpm", optional: true]}, {:plug, "~> 1.9", [hex: :plug, repo: "hexpm", optional: false]}], "hexpm", "d36c401206f3011fefd63d04e8ef626ec8791975d9d107f9a0817d426f61ac07"},
"phoenix": {:hex, :phoenix, "1.7.12", "1cc589e0eab99f593a8aa38ec45f15d25297dd6187ee801c8de8947090b5a9d3", [:mix], [{:castore, ">= 0.0.0", [hex: :castore, repo: "hexpm", optional: false]}, {:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: true]}, {:phoenix_pubsub, "~> 2.1", [hex: :phoenix_pubsub, repo: "hexpm", optional: false]}, {:phoenix_template, "~> 1.0", [hex: :phoenix_template, repo: "hexpm", optional: false]}, {:phoenix_view, "~> 2.0", [hex: :phoenix_view, repo: "hexpm", optional: true]}, {:plug, "~> 1.14", [hex: :plug, repo: "hexpm", optional: false]}, {:plug_cowboy, "~> 2.7", [hex: :plug_cowboy, repo: "hexpm", optional: true]}, {:plug_crypto, "~> 1.2 or ~> 2.0", [hex: :plug_crypto, repo: "hexpm", optional: false]}, {:telemetry, "~> 0.4 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}, {:websock_adapter, "~> 0.5.3", [hex: :websock_adapter, repo: "hexpm", optional: false]}], "hexpm", "d646192fbade9f485b01bc9920c139bfdd19d0f8df3d73fd8eaf2dfbe0d2837c"},
"phoenix_ecto": {:hex, :phoenix_ecto, "4.5.1", "6fdbc334ea53620e71655664df6f33f670747b3a7a6c4041cdda3e2c32df6257", [:mix], [{:ecto, "~> 3.5", [hex: :ecto, repo: "hexpm", optional: false]}, {:phoenix_html, "~> 2.14.2 or ~> 3.0 or ~> 4.1", [hex: :phoenix_html, repo: "hexpm", optional: true]}, {:plug, "~> 1.9", [hex: :plug, repo: "hexpm", optional: false]}], "hexpm", "ebe43aa580db129e54408e719fb9659b7f9e0d52b965c5be26cdca416ecead28"},
"phoenix_html": {:hex, :phoenix_html, "3.3.3", "380b8fb45912b5638d2f1d925a3771b4516b9a78587249cabe394e0a5d579dc9", [:mix], [{:plug, "~> 1.5", [hex: :plug, repo: "hexpm", optional: true]}], "hexpm", "923ebe6fec6e2e3b3e569dfbdc6560de932cd54b000ada0208b5f45024bdd76c"},
"phoenix_live_dashboard": {:hex, :phoenix_live_dashboard, "0.7.2", "97cc4ff2dba1ebe504db72cb45098cb8e91f11160528b980bd282cc45c73b29c", [:mix], [{:ecto, "~> 3.6.2 or ~> 3.7", [hex: :ecto, repo: "hexpm", optional: true]}, {:ecto_mysql_extras, "~> 0.5", [hex: :ecto_mysql_extras, repo: "hexpm", optional: true]}, {:ecto_psql_extras, "~> 0.7", [hex: :ecto_psql_extras, repo: "hexpm", optional: true]}, {:mime, "~> 1.6 or ~> 2.0", [hex: :mime, repo: "hexpm", optional: false]}, {:phoenix_live_view, "~> 0.18.3", [hex: :phoenix_live_view, repo: "hexpm", optional: false]}, {:telemetry_metrics, "~> 0.6 or ~> 1.0", [hex: :telemetry_metrics, repo: "hexpm", optional: false]}], "hexpm", "0e5fdf063c7a3b620c566a30fcf68b7ee02e5e46fe48ee46a6ec3ba382dc05b7"},
"phoenix_live_view": {:hex, :phoenix_live_view, "0.18.18", "1f38fbd7c363723f19aad1a04b5490ff3a178e37daaf6999594d5f34796c47fc", [:mix], [{:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: true]}, {:phoenix, "~> 1.6.15 or ~> 1.7.0", [hex: :phoenix, repo: "hexpm", optional: false]}, {:phoenix_html, "~> 3.3", [hex: :phoenix_html, repo: "hexpm", optional: false]}, {:phoenix_template, "~> 1.0", [hex: :phoenix_template, repo: "hexpm", optional: false]}, {:phoenix_view, "~> 2.0", [hex: :phoenix_view, repo: "hexpm", optional: true]}, {:telemetry, "~> 0.4.2 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "a5810d0472f3189ede6d2a95bda7f31c6113156b91784a3426cb0ab6a6d85214"},
"phoenix_pubsub": {:hex, :phoenix_pubsub, "2.1.3", "3168d78ba41835aecad272d5e8cd51aa87a7ac9eb836eabc42f6e57538e3731d", [:mix], [], "hexpm", "bba06bc1dcfd8cb086759f0edc94a8ba2bc8896d5331a1e2c2902bf8e36ee502"},
"phoenix_swoosh": {:hex, :phoenix_swoosh, "1.2.0", "a544d83fde4a767efb78f45404a74c9e37b2a9c5ea3339692e65a6966731f935", [:mix], [{:finch, "~> 0.8", [hex: :finch, repo: "hexpm", optional: true]}, {:hackney, "~> 1.10", [hex: :hackney, repo: "hexpm", optional: true]}, {:phoenix, "~> 1.6", [hex: :phoenix, repo: "hexpm", optional: true]}, {:phoenix_html, "~> 3.0", [hex: :phoenix_html, repo: "hexpm", optional: true]}, {:phoenix_view, "~> 1.0 or ~> 2.0", [hex: :phoenix_view, repo: "hexpm", optional: false]}, {:swoosh, "~> 1.5", [hex: :swoosh, repo: "hexpm", optional: false]}], "hexpm", "e88d117251e89a16b92222415a6d87b99a96747ddf674fc5c7631de734811dba"},
"phoenix_template": {:hex, :phoenix_template, "1.0.3", "32de561eefcefa951aead30a1f94f1b5f0379bc9e340bb5c667f65f1edfa4326", [:mix], [{:phoenix_html, "~> 2.14.2 or ~> 3.0", [hex: :phoenix_html, repo: "hexpm", optional: true]}], "hexpm", "16f4b6588a4152f3cc057b9d0c0ba7e82ee23afa65543da535313ad8d25d8e2c"},
"phoenix_swoosh": {:hex, :phoenix_swoosh, "1.2.1", "b74ccaa8046fbc388a62134360ee7d9742d5a8ae74063f34eb050279de7a99e1", [:mix], [{:finch, "~> 0.8", [hex: :finch, repo: "hexpm", optional: true]}, {:hackney, "~> 1.10", [hex: :hackney, repo: "hexpm", optional: true]}, {:phoenix, "~> 1.6", [hex: :phoenix, repo: "hexpm", optional: true]}, {:phoenix_html, "~> 3.0 or ~> 4.0", [hex: :phoenix_html, repo: "hexpm", optional: true]}, {:phoenix_view, "~> 1.0 or ~> 2.0", [hex: :phoenix_view, repo: "hexpm", optional: false]}, {:swoosh, "~> 1.5", [hex: :swoosh, repo: "hexpm", optional: false]}], "hexpm", "4000eeba3f9d7d1a6bf56d2bd56733d5cadf41a7f0d8ffe5bb67e7d667e204a2"},
"phoenix_template": {:hex, :phoenix_template, "1.0.4", "e2092c132f3b5e5b2d49c96695342eb36d0ed514c5b252a77048d5969330d639", [:mix], [{:phoenix_html, "~> 2.14.2 or ~> 3.0 or ~> 4.0", [hex: :phoenix_html, repo: "hexpm", optional: true]}], "hexpm", "2c0c81f0e5c6753faf5cca2f229c9709919aba34fab866d3bc05060c9c444206"},
"phoenix_view": {:hex, :phoenix_view, "2.0.3", "4d32c4817fce933693741deeb99ef1392619f942633dde834a5163124813aad3", [:mix], [{:phoenix_html, "~> 2.14.2 or ~> 3.0 or ~> 4.0", [hex: :phoenix_html, repo: "hexpm", optional: true]}, {:phoenix_template, "~> 1.0", [hex: :phoenix_template, repo: "hexpm", optional: false]}], "hexpm", "cd34049af41be2c627df99cd4eaa71fc52a328c0c3d8e7d4aa28f880c30e7f64"},
"plug": {:hex, :plug, "1.15.2", "94cf1fa375526f30ff8770837cb804798e0045fd97185f0bb9e5fcd858c792a3", [:mix], [{:mime, "~> 1.0 or ~> 2.0", [hex: :mime, repo: "hexpm", optional: false]}, {:plug_crypto, "~> 1.1.1 or ~> 1.2 or ~> 2.0", [hex: :plug_crypto, repo: "hexpm", optional: false]}, {:telemetry, "~> 0.4.3 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "02731fa0c2dcb03d8d21a1d941bdbbe99c2946c0db098eee31008e04c6283615"},
"plug_cowboy": {:hex, :plug_cowboy, "2.6.1", "9a3bbfceeb65eff5f39dab529e5cd79137ac36e913c02067dba3963a26efe9b2", [:mix], [{:cowboy, "~> 2.7", [hex: :cowboy, repo: "hexpm", optional: false]}, {:cowboy_telemetry, "~> 0.3", [hex: :cowboy_telemetry, repo: "hexpm", optional: false]}, {:plug, "~> 1.14", [hex: :plug, repo: "hexpm", optional: false]}], "hexpm", "de36e1a21f451a18b790f37765db198075c25875c64834bcc82d90b309eb6613"},
"plug": {:hex, :plug, "1.15.3", "712976f504418f6dff0a3e554c40d705a9bcf89a7ccef92fc6a5ef8f16a30a97", [:mix], [{:mime, "~> 1.0 or ~> 2.0", [hex: :mime, repo: "hexpm", optional: false]}, {:plug_crypto, "~> 1.1.1 or ~> 1.2 or ~> 2.0", [hex: :plug_crypto, repo: "hexpm", optional: false]}, {:telemetry, "~> 0.4.3 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "cc4365a3c010a56af402e0809208873d113e9c38c401cabd88027ef4f5c01fd2"},
"plug_cowboy": {:hex, :plug_cowboy, "2.7.1", "87677ffe3b765bc96a89be7960f81703223fe2e21efa42c125fcd0127dd9d6b2", [:mix], [{:cowboy, "~> 2.7", [hex: :cowboy, repo: "hexpm", optional: false]}, {:cowboy_telemetry, "~> 0.3", [hex: :cowboy_telemetry, repo: "hexpm", optional: false]}, {:plug, "~> 1.14", [hex: :plug, repo: "hexpm", optional: false]}], "hexpm", "02dbd5f9ab571b864ae39418db7811618506256f6d13b4a45037e5fe78dc5de3"},
"plug_crypto": {:hex, :plug_crypto, "2.0.0", "77515cc10af06645abbfb5e6ad7a3e9714f805ae118fa1a70205f80d2d70fe73", [:mix], [], "hexpm", "53695bae57cc4e54566d993eb01074e4d894b65a3766f1c43e2c61a1b0f45ea9"},
"plug_static_index_html": {:hex, :plug_static_index_html, "1.0.0", "840123d4d3975585133485ea86af73cb2600afd7f2a976f9f5fd8b3808e636a0", [:mix], [{:plug, "~> 1.0", [hex: :plug, repo: "hexpm", optional: false]}], "hexpm", "79fd4fcf34d110605c26560cbae8f23c603ec4158c08298bd4360fdea90bb5cf"},
"poison": {:hex, :poison, "5.0.0", "d2b54589ab4157bbb82ec2050757779bfed724463a544b6e20d79855a9e43b24", [:mix], [{:decimal, "~> 2.0", [hex: :decimal, repo: "hexpm", optional: true]}], "hexpm", "11dc6117c501b80c62a7594f941d043982a1bd05a1184280c0d9166eb4d8d3fc"},
"poolboy": {:hex, :poolboy, "1.5.2", "392b007a1693a64540cead79830443abf5762f5d30cf50bc95cb2c1aaafa006b", [:rebar3], [], "hexpm", "dad79704ce5440f3d5a3681c8590b9dc25d1a561e8f5a9c995281012860901e3"},
"postgrex": {:hex, :postgrex, "0.17.4", "5777781f80f53b7c431a001c8dad83ee167bcebcf3a793e3906efff680ab62b3", [:mix], [{:db_connection, "~> 2.1", [hex: :db_connection, repo: "hexpm", optional: false]}, {:decimal, "~> 1.5 or ~> 2.0", [hex: :decimal, repo: "hexpm", optional: false]}, {:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: true]}, {:table, "~> 0.1.0", [hex: :table, repo: "hexpm", optional: true]}], "hexpm", "6458f7d5b70652bc81c3ea759f91736c16a31be000f306d3c64bcdfe9a18b3cc"},
"postgrex": {:hex, :postgrex, "0.17.5", "0483d054938a8dc069b21bdd636bf56c487404c241ce6c319c1f43588246b281", [:mix], [{:db_connection, "~> 2.1", [hex: :db_connection, repo: "hexpm", optional: false]}, {:decimal, "~> 1.5 or ~> 2.0", [hex: :decimal, repo: "hexpm", optional: false]}, {:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: true]}, {:table, "~> 0.1.0", [hex: :table, repo: "hexpm", optional: true]}], "hexpm", "50b8b11afbb2c4095a3ba675b4f055c416d0f3d7de6633a595fc131a828a67eb"},
"pot": {:hex, :pot, "1.0.2", "13abb849139fdc04ab8154986abbcb63bdee5de6ed2ba7e1713527e33df923dd", [:rebar3], [], "hexpm", "78fe127f5a4f5f919d6ea5a2a671827bd53eb9d37e5b4128c0ad3df99856c2e0"},
"ranch": {:hex, :ranch, "1.8.0", "8c7a100a139fd57f17327b6413e4167ac559fbc04ca7448e9be9057311597a1d", [:make, :rebar3], [], "hexpm", "49fbcfd3682fab1f5d109351b61257676da1a2fdbe295904176d5e521a2ddfe5"},
"recon": {:hex, :recon, "2.5.4", "05dd52a119ee4059fa9daa1ab7ce81bc7a8161a2f12e9d42e9d551ffd2ba901c", [:mix, :rebar3], [], "hexpm", "e9ab01ac7fc8572e41eb59385efeb3fb0ff5bf02103816535bacaedf327d0263"},
"recon": {:hex, :recon, "2.5.5", "c108a4c406fa301a529151a3bb53158cadc4064ec0c5f99b03ddb8c0e4281bdf", [:mix, :rebar3], [], "hexpm", "632a6f447df7ccc1a4a10bdcfce71514412b16660fe59deca0fcf0aa3c054404"},
"remote_ip": {:hex, :remote_ip, "1.1.0", "cb308841595d15df3f9073b7c39243a1dd6ca56e5020295cb012c76fbec50f2d", [:mix], [{:combine, "~> 0.10", [hex: :combine, repo: "hexpm", optional: false]}, {:plug, "~> 1.14", [hex: :plug, repo: "hexpm", optional: false]}], "hexpm", "616ffdf66aaad6a72fc546dabf42eed87e2a99e97b09cbd92b10cc180d02ed74"},
"search_parser": {:git, "https://github.com/FloatingGhost/pleroma-contrib-search-parser.git", "08971a81e68686f9ac465cfb6661d51c5e4e1e7f", [ref: "08971a81e68686f9ac465cfb6661d51c5e4e1e7f"]},
"sleeplocks": {:hex, :sleeplocks, "1.1.2", "d45aa1c5513da48c888715e3381211c859af34bee9b8290490e10c90bb6ff0ca", [:rebar3], [], "hexpm", "9fe5d048c5b781d6305c1a3a0f40bb3dfc06f49bf40571f3d2d0c57eaa7f59a5"},
"ssl_verify_fun": {:hex, :ssl_verify_fun, "1.1.7", "354c321cf377240c7b8716899e182ce4890c5938111a1296add3ec74cf1715df", [:make, :mix, :rebar3], [], "hexpm", "fe4c190e8f37401d30167c8c405eda19469f34577987c76dde613e838bbc67f8"},
"statistex": {:hex, :statistex, "1.0.0", "f3dc93f3c0c6c92e5f291704cf62b99b553253d7969e9a5fa713e5481cd858a5", [:mix], [], "hexpm", "ff9d8bee7035028ab4742ff52fc80a2aa35cece833cf5319009b52f1b5a86c27"},
"sweet_xml": {:hex, :sweet_xml, "0.7.4", "a8b7e1ce7ecd775c7e8a65d501bc2cd933bff3a9c41ab763f5105688ef485d08", [:mix], [], "hexpm", "e7c4b0bdbf460c928234951def54fe87edf1a170f6896675443279e2dbeba167"},
"swoosh": {:hex, :swoosh, "1.14.2", "cf686f92ad3b21e6651b20c50eeb1781f581dc7097ef6251b4d322a9f1d19339", [:mix], [{:cowboy, "~> 1.1 or ~> 2.4", [hex: :cowboy, repo: "hexpm", optional: true]}, {:ex_aws, "~> 2.1", [hex: :ex_aws, repo: "hexpm", optional: true]}, {:finch, "~> 0.6", [hex: :finch, repo: "hexpm", optional: true]}, {:gen_smtp, "~> 0.13 or ~> 1.0", [hex: :gen_smtp, repo: "hexpm", optional: true]}, {:hackney, "~> 1.9", [hex: :hackney, repo: "hexpm", optional: true]}, {:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: false]}, {:mail, "~> 0.2", [hex: :mail, repo: "hexpm", optional: true]}, {:mime, "~> 1.1 or ~> 2.0", [hex: :mime, repo: "hexpm", optional: false]}, {:plug, "~> 1.9", [hex: :plug, repo: "hexpm", optional: true]}, {:plug_cowboy, ">= 1.0.0", [hex: :plug_cowboy, repo: "hexpm", optional: true]}, {:req, "~> 0.4 or ~> 1.0", [hex: :req, repo: "hexpm", optional: true]}, {:telemetry, "~> 0.4.2 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "01d8fae72930a0b5c1bb9725df0408602ed8c5c3d59dc6e7a39c57b723cd1065"},
"swoosh": {:hex, :swoosh, "1.14.4", "94e9dba91f7695a10f49b0172c4a4cb658ef24abef7e8140394521b7f3bbb2d4", [:mix], [{:cowboy, "~> 1.1 or ~> 2.4", [hex: :cowboy, repo: "hexpm", optional: true]}, {:ex_aws, "~> 2.1", [hex: :ex_aws, repo: "hexpm", optional: true]}, {:finch, "~> 0.6", [hex: :finch, repo: "hexpm", optional: true]}, {:gen_smtp, "~> 0.13 or ~> 1.0", [hex: :gen_smtp, repo: "hexpm", optional: true]}, {:hackney, "~> 1.9", [hex: :hackney, repo: "hexpm", optional: true]}, {:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: false]}, {:mail, "~> 0.2", [hex: :mail, repo: "hexpm", optional: true]}, {:mime, "~> 1.1 or ~> 2.0", [hex: :mime, repo: "hexpm", optional: false]}, {:plug, "~> 1.9", [hex: :plug, repo: "hexpm", optional: true]}, {:plug_cowboy, ">= 1.0.0", [hex: :plug_cowboy, repo: "hexpm", optional: true]}, {:req, "~> 0.4 or ~> 1.0", [hex: :req, repo: "hexpm", optional: true]}, {:telemetry, "~> 0.4.2 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "081c5a590e4ba85cc89baddf7b2beecf6c13f7f84a958f1cd969290815f0f026"},
"syslog": {:hex, :syslog, "1.1.0", "6419a232bea84f07b56dc575225007ffe34d9fdc91abe6f1b2f254fd71d8efc2", [:rebar3], [], "hexpm", "4c6a41373c7e20587be33ef841d3de6f3beba08519809329ecc4d27b15b659e1"},
"table_rex": {:hex, :table_rex, "3.1.1", "0c67164d1714b5e806d5067c1e96ff098ba7ae79413cc075973e17c38a587caa", [:mix], [], "hexpm", "678a23aba4d670419c23c17790f9dcd635a4a89022040df7d5d772cb21012490"},
"table_rex": {:hex, :table_rex, "4.0.0", "3c613a68ebdc6d4d1e731bc973c233500974ec3993c99fcdabb210407b90959b", [:mix], [], "hexpm", "c35c4d5612ca49ebb0344ea10387da4d2afe278387d4019e4d8111e815df8f55"},
"telemetry": {:hex, :telemetry, "1.2.1", "68fdfe8d8f05a8428483a97d7aab2f268aaff24b49e0f599faa091f1d4e7f61c", [:rebar3], [], "hexpm", "dad9ce9d8effc621708f99eac538ef1cbe05d6a874dd741de2e689c47feafed5"},
"telemetry_metrics": {:hex, :telemetry_metrics, "0.6.1", "315d9163a1d4660aedc3fee73f33f1d355dcc76c5c3ab3d59e76e3edf80eef1f", [:mix], [{:telemetry, "~> 0.4 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "7be9e0871c41732c233be71e4be11b96e56177bf15dde64a8ac9ce72ac9834c6"},
"telemetry_metrics": {:hex, :telemetry_metrics, "0.6.2", "2caabe9344ec17eafe5403304771c3539f3b6e2f7fb6a6f602558c825d0d0bfb", [:mix], [{:telemetry, "~> 0.4 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "9b43db0dc33863930b9ef9d27137e78974756f5f198cae18409970ed6fa5b561"},
"telemetry_metrics_prometheus": {:hex, :telemetry_metrics_prometheus, "1.1.0", "1cc23e932c1ef9aa3b91db257ead31ea58d53229d407e059b29bb962c1505a13", [:mix], [{:plug_cowboy, "~> 2.1", [hex: :plug_cowboy, repo: "hexpm", optional: false]}, {:telemetry_metrics_prometheus_core, "~> 1.0", [hex: :telemetry_metrics_prometheus_core, repo: "hexpm", optional: false]}], "hexpm", "d43b3659b3244da44fe0275b717701542365d4519b79d9ce895b9719c1ce4d26"},
"telemetry_metrics_prometheus_core": {:hex, :telemetry_metrics_prometheus_core, "1.1.0", "4e15f6d7dbedb3a4e3aed2262b7e1407f166fcb9c30ca3f96635dfbbef99965c", [:mix], [{:telemetry, "~> 0.4 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}, {:telemetry_metrics, "~> 0.6", [hex: :telemetry_metrics, repo: "hexpm", optional: false]}], "hexpm", "0dd10e7fe8070095df063798f82709b0a1224c31b8baf6278b423898d591a069"},
"telemetry_poller": {:hex, :telemetry_poller, "1.0.0", "db91bb424e07f2bb6e73926fcafbfcbcb295f0193e0a00e825e589a0a47e8453", [:rebar3], [{:telemetry, "~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "b3a24eafd66c3f42da30fc3ca7dda1e9d546c12250a2d60d7b81d264fbec4f6e"},
"telemetry_poller": {:hex, :telemetry_poller, "1.1.0", "58fa7c216257291caaf8d05678c8d01bd45f4bdbc1286838a28c4bb62ef32999", [:rebar3], [{:telemetry, "~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "9eb9d9cbfd81cbd7cdd24682f8711b6e2b691289a0de6826e58452f28c103c8f"},
"temple": {:git, "https://akkoma.dev/AkkomaGang/temple.git", "066a699ade472d8fa42a9d730b29a61af9bc8b59", [ref: "066a699ade472d8fa42a9d730b29a61af9bc8b59"]},
"tesla": {:hex, :tesla, "1.8.0", "d511a4f5c5e42538d97eef7c40ec4f3e44effdc5068206f42ed859e09e51d1fd", [:mix], [{:castore, "~> 0.1 or ~> 1.0", [hex: :castore, repo: "hexpm", optional: true]}, {:exjsx, ">= 3.0.0", [hex: :exjsx, repo: "hexpm", optional: true]}, {:finch, "~> 0.13", [hex: :finch, repo: "hexpm", optional: true]}, {:fuse, "~> 2.4", [hex: :fuse, repo: "hexpm", optional: true]}, {:gun, ">= 1.0.0", [hex: :gun, repo: "hexpm", optional: true]}, {:hackney, "~> 1.6", [hex: :hackney, repo: "hexpm", optional: true]}, {:ibrowse, "4.4.2", [hex: :ibrowse, repo: "hexpm", optional: true]}, {:jason, ">= 1.0.0", [hex: :jason, repo: "hexpm", optional: true]}, {:mime, "~> 1.0 or ~> 2.0", [hex: :mime, repo: "hexpm", optional: false]}, {:mint, "~> 1.0", [hex: :mint, repo: "hexpm", optional: true]}, {:msgpax, "~> 2.3", [hex: :msgpax, repo: "hexpm", optional: true]}, {:poison, ">= 1.0.0", [hex: :poison, repo: "hexpm", optional: true]}, {:telemetry, "~> 0.4 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: true]}], "hexpm", "10501f360cd926a309501287470372af1a6e1cbed0f43949203a4c13300bc79f"},
"tesla": {:hex, :tesla, "1.9.0", "8c22db6a826e56a087eeb8cdef56889731287f53feeb3f361dec5d4c8efb6f14", [:mix], [{:castore, "~> 0.1 or ~> 1.0", [hex: :castore, repo: "hexpm", optional: true]}, {:exjsx, ">= 3.0.0", [hex: :exjsx, repo: "hexpm", optional: true]}, {:finch, "~> 0.13", [hex: :finch, repo: "hexpm", optional: true]}, {:fuse, "~> 2.4", [hex: :fuse, repo: "hexpm", optional: true]}, {:gun, ">= 1.0.0", [hex: :gun, repo: "hexpm", optional: true]}, {:hackney, "~> 1.6", [hex: :hackney, repo: "hexpm", optional: true]}, {:ibrowse, "4.4.2", [hex: :ibrowse, repo: "hexpm", optional: true]}, {:jason, ">= 1.0.0", [hex: :jason, repo: "hexpm", optional: true]}, {:mime, "~> 1.0 or ~> 2.0", [hex: :mime, repo: "hexpm", optional: false]}, {:mint, "~> 1.0", [hex: :mint, repo: "hexpm", optional: true]}, {:msgpax, "~> 2.3", [hex: :msgpax, repo: "hexpm", optional: true]}, {:poison, ">= 1.0.0", [hex: :poison, repo: "hexpm", optional: true]}, {:telemetry, "~> 0.4 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: true]}], "hexpm", "7c240c67e855f7e63e795bf16d6b3f5115a81d1f44b7fe4eadbf656bae0fef8a"},
"timex": {:hex, :timex, "3.7.11", "bb95cb4eb1d06e27346325de506bcc6c30f9c6dea40d1ebe390b262fad1862d1", [:mix], [{:combine, "~> 0.10", [hex: :combine, repo: "hexpm", optional: false]}, {:gettext, "~> 0.20", [hex: :gettext, repo: "hexpm", optional: false]}, {:tzdata, "~> 1.1", [hex: :tzdata, repo: "hexpm", optional: false]}], "hexpm", "8b9024f7efbabaf9bd7aa04f65cf8dcd7c9818ca5737677c7b76acbc6a94d1aa"},
"trailing_format_plug": {:hex, :trailing_format_plug, "0.0.7", "64b877f912cf7273bed03379936df39894149e35137ac9509117e59866e10e45", [:mix], [{:plug, "> 0.12.0", [hex: :plug, repo: "hexpm", optional: false]}], "hexpm", "bd4fde4c15f3e993a999e019d64347489b91b7a9096af68b2bdadd192afa693f"},
"tzdata": {:hex, :tzdata, "1.1.1", "20c8043476dfda8504952d00adac41c6eda23912278add38edc140ae0c5bcc46", [:mix], [{:hackney, "~> 1.17", [hex: :hackney, repo: "hexpm", optional: false]}], "hexpm", "a69cec8352eafcd2e198dea28a34113b60fdc6cb57eb5ad65c10292a6ba89787"},
"ueberauth": {:hex, :ueberauth, "0.10.5", "806adb703df87e55b5615cf365e809f84c20c68aa8c08ff8a416a5a6644c4b02", [:mix], [{:plug, "~> 1.5", [hex: :plug, repo: "hexpm", optional: false]}], "hexpm", "3efd1f31d490a125c7ed453b926f7c31d78b97b8a854c755f5c40064bf3ac9e1"},
"unicode_util_compat": {:hex, :unicode_util_compat, "0.7.0", "bc84380c9ab48177092f43ac89e4dfa2c6d62b40b8bd132b1059ecc7232f9a78", [:rebar3], [], "hexpm", "25eee6d67df61960cf6a794239566599b09e17e668d3700247bc498638152521"},
"unsafe": {:hex, :unsafe, "1.0.2", "23c6be12f6c1605364801f4b47007c0c159497d0446ad378b5cf05f1855c0581", [:mix], [], "hexpm", "b485231683c3ab01a9cd44cb4a79f152c6f3bb87358439c6f68791b85c2df675"},
"vex": {:hex, :vex, "0.9.1", "cb65348ebd1c4002861b65bef36e524c29d9a879c90119b2d0e674e323124277", [:mix], [], "hexpm", "a0f9f3959d127ad6a6a617c3f607ecfb1bc6f3c59f9c3614a901a46d1765bafe"},
"vex": {:hex, :vex, "0.9.2", "fe061acc9e0907d983d46b51bf35d58176f0fe6eb7ba3b33c9336401bf42b6d1", [:mix], [], "hexpm", "76e709a9762e98c6b462dfce92e9b5dfbf712839227f2da8add6dd11549b12cb"},
"web_push_encryption": {:hex, :web_push_encryption, "0.3.1", "76d0e7375142dfee67391e7690e89f92578889cbcf2879377900b5620ee4708d", [:mix], [{:httpoison, "~> 1.0", [hex: :httpoison, repo: "hexpm", optional: false]}, {:jose, "~> 1.11.1", [hex: :jose, repo: "hexpm", optional: false]}], "hexpm", "4f82b2e57622fb9337559058e8797cb0df7e7c9790793bdc4e40bc895f70e2a2"},
"websock": {:hex, :websock, "0.5.3", "2f69a6ebe810328555b6fe5c831a851f485e303a7c8ce6c5f675abeb20ebdadc", [:mix], [], "hexpm", "6105453d7fac22c712ad66fab1d45abdf049868f253cf719b625151460b8b453"},
"websock_adapter": {:hex, :websock_adapter, "0.5.5", "9dfeee8269b27e958a65b3e235b7e447769f66b5b5925385f5a569269164a210", [:mix], [{:bandit, ">= 0.6.0", [hex: :bandit, repo: "hexpm", optional: true]}, {:plug, "~> 1.14", [hex: :plug, repo: "hexpm", optional: false]}, {:plug_cowboy, "~> 2.6", [hex: :plug_cowboy, repo: "hexpm", optional: true]}, {:websock, "~> 0.5", [hex: :websock, repo: "hexpm", optional: false]}], "hexpm", "4b977ba4a01918acbf77045ff88de7f6972c2a009213c515a445c48f224ffce9"},
"websock_adapter": {:hex, :websock_adapter, "0.5.6", "0437fe56e093fd4ac422de33bf8fc89f7bc1416a3f2d732d8b2c8fd54792fe60", [:mix], [{:bandit, ">= 0.6.0", [hex: :bandit, repo: "hexpm", optional: true]}, {:plug, "~> 1.14", [hex: :plug, repo: "hexpm", optional: false]}, {:plug_cowboy, "~> 2.6", [hex: :plug_cowboy, repo: "hexpm", optional: true]}, {:websock, "~> 0.5", [hex: :websock, repo: "hexpm", optional: false]}], "hexpm", "e04378d26b0af627817ae84c92083b7e97aca3121196679b73c73b99d0d133ea"},
"websockex": {:hex, :websockex, "0.4.3", "92b7905769c79c6480c02daacaca2ddd49de936d912976a4d3c923723b647bf0", [:mix], [], "hexpm", "95f2e7072b85a3a4cc385602d42115b73ce0b74a9121d0d6dbbf557645ac53e4"},
}

View File

@ -0,0 +1,37 @@
defmodule Pleroma.Repo.Migrations.UploadFilterExiftoolToExiftoolStripMetadata do
use Ecto.Migration
alias Pleroma.ConfigDB
def up,
do:
ConfigDB.get_by_params(%{group: :pleroma, key: Pleroma.Upload})
|> update_filtername(
Pleroma.Upload.Filter.Exiftool,
Pleroma.Upload.Filter.Exiftool.StripMetadata
)
def down,
do:
ConfigDB.get_by_params(%{group: :pleroma, key: Pleroma.Upload})
|> update_filtername(
Pleroma.Upload.Filter.Exiftool.StripMetadata,
Pleroma.Upload.Filter.Exiftool
)
defp update_filtername(%{value: value}, from_filtername, to_filtername) do
new_value =
value
|> Keyword.update(:filters, [], fn filters ->
filters
|> Enum.map(fn
^from_filtername -> to_filtername
filter -> filter
end)
end)
ConfigDB.update_or_create(%{group: :pleroma, key: Pleroma.Upload, value: new_value})
end
defp update_filtername(_, _, _), do: nil
end

View File

@ -19,6 +19,7 @@
"toot": "http://joinmastodon.org/ns#",
"misskey": "https://misskey-hub.net/ns#",
"fedibird": "http://fedibird.com/ns#",
"sharkey": "https://joinsharkey.org/ns#",
"value": "schema:value",
"sensitive": "as:sensitive",
"litepub": "http://litepub.social/ns#",
@ -45,6 +46,14 @@
"contentMap": {
"@id": "as:content",
"@container": "@language"
},
"featured": {
"@id": "toot:featured",
"@type": "@id"
},
"backgroundUrl": {
"@id": "sharkey:backgroundUrl",
"@type": "@id"
}
}
]

View File

@ -117,6 +117,7 @@ nav {
.inner-nav img {
height: 28px;
width: auto;
vertical-align: middle;
padding-right: 5px
}
@ -440,6 +441,7 @@ summary {
.avatar img {
border-radius: 3px;
box-shadow: var(--avatarShadow);
object-fit: cover;
}
.user-summary {
@ -642,13 +644,13 @@ summary {
}
img:not(.u-photo, .fa-icon) {
width: 32px;
width: auto;
height: 32px;
padding: 0;
vertical-align: middle;
}
.username img:not(.u-photo) {
width: 16px;
width: auto;
height: 16px;
}

Binary file not shown.

After

Width:  |  Height:  |  Size: 697 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 823 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 785 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 631 B

View File

@ -0,0 +1,117 @@
{
"@context": [
"https://www.w3.org/ns/activitystreams",
"https://w3id.org/security/v1",
{
"manuallyApprovesFollowers": "as:manuallyApprovesFollowers",
"toot": "http://joinmastodon.org/ns#",
"featured": {
"@id": "toot:featured",
"@type": "@id"
},
"featuredTags": {
"@id": "toot:featuredTags",
"@type": "@id"
},
"alsoKnownAs": {
"@id": "as:alsoKnownAs",
"@type": "@id"
},
"movedTo": {
"@id": "as:movedTo",
"@type": "@id"
},
"schema": "http://schema.org#",
"PropertyValue": "schema:PropertyValue",
"value": "schema:value",
"discoverable": "toot:discoverable",
"Device": "toot:Device",
"Ed25519Signature": "toot:Ed25519Signature",
"Ed25519Key": "toot:Ed25519Key",
"Curve25519Key": "toot:Curve25519Key",
"EncryptedMessage": "toot:EncryptedMessage",
"publicKeyBase64": "toot:publicKeyBase64",
"deviceId": "toot:deviceId",
"claim": {
"@type": "@id",
"@id": "toot:claim"
},
"fingerprintKey": {
"@type": "@id",
"@id": "toot:fingerprintKey"
},
"identityKey": {
"@type": "@id",
"@id": "toot:identityKey"
},
"devices": {
"@type": "@id",
"@id": "toot:devices"
},
"messageFranking": "toot:messageFranking",
"messageType": "toot:messageType",
"cipherText": "toot:cipherText",
"suspended": "toot:suspended",
"memorial": "toot:memorial",
"indexable": "toot:indexable",
"focalPoint": {
"@container": "@list",
"@id": "toot:focalPoint"
}
}
],
"id": "https://mastodont.cat/users/fediverse",
"type": "Service",
"following": "https://mastodont.cat/users/fediverse/following",
"followers": "https://mastodont.cat/users/fediverse/followers",
"inbox": "https://mastodont.cat/users/fediverse/inbox",
"outbox": "https://mastodont.cat/users/fediverse/outbox",
"featured": "https://mastodont.cat/users/fediverse/collections/featured",
"featuredTags": "https://mastodont.cat/users/fediverse/collections/tags",
"preferredUsername": "fediverse",
"name": "fediverse's stats",
"summary": "<p>All fediverse alive servers stats. New refactored code!</p><p>Ask server info:</p><p><span class=\"h-card\" translate=\"no\"><a href=\"https://mastodont.cat/@fediverse\" class=\"u-url mention\">@<span>fediverse</span></a></span> server example.server</p><p>Ask software info:</p><p><span class=\"h-card\" translate=\"no\"><a href=\"https://mastodont.cat/@fediverse\" class=\"u-url mention\">@<span>fediverse</span></a></span> soft mastodon</p>",
"url": "https://mastodont.cat/@fediverse",
"manuallyApprovesFollowers": false,
"discoverable": true,
"indexable": false,
"published": "2020-05-13T00:00:00Z",
"memorial": false,
"devices": "https://mastodont.cat/users/fediverse/collections/devices",
"publicKey": {
"id": "https://mastodont.cat/users/fediverse#main-key",
"owner": "https://mastodont.cat/users/fediverse",
"publicKeyPem": "-----BEGIN PUBLIC KEY-----\nMIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEAu2X8LqAR/6j95UUTG02T\nWG+PmNRWnfOl+zjDts3OctyJK7at5AwA+T0be1faHpf+oLREl/dkWXc8VQY2UJzY\n8QTuXXnIkwHAeA7WADB6kPvQhVpfGPgKD0dpAgBz9WHFquMSXcnuyt7q1CDn5wId\nRoUtkCAcg1rOX+lIAoeic5hT0O0sXLJdtaSCTZmGqkF2Cf+/16q8XhRevMRh73vP\nX2PefCr63Iy/Zh5rnVhPluQMyQ6FGxXgd5dEKJRa2kxrhIsrm0TzMX892Ev45AwI\ndppYQOQ+nLOgMYrpFNYdOmizJsn635l18K1r/tyDDAegPp6Kfa8v+BaZdOmNTFKr\n/wIDAQAB\n-----END PUBLIC KEY-----\n"
},
"tag": [],
"attachment": [
{
"type": "PropertyValue",
"name": "code",
"value": "<a href=\"https://codeberg.org/spla/stats\" target=\"_blank\" rel=\"nofollow noopener noreferrer me\" translate=\"no\"><span class=\"invisible\">https://</span><span class=\"\">codeberg.org/spla/stats</span><span class=\"invisible\"></span></a>"
},
{
"type": "PropertyValue",
"name": "my user-agent",
"value": "&quot;fediverse&#39;s stats (fediverse@mastodont.cat)&quot;"
},
{
"type": "PropertyValue",
"name": "coded by",
"value": "<span class=\"h-card\" translate=\"no\"><a href=\"https://mastodont.cat/@spla\" class=\"u-url mention\">@<span>spla</span></a></span>"
}
],
"endpoints": {
"sharedInbox": "https://mastodont.cat/inbox"
},
"icon": {
"type": "Image",
"mediaType": "image/png",
"url": "https://mastodont.cat/system/accounts/avatars/000/149/323/original/33201dbeb139a24a.png"
},
"image": {
"type": "Image",
"mediaType": "image/jpeg",
"url": "https://mastodont.cat/system/accounts/headers/000/149/323/original/75c861d59e5a8860.jpeg"
}
}

View File

@ -0,0 +1,72 @@
{
"@context": [
"https://www.w3.org/ns/activitystreams",
"https://w3id.org/security/v1",
{
"RsaSignature2017": "https://w3id.org/security#RsaSignature2017"
},
{
"pt": "https://joinpeertube.org/ns#",
"sc": "http://schema.org/",
"playlists": {
"@id": "pt:playlists",
"@type": "@id"
},
"support": {
"@type": "sc:Text",
"@id": "pt:support"
},
"icons": "as:icon"
}
],
"type": "Group",
"id": "https://spectra.video/video-channels/fediforum_demos",
"following": "https://spectra.video/video-channels/fediforum_demos/following",
"followers": "https://spectra.video/video-channels/fediforum_demos/followers",
"playlists": "https://spectra.video/video-channels/fediforum_demos/playlists",
"inbox": "https://spectra.video/video-channels/fediforum_demos/inbox",
"outbox": "https://spectra.video/video-channels/fediforum_demos/outbox",
"preferredUsername": "fediforum_demos",
"url": "https://spectra.video/video-channels/fediforum_demos",
"name": "FediForum Demos",
"endpoints": {
"sharedInbox": "https://spectra.video/inbox"
},
"publicKey": {
"id": "https://spectra.video/video-channels/fediforum_demos#main-key",
"owner": "https://spectra.video/video-channels/fediforum_demos",
"publicKeyPem": "-----BEGIN PUBLIC KEY-----\nMIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEAxFVESAz0Z28zhXVJafzg\nKXVWS6yuZdQ4vOuA+k//ioSpNls53pI9vwQyixNa+QLdnXxm51dy//Py49wZbzAV\n2nC2FEnzcCM/EZvA4gzy7wekcjnGIz3equbdLOj3IAJJTSwCvZpW2f0poAa1CUmQ\nDRV5p3t3bjtUX5B9RnhiuDitN8qCzEeEbD9SHoyMDIACl8wXer8eyi5v98CMTHwh\nJYUJZJmS7/SSlJO2aqThEBaAYCUzVxlcXOecF1N1RWjjtwqi9xXxmlJ+teivYyST\nYfCeLmY/zZPY7OjoBxoVcVa/Yj3Wg6Nt+A5co9NATpsXmud7GWx4CvQ00uH/fa7e\nvQIDAQAB\n-----END PUBLIC KEY-----\n"
},
"published": "2024-03-26T19:34:06.073Z",
"icon": [
{
"type": "Image",
"mediaType": "image/png",
"height": 48,
"width": 48,
"url": "https://spectra.video/lazy-static/avatars/b13e5038-0169-420e-a6bc-4f5e0666fae6.png"
},
{
"type": "Image",
"mediaType": "image/png",
"height": 120,
"width": 120,
"url": "https://spectra.video/lazy-static/avatars/559b141a-96ec-4161-8889-1111b71abca0.png"
}
],
"image": {
"type": "Image",
"mediaType": "image/png",
"height": 317,
"width": 1920,
"url": "https://spectra.video/lazy-static/banners/bbe18e2c-79ef-4640-9193-cdd743c964dd.png"
},
"summary": "Demos from the the FediForum Unconference. For the sake of simplicity, demos are [broken out into playlists](https://spectra.video/c/fediforum_demos/video-playlists) representing each FediForum Event.",
"support": "Check out our site: https://fediforum.org/\nFollow us on Mastodon: https://mastodon.social/@fediforum",
"attributedTo": [
{
"type": "Person",
"id": "https://spectra.video/accounts/fediforum"
}
]
}

View File

@ -0,0 +1,64 @@
{
"@context": [
"https://www.w3.org/ns/activitystreams",
"https://w3id.org/security/v1",
{
"manuallyApprovesFollowers": "as:manuallyApprovesFollowers",
"sensitive": "as:sensitive",
"Hashtag": "as:Hashtag",
"quoteUrl": "as:quoteUrl",
"toot": "http://joinmastodon.org/ns#",
"Emoji": "toot:Emoji",
"featured": "toot:featured",
"discoverable": "toot:discoverable",
"schema": "http://schema.org#",
"PropertyValue": "schema:PropertyValue",
"value": "schema:value",
"misskey": "https://misskey.io/ns#",
"_misskey_content": "misskey:_misskey_content",
"_misskey_quote": "misskey:_misskey_quote",
"_misskey_reaction": "misskey:_misskey_reaction",
"_misskey_votes": "misskey:_misskey_votes",
"_misskey_talk": "misskey:_misskey_talk",
"isCat": "misskey:isCat",
"vcard": "http://www.w3.org/2006/vcard/ns#"
}
],
"type": "Person",
"id": "https://misskey.io/users/83ssedkv53",
"inbox": "https://misskey.io/users/83ssedkv53/inbox",
"outbox": "https://misskey.io/users/83ssedkv53/outbox",
"followers": "https://misskey.io/users/83ssedkv53/followers",
"following": "https://misskey.io/users/83ssedkv53/following",
"sharedInbox": "https://misskey.io/inbox",
"endpoints": {
"sharedInbox": "https://misskey.io/inbox"
},
"url": "https://misskey.io/@aimu",
"preferredUsername": "aimu",
"name": "あいむ",
"summary": "<p><span>わずかな作曲要素 巣穴で独り言<br>Twitter </span><a href=\"https://twitter.com/aimu_53\">https://twitter.com/aimu_53</a><span><br>Soundcloud </span><a href=\"https://soundcloud.com/aimu-53\">https://soundcloud.com/aimu-53</a></p>",
"icon": {
"type": "Image",
"url": "https://s3.arkjp.net/misskey/webpublic-3f7e93c0-34f5-443c-acc0-f415cb2342b4.jpg",
"sensitive": false,
"name": null
},
"image": {
"type": "Image",
"url": "https://s3.arkjp.net/misskey/webpublic-2db63d1d-490b-488b-ab62-c93c285f26b6.png",
"sensitive": false,
"name": null
},
"tag": [],
"manuallyApprovesFollowers": false,
"discoverable": true,
"publicKey": {
"id": "https://misskey.io/users/83ssedkv53#main-key",
"type": "Key",
"owner": "https://misskey.io/users/83ssedkv53",
"publicKeyPem": "-----BEGIN PUBLIC KEY-----\nMIICIjANBgkqhkiG9w0BAQEFAAOCAg8AMIICCgKCAgEA1ylhePJ6qGHmwHSBP17b\nIosxGaiFKvgDBgZdm8vzvKeRSqJV9uLHfZL3pO/Zt02EwaZd2GohZAtBZEF8DbMA\n3s93WAesvyGF9mjGrYYKlhp/glwyrrrbf+RdD0DLtyDwRRlrxp3pS2lLmv5Tp1Zl\npH+UKpOnNrpQqjHI5P+lEc9bnflzbRrX+UiyLNsVAP80v4wt7SZfT/telrU6mDru\n998UdfhUo7bDKeDsHG1PfLpyhhtfdoZub4kBpkyacHiwAd+CdCjR54Eu7FDwVK3p\nY3JcrT2q5stgMqN1m4QgSL4XAADIotWwDYttTJejM1n9dr+6VWv5bs0F2Q/6gxOp\nu5DQZLk4Q+64U4LWNox6jCMOq3fYe0g7QalJIHnanYQQo+XjoH6S1Aw64gQ3Ip2Y\nZBmZREAOR7GMFVDPFnVnsbCHnIAv16TdgtLgQBAihkWEUuPqITLi8PMu6kMr3uyq\nYkObEfH0TNTcqaiVpoXv791GZLEUV5ROl0FSUANLNkHZZv29xZ5JDOBOR1rNBLyH\ngVtW8rpszYqOXwzX23hh4WsVXfB7YgNvIijwjiaWbzsecleaENGEnLNMiVKVumTj\nmtyTeFJpH0+OaSrUYpemRRJizmqIjklKsNwUEwUb2WcUUg92o56T2obrBkooabZe\nwgSXSKTOcjsR/ju7+AuIyvkCAwEAAQ==\n-----END PUBLIC KEY-----\n"
},
"isCat": true,
"vcard:bday": "5353-05-03"
}

View File

@ -0,0 +1,44 @@
{
"@context": [
"https://www.w3.org/ns/activitystreams",
"https://w3id.org/security/v1",
{
"manuallyApprovesFollowers": "as:manuallyApprovesFollowers",
"sensitive": "as:sensitive",
"Hashtag": "as:Hashtag",
"quoteUrl": "as:quoteUrl",
"toot": "http://joinmastodon.org/ns#",
"Emoji": "toot:Emoji",
"featured": "toot:featured",
"discoverable": "toot:discoverable",
"schema": "http://schema.org#",
"PropertyValue": "schema:PropertyValue",
"value": "schema:value",
"misskey": "https://misskey.io/ns#",
"_misskey_content": "misskey:_misskey_content",
"_misskey_quote": "misskey:_misskey_quote",
"_misskey_reaction": "misskey:_misskey_reaction",
"_misskey_votes": "misskey:_misskey_votes",
"_misskey_talk": "misskey:_misskey_talk",
"isCat": "misskey:isCat",
"vcard": "http://www.w3.org/2006/vcard/ns#"
}
],
"id": "https://misskey.io/notes/8vs6wxufd0",
"type": "Note",
"attributedTo": "https://misskey.io/users/83ssedkv53",
"summary": null,
"content": "<p><span>Fantiaこれできないように過去のやつは従量課金だった気がする</span></p>",
"_misskey_content": "Fantiaこれできないように過去のやつは従量課金だった気がする",
"published": "2022-01-21T16:37:12.663Z",
"to": [
"https://www.w3.org/ns/activitystreams#Public"
],
"cc": [
"https://misskey.io/users/83ssedkv53/followers"
],
"inReplyTo": null,
"attachment": [],
"sensitive": false,
"tag": []
}

View File

@ -69,7 +69,9 @@ defmodule Mix.Tasks.Pleroma.InstanceTest do
"test/uploads",
"--static-dir",
"./test/../test/instance/static/",
"--strip-uploads",
"--strip-uploads-metadata",
"y",
"--read-uploads-description",
"y",
"--anonymize-uploads",
"n"
@ -91,7 +93,10 @@ defmodule Mix.Tasks.Pleroma.InstanceTest do
assert generated_config =~ "password: \"dbpass\""
assert generated_config =~ "configurable_from_database: true"
assert generated_config =~ "http: [ip: {127, 0, 0, 1}, port: 4000]"
assert generated_config =~ "filters: [Pleroma.Upload.Filter.Exiftool]"
assert generated_config =~
"filters: [Pleroma.Upload.Filter.Exiftool.ReadDescription, Pleroma.Upload.Filter.Exiftool.StripMetadata]"
assert generated_config =~ "base_url: \"https://media.pleroma.social/media\""
assert File.read!(tmp_path() <> "setup.psql") == generated_setup_psql()
assert File.exists?(Path.expand("./test/instance/static/robots.txt"))

View File

@ -11,6 +11,62 @@ defmodule Pleroma.Config.DeprecationWarningsTest do
alias Pleroma.Config
alias Pleroma.Config.DeprecationWarnings
describe "filter exiftool" do
test "gives warning when still used" do
clear_config(
[Pleroma.Upload, :filters],
[Pleroma.Upload.Filter.Exiftool]
)
assert capture_log(fn -> DeprecationWarnings.check_exiftool_filter() end) =~
"""
!!!DEPRECATION WARNING!!!
Your config is using Exiftool as a filter instead of Exiftool.StripMetadata. This should work for now, but you are advised to change to the new configuration to prevent possible issues later:
```
config :pleroma, Pleroma.Upload,
filters: [Pleroma.Upload.Filter.Exiftool]
```
Is now
```
config :pleroma, Pleroma.Upload,
filters: [Pleroma.Upload.Filter.Exiftool.StripMetadata]
```
"""
end
test "changes setting to exiftool strip metadata" do
clear_config(
[Pleroma.Upload, :filters],
[Pleroma.Upload.Filter.Exiftool, Pleroma.Upload.Filter.Exiftool.ReadDescription]
)
expected_config = [
Pleroma.Upload.Filter.Exiftool.StripMetadata,
Pleroma.Upload.Filter.Exiftool.ReadDescription
]
capture_log(fn -> DeprecationWarnings.warn() end)
assert Config.get([Pleroma.Upload]) |> Keyword.get(:filters, []) == expected_config
end
test "doesn't give a warning with correct config" do
clear_config(
[Pleroma.Upload, :filters],
[
Pleroma.Upload.Filter.Exiftool.StripMetadata,
Pleroma.Upload.Filter.Exiftool.ReadDescription
]
)
assert capture_log(fn -> DeprecationWarnings.check_exiftool_filter() end) == ""
end
end
describe "simple policy tuples" do
test "gives warning when there are still strings" do
clear_config([:mrf_simple],

View File

@ -57,6 +57,9 @@ defmodule Pleroma.Object.FetcherTest do
body: spoofed_object_with_ids("https://patch.cx/objects/spoof_content_type")
}
%{method: :get, url: "https://octodon.social/users/cwebber/statuses/111647596861000656"} ->
%Tesla.Env{status: 403}
# Spoof: mismatching ids
# Variant 1: Non-exisitng fake id
%{
@ -203,8 +206,7 @@ defmodule Pleroma.Object.FetcherTest do
test "it returns thread depth exceeded error if thread depth is exceeded" do
clear_config([:instance, :federation_incoming_replies_max_depth], 0)
assert {:error, "Max thread distance exceeded."} =
Fetcher.fetch_object_from_id(@ap_id, depth: 1)
assert {:error, :allowed_depth} = Fetcher.fetch_object_from_id(@ap_id, depth: 1)
end
test "it fetches object if max thread depth is restricted to 0 and depth is not specified" do
@ -250,12 +252,12 @@ defmodule Pleroma.Object.FetcherTest do
end
test "it does not fetch a spoofed object with id different from URL" do
assert {:error, "Object's ActivityPub id/url does not match final fetch URL"} =
assert {:error, :id_mismatch} =
Fetcher.fetch_and_contain_remote_object_from_id(
"https://patch.cx/media/03ca3c8b4ac3ddd08bf0f84be7885f2f88de0f709112131a22d83650819e36c2.json"
)
assert {:error, "Object's ActivityPub id/url does not match final fetch URL"} =
assert {:error, :id_mismatch} =
Fetcher.fetch_and_contain_remote_object_from_id(
"https://patch.cx/media/spoof_stage1.json"
)
@ -285,14 +287,14 @@ defmodule Pleroma.Object.FetcherTest do
end
test "it does not fetch a spoofed object with a foreign actor" do
assert {:error, "Object containment failed."} =
assert {:error, _} =
Fetcher.fetch_and_contain_remote_object_from_id(
"https://patch.cx/objects/spoof_foreign_actor"
)
end
test "it does not fetch from localhost" do
assert {:error, "Trying to fetch local resource"} =
assert {:error, :local_resource} =
Fetcher.fetch_and_contain_remote_object_from_id(
Pleroma.Web.Endpoint.url() <> "/spoof_local"
)
@ -402,16 +404,14 @@ defmodule Pleroma.Object.FetcherTest do
end
test "handle HTTP 410 Gone response" do
assert {:error,
{"Object has been deleted", "https://mastodon.example.org/users/userisgone", 410}} ==
assert {:error, :not_found} ==
Fetcher.fetch_and_contain_remote_object_from_id(
"https://mastodon.example.org/users/userisgone"
)
end
test "handle HTTP 404 response" do
assert {:error,
{"Object has been deleted", "https://mastodon.example.org/users/userisgone404", 404}} ==
assert {:error, :not_found} ==
Fetcher.fetch_and_contain_remote_object_from_id(
"https://mastodon.example.org/users/userisgone404"
)

View File

@ -0,0 +1,117 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2021 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Upload.Filter.Exiftool.ReadDescriptionTest do
use Pleroma.DataCase, async: true
alias Pleroma.Upload.Filter
@uploads %Pleroma.Upload{
name: "image_with_imagedescription_and_caption-abstract.jpg",
content_type: "image/jpeg",
path: Path.absname("test/fixtures/image_with_imagedescription_and_caption-abstract.jpg"),
tempfile: Path.absname("test/fixtures/image_with_imagedescription_and_caption-abstract.jpg"),
description: nil
}
test "keeps description when not empty" do
uploads = %Pleroma.Upload{
name: "image_with_imagedescription_and_caption-abstract.jpg",
content_type: "image/jpeg",
path: Path.absname("test/fixtures/image_with_imagedescription_and_caption-abstract.jpg"),
tempfile:
Path.absname("test/fixtures/image_with_imagedescription_and_caption-abstract.jpg"),
description: "Some description"
}
assert Filter.Exiftool.ReadDescription.filter(uploads) ==
{:ok, :noop}
end
test "otherwise returns ImageDescription when present" do
uploads_after = %Pleroma.Upload{
name: "image_with_imagedescription_and_caption-abstract.jpg",
content_type: "image/jpeg",
path: Path.absname("test/fixtures/image_with_imagedescription_and_caption-abstract.jpg"),
tempfile:
Path.absname("test/fixtures/image_with_imagedescription_and_caption-abstract.jpg"),
description: "a descriptive white pixel"
}
assert Filter.Exiftool.ReadDescription.filter(@uploads) ==
{:ok, :filtered, uploads_after}
end
test "otherwise returns iptc:Caption-Abstract when present" do
upload = %Pleroma.Upload{
name: "image_with_caption-abstract.jpg",
content_type: "image/jpeg",
path: Path.absname("test/fixtures/image_with_caption-abstract.jpg"),
tempfile: Path.absname("test/fixtures/image_with_caption-abstract.jpg"),
description: nil
}
upload_after = %Pleroma.Upload{
name: "image_with_caption-abstract.jpg",
content_type: "image/jpeg",
path: Path.absname("test/fixtures/image_with_caption-abstract.jpg"),
tempfile: Path.absname("test/fixtures/image_with_caption-abstract.jpg"),
description: "an abstract white pixel"
}
assert Filter.Exiftool.ReadDescription.filter(upload) ==
{:ok, :filtered, upload_after}
end
test "otherwise returns nil" do
uploads = %Pleroma.Upload{
name: "image_with_no_description.jpg",
content_type: "image/jpeg",
path: Path.absname("test/fixtures/image_with_no_description.jpg"),
tempfile: Path.absname("test/fixtures/image_with_no_description.jpg"),
description: nil
}
assert Filter.Exiftool.ReadDescription.filter(uploads) ==
{:ok, :filtered, uploads}
end
test "Return nil when image description from EXIF data exceeds the maximum length" do
clear_config([:instance, :description_limit], 5)
assert Filter.Exiftool.ReadDescription.filter(@uploads) ==
{:ok, :filtered, @uploads}
end
test "Ignores content with only whitespace" do
uploads = %Pleroma.Upload{
name: "non-existant.jpg",
content_type: "image/jpeg",
path:
Path.absname(
"test/fixtures/image_with_imagedescription_and_caption-abstract_whitespaces.jpg"
),
tempfile:
Path.absname(
"test/fixtures/image_with_imagedescription_and_caption-abstract_whitespaces.jpg"
),
description: nil
}
assert Filter.Exiftool.ReadDescription.filter(uploads) ==
{:ok, :filtered, uploads}
end
test "Return nil when image description from EXIF data can't be read" do
uploads = %Pleroma.Upload{
name: "non-existant.jpg",
content_type: "image/jpeg",
path: Path.absname("test/fixtures/non-existant.jpg"),
tempfile: Path.absname("test/fixtures/non-existant_tmp.jpg"),
description: nil
}
assert Filter.Exiftool.ReadDescription.filter(uploads) ==
{:ok, :filtered, uploads}
end
end

View File

@ -0,0 +1,148 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2021 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Upload.Filter.Exiftool.StripMetadataTest do
use Pleroma.DataCase
alias Pleroma.Upload.Filter
@tag :tmp_dir
test "exiftool strip metadata strips GPS etc but preserves Orientation and ColorSpace by default",
%{tmp_dir: tmp_dir} do
assert Pleroma.Utils.command_available?("exiftool")
tmpfile = Path.join(tmp_dir, "tmp.jpg")
File.cp!(
"test/fixtures/DSCN0010.jpg",
tmpfile
)
upload = %Pleroma.Upload{
name: "image_with_GPS_data.jpg",
content_type: "image/jpeg",
path: Path.absname("test/fixtures/DSCN0010.jpg"),
tempfile: Path.absname(tmpfile)
}
assert Filter.Exiftool.StripMetadata.filter(upload) == {:ok, :filtered}
exif_original = read_exif("test/fixtures/DSCN0010.jpg")
exif_filtered = read_exif(tmpfile)
refute exif_original == exif_filtered
assert String.match?(exif_original, ~r/GPS/)
refute String.match?(exif_filtered, ~r/GPS/)
assert String.match?(exif_original, ~r/Camera Model Name/)
refute String.match?(exif_filtered, ~r/Camera Model Name/)
assert String.match?(exif_original, ~r/Orientation/)
assert String.match?(exif_filtered, ~r/Orientation/)
assert String.match?(exif_original, ~r/Color Space/)
assert String.match?(exif_filtered, ~r/Color Space/)
end
# this is a nonsensical configuration, but it shouldn't explode
@tag :tmp_dir
test "exiftool strip metadata is a noop with empty purge list", %{tmp_dir: tmp_dir} do
assert Pleroma.Utils.command_available?("exiftool")
clear_config([Pleroma.Upload.Filter.Exiftool.StripMetadata, :purge], [])
tmpfile = Path.join(tmp_dir, "tmp.jpg")
File.cp!(
"test/fixtures/DSCN0010.jpg",
tmpfile
)
upload = %Pleroma.Upload{
name: "image_with_GPS_data.jpg",
content_type: "image/jpeg",
path: Path.absname("test/fixtures/DSCN0010.jpg"),
tempfile: Path.absname(tmpfile)
}
assert Filter.Exiftool.StripMetadata.filter(upload) == {:ok, :filtered}
exif_original = read_exif("test/fixtures/DSCN0010.jpg")
exif_filtered = read_exif(tmpfile)
assert exif_original == exif_filtered
end
@tag :tmp_dir
test "exiftool strip metadata works with empty preserve list", %{tmp_dir: tmp_dir} do
assert Pleroma.Utils.command_available?("exiftool")
clear_config([Pleroma.Upload.Filter.Exiftool.StripMetadata, :preserve], [])
tmpfile = Path.join(tmp_dir, "tmp.jpg")
File.cp!(
"test/fixtures/DSCN0010.jpg",
tmpfile
)
upload = %Pleroma.Upload{
name: "image_with_GPS_data.jpg",
content_type: "image/jpeg",
path: Path.absname("test/fixtures/DSCN0010.jpg"),
tempfile: Path.absname(tmpfile)
}
write_exif(["-ImageDescription=Trees and Houses", "-Orientation=1", tmpfile])
exif_extended = read_exif(tmpfile)
assert String.match?(exif_extended, ~r/Image Description[ \t]*:[ \t]*Trees and Houses/)
assert String.match?(exif_extended, ~r/Orientation/)
assert Filter.Exiftool.StripMetadata.filter(upload) == {:ok, :filtered}
exif_original = read_exif("test/fixtures/DSCN0010.jpg")
exif_filtered = read_exif(tmpfile)
refute exif_original == exif_filtered
refute exif_extended == exif_filtered
assert String.match?(exif_original, ~r/GPS/)
refute String.match?(exif_filtered, ~r/GPS/)
refute String.match?(exif_filtered, ~r/Image Description/)
refute String.match?(exif_filtered, ~r/Orientation/)
end
test "verify webp files are skipped" do
upload = %Pleroma.Upload{
name: "sample.webp",
content_type: "image/webp"
}
assert Filter.Exiftool.StripMetadata.filter(upload) == {:ok, :noop}
end
test "verify svg files are skipped" do
upload = %Pleroma.Upload{
name: "sample.svg",
content_type: "image/svg+xml"
}
assert Filter.Exiftool.StripMetadata.filter(upload) == {:ok, :noop}
end
defp read_exif(file) do
# time and file path tags cause mismatches even for byte-identical files
{exif_data, 0} =
System.cmd("exiftool", [
"-x",
"Time:All",
"-x",
"Directory",
"-x",
"FileName",
"-x",
"FileSize",
file
])
exif_data
end
defp write_exif(args) do
{_response, 0} = System.cmd("exiftool", ["-ignoreMinorErrors", "-overwrite_original" | args])
end
end

View File

@ -1,42 +0,0 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2021 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Upload.Filter.ExiftoolTest do
use Pleroma.DataCase
alias Pleroma.Upload.Filter
test "apply exiftool filter" do
assert Pleroma.Utils.command_available?("exiftool")
File.cp!(
"test/fixtures/DSCN0010.jpg",
"test/fixtures/DSCN0010_tmp.jpg"
)
upload = %Pleroma.Upload{
name: "image_with_GPS_data.jpg",
content_type: "image/jpeg",
path: Path.absname("test/fixtures/DSCN0010.jpg"),
tempfile: Path.absname("test/fixtures/DSCN0010_tmp.jpg")
}
assert Filter.Exiftool.filter(upload) == {:ok, :filtered}
{exif_original, 0} = System.cmd("exiftool", ["test/fixtures/DSCN0010.jpg"])
{exif_filtered, 0} = System.cmd("exiftool", ["test/fixtures/DSCN0010_tmp.jpg"])
refute exif_original == exif_filtered
assert String.match?(exif_original, ~r/GPS/)
refute String.match?(exif_filtered, ~r/GPS/)
end
test "verify webp files are skipped" do
upload = %Pleroma.Upload{
name: "sample.webp",
content_type: "image/webp"
}
assert Filter.Exiftool.filter(upload) == {:ok, :noop}
end
end

View File

@ -1418,244 +1418,6 @@ defmodule Pleroma.Web.ActivityPub.ActivityPubControllerTest do
end
end
describe "POST /users/:nickname/outbox (C2S)" do
setup do: clear_config([:instance, :limit])
setup do
[
activity: %{
"@context" => "https://www.w3.org/ns/activitystreams",
"type" => "Create",
"object" => %{
"type" => "Note",
"content" => "AP C2S test",
"to" => "https://www.w3.org/ns/activitystreams#Public",
"cc" => []
}
}
]
end
test "it rejects posts from other users / unauthenticated users", %{
conn: conn,
activity: activity
} do
user = insert(:user)
other_user = insert(:user)
conn = put_req_header(conn, "content-type", "application/activity+json")
conn
|> post("/users/#{user.nickname}/outbox", activity)
|> json_response(403)
conn
|> assign(:user, other_user)
|> post("/users/#{user.nickname}/outbox", activity)
|> json_response(403)
end
test "it inserts an incoming create activity into the database", %{
conn: conn,
activity: activity
} do
user = insert(:user)
result =
conn
|> assign(:user, user)
|> put_req_header("content-type", "application/activity+json")
|> post("/users/#{user.nickname}/outbox", activity)
|> json_response(201)
assert Activity.get_by_ap_id(result["id"])
assert result["object"]
assert %Object{data: object} = Object.normalize(result["object"], fetch: false)
assert object["content"] == activity["object"]["content"]
end
test "it rejects anything beyond 'Note' creations", %{conn: conn, activity: activity} do
user = insert(:user)
activity =
activity
|> put_in(["object", "type"], "Benis")
_result =
conn
|> assign(:user, user)
|> put_req_header("content-type", "application/activity+json")
|> post("/users/#{user.nickname}/outbox", activity)
|> json_response(400)
end
test "it inserts an incoming sensitive activity into the database", %{
conn: conn,
activity: activity
} do
user = insert(:user)
conn = assign(conn, :user, user)
object = Map.put(activity["object"], "sensitive", true)
activity = Map.put(activity, "object", object)
response =
conn
|> put_req_header("content-type", "application/activity+json")
|> post("/users/#{user.nickname}/outbox", activity)
|> json_response(201)
assert Activity.get_by_ap_id(response["id"])
assert response["object"]
assert %Object{data: response_object} = Object.normalize(response["object"], fetch: false)
assert response_object["sensitive"] == true
assert response_object["content"] == activity["object"]["content"]
representation =
conn
|> put_req_header("accept", "application/activity+json")
|> get(response["id"])
|> json_response(200)
assert representation["object"]["sensitive"] == true
end
test "it rejects an incoming activity with bogus type", %{conn: conn, activity: activity} do
user = insert(:user)
activity = Map.put(activity, "type", "BadType")
conn =
conn
|> assign(:user, user)
|> put_req_header("content-type", "application/activity+json")
|> post("/users/#{user.nickname}/outbox", activity)
assert json_response(conn, 400)
end
test "it erects a tombstone when receiving a delete activity", %{conn: conn} do
note_activity = insert(:note_activity)
note_object = Object.normalize(note_activity, fetch: false)
user = User.get_cached_by_ap_id(note_activity.data["actor"])
data = %{
"type" => "Delete",
"object" => %{
"id" => note_object.data["id"]
}
}
result =
conn
|> assign(:user, user)
|> put_req_header("content-type", "application/activity+json")
|> post("/users/#{user.nickname}/outbox", data)
|> json_response(201)
assert Activity.get_by_ap_id(result["id"])
assert object = Object.get_by_ap_id(note_object.data["id"])
assert object.data["type"] == "Tombstone"
end
test "it rejects delete activity of object from other actor", %{conn: conn} do
note_activity = insert(:note_activity)
note_object = Object.normalize(note_activity, fetch: false)
user = insert(:user)
data = %{
type: "Delete",
object: %{
id: note_object.data["id"]
}
}
conn =
conn
|> assign(:user, user)
|> put_req_header("content-type", "application/activity+json")
|> post("/users/#{user.nickname}/outbox", data)
assert json_response(conn, 403)
end
test "it increases like count when receiving a like action", %{conn: conn} do
note_activity = insert(:note_activity)
note_object = Object.normalize(note_activity, fetch: false)
user = User.get_cached_by_ap_id(note_activity.data["actor"])
data = %{
type: "Like",
object: %{
id: note_object.data["id"]
}
}
conn =
conn
|> assign(:user, user)
|> put_req_header("content-type", "application/activity+json")
|> post("/users/#{user.nickname}/outbox", data)
result = json_response(conn, 201)
assert Activity.get_by_ap_id(result["id"])
assert object = Object.get_by_ap_id(note_object.data["id"])
assert object.data["like_count"] == 1
end
test "it doesn't spreads faulty attributedTo or actor fields", %{
conn: conn,
activity: activity
} do
reimu = insert(:user, nickname: "reimu")
cirno = insert(:user, nickname: "cirno")
assert reimu.ap_id
assert cirno.ap_id
activity =
activity
|> put_in(["object", "actor"], reimu.ap_id)
|> put_in(["object", "attributedTo"], reimu.ap_id)
|> put_in(["actor"], reimu.ap_id)
|> put_in(["attributedTo"], reimu.ap_id)
_reimu_outbox =
conn
|> assign(:user, cirno)
|> put_req_header("content-type", "application/activity+json")
|> post("/users/#{reimu.nickname}/outbox", activity)
|> json_response(403)
cirno_outbox =
conn
|> assign(:user, cirno)
|> put_req_header("content-type", "application/activity+json")
|> post("/users/#{cirno.nickname}/outbox", activity)
|> json_response(201)
assert cirno_outbox["attributedTo"] == nil
assert cirno_outbox["actor"] == cirno.ap_id
assert cirno_object = Object.normalize(cirno_outbox["object"], fetch: false)
assert cirno_object.data["actor"] == cirno.ap_id
assert cirno_object.data["attributedTo"] == cirno.ap_id
end
test "Character limitation", %{conn: conn, activity: activity} do
clear_config([:instance, :limit], 5)
user = insert(:user)
result =
conn
|> assign(:user, user)
|> put_req_header("content-type", "application/activity+json")
|> post("/users/#{user.nickname}/outbox", activity)
|> json_response(400)
assert result == "Character limit (5 characters) exceeded, contains 11 characters"
end
end
describe "/relay/followers" do
test "it returns relay followers", %{conn: conn} do
relay_actor = Relay.get_actor()
@ -1977,95 +1739,6 @@ defmodule Pleroma.Web.ActivityPub.ActivityPubControllerTest do
end
end
describe "Additional ActivityPub C2S endpoints" do
test "GET /api/ap/whoami", %{conn: conn} do
user = insert(:user)
conn =
conn
|> assign(:user, user)
|> get("/api/ap/whoami")
user = User.get_cached_by_id(user.id)
assert UserView.render("user.json", %{user: user}) == json_response(conn, 200)
conn
|> get("/api/ap/whoami")
|> json_response(403)
end
setup do: clear_config([:media_proxy])
setup do: clear_config([Pleroma.Upload])
test "POST /api/ap/upload_media", %{conn: conn} do
user = insert(:user)
desc = "Description of the image"
image = %Plug.Upload{
content_type: "image/jpeg",
path: Path.absname("test/fixtures/image.jpg"),
filename: "an_image.jpg"
}
object =
conn
|> assign(:user, user)
|> post("/api/ap/upload_media", %{"file" => image, "description" => desc})
|> json_response(:created)
assert object["name"] == desc
assert object["type"] == "Document"
assert object["actor"] == user.ap_id
assert [%{"href" => object_href, "mediaType" => object_mediatype}] = object["url"]
assert is_binary(object_href)
assert object_mediatype == "image/jpeg"
assert String.ends_with?(object_href, ".jpg")
activity_request = %{
"@context" => "https://www.w3.org/ns/activitystreams",
"type" => "Create",
"object" => %{
"type" => "Note",
"content" => "AP C2S test, attachment",
"attachment" => [object],
"to" => "https://www.w3.org/ns/activitystreams#Public",
"cc" => []
}
}
activity_response =
conn
|> assign(:user, user)
|> post("/users/#{user.nickname}/outbox", activity_request)
|> json_response(:created)
assert activity_response["id"]
assert activity_response["object"]
assert activity_response["actor"] == user.ap_id
assert %Object{data: %{"attachment" => [attachment]}} =
Object.normalize(activity_response["object"], fetch: false)
assert attachment["type"] == "Document"
assert attachment["name"] == desc
assert [
%{
"href" => ^object_href,
"type" => "Link",
"mediaType" => ^object_mediatype
}
] = attachment["url"]
# Fails if unauthenticated
conn
|> post("/api/ap/upload_media", %{"file" => image, "description" => desc})
|> json_response(403)
end
end
test "pinned collection", %{conn: conn} do
clear_config([:instance, :max_pinned_statuses], 2)
user = insert(:user)

View File

@ -4,10 +4,16 @@
defmodule Pleroma.Web.ActivityPub.MRF.InlineQuotePolicyTest do
alias Pleroma.Web.ActivityPub.MRF.InlineQuotePolicy
alias Pleroma.Object
use Pleroma.DataCase
setup_all do
Tesla.Mock.mock_global(fn env -> apply(HttpRequestMock, :request, [env]) end)
:ok
end
test "adds quote URL to post content" do
quote_url = "https://example.com/objects/1234"
quote_url = "https://mastodon.social/users/emelie/statuses/101849165031453009"
activity = %{
"type" => "Create",
@ -19,10 +25,13 @@ defmodule Pleroma.Web.ActivityPub.MRF.InlineQuotePolicyTest do
}
}
# Prefetch the quoted post
%Object{} = Object.normalize(quote_url, fetch: true)
{:ok, %{"object" => %{"content" => filtered}}} = InlineQuotePolicy.filter(activity)
assert filtered ==
"<p>Nice post<span class=\"quote-inline\"><br/><br/>RE: <a href=\"https://example.com/objects/1234\">https://example.com/objects/1234</a></span></p>"
"<p>Nice post<span class=\"quote-inline\"><br/><br/>RE: <a href=\"https://mastodon.social/@emelie/101849165031453009\">https://mastodon.social/@emelie/101849165031453009</a></span></p>"
end
test "ignores Misskey quote posts" do

View File

@ -39,6 +39,28 @@ defmodule Pleroma.Web.ActivityPub.ObjectValidators.ArticleNotePageValidatorTest
%{valid?: true} = ArticleNotePageValidator.cast_and_validate(note)
end
test "note with url validates", %{note: note} do
note = Map.put(note, "url", "https://remote.example/link")
%{valid?: true} = ArticleNotePageValidator.cast_and_validate(note)
end
test "note with url array validates", %{note: note} do
note = Map.put(note, "url", ["https://remote.example/link"])
%{valid?: true} = ArticleNotePageValidator.cast_and_validate(note)
end
test "note with url array validates if contains a link object", %{note: note} do
note =
Map.put(note, "url", [
%{
"type" => "Link",
"href" => "https://remote.example/link"
}
])
%{valid?: true} = ArticleNotePageValidator.cast_and_validate(note)
end
test "a note with a language validates" do
insert(:user, %{ap_id: "https://mastodon.social/users/akkoma_ap_integration_tester"})
note = File.read!("test/fixtures/mastodon/note_with_language.json") |> Jason.decode!()

View File

@ -12,14 +12,13 @@ defmodule Pleroma.Web.ActivityPub.ObjectValidators.AttachmentValidatorTest do
describe "attachments" do
test "works with apng" do
attachment =
%{
"mediaType" => "image/apng",
"name" => "",
"type" => "Document",
"url" =>
"https://media.misskeyusercontent.com/io/2859c26e-cd43-4550-848b-b6243bc3fe28.apng"
}
attachment = %{
"mediaType" => "image/apng",
"name" => "",
"type" => "Document",
"url" =>
"https://media.misskeyusercontent.com/io/2859c26e-cd43-4550-848b-b6243bc3fe28.apng"
}
assert {:ok, attachment} =
AttachmentValidator.cast_and_validate(attachment)

View File

@ -0,0 +1,38 @@
# Akkoma: Magically expressive social media
# Copyright © 2024 Akkoma Authors <https://akkoma.dev/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Web.ActivityPub.ObjectValidators.UserValidatorTest do
use Pleroma.DataCase, async: true
alias Pleroma.Web.ActivityPub.ObjectValidators.UserValidator
# all standard actor types are listed here:
# https://www.w3.org/TR/activitystreams-vocabulary/#actor-types
describe "accepts standard type" do
test "Application" do
validates_file!("test/fixtures/mastodon/application_actor.json")
end
test "Group" do
validates_file!("test/fixtures/peertube/actor-videochannel.json")
end
test "Organization" do
validates_file!("test/fixtures/tesla_mock/wedistribute-user.json")
end
test "Person" do
validates_file!("test/fixtures/bridgy/actor.json")
end
test "Service" do
validates_file!("test/fixtures/mastodon/service_actor.json")
end
end
defp validates_file!(path) do
user_data = Jason.decode!(File.read!(path))
{:ok, _validated_data, _meta} = UserValidator.validate(user_data, [])
end
end

View File

@ -37,7 +37,80 @@ defmodule Pleroma.Web.ActivityPub.Transmogrifier.EmojiReactHandlingTest do
assert match?([["👌", _, nil]], object.data["reactions"])
end
test "it works for incoming custom emoji reactions" do
test "it works for incoming custom emoji with nil id" do
user = insert(:user)
other_user = insert(:user, local: false)
{:ok, activity} = CommonAPI.post(user, %{status: "hello"})
shortcode = "blobcatgoogly"
emoji = emoji_object(shortcode)
data = react_with_custom(activity.data["object"], other_user.ap_id, emoji)
{:ok, %Activity{data: data, local: false}} = Transmogrifier.handle_incoming(data)
assert data["actor"] == other_user.ap_id
assert data["type"] == "EmojiReact"
assert data["object"] == activity.data["object"]
assert data["content"] == ":" <> shortcode <> ":"
[%{}] = data["tag"]
object = Object.get_by_ap_id(data["object"])
assert object.data["reaction_count"] == 1
assert match?([[^shortcode, _, _]], object.data["reactions"])
end
test "it works for incoming custom emoji with image url as id" do
user = insert(:user)
other_user = insert(:user, local: false)
{:ok, activity} = CommonAPI.post(user, %{status: "hello"})
shortcode = "blobcatgoogly"
imgurl = "https://example.org/emoji/a.png"
emoji = emoji_object(shortcode, imgurl, imgurl)
data = react_with_custom(activity.data["object"], other_user.ap_id, emoji)
{:ok, %Activity{data: data, local: false}} = Transmogrifier.handle_incoming(data)
assert data["actor"] == other_user.ap_id
assert data["type"] == "EmojiReact"
assert data["object"] == activity.data["object"]
assert data["content"] == ":" <> shortcode <> ":"
assert [%{}] = data["tag"]
object = Object.get_by_ap_id(data["object"])
assert object.data["reaction_count"] == 1
assert match?([[^shortcode, _, ^imgurl]], object.data["reactions"])
end
test "it works for incoming custom emoji without tag array" do
user = insert(:user)
other_user = insert(:user, local: false)
{:ok, activity} = CommonAPI.post(user, %{status: "hello"})
shortcode = "blobcatgoogly"
imgurl = "https://example.org/emoji/b.png"
emoji = emoji_object(shortcode, imgurl, imgurl)
data = react_with_custom(activity.data["object"], other_user.ap_id, emoji, false)
assert %{} = data["tag"]
{:ok, %Activity{data: data, local: false}} = Transmogrifier.handle_incoming(data)
assert data["actor"] == other_user.ap_id
assert data["type"] == "EmojiReact"
assert data["object"] == activity.data["object"]
assert data["content"] == ":" <> shortcode <> ":"
assert [%{}] = data["tag"]
object = Object.get_by_ap_id(data["object"])
assert object.data["reaction_count"] == 1
assert match?([[^shortcode, _, _]], object.data["reactions"])
end
test "it works for incoming custom emoji reactions from Misskey" do
user = insert(:user)
other_user = insert(:user, local: false)
{:ok, activity} = CommonAPI.post(user, %{status: "hello"})
@ -138,4 +211,27 @@ defmodule Pleroma.Web.ActivityPub.Transmogrifier.EmojiReactHandlingTest do
assert {:error, _} = Transmogrifier.handle_incoming(data)
end
defp emoji_object(shortcode, id \\ nil, url \\ "https://example.org/emoji.png") do
%{
"type" => "Emoji",
"id" => id,
"name" => shortcode |> String.replace_prefix(":", "") |> String.replace_suffix(":", ""),
"icon" => %{
"type" => "Image",
"url" => url
}
}
end
defp react_with_custom(object_id, as_actor, emoji, tag_array \\ true) do
tag = if tag_array, do: [emoji], else: emoji
File.read!("test/fixtures/emoji-reaction.json")
|> Jason.decode!()
|> Map.put("object", object_id)
|> Map.put("actor", as_actor)
|> Map.put("content", ":" <> emoji["name"] <> ":")
|> Map.put("tag", tag)
end
end

View File

@ -124,6 +124,28 @@ defmodule Pleroma.Web.ActivityPub.TransmogrifierTest do
assert activity.data["context"] == object.data["context"]
end
test "it accepts quote posts" do
insert(:user, ap_id: "https://misskey.io/users/7rkrarq81i")
object = File.read!("test/fixtures/quote_post/misskey_quote_post.json") |> Jason.decode!()
message = %{
"@context" => "https://www.w3.org/ns/activitystreams",
"type" => "Create",
"actor" => "https://misskey.io/users/7rkrarq81i",
"object" => object
}
assert {:ok, activity} = Transmogrifier.handle_incoming(message)
# Object was created in the database
object = Object.normalize(activity)
assert object.data["quoteUri"] == "https://misskey.io/notes/8vs6wxufd0"
# It fetched the quoted post
assert Object.normalize("https://misskey.io/notes/8vs6wxufd0")
end
end
describe "prepare outgoing" do
@ -413,7 +435,7 @@ defmodule Pleroma.Web.ActivityPub.TransmogrifierTest do
assert capture_log(fn ->
{:error, _} = Transmogrifier.handle_incoming(data)
end) =~ "Object containment failed"
end) =~ "Object rejected while fetching"
end
test "it rejects activities which reference objects that have an incorrect attribution (variant 1)" do
@ -428,7 +450,7 @@ defmodule Pleroma.Web.ActivityPub.TransmogrifierTest do
assert capture_log(fn ->
{:error, _} = Transmogrifier.handle_incoming(data)
end) =~ "Object containment failed"
end) =~ "Object rejected while fetching"
end
test "it rejects activities which reference objects that have an incorrect attribution (variant 2)" do
@ -443,7 +465,7 @@ defmodule Pleroma.Web.ActivityPub.TransmogrifierTest do
assert capture_log(fn ->
{:error, _} = Transmogrifier.handle_incoming(data)
end) =~ "Object containment failed"
end) =~ "Object rejected while fetching"
end
end
@ -536,7 +558,7 @@ defmodule Pleroma.Web.ActivityPub.TransmogrifierTest do
test "returns nil when cannot normalize object" do
assert capture_log(fn ->
refute Transmogrifier.get_obj_helper("test-obj-id")
end) =~ "URI Scheme Invalid"
end) =~ ":valid_uri_scheme"
end
@tag capture_log: true

View File

@ -137,6 +137,37 @@ defmodule Pleroma.Web.FederatorTest do
assert {:error, :already_present} = ObanHelpers.perform(job)
end
test "successfully normalises public scope descriptors" do
params = %{
"@context" => "https://www.w3.org/ns/activitystreams",
"actor" => "http://mastodon.example.org/users/admin",
"type" => "Create",
"id" => "http://mastodon.example.org/users/admin/activities/1",
"object" => %{
"type" => "Note",
"content" => "hi world!",
"id" => "http://mastodon.example.org/users/admin/objects/1",
"attributedTo" => "http://mastodon.example.org/users/admin",
"to" => ["Public"]
},
"to" => ["as:Public"]
}
assert {:ok, job} = Federator.incoming_ap_doc(params)
assert {:ok, activity} = ObanHelpers.perform(job)
assert activity.data["to"] == ["https://www.w3.org/ns/activitystreams#Public"]
object =
from(
object in Pleroma.Object,
where: fragment("(?)->>'id' = ?", object.data, ^activity.data["object"]),
limit: 1
)
|> Repo.one()
assert object.data["to"] == ["https://www.w3.org/ns/activitystreams#Public"]
end
test "rejects incoming AP docs with incorrect origin" do
params = %{
"@context" => "https://www.w3.org/ns/activitystreams",

View File

@ -111,7 +111,7 @@ defmodule Pleroma.Web.MastodonAPI.StatusControllerTest do
# 2 hours
expires_in = 2 * 60 * 60
expires_at = DateTime.add(DateTime.utc_now(), expires_in)
expires_at1 = DateTime.add(DateTime.utc_now(), expires_in)
conn_four =
conn
@ -123,12 +123,16 @@ defmodule Pleroma.Web.MastodonAPI.StatusControllerTest do
assert %{"id" => fourth_id} = json_response_and_validate_schema(conn_four, 200)
assert Activity.get_by_id(fourth_id)
activity = Activity.get_by_id(fourth_id)
assert activity
{:ok, expires_at2, _} = DateTime.from_iso8601(activity.data["expires_at"])
assert Timex.compare(expires_at1, expires_at2, :minutes) == 0
assert_enqueued(
worker: Pleroma.Workers.PurgeExpiredActivity,
args: %{activity_id: fourth_id},
scheduled_at: expires_at
scheduled_at: expires_at2
)
end
@ -148,16 +152,13 @@ defmodule Pleroma.Web.MastodonAPI.StatusControllerTest do
activity = Activity.get_by_id_with_object(id)
{:ok, expires_at, _} = DateTime.from_iso8601(activity.data["expires_at"])
assert Timex.diff(
expires_at,
DateTime.utc_now(),
:hours
) == 23
expiry_delay = Timex.diff(expires_at, DateTime.utc_now(), :hours)
assert(expiry_delay in [23, 24])
assert_enqueued(
worker: Pleroma.Workers.PurgeExpiredActivity,
args: %{activity_id: id},
scheduled_at: DateTime.add(DateTime.utc_now(), 1 * 60 * 60 * 24)
scheduled_at: expires_at
)
end
@ -1405,7 +1406,7 @@ defmodule Pleroma.Web.MastodonAPI.StatusControllerTest do
%{conn: conn} = oauth_access(["write:accounts", "write:statuses"])
expires_in = 2 * 60 * 60
expires_at = DateTime.add(DateTime.utc_now(), expires_in)
expires_at1 = DateTime.add(DateTime.utc_now(), expires_in)
assert %{"id" => id} =
conn
@ -1416,10 +1417,15 @@ defmodule Pleroma.Web.MastodonAPI.StatusControllerTest do
})
|> json_response_and_validate_schema(200)
activity = Activity.get_by_id(id)
{:ok, expires_at2, _} = DateTime.from_iso8601(activity.data["expires_at"])
assert Timex.compare(expires_at1, expires_at2, :minutes) == 0
assert_enqueued(
worker: Pleroma.Workers.PurgeExpiredActivity,
args: %{activity_id: id},
scheduled_at: expires_at
scheduled_at: expires_at2
)
assert %{"id" => ^id, "pinned" => true} =
@ -1431,7 +1437,7 @@ defmodule Pleroma.Web.MastodonAPI.StatusControllerTest do
refute_enqueued(
worker: Pleroma.Workers.PurgeExpiredActivity,
args: %{activity_id: id},
scheduled_at: expires_at
scheduled_at: expires_at2
)
assert %{"id" => ^id, "pinned" => false} =
@ -1443,7 +1449,7 @@ defmodule Pleroma.Web.MastodonAPI.StatusControllerTest do
assert_enqueued(
worker: Pleroma.Workers.PurgeExpiredActivity,
args: %{activity_id: id},
scheduled_at: expires_at
scheduled_at: expires_at2
)
end
end
@ -1944,7 +1950,7 @@ defmodule Pleroma.Web.MastodonAPI.StatusControllerTest do
|> json_response_and_validate_schema(:ok)
{:ok, a_expires_at, 0} = DateTime.from_iso8601(a_expires_at)
assert DateTime.diff(expires_at, a_expires_at) == 0
assert Timex.compare(expires_at, a_expires_at, :minutes) == 0
%{conn: conn} = oauth_access(["read:statuses"])

View File

@ -9,6 +9,8 @@ defmodule Pleroma.Web.Metadata.Providers.OpenGraphTest do
setup do: clear_config([Pleroma.Web.Metadata, :unfurl_nsfw])
setup do: clear_config([Pleroma.Upload, :uploader], Pleroma.Uploaders.Local)
setup do: clear_config([:restrict_unauthenticated, :profiles, :local])
setup do: clear_config([:restrict_unauthenticated, :activities, :local])
test "it renders all supported types of attachments and skips unknown types" do
user = insert(:user)
@ -188,4 +190,24 @@ defmodule Pleroma.Web.Metadata.Providers.OpenGraphTest do
"http://localhost:4001/proxy/preview/LzAnlke-l5oZbNzWsrHfprX1rGw/aHR0cHM6Ly9wbGVyb21hLmdvdi9hYm91dC9qdWNoZS53ZWJt/juche.webm"
], []} in result
end
test "it does not render users if profiles are marked as restricted" do
clear_config([:restrict_unauthenticated, :profiles, :local], true)
user = insert(:user)
result = OpenGraph.build_tags(%{user: user})
assert Enum.empty?(result)
end
test "it does not activities users if they are marked as restricted" do
clear_config([:restrict_unauthenticated, :activities, :local], true)
user = insert(:user)
note = insert(:note, data: %{"actor" => user.ap_id})
result = OpenGraph.build_tags(%{object: note, url: note.data["id"], user: user})
assert {:meta, [property: "og:description", content: "Content cannot be displayed."], []} in result
end
end

View File

@ -14,6 +14,8 @@ defmodule Pleroma.Web.Metadata.Providers.TwitterCardTest do
alias Pleroma.Web.Metadata.Utils
setup do: clear_config([Pleroma.Web.Metadata, :unfurl_nsfw])
setup do: clear_config([:restrict_unauthenticated, :profiles, :local])
setup do: clear_config([:restrict_unauthenticated, :activities, :local])
test "it renders twitter card for user info" do
user = insert(:user, name: "Jimmy Hendriks", bio: "born 19 March 1994")
@ -28,6 +30,14 @@ defmodule Pleroma.Web.Metadata.Providers.TwitterCardTest do
]
end
test "it does not render twitter card for user info if it is restricted" do
clear_config([:restrict_unauthenticated, :profiles, :local], true)
user = insert(:user, name: "Jimmy Hendriks", bio: "born 19 March 1994")
res = TwitterCard.build_tags(%{user: user})
assert Enum.empty?(res)
end
test "it uses summary twittercard if post has no attachment" do
user = insert(:user, name: "Jimmy Hendriks", bio: "born 19 March 1994")
{:ok, activity} = CommonAPI.post(user, %{status: "HI"})
@ -54,6 +64,16 @@ defmodule Pleroma.Web.Metadata.Providers.TwitterCardTest do
] == result
end
test "it does not summarise activities if they are marked as restricted" do
clear_config([:restrict_unauthenticated, :activities, :local], true)
user = insert(:user)
note = insert(:note, data: %{"actor" => user.ap_id})
result = TwitterCard.build_tags(%{object: note, activity_id: note.data["id"], user: user})
assert {:meta, [name: "twitter:description", content: "Content cannot be displayed."], []} in result
end
test "it uses summary as description if post has one" do
user = insert(:user, name: "Jimmy Hendriks", bio: "born 19 March 1994")
{:ok, activity} = CommonAPI.post(user, %{status: "HI"})

View File

@ -7,8 +7,6 @@ defmodule Pleroma.Web.Plugs.HTTPSecurityPlugTest do
alias Plug.Conn
setup_all do: clear_config([Pleroma.Upload, :base_url], nil)
describe "http security enabled" do
setup do: clear_config([:http_security, :enabled], true)
@ -98,51 +96,68 @@ defmodule Pleroma.Web.Plugs.HTTPSecurityPlugTest do
test "media_proxy with base_url", %{conn: conn} do
url = "https://example.com"
clear_config([:media_proxy, :base_url], url)
assert_media_img_src(conn, url)
assert_media_img_src(conn, proxy: url)
assert_connect_src(conn, url)
end
test "upload with base url", %{conn: conn} do
url = "https://example2.com"
clear_config([Pleroma.Upload, :base_url], url)
assert_media_img_src(conn, url)
assert_media_img_src(conn, upload: url)
assert_connect_src(conn, url)
end
test "with S3 public endpoint", %{conn: conn} do
url = "https://example3.com"
clear_config([Pleroma.Uploaders.S3, :public_endpoint], url)
assert_media_img_src(conn, url)
assert_media_img_src(conn, s3: url)
end
test "with captcha endpoint", %{conn: conn} do
clear_config([Pleroma.Captcha.Mock, :endpoint], "https://captcha.com")
assert_media_img_src(conn, "https://captcha.com")
assert_media_img_src(conn, captcha: "https://captcha.com")
end
test "with media_proxy whitelist", %{conn: conn} do
clear_config([:media_proxy, :whitelist], ["https://example6.com", "https://example7.com"])
assert_media_img_src(conn, "https://example7.com https://example6.com")
assert_media_img_src(conn, proxy_whitelist: "https://example7.com https://example6.com")
end
# TODO: delete after removing support bare domains for media proxy whitelist
test "with media_proxy bare domains whitelist (deprecated)", %{conn: conn} do
clear_config([:media_proxy, :whitelist], ["example4.com", "example5.com"])
assert_media_img_src(conn, "example5.com example4.com")
assert_media_img_src(conn, proxy_whitelist: "example5.com example4.com")
end
test "with media_proxy blocklist", %{conn: conn} do
clear_config([:media_proxy, :whitelist], ["https://example6.com", "https://example7.com"])
clear_config([:media_proxy, :blocklist], ["https://example8.com"])
assert_media_img_src(conn, "https://example7.com https://example6.com")
assert_media_img_src(conn, proxy_whitelist: "https://example7.com https://example6.com")
end
end
defp assert_media_img_src(conn, url) do
defp maybe_concat(nil, b), do: b
defp maybe_concat(a, nil), do: a
defp maybe_concat(a, b), do: a <> " " <> b
defp build_src_str(urls) do
urls[:proxy_whitelist]
|> maybe_concat(urls[:s3])
|> maybe_concat(urls[:upload])
|> maybe_concat(urls[:proxy])
|> maybe_concat(urls[:captcha])
end
defp assert_media_img_src(conn, urls) do
urlstr =
[upload: "http://localhost", proxy: "http://localhost"]
|> Keyword.merge(urls)
|> build_src_str()
conn = get(conn, "/api/v1/instance")
[csp] = Conn.get_resp_header(conn, "content-security-policy")
assert csp =~ "media-src 'self' #{url};"
assert csp =~ "img-src 'self' data: blob: #{url};"
assert csp =~ "media-src 'self' #{urlstr};"
assert csp =~ "img-src 'self' data: blob: #{urlstr};"
end
defp assert_connect_src(conn, url) do

View File

@ -132,7 +132,7 @@ defmodule Pleroma.Web.TwitterAPI.RemoteFollowControllerTest do
|> html_response(200)
assert response =~ "Error fetching user"
end) =~ "Object has been deleted"
end) =~ ":not_found"
end
end

View File

@ -24,7 +24,7 @@ defmodule Pleroma.Web.TwitterAPI.UtilControllerTest do
describe "PUT /api/pleroma/notification_settings" do
setup do: oauth_access(["write:accounts"])
test "it updates notification settings", %{user: user, conn: conn} do
test "it updates notification settings via url paramters", %{user: user, conn: conn} do
conn
|> put(
"/api/pleroma/notification_settings?#{URI.encode_query(%{block_from_strangers: true})}"
@ -39,6 +39,57 @@ defmodule Pleroma.Web.TwitterAPI.UtilControllerTest do
} == user.notification_settings
end
test "it updates notification settings via JSON body params", %{user: user, conn: conn} do
conn
|> put_req_header("content-type", "application/json")
|> put(
"/api/pleroma/notification_settings",
%{"block_from_strangers" => true}
)
|> json_response_and_validate_schema(:ok)
user = refresh_record(user)
assert %Pleroma.User.NotificationSetting{
block_from_strangers: true,
hide_notification_contents: false
} == user.notification_settings
end
test "it updates notification settings via form data", %{user: user, conn: conn} do
conn
|> put_req_header("content-type", "multipart/form-data")
|> put(
"/api/pleroma/notification_settings",
%{:block_from_strangers => true}
)
|> json_response_and_validate_schema(:ok)
user = refresh_record(user)
assert %Pleroma.User.NotificationSetting{
block_from_strangers: true,
hide_notification_contents: false
} == user.notification_settings
end
test "it updates notification settings via urlencoded body", %{user: user, conn: conn} do
conn
|> put_req_header("content-type", "application/x-www-form-urlencoded")
|> put(
"/api/pleroma/notification_settings",
"block_from_strangers=true"
)
|> json_response_and_validate_schema(:ok)
user = refresh_record(user)
assert %Pleroma.User.NotificationSetting{
block_from_strangers: true,
hide_notification_contents: false
} == user.notification_settings
end
test "it updates notification settings to enable hiding contents", %{user: user, conn: conn} do
conn
|> put(
@ -53,6 +104,27 @@ defmodule Pleroma.Web.TwitterAPI.UtilControllerTest do
hide_notification_contents: true
} == user.notification_settings
end
# we already test all body variants for block_from_strangers, so just one should suffice here
test "it updates notification settings to enable hiding contents via JSON body params", %{
user: user,
conn: conn
} do
conn
|> put_req_header("content-type", "application/json")
|> put(
"/api/pleroma/notification_settings",
%{"hide_notification_contents" => true}
)
|> json_response_and_validate_schema(:ok)
user = refresh_record(user)
assert %Pleroma.User.NotificationSetting{
block_from_strangers: false,
hide_notification_contents: true
} == user.notification_settings
end
end
describe "GET /api/pleroma/frontend_configurations" do

View File

@ -46,8 +46,7 @@ defmodule Pleroma.Web.WebFinger.WebFingerControllerTest do
assert response["subject"] == "acct:#{user.nickname}@localhost"
assert response["aliases"] == [
"https://hyrule.world/users/zelda",
"https://mushroom.kingdom/users/toad"
"https://hyrule.world/users/zelda"
]
end
@ -104,7 +103,6 @@ defmodule Pleroma.Web.WebFinger.WebFingerControllerTest do
|> response(200)
assert response =~ "<Alias>https://hyrule.world/users/zelda</Alias>"
assert response =~ "<Alias>https://mushroom.kingdom/users/toad</Alias>"
end
test "it returns 404 when user isn't found (XML)" do

View File

@ -0,0 +1,69 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2023 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Workers.RemoteFetcherWorkerTest do
use Pleroma.DataCase
use Oban.Testing, repo: Pleroma.Repo
alias Pleroma.Workers.RemoteFetcherWorker
@deleted_object_one "https://deleted-404.example.com/"
@deleted_object_two "https://deleted-410.example.com/"
@unauthorized_object "https://unauthorized.example.com/"
@depth_object "https://depth.example.com/"
describe "RemoteFetcherWorker" do
setup do
Tesla.Mock.mock(fn
%{method: :get, url: @deleted_object_one} ->
%Tesla.Env{
status: 404
}
%{method: :get, url: @deleted_object_two} ->
%Tesla.Env{
status: 410
}
%{method: :get, url: @unauthorized_object} ->
%Tesla.Env{
status: 403
}
%{method: :get, url: @depth_object} ->
%Tesla.Env{
status: 200
}
end)
end
test "does not requeue a deleted object" do
assert {:discard, _} =
RemoteFetcherWorker.perform(%Oban.Job{
args: %{"op" => "fetch_remote", "id" => @deleted_object_one}
})
assert {:discard, _} =
RemoteFetcherWorker.perform(%Oban.Job{
args: %{"op" => "fetch_remote", "id" => @deleted_object_two}
})
end
test "does not requeue an unauthorized object" do
assert {:discard, _} =
RemoteFetcherWorker.perform(%Oban.Job{
args: %{"op" => "fetch_remote", "id" => @unauthorized_object}
})
end
test "does not requeue an object that exceeded depth" do
clear_config([:instance, :federation_incoming_replies_max_depth], 0)
assert {:discard, _} =
RemoteFetcherWorker.perform(%Oban.Job{
args: %{"op" => "fetch_remote", "id" => @depth_object, "depth" => 1}
})
end
end
end

View File

@ -559,16 +559,15 @@ defmodule Pleroma.Factory do
like_activity = attrs[:like_activity] || insert(:like_activity)
attrs = Map.drop(attrs, [:like_activity])
data =
%{
"id" => Pleroma.Web.ActivityPub.Utils.generate_activity_id(),
"type" => "Undo",
"actor" => like_activity.data["actor"],
"to" => like_activity.data["to"],
"object" => like_activity.data["id"],
"published" => DateTime.utc_now() |> DateTime.to_iso8601(),
"context" => like_activity.data["context"]
}
data = %{
"id" => Pleroma.Web.ActivityPub.Utils.generate_activity_id(),
"type" => "Undo",
"actor" => like_activity.data["actor"],
"to" => like_activity.data["to"],
"object" => like_activity.data["id"],
"published" => DateTime.utc_now() |> DateTime.to_iso8601(),
"context" => like_activity.data["context"]
}
%Pleroma.Activity{
data: data,

View File

@ -5,7 +5,16 @@
defmodule HttpRequestMock do
require Logger
def activitypub_object_headers, do: [{"content-type", "application/activity+json"}]
def activitypub_object_headers,
do: [
{"content-type", "application/ld+json; profile=\"https://www.w3.org/ns/activitystreams\""}
]
# The Accept headers we genrate to be exact; AP spec only requires the first somewhere
@activitypub_accept_headers [
{"accept", "application/ld+json; profile=\"https://www.w3.org/ns/activitystreams\""},
{"accept", "application/activity+json"}
]
def request(
%Tesla.Env{
@ -97,7 +106,7 @@ defmodule HttpRequestMock do
File.read!("test/fixtures/users_mock/masto_featured.json")
|> String.replace("{{domain}}", "mastodon.sdf.org")
|> String.replace("{{nickname}}", "rinpatch"),
headers: [{"content-type", "application/activity+json"}]
headers: activitypub_object_headers()
}}
end
@ -208,7 +217,7 @@ defmodule HttpRequestMock do
"https://mst3k.interlinked.me/users/luciferMysticus",
_,
_,
[{"accept", "application/activity+json"}]
@activitypub_accept_headers
) do
{:ok,
%Tesla.Env{
@ -231,7 +240,7 @@ defmodule HttpRequestMock do
"https://hubzilla.example.org/channel/kaniini",
_,
_,
[{"accept", "application/activity+json"}]
@activitypub_accept_headers
) do
{:ok,
%Tesla.Env{
@ -241,7 +250,7 @@ defmodule HttpRequestMock do
}}
end
def get("https://niu.moe/users/rye", _, _, [{"accept", "application/activity+json"}]) do
def get("https://niu.moe/users/rye", _, _, @activitypub_accept_headers) do
{:ok,
%Tesla.Env{
status: 200,
@ -250,7 +259,7 @@ defmodule HttpRequestMock do
}}
end
def get("https://n1u.moe/users/rye", _, _, [{"accept", "application/activity+json"}]) do
def get("https://n1u.moe/users/rye", _, _, @activitypub_accept_headers) do
{:ok,
%Tesla.Env{
status: 200,
@ -270,7 +279,7 @@ defmodule HttpRequestMock do
}}
end
def get("https://puckipedia.com/", _, _, [{"accept", "application/activity+json"}]) do
def get("https://puckipedia.com/", _, _, @activitypub_accept_headers) do
{:ok,
%Tesla.Env{
status: 200,
@ -342,9 +351,12 @@ defmodule HttpRequestMock do
}}
end
def get("https://mobilizon.org/events/252d5816-00a3-4a89-a66f-15bf65c33e39", _, _, [
{"accept", "application/activity+json"}
]) do
def get(
"https://mobilizon.org/events/252d5816-00a3-4a89-a66f-15bf65c33e39",
_,
_,
@activitypub_accept_headers
) do
{:ok,
%Tesla.Env{
status: 200,
@ -353,7 +365,7 @@ defmodule HttpRequestMock do
}}
end
def get("https://mobilizon.org/@tcit", _, _, [{"accept", "application/activity+json"}]) do
def get("https://mobilizon.org/@tcit", _, _, @activitypub_accept_headers) do
{:ok,
%Tesla.Env{
status: 200,
@ -416,9 +428,7 @@ defmodule HttpRequestMock do
{:ok, %Tesla.Env{status: 404, body: ""}}
end
def get("http://mastodon.example.org/users/relay", _, _, [
{"accept", "application/activity+json"}
]) do
def get("http://mastodon.example.org/users/relay", _, _, @activitypub_accept_headers) do
{:ok,
%Tesla.Env{
status: 200,
@ -427,9 +437,7 @@ defmodule HttpRequestMock do
}}
end
def get("http://mastodon.example.org/users/gargron", _, _, [
{"accept", "application/activity+json"}
]) do
def get("http://mastodon.example.org/users/gargron", _, _, @activitypub_accept_headers) do
{:error, :nxdomain}
end
@ -620,7 +628,7 @@ defmodule HttpRequestMock do
}}
end
def get("https://mstdn.io/users/mayuutann", _, _, [{"accept", "application/activity+json"}]) do
def get("https://mstdn.io/users/mayuutann", _, _, @activitypub_accept_headers) do
{:ok,
%Tesla.Env{
status: 200,
@ -633,7 +641,7 @@ defmodule HttpRequestMock do
"https://mstdn.io/users/mayuutann/statuses/99568293732299394",
_,
_,
[{"accept", "application/activity+json"}]
@activitypub_accept_headers
) do
{:ok,
%Tesla.Env{
@ -779,7 +787,7 @@ defmodule HttpRequestMock do
"http://gs.example.org:4040/index.php/user/1",
_,
_,
[{"accept", "application/activity+json"}]
@activitypub_accept_headers
) do
{:ok, %Tesla.Env{status: 406, body: ""}}
end
@ -966,7 +974,7 @@ defmodule HttpRequestMock do
}}
end
def get("https://social.heldscal.la/user/23211", _, _, [{"accept", "application/activity+json"}]) do
def get("https://social.heldscal.la/user/23211", _, _, @activitypub_accept_headers) do
{:ok, Tesla.Mock.json(%{"id" => "https://social.heldscal.la/user/23211"}, status: 200)}
end
@ -1207,13 +1215,11 @@ defmodule HttpRequestMock do
File.read!("test/fixtures/users_mock/masto_featured.json")
|> String.replace("{{domain}}", "lm.kazv.moe")
|> String.replace("{{nickname}}", "mewmew"),
headers: [{"content-type", "application/activity+json"}]
headers: activitypub_object_headers()
}}
end
def get("https://info.pleroma.site/activity.json", _, _, [
{"accept", "application/activity+json"}
]) do
def get("https://info.pleroma.site/activity.json", _, _, @activitypub_accept_headers) do
{:ok,
%Tesla.Env{
status: 200,
@ -1226,9 +1232,7 @@ defmodule HttpRequestMock do
{:ok, %Tesla.Env{status: 404, body: ""}}
end
def get("https://info.pleroma.site/activity2.json", _, _, [
{"accept", "application/activity+json"}
]) do
def get("https://info.pleroma.site/activity2.json", _, _, @activitypub_accept_headers) do
{:ok,
%Tesla.Env{
status: 200,
@ -1241,9 +1245,7 @@ defmodule HttpRequestMock do
{:ok, %Tesla.Env{status: 404, body: ""}}
end
def get("https://info.pleroma.site/activity3.json", _, _, [
{"accept", "application/activity+json"}
]) do
def get("https://info.pleroma.site/activity3.json", _, _, @activitypub_accept_headers) do
{:ok,
%Tesla.Env{
status: 200,
@ -1319,6 +1321,25 @@ defmodule HttpRequestMock do
}}
end
# A misskey quote
def get("https://misskey.io/notes/8vs6wxufd0", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/misskey.io_8vs6wxufd0.json"),
headers: activitypub_object_headers()
}}
end
def get("https://misskey.io/users/83ssedkv53", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/aimu@misskey.io.json"),
headers: activitypub_object_headers()
}}
end
def get("https://example.org/emoji/firedfox.png", _, _, _) do
{:ok, %Tesla.Env{status: 200, body: File.read!("test/fixtures/image.jpg")}}
end