2022.09 stable #208

Merged
floatingghost merged 49 commits from develop into stable 2022-09-10 16:06:19 +00:00
164 changed files with 6073 additions and 780 deletions

View file

@ -1,18 +0,0 @@
<!--
### Precheck
* For support use https://git.pleroma.social/pleroma/pleroma-support or [community channels](https://git.pleroma.social/pleroma/pleroma#community-channels).
* Please do a quick search to ensure no similar bug has been reported before. If the bug has not been addressed after 2 weeks, it's fine to bump it.
* Try to ensure that the bug is actually related to the Pleroma backend. For example, if a bug happens in Pleroma-FE but not in Mastodon-FE or mobile clients, it's likely that the bug should be filed in [Pleroma-FE](https://git.pleroma.social/pleroma/pleroma-fe/issues/new) repository.
-->
### Environment
* Installation type (OTP or From Source):
* Pleroma version (could be found in the "Version" tab of settings in Pleroma-FE):
* Elixir version (`elixir -v` for from source installations, N/A for OTP):
* Operating system:
* PostgreSQL version (`psql -V`):
### Bug description

View file

@ -1,6 +0,0 @@
### Release checklist
* [ ] Bump version in `mix.exs`
* [ ] Compile a changelog
* [ ] Create an MR with an announcement to pleroma.social
* [ ] Tag the release
* [ ] Merge `stable` into `develop` (in case the fixes are already in develop, use `git merge -s ours --no-commit` and manually merge the changelogs)

View file

@ -112,7 +112,7 @@ pipeline:
- /bin/sh /entrypoint.sh
debian-bullseye:
image: elixir:1.13.4
image: akkoma/debian
<<: *on-release
environment:
MIX_ENV: prod

View file

@ -4,7 +4,33 @@ All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/).
## [Unreleased]
## 2022.09
### Added
- support for fedibird-fe, and non-breaking API parity for it to function
- support for setting instance languages in metadata
- support for reusing oauth tokens, and not requiring new authorizations
- the ability to obfuscate domains in your MRF descriptions
- automatic translation of statuses via DeepL or LibreTranslate
- ability to edit posts
- ability to react with remote emoji
### Changed
- MFM parsing is now done on the backend by a modified version of ilja's parser -> https://akkoma.dev/AkkomaGang/mfm-parser
- InlineQuotePolicy is now on by default
### Fixed
- Compatibility with latest meilisearch
- Resolution of nested mix tasks (i.e search.meilisearch) in OTP releases
- Elasticsearch returning likes and repeats, displaying as posts
- Ensure key generation happens at registration-time to prevent potential race-conditions
- Ensured websockets get closed on logout
- Allowed GoToSocial-style `?query_string` signatures
### Removed
- Non-finch HTTP adapters. `:tesla, :adapter` is now highly recommended to be set to the default.
## 2022.08
### Added
- extended runtime module support, see config cheatsheet
@ -18,6 +44,7 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/).
- amd64 is built for debian stable. Compatible with ubuntu 20.
- ubuntu-jammy is built for... well, ubuntu 22 (LTS)
- amd64-musl is built for alpine 3.16
- Enable remote users to interact with posts
### Fixed
- Updated mastoFE path, for the newer version

View file

@ -2,6 +2,8 @@
*a smallish microblogging platform, aka the cooler pleroma*
![English OK](https://img.shields.io/badge/English-OK-blueviolet) ![日本語OK](https://img.shields.io/badge/%E6%97%A5%E6%9C%AC%E8%AA%9E-OK-blueviolet)
## About
This is a fork of Pleroma, which is a microblogging server software that can federate (= exchange messages with) other servers that support ActivityPub. What that means is that you can host a server for yourself or your friends and stay in control of your online identity, but still exchange messages with people on larger servers. Akkoma will federate with all servers that implement ActivityPub, like Friendica, GNU Social, Hubzilla, Mastodon, Misskey, Peertube, and Pixelfed.

2
SIGNING_KEY.pub Normal file
View file

@ -0,0 +1,2 @@
untrusted comment: Akkoma Signing Key public key
RWQRlw8Ex/uTbvo1wB1yK75tQ5nXKilB/vrKdkL41bgZHL9aKP+7fSS5

View file

@ -197,6 +197,7 @@
avatar_upload_limit: 2_000_000,
background_upload_limit: 4_000_000,
banner_upload_limit: 4_000_000,
languages: ["en"],
poll_limits: %{
max_options: 20,
max_option_chars: 200,
@ -734,6 +735,14 @@
"build_dir" => "distribution",
"ref" => "akkoma"
},
"fedibird-fe" => %{
"name" => "fedibird-fe",
"git" => "https://akkoma.dev/AkkomaGang/fedibird-fe",
"build_url" =>
"https://akkoma-updates.s3-website.fr-par.scw.cloud/frontend/${ref}/fedibird-fe.zip",
"build_dir" => "distribution",
"ref" => "akkoma"
},
"admin-fe" => %{
"name" => "admin-fe",
"git" => "https://akkoma.dev/AkkomaGang/admin-fe",
@ -785,7 +794,8 @@
config :pleroma, :mrf,
policies: [Pleroma.Web.ActivityPub.MRF.ObjectAgePolicy, Pleroma.Web.ActivityPub.MRF.TagPolicy],
transparency: true,
transparency_exclusions: []
transparency_exclusions: [],
transparency_obfuscate_domains: []
config :ex_aws, http_client: Pleroma.HTTP.ExAws
@ -833,6 +843,19 @@
}
}
config :pleroma, :translator,
enabled: false,
module: Pleroma.Akkoma.Translators.DeepL
config :pleroma, :deepl,
# either :free or :pro
tier: :free,
api_key: ""
config :pleroma, :libre_translate,
url: "http://127.0.0.1:5000",
api_key: nil
# Import environment specific config. This must remain at the bottom
# of this file so it overrides the configuration defined above.
import_config "#{Mix.env()}.exs"

View file

@ -509,6 +509,16 @@
"Pleroma"
]
},
%{
key: :languages,
type: {:list, :string},
description: "Languages the instance uses",
suggestions: [
"en",
"ja",
"fr"
]
},
%{
key: :email,
label: "Admin Email Address",
@ -1169,7 +1179,6 @@
hideFilteredStatuses: false,
hideMutedPosts: false,
hidePostStats: false,
hideSitename: false,
hideUserStats: false,
loginMethod: "password",
logo: "/static/logo.svg",
@ -1235,12 +1244,6 @@
type: :boolean,
description: "Hide notices statistics (repeats, favorites, ...)"
},
%{
key: :hideSitename,
label: "Hide Sitename",
type: :boolean,
description: "Hides instance name from PleromaFE banner"
},
%{
key: :hideUserStats,
label: "Hide user stats",
@ -1350,6 +1353,42 @@
type: :string,
description: "Which theme to use. Available themes are defined in styles.json",
suggestions: ["pleroma-dark"]
},
%{
key: :showPanelNavShortcuts,
label: "Show timeline panel nav shortcuts",
type: :boolean,
description: "Whether to put timeline nav tabs on the top of the panel"
},
%{
key: :showNavShortcuts,
label: "Show navbar shortcuts",
type: :boolean,
description: "Whether to put extra navigation options on the navbar"
},
%{
key: :showWiderShortcuts,
label: "Increase navbar shortcut spacing",
type: :boolean,
description: "Whether to add extra space between navbar icons"
},
%{
key: :hideSiteFavicon,
label: "Hide site favicon",
type: :boolean,
description: "Whether to hide the instance favicon from the navbar"
},
%{
key: :hideSiteName,
label: "Hide site name",
type: :boolean,
description: "Whether to hide the site name from the navbar"
},
%{
key: :renderMisskeyMarkdown,
label: "Render misskey markdown",
type: :boolean,
description: "Whether to render Misskey-flavoured markdown"
}
]
},
@ -2598,9 +2637,10 @@
%{
key: :proxy_url,
label: "Proxy URL",
type: [:string, :tuple],
description: "Proxy URL",
suggestions: ["localhost:9020", {:socks5, :localhost, 3090}]
type: :string,
description:
"Proxy URL - of the format http://host:port. Advise setting in .exs instead of admin-fe due to this being set at boot-time.",
suggestions: ["http://localhost:3128"]
},
%{
key: :user_agent,
@ -3186,13 +3226,14 @@
group: :pleroma,
key: Pleroma.Search,
type: :group,
label: "Search",
description: "General search settings.",
children: [
%{
key: :module,
type: :keyword,
type: :module,
description: "Selected search module.",
suggestion: [Pleroma.Search.DatabaseSearch, Pleroma.Search.Meilisearch]
suggestions: {:list_behaviour_implementations, Pleroma.Search.SearchBackend}
}
]
},
@ -3217,7 +3258,7 @@
},
%{
key: :initial_indexing_chunk_size,
type: :int,
type: :integer,
description:
"Amount of posts in a batch when running the initial indexing operation. Should probably not be more than 100000" <>
" since there's a limit on maximum insert size",
@ -3228,6 +3269,7 @@
%{
group: :pleroma,
key: Pleroma.Search.Elasticsearch.Cluster,
label: "Elasticsearch",
type: :group,
description: "Elasticsearch settings.",
children: [
@ -3294,13 +3336,13 @@
},
%{
key: :bulk_page_size,
type: :int,
type: :integer,
description: "Size for bulk put requests, mostly used on building the index",
suggestion: [5000]
},
%{
key: :bulk_wait_interval,
type: :int,
type: :integer,
description: "Time to wait between bulk put requests (in ms)",
suggestion: [15_000]
}
@ -3309,5 +3351,66 @@
]
}
]
},
%{
group: :pleroma,
key: :translator,
type: :group,
description: "Translation Settings",
children: [
%{
key: :enabled,
type: :boolean,
description: "Is translation enabled?",
suggestion: [true, false]
},
%{
key: :module,
type: :module,
description: "Translation module.",
suggestions: {:list_behaviour_implementations, Pleroma.Akkoma.Translator}
}
]
},
%{
group: :pleroma,
key: :deepl,
label: "DeepL",
type: :group,
description: "DeepL Settings.",
children: [
%{
key: :tier,
type: {:dropdown, :atom},
description: "API Tier",
suggestions: [:free, :pro]
},
%{
key: :api_key,
type: :string,
description: "API key for DeepL",
suggestions: [nil]
}
]
},
%{
group: :pleroma,
key: :libre_translate,
type: :group,
description: "LibreTranslate Settings.",
children: [
%{
key: :url,
type: :string,
description: "URL for libretranslate",
suggestion: [nil]
},
%{
key: :api_key,
type: :string,
description: "API key for libretranslate",
suggestion: [nil]
}
]
}
]

View file

@ -120,6 +120,7 @@ To add configuration to your config file, you can copy it from the base config.
* `Pleroma.Web.ActivityPub.MRF.KeywordPolicy`: Rejects or removes from the federated timeline or replaces keywords. (See [`:mrf_keyword`](#mrf_keyword)).
* `transparency`: Make the content of your Message Rewrite Facility settings public (via nodeinfo).
* `transparency_exclusions`: Exclude specific instance names from MRF transparency. The use of the exclusions feature will be disclosed in nodeinfo as a boolean value.
* `transparency_obfuscate_domains`: Show domains with `*` in the middle, to censor them if needed. For example, `ridingho.me` will show as `rid*****.me`
## Federation
### MRF policies
@ -521,7 +522,7 @@ Available caches:
### :http
* `proxy_url`: an upstream proxy to fetch posts and/or media with, (default: `nil`)
* `proxy_url`: an upstream proxy to fetch posts and/or media with, (default: `nil`); for example `http://127.0.0.1:3192`. Does not support SOCKS5 proxy, only http(s).
* `send_user_agent`: should we include a user agent with HTTP requests? (default: `true`)
* `user_agent`: what user agent should we use? (default: `:default`), must be string or `:default`
* `adapter`: array of adapter options
@ -1158,3 +1159,28 @@ Each job has these settings:
* `:max_running` - max concurrently runnings jobs
* `:max_waiting` - max waiting jobs
### Translation Settings
Settings to automatically translate statuses for end users. Currently supported
translation services are DeepL and LibreTranslate.
Translations are available at `/api/v1/statuses/:id/translations/:language`, where
`language` is the target language code (e.g `en`)
### `:translator`
- `:enabled` - enables translation
- `:module` - Sets module to be used
- Either `Pleroma.Akkoma.Translators.DeepL` or `Pleroma.Akkoma.Translators.LibreTranslate`
### `:deepl`
- `:api_key` - API key for DeepL
- `:tier` - API tier
- either `:free` or `:pro`
### `:libre_translate`
- `:url` - URL of LibreTranslate instance
- `:api_key` - API key for LibreTranslate

View file

@ -19,6 +19,10 @@ config :pleroma, :frontends,
admin: %{
"name" => "admin-fe",
"ref" => "stable"
},
mastodon: %{
"name" => "mastodon-fe",
"ref" => "akkoma"
}
```
@ -26,12 +30,18 @@ This would serve the frontend from the the folder at `$instance_static/frontends
Refer to [the frontend CLI task](../../administration/CLI_tasks/frontend) for how to install the frontend's files
If you wish masto-fe to also be enabled, you will also need to run the install task for `mastodon-fe`. Not doing this will lead to the frontend not working.
If you choose not to install a frontend for whatever reason, it is recommended that you enable [`:static_fe`](#static_fe) to allow remote users to click "view remote source". Don't bother with this if you've got no unauthenticated access though.
You can also replace the default "no frontend" page by placing an `index.html` file under your `instance/static/` directory.
## Mastodon-FE
Akkoma supports both [glitchsoc](https://github.com/glitch-soc/mastodon)'s more "vanilla" mastodon frontend,
as well as [fedibird](https://github.com/fedibird/mastodon)'s extended frontend which has near-feature-parity with akkoma (with quoting and reactions).
To enable either one, you must run the `frontend.install` task for either `mastodon-fe` or `fedibird-fe` (both `--ref akkoma`), then make sure
`:pleroma, :frontends, :mastodon` references the one you want.
## Swagger (openAPI) documentation viewer
If you're a developer and you'd like a human-readable rendering of the

View file

@ -40,6 +40,10 @@ Has these additional fields under the `pleroma` object:
- `parent_visible`: If the parent of this post is visible to the user or not.
- `pinned_at`: a datetime (iso8601) when status was pinned, `null` otherwise.
The `GET /api/v1/statuses/:id/source` endpoint additionally has the following attributes:
- `content_type`: The content type of the status source.
## Scheduled statuses
Has these additional fields in `params`:

View file

@ -221,6 +221,8 @@ If your instance is up and running, you can create your first user with administ
doas -u akkoma env MIX_ENV=prod mix pleroma.user new <username> <your@emailaddress> --admin
```
{! installation/frontends.include !}
#### Further reading
{! installation/further_reading.include !}

View file

@ -212,6 +212,8 @@ If your instance is up and running, you can create your first user with administ
sudo -Hu akkoma MIX_ENV=prod mix pleroma.user new <username> <your@emailaddress> --admin
```
{! installation/frontends.include !}
#### Further reading
{! installation/further_reading.include !}

View file

@ -175,6 +175,8 @@ If your instance is up and running, you can create your first user with administ
sudo -Hu akkoma MIX_ENV=prod mix pleroma.user new <username> <your@emailaddress> --admin
```
{! installation/frontends.include !}
#### Further reading
{! installation/further_reading.include !}

View file

@ -199,6 +199,8 @@ If your instance is up and running, you can create your first user with administ
sudo -Hu akkoma MIX_ENV=prod mix pleroma.user new <username> <your@emailaddress> --admin
```
{! installation/frontends.include !}
#### Further reading
{! installation/further_reading.include !}

View file

@ -206,6 +206,9 @@ If your instance is up and running, you can create your first user with administ
```shell
sudo -Hu akkoma MIX_ENV=prod mix pleroma.user new <username> <your@emailaddress> --admin
```
{! installation/frontends.include !}
## Conclusion
Restart nginx with `# service nginx restart` and you should be up and running.

View file

@ -0,0 +1,25 @@
#### Installing Frontends
Once your backend server is functional, you'll also want to
probably install frontends.
These are no longer bundled with the distribution and need an extra
command to install.
For most installations, the following will suffice:
=== "OTP"
```sh
./bin/pleroma_ctl frontend install pleroma-fe --ref stable
# and also, if desired
./bin/pleroma_ctl frontend install admin-fe --ref stable
```
=== "From Source"
```sh
mix pleroma.frontend install pleroma-fe --ref stable
mix pleroma.frontend install admin-fe --ref stable
```
For more customised installations, refer to [Frontend Management](../../configuration/frontend_management)

View file

@ -293,6 +293,8 @@ akkoma$ MIX_ENV=prod mix pleroma.user new <username> <your@emailaddress> --admin
If you opted to allow sudo for the `akkoma` user but would like to remove the ability for greater security, now might be a good time to edit `/etc/sudoers` and/or change the groups the `akkoma` user belongs to. Be sure to restart the akkoma service afterwards to ensure it picks up on the changes.
{! installation/frontends.include !}
#### Further reading
{! installation/further_reading.include !}

View file

@ -1,7 +1,5 @@
# Migrating to Akkoma
**Akkoma does not currently have a stable release, until 3.0, all builds should be considered "develop"**
## Why should you migrate?
aside from actually responsive maintainer(s)? let's lookie here, we've got:
@ -11,6 +9,8 @@ aside from actually responsive maintainer(s)? let's lookie here, we've got:
- elasticsearch support (because pleroma search is GARBAGE)
- latest develop pleroma-fe additions
- local-only posting
- automatic post translation
- the mastodon frontend back in all its glory
- probably more, this is like 3.5 years of IHBA additions finally compiled
## Actually migrating
@ -43,14 +43,14 @@ This will just be setting the update URL - find your flavour from the [mapping o
```bash
export FLAVOUR=[the flavour you found above]
./bin/pleroma_ctl update --zip-url https://akkoma-updates.s3-website.fr-par.scw.cloud/develop/akkoma-$FLAVOUR.zip
./bin/pleroma_ctl update --zip-url https://akkoma-updates.s3-website.fr-par.scw.cloud/stable/akkoma-$FLAVOUR.zip
./bin/pleroma_ctl migrate
```
Then restart. When updating in the future, you canjust use
```bash
./bin/pleroma_ctl update --branch develop
./bin/pleroma_ctl update --branch stable
```
## Frontend changes
@ -62,16 +62,17 @@ your upgrade path here depends on your setup
You'll need to run a couple of commands,
```bash
# From source
mix pleroma.frontend install pleroma-fe
# you'll probably want this too
mix pleroma.frontend install admin-fe
=== "OTP"
```sh
./bin/pleroma_ctl frontend install pleroma-fe --ref stable
# and also, if desired
./bin/pleroma_ctl frontend install admin-fe --ref stable
```
# OTP
./bin/pleroma_ctl frontend install pleroma-fe
# you'll probably want this too
./bin/pleroma_ctl frontend install admin-fe
=== "From Source"
```sh
mix pleroma.frontend install pleroma-fe --ref stable
mix pleroma.frontend install admin-fe --ref stable
```
### I've run the mix task to install a frontend

View file

@ -202,6 +202,8 @@ incorrect timestamps. You should have ntpd running.
* <https://catgirl.science>
{! installation/frontends.include !}
#### Further reading
{! installation/further_reading.include !}

View file

@ -250,6 +250,8 @@ If your instance is up and running, you can create your first user with administ
LC_ALL=en_US.UTF-8 MIX_ENV=prod mix pleroma.user new <username> <your@emailaddress> --admin
```
{! installation/frontends.include !}
#### Further reading
{! installation/further_reading.include !}

View file

@ -306,6 +306,8 @@ su akkoma -s $SHELL -lc "./bin/pleroma_ctl user new joeuser joeuser@sld.tld --ad
```
This will create an account withe the username of 'joeuser' with the email address of joeuser@sld.tld, and set that user's account as an admin. This will result in a link that you can paste into the browser, which logs you in and enables you to set the password.
{! installation/frontends.include !}
## Further reading
{! installation/further_reading.include !}

View file

@ -279,6 +279,7 @@ After that, run the `pleroma_ctl migrate` command as usual to perform database m
As it currently stands, your OTP build will only be compatible for the specific RedHat distribution you've built it on. Fedora builds only work on Fedora, Centos builds only on Centos, RedHat builds only on RedHat. Secondly, for Fedora, they will also be bound to the specific Fedora release. This is because different releases of Fedora may have significant changes made in some of the required packages and libraries.
{! installation/frontends.include !}
{! installation/further_reading.include !}

View file

@ -0,0 +1,66 @@
# Verifying OTP release integrity
All stable OTP releases are cryptographically signed, to allow
you to verify the integrity if you choose to.
Releases are signed with [Signify](https://man.openbsd.org/signify.1),
with [the public key in the main repository](https://akkoma.dev/AkkomaGang/akkoma/src/branch/develop/SIGNING_KEY.pub)
Release URLs will always be of the form
```
https://akkoma-updates.s3-website.fr-par.scw.cloud/{branch}/akkoma-{flavour}.zip
```
Where branch is usually `stable` or `develop`, and `flavour` is
the one [that you detect on install](../otp_en/#detecting-flavour).
So, for an AMD64 stable install, your update URL will be
```
https://akkoma-updates.s3-website.fr-par.scw.cloud/stable/akkoma-amd64.zip
```
To verify the integrity of this file, we have two helper files
```
# Checksums
https://akkoma-updates.s3-website.fr-par.scw.cloud/{branch}/akkoma-{flavour}.zip.sha256
# Signify signature of the hashes
https://akkoma-updates.s3-website.fr-par.scw.cloud/{branch}/akkoma-{flavour}.zip.sha256.sig
```
Thus, to upgrade manually, with integrity checking, consider the following script:
```bash
#!/bin/bash
set -eo pipefail
export FLAVOUR=amd64
export BRANCH=stable
# Fetch signing key
curl --silent https://akkoma.dev/AkkomaGang/akkoma/raw/branch/$BRANCH/SIGNING_KEY.pub -o AKKOMA_SIGNING_KEY.pub
# Download zip file and sig files
wget -q https://akkoma-updates.s3-website.fr-par.scw.cloud/$BRANCH/akkoma-$FLAVOUR{.zip,.zip.sha256,.zip.sha256.sig}
# Verify zip file's sha256 integrity
sha256sum --check akkoma-$FLAVOUR.zip.sha256
# Verify hash file's integrity
# Signify might be under the `signify` command, depending on your distribution
signify-openbsd -V -p AKKOMA_SIGNING_KEY.pub -m akkoma-$FLAVOUR.zip.sha256
# We're good, use that URL
echo "Update URL contents verified"
echo "use"
echo "./bin/pleroma_ctl update --zip-url https://akkoma-updates.s3-website.fr-par.scw.cloud/$BRANCH/akkoma-$FLAVOUR"
echo "to update your instance"
# Clean up
rm akkoma-$FLAVOUR.zip
rm akkoma-$FLAVOUR.zip.sha256
rm akkoma-$FLAVOUR.zip.sha256.sig
```

View file

@ -23,7 +23,15 @@ def start_pleroma do
Pleroma.Config.Oban.warn()
Pleroma.Application.limiters_setup()
Application.put_env(:phoenix, :serve_endpoints, false, persistent: true)
Finch.start_link(name: MyFinch)
proxy_url = Pleroma.Config.get([:http, :proxy_url])
proxy = Pleroma.HTTP.AdapterHelper.format_proxy(proxy_url)
finch_config =
[:http, :adapter]
|> Pleroma.Config.get([])
|> Pleroma.HTTP.AdapterHelper.maybe_add_proxy_pool(proxy)
|> Keyword.put(:name, MyFinch)
unless System.get_env("DEBUG") do
Logger.remove_backend(:console)
@ -45,6 +53,7 @@ def start_pleroma do
Pleroma.Emoji,
{Pleroma.Config.TransferTask, false},
Pleroma.Web.Endpoint,
{Finch, finch_config},
{Oban, oban_config},
{Majic.Pool,
[name: Pleroma.MajicPool, pool_size: Pleroma.Config.get([:majic_pool, :size], 2)]}

View file

@ -9,7 +9,7 @@ defmodule Mix.Tasks.Pleroma.Search.Meilisearch do
import Ecto.Query
import Pleroma.Search.Meilisearch,
only: [meili_post: 2, meili_put: 2, meili_get: 1, meili_delete!: 1]
only: [meili_put: 2, meili_get: 1, meili_delete!: 1]
def run(["index"]) do
start_pleroma()
@ -27,7 +27,7 @@ def run(["index"]) do
end
{:ok, _} =
meili_post(
meili_put(
"/indexes/objects/settings/ranking-rules",
[
"published:desc",
@ -41,7 +41,7 @@ def run(["index"]) do
)
{:ok, _} =
meili_post(
meili_put(
"/indexes/objects/settings/searchable-attributes",
[
"content"
@ -91,7 +91,7 @@ def run(["index"]) do
)
with {:ok, res} <- result do
if not Map.has_key?(res, "uid") do
if not Map.has_key?(res, "indexUid") do
IO.puts("\nFailed to index: #{inspect(result)}")
end
else

View file

@ -258,6 +258,25 @@ def run(["untag", nickname | tags]) do
end
end
def run(["refetch_public_keys"]) do
start_pleroma()
Pleroma.User.Query.build(%{
external: true,
is_active: true
})
|> refetch_public_keys()
end
def run(["refetch_public_keys" | rest]) do
start_pleroma()
Pleroma.User.Query.build(%{
ap_id: rest
})
|> refetch_public_keys()
end
def run(["invite" | rest]) do
{options, [], []} =
OptionParser.parse(rest,
@ -519,6 +538,26 @@ def run(["fix_follow_state", local_user, remote_user]) do
end
end
defp refetch_public_keys(query) do
query
|> Pleroma.Repo.chunk_stream(50, :batches)
|> Stream.each(fn users ->
users
|> Enum.each(fn user ->
IO.puts("Re-Resolving: #{user.ap_id}")
with {:ok, user} <- Pleroma.User.fetch_by_ap_id(user.ap_id),
changeset <- Pleroma.User.update_changeset(user),
{:ok, _user} <- Pleroma.User.update_and_set_cache(changeset) do
:ok
else
error -> IO.puts("Could not resolve: #{user.ap_id}, #{inspect(error)}")
end
end)
end)
|> Stream.run()
end
defp set_moderator(user, value) do
{:ok, user} =
user

View file

@ -8,6 +8,40 @@ defmodule Pleroma.Activity.HTML do
@cachex Pleroma.Config.get([:cachex, :provider], Cachex)
# We store a list of cache keys related to an activity in a
# separate cache, scrubber_management_cache. It has the same
# size as scrubber_cache (see application.ex). Every time we add
# a cache to scrubber_cache, we update scrubber_management_cache.
#
# The most recent write of a certain key in the management cache
# is the same as the most recent write of any record related to that
# key in the main cache.
# Assuming LRW ( https://hexdocs.pm/cachex/Cachex.Policy.LRW.html ),
# this means when the management cache is evicted by cachex, all
# related records in the main cache will also have been evicted.
defp get_cache_keys_for(activity_id) do
with {:ok, list} when is_list(list) <- @cachex.get(:scrubber_management_cache, activity_id) do
list
else
_ -> []
end
end
defp add_cache_key_for(activity_id, additional_key) do
current = get_cache_keys_for(activity_id)
unless additional_key in current do
@cachex.put(:scrubber_management_cache, activity_id, [additional_key | current])
end
end
def invalidate_cache_for(activity_id) do
keys = get_cache_keys_for(activity_id)
Enum.map(keys, &@cachex.del(:scrubber_cache, &1))
@cachex.del(:scrubber_management_cache, activity_id)
end
def get_cached_scrubbed_html_for_activity(
content,
scrubbers,
@ -19,6 +53,8 @@ def get_cached_scrubbed_html_for_activity(
@cachex.fetch!(:scrubber_cache, key, fn _key ->
object = Object.normalize(activity, fetch: false)
add_cache_key_for(activity.id, key)
HTML.ensure_scrubbed_html(content, scrubbers, object.data["fake"] || false, callback)
end)
end

View file

@ -0,0 +1,100 @@
defmodule Pleroma.Akkoma.Translators.DeepL do
@behaviour Pleroma.Akkoma.Translator
alias Pleroma.HTTP
alias Pleroma.Config
require Logger
defp base_url(:free) do
"https://api-free.deepl.com/v2/"
end
defp base_url(:pro) do
"https://api.deepl.com/v2/"
end
defp api_key do
Config.get([:deepl, :api_key])
end
defp tier do
Config.get([:deepl, :tier])
end
@impl Pleroma.Akkoma.Translator
def languages do
with {:ok, %{status: 200} = source_response} <- do_languages("source"),
{:ok, %{status: 200} = dest_response} <- do_languages("target"),
{:ok, source_body} <- Jason.decode(source_response.body),
{:ok, dest_body} <- Jason.decode(dest_response.body) do
source_resp =
Enum.map(source_body, fn %{"language" => code, "name" => name} ->
%{code: code, name: name}
end)
dest_resp =
Enum.map(dest_body, fn %{"language" => code, "name" => name} ->
%{code: code, name: name}
end)
{:ok, source_resp, dest_resp}
else
{:ok, %{status: status} = response} ->
Logger.warning("DeepL: Request rejected: #{inspect(response)}")
{:error, "DeepL request failed (code #{status})"}
{:error, reason} ->
{:error, reason}
end
end
@impl Pleroma.Akkoma.Translator
def translate(string, from_language, to_language) do
with {:ok, %{status: 200} = response} <-
do_request(api_key(), tier(), string, from_language, to_language),
{:ok, body} <- Jason.decode(response.body) do
%{"translations" => [%{"text" => translated, "detected_source_language" => detected}]} =
body
{:ok, detected, translated}
else
{:ok, %{status: status} = response} ->
Logger.warning("DeepL: Request rejected: #{inspect(response)}")
{:error, "DeepL request failed (code #{status})"}
{:error, reason} ->
{:error, reason}
end
end
defp do_request(api_key, tier, string, from_language, to_language) do
HTTP.post(
base_url(tier) <> "translate",
URI.encode_query(
%{
text: string,
target_lang: to_language,
tag_handling: "html"
}
|> maybe_add_source(from_language),
:rfc3986
),
[
{"authorization", "DeepL-Auth-Key #{api_key}"},
{"content-type", "application/x-www-form-urlencoded"}
]
)
end
defp maybe_add_source(opts, nil), do: opts
defp maybe_add_source(opts, lang), do: Map.put(opts, :source_lang, lang)
defp do_languages(type) do
HTTP.get(
base_url(tier()) <> "languages?type=#{type}",
[
{"authorization", "DeepL-Auth-Key #{api_key()}"}
]
)
end
end

View file

@ -0,0 +1,82 @@
defmodule Pleroma.Akkoma.Translators.LibreTranslate do
@behaviour Pleroma.Akkoma.Translator
alias Pleroma.Config
alias Pleroma.HTTP
require Logger
defp api_key do
Config.get([:libre_translate, :api_key])
end
defp url do
Config.get([:libre_translate, :url])
end
@impl Pleroma.Akkoma.Translator
def languages do
with {:ok, %{status: 200} = response} <- do_languages(),
{:ok, body} <- Jason.decode(response.body) do
resp = Enum.map(body, fn %{"code" => code, "name" => name} -> %{code: code, name: name} end)
# No separate source/dest
{:ok, resp, resp}
else
{:ok, %{status: status} = response} ->
Logger.warning("LibreTranslate: Request rejected: #{inspect(response)}")
{:error, "LibreTranslate request failed (code #{status})"}
{:error, reason} ->
{:error, reason}
end
end
@impl Pleroma.Akkoma.Translator
def translate(string, from_language, to_language) do
with {:ok, %{status: 200} = response} <- do_request(string, from_language, to_language),
{:ok, body} <- Jason.decode(response.body) do
%{"translatedText" => translated} = body
detected =
if Map.has_key?(body, "detectedLanguage") do
get_in(body, ["detectedLanguage", "language"])
else
from_language
end
{:ok, detected, translated}
else
{:ok, %{status: status} = response} ->
Logger.warning("libre_translate: request failed, #{inspect(response)}")
{:error, "libre_translate: request failed (code #{status})"}
{:error, reason} ->
{:error, reason}
end
end
defp do_request(string, from_language, to_language) do
url = URI.parse(url())
url = %{url | path: "/translate"}
HTTP.post(
to_string(url),
Jason.encode!(%{
q: string,
source: if(is_nil(from_language), do: "auto", else: from_language),
target: to_language,
format: "html",
api_key: api_key()
}),
[
{"content-type", "application/json"}
]
)
end
defp do_languages() do
url = URI.parse(url())
url = %{url | path: "/languages"}
HTTP.get(to_string(url))
end
end

View file

@ -0,0 +1,8 @@
defmodule Pleroma.Akkoma.Translator do
@callback translate(String.t(), String.t() | nil, String.t()) ::
{:ok, String.t(), String.t()} | {:error, any()}
@callback languages() ::
{:ok, [%{name: String.t(), code: String.t()}],
[%{name: String.t(), code: String.t()}]}
| {:error, any()}
end

View file

@ -63,7 +63,8 @@ def start(_type, _args) do
Pleroma.Repo,
Config.TransferTask,
Pleroma.Emoji,
Pleroma.Web.Plugs.RateLimiter.Supervisor
Pleroma.Web.Plugs.RateLimiter.Supervisor,
{Task.Supervisor, name: Pleroma.TaskSupervisor}
] ++
cachex_children() ++
http_children() ++
@ -149,11 +150,13 @@ defp cachex_children do
build_cachex("object", default_ttl: 25_000, ttl_interval: 1000, limit: 2500),
build_cachex("rich_media", default_ttl: :timer.minutes(120), limit: 5000),
build_cachex("scrubber", limit: 2500),
build_cachex("scrubber_management", limit: 2500),
build_cachex("idempotency", expiration: idempotency_expiration(), limit: 2500),
build_cachex("web_resp", limit: 2500),
build_cachex("emoji_packs", expiration: emoji_packs_expiration(), limit: 10),
build_cachex("failed_proxy_url", limit: 2500),
build_cachex("banned_urls", default_ttl: :timer.hours(24 * 30), limit: 5_000)
build_cachex("banned_urls", default_ttl: :timer.hours(24 * 30), limit: 5_000),
build_cachex("translations", default_ttl: :timer.hours(24 * 30), limit: 2500)
]
end
@ -248,9 +251,13 @@ def limiters_setup do
end
defp http_children do
proxy_url = Config.get([:http, :proxy_url])
proxy = Pleroma.HTTP.AdapterHelper.format_proxy(proxy_url)
config =
[:http, :adapter]
|> Config.get([])
|> Pleroma.HTTP.AdapterHelper.maybe_add_proxy_pool(proxy)
|> Keyword.put(:name, MyFinch)
[{Finch, config}]

View file

@ -11,10 +11,7 @@ defmodule Akkoma.Collections.Fetcher do
alias Pleroma.Config
require Logger
def fetch_collection_by_ap_id(ap_id) when is_binary(ap_id) do
fetch_collection(ap_id)
end
@spec fetch_collection(String.t() | map()) :: {:ok, [Pleroma.Object.t()]} | {:error, any()}
def fetch_collection(ap_id) when is_binary(ap_id) do
with {:ok, page} <- Fetcher.fetch_and_contain_remote_object_from_id(ap_id) do
{:ok, objects_from_collection(page)}
@ -26,7 +23,7 @@ def fetch_collection(ap_id) when is_binary(ap_id) do
end
def fetch_collection(%{"type" => type} = page)
when type in ["Collection", "OrderedCollection"] do
when type in ["Collection", "OrderedCollection", "CollectionPage", "OrderedCollectionPage"] do
{:ok, objects_from_collection(page)}
end
@ -38,12 +35,13 @@ defp items_in_page(%{"type" => type, "items" => items})
when is_list(items) and type in ["Collection", "CollectionPage"],
do: items
defp objects_from_collection(%{"type" => "OrderedCollection", "orderedItems" => items})
when is_list(items),
do: items
defp objects_from_collection(%{"type" => type, "orderedItems" => items} = page)
when is_list(items) and type in ["OrderedCollection", "OrderedCollectionPage"],
do: maybe_next_page(page, items)
defp objects_from_collection(%{"type" => "Collection", "items" => items}) when is_list(items),
do: items
defp objects_from_collection(%{"type" => type, "items" => items} = page)
when is_list(items) and type in ["Collection", "CollectionPage"],
do: maybe_next_page(page, items)
defp objects_from_collection(%{"type" => type, "first" => first})
when is_binary(first) and type in ["Collection", "OrderedCollection"] do
@ -55,11 +53,13 @@ defp objects_from_collection(%{"type" => type, "first" => %{"id" => id}})
fetch_page_items(id)
end
defp objects_from_collection(_page), do: []
defp fetch_page_items(id, items \\ []) do
if Enum.count(items) >= Config.get([:activitypub, :max_collection_objects]) do
items
else
{:ok, page} = Fetcher.fetch_and_contain_remote_object_from_id(id)
with {:ok, page} <- Fetcher.fetch_and_contain_remote_object_from_id(id) do
objects = items_in_page(page)
if Enum.count(objects) > 0 do
@ -67,6 +67,14 @@ defp fetch_page_items(id, items \\ []) do
else
items
end
else
{:error, "Object has been deleted"} ->
items
{:error, error} ->
Logger.error("Could not fetch page #{id} - #{inspect(error)}")
{:error, error}
end
end
end

View file

@ -38,7 +38,6 @@ def start_link(restart_pleroma? \\ true) do
def load_and_update_env(deleted_settings \\ [], restart_pleroma? \\ true) do
with {_, true} <- {:configurable, Config.get(:configurable_from_database)} do
# We need to restart applications for loaded settings take effect
{logger, other} =
(Repo.all(ConfigDB) ++ deleted_settings)
|> Enum.map(&merge_with_default/1)
@ -85,7 +84,12 @@ defp maybe_set_pleroma_last(apps) do
end
defp merge_with_default(%{group: group, key: key, value: value} = setting) do
default = Config.Holder.default_config(group, key)
default =
if group == :pleroma do
Config.get([key], Config.Holder.default_config(group, key))
else
Config.Holder.default_config(group, key)
end
merged =
cond do

View file

@ -27,4 +27,40 @@ defmodule Pleroma.Constants do
do:
~w(index.html robots.txt static static-fe finmoji emoji packs sounds images instance sw.js sw-pleroma.js favicon.png schemas doc embed.js embed.css)
)
const(status_updatable_fields,
do: [
"source",
"tag",
"updated",
"emoji",
"content",
"summary",
"sensitive",
"attachment",
"generator"
]
)
const(updatable_object_types,
do: [
"Note",
"Question",
"Audio",
"Video",
"Event",
"Article",
"Page"
]
)
const(actor_types,
do: [
"Application",
"Group",
"Organization",
"Person",
"Service"
]
)
end

View file

@ -188,6 +188,11 @@ def emoji_url(%{"type" => "EmojiReact", "content" => emoji, "tag" => tags}) do
def emoji_url(_), do: nil
def emoji_name_with_instance(name, url) do
url = url |> URI.parse() |> Map.get(:host)
"#{name}@#{url}"
end
emoji_qualification_map =
emojis
|> Enum.filter(&String.contains?(&1, "\uFE0F"))

View file

@ -9,6 +9,7 @@ defmodule Pleroma.Helpers.AuthHelper do
import Plug.Conn
@oauth_token_session_key :oauth_token
@oauth_user_session_key :oauth_user
@doc """
Skips OAuth permissions (scopes) checks, assigns nil `:token`.
@ -43,4 +44,16 @@ def put_session_token(%Conn{} = conn, token) when is_binary(token) do
def delete_session_token(%Conn{} = conn) do
delete_session(conn, @oauth_token_session_key)
end
def put_session_user(%Conn{} = conn, user) do
put_session(conn, @oauth_user_session_key, user)
end
def delete_session_user(%Conn{} = conn) do
delete_session(conn, @oauth_user_session_key)
end
def get_session_user(%Conn{} = conn) do
get_session(conn, @oauth_user_session_key)
end
end

View file

@ -6,7 +6,7 @@ defmodule Pleroma.HTTP.AdapterHelper do
@moduledoc """
Configure Tesla.Client with default and customized adapter options.
"""
@defaults [name: MyFinch, connect_timeout: 5_000, recv_timeout: 5_000]
@defaults [name: MyFinch, pool_timeout: 5_000, receive_timeout: 5_000]
@type proxy_type() :: :socks4 | :socks5
@type host() :: charlist() | :inet.ip_address()
@ -25,15 +25,58 @@ def format_proxy(nil), do: nil
def format_proxy(proxy_url) do
case parse_proxy(proxy_url) do
{:ok, host, port} -> {host, port}
{:ok, type, host, port} -> {type, host, port}
{:ok, host, port} -> {:http, host, port, []}
{:ok, type, host, port} -> {type, host, port, []}
_ -> nil
end
end
@spec maybe_add_proxy(keyword(), proxy() | nil) :: keyword()
def maybe_add_proxy(opts, nil), do: opts
def maybe_add_proxy(opts, proxy), do: Keyword.put_new(opts, :proxy, proxy)
def maybe_add_proxy(opts, proxy) do
Keyword.put(opts, :proxy, proxy)
end
def maybe_add_proxy_pool(opts, nil), do: opts
def maybe_add_proxy_pool(opts, proxy) do
Logger.info("Using HTTP Proxy: #{inspect(proxy)}")
opts
|> maybe_add_pools()
|> maybe_add_default_pool()
|> maybe_add_conn_opts()
|> put_in([:pools, :default, :conn_opts, :proxy], proxy)
end
defp maybe_add_pools(opts) do
if Keyword.has_key?(opts, :pools) do
opts
else
Keyword.put(opts, :pools, %{})
end
end
defp maybe_add_default_pool(opts) do
pools = Keyword.get(opts, :pools)
if Map.has_key?(pools, :default) do
opts
else
put_in(opts, [:pools, :default], [])
end
end
defp maybe_add_conn_opts(opts) do
conn_opts = get_in(opts, [:pools, :default, :conn_opts])
unless is_nil(conn_opts) do
opts
else
put_in(opts, [:pools, :default, :conn_opts], [])
end
end
@doc """
Merge default connection & adapter options with received ones.
@ -46,36 +89,31 @@ def options(%URI{} = uri, opts \\ []) do
|> AdapterHelper.Default.options(uri)
end
defp proxy_type("http"), do: {:ok, :http}
defp proxy_type("https"), do: {:ok, :https}
defp proxy_type(_), do: {:error, :unknown}
@spec parse_proxy(String.t() | tuple() | nil) ::
{:ok, host(), pos_integer()}
| {:ok, proxy_type(), host(), pos_integer()}
| {:error, atom()}
| nil
def parse_proxy(nil), do: nil
def parse_proxy(proxy) when is_binary(proxy) do
with [host, port] <- String.split(proxy, ":"),
{port, ""} <- Integer.parse(port) do
{:ok, parse_host(host), port}
with %URI{} = uri <- URI.parse(proxy),
{:ok, type} <- proxy_type(uri.scheme) do
{:ok, type, uri.host, uri.port}
else
{_, _} ->
Logger.warn("Parsing port failed #{inspect(proxy)}")
{:error, :invalid_proxy_port}
:error ->
Logger.warn("Parsing port failed #{inspect(proxy)}")
{:error, :invalid_proxy_port}
_ ->
Logger.warn("Parsing proxy failed #{inspect(proxy)}")
e ->
Logger.warn("Parsing proxy failed #{inspect(proxy)}, #{inspect(e)}")
{:error, :invalid_proxy}
end
end
def parse_proxy(proxy) when is_tuple(proxy) do
with {type, host, port} <- proxy do
{:ok, type, parse_host(host), port}
{:ok, type, host, port}
else
_ ->
Logger.warn("Parsing proxy failed #{inspect(proxy)}")

View file

@ -9,7 +9,7 @@ defmodule Pleroma.HTTP.AdapterHelper.Default do
@spec options(keyword(), URI.t()) :: keyword()
def options(opts, _uri) do
proxy = Pleroma.Config.get([:http, :proxy_url], nil)
proxy = Pleroma.Config.get([:http, :proxy_url])
AdapterHelper.maybe_add_proxy(opts, AdapterHelper.format_proxy(proxy))
end

View file

@ -1,82 +0,0 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2021 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.HTTP.AdapterHelper.Gun do
@behaviour Pleroma.HTTP.AdapterHelper
alias Pleroma.Config
alias Pleroma.HTTP.AdapterHelper
require Logger
@defaults [
retry: 1,
retry_timeout: 1_000
]
@type pool() :: :federation | :upload | :media | :default
@spec options(keyword(), URI.t()) :: keyword()
def options(incoming_opts \\ [], %URI{} = uri) do
proxy =
[:http, :proxy_url]
|> Config.get()
|> AdapterHelper.format_proxy()
config_opts = Config.get([:http, :adapter], [])
@defaults
|> Keyword.merge(config_opts)
|> add_scheme_opts(uri)
|> AdapterHelper.maybe_add_proxy(proxy)
|> Keyword.merge(incoming_opts)
|> put_timeout()
end
defp add_scheme_opts(opts, %{scheme: "http"}), do: opts
defp add_scheme_opts(opts, %{scheme: "https"}) do
Keyword.put(opts, :certificates_verification, true)
end
defp put_timeout(opts) do
{recv_timeout, opts} = Keyword.pop(opts, :recv_timeout, pool_timeout(opts[:pool]))
# this is the timeout to receive a message from Gun
# `:timeout` key is used in Tesla
Keyword.put(opts, :timeout, recv_timeout)
end
@spec pool_timeout(pool()) :: non_neg_integer()
def pool_timeout(pool) do
default = Config.get([:pools, :default, :recv_timeout], 5_000)
Config.get([:pools, pool, :recv_timeout], default)
end
def limiter_setup do
prefix = Pleroma.Gun.ConnectionPool
wait = Config.get([:connections_pool, :connection_acquisition_wait])
retries = Config.get([:connections_pool, :connection_acquisition_retries])
:pools
|> Config.get([])
|> Enum.each(fn {name, opts} ->
max_running = Keyword.get(opts, :size, 50)
max_waiting = Keyword.get(opts, :max_waiting, 10)
result =
ConcurrentLimiter.new(:"#{prefix}.#{name}", max_running, max_waiting,
wait: wait,
max_retries: retries
)
case result do
:ok -> :ok
{:error, :existing} -> :ok
end
end)
:ok
end
end

View file

@ -1,40 +0,0 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2021 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.HTTP.AdapterHelper.Hackney do
@behaviour Pleroma.HTTP.AdapterHelper
@defaults [
follow_redirect: true,
force_redirect: true
]
@spec options(keyword(), URI.t()) :: keyword()
def options(connection_opts \\ [], %URI{} = uri) do
proxy = Pleroma.Config.get([:http, :proxy_url])
config_opts = Pleroma.Config.get([:http, :adapter], [])
@defaults
|> Keyword.merge(config_opts)
|> Keyword.merge(connection_opts)
|> add_scheme_opts(uri)
|> maybe_add_with_body()
|> Pleroma.HTTP.AdapterHelper.maybe_add_proxy(proxy)
end
defp add_scheme_opts(opts, %URI{scheme: "https"}) do
Keyword.put(opts, :ssl_options, versions: [:"tlsv1.3", :"tlsv1.2", :"tlsv1.1", :tlsv1])
end
defp add_scheme_opts(opts, _), do: opts
defp maybe_add_with_body(opts) do
if opts[:max_body] do
Keyword.put(opts, :with_body, true)
else
opts
end
end
end

View file

@ -384,7 +384,7 @@ def create_notifications(%Activity{data: %{"to" => _, "type" => "Create"}} = act
end
def create_notifications(%Activity{data: %{"type" => type}} = activity, options)
when type in ["Follow", "Like", "Announce", "Move", "EmojiReact", "Flag"] do
when type in ["Follow", "Like", "Announce", "Move", "EmojiReact", "Flag", "Update"] do
do_create_notifications(activity, options)
end
@ -438,6 +438,9 @@ defp type_from_activity(%{data: %{"type" => type}} = activity) do
activity
|> type_from_activity_object()
"Update" ->
"update"
t ->
raise "No notification type for activity type #{t}"
end
@ -503,7 +506,16 @@ def create_poll_notifications(%Activity{} = activity) do
def get_notified_from_activity(activity, local_only \\ true)
def get_notified_from_activity(%Activity{data: %{"type" => type}} = activity, local_only)
when type in ["Create", "Like", "Announce", "Follow", "Move", "EmojiReact", "Flag"] do
when type in [
"Create",
"Like",
"Announce",
"Follow",
"Move",
"EmojiReact",
"Flag",
"Update"
] do
potential_receiver_ap_ids = get_potential_receiver_ap_ids(activity)
potential_receivers =
@ -543,6 +555,21 @@ def get_potential_receiver_ap_ids(%{data: %{"type" => "Flag", "actor" => actor}}
(User.all_superusers() |> Enum.map(fn user -> user.ap_id end)) -- [actor]
end
# Update activity: notify all who repeated this
def get_potential_receiver_ap_ids(%{data: %{"type" => "Update", "actor" => actor}} = activity) do
with %Object{data: %{"id" => object_id}} <- Object.normalize(activity, fetch: false) do
repeaters =
Activity.Queries.by_type("Announce")
|> Activity.Queries.by_object_id(object_id)
|> Activity.with_joined_user_actor()
|> where([a, u], u.local)
|> select([a, u], u.ap_id)
|> Repo.all()
repeaters -- [actor]
end
end
def get_potential_receiver_ap_ids(activity) do
[]
|> Utils.maybe_notify_to_recipients(activity)

View file

@ -26,8 +26,42 @@ defp touch_changeset(changeset) do
end
defp maybe_reinject_internal_fields(%{data: %{} = old_data}, new_data) do
has_history? = fn
%{"formerRepresentations" => %{"orderedItems" => list}} when is_list(list) -> true
_ -> false
end
internal_fields = Map.take(old_data, Pleroma.Constants.object_internal_fields())
remote_history_exists? = has_history?.(new_data)
# If the remote history exists, we treat that as the only source of truth.
new_data =
if has_history?.(old_data) and not remote_history_exists? do
Map.put(new_data, "formerRepresentations", old_data["formerRepresentations"])
else
new_data
end
# If the remote does not have history information, we need to manage it ourselves
new_data =
if not remote_history_exists? do
changed? =
Pleroma.Constants.status_updatable_fields()
|> Enum.any?(fn field -> Map.get(old_data, field) != Map.get(new_data, field) end)
%{updated_object: updated_object} =
new_data
|> Object.Updater.maybe_update_history(old_data,
updated: changed?,
use_history_in_new_object?: false
)
updated_object
else
new_data
end
Map.merge(new_data, internal_fields)
end

View file

@ -0,0 +1,240 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2022 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Object.Updater do
require Pleroma.Constants
def update_content_fields(orig_object_data, updated_object) do
Pleroma.Constants.status_updatable_fields()
|> Enum.reduce(
%{data: orig_object_data, updated: false},
fn field, %{data: data, updated: updated} ->
updated =
updated or
(field != "updated" and
Map.get(updated_object, field) != Map.get(orig_object_data, field))
data =
if Map.has_key?(updated_object, field) do
Map.put(data, field, updated_object[field])
else
Map.drop(data, [field])
end
%{data: data, updated: updated}
end
)
end
def maybe_history(object) do
with history <- Map.get(object, "formerRepresentations"),
true <- is_map(history),
"OrderedCollection" <- Map.get(history, "type"),
true <- is_list(Map.get(history, "orderedItems")),
true <- is_integer(Map.get(history, "totalItems")) do
history
else
_ -> nil
end
end
def history_for(object) do
with history when not is_nil(history) <- maybe_history(object) do
history
else
_ -> history_skeleton()
end
end
defp history_skeleton do
%{
"type" => "OrderedCollection",
"totalItems" => 0,
"orderedItems" => []
}
end
def maybe_update_history(
updated_object,
orig_object_data,
opts
) do
updated = opts[:updated]
use_history_in_new_object? = opts[:use_history_in_new_object?]
if not updated do
%{updated_object: updated_object, used_history_in_new_object?: false}
else
# Put edit history
# Note that we may have got the edit history by first fetching the object
{new_history, used_history_in_new_object?} =
with true <- use_history_in_new_object?,
updated_history when not is_nil(updated_history) <- maybe_history(opts[:new_data]) do
{updated_history, true}
else
_ ->
history = history_for(orig_object_data)
latest_history_item =
orig_object_data
|> Map.drop(["id", "formerRepresentations"])
updated_history =
history
|> Map.put("orderedItems", [latest_history_item | history["orderedItems"]])
|> Map.put("totalItems", history["totalItems"] + 1)
{updated_history, false}
end
updated_object =
updated_object
|> Map.put("formerRepresentations", new_history)
%{updated_object: updated_object, used_history_in_new_object?: used_history_in_new_object?}
end
end
defp maybe_update_poll(to_be_updated, updated_object) do
choice_key = fn data ->
if Map.has_key?(data, "anyOf"), do: "anyOf", else: "oneOf"
end
with true <- to_be_updated["type"] == "Question",
key <- choice_key.(updated_object),
true <- key == choice_key.(to_be_updated),
orig_choices <- to_be_updated[key] |> Enum.map(&Map.drop(&1, ["replies"])),
new_choices <- updated_object[key] |> Enum.map(&Map.drop(&1, ["replies"])),
true <- orig_choices == new_choices do
# Choices are the same, but counts are different
to_be_updated
|> Map.put(key, updated_object[key])
else
# Choices (or vote type) have changed, do not allow this
_ -> to_be_updated
end
end
# This calculates the data to be sent as the object of an Update.
# new_data's formerRepresentations is not considered.
# formerRepresentations is added to the returned data.
def make_update_object_data(original_data, new_data, date) do
%{data: updated_data, updated: updated} =
original_data
|> update_content_fields(new_data)
if not updated do
updated_data
else
%{updated_object: updated_data} =
updated_data
|> maybe_update_history(original_data, updated: updated, use_history_in_new_object?: false)
updated_data
|> Map.put("updated", date)
end
end
# This calculates the data of the new Object from an Update.
# new_data's formerRepresentations is considered.
def make_new_object_data_from_update_object(original_data, new_data) do
update_is_reasonable =
with {_, updated} when not is_nil(updated) <- {:cur_updated, new_data["updated"]},
{_, {:ok, updated_time, _}} <- {:cur_updated, DateTime.from_iso8601(updated)},
{_, last_updated} when not is_nil(last_updated) <-
{:last_updated, original_data["updated"] || original_data["published"]},
{_, {:ok, last_updated_time, _}} <-
{:last_updated, DateTime.from_iso8601(last_updated)},
:gt <- DateTime.compare(updated_time, last_updated_time) do
:update_everything
else
# only allow poll updates
{:cur_updated, _} -> :no_content_update
:eq -> :no_content_update
# allow all updates
{:last_updated, _} -> :update_everything
# allow no updates
_ -> false
end
%{
updated_object: updated_data,
used_history_in_new_object?: used_history_in_new_object?,
updated: updated
} =
if update_is_reasonable == :update_everything do
%{data: updated_data, updated: updated} =
original_data
|> update_content_fields(new_data)
updated_data
|> maybe_update_history(original_data,
updated: updated,
use_history_in_new_object?: true,
new_data: new_data
)
|> Map.put(:updated, updated)
else
%{
updated_object: original_data,
used_history_in_new_object?: false,
updated: false
}
end
updated_data =
if update_is_reasonable != false do
updated_data
|> maybe_update_poll(new_data)
else
updated_data
end
%{
updated_data: updated_data,
updated: updated,
used_history_in_new_object?: used_history_in_new_object?
}
end
def for_each_history_item(%{"orderedItems" => items} = history, _object, fun) do
new_items =
Enum.map(items, fun)
|> Enum.reduce_while(
{:ok, []},
fn
{:ok, item}, {:ok, acc} -> {:cont, {:ok, acc ++ [item]}}
e, _acc -> {:halt, e}
end
)
case new_items do
{:ok, items} -> {:ok, Map.put(history, "orderedItems", items)}
e -> e
end
end
def for_each_history_item(history, _, _) do
{:ok, history}
end
def do_with_history(object, fun) do
with history <- object["formerRepresentations"],
object <- Map.drop(object, ["formerRepresentations"]),
{_, {:ok, object}} <- {:main_body, fun.(object)},
{_, {:ok, history}} <- {:history_items, for_each_history_item(history, object, fun)} do
object =
if history do
Map.put(object, "formerRepresentations", history)
else
object
end
{:ok, object}
else
{:main_body, e} -> e
{:history_items, e} -> e
end
end
end

View file

@ -25,7 +25,7 @@ defp mix_task(task, args) do
module = Module.split(module)
match?(["Mix", "Tasks", "Pleroma" | _], module) and
String.downcase(List.last(module)) == task
task_match?(module, task)
end)
if module do
@ -35,6 +35,13 @@ defp mix_task(task, args) do
end
end
defp task_match?(["Mix", "Tasks", "Pleroma" | module_path], task) do
module_path
|> Enum.join(".")
|> String.downcase()
|> String.equivalent?(String.downcase(task))
end
def migrate(args) do
Mix.Tasks.Pleroma.Ecto.Migrate.run(args)
end

View file

@ -23,7 +23,7 @@ def es_query(:activity, query, offset, limit) do
timeout: "5s",
sort: [
"_score",
%{_timestamp: %{order: "desc", format: "basic_date_time"}}
%{"_timestamp" => %{order: "desc", format: "basic_date_time"}}
],
query: %{
bool: %{
@ -62,8 +62,12 @@ def search(user, query, options) do
Task.async(fn ->
q = es_query(:activity, parsed_query, offset, limit)
Pleroma.Search.Elasticsearch.Store.search(:activities, q)
|> Enum.filter(fn x -> Visibility.visible_for_user?(x, user) end)
:activities
|> Pleroma.Search.Elasticsearch.Store.search(q)
|> Enum.filter(fn x ->
x.data["type"] == "Create" && x.object.data["type"] == "Note" &&
Visibility.visible_for_user?(x, user)
end)
end)
activity_results = Task.await(activity_task)

View file

@ -42,7 +42,6 @@ def search(:activities, q) do
results
|> Enum.map(fn result -> result["_id"] end)
|> Pleroma.Activity.all_by_ids_with_object()
|> Enum.sort(&(&1.inserted_at >= &2.inserted_at))
else
e ->
Logger.error(e)

View file

@ -36,6 +36,7 @@ defmodule Pleroma.Upload do
alias Ecto.UUID
alias Pleroma.Config
alias Pleroma.Maps
alias Pleroma.Web.ActivityPub.Utils
require Logger
@type source ::
@ -88,6 +89,7 @@ def store(upload, opts \\ []) do
{:ok, url_spec} <- Pleroma.Uploaders.Uploader.put_file(opts.uploader, upload) do
{:ok,
%{
"id" => Utils.generate_object_id(),
"type" => opts.activity_type,
"mediaType" => upload.content_type,
"url" => [

View file

@ -681,6 +681,7 @@ def register_changeset_ldap(struct, params = %{password: password})
|> validate_exclusion(:nickname, Config.get([User, :restricted_nicknames]))
|> validate_format(:nickname, local_nickname_regex())
|> put_ap_id()
|> put_keys()
|> unique_constraint(:ap_id)
|> put_following_and_follower_and_featured_address()
end
@ -740,6 +741,7 @@ def register_changeset(struct, params \\ %{}, opts \\ []) do
|> validate_length(:registration_reason, max: reason_limit)
|> maybe_validate_required_email(opts[:external])
|> put_password_hash
|> put_keys()
|> put_ap_id()
|> unique_constraint(:ap_id)
|> put_following_and_follower_and_featured_address()
@ -755,6 +757,11 @@ def maybe_validate_required_email(changeset, _) do
end
end
def put_keys(changeset) do
{:ok, pem} = Keys.generate_rsa_pem()
put_change(changeset, :keys, pem)
end
def put_ap_id(changeset) do
ap_id = ap_id(%User{nickname: get_field(changeset, :nickname)})
put_change(changeset, :ap_id, ap_id)

View file

@ -194,7 +194,16 @@ defp insert_activity_with_expiration(data, local, recipients) do
def notify_and_stream(activity) do
Notification.create_notifications(activity)
conversation = create_or_bump_conversation(activity, activity.actor)
original_activity =
case activity do
%{data: %{"type" => "Update"}, object: %{data: %{"id" => id}}} ->
Activity.get_create_by_object_ap_id_with_object(id)
_ ->
activity
end
conversation = create_or_bump_conversation(original_activity, original_activity.actor)
participations = get_participations(conversation)
stream_out(activity)
stream_out_participations(participations)
@ -260,7 +269,7 @@ def stream_out_participations(_, _), do: :noop
@impl true
def stream_out(%Activity{data: %{"type" => data_type}} = activity)
when data_type in ["Create", "Announce", "Delete"] do
when data_type in ["Create", "Announce", "Delete", "Update"] do
activity
|> Topics.get_activity_topics()
|> Streamer.stream(activity)
@ -331,9 +340,9 @@ defp do_unfollow(follower, followed, activity_id, local)
defp do_unfollow(follower, followed, activity_id, local) when local == true do
with %Activity{} = follow_activity <- fetch_latest_follow(follower, followed),
{:ok, follow_activity} <- update_follow_state(follow_activity, "cancelled"),
unfollow_data <- make_unfollow_data(follower, followed, follow_activity, activity_id),
{:ok, activity} <- insert(unfollow_data, local),
{:ok, _activity} <- Repo.delete(follow_activity),
_ <- notify_and_stream(activity),
:ok <- maybe_federate(activity) do
{:ok, activity}
@ -349,7 +358,7 @@ defp do_unfollow(follower, followed, activity_id, false) do
with %Activity{} = follow_activity <- fetch_latest_follow(follower, followed),
{:ok, _activity} <- Repo.delete(follow_activity),
unfollow_data <- make_unfollow_data(follower, followed, follow_activity, activity_id),
unfollow_activity <- remote_unfollow_data(unfollow_data),
unfollow_activity <- make_unfollow_activity(unfollow_data, false),
_ <- notify_and_stream(unfollow_activity) do
{:ok, unfollow_activity}
else
@ -358,12 +367,12 @@ defp do_unfollow(follower, followed, activity_id, false) do
end
end
defp remote_unfollow_data(data) do
defp make_unfollow_activity(data, local) do
{recipients, _, _} = get_recipients(data)
%Activity{
data: data,
local: false,
local: local,
actor: data["actor"],
recipients: recipients
}

View file

@ -55,27 +55,21 @@ def follow(follower, followed) do
{:ok, data, []}
end
@spec emoji_react(User.t(), Object.t(), String.t()) :: {:ok, map(), keyword()}
def emoji_react(actor, object, emoji) do
with {:ok, data, meta} <- object_action(actor, object) do
data =
if Emoji.is_unicode_emoji?(emoji) do
defp unicode_emoji_react(_object, data, emoji) do
data
|> Map.put("content", emoji)
|> Map.put("type", "EmojiReact")
else
with %{} = emojo <- Emoji.get(emoji) do
path = emojo |> Map.get(:file)
url = "#{Endpoint.url()}#{path}"
end
defp add_emoji_content(data, emoji, url) do
data
|> Map.put("content", emoji)
|> Map.put("content", Emoji.maybe_quote(emoji))
|> Map.put("type", "EmojiReact")
|> Map.put("tag", [
%{}
|> Map.put("id", url)
|> Map.put("type", "Emoji")
|> Map.put("name", emojo.code)
|> Map.put("name", Emoji.maybe_quote(emoji))
|> Map.put(
"icon",
%{}
@ -83,11 +77,64 @@ def emoji_react(actor, object, emoji) do
|> Map.put("url", url)
)
])
end
defp remote_custom_emoji_react(
%{data: %{"reactions" => existing_reactions}},
data,
emoji
) do
[emoji_code, instance] = String.split(Emoji.stripped_name(emoji), "@")
matching_reaction =
Enum.find(
existing_reactions,
fn [name, _, url] ->
url = URI.parse(url)
url.host == instance && name == emoji_code
end
)
if matching_reaction do
[name, _, url] = matching_reaction
add_emoji_content(data, name, url)
else
{:error, "Could not react"}
end
end
defp remote_custom_emoji_react(_object, _data, _emoji) do
{:error, "Could not react"}
end
defp local_custom_emoji_react(data, emoji) do
with %{} = emojo <- Emoji.get(emoji) do
path = emojo |> Map.get(:file)
url = "#{Endpoint.url()}#{path}"
add_emoji_content(data, emojo.code, url)
else
_ -> {:error, "Emoji does not exist"}
end
end
defp custom_emoji_react(object, data, emoji) do
if String.contains?(emoji, "@") do
remote_custom_emoji_react(object, data, emoji)
else
local_custom_emoji_react(data, emoji)
end
end
@spec emoji_react(User.t(), Object.t(), String.t()) :: {:ok, map(), keyword()}
def emoji_react(actor, object, emoji) do
with {:ok, data, meta} <- object_action(actor, object) do
data =
if Emoji.is_unicode_emoji?(emoji) do
unicode_emoji_react(object, data, emoji)
else
custom_emoji_react(object, data, emoji)
end
{:ok, data, meta}
end
end
@ -231,10 +278,16 @@ def like(actor, object) do
end
end
# Retricted to user updates for now, always public
@spec update(User.t(), Object.t()) :: {:ok, map(), keyword()}
def update(actor, object) do
to = [Pleroma.Constants.as_public(), actor.follower_address]
{to, cc} =
if object["type"] in Pleroma.Constants.actor_types() do
# User updates, always public
{[Pleroma.Constants.as_public(), actor.follower_address], []}
else
# Status updates, follow the recipients in the object
{object["to"] || [], object["cc"] || []}
end
{:ok,
%{
@ -242,7 +295,8 @@ def update(actor, object) do
"type" => "Update",
"actor" => actor.ap_id,
"object" => object,
"to" => to
"to" => to,
"cc" => cc
}, []}
end

View file

@ -41,6 +41,16 @@ defmodule Pleroma.Web.ActivityPub.MRF do
suggestions: [
"exclusion.com"
]
},
%{
key: :transparency_obfuscate_domains,
label: "MRF domain obfuscation",
type: {:list, :string},
description:
"Obfuscate domains in MRF transparency. This is useful if the domain you're blocking contains words you don't want displayed, but still want to disclose the MRF settings.",
suggestions: [
"badword.com"
]
}
]
}
@ -53,10 +63,53 @@ defmodule Pleroma.Web.ActivityPub.MRF do
@required_description_keys [:key, :related_policy]
def filter_one(policy, message) do
should_plug_history? =
if function_exported?(policy, :history_awareness, 0) do
policy.history_awareness()
else
:manual
end
|> Kernel.==(:auto)
if not should_plug_history? do
policy.filter(message)
else
main_result = policy.filter(message)
with {_, {:ok, main_message}} <- {:main, main_result},
{_,
%{
"formerRepresentations" => %{
"orderedItems" => [_ | _]
}
}} = {_, object} <- {:object, message["object"]},
{_, {:ok, new_history}} <-
{:history,
Pleroma.Object.Updater.for_each_history_item(
object["formerRepresentations"],
object,
fn item ->
with {:ok, filtered} <- policy.filter(Map.put(message, "object", item)) do
{:ok, filtered["object"]}
else
e -> e
end
end
)} do
{:ok, put_in(main_message, ["object", "formerRepresentations"], new_history)}
else
{:main, _} -> main_result
{:object, _} -> main_result
{:history, e} -> e
end
end
end
def filter(policies, %{} = message) do
policies
|> Enum.reduce({:ok, message}, fn
policy, {:ok, message} -> policy.filter(message)
policy, {:ok, message} -> filter_one(policy, message)
_, error -> error
end)
end
@ -85,7 +138,11 @@ def pipeline_filter(%{} = message, meta) do
def get_policies do
Pleroma.Config.get([:mrf, :policies], [])
|> get_policies()
|> Enum.concat([Pleroma.Web.ActivityPub.MRF.HashtagPolicy])
|> Enum.concat([
Pleroma.Web.ActivityPub.MRF.HashtagPolicy,
Pleroma.Web.ActivityPub.MRF.InlineQuotePolicy
])
|> Enum.uniq()
end
defp get_policies(policy) when is_atom(policy), do: [policy]

View file

@ -9,6 +9,9 @@ defmodule Pleroma.Web.ActivityPub.MRF.AntiLinkSpamPolicy do
require Logger
@impl true
def history_awareness, do: :auto
# has the user successfully posted before?
defp old_user?(%User{} = u) do
u.note_count > 0 || u.follower_count > 0

View file

@ -10,6 +10,8 @@ defmodule Pleroma.Web.ActivityPub.MRF.EnsureRePrepended do
@reply_prefix Regex.compile!("^re:[[:space:]]*", [:caseless])
def history_awareness, do: :auto
def filter_by_summary(
%{data: %{"summary" => parent_summary}} = _in_reply_to,
%{"summary" => child_summary} = child
@ -27,8 +29,8 @@ def filter_by_summary(
def filter_by_summary(_in_reply_to, child), do: child
def filter(%{"type" => "Create", "object" => child_object} = object)
when is_map(child_object) do
def filter(%{"type" => type, "object" => child_object} = object)
when type in ["Create", "Update"] and is_map(child_object) do
child =
child_object["inReplyTo"]
|> Object.normalize(fetch: false)

View file

@ -16,6 +16,9 @@ defmodule Pleroma.Web.ActivityPub.MRF.HashtagPolicy do
@behaviour Pleroma.Web.ActivityPub.MRF.Policy
@impl true
def history_awareness, do: :manual
defp check_reject(message, hashtags) do
if Enum.any?(Config.get([:mrf_hashtag, :reject]), fn match -> match in hashtags end) do
{:reject, "[HashtagPolicy] Matches with rejected keyword"}
@ -47,22 +50,46 @@ defp check_ftl_removal(%{"to" => to} = message, hashtags) do
defp check_ftl_removal(message, _hashtags), do: {:ok, message}
defp check_sensitive(message, hashtags) do
defp check_sensitive(message) do
{:ok, new_object} =
Object.Updater.do_with_history(message["object"], fn object ->
hashtags = Object.hashtags(%Object{data: object})
if Enum.any?(Config.get([:mrf_hashtag, :sensitive]), fn match -> match in hashtags end) do
{:ok, Kernel.put_in(message, ["object", "sensitive"], true)}
{:ok, Map.put(object, "sensitive", true)}
else
{:ok, message}
{:ok, object}
end
end)
{:ok, Map.put(message, "object", new_object)}
end
@impl true
def filter(%{"type" => "Create", "object" => object} = message) do
hashtags = Object.hashtags(%Object{data: object})
def filter(%{"type" => type, "object" => object} = message) when type in ["Create", "Update"] do
history_items =
with %{"formerRepresentations" => %{"orderedItems" => items}} <- object do
items
else
_ -> []
end
historical_hashtags =
Enum.reduce(history_items, [], fn item, acc ->
acc ++ Object.hashtags(%Object{data: item})
end)
hashtags = Object.hashtags(%Object{data: object}) ++ historical_hashtags
if hashtags != [] do
with {:ok, message} <- check_reject(message, hashtags),
{:ok, message} <- check_ftl_removal(message, hashtags),
{:ok, message} <- check_sensitive(message, hashtags) do
{:ok, message} <-
(if "type" == "Create" do
check_ftl_removal(message, hashtags)
else
{:ok, message}
end),
{:ok, message} <- check_sensitive(message) do
{:ok, message}
end
else

View file

@ -27,6 +27,8 @@ defp object_payload(%{} = object) do
end
defp check_reject(%{"object" => %{} = object} = message) do
with {:ok, _new_object} <-
Pleroma.Object.Updater.do_with_history(object, fn object ->
payload = object_payload(object)
if Enum.any?(Pleroma.Config.get([:mrf_keyword, :reject]), fn pattern ->
@ -36,15 +38,35 @@ defp check_reject(%{"object" => %{} = object} = message) do
else
{:ok, message}
end
end) do
{:ok, message}
else
e -> e
end
end
defp check_ftl_removal(%{"to" => to, "object" => %{} = object} = message) do
defp check_ftl_removal(%{"type" => "Create", "to" => to, "object" => %{} = object} = message) do
check_keyword = fn object ->
payload = object_payload(object)
if Pleroma.Constants.as_public() in to and
Enum.any?(Pleroma.Config.get([:mrf_keyword, :federated_timeline_removal]), fn pattern ->
if Enum.any?(Pleroma.Config.get([:mrf_keyword, :federated_timeline_removal]), fn pattern ->
string_matches?(payload, pattern)
end) do
{:should_delist, nil}
else
{:ok, %{}}
end
end
should_delist? = fn object ->
with {:ok, _} <- Pleroma.Object.Updater.do_with_history(object, check_keyword) do
false
else
_ -> true
end
end
if Pleroma.Constants.as_public() in to and should_delist?.(object) do
to = List.delete(to, Pleroma.Constants.as_public())
cc = [Pleroma.Constants.as_public() | message["cc"] || []]
@ -59,8 +81,12 @@ defp check_ftl_removal(%{"to" => to, "object" => %{} = object} = message) do
end
end
defp check_ftl_removal(message) do
{:ok, message}
end
defp check_replace(%{"object" => %{} = object} = message) do
object =
replace_kw = fn object ->
["content", "name", "summary"]
|> Enum.filter(fn field -> Map.has_key?(object, field) && object[field] end)
|> Enum.reduce(object, fn field, object ->
@ -73,6 +99,10 @@ defp check_replace(%{"object" => %{} = object} = message) do
Map.put(object, field, data)
end)
|> (fn object -> {:ok, object} end).()
end
{:ok, object} = Pleroma.Object.Updater.do_with_history(object, replace_kw)
message = Map.put(message, "object", object)
@ -80,7 +110,8 @@ defp check_replace(%{"object" => %{} = object} = message) do
end
@impl true
def filter(%{"type" => "Create", "object" => %{"content" => _content}} = message) do
def filter(%{"type" => type, "object" => %{"content" => _content}} = message)
when type in ["Create", "Update"] do
with {:ok, message} <- check_reject(message),
{:ok, message} <- check_ftl_removal(message),
{:ok, message} <- check_replace(message) do

View file

@ -15,6 +15,9 @@ defmodule Pleroma.Web.ActivityPub.MRF.MediaProxyWarmingPolicy do
recv_timeout: 10_000
]
@impl true
def history_awareness, do: :auto
defp prefetch(url) do
# Fetching only proxiable resources
if MediaProxy.enabled?() and MediaProxy.url_proxiable?(url) do
@ -53,10 +56,8 @@ defp preload(%{"object" => %{"attachment" => attachments}} = _message) do
end
@impl true
def filter(
%{"type" => "Create", "object" => %{"attachment" => attachments} = _object} = message
)
when is_list(attachments) and length(attachments) > 0 do
def filter(%{"type" => type, "object" => %{"attachment" => attachments} = _object} = message)
when type in ["Create", "Update"] and is_list(attachments) and length(attachments) > 0 do
preload(message)
{:ok, message}

View file

@ -11,6 +11,7 @@ defmodule Pleroma.Web.ActivityPub.MRF.NoEmptyPolicy do
@impl true
def filter(%{"actor" => actor} = object) do
with true <- is_local?(actor),
true <- is_eligible_type?(object),
true <- is_note?(object),
false <- has_attachment?(object),
true <- only_mentions?(object) do
@ -32,7 +33,6 @@ defp is_local?(actor) do
end
defp has_attachment?(%{
"type" => "Create",
"object" => %{"type" => "Note", "attachment" => attachments}
})
when length(attachments) > 0,
@ -40,23 +40,13 @@ defp has_attachment?(%{
defp has_attachment?(_), do: false
defp only_mentions?(%{"type" => "Create", "object" => %{"type" => "Note", "source" => source}})
when is_binary(source) do
non_mentions =
source |> String.split() |> Enum.filter(&(not String.starts_with?(&1, "@"))) |> length
if non_mentions > 0 do
false
else
true
end
defp only_mentions?(%{"object" => %{"type" => "Note", "source" => source}}) do
source =
case source do
%{"content" => text} -> text
_ -> source
end
defp only_mentions?(%{
"type" => "Create",
"object" => %{"type" => "Note", "source" => %{"content" => source}}
})
when is_binary(source) do
non_mentions =
source |> String.split() |> Enum.filter(&(not String.starts_with?(&1, "@"))) |> length
@ -69,9 +59,12 @@ defp only_mentions?(%{
defp only_mentions?(_), do: false
defp is_note?(%{"type" => "Create", "object" => %{"type" => "Note"}}), do: true
defp is_note?(%{"object" => %{"type" => "Note"}}), do: true
defp is_note?(_), do: false
defp is_eligible_type?(%{"type" => type}) when type in ["Create", "Update"], do: true
defp is_eligible_type?(_), do: false
@impl true
def describe, do: {:ok, %{}}
end

View file

@ -6,14 +6,17 @@ defmodule Pleroma.Web.ActivityPub.MRF.NoPlaceholderTextPolicy do
@moduledoc "Ensure no content placeholder is present (such as the dot from mastodon)"
@behaviour Pleroma.Web.ActivityPub.MRF.Policy
@impl true
def history_awareness, do: :auto
@impl true
def filter(
%{
"type" => "Create",
"type" => type,
"object" => %{"content" => content, "attachment" => _} = _child_object
} = object
)
when content in [".", "<p>.</p>"] do
when type in ["Create", "Update"] and content in [".", "<p>.</p>"] do
{:ok, put_in(object, ["object", "content"], "")}
end

View file

@ -9,7 +9,11 @@ defmodule Pleroma.Web.ActivityPub.MRF.NormalizeMarkup do
@behaviour Pleroma.Web.ActivityPub.MRF.Policy
@impl true
def filter(%{"type" => "Create", "object" => child_object} = object) do
def history_awareness, do: :auto
@impl true
def filter(%{"type" => type, "object" => child_object} = object)
when type in ["Create", "Update"] do
scrub_policy = Pleroma.Config.get([:mrf_normalize_markup, :scrub_policy])
content =

View file

@ -12,5 +12,6 @@ defmodule Pleroma.Web.ActivityPub.MRF.Policy do
label: String.t(),
description: String.t()
}
@optional_callbacks config_description: 0
@callback history_awareness() :: :auto | :manual
@optional_callbacks config_description: 0, history_awareness: 0
end

View file

@ -256,10 +256,35 @@ def filter(object) when is_binary(object) do
def filter(object), do: {:ok, object}
defp obfuscate(string) when is_binary(string) do
string
|> to_charlist()
|> Enum.with_index()
|> Enum.map(fn
{?., _index} ->
?.
{char, index} ->
if 3 <= index && index < String.length(string) - 3, do: ?*, else: char
end)
|> to_string()
end
defp maybe_obfuscate(host, obfuscations) do
if MRF.subdomain_match?(obfuscations, host) do
obfuscate(host)
else
host
end
end
@impl true
def describe do
exclusions = Config.get([:mrf, :transparency_exclusions]) |> MRF.instance_list_from_tuples()
obfuscations =
Config.get([:mrf, :transparency_obfuscate_domains], []) |> MRF.subdomains_regex()
mrf_simple_excluded =
Config.get(:mrf_simple)
|> Enum.map(fn {rule, instances} ->
@ -269,7 +294,7 @@ def describe do
mrf_simple =
mrf_simple_excluded
|> Enum.map(fn {rule, instances} ->
{rule, Enum.map(instances, fn {host, _} -> host end)}
{rule, Enum.map(instances, fn {host, _} -> maybe_obfuscate(host, obfuscations) end)}
end)
|> Map.new()
@ -286,7 +311,9 @@ def describe do
|> Enum.map(fn {rule, instances} ->
instances =
instances
|> Enum.map(fn {host, reason} -> {host, %{"reason" => reason}} end)
|> Enum.map(fn {host, reason} ->
{maybe_obfuscate(host, obfuscations), %{"reason" => reason}}
end)
|> Map.new()
{rule, instances}

View file

@ -86,8 +86,8 @@ def validate(
meta
)
when objtype in ~w[Question Answer Audio Video Event Article Note Page] do
with {:ok, object_data} <- cast_and_apply(object),
meta = Keyword.put(meta, :object_data, object_data |> stringify_keys),
with {:ok, object_data} <- cast_and_apply_and_stringify_with_history(object),
meta = Keyword.put(meta, :object_data, object_data),
{:ok, create_activity} <-
create_activity
|> CreateGenericValidator.cast_and_validate(meta)
@ -110,6 +110,8 @@ def validate(%{"type" => type} = object, meta)
"Page" -> ArticleNotePageValidator
end
with {:ok, object} <-
do_separate_with_history(object, fn object ->
with {:ok, object} <-
object
|> validator.cast_and_validate()
@ -120,10 +122,42 @@ def validate(%{"type" => type} = object, meta)
tag = (object["tag"] || []) ++ Object.hashtags(%Object{data: object})
object = Map.put(object, "tag", tag)
{:ok, object}
end
end) do
{:ok, object, meta}
end
end
def validate(
%{"type" => "Update", "object" => %{"type" => objtype} = object} = update_activity,
meta
)
when objtype in ~w[Question Answer Audio Video Event Article Note Page] do
with {_, false} <- {:local, Access.get(meta, :local, false)},
{_, {:ok, object_data, _}} <- {:object_validation, validate(object, meta)},
meta = Keyword.put(meta, :object_data, object_data),
{:ok, update_activity} <-
update_activity
|> UpdateValidator.cast_and_validate()
|> Ecto.Changeset.apply_action(:insert) do
update_activity = stringify_keys(update_activity)
{:ok, update_activity, meta}
else
{:local, _} ->
with {:ok, object} <-
update_activity
|> UpdateValidator.cast_and_validate()
|> Ecto.Changeset.apply_action(:insert) do
object = stringify_keys(object)
{:ok, object, meta}
end
{:object_validation, e} ->
e
end
end
def validate(%{"type" => type} = object, meta)
when type in ~w[Accept Reject Follow Update Like EmojiReact Announce
Answer] do
@ -160,6 +194,15 @@ def validate(%{"type" => type} = object, meta) when type in ~w(Add Remove) do
def validate(o, m), do: {:error, {:validator_not_set, {o, m}}}
def cast_and_apply_and_stringify_with_history(object) do
do_separate_with_history(object, fn object ->
with {:ok, object_data} <- cast_and_apply(object),
object_data <- object_data |> stringify_keys() do
{:ok, object_data}
end
end)
end
def cast_and_apply(%{"type" => "Question"} = object) do
QuestionValidator.cast_and_apply(object)
end
@ -214,4 +257,54 @@ def fetch_actor_and_object(object) do
Object.normalize(object["object"], fetch: true)
:ok
end
defp for_each_history_item(
%{"type" => "OrderedCollection", "orderedItems" => items} = history,
object,
fun
) do
processed_items =
Enum.map(items, fn item ->
with item <- Map.put(item, "id", object["id"]),
{:ok, item} <- fun.(item) do
item
else
_ -> nil
end
end)
if Enum.all?(processed_items, &(not is_nil(&1))) do
{:ok, Map.put(history, "orderedItems", processed_items)}
else
{:error, :invalid_history}
end
end
defp for_each_history_item(nil, _object, _fun) do
{:ok, nil}
end
defp for_each_history_item(_, _object, _fun) do
{:error, :invalid_history}
end
# fun is (object -> {:ok, validated_object_with_string_keys})
defp do_separate_with_history(object, fun) do
with history <- object["formerRepresentations"],
object <- Map.drop(object, ["formerRepresentations"]),
{_, {:ok, object}} <- {:main_body, fun.(object)},
{_, {:ok, history}} <- {:history_items, for_each_history_item(history, object, fun)} do
object =
if history do
Map.put(object, "formerRepresentations", history)
else
object
end
{:ok, object}
else
{:main_body, e} -> e
{:history_items, e} -> e
end
end
end

View file

@ -6,7 +6,6 @@ defmodule Pleroma.Web.ActivityPub.ObjectValidators.ArticleNotePageValidator do
use Ecto.Schema
alias Pleroma.User
alias Pleroma.EctoType.ActivityPub.ObjectValidators
alias Pleroma.Object.Fetcher
alias Pleroma.Web.CommonAPI.Utils
alias Pleroma.Web.ActivityPub.ObjectValidators.CommonFixes
alias Pleroma.Web.ActivityPub.ObjectValidators.CommonValidations
@ -54,23 +53,17 @@ defp fix_url(%{"url" => url} = data) when is_bitstring(url), do: data
defp fix_url(%{"url" => url} = data) when is_map(url), do: Map.put(data, "url", url["href"])
defp fix_url(data), do: data
defp fix_tag(%{"tag" => tag} = data) when is_list(tag), do: data
defp fix_tag(%{"tag" => tag} = data) when is_list(tag) do
Map.put(data, "tag", Enum.filter(tag, &is_map/1))
end
defp fix_tag(%{"tag" => tag} = data) when is_map(tag), do: Map.put(data, "tag", [tag])
defp fix_tag(data), do: Map.drop(data, ["tag"])
defp fix_replies(%{"replies" => %{"first" => %{"items" => replies}}} = data)
when is_list(replies),
do: Map.put(data, "replies", replies)
defp fix_replies(%{"replies" => %{"items" => replies}} = data) when is_list(replies),
do: Map.put(data, "replies", replies)
defp fix_replies(%{"replies" => replies} = data) when is_bitstring(replies),
do: Map.drop(data, ["replies"])
defp fix_replies(%{"replies" => replies} = data) when is_list(replies), do: data
defp fix_replies(%{"replies" => %{"first" => first}} = data) do
with {:ok, %{"orderedItems" => replies}} <-
Fetcher.fetch_and_contain_remote_object_from_id(first) do
with {:ok, replies} <- Akkoma.Collections.Fetcher.fetch_collection(first) do
Map.put(data, "replies", replies)
else
{:error, _} ->
@ -79,7 +72,10 @@ defp fix_replies(%{"replies" => %{"first" => first}} = data) do
end
end
defp fix_replies(data), do: data
defp fix_replies(%{"replies" => %{"items" => replies}} = data) when is_list(replies),
do: Map.put(data, "replies", replies)
defp fix_replies(data), do: Map.delete(data, "replies")
defp remote_mention_resolver(
%{"id" => ap_id, "tag" => tags},
@ -108,6 +104,8 @@ defp remote_mention_resolver(
end
# https://github.com/misskey-dev/misskey/pull/8787
# Misskey has an awful tendency to drop all custom formatting when it sends remotely
# So this basically reprocesses their MFM source
defp fix_misskey_content(
%{"source" => %{"mediaType" => "text/x.misskeymarkdown", "content" => content}} = object
)

View file

@ -11,6 +11,7 @@ defmodule Pleroma.Web.ActivityPub.ObjectValidators.AttachmentValidator do
@primary_key false
embedded_schema do
field(:id, :string)
field(:type, :string)
field(:mediaType, :string, default: "application/octet-stream")
field(:name, :string)
@ -43,7 +44,7 @@ def changeset(struct, data) do
|> fix_url()
struct
|> cast(data, [:type, :mediaType, :name, :blurhash])
|> cast(data, [:id, :type, :mediaType, :name, :blurhash])
|> cast_embed(:url, with: &url_changeset/2, required: true)
|> validate_inclusion(:type, ~w[Link Document Audio Image Video])
|> validate_required([:type, :mediaType])

View file

@ -33,6 +33,7 @@ defmacro object_fields do
field(:content, :string)
field(:published, ObjectValidators.DateTime)
field(:updated, ObjectValidators.DateTime)
field(:emoji, ObjectValidators.Emoji, default: %{})
embeds_many(:attachment, AttachmentValidator)
end

View file

@ -7,8 +7,8 @@ defmodule Pleroma.Web.ActivityPub.ObjectValidators.CommonFixes do
alias Pleroma.Object
alias Pleroma.Object.Containment
alias Pleroma.User
alias Pleroma.Web.ActivityPub.Transmogrifier
alias Pleroma.Web.ActivityPub.Utils
require Pleroma.Constants
def cast_and_filter_recipients(message, field, follower_collection, field_fallback \\ []) do
{:ok, data} = ObjectValidators.Recipients.cast(message[field] || field_fallback)
@ -32,7 +32,7 @@ def fix_object_defaults(data) do
|> cast_and_filter_recipients("cc", follower_collection)
|> cast_and_filter_recipients("bto", follower_collection)
|> cast_and_filter_recipients("bcc", follower_collection)
|> Transmogrifier.fix_implicit_addressing(follower_collection)
|> fix_implicit_addressing(follower_collection)
end
def fix_activity_addressing(activity) do
@ -43,7 +43,7 @@ def fix_activity_addressing(activity) do
|> cast_and_filter_recipients("cc", follower_collection)
|> cast_and_filter_recipients("bto", follower_collection)
|> cast_and_filter_recipients("bcc", follower_collection)
|> Transmogrifier.fix_implicit_addressing(follower_collection)
|> fix_implicit_addressing(follower_collection)
end
def fix_actor(data) do
@ -73,4 +73,27 @@ def fix_object_action_recipients(data, %Object{data: %{"actor" => actor}}) do
Map.put(data, "to", to)
end
# if as:Public is addressed, then make sure the followers collection is also addressed
# so that the activities will be delivered to local users.
def fix_implicit_addressing(%{"to" => to, "cc" => cc} = object, followers_collection) do
recipients = to ++ cc
if followers_collection not in recipients do
cond do
Pleroma.Constants.as_public() in cc ->
to = to ++ [followers_collection]
Map.put(object, "to", to)
Pleroma.Constants.as_public() in to ->
cc = cc ++ [followers_collection]
Map.put(object, "cc", cc)
true ->
object
end
else
object
end
end
end

View file

@ -13,7 +13,6 @@ defmodule Pleroma.Web.ActivityPub.ObjectValidators.CreateGenericValidator do
alias Pleroma.User
alias Pleroma.Web.ActivityPub.ObjectValidators.CommonFixes
alias Pleroma.Web.ActivityPub.ObjectValidators.CommonValidations
alias Pleroma.Web.ActivityPub.Transmogrifier
import Ecto.Changeset
@ -67,7 +66,7 @@ defp fix_addressing(data, object) do
|> CommonFixes.cast_and_filter_recipients("cc", follower_collection, object["cc"])
|> CommonFixes.cast_and_filter_recipients("bto", follower_collection, object["bto"])
|> CommonFixes.cast_and_filter_recipients("bcc", follower_collection, object["bcc"])
|> Transmogrifier.fix_implicit_addressing(follower_collection)
|> CommonFixes.fix_implicit_addressing(follower_collection)
end
def fix(data, meta) do

View file

@ -13,7 +13,7 @@ defmodule Pleroma.Web.ActivityPub.ObjectValidators.EmojiReactValidator do
import Pleroma.Web.ActivityPub.ObjectValidators.CommonValidations
@primary_key false
@emoji_regex ~r/:[A-Za-z0-9_-]+:/
@emoji_regex ~r/:[A-Za-z0-9_-]+(@.+)?:/
embedded_schema do
quote do

View file

@ -51,7 +51,9 @@ def validate_updating_rights(cng) do
with actor = get_field(cng, :actor),
object = get_field(cng, :object),
{:ok, object_id} <- ObjectValidators.ObjectID.cast(object),
true <- actor == object_id do
actor_uri <- URI.parse(actor),
object_uri <- URI.parse(object_id),
true <- actor_uri.host == object_uri.host do
cng
else
_e ->

View file

@ -23,6 +23,7 @@ defmodule Pleroma.Web.ActivityPub.SideEffects do
alias Pleroma.Web.Streamer
alias Pleroma.Workers.PollWorker
require Pleroma.Constants
require Logger
@logger Pleroma.Config.get([:side_effects, :logger], Logger)
@ -150,24 +151,27 @@ def handle(
# Tasks this handles:
# - Update the user
# - Update a non-user object (Note, Question, etc.)
#
# For a local user, we also get a changeset with the full information, so we
# can update non-federating, non-activitypub settings as well.
@impl true
def handle(%{data: %{"type" => "Update", "object" => updated_object}} = object, meta) do
if changeset = Keyword.get(meta, :user_update_changeset) do
changeset
|> User.update_and_set_cache()
updated_object_id = updated_object["id"]
with {_, true} <- {:has_id, is_binary(updated_object_id)},
%{"type" => type} <- updated_object,
{_, is_user} <- {:is_user, type in Pleroma.Constants.actor_types()} do
if is_user do
handle_update_user(object, meta)
else
{:ok, new_user_data} = ActivityPub.user_data_from_user_object(updated_object)
User.get_by_ap_id(updated_object["id"])
|> User.remote_user_changeset(new_user_data)
|> User.update_and_set_cache()
handle_update_object(object, meta)
end
else
_ ->
{:ok, object, meta}
end
end
# Tasks this handles:
# - Add like to object
@ -395,6 +399,79 @@ def handle(object, meta) do
{:ok, object, meta}
end
defp handle_update_user(
%{data: %{"type" => "Update", "object" => updated_object}} = object,
meta
) do
if changeset = Keyword.get(meta, :user_update_changeset) do
changeset
|> User.update_and_set_cache()
else
{:ok, new_user_data} = ActivityPub.user_data_from_user_object(updated_object)
User.get_by_ap_id(updated_object["id"])
|> User.remote_user_changeset(new_user_data)
|> User.update_and_set_cache()
end
{:ok, object, meta}
end
defp handle_update_object(
%{data: %{"type" => "Update", "object" => updated_object}} = object,
meta
) do
orig_object_ap_id = updated_object["id"]
orig_object = Object.get_by_ap_id(orig_object_ap_id)
orig_object_data = orig_object.data
updated_object =
if meta[:local] do
# If this is a local Update, we don't process it by transmogrifier,
# so we use the embedded object as-is.
updated_object
else
meta[:object_data]
end
if orig_object_data["type"] in Pleroma.Constants.updatable_object_types() do
%{
updated_data: updated_object_data,
updated: updated,
used_history_in_new_object?: used_history_in_new_object?
} = Object.Updater.make_new_object_data_from_update_object(orig_object_data, updated_object)
changeset =
orig_object
|> Repo.preload(:hashtags)
|> Object.change(%{data: updated_object_data})
with {:ok, new_object} <- Repo.update(changeset),
{:ok, _} <- Object.invalid_object_cache(new_object),
{:ok, _} <- Object.set_cache(new_object),
# The metadata/utils.ex uses the object id for the cache.
{:ok, _} <- Pleroma.Activity.HTML.invalidate_cache_for(new_object.id) do
if used_history_in_new_object? do
with create_activity when not is_nil(create_activity) <-
Pleroma.Activity.get_create_by_object_ap_id(orig_object_ap_id),
{:ok, _} <- Pleroma.Activity.HTML.invalidate_cache_for(create_activity.id) do
nil
else
_ -> nil
end
end
if updated do
object
|> Activity.normalize()
|> ActivityPub.notify_and_stream()
end
end
end
{:ok, object, meta}
end
def handle_object_creation(%{"type" => "Question"} = object, activity, meta) do
with {:ok, object, meta} <- Pipeline.common_pipeline(object, meta) do
PollWorker.schedule_poll_end(activity)

View file

@ -19,6 +19,7 @@ defmodule Pleroma.Web.ActivityPub.Transmogrifier do
alias Pleroma.Web.ActivityPub.Pipeline
alias Pleroma.Web.ActivityPub.Utils
alias Pleroma.Web.ActivityPub.Visibility
alias Pleroma.Web.ActivityPub.ObjectValidators.CommonFixes
alias Pleroma.Web.Federator
alias Pleroma.Workers.TransmogrifierWorker
@ -95,29 +96,6 @@ def fix_explicit_addressing(%{"to" => to, "cc" => cc} = object, follower_collect
|> Map.put("cc", final_cc)
end
# if as:Public is addressed, then make sure the followers collection is also addressed
# so that the activities will be delivered to local users.
def fix_implicit_addressing(%{"to" => to, "cc" => cc} = object, followers_collection) do
recipients = to ++ cc
if followers_collection not in recipients do
cond do
Pleroma.Constants.as_public() in cc ->
to = to ++ [followers_collection]
Map.put(object, "to", to)
Pleroma.Constants.as_public() in to ->
cc = cc ++ [followers_collection]
Map.put(object, "cc", cc)
true ->
object
end
else
object
end
end
def fix_addressing(object) do
{:ok, %User{follower_address: follower_collection}} =
object
@ -130,7 +108,7 @@ def fix_addressing(object) do
|> fix_addressing_list("bto")
|> fix_addressing_list("bcc")
|> fix_explicit_addressing(follower_collection)
|> fix_implicit_addressing(follower_collection)
|> CommonFixes.fix_implicit_addressing(follower_collection)
end
def fix_actor(%{"attributedTo" => actor} = object) do
@ -721,6 +699,24 @@ def prepare_object(object) do
|> strip_internal_fields
|> strip_internal_tags
|> set_type
|> maybe_process_history
end
defp maybe_process_history(%{"formerRepresentations" => %{"orderedItems" => history}} = object) do
processed_history =
Enum.map(
history,
fn
item when is_map(item) -> prepare_object(item)
item -> item
end
)
put_in(object, ["formerRepresentations", "orderedItems"], processed_history)
end
defp maybe_process_history(object) do
object
end
# @doc
@ -745,6 +741,21 @@ def prepare_outgoing(%{"type" => activity_type, "object" => object_id} = data)
{:ok, data}
end
def prepare_outgoing(%{"type" => "Update", "object" => %{"type" => objtype} = object} = data)
when objtype in Pleroma.Constants.updatable_object_types() do
object =
object
|> prepare_object
data =
data
|> Map.put("object", object)
|> Map.merge(Utils.make_json_ld_header())
|> Map.delete("bcc")
{:ok, data}
end
def prepare_outgoing(%{"type" => "Announce", "actor" => ap_id, "object" => object_id} = data) do
object =
object_id

View file

@ -329,7 +329,7 @@ def add_emoji_reaction_to_object(
object
) do
reactions = get_cached_emoji_reactions(object)
emoji = stripped_emoji_name(emoji)
emoji = Pleroma.Emoji.stripped_name(emoji)
url = emoji_url(emoji, activity)
new_reactions =
@ -356,12 +356,6 @@ def add_emoji_reaction_to_object(
update_element_in_object("reaction", new_reactions, object, count)
end
defp stripped_emoji_name(name) do
name
|> String.replace_leading(":", "")
|> String.replace_trailing(":", "")
end
defp emoji_url(
name,
%Activity{
@ -384,7 +378,7 @@ def remove_emoji_reaction_from_object(
%Activity{data: %{"content" => emoji, "actor" => actor}} = activity,
object
) do
emoji = stripped_emoji_name(emoji)
emoji = Pleroma.Emoji.stripped_name(emoji)
reactions = get_cached_emoji_reactions(object)
url = emoji_url(emoji, activity)
@ -472,18 +466,6 @@ def update_follow_state_for_all(
{:ok, activity}
end
def update_follow_state(
%Activity{} = activity,
state
) do
new_data = Map.put(activity.data, "state", state)
changeset = Changeset.change(activity, data: new_data)
with {:ok, activity} <- Repo.update(changeset) do
{:ok, activity}
end
end
@doc """
Makes a follow activity data for the given follower and followed
"""
@ -525,19 +507,37 @@ def fetch_latest_undo(%User{ap_id: ap_id}) do
def get_latest_reaction(internal_activity_id, %{ap_id: ap_id}, emoji) do
%{data: %{"object" => object_ap_id}} = Activity.get_by_id(internal_activity_id)
emoji = Pleroma.Emoji.maybe_quote(emoji)
"EmojiReact"
|> Activity.Queries.by_type()
|> where(actor: ^ap_id)
|> where([activity], fragment("?->>'content' = ?", activity.data, ^emoji))
|> custom_emoji_discriminator(emoji)
|> Activity.Queries.by_object_id(object_ap_id)
|> order_by([activity], fragment("? desc nulls last", activity.id))
|> limit(1)
|> Repo.one()
end
defp custom_emoji_discriminator(query, emoji) do
if String.contains?(emoji, "@") do
stripped = Pleroma.Emoji.stripped_name(emoji)
[name, domain] = String.split(stripped, "@")
domain_pattern = "%" <> domain <> "%"
emoji_pattern = Pleroma.Emoji.maybe_quote(name)
query
|> where([activity], fragment("?->>'content' = ?
AND EXISTS (
SELECT FROM jsonb_array_elements(?->'tag') elem
WHERE elem->>'id' ILIKE ?
)", activity.data, ^emoji_pattern, activity.data, ^domain_pattern))
else
query
|> where([activity], fragment("?->>'content' = ?", activity.data, ^emoji))
end
end
#### Announce-related helpers
@doc """

View file

@ -0,0 +1,43 @@
defmodule Pleroma.Web.AkkomaAPI.TranslationController do
use Pleroma.Web, :controller
alias Pleroma.Web.Plugs.OAuthScopesPlug
@cachex Pleroma.Config.get([:cachex, :provider], Cachex)
@unauthenticated_access %{fallback: :proceed_unauthenticated, scopes: []}
plug(
OAuthScopesPlug,
%{@unauthenticated_access | scopes: ["read:statuses"]}
when action in [
:languages
]
)
plug(Pleroma.Web.ApiSpec.CastAndValidate)
defdelegate open_api_operation(action), to: Pleroma.Web.ApiSpec.TranslationOperation
action_fallback(Pleroma.Web.MastodonAPI.FallbackController)
@doc "GET /api/v1/akkoma/translation/languages"
def languages(conn, _params) do
with {:ok, source_languages, dest_languages} <- get_languages() do
conn
|> json(%{source: source_languages, target: dest_languages})
else
e -> IO.inspect(e)
end
end
defp get_languages do
module = Pleroma.Config.get([:translator, :module])
@cachex.fetch!(:translations_cache, "languages:#{module}}", fn _ ->
with {:ok, source_languages, dest_languages} <- module.languages() do
{:ok, source_languages, dest_languages}
else
{:error, err} -> {:ignore, {:error, err}}
end
end)
end
end

View file

@ -6,9 +6,13 @@ defmodule Pleroma.Web.ApiSpec.StatusOperation do
alias OpenApiSpex.Operation
alias OpenApiSpex.Schema
alias Pleroma.Web.ApiSpec.AccountOperation
alias Pleroma.Web.ApiSpec.Schemas.Account
alias Pleroma.Web.ApiSpec.Schemas.ApiError
alias Pleroma.Web.ApiSpec.Schemas.Attachment
alias Pleroma.Web.ApiSpec.Schemas.BooleanLike
alias Pleroma.Web.ApiSpec.Schemas.Emoji
alias Pleroma.Web.ApiSpec.Schemas.FlakeID
alias Pleroma.Web.ApiSpec.Schemas.Poll
alias Pleroma.Web.ApiSpec.Schemas.ScheduledStatus
alias Pleroma.Web.ApiSpec.Schemas.Status
alias Pleroma.Web.ApiSpec.Schemas.VisibilityScope
@ -406,6 +410,75 @@ def bookmarks_operation do
}
end
def translate_operation do
%Operation{
tags: ["Retrieve status translation"],
summary: "Translate status",
description: "View the translation of a given status",
operationId: "StatusController.translation",
security: [%{"oAuth" => ["read:statuses"]}],
parameters: [id_param(), language_param(), source_language_param()],
responses: %{
200 => Operation.response("Translation", "application/json", translation()),
400 => Operation.response("Error", "application/json", ApiError),
404 => Operation.response("Not Found", "application/json", ApiError)
}
}
end
def show_history_operation do
%Operation{
tags: ["Retrieve status history"],
summary: "Status history",
description: "View history of a status",
operationId: "StatusController.show_history",
security: [%{"oAuth" => ["read:statuses"]}],
parameters: [
id_param()
],
responses: %{
200 => status_history_response(),
404 => Operation.response("Not Found", "application/json", ApiError)
}
}
end
def show_source_operation do
%Operation{
tags: ["Retrieve status source"],
summary: "Status source",
description: "View source of a status",
operationId: "StatusController.show_source",
security: [%{"oAuth" => ["read:statuses"]}],
parameters: [
id_param()
],
responses: %{
200 => status_source_response(),
404 => Operation.response("Not Found", "application/json", ApiError)
}
}
end
def update_operation do
%Operation{
tags: ["Update status"],
summary: "Update status",
description: "Change the content of a status",
operationId: "StatusController.update",
security: [%{"oAuth" => ["write:statuses"]}],
parameters: [
id_param()
],
requestBody: request_body("Parameters", update_request(), required: true),
responses: %{
200 => status_response(),
403 => Operation.response("Forbidden", "application/json", ApiError),
404 => Operation.response("Not Found", "application/json", ApiError)
}
}
end
def array_of_statuses do
%Schema{type: :array, items: Status, example: [Status.schema().example]}
end
@ -514,6 +587,60 @@ defp create_request do
}
end
defp update_request do
%Schema{
title: "StatusUpdateRequest",
type: :object,
properties: %{
status: %Schema{
type: :string,
nullable: true,
description:
"Text content of the status. If `media_ids` is provided, this becomes optional. Attaching a `poll` is optional while `status` is provided."
},
media_ids: %Schema{
nullable: true,
type: :array,
items: %Schema{type: :string},
description: "Array of Attachment ids to be attached as media."
},
poll: poll_params(),
sensitive: %Schema{
allOf: [BooleanLike],
nullable: true,
description: "Mark status and attached media as sensitive?"
},
spoiler_text: %Schema{
type: :string,
nullable: true,
description:
"Text to be shown as a warning or subject before the actual content. Statuses are generally collapsed behind this field."
},
content_type: %Schema{
type: :string,
nullable: true,
description:
"The MIME type of the status, it is transformed into HTML by the backend. You can get the list of the supported MIME types with the nodeinfo endpoint."
},
to: %Schema{
type: :array,
nullable: true,
items: %Schema{type: :string},
description:
"A list of nicknames (like `lain@soykaf.club` or `lain` on the local server) that will be used to determine who is going to be addressed by this post. Using this will disable the implicit addressing by mentioned names in the `status` body, only the people in the `to` list will be addressed. The normal rules for for post visibility are not affected by this and will still apply"
}
},
example: %{
"status" => "What time is it?",
"sensitive" => "false",
"poll" => %{
"options" => ["Cofe", "Adventure"],
"expires_in" => 420
}
}
}
end
def poll_params do
%Schema{
nullable: true,
@ -552,10 +679,99 @@ def id_param do
)
end
defp language_param do
Operation.parameter(:language, :path, :string, "ISO 639 language code", example: "en")
end
defp source_language_param do
Operation.parameter(:from, :query, :string, "ISO 639 language code", example: "en")
end
defp status_response do
Operation.response("Status", "application/json", Status)
end
defp status_history_response do
Operation.response(
"Status History",
"application/json",
%Schema{
title: "Status history",
description: "Response schema for history of a status",
type: :array,
items: %Schema{
type: :object,
properties: %{
account: %Schema{
allOf: [Account],
description: "The account that authored this status"
},
content: %Schema{
type: :string,
format: :html,
description: "HTML-encoded status content"
},
sensitive: %Schema{
type: :boolean,
description: "Is this status marked as sensitive content?"
},
spoiler_text: %Schema{
type: :string,
description:
"Subject or summary line, below which status content is collapsed until expanded"
},
created_at: %Schema{
type: :string,
format: "date-time",
description: "The date when this status was created"
},
media_attachments: %Schema{
type: :array,
items: Attachment,
description: "Media that is attached to this status"
},
emojis: %Schema{
type: :array,
items: Emoji,
description: "Custom emoji to be used when rendering status content"
},
poll: %Schema{
allOf: [Poll],
nullable: true,
description: "The poll attached to the status"
}
}
}
}
)
end
defp status_source_response do
Operation.response(
"Status Source",
"application/json",
%Schema{
type: :object,
properties: %{
id: FlakeID,
text: %Schema{
type: :string,
description: "Raw source of status content"
},
spoiler_text: %Schema{
type: :string,
description:
"Subject or summary line, below which status content is collapsed until expanded"
},
content_type: %Schema{
type: :string,
description: "The content type of the source"
}
}
}
)
end
defp context do
%Schema{
title: "StatusContext",
@ -573,4 +789,20 @@ defp context do
}
}
end
defp translation do
%Schema{
title: "StatusTranslation",
description: "The translation of a status.",
type: :object,
required: [:detected_language, :text],
properties: %{
detected_language: %Schema{
type: :string,
description: "The detected language of the text"
},
text: %Schema{type: :string, description: "The translated text"}
}
}
end
end

View file

@ -0,0 +1,53 @@
defmodule Pleroma.Web.ApiSpec.TranslationOperation do
alias OpenApiSpex.Operation
alias OpenApiSpex.Schema
@spec open_api_operation(atom) :: Operation.t()
def open_api_operation(action) do
operation = String.to_existing_atom("#{action}_operation")
apply(__MODULE__, operation, [])
end
@spec languages_operation() :: Operation.t()
def languages_operation() do
%Operation{
tags: ["Retrieve status translation"],
summary: "Translate status",
description: "View the translation of a given status",
operationId: "AkkomaAPI.TranslationController.languages",
security: [%{"oAuth" => ["read:statuses"]}],
responses: %{
200 =>
Operation.response("Translation", "application/json", source_dest_languages_schema())
}
}
end
defp source_dest_languages_schema do
%Schema{
type: :object,
required: [:source, :target],
properties: %{
source: languages_schema(),
target: languages_schema()
}
}
end
defp languages_schema do
%Schema{
type: :array,
items: %Schema{
type: :object,
properties: %{
code: %Schema{
type: :string
},
name: %Schema{
type: :string
}
}
}
}
end
end

View file

@ -405,6 +405,16 @@ defp remote_interaction_request do
}
end
def show_subscribe_form_operation do
%Operation{
tags: ["Accounts"],
summary: "Show remote subscribe form",
operationId: "UtilController.show_subscribe_form",
parameters: [],
responses: %{200 => Operation.response("Web Page", "test/html", %Schema{type: :string})}
}
end
defp delete_account_request do
%Schema{
title: "AccountDeleteRequest",

View file

@ -73,6 +73,12 @@ defmodule Pleroma.Web.ApiSpec.Schemas.Status do
format: "date-time",
description: "The date when this status was created"
},
edited_at: %Schema{
type: :string,
format: "date-time",
nullable: true,
description: "The date when this status was last edited"
},
emojis: %Schema{
type: :array,
items: Emoji,

View file

@ -209,7 +209,8 @@ def react_with_emoji(id, user, emoji) do
{:ok, activity, _} <- Pipeline.common_pipeline(emoji_react, local: true) do
{:ok, activity}
else
_ -> {:error, dgettext("errors", "Could not add reaction emoji")}
_ ->
{:error, dgettext("errors", "Could not add reaction emoji")}
end
end
@ -346,6 +347,41 @@ def post(user, %{status: _} = data) do
end
end
def update(user, orig_activity, changes) do
with orig_object <- Object.normalize(orig_activity),
{:ok, new_object} <- make_update_data(user, orig_object, changes),
{:ok, update_data, _} <- Builder.update(user, new_object),
{:ok, update, _} <- Pipeline.common_pipeline(update_data, local: true) do
{:ok, update}
else
_ -> {:error, nil}
end
end
defp make_update_data(user, orig_object, changes) do
kept_params = %{
visibility: Visibility.get_visibility(orig_object),
in_reply_to_id:
with replied_id when is_binary(replied_id) <- orig_object.data["inReplyTo"],
%Activity{id: activity_id} <- Activity.get_create_by_object_ap_id(replied_id) do
activity_id
else
_ -> nil
end
}
params = Map.merge(changes, kept_params)
with {:ok, draft} <- ActivityDraft.create(user, params) do
change =
Object.Updater.make_update_object_data(orig_object.data, draft.object, Utils.make_date())
{:ok, change}
else
_ -> {:error, nil}
end
end
@spec pin(String.t(), User.t()) :: {:ok, Activity.t()} | {:error, term()}
def pin(id, %User{} = user) do
with %Activity{} = activity <- create_activity_by_id(id),

View file

@ -221,7 +221,7 @@ defp object(draft) do
|> Map.put("emoji", emoji)
|> Map.put("source", %{
"content" => draft.status,
"mediaType" => draft.params[:content_type]
"mediaType" => Utils.get_content_type(draft.params[:content_type])
})
|> Map.put("generator", draft.params[:generator])

View file

@ -37,7 +37,7 @@ def attachments_from_ids_no_descs([]), do: []
def attachments_from_ids_no_descs(ids) do
Enum.map(ids, fn media_id ->
case Repo.get(Object, media_id) do
case get_attachment(media_id) do
%Object{data: data} -> data
_ -> nil
end
@ -51,13 +51,17 @@ def attachments_from_ids_descs(ids, descs_str) do
{_, descs} = Jason.decode(descs_str)
Enum.map(ids, fn media_id ->
with %Object{data: data} <- Repo.get(Object, media_id) do
with %Object{data: data} <- get_attachment(media_id) do
Map.put(data, "name", descs[media_id])
end
end)
|> Enum.reject(&is_nil/1)
end
defp get_attachment(media_id) do
Repo.get(Object, media_id)
end
@spec get_to_and_cc(ActivityDraft.t()) :: {list(String.t()), list(String.t())}
def get_to_and_cc(%{in_reply_to_conversation: %Participation{} = participation}) do
@ -219,7 +223,7 @@ def make_content_html(%ActivityDraft{} = draft) do
|> maybe_add_attachments(draft.attachments, attachment_links)
end
defp get_content_type(content_type) do
def get_content_type(content_type) do
if Enum.member?(Config.get([:instance, :allowed_post_formats]), content_type) do
content_type
else
@ -285,11 +289,11 @@ def format_input(text, "text/html", options) do
def format_input(text, "text/x.misskeymarkdown", options) do
text
|> Formatter.markdown_to_html()
|> MfmParser.Parser.parse()
|> MfmParser.Encoder.to_html()
|> Formatter.linkify(options)
|> Formatter.html_escape("text/x.misskeymarkdown")
|> (fn {text, mentions, tags} ->
{String.replace(text, ~r/\r?\n/, "<br>"), mentions, tags}
end).()
|> Formatter.html_escape("text/html")
end
def format_input(text, "text/markdown", options) do

View file

@ -27,9 +27,21 @@ defmodule Pleroma.Web.MastoFEController do
def index(conn, _params) do
with %{assigns: %{user: %User{} = user, token: %Token{app_id: token_app_id} = token}} <- conn,
{:ok, %{id: ^token_app_id}} <- AuthController.local_mastofe_app() do
flavour =
[:frontends, :mastodon]
|> Pleroma.Config.get()
|> Map.get("name", "mastodon-fe")
index =
if flavour == "fedibird-fe" do
"fedibird.index.html"
else
"glitchsoc.index.html"
end
conn
|> put_layout(false)
|> render("index.html",
|> render(index,
token: token.token,
user: user,
custom_emojis: Pleroma.Emoji.get_all()

View file

@ -27,7 +27,8 @@ defmodule Pleroma.Web.MastodonAPI.AuthController do
def login(conn, %{"code" => auth_token} = params) do
with {:ok, app} <- local_mastofe_app(),
{:ok, auth} <- Authorization.get_by_token(app, auth_token),
{:ok, oauth_token} <- Token.exchange_token(app, auth) do
%User{} = user <- User.get_cached_by_id(auth.user_id),
{:ok, oauth_token} <- Token.get_or_exchange_token(auth, app, user) do
redirect_to =
conn
|> local_mastodon_post_login_path()

View file

@ -51,6 +51,7 @@ def index(conn, %{account_id: account_id} = params) do
move
pleroma:emoji_reaction
poll
update
}
def index(%{assigns: %{user: user}} = conn, params) do
params =

View file

@ -14,6 +14,7 @@ defmodule Pleroma.Web.MastodonAPI.StatusController do
alias Pleroma.Bookmark
alias Pleroma.Object
alias Pleroma.Repo
alias Pleroma.Config
alias Pleroma.ScheduledActivity
alias Pleroma.User
alias Pleroma.Web.ActivityPub.ActivityPub
@ -30,6 +31,7 @@ defmodule Pleroma.Web.MastodonAPI.StatusController do
plug(:skip_public_check when action in [:index, :show])
@unauthenticated_access %{fallback: :proceed_unauthenticated, scopes: []}
@cachex Pleroma.Config.get([:cachex, :provider], Cachex)
plug(
OAuthScopesPlug,
@ -37,7 +39,10 @@ defmodule Pleroma.Web.MastodonAPI.StatusController do
when action in [
:index,
:show,
:context
:context,
:translate,
:show_history,
:show_source
]
)
@ -48,7 +53,8 @@ defmodule Pleroma.Web.MastodonAPI.StatusController do
:create,
:delete,
:reblog,
:unreblog
:unreblog,
:update
]
)
@ -190,6 +196,59 @@ def create(%{assigns: %{user: _user}, body_params: %{media_ids: _} = params} = c
create(%Plug.Conn{conn | body_params: params}, %{})
end
@doc "GET /api/v1/statuses/:id/history"
def show_history(%{assigns: assigns} = conn, %{id: id} = params) do
with user = assigns[:user],
%Activity{} = activity <- Activity.get_by_id_with_object(id),
true <- Visibility.visible_for_user?(activity, user) do
try_render(conn, "history.json",
activity: activity,
for: user,
with_direct_conversation_id: true,
with_muted: Map.get(params, :with_muted, false)
)
else
_ -> {:error, :not_found}
end
end
@doc "GET /api/v1/statuses/:id/source"
def show_source(%{assigns: assigns} = conn, %{id: id} = _params) do
with user = assigns[:user],
%Activity{} = activity <- Activity.get_by_id_with_object(id),
true <- Visibility.visible_for_user?(activity, user) do
try_render(conn, "source.json",
activity: activity,
for: user
)
else
_ -> {:error, :not_found}
end
end
@doc "PUT /api/v1/statuses/:id"
def update(%{assigns: %{user: user}, body_params: body_params} = conn, %{id: id} = params) do
with {_, %Activity{}} = {_, activity} <- {:activity, Activity.get_by_id_with_object(id)},
{_, true} <- {:visible, Visibility.visible_for_user?(activity, user)},
{_, true} <- {:is_create, activity.data["type"] == "Create"},
actor <- Activity.user_actor(activity),
{_, true} <- {:own_status, actor.id == user.id},
changes <- body_params |> put_application(conn),
{_, {:ok, _update_activity}} <- {:pipeline, CommonAPI.update(user, activity, changes)},
{_, %Activity{}} = {_, activity} <- {:refetched, Activity.get_by_id_with_object(id)} do
try_render(conn, "show.json",
activity: activity,
for: user,
with_direct_conversation_id: true,
with_muted: Map.get(params, :with_muted, false)
)
else
{:own_status, _} -> {:error, :forbidden}
{:pipeline, _} -> {:error, :internal_server_error}
_ -> {:error, :not_found}
end
end
@doc "GET /api/v1/statuses/:id"
def show(%{assigns: %{user: user}} = conn, %{id: id} = params) do
with %Activity{} = activity <- Activity.get_by_id_with_object(id),
@ -418,6 +477,51 @@ def bookmarks(%{assigns: %{user: user}} = conn, params) do
)
end
@doc "GET /api/v1/statuses/:id/translations/:language"
def translate(%{assigns: %{user: user}} = conn, %{id: id, language: language} = params) do
with {:enabled, true} <- {:enabled, Config.get([:translator, :enabled])},
%Activity{} = activity <- Activity.get_by_id_with_object(id),
{:visible, true} <- {:visible, Visibility.visible_for_user?(activity, user)},
translation_module <- Config.get([:translator, :module]),
{:ok, detected, translation} <-
fetch_or_translate(
activity.id,
activity.object.data["content"],
Map.get(params, :from, nil),
language,
translation_module
) do
json(conn, %{detected_language: detected, text: translation})
else
{:enabled, false} ->
conn
|> put_status(:bad_request)
|> json(%{"error" => "Translation is not enabled"})
{:visible, false} ->
{:error, :not_found}
e ->
e
end
end
defp fetch_or_translate(status_id, text, source_language, target_language, translation_module) do
@cachex.fetch!(
:translations_cache,
"translations:#{status_id}:#{source_language}:#{target_language}",
fn _ ->
value = translation_module.translate(text, source_language, target_language)
with {:ok, _, _} <- value do
value
else
_ -> {:ignore, value}
end
end
)
end
defp put_application(params, %{assigns: %{token: %Token{user: %User{} = user} = token}} = _conn) do
if user.disclose_client do
%{client_name: client_name, website: website} = Repo.preload(token, :app).app

View file

@ -26,7 +26,7 @@ def render("show.json", _) do
thumbnail:
URI.merge(Pleroma.Web.Endpoint.url(), Keyword.get(instance, :instance_thumbnail))
|> to_string,
languages: ["en"],
languages: Keyword.get(instance, :languages, ["en"]),
registrations: Keyword.get(instance, :registrations_open),
approval_required: Keyword.get(instance, :account_approval_required),
# Extra (not present in Mastodon):
@ -65,6 +65,7 @@ def features do
"shareable_emoji_packs",
"multifetch",
"pleroma:api/v1/notifications:include_types_filter",
"editing",
if Config.get([:media_proxy, :enabled]) do
"media_proxy"
end,
@ -81,7 +82,11 @@ def features do
if Config.get([:instance, :profile_directory]) do
"profile_directory"
end,
"custom_emoji_reactions"
if Config.get([:translator, :enabled], false) do
"akkoma:machine_translation"
end,
"custom_emoji_reactions",
"pleroma:get:main/ostatus"
]
|> Enum.filter(& &1)
end

View file

@ -17,7 +17,11 @@ defmodule Pleroma.Web.MastodonAPI.NotificationView do
alias Pleroma.Web.MastodonAPI.NotificationView
alias Pleroma.Web.MastodonAPI.StatusView
@parent_types ~w{Like Announce EmojiReact}
defp object_id_for(%{data: %{"object" => %{"id" => id}}}) when is_binary(id), do: id
defp object_id_for(%{data: %{"object" => id}}) when is_binary(id), do: id
@parent_types ~w{Like Announce EmojiReact Update}
def render("index.json", %{notifications: notifications, for: reading_user} = opts) do
activities = Enum.map(notifications, & &1.activity)
@ -28,7 +32,7 @@ def render("index.json", %{notifications: notifications, for: reading_user} = op
%{data: %{"type" => type}} ->
type in @parent_types
end)
|> Enum.map(& &1.data["object"])
|> Enum.map(&object_id_for/1)
|> Activity.create_by_object_ap_id()
|> Activity.with_preloaded_object(:left)
|> Pleroma.Repo.all()
@ -76,9 +80,9 @@ def render(
parent_activity_fn = fn ->
if opts[:parent_activities] do
Activity.Queries.find_by_object_ap_id(opts[:parent_activities], activity.data["object"])
Activity.Queries.find_by_object_ap_id(opts[:parent_activities], object_id_for(activity))
else
Activity.get_create_by_object_ap_id(activity.data["object"])
Activity.get_create_by_object_ap_id(object_id_for(activity))
end
end
@ -107,6 +111,9 @@ def render(
"reblog" ->
put_status(response, parent_activity_fn.(), reading_user, status_render_opts)
"update" ->
put_status(response, parent_activity_fn.(), reading_user, status_render_opts)
"move" ->
put_target(response, activity, reading_user, %{})

View file

@ -265,10 +265,30 @@ def render("show.json", %{activity: %{data: %{"object" => _object}} = activity}
created_at = Utils.to_masto_date(object.data["published"])
edited_at =
with %{"updated" => updated} <- object.data,
date <- Utils.to_masto_date(updated),
true <- date != "" do
date
else
_ ->
nil
end
reply_to = get_reply_to(activity, opts)
reply_to_user = reply_to && CommonAPI.get_user(reply_to.data["actor"])
history_len =
1 +
(Object.Updater.history_for(object.data)
|> Map.get("orderedItems")
|> length())
# See render("history.json", ...) for more details
# Here the implicit index of the current content is 0
chrono_order = history_len - 1
content =
object
|> render_content()
@ -278,14 +298,14 @@ def render("show.json", %{activity: %{data: %{"object" => _object}} = activity}
|> Activity.HTML.get_cached_scrubbed_html_for_activity(
User.html_filter_policy(opts[:for]),
activity,
"mastoapi:content"
"mastoapi:content:#{chrono_order}"
)
content_plaintext =
content
|> Activity.HTML.get_cached_stripped_html_for_activity(
activity,
"mastoapi:content"
"mastoapi:content:#{chrono_order}"
)
summary = object.data["summary"] || ""
@ -353,8 +373,9 @@ def render("show.json", %{activity: %{data: %{"object" => _object}} = activity}
reblog: nil,
card: card,
content: content_html,
text: opts[:with_source] && object.data["source"],
text: opts[:with_source] && get_source_text(object.data["source"]),
created_at: created_at,
edited_at: edited_at,
reblogs_count: announcement_count,
replies_count: object.data["repliesCount"] || 0,
favourites_count: like_count,
@ -375,6 +396,7 @@ def render("show.json", %{activity: %{data: %{"object" => _object}} = activity}
emojis: build_emojis(object.data["emoji"]),
quote_id: if(quote, do: quote.id, else: nil),
quote: maybe_render_quote(quote, opts),
emoji_reactions: emoji_reactions,
pleroma: %{
local: activity.local,
conversation_id: get_context_id(activity),
@ -399,6 +421,100 @@ def render("show.json", _) do
nil
end
def render("history.json", %{activity: %{data: %{"object" => _object}} = activity} = opts) do
object = Object.normalize(activity, fetch: false)
hashtags = Object.hashtags(object)
user = CommonAPI.get_user(activity.data["actor"])
past_history =
Object.Updater.history_for(object.data)
|> Map.get("orderedItems")
|> Enum.map(&Map.put(&1, "id", object.data["id"]))
|> Enum.map(&%Object{data: &1, id: object.id})
history =
[object | past_history]
# Mastodon expects the original to be at the first
|> Enum.reverse()
|> Enum.with_index()
|> Enum.map(fn {object, chrono_order} ->
%{
# The history is prepended every time there is a new edit.
# In chrono_order, the oldest item is always at 0, and so on.
# The chrono_order is an invariant kept between edits.
chrono_order: chrono_order,
object: object
}
end)
individual_opts =
opts
|> Map.put(:as, :item)
|> Map.put(:user, user)
|> Map.put(:hashtags, hashtags)
render_many(history, StatusView, "history_item.json", individual_opts)
end
def render(
"history_item.json",
%{
activity: activity,
user: user,
item: %{object: object, chrono_order: chrono_order},
hashtags: hashtags
} = opts
) do
sensitive = object.data["sensitive"] || Enum.member?(hashtags, "nsfw")
attachment_data = object.data["attachment"] || []
attachments = render_many(attachment_data, StatusView, "attachment.json", as: :attachment)
created_at = Utils.to_masto_date(object.data["updated"] || object.data["published"])
content =
object
|> render_content()
content_html =
content
|> Activity.HTML.get_cached_scrubbed_html_for_activity(
User.html_filter_policy(opts[:for]),
activity,
"mastoapi:content:#{chrono_order}"
)
summary = object.data["summary"] || ""
%{
account:
AccountView.render("show.json", %{
user: user,
for: opts[:for]
}),
content: content_html,
sensitive: sensitive,
spoiler_text: summary,
created_at: created_at,
media_attachments: attachments,
emojis: build_emojis(object.data["emoji"]),
poll: render(PollView, "show.json", object: object, for: opts[:for])
}
end
def render("source.json", %{activity: %{data: %{"object" => _object}} = activity} = _opts) do
object = Object.normalize(activity, fetch: false)
%{
id: activity.id,
text: get_source_text(Map.get(object.data, "source", "")),
spoiler_text: Map.get(object.data, "summary", ""),
content_type: get_source_content_type(object.data["source"])
}
end
def render("card.json", %{rich_media: rich_media, page_url: page_url}) do
page_url_data = URI.parse(page_url)
@ -451,10 +567,19 @@ def render("attachment.json", %{attachment: attachment}) do
true -> "unknown"
end
attachment_id =
with {_, ap_id} when is_binary(ap_id) <- {:ap_id, attachment["id"]},
{_, %Object{data: _object_data, id: object_id}} <-
{:object, Object.get_by_ap_id(ap_id)} do
to_string(object_id)
else
_ ->
<<hash_id::signed-32, _rest::binary>> = :crypto.hash(:md5, href)
to_string(attachment["id"] || hash_id)
end
%{
id: to_string(attachment["id"] || hash_id),
id: attachment_id,
url: href,
remote_url: href,
preview_url: href_preview,
@ -586,10 +711,11 @@ defp pin_data(%Object{data: %{"id" => object_id}}, %User{pinned_objects: pinned_
defp build_emoji_map(emoji, users, url, current_user) do
%{
name: emoji,
name: Pleroma.Web.PleromaAPI.EmojiReactionView.emoji_name(emoji, url),
count: length(users),
url: MediaProxy.url(url),
me: !!(current_user && current_user.ap_id in users)
me: !!(current_user && current_user.ap_id in users),
account_ids: Enum.map(users, fn user -> User.get_cached_by_ap_id(user).id end)
}
end
@ -621,15 +747,39 @@ defp build_image_url(_, _), do: nil
defp maybe_render_quote(nil, _), do: nil
defp maybe_render_quote(quote, opts) do
if opts[:do_not_recurse] || !visible_for_user?(quote, opts[:for]) do
nil
else
with %User{} = quoted_user <- User.get_cached_by_ap_id(quote.actor),
false <- Map.get(opts, :do_not_recurse, false),
true <- visible_for_user?(quote, opts[:for]),
false <- User.blocks?(opts[:for], quoted_user),
false <- User.mutes?(opts[:for], quoted_user) do
opts =
opts
|> Map.put(:activity, quote)
|> Map.put(:do_not_recurse, true)
render("show.json", opts)
else
_ -> nil
end
end
defp get_source_text(%{"content" => content} = _source) do
content
end
defp get_source_text(source) when is_binary(source) do
source
end
defp get_source_text(_) do
""
end
defp get_source_content_type(%{"mediaType" => type} = _source) do
type
end
defp get_source_content_type(_source) do
Utils.get_content_type(nil)
end
end

View file

@ -32,8 +32,15 @@ def init(%{qs: qs} = req, state) do
req
end
{:cowboy_websocket, req, %{user: user, topic: topic, count: 0, timer: nil},
%{idle_timeout: @timeout}}
{:cowboy_websocket, req,
%{
user: user,
topic: topic,
count: 0,
timer: nil,
subscriptions: [],
oauth_token: oauth_token
}, %{idle_timeout: @timeout}}
else
{:error, :bad_topic} ->
Logger.debug("#{__MODULE__} bad topic #{inspect(req)}")
@ -52,7 +59,7 @@ def websocket_init(state) do
"#{__MODULE__} accepted websocket connection for user #{(state.user || %{id: "anonymous"}).id}, topic #{state.topic}"
)
Streamer.add_socket(state.topic, state.user)
Streamer.add_socket(state.topic, state.oauth_token)
{:ok, %{state | timer: timer()}}
end
@ -65,21 +72,50 @@ def websocket_handle(:pong, state) do
# We only receive pings for now
def websocket_handle(:ping, state), do: {:ok, state}
def websocket_handle({:text, "ping"}, state) do
def websocket_handle({:text, ping}, state) when ping in ~w[ping PING] do
if state.timer, do: Process.cancel_timer(state.timer)
{:reply, {:text, "pong"}, %{state | timer: timer()}}
end
def websocket_handle({:text, text}, state) do
with {:ok, json} <- Jason.decode(text) do
websocket_handle({:json, json}, state)
else
_ ->
Logger.error("#{__MODULE__} received text frame: #{text}")
{:ok, state}
end
end
def websocket_handle(
{:json, %{"type" => "subscribe", "stream" => stream_name}},
%{user: user, oauth_token: token} = state
) do
with {:ok, topic} <- Streamer.get_topic(stream_name, user, token, %{}) do
new_subscriptions =
[topic | Map.get(state, :subscriptions, [])]
|> Enum.uniq()
{:ok, _topic} = Streamer.add_socket(topic, user)
{:ok, Map.put(state, :subscriptions, new_subscriptions)}
else
_ ->
Logger.error("#{__MODULE__} received invalid topic: #{stream_name}")
{:ok, state}
end
end
def websocket_handle(frame, state) do
Logger.error("#{__MODULE__} received frame: #{inspect(frame)}")
{:ok, state}
end
def websocket_info({:render_with_user, view, template, item}, state) do
def websocket_info({:render_with_user, view, template, item, topic}, state) do
user = %User{} = User.get_cached_by_ap_id(state.user.ap_id)
unless Streamer.filtered_by_user?(user, item) do
websocket_info({:text, view.render(template, item, user)}, %{state | user: user})
websocket_info({:text, view.render(template, item, user, topic)}, %{state | user: user})
else
{:ok, state}
end
@ -103,6 +139,10 @@ def websocket_info(:tick, state) do
{:reply, :ping, %{state | timer: nil, count: 0}, :hibernate}
end
def websocket_info(:close, state) do
{:stop, state}
end
# State can be `[]` only in case we terminate before switching to websocket,
# we already log errors for these cases in `init/1`, so just do nothing here
def terminate(_reason, _req, []), do: :ok

View file

@ -94,4 +94,9 @@ def get_by_token(%App{id: app_id} = _app, token) do
from(t in __MODULE__, where: t.app_id == ^app_id and t.token == ^token)
|> Repo.find_resource()
end
def get_preeexisting_by_app_and_user(%App{id: app_id} = _app, %User{id: user_id} = _user) do
from(t in __MODULE__, where: t.app_id == ^app_id and t.user_id == ^user_id, limit: 1)
|> Repo.find_resource()
end
end

View file

@ -59,18 +59,39 @@ def authorize(%Plug.Conn{assigns: %{token: %Token{}}} = conn, %{"force_login" =>
# after user already authorized to MastodonFE.
# So we have to check client and token.
def authorize(
%Plug.Conn{assigns: %{token: %Token{} = token}} = conn,
%Plug.Conn{assigns: %{token: %Token{} = token, user: %User{} = user}} = conn,
%{"client_id" => client_id} = params
) do
with %Token{} = t <- Repo.get_by(Token, token: token.token) |> Repo.preload(:app),
^client_id <- t.app.client_id do
handle_existing_authorization(conn, params)
else
_ ->
maybe_reuse_token(conn, params, user.id)
end
end
def authorize(%Plug.Conn{} = conn, params) do
# if we have a user in the session, attempt to authenticate as them
# otherwise show the login form
maybe_reuse_token(conn, params, AuthHelper.get_session_user(conn))
end
defp maybe_reuse_token(conn, params, user_id) when is_binary(user_id) do
with %User{} = user <- User.get_cached_by_id(user_id),
%App{} = app <- Repo.get_by(App, client_id: params["client_id"]),
{:ok, %Token{} = token} <- Token.get_preeexisting_by_app_and_user(app, user),
{:ok, %Authorization{} = auth} <-
Authorization.get_preeexisting_by_app_and_user(app, user) do
conn
|> assign(:token, token)
|> after_create_authorization(auth, %{"authorization" => params})
else
_ -> do_authorize(conn, params)
end
end
def authorize(%Plug.Conn{} = conn, params), do: do_authorize(conn, params)
defp maybe_reuse_token(conn, params, _user), do: do_authorize(conn, params)
defp do_authorize(%Plug.Conn{} = conn, params) do
app = Repo.get_by(App, client_id: params["client_id"])
@ -148,7 +169,9 @@ def create_authorization(%Plug.Conn{assigns: %{user: %User{} = user}} = conn, pa
def create_authorization(%Plug.Conn{} = conn, %{"authorization" => _} = params, opts) do
with {:ok, auth, user} <- do_create_authorization(conn, params, opts[:user]),
{:mfa_required, _, _, false} <- {:mfa_required, user, auth, MFA.require?(user)} do
after_create_authorization(conn, auth, params)
conn
|> AuthHelper.put_session_user(user.id)
|> after_create_authorization(auth, params)
else
error ->
handle_create_authorization_error(conn, error, params)
@ -269,7 +292,7 @@ def token_exchange(%Plug.Conn{} = conn, %{"grant_type" => "authorization_code"}
fixed_token = Token.Utils.fix_padding(params["code"]),
{:ok, auth} <- Authorization.get_by_token(app, fixed_token),
%User{} = user <- User.get_cached_by_id(auth.user_id),
{:ok, token} <- Token.exchange_token(app, auth) do
{:ok, token} <- Token.get_or_exchange_token(auth, app, user) do
after_token_exchange(conn, %{user: user, token: token})
else
error ->
@ -321,6 +344,7 @@ def token_exchange(%Plug.Conn{} = conn, params), do: bad_request(conn, params)
def after_token_exchange(%Plug.Conn{} = conn, %{token: token} = view_params) do
conn
|> AuthHelper.put_session_token(token.token)
|> AuthHelper.put_session_user(token.user_id)
|> json(OAuthView.render("token.json", view_params))
end

View file

@ -70,6 +70,16 @@ def exchange_token(app, auth) do
end
end
def get_preeexisting_by_app_and_user(app, user) do
Query.get_by_app(app.id)
|> Query.get_by_user(user.id)
|> Query.get_unexpired()
|> Query.preload([:user])
|> Query.sort_by_inserted_at()
|> Query.limit(1)
|> Repo.find_resource()
end
defp put_token(changeset) do
changeset
|> change(%{token: Token.Utils.generate_token()})
@ -86,6 +96,14 @@ defp put_refresh_token(changeset, attrs) do
|> unique_constraint(:refresh_token)
end
def get_or_exchange_token(%Authorization{} = auth, %App{} = app, %User{} = user) do
if auth.used do
get_preeexisting_by_app_and_user(app, user)
else
exchange_token(app, auth)
end
end
defp put_valid_until(changeset, attrs) do
valid_until =
Map.get(attrs, :valid_until, NaiveDateTime.add(NaiveDateTime.utc_now(), lifespan()))

View file

@ -38,6 +38,19 @@ def get_by_user(query \\ Token, user_id) do
from(q in query, where: q.user_id == ^user_id)
end
def get_unexpired(query) do
now = NaiveDateTime.utc_now()
from(q in query, where: q.valid_until > ^now)
end
def limit(query, limit) do
from(q in query, limit: ^limit)
end
def sort_by_inserted_at(query) do
from(q in query, order_by: [desc: :updated_at])
end
@spec preload(query, any) :: query
def preload(query \\ Token, assoc_preload \\ [])

View file

@ -21,6 +21,18 @@ def revoke(%App{} = app, %{"token" => token} = _attrs) do
@doc "Revokes access token"
@spec revoke(Token.t()) :: {:ok, Token.t()} | {:error, Ecto.Changeset.t()}
def revoke(%Token{} = token) do
Repo.delete(token)
with {:ok, token} <- Repo.delete(token) do
Task.Supervisor.start_child(
Pleroma.TaskSupervisor,
Pleroma.Web.Streamer,
:close_streams_by_oauth_token,
[token],
restart: :transient
)
{:ok, token}
else
result -> result
end
end
end

View file

@ -74,6 +74,11 @@ defp filter(reactions, %{emoji: emoji}) when is_binary(emoji) do
defp filter(reactions, _), do: reactions
def create(%{assigns: %{user: user}} = conn, %{id: activity_id, emoji: emoji}) do
emoji =
emoji
|> Pleroma.Emoji.fully_qualify_emoji()
|> Pleroma.Emoji.maybe_quote()
with {:ok, _activity} <- CommonAPI.react_with_emoji(activity_id, user, emoji) do
activity = Activity.get_by_id(activity_id)
@ -84,6 +89,11 @@ def create(%{assigns: %{user: user}} = conn, %{id: activity_id, emoji: emoji}) d
end
def delete(%{assigns: %{user: user}} = conn, %{id: activity_id, emoji: emoji}) do
emoji =
emoji
|> Pleroma.Emoji.fully_qualify_emoji()
|> Pleroma.Emoji.maybe_quote()
with {:ok, _activity} <- CommonAPI.unreact_with_emoji(activity_id, user, emoji) do
activity = Activity.get_by_id(activity_id)

View file

@ -8,6 +8,18 @@ defmodule Pleroma.Web.PleromaAPI.EmojiReactionView do
alias Pleroma.Web.MastodonAPI.AccountView
alias Pleroma.Web.MediaProxy
def emoji_name(emoji, nil), do: emoji
def emoji_name(emoji, url) do
url = URI.parse(url)
if url.host == Pleroma.Web.Endpoint.host() do
emoji
else
"#{emoji}@#{url.host}"
end
end
def render("index.json", %{emoji_reactions: emoji_reactions} = opts) do
render_many(emoji_reactions, __MODULE__, "show.json", opts)
end
@ -16,7 +28,7 @@ def render("show.json", %{emoji_reaction: {emoji, user_ap_ids, url}, user: user}
users = fetch_users(user_ap_ids)
%{
name: emoji,
name: emoji_name(emoji, url),
count: length(users),
accounts: render(AccountView, "index.json", users: users, for: user),
url: MediaProxy.url(url),

View file

@ -27,11 +27,11 @@ def call(conn, _opts) do
end
end
def route_aliases(%{path_info: ["objects", id]}) do
def route_aliases(%{path_info: ["objects", id], query_string: query_string}) do
ap_id = Router.Helpers.o_status_url(Pleroma.Web.Endpoint, :object, id)
with %Activity{} = activity <- Activity.get_by_object_ap_id_with_object(ap_id) do
["/notice/#{activity.id}"]
["/notice/#{activity.id}", "/notice/#{activity.id}?#{query_string}"]
else
_ -> []
end
@ -64,7 +64,9 @@ defp maybe_assign_valid_signature(conn) do
if has_signature_header?(conn) do
# set (request-target) header to the appropriate value
# we also replace the digest header with the one we computed
possible_paths = route_aliases(conn) ++ [conn.request_path]
possible_paths =
route_aliases(conn) ++ [conn.request_path, conn.request_path <> "?#{conn.query_string}"]
assign_valid_signature_on_route_aliases(conn, possible_paths)
else
Logger.debug("No signature header!")

View file

@ -337,6 +337,7 @@ defmodule Pleroma.Web.Router do
pipe_through(:pleroma_html)
post("/main/ostatus", UtilController, :remote_subscribe)
get("/main/ostatus", UtilController, :show_subscribe_form)
get("/ostatus_subscribe", RemoteFollowController, :follow)
post("/ostatus_subscribe", RemoteFollowController, :do_follow)
end
@ -457,6 +458,16 @@ defmodule Pleroma.Web.Router do
get("/federation_status", InstancesController, :show)
end
scope "/api/v1", Pleroma.Web.PleromaAPI do
pipe_through(:authenticated_api)
put("/statuses/:id/emoji_reactions/:emoji", EmojiReactionController, :create)
end
scope "/api/v1/akkoma", Pleroma.Web.AkkomaAPI do
pipe_through(:authenticated_api)
get("/translation/languages", TranslationController, :languages)
end
scope "/api/v1", Pleroma.Web.MastodonAPI do
pipe_through(:authenticated_api)
@ -537,6 +548,7 @@ defmodule Pleroma.Web.Router do
get("/bookmarks", StatusController, :bookmarks)
post("/statuses", StatusController, :create)
put("/statuses/:id", StatusController, :update)
delete("/statuses/:id", StatusController, :delete)
post("/statuses/:id/reblog", StatusController, :reblog)
post("/statuses/:id/unreblog", StatusController, :unreblog)
@ -548,6 +560,7 @@ defmodule Pleroma.Web.Router do
post("/statuses/:id/unbookmark", StatusController, :unbookmark)
post("/statuses/:id/mute", StatusController, :mute_conversation)
post("/statuses/:id/unmute", StatusController, :unmute_conversation)
get("/statuses/:id/translations/:language", StatusController, :translate)
post("/push/subscription", SubscriptionController, :create)
get("/push/subscription", SubscriptionController, :show)
@ -601,6 +614,8 @@ defmodule Pleroma.Web.Router do
get("/statuses/:id/context", StatusController, :context)
get("/statuses/:id/favourited_by", StatusController, :favourited_by)
get("/statuses/:id/reblogged_by", StatusController, :reblogged_by)
get("/statuses/:id/history", StatusController, :show_history)
get("/statuses/:id/source", StatusController, :show_source)
get("/custom_emojis", CustomEmojiController, :index)

View file

@ -36,7 +36,7 @@ def registry, do: @registry
{:ok, topic :: String.t()} | {:error, :bad_topic} | {:error, :unauthorized}
def get_topic_and_add_socket(stream, user, oauth_token, params \\ %{}) do
with {:ok, topic} <- get_topic(stream, user, oauth_token, params) do
add_socket(topic, user)
add_socket(topic, oauth_token)
end
end
@ -114,15 +114,20 @@ def get_topic("list", _user, _oauth_token, _params) do
{:error, :unauthorized}
end
# mastodon multi-topic WS
def get_topic(nil, _user, _oauth_token, _params) do
{:ok, :multi}
end
def get_topic(_stream, _user, _oauth_token, _params) do
{:error, :bad_topic}
end
@doc "Registers the process for streaming. Use `get_topic/3` to get the full authorized topic."
def add_socket(topic, user) do
def add_socket(topic, oauth_token) do
if should_env_send?() do
auth? = if user, do: true
Registry.register(@registry, topic, auth?)
oauth_token_id = if oauth_token, do: oauth_token.id, else: false
Registry.register(@registry, topic, oauth_token_id)
end
{:ok, topic}
@ -186,8 +191,8 @@ defp do_stream("direct", item) do
end
defp do_stream("follow_relationship", item) do
text = StreamerView.render("follow_relationships_update.json", item)
user_topic = "user:#{item.follower.id}"
text = StreamerView.render("follow_relationships_update.json", item, user_topic)
Logger.debug("Trying to push follow relationship update to #{user_topic}\n\n")
@ -235,7 +240,7 @@ defp do_stream(topic, %Notification{} = item)
when topic in ["user", "user:notification"] do
Registry.dispatch(@registry, "#{topic}:#{item.user_id}", fn list ->
Enum.each(list, fn {pid, _auth} ->
send(pid, {:render_with_user, StreamerView, "notification.json", item})
send(pid, {:render_with_user, StreamerView, "notification.json", item, topic})
end)
end)
end
@ -259,7 +264,7 @@ defp do_stream(topic, item) do
end
defp push_to_socket(topic, %Participation{} = participation) do
rendered = StreamerView.render("conversation.json", participation)
rendered = StreamerView.render("conversation.json", participation, topic)
Registry.dispatch(@registry, topic, fn list ->
Enum.each(list, fn {pid, _} ->
@ -282,13 +287,34 @@ defp push_to_socket(topic, %Activity{
defp push_to_socket(_topic, %Activity{data: %{"type" => "Delete"}}), do: :noop
defp push_to_socket(topic, item) do
anon_render = StreamerView.render("update.json", item)
defp push_to_socket(topic, %Activity{data: %{"type" => "Update"}} = item) do
create_activity =
Pleroma.Activity.get_create_by_object_ap_id(item.object.data["id"])
|> Map.put(:object, item.object)
anon_render = StreamerView.render("status_update.json", create_activity, topic)
Registry.dispatch(@registry, topic, fn list ->
Enum.each(list, fn {pid, auth?} ->
if auth? do
send(pid, {:render_with_user, StreamerView, "update.json", item})
send(
pid,
{:render_with_user, StreamerView, "status_update.json", create_activity, topic}
)
else
send(pid, {:text, anon_render})
end
end)
end)
end
defp push_to_socket(topic, item) do
anon_render = StreamerView.render("update.json", item, topic)
Registry.dispatch(@registry, topic, fn list ->
Enum.each(list, fn {pid, auth?} ->
if auth? do
send(pid, {:render_with_user, StreamerView, "update.json", item, topic})
else
send(pid, {:text, anon_render})
end
@ -306,6 +332,22 @@ defp thread_containment(activity, user) do
end
end
def close_streams_by_oauth_token(oauth_token) do
if should_env_send?() do
Registry.select(
@registry,
[
{
{:"$1", :"$2", :"$3"},
[{:==, :"$3", oauth_token.id}],
[:"$2"]
}
]
)
|> Enum.each(fn pid -> send(pid, :close) end)
end
end
# In test environement, only return true if the registry is started.
# In benchmark environment, returns false.
# In any other environment, always returns true.

View file

@ -0,0 +1,34 @@
<!DOCTYPE html>
<html lang='en'>
<head>
<meta charset='utf-8'>
<meta content='width=device-width, initial-scale=1' name='viewport'>
<title>
<%= Config.get([:instance, :name]) %>
</title>
<link rel="icon" type="image/png" href="/favicon.png"/>
<link rel="manifest" type="applicaton/manifest+json" href="<%= Routes.masto_fe_path(Pleroma.Web.Endpoint, :manifest) %>" />
<meta name="theme-color" content="<%= Config.get([:manifest, :theme_color]) %>" />
<script id='initial-state' type='application/json'><%= initial_state(@token, @user, @custom_emojis) %></script>
<script crossorigin='anonymous' src="/packs/js/common.js"></script>
<script crossorigin='anonymous' src="/packs/js/locale_en.js"></script>
<link rel='preload' as='script' crossorigin='anonymous' href='/packs/js/features/getting_started.js'>
<link rel='preload' as='script' crossorigin='anonymous' href='/packs/js/features/compose.js'>
<link rel='preload' as='script' crossorigin='anonymous' href='/packs/js/features/home_timeline.js'>
<link rel='preload' as='script' crossorigin='anonymous' href='/packs/js/features/notifications.js'>
<script crossorigin='anonymous' src="/packs/js/application.js"></script>
<link rel="stylesheet" media="all" href="/packs/css/common.css" />
<link rel="stylesheet" media="all" href="/packs/css/default.css" />
</head>
<body class='app-body no-reduce-motion system-font'>
<div class='app-holder' data-props='{&quot;locale&quot;:&quot;en&quot;}' id='mastodon'>
</div>
</body>
</html>

Some files were not shown because too many files have changed in this diff Show more