Merge branch 'develop' into fedi-absturztau-be

This commit is contained in:
Puniko 2022-09-17 11:14:46 +02:00
commit 4c7dc359ed
140 changed files with 5285 additions and 455 deletions

View file

@ -1,18 +0,0 @@
<!--
### Precheck
* For support use https://git.pleroma.social/pleroma/pleroma-support or [community channels](https://git.pleroma.social/pleroma/pleroma#community-channels).
* Please do a quick search to ensure no similar bug has been reported before. If the bug has not been addressed after 2 weeks, it's fine to bump it.
* Try to ensure that the bug is actually related to the Pleroma backend. For example, if a bug happens in Pleroma-FE but not in Mastodon-FE or mobile clients, it's likely that the bug should be filed in [Pleroma-FE](https://git.pleroma.social/pleroma/pleroma-fe/issues/new) repository.
-->
### Environment
* Installation type (OTP or From Source):
* Pleroma version (could be found in the "Version" tab of settings in Pleroma-FE):
* Elixir version (`elixir -v` for from source installations, N/A for OTP):
* Operating system:
* PostgreSQL version (`psql -V`):
### Bug description

View file

@ -1,6 +0,0 @@
### Release checklist
* [ ] Bump version in `mix.exs`
* [ ] Compile a changelog
* [ ] Create an MR with an announcement to pleroma.social
* [ ] Tag the release
* [ ] Merge `stable` into `develop` (in case the fixes are already in develop, use `git merge -s ours --no-commit` and manually merge the changelogs)

View file

@ -4,16 +4,20 @@ All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/). The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/).
## [Unreleased] ## 2022.09
### Added ### Added
- support for fedibird-fe, and non-breaking API parity for it to function - support for fedibird-fe, and non-breaking API parity for it to function
- support for setting instance languages in metadata - support for setting instance languages in metadata
- support for reusing oauth tokens, and not requiring new authorizations - support for reusing oauth tokens, and not requiring new authorizations
- the ability to obfuscate domains in your MRF descriptions - the ability to obfuscate domains in your MRF descriptions
- automatic translation of statuses via DeepL or LibreTranslate
- ability to edit posts
- ability to react with remote emoji
### Changed ### Changed
- MFM parsing is now done on the backend by a modified version of ilja's parser -> https://akkoma.dev/AkkomaGang/mfm-parser - MFM parsing is now done on the backend by a modified version of ilja's parser -> https://akkoma.dev/AkkomaGang/mfm-parser
- InlineQuotePolicy is now on by default
### Fixed ### Fixed
- Compatibility with latest meilisearch - Compatibility with latest meilisearch
@ -40,6 +44,7 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/).
- amd64 is built for debian stable. Compatible with ubuntu 20. - amd64 is built for debian stable. Compatible with ubuntu 20.
- ubuntu-jammy is built for... well, ubuntu 22 (LTS) - ubuntu-jammy is built for... well, ubuntu 22 (LTS)
- amd64-musl is built for alpine 3.16 - amd64-musl is built for alpine 3.16
- Enable remote users to interact with posts
### Fixed ### Fixed
- Updated mastoFE path, for the newer version - Updated mastoFE path, for the newer version

2
SIGNING_KEY.pub Normal file
View file

@ -0,0 +1,2 @@
untrusted comment: Akkoma Signing Key public key
RWQRlw8Ex/uTbvo1wB1yK75tQ5nXKilB/vrKdkL41bgZHL9aKP+7fSS5

View file

@ -48,6 +48,7 @@
config :pleroma, Pleroma.Repo, config :pleroma, Pleroma.Repo,
telemetry_event: [Pleroma.Repo.Instrumenter], telemetry_event: [Pleroma.Repo.Instrumenter],
queue_target: 20_000,
migration_lock: nil migration_lock: nil
config :pleroma, Pleroma.Captcha, config :pleroma, Pleroma.Captcha,
@ -843,6 +844,19 @@
} }
} }
config :pleroma, :translator,
enabled: false,
module: Pleroma.Akkoma.Translators.DeepL
config :pleroma, :deepl,
# either :free or :pro
tier: :free,
api_key: ""
config :pleroma, :libre_translate,
url: "http://127.0.0.1:5000",
api_key: nil
# Import environment specific config. This must remain at the bottom # Import environment specific config. This must remain at the bottom
# of this file so it overrides the configuration defined above. # of this file so it overrides the configuration defined above.
import_config "#{Mix.env()}.exs" import_config "#{Mix.env()}.exs"

View file

@ -3226,13 +3226,14 @@
group: :pleroma, group: :pleroma,
key: Pleroma.Search, key: Pleroma.Search,
type: :group, type: :group,
label: "Search",
description: "General search settings.", description: "General search settings.",
children: [ children: [
%{ %{
key: :module, key: :module,
type: :keyword, type: :module,
description: "Selected search module.", description: "Selected search module.",
suggestion: [Pleroma.Search.DatabaseSearch, Pleroma.Search.Meilisearch] suggestions: {:list_behaviour_implementations, Pleroma.Search.SearchBackend}
} }
] ]
}, },
@ -3257,7 +3258,7 @@
}, },
%{ %{
key: :initial_indexing_chunk_size, key: :initial_indexing_chunk_size,
type: :int, type: :integer,
description: description:
"Amount of posts in a batch when running the initial indexing operation. Should probably not be more than 100000" <> "Amount of posts in a batch when running the initial indexing operation. Should probably not be more than 100000" <>
" since there's a limit on maximum insert size", " since there's a limit on maximum insert size",
@ -3268,6 +3269,7 @@
%{ %{
group: :pleroma, group: :pleroma,
key: Pleroma.Search.Elasticsearch.Cluster, key: Pleroma.Search.Elasticsearch.Cluster,
label: "Elasticsearch",
type: :group, type: :group,
description: "Elasticsearch settings.", description: "Elasticsearch settings.",
children: [ children: [
@ -3334,13 +3336,13 @@
}, },
%{ %{
key: :bulk_page_size, key: :bulk_page_size,
type: :int, type: :integer,
description: "Size for bulk put requests, mostly used on building the index", description: "Size for bulk put requests, mostly used on building the index",
suggestion: [5000] suggestion: [5000]
}, },
%{ %{
key: :bulk_wait_interval, key: :bulk_wait_interval,
type: :int, type: :integer,
description: "Time to wait between bulk put requests (in ms)", description: "Time to wait between bulk put requests (in ms)",
suggestion: [15_000] suggestion: [15_000]
} }
@ -3349,5 +3351,66 @@
] ]
} }
] ]
},
%{
group: :pleroma,
key: :translator,
type: :group,
description: "Translation Settings",
children: [
%{
key: :enabled,
type: :boolean,
description: "Is translation enabled?",
suggestion: [true, false]
},
%{
key: :module,
type: :module,
description: "Translation module.",
suggestions: {:list_behaviour_implementations, Pleroma.Akkoma.Translator}
}
]
},
%{
group: :pleroma,
key: :deepl,
label: "DeepL",
type: :group,
description: "DeepL Settings.",
children: [
%{
key: :tier,
type: {:dropdown, :atom},
description: "API Tier",
suggestions: [:free, :pro]
},
%{
key: :api_key,
type: :string,
description: "API key for DeepL",
suggestions: [nil]
}
]
},
%{
group: :pleroma,
key: :libre_translate,
type: :group,
description: "LibreTranslate Settings.",
children: [
%{
key: :url,
type: :string,
description: "URL for libretranslate",
suggestion: [nil]
},
%{
key: :api_key,
type: :string,
description: "API key for libretranslate",
suggestion: [nil]
}
]
} }
] ]

View file

@ -1159,3 +1159,28 @@ Each job has these settings:
* `:max_running` - max concurrently runnings jobs * `:max_running` - max concurrently runnings jobs
* `:max_waiting` - max waiting jobs * `:max_waiting` - max waiting jobs
### Translation Settings
Settings to automatically translate statuses for end users. Currently supported
translation services are DeepL and LibreTranslate.
Translations are available at `/api/v1/statuses/:id/translations/:language`, where
`language` is the target language code (e.g `en`)
### `:translator`
- `:enabled` - enables translation
- `:module` - Sets module to be used
- Either `Pleroma.Akkoma.Translators.DeepL` or `Pleroma.Akkoma.Translators.LibreTranslate`
### `:deepl`
- `:api_key` - API key for DeepL
- `:tier` - API tier
- either `:free` or `:pro`
### `:libre_translate`
- `:url` - URL of LibreTranslate instance
- `:api_key` - API key for LibreTranslate

View file

@ -21,7 +21,7 @@ This will only save the theme for you personally. To make it available to the wh
### Upload the theme to the server ### Upload the theme to the server
Themes can be found in the [static directory](static_dir.md). Create `STATIC-DIR/static/themes/` if needed and copy your theme there. Next you need to add an entry for your theme to `STATIC-DIR/static/styles.json`. If you use a from source installation, you'll first need to copy the file from `priv/static/static/styles.json`. Themes can be found in the [static directory](static_dir.md). Create `STATIC-DIR/static/themes/` if needed and copy your theme there. Next you need to add an entry for your theme to `STATIC-DIR/static/styles.json`. If you use a from source installation, you'll first need to copy the file from `STATIC-DIR/frontends/pleroma-fe/REF/static/styles.json` (where `REF` is `stable` or `develop` depending on which ref you decided to install).
Example of `styles.json` where we add our own `my-awesome-theme.json` Example of `styles.json` where we add our own `my-awesome-theme.json`
```json ```json

View file

@ -40,6 +40,10 @@ Has these additional fields under the `pleroma` object:
- `parent_visible`: If the parent of this post is visible to the user or not. - `parent_visible`: If the parent of this post is visible to the user or not.
- `pinned_at`: a datetime (iso8601) when status was pinned, `null` otherwise. - `pinned_at`: a datetime (iso8601) when status was pinned, `null` otherwise.
The `GET /api/v1/statuses/:id/source` endpoint additionally has the following attributes:
- `content_type`: The content type of the status source.
## Scheduled statuses ## Scheduled statuses
Has these additional fields in `params`: Has these additional fields in `params`:

View file

@ -221,6 +221,8 @@ If your instance is up and running, you can create your first user with administ
doas -u akkoma env MIX_ENV=prod mix pleroma.user new <username> <your@emailaddress> --admin doas -u akkoma env MIX_ENV=prod mix pleroma.user new <username> <your@emailaddress> --admin
``` ```
{! installation/frontends.include !}
#### Further reading #### Further reading
{! installation/further_reading.include !} {! installation/further_reading.include !}

View file

@ -212,6 +212,8 @@ If your instance is up and running, you can create your first user with administ
sudo -Hu akkoma MIX_ENV=prod mix pleroma.user new <username> <your@emailaddress> --admin sudo -Hu akkoma MIX_ENV=prod mix pleroma.user new <username> <your@emailaddress> --admin
``` ```
{! installation/frontends.include !}
#### Further reading #### Further reading
{! installation/further_reading.include !} {! installation/further_reading.include !}

View file

@ -175,6 +175,8 @@ If your instance is up and running, you can create your first user with administ
sudo -Hu akkoma MIX_ENV=prod mix pleroma.user new <username> <your@emailaddress> --admin sudo -Hu akkoma MIX_ENV=prod mix pleroma.user new <username> <your@emailaddress> --admin
``` ```
{! installation/frontends.include !}
#### Further reading #### Further reading
{! installation/further_reading.include !} {! installation/further_reading.include !}

View file

@ -199,6 +199,8 @@ If your instance is up and running, you can create your first user with administ
sudo -Hu akkoma MIX_ENV=prod mix pleroma.user new <username> <your@emailaddress> --admin sudo -Hu akkoma MIX_ENV=prod mix pleroma.user new <username> <your@emailaddress> --admin
``` ```
{! installation/frontends.include !}
#### Further reading #### Further reading
{! installation/further_reading.include !} {! installation/further_reading.include !}

View file

@ -206,6 +206,9 @@ If your instance is up and running, you can create your first user with administ
```shell ```shell
sudo -Hu akkoma MIX_ENV=prod mix pleroma.user new <username> <your@emailaddress> --admin sudo -Hu akkoma MIX_ENV=prod mix pleroma.user new <username> <your@emailaddress> --admin
``` ```
{! installation/frontends.include !}
## Conclusion ## Conclusion
Restart nginx with `# service nginx restart` and you should be up and running. Restart nginx with `# service nginx restart` and you should be up and running.

View file

@ -0,0 +1,25 @@
#### Installing Frontends
Once your backend server is functional, you'll also want to
probably install frontends.
These are no longer bundled with the distribution and need an extra
command to install.
For most installations, the following will suffice:
=== "OTP"
```sh
./bin/pleroma_ctl frontend install pleroma-fe --ref stable
# and also, if desired
./bin/pleroma_ctl frontend install admin-fe --ref stable
```
=== "From Source"
```sh
mix pleroma.frontend install pleroma-fe --ref stable
mix pleroma.frontend install admin-fe --ref stable
```
For more customised installations, refer to [Frontend Management](../../configuration/frontend_management)

View file

@ -293,6 +293,8 @@ akkoma$ MIX_ENV=prod mix pleroma.user new <username> <your@emailaddress> --admin
If you opted to allow sudo for the `akkoma` user but would like to remove the ability for greater security, now might be a good time to edit `/etc/sudoers` and/or change the groups the `akkoma` user belongs to. Be sure to restart the akkoma service afterwards to ensure it picks up on the changes. If you opted to allow sudo for the `akkoma` user but would like to remove the ability for greater security, now might be a good time to edit `/etc/sudoers` and/or change the groups the `akkoma` user belongs to. Be sure to restart the akkoma service afterwards to ensure it picks up on the changes.
{! installation/frontends.include !}
#### Further reading #### Further reading
{! installation/further_reading.include !} {! installation/further_reading.include !}

View file

@ -1,7 +1,5 @@
# Migrating to Akkoma # Migrating to Akkoma
**Akkoma does not currently have a stable release, until 3.0, all builds should be considered "develop"**
## Why should you migrate? ## Why should you migrate?
aside from actually responsive maintainer(s)? let's lookie here, we've got: aside from actually responsive maintainer(s)? let's lookie here, we've got:
@ -11,6 +9,8 @@ aside from actually responsive maintainer(s)? let's lookie here, we've got:
- elasticsearch support (because pleroma search is GARBAGE) - elasticsearch support (because pleroma search is GARBAGE)
- latest develop pleroma-fe additions - latest develop pleroma-fe additions
- local-only posting - local-only posting
- automatic post translation
- the mastodon frontend back in all its glory
- probably more, this is like 3.5 years of IHBA additions finally compiled - probably more, this is like 3.5 years of IHBA additions finally compiled
## Actually migrating ## Actually migrating
@ -43,14 +43,14 @@ This will just be setting the update URL - find your flavour from the [mapping o
```bash ```bash
export FLAVOUR=[the flavour you found above] export FLAVOUR=[the flavour you found above]
./bin/pleroma_ctl update --zip-url https://akkoma-updates.s3-website.fr-par.scw.cloud/develop/akkoma-$FLAVOUR.zip ./bin/pleroma_ctl update --zip-url https://akkoma-updates.s3-website.fr-par.scw.cloud/stable/akkoma-$FLAVOUR.zip
./bin/pleroma_ctl migrate ./bin/pleroma_ctl migrate
``` ```
Then restart. When updating in the future, you canjust use Then restart. When updating in the future, you canjust use
```bash ```bash
./bin/pleroma_ctl update --branch develop ./bin/pleroma_ctl update --branch stable
``` ```
## Frontend changes ## Frontend changes
@ -62,17 +62,18 @@ your upgrade path here depends on your setup
You'll need to run a couple of commands, You'll need to run a couple of commands,
```bash === "OTP"
# From source ```sh
mix pleroma.frontend install pleroma-fe ./bin/pleroma_ctl frontend install pleroma-fe --ref stable
# you'll probably want this too # and also, if desired
mix pleroma.frontend install admin-fe ./bin/pleroma_ctl frontend install admin-fe --ref stable
```
# OTP === "From Source"
./bin/pleroma_ctl frontend install pleroma-fe ```sh
# you'll probably want this too mix pleroma.frontend install pleroma-fe --ref stable
./bin/pleroma_ctl frontend install admin-fe mix pleroma.frontend install admin-fe --ref stable
``` ```
### I've run the mix task to install a frontend ### I've run the mix task to install a frontend

View file

@ -202,6 +202,8 @@ incorrect timestamps. You should have ntpd running.
* <https://catgirl.science> * <https://catgirl.science>
{! installation/frontends.include !}
#### Further reading #### Further reading
{! installation/further_reading.include !} {! installation/further_reading.include !}

View file

@ -250,6 +250,8 @@ If your instance is up and running, you can create your first user with administ
LC_ALL=en_US.UTF-8 MIX_ENV=prod mix pleroma.user new <username> <your@emailaddress> --admin LC_ALL=en_US.UTF-8 MIX_ENV=prod mix pleroma.user new <username> <your@emailaddress> --admin
``` ```
{! installation/frontends.include !}
#### Further reading #### Further reading
{! installation/further_reading.include !} {! installation/further_reading.include !}

View file

@ -306,6 +306,8 @@ su akkoma -s $SHELL -lc "./bin/pleroma_ctl user new joeuser joeuser@sld.tld --ad
``` ```
This will create an account withe the username of 'joeuser' with the email address of joeuser@sld.tld, and set that user's account as an admin. This will result in a link that you can paste into the browser, which logs you in and enables you to set the password. This will create an account withe the username of 'joeuser' with the email address of joeuser@sld.tld, and set that user's account as an admin. This will result in a link that you can paste into the browser, which logs you in and enables you to set the password.
{! installation/frontends.include !}
## Further reading ## Further reading
{! installation/further_reading.include !} {! installation/further_reading.include !}

View file

@ -279,6 +279,7 @@ After that, run the `pleroma_ctl migrate` command as usual to perform database m
As it currently stands, your OTP build will only be compatible for the specific RedHat distribution you've built it on. Fedora builds only work on Fedora, Centos builds only on Centos, RedHat builds only on RedHat. Secondly, for Fedora, they will also be bound to the specific Fedora release. This is because different releases of Fedora may have significant changes made in some of the required packages and libraries. As it currently stands, your OTP build will only be compatible for the specific RedHat distribution you've built it on. Fedora builds only work on Fedora, Centos builds only on Centos, RedHat builds only on RedHat. Secondly, for Fedora, they will also be bound to the specific Fedora release. This is because different releases of Fedora may have significant changes made in some of the required packages and libraries.
{! installation/frontends.include !}
{! installation/further_reading.include !} {! installation/further_reading.include !}

View file

@ -0,0 +1,66 @@
# Verifying OTP release integrity
All stable OTP releases are cryptographically signed, to allow
you to verify the integrity if you choose to.
Releases are signed with [Signify](https://man.openbsd.org/signify.1),
with [the public key in the main repository](https://akkoma.dev/AkkomaGang/akkoma/src/branch/develop/SIGNING_KEY.pub)
Release URLs will always be of the form
```
https://akkoma-updates.s3-website.fr-par.scw.cloud/{branch}/akkoma-{flavour}.zip
```
Where branch is usually `stable` or `develop`, and `flavour` is
the one [that you detect on install](../otp_en/#detecting-flavour).
So, for an AMD64 stable install, your update URL will be
```
https://akkoma-updates.s3-website.fr-par.scw.cloud/stable/akkoma-amd64.zip
```
To verify the integrity of this file, we have two helper files
```
# Checksums
https://akkoma-updates.s3-website.fr-par.scw.cloud/{branch}/akkoma-{flavour}.zip.sha256
# Signify signature of the hashes
https://akkoma-updates.s3-website.fr-par.scw.cloud/{branch}/akkoma-{flavour}.zip.sha256.sig
```
Thus, to upgrade manually, with integrity checking, consider the following script:
```bash
#!/bin/bash
set -eo pipefail
export FLAVOUR=amd64
export BRANCH=stable
# Fetch signing key
curl --silent https://akkoma.dev/AkkomaGang/akkoma/raw/branch/$BRANCH/SIGNING_KEY.pub -o AKKOMA_SIGNING_KEY.pub
# Download zip file and sig files
wget -q https://akkoma-updates.s3-website.fr-par.scw.cloud/$BRANCH/akkoma-$FLAVOUR{.zip,.zip.sha256,.zip.sha256.sig}
# Verify zip file's sha256 integrity
sha256sum --check akkoma-$FLAVOUR.zip.sha256
# Verify hash file's integrity
# Signify might be under the `signify` command, depending on your distribution
signify-openbsd -V -p AKKOMA_SIGNING_KEY.pub -m akkoma-$FLAVOUR.zip.sha256
# We're good, use that URL
echo "Update URL contents verified"
echo "use"
echo "./bin/pleroma_ctl update --zip-url https://akkoma-updates.s3-website.fr-par.scw.cloud/$BRANCH/akkoma-$FLAVOUR"
echo "to update your instance"
# Clean up
rm akkoma-$FLAVOUR.zip
rm akkoma-$FLAVOUR.zip.sha256
rm akkoma-$FLAVOUR.zip.sha256.sig
```

View file

@ -8,6 +8,40 @@ defmodule Pleroma.Activity.HTML do
@cachex Pleroma.Config.get([:cachex, :provider], Cachex) @cachex Pleroma.Config.get([:cachex, :provider], Cachex)
# We store a list of cache keys related to an activity in a
# separate cache, scrubber_management_cache. It has the same
# size as scrubber_cache (see application.ex). Every time we add
# a cache to scrubber_cache, we update scrubber_management_cache.
#
# The most recent write of a certain key in the management cache
# is the same as the most recent write of any record related to that
# key in the main cache.
# Assuming LRW ( https://hexdocs.pm/cachex/Cachex.Policy.LRW.html ),
# this means when the management cache is evicted by cachex, all
# related records in the main cache will also have been evicted.
defp get_cache_keys_for(activity_id) do
with {:ok, list} when is_list(list) <- @cachex.get(:scrubber_management_cache, activity_id) do
list
else
_ -> []
end
end
defp add_cache_key_for(activity_id, additional_key) do
current = get_cache_keys_for(activity_id)
unless additional_key in current do
@cachex.put(:scrubber_management_cache, activity_id, [additional_key | current])
end
end
def invalidate_cache_for(activity_id) do
keys = get_cache_keys_for(activity_id)
Enum.map(keys, &@cachex.del(:scrubber_cache, &1))
@cachex.del(:scrubber_management_cache, activity_id)
end
def get_cached_scrubbed_html_for_activity( def get_cached_scrubbed_html_for_activity(
content, content,
scrubbers, scrubbers,
@ -19,6 +53,8 @@ def get_cached_scrubbed_html_for_activity(
@cachex.fetch!(:scrubber_cache, key, fn _key -> @cachex.fetch!(:scrubber_cache, key, fn _key ->
object = Object.normalize(activity, fetch: false) object = Object.normalize(activity, fetch: false)
add_cache_key_for(activity.id, key)
HTML.ensure_scrubbed_html(content, scrubbers, object.data["fake"] || false, callback) HTML.ensure_scrubbed_html(content, scrubbers, object.data["fake"] || false, callback)
end) end)
end end

View file

@ -0,0 +1,100 @@
defmodule Pleroma.Akkoma.Translators.DeepL do
@behaviour Pleroma.Akkoma.Translator
alias Pleroma.HTTP
alias Pleroma.Config
require Logger
defp base_url(:free) do
"https://api-free.deepl.com/v2/"
end
defp base_url(:pro) do
"https://api.deepl.com/v2/"
end
defp api_key do
Config.get([:deepl, :api_key])
end
defp tier do
Config.get([:deepl, :tier])
end
@impl Pleroma.Akkoma.Translator
def languages do
with {:ok, %{status: 200} = source_response} <- do_languages("source"),
{:ok, %{status: 200} = dest_response} <- do_languages("target"),
{:ok, source_body} <- Jason.decode(source_response.body),
{:ok, dest_body} <- Jason.decode(dest_response.body) do
source_resp =
Enum.map(source_body, fn %{"language" => code, "name" => name} ->
%{code: code, name: name}
end)
dest_resp =
Enum.map(dest_body, fn %{"language" => code, "name" => name} ->
%{code: code, name: name}
end)
{:ok, source_resp, dest_resp}
else
{:ok, %{status: status} = response} ->
Logger.warning("DeepL: Request rejected: #{inspect(response)}")
{:error, "DeepL request failed (code #{status})"}
{:error, reason} ->
{:error, reason}
end
end
@impl Pleroma.Akkoma.Translator
def translate(string, from_language, to_language) do
with {:ok, %{status: 200} = response} <-
do_request(api_key(), tier(), string, from_language, to_language),
{:ok, body} <- Jason.decode(response.body) do
%{"translations" => [%{"text" => translated, "detected_source_language" => detected}]} =
body
{:ok, detected, translated}
else
{:ok, %{status: status} = response} ->
Logger.warning("DeepL: Request rejected: #{inspect(response)}")
{:error, "DeepL request failed (code #{status})"}
{:error, reason} ->
{:error, reason}
end
end
defp do_request(api_key, tier, string, from_language, to_language) do
HTTP.post(
base_url(tier) <> "translate",
URI.encode_query(
%{
text: string,
target_lang: to_language,
tag_handling: "html"
}
|> maybe_add_source(from_language),
:rfc3986
),
[
{"authorization", "DeepL-Auth-Key #{api_key}"},
{"content-type", "application/x-www-form-urlencoded"}
]
)
end
defp maybe_add_source(opts, nil), do: opts
defp maybe_add_source(opts, lang), do: Map.put(opts, :source_lang, lang)
defp do_languages(type) do
HTTP.get(
base_url(tier()) <> "languages?type=#{type}",
[
{"authorization", "DeepL-Auth-Key #{api_key()}"}
]
)
end
end

View file

@ -0,0 +1,82 @@
defmodule Pleroma.Akkoma.Translators.LibreTranslate do
@behaviour Pleroma.Akkoma.Translator
alias Pleroma.Config
alias Pleroma.HTTP
require Logger
defp api_key do
Config.get([:libre_translate, :api_key])
end
defp url do
Config.get([:libre_translate, :url])
end
@impl Pleroma.Akkoma.Translator
def languages do
with {:ok, %{status: 200} = response} <- do_languages(),
{:ok, body} <- Jason.decode(response.body) do
resp = Enum.map(body, fn %{"code" => code, "name" => name} -> %{code: code, name: name} end)
# No separate source/dest
{:ok, resp, resp}
else
{:ok, %{status: status} = response} ->
Logger.warning("LibreTranslate: Request rejected: #{inspect(response)}")
{:error, "LibreTranslate request failed (code #{status})"}
{:error, reason} ->
{:error, reason}
end
end
@impl Pleroma.Akkoma.Translator
def translate(string, from_language, to_language) do
with {:ok, %{status: 200} = response} <- do_request(string, from_language, to_language),
{:ok, body} <- Jason.decode(response.body) do
%{"translatedText" => translated} = body
detected =
if Map.has_key?(body, "detectedLanguage") do
get_in(body, ["detectedLanguage", "language"])
else
from_language
end
{:ok, detected, translated}
else
{:ok, %{status: status} = response} ->
Logger.warning("libre_translate: request failed, #{inspect(response)}")
{:error, "libre_translate: request failed (code #{status})"}
{:error, reason} ->
{:error, reason}
end
end
defp do_request(string, from_language, to_language) do
url = URI.parse(url())
url = %{url | path: "/translate"}
HTTP.post(
to_string(url),
Jason.encode!(%{
q: string,
source: if(is_nil(from_language), do: "auto", else: from_language),
target: to_language,
format: "html",
api_key: api_key()
}),
[
{"content-type", "application/json"}
]
)
end
defp do_languages() do
url = URI.parse(url())
url = %{url | path: "/languages"}
HTTP.get(to_string(url))
end
end

View file

@ -0,0 +1,8 @@
defmodule Pleroma.Akkoma.Translator do
@callback translate(String.t(), String.t() | nil, String.t()) ::
{:ok, String.t(), String.t()} | {:error, any()}
@callback languages() ::
{:ok, [%{name: String.t(), code: String.t()}],
[%{name: String.t(), code: String.t()}]}
| {:error, any()}
end

View file

@ -150,11 +150,13 @@ defp cachex_children do
build_cachex("object", default_ttl: 25_000, ttl_interval: 1000, limit: 2500), build_cachex("object", default_ttl: 25_000, ttl_interval: 1000, limit: 2500),
build_cachex("rich_media", default_ttl: :timer.minutes(120), limit: 5000), build_cachex("rich_media", default_ttl: :timer.minutes(120), limit: 5000),
build_cachex("scrubber", limit: 2500), build_cachex("scrubber", limit: 2500),
build_cachex("scrubber_management", limit: 2500),
build_cachex("idempotency", expiration: idempotency_expiration(), limit: 2500), build_cachex("idempotency", expiration: idempotency_expiration(), limit: 2500),
build_cachex("web_resp", limit: 2500), build_cachex("web_resp", limit: 2500),
build_cachex("emoji_packs", expiration: emoji_packs_expiration(), limit: 10), build_cachex("emoji_packs", expiration: emoji_packs_expiration(), limit: 10),
build_cachex("failed_proxy_url", limit: 2500), build_cachex("failed_proxy_url", limit: 2500),
build_cachex("banned_urls", default_ttl: :timer.hours(24 * 30), limit: 5_000) build_cachex("banned_urls", default_ttl: :timer.hours(24 * 30), limit: 5_000),
build_cachex("translations", default_ttl: :timer.hours(24 * 30), limit: 2500)
] ]
end end

View file

@ -38,7 +38,6 @@ def start_link(restart_pleroma? \\ true) do
def load_and_update_env(deleted_settings \\ [], restart_pleroma? \\ true) do def load_and_update_env(deleted_settings \\ [], restart_pleroma? \\ true) do
with {_, true} <- {:configurable, Config.get(:configurable_from_database)} do with {_, true} <- {:configurable, Config.get(:configurable_from_database)} do
# We need to restart applications for loaded settings take effect # We need to restart applications for loaded settings take effect
{logger, other} = {logger, other} =
(Repo.all(ConfigDB) ++ deleted_settings) (Repo.all(ConfigDB) ++ deleted_settings)
|> Enum.map(&merge_with_default/1) |> Enum.map(&merge_with_default/1)
@ -85,7 +84,12 @@ defp maybe_set_pleroma_last(apps) do
end end
defp merge_with_default(%{group: group, key: key, value: value} = setting) do defp merge_with_default(%{group: group, key: key, value: value} = setting) do
default = Config.Holder.default_config(group, key) default =
if group == :pleroma do
Config.get([key], Config.Holder.default_config(group, key))
else
Config.Holder.default_config(group, key)
end
merged = merged =
cond do cond do

View file

@ -27,4 +27,40 @@ defmodule Pleroma.Constants do
do: do:
~w(index.html robots.txt static static-fe finmoji emoji packs sounds images instance sw.js sw-pleroma.js favicon.png schemas doc embed.js embed.css) ~w(index.html robots.txt static static-fe finmoji emoji packs sounds images instance sw.js sw-pleroma.js favicon.png schemas doc embed.js embed.css)
) )
const(status_updatable_fields,
do: [
"source",
"tag",
"updated",
"emoji",
"content",
"summary",
"sensitive",
"attachment",
"generator"
]
)
const(updatable_object_types,
do: [
"Note",
"Question",
"Audio",
"Video",
"Event",
"Article",
"Page"
]
)
const(actor_types,
do: [
"Application",
"Group",
"Organization",
"Person",
"Service"
]
)
end end

View file

@ -1,13 +1,13 @@
# emoji-test.txt # emoji-test.txt
# Date: 2021-08-26, 17:22:23 GMT # Date: 2022-08-12, 20:24:39 GMT
# © 2021 Unicode®, Inc. # © 2022 Unicode®, Inc.
# Unicode and the Unicode Logo are registered trademarks of Unicode, Inc. in the U.S. and other countries. # Unicode and the Unicode Logo are registered trademarks of Unicode, Inc. in the U.S. and other countries.
# For terms of use, see http://www.unicode.org/terms_of_use.html # For terms of use, see https://www.unicode.org/terms_of_use.html
# #
# Emoji Keyboard/Display Test Data for UTS #51 # Emoji Keyboard/Display Test Data for UTS #51
# Version: 14.0 # Version: 15.0
# #
# For documentation and usage, see http://www.unicode.org/reports/tr51 # For documentation and usage, see https://www.unicode.org/reports/tr51
# #
# This file provides data for testing which emoji forms should be in keyboards and which should also be displayed/processed. # This file provides data for testing which emoji forms should be in keyboards and which should also be displayed/processed.
# Format: code points; status # emoji name # Format: code points; status # emoji name
@ -92,6 +92,7 @@
1F62C ; fully-qualified # 😬 E1.0 grimacing face 1F62C ; fully-qualified # 😬 E1.0 grimacing face
1F62E 200D 1F4A8 ; fully-qualified # 😮‍💨 E13.1 face exhaling 1F62E 200D 1F4A8 ; fully-qualified # 😮‍💨 E13.1 face exhaling
1F925 ; fully-qualified # 🤥 E3.0 lying face 1F925 ; fully-qualified # 🤥 E3.0 lying face
1FAE8 ; fully-qualified # 🫨 E15.0 shaking face
# subgroup: face-sleepy # subgroup: face-sleepy
1F60C ; fully-qualified # 😌 E0.6 relieved face 1F60C ; fully-qualified # 😌 E0.6 relieved face
@ -155,7 +156,7 @@
# subgroup: face-negative # subgroup: face-negative
1F624 ; fully-qualified # 😤 E0.6 face with steam from nose 1F624 ; fully-qualified # 😤 E0.6 face with steam from nose
1F621 ; fully-qualified # 😡 E0.6 pouting face 1F621 ; fully-qualified # 😡 E0.6 enraged face
1F620 ; fully-qualified # 😠 E0.6 angry face 1F620 ; fully-qualified # 😠 E0.6 angry face
1F92C ; fully-qualified # 🤬 E5.0 face with symbols on mouth 1F92C ; fully-qualified # 🤬 E5.0 face with symbols on mouth
1F608 ; fully-qualified # 😈 E1.0 smiling face with horns 1F608 ; fully-qualified # 😈 E1.0 smiling face with horns
@ -190,8 +191,7 @@
1F649 ; fully-qualified # 🙉 E0.6 hear-no-evil monkey 1F649 ; fully-qualified # 🙉 E0.6 hear-no-evil monkey
1F64A ; fully-qualified # 🙊 E0.6 speak-no-evil monkey 1F64A ; fully-qualified # 🙊 E0.6 speak-no-evil monkey
# subgroup: emotion # subgroup: heart
1F48B ; fully-qualified # 💋 E0.6 kiss mark
1F48C ; fully-qualified # 💌 E0.6 love letter 1F48C ; fully-qualified # 💌 E0.6 love letter
1F498 ; fully-qualified # 💘 E0.6 heart with arrow 1F498 ; fully-qualified # 💘 E0.6 heart with arrow
1F49D ; fully-qualified # 💝 E0.6 heart with ribbon 1F49D ; fully-qualified # 💝 E0.6 heart with ribbon
@ -210,14 +210,20 @@
2764 200D 1FA79 ; unqualified # ❤‍🩹 E13.1 mending heart 2764 200D 1FA79 ; unqualified # ❤‍🩹 E13.1 mending heart
2764 FE0F ; fully-qualified # ❤️ E0.6 red heart 2764 FE0F ; fully-qualified # ❤️ E0.6 red heart
2764 ; unqualified # ❤ E0.6 red heart 2764 ; unqualified # ❤ E0.6 red heart
1FA77 ; fully-qualified # 🩷 E15.0 pink heart
1F9E1 ; fully-qualified # 🧡 E5.0 orange heart 1F9E1 ; fully-qualified # 🧡 E5.0 orange heart
1F49B ; fully-qualified # 💛 E0.6 yellow heart 1F49B ; fully-qualified # 💛 E0.6 yellow heart
1F49A ; fully-qualified # 💚 E0.6 green heart 1F49A ; fully-qualified # 💚 E0.6 green heart
1F499 ; fully-qualified # 💙 E0.6 blue heart 1F499 ; fully-qualified # 💙 E0.6 blue heart
1FA75 ; fully-qualified # 🩵 E15.0 light blue heart
1F49C ; fully-qualified # 💜 E0.6 purple heart 1F49C ; fully-qualified # 💜 E0.6 purple heart
1F90E ; fully-qualified # 🤎 E12.0 brown heart 1F90E ; fully-qualified # 🤎 E12.0 brown heart
1F5A4 ; fully-qualified # 🖤 E3.0 black heart 1F5A4 ; fully-qualified # 🖤 E3.0 black heart
1FA76 ; fully-qualified # 🩶 E15.0 grey heart
1F90D ; fully-qualified # 🤍 E12.0 white heart 1F90D ; fully-qualified # 🤍 E12.0 white heart
# subgroup: emotion
1F48B ; fully-qualified # 💋 E0.6 kiss mark
1F4AF ; fully-qualified # 💯 E0.6 hundred points 1F4AF ; fully-qualified # 💯 E0.6 hundred points
1F4A2 ; fully-qualified # 💢 E0.6 anger symbol 1F4A2 ; fully-qualified # 💢 E0.6 anger symbol
1F4A5 ; fully-qualified # 💥 E0.6 collision 1F4A5 ; fully-qualified # 💥 E0.6 collision
@ -226,21 +232,20 @@
1F4A8 ; fully-qualified # 💨 E0.6 dashing away 1F4A8 ; fully-qualified # 💨 E0.6 dashing away
1F573 FE0F ; fully-qualified # 🕳️ E0.7 hole 1F573 FE0F ; fully-qualified # 🕳️ E0.7 hole
1F573 ; unqualified # 🕳 E0.7 hole 1F573 ; unqualified # 🕳 E0.7 hole
1F4A3 ; fully-qualified # 💣 E0.6 bomb
1F4AC ; fully-qualified # 💬 E0.6 speech balloon 1F4AC ; fully-qualified # 💬 E0.6 speech balloon
1F441 FE0F 200D 1F5E8 FE0F ; fully-qualified # 👁️‍🗨️ E2.0 eye in speech bubble 1F441 FE0F 200D 1F5E8 FE0F ; fully-qualified # 👁️‍🗨️ E2.0 eye in speech bubble
1F441 200D 1F5E8 FE0F ; unqualified # 👁‍🗨️ E2.0 eye in speech bubble 1F441 200D 1F5E8 FE0F ; unqualified # 👁‍🗨️ E2.0 eye in speech bubble
1F441 FE0F 200D 1F5E8 ; unqualified # 👁️‍🗨 E2.0 eye in speech bubble 1F441 FE0F 200D 1F5E8 ; minimally-qualified # 👁️‍🗨 E2.0 eye in speech bubble
1F441 200D 1F5E8 ; unqualified # 👁‍🗨 E2.0 eye in speech bubble 1F441 200D 1F5E8 ; unqualified # 👁‍🗨 E2.0 eye in speech bubble
1F5E8 FE0F ; fully-qualified # 🗨️ E2.0 left speech bubble 1F5E8 FE0F ; fully-qualified # 🗨️ E2.0 left speech bubble
1F5E8 ; unqualified # 🗨 E2.0 left speech bubble 1F5E8 ; unqualified # 🗨 E2.0 left speech bubble
1F5EF FE0F ; fully-qualified # 🗯️ E0.7 right anger bubble 1F5EF FE0F ; fully-qualified # 🗯️ E0.7 right anger bubble
1F5EF ; unqualified # 🗯 E0.7 right anger bubble 1F5EF ; unqualified # 🗯 E0.7 right anger bubble
1F4AD ; fully-qualified # 💭 E1.0 thought balloon 1F4AD ; fully-qualified # 💭 E1.0 thought balloon
1F4A4 ; fully-qualified # 💤 E0.6 zzz 1F4A4 ; fully-qualified # 💤 E0.6 ZZZ
# Smileys & Emotion subtotal: 177 # Smileys & Emotion subtotal: 180
# Smileys & Emotion subtotal: 177 w/o modifiers # Smileys & Emotion subtotal: 180 w/o modifiers
# group: People & Body # group: People & Body
@ -300,6 +305,18 @@
1FAF4 1F3FD ; fully-qualified # 🫴🏽 E14.0 palm up hand: medium skin tone 1FAF4 1F3FD ; fully-qualified # 🫴🏽 E14.0 palm up hand: medium skin tone
1FAF4 1F3FE ; fully-qualified # 🫴🏾 E14.0 palm up hand: medium-dark skin tone 1FAF4 1F3FE ; fully-qualified # 🫴🏾 E14.0 palm up hand: medium-dark skin tone
1FAF4 1F3FF ; fully-qualified # 🫴🏿 E14.0 palm up hand: dark skin tone 1FAF4 1F3FF ; fully-qualified # 🫴🏿 E14.0 palm up hand: dark skin tone
1FAF7 ; fully-qualified # 🫷 E15.0 leftwards pushing hand
1FAF7 1F3FB ; fully-qualified # 🫷🏻 E15.0 leftwards pushing hand: light skin tone
1FAF7 1F3FC ; fully-qualified # 🫷🏼 E15.0 leftwards pushing hand: medium-light skin tone
1FAF7 1F3FD ; fully-qualified # 🫷🏽 E15.0 leftwards pushing hand: medium skin tone
1FAF7 1F3FE ; fully-qualified # 🫷🏾 E15.0 leftwards pushing hand: medium-dark skin tone
1FAF7 1F3FF ; fully-qualified # 🫷🏿 E15.0 leftwards pushing hand: dark skin tone
1FAF8 ; fully-qualified # 🫸 E15.0 rightwards pushing hand
1FAF8 1F3FB ; fully-qualified # 🫸🏻 E15.0 rightwards pushing hand: light skin tone
1FAF8 1F3FC ; fully-qualified # 🫸🏼 E15.0 rightwards pushing hand: medium-light skin tone
1FAF8 1F3FD ; fully-qualified # 🫸🏽 E15.0 rightwards pushing hand: medium skin tone
1FAF8 1F3FE ; fully-qualified # 🫸🏾 E15.0 rightwards pushing hand: medium-dark skin tone
1FAF8 1F3FF ; fully-qualified # 🫸🏿 E15.0 rightwards pushing hand: dark skin tone
# subgroup: hand-fingers-partial # subgroup: hand-fingers-partial
1F44C ; fully-qualified # 👌 E0.6 OK hand 1F44C ; fully-qualified # 👌 E0.6 OK hand
@ -473,11 +490,11 @@
1F932 1F3FE ; fully-qualified # 🤲🏾 E5.0 palms up together: medium-dark skin tone 1F932 1F3FE ; fully-qualified # 🤲🏾 E5.0 palms up together: medium-dark skin tone
1F932 1F3FF ; fully-qualified # 🤲🏿 E5.0 palms up together: dark skin tone 1F932 1F3FF ; fully-qualified # 🤲🏿 E5.0 palms up together: dark skin tone
1F91D ; fully-qualified # 🤝 E3.0 handshake 1F91D ; fully-qualified # 🤝 E3.0 handshake
1F91D 1F3FB ; fully-qualified # 🤝🏻 E3.0 handshake: light skin tone 1F91D 1F3FB ; fully-qualified # 🤝🏻 E14.0 handshake: light skin tone
1F91D 1F3FC ; fully-qualified # 🤝🏼 E3.0 handshake: medium-light skin tone 1F91D 1F3FC ; fully-qualified # 🤝🏼 E14.0 handshake: medium-light skin tone
1F91D 1F3FD ; fully-qualified # 🤝🏽 E3.0 handshake: medium skin tone 1F91D 1F3FD ; fully-qualified # 🤝🏽 E14.0 handshake: medium skin tone
1F91D 1F3FE ; fully-qualified # 🤝🏾 E3.0 handshake: medium-dark skin tone 1F91D 1F3FE ; fully-qualified # 🤝🏾 E14.0 handshake: medium-dark skin tone
1F91D 1F3FF ; fully-qualified # 🤝🏿 E3.0 handshake: dark skin tone 1F91D 1F3FF ; fully-qualified # 🤝🏿 E14.0 handshake: dark skin tone
1FAF1 1F3FB 200D 1FAF2 1F3FC ; fully-qualified # 🫱🏻‍🫲🏼 E14.0 handshake: light skin tone, medium-light skin tone 1FAF1 1F3FB 200D 1FAF2 1F3FC ; fully-qualified # 🫱🏻‍🫲🏼 E14.0 handshake: light skin tone, medium-light skin tone
1FAF1 1F3FB 200D 1FAF2 1F3FD ; fully-qualified # 🫱🏻‍🫲🏽 E14.0 handshake: light skin tone, medium skin tone 1FAF1 1F3FB 200D 1FAF2 1F3FD ; fully-qualified # 🫱🏻‍🫲🏽 E14.0 handshake: light skin tone, medium skin tone
1FAF1 1F3FB 200D 1FAF2 1F3FE ; fully-qualified # 🫱🏻‍🫲🏾 E14.0 handshake: light skin tone, medium-dark skin tone 1FAF1 1F3FB 200D 1FAF2 1F3FE ; fully-qualified # 🫱🏻‍🫲🏾 E14.0 handshake: light skin tone, medium-dark skin tone
@ -1455,7 +1472,7 @@
1F575 1F3FF ; fully-qualified # 🕵🏿 E2.0 detective: dark skin tone 1F575 1F3FF ; fully-qualified # 🕵🏿 E2.0 detective: dark skin tone
1F575 FE0F 200D 2642 FE0F ; fully-qualified # 🕵️‍♂️ E4.0 man detective 1F575 FE0F 200D 2642 FE0F ; fully-qualified # 🕵️‍♂️ E4.0 man detective
1F575 200D 2642 FE0F ; unqualified # 🕵‍♂️ E4.0 man detective 1F575 200D 2642 FE0F ; unqualified # 🕵‍♂️ E4.0 man detective
1F575 FE0F 200D 2642 ; unqualified # 🕵️‍♂ E4.0 man detective 1F575 FE0F 200D 2642 ; minimally-qualified # 🕵️‍♂ E4.0 man detective
1F575 200D 2642 ; unqualified # 🕵‍♂ E4.0 man detective 1F575 200D 2642 ; unqualified # 🕵‍♂ E4.0 man detective
1F575 1F3FB 200D 2642 FE0F ; fully-qualified # 🕵🏻‍♂️ E4.0 man detective: light skin tone 1F575 1F3FB 200D 2642 FE0F ; fully-qualified # 🕵🏻‍♂️ E4.0 man detective: light skin tone
1F575 1F3FB 200D 2642 ; minimally-qualified # 🕵🏻‍♂ E4.0 man detective: light skin tone 1F575 1F3FB 200D 2642 ; minimally-qualified # 🕵🏻‍♂ E4.0 man detective: light skin tone
@ -1469,7 +1486,7 @@
1F575 1F3FF 200D 2642 ; minimally-qualified # 🕵🏿‍♂ E4.0 man detective: dark skin tone 1F575 1F3FF 200D 2642 ; minimally-qualified # 🕵🏿‍♂ E4.0 man detective: dark skin tone
1F575 FE0F 200D 2640 FE0F ; fully-qualified # 🕵️‍♀️ E4.0 woman detective 1F575 FE0F 200D 2640 FE0F ; fully-qualified # 🕵️‍♀️ E4.0 woman detective
1F575 200D 2640 FE0F ; unqualified # 🕵‍♀️ E4.0 woman detective 1F575 200D 2640 FE0F ; unqualified # 🕵‍♀️ E4.0 woman detective
1F575 FE0F 200D 2640 ; unqualified # 🕵️‍♀ E4.0 woman detective 1F575 FE0F 200D 2640 ; minimally-qualified # 🕵️‍♀ E4.0 woman detective
1F575 200D 2640 ; unqualified # 🕵‍♀ E4.0 woman detective 1F575 200D 2640 ; unqualified # 🕵‍♀ E4.0 woman detective
1F575 1F3FB 200D 2640 FE0F ; fully-qualified # 🕵🏻‍♀️ E4.0 woman detective: light skin tone 1F575 1F3FB 200D 2640 FE0F ; fully-qualified # 🕵🏻‍♀️ E4.0 woman detective: light skin tone
1F575 1F3FB 200D 2640 ; minimally-qualified # 🕵🏻‍♀ E4.0 woman detective: light skin tone 1F575 1F3FB 200D 2640 ; minimally-qualified # 🕵🏻‍♀ E4.0 woman detective: light skin tone
@ -2302,7 +2319,7 @@
1F3CC 1F3FF ; fully-qualified # 🏌🏿 E4.0 person golfing: dark skin tone 1F3CC 1F3FF ; fully-qualified # 🏌🏿 E4.0 person golfing: dark skin tone
1F3CC FE0F 200D 2642 FE0F ; fully-qualified # 🏌️‍♂️ E4.0 man golfing 1F3CC FE0F 200D 2642 FE0F ; fully-qualified # 🏌️‍♂️ E4.0 man golfing
1F3CC 200D 2642 FE0F ; unqualified # 🏌‍♂️ E4.0 man golfing 1F3CC 200D 2642 FE0F ; unqualified # 🏌‍♂️ E4.0 man golfing
1F3CC FE0F 200D 2642 ; unqualified # 🏌️‍♂ E4.0 man golfing 1F3CC FE0F 200D 2642 ; minimally-qualified # 🏌️‍♂ E4.0 man golfing
1F3CC 200D 2642 ; unqualified # 🏌‍♂ E4.0 man golfing 1F3CC 200D 2642 ; unqualified # 🏌‍♂ E4.0 man golfing
1F3CC 1F3FB 200D 2642 FE0F ; fully-qualified # 🏌🏻‍♂️ E4.0 man golfing: light skin tone 1F3CC 1F3FB 200D 2642 FE0F ; fully-qualified # 🏌🏻‍♂️ E4.0 man golfing: light skin tone
1F3CC 1F3FB 200D 2642 ; minimally-qualified # 🏌🏻‍♂ E4.0 man golfing: light skin tone 1F3CC 1F3FB 200D 2642 ; minimally-qualified # 🏌🏻‍♂ E4.0 man golfing: light skin tone
@ -2316,7 +2333,7 @@
1F3CC 1F3FF 200D 2642 ; minimally-qualified # 🏌🏿‍♂ E4.0 man golfing: dark skin tone 1F3CC 1F3FF 200D 2642 ; minimally-qualified # 🏌🏿‍♂ E4.0 man golfing: dark skin tone
1F3CC FE0F 200D 2640 FE0F ; fully-qualified # 🏌️‍♀️ E4.0 woman golfing 1F3CC FE0F 200D 2640 FE0F ; fully-qualified # 🏌️‍♀️ E4.0 woman golfing
1F3CC 200D 2640 FE0F ; unqualified # 🏌‍♀️ E4.0 woman golfing 1F3CC 200D 2640 FE0F ; unqualified # 🏌‍♀️ E4.0 woman golfing
1F3CC FE0F 200D 2640 ; unqualified # 🏌️‍♀ E4.0 woman golfing 1F3CC FE0F 200D 2640 ; minimally-qualified # 🏌️‍♀ E4.0 woman golfing
1F3CC 200D 2640 ; unqualified # 🏌‍♀ E4.0 woman golfing 1F3CC 200D 2640 ; unqualified # 🏌‍♀ E4.0 woman golfing
1F3CC 1F3FB 200D 2640 FE0F ; fully-qualified # 🏌🏻‍♀️ E4.0 woman golfing: light skin tone 1F3CC 1F3FB 200D 2640 FE0F ; fully-qualified # 🏌🏻‍♀️ E4.0 woman golfing: light skin tone
1F3CC 1F3FB 200D 2640 ; minimally-qualified # 🏌🏻‍♀ E4.0 woman golfing: light skin tone 1F3CC 1F3FB 200D 2640 ; minimally-qualified # 🏌🏻‍♀ E4.0 woman golfing: light skin tone
@ -2427,7 +2444,7 @@
26F9 1F3FF ; fully-qualified # ⛹🏿 E2.0 person bouncing ball: dark skin tone 26F9 1F3FF ; fully-qualified # ⛹🏿 E2.0 person bouncing ball: dark skin tone
26F9 FE0F 200D 2642 FE0F ; fully-qualified # ⛹️‍♂️ E4.0 man bouncing ball 26F9 FE0F 200D 2642 FE0F ; fully-qualified # ⛹️‍♂️ E4.0 man bouncing ball
26F9 200D 2642 FE0F ; unqualified # ⛹‍♂️ E4.0 man bouncing ball 26F9 200D 2642 FE0F ; unqualified # ⛹‍♂️ E4.0 man bouncing ball
26F9 FE0F 200D 2642 ; unqualified # ⛹️‍♂ E4.0 man bouncing ball 26F9 FE0F 200D 2642 ; minimally-qualified # ⛹️‍♂ E4.0 man bouncing ball
26F9 200D 2642 ; unqualified # ⛹‍♂ E4.0 man bouncing ball 26F9 200D 2642 ; unqualified # ⛹‍♂ E4.0 man bouncing ball
26F9 1F3FB 200D 2642 FE0F ; fully-qualified # ⛹🏻‍♂️ E4.0 man bouncing ball: light skin tone 26F9 1F3FB 200D 2642 FE0F ; fully-qualified # ⛹🏻‍♂️ E4.0 man bouncing ball: light skin tone
26F9 1F3FB 200D 2642 ; minimally-qualified # ⛹🏻‍♂ E4.0 man bouncing ball: light skin tone 26F9 1F3FB 200D 2642 ; minimally-qualified # ⛹🏻‍♂ E4.0 man bouncing ball: light skin tone
@ -2441,7 +2458,7 @@
26F9 1F3FF 200D 2642 ; minimally-qualified # ⛹🏿‍♂ E4.0 man bouncing ball: dark skin tone 26F9 1F3FF 200D 2642 ; minimally-qualified # ⛹🏿‍♂ E4.0 man bouncing ball: dark skin tone
26F9 FE0F 200D 2640 FE0F ; fully-qualified # ⛹️‍♀️ E4.0 woman bouncing ball 26F9 FE0F 200D 2640 FE0F ; fully-qualified # ⛹️‍♀️ E4.0 woman bouncing ball
26F9 200D 2640 FE0F ; unqualified # ⛹‍♀️ E4.0 woman bouncing ball 26F9 200D 2640 FE0F ; unqualified # ⛹‍♀️ E4.0 woman bouncing ball
26F9 FE0F 200D 2640 ; unqualified # ⛹️‍♀ E4.0 woman bouncing ball 26F9 FE0F 200D 2640 ; minimally-qualified # ⛹️‍♀ E4.0 woman bouncing ball
26F9 200D 2640 ; unqualified # ⛹‍♀ E4.0 woman bouncing ball 26F9 200D 2640 ; unqualified # ⛹‍♀ E4.0 woman bouncing ball
26F9 1F3FB 200D 2640 FE0F ; fully-qualified # ⛹🏻‍♀️ E4.0 woman bouncing ball: light skin tone 26F9 1F3FB 200D 2640 FE0F ; fully-qualified # ⛹🏻‍♀️ E4.0 woman bouncing ball: light skin tone
26F9 1F3FB 200D 2640 ; minimally-qualified # ⛹🏻‍♀ E4.0 woman bouncing ball: light skin tone 26F9 1F3FB 200D 2640 ; minimally-qualified # ⛹🏻‍♀ E4.0 woman bouncing ball: light skin tone
@ -2462,7 +2479,7 @@
1F3CB 1F3FF ; fully-qualified # 🏋🏿 E2.0 person lifting weights: dark skin tone 1F3CB 1F3FF ; fully-qualified # 🏋🏿 E2.0 person lifting weights: dark skin tone
1F3CB FE0F 200D 2642 FE0F ; fully-qualified # 🏋️‍♂️ E4.0 man lifting weights 1F3CB FE0F 200D 2642 FE0F ; fully-qualified # 🏋️‍♂️ E4.0 man lifting weights
1F3CB 200D 2642 FE0F ; unqualified # 🏋‍♂️ E4.0 man lifting weights 1F3CB 200D 2642 FE0F ; unqualified # 🏋‍♂️ E4.0 man lifting weights
1F3CB FE0F 200D 2642 ; unqualified # 🏋️‍♂ E4.0 man lifting weights 1F3CB FE0F 200D 2642 ; minimally-qualified # 🏋️‍♂ E4.0 man lifting weights
1F3CB 200D 2642 ; unqualified # 🏋‍♂ E4.0 man lifting weights 1F3CB 200D 2642 ; unqualified # 🏋‍♂ E4.0 man lifting weights
1F3CB 1F3FB 200D 2642 FE0F ; fully-qualified # 🏋🏻‍♂️ E4.0 man lifting weights: light skin tone 1F3CB 1F3FB 200D 2642 FE0F ; fully-qualified # 🏋🏻‍♂️ E4.0 man lifting weights: light skin tone
1F3CB 1F3FB 200D 2642 ; minimally-qualified # 🏋🏻‍♂ E4.0 man lifting weights: light skin tone 1F3CB 1F3FB 200D 2642 ; minimally-qualified # 🏋🏻‍♂ E4.0 man lifting weights: light skin tone
@ -2476,7 +2493,7 @@
1F3CB 1F3FF 200D 2642 ; minimally-qualified # 🏋🏿‍♂ E4.0 man lifting weights: dark skin tone 1F3CB 1F3FF 200D 2642 ; minimally-qualified # 🏋🏿‍♂ E4.0 man lifting weights: dark skin tone
1F3CB FE0F 200D 2640 FE0F ; fully-qualified # 🏋️‍♀️ E4.0 woman lifting weights 1F3CB FE0F 200D 2640 FE0F ; fully-qualified # 🏋️‍♀️ E4.0 woman lifting weights
1F3CB 200D 2640 FE0F ; unqualified # 🏋‍♀️ E4.0 woman lifting weights 1F3CB 200D 2640 FE0F ; unqualified # 🏋‍♀️ E4.0 woman lifting weights
1F3CB FE0F 200D 2640 ; unqualified # 🏋️‍♀ E4.0 woman lifting weights 1F3CB FE0F 200D 2640 ; minimally-qualified # 🏋️‍♀ E4.0 woman lifting weights
1F3CB 200D 2640 ; unqualified # 🏋‍♀ E4.0 woman lifting weights 1F3CB 200D 2640 ; unqualified # 🏋‍♀ E4.0 woman lifting weights
1F3CB 1F3FB 200D 2640 FE0F ; fully-qualified # 🏋🏻‍♀️ E4.0 woman lifting weights: light skin tone 1F3CB 1F3FB 200D 2640 FE0F ; fully-qualified # 🏋🏻‍♀️ E4.0 woman lifting weights: light skin tone
1F3CB 1F3FB 200D 2640 ; minimally-qualified # 🏋🏻‍♀ E4.0 woman lifting weights: light skin tone 1F3CB 1F3FB 200D 2640 ; minimally-qualified # 🏋🏻‍♀ E4.0 woman lifting weights: light skin tone
@ -3262,8 +3279,8 @@
1FAC2 ; fully-qualified # 🫂 E13.0 people hugging 1FAC2 ; fully-qualified # 🫂 E13.0 people hugging
1F463 ; fully-qualified # 👣 E0.6 footprints 1F463 ; fully-qualified # 👣 E0.6 footprints
# People & Body subtotal: 2986 # People & Body subtotal: 2998
# People & Body subtotal: 506 w/o modifiers # People & Body subtotal: 508 w/o modifiers
# group: Component # group: Component
@ -3306,6 +3323,8 @@
1F405 ; fully-qualified # 🐅 E1.0 tiger 1F405 ; fully-qualified # 🐅 E1.0 tiger
1F406 ; fully-qualified # 🐆 E1.0 leopard 1F406 ; fully-qualified # 🐆 E1.0 leopard
1F434 ; fully-qualified # 🐴 E0.6 horse face 1F434 ; fully-qualified # 🐴 E0.6 horse face
1FACE ; fully-qualified # 🫎 E15.0 moose
1FACF ; fully-qualified # 🫏 E15.0 donkey
1F40E ; fully-qualified # 🐎 E0.6 horse 1F40E ; fully-qualified # 🐎 E0.6 horse
1F984 ; fully-qualified # 🦄 E1.0 unicorn 1F984 ; fully-qualified # 🦄 E1.0 unicorn
1F993 ; fully-qualified # 🦓 E5.0 zebra 1F993 ; fully-qualified # 🦓 E5.0 zebra
@ -3373,6 +3392,9 @@
1F9A9 ; fully-qualified # 🦩 E12.0 flamingo 1F9A9 ; fully-qualified # 🦩 E12.0 flamingo
1F99A ; fully-qualified # 🦚 E11.0 peacock 1F99A ; fully-qualified # 🦚 E11.0 peacock
1F99C ; fully-qualified # 🦜 E11.0 parrot 1F99C ; fully-qualified # 🦜 E11.0 parrot
1FABD ; fully-qualified # 🪽 E15.0 wing
1F426 200D 2B1B ; fully-qualified # 🐦‍⬛ E15.0 black bird
1FABF ; fully-qualified # 🪿 E15.0 goose
# subgroup: animal-amphibian # subgroup: animal-amphibian
1F438 ; fully-qualified # 🐸 E0.6 frog 1F438 ; fully-qualified # 🐸 E0.6 frog
@ -3399,6 +3421,7 @@
1F419 ; fully-qualified # 🐙 E0.6 octopus 1F419 ; fully-qualified # 🐙 E0.6 octopus
1F41A ; fully-qualified # 🐚 E0.6 spiral shell 1F41A ; fully-qualified # 🐚 E0.6 spiral shell
1FAB8 ; fully-qualified # 🪸 E14.0 coral 1FAB8 ; fully-qualified # 🪸 E14.0 coral
1FABC ; fully-qualified # 🪼 E15.0 jellyfish
# subgroup: animal-bug # subgroup: animal-bug
1F40C ; fully-qualified # 🐌 E0.6 snail 1F40C ; fully-qualified # 🐌 E0.6 snail
@ -3433,6 +3456,7 @@
1F33B ; fully-qualified # 🌻 E0.6 sunflower 1F33B ; fully-qualified # 🌻 E0.6 sunflower
1F33C ; fully-qualified # 🌼 E0.6 blossom 1F33C ; fully-qualified # 🌼 E0.6 blossom
1F337 ; fully-qualified # 🌷 E0.6 tulip 1F337 ; fully-qualified # 🌷 E0.6 tulip
1FABB ; fully-qualified # 🪻 E15.0 hyacinth
# subgroup: plant-other # subgroup: plant-other
1F331 ; fully-qualified # 🌱 E0.6 seedling 1F331 ; fully-qualified # 🌱 E0.6 seedling
@ -3451,9 +3475,10 @@
1F343 ; fully-qualified # 🍃 E0.6 leaf fluttering in wind 1F343 ; fully-qualified # 🍃 E0.6 leaf fluttering in wind
1FAB9 ; fully-qualified # 🪹 E14.0 empty nest 1FAB9 ; fully-qualified # 🪹 E14.0 empty nest
1FABA ; fully-qualified # 🪺 E14.0 nest with eggs 1FABA ; fully-qualified # 🪺 E14.0 nest with eggs
1F344 ; fully-qualified # 🍄 E0.6 mushroom
# Animals & Nature subtotal: 151 # Animals & Nature subtotal: 159
# Animals & Nature subtotal: 151 w/o modifiers # Animals & Nature subtotal: 159 w/o modifiers
# group: Food & Drink # group: Food & Drink
@ -3492,10 +3517,11 @@
1F966 ; fully-qualified # 🥦 E5.0 broccoli 1F966 ; fully-qualified # 🥦 E5.0 broccoli
1F9C4 ; fully-qualified # 🧄 E12.0 garlic 1F9C4 ; fully-qualified # 🧄 E12.0 garlic
1F9C5 ; fully-qualified # 🧅 E12.0 onion 1F9C5 ; fully-qualified # 🧅 E12.0 onion
1F344 ; fully-qualified # 🍄 E0.6 mushroom
1F95C ; fully-qualified # 🥜 E3.0 peanuts 1F95C ; fully-qualified # 🥜 E3.0 peanuts
1FAD8 ; fully-qualified # 🫘 E14.0 beans 1FAD8 ; fully-qualified # 🫘 E14.0 beans
1F330 ; fully-qualified # 🌰 E0.6 chestnut 1F330 ; fully-qualified # 🌰 E0.6 chestnut
1FADA ; fully-qualified # 🫚 E15.0 ginger root
1FADB ; fully-qualified # 🫛 E15.0 pea pod
# subgroup: food-prepared # subgroup: food-prepared
1F35E ; fully-qualified # 🍞 E0.6 bread 1F35E ; fully-qualified # 🍞 E0.6 bread
@ -3607,8 +3633,8 @@
1FAD9 ; fully-qualified # 🫙 E14.0 jar 1FAD9 ; fully-qualified # 🫙 E14.0 jar
1F3FA ; fully-qualified # 🏺 E1.0 amphora 1F3FA ; fully-qualified # 🏺 E1.0 amphora
# Food & Drink subtotal: 134 # Food & Drink subtotal: 135
# Food & Drink subtotal: 134 w/o modifiers # Food & Drink subtotal: 135 w/o modifiers
# group: Travel & Places # group: Travel & Places
@ -3974,11 +4000,10 @@
1F3AF ; fully-qualified # 🎯 E0.6 bullseye 1F3AF ; fully-qualified # 🎯 E0.6 bullseye
1FA80 ; fully-qualified # 🪀 E12.0 yo-yo 1FA80 ; fully-qualified # 🪀 E12.0 yo-yo
1FA81 ; fully-qualified # 🪁 E12.0 kite 1FA81 ; fully-qualified # 🪁 E12.0 kite
1F52B ; fully-qualified # 🔫 E0.6 water pistol
1F3B1 ; fully-qualified # 🎱 E0.6 pool 8 ball 1F3B1 ; fully-qualified # 🎱 E0.6 pool 8 ball
1F52E ; fully-qualified # 🔮 E0.6 crystal ball 1F52E ; fully-qualified # 🔮 E0.6 crystal ball
1FA84 ; fully-qualified # 🪄 E13.0 magic wand 1FA84 ; fully-qualified # 🪄 E13.0 magic wand
1F9FF ; fully-qualified # 🧿 E11.0 nazar amulet
1FAAC ; fully-qualified # 🪬 E14.0 hamsa
1F3AE ; fully-qualified # 🎮 E0.6 video game 1F3AE ; fully-qualified # 🎮 E0.6 video game
1F579 FE0F ; fully-qualified # 🕹️ E0.7 joystick 1F579 FE0F ; fully-qualified # 🕹️ E0.7 joystick
1F579 ; unqualified # 🕹 E0.7 joystick 1F579 ; unqualified # 🕹 E0.7 joystick
@ -4013,8 +4038,8 @@
1F9F6 ; fully-qualified # 🧶 E11.0 yarn 1F9F6 ; fully-qualified # 🧶 E11.0 yarn
1FAA2 ; fully-qualified # 🪢 E13.0 knot 1FAA2 ; fully-qualified # 🪢 E13.0 knot
# Activities subtotal: 97 # Activities subtotal: 96
# Activities subtotal: 97 w/o modifiers # Activities subtotal: 96 w/o modifiers
# group: Objects # group: Objects
@ -4040,6 +4065,7 @@
1FA73 ; fully-qualified # 🩳 E12.0 shorts 1FA73 ; fully-qualified # 🩳 E12.0 shorts
1F459 ; fully-qualified # 👙 E0.6 bikini 1F459 ; fully-qualified # 👙 E0.6 bikini
1F45A ; fully-qualified # 👚 E0.6 womans clothes 1F45A ; fully-qualified # 👚 E0.6 womans clothes
1FAAD ; fully-qualified # 🪭 E15.0 folding hand fan
1F45B ; fully-qualified # 👛 E0.6 purse 1F45B ; fully-qualified # 👛 E0.6 purse
1F45C ; fully-qualified # 👜 E0.6 handbag 1F45C ; fully-qualified # 👜 E0.6 handbag
1F45D ; fully-qualified # 👝 E0.6 clutch bag 1F45D ; fully-qualified # 👝 E0.6 clutch bag
@ -4055,6 +4081,7 @@
1F461 ; fully-qualified # 👡 E0.6 womans sandal 1F461 ; fully-qualified # 👡 E0.6 womans sandal
1FA70 ; fully-qualified # 🩰 E12.0 ballet shoes 1FA70 ; fully-qualified # 🩰 E12.0 ballet shoes
1F462 ; fully-qualified # 👢 E0.6 womans boot 1F462 ; fully-qualified # 👢 E0.6 womans boot
1FAAE ; fully-qualified # 🪮 E15.0 hair pick
1F451 ; fully-qualified # 👑 E0.6 crown 1F451 ; fully-qualified # 👑 E0.6 crown
1F452 ; fully-qualified # 👒 E0.6 womans hat 1F452 ; fully-qualified # 👒 E0.6 womans hat
1F3A9 ; fully-qualified # 🎩 E0.6 top hat 1F3A9 ; fully-qualified # 🎩 E0.6 top hat
@ -4103,6 +4130,8 @@
1FA95 ; fully-qualified # 🪕 E12.0 banjo 1FA95 ; fully-qualified # 🪕 E12.0 banjo
1F941 ; fully-qualified # 🥁 E3.0 drum 1F941 ; fully-qualified # 🥁 E3.0 drum
1FA98 ; fully-qualified # 🪘 E13.0 long drum 1FA98 ; fully-qualified # 🪘 E13.0 long drum
1FA87 ; fully-qualified # 🪇 E15.0 maracas
1FA88 ; fully-qualified # 🪈 E15.0 flute
# subgroup: phone # subgroup: phone
1F4F1 ; fully-qualified # 📱 E0.6 mobile phone 1F4F1 ; fully-qualified # 📱 E0.6 mobile phone
@ -4275,7 +4304,7 @@
1F5E1 ; unqualified # 🗡 E0.7 dagger 1F5E1 ; unqualified # 🗡 E0.7 dagger
2694 FE0F ; fully-qualified # ⚔️ E1.0 crossed swords 2694 FE0F ; fully-qualified # ⚔️ E1.0 crossed swords
2694 ; unqualified # ⚔ E1.0 crossed swords 2694 ; unqualified # ⚔ E1.0 crossed swords
1F52B ; fully-qualified # 🔫 E0.6 water pistol 1F4A3 ; fully-qualified # 💣 E0.6 bomb
1FA83 ; fully-qualified # 🪃 E13.0 boomerang 1FA83 ; fully-qualified # 🪃 E13.0 boomerang
1F3F9 ; fully-qualified # 🏹 E1.0 bow and arrow 1F3F9 ; fully-qualified # 🏹 E1.0 bow and arrow
1F6E1 FE0F ; fully-qualified # 🛡️ E0.7 shield 1F6E1 FE0F ; fully-qualified # 🛡️ E0.7 shield
@ -4354,12 +4383,14 @@
1FAA6 ; fully-qualified # 🪦 E13.0 headstone 1FAA6 ; fully-qualified # 🪦 E13.0 headstone
26B1 FE0F ; fully-qualified # ⚱️ E1.0 funeral urn 26B1 FE0F ; fully-qualified # ⚱️ E1.0 funeral urn
26B1 ; unqualified # ⚱ E1.0 funeral urn 26B1 ; unqualified # ⚱ E1.0 funeral urn
1F9FF ; fully-qualified # 🧿 E11.0 nazar amulet
1FAAC ; fully-qualified # 🪬 E14.0 hamsa
1F5FF ; fully-qualified # 🗿 E0.6 moai 1F5FF ; fully-qualified # 🗿 E0.6 moai
1FAA7 ; fully-qualified # 🪧 E13.0 placard 1FAA7 ; fully-qualified # 🪧 E13.0 placard
1FAAA ; fully-qualified # 🪪 E14.0 identification card 1FAAA ; fully-qualified # 🪪 E14.0 identification card
# Objects subtotal: 304 # Objects subtotal: 310
# Objects subtotal: 304 w/o modifiers # Objects subtotal: 310 w/o modifiers
# group: Symbols # group: Symbols
@ -4455,6 +4486,7 @@
262E ; unqualified # ☮ E1.0 peace symbol 262E ; unqualified # ☮ E1.0 peace symbol
1F54E ; fully-qualified # 🕎 E1.0 menorah 1F54E ; fully-qualified # 🕎 E1.0 menorah
1F52F ; fully-qualified # 🔯 E0.6 dotted six-pointed star 1F52F ; fully-qualified # 🔯 E0.6 dotted six-pointed star
1FAAF ; fully-qualified # 🪯 E15.0 khanda
# subgroup: zodiac # subgroup: zodiac
2648 ; fully-qualified # ♈ E0.6 Aries 2648 ; fully-qualified # ♈ E0.6 Aries
@ -4503,6 +4535,7 @@
1F505 ; fully-qualified # 🔅 E1.0 dim button 1F505 ; fully-qualified # 🔅 E1.0 dim button
1F506 ; fully-qualified # 🔆 E1.0 bright button 1F506 ; fully-qualified # 🔆 E1.0 bright button
1F4F6 ; fully-qualified # 📶 E0.6 antenna bars 1F4F6 ; fully-qualified # 📶 E0.6 antenna bars
1F6DC ; fully-qualified # 🛜 E15.0 wireless
1F4F3 ; fully-qualified # 📳 E0.6 vibration mode 1F4F3 ; fully-qualified # 📳 E0.6 vibration mode
1F4F4 ; fully-qualified # 📴 E0.6 mobile phone off 1F4F4 ; fully-qualified # 📴 E0.6 mobile phone off
@ -4693,8 +4726,8 @@
1F533 ; fully-qualified # 🔳 E0.6 white square button 1F533 ; fully-qualified # 🔳 E0.6 white square button
1F532 ; fully-qualified # 🔲 E0.6 black square button 1F532 ; fully-qualified # 🔲 E0.6 black square button
# Symbols subtotal: 302 # Symbols subtotal: 304
# Symbols subtotal: 302 w/o modifiers # Symbols subtotal: 304 w/o modifiers
# group: Flags # group: Flags
@ -4709,7 +4742,7 @@
1F3F3 200D 1F308 ; unqualified # 🏳‍🌈 E4.0 rainbow flag 1F3F3 200D 1F308 ; unqualified # 🏳‍🌈 E4.0 rainbow flag
1F3F3 FE0F 200D 26A7 FE0F ; fully-qualified # 🏳️‍⚧️ E13.0 transgender flag 1F3F3 FE0F 200D 26A7 FE0F ; fully-qualified # 🏳️‍⚧️ E13.0 transgender flag
1F3F3 200D 26A7 FE0F ; unqualified # 🏳‍⚧️ E13.0 transgender flag 1F3F3 200D 26A7 FE0F ; unqualified # 🏳‍⚧️ E13.0 transgender flag
1F3F3 FE0F 200D 26A7 ; unqualified # 🏳️‍⚧ E13.0 transgender flag 1F3F3 FE0F 200D 26A7 ; minimally-qualified # 🏳️‍⚧ E13.0 transgender flag
1F3F3 200D 26A7 ; unqualified # 🏳‍⚧ E13.0 transgender flag 1F3F3 200D 26A7 ; unqualified # 🏳‍⚧ E13.0 transgender flag
1F3F4 200D 2620 FE0F ; fully-qualified # 🏴‍☠️ E11.0 pirate flag 1F3F4 200D 2620 FE0F ; fully-qualified # 🏴‍☠️ E11.0 pirate flag
1F3F4 200D 2620 ; minimally-qualified # 🏴‍☠ E11.0 pirate flag 1F3F4 200D 2620 ; minimally-qualified # 🏴‍☠ E11.0 pirate flag
@ -4983,9 +5016,9 @@
# Flags subtotal: 275 w/o modifiers # Flags subtotal: 275 w/o modifiers
# Status Counts # Status Counts
# fully-qualified : 3624 # fully-qualified : 3655
# minimally-qualified : 817 # minimally-qualified : 827
# unqualified : 252 # unqualified : 242
# component : 9 # component : 9
#EOF #EOF

View file

@ -188,6 +188,11 @@ def emoji_url(%{"type" => "EmojiReact", "content" => emoji, "tag" => tags}) do
def emoji_url(_), do: nil def emoji_url(_), do: nil
def emoji_name_with_instance(name, url) do
url = url |> URI.parse() |> Map.get(:host)
"#{name}@#{url}"
end
emoji_qualification_map = emoji_qualification_map =
emojis emojis
|> Enum.filter(&String.contains?(&1, "\uFE0F")) |> Enum.filter(&String.contains?(&1, "\uFE0F"))

View file

@ -384,7 +384,7 @@ def create_notifications(%Activity{data: %{"to" => _, "type" => "Create"}} = act
end end
def create_notifications(%Activity{data: %{"type" => type}} = activity, options) def create_notifications(%Activity{data: %{"type" => type}} = activity, options)
when type in ["Follow", "Like", "Announce", "Move", "EmojiReact", "Flag"] do when type in ["Follow", "Like", "Announce", "Move", "EmojiReact", "Flag", "Update"] do
do_create_notifications(activity, options) do_create_notifications(activity, options)
end end
@ -438,6 +438,9 @@ defp type_from_activity(%{data: %{"type" => type}} = activity) do
activity activity
|> type_from_activity_object() |> type_from_activity_object()
"Update" ->
"update"
t -> t ->
raise "No notification type for activity type #{t}" raise "No notification type for activity type #{t}"
end end
@ -503,7 +506,16 @@ def create_poll_notifications(%Activity{} = activity) do
def get_notified_from_activity(activity, local_only \\ true) def get_notified_from_activity(activity, local_only \\ true)
def get_notified_from_activity(%Activity{data: %{"type" => type}} = activity, local_only) def get_notified_from_activity(%Activity{data: %{"type" => type}} = activity, local_only)
when type in ["Create", "Like", "Announce", "Follow", "Move", "EmojiReact", "Flag"] do when type in [
"Create",
"Like",
"Announce",
"Follow",
"Move",
"EmojiReact",
"Flag",
"Update"
] do
potential_receiver_ap_ids = get_potential_receiver_ap_ids(activity) potential_receiver_ap_ids = get_potential_receiver_ap_ids(activity)
potential_receivers = potential_receivers =
@ -543,6 +555,21 @@ def get_potential_receiver_ap_ids(%{data: %{"type" => "Flag", "actor" => actor}}
(User.all_superusers() |> Enum.map(fn user -> user.ap_id end)) -- [actor] (User.all_superusers() |> Enum.map(fn user -> user.ap_id end)) -- [actor]
end end
# Update activity: notify all who repeated this
def get_potential_receiver_ap_ids(%{data: %{"type" => "Update", "actor" => actor}} = activity) do
with %Object{data: %{"id" => object_id}} <- Object.normalize(activity, fetch: false) do
repeaters =
Activity.Queries.by_type("Announce")
|> Activity.Queries.by_object_id(object_id)
|> Activity.with_joined_user_actor()
|> where([a, u], u.local)
|> select([a, u], u.ap_id)
|> Repo.all()
repeaters -- [actor]
end
end
def get_potential_receiver_ap_ids(activity) do def get_potential_receiver_ap_ids(activity) do
[] []
|> Utils.maybe_notify_to_recipients(activity) |> Utils.maybe_notify_to_recipients(activity)

View file

@ -145,7 +145,7 @@ defp warn_on_no_object_preloaded(ap_id) do
Logger.debug("Backtrace: #{inspect(Process.info(:erlang.self(), :current_stacktrace))}") Logger.debug("Backtrace: #{inspect(Process.info(:erlang.self(), :current_stacktrace))}")
end end
def normalize(_, options \\ [fetch: false]) def normalize(_, options \\ [fetch: false, id_only: false])
# If we pass an Activity to Object.normalize(), we can try to use the preloaded object. # If we pass an Activity to Object.normalize(), we can try to use the preloaded object.
# Use this whenever possible, especially when walking graphs in an O(N) loop! # Use this whenever possible, especially when walking graphs in an O(N) loop!
@ -173,10 +173,15 @@ def normalize(%Activity{data: %{"object" => ap_id}}, options) do
def normalize(%{"id" => ap_id}, options), do: normalize(ap_id, options) def normalize(%{"id" => ap_id}, options), do: normalize(ap_id, options)
def normalize(ap_id, options) when is_binary(ap_id) do def normalize(ap_id, options) when is_binary(ap_id) do
if Keyword.get(options, :fetch) do cond do
Fetcher.fetch_object_from_id!(ap_id, options) Keyword.get(options, :id_only) ->
else ap_id
get_cached_by_ap_id(ap_id)
Keyword.get(options, :fetch) ->
Fetcher.fetch_object_from_id!(ap_id, options)
true ->
get_cached_by_ap_id(ap_id)
end end
end end

View file

@ -26,8 +26,42 @@ defp touch_changeset(changeset) do
end end
defp maybe_reinject_internal_fields(%{data: %{} = old_data}, new_data) do defp maybe_reinject_internal_fields(%{data: %{} = old_data}, new_data) do
has_history? = fn
%{"formerRepresentations" => %{"orderedItems" => list}} when is_list(list) -> true
_ -> false
end
internal_fields = Map.take(old_data, Pleroma.Constants.object_internal_fields()) internal_fields = Map.take(old_data, Pleroma.Constants.object_internal_fields())
remote_history_exists? = has_history?.(new_data)
# If the remote history exists, we treat that as the only source of truth.
new_data =
if has_history?.(old_data) and not remote_history_exists? do
Map.put(new_data, "formerRepresentations", old_data["formerRepresentations"])
else
new_data
end
# If the remote does not have history information, we need to manage it ourselves
new_data =
if not remote_history_exists? do
changed? =
Pleroma.Constants.status_updatable_fields()
|> Enum.any?(fn field -> Map.get(old_data, field) != Map.get(new_data, field) end)
%{updated_object: updated_object} =
new_data
|> Object.Updater.maybe_update_history(old_data,
updated: changed?,
use_history_in_new_object?: false
)
updated_object
else
new_data
end
Map.merge(new_data, internal_fields) Map.merge(new_data, internal_fields)
end end

View file

@ -0,0 +1,240 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2022 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Object.Updater do
require Pleroma.Constants
def update_content_fields(orig_object_data, updated_object) do
Pleroma.Constants.status_updatable_fields()
|> Enum.reduce(
%{data: orig_object_data, updated: false},
fn field, %{data: data, updated: updated} ->
updated =
updated or
(field != "updated" and
Map.get(updated_object, field) != Map.get(orig_object_data, field))
data =
if Map.has_key?(updated_object, field) do
Map.put(data, field, updated_object[field])
else
Map.drop(data, [field])
end
%{data: data, updated: updated}
end
)
end
def maybe_history(object) do
with history <- Map.get(object, "formerRepresentations"),
true <- is_map(history),
"OrderedCollection" <- Map.get(history, "type"),
true <- is_list(Map.get(history, "orderedItems")),
true <- is_integer(Map.get(history, "totalItems")) do
history
else
_ -> nil
end
end
def history_for(object) do
with history when not is_nil(history) <- maybe_history(object) do
history
else
_ -> history_skeleton()
end
end
defp history_skeleton do
%{
"type" => "OrderedCollection",
"totalItems" => 0,
"orderedItems" => []
}
end
def maybe_update_history(
updated_object,
orig_object_data,
opts
) do
updated = opts[:updated]
use_history_in_new_object? = opts[:use_history_in_new_object?]
if not updated do
%{updated_object: updated_object, used_history_in_new_object?: false}
else
# Put edit history
# Note that we may have got the edit history by first fetching the object
{new_history, used_history_in_new_object?} =
with true <- use_history_in_new_object?,
updated_history when not is_nil(updated_history) <- maybe_history(opts[:new_data]) do
{updated_history, true}
else
_ ->
history = history_for(orig_object_data)
latest_history_item =
orig_object_data
|> Map.drop(["id", "formerRepresentations"])
updated_history =
history
|> Map.put("orderedItems", [latest_history_item | history["orderedItems"]])
|> Map.put("totalItems", history["totalItems"] + 1)
{updated_history, false}
end
updated_object =
updated_object
|> Map.put("formerRepresentations", new_history)
%{updated_object: updated_object, used_history_in_new_object?: used_history_in_new_object?}
end
end
defp maybe_update_poll(to_be_updated, updated_object) do
choice_key = fn data ->
if Map.has_key?(data, "anyOf"), do: "anyOf", else: "oneOf"
end
with true <- to_be_updated["type"] == "Question",
key <- choice_key.(updated_object),
true <- key == choice_key.(to_be_updated),
orig_choices <- to_be_updated[key] |> Enum.map(&Map.drop(&1, ["replies"])),
new_choices <- updated_object[key] |> Enum.map(&Map.drop(&1, ["replies"])),
true <- orig_choices == new_choices do
# Choices are the same, but counts are different
to_be_updated
|> Map.put(key, updated_object[key])
else
# Choices (or vote type) have changed, do not allow this
_ -> to_be_updated
end
end
# This calculates the data to be sent as the object of an Update.
# new_data's formerRepresentations is not considered.
# formerRepresentations is added to the returned data.
def make_update_object_data(original_data, new_data, date) do
%{data: updated_data, updated: updated} =
original_data
|> update_content_fields(new_data)
if not updated do
updated_data
else
%{updated_object: updated_data} =
updated_data
|> maybe_update_history(original_data, updated: updated, use_history_in_new_object?: false)
updated_data
|> Map.put("updated", date)
end
end
# This calculates the data of the new Object from an Update.
# new_data's formerRepresentations is considered.
def make_new_object_data_from_update_object(original_data, new_data) do
update_is_reasonable =
with {_, updated} when not is_nil(updated) <- {:cur_updated, new_data["updated"]},
{_, {:ok, updated_time, _}} <- {:cur_updated, DateTime.from_iso8601(updated)},
{_, last_updated} when not is_nil(last_updated) <-
{:last_updated, original_data["updated"] || original_data["published"]},
{_, {:ok, last_updated_time, _}} <-
{:last_updated, DateTime.from_iso8601(last_updated)},
:gt <- DateTime.compare(updated_time, last_updated_time) do
:update_everything
else
# only allow poll updates
{:cur_updated, _} -> :no_content_update
:eq -> :no_content_update
# allow all updates
{:last_updated, _} -> :update_everything
# allow no updates
_ -> false
end
%{
updated_object: updated_data,
used_history_in_new_object?: used_history_in_new_object?,
updated: updated
} =
if update_is_reasonable == :update_everything do
%{data: updated_data, updated: updated} =
original_data
|> update_content_fields(new_data)
updated_data
|> maybe_update_history(original_data,
updated: updated,
use_history_in_new_object?: true,
new_data: new_data
)
|> Map.put(:updated, updated)
else
%{
updated_object: original_data,
used_history_in_new_object?: false,
updated: false
}
end
updated_data =
if update_is_reasonable != false do
updated_data
|> maybe_update_poll(new_data)
else
updated_data
end
%{
updated_data: updated_data,
updated: updated,
used_history_in_new_object?: used_history_in_new_object?
}
end
def for_each_history_item(%{"orderedItems" => items} = history, _object, fun) do
new_items =
Enum.map(items, fun)
|> Enum.reduce_while(
{:ok, []},
fn
{:ok, item}, {:ok, acc} -> {:cont, {:ok, acc ++ [item]}}
e, _acc -> {:halt, e}
end
)
case new_items do
{:ok, items} -> {:ok, Map.put(history, "orderedItems", items)}
e -> e
end
end
def for_each_history_item(history, _, _) do
{:ok, history}
end
def do_with_history(object, fun) do
with history <- object["formerRepresentations"],
object <- Map.drop(object, ["formerRepresentations"]),
{_, {:ok, object}} <- {:main_body, fun.(object)},
{_, {:ok, history}} <- {:history_items, for_each_history_item(history, object, fun)} do
object =
if history do
Map.put(object, "formerRepresentations", history)
else
object
end
{:ok, object}
else
{:main_body, e} -> e
{:history_items, e} -> e
end
end
end

View file

@ -66,9 +66,8 @@ def refetch_public_key(conn) do
end end
end end
def sign(%User{} = user, headers) do def sign(%User{keys: keys} = user, headers) do
with {:ok, %{keys: keys}} <- User.ensure_keys_present(user), with {:ok, private_key, _} <- Keys.keys_from_pem(keys) do
{:ok, private_key, _} <- Keys.keys_from_pem(keys) do
HTTPSignatures.sign(private_key, user.ap_id <> "#main-key", headers) HTTPSignatures.sign(private_key, user.ap_id <> "#main-key", headers)
end end
end end

View file

@ -36,6 +36,7 @@ defmodule Pleroma.Upload do
alias Ecto.UUID alias Ecto.UUID
alias Pleroma.Config alias Pleroma.Config
alias Pleroma.Maps alias Pleroma.Maps
alias Pleroma.Web.ActivityPub.Utils
require Logger require Logger
@type source :: @type source ::
@ -88,6 +89,7 @@ def store(upload, opts \\ []) do
{:ok, url_spec} <- Pleroma.Uploaders.Uploader.put_file(opts.uploader, upload) do {:ok, url_spec} <- Pleroma.Uploaders.Uploader.put_file(opts.uploader, upload) do
{:ok, {:ok,
%{ %{
"id" => Utils.generate_object_id(),
"type" => opts.activity_type, "type" => opts.activity_type,
"mediaType" => upload.content_type, "mediaType" => upload.content_type,
"url" => [ "url" => [

View file

@ -681,9 +681,9 @@ def register_changeset_ldap(struct, params = %{password: password})
|> validate_exclusion(:nickname, Config.get([User, :restricted_nicknames])) |> validate_exclusion(:nickname, Config.get([User, :restricted_nicknames]))
|> validate_format(:nickname, local_nickname_regex()) |> validate_format(:nickname, local_nickname_regex())
|> put_ap_id() |> put_ap_id()
|> put_keys()
|> unique_constraint(:ap_id) |> unique_constraint(:ap_id)
|> put_following_and_follower_and_featured_address() |> put_following_and_follower_and_featured_address()
|> put_private_key()
end end
def register_changeset(struct, params \\ %{}, opts \\ []) do def register_changeset(struct, params \\ %{}, opts \\ []) do
@ -741,10 +741,10 @@ def register_changeset(struct, params \\ %{}, opts \\ []) do
|> validate_length(:registration_reason, max: reason_limit) |> validate_length(:registration_reason, max: reason_limit)
|> maybe_validate_required_email(opts[:external]) |> maybe_validate_required_email(opts[:external])
|> put_password_hash |> put_password_hash
|> put_keys()
|> put_ap_id() |> put_ap_id()
|> unique_constraint(:ap_id) |> unique_constraint(:ap_id)
|> put_following_and_follower_and_featured_address() |> put_following_and_follower_and_featured_address()
|> put_private_key()
end end
def maybe_validate_required_email(changeset, true), do: changeset def maybe_validate_required_email(changeset, true), do: changeset
@ -757,11 +757,6 @@ def maybe_validate_required_email(changeset, _) do
end end
end end
def put_keys(changeset) do
{:ok, pem} = Keys.generate_rsa_pem()
put_change(changeset, :keys, pem)
end
def put_ap_id(changeset) do def put_ap_id(changeset) do
ap_id = ap_id(%User{nickname: get_field(changeset, :nickname)}) ap_id = ap_id(%User{nickname: get_field(changeset, :nickname)})
put_change(changeset, :ap_id, ap_id) put_change(changeset, :ap_id, ap_id)
@ -779,6 +774,11 @@ def put_following_and_follower_and_featured_address(changeset) do
|> put_change(:featured_address, featured) |> put_change(:featured_address, featured)
end end
defp put_private_key(changeset) do
{:ok, pem} = Keys.generate_rsa_pem()
put_change(changeset, :keys, pem)
end
defp autofollow_users(user) do defp autofollow_users(user) do
candidates = Config.get([:instance, :autofollowed_nicknames]) candidates = Config.get([:instance, :autofollowed_nicknames])
@ -1955,6 +1955,7 @@ defp create_service_actor(uri, nickname) do
follower_address: uri <> "/followers" follower_address: uri <> "/followers"
} }
|> change |> change
|> put_private_key()
|> unique_constraint(:nickname) |> unique_constraint(:nickname)
|> Repo.insert() |> Repo.insert()
|> set_cache() |> set_cache()
@ -1987,7 +1988,8 @@ def ap_enabled?(_), do: false
@doc "Gets or fetch a user by uri or nickname." @doc "Gets or fetch a user by uri or nickname."
@spec get_or_fetch(String.t()) :: {:ok, User.t()} | {:error, String.t()} @spec get_or_fetch(String.t()) :: {:ok, User.t()} | {:error, String.t()}
def get_or_fetch("http" <> _host = uri), do: get_or_fetch_by_ap_id(uri) def get_or_fetch("http://" <> _host = uri), do: get_or_fetch_by_ap_id(uri)
def get_or_fetch("https://" <> _host = uri), do: get_or_fetch_by_ap_id(uri)
def get_or_fetch(nickname), do: get_or_fetch_by_nickname(nickname) def get_or_fetch(nickname), do: get_or_fetch_by_nickname(nickname)
# wait a period of time and return newest version of the User structs # wait a period of time and return newest version of the User structs
@ -2220,17 +2222,6 @@ def get_mascot(%{mascot: mascot}) when is_nil(mascot) do
} }
end end
def ensure_keys_present(%{keys: keys} = user) when not is_nil(keys), do: {:ok, user}
def ensure_keys_present(%User{} = user) do
with {:ok, pem} <- Keys.generate_rsa_pem() do
user
|> cast(%{keys: pem}, [:keys])
|> validate_required([:keys])
|> update_and_set_cache()
end
end
def get_ap_ids_by_nicknames(nicknames) do def get_ap_ids_by_nicknames(nicknames) do
from(u in User, from(u in User,
where: u.nickname in ^nicknames, where: u.nickname in ^nicknames,

View file

@ -94,6 +94,7 @@ defp search_query(query_string, for_user, following, top_user_ids) do
|> subquery() |> subquery()
|> order_by(desc: :search_rank) |> order_by(desc: :search_rank)
|> maybe_restrict_local(for_user) |> maybe_restrict_local(for_user)
|> filter_deactivated_users()
end end
defp select_top_users(query, top_user_ids) do defp select_top_users(query, top_user_ids) do
@ -166,6 +167,10 @@ defp filter_internal_users(query) do
from(q in query, where: q.actor_type != "Application") from(q in query, where: q.actor_type != "Application")
end end
defp filter_deactivated_users(query) do
from(q in query, where: q.is_active == true)
end
defp filter_blocked_user(query, %User{} = blocker) do defp filter_blocked_user(query, %User{} = blocker) do
query query
|> join(:left, [u], b in Pleroma.UserRelationship, |> join(:left, [u], b in Pleroma.UserRelationship,

View file

@ -194,7 +194,16 @@ defp insert_activity_with_expiration(data, local, recipients) do
def notify_and_stream(activity) do def notify_and_stream(activity) do
Notification.create_notifications(activity) Notification.create_notifications(activity)
conversation = create_or_bump_conversation(activity, activity.actor) original_activity =
case activity do
%{data: %{"type" => "Update"}, object: %{data: %{"id" => id}}} ->
Activity.get_create_by_object_ap_id_with_object(id)
_ ->
activity
end
conversation = create_or_bump_conversation(original_activity, original_activity.actor)
participations = get_participations(conversation) participations = get_participations(conversation)
stream_out(activity) stream_out(activity)
stream_out_participations(participations) stream_out_participations(participations)
@ -260,7 +269,7 @@ def stream_out_participations(_, _), do: :noop
@impl true @impl true
def stream_out(%Activity{data: %{"type" => data_type}} = activity) def stream_out(%Activity{data: %{"type" => data_type}} = activity)
when data_type in ["Create", "Announce", "Delete"] do when data_type in ["Create", "Announce", "Delete", "Update"] do
activity activity
|> Topics.get_activity_topics() |> Topics.get_activity_topics()
|> Streamer.stream(activity) |> Streamer.stream(activity)
@ -331,9 +340,9 @@ defp do_unfollow(follower, followed, activity_id, local)
defp do_unfollow(follower, followed, activity_id, local) when local == true do defp do_unfollow(follower, followed, activity_id, local) when local == true do
with %Activity{} = follow_activity <- fetch_latest_follow(follower, followed), with %Activity{} = follow_activity <- fetch_latest_follow(follower, followed),
{:ok, follow_activity} <- update_follow_state(follow_activity, "cancelled"),
unfollow_data <- make_unfollow_data(follower, followed, follow_activity, activity_id), unfollow_data <- make_unfollow_data(follower, followed, follow_activity, activity_id),
{:ok, activity} <- insert(unfollow_data, local), {:ok, activity} <- insert(unfollow_data, local),
{:ok, _activity} <- Repo.delete(follow_activity),
_ <- notify_and_stream(activity), _ <- notify_and_stream(activity),
:ok <- maybe_federate(activity) do :ok <- maybe_federate(activity) do
{:ok, activity} {:ok, activity}
@ -349,7 +358,7 @@ defp do_unfollow(follower, followed, activity_id, false) do
with %Activity{} = follow_activity <- fetch_latest_follow(follower, followed), with %Activity{} = follow_activity <- fetch_latest_follow(follower, followed),
{:ok, _activity} <- Repo.delete(follow_activity), {:ok, _activity} <- Repo.delete(follow_activity),
unfollow_data <- make_unfollow_data(follower, followed, follow_activity, activity_id), unfollow_data <- make_unfollow_data(follower, followed, follow_activity, activity_id),
unfollow_activity <- remote_unfollow_data(unfollow_data), unfollow_activity <- make_unfollow_activity(unfollow_data, false),
_ <- notify_and_stream(unfollow_activity) do _ <- notify_and_stream(unfollow_activity) do
{:ok, unfollow_activity} {:ok, unfollow_activity}
else else
@ -358,12 +367,12 @@ defp do_unfollow(follower, followed, activity_id, false) do
end end
end end
defp remote_unfollow_data(data) do defp make_unfollow_activity(data, local) do
{recipients, _, _} = get_recipients(data) {recipients, _, _} = get_recipients(data)
%Activity{ %Activity{
data: data, data: data,
local: false, local: local,
actor: data["actor"], actor: data["actor"],
recipients: recipients recipients: recipients
} }

View file

@ -66,8 +66,7 @@ defp relay_active?(conn, _) do
end end
def user(conn, %{"nickname" => nickname}) do def user(conn, %{"nickname" => nickname}) do
with %User{local: true} = user <- User.get_cached_by_nickname(nickname), with %User{local: true} = user <- User.get_cached_by_nickname(nickname) do
{:ok, user} <- User.ensure_keys_present(user) do
conn conn
|> put_resp_content_type("application/activity+json") |> put_resp_content_type("application/activity+json")
|> put_view(UserView) |> put_view(UserView)
@ -174,7 +173,6 @@ def relay_following(conn, _params) do
def following(%{assigns: %{user: for_user}} = conn, %{"nickname" => nickname, "page" => page}) do def following(%{assigns: %{user: for_user}} = conn, %{"nickname" => nickname, "page" => page}) do
with %User{} = user <- User.get_cached_by_nickname(nickname), with %User{} = user <- User.get_cached_by_nickname(nickname),
{user, for_user} <- ensure_user_keys_present_and_maybe_refresh_for_user(user, for_user),
{:show_follows, true} <- {:show_follows, true} <-
{:show_follows, (for_user && for_user == user) || !user.hide_follows} do {:show_follows, (for_user && for_user == user) || !user.hide_follows} do
{page, _} = Integer.parse(page) {page, _} = Integer.parse(page)
@ -192,8 +190,7 @@ def following(%{assigns: %{user: for_user}} = conn, %{"nickname" => nickname, "p
end end
def following(%{assigns: %{user: for_user}} = conn, %{"nickname" => nickname}) do def following(%{assigns: %{user: for_user}} = conn, %{"nickname" => nickname}) do
with %User{} = user <- User.get_cached_by_nickname(nickname), with %User{} = user <- User.get_cached_by_nickname(nickname) do
{user, for_user} <- ensure_user_keys_present_and_maybe_refresh_for_user(user, for_user) do
conn conn
|> put_resp_content_type("application/activity+json") |> put_resp_content_type("application/activity+json")
|> put_view(UserView) |> put_view(UserView)
@ -213,7 +210,6 @@ def relay_followers(conn, _params) do
def followers(%{assigns: %{user: for_user}} = conn, %{"nickname" => nickname, "page" => page}) do def followers(%{assigns: %{user: for_user}} = conn, %{"nickname" => nickname, "page" => page}) do
with %User{} = user <- User.get_cached_by_nickname(nickname), with %User{} = user <- User.get_cached_by_nickname(nickname),
{user, for_user} <- ensure_user_keys_present_and_maybe_refresh_for_user(user, for_user),
{:show_followers, true} <- {:show_followers, true} <-
{:show_followers, (for_user && for_user == user) || !user.hide_followers} do {:show_followers, (for_user && for_user == user) || !user.hide_followers} do
{page, _} = Integer.parse(page) {page, _} = Integer.parse(page)
@ -231,8 +227,7 @@ def followers(%{assigns: %{user: for_user}} = conn, %{"nickname" => nickname, "p
end end
def followers(%{assigns: %{user: for_user}} = conn, %{"nickname" => nickname}) do def followers(%{assigns: %{user: for_user}} = conn, %{"nickname" => nickname}) do
with %User{} = user <- User.get_cached_by_nickname(nickname), with %User{} = user <- User.get_cached_by_nickname(nickname) do
{user, for_user} <- ensure_user_keys_present_and_maybe_refresh_for_user(user, for_user) do
conn conn
|> put_resp_content_type("application/activity+json") |> put_resp_content_type("application/activity+json")
|> put_view(UserView) |> put_view(UserView)
@ -245,8 +240,7 @@ def outbox(
%{"nickname" => nickname, "page" => page?} = params %{"nickname" => nickname, "page" => page?} = params
) )
when page? in [true, "true"] do when page? in [true, "true"] do
with %User{} = user <- User.get_cached_by_nickname(nickname), with %User{} = user <- User.get_cached_by_nickname(nickname) do
{:ok, user} <- User.ensure_keys_present(user) do
# "include_poll_votes" is a hack because postgres generates inefficient # "include_poll_votes" is a hack because postgres generates inefficient
# queries when filtering by 'Answer', poll votes will be hidden by the # queries when filtering by 'Answer', poll votes will be hidden by the
# visibility filter in this case anyway # visibility filter in this case anyway
@ -270,8 +264,7 @@ def outbox(
end end
def outbox(conn, %{"nickname" => nickname}) do def outbox(conn, %{"nickname" => nickname}) do
with %User{} = user <- User.get_cached_by_nickname(nickname), with %User{} = user <- User.get_cached_by_nickname(nickname) do
{:ok, user} <- User.ensure_keys_present(user) do
conn conn
|> put_resp_content_type("application/activity+json") |> put_resp_content_type("application/activity+json")
|> put_view(UserView) |> put_view(UserView)
@ -328,14 +321,10 @@ defp post_inbox_relayed_create(conn, params) do
end end
defp represent_service_actor(%User{} = user, conn) do defp represent_service_actor(%User{} = user, conn) do
with {:ok, user} <- User.ensure_keys_present(user) do conn
conn |> put_resp_content_type("application/activity+json")
|> put_resp_content_type("application/activity+json") |> put_view(UserView)
|> put_view(UserView) |> render("user.json", %{user: user})
|> render("user.json", %{user: user})
else
nil -> {:error, :not_found}
end
end end
defp represent_service_actor(nil, _), do: {:error, :not_found} defp represent_service_actor(nil, _), do: {:error, :not_found}
@ -388,12 +377,10 @@ def read_inbox(
def read_inbox(%{assigns: %{user: %User{nickname: nickname} = user}} = conn, %{ def read_inbox(%{assigns: %{user: %User{nickname: nickname} = user}} = conn, %{
"nickname" => nickname "nickname" => nickname
}) do }) do
with {:ok, user} <- User.ensure_keys_present(user) do conn
conn |> put_resp_content_type("application/activity+json")
|> put_resp_content_type("application/activity+json") |> put_view(UserView)
|> put_view(UserView) |> render("activity_collection.json", %{iri: "#{user.ap_id}/inbox"})
|> render("activity_collection.json", %{iri: "#{user.ap_id}/inbox"})
end
end end
def read_inbox(%{assigns: %{user: %User{nickname: as_nickname}}} = conn, %{ def read_inbox(%{assigns: %{user: %User{nickname: as_nickname}}} = conn, %{
@ -530,19 +517,6 @@ defp set_requester_reachable(%Plug.Conn{} = conn, _) do
conn conn
end end
defp ensure_user_keys_present_and_maybe_refresh_for_user(user, for_user) do
{:ok, new_user} = User.ensure_keys_present(user)
for_user =
if new_user != user and match?(%User{}, for_user) do
User.get_cached_by_nickname(for_user.nickname)
else
for_user
end
{new_user, for_user}
end
def upload_media(%{assigns: %{user: %User{} = user}} = conn, %{"file" => file} = data) do def upload_media(%{assigns: %{user: %User{} = user}} = conn, %{"file" => file} = data) do
with {:ok, object} <- with {:ok, object} <-
ActivityPub.upload( ActivityPub.upload(

View file

@ -55,37 +55,84 @@ def follow(follower, followed) do
{:ok, data, []} {:ok, data, []}
end end
defp unicode_emoji_react(_object, data, emoji) do
data
|> Map.put("content", emoji)
|> Map.put("type", "EmojiReact")
end
defp add_emoji_content(data, emoji, url) do
data
|> Map.put("content", Emoji.maybe_quote(emoji))
|> Map.put("type", "EmojiReact")
|> Map.put("tag", [
%{}
|> Map.put("id", url)
|> Map.put("type", "Emoji")
|> Map.put("name", Emoji.maybe_quote(emoji))
|> Map.put(
"icon",
%{}
|> Map.put("type", "Image")
|> Map.put("url", url)
)
])
end
defp remote_custom_emoji_react(
%{data: %{"reactions" => existing_reactions}},
data,
emoji
) do
[emoji_code, instance] = String.split(Emoji.stripped_name(emoji), "@")
matching_reaction =
Enum.find(
existing_reactions,
fn [name, _, url] ->
url = URI.parse(url)
url.host == instance && name == emoji_code
end
)
if matching_reaction do
[name, _, url] = matching_reaction
add_emoji_content(data, name, url)
else
{:error, "Could not react"}
end
end
defp remote_custom_emoji_react(_object, _data, _emoji) do
{:error, "Could not react"}
end
defp local_custom_emoji_react(data, emoji) do
with %{} = emojo <- Emoji.get(emoji) do
path = emojo |> Map.get(:file)
url = "#{Endpoint.url()}#{path}"
add_emoji_content(data, emojo.code, url)
else
_ -> {:error, "Emoji does not exist"}
end
end
defp custom_emoji_react(object, data, emoji) do
if String.contains?(emoji, "@") do
remote_custom_emoji_react(object, data, emoji)
else
local_custom_emoji_react(data, emoji)
end
end
@spec emoji_react(User.t(), Object.t(), String.t()) :: {:ok, map(), keyword()} @spec emoji_react(User.t(), Object.t(), String.t()) :: {:ok, map(), keyword()}
def emoji_react(actor, object, emoji) do def emoji_react(actor, object, emoji) do
with {:ok, data, meta} <- object_action(actor, object) do with {:ok, data, meta} <- object_action(actor, object) do
data = data =
if Emoji.is_unicode_emoji?(emoji) do if Emoji.is_unicode_emoji?(emoji) do
data unicode_emoji_react(object, data, emoji)
|> Map.put("content", emoji)
|> Map.put("type", "EmojiReact")
else else
with %{} = emojo <- Emoji.get(emoji) do custom_emoji_react(object, data, emoji)
path = emojo |> Map.get(:file)
url = "#{Endpoint.url()}#{path}"
data
|> Map.put("content", emoji)
|> Map.put("type", "EmojiReact")
|> Map.put("tag", [
%{}
|> Map.put("id", url)
|> Map.put("type", "Emoji")
|> Map.put("name", emojo.code)
|> Map.put(
"icon",
%{}
|> Map.put("type", "Image")
|> Map.put("url", url)
)
])
else
_ -> {:error, "Emoji does not exist"}
end
end end
{:ok, data, meta} {:ok, data, meta}
@ -231,10 +278,16 @@ def like(actor, object) do
end end
end end
# Retricted to user updates for now, always public
@spec update(User.t(), Object.t()) :: {:ok, map(), keyword()} @spec update(User.t(), Object.t()) :: {:ok, map(), keyword()}
def update(actor, object) do def update(actor, object) do
to = [Pleroma.Constants.as_public(), actor.follower_address] {to, cc} =
if object["type"] in Pleroma.Constants.actor_types() do
# User updates, always public
{[Pleroma.Constants.as_public(), actor.follower_address], []}
else
# Status updates, follow the recipients in the object
{object["to"] || [], object["cc"] || []}
end
{:ok, {:ok,
%{ %{
@ -242,7 +295,8 @@ def update(actor, object) do
"type" => "Update", "type" => "Update",
"actor" => actor.ap_id, "actor" => actor.ap_id,
"object" => object, "object" => object,
"to" => to "to" => to,
"cc" => cc
}, []} }, []}
end end

View file

@ -63,10 +63,53 @@ defmodule Pleroma.Web.ActivityPub.MRF do
@required_description_keys [:key, :related_policy] @required_description_keys [:key, :related_policy]
def filter_one(policy, message) do
should_plug_history? =
if function_exported?(policy, :history_awareness, 0) do
policy.history_awareness()
else
:manual
end
|> Kernel.==(:auto)
if not should_plug_history? do
policy.filter(message)
else
main_result = policy.filter(message)
with {_, {:ok, main_message}} <- {:main, main_result},
{_,
%{
"formerRepresentations" => %{
"orderedItems" => [_ | _]
}
}} = {_, object} <- {:object, message["object"]},
{_, {:ok, new_history}} <-
{:history,
Pleroma.Object.Updater.for_each_history_item(
object["formerRepresentations"],
object,
fn item ->
with {:ok, filtered} <- policy.filter(Map.put(message, "object", item)) do
{:ok, filtered["object"]}
else
e -> e
end
end
)} do
{:ok, put_in(main_message, ["object", "formerRepresentations"], new_history)}
else
{:main, _} -> main_result
{:object, _} -> main_result
{:history, e} -> e
end
end
end
def filter(policies, %{} = message) do def filter(policies, %{} = message) do
policies policies
|> Enum.reduce({:ok, message}, fn |> Enum.reduce({:ok, message}, fn
policy, {:ok, message} -> policy.filter(message) policy, {:ok, message} -> filter_one(policy, message)
_, error -> error _, error -> error
end) end)
end end
@ -95,7 +138,11 @@ def pipeline_filter(%{} = message, meta) do
def get_policies do def get_policies do
Pleroma.Config.get([:mrf, :policies], []) Pleroma.Config.get([:mrf, :policies], [])
|> get_policies() |> get_policies()
|> Enum.concat([Pleroma.Web.ActivityPub.MRF.HashtagPolicy]) |> Enum.concat([
Pleroma.Web.ActivityPub.MRF.HashtagPolicy,
Pleroma.Web.ActivityPub.MRF.InlineQuotePolicy
])
|> Enum.uniq()
end end
defp get_policies(policy) when is_atom(policy), do: [policy] defp get_policies(policy) when is_atom(policy), do: [policy]

View file

@ -9,6 +9,9 @@ defmodule Pleroma.Web.ActivityPub.MRF.AntiLinkSpamPolicy do
require Logger require Logger
@impl true
def history_awareness, do: :auto
# has the user successfully posted before? # has the user successfully posted before?
defp old_user?(%User{} = u) do defp old_user?(%User{} = u) do
u.note_count > 0 || u.follower_count > 0 u.note_count > 0 || u.follower_count > 0

View file

@ -10,6 +10,8 @@ defmodule Pleroma.Web.ActivityPub.MRF.EnsureRePrepended do
@reply_prefix Regex.compile!("^re:[[:space:]]*", [:caseless]) @reply_prefix Regex.compile!("^re:[[:space:]]*", [:caseless])
def history_awareness, do: :auto
def filter_by_summary( def filter_by_summary(
%{data: %{"summary" => parent_summary}} = _in_reply_to, %{data: %{"summary" => parent_summary}} = _in_reply_to,
%{"summary" => child_summary} = child %{"summary" => child_summary} = child
@ -27,8 +29,8 @@ def filter_by_summary(
def filter_by_summary(_in_reply_to, child), do: child def filter_by_summary(_in_reply_to, child), do: child
def filter(%{"type" => "Create", "object" => child_object} = object) def filter(%{"type" => type, "object" => child_object} = object)
when is_map(child_object) do when type in ["Create", "Update"] and is_map(child_object) do
child = child =
child_object["inReplyTo"] child_object["inReplyTo"]
|> Object.normalize(fetch: false) |> Object.normalize(fetch: false)

View file

@ -16,6 +16,9 @@ defmodule Pleroma.Web.ActivityPub.MRF.HashtagPolicy do
@behaviour Pleroma.Web.ActivityPub.MRF.Policy @behaviour Pleroma.Web.ActivityPub.MRF.Policy
@impl true
def history_awareness, do: :manual
defp check_reject(message, hashtags) do defp check_reject(message, hashtags) do
if Enum.any?(Config.get([:mrf_hashtag, :reject]), fn match -> match in hashtags end) do if Enum.any?(Config.get([:mrf_hashtag, :reject]), fn match -> match in hashtags end) do
{:reject, "[HashtagPolicy] Matches with rejected keyword"} {:reject, "[HashtagPolicy] Matches with rejected keyword"}
@ -47,22 +50,46 @@ defp check_ftl_removal(%{"to" => to} = message, hashtags) do
defp check_ftl_removal(message, _hashtags), do: {:ok, message} defp check_ftl_removal(message, _hashtags), do: {:ok, message}
defp check_sensitive(message, hashtags) do defp check_sensitive(message) do
if Enum.any?(Config.get([:mrf_hashtag, :sensitive]), fn match -> match in hashtags end) do {:ok, new_object} =
{:ok, Kernel.put_in(message, ["object", "sensitive"], true)} Object.Updater.do_with_history(message["object"], fn object ->
else hashtags = Object.hashtags(%Object{data: object})
{:ok, message}
end if Enum.any?(Config.get([:mrf_hashtag, :sensitive]), fn match -> match in hashtags end) do
{:ok, Map.put(object, "sensitive", true)}
else
{:ok, object}
end
end)
{:ok, Map.put(message, "object", new_object)}
end end
@impl true @impl true
def filter(%{"type" => "Create", "object" => object} = message) do def filter(%{"type" => type, "object" => object} = message) when type in ["Create", "Update"] do
hashtags = Object.hashtags(%Object{data: object}) history_items =
with %{"formerRepresentations" => %{"orderedItems" => items}} <- object do
items
else
_ -> []
end
historical_hashtags =
Enum.reduce(history_items, [], fn item, acc ->
acc ++ Object.hashtags(%Object{data: item})
end)
hashtags = Object.hashtags(%Object{data: object}) ++ historical_hashtags
if hashtags != [] do if hashtags != [] do
with {:ok, message} <- check_reject(message, hashtags), with {:ok, message} <- check_reject(message, hashtags),
{:ok, message} <- check_ftl_removal(message, hashtags), {:ok, message} <-
{:ok, message} <- check_sensitive(message, hashtags) do (if "type" == "Create" do
check_ftl_removal(message, hashtags)
else
{:ok, message}
end),
{:ok, message} <- check_sensitive(message) do
{:ok, message} {:ok, message}
end end
else else

View file

@ -27,24 +27,46 @@ defp object_payload(%{} = object) do
end end
defp check_reject(%{"object" => %{} = object} = message) do defp check_reject(%{"object" => %{} = object} = message) do
payload = object_payload(object) with {:ok, _new_object} <-
Pleroma.Object.Updater.do_with_history(object, fn object ->
payload = object_payload(object)
if Enum.any?(Pleroma.Config.get([:mrf_keyword, :reject]), fn pattern -> if Enum.any?(Pleroma.Config.get([:mrf_keyword, :reject]), fn pattern ->
string_matches?(payload, pattern) string_matches?(payload, pattern)
end) do end) do
{:reject, "[KeywordPolicy] Matches with rejected keyword"} {:reject, "[KeywordPolicy] Matches with rejected keyword"}
else else
{:ok, message}
end
end) do
{:ok, message} {:ok, message}
else
e -> e
end end
end end
defp check_ftl_removal(%{"to" => to, "object" => %{} = object} = message) do defp check_ftl_removal(%{"type" => "Create", "to" => to, "object" => %{} = object} = message) do
payload = object_payload(object) check_keyword = fn object ->
payload = object_payload(object)
if Pleroma.Constants.as_public() in to and if Enum.any?(Pleroma.Config.get([:mrf_keyword, :federated_timeline_removal]), fn pattern ->
Enum.any?(Pleroma.Config.get([:mrf_keyword, :federated_timeline_removal]), fn pattern ->
string_matches?(payload, pattern) string_matches?(payload, pattern)
end) do end) do
{:should_delist, nil}
else
{:ok, %{}}
end
end
should_delist? = fn object ->
with {:ok, _} <- Pleroma.Object.Updater.do_with_history(object, check_keyword) do
false
else
_ -> true
end
end
if Pleroma.Constants.as_public() in to and should_delist?.(object) do
to = List.delete(to, Pleroma.Constants.as_public()) to = List.delete(to, Pleroma.Constants.as_public())
cc = [Pleroma.Constants.as_public() | message["cc"] || []] cc = [Pleroma.Constants.as_public() | message["cc"] || []]
@ -59,8 +81,12 @@ defp check_ftl_removal(%{"to" => to, "object" => %{} = object} = message) do
end end
end end
defp check_ftl_removal(message) do
{:ok, message}
end
defp check_replace(%{"object" => %{} = object} = message) do defp check_replace(%{"object" => %{} = object} = message) do
object = replace_kw = fn object ->
["content", "name", "summary"] ["content", "name", "summary"]
|> Enum.filter(fn field -> Map.has_key?(object, field) && object[field] end) |> Enum.filter(fn field -> Map.has_key?(object, field) && object[field] end)
|> Enum.reduce(object, fn field, object -> |> Enum.reduce(object, fn field, object ->
@ -73,6 +99,10 @@ defp check_replace(%{"object" => %{} = object} = message) do
Map.put(object, field, data) Map.put(object, field, data)
end) end)
|> (fn object -> {:ok, object} end).()
end
{:ok, object} = Pleroma.Object.Updater.do_with_history(object, replace_kw)
message = Map.put(message, "object", object) message = Map.put(message, "object", object)
@ -80,7 +110,8 @@ defp check_replace(%{"object" => %{} = object} = message) do
end end
@impl true @impl true
def filter(%{"type" => "Create", "object" => %{"content" => _content}} = message) do def filter(%{"type" => type, "object" => %{"content" => _content}} = message)
when type in ["Create", "Update"] do
with {:ok, message} <- check_reject(message), with {:ok, message} <- check_reject(message),
{:ok, message} <- check_ftl_removal(message), {:ok, message} <- check_ftl_removal(message),
{:ok, message} <- check_replace(message) do {:ok, message} <- check_replace(message) do

View file

@ -15,6 +15,9 @@ defmodule Pleroma.Web.ActivityPub.MRF.MediaProxyWarmingPolicy do
recv_timeout: 10_000 recv_timeout: 10_000
] ]
@impl true
def history_awareness, do: :auto
defp prefetch(url) do defp prefetch(url) do
# Fetching only proxiable resources # Fetching only proxiable resources
if MediaProxy.enabled?() and MediaProxy.url_proxiable?(url) do if MediaProxy.enabled?() and MediaProxy.url_proxiable?(url) do
@ -53,10 +56,8 @@ defp preload(%{"object" => %{"attachment" => attachments}} = _message) do
end end
@impl true @impl true
def filter( def filter(%{"type" => type, "object" => %{"attachment" => attachments} = _object} = message)
%{"type" => "Create", "object" => %{"attachment" => attachments} = _object} = message when type in ["Create", "Update"] and is_list(attachments) and length(attachments) > 0 do
)
when is_list(attachments) and length(attachments) > 0 do
preload(message) preload(message)
{:ok, message} {:ok, message}

View file

@ -11,6 +11,7 @@ defmodule Pleroma.Web.ActivityPub.MRF.NoEmptyPolicy do
@impl true @impl true
def filter(%{"actor" => actor} = object) do def filter(%{"actor" => actor} = object) do
with true <- is_local?(actor), with true <- is_local?(actor),
true <- is_eligible_type?(object),
true <- is_note?(object), true <- is_note?(object),
false <- has_attachment?(object), false <- has_attachment?(object),
true <- only_mentions?(object) do true <- only_mentions?(object) do
@ -32,7 +33,6 @@ defp is_local?(actor) do
end end
defp has_attachment?(%{ defp has_attachment?(%{
"type" => "Create",
"object" => %{"type" => "Note", "attachment" => attachments} "object" => %{"type" => "Note", "attachment" => attachments}
}) })
when length(attachments) > 0, when length(attachments) > 0,
@ -40,23 +40,13 @@ defp has_attachment?(%{
defp has_attachment?(_), do: false defp has_attachment?(_), do: false
defp only_mentions?(%{"type" => "Create", "object" => %{"type" => "Note", "source" => source}}) defp only_mentions?(%{"object" => %{"type" => "Note", "source" => source}}) do
when is_binary(source) do source =
non_mentions = case source do
source |> String.split() |> Enum.filter(&(not String.starts_with?(&1, "@"))) |> length %{"content" => text} -> text
_ -> source
end
if non_mentions > 0 do
false
else
true
end
end
defp only_mentions?(%{
"type" => "Create",
"object" => %{"type" => "Note", "source" => %{"content" => source}}
})
when is_binary(source) do
non_mentions = non_mentions =
source |> String.split() |> Enum.filter(&(not String.starts_with?(&1, "@"))) |> length source |> String.split() |> Enum.filter(&(not String.starts_with?(&1, "@"))) |> length
@ -69,9 +59,12 @@ defp only_mentions?(%{
defp only_mentions?(_), do: false defp only_mentions?(_), do: false
defp is_note?(%{"type" => "Create", "object" => %{"type" => "Note"}}), do: true defp is_note?(%{"object" => %{"type" => "Note"}}), do: true
defp is_note?(_), do: false defp is_note?(_), do: false
defp is_eligible_type?(%{"type" => type}) when type in ["Create", "Update"], do: true
defp is_eligible_type?(_), do: false
@impl true @impl true
def describe, do: {:ok, %{}} def describe, do: {:ok, %{}}
end end

View file

@ -6,14 +6,17 @@ defmodule Pleroma.Web.ActivityPub.MRF.NoPlaceholderTextPolicy do
@moduledoc "Ensure no content placeholder is present (such as the dot from mastodon)" @moduledoc "Ensure no content placeholder is present (such as the dot from mastodon)"
@behaviour Pleroma.Web.ActivityPub.MRF.Policy @behaviour Pleroma.Web.ActivityPub.MRF.Policy
@impl true
def history_awareness, do: :auto
@impl true @impl true
def filter( def filter(
%{ %{
"type" => "Create", "type" => type,
"object" => %{"content" => content, "attachment" => _} = _child_object "object" => %{"content" => content, "attachment" => _} = _child_object
} = object } = object
) )
when content in [".", "<p>.</p>"] do when type in ["Create", "Update"] and content in [".", "<p>.</p>"] do
{:ok, put_in(object, ["object", "content"], "")} {:ok, put_in(object, ["object", "content"], "")}
end end

View file

@ -9,7 +9,11 @@ defmodule Pleroma.Web.ActivityPub.MRF.NormalizeMarkup do
@behaviour Pleroma.Web.ActivityPub.MRF.Policy @behaviour Pleroma.Web.ActivityPub.MRF.Policy
@impl true @impl true
def filter(%{"type" => "Create", "object" => child_object} = object) do def history_awareness, do: :auto
@impl true
def filter(%{"type" => type, "object" => child_object} = object)
when type in ["Create", "Update"] do
scrub_policy = Pleroma.Config.get([:mrf_normalize_markup, :scrub_policy]) scrub_policy = Pleroma.Config.get([:mrf_normalize_markup, :scrub_policy])
content = content =

View file

@ -12,5 +12,6 @@ defmodule Pleroma.Web.ActivityPub.MRF.Policy do
label: String.t(), label: String.t(),
description: String.t() description: String.t()
} }
@optional_callbacks config_description: 0 @callback history_awareness() :: :auto | :manual
@optional_callbacks config_description: 0, history_awareness: 0
end end

View file

@ -86,8 +86,8 @@ def validate(
meta meta
) )
when objtype in ~w[Question Answer Audio Video Event Article Note Page] do when objtype in ~w[Question Answer Audio Video Event Article Note Page] do
with {:ok, object_data} <- cast_and_apply(object), with {:ok, object_data} <- cast_and_apply_and_stringify_with_history(object),
meta = Keyword.put(meta, :object_data, object_data |> stringify_keys), meta = Keyword.put(meta, :object_data, object_data),
{:ok, create_activity} <- {:ok, create_activity} <-
create_activity create_activity
|> CreateGenericValidator.cast_and_validate(meta) |> CreateGenericValidator.cast_and_validate(meta)
@ -111,19 +111,53 @@ def validate(%{"type" => type} = object, meta)
end end
with {:ok, object} <- with {:ok, object} <-
object do_separate_with_history(object, fn object ->
|> validator.cast_and_validate() with {:ok, object} <-
|> Ecto.Changeset.apply_action(:insert) do object
object = stringify_keys(object) |> validator.cast_and_validate()
|> Ecto.Changeset.apply_action(:insert) do
object = stringify_keys(object)
# Insert copy of hashtags as strings for the non-hashtag table indexing # Insert copy of hashtags as strings for the non-hashtag table indexing
tag = (object["tag"] || []) ++ Object.hashtags(%Object{data: object}) tag = (object["tag"] || []) ++ Object.hashtags(%Object{data: object})
object = Map.put(object, "tag", tag) object = Map.put(object, "tag", tag)
{:ok, object}
end
end) do
{:ok, object, meta} {:ok, object, meta}
end end
end end
def validate(
%{"type" => "Update", "object" => %{"type" => objtype} = object} = update_activity,
meta
)
when objtype in ~w[Question Answer Audio Video Event Article Note Page] do
with {_, false} <- {:local, Access.get(meta, :local, false)},
{_, {:ok, object_data, _}} <- {:object_validation, validate(object, meta)},
meta = Keyword.put(meta, :object_data, object_data),
{:ok, update_activity} <-
update_activity
|> UpdateValidator.cast_and_validate()
|> Ecto.Changeset.apply_action(:insert) do
update_activity = stringify_keys(update_activity)
{:ok, update_activity, meta}
else
{:local, _} ->
with {:ok, object} <-
update_activity
|> UpdateValidator.cast_and_validate()
|> Ecto.Changeset.apply_action(:insert) do
object = stringify_keys(object)
{:ok, object, meta}
end
{:object_validation, e} ->
e
end
end
def validate(%{"type" => type} = object, meta) def validate(%{"type" => type} = object, meta)
when type in ~w[Accept Reject Follow Update Like EmojiReact Announce when type in ~w[Accept Reject Follow Update Like EmojiReact Announce
Answer] do Answer] do
@ -160,6 +194,15 @@ def validate(%{"type" => type} = object, meta) when type in ~w(Add Remove) do
def validate(o, m), do: {:error, {:validator_not_set, {o, m}}} def validate(o, m), do: {:error, {:validator_not_set, {o, m}}}
def cast_and_apply_and_stringify_with_history(object) do
do_separate_with_history(object, fn object ->
with {:ok, object_data} <- cast_and_apply(object),
object_data <- object_data |> stringify_keys() do
{:ok, object_data}
end
end)
end
def cast_and_apply(%{"type" => "Question"} = object) do def cast_and_apply(%{"type" => "Question"} = object) do
QuestionValidator.cast_and_apply(object) QuestionValidator.cast_and_apply(object)
end end
@ -214,4 +257,54 @@ def fetch_actor_and_object(object) do
Object.normalize(object["object"], fetch: true) Object.normalize(object["object"], fetch: true)
:ok :ok
end end
defp for_each_history_item(
%{"type" => "OrderedCollection", "orderedItems" => items} = history,
object,
fun
) do
processed_items =
Enum.map(items, fn item ->
with item <- Map.put(item, "id", object["id"]),
{:ok, item} <- fun.(item) do
item
else
_ -> nil
end
end)
if Enum.all?(processed_items, &(not is_nil(&1))) do
{:ok, Map.put(history, "orderedItems", processed_items)}
else
{:error, :invalid_history}
end
end
defp for_each_history_item(nil, _object, _fun) do
{:ok, nil}
end
defp for_each_history_item(_, _object, _fun) do
{:error, :invalid_history}
end
# fun is (object -> {:ok, validated_object_with_string_keys})
defp do_separate_with_history(object, fun) do
with history <- object["formerRepresentations"],
object <- Map.drop(object, ["formerRepresentations"]),
{_, {:ok, object}} <- {:main_body, fun.(object)},
{_, {:ok, history}} <- {:history_items, for_each_history_item(history, object, fun)} do
object =
if history do
Map.put(object, "formerRepresentations", history)
else
object
end
{:ok, object}
else
{:main_body, e} -> e
{:history_items, e} -> e
end
end
end end

View file

@ -53,7 +53,10 @@ defp fix_url(%{"url" => url} = data) when is_bitstring(url), do: data
defp fix_url(%{"url" => url} = data) when is_map(url), do: Map.put(data, "url", url["href"]) defp fix_url(%{"url" => url} = data) when is_map(url), do: Map.put(data, "url", url["href"])
defp fix_url(data), do: data defp fix_url(data), do: data
defp fix_tag(%{"tag" => tag} = data) when is_list(tag), do: data defp fix_tag(%{"tag" => tag} = data) when is_list(tag) do
Map.put(data, "tag", Enum.filter(tag, &is_map/1))
end
defp fix_tag(%{"tag" => tag} = data) when is_map(tag), do: Map.put(data, "tag", [tag]) defp fix_tag(%{"tag" => tag} = data) when is_map(tag), do: Map.put(data, "tag", [tag])
defp fix_tag(data), do: Map.drop(data, ["tag"]) defp fix_tag(data), do: Map.drop(data, ["tag"])

View file

@ -11,6 +11,7 @@ defmodule Pleroma.Web.ActivityPub.ObjectValidators.AttachmentValidator do
@primary_key false @primary_key false
embedded_schema do embedded_schema do
field(:id, :string)
field(:type, :string) field(:type, :string)
field(:mediaType, :string, default: "application/octet-stream") field(:mediaType, :string, default: "application/octet-stream")
field(:name, :string) field(:name, :string)
@ -43,7 +44,7 @@ def changeset(struct, data) do
|> fix_url() |> fix_url()
struct struct
|> cast(data, [:type, :mediaType, :name, :blurhash]) |> cast(data, [:id, :type, :mediaType, :name, :blurhash])
|> cast_embed(:url, with: &url_changeset/2, required: true) |> cast_embed(:url, with: &url_changeset/2, required: true)
|> validate_inclusion(:type, ~w[Link Document Audio Image Video]) |> validate_inclusion(:type, ~w[Link Document Audio Image Video])
|> validate_required([:type, :mediaType]) |> validate_required([:type, :mediaType])

View file

@ -33,6 +33,7 @@ defmacro object_fields do
field(:content, :string) field(:content, :string)
field(:published, ObjectValidators.DateTime) field(:published, ObjectValidators.DateTime)
field(:updated, ObjectValidators.DateTime)
field(:emoji, ObjectValidators.Emoji, default: %{}) field(:emoji, ObjectValidators.Emoji, default: %{})
embeds_many(:attachment, AttachmentValidator) embeds_many(:attachment, AttachmentValidator)
end end

View file

@ -13,7 +13,7 @@ defmodule Pleroma.Web.ActivityPub.ObjectValidators.EmojiReactValidator do
import Pleroma.Web.ActivityPub.ObjectValidators.CommonValidations import Pleroma.Web.ActivityPub.ObjectValidators.CommonValidations
@primary_key false @primary_key false
@emoji_regex ~r/:[A-Za-z0-9_-]+:/ @emoji_regex ~r/:[A-Za-z0-9_-]+(@.+)?:/
embedded_schema do embedded_schema do
quote do quote do

View file

@ -51,7 +51,9 @@ def validate_updating_rights(cng) do
with actor = get_field(cng, :actor), with actor = get_field(cng, :actor),
object = get_field(cng, :object), object = get_field(cng, :object),
{:ok, object_id} <- ObjectValidators.ObjectID.cast(object), {:ok, object_id} <- ObjectValidators.ObjectID.cast(object),
true <- actor == object_id do actor_uri <- URI.parse(actor),
object_uri <- URI.parse(object_id),
true <- actor_uri.host == object_uri.host do
cng cng
else else
_e -> _e ->

View file

@ -23,6 +23,7 @@ defmodule Pleroma.Web.ActivityPub.SideEffects do
alias Pleroma.Web.Streamer alias Pleroma.Web.Streamer
alias Pleroma.Workers.PollWorker alias Pleroma.Workers.PollWorker
require Pleroma.Constants
require Logger require Logger
@logger Pleroma.Config.get([:side_effects, :logger], Logger) @logger Pleroma.Config.get([:side_effects, :logger], Logger)
@ -150,23 +151,26 @@ def handle(
# Tasks this handles: # Tasks this handles:
# - Update the user # - Update the user
# - Update a non-user object (Note, Question, etc.)
# #
# For a local user, we also get a changeset with the full information, so we # For a local user, we also get a changeset with the full information, so we
# can update non-federating, non-activitypub settings as well. # can update non-federating, non-activitypub settings as well.
@impl true @impl true
def handle(%{data: %{"type" => "Update", "object" => updated_object}} = object, meta) do def handle(%{data: %{"type" => "Update", "object" => updated_object}} = object, meta) do
if changeset = Keyword.get(meta, :user_update_changeset) do updated_object_id = updated_object["id"]
changeset
|> User.update_and_set_cache() with {_, true} <- {:has_id, is_binary(updated_object_id)},
%{"type" => type} <- updated_object,
{_, is_user} <- {:is_user, type in Pleroma.Constants.actor_types()} do
if is_user do
handle_update_user(object, meta)
else
handle_update_object(object, meta)
end
else else
{:ok, new_user_data} = ActivityPub.user_data_from_user_object(updated_object) _ ->
{:ok, object, meta}
User.get_by_ap_id(updated_object["id"])
|> User.remote_user_changeset(new_user_data)
|> User.update_and_set_cache()
end end
{:ok, object, meta}
end end
# Tasks this handles: # Tasks this handles:
@ -395,6 +399,79 @@ def handle(object, meta) do
{:ok, object, meta} {:ok, object, meta}
end end
defp handle_update_user(
%{data: %{"type" => "Update", "object" => updated_object}} = object,
meta
) do
if changeset = Keyword.get(meta, :user_update_changeset) do
changeset
|> User.update_and_set_cache()
else
{:ok, new_user_data} = ActivityPub.user_data_from_user_object(updated_object)
User.get_by_ap_id(updated_object["id"])
|> User.remote_user_changeset(new_user_data)
|> User.update_and_set_cache()
end
{:ok, object, meta}
end
defp handle_update_object(
%{data: %{"type" => "Update", "object" => updated_object}} = object,
meta
) do
orig_object_ap_id = updated_object["id"]
orig_object = Object.get_by_ap_id(orig_object_ap_id)
orig_object_data = orig_object.data
updated_object =
if meta[:local] do
# If this is a local Update, we don't process it by transmogrifier,
# so we use the embedded object as-is.
updated_object
else
meta[:object_data]
end
if orig_object_data["type"] in Pleroma.Constants.updatable_object_types() do
%{
updated_data: updated_object_data,
updated: updated,
used_history_in_new_object?: used_history_in_new_object?
} = Object.Updater.make_new_object_data_from_update_object(orig_object_data, updated_object)
changeset =
orig_object
|> Repo.preload(:hashtags)
|> Object.change(%{data: updated_object_data})
with {:ok, new_object} <- Repo.update(changeset),
{:ok, _} <- Object.invalid_object_cache(new_object),
{:ok, _} <- Object.set_cache(new_object),
# The metadata/utils.ex uses the object id for the cache.
{:ok, _} <- Pleroma.Activity.HTML.invalidate_cache_for(new_object.id) do
if used_history_in_new_object? do
with create_activity when not is_nil(create_activity) <-
Pleroma.Activity.get_create_by_object_ap_id(orig_object_ap_id),
{:ok, _} <- Pleroma.Activity.HTML.invalidate_cache_for(create_activity.id) do
nil
else
_ -> nil
end
end
if updated do
object
|> Activity.normalize()
|> ActivityPub.notify_and_stream()
end
end
end
{:ok, object, meta}
end
def handle_object_creation(%{"type" => "Question"} = object, activity, meta) do def handle_object_creation(%{"type" => "Question"} = object, activity, meta) do
with {:ok, object, meta} <- Pipeline.common_pipeline(object, meta) do with {:ok, object, meta} <- Pipeline.common_pipeline(object, meta) do
PollWorker.schedule_poll_end(activity) PollWorker.schedule_poll_end(activity)

View file

@ -699,6 +699,24 @@ def prepare_object(object) do
|> strip_internal_fields |> strip_internal_fields
|> strip_internal_tags |> strip_internal_tags
|> set_type |> set_type
|> maybe_process_history
end
defp maybe_process_history(%{"formerRepresentations" => %{"orderedItems" => history}} = object) do
processed_history =
Enum.map(
history,
fn
item when is_map(item) -> prepare_object(item)
item -> item
end
)
put_in(object, ["formerRepresentations", "orderedItems"], processed_history)
end
defp maybe_process_history(object) do
object
end end
# @doc # @doc
@ -723,6 +741,21 @@ def prepare_outgoing(%{"type" => activity_type, "object" => object_id} = data)
{:ok, data} {:ok, data}
end end
def prepare_outgoing(%{"type" => "Update", "object" => %{"type" => objtype} = object} = data)
when objtype in Pleroma.Constants.updatable_object_types() do
object =
object
|> prepare_object
data =
data
|> Map.put("object", object)
|> Map.merge(Utils.make_json_ld_header())
|> Map.delete("bcc")
{:ok, data}
end
def prepare_outgoing(%{"type" => "Announce", "actor" => ap_id, "object" => object_id} = data) do def prepare_outgoing(%{"type" => "Announce", "actor" => ap_id, "object" => object_id} = data) do
object = object =
object_id object_id

View file

@ -329,7 +329,7 @@ def add_emoji_reaction_to_object(
object object
) do ) do
reactions = get_cached_emoji_reactions(object) reactions = get_cached_emoji_reactions(object)
emoji = stripped_emoji_name(emoji) emoji = Pleroma.Emoji.stripped_name(emoji)
url = emoji_url(emoji, activity) url = emoji_url(emoji, activity)
new_reactions = new_reactions =
@ -356,12 +356,6 @@ def add_emoji_reaction_to_object(
update_element_in_object("reaction", new_reactions, object, count) update_element_in_object("reaction", new_reactions, object, count)
end end
defp stripped_emoji_name(name) do
name
|> String.replace_leading(":", "")
|> String.replace_trailing(":", "")
end
defp emoji_url( defp emoji_url(
name, name,
%Activity{ %Activity{
@ -384,7 +378,7 @@ def remove_emoji_reaction_from_object(
%Activity{data: %{"content" => emoji, "actor" => actor}} = activity, %Activity{data: %{"content" => emoji, "actor" => actor}} = activity,
object object
) do ) do
emoji = stripped_emoji_name(emoji) emoji = Pleroma.Emoji.stripped_name(emoji)
reactions = get_cached_emoji_reactions(object) reactions = get_cached_emoji_reactions(object)
url = emoji_url(emoji, activity) url = emoji_url(emoji, activity)
@ -472,18 +466,6 @@ def update_follow_state_for_all(
{:ok, activity} {:ok, activity}
end end
def update_follow_state(
%Activity{} = activity,
state
) do
new_data = Map.put(activity.data, "state", state)
changeset = Changeset.change(activity, data: new_data)
with {:ok, activity} <- Repo.update(changeset) do
{:ok, activity}
end
end
@doc """ @doc """
Makes a follow activity data for the given follower and followed Makes a follow activity data for the given follower and followed
""" """
@ -525,19 +507,37 @@ def fetch_latest_undo(%User{ap_id: ap_id}) do
def get_latest_reaction(internal_activity_id, %{ap_id: ap_id}, emoji) do def get_latest_reaction(internal_activity_id, %{ap_id: ap_id}, emoji) do
%{data: %{"object" => object_ap_id}} = Activity.get_by_id(internal_activity_id) %{data: %{"object" => object_ap_id}} = Activity.get_by_id(internal_activity_id)
emoji = Pleroma.Emoji.maybe_quote(emoji) emoji = Pleroma.Emoji.maybe_quote(emoji)
"EmojiReact" "EmojiReact"
|> Activity.Queries.by_type() |> Activity.Queries.by_type()
|> where(actor: ^ap_id) |> where(actor: ^ap_id)
|> where([activity], fragment("?->>'content' = ?", activity.data, ^emoji)) |> custom_emoji_discriminator(emoji)
|> Activity.Queries.by_object_id(object_ap_id) |> Activity.Queries.by_object_id(object_ap_id)
|> order_by([activity], fragment("? desc nulls last", activity.id)) |> order_by([activity], fragment("? desc nulls last", activity.id))
|> limit(1) |> limit(1)
|> Repo.one() |> Repo.one()
end end
defp custom_emoji_discriminator(query, emoji) do
if String.contains?(emoji, "@") do
stripped = Pleroma.Emoji.stripped_name(emoji)
[name, domain] = String.split(stripped, "@")
domain_pattern = "%" <> domain <> "%"
emoji_pattern = Pleroma.Emoji.maybe_quote(name)
query
|> where([activity], fragment("?->>'content' = ?
AND EXISTS (
SELECT FROM jsonb_array_elements(?->'tag') elem
WHERE elem->>'id' ILIKE ?
)", activity.data, ^emoji_pattern, activity.data, ^domain_pattern))
else
query
|> where([activity], fragment("?->>'content' = ?", activity.data, ^emoji))
end
end
#### Announce-related helpers #### Announce-related helpers
@doc """ @doc """

View file

@ -29,11 +29,11 @@ def render("object.json", %{object: %Activity{data: %{"type" => activity_type}}
def render("object.json", %{object: %Activity{} = activity}) do def render("object.json", %{object: %Activity{} = activity}) do
base = Pleroma.Web.ActivityPub.Utils.make_json_ld_header() base = Pleroma.Web.ActivityPub.Utils.make_json_ld_header()
object = Object.normalize(activity, fetch: false) object_id = Object.normalize(activity, id_only: true)
additional = additional =
Transmogrifier.prepare_object(activity.data) Transmogrifier.prepare_object(activity.data)
|> Map.put("object", object.data["id"]) |> Map.put("object", object_id)
Map.merge(base, additional) Map.merge(base, additional)
end end

View file

@ -34,7 +34,6 @@ def render("endpoints.json", %{user: %User{local: true} = _user}) do
def render("endpoints.json", _), do: %{} def render("endpoints.json", _), do: %{}
def render("service.json", %{user: user}) do def render("service.json", %{user: user}) do
{:ok, user} = User.ensure_keys_present(user)
{:ok, _, public_key} = Keys.keys_from_pem(user.keys) {:ok, _, public_key} = Keys.keys_from_pem(user.keys)
public_key = :public_key.pem_entry_encode(:SubjectPublicKeyInfo, public_key) public_key = :public_key.pem_entry_encode(:SubjectPublicKeyInfo, public_key)
public_key = :public_key.pem_encode([public_key]) public_key = :public_key.pem_encode([public_key])
@ -71,7 +70,6 @@ def render("user.json", %{user: %User{nickname: "internal." <> _} = user}),
do: render("service.json", %{user: user}) |> Map.put("preferredUsername", user.nickname) do: render("service.json", %{user: user}) |> Map.put("preferredUsername", user.nickname)
def render("user.json", %{user: user}) do def render("user.json", %{user: user}) do
{:ok, user} = User.ensure_keys_present(user)
{:ok, _, public_key} = Keys.keys_from_pem(user.keys) {:ok, _, public_key} = Keys.keys_from_pem(user.keys)
public_key = :public_key.pem_entry_encode(:SubjectPublicKeyInfo, public_key) public_key = :public_key.pem_entry_encode(:SubjectPublicKeyInfo, public_key)
public_key = :public_key.pem_encode([public_key]) public_key = :public_key.pem_encode([public_key])

View file

@ -0,0 +1,43 @@
defmodule Pleroma.Web.AkkomaAPI.TranslationController do
use Pleroma.Web, :controller
alias Pleroma.Web.Plugs.OAuthScopesPlug
@cachex Pleroma.Config.get([:cachex, :provider], Cachex)
@unauthenticated_access %{fallback: :proceed_unauthenticated, scopes: []}
plug(
OAuthScopesPlug,
%{@unauthenticated_access | scopes: ["read:statuses"]}
when action in [
:languages
]
)
plug(Pleroma.Web.ApiSpec.CastAndValidate)
defdelegate open_api_operation(action), to: Pleroma.Web.ApiSpec.TranslationOperation
action_fallback(Pleroma.Web.MastodonAPI.FallbackController)
@doc "GET /api/v1/akkoma/translation/languages"
def languages(conn, _params) do
with {:ok, source_languages, dest_languages} <- get_languages() do
conn
|> json(%{source: source_languages, target: dest_languages})
else
e -> IO.inspect(e)
end
end
defp get_languages do
module = Pleroma.Config.get([:translator, :module])
@cachex.fetch!(:translations_cache, "languages:#{module}}", fn _ ->
with {:ok, source_languages, dest_languages} <- module.languages() do
{:ok, source_languages, dest_languages}
else
{:error, err} -> {:ignore, {:error, err}}
end
end)
end
end

View file

@ -6,9 +6,13 @@ defmodule Pleroma.Web.ApiSpec.StatusOperation do
alias OpenApiSpex.Operation alias OpenApiSpex.Operation
alias OpenApiSpex.Schema alias OpenApiSpex.Schema
alias Pleroma.Web.ApiSpec.AccountOperation alias Pleroma.Web.ApiSpec.AccountOperation
alias Pleroma.Web.ApiSpec.Schemas.Account
alias Pleroma.Web.ApiSpec.Schemas.ApiError alias Pleroma.Web.ApiSpec.Schemas.ApiError
alias Pleroma.Web.ApiSpec.Schemas.Attachment
alias Pleroma.Web.ApiSpec.Schemas.BooleanLike alias Pleroma.Web.ApiSpec.Schemas.BooleanLike
alias Pleroma.Web.ApiSpec.Schemas.Emoji
alias Pleroma.Web.ApiSpec.Schemas.FlakeID alias Pleroma.Web.ApiSpec.Schemas.FlakeID
alias Pleroma.Web.ApiSpec.Schemas.Poll
alias Pleroma.Web.ApiSpec.Schemas.ScheduledStatus alias Pleroma.Web.ApiSpec.Schemas.ScheduledStatus
alias Pleroma.Web.ApiSpec.Schemas.Status alias Pleroma.Web.ApiSpec.Schemas.Status
alias Pleroma.Web.ApiSpec.Schemas.VisibilityScope alias Pleroma.Web.ApiSpec.Schemas.VisibilityScope
@ -406,6 +410,75 @@ def bookmarks_operation do
} }
end end
def translate_operation do
%Operation{
tags: ["Retrieve status translation"],
summary: "Translate status",
description: "View the translation of a given status",
operationId: "StatusController.translation",
security: [%{"oAuth" => ["read:statuses"]}],
parameters: [id_param(), language_param(), source_language_param()],
responses: %{
200 => Operation.response("Translation", "application/json", translation()),
400 => Operation.response("Error", "application/json", ApiError),
404 => Operation.response("Not Found", "application/json", ApiError)
}
}
end
def show_history_operation do
%Operation{
tags: ["Retrieve status history"],
summary: "Status history",
description: "View history of a status",
operationId: "StatusController.show_history",
security: [%{"oAuth" => ["read:statuses"]}],
parameters: [
id_param()
],
responses: %{
200 => status_history_response(),
404 => Operation.response("Not Found", "application/json", ApiError)
}
}
end
def show_source_operation do
%Operation{
tags: ["Retrieve status source"],
summary: "Status source",
description: "View source of a status",
operationId: "StatusController.show_source",
security: [%{"oAuth" => ["read:statuses"]}],
parameters: [
id_param()
],
responses: %{
200 => status_source_response(),
404 => Operation.response("Not Found", "application/json", ApiError)
}
}
end
def update_operation do
%Operation{
tags: ["Update status"],
summary: "Update status",
description: "Change the content of a status",
operationId: "StatusController.update",
security: [%{"oAuth" => ["write:statuses"]}],
parameters: [
id_param()
],
requestBody: request_body("Parameters", update_request(), required: true),
responses: %{
200 => status_response(),
403 => Operation.response("Forbidden", "application/json", ApiError),
404 => Operation.response("Not Found", "application/json", ApiError)
}
}
end
def array_of_statuses do def array_of_statuses do
%Schema{type: :array, items: Status, example: [Status.schema().example]} %Schema{type: :array, items: Status, example: [Status.schema().example]}
end end
@ -514,6 +587,60 @@ defp create_request do
} }
end end
defp update_request do
%Schema{
title: "StatusUpdateRequest",
type: :object,
properties: %{
status: %Schema{
type: :string,
nullable: true,
description:
"Text content of the status. If `media_ids` is provided, this becomes optional. Attaching a `poll` is optional while `status` is provided."
},
media_ids: %Schema{
nullable: true,
type: :array,
items: %Schema{type: :string},
description: "Array of Attachment ids to be attached as media."
},
poll: poll_params(),
sensitive: %Schema{
allOf: [BooleanLike],
nullable: true,
description: "Mark status and attached media as sensitive?"
},
spoiler_text: %Schema{
type: :string,
nullable: true,
description:
"Text to be shown as a warning or subject before the actual content. Statuses are generally collapsed behind this field."
},
content_type: %Schema{
type: :string,
nullable: true,
description:
"The MIME type of the status, it is transformed into HTML by the backend. You can get the list of the supported MIME types with the nodeinfo endpoint."
},
to: %Schema{
type: :array,
nullable: true,
items: %Schema{type: :string},
description:
"A list of nicknames (like `lain@soykaf.club` or `lain` on the local server) that will be used to determine who is going to be addressed by this post. Using this will disable the implicit addressing by mentioned names in the `status` body, only the people in the `to` list will be addressed. The normal rules for for post visibility are not affected by this and will still apply"
}
},
example: %{
"status" => "What time is it?",
"sensitive" => "false",
"poll" => %{
"options" => ["Cofe", "Adventure"],
"expires_in" => 420
}
}
}
end
def poll_params do def poll_params do
%Schema{ %Schema{
nullable: true, nullable: true,
@ -552,10 +679,99 @@ def id_param do
) )
end end
defp language_param do
Operation.parameter(:language, :path, :string, "ISO 639 language code", example: "en")
end
defp source_language_param do
Operation.parameter(:from, :query, :string, "ISO 639 language code", example: "en")
end
defp status_response do defp status_response do
Operation.response("Status", "application/json", Status) Operation.response("Status", "application/json", Status)
end end
defp status_history_response do
Operation.response(
"Status History",
"application/json",
%Schema{
title: "Status history",
description: "Response schema for history of a status",
type: :array,
items: %Schema{
type: :object,
properties: %{
account: %Schema{
allOf: [Account],
description: "The account that authored this status"
},
content: %Schema{
type: :string,
format: :html,
description: "HTML-encoded status content"
},
sensitive: %Schema{
type: :boolean,
description: "Is this status marked as sensitive content?"
},
spoiler_text: %Schema{
type: :string,
description:
"Subject or summary line, below which status content is collapsed until expanded"
},
created_at: %Schema{
type: :string,
format: "date-time",
description: "The date when this status was created"
},
media_attachments: %Schema{
type: :array,
items: Attachment,
description: "Media that is attached to this status"
},
emojis: %Schema{
type: :array,
items: Emoji,
description: "Custom emoji to be used when rendering status content"
},
poll: %Schema{
allOf: [Poll],
nullable: true,
description: "The poll attached to the status"
}
}
}
}
)
end
defp status_source_response do
Operation.response(
"Status Source",
"application/json",
%Schema{
type: :object,
properties: %{
id: FlakeID,
text: %Schema{
type: :string,
description: "Raw source of status content"
},
spoiler_text: %Schema{
type: :string,
description:
"Subject or summary line, below which status content is collapsed until expanded"
},
content_type: %Schema{
type: :string,
description: "The content type of the source"
}
}
}
)
end
defp context do defp context do
%Schema{ %Schema{
title: "StatusContext", title: "StatusContext",
@ -573,4 +789,20 @@ defp context do
} }
} }
end end
defp translation do
%Schema{
title: "StatusTranslation",
description: "The translation of a status.",
type: :object,
required: [:detected_language, :text],
properties: %{
detected_language: %Schema{
type: :string,
description: "The detected language of the text"
},
text: %Schema{type: :string, description: "The translated text"}
}
}
end
end end

View file

@ -0,0 +1,53 @@
defmodule Pleroma.Web.ApiSpec.TranslationOperation do
alias OpenApiSpex.Operation
alias OpenApiSpex.Schema
@spec open_api_operation(atom) :: Operation.t()
def open_api_operation(action) do
operation = String.to_existing_atom("#{action}_operation")
apply(__MODULE__, operation, [])
end
@spec languages_operation() :: Operation.t()
def languages_operation() do
%Operation{
tags: ["Retrieve status translation"],
summary: "Translate status",
description: "View the translation of a given status",
operationId: "AkkomaAPI.TranslationController.languages",
security: [%{"oAuth" => ["read:statuses"]}],
responses: %{
200 =>
Operation.response("Translation", "application/json", source_dest_languages_schema())
}
}
end
defp source_dest_languages_schema do
%Schema{
type: :object,
required: [:source, :target],
properties: %{
source: languages_schema(),
target: languages_schema()
}
}
end
defp languages_schema do
%Schema{
type: :array,
items: %Schema{
type: :object,
properties: %{
code: %Schema{
type: :string
},
name: %Schema{
type: :string
}
}
}
}
end
end

View file

@ -405,6 +405,16 @@ defp remote_interaction_request do
} }
end end
def show_subscribe_form_operation do
%Operation{
tags: ["Accounts"],
summary: "Show remote subscribe form",
operationId: "UtilController.show_subscribe_form",
parameters: [],
responses: %{200 => Operation.response("Web Page", "test/html", %Schema{type: :string})}
}
end
defp delete_account_request do defp delete_account_request do
%Schema{ %Schema{
title: "AccountDeleteRequest", title: "AccountDeleteRequest",

View file

@ -73,6 +73,12 @@ defmodule Pleroma.Web.ApiSpec.Schemas.Status do
format: "date-time", format: "date-time",
description: "The date when this status was created" description: "The date when this status was created"
}, },
edited_at: %Schema{
type: :string,
format: "date-time",
nullable: true,
description: "The date when this status was last edited"
},
emojis: %Schema{ emojis: %Schema{
type: :array, type: :array,
items: Emoji, items: Emoji,

View file

@ -209,7 +209,8 @@ def react_with_emoji(id, user, emoji) do
{:ok, activity, _} <- Pipeline.common_pipeline(emoji_react, local: true) do {:ok, activity, _} <- Pipeline.common_pipeline(emoji_react, local: true) do
{:ok, activity} {:ok, activity}
else else
_ -> {:error, dgettext("errors", "Could not add reaction emoji")} _ ->
{:error, dgettext("errors", "Could not add reaction emoji")}
end end
end end
@ -346,6 +347,41 @@ def post(user, %{status: _} = data) do
end end
end end
def update(user, orig_activity, changes) do
with orig_object <- Object.normalize(orig_activity),
{:ok, new_object} <- make_update_data(user, orig_object, changes),
{:ok, update_data, _} <- Builder.update(user, new_object),
{:ok, update, _} <- Pipeline.common_pipeline(update_data, local: true) do
{:ok, update}
else
_ -> {:error, nil}
end
end
defp make_update_data(user, orig_object, changes) do
kept_params = %{
visibility: Visibility.get_visibility(orig_object),
in_reply_to_id:
with replied_id when is_binary(replied_id) <- orig_object.data["inReplyTo"],
%Activity{id: activity_id} <- Activity.get_create_by_object_ap_id(replied_id) do
activity_id
else
_ -> nil
end
}
params = Map.merge(changes, kept_params)
with {:ok, draft} <- ActivityDraft.create(user, params) do
change =
Object.Updater.make_update_object_data(orig_object.data, draft.object, Utils.make_date())
{:ok, change}
else
_ -> {:error, nil}
end
end
@spec pin(String.t(), User.t()) :: {:ok, Activity.t()} | {:error, term()} @spec pin(String.t(), User.t()) :: {:ok, Activity.t()} | {:error, term()}
def pin(id, %User{} = user) do def pin(id, %User{} = user) do
with %Activity{} = activity <- create_activity_by_id(id), with %Activity{} = activity <- create_activity_by_id(id),

View file

@ -221,7 +221,7 @@ defp object(draft) do
|> Map.put("emoji", emoji) |> Map.put("emoji", emoji)
|> Map.put("source", %{ |> Map.put("source", %{
"content" => draft.status, "content" => draft.status,
"mediaType" => draft.params[:content_type] "mediaType" => Utils.get_content_type(draft.params[:content_type])
}) })
|> Map.put("generator", draft.params[:generator]) |> Map.put("generator", draft.params[:generator])

View file

@ -37,7 +37,7 @@ def attachments_from_ids_no_descs([]), do: []
def attachments_from_ids_no_descs(ids) do def attachments_from_ids_no_descs(ids) do
Enum.map(ids, fn media_id -> Enum.map(ids, fn media_id ->
case Repo.get(Object, media_id) do case get_attachment(media_id) do
%Object{data: data} -> data %Object{data: data} -> data
_ -> nil _ -> nil
end end
@ -51,13 +51,17 @@ def attachments_from_ids_descs(ids, descs_str) do
{_, descs} = Jason.decode(descs_str) {_, descs} = Jason.decode(descs_str)
Enum.map(ids, fn media_id -> Enum.map(ids, fn media_id ->
with %Object{data: data} <- Repo.get(Object, media_id) do with %Object{data: data} <- get_attachment(media_id) do
Map.put(data, "name", descs[media_id]) Map.put(data, "name", descs[media_id])
end end
end) end)
|> Enum.reject(&is_nil/1) |> Enum.reject(&is_nil/1)
end end
defp get_attachment(media_id) do
Repo.get(Object, media_id)
end
@spec get_to_and_cc(ActivityDraft.t()) :: {list(String.t()), list(String.t())} @spec get_to_and_cc(ActivityDraft.t()) :: {list(String.t()), list(String.t())}
def get_to_and_cc(%{in_reply_to_conversation: %Participation{} = participation}) do def get_to_and_cc(%{in_reply_to_conversation: %Participation{} = participation}) do
@ -219,7 +223,7 @@ def make_content_html(%ActivityDraft{} = draft) do
|> maybe_add_attachments(draft.attachments, attachment_links) |> maybe_add_attachments(draft.attachments, attachment_links)
end end
defp get_content_type(content_type) do def get_content_type(content_type) do
if Enum.member?(Config.get([:instance, :allowed_post_formats]), content_type) do if Enum.member?(Config.get([:instance, :allowed_post_formats]), content_type) do
content_type content_type
else else

View file

@ -69,10 +69,8 @@ def perform(:publish_one, module, params) do
def perform(:publish, activity) do def perform(:publish, activity) do
Logger.debug(fn -> "Running publish for #{activity.data["id"]}" end) Logger.debug(fn -> "Running publish for #{activity.data["id"]}" end)
with %User{} = actor <- User.get_cached_by_ap_id(activity.data["actor"]), %User{} = actor = User.get_cached_by_ap_id(activity.data["actor"])
{:ok, actor} <- User.ensure_keys_present(actor) do Publisher.publish(actor, activity)
Publisher.publish(actor, activity)
end
end end
def perform(:incoming_ap_doc, params) do def perform(:incoming_ap_doc, params) do

View file

@ -51,6 +51,7 @@ def index(conn, %{account_id: account_id} = params) do
move move
pleroma:emoji_reaction pleroma:emoji_reaction
poll poll
update
} }
def index(%{assigns: %{user: user}} = conn, params) do def index(%{assigns: %{user: user}} = conn, params) do
params = params =

View file

@ -14,6 +14,7 @@ defmodule Pleroma.Web.MastodonAPI.StatusController do
alias Pleroma.Bookmark alias Pleroma.Bookmark
alias Pleroma.Object alias Pleroma.Object
alias Pleroma.Repo alias Pleroma.Repo
alias Pleroma.Config
alias Pleroma.ScheduledActivity alias Pleroma.ScheduledActivity
alias Pleroma.User alias Pleroma.User
alias Pleroma.Web.ActivityPub.ActivityPub alias Pleroma.Web.ActivityPub.ActivityPub
@ -30,6 +31,7 @@ defmodule Pleroma.Web.MastodonAPI.StatusController do
plug(:skip_public_check when action in [:index, :show]) plug(:skip_public_check when action in [:index, :show])
@unauthenticated_access %{fallback: :proceed_unauthenticated, scopes: []} @unauthenticated_access %{fallback: :proceed_unauthenticated, scopes: []}
@cachex Pleroma.Config.get([:cachex, :provider], Cachex)
plug( plug(
OAuthScopesPlug, OAuthScopesPlug,
@ -37,7 +39,10 @@ defmodule Pleroma.Web.MastodonAPI.StatusController do
when action in [ when action in [
:index, :index,
:show, :show,
:context :context,
:translate,
:show_history,
:show_source
] ]
) )
@ -48,7 +53,8 @@ defmodule Pleroma.Web.MastodonAPI.StatusController do
:create, :create,
:delete, :delete,
:reblog, :reblog,
:unreblog :unreblog,
:update
] ]
) )
@ -190,6 +196,59 @@ def create(%{assigns: %{user: _user}, body_params: %{media_ids: _} = params} = c
create(%Plug.Conn{conn | body_params: params}, %{}) create(%Plug.Conn{conn | body_params: params}, %{})
end end
@doc "GET /api/v1/statuses/:id/history"
def show_history(%{assigns: assigns} = conn, %{id: id} = params) do
with user = assigns[:user],
%Activity{} = activity <- Activity.get_by_id_with_object(id),
true <- Visibility.visible_for_user?(activity, user) do
try_render(conn, "history.json",
activity: activity,
for: user,
with_direct_conversation_id: true,
with_muted: Map.get(params, :with_muted, false)
)
else
_ -> {:error, :not_found}
end
end
@doc "GET /api/v1/statuses/:id/source"
def show_source(%{assigns: assigns} = conn, %{id: id} = _params) do
with user = assigns[:user],
%Activity{} = activity <- Activity.get_by_id_with_object(id),
true <- Visibility.visible_for_user?(activity, user) do
try_render(conn, "source.json",
activity: activity,
for: user
)
else
_ -> {:error, :not_found}
end
end
@doc "PUT /api/v1/statuses/:id"
def update(%{assigns: %{user: user}, body_params: body_params} = conn, %{id: id} = params) do
with {_, %Activity{}} = {_, activity} <- {:activity, Activity.get_by_id_with_object(id)},
{_, true} <- {:visible, Visibility.visible_for_user?(activity, user)},
{_, true} <- {:is_create, activity.data["type"] == "Create"},
actor <- Activity.user_actor(activity),
{_, true} <- {:own_status, actor.id == user.id},
changes <- body_params |> put_application(conn),
{_, {:ok, _update_activity}} <- {:pipeline, CommonAPI.update(user, activity, changes)},
{_, %Activity{}} = {_, activity} <- {:refetched, Activity.get_by_id_with_object(id)} do
try_render(conn, "show.json",
activity: activity,
for: user,
with_direct_conversation_id: true,
with_muted: Map.get(params, :with_muted, false)
)
else
{:own_status, _} -> {:error, :forbidden}
{:pipeline, _} -> {:error, :internal_server_error}
_ -> {:error, :not_found}
end
end
@doc "GET /api/v1/statuses/:id" @doc "GET /api/v1/statuses/:id"
def show(%{assigns: %{user: user}} = conn, %{id: id} = params) do def show(%{assigns: %{user: user}} = conn, %{id: id} = params) do
with %Activity{} = activity <- Activity.get_by_id_with_object(id), with %Activity{} = activity <- Activity.get_by_id_with_object(id),
@ -418,6 +477,51 @@ def bookmarks(%{assigns: %{user: user}} = conn, params) do
) )
end end
@doc "GET /api/v1/statuses/:id/translations/:language"
def translate(%{assigns: %{user: user}} = conn, %{id: id, language: language} = params) do
with {:enabled, true} <- {:enabled, Config.get([:translator, :enabled])},
%Activity{} = activity <- Activity.get_by_id_with_object(id),
{:visible, true} <- {:visible, Visibility.visible_for_user?(activity, user)},
translation_module <- Config.get([:translator, :module]),
{:ok, detected, translation} <-
fetch_or_translate(
activity.id,
activity.object.data["content"],
Map.get(params, :from, nil),
language,
translation_module
) do
json(conn, %{detected_language: detected, text: translation})
else
{:enabled, false} ->
conn
|> put_status(:bad_request)
|> json(%{"error" => "Translation is not enabled"})
{:visible, false} ->
{:error, :not_found}
e ->
e
end
end
defp fetch_or_translate(status_id, text, source_language, target_language, translation_module) do
@cachex.fetch!(
:translations_cache,
"translations:#{status_id}:#{source_language}:#{target_language}",
fn _ ->
value = translation_module.translate(text, source_language, target_language)
with {:ok, _, _} <- value do
value
else
_ -> {:ignore, value}
end
end
)
end
defp put_application(params, %{assigns: %{token: %Token{user: %User{} = user} = token}} = _conn) do defp put_application(params, %{assigns: %{token: %Token{user: %User{} = user} = token}} = _conn) do
if user.disclose_client do if user.disclose_client do
%{client_name: client_name, website: website} = Repo.preload(token, :app).app %{client_name: client_name, website: website} = Repo.preload(token, :app).app

View file

@ -65,6 +65,7 @@ def features do
"shareable_emoji_packs", "shareable_emoji_packs",
"multifetch", "multifetch",
"pleroma:api/v1/notifications:include_types_filter", "pleroma:api/v1/notifications:include_types_filter",
"editing",
if Config.get([:media_proxy, :enabled]) do if Config.get([:media_proxy, :enabled]) do
"media_proxy" "media_proxy"
end, end,
@ -81,7 +82,11 @@ def features do
if Config.get([:instance, :profile_directory]) do if Config.get([:instance, :profile_directory]) do
"profile_directory" "profile_directory"
end, end,
"custom_emoji_reactions" if Config.get([:translator, :enabled], false) do
"akkoma:machine_translation"
end,
"custom_emoji_reactions",
"pleroma:get:main/ostatus"
] ]
|> Enum.filter(& &1) |> Enum.filter(& &1)
end end

View file

@ -17,7 +17,11 @@ defmodule Pleroma.Web.MastodonAPI.NotificationView do
alias Pleroma.Web.MastodonAPI.NotificationView alias Pleroma.Web.MastodonAPI.NotificationView
alias Pleroma.Web.MastodonAPI.StatusView alias Pleroma.Web.MastodonAPI.StatusView
@parent_types ~w{Like Announce EmojiReact} defp object_id_for(%{data: %{"object" => %{"id" => id}}}) when is_binary(id), do: id
defp object_id_for(%{data: %{"object" => id}}) when is_binary(id), do: id
@parent_types ~w{Like Announce EmojiReact Update}
def render("index.json", %{notifications: notifications, for: reading_user} = opts) do def render("index.json", %{notifications: notifications, for: reading_user} = opts) do
activities = Enum.map(notifications, & &1.activity) activities = Enum.map(notifications, & &1.activity)
@ -28,7 +32,7 @@ def render("index.json", %{notifications: notifications, for: reading_user} = op
%{data: %{"type" => type}} -> %{data: %{"type" => type}} ->
type in @parent_types type in @parent_types
end) end)
|> Enum.map(& &1.data["object"]) |> Enum.map(&object_id_for/1)
|> Activity.create_by_object_ap_id() |> Activity.create_by_object_ap_id()
|> Activity.with_preloaded_object(:left) |> Activity.with_preloaded_object(:left)
|> Pleroma.Repo.all() |> Pleroma.Repo.all()
@ -76,9 +80,9 @@ def render(
parent_activity_fn = fn -> parent_activity_fn = fn ->
if opts[:parent_activities] do if opts[:parent_activities] do
Activity.Queries.find_by_object_ap_id(opts[:parent_activities], activity.data["object"]) Activity.Queries.find_by_object_ap_id(opts[:parent_activities], object_id_for(activity))
else else
Activity.get_create_by_object_ap_id(activity.data["object"]) Activity.get_create_by_object_ap_id(object_id_for(activity))
end end
end end
@ -107,6 +111,9 @@ def render(
"reblog" -> "reblog" ->
put_status(response, parent_activity_fn.(), reading_user, status_render_opts) put_status(response, parent_activity_fn.(), reading_user, status_render_opts)
"update" ->
put_status(response, parent_activity_fn.(), reading_user, status_render_opts)
"move" -> "move" ->
put_target(response, activity, reading_user, %{}) put_target(response, activity, reading_user, %{})

View file

@ -265,10 +265,30 @@ def render("show.json", %{activity: %{data: %{"object" => _object}} = activity}
created_at = Utils.to_masto_date(object.data["published"]) created_at = Utils.to_masto_date(object.data["published"])
edited_at =
with %{"updated" => updated} <- object.data,
date <- Utils.to_masto_date(updated),
true <- date != "" do
date
else
_ ->
nil
end
reply_to = get_reply_to(activity, opts) reply_to = get_reply_to(activity, opts)
reply_to_user = reply_to && CommonAPI.get_user(reply_to.data["actor"]) reply_to_user = reply_to && CommonAPI.get_user(reply_to.data["actor"])
history_len =
1 +
(Object.Updater.history_for(object.data)
|> Map.get("orderedItems")
|> length())
# See render("history.json", ...) for more details
# Here the implicit index of the current content is 0
chrono_order = history_len - 1
content = content =
object object
|> render_content() |> render_content()
@ -278,14 +298,14 @@ def render("show.json", %{activity: %{data: %{"object" => _object}} = activity}
|> Activity.HTML.get_cached_scrubbed_html_for_activity( |> Activity.HTML.get_cached_scrubbed_html_for_activity(
User.html_filter_policy(opts[:for]), User.html_filter_policy(opts[:for]),
activity, activity,
"mastoapi:content" "mastoapi:content:#{chrono_order}"
) )
content_plaintext = content_plaintext =
content content
|> Activity.HTML.get_cached_stripped_html_for_activity( |> Activity.HTML.get_cached_stripped_html_for_activity(
activity, activity,
"mastoapi:content" "mastoapi:content:#{chrono_order}"
) )
summary = object.data["summary"] || "" summary = object.data["summary"] || ""
@ -353,8 +373,9 @@ def render("show.json", %{activity: %{data: %{"object" => _object}} = activity}
reblog: nil, reblog: nil,
card: card, card: card,
content: content_html, content: content_html,
text: opts[:with_source] && object.data["source"], text: opts[:with_source] && get_source_text(object.data["source"]),
created_at: created_at, created_at: created_at,
edited_at: edited_at,
reblogs_count: announcement_count, reblogs_count: announcement_count,
replies_count: object.data["repliesCount"] || 0, replies_count: object.data["repliesCount"] || 0,
favourites_count: like_count, favourites_count: like_count,
@ -400,6 +421,100 @@ def render("show.json", _) do
nil nil
end end
def render("history.json", %{activity: %{data: %{"object" => _object}} = activity} = opts) do
object = Object.normalize(activity, fetch: false)
hashtags = Object.hashtags(object)
user = CommonAPI.get_user(activity.data["actor"])
past_history =
Object.Updater.history_for(object.data)
|> Map.get("orderedItems")
|> Enum.map(&Map.put(&1, "id", object.data["id"]))
|> Enum.map(&%Object{data: &1, id: object.id})
history =
[object | past_history]
# Mastodon expects the original to be at the first
|> Enum.reverse()
|> Enum.with_index()
|> Enum.map(fn {object, chrono_order} ->
%{
# The history is prepended every time there is a new edit.
# In chrono_order, the oldest item is always at 0, and so on.
# The chrono_order is an invariant kept between edits.
chrono_order: chrono_order,
object: object
}
end)
individual_opts =
opts
|> Map.put(:as, :item)
|> Map.put(:user, user)
|> Map.put(:hashtags, hashtags)
render_many(history, StatusView, "history_item.json", individual_opts)
end
def render(
"history_item.json",
%{
activity: activity,
user: user,
item: %{object: object, chrono_order: chrono_order},
hashtags: hashtags
} = opts
) do
sensitive = object.data["sensitive"] || Enum.member?(hashtags, "nsfw")
attachment_data = object.data["attachment"] || []
attachments = render_many(attachment_data, StatusView, "attachment.json", as: :attachment)
created_at = Utils.to_masto_date(object.data["updated"] || object.data["published"])
content =
object
|> render_content()
content_html =
content
|> Activity.HTML.get_cached_scrubbed_html_for_activity(
User.html_filter_policy(opts[:for]),
activity,
"mastoapi:content:#{chrono_order}"
)
summary = object.data["summary"] || ""
%{
account:
AccountView.render("show.json", %{
user: user,
for: opts[:for]
}),
content: content_html,
sensitive: sensitive,
spoiler_text: summary,
created_at: created_at,
media_attachments: attachments,
emojis: build_emojis(object.data["emoji"]),
poll: render(PollView, "show.json", object: object, for: opts[:for])
}
end
def render("source.json", %{activity: %{data: %{"object" => _object}} = activity} = _opts) do
object = Object.normalize(activity, fetch: false)
%{
id: activity.id,
text: get_source_text(Map.get(object.data, "source", "")),
spoiler_text: Map.get(object.data, "summary", ""),
content_type: get_source_content_type(object.data["source"])
}
end
def render("card.json", %{rich_media: rich_media, page_url: page_url}) do def render("card.json", %{rich_media: rich_media, page_url: page_url}) do
page_url_data = URI.parse(page_url) page_url_data = URI.parse(page_url)
@ -452,10 +567,19 @@ def render("attachment.json", %{attachment: attachment}) do
true -> "unknown" true -> "unknown"
end end
<<hash_id::signed-32, _rest::binary>> = :crypto.hash(:md5, href) attachment_id =
with {_, ap_id} when is_binary(ap_id) <- {:ap_id, attachment["id"]},
{_, %Object{data: _object_data, id: object_id}} <-
{:object, Object.get_by_ap_id(ap_id)} do
to_string(object_id)
else
_ ->
<<hash_id::signed-32, _rest::binary>> = :crypto.hash(:md5, href)
to_string(attachment["id"] || hash_id)
end
%{ %{
id: to_string(attachment["id"] || hash_id), id: attachment_id,
url: href, url: href,
remote_url: href, remote_url: href,
preview_url: href_preview, preview_url: href_preview,
@ -587,7 +711,7 @@ defp pin_data(%Object{data: %{"id" => object_id}}, %User{pinned_objects: pinned_
defp build_emoji_map(emoji, users, url, current_user) do defp build_emoji_map(emoji, users, url, current_user) do
%{ %{
name: emoji, name: Pleroma.Web.PleromaAPI.EmojiReactionView.emoji_name(emoji, url),
count: length(users), count: length(users),
url: MediaProxy.url(url), url: MediaProxy.url(url),
me: !!(current_user && current_user.ap_id in users), me: !!(current_user && current_user.ap_id in users),
@ -638,4 +762,24 @@ defp maybe_render_quote(quote, opts) do
_ -> nil _ -> nil
end end
end end
defp get_source_text(%{"content" => content} = _source) do
content
end
defp get_source_text(source) when is_binary(source) do
source
end
defp get_source_text(_) do
""
end
defp get_source_content_type(%{"mediaType" => type} = _source) do
type
end
defp get_source_content_type(_source) do
Utils.get_content_type(nil)
end
end end

View file

@ -8,8 +8,8 @@ defmodule Pleroma.Web.Metadata.Utils do
alias Pleroma.Formatter alias Pleroma.Formatter
alias Pleroma.HTML alias Pleroma.HTML
def scrub_html_and_truncate(%{data: %{"content" => content}} = object) do defp scrub_html_and_truncate_object_field(field, object) do
content field
# html content comes from DB already encoded, decode first and scrub after # html content comes from DB already encoded, decode first and scrub after
|> HtmlEntities.decode() |> HtmlEntities.decode()
|> String.replace(~r/<br\s?\/?>/, " ") |> String.replace(~r/<br\s?\/?>/, " ")
@ -19,6 +19,17 @@ def scrub_html_and_truncate(%{data: %{"content" => content}} = object) do
|> Formatter.truncate() |> Formatter.truncate()
end end
def scrub_html_and_truncate(%{data: %{"summary" => summary}} = object)
when is_binary(summary) and summary != "" do
summary
|> scrub_html_and_truncate_object_field(object)
end
def scrub_html_and_truncate(%{data: %{"content" => content}} = object) do
content
|> scrub_html_and_truncate_object_field(object)
end
def scrub_html_and_truncate(content, max_length \\ 200) when is_binary(content) do def scrub_html_and_truncate(content, max_length \\ 200) when is_binary(content) do
content content
|> scrub_html |> scrub_html

View file

@ -74,7 +74,10 @@ defp filter(reactions, %{emoji: emoji}) when is_binary(emoji) do
defp filter(reactions, _), do: reactions defp filter(reactions, _), do: reactions
def create(%{assigns: %{user: user}} = conn, %{id: activity_id, emoji: emoji}) do def create(%{assigns: %{user: user}} = conn, %{id: activity_id, emoji: emoji}) do
emoji = Pleroma.Emoji.maybe_quote(emoji) emoji =
emoji
|> Pleroma.Emoji.fully_qualify_emoji()
|> Pleroma.Emoji.maybe_quote()
with {:ok, _activity} <- CommonAPI.react_with_emoji(activity_id, user, emoji) do with {:ok, _activity} <- CommonAPI.react_with_emoji(activity_id, user, emoji) do
activity = Activity.get_by_id(activity_id) activity = Activity.get_by_id(activity_id)
@ -86,6 +89,11 @@ def create(%{assigns: %{user: user}} = conn, %{id: activity_id, emoji: emoji}) d
end end
def delete(%{assigns: %{user: user}} = conn, %{id: activity_id, emoji: emoji}) do def delete(%{assigns: %{user: user}} = conn, %{id: activity_id, emoji: emoji}) do
emoji =
emoji
|> Pleroma.Emoji.fully_qualify_emoji()
|> Pleroma.Emoji.maybe_quote()
with {:ok, _activity} <- CommonAPI.unreact_with_emoji(activity_id, user, emoji) do with {:ok, _activity} <- CommonAPI.unreact_with_emoji(activity_id, user, emoji) do
activity = Activity.get_by_id(activity_id) activity = Activity.get_by_id(activity_id)

View file

@ -8,6 +8,18 @@ defmodule Pleroma.Web.PleromaAPI.EmojiReactionView do
alias Pleroma.Web.MastodonAPI.AccountView alias Pleroma.Web.MastodonAPI.AccountView
alias Pleroma.Web.MediaProxy alias Pleroma.Web.MediaProxy
def emoji_name(emoji, nil), do: emoji
def emoji_name(emoji, url) do
url = URI.parse(url)
if url.host == Pleroma.Web.Endpoint.host() do
emoji
else
"#{emoji}@#{url.host}"
end
end
def render("index.json", %{emoji_reactions: emoji_reactions} = opts) do def render("index.json", %{emoji_reactions: emoji_reactions} = opts) do
render_many(emoji_reactions, __MODULE__, "show.json", opts) render_many(emoji_reactions, __MODULE__, "show.json", opts)
end end
@ -16,7 +28,7 @@ def render("show.json", %{emoji_reaction: {emoji, user_ap_ids, url}, user: user}
users = fetch_users(user_ap_ids) users = fetch_users(user_ap_ids)
%{ %{
name: emoji, name: emoji_name(emoji, url),
count: length(users), count: length(users),
accounts: render(AccountView, "index.json", users: users, for: user), accounts: render(AccountView, "index.json", users: users, for: user),
url: MediaProxy.url(url), url: MediaProxy.url(url),

View file

@ -47,15 +47,17 @@ def call(conn, _) do
# #
@spec fetch_user_and_token(String.t()) :: {:ok, User.t(), Token.t()} | nil @spec fetch_user_and_token(String.t()) :: {:ok, User.t(), Token.t()} | nil
defp fetch_user_and_token(token) do defp fetch_user_and_token(token) do
query = token_query =
from(t in Token, from(t in Token,
where: t.token == ^token, where: t.token == ^token
join: user in assoc(t, :user),
preload: [user: user]
) )
with %Token{user: user} = token_record <- Repo.one(query) do with %Token{user_id: user_id} = token_record <- Repo.one(token_query),
false <- is_nil(user_id),
%User{} = user <- User.get_cached_by_id(user_id) do
{:ok, user, token_record} {:ok, user, token_record}
else
_ -> nil
end end
end end

View file

@ -337,6 +337,7 @@ defmodule Pleroma.Web.Router do
pipe_through(:pleroma_html) pipe_through(:pleroma_html)
post("/main/ostatus", UtilController, :remote_subscribe) post("/main/ostatus", UtilController, :remote_subscribe)
get("/main/ostatus", UtilController, :show_subscribe_form)
get("/ostatus_subscribe", RemoteFollowController, :follow) get("/ostatus_subscribe", RemoteFollowController, :follow)
post("/ostatus_subscribe", RemoteFollowController, :do_follow) post("/ostatus_subscribe", RemoteFollowController, :do_follow)
end end
@ -462,6 +463,11 @@ defmodule Pleroma.Web.Router do
put("/statuses/:id/emoji_reactions/:emoji", EmojiReactionController, :create) put("/statuses/:id/emoji_reactions/:emoji", EmojiReactionController, :create)
end end
scope "/api/v1/akkoma", Pleroma.Web.AkkomaAPI do
pipe_through(:authenticated_api)
get("/translation/languages", TranslationController, :languages)
end
scope "/api/v1", Pleroma.Web.MastodonAPI do scope "/api/v1", Pleroma.Web.MastodonAPI do
pipe_through(:authenticated_api) pipe_through(:authenticated_api)
@ -542,6 +548,7 @@ defmodule Pleroma.Web.Router do
get("/bookmarks", StatusController, :bookmarks) get("/bookmarks", StatusController, :bookmarks)
post("/statuses", StatusController, :create) post("/statuses", StatusController, :create)
put("/statuses/:id", StatusController, :update)
delete("/statuses/:id", StatusController, :delete) delete("/statuses/:id", StatusController, :delete)
post("/statuses/:id/reblog", StatusController, :reblog) post("/statuses/:id/reblog", StatusController, :reblog)
post("/statuses/:id/unreblog", StatusController, :unreblog) post("/statuses/:id/unreblog", StatusController, :unreblog)
@ -553,6 +560,7 @@ defmodule Pleroma.Web.Router do
post("/statuses/:id/unbookmark", StatusController, :unbookmark) post("/statuses/:id/unbookmark", StatusController, :unbookmark)
post("/statuses/:id/mute", StatusController, :mute_conversation) post("/statuses/:id/mute", StatusController, :mute_conversation)
post("/statuses/:id/unmute", StatusController, :unmute_conversation) post("/statuses/:id/unmute", StatusController, :unmute_conversation)
get("/statuses/:id/translations/:language", StatusController, :translate)
post("/push/subscription", SubscriptionController, :create) post("/push/subscription", SubscriptionController, :create)
get("/push/subscription", SubscriptionController, :show) get("/push/subscription", SubscriptionController, :show)
@ -606,6 +614,8 @@ defmodule Pleroma.Web.Router do
get("/statuses/:id/context", StatusController, :context) get("/statuses/:id/context", StatusController, :context)
get("/statuses/:id/favourited_by", StatusController, :favourited_by) get("/statuses/:id/favourited_by", StatusController, :favourited_by)
get("/statuses/:id/reblogged_by", StatusController, :reblogged_by) get("/statuses/:id/reblogged_by", StatusController, :reblogged_by)
get("/statuses/:id/history", StatusController, :show_history)
get("/statuses/:id/source", StatusController, :show_source)
get("/custom_emojis", CustomEmojiController, :index) get("/custom_emojis", CustomEmojiController, :index)

View file

@ -287,6 +287,27 @@ defp push_to_socket(topic, %Activity{
defp push_to_socket(_topic, %Activity{data: %{"type" => "Delete"}}), do: :noop defp push_to_socket(_topic, %Activity{data: %{"type" => "Delete"}}), do: :noop
defp push_to_socket(topic, %Activity{data: %{"type" => "Update"}} = item) do
create_activity =
Pleroma.Activity.get_create_by_object_ap_id(item.object.data["id"])
|> Map.put(:object, item.object)
anon_render = StreamerView.render("status_update.json", create_activity, topic)
Registry.dispatch(@registry, topic, fn list ->
Enum.each(list, fn {pid, auth?} ->
if auth? do
send(
pid,
{:render_with_user, StreamerView, "status_update.json", create_activity, topic}
)
else
send(pid, {:text, anon_render})
end
end)
end)
end
defp push_to_socket(topic, item) do defp push_to_socket(topic, item) do
anon_render = StreamerView.render("update.json", item, topic) anon_render = StreamerView.render("update.json", item, topic)

View file

@ -0,0 +1,10 @@
<%= if @error do %>
<h2><%= Gettext.dpgettext("static_pages", "status interact error", "Error: %{error}", error: @error) %></h2>
<% else %>
<h2><%= raw Gettext.dpgettext("static_pages", "status interact header", "Interacting with %{nickname}'s %{status_link}", nickname: safe_to_string(html_escape(@nickname)), status_link: safe_to_string(link(Gettext.dpgettext("static_pages", "status interact header - status link text", "status"), to: @status_link))) %></h2>
<%= form_for @conn, Routes.util_path(@conn, :remote_subscribe), [as: "status"], fn f -> %>
<%= hidden_input f, :status_id, value: @status_id %>
<%= text_input f, :profile, placeholder: Gettext.dpgettext("static_pages", "placeholder text for account id", "Your account ID, e.g. lain@quitter.se") %>
<%= submit Gettext.dpgettext("static_pages", "status interact authorization button", "Interact") %>
<% end %>
<% end %>

View file

@ -7,6 +7,7 @@ defmodule Pleroma.Web.TwitterAPI.UtilController do
require Logger require Logger
alias Pleroma.Activity
alias Pleroma.Config alias Pleroma.Config
alias Pleroma.Emoji alias Pleroma.Emoji
alias Pleroma.Healthcheck alias Pleroma.Healthcheck
@ -16,8 +17,16 @@ defmodule Pleroma.Web.TwitterAPI.UtilController do
alias Pleroma.Web.Plugs.OAuthScopesPlug alias Pleroma.Web.Plugs.OAuthScopesPlug
alias Pleroma.Web.WebFinger alias Pleroma.Web.WebFinger
plug(Pleroma.Web.ApiSpec.CastAndValidate when action != :remote_subscribe) plug(
plug(Pleroma.Web.Plugs.FederatingPlug when action == :remote_subscribe) Pleroma.Web.ApiSpec.CastAndValidate
when action != :remote_subscribe and action != :show_subscribe_form
)
plug(
Pleroma.Web.Plugs.FederatingPlug
when action == :remote_subscribe
when action == :show_subscribe_form
)
plug( plug(
OAuthScopesPlug, OAuthScopesPlug,
@ -44,7 +53,7 @@ defmodule Pleroma.Web.TwitterAPI.UtilController do
defdelegate open_api_operation(action), to: Pleroma.Web.ApiSpec.TwitterUtilOperation defdelegate open_api_operation(action), to: Pleroma.Web.ApiSpec.TwitterUtilOperation
def remote_subscribe(conn, %{"nickname" => nick, "profile" => _}) do def show_subscribe_form(conn, %{"nickname" => nick}) do
with %User{} = user <- User.get_cached_by_nickname(nick), with %User{} = user <- User.get_cached_by_nickname(nick),
avatar = User.avatar_url(user) do avatar = User.avatar_url(user) do
conn conn
@ -54,11 +63,52 @@ def remote_subscribe(conn, %{"nickname" => nick, "profile" => _}) do
render(conn, "subscribe.html", %{ render(conn, "subscribe.html", %{
nickname: nick, nickname: nick,
avatar: nil, avatar: nil,
error: "Could not find user" error:
Pleroma.Web.Gettext.dpgettext(
"static_pages",
"remote follow error message - user not found",
"Could not find user"
)
}) })
end end
end end
def show_subscribe_form(conn, %{"status_id" => id}) do
with %Activity{} = activity <- Activity.get_by_id(id),
{:ok, ap_id} <- get_ap_id(activity),
%User{} = user <- User.get_cached_by_ap_id(activity.actor),
avatar = User.avatar_url(user) do
conn
|> render("status_interact.html", %{
status_link: ap_id,
status_id: id,
nickname: user.nickname,
avatar: avatar,
error: false
})
else
_e ->
render(conn, "status_interact.html", %{
status_id: id,
avatar: nil,
error:
Pleroma.Web.Gettext.dpgettext(
"static_pages",
"status interact error message - status not found",
"Could not find status"
)
})
end
end
def remote_subscribe(conn, %{"nickname" => nick, "profile" => _}) do
show_subscribe_form(conn, %{"nickname" => nick})
end
def remote_subscribe(conn, %{"status_id" => id, "profile" => _}) do
show_subscribe_form(conn, %{"status_id" => id})
end
def remote_subscribe(conn, %{"user" => %{"nickname" => nick, "profile" => profile}}) do def remote_subscribe(conn, %{"user" => %{"nickname" => nick, "profile" => profile}}) do
with {:ok, %{"subscribe_address" => template}} <- WebFinger.finger(profile), with {:ok, %{"subscribe_address" => template}} <- WebFinger.finger(profile),
%User{ap_id: ap_id} <- User.get_cached_by_nickname(nick) do %User{ap_id: ap_id} <- User.get_cached_by_nickname(nick) do
@ -69,7 +119,33 @@ def remote_subscribe(conn, %{"user" => %{"nickname" => nick, "profile" => profil
render(conn, "subscribe.html", %{ render(conn, "subscribe.html", %{
nickname: nick, nickname: nick,
avatar: nil, avatar: nil,
error: "Something went wrong." error:
Pleroma.Web.Gettext.dpgettext(
"static_pages",
"remote follow error message - unknown error",
"Something went wrong."
)
})
end
end
def remote_subscribe(conn, %{"status" => %{"status_id" => id, "profile" => profile}}) do
with {:ok, %{"subscribe_address" => template}} <- WebFinger.finger(profile),
%Activity{} = activity <- Activity.get_by_id(id),
{:ok, ap_id} <- get_ap_id(activity) do
conn
|> Phoenix.Controller.redirect(external: String.replace(template, "{uri}", ap_id))
else
_e ->
render(conn, "status_interact.html", %{
status_id: id,
avatar: nil,
error:
Pleroma.Web.Gettext.dpgettext(
"static_pages",
"status interact error message - unknown error",
"Something went wrong."
)
}) })
end end
end end
@ -83,6 +159,15 @@ def remote_interaction(%{body_params: %{ap_id: ap_id, profile: profile}} = conn,
end end
end end
defp get_ap_id(activity) do
object = Pleroma.Object.normalize(activity, fetch: false)
case object do
%{data: %{"id" => ap_id}} -> {:ok, ap_id}
_ -> {:no_ap_id, nil}
end
end
def frontend_configurations(conn, _params) do def frontend_configurations(conn, _params) do
render(conn, "frontend_configurations.json") render(conn, "frontend_configurations.json")
end end

View file

@ -4,7 +4,9 @@
defmodule Pleroma.Web.TwitterAPI.UtilView do defmodule Pleroma.Web.TwitterAPI.UtilView do
use Pleroma.Web, :view use Pleroma.Web, :view
import Phoenix.HTML
import Phoenix.HTML.Form import Phoenix.HTML.Form
import Phoenix.HTML.Link
alias Pleroma.Config alias Pleroma.Config
alias Pleroma.Web.Endpoint alias Pleroma.Web.Endpoint
alias Pleroma.Web.Gettext alias Pleroma.Web.Gettext

View file

@ -26,6 +26,23 @@ def render("update.json", %Activity{} = activity, %User{} = user, topic) do
|> Jason.encode!() |> Jason.encode!()
end end
def render("status_update.json", %Activity{} = activity, %User{} = user, topic) do
activity = Activity.get_create_by_object_ap_id_with_object(activity.object.data["id"])
%{
stream: [topic],
event: "status.update",
payload:
Pleroma.Web.MastodonAPI.StatusView.render(
"show.json",
activity: activity,
for: user
)
|> Jason.encode!()
}
|> Jason.encode!()
end
def render("notification.json", %Notification{} = notify, %User{} = user, topic) do def render("notification.json", %Notification{} = notify, %User{} = user, topic) do
%{ %{
stream: [topic], stream: [topic],
@ -54,6 +71,22 @@ def render("update.json", %Activity{} = activity, topic) do
|> Jason.encode!() |> Jason.encode!()
end end
def render("status_update.json", %Activity{} = activity, topic) do
activity = Activity.get_create_by_object_ap_id_with_object(activity.object.data["id"])
%{
stream: [topic],
event: "status.update",
payload:
Pleroma.Web.MastodonAPI.StatusView.render(
"show.json",
activity: activity
)
|> Jason.encode!()
}
|> Jason.encode!()
end
def render("follow_relationships_update.json", item, topic) do def render("follow_relationships_update.json", item, topic) do
%{ %{
stream: [topic], stream: [topic],

View file

@ -69,8 +69,6 @@ defp gather_aliases(%User{} = user) do
end end
def represent_user(user, "JSON") do def represent_user(user, "JSON") do
{:ok, user} = User.ensure_keys_present(user)
%{ %{
"subject" => "acct:#{user.nickname}@#{domain()}", "subject" => "acct:#{user.nickname}@#{domain()}",
"aliases" => gather_aliases(user), "aliases" => gather_aliases(user),
@ -79,8 +77,6 @@ def represent_user(user, "JSON") do
end end
def represent_user(user, "XML") do def represent_user(user, "XML") do
{:ok, user} = User.ensure_keys_present(user)
aliases = aliases =
user user
|> gather_aliases() |> gather_aliases()

View file

@ -4,7 +4,7 @@ defmodule Pleroma.Mixfile do
def project do def project do
[ [
app: :pleroma, app: :pleroma,
version: version("3.1.0"), version: version("3.2.0"),
elixir: "~> 1.12", elixir: "~> 1.12",
elixirc_paths: elixirc_paths(Mix.env()), elixirc_paths: elixirc_paths(Mix.env()),
compilers: [:phoenix, :gettext] ++ Mix.compilers(), compilers: [:phoenix, :gettext] ++ Mix.compilers(),

View file

@ -0,0 +1,51 @@
defmodule Pleroma.Repo.Migrations.AddUpdateToNotificationsEnum do
use Ecto.Migration
@disable_ddl_transaction true
def up do
"""
alter type notification_type add value 'update'
"""
|> execute()
end
# 20210717000000_add_poll_to_notifications_enum.exs
def down do
alter table(:notifications) do
modify(:type, :string)
end
"""
delete from notifications where type = 'update'
"""
|> execute()
"""
drop type if exists notification_type
"""
|> execute()
"""
create type notification_type as enum (
'follow',
'follow_request',
'mention',
'move',
'pleroma:emoji_reaction',
'pleroma:chat_mention',
'reblog',
'favourite',
'pleroma:report',
'poll'
)
"""
|> execute()
"""
alter table notifications
alter column type type notification_type using (type::notification_type)
"""
|> execute()
end
end

View file

@ -1,7 +1,10 @@
defmodule Pleroma.Repo.Migrations.UpgradeObanToV11 do defmodule Pleroma.Repo.Migrations.UpgradeObanToV11 do
use Ecto.Migration use Ecto.Migration
def up, do: Oban.Migrations.up(version: 11) def up do
execute("UPDATE oban_jobs SET priority = 0 WHERE priority IS NULL;")
Oban.Migrations.up(version: 11)
end
def down, do: Oban.Migrations.down(version: 11) def down, do: Oban.Migrations.down(version: 11)
end end

View file

@ -0,0 +1,22 @@
defmodule Pleroma.Repo.Migrations.RemoveLocalCancelledFollows do
use Ecto.Migration
def up do
statement = """
DELETE FROM
activities
WHERE
(data->>'type') = 'Follow'
AND
(data->>'state') = 'cancelled'
AND
local = true;
"""
execute(statement)
end
def down do
:ok
end
end

View file

@ -0,0 +1,28 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2022 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Repo.Migrations.GenerateUnsetUserKeys do
use Ecto.Migration
import Ecto.Query
alias Pleroma.Keys
alias Pleroma.Repo
alias Pleroma.User
def change do
query =
from(u in User,
where: u.local == true,
where: is_nil(u.keys),
select: u
)
Repo.stream(query)
|> Enum.each(fn user ->
with {:ok, pem} <- Keys.generate_rsa_pem() do
Ecto.Changeset.cast(user, %{keys: pem}, [:keys])
|> Repo.update()
end
end)
end
end

View file

@ -0,0 +1,15 @@
defmodule Pleroma.Repo.Migrations.EnsureMastofeSettings do
use Ecto.Migration
def up do
alter table(:users) do
add_if_not_exists(:mastofe_settings, :map)
end
end
def down do
alter table(:users) do
remove_if_exists(:mastofe_settings, :map)
end
end
end

View file

@ -6,7 +6,18 @@
<body> <body>
<h3>Welcome to Akkoma!</h3> <h3>Welcome to Akkoma!</h3>
<p>If you're seeing this page, your server works!</p> <p>If you're seeing this page, your server works!</p>
<p>In order to get a frontend to show here, you'll need to set up <code>:pleroma, :frontends, primary</code> and install your frontend of choice</p> <p>In order to get a frontend to show here, you'll need to set up <code>:pleroma, :frontends, primary</code> and install your frontend of choice, in most cases this will just be:</p>
<a href="https://docs.akkoma.dev/stable/configuration/cheatsheet/#frontend-management">Documentation</a> <pre>
<code lang="bash">
# OTP
./bin/pleroma_ctl frontend install pleroma-fe --ref stable
# Source
mix pleroma.frontend install pleroma-fe --ref stable
## you can do the same thing for admin-fe if you so wish
</code>
</pre>
<p><a href="https://docs.akkoma.dev/stable/administration/CLI_tasks/frontend/">Installation Command Documentation</a></p>
<p><a href="https://docs.akkoma.dev/stable/configuration/cheatsheet/#frontend-management">Config Documentation</a></p>
</body> </body>
</html> </html>

View file

@ -51,6 +51,7 @@ .container {
overflow: hidden; overflow: hidden;
margin: 35px auto; margin: 35px auto;
box-shadow: 0 1px 4px 0px rgba(0, 0, 0, 0.5); box-shadow: 0 1px 4px 0px rgba(0, 0, 0, 0.5);
padding: 0em 1em 0em 1em;
} }
.container__content { .container__content {
@ -86,7 +87,6 @@ .input {
} }
input { input {
box-sizing: content-box;
padding: 10px; padding: 10px;
margin-top: 5px; margin-top: 5px;
margin-bottom: 10px; margin-bottom: 10px;
@ -97,6 +97,8 @@ input {
transition-duration: 0.35s; transition-duration: 0.35s;
border-bottom: 2px solid #2a384a; border-bottom: 2px solid #2a384a;
font-size: 14px; font-size: 14px;
width: inherit;
box-sizing: border-box;
} }
.scopes-input { .scopes-input {

View file

@ -39,7 +39,9 @@
"alsoKnownAs": { "alsoKnownAs": {
"@id": "as:alsoKnownAs", "@id": "as:alsoKnownAs",
"@type": "@id" "@type": "@id"
} },
"vcard": "http://www.w3.org/2006/vcard/ns#",
"formerRepresentations": "litepub:formerRepresentations"
} }
] ]
} }

27
test/fixtures/rsa_keys/key_1.pem vendored Normal file
View file

@ -0,0 +1,27 @@
-----BEGIN RSA PRIVATE KEY-----
MIIEowIBAAKCAQEA2gdPJM5bWarGZ6QujfQ296l1yEQohS5fdtnxYQc+RXuS1gqZ
R/jVGHG25o4tmwyCLClyREU1CBTOCQBsg+BSehXlxNR9fiB4KaVQW9MMNa2vhHuG
f7HLdILiC+SPPTV1Bi8LCpxJowiSpnFPP4BDDeRKib7nOxll9Ln9gEpUueKKabsQ
EQKCmEJYhIz/8g5R0Qz+6VjASdejDjTEdZbr/rwyldRRjIklyeZ3lBzB/c8/51wn
HT2Dt0r9NiapxYC3oNhbE2A+4FU9pZTqS8yc3KqWZAy74snaRO9QQSednKlOJpXP
V3vwWo5CxuSNLttV7zRcrqeYOkIVNF4dQ/bHzQIDAQABAoIBADTCfglnEj4BkF92
IHnjdgW6cTEUJUYNMba+CKY1LYF85Mx85hi/gzmWEu95yllxznJHWUpiAPJCrpUJ
EDldaDf44pAd53xE+S8CvQ5rZNH8hLOnfKWb7aL1JSRBm9PxAq+LZL2dkkgsg+hZ
FRdFv3Q2IT9x/dyUSdLNyyVnV1dfoya/7zOFc7+TwqlofznzrlBgNoAe8Lb4AN/q
itormPxskqATiq11XtP4F6eQ556eRgHCBxmktx/rRDl6f9G9dvjRQOA2qZlHQdFq
kjOZsrvItL46LdVoLPOdCYG+3HFeKoDUR1NNXEkt66eqmEhLY4MgzGUT1wqXWk7N
XowZc9UCgYEA+L5h4PhANiY5Kd+PkRI8zTlJMv8hFqLK17Q0p9eL+mAyOgXjH9so
QutJf4wU+h6ESDxH+1tCjCN307uUqT7YnT2zHf3b6GcmA+t6ewxfxOY2nJ82HENq
hK1aodnPTvRRRqCGfrx9qUHRTarTzi+2u86zH+KoMHSiuzn4VpQhg4MCgYEA4GOL
1tLR9+hyfYuMFo2CtQjp3KpJeGNKEqc33vFD05xJQX+m5THamBv8vzdVlVrMh/7j
iV85mlA7HaaP+r5DGwtonw9bqY76lYRgJJprsS5lHcRnXsDmU4Ne8RdB3dHNsT5P
n4P6v8y4jaT638iJ/qLt4e8itOBlZwS//VIglm8CgYEA7KXD3RKRlHK9A7drkOs2
6VBM8bWEN1LdhGYvilcpFyUZ49XiBVatcS0EGdKdym/qDgc7vElQgJ7ly4y0nGfs
EXy3whrYcrxfkG8hcZuOKXeUEWHvSuhgmKWMilr8PfN2t6jVDBIrwzGY/Tk+lPUT
9o1qITW0KZVtlI5MU6JOWB0CgYAHwwnETZibxbuoIhqfcRezYXKNgop2EqEuUgB5
wsjA2igijuLcDMRt/JHan3RjbTekAKooR1X7w4i39toGJ2y008kzr1lRXTPH1kNp
ILpW767pv7B/s5aEDwhKuK47mRVPa0Nf1jXnSpKbu7g943b6ivJFnXsK3LRFQwHN
JnkgGwKBgGUleQVd2GPr1dkqLVOF/s2aNB/+h2b1WFWwq0YTnW81OLwAcUVE4p58
3GQgz8PCsWbNdTb9yFY5fq0fXgi0+T54FEoZWH09DrOepA433llAwI6sq7egrFdr
kKQttZMzs6ST9q/IOF4wgqSnBjjTC06vKSkNAlXJz+LMvIRMeBr0
-----END RSA PRIVATE KEY-----

27
test/fixtures/rsa_keys/key_2.pem vendored Normal file
View file

@ -0,0 +1,27 @@
-----BEGIN RSA PRIVATE KEY-----
MIIEpQIBAAKCAQEAwu0VqVGRVDW09V3zZ0+08K9HMKivIzIInO0xim3jbfVcg8r1
sR7vNLorYAB6TDDlXYAWKx1OxUMZusbOigrpQd+5wy8VdCogDD7qk4bbZ+NjXkuD
ETzrQsGWUXe+IdeH8L0Zh0bGjbarCuA0qAeY1TEteGl+Qwo2dsrBUH7yKmWO6Mz9
XfPshrIDOGo4QNyVfEBNGq2K9eRrQUHeAPcM2/qu4ZAZRK+VCifDZrF8ZNpoAsnS
R2mJDhOBUMvI/ZaxOc2ry4EzwcS4uBaM2wONkGWDaqO6jNAQflaX7vtzOAeJB7Dt
VKXUUcZAGN7uI3c2mG5IKGMhTYUtUdrzmqmtZwIDAQABAoIBAQCHBJfTf3dt4AGn
T9twfSp06MQj9UPS2i5THI0LONCm8qSReX0zoZzJZgbzaYFM0zWczUMNvDA6vR7O
XDTmM2acxW4zv6JZo3Ata0sqwuepDz1eLGnt/8dppxQK/ClL4bH8088h/6k6sgPJ
9cEjfpejXHwFgvT9VM6i/BBpRHVTXWuJqwpDtg+bleQNN3L3RapluDd7BGiKoCwQ
cCTKd+lxTu9gVJkbRTI/Jn3kV+rnedYxHTxVp5cU1qIabsJWBcdDz25mRHupxQsn
JbQR4+ZnRLeAsC6WJZtEJz2KjXgBaYroHbGZY3KcGW95ILqiCJoJJugbW1eABKnN
Q5k8XVspAoGBAPzGJBZuX3c0quorhMIpREmGq2vS6VCQwLhH5qayYYH1LiPDfpdq
69lOROxZodzLxBgTf5z/a5kBF+eNKvOqfZJeRTxmllxxO1MuJQuRLi/b7BHHLuyN
Eea+YwtehA0T0CbD2hydefARNDruor2BLvt/kt6qEoIFiPauTsMfXP39AoGBAMVp
8argtnB+vsk5Z7rpQ4b9gF5QxfNbA0Hpg5wUUdYrUjFr50KWt1iowj6AOVp/EYgr
xRfvOQdYODDH7R5cjgMbwvtpHo39Zwq7ewaiT1sJXnpGmCDVh+pdTHePC5OOXnxN
0USK3M4KjltjVqJo7xPPElgJvCejudD47mtHMaQzAoGBAIFQ/PVc0goyL55NVUXf
xse21cv7wtEsvOuKHT361FegD1LMmN7uHGq32BryYBSNSmzmzMqNAYbtQEV9uxOd
jVBsWg9kjFgOtcMAQIOCapahdExEEoWCRj49+H3AhN4L3Nl4KQWqqs9efdIIc8lv
ZZHU2lZ/u6g5HLDWzASW7wQhAoGAdERPRrqN+HdNWinrA9Q6JxjKL8IWs5rYsksb
biMxh5eAEwdf7oHhfd/2duUB4mCQLMjKjawgxEia33AAIS+VnBMPpQ5mJm4l79Y3
QNL7Nbyw3gcRtdTM9aT5Ujj3MnJZB5C1PU8jeF4TNZOuBH0UwW/ld+BT5myxFXhm
wtvtSq0CgYEA19b0/7il4Em6uiLOmYUuqaUoFhUPqzjaS6OM/lRAw12coWv/8/1P
cwaNZHNMW9Me/bNH3zcOTz0lxnYp2BeRehjFYVPRuS1GU7uwqKtlL2wCPptTfAhN
aJWIplzUCTg786u+sdNZ0umWRuCLoUpsKTgP/yt4RglzEcfxAuBDljk=
-----END RSA PRIVATE KEY-----

27
test/fixtures/rsa_keys/key_3.pem vendored Normal file
View file

@ -0,0 +1,27 @@
-----BEGIN RSA PRIVATE KEY-----
MIIEpQIBAAKCAQEA0GvzqZ3r78GLa7guGn+palKRLGru4D4jnriHgfrUAJrdLyZ5
9d0zAA4qnS2L6YAMoPPBhBUtIV5e2sn1+rwTClWU3dm3FyBAeqdeIBKN+04AyrUc
HXYaZtOPJXCTeytzoSQE359Tq6+xwgoHlUWSWxQF51/z/PDQcUvqFjJqAtdiDchd
3CiFRtdjegyxXGnqvPmBix+vEjDytcVydfch+R1Twf6f5EL7a1jFVWNGcratYBEl
nqOWKI2fBu/WA8QlrcVW5zmtZo9aJ6IrFddQgQTxPk/mEHgCzv8tbCRI9TxiXeYH
YqxZFYBW40xbZQwGRjaYHJlIRYp9+TOynW9OZQIDAQABAoIBAQC97cIMDbdVsyAk
N6D70N5H35ofygqJGtdG6o3B6xuKuZVaREvbu4mgQUigF0Nqs5/OhJMSlGGeCOuT
oXug1Abd4gNY7++jCWb43tAtlfsAyaJ7FvPZ/SguEBhgW+hp07z5WWN/jSeoSuFI
G++xHcczbFm88XncRG8O78kQFTz5/DlQYkFXfbqpuS3BqxnrACpDCUfrUwZNYFIp
CUNq21jdifhHwlS0K3PX8A5HdOYeVnVHaE78LGE4oJVHwcokELv+PYqarWZq/a6L
vKU3yn2+4pj2WO490iGQaRKVM35vrtjdVxiWEIUiFc3Jg5fKZA3wuHXoF1N1DpPO
BO6Att55AoGBAP/nC2szmDcnU5Sh8LDeQbL+FpSBwOmFnmel5uqbjKnDzf9emPQu
NFUls1N9OGgyUq08TnmcY/7wLZzcu7Y9XOUURuYtx9nGRs4RmE2VEBhK1r7CkDIx
oOb+NtdqnPtQASAxCHszoGCFxpuV7UVoo2SRgc+M4ceX128arvBUtvdrAoGBANCA
RuO3eelkXaJoCeogEUVWXZ6QmPeYzbMD4vg2DM0ynUbReyuEIIhn+SR7tehlj5ie
4T3ixVdur6k+YUdiFhUYgXaHBJWHoHl1lrU3ZON8n7AeEk9ft6gg4L07ouj78UMZ
sArJIlU5mLnW02zbV9XryU39dIgpQREqC0bIOtVvAoGBAORv1JKq6Rt7ALJy6VCJ
5y4ogfGp7pLHk8NEpuERYDz/rLllMbbwNAk6cV17L8pb+c/pQMhwohcnQiCALxUc
q/tW4X+CqJ+vzu8PZ90Bzu9Qh2iceGpGQTNTBZPA+UeigI7DFqYcTPM9GDE1YiyO
nyUcezvSsI4i7s6gjD+/7+DnAoGABm3+QaV1z/m1XX3B2IN2pOG971bcML54kW2s
QSVBjc5ixT1OhBAGBM7YAwUBnhILtJQptAPbPBAAwMJYs5/VuH7R9zrArG/LRhOX
Oy1jIhTEw+SZgfMcscWZyJwfMPob/Yq8QAjl0yT8jbaPPIsjEUi9I3eOcWh8RjA6
ussP7WcCgYEAm3yvJR9z6QGoQQwtDbwjyZPYOSgK9wFS/65aupi6cm/Qk2N1YaLY
q2amNrzNsIc9vQwYGEHUwogn4MieHk96V7m2f0Hx9EHCMwizU9EiS6oyiLVowTG6
YsBgSzcpnt0Vkgil4CQks5uQoan0tubEUQ5DI79lLnb02n4o46iAYK0=
-----END RSA PRIVATE KEY-----

Some files were not shown because too many files have changed in this diff Show more