diff --git a/.gitlab-ci.yml b/.gitlab-ci.yml index 29eb8d6b9..e65cae9d8 100644 --- a/.gitlab-ci.yml +++ b/.gitlab-ci.yml @@ -61,7 +61,7 @@ unit-testing: alias: postgres command: ["postgres", "-c", "fsync=off", "-c", "synchronous_commit=off", "-c", "full_page_writes=off"] script: - - apt-get update && apt-get install -y libimage-exiftool-perl + - apt-get update && apt-get install -y libimage-exiftool-perl ffmpeg - mix deps.get - mix ecto.create - mix ecto.migrate @@ -95,7 +95,7 @@ unit-testing-rum: <<: *global_variables RUM_ENABLED: "true" script: - - apt-get update && apt-get install -y libimage-exiftool-perl + - apt-get update && apt-get install -y libimage-exiftool-perl ffmpeg - mix deps.get - mix ecto.create - mix ecto.migrate diff --git a/CHANGELOG.md b/CHANGELOG.md index a5c75bd4f..36a84b1a8 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -5,20 +5,77 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/). ## Unreleased +### Added +- Mix tasks for controlling user account confirmation status in bulk (`mix pleroma.user confirm_all` and `mix pleroma.user unconfirm_all`) +- Mix task for sending confirmation emails to all unconfirmed users (`mix pleroma.email send_confirmation_mails`) +- Mix task option for force-unfollowing relays + ### Changed - **Breaking** Requires `libmagic` (or `file`) to guess file types. +- **Breaking:** Pleroma Admin API: emoji packs and files routes changed. +- **Breaking:** Sensitive/NSFW statuses no longer disable link previews. +- Search: Users are now findable by their urls. - Renamed `:await_up_timeout` in `:connections_pool` namespace to `:connect_timeout`, old name is deprecated. - Renamed `:timeout` in `pools` namespace to `:recv_timeout`, old name is deprecated. +- The `discoverable` field in the `User` struct will now add a NOINDEX metatag to profile pages when false. +- Users with the `discoverable` field set to false will not show up in searches. +- Minimum lifetime for ephmeral activities changed to 10 minutes and made configurable (`:min_lifetime` option). +- Introduced optional dependencies on `ffmpeg`, `ImageMagick`, `exiftool` software packages. Please refer to `docs/installation/optional/media_graphics_packages.md`. + +### Added +- Media preview proxy (requires `ffmpeg` and `ImageMagick` to be installed and media proxy to be enabled; see `:media_preview_proxy` config for more details). +- Pleroma API: Importing the mutes users from CSV files. +- Experimental websocket-based federation between Pleroma instances. + +
+ API Changes + +- Pleroma API: Importing the mutes users from CSV files. +- Admin API: Importing emoji from a zip file +- Pleroma API: Pagination for remote/local packs and emoji. + +
### Removed - **Breaking:** `Pleroma.Workers.Cron.StatsWorker` setting from Oban `:crontab` (moved to a simpler implementation). - **Breaking:** `Pleroma.Workers.Cron.ClearOauthTokenWorker` setting from Oban `:crontab` (moved to scheduled jobs). - **Breaking:** `Pleroma.Workers.Cron.PurgeExpiredActivitiesWorker` setting from Oban `:crontab` (moved to scheduled jobs). +- Removed `:managed_config` option. In practice, it was accidentally removed with 2.0.0 release when frontends were +switched to a new configuration mechanism, however it was not officially removed until now. + +### Fixed + +- Add documented-but-missing chat pagination. +- Allow sending out emails again. + +## Unreleased (Patch) ### Changed -- Minimum lifetime for ephmeral activities changed to 10 minutes and made configurable (`:min_lifetime` option). +- API: Empty parameter values for integer parameters are now ignored in non-strict validaton mode. + +## [2.1.2] - 2020-09-17 + +### Security + +- Fix most MRF rules either crashing or not being applied to objects passed into the Common Pipeline (ChatMessage, Question, Answer, Audio, Event). + +### Fixed + +- Welcome Chat messages preventing user registration with MRF Simple Policy applied to the local instance. +- Mastodon API: the public timeline returning an error when the `reply_visibility` parameter is set to `self` for an unauthenticated user. +- Mastodon Streaming API: Handler crashes on authentication failures, resulting in error logs. +- Mastodon Streaming API: Error logs on client pings. +- Rich media: Log spam on failures. Now the error is only logged once per attempt. + +### Changed + +- Rich Media: A HEAD request is now done to the url, to ensure it has the appropriate content type and size before proceeding with a GET. + +### Upgrade notes + +1. Restart Pleroma ## [2.1.1] - 2020-09-08 @@ -35,6 +92,12 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/). ### Added - Rich media failure tracking (along with `:failure_backoff` option). +
+ Admin API Changes + +- Add `PATCH /api/pleroma/admin/instance_document/:document_name` to modify the Terms of Service and Instance Panel HTML pages via Admin API +
+ ### Fixed - Default HTTP adapter not respecting pool setting, leading to possible OOM. - Fixed uploading webp images when the Exiftool Upload Filter is enabled by skipping them diff --git a/README.md b/README.md index 6ca3118fb..7a05b9e48 100644 --- a/README.md +++ b/README.md @@ -18,15 +18,16 @@ If you are running Linux (glibc or musl) on x86/arm, the recommended way to inst ### From Source If your platform is not supported, or you just want to be able to edit the source code easily, you may install Pleroma from source. -- [Debian-based](https://docs-develop.pleroma.social/backend/installation/debian_based_en/) -- [Debian-based (jp)](https://docs-develop.pleroma.social/backend/installation/debian_based_jp/) - [Alpine Linux](https://docs-develop.pleroma.social/backend/installation/alpine_linux_en/) - [Arch Linux](https://docs-develop.pleroma.social/backend/installation/arch_linux_en/) +- [CentOS 7](https://docs-develop.pleroma.social/backend/installation/centos7_en/) +- [Debian-based](https://docs-develop.pleroma.social/backend/installation/debian_based_en/) +- [Debian-based (jp)](https://docs-develop.pleroma.social/backend/installation/debian_based_jp/) +- [FreeBSD](https://docs-develop.pleroma.social/backend/installation/freebsd_en/) - [Gentoo Linux](https://docs-develop.pleroma.social/backend/installation/gentoo_en/) - [NetBSD](https://docs-develop.pleroma.social/backend/installation/netbsd_en/) - [OpenBSD](https://docs-develop.pleroma.social/backend/installation/openbsd_en/) - [OpenBSD (fi)](https://docs-develop.pleroma.social/backend/installation/openbsd_fi/) -- [CentOS 7](https://docs-develop.pleroma.social/backend/installation/centos7_en/) ### OS/Distro packages Currently Pleroma is not packaged by any OS/Distros, but if you want to package it for one, we can guide you through the process on our [community channels](#community-channels). If you want to change default options in your Pleroma package, please **discuss it with us first**. diff --git a/config/benchmark.exs b/config/benchmark.exs index e867253eb..5567ff26e 100644 --- a/config/benchmark.exs +++ b/config/benchmark.exs @@ -59,8 +59,6 @@ "BLH1qVhJItRGCfxgTtONfsOKDc9VRAraXw-3NsmjMngWSh7NxOizN6bkuRA7iLTMPS82PjwJAr3UoK9EC1IFrz4", private_key: "_-XZ0iebPrRfZ_o0-IatTdszYa8VCH1yLN-JauK7HHA" -config :web_push_encryption, :http_client, Pleroma.Web.WebPushHttpClientMock - config :pleroma, Pleroma.ScheduledActivity, daily_user_limit: 2, total_user_limit: 3, diff --git a/config/config.exs b/config/config.exs index 46a649b73..d96db7416 100644 --- a/config/config.exs +++ b/config/config.exs @@ -130,6 +130,7 @@ dispatch: [ {:_, [ + {"/api/fedsocket/v1", Pleroma.Web.FedSockets.IncomingHandler, []}, {"/api/v1/streaming", Pleroma.Web.MastodonAPI.WebsocketHandler, []}, {"/websocket", Phoenix.Endpoint.CowboyWebSocket, {Phoenix.Transports.WebSocket, @@ -148,6 +149,16 @@ "SameSite=Lax" ] +config :pleroma, :fed_sockets, + enabled: false, + connection_duration: :timer.hours(8), + rejection_duration: :timer.minutes(15), + fed_socket_fetches: [ + default: 12_000, + interval: 3_000, + lazy: false + ] + # Configures Elixir's Logger config :logger, :console, level: :debug, @@ -216,7 +227,6 @@ allow_relay: true, public: true, quarantined_instances: [], - managed_config: true, static_dir: "instance/static/", allowed_post_formats: [ "text/plain", @@ -424,6 +434,8 @@ proxy_opts: [ redirect_on_failure: false, max_body_length: 25 * 1_048_576, + # Note: max_read_duration defaults to Pleroma.ReverseProxy.max_read_duration_default/1 + max_read_duration: 30_000, http: [ follow_redirect: true, pool: :media @@ -438,6 +450,14 @@ config :pleroma, Pleroma.Web.MediaProxy.Invalidation.Script, script_path: nil +# Note: media preview proxy depends on media proxy to be enabled +config :pleroma, :media_preview_proxy, + enabled: false, + thumbnail_max_width: 600, + thumbnail_max_height: 600, + image_quality: 85, + min_content_length: 100 * 1024 + config :pleroma, :chat, enabled: true config :phoenix, :format_encoders, json: Jason @@ -533,6 +553,7 @@ token_expiration: 5, federator_incoming: 50, federator_outgoing: 50, + ingestion_queue: 50, web_push: 50, mailer: 10, transmogrifier: 20, @@ -656,7 +677,18 @@ config :pleroma, Pleroma.Workers.PurgeExpiredActivity, enabled: true, min_lifetime: 600 -config :pleroma, Pleroma.Plugs.RemoteIp, enabled: true +config :pleroma, Pleroma.Plugs.RemoteIp, + enabled: true, + headers: ["x-forwarded-for"], + proxies: [], + reserved: [ + "127.0.0.0/8", + "::1/128", + "fc00::/7", + "10.0.0.0/8", + "172.16.0.0/12", + "192.168.0.0/16" + ] config :pleroma, :static_fe, enabled: false @@ -742,8 +774,8 @@ ], media: [ size: 50, - max_waiting: 10, - recv_timeout: 10_000 + max_waiting: 20, + recv_timeout: 15_000 ], upload: [ size: 25, @@ -790,6 +822,8 @@ config :ex_aws, http_client: Pleroma.HTTP.ExAws +config :web_push_encryption, http_client: Pleroma.HTTP.WebPush + config :pleroma, :instances_favicons, enabled: false config :floki, :html_parser, Floki.HTMLParser.FastHtml diff --git a/config/description.exs b/config/description.exs index d05adf88b..f068a35de 100644 --- a/config/description.exs +++ b/config/description.exs @@ -44,11 +44,13 @@ }, %{ key: "git", + label: "Git Repository URL", type: :string, description: "URL of the git repository of the frontend" }, %{ key: "build_url", + label: "Build URL", type: :string, description: "Either an url to a zip file containing the frontend or a template to build it by inserting the `ref`. The string `${ref}` will be replaced by the configured `ref`.", @@ -56,6 +58,7 @@ }, %{ key: "build_dir", + label: "Build directory", type: :string, description: "The directory inside the zip file " } @@ -270,6 +273,19 @@ } ] }, + %{ + group: :pleroma, + key: :fed_sockets, + type: :group, + description: "Websocket based federation", + children: [ + %{ + key: :enabled, + type: :boolean, + description: "Enable FedSockets" + } + ] + }, %{ group: :pleroma, key: Pleroma.Emails.Mailer, @@ -764,12 +780,6 @@ "*.quarantined.com" ] }, - %{ - key: :managed_config, - type: :boolean, - description: - "Whenether the config for pleroma-fe is configured in this config or in static/config.json" - }, %{ key: :static_dir, type: :string, @@ -1880,6 +1890,7 @@ suggestions: [ redirect_on_failure: false, max_body_length: 25 * 1_048_576, + max_read_duration: 30_000, http: [ follow_redirect: true, pool: :media @@ -1900,6 +1911,11 @@ "Limits the content length to be approximately the " <> "specified length. It is validated with the `content-length` header and also verified when proxying." }, + %{ + key: :max_read_duration, + type: :integer, + description: "Timeout (in milliseconds) of GET request to remote URI." + }, %{ key: :http, label: "HTTP", @@ -1946,6 +1962,43 @@ } ] }, + %{ + group: :pleroma, + key: :media_preview_proxy, + type: :group, + description: "Media preview proxy", + children: [ + %{ + key: :enabled, + type: :boolean, + description: + "Enables proxying of remote media preview to the instance's proxy. Requires enabled media proxy." + }, + %{ + key: :thumbnail_max_width, + type: :integer, + description: + "Max width of preview thumbnail for images (video preview always has original dimensions)." + }, + %{ + key: :thumbnail_max_height, + type: :integer, + description: + "Max height of preview thumbnail for images (video preview always has original dimensions)." + }, + %{ + key: :image_quality, + type: :integer, + description: "Quality of the output. Ranges from 0 (min quality) to 100 (max quality)." + }, + %{ + key: :min_content_length, + type: :integer, + description: + "Min content length to perform preview, in bytes. If greater than 0, media smaller in size will be served as is, without thumbnailing." + } + ] + }, %{ group: :pleroma, key: Pleroma.Web.MediaProxy.Invalidation.Http, @@ -2395,7 +2448,7 @@ %{ group: :pleroma, key: Pleroma.Formatter, - label: "Auto Linker", + label: "Linkify", type: :group, description: "Configuration for Pleroma's link formatter which parses mentions, hashtags, and URLs.", @@ -3212,20 +3265,22 @@ %{ key: :headers, type: {:list, :string}, - description: - "A list of strings naming the `req_headers` to use when deriving the `remote_ip`. Order does not matter. Default: `~w[forwarded x-forwarded-for x-client-ip x-real-ip]`." + description: """ + A list of strings naming the HTTP headers to use when deriving the true client IP. Default: `["x-forwarded-for"]`. + """ }, %{ key: :proxies, type: {:list, :string}, description: - "A list of strings in [CIDR](https://en.wikipedia.org/wiki/CIDR) notation specifying the IPs of known proxies. Default: `[]`." + "A list of upstream proxy IP subnets in CIDR notation from which we will parse the content of `headers`. Defaults to `[]`. IPv4 entries without a bitmask will be assumed to be /32 and IPv6 /128." }, %{ key: :reserved, type: {:list, :string}, - description: - "Defaults to [localhost](https://en.wikipedia.org/wiki/Localhost) and [private network](https://en.wikipedia.org/wiki/Private_network)." + description: """ + A list of reserved IP subnets in CIDR notation which should be ignored if found in `headers`. Defaults to `["127.0.0.0/8", "::1/128", "fc00::/7", "10.0.0.0/8", "172.16.0.0/12", "192.168.0.0/16"]` + """ } ] }, @@ -3631,9 +3686,7 @@ type: :map, description: "A map containing available frontends and parameters for their installation.", - children: [ - frontend_options - ] + children: frontend_options } ] }, diff --git a/config/test.exs b/config/test.exs index 0ee6f1b7f..95f860f2f 100644 --- a/config/test.exs +++ b/config/test.exs @@ -19,6 +19,11 @@ level: :warn, format: "\n[$level] $message\n" +config :pleroma, :fed_sockets, + enabled: false, + connection_duration: 5, + rejection_duration: 5 + config :pleroma, :auth, oauth_consumer_strategies: [] config :pleroma, Pleroma.Upload, @@ -78,8 +83,6 @@ "BLH1qVhJItRGCfxgTtONfsOKDc9VRAraXw-3NsmjMngWSh7NxOizN6bkuRA7iLTMPS82PjwJAr3UoK9EC1IFrz4", private_key: "_-XZ0iebPrRfZ_o0-IatTdszYa8VCH1yLN-JauK7HHA" -config :web_push_encryption, :http_client, Pleroma.Web.WebPushHttpClientMock - config :pleroma, Oban, queues: false, crontab: false, diff --git a/docs/API/admin_api.md b/docs/API/admin_api.md index c0ea074f0..7bf13daef 100644 --- a/docs/API/admin_api.md +++ b/docs/API/admin_api.md @@ -349,9 +349,9 @@ Response: ### Unfollow a Relay -Params: - -* `relay_url` +- Params: + - `relay_url` + - *optional* `force`: forcefully unfollow a relay even when the relay is not available. (default is `false`) Response: @@ -1334,3 +1334,166 @@ Loads json generated from `config/descriptions.exs`. { } ``` + +## GET /api/pleroma/admin/users/:nickname/chats + +### List a user's chats + +- Params: None + +- Response: + +```json +[ + { + "sender": { + "id": "someflakeid", + "username": "somenick", + ... + }, + "receiver": { + "id": "someflakeid", + "username": "somenick", + ... + }, + "id" : "1", + "unread" : 2, + "last_message" : {...}, // The last message in that chat + "updated_at": "2020-04-21T15:11:46.000Z" + } +] +``` + +## GET /api/pleroma/admin/chats/:chat_id + +### View a single chat + +- Params: None + +- Response: + +```json +{ + "sender": { + "id": "someflakeid", + "username": "somenick", + ... + }, + "receiver": { + "id": "someflakeid", + "username": "somenick", + ... + }, + "id" : "1", + "unread" : 2, + "last_message" : {...}, // The last message in that chat + "updated_at": "2020-04-21T15:11:46.000Z" +} +``` + +## GET /api/pleroma/admin/chats/:chat_id/messages + +### List the messages in a chat + +- Params: `max_id`, `min_id` + +- Response: + +```json +[ + { + "account_id": "someflakeid", + "chat_id": "1", + "content": "Check this out :firefox:", + "created_at": "2020-04-21T15:11:46.000Z", + "emojis": [ + { + "shortcode": "firefox", + "static_url": "https://dontbulling.me/emoji/Firefox.gif", + "url": "https://dontbulling.me/emoji/Firefox.gif", + "visible_in_picker": false + } + ], + "id": "13", + "unread": true + }, + { + "account_id": "someflakeid", + "chat_id": "1", + "content": "Whats' up?", + "created_at": "2020-04-21T15:06:45.000Z", + "emojis": [], + "id": "12", + "unread": false + } +] +``` + +## DELETE /api/pleroma/admin/chats/:chat_id/messages/:message_id + +### Delete a single message + +- Params: None + +- Response: + +```json +{ + "account_id": "someflakeid", + "chat_id": "1", + "content": "Check this out :firefox:", + "created_at": "2020-04-21T15:11:46.000Z", + "emojis": [ + { + "shortcode": "firefox", + "static_url": "https://dontbulling.me/emoji/Firefox.gif", + "url": "https://dontbulling.me/emoji/Firefox.gif", + "visible_in_picker": false + } + ], + "id": "13", + "unread": false +} +``` + +## `GET /api/pleroma/admin/instance_document/:document_name` + +### Get an instance document + +- Authentication: required + +- Response: + +Returns the content of the document + +```html +

Instance panel

+``` + +## `PATCH /api/pleroma/admin/instance_document/:document_name` +- Params: + - `file` (the file to be uploaded, using multipart form data.) + +### Update an instance document + +- Authentication: required + +- Response: + +``` json +{ + "url": "https://example.com/instance/panel.html" +} +``` + +## `DELETE /api/pleroma/admin/instance_document/:document_name` + +### Delete an instance document + +- Response: + +``` json +{ + "url": "https://example.com/instance/panel.html" +} +``` diff --git a/docs/API/pleroma_api.md b/docs/API/pleroma_api.md index 4e97d26c0..3fd141bd2 100644 --- a/docs/API/pleroma_api.md +++ b/docs/API/pleroma_api.md @@ -44,6 +44,22 @@ Request parameters can be passed via [query strings](https://en.wikipedia.org/wi * Response: HTTP 200 on success, 500 on error * Note: Users that can't be followed are silently skipped. +## `/api/pleroma/blocks_import` +### Imports your blocks. +* Method: `POST` +* Authentication: required +* Params: + * `list`: STRING or FILE containing a whitespace-separated list of accounts to block +* Response: HTTP 200 on success, 500 on error + +## `/api/pleroma/mutes_import` +### Imports your mutes. +* Method: `POST` +* Authentication: required +* Params: + * `list`: STRING or FILE containing a whitespace-separated list of accounts to mute +* Response: HTTP 200 on success, 500 on error + ## `/api/pleroma/captcha` ### Get a new captcha * Method: `GET` @@ -362,44 +378,43 @@ The status posting endpoint takes an additional parameter, `in_reply_to_conversa * Params: None * Response: JSON, returns a list of Mastodon Conversation entities that were marked as read (200 - healthy, 503 unhealthy). -## `GET /api/pleroma/emoji/packs/import` -### Imports packs from filesystem +## `GET /api/pleroma/emoji/pack?name=:name` + +### Get pack.json for the pack + * Method `GET` -* Authentication: required -* Params: None -* Response: JSON, returns a list of imported packs. - -## `GET /api/pleroma/emoji/packs/remote` -### Make request to another instance for packs list -* Method `GET` -* Authentication: required +* Authentication: not required * Params: - * `url`: url of the instance to get packs from -* Response: JSON with the pack list, hashmap with pack name and pack contents + * `page`: page number for files (default 1) + * `page_size`: page size for files (default 30) +* Response: JSON, pack json with `files`, `files_count` and `pack` keys with 200 status or 404 if the pack does not exist. -## `POST /api/pleroma/emoji/packs/download` -### Download pack from another instance -* Method `POST` -* Authentication: required -* Params: - * `url`: url of the instance to download from - * `name`: pack to download from that instance - * `as`: (*optional*) name how to save pack -* Response: JSON, "ok" with 200 status if the pack was downloaded, or 500 if there were - errors downloading the pack +```json +{ + "files": {...}, + "files_count": 0, // emoji count in pack + "pack": {...} +} +``` + +## `POST /api/pleroma/emoji/pack?name=:name` -## `POST /api/pleroma/emoji/packs/:name` ### Creates an empty pack + * Method `POST` -* Authentication: required -* Params: None +* Authentication: required (admin) +* Params: + * `name`: pack name * Response: JSON, "ok" and 200 status or 409 if the pack with that name already exists -## `PATCH /api/pleroma/emoji/packs/:name` +## `PATCH /api/pleroma/emoji/pack?name=:name` + ### Updates (replaces) pack metadata + * Method `PATCH` -* Authentication: required +* Authentication: required (admin) * Params: + * `name`: pack name * `metadata`: metadata to replace the old one * `license`: Pack license * `homepage`: Pack home page url @@ -410,39 +425,85 @@ The status posting endpoint takes an additional parameter, `in_reply_to_conversa * Response: JSON, updated "metadata" section of the pack and 200 status or 400 if there was a problem with the new metadata (the error is specified in the "error" part of the response JSON) -## `DELETE /api/pleroma/emoji/packs/:name` +## `DELETE /api/pleroma/emoji/pack?name=:name` + ### Delete a custom emoji pack + * Method `DELETE` -* Authentication: required -* Params: None +* Authentication: required (admin) +* Params: + * `name`: pack name * Response: JSON, "ok" and 200 status or 500 if there was an error deleting the pack -## `POST /api/pleroma/emoji/packs/:name/files` -### Add new file to the pack -* Method `POST` -* Authentication: required +## `GET /api/pleroma/emoji/packs/import` + +### Imports packs from filesystem + +* Method `GET` +* Authentication: required (admin) +* Params: None +* Response: JSON, returns a list of imported packs. + +## `GET /api/pleroma/emoji/packs/remote` + +### Make request to another instance for packs list + +* Method `GET` +* Authentication: required (admin) * Params: + * `url`: url of the instance to get packs from + * `page`: page number for packs (default 1) + * `page_size`: page size for packs (default 50) +* Response: JSON with the pack list, hashmap with pack name and pack contents + +## `POST /api/pleroma/emoji/packs/download` + +### Download pack from another instance + +* Method `POST` +* Authentication: required (admin) +* Params: + * `url`: url of the instance to download from + * `name`: pack to download from that instance + * `as`: (*optional*) name how to save pack +* Response: JSON, "ok" with 200 status if the pack was downloaded, or 500 if there were + errors downloading the pack + +## `POST /api/pleroma/emoji/packs/files?name=:name` + +### Add new file to the pack + +* Method `POST` +* Authentication: required (admin) +* Params: + * `name`: pack name * `file`: file needs to be uploaded with the multipart request or link to remote file. * `shortcode`: (*optional*) shortcode for new emoji, must be unique for all emoji. If not sended, shortcode will be taken from original filename. * `filename`: (*optional*) new emoji file name. If not specified will be taken from original filename. * Response: JSON, list of files for updated pack (hashmap -> shortcode => filename) with status 200, either error status with error message. -## `PATCH /api/pleroma/emoji/packs/:name/files` +## `PATCH /api/pleroma/emoji/packs/files?name=:name` + ### Update emoji file from pack + * Method `PATCH` -* Authentication: required +* Authentication: required (admin) * Params: + * `name`: pack name * `shortcode`: emoji file shortcode * `new_shortcode`: new emoji file shortcode * `new_filename`: new filename for emoji file * `force`: (*optional*) with true value to overwrite existing emoji with new shortcode * Response: JSON, list with updated files for updated pack (hashmap -> shortcode => filename) with status 200, either error status with error message. -## `DELETE /api/pleroma/emoji/packs/:name/files` +## `DELETE /api/pleroma/emoji/packs/files?name=:name` + ### Delete emoji file from pack + * Method `DELETE` -* Authentication: required +* Authentication: required (admin) * Params: + * `name`: pack name * `shortcode`: emoji file shortcode * Response: JSON, list with updated files for updated pack (hashmap -> shortcode => filename) with status 200, either error status with error message. @@ -467,30 +528,14 @@ The status posting endpoint takes an additional parameter, `in_reply_to_conversa } ``` -## `GET /api/pleroma/emoji/packs/:name` +## `GET /api/pleroma/emoji/packs/archive?name=:name` -### Get pack.json for the pack +### Requests a local pack archive from the instance * Method `GET` * Authentication: not required * Params: - * `page`: page number for files (default 1) - * `page_size`: page size for files (default 30) -* Response: JSON, pack json with `files`, `files_count` and `pack` keys with 200 status or 404 if the pack does not exist. - -```json -{ - "files": {...}, - "files_count": 0, // emoji count in pack - "pack": {...} -} -``` - -## `GET /api/pleroma/emoji/packs/:name/archive` -### Requests a local pack archive from the instance -* Method `GET` -* Authentication: not required -* Params: None + * `name`: pack name * Response: the archive of the pack with a 200 status code, 403 if the pack is not set as shared, 404 if the pack does not exist diff --git a/docs/administration/CLI_tasks/email.md b/docs/administration/CLI_tasks/email.md index 00d2e74f8..d9aa0e71b 100644 --- a/docs/administration/CLI_tasks/email.md +++ b/docs/administration/CLI_tasks/email.md @@ -1,4 +1,4 @@ -# Managing emails +# EMail administration tasks {! backend/administration/CLI_tasks/general_cli_task_info.include !} @@ -30,3 +30,17 @@ Example: ```sh mix pleroma.email test --to root@example.org ``` + +## Send confirmation emails to all unconfirmed user accounts + +=== "OTP" + + ```sh + ./bin/pleroma_ctl email send_confirmation_mails + ``` + +=== "From Source" + + ```sh + mix pleroma.email send_confirmation_mails + ``` diff --git a/docs/administration/CLI_tasks/user.md b/docs/administration/CLI_tasks/user.md index 3e7f028ba..c64ed4f22 100644 --- a/docs/administration/CLI_tasks/user.md +++ b/docs/administration/CLI_tasks/user.md @@ -224,9 +224,10 @@ ``` ### Options +- `--admin`/`--no-admin` - whether the user should be an admin +- `--confirmed`/`--no-confirmed` - whether the user account is confirmed - `--locked`/`--no-locked` - whether the user should be locked - `--moderator`/`--no-moderator` - whether the user should be a moderator -- `--admin`/`--no-admin` - whether the user should be an admin ## Add tags to a user @@ -271,3 +272,33 @@ ```sh mix pleroma.user toggle_confirmed ``` + +## Set confirmation status for all regular active users +*Admins and moderators are excluded* + +=== "OTP" + + ```sh + ./bin/pleroma_ctl user confirm_all + ``` + +=== "From Source" + + ```sh + mix pleroma.user confirm_all + ``` + +## Revoke confirmation status for all regular active users +*Admins and moderators are excluded* + +=== "OTP" + + ```sh + ./bin/pleroma_ctl user unconfirm_all + ``` + +=== "From Source" + + ```sh + mix pleroma.user unconfirm_all + ``` diff --git a/docs/administration/backup.md b/docs/administration/backup.md index be57bf74a..b49ff07fb 100644 --- a/docs/administration/backup.md +++ b/docs/administration/backup.md @@ -5,20 +5,24 @@ 1. Stop the Pleroma service. 2. Go to the working directory of Pleroma (default is `/opt/pleroma`) 3. Run `sudo -Hu postgres pg_dump -d --format=custom -f ` (make sure the postgres user has write access to the destination file) -4. Copy `pleroma.pgdump`, `config/prod.secret.exs` and the `uploads` folder to your backup destination. If you have other modifications, copy those changes too. +4. Copy `pleroma.pgdump`, `config/prod.secret.exs`, `config/setup_db.psql` (if still available) and the `uploads` folder to your backup destination. If you have other modifications, copy those changes too. 5. Restart the Pleroma service. ## Restore/Move -1. Optionally reinstall Pleroma (either on the same server or on another server if you want to move servers). Try to use the same database name. +1. Optionally reinstall Pleroma (either on the same server or on another server if you want to move servers). 2. Stop the Pleroma service. 3. Go to the working directory of Pleroma (default is `/opt/pleroma`) 4. Copy the above mentioned files back to their original position. -5. Drop the existing database and recreate an empty one `sudo -Hu postgres psql -c 'DROP DATABASE ;';` `sudo -Hu postgres psql -c 'CREATE DATABASE ;';` -6. Run `sudo -Hu postgres pg_restore -d -v -1 ` -7. If you installed a newer Pleroma version, you should run `mix ecto.migrate`[^1]. This task performs database migrations, if there were any. -8. Restart the Pleroma service. -9. Run `sudo -Hu postgres vacuumdb --all --analyze-in-stages`. This will quickly generate the statistics so that postgres can properly plan queries. +5. Drop the existing database if restoring in-place. `sudo -Hu postgres psql -c 'DROP DATABASE ;'` +6. Restore the database schema and pleroma postgres role the with the original `setup_db.psql` if you have it: `sudo -Hu postgres psql -f config/setup_db.psql`. + + Alernatively, run the `mix pleroma.instance gen` task again. You can ignore most of the questions, but make the database user, name, and password the same as found in your backup of `config/prod.secret.exs`. Then run the restoration of the pleroma role and schema with of the generated `config/setup_db.psql` as instructed above. You may delete the `config/generated_config.exs` file as it is not needed. + +7. Now restore the Pleroma instance's data into the empty database schema: `sudo -Hu postgres pg_restore -d -v -1 ` +8. If you installed a newer Pleroma version, you should run `mix ecto.migrate`[^1]. This task performs database migrations, if there were any. +9. Restart the Pleroma service. +10. Run `sudo -Hu postgres vacuumdb --all --analyze-in-stages`. This will quickly generate the statistics so that postgres can properly plan queries. [^1]: Prefix with `MIX_ENV=prod` to run it using the production config file. diff --git a/docs/configuration/cheatsheet.md b/docs/configuration/cheatsheet.md index 7cf1d1ce7..ea7dfec98 100644 --- a/docs/configuration/cheatsheet.md +++ b/docs/configuration/cheatsheet.md @@ -18,7 +18,7 @@ To add configuration to your config file, you can copy it from the base config. * `notify_email`: Email used for notifications. * `description`: The instance’s description, can be seen in nodeinfo and ``/api/v1/instance``. * `limit`: Posts character limit (CW/Subject included in the counter). -* `discription_limit`: The character limit for image descriptions. +* `description_limit`: The character limit for image descriptions. * `chat_limit`: Character limit of the instance chat messages. * `remote_limit`: Hard character limit beyond which remote posts will be dropped. * `upload_limit`: File size limit of uploads (except for avatar, background, banner). @@ -40,7 +40,6 @@ To add configuration to your config file, you can copy it from the base config. * `allow_relay`: Enable Pleroma’s Relay, which makes it possible to follow a whole instance. * `public`: Makes the client API in authenticated mode-only except for user-profiles. Useful for disabling the Local Timeline and The Whole Known Network. Note that there is a dependent setting restricting or allowing unauthenticated access to specific resources, see `restrict_unauthenticated` for more details. * `quarantined_instances`: List of ActivityPub instances where private (DMs, followers-only) activities will not be send. -* `managed_config`: Whenether the config for pleroma-fe is configured in [:frontend_configurations](#frontend_configurations) or in ``static/config.json``. * `allowed_post_formats`: MIME-type list of formats allowed to be posted (transformed into HTML). * `extended_nickname_format`: Set to `true` to use extended local nicknames format (allows underscores/dashes). This will break federation with older software for theses nicknames. @@ -226,6 +225,16 @@ Enables the worker which processes posts scheduled for deletion. Pinned posts ar * `enabled`: whether expired activities will be sent to the job queue to be deleted +## FedSockets +FedSockets is an experimental feature allowing for Pleroma backends to federate using a persistant websocket connection as opposed to making each federation a seperate http connection. This feature is currently off by default. It is configurable throught he following options. + +### :fedsockets +* `enabled`: Enables FedSockets for this instance. `false` by default. +* `connection_duration`: Time an idle websocket is kept open. +* `rejection_duration`: Failures to connect via FedSockets will not be retried for this period of time. +* `fed_socket_fetches` and `fed_socket_rejections`: Settings passed to `cachex` for the fetch registry, and rejection stacks. See `Pleroma.Web.FedSockets` for more details. + + ## Frontends ### :frontend_configurations @@ -315,6 +324,14 @@ This section describe PWA manifest instance-specific values. Currently this opti * `enabled`: Enables purge cache * `provider`: Which one of the [purge cache strategy](#purge-cache-strategy) to use. +## :media_preview_proxy + +* `enabled`: Enables proxying of remote media preview to the instance’s proxy. Requires enabled media proxy (`media_proxy/enabled`). +* `thumbnail_max_width`: Max width of preview thumbnail for images (video preview always has original dimensions). +* `thumbnail_max_height`: Max height of preview thumbnail for images (video preview always has original dimensions). +* `image_quality`: Quality of the output. Ranges from 0 (min quality) to 100 (max quality). +* `min_content_length`: Min content length to perform preview, in bytes. If greater than 0, media smaller in size will be served as is, without thumbnailing. + ### Purge cache strategy #### Pleroma.Web.MediaProxy.Invalidation.Script @@ -409,9 +426,9 @@ This will make Pleroma listen on `127.0.0.1` port `8080` and generate urls start Available options: * `enabled` - Enable/disable the plug. Defaults to `false`. -* `headers` - A list of strings naming the `req_headers` to use when deriving the `remote_ip`. Order does not matter. Defaults to `["x-forwarded-for"]`. -* `proxies` - A list of strings in [CIDR](https://en.wikipedia.org/wiki/CIDR) notation specifying the IPs of known proxies. Defaults to `[]`. -* `reserved` - Defaults to [localhost](https://en.wikipedia.org/wiki/Localhost) and [private network](https://en.wikipedia.org/wiki/Private_network). +* `headers` - A list of strings naming the HTTP headers to use when deriving the true client IP address. Defaults to `["x-forwarded-for"]`. +* `proxies` - A list of upstream proxy IP subnets in CIDR notation from which we will parse the content of `headers`. Defaults to `[]`. IPv4 entries without a bitmask will be assumed to be /32 and IPv6 /128. +* `reserved` - A list of reserved IP subnets in CIDR notation which should be ignored if found in `headers`. Defaults to `["127.0.0.0/8", "::1/128", "fc00::/7", "10.0.0.0/8", "172.16.0.0/12", "192.168.0.0/16"]`. ### :rate_limit diff --git a/docs/installation/alpine_linux_en.md b/docs/installation/alpine_linux_en.md index f393e4978..62f2fb778 100644 --- a/docs/installation/alpine_linux_en.md +++ b/docs/installation/alpine_linux_en.md @@ -21,6 +21,9 @@ It assumes that you have administrative rights, either as root or a user with [s * `nginx` (preferred, example configs for other reverse proxies can be found in the repo) * `certbot` (or any other ACME client for Let’s Encrypt certificates) +* `ImageMagick` +* `ffmpeg` +* `exiftool` ### Prepare the system @@ -30,7 +33,6 @@ It assumes that you have administrative rights, either as root or a user with [s awk 'NR==2' /etc/apk/repositories | sed 's/main/community/' | tee -a /etc/apk/repositories ``` - * Then update the system, if not already done: ```shell @@ -57,6 +59,7 @@ sudo apk add erlang erlang-runtime-tools erlang-xmerl elixir ```shell sudo apk add erlang-eldap ``` + ### Install PostgreSQL * Install Postgresql server: @@ -77,6 +80,12 @@ sudo /etc/init.d/postgresql start sudo rc-update add postgresql ``` +### Install media / graphics packages (optional, see [`docs/installation/optional/media_graphics_packages.md`](docs/installation/optional/media_graphics_packages.md)) + +```shell +sudo apk add ffmpeg imagemagick exiftool +``` + ### Install PleromaBE * Add a new system user for the Pleroma service: diff --git a/docs/installation/arch_linux_en.md b/docs/installation/arch_linux_en.md index 99eb011ad..0eb6d2d5f 100644 --- a/docs/installation/arch_linux_en.md +++ b/docs/installation/arch_linux_en.md @@ -16,6 +16,9 @@ This guide will assume that you have administrative rights, either as root or a * `nginx` (preferred, example configs for other reverse proxies can be found in the repo) * `certbot` (or any other ACME client for Let’s Encrypt certificates) +* `ImageMagick` +* `ffmpeg` +* `exiftool` ### Prepare the system @@ -53,6 +56,12 @@ sudo -iu postgres initdb -D /var/lib/postgres/data sudo systemctl enable --now postgresql.service ``` +### Install media / graphics packages (optional, see [`docs/installation/optional/media_graphics_packages.md`](docs/installation/optional/media_graphics_packages.md)) + +```shell +sudo pacman -S ffmpeg imagemagick perl-image-exiftool +``` + ### Install PleromaBE * Add a new system user for the Pleroma service: diff --git a/docs/installation/debian_based_en.md b/docs/installation/debian_based_en.md index 58d15ce14..6a9026d94 100644 --- a/docs/installation/debian_based_en.md +++ b/docs/installation/debian_based_en.md @@ -19,6 +19,9 @@ This guide will assume you are on Debian Stretch. This guide should also work wi * `nginx` (preferred, example configs for other reverse proxies can be found in the repo) * `certbot` (or any other ACME client for Let’s Encrypt certificates) +* `ImageMagick` +* `ffmpeg` +* `exiftool` ### Prepare the system @@ -51,6 +54,12 @@ sudo apt update sudo apt install elixir erlang-dev erlang-nox ``` +### Optional packages: [`docs/installation/optional/media_graphics_packages.md`](docs/installation/optional/media_graphics_packages.md) + +```shell +sudo apt install imagemagick ffmpeg libimage-exiftool-perl +``` + ### Install PleromaBE * Add a new system user for the Pleroma service: diff --git a/docs/installation/debian_based_jp.md b/docs/installation/debian_based_jp.md index d98162796..94e22325c 100644 --- a/docs/installation/debian_based_jp.md +++ b/docs/installation/debian_based_jp.md @@ -23,6 +23,9 @@ - `nginx` (おすすめです。他のリバースプロキシを使う場合は、参考となる設定をこのリポジトリから探してください) - `certbot` (または何らかのLet's Encrypt向けACMEクライアント) +- `ImageMagick` +- `ffmpeg` +- `exiftool` ### システムを準備する @@ -34,10 +37,9 @@ sudo apt full-upgrade * 上記に挙げたパッケージをインストールしておきます。 ``` -sudo apt install git build-essential postgresql postgresql-contrib cmake libmagic-dev +sudo apt install git build-essential postgresql postgresql-contrib cmake ffmpeg imagemagick libmagic-dev ``` - ### ElixirとErlangをインストールします * Erlangのリポジトリをダウンロードおよびインストールします。 @@ -52,6 +54,12 @@ sudo apt update sudo apt install elixir erlang-dev erlang-nox ``` +### オプションパッケージ: [`docs/installation/optional/media_graphics_packages.md`](docs/installation/optional/media_graphics_packages.md) + +```shell +sudo apt install imagemagick ffmpeg libimage-exiftool-perl +``` + ### Pleroma BE (バックエンド) をインストールします * Pleroma用に新しいユーザーを作ります。 diff --git a/docs/installation/freebsd_en.md b/docs/installation/freebsd_en.md index ca2575d9b..fdcb06c53 100644 --- a/docs/installation/freebsd_en.md +++ b/docs/installation/freebsd_en.md @@ -26,6 +26,12 @@ Setup the required services to automatically start at boot, using `sysrc(8)`. # service postgresql start ``` +### Install media / graphics packages (optional, see [`docs/installation/optional/media_graphics_packages.md`](docs/installation/optional/media_graphics_packages.md)) + +```shell +# pkg install imagemagick ffmpeg p5-Image-ExifTool +``` + ## Configuring Pleroma Create a user for Pleroma: diff --git a/docs/installation/gentoo_en.md b/docs/installation/gentoo_en.md index 0f7ed9d47..f2380ab72 100644 --- a/docs/installation/gentoo_en.md +++ b/docs/installation/gentoo_en.md @@ -36,6 +36,9 @@ Gentoo quite pointedly does not come with a cron daemon installed, and as such i * `www-servers/nginx` (preferred, example configs for other reverse proxies can be found in the repo) * `app-crypt/certbot` (or any other ACME client for Let’s Encrypt certificates) * `app-crypt/certbot-nginx` (nginx certbot plugin that allows use of the all-powerful `--nginx` flag on certbot) +* `media-gfx/imagemagick` +* `media-video/ffmpeg` +* `media-libs/exiftool` ### Prepare the system @@ -88,6 +91,12 @@ If you do not plan to make any modifications to your Pleroma instance, cloning d Not only does this make it much easier to deploy changes you make, as you can commit and pull from upstream and all that good stuff from the comfort of your local machine then simply `git pull` on your instance server when you're ready to deploy, it also ensures you are compliant with the Affero General Public Licence that Pleroma is licenced under, which stipulates that all network services provided with modified AGPL code must publish their changes on a publicly available internet service and for free. It also makes it much easier to ask for help from and provide help to your fellow Pleroma admins if your public repo always reflects what you are running because it is part of your deployment procedure. +### Install media / graphics packages (optional, see [`docs/installation/optional/media_graphics_packages.md`](docs/installation/optional/media_graphics_packages.md)) + +```shell +# emerge --ask media-video/ffmpeg media-gfx/imagemagick media-libs/exiftool +``` + ### Install PleromaBE * Add a new system user for the Pleroma service and set up default directories: diff --git a/docs/installation/netbsd_en.md b/docs/installation/netbsd_en.md index 6ad0de2f6..d5fa04fdf 100644 --- a/docs/installation/netbsd_en.md +++ b/docs/installation/netbsd_en.md @@ -10,7 +10,7 @@ Pleroma uses. The `mksh` shell is needed to run the Elixir `mix` script. -`# pkgin install acmesh elixir git-base git-docs mksh nginx postgresql11-server postgresql11-client postgresql11-contrib sudo` +`# pkgin install acmesh elixir git-base git-docs mksh nginx postgresql11-server postgresql11-client postgresql11-contrib sudo ffmpeg4 ImageMagick` You can also build these packages using pkgsrc: ``` @@ -44,6 +44,10 @@ pgsql=YES First, run `# /etc/rc.d/pgsql start`. Then, `$ sudo -Hu pgsql -g pgsql createdb`. +### Install media / graphics packages (optional, see [`docs/installation/optional/media_graphics_packages.md`](docs/installation/optional/media_graphics_packages.md)) + +`# pkgin install ImageMagick ffmpeg4 p5-Image-ExifTool` + ## Configuring Pleroma Create a user for Pleroma: diff --git a/docs/installation/openbsd_en.md b/docs/installation/openbsd_en.md index eee452845..8092ac379 100644 --- a/docs/installation/openbsd_en.md +++ b/docs/installation/openbsd_en.md @@ -10,20 +10,34 @@ The following packages need to be installed: * elixir * gmake - * ImageMagick * git * postgresql-server * postgresql-contrib * cmake + * ffmpeg + * ImageMagick To install them, run the following command (with doas or as root): ``` -pkg_add elixir gmake ImageMagick git postgresql-server postgresql-contrib cmake +pkg_add elixir gmake git postgresql-server postgresql-contrib cmake ffmpeg ImageMagick ``` Pleroma requires a reverse proxy, OpenBSD has relayd in base (and is used in this guide) and packages/ports are available for nginx (www/nginx) and apache (www/apache-httpd). Independently of the reverse proxy, [acme-client(1)](https://man.openbsd.org/acme-client) can be used to get a certificate from Let's Encrypt. +#### Optional software + +Per [`docs/installation/optional/media_graphics_packages.md`](docs/installation/optional/media_graphics_packages.md): + * ImageMagick + * ffmpeg + * exiftool + +To install the above: + +``` +pkg_add ImageMagick ffmpeg p5-Image-ExifTool +``` + #### Creating the pleroma user Pleroma will be run by a dedicated user, \_pleroma. Before creating it, insert the following lines in login.conf: ``` diff --git a/docs/installation/openbsd_fi.md b/docs/installation/openbsd_fi.md index b5b5056a9..01cf34ab4 100644 --- a/docs/installation/openbsd_fi.md +++ b/docs/installation/openbsd_fi.md @@ -16,7 +16,18 @@ Matrix-kanava #freenode_#pleroma:matrix.org ovat hyviä paikkoja löytää apua Asenna tarvittava ohjelmisto: -`# pkg_add git elixir gmake postgresql-server-10.3 postgresql-contrib-10.3 cmake` +`# pkg_add git elixir gmake postgresql-server-10.3 postgresql-contrib-10.3 cmake ffmpeg ImageMagick` + +#### Optional software + +[`docs/installation/optional/media_graphics_packages.md`](docs/installation/optional/media_graphics_packages.md): + * ImageMagick + * ffmpeg + * exiftool + +Asenna tarvittava ohjelmisto: + +`# pkg_add ImageMagick ffmpeg p5-Image-ExifTool` Luo postgresql-tietokanta: diff --git a/docs/installation/optional/media_graphics_packages.md b/docs/installation/optional/media_graphics_packages.md new file mode 100644 index 000000000..cb3d71188 --- /dev/null +++ b/docs/installation/optional/media_graphics_packages.md @@ -0,0 +1,32 @@ +# Optional software packages needed for specific functionality + +For specific Pleroma functionality (which is disabled by default) some or all of the below packages are required: + * `ImageMagic` + * `ffmpeg` + * `exiftool` + +Please refer to documentation in `docs/installation` on how to install them on specific OS. + +Note: the packages are not required with the current default settings of Pleroma. + +## `ImageMagick` + +`ImageMagick` is a set of tools to create, edit, compose, or convert bitmap images. + +It is required for the following Pleroma features: + * `Pleroma.Upload.Filters.Mogrify`, `Pleroma.Upload.Filters.Mogrifun` upload filters (related config: `Plaroma.Upload/filters` in `config/config.exs`) + * Media preview proxy for still images (related config: `media_preview_proxy/enabled` in `config/config.exs`) + +## `ffmpeg` + +`ffmpeg` is software to record, convert and stream audio and video. + +It is required for the following Pleroma features: + * Media preview proxy for videos (related config: `media_preview_proxy/enabled` in `config/config.exs`) + +## `exiftool` + +`exiftool` is media files metadata reader/writer. + +It is required for the following Pleroma features: + * `Pleroma.Upload.Filters.Exiftool` upload filter (related config: `Plaroma.Upload/filters` in `config/config.exs`) diff --git a/docs/installation/otp_en.md b/docs/installation/otp_en.md index 32f04a9c4..62d4c8a72 100644 --- a/docs/installation/otp_en.md +++ b/docs/installation/otp_en.md @@ -41,6 +41,25 @@ Other than things bundled in the OTP release Pleroma depends on: apt install curl unzip libncurses5 postgresql postgresql-contrib nginx certbot libmagic-dev ``` +### Installing optional packages + +Per [`docs/installation/optional/media_graphics_packages.md`](docs/installation/optional/media_graphics_packages.md): + * ImageMagick + * ffmpeg + * exiftool + +=== "Alpine" + ``` + echo "http://nl.alpinelinux.org/alpine/latest-stable/community" >> /etc/apk/repositories + apk update + apk add imagemagick ffmpeg exiftool + ``` + +=== "Debian/Ubuntu" + ``` + apt install imagemagick ffmpeg libimage-exiftool-perl + ``` + ## Setup ### Configuring PostgreSQL #### (Optional) Installing RUM indexes @@ -83,6 +102,8 @@ It is encouraged to check [Optimizing your PostgreSQL performance](../configurat If you are using PostgreSQL 12 or higher, add this to your Ecto database configuration ```elixir +# +config :pleroma, Pleroma.Repo, prepare: :named, parameters: [ plan_cache_mode: "force_custom_plan" diff --git a/installation/pleroma.nginx b/installation/pleroma.nginx index d301ca615..d613befd2 100644 --- a/installation/pleroma.nginx +++ b/installation/pleroma.nginx @@ -9,6 +9,12 @@ proxy_cache_path /tmp/pleroma-media-cache levels=1:2 keys_zone=pleroma_media_cache:10m max_size=10g inactive=720m use_temp_path=off; +# this is explicitly IPv4 since Pleroma.Web.Endpoint binds on IPv4 only +# and `localhost.` resolves to [::0] on some systems: see issue #930 +upstream phoenix { + server 127.0.0.1:4000 max_fails=5 fail_timeout=60s; +} + server { server_name example.tld; @@ -63,19 +69,16 @@ server { # the nginx default is 1m, not enough for large media uploads client_max_body_size 16m; + ignore_invalid_headers off; + + proxy_http_version 1.1; + proxy_set_header Upgrade $http_upgrade; + proxy_set_header Connection "upgrade"; + proxy_set_header Host $http_host; + proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; location / { - proxy_http_version 1.1; - proxy_set_header Upgrade $http_upgrade; - proxy_set_header Connection "upgrade"; - proxy_set_header Host $http_host; - proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; - - # this is explicitly IPv4 since Pleroma.Web.Endpoint binds on IPv4 only - # and `localhost.` resolves to [::0] on some systems: see issue #930 - proxy_pass http://127.0.0.1:4000; - - client_max_body_size 16m; + proxy_pass http://phoenix; } location ~ ^/(media|proxy) { @@ -83,12 +86,16 @@ server { slice 1m; proxy_cache_key $host$uri$is_args$args$slice_range; proxy_set_header Range $slice_range; - proxy_http_version 1.1; proxy_cache_valid 200 206 301 304 1h; proxy_cache_lock on; proxy_ignore_client_abort on; proxy_buffering on; chunked_transfer_encoding on; - proxy_pass http://127.0.0.1:4000; + proxy_pass http://phoenix; + } + + location /api/fedsocket/v1 { + proxy_request_buffering off; + proxy_pass http://phoenix/api/fedsocket/v1; } } diff --git a/installation/pleroma.vcl b/installation/pleroma.vcl index 154747aa6..13dad784c 100644 --- a/installation/pleroma.vcl +++ b/installation/pleroma.vcl @@ -1,3 +1,4 @@ +# Recommended varnishncsa logging format: '%h %l %u %t "%m %{X-Forwarded-Proto}i://%{Host}i%U%q %H" %s %b "%{Referer}i" "%{User-agent}i"' vcl 4.1; import std; @@ -14,8 +15,11 @@ acl purge { sub vcl_recv { # Redirect HTTP to HTTPS if (std.port(server.ip) != 443) { + set req.http.X-Forwarded-Proto = "http"; set req.http.x-redir = "https://" + req.http.host + req.url; return (synth(750, "")); + } else { + set req.http.X-Forwarded-Proto = "https"; } # CHUNKED SUPPORT @@ -105,7 +109,7 @@ sub vcl_hash { sub vcl_backend_fetch { # Be more lenient for slow servers on the fediverse - if bereq.url ~ "^/proxy/" { + if (bereq.url ~ "^/proxy/") { set bereq.first_byte_timeout = 300s; } diff --git a/lib/mix/tasks/pleroma/config.ex b/lib/mix/tasks/pleroma/config.ex index 904c5a74b..18f99318d 100644 --- a/lib/mix/tasks/pleroma/config.ex +++ b/lib/mix/tasks/pleroma/config.ex @@ -32,7 +32,8 @@ def run(["migrate_from_db" | options]) do @spec migrate_to_db(Path.t() | nil) :: any() def migrate_to_db(file_path \\ nil) do - if Pleroma.Config.get([:configurable_from_database]) do + with true <- Pleroma.Config.get([:configurable_from_database]), + :ok <- Pleroma.Config.DeprecationWarnings.warn() do config_file = if file_path do file_path @@ -46,7 +47,8 @@ def migrate_to_db(file_path \\ nil) do do_migrate_to_db(config_file) else - migration_error() + :error -> deprecation_error() + _ -> migration_error() end end @@ -120,6 +122,10 @@ defp migration_error do ) end + defp deprecation_error do + shell_error("Migration is not allowed until all deprecation warnings have been resolved.") + end + if Code.ensure_loaded?(Config.Reader) do defp config_header, do: "import Config\r\n\r\n" defp read_file(config_file), do: Config.Reader.read_imports!(config_file) diff --git a/lib/mix/tasks/pleroma/database.ex b/lib/mix/tasks/pleroma/database.ex index 7f1108dcf..a01c36ece 100644 --- a/lib/mix/tasks/pleroma/database.ex +++ b/lib/mix/tasks/pleroma/database.ex @@ -99,7 +99,7 @@ def run(["fix_likes_collections"]) do where: fragment("(?)->>'likes' is not null", object.data), select: %{id: object.id, likes: fragment("(?)->>'likes'", object.data)} ) - |> Pleroma.RepoStreamer.chunk_stream(100) + |> Pleroma.Repo.chunk_stream(100, :batches) |> Stream.each(fn objects -> ids = objects @@ -145,7 +145,7 @@ def run(["ensure_expiration"]) do |> where(local: true) |> where([a], fragment("(? ->> 'type'::text) = 'Create'", a.data)) |> where([_a, o], fragment("?->>'type' = 'Note'", o.data)) - |> Pleroma.RepoStreamer.chunk_stream(100) + |> Pleroma.Repo.chunk_stream(100, :batches) |> Stream.each(fn activities -> Enum.each(activities, fn activity -> expires_at = diff --git a/lib/mix/tasks/pleroma/email.ex b/lib/mix/tasks/pleroma/email.ex index d3fac6ec8..9972cb988 100644 --- a/lib/mix/tasks/pleroma/email.ex +++ b/lib/mix/tasks/pleroma/email.ex @@ -2,11 +2,11 @@ defmodule Mix.Tasks.Pleroma.Email do use Mix.Task import Mix.Pleroma - @shortdoc "Simple Email test" + @shortdoc "Email administrative tasks" @moduledoc File.read!("docs/administration/CLI_tasks/email.md") def run(["test" | args]) do - Mix.Pleroma.start_pleroma() + start_pleroma() {options, [], []} = OptionParser.parse( @@ -21,4 +21,20 @@ def run(["test" | args]) do shell_info("Test email has been sent to #{inspect(email.to)} from #{inspect(email.from)}") end + + def run(["resend_confirmation_emails"]) do + start_pleroma() + + shell_info("Sending emails to all unconfirmed users") + + Pleroma.User.Query.build(%{ + local: true, + deactivated: false, + confirmation_pending: true, + invisible: false + }) + |> Pleroma.Repo.chunk_stream(500) + |> Stream.each(&Pleroma.User.try_send_confirmation_email(&1)) + |> Stream.run() + end end diff --git a/lib/mix/tasks/pleroma/relay.ex b/lib/mix/tasks/pleroma/relay.ex index a6d8d6c1c..bb808ca47 100644 --- a/lib/mix/tasks/pleroma/relay.ex +++ b/lib/mix/tasks/pleroma/relay.ex @@ -21,10 +21,19 @@ def run(["follow", target]) do end end - def run(["unfollow", target]) do + def run(["unfollow", target | rest]) do start_pleroma() - with {:ok, _activity} <- Relay.unfollow(target) do + {options, [], []} = + OptionParser.parse( + rest, + strict: [force: :boolean], + aliases: [f: :force] + ) + + force = Keyword.get(options, :force, false) + + with {:ok, _activity} <- Relay.unfollow(target, %{force: force}) do # put this task to sleep to allow the genserver to push out the messages :timer.sleep(500) else diff --git a/lib/mix/tasks/pleroma/user.ex b/lib/mix/tasks/pleroma/user.ex index 01824aa18..e06262804 100644 --- a/lib/mix/tasks/pleroma/user.ex +++ b/lib/mix/tasks/pleroma/user.ex @@ -179,7 +179,7 @@ def run(["deactivate_all_from_instance", instance]) do start_pleroma() Pleroma.User.Query.build(%{nickname: "@#{instance}"}) - |> Pleroma.RepoStreamer.chunk_stream(500) + |> Pleroma.Repo.chunk_stream(500, :batches) |> Stream.each(fn users -> users |> Enum.each(fn user -> @@ -196,17 +196,24 @@ def run(["set", nickname | rest]) do OptionParser.parse( rest, strict: [ - moderator: :boolean, admin: :boolean, - locked: :boolean + confirmed: :boolean, + locked: :boolean, + moderator: :boolean ] ) with %User{local: true} = user <- User.get_cached_by_nickname(nickname) do user = - case Keyword.get(options, :moderator) do + case Keyword.get(options, :admin) do nil -> user - value -> set_moderator(user, value) + value -> set_admin(user, value) + end + + user = + case Keyword.get(options, :confirmed) do + nil -> user + value -> set_confirmed(user, value) end user = @@ -216,9 +223,9 @@ def run(["set", nickname | rest]) do end _user = - case Keyword.get(options, :admin) do + case Keyword.get(options, :moderator) do nil -> user - value -> set_admin(user, value) + value -> set_moderator(user, value) end else _ -> @@ -353,6 +360,42 @@ def run(["toggle_confirmed", nickname]) do end end + def run(["confirm_all"]) do + start_pleroma() + + Pleroma.User.Query.build(%{ + local: true, + deactivated: false, + is_moderator: false, + is_admin: false, + invisible: false + }) + |> Pleroma.Repo.chunk_stream(500, :batches) + |> Stream.each(fn users -> + users + |> Enum.each(fn user -> User.need_confirmation(user, false) end) + end) + |> Stream.run() + end + + def run(["unconfirm_all"]) do + start_pleroma() + + Pleroma.User.Query.build(%{ + local: true, + deactivated: false, + is_moderator: false, + is_admin: false, + invisible: false + }) + |> Pleroma.Repo.chunk_stream(500, :batches) + |> Stream.each(fn users -> + users + |> Enum.each(fn user -> User.need_confirmation(user, true) end) + end) + |> Stream.run() + end + def run(["sign_out", nickname]) do start_pleroma() @@ -370,7 +413,7 @@ def run(["list"]) do start_pleroma() Pleroma.User.Query.build(%{local: true}) - |> Pleroma.RepoStreamer.chunk_stream(500) + |> Pleroma.Repo.chunk_stream(500, :batches) |> Stream.each(fn users -> users |> Enum.each(fn user -> @@ -410,4 +453,11 @@ defp set_locked(user, value) do shell_info("Locked status of #{user.nickname}: #{user.locked}") user end + + defp set_confirmed(user, value) do + {:ok, user} = User.need_confirmation(user, !value) + + shell_info("Confirmation pending status of #{user.nickname}: #{user.confirmation_pending}") + user + end end diff --git a/lib/pleroma/application.ex b/lib/pleroma/application.ex index cd7a856d0..4ed8df09c 100644 --- a/lib/pleroma/application.ex +++ b/lib/pleroma/application.ex @@ -56,7 +56,6 @@ def start(_type, _args) do Pleroma.ApplicationRequirements.verify!() setup_instrumenters() load_custom_modules() - check_system_commands() Pleroma.Docs.JSON.compile() adapter = Application.get_env(:tesla, :adapter) @@ -100,7 +99,7 @@ def start(_type, _args) do {Oban, Config.get(Oban)} ] ++ task_children(@env) ++ - streamer_child(@env) ++ + dont_run_in_test(@env) ++ chat_child(@env, chat_enabled?()) ++ [ Pleroma.Web.Endpoint, @@ -189,16 +188,17 @@ def build_cachex(type, opts), defp chat_enabled?, do: Config.get([:chat, :enabled]) - defp streamer_child(env) when env in [:test, :benchmark], do: [] + defp dont_run_in_test(env) when env in [:test, :benchmark], do: [] - defp streamer_child(_) do + defp dont_run_in_test(_) do [ {Registry, [ name: Pleroma.Web.Streamer.registry(), keys: :duplicate, partitions: System.schedulers_online() - ]} + ]}, + Pleroma.Web.FedSockets.Supervisor ] end @@ -260,21 +260,4 @@ defp http_children(Tesla.Adapter.Gun, _) do end defp http_children(_, _), do: [] - - defp check_system_commands do - filters = Config.get([Pleroma.Upload, :filters]) - - check_filter = fn filter, command_required -> - with true <- filter in filters, - false <- Pleroma.Utils.command_available?(command_required) do - Logger.error( - "#{filter} is specified in list of Pleroma.Upload filters, but the #{command_required} command is not found" - ) - end - end - - check_filter.(Pleroma.Upload.Filters.Exiftool, "exiftool") - check_filter.(Pleroma.Upload.Filters.Mogrify, "mogrify") - check_filter.(Pleroma.Upload.Filters.Mogrifun, "mogrify") - end end diff --git a/lib/pleroma/application_requirements.ex b/lib/pleroma/application_requirements.ex index 16f62b6f5..b977257a3 100644 --- a/lib/pleroma/application_requirements.ex +++ b/lib/pleroma/application_requirements.ex @@ -9,6 +9,9 @@ defmodule Pleroma.ApplicationRequirements do defmodule VerifyError, do: defexception([:message]) + alias Pleroma.Config + alias Pleroma.Helpers.MediaHelper + import Ecto.Query require Logger @@ -16,7 +19,8 @@ defmodule VerifyError, do: defexception([:message]) @spec verify!() :: :ok | VerifyError.t() def verify! do :ok - |> check_confirmation_accounts! + |> check_system_commands!() + |> check_confirmation_accounts!() |> check_migrations_applied!() |> check_welcome_message_config!() |> check_rum!() @@ -48,7 +52,9 @@ def check_confirmation_accounts!(:ok) do if Pleroma.Config.get([:instance, :account_activation_required]) && not Pleroma.Config.get([Pleroma.Emails.Mailer, :enabled]) do Logger.error( - "Account activation enabled, but no Mailer settings enabled.\nPlease set config :pleroma, :instance, account_activation_required: false\nOtherwise setup and enable Mailer." + "Account activation enabled, but no Mailer settings enabled.\n" <> + "Please set config :pleroma, :instance, account_activation_required: false\n" <> + "Otherwise setup and enable Mailer." ) {:error, @@ -81,7 +87,9 @@ def check_migrations_applied!(:ok) do Enum.map(down_migrations, fn {:down, id, name} -> "- #{name} (#{id})\n" end) Logger.error( - "The following migrations were not applied:\n#{down_migrations_text}If you want to start Pleroma anyway, set\nconfig :pleroma, :i_am_aware_this_may_cause_data_loss, disable_migration_check: true" + "The following migrations were not applied:\n#{down_migrations_text}" <> + "If you want to start Pleroma anyway, set\n" <> + "config :pleroma, :i_am_aware_this_may_cause_data_loss, disable_migration_check: true" ) {:error, "Unapplied Migrations detected"} @@ -124,14 +132,22 @@ defp do_check_rum!(setting, migrate) do case {setting, migrate} do {true, false} -> Logger.error( - "Use `RUM` index is enabled, but were not applied migrations for it.\nIf you want to start Pleroma anyway, set\nconfig :pleroma, :database, rum_enabled: false\nOtherwise apply the following migrations:\n`mix ecto.migrate --migrations-path priv/repo/optional_migrations/rum_indexing/`" + "Use `RUM` index is enabled, but were not applied migrations for it.\n" <> + "If you want to start Pleroma anyway, set\n" <> + "config :pleroma, :database, rum_enabled: false\n" <> + "Otherwise apply the following migrations:\n" <> + "`mix ecto.migrate --migrations-path priv/repo/optional_migrations/rum_indexing/`" ) {:error, "Unapplied RUM Migrations detected"} {false, true} -> Logger.error( - "Detected applied migrations to use `RUM` index, but `RUM` isn't enable in settings.\nIf you want to use `RUM`, set\nconfig :pleroma, :database, rum_enabled: true\nOtherwise roll `RUM` migrations back.\n`mix ecto.rollback --migrations-path priv/repo/optional_migrations/rum_indexing/`" + "Detected applied migrations to use `RUM` index, but `RUM` isn't enable in settings.\n" <> + "If you want to use `RUM`, set\n" <> + "config :pleroma, :database, rum_enabled: true\n" <> + "Otherwise roll `RUM` migrations back.\n" <> + "`mix ecto.rollback --migrations-path priv/repo/optional_migrations/rum_indexing/`" ) {:error, "RUM Migrations detected"} @@ -140,4 +156,50 @@ defp do_check_rum!(setting, migrate) do :ok end end + + defp check_system_commands!(:ok) do + filter_commands_statuses = [ + check_filter(Pleroma.Upload.Filters.Exiftool, "exiftool"), + check_filter(Pleroma.Upload.Filters.Mogrify, "mogrify"), + check_filter(Pleroma.Upload.Filters.Mogrifun, "mogrify") + ] + + preview_proxy_commands_status = + if !Config.get([:media_preview_proxy, :enabled]) or + MediaHelper.missing_dependencies() == [] do + true + else + Logger.error( + "The following dependencies required by Media preview proxy " <> + "(which is currently enabled) are not installed: " <> + inspect(MediaHelper.missing_dependencies()) + ) + + false + end + + if Enum.all?([preview_proxy_commands_status | filter_commands_statuses], & &1) do + :ok + else + {:error, + "System commands missing. Check logs and see `docs/installation` for more details."} + end + end + + defp check_system_commands!(result), do: result + + defp check_filter(filter, command_required) do + filters = Config.get([Pleroma.Upload, :filters]) + + if filter in filters and not Pleroma.Utils.command_available?(command_required) do + Logger.error( + "#{filter} is specified in list of Pleroma.Upload filters, but the " <> + "#{command_required} command is not found" + ) + + false + else + true + end + end end diff --git a/lib/pleroma/chat.ex b/lib/pleroma/chat.ex index 24a86371e..28007cd9f 100644 --- a/lib/pleroma/chat.ex +++ b/lib/pleroma/chat.ex @@ -6,7 +6,9 @@ defmodule Pleroma.Chat do use Ecto.Schema import Ecto.Changeset + import Ecto.Query + alias Pleroma.Chat alias Pleroma.Repo alias Pleroma.User @@ -16,6 +18,7 @@ defmodule Pleroma.Chat do It is a helper only, to make it easy to display a list of chats with other people, ordered by last bump. The actual messages are retrieved by querying the recipients of the ChatMessages. """ + @type t :: %__MODULE__{} @primary_key {:id, FlakeId.Ecto.CompatType, autogenerate: true} schema "chats" do @@ -39,16 +42,28 @@ def changeset(struct, params) do |> unique_constraint(:user_id, name: :chats_user_id_recipient_index) end + @spec get_by_user_and_id(User.t(), FlakeId.Ecto.CompatType.t()) :: + {:ok, t()} | {:error, :not_found} + def get_by_user_and_id(%User{id: user_id}, id) do + from(c in __MODULE__, + where: c.id == ^id, + where: c.user_id == ^user_id + ) + |> Repo.find_resource() + end + + @spec get_by_id(FlakeId.Ecto.CompatType.t()) :: t() | nil def get_by_id(id) do - __MODULE__ - |> Repo.get(id) + Repo.get(__MODULE__, id) end + @spec get(FlakeId.Ecto.CompatType.t(), String.t()) :: t() | nil def get(user_id, recipient) do - __MODULE__ - |> Repo.get_by(user_id: user_id, recipient: recipient) + Repo.get_by(__MODULE__, user_id: user_id, recipient: recipient) end + @spec get_or_create(FlakeId.Ecto.CompatType.t(), String.t()) :: + {:ok, t()} | {:error, Ecto.Changeset.t()} def get_or_create(user_id, recipient) do %__MODULE__{} |> changeset(%{user_id: user_id, recipient: recipient}) @@ -60,6 +75,8 @@ def get_or_create(user_id, recipient) do ) end + @spec bump_or_create(FlakeId.Ecto.CompatType.t(), String.t()) :: + {:ok, t()} | {:error, Ecto.Changeset.t()} def bump_or_create(user_id, recipient) do %__MODULE__{} |> changeset(%{user_id: user_id, recipient: recipient}) @@ -69,4 +86,12 @@ def bump_or_create(user_id, recipient) do conflict_target: [:user_id, :recipient] ) end + + @spec for_user_query(FlakeId.Ecto.CompatType.t()) :: Ecto.Query.t() + def for_user_query(user_id) do + from(c in Chat, + where: c.user_id == ^user_id, + order_by: [desc: c.updated_at] + ) + end end diff --git a/lib/pleroma/config/deprecation_warnings.ex b/lib/pleroma/config/deprecation_warnings.ex index 412d55a77..4ba6eaa77 100644 --- a/lib/pleroma/config/deprecation_warnings.ex +++ b/lib/pleroma/config/deprecation_warnings.ex @@ -26,38 +26,25 @@ def check_hellthread_threshold do !!!DEPRECATION WARNING!!! You are using the old configuration mechanism for the hellthread filter. Please check config.md. """) - end - end - def mrf_user_allowlist do - config = Config.get(:mrf_user_allowlist) - - if config && Enum.any?(config, fn {k, _} -> is_atom(k) end) do - rewritten = - Enum.reduce(Config.get(:mrf_user_allowlist), Map.new(), fn {k, v}, acc -> - Map.put(acc, to_string(k), v) - end) - - Config.put(:mrf_user_allowlist, rewritten) - - Logger.error(""" - !!!DEPRECATION WARNING!!! - As of Pleroma 2.0.7, the `mrf_user_allowlist` setting changed of format. - Pleroma 2.1 will remove support for the old format. Please change your configuration to match this: - - config :pleroma, :mrf_user_allowlist, #{inspect(rewritten, pretty: true)} - """) + :error + else + :ok end end def warn do - check_hellthread_threshold() - mrf_user_allowlist() - check_old_mrf_config() - check_media_proxy_whitelist_config() - check_welcome_message_config() - check_gun_pool_options() - check_activity_expiration_config() + with :ok <- check_hellthread_threshold(), + :ok <- check_old_mrf_config(), + :ok <- check_media_proxy_whitelist_config(), + :ok <- check_welcome_message_config(), + :ok <- check_gun_pool_options(), + :ok <- check_activity_expiration_config() do + :ok + else + _ -> + :error + end end def check_welcome_message_config do @@ -70,10 +57,14 @@ def check_welcome_message_config do if use_old_config do Logger.error(""" !!!DEPRECATION WARNING!!! - Your config is using the old namespace for Welcome messages configuration. You need to change to the new namespace: - \n* `config :pleroma, :instance, welcome_user_nickname` is now `config :pleroma, :welcome, :direct_message, :sender_nickname` - \n* `config :pleroma, :instance, welcome_message` is now `config :pleroma, :welcome, :direct_message, :message` + Your config is using the old namespace for Welcome messages configuration. You need to convert to the new namespace. e.g., + \n* `config :pleroma, :instance, welcome_user_nickname` and `config :pleroma, :instance, welcome_message` are now equal to: + \n* `config :pleroma, :welcome, direct_message: [enabled: true, sender_nickname: "NICKNAME", message: "Your welcome message"]`" """) + + :error + else + :ok end end @@ -101,8 +92,11 @@ def move_namespace_and_warn(config_map, warning_preface) do end end) - if warning != "" do + if warning == "" do + :ok + else Logger.warn(warning_preface <> warning) + :error end end @@ -115,6 +109,10 @@ def check_media_proxy_whitelist_config do !!!DEPRECATION WARNING!!! Your config is using old format (only domain) for MediaProxy whitelist option. Setting should work for now, but you are advised to change format to scheme with port to prevent possible issues later. """) + + :error + else + :ok end end @@ -124,7 +122,7 @@ def check_gun_pool_options do if timeout = pool_config[:await_up_timeout] do Logger.warn(""" !!!DEPRECATION WARNING!!! - Your config is using old setting name `await_up_timeout` instead of `connect_timeout`. Setting should work for now, but you are advised to change format to scheme with port to prevent possible issues later. + Your config is using old setting `config :pleroma, :connections_pool, await_up_timeout`. Please change to `config :pleroma, :connections_pool, connect_timeout` to ensure compatibility with future releases. """) Config.put(:connections_pool, Keyword.put_new(pool_config, :connect_timeout, timeout)) @@ -157,6 +155,9 @@ def check_gun_pool_options do Logger.warn(Enum.join([warning_preface | pool_warnings])) Config.put(:pools, updated_config) + :error + else + :ok end end diff --git a/lib/pleroma/emails/mailer.ex b/lib/pleroma/emails/mailer.ex index 8b1bdef75..5108c71c8 100644 --- a/lib/pleroma/emails/mailer.ex +++ b/lib/pleroma/emails/mailer.ex @@ -35,6 +35,11 @@ def perform(:deliver_async, email, config), do: deliver(email, config) def deliver(email, config \\ []) def deliver(email, config) do + # temporary hackney fix until hackney max_connections bug is fixed + # https://git.pleroma.social/pleroma/pleroma/-/issues/2101 + email = + Swoosh.Email.put_private(email, :hackney_options, ssl_options: [versions: [:"tlsv1.2"]]) + case enabled?() do true -> Swoosh.Mailer.deliver(email, parse_config(config)) false -> {:error, :deliveries_disabled} diff --git a/lib/pleroma/emoji.ex b/lib/pleroma/emoji.ex index f6016d73f..04936155b 100644 --- a/lib/pleroma/emoji.ex +++ b/lib/pleroma/emoji.ex @@ -56,6 +56,9 @@ def get(name) do end end + @spec exist?(String.t()) :: boolean() + def exist?(name), do: not is_nil(get(name)) + @doc "Returns all the emojos!!" @spec get_all() :: list({String.t(), String.t(), String.t()}) def get_all do diff --git a/lib/pleroma/emoji/pack.ex b/lib/pleroma/emoji/pack.ex index d076ae312..8f1989ada 100644 --- a/lib/pleroma/emoji/pack.ex +++ b/lib/pleroma/emoji/pack.ex @@ -17,6 +17,7 @@ defmodule Pleroma.Emoji.Pack do } alias Pleroma.Emoji + alias Pleroma.Emoji.Pack @spec create(String.t()) :: {:ok, t()} | {:error, File.posix()} | {:error, :empty_values} def create(name) do @@ -64,24 +65,93 @@ def delete(name) do end end - @spec add_file(String.t(), String.t(), Path.t(), Plug.Upload.t() | String.t()) :: - {:ok, t()} | {:error, File.posix() | atom()} - def add_file(name, shortcode, filename, file) do - with :ok <- validate_not_empty([name, shortcode, filename]), + @spec unpack_zip_emojies(list(tuple())) :: list(map()) + defp unpack_zip_emojies(zip_files) do + Enum.reduce(zip_files, [], fn + {_, path, s, _, _, _}, acc when elem(s, 2) == :regular -> + with( + filename <- Path.basename(path), + shortcode <- Path.basename(filename, Path.extname(filename)), + false <- Emoji.exist?(shortcode) + ) do + [%{path: path, filename: path, shortcode: shortcode} | acc] + else + _ -> acc + end + + _, acc -> + acc + end) + end + + @spec add_file(t(), String.t(), Path.t(), Plug.Upload.t()) :: + {:ok, t()} + | {:error, File.posix() | atom()} + def add_file(%Pack{} = pack, _, _, %Plug.Upload{content_type: "application/zip"} = file) do + with {:ok, zip_files} <- :zip.table(to_charlist(file.path)), + [_ | _] = emojies <- unpack_zip_emojies(zip_files), + {:ok, tmp_dir} <- Pleroma.Utils.tmp_dir("emoji") do + try do + {:ok, _emoji_files} = + :zip.unzip( + to_charlist(file.path), + [{:file_list, Enum.map(emojies, & &1[:path])}, {:cwd, tmp_dir}] + ) + + {_, updated_pack} = + Enum.map_reduce(emojies, pack, fn item, emoji_pack -> + emoji_file = %Plug.Upload{ + filename: item[:filename], + path: Path.join(tmp_dir, item[:path]) + } + + {:ok, updated_pack} = + do_add_file( + emoji_pack, + item[:shortcode], + to_string(item[:filename]), + emoji_file + ) + + {item, updated_pack} + end) + + Emoji.reload() + + {:ok, updated_pack} + after + File.rm_rf(tmp_dir) + end + else + {:error, _} = error -> + error + + _ -> + {:ok, pack} + end + end + + def add_file(%Pack{} = pack, shortcode, filename, %Plug.Upload{} = file) do + with :ok <- validate_not_empty([shortcode, filename]), :ok <- validate_emoji_not_exists(shortcode), - {:ok, pack} <- load_pack(name), - :ok <- save_file(file, pack, filename), - {:ok, updated_pack} <- pack |> put_emoji(shortcode, filename) |> save_pack() do + {:ok, updated_pack} <- do_add_file(pack, shortcode, filename, file) do Emoji.reload() {:ok, updated_pack} end end - @spec delete_file(String.t(), String.t()) :: + defp do_add_file(pack, shortcode, filename, file) do + with :ok <- save_file(file, pack, filename) do + pack + |> put_emoji(shortcode, filename) + |> save_pack() + end + end + + @spec delete_file(t(), String.t()) :: {:ok, t()} | {:error, File.posix() | atom()} - def delete_file(name, shortcode) do - with :ok <- validate_not_empty([name, shortcode]), - {:ok, pack} <- load_pack(name), + def delete_file(%Pack{} = pack, shortcode) do + with :ok <- validate_not_empty([shortcode]), :ok <- remove_file(pack, shortcode), {:ok, updated_pack} <- pack |> delete_emoji(shortcode) |> save_pack() do Emoji.reload() @@ -89,11 +159,10 @@ def delete_file(name, shortcode) do end end - @spec update_file(String.t(), String.t(), String.t(), String.t(), boolean()) :: + @spec update_file(t(), String.t(), String.t(), String.t(), boolean()) :: {:ok, t()} | {:error, File.posix() | atom()} - def update_file(name, shortcode, new_shortcode, new_filename, force) do - with :ok <- validate_not_empty([name, shortcode, new_shortcode, new_filename]), - {:ok, pack} <- load_pack(name), + def update_file(%Pack{} = pack, shortcode, new_shortcode, new_filename, force) do + with :ok <- validate_not_empty([shortcode, new_shortcode, new_filename]), {:ok, filename} <- get_filename(pack, shortcode), :ok <- validate_emoji_not_exists(new_shortcode, force), :ok <- rename_file(pack, filename, new_filename), @@ -129,13 +198,13 @@ def import_from_filesystem do end end - @spec list_remote(String.t()) :: {:ok, map()} | {:error, atom()} - def list_remote(url) do - uri = url |> String.trim() |> URI.parse() + @spec list_remote(keyword()) :: {:ok, map()} | {:error, atom()} + def list_remote(opts) do + uri = opts[:url] |> String.trim() |> URI.parse() with :ok <- validate_shareable_packs_available(uri) do uri - |> URI.merge("/api/pleroma/emoji/packs") + |> URI.merge("/api/pleroma/emoji/packs?page=#{opts[:page]}&page_size=#{opts[:page_size]}") |> http_get() end end @@ -175,7 +244,8 @@ def download(name, url, as) do uri = url |> String.trim() |> URI.parse() with :ok <- validate_shareable_packs_available(uri), - {:ok, remote_pack} <- uri |> URI.merge("/api/pleroma/emoji/packs/#{name}") |> http_get(), + {:ok, remote_pack} <- + uri |> URI.merge("/api/pleroma/emoji/pack?name=#{name}") |> http_get(), {:ok, %{sha: sha, url: url} = pack_info} <- fetch_pack_info(remote_pack, uri, name), {:ok, archive} <- download_archive(url, sha), pack <- copy_as(remote_pack, as || name), @@ -243,9 +313,10 @@ defp validate_emoji_not_exists(shortcode, force \\ false) defp validate_emoji_not_exists(_shortcode, true), do: :ok defp validate_emoji_not_exists(shortcode, _) do - case Emoji.get(shortcode) do - nil -> :ok - _ -> {:error, :already_exists} + if Emoji.exist?(shortcode) do + {:error, :already_exists} + else + :ok end end @@ -386,25 +457,18 @@ defp validate_not_empty(list) do end end - defp save_file(file, pack, filename) do + defp save_file(%Plug.Upload{path: upload_path}, pack, filename) do file_path = Path.join(pack.path, filename) create_subdirs(file_path) - case file do - %Plug.Upload{path: upload_path} -> - # Copy the uploaded file from the temporary directory - with {:ok, _} <- File.copy(upload_path, file_path), do: :ok - - url when is_binary(url) -> - # Download and write the file - file_contents = Tesla.get!(url).body - File.write(file_path, file_contents) + with {:ok, _} <- File.copy(upload_path, file_path) do + :ok end end defp put_emoji(pack, shortcode, filename) do files = Map.put(pack.files, shortcode, filename) - %{pack | files: files} + %{pack | files: files, files_count: length(Map.keys(files))} end defp delete_emoji(pack, shortcode) do @@ -460,7 +524,7 @@ defp get_filename(pack, shortcode) do defp http_get(%URI{} = url), do: url |> to_string() |> http_get() defp http_get(url) do - with {:ok, %{body: body}} <- url |> Pleroma.HTTP.get() do + with {:ok, %{body: body}} <- Pleroma.HTTP.get(url, [], pool: :default) do Jason.decode(body) end end @@ -509,7 +573,7 @@ defp fetch_pack_info(remote_pack, uri, name) do {:ok, %{ sha: sha, - url: URI.merge(uri, "/api/pleroma/emoji/packs/#{name}/archive") |> to_string() + url: URI.merge(uri, "/api/pleroma/emoji/packs/archive?name=#{name}") |> to_string() }} %{"fallback-src" => src, "fallback-src-sha256" => sha} when is_binary(src) -> diff --git a/lib/pleroma/gun/conn.ex b/lib/pleroma/gun/conn.ex index 75b1ffc0a..477e19c6e 100644 --- a/lib/pleroma/gun/conn.ex +++ b/lib/pleroma/gun/conn.ex @@ -50,10 +50,10 @@ defp do_open(uri, %{proxy: {proxy_host, proxy_port}} = opts) do with open_opts <- Map.delete(opts, :tls_opts), {:ok, conn} <- Gun.open(proxy_host, proxy_port, open_opts), - {:ok, _} <- Gun.await_up(conn, opts[:connect_timeout]), + {:ok, protocol} <- Gun.await_up(conn, opts[:connect_timeout]), stream <- Gun.connect(conn, connect_opts), {:response, :fin, 200, _} <- Gun.await(conn, stream) do - {:ok, conn} + {:ok, conn, protocol} else error -> Logger.warn( @@ -88,8 +88,8 @@ defp do_open(uri, %{proxy: {proxy_type, proxy_host, proxy_port}} = opts) do |> Map.put(:socks_opts, socks_opts) with {:ok, conn} <- Gun.open(proxy_host, proxy_port, opts), - {:ok, _} <- Gun.await_up(conn, opts[:connect_timeout]) do - {:ok, conn} + {:ok, protocol} <- Gun.await_up(conn, opts[:connect_timeout]) do + {:ok, conn, protocol} else error -> Logger.warn( @@ -106,8 +106,8 @@ defp do_open(%URI{host: host, port: port} = uri, opts) do host = Pleroma.HTTP.AdapterHelper.parse_host(host) with {:ok, conn} <- Gun.open(host, port, opts), - {:ok, _} <- Gun.await_up(conn, opts[:connect_timeout]) do - {:ok, conn} + {:ok, protocol} <- Gun.await_up(conn, opts[:connect_timeout]) do + {:ok, conn, protocol} else error -> Logger.warn( diff --git a/lib/pleroma/gun/connection_pool/worker.ex b/lib/pleroma/gun/connection_pool/worker.ex index c36332817..bf57e9e5f 100644 --- a/lib/pleroma/gun/connection_pool/worker.ex +++ b/lib/pleroma/gun/connection_pool/worker.ex @@ -15,7 +15,7 @@ def init([_key, _uri, _opts, _client_pid] = opts) do @impl true def handle_continue({:connect, [key, uri, opts, client_pid]}, _) do - with {:ok, conn_pid} <- Gun.Conn.open(uri, opts), + with {:ok, conn_pid, protocol} <- Gun.Conn.open(uri, opts), Process.link(conn_pid) do time = :erlang.monotonic_time(:millisecond) @@ -27,8 +27,12 @@ def handle_continue({:connect, [key, uri, opts, client_pid]}, _) do send(client_pid, {:conn_pid, conn_pid}) {:noreply, - %{key: key, timer: nil, client_monitors: %{client_pid => Process.monitor(client_pid)}}, - :hibernate} + %{ + key: key, + timer: nil, + client_monitors: %{client_pid => Process.monitor(client_pid)}, + protocol: protocol + }, :hibernate} else err -> {:stop, {:shutdown, err}, nil} @@ -53,14 +57,20 @@ def handle_cast({:remove_client, client_pid}, state) do end @impl true - def handle_call(:add_client, {client_pid, _}, %{key: key} = state) do + def handle_call(:add_client, {client_pid, _}, %{key: key, protocol: protocol} = state) do time = :erlang.monotonic_time(:millisecond) - {{conn_pid, _, _, _}, _} = + {{conn_pid, used_by, _, _}, _} = Registry.update_value(@registry, key, fn {conn_pid, used_by, crf, last_reference} -> {conn_pid, [client_pid | used_by], crf(time - last_reference, crf), time} end) + :telemetry.execute( + [:pleroma, :connection_pool, :client, :add], + %{client_pid: client_pid, clients: used_by}, + %{key: state.key, protocol: protocol} + ) + state = if state.timer != nil do Process.cancel_timer(state[:timer]) @@ -83,25 +93,18 @@ def handle_call(:remove_client, {client_pid, _}, %{key: key} = state) do end) {ref, state} = pop_in(state.client_monitors[client_pid]) - # DOWN message can receive right after `remove_client` call and cause worker to terminate - state = - if is_nil(ref) do - state + + Process.demonitor(ref, [:flush]) + + timer = + if used_by == [] do + max_idle = Pleroma.Config.get([:connections_pool, :max_idle_time], 30_000) + Process.send_after(self(), :idle_close, max_idle) else - Process.demonitor(ref) - - timer = - if used_by == [] do - max_idle = Pleroma.Config.get([:connections_pool, :max_idle_time], 30_000) - Process.send_after(self(), :idle_close, max_idle) - else - nil - end - - %{state | timer: timer} + nil end - {:reply, :ok, state, :hibernate} + {:reply, :ok, %{state | timer: timer}, :hibernate} end @impl true @@ -131,7 +134,7 @@ def handle_info({:gun_down, _pid, _protocol, _reason, _killed_streams}, state) d @impl true def handle_info({:DOWN, _ref, :process, pid, reason}, state) do :telemetry.execute( - [:pleroma, :connection_pool, :client_death], + [:pleroma, :connection_pool, :client, :dead], %{client_pid: pid, reason: reason}, %{key: state.key} ) diff --git a/lib/pleroma/helpers/media_helper.ex b/lib/pleroma/helpers/media_helper.ex new file mode 100644 index 000000000..6b799173e --- /dev/null +++ b/lib/pleroma/helpers/media_helper.ex @@ -0,0 +1,162 @@ +# Pleroma: A lightweight social networking server +# Copyright © 2017-2020 Pleroma Authors +# SPDX-License-Identifier: AGPL-3.0-only + +defmodule Pleroma.Helpers.MediaHelper do + @moduledoc """ + Handles common media-related operations. + """ + + alias Pleroma.HTTP + + require Logger + + def missing_dependencies do + Enum.reduce([imagemagick: "convert", ffmpeg: "ffmpeg"], [], fn {sym, executable}, acc -> + if Pleroma.Utils.command_available?(executable) do + acc + else + [sym | acc] + end + end) + end + + def image_resize(url, options) do + with executable when is_binary(executable) <- System.find_executable("convert"), + {:ok, args} <- prepare_image_resize_args(options), + {:ok, env} <- HTTP.get(url, [], pool: :media), + {:ok, fifo_path} <- mkfifo() do + args = List.flatten([fifo_path, args]) + run_fifo(fifo_path, env, executable, args) + else + nil -> {:error, {:convert, :command_not_found}} + {:error, _} = error -> error + end + end + + defp prepare_image_resize_args( + %{max_width: max_width, max_height: max_height, format: "png"} = options + ) do + quality = options[:quality] || 85 + resize = Enum.join([max_width, "x", max_height, ">"]) + + args = [ + "-resize", + resize, + "-quality", + to_string(quality), + "png:-" + ] + + {:ok, args} + end + + defp prepare_image_resize_args(%{max_width: max_width, max_height: max_height} = options) do + quality = options[:quality] || 85 + resize = Enum.join([max_width, "x", max_height, ">"]) + + args = [ + "-interlace", + "Plane", + "-resize", + resize, + "-quality", + to_string(quality), + "jpg:-" + ] + + {:ok, args} + end + + defp prepare_image_resize_args(_), do: {:error, :missing_options} + + # Note: video thumbnail is intentionally not resized (always has original dimensions) + def video_framegrab(url) do + with executable when is_binary(executable) <- System.find_executable("ffmpeg"), + {:ok, env} <- HTTP.get(url, [], pool: :media), + {:ok, fifo_path} <- mkfifo(), + args = [ + "-y", + "-i", + fifo_path, + "-vframes", + "1", + "-f", + "mjpeg", + "-loglevel", + "error", + "-" + ] do + run_fifo(fifo_path, env, executable, args) + else + nil -> {:error, {:ffmpeg, :command_not_found}} + {:error, _} = error -> error + end + end + + defp run_fifo(fifo_path, env, executable, args) do + pid = + Port.open({:spawn_executable, executable}, [ + :use_stdio, + :stream, + :exit_status, + :binary, + args: args + ]) + + fifo = Port.open(to_charlist(fifo_path), [:eof, :binary, :stream, :out]) + fix = Pleroma.Helpers.QtFastStart.fix(env.body) + true = Port.command(fifo, fix) + :erlang.port_close(fifo) + loop_recv(pid) + after + File.rm(fifo_path) + end + + defp mkfifo do + path = Path.join(System.tmp_dir!(), "pleroma-media-preview-pipe-#{Ecto.UUID.generate()}") + + case System.cmd("mkfifo", [path]) do + {_, 0} -> + spawn(fifo_guard(path)) + {:ok, path} + + {_, err} -> + {:error, {:fifo_failed, err}} + end + end + + defp fifo_guard(path) do + pid = self() + + fn -> + ref = Process.monitor(pid) + + receive do + {:DOWN, ^ref, :process, ^pid, _} -> + File.rm(path) + end + end + end + + defp loop_recv(pid) do + loop_recv(pid, <<>>) + end + + defp loop_recv(pid, acc) do + receive do + {^pid, {:data, data}} -> + loop_recv(pid, acc <> data) + + {^pid, {:exit_status, 0}} -> + {:ok, acc} + + {^pid, {:exit_status, status}} -> + {:error, status} + after + 5000 -> + :erlang.port_close(pid) + {:error, :timeout} + end + end +end diff --git a/lib/pleroma/helpers/qt_fast_start.ex b/lib/pleroma/helpers/qt_fast_start.ex new file mode 100644 index 000000000..bb93224b5 --- /dev/null +++ b/lib/pleroma/helpers/qt_fast_start.ex @@ -0,0 +1,131 @@ +# Pleroma: A lightweight social networking server +# Copyright © 2017-2020 Pleroma Authors +# SPDX-License-Identifier: AGPL-3.0-only + +defmodule Pleroma.Helpers.QtFastStart do + @moduledoc """ + (WIP) Converts a "slow start" (data before metadatas) mov/mp4 file to a "fast start" one (metadatas before data). + """ + + # TODO: Cleanup and optimizations + # Inspirations: https://www.ffmpeg.org/doxygen/3.4/qt-faststart_8c_source.html + # https://github.com/danielgtaylor/qtfaststart/blob/master/qtfaststart/processor.py + # ISO/IEC 14496-12:2015, ISO/IEC 15444-12:2015 + # Paracetamol + + def fix(<<0x00, 0x00, 0x00, _, 0x66, 0x74, 0x79, 0x70, _::bits>> = binary) do + index = fix(binary, 0, nil, nil, []) + + case index do + :abort -> binary + [{"ftyp", _, _, _, _}, {"mdat", _, _, _, _} | _] -> faststart(index) + [{"ftyp", _, _, _, _}, {"free", _, _, _, _}, {"mdat", _, _, _, _} | _] -> faststart(index) + _ -> binary + end + end + + def fix(binary) do + binary + end + + # MOOV have been seen before MDAT- abort + defp fix(<<_::bits>>, _, true, false, _) do + :abort + end + + defp fix( + <>, + pos, + got_moov, + got_mdat, + acc + ) do + full_size = (size - 8) * 8 + <> = rest + + acc = [ + {fourcc, pos, pos + size, size, + <>} + | acc + ] + + fix(rest, pos + size, got_moov || fourcc == "moov", got_mdat || fourcc == "mdat", acc) + end + + defp fix(<<>>, _pos, _, _, acc) do + :lists.reverse(acc) + end + + defp faststart(index) do + {{_ftyp, _, _, _, ftyp}, index} = List.keytake(index, "ftyp", 0) + + # Skip re-writing the free fourcc as it's kind of useless. + # Why stream useless bytes when you can do without? + {free_size, index} = + case List.keytake(index, "free", 0) do + {{_, _, _, size, _}, index} -> {size, index} + _ -> {0, index} + end + + {{_moov, _, _, moov_size, moov}, index} = List.keytake(index, "moov", 0) + offset = -free_size + moov_size + rest = for {_, _, _, _, data} <- index, do: data, into: [] + <> = moov + [ftyp, moov_head, fix_moov(moov_data, offset, []), rest] + end + + defp fix_moov( + <>, + offset, + acc + ) do + full_size = (size - 8) * 8 + <> = rest + + data = + cond do + fourcc in ["trak", "mdia", "minf", "stbl"] -> + # Theses contains sto or co64 part + [<>, fix_moov(data, offset, [])] + + fourcc in ["stco", "co64"] -> + # fix the damn thing + <> = data + + entry_size = + case fourcc do + "stco" -> 32 + "co64" -> 64 + end + + [ + <>, + rewrite_entries(entry_size, offset, rest, []) + ] + + true -> + [<>, data] + end + + acc = [acc | data] + fix_moov(rest, offset, acc) + end + + defp fix_moov(<<>>, _, acc), do: acc + + for size <- [32, 64] do + defp rewrite_entries( + unquote(size), + offset, + <>, + acc + ) do + rewrite_entries(unquote(size), offset, rest, [ + acc | <> + ]) + end + end + + defp rewrite_entries(_, _, <<>>, acc), do: acc +end diff --git a/lib/pleroma/helpers/uri_helper.ex b/lib/pleroma/helpers/uri_helper.ex index 6d205a636..f1301f055 100644 --- a/lib/pleroma/helpers/uri_helper.ex +++ b/lib/pleroma/helpers/uri_helper.ex @@ -3,18 +3,22 @@ # SPDX-License-Identifier: AGPL-3.0-only defmodule Pleroma.Helpers.UriHelper do - def append_uri_params(uri, appended_params) do + def modify_uri_params(uri, overridden_params, deleted_params \\ []) do uri = URI.parse(uri) - appended_params = for {k, v} <- appended_params, into: %{}, do: {to_string(k), v} - existing_params = URI.query_decoder(uri.query || "") |> Enum.into(%{}) - updated_params_keys = Enum.uniq(Map.keys(existing_params) ++ Map.keys(appended_params)) + + existing_params = URI.query_decoder(uri.query || "") |> Map.new() + overridden_params = Map.new(overridden_params, fn {k, v} -> {to_string(k), v} end) + deleted_params = Enum.map(deleted_params, &to_string/1) updated_params = - for k <- updated_params_keys, do: {k, appended_params[k] || existing_params[k]} + existing_params + |> Map.merge(overridden_params) + |> Map.drop(deleted_params) uri |> Map.put(:query, URI.encode_query(updated_params)) |> URI.to_string() + |> String.replace_suffix("?", "") end def maybe_add_base("/" <> uri, base), do: Path.join([base, uri]) diff --git a/lib/pleroma/http/web_push.ex b/lib/pleroma/http/web_push.ex new file mode 100644 index 000000000..78148a12e --- /dev/null +++ b/lib/pleroma/http/web_push.ex @@ -0,0 +1,12 @@ +# Pleroma: A lightweight social networking server +# Copyright © 2017-2020 Pleroma Authors +# SPDX-License-Identifier: AGPL-3.0-only + +defmodule Pleroma.HTTP.WebPush do + @moduledoc false + + def post(url, payload, headers) do + list_headers = Map.to_list(headers) + Pleroma.HTTP.post(url, payload, list_headers) + end +end diff --git a/lib/pleroma/instances/instance.ex b/lib/pleroma/instances/instance.ex index 8bf53c090..f0f601469 100644 --- a/lib/pleroma/instances/instance.ex +++ b/lib/pleroma/instances/instance.ex @@ -156,16 +156,12 @@ def get_or_update_favicon(%URI{host: host} = instance_uri) do defp scrape_favicon(%URI{} = instance_uri) do try do with {:ok, %Tesla.Env{body: html}} <- - Pleroma.HTTP.get(to_string(instance_uri), [{"accept", "text/html"}], - adapter: [pool: :media] - ), - favicon_rel <- - html - |> Floki.parse_document!() - |> Floki.attribute("link[rel=icon]", "href") - |> List.first(), - favicon <- URI.merge(instance_uri, favicon_rel) |> to_string(), - true <- is_binary(favicon) do + Pleroma.HTTP.get(to_string(instance_uri), [{"accept", "text/html"}], pool: :media), + {_, [favicon_rel | _]} when is_binary(favicon_rel) <- + {:parse, + html |> Floki.parse_document!() |> Floki.attribute("link[rel=icon]", "href")}, + {_, favicon} when is_binary(favicon) <- + {:merge, URI.merge(instance_uri, favicon_rel) |> to_string()} do favicon else _ -> nil diff --git a/lib/pleroma/migration_helper/notification_backfill.ex b/lib/pleroma/migration_helper/notification_backfill.ex index d260e62ca..24f4733fe 100644 --- a/lib/pleroma/migration_helper/notification_backfill.ex +++ b/lib/pleroma/migration_helper/notification_backfill.ex @@ -19,13 +19,13 @@ def fill_in_notification_types do query |> Repo.chunk_stream(100) |> Enum.each(fn notification -> - type = - notification.activity - |> type_from_activity() + if notification.activity do + type = type_from_activity(notification.activity) - notification - |> Ecto.Changeset.change(%{type: type}) - |> Repo.update() + notification + |> Ecto.Changeset.change(%{type: type}) + |> Repo.update() + end end) end @@ -72,8 +72,7 @@ defp type_from_activity(%{data: %{"type" => type}} = activity) do "pleroma:emoji_reaction" "Create" -> - activity - |> type_from_activity_object() + type_from_activity_object(activity) t -> raise "No notification type for activity type #{t}" diff --git a/lib/pleroma/moderation_log.ex b/lib/pleroma/moderation_log.ex index 31c9afe2a..47036a6f6 100644 --- a/lib/pleroma/moderation_log.ex +++ b/lib/pleroma/moderation_log.ex @@ -320,6 +320,19 @@ def insert_log(%{ |> insert_log_entry_with_message() end + @spec insert_log(%{actor: User, action: String.t(), subject_id: String.t()}) :: + {:ok, ModerationLog} | {:error, any} + def insert_log(%{actor: %User{} = actor, action: "chat_message_delete", subject_id: subject_id}) do + %ModerationLog{ + data: %{ + "actor" => %{"nickname" => actor.nickname}, + "action" => "chat_message_delete", + "subject_id" => subject_id + } + } + |> insert_log_entry_with_message() + end + @spec insert_log_entry_with_message(ModerationLog) :: {:ok, ModerationLog} | {:error, any} defp insert_log_entry_with_message(entry) do entry.data["message"] @@ -627,6 +640,17 @@ def get_log_entry_message(%ModerationLog{ "@#{actor_nickname} updated users: #{users_to_nicknames_string(subjects)}" end + @spec get_log_entry_message(ModerationLog) :: String.t() + def get_log_entry_message(%ModerationLog{ + data: %{ + "actor" => %{"nickname" => actor_nickname}, + "action" => "chat_message_delete", + "subject_id" => subject_id + } + }) do + "@#{actor_nickname} deleted chat message ##{subject_id}" + end + defp nicknames_to_string(nicknames) do nicknames |> Enum.map(&"@#{&1}") diff --git a/lib/pleroma/object/fetcher.ex b/lib/pleroma/object/fetcher.ex index 1de2ce6c3..169298b34 100644 --- a/lib/pleroma/object/fetcher.ex +++ b/lib/pleroma/object/fetcher.ex @@ -12,6 +12,7 @@ defmodule Pleroma.Object.Fetcher do alias Pleroma.Web.ActivityPub.ObjectValidator alias Pleroma.Web.ActivityPub.Transmogrifier alias Pleroma.Web.Federator + alias Pleroma.Web.FedSockets require Logger require Pleroma.Constants @@ -98,8 +99,8 @@ def fetch_object_from_id(id, options \\ []) do {:containment, _} -> {:error, "Object containment failed."} - {:transmogrifier, {:error, {:reject, nil}}} -> - {:reject, nil} + {:transmogrifier, {:error, {:reject, e}}} -> + {:reject, e} {:transmogrifier, _} = e -> {:error, e} @@ -182,27 +183,20 @@ defp maybe_date_fetch(headers, date) do end end - def fetch_and_contain_remote_object_from_id(id) when is_binary(id) do + def fetch_and_contain_remote_object_from_id(prm, opts \\ []) + + def fetch_and_contain_remote_object_from_id(%{"id" => id}, opts), + do: fetch_and_contain_remote_object_from_id(id, opts) + + def fetch_and_contain_remote_object_from_id(id, opts) when is_binary(id) do Logger.debug("Fetching object #{id} via AP") - date = Pleroma.Signature.signed_date() - - headers = - [{"accept", "application/activity+json"}] - |> maybe_date_fetch(date) - |> sign_fetch(id, date) - - Logger.debug("Fetch headers: #{inspect(headers)}") - with {:scheme, true} <- {:scheme, String.starts_with?(id, "http")}, - {:ok, %{body: body, status: code}} when code in 200..299 <- HTTP.get(id, headers), - {:ok, data} <- Jason.decode(body), + {:ok, body} <- get_object(id, opts), + {:ok, data} <- safe_json_decode(body), :ok <- Containment.contain_origin_from_id(id, data) do {:ok, data} else - {:ok, %{status: code}} when code in [404, 410] -> - {:error, "Object has been deleted"} - {:scheme, _} -> {:error, "Unsupported URI scheme"} @@ -214,8 +208,44 @@ def fetch_and_contain_remote_object_from_id(id) when is_binary(id) do end end - def fetch_and_contain_remote_object_from_id(%{"id" => id}), - do: fetch_and_contain_remote_object_from_id(id) + def fetch_and_contain_remote_object_from_id(_id, _opts), + do: {:error, "id must be a string"} - def fetch_and_contain_remote_object_from_id(_id), do: {:error, "id must be a string"} + defp get_object(id, opts) do + with false <- Keyword.get(opts, :force_http, false), + {:ok, fedsocket} <- FedSockets.get_or_create_fed_socket(id) do + Logger.debug("fetching via fedsocket - #{inspect(id)}") + FedSockets.fetch(fedsocket, id) + else + _other -> + Logger.debug("fetching via http - #{inspect(id)}") + get_object_http(id) + end + end + + defp get_object_http(id) do + date = Pleroma.Signature.signed_date() + + headers = + [{"accept", "application/activity+json"}] + |> maybe_date_fetch(date) + |> sign_fetch(id, date) + + case HTTP.get(id, headers) do + {:ok, %{body: body, status: code}} when code in 200..299 -> + {:ok, body} + + {:ok, %{status: code}} when code in [404, 410] -> + {:error, "Object has been deleted"} + + {:error, e} -> + {:error, e} + + e -> + {:error, e} + end + end + + defp safe_json_decode(nil), do: {:ok, nil} + defp safe_json_decode(json), do: Jason.decode(json) end diff --git a/lib/pleroma/plugs/oauth_scopes_plug.ex b/lib/pleroma/plugs/oauth_scopes_plug.ex index efc25b79f..b1a736d78 100644 --- a/lib/pleroma/plugs/oauth_scopes_plug.ex +++ b/lib/pleroma/plugs/oauth_scopes_plug.ex @@ -53,7 +53,7 @@ def drop_auth_info(conn) do |> assign(:token, nil) end - @doc "Filters descendants of supported scopes" + @doc "Keeps those of `scopes` which are descendants of `supported_scopes`" def filter_descendants(scopes, supported_scopes) do Enum.filter( scopes, diff --git a/lib/pleroma/plugs/remote_ip.ex b/lib/pleroma/plugs/remote_ip.ex index 0ac9050d0..987022156 100644 --- a/lib/pleroma/plugs/remote_ip.ex +++ b/lib/pleroma/plugs/remote_ip.ex @@ -7,48 +7,42 @@ defmodule Pleroma.Plugs.RemoteIp do This is a shim to call [`RemoteIp`](https://git.pleroma.social/pleroma/remote_ip) but with runtime configuration. """ + alias Pleroma.Config import Plug.Conn @behaviour Plug - @headers ~w[ - x-forwarded-for - ] - - # https://en.wikipedia.org/wiki/Localhost - # https://en.wikipedia.org/wiki/Private_network - @reserved ~w[ - 127.0.0.0/8 - ::1/128 - fc00::/7 - 10.0.0.0/8 - 172.16.0.0/12 - 192.168.0.0/16 - ] - def init(_), do: nil def call(%{remote_ip: original_remote_ip} = conn, _) do - config = Pleroma.Config.get(__MODULE__, []) - - if Keyword.get(config, :enabled, false) do - %{remote_ip: new_remote_ip} = conn = RemoteIp.call(conn, remote_ip_opts(config)) + if Config.get([__MODULE__, :enabled]) do + %{remote_ip: new_remote_ip} = conn = RemoteIp.call(conn, remote_ip_opts()) assign(conn, :remote_ip_found, original_remote_ip != new_remote_ip) else conn end end - defp remote_ip_opts(config) do - headers = config |> Keyword.get(:headers, @headers) |> MapSet.new() - reserved = Keyword.get(config, :reserved, @reserved) + defp remote_ip_opts do + headers = Config.get([__MODULE__, :headers], []) |> MapSet.new() + reserved = Config.get([__MODULE__, :reserved], []) proxies = - config - |> Keyword.get(:proxies, []) + Config.get([__MODULE__, :proxies], []) |> Enum.concat(reserved) - |> Enum.map(&InetCidr.parse/1) + |> Enum.map(&maybe_add_cidr/1) {headers, proxies} end + + defp maybe_add_cidr(proxy) when is_binary(proxy) do + proxy = + cond do + "/" in String.codepoints(proxy) -> proxy + InetCidr.v4?(InetCidr.parse_address!(proxy)) -> proxy <> "/32" + InetCidr.v6?(InetCidr.parse_address!(proxy)) -> proxy <> "/128" + end + + InetCidr.parse(proxy, true) + end end diff --git a/lib/pleroma/repo.ex b/lib/pleroma/repo.ex index f317e4d58..4524bd5e2 100644 --- a/lib/pleroma/repo.ex +++ b/lib/pleroma/repo.ex @@ -49,7 +49,21 @@ def get_assoc(resource, association) do end end - def chunk_stream(query, chunk_size) do + @doc """ + Returns a lazy enumerable that emits all entries from the data store matching the given query. + + `returns_as` use to group records. use the `batches` option to fetch records in bulk. + + ## Examples + + # fetch records one-by-one + iex> Pleroma.Repo.chunk_stream(Pleroma.Activity.Queries.by_actor(ap_id), 500) + + # fetch records in bulk + iex> Pleroma.Repo.chunk_stream(Pleroma.Activity.Queries.by_actor(ap_id), 500, :batches) + """ + @spec chunk_stream(Ecto.Query.t(), integer(), atom()) :: Enumerable.t() + def chunk_stream(query, chunk_size, returns_as \\ :one) do # We don't actually need start and end funcitons of resource streaming, # but it seems to be the only way to not fetch records one-by-one and # have individual records be the elements of the stream, instead of @@ -69,7 +83,12 @@ def chunk_stream(query, chunk_size) do records -> last_id = List.last(records).id - {records, last_id} + + if returns_as == :one do + {records, last_id} + else + {[records], last_id} + end end end, fn _ -> :ok end diff --git a/lib/pleroma/repo_streamer.ex b/lib/pleroma/repo_streamer.ex deleted file mode 100644 index cb4d7bb7a..000000000 --- a/lib/pleroma/repo_streamer.ex +++ /dev/null @@ -1,34 +0,0 @@ -# Pleroma: A lightweight social networking server -# Copyright © 2017-2020 Pleroma Authors -# SPDX-License-Identifier: AGPL-3.0-only - -defmodule Pleroma.RepoStreamer do - alias Pleroma.Repo - import Ecto.Query - - def chunk_stream(query, chunk_size) do - Stream.unfold(0, fn - :halt -> - {[], :halt} - - last_id -> - query - |> order_by(asc: :id) - |> where([r], r.id > ^last_id) - |> limit(^chunk_size) - |> Repo.all() - |> case do - [] -> - {[], :halt} - - records -> - last_id = List.last(records).id - {records, last_id} - end - end) - |> Stream.take_while(fn - [] -> false - _ -> true - end) - end -end diff --git a/lib/pleroma/reverse_proxy/reverse_proxy.ex b/lib/pleroma/reverse_proxy/reverse_proxy.ex index 0de4e2309..8ae1157df 100644 --- a/lib/pleroma/reverse_proxy/reverse_proxy.ex +++ b/lib/pleroma/reverse_proxy/reverse_proxy.ex @@ -17,6 +17,9 @@ defmodule Pleroma.ReverseProxy do @failed_request_ttl :timer.seconds(60) @methods ~w(GET HEAD) + def max_read_duration_default, do: @max_read_duration + def default_cache_control_header, do: @default_cache_control_header + @moduledoc """ A reverse proxy. @@ -391,6 +394,8 @@ defp body_size_constraint(size, limit) when is_integer(limit) and limit > 0 and defp body_size_constraint(_, _), do: :ok + defp check_read_duration(nil = _duration, max), do: check_read_duration(@max_read_duration, max) + defp check_read_duration(duration, max) when is_integer(duration) and is_integer(max) and max > 0 do if duration > max do diff --git a/lib/pleroma/signature.ex b/lib/pleroma/signature.ex index 3aa6909d2..e388993b7 100644 --- a/lib/pleroma/signature.ex +++ b/lib/pleroma/signature.ex @@ -39,7 +39,7 @@ def key_id_to_actor_id(key_id) do def fetch_public_key(conn) do with %{"keyId" => kid} <- HTTPSignatures.signature_for_conn(conn), {:ok, actor_id} <- key_id_to_actor_id(kid), - {:ok, public_key} <- User.get_public_key_for_ap_id(actor_id) do + {:ok, public_key} <- User.get_public_key_for_ap_id(actor_id, force_http: true) do {:ok, public_key} else e -> @@ -50,8 +50,8 @@ def fetch_public_key(conn) do def refetch_public_key(conn) do with %{"keyId" => kid} <- HTTPSignatures.signature_for_conn(conn), {:ok, actor_id} <- key_id_to_actor_id(kid), - {:ok, _user} <- ActivityPub.make_user_from_ap_id(actor_id), - {:ok, public_key} <- User.get_public_key_for_ap_id(actor_id) do + {:ok, _user} <- ActivityPub.make_user_from_ap_id(actor_id, force_http: true), + {:ok, public_key} <- User.get_public_key_for_ap_id(actor_id, force_http: true) do {:ok, public_key} else e -> diff --git a/lib/pleroma/telemetry/logger.ex b/lib/pleroma/telemetry/logger.ex index 4cacae02f..197b1d091 100644 --- a/lib/pleroma/telemetry/logger.ex +++ b/lib/pleroma/telemetry/logger.ex @@ -7,7 +7,8 @@ defmodule Pleroma.Telemetry.Logger do [:pleroma, :connection_pool, :reclaim, :start], [:pleroma, :connection_pool, :reclaim, :stop], [:pleroma, :connection_pool, :provision_failure], - [:pleroma, :connection_pool, :client_death] + [:pleroma, :connection_pool, :client, :dead], + [:pleroma, :connection_pool, :client, :add] ] def attach do :telemetry.attach_many("pleroma-logger", @events, &handle_event/4, []) @@ -62,7 +63,7 @@ def handle_event( end def handle_event( - [:pleroma, :connection_pool, :client_death], + [:pleroma, :connection_pool, :client, :dead], %{client_pid: client_pid, reason: reason}, %{key: key}, _ @@ -73,4 +74,17 @@ def handle_event( }" end) end + + def handle_event( + [:pleroma, :connection_pool, :client, :add], + %{clients: [_, _ | _] = clients}, + %{key: key, protocol: :http}, + _ + ) do + Logger.info(fn -> + "Pool worker for #{key}: #{length(clients)} clients are using an HTTP1 connection at the same time, head-of-line blocking might occur." + end) + end + + def handle_event([:pleroma, :connection_pool, :client, :add], _, _, _), do: :ok end diff --git a/lib/pleroma/user.ex b/lib/pleroma/user.ex index e73d19964..09ea80793 100644 --- a/lib/pleroma/user.ex +++ b/lib/pleroma/user.ex @@ -25,7 +25,6 @@ defmodule Pleroma.User do alias Pleroma.Object alias Pleroma.Registration alias Pleroma.Repo - alias Pleroma.RepoStreamer alias Pleroma.User alias Pleroma.UserRelationship alias Pleroma.Web @@ -276,9 +275,9 @@ def binary_id(%User{} = user), do: binary_id(user.id) @spec account_status(User.t()) :: account_status() def account_status(%User{deactivated: true}), do: :deactivated def account_status(%User{password_reset_pending: true}), do: :password_reset_pending - def account_status(%User{approval_pending: true}), do: :approval_pending + def account_status(%User{local: true, approval_pending: true}), do: :approval_pending - def account_status(%User{confirmation_pending: true}) do + def account_status(%User{local: true, confirmation_pending: true}) do if Config.get([:instance, :account_activation_required]) do :confirmation_pending else @@ -814,7 +813,8 @@ def send_welcome_email(%User{email: email} = user) when is_binary(email) do def send_welcome_email(_), do: {:ok, :noop} @spec try_send_confirmation_email(User.t()) :: {:ok, :enqueued | :noop} - def try_send_confirmation_email(%User{confirmation_pending: true} = user) do + def try_send_confirmation_email(%User{confirmation_pending: true, email: email} = user) + when is_binary(email) do if Config.get([:instance, :account_activation_required]) do send_confirmation_email(user) {:ok, :enqueued} @@ -915,9 +915,7 @@ defp do_unfollow(%User{} = follower, %User{} = followed) do FollowingRelationship.unfollow(follower, followed) {:ok, followed} = update_follower_count(followed) - {:ok, follower} = - follower - |> update_following_count() + {:ok, follower} = update_following_count(follower) {:ok, follower, followed} @@ -1686,42 +1684,6 @@ def perform(:delete, %User{} = user) do def perform(:deactivate_async, user, status), do: deactivate(user, status) - @spec perform(atom(), User.t(), list()) :: list() | {:error, any()} - def perform(:blocks_import, %User{} = blocker, blocked_identifiers) - when is_list(blocked_identifiers) do - Enum.map( - blocked_identifiers, - fn blocked_identifier -> - with {:ok, %User{} = blocked} <- get_or_fetch(blocked_identifier), - {:ok, _block} <- CommonAPI.block(blocker, blocked) do - blocked - else - err -> - Logger.debug("blocks_import failed for #{blocked_identifier} with: #{inspect(err)}") - err - end - end - ) - end - - def perform(:follow_import, %User{} = follower, followed_identifiers) - when is_list(followed_identifiers) do - Enum.map( - followed_identifiers, - fn followed_identifier -> - with {:ok, %User{} = followed} <- get_or_fetch(followed_identifier), - {:ok, follower} <- maybe_direct_follow(follower, followed), - {:ok, _, _, _} <- CommonAPI.follow(follower, followed) do - followed - else - err -> - Logger.debug("follow_import failed for #{followed_identifier} with: #{inspect(err)}") - err - end - end - ) - end - @spec external_users_query() :: Ecto.Query.t() def external_users_query do User.Query.build(%{ @@ -1750,21 +1712,6 @@ def external_users(opts \\ []) do Repo.all(query) end - def blocks_import(%User{} = blocker, blocked_identifiers) when is_list(blocked_identifiers) do - BackgroundWorker.enqueue("blocks_import", %{ - "blocker_id" => blocker.id, - "blocked_identifiers" => blocked_identifiers - }) - end - - def follow_import(%User{} = follower, followed_identifiers) - when is_list(followed_identifiers) do - BackgroundWorker.enqueue("follow_import", %{ - "follower_id" => follower.id, - "followed_identifiers" => followed_identifiers - }) - end - def delete_notifications_from_user_activities(%User{ap_id: ap_id}) do Notification |> join(:inner, [n], activity in assoc(n, :activity)) @@ -1775,7 +1722,7 @@ def delete_notifications_from_user_activities(%User{ap_id: ap_id}) do def delete_user_activities(%User{ap_id: ap_id} = user) do ap_id |> Activity.Queries.by_actor() - |> RepoStreamer.chunk_stream(50) + |> Repo.chunk_stream(50, :batches) |> Stream.each(fn activities -> Enum.each(activities, fn activity -> delete_activity(activity, user) end) end) @@ -1821,12 +1768,12 @@ def html_filter_policy(%User{no_rich_text: true}) do def html_filter_policy(_), do: Config.get([:markup, :scrub_policy]) - def fetch_by_ap_id(ap_id), do: ActivityPub.make_user_from_ap_id(ap_id) + def fetch_by_ap_id(ap_id, opts \\ []), do: ActivityPub.make_user_from_ap_id(ap_id, opts) - def get_or_fetch_by_ap_id(ap_id) do + def get_or_fetch_by_ap_id(ap_id, opts \\ []) do cached_user = get_cached_by_ap_id(ap_id) - maybe_fetched_user = needs_update?(cached_user) && fetch_by_ap_id(ap_id) + maybe_fetched_user = needs_update?(cached_user) && fetch_by_ap_id(ap_id, opts) case {cached_user, maybe_fetched_user} do {_, {:ok, %User{} = user}} -> @@ -1899,8 +1846,8 @@ def public_key(%{public_key: public_key_pem}) when is_binary(public_key_pem) do def public_key(_), do: {:error, "key not found"} - def get_public_key_for_ap_id(ap_id) do - with {:ok, %User{} = user} <- get_or_fetch_by_ap_id(ap_id), + def get_public_key_for_ap_id(ap_id, opts \\ []) do + with {:ok, %User{} = user} <- get_or_fetch_by_ap_id(ap_id, opts), {:ok, public_key} <- public_key(user) do {:ok, public_key} else @@ -2123,6 +2070,13 @@ def toggle_confirmation(users) do Enum.map(users, &toggle_confirmation/1) end + @spec need_confirmation(User.t(), boolean()) :: {:ok, User.t()} | {:error, Changeset.t()} + def need_confirmation(%User{} = user, bool) do + user + |> confirmation_changeset(need_confirmation: bool) + |> update_and_set_cache() + end + def get_mascot(%{mascot: %{} = mascot}) when not is_nil(mascot) do mascot end @@ -2337,7 +2291,9 @@ def remove_pinnned_activity(user, %Pleroma.Activity{id: id, data: data}) do # if pinned activity was scheduled for deletion, we reschedule it for deletion if data["expires_at"] do - {:ok, expires_at, _} = DateTime.from_iso8601(data["expires_at"]) + # MRF.ActivityExpirationPolicy used UTC timestamps for expires_at in original implementation + {:ok, expires_at} = + data["expires_at"] |> Pleroma.EctoType.ActivityPub.ObjectValidators.DateTime.cast() Pleroma.Workers.PurgeExpiredActivity.enqueue(%{ activity_id: id, diff --git a/lib/pleroma/user/import.ex b/lib/pleroma/user/import.ex new file mode 100644 index 000000000..e458021c8 --- /dev/null +++ b/lib/pleroma/user/import.ex @@ -0,0 +1,85 @@ +# Pleroma: A lightweight social networking server +# Copyright © 2017-2020 Pleroma Authors +# SPDX-License-Identifier: AGPL-3.0-only + +defmodule Pleroma.User.Import do + use Ecto.Schema + + alias Pleroma.User + alias Pleroma.Web.CommonAPI + alias Pleroma.Workers.BackgroundWorker + + require Logger + + @spec perform(atom(), User.t(), list()) :: :ok | list() | {:error, any()} + def perform(:mutes_import, %User{} = user, [_ | _] = identifiers) do + Enum.map( + identifiers, + fn identifier -> + with {:ok, %User{} = muted_user} <- User.get_or_fetch(identifier), + {:ok, _} <- User.mute(user, muted_user) do + muted_user + else + error -> handle_error(:mutes_import, identifier, error) + end + end + ) + end + + def perform(:blocks_import, %User{} = blocker, [_ | _] = identifiers) do + Enum.map( + identifiers, + fn identifier -> + with {:ok, %User{} = blocked} <- User.get_or_fetch(identifier), + {:ok, _block} <- CommonAPI.block(blocker, blocked) do + blocked + else + error -> handle_error(:blocks_import, identifier, error) + end + end + ) + end + + def perform(:follow_import, %User{} = follower, [_ | _] = identifiers) do + Enum.map( + identifiers, + fn identifier -> + with {:ok, %User{} = followed} <- User.get_or_fetch(identifier), + {:ok, follower} <- User.maybe_direct_follow(follower, followed), + {:ok, _, _, _} <- CommonAPI.follow(follower, followed) do + followed + else + error -> handle_error(:follow_import, identifier, error) + end + end + ) + end + + def perform(_, _, _), do: :ok + + defp handle_error(op, user_id, error) do + Logger.debug("#{op} failed for #{user_id} with: #{inspect(error)}") + error + end + + def blocks_import(%User{} = blocker, [_ | _] = identifiers) do + BackgroundWorker.enqueue( + "blocks_import", + %{"user_id" => blocker.id, "identifiers" => identifiers} + ) + end + + def follow_import(%User{} = follower, [_ | _] = identifiers) do + BackgroundWorker.enqueue( + "follow_import", + %{"user_id" => follower.id, "identifiers" => identifiers} + ) + end + + def mutes_import(%User{} = user, [_ | _] = identifiers) do + BackgroundWorker.enqueue( + "mutes_import", + %{"user_id" => user.id, "identifiers" => identifiers} + ) + end +end diff --git a/lib/pleroma/user/query.ex b/lib/pleroma/user/query.ex index d618432ff..2440bf890 100644 --- a/lib/pleroma/user/query.ex +++ b/lib/pleroma/user/query.ex @@ -47,6 +47,7 @@ defmodule Pleroma.User.Query do is_moderator: boolean(), super_users: boolean(), invisible: boolean(), + internal: boolean(), followers: User.t(), friends: User.t(), recipients_from_activity: [String.t()], @@ -80,7 +81,9 @@ defp base_query do end defp prepare_query(query, criteria) do - Enum.reduce(criteria, query, &compose_query/2) + criteria + |> Map.put_new(:internal, false) + |> Enum.reduce(query, &compose_query/2) end defp compose_query({key, value}, query) @@ -107,12 +110,12 @@ defp compose_query({:tags, tags}, query) when is_list(tags) and length(tags) > 0 where(query, [u], fragment("? && ?", u.tags, ^tags)) end - defp compose_query({:is_admin, _}, query) do - where(query, [u], u.is_admin) + defp compose_query({:is_admin, bool}, query) do + where(query, [u], u.is_admin == ^bool) end - defp compose_query({:is_moderator, _}, query) do - where(query, [u], u.is_moderator) + defp compose_query({:is_moderator, bool}, query) do + where(query, [u], u.is_moderator == ^bool) end defp compose_query({:super_users, _}, query) do @@ -129,14 +132,12 @@ defp compose_query({:external, _}, query), do: location_query(query, false) defp compose_query({:active, _}, query) do User.restrict_deactivated(query) - |> where([u], not is_nil(u.nickname)) |> where([u], u.approval_pending == false) end defp compose_query({:legacy_active, _}, query) do query |> where([u], fragment("not (?->'deactivated' @> 'true')", u.info)) - |> where([u], not is_nil(u.nickname)) end defp compose_query({:deactivated, false}, query) do @@ -145,7 +146,10 @@ defp compose_query({:deactivated, false}, query) do defp compose_query({:deactivated, true}, query) do where(query, [u], u.deactivated == ^true) - |> where([u], not is_nil(u.nickname)) + end + + defp compose_query({:confirmation_pending, bool}, query) do + where(query, [u], u.confirmation_pending == ^bool) end defp compose_query({:need_approval, _}, query) do @@ -199,10 +203,15 @@ defp compose_query({:limit, limit}, query) do limit(query, ^limit) end + defp compose_query({:internal, false}, query) do + query + |> where([u], not is_nil(u.nickname)) + |> where([u], not like(u.nickname, "internal.%")) + end + defp compose_query(_unsupported_param, query), do: query defp location_query(query, local) do where(query, [u], u.local == ^local) - |> where([u], not is_nil(u.nickname)) end end diff --git a/lib/pleroma/user/search.ex b/lib/pleroma/user/search.ex index 7babd47ea..35a828008 100644 --- a/lib/pleroma/user/search.ex +++ b/lib/pleroma/user/search.ex @@ -3,8 +3,10 @@ # SPDX-License-Identifier: AGPL-3.0-only defmodule Pleroma.User.Search do + alias Pleroma.EctoType.ActivityPub.ObjectValidators.Uri, as: UriType alias Pleroma.Pagination alias Pleroma.User + import Ecto.Query @limit 20 @@ -19,16 +21,47 @@ def search(query_string, opts \\ []) do query_string = format_query(query_string) - maybe_resolve(resolve, for_user, query_string) + # If this returns anything, it should bounce to the top + maybe_resolved = maybe_resolve(resolve, for_user, query_string) + + top_user_ids = + [] + |> maybe_add_resolved(maybe_resolved) + |> maybe_add_ap_id_match(query_string) + |> maybe_add_uri_match(query_string) results = query_string - |> search_query(for_user, following) + |> search_query(for_user, following, top_user_ids) |> Pagination.fetch_paginated(%{"offset" => offset, "limit" => result_limit}, :offset) results end + defp maybe_add_resolved(list, {:ok, %User{} = user}) do + [user.id | list] + end + + defp maybe_add_resolved(list, _), do: list + + defp maybe_add_ap_id_match(list, query) do + if user = User.get_cached_by_ap_id(query) do + [user.id | list] + else + list + end + end + + defp maybe_add_uri_match(list, query) do + with {:ok, query} <- UriType.cast(query), + q = from(u in User, where: u.uri == ^query, select: u.id), + users = Pleroma.Repo.all(q) do + users ++ list + else + _ -> list + end + end + defp format_query(query_string) do # Strip the beginning @ off if there is a query query_string = String.trim_leading(query_string, "@") @@ -47,21 +80,29 @@ defp format_query(query_string) do end end - defp search_query(query_string, for_user, following) do + defp search_query(query_string, for_user, following, top_user_ids) do for_user |> base_query(following) |> filter_blocked_user(for_user) |> filter_invisible_users() + |> filter_discoverable_users() |> filter_internal_users() |> filter_blocked_domains(for_user) |> fts_search(query_string) + |> select_top_users(top_user_ids) |> trigram_rank(query_string) - |> boost_search_rank(for_user) + |> boost_search_rank(for_user, top_user_ids) |> subquery() |> order_by(desc: :search_rank) |> maybe_restrict_local(for_user) end + defp select_top_users(query, top_user_ids) do + from(u in query, + or_where: u.id in ^top_user_ids + ) + end + defp fts_search(query, query_string) do query_string = to_tsquery(query_string) @@ -122,6 +163,10 @@ defp filter_invisible_users(query) do from(q in query, where: q.invisible == false) end + defp filter_discoverable_users(query) do + from(q in query, where: q.discoverable == true) + end + defp filter_internal_users(query) do from(q in query, where: q.actor_type != "Application") end @@ -175,7 +220,7 @@ defp restrict_local(q), do: where(q, [u], u.local == true) defp local_domain, do: Pleroma.Config.get([Pleroma.Web.Endpoint, :url, :host]) - defp boost_search_rank(query, %User{} = for_user) do + defp boost_search_rank(query, %User{} = for_user, top_user_ids) do friends_ids = User.get_friends_ids(for_user) followers_ids = User.get_followers_ids(for_user) @@ -187,6 +232,7 @@ defp boost_search_rank(query, %User{} = for_user) do CASE WHEN (?) THEN (?) * 1.5 WHEN (?) THEN (?) * 1.3 WHEN (?) THEN (?) * 1.1 + WHEN (?) THEN 9001 ELSE (?) END """, u.id in ^friends_ids and u.id in ^followers_ids, @@ -195,11 +241,26 @@ defp boost_search_rank(query, %User{} = for_user) do u.search_rank, u.id in ^followers_ids, u.search_rank, + u.id in ^top_user_ids, u.search_rank ) } ) end - defp boost_search_rank(query, _for_user), do: query + defp boost_search_rank(query, _for_user, top_user_ids) do + from(u in subquery(query), + select_merge: %{ + search_rank: + fragment( + """ + CASE WHEN (?) THEN 9001 + ELSE (?) END + """, + u.id in ^top_user_ids, + u.search_rank + ) + } + ) + end end diff --git a/lib/pleroma/utils.ex b/lib/pleroma/utils.ex index 21d1159be..e95766223 100644 --- a/lib/pleroma/utils.ex +++ b/lib/pleroma/utils.ex @@ -24,4 +24,24 @@ def compile_dir(dir) when is_binary(dir) do def command_available?(command) do match?({_output, 0}, System.cmd("sh", ["-c", "command -v #{command}"])) end + + @doc "creates the uniq temporary directory" + @spec tmp_dir(String.t()) :: {:ok, String.t()} | {:error, :file.posix()} + def tmp_dir(prefix \\ "") do + sub_dir = + [ + prefix, + Timex.to_unix(Timex.now()), + :os.getpid(), + String.downcase(Integer.to_string(:rand.uniform(0x100000000), 36)) + ] + |> Enum.join("-") + + tmp_dir = Path.join(System.tmp_dir!(), sub_dir) + + case File.mkdir(tmp_dir) do + :ok -> {:ok, tmp_dir} + error -> error + end + end end diff --git a/lib/pleroma/web/activity_pub/activity_pub.ex b/lib/pleroma/web/activity_pub/activity_pub.ex index 66a9f78a3..eb44cffec 100644 --- a/lib/pleroma/web/activity_pub/activity_pub.ex +++ b/lib/pleroma/web/activity_pub/activity_pub.ex @@ -84,7 +84,7 @@ defp increase_replies_count_if_reply(%{ defp increase_replies_count_if_reply(_create_data), do: :noop - @object_types ~w[ChatMessage Question Answer Audio Event] + @object_types ~w[ChatMessage Question Answer Audio Video Event Article] @spec persist(map(), keyword()) :: {:ok, Activity.t() | Object.t()} def persist(%{"type" => type} = object, meta) when type in @object_types do with {:ok, object} <- Object.create(object) do @@ -154,8 +154,8 @@ def insert(map, local \\ true, fake \\ false, bypass_actor_check \\ false) when {:remote_limit_pass, _} -> {:error, :remote_limit} - {:reject, reason} -> - {:error, reason} + {:reject, _} = e -> + {:error, e} end end @@ -767,7 +767,7 @@ defp restrict_replies(query, %{exclude_replies: true}) do end defp restrict_replies(query, %{ - reply_filtering_user: user, + reply_filtering_user: %User{} = user, reply_visibility: "self" }) do from( @@ -783,14 +783,24 @@ defp restrict_replies(query, %{ end defp restrict_replies(query, %{ - reply_filtering_user: user, + reply_filtering_user: %User{} = user, reply_visibility: "following" }) do from( [activity, object] in query, where: fragment( - "?->>'inReplyTo' is null OR ? && array_remove(?, ?) OR ? = ?", + """ + ?->>'type' != 'Create' -- This isn't a Create + OR ?->>'inReplyTo' is null -- this isn't a reply + OR ? && array_remove(?, ?) -- The recipient is us or one of our friends, + -- unless they are the author (because authors + -- are also part of the recipients). This leads + -- to a bug that self-replies by friends won't + -- show up. + OR ? = ? -- The actor is us + """, + activity.data, object.data, ^[user.ap_id | User.get_cached_user_friends_ap_ids(user)], activity.recipients, @@ -841,7 +851,14 @@ defp restrict_blocked(query, %{blocking_user: %User{} = user} = opts) do from( [activity, object: o] in query, where: fragment("not (? = ANY(?))", activity.actor, ^blocked_ap_ids), - where: fragment("not (? && ?)", activity.recipients, ^blocked_ap_ids), + where: + fragment( + "((not (? && ?)) or ? = ?)", + activity.recipients, + ^blocked_ap_ids, + activity.actor, + ^user.ap_id + ), where: fragment( "recipients_contain_blocked_domains(?, ?) = false", @@ -1270,10 +1287,12 @@ defp object_to_user_data(data) do def fetch_follow_information_for_user(user) do with {:ok, following_data} <- - Fetcher.fetch_and_contain_remote_object_from_id(user.following_address), + Fetcher.fetch_and_contain_remote_object_from_id(user.following_address, + force_http: true + ), {:ok, hide_follows} <- collection_private(following_data), {:ok, followers_data} <- - Fetcher.fetch_and_contain_remote_object_from_id(user.follower_address), + Fetcher.fetch_and_contain_remote_object_from_id(user.follower_address, force_http: true), {:ok, hide_followers} <- collection_private(followers_data) do {:ok, %{ @@ -1347,8 +1366,8 @@ def user_data_from_user_object(data) do end end - def fetch_and_prepare_user_from_ap_id(ap_id) do - with {:ok, data} <- Fetcher.fetch_and_contain_remote_object_from_id(ap_id), + def fetch_and_prepare_user_from_ap_id(ap_id, opts \\ []) do + with {:ok, data} <- Fetcher.fetch_and_contain_remote_object_from_id(ap_id, opts), {:ok, data} <- user_data_from_user_object(data) do {:ok, maybe_update_follow_information(data)} else @@ -1390,13 +1409,13 @@ def maybe_handle_clashing_nickname(data) do end end - def make_user_from_ap_id(ap_id) do + def make_user_from_ap_id(ap_id, opts \\ []) do user = User.get_cached_by_ap_id(ap_id) if user && !User.ap_enabled?(user) do Transmogrifier.upgrade_user_from_ap_id(ap_id) else - with {:ok, data} <- fetch_and_prepare_user_from_ap_id(ap_id) do + with {:ok, data} <- fetch_and_prepare_user_from_ap_id(ap_id, opts) do if user do user |> User.remote_user_changeset(data) diff --git a/lib/pleroma/web/activity_pub/mrf.ex b/lib/pleroma/web/activity_pub/mrf.ex index 206d6af52..5e5361082 100644 --- a/lib/pleroma/web/activity_pub/mrf.ex +++ b/lib/pleroma/web/activity_pub/mrf.ex @@ -5,16 +5,34 @@ defmodule Pleroma.Web.ActivityPub.MRF do @callback filter(Map.t()) :: {:ok | :reject, Map.t()} - def filter(policies, %{} = object) do + def filter(policies, %{} = message) do policies - |> Enum.reduce({:ok, object}, fn - policy, {:ok, object} -> policy.filter(object) + |> Enum.reduce({:ok, message}, fn + policy, {:ok, message} -> policy.filter(message) _, error -> error end) end def filter(%{} = object), do: get_policies() |> filter(object) + def pipeline_filter(%{} = message, meta) do + object = meta[:object_data] + ap_id = message["object"] + + if object && ap_id do + with {:ok, message} <- filter(Map.put(message, "object", object)) do + meta = Keyword.put(meta, :object_data, message["object"]) + {:ok, Map.put(message, "object", ap_id), meta} + else + {err, message} -> {err, message, meta} + end + else + {err, message} = filter(message) + + {err, message, meta} + end + end + def get_policies do Pleroma.Config.get([:mrf, :policies], []) |> get_policies() end diff --git a/lib/pleroma/web/activity_pub/mrf/keyword_policy.ex b/lib/pleroma/web/activity_pub/mrf/keyword_policy.ex index 15e09dcf0..db66cfa3e 100644 --- a/lib/pleroma/web/activity_pub/mrf/keyword_policy.ex +++ b/lib/pleroma/web/activity_pub/mrf/keyword_policy.ex @@ -20,9 +20,17 @@ defp string_matches?(string, pattern) do String.match?(string, pattern) end - defp check_reject(%{"object" => %{"content" => content, "summary" => summary}} = message) do + defp object_payload(%{} = object) do + [object["content"], object["summary"], object["name"]] + |> Enum.filter(& &1) + |> Enum.join("\n") + end + + defp check_reject(%{"object" => %{} = object} = message) do + payload = object_payload(object) + if Enum.any?(Pleroma.Config.get([:mrf_keyword, :reject]), fn pattern -> - string_matches?(content, pattern) or string_matches?(summary, pattern) + string_matches?(payload, pattern) end) do {:reject, "[KeywordPolicy] Matches with rejected keyword"} else @@ -30,12 +38,12 @@ defp check_reject(%{"object" => %{"content" => content, "summary" => summary}} = end end - defp check_ftl_removal( - %{"to" => to, "object" => %{"content" => content, "summary" => summary}} = message - ) do + defp check_ftl_removal(%{"to" => to, "object" => %{} = object} = message) do + payload = object_payload(object) + if Pleroma.Constants.as_public() in to and Enum.any?(Pleroma.Config.get([:mrf_keyword, :federated_timeline_removal]), fn pattern -> - string_matches?(content, pattern) or string_matches?(summary, pattern) + string_matches?(payload, pattern) end) do to = List.delete(to, Pleroma.Constants.as_public()) cc = [Pleroma.Constants.as_public() | message["cc"] || []] @@ -51,35 +59,24 @@ defp check_ftl_removal( end end - defp check_replace(%{"object" => %{"content" => content, "summary" => summary}} = message) do - content = - if is_binary(content) do - content - else - "" - end + defp check_replace(%{"object" => %{} = object} = message) do + object = + ["content", "name", "summary"] + |> Enum.filter(fn field -> Map.has_key?(object, field) && object[field] end) + |> Enum.reduce(object, fn field, object -> + data = + Enum.reduce( + Pleroma.Config.get([:mrf_keyword, :replace]), + object[field], + fn {pat, repl}, acc -> String.replace(acc, pat, repl) end + ) - summary = - if is_binary(summary) do - summary - else - "" - end + Map.put(object, field, data) + end) - {content, summary} = - Enum.reduce( - Pleroma.Config.get([:mrf_keyword, :replace]), - {content, summary}, - fn {pattern, replacement}, {content_acc, summary_acc} -> - {String.replace(content_acc, pattern, replacement), - String.replace(summary_acc, pattern, replacement)} - end - ) + message = Map.put(message, "object", object) - {:ok, - message - |> put_in(["object", "content"], content) - |> put_in(["object", "summary"], summary)} + {:ok, message} end @impl true diff --git a/lib/pleroma/web/activity_pub/mrf/media_proxy_warming_policy.ex b/lib/pleroma/web/activity_pub/mrf/media_proxy_warming_policy.ex index 98d595469..0fb05d3c4 100644 --- a/lib/pleroma/web/activity_pub/mrf/media_proxy_warming_policy.ex +++ b/lib/pleroma/web/activity_pub/mrf/media_proxy_warming_policy.ex @@ -12,17 +12,21 @@ defmodule Pleroma.Web.ActivityPub.MRF.MediaProxyWarmingPolicy do require Logger - @options [ + @adapter_options [ pool: :media, recv_timeout: 10_000 ] def perform(:prefetch, url) do - Logger.debug("Prefetching #{inspect(url)}") + # Fetching only proxiable resources + if MediaProxy.enabled?() and MediaProxy.url_proxiable?(url) do + # If preview proxy is enabled, it'll also hit media proxy (so we're caching both requests) + prefetch_url = MediaProxy.preview_url(url) - url - |> MediaProxy.url() - |> HTTP.get([], @options) + Logger.debug("Prefetching #{inspect(url)} as #{inspect(prefetch_url)}") + + HTTP.get(prefetch_url, [], @adapter_options) + end end def perform(:preload, %{"object" => %{"attachment" => attachments}} = _message) do diff --git a/lib/pleroma/web/activity_pub/mrf/simple_policy.ex b/lib/pleroma/web/activity_pub/mrf/simple_policy.ex index bb193475a..161177727 100644 --- a/lib/pleroma/web/activity_pub/mrf/simple_policy.ex +++ b/lib/pleroma/web/activity_pub/mrf/simple_policy.ex @@ -66,7 +66,8 @@ defp check_media_nsfw( "type" => "Create", "object" => child_object } = object - ) do + ) + when is_map(child_object) do media_nsfw = Config.get([:mrf_simple, :media_nsfw]) |> MRF.subdomains_regex() diff --git a/lib/pleroma/web/activity_pub/mrf/subchain_policy.ex b/lib/pleroma/web/activity_pub/mrf/subchain_policy.ex index c9f20571f..048052da6 100644 --- a/lib/pleroma/web/activity_pub/mrf/subchain_policy.ex +++ b/lib/pleroma/web/activity_pub/mrf/subchain_policy.ex @@ -28,8 +28,7 @@ def filter(%{"actor" => actor} = message) do }" ) - subchain - |> MRF.filter(message) + MRF.filter(subchain, message) else _e -> {:ok, message} end diff --git a/lib/pleroma/web/activity_pub/object_validator.ex b/lib/pleroma/web/activity_pub/object_validator.ex index b77c06395..bd0a2a8dc 100644 --- a/lib/pleroma/web/activity_pub/object_validator.ex +++ b/lib/pleroma/web/activity_pub/object_validator.ex @@ -12,11 +12,13 @@ defmodule Pleroma.Web.ActivityPub.ObjectValidator do alias Pleroma.Activity alias Pleroma.EctoType.ActivityPub.ObjectValidators alias Pleroma.Object + alias Pleroma.Object.Containment alias Pleroma.User alias Pleroma.Web.ActivityPub.ObjectValidators.AcceptRejectValidator alias Pleroma.Web.ActivityPub.ObjectValidators.AnnounceValidator alias Pleroma.Web.ActivityPub.ObjectValidators.AnswerValidator - alias Pleroma.Web.ActivityPub.ObjectValidators.AudioValidator + alias Pleroma.Web.ActivityPub.ObjectValidators.ArticleNoteValidator + alias Pleroma.Web.ActivityPub.ObjectValidators.AudioVideoValidator alias Pleroma.Web.ActivityPub.ObjectValidators.BlockValidator alias Pleroma.Web.ActivityPub.ObjectValidators.ChatMessageValidator alias Pleroma.Web.ActivityPub.ObjectValidators.CreateChatMessageValidator @@ -149,10 +151,20 @@ def validate(%{"type" => "Question"} = object, meta) do end end - def validate(%{"type" => "Audio"} = object, meta) do + def validate(%{"type" => type} = object, meta) when type in ~w[Audio Video] do with {:ok, object} <- object - |> AudioValidator.cast_and_validate() + |> AudioVideoValidator.cast_and_validate() + |> Ecto.Changeset.apply_action(:insert) do + object = stringify_keys(object) + {:ok, object, meta} + end + end + + def validate(%{"type" => "Article"} = object, meta) do + with {:ok, object} <- + object + |> ArticleNoteValidator.cast_and_validate() |> Ecto.Changeset.apply_action(:insert) do object = stringify_keys(object) {:ok, object, meta} @@ -198,7 +210,7 @@ def validate( %{"type" => "Create", "object" => %{"type" => objtype} = object} = create_activity, meta ) - when objtype in ~w[Question Answer Audio Event] do + when objtype in ~w[Question Answer Audio Video Event Article] do with {:ok, object_data} <- cast_and_apply(object), meta = Keyword.put(meta, :object_data, object_data |> stringify_keys), {:ok, create_activity} <- @@ -232,14 +244,18 @@ def cast_and_apply(%{"type" => "Answer"} = object) do AnswerValidator.cast_and_apply(object) end - def cast_and_apply(%{"type" => "Audio"} = object) do - AudioValidator.cast_and_apply(object) + def cast_and_apply(%{"type" => type} = object) when type in ~w[Audio Video] do + AudioVideoValidator.cast_and_apply(object) end def cast_and_apply(%{"type" => "Event"} = object) do EventValidator.cast_and_apply(object) end + def cast_and_apply(%{"type" => "Article"} = object) do + ArticleNoteValidator.cast_and_apply(object) + end + def cast_and_apply(o), do: {:error, {:validator_not_set, o}} # is_struct/1 isn't present in Elixir 1.8.x @@ -262,7 +278,8 @@ def stringify_keys(object) when is_list(object) do def stringify_keys(object), do: object def fetch_actor(object) do - with {:ok, actor} <- ObjectValidators.ObjectID.cast(object["actor"]) do + with actor <- Containment.get_actor(object), + {:ok, actor} <- ObjectValidators.ObjectID.cast(actor) do User.get_or_fetch_by_ap_id(actor) end end diff --git a/lib/pleroma/web/activity_pub/object_validators/audio_validator.ex b/lib/pleroma/web/activity_pub/object_validators/article_note_validator.ex similarity index 80% rename from lib/pleroma/web/activity_pub/object_validators/audio_validator.ex rename to lib/pleroma/web/activity_pub/object_validators/article_note_validator.ex index 1a97c504a..5b7dad517 100644 --- a/lib/pleroma/web/activity_pub/object_validators/audio_validator.ex +++ b/lib/pleroma/web/activity_pub/object_validators/article_note_validator.ex @@ -2,7 +2,7 @@ # Copyright © 2017-2020 Pleroma Authors # SPDX-License-Identifier: AGPL-3.0-only -defmodule Pleroma.Web.ActivityPub.ObjectValidators.AudioValidator do +defmodule Pleroma.Web.ActivityPub.ObjectValidators.ArticleNoteValidator do use Ecto.Schema alias Pleroma.EctoType.ActivityPub.ObjectValidators @@ -25,14 +25,19 @@ defmodule Pleroma.Web.ActivityPub.ObjectValidators.AudioValidator do # TODO: Write type field(:tag, {:array, :map}, default: []) field(:type, :string) + + field(:name, :string) + field(:summary, :string) field(:content, :string) + field(:context, :string) + # short identifier for PleromaFE to group statuses by context + field(:context_id, :integer) # TODO: Remove actor on objects field(:actor, ObjectValidators.ObjectID) field(:attributedTo, ObjectValidators.ObjectID) - field(:summary, :string) field(:published, ObjectValidators.DateTime) field(:emoji, ObjectValidators.Emoji, default: %{}) field(:sensitive, :boolean, default: false) @@ -40,13 +45,11 @@ defmodule Pleroma.Web.ActivityPub.ObjectValidators.AudioValidator do field(:replies_count, :integer, default: 0) field(:like_count, :integer, default: 0) field(:announcement_count, :integer, default: 0) - field(:inReplyTo, :string) + field(:inReplyTo, ObjectValidators.ObjectID) field(:url, ObjectValidators.Uri) - # short identifier for PleromaFE to group statuses by context - field(:context_id, :integer) - field(:likes, {:array, :string}, default: []) - field(:announcements, {:array, :string}, default: []) + field(:likes, {:array, ObjectValidators.ObjectID}, default: []) + field(:announcements, {:array, ObjectValidators.ObjectID}, default: []) end def cast_and_apply(data) do @@ -62,19 +65,14 @@ def cast_and_validate(data) do end def cast_data(data) do + data = fix(data) + %__MODULE__{} |> changeset(data) end - defp fix_url(%{"url" => url} = data) when is_list(url) do - attachment = - Enum.find(url, fn x -> is_map(x) and String.starts_with?(x["mimeType"], "audio/") end) - - link_element = Enum.find(url, fn x -> is_map(x) and x["mimeType"] == "text/html" end) - - data - |> Map.put("attachment", [attachment]) - |> Map.put("url", link_element["href"]) + defp fix_url(%{"url" => url} = data) when is_map(url) do + Map.put(data, "url", url["href"]) end defp fix_url(data), do: data @@ -83,8 +81,9 @@ defp fix(data) do data |> CommonFixes.fix_defaults() |> CommonFixes.fix_attribution() - |> Transmogrifier.fix_emoji() + |> CommonFixes.fix_actor() |> fix_url() + |> Transmogrifier.fix_emoji() end def changeset(struct, data) do @@ -97,8 +96,8 @@ def changeset(struct, data) do def validate_data(data_cng) do data_cng - |> validate_inclusion(:type, ["Audio"]) - |> validate_required([:id, :actor, :attributedTo, :type, :context, :attachment]) + |> validate_inclusion(:type, ["Article", "Note"]) + |> validate_required([:id, :actor, :attributedTo, :type, :context, :context_id]) |> CommonValidations.validate_any_presence([:cc, :to]) |> CommonValidations.validate_fields_match([:actor, :attributedTo]) |> CommonValidations.validate_actor_presence() diff --git a/lib/pleroma/web/activity_pub/object_validators/attachment_validator.ex b/lib/pleroma/web/activity_pub/object_validators/attachment_validator.ex index c8b148280..df102a134 100644 --- a/lib/pleroma/web/activity_pub/object_validators/attachment_validator.ex +++ b/lib/pleroma/web/activity_pub/object_validators/attachment_validator.ex @@ -5,6 +5,7 @@ defmodule Pleroma.Web.ActivityPub.ObjectValidators.AttachmentValidator do use Ecto.Schema + alias Pleroma.EctoType.ActivityPub.ObjectValidators alias Pleroma.Web.ActivityPub.ObjectValidators.UrlObjectValidator import Ecto.Changeset @@ -15,7 +16,11 @@ defmodule Pleroma.Web.ActivityPub.ObjectValidators.AttachmentValidator do field(:mediaType, :string, default: "application/octet-stream") field(:name, :string) - embeds_many(:url, UrlObjectValidator) + embeds_many :url, UrlObjectValidator, primary_key: false do + field(:type, :string) + field(:href, ObjectValidators.Uri) + field(:mediaType, :string, default: "application/octet-stream") + end end def cast_and_validate(data) do @@ -37,7 +42,18 @@ def changeset(struct, data) do struct |> cast(data, [:type, :mediaType, :name]) - |> cast_embed(:url, required: true) + |> cast_embed(:url, with: &url_changeset/2) + |> validate_inclusion(:type, ~w[Link Document Audio Image Video]) + |> validate_required([:type, :mediaType, :url]) + end + + def url_changeset(struct, data) do + data = fix_media_type(data) + + struct + |> cast(data, [:type, :href, :mediaType]) + |> validate_inclusion(:type, ["Link"]) + |> validate_required([:type, :href, :mediaType]) end def fix_media_type(data) do @@ -75,6 +91,7 @@ defp fix_url(data) do def validate_data(cng) do cng + |> validate_inclusion(:type, ~w[Document Audio Image Video]) |> validate_required([:mediaType, :url, :type]) end end diff --git a/lib/pleroma/web/activity_pub/object_validators/audio_video_validator.ex b/lib/pleroma/web/activity_pub/object_validators/audio_video_validator.ex new file mode 100644 index 000000000..16973e5db --- /dev/null +++ b/lib/pleroma/web/activity_pub/object_validators/audio_video_validator.ex @@ -0,0 +1,134 @@ +# Pleroma: A lightweight social networking server +# Copyright © 2017-2020 Pleroma Authors +# SPDX-License-Identifier: AGPL-3.0-only + +defmodule Pleroma.Web.ActivityPub.ObjectValidators.AudioVideoValidator do + use Ecto.Schema + + alias Pleroma.EarmarkRenderer + alias Pleroma.EctoType.ActivityPub.ObjectValidators + alias Pleroma.Web.ActivityPub.ObjectValidators.AttachmentValidator + alias Pleroma.Web.ActivityPub.ObjectValidators.CommonFixes + alias Pleroma.Web.ActivityPub.ObjectValidators.CommonValidations + alias Pleroma.Web.ActivityPub.Transmogrifier + + import Ecto.Changeset + + @primary_key false + @derive Jason.Encoder + + embedded_schema do + field(:id, ObjectValidators.ObjectID, primary_key: true) + field(:to, ObjectValidators.Recipients, default: []) + field(:cc, ObjectValidators.Recipients, default: []) + field(:bto, ObjectValidators.Recipients, default: []) + field(:bcc, ObjectValidators.Recipients, default: []) + # TODO: Write type + field(:tag, {:array, :map}, default: []) + field(:type, :string) + + field(:name, :string) + field(:summary, :string) + field(:content, :string) + + field(:context, :string) + # short identifier for PleromaFE to group statuses by context + field(:context_id, :integer) + + # TODO: Remove actor on objects + field(:actor, ObjectValidators.ObjectID) + + field(:attributedTo, ObjectValidators.ObjectID) + field(:published, ObjectValidators.DateTime) + field(:emoji, ObjectValidators.Emoji, default: %{}) + field(:sensitive, :boolean, default: false) + embeds_many(:attachment, AttachmentValidator) + field(:replies_count, :integer, default: 0) + field(:like_count, :integer, default: 0) + field(:announcement_count, :integer, default: 0) + field(:inReplyTo, ObjectValidators.ObjectID) + field(:url, ObjectValidators.Uri) + + field(:likes, {:array, ObjectValidators.ObjectID}, default: []) + field(:announcements, {:array, ObjectValidators.ObjectID}, default: []) + end + + def cast_and_apply(data) do + data + |> cast_data + |> apply_action(:insert) + end + + def cast_and_validate(data) do + data + |> cast_data() + |> validate_data() + end + + def cast_data(data) do + %__MODULE__{} + |> changeset(data) + end + + defp fix_url(%{"url" => url} = data) when is_list(url) do + attachment = + Enum.find(url, fn x -> + mime_type = x["mimeType"] || x["mediaType"] || "" + + is_map(x) and String.starts_with?(mime_type, ["video/", "audio/"]) + end) + + link_element = + Enum.find(url, fn x -> + mime_type = x["mimeType"] || x["mediaType"] || "" + + is_map(x) and mime_type == "text/html" + end) + + data + |> Map.put("attachment", [attachment]) + |> Map.put("url", link_element["href"]) + end + + defp fix_url(data), do: data + + defp fix_content(%{"mediaType" => "text/markdown", "content" => content} = data) + when is_binary(content) do + content = + content + |> Earmark.as_html!(%Earmark.Options{renderer: EarmarkRenderer}) + |> Pleroma.HTML.filter_tags() + + Map.put(data, "content", content) + end + + defp fix_content(data), do: data + + defp fix(data) do + data + |> CommonFixes.fix_defaults() + |> CommonFixes.fix_attribution() + |> CommonFixes.fix_actor() + |> Transmogrifier.fix_emoji() + |> fix_url() + |> fix_content() + end + + def changeset(struct, data) do + data = fix(data) + + struct + |> cast(data, __schema__(:fields) -- [:attachment]) + |> cast_embed(:attachment) + end + + def validate_data(data_cng) do + data_cng + |> validate_inclusion(:type, ["Audio", "Video"]) + |> validate_required([:id, :actor, :attributedTo, :type, :context, :attachment]) + |> CommonValidations.validate_any_presence([:cc, :to]) + |> CommonValidations.validate_fields_match([:actor, :attributedTo]) + |> CommonValidations.validate_actor_presence() + |> CommonValidations.validate_host_match() + end +end diff --git a/lib/pleroma/web/activity_pub/object_validators/common_fixes.ex b/lib/pleroma/web/activity_pub/object_validators/common_fixes.ex index 720213d73..b3638cfc7 100644 --- a/lib/pleroma/web/activity_pub/object_validators/common_fixes.ex +++ b/lib/pleroma/web/activity_pub/object_validators/common_fixes.ex @@ -3,6 +3,7 @@ # SPDX-License-Identifier: AGPL-3.0-only defmodule Pleroma.Web.ActivityPub.ObjectValidators.CommonFixes do + alias Pleroma.Object.Containment alias Pleroma.Web.ActivityPub.Utils # based on Pleroma.Web.ActivityPub.Utils.lazy_put_objects_defaults @@ -19,4 +20,12 @@ def fix_attribution(data) do data |> Map.put_new("actor", data["attributedTo"]) end + + def fix_actor(data) do + actor = Containment.get_actor(data) + + data + |> Map.put("actor", actor) + |> Map.put("attributedTo", actor) + end end diff --git a/lib/pleroma/web/activity_pub/object_validators/create_generic_validator.ex b/lib/pleroma/web/activity_pub/object_validators/create_generic_validator.ex index b3dbeea57..422ee07be 100644 --- a/lib/pleroma/web/activity_pub/object_validators/create_generic_validator.ex +++ b/lib/pleroma/web/activity_pub/object_validators/create_generic_validator.ex @@ -10,9 +10,10 @@ defmodule Pleroma.Web.ActivityPub.ObjectValidators.CreateGenericValidator do alias Pleroma.EctoType.ActivityPub.ObjectValidators alias Pleroma.Object + alias Pleroma.Web.ActivityPub.ObjectValidators.CommonFixes + alias Pleroma.Web.ActivityPub.ObjectValidators.CommonValidations import Ecto.Changeset - import Pleroma.Web.ActivityPub.ObjectValidators.CommonValidations @primary_key false @@ -75,14 +76,15 @@ defp fix(data, meta) do data |> fix_context(meta) |> fix_addressing(meta) + |> CommonFixes.fix_actor() end def validate_data(cng, meta \\ []) do cng |> validate_required([:actor, :type, :object]) |> validate_inclusion(:type, ["Create"]) - |> validate_actor_presence() - |> validate_any_presence([:to, :cc]) + |> CommonValidations.validate_actor_presence() + |> CommonValidations.validate_any_presence([:to, :cc]) |> validate_actors_match(meta) |> validate_context_match(meta) |> validate_object_nonexistence() diff --git a/lib/pleroma/web/activity_pub/object_validators/note_validator.ex b/lib/pleroma/web/activity_pub/object_validators/note_validator.ex deleted file mode 100644 index ab4469a59..000000000 --- a/lib/pleroma/web/activity_pub/object_validators/note_validator.ex +++ /dev/null @@ -1,73 +0,0 @@ -# Pleroma: A lightweight social networking server -# Copyright © 2017-2020 Pleroma Authors -# SPDX-License-Identifier: AGPL-3.0-only - -defmodule Pleroma.Web.ActivityPub.ObjectValidators.NoteValidator do - use Ecto.Schema - - alias Pleroma.EctoType.ActivityPub.ObjectValidators - alias Pleroma.Web.ActivityPub.Transmogrifier - - import Ecto.Changeset - - @primary_key false - - embedded_schema do - field(:id, ObjectValidators.ObjectID, primary_key: true) - field(:to, ObjectValidators.Recipients, default: []) - field(:cc, ObjectValidators.Recipients, default: []) - field(:bto, ObjectValidators.Recipients, default: []) - field(:bcc, ObjectValidators.Recipients, default: []) - # TODO: Write type - field(:tag, {:array, :map}, default: []) - field(:type, :string) - - field(:name, :string) - field(:summary, :string) - field(:content, :string) - - field(:context, :string) - # short identifier for PleromaFE to group statuses by context - field(:context_id, :integer) - - field(:actor, ObjectValidators.ObjectID) - field(:attributedTo, ObjectValidators.ObjectID) - field(:published, ObjectValidators.DateTime) - field(:emoji, ObjectValidators.Emoji, default: %{}) - field(:sensitive, :boolean, default: false) - # TODO: Write type - field(:attachment, {:array, :map}, default: []) - field(:replies_count, :integer, default: 0) - field(:like_count, :integer, default: 0) - field(:announcement_count, :integer, default: 0) - field(:inReplyTo, ObjectValidators.ObjectID) - field(:url, ObjectValidators.Uri) - - field(:likes, {:array, :string}, default: []) - field(:announcements, {:array, :string}, default: []) - end - - def cast_and_validate(data) do - data - |> cast_data() - |> validate_data() - end - - defp fix(data) do - data - |> Transmogrifier.fix_emoji() - end - - def cast_data(data) do - data = fix(data) - - %__MODULE__{} - |> cast(data, __schema__(:fields)) - end - - def validate_data(data_cng) do - data_cng - |> validate_inclusion(:type, ["Note"]) - |> validate_required([:id, :actor, :to, :cc, :type, :content, :context]) - end -end diff --git a/lib/pleroma/web/activity_pub/object_validators/question_validator.ex b/lib/pleroma/web/activity_pub/object_validators/question_validator.ex index 934d3c1ea..9310485dc 100644 --- a/lib/pleroma/web/activity_pub/object_validators/question_validator.ex +++ b/lib/pleroma/web/activity_pub/object_validators/question_validator.ex @@ -47,8 +47,8 @@ defmodule Pleroma.Web.ActivityPub.ObjectValidators.QuestionValidator do # short identifier for PleromaFE to group statuses by context field(:context_id, :integer) - field(:likes, {:array, :string}, default: []) - field(:announcements, {:array, :string}, default: []) + field(:likes, {:array, ObjectValidators.ObjectID}, default: []) + field(:announcements, {:array, ObjectValidators.ObjectID}, default: []) field(:closed, ObjectValidators.DateTime) field(:voters, {:array, ObjectValidators.ObjectID}, default: []) diff --git a/lib/pleroma/web/activity_pub/object_validators/url_object_validator.ex b/lib/pleroma/web/activity_pub/object_validators/url_object_validator.ex deleted file mode 100644 index 881030f38..000000000 --- a/lib/pleroma/web/activity_pub/object_validators/url_object_validator.ex +++ /dev/null @@ -1,24 +0,0 @@ -# Pleroma: A lightweight social networking server -# Copyright © 2017-2020 Pleroma Authors -# SPDX-License-Identifier: AGPL-3.0-only - -defmodule Pleroma.Web.ActivityPub.ObjectValidators.UrlObjectValidator do - use Ecto.Schema - - alias Pleroma.EctoType.ActivityPub.ObjectValidators - - import Ecto.Changeset - @primary_key false - - embedded_schema do - field(:type, :string) - field(:href, ObjectValidators.Uri) - field(:mediaType, :string, default: "application/octet-stream") - end - - def changeset(struct, data) do - struct - |> cast(data, __schema__(:fields)) - |> validate_required([:type, :href, :mediaType]) - end -end diff --git a/lib/pleroma/web/activity_pub/pipeline.ex b/lib/pleroma/web/activity_pub/pipeline.ex index 36e325c37..2db86f116 100644 --- a/lib/pleroma/web/activity_pub/pipeline.ex +++ b/lib/pleroma/web/activity_pub/pipeline.ex @@ -26,13 +26,17 @@ def common_pipeline(object, meta) do {:error, e} -> {:error, e} + + {:reject, e} -> + {:reject, e} end end def do_common_pipeline(object, meta) do with {_, {:ok, validated_object, meta}} <- {:validate_object, ObjectValidator.validate(object, meta)}, - {_, {:ok, mrfd_object}} <- {:mrf_object, MRF.filter(validated_object)}, + {_, {:ok, mrfd_object, meta}} <- + {:mrf_object, MRF.pipeline_filter(validated_object, meta)}, {_, {:ok, activity, meta}} <- {:persist_object, ActivityPub.persist(mrfd_object, meta)}, {_, {:ok, activity, meta}} <- @@ -40,7 +44,7 @@ def do_common_pipeline(object, meta) do {_, {:ok, _}} <- {:federation, maybe_federate(activity, meta)} do {:ok, activity, meta} else - {:mrf_object, {:reject, _}} -> {:ok, nil, meta} + {:mrf_object, {:reject, message, _}} -> {:reject, message} e -> {:error, e} end end diff --git a/lib/pleroma/web/activity_pub/publisher.ex b/lib/pleroma/web/activity_pub/publisher.ex index d88f7f3ee..9c3956683 100644 --- a/lib/pleroma/web/activity_pub/publisher.ex +++ b/lib/pleroma/web/activity_pub/publisher.ex @@ -13,6 +13,7 @@ defmodule Pleroma.Web.ActivityPub.Publisher do alias Pleroma.User alias Pleroma.Web.ActivityPub.Relay alias Pleroma.Web.ActivityPub.Transmogrifier + alias Pleroma.Web.FedSockets require Pleroma.Constants @@ -50,15 +51,35 @@ def is_representable?(%Activity{} = activity) do def publish_one(%{inbox: inbox, json: json, actor: %User{} = actor, id: id} = params) do Logger.debug("Federating #{id} to #{inbox}") - uri = URI.parse(inbox) + case FedSockets.get_or_create_fed_socket(inbox) do + {:ok, fedsocket} -> + Logger.debug("publishing via fedsockets - #{inspect(inbox)}") + FedSockets.publish(fedsocket, json) + _ -> + Logger.debug("publishing via http - #{inspect(inbox)}") + http_publish(inbox, actor, json, params) + end + end + + def publish_one(%{actor_id: actor_id} = params) do + actor = User.get_cached_by_id(actor_id) + + params + |> Map.delete(:actor_id) + |> Map.put(:actor, actor) + |> publish_one() + end + + defp http_publish(inbox, actor, json, params) do + uri = %{path: path} = URI.parse(inbox) digest = "SHA-256=" <> (:crypto.hash(:sha256, json) |> Base.encode64()) date = Pleroma.Signature.signed_date() signature = Pleroma.Signature.sign(actor, %{ - "(request-target)": "post #{uri.path}", + "(request-target)": "post #{path}", host: signature_host(uri), "content-length": byte_size(json), digest: digest, @@ -89,15 +110,6 @@ def publish_one(%{inbox: inbox, json: json, actor: %User{} = actor, id: id} = pa end end - def publish_one(%{actor_id: actor_id} = params) do - actor = User.get_cached_by_id(actor_id) - - params - |> Map.delete(:actor_id) - |> Map.put(:actor, actor) - |> publish_one() - end - defp signature_host(%URI{port: port, scheme: scheme, host: host}) do if port == URI.default_port(scheme) do host diff --git a/lib/pleroma/web/activity_pub/relay.ex b/lib/pleroma/web/activity_pub/relay.ex index b65710a94..6606e1780 100644 --- a/lib/pleroma/web/activity_pub/relay.ex +++ b/lib/pleroma/web/activity_pub/relay.ex @@ -30,12 +30,16 @@ def follow(target_instance) do end end - @spec unfollow(String.t()) :: {:ok, Activity.t()} | {:error, any()} - def unfollow(target_instance) do + @spec unfollow(String.t(), map()) :: {:ok, Activity.t()} | {:error, any()} + def unfollow(target_instance, opts \\ %{}) do with %User{} = local_user <- get_actor(), - {:ok, %User{} = target_user} <- User.get_or_fetch_by_ap_id(target_instance), + {:ok, target_user} <- fetch_target_user(target_instance, opts), {:ok, activity} <- ActivityPub.unfollow(local_user, target_user) do - User.unfollow(local_user, target_user) + case target_user.id do + nil -> User.update_following_count(local_user) + _ -> User.unfollow(local_user, target_user) + end + Logger.info("relay: unfollowed instance: #{target_instance}: id=#{activity.data["id"]}") {:ok, activity} else @@ -43,6 +47,14 @@ def unfollow(target_instance) do end end + defp fetch_target_user(ap_id, opts) do + case {opts[:force], User.get_or_fetch_by_ap_id(ap_id)} do + {_, {:ok, %User{} = user}} -> {:ok, user} + {true, _} -> {:ok, %User{ap_id: ap_id}} + {_, error} -> error + end + end + @spec publish(any()) :: {:ok, Activity.t()} | {:error, any()} def publish(%Activity{data: %{"type" => "Create"}} = activity) do with %User{} = user <- get_actor(), diff --git a/lib/pleroma/web/activity_pub/side_effects.ex b/lib/pleroma/web/activity_pub/side_effects.ex index 46a8be767..b9a83a544 100644 --- a/lib/pleroma/web/activity_pub/side_effects.ex +++ b/lib/pleroma/web/activity_pub/side_effects.ex @@ -336,7 +336,7 @@ def handle_object_creation(%{"type" => "Answer"} = object_map, meta) do end def handle_object_creation(%{"type" => objtype} = object, meta) - when objtype in ~w[Audio Question Event] do + when objtype in ~w[Audio Video Question Event Article] do with {:ok, object, meta} <- Pipeline.common_pipeline(object, meta) do {:ok, object, meta} end diff --git a/lib/pleroma/web/activity_pub/transmogrifier.ex b/lib/pleroma/web/activity_pub/transmogrifier.ex index af4384213..d7dd9fe6b 100644 --- a/lib/pleroma/web/activity_pub/transmogrifier.ex +++ b/lib/pleroma/web/activity_pub/transmogrifier.ex @@ -7,7 +7,6 @@ defmodule Pleroma.Web.ActivityPub.Transmogrifier do A module to handle coding from internal to wire ActivityPub and back. """ alias Pleroma.Activity - alias Pleroma.EarmarkRenderer alias Pleroma.EctoType.ActivityPub.ObjectValidators alias Pleroma.Maps alias Pleroma.Object @@ -45,7 +44,6 @@ def fix_object(object, options \\ []) do |> fix_addressing |> fix_summary |> fix_type(options) - |> fix_content end def fix_summary(%{"summary" => nil} = object) do @@ -274,24 +272,7 @@ def fix_url(%{"url" => url} = object) when is_map(url) do Map.put(object, "url", url["href"]) end - def fix_url(%{"type" => "Video", "url" => url} = object) when is_list(url) do - attachment = - Enum.find(url, fn x -> - media_type = x["mediaType"] || x["mimeType"] || "" - - is_map(x) and String.starts_with?(media_type, "video/") - end) - - link_element = - Enum.find(url, fn x -> is_map(x) and (x["mediaType"] || x["mimeType"]) == "text/html" end) - - object - |> Map.put("attachment", [attachment]) - |> Map.put("url", link_element["href"]) - end - - def fix_url(%{"type" => object_type, "url" => url} = object) - when object_type != "Video" and is_list(url) do + def fix_url(%{"url" => url} = object) when is_list(url) do first_element = Enum.at(url, 0) url_string = @@ -309,7 +290,7 @@ def fix_url(object), do: object def fix_emoji(%{"tag" => tags} = object) when is_list(tags) do emoji = tags - |> Enum.filter(fn data -> data["type"] == "Emoji" and data["icon"] end) + |> Enum.filter(fn data -> is_map(data) and data["type"] == "Emoji" and data["icon"] end) |> Enum.reduce(%{}, fn data, mapping -> name = String.trim(data["name"], ":") @@ -371,18 +352,6 @@ def fix_type(%{"inReplyTo" => reply_id, "name" => _} = object, options) def fix_type(object, _), do: object - defp fix_content(%{"mediaType" => "text/markdown", "content" => content} = object) - when is_binary(content) do - html_content = - content - |> Earmark.as_html!(%Earmark.Options{renderer: EarmarkRenderer}) - |> Pleroma.HTML.filter_tags() - - Map.merge(object, %{"content" => html_content, "mediaType" => "text/html"}) - end - - defp fix_content(object), do: object - # Reduce the object list to find the reported user. defp get_reported(objects) do Enum.reduce_while(objects, nil, fn ap_id, _ -> @@ -455,7 +424,7 @@ def handle_incoming( %{"type" => "Create", "object" => %{"type" => objtype} = object} = data, options ) - when objtype in ~w{Article Note Video Page} do + when objtype in ~w{Note Page} do actor = Containment.get_actor(data) with nil <- Activity.get_create_by_object_ap_id(object["id"]), @@ -546,13 +515,19 @@ def handle_incoming( end def handle_incoming( - %{"type" => "Create", "object" => %{"type" => objtype}} = data, + %{"type" => "Create", "object" => %{"type" => objtype, "id" => obj_id}} = data, _options ) - when objtype in ~w{Question Answer ChatMessage Audio Event} do + when objtype in ~w{Question Answer ChatMessage Audio Video Event Article} do + data = Map.put(data, "object", strip_internal_fields(data["object"])) + with {:ok, %User{}} <- ObjectValidator.fetch_actor(data), + nil <- Activity.get_create_by_object_ap_id(obj_id), {:ok, activity, _} <- Pipeline.common_pipeline(data, local: false) do {:ok, activity} + else + %Activity{} = activity -> {:ok, activity} + e -> e end end @@ -1029,7 +1004,7 @@ def perform(:user_upgrade, user) do def upgrade_user_from_ap_id(ap_id) do with %User{local: false} = user <- User.get_cached_by_ap_id(ap_id), - {:ok, data} <- ActivityPub.fetch_and_prepare_user_from_ap_id(ap_id), + {:ok, data} <- ActivityPub.fetch_and_prepare_user_from_ap_id(ap_id, force_http: true), {:ok, user} <- update_user(user, data) do TransmogrifierWorker.enqueue("user_upgrade", %{"user_id" => user.id}) {:ok, user} diff --git a/lib/pleroma/web/admin_api/controllers/admin_api_controller.ex b/lib/pleroma/web/admin_api/controllers/admin_api_controller.ex index f5e4d49f9..d5713c3dd 100644 --- a/lib/pleroma/web/admin_api/controllers/admin_api_controller.ex +++ b/lib/pleroma/web/admin_api/controllers/admin_api_controller.ex @@ -23,8 +23,6 @@ defmodule Pleroma.Web.AdminAPI.AdminAPIController do alias Pleroma.Web.Endpoint alias Pleroma.Web.Router - require Logger - @users_page_size 50 plug( @@ -68,6 +66,12 @@ defmodule Pleroma.Web.AdminAPI.AdminAPIController do when action in [:list_user_statuses, :list_instance_statuses] ) + plug( + OAuthScopesPlug, + %{scopes: ["read:chats"], admin: true} + when action in [:list_user_chats] + ) + plug( OAuthScopesPlug, %{scopes: ["read"], admin: true} @@ -256,6 +260,20 @@ def list_user_statuses(%{assigns: %{user: admin}} = conn, %{"nickname" => nickna end end + def list_user_chats(%{assigns: %{user: admin}} = conn, %{"nickname" => nickname} = _params) do + with %User{id: user_id} <- User.get_cached_by_nickname_or_id(nickname, for: admin) do + chats = + Pleroma.Chat.for_user_query(user_id) + |> Pleroma.Repo.all() + + conn + |> put_view(AdminAPI.ChatView) + |> render("index.json", chats: chats) + else + _ -> {:error, :not_found} + end + end + def user_toggle_activation(%{assigns: %{user: admin}} = conn, %{"nickname" => nickname}) do user = User.get_cached_by_nickname(nickname) diff --git a/lib/pleroma/web/admin_api/controllers/chat_controller.ex b/lib/pleroma/web/admin_api/controllers/chat_controller.ex new file mode 100644 index 000000000..967600d69 --- /dev/null +++ b/lib/pleroma/web/admin_api/controllers/chat_controller.ex @@ -0,0 +1,85 @@ +# Pleroma: A lightweight social networking server +# Copyright © 2017-2020 Pleroma Authors +# SPDX-License-Identifier: AGPL-3.0-only + +defmodule Pleroma.Web.AdminAPI.ChatController do + use Pleroma.Web, :controller + + alias Pleroma.Activity + alias Pleroma.Chat + alias Pleroma.Chat.MessageReference + alias Pleroma.ModerationLog + alias Pleroma.Pagination + alias Pleroma.Plugs.OAuthScopesPlug + alias Pleroma.Web.AdminAPI + alias Pleroma.Web.CommonAPI + alias Pleroma.Web.PleromaAPI.Chat.MessageReferenceView + + require Logger + + plug(Pleroma.Web.ApiSpec.CastAndValidate) + + plug( + OAuthScopesPlug, + %{scopes: ["read:chats"], admin: true} when action in [:show, :messages] + ) + + plug( + OAuthScopesPlug, + %{scopes: ["write:chats"], admin: true} when action in [:delete_message] + ) + + action_fallback(Pleroma.Web.AdminAPI.FallbackController) + + defdelegate open_api_operation(action), to: Pleroma.Web.ApiSpec.Admin.ChatOperation + + def delete_message(%{assigns: %{user: user}} = conn, %{ + message_id: message_id, + id: chat_id + }) do + with %MessageReference{object: %{data: %{"id" => object_ap_id}}} = cm_ref <- + MessageReference.get_by_id(message_id), + ^chat_id <- to_string(cm_ref.chat_id), + %Activity{id: activity_id} <- Activity.get_create_by_object_ap_id(object_ap_id), + {:ok, _} <- CommonAPI.delete(activity_id, user) do + ModerationLog.insert_log(%{ + action: "chat_message_delete", + actor: user, + subject_id: message_id + }) + + conn + |> put_view(MessageReferenceView) + |> render("show.json", chat_message_reference: cm_ref) + else + _e -> + {:error, :could_not_delete} + end + end + + def messages(conn, %{id: id} = params) do + with %Chat{} = chat <- Chat.get_by_id(id) do + cm_refs = + chat + |> MessageReference.for_chat_query() + |> Pagination.fetch_paginated(params) + + conn + |> put_view(MessageReferenceView) + |> render("index.json", chat_message_references: cm_refs) + else + _ -> + conn + |> put_status(:not_found) + |> json(%{error: "not found"}) + end + end + + def show(conn, %{id: id}) do + with %Chat{} = chat <- Chat.get_by_id(id) do + conn + |> put_view(AdminAPI.ChatView) + |> render("show.json", chat: chat) + end + end +end diff --git a/lib/pleroma/web/admin_api/controllers/instance_document_controller.ex b/lib/pleroma/web/admin_api/controllers/instance_document_controller.ex new file mode 100644 index 000000000..504d9b517 --- /dev/null +++ b/lib/pleroma/web/admin_api/controllers/instance_document_controller.ex @@ -0,0 +1,41 @@ +# Pleroma: A lightweight social networking server +# Copyright © 2017-2020 Pleroma Authors +# SPDX-License-Identifier: AGPL-3.0-only + +defmodule Pleroma.Web.AdminAPI.InstanceDocumentController do + use Pleroma.Web, :controller + + alias Pleroma.Plugs.InstanceStatic + alias Pleroma.Plugs.OAuthScopesPlug + alias Pleroma.Web.InstanceDocument + + plug(Pleroma.Web.ApiSpec.CastAndValidate) + + action_fallback(Pleroma.Web.AdminAPI.FallbackController) + + defdelegate open_api_operation(action), to: Pleroma.Web.ApiSpec.Admin.InstanceDocumentOperation + + plug(OAuthScopesPlug, %{scopes: ["read"], admin: true} when action == :show) + plug(OAuthScopesPlug, %{scopes: ["write"], admin: true} when action in [:update, :delete]) + + def show(conn, %{name: document_name}) do + with {:ok, url} <- InstanceDocument.get(document_name), + {:ok, content} <- File.read(InstanceStatic.file_path(url)) do + conn + |> put_resp_content_type("text/html") + |> send_resp(200, content) + end + end + + def update(%{body_params: %{file: file}} = conn, %{name: document_name}) do + with {:ok, url} <- InstanceDocument.put(document_name, file.path) do + json(conn, %{"url" => url}) + end + end + + def delete(conn, %{name: document_name}) do + with :ok <- InstanceDocument.delete(document_name) do + json(conn, %{}) + end + end +end diff --git a/lib/pleroma/web/admin_api/controllers/relay_controller.ex b/lib/pleroma/web/admin_api/controllers/relay_controller.ex index 95d06dde7..6c19f09f7 100644 --- a/lib/pleroma/web/admin_api/controllers/relay_controller.ex +++ b/lib/pleroma/web/admin_api/controllers/relay_controller.ex @@ -33,11 +33,7 @@ def index(conn, _params) do def follow(%{assigns: %{user: admin}, body_params: %{relay_url: target}} = conn, _) do with {:ok, _message} <- Relay.follow(target) do - ModerationLog.insert_log(%{ - action: "relay_follow", - actor: admin, - target: target - }) + ModerationLog.insert_log(%{action: "relay_follow", actor: admin, target: target}) json(conn, %{actor: target, followed_back: target in Relay.following()}) else @@ -48,13 +44,9 @@ def follow(%{assigns: %{user: admin}, body_params: %{relay_url: target}} = conn, end end - def unfollow(%{assigns: %{user: admin}, body_params: %{relay_url: target}} = conn, _) do - with {:ok, _message} <- Relay.unfollow(target) do - ModerationLog.insert_log(%{ - action: "relay_unfollow", - actor: admin, - target: target - }) + def unfollow(%{assigns: %{user: admin}, body_params: %{relay_url: target} = params} = conn, _) do + with {:ok, _message} <- Relay.unfollow(target, %{force: params[:force]}) do + ModerationLog.insert_log(%{action: "relay_unfollow", actor: admin, target: target}) json(conn, target) else diff --git a/lib/pleroma/web/admin_api/views/chat_view.ex b/lib/pleroma/web/admin_api/views/chat_view.ex new file mode 100644 index 000000000..847df1423 --- /dev/null +++ b/lib/pleroma/web/admin_api/views/chat_view.ex @@ -0,0 +1,30 @@ +# Pleroma: A lightweight social networking server +# Copyright © 2017-2020 Pleroma Authors +# SPDX-License-Identifier: AGPL-3.0-only + +defmodule Pleroma.Web.AdminAPI.ChatView do + use Pleroma.Web, :view + + alias Pleroma.Chat + alias Pleroma.User + alias Pleroma.Web.MastodonAPI + alias Pleroma.Web.PleromaAPI + + def render("index.json", %{chats: chats} = opts) do + render_many(chats, __MODULE__, "show.json", Map.delete(opts, :chats)) + end + + def render("show.json", %{chat: %Chat{user_id: user_id}} = opts) do + user = User.get_by_id(user_id) + sender = MastodonAPI.AccountView.render("show.json", user: user, skip_visibility_check: true) + + serialized_chat = PleromaAPI.ChatView.render("show.json", opts) + + serialized_chat + |> Map.put(:sender, sender) + |> Map.put(:receiver, serialized_chat[:account]) + |> Map.delete(:account) + end + + def render(view, opts), do: PleromaAPI.ChatView.render(view, opts) +end diff --git a/lib/pleroma/web/admin_api/views/status_view.ex b/lib/pleroma/web/admin_api/views/status_view.ex index 500800be2..6042a22b6 100644 --- a/lib/pleroma/web/admin_api/views/status_view.ex +++ b/lib/pleroma/web/admin_api/views/status_view.ex @@ -8,6 +8,7 @@ defmodule Pleroma.Web.AdminAPI.StatusView do require Pleroma.Constants alias Pleroma.Web.AdminAPI + alias Pleroma.Web.CommonAPI alias Pleroma.Web.MastodonAPI defdelegate merge_account_views(user), to: AdminAPI.AccountView @@ -17,7 +18,7 @@ def render("index.json", opts) do end def render("show.json", %{activity: %{data: %{"object" => _object}} = activity} = opts) do - user = MastodonAPI.StatusView.get_user(activity.data["actor"]) + user = CommonAPI.get_user(activity.data["actor"]) MastodonAPI.StatusView.render("show.json", opts) |> Map.merge(%{account: merge_account_views(user)}) diff --git a/lib/pleroma/web/api_spec.ex b/lib/pleroma/web/api_spec.ex index 79fd5f871..93a5273e3 100644 --- a/lib/pleroma/web/api_spec.ex +++ b/lib/pleroma/web/api_spec.ex @@ -13,10 +13,15 @@ defmodule Pleroma.Web.ApiSpec do @impl OpenApi def spec do %OpenApi{ - servers: [ - # Populate the Server info from a phoenix endpoint - OpenApiSpex.Server.from_endpoint(Endpoint) - ], + servers: + if Phoenix.Endpoint.server?(:pleroma, Endpoint) do + [ + # Populate the Server info from a phoenix endpoint + OpenApiSpex.Server.from_endpoint(Endpoint) + ] + else + [] + end, info: %OpenApiSpex.Info{ title: "Pleroma", description: Application.spec(:pleroma, :description) |> to_string(), diff --git a/lib/pleroma/web/api_spec/cast_and_validate.ex b/lib/pleroma/web/api_spec/cast_and_validate.ex index fbfc27d6f..6d1a7ebbc 100644 --- a/lib/pleroma/web/api_spec/cast_and_validate.ex +++ b/lib/pleroma/web/api_spec/cast_and_validate.ex @@ -115,6 +115,10 @@ defp cast_and_validate(spec, operation, conn, content_type, false = _strict) do %{reason: :unexpected_field, name: name, path: [name]}, params -> Map.delete(params, name) + # Filter out empty params + %{reason: :invalid_type, path: [name_atom], value: ""}, params -> + Map.delete(params, to_string(name_atom)) + %{reason: :invalid_enum, name: nil, path: path, value: value}, params -> path = path |> Enum.reverse() |> tl() |> Enum.reverse() |> list_items_to_string() update_in(params, path, &List.delete(&1, value)) diff --git a/lib/pleroma/web/api_spec/helpers.ex b/lib/pleroma/web/api_spec/helpers.ex index 2a7f1a706..34de2ed57 100644 --- a/lib/pleroma/web/api_spec/helpers.ex +++ b/lib/pleroma/web/api_spec/helpers.ex @@ -72,7 +72,11 @@ def empty_object_response do end def empty_array_response do - Operation.response("Empty array", "application/json", %Schema{type: :array, example: []}) + Operation.response("Empty array", "application/json", %Schema{ + type: :array, + items: %Schema{type: :object, example: %{}}, + example: [] + }) end def no_content_response do diff --git a/lib/pleroma/web/api_spec/operations/account_operation.ex b/lib/pleroma/web/api_spec/operations/account_operation.ex index aaebc9b5c..d90ddb787 100644 --- a/lib/pleroma/web/api_spec/operations/account_operation.ex +++ b/lib/pleroma/web/api_spec/operations/account_operation.ex @@ -372,6 +372,10 @@ def identity_proofs_operation do tags: ["accounts"], summary: "Identity proofs", operationId: "AccountController.identity_proofs", + # Validators complains about unused path params otherwise + parameters: [ + %Reference{"$ref": "#/components/parameters/accountIdOrNickname"} + ], description: "Not implemented", responses: %{ 200 => empty_array_response() @@ -469,7 +473,6 @@ defp create_response do identifier: %Schema{type: :string}, message: %Schema{type: :string} }, - required: [], # Note: example of successful registration with failed login response: # example: %{ # "identifier" => "missing_confirmed_email", @@ -530,7 +533,7 @@ defp update_credentials_request do nullable: true, oneOf: [ %Schema{type: :array, items: attribute_field()}, - %Schema{type: :object, additionalProperties: %Schema{type: attribute_field()}} + %Schema{type: :object, additionalProperties: attribute_field()} ] }, # NOTE: `source` field is not supported diff --git a/lib/pleroma/web/api_spec/operations/admin/chat_operation.ex b/lib/pleroma/web/api_spec/operations/admin/chat_operation.ex new file mode 100644 index 000000000..d3e5dfc1c --- /dev/null +++ b/lib/pleroma/web/api_spec/operations/admin/chat_operation.ex @@ -0,0 +1,96 @@ +# Pleroma: A lightweight social networking server +# Copyright © 2017-2020 Pleroma Authors +# SPDX-License-Identifier: AGPL-3.0-only + +defmodule Pleroma.Web.ApiSpec.Admin.ChatOperation do + alias OpenApiSpex.Operation + alias Pleroma.Web.ApiSpec.Schemas.Chat + alias Pleroma.Web.ApiSpec.Schemas.ChatMessage + + import Pleroma.Web.ApiSpec.Helpers + + def open_api_operation(action) do + operation = String.to_existing_atom("#{action}_operation") + apply(__MODULE__, operation, []) + end + + def delete_message_operation do + %Operation{ + tags: ["admin", "chat"], + summary: "Delete an individual chat message", + operationId: "AdminAPI.ChatController.delete_message", + parameters: [ + Operation.parameter(:id, :path, :string, "The ID of the Chat"), + Operation.parameter(:message_id, :path, :string, "The ID of the message") + ], + responses: %{ + 200 => + Operation.response( + "The deleted ChatMessage", + "application/json", + ChatMessage + ) + }, + security: [ + %{ + "oAuth" => ["write:chats"] + } + ] + } + end + + def messages_operation do + %Operation{ + tags: ["admin", "chat"], + summary: "Get the most recent messages of the chat", + operationId: "AdminAPI.ChatController.messages", + parameters: + [Operation.parameter(:id, :path, :string, "The ID of the Chat")] ++ + pagination_params(), + responses: %{ + 200 => + Operation.response( + "The messages in the chat", + "application/json", + Pleroma.Web.ApiSpec.ChatOperation.chat_messages_response() + ) + }, + security: [ + %{ + "oAuth" => ["read:chats"] + } + ] + } + end + + def show_operation do + %Operation{ + tags: ["chat"], + summary: "Create a chat", + operationId: "AdminAPI.ChatController.show", + parameters: [ + Operation.parameter( + :id, + :path, + :string, + "The id of the chat", + required: true, + example: "1234" + ) + ], + responses: %{ + 200 => + Operation.response( + "The existing chat", + "application/json", + Chat + ) + }, + security: [ + %{ + "oAuth" => ["read"] + } + ] + } + end +end diff --git a/lib/pleroma/web/api_spec/operations/admin/instance_document_operation.ex b/lib/pleroma/web/api_spec/operations/admin/instance_document_operation.ex new file mode 100644 index 000000000..a120ff4e8 --- /dev/null +++ b/lib/pleroma/web/api_spec/operations/admin/instance_document_operation.ex @@ -0,0 +1,115 @@ +# Pleroma: A lightweight social networking server +# Copyright © 2017-2020 Pleroma Authors +# SPDX-License-Identifier: AGPL-3.0-only + +defmodule Pleroma.Web.ApiSpec.Admin.InstanceDocumentOperation do + alias OpenApiSpex.Operation + alias OpenApiSpex.Schema + alias Pleroma.Web.ApiSpec.Helpers + alias Pleroma.Web.ApiSpec.Schemas.ApiError + + def open_api_operation(action) do + operation = String.to_existing_atom("#{action}_operation") + apply(__MODULE__, operation, []) + end + + def show_operation do + %Operation{ + tags: ["Admin", "InstanceDocument"], + summary: "Get the instance document", + operationId: "AdminAPI.InstanceDocumentController.show", + security: [%{"oAuth" => ["read"]}], + parameters: [ + Operation.parameter(:name, :path, %Schema{type: :string}, "The document name", + required: true + ) + | Helpers.admin_api_params() + ], + responses: %{ + 200 => document_content(), + 400 => Operation.response("Bad Request", "application/json", ApiError), + 403 => Operation.response("Forbidden", "application/json", ApiError), + 404 => Operation.response("Not Found", "application/json", ApiError) + } + } + end + + def update_operation do + %Operation{ + tags: ["Admin", "InstanceDocument"], + summary: "Update the instance document", + operationId: "AdminAPI.InstanceDocumentController.update", + security: [%{"oAuth" => ["write"]}], + requestBody: Helpers.request_body("Parameters", update_request()), + parameters: [ + Operation.parameter(:name, :path, %Schema{type: :string}, "The document name", + required: true + ) + | Helpers.admin_api_params() + ], + responses: %{ + 200 => Operation.response("InstanceDocument", "application/json", instance_document()), + 400 => Operation.response("Bad Request", "application/json", ApiError), + 403 => Operation.response("Forbidden", "application/json", ApiError), + 404 => Operation.response("Not Found", "application/json", ApiError) + } + } + end + + defp update_request do + %Schema{ + title: "UpdateRequest", + description: "POST body for uploading the file", + type: :object, + required: [:file], + properties: %{ + file: %Schema{ + type: :string, + format: :binary, + description: "The file to be uploaded, using multipart form data." + } + } + } + end + + def delete_operation do + %Operation{ + tags: ["Admin", "InstanceDocument"], + summary: "Get the instance document", + operationId: "AdminAPI.InstanceDocumentController.delete", + security: [%{"oAuth" => ["write"]}], + parameters: [ + Operation.parameter(:name, :path, %Schema{type: :string}, "The document name", + required: true + ) + | Helpers.admin_api_params() + ], + responses: %{ + 200 => Operation.response("InstanceDocument", "application/json", instance_document()), + 400 => Operation.response("Bad Request", "application/json", ApiError), + 403 => Operation.response("Forbidden", "application/json", ApiError), + 404 => Operation.response("Not Found", "application/json", ApiError) + } + } + end + + defp instance_document do + %Schema{ + title: "InstanceDocument", + type: :object, + properties: %{ + url: %Schema{type: :string} + }, + example: %{ + "url" => "https://example.com/static/terms-of-service.html" + } + } + end + + defp document_content do + Operation.response("InstanceDocumentContent", "text/html", %Schema{ + type: :string, + example: "

Instance panel

" + }) + end +end diff --git a/lib/pleroma/web/api_spec/operations/admin/relay_operation.ex b/lib/pleroma/web/api_spec/operations/admin/relay_operation.ex index e06b2d164..f754bb9f5 100644 --- a/lib/pleroma/web/api_spec/operations/admin/relay_operation.ex +++ b/lib/pleroma/web/api_spec/operations/admin/relay_operation.ex @@ -56,7 +56,7 @@ def unfollow_operation do operationId: "AdminAPI.RelayController.unfollow", security: [%{"oAuth" => ["write:follows"]}], parameters: admin_api_params(), - requestBody: request_body("Parameters", relay_url()), + requestBody: request_body("Parameters", relay_unfollow()), responses: %{ 200 => Operation.response("Status", "application/json", %Schema{ @@ -91,4 +91,14 @@ defp relay_url do } } end + + defp relay_unfollow do + %Schema{ + type: :object, + properties: %{ + relay_url: %Schema{type: :string, format: :uri}, + force: %Schema{type: :boolean, default: false} + } + } + end end diff --git a/lib/pleroma/web/api_spec/operations/chat_operation.ex b/lib/pleroma/web/api_spec/operations/chat_operation.ex index b1a0d26ab..0dcfdb354 100644 --- a/lib/pleroma/web/api_spec/operations/chat_operation.ex +++ b/lib/pleroma/web/api_spec/operations/chat_operation.ex @@ -158,7 +158,8 @@ def messages_operation do "The messages in the chat", "application/json", chat_messages_response() - ) + ), + 404 => Operation.response("Not Found", "application/json", ApiError) }, security: [ %{ @@ -184,7 +185,8 @@ def post_chat_message_operation do "application/json", ChatMessage ), - 400 => Operation.response("Bad Request", "application/json", ApiError) + 400 => Operation.response("Bad Request", "application/json", ApiError), + 422 => Operation.response("MRF Rejection", "application/json", ApiError) }, security: [ %{ diff --git a/lib/pleroma/web/api_spec/operations/custom_emoji_operation.ex b/lib/pleroma/web/api_spec/operations/custom_emoji_operation.ex index 2f812ac77..5ff263ceb 100644 --- a/lib/pleroma/web/api_spec/operations/custom_emoji_operation.ex +++ b/lib/pleroma/web/api_spec/operations/custom_emoji_operation.ex @@ -69,7 +69,7 @@ defp custom_emoji do type: :object, properties: %{ category: %Schema{type: :string}, - tags: %Schema{type: :array} + tags: %Schema{type: :array, items: %Schema{type: :string}} } } ], diff --git a/lib/pleroma/web/api_spec/operations/emoji_reaction_operation.ex b/lib/pleroma/web/api_spec/operations/emoji_reaction_operation.ex index 1a49fece0..745d41f88 100644 --- a/lib/pleroma/web/api_spec/operations/emoji_reaction_operation.ex +++ b/lib/pleroma/web/api_spec/operations/emoji_reaction_operation.ex @@ -23,7 +23,7 @@ def index_operation do parameters: [ Operation.parameter(:id, :path, FlakeID, "Status ID", required: true), Operation.parameter(:emoji, :path, :string, "Filter by a single unicode emoji", - required: false + required: nil ) ], security: [%{"oAuth" => ["read:statuses"]}], diff --git a/lib/pleroma/web/api_spec/operations/list_operation.ex b/lib/pleroma/web/api_spec/operations/list_operation.ex index 15039052e..f6e73968a 100644 --- a/lib/pleroma/web/api_spec/operations/list_operation.ex +++ b/lib/pleroma/web/api_spec/operations/list_operation.ex @@ -187,8 +187,7 @@ defp add_remove_accounts_request(required) when is_boolean(required) do type: :object, properties: %{ account_ids: %Schema{type: :array, description: "Array of account IDs", items: FlakeID} - }, - required: required && [:account_ids] + } }, required: required ) diff --git a/lib/pleroma/web/api_spec/operations/pleroma_emoji_file_operation.ex b/lib/pleroma/web/api_spec/operations/pleroma_emoji_file_operation.ex new file mode 100644 index 000000000..a56641426 --- /dev/null +++ b/lib/pleroma/web/api_spec/operations/pleroma_emoji_file_operation.ex @@ -0,0 +1,139 @@ +# Pleroma: A lightweight social networking server +# Copyright © 2017-2020 Pleroma Authors +# SPDX-License-Identifier: AGPL-3.0-only + +defmodule Pleroma.Web.ApiSpec.PleromaEmojiFileOperation do + alias OpenApiSpex.Operation + alias OpenApiSpex.Schema + alias Pleroma.Web.ApiSpec.Schemas.ApiError + + import Pleroma.Web.ApiSpec.Helpers + + def open_api_operation(action) do + operation = String.to_existing_atom("#{action}_operation") + apply(__MODULE__, operation, []) + end + + def create_operation do + %Operation{ + tags: ["Emoji Packs"], + summary: "Add new file to the pack", + operationId: "PleromaAPI.EmojiPackController.add_file", + security: [%{"oAuth" => ["write"]}], + requestBody: request_body("Parameters", create_request(), required: true), + parameters: [name_param()], + responses: %{ + 200 => Operation.response("Files Object", "application/json", files_object()), + 422 => Operation.response("Unprocessable Entity", "application/json", ApiError), + 404 => Operation.response("Not Found", "application/json", ApiError), + 400 => Operation.response("Bad Request", "application/json", ApiError), + 409 => Operation.response("Conflict", "application/json", ApiError) + } + } + end + + defp create_request do + %Schema{ + type: :object, + required: [:file], + properties: %{ + file: %Schema{ + description: + "File needs to be uploaded with the multipart request or link to remote file", + anyOf: [ + %Schema{type: :string, format: :binary}, + %Schema{type: :string, format: :uri} + ] + }, + shortcode: %Schema{ + type: :string, + description: + "Shortcode for new emoji, must be unique for all emoji. If not sended, shortcode will be taken from original filename." + }, + filename: %Schema{ + type: :string, + description: + "New emoji file name. If not specified will be taken from original filename." + } + } + } + end + + def update_operation do + %Operation{ + tags: ["Emoji Packs"], + summary: "Add new file to the pack", + operationId: "PleromaAPI.EmojiPackController.update_file", + security: [%{"oAuth" => ["write"]}], + requestBody: request_body("Parameters", update_request(), required: true), + parameters: [name_param()], + responses: %{ + 200 => Operation.response("Files Object", "application/json", files_object()), + 404 => Operation.response("Not Found", "application/json", ApiError), + 400 => Operation.response("Bad Request", "application/json", ApiError), + 409 => Operation.response("Conflict", "application/json", ApiError), + 422 => Operation.response("Unprocessable Entity", "application/json", ApiError) + } + } + end + + defp update_request do + %Schema{ + type: :object, + required: [:shortcode, :new_shortcode, :new_filename], + properties: %{ + shortcode: %Schema{ + type: :string, + description: "Emoji file shortcode" + }, + new_shortcode: %Schema{ + type: :string, + description: "New emoji file shortcode" + }, + new_filename: %Schema{ + type: :string, + description: "New filename for emoji file" + }, + force: %Schema{ + type: :boolean, + description: "With true value to overwrite existing emoji with new shortcode", + default: false + } + } + } + end + + def delete_operation do + %Operation{ + tags: ["Emoji Packs"], + summary: "Delete emoji file from pack", + operationId: "PleromaAPI.EmojiPackController.delete_file", + security: [%{"oAuth" => ["write"]}], + parameters: [ + name_param(), + Operation.parameter(:shortcode, :query, :string, "File shortcode", + example: "cofe", + required: true + ) + ], + responses: %{ + 200 => Operation.response("Files Object", "application/json", files_object()), + 400 => Operation.response("Bad Request", "application/json", ApiError), + 404 => Operation.response("Not Found", "application/json", ApiError), + 422 => Operation.response("Unprocessable Entity", "application/json", ApiError) + } + } + end + + defp name_param do + Operation.parameter(:name, :query, :string, "Pack Name", example: "cofe", required: true) + end + + defp files_object do + %Schema{ + type: :object, + additionalProperties: %Schema{type: :string}, + description: "Object with emoji names as keys and filenames as values" + } + end +end diff --git a/lib/pleroma/web/api_spec/operations/pleroma_emoji_pack_operation.ex b/lib/pleroma/web/api_spec/operations/pleroma_emoji_pack_operation.ex index b2b4f8713..79f52dcb3 100644 --- a/lib/pleroma/web/api_spec/operations/pleroma_emoji_pack_operation.ex +++ b/lib/pleroma/web/api_spec/operations/pleroma_emoji_pack_operation.ex @@ -19,7 +19,21 @@ def remote_operation do tags: ["Emoji Packs"], summary: "Make request to another instance for emoji packs list", security: [%{"oAuth" => ["write"]}], - parameters: [url_param()], + parameters: [ + url_param(), + Operation.parameter( + :page, + :query, + %Schema{type: :integer, default: 1}, + "Page" + ), + Operation.parameter( + :page_size, + :query, + %Schema{type: :integer, default: 30}, + "Number of emoji to return" + ) + ], operationId: "PleromaAPI.EmojiPackController.remote", responses: %{ 200 => emoji_packs_response(), @@ -175,111 +189,6 @@ def update_operation do } end - def add_file_operation do - %Operation{ - tags: ["Emoji Packs"], - summary: "Add new file to the pack", - operationId: "PleromaAPI.EmojiPackController.add_file", - security: [%{"oAuth" => ["write"]}], - requestBody: request_body("Parameters", add_file_request(), required: true), - parameters: [name_param()], - responses: %{ - 200 => Operation.response("Files Object", "application/json", files_object()), - 400 => Operation.response("Bad Request", "application/json", ApiError), - 409 => Operation.response("Conflict", "application/json", ApiError) - } - } - end - - defp add_file_request do - %Schema{ - type: :object, - required: [:file], - properties: %{ - file: %Schema{ - description: - "File needs to be uploaded with the multipart request or link to remote file", - anyOf: [ - %Schema{type: :string, format: :binary}, - %Schema{type: :string, format: :uri} - ] - }, - shortcode: %Schema{ - type: :string, - description: - "Shortcode for new emoji, must be unique for all emoji. If not sended, shortcode will be taken from original filename." - }, - filename: %Schema{ - type: :string, - description: - "New emoji file name. If not specified will be taken from original filename." - } - } - } - end - - def update_file_operation do - %Operation{ - tags: ["Emoji Packs"], - summary: "Add new file to the pack", - operationId: "PleromaAPI.EmojiPackController.update_file", - security: [%{"oAuth" => ["write"]}], - requestBody: request_body("Parameters", update_file_request(), required: true), - parameters: [name_param()], - responses: %{ - 200 => Operation.response("Files Object", "application/json", files_object()), - 400 => Operation.response("Bad Request", "application/json", ApiError), - 409 => Operation.response("Conflict", "application/json", ApiError) - } - } - end - - defp update_file_request do - %Schema{ - type: :object, - required: [:shortcode, :new_shortcode, :new_filename], - properties: %{ - shortcode: %Schema{ - type: :string, - description: "Emoji file shortcode" - }, - new_shortcode: %Schema{ - type: :string, - description: "New emoji file shortcode" - }, - new_filename: %Schema{ - type: :string, - description: "New filename for emoji file" - }, - force: %Schema{ - type: :boolean, - description: "With true value to overwrite existing emoji with new shortcode", - default: false - } - } - } - end - - def delete_file_operation do - %Operation{ - tags: ["Emoji Packs"], - summary: "Delete emoji file from pack", - operationId: "PleromaAPI.EmojiPackController.delete_file", - security: [%{"oAuth" => ["write"]}], - parameters: [ - name_param(), - Operation.parameter(:shortcode, :query, :string, "File shortcode", - example: "cofe", - required: true - ) - ], - responses: %{ - 200 => Operation.response("Files Object", "application/json", files_object()), - 400 => Operation.response("Bad Request", "application/json", ApiError) - } - } - end - def import_from_filesystem_operation do %Operation{ tags: ["Emoji Packs"], @@ -297,7 +206,7 @@ def import_from_filesystem_operation do end defp name_param do - Operation.parameter(:name, :path, :string, "Pack Name", example: "cofe", required: true) + Operation.parameter(:name, :query, :string, "Pack Name", example: "cofe", required: true) end defp url_param do diff --git a/lib/pleroma/web/api_spec/operations/status_operation.ex b/lib/pleroma/web/api_spec/operations/status_operation.ex index 5bd4619d5..d7ebde6f6 100644 --- a/lib/pleroma/web/api_spec/operations/status_operation.ex +++ b/lib/pleroma/web/api_spec/operations/status_operation.ex @@ -55,7 +55,7 @@ def create_operation do "application/json", %Schema{oneOf: [Status, ScheduledStatus]} ), - 422 => Operation.response("Bad Request", "application/json", ApiError) + 422 => Operation.response("Bad Request / MRF Rejection", "application/json", ApiError) } } end diff --git a/lib/pleroma/web/api_spec/operations/user_import_operation.ex b/lib/pleroma/web/api_spec/operations/user_import_operation.ex new file mode 100644 index 000000000..a50314fb7 --- /dev/null +++ b/lib/pleroma/web/api_spec/operations/user_import_operation.ex @@ -0,0 +1,80 @@ +# Pleroma: A lightweight social networking server +# Copyright © 2017-2020 Pleroma Authors +# SPDX-License-Identifier: AGPL-3.0-only + +defmodule Pleroma.Web.ApiSpec.UserImportOperation do + alias OpenApiSpex.Operation + alias OpenApiSpex.Schema + alias Pleroma.Web.ApiSpec.Schemas.ApiError + + import Pleroma.Web.ApiSpec.Helpers + + @spec open_api_operation(atom) :: Operation.t() + def open_api_operation(action) do + operation = String.to_existing_atom("#{action}_operation") + apply(__MODULE__, operation, []) + end + + def follow_operation do + %Operation{ + tags: ["follow_import"], + summary: "Imports your follows.", + operationId: "UserImportController.follow", + requestBody: request_body("Parameters", import_request(), required: true), + responses: %{ + 200 => ok_response(), + 500 => Operation.response("Error", "application/json", ApiError) + }, + security: [%{"oAuth" => ["write:follow"]}] + } + end + + def blocks_operation do + %Operation{ + tags: ["blocks_import"], + summary: "Imports your blocks.", + operationId: "UserImportController.blocks", + requestBody: request_body("Parameters", import_request(), required: true), + responses: %{ + 200 => ok_response(), + 500 => Operation.response("Error", "application/json", ApiError) + }, + security: [%{"oAuth" => ["write:blocks"]}] + } + end + + def mutes_operation do + %Operation{ + tags: ["mutes_import"], + summary: "Imports your mutes.", + operationId: "UserImportController.mutes", + requestBody: request_body("Parameters", import_request(), required: true), + responses: %{ + 200 => ok_response(), + 500 => Operation.response("Error", "application/json", ApiError) + }, + security: [%{"oAuth" => ["write:mutes"]}] + } + end + + defp import_request do + %Schema{ + type: :object, + required: [:list], + properties: %{ + list: %Schema{ + description: + "STRING or FILE containing a whitespace-separated list of accounts to import.", + anyOf: [ + %Schema{type: :string, format: :binary}, + %Schema{type: :string} + ] + } + } + } + end + + defp ok_response do + Operation.response("Ok", "application/json", %Schema{type: :string, example: "ok"}) + end +end diff --git a/lib/pleroma/web/api_spec/schemas/chat_message.ex b/lib/pleroma/web/api_spec/schemas/chat_message.ex index bbf2a4427..9d2799618 100644 --- a/lib/pleroma/web/api_spec/schemas/chat_message.ex +++ b/lib/pleroma/web/api_spec/schemas/chat_message.ex @@ -4,6 +4,7 @@ defmodule Pleroma.Web.ApiSpec.Schemas.ChatMessage do alias OpenApiSpex.Schema + alias Pleroma.Web.ApiSpec.Schemas.Emoji require OpenApiSpex @@ -18,7 +19,7 @@ defmodule Pleroma.Web.ApiSpec.Schemas.ChatMessage do chat_id: %Schema{type: :string}, content: %Schema{type: :string, nullable: true}, created_at: %Schema{type: :string, format: :"date-time"}, - emojis: %Schema{type: :array}, + emojis: %Schema{type: :array, items: Emoji}, attachment: %Schema{type: :object, nullable: true}, card: %Schema{ type: :object, diff --git a/lib/pleroma/web/api_spec/schemas/scheduled_status.ex b/lib/pleroma/web/api_spec/schemas/scheduled_status.ex index 0520d0848..addefa9d3 100644 --- a/lib/pleroma/web/api_spec/schemas/scheduled_status.ex +++ b/lib/pleroma/web/api_spec/schemas/scheduled_status.ex @@ -27,9 +27,9 @@ defmodule Pleroma.Web.ApiSpec.Schemas.ScheduledStatus do media_ids: %Schema{type: :array, nullable: true, items: %Schema{type: :string}}, sensitive: %Schema{type: :boolean, nullable: true}, spoiler_text: %Schema{type: :string, nullable: true}, - visibility: %Schema{type: VisibilityScope, nullable: true}, + visibility: %Schema{allOf: [VisibilityScope], nullable: true}, scheduled_at: %Schema{type: :string, format: :"date-time", nullable: true}, - poll: %Schema{type: Poll, nullable: true}, + poll: %Schema{allOf: [Poll], nullable: true}, in_reply_to_id: %Schema{type: :string, nullable: true} } } diff --git a/lib/pleroma/web/common_api/common_api.ex b/lib/pleroma/web/common_api/common_api.ex index 500c3883e..60a50b027 100644 --- a/lib/pleroma/web/common_api/common_api.ex +++ b/lib/pleroma/web/common_api/common_api.ex @@ -48,6 +48,9 @@ def post_chat_message(%User{} = user, %User{} = recipient, content, opts \\ []) local: true )} do {:ok, activity} + else + {:common_pipeline, {:reject, _} = e} -> e + e -> e end end @@ -550,4 +553,21 @@ def hide_reblogs(%User{} = user, %User{} = target) do def show_reblogs(%User{} = user, %User{} = target) do UserRelationship.delete_reblog_mute(user, target) end + + def get_user(ap_id, fake_record_fallback \\ true) do + cond do + user = User.get_cached_by_ap_id(ap_id) -> + user + + user = User.get_by_guessed_nickname(ap_id) -> + user + + fake_record_fallback -> + # TODO: refactor (fake records is never a good idea) + User.error_user(ap_id) + + true -> + nil + end + end end diff --git a/lib/pleroma/web/controller_helper.ex b/lib/pleroma/web/controller_helper.ex index 6445966e0..69188a882 100644 --- a/lib/pleroma/web/controller_helper.ex +++ b/lib/pleroma/web/controller_helper.ex @@ -48,13 +48,13 @@ defp param_to_integer(val, default) when is_binary(val) do defp param_to_integer(_, default), do: default - def add_link_headers(conn, activities, extra_params \\ %{}) + def add_link_headers(conn, entries, extra_params \\ %{}) - def add_link_headers(%{assigns: %{skip_link_headers: true}} = conn, _activities, _extra_params), + def add_link_headers(%{assigns: %{skip_link_headers: true}} = conn, _entries, _extra_params), do: conn - def add_link_headers(conn, activities, extra_params) do - case get_pagination_fields(conn, activities, extra_params) do + def add_link_headers(conn, entries, extra_params) do + case get_pagination_fields(conn, entries, extra_params) do %{"next" => next_url, "prev" => prev_url} -> put_resp_header(conn, "link", "<#{next_url}>; rel=\"next\", <#{prev_url}>; rel=\"prev\"") @@ -78,19 +78,15 @@ defp build_pagination_fields(conn, min_id, max_id, extra_params) do } end - def get_pagination_fields(conn, activities, extra_params \\ %{}) do - case List.last(activities) do + def get_pagination_fields(conn, entries, extra_params \\ %{}) do + case List.last(entries) do %{pagination_id: max_id} when not is_nil(max_id) -> - %{pagination_id: min_id} = - activities - |> List.first() + %{pagination_id: min_id} = List.first(entries) build_pagination_fields(conn, min_id, max_id, extra_params) %{id: max_id} -> - %{id: min_id} = - activities - |> List.first() + %{id: min_id} = List.first(entries) build_pagination_fields(conn, min_id, max_id, extra_params) diff --git a/lib/pleroma/web/fed_sockets/fed_registry.ex b/lib/pleroma/web/fed_sockets/fed_registry.ex new file mode 100644 index 000000000..e00ea69c0 --- /dev/null +++ b/lib/pleroma/web/fed_sockets/fed_registry.ex @@ -0,0 +1,185 @@ +# Pleroma: A lightweight social networking server +# Copyright © 2017-2020 Pleroma Authors +# SPDX-License-Identifier: AGPL-3.0-only + +defmodule Pleroma.Web.FedSockets.FedRegistry do + @moduledoc """ + The FedRegistry stores the active FedSockets for quick retrieval. + + The storage and retrieval portion of the FedRegistry is done in process through + elixir's `Registry` module for speed and its ability to monitor for terminated processes. + + Dropped connections will be caught by `Registry` and deleted. Since the next + message will initiate a new connection there is no reason to try and reconnect at that point. + + Normally outside modules should have no need to call or use the FedRegistry themselves. + """ + + alias Pleroma.Web.FedSockets.FedSocket + alias Pleroma.Web.FedSockets.SocketInfo + + require Logger + + @default_rejection_duration 15 * 60 * 1000 + @rejections :fed_socket_rejections + + @doc """ + Retrieves a FedSocket from the Registry given it's origin. + + The origin is expected to be a string identifying the endpoint "example.com" or "example2.com:8080" + + Will return: + * {:ok, fed_socket} for working FedSockets + * {:error, :rejected} for origins that have been tried and refused within the rejection duration interval + * {:error, some_reason} usually :missing for unknown origins + """ + def get_fed_socket(origin) do + case get_registry_data(origin) do + {:error, reason} -> + {:error, reason} + + {:ok, %{state: :connected} = socket_info} -> + {:ok, socket_info} + end + end + + @doc """ + Adds a connected FedSocket to the Registry. + + Always returns {:ok, fed_socket} + """ + def add_fed_socket(origin, pid \\ nil) do + origin + |> SocketInfo.build(pid) + |> SocketInfo.connect() + |> add_socket_info + end + + defp add_socket_info(%{origin: origin, state: :connected} = socket_info) do + case Registry.register(FedSockets.Registry, origin, socket_info) do + {:ok, _owner} -> + clear_prior_rejection(origin) + Logger.debug("fedsocket added: #{inspect(origin)}") + + {:ok, socket_info} + + {:error, {:already_registered, _pid}} -> + FedSocket.close(socket_info) + existing_socket_info = Registry.lookup(FedSockets.Registry, origin) + + {:ok, existing_socket_info} + + _ -> + {:error, :error_adding_socket} + end + end + + @doc """ + Mark this origin as having rejected a connection attempt. + This will keep it from getting additional connection attempts + for a period of time specified in the config. + + Always returns {:ok, new_reg_data} + """ + def set_host_rejected(uri) do + new_reg_data = + uri + |> SocketInfo.origin() + |> get_or_create_registry_data() + |> set_to_rejected() + |> save_registry_data() + + {:ok, new_reg_data} + end + + @doc """ + Retrieves the FedRegistryData from the Registry given it's origin. + + The origin is expected to be a string identifying the endpoint "example.com" or "example2.com:8080" + + Will return: + * {:ok, fed_registry_data} for known origins + * {:error, :missing} for uniknown origins + * {:error, :cache_error} indicating some low level runtime issues + """ + def get_registry_data(origin) do + case Registry.lookup(FedSockets.Registry, origin) do + [] -> + if is_rejected?(origin) do + Logger.debug("previously rejected fedsocket requested") + {:error, :rejected} + else + {:error, :missing} + end + + [{_pid, %{state: :connected} = socket_info}] -> + {:ok, socket_info} + + _ -> + {:error, :cache_error} + end + end + + @doc """ + Retrieves a map of all sockets from the Registry. The keys are the origins and the values are the corresponding SocketInfo + """ + def list_all do + (list_all_connected() ++ list_all_rejected()) + |> Enum.into(%{}) + end + + defp list_all_connected do + FedSockets.Registry + |> Registry.select([{{:"$1", :_, :"$3"}, [], [{{:"$1", :"$3"}}]}]) + end + + defp list_all_rejected do + {:ok, keys} = Cachex.keys(@rejections) + + {:ok, registry_data} = + Cachex.execute(@rejections, fn worker -> + Enum.map(keys, fn k -> {k, Cachex.get!(worker, k)} end) + end) + + registry_data + end + + defp clear_prior_rejection(origin), + do: Cachex.del(@rejections, origin) + + defp is_rejected?(origin) do + case Cachex.get(@rejections, origin) do + {:ok, nil} -> + false + + {:ok, _} -> + true + end + end + + defp get_or_create_registry_data(origin) do + case get_registry_data(origin) do + {:error, :missing} -> + %SocketInfo{origin: origin} + + {:ok, socket_info} -> + socket_info + end + end + + defp save_registry_data(%SocketInfo{origin: origin, state: :connected} = socket_info) do + {:ok, true} = Registry.update_value(FedSockets.Registry, origin, fn _ -> socket_info end) + socket_info + end + + defp save_registry_data(%SocketInfo{origin: origin, state: :rejected} = socket_info) do + rejection_expiration = + Pleroma.Config.get([:fed_sockets, :rejection_duration], @default_rejection_duration) + + {:ok, true} = Cachex.put(@rejections, origin, socket_info, ttl: rejection_expiration) + socket_info + end + + defp set_to_rejected(%SocketInfo{} = socket_info), + do: %SocketInfo{socket_info | state: :rejected} +end diff --git a/lib/pleroma/web/fed_sockets/fed_socket.ex b/lib/pleroma/web/fed_sockets/fed_socket.ex new file mode 100644 index 000000000..98d64e65a --- /dev/null +++ b/lib/pleroma/web/fed_sockets/fed_socket.ex @@ -0,0 +1,137 @@ +# Pleroma: A lightweight social networking server +# Copyright © 2017-2020 Pleroma Authors +# SPDX-License-Identifier: AGPL-3.0-only + +defmodule Pleroma.Web.FedSockets.FedSocket do + @moduledoc """ + The FedSocket module abstracts the actions to be taken taken on connections regardless of + whether the connection started as inbound or outbound. + + + Normally outside modules will have no need to call the FedSocket module directly. + """ + + alias Pleroma.Object + alias Pleroma.Object.Containment + alias Pleroma.User + alias Pleroma.Web.ActivityPub.ObjectView + alias Pleroma.Web.ActivityPub.UserView + alias Pleroma.Web.ActivityPub.Visibility + alias Pleroma.Web.FedSockets.FetchRegistry + alias Pleroma.Web.FedSockets.IngesterWorker + alias Pleroma.Web.FedSockets.OutgoingHandler + alias Pleroma.Web.FedSockets.SocketInfo + + require Logger + + @shake "61dd18f7-f1e6-49a4-939a-a749fcdc1103" + + def connect_to_host(uri) do + case OutgoingHandler.start_link(uri) do + {:ok, pid} -> + {:ok, pid} + + error -> + {:error, error} + end + end + + def close(%SocketInfo{pid: socket_pid}), + do: Process.send(socket_pid, :close, []) + + def publish(%SocketInfo{pid: socket_pid}, json) do + %{action: :publish, data: json} + |> Jason.encode!() + |> send_packet(socket_pid) + end + + def fetch(%SocketInfo{pid: socket_pid}, id) do + fetch_uuid = FetchRegistry.register_fetch(id) + + %{action: :fetch, data: id, uuid: fetch_uuid} + |> Jason.encode!() + |> send_packet(socket_pid) + + wait_for_fetch_to_return(fetch_uuid, 0) + end + + def receive_package(%SocketInfo{} = fed_socket, json) do + json + |> Jason.decode!() + |> process_package(fed_socket) + end + + defp wait_for_fetch_to_return(uuid, cntr) do + case FetchRegistry.check_fetch(uuid) do + {:error, :waiting} -> + Process.sleep(:math.pow(cntr, 3) |> Kernel.trunc()) + wait_for_fetch_to_return(uuid, cntr + 1) + + {:error, :missing} -> + Logger.error("FedSocket fetch timed out - #{inspect(uuid)}") + {:error, :timeout} + + {:ok, _fr} -> + FetchRegistry.pop_fetch(uuid) + end + end + + defp process_package(%{"action" => "publish", "data" => data}, %{origin: origin} = _fed_socket) do + if Containment.contain_origin(origin, data) do + IngesterWorker.enqueue("ingest", %{"object" => data}) + end + + {:reply, %{"action" => "publish_reply", "status" => "processed"}} + end + + defp process_package(%{"action" => "fetch_reply", "uuid" => uuid, "data" => data}, _fed_socket) do + FetchRegistry.register_fetch_received(uuid, data) + {:noreply, nil} + end + + defp process_package(%{"action" => "fetch", "uuid" => uuid, "data" => ap_id}, _fed_socket) do + {:ok, data} = render_fetched_data(ap_id, uuid) + {:reply, data} + end + + defp process_package(%{"action" => "publish_reply"}, _fed_socket) do + {:noreply, nil} + end + + defp process_package(other, _fed_socket) do + Logger.warn("unknown json packages received #{inspect(other)}") + {:noreply, nil} + end + + defp render_fetched_data(ap_id, uuid) do + {:ok, + %{ + "action" => "fetch_reply", + "status" => "processed", + "uuid" => uuid, + "data" => represent_item(ap_id) + }} + end + + defp represent_item(ap_id) do + case User.get_by_ap_id(ap_id) do + nil -> + object = Object.get_cached_by_ap_id(ap_id) + + if Visibility.is_public?(object) do + Phoenix.View.render_to_string(ObjectView, "object.json", object: object) + else + nil + end + + user -> + Phoenix.View.render_to_string(UserView, "user.json", user: user) + end + end + + defp send_packet(data, socket_pid) do + Process.send(socket_pid, {:send, data}, []) + end + + def shake, do: @shake +end diff --git a/lib/pleroma/web/fed_sockets/fed_sockets.ex b/lib/pleroma/web/fed_sockets/fed_sockets.ex new file mode 100644 index 000000000..1fd5899c8 --- /dev/null +++ b/lib/pleroma/web/fed_sockets/fed_sockets.ex @@ -0,0 +1,185 @@ +# Pleroma: A lightweight social networking server +# Copyright © 2017-2020 Pleroma Authors +# SPDX-License-Identifier: AGPL-3.0-only + +defmodule Pleroma.Web.FedSockets do + @moduledoc """ + This documents the FedSockets framework. A framework for federating + ActivityPub objects between servers via persistant WebSocket connections. + + FedSockets allow servers to authenticate on first contact and maintain that + connection, eliminating the need to authenticate every time data needs to be shared. + + ## Protocol + FedSockets currently support 2 types of data transfer: + * `publish` method which doesn't require a response + * `fetch` method requires a response be sent + + ### Publish + The publish operation sends a json encoded map of the shape: + %{action: :publish, data: json} + and accepts (but does not require) a reply of form: + %{"action" => "publish_reply"} + + The outgoing params represent + * data: ActivityPub object encoded into json + + + ### Fetch + The fetch operation sends a json encoded map of the shape: + %{action: :fetch, data: id, uuid: fetch_uuid} + and requires a reply of form: + %{"action" => "fetch_reply", "uuid" => uuid, "data" => data} + + The outgoing params represent + * id: an ActivityPub object URI + * uuid: a unique uuid generated by the sender + + The reply params represent + * data: an ActivityPub object encoded into json + * uuid: the uuid sent along with the fetch request + + ## Examples + Clients of FedSocket transfers shouldn't need to use any of the functions outside of this module. + + A typical publish operation can be performed through the following code, and a fetch operation in a similar manner. + + case FedSockets.get_or_create_fed_socket(inbox) do + {:ok, fedsocket} -> + FedSockets.publish(fedsocket, json) + + _ -> + alternative_publish(inbox, actor, json, params) + end + + ## Configuration + FedSockets have the following config settings + + config :pleroma, :fed_sockets, + enabled: true, + ping_interval: :timer.seconds(15), + connection_duration: :timer.hours(1), + rejection_duration: :timer.hours(1), + fed_socket_fetches: [ + default: 12_000, + interval: 3_000, + lazy: false + ] + * enabled - turn FedSockets on or off with this flag. Can be toggled at runtime. + * connection_duration - How long a FedSocket can sit idle before it's culled. + * rejection_duration - After failing to make a FedSocket connection a host will be excluded + from further connections for this amount of time + * fed_socket_fetches - Use these parameters to pass options to the Cachex queue backing the FetchRegistry + * fed_socket_rejections - Use these parameters to pass options to the Cachex queue backing the FedRegistry + + Cachex options are + * default: the minimum amount of time a fetch can wait before it times out. + * interval: the interval between checks for timed out entries. This plus the default represent the maximum time allowed + * lazy: leave at false for consistant and fast lookups, set to true for stricter timeout enforcement + + """ + require Logger + + alias Pleroma.Web.FedSockets.FedRegistry + alias Pleroma.Web.FedSockets.FedSocket + alias Pleroma.Web.FedSockets.SocketInfo + + @doc """ + returns a FedSocket for the given origin. Will reuse an existing one or create a new one. + + address is expected to be a fully formed URL such as: + "http://www.example.com" or "http://www.example.com:8080" + + It can and usually does include additional path parameters, + but these are ignored as the FedSockets are organized by host and port info alone. + """ + def get_or_create_fed_socket(address) do + with {:cache, {:error, :missing}} <- {:cache, get_fed_socket(address)}, + {:connect, {:ok, _pid}} <- {:connect, FedSocket.connect_to_host(address)}, + {:cache, {:ok, fed_socket}} <- {:cache, get_fed_socket(address)} do + Logger.debug("fedsocket created for - #{inspect(address)}") + {:ok, fed_socket} + else + {:cache, {:ok, socket}} -> + Logger.debug("fedsocket found in cache - #{inspect(address)}") + {:ok, socket} + + {:cache, {:error, :rejected} = e} -> + e + + {:connect, {:error, _host}} -> + Logger.debug("set host rejected for - #{inspect(address)}") + FedRegistry.set_host_rejected(address) + {:error, :rejected} + + {_, {:error, :disabled}} -> + {:error, :disabled} + + {_, {:error, reason}} -> + Logger.warn("get_or_create_fed_socket error - #{inspect(reason)}") + {:error, reason} + end + end + + @doc """ + returns a FedSocket for the given origin. Will not create a new FedSocket if one does not exist. + + address is expected to be a fully formed URL such as: + "http://www.example.com" or "http://www.example.com:8080" + """ + def get_fed_socket(address) do + origin = SocketInfo.origin(address) + + with {:config, true} <- {:config, Pleroma.Config.get([:fed_sockets, :enabled], false)}, + {:ok, socket} <- FedRegistry.get_fed_socket(origin) do + {:ok, socket} + else + {:config, _} -> + {:error, :disabled} + + {:error, :rejected} -> + Logger.debug("FedSocket previously rejected - #{inspect(origin)}") + {:error, :rejected} + + {:error, reason} -> + {:error, reason} + end + end + + @doc """ + Sends the supplied data via the publish protocol. + It will not block waiting for a reply. + Returns :ok but this is not an indication of a successful transfer. + + the data is expected to be JSON encoded binary data. + """ + def publish(%SocketInfo{} = fed_socket, json) do + FedSocket.publish(fed_socket, json) + end + + @doc """ + Sends the supplied data via the fetch protocol. + It will block waiting for a reply or timeout. + + Returns {:ok, object} where object is the requested object (or nil) + {:error, :timeout} in the event the message was not responded to + + the id is expected to be the URI of an ActivityPub object. + """ + def fetch(%SocketInfo{} = fed_socket, id) do + FedSocket.fetch(fed_socket, id) + end + + @doc """ + Disconnect all and restart FedSockets. + This is mainly used in development and testing but could be useful in production. + """ + def reset do + FedRegistry + |> Process.whereis() + |> Process.exit(:testing) + end + + def uri_for_origin(origin), + do: "ws://#{origin}/api/fedsocket/v1" +end diff --git a/lib/pleroma/web/fed_sockets/fetch_registry.ex b/lib/pleroma/web/fed_sockets/fetch_registry.ex new file mode 100644 index 000000000..7897f0fc6 --- /dev/null +++ b/lib/pleroma/web/fed_sockets/fetch_registry.ex @@ -0,0 +1,151 @@ +# Pleroma: A lightweight social networking server +# Copyright © 2017-2020 Pleroma Authors +# SPDX-License-Identifier: AGPL-3.0-only + +defmodule Pleroma.Web.FedSockets.FetchRegistry do + @moduledoc """ + The FetchRegistry acts as a broker for fetch requests and return values. + This allows calling processes to block while waiting for a reply. + It doesn't impose it's own process instead using `Cachex` to handle fetches in process, allowing + multi threaded processes to avoid bottlenecking. + + Normally outside modules will have no need to call or use the FetchRegistry themselves. + + The `Cachex` parameters can be controlled from the config. Since exact timeout intervals + aren't necessary the following settings are used by default: + + config :pleroma, :fed_sockets, + fed_socket_fetches: [ + default: 12_000, + interval: 3_000, + lazy: false + ] + + """ + + defmodule FetchRegistryData do + defstruct uuid: nil, + sent_json: nil, + received_json: nil, + sent_at: nil, + received_at: nil + end + + alias Ecto.UUID + + require Logger + + @fetches :fed_socket_fetches + + @doc """ + Registers a json request wth the FetchRegistry and returns the identifying UUID. + """ + def register_fetch(json) do + %FetchRegistryData{uuid: uuid} = + json + |> new_registry_data + |> save_registry_data + + uuid + end + + @doc """ + Reports on the status of a Fetch given the identifying UUID. + + Will return + * {:ok, fetched_object} if a fetch has completed + * {:error, :waiting} if a fetch is still pending + * {:error, other_error} usually :missing to indicate a fetch that has timed out + """ + def check_fetch(uuid) do + case get_registry_data(uuid) do + {:ok, %FetchRegistryData{received_at: nil}} -> + {:error, :waiting} + + {:ok, %FetchRegistryData{} = reg_data} -> + {:ok, reg_data} + + e -> + e + end + end + + @doc """ + Retrieves the response to a fetch given the identifying UUID. + The completed fetch will be deleted from the FetchRegistry + + Will return + * {:ok, fetched_object} if a fetch has completed + * {:error, :waiting} if a fetch is still pending + * {:error, other_error} usually :missing to indicate a fetch that has timed out + """ + def pop_fetch(uuid) do + case check_fetch(uuid) do + {:ok, %FetchRegistryData{received_json: received_json}} -> + delete_registry_data(uuid) + {:ok, received_json} + + e -> + e + end + end + + @doc """ + This is called to register a fetch has returned. + It expects the result data along with the UUID that was sent in the request + + Will return the fetched object or :error + """ + def register_fetch_received(uuid, data) do + case get_registry_data(uuid) do + {:ok, %FetchRegistryData{received_at: nil} = reg_data} -> + reg_data + |> set_fetch_received(data) + |> save_registry_data() + + {:ok, %FetchRegistryData{} = reg_data} -> + Logger.warn("tried to add fetched data twice - #{uuid}") + reg_data + + {:error, _} -> + Logger.warn("Error adding fetch to registry - #{uuid}") + :error + end + end + + defp new_registry_data(json) do + %FetchRegistryData{ + uuid: UUID.generate(), + sent_json: json, + sent_at: :erlang.monotonic_time(:millisecond) + } + end + + defp get_registry_data(origin) do + case Cachex.get(@fetches, origin) do + {:ok, nil} -> + {:error, :missing} + + {:ok, reg_data} -> + {:ok, reg_data} + + _ -> + {:error, :cache_error} + end + end + + defp set_fetch_received(%FetchRegistryData{} = reg_data, data), + do: %FetchRegistryData{ + reg_data + | received_at: :erlang.monotonic_time(:millisecond), + received_json: data + } + + defp save_registry_data(%FetchRegistryData{uuid: uuid} = reg_data) do + {:ok, true} = Cachex.put(@fetches, uuid, reg_data) + reg_data + end + + defp delete_registry_data(origin), + do: {:ok, true} = Cachex.del(@fetches, origin) +end diff --git a/lib/pleroma/web/fed_sockets/incoming_handler.ex b/lib/pleroma/web/fed_sockets/incoming_handler.ex new file mode 100644 index 000000000..49d0d9d84 --- /dev/null +++ b/lib/pleroma/web/fed_sockets/incoming_handler.ex @@ -0,0 +1,88 @@ +# Pleroma: A lightweight social networking server +# Copyright © 2017-2020 Pleroma Authors +# SPDX-License-Identifier: AGPL-3.0-only + +defmodule Pleroma.Web.FedSockets.IncomingHandler do + require Logger + + alias Pleroma.Web.FedSockets.FedRegistry + alias Pleroma.Web.FedSockets.FedSocket + alias Pleroma.Web.FedSockets.SocketInfo + + import HTTPSignatures, only: [validate_conn: 1, split_signature: 1] + + @behaviour :cowboy_websocket + + def init(req, state) do + shake = FedSocket.shake() + + with true <- Pleroma.Config.get([:fed_sockets, :enabled]), + sec_protocol <- :cowboy_req.header("sec-websocket-protocol", req, nil), + headers = %{"(request-target)" => ^shake} <- :cowboy_req.headers(req), + true <- validate_conn(%{req_headers: headers}), + %{"keyId" => origin} <- split_signature(headers["signature"]) do + req = + if is_nil(sec_protocol) do + req + else + :cowboy_req.set_resp_header("sec-websocket-protocol", sec_protocol, req) + end + + {:cowboy_websocket, req, %{origin: origin}, %{}} + else + _ -> + {:ok, req, state} + end + end + + def websocket_init(%{origin: origin}) do + case FedRegistry.add_fed_socket(origin) do + {:ok, socket_info} -> + {:ok, socket_info} + + e -> + Logger.error("FedSocket websocket_init failed - #{inspect(e)}") + {:error, inspect(e)} + end + end + + # Use the ping to check if the connection should be expired + def websocket_handle(:ping, socket_info) do + if SocketInfo.expired?(socket_info) do + {:stop, socket_info} + else + {:ok, socket_info, :hibernate} + end + end + + def websocket_handle({:text, data}, socket_info) do + socket_info = SocketInfo.touch(socket_info) + + case FedSocket.receive_package(socket_info, data) do + {:noreply, _} -> + {:ok, socket_info} + + {:reply, reply} -> + {:reply, {:text, Jason.encode!(reply)}, socket_info} + + {:error, reason} -> + Logger.error("incoming error - receive_package: #{inspect(reason)}") + {:ok, socket_info} + end + end + + def websocket_info({:send, message}, socket_info) do + socket_info = SocketInfo.touch(socket_info) + + {:reply, {:text, message}, socket_info} + end + + def websocket_info(:close, state) do + {:stop, state} + end + + def websocket_info(message, state) do + Logger.debug("#{__MODULE__} unknown message #{inspect(message)}") + {:ok, state} + end +end diff --git a/lib/pleroma/web/fed_sockets/ingester_worker.ex b/lib/pleroma/web/fed_sockets/ingester_worker.ex new file mode 100644 index 000000000..325f2a4ab --- /dev/null +++ b/lib/pleroma/web/fed_sockets/ingester_worker.ex @@ -0,0 +1,33 @@ +# Pleroma: A lightweight social networking server +# Copyright © 2017-2020 Pleroma Authors +# SPDX-License-Identifier: AGPL-3.0-only + +defmodule Pleroma.Web.FedSockets.IngesterWorker do + use Pleroma.Workers.WorkerHelper, queue: "ingestion_queue" + require Logger + + alias Pleroma.Web.Federator + + @impl Oban.Worker + def perform(%Job{args: %{"op" => "ingest", "object" => ingestee}}) do + try do + ingestee + |> Jason.decode!() + |> do_ingestion() + rescue + e -> + Logger.error("IngesterWorker error - #{inspect(e)}") + e + end + end + + defp do_ingestion(params) do + case Federator.incoming_ap_doc(params) do + {:error, reason} -> + {:error, reason} + + {:ok, object} -> + {:ok, object} + end + end +end diff --git a/lib/pleroma/web/fed_sockets/outgoing_handler.ex b/lib/pleroma/web/fed_sockets/outgoing_handler.ex new file mode 100644 index 000000000..e235a7c43 --- /dev/null +++ b/lib/pleroma/web/fed_sockets/outgoing_handler.ex @@ -0,0 +1,151 @@ +# Pleroma: A lightweight social networking server +# Copyright © 2017-2020 Pleroma Authors +# SPDX-License-Identifier: AGPL-3.0-only + +defmodule Pleroma.Web.FedSockets.OutgoingHandler do + use GenServer + + require Logger + + alias Pleroma.Application + alias Pleroma.Web.ActivityPub.InternalFetchActor + alias Pleroma.Web.FedSockets + alias Pleroma.Web.FedSockets.FedRegistry + alias Pleroma.Web.FedSockets.FedSocket + alias Pleroma.Web.FedSockets.SocketInfo + + def start_link(uri) do + GenServer.start_link(__MODULE__, %{uri: uri}) + end + + def init(%{uri: uri}) do + case initiate_connection(uri) do + {:ok, ws_origin, conn_pid} -> + FedRegistry.add_fed_socket(ws_origin, conn_pid) + + {:error, reason} -> + Logger.debug("Outgoing connection failed - #{inspect(reason)}") + :ignore + end + end + + def handle_info({:gun_ws, conn_pid, _ref, {:text, data}}, socket_info) do + socket_info = SocketInfo.touch(socket_info) + + case FedSocket.receive_package(socket_info, data) do + {:noreply, _} -> + {:noreply, socket_info} + + {:reply, reply} -> + :gun.ws_send(conn_pid, {:text, Jason.encode!(reply)}) + {:noreply, socket_info} + + {:error, reason} -> + Logger.error("incoming error - receive_package: #{inspect(reason)}") + {:noreply, socket_info} + end + end + + def handle_info(:close, state) do + Logger.debug("Sending close frame !!!!!!!") + {:close, state} + end + + def handle_info({:gun_down, _pid, _prot, :closed, _}, state) do + {:stop, :normal, state} + end + + def handle_info({:send, data}, %{conn_pid: conn_pid} = socket_info) do + socket_info = SocketInfo.touch(socket_info) + :gun.ws_send(conn_pid, {:text, data}) + {:noreply, socket_info} + end + + def handle_info({:gun_ws, _, _, :pong}, state) do + {:noreply, state, :hibernate} + end + + def handle_info(msg, state) do + Logger.debug("#{__MODULE__} unhandled event #{inspect(msg)}") + {:noreply, state} + end + + def terminate(reason, state) do + Logger.debug( + "#{__MODULE__} terminating outgoing connection for #{inspect(state)} for #{inspect(reason)}" + ) + + {:ok, state} + end + + def initiate_connection(uri) do + ws_uri = + uri + |> SocketInfo.origin() + |> FedSockets.uri_for_origin() + + %{host: host, port: port, path: path} = URI.parse(ws_uri) + + with {:ok, conn_pid} <- :gun.open(to_charlist(host), port, %{protocols: [:http]}), + {:ok, _} <- :gun.await_up(conn_pid), + reference <- + :gun.get(conn_pid, to_charlist(path), [ + {'user-agent', to_charlist(Application.user_agent())} + ]), + {:response, :fin, 204, _} <- :gun.await(conn_pid, reference), + headers <- build_headers(uri), + ref <- :gun.ws_upgrade(conn_pid, to_charlist(path), headers, %{silence_pings: false}) do + receive do + {:gun_upgrade, ^conn_pid, ^ref, [<<"websocket">>], _} -> + {:ok, ws_uri, conn_pid} + after + 15_000 -> + Logger.debug("Fedsocket timeout connecting to #{inspect(uri)}") + {:error, :timeout} + end + else + {:response, :nofin, 404, _} -> + {:error, :fedsockets_not_supported} + + e -> + Logger.debug("Fedsocket error connecting to #{inspect(uri)}") + {:error, e} + end + end + + defp build_headers(uri) do + host_for_sig = uri |> URI.parse() |> host_signature() + + shake = FedSocket.shake() + digest = "SHA-256=" <> (:crypto.hash(:sha256, shake) |> Base.encode64()) + date = Pleroma.Signature.signed_date() + shake_size = byte_size(shake) + + signature_opts = %{ + "(request-target)": shake, + "content-length": to_charlist("#{shake_size}"), + date: date, + digest: digest, + host: host_for_sig + } + + signature = Pleroma.Signature.sign(InternalFetchActor.get_actor(), signature_opts) + + [ + {'signature', to_charlist(signature)}, + {'date', date}, + {'digest', to_charlist(digest)}, + {'content-length', to_charlist("#{shake_size}")}, + {to_charlist("(request-target)"), to_charlist(shake)}, + {'user-agent', to_charlist(Application.user_agent())} + ] + end + + defp host_signature(%{host: host, scheme: scheme, port: port}) do + if port == URI.default_port(scheme) do + host + else + "#{host}:#{port}" + end + end +end diff --git a/lib/pleroma/web/fed_sockets/socket_info.ex b/lib/pleroma/web/fed_sockets/socket_info.ex new file mode 100644 index 000000000..d6fdffe1a --- /dev/null +++ b/lib/pleroma/web/fed_sockets/socket_info.ex @@ -0,0 +1,52 @@ +# Pleroma: A lightweight social networking server +# Copyright © 2017-2020 Pleroma Authors +# SPDX-License-Identifier: AGPL-3.0-only + +defmodule Pleroma.Web.FedSockets.SocketInfo do + defstruct origin: nil, + pid: nil, + conn_pid: nil, + state: :default, + connected_until: nil + + alias Pleroma.Web.FedSockets.SocketInfo + @default_connection_duration 15 * 60 * 1000 + + def build(uri, conn_pid \\ nil) do + uri + |> build_origin() + |> build_pids(conn_pid) + |> touch() + end + + def touch(%SocketInfo{} = socket_info), + do: %{socket_info | connected_until: new_ttl()} + + def connect(%SocketInfo{} = socket_info), + do: %{socket_info | state: :connected} + + def expired?(%{connected_until: connected_until}), + do: connected_until < :erlang.monotonic_time(:millisecond) + + def origin(uri), + do: build_origin(uri).origin + + defp build_pids(socket_info, conn_pid), + do: struct(socket_info, pid: self(), conn_pid: conn_pid) + + defp build_origin(uri) when is_binary(uri), + do: uri |> URI.parse() |> build_origin + + defp build_origin(%{host: host, port: nil, scheme: scheme}), + do: build_origin(%{host: host, port: URI.default_port(scheme)}) + + defp build_origin(%{host: host, port: port}), + do: %SocketInfo{origin: "#{host}:#{port}"} + + defp new_ttl do + connection_duration = + Pleroma.Config.get([:fed_sockets, :connection_duration], @default_connection_duration) + + :erlang.monotonic_time(:millisecond) + connection_duration + end +end diff --git a/lib/pleroma/web/fed_sockets/supervisor.ex b/lib/pleroma/web/fed_sockets/supervisor.ex new file mode 100644 index 000000000..a5f4bebfb --- /dev/null +++ b/lib/pleroma/web/fed_sockets/supervisor.ex @@ -0,0 +1,59 @@ +# Pleroma: A lightweight social networking server +# Copyright © 2017-2020 Pleroma Authors +# SPDX-License-Identifier: AGPL-3.0-only + +defmodule Pleroma.Web.FedSockets.Supervisor do + use Supervisor + import Cachex.Spec + + def start_link(opts) do + Supervisor.start_link(__MODULE__, opts, name: __MODULE__) + end + + def init(args) do + children = [ + build_cache(:fed_socket_fetches, args), + build_cache(:fed_socket_rejections, args), + {Registry, keys: :unique, name: FedSockets.Registry, meta: [rejected: %{}]} + ] + + opts = [strategy: :one_for_all, name: Pleroma.Web.Streamer.Supervisor] + Supervisor.init(children, opts) + end + + defp build_cache(name, args) do + opts = get_opts(name, args) + + %{ + id: String.to_atom("#{name}_cache"), + start: {Cachex, :start_link, [name, opts]}, + type: :worker + } + end + + defp get_opts(cache_name, args) + when cache_name in [:fed_socket_fetches, :fed_socket_rejections] do + default = get_opts_or_config(args, cache_name, :default, 15_000) + interval = get_opts_or_config(args, cache_name, :interval, 3_000) + lazy = get_opts_or_config(args, cache_name, :lazy, false) + + [expiration: expiration(default: default, interval: interval, lazy: lazy)] + end + + defp get_opts(name, args) do + Keyword.get(args, name, []) + end + + defp get_opts_or_config(args, name, key, default) do + args + |> Keyword.get(name, []) + |> Keyword.get(key) + |> case do + nil -> + Pleroma.Config.get([:fed_sockets, name, key], default) + + value -> + value + end + end +end diff --git a/lib/pleroma/web/federator/federator.ex b/lib/pleroma/web/federator/federator.ex index f5803578d..130654145 100644 --- a/lib/pleroma/web/federator/federator.ex +++ b/lib/pleroma/web/federator/federator.ex @@ -66,14 +66,17 @@ def perform(:publish, activity) do def perform(:incoming_ap_doc, params) do Logger.debug("Handling incoming AP activity") - params = Utils.normalize_params(params) + actor = + params + |> Map.get("actor") + |> Utils.get_ap_id() # NOTE: we use the actor ID to do the containment, this is fine because an # actor shouldn't be acting on objects outside their own AP server. - with {:ok, _user} <- ap_enabled_actor(params["actor"]), + with {_, {:ok, _user}} <- {:actor, ap_enabled_actor(actor)}, nil <- Activity.normalize(params["id"]), {_, :ok} <- - {:correct_origin?, Containment.contain_origin_from_id(params["actor"], params)}, + {:correct_origin?, Containment.contain_origin_from_id(actor, params)}, {:ok, activity} <- Transmogrifier.handle_incoming(params) do {:ok, activity} else @@ -85,10 +88,13 @@ def perform(:incoming_ap_doc, params) do Logger.debug("Already had #{params["id"]}") {:error, :already_present} + {:actor, e} -> + Logger.debug("Unhandled actor #{actor}, #{inspect(e)}") + {:error, e} + e -> # Just drop those for now - Logger.debug("Unhandled activity") - Logger.debug(Jason.encode!(params, pretty: true)) + Logger.debug(fn -> "Unhandled activity\n" <> Jason.encode!(params, pretty: true) end) {:error, e} end end diff --git a/lib/pleroma/web/instance_document.ex b/lib/pleroma/web/instance_document.ex new file mode 100644 index 000000000..df5caebf0 --- /dev/null +++ b/lib/pleroma/web/instance_document.ex @@ -0,0 +1,62 @@ +# Pleroma: A lightweight social networking server +# Copyright © 2017-2020 Pleroma Authors +# SPDX-License-Identifier: AGPL-3.0-only + +defmodule Pleroma.Web.InstanceDocument do + alias Pleroma.Config + alias Pleroma.Web.Endpoint + + @instance_documents %{ + "terms-of-service" => "/static/terms-of-service.html", + "instance-panel" => "/instance/panel.html" + } + + @spec get(String.t()) :: {:ok, String.t()} | {:error, atom()} + def get(document_name) do + case Map.fetch(@instance_documents, document_name) do + {:ok, path} -> {:ok, path} + _ -> {:error, :not_found} + end + end + + @spec put(String.t(), String.t()) :: {:ok, String.t()} | {:error, atom()} + def put(document_name, origin_path) do + with {_, {:ok, destination_path}} <- + {:instance_document, Map.fetch(@instance_documents, document_name)}, + :ok <- put_file(origin_path, destination_path) do + {:ok, Path.join(Endpoint.url(), destination_path)} + else + {:instance_document, :error} -> {:error, :not_found} + error -> error + end + end + + @spec delete(String.t()) :: :ok | {:error, atom()} + def delete(document_name) do + with {_, {:ok, path}} <- {:instance_document, Map.fetch(@instance_documents, document_name)}, + instance_static_dir_path <- instance_static_dir(path), + :ok <- File.rm(instance_static_dir_path) do + :ok + else + {:instance_document, :error} -> {:error, :not_found} + {:error, :enoent} -> {:error, :not_found} + error -> error + end + end + + defp put_file(origin_path, destination_path) do + with destination <- instance_static_dir(destination_path), + {_, :ok} <- {:mkdir_p, File.mkdir_p(Path.dirname(destination))}, + {_, {:ok, _}} <- {:copy, File.copy(origin_path, destination)} do + :ok + else + {error, _} -> {:error, error} + end + end + + defp instance_static_dir(filename) do + [:instance, :static_dir] + |> Config.get!() + |> Path.join(filename) + end +end diff --git a/lib/pleroma/web/mastodon_api/controllers/auth_controller.ex b/lib/pleroma/web/mastodon_api/controllers/auth_controller.ex index 9f09550e1..57c0be5fe 100644 --- a/lib/pleroma/web/mastodon_api/controllers/auth_controller.ex +++ b/lib/pleroma/web/mastodon_api/controllers/auth_controller.ex @@ -5,6 +5,8 @@ defmodule Pleroma.Web.MastodonAPI.AuthController do use Pleroma.Web, :controller + import Pleroma.Web.ControllerHelper, only: [json_response: 3] + alias Pleroma.User alias Pleroma.Web.OAuth.App alias Pleroma.Web.OAuth.Authorization @@ -61,9 +63,7 @@ def password_reset(conn, params) do TwitterAPI.password_reset(nickname_or_email) - conn - |> put_status(:no_content) - |> json("") + json_response(conn, :no_content, "") end defp local_mastodon_root_path(conn) do diff --git a/lib/pleroma/web/mastodon_api/views/account_view.ex b/lib/pleroma/web/mastodon_api/views/account_view.ex index d2a30a548..121ba1693 100644 --- a/lib/pleroma/web/mastodon_api/views/account_view.ex +++ b/lib/pleroma/web/mastodon_api/views/account_view.ex @@ -181,8 +181,10 @@ defp do_render("show.json", %{user: user} = opts) do user = User.sanitize_html(user, User.html_filter_policy(opts[:for])) display_name = user.name || user.nickname - image = User.avatar_url(user) |> MediaProxy.url() + avatar = User.avatar_url(user) |> MediaProxy.url() + avatar_static = User.avatar_url(user) |> MediaProxy.preview_url(static: true) header = User.banner_url(user) |> MediaProxy.url() + header_static = User.banner_url(user) |> MediaProxy.preview_url(static: true) following_count = if !user.hide_follows_count or !user.hide_follows or opts[:for] == user do @@ -247,10 +249,10 @@ defp do_render("show.json", %{user: user} = opts) do statuses_count: user.note_count, note: user.bio, url: user.uri || user.ap_id, - avatar: image, - avatar_static: image, + avatar: avatar, + avatar_static: avatar_static, header: header, - header_static: header, + header_static: header_static, emojis: emojis, fields: user.fields, bot: bot, diff --git a/lib/pleroma/web/mastodon_api/views/status_view.ex b/lib/pleroma/web/mastodon_api/views/status_view.ex index ca42917fc..435bcde15 100644 --- a/lib/pleroma/web/mastodon_api/views/status_view.ex +++ b/lib/pleroma/web/mastodon_api/views/status_view.ex @@ -55,23 +55,6 @@ defp get_replied_to_activities(activities) do end) end - def get_user(ap_id, fake_record_fallback \\ true) do - cond do - user = User.get_cached_by_ap_id(ap_id) -> - user - - user = User.get_by_guessed_nickname(ap_id) -> - user - - fake_record_fallback -> - # TODO: refactor (fake records is never a good idea) - User.error_user(ap_id) - - true -> - nil - end - end - defp get_context_id(%{data: %{"context_id" => context_id}}) when not is_nil(context_id), do: context_id @@ -119,7 +102,7 @@ def render("index.json", opts) do # Note: unresolved users are filtered out actors = (activities ++ parent_activities) - |> Enum.map(&get_user(&1.data["actor"], false)) + |> Enum.map(&CommonAPI.get_user(&1.data["actor"], false)) |> Enum.filter(& &1) UserRelationship.view_relationships_option(reading_user, actors, subset: :source_mutes) @@ -138,7 +121,7 @@ def render( "show.json", %{activity: %{data: %{"type" => "Announce", "object" => _object}} = activity} = opts ) do - user = get_user(activity.data["actor"]) + user = CommonAPI.get_user(activity.data["actor"]) created_at = Utils.to_masto_date(activity.data["published"]) activity_object = Object.normalize(activity) @@ -211,7 +194,7 @@ def render( def render("show.json", %{activity: %{data: %{"object" => _object}} = activity} = opts) do object = Object.normalize(activity) - user = get_user(activity.data["actor"]) + user = CommonAPI.get_user(activity.data["actor"]) user_follower_address = user.follower_address like_count = object.data["like_count"] || 0 @@ -265,7 +248,7 @@ def render("show.json", %{activity: %{data: %{"object" => _object}} = activity} reply_to = get_reply_to(activity, opts) - reply_to_user = reply_to && get_user(reply_to.data["actor"]) + reply_to_user = reply_to && CommonAPI.get_user(reply_to.data["actor"]) content = object @@ -432,6 +415,7 @@ def render("attachment.json", %{attachment: attachment}) do [attachment_url | _] = attachment["url"] media_type = attachment_url["mediaType"] || attachment_url["mimeType"] || "image" href = attachment_url["href"] |> MediaProxy.url() + href_preview = attachment_url["href"] |> MediaProxy.preview_url() type = cond do @@ -447,7 +431,7 @@ def render("attachment.json", %{attachment: attachment}) do id: to_string(attachment["id"] || hash_id), url: href, remote_url: href, - preview_url: href, + preview_url: href_preview, text_url: href, type: type, description: attachment["name"], diff --git a/lib/pleroma/web/mastodon_api/websocket_handler.ex b/lib/pleroma/web/mastodon_api/websocket_handler.ex index cf923ded8..439cdd716 100644 --- a/lib/pleroma/web/mastodon_api/websocket_handler.ex +++ b/lib/pleroma/web/mastodon_api/websocket_handler.ex @@ -23,8 +23,8 @@ def init(%{qs: qs} = req, state) do with params <- Enum.into(:cow_qs.parse_qs(qs), %{}), sec_websocket <- :cowboy_req.header("sec-websocket-protocol", req, nil), access_token <- Map.get(params, "access_token"), - {:ok, user} <- authenticate_request(access_token, sec_websocket), - {:ok, topic} <- Streamer.get_topic(Map.get(params, "stream"), user, params) do + {:ok, user, oauth_token} <- authenticate_request(access_token, sec_websocket), + {:ok, topic} <- Streamer.get_topic(params["stream"], user, oauth_token, params) do req = if sec_websocket do :cowboy_req.set_resp_header("sec-websocket-protocol", sec_websocket, req) @@ -117,7 +117,7 @@ def terminate(reason, _req, state) do # Public streams without authentication. defp authenticate_request(nil, nil) do - {:ok, nil} + {:ok, nil, nil} end # Authenticated streams. @@ -125,9 +125,9 @@ defp authenticate_request(access_token, sec_websocket) do token = access_token || sec_websocket with true <- is_bitstring(token), - %Token{user_id: user_id} <- Repo.get_by(Token, token: token), + oauth_token = %Token{user_id: user_id} <- Repo.get_by(Token, token: token), user = %User{} <- User.get_cached_by_id(user_id) do - {:ok, user} + {:ok, user, oauth_token} else _ -> {:error, :unauthorized} end diff --git a/lib/pleroma/web/media_proxy/invalidation.ex b/lib/pleroma/web/media_proxy/invalidation.ex index 5808861e6..4f4340478 100644 --- a/lib/pleroma/web/media_proxy/invalidation.ex +++ b/lib/pleroma/web/media_proxy/invalidation.ex @@ -33,6 +33,8 @@ defp do_purge(urls) do def prepare_urls(urls) do urls |> List.wrap() - |> Enum.map(&MediaProxy.url/1) + |> Enum.map(fn url -> [MediaProxy.url(url), MediaProxy.preview_url(url)] end) + |> List.flatten() + |> Enum.uniq() end end diff --git a/lib/pleroma/web/media_proxy/media_proxy.ex b/lib/pleroma/web/media_proxy/media_proxy.ex index e18dd8224..8656b8cad 100644 --- a/lib/pleroma/web/media_proxy/media_proxy.ex +++ b/lib/pleroma/web/media_proxy/media_proxy.ex @@ -4,6 +4,7 @@ defmodule Pleroma.Web.MediaProxy do alias Pleroma.Config + alias Pleroma.Helpers.UriHelper alias Pleroma.Upload alias Pleroma.Web alias Pleroma.Web.MediaProxy.Invalidation @@ -40,27 +41,35 @@ def url(url) when is_nil(url) or url == "", do: nil def url("/" <> _ = url), do: url def url(url) do - if disabled?() or not url_proxiable?(url) do - url - else + if enabled?() and url_proxiable?(url) do encode_url(url) + else + url end end @spec url_proxiable?(String.t()) :: boolean() def url_proxiable?(url) do - if local?(url) or whitelisted?(url) do - false + not local?(url) and not whitelisted?(url) + end + + def preview_url(url, preview_params \\ []) do + if preview_enabled?() do + encode_preview_url(url, preview_params) else - true + url(url) end end - defp disabled?, do: !Config.get([:media_proxy, :enabled], false) + def enabled?, do: Config.get([:media_proxy, :enabled], false) - defp local?(url), do: String.starts_with?(url, Pleroma.Web.base_url()) + # Note: media proxy must be enabled for media preview proxy in order to load all + # non-local non-whitelisted URLs through it and be sure that body size constraint is preserved. + def preview_enabled?, do: enabled?() and !!Config.get([:media_preview_proxy, :enabled]) - defp whitelisted?(url) do + def local?(url), do: String.starts_with?(url, Pleroma.Web.base_url()) + + def whitelisted?(url) do %{host: domain} = URI.parse(url) mediaproxy_whitelist_domains = @@ -85,17 +94,29 @@ defp maybe_get_domain_from_url("http" <> _ = url) do defp maybe_get_domain_from_url(domain), do: domain - def encode_url(url) do + defp base64_sig64(url) do base64 = Base.url_encode64(url, @base64_opts) sig64 = base64 - |> signed_url + |> signed_url() |> Base.url_encode64(@base64_opts) + {base64, sig64} + end + + def encode_url(url) do + {base64, sig64} = base64_sig64(url) + build_url(sig64, base64, filename(url)) end + def encode_preview_url(url, preview_params \\ []) do + {base64, sig64} = base64_sig64(url) + + build_preview_url(sig64, base64, filename(url), preview_params) + end + def decode_url(sig, url) do with {:ok, sig} <- Base.url_decode64(sig, @base64_opts), signature when signature == sig <- signed_url(url) do @@ -113,10 +134,14 @@ def filename(url_or_path) do if path = URI.parse(url_or_path).path, do: Path.basename(path) end - def build_url(sig_base64, url_base64, filename \\ nil) do + def base_url do + Config.get([:media_proxy, :base_url], Web.base_url()) + end + + defp proxy_url(path, sig_base64, url_base64, filename) do [ - Config.get([:media_proxy, :base_url], Web.base_url()), - "proxy", + base_url(), + path, sig_base64, url_base64, filename @@ -124,4 +149,38 @@ def build_url(sig_base64, url_base64, filename \\ nil) do |> Enum.filter(& &1) |> Path.join() end + + def build_url(sig_base64, url_base64, filename \\ nil) do + proxy_url("proxy", sig_base64, url_base64, filename) + end + + def build_preview_url(sig_base64, url_base64, filename \\ nil, preview_params \\ []) do + uri = proxy_url("proxy/preview", sig_base64, url_base64, filename) + + UriHelper.modify_uri_params(uri, preview_params) + end + + def verify_request_path_and_url( + %Plug.Conn{params: %{"filename" => _}, request_path: request_path}, + url + ) do + verify_request_path_and_url(request_path, url) + end + + def verify_request_path_and_url(request_path, url) when is_binary(request_path) do + filename = filename(url) + + if filename && not basename_matches?(request_path, filename) do + {:wrong_filename, filename} + else + :ok + end + end + + def verify_request_path_and_url(_, _), do: :ok + + defp basename_matches?(path, filename) do + basename = Path.basename(path) + basename == filename or URI.decode(basename) == filename or URI.encode(basename) == filename + end end diff --git a/lib/pleroma/web/media_proxy/media_proxy_controller.ex b/lib/pleroma/web/media_proxy/media_proxy_controller.ex index 9a64b0ef3..90651ed9b 100644 --- a/lib/pleroma/web/media_proxy/media_proxy_controller.ex +++ b/lib/pleroma/web/media_proxy/media_proxy_controller.ex @@ -5,44 +5,201 @@ defmodule Pleroma.Web.MediaProxy.MediaProxyController do use Pleroma.Web, :controller + alias Pleroma.Config + alias Pleroma.Helpers.MediaHelper + alias Pleroma.Helpers.UriHelper alias Pleroma.ReverseProxy alias Pleroma.Web.MediaProxy + alias Plug.Conn - @default_proxy_opts [max_body_length: 25 * 1_048_576, http: [follow_redirect: true]] - - def remote(conn, %{"sig" => sig64, "url" => url64} = params) do - with config <- Pleroma.Config.get([:media_proxy], []), - true <- Keyword.get(config, :enabled, false), + def remote(conn, %{"sig" => sig64, "url" => url64}) do + with {_, true} <- {:enabled, MediaProxy.enabled?()}, {:ok, url} <- MediaProxy.decode_url(sig64, url64), {_, false} <- {:in_banned_urls, MediaProxy.in_banned_urls(url)}, - :ok <- filename_matches(params, conn.request_path, url) do - ReverseProxy.call(conn, url, Keyword.get(config, :proxy_opts, @default_proxy_opts)) + :ok <- MediaProxy.verify_request_path_and_url(conn, url) do + ReverseProxy.call(conn, url, media_proxy_opts()) else - error when error in [false, {:in_banned_urls, true}] -> - send_resp(conn, 404, Plug.Conn.Status.reason_phrase(404)) + {:enabled, false} -> + send_resp(conn, 404, Conn.Status.reason_phrase(404)) + + {:in_banned_urls, true} -> + send_resp(conn, 404, Conn.Status.reason_phrase(404)) {:error, :invalid_signature} -> - send_resp(conn, 403, Plug.Conn.Status.reason_phrase(403)) + send_resp(conn, 403, Conn.Status.reason_phrase(403)) {:wrong_filename, filename} -> redirect(conn, external: MediaProxy.build_url(sig64, url64, filename)) end end - def filename_matches(%{"filename" => _} = _, path, url) do - filename = MediaProxy.filename(url) - - if filename && does_not_match(path, filename) do - {:wrong_filename, filename} + def preview(%Conn{} = conn, %{"sig" => sig64, "url" => url64}) do + with {_, true} <- {:enabled, MediaProxy.preview_enabled?()}, + {:ok, url} <- MediaProxy.decode_url(sig64, url64), + :ok <- MediaProxy.verify_request_path_and_url(conn, url) do + handle_preview(conn, url) else - :ok + {:enabled, false} -> + send_resp(conn, 404, Conn.Status.reason_phrase(404)) + + {:error, :invalid_signature} -> + send_resp(conn, 403, Conn.Status.reason_phrase(403)) + + {:wrong_filename, filename} -> + redirect(conn, external: MediaProxy.build_preview_url(sig64, url64, filename)) end end - def filename_matches(_, _, _), do: :ok + defp handle_preview(conn, url) do + media_proxy_url = MediaProxy.url(url) - defp does_not_match(path, filename) do - basename = Path.basename(path) - basename != filename and URI.decode(basename) != filename and URI.encode(basename) != filename + with {:ok, %{status: status} = head_response} when status in 200..299 <- + Pleroma.HTTP.request("head", media_proxy_url, [], [], pool: :media) do + content_type = Tesla.get_header(head_response, "content-type") + content_length = Tesla.get_header(head_response, "content-length") + content_length = content_length && String.to_integer(content_length) + static = conn.params["static"] in ["true", true] + + cond do + static and content_type == "image/gif" -> + handle_jpeg_preview(conn, media_proxy_url) + + static -> + drop_static_param_and_redirect(conn) + + content_type == "image/gif" -> + redirect(conn, external: media_proxy_url) + + min_content_length_for_preview() > 0 and content_length > 0 and + content_length < min_content_length_for_preview() -> + redirect(conn, external: media_proxy_url) + + true -> + handle_preview(content_type, conn, media_proxy_url) + end + else + # If HEAD failed, redirecting to media proxy URI doesn't make much sense; returning an error + {_, %{status: status}} -> + send_resp(conn, :failed_dependency, "Can't fetch HTTP headers (HTTP #{status}).") + + {:error, :recv_response_timeout} -> + send_resp(conn, :failed_dependency, "HEAD request timeout.") + + _ -> + send_resp(conn, :failed_dependency, "Can't fetch HTTP headers.") + end + end + + defp handle_preview("image/png" <> _ = _content_type, conn, media_proxy_url) do + handle_png_preview(conn, media_proxy_url) + end + + defp handle_preview("image/" <> _ = _content_type, conn, media_proxy_url) do + handle_jpeg_preview(conn, media_proxy_url) + end + + defp handle_preview("video/" <> _ = _content_type, conn, media_proxy_url) do + handle_video_preview(conn, media_proxy_url) + end + + defp handle_preview(_unsupported_content_type, conn, media_proxy_url) do + fallback_on_preview_error(conn, media_proxy_url) + end + + defp handle_png_preview(conn, media_proxy_url) do + quality = Config.get!([:media_preview_proxy, :image_quality]) + {thumbnail_max_width, thumbnail_max_height} = thumbnail_max_dimensions() + + with {:ok, thumbnail_binary} <- + MediaHelper.image_resize( + media_proxy_url, + %{ + max_width: thumbnail_max_width, + max_height: thumbnail_max_height, + quality: quality, + format: "png" + } + ) do + conn + |> put_preview_response_headers(["image/png", "preview.png"]) + |> send_resp(200, thumbnail_binary) + else + _ -> + fallback_on_preview_error(conn, media_proxy_url) + end + end + + defp handle_jpeg_preview(conn, media_proxy_url) do + quality = Config.get!([:media_preview_proxy, :image_quality]) + {thumbnail_max_width, thumbnail_max_height} = thumbnail_max_dimensions() + + with {:ok, thumbnail_binary} <- + MediaHelper.image_resize( + media_proxy_url, + %{max_width: thumbnail_max_width, max_height: thumbnail_max_height, quality: quality} + ) do + conn + |> put_preview_response_headers() + |> send_resp(200, thumbnail_binary) + else + _ -> + fallback_on_preview_error(conn, media_proxy_url) + end + end + + defp handle_video_preview(conn, media_proxy_url) do + with {:ok, thumbnail_binary} <- + MediaHelper.video_framegrab(media_proxy_url) do + conn + |> put_preview_response_headers() + |> send_resp(200, thumbnail_binary) + else + _ -> + fallback_on_preview_error(conn, media_proxy_url) + end + end + + defp drop_static_param_and_redirect(conn) do + uri_without_static_param = + conn + |> current_url() + |> UriHelper.modify_uri_params(%{}, ["static"]) + + redirect(conn, external: uri_without_static_param) + end + + defp fallback_on_preview_error(conn, media_proxy_url) do + redirect(conn, external: media_proxy_url) + end + + defp put_preview_response_headers( + conn, + [content_type, filename] = _content_info \\ ["image/jpeg", "preview.jpg"] + ) do + conn + |> put_resp_header("content-type", content_type) + |> put_resp_header("content-disposition", "inline; filename=\"#{filename}\"") + |> put_resp_header("cache-control", ReverseProxy.default_cache_control_header()) + end + + defp thumbnail_max_dimensions do + config = media_preview_proxy_config() + + thumbnail_max_width = Keyword.fetch!(config, :thumbnail_max_width) + thumbnail_max_height = Keyword.fetch!(config, :thumbnail_max_height) + + {thumbnail_max_width, thumbnail_max_height} + end + + defp min_content_length_for_preview do + Keyword.get(media_preview_proxy_config(), :min_content_length, 0) + end + + defp media_preview_proxy_config do + Config.get!([:media_preview_proxy]) + end + + defp media_proxy_opts do + Config.get([:media_proxy, :proxy_opts], []) end end diff --git a/lib/pleroma/web/metadata/restrict_indexing.ex b/lib/pleroma/web/metadata/restrict_indexing.ex index f15607896..a1dcb6e15 100644 --- a/lib/pleroma/web/metadata/restrict_indexing.ex +++ b/lib/pleroma/web/metadata/restrict_indexing.ex @@ -10,7 +10,9 @@ defmodule Pleroma.Web.Metadata.Providers.RestrictIndexing do """ @impl true - def build_tags(%{user: %{local: false}}) do + def build_tags(%{user: %{local: true, discoverable: true}}), do: [] + + def build_tags(_) do [ {:meta, [ @@ -19,7 +21,4 @@ def build_tags(%{user: %{local: false}}) do ], []} ] end - - @impl true - def build_tags(%{user: %{local: true}}), do: [] end diff --git a/lib/pleroma/web/metadata/utils.ex b/lib/pleroma/web/metadata/utils.ex index 2f0dfb474..8a206e019 100644 --- a/lib/pleroma/web/metadata/utils.ex +++ b/lib/pleroma/web/metadata/utils.ex @@ -38,7 +38,7 @@ def scrub_html(content) when is_binary(content) do def scrub_html(content), do: content def attachment_url(url) do - MediaProxy.url(url) + MediaProxy.preview_url(url) end def user_name_string(user) do diff --git a/lib/pleroma/web/oauth/oauth_controller.ex b/lib/pleroma/web/oauth/oauth_controller.ex index 26e68be42..a4152e840 100644 --- a/lib/pleroma/web/oauth/oauth_controller.ex +++ b/lib/pleroma/web/oauth/oauth_controller.ex @@ -119,7 +119,7 @@ defp handle_existing_authorization( redirect_uri = redirect_uri(conn, redirect_uri) url_params = %{access_token: token.token} url_params = Maps.put_if_present(url_params, :state, params["state"]) - url = UriHelper.append_uri_params(redirect_uri, url_params) + url = UriHelper.modify_uri_params(redirect_uri, url_params) redirect(conn, external: url) else conn @@ -161,7 +161,7 @@ def after_create_authorization(%Plug.Conn{} = conn, %Authorization{} = auth, %{ redirect_uri = redirect_uri(conn, redirect_uri) url_params = %{code: auth.token} url_params = Maps.put_if_present(url_params, :state, auth_attrs["state"]) - url = UriHelper.append_uri_params(redirect_uri, url_params) + url = UriHelper.modify_uri_params(redirect_uri, url_params) redirect(conn, external: url) else conn diff --git a/lib/pleroma/web/pleroma_api/controllers/chat_controller.ex b/lib/pleroma/web/pleroma_api/controllers/chat_controller.ex index e8a1746d4..e667831c5 100644 --- a/lib/pleroma/web/pleroma_api/controllers/chat_controller.ex +++ b/lib/pleroma/web/pleroma_api/controllers/chat_controller.ex @@ -4,6 +4,8 @@ defmodule Pleroma.Web.PleromaAPI.ChatController do use Pleroma.Web, :controller + import Pleroma.Web.ControllerHelper, only: [add_link_headers: 2] + alias Pleroma.Activity alias Pleroma.Chat alias Pleroma.Chat.MessageReference @@ -47,7 +49,7 @@ def delete_message(%{assigns: %{user: %{id: user_id} = user}} = conn, %{ }) do with %MessageReference{} = cm_ref <- MessageReference.get_by_id(message_id), - ^chat_id <- cm_ref.chat_id |> to_string(), + ^chat_id <- to_string(cm_ref.chat_id), %Chat{user_id: ^user_id} <- Chat.get_by_id(chat_id), {:ok, _} <- remove_or_delete(cm_ref, user) do conn @@ -68,18 +70,13 @@ defp remove_or_delete( end end - defp remove_or_delete(cm_ref, _) do - cm_ref - |> MessageReference.delete() - end + defp remove_or_delete(cm_ref, _), do: MessageReference.delete(cm_ref) def post_chat_message( - %{body_params: params, assigns: %{user: %{id: user_id} = user}} = conn, - %{ - id: id - } + %{body_params: params, assigns: %{user: user}} = conn, + %{id: id} ) do - with %Chat{} = chat <- Repo.get_by(Chat, id: id, user_id: user_id), + with {:ok, chat} <- Chat.get_by_user_and_id(user, id), %User{} = recipient <- User.get_cached_by_ap_id(chat.recipient), {:ok, activity} <- CommonAPI.post_chat_message(user, recipient, params[:content], @@ -90,16 +87,25 @@ def post_chat_message( conn |> put_view(MessageReferenceView) |> render("show.json", chat_message_reference: cm_ref) + else + {:reject, message} -> + conn + |> put_status(:unprocessable_entity) + |> json(%{error: message}) + + {:error, message} -> + conn + |> put_status(:bad_request) + |> json(%{error: message}) end end - def mark_message_as_read(%{assigns: %{user: %{id: user_id}}} = conn, %{ - id: chat_id, - message_id: message_id - }) do - with %MessageReference{} = cm_ref <- - MessageReference.get_by_id(message_id), - ^chat_id <- cm_ref.chat_id |> to_string(), + def mark_message_as_read( + %{assigns: %{user: %{id: user_id}}} = conn, + %{id: chat_id, message_id: message_id} + ) do + with %MessageReference{} = cm_ref <- MessageReference.get_by_id(message_id), + ^chat_id <- to_string(cm_ref.chat_id), %Chat{user_id: ^user_id} <- Chat.get_by_id(chat_id), {:ok, cm_ref} <- MessageReference.mark_as_read(cm_ref) do conn @@ -109,36 +115,28 @@ def mark_message_as_read(%{assigns: %{user: %{id: user_id}}} = conn, %{ end def mark_as_read( - %{ - body_params: %{last_read_id: last_read_id}, - assigns: %{user: %{id: user_id}} - } = conn, + %{body_params: %{last_read_id: last_read_id}, assigns: %{user: user}} = conn, %{id: id} ) do - with %Chat{} = chat <- Repo.get_by(Chat, id: id, user_id: user_id), - {_n, _} <- - MessageReference.set_all_seen_for_chat(chat, last_read_id) do + with {:ok, chat} <- Chat.get_by_user_and_id(user, id), + {_n, _} <- MessageReference.set_all_seen_for_chat(chat, last_read_id) do conn |> put_view(ChatView) |> render("show.json", chat: chat) end end - def messages(%{assigns: %{user: %{id: user_id}}} = conn, %{id: id} = params) do - with %Chat{} = chat <- Repo.get_by(Chat, id: id, user_id: user_id) do - cm_refs = + def messages(%{assigns: %{user: user}} = conn, %{id: id} = params) do + with {:ok, chat} <- Chat.get_by_user_and_id(user, id) do + chat_message_refs = chat |> MessageReference.for_chat_query() |> Pagination.fetch_paginated(params) conn + |> add_link_headers(chat_message_refs) |> put_view(MessageReferenceView) - |> render("index.json", chat_message_references: cm_refs) - else - _ -> - conn - |> put_status(:not_found) - |> json(%{error: "not found"}) + |> render("index.json", chat_message_references: chat_message_refs) end end @@ -146,11 +144,8 @@ def index(%{assigns: %{user: %{id: user_id} = user}} = conn, _params) do blocked_ap_ids = User.blocked_users_ap_ids(user) chats = - from(c in Chat, - where: c.user_id == ^user_id, - where: c.recipient not in ^blocked_ap_ids, - order_by: [desc: c.updated_at] - ) + Chat.for_user_query(user_id) + |> where([c], c.recipient not in ^blocked_ap_ids) |> Repo.all() conn @@ -158,8 +153,8 @@ def index(%{assigns: %{user: %{id: user_id} = user}} = conn, _params) do |> render("index.json", chats: chats) end - def create(%{assigns: %{user: user}} = conn, params) do - with %User{ap_id: recipient} <- User.get_by_id(params[:id]), + def create(%{assigns: %{user: user}} = conn, %{id: id}) do + with %User{ap_id: recipient} <- User.get_cached_by_id(id), {:ok, %Chat{} = chat} <- Chat.get_or_create(user.id, recipient) do conn |> put_view(ChatView) @@ -167,8 +162,8 @@ def create(%{assigns: %{user: user}} = conn, params) do end end - def show(%{assigns: %{user: user}} = conn, params) do - with %Chat{} = chat <- Repo.get_by(Chat, user_id: user.id, id: params[:id]) do + def show(%{assigns: %{user: user}} = conn, %{id: id}) do + with {:ok, chat} <- Chat.get_by_user_and_id(user, id) do conn |> put_view(ChatView) |> render("show.json", chat: chat) diff --git a/lib/pleroma/web/pleroma_api/controllers/emoji_file_controller.ex b/lib/pleroma/web/pleroma_api/controllers/emoji_file_controller.ex new file mode 100644 index 000000000..71c53df1d --- /dev/null +++ b/lib/pleroma/web/pleroma_api/controllers/emoji_file_controller.ex @@ -0,0 +1,133 @@ +defmodule Pleroma.Web.PleromaAPI.EmojiFileController do + use Pleroma.Web, :controller + + alias Pleroma.Emoji.Pack + alias Pleroma.Web.ApiSpec + + plug(Pleroma.Web.ApiSpec.CastAndValidate) + + plug( + Pleroma.Plugs.OAuthScopesPlug, + %{scopes: ["write"], admin: true} + when action in [ + :create, + :update, + :delete + ] + ) + + defdelegate open_api_operation(action), to: ApiSpec.PleromaEmojiFileOperation + + def create(%{body_params: params} = conn, %{name: pack_name}) do + filename = params[:filename] || get_filename(params[:file]) + shortcode = params[:shortcode] || Path.basename(filename, Path.extname(filename)) + + with {:ok, pack} <- Pack.load_pack(pack_name), + {:ok, file} <- get_file(params[:file]), + {:ok, pack} <- Pack.add_file(pack, shortcode, filename, file) do + json(conn, pack.files) + else + {:error, :already_exists} -> + conn + |> put_status(:conflict) + |> json(%{error: "An emoji with the \"#{shortcode}\" shortcode already exists"}) + + {:error, :empty_values} -> + conn + |> put_status(:unprocessable_entity) + |> json(%{error: "pack name, shortcode or filename cannot be empty"}) + + {:error, _} = error -> + handle_error(conn, error, %{pack_name: pack_name}) + end + end + + def update(%{body_params: %{shortcode: shortcode} = params} = conn, %{name: pack_name}) do + new_shortcode = params[:new_shortcode] + new_filename = params[:new_filename] + force = params[:force] + + with {:ok, pack} <- Pack.load_pack(pack_name), + {:ok, pack} <- Pack.update_file(pack, shortcode, new_shortcode, new_filename, force) do + json(conn, pack.files) + else + {:error, :already_exists} -> + conn + |> put_status(:conflict) + |> json(%{ + error: + "New shortcode \"#{new_shortcode}\" is already used. If you want to override emoji use 'force' option" + }) + + {:error, :empty_values} -> + conn + |> put_status(:unprocessable_entity) + |> json(%{error: "new_shortcode or new_filename cannot be empty"}) + + {:error, _} = error -> + handle_error(conn, error, %{pack_name: pack_name, code: shortcode}) + end + end + + def delete(conn, %{name: pack_name, shortcode: shortcode}) do + with {:ok, pack} <- Pack.load_pack(pack_name), + {:ok, pack} <- Pack.delete_file(pack, shortcode) do + json(conn, pack.files) + else + {:error, :empty_values} -> + conn + |> put_status(:unprocessable_entity) + |> json(%{error: "pack name or shortcode cannot be empty"}) + + {:error, _} = error -> + handle_error(conn, error, %{pack_name: pack_name, code: shortcode}) + end + end + + defp handle_error(conn, {:error, :doesnt_exist}, %{code: emoji_code}) do + conn + |> put_status(:bad_request) + |> json(%{error: "Emoji \"#{emoji_code}\" does not exist"}) + end + + defp handle_error(conn, {:error, :not_found}, %{pack_name: pack_name}) do + conn + |> put_status(:not_found) + |> json(%{error: "pack \"#{pack_name}\" is not found"}) + end + + defp handle_error(conn, {:error, _}, _) do + render_error( + conn, + :internal_server_error, + "Unexpected error occurred while adding file to pack." + ) + end + + defp get_filename(%Plug.Upload{filename: filename}), do: filename + defp get_filename(url) when is_binary(url), do: Path.basename(url) + + def get_file(%Plug.Upload{} = file), do: {:ok, file} + + def get_file(url) when is_binary(url) do + with {:ok, %Tesla.Env{body: body, status: code, headers: headers}} + when code in 200..299 <- Pleroma.HTTP.get(url) do + path = Plug.Upload.random_file!("emoji") + + content_type = + case List.keyfind(headers, "content-type", 0) do + {"content-type", value} -> value + nil -> nil + end + + File.write(path, body) + + {:ok, + %Plug.Upload{ + filename: Path.basename(url), + path: path, + content_type: content_type + }} + end + end +end diff --git a/lib/pleroma/web/pleroma_api/controllers/emoji_pack_controller.ex b/lib/pleroma/web/pleroma_api/controllers/emoji_pack_controller.ex index 657f46324..6696f8b92 100644 --- a/lib/pleroma/web/pleroma_api/controllers/emoji_pack_controller.ex +++ b/lib/pleroma/web/pleroma_api/controllers/emoji_pack_controller.ex @@ -14,10 +14,7 @@ defmodule Pleroma.Web.PleromaAPI.EmojiPackController do :download, :create, :update, - :delete, - :add_file, - :update_file, - :delete_file + :delete ] ) @@ -26,8 +23,9 @@ defmodule Pleroma.Web.PleromaAPI.EmojiPackController do defdelegate open_api_operation(action), to: Pleroma.Web.ApiSpec.PleromaEmojiPackOperation - def remote(conn, %{url: url}) do - with {:ok, packs} <- Pack.list_remote(url) do + def remote(conn, params) do + with {:ok, packs} <- + Pack.list_remote(url: params.url, page_size: params.page_size, page: params.page) do json(conn, packs) else {:error, :not_shareable} -> @@ -184,105 +182,6 @@ def update(%{body_params: %{metadata: metadata}} = conn, %{name: name}) do end end - def add_file(%{body_params: params} = conn, %{name: name}) do - filename = params[:filename] || get_filename(params[:file]) - shortcode = params[:shortcode] || Path.basename(filename, Path.extname(filename)) - - with {:ok, pack} <- Pack.add_file(name, shortcode, filename, params[:file]) do - json(conn, pack.files) - else - {:error, :already_exists} -> - conn - |> put_status(:conflict) - |> json(%{error: "An emoji with the \"#{shortcode}\" shortcode already exists"}) - - {:error, :not_found} -> - conn - |> put_status(:bad_request) - |> json(%{error: "pack \"#{name}\" is not found"}) - - {:error, :empty_values} -> - conn - |> put_status(:bad_request) - |> json(%{error: "pack name, shortcode or filename cannot be empty"}) - - {:error, _} -> - render_error( - conn, - :internal_server_error, - "Unexpected error occurred while adding file to pack." - ) - end - end - - def update_file(%{body_params: %{shortcode: shortcode} = params} = conn, %{name: name}) do - new_shortcode = params[:new_shortcode] - new_filename = params[:new_filename] - force = params[:force] - - with {:ok, pack} <- Pack.update_file(name, shortcode, new_shortcode, new_filename, force) do - json(conn, pack.files) - else - {:error, :doesnt_exist} -> - conn - |> put_status(:bad_request) - |> json(%{error: "Emoji \"#{shortcode}\" does not exist"}) - - {:error, :already_exists} -> - conn - |> put_status(:conflict) - |> json(%{ - error: - "New shortcode \"#{new_shortcode}\" is already used. If you want to override emoji use 'force' option" - }) - - {:error, :not_found} -> - conn - |> put_status(:bad_request) - |> json(%{error: "pack \"#{name}\" is not found"}) - - {:error, :empty_values} -> - conn - |> put_status(:bad_request) - |> json(%{error: "new_shortcode or new_filename cannot be empty"}) - - {:error, _} -> - render_error( - conn, - :internal_server_error, - "Unexpected error occurred while updating file in pack." - ) - end - end - - def delete_file(conn, %{name: name, shortcode: shortcode}) do - with {:ok, pack} <- Pack.delete_file(name, shortcode) do - json(conn, pack.files) - else - {:error, :doesnt_exist} -> - conn - |> put_status(:bad_request) - |> json(%{error: "Emoji \"#{shortcode}\" does not exist"}) - - {:error, :not_found} -> - conn - |> put_status(:bad_request) - |> json(%{error: "pack \"#{name}\" is not found"}) - - {:error, :empty_values} -> - conn - |> put_status(:bad_request) - |> json(%{error: "pack name or shortcode cannot be empty"}) - - {:error, _} -> - render_error( - conn, - :internal_server_error, - "Unexpected error occurred while removing file from pack." - ) - end - end - def import_from_filesystem(conn, _params) do with {:ok, names} <- Pack.import_from_filesystem() do json(conn, names) @@ -298,7 +197,4 @@ def import_from_filesystem(conn, _params) do |> json(%{error: "Error accessing emoji pack directory"}) end end - - defp get_filename(%Plug.Upload{filename: filename}), do: filename - defp get_filename(url) when is_binary(url), do: Path.basename(url) end diff --git a/lib/pleroma/web/pleroma_api/controllers/user_import_controller.ex b/lib/pleroma/web/pleroma_api/controllers/user_import_controller.ex new file mode 100644 index 000000000..f10c45750 --- /dev/null +++ b/lib/pleroma/web/pleroma_api/controllers/user_import_controller.ex @@ -0,0 +1,61 @@ +# Pleroma: A lightweight social networking server +# Copyright © 2017-2020 Pleroma Authors +# SPDX-License-Identifier: AGPL-3.0-only + +defmodule Pleroma.Web.PleromaAPI.UserImportController do + use Pleroma.Web, :controller + + require Logger + + alias Pleroma.Plugs.OAuthScopesPlug + alias Pleroma.User + alias Pleroma.Web.ApiSpec + + plug(OAuthScopesPlug, %{scopes: ["follow", "write:follows"]} when action == :follow) + plug(OAuthScopesPlug, %{scopes: ["follow", "write:blocks"]} when action == :blocks) + plug(OAuthScopesPlug, %{scopes: ["follow", "write:mutes"]} when action == :mutes) + + plug(OpenApiSpex.Plug.CastAndValidate) + defdelegate open_api_operation(action), to: ApiSpec.UserImportOperation + + def follow(%{body_params: %{list: %Plug.Upload{path: path}}} = conn, _) do + follow(%Plug.Conn{conn | body_params: %{list: File.read!(path)}}, %{}) + end + + def follow(%{assigns: %{user: follower}, body_params: %{list: list}} = conn, _) do + identifiers = + list + |> String.split("\n") + |> Enum.map(&(&1 |> String.split(",") |> List.first())) + |> List.delete("Account address") + |> Enum.map(&(&1 |> String.trim() |> String.trim_leading("@"))) + |> Enum.reject(&(&1 == "")) + + User.Import.follow_import(follower, identifiers) + json(conn, "job started") + end + + def blocks(%{body_params: %{list: %Plug.Upload{path: path}}} = conn, _) do + blocks(%Plug.Conn{conn | body_params: %{list: File.read!(path)}}, %{}) + end + + def blocks(%{assigns: %{user: blocker}, body_params: %{list: list}} = conn, _) do + User.Import.blocks_import(blocker, prepare_user_identifiers(list)) + json(conn, "job started") + end + + def mutes(%{body_params: %{list: %Plug.Upload{path: path}}} = conn, _) do + mutes(%Plug.Conn{conn | body_params: %{list: File.read!(path)}}, %{}) + end + + def mutes(%{assigns: %{user: user}, body_params: %{list: list}} = conn, _) do + User.Import.mutes_import(user, prepare_user_identifiers(list)) + json(conn, "job started") + end + + defp prepare_user_identifiers(list) do + list + |> String.split() + |> Enum.map(&String.trim_leading(&1, "@")) + end +end diff --git a/lib/pleroma/web/pleroma_api/views/scrobble_view.ex b/lib/pleroma/web/pleroma_api/views/scrobble_view.ex index bbff93abe..95bd4c368 100644 --- a/lib/pleroma/web/pleroma_api/views/scrobble_view.ex +++ b/lib/pleroma/web/pleroma_api/views/scrobble_view.ex @@ -10,14 +10,14 @@ defmodule Pleroma.Web.PleromaAPI.ScrobbleView do alias Pleroma.Activity alias Pleroma.HTML alias Pleroma.Object + alias Pleroma.Web.CommonAPI alias Pleroma.Web.CommonAPI.Utils alias Pleroma.Web.MastodonAPI.AccountView - alias Pleroma.Web.MastodonAPI.StatusView def render("show.json", %{activity: %Activity{data: %{"type" => "Listen"}} = activity} = opts) do object = Object.normalize(activity) - user = StatusView.get_user(activity.data["actor"]) + user = CommonAPI.get_user(activity.data["actor"]) created_at = Utils.to_masto_date(activity.data["published"]) %{ diff --git a/lib/pleroma/web/push/impl.ex b/lib/pleroma/web/push/impl.ex index 16368485e..da535aa68 100644 --- a/lib/pleroma/web/push/impl.ex +++ b/lib/pleroma/web/push/impl.ex @@ -19,7 +19,7 @@ defmodule Pleroma.Web.Push.Impl do @types ["Create", "Follow", "Announce", "Like", "Move"] @doc "Performs sending notifications for user subscriptions" - @spec perform(Notification.t()) :: list(any) | :error + @spec perform(Notification.t()) :: list(any) | :error | {:error, :unknown_type} def perform( %{ activity: %{data: %{"type" => activity_type}} = activity, @@ -64,20 +64,20 @@ def perform(_) do @doc "Push message to web" def push_message(body, sub, api_key, subscription) do case WebPushEncryption.send_web_push(body, sub, api_key) do - {:ok, %{status_code: code}} when 400 <= code and code < 500 -> + {:ok, %{status: code}} when code in 400..499 -> Logger.debug("Removing subscription record") Repo.delete!(subscription) :ok - {:ok, %{status_code: code}} when 200 <= code and code < 300 -> + {:ok, %{status: code}} when code in 200..299 -> :ok - {:ok, %{status_code: code}} -> + {:ok, %{status: code}} -> Logger.error("Web Push Notification failed with code: #{code}") :error - _ -> - Logger.error("Web Push Notification failed with unknown error") + error -> + Logger.error("Web Push Notification failed with #{inspect(error)}") :error end end diff --git a/lib/pleroma/web/rich_media/helpers.ex b/lib/pleroma/web/rich_media/helpers.ex index bd7f03cbe..d67b594b5 100644 --- a/lib/pleroma/web/rich_media/helpers.ex +++ b/lib/pleroma/web/rich_media/helpers.ex @@ -57,7 +57,6 @@ defp get_tld(host) do def fetch_data_for_object(object) do with true <- Config.get([:rich_media, :enabled]), - false <- object.data["sensitive"] || false, {:ok, page_url} <- HTML.extract_first_external_url_from_object(object), :ok <- validate_page_url(page_url), @@ -87,6 +86,50 @@ def perform(:fetch, %Activity{} = activity) do def rich_media_get(url) do headers = [{"user-agent", Pleroma.Application.user_agent() <> "; Bot"}] - Pleroma.HTTP.get(url, headers, @options) + head_check = + case Pleroma.HTTP.head(url, headers, @options) do + # If the HEAD request didn't reach the server for whatever reason, + # we assume the GET that comes right after won't either + {:error, _} = e -> + e + + {:ok, %Tesla.Env{status: 200, headers: headers}} -> + with :ok <- check_content_type(headers), + :ok <- check_content_length(headers), + do: :ok + + _ -> + :ok + end + + with :ok <- head_check, do: Pleroma.HTTP.get(url, headers, @options) + end + + defp check_content_type(headers) do + case List.keyfind(headers, "content-type", 0) do + {_, content_type} -> + case Plug.Conn.Utils.media_type(content_type) do + {:ok, "text", "html", _} -> :ok + _ -> {:error, {:content_type, content_type}} + end + + _ -> + :ok + end + end + + @max_body @options[:max_body] + defp check_content_length(headers) do + case List.keyfind(headers, "content-length", 0) do + {_, maybe_content_length} -> + case Integer.parse(maybe_content_length) do + {content_length, ""} when content_length <= @max_body -> :ok + {_, ""} -> {:error, :body_too_large} + _ -> :ok + end + + _ -> + :ok + end end end diff --git a/lib/pleroma/web/rich_media/parser.ex b/lib/pleroma/web/rich_media/parser.ex index 5727fda18..c70d2fdba 100644 --- a/lib/pleroma/web/rich_media/parser.ex +++ b/lib/pleroma/web/rich_media/parser.ex @@ -20,28 +20,61 @@ def parse(url) do with {:ok, data} <- get_cached_or_parse(url), {:ok, _} <- set_ttl_based_on_image(data, url) do {:ok, data} - else - {:error, {:invalid_metadata, data}} = e -> - Logger.debug(fn -> "Incomplete or invalid metadata for #{url}: #{inspect(data)}" end) - e - - error -> - Logger.error(fn -> "Rich media error for #{url}: #{inspect(error)}" end) - error end end defp get_cached_or_parse(url) do - case Cachex.fetch!(:rich_media_cache, url, fn _ -> {:commit, parse_url(url)} end) do - {:ok, _data} = res -> - res + case Cachex.fetch(:rich_media_cache, url, fn -> + case parse_url(url) do + {:ok, _} = res -> + {:commit, res} - {:error, _} = e -> - ttl = Pleroma.Config.get([:rich_media, :failure_backoff], 60_000) - Cachex.expire(:rich_media_cache, url, ttl) - e + {:error, reason} = e -> + # Unfortunately we have to log errors here, instead of doing that + # along with ttl setting at the bottom. Otherwise we can get log spam + # if more than one process was waiting for the rich media card + # while it was generated. Ideally we would set ttl here as well, + # so we don't override it number_of_waiters_on_generation + # times, but one, obviously, can't set ttl for not-yet-created entry + # and Cachex doesn't support returning ttl from the fetch callback. + log_error(url, reason) + {:commit, e} + end + end) do + {action, res} when action in [:commit, :ok] -> + case res do + {:ok, _data} = res -> + res + + {:error, reason} = e -> + if action == :commit, do: set_error_ttl(url, reason) + e + end + + {:error, e} -> + {:error, {:cachex_error, e}} end end + + defp set_error_ttl(_url, :body_too_large), do: :ok + defp set_error_ttl(_url, {:content_type, _}), do: :ok + + # The TTL is not set for the errors above, since they are unlikely to change + # with time + + defp set_error_ttl(url, _reason) do + ttl = Pleroma.Config.get([:rich_media, :failure_backoff], 60_000) + Cachex.expire(:rich_media_cache, url, ttl) + :ok + end + + defp log_error(url, {:invalid_metadata, data}) do + Logger.debug(fn -> "Incomplete or invalid metadata for #{url}: #{inspect(data)}" end) + end + + defp log_error(url, reason) do + Logger.warn(fn -> "Rich media error for #{url}: #{inspect(reason)}" end) + end end @doc """ diff --git a/lib/pleroma/web/router.ex b/lib/pleroma/web/router.ex index c6433cc53..e22b31b4c 100644 --- a/lib/pleroma/web/router.ex +++ b/lib/pleroma/web/router.ex @@ -178,9 +178,14 @@ defmodule Pleroma.Web.Router do get("/users", AdminAPIController, :list_users) get("/users/:nickname", AdminAPIController, :user_show) get("/users/:nickname/statuses", AdminAPIController, :list_user_statuses) + get("/users/:nickname/chats", AdminAPIController, :list_user_chats) get("/instances/:instance/statuses", AdminAPIController, :list_instance_statuses) + get("/instance_document/:name", InstanceDocumentController, :show) + patch("/instance_document/:name", InstanceDocumentController, :update) + delete("/instance_document/:name", InstanceDocumentController, :delete) + patch("/users/confirm_email", AdminAPIController, :confirm_email) patch("/users/resend_confirmation_email", AdminAPIController, :resend_confirmation_email) @@ -214,9 +219,27 @@ defmodule Pleroma.Web.Router do get("/media_proxy_caches", MediaProxyCacheController, :index) post("/media_proxy_caches/delete", MediaProxyCacheController, :delete) post("/media_proxy_caches/purge", MediaProxyCacheController, :purge) + + get("/chats/:id", ChatController, :show) + get("/chats/:id/messages", ChatController, :messages) + delete("/chats/:id/messages/:message_id", ChatController, :delete_message) end scope "/api/pleroma/emoji", Pleroma.Web.PleromaAPI do + scope "/pack" do + pipe_through(:admin_api) + + post("/", EmojiPackController, :create) + patch("/", EmojiPackController, :update) + delete("/", EmojiPackController, :delete) + end + + scope "/pack" do + pipe_through(:api) + + get("/", EmojiPackController, :show) + end + # Modifying packs scope "/packs" do pipe_through(:admin_api) @@ -225,21 +248,17 @@ defmodule Pleroma.Web.Router do get("/remote", EmojiPackController, :remote) post("/download", EmojiPackController, :download) - post("/:name", EmojiPackController, :create) - patch("/:name", EmojiPackController, :update) - delete("/:name", EmojiPackController, :delete) - - post("/:name/files", EmojiPackController, :add_file) - patch("/:name/files", EmojiPackController, :update_file) - delete("/:name/files", EmojiPackController, :delete_file) + post("/files", EmojiFileController, :create) + patch("/files", EmojiFileController, :update) + delete("/files", EmojiFileController, :delete) end # Pack info / downloading scope "/packs" do pipe_through(:api) + get("/", EmojiPackController, :index) - get("/:name", EmojiPackController, :show) - get("/:name/archive", EmojiPackController, :archive) + get("/archive", EmojiPackController, :archive) end end @@ -260,14 +279,15 @@ defmodule Pleroma.Web.Router do post("/delete_account", UtilController, :delete_account) put("/notification_settings", UtilController, :update_notificaton_settings) post("/disable_account", UtilController, :disable_account) - - post("/blocks_import", UtilController, :blocks_import) - post("/follow_import", UtilController, :follow_import) end scope "/api/pleroma", Pleroma.Web.PleromaAPI do pipe_through(:authenticated_api) + post("/mutes_import", UserImportController, :mutes) + post("/blocks_import", UserImportController, :blocks) + post("/follow_import", UserImportController, :follow) + get("/accounts/mfa", TwoFactorAuthenticationController, :settings) get("/accounts/mfa/backup_codes", TwoFactorAuthenticationController, :backup_codes) get("/accounts/mfa/setup/:method", TwoFactorAuthenticationController, :setup) @@ -670,6 +690,8 @@ defmodule Pleroma.Web.Router do end scope "/proxy/", Pleroma.Web.MediaProxy do + get("/preview/:sig/:url", MediaProxyController, :preview) + get("/preview/:sig/:url/:filename", MediaProxyController, :preview) get("/:sig/:url", MediaProxyController, :remote) get("/:sig/:url/:filename", MediaProxyController, :remote) end diff --git a/lib/pleroma/web/streamer/streamer.ex b/lib/pleroma/web/streamer/streamer.ex index d1d70e556..5475f18a6 100644 --- a/lib/pleroma/web/streamer/streamer.ex +++ b/lib/pleroma/web/streamer/streamer.ex @@ -11,10 +11,12 @@ defmodule Pleroma.Web.Streamer do alias Pleroma.Conversation.Participation alias Pleroma.Notification alias Pleroma.Object + alias Pleroma.Plugs.OAuthScopesPlug alias Pleroma.User alias Pleroma.Web.ActivityPub.ActivityPub alias Pleroma.Web.ActivityPub.Visibility alias Pleroma.Web.CommonAPI + alias Pleroma.Web.OAuth.Token alias Pleroma.Web.StreamerView @mix_env Mix.env() @@ -26,53 +28,87 @@ def registry, do: @registry @user_streams ["user", "user:notification", "direct", "user:pleroma_chat"] @doc "Expands and authorizes a stream, and registers the process for streaming." - @spec get_topic_and_add_socket(stream :: String.t(), User.t() | nil, Map.t() | nil) :: + @spec get_topic_and_add_socket( + stream :: String.t(), + User.t() | nil, + Token.t() | nil, + Map.t() | nil + ) :: {:ok, topic :: String.t()} | {:error, :bad_topic} | {:error, :unauthorized} - def get_topic_and_add_socket(stream, user, params \\ %{}) do - case get_topic(stream, user, params) do + def get_topic_and_add_socket(stream, user, oauth_token, params \\ %{}) do + case get_topic(stream, user, oauth_token, params) do {:ok, topic} -> add_socket(topic, user) error -> error end end @doc "Expand and authorizes a stream" - @spec get_topic(stream :: String.t(), User.t() | nil, Map.t()) :: + @spec get_topic(stream :: String.t(), User.t() | nil, Token.t() | nil, Map.t()) :: {:ok, topic :: String.t()} | {:error, :bad_topic} - def get_topic(stream, user, params \\ %{}) + def get_topic(stream, user, oauth_token, params \\ %{}) # Allow all public steams. - def get_topic(stream, _, _) when stream in @public_streams do + def get_topic(stream, _user, _oauth_token, _params) when stream in @public_streams do {:ok, stream} end # Allow all hashtags streams. - def get_topic("hashtag", _, %{"tag" => tag}) do + def get_topic("hashtag", _user, _oauth_token, %{"tag" => tag} = _params) do {:ok, "hashtag:" <> tag} end # Expand user streams. - def get_topic(stream, %User{} = user, _) when stream in @user_streams do - {:ok, stream <> ":" <> to_string(user.id)} + def get_topic( + stream, + %User{id: user_id} = user, + %Token{user_id: token_user_id} = oauth_token, + _params + ) + when stream in @user_streams and user_id == token_user_id do + # Note: "read" works for all user streams (not mentioning it since it's an ancestor scope) + required_scopes = + if stream == "user:notification" do + ["read:notifications"] + else + ["read:statuses"] + end + + if OAuthScopesPlug.filter_descendants(required_scopes, oauth_token.scopes) == [] do + {:error, :unauthorized} + else + {:ok, stream <> ":" <> to_string(user.id)} + end end - def get_topic(stream, _, _) when stream in @user_streams do + def get_topic(stream, _user, _oauth_token, _params) when stream in @user_streams do {:error, :unauthorized} end # List streams. - def get_topic("list", %User{} = user, %{"list" => id}) do - if Pleroma.List.get(id, user) do - {:ok, "list:" <> to_string(id)} - else - {:error, :bad_topic} + def get_topic( + "list", + %User{id: user_id} = user, + %Token{user_id: token_user_id} = oauth_token, + %{"list" => id} + ) + when user_id == token_user_id do + cond do + OAuthScopesPlug.filter_descendants(["read", "read:lists"], oauth_token.scopes) == [] -> + {:error, :unauthorized} + + Pleroma.List.get(id, user) -> + {:ok, "list:" <> to_string(id)} + + true -> + {:error, :bad_topic} end end - def get_topic("list", _, _) do + def get_topic("list", _user, _oauth_token, _params) do {:error, :unauthorized} end - def get_topic(_, _, _) do + def get_topic(_stream, _user, _oauth_token, _params) do {:error, :bad_topic} end diff --git a/lib/pleroma/web/twitter_api/controllers/util_controller.ex b/lib/pleroma/web/twitter_api/controllers/util_controller.ex index f02c4075c..70b0fbd54 100644 --- a/lib/pleroma/web/twitter_api/controllers/util_controller.ex +++ b/lib/pleroma/web/twitter_api/controllers/util_controller.ex @@ -18,14 +18,6 @@ defmodule Pleroma.Web.TwitterAPI.UtilController do plug(Pleroma.Web.FederatingPlug when action == :remote_subscribe) - plug( - OAuthScopesPlug, - %{scopes: ["follow", "write:follows"]} - when action == :follow_import - ) - - plug(OAuthScopesPlug, %{scopes: ["follow", "write:blocks"]} when action == :blocks_import) - plug( OAuthScopesPlug, %{scopes: ["write:accounts"]} @@ -104,33 +96,6 @@ def update_notificaton_settings(%{assigns: %{user: user}} = conn, params) do end end - def follow_import(conn, %{"list" => %Plug.Upload{} = listfile}) do - follow_import(conn, %{"list" => File.read!(listfile.path)}) - end - - def follow_import(%{assigns: %{user: follower}} = conn, %{"list" => list}) do - followed_identifiers = - list - |> String.split("\n") - |> Enum.map(&(&1 |> String.split(",") |> List.first())) - |> List.delete("Account address") - |> Enum.map(&(&1 |> String.trim() |> String.trim_leading("@"))) - |> Enum.reject(&(&1 == "")) - - User.follow_import(follower, followed_identifiers) - json(conn, "job started") - end - - def blocks_import(conn, %{"list" => %Plug.Upload{} = listfile}) do - blocks_import(conn, %{"list" => File.read!(listfile.path)}) - end - - def blocks_import(%{assigns: %{user: blocker}} = conn, %{"list" => list}) do - blocked_identifiers = list |> String.split() |> Enum.map(&String.trim_leading(&1, "@")) - User.blocks_import(blocker, blocked_identifiers) - json(conn, "job started") - end - def change_password(%{assigns: %{user: user}} = conn, params) do case CommonAPI.Utils.confirm_current_password(user, params["password"]) do {:ok, user} -> diff --git a/lib/pleroma/workers/background_worker.ex b/lib/pleroma/workers/background_worker.ex index cec5a7462..55b5a13d9 100644 --- a/lib/pleroma/workers/background_worker.ex +++ b/lib/pleroma/workers/background_worker.ex @@ -26,26 +26,10 @@ def perform(%Job{args: %{"op" => "force_password_reset", "user_id" => user_id}}) User.perform(:force_password_reset, user) end - def perform(%Job{ - args: %{ - "op" => "blocks_import", - "blocker_id" => blocker_id, - "blocked_identifiers" => blocked_identifiers - } - }) do - blocker = User.get_cached_by_id(blocker_id) - {:ok, User.perform(:blocks_import, blocker, blocked_identifiers)} - end - - def perform(%Job{ - args: %{ - "op" => "follow_import", - "follower_id" => follower_id, - "followed_identifiers" => followed_identifiers - } - }) do - follower = User.get_cached_by_id(follower_id) - {:ok, User.perform(:follow_import, follower, followed_identifiers)} + def perform(%Job{args: %{"op" => op, "user_id" => user_id, "identifiers" => identifiers}}) + when op in ["blocks_import", "follow_import", "mutes_import"] do + user = User.get_cached_by_id(user_id) + {:ok, User.Import.perform(String.to_atom(op), user, identifiers)} end def perform(%Job{args: %{"op" => "media_proxy_preload", "message" => message}}) do diff --git a/mix.exs b/mix.exs index d2a40bcc8..c6e94aff7 100644 --- a/mix.exs +++ b/mix.exs @@ -122,7 +122,7 @@ defp deps do {:ecto_enum, "~> 1.4"}, {:ecto_sql, "~> 3.4.4"}, {:postgrex, ">= 0.15.5"}, - {:oban, "~> 2.0.0"}, + {:oban, "~> 2.1.0"}, {:gettext, "~> 0.18"}, {:pbkdf2_elixir, "~> 1.2"}, {:bcrypt_elixir, "~> 2.2"}, diff --git a/mix.lock b/mix.lock index 96d10641f..73d84d7cc 100644 --- a/mix.lock +++ b/mix.lock @@ -24,13 +24,14 @@ "crypt": {:git, "https://github.com/msantos/crypt.git", "f63a705f92c26955977ee62a313012e309a4d77a", [ref: "f63a705f92c26955977ee62a313012e309a4d77a"]}, "custom_base": {:hex, :custom_base, "0.2.1", "4a832a42ea0552299d81652aa0b1f775d462175293e99dfbe4d7dbaab785a706", [:mix], [], "hexpm", "8df019facc5ec9603e94f7270f1ac73ddf339f56ade76a721eaa57c1493ba463"}, "db_connection": {:hex, :db_connection, "2.2.2", "3bbca41b199e1598245b716248964926303b5d4609ff065125ce98bcd368939e", [:mix], [{:connection, "~> 1.0.2", [hex: :connection, repo: "hexpm", optional: false]}], "hexpm", "642af240d8a8affb93b4ba5a6fcd2bbcbdc327e1a524b825d383711536f8070c"}, - "decimal": {:hex, :decimal, "1.8.1", "a4ef3f5f3428bdbc0d35374029ffcf4ede8533536fa79896dd450168d9acdf3c", [:mix], [], "hexpm", "3cb154b00225ac687f6cbd4acc4b7960027c757a5152b369923ead9ddbca7aec"}, + "decimal": {:hex, :decimal, "2.0.0", "a78296e617b0f5dd4c6caf57c714431347912ffb1d0842e998e9792b5642d697", [:mix], [], "hexpm", "34666e9c55dea81013e77d9d87370fe6cb6291d1ef32f46a1600230b1d44f577"}, "deep_merge": {:hex, :deep_merge, "1.0.0", "b4aa1a0d1acac393bdf38b2291af38cb1d4a52806cf7a4906f718e1feb5ee961", [:mix], [], "hexpm", "ce708e5f094b9cd4e8f2be4f00d2f4250c4095be93f8cd6d018c753894885430"}, "earmark": {:hex, :earmark, "1.4.3", "364ca2e9710f6bff494117dbbd53880d84bebb692dafc3a78eb50aa3183f2bfd", [:mix], [], "hexpm", "8cf8a291ebf1c7b9539e3cddb19e9cef066c2441b1640f13c34c1d3cfc825fec"}, "earmark_parser": {:hex, :earmark_parser, "1.4.10", "6603d7a603b9c18d3d20db69921527f82ef09990885ed7525003c7fe7dc86c56", [:mix], [], "hexpm", "8e2d5370b732385db2c9b22215c3f59c84ac7dda7ed7e544d7c459496ae519c0"}, - "ecto": {:hex, :ecto, "3.4.5", "2bcd262f57b2c888b0bd7f7a28c8a48aa11dc1a2c6a858e45dd8f8426d504265", [:mix], [{:decimal, "~> 1.6 or ~> 2.0", [hex: :decimal, repo: "hexpm", optional: false]}, {:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: true]}, {:telemetry, "~> 0.4", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "8c6d1d4d524559e9b7a062f0498e2c206122552d63eacff0a6567ffe7a8e8691"}, + "ecto": {:hex, :ecto, "3.4.6", "08f7afad3257d6eb8613309af31037e16c36808dfda5a3cd0cb4e9738db030e4", [:mix], [{:decimal, "~> 1.6 or ~> 2.0", [hex: :decimal, repo: "hexpm", optional: false]}, {:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: true]}, {:telemetry, "~> 0.4", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "6f13a9e2a62e75c2dcfc7207bfc65645ab387af8360db4c89fee8b5a4bf3f70b"}, "ecto_enum": {:hex, :ecto_enum, "1.4.0", "d14b00e04b974afc69c251632d1e49594d899067ee2b376277efd8233027aec8", [:mix], [{:ecto, ">= 3.0.0", [hex: :ecto, repo: "hexpm", optional: false]}, {:ecto_sql, "> 3.0.0", [hex: :ecto_sql, repo: "hexpm", optional: false]}, {:mariaex, ">= 0.0.0", [hex: :mariaex, repo: "hexpm", optional: true]}, {:postgrex, ">= 0.0.0", [hex: :postgrex, repo: "hexpm", optional: true]}], "hexpm", "8fb55c087181c2b15eee406519dc22578fa60dd82c088be376d0010172764ee4"}, "ecto_sql": {:hex, :ecto_sql, "3.4.5", "30161f81b167d561a9a2df4329c10ae05ff36eca7ccc84628f2c8b9fa1e43323", [:mix], [{:db_connection, "~> 2.2", [hex: :db_connection, repo: "hexpm", optional: false]}, {:ecto, "~> 3.4.3", [hex: :ecto, repo: "hexpm", optional: false]}, {:myxql, "~> 0.3.0 or ~> 0.4.0", [hex: :myxql, repo: "hexpm", optional: true]}, {:postgrex, "~> 0.15.0", [hex: :postgrex, repo: "hexpm", optional: true]}, {:tds, "~> 2.1.0", [hex: :tds, repo: "hexpm", optional: true]}, {:telemetry, "~> 0.4.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "31990c6a3579b36a3c0841d34a94c275e727de8b84f58509da5f1b2032c98ac2"}, + "eimp": {:hex, :eimp, "1.0.14", "fc297f0c7e2700457a95a60c7010a5f1dcb768a083b6d53f49cd94ab95a28f22", [:rebar3], [{:p1_utils, "1.0.18", [hex: :p1_utils, repo: "hexpm", optional: false]}], "hexpm", "501133f3112079b92d9e22da8b88bf4f0e13d4d67ae9c15c42c30bd25ceb83b6"}, "elixir_make": {:hex, :elixir_make, "0.6.0", "38349f3e29aff4864352084fc736fa7fa0f2995a819a737554f7ebd28b85aaab", [:mix], [], "hexpm", "d522695b93b7f0b4c0fcb2dfe73a6b905b1c301226a5a55cb42e5b14d509e050"}, "esshd": {:hex, :esshd, "0.1.1", "d4dd4c46698093a40a56afecce8a46e246eb35463c457c246dacba2e056f31b5", [:mix], [], "hexpm", "d73e341e3009d390aa36387dc8862860bf9f874c94d9fd92ade2926376f49981"}, "eternal": {:hex, :eternal, "1.2.1", "d5b6b2499ba876c57be2581b5b999ee9bdf861c647401066d3eeed111d096bc4", [:mix], [], "hexpm", "b14f1dc204321429479c569cfbe8fb287541184ed040956c8862cb7a677b8406"}, @@ -58,7 +59,7 @@ "httpoison": {:hex, :httpoison, "1.6.2", "ace7c8d3a361cebccbed19c283c349b3d26991eff73a1eaaa8abae2e3c8089b6", [:mix], [{:hackney, "~> 1.15 and >= 1.15.2", [hex: :hackney, repo: "hexpm", optional: false]}], "hexpm", "aa2c74bd271af34239a3948779612f87df2422c2fdcfdbcec28d9c105f0773fe"}, "idna": {:hex, :idna, "6.0.0", "689c46cbcdf3524c44d5f3dde8001f364cd7608a99556d8fbd8239a5798d4c10", [:rebar3], [{:unicode_util_compat, "0.4.1", [hex: :unicode_util_compat, repo: "hexpm", optional: false]}], "hexpm", "4bdd305eb64e18b0273864920695cb18d7a2021f31a11b9c5fbcd9a253f936e2"}, "inet_cidr": {:hex, :inet_cidr, "1.0.4", "a05744ab7c221ca8e395c926c3919a821eb512e8f36547c062f62c4ca0cf3d6e", [:mix], [], "hexpm", "64a2d30189704ae41ca7dbdd587f5291db5d1dda1414e0774c29ffc81088c1bc"}, - "jason": {:hex, :jason, "1.2.1", "12b22825e22f468c02eb3e4b9985f3d0cb8dc40b9bd704730efa11abd2708c44", [:mix], [{:decimal, "~> 1.0", [hex: :decimal, repo: "hexpm", optional: true]}], "hexpm", "b659b8571deedf60f79c5a608e15414085fa141344e2716fbd6988a084b5f993"}, + "jason": {:hex, :jason, "1.2.2", "ba43e3f2709fd1aa1dce90aaabfd039d000469c05c56f0b8e31978e03fa39052", [:mix], [{:decimal, "~> 1.0 or ~> 2.0", [hex: :decimal, repo: "hexpm", optional: true]}], "hexpm", "18a228f5f0058ee183f29f9eae0805c6e59d61c3b006760668d8d18ff0d12179"}, "joken": {:hex, :joken, "2.2.0", "2daa1b12be05184aff7b5ace1d43ca1f81345962285fff3f88db74927c954d3a", [:mix], [{:jose, "~> 1.9", [hex: :jose, repo: "hexpm", optional: false]}], "hexpm", "b4f92e30388206f869dd25d1af628a1d99d7586e5cf0672f64d4df84c4d2f5e9"}, "jose": {:hex, :jose, "1.10.1", "16d8e460dae7203c6d1efa3f277e25b5af8b659febfc2f2eb4bacf87f128b80a", [:mix, :rebar3], [], "hexpm", "3c7ddc8a9394b92891db7c2771da94bf819834a1a4c92e30857b7d582e2f8257"}, "jumper": {:hex, :jumper, "1.0.1", "3c00542ef1a83532b72269fab9f0f0c82bf23a35e27d278bfd9ed0865cecabff", [:mix], [], "hexpm", "318c59078ac220e966d27af3646026db9b5a5e6703cb2aa3e26bcfaba65b7433"}, @@ -79,8 +80,9 @@ "nimble_parsec": {:hex, :nimble_parsec, "0.6.0", "32111b3bf39137144abd7ba1cce0914533b2d16ef35e8abc5ec8be6122944263", [:mix], [], "hexpm", "27eac315a94909d4dc68bc07a4a83e06c8379237c5ea528a9acff4ca1c873c52"}, "nimble_pool": {:hex, :nimble_pool, "0.1.0", "ffa9d5be27eee2b00b0c634eb649aa27f97b39186fec3c493716c2a33e784ec6", [:mix], [], "hexpm", "343a1eaa620ddcf3430a83f39f2af499fe2370390d4f785cd475b4df5acaf3f9"}, "nodex": {:git, "https://git.pleroma.social/pleroma/nodex", "cb6730f943cfc6aad674c92161be23a8411f15d1", [ref: "cb6730f943cfc6aad674c92161be23a8411f15d1"]}, - "oban": {:hex, :oban, "2.0.0", "e6ce70d94dd46815ec0882a1ffb7356df9a9d5b8a40a64ce5c2536617a447379", [:mix], [{:ecto_sql, ">= 3.4.3", [hex: :ecto_sql, repo: "hexpm", optional: false]}, {:jason, "~> 1.1", [hex: :jason, repo: "hexpm", optional: false]}, {:postgrex, "~> 0.14", [hex: :postgrex, repo: "hexpm", optional: false]}, {:telemetry, "~> 0.4", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "cf574813bd048b98a698aa587c21367d2e06842d4e1b1993dcd6a696e9e633bd"}, + "oban": {:hex, :oban, "2.1.0", "034144686f7e76a102b5d67731f098d98a9e4a52b07c25ad580a01f83a7f1cf5", [:mix], [{:ecto_sql, ">= 3.4.3", [hex: :ecto_sql, repo: "hexpm", optional: false]}, {:jason, "~> 1.1", [hex: :jason, repo: "hexpm", optional: false]}, {:postgrex, "~> 0.14", [hex: :postgrex, repo: "hexpm", optional: false]}, {:telemetry, "~> 0.4", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "c6f067fa3b308ed9e0e6beb2b34277c9c4e48bf95338edabd8f4a757a26e04c2"}, "open_api_spex": {:git, "https://git.pleroma.social/pleroma/elixir-libraries/open_api_spex.git", "f296ac0924ba3cf79c7a588c4c252889df4c2edd", [ref: "f296ac0924ba3cf79c7a588c4c252889df4c2edd"]}, + "p1_utils": {:hex, :p1_utils, "1.0.18", "3fe224de5b2e190d730a3c5da9d6e8540c96484cf4b4692921d1e28f0c32b01c", [:rebar3], [], "hexpm", "1fc8773a71a15553b179c986b22fbeead19b28fe486c332d4929700ffeb71f88"}, "parse_trans": {:hex, :parse_trans, "3.3.0", "09765507a3c7590a784615cfd421d101aec25098d50b89d7aa1d66646bc571c1", [:rebar3], [], "hexpm", "17ef63abde837ad30680ea7f857dd9e7ced9476cdd7b0394432af4bfc241b960"}, "pbkdf2_elixir": {:hex, :pbkdf2_elixir, "1.2.1", "9cbe354b58121075bd20eb83076900a3832324b7dd171a6895fab57b6bb2752c", [:mix], [{:comeonin, "~> 5.3", [hex: :comeonin, repo: "hexpm", optional: false]}], "hexpm", "d3b40a4a4630f0b442f19eca891fcfeeee4c40871936fed2f68e1c4faa30481f"}, "phoenix": {:hex, :phoenix, "1.4.17", "1b1bd4cff7cfc87c94deaa7d60dd8c22e04368ab95499483c50640ef3bd838d8", [:mix], [{:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: true]}, {:phoenix_pubsub, "~> 1.1", [hex: :phoenix_pubsub, repo: "hexpm", optional: false]}, {:plug, "~> 1.8.1 or ~> 1.9", [hex: :plug, repo: "hexpm", optional: false]}, {:plug_cowboy, "~> 1.0 or ~> 2.0", [hex: :plug_cowboy, repo: "hexpm", optional: true]}, {:telemetry, "~> 0.4", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "3a8e5d7a3d76d452bb5fb86e8b7bd115f737e4f8efe202a463d4aeb4a5809611"}, @@ -94,7 +96,7 @@ "plug_static_index_html": {:hex, :plug_static_index_html, "1.0.0", "840123d4d3975585133485ea86af73cb2600afd7f2a976f9f5fd8b3808e636a0", [:mix], [{:plug, "~> 1.0", [hex: :plug, repo: "hexpm", optional: false]}], "hexpm", "79fd4fcf34d110605c26560cbae8f23c603ec4158c08298bd4360fdea90bb5cf"}, "poison": {:hex, :poison, "3.1.0", "d9eb636610e096f86f25d9a46f35a9facac35609a7591b3be3326e99a0484665", [:mix], [], "hexpm", "fec8660eb7733ee4117b85f55799fd3833eb769a6df71ccf8903e8dc5447cfce"}, "poolboy": {:hex, :poolboy, "1.5.2", "392b007a1693a64540cead79830443abf5762f5d30cf50bc95cb2c1aaafa006b", [:rebar3], [], "hexpm", "dad79704ce5440f3d5a3681c8590b9dc25d1a561e8f5a9c995281012860901e3"}, - "postgrex": {:hex, :postgrex, "0.15.5", "aec40306a622d459b01bff890fa42f1430dac61593b122754144ad9033a2152f", [:mix], [{:connection, "~> 1.0", [hex: :connection, repo: "hexpm", optional: false]}, {:db_connection, "~> 2.1", [hex: :db_connection, repo: "hexpm", optional: false]}, {:decimal, "~> 1.5", [hex: :decimal, repo: "hexpm", optional: false]}, {:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: true]}], "hexpm", "ed90c81e1525f65a2ba2279dbcebf030d6d13328daa2f8088b9661eb9143af7f"}, + "postgrex": {:hex, :postgrex, "0.15.6", "a464c72010a56e3214fe2b99c1a76faab4c2bb0255cabdef30dea763a3569aa2", [:mix], [{:connection, "~> 1.0", [hex: :connection, repo: "hexpm", optional: false]}, {:db_connection, "~> 2.1", [hex: :db_connection, repo: "hexpm", optional: false]}, {:decimal, "~> 1.5 or ~> 2.0", [hex: :decimal, repo: "hexpm", optional: false]}, {:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: true]}], "hexpm", "f99268325ac8f66ffd6c4964faab9e70fbf721234ab2ad238c00f9530b8cdd55"}, "pot": {:hex, :pot, "0.11.0", "61bad869a94534739dd4614a25a619bc5c47b9970e9a0ea5bef4628036fc7a16", [:rebar3], [], "hexpm", "57ee6ee6bdeb639661ffafb9acefe3c8f966e45394de6a766813bb9e1be4e54b"}, "prometheus": {:hex, :prometheus, "4.6.0", "20510f381db1ccab818b4cf2fac5fa6ab5cc91bc364a154399901c001465f46f", [:mix, :rebar3], [], "hexpm", "4905fd2992f8038eccd7aa0cd22f40637ed618c0bed1f75c05aacec15b7545de"}, "prometheus_ecto": {:hex, :prometheus_ecto, "1.4.3", "3dd4da1812b8e0dbee81ea58bb3b62ed7588f2eae0c9e97e434c46807ff82311", [:mix], [{:ecto, "~> 2.0 or ~> 3.0", [hex: :ecto, repo: "hexpm", optional: false]}, {:prometheus_ex, "~> 1.1 or ~> 2.0 or ~> 3.0", [hex: :prometheus_ex, repo: "hexpm", optional: false]}], "hexpm", "8d66289f77f913b37eda81fd287340c17e61a447549deb28efc254532b2bed82"}, diff --git a/priv/gettext/es/LC_MESSAGES/errors.po b/priv/gettext/es/LC_MESSAGES/errors.po index ba75936a9..0a6fceaad 100644 --- a/priv/gettext/es/LC_MESSAGES/errors.po +++ b/priv/gettext/es/LC_MESSAGES/errors.po @@ -3,7 +3,7 @@ msgstr "" "Project-Id-Version: PACKAGE VERSION\n" "Report-Msgid-Bugs-To: \n" "POT-Creation-Date: 2020-09-09 09:49+0000\n" -"PO-Revision-Date: 2020-09-09 10:52+0000\n" +"PO-Revision-Date: 2020-09-11 21:26+0000\n" "Last-Translator: tarteka \n" "Language-Team: Spanish \n" @@ -94,52 +94,52 @@ msgid "must be less than %{number}" msgstr "" msgid "must be greater than %{number}" -msgstr "" +msgstr "debe ser mayor que %{number}" msgid "must be less than or equal to %{number}" -msgstr "" +msgstr "debe ser menor o igual que %{number}" msgid "must be greater than or equal to %{number}" -msgstr "" +msgstr "deber ser mayor o igual que %{number}" msgid "must be equal to %{number}" -msgstr "" +msgstr "deber ser igual a %{number}" #: lib/pleroma/web/common_api/common_api.ex:505 #, elixir-format msgid "Account not found" -msgstr "" +msgstr "Cuenta no encontrada" #: lib/pleroma/web/common_api/common_api.ex:339 #, elixir-format msgid "Already voted" -msgstr "" +msgstr "Ya has votado" #: lib/pleroma/web/oauth/oauth_controller.ex:359 #, elixir-format msgid "Bad request" -msgstr "" +msgstr "Solicitud incorrecta" #: lib/pleroma/web/activity_pub/activity_pub_controller.ex:426 #, elixir-format msgid "Can't delete object" -msgstr "" +msgstr "No se puede eliminar el objeto" #: lib/pleroma/web/controller_helper.ex:105 #: lib/pleroma/web/controller_helper.ex:111 #, elixir-format msgid "Can't display this activity" -msgstr "" +msgstr "No se puede mostrar esta actividad" #: lib/pleroma/web/mastodon_api/controllers/account_controller.ex:285 #, elixir-format msgid "Can't find user" -msgstr "" +msgstr "No se puede encontrar al usuario" #: lib/pleroma/web/pleroma_api/controllers/account_controller.ex:61 #, elixir-format msgid "Can't get favorites" -msgstr "" +msgstr "No se puede obtener los favoritos" #: lib/pleroma/web/activity_pub/activity_pub_controller.ex:438 #, elixir-format @@ -149,7 +149,7 @@ msgstr "" #: lib/pleroma/web/common_api/utils.ex:563 #, elixir-format msgid "Cannot post an empty status without attachments" -msgstr "" +msgstr "No se puede publicar un estado vacío y sin archivos adjuntos" #: lib/pleroma/web/common_api/utils.ex:511 #, elixir-format diff --git a/priv/gettext/zh_Hans/LC_MESSAGES/errors.po b/priv/gettext/zh_Hans/LC_MESSAGES/errors.po new file mode 100644 index 000000000..4f029d558 --- /dev/null +++ b/priv/gettext/zh_Hans/LC_MESSAGES/errors.po @@ -0,0 +1,580 @@ +msgid "" +msgstr "" +"Project-Id-Version: PACKAGE VERSION\n" +"Report-Msgid-Bugs-To: \n" +"POT-Creation-Date: 2020-09-20 13:18+0000\n" +"PO-Revision-Date: 2020-09-20 14:48+0000\n" +"Last-Translator: Kana \n" +"Language-Team: Chinese (Simplified) \n" +"Language: zh_Hans\n" +"MIME-Version: 1.0\n" +"Content-Type: text/plain; charset=UTF-8\n" +"Content-Transfer-Encoding: 8bit\n" +"Plural-Forms: nplurals=1; plural=0;\n" +"X-Generator: Weblate 4.0.4\n" + +## This file is a PO Template file. +## +## `msgid`s here are often extracted from source code. +## Add new translations manually only if they're dynamic +## translations that can't be statically extracted. +## +## Run `mix gettext.extract` to bring this file up to +## date. Leave `msgstr`s empty as changing them here as no +## effect: edit them in PO (`.po`) files instead. +## From Ecto.Changeset.cast/4 +msgid "can't be blank" +msgstr "不能为空" + +## From Ecto.Changeset.unique_constraint/3 +msgid "has already been taken" +msgstr "已被占用" + +## From Ecto.Changeset.put_change/3 +msgid "is invalid" +msgstr "不合法" + +## From Ecto.Changeset.validate_format/3 +msgid "has invalid format" +msgstr "的格式不合法" + +## From Ecto.Changeset.validate_subset/3 +msgid "has an invalid entry" +msgstr "中存在不合法的项目" + +## From Ecto.Changeset.validate_exclusion/3 +msgid "is reserved" +msgstr "是被保留的" + +## From Ecto.Changeset.validate_confirmation/3 +msgid "does not match confirmation" +msgstr "" + +## From Ecto.Changeset.no_assoc_constraint/3 +msgid "is still associated with this entry" +msgstr "仍与该项目相关联" + +msgid "are still associated with this entry" +msgstr "仍与该项目相关联" + +## From Ecto.Changeset.validate_length/3 +msgid "should be %{count} character(s)" +msgid_plural "should be %{count} character(s)" +msgstr[0] "应为 %{count} 个字符" + +msgid "should have %{count} item(s)" +msgid_plural "should have %{count} item(s)" +msgstr[0] "应有 %{item} 项" + +msgid "should be at least %{count} character(s)" +msgid_plural "should be at least %{count} character(s)" +msgstr[0] "应至少有 %{count} 个字符" + +msgid "should have at least %{count} item(s)" +msgid_plural "should have at least %{count} item(s)" +msgstr[0] "应至少有 %{count} 项" + +msgid "should be at most %{count} character(s)" +msgid_plural "should be at most %{count} character(s)" +msgstr[0] "应至多有 %{count} 个字符" + +msgid "should have at most %{count} item(s)" +msgid_plural "should have at most %{count} item(s)" +msgstr[0] "应至多有 %{count} 项" + +## From Ecto.Changeset.validate_number/3 +msgid "must be less than %{number}" +msgstr "必须小于 %{number}" + +msgid "must be greater than %{number}" +msgstr "必须大于 %{number}" + +msgid "must be less than or equal to %{number}" +msgstr "必须小于等于 %{number}" + +msgid "must be greater than or equal to %{number}" +msgstr "必须大于等于 %{number}" + +msgid "must be equal to %{number}" +msgstr "必须等于 %{number}" + +#: lib/pleroma/web/common_api/common_api.ex:505 +#, elixir-format +msgid "Account not found" +msgstr "未找到账号" + +#: lib/pleroma/web/common_api/common_api.ex:339 +#, elixir-format +msgid "Already voted" +msgstr "已经进行了投票" + +#: lib/pleroma/web/oauth/oauth_controller.ex:359 +#, elixir-format +msgid "Bad request" +msgstr "不正确的请求" + +#: lib/pleroma/web/activity_pub/activity_pub_controller.ex:426 +#, elixir-format +msgid "Can't delete object" +msgstr "不能删除对象" + +#: lib/pleroma/web/controller_helper.ex:105 +#: lib/pleroma/web/controller_helper.ex:111 +#, elixir-format +msgid "Can't display this activity" +msgstr "不能显示该活动" + +#: lib/pleroma/web/mastodon_api/controllers/account_controller.ex:285 +#, elixir-format +msgid "Can't find user" +msgstr "找不到用户" + +#: lib/pleroma/web/pleroma_api/controllers/account_controller.ex:61 +#, elixir-format +msgid "Can't get favorites" +msgstr "不能获取收藏" + +#: lib/pleroma/web/activity_pub/activity_pub_controller.ex:438 +#, elixir-format +msgid "Can't like object" +msgstr "" + +#: lib/pleroma/web/common_api/utils.ex:563 +#, elixir-format +msgid "Cannot post an empty status without attachments" +msgstr "" + +#: lib/pleroma/web/common_api/utils.ex:511 +#, elixir-format +msgid "Comment must be up to %{max_size} characters" +msgstr "" + +#: lib/pleroma/config/config_db.ex:191 +#, elixir-format +msgid "Config with params %{params} not found" +msgstr "" + +#: lib/pleroma/web/common_api/common_api.ex:181 +#: lib/pleroma/web/common_api/common_api.ex:185 +#, elixir-format +msgid "Could not delete" +msgstr "" + +#: lib/pleroma/web/common_api/common_api.ex:231 +#, elixir-format +msgid "Could not favorite" +msgstr "" + +#: lib/pleroma/web/common_api/common_api.ex:453 +#, elixir-format +msgid "Could not pin" +msgstr "" + +#: lib/pleroma/web/common_api/common_api.ex:278 +#, elixir-format +msgid "Could not unfavorite" +msgstr "" + +#: lib/pleroma/web/common_api/common_api.ex:463 +#, elixir-format +msgid "Could not unpin" +msgstr "" + +#: lib/pleroma/web/common_api/common_api.ex:216 +#, elixir-format +msgid "Could not unrepeat" +msgstr "" + +#: lib/pleroma/web/common_api/common_api.ex:512 +#: lib/pleroma/web/common_api/common_api.ex:521 +#, elixir-format +msgid "Could not update state" +msgstr "" + +#: lib/pleroma/web/mastodon_api/controllers/timeline_controller.ex:207 +#, elixir-format +msgid "Error." +msgstr "" + +#: lib/pleroma/web/twitter_api/twitter_api.ex:106 +#, elixir-format +msgid "Invalid CAPTCHA" +msgstr "" + +#: lib/pleroma/web/mastodon_api/controllers/account_controller.ex:116 +#: lib/pleroma/web/oauth/oauth_controller.ex:568 +#, elixir-format +msgid "Invalid credentials" +msgstr "" + +#: lib/pleroma/plugs/ensure_authenticated_plug.ex:38 +#, elixir-format +msgid "Invalid credentials." +msgstr "" + +#: lib/pleroma/web/common_api/common_api.ex:355 +#, elixir-format +msgid "Invalid indices" +msgstr "" + +#: lib/pleroma/web/admin_api/controllers/fallback_controller.ex:29 +#, elixir-format +msgid "Invalid parameters" +msgstr "" + +#: lib/pleroma/web/common_api/utils.ex:414 +#, elixir-format +msgid "Invalid password." +msgstr "" + +#: lib/pleroma/web/mastodon_api/controllers/account_controller.ex:220 +#, elixir-format +msgid "Invalid request" +msgstr "" + +#: lib/pleroma/web/twitter_api/twitter_api.ex:109 +#, elixir-format +msgid "Kocaptcha service unavailable" +msgstr "" + +#: lib/pleroma/web/mastodon_api/controllers/account_controller.ex:112 +#, elixir-format +msgid "Missing parameters" +msgstr "" + +#: lib/pleroma/web/common_api/utils.ex:547 +#, elixir-format +msgid "No such conversation" +msgstr "" + +#: lib/pleroma/web/admin_api/controllers/admin_api_controller.ex:388 +#: lib/pleroma/web/admin_api/controllers/admin_api_controller.ex:414 lib/pleroma/web/admin_api/controllers/admin_api_controller.ex:456 +#, elixir-format +msgid "No such permission_group" +msgstr "" + +#: lib/pleroma/plugs/uploaded_media.ex:84 +#: lib/pleroma/web/activity_pub/activity_pub_controller.ex:486 lib/pleroma/web/admin_api/controllers/fallback_controller.ex:11 +#: lib/pleroma/web/feed/user_controller.ex:71 lib/pleroma/web/ostatus/ostatus_controller.ex:143 +#, elixir-format +msgid "Not found" +msgstr "" + +#: lib/pleroma/web/common_api/common_api.ex:331 +#, elixir-format +msgid "Poll's author can't vote" +msgstr "" + +#: lib/pleroma/web/mastodon_api/controllers/fallback_controller.ex:20 +#: lib/pleroma/web/mastodon_api/controllers/poll_controller.ex:37 lib/pleroma/web/mastodon_api/controllers/poll_controller.ex:49 +#: lib/pleroma/web/mastodon_api/controllers/poll_controller.ex:50 lib/pleroma/web/mastodon_api/controllers/status_controller.ex:306 +#: lib/pleroma/web/mastodon_api/controllers/subscription_controller.ex:71 +#, elixir-format +msgid "Record not found" +msgstr "" + +#: lib/pleroma/web/admin_api/controllers/fallback_controller.ex:35 +#: lib/pleroma/web/feed/user_controller.ex:77 lib/pleroma/web/mastodon_api/controllers/fallback_controller.ex:36 +#: lib/pleroma/web/ostatus/ostatus_controller.ex:149 +#, elixir-format +msgid "Something went wrong" +msgstr "" + +#: lib/pleroma/web/common_api/activity_draft.ex:107 +#, elixir-format +msgid "The message visibility must be direct" +msgstr "" + +#: lib/pleroma/web/common_api/utils.ex:573 +#, elixir-format +msgid "The status is over the character limit" +msgstr "" + +#: lib/pleroma/plugs/ensure_public_or_authenticated_plug.ex:31 +#, elixir-format +msgid "This resource requires authentication." +msgstr "" + +#: lib/pleroma/plugs/rate_limiter/rate_limiter.ex:206 +#, elixir-format +msgid "Throttled" +msgstr "" + +#: lib/pleroma/web/common_api/common_api.ex:356 +#, elixir-format +msgid "Too many choices" +msgstr "" + +#: lib/pleroma/web/activity_pub/activity_pub_controller.ex:443 +#, elixir-format +msgid "Unhandled activity type" +msgstr "" + +#: lib/pleroma/web/admin_api/controllers/admin_api_controller.ex:485 +#, elixir-format +msgid "You can't revoke your own admin status." +msgstr "" + +#: lib/pleroma/web/oauth/oauth_controller.ex:221 +#: lib/pleroma/web/oauth/oauth_controller.ex:308 +#, elixir-format +msgid "Your account is currently disabled" +msgstr "" + +#: lib/pleroma/web/oauth/oauth_controller.ex:183 +#: lib/pleroma/web/oauth/oauth_controller.ex:331 +#, elixir-format +msgid "Your login is missing a confirmed e-mail address" +msgstr "" + +#: lib/pleroma/web/activity_pub/activity_pub_controller.ex:390 +#, elixir-format +msgid "can't read inbox of %{nickname} as %{as_nickname}" +msgstr "" + +#: lib/pleroma/web/activity_pub/activity_pub_controller.ex:473 +#, elixir-format +msgid "can't update outbox of %{nickname} as %{as_nickname}" +msgstr "" + +#: lib/pleroma/web/common_api/common_api.ex:471 +#, elixir-format +msgid "conversation is already muted" +msgstr "" + +#: lib/pleroma/web/activity_pub/activity_pub_controller.ex:314 +#: lib/pleroma/web/activity_pub/activity_pub_controller.ex:492 +#, elixir-format +msgid "error" +msgstr "" + +#: lib/pleroma/web/pleroma_api/controllers/mascot_controller.ex:32 +#, elixir-format +msgid "mascots can only be images" +msgstr "" + +#: lib/pleroma/web/activity_pub/activity_pub_controller.ex:62 +#, elixir-format +msgid "not found" +msgstr "" + +#: lib/pleroma/web/oauth/oauth_controller.ex:394 +#, elixir-format +msgid "Bad OAuth request." +msgstr "" + +#: lib/pleroma/web/twitter_api/twitter_api.ex:115 +#, elixir-format +msgid "CAPTCHA already used" +msgstr "" + +#: lib/pleroma/web/twitter_api/twitter_api.ex:112 +#, elixir-format +msgid "CAPTCHA expired" +msgstr "" + +#: lib/pleroma/plugs/uploaded_media.ex:57 +#, elixir-format +msgid "Failed" +msgstr "" + +#: lib/pleroma/web/oauth/oauth_controller.ex:410 +#, elixir-format +msgid "Failed to authenticate: %{message}." +msgstr "" + +#: lib/pleroma/web/oauth/oauth_controller.ex:441 +#, elixir-format +msgid "Failed to set up user account." +msgstr "" + +#: lib/pleroma/plugs/oauth_scopes_plug.ex:38 +#, elixir-format +msgid "Insufficient permissions: %{permissions}." +msgstr "" + +#: lib/pleroma/plugs/uploaded_media.ex:104 +#, elixir-format +msgid "Internal Error" +msgstr "" + +#: lib/pleroma/web/oauth/fallback_controller.ex:22 +#: lib/pleroma/web/oauth/fallback_controller.ex:29 +#, elixir-format +msgid "Invalid Username/Password" +msgstr "" + +#: lib/pleroma/web/twitter_api/twitter_api.ex:118 +#, elixir-format +msgid "Invalid answer data" +msgstr "" + +#: lib/pleroma/web/nodeinfo/nodeinfo_controller.ex:33 +#, elixir-format +msgid "Nodeinfo schema version not handled" +msgstr "" + +#: lib/pleroma/web/oauth/oauth_controller.ex:172 +#, elixir-format +msgid "This action is outside the authorized scopes" +msgstr "" + +#: lib/pleroma/web/oauth/fallback_controller.ex:14 +#, elixir-format +msgid "Unknown error, please check the details and try again." +msgstr "" + +#: lib/pleroma/web/oauth/oauth_controller.ex:119 +#: lib/pleroma/web/oauth/oauth_controller.ex:158 +#, elixir-format +msgid "Unlisted redirect_uri." +msgstr "" + +#: lib/pleroma/web/oauth/oauth_controller.ex:390 +#, elixir-format +msgid "Unsupported OAuth provider: %{provider}." +msgstr "" + +#: lib/pleroma/uploaders/uploader.ex:72 +#, elixir-format +msgid "Uploader callback timeout" +msgstr "" + +#: lib/pleroma/web/uploader_controller.ex:23 +#, elixir-format +msgid "bad request" +msgstr "" + +#: lib/pleroma/web/twitter_api/twitter_api.ex:103 +#, elixir-format +msgid "CAPTCHA Error" +msgstr "" + +#: lib/pleroma/web/common_api/common_api.ex:290 +#, elixir-format +msgid "Could not add reaction emoji" +msgstr "" + +#: lib/pleroma/web/common_api/common_api.ex:301 +#, elixir-format +msgid "Could not remove reaction emoji" +msgstr "" + +#: lib/pleroma/web/twitter_api/twitter_api.ex:129 +#, elixir-format +msgid "Invalid CAPTCHA (Missing parameter: %{name})" +msgstr "" + +#: lib/pleroma/web/mastodon_api/controllers/list_controller.ex:92 +#, elixir-format +msgid "List not found" +msgstr "" + +#: lib/pleroma/web/mastodon_api/controllers/account_controller.ex:123 +#, elixir-format +msgid "Missing parameter: %{name}" +msgstr "" + +#: lib/pleroma/web/oauth/oauth_controller.ex:210 +#: lib/pleroma/web/oauth/oauth_controller.ex:321 +#, elixir-format +msgid "Password reset is required" +msgstr "" + +#: lib/pleroma/tests/auth_test_controller.ex:9 +#: lib/pleroma/web/activity_pub/activity_pub_controller.ex:6 lib/pleroma/web/admin_api/controllers/admin_api_controller.ex:6 +#: lib/pleroma/web/admin_api/controllers/config_controller.ex:6 lib/pleroma/web/admin_api/controllers/fallback_controller.ex:6 +#: lib/pleroma/web/admin_api/controllers/invite_controller.ex:6 lib/pleroma/web/admin_api/controllers/media_proxy_cache_controller.ex:6 +#: lib/pleroma/web/admin_api/controllers/oauth_app_controller.ex:6 lib/pleroma/web/admin_api/controllers/relay_controller.ex:6 +#: lib/pleroma/web/admin_api/controllers/report_controller.ex:6 lib/pleroma/web/admin_api/controllers/status_controller.ex:6 +#: lib/pleroma/web/controller_helper.ex:6 lib/pleroma/web/embed_controller.ex:6 +#: lib/pleroma/web/fallback_redirect_controller.ex:6 lib/pleroma/web/feed/tag_controller.ex:6 +#: lib/pleroma/web/feed/user_controller.ex:6 lib/pleroma/web/mailer/subscription_controller.ex:2 +#: lib/pleroma/web/masto_fe_controller.ex:6 lib/pleroma/web/mastodon_api/controllers/account_controller.ex:6 +#: lib/pleroma/web/mastodon_api/controllers/app_controller.ex:6 lib/pleroma/web/mastodon_api/controllers/auth_controller.ex:6 +#: lib/pleroma/web/mastodon_api/controllers/conversation_controller.ex:6 lib/pleroma/web/mastodon_api/controllers/custom_emoji_controller.ex:6 +#: lib/pleroma/web/mastodon_api/controllers/domain_block_controller.ex:6 lib/pleroma/web/mastodon_api/controllers/fallback_controller.ex:6 +#: lib/pleroma/web/mastodon_api/controllers/filter_controller.ex:6 lib/pleroma/web/mastodon_api/controllers/follow_request_controller.ex:6 +#: lib/pleroma/web/mastodon_api/controllers/instance_controller.ex:6 lib/pleroma/web/mastodon_api/controllers/list_controller.ex:6 +#: lib/pleroma/web/mastodon_api/controllers/marker_controller.ex:6 lib/pleroma/web/mastodon_api/controllers/mastodon_api_controller.ex:14 +#: lib/pleroma/web/mastodon_api/controllers/media_controller.ex:6 lib/pleroma/web/mastodon_api/controllers/notification_controller.ex:6 +#: lib/pleroma/web/mastodon_api/controllers/poll_controller.ex:6 lib/pleroma/web/mastodon_api/controllers/report_controller.ex:8 +#: lib/pleroma/web/mastodon_api/controllers/scheduled_activity_controller.ex:6 lib/pleroma/web/mastodon_api/controllers/search_controller.ex:6 +#: lib/pleroma/web/mastodon_api/controllers/status_controller.ex:6 lib/pleroma/web/mastodon_api/controllers/subscription_controller.ex:7 +#: lib/pleroma/web/mastodon_api/controllers/suggestion_controller.ex:6 lib/pleroma/web/mastodon_api/controllers/timeline_controller.ex:6 +#: lib/pleroma/web/media_proxy/media_proxy_controller.ex:6 lib/pleroma/web/mongooseim/mongoose_im_controller.ex:6 +#: lib/pleroma/web/nodeinfo/nodeinfo_controller.ex:6 lib/pleroma/web/oauth/fallback_controller.ex:6 +#: lib/pleroma/web/oauth/mfa_controller.ex:10 lib/pleroma/web/oauth/oauth_controller.ex:6 +#: lib/pleroma/web/ostatus/ostatus_controller.ex:6 lib/pleroma/web/pleroma_api/controllers/account_controller.ex:6 +#: lib/pleroma/web/pleroma_api/controllers/chat_controller.ex:5 lib/pleroma/web/pleroma_api/controllers/conversation_controller.ex:6 +#: lib/pleroma/web/pleroma_api/controllers/emoji_pack_controller.ex:2 lib/pleroma/web/pleroma_api/controllers/emoji_reaction_controller.ex:6 +#: lib/pleroma/web/pleroma_api/controllers/mascot_controller.ex:6 lib/pleroma/web/pleroma_api/controllers/notification_controller.ex:6 +#: lib/pleroma/web/pleroma_api/controllers/scrobble_controller.ex:6 +#: lib/pleroma/web/pleroma_api/controllers/two_factor_authentication_controller.ex:7 lib/pleroma/web/static_fe/static_fe_controller.ex:6 +#: lib/pleroma/web/twitter_api/controllers/password_controller.ex:10 lib/pleroma/web/twitter_api/controllers/remote_follow_controller.ex:6 +#: lib/pleroma/web/twitter_api/controllers/util_controller.ex:6 lib/pleroma/web/twitter_api/twitter_api_controller.ex:6 +#: lib/pleroma/web/uploader_controller.ex:6 lib/pleroma/web/web_finger/web_finger_controller.ex:6 +#, elixir-format +msgid "Security violation: OAuth scopes check was neither handled nor explicitly skipped." +msgstr "" + +#: lib/pleroma/plugs/ensure_authenticated_plug.ex:28 +#, elixir-format +msgid "Two-factor authentication enabled, you must use a access token." +msgstr "" + +#: lib/pleroma/web/pleroma_api/controllers/emoji_pack_controller.ex:210 +#, elixir-format +msgid "Unexpected error occurred while adding file to pack." +msgstr "" + +#: lib/pleroma/web/pleroma_api/controllers/emoji_pack_controller.ex:138 +#, elixir-format +msgid "Unexpected error occurred while creating pack." +msgstr "" + +#: lib/pleroma/web/pleroma_api/controllers/emoji_pack_controller.ex:278 +#, elixir-format +msgid "Unexpected error occurred while removing file from pack." +msgstr "" + +#: lib/pleroma/web/pleroma_api/controllers/emoji_pack_controller.ex:250 +#, elixir-format +msgid "Unexpected error occurred while updating file in pack." +msgstr "" + +#: lib/pleroma/web/pleroma_api/controllers/emoji_pack_controller.ex:179 +#, elixir-format +msgid "Unexpected error occurred while updating pack metadata." +msgstr "" + +#: lib/pleroma/web/mastodon_api/controllers/subscription_controller.ex:61 +#, elixir-format +msgid "Web push subscription is disabled on this Pleroma instance" +msgstr "" + +#: lib/pleroma/web/admin_api/controllers/admin_api_controller.ex:451 +#, elixir-format +msgid "You can't revoke your own admin/moderator status." +msgstr "" + +#: lib/pleroma/web/mastodon_api/controllers/timeline_controller.ex:126 +#, elixir-format +msgid "authorization required for timeline view" +msgstr "" + +#: lib/pleroma/web/mastodon_api/controllers/fallback_controller.ex:24 +#, elixir-format +msgid "Access denied" +msgstr "" + +#: lib/pleroma/web/mastodon_api/controllers/account_controller.ex:282 +#, elixir-format +msgid "This API requires an authenticated user" +msgstr "" + +#: lib/pleroma/plugs/user_is_admin_plug.ex:21 +#, elixir-format +msgid "User is not an admin." +msgstr "" diff --git a/priv/repo/migrations/20200825061316_move_activity_expirations_to_oban.exs b/priv/repo/migrations/20200825061316_move_activity_expirations_to_oban.exs index cdc00d20b..a703af83f 100644 --- a/priv/repo/migrations/20200825061316_move_activity_expirations_to_oban.exs +++ b/priv/repo/migrations/20200825061316_move_activity_expirations_to_oban.exs @@ -4,6 +4,8 @@ defmodule Pleroma.Repo.Migrations.MoveActivityExpirationsToOban do import Ecto.Query, only: [from: 2] def change do + Pleroma.Config.Oban.warn() + Supervisor.start_link([{Oban, Pleroma.Config.get(Oban)}], strategy: :one_for_one, name: Pleroma.Supervisor diff --git a/priv/repo/migrations/20200907092050_move_tokens_expiration_into_oban.exs b/priv/repo/migrations/20200907092050_move_tokens_expiration_into_oban.exs index 832bd02a7..9e49ddacb 100644 --- a/priv/repo/migrations/20200907092050_move_tokens_expiration_into_oban.exs +++ b/priv/repo/migrations/20200907092050_move_tokens_expiration_into_oban.exs @@ -4,6 +4,8 @@ defmodule Pleroma.Repo.Migrations.MoveTokensExpirationIntoOban do import Ecto.Query, only: [from: 2] def change do + Pleroma.Config.Oban.warn() + Supervisor.start_link([{Oban, Pleroma.Config.get(Oban)}], strategy: :one_for_one, name: Pleroma.Supervisor diff --git a/priv/repo/migrations/20200910113106_remove_managed_config_from_db.exs b/priv/repo/migrations/20200910113106_remove_managed_config_from_db.exs new file mode 100644 index 000000000..e27a9ae48 --- /dev/null +++ b/priv/repo/migrations/20200910113106_remove_managed_config_from_db.exs @@ -0,0 +1,27 @@ +defmodule Pleroma.Repo.Migrations.RemoveManagedConfigFromDb do + use Ecto.Migration + import Ecto.Query + alias Pleroma.ConfigDB + alias Pleroma.Repo + + def up do + config_entry = + from(c in ConfigDB, + select: [:id, :value], + where: c.group == ^:pleroma and c.key == ^:instance + ) + |> Repo.one() + + if config_entry do + {_, value} = Keyword.pop(config_entry.value, :managed_config) + + config_entry + |> Ecto.Changeset.change(value: value) + |> Repo.update() + end + end + + def down do + :ok + end +end diff --git a/priv/repo/migrations/20200911055909_remove_cron_jobs.exs b/priv/repo/migrations/20200911055909_remove_cron_jobs.exs new file mode 100644 index 000000000..33897d128 --- /dev/null +++ b/priv/repo/migrations/20200911055909_remove_cron_jobs.exs @@ -0,0 +1,20 @@ +defmodule Pleroma.Repo.Migrations.RemoveCronJobs do + use Ecto.Migration + + import Ecto.Query, only: [from: 2] + + def up do + from(j in "oban_jobs", + where: + j.worker in ^[ + "Pleroma.Workers.Cron.PurgeExpiredActivitiesWorker", + "Pleroma.Workers.Cron.StatsWorker", + "Pleroma.Workers.Cron.ClearOauthTokenWorker" + ], + select: [:id] + ) + |> Pleroma.Repo.delete_all() + end + + def down, do: :ok +end diff --git a/priv/repo/migrations/20200914105638_delete_notification_without_activity.exs b/priv/repo/migrations/20200914105638_delete_notification_without_activity.exs new file mode 100644 index 000000000..9333fc5a1 --- /dev/null +++ b/priv/repo/migrations/20200914105638_delete_notification_without_activity.exs @@ -0,0 +1,30 @@ +defmodule Pleroma.Repo.Migrations.DeleteNotificationWithoutActivity do + use Ecto.Migration + + import Ecto.Query + alias Pleroma.Repo + + def up do + from( + q in Pleroma.Notification, + left_join: c in assoc(q, :activity), + select: %{id: type(q.id, :integer)}, + where: is_nil(c.id) + ) + |> Repo.chunk_stream(1_000, :batches) + |> Stream.each(fn records -> + notification_ids = Enum.map(records, fn %{id: id} -> id end) + + Repo.delete_all( + from(n in "notifications", + where: n.id in ^notification_ids + ) + ) + end) + |> Stream.run() + end + + def down do + :ok + end +end diff --git a/priv/repo/migrations/20200914105800_add_notification_constraints.exs b/priv/repo/migrations/20200914105800_add_notification_constraints.exs new file mode 100644 index 000000000..a65c35fd0 --- /dev/null +++ b/priv/repo/migrations/20200914105800_add_notification_constraints.exs @@ -0,0 +1,23 @@ +defmodule Pleroma.Repo.Migrations.AddNotificationConstraints do + use Ecto.Migration + + def up do + drop(constraint(:notifications, "notifications_activity_id_fkey")) + + alter table(:notifications) do + modify(:activity_id, references(:activities, type: :uuid, on_delete: :delete_all), + null: false + ) + end + end + + def down do + drop(constraint(:notifications, "notifications_activity_id_fkey")) + + alter table(:notifications) do + modify(:activity_id, references(:activities, type: :uuid, on_delete: :delete_all), + null: true + ) + end + end +end diff --git a/priv/repo/migrations/20200925065249_make_user_ids_ci.exs b/priv/repo/migrations/20200925065249_make_user_ids_ci.exs new file mode 100644 index 000000000..8ea0f2cf1 --- /dev/null +++ b/priv/repo/migrations/20200925065249_make_user_ids_ci.exs @@ -0,0 +1,9 @@ +defmodule Pleroma.Repo.Migrations.MakeUserIdsCI do + use Ecto.Migration + + def change do + # Migration retired, see + # https://git.pleroma.social/pleroma/pleroma/-/issues/2188 + :noop + end +end diff --git a/priv/repo/migrations/20200928145912_revert_citext_change.exs b/priv/repo/migrations/20200928145912_revert_citext_change.exs new file mode 100644 index 000000000..685a98533 --- /dev/null +++ b/priv/repo/migrations/20200928145912_revert_citext_change.exs @@ -0,0 +1,11 @@ +defmodule Pleroma.Repo.Migrations.RevertCitextChange do + use Ecto.Migration + + def change do + alter table(:users) do + modify(:uri, :text) + end + + # create_if_not_exists(unique_index(:users, :uri)) + end +end diff --git a/priv/repo/migrations/20200930082320_user_ur_is_index_part_three.exs b/priv/repo/migrations/20200930082320_user_ur_is_index_part_three.exs new file mode 100644 index 000000000..816c6526e --- /dev/null +++ b/priv/repo/migrations/20200930082320_user_ur_is_index_part_three.exs @@ -0,0 +1,8 @@ +defmodule Pleroma.Repo.Migrations.UserURIsIndexPartThree do + use Ecto.Migration + + def change do + drop_if_exists(unique_index(:users, :uri)) + create_if_not_exists(index(:users, :uri)) + end +end diff --git a/priv/static/index.html b/priv/static/index.html index 6fa237768..f5690a8d6 100644 --- a/priv/static/index.html +++ b/priv/static/index.html @@ -1 +1 @@ -Pleroma
\ No newline at end of file +Pleroma
\ No newline at end of file diff --git a/priv/static/static/font/fontello.1599568314856.woff b/priv/static/static/font/fontello.1599568314856.woff deleted file mode 100644 index 64f566383..000000000 Binary files a/priv/static/static/font/fontello.1599568314856.woff and /dev/null differ diff --git a/priv/static/static/font/fontello.1599568314856.woff2 b/priv/static/static/font/fontello.1599568314856.woff2 deleted file mode 100644 index 972e70831..000000000 Binary files a/priv/static/static/font/fontello.1599568314856.woff2 and /dev/null differ diff --git a/priv/static/static/font/fontello.1599568314856.eot b/priv/static/static/font/fontello.1600365488745.eot similarity index 89% rename from priv/static/static/font/fontello.1599568314856.eot rename to priv/static/static/font/fontello.1600365488745.eot index 1a6931a0e..255f50372 100644 Binary files a/priv/static/static/font/fontello.1599568314856.eot and b/priv/static/static/font/fontello.1600365488745.eot differ diff --git a/priv/static/static/font/fontello.1599568314856.svg b/priv/static/static/font/fontello.1600365488745.svg similarity index 98% rename from priv/static/static/font/fontello.1599568314856.svg rename to priv/static/static/font/fontello.1600365488745.svg index 71b5d70af..9eddf62ea 100644 --- a/priv/static/static/font/fontello.1599568314856.svg +++ b/priv/static/static/font/fontello.1600365488745.svg @@ -92,6 +92,8 @@ + + diff --git a/priv/static/static/font/fontello.1599568314856.ttf b/priv/static/static/font/fontello.1600365488745.ttf similarity index 89% rename from priv/static/static/font/fontello.1599568314856.ttf rename to priv/static/static/font/fontello.1600365488745.ttf index 795464475..6bda99d50 100644 Binary files a/priv/static/static/font/fontello.1599568314856.ttf and b/priv/static/static/font/fontello.1600365488745.ttf differ diff --git a/priv/static/static/font/fontello.1600365488745.woff b/priv/static/static/font/fontello.1600365488745.woff new file mode 100644 index 000000000..11c866ae0 Binary files /dev/null and b/priv/static/static/font/fontello.1600365488745.woff differ diff --git a/priv/static/static/font/fontello.1600365488745.woff2 b/priv/static/static/font/fontello.1600365488745.woff2 new file mode 100644 index 000000000..06151d28c Binary files /dev/null and b/priv/static/static/font/fontello.1600365488745.woff2 differ diff --git a/priv/static/static/fontello.1599568314856.css b/priv/static/static/fontello.1600365488745.css similarity index 89% rename from priv/static/static/fontello.1599568314856.css rename to priv/static/static/fontello.1600365488745.css index e636286c0..781ff7620 100644 --- a/priv/static/static/fontello.1599568314856.css +++ b/priv/static/static/fontello.1600365488745.css @@ -1,11 +1,11 @@ @font-face { font-family: "Icons"; - src: url("./font/fontello.1599568314856.eot"); - src: url("./font/fontello.1599568314856.eot") format("embedded-opentype"), - url("./font/fontello.1599568314856.woff2") format("woff2"), - url("./font/fontello.1599568314856.woff") format("woff"), - url("./font/fontello.1599568314856.ttf") format("truetype"), - url("./font/fontello.1599568314856.svg") format("svg"); + src: url("./font/fontello.1600365488745.eot"); + src: url("./font/fontello.1600365488745.eot") format("embedded-opentype"), + url("./font/fontello.1600365488745.woff2") format("woff2"), + url("./font/fontello.1600365488745.woff") format("woff"), + url("./font/fontello.1600365488745.ttf") format("truetype"), + url("./font/fontello.1600365488745.svg") format("svg"); font-weight: normal; font-style: normal; } @@ -156,3 +156,5 @@ .icon-music::before { content: "\e828"; } .icon-doc::before { content: "\e829"; } .icon-block::before { content: "\e82a"; } + +.icon-megaphone::before { content: "\e82b"; } diff --git a/priv/static/static/fontello.json b/priv/static/static/fontello.json index 706800cdb..b0136fd90 100644 --- a/priv/static/static/fontello.json +++ b/priv/static/static/fontello.json @@ -405,6 +405,12 @@ "css": "block", "code": 59434, "src": "fontawesome" + }, + { + "uid": "3e674995cacc2b09692c096ea7eb6165", + "css": "megaphone", + "code": 59435, + "src": "fontawesome" } ] } \ No newline at end of file diff --git a/priv/static/static/js/2.c92f4803ff24726cea58.js.map b/priv/static/static/js/2.c92f4803ff24726cea58.js.map deleted file mode 100644 index e3cc6a3fb..000000000 Binary files a/priv/static/static/js/2.c92f4803ff24726cea58.js.map and /dev/null differ diff --git a/priv/static/static/js/2.c92f4803ff24726cea58.js b/priv/static/static/js/2.e852a6b4b3bba752b838.js similarity index 66% rename from priv/static/static/js/2.c92f4803ff24726cea58.js rename to priv/static/static/js/2.e852a6b4b3bba752b838.js index 55aa1f44e..42e446575 100644 Binary files a/priv/static/static/js/2.c92f4803ff24726cea58.js and b/priv/static/static/js/2.e852a6b4b3bba752b838.js differ diff --git a/priv/static/static/js/2.e852a6b4b3bba752b838.js.map b/priv/static/static/js/2.e852a6b4b3bba752b838.js.map new file mode 100644 index 000000000..d698f09e1 Binary files /dev/null and b/priv/static/static/js/2.e852a6b4b3bba752b838.js.map differ diff --git a/priv/static/static/js/app.55d173dc5e39519aa518.js b/priv/static/static/js/app.55d173dc5e39519aa518.js deleted file mode 100644 index d04ae3499..000000000 Binary files a/priv/static/static/js/app.55d173dc5e39519aa518.js and /dev/null differ diff --git a/priv/static/static/js/app.55d173dc5e39519aa518.js.map b/priv/static/static/js/app.55d173dc5e39519aa518.js.map deleted file mode 100644 index 600e97afa..000000000 Binary files a/priv/static/static/js/app.55d173dc5e39519aa518.js.map and /dev/null differ diff --git a/priv/static/static/js/app.826c44232e0a76bbd9ba.js b/priv/static/static/js/app.826c44232e0a76bbd9ba.js new file mode 100644 index 000000000..16762165e Binary files /dev/null and b/priv/static/static/js/app.826c44232e0a76bbd9ba.js differ diff --git a/priv/static/static/js/app.826c44232e0a76bbd9ba.js.map b/priv/static/static/js/app.826c44232e0a76bbd9ba.js.map new file mode 100644 index 000000000..b188c3379 Binary files /dev/null and b/priv/static/static/js/app.826c44232e0a76bbd9ba.js.map differ diff --git a/priv/static/sw-pleroma.js b/priv/static/sw-pleroma.js index 54c6ed8f0..fa4969025 100644 Binary files a/priv/static/sw-pleroma.js and b/priv/static/sw-pleroma.js differ diff --git a/test/config/deprecation_warnings_test.exs b/test/config/deprecation_warnings_test.exs index e22052404..f81a7b580 100644 --- a/test/config/deprecation_warnings_test.exs +++ b/test/config/deprecation_warnings_test.exs @@ -1,5 +1,5 @@ defmodule Pleroma.Config.DeprecationWarningsTest do - use ExUnit.Case, async: true + use ExUnit.Case use Pleroma.Tests.Helpers import ExUnit.CaptureLog @@ -66,6 +66,30 @@ test "check_media_proxy_whitelist_config/0" do end) =~ "Your config is using old format (only domain) for MediaProxy whitelist option" end + test "check_welcome_message_config/0" do + clear_config([:instance, :welcome_user_nickname], "LainChan") + + assert capture_log(fn -> + DeprecationWarnings.check_welcome_message_config() + end) =~ "Your config is using the old namespace for Welcome messages configuration." + end + + test "check_hellthread_threshold/0" do + clear_config([:mrf_hellthread, :threshold], 16) + + assert capture_log(fn -> + DeprecationWarnings.check_hellthread_threshold() + end) =~ "You are using the old configuration mechanism for the hellthread filter." + end + + test "check_activity_expiration_config/0" do + clear_config([Pleroma.ActivityExpiration, :enabled], true) + + assert capture_log(fn -> + DeprecationWarnings.check_activity_expiration_config() + end) =~ "Your config is using old namespace for activity expiration configuration." + end + describe "check_gun_pool_options/0" do test "await_up_timeout" do config = Config.get(:connections_pool) @@ -74,7 +98,7 @@ test "await_up_timeout" do assert capture_log(fn -> DeprecationWarnings.check_gun_pool_options() end) =~ - "Your config is using old setting name `await_up_timeout` instead of `connect_timeout`" + "Your config is using old setting `config :pleroma, :connections_pool, await_up_timeout`." end test "pool timeout" do diff --git a/test/emoji/pack_test.exs b/test/emoji/pack_test.exs new file mode 100644 index 000000000..70d1eaa1b --- /dev/null +++ b/test/emoji/pack_test.exs @@ -0,0 +1,93 @@ +# Pleroma: A lightweight social networking server +# Copyright © 2017-2020 Pleroma Authors +# SPDX-License-Identifier: AGPL-3.0-only + +defmodule Pleroma.Emoji.PackTest do + use ExUnit.Case, async: true + alias Pleroma.Emoji.Pack + + @emoji_path Path.join( + Pleroma.Config.get!([:instance, :static_dir]), + "emoji" + ) + + setup do + pack_path = Path.join(@emoji_path, "dump_pack") + File.mkdir(pack_path) + + File.write!(Path.join(pack_path, "pack.json"), """ + { + "files": { }, + "pack": { + "description": "Dump pack", "homepage": "https://pleroma.social", + "license": "Test license", "share-files": true + }} + """) + + {:ok, pack} = Pleroma.Emoji.Pack.load_pack("dump_pack") + + on_exit(fn -> + File.rm_rf!(pack_path) + end) + + {:ok, pack: pack} + end + + describe "add_file/4" do + test "add emojies from zip file", %{pack: pack} do + file = %Plug.Upload{ + content_type: "application/zip", + filename: "emojis.zip", + path: Path.absname("test/fixtures/emojis.zip") + } + + {:ok, updated_pack} = Pack.add_file(pack, nil, nil, file) + + assert updated_pack.files == %{ + "a_trusted_friend-128" => "128px/a_trusted_friend-128.png", + "auroraborealis" => "auroraborealis.png", + "baby_in_a_box" => "1000px/baby_in_a_box.png", + "bear" => "1000px/bear.png", + "bear-128" => "128px/bear-128.png" + } + + assert updated_pack.files_count == 5 + end + end + + test "returns error when zip file is bad", %{pack: pack} do + file = %Plug.Upload{ + content_type: "application/zip", + filename: "emojis.zip", + path: Path.absname("test/instance_static/emoji/test_pack/blank.png") + } + + assert Pack.add_file(pack, nil, nil, file) == {:error, :einval} + end + + test "returns pack when zip file is empty", %{pack: pack} do + file = %Plug.Upload{ + content_type: "application/zip", + filename: "emojis.zip", + path: Path.absname("test/fixtures/empty.zip") + } + + {:ok, updated_pack} = Pack.add_file(pack, nil, nil, file) + assert updated_pack == pack + end + + test "add emoji file", %{pack: pack} do + file = %Plug.Upload{ + filename: "blank.png", + path: "#{@emoji_path}/test_pack/blank.png" + } + + {:ok, updated_pack} = Pack.add_file(pack, "test_blank", "test_blank.png", file) + + assert updated_pack.files == %{ + "test_blank" => "test_blank.png" + } + + assert updated_pack.files_count == 1 + end +end diff --git a/test/emoji_test.exs b/test/emoji_test.exs index b36047578..1dd3c58c6 100644 --- a/test/emoji_test.exs +++ b/test/emoji_test.exs @@ -3,7 +3,7 @@ # SPDX-License-Identifier: AGPL-3.0-only defmodule Pleroma.EmojiTest do - use ExUnit.Case, async: true + use ExUnit.Case alias Pleroma.Emoji describe "is_unicode_emoji?/1" do diff --git a/test/fixtures/custom_instance_panel.html b/test/fixtures/custom_instance_panel.html new file mode 100644 index 000000000..6371a1e43 --- /dev/null +++ b/test/fixtures/custom_instance_panel.html @@ -0,0 +1 @@ +

Custom instance panel

\ No newline at end of file diff --git a/test/fixtures/emojis.zip b/test/fixtures/emojis.zip new file mode 100644 index 000000000..d7fc4732b Binary files /dev/null and b/test/fixtures/emojis.zip differ diff --git a/test/fixtures/empty.zip b/test/fixtures/empty.zip new file mode 100644 index 000000000..15cb0ecb3 Binary files /dev/null and b/test/fixtures/empty.zip differ diff --git a/test/fixtures/image.gif b/test/fixtures/image.gif new file mode 100755 index 000000000..9df64778b Binary files /dev/null and b/test/fixtures/image.gif differ diff --git a/test/fixtures/image.png b/test/fixtures/image.png new file mode 100755 index 000000000..e999e8800 Binary files /dev/null and b/test/fixtures/image.png differ diff --git a/test/fixtures/tesla_mock/admin@mastdon.example.org.json b/test/fixtures/tesla_mock/admin@mastdon.example.org.json index a911b979a..f961ccb36 100644 --- a/test/fixtures/tesla_mock/admin@mastdon.example.org.json +++ b/test/fixtures/tesla_mock/admin@mastdon.example.org.json @@ -1,20 +1,24 @@ { - "@context": ["https://www.w3.org/ns/activitystreams", "https://w3id.org/security/v1", { - "manuallyApprovesFollowers": "as:manuallyApprovesFollowers", - "sensitive": "as:sensitive", - "movedTo": "as:movedTo", - "Hashtag": "as:Hashtag", - "ostatus": "http://ostatus.org#", - "atomUri": "ostatus:atomUri", - "inReplyToAtomUri": "ostatus:inReplyToAtomUri", - "conversation": "ostatus:conversation", - "toot": "http://joinmastodon.org/ns#", - "Emoji": "toot:Emoji", - "alsoKnownAs": { - "@id": "as:alsoKnownAs", - "@type": "@id" + "@context": [ + "https://www.w3.org/ns/activitystreams", + "https://w3id.org/security/v1", + { + "manuallyApprovesFollowers": "as:manuallyApprovesFollowers", + "sensitive": "as:sensitive", + "movedTo": "as:movedTo", + "Hashtag": "as:Hashtag", + "ostatus": "http://ostatus.org#", + "atomUri": "ostatus:atomUri", + "inReplyToAtomUri": "ostatus:inReplyToAtomUri", + "conversation": "ostatus:conversation", + "toot": "http://joinmastodon.org/ns#", + "Emoji": "toot:Emoji", + "alsoKnownAs": { + "@id": "as:alsoKnownAs", + "@type": "@id" + } } - }], + ], "id": "http://mastodon.example.org/users/admin", "type": "Person", "following": "http://mastodon.example.org/users/admin/following", @@ -23,6 +27,7 @@ "outbox": "http://mastodon.example.org/users/admin/outbox", "preferredUsername": "admin", "name": null, + "discoverable": "true", "summary": "\u003cp\u003e\u003c/p\u003e", "url": "http://mastodon.example.org/@admin", "manuallyApprovesFollowers": false, @@ -34,7 +39,8 @@ "owner": "http://mastodon.example.org/users/admin", "publicKeyPem": "-----BEGIN PUBLIC KEY-----\nMIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEAtc4Tir+3ADhSNF6VKrtW\nOU32T01w7V0yshmQei38YyiVwVvFu8XOP6ACchkdxbJ+C9mZud8qWaRJKVbFTMUG\nNX4+6Q+FobyuKrwN7CEwhDALZtaN2IPbaPd6uG1B7QhWorrY+yFa8f2TBM3BxnUy\nI4T+bMIZIEYG7KtljCBoQXuTQmGtuffO0UwJksidg2ffCF5Q+K//JfQagJ3UzrR+\nZXbKMJdAw4bCVJYs4Z5EhHYBwQWiXCyMGTd7BGlmMkY6Av7ZqHKC/owp3/0EWDNz\nNqF09Wcpr3y3e8nA10X40MJqp/wR+1xtxp+YGbq/Cj5hZGBG7etFOmIpVBrDOhry\nBwIDAQAB\n-----END PUBLIC KEY-----\n" }, - "attachment": [{ + "attachment": [ + { "type": "PropertyValue", "name": "foo", "value": "bar" @@ -58,5 +64,7 @@ "mediaType": "image/png", "url": "https://cdn.niu.moe/accounts/headers/000/033/323/original/850b3448fa5fd477.png" }, - "alsoKnownAs": ["http://example.org/users/foo"] -} + "alsoKnownAs": [ + "http://example.org/users/foo" + ] +} \ No newline at end of file diff --git a/test/fixtures/tesla_mock/framatube.org-video.json b/test/fixtures/tesla_mock/framatube.org-video.json index 3d53f0c97..1fa529886 100644 --- a/test/fixtures/tesla_mock/framatube.org-video.json +++ b/test/fixtures/tesla_mock/framatube.org-video.json @@ -1 +1 @@ -{"type":"Video","id":"https://framatube.org/videos/watch/6050732a-8a7a-43d4-a6cd-809525a1d206","name":"Déframasoftisons Internet [Framasoft]","duration":"PT3622S","uuid":"6050732a-8a7a-43d4-a6cd-809525a1d206","tag":[{"type":"Hashtag","name":"déframasoftisons"},{"type":"Hashtag","name":"EPN23"},{"type":"Hashtag","name":"framaconf"},{"type":"Hashtag","name":"Framasoft"},{"type":"Hashtag","name":"pyg"}],"category":{"identifier":"15","name":"Science & Technology"},"views":122,"sensitive":false,"waitTranscoding":false,"state":1,"commentsEnabled":true,"downloadEnabled":true,"published":"2020-05-24T18:34:31.569Z","originallyPublishedAt":"2019-11-30T23:00:00.000Z","updated":"2020-07-05T09:01:01.720Z","mediaType":"text/markdown","content":"Après avoir mené avec un certain succès la campagne « Dégooglisons Internet » en 2014, l’association Framasoft annonce fin 2019 arrêter progressivement un certain nombre de ses services alternatifs aux GAFAM. Pourquoi ?\r\n\r\nTranscription par @april...","support":null,"subtitleLanguage":[],"icon":{"type":"Image","url":"https://framatube.org/static/thumbnails/6050732a-8a7a-43d4-a6cd-809525a1d206.jpg","mediaType":"image/jpeg","width":223,"height":122},"url":[{"type":"Link","mediaType":"text/html","href":"https://framatube.org/videos/watch/6050732a-8a7a-43d4-a6cd-809525a1d206"},{"type":"Link","mediaType":"video/mp4","href":"https://framatube.org/static/webseed/6050732a-8a7a-43d4-a6cd-809525a1d206-1080.mp4","height":1080,"size":1157359410,"fps":25},{"type":"Link","rel":["metadata","video/mp4"],"mediaType":"application/json","href":"https://framatube.org/api/v1/videos/6050732a-8a7a-43d4-a6cd-809525a1d206/metadata/1309939","height":1080,"fps":25},{"type":"Link","mediaType":"application/x-bittorrent","href":"https://framatube.org/static/torrents/6050732a-8a7a-43d4-a6cd-809525a1d206-1080.torrent","height":1080},{"type":"Link","mediaType":"application/x-bittorrent;x-scheme-handler/magnet","href":"magnet:?xs=https%3A%2F%2Fframatube.org%2Fstatic%2Ftorrents%2F6050732a-8a7a-43d4-a6cd-809525a1d206-1080.torrent&xt=urn:btih:381c9429900552e23a4eb506318f1fa01e4d63a8&dn=D%C3%A9framasoftisons+Internet+%5BFramasoft%5D&tr=wss%3A%2F%2Fframatube.org%3A443%2Ftracker%2Fsocket&tr=https%3A%2F%2Fframatube.org%2Ftracker%2Fannounce&ws=https%3A%2F%2Fframatube.org%2Fstatic%2Fwebseed%2F6050732a-8a7a-43d4-a6cd-809525a1d206-1080.mp4&ws=https%3A%2F%2Fpeertube.iselfhost.com%2Fstatic%2Fredundancy%2F6050732a-8a7a-43d4-a6cd-809525a1d206-1080.mp4&ws=https%3A%2F%2Ftube.privacytools.io%2Fstatic%2Fredundancy%2F6050732a-8a7a-43d4-a6cd-809525a1d206-1080.mp4&ws=https%3A%2F%2Fpeertube.live%2Fstatic%2Fredundancy%2F6050732a-8a7a-43d4-a6cd-809525a1d206-1080.mp4","height":1080},{"type":"Link","mediaType":"video/mp4","href":"https://framatube.org/static/webseed/6050732a-8a7a-43d4-a6cd-809525a1d206-480.mp4","height":480,"size":250095131,"fps":25},{"type":"Link","rel":["metadata","video/mp4"],"mediaType":"application/json","href":"https://framatube.org/api/v1/videos/6050732a-8a7a-43d4-a6cd-809525a1d206/metadata/1309941","height":480,"fps":25},{"type":"Link","mediaType":"application/x-bittorrent","href":"https://framatube.org/static/torrents/6050732a-8a7a-43d4-a6cd-809525a1d206-480.torrent","height":480},{"type":"Link","mediaType":"application/x-bittorrent;x-scheme-handler/magnet","href":"magnet:?xs=https%3A%2F%2Fframatube.org%2Fstatic%2Ftorrents%2F6050732a-8a7a-43d4-a6cd-809525a1d206-480.torrent&xt=urn:btih:a181dcbb5368ab5c31cc9ff07634becb72c344ee&dn=D%C3%A9framasoftisons+Internet+%5BFramasoft%5D&tr=wss%3A%2F%2Fframatube.org%3A443%2Ftracker%2Fsocket&tr=https%3A%2F%2Fframatube.org%2Ftracker%2Fannounce&ws=https%3A%2F%2Fframatube.org%2Fstatic%2Fwebseed%2F6050732a-8a7a-43d4-a6cd-809525a1d206-480.mp4&ws=https%3A%2F%2Fpeertube.iselfhost.com%2Fstatic%2Fredundancy%2F6050732a-8a7a-43d4-a6cd-809525a1d206-480.mp4&ws=https%3A%2F%2Ftube.privacytools.io%2Fstatic%2Fredundancy%2F6050732a-8a7a-43d4-a6cd-809525a1d206-480.mp4&ws=https%3A%2F%2Fpeertube.live%2Fstatic%2Fredundancy%2F6050732a-8a7a-43d4-a6cd-809525a1d206-480.mp4","height":480},{"type":"Link","mediaType":"video/mp4","href":"https://framatube.org/static/webseed/6050732a-8a7a-43d4-a6cd-809525a1d206-360.mp4","height":360,"size":171357733,"fps":25},{"type":"Link","rel":["metadata","video/mp4"],"mediaType":"application/json","href":"https://framatube.org/api/v1/videos/6050732a-8a7a-43d4-a6cd-809525a1d206/metadata/1309942","height":360,"fps":25},{"type":"Link","mediaType":"application/x-bittorrent","href":"https://framatube.org/static/torrents/6050732a-8a7a-43d4-a6cd-809525a1d206-360.torrent","height":360},{"type":"Link","mediaType":"application/x-bittorrent;x-scheme-handler/magnet","href":"magnet:?xs=https%3A%2F%2Fframatube.org%2Fstatic%2Ftorrents%2F6050732a-8a7a-43d4-a6cd-809525a1d206-360.torrent&xt=urn:btih:aedfa9479ea04a175eee0b0bd0bda64076308746&dn=D%C3%A9framasoftisons+Internet+%5BFramasoft%5D&tr=wss%3A%2F%2Fframatube.org%3A443%2Ftracker%2Fsocket&tr=https%3A%2F%2Fframatube.org%2Ftracker%2Fannounce&ws=https%3A%2F%2Fframatube.org%2Fstatic%2Fwebseed%2F6050732a-8a7a-43d4-a6cd-809525a1d206-360.mp4&ws=https%3A%2F%2Fpeertube.iselfhost.com%2Fstatic%2Fredundancy%2F6050732a-8a7a-43d4-a6cd-809525a1d206-360.mp4&ws=https%3A%2F%2Ftube.privacytools.io%2Fstatic%2Fredundancy%2F6050732a-8a7a-43d4-a6cd-809525a1d206-360.mp4&ws=https%3A%2F%2Fpeertube.live%2Fstatic%2Fredundancy%2F6050732a-8a7a-43d4-a6cd-809525a1d206-360.mp4","height":360},{"type":"Link","mediaType":"video/mp4","href":"https://framatube.org/static/webseed/6050732a-8a7a-43d4-a6cd-809525a1d206-720.mp4","height":720,"size":497100839,"fps":25},{"type":"Link","rel":["metadata","video/mp4"],"mediaType":"application/json","href":"https://framatube.org/api/v1/videos/6050732a-8a7a-43d4-a6cd-809525a1d206/metadata/1309943","height":720,"fps":25},{"type":"Link","mediaType":"application/x-bittorrent","href":"https://framatube.org/static/torrents/6050732a-8a7a-43d4-a6cd-809525a1d206-720.torrent","height":720},{"type":"Link","mediaType":"application/x-bittorrent;x-scheme-handler/magnet","href":"magnet:?xs=https%3A%2F%2Fframatube.org%2Fstatic%2Ftorrents%2F6050732a-8a7a-43d4-a6cd-809525a1d206-720.torrent&xt=urn:btih:71971668f82a3b24ac71bc3a982848dd8dc5a5f5&dn=D%C3%A9framasoftisons+Internet+%5BFramasoft%5D&tr=wss%3A%2F%2Fframatube.org%3A443%2Ftracker%2Fsocket&tr=https%3A%2F%2Fframatube.org%2Ftracker%2Fannounce&ws=https%3A%2F%2Fframatube.org%2Fstatic%2Fwebseed%2F6050732a-8a7a-43d4-a6cd-809525a1d206-720.mp4&ws=https%3A%2F%2Fpeertube.iselfhost.com%2Fstatic%2Fredundancy%2F6050732a-8a7a-43d4-a6cd-809525a1d206-720.mp4&ws=https%3A%2F%2Ftube.privacytools.io%2Fstatic%2Fredundancy%2F6050732a-8a7a-43d4-a6cd-809525a1d206-720.mp4&ws=https%3A%2F%2Fpeertube.live%2Fstatic%2Fredundancy%2F6050732a-8a7a-43d4-a6cd-809525a1d206-720.mp4","height":720},{"type":"Link","mediaType":"video/mp4","href":"https://framatube.org/static/webseed/6050732a-8a7a-43d4-a6cd-809525a1d206-240.mp4","height":240,"size":113038439,"fps":25},{"type":"Link","rel":["metadata","video/mp4"],"mediaType":"application/json","href":"https://framatube.org/api/v1/videos/6050732a-8a7a-43d4-a6cd-809525a1d206/metadata/1309944","height":240,"fps":25},{"type":"Link","mediaType":"application/x-bittorrent","href":"https://framatube.org/static/torrents/6050732a-8a7a-43d4-a6cd-809525a1d206-240.torrent","height":240},{"type":"Link","mediaType":"application/x-bittorrent;x-scheme-handler/magnet","href":"magnet:?xs=https%3A%2F%2Fframatube.org%2Fstatic%2Ftorrents%2F6050732a-8a7a-43d4-a6cd-809525a1d206-240.torrent&xt=urn:btih:c42aa6c95efb28d9f114ebd98537f7b00fa72246&dn=D%C3%A9framasoftisons+Internet+%5BFramasoft%5D&tr=wss%3A%2F%2Fframatube.org%3A443%2Ftracker%2Fsocket&tr=https%3A%2F%2Fframatube.org%2Ftracker%2Fannounce&ws=https%3A%2F%2Fframatube.org%2Fstatic%2Fwebseed%2F6050732a-8a7a-43d4-a6cd-809525a1d206-240.mp4&ws=https%3A%2F%2Fpeertube.iselfhost.com%2Fstatic%2Fredundancy%2F6050732a-8a7a-43d4-a6cd-809525a1d206-240.mp4&ws=https%3A%2F%2Ftube.privacytools.io%2Fstatic%2Fredundancy%2F6050732a-8a7a-43d4-a6cd-809525a1d206-240.mp4","height":240},{"type":"Link","mediaType":"application/x-mpegURL","href":"https://framatube.org/static/streaming-playlists/hls/6050732a-8a7a-43d4-a6cd-809525a1d206/master.m3u8","tag":[{"type":"Infohash","name":"f7428214539626e062f300f2ca4cf9154575144e"},{"type":"Infohash","name":"46e236dffb1ea6b9123a5396cbe88e97dd94cc6c"},{"type":"Infohash","name":"11f1045830b5d786c788f2594d19f128764e7d87"},{"type":"Infohash","name":"4327ad3e0d84de100130a27e9ab6fe40c4284f0e"},{"type":"Infohash","name":"41e2eee8e7b23a63c23a77c40a46de11492a4831"},{"type":"Link","name":"sha256","mediaType":"application/json","href":"https://framatube.org/static/streaming-playlists/hls/6050732a-8a7a-43d4-a6cd-809525a1d206/segments-sha256.json"},{"type":"Link","mediaType":"video/mp4","href":"https://framatube.org/static/streaming-playlists/hls/6050732a-8a7a-43d4-a6cd-809525a1d206/6050732a-8a7a-43d4-a6cd-809525a1d206-1080-fragmented.mp4","height":1080,"size":1156777472,"fps":25},{"type":"Link","rel":["metadata","video/mp4"],"mediaType":"application/json","href":"https://framatube.org/api/v1/videos/6050732a-8a7a-43d4-a6cd-809525a1d206/metadata/1309940","height":1080,"fps":25},{"type":"Link","mediaType":"application/x-bittorrent","href":"https://framatube.org/static/torrents/6050732a-8a7a-43d4-a6cd-809525a1d206-1080-hls.torrent","height":1080},{"type":"Link","mediaType":"application/x-bittorrent;x-scheme-handler/magnet","href":"magnet:?xs=https%3A%2F%2Fframatube.org%2Fstatic%2Ftorrents%2F6050732a-8a7a-43d4-a6cd-809525a1d206-1080-hls.torrent&xt=urn:btih:0204d780ebfab0d5d9d3476a038e812ad792deeb&dn=D%C3%A9framasoftisons+Internet+%5BFramasoft%5D&tr=wss%3A%2F%2Fframatube.org%3A443%2Ftracker%2Fsocket&tr=https%3A%2F%2Fframatube.org%2Ftracker%2Fannounce&ws=https%3A%2F%2Fframatube.org%2Fstatic%2Fstreaming-playlists%2Fhls%2F6050732a-8a7a-43d4-a6cd-809525a1d206%2F6050732a-8a7a-43d4-a6cd-809525a1d206-1080-fragmented.mp4","height":1080},{"type":"Link","mediaType":"video/mp4","href":"https://framatube.org/static/streaming-playlists/hls/6050732a-8a7a-43d4-a6cd-809525a1d206/6050732a-8a7a-43d4-a6cd-809525a1d206-480-fragmented.mp4","height":480,"size":249562889,"fps":25},{"type":"Link","rel":["metadata","video/mp4"],"mediaType":"application/json","href":"https://framatube.org/api/v1/videos/6050732a-8a7a-43d4-a6cd-809525a1d206/metadata/1309945","height":480,"fps":25},{"type":"Link","mediaType":"application/x-bittorrent","href":"https://framatube.org/static/torrents/6050732a-8a7a-43d4-a6cd-809525a1d206-480-hls.torrent","height":480},{"type":"Link","mediaType":"application/x-bittorrent;x-scheme-handler/magnet","href":"magnet:?xs=https%3A%2F%2Fframatube.org%2Fstatic%2Ftorrents%2F6050732a-8a7a-43d4-a6cd-809525a1d206-480-hls.torrent&xt=urn:btih:5d14f38ded29de629668fe1cfc61a75f4cce2628&dn=D%C3%A9framasoftisons+Internet+%5BFramasoft%5D&tr=wss%3A%2F%2Fframatube.org%3A443%2Ftracker%2Fsocket&tr=https%3A%2F%2Fframatube.org%2Ftracker%2Fannounce&ws=https%3A%2F%2Fframatube.org%2Fstatic%2Fstreaming-playlists%2Fhls%2F6050732a-8a7a-43d4-a6cd-809525a1d206%2F6050732a-8a7a-43d4-a6cd-809525a1d206-480-fragmented.mp4","height":480},{"type":"Link","mediaType":"video/mp4","href":"https://framatube.org/static/streaming-playlists/hls/6050732a-8a7a-43d4-a6cd-809525a1d206/6050732a-8a7a-43d4-a6cd-809525a1d206-360-fragmented.mp4","height":360,"size":170836415,"fps":25},{"type":"Link","rel":["metadata","video/mp4"],"mediaType":"application/json","href":"https://framatube.org/api/v1/videos/6050732a-8a7a-43d4-a6cd-809525a1d206/metadata/1309946","height":360,"fps":25},{"type":"Link","mediaType":"application/x-bittorrent","href":"https://framatube.org/static/torrents/6050732a-8a7a-43d4-a6cd-809525a1d206-360-hls.torrent","height":360},{"type":"Link","mediaType":"application/x-bittorrent;x-scheme-handler/magnet","href":"magnet:?xs=https%3A%2F%2Fframatube.org%2Fstatic%2Ftorrents%2F6050732a-8a7a-43d4-a6cd-809525a1d206-360-hls.torrent&xt=urn:btih:30125488789080ad405ebcee6c214945f31b8f30&dn=D%C3%A9framasoftisons+Internet+%5BFramasoft%5D&tr=wss%3A%2F%2Fframatube.org%3A443%2Ftracker%2Fsocket&tr=https%3A%2F%2Fframatube.org%2Ftracker%2Fannounce&ws=https%3A%2F%2Fframatube.org%2Fstatic%2Fstreaming-playlists%2Fhls%2F6050732a-8a7a-43d4-a6cd-809525a1d206%2F6050732a-8a7a-43d4-a6cd-809525a1d206-360-fragmented.mp4","height":360},{"type":"Link","mediaType":"video/mp4","href":"https://framatube.org/static/streaming-playlists/hls/6050732a-8a7a-43d4-a6cd-809525a1d206/6050732a-8a7a-43d4-a6cd-809525a1d206-720-fragmented.mp4","height":720,"size":496533741,"fps":25},{"type":"Link","rel":["metadata","video/mp4"],"mediaType":"application/json","href":"https://framatube.org/api/v1/videos/6050732a-8a7a-43d4-a6cd-809525a1d206/metadata/1309947","height":720,"fps":25},{"type":"Link","mediaType":"application/x-bittorrent","href":"https://framatube.org/static/torrents/6050732a-8a7a-43d4-a6cd-809525a1d206-720-hls.torrent","height":720},{"type":"Link","mediaType":"application/x-bittorrent;x-scheme-handler/magnet","href":"magnet:?xs=https%3A%2F%2Fframatube.org%2Fstatic%2Ftorrents%2F6050732a-8a7a-43d4-a6cd-809525a1d206-720-hls.torrent&xt=urn:btih:8ed1e8bccde709901c26e315fc8f53bfd26d1ba6&dn=D%C3%A9framasoftisons+Internet+%5BFramasoft%5D&tr=wss%3A%2F%2Fframatube.org%3A443%2Ftracker%2Fsocket&tr=https%3A%2F%2Fframatube.org%2Ftracker%2Fannounce&ws=https%3A%2F%2Fframatube.org%2Fstatic%2Fstreaming-playlists%2Fhls%2F6050732a-8a7a-43d4-a6cd-809525a1d206%2F6050732a-8a7a-43d4-a6cd-809525a1d206-720-fragmented.mp4","height":720},{"type":"Link","mediaType":"video/mp4","href":"https://framatube.org/static/streaming-playlists/hls/6050732a-8a7a-43d4-a6cd-809525a1d206/6050732a-8a7a-43d4-a6cd-809525a1d206-240-fragmented.mp4","height":240,"size":112529249,"fps":25},{"type":"Link","rel":["metadata","video/mp4"],"mediaType":"application/json","href":"https://framatube.org/api/v1/videos/6050732a-8a7a-43d4-a6cd-809525a1d206/metadata/1309948","height":240,"fps":25},{"type":"Link","mediaType":"application/x-bittorrent","href":"https://framatube.org/static/torrents/6050732a-8a7a-43d4-a6cd-809525a1d206-240-hls.torrent","height":240},{"type":"Link","mediaType":"application/x-bittorrent;x-scheme-handler/magnet","href":"magnet:?xs=https%3A%2F%2Fframatube.org%2Fstatic%2Ftorrents%2F6050732a-8a7a-43d4-a6cd-809525a1d206-240-hls.torrent&xt=urn:btih:8b452bf4e70b9078d4e74ca8b5523cc9dc70d10a&dn=D%C3%A9framasoftisons+Internet+%5BFramasoft%5D&tr=wss%3A%2F%2Fframatube.org%3A443%2Ftracker%2Fsocket&tr=https%3A%2F%2Fframatube.org%2Ftracker%2Fannounce&ws=https%3A%2F%2Fframatube.org%2Fstatic%2Fstreaming-playlists%2Fhls%2F6050732a-8a7a-43d4-a6cd-809525a1d206%2F6050732a-8a7a-43d4-a6cd-809525a1d206-240-fragmented.mp4","height":240}]}],"likes":"https://framatube.org/videos/watch/6050732a-8a7a-43d4-a6cd-809525a1d206/likes","dislikes":"https://framatube.org/videos/watch/6050732a-8a7a-43d4-a6cd-809525a1d206/dislikes","shares":"https://framatube.org/videos/watch/6050732a-8a7a-43d4-a6cd-809525a1d206/announces","comments":"https://framatube.org/videos/watch/6050732a-8a7a-43d4-a6cd-809525a1d206/comments","attributedTo":[{"type":"Person","id":"https://framatube.org/accounts/framasoft"},{"type":"Group","id":"https://framatube.org/video-channels/bf54d359-cfad-4935-9d45-9d6be93f63e8"}],"to":["https://www.w3.org/ns/activitystreams#Public"],"cc":["https://framatube.org/accounts/framasoft/followers"],"@context":["https://www.w3.org/ns/activitystreams","https://w3id.org/security/v1",{"RsaSignature2017":"https://w3id.org/security#RsaSignature2017"},{"pt":"https://joinpeertube.org/ns#","sc":"http://schema.org#","Hashtag":"as:Hashtag","uuid":"sc:identifier","category":"sc:category","licence":"sc:license","subtitleLanguage":"sc:subtitleLanguage","sensitive":"as:sensitive","language":"sc:inLanguage","Infohash":"pt:Infohash","Playlist":"pt:Playlist","PlaylistElement":"pt:PlaylistElement","originallyPublishedAt":"sc:datePublished","views":{"@type":"sc:Number","@id":"pt:views"},"state":{"@type":"sc:Number","@id":"pt:state"},"size":{"@type":"sc:Number","@id":"pt:size"},"fps":{"@type":"sc:Number","@id":"pt:fps"},"startTimestamp":{"@type":"sc:Number","@id":"pt:startTimestamp"},"stopTimestamp":{"@type":"sc:Number","@id":"pt:stopTimestamp"},"position":{"@type":"sc:Number","@id":"pt:position"},"commentsEnabled":{"@type":"sc:Boolean","@id":"pt:commentsEnabled"},"downloadEnabled":{"@type":"sc:Boolean","@id":"pt:downloadEnabled"},"waitTranscoding":{"@type":"sc:Boolean","@id":"pt:waitTranscoding"},"support":{"@type":"sc:Text","@id":"pt:support"},"likes":{"@id":"as:likes","@type":"@id"},"dislikes":{"@id":"as:dislikes","@type":"@id"},"playlists":{"@id":"pt:playlists","@type":"@id"},"shares":{"@id":"as:shares","@type":"@id"},"comments":{"@id":"as:comments","@type":"@id"}}]} \ No newline at end of file +{"type":"Create","id":"https://framatube.org/videos/watch/6050732a-8a7a-43d4-a6cd-809525a1d206/activity","actor":"https://framatube.org/accounts/framasoft","object":{"type":"Video","id":"https://framatube.org/videos/watch/6050732a-8a7a-43d4-a6cd-809525a1d206","name":"Déframasoftisons Internet [Framasoft]","duration":"PT3622S","uuid":"6050732a-8a7a-43d4-a6cd-809525a1d206","tag":[{"type":"Hashtag","name":"déframasoftisons"},{"type":"Hashtag","name":"EPN23"},{"type":"Hashtag","name":"framaconf"},{"type":"Hashtag","name":"Framasoft"},{"type":"Hashtag","name":"pyg"}],"category":{"identifier":"15","name":"Science & Technology"},"views":154,"sensitive":false,"waitTranscoding":false,"state":1,"commentsEnabled":true,"downloadEnabled":true,"published":"2020-05-24T18:34:31.569Z","originallyPublishedAt":"2019-11-30T23:00:00.000Z","updated":"2020-08-17T11:01:02.994Z","mediaType":"text/markdown","content":"Après avoir mené avec un certain succès la campagne « Dégooglisons Internet » en 2014, l’association Framasoft annonce fin 2019 arrêter progressivement un certain nombre de ses services alternatifs aux GAFAM. Pourquoi ?\r\n\r\nTranscription par @aprilorg ici : https://www.april.org/deframasoftisons-internet-pierre-yves-gosset-framasoft","support":null,"subtitleLanguage":[],"icon":[{"type":"Image","url":"https://framatube.org/static/thumbnails/6050732a-8a7a-43d4-a6cd-809525a1d206.jpg","mediaType":"image/jpeg","width":223,"height":122},{"type":"Image","url":"https://framatube.org/static/previews/6050732a-8a7a-43d4-a6cd-809525a1d206.jpg","mediaType":"image/jpeg","width":850,"height":480}],"url":[{"type":"Link","mediaType":"text/html","href":"https://framatube.org/videos/watch/6050732a-8a7a-43d4-a6cd-809525a1d206"},{"type":"Link","mediaType":"video/mp4","href":"https://framatube.org/static/webseed/6050732a-8a7a-43d4-a6cd-809525a1d206-1080.mp4","height":1080,"size":1157359410,"fps":25},{"type":"Link","rel":["metadata","video/mp4"],"mediaType":"application/json","href":"https://framatube.org/api/v1/videos/6050732a-8a7a-43d4-a6cd-809525a1d206/metadata/1309939","height":1080,"fps":25},{"type":"Link","mediaType":"application/x-bittorrent","href":"https://framatube.org/static/torrents/6050732a-8a7a-43d4-a6cd-809525a1d206-1080.torrent","height":1080},{"type":"Link","mediaType":"application/x-bittorrent;x-scheme-handler/magnet","href":"magnet:?xs=https%3A%2F%2Fframatube.org%2Fstatic%2Ftorrents%2F6050732a-8a7a-43d4-a6cd-809525a1d206-1080.torrent&xt=urn:btih:381c9429900552e23a4eb506318f1fa01e4d63a8&dn=D%C3%A9framasoftisons+Internet+%5BFramasoft%5D&tr=wss%3A%2F%2Fframatube.org%3A443%2Ftracker%2Fsocket&tr=https%3A%2F%2Fframatube.org%2Ftracker%2Fannounce&ws=https%3A%2F%2Fframatube.org%2Fstatic%2Fwebseed%2F6050732a-8a7a-43d4-a6cd-809525a1d206-1080.mp4&ws=https%3A%2F%2Fpeertube.iselfhost.com%2Fstatic%2Fredundancy%2F6050732a-8a7a-43d4-a6cd-809525a1d206-1080.mp4&ws=https%3A%2F%2Ftube.privacytools.io%2Fstatic%2Fredundancy%2F6050732a-8a7a-43d4-a6cd-809525a1d206-1080.mp4&ws=https%3A%2F%2Fpeertube.live%2Fstatic%2Fredundancy%2F6050732a-8a7a-43d4-a6cd-809525a1d206-1080.mp4","height":1080},{"type":"Link","mediaType":"video/mp4","href":"https://framatube.org/static/webseed/6050732a-8a7a-43d4-a6cd-809525a1d206-720.mp4","height":720,"size":497100839,"fps":25},{"type":"Link","rel":["metadata","video/mp4"],"mediaType":"application/json","href":"https://framatube.org/api/v1/videos/6050732a-8a7a-43d4-a6cd-809525a1d206/metadata/1309943","height":720,"fps":25},{"type":"Link","mediaType":"application/x-bittorrent","href":"https://framatube.org/static/torrents/6050732a-8a7a-43d4-a6cd-809525a1d206-720.torrent","height":720},{"type":"Link","mediaType":"application/x-bittorrent;x-scheme-handler/magnet","href":"magnet:?xs=https%3A%2F%2Fframatube.org%2Fstatic%2Ftorrents%2F6050732a-8a7a-43d4-a6cd-809525a1d206-720.torrent&xt=urn:btih:71971668f82a3b24ac71bc3a982848dd8dc5a5f5&dn=D%C3%A9framasoftisons+Internet+%5BFramasoft%5D&tr=wss%3A%2F%2Fframatube.org%3A443%2Ftracker%2Fsocket&tr=https%3A%2F%2Fframatube.org%2Ftracker%2Fannounce&ws=https%3A%2F%2Fframatube.org%2Fstatic%2Fwebseed%2F6050732a-8a7a-43d4-a6cd-809525a1d206-720.mp4&ws=https%3A%2F%2Ftube.privacytools.io%2Fstatic%2Fredundancy%2F6050732a-8a7a-43d4-a6cd-809525a1d206-720.mp4&ws=https%3A%2F%2Fpeertube.iselfhost.com%2Fstatic%2Fredundancy%2F6050732a-8a7a-43d4-a6cd-809525a1d206-720.mp4&ws=https%3A%2F%2Fpeertube.live%2Fstatic%2Fredundancy%2F6050732a-8a7a-43d4-a6cd-809525a1d206-720.mp4","height":720},{"type":"Link","mediaType":"video/mp4","href":"https://framatube.org/static/webseed/6050732a-8a7a-43d4-a6cd-809525a1d206-480.mp4","height":480,"size":250095131,"fps":25},{"type":"Link","rel":["metadata","video/mp4"],"mediaType":"application/json","href":"https://framatube.org/api/v1/videos/6050732a-8a7a-43d4-a6cd-809525a1d206/metadata/1309941","height":480,"fps":25},{"type":"Link","mediaType":"application/x-bittorrent","href":"https://framatube.org/static/torrents/6050732a-8a7a-43d4-a6cd-809525a1d206-480.torrent","height":480},{"type":"Link","mediaType":"application/x-bittorrent;x-scheme-handler/magnet","href":"magnet:?xs=https%3A%2F%2Fframatube.org%2Fstatic%2Ftorrents%2F6050732a-8a7a-43d4-a6cd-809525a1d206-480.torrent&xt=urn:btih:a181dcbb5368ab5c31cc9ff07634becb72c344ee&dn=D%C3%A9framasoftisons+Internet+%5BFramasoft%5D&tr=wss%3A%2F%2Fframatube.org%3A443%2Ftracker%2Fsocket&tr=https%3A%2F%2Fframatube.org%2Ftracker%2Fannounce&ws=https%3A%2F%2Fframatube.org%2Fstatic%2Fwebseed%2F6050732a-8a7a-43d4-a6cd-809525a1d206-480.mp4&ws=https%3A%2F%2Fpeertube.iselfhost.com%2Fstatic%2Fredundancy%2F6050732a-8a7a-43d4-a6cd-809525a1d206-480.mp4&ws=https%3A%2F%2Ftube.privacytools.io%2Fstatic%2Fredundancy%2F6050732a-8a7a-43d4-a6cd-809525a1d206-480.mp4&ws=https%3A%2F%2Fpeertube.live%2Fstatic%2Fredundancy%2F6050732a-8a7a-43d4-a6cd-809525a1d206-480.mp4","height":480},{"type":"Link","mediaType":"video/mp4","href":"https://framatube.org/static/webseed/6050732a-8a7a-43d4-a6cd-809525a1d206-360.mp4","height":360,"size":171357733,"fps":25},{"type":"Link","rel":["metadata","video/mp4"],"mediaType":"application/json","href":"https://framatube.org/api/v1/videos/6050732a-8a7a-43d4-a6cd-809525a1d206/metadata/1309942","height":360,"fps":25},{"type":"Link","mediaType":"application/x-bittorrent","href":"https://framatube.org/static/torrents/6050732a-8a7a-43d4-a6cd-809525a1d206-360.torrent","height":360},{"type":"Link","mediaType":"application/x-bittorrent;x-scheme-handler/magnet","href":"magnet:?xs=https%3A%2F%2Fframatube.org%2Fstatic%2Ftorrents%2F6050732a-8a7a-43d4-a6cd-809525a1d206-360.torrent&xt=urn:btih:aedfa9479ea04a175eee0b0bd0bda64076308746&dn=D%C3%A9framasoftisons+Internet+%5BFramasoft%5D&tr=wss%3A%2F%2Fframatube.org%3A443%2Ftracker%2Fsocket&tr=https%3A%2F%2Fframatube.org%2Ftracker%2Fannounce&ws=https%3A%2F%2Fframatube.org%2Fstatic%2Fwebseed%2F6050732a-8a7a-43d4-a6cd-809525a1d206-360.mp4&ws=https%3A%2F%2Fpeertube.iselfhost.com%2Fstatic%2Fredundancy%2F6050732a-8a7a-43d4-a6cd-809525a1d206-360.mp4&ws=https%3A%2F%2Ftube.privacytools.io%2Fstatic%2Fredundancy%2F6050732a-8a7a-43d4-a6cd-809525a1d206-360.mp4&ws=https%3A%2F%2Fpeertube.live%2Fstatic%2Fredundancy%2F6050732a-8a7a-43d4-a6cd-809525a1d206-360.mp4","height":360},{"type":"Link","mediaType":"video/mp4","href":"https://framatube.org/static/webseed/6050732a-8a7a-43d4-a6cd-809525a1d206-240.mp4","height":240,"size":113038439,"fps":25},{"type":"Link","rel":["metadata","video/mp4"],"mediaType":"application/json","href":"https://framatube.org/api/v1/videos/6050732a-8a7a-43d4-a6cd-809525a1d206/metadata/1309944","height":240,"fps":25},{"type":"Link","mediaType":"application/x-bittorrent","href":"https://framatube.org/static/torrents/6050732a-8a7a-43d4-a6cd-809525a1d206-240.torrent","height":240},{"type":"Link","mediaType":"application/x-bittorrent;x-scheme-handler/magnet","href":"magnet:?xs=https%3A%2F%2Fframatube.org%2Fstatic%2Ftorrents%2F6050732a-8a7a-43d4-a6cd-809525a1d206-240.torrent&xt=urn:btih:c42aa6c95efb28d9f114ebd98537f7b00fa72246&dn=D%C3%A9framasoftisons+Internet+%5BFramasoft%5D&tr=wss%3A%2F%2Fframatube.org%3A443%2Ftracker%2Fsocket&tr=https%3A%2F%2Fframatube.org%2Ftracker%2Fannounce&ws=https%3A%2F%2Fframatube.org%2Fstatic%2Fwebseed%2F6050732a-8a7a-43d4-a6cd-809525a1d206-240.mp4&ws=https%3A%2F%2Fpeertube.iselfhost.com%2Fstatic%2Fredundancy%2F6050732a-8a7a-43d4-a6cd-809525a1d206-240.mp4&ws=https%3A%2F%2Ftube.privacytools.io%2Fstatic%2Fredundancy%2F6050732a-8a7a-43d4-a6cd-809525a1d206-240.mp4","height":240},{"type":"Link","mediaType":"application/x-mpegURL","href":"https://framatube.org/static/streaming-playlists/hls/6050732a-8a7a-43d4-a6cd-809525a1d206/master.m3u8","tag":[{"type":"Infohash","name":"f7428214539626e062f300f2ca4cf9154575144e"},{"type":"Infohash","name":"46e236dffb1ea6b9123a5396cbe88e97dd94cc6c"},{"type":"Infohash","name":"11f1045830b5d786c788f2594d19f128764e7d87"},{"type":"Infohash","name":"4327ad3e0d84de100130a27e9ab6fe40c4284f0e"},{"type":"Infohash","name":"41e2eee8e7b23a63c23a77c40a46de11492a4831"},{"type":"Link","name":"sha256","mediaType":"application/json","href":"https://framatube.org/static/streaming-playlists/hls/6050732a-8a7a-43d4-a6cd-809525a1d206/segments-sha256.json"},{"type":"Link","mediaType":"video/mp4","href":"https://framatube.org/static/streaming-playlists/hls/6050732a-8a7a-43d4-a6cd-809525a1d206/6050732a-8a7a-43d4-a6cd-809525a1d206-1080-fragmented.mp4","height":1080,"size":1156777472,"fps":25},{"type":"Link","rel":["metadata","video/mp4"],"mediaType":"application/json","href":"https://framatube.org/api/v1/videos/6050732a-8a7a-43d4-a6cd-809525a1d206/metadata/1309940","height":1080,"fps":25},{"type":"Link","mediaType":"application/x-bittorrent","href":"https://framatube.org/static/torrents/6050732a-8a7a-43d4-a6cd-809525a1d206-1080-hls.torrent","height":1080},{"type":"Link","mediaType":"application/x-bittorrent;x-scheme-handler/magnet","href":"magnet:?xs=https%3A%2F%2Fframatube.org%2Fstatic%2Ftorrents%2F6050732a-8a7a-43d4-a6cd-809525a1d206-1080-hls.torrent&xt=urn:btih:0204d780ebfab0d5d9d3476a038e812ad792deeb&dn=D%C3%A9framasoftisons+Internet+%5BFramasoft%5D&tr=wss%3A%2F%2Fframatube.org%3A443%2Ftracker%2Fsocket&tr=https%3A%2F%2Fframatube.org%2Ftracker%2Fannounce&ws=https%3A%2F%2Fframatube.org%2Fstatic%2Fstreaming-playlists%2Fhls%2F6050732a-8a7a-43d4-a6cd-809525a1d206%2F6050732a-8a7a-43d4-a6cd-809525a1d206-1080-fragmented.mp4","height":1080},{"type":"Link","mediaType":"video/mp4","href":"https://framatube.org/static/streaming-playlists/hls/6050732a-8a7a-43d4-a6cd-809525a1d206/6050732a-8a7a-43d4-a6cd-809525a1d206-720-fragmented.mp4","height":720,"size":496533741,"fps":25},{"type":"Link","rel":["metadata","video/mp4"],"mediaType":"application/json","href":"https://framatube.org/api/v1/videos/6050732a-8a7a-43d4-a6cd-809525a1d206/metadata/1309947","height":720,"fps":25},{"type":"Link","mediaType":"application/x-bittorrent","href":"https://framatube.org/static/torrents/6050732a-8a7a-43d4-a6cd-809525a1d206-720-hls.torrent","height":720},{"type":"Link","mediaType":"application/x-bittorrent;x-scheme-handler/magnet","href":"magnet:?xs=https%3A%2F%2Fframatube.org%2Fstatic%2Ftorrents%2F6050732a-8a7a-43d4-a6cd-809525a1d206-720-hls.torrent&xt=urn:btih:8ed1e8bccde709901c26e315fc8f53bfd26d1ba6&dn=D%C3%A9framasoftisons+Internet+%5BFramasoft%5D&tr=wss%3A%2F%2Fframatube.org%3A443%2Ftracker%2Fsocket&tr=https%3A%2F%2Fframatube.org%2Ftracker%2Fannounce&ws=https%3A%2F%2Fframatube.org%2Fstatic%2Fstreaming-playlists%2Fhls%2F6050732a-8a7a-43d4-a6cd-809525a1d206%2F6050732a-8a7a-43d4-a6cd-809525a1d206-720-fragmented.mp4","height":720},{"type":"Link","mediaType":"video/mp4","href":"https://framatube.org/static/streaming-playlists/hls/6050732a-8a7a-43d4-a6cd-809525a1d206/6050732a-8a7a-43d4-a6cd-809525a1d206-480-fragmented.mp4","height":480,"size":249562889,"fps":25},{"type":"Link","rel":["metadata","video/mp4"],"mediaType":"application/json","href":"https://framatube.org/api/v1/videos/6050732a-8a7a-43d4-a6cd-809525a1d206/metadata/1309945","height":480,"fps":25},{"type":"Link","mediaType":"application/x-bittorrent","href":"https://framatube.org/static/torrents/6050732a-8a7a-43d4-a6cd-809525a1d206-480-hls.torrent","height":480},{"type":"Link","mediaType":"application/x-bittorrent;x-scheme-handler/magnet","href":"magnet:?xs=https%3A%2F%2Fframatube.org%2Fstatic%2Ftorrents%2F6050732a-8a7a-43d4-a6cd-809525a1d206-480-hls.torrent&xt=urn:btih:5d14f38ded29de629668fe1cfc61a75f4cce2628&dn=D%C3%A9framasoftisons+Internet+%5BFramasoft%5D&tr=wss%3A%2F%2Fframatube.org%3A443%2Ftracker%2Fsocket&tr=https%3A%2F%2Fframatube.org%2Ftracker%2Fannounce&ws=https%3A%2F%2Fframatube.org%2Fstatic%2Fstreaming-playlists%2Fhls%2F6050732a-8a7a-43d4-a6cd-809525a1d206%2F6050732a-8a7a-43d4-a6cd-809525a1d206-480-fragmented.mp4","height":480},{"type":"Link","mediaType":"video/mp4","href":"https://framatube.org/static/streaming-playlists/hls/6050732a-8a7a-43d4-a6cd-809525a1d206/6050732a-8a7a-43d4-a6cd-809525a1d206-360-fragmented.mp4","height":360,"size":170836415,"fps":25},{"type":"Link","rel":["metadata","video/mp4"],"mediaType":"application/json","href":"https://framatube.org/api/v1/videos/6050732a-8a7a-43d4-a6cd-809525a1d206/metadata/1309946","height":360,"fps":25},{"type":"Link","mediaType":"application/x-bittorrent","href":"https://framatube.org/static/torrents/6050732a-8a7a-43d4-a6cd-809525a1d206-360-hls.torrent","height":360},{"type":"Link","mediaType":"application/x-bittorrent;x-scheme-handler/magnet","href":"magnet:?xs=https%3A%2F%2Fframatube.org%2Fstatic%2Ftorrents%2F6050732a-8a7a-43d4-a6cd-809525a1d206-360-hls.torrent&xt=urn:btih:30125488789080ad405ebcee6c214945f31b8f30&dn=D%C3%A9framasoftisons+Internet+%5BFramasoft%5D&tr=wss%3A%2F%2Fframatube.org%3A443%2Ftracker%2Fsocket&tr=https%3A%2F%2Fframatube.org%2Ftracker%2Fannounce&ws=https%3A%2F%2Fframatube.org%2Fstatic%2Fstreaming-playlists%2Fhls%2F6050732a-8a7a-43d4-a6cd-809525a1d206%2F6050732a-8a7a-43d4-a6cd-809525a1d206-360-fragmented.mp4","height":360},{"type":"Link","mediaType":"video/mp4","href":"https://framatube.org/static/streaming-playlists/hls/6050732a-8a7a-43d4-a6cd-809525a1d206/6050732a-8a7a-43d4-a6cd-809525a1d206-240-fragmented.mp4","height":240,"size":112529249,"fps":25},{"type":"Link","rel":["metadata","video/mp4"],"mediaType":"application/json","href":"https://framatube.org/api/v1/videos/6050732a-8a7a-43d4-a6cd-809525a1d206/metadata/1309948","height":240,"fps":25},{"type":"Link","mediaType":"application/x-bittorrent","href":"https://framatube.org/static/torrents/6050732a-8a7a-43d4-a6cd-809525a1d206-240-hls.torrent","height":240},{"type":"Link","mediaType":"application/x-bittorrent;x-scheme-handler/magnet","href":"magnet:?xs=https%3A%2F%2Fframatube.org%2Fstatic%2Ftorrents%2F6050732a-8a7a-43d4-a6cd-809525a1d206-240-hls.torrent&xt=urn:btih:8b452bf4e70b9078d4e74ca8b5523cc9dc70d10a&dn=D%C3%A9framasoftisons+Internet+%5BFramasoft%5D&tr=wss%3A%2F%2Fframatube.org%3A443%2Ftracker%2Fsocket&tr=https%3A%2F%2Fframatube.org%2Ftracker%2Fannounce&ws=https%3A%2F%2Fframatube.org%2Fstatic%2Fstreaming-playlists%2Fhls%2F6050732a-8a7a-43d4-a6cd-809525a1d206%2F6050732a-8a7a-43d4-a6cd-809525a1d206-240-fragmented.mp4","height":240}]}],"likes":"https://framatube.org/videos/watch/6050732a-8a7a-43d4-a6cd-809525a1d206/likes","dislikes":"https://framatube.org/videos/watch/6050732a-8a7a-43d4-a6cd-809525a1d206/dislikes","shares":"https://framatube.org/videos/watch/6050732a-8a7a-43d4-a6cd-809525a1d206/announces","comments":"https://framatube.org/videos/watch/6050732a-8a7a-43d4-a6cd-809525a1d206/comments","attributedTo":[{"type":"Person","id":"https://framatube.org/accounts/framasoft"},{"type":"Group","id":"https://framatube.org/video-channels/bf54d359-cfad-4935-9d45-9d6be93f63e8"}],"to":["https://www.w3.org/ns/activitystreams#Public"],"cc":["https://framatube.org/accounts/framasoft/followers"]},"to":["https://www.w3.org/ns/activitystreams#Public"],"cc":["https://framatube.org/accounts/framasoft/followers"],"@context":["https://www.w3.org/ns/activitystreams","https://w3id.org/security/v1",{"RsaSignature2017":"https://w3id.org/security#RsaSignature2017"},{"pt":"https://joinpeertube.org/ns#","sc":"http://schema.org#","Hashtag":"as:Hashtag","uuid":"sc:identifier","category":"sc:category","licence":"sc:license","subtitleLanguage":"sc:subtitleLanguage","sensitive":"as:sensitive","language":"sc:inLanguage","Infohash":"pt:Infohash","Playlist":"pt:Playlist","PlaylistElement":"pt:PlaylistElement","originallyPublishedAt":"sc:datePublished","views":{"@type":"sc:Number","@id":"pt:views"},"state":{"@type":"sc:Number","@id":"pt:state"},"size":{"@type":"sc:Number","@id":"pt:size"},"fps":{"@type":"sc:Number","@id":"pt:fps"},"startTimestamp":{"@type":"sc:Number","@id":"pt:startTimestamp"},"stopTimestamp":{"@type":"sc:Number","@id":"pt:stopTimestamp"},"position":{"@type":"sc:Number","@id":"pt:position"},"commentsEnabled":{"@type":"sc:Boolean","@id":"pt:commentsEnabled"},"downloadEnabled":{"@type":"sc:Boolean","@id":"pt:downloadEnabled"},"waitTranscoding":{"@type":"sc:Boolean","@id":"pt:waitTranscoding"},"support":{"@type":"sc:Text","@id":"pt:support"},"likes":{"@id":"as:likes","@type":"@id"},"dislikes":{"@id":"as:dislikes","@type":"@id"},"playlists":{"@id":"pt:playlists","@type":"@id"},"shares":{"@id":"as:shares","@type":"@id"},"comments":{"@id":"as:comments","@type":"@id"}}]} \ No newline at end of file diff --git a/test/fixtures/tesla_mock/https___osada.macgirvin.com_channel_mike.json b/test/fixtures/tesla_mock/https___osada.macgirvin.com_channel_mike.json index c42f3a53c..ca76d6e17 100644 --- a/test/fixtures/tesla_mock/https___osada.macgirvin.com_channel_mike.json +++ b/test/fixtures/tesla_mock/https___osada.macgirvin.com_channel_mike.json @@ -8,6 +8,7 @@ "preferredUsername": "mike", "name": "Mike Macgirvin (Osada)", "updated": "2018-08-29T03:09:11Z", + "discoverable": "true", "icon": { "type": "Image", "mediaType": "image/jpeg", @@ -51,4 +52,4 @@ "created": "2018-10-17T07:16:28Z", "signatureValue": "WbfFVIPImkd3yNu6brz0CvZaeV242rwAbH0vy8DM4vfnXCxLr5Uv/Wj9gwP+tbooTxGaahAKBeqlGkQp8RLEo37LATrKMRLA/0V6DeeV+C5ORWR9B4WxyWiD3s/9Wf+KesFMtktNLAcMZ5PfnOS/xNYerhnpkp/gWPxtkglmLIWJv+w18A5zZ01JCxsO4QljHbhYaEUPHUfQ97abrkLECeam+FThVwdO6BFCtbjoNXHfzjpSZL/oKyBpi5/fpnqMqOLOQPs5WgBBZJvjEYYkQcoPTyxYI5NGpNbzIjGHPQNuACnOelH16A7L+q4swLWDIaEFeXQ2/5bmqVKZDZZ6usNP4QyTVszwd8jqo27qcDTNibXDUTsTdKpNQvM/3UncBuzuzmUV3FczhtGshIU1/pRVZiQycpVqPlGLvXhP/yZCe+1siyqDd+3uMaS2vkHTObSl5r+VYof+c+TcjrZXHSWnQTg8/X3zkoBWosrQ93VZcwjzMxQoARYv6rphbOoTz7RPmGAXYUt3/PDWkqDlmQDwCpLNNkJo1EidyefZBdD9HXQpCBO0ZU0NHb0JmPvg/+zU0krxlv70bm3RHA/maBETVjroIWzt7EwQEg5pL2hVnvSBG+1wF3BtRVe77etkPOHxLnYYIcAMLlVKCcgDd89DPIziQyruvkx1busHI08=" } -} +} \ No newline at end of file diff --git a/test/fixtures/tesla_mock/wedistribute-create-article.json b/test/fixtures/tesla_mock/wedistribute-create-article.json new file mode 100644 index 000000000..3cfef8b99 --- /dev/null +++ b/test/fixtures/tesla_mock/wedistribute-create-article.json @@ -0,0 +1 @@ +{"@context":["https:\/\/www.w3.org\/ns\/activitystreams"],"type":"Create","actor":"https:\/\/wedistribute.org\/wp-json\/pterotype\/v1\/actor\/-blog","object":{"@context":["https:\/\/www.w3.org\/ns\/activitystreams"],"type":"Article","name":"The end is near: Mastodon plans to drop OStatus support","content":"\n

The days of OStatus are numbered. The venerable protocol has served as a glue between many different types of servers since the early days of the Fediverse, connecting StatusNet (now GNU Social) to Friendica, Hubzilla, Mastodon, and Pleroma.<\/p>\n\n\n\n

Now that many fediverse platforms support ActivityPub as a successor protocol, Mastodon appears to be drawing a line in the sand. In a Patreon update<\/a>, Eugen Rochko writes:<\/p>\n\n\n\n

...OStatus...has overstayed its welcome in the code...and now that most of the network uses ActivityPub, it's time for it to go. <\/p>Eugen Rochko, Mastodon creator<\/cite><\/blockquote>\n\n\n\n

The pull request<\/a> to remove Pubsubhubbub and Salmon, two of the main components of OStatus, has already been merged into Mastodon's master branch.<\/p>\n\n\n\n

Some projects will be left in the dark as a side effect of this. GNU Social and PostActiv, for example, both only communicate using OStatus. While some discussion<\/a> exists regarding adopting ActivityPub for GNU Social, and a plugin is in development<\/a>, it hasn't been formally adopted yet. We just hope that the Free Software Foundation's instance<\/a> gets updated in time!<\/p>\n","summary":"One of the largest platforms in the federated social web is dropping the protocol that it started with.","attributedTo":"https:\/\/wedistribute.org\/wp-json\/pterotype\/v1\/actor\/-blog","url":"https:\/\/wedistribute.org\/2019\/07\/mastodon-drops-ostatus\/","to":["https:\/\/www.w3.org\/ns\/activitystreams#Public","https:\/\/wedistribute.org\/wp-json\/pterotype\/v1\/actor\/-blog\/followers"],"id":"https:\/\/wedistribute.org\/wp-json\/pterotype\/v1\/object\/85810","likes":"https:\/\/wedistribute.org\/wp-json\/pterotype\/v1\/object\/85810\/likes","shares":"https:\/\/wedistribute.org\/wp-json\/pterotype\/v1\/object\/85810\/shares"},"to":["https:\/\/www.w3.org\/ns\/activitystreams#Public","https:\/\/wedistribute.org\/wp-json\/pterotype\/v1\/actor\/-blog\/followers"],"id":"https:\/\/wedistribute.org\/wp-json\/pterotype\/v1\/object\/85809"} \ No newline at end of file diff --git a/test/instance_static/emoji/blobs.gg/blank.png b/test/instance_static/emoji/blobs.gg/blank.png new file mode 100644 index 000000000..8f50fa023 Binary files /dev/null and b/test/instance_static/emoji/blobs.gg/blank.png differ diff --git a/test/instance_static/emoji/blobs.gg/pack.json b/test/instance_static/emoji/blobs.gg/pack.json new file mode 100644 index 000000000..481891b08 --- /dev/null +++ b/test/instance_static/emoji/blobs.gg/pack.json @@ -0,0 +1,11 @@ +{ + "files": { + "blank": "blank.png" + }, + "pack": { + "description": "Test description", + "homepage": "https://pleroma.social", + "license": "Test license", + "share-files": true + } +} \ No newline at end of file diff --git a/test/integration/mastodon_websocket_test.exs b/test/integration/mastodon_websocket_test.exs index 76fbc8bda..0f2e6cc2b 100644 --- a/test/integration/mastodon_websocket_test.exs +++ b/test/integration/mastodon_websocket_test.exs @@ -78,7 +78,7 @@ test "receives well formatted events" do Pleroma.Repo.insert( OAuth.App.register_changeset(%OAuth.App{}, %{ client_name: "client", - scopes: ["scope"], + scopes: ["read"], redirect_uris: "url" }) ) diff --git a/test/marker_test.exs b/test/marker_test.exs index 5b6d0b4a4..7b3943c7b 100644 --- a/test/marker_test.exs +++ b/test/marker_test.exs @@ -33,8 +33,8 @@ test "return empty multi" do test "returns user markers" do user = insert(:user) marker = insert(:marker, user: user) - insert(:notification, user: user) - insert(:notification, user: user) + insert(:notification, user: user, activity: insert(:note_activity)) + insert(:notification, user: user, activity: insert(:note_activity)) insert(:marker, timeline: "home", user: user) assert Marker.get_markers( diff --git a/test/notification_test.exs b/test/notification_test.exs index a09b08675..f2e0f0b0d 100644 --- a/test/notification_test.exs +++ b/test/notification_test.exs @@ -179,17 +179,19 @@ test "does not create a notification for subscribed users if status is a reply" describe "create_notification" do @tag needs_streamer: true test "it creates a notification for user and send to the 'user' and the 'user:notification' stream" do - user = insert(:user) + %{user: user, token: oauth_token} = oauth_access(["read"]) task = Task.async(fn -> - Streamer.get_topic_and_add_socket("user", user) + {:ok, _topic} = Streamer.get_topic_and_add_socket("user", user, oauth_token) assert_receive {:render_with_user, _, _, _}, 4_000 end) task_user_notification = Task.async(fn -> - Streamer.get_topic_and_add_socket("user:notification", user) + {:ok, _topic} = + Streamer.get_topic_and_add_socket("user:notification", user, oauth_token) + assert_receive {:render_with_user, _, _, _}, 4_000 end) diff --git a/test/object/fetcher_test.exs b/test/object/fetcher_test.exs index 16cfa7f5c..14d2c645f 100644 --- a/test/object/fetcher_test.exs +++ b/test/object/fetcher_test.exs @@ -6,10 +6,12 @@ defmodule Pleroma.Object.FetcherTest do use Pleroma.DataCase alias Pleroma.Activity + alias Pleroma.Config alias Pleroma.Object alias Pleroma.Object.Fetcher - import Tesla.Mock + import Mock + import Tesla.Mock setup do mock(fn @@ -71,20 +73,20 @@ test "it works when fetching the OP actor errors out" do setup do: clear_config([:instance, :federation_incoming_replies_max_depth]) test "it returns thread depth exceeded error if thread depth is exceeded" do - Pleroma.Config.put([:instance, :federation_incoming_replies_max_depth], 0) + Config.put([:instance, :federation_incoming_replies_max_depth], 0) assert {:error, "Max thread distance exceeded."} = Fetcher.fetch_object_from_id(@ap_id, depth: 1) end test "it fetches object if max thread depth is restricted to 0 and depth is not specified" do - Pleroma.Config.put([:instance, :federation_incoming_replies_max_depth], 0) + Config.put([:instance, :federation_incoming_replies_max_depth], 0) assert {:ok, _} = Fetcher.fetch_object_from_id(@ap_id) end test "it fetches object if requested depth does not exceed max thread depth" do - Pleroma.Config.put([:instance, :federation_incoming_replies_max_depth], 10) + Config.put([:instance, :federation_incoming_replies_max_depth], 10) assert {:ok, _} = Fetcher.fetch_object_from_id(@ap_id, depth: 10) end @@ -120,6 +122,16 @@ test "it fetches an object" do assert object == object_again end + + test "Return MRF reason when fetched status is rejected by one" do + clear_config([:mrf_keyword, :reject], ["yeah"]) + clear_config([:mrf, :policies], [Pleroma.Web.ActivityPub.MRF.KeywordPolicy]) + + assert {:reject, "[KeywordPolicy] Matches with rejected keyword"} == + Fetcher.fetch_object_from_id( + "http://mastodon.example.org/@admin/99541947525187367" + ) + end end describe "implementation quirks" do @@ -212,7 +224,7 @@ test "it can refetch pruned objects" do Pleroma.Signature, [:passthrough], [] do - Pleroma.Config.put([:activitypub, :sign_object_fetches], true) + Config.put([:activitypub, :sign_object_fetches], true) Fetcher.fetch_object_from_id("http://mastodon.example.org/@admin/99541947525187367") @@ -223,7 +235,7 @@ test "it can refetch pruned objects" do Pleroma.Signature, [:passthrough], [] do - Pleroma.Config.put([:activitypub, :sign_object_fetches], false) + Config.put([:activitypub, :sign_object_fetches], false) Fetcher.fetch_object_from_id("http://mastodon.example.org/@admin/99541947525187367") diff --git a/test/plugs/remote_ip_test.exs b/test/plugs/remote_ip_test.exs index 752ab32e7..6d01c812d 100644 --- a/test/plugs/remote_ip_test.exs +++ b/test/plugs/remote_ip_test.exs @@ -3,13 +3,27 @@ # SPDX-License-Identifier: AGPL-3.0-only defmodule Pleroma.Plugs.RemoteIpTest do - use ExUnit.Case, async: true + use ExUnit.Case use Plug.Test alias Pleroma.Plugs.RemoteIp - import Pleroma.Tests.Helpers, only: [clear_config: 1, clear_config: 2] - setup do: clear_config(RemoteIp) + import Pleroma.Tests.Helpers, only: [clear_config: 2] + + setup do: + clear_config(RemoteIp, + enabled: true, + headers: ["x-forwarded-for"], + proxies: [], + reserved: [ + "127.0.0.0/8", + "::1/128", + "fc00::/7", + "10.0.0.0/8", + "172.16.0.0/12", + "192.168.0.0/16" + ] + ) test "disabled" do Pleroma.Config.put(RemoteIp, enabled: false) @@ -25,8 +39,6 @@ test "disabled" do end test "enabled" do - Pleroma.Config.put(RemoteIp, enabled: true) - conn = conn(:get, "/") |> put_req_header("x-forwarded-for", "1.1.1.1") @@ -54,8 +66,6 @@ test "custom headers" do end test "custom proxies" do - Pleroma.Config.put(RemoteIp, enabled: true) - conn = conn(:get, "/") |> put_req_header("x-forwarded-for", "173.245.48.1, 1.1.1.1, 173.245.48.2") @@ -72,4 +82,27 @@ test "custom proxies" do assert conn.remote_ip == {1, 1, 1, 1} end + + test "proxies set without CIDR format" do + Pleroma.Config.put([RemoteIp, :proxies], ["173.245.48.1"]) + + conn = + conn(:get, "/") + |> put_req_header("x-forwarded-for", "173.245.48.1, 1.1.1.1") + |> RemoteIp.call(nil) + + assert conn.remote_ip == {1, 1, 1, 1} + end + + test "proxies set `nonsensical` CIDR" do + Pleroma.Config.put([RemoteIp, :reserved], ["127.0.0.0/8"]) + Pleroma.Config.put([RemoteIp, :proxies], ["10.0.0.3/24"]) + + conn = + conn(:get, "/") + |> put_req_header("x-forwarded-for", "10.0.0.3, 1.1.1.1") + |> RemoteIp.call(nil) + + assert conn.remote_ip == {1, 1, 1, 1} + end end diff --git a/test/repo_test.exs b/test/repo_test.exs index 92e827c95..155791be2 100644 --- a/test/repo_test.exs +++ b/test/repo_test.exs @@ -37,7 +37,9 @@ test "get one-to-one assoc from repo" do test "get one-to-many assoc from repo" do user = insert(:user) - notification = refresh_record(insert(:notification, user: user)) + + notification = + refresh_record(insert(:notification, user: user, activity: insert(:note_activity))) assert Repo.get_assoc(user, :notifications) == {:ok, [notification]} end @@ -47,4 +49,32 @@ test "return error if has not assoc " do assert Repo.get_assoc(token, :user) == {:error, :not_found} end end + + describe "chunk_stream/3" do + test "fetch records one-by-one" do + users = insert_list(50, :user) + + {fetch_users, 50} = + from(t in User) + |> Repo.chunk_stream(5) + |> Enum.reduce({[], 0}, fn %User{} = user, {acc, count} -> + {acc ++ [user], count + 1} + end) + + assert users == fetch_users + end + + test "fetch records in bulk" do + users = insert_list(50, :user) + + {fetch_users, 10} = + from(t in User) + |> Repo.chunk_stream(5, :batches) + |> Enum.reduce({[], 0}, fn users, {acc, count} -> + {acc ++ users, count + 1} + end) + + assert users == fetch_users + end + end end diff --git a/test/support/data_case.ex b/test/support/data_case.ex index ba8848952..d5456521c 100644 --- a/test/support/data_case.ex +++ b/test/support/data_case.ex @@ -27,6 +27,21 @@ defmodule Pleroma.DataCase do import Ecto.Query import Pleroma.DataCase use Pleroma.Tests.Helpers + + # Sets up OAuth access with specified scopes + defp oauth_access(scopes, opts \\ []) do + user = + Keyword.get_lazy(opts, :user, fn -> + Pleroma.Factory.insert(:user) + end) + + token = + Keyword.get_lazy(opts, :oauth_token, fn -> + Pleroma.Factory.insert(:oauth_token, user: user, scopes: scopes) + end) + + %{user: user, token: token} + end end end diff --git a/test/support/factory.ex b/test/support/factory.ex index 2fdfabbc5..fb82be0c4 100644 --- a/test/support/factory.ex +++ b/test/support/factory.ex @@ -31,6 +31,7 @@ def user_factory do nickname: sequence(:nickname, &"nick#{&1}"), password_hash: Pbkdf2.hash_pwd_salt("test"), bio: sequence(:bio, &"Tester Number #{&1}"), + discoverable: true, last_digest_emailed_at: NaiveDateTime.utc_now(), last_refreshed_at: NaiveDateTime.utc_now(), notification_settings: %Pleroma.User.NotificationSetting{}, diff --git a/test/support/http_request_mock.ex b/test/support/http_request_mock.ex index 344e27f13..cb022333f 100644 --- a/test/support/http_request_mock.ex +++ b/test/support/http_request_mock.ex @@ -1262,4 +1262,21 @@ def post(url, query, body, headers) do inspect(headers) }"} end + + # Most of the rich media mocks are missing HEAD requests, so we just return 404. + @rich_media_mocks [ + "https://example.com/ogp", + "https://example.com/ogp-missing-data", + "https://example.com/twitter-card" + ] + def head(url, _query, _body, _headers) when url in @rich_media_mocks do + {:ok, %Tesla.Env{status: 404, body: ""}} + end + + def head(url, query, body, headers) do + {:error, + "Mock response not implemented for HEAD #{inspect(url)}, #{query}, #{inspect(body)}, #{ + inspect(headers) + }"} + end end diff --git a/test/support/web_push_http_client_mock.ex b/test/support/web_push_http_client_mock.ex deleted file mode 100644 index 3cd12957d..000000000 --- a/test/support/web_push_http_client_mock.ex +++ /dev/null @@ -1,23 +0,0 @@ -# Pleroma: A lightweight social networking server -# Copyright © 2017-2020 Pleroma Authors -# SPDX-License-Identifier: AGPL-3.0-only - -defmodule Pleroma.Web.WebPushHttpClientMock do - def get(url, headers \\ [], options \\ []) do - { - res, - %Tesla.Env{status: status} - } = Pleroma.HTTP.request(:get, url, "", headers, options) - - {res, %{status_code: status}} - end - - def post(url, body, headers \\ [], options \\ []) do - { - res, - %Tesla.Env{status: status} - } = Pleroma.HTTP.request(:post, url, body, headers, options) - - {res, %{status_code: status}} - end -end diff --git a/test/tasks/config_test.exs b/test/tasks/config_test.exs index fb12e7fb3..f36648829 100644 --- a/test/tasks/config_test.exs +++ b/test/tasks/config_test.exs @@ -40,6 +40,19 @@ test "error if file with custom settings doesn't exist" do on_exit(fn -> Application.put_env(:quack, :level, initial) end) end + @tag capture_log: true + test "config migration refused when deprecated settings are found" do + clear_config([:media_proxy, :whitelist], ["domain_without_scheme.com"]) + assert Repo.all(ConfigDB) == [] + + Mix.Tasks.Pleroma.Config.migrate_to_db("test/fixtures/config/temp.secret.exs") + + assert_received {:mix_shell, :error, [message]} + + assert message =~ + "Migration is not allowed until all deprecation warnings have been resolved." + end + test "filtered settings are migrated to db" do assert Repo.all(ConfigDB) == [] diff --git a/test/tasks/email_test.exs b/test/tasks/email_test.exs index c3af7ef68..5393e3573 100644 --- a/test/tasks/email_test.exs +++ b/test/tasks/email_test.exs @@ -6,6 +6,8 @@ defmodule Mix.Tasks.Pleroma.EmailTest do alias Pleroma.Config alias Pleroma.Tests.ObanHelpers + import Pleroma.Factory + setup_all do Mix.shell(Mix.Shell.Process) @@ -17,6 +19,7 @@ defmodule Mix.Tasks.Pleroma.EmailTest do end setup do: clear_config([Pleroma.Emails.Mailer, :enabled], true) + setup do: clear_config([:instance, :account_activation_required], true) describe "pleroma.email test" do test "Sends test email with no given address" do @@ -50,5 +53,71 @@ test "Sends test email with given address" do html_body: ~r/a test email was requested./i ) end + + test "Sends confirmation emails" do + local_user1 = + insert(:user, %{ + confirmation_pending: true, + confirmation_token: "mytoken", + deactivated: false, + email: "local1@pleroma.com", + local: true + }) + + local_user2 = + insert(:user, %{ + confirmation_pending: true, + confirmation_token: "mytoken", + deactivated: false, + email: "local2@pleroma.com", + local: true + }) + + :ok = Mix.Tasks.Pleroma.Email.run(["resend_confirmation_emails"]) + + ObanHelpers.perform_all() + + assert_email_sent(to: {local_user1.name, local_user1.email}) + assert_email_sent(to: {local_user2.name, local_user2.email}) + end + + test "Does not send confirmation email to inappropriate users" do + # confirmed user + insert(:user, %{ + confirmation_pending: false, + confirmation_token: "mytoken", + deactivated: false, + email: "confirmed@pleroma.com", + local: true + }) + + # remote user + insert(:user, %{ + deactivated: false, + email: "remote@not-pleroma.com", + local: false + }) + + # deactivated user = + insert(:user, %{ + deactivated: true, + email: "deactivated@pleroma.com", + local: false + }) + + # invisible user + insert(:user, %{ + deactivated: false, + email: "invisible@pleroma.com", + local: true, + invisible: true + }) + + :ok = Mix.Tasks.Pleroma.Email.run(["resend_confirmation_emails"]) + + ObanHelpers.perform_all() + + refute_email_sent() + end end end diff --git a/test/tasks/relay_test.exs b/test/tasks/relay_test.exs index e5225b64c..cf48e7dda 100644 --- a/test/tasks/relay_test.exs +++ b/test/tasks/relay_test.exs @@ -81,6 +81,80 @@ test "relay is unfollowed" do assert undo_activity.data["object"]["id"] == cancelled_activity.data["id"] refute "#{target_instance}/followers" in User.following(local_user) end + + test "unfollow when relay is dead" do + user = insert(:user) + target_instance = user.ap_id + + Mix.Tasks.Pleroma.Relay.run(["follow", target_instance]) + + %User{ap_id: follower_id} = local_user = Relay.get_actor() + target_user = User.get_cached_by_ap_id(target_instance) + follow_activity = Utils.fetch_latest_follow(local_user, target_user) + User.follow(local_user, target_user) + + assert "#{target_instance}/followers" in User.following(local_user) + + Tesla.Mock.mock(fn %{method: :get, url: ^target_instance} -> + %Tesla.Env{status: 404} + end) + + Pleroma.Repo.delete(user) + Cachex.clear(:user_cache) + + Mix.Tasks.Pleroma.Relay.run(["unfollow", target_instance]) + + cancelled_activity = Activity.get_by_ap_id(follow_activity.data["id"]) + assert cancelled_activity.data["state"] == "accept" + + assert [] == + ActivityPub.fetch_activities( + [], + %{ + type: "Undo", + actor_id: follower_id, + skip_preload: true, + invisible_actors: true + } + ) + end + + test "force unfollow when relay is dead" do + user = insert(:user) + target_instance = user.ap_id + + Mix.Tasks.Pleroma.Relay.run(["follow", target_instance]) + + %User{ap_id: follower_id} = local_user = Relay.get_actor() + target_user = User.get_cached_by_ap_id(target_instance) + follow_activity = Utils.fetch_latest_follow(local_user, target_user) + User.follow(local_user, target_user) + + assert "#{target_instance}/followers" in User.following(local_user) + + Tesla.Mock.mock(fn %{method: :get, url: ^target_instance} -> + %Tesla.Env{status: 404} + end) + + Pleroma.Repo.delete(user) + Cachex.clear(:user_cache) + + Mix.Tasks.Pleroma.Relay.run(["unfollow", target_instance, "--force"]) + + cancelled_activity = Activity.get_by_ap_id(follow_activity.data["id"]) + assert cancelled_activity.data["state"] == "cancelled" + + [undo_activity] = + ActivityPub.fetch_activities( + [], + %{type: "Undo", actor_id: follower_id, skip_preload: true, invisible_actors: true} + ) + + assert undo_activity.data["type"] == "Undo" + assert undo_activity.data["actor"] == local_user.ap_id + assert undo_activity.data["object"]["id"] == cancelled_activity.data["id"] + refute "#{target_instance}/followers" in User.following(local_user) + end end describe "mix pleroma.relay list" do diff --git a/test/tasks/user_test.exs b/test/tasks/user_test.exs index ce43a9cc7..b8c423c48 100644 --- a/test/tasks/user_test.exs +++ b/test/tasks/user_test.exs @@ -225,47 +225,64 @@ test "no user to deactivate" do test "All statuses set" do user = insert(:user) - Mix.Tasks.Pleroma.User.run(["set", user.nickname, "--moderator", "--admin", "--locked"]) + Mix.Tasks.Pleroma.User.run([ + "set", + user.nickname, + "--admin", + "--confirmed", + "--locked", + "--moderator" + ]) assert_received {:mix_shell, :info, [message]} - assert message =~ ~r/Moderator status .* true/ + assert message =~ ~r/Admin status .* true/ + + assert_received {:mix_shell, :info, [message]} + assert message =~ ~r/Confirmation pending .* false/ assert_received {:mix_shell, :info, [message]} assert message =~ ~r/Locked status .* true/ assert_received {:mix_shell, :info, [message]} - assert message =~ ~r/Admin status .* true/ + assert message =~ ~r/Moderator status .* true/ user = User.get_cached_by_nickname(user.nickname) assert user.is_moderator assert user.locked assert user.is_admin + refute user.confirmation_pending end test "All statuses unset" do - user = insert(:user, locked: true, is_moderator: true, is_admin: true) + user = + insert(:user, locked: true, is_moderator: true, is_admin: true, confirmation_pending: true) Mix.Tasks.Pleroma.User.run([ "set", user.nickname, - "--no-moderator", "--no-admin", - "--no-locked" + "--no-confirmed", + "--no-locked", + "--no-moderator" ]) assert_received {:mix_shell, :info, [message]} - assert message =~ ~r/Moderator status .* false/ + assert message =~ ~r/Admin status .* false/ + + assert_received {:mix_shell, :info, [message]} + assert message =~ ~r/Confirmation pending .* true/ assert_received {:mix_shell, :info, [message]} assert message =~ ~r/Locked status .* false/ assert_received {:mix_shell, :info, [message]} - assert message =~ ~r/Admin status .* false/ + assert message =~ ~r/Moderator status .* false/ user = User.get_cached_by_nickname(user.nickname) refute user.is_moderator refute user.locked refute user.is_admin + assert user.confirmation_pending end test "no user to set status" do @@ -554,4 +571,44 @@ test "it prints an error message when user is not exist" do assert message =~ "Could not change user tags" end end + + describe "bulk confirm and unconfirm" do + test "confirm all" do + user1 = insert(:user, confirmation_pending: true) + user2 = insert(:user, confirmation_pending: true) + + assert user1.confirmation_pending + assert user2.confirmation_pending + + Mix.Tasks.Pleroma.User.run(["confirm_all"]) + + user1 = User.get_cached_by_nickname(user1.nickname) + user2 = User.get_cached_by_nickname(user2.nickname) + + refute user1.confirmation_pending + refute user2.confirmation_pending + end + + test "unconfirm all" do + user1 = insert(:user, confirmation_pending: false) + user2 = insert(:user, confirmation_pending: false) + admin = insert(:user, is_admin: true, confirmation_pending: false) + mod = insert(:user, is_moderator: true, confirmation_pending: false) + + refute user1.confirmation_pending + refute user2.confirmation_pending + + Mix.Tasks.Pleroma.User.run(["unconfirm_all"]) + + user1 = User.get_cached_by_nickname(user1.nickname) + user2 = User.get_cached_by_nickname(user2.nickname) + admin = User.get_cached_by_nickname(admin.nickname) + mod = User.get_cached_by_nickname(mod.nickname) + + assert user1.confirmation_pending + assert user2.confirmation_pending + refute admin.confirmation_pending + refute mod.confirmation_pending + end + end end diff --git a/test/user/import_test.exs b/test/user/import_test.exs new file mode 100644 index 000000000..e404deeb5 --- /dev/null +++ b/test/user/import_test.exs @@ -0,0 +1,76 @@ +# Pleroma: A lightweight social networking server +# Copyright © 2017-2020 Pleroma Authors +# SPDX-License-Identifier: AGPL-3.0-only + +defmodule Pleroma.User.ImportTest do + alias Pleroma.Repo + alias Pleroma.Tests.ObanHelpers + alias Pleroma.User + + use Pleroma.DataCase + use Oban.Testing, repo: Pleroma.Repo + + import Pleroma.Factory + + setup_all do + Tesla.Mock.mock_global(fn env -> apply(HttpRequestMock, :request, [env]) end) + :ok + end + + describe "follow_import" do + test "it imports user followings from list" do + [user1, user2, user3] = insert_list(3, :user) + + identifiers = [ + user2.ap_id, + user3.nickname + ] + + {:ok, job} = User.Import.follow_import(user1, identifiers) + + assert {:ok, result} = ObanHelpers.perform(job) + assert is_list(result) + assert result == [user2, user3] + assert User.following?(user1, user2) + assert User.following?(user1, user3) + end + end + + describe "blocks_import" do + test "it imports user blocks from list" do + [user1, user2, user3] = insert_list(3, :user) + + identifiers = [ + user2.ap_id, + user3.nickname + ] + + {:ok, job} = User.Import.blocks_import(user1, identifiers) + + assert {:ok, result} = ObanHelpers.perform(job) + assert is_list(result) + assert result == [user2, user3] + assert User.blocks?(user1, user2) + assert User.blocks?(user1, user3) + end + end + + describe "mutes_import" do + test "it imports user mutes from list" do + [user1, user2, user3] = insert_list(3, :user) + + identifiers = [ + user2.ap_id, + user3.nickname + ] + + {:ok, job} = User.Import.mutes_import(user1, identifiers) + + assert {:ok, result} = ObanHelpers.perform(job) + assert is_list(result) + assert result == [user2, user3] + assert User.mutes?(user1, user2) + assert User.mutes?(user1, user3) + end + end +end diff --git a/test/user/query_test.exs b/test/user/query_test.exs new file mode 100644 index 000000000..e2f5c7d81 --- /dev/null +++ b/test/user/query_test.exs @@ -0,0 +1,37 @@ +# Pleroma: A lightweight social networking server +# Copyright © 2017-2020 Pleroma Authors +# SPDX-License-Identifier: AGPL-3.0-only + +defmodule Pleroma.User.QueryTest do + use Pleroma.DataCase, async: true + + alias Pleroma.Repo + alias Pleroma.User + alias Pleroma.User.Query + alias Pleroma.Web.ActivityPub.InternalFetchActor + + import Pleroma.Factory + + describe "internal users" do + test "it filters out internal users by default" do + %User{nickname: "internal.fetch"} = InternalFetchActor.get_actor() + + assert [_user] = User |> Repo.all() + assert [] == %{} |> Query.build() |> Repo.all() + end + + test "it filters out users without nickname by default" do + insert(:user, %{nickname: nil}) + + assert [_user] = User |> Repo.all() + assert [] == %{} |> Query.build() |> Repo.all() + end + + test "it returns internal users when enabled" do + %User{nickname: "internal.fetch"} = InternalFetchActor.get_actor() + insert(:user, %{nickname: nil}) + + assert %{internal: true} |> Query.build() |> Repo.aggregate(:count) == 2 + end + end +end diff --git a/test/user_search_test.exs b/test/user_search_test.exs index 01976bf58..c4b805005 100644 --- a/test/user_search_test.exs +++ b/test/user_search_test.exs @@ -17,6 +17,46 @@ defmodule Pleroma.UserSearchTest do describe "User.search" do setup do: clear_config([:instance, :limit_to_local_content]) + test "returns a resolved user as the first result" do + Pleroma.Config.put([:instance, :limit_to_local_content], false) + user = insert(:user, %{nickname: "no_relation", ap_id: "https://lain.com/users/lain"}) + _user = insert(:user, %{nickname: "com_user"}) + + [first_user, _second_user] = User.search("https://lain.com/users/lain", resolve: true) + + assert first_user.id == user.id + end + + test "returns a user with matching ap_id as the first result" do + user = insert(:user, %{nickname: "no_relation", ap_id: "https://lain.com/users/lain"}) + _user = insert(:user, %{nickname: "com_user"}) + + [first_user, _second_user] = User.search("https://lain.com/users/lain") + + assert first_user.id == user.id + end + + test "doesn't die if two users have the same uri" do + insert(:user, %{uri: "https://gensokyo.2hu/@raymoo"}) + insert(:user, %{uri: "https://gensokyo.2hu/@raymoo"}) + assert [_first_user, _second_user] = User.search("https://gensokyo.2hu/@raymoo") + end + + test "returns a user with matching uri as the first result" do + user = + insert(:user, %{ + nickname: "no_relation", + ap_id: "https://lain.com/users/lain", + uri: "https://lain.com/@lain" + }) + + _user = insert(:user, %{nickname: "com_user"}) + + [first_user, _second_user] = User.search("https://lain.com/@lain") + + assert first_user.id == user.id + end + test "excludes invisible users from results" do user = insert(:user, %{nickname: "john t1000"}) insert(:user, %{invisible: true, nickname: "john t800"}) @@ -25,6 +65,14 @@ test "excludes invisible users from results" do assert found_user.id == user.id end + test "excludes users when discoverable is false" do + insert(:user, %{nickname: "john 3000", discoverable: false}) + insert(:user, %{nickname: "john 3001"}) + + users = User.search("john") + assert Enum.count(users) == 1 + end + test "excludes service actors from results" do insert(:user, actor_type: "Application", nickname: "user1") service = insert(:user, actor_type: "Service", nickname: "user2") diff --git a/test/user_test.exs b/test/user_test.exs index 50f72549e..d506f7047 100644 --- a/test/user_test.exs +++ b/test/user_test.exs @@ -440,6 +440,45 @@ test "it sends a welcome chat message if it is set" do assert activity.actor == welcome_user.ap_id end + setup do: + clear_config(:mrf_simple, + media_removal: [], + media_nsfw: [], + federated_timeline_removal: [], + report_removal: [], + reject: [], + followers_only: [], + accept: [], + avatar_removal: [], + banner_removal: [], + reject_deletes: [] + ) + + setup do: + clear_config(:mrf, + policies: [ + Pleroma.Web.ActivityPub.MRF.SimplePolicy + ] + ) + + test "it sends a welcome chat message when Simple policy applied to local instance" do + Pleroma.Config.put([:mrf_simple, :media_nsfw], ["localhost"]) + + welcome_user = insert(:user) + Pleroma.Config.put([:welcome, :chat_message, :enabled], true) + Pleroma.Config.put([:welcome, :chat_message, :sender_nickname], welcome_user.nickname) + Pleroma.Config.put([:welcome, :chat_message, :message], "Hello, this is a chat message") + + cng = User.register_changeset(%User{}, @full_user_data) + {:ok, registered_user} = User.register(cng) + ObanHelpers.perform_all() + + activity = Repo.one(Pleroma.Activity) + assert registered_user.ap_id in activity.recipients + assert Object.normalize(activity).data["content"] =~ "chat message" + assert activity.actor == welcome_user.ap_id + end + test "it sends a welcome email message if it is set" do welcome_user = insert(:user) Pleroma.Config.put([:welcome, :email, :enabled], true) @@ -470,7 +509,12 @@ test "it sends a confirm email" do cng = User.register_changeset(%User{}, @full_user_data) {:ok, registered_user} = User.register(cng) ObanHelpers.perform_all() - assert_email_sent(Pleroma.Emails.UserEmail.account_confirmation_email(registered_user)) + + Pleroma.Emails.UserEmail.account_confirmation_email(registered_user) + # temporary hackney fix until hackney max_connections bug is fixed + # https://git.pleroma.social/pleroma/pleroma/-/issues/2101 + |> Swoosh.Email.put_private(:hackney_options, ssl_options: [versions: [:"tlsv1.2"]]) + |> assert_email_sent() end test "it requires an email, name, nickname and password, bio is optional when account_activation_required is enabled" do @@ -932,23 +976,6 @@ test "it sets the follower_count property" do end end - describe "follow_import" do - test "it imports user followings from list" do - [user1, user2, user3] = insert_list(3, :user) - - identifiers = [ - user2.ap_id, - user3.nickname - ] - - {:ok, job} = User.follow_import(user1, identifiers) - - assert {:ok, result} = ObanHelpers.perform(job) - assert is_list(result) - assert result == [user2, user3] - end - end - describe "mutes" do test "it mutes people" do user = insert(:user) @@ -1155,23 +1182,6 @@ test "follows take precedence over domain blocks" do end end - describe "blocks_import" do - test "it imports user blocks from list" do - [user1, user2, user3] = insert_list(3, :user) - - identifiers = [ - user2.ap_id, - user3.nickname - ] - - {:ok, job} = User.blocks_import(user1, identifiers) - - assert {:ok, result} = ObanHelpers.perform(job) - assert is_list(result) - assert result == [user2, user3] - end - end - describe "get_recipients_from_activity" do test "works for announces" do actor = insert(:user) @@ -1637,7 +1647,7 @@ test "returns true when the account is itself" do assert User.visible_for(user, user) == :visible end - test "returns false when the account is unauthenticated and auth is required" do + test "returns false when the account is unconfirmed and confirmation is required" do Pleroma.Config.put([:instance, :account_activation_required], true) user = insert(:user, local: true, confirmation_pending: true) @@ -1646,14 +1656,23 @@ test "returns false when the account is unauthenticated and auth is required" do refute User.visible_for(user, other_user) == :visible end - test "returns true when the account is unauthenticated and auth is not required" do + test "returns true when the account is unconfirmed and confirmation is required but the account is remote" do + Pleroma.Config.put([:instance, :account_activation_required], true) + + user = insert(:user, local: false, confirmation_pending: true) + other_user = insert(:user, local: true) + + assert User.visible_for(user, other_user) == :visible + end + + test "returns true when the account is unconfirmed and confirmation is not required" do user = insert(:user, local: true, confirmation_pending: true) other_user = insert(:user, local: true) assert User.visible_for(user, other_user) == :visible end - test "returns true when the account is unauthenticated and being viewed by a privileged account (auth required)" do + test "returns true when the account is unconfirmed and being viewed by a privileged account (confirmation required)" do Pleroma.Config.put([:instance, :account_activation_required], true) user = insert(:user, local: true, confirmation_pending: true) diff --git a/test/utils_test.exs b/test/utils_test.exs new file mode 100644 index 000000000..460f7e0b5 --- /dev/null +++ b/test/utils_test.exs @@ -0,0 +1,15 @@ +# Pleroma: A lightweight social networking server +# Copyright © 2017-2020 Pleroma Authors +# SPDX-License-Identifier: AGPL-3.0-only + +defmodule Pleroma.UtilsTest do + use ExUnit.Case, async: true + + describe "tmp_dir/1" do + test "returns unique temporary directory" do + {:ok, path} = Pleroma.Utils.tmp_dir("emoji") + assert path =~ ~r/\/emoji-(.*)-#{:os.getpid()}-(.*)/ + File.rm_rf(path) + end + end +end diff --git a/test/web/activity_pub/activity_pub_test.exs b/test/web/activity_pub/activity_pub_test.exs index d8caa0b00..804305a13 100644 --- a/test/web/activity_pub/activity_pub_test.exs +++ b/test/web/activity_pub/activity_pub_test.exs @@ -1810,6 +1810,14 @@ test "public timeline with default reply_visibility `self`", %{users: %{u1: user |> Enum.map(& &1.id) assert activities_ids == [] + + activities_ids = + %{} + |> Map.put(:reply_visibility, "self") + |> Map.put(:reply_filtering_user, nil) + |> ActivityPub.fetch_public_activities() + + assert activities_ids == [] end test "home timeline", %{users: %{u1: user}} do @@ -2169,4 +2177,84 @@ test "does nothing with a clashing nickname and the same ap id" do assert user.nickname == orig_user.nickname end end + + describe "reply filtering" do + test "`following` still contains announcements by friends" do + user = insert(:user) + followed = insert(:user) + not_followed = insert(:user) + + User.follow(user, followed) + + {:ok, followed_post} = CommonAPI.post(followed, %{status: "Hello"}) + + {:ok, not_followed_to_followed} = + CommonAPI.post(not_followed, %{ + status: "Also hello", + in_reply_to_status_id: followed_post.id + }) + + {:ok, retoot} = CommonAPI.repeat(not_followed_to_followed.id, followed) + + params = + %{} + |> Map.put(:type, ["Create", "Announce"]) + |> Map.put(:blocking_user, user) + |> Map.put(:muting_user, user) + |> Map.put(:reply_filtering_user, user) + |> Map.put(:reply_visibility, "following") + |> Map.put(:announce_filtering_user, user) + |> Map.put(:user, user) + + activities = + [user.ap_id | User.following(user)] + |> ActivityPub.fetch_activities(params) + + followed_post_id = followed_post.id + retoot_id = retoot.id + + assert [%{id: ^followed_post_id}, %{id: ^retoot_id}] = activities + + assert length(activities) == 2 + end + + # This test is skipped because, while this is the desired behavior, + # there seems to be no good way to achieve it with the method that + # we currently use for detecting to who a reply is directed. + # This is a TODO and should be fixed by a later rewrite of the code + # in question. + @tag skip: true + test "`following` still contains self-replies by friends" do + user = insert(:user) + followed = insert(:user) + not_followed = insert(:user) + + User.follow(user, followed) + + {:ok, followed_post} = CommonAPI.post(followed, %{status: "Hello"}) + {:ok, not_followed_post} = CommonAPI.post(not_followed, %{status: "Also hello"}) + + {:ok, _followed_to_not_followed} = + CommonAPI.post(followed, %{status: "sup", in_reply_to_status_id: not_followed_post.id}) + + {:ok, _followed_self_reply} = + CommonAPI.post(followed, %{status: "Also cofe", in_reply_to_status_id: followed_post.id}) + + params = + %{} + |> Map.put(:type, ["Create", "Announce"]) + |> Map.put(:blocking_user, user) + |> Map.put(:muting_user, user) + |> Map.put(:reply_filtering_user, user) + |> Map.put(:reply_visibility, "following") + |> Map.put(:announce_filtering_user, user) + |> Map.put(:user, user) + + activities = + [user.ap_id | User.following(user)] + |> ActivityPub.fetch_activities(params) + + assert length(activities) == 2 + end + end end diff --git a/test/web/activity_pub/mrf/mediaproxy_warming_policy_test.exs b/test/web/activity_pub/mrf/mediaproxy_warming_policy_test.exs index 313d59a66..1710c4d2a 100644 --- a/test/web/activity_pub/mrf/mediaproxy_warming_policy_test.exs +++ b/test/web/activity_pub/mrf/mediaproxy_warming_policy_test.exs @@ -22,6 +22,8 @@ defmodule Pleroma.Web.ActivityPub.MRF.MediaProxyWarmingPolicyTest do } } + setup do: clear_config([:media_proxy, :enabled], true) + test "it prefetches media proxy URIs" do with_mock HTTP, get: fn _, _, _ -> {:ok, []} end do MediaProxyWarmingPolicy.filter(@message) diff --git a/test/web/activity_pub/mrf/mrf_test.exs b/test/web/activity_pub/mrf/mrf_test.exs index a63b25423..e82c8afa6 100644 --- a/test/web/activity_pub/mrf/mrf_test.exs +++ b/test/web/activity_pub/mrf/mrf_test.exs @@ -61,6 +61,8 @@ test "matches are case-insensitive" do describe "describe/0" do test "it works as expected with noop policy" do + clear_config([:mrf, :policies], [Pleroma.Web.ActivityPub.MRF.NoOpPolicy]) + expected = %{ mrf_policies: ["NoOpPolicy"], exclusions: false diff --git a/test/web/activity_pub/object_validators/note_validator_test.exs b/test/web/activity_pub/object_validators/article_note_validator_test.exs similarity index 76% rename from test/web/activity_pub/object_validators/note_validator_test.exs rename to test/web/activity_pub/object_validators/article_note_validator_test.exs index 30c481ffb..cc6dab872 100644 --- a/test/web/activity_pub/object_validators/note_validator_test.exs +++ b/test/web/activity_pub/object_validators/article_note_validator_test.exs @@ -2,10 +2,10 @@ # Copyright © 2017-2020 Pleroma Authors # SPDX-License-Identifier: AGPL-3.0-only -defmodule Pleroma.Web.ActivityPub.ObjectValidators.NoteValidatorTest do +defmodule Pleroma.Web.ActivityPub.ObjectValidators.ArticleNoteValidatorTest do use Pleroma.DataCase - alias Pleroma.Web.ActivityPub.ObjectValidators.NoteValidator + alias Pleroma.Web.ActivityPub.ObjectValidators.ArticleNoteValidator alias Pleroma.Web.ActivityPub.Utils import Pleroma.Factory @@ -29,7 +29,7 @@ defmodule Pleroma.Web.ActivityPub.ObjectValidators.NoteValidatorTest do end test "a basic note validates", %{note: note} do - %{valid?: true} = NoteValidator.cast_and_validate(note) + %{valid?: true} = ArticleNoteValidator.cast_and_validate(note) end end end diff --git a/test/web/activity_pub/pipeline_test.exs b/test/web/activity_pub/pipeline_test.exs index f2a231eaf..210a06563 100644 --- a/test/web/activity_pub/pipeline_test.exs +++ b/test/web/activity_pub/pipeline_test.exs @@ -26,7 +26,7 @@ test "when given an `object_data` in meta, Federation will receive a the origina { Pleroma.Web.ActivityPub.MRF, [], - [filter: fn o -> {:ok, o} end] + [pipeline_filter: fn o, m -> {:ok, o, m} end] }, { Pleroma.Web.ActivityPub.ActivityPub, @@ -51,7 +51,7 @@ test "when given an `object_data` in meta, Federation will receive a the origina Pleroma.Web.ActivityPub.Pipeline.common_pipeline(activity, meta) assert_called(Pleroma.Web.ActivityPub.ObjectValidator.validate(activity, meta)) - assert_called(Pleroma.Web.ActivityPub.MRF.filter(activity)) + assert_called(Pleroma.Web.ActivityPub.MRF.pipeline_filter(activity, meta)) assert_called(Pleroma.Web.ActivityPub.ActivityPub.persist(activity, meta)) assert_called(Pleroma.Web.ActivityPub.SideEffects.handle(activity, meta)) refute called(Pleroma.Web.Federator.publish(activity)) @@ -68,7 +68,7 @@ test "it goes through validation, filtering, persisting, side effects and federa { Pleroma.Web.ActivityPub.MRF, [], - [filter: fn o -> {:ok, o} end] + [pipeline_filter: fn o, m -> {:ok, o, m} end] }, { Pleroma.Web.ActivityPub.ActivityPub, @@ -93,7 +93,7 @@ test "it goes through validation, filtering, persisting, side effects and federa Pleroma.Web.ActivityPub.Pipeline.common_pipeline(activity, meta) assert_called(Pleroma.Web.ActivityPub.ObjectValidator.validate(activity, meta)) - assert_called(Pleroma.Web.ActivityPub.MRF.filter(activity)) + assert_called(Pleroma.Web.ActivityPub.MRF.pipeline_filter(activity, meta)) assert_called(Pleroma.Web.ActivityPub.ActivityPub.persist(activity, meta)) assert_called(Pleroma.Web.ActivityPub.SideEffects.handle(activity, meta)) assert_called(Pleroma.Web.Federator.publish(activity)) @@ -109,7 +109,7 @@ test "it goes through validation, filtering, persisting, side effects without fe { Pleroma.Web.ActivityPub.MRF, [], - [filter: fn o -> {:ok, o} end] + [pipeline_filter: fn o, m -> {:ok, o, m} end] }, { Pleroma.Web.ActivityPub.ActivityPub, @@ -131,7 +131,7 @@ test "it goes through validation, filtering, persisting, side effects without fe Pleroma.Web.ActivityPub.Pipeline.common_pipeline(activity, meta) assert_called(Pleroma.Web.ActivityPub.ObjectValidator.validate(activity, meta)) - assert_called(Pleroma.Web.ActivityPub.MRF.filter(activity)) + assert_called(Pleroma.Web.ActivityPub.MRF.pipeline_filter(activity, meta)) assert_called(Pleroma.Web.ActivityPub.ActivityPub.persist(activity, meta)) assert_called(Pleroma.Web.ActivityPub.SideEffects.handle(activity, meta)) end @@ -148,7 +148,7 @@ test "it goes through validation, filtering, persisting, side effects without fe { Pleroma.Web.ActivityPub.MRF, [], - [filter: fn o -> {:ok, o} end] + [pipeline_filter: fn o, m -> {:ok, o, m} end] }, { Pleroma.Web.ActivityPub.ActivityPub, @@ -170,7 +170,7 @@ test "it goes through validation, filtering, persisting, side effects without fe Pleroma.Web.ActivityPub.Pipeline.common_pipeline(activity, meta) assert_called(Pleroma.Web.ActivityPub.ObjectValidator.validate(activity, meta)) - assert_called(Pleroma.Web.ActivityPub.MRF.filter(activity)) + assert_called(Pleroma.Web.ActivityPub.MRF.pipeline_filter(activity, meta)) assert_called(Pleroma.Web.ActivityPub.ActivityPub.persist(activity, meta)) assert_called(Pleroma.Web.ActivityPub.SideEffects.handle(activity, meta)) end diff --git a/test/web/activity_pub/relay_test.exs b/test/web/activity_pub/relay_test.exs index 9d657ac4f..3284980f7 100644 --- a/test/web/activity_pub/relay_test.exs +++ b/test/web/activity_pub/relay_test.exs @@ -63,6 +63,46 @@ test "returns activity" do assert activity.data["to"] == [user.ap_id] refute "#{user.ap_id}/followers" in User.following(service_actor) end + + test "force unfollow when target service is dead" do + user = insert(:user) + user_ap_id = user.ap_id + user_id = user.id + + Tesla.Mock.mock(fn %{method: :get, url: ^user_ap_id} -> + %Tesla.Env{status: 404} + end) + + service_actor = Relay.get_actor() + CommonAPI.follow(service_actor, user) + assert "#{user.ap_id}/followers" in User.following(service_actor) + + assert Pleroma.Repo.get_by( + Pleroma.FollowingRelationship, + follower_id: service_actor.id, + following_id: user_id + ) + + Pleroma.Repo.delete(user) + Cachex.clear(:user_cache) + + assert {:ok, %Activity{} = activity} = Relay.unfollow(user_ap_id, %{force: true}) + + assert refresh_record(service_actor).following_count == 0 + + refute Pleroma.Repo.get_by( + Pleroma.FollowingRelationship, + follower_id: service_actor.id, + following_id: user_id + ) + + assert activity.actor == "#{Pleroma.Web.Endpoint.url()}/relay" + assert user.ap_id in activity.recipients + assert activity.data["type"] == "Undo" + assert activity.data["actor"] == service_actor.ap_id + assert activity.data["to"] == [user_ap_id] + refute "#{user.ap_id}/followers" in User.following(service_actor) + end end describe "publish/1" do diff --git a/test/web/activity_pub/transmogrifier/article_handling_test.exs b/test/web/activity_pub/transmogrifier/article_handling_test.exs new file mode 100644 index 000000000..9b12a470a --- /dev/null +++ b/test/web/activity_pub/transmogrifier/article_handling_test.exs @@ -0,0 +1,75 @@ +# Pleroma: A lightweight social networking server +# Copyright © 2017-2020 Pleroma Authors +# SPDX-License-Identifier: AGPL-3.0-only + +defmodule Pleroma.Web.ActivityPub.Transmogrifier.ArticleHandlingTest do + use Oban.Testing, repo: Pleroma.Repo + use Pleroma.DataCase + + alias Pleroma.Activity + alias Pleroma.Object + alias Pleroma.Object.Fetcher + alias Pleroma.Web.ActivityPub.Transmogrifier + + test "Pterotype (Wordpress Plugin) Article" do + Tesla.Mock.mock(fn %{url: "https://wedistribute.org/wp-json/pterotype/v1/actor/-blog"} -> + %Tesla.Env{status: 200, body: File.read!("test/fixtures/tesla_mock/wedistribute-user.json")} + end) + + data = + File.read!("test/fixtures/tesla_mock/wedistribute-create-article.json") |> Jason.decode!() + + {:ok, %Activity{data: data, local: false}} = Transmogrifier.handle_incoming(data) + + object = Object.normalize(data["object"]) + + assert object.data["name"] == "The end is near: Mastodon plans to drop OStatus support" + + assert object.data["summary"] == + "One of the largest platforms in the federated social web is dropping the protocol that it started with." + + assert object.data["url"] == "https://wedistribute.org/2019/07/mastodon-drops-ostatus/" + end + + test "Plume Article" do + Tesla.Mock.mock(fn + %{url: "https://baptiste.gelez.xyz/~/PlumeDevelopment/this-month-in-plume-june-2018/"} -> + %Tesla.Env{ + status: 200, + body: File.read!("test/fixtures/tesla_mock/baptiste.gelex.xyz-article.json") + } + + %{url: "https://baptiste.gelez.xyz/@/BaptisteGelez"} -> + %Tesla.Env{ + status: 200, + body: File.read!("test/fixtures/tesla_mock/baptiste.gelex.xyz-user.json") + } + end) + + {:ok, object} = + Fetcher.fetch_object_from_id( + "https://baptiste.gelez.xyz/~/PlumeDevelopment/this-month-in-plume-june-2018/" + ) + + assert object.data["name"] == "This Month in Plume: June 2018" + + assert object.data["url"] == + "https://baptiste.gelez.xyz/~/PlumeDevelopment/this-month-in-plume-june-2018/" + end + + test "Prismo Article" do + Tesla.Mock.mock(fn %{url: "https://prismo.news/@mxb"} -> + %Tesla.Env{ + status: 200, + body: File.read!("test/fixtures/tesla_mock/https___prismo.news__mxb.json") + } + end) + + data = File.read!("test/fixtures/prismo-url-map.json") |> Jason.decode!() + + {:ok, %Activity{data: data, local: false}} = Transmogrifier.handle_incoming(data) + object = Object.normalize(data["object"]) + + assert object.data["url"] == "https://prismo.news/posts/83" + end +end diff --git a/test/web/activity_pub/transmogrifier/question_handling_test.exs b/test/web/activity_pub/transmogrifier/question_handling_test.exs index 74ee79543..d2822ce75 100644 --- a/test/web/activity_pub/transmogrifier/question_handling_test.exs +++ b/test/web/activity_pub/transmogrifier/question_handling_test.exs @@ -157,12 +157,12 @@ test "Mastodon Question activity with custom emojis" do } end - test "returns an error if received a second time" do + test "returns same activity if received a second time" do data = File.read!("test/fixtures/mastodon-question-activity.json") |> Poison.decode!() assert {:ok, %Activity{local: false} = activity} = Transmogrifier.handle_incoming(data) - assert {:error, {:validate_object, {:error, _}}} = Transmogrifier.handle_incoming(data) + assert {:ok, ^activity} = Transmogrifier.handle_incoming(data) end test "accepts a Question with no content" do diff --git a/test/web/activity_pub/transmogrifier/video_handling_test.exs b/test/web/activity_pub/transmogrifier/video_handling_test.exs new file mode 100644 index 000000000..69c953a2e --- /dev/null +++ b/test/web/activity_pub/transmogrifier/video_handling_test.exs @@ -0,0 +1,93 @@ +# Pleroma: A lightweight social networking server +# Copyright © 2017-2020 Pleroma Authors +# SPDX-License-Identifier: AGPL-3.0-only + +defmodule Pleroma.Web.ActivityPub.Transmogrifier.VideoHandlingTest do + use Oban.Testing, repo: Pleroma.Repo + use Pleroma.DataCase + + alias Pleroma.Activity + alias Pleroma.Object + alias Pleroma.Object.Fetcher + alias Pleroma.Web.ActivityPub.Transmogrifier + + setup_all do + Tesla.Mock.mock_global(fn env -> apply(HttpRequestMock, :request, [env]) end) + :ok + end + + test "skip converting the content when it is nil" do + data = + File.read!("test/fixtures/tesla_mock/framatube.org-video.json") + |> Jason.decode!() + |> Kernel.put_in(["object", "content"], nil) + + {:ok, %Activity{local: false} = activity} = Transmogrifier.handle_incoming(data) + + assert object = Object.normalize(activity, false) + + assert object.data["content"] == nil + end + + test "it converts content of object to html" do + data = File.read!("test/fixtures/tesla_mock/framatube.org-video.json") |> Jason.decode!() + + {:ok, %Activity{local: false} = activity} = Transmogrifier.handle_incoming(data) + + assert object = Object.normalize(activity, false) + + assert object.data["content"] == + "

Après avoir mené avec un certain succès la campagne « Dégooglisons Internet » en 2014, l’association Framasoft annonce fin 2019 arrêter progressivement un certain nombre de ses services alternatifs aux GAFAM. Pourquoi ?

Transcription par @aprilorg ici : https://www.april.org/deframasoftisons-internet-pierre-yves-gosset-framasoft

" + end + + test "it remaps video URLs as attachments if necessary" do + {:ok, object} = + Fetcher.fetch_object_from_id( + "https://peertube.moe/videos/watch/df5f464b-be8d-46fb-ad81-2d4c2d1630e3" + ) + + assert object.data["url"] == + "https://peertube.moe/videos/watch/df5f464b-be8d-46fb-ad81-2d4c2d1630e3" + + assert object.data["attachment"] == [ + %{ + "type" => "Link", + "mediaType" => "video/mp4", + "name" => nil, + "url" => [ + %{ + "href" => + "https://peertube.moe/static/webseed/df5f464b-be8d-46fb-ad81-2d4c2d1630e3-480.mp4", + "mediaType" => "video/mp4", + "type" => "Link" + } + ] + } + ] + + data = File.read!("test/fixtures/tesla_mock/framatube.org-video.json") |> Jason.decode!() + + {:ok, %Activity{local: false} = activity} = Transmogrifier.handle_incoming(data) + + assert object = Object.normalize(activity, false) + + assert object.data["attachment"] == [ + %{ + "type" => "Link", + "mediaType" => "video/mp4", + "name" => nil, + "url" => [ + %{ + "href" => + "https://framatube.org/static/webseed/6050732a-8a7a-43d4-a6cd-809525a1d206-1080.mp4", + "mediaType" => "video/mp4", + "type" => "Link" + } + ] + } + ] + + assert object.data["url"] == + "https://framatube.org/videos/watch/6050732a-8a7a-43d4-a6cd-809525a1d206" + end +end diff --git a/test/web/activity_pub/transmogrifier_test.exs b/test/web/activity_pub/transmogrifier_test.exs index cc55a7be7..561674f01 100644 --- a/test/web/activity_pub/transmogrifier_test.exs +++ b/test/web/activity_pub/transmogrifier_test.exs @@ -8,7 +8,6 @@ defmodule Pleroma.Web.ActivityPub.TransmogrifierTest do alias Pleroma.Activity alias Pleroma.Object - alias Pleroma.Object.Fetcher alias Pleroma.Tests.ObanHelpers alias Pleroma.User alias Pleroma.Web.ActivityPub.Transmogrifier @@ -45,15 +44,6 @@ test "it works for incoming notices with tag not being an array (kroeg)" do assert "test" in object.data["tag"] end - test "it works for incoming notices with url not being a string (prismo)" do - data = File.read!("test/fixtures/prismo-url-map.json") |> Poison.decode!() - - {:ok, %Activity{data: data, local: false}} = Transmogrifier.handle_incoming(data) - object = Object.normalize(data["object"]) - - assert object.data["url"] == "https://prismo.news/posts/83" - end - test "it cleans up incoming notices which are not really DMs" do user = insert(:user) other_user = insert(:user) @@ -355,83 +345,6 @@ test "it works for incoming unfollows with an existing follow" do refute User.following?(User.get_cached_by_ap_id(data["actor"]), user) end - test "skip converting the content when it is nil" do - object_id = "https://peertube.social/videos/watch/278d2b7c-0f38-4aaa-afe6-9ecc0c4a34fe" - - {:ok, object} = Fetcher.fetch_and_contain_remote_object_from_id(object_id) - - result = - Pleroma.Web.ActivityPub.Transmogrifier.fix_object(Map.merge(object, %{"content" => nil})) - - assert result["content"] == nil - end - - test "it converts content of object to html" do - object_id = "https://peertube.social/videos/watch/278d2b7c-0f38-4aaa-afe6-9ecc0c4a34fe" - - {:ok, %{"content" => content_markdown}} = - Fetcher.fetch_and_contain_remote_object_from_id(object_id) - - {:ok, %Pleroma.Object{data: %{"content" => content}} = object} = - Fetcher.fetch_object_from_id(object_id) - - assert content_markdown == - "Support this and our other Michigan!/usr/group videos and meetings. Learn more at http://mug.org/membership\n\nTwenty Years in Jail: FreeBSD's Jails, Then and Now\n\nJails started as a limited virtualization system, but over the last two years they've..." - - assert content == - "

Support this and our other Michigan!/usr/group videos and meetings. Learn more at http://mug.org/membership

Twenty Years in Jail: FreeBSD’s Jails, Then and Now

Jails started as a limited virtualization system, but over the last two years they’ve…

" - - assert object.data["mediaType"] == "text/html" - end - - test "it remaps video URLs as attachments if necessary" do - {:ok, object} = - Fetcher.fetch_object_from_id( - "https://peertube.moe/videos/watch/df5f464b-be8d-46fb-ad81-2d4c2d1630e3" - ) - - assert object.data["url"] == - "https://peertube.moe/videos/watch/df5f464b-be8d-46fb-ad81-2d4c2d1630e3" - - assert object.data["attachment"] == [ - %{ - "type" => "Link", - "mediaType" => "video/mp4", - "url" => [ - %{ - "href" => - "https://peertube.moe/static/webseed/df5f464b-be8d-46fb-ad81-2d4c2d1630e3-480.mp4", - "mediaType" => "video/mp4", - "type" => "Link" - } - ] - } - ] - - {:ok, object} = - Fetcher.fetch_object_from_id( - "https://framatube.org/videos/watch/6050732a-8a7a-43d4-a6cd-809525a1d206" - ) - - assert object.data["attachment"] == [ - %{ - "type" => "Link", - "mediaType" => "video/mp4", - "url" => [ - %{ - "href" => - "https://framatube.org/static/webseed/6050732a-8a7a-43d4-a6cd-809525a1d206-1080.mp4", - "mediaType" => "video/mp4", - "type" => "Link" - } - ] - } - ] - - assert object.data["url"] == - "https://framatube.org/videos/watch/6050732a-8a7a-43d4-a6cd-809525a1d206" - end - test "it accepts Flag activities" do user = insert(:user) other_user = insert(:user) @@ -1133,75 +1046,7 @@ test "fixes data for object when url is map" do } end - test "fixes data for video object" do - object = %{ - "type" => "Video", - "url" => [ - %{ - "type" => "Link", - "mimeType" => "video/mp4", - "href" => "https://peede8d-46fb-ad81-2d4c2d1630e3-480.mp4" - }, - %{ - "type" => "Link", - "mimeType" => "video/mp4", - "href" => "https://peertube46fb-ad81-2d4c2d1630e3-240.mp4" - }, - %{ - "type" => "Link", - "mimeType" => "text/html", - "href" => "https://peertube.-2d4c2d1630e3" - }, - %{ - "type" => "Link", - "mimeType" => "text/html", - "href" => "https://peertube.-2d4c2d16377-42" - } - ] - } - - assert Transmogrifier.fix_url(object) == %{ - "attachment" => [ - %{ - "href" => "https://peede8d-46fb-ad81-2d4c2d1630e3-480.mp4", - "mimeType" => "video/mp4", - "type" => "Link" - } - ], - "type" => "Video", - "url" => "https://peertube.-2d4c2d1630e3" - } - end - - test "fixes url for not Video object" do - object = %{ - "type" => "Text", - "url" => [ - %{ - "type" => "Link", - "mimeType" => "text/html", - "href" => "https://peertube.-2d4c2d1630e3" - }, - %{ - "type" => "Link", - "mimeType" => "text/html", - "href" => "https://peertube.-2d4c2d16377-42" - } - ] - } - - assert Transmogrifier.fix_url(object) == %{ - "type" => "Text", - "url" => "https://peertube.-2d4c2d1630e3" - } - - assert Transmogrifier.fix_url(%{"type" => "Text", "url" => []}) == %{ - "type" => "Text", - "url" => "" - } - end - - test "retunrs not modified object" do + test "returns non-modified object" do assert Transmogrifier.fix_url(%{"type" => "Text"}) == %{"type" => "Text"} end end diff --git a/test/web/admin_api/controllers/admin_api_controller_test.exs b/test/web/admin_api/controllers/admin_api_controller_test.exs index 3bc88c6a9..cba6b43d3 100644 --- a/test/web/admin_api/controllers/admin_api_controller_test.exs +++ b/test/web/admin_api/controllers/admin_api_controller_test.exs @@ -1510,6 +1510,56 @@ test "excludes reblogs by default", %{conn: conn, user: user} do end end + describe "GET /api/pleroma/admin/users/:nickname/chats" do + setup do + user = insert(:user) + recipients = insert_list(3, :user) + + Enum.each(recipients, fn recipient -> + CommonAPI.post_chat_message(user, recipient, "yo") + end) + + %{user: user} + end + + test "renders user's chats", %{conn: conn, user: user} do + conn = get(conn, "/api/pleroma/admin/users/#{user.nickname}/chats") + + assert json_response(conn, 200) |> length() == 3 + end + end + + describe "GET /api/pleroma/admin/users/:nickname/chats unauthorized" do + setup do + user = insert(:user) + recipient = insert(:user) + CommonAPI.post_chat_message(user, recipient, "yo") + %{conn: conn} = oauth_access(["read:chats"]) + %{conn: conn, user: user} + end + + test "returns 403", %{conn: conn, user: user} do + conn + |> get("/api/pleroma/admin/users/#{user.nickname}/chats") + |> json_response(403) + end + end + + describe "GET /api/pleroma/admin/users/:nickname/chats unauthenticated" do + setup do + user = insert(:user) + recipient = insert(:user) + CommonAPI.post_chat_message(user, recipient, "yo") + %{conn: build_conn(), user: user} + end + + test "returns 403", %{conn: conn, user: user} do + conn + |> get("/api/pleroma/admin/users/#{user.nickname}/chats") + |> json_response(403) + end + end + describe "GET /api/pleroma/admin/moderation_log" do setup do moderator = insert(:user, is_moderator: true) @@ -1927,7 +1977,12 @@ test "it resend emails for two users", %{conn: conn, admin: admin} do }" ObanHelpers.perform_all() - assert_email_sent(Pleroma.Emails.UserEmail.account_confirmation_email(first_user)) + + Pleroma.Emails.UserEmail.account_confirmation_email(first_user) + # temporary hackney fix until hackney max_connections bug is fixed + # https://git.pleroma.social/pleroma/pleroma/-/issues/2101 + |> Swoosh.Email.put_private(:hackney_options, ssl_options: [versions: [:"tlsv1.2"]]) + |> assert_email_sent() end end diff --git a/test/web/admin_api/controllers/chat_controller_test.exs b/test/web/admin_api/controllers/chat_controller_test.exs new file mode 100644 index 000000000..bd4c9c9d1 --- /dev/null +++ b/test/web/admin_api/controllers/chat_controller_test.exs @@ -0,0 +1,219 @@ +# Pleroma: A lightweight social networking server +# Copyright © 2017-2020 Pleroma Authors +# SPDX-License-Identifier: AGPL-3.0-only + +defmodule Pleroma.Web.AdminAPI.ChatControllerTest do + use Pleroma.Web.ConnCase + + import Pleroma.Factory + + alias Pleroma.Chat + alias Pleroma.Chat.MessageReference + alias Pleroma.Config + alias Pleroma.ModerationLog + alias Pleroma.Object + alias Pleroma.Repo + alias Pleroma.Web.CommonAPI + + defp admin_setup do + admin = insert(:user, is_admin: true) + token = insert(:oauth_admin_token, user: admin) + + conn = + build_conn() + |> assign(:user, admin) + |> assign(:token, token) + + {:ok, %{admin: admin, token: token, conn: conn}} + end + + describe "DELETE /api/pleroma/admin/chats/:id/messages/:message_id" do + setup do: admin_setup() + + test "it deletes a message from the chat", %{conn: conn, admin: admin} do + user = insert(:user) + recipient = insert(:user) + + {:ok, message} = + CommonAPI.post_chat_message(user, recipient, "Hello darkness my old friend") + + object = Object.normalize(message, false) + + chat = Chat.get(user.id, recipient.ap_id) + recipient_chat = Chat.get(recipient.id, user.ap_id) + + cm_ref = MessageReference.for_chat_and_object(chat, object) + recipient_cm_ref = MessageReference.for_chat_and_object(recipient_chat, object) + + result = + conn + |> put_req_header("content-type", "application/json") + |> delete("/api/pleroma/admin/chats/#{chat.id}/messages/#{cm_ref.id}") + |> json_response_and_validate_schema(200) + + log_entry = Repo.one(ModerationLog) + + assert ModerationLog.get_log_entry_message(log_entry) == + "@#{admin.nickname} deleted chat message ##{cm_ref.id}" + + assert result["id"] == cm_ref.id + refute MessageReference.get_by_id(cm_ref.id) + refute MessageReference.get_by_id(recipient_cm_ref.id) + assert %{data: %{"type" => "Tombstone"}} = Object.get_by_id(object.id) + end + end + + describe "GET /api/pleroma/admin/chats/:id/messages" do + setup do: admin_setup() + + test "it paginates", %{conn: conn} do + user = insert(:user) + recipient = insert(:user) + + Enum.each(1..30, fn _ -> + {:ok, _} = CommonAPI.post_chat_message(user, recipient, "hey") + end) + + chat = Chat.get(user.id, recipient.ap_id) + + result = + conn + |> get("/api/pleroma/admin/chats/#{chat.id}/messages") + |> json_response_and_validate_schema(200) + + assert length(result) == 20 + + result = + conn + |> get("/api/pleroma/admin/chats/#{chat.id}/messages?max_id=#{List.last(result)["id"]}") + |> json_response_and_validate_schema(200) + + assert length(result) == 10 + end + + test "it returns the messages for a given chat", %{conn: conn} do + user = insert(:user) + other_user = insert(:user) + third_user = insert(:user) + + {:ok, _} = CommonAPI.post_chat_message(user, other_user, "hey") + {:ok, _} = CommonAPI.post_chat_message(user, third_user, "hey") + {:ok, _} = CommonAPI.post_chat_message(user, other_user, "how are you?") + {:ok, _} = CommonAPI.post_chat_message(other_user, user, "fine, how about you?") + + chat = Chat.get(user.id, other_user.ap_id) + + result = + conn + |> get("/api/pleroma/admin/chats/#{chat.id}/messages") + |> json_response_and_validate_schema(200) + + result + |> Enum.each(fn message -> + assert message["chat_id"] == chat.id |> to_string() + end) + + assert length(result) == 3 + end + end + + describe "GET /api/pleroma/admin/chats/:id" do + setup do: admin_setup() + + test "it returns a chat", %{conn: conn} do + user = insert(:user) + other_user = insert(:user) + + {:ok, chat} = Chat.get_or_create(user.id, other_user.ap_id) + + result = + conn + |> get("/api/pleroma/admin/chats/#{chat.id}") + |> json_response_and_validate_schema(200) + + assert result["id"] == to_string(chat.id) + assert %{} = result["sender"] + assert %{} = result["receiver"] + refute result["account"] + end + end + + describe "unauthorized chat moderation" do + setup do + user = insert(:user) + recipient = insert(:user) + + {:ok, message} = CommonAPI.post_chat_message(user, recipient, "Yo") + object = Object.normalize(message, false) + chat = Chat.get(user.id, recipient.ap_id) + cm_ref = MessageReference.for_chat_and_object(chat, object) + + %{conn: conn} = oauth_access(["read:chats", "write:chats"]) + %{conn: conn, chat: chat, cm_ref: cm_ref} + end + + test "DELETE /api/pleroma/admin/chats/:id/messages/:message_id", %{ + conn: conn, + chat: chat, + cm_ref: cm_ref + } do + conn + |> put_req_header("content-type", "application/json") + |> delete("/api/pleroma/admin/chats/#{chat.id}/messages/#{cm_ref.id}") + |> json_response(403) + + assert MessageReference.get_by_id(cm_ref.id) == cm_ref + end + + test "GET /api/pleroma/admin/chats/:id/messages", %{conn: conn, chat: chat} do + conn + |> get("/api/pleroma/admin/chats/#{chat.id}/messages") + |> json_response(403) + end + + test "GET /api/pleroma/admin/chats/:id", %{conn: conn, chat: chat} do + conn + |> get("/api/pleroma/admin/chats/#{chat.id}") + |> json_response(403) + end + end + + describe "unauthenticated chat moderation" do + setup do + user = insert(:user) + recipient = insert(:user) + + {:ok, message} = CommonAPI.post_chat_message(user, recipient, "Yo") + object = Object.normalize(message, false) + chat = Chat.get(user.id, recipient.ap_id) + cm_ref = MessageReference.for_chat_and_object(chat, object) + + %{conn: build_conn(), chat: chat, cm_ref: cm_ref} + end + + test "DELETE /api/pleroma/admin/chats/:id/messages/:message_id", %{ + conn: conn, + chat: chat, + cm_ref: cm_ref + } do + conn + |> put_req_header("content-type", "application/json") + |> delete("/api/pleroma/admin/chats/#{chat.id}/messages/#{cm_ref.id}") + |> json_response(403) + + assert MessageReference.get_by_id(cm_ref.id) == cm_ref + end + + test "GET /api/pleroma/admin/chats/:id/messages", %{conn: conn, chat: chat} do + conn + |> get("/api/pleroma/admin/chats/#{chat.id}/messages") + |> json_response(403) + end + + test "GET /api/pleroma/admin/chats/:id", %{conn: conn, chat: chat} do + conn + |> get("/api/pleroma/admin/chats/#{chat.id}") + |> json_response(403) + end + end +end diff --git a/test/web/admin_api/controllers/instance_document_controller_test.exs b/test/web/admin_api/controllers/instance_document_controller_test.exs new file mode 100644 index 000000000..5f7b042f6 --- /dev/null +++ b/test/web/admin_api/controllers/instance_document_controller_test.exs @@ -0,0 +1,106 @@ +# Pleroma: A lightweight social networking server +# Copyright © 2017-2020 Pleroma Authors +# SPDX-License-Identifier: AGPL-3.0-only + +defmodule Pleroma.Web.AdminAPI.InstanceDocumentControllerTest do + use Pleroma.Web.ConnCase, async: true + import Pleroma.Factory + alias Pleroma.Config + + @dir "test/tmp/instance_static" + @default_instance_panel ~s(

Welcome to Pleroma!

) + + setup do + File.mkdir_p!(@dir) + on_exit(fn -> File.rm_rf(@dir) end) + end + + setup do: clear_config([:instance, :static_dir], @dir) + + setup do + admin = insert(:user, is_admin: true) + token = insert(:oauth_admin_token, user: admin) + + conn = + build_conn() + |> assign(:user, admin) + |> assign(:token, token) + + {:ok, %{admin: admin, token: token, conn: conn}} + end + + describe "GET /api/pleroma/admin/instance_document/:name" do + test "return the instance document url", %{conn: conn} do + conn = get(conn, "/api/pleroma/admin/instance_document/instance-panel") + + assert content = html_response(conn, 200) + assert String.contains?(content, @default_instance_panel) + end + + test "it returns 403 if requested by a non-admin" do + non_admin_user = insert(:user) + token = insert(:oauth_token, user: non_admin_user) + + conn = + build_conn() + |> assign(:user, non_admin_user) + |> assign(:token, token) + |> get("/api/pleroma/admin/instance_document/instance-panel") + + assert json_response(conn, :forbidden) + end + + test "it returns 404 if the instance document with the given name doesn't exist", %{ + conn: conn + } do + conn = get(conn, "/api/pleroma/admin/instance_document/1234") + + assert json_response_and_validate_schema(conn, 404) + end + end + + describe "PATCH /api/pleroma/admin/instance_document/:name" do + test "uploads the instance document", %{conn: conn} do + image = %Plug.Upload{ + content_type: "text/html", + path: Path.absname("test/fixtures/custom_instance_panel.html"), + filename: "custom_instance_panel.html" + } + + conn = + conn + |> put_req_header("content-type", "multipart/form-data") + |> patch("/api/pleroma/admin/instance_document/instance-panel", %{ + "file" => image + }) + + assert %{"url" => url} = json_response_and_validate_schema(conn, 200) + index = get(build_conn(), url) + assert html_response(index, 200) == "

Custom instance panel

" + end + end + + describe "DELETE /api/pleroma/admin/instance_document/:name" do + test "deletes the instance document", %{conn: conn} do + File.mkdir!(@dir <> "/instance/") + File.write!(@dir <> "/instance/panel.html", "Custom instance panel") + + conn_resp = + conn + |> get("/api/pleroma/admin/instance_document/instance-panel") + + assert html_response(conn_resp, 200) == "Custom instance panel" + + conn + |> delete("/api/pleroma/admin/instance_document/instance-panel") + |> json_response_and_validate_schema(200) + + conn_resp = + conn + |> get("/api/pleroma/admin/instance_document/instance-panel") + + assert content = html_response(conn_resp, 200) + assert String.contains?(content, @default_instance_panel) + end + end +end diff --git a/test/web/admin_api/search_test.exs b/test/web/admin_api/search_test.exs index b974cedd5..d88867c52 100644 --- a/test/web/admin_api/search_test.exs +++ b/test/web/admin_api/search_test.exs @@ -177,5 +177,14 @@ test "it returns unapproved user" do assert total == 3 assert count == 1 end + + test "it returns non-discoverable users" do + insert(:user) + insert(:user, discoverable: false) + + {:ok, _results, total} = Search.user() + + assert total == 2 + end end end diff --git a/test/web/common_api/common_api_test.exs b/test/web/common_api/common_api_test.exs index 5afb0a6dc..e34f5a49b 100644 --- a/test/web/common_api/common_api_test.exs +++ b/test/web/common_api/common_api_test.exs @@ -29,6 +29,23 @@ defmodule Pleroma.Web.CommonAPITest do setup do: clear_config([:instance, :limit]) setup do: clear_config([:instance, :max_pinned_statuses]) + describe "posting polls" do + test "it posts a poll" do + user = insert(:user) + + {:ok, activity} = + CommonAPI.post(user, %{ + status: "who is the best", + poll: %{expires_in: 600, options: ["reimu", "marisa"]} + }) + + object = Object.normalize(activity) + + assert object.data["type"] == "Question" + assert object.data["oneOf"] |> length() == 2 + end + end + describe "blocking" do setup do blocker = insert(:user) @@ -217,6 +234,17 @@ test "it reject messages over the local limit" do assert message == :content_too_long end + + test "it reject messages via MRF" do + clear_config([:mrf_keyword, :reject], ["GNO"]) + clear_config([:mrf, :policies], [Pleroma.Web.ActivityPub.MRF.KeywordPolicy]) + + author = insert(:user) + recipient = insert(:user) + + assert {:reject, "[KeywordPolicy] Matches with rejected keyword"} == + CommonAPI.post_chat_message(author, recipient, "GNO/Linux") + end end describe "unblocking" do @@ -1193,4 +1221,24 @@ test "respects visibility=private" do assert Visibility.get_visibility(activity) == "private" end end + + describe "get_user/1" do + test "gets user by ap_id" do + user = insert(:user) + assert CommonAPI.get_user(user.ap_id) == user + end + + test "gets user by guessed nickname" do + user = insert(:user, ap_id: "", nickname: "mario@mushroom.kingdom") + assert CommonAPI.get_user("https://mushroom.kingdom/users/mario") == user + end + + test "fallback" do + assert %User{ + name: "", + ap_id: "", + nickname: "erroruser@example.com" + } = CommonAPI.get_user("") + end + end end diff --git a/test/web/fed_sockets/fed_registry_test.exs b/test/web/fed_sockets/fed_registry_test.exs new file mode 100644 index 000000000..19ac874d6 --- /dev/null +++ b/test/web/fed_sockets/fed_registry_test.exs @@ -0,0 +1,124 @@ +# Pleroma: A lightweight social networking server +# Copyright © 2017-2020 Pleroma Authors +# SPDX-License-Identifier: AGPL-3.0-only + +defmodule Pleroma.Web.FedSockets.FedRegistryTest do + use ExUnit.Case + + alias Pleroma.Web.FedSockets + alias Pleroma.Web.FedSockets.FedRegistry + alias Pleroma.Web.FedSockets.SocketInfo + + @good_domain "http://good.domain" + @good_domain_origin "good.domain:80" + + setup do + start_supervised({Pleroma.Web.FedSockets.Supervisor, []}) + build_test_socket(@good_domain) + Process.sleep(10) + + :ok + end + + describe "add_fed_socket/1 without conflicting sockets" do + test "can be added" do + Process.sleep(10) + assert {:ok, %SocketInfo{origin: origin}} = FedRegistry.get_fed_socket(@good_domain_origin) + assert origin == "good.domain:80" + end + + test "multiple origins can be added" do + build_test_socket("http://anothergood.domain") + Process.sleep(10) + + assert {:ok, %SocketInfo{origin: origin_1}} = + FedRegistry.get_fed_socket(@good_domain_origin) + + assert {:ok, %SocketInfo{origin: origin_2}} = + FedRegistry.get_fed_socket("anothergood.domain:80") + + assert origin_1 == "good.domain:80" + assert origin_2 == "anothergood.domain:80" + assert FedRegistry.list_all() |> Enum.count() == 2 + end + end + + describe "add_fed_socket/1 when duplicate sockets conflict" do + setup do + build_test_socket(@good_domain) + build_test_socket(@good_domain) + Process.sleep(10) + :ok + end + + test "will be ignored" do + assert {:ok, %SocketInfo{origin: origin, pid: pid_one}} = + FedRegistry.get_fed_socket(@good_domain_origin) + + assert origin == "good.domain:80" + + assert FedRegistry.list_all() |> Enum.count() == 1 + end + + test "the newer process will be closed" do + pid_two = build_test_socket(@good_domain) + + assert {:ok, %SocketInfo{origin: origin, pid: pid_one}} = + FedRegistry.get_fed_socket(@good_domain_origin) + + assert origin == "good.domain:80" + Process.sleep(10) + + refute Process.alive?(pid_two) + + assert FedRegistry.list_all() |> Enum.count() == 1 + end + end + + describe "get_fed_socket/1" do + test "returns missing for unknown hosts" do + assert {:error, :missing} = FedRegistry.get_fed_socket("not_a_dmoain") + end + + test "returns rejected for hosts previously rejected" do + "rejected.domain:80" + |> FedSockets.uri_for_origin() + |> FedRegistry.set_host_rejected() + + assert {:error, :rejected} = FedRegistry.get_fed_socket("rejected.domain:80") + end + + test "can retrieve a previously added SocketInfo" do + build_test_socket(@good_domain) + Process.sleep(10) + assert {:ok, %SocketInfo{origin: origin}} = FedRegistry.get_fed_socket(@good_domain_origin) + assert origin == "good.domain:80" + end + + test "removes references to SocketInfos when the process crashes" do + assert {:ok, %SocketInfo{origin: origin, pid: pid}} = + FedRegistry.get_fed_socket(@good_domain_origin) + + assert origin == "good.domain:80" + + Process.exit(pid, :testing) + Process.sleep(100) + assert {:error, :missing} = FedRegistry.get_fed_socket(@good_domain_origin) + end + end + + def build_test_socket(uri) do + Kernel.spawn(fn -> fed_socket_almost(uri) end) + end + + def fed_socket_almost(origin) do + FedRegistry.add_fed_socket(origin) + + receive do + :close -> + :ok + after + 5_000 -> :timeout + end + end +end diff --git a/test/web/fed_sockets/fetch_registry_test.exs b/test/web/fed_sockets/fetch_registry_test.exs new file mode 100644 index 000000000..7bd2d995a --- /dev/null +++ b/test/web/fed_sockets/fetch_registry_test.exs @@ -0,0 +1,67 @@ +# Pleroma: A lightweight social networking server +# Copyright © 2017-2020 Pleroma Authors +# SPDX-License-Identifier: AGPL-3.0-only + +defmodule Pleroma.Web.FedSockets.FetchRegistryTest do + use ExUnit.Case + + alias Pleroma.Web.FedSockets.FetchRegistry + alias Pleroma.Web.FedSockets.FetchRegistry.FetchRegistryData + + @json_message "hello" + @json_reply "hello back" + + setup do + start_supervised( + {Pleroma.Web.FedSockets.Supervisor, + [ + ping_interval: 8, + connection_duration: 15, + rejection_duration: 5, + fed_socket_fetches: [default: 10, interval: 10] + ]} + ) + + :ok + end + + test "fetches can be stored" do + uuid = FetchRegistry.register_fetch(@json_message) + + assert {:error, :waiting} = FetchRegistry.check_fetch(uuid) + end + + test "fetches can return" do + uuid = FetchRegistry.register_fetch(@json_message) + task = Task.async(fn -> FetchRegistry.register_fetch_received(uuid, @json_reply) end) + + assert {:error, :waiting} = FetchRegistry.check_fetch(uuid) + Task.await(task) + + assert {:ok, %FetchRegistryData{received_json: received_json}} = + FetchRegistry.check_fetch(uuid) + + assert received_json == @json_reply + end + + test "fetches are deleted once popped from stack" do + uuid = FetchRegistry.register_fetch(@json_message) + task = Task.async(fn -> FetchRegistry.register_fetch_received(uuid, @json_reply) end) + Task.await(task) + + assert {:ok, %FetchRegistryData{received_json: received_json}} = + FetchRegistry.check_fetch(uuid) + + assert received_json == @json_reply + assert {:ok, @json_reply} = FetchRegistry.pop_fetch(uuid) + + assert {:error, :missing} = FetchRegistry.check_fetch(uuid) + end + + test "fetches can time out" do + uuid = FetchRegistry.register_fetch(@json_message) + assert {:error, :waiting} = FetchRegistry.check_fetch(uuid) + Process.sleep(500) + assert {:error, :missing} = FetchRegistry.check_fetch(uuid) + end +end diff --git a/test/web/fed_sockets/socket_info_test.exs b/test/web/fed_sockets/socket_info_test.exs new file mode 100644 index 000000000..db3d6edcd --- /dev/null +++ b/test/web/fed_sockets/socket_info_test.exs @@ -0,0 +1,118 @@ +# Pleroma: A lightweight social networking server +# Copyright © 2017-2020 Pleroma Authors +# SPDX-License-Identifier: AGPL-3.0-only + +defmodule Pleroma.Web.FedSockets.SocketInfoTest do + use ExUnit.Case + + alias Pleroma.Web.FedSockets + alias Pleroma.Web.FedSockets.SocketInfo + + describe "uri_for_origin" do + test "provides the fed_socket URL given the origin information" do + endpoint = "example.com:4000" + assert FedSockets.uri_for_origin(endpoint) =~ "ws://" + assert FedSockets.uri_for_origin(endpoint) =~ endpoint + end + end + + describe "origin" do + test "will provide the origin field given a url" do + endpoint = "example.com:4000" + assert SocketInfo.origin("ws://#{endpoint}") == endpoint + assert SocketInfo.origin("http://#{endpoint}") == endpoint + assert SocketInfo.origin("https://#{endpoint}") == endpoint + end + + test "will proide the origin field given a uri" do + endpoint = "example.com:4000" + uri = URI.parse("http://#{endpoint}") + + assert SocketInfo.origin(uri) == endpoint + end + end + + describe "touch" do + test "will update the TTL" do + endpoint = "example.com:4000" + socket = SocketInfo.build("ws://#{endpoint}") + Process.sleep(2) + touched_socket = SocketInfo.touch(socket) + + assert socket.connected_until < touched_socket.connected_until + end + end + + describe "expired?" do + setup do + start_supervised( + {Pleroma.Web.FedSockets.Supervisor, + [ + ping_interval: 8, + connection_duration: 5, + rejection_duration: 5, + fed_socket_rejections: [lazy: true] + ]} + ) + + :ok + end + + test "tests if the TTL is exceeded" do + endpoint = "example.com:4000" + socket = SocketInfo.build("ws://#{endpoint}") + refute SocketInfo.expired?(socket) + Process.sleep(10) + + assert SocketInfo.expired?(socket) + end + end + + describe "creating outgoing connection records" do + test "can be passed a string" do + assert %{conn_pid: :pid, origin: _origin} = SocketInfo.build("example.com:4000", :pid) + end + + test "can be passed a URI" do + uri = URI.parse("http://example.com:4000") + assert %{conn_pid: :pid, origin: origin} = SocketInfo.build(uri, :pid) + assert origin =~ "example.com:4000" + end + + test "will include the port number" do + assert %{conn_pid: :pid, origin: origin} = SocketInfo.build("http://example.com:4000", :pid) + + assert origin =~ ":4000" + end + + test "will provide the port if missing" do + assert %{conn_pid: :pid, origin: "example.com:80"} = + SocketInfo.build("http://example.com", :pid) + + assert %{conn_pid: :pid, origin: "example.com:443"} = + SocketInfo.build("https://example.com", :pid) + end + end + + describe "creating incoming connection records" do + test "can be passed a string" do + assert %{pid: _, origin: _origin} = SocketInfo.build("example.com:4000") + end + + test "can be passed a URI" do + uri = URI.parse("example.com:4000") + assert %{pid: _, origin: _origin} = SocketInfo.build(uri) + end + + test "will include the port number" do + assert %{pid: _, origin: origin} = SocketInfo.build("http://example.com:4000") + + assert origin =~ ":4000" + end + + test "will provide the port if missing" do + assert %{pid: _, origin: "example.com:80"} = SocketInfo.build("http://example.com") + assert %{pid: _, origin: "example.com:443"} = SocketInfo.build("https://example.com") + end + end +end diff --git a/test/web/instances/instance_test.exs b/test/web/instances/instance_test.exs index dc6ace843..4f0805100 100644 --- a/test/web/instances/instance_test.exs +++ b/test/web/instances/instance_test.exs @@ -99,35 +99,54 @@ test "does NOT modify `unreachable_since` value of existing record in case it's end end - test "Scrapes favicon URLs" do - Tesla.Mock.mock(fn %{url: "https://favicon.example.org/"} -> - %Tesla.Env{ - status: 200, - body: ~s[] - } - end) + describe "get_or_update_favicon/1" do + test "Scrapes favicon URLs" do + Tesla.Mock.mock(fn %{url: "https://favicon.example.org/"} -> + %Tesla.Env{ + status: 200, + body: ~s[] + } + end) - assert "https://favicon.example.org/favicon.png" == - Instance.get_or_update_favicon(URI.parse("https://favicon.example.org/")) - end + assert "https://favicon.example.org/favicon.png" == + Instance.get_or_update_favicon(URI.parse("https://favicon.example.org/")) + end - test "Returns nil on too long favicon URLs" do - long_favicon_url = - "https://Lorem.ipsum.dolor.sit.amet/consecteturadipiscingelit/Praesentpharetrapurusutaliquamtempus/Mauriseulaoreetarcu/atfacilisisorci/Nullamporttitor/nequesedfeugiatmollis/dolormagnaefficiturlorem/nonpretiumsapienorcieurisus/Nullamveleratsem/Maecenassedaccumsanexnam/favicon.png" + test "Returns nil on too long favicon URLs" do + long_favicon_url = + "https://Lorem.ipsum.dolor.sit.amet/consecteturadipiscingelit/Praesentpharetrapurusutaliquamtempus/Mauriseulaoreetarcu/atfacilisisorci/Nullamporttitor/nequesedfeugiatmollis/dolormagnaefficiturlorem/nonpretiumsapienorcieurisus/Nullamveleratsem/Maecenassedaccumsanexnam/favicon.png" - Tesla.Mock.mock(fn %{url: "https://long-favicon.example.org/"} -> - %Tesla.Env{ - status: 200, - body: ~s[] - } - end) + Tesla.Mock.mock(fn %{url: "https://long-favicon.example.org/"} -> + %Tesla.Env{ + status: 200, + body: + ~s[] + } + end) - assert capture_log(fn -> - assert nil == - Instance.get_or_update_favicon( - URI.parse("https://long-favicon.example.org/") - ) - end) =~ - "Instance.get_or_update_favicon(\"long-favicon.example.org\") error: %Postgrex.Error{" + assert capture_log(fn -> + assert nil == + Instance.get_or_update_favicon( + URI.parse("https://long-favicon.example.org/") + ) + end) =~ + "Instance.get_or_update_favicon(\"long-favicon.example.org\") error: %Postgrex.Error{" + end + + test "Handles not getting a favicon URL properly" do + Tesla.Mock.mock(fn %{url: "https://no-favicon.example.org/"} -> + %Tesla.Env{ + status: 200, + body: ~s[

I wil look down and whisper "GNO.."

] + } + end) + + refute capture_log(fn -> + assert nil == + Instance.get_or_update_favicon( + URI.parse("https://no-favicon.example.org/") + ) + end) =~ "Instance.scrape_favicon(\"https://no-favicon.example.org/\") error: " + end end end diff --git a/test/web/mastodon_api/controllers/account_controller_test.exs b/test/web/mastodon_api/controllers/account_controller_test.exs index 17a1e7d66..f7f1369e4 100644 --- a/test/web/mastodon_api/controllers/account_controller_test.exs +++ b/test/web/mastodon_api/controllers/account_controller_test.exs @@ -1442,7 +1442,10 @@ test "returns lists to which the account belongs" do describe "verify_credentials" do test "verify_credentials" do %{user: user, conn: conn} = oauth_access(["read:accounts"]) - [notification | _] = insert_list(7, :notification, user: user) + + [notification | _] = + insert_list(7, :notification, user: user, activity: insert(:note_activity)) + Pleroma.Notification.set_read_up_to(user, notification.id) conn = get(conn, "/api/v1/accounts/verify_credentials") diff --git a/test/web/mastodon_api/controllers/auth_controller_test.exs b/test/web/mastodon_api/controllers/auth_controller_test.exs index 4fa95fce1..bf2438fe2 100644 --- a/test/web/mastodon_api/controllers/auth_controller_test.exs +++ b/test/web/mastodon_api/controllers/auth_controller_test.exs @@ -61,7 +61,7 @@ test "redirects to the getting-started page when referer is not present", %{conn end test "it returns 204", %{conn: conn} do - assert json_response(conn, :no_content) + assert empty_json_response(conn) end test "it creates a PasswordResetToken record for user", %{user: user} do @@ -91,7 +91,7 @@ test "it returns 204", %{conn: conn} do assert conn |> post("/auth/password?nickname=#{user.nickname}") - |> json_response(:no_content) + |> empty_json_response() ObanHelpers.perform_all() token_record = Repo.get_by(Pleroma.PasswordResetToken, user_id: user.id) @@ -112,7 +112,7 @@ test "it doesn't fail when a user has no email", %{conn: conn} do assert conn |> post("/auth/password?nickname=#{user.nickname}") - |> json_response(:no_content) + |> empty_json_response() end end @@ -125,24 +125,21 @@ test "it doesn't fail when a user has no email", %{conn: conn} do test "it returns 204 when user is not found", %{conn: conn, user: user} do conn = post(conn, "/auth/password?email=nonexisting_#{user.email}") - assert conn - |> json_response(:no_content) + assert empty_json_response(conn) end test "it returns 204 when user is not local", %{conn: conn, user: user} do {:ok, user} = Repo.update(Ecto.Changeset.change(user, local: false)) conn = post(conn, "/auth/password?email=#{user.email}") - assert conn - |> json_response(:no_content) + assert empty_json_response(conn) end test "it returns 204 when user is deactivated", %{conn: conn, user: user} do {:ok, user} = Repo.update(Ecto.Changeset.change(user, deactivated: true, local: true)) conn = post(conn, "/auth/password?email=#{user.email}") - assert conn - |> json_response(:no_content) + assert empty_json_response(conn) end end diff --git a/test/web/mastodon_api/controllers/marker_controller_test.exs b/test/web/mastodon_api/controllers/marker_controller_test.exs index 6dd40fb4a..9f0481120 100644 --- a/test/web/mastodon_api/controllers/marker_controller_test.exs +++ b/test/web/mastodon_api/controllers/marker_controller_test.exs @@ -11,7 +11,7 @@ defmodule Pleroma.Web.MastodonAPI.MarkerControllerTest do test "gets markers with correct scopes", %{conn: conn} do user = insert(:user) token = insert(:oauth_token, user: user, scopes: ["read:statuses"]) - insert_list(7, :notification, user: user) + insert_list(7, :notification, user: user, activity: insert(:note_activity)) {:ok, %{"notifications" => marker}} = Pleroma.Marker.upsert( diff --git a/test/web/mastodon_api/controllers/timeline_controller_test.exs b/test/web/mastodon_api/controllers/timeline_controller_test.exs index 517cabcff..c6e0268fd 100644 --- a/test/web/mastodon_api/controllers/timeline_controller_test.exs +++ b/test/web/mastodon_api/controllers/timeline_controller_test.exs @@ -114,8 +114,16 @@ test "doesn't return replies if follower is posting with blocked user" do {:ok, _reply_from_friend} = CommonAPI.post(friend, %{status: "status", in_reply_to_status_id: reply_from_blockee}) - res_conn = get(conn, "/api/v1/timelines/public") - [%{"id" => ^activity_id}] = json_response_and_validate_schema(res_conn, 200) + # Still shows replies from yourself + {:ok, %{id: reply_from_me}} = + CommonAPI.post(blocker, %{status: "status", in_reply_to_status_id: reply_from_blockee}) + + response = + get(conn, "/api/v1/timelines/public") + |> json_response_and_validate_schema(200) + + assert length(response) == 2 + [%{"id" => ^reply_from_me}, %{"id" => ^activity_id}] = response end test "doesn't return replies if follow is posting with users from blocked domain" do diff --git a/test/web/mastodon_api/views/account_view_test.exs b/test/web/mastodon_api/views/account_view_test.exs index 9f22f9dcf..a5f39b215 100644 --- a/test/web/mastodon_api/views/account_view_test.exs +++ b/test/web/mastodon_api/views/account_view_test.exs @@ -5,6 +5,7 @@ defmodule Pleroma.Web.MastodonAPI.AccountViewTest do use Pleroma.DataCase + alias Pleroma.Config alias Pleroma.User alias Pleroma.UserRelationship alias Pleroma.Web.CommonAPI @@ -68,7 +69,7 @@ test "Represent a user account" do sensitive: false, pleroma: %{ actor_type: "Person", - discoverable: false + discoverable: true }, fields: [] }, @@ -166,7 +167,7 @@ test "Represent a Service(bot) account" do sensitive: false, pleroma: %{ actor_type: "Service", - discoverable: false + discoverable: true }, fields: [] }, @@ -448,7 +449,7 @@ test "shows unread_conversation_count only to the account owner" do test "shows unread_count only to the account owner" do user = insert(:user) - insert_list(7, :notification, user: user) + insert_list(7, :notification, user: user, activity: insert(:note_activity)) other_user = insert(:user) user = User.get_cached_by_ap_id(user.ap_id) @@ -540,8 +541,9 @@ test "shows non-zero when historical unapproved requests are present" do end end - test "uses mediaproxy urls when it's enabled" do + test "uses mediaproxy urls when it's enabled (regardless of media preview proxy state)" do clear_config([:media_proxy, :enabled], true) + clear_config([:media_preview_proxy, :enabled]) user = insert(:user, @@ -550,20 +552,24 @@ test "uses mediaproxy urls when it's enabled" do emoji: %{"joker_smile" => "https://evil.website/society.png"} ) - AccountView.render("show.json", %{user: user, skip_visibility_check: true}) - |> Enum.all?(fn - {key, url} when key in [:avatar, :avatar_static, :header, :header_static] -> - String.starts_with?(url, Pleroma.Web.base_url()) + with media_preview_enabled <- [false, true] do + Config.put([:media_preview_proxy, :enabled], media_preview_enabled) - {:emojis, emojis} -> - Enum.all?(emojis, fn %{url: url, static_url: static_url} -> - String.starts_with?(url, Pleroma.Web.base_url()) && - String.starts_with?(static_url, Pleroma.Web.base_url()) - end) + AccountView.render("show.json", %{user: user, skip_visibility_check: true}) + |> Enum.all?(fn + {key, url} when key in [:avatar, :avatar_static, :header, :header_static] -> + String.starts_with?(url, Pleroma.Web.base_url()) - _ -> - true - end) - |> assert() + {:emojis, emojis} -> + Enum.all?(emojis, fn %{url: url, static_url: static_url} -> + String.starts_with?(url, Pleroma.Web.base_url()) && + String.starts_with?(static_url, Pleroma.Web.base_url()) + end) + + _ -> + true + end) + |> assert() + end end end diff --git a/test/web/media_proxy/media_proxy_controller_test.exs b/test/web/media_proxy/media_proxy_controller_test.exs index d4db44c63..e9b584822 100644 --- a/test/web/media_proxy/media_proxy_controller_test.exs +++ b/test/web/media_proxy/media_proxy_controller_test.exs @@ -8,34 +8,34 @@ defmodule Pleroma.Web.MediaProxy.MediaProxyControllerTest do import Mock alias Pleroma.Web.MediaProxy - alias Pleroma.Web.MediaProxy.MediaProxyController alias Plug.Conn setup do on_exit(fn -> Cachex.clear(:banned_urls_cache) end) end - test "it returns 404 when MediaProxy disabled", %{conn: conn} do - clear_config([:media_proxy, :enabled], false) - - assert %Conn{ - status: 404, - resp_body: "Not Found" - } = get(conn, "/proxy/hhgfh/eeeee") - - assert %Conn{ - status: 404, - resp_body: "Not Found" - } = get(conn, "/proxy/hhgfh/eeee/fff") - end - - describe "" do + describe "Media Proxy" do setup do clear_config([:media_proxy, :enabled], true) clear_config([Pleroma.Web.Endpoint, :secret_key_base], "00000000000") + [url: MediaProxy.encode_url("https://google.fn/test.png")] end + test "it returns 404 when disabled", %{conn: conn} do + clear_config([:media_proxy, :enabled], false) + + assert %Conn{ + status: 404, + resp_body: "Not Found" + } = get(conn, "/proxy/hhgfh/eeeee") + + assert %Conn{ + status: 404, + resp_body: "Not Found" + } = get(conn, "/proxy/hhgfh/eeee/fff") + end + test "it returns 403 for invalid signature", %{conn: conn, url: url} do Pleroma.Config.put([Pleroma.Web.Endpoint, :secret_key_base], "000") %{path: path} = URI.parse(url) @@ -56,7 +56,7 @@ test "it returns 403 for invalid signature", %{conn: conn, url: url} do } = get(conn, "/proxy/hhgfh/eeee/fff") end - test "redirects on valid url when filename is invalidated", %{conn: conn, url: url} do + test "redirects to valid url when filename is invalidated", %{conn: conn, url: url} do invalid_url = String.replace(url, "test.png", "test-file.png") response = get(conn, invalid_url) assert response.status == 302 @@ -80,42 +80,263 @@ test "it returns 404 when url is in banned_urls cache", %{conn: conn, url: url} end end - describe "filename_matches/3" do - test "preserves the encoded or decoded path" do - assert MediaProxyController.filename_matches( - %{"filename" => "/Hello world.jpg"}, - "/Hello world.jpg", - "http://pleroma.social/Hello world.jpg" - ) == :ok + describe "Media Preview Proxy" do + def assert_dependencies_installed do + missing_dependencies = Pleroma.Helpers.MediaHelper.missing_dependencies() - assert MediaProxyController.filename_matches( - %{"filename" => "/Hello%20world.jpg"}, - "/Hello%20world.jpg", - "http://pleroma.social/Hello%20world.jpg" - ) == :ok - - assert MediaProxyController.filename_matches( - %{"filename" => "/my%2Flong%2Furl%2F2019%2F07%2FS.jpg"}, - "/my%2Flong%2Furl%2F2019%2F07%2FS.jpg", - "http://pleroma.social/my%2Flong%2Furl%2F2019%2F07%2FS.jpg" - ) == :ok - - assert MediaProxyController.filename_matches( - %{"filename" => "/my%2Flong%2Furl%2F2019%2F07%2FS.jp"}, - "/my%2Flong%2Furl%2F2019%2F07%2FS.jp", - "http://pleroma.social/my%2Flong%2Furl%2F2019%2F07%2FS.jpg" - ) == {:wrong_filename, "my%2Flong%2Furl%2F2019%2F07%2FS.jpg"} + assert missing_dependencies == [], + "Error: missing dependencies (please refer to `docs/installation`): #{ + inspect(missing_dependencies) + }" end - test "encoded url are tried to match for proxy as `conn.request_path` encodes the url" do - # conn.request_path will return encoded url - request_path = "/ANALYSE-DAI-_-LE-STABLECOIN-100-D%C3%89CENTRALIS%C3%89-BQ.jpg" + setup do + clear_config([:media_proxy, :enabled], true) + clear_config([:media_preview_proxy, :enabled], true) + clear_config([Pleroma.Web.Endpoint, :secret_key_base], "00000000000") - assert MediaProxyController.filename_matches( - true, - request_path, - "https://mydomain.com/uploads/2019/07/ANALYSE-DAI-_-LE-STABLECOIN-100-DÉCENTRALISÉ-BQ.jpg" - ) == :ok + original_url = "https://google.fn/test.png" + + [ + url: MediaProxy.encode_preview_url(original_url), + media_proxy_url: MediaProxy.encode_url(original_url) + ] + end + + test "returns 404 when media proxy is disabled", %{conn: conn} do + clear_config([:media_proxy, :enabled], false) + + assert %Conn{ + status: 404, + resp_body: "Not Found" + } = get(conn, "/proxy/preview/hhgfh/eeeee") + + assert %Conn{ + status: 404, + resp_body: "Not Found" + } = get(conn, "/proxy/preview/hhgfh/fff") + end + + test "returns 404 when disabled", %{conn: conn} do + clear_config([:media_preview_proxy, :enabled], false) + + assert %Conn{ + status: 404, + resp_body: "Not Found" + } = get(conn, "/proxy/preview/hhgfh/eeeee") + + assert %Conn{ + status: 404, + resp_body: "Not Found" + } = get(conn, "/proxy/preview/hhgfh/fff") + end + + test "it returns 403 for invalid signature", %{conn: conn, url: url} do + Pleroma.Config.put([Pleroma.Web.Endpoint, :secret_key_base], "000") + %{path: path} = URI.parse(url) + + assert %Conn{ + status: 403, + resp_body: "Forbidden" + } = get(conn, path) + + assert %Conn{ + status: 403, + resp_body: "Forbidden" + } = get(conn, "/proxy/preview/hhgfh/eeee") + + assert %Conn{ + status: 403, + resp_body: "Forbidden" + } = get(conn, "/proxy/preview/hhgfh/eeee/fff") + end + + test "redirects to valid url when filename is invalidated", %{conn: conn, url: url} do + invalid_url = String.replace(url, "test.png", "test-file.png") + response = get(conn, invalid_url) + assert response.status == 302 + assert redirected_to(response) == url + end + + test "responds with 424 Failed Dependency if HEAD request to media proxy fails", %{ + conn: conn, + url: url, + media_proxy_url: media_proxy_url + } do + Tesla.Mock.mock(fn + %{method: "head", url: ^media_proxy_url} -> + %Tesla.Env{status: 500, body: ""} + end) + + response = get(conn, url) + assert response.status == 424 + assert response.resp_body == "Can't fetch HTTP headers (HTTP 500)." + end + + test "redirects to media proxy URI on unsupported content type", %{ + conn: conn, + url: url, + media_proxy_url: media_proxy_url + } do + Tesla.Mock.mock(fn + %{method: "head", url: ^media_proxy_url} -> + %Tesla.Env{status: 200, body: "", headers: [{"content-type", "application/pdf"}]} + end) + + response = get(conn, url) + assert response.status == 302 + assert redirected_to(response) == media_proxy_url + end + + test "with `static=true` and GIF image preview requested, responds with JPEG image", %{ + conn: conn, + url: url, + media_proxy_url: media_proxy_url + } do + assert_dependencies_installed() + + # Setting a high :min_content_length to ensure this scenario is not affected by its logic + clear_config([:media_preview_proxy, :min_content_length], 1_000_000_000) + + Tesla.Mock.mock(fn + %{method: "head", url: ^media_proxy_url} -> + %Tesla.Env{ + status: 200, + body: "", + headers: [{"content-type", "image/gif"}, {"content-length", "1001718"}] + } + + %{method: :get, url: ^media_proxy_url} -> + %Tesla.Env{status: 200, body: File.read!("test/fixtures/image.gif")} + end) + + response = get(conn, url <> "?static=true") + + assert response.status == 200 + assert Conn.get_resp_header(response, "content-type") == ["image/jpeg"] + assert response.resp_body != "" + end + + test "with GIF image preview requested and no `static` param, redirects to media proxy URI", + %{ + conn: conn, + url: url, + media_proxy_url: media_proxy_url + } do + Tesla.Mock.mock(fn + %{method: "head", url: ^media_proxy_url} -> + %Tesla.Env{status: 200, body: "", headers: [{"content-type", "image/gif"}]} + end) + + response = get(conn, url) + + assert response.status == 302 + assert redirected_to(response) == media_proxy_url + end + + test "with `static` param and non-GIF image preview requested, " <> + "redirects to media preview proxy URI without `static` param", + %{ + conn: conn, + url: url, + media_proxy_url: media_proxy_url + } do + Tesla.Mock.mock(fn + %{method: "head", url: ^media_proxy_url} -> + %Tesla.Env{status: 200, body: "", headers: [{"content-type", "image/jpeg"}]} + end) + + response = get(conn, url <> "?static=true") + + assert response.status == 302 + assert redirected_to(response) == url + end + + test "with :min_content_length setting not matched by Content-Length header, " <> + "redirects to media proxy URI", + %{ + conn: conn, + url: url, + media_proxy_url: media_proxy_url + } do + clear_config([:media_preview_proxy, :min_content_length], 100_000) + + Tesla.Mock.mock(fn + %{method: "head", url: ^media_proxy_url} -> + %Tesla.Env{ + status: 200, + body: "", + headers: [{"content-type", "image/gif"}, {"content-length", "5000"}] + } + end) + + response = get(conn, url) + + assert response.status == 302 + assert redirected_to(response) == media_proxy_url + end + + test "thumbnails PNG images into PNG", %{ + conn: conn, + url: url, + media_proxy_url: media_proxy_url + } do + assert_dependencies_installed() + + Tesla.Mock.mock(fn + %{method: "head", url: ^media_proxy_url} -> + %Tesla.Env{status: 200, body: "", headers: [{"content-type", "image/png"}]} + + %{method: :get, url: ^media_proxy_url} -> + %Tesla.Env{status: 200, body: File.read!("test/fixtures/image.png")} + end) + + response = get(conn, url) + + assert response.status == 200 + assert Conn.get_resp_header(response, "content-type") == ["image/png"] + assert response.resp_body != "" + end + + test "thumbnails JPEG images into JPEG", %{ + conn: conn, + url: url, + media_proxy_url: media_proxy_url + } do + assert_dependencies_installed() + + Tesla.Mock.mock(fn + %{method: "head", url: ^media_proxy_url} -> + %Tesla.Env{status: 200, body: "", headers: [{"content-type", "image/jpeg"}]} + + %{method: :get, url: ^media_proxy_url} -> + %Tesla.Env{status: 200, body: File.read!("test/fixtures/image.jpg")} + end) + + response = get(conn, url) + + assert response.status == 200 + assert Conn.get_resp_header(response, "content-type") == ["image/jpeg"] + assert response.resp_body != "" + end + + test "redirects to media proxy URI in case of thumbnailing error", %{ + conn: conn, + url: url, + media_proxy_url: media_proxy_url + } do + Tesla.Mock.mock(fn + %{method: "head", url: ^media_proxy_url} -> + %Tesla.Env{status: 200, body: "", headers: [{"content-type", "image/jpeg"}]} + + %{method: :get, url: ^media_proxy_url} -> + %Tesla.Env{status: 200, body: "error"} + end) + + response = get(conn, url) + + assert response.status == 302 + assert redirected_to(response) == media_proxy_url end end end diff --git a/test/web/media_proxy/media_proxy_test.exs b/test/web/media_proxy/media_proxy_test.exs index 72885cfdd..0e6df826c 100644 --- a/test/web/media_proxy/media_proxy_test.exs +++ b/test/web/media_proxy/media_proxy_test.exs @@ -6,9 +6,16 @@ defmodule Pleroma.Web.MediaProxyTest do use ExUnit.Case use Pleroma.Tests.Helpers + alias Pleroma.Config alias Pleroma.Web.Endpoint alias Pleroma.Web.MediaProxy + defp decode_result(encoded) do + [_, "proxy", sig, base64 | _] = URI.parse(encoded).path |> String.split("/") + {:ok, decoded} = MediaProxy.decode_url(sig, base64) + decoded + end + describe "when enabled" do setup do: clear_config([:media_proxy, :enabled], true) @@ -35,7 +42,7 @@ test "encodes and decodes URL" do assert String.starts_with?( encoded, - Pleroma.Config.get([:media_proxy, :base_url], Pleroma.Web.base_url()) + Config.get([:media_proxy, :base_url], Pleroma.Web.base_url()) ) assert String.ends_with?(encoded, "/logo.png") @@ -75,6 +82,64 @@ test "validates signature" do assert MediaProxy.decode_url(sig, base64) == {:error, :invalid_signature} end + def test_verify_request_path_and_url(request_path, url, expected_result) do + assert MediaProxy.verify_request_path_and_url(request_path, url) == expected_result + + assert MediaProxy.verify_request_path_and_url( + %Plug.Conn{ + params: %{"filename" => Path.basename(request_path)}, + request_path: request_path + }, + url + ) == expected_result + end + + test "if first arg of `verify_request_path_and_url/2` is a Plug.Conn without \"filename\" " <> + "parameter, `verify_request_path_and_url/2` returns :ok " do + assert MediaProxy.verify_request_path_and_url( + %Plug.Conn{params: %{}, request_path: "/some/path"}, + "https://instance.com/file.jpg" + ) == :ok + + assert MediaProxy.verify_request_path_and_url( + %Plug.Conn{params: %{}, request_path: "/path/to/file.jpg"}, + "https://instance.com/file.jpg" + ) == :ok + end + + test "`verify_request_path_and_url/2` preserves the encoded or decoded path" do + test_verify_request_path_and_url( + "/Hello world.jpg", + "http://pleroma.social/Hello world.jpg", + :ok + ) + + test_verify_request_path_and_url( + "/Hello%20world.jpg", + "http://pleroma.social/Hello%20world.jpg", + :ok + ) + + test_verify_request_path_and_url( + "/my%2Flong%2Furl%2F2019%2F07%2FS.jpg", + "http://pleroma.social/my%2Flong%2Furl%2F2019%2F07%2FS.jpg", + :ok + ) + + test_verify_request_path_and_url( + # Note: `conn.request_path` returns encoded url + "/ANALYSE-DAI-_-LE-STABLECOIN-100-D%C3%89CENTRALIS%C3%89-BQ.jpg", + "https://mydomain.com/uploads/2019/07/ANALYSE-DAI-_-LE-STABLECOIN-100-DÉCENTRALISÉ-BQ.jpg", + :ok + ) + + test_verify_request_path_and_url( + "/my%2Flong%2Furl%2F2019%2F07%2FS", + "http://pleroma.social/my%2Flong%2Furl%2F2019%2F07%2FS.jpg", + {:wrong_filename, "my%2Flong%2Furl%2F2019%2F07%2FS.jpg"} + ) + end + test "uses the configured base_url" do base_url = "https://cache.pleroma.social" clear_config([:media_proxy, :base_url], base_url) @@ -124,12 +189,6 @@ test "does not encode remote urls" do end end - defp decode_result(encoded) do - [_, "proxy", sig, base64 | _] = URI.parse(encoded).path |> String.split("/") - {:ok, decoded} = MediaProxy.decode_url(sig, base64) - decoded - end - describe "whitelist" do setup do: clear_config([:media_proxy, :enabled], true) diff --git a/test/web/metadata/metadata_test.exs b/test/web/metadata/metadata_test.exs index 9d3121b7b..ca6cbe67f 100644 --- a/test/web/metadata/metadata_test.exs +++ b/test/web/metadata/metadata_test.exs @@ -16,7 +16,14 @@ test "for remote user" do end test "for local user" do - user = insert(:user) + user = insert(:user, discoverable: false) + + assert Pleroma.Web.Metadata.build_tags(%{user: user}) =~ + "" + end + + test "for local user set to discoverable" do + user = insert(:user, discoverable: true) refute Pleroma.Web.Metadata.build_tags(%{user: user}) =~ "" @@ -24,11 +31,19 @@ test "for local user" do end describe "no metadata for private instances" do - test "for local user" do + test "for local user set to discoverable" do clear_config([:instance, :public], false) - user = insert(:user, bio: "This is my secret fedi account bio") + user = insert(:user, bio: "This is my secret fedi account bio", discoverable: true) assert "" = Pleroma.Web.Metadata.build_tags(%{user: user}) end + + test "search exclusion metadata is included" do + clear_config([:instance, :public], false) + user = insert(:user, bio: "This is my secret fedi account bio", discoverable: false) + + assert ~s() == + Pleroma.Web.Metadata.build_tags(%{user: user}) + end end end diff --git a/test/web/metadata/restrict_indexing_test.exs b/test/web/metadata/restrict_indexing_test.exs index aad0bac42..6b3a65372 100644 --- a/test/web/metadata/restrict_indexing_test.exs +++ b/test/web/metadata/restrict_indexing_test.exs @@ -14,8 +14,14 @@ test "for remote user" do test "for local user" do assert Pleroma.Web.Metadata.Providers.RestrictIndexing.build_tags(%{ - user: %Pleroma.User{local: true} + user: %Pleroma.User{local: true, discoverable: true} }) == [] end + + test "for local user when discoverable is false" do + assert Pleroma.Web.Metadata.Providers.RestrictIndexing.build_tags(%{ + user: %Pleroma.User{local: true, discoverable: false} + }) == [{:meta, [name: "robots", content: "noindex, noarchive"], []}] + end end end diff --git a/test/web/pleroma_api/controllers/chat_controller_test.exs b/test/web/pleroma_api/controllers/chat_controller_test.exs index 7be5fe09c..11d5ba373 100644 --- a/test/web/pleroma_api/controllers/chat_controller_test.exs +++ b/test/web/pleroma_api/controllers/chat_controller_test.exs @@ -2,7 +2,7 @@ # Copyright © 2017-2020 Pleroma Authors # SPDX-License-Identifier: AGPL-3.0-only defmodule Pleroma.Web.PleromaAPI.ChatControllerTest do - use Pleroma.Web.ConnCase, async: true + use Pleroma.Web.ConnCase alias Pleroma.Chat alias Pleroma.Chat.MessageReference @@ -100,7 +100,7 @@ test "it fails if there is no content", %{conn: conn, user: user} do |> post("/api/v1/pleroma/chats/#{chat.id}/messages") |> json_response_and_validate_schema(400) - assert result + assert %{"error" => "no_content"} == result end test "it works with an attachment", %{conn: conn, user: user} do @@ -126,6 +126,23 @@ test "it works with an attachment", %{conn: conn, user: user} do assert result["attachment"] end + + test "gets MRF reason when rejected", %{conn: conn, user: user} do + clear_config([:mrf_keyword, :reject], ["GNO"]) + clear_config([:mrf, :policies], [Pleroma.Web.ActivityPub.MRF.KeywordPolicy]) + + other_user = insert(:user) + + {:ok, chat} = Chat.get_or_create(user.id, other_user.ap_id) + + result = + conn + |> put_req_header("content-type", "application/json") + |> post("/api/v1/pleroma/chats/#{chat.id}/messages", %{"content" => "GNO/Linux"}) + |> json_response_and_validate_schema(422) + + assert %{"error" => "[KeywordPolicy] Matches with rejected keyword"} == result + end end describe "DELETE /api/v1/pleroma/chats/:id/messages/:message_id" do @@ -184,17 +201,39 @@ test "it paginates", %{conn: conn, user: user} do chat = Chat.get(user.id, recipient.ap_id) - result = - conn - |> get("/api/v1/pleroma/chats/#{chat.id}/messages") - |> json_response_and_validate_schema(200) + response = get(conn, "/api/v1/pleroma/chats/#{chat.id}/messages") + result = json_response_and_validate_schema(response, 200) + + [next, prev] = get_resp_header(response, "link") |> hd() |> String.split(", ") + api_endpoint = "/api/v1/pleroma/chats/" + + assert String.match?( + next, + ~r(#{api_endpoint}.*/messages\?id=.*&limit=\d+&max_id=.*; rel=\"next\"$) + ) + + assert String.match?( + prev, + ~r(#{api_endpoint}.*/messages\?id=.*&limit=\d+&min_id=.*; rel=\"prev\"$) + ) assert length(result) == 20 - result = - conn - |> get("/api/v1/pleroma/chats/#{chat.id}/messages?max_id=#{List.last(result)["id"]}") - |> json_response_and_validate_schema(200) + response = + get(conn, "/api/v1/pleroma/chats/#{chat.id}/messages?max_id=#{List.last(result)["id"]}") + + result = json_response_and_validate_schema(response, 200) + [next, prev] = get_resp_header(response, "link") |> hd() |> String.split(", ") + + assert String.match?( + next, + ~r(#{api_endpoint}.*/messages\?id=.*&limit=\d+&max_id=.*; rel=\"next\"$) + ) + + assert String.match?( + prev, + ~r(#{api_endpoint}.*/messages\?id=.*&limit=\d+&max_id=.*&min_id=.*; rel=\"prev\"$) + ) assert length(result) == 10 end @@ -223,12 +262,10 @@ test "it returns the messages for a given chat", %{conn: conn, user: user} do assert length(result) == 3 # Trying to get the chat of a different user - result = - conn - |> assign(:user, other_user) - |> get("/api/v1/pleroma/chats/#{chat.id}/messages") - - assert result |> json_response(404) + conn + |> assign(:user, other_user) + |> get("/api/v1/pleroma/chats/#{chat.id}/messages") + |> json_response_and_validate_schema(404) end end diff --git a/test/web/pleroma_api/controllers/emoji_file_controller_test.exs b/test/web/pleroma_api/controllers/emoji_file_controller_test.exs new file mode 100644 index 000000000..82de86ee3 --- /dev/null +++ b/test/web/pleroma_api/controllers/emoji_file_controller_test.exs @@ -0,0 +1,357 @@ +# Pleroma: A lightweight social networking server +# Copyright © 2017-2020 Pleroma Authors +# SPDX-License-Identifier: AGPL-3.0-only + +defmodule Pleroma.Web.PleromaAPI.EmojiFileControllerTest do + use Pleroma.Web.ConnCase + + import Tesla.Mock + import Pleroma.Factory + + @emoji_path Path.join( + Pleroma.Config.get!([:instance, :static_dir]), + "emoji" + ) + setup do: clear_config([:auth, :enforce_oauth_admin_scope_usage], false) + + setup do: clear_config([:instance, :public], true) + + setup do + admin = insert(:user, is_admin: true) + token = insert(:oauth_admin_token, user: admin) + + admin_conn = + build_conn() + |> assign(:user, admin) + |> assign(:token, token) + + Pleroma.Emoji.reload() + {:ok, %{admin_conn: admin_conn}} + end + + describe "POST/PATCH/DELETE /api/pleroma/emoji/packs/files?name=:name" do + setup do + pack_file = "#{@emoji_path}/test_pack/pack.json" + original_content = File.read!(pack_file) + + on_exit(fn -> + File.write!(pack_file, original_content) + end) + + :ok + end + + test "upload zip file with emojies", %{admin_conn: admin_conn} do + on_exit(fn -> + [ + "128px/a_trusted_friend-128.png", + "auroraborealis.png", + "1000px/baby_in_a_box.png", + "1000px/bear.png", + "128px/bear-128.png" + ] + |> Enum.each(fn path -> File.rm_rf!("#{@emoji_path}/test_pack/#{path}") end) + end) + + resp = + admin_conn + |> put_req_header("content-type", "multipart/form-data") + |> post("/api/pleroma/emoji/packs/files?name=test_pack", %{ + file: %Plug.Upload{ + content_type: "application/zip", + filename: "emojis.zip", + path: Path.absname("test/fixtures/emojis.zip") + } + }) + |> json_response_and_validate_schema(200) + + assert resp == %{ + "a_trusted_friend-128" => "128px/a_trusted_friend-128.png", + "auroraborealis" => "auroraborealis.png", + "baby_in_a_box" => "1000px/baby_in_a_box.png", + "bear" => "1000px/bear.png", + "bear-128" => "128px/bear-128.png", + "blank" => "blank.png", + "blank2" => "blank2.png" + } + + Enum.each(Map.values(resp), fn path -> + assert File.exists?("#{@emoji_path}/test_pack/#{path}") + end) + end + + test "create shortcode exists", %{admin_conn: admin_conn} do + assert admin_conn + |> put_req_header("content-type", "multipart/form-data") + |> post("/api/pleroma/emoji/packs/files?name=test_pack", %{ + shortcode: "blank", + filename: "dir/blank.png", + file: %Plug.Upload{ + filename: "blank.png", + path: "#{@emoji_path}/test_pack/blank.png" + } + }) + |> json_response_and_validate_schema(:conflict) == %{ + "error" => "An emoji with the \"blank\" shortcode already exists" + } + end + + test "don't rewrite old emoji", %{admin_conn: admin_conn} do + on_exit(fn -> File.rm_rf!("#{@emoji_path}/test_pack/dir/") end) + + assert admin_conn + |> put_req_header("content-type", "multipart/form-data") + |> post("/api/pleroma/emoji/packs/files?name=test_pack", %{ + shortcode: "blank3", + filename: "dir/blank.png", + file: %Plug.Upload{ + filename: "blank.png", + path: "#{@emoji_path}/test_pack/blank.png" + } + }) + |> json_response_and_validate_schema(200) == %{ + "blank" => "blank.png", + "blank2" => "blank2.png", + "blank3" => "dir/blank.png" + } + + assert File.exists?("#{@emoji_path}/test_pack/dir/blank.png") + + assert admin_conn + |> put_req_header("content-type", "multipart/form-data") + |> patch("/api/pleroma/emoji/packs/files?name=test_pack", %{ + shortcode: "blank", + new_shortcode: "blank2", + new_filename: "dir_2/blank_3.png" + }) + |> json_response_and_validate_schema(:conflict) == %{ + "error" => + "New shortcode \"blank2\" is already used. If you want to override emoji use 'force' option" + } + end + + test "rewrite old emoji with force option", %{admin_conn: admin_conn} do + on_exit(fn -> File.rm_rf!("#{@emoji_path}/test_pack/dir_2/") end) + + assert admin_conn + |> put_req_header("content-type", "multipart/form-data") + |> post("/api/pleroma/emoji/packs/files?name=test_pack", %{ + shortcode: "blank3", + filename: "dir/blank.png", + file: %Plug.Upload{ + filename: "blank.png", + path: "#{@emoji_path}/test_pack/blank.png" + } + }) + |> json_response_and_validate_schema(200) == %{ + "blank" => "blank.png", + "blank2" => "blank2.png", + "blank3" => "dir/blank.png" + } + + assert File.exists?("#{@emoji_path}/test_pack/dir/blank.png") + + assert admin_conn + |> put_req_header("content-type", "multipart/form-data") + |> patch("/api/pleroma/emoji/packs/files?name=test_pack", %{ + shortcode: "blank3", + new_shortcode: "blank4", + new_filename: "dir_2/blank_3.png", + force: true + }) + |> json_response_and_validate_schema(200) == %{ + "blank" => "blank.png", + "blank2" => "blank2.png", + "blank4" => "dir_2/blank_3.png" + } + + assert File.exists?("#{@emoji_path}/test_pack/dir_2/blank_3.png") + end + + test "with empty filename", %{admin_conn: admin_conn} do + assert admin_conn + |> put_req_header("content-type", "multipart/form-data") + |> post("/api/pleroma/emoji/packs/files?name=test_pack", %{ + shortcode: "blank2", + filename: "", + file: %Plug.Upload{ + filename: "blank.png", + path: "#{@emoji_path}/test_pack/blank.png" + } + }) + |> json_response_and_validate_schema(422) == %{ + "error" => "pack name, shortcode or filename cannot be empty" + } + end + + test "add file with not loaded pack", %{admin_conn: admin_conn} do + assert admin_conn + |> put_req_header("content-type", "multipart/form-data") + |> post("/api/pleroma/emoji/packs/files?name=not_loaded", %{ + shortcode: "blank3", + filename: "dir/blank.png", + file: %Plug.Upload{ + filename: "blank.png", + path: "#{@emoji_path}/test_pack/blank.png" + } + }) + |> json_response_and_validate_schema(:not_found) == %{ + "error" => "pack \"not_loaded\" is not found" + } + end + + test "remove file with not loaded pack", %{admin_conn: admin_conn} do + assert admin_conn + |> delete("/api/pleroma/emoji/packs/files?name=not_loaded&shortcode=blank3") + |> json_response_and_validate_schema(:not_found) == %{ + "error" => "pack \"not_loaded\" is not found" + } + end + + test "remove file with empty shortcode", %{admin_conn: admin_conn} do + assert admin_conn + |> delete("/api/pleroma/emoji/packs/files?name=not_loaded&shortcode=") + |> json_response_and_validate_schema(:not_found) == %{ + "error" => "pack \"not_loaded\" is not found" + } + end + + test "update file with not loaded pack", %{admin_conn: admin_conn} do + assert admin_conn + |> put_req_header("content-type", "multipart/form-data") + |> patch("/api/pleroma/emoji/packs/files?name=not_loaded", %{ + shortcode: "blank4", + new_shortcode: "blank3", + new_filename: "dir_2/blank_3.png" + }) + |> json_response_and_validate_schema(:not_found) == %{ + "error" => "pack \"not_loaded\" is not found" + } + end + + test "new with shortcode as file with update", %{admin_conn: admin_conn} do + assert admin_conn + |> put_req_header("content-type", "multipart/form-data") + |> post("/api/pleroma/emoji/packs/files?name=test_pack", %{ + shortcode: "blank4", + filename: "dir/blank.png", + file: %Plug.Upload{ + filename: "blank.png", + path: "#{@emoji_path}/test_pack/blank.png" + } + }) + |> json_response_and_validate_schema(200) == %{ + "blank" => "blank.png", + "blank4" => "dir/blank.png", + "blank2" => "blank2.png" + } + + assert File.exists?("#{@emoji_path}/test_pack/dir/blank.png") + + assert admin_conn + |> put_req_header("content-type", "multipart/form-data") + |> patch("/api/pleroma/emoji/packs/files?name=test_pack", %{ + shortcode: "blank4", + new_shortcode: "blank3", + new_filename: "dir_2/blank_3.png" + }) + |> json_response_and_validate_schema(200) == %{ + "blank3" => "dir_2/blank_3.png", + "blank" => "blank.png", + "blank2" => "blank2.png" + } + + refute File.exists?("#{@emoji_path}/test_pack/dir/") + assert File.exists?("#{@emoji_path}/test_pack/dir_2/blank_3.png") + + assert admin_conn + |> delete("/api/pleroma/emoji/packs/files?name=test_pack&shortcode=blank3") + |> json_response_and_validate_schema(200) == %{ + "blank" => "blank.png", + "blank2" => "blank2.png" + } + + refute File.exists?("#{@emoji_path}/test_pack/dir_2/") + + on_exit(fn -> File.rm_rf!("#{@emoji_path}/test_pack/dir") end) + end + + test "new with shortcode from url", %{admin_conn: admin_conn} do + mock(fn + %{ + method: :get, + url: "https://test-blank/blank_url.png" + } -> + text(File.read!("#{@emoji_path}/test_pack/blank.png")) + end) + + assert admin_conn + |> put_req_header("content-type", "multipart/form-data") + |> post("/api/pleroma/emoji/packs/files?name=test_pack", %{ + shortcode: "blank_url", + file: "https://test-blank/blank_url.png" + }) + |> json_response_and_validate_schema(200) == %{ + "blank_url" => "blank_url.png", + "blank" => "blank.png", + "blank2" => "blank2.png" + } + + assert File.exists?("#{@emoji_path}/test_pack/blank_url.png") + + on_exit(fn -> File.rm_rf!("#{@emoji_path}/test_pack/blank_url.png") end) + end + + test "new without shortcode", %{admin_conn: admin_conn} do + on_exit(fn -> File.rm_rf!("#{@emoji_path}/test_pack/shortcode.png") end) + + assert admin_conn + |> put_req_header("content-type", "multipart/form-data") + |> post("/api/pleroma/emoji/packs/files?name=test_pack", %{ + file: %Plug.Upload{ + filename: "shortcode.png", + path: "#{Pleroma.Config.get([:instance, :static_dir])}/add/shortcode.png" + } + }) + |> json_response_and_validate_schema(200) == %{ + "shortcode" => "shortcode.png", + "blank" => "blank.png", + "blank2" => "blank2.png" + } + end + + test "remove non existing shortcode in pack.json", %{admin_conn: admin_conn} do + assert admin_conn + |> delete("/api/pleroma/emoji/packs/files?name=test_pack&shortcode=blank3") + |> json_response_and_validate_schema(:bad_request) == %{ + "error" => "Emoji \"blank3\" does not exist" + } + end + + test "update non existing emoji", %{admin_conn: admin_conn} do + assert admin_conn + |> put_req_header("content-type", "multipart/form-data") + |> patch("/api/pleroma/emoji/packs/files?name=test_pack", %{ + shortcode: "blank3", + new_shortcode: "blank4", + new_filename: "dir_2/blank_3.png" + }) + |> json_response_and_validate_schema(:bad_request) == %{ + "error" => "Emoji \"blank3\" does not exist" + } + end + + test "update with empty shortcode", %{admin_conn: admin_conn} do + assert %{ + "error" => "Missing field: new_shortcode." + } = + admin_conn + |> put_req_header("content-type", "multipart/form-data") + |> patch("/api/pleroma/emoji/packs/files?name=test_pack", %{ + shortcode: "blank", + new_filename: "dir_2/blank_3.png" + }) + |> json_response_and_validate_schema(:bad_request) + end + end +end diff --git a/test/web/pleroma_api/controllers/emoji_pack_controller_test.exs b/test/web/pleroma_api/controllers/emoji_pack_controller_test.exs index e113bb15f..386ad8634 100644 --- a/test/web/pleroma_api/controllers/emoji_pack_controller_test.exs +++ b/test/web/pleroma_api/controllers/emoji_pack_controller_test.exs @@ -37,11 +37,11 @@ test "GET /api/pleroma/emoji/packs when :public: false", %{conn: conn} do test "GET /api/pleroma/emoji/packs", %{conn: conn} do resp = conn |> get("/api/pleroma/emoji/packs") |> json_response_and_validate_schema(200) - assert resp["count"] == 3 + assert resp["count"] == 4 assert resp["packs"] |> Map.keys() - |> length() == 3 + |> length() == 4 shared = resp["packs"]["test_pack"] assert shared["files"] == %{"blank" => "blank.png", "blank2" => "blank2.png"} @@ -58,7 +58,7 @@ test "GET /api/pleroma/emoji/packs", %{conn: conn} do |> get("/api/pleroma/emoji/packs?page_size=1") |> json_response_and_validate_schema(200) - assert resp["count"] == 3 + assert resp["count"] == 4 packs = Map.keys(resp["packs"]) @@ -71,7 +71,7 @@ test "GET /api/pleroma/emoji/packs", %{conn: conn} do |> get("/api/pleroma/emoji/packs?page_size=1&page=2") |> json_response_and_validate_schema(200) - assert resp["count"] == 3 + assert resp["count"] == 4 packs = Map.keys(resp["packs"]) assert length(packs) == 1 [pack2] = packs @@ -81,18 +81,28 @@ test "GET /api/pleroma/emoji/packs", %{conn: conn} do |> get("/api/pleroma/emoji/packs?page_size=1&page=3") |> json_response_and_validate_schema(200) - assert resp["count"] == 3 + assert resp["count"] == 4 packs = Map.keys(resp["packs"]) assert length(packs) == 1 [pack3] = packs - assert [pack1, pack2, pack3] |> Enum.uniq() |> length() == 3 + + resp = + conn + |> get("/api/pleroma/emoji/packs?page_size=1&page=4") + |> json_response_and_validate_schema(200) + + assert resp["count"] == 4 + packs = Map.keys(resp["packs"]) + assert length(packs) == 1 + [pack4] = packs + assert [pack1, pack2, pack3, pack4] |> Enum.uniq() |> length() == 4 end describe "GET /api/pleroma/emoji/packs/remote" do test "shareable instance", %{admin_conn: admin_conn, conn: conn} do resp = conn - |> get("/api/pleroma/emoji/packs") + |> get("/api/pleroma/emoji/packs?page=2&page_size=1") |> json_response_and_validate_schema(200) mock(fn @@ -102,12 +112,12 @@ test "shareable instance", %{admin_conn: admin_conn, conn: conn} do %{method: :get, url: "https://example.com/nodeinfo/2.1.json"} -> json(%{metadata: %{features: ["shareable_emoji_packs"]}}) - %{method: :get, url: "https://example.com/api/pleroma/emoji/packs"} -> + %{method: :get, url: "https://example.com/api/pleroma/emoji/packs?page=2&page_size=1"} -> json(resp) end) assert admin_conn - |> get("/api/pleroma/emoji/packs/remote?url=https://example.com") + |> get("/api/pleroma/emoji/packs/remote?url=https://example.com&page=2&page_size=1") |> json_response_and_validate_schema(200) == resp end @@ -128,11 +138,11 @@ test "non shareable instance", %{admin_conn: admin_conn} do end end - describe "GET /api/pleroma/emoji/packs/:name/archive" do + describe "GET /api/pleroma/emoji/packs/archive?name=:name" do test "download shared pack", %{conn: conn} do resp = conn - |> get("/api/pleroma/emoji/packs/test_pack/archive") + |> get("/api/pleroma/emoji/packs/archive?name=test_pack") |> response(200) {:ok, arch} = :zip.unzip(resp, [:memory]) @@ -143,7 +153,7 @@ test "download shared pack", %{conn: conn} do test "non existing pack", %{conn: conn} do assert conn - |> get("/api/pleroma/emoji/packs/test_pack_for_import/archive") + |> get("/api/pleroma/emoji/packs/archive?name=test_pack_for_import") |> json_response_and_validate_schema(:not_found) == %{ "error" => "Pack test_pack_for_import does not exist" } @@ -151,7 +161,7 @@ test "non existing pack", %{conn: conn} do test "non downloadable pack", %{conn: conn} do assert conn - |> get("/api/pleroma/emoji/packs/test_pack_nonshared/archive") + |> get("/api/pleroma/emoji/packs/archive?name=test_pack_nonshared") |> json_response_and_validate_schema(:forbidden) == %{ "error" => "Pack test_pack_nonshared cannot be downloaded from this instance, either pack sharing was disabled for this pack or some files are missing" @@ -173,28 +183,28 @@ test "shared pack from remote and non shared from fallback-src", %{ %{ method: :get, - url: "https://example.com/api/pleroma/emoji/packs/test_pack" + url: "https://example.com/api/pleroma/emoji/pack?name=test_pack" } -> conn - |> get("/api/pleroma/emoji/packs/test_pack") + |> get("/api/pleroma/emoji/pack?name=test_pack") |> json_response_and_validate_schema(200) |> json() %{ method: :get, - url: "https://example.com/api/pleroma/emoji/packs/test_pack/archive" + url: "https://example.com/api/pleroma/emoji/packs/archive?name=test_pack" } -> conn - |> get("/api/pleroma/emoji/packs/test_pack/archive") + |> get("/api/pleroma/emoji/packs/archive?name=test_pack") |> response(200) |> text() %{ method: :get, - url: "https://example.com/api/pleroma/emoji/packs/test_pack_nonshared" + url: "https://example.com/api/pleroma/emoji/pack?name=test_pack_nonshared" } -> conn - |> get("/api/pleroma/emoji/packs/test_pack_nonshared") + |> get("/api/pleroma/emoji/pack?name=test_pack_nonshared") |> json_response_and_validate_schema(200) |> json() @@ -218,7 +228,7 @@ test "shared pack from remote and non shared from fallback-src", %{ assert File.exists?("#{@emoji_path}/test_pack2/blank.png") assert admin_conn - |> delete("/api/pleroma/emoji/packs/test_pack2") + |> delete("/api/pleroma/emoji/pack?name=test_pack2") |> json_response_and_validate_schema(200) == "ok" refute File.exists?("#{@emoji_path}/test_pack2") @@ -239,7 +249,7 @@ test "shared pack from remote and non shared from fallback-src", %{ assert File.exists?("#{@emoji_path}/test_pack_nonshared2/blank.png") assert admin_conn - |> delete("/api/pleroma/emoji/packs/test_pack_nonshared2") + |> delete("/api/pleroma/emoji/pack?name=test_pack_nonshared2") |> json_response_and_validate_schema(200) == "ok" refute File.exists?("#{@emoji_path}/test_pack_nonshared2") @@ -279,14 +289,14 @@ test "checksum fail", %{admin_conn: admin_conn} do %{ method: :get, - url: "https://example.com/api/pleroma/emoji/packs/pack_bad_sha" + url: "https://example.com/api/pleroma/emoji/pack?name=pack_bad_sha" } -> {:ok, pack} = Pleroma.Emoji.Pack.load_pack("pack_bad_sha") %Tesla.Env{status: 200, body: Jason.encode!(pack)} %{ method: :get, - url: "https://example.com/api/pleroma/emoji/packs/pack_bad_sha/archive" + url: "https://example.com/api/pleroma/emoji/packs/archive?name=pack_bad_sha" } -> %Tesla.Env{ status: 200, @@ -316,7 +326,7 @@ test "other error", %{admin_conn: admin_conn} do %{ method: :get, - url: "https://example.com/api/pleroma/emoji/packs/test_pack" + url: "https://example.com/api/pleroma/emoji/pack?name=test_pack" } -> {:ok, pack} = Pleroma.Emoji.Pack.load_pack("test_pack") %Tesla.Env{status: 200, body: Jason.encode!(pack)} @@ -336,7 +346,7 @@ test "other error", %{admin_conn: admin_conn} do end end - describe "PATCH /api/pleroma/emoji/packs/:name" do + describe "PATCH /api/pleroma/emoji/pack?name=:name" do setup do pack_file = "#{@emoji_path}/test_pack/pack.json" original_content = File.read!(pack_file) @@ -358,7 +368,9 @@ test "other error", %{admin_conn: admin_conn} do test "for a pack without a fallback source", ctx do assert ctx[:admin_conn] |> put_req_header("content-type", "multipart/form-data") - |> patch("/api/pleroma/emoji/packs/test_pack", %{"metadata" => ctx[:new_data]}) + |> patch("/api/pleroma/emoji/pack?name=test_pack", %{ + "metadata" => ctx[:new_data] + }) |> json_response_and_validate_schema(200) == ctx[:new_data] assert Jason.decode!(File.read!(ctx[:pack_file]))["pack"] == ctx[:new_data] @@ -384,7 +396,7 @@ test "for a pack with a fallback source", ctx do assert ctx[:admin_conn] |> put_req_header("content-type", "multipart/form-data") - |> patch("/api/pleroma/emoji/packs/test_pack", %{metadata: new_data}) + |> patch("/api/pleroma/emoji/pack?name=test_pack", %{metadata: new_data}) |> json_response_and_validate_schema(200) == new_data_with_sha assert Jason.decode!(File.read!(ctx[:pack_file]))["pack"] == new_data_with_sha @@ -404,304 +416,17 @@ test "when the fallback source doesn't have all the files", ctx do assert ctx[:admin_conn] |> put_req_header("content-type", "multipart/form-data") - |> patch("/api/pleroma/emoji/packs/test_pack", %{metadata: new_data}) + |> patch("/api/pleroma/emoji/pack?name=test_pack", %{metadata: new_data}) |> json_response_and_validate_schema(:bad_request) == %{ "error" => "The fallback archive does not have all files specified in pack.json" } end end - describe "POST/PATCH/DELETE /api/pleroma/emoji/packs/:name/files" do - setup do - pack_file = "#{@emoji_path}/test_pack/pack.json" - original_content = File.read!(pack_file) - - on_exit(fn -> - File.write!(pack_file, original_content) - end) - - :ok - end - - test "create shortcode exists", %{admin_conn: admin_conn} do - assert admin_conn - |> put_req_header("content-type", "multipart/form-data") - |> post("/api/pleroma/emoji/packs/test_pack/files", %{ - shortcode: "blank", - filename: "dir/blank.png", - file: %Plug.Upload{ - filename: "blank.png", - path: "#{@emoji_path}/test_pack/blank.png" - } - }) - |> json_response_and_validate_schema(:conflict) == %{ - "error" => "An emoji with the \"blank\" shortcode already exists" - } - end - - test "don't rewrite old emoji", %{admin_conn: admin_conn} do - on_exit(fn -> File.rm_rf!("#{@emoji_path}/test_pack/dir/") end) - - assert admin_conn - |> put_req_header("content-type", "multipart/form-data") - |> post("/api/pleroma/emoji/packs/test_pack/files", %{ - shortcode: "blank3", - filename: "dir/blank.png", - file: %Plug.Upload{ - filename: "blank.png", - path: "#{@emoji_path}/test_pack/blank.png" - } - }) - |> json_response_and_validate_schema(200) == %{ - "blank" => "blank.png", - "blank2" => "blank2.png", - "blank3" => "dir/blank.png" - } - - assert File.exists?("#{@emoji_path}/test_pack/dir/blank.png") - - assert admin_conn - |> put_req_header("content-type", "multipart/form-data") - |> patch("/api/pleroma/emoji/packs/test_pack/files", %{ - shortcode: "blank", - new_shortcode: "blank2", - new_filename: "dir_2/blank_3.png" - }) - |> json_response_and_validate_schema(:conflict) == %{ - "error" => - "New shortcode \"blank2\" is already used. If you want to override emoji use 'force' option" - } - end - - test "rewrite old emoji with force option", %{admin_conn: admin_conn} do - on_exit(fn -> File.rm_rf!("#{@emoji_path}/test_pack/dir_2/") end) - - assert admin_conn - |> put_req_header("content-type", "multipart/form-data") - |> post("/api/pleroma/emoji/packs/test_pack/files", %{ - shortcode: "blank3", - filename: "dir/blank.png", - file: %Plug.Upload{ - filename: "blank.png", - path: "#{@emoji_path}/test_pack/blank.png" - } - }) - |> json_response_and_validate_schema(200) == %{ - "blank" => "blank.png", - "blank2" => "blank2.png", - "blank3" => "dir/blank.png" - } - - assert File.exists?("#{@emoji_path}/test_pack/dir/blank.png") - - assert admin_conn - |> put_req_header("content-type", "multipart/form-data") - |> patch("/api/pleroma/emoji/packs/test_pack/files", %{ - shortcode: "blank3", - new_shortcode: "blank4", - new_filename: "dir_2/blank_3.png", - force: true - }) - |> json_response_and_validate_schema(200) == %{ - "blank" => "blank.png", - "blank2" => "blank2.png", - "blank4" => "dir_2/blank_3.png" - } - - assert File.exists?("#{@emoji_path}/test_pack/dir_2/blank_3.png") - end - - test "with empty filename", %{admin_conn: admin_conn} do - assert admin_conn - |> put_req_header("content-type", "multipart/form-data") - |> post("/api/pleroma/emoji/packs/test_pack/files", %{ - shortcode: "blank2", - filename: "", - file: %Plug.Upload{ - filename: "blank.png", - path: "#{@emoji_path}/test_pack/blank.png" - } - }) - |> json_response_and_validate_schema(:bad_request) == %{ - "error" => "pack name, shortcode or filename cannot be empty" - } - end - - test "add file with not loaded pack", %{admin_conn: admin_conn} do - assert admin_conn - |> put_req_header("content-type", "multipart/form-data") - |> post("/api/pleroma/emoji/packs/not_loaded/files", %{ - shortcode: "blank3", - filename: "dir/blank.png", - file: %Plug.Upload{ - filename: "blank.png", - path: "#{@emoji_path}/test_pack/blank.png" - } - }) - |> json_response_and_validate_schema(:bad_request) == %{ - "error" => "pack \"not_loaded\" is not found" - } - end - - test "remove file with not loaded pack", %{admin_conn: admin_conn} do - assert admin_conn - |> delete("/api/pleroma/emoji/packs/not_loaded/files?shortcode=blank3") - |> json_response_and_validate_schema(:bad_request) == %{ - "error" => "pack \"not_loaded\" is not found" - } - end - - test "remove file with empty shortcode", %{admin_conn: admin_conn} do - assert admin_conn - |> delete("/api/pleroma/emoji/packs/not_loaded/files?shortcode=") - |> json_response_and_validate_schema(:bad_request) == %{ - "error" => "pack name or shortcode cannot be empty" - } - end - - test "update file with not loaded pack", %{admin_conn: admin_conn} do - assert admin_conn - |> put_req_header("content-type", "multipart/form-data") - |> patch("/api/pleroma/emoji/packs/not_loaded/files", %{ - shortcode: "blank4", - new_shortcode: "blank3", - new_filename: "dir_2/blank_3.png" - }) - |> json_response_and_validate_schema(:bad_request) == %{ - "error" => "pack \"not_loaded\" is not found" - } - end - - test "new with shortcode as file with update", %{admin_conn: admin_conn} do - assert admin_conn - |> put_req_header("content-type", "multipart/form-data") - |> post("/api/pleroma/emoji/packs/test_pack/files", %{ - shortcode: "blank4", - filename: "dir/blank.png", - file: %Plug.Upload{ - filename: "blank.png", - path: "#{@emoji_path}/test_pack/blank.png" - } - }) - |> json_response_and_validate_schema(200) == %{ - "blank" => "blank.png", - "blank4" => "dir/blank.png", - "blank2" => "blank2.png" - } - - assert File.exists?("#{@emoji_path}/test_pack/dir/blank.png") - - assert admin_conn - |> put_req_header("content-type", "multipart/form-data") - |> patch("/api/pleroma/emoji/packs/test_pack/files", %{ - shortcode: "blank4", - new_shortcode: "blank3", - new_filename: "dir_2/blank_3.png" - }) - |> json_response_and_validate_schema(200) == %{ - "blank3" => "dir_2/blank_3.png", - "blank" => "blank.png", - "blank2" => "blank2.png" - } - - refute File.exists?("#{@emoji_path}/test_pack/dir/") - assert File.exists?("#{@emoji_path}/test_pack/dir_2/blank_3.png") - - assert admin_conn - |> delete("/api/pleroma/emoji/packs/test_pack/files?shortcode=blank3") - |> json_response_and_validate_schema(200) == %{ - "blank" => "blank.png", - "blank2" => "blank2.png" - } - - refute File.exists?("#{@emoji_path}/test_pack/dir_2/") - - on_exit(fn -> File.rm_rf!("#{@emoji_path}/test_pack/dir") end) - end - - test "new with shortcode from url", %{admin_conn: admin_conn} do - mock(fn - %{ - method: :get, - url: "https://test-blank/blank_url.png" - } -> - text(File.read!("#{@emoji_path}/test_pack/blank.png")) - end) - - assert admin_conn - |> put_req_header("content-type", "multipart/form-data") - |> post("/api/pleroma/emoji/packs/test_pack/files", %{ - shortcode: "blank_url", - file: "https://test-blank/blank_url.png" - }) - |> json_response_and_validate_schema(200) == %{ - "blank_url" => "blank_url.png", - "blank" => "blank.png", - "blank2" => "blank2.png" - } - - assert File.exists?("#{@emoji_path}/test_pack/blank_url.png") - - on_exit(fn -> File.rm_rf!("#{@emoji_path}/test_pack/blank_url.png") end) - end - - test "new without shortcode", %{admin_conn: admin_conn} do - on_exit(fn -> File.rm_rf!("#{@emoji_path}/test_pack/shortcode.png") end) - - assert admin_conn - |> put_req_header("content-type", "multipart/form-data") - |> post("/api/pleroma/emoji/packs/test_pack/files", %{ - file: %Plug.Upload{ - filename: "shortcode.png", - path: "#{Pleroma.Config.get([:instance, :static_dir])}/add/shortcode.png" - } - }) - |> json_response_and_validate_schema(200) == %{ - "shortcode" => "shortcode.png", - "blank" => "blank.png", - "blank2" => "blank2.png" - } - end - - test "remove non existing shortcode in pack.json", %{admin_conn: admin_conn} do - assert admin_conn - |> delete("/api/pleroma/emoji/packs/test_pack/files?shortcode=blank3") - |> json_response_and_validate_schema(:bad_request) == %{ - "error" => "Emoji \"blank3\" does not exist" - } - end - - test "update non existing emoji", %{admin_conn: admin_conn} do - assert admin_conn - |> put_req_header("content-type", "multipart/form-data") - |> patch("/api/pleroma/emoji/packs/test_pack/files", %{ - shortcode: "blank3", - new_shortcode: "blank4", - new_filename: "dir_2/blank_3.png" - }) - |> json_response_and_validate_schema(:bad_request) == %{ - "error" => "Emoji \"blank3\" does not exist" - } - end - - test "update with empty shortcode", %{admin_conn: admin_conn} do - assert %{ - "error" => "Missing field: new_shortcode." - } = - admin_conn - |> put_req_header("content-type", "multipart/form-data") - |> patch("/api/pleroma/emoji/packs/test_pack/files", %{ - shortcode: "blank", - new_filename: "dir_2/blank_3.png" - }) - |> json_response_and_validate_schema(:bad_request) - end - end - - describe "POST/DELETE /api/pleroma/emoji/packs/:name" do + describe "POST/DELETE /api/pleroma/emoji/pack?name=:name" do test "creating and deleting a pack", %{admin_conn: admin_conn} do assert admin_conn - |> post("/api/pleroma/emoji/packs/test_created") + |> post("/api/pleroma/emoji/pack?name=test_created") |> json_response_and_validate_schema(200) == "ok" assert File.exists?("#{@emoji_path}/test_created/pack.json") @@ -713,7 +438,7 @@ test "creating and deleting a pack", %{admin_conn: admin_conn} do } assert admin_conn - |> delete("/api/pleroma/emoji/packs/test_created") + |> delete("/api/pleroma/emoji/pack?name=test_created") |> json_response_and_validate_schema(200) == "ok" refute File.exists?("#{@emoji_path}/test_created/pack.json") @@ -726,7 +451,7 @@ test "if pack exists", %{admin_conn: admin_conn} do File.write!(Path.join(path, "pack.json"), pack_file) assert admin_conn - |> post("/api/pleroma/emoji/packs/test_created") + |> post("/api/pleroma/emoji/pack?name=test_created") |> json_response_and_validate_schema(:conflict) == %{ "error" => "A pack named \"test_created\" already exists" } @@ -736,7 +461,7 @@ test "if pack exists", %{admin_conn: admin_conn} do test "with empty name", %{admin_conn: admin_conn} do assert admin_conn - |> post("/api/pleroma/emoji/packs/ ") + |> post("/api/pleroma/emoji/pack?name= ") |> json_response_and_validate_schema(:bad_request) == %{ "error" => "pack name cannot be empty" } @@ -745,7 +470,7 @@ test "with empty name", %{admin_conn: admin_conn} do test "deleting nonexisting pack", %{admin_conn: admin_conn} do assert admin_conn - |> delete("/api/pleroma/emoji/packs/non_existing") + |> delete("/api/pleroma/emoji/pack?name=non_existing") |> json_response_and_validate_schema(:not_found) == %{ "error" => "Pack non_existing does not exist" } @@ -753,7 +478,7 @@ test "deleting nonexisting pack", %{admin_conn: admin_conn} do test "deleting with empty name", %{admin_conn: admin_conn} do assert admin_conn - |> delete("/api/pleroma/emoji/packs/ ") + |> delete("/api/pleroma/emoji/pack?name= ") |> json_response_and_validate_schema(:bad_request) == %{ "error" => "pack name cannot be empty" } @@ -801,7 +526,7 @@ test "filesystem import", %{admin_conn: admin_conn, conn: conn} do } end - describe "GET /api/pleroma/emoji/packs/:name" do + describe "GET /api/pleroma/emoji/pack?name=:name" do test "shows pack.json", %{conn: conn} do assert %{ "files" => files, @@ -816,7 +541,7 @@ test "shows pack.json", %{conn: conn} do } } = conn - |> get("/api/pleroma/emoji/packs/test_pack") + |> get("/api/pleroma/emoji/pack?name=test_pack") |> json_response_and_validate_schema(200) assert files == %{"blank" => "blank.png", "blank2" => "blank2.png"} @@ -826,7 +551,7 @@ test "shows pack.json", %{conn: conn} do "files_count" => 2 } = conn - |> get("/api/pleroma/emoji/packs/test_pack?page_size=1") + |> get("/api/pleroma/emoji/pack?name=test_pack&page_size=1") |> json_response_and_validate_schema(200) assert files |> Map.keys() |> length() == 1 @@ -836,15 +561,33 @@ test "shows pack.json", %{conn: conn} do "files_count" => 2 } = conn - |> get("/api/pleroma/emoji/packs/test_pack?page_size=1&page=2") + |> get("/api/pleroma/emoji/pack?name=test_pack&page_size=1&page=2") |> json_response_and_validate_schema(200) assert files |> Map.keys() |> length() == 1 end + test "for pack name with special chars", %{conn: conn} do + assert %{ + "files" => files, + "files_count" => 1, + "pack" => %{ + "can-download" => true, + "description" => "Test description", + "download-sha256" => _, + "homepage" => "https://pleroma.social", + "license" => "Test license", + "share-files" => true + } + } = + conn + |> get("/api/pleroma/emoji/pack?name=blobs.gg") + |> json_response_and_validate_schema(200) + end + test "non existing pack", %{conn: conn} do assert conn - |> get("/api/pleroma/emoji/packs/non_existing") + |> get("/api/pleroma/emoji/pack?name=non_existing") |> json_response_and_validate_schema(:not_found) == %{ "error" => "Pack non_existing does not exist" } @@ -852,7 +595,7 @@ test "non existing pack", %{conn: conn} do test "error name", %{conn: conn} do assert conn - |> get("/api/pleroma/emoji/packs/ ") + |> get("/api/pleroma/emoji/pack?name= ") |> json_response_and_validate_schema(:bad_request) == %{ "error" => "pack name cannot be empty" } diff --git a/test/web/pleroma_api/controllers/user_import_controller_test.exs b/test/web/pleroma_api/controllers/user_import_controller_test.exs new file mode 100644 index 000000000..433c97e81 --- /dev/null +++ b/test/web/pleroma_api/controllers/user_import_controller_test.exs @@ -0,0 +1,235 @@ +# Pleroma: A lightweight social networking server +# Copyright © 2017-2020 Pleroma Authors +# SPDX-License-Identifier: AGPL-3.0-only + +defmodule Pleroma.Web.PleromaAPI.UserImportControllerTest do + use Pleroma.Web.ConnCase + use Oban.Testing, repo: Pleroma.Repo + + alias Pleroma.Config + alias Pleroma.Tests.ObanHelpers + + import Pleroma.Factory + import Mock + + setup do + Tesla.Mock.mock(fn env -> apply(HttpRequestMock, :request, [env]) end) + :ok + end + + describe "POST /api/pleroma/follow_import" do + setup do: oauth_access(["follow"]) + + test "it returns HTTP 200", %{conn: conn} do + user2 = insert(:user) + + assert "job started" == + conn + |> put_req_header("content-type", "application/json") + |> post("/api/pleroma/follow_import", %{"list" => "#{user2.ap_id}"}) + |> json_response_and_validate_schema(200) + end + + test "it imports follow lists from file", %{conn: conn} do + user2 = insert(:user) + + with_mocks([ + {File, [], + read!: fn "follow_list.txt" -> + "Account address,Show boosts\n#{user2.ap_id},true" + end} + ]) do + assert "job started" == + conn + |> put_req_header("content-type", "application/json") + |> post("/api/pleroma/follow_import", %{ + "list" => %Plug.Upload{path: "follow_list.txt"} + }) + |> json_response_and_validate_schema(200) + + assert [{:ok, job_result}] = ObanHelpers.perform_all() + assert job_result == [user2] + end + end + + test "it imports new-style mastodon follow lists", %{conn: conn} do + user2 = insert(:user) + + response = + conn + |> put_req_header("content-type", "application/json") + |> post("/api/pleroma/follow_import", %{ + "list" => "Account address,Show boosts\n#{user2.ap_id},true" + }) + |> json_response_and_validate_schema(200) + + assert response == "job started" + end + + test "requires 'follow' or 'write:follows' permissions" do + token1 = insert(:oauth_token, scopes: ["read", "write"]) + token2 = insert(:oauth_token, scopes: ["follow"]) + token3 = insert(:oauth_token, scopes: ["something"]) + another_user = insert(:user) + + for token <- [token1, token2, token3] do + conn = + build_conn() + |> put_req_header("authorization", "Bearer #{token.token}") + |> put_req_header("content-type", "application/json") + |> post("/api/pleroma/follow_import", %{"list" => "#{another_user.ap_id}"}) + + if token == token3 do + assert %{"error" => "Insufficient permissions: follow | write:follows."} == + json_response(conn, 403) + else + assert json_response(conn, 200) + end + end + end + + test "it imports follows with different nickname variations", %{conn: conn} do + users = [user2, user3, user4, user5, user6] = insert_list(5, :user) + + identifiers = + [ + user2.ap_id, + user3.nickname, + " ", + "@" <> user4.nickname, + user5.nickname <> "@localhost", + "@" <> user6.nickname <> "@localhost" + ] + |> Enum.join("\n") + + assert "job started" == + conn + |> put_req_header("content-type", "application/json") + |> post("/api/pleroma/follow_import", %{"list" => identifiers}) + |> json_response_and_validate_schema(200) + + assert [{:ok, job_result}] = ObanHelpers.perform_all() + assert job_result == users + end + end + + describe "POST /api/pleroma/blocks_import" do + # Note: "follow" or "write:blocks" permission is required + setup do: oauth_access(["write:blocks"]) + + test "it returns HTTP 200", %{conn: conn} do + user2 = insert(:user) + + assert "job started" == + conn + |> put_req_header("content-type", "application/json") + |> post("/api/pleroma/blocks_import", %{"list" => "#{user2.ap_id}"}) + |> json_response_and_validate_schema(200) + end + + test "it imports blocks users from file", %{conn: conn} do + users = [user2, user3] = insert_list(2, :user) + + with_mocks([ + {File, [], read!: fn "blocks_list.txt" -> "#{user2.ap_id} #{user3.ap_id}" end} + ]) do + assert "job started" == + conn + |> put_req_header("content-type", "application/json") + |> post("/api/pleroma/blocks_import", %{ + "list" => %Plug.Upload{path: "blocks_list.txt"} + }) + |> json_response_and_validate_schema(200) + + assert [{:ok, job_result}] = ObanHelpers.perform_all() + assert job_result == users + end + end + + test "it imports blocks with different nickname variations", %{conn: conn} do + users = [user2, user3, user4, user5, user6] = insert_list(5, :user) + + identifiers = + [ + user2.ap_id, + user3.nickname, + "@" <> user4.nickname, + user5.nickname <> "@localhost", + "@" <> user6.nickname <> "@localhost" + ] + |> Enum.join(" ") + + assert "job started" == + conn + |> put_req_header("content-type", "application/json") + |> post("/api/pleroma/blocks_import", %{"list" => identifiers}) + |> json_response_and_validate_schema(200) + + assert [{:ok, job_result}] = ObanHelpers.perform_all() + assert job_result == users + end + end + + describe "POST /api/pleroma/mutes_import" do + # Note: "follow" or "write:mutes" permission is required + setup do: oauth_access(["write:mutes"]) + + test "it returns HTTP 200", %{user: user, conn: conn} do + user2 = insert(:user) + + assert "job started" == + conn + |> put_req_header("content-type", "application/json") + |> post("/api/pleroma/mutes_import", %{"list" => "#{user2.ap_id}"}) + |> json_response_and_validate_schema(200) + + assert [{:ok, job_result}] = ObanHelpers.perform_all() + assert job_result == [user2] + assert Pleroma.User.mutes?(user, user2) + end + + test "it imports mutes users from file", %{user: user, conn: conn} do + users = [user2, user3] = insert_list(2, :user) + + with_mocks([ + {File, [], read!: fn "mutes_list.txt" -> "#{user2.ap_id} #{user3.ap_id}" end} + ]) do + assert "job started" == + conn + |> put_req_header("content-type", "application/json") + |> post("/api/pleroma/mutes_import", %{ + "list" => %Plug.Upload{path: "mutes_list.txt"} + }) + |> json_response_and_validate_schema(200) + + assert [{:ok, job_result}] = ObanHelpers.perform_all() + assert job_result == users + assert Enum.all?(users, &Pleroma.User.mutes?(user, &1)) + end + end + + test "it imports mutes with different nickname variations", %{user: user, conn: conn} do + users = [user2, user3, user4, user5, user6] = insert_list(5, :user) + + identifiers = + [ + user2.ap_id, + user3.nickname, + "@" <> user4.nickname, + user5.nickname <> "@localhost", + "@" <> user6.nickname <> "@localhost" + ] + |> Enum.join(" ") + + assert "job started" == + conn + |> put_req_header("content-type", "application/json") + |> post("/api/pleroma/mutes_import", %{"list" => identifiers}) + |> json_response_and_validate_schema(200) + + assert [{:ok, job_result}] = ObanHelpers.perform_all() + assert job_result == users + assert Enum.all?(users, &Pleroma.User.mutes?(user, &1)) + end + end +end diff --git a/test/web/push/impl_test.exs b/test/web/push/impl_test.exs index aeb5c1fbd..6cab46696 100644 --- a/test/web/push/impl_test.exs +++ b/test/web/push/impl_test.exs @@ -5,6 +5,8 @@ defmodule Pleroma.Web.Push.ImplTest do use Pleroma.DataCase + import Pleroma.Factory + alias Pleroma.Notification alias Pleroma.Object alias Pleroma.User @@ -13,8 +15,6 @@ defmodule Pleroma.Web.Push.ImplTest do alias Pleroma.Web.Push.Impl alias Pleroma.Web.Push.Subscription - import Pleroma.Factory - setup do Tesla.Mock.mock(fn %{method: :post, url: "https://example.com/example/1234"} -> diff --git a/test/web/rich_media/helpers_test.exs b/test/web/rich_media/helpers_test.exs index 8264a9c41..4b97bd66b 100644 --- a/test/web/rich_media/helpers_test.exs +++ b/test/web/rich_media/helpers_test.exs @@ -64,41 +64,6 @@ test "crawls valid, complete URLs" do Pleroma.Web.RichMedia.Helpers.fetch_data_for_activity(activity) end - test "refuses to crawl URLs from posts marked sensitive" do - user = insert(:user) - - {:ok, activity} = - CommonAPI.post(user, %{ - status: "http://example.com/ogp", - sensitive: true - }) - - %Object{} = object = Object.normalize(activity) - - assert object.data["sensitive"] - - Config.put([:rich_media, :enabled], true) - - assert %{} = Pleroma.Web.RichMedia.Helpers.fetch_data_for_activity(activity) - end - - test "refuses to crawl URLs from posts tagged NSFW" do - user = insert(:user) - - {:ok, activity} = - CommonAPI.post(user, %{ - status: "http://example.com/ogp #nsfw" - }) - - %Object{} = object = Object.normalize(activity) - - assert object.data["sensitive"] - - Config.put([:rich_media, :enabled], true) - - assert %{} = Pleroma.Web.RichMedia.Helpers.fetch_data_for_activity(activity) - end - test "refuses to crawl URLs of private network from posts" do user = insert(:user) diff --git a/test/web/rich_media/parser_test.exs b/test/web/rich_media/parser_test.exs index 21ae35f8b..6d00c2af5 100644 --- a/test/web/rich_media/parser_test.exs +++ b/test/web/rich_media/parser_test.exs @@ -56,6 +56,27 @@ defmodule Pleroma.Web.RichMedia.ParserTest do %{method: :get, url: "http://example.com/error"} -> {:error, :overload} + + %{ + method: :head, + url: "http://example.com/huge-page" + } -> + %Tesla.Env{ + status: 200, + headers: [{"content-length", "2000001"}, {"content-type", "text/html"}] + } + + %{ + method: :head, + url: "http://example.com/pdf-file" + } -> + %Tesla.Env{ + status: 200, + headers: [{"content-length", "1000000"}, {"content-type", "application/pdf"}] + } + + %{method: :head} -> + %Tesla.Env{status: 404, body: "", headers: []} end) :ok @@ -144,4 +165,12 @@ test "rejects invalid OGP data" do test "returns error if getting page was not successful" do assert {:error, :overload} = Parser.parse("http://example.com/error") end + + test "does a HEAD request to check if the body is too large" do + assert {:error, :body_too_large} = Parser.parse("http://example.com/huge-page") + end + + test "does a HEAD request to check if the body is html" do + assert {:error, {:content_type, _}} = Parser.parse("http://example.com/pdf-file") + end end diff --git a/test/web/streamer/streamer_test.exs b/test/web/streamer/streamer_test.exs index d56d74464..185724a9f 100644 --- a/test/web/streamer/streamer_test.exs +++ b/test/web/streamer/streamer_test.exs @@ -21,92 +21,148 @@ defmodule Pleroma.Web.StreamerTest do setup do: clear_config([:instance, :skip_thread_containment]) - describe "get_topic without an user" do + describe "get_topic/_ (unauthenticated)" do test "allows public" do - assert {:ok, "public"} = Streamer.get_topic("public", nil) - assert {:ok, "public:local"} = Streamer.get_topic("public:local", nil) - assert {:ok, "public:media"} = Streamer.get_topic("public:media", nil) - assert {:ok, "public:local:media"} = Streamer.get_topic("public:local:media", nil) + assert {:ok, "public"} = Streamer.get_topic("public", nil, nil) + assert {:ok, "public:local"} = Streamer.get_topic("public:local", nil, nil) + assert {:ok, "public:media"} = Streamer.get_topic("public:media", nil, nil) + assert {:ok, "public:local:media"} = Streamer.get_topic("public:local:media", nil, nil) end test "allows hashtag streams" do - assert {:ok, "hashtag:cofe"} = Streamer.get_topic("hashtag", nil, %{"tag" => "cofe"}) + assert {:ok, "hashtag:cofe"} = Streamer.get_topic("hashtag", nil, nil, %{"tag" => "cofe"}) end test "disallows user streams" do - assert {:error, _} = Streamer.get_topic("user", nil) - assert {:error, _} = Streamer.get_topic("user:notification", nil) - assert {:error, _} = Streamer.get_topic("direct", nil) + assert {:error, _} = Streamer.get_topic("user", nil, nil) + assert {:error, _} = Streamer.get_topic("user:notification", nil, nil) + assert {:error, _} = Streamer.get_topic("direct", nil, nil) end test "disallows list streams" do - assert {:error, _} = Streamer.get_topic("list", nil, %{"list" => 42}) + assert {:error, _} = Streamer.get_topic("list", nil, nil, %{"list" => 42}) end end - describe "get_topic with an user" do - setup do - user = insert(:user) - {:ok, %{user: user}} + describe "get_topic/_ (authenticated)" do + setup do: oauth_access(["read"]) + + test "allows public streams (regardless of OAuth token scopes)", %{ + user: user, + token: read_oauth_token + } do + with oauth_token <- [nil, read_oauth_token] do + assert {:ok, "public"} = Streamer.get_topic("public", user, oauth_token) + assert {:ok, "public:local"} = Streamer.get_topic("public:local", user, oauth_token) + assert {:ok, "public:media"} = Streamer.get_topic("public:media", user, oauth_token) + + assert {:ok, "public:local:media"} = + Streamer.get_topic("public:local:media", user, oauth_token) + end end - test "allows public streams", %{user: user} do - assert {:ok, "public"} = Streamer.get_topic("public", user) - assert {:ok, "public:local"} = Streamer.get_topic("public:local", user) - assert {:ok, "public:media"} = Streamer.get_topic("public:media", user) - assert {:ok, "public:local:media"} = Streamer.get_topic("public:local:media", user) - end + test "allows user streams (with proper OAuth token scopes)", %{ + user: user, + token: read_oauth_token + } do + %{token: read_notifications_token} = oauth_access(["read:notifications"], user: user) + %{token: read_statuses_token} = oauth_access(["read:statuses"], user: user) + %{token: badly_scoped_token} = oauth_access(["irrelevant:scope"], user: user) - test "allows user streams", %{user: user} do expected_user_topic = "user:#{user.id}" - expected_notif_topic = "user:notification:#{user.id}" + expected_notification_topic = "user:notification:#{user.id}" expected_direct_topic = "direct:#{user.id}" - assert {:ok, ^expected_user_topic} = Streamer.get_topic("user", user) - assert {:ok, ^expected_notif_topic} = Streamer.get_topic("user:notification", user) - assert {:ok, ^expected_direct_topic} = Streamer.get_topic("direct", user) + expected_pleroma_chat_topic = "user:pleroma_chat:#{user.id}" + + for valid_user_token <- [read_oauth_token, read_statuses_token] do + assert {:ok, ^expected_user_topic} = Streamer.get_topic("user", user, valid_user_token) + + assert {:ok, ^expected_direct_topic} = + Streamer.get_topic("direct", user, valid_user_token) + + assert {:ok, ^expected_pleroma_chat_topic} = + Streamer.get_topic("user:pleroma_chat", user, valid_user_token) + end + + for invalid_user_token <- [read_notifications_token, badly_scoped_token], + user_topic <- ["user", "direct", "user:pleroma_chat"] do + assert {:error, :unauthorized} = Streamer.get_topic(user_topic, user, invalid_user_token) + end + + for valid_notification_token <- [read_oauth_token, read_notifications_token] do + assert {:ok, ^expected_notification_topic} = + Streamer.get_topic("user:notification", user, valid_notification_token) + end + + for invalid_notification_token <- [read_statuses_token, badly_scoped_token] do + assert {:error, :unauthorized} = + Streamer.get_topic("user:notification", user, invalid_notification_token) + end end - test "allows hashtag streams", %{user: user} do - assert {:ok, "hashtag:cofe"} = Streamer.get_topic("hashtag", user, %{"tag" => "cofe"}) + test "allows hashtag streams (regardless of OAuth token scopes)", %{ + user: user, + token: read_oauth_token + } do + for oauth_token <- [nil, read_oauth_token] do + assert {:ok, "hashtag:cofe"} = + Streamer.get_topic("hashtag", user, oauth_token, %{"tag" => "cofe"}) + end end - test "disallows registering to an user stream", %{user: user} do + test "disallows registering to another user's stream", %{user: user, token: read_oauth_token} do another_user = insert(:user) - assert {:error, _} = Streamer.get_topic("user:#{another_user.id}", user) - assert {:error, _} = Streamer.get_topic("user:notification:#{another_user.id}", user) - assert {:error, _} = Streamer.get_topic("direct:#{another_user.id}", user) + assert {:error, _} = Streamer.get_topic("user:#{another_user.id}", user, read_oauth_token) + + assert {:error, _} = + Streamer.get_topic("user:notification:#{another_user.id}", user, read_oauth_token) + + assert {:error, _} = Streamer.get_topic("direct:#{another_user.id}", user, read_oauth_token) end - test "allows list stream that are owned by the user", %{user: user} do + test "allows list stream that are owned by the user (with `read` or `read:lists` scopes)", %{ + user: user, + token: read_oauth_token + } do + %{token: read_lists_token} = oauth_access(["read:lists"], user: user) + %{token: invalid_token} = oauth_access(["irrelevant:scope"], user: user) {:ok, list} = List.create("Test", user) - assert {:error, _} = Streamer.get_topic("list:#{list.id}", user) - assert {:ok, _} = Streamer.get_topic("list", user, %{"list" => list.id}) + + assert {:error, _} = Streamer.get_topic("list:#{list.id}", user, read_oauth_token) + + for valid_token <- [read_oauth_token, read_lists_token] do + assert {:ok, _} = Streamer.get_topic("list", user, valid_token, %{"list" => list.id}) + end + + assert {:error, _} = Streamer.get_topic("list", user, invalid_token, %{"list" => list.id}) end - test "disallows list stream that are not owned by the user", %{user: user} do + test "disallows list stream that are not owned by the user", %{user: user, token: oauth_token} do another_user = insert(:user) {:ok, list} = List.create("Test", another_user) - assert {:error, _} = Streamer.get_topic("list:#{list.id}", user) - assert {:error, _} = Streamer.get_topic("list", user, %{"list" => list.id}) + + assert {:error, _} = Streamer.get_topic("list:#{list.id}", user, oauth_token) + assert {:error, _} = Streamer.get_topic("list", user, oauth_token, %{"list" => list.id}) end end describe "user streams" do setup do - user = insert(:user) + %{user: user, token: token} = oauth_access(["read"]) notify = insert(:notification, user: user, activity: build(:note_activity)) - {:ok, %{user: user, notify: notify}} + {:ok, %{user: user, notify: notify, token: token}} end - test "it streams the user's post in the 'user' stream", %{user: user} do - Streamer.get_topic_and_add_socket("user", user) + test "it streams the user's post in the 'user' stream", %{user: user, token: oauth_token} do + Streamer.get_topic_and_add_socket("user", user, oauth_token) {:ok, activity} = CommonAPI.post(user, %{status: "hey"}) + assert_receive {:render_with_user, _, _, ^activity} refute Streamer.filtered_by_user?(user, activity) end - test "it streams boosts of the user in the 'user' stream", %{user: user} do - Streamer.get_topic_and_add_socket("user", user) + test "it streams boosts of the user in the 'user' stream", %{user: user, token: oauth_token} do + Streamer.get_topic_and_add_socket("user", user, oauth_token) other_user = insert(:user) {:ok, activity} = CommonAPI.post(other_user, %{status: "hey"}) @@ -117,9 +173,10 @@ test "it streams boosts of the user in the 'user' stream", %{user: user} do end test "it does not stream announces of the user's own posts in the 'user' stream", %{ - user: user + user: user, + token: oauth_token } do - Streamer.get_topic_and_add_socket("user", user) + Streamer.get_topic_and_add_socket("user", user, oauth_token) other_user = insert(:user) {:ok, activity} = CommonAPI.post(user, %{status: "hey"}) @@ -129,9 +186,10 @@ test "it does not stream announces of the user's own posts in the 'user' stream" end test "it does stream notifications announces of the user's own posts in the 'user' stream", %{ - user: user + user: user, + token: oauth_token } do - Streamer.get_topic_and_add_socket("user", user) + Streamer.get_topic_and_add_socket("user", user, oauth_token) other_user = insert(:user) {:ok, activity} = CommonAPI.post(user, %{status: "hey"}) @@ -145,8 +203,11 @@ test "it does stream notifications announces of the user's own posts in the 'use refute Streamer.filtered_by_user?(user, notification) end - test "it streams boosts of mastodon user in the 'user' stream", %{user: user} do - Streamer.get_topic_and_add_socket("user", user) + test "it streams boosts of mastodon user in the 'user' stream", %{ + user: user, + token: oauth_token + } do + Streamer.get_topic_and_add_socket("user", user, oauth_token) other_user = insert(:user) {:ok, activity} = CommonAPI.post(other_user, %{status: "hey"}) @@ -164,21 +225,34 @@ test "it streams boosts of mastodon user in the 'user' stream", %{user: user} do refute Streamer.filtered_by_user?(user, announce) end - test "it sends notify to in the 'user' stream", %{user: user, notify: notify} do - Streamer.get_topic_and_add_socket("user", user) + test "it sends notify to in the 'user' stream", %{ + user: user, + token: oauth_token, + notify: notify + } do + Streamer.get_topic_and_add_socket("user", user, oauth_token) Streamer.stream("user", notify) + assert_receive {:render_with_user, _, _, ^notify} refute Streamer.filtered_by_user?(user, notify) end - test "it sends notify to in the 'user:notification' stream", %{user: user, notify: notify} do - Streamer.get_topic_and_add_socket("user:notification", user) + test "it sends notify to in the 'user:notification' stream", %{ + user: user, + token: oauth_token, + notify: notify + } do + Streamer.get_topic_and_add_socket("user:notification", user, oauth_token) Streamer.stream("user:notification", notify) + assert_receive {:render_with_user, _, _, ^notify} refute Streamer.filtered_by_user?(user, notify) end - test "it sends chat messages to the 'user:pleroma_chat' stream", %{user: user} do + test "it sends chat messages to the 'user:pleroma_chat' stream", %{ + user: user, + token: oauth_token + } do other_user = insert(:user) {:ok, create_activity} = CommonAPI.post_chat_message(other_user, user, "hey cirno") @@ -187,7 +261,7 @@ test "it sends chat messages to the 'user:pleroma_chat' stream", %{user: user} d cm_ref = MessageReference.for_chat_and_object(chat, object) cm_ref = %{cm_ref | chat: chat, object: object} - Streamer.get_topic_and_add_socket("user:pleroma_chat", user) + Streamer.get_topic_and_add_socket("user:pleroma_chat", user, oauth_token) Streamer.stream("user:pleroma_chat", {user, cm_ref}) text = StreamerView.render("chat_update.json", %{chat_message_reference: cm_ref}) @@ -196,7 +270,7 @@ test "it sends chat messages to the 'user:pleroma_chat' stream", %{user: user} d assert_receive {:text, ^text} end - test "it sends chat messages to the 'user' stream", %{user: user} do + test "it sends chat messages to the 'user' stream", %{user: user, token: oauth_token} do other_user = insert(:user) {:ok, create_activity} = CommonAPI.post_chat_message(other_user, user, "hey cirno") @@ -205,7 +279,7 @@ test "it sends chat messages to the 'user' stream", %{user: user} do cm_ref = MessageReference.for_chat_and_object(chat, object) cm_ref = %{cm_ref | chat: chat, object: object} - Streamer.get_topic_and_add_socket("user", user) + Streamer.get_topic_and_add_socket("user", user, oauth_token) Streamer.stream("user", {user, cm_ref}) text = StreamerView.render("chat_update.json", %{chat_message_reference: cm_ref}) @@ -214,7 +288,10 @@ test "it sends chat messages to the 'user' stream", %{user: user} do assert_receive {:text, ^text} end - test "it sends chat message notifications to the 'user:notification' stream", %{user: user} do + test "it sends chat message notifications to the 'user:notification' stream", %{ + user: user, + token: oauth_token + } do other_user = insert(:user) {:ok, create_activity} = CommonAPI.post_chat_message(other_user, user, "hey") @@ -223,19 +300,21 @@ test "it sends chat message notifications to the 'user:notification' stream", %{ Repo.get_by(Pleroma.Notification, user_id: user.id, activity_id: create_activity.id) |> Repo.preload(:activity) - Streamer.get_topic_and_add_socket("user:notification", user) + Streamer.get_topic_and_add_socket("user:notification", user, oauth_token) Streamer.stream("user:notification", notify) + assert_receive {:render_with_user, _, _, ^notify} refute Streamer.filtered_by_user?(user, notify) end test "it doesn't send notify to the 'user:notification' stream when a user is blocked", %{ - user: user + user: user, + token: oauth_token } do blocked = insert(:user) {:ok, _user_relationship} = User.block(user, blocked) - Streamer.get_topic_and_add_socket("user:notification", user) + Streamer.get_topic_and_add_socket("user:notification", user, oauth_token) {:ok, activity} = CommonAPI.post(user, %{status: ":("}) {:ok, _} = CommonAPI.favorite(blocked, activity.id) @@ -244,14 +323,15 @@ test "it doesn't send notify to the 'user:notification' stream when a user is bl end test "it doesn't send notify to the 'user:notification' stream when a thread is muted", %{ - user: user + user: user, + token: oauth_token } do user2 = insert(:user) {:ok, activity} = CommonAPI.post(user, %{status: "super hot take"}) {:ok, _} = CommonAPI.add_mute(user, activity) - Streamer.get_topic_and_add_socket("user:notification", user) + Streamer.get_topic_and_add_socket("user:notification", user, oauth_token) {:ok, favorite_activity} = CommonAPI.favorite(user2, activity.id) @@ -260,12 +340,13 @@ test "it doesn't send notify to the 'user:notification' stream when a thread is end test "it sends favorite to 'user:notification' stream'", %{ - user: user + user: user, + token: oauth_token } do user2 = insert(:user, %{ap_id: "https://hecking-lewd-place.com/user/meanie"}) {:ok, activity} = CommonAPI.post(user, %{status: "super hot take"}) - Streamer.get_topic_and_add_socket("user:notification", user) + Streamer.get_topic_and_add_socket("user:notification", user, oauth_token) {:ok, favorite_activity} = CommonAPI.favorite(user2, activity.id) assert_receive {:render_with_user, _, "notification.json", notif} @@ -274,13 +355,14 @@ test "it sends favorite to 'user:notification' stream'", %{ end test "it doesn't send the 'user:notification' stream' when a domain is blocked", %{ - user: user + user: user, + token: oauth_token } do user2 = insert(:user, %{ap_id: "https://hecking-lewd-place.com/user/meanie"}) {:ok, user} = User.block_domain(user, "hecking-lewd-place.com") {:ok, activity} = CommonAPI.post(user, %{status: "super hot take"}) - Streamer.get_topic_and_add_socket("user:notification", user) + Streamer.get_topic_and_add_socket("user:notification", user, oauth_token) {:ok, favorite_activity} = CommonAPI.favorite(user2, activity.id) refute_receive _ @@ -288,7 +370,8 @@ test "it doesn't send the 'user:notification' stream' when a domain is blocked", end test "it sends follow activities to the 'user:notification' stream", %{ - user: user + user: user, + token: oauth_token } do user_url = user.ap_id user2 = insert(:user) @@ -303,7 +386,7 @@ test "it sends follow activities to the 'user:notification' stream", %{ %Tesla.Env{status: 200, body: body} end) - Streamer.get_topic_and_add_socket("user:notification", user) + Streamer.get_topic_and_add_socket("user:notification", user, oauth_token) {:ok, _follower, _followed, follow_activity} = CommonAPI.follow(user2, user) assert_receive {:render_with_user, _, "notification.json", notif} @@ -312,51 +395,53 @@ test "it sends follow activities to the 'user:notification' stream", %{ end end - test "it sends to public authenticated" do - user = insert(:user) - other_user = insert(:user) + describe "public streams" do + test "it sends to public (authenticated)" do + %{user: user, token: oauth_token} = oauth_access(["read"]) + other_user = insert(:user) - Streamer.get_topic_and_add_socket("public", other_user) + Streamer.get_topic_and_add_socket("public", user, oauth_token) - {:ok, activity} = CommonAPI.post(user, %{status: "Test"}) - assert_receive {:render_with_user, _, _, ^activity} - refute Streamer.filtered_by_user?(user, activity) + {:ok, activity} = CommonAPI.post(other_user, %{status: "Test"}) + assert_receive {:render_with_user, _, _, ^activity} + refute Streamer.filtered_by_user?(other_user, activity) + end + + test "it sends to public (unauthenticated)" do + user = insert(:user) + + Streamer.get_topic_and_add_socket("public", nil, nil) + + {:ok, activity} = CommonAPI.post(user, %{status: "Test"}) + activity_id = activity.id + assert_receive {:text, event} + assert %{"event" => "update", "payload" => payload} = Jason.decode!(event) + assert %{"id" => ^activity_id} = Jason.decode!(payload) + + {:ok, _} = CommonAPI.delete(activity.id, user) + assert_receive {:text, event} + assert %{"event" => "delete", "payload" => ^activity_id} = Jason.decode!(event) + end + + test "handles deletions" do + %{user: user, token: oauth_token} = oauth_access(["read"]) + other_user = insert(:user) + {:ok, activity} = CommonAPI.post(other_user, %{status: "Test"}) + + Streamer.get_topic_and_add_socket("public", user, oauth_token) + + {:ok, _} = CommonAPI.delete(activity.id, other_user) + activity_id = activity.id + assert_receive {:text, event} + assert %{"event" => "delete", "payload" => ^activity_id} = Jason.decode!(event) + end end - test "works for deletions" do - user = insert(:user) - other_user = insert(:user) - {:ok, activity} = CommonAPI.post(other_user, %{status: "Test"}) - - Streamer.get_topic_and_add_socket("public", user) - - {:ok, _} = CommonAPI.delete(activity.id, other_user) - activity_id = activity.id - assert_receive {:text, event} - assert %{"event" => "delete", "payload" => ^activity_id} = Jason.decode!(event) - end - - test "it sends to public unauthenticated" do - user = insert(:user) - - Streamer.get_topic_and_add_socket("public", nil) - - {:ok, activity} = CommonAPI.post(user, %{status: "Test"}) - activity_id = activity.id - assert_receive {:text, event} - assert %{"event" => "update", "payload" => payload} = Jason.decode!(event) - assert %{"id" => ^activity_id} = Jason.decode!(payload) - - {:ok, _} = CommonAPI.delete(activity.id, user) - assert_receive {:text, event} - assert %{"event" => "delete", "payload" => ^activity_id} = Jason.decode!(event) - end - - describe "thread_containment" do + describe "thread_containment/2" do test "it filters to user if recipients invalid and thread containment is enabled" do Pleroma.Config.put([:instance, :skip_thread_containment], false) author = insert(:user) - user = insert(:user) + %{user: user, token: oauth_token} = oauth_access(["read"]) User.follow(user, author, :follow_accept) activity = @@ -368,7 +453,7 @@ test "it filters to user if recipients invalid and thread containment is enabled ) ) - Streamer.get_topic_and_add_socket("public", user) + Streamer.get_topic_and_add_socket("public", user, oauth_token) Streamer.stream("public", activity) assert_receive {:render_with_user, _, _, ^activity} assert Streamer.filtered_by_user?(user, activity) @@ -377,7 +462,7 @@ test "it filters to user if recipients invalid and thread containment is enabled test "it sends message if recipients invalid and thread containment is disabled" do Pleroma.Config.put([:instance, :skip_thread_containment], true) author = insert(:user) - user = insert(:user) + %{user: user, token: oauth_token} = oauth_access(["read"]) User.follow(user, author, :follow_accept) activity = @@ -389,7 +474,7 @@ test "it sends message if recipients invalid and thread containment is disabled" ) ) - Streamer.get_topic_and_add_socket("public", user) + Streamer.get_topic_and_add_socket("public", user, oauth_token) Streamer.stream("public", activity) assert_receive {:render_with_user, _, _, ^activity} @@ -400,6 +485,7 @@ test "it sends message if recipients invalid and thread containment is enabled b Pleroma.Config.put([:instance, :skip_thread_containment], false) author = insert(:user) user = insert(:user, skip_thread_containment: true) + %{token: oauth_token} = oauth_access(["read"], user: user) User.follow(user, author, :follow_accept) activity = @@ -411,7 +497,7 @@ test "it sends message if recipients invalid and thread containment is enabled b ) ) - Streamer.get_topic_and_add_socket("public", user) + Streamer.get_topic_and_add_socket("public", user, oauth_token) Streamer.stream("public", activity) assert_receive {:render_with_user, _, _, ^activity} @@ -420,23 +506,26 @@ test "it sends message if recipients invalid and thread containment is enabled b end describe "blocks" do - test "it filters messages involving blocked users" do - user = insert(:user) + setup do: oauth_access(["read"]) + + test "it filters messages involving blocked users", %{user: user, token: oauth_token} do blocked_user = insert(:user) {:ok, _user_relationship} = User.block(user, blocked_user) - Streamer.get_topic_and_add_socket("public", user) + Streamer.get_topic_and_add_socket("public", user, oauth_token) {:ok, activity} = CommonAPI.post(blocked_user, %{status: "Test"}) assert_receive {:render_with_user, _, _, ^activity} assert Streamer.filtered_by_user?(user, activity) end - test "it filters messages transitively involving blocked users" do - blocker = insert(:user) + test "it filters messages transitively involving blocked users", %{ + user: blocker, + token: blocker_token + } do blockee = insert(:user) friend = insert(:user) - Streamer.get_topic_and_add_socket("public", blocker) + Streamer.get_topic_and_add_socket("public", blocker, blocker_token) {:ok, _user_relationship} = User.block(blocker, blockee) @@ -458,8 +547,9 @@ test "it filters messages transitively involving blocked users" do end describe "lists" do - test "it doesn't send unwanted DMs to list" do - user_a = insert(:user) + setup do: oauth_access(["read"]) + + test "it doesn't send unwanted DMs to list", %{user: user_a, token: user_a_token} do user_b = insert(:user) user_c = insert(:user) @@ -468,7 +558,7 @@ test "it doesn't send unwanted DMs to list" do {:ok, list} = List.create("Test", user_a) {:ok, list} = List.follow(list, user_b) - Streamer.get_topic_and_add_socket("list", user_a, %{"list" => list.id}) + Streamer.get_topic_and_add_socket("list", user_a, user_a_token, %{"list" => list.id}) {:ok, _activity} = CommonAPI.post(user_b, %{ @@ -479,14 +569,13 @@ test "it doesn't send unwanted DMs to list" do refute_receive _ end - test "it doesn't send unwanted private posts to list" do - user_a = insert(:user) + test "it doesn't send unwanted private posts to list", %{user: user_a, token: user_a_token} do user_b = insert(:user) {:ok, list} = List.create("Test", user_a) {:ok, list} = List.follow(list, user_b) - Streamer.get_topic_and_add_socket("list", user_a, %{"list" => list.id}) + Streamer.get_topic_and_add_socket("list", user_a, user_a_token, %{"list" => list.id}) {:ok, _activity} = CommonAPI.post(user_b, %{ @@ -497,8 +586,7 @@ test "it doesn't send unwanted private posts to list" do refute_receive _ end - test "it sends wanted private posts to list" do - user_a = insert(:user) + test "it sends wanted private posts to list", %{user: user_a, token: user_a_token} do user_b = insert(:user) {:ok, user_a} = User.follow(user_a, user_b) @@ -506,7 +594,7 @@ test "it sends wanted private posts to list" do {:ok, list} = List.create("Test", user_a) {:ok, list} = List.follow(list, user_b) - Streamer.get_topic_and_add_socket("list", user_a, %{"list" => list.id}) + Streamer.get_topic_and_add_socket("list", user_a, user_a_token, %{"list" => list.id}) {:ok, activity} = CommonAPI.post(user_b, %{ @@ -520,8 +608,9 @@ test "it sends wanted private posts to list" do end describe "muted reblogs" do - test "it filters muted reblogs" do - user1 = insert(:user) + setup do: oauth_access(["read"]) + + test "it filters muted reblogs", %{user: user1, token: user1_token} do user2 = insert(:user) user3 = insert(:user) CommonAPI.follow(user1, user2) @@ -529,34 +618,38 @@ test "it filters muted reblogs" do {:ok, create_activity} = CommonAPI.post(user3, %{status: "I'm kawen"}) - Streamer.get_topic_and_add_socket("user", user1) + Streamer.get_topic_and_add_socket("user", user1, user1_token) {:ok, announce_activity} = CommonAPI.repeat(create_activity.id, user2) assert_receive {:render_with_user, _, _, ^announce_activity} assert Streamer.filtered_by_user?(user1, announce_activity) end - test "it filters reblog notification for reblog-muted actors" do - user1 = insert(:user) + test "it filters reblog notification for reblog-muted actors", %{ + user: user1, + token: user1_token + } do user2 = insert(:user) CommonAPI.follow(user1, user2) CommonAPI.hide_reblogs(user1, user2) {:ok, create_activity} = CommonAPI.post(user1, %{status: "I'm kawen"}) - Streamer.get_topic_and_add_socket("user", user1) + Streamer.get_topic_and_add_socket("user", user1, user1_token) {:ok, _announce_activity} = CommonAPI.repeat(create_activity.id, user2) assert_receive {:render_with_user, _, "notification.json", notif} assert Streamer.filtered_by_user?(user1, notif) end - test "it send non-reblog notification for reblog-muted actors" do - user1 = insert(:user) + test "it send non-reblog notification for reblog-muted actors", %{ + user: user1, + token: user1_token + } do user2 = insert(:user) CommonAPI.follow(user1, user2) CommonAPI.hide_reblogs(user1, user2) {:ok, create_activity} = CommonAPI.post(user1, %{status: "I'm kawen"}) - Streamer.get_topic_and_add_socket("user", user1) + Streamer.get_topic_and_add_socket("user", user1, user1_token) {:ok, _favorite_activity} = CommonAPI.favorite(user2, create_activity.id) assert_receive {:render_with_user, _, "notification.json", notif} @@ -564,27 +657,28 @@ test "it send non-reblog notification for reblog-muted actors" do end end - test "it filters posts from muted threads" do - user = insert(:user) - user2 = insert(:user) - Streamer.get_topic_and_add_socket("user", user2) - {:ok, user2, user, _activity} = CommonAPI.follow(user2, user) - {:ok, activity} = CommonAPI.post(user, %{status: "super hot take"}) - {:ok, _} = CommonAPI.add_mute(user2, activity) - assert_receive {:render_with_user, _, _, ^activity} - assert Streamer.filtered_by_user?(user2, activity) + describe "muted threads" do + test "it filters posts from muted threads" do + user = insert(:user) + %{user: user2, token: user2_token} = oauth_access(["read"]) + Streamer.get_topic_and_add_socket("user", user2, user2_token) + + {:ok, user2, user, _activity} = CommonAPI.follow(user2, user) + {:ok, activity} = CommonAPI.post(user, %{status: "super hot take"}) + {:ok, _} = CommonAPI.add_mute(user2, activity) + + assert_receive {:render_with_user, _, _, ^activity} + assert Streamer.filtered_by_user?(user2, activity) + end end describe "direct streams" do - setup do - :ok - end + setup do: oauth_access(["read"]) - test "it sends conversation update to the 'direct' stream", %{} do - user = insert(:user) + test "it sends conversation update to the 'direct' stream", %{user: user, token: oauth_token} do another_user = insert(:user) - Streamer.get_topic_and_add_socket("direct", user) + Streamer.get_topic_and_add_socket("direct", user, oauth_token) {:ok, _create_activity} = CommonAPI.post(another_user, %{ @@ -602,11 +696,11 @@ test "it sends conversation update to the 'direct' stream", %{} do assert last_status["pleroma"]["direct_conversation_id"] == participation.id end - test "it doesn't send conversation update to the 'direct' stream when the last message in the conversation is deleted" do - user = insert(:user) + test "it doesn't send conversation update to the 'direct' stream when the last message in the conversation is deleted", + %{user: user, token: oauth_token} do another_user = insert(:user) - Streamer.get_topic_and_add_socket("direct", user) + Streamer.get_topic_and_add_socket("direct", user, oauth_token) {:ok, create_activity} = CommonAPI.post(another_user, %{ @@ -629,10 +723,12 @@ test "it doesn't send conversation update to the 'direct' stream when the last m refute_receive _ end - test "it sends conversation update to the 'direct' stream when a message is deleted" do - user = insert(:user) + test "it sends conversation update to the 'direct' stream when a message is deleted", %{ + user: user, + token: oauth_token + } do another_user = insert(:user) - Streamer.get_topic_and_add_socket("direct", user) + Streamer.get_topic_and_add_socket("direct", user, oauth_token) {:ok, create_activity} = CommonAPI.post(another_user, %{ diff --git a/test/web/twitter_api/util_controller_test.exs b/test/web/twitter_api/util_controller_test.exs index d164127ee..60f2fb052 100644 --- a/test/web/twitter_api/util_controller_test.exs +++ b/test/web/twitter_api/util_controller_test.exs @@ -21,170 +21,6 @@ defmodule Pleroma.Web.TwitterAPI.UtilControllerTest do setup do: clear_config([:instance]) setup do: clear_config([:frontend_configurations, :pleroma_fe]) - describe "POST /api/pleroma/follow_import" do - setup do: oauth_access(["follow"]) - - test "it returns HTTP 200", %{conn: conn} do - user2 = insert(:user) - - response = - conn - |> post("/api/pleroma/follow_import", %{"list" => "#{user2.ap_id}"}) - |> json_response(:ok) - - assert response == "job started" - end - - test "it imports follow lists from file", %{user: user1, conn: conn} do - user2 = insert(:user) - - with_mocks([ - {File, [], - read!: fn "follow_list.txt" -> - "Account address,Show boosts\n#{user2.ap_id},true" - end} - ]) do - response = - conn - |> post("/api/pleroma/follow_import", %{"list" => %Plug.Upload{path: "follow_list.txt"}}) - |> json_response(:ok) - - assert response == "job started" - - assert ObanHelpers.member?( - %{ - "op" => "follow_import", - "follower_id" => user1.id, - "followed_identifiers" => [user2.ap_id] - }, - all_enqueued(worker: Pleroma.Workers.BackgroundWorker) - ) - end - end - - test "it imports new-style mastodon follow lists", %{conn: conn} do - user2 = insert(:user) - - response = - conn - |> post("/api/pleroma/follow_import", %{ - "list" => "Account address,Show boosts\n#{user2.ap_id},true" - }) - |> json_response(:ok) - - assert response == "job started" - end - - test "requires 'follow' or 'write:follows' permissions" do - token1 = insert(:oauth_token, scopes: ["read", "write"]) - token2 = insert(:oauth_token, scopes: ["follow"]) - token3 = insert(:oauth_token, scopes: ["something"]) - another_user = insert(:user) - - for token <- [token1, token2, token3] do - conn = - build_conn() - |> put_req_header("authorization", "Bearer #{token.token}") - |> post("/api/pleroma/follow_import", %{"list" => "#{another_user.ap_id}"}) - - if token == token3 do - assert %{"error" => "Insufficient permissions: follow | write:follows."} == - json_response(conn, 403) - else - assert json_response(conn, 200) - end - end - end - - test "it imports follows with different nickname variations", %{conn: conn} do - [user2, user3, user4, user5, user6] = insert_list(5, :user) - - identifiers = - [ - user2.ap_id, - user3.nickname, - " ", - "@" <> user4.nickname, - user5.nickname <> "@localhost", - "@" <> user6.nickname <> "@localhost" - ] - |> Enum.join("\n") - - response = - conn - |> post("/api/pleroma/follow_import", %{"list" => identifiers}) - |> json_response(:ok) - - assert response == "job started" - assert [{:ok, job_result}] = ObanHelpers.perform_all() - assert job_result == [user2, user3, user4, user5, user6] - end - end - - describe "POST /api/pleroma/blocks_import" do - # Note: "follow" or "write:blocks" permission is required - setup do: oauth_access(["write:blocks"]) - - test "it returns HTTP 200", %{conn: conn} do - user2 = insert(:user) - - response = - conn - |> post("/api/pleroma/blocks_import", %{"list" => "#{user2.ap_id}"}) - |> json_response(:ok) - - assert response == "job started" - end - - test "it imports blocks users from file", %{user: user1, conn: conn} do - user2 = insert(:user) - user3 = insert(:user) - - with_mocks([ - {File, [], read!: fn "blocks_list.txt" -> "#{user2.ap_id} #{user3.ap_id}" end} - ]) do - response = - conn - |> post("/api/pleroma/blocks_import", %{"list" => %Plug.Upload{path: "blocks_list.txt"}}) - |> json_response(:ok) - - assert response == "job started" - - assert ObanHelpers.member?( - %{ - "op" => "blocks_import", - "blocker_id" => user1.id, - "blocked_identifiers" => [user2.ap_id, user3.ap_id] - }, - all_enqueued(worker: Pleroma.Workers.BackgroundWorker) - ) - end - end - - test "it imports blocks with different nickname variations", %{conn: conn} do - [user2, user3, user4, user5, user6] = insert_list(5, :user) - - identifiers = - [ - user2.ap_id, - user3.nickname, - "@" <> user4.nickname, - user5.nickname <> "@localhost", - "@" <> user6.nickname <> "@localhost" - ] - |> Enum.join(" ") - - response = - conn - |> post("/api/pleroma/blocks_import", %{"list" => identifiers}) - |> json_response(:ok) - - assert response == "job started" - assert [{:ok, job_result}] = ObanHelpers.perform_all() - assert job_result == [user2, user3, user4, user5, user6] - end - end - describe "PUT /api/pleroma/notification_settings" do setup do: oauth_access(["write:accounts"])