forked from AkkomaGang/akkoma
Merge branch 'release/2.0.1' into 'stable'
2.0.1 release See merge request pleroma/pleroma!2298
This commit is contained in:
commit
bb49d8f5a0
114 changed files with 846 additions and 417 deletions
29
CHANGELOG.md
29
CHANGELOG.md
|
@ -3,6 +3,34 @@ All notable changes to this project will be documented in this file.
|
||||||
|
|
||||||
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/).
|
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/).
|
||||||
|
|
||||||
|
## [2.0.1] - 2020-03-15
|
||||||
|
### Security
|
||||||
|
- Static-FE: Fix remote posts not being sanitized
|
||||||
|
|
||||||
|
### Fixed
|
||||||
|
- Rate limiter crashes when there is no explicitly specified ip in the config
|
||||||
|
- 500 errors when no `Accept` header is present if Static-FE is enabled
|
||||||
|
- Instance panel not being updated immediately due to wrong `Cache-Control` headers
|
||||||
|
- Statuses posted with BBCode/Markdown having unncessary newlines in Pleroma-FE
|
||||||
|
- OTP: Fix some settings not being migrated to in-database config properly
|
||||||
|
- No `Cache-Control` headers on attachment/media proxy requests
|
||||||
|
- Character limit enforcement being off by 1
|
||||||
|
- Mastodon Streaming API: hashtag timelines not working
|
||||||
|
|
||||||
|
### Changed
|
||||||
|
- BBCode and Markdown formatters will no longer return any `\n` and only use `<br/>` for newlines
|
||||||
|
- Mastodon API: Allow registration without email if email verification is not enabled
|
||||||
|
|
||||||
|
### Upgrade notes
|
||||||
|
#### Nginx only
|
||||||
|
1. Remove `proxy_ignore_headers Cache-Control;` and `proxy_hide_header Cache-Control;` from your config.
|
||||||
|
|
||||||
|
#### Everyone
|
||||||
|
1. Run database migrations (inside Pleroma directory):
|
||||||
|
- OTP: `./bin/pleroma_ctl migrate`
|
||||||
|
- From Source: `mix ecto.migrate`
|
||||||
|
2. Restart Pleroma
|
||||||
|
|
||||||
## [2.0.0] - 2019-03-08
|
## [2.0.0] - 2019-03-08
|
||||||
### Security
|
### Security
|
||||||
- Mastodon API: Fix being able to request enourmous amount of statuses in timelines leading to DoS. Now limited to 40 per request.
|
- Mastodon API: Fix being able to request enourmous amount of statuses in timelines leading to DoS. Now limited to 40 per request.
|
||||||
|
@ -38,6 +66,7 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/).
|
||||||
- Rate limiter is now disabled for localhost/socket (unless remoteip plug is enabled)
|
- Rate limiter is now disabled for localhost/socket (unless remoteip plug is enabled)
|
||||||
- Logger: default log level changed from `warn` to `info`.
|
- Logger: default log level changed from `warn` to `info`.
|
||||||
- Config mix task `migrate_to_db` truncates `config` table before migrating the config file.
|
- Config mix task `migrate_to_db` truncates `config` table before migrating the config file.
|
||||||
|
- Allow account registration without an email
|
||||||
- Default to `prepare: :unnamed` in the database configuration.
|
- Default to `prepare: :unnamed` in the database configuration.
|
||||||
- Instance stats are now loaded on startup instead of being empty until next hourly job.
|
- Instance stats are now loaded on startup instead of being empty until next hourly job.
|
||||||
<details>
|
<details>
|
||||||
|
|
|
@ -504,10 +504,6 @@
|
||||||
federator_outgoing: 5
|
federator_outgoing: 5
|
||||||
]
|
]
|
||||||
|
|
||||||
config :pleroma, :fetch_initial_posts,
|
|
||||||
enabled: false,
|
|
||||||
pages: 5
|
|
||||||
|
|
||||||
config :auto_linker,
|
config :auto_linker,
|
||||||
opts: [
|
opts: [
|
||||||
extra: true,
|
extra: true,
|
||||||
|
|
|
@ -2007,25 +2007,6 @@
|
||||||
}
|
}
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
%{
|
|
||||||
group: :pleroma,
|
|
||||||
key: :fetch_initial_posts,
|
|
||||||
type: :group,
|
|
||||||
description: "Fetching initial posts settings",
|
|
||||||
children: [
|
|
||||||
%{
|
|
||||||
key: :enabled,
|
|
||||||
type: :boolean,
|
|
||||||
description: "Fetch posts when a new user is federated with"
|
|
||||||
},
|
|
||||||
%{
|
|
||||||
key: :pages,
|
|
||||||
type: :integer,
|
|
||||||
description: "The amount of pages to fetch",
|
|
||||||
suggestions: [5]
|
|
||||||
}
|
|
||||||
]
|
|
||||||
},
|
|
||||||
%{
|
%{
|
||||||
group: :auto_linker,
|
group: :auto_linker,
|
||||||
key: :opts,
|
key: :opts,
|
||||||
|
|
|
@ -92,6 +92,8 @@
|
||||||
|
|
||||||
config :pleroma, Pleroma.Emails.NewUsersDigestEmail, enabled: true
|
config :pleroma, Pleroma.Emails.NewUsersDigestEmail, enabled: true
|
||||||
|
|
||||||
|
config :pleroma, Pleroma.Plugs.RemoteIp, enabled: false
|
||||||
|
|
||||||
if File.exists?("./config/test.secret.exs") do
|
if File.exists?("./config/test.secret.exs") do
|
||||||
import_config "test.secret.exs"
|
import_config "test.secret.exs"
|
||||||
else
|
else
|
||||||
|
|
|
@ -288,10 +288,11 @@ Pleroma Conversations have the same general structure that Mastodon Conversation
|
||||||
2. Pleroma Conversations statuses can be requested by Conversation id.
|
2. Pleroma Conversations statuses can be requested by Conversation id.
|
||||||
3. Pleroma Conversations can be replied to.
|
3. Pleroma Conversations can be replied to.
|
||||||
|
|
||||||
Conversations have the additional field "recipients" under the "pleroma" key. This holds a list of all the accounts that will receive a message in this conversation.
|
Conversations have the additional field `recipients` under the `pleroma` key. This holds a list of all the accounts that will receive a message in this conversation.
|
||||||
|
|
||||||
The status posting endpoint takes an additional parameter, `in_reply_to_conversation_id`, which, when set, will set the visiblity to direct and address only the people who are the recipients of that Conversation.
|
The status posting endpoint takes an additional parameter, `in_reply_to_conversation_id`, which, when set, will set the visiblity to direct and address only the people who are the recipients of that Conversation.
|
||||||
|
|
||||||
|
⚠ Conversation IDs can be found in direct messages with the `pleroma.direct_conversation_id` key, do not confuse it with `pleroma.conversation_id`.
|
||||||
|
|
||||||
## `GET /api/v1/pleroma/conversations/:id/statuses`
|
## `GET /api/v1/pleroma/conversations/:id/statuses`
|
||||||
### Timeline for a given conversation
|
### Timeline for a given conversation
|
||||||
|
|
|
@ -10,11 +10,11 @@
|
||||||
Replaces embedded objects with references to them in the `objects` table. Only needs to be ran once if the instance was created before Pleroma 1.0.5. The reason why this is not a migration is because it could significantly increase the database size after being ran, however after this `VACUUM FULL` will be able to reclaim about 20% (really depends on what is in the database, your mileage may vary) of the db size before the migration.
|
Replaces embedded objects with references to them in the `objects` table. Only needs to be ran once if the instance was created before Pleroma 1.0.5. The reason why this is not a migration is because it could significantly increase the database size after being ran, however after this `VACUUM FULL` will be able to reclaim about 20% (really depends on what is in the database, your mileage may vary) of the db size before the migration.
|
||||||
|
|
||||||
```sh tab="OTP"
|
```sh tab="OTP"
|
||||||
./bin/pleroma_ctl database remove_embedded_objects [<options>]
|
./bin/pleroma_ctl database remove_embedded_objects [option ...]
|
||||||
```
|
```
|
||||||
|
|
||||||
```sh tab="From Source"
|
```sh tab="From Source"
|
||||||
mix pleroma.database remove_embedded_objects [<options>]
|
mix pleroma.database remove_embedded_objects [option ...]
|
||||||
```
|
```
|
||||||
|
|
||||||
### Options
|
### Options
|
||||||
|
@ -28,11 +28,11 @@ This will prune remote posts older than 90 days (configurable with [`config :ple
|
||||||
The disk space will only be reclaimed after `VACUUM FULL`. You may run out of disk space during the execution of the task or vacuuming if you don't have about 1/3rds of the database size free.
|
The disk space will only be reclaimed after `VACUUM FULL`. You may run out of disk space during the execution of the task or vacuuming if you don't have about 1/3rds of the database size free.
|
||||||
|
|
||||||
```sh tab="OTP"
|
```sh tab="OTP"
|
||||||
./bin/pleroma_ctl database prune_objects [<options>]
|
./bin/pleroma_ctl database prune_objects [option ...]
|
||||||
```
|
```
|
||||||
|
|
||||||
```sh tab="From Source"
|
```sh tab="From Source"
|
||||||
mix pleroma.database prune_objects [<options>]
|
mix pleroma.database prune_objects [option ...]
|
||||||
```
|
```
|
||||||
|
|
||||||
### Options
|
### Options
|
||||||
|
|
|
@ -5,11 +5,11 @@
|
||||||
## Send digest email since given date (user registration date by default) ignoring user activity status.
|
## Send digest email since given date (user registration date by default) ignoring user activity status.
|
||||||
|
|
||||||
```sh tab="OTP"
|
```sh tab="OTP"
|
||||||
./bin/pleroma_ctl digest test <nickname> [<since_date>]
|
./bin/pleroma_ctl digest test <nickname> [since_date]
|
||||||
```
|
```
|
||||||
|
|
||||||
```sh tab="From Source"
|
```sh tab="From Source"
|
||||||
mix pleroma.digest test <nickname> [<since_date>]
|
mix pleroma.digest test <nickname> [since_date]
|
||||||
```
|
```
|
||||||
|
|
||||||
|
|
||||||
|
|
|
@ -5,11 +5,11 @@
|
||||||
## Lists emoji packs and metadata specified in the manifest
|
## Lists emoji packs and metadata specified in the manifest
|
||||||
|
|
||||||
```sh tab="OTP"
|
```sh tab="OTP"
|
||||||
./bin/pleroma_ctl emoji ls-packs [<options>]
|
./bin/pleroma_ctl emoji ls-packs [option ...]
|
||||||
```
|
```
|
||||||
|
|
||||||
```sh tab="From Source"
|
```sh tab="From Source"
|
||||||
mix pleroma.emoji ls-packs [<options>]
|
mix pleroma.emoji ls-packs [option ...]
|
||||||
```
|
```
|
||||||
|
|
||||||
|
|
||||||
|
@ -19,11 +19,11 @@ mix pleroma.emoji ls-packs [<options>]
|
||||||
## Fetch, verify and install the specified packs from the manifest into `STATIC-DIR/emoji/PACK-NAME`
|
## Fetch, verify and install the specified packs from the manifest into `STATIC-DIR/emoji/PACK-NAME`
|
||||||
|
|
||||||
```sh tab="OTP"
|
```sh tab="OTP"
|
||||||
./bin/pleroma_ctl emoji get-packs [<options>] <packs>
|
./bin/pleroma_ctl emoji get-packs [option ...] <pack ...>
|
||||||
```
|
```
|
||||||
|
|
||||||
```sh tab="From Source"
|
```sh tab="From Source"
|
||||||
mix pleroma.emoji get-packs [<options>] <packs>
|
mix pleroma.emoji get-packs [option ...] <pack ...>
|
||||||
```
|
```
|
||||||
|
|
||||||
### Options
|
### Options
|
||||||
|
|
|
@ -4,11 +4,11 @@
|
||||||
|
|
||||||
## Generate a new configuration file
|
## Generate a new configuration file
|
||||||
```sh tab="OTP"
|
```sh tab="OTP"
|
||||||
./bin/pleroma_ctl instance gen [<options>]
|
./bin/pleroma_ctl instance gen [option ...]
|
||||||
```
|
```
|
||||||
|
|
||||||
```sh tab="From Source"
|
```sh tab="From Source"
|
||||||
mix pleroma.instance gen [<options>]
|
mix pleroma.instance gen [option ...]
|
||||||
```
|
```
|
||||||
|
|
||||||
|
|
||||||
|
|
|
@ -4,11 +4,11 @@
|
||||||
|
|
||||||
## Migrate uploads from local to remote storage
|
## Migrate uploads from local to remote storage
|
||||||
```sh tab="OTP"
|
```sh tab="OTP"
|
||||||
./bin/pleroma_ctl uploads migrate_local <target_uploader> [<options>]
|
./bin/pleroma_ctl uploads migrate_local <target_uploader> [option ...]
|
||||||
```
|
```
|
||||||
|
|
||||||
```sh tab="From Source"
|
```sh tab="From Source"
|
||||||
mix pleroma.uploads migrate_local <target_uploader> [<options>]
|
mix pleroma.uploads migrate_local <target_uploader> [option ...]
|
||||||
```
|
```
|
||||||
|
|
||||||
### Options
|
### Options
|
||||||
|
|
|
@ -5,11 +5,11 @@
|
||||||
## Create a user
|
## Create a user
|
||||||
|
|
||||||
```sh tab="OTP"
|
```sh tab="OTP"
|
||||||
./bin/pleroma_ctl user new <email> [<options>]
|
./bin/pleroma_ctl user new <nickname> <email> [option ...]
|
||||||
```
|
```
|
||||||
|
|
||||||
```sh tab="From Source"
|
```sh tab="From Source"
|
||||||
mix pleroma.user new <email> [<options>]
|
mix pleroma.user new <nickname> <email> [option ...]
|
||||||
```
|
```
|
||||||
|
|
||||||
|
|
||||||
|
@ -33,11 +33,11 @@ mix pleroma.user list
|
||||||
|
|
||||||
## Generate an invite link
|
## Generate an invite link
|
||||||
```sh tab="OTP"
|
```sh tab="OTP"
|
||||||
./bin/pleroma_ctl user invite [<options>]
|
./bin/pleroma_ctl user invite [option ...]
|
||||||
```
|
```
|
||||||
|
|
||||||
```sh tab="From Source"
|
```sh tab="From Source"
|
||||||
mix pleroma.user invite [<options>]
|
mix pleroma.user invite [option ...]
|
||||||
```
|
```
|
||||||
|
|
||||||
|
|
||||||
|
@ -137,11 +137,11 @@ mix pleroma.user reset_password <nickname>
|
||||||
|
|
||||||
## Set the value of the given user's settings
|
## Set the value of the given user's settings
|
||||||
```sh tab="OTP"
|
```sh tab="OTP"
|
||||||
./bin/pleroma_ctl user set <nickname> [<options>]
|
./bin/pleroma_ctl user set <nickname> [option ...]
|
||||||
```
|
```
|
||||||
|
|
||||||
```sh tab="From Source"
|
```sh tab="From Source"
|
||||||
mix pleroma.user set <nickname> [<options>]
|
mix pleroma.user set <nickname> [option ...]
|
||||||
```
|
```
|
||||||
|
|
||||||
### Options
|
### Options
|
||||||
|
|
|
@ -151,14 +151,6 @@ config :pleroma, :mrf_user_allowlist,
|
||||||
* `sign_object_fetches`: Sign object fetches with HTTP signatures
|
* `sign_object_fetches`: Sign object fetches with HTTP signatures
|
||||||
* `authorized_fetch_mode`: Require HTTP signatures for AP fetches
|
* `authorized_fetch_mode`: Require HTTP signatures for AP fetches
|
||||||
|
|
||||||
### :fetch_initial_posts
|
|
||||||
|
|
||||||
!!! warning
|
|
||||||
Be careful with this setting, fetching posts may lead to new users being discovered whose posts will then also be fetched. This can lead to serious load on your instance and database.
|
|
||||||
|
|
||||||
* `enabled`: If enabled, when a new user is discovered by your instance, fetch some of their latest posts.
|
|
||||||
* `pages`: The amount of pages to fetch
|
|
||||||
|
|
||||||
## Pleroma.ScheduledActivity
|
## Pleroma.ScheduledActivity
|
||||||
|
|
||||||
* `daily_user_limit`: the number of scheduled activities a user is allowed to create in a single day (Default: `25`)
|
* `daily_user_limit`: the number of scheduled activities a user is allowed to create in a single day (Default: `25`)
|
||||||
|
|
|
@ -156,8 +156,8 @@ cp /opt/pleroma/installation/pleroma.nginx /etc/nginx/conf.d/pleroma.conf
|
||||||
```
|
```
|
||||||
|
|
||||||
```sh tab="Debian/Ubuntu"
|
```sh tab="Debian/Ubuntu"
|
||||||
cp /opt/pleroma/installation/pleroma.nginx /etc/nginx/sites-available/pleroma.nginx
|
cp /opt/pleroma/installation/pleroma.nginx /etc/nginx/sites-available/pleroma.conf
|
||||||
ln -s /etc/nginx/sites-available/pleroma.nginx /etc/nginx/sites-enabled/pleroma.nginx
|
ln -s /etc/nginx/sites-available/pleroma.conf /etc/nginx/sites-enabled/pleroma.conf
|
||||||
```
|
```
|
||||||
|
|
||||||
If your distro does not have either of those you can append `include /etc/nginx/pleroma.conf` to the end of the http section in /etc/nginx/nginx.conf and
|
If your distro does not have either of those you can append `include /etc/nginx/pleroma.conf` to the end of the http section in /etc/nginx/nginx.conf and
|
||||||
|
|
|
@ -90,8 +90,6 @@ server {
|
||||||
proxy_ignore_client_abort on;
|
proxy_ignore_client_abort on;
|
||||||
proxy_buffering on;
|
proxy_buffering on;
|
||||||
chunked_transfer_encoding on;
|
chunked_transfer_encoding on;
|
||||||
proxy_ignore_headers Cache-Control;
|
|
||||||
proxy_hide_header Cache-Control;
|
|
||||||
proxy_pass http://127.0.0.1:4000;
|
proxy_pass http://127.0.0.1:4000;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -28,7 +28,7 @@ def run(_) do
|
||||||
defp do_run(implementation) do
|
defp do_run(implementation) do
|
||||||
start_pleroma()
|
start_pleroma()
|
||||||
|
|
||||||
with descriptions <- Pleroma.Config.Loader.load("config/description.exs"),
|
with descriptions <- Pleroma.Config.Loader.read("config/description.exs"),
|
||||||
{:ok, file_path} <-
|
{:ok, file_path} <-
|
||||||
Pleroma.Docs.Generator.process(
|
Pleroma.Docs.Generator.process(
|
||||||
implementation,
|
implementation,
|
||||||
|
|
|
@ -35,7 +35,7 @@ def run(["unfollow", target]) do
|
||||||
def run(["list"]) do
|
def run(["list"]) do
|
||||||
start_pleroma()
|
start_pleroma()
|
||||||
|
|
||||||
with {:ok, list} <- Relay.list() do
|
with {:ok, list} <- Relay.list(true) do
|
||||||
list |> Enum.each(&shell_info(&1))
|
list |> Enum.each(&shell_info(&1))
|
||||||
else
|
else
|
||||||
{:error, e} -> shell_error("Error while fetching relay subscription list: #{inspect(e)}")
|
{:error, e} -> shell_error("Error while fetching relay subscription list: #{inspect(e)}")
|
||||||
|
|
|
@ -308,6 +308,13 @@ def follow_requests_for_actor(%Pleroma.User{ap_id: ap_id}) do
|
||||||
|> where([a], fragment("? ->> 'state' = 'pending'", a.data))
|
|> where([a], fragment("? ->> 'state' = 'pending'", a.data))
|
||||||
end
|
end
|
||||||
|
|
||||||
|
def following_requests_for_actor(%Pleroma.User{ap_id: ap_id}) do
|
||||||
|
Queries.by_type("Follow")
|
||||||
|
|> where([a], fragment("?->>'state' = 'pending'", a.data))
|
||||||
|
|> where([a], a.actor == ^ap_id)
|
||||||
|
|> Repo.all()
|
||||||
|
end
|
||||||
|
|
||||||
def restrict_deactivated_users(query) do
|
def restrict_deactivated_users(query) do
|
||||||
deactivated_users =
|
deactivated_users =
|
||||||
from(u in User.Query.build(%{deactivated: true}), select: u.ap_id)
|
from(u in User.Query.build(%{deactivated: true}), select: u.ap_id)
|
||||||
|
|
|
@ -39,7 +39,7 @@ defp visibility_tags(object, activity) do
|
||||||
end
|
end
|
||||||
end
|
end
|
||||||
|
|
||||||
defp item_creation_tags(tags, %{data: %{"type" => "Create"}} = object, activity) do
|
defp item_creation_tags(tags, object, %{data: %{"type" => "Create"}} = activity) do
|
||||||
tags ++ hashtags_to_topics(object) ++ attachment_topics(object, activity)
|
tags ++ hashtags_to_topics(object) ++ attachment_topics(object, activity)
|
||||||
end
|
end
|
||||||
|
|
||||||
|
|
|
@ -31,6 +31,7 @@ def user_agent do
|
||||||
# See http://elixir-lang.org/docs/stable/elixir/Application.html
|
# See http://elixir-lang.org/docs/stable/elixir/Application.html
|
||||||
# for more information on OTP Applications
|
# for more information on OTP Applications
|
||||||
def start(_type, _args) do
|
def start(_type, _args) do
|
||||||
|
Pleroma.Config.Holder.save_default()
|
||||||
Pleroma.HTML.compile_scrubbers()
|
Pleroma.HTML.compile_scrubbers()
|
||||||
Pleroma.Config.DeprecationWarnings.warn()
|
Pleroma.Config.DeprecationWarnings.warn()
|
||||||
Pleroma.Plugs.HTTPSecurityPlug.warn_if_disabled()
|
Pleroma.Plugs.HTTPSecurityPlug.warn_if_disabled()
|
||||||
|
|
|
@ -3,14 +3,33 @@
|
||||||
# SPDX-License-Identifier: AGPL-3.0-only
|
# SPDX-License-Identifier: AGPL-3.0-only
|
||||||
|
|
||||||
defmodule Pleroma.Config.Holder do
|
defmodule Pleroma.Config.Holder do
|
||||||
@config Pleroma.Config.Loader.load_and_merge()
|
@config Pleroma.Config.Loader.default_config()
|
||||||
|
|
||||||
@spec config() :: keyword()
|
@spec save_default() :: :ok
|
||||||
def config, do: @config
|
def save_default do
|
||||||
|
default_config =
|
||||||
|
if System.get_env("RELEASE_NAME") do
|
||||||
|
release_config =
|
||||||
|
[:code.root_dir(), "releases", System.get_env("RELEASE_VSN"), "releases.exs"]
|
||||||
|
|> Path.join()
|
||||||
|
|> Pleroma.Config.Loader.read()
|
||||||
|
|
||||||
@spec config(atom()) :: any()
|
Pleroma.Config.Loader.merge(@config, release_config)
|
||||||
def config(group), do: @config[group]
|
else
|
||||||
|
@config
|
||||||
@spec config(atom(), atom()) :: any()
|
end
|
||||||
def config(group, key), do: @config[group][key]
|
|
||||||
|
Pleroma.Config.put(:default_config, default_config)
|
||||||
|
end
|
||||||
|
|
||||||
|
@spec default_config() :: keyword()
|
||||||
|
def default_config, do: get_default()
|
||||||
|
|
||||||
|
@spec default_config(atom()) :: keyword()
|
||||||
|
def default_config(group), do: Keyword.get(get_default(), group)
|
||||||
|
|
||||||
|
@spec default_config(atom(), atom()) :: keyword()
|
||||||
|
def default_config(group, key), do: get_in(get_default(), [group, key])
|
||||||
|
|
||||||
|
defp get_default, do: Pleroma.Config.get(:default_config)
|
||||||
end
|
end
|
||||||
|
|
|
@ -13,32 +13,28 @@ defmodule Pleroma.Config.Loader do
|
||||||
]
|
]
|
||||||
|
|
||||||
if Code.ensure_loaded?(Config.Reader) do
|
if Code.ensure_loaded?(Config.Reader) do
|
||||||
@spec load(Path.t()) :: keyword()
|
@reader Config.Reader
|
||||||
def load(path), do: Config.Reader.read!(path)
|
|
||||||
|
|
||||||
defp do_merge(conf1, conf2), do: Config.Reader.merge(conf1, conf2)
|
def read(path), do: @reader.read!(path)
|
||||||
else
|
else
|
||||||
# support for Elixir less than 1.9
|
# support for Elixir less than 1.9
|
||||||
@spec load(Path.t()) :: keyword()
|
@reader Mix.Config
|
||||||
def load(path) do
|
def read(path) do
|
||||||
path
|
path
|
||||||
|> Mix.Config.eval!()
|
|> @reader.eval!()
|
||||||
|> elem(0)
|
|> elem(0)
|
||||||
end
|
end
|
||||||
|
|
||||||
defp do_merge(conf1, conf2), do: Mix.Config.merge(conf1, conf2)
|
|
||||||
end
|
end
|
||||||
|
|
||||||
@spec load_and_merge() :: keyword()
|
@spec read(Path.t()) :: keyword()
|
||||||
def load_and_merge do
|
|
||||||
all_paths =
|
|
||||||
if Pleroma.Config.get(:release),
|
|
||||||
do: ["config/config.exs", "config/releases.exs"],
|
|
||||||
else: ["config/config.exs"]
|
|
||||||
|
|
||||||
all_paths
|
@spec merge(keyword(), keyword()) :: keyword()
|
||||||
|> Enum.map(&load(&1))
|
def merge(c1, c2), do: @reader.merge(c1, c2)
|
||||||
|> Enum.reduce([], &do_merge(&2, &1))
|
|
||||||
|
@spec default_config() :: keyword()
|
||||||
|
def default_config do
|
||||||
|
"config/config.exs"
|
||||||
|
|> read()
|
||||||
|> filter()
|
|> filter()
|
||||||
end
|
end
|
||||||
|
|
||||||
|
|
|
@ -83,7 +83,7 @@ defp merge_and_update(setting) do
|
||||||
key = ConfigDB.from_string(setting.key)
|
key = ConfigDB.from_string(setting.key)
|
||||||
group = ConfigDB.from_string(setting.group)
|
group = ConfigDB.from_string(setting.group)
|
||||||
|
|
||||||
default = Pleroma.Config.Holder.config(group, key)
|
default = Pleroma.Config.Holder.default_config(group, key)
|
||||||
value = ConfigDB.from_binary(setting.value)
|
value = ConfigDB.from_binary(setting.value)
|
||||||
|
|
||||||
merged_value =
|
merged_value =
|
||||||
|
|
|
@ -15,7 +15,7 @@ def process(descriptions) do
|
||||||
end
|
end
|
||||||
|
|
||||||
def compile do
|
def compile do
|
||||||
with config <- Pleroma.Config.Loader.load("config/description.exs") do
|
with config <- Pleroma.Config.Loader.read("config/description.exs") do
|
||||||
config[:pleroma][:config_description]
|
config[:pleroma][:config_description]
|
||||||
|> Pleroma.Docs.Generator.convert_to_strings()
|
|> Pleroma.Docs.Generator.convert_to_strings()
|
||||||
|> Jason.encode!()
|
|> Jason.encode!()
|
||||||
|
|
256
lib/pleroma/earmark_renderer.ex
Normal file
256
lib/pleroma/earmark_renderer.ex
Normal file
|
@ -0,0 +1,256 @@
|
||||||
|
# Pleroma: A lightweight social networking server
|
||||||
|
# Copyright © 2020 Pleroma Authors <https://pleroma.social/>
|
||||||
|
# SPDX-License-Identifier: AGPL-3.0-only
|
||||||
|
#
|
||||||
|
# This file is derived from Earmark, under the following copyright:
|
||||||
|
# Copyright © 2014 Dave Thomas, The Pragmatic Programmers
|
||||||
|
# SPDX-License-Identifier: Apache-2.0
|
||||||
|
# Upstream: https://github.com/pragdave/earmark/blob/master/lib/earmark/html_renderer.ex
|
||||||
|
defmodule Pleroma.EarmarkRenderer do
|
||||||
|
@moduledoc false
|
||||||
|
|
||||||
|
alias Earmark.Block
|
||||||
|
alias Earmark.Context
|
||||||
|
alias Earmark.HtmlRenderer
|
||||||
|
alias Earmark.Options
|
||||||
|
|
||||||
|
import Earmark.Inline, only: [convert: 3]
|
||||||
|
import Earmark.Helpers.HtmlHelpers
|
||||||
|
import Earmark.Message, only: [add_messages_from: 2, get_messages: 1, set_messages: 2]
|
||||||
|
import Earmark.Context, only: [append: 2, set_value: 2]
|
||||||
|
import Earmark.Options, only: [get_mapper: 1]
|
||||||
|
|
||||||
|
@doc false
|
||||||
|
def render(blocks, %Context{options: %Options{}} = context) do
|
||||||
|
messages = get_messages(context)
|
||||||
|
|
||||||
|
{contexts, html} =
|
||||||
|
get_mapper(context.options).(
|
||||||
|
blocks,
|
||||||
|
&render_block(&1, put_in(context.options.messages, []))
|
||||||
|
)
|
||||||
|
|> Enum.unzip()
|
||||||
|
|
||||||
|
all_messages =
|
||||||
|
contexts
|
||||||
|
|> Enum.reduce(messages, fn ctx, messages1 -> messages1 ++ get_messages(ctx) end)
|
||||||
|
|
||||||
|
{put_in(context.options.messages, all_messages), html |> IO.iodata_to_binary()}
|
||||||
|
end
|
||||||
|
|
||||||
|
#############
|
||||||
|
# Paragraph #
|
||||||
|
#############
|
||||||
|
defp render_block(%Block.Para{lnb: lnb, lines: lines, attrs: attrs}, context) do
|
||||||
|
lines = convert(lines, lnb, context)
|
||||||
|
add_attrs(lines, "<p>#{lines.value}</p>", attrs, [], lnb)
|
||||||
|
end
|
||||||
|
|
||||||
|
########
|
||||||
|
# Html #
|
||||||
|
########
|
||||||
|
defp render_block(%Block.Html{html: html}, context) do
|
||||||
|
{context, html}
|
||||||
|
end
|
||||||
|
|
||||||
|
defp render_block(%Block.HtmlComment{lines: lines}, context) do
|
||||||
|
{context, lines}
|
||||||
|
end
|
||||||
|
|
||||||
|
defp render_block(%Block.HtmlOneline{html: html}, context) do
|
||||||
|
{context, html}
|
||||||
|
end
|
||||||
|
|
||||||
|
#########
|
||||||
|
# Ruler #
|
||||||
|
#########
|
||||||
|
defp render_block(%Block.Ruler{lnb: lnb, attrs: attrs}, context) do
|
||||||
|
add_attrs(context, "<hr />", attrs, [], lnb)
|
||||||
|
end
|
||||||
|
|
||||||
|
###########
|
||||||
|
# Heading #
|
||||||
|
###########
|
||||||
|
defp render_block(
|
||||||
|
%Block.Heading{lnb: lnb, level: level, content: content, attrs: attrs},
|
||||||
|
context
|
||||||
|
) do
|
||||||
|
converted = convert(content, lnb, context)
|
||||||
|
html = "<h#{level}>#{converted.value}</h#{level}>"
|
||||||
|
add_attrs(converted, html, attrs, [], lnb)
|
||||||
|
end
|
||||||
|
|
||||||
|
##############
|
||||||
|
# Blockquote #
|
||||||
|
##############
|
||||||
|
|
||||||
|
defp render_block(%Block.BlockQuote{lnb: lnb, blocks: blocks, attrs: attrs}, context) do
|
||||||
|
{context1, body} = render(blocks, context)
|
||||||
|
html = "<blockquote>#{body}</blockquote>"
|
||||||
|
add_attrs(context1, html, attrs, [], lnb)
|
||||||
|
end
|
||||||
|
|
||||||
|
#########
|
||||||
|
# Table #
|
||||||
|
#########
|
||||||
|
|
||||||
|
defp render_block(
|
||||||
|
%Block.Table{lnb: lnb, header: header, rows: rows, alignments: aligns, attrs: attrs},
|
||||||
|
context
|
||||||
|
) do
|
||||||
|
{context1, html} = add_attrs(context, "<table>", attrs, [], lnb)
|
||||||
|
context2 = set_value(context1, html)
|
||||||
|
|
||||||
|
context3 =
|
||||||
|
if header do
|
||||||
|
append(add_trs(append(context2, "<thead>"), [header], "th", aligns, lnb), "</thead>")
|
||||||
|
else
|
||||||
|
# Maybe an error, needed append(context, html)
|
||||||
|
context2
|
||||||
|
end
|
||||||
|
|
||||||
|
context4 = append(add_trs(append(context3, "<tbody>"), rows, "td", aligns, lnb), "</tbody>")
|
||||||
|
|
||||||
|
{context4, [context4.value, "</table>"]}
|
||||||
|
end
|
||||||
|
|
||||||
|
########
|
||||||
|
# Code #
|
||||||
|
########
|
||||||
|
|
||||||
|
defp render_block(
|
||||||
|
%Block.Code{lnb: lnb, language: language, attrs: attrs} = block,
|
||||||
|
%Context{options: options} = context
|
||||||
|
) do
|
||||||
|
class =
|
||||||
|
if language, do: ~s{ class="#{code_classes(language, options.code_class_prefix)}"}, else: ""
|
||||||
|
|
||||||
|
tag = ~s[<pre><code#{class}>]
|
||||||
|
lines = options.render_code.(block)
|
||||||
|
html = ~s[#{tag}#{lines}</code></pre>]
|
||||||
|
add_attrs(context, html, attrs, [], lnb)
|
||||||
|
end
|
||||||
|
|
||||||
|
#########
|
||||||
|
# Lists #
|
||||||
|
#########
|
||||||
|
|
||||||
|
defp render_block(
|
||||||
|
%Block.List{lnb: lnb, type: type, blocks: items, attrs: attrs, start: start},
|
||||||
|
context
|
||||||
|
) do
|
||||||
|
{context1, content} = render(items, context)
|
||||||
|
html = "<#{type}#{start}>#{content}</#{type}>"
|
||||||
|
add_attrs(context1, html, attrs, [], lnb)
|
||||||
|
end
|
||||||
|
|
||||||
|
# format a single paragraph list item, and remove the para tags
|
||||||
|
defp render_block(
|
||||||
|
%Block.ListItem{lnb: lnb, blocks: blocks, spaced: false, attrs: attrs},
|
||||||
|
context
|
||||||
|
)
|
||||||
|
when length(blocks) == 1 do
|
||||||
|
{context1, content} = render(blocks, context)
|
||||||
|
content = Regex.replace(~r{</?p>}, content, "")
|
||||||
|
html = "<li>#{content}</li>"
|
||||||
|
add_attrs(context1, html, attrs, [], lnb)
|
||||||
|
end
|
||||||
|
|
||||||
|
# format a spaced list item
|
||||||
|
defp render_block(%Block.ListItem{lnb: lnb, blocks: blocks, attrs: attrs}, context) do
|
||||||
|
{context1, content} = render(blocks, context)
|
||||||
|
html = "<li>#{content}</li>"
|
||||||
|
add_attrs(context1, html, attrs, [], lnb)
|
||||||
|
end
|
||||||
|
|
||||||
|
##################
|
||||||
|
# Footnote Block #
|
||||||
|
##################
|
||||||
|
|
||||||
|
defp render_block(%Block.FnList{blocks: footnotes}, context) do
|
||||||
|
items =
|
||||||
|
Enum.map(footnotes, fn note ->
|
||||||
|
blocks = append_footnote_link(note)
|
||||||
|
%Block.ListItem{attrs: "#fn:#{note.number}", type: :ol, blocks: blocks}
|
||||||
|
end)
|
||||||
|
|
||||||
|
{context1, html} = render_block(%Block.List{type: :ol, blocks: items}, context)
|
||||||
|
{context1, Enum.join([~s[<div class="footnotes">], "<hr />", html, "</div>"])}
|
||||||
|
end
|
||||||
|
|
||||||
|
#######################################
|
||||||
|
# Isolated IALs are rendered as paras #
|
||||||
|
#######################################
|
||||||
|
|
||||||
|
defp render_block(%Block.Ial{verbatim: verbatim}, context) do
|
||||||
|
{context, "<p>{:#{verbatim}}</p>"}
|
||||||
|
end
|
||||||
|
|
||||||
|
####################
|
||||||
|
# IDDef is ignored #
|
||||||
|
####################
|
||||||
|
|
||||||
|
defp render_block(%Block.IdDef{}, context), do: {context, ""}
|
||||||
|
|
||||||
|
#####################################
|
||||||
|
# And here are the inline renderers #
|
||||||
|
#####################################
|
||||||
|
|
||||||
|
defdelegate br, to: HtmlRenderer
|
||||||
|
defdelegate codespan(text), to: HtmlRenderer
|
||||||
|
defdelegate em(text), to: HtmlRenderer
|
||||||
|
defdelegate strong(text), to: HtmlRenderer
|
||||||
|
defdelegate strikethrough(text), to: HtmlRenderer
|
||||||
|
|
||||||
|
defdelegate link(url, text), to: HtmlRenderer
|
||||||
|
defdelegate link(url, text, title), to: HtmlRenderer
|
||||||
|
|
||||||
|
defdelegate image(path, alt, title), to: HtmlRenderer
|
||||||
|
|
||||||
|
defdelegate footnote_link(ref, backref, number), to: HtmlRenderer
|
||||||
|
|
||||||
|
# Table rows
|
||||||
|
defp add_trs(context, rows, tag, aligns, lnb) do
|
||||||
|
numbered_rows =
|
||||||
|
rows
|
||||||
|
|> Enum.zip(Stream.iterate(lnb, &(&1 + 1)))
|
||||||
|
|
||||||
|
numbered_rows
|
||||||
|
|> Enum.reduce(context, fn {row, lnb}, ctx ->
|
||||||
|
append(add_tds(append(ctx, "<tr>"), row, tag, aligns, lnb), "</tr>")
|
||||||
|
end)
|
||||||
|
end
|
||||||
|
|
||||||
|
defp add_tds(context, row, tag, aligns, lnb) do
|
||||||
|
Enum.reduce(1..length(row), context, add_td_fn(row, tag, aligns, lnb))
|
||||||
|
end
|
||||||
|
|
||||||
|
defp add_td_fn(row, tag, aligns, lnb) do
|
||||||
|
fn n, ctx ->
|
||||||
|
style =
|
||||||
|
case Enum.at(aligns, n - 1, :default) do
|
||||||
|
:default -> ""
|
||||||
|
align -> " style=\"text-align: #{align}\""
|
||||||
|
end
|
||||||
|
|
||||||
|
col = Enum.at(row, n - 1)
|
||||||
|
converted = convert(col, lnb, set_messages(ctx, []))
|
||||||
|
append(add_messages_from(ctx, converted), "<#{tag}#{style}>#{converted.value}</#{tag}>")
|
||||||
|
end
|
||||||
|
end
|
||||||
|
|
||||||
|
###############################
|
||||||
|
# Append Footnote Return Link #
|
||||||
|
###############################
|
||||||
|
|
||||||
|
defdelegate append_footnote_link(note), to: HtmlRenderer
|
||||||
|
defdelegate append_footnote_link(note, fnlink), to: HtmlRenderer
|
||||||
|
|
||||||
|
defdelegate render_code(lines), to: HtmlRenderer
|
||||||
|
|
||||||
|
defp code_classes(language, prefix) do
|
||||||
|
["" | String.split(prefix || "")]
|
||||||
|
|> Enum.map(fn pfx -> "#{pfx}#{language}" end)
|
||||||
|
|> Enum.join(" ")
|
||||||
|
end
|
||||||
|
end
|
|
@ -78,7 +78,7 @@ def init(plug_opts) do
|
||||||
end
|
end
|
||||||
|
|
||||||
def call(conn, plug_opts) do
|
def call(conn, plug_opts) do
|
||||||
if disabled?() do
|
if disabled?(conn) do
|
||||||
handle_disabled(conn)
|
handle_disabled(conn)
|
||||||
else
|
else
|
||||||
action_settings = action_settings(plug_opts)
|
action_settings = action_settings(plug_opts)
|
||||||
|
@ -87,9 +87,9 @@ def call(conn, plug_opts) do
|
||||||
end
|
end
|
||||||
|
|
||||||
defp handle_disabled(conn) do
|
defp handle_disabled(conn) do
|
||||||
if Config.get(:env) == :prod do
|
Logger.warn(
|
||||||
Logger.warn("Rate limiter is disabled for localhost/socket")
|
"Rate limiter disabled due to forwarded IP not being found. Please ensure your reverse proxy is providing the X-Forwarded-For header or disable the RemoteIP plug/rate limiter."
|
||||||
end
|
)
|
||||||
|
|
||||||
conn
|
conn
|
||||||
end
|
end
|
||||||
|
@ -109,16 +109,21 @@ defp handle(conn, action_settings) do
|
||||||
end
|
end
|
||||||
end
|
end
|
||||||
|
|
||||||
def disabled? do
|
def disabled?(conn) do
|
||||||
localhost_or_socket =
|
localhost_or_socket =
|
||||||
Config.get([Pleroma.Web.Endpoint, :http, :ip])
|
case Config.get([Pleroma.Web.Endpoint, :http, :ip]) do
|
||||||
|> Tuple.to_list()
|
{127, 0, 0, 1} -> true
|
||||||
|> Enum.join(".")
|
{0, 0, 0, 0, 0, 0, 0, 1} -> true
|
||||||
|> String.match?(~r/^local|^127.0.0.1/)
|
{:local, _} -> true
|
||||||
|
_ -> false
|
||||||
|
end
|
||||||
|
|
||||||
remote_ip_disabled = not Config.get([Pleroma.Plugs.RemoteIp, :enabled])
|
remote_ip_not_found =
|
||||||
|
if Map.has_key?(conn.assigns, :remote_ip_found),
|
||||||
|
do: !conn.assigns.remote_ip_found,
|
||||||
|
else: false
|
||||||
|
|
||||||
localhost_or_socket and remote_ip_disabled
|
localhost_or_socket and remote_ip_not_found
|
||||||
end
|
end
|
||||||
|
|
||||||
@inspect_bucket_not_found {:error, :not_found}
|
@inspect_bucket_not_found {:error, :not_found}
|
||||||
|
|
|
@ -7,6 +7,8 @@ defmodule Pleroma.Plugs.RemoteIp do
|
||||||
This is a shim to call [`RemoteIp`](https://git.pleroma.social/pleroma/remote_ip) but with runtime configuration.
|
This is a shim to call [`RemoteIp`](https://git.pleroma.social/pleroma/remote_ip) but with runtime configuration.
|
||||||
"""
|
"""
|
||||||
|
|
||||||
|
import Plug.Conn
|
||||||
|
|
||||||
@behaviour Plug
|
@behaviour Plug
|
||||||
|
|
||||||
@headers ~w[
|
@headers ~w[
|
||||||
|
@ -26,11 +28,12 @@ defmodule Pleroma.Plugs.RemoteIp do
|
||||||
|
|
||||||
def init(_), do: nil
|
def init(_), do: nil
|
||||||
|
|
||||||
def call(conn, _) do
|
def call(%{remote_ip: original_remote_ip} = conn, _) do
|
||||||
config = Pleroma.Config.get(__MODULE__, [])
|
config = Pleroma.Config.get(__MODULE__, [])
|
||||||
|
|
||||||
if Keyword.get(config, :enabled, false) do
|
if Keyword.get(config, :enabled, false) do
|
||||||
RemoteIp.call(conn, remote_ip_opts(config))
|
%{remote_ip: new_remote_ip} = conn = RemoteIp.call(conn, remote_ip_opts(config))
|
||||||
|
assign(conn, :remote_ip_found, original_remote_ip != new_remote_ip)
|
||||||
else
|
else
|
||||||
conn
|
conn
|
||||||
end
|
end
|
||||||
|
|
|
@ -21,6 +21,9 @@ def call(conn, _) do
|
||||||
defp enabled?, do: Pleroma.Config.get([:static_fe, :enabled], false)
|
defp enabled?, do: Pleroma.Config.get([:static_fe, :enabled], false)
|
||||||
|
|
||||||
defp accepts_html?(conn) do
|
defp accepts_html?(conn) do
|
||||||
conn |> get_req_header("accept") |> List.first() |> String.contains?("text/html")
|
case get_req_header(conn, "accept") do
|
||||||
|
[accept | _] -> String.contains?(accept, "text/html")
|
||||||
|
_ -> false
|
||||||
|
end
|
||||||
end
|
end
|
||||||
end
|
end
|
||||||
|
|
|
@ -14,9 +14,14 @@ defmodule Pleroma.Plugs.UploadedMedia do
|
||||||
# no slashes
|
# no slashes
|
||||||
@path "media"
|
@path "media"
|
||||||
|
|
||||||
|
@default_cache_control_header "public, max-age=1209600"
|
||||||
|
|
||||||
def init(_opts) do
|
def init(_opts) do
|
||||||
static_plug_opts =
|
static_plug_opts =
|
||||||
[]
|
[
|
||||||
|
headers: %{"cache-control" => @default_cache_control_header},
|
||||||
|
cache_control_for_etags: @default_cache_control_header
|
||||||
|
]
|
||||||
|> Keyword.put(:from, "__unconfigured_media_plug")
|
|> Keyword.put(:from, "__unconfigured_media_plug")
|
||||||
|> Keyword.put(:at, "/__unconfigured_media_plug")
|
|> Keyword.put(:at, "/__unconfigured_media_plug")
|
||||||
|> Plug.Static.init()
|
|> Plug.Static.init()
|
||||||
|
|
|
@ -7,7 +7,7 @@ defmodule Pleroma.ReverseProxy do
|
||||||
|
|
||||||
@keep_req_headers ~w(accept user-agent accept-encoding cache-control if-modified-since) ++
|
@keep_req_headers ~w(accept user-agent accept-encoding cache-control if-modified-since) ++
|
||||||
~w(if-unmodified-since if-none-match if-range range)
|
~w(if-unmodified-since if-none-match if-range range)
|
||||||
@resp_cache_headers ~w(etag date last-modified cache-control)
|
@resp_cache_headers ~w(etag date last-modified)
|
||||||
@keep_resp_headers @resp_cache_headers ++
|
@keep_resp_headers @resp_cache_headers ++
|
||||||
~w(content-type content-disposition content-encoding content-range) ++
|
~w(content-type content-disposition content-encoding content-range) ++
|
||||||
~w(accept-ranges vary)
|
~w(accept-ranges vary)
|
||||||
|
@ -34,9 +34,6 @@ defmodule Pleroma.ReverseProxy do
|
||||||
* request: `#{inspect(@keep_req_headers)}`
|
* request: `#{inspect(@keep_req_headers)}`
|
||||||
* response: `#{inspect(@keep_resp_headers)}`
|
* response: `#{inspect(@keep_resp_headers)}`
|
||||||
|
|
||||||
If no caching headers (`#{inspect(@resp_cache_headers)}`) are returned by upstream, `cache-control` will be
|
|
||||||
set to `#{inspect(@default_cache_control_header)}`.
|
|
||||||
|
|
||||||
Options:
|
Options:
|
||||||
|
|
||||||
* `redirect_on_failure` (default `false`). Redirects the client to the real remote URL if there's any HTTP
|
* `redirect_on_failure` (default `false`). Redirects the client to the real remote URL if there's any HTTP
|
||||||
|
@ -297,16 +294,17 @@ defp build_resp_headers(headers, opts) do
|
||||||
|
|
||||||
defp build_resp_cache_headers(headers, _opts) do
|
defp build_resp_cache_headers(headers, _opts) do
|
||||||
has_cache? = Enum.any?(headers, fn {k, _} -> k in @resp_cache_headers end)
|
has_cache? = Enum.any?(headers, fn {k, _} -> k in @resp_cache_headers end)
|
||||||
has_cache_control? = List.keymember?(headers, "cache-control", 0)
|
|
||||||
|
|
||||||
cond do
|
cond do
|
||||||
has_cache? && has_cache_control? ->
|
|
||||||
headers
|
|
||||||
|
|
||||||
has_cache? ->
|
has_cache? ->
|
||||||
# There's caching header present but no cache-control -- we need to explicitely override it
|
# There's caching header present but no cache-control -- we need to set our own
|
||||||
# to public as Plug defaults to "max-age=0, private, must-revalidate"
|
# as Plug defaults to "max-age=0, private, must-revalidate"
|
||||||
List.keystore(headers, "cache-control", 0, {"cache-control", "public"})
|
List.keystore(
|
||||||
|
headers,
|
||||||
|
"cache-control",
|
||||||
|
0,
|
||||||
|
{"cache-control", @default_cache_control_header}
|
||||||
|
)
|
||||||
|
|
||||||
true ->
|
true ->
|
||||||
List.keystore(
|
List.keystore(
|
||||||
|
|
|
@ -16,6 +16,7 @@ defmodule Pleroma.User do
|
||||||
alias Pleroma.Conversation.Participation
|
alias Pleroma.Conversation.Participation
|
||||||
alias Pleroma.Delivery
|
alias Pleroma.Delivery
|
||||||
alias Pleroma.FollowingRelationship
|
alias Pleroma.FollowingRelationship
|
||||||
|
alias Pleroma.HTML
|
||||||
alias Pleroma.Keys
|
alias Pleroma.Keys
|
||||||
alias Pleroma.Notification
|
alias Pleroma.Notification
|
||||||
alias Pleroma.Object
|
alias Pleroma.Object
|
||||||
|
@ -530,7 +531,14 @@ def register_changeset(struct, params \\ %{}, opts \\ []) do
|
||||||
end
|
end
|
||||||
|
|
||||||
def maybe_validate_required_email(changeset, true), do: changeset
|
def maybe_validate_required_email(changeset, true), do: changeset
|
||||||
def maybe_validate_required_email(changeset, _), do: validate_required(changeset, [:email])
|
|
||||||
|
def maybe_validate_required_email(changeset, _) do
|
||||||
|
if Pleroma.Config.get([:instance, :account_activation_required]) do
|
||||||
|
validate_required(changeset, [:email])
|
||||||
|
else
|
||||||
|
changeset
|
||||||
|
end
|
||||||
|
end
|
||||||
|
|
||||||
defp put_ap_id(changeset) do
|
defp put_ap_id(changeset) do
|
||||||
ap_id = ap_id(%User{nickname: get_field(changeset, :nickname)})
|
ap_id = ap_id(%User{nickname: get_field(changeset, :nickname)})
|
||||||
|
@ -832,10 +840,6 @@ def get_or_fetch_by_nickname(nickname) do
|
||||||
_e ->
|
_e ->
|
||||||
with [_nick, _domain] <- String.split(nickname, "@"),
|
with [_nick, _domain] <- String.split(nickname, "@"),
|
||||||
{:ok, user} <- fetch_by_nickname(nickname) do
|
{:ok, user} <- fetch_by_nickname(nickname) do
|
||||||
if Pleroma.Config.get([:fetch_initial_posts, :enabled]) do
|
|
||||||
fetch_initial_posts(user)
|
|
||||||
end
|
|
||||||
|
|
||||||
{:ok, user}
|
{:ok, user}
|
||||||
else
|
else
|
||||||
_e -> {:error, "not found " <> nickname}
|
_e -> {:error, "not found " <> nickname}
|
||||||
|
@ -843,11 +847,6 @@ def get_or_fetch_by_nickname(nickname) do
|
||||||
end
|
end
|
||||||
end
|
end
|
||||||
|
|
||||||
@doc "Fetch some posts when the user has just been federated with"
|
|
||||||
def fetch_initial_posts(user) do
|
|
||||||
BackgroundWorker.enqueue("fetch_initial_posts", %{"user_id" => user.id})
|
|
||||||
end
|
|
||||||
|
|
||||||
@spec get_followers_query(User.t(), pos_integer() | nil) :: Ecto.Query.t()
|
@spec get_followers_query(User.t(), pos_integer() | nil) :: Ecto.Query.t()
|
||||||
def get_followers_query(%User{} = user, nil) do
|
def get_followers_query(%User{} = user, nil) do
|
||||||
User.Query.build(%{followers: user, deactivated: false})
|
User.Query.build(%{followers: user, deactivated: false})
|
||||||
|
@ -1313,16 +1312,6 @@ def perform(:delete, %User{} = user) do
|
||||||
Repo.delete(user)
|
Repo.delete(user)
|
||||||
end
|
end
|
||||||
|
|
||||||
def perform(:fetch_initial_posts, %User{} = user) do
|
|
||||||
pages = Pleroma.Config.get!([:fetch_initial_posts, :pages])
|
|
||||||
|
|
||||||
# Insert all the posts in reverse order, so they're in the right order on the timeline
|
|
||||||
user.source_data["outbox"]
|
|
||||||
|> Utils.fetch_ordered_collection(pages)
|
|
||||||
|> Enum.reverse()
|
|
||||||
|> Enum.each(&Pleroma.Web.Federator.incoming_ap_doc/1)
|
|
||||||
end
|
|
||||||
|
|
||||||
def perform(:deactivate_async, user, status), do: deactivate(user, status)
|
def perform(:deactivate_async, user, status), do: deactivate(user, status)
|
||||||
|
|
||||||
@spec perform(atom(), User.t(), list()) :: list() | {:error, any()}
|
@spec perform(atom(), User.t(), list()) :: list() | {:error, any()}
|
||||||
|
@ -1451,18 +1440,7 @@ def get_or_fetch_by_ap_id(ap_id) do
|
||||||
if !is_nil(user) and !needs_update?(user) do
|
if !is_nil(user) and !needs_update?(user) do
|
||||||
{:ok, user}
|
{:ok, user}
|
||||||
else
|
else
|
||||||
# Whether to fetch initial posts for the user (if it's a new user & the fetching is enabled)
|
fetch_by_ap_id(ap_id)
|
||||||
should_fetch_initial = is_nil(user) and Pleroma.Config.get([:fetch_initial_posts, :enabled])
|
|
||||||
|
|
||||||
resp = fetch_by_ap_id(ap_id)
|
|
||||||
|
|
||||||
if should_fetch_initial do
|
|
||||||
with {:ok, %User{} = user} <- resp do
|
|
||||||
fetch_initial_posts(user)
|
|
||||||
end
|
|
||||||
end
|
|
||||||
|
|
||||||
resp
|
|
||||||
end
|
end
|
||||||
end
|
end
|
||||||
|
|
||||||
|
@ -2055,4 +2033,27 @@ def set_invisible(user, invisible) do
|
||||||
|> validate_required([:invisible])
|
|> validate_required([:invisible])
|
||||||
|> update_and_set_cache()
|
|> update_and_set_cache()
|
||||||
end
|
end
|
||||||
|
|
||||||
|
def sanitize_html(%User{} = user) do
|
||||||
|
sanitize_html(user, nil)
|
||||||
|
end
|
||||||
|
|
||||||
|
# User data that mastodon isn't filtering (treated as plaintext):
|
||||||
|
# - field name
|
||||||
|
# - display name
|
||||||
|
def sanitize_html(%User{} = user, filter) do
|
||||||
|
fields =
|
||||||
|
user
|
||||||
|
|> User.fields()
|
||||||
|
|> Enum.map(fn %{"name" => name, "value" => value} ->
|
||||||
|
%{
|
||||||
|
"name" => name,
|
||||||
|
"value" => HTML.filter_tags(value, Pleroma.HTML.Scrubber.LinksOnly)
|
||||||
|
}
|
||||||
|
end)
|
||||||
|
|
||||||
|
user
|
||||||
|
|> Map.put(:bio, HTML.filter_tags(user.bio, filter))
|
||||||
|
|> Map.put(:fields, fields)
|
||||||
|
end
|
||||||
end
|
end
|
||||||
|
|
|
@ -60,15 +60,28 @@ def publish(%Activity{data: %{"type" => "Create"}} = activity) do
|
||||||
|
|
||||||
def publish(_), do: {:error, "Not implemented"}
|
def publish(_), do: {:error, "Not implemented"}
|
||||||
|
|
||||||
@spec list() :: {:ok, [String.t()]} | {:error, any()}
|
@spec list(boolean()) :: {:ok, [String.t()]} | {:error, any()}
|
||||||
def list do
|
def list(with_not_accepted \\ false) do
|
||||||
with %User{} = user <- get_actor() do
|
with %User{} = user <- get_actor() do
|
||||||
list =
|
accepted =
|
||||||
user
|
user
|
||||||
|> User.following()
|
|> User.following()
|
||||||
|> Enum.map(fn entry -> URI.parse(entry).host end)
|
|> Enum.map(fn entry -> URI.parse(entry).host end)
|
||||||
|> Enum.uniq()
|
|> Enum.uniq()
|
||||||
|
|
||||||
|
list =
|
||||||
|
if with_not_accepted do
|
||||||
|
without_accept =
|
||||||
|
user
|
||||||
|
|> Pleroma.Activity.following_requests_for_actor()
|
||||||
|
|> Enum.map(fn a -> URI.parse(a.data["object"]).host <> " (no Accept received)" end)
|
||||||
|
|> Enum.uniq()
|
||||||
|
|
||||||
|
accepted ++ without_accept
|
||||||
|
else
|
||||||
|
accepted
|
||||||
|
end
|
||||||
|
|
||||||
{:ok, list}
|
{:ok, list}
|
||||||
else
|
else
|
||||||
error -> format_error(error)
|
error -> format_error(error)
|
||||||
|
|
|
@ -784,45 +784,6 @@ defp build_flag_object(act) when is_map(act) or is_binary(act) do
|
||||||
|
|
||||||
defp build_flag_object(_), do: []
|
defp build_flag_object(_), do: []
|
||||||
|
|
||||||
@doc """
|
|
||||||
Fetches the OrderedCollection/OrderedCollectionPage from `from`, limiting the amount of pages fetched after
|
|
||||||
the first one to `pages_left` pages.
|
|
||||||
If the amount of pages is higher than the collection has, it returns whatever was there.
|
|
||||||
"""
|
|
||||||
def fetch_ordered_collection(from, pages_left, acc \\ []) do
|
|
||||||
with {:ok, response} <- Tesla.get(from),
|
|
||||||
{:ok, collection} <- Jason.decode(response.body) do
|
|
||||||
case collection["type"] do
|
|
||||||
"OrderedCollection" ->
|
|
||||||
# If we've encountered the OrderedCollection and not the page,
|
|
||||||
# just call the same function on the page address
|
|
||||||
fetch_ordered_collection(collection["first"], pages_left)
|
|
||||||
|
|
||||||
"OrderedCollectionPage" ->
|
|
||||||
if pages_left > 0 do
|
|
||||||
# There are still more pages
|
|
||||||
if Map.has_key?(collection, "next") do
|
|
||||||
# There are still more pages, go deeper saving what we have into the accumulator
|
|
||||||
fetch_ordered_collection(
|
|
||||||
collection["next"],
|
|
||||||
pages_left - 1,
|
|
||||||
acc ++ collection["orderedItems"]
|
|
||||||
)
|
|
||||||
else
|
|
||||||
# No more pages left, just return whatever we already have
|
|
||||||
acc ++ collection["orderedItems"]
|
|
||||||
end
|
|
||||||
else
|
|
||||||
# Got the amount of pages needed, add them all to the accumulator
|
|
||||||
acc ++ collection["orderedItems"]
|
|
||||||
end
|
|
||||||
|
|
||||||
_ ->
|
|
||||||
{:error, "Not an OrderedCollection or OrderedCollectionPage"}
|
|
||||||
end
|
|
||||||
end
|
|
||||||
end
|
|
||||||
|
|
||||||
#### Report-related helpers
|
#### Report-related helpers
|
||||||
def get_reports(params, page, page_size) do
|
def get_reports(params, page, page_size) do
|
||||||
params =
|
params =
|
||||||
|
|
|
@ -73,6 +73,7 @@ def render("user.json", %{user: user}) do
|
||||||
{:ok, _, public_key} = Keys.keys_from_pem(user.keys)
|
{:ok, _, public_key} = Keys.keys_from_pem(user.keys)
|
||||||
public_key = :public_key.pem_entry_encode(:SubjectPublicKeyInfo, public_key)
|
public_key = :public_key.pem_entry_encode(:SubjectPublicKeyInfo, public_key)
|
||||||
public_key = :public_key.pem_encode([public_key])
|
public_key = :public_key.pem_encode([public_key])
|
||||||
|
user = User.sanitize_html(user)
|
||||||
|
|
||||||
endpoints = render("endpoints.json", %{user: user})
|
endpoints = render("endpoints.json", %{user: user})
|
||||||
|
|
||||||
|
@ -81,12 +82,6 @@ def render("user.json", %{user: user}) do
|
||||||
fields =
|
fields =
|
||||||
user
|
user
|
||||||
|> User.fields()
|
|> User.fields()
|
||||||
|> Enum.map(fn %{"name" => name, "value" => value} ->
|
|
||||||
%{
|
|
||||||
"name" => Pleroma.HTML.strip_tags(name),
|
|
||||||
"value" => Pleroma.HTML.filter_tags(value, Pleroma.HTML.Scrubber.LinksOnly)
|
|
||||||
}
|
|
||||||
end)
|
|
||||||
|> Enum.map(&Map.put(&1, "type", "PropertyValue"))
|
|> Enum.map(&Map.put(&1, "type", "PropertyValue"))
|
||||||
|
|
||||||
%{
|
%{
|
||||||
|
|
|
@ -834,7 +834,7 @@ def config_show(conn, _params) do
|
||||||
configs = ConfigDB.get_all_as_keyword()
|
configs = ConfigDB.get_all_as_keyword()
|
||||||
|
|
||||||
merged =
|
merged =
|
||||||
Config.Holder.config()
|
Config.Holder.default_config()
|
||||||
|> ConfigDB.merge(configs)
|
|> ConfigDB.merge(configs)
|
||||||
|> Enum.map(fn {group, values} ->
|
|> Enum.map(fn {group, values} ->
|
||||||
Enum.map(values, fn {key, value} ->
|
Enum.map(values, fn {key, value} ->
|
||||||
|
|
|
@ -5,7 +5,6 @@
|
||||||
defmodule Pleroma.Web.AdminAPI.AccountView do
|
defmodule Pleroma.Web.AdminAPI.AccountView do
|
||||||
use Pleroma.Web, :view
|
use Pleroma.Web, :view
|
||||||
|
|
||||||
alias Pleroma.HTML
|
|
||||||
alias Pleroma.User
|
alias Pleroma.User
|
||||||
alias Pleroma.Web.AdminAPI.AccountView
|
alias Pleroma.Web.AdminAPI.AccountView
|
||||||
alias Pleroma.Web.MediaProxy
|
alias Pleroma.Web.MediaProxy
|
||||||
|
@ -26,7 +25,8 @@ def render("index.json", %{users: users}) do
|
||||||
|
|
||||||
def render("show.json", %{user: user}) do
|
def render("show.json", %{user: user}) do
|
||||||
avatar = User.avatar_url(user) |> MediaProxy.url()
|
avatar = User.avatar_url(user) |> MediaProxy.url()
|
||||||
display_name = HTML.strip_tags(user.name || user.nickname)
|
display_name = Pleroma.HTML.strip_tags(user.name || user.nickname)
|
||||||
|
user = User.sanitize_html(user, FastSanitize.Sanitizer.StripTags)
|
||||||
|
|
||||||
%{
|
%{
|
||||||
"id" => user.id,
|
"id" => user.id,
|
||||||
|
|
|
@ -331,7 +331,7 @@ def format_input(text, "text/html", options) do
|
||||||
def format_input(text, "text/markdown", options) do
|
def format_input(text, "text/markdown", options) do
|
||||||
text
|
text
|
||||||
|> Formatter.mentions_escape(options)
|
|> Formatter.mentions_escape(options)
|
||||||
|> Earmark.as_html!()
|
|> Earmark.as_html!(%Earmark.Options{renderer: Pleroma.EarmarkRenderer})
|
||||||
|> Formatter.linkify(options)
|
|> Formatter.linkify(options)
|
||||||
|> Formatter.html_escape("text/html")
|
|> Formatter.html_escape("text/html")
|
||||||
end
|
end
|
||||||
|
@ -591,7 +591,7 @@ def validate_character_limit(full_payload, _attachments) do
|
||||||
limit = Pleroma.Config.get([:instance, :limit])
|
limit = Pleroma.Config.get([:instance, :limit])
|
||||||
length = String.length(full_payload)
|
length = String.length(full_payload)
|
||||||
|
|
||||||
if length < limit do
|
if length <= limit do
|
||||||
:ok
|
:ok
|
||||||
else
|
else
|
||||||
{:error, dgettext("errors", "The status is over the character limit")}
|
{:error, dgettext("errors", "The status is over the character limit")}
|
||||||
|
|
|
@ -12,7 +12,7 @@ defmodule Pleroma.Web.Endpoint do
|
||||||
plug(Pleroma.Plugs.HTTPSecurityPlug)
|
plug(Pleroma.Plugs.HTTPSecurityPlug)
|
||||||
plug(Pleroma.Plugs.UploadedMedia)
|
plug(Pleroma.Plugs.UploadedMedia)
|
||||||
|
|
||||||
@static_cache_control "public max-age=86400 must-revalidate"
|
@static_cache_control "public, no-cache"
|
||||||
|
|
||||||
# InstanceStatic needs to be before Plug.Static to be able to override shipped-static files
|
# InstanceStatic needs to be before Plug.Static to be able to override shipped-static files
|
||||||
# If you're adding new paths to `only:` you'll need to configure them in InstanceStatic as well
|
# If you're adding new paths to `only:` you'll need to configure them in InstanceStatic as well
|
||||||
|
|
|
@ -76,7 +76,7 @@ defmodule Pleroma.Web.MastodonAPI.AccountController do
|
||||||
@doc "POST /api/v1/accounts"
|
@doc "POST /api/v1/accounts"
|
||||||
def create(
|
def create(
|
||||||
%{assigns: %{app: app}} = conn,
|
%{assigns: %{app: app}} = conn,
|
||||||
%{"username" => nickname, "email" => _, "password" => _, "agreement" => true} = params
|
%{"username" => nickname, "password" => _, "agreement" => true} = params
|
||||||
) do
|
) do
|
||||||
params =
|
params =
|
||||||
params
|
params
|
||||||
|
@ -93,7 +93,8 @@ def create(
|
||||||
|> Map.put("bio", params["bio"] || "")
|
|> Map.put("bio", params["bio"] || "")
|
||||||
|> Map.put("confirm", params["password"])
|
|> Map.put("confirm", params["password"])
|
||||||
|
|
||||||
with {:ok, user} <- TwitterAPI.register_user(params, need_confirmation: true),
|
with :ok <- validate_email_param(params),
|
||||||
|
{:ok, user} <- TwitterAPI.register_user(params, need_confirmation: true),
|
||||||
{:ok, token} <- Token.create_token(app, user, %{scopes: app.scopes}) do
|
{:ok, token} <- Token.create_token(app, user, %{scopes: app.scopes}) do
|
||||||
json(conn, %{
|
json(conn, %{
|
||||||
token_type: "Bearer",
|
token_type: "Bearer",
|
||||||
|
@ -114,6 +115,15 @@ def create(conn, _) do
|
||||||
render_error(conn, :forbidden, "Invalid credentials")
|
render_error(conn, :forbidden, "Invalid credentials")
|
||||||
end
|
end
|
||||||
|
|
||||||
|
defp validate_email_param(%{"email" => _}), do: :ok
|
||||||
|
|
||||||
|
defp validate_email_param(_) do
|
||||||
|
case Pleroma.Config.get([:instance, :account_activation_required]) do
|
||||||
|
true -> {:error, %{"error" => "Missing parameters"}}
|
||||||
|
_ -> :ok
|
||||||
|
end
|
||||||
|
end
|
||||||
|
|
||||||
@doc "GET /api/v1/accounts/verify_credentials"
|
@doc "GET /api/v1/accounts/verify_credentials"
|
||||||
def verify_credentials(%{assigns: %{user: user}} = conn, _) do
|
def verify_credentials(%{assigns: %{user: user}} = conn, _) do
|
||||||
chat_token = Phoenix.Token.sign(conn, "user socket", user.id)
|
chat_token = Phoenix.Token.sign(conn, "user socket", user.id)
|
||||||
|
|
|
@ -86,6 +86,6 @@ defp local_mastodon_root_path(conn) do
|
||||||
@spec get_or_make_app() :: {:ok, App.t()} | {:error, Ecto.Changeset.t()}
|
@spec get_or_make_app() :: {:ok, App.t()} | {:error, Ecto.Changeset.t()}
|
||||||
defp get_or_make_app do
|
defp get_or_make_app do
|
||||||
%{client_name: @local_mastodon_name, redirect_uris: "."}
|
%{client_name: @local_mastodon_name, redirect_uris: "."}
|
||||||
|> App.get_or_make(["read", "write", "follow", "push"])
|
|> App.get_or_make(["read", "write", "follow", "push", "admin"])
|
||||||
end
|
end
|
||||||
end
|
end
|
||||||
|
|
|
@ -5,7 +5,6 @@
|
||||||
defmodule Pleroma.Web.MastodonAPI.AccountView do
|
defmodule Pleroma.Web.MastodonAPI.AccountView do
|
||||||
use Pleroma.Web, :view
|
use Pleroma.Web, :view
|
||||||
|
|
||||||
alias Pleroma.HTML
|
|
||||||
alias Pleroma.User
|
alias Pleroma.User
|
||||||
alias Pleroma.Web.CommonAPI.Utils
|
alias Pleroma.Web.CommonAPI.Utils
|
||||||
alias Pleroma.Web.MastodonAPI.AccountView
|
alias Pleroma.Web.MastodonAPI.AccountView
|
||||||
|
@ -67,6 +66,7 @@ def render("relationships.json", %{user: user, targets: targets}) do
|
||||||
end
|
end
|
||||||
|
|
||||||
defp do_render("show.json", %{user: user} = opts) do
|
defp do_render("show.json", %{user: user} = opts) do
|
||||||
|
user = User.sanitize_html(user, User.html_filter_policy(opts[:for]))
|
||||||
display_name = user.name || user.nickname
|
display_name = user.name || user.nickname
|
||||||
|
|
||||||
image = User.avatar_url(user) |> MediaProxy.url()
|
image = User.avatar_url(user) |> MediaProxy.url()
|
||||||
|
@ -100,17 +100,6 @@ defp do_render("show.json", %{user: user} = opts) do
|
||||||
}
|
}
|
||||||
end)
|
end)
|
||||||
|
|
||||||
fields =
|
|
||||||
user
|
|
||||||
|> User.fields()
|
|
||||||
|> Enum.map(fn %{"name" => name, "value" => value} ->
|
|
||||||
%{
|
|
||||||
"name" => name,
|
|
||||||
"value" => Pleroma.HTML.filter_tags(value, Pleroma.HTML.Scrubber.LinksOnly)
|
|
||||||
}
|
|
||||||
end)
|
|
||||||
|
|
||||||
bio = HTML.filter_tags(user.bio, User.html_filter_policy(opts[:for]))
|
|
||||||
relationship = render("relationship.json", %{user: opts[:for], target: user})
|
relationship = render("relationship.json", %{user: opts[:for], target: user})
|
||||||
|
|
||||||
%{
|
%{
|
||||||
|
@ -123,17 +112,17 @@ defp do_render("show.json", %{user: user} = opts) do
|
||||||
followers_count: followers_count,
|
followers_count: followers_count,
|
||||||
following_count: following_count,
|
following_count: following_count,
|
||||||
statuses_count: user.note_count,
|
statuses_count: user.note_count,
|
||||||
note: bio || "",
|
note: user.bio || "",
|
||||||
url: User.profile_url(user),
|
url: User.profile_url(user),
|
||||||
avatar: image,
|
avatar: image,
|
||||||
avatar_static: image,
|
avatar_static: image,
|
||||||
header: header,
|
header: header,
|
||||||
header_static: header,
|
header_static: header,
|
||||||
emojis: emojis,
|
emojis: emojis,
|
||||||
fields: fields,
|
fields: user.fields,
|
||||||
bot: bot,
|
bot: bot,
|
||||||
source: %{
|
source: %{
|
||||||
note: HTML.strip_tags((user.bio || "") |> String.replace("<br>", "\n")),
|
note: Pleroma.HTML.strip_tags((user.bio || "") |> String.replace("<br>", "\n")),
|
||||||
sensitive: false,
|
sensitive: false,
|
||||||
fields: user.raw_fields,
|
fields: user.raw_fields,
|
||||||
pleroma: %{
|
pleroma: %{
|
||||||
|
|
|
@ -101,6 +101,11 @@ def conversation(%{assigns: %{user: user}} = conn, %{"id" => participation_id})
|
||||||
conn
|
conn
|
||||||
|> put_view(ConversationView)
|
|> put_view(ConversationView)
|
||||||
|> render("participation.json", %{participation: participation, for: user})
|
|> render("participation.json", %{participation: participation, for: user})
|
||||||
|
else
|
||||||
|
_error ->
|
||||||
|
conn
|
||||||
|
|> put_status(404)
|
||||||
|
|> json(%{"error" => "Unknown conversation id"})
|
||||||
end
|
end
|
||||||
end
|
end
|
||||||
|
|
||||||
|
@ -108,9 +113,9 @@ def conversation_statuses(
|
||||||
%{assigns: %{user: user}} = conn,
|
%{assigns: %{user: user}} = conn,
|
||||||
%{"id" => participation_id} = params
|
%{"id" => participation_id} = params
|
||||||
) do
|
) do
|
||||||
participation = Participation.get(participation_id, preload: [:conversation])
|
with %Participation{} = participation <-
|
||||||
|
Participation.get(participation_id, preload: [:conversation]),
|
||||||
if user.id == participation.user_id do
|
true <- user.id == participation.user_id do
|
||||||
params =
|
params =
|
||||||
params
|
params
|
||||||
|> Map.put("blocking_user", user)
|
|> Map.put("blocking_user", user)
|
||||||
|
@ -126,6 +131,11 @@ def conversation_statuses(
|
||||||
|> add_link_headers(activities)
|
|> add_link_headers(activities)
|
||||||
|> put_view(StatusView)
|
|> put_view(StatusView)
|
||||||
|> render("index.json", %{activities: activities, for: user, as: :activity})
|
|> render("index.json", %{activities: activities, for: user, as: :activity})
|
||||||
|
else
|
||||||
|
_error ->
|
||||||
|
conn
|
||||||
|
|> put_status(404)
|
||||||
|
|> json(%{"error" => "Unknown conversation id"})
|
||||||
end
|
end
|
||||||
end
|
end
|
||||||
|
|
||||||
|
@ -133,15 +143,22 @@ def update_conversation(
|
||||||
%{assigns: %{user: user}} = conn,
|
%{assigns: %{user: user}} = conn,
|
||||||
%{"id" => participation_id, "recipients" => recipients}
|
%{"id" => participation_id, "recipients" => recipients}
|
||||||
) do
|
) do
|
||||||
participation =
|
with %Participation{} = participation <- Participation.get(participation_id),
|
||||||
participation_id
|
true <- user.id == participation.user_id,
|
||||||
|> Participation.get()
|
|
||||||
|
|
||||||
with true <- user.id == participation.user_id,
|
|
||||||
{:ok, participation} <- Participation.set_recipients(participation, recipients) do
|
{:ok, participation} <- Participation.set_recipients(participation, recipients) do
|
||||||
conn
|
conn
|
||||||
|> put_view(ConversationView)
|
|> put_view(ConversationView)
|
||||||
|> render("participation.json", %{participation: participation, for: user})
|
|> render("participation.json", %{participation: participation, for: user})
|
||||||
|
else
|
||||||
|
{:error, message} ->
|
||||||
|
conn
|
||||||
|
|> put_status(:bad_request)
|
||||||
|
|> json(%{"error" => message})
|
||||||
|
|
||||||
|
_error ->
|
||||||
|
conn
|
||||||
|
|> put_status(404)
|
||||||
|
|> json(%{"error" => "Unknown conversation id"})
|
||||||
end
|
end
|
||||||
end
|
end
|
||||||
|
|
||||||
|
|
|
@ -54,10 +54,17 @@ def represent(%Activity{object: %Object{data: data}} = activity, selected) do
|
||||||
_ -> data["url"] || data["external_url"] || data["id"]
|
_ -> data["url"] || data["external_url"] || data["id"]
|
||||||
end
|
end
|
||||||
|
|
||||||
|
content =
|
||||||
|
if data["content"] do
|
||||||
|
Pleroma.HTML.filter_tags(data["content"])
|
||||||
|
else
|
||||||
|
nil
|
||||||
|
end
|
||||||
|
|
||||||
%{
|
%{
|
||||||
user: user,
|
user: User.sanitize_html(user),
|
||||||
title: get_title(activity.object),
|
title: get_title(activity.object),
|
||||||
content: data["content"] || nil,
|
content: content,
|
||||||
attachment: data["attachment"],
|
attachment: data["attachment"],
|
||||||
link: link,
|
link: link,
|
||||||
published: data["published"],
|
published: data["published"],
|
||||||
|
@ -109,7 +116,7 @@ def show(%{assigns: %{username_or_id: username_or_id}} = conn, params) do
|
||||||
next_page_id = List.last(timeline) && List.last(timeline).id
|
next_page_id = List.last(timeline) && List.last(timeline).id
|
||||||
|
|
||||||
render(conn, "profile.html", %{
|
render(conn, "profile.html", %{
|
||||||
user: user,
|
user: User.sanitize_html(user),
|
||||||
timeline: timeline,
|
timeline: timeline,
|
||||||
prev_page_id: prev_page_id,
|
prev_page_id: prev_page_id,
|
||||||
next_page_id: next_page_id,
|
next_page_id: next_page_id,
|
||||||
|
|
|
@ -10,10 +10,6 @@ defmodule Pleroma.Workers.BackgroundWorker do
|
||||||
use Pleroma.Workers.WorkerHelper, queue: "background"
|
use Pleroma.Workers.WorkerHelper, queue: "background"
|
||||||
|
|
||||||
@impl Oban.Worker
|
@impl Oban.Worker
|
||||||
def perform(%{"op" => "fetch_initial_posts", "user_id" => user_id}, _job) do
|
|
||||||
user = User.get_cached_by_id(user_id)
|
|
||||||
User.perform(:fetch_initial_posts, user)
|
|
||||||
end
|
|
||||||
|
|
||||||
def perform(%{"op" => "deactivate_user", "user_id" => user_id, "status" => status}, _job) do
|
def perform(%{"op" => "deactivate_user", "user_id" => user_id, "status" => status}, _job) do
|
||||||
user = User.get_cached_by_id(user_id)
|
user = User.get_cached_by_id(user_id)
|
||||||
|
|
4
mix.exs
4
mix.exs
|
@ -4,7 +4,7 @@ defmodule Pleroma.Mixfile do
|
||||||
def project do
|
def project do
|
||||||
[
|
[
|
||||||
app: :pleroma,
|
app: :pleroma,
|
||||||
version: version("2.0.0"),
|
version: version("2.0.1"),
|
||||||
elixir: "~> 1.8",
|
elixir: "~> 1.8",
|
||||||
elixirc_paths: elixirc_paths(Mix.env()),
|
elixirc_paths: elixirc_paths(Mix.env()),
|
||||||
compilers: [:phoenix, :gettext] ++ Mix.compilers(),
|
compilers: [:phoenix, :gettext] ++ Mix.compilers(),
|
||||||
|
@ -126,7 +126,7 @@ defp deps do
|
||||||
{:ex_aws_s3, "~> 2.0"},
|
{:ex_aws_s3, "~> 2.0"},
|
||||||
{:sweet_xml, "~> 0.6.6"},
|
{:sweet_xml, "~> 0.6.6"},
|
||||||
{:earmark, "~> 1.3"},
|
{:earmark, "~> 1.3"},
|
||||||
{:bbcode, "~> 0.1.1"},
|
{:bbcode_pleroma, "~> 0.2.0"},
|
||||||
{:ex_machina, "~> 2.3", only: :test},
|
{:ex_machina, "~> 2.3", only: :test},
|
||||||
{:credo, "~> 1.1.0", only: [:dev, :test], runtime: false},
|
{:credo, "~> 1.1.0", only: [:dev, :test], runtime: false},
|
||||||
{:mock, "~> 0.3.3", only: :test},
|
{:mock, "~> 0.3.3", only: :test},
|
||||||
|
|
4
mix.lock
4
mix.lock
|
@ -3,7 +3,8 @@
|
||||||
"auto_linker": {:git, "https://git.pleroma.social/pleroma/auto_linker.git", "95e8188490e97505c56636c1379ffdf036c1fdde", [ref: "95e8188490e97505c56636c1379ffdf036c1fdde"]},
|
"auto_linker": {:git, "https://git.pleroma.social/pleroma/auto_linker.git", "95e8188490e97505c56636c1379ffdf036c1fdde", [ref: "95e8188490e97505c56636c1379ffdf036c1fdde"]},
|
||||||
"base62": {:hex, :base62, "1.2.1", "4866763e08555a7b3917064e9eef9194c41667276c51b59de2bc42c6ea65f806", [:mix], [{:custom_base, "~> 0.2.1", [hex: :custom_base, repo: "hexpm", optional: false]}], "hexpm", "3b29948de2013d3f93aa898c884a9dff847e7aec75d9d6d8c1dc4c61c2716c42"},
|
"base62": {:hex, :base62, "1.2.1", "4866763e08555a7b3917064e9eef9194c41667276c51b59de2bc42c6ea65f806", [:mix], [{:custom_base, "~> 0.2.1", [hex: :custom_base, repo: "hexpm", optional: false]}], "hexpm", "3b29948de2013d3f93aa898c884a9dff847e7aec75d9d6d8c1dc4c61c2716c42"},
|
||||||
"base64url": {:hex, :base64url, "0.0.1", "36a90125f5948e3afd7be97662a1504b934dd5dac78451ca6e9abf85a10286be", [:rebar], [], "hexpm"},
|
"base64url": {:hex, :base64url, "0.0.1", "36a90125f5948e3afd7be97662a1504b934dd5dac78451ca6e9abf85a10286be", [:rebar], [], "hexpm"},
|
||||||
"bbcode": {:hex, :bbcode, "0.1.1", "0023e2c7814119b2e620b7add67182e3f6019f92bfec9a22da7e99821aceba70", [:mix], [{:nimble_parsec, "~> 0.5", [hex: :nimble_parsec, repo: "hexpm", optional: false]}], "hexpm", "5a981b98ac7d366a9b6bf40eac389aaf4d6e623c631e6b6f8a6b571efaafd338"},
|
"bbcode": {:git, "https://git.pleroma.social/pleroma/elixir-libraries/bbcode.git", "f2d267675e9a7e1ad1ea9beb4cc23382762b66c2", [ref: "v0.2.0"]},
|
||||||
|
"bbcode_pleroma": {:hex, :bbcode_pleroma, "0.2.0", "d36f5bca6e2f62261c45be30fa9b92725c0655ad45c99025cb1c3e28e25803ef", [:mix], [{:nimble_parsec, "~> 0.5", [hex: :nimble_parsec, repo: "hexpm", optional: false]}], "hexpm", "19851074419a5fedb4ef49e1f01b30df504bb5dbb6d6adfc135238063bebd1c3"},
|
||||||
"benchee": {:hex, :benchee, "1.0.1", "66b211f9bfd84bd97e6d1beaddf8fc2312aaabe192f776e8931cb0c16f53a521", [:mix], [{:deep_merge, "~> 1.0", [hex: :deep_merge, repo: "hexpm", optional: false]}], "hexpm", "3ad58ae787e9c7c94dd7ceda3b587ec2c64604563e049b2a0e8baafae832addb"},
|
"benchee": {:hex, :benchee, "1.0.1", "66b211f9bfd84bd97e6d1beaddf8fc2312aaabe192f776e8931cb0c16f53a521", [:mix], [{:deep_merge, "~> 1.0", [hex: :deep_merge, repo: "hexpm", optional: false]}], "hexpm", "3ad58ae787e9c7c94dd7ceda3b587ec2c64604563e049b2a0e8baafae832addb"},
|
||||||
"bunt": {:hex, :bunt, "0.2.0", "951c6e801e8b1d2cbe58ebbd3e616a869061ddadcc4863d0a2182541acae9a38", [:mix], [], "hexpm", "7af5c7e09fe1d40f76c8e4f9dd2be7cebd83909f31fee7cd0e9eadc567da8353"},
|
"bunt": {:hex, :bunt, "0.2.0", "951c6e801e8b1d2cbe58ebbd3e616a869061ddadcc4863d0a2182541acae9a38", [:mix], [], "hexpm", "7af5c7e09fe1d40f76c8e4f9dd2be7cebd83909f31fee7cd0e9eadc567da8353"},
|
||||||
"cachex": {:hex, :cachex, "3.2.0", "a596476c781b0646e6cb5cd9751af2e2974c3e0d5498a8cab71807618b74fe2f", [:mix], [{:eternal, "~> 1.2", [hex: :eternal, repo: "hexpm", optional: false]}, {:jumper, "~> 1.0", [hex: :jumper, repo: "hexpm", optional: false]}, {:sleeplocks, "~> 1.1", [hex: :sleeplocks, repo: "hexpm", optional: false]}, {:unsafe, "~> 1.0", [hex: :unsafe, repo: "hexpm", optional: false]}], "hexpm", "aef93694067a43697ae0531727e097754a9e992a1e7946296f5969d6dd9ac986"},
|
"cachex": {:hex, :cachex, "3.2.0", "a596476c781b0646e6cb5cd9751af2e2974c3e0d5498a8cab71807618b74fe2f", [:mix], [{:eternal, "~> 1.2", [hex: :eternal, repo: "hexpm", optional: false]}, {:jumper, "~> 1.0", [hex: :jumper, repo: "hexpm", optional: false]}, {:sleeplocks, "~> 1.1", [hex: :sleeplocks, repo: "hexpm", optional: false]}, {:unsafe, "~> 1.0", [hex: :unsafe, repo: "hexpm", optional: false]}], "hexpm", "aef93694067a43697ae0531727e097754a9e992a1e7946296f5969d6dd9ac986"},
|
||||||
|
@ -110,4 +111,3 @@
|
||||||
"web_push_encryption": {:hex, :web_push_encryption, "0.2.3", "a0ceab85a805a30852f143d22d71c434046fbdbafbc7292e7887cec500826a80", [:mix], [{:httpoison, "~> 1.0", [hex: :httpoison, repo: "hexpm", optional: false]}, {:jose, "~> 1.8", [hex: :jose, repo: "hexpm", optional: false]}, {:poison, "~> 3.0", [hex: :poison, repo: "hexpm", optional: false]}], "hexpm", "9315c8f37c108835cf3f8e9157d7a9b8f420a34f402d1b1620a31aed5b93ecdf"},
|
"web_push_encryption": {:hex, :web_push_encryption, "0.2.3", "a0ceab85a805a30852f143d22d71c434046fbdbafbc7292e7887cec500826a80", [:mix], [{:httpoison, "~> 1.0", [hex: :httpoison, repo: "hexpm", optional: false]}, {:jose, "~> 1.8", [hex: :jose, repo: "hexpm", optional: false]}, {:poison, "~> 3.0", [hex: :poison, repo: "hexpm", optional: false]}], "hexpm", "9315c8f37c108835cf3f8e9157d7a9b8f420a34f402d1b1620a31aed5b93ecdf"},
|
||||||
"websocket_client": {:git, "https://github.com/jeremyong/websocket_client.git", "9a6f65d05ebf2725d62fb19262b21f1805a59fbf", []},
|
"websocket_client": {:git, "https://github.com/jeremyong/websocket_client.git", "9a6f65d05ebf2725d62fb19262b21f1805a59fbf", []},
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
|
@ -0,0 +1,10 @@
|
||||||
|
defmodule Pleroma.Repo.Migrations.ConfigRemoveFetchInitialPosts do
|
||||||
|
use Ecto.Migration
|
||||||
|
|
||||||
|
def change do
|
||||||
|
execute(
|
||||||
|
"delete from config where config.key = ':fetch_initial_posts' and config.group = ':pleroma';",
|
||||||
|
""
|
||||||
|
)
|
||||||
|
end
|
||||||
|
end
|
|
@ -0,0 +1,10 @@
|
||||||
|
defmodule Pleroma.Repo.Migrations.DeleteFetchInitialPostsJobs do
|
||||||
|
use Ecto.Migration
|
||||||
|
|
||||||
|
def change do
|
||||||
|
execute(
|
||||||
|
"delete from oban_jobs where worker = 'Pleroma.Workers.BackgroundWorker' and args->>'op' = 'fetch_initial_posts';",
|
||||||
|
""
|
||||||
|
)
|
||||||
|
end
|
||||||
|
end
|
Binary file not shown.
Binary file not shown.
Binary file not shown.
|
@ -1 +1 @@
|
||||||
<!DOCTYPE html><html><head><meta charset=utf-8><meta http-equiv=X-UA-Compatible content="IE=edge,chrome=1"><meta name=renderer content=webkit><meta name=viewport content="width=device-width,initial-scale=1,maximum-scale=1,user-scalable=no"><title>Admin FE</title><link rel="shortcut icon" href=favicon.ico><link href=chunk-elementUI.1abbc9b8.css rel=stylesheet><link href=chunk-libs.686b5876.css rel=stylesheet><link href=app.c836e084.css rel=stylesheet></head><body><div id=app></div><script type=text/javascript src=static/js/runtime.ae93ea9f.js></script><script type=text/javascript src=static/js/chunk-elementUI.fba0efec.js></script><script type=text/javascript src=static/js/chunk-libs.b8c453ab.js></script><script type=text/javascript src=static/js/app.55df3157.js></script></body></html>
|
<!DOCTYPE html><html><head><meta charset=utf-8><meta http-equiv=X-UA-Compatible content="IE=edge,chrome=1"><meta name=renderer content=webkit><meta name=viewport content="width=device-width,initial-scale=1,maximum-scale=1,user-scalable=no"><title>Admin FE</title><link rel="shortcut icon" href=favicon.ico><link href=chunk-elementUI.1abbc9b8.css rel=stylesheet><link href=chunk-libs.686b5876.css rel=stylesheet><link href=app.c836e084.css rel=stylesheet></head><body><div id=app></div><script type=text/javascript src=static/js/runtime.fa19e5d1.js></script><script type=text/javascript src=static/js/chunk-elementUI.fba0efec.js></script><script type=text/javascript src=static/js/chunk-libs.b8c453ab.js></script><script type=text/javascript src=static/js/app.d2c3c6b3.js></script></body></html>
|
Binary file not shown.
Binary file not shown.
BIN
priv/static/adminfe/static/js/app.d2c3c6b3.js
Normal file
BIN
priv/static/adminfe/static/js/app.d2c3c6b3.js
Normal file
Binary file not shown.
BIN
priv/static/adminfe/static/js/app.d2c3c6b3.js.map
Normal file
BIN
priv/static/adminfe/static/js/app.d2c3c6b3.js.map
Normal file
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
BIN
priv/static/adminfe/static/js/chunk-4e7d.a40ad735.js
Normal file
BIN
priv/static/adminfe/static/js/chunk-4e7d.a40ad735.js
Normal file
Binary file not shown.
BIN
priv/static/adminfe/static/js/chunk-4e7d.a40ad735.js.map
Normal file
BIN
priv/static/adminfe/static/js/chunk-4e7d.a40ad735.js.map
Normal file
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
BIN
priv/static/adminfe/static/js/chunk-87b3.4704cadf.js
Normal file
BIN
priv/static/adminfe/static/js/chunk-87b3.4704cadf.js
Normal file
Binary file not shown.
BIN
priv/static/adminfe/static/js/chunk-87b3.4704cadf.js.map
Normal file
BIN
priv/static/adminfe/static/js/chunk-87b3.4704cadf.js.map
Normal file
Binary file not shown.
BIN
priv/static/adminfe/static/js/chunk-cf57.42b96339.js
Normal file
BIN
priv/static/adminfe/static/js/chunk-cf57.42b96339.js
Normal file
Binary file not shown.
BIN
priv/static/adminfe/static/js/chunk-cf57.42b96339.js.map
Normal file
BIN
priv/static/adminfe/static/js/chunk-cf57.42b96339.js.map
Normal file
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
BIN
priv/static/adminfe/static/js/runtime.fa19e5d1.js
Normal file
BIN
priv/static/adminfe/static/js/runtime.fa19e5d1.js
Normal file
Binary file not shown.
Binary file not shown.
BIN
priv/static/static/static-fe.css
Normal file
BIN
priv/static/static/static-fe.css
Normal file
Binary file not shown.
|
@ -59,8 +59,8 @@ test "non-local action does not produce public:local topic", %{activity: activit
|
||||||
describe "public visibility create events" do
|
describe "public visibility create events" do
|
||||||
setup do
|
setup do
|
||||||
activity = %Activity{
|
activity = %Activity{
|
||||||
object: %Object{data: %{"type" => "Create", "attachment" => []}},
|
object: %Object{data: %{"attachment" => []}},
|
||||||
data: %{"to" => [Pleroma.Constants.as_public()]}
|
data: %{"type" => "Create", "to" => [Pleroma.Constants.as_public()]}
|
||||||
}
|
}
|
||||||
|
|
||||||
{:ok, activity: activity}
|
{:ok, activity: activity}
|
||||||
|
@ -98,8 +98,8 @@ test "only converts strinngs to hash tags", %{
|
||||||
describe "public visibility create events with attachments" do
|
describe "public visibility create events with attachments" do
|
||||||
setup do
|
setup do
|
||||||
activity = %Activity{
|
activity = %Activity{
|
||||||
object: %Object{data: %{"type" => "Create", "attachment" => ["foo"]}},
|
object: %Object{data: %{"attachment" => ["foo"]}},
|
||||||
data: %{"to" => [Pleroma.Constants.as_public()]}
|
data: %{"type" => "Create", "to" => [Pleroma.Constants.as_public()]}
|
||||||
}
|
}
|
||||||
|
|
||||||
{:ok, activity: activity}
|
{:ok, activity: activity}
|
||||||
|
|
|
@ -7,8 +7,8 @@ defmodule Pleroma.Config.HolderTest do
|
||||||
|
|
||||||
alias Pleroma.Config.Holder
|
alias Pleroma.Config.Holder
|
||||||
|
|
||||||
test "config/0" do
|
test "default_config/0" do
|
||||||
config = Holder.config()
|
config = Holder.default_config()
|
||||||
assert config[:pleroma][Pleroma.Uploaders.Local][:uploads] == "test/uploads"
|
assert config[:pleroma][Pleroma.Uploaders.Local][:uploads] == "test/uploads"
|
||||||
assert config[:tesla][:adapter] == Tesla.Mock
|
assert config[:tesla][:adapter] == Tesla.Mock
|
||||||
|
|
||||||
|
@ -20,15 +20,15 @@ test "config/0" do
|
||||||
refute config[:phoenix][:serve_endpoints]
|
refute config[:phoenix][:serve_endpoints]
|
||||||
end
|
end
|
||||||
|
|
||||||
test "config/1" do
|
test "default_config/1" do
|
||||||
pleroma_config = Holder.config(:pleroma)
|
pleroma_config = Holder.default_config(:pleroma)
|
||||||
assert pleroma_config[Pleroma.Uploaders.Local][:uploads] == "test/uploads"
|
assert pleroma_config[Pleroma.Uploaders.Local][:uploads] == "test/uploads"
|
||||||
tesla_config = Holder.config(:tesla)
|
tesla_config = Holder.default_config(:tesla)
|
||||||
assert tesla_config[:adapter] == Tesla.Mock
|
assert tesla_config[:adapter] == Tesla.Mock
|
||||||
end
|
end
|
||||||
|
|
||||||
test "config/2" do
|
test "default_config/2" do
|
||||||
assert Holder.config(:pleroma, Pleroma.Uploaders.Local) == [uploads: "test/uploads"]
|
assert Holder.default_config(:pleroma, Pleroma.Uploaders.Local) == [uploads: "test/uploads"]
|
||||||
assert Holder.config(:tesla, :adapter) == Tesla.Mock
|
assert Holder.default_config(:tesla, :adapter) == Tesla.Mock
|
||||||
end
|
end
|
||||||
end
|
end
|
||||||
|
|
|
@ -7,28 +7,13 @@ defmodule Pleroma.Config.LoaderTest do
|
||||||
|
|
||||||
alias Pleroma.Config.Loader
|
alias Pleroma.Config.Loader
|
||||||
|
|
||||||
test "load/1" do
|
test "read/1" do
|
||||||
config = Loader.load("test/fixtures/config/temp.secret.exs")
|
config = Loader.read("test/fixtures/config/temp.secret.exs")
|
||||||
assert config[:pleroma][:first_setting][:key] == "value"
|
assert config[:pleroma][:first_setting][:key] == "value"
|
||||||
assert config[:pleroma][:first_setting][:key2] == [Pleroma.Repo]
|
assert config[:pleroma][:first_setting][:key2] == [Pleroma.Repo]
|
||||||
assert config[:quack][:level] == :info
|
assert config[:quack][:level] == :info
|
||||||
end
|
end
|
||||||
|
|
||||||
test "load_and_merge/0" do
|
|
||||||
config = Loader.load_and_merge()
|
|
||||||
|
|
||||||
refute config[:pleroma][Pleroma.Repo]
|
|
||||||
refute config[:pleroma][Pleroma.Web.Endpoint]
|
|
||||||
refute config[:pleroma][:env]
|
|
||||||
refute config[:pleroma][:configurable_from_database]
|
|
||||||
refute config[:pleroma][:database]
|
|
||||||
refute config[:phoenix][:serve_endpoints]
|
|
||||||
|
|
||||||
assert config[:pleroma][:ecto_repos] == [Pleroma.Repo]
|
|
||||||
assert config[:pleroma][Pleroma.Uploaders.Local][:uploads] == "test/uploads"
|
|
||||||
assert config[:tesla][:adapter] == Tesla.Mock
|
|
||||||
end
|
|
||||||
|
|
||||||
test "filter_group/2" do
|
test "filter_group/2" do
|
||||||
assert Loader.filter_group(:pleroma,
|
assert Loader.filter_group(:pleroma,
|
||||||
pleroma: [
|
pleroma: [
|
||||||
|
|
|
@ -70,7 +70,7 @@ test "transfer config values for 1 group and some keys" do
|
||||||
|
|
||||||
assert Application.get_env(:quack, :level) == :info
|
assert Application.get_env(:quack, :level) == :info
|
||||||
assert Application.get_env(:quack, :meta) == [:none]
|
assert Application.get_env(:quack, :meta) == [:none]
|
||||||
default = Pleroma.Config.Holder.config(:quack, :webhook_url)
|
default = Pleroma.Config.Holder.default_config(:quack, :webhook_url)
|
||||||
assert Application.get_env(:quack, :webhook_url) == default
|
assert Application.get_env(:quack, :webhook_url) == default
|
||||||
|
|
||||||
on_exit(fn ->
|
on_exit(fn ->
|
||||||
|
|
79
test/earmark_renderer_test.ex
Normal file
79
test/earmark_renderer_test.ex
Normal file
|
@ -0,0 +1,79 @@
|
||||||
|
# Pleroma: A lightweight social networking server
|
||||||
|
# Copyright © 2020 Pleroma Authors <https://pleroma.social/>
|
||||||
|
# SPDX-License-Identifier: AGPL-3.0-only
|
||||||
|
defmodule Pleroma.EarmarkRendererTest do
|
||||||
|
use ExUnit.Case
|
||||||
|
|
||||||
|
test "Paragraph" do
|
||||||
|
code = ~s[Hello\n\nWorld!]
|
||||||
|
result = Earmark.as_html!(code, %Earmark.Options{renderer: Pleroma.EarmarkRenderer})
|
||||||
|
assert result == "<p>Hello</p><p>World!</p>"
|
||||||
|
end
|
||||||
|
|
||||||
|
test "raw HTML" do
|
||||||
|
code = ~s[<a href="http://example.org/">OwO</a><!-- what's this?-->]
|
||||||
|
result = Earmark.as_html!(code, %Earmark.Options{renderer: Pleroma.EarmarkRenderer})
|
||||||
|
assert result == "<p>#{code}</p>"
|
||||||
|
end
|
||||||
|
|
||||||
|
test "rulers" do
|
||||||
|
code = ~s[before\n\n-----\n\nafter]
|
||||||
|
result = Earmark.as_html!(code, %Earmark.Options{renderer: Pleroma.EarmarkRenderer})
|
||||||
|
assert result == "<p>before</p><hr /><p>after</p>"
|
||||||
|
end
|
||||||
|
|
||||||
|
test "headings" do
|
||||||
|
code = ~s[# h1\n## h2\n### h3\n]
|
||||||
|
result = Earmark.as_html!(code, %Earmark.Options{renderer: Pleroma.EarmarkRenderer})
|
||||||
|
assert result == ~s[<h1>h1</h1><h2>h2</h2><h3>h3</h3>]
|
||||||
|
end
|
||||||
|
|
||||||
|
test "blockquote" do
|
||||||
|
code = ~s[> whoms't are you quoting?]
|
||||||
|
result = Earmark.as_html!(code, %Earmark.Options{renderer: Pleroma.EarmarkRenderer})
|
||||||
|
assert result == "<blockquote><p>whoms’t are you quoting?</p></blockquote>"
|
||||||
|
end
|
||||||
|
|
||||||
|
test "code" do
|
||||||
|
code = ~s[`mix`]
|
||||||
|
result = Earmark.as_html!(code, %Earmark.Options{renderer: Pleroma.EarmarkRenderer})
|
||||||
|
assert result == ~s[<p><code class="inline">mix</code></p>]
|
||||||
|
|
||||||
|
code = ~s[``mix``]
|
||||||
|
result = Earmark.as_html!(code, %Earmark.Options{renderer: Pleroma.EarmarkRenderer})
|
||||||
|
assert result == ~s[<p><code class="inline">mix</code></p>]
|
||||||
|
|
||||||
|
code = ~s[```\nputs "Hello World"\n```]
|
||||||
|
result = Earmark.as_html!(code, %Earmark.Options{renderer: Pleroma.EarmarkRenderer})
|
||||||
|
assert result == ~s[<pre><code class="">puts "Hello World"</code></pre>]
|
||||||
|
end
|
||||||
|
|
||||||
|
test "lists" do
|
||||||
|
code = ~s[- one\n- two\n- three\n- four]
|
||||||
|
result = Earmark.as_html!(code, %Earmark.Options{renderer: Pleroma.EarmarkRenderer})
|
||||||
|
assert result == "<ul><li>one</li><li>two</li><li>three</li><li>four</li></ul>"
|
||||||
|
|
||||||
|
code = ~s[1. one\n2. two\n3. three\n4. four\n]
|
||||||
|
result = Earmark.as_html!(code, %Earmark.Options{renderer: Pleroma.EarmarkRenderer})
|
||||||
|
assert result == "<ol><li>one</li><li>two</li><li>three</li><li>four</li></ol>"
|
||||||
|
end
|
||||||
|
|
||||||
|
test "delegated renderers" do
|
||||||
|
code = ~s[a<br/>b]
|
||||||
|
result = Earmark.as_html!(code, %Earmark.Options{renderer: Pleroma.EarmarkRenderer})
|
||||||
|
assert result == "<p>#{code}</p>"
|
||||||
|
|
||||||
|
code = ~s[*aaaa~*]
|
||||||
|
result = Earmark.as_html!(code, %Earmark.Options{renderer: Pleroma.EarmarkRenderer})
|
||||||
|
assert result == ~s[<p><em>aaaa~</em></p>]
|
||||||
|
|
||||||
|
code = ~s[**aaaa~**]
|
||||||
|
result = Earmark.as_html!(code, %Earmark.Options{renderer: Pleroma.EarmarkRenderer})
|
||||||
|
assert result == ~s[<p><strong>aaaa~</strong></p>]
|
||||||
|
|
||||||
|
# strikethrought
|
||||||
|
code = ~s[<del>aaaa~</del>]
|
||||||
|
result = Earmark.as_html!(code, %Earmark.Options{renderer: Pleroma.EarmarkRenderer})
|
||||||
|
assert result == ~s[<p><del>aaaa~</del></p>]
|
||||||
|
end
|
||||||
|
end
|
Some files were not shown because too many files have changed in this diff Show more
Loading…
Reference in a new issue