Merge branch 'develop' of https://git.pleroma.social/pleroma/pleroma into develop

This commit is contained in:
sadposter 2019-06-25 12:40:02 +01:00
commit 3a71016699
88 changed files with 2973 additions and 390 deletions

View file

@ -173,6 +173,7 @@ amd64:
script: &release script: &release
- mix deps.get --only prod - mix deps.get --only prod
- mkdir release - mkdir release
- export PLEROMA_BUILD_BRANCH=$CI_COMMIT_REF_NAME
- mix release --path release - mix release --path release

View file

@ -62,8 +62,10 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/).
- MRF: Support for running subchains. - MRF: Support for running subchains.
- Configuration: `skip_thread_containment` option - Configuration: `skip_thread_containment` option
- Configuration: `rate_limit` option. See `Pleroma.Plugs.RateLimiter` documentation for details. - Configuration: `rate_limit` option. See `Pleroma.Plugs.RateLimiter` documentation for details.
- MRF: Support for filtering out likely spam messages by rejecting posts from new users that contain links.
### Changed ### Changed
- **Breaking:** bind to 127.0.0.1 instead of 0.0.0.0 by default
- **Breaking:** Configuration: move from Pleroma.Mailer to Pleroma.Emails.Mailer - **Breaking:** Configuration: move from Pleroma.Mailer to Pleroma.Emails.Mailer
- Thread containment / test for complete visibility will be skipped by default. - Thread containment / test for complete visibility will be skipped by default.
- Enforcement of OAuth scopes - Enforcement of OAuth scopes

View file

@ -139,6 +139,7 @@
instrumenters: [Pleroma.Web.Endpoint.Instrumenter], instrumenters: [Pleroma.Web.Endpoint.Instrumenter],
url: [host: "localhost"], url: [host: "localhost"],
http: [ http: [
ip: {127, 0, 0, 1},
dispatch: [ dispatch: [
{:_, {:_,
[ [

View file

@ -60,5 +60,5 @@
) )
end end
if File.exists?("./config/dev.migrated.secret.exs"), if File.exists?("./config/dev.exported_from_db.secret.exs"),
do: import_config("./config/dev.migrated.secret.exs") do: import_config("dev.exported_from_db.secret.exs")

View file

@ -64,5 +64,5 @@
# which should be versioned separately. # which should be versioned separately.
import_config "prod.secret.exs" import_config "prod.secret.exs"
if File.exists?("./config/prod.migrated.secret.exs"), if File.exists?("./config/prod.exported_from_db.secret.exs"),
do: import_config("./config/prod.migrated.secret.exs") do: import_config("prod.exported_from_db.secret.exs")

View file

@ -568,8 +568,9 @@ Note: Available `:permission_group` is currently moderator and admin. 404 is ret
{ {
configs: [ configs: [
{ {
"group": string,
"key": string, "key": string,
"value": string or {} or [] "value": string or {} or [] or {"tuple": []}
} }
] ]
} }
@ -580,6 +581,8 @@ Note: Available `:permission_group` is currently moderator and admin. 404 is ret
Module name can be passed as string, which starts with `Pleroma`, e.g. `"Pleroma.Upload"`. Module name can be passed as string, which starts with `Pleroma`, e.g. `"Pleroma.Upload"`.
Atom or boolean value can be passed with `:` in the beginning, e.g. `":true"`, `":upload"`. Atom or boolean value can be passed with `:` in the beginning, e.g. `":true"`, `":upload"`.
Integer with `i:`, e.g. `"i:150"`. Integer with `i:`, e.g. `"i:150"`.
Tuple with more than 2 values with `{"tuple": ["first_val", Pleroma.Module, []]}`.
`{"tuple": ["some_string", "Pleroma.Some.Module", []]}` will be converted to `{"some_string", Pleroma.Some.Module, []}`.
Compile time settings (need instance reboot): Compile time settings (need instance reboot):
- all settings by this keys: - all settings by this keys:
@ -595,8 +598,9 @@ Compile time settings (need instance reboot):
- Method `POST` - Method `POST`
- Params: - Params:
- `configs` => [ - `configs` => [
- `group` (string)
- `key` (string) - `key` (string)
- `value` (string, [], {}) - `value` (string, [], {} or {"tuple": []})
- `delete` = true (optional, if parameter must be deleted) - `delete` = true (optional, if parameter must be deleted)
] ]
@ -606,6 +610,7 @@ Compile time settings (need instance reboot):
{ {
configs: [ configs: [
{ {
"group": "pleroma",
"key": "Pleroma.Upload", "key": "Pleroma.Upload",
"value": { "value": {
"uploader": "Pleroma.Uploaders.Local", "uploader": "Pleroma.Uploaders.Local",
@ -619,6 +624,9 @@ Compile time settings (need instance reboot):
"follow_redirect": ":true", "follow_redirect": ":true",
"pool": ":upload" "pool": ":upload"
} }
},
"dispatch": {
"tuple": ["/api/v1/streaming", "Pleroma.Web.MastodonAPI.WebsocketHandler", []]
} }
} }
} }
@ -631,8 +639,9 @@ Compile time settings (need instance reboot):
{ {
configs: [ configs: [
{ {
"group": string,
"key": string, "key": string,
"value": string or {} or [] "value": string or {} or [] or {"tuple": []}
} }
] ]
} }

View file

@ -49,13 +49,6 @@ Feel free to contact us to be added to this list!
- Platforms: iOS, Android - Platforms: iOS, Android
- Features: No Streaming - Features: No Streaming
### Tootdon
- Homepage: <http://tootdon.club/>, <http://blog.mastodon-tootdon.com/>
- Source Code: ???
- Contact: [@tootdon@mstdn.jp](https://mstdn.jp/users/tootdon)
- Platforms: Android, iOS
- Features: No Streaming
### Tusky ### Tusky
- Homepage: <https://tuskyapp.github.io/> - Homepage: <https://tuskyapp.github.io/>
- Source Code: <https://github.com/tuskyapp/Tusky> - Source Code: <https://github.com/tuskyapp/Tusky>

View file

@ -16,6 +16,13 @@ Note: `strip_exif` has been replaced by `Pleroma.Upload.Filter.Mogrify`.
## Pleroma.Uploaders.Local ## Pleroma.Uploaders.Local
* `uploads`: Which directory to store the user-uploads in, relative to pleromas working directory * `uploads`: Which directory to store the user-uploads in, relative to pleromas working directory
## Pleroma.Uploaders.S3
* `bucket`: S3 bucket name
* `public_endpoint`: S3 endpoint that the user finally accesses(ex. "https://s3.dualstack.ap-northeast-1.amazonaws.com")
* `truncated_namespace`: If you use S3 compatible service such as Digital Ocean Spaces or CDN, set folder name or "" etc.
For example, when using CDN to S3 virtual host format, set "".
At this time, write CNAME to CDN in public_endpoint.
## Pleroma.Upload.Filter.Mogrify ## Pleroma.Upload.Filter.Mogrify
* `args`: List of actions for the `mogrify` command like `"strip"` or `["strip", "auto-orient", {"impode", "1"}]`. * `args`: List of actions for the `mogrify` command like `"strip"` or `["strip", "auto-orient", {"impode", "1"}]`.
@ -90,6 +97,7 @@ config :pleroma, Pleroma.Emails.Mailer,
* `Pleroma.Web.ActivityPub.MRF.SubchainPolicy`: Selectively runs other MRF policies when messages match (see ``:mrf_subchain`` section) * `Pleroma.Web.ActivityPub.MRF.SubchainPolicy`: Selectively runs other MRF policies when messages match (see ``:mrf_subchain`` section)
* `Pleroma.Web.ActivityPub.MRF.RejectNonPublic`: Drops posts with non-public visibility settings (See ``:mrf_rejectnonpublic`` section) * `Pleroma.Web.ActivityPub.MRF.RejectNonPublic`: Drops posts with non-public visibility settings (See ``:mrf_rejectnonpublic`` section)
* `Pleroma.Web.ActivityPub.MRF.EnsureRePrepended`: Rewrites posts to ensure that replies to posts with subjects do not have an identical subject and instead begin with re:. * `Pleroma.Web.ActivityPub.MRF.EnsureRePrepended`: Rewrites posts to ensure that replies to posts with subjects do not have an identical subject and instead begin with re:.
* `Pleroma.Web.ActivityPub.MRF.AntiLinkSpamPolicy`: Rejects posts from likely spambots by rejecting posts from new users that contain links.
* `public`: Makes the client API in authentificated mode-only except for user-profiles. Useful for disabling the Local Timeline and The Whole Known Network. * `public`: Makes the client API in authentificated mode-only except for user-profiles. Useful for disabling the Local Timeline and The Whole Known Network.
* `quarantined_instances`: List of ActivityPub instances where private(DMs, followers-only) activities will not be send. * `quarantined_instances`: List of ActivityPub instances where private(DMs, followers-only) activities will not be send.
* `managed_config`: Whenether the config for pleroma-fe is configured in this config or in ``static/config.json`` * `managed_config`: Whenether the config for pleroma-fe is configured in this config or in ``static/config.json``

View file

@ -2,19 +2,23 @@
# Copyright © 2017-2018 Pleroma Authors <https://pleroma.social/> # Copyright © 2017-2018 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only # SPDX-License-Identifier: AGPL-3.0-only
defmodule Mix.Tasks.Pleroma.Common do defmodule Mix.Pleroma do
@doc "Common functions to be reused in mix tasks" @doc "Common functions to be reused in mix tasks"
def start_pleroma do def start_pleroma do
Application.put_env(:phoenix, :serve_endpoints, false, persistent: true) Application.put_env(:phoenix, :serve_endpoints, false, persistent: true)
{:ok, _} = Application.ensure_all_started(:pleroma) {:ok, _} = Application.ensure_all_started(:pleroma)
end end
def load_pleroma do
Application.load(:pleroma)
end
def get_option(options, opt, prompt, defval \\ nil, defname \\ nil) do def get_option(options, opt, prompt, defval \\ nil, defname \\ nil) do
Keyword.get(options, opt) || shell_prompt(prompt, defval, defname) Keyword.get(options, opt) || shell_prompt(prompt, defval, defname)
end end
def shell_prompt(prompt, defval \\ nil, defname \\ nil) do def shell_prompt(prompt, defval \\ nil, defname \\ nil) do
prompt_message = "#{prompt} [#{defname || defval}]" prompt_message = "#{prompt} [#{defname || defval}] "
input = input =
if mix_shell?(), if mix_shell?(),

View file

@ -1,9 +1,9 @@
defmodule Mix.Tasks.Pleroma.Benchmark do defmodule Mix.Tasks.Pleroma.Benchmark do
import Mix.Pleroma
use Mix.Task use Mix.Task
alias Mix.Tasks.Pleroma.Common
def run(["search"]) do def run(["search"]) do
Common.start_pleroma() start_pleroma()
Benchee.run(%{ Benchee.run(%{
"search" => fn -> "search" => fn ->
@ -13,7 +13,7 @@ def run(["search"]) do
end end
def run(["tag"]) do def run(["tag"]) do
Common.start_pleroma() start_pleroma()
Benchee.run(%{ Benchee.run(%{
"tag" => fn -> "tag" => fn ->

View file

@ -1,6 +1,6 @@
defmodule Mix.Tasks.Pleroma.Config do defmodule Mix.Tasks.Pleroma.Config do
use Mix.Task use Mix.Task
alias Mix.Tasks.Pleroma.Common import Mix.Pleroma
alias Pleroma.Repo alias Pleroma.Repo
alias Pleroma.Web.AdminAPI.Config alias Pleroma.Web.AdminAPI.Config
@shortdoc "Manages the location of the config" @shortdoc "Manages the location of the config"
@ -17,14 +17,14 @@ defmodule Mix.Tasks.Pleroma.Config do
""" """
def run(["migrate_to_db"]) do def run(["migrate_to_db"]) do
Common.start_pleroma() start_pleroma()
if Pleroma.Config.get([:instance, :dynamic_configuration]) do if Pleroma.Config.get([:instance, :dynamic_configuration]) do
Application.get_all_env(:pleroma) Application.get_all_env(:pleroma)
|> Enum.reject(fn {k, _v} -> k in [Pleroma.Repo, :env] end) |> Enum.reject(fn {k, _v} -> k in [Pleroma.Repo, :env] end)
|> Enum.each(fn {k, v} -> |> Enum.each(fn {k, v} ->
key = to_string(k) |> String.replace("Elixir.", "") key = to_string(k) |> String.replace("Elixir.", "")
{:ok, _} = Config.update_or_create(%{key: key, value: v}) {:ok, _} = Config.update_or_create(%{group: "pleroma", key: key, value: v})
Mix.shell().info("#{key} is migrated.") Mix.shell().info("#{key} is migrated.")
end) end)
@ -37,12 +37,13 @@ def run(["migrate_to_db"]) do
end end
def run(["migrate_from_db", env]) do def run(["migrate_from_db", env]) do
Common.start_pleroma() start_pleroma()
if Pleroma.Config.get([:instance, :dynamic_configuration]) do if Pleroma.Config.get([:instance, :dynamic_configuration]) do
config_path = "config/#{env}.migrated.secret.exs" config_path = "config/#{env}.exported_from_db.secret.exs"
{:ok, file} = File.open(config_path, [:write]) {:ok, file} = File.open(config_path, [:write])
IO.write(file, "use Mix.Config\r\n")
Repo.all(Config) Repo.all(Config)
|> Enum.each(fn config -> |> Enum.each(fn config ->
@ -50,7 +51,9 @@ def run(["migrate_from_db", env]) do
IO.write( IO.write(
file, file,
"config :pleroma, #{config.key}#{mark} #{inspect(Config.from_binary(config.value))}\r\n" "config :#{config.group}, #{config.key}#{mark} #{
inspect(Config.from_binary(config.value))
}\r\n"
) )
{:ok, _} = Repo.delete(config) {:ok, _} = Repo.delete(config)

View file

@ -3,12 +3,12 @@
# SPDX-License-Identifier: AGPL-3.0-only # SPDX-License-Identifier: AGPL-3.0-only
defmodule Mix.Tasks.Pleroma.Database do defmodule Mix.Tasks.Pleroma.Database do
alias Mix.Tasks.Pleroma.Common
alias Pleroma.Conversation alias Pleroma.Conversation
alias Pleroma.Object alias Pleroma.Object
alias Pleroma.Repo alias Pleroma.Repo
alias Pleroma.User alias Pleroma.User
require Logger require Logger
import Mix.Pleroma
use Mix.Task use Mix.Task
@shortdoc "A collection of database related tasks" @shortdoc "A collection of database related tasks"
@ -45,7 +45,7 @@ def run(["remove_embedded_objects" | args]) do
] ]
) )
Common.start_pleroma() start_pleroma()
Logger.info("Removing embedded objects") Logger.info("Removing embedded objects")
Repo.query!( Repo.query!(
@ -66,12 +66,12 @@ def run(["remove_embedded_objects" | args]) do
end end
def run(["bump_all_conversations"]) do def run(["bump_all_conversations"]) do
Common.start_pleroma() start_pleroma()
Conversation.bump_for_all_activities() Conversation.bump_for_all_activities()
end end
def run(["update_users_following_followers_counts"]) do def run(["update_users_following_followers_counts"]) do
Common.start_pleroma() start_pleroma()
users = Repo.all(User) users = Repo.all(User)
Enum.each(users, &User.remove_duplicated_following/1) Enum.each(users, &User.remove_duplicated_following/1)
@ -89,7 +89,7 @@ def run(["prune_objects" | args]) do
] ]
) )
Common.start_pleroma() start_pleroma()
deadline = Pleroma.Config.get([:instance, :remote_post_retention_days]) deadline = Pleroma.Config.get([:instance, :remote_post_retention_days])

View file

@ -9,6 +9,15 @@ defmodule Mix.Tasks.Pleroma.Ecto do
def ensure_migrations_path(repo, opts) do def ensure_migrations_path(repo, opts) do
path = opts[:migrations_path] || Path.join(source_repo_priv(repo), "migrations") path = opts[:migrations_path] || Path.join(source_repo_priv(repo), "migrations")
path =
case Path.type(path) do
:relative ->
Path.join(Application.app_dir(:pleroma), path)
:absolute ->
path
end
if not File.dir?(path) do if not File.dir?(path) do
raise_missing_migrations(Path.relative_to_cwd(path), repo) raise_missing_migrations(Path.relative_to_cwd(path), repo)
end end
@ -22,7 +31,7 @@ def ensure_migrations_path(repo, opts) do
def source_repo_priv(repo) do def source_repo_priv(repo) do
config = repo.config() config = repo.config()
priv = config[:priv] || "priv/#{repo |> Module.split() |> List.last() |> Macro.underscore()}" priv = config[:priv] || "priv/#{repo |> Module.split() |> List.last() |> Macro.underscore()}"
Path.join(File.cwd!(), priv) Path.join(Application.app_dir(:pleroma), priv)
end end
defp raise_missing_migrations(path, repo) do defp raise_missing_migrations(path, repo) do

View file

@ -4,6 +4,7 @@
defmodule Mix.Tasks.Pleroma.Ecto.Migrate do defmodule Mix.Tasks.Pleroma.Ecto.Migrate do
use Mix.Task use Mix.Task
import Mix.Pleroma
require Logger require Logger
@shortdoc "Wrapper on `ecto.migrate` task." @shortdoc "Wrapper on `ecto.migrate` task."
@ -37,6 +38,7 @@ defmodule Mix.Tasks.Pleroma.Ecto.Migrate do
@impl true @impl true
def run(args \\ []) do def run(args \\ []) do
load_pleroma()
{opts, _} = OptionParser.parse!(args, strict: @switches, aliases: @aliases) {opts, _} = OptionParser.parse!(args, strict: @switches, aliases: @aliases)
opts = opts =

View file

@ -4,6 +4,7 @@
defmodule Mix.Tasks.Pleroma.Ecto.Rollback do defmodule Mix.Tasks.Pleroma.Ecto.Rollback do
use Mix.Task use Mix.Task
import Mix.Pleroma
require Logger require Logger
@shortdoc "Wrapper on `ecto.rollback` task" @shortdoc "Wrapper on `ecto.rollback` task"
@ -36,6 +37,7 @@ defmodule Mix.Tasks.Pleroma.Ecto.Rollback do
@impl true @impl true
def run(args \\ []) do def run(args \\ []) do
load_pleroma()
{opts, _} = OptionParser.parse!(args, strict: @switches, aliases: @aliases) {opts, _} = OptionParser.parse!(args, strict: @switches, aliases: @aliases)
opts = opts =

View file

@ -4,7 +4,7 @@
defmodule Mix.Tasks.Pleroma.Instance do defmodule Mix.Tasks.Pleroma.Instance do
use Mix.Task use Mix.Task
alias Mix.Tasks.Pleroma.Common import Mix.Pleroma
@shortdoc "Manages Pleroma instance" @shortdoc "Manages Pleroma instance"
@moduledoc """ @moduledoc """
@ -29,8 +29,11 @@ defmodule Mix.Tasks.Pleroma.Instance do
- `--dbname DBNAME` - the name of the database to use - `--dbname DBNAME` - the name of the database to use
- `--dbuser DBUSER` - the user (aka role) to use for the database connection - `--dbuser DBUSER` - the user (aka role) to use for the database connection
- `--dbpass DBPASS` - the password to use for the database connection - `--dbpass DBPASS` - the password to use for the database connection
- `--rum Y/N` - Whether to enable RUM indexes
- `--indexable Y/N` - Allow/disallow indexing site by search engines - `--indexable Y/N` - Allow/disallow indexing site by search engines
- `--db-configurable Y/N` - Allow/disallow configuring instance from admin part - `--db-configurable Y/N` - Allow/disallow configuring instance from admin part
- `--uploads-dir` - the directory uploads go in when using a local uploader
- `--static-dir` - the directory custom public files should be read from (custom emojis, frontend bundle overrides, robots.txt, etc.)
""" """
def run(["gen" | rest]) do def run(["gen" | rest]) do
@ -49,8 +52,11 @@ def run(["gen" | rest]) do
dbname: :string, dbname: :string,
dbuser: :string, dbuser: :string,
dbpass: :string, dbpass: :string,
rum: :string,
indexable: :string, indexable: :string,
db_configurable: :string db_configurable: :string,
uploads_dir: :string,
static_dir: :string
], ],
aliases: [ aliases: [
o: :output, o: :output,
@ -70,7 +76,7 @@ def run(["gen" | rest]) do
if proceed? do if proceed? do
[domain, port | _] = [domain, port | _] =
String.split( String.split(
Common.get_option( get_option(
options, options,
:domain, :domain,
"What domain will your instance use? (e.g pleroma.soykaf.com)" "What domain will your instance use? (e.g pleroma.soykaf.com)"
@ -79,16 +85,16 @@ def run(["gen" | rest]) do
) ++ [443] ) ++ [443]
name = name =
Common.get_option( get_option(
options, options,
:instance_name, :instance_name,
"What is the name of your instance? (e.g. Pleroma/Soykaf)" "What is the name of your instance? (e.g. Pleroma/Soykaf)"
) )
email = Common.get_option(options, :admin_email, "What is your admin email address?") email = get_option(options, :admin_email, "What is your admin email address?")
notify_email = notify_email =
Common.get_option( get_option(
options, options,
:notify_email, :notify_email,
"What email address do you want to use for sending email notifications?", "What email address do you want to use for sending email notifications?",
@ -96,7 +102,7 @@ def run(["gen" | rest]) do
) )
indexable = indexable =
Common.get_option( get_option(
options, options,
:indexable, :indexable,
"Do you want search engines to index your site? (y/n)", "Do you want search engines to index your site? (y/n)",
@ -104,21 +110,19 @@ def run(["gen" | rest]) do
) === "y" ) === "y"
db_configurable? = db_configurable? =
Common.get_option( get_option(
options, options,
:db_configurable, :db_configurable,
"Do you want to be able to configure instance from admin part? (y/n)", "Do you want to store the configuration in the database (allows controlling it from admin-fe)? (y/n)",
"y" "y"
) === "y" ) === "y"
dbhost = dbhost = get_option(options, :dbhost, "What is the hostname of your database?", "localhost")
Common.get_option(options, :dbhost, "What is the hostname of your database?", "localhost")
dbname = dbname = get_option(options, :dbname, "What is the name of your database?", "pleroma_dev")
Common.get_option(options, :dbname, "What is the name of your database?", "pleroma_dev")
dbuser = dbuser =
Common.get_option( get_option(
options, options,
:dbuser, :dbuser,
"What is the user used to connect to your database?", "What is the user used to connect to your database?",
@ -126,7 +130,7 @@ def run(["gen" | rest]) do
) )
dbpass = dbpass =
Common.get_option( get_option(
options, options,
:dbpass, :dbpass,
"What is the password used to connect to your database?", "What is the password used to connect to your database?",
@ -134,13 +138,38 @@ def run(["gen" | rest]) do
"autogenerated" "autogenerated"
) )
rum_enabled =
get_option(
options,
:rum,
"Would you like to use RUM indices?",
"n"
) === "y"
uploads_dir =
get_option(
options,
:upload_dir,
"What directory should media uploads go in (when using the local uploader)?",
Pleroma.Config.get([Pleroma.Uploaders.Local, :uploads])
)
static_dir =
get_option(
options,
:static_dir,
"What directory should custom public files be read from (custom emojis, frontend bundle overrides, robots.txt, etc.)?",
Pleroma.Config.get([:instance, :static_dir])
)
secret = :crypto.strong_rand_bytes(64) |> Base.encode64() |> binary_part(0, 64) secret = :crypto.strong_rand_bytes(64) |> Base.encode64() |> binary_part(0, 64)
signing_salt = :crypto.strong_rand_bytes(8) |> Base.encode64() |> binary_part(0, 8) signing_salt = :crypto.strong_rand_bytes(8) |> Base.encode64() |> binary_part(0, 8)
{web_push_public_key, web_push_private_key} = :crypto.generate_key(:ecdh, :prime256v1) {web_push_public_key, web_push_private_key} = :crypto.generate_key(:ecdh, :prime256v1)
template_dir = Application.app_dir(:pleroma, "priv") <> "/templates"
result_config = result_config =
EEx.eval_file( EEx.eval_file(
"sample_config.eex" |> Path.expand(__DIR__), template_dir <> "/sample_config.eex",
domain: domain, domain: domain,
port: port, port: port,
email: email, email: email,
@ -150,47 +179,50 @@ def run(["gen" | rest]) do
dbname: dbname, dbname: dbname,
dbuser: dbuser, dbuser: dbuser,
dbpass: dbpass, dbpass: dbpass,
version: Pleroma.Mixfile.project() |> Keyword.get(:version),
secret: secret, secret: secret,
signing_salt: signing_salt, signing_salt: signing_salt,
web_push_public_key: Base.url_encode64(web_push_public_key, padding: false), web_push_public_key: Base.url_encode64(web_push_public_key, padding: false),
web_push_private_key: Base.url_encode64(web_push_private_key, padding: false), web_push_private_key: Base.url_encode64(web_push_private_key, padding: false),
db_configurable?: db_configurable? db_configurable?: db_configurable?,
static_dir: static_dir,
uploads_dir: uploads_dir,
rum_enabled: rum_enabled
) )
result_psql = result_psql =
EEx.eval_file( EEx.eval_file(
"sample_psql.eex" |> Path.expand(__DIR__), template_dir <> "/sample_psql.eex",
dbname: dbname, dbname: dbname,
dbuser: dbuser, dbuser: dbuser,
dbpass: dbpass dbpass: dbpass,
rum_enabled: rum_enabled
) )
Common.shell_info( shell_info(
"Writing config to #{config_path}. You should rename it to config/prod.secret.exs or config/dev.secret.exs." "Writing config to #{config_path}. You should rename it to config/prod.secret.exs or config/dev.secret.exs."
) )
File.write(config_path, result_config) File.write(config_path, result_config)
Common.shell_info("Writing #{psql_path}.") shell_info("Writing #{psql_path}.")
File.write(psql_path, result_psql) File.write(psql_path, result_psql)
write_robots_txt(indexable) write_robots_txt(indexable, template_dir)
Common.shell_info( shell_info(
"\n" <> "\n" <>
""" """
To get started: To get started:
1. Verify the contents of the generated files. 1. Verify the contents of the generated files.
2. Run `sudo -u postgres psql -f #{Common.escape_sh_path(psql_path)}`. 2. Run `sudo -u postgres psql -f #{escape_sh_path(psql_path)}`.
""" <> """ <>
if config_path in ["config/dev.secret.exs", "config/prod.secret.exs"] do if config_path in ["config/dev.secret.exs", "config/prod.secret.exs"] do
"" ""
else else
"3. Run `mv #{Common.escape_sh_path(config_path)} 'config/prod.secret.exs'`." "3. Run `mv #{escape_sh_path(config_path)} 'config/prod.secret.exs'`."
end end
) )
else else
Common.shell_error( shell_error(
"The task would have overwritten the following files:\n" <> "The task would have overwritten the following files:\n" <>
(Enum.map(paths, &"- #{&1}\n") |> Enum.join("")) <> (Enum.map(paths, &"- #{&1}\n") |> Enum.join("")) <>
"Rerun with `--force` to overwrite them." "Rerun with `--force` to overwrite them."
@ -198,10 +230,10 @@ def run(["gen" | rest]) do
end end
end end
defp write_robots_txt(indexable) do defp write_robots_txt(indexable, template_dir) do
robots_txt = robots_txt =
EEx.eval_file( EEx.eval_file(
Path.expand("robots_txt.eex", __DIR__), template_dir <> "/robots_txt.eex",
indexable: indexable indexable: indexable
) )
@ -215,10 +247,10 @@ defp write_robots_txt(indexable) do
if File.exists?(robots_txt_path) do if File.exists?(robots_txt_path) do
File.cp!(robots_txt_path, "#{robots_txt_path}.bak") File.cp!(robots_txt_path, "#{robots_txt_path}.bak")
Common.shell_info("Backing up existing robots.txt to #{robots_txt_path}.bak") shell_info("Backing up existing robots.txt to #{robots_txt_path}.bak")
end end
File.write(robots_txt_path, robots_txt) File.write(robots_txt_path, robots_txt)
Common.shell_info("Writing #{robots_txt_path}.") shell_info("Writing #{robots_txt_path}.")
end end
end end

View file

@ -4,7 +4,7 @@
defmodule Mix.Tasks.Pleroma.Relay do defmodule Mix.Tasks.Pleroma.Relay do
use Mix.Task use Mix.Task
alias Mix.Tasks.Pleroma.Common import Mix.Pleroma
alias Pleroma.Web.ActivityPub.Relay alias Pleroma.Web.ActivityPub.Relay
@shortdoc "Manages remote relays" @shortdoc "Manages remote relays"
@ -24,24 +24,24 @@ defmodule Mix.Tasks.Pleroma.Relay do
Example: ``mix pleroma.relay unfollow https://example.org/relay`` Example: ``mix pleroma.relay unfollow https://example.org/relay``
""" """
def run(["follow", target]) do def run(["follow", target]) do
Common.start_pleroma() start_pleroma()
with {:ok, _activity} <- Relay.follow(target) do with {:ok, _activity} <- Relay.follow(target) do
# put this task to sleep to allow the genserver to push out the messages # put this task to sleep to allow the genserver to push out the messages
:timer.sleep(500) :timer.sleep(500)
else else
{:error, e} -> Common.shell_error("Error while following #{target}: #{inspect(e)}") {:error, e} -> shell_error("Error while following #{target}: #{inspect(e)}")
end end
end end
def run(["unfollow", target]) do def run(["unfollow", target]) do
Common.start_pleroma() start_pleroma()
with {:ok, _activity} <- Relay.unfollow(target) do with {:ok, _activity} <- Relay.unfollow(target) do
# put this task to sleep to allow the genserver to push out the messages # put this task to sleep to allow the genserver to push out the messages
:timer.sleep(500) :timer.sleep(500)
else else
{:error, e} -> Common.shell_error("Error while following #{target}: #{inspect(e)}") {:error, e} -> shell_error("Error while following #{target}: #{inspect(e)}")
end end
end end
end end

View file

@ -4,7 +4,7 @@
defmodule Mix.Tasks.Pleroma.Uploads do defmodule Mix.Tasks.Pleroma.Uploads do
use Mix.Task use Mix.Task
alias Mix.Tasks.Pleroma.Common import Mix.Pleroma
alias Pleroma.Upload alias Pleroma.Upload
alias Pleroma.Uploaders.Local alias Pleroma.Uploaders.Local
require Logger require Logger
@ -24,7 +24,7 @@ defmodule Mix.Tasks.Pleroma.Uploads do
""" """
def run(["migrate_local", target_uploader | args]) do def run(["migrate_local", target_uploader | args]) do
delete? = Enum.member?(args, "--delete") delete? = Enum.member?(args, "--delete")
Common.start_pleroma() start_pleroma()
local_path = Pleroma.Config.get!([Local, :uploads]) local_path = Pleroma.Config.get!([Local, :uploads])
uploader = Module.concat(Pleroma.Uploaders, target_uploader) uploader = Module.concat(Pleroma.Uploaders, target_uploader)
@ -38,10 +38,10 @@ def run(["migrate_local", target_uploader | args]) do
Pleroma.Config.put([Upload, :uploader], uploader) Pleroma.Config.put([Upload, :uploader], uploader)
end end
Common.shell_info("Migrating files from local #{local_path} to #{to_string(uploader)}") shell_info("Migrating files from local #{local_path} to #{to_string(uploader)}")
if delete? do if delete? do
Common.shell_info( shell_info(
"Attention: uploaded files will be deleted, hope you have backups! (--delete ; cancel with ^C)" "Attention: uploaded files will be deleted, hope you have backups! (--delete ; cancel with ^C)"
) )
@ -78,7 +78,7 @@ def run(["migrate_local", target_uploader | args]) do
|> Enum.filter(& &1) |> Enum.filter(& &1)
total_count = length(uploads) total_count = length(uploads)
Common.shell_info("Found #{total_count} uploads") shell_info("Found #{total_count} uploads")
uploads uploads
|> Task.async_stream( |> Task.async_stream(
@ -90,7 +90,7 @@ def run(["migrate_local", target_uploader | args]) do
:ok :ok
error -> error ->
Common.shell_error("failed to upload #{inspect(upload.path)}: #{inspect(error)}") shell_error("failed to upload #{inspect(upload.path)}: #{inspect(error)}")
end end
end, end,
timeout: 150_000 timeout: 150_000
@ -99,10 +99,10 @@ def run(["migrate_local", target_uploader | args]) do
# credo:disable-for-next-line Credo.Check.Warning.UnusedEnumOperation # credo:disable-for-next-line Credo.Check.Warning.UnusedEnumOperation
|> Enum.reduce(0, fn done, count -> |> Enum.reduce(0, fn done, count ->
count = count + length(done) count = count + length(done)
Common.shell_info("Uploaded #{count}/#{total_count} files") shell_info("Uploaded #{count}/#{total_count} files")
count count
end) end)
Common.shell_info("Done!") shell_info("Done!")
end end
end end

View file

@ -5,9 +5,10 @@
defmodule Mix.Tasks.Pleroma.User do defmodule Mix.Tasks.Pleroma.User do
use Mix.Task use Mix.Task
import Ecto.Changeset import Ecto.Changeset
alias Mix.Tasks.Pleroma.Common import Mix.Pleroma
alias Pleroma.User alias Pleroma.User
alias Pleroma.UserInviteToken alias Pleroma.UserInviteToken
alias Pleroma.Web.OAuth
@shortdoc "Manages Pleroma users" @shortdoc "Manages Pleroma users"
@moduledoc """ @moduledoc """
@ -49,6 +50,10 @@ defmodule Mix.Tasks.Pleroma.User do
mix pleroma.user delete_activities NICKNAME mix pleroma.user delete_activities NICKNAME
## Sign user out from all applications (delete user's OAuth tokens and authorizations).
mix pleroma.user sign_out NICKNAME
## Deactivate or activate the user's account. ## Deactivate or activate the user's account.
mix pleroma.user toggle_activated NICKNAME mix pleroma.user toggle_activated NICKNAME
@ -115,7 +120,7 @@ def run(["new", nickname, email | rest]) do
admin? = Keyword.get(options, :admin, false) admin? = Keyword.get(options, :admin, false)
assume_yes? = Keyword.get(options, :assume_yes, false) assume_yes? = Keyword.get(options, :assume_yes, false)
Common.shell_info(""" shell_info("""
A user will be created with the following information: A user will be created with the following information:
- nickname: #{nickname} - nickname: #{nickname}
- email: #{email} - email: #{email}
@ -128,10 +133,10 @@ def run(["new", nickname, email | rest]) do
- admin: #{if(admin?, do: "true", else: "false")} - admin: #{if(admin?, do: "true", else: "false")}
""") """)
proceed? = assume_yes? or Common.shell_yes?("Continue?") proceed? = assume_yes? or shell_yes?("Continue?")
if proceed? do if proceed? do
Common.start_pleroma() start_pleroma()
params = %{ params = %{
nickname: nickname, nickname: nickname,
@ -145,7 +150,7 @@ def run(["new", nickname, email | rest]) do
changeset = User.register_changeset(%User{}, params, need_confirmation: false) changeset = User.register_changeset(%User{}, params, need_confirmation: false)
{:ok, _user} = User.register(changeset) {:ok, _user} = User.register(changeset)
Common.shell_info("User #{nickname} created") shell_info("User #{nickname} created")
if moderator? do if moderator? do
run(["set", nickname, "--moderator"]) run(["set", nickname, "--moderator"])
@ -159,64 +164,64 @@ def run(["new", nickname, email | rest]) do
run(["reset_password", nickname]) run(["reset_password", nickname])
end end
else else
Common.shell_info("User will not be created.") shell_info("User will not be created.")
end end
end end
def run(["rm", nickname]) do def run(["rm", nickname]) do
Common.start_pleroma() start_pleroma()
with %User{local: true} = user <- User.get_cached_by_nickname(nickname) do with %User{local: true} = user <- User.get_cached_by_nickname(nickname) do
User.perform(:delete, user) User.perform(:delete, user)
Common.shell_info("User #{nickname} deleted.") shell_info("User #{nickname} deleted.")
else else
_ -> _ ->
Common.shell_error("No local user #{nickname}") shell_error("No local user #{nickname}")
end end
end end
def run(["toggle_activated", nickname]) do def run(["toggle_activated", nickname]) do
Common.start_pleroma() start_pleroma()
with %User{} = user <- User.get_cached_by_nickname(nickname) do with %User{} = user <- User.get_cached_by_nickname(nickname) do
{:ok, user} = User.deactivate(user, !user.info.deactivated) {:ok, user} = User.deactivate(user, !user.info.deactivated)
Common.shell_info( shell_info(
"Activation status of #{nickname}: #{if(user.info.deactivated, do: "de", else: "")}activated" "Activation status of #{nickname}: #{if(user.info.deactivated, do: "de", else: "")}activated"
) )
else else
_ -> _ ->
Common.shell_error("No user #{nickname}") shell_error("No user #{nickname}")
end end
end end
def run(["reset_password", nickname]) do def run(["reset_password", nickname]) do
Common.start_pleroma() start_pleroma()
with %User{local: true} = user <- User.get_cached_by_nickname(nickname), with %User{local: true} = user <- User.get_cached_by_nickname(nickname),
{:ok, token} <- Pleroma.PasswordResetToken.create_token(user) do {:ok, token} <- Pleroma.PasswordResetToken.create_token(user) do
Common.shell_info("Generated password reset token for #{user.nickname}") shell_info("Generated password reset token for #{user.nickname}")
IO.puts( IO.puts(
"URL: #{ "URL: #{
Pleroma.Web.Router.Helpers.util_url( Pleroma.Web.Router.Helpers.reset_password_url(
Pleroma.Web.Endpoint, Pleroma.Web.Endpoint,
:show_password_reset, :reset,
token.token token.token
) )
}" }"
) )
else else
_ -> _ ->
Common.shell_error("No local user #{nickname}") shell_error("No local user #{nickname}")
end end
end end
def run(["unsubscribe", nickname]) do def run(["unsubscribe", nickname]) do
Common.start_pleroma() start_pleroma()
with %User{} = user <- User.get_cached_by_nickname(nickname) do with %User{} = user <- User.get_cached_by_nickname(nickname) do
Common.shell_info("Deactivating #{user.nickname}") shell_info("Deactivating #{user.nickname}")
User.deactivate(user) User.deactivate(user)
{:ok, friends} = User.get_friends(user) {:ok, friends} = User.get_friends(user)
@ -224,7 +229,7 @@ def run(["unsubscribe", nickname]) do
Enum.each(friends, fn friend -> Enum.each(friends, fn friend ->
user = User.get_cached_by_id(user.id) user = User.get_cached_by_id(user.id)
Common.shell_info("Unsubscribing #{friend.nickname} from #{user.nickname}") shell_info("Unsubscribing #{friend.nickname} from #{user.nickname}")
User.unfollow(user, friend) User.unfollow(user, friend)
end) end)
@ -233,16 +238,16 @@ def run(["unsubscribe", nickname]) do
user = User.get_cached_by_id(user.id) user = User.get_cached_by_id(user.id)
if Enum.empty?(user.following) do if Enum.empty?(user.following) do
Common.shell_info("Successfully unsubscribed all followers from #{user.nickname}") shell_info("Successfully unsubscribed all followers from #{user.nickname}")
end end
else else
_ -> _ ->
Common.shell_error("No user #{nickname}") shell_error("No user #{nickname}")
end end
end end
def run(["set", nickname | rest]) do def run(["set", nickname | rest]) do
Common.start_pleroma() start_pleroma()
{options, [], []} = {options, [], []} =
OptionParser.parse( OptionParser.parse(
@ -274,33 +279,33 @@ def run(["set", nickname | rest]) do
end end
else else
_ -> _ ->
Common.shell_error("No local user #{nickname}") shell_error("No local user #{nickname}")
end end
end end
def run(["tag", nickname | tags]) do def run(["tag", nickname | tags]) do
Common.start_pleroma() start_pleroma()
with %User{} = user <- User.get_cached_by_nickname(nickname) do with %User{} = user <- User.get_cached_by_nickname(nickname) do
user = user |> User.tag(tags) user = user |> User.tag(tags)
Common.shell_info("Tags of #{user.nickname}: #{inspect(tags)}") shell_info("Tags of #{user.nickname}: #{inspect(tags)}")
else else
_ -> _ ->
Common.shell_error("Could not change user tags for #{nickname}") shell_error("Could not change user tags for #{nickname}")
end end
end end
def run(["untag", nickname | tags]) do def run(["untag", nickname | tags]) do
Common.start_pleroma() start_pleroma()
with %User{} = user <- User.get_cached_by_nickname(nickname) do with %User{} = user <- User.get_cached_by_nickname(nickname) do
user = user |> User.untag(tags) user = user |> User.untag(tags)
Common.shell_info("Tags of #{user.nickname}: #{inspect(tags)}") shell_info("Tags of #{user.nickname}: #{inspect(tags)}")
else else
_ -> _ ->
Common.shell_error("Could not change user tags for #{nickname}") shell_error("Could not change user tags for #{nickname}")
end end
end end
@ -321,14 +326,12 @@ def run(["invite" | rest]) do
end) end)
|> Enum.into(%{}) |> Enum.into(%{})
Common.start_pleroma() start_pleroma()
with {:ok, val} <- options[:expires_at], with {:ok, val} <- options[:expires_at],
options = Map.put(options, :expires_at, val), options = Map.put(options, :expires_at, val),
{:ok, invite} <- UserInviteToken.create_invite(options) do {:ok, invite} <- UserInviteToken.create_invite(options) do
Common.shell_info( shell_info("Generated user invite token " <> String.replace(invite.invite_type, "_", " "))
"Generated user invite token " <> String.replace(invite.invite_type, "_", " ")
)
url = url =
Pleroma.Web.Router.Helpers.redirect_url( Pleroma.Web.Router.Helpers.redirect_url(
@ -340,14 +343,14 @@ def run(["invite" | rest]) do
IO.puts(url) IO.puts(url)
else else
error -> error ->
Common.shell_error("Could not create invite token: #{inspect(error)}") shell_error("Could not create invite token: #{inspect(error)}")
end end
end end
def run(["invites"]) do def run(["invites"]) do
Common.start_pleroma() start_pleroma()
Common.shell_info("Invites list:") shell_info("Invites list:")
UserInviteToken.list_invites() UserInviteToken.list_invites()
|> Enum.each(fn invite -> |> Enum.each(fn invite ->
@ -361,7 +364,7 @@ def run(["invites"]) do
" | Max use: #{max_use} Left use: #{max_use - invite.uses}" " | Max use: #{max_use} Left use: #{max_use - invite.uses}"
end end
Common.shell_info( shell_info(
"ID: #{invite.id} | Token: #{invite.token} | Token type: #{invite.invite_type} | Used: #{ "ID: #{invite.id} | Token: #{invite.token} | Token type: #{invite.invite_type} | Used: #{
invite.used invite.used
}#{expire_info}#{using_info}" }#{expire_info}#{using_info}"
@ -370,40 +373,54 @@ def run(["invites"]) do
end end
def run(["revoke_invite", token]) do def run(["revoke_invite", token]) do
Common.start_pleroma() start_pleroma()
with {:ok, invite} <- UserInviteToken.find_by_token(token), with {:ok, invite} <- UserInviteToken.find_by_token(token),
{:ok, _} <- UserInviteToken.update_invite(invite, %{used: true}) do {:ok, _} <- UserInviteToken.update_invite(invite, %{used: true}) do
Common.shell_info("Invite for token #{token} was revoked.") shell_info("Invite for token #{token} was revoked.")
else else
_ -> Common.shell_error("No invite found with token #{token}") _ -> shell_error("No invite found with token #{token}")
end end
end end
def run(["delete_activities", nickname]) do def run(["delete_activities", nickname]) do
Common.start_pleroma() start_pleroma()
with %User{local: true} = user <- User.get_cached_by_nickname(nickname) do with %User{local: true} = user <- User.get_cached_by_nickname(nickname) do
{:ok, _} = User.delete_user_activities(user) {:ok, _} = User.delete_user_activities(user)
Common.shell_info("User #{nickname} statuses deleted.") shell_info("User #{nickname} statuses deleted.")
else else
_ -> _ ->
Common.shell_error("No local user #{nickname}") shell_error("No local user #{nickname}")
end end
end end
def run(["toggle_confirmed", nickname]) do def run(["toggle_confirmed", nickname]) do
Common.start_pleroma() start_pleroma()
with %User{} = user <- User.get_cached_by_nickname(nickname) do with %User{} = user <- User.get_cached_by_nickname(nickname) do
{:ok, user} = User.toggle_confirmation(user) {:ok, user} = User.toggle_confirmation(user)
message = if user.info.confirmation_pending, do: "needs", else: "doesn't need" message = if user.info.confirmation_pending, do: "needs", else: "doesn't need"
Common.shell_info("#{nickname} #{message} confirmation.") shell_info("#{nickname} #{message} confirmation.")
else else
_ -> _ ->
Common.shell_error("No local user #{nickname}") shell_error("No local user #{nickname}")
end
end
def run(["sign_out", nickname]) do
start_pleroma()
with %User{local: true} = user <- User.get_cached_by_nickname(nickname) do
OAuth.Token.delete_user_tokens(user)
OAuth.Authorization.delete_user_authorizations(user)
shell_info("#{nickname} signed out from all apps.")
else
_ ->
shell_error("No local user #{nickname}")
end end
end end
@ -416,7 +433,7 @@ defp set_moderator(user, value) do
{:ok, user} = User.update_and_set_cache(user_cng) {:ok, user} = User.update_and_set_cache(user_cng)
Common.shell_info("Moderator status of #{user.nickname}: #{user.info.is_moderator}") shell_info("Moderator status of #{user.nickname}: #{user.info.is_moderator}")
user user
end end
@ -429,7 +446,7 @@ defp set_admin(user, value) do
{:ok, user} = User.update_and_set_cache(user_cng) {:ok, user} = User.update_and_set_cache(user_cng)
Common.shell_info("Admin status of #{user.nickname}: #{user.info.is_admin}") shell_info("Admin status of #{user.nickname}: #{user.info.is_admin}")
user user
end end
@ -442,7 +459,7 @@ defp set_locked(user, value) do
{:ok, user} = User.update_and_set_cache(user_cng) {:ok, user} = User.update_and_set_cache(user_cng)
Common.shell_info("Locked status of #{user.nickname}: #{user.info.locked}") shell_info("Locked status of #{user.nickname}: #{user.info.locked}")
user user
end end
end end

View file

@ -11,8 +11,17 @@ def start_link do
def load_and_update_env do def load_and_update_env do
if Pleroma.Config.get([:instance, :dynamic_configuration]) and if Pleroma.Config.get([:instance, :dynamic_configuration]) and
Ecto.Adapters.SQL.table_exists?(Pleroma.Repo, "config") do Ecto.Adapters.SQL.table_exists?(Pleroma.Repo, "config") do
Pleroma.Repo.all(Config) for_restart =
|> Enum.each(&update_env(&1)) Pleroma.Repo.all(Config)
|> Enum.map(&update_env(&1))
# We need to restart applications for loaded settings take effect
for_restart
|> Enum.reject(&(&1 in [:pleroma, :ok]))
|> Enum.each(fn app ->
Application.stop(app)
:ok = Application.start(app)
end)
end end
end end
@ -25,11 +34,15 @@ defp update_env(setting) do
setting.key setting.key
end end
group = String.to_existing_atom(setting.group)
Application.put_env( Application.put_env(
:pleroma, group,
String.to_existing_atom(key), String.to_existing_atom(key),
Config.from_binary(setting.value) Config.from_binary(setting.value)
) )
group
rescue rescue
e -> e ->
require Logger require Logger

View file

@ -23,13 +23,8 @@ defp recipient(email, nil), do: email
defp recipient(email, name), do: {name, email} defp recipient(email, name), do: {name, email}
defp recipient(%Pleroma.User{} = user), do: recipient(user.email, user.name) defp recipient(%Pleroma.User{} = user), do: recipient(user.email, user.name)
def password_reset_email(user, password_reset_token) when is_binary(password_reset_token) do def password_reset_email(user, token) when is_binary(token) do
password_reset_url = password_reset_url = Router.Helpers.reset_password_url(Endpoint, :reset, token)
Router.Helpers.util_url(
Endpoint,
:show_password_reset,
password_reset_token
)
html_body = """ html_body = """
<h3>Reset your password at #{instance_name()}</h3> <h3>Reset your password at #{instance_name()}</h3>

View file

@ -127,8 +127,7 @@ def dismiss(%{id: user_id} = _user, id) do
end end
end end
def create_notifications(%Activity{data: %{"to" => _, "type" => type}} = activity) def create_notifications(%Activity{data: %{"to" => _, "type" => "Create"}} = activity) do
when type in ["Create", "Like", "Announce", "Follow"] do
object = Object.normalize(activity) object = Object.normalize(activity)
unless object && object.data["type"] == "Answer" do unless object && object.data["type"] == "Answer" do
@ -140,6 +139,13 @@ def create_notifications(%Activity{data: %{"to" => _, "type" => type}} = activit
end end
end end
def create_notifications(%Activity{data: %{"to" => _, "type" => type}} = activity)
when type in ["Like", "Announce", "Follow"] do
users = get_notified_from_activity(activity)
notifications = Enum.map(users, fn user -> create_notification(activity, user) end)
{:ok, notifications}
end
def create_notifications(_), do: {:ok, []} def create_notifications(_), do: {:ok, []}
# TODO move to sql, too. # TODO move to sql, too.

View file

@ -37,6 +37,7 @@ def used_changeset(struct) do
|> put_change(:used, true) |> put_change(:used, true)
end end
@spec reset_password(binary(), map()) :: {:ok, User.t()} | {:error, binary()}
def reset_password(token, data) do def reset_password(token, data) do
with %{used: false} = token <- Repo.get_by(PasswordResetToken, %{token: token}), with %{used: false} = token <- Repo.get_by(PasswordResetToken, %{token: token}),
%User{} = user <- User.get_cached_by_id(token.user_id), %User{} = user <- User.get_cached_by_id(token.user_id),

View file

@ -17,6 +17,7 @@ def run(args) do
end end
defp mix_task(task, args) do defp mix_task(task, args) do
Application.load(:pleroma)
{:ok, modules} = :application.get_key(:pleroma, :modules) {:ok, modules} = :application.get_key(:pleroma, :modules)
module = module =
@ -43,6 +44,8 @@ def rollback(args) do
end end
def create do def create do
Application.load(:pleroma)
case @repo.__adapter__.storage_up(@repo.config) do case @repo.__adapter__.storage_up(@repo.config) do
:ok -> :ok ->
IO.puts("The database for #{inspect(@repo)} has been created") IO.puts("The database for #{inspect(@repo)} has been created")

View file

@ -0,0 +1,34 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2019 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.RepoStreamer do
alias Pleroma.Repo
import Ecto.Query
def chunk_stream(query, chunk_size) do
Stream.unfold(0, fn
:halt ->
{[], :halt}
last_id ->
query
|> order_by(asc: :id)
|> where([r], r.id > ^last_id)
|> limit(^chunk_size)
|> Repo.all()
|> case do
[] ->
{[], :halt}
records ->
last_id = List.last(records).id
{records, last_id}
end
end)
|> Stream.take_while(fn
[] -> false
_ -> true
end)
end
end

View file

@ -9,12 +9,14 @@ defmodule Pleroma.User do
import Ecto.Query import Ecto.Query
alias Comeonin.Pbkdf2 alias Comeonin.Pbkdf2
alias Ecto.Multi
alias Pleroma.Activity alias Pleroma.Activity
alias Pleroma.Keys alias Pleroma.Keys
alias Pleroma.Notification alias Pleroma.Notification
alias Pleroma.Object alias Pleroma.Object
alias Pleroma.Registration alias Pleroma.Registration
alias Pleroma.Repo alias Pleroma.Repo
alias Pleroma.RepoStreamer
alias Pleroma.User alias Pleroma.User
alias Pleroma.Web alias Pleroma.Web
alias Pleroma.Web.ActivityPub.ActivityPub alias Pleroma.Web.ActivityPub.ActivityPub
@ -193,27 +195,24 @@ def upgrade_changeset(struct, params \\ %{}) do
end end
def password_update_changeset(struct, params) do def password_update_changeset(struct, params) do
changeset = struct
struct |> cast(params, [:password, :password_confirmation])
|> cast(params, [:password, :password_confirmation]) |> validate_required([:password, :password_confirmation])
|> validate_required([:password, :password_confirmation]) |> validate_confirmation(:password)
|> validate_confirmation(:password) |> put_password_hash
OAuth.Token.delete_user_tokens(struct)
OAuth.Authorization.delete_user_authorizations(struct)
if changeset.valid? do
hashed = Pbkdf2.hashpwsalt(changeset.changes[:password])
changeset
|> put_change(:password_hash, hashed)
else
changeset
end
end end
def reset_password(user, data) do def reset_password(%User{id: user_id} = user, data) do
update_and_set_cache(password_update_changeset(user, data)) multi =
Multi.new()
|> Multi.update(:user, password_update_changeset(user, data))
|> Multi.delete_all(:tokens, OAuth.Token.Query.get_by_user(user_id))
|> Multi.delete_all(:auth, OAuth.Authorization.delete_by_user_query(user))
case Repo.transaction(multi) do
{:ok, %{user: user} = _} -> set_cache(user)
{:error, _, changeset, _} -> {:error, changeset}
end
end end
def register_changeset(struct, params \\ %{}, opts \\ []) do def register_changeset(struct, params \\ %{}, opts \\ []) do
@ -249,12 +248,11 @@ def register_changeset(struct, params \\ %{}, opts \\ []) do
end end
if changeset.valid? do if changeset.valid? do
hashed = Pbkdf2.hashpwsalt(changeset.changes[:password])
ap_id = User.ap_id(%User{nickname: changeset.changes[:nickname]}) ap_id = User.ap_id(%User{nickname: changeset.changes[:nickname]})
followers = User.ap_followers(%User{nickname: changeset.changes[:nickname]}) followers = User.ap_followers(%User{nickname: changeset.changes[:nickname]})
changeset changeset
|> put_change(:password_hash, hashed) |> put_password_hash
|> put_change(:ap_id, ap_id) |> put_change(:ap_id, ap_id)
|> unique_constraint(:ap_id) |> unique_constraint(:ap_id)
|> put_change(:following, [followers]) |> put_change(:following, [followers])
@ -932,18 +930,24 @@ def delete(%User{} = user),
@spec perform(atom(), User.t()) :: {:ok, User.t()} @spec perform(atom(), User.t()) :: {:ok, User.t()}
def perform(:delete, %User{} = user) do def perform(:delete, %User{} = user) do
{:ok, user} = User.deactivate(user)
# Remove all relationships # Remove all relationships
{:ok, followers} = User.get_followers(user) {:ok, followers} = User.get_followers(user)
Enum.each(followers, fn follower -> User.unfollow(follower, user) end) Enum.each(followers, fn follower ->
ActivityPub.unfollow(follower, user)
User.unfollow(follower, user)
end)
{:ok, friends} = User.get_friends(user) {:ok, friends} = User.get_friends(user)
Enum.each(friends, fn followed -> User.unfollow(user, followed) end) Enum.each(friends, fn followed ->
ActivityPub.unfollow(user, followed)
User.unfollow(user, followed)
end)
delete_user_activities(user) delete_user_activities(user)
{:ok, _user} = Repo.delete(user)
end end
@spec perform(atom(), User.t()) :: {:ok, User.t()} @spec perform(atom(), User.t()) :: {:ok, User.t()}
@ -1016,18 +1020,35 @@ def follow_import(%User{} = follower, followed_identifiers) when is_list(followe
]) ])
def delete_user_activities(%User{ap_id: ap_id} = user) do def delete_user_activities(%User{ap_id: ap_id} = user) do
stream = ap_id
ap_id |> Activity.query_by_actor()
|> Activity.query_by_actor() |> RepoStreamer.chunk_stream(50)
|> Repo.stream() |> Stream.each(fn activities ->
Enum.each(activities, &delete_activity(&1))
Repo.transaction(fn -> Enum.each(stream, &delete_activity(&1)) end, timeout: :infinity) end)
|> Stream.run()
{:ok, user} {:ok, user}
end end
defp delete_activity(%{data: %{"type" => "Create"}} = activity) do defp delete_activity(%{data: %{"type" => "Create"}} = activity) do
Object.normalize(activity) |> ActivityPub.delete() activity
|> Object.normalize()
|> ActivityPub.delete()
end
defp delete_activity(%{data: %{"type" => "Like"}} = activity) do
user = get_cached_by_ap_id(activity.actor)
object = Object.normalize(activity)
ActivityPub.unlike(user, object)
end
defp delete_activity(%{data: %{"type" => "Announce"}} = activity) do
user = get_cached_by_ap_id(activity.actor)
object = Object.normalize(activity)
ActivityPub.unannounce(user, object)
end end
defp delete_activity(_activity), do: "Doing nothing" defp delete_activity(_activity), do: "Doing nothing"
@ -1325,4 +1346,12 @@ def get_ap_ids_by_nicknames(nicknames) do
end end
defdelegate search(query, opts \\ []), to: User.Search defdelegate search(query, opts \\ []), to: User.Search
defp put_password_hash(
%Ecto.Changeset{valid?: true, changes: %{password: password}} = changeset
) do
change(changeset, password_hash: Pbkdf2.hashpwsalt(password))
end
defp put_password_hash(changeset), do: changeset
end end

View file

@ -189,6 +189,22 @@ def stream_out_participations(participations) do
end) end)
end end
def stream_out_participations(%Object{data: %{"context" => context}}, user) do
with %Conversation{} = conversation <- Conversation.get_for_ap_id(context),
conversation = Repo.preload(conversation, :participations),
last_activity_id =
fetch_latest_activity_id_for_context(conversation.ap_id, %{
"user" => user,
"blocking_user" => user
}) do
if last_activity_id do
stream_out_participations(conversation.participations)
end
end
end
def stream_out_participations(_, _), do: :noop
def stream_out(activity) do def stream_out(activity) do
public = "https://www.w3.org/ns/activitystreams#Public" public = "https://www.w3.org/ns/activitystreams#Public"
@ -401,7 +417,8 @@ def delete(%Object{data: %{"id" => id, "actor" => actor}} = object, local \\ tru
"to" => to, "to" => to,
"deleted_activity_id" => activity && activity.id "deleted_activity_id" => activity && activity.id
}, },
{:ok, activity} <- insert(data, local), {:ok, activity} <- insert(data, local, false),
stream_out_participations(object, user),
_ <- decrease_replies_count_if_reply(object), _ <- decrease_replies_count_if_reply(object),
# Changing note count prior to enqueuing federation task in order to avoid # Changing note count prior to enqueuing federation task in order to avoid
# race conditions on updating user.info # race conditions on updating user.info

View file

@ -0,0 +1,48 @@
# Pleroma: A lightweight social networking server
# Copyright © 2019 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Web.ActivityPub.MRF.AntiLinkSpamPolicy do
alias Pleroma.User
require Logger
# has the user successfully posted before?
defp old_user?(%User{} = u) do
u.info.note_count > 0 || u.info.follower_count > 0
end
# does the post contain links?
defp contains_links?(%{"content" => content} = _object) do
content
|> Floki.filter_out("a.mention,a.hashtag,a[rel~=\"tag\"],a.zrl")
|> Floki.attribute("a", "href")
|> length() > 0
end
defp contains_links?(_), do: false
def filter(%{"type" => "Create", "actor" => actor, "object" => object} = message) do
with {:ok, %User{} = u} <- User.get_or_fetch_by_ap_id(actor),
{:contains_links, true} <- {:contains_links, contains_links?(object)},
{:old_user, true} <- {:old_user, old_user?(u)} do
{:ok, message}
else
{:contains_links, false} ->
{:ok, message}
{:old_user, false} ->
{:reject, nil}
{:error, _} ->
{:reject, nil}
e ->
Logger.warn("[MRF anti-link-spam] WTF: unhandled error #{inspect(e)}")
{:reject, nil}
end
end
# in all other cases, pass through
def filter(message), do: {:ok, message}
end

View file

@ -151,16 +151,18 @@ def get_notified_from_object(object) do
def create_context(context) do def create_context(context) do
context = context || generate_id("contexts") context = context || generate_id("contexts")
changeset = Object.context_mapping(context)
case Repo.insert(changeset) do # Ecto has problems accessing the constraint inside the jsonb,
{:ok, object} -> # so we explicitly check for the existed object before insert
object = Object.get_cached_by_ap_id(context)
with true <- is_nil(object),
changeset <- Object.context_mapping(context),
{:ok, inserted_object} <- Repo.insert(changeset) do
inserted_object
else
_ ->
object object
# This should be solved by an upsert, but it seems ecto
# has problems accessing the constraint inside the jsonb.
{:error, _} ->
Object.get_cached_by_ap_id(context)
end end
end end

View file

@ -377,12 +377,12 @@ def config_update(conn, %{"configs" => configs}) do
if Pleroma.Config.get([:instance, :dynamic_configuration]) do if Pleroma.Config.get([:instance, :dynamic_configuration]) do
updated = updated =
Enum.map(configs, fn Enum.map(configs, fn
%{"key" => key, "value" => value} -> %{"group" => group, "key" => key, "value" => value} ->
{:ok, config} = Config.update_or_create(%{key: key, value: value}) {:ok, config} = Config.update_or_create(%{group: group, key: key, value: value})
config config
%{"key" => key, "delete" => "true"} -> %{"group" => group, "key" => key, "delete" => "true"} ->
{:ok, _} = Config.delete(key) {:ok, _} = Config.delete(%{group: group, key: key})
nil nil
end) end)
|> Enum.reject(&is_nil(&1)) |> Enum.reject(&is_nil(&1))

View file

@ -12,26 +12,27 @@ defmodule Pleroma.Web.AdminAPI.Config do
schema "config" do schema "config" do
field(:key, :string) field(:key, :string)
field(:group, :string)
field(:value, :binary) field(:value, :binary)
timestamps() timestamps()
end end
@spec get_by_key(String.t()) :: Config.t() | nil @spec get_by_params(map()) :: Config.t() | nil
def get_by_key(key), do: Repo.get_by(Config, key: key) def get_by_params(params), do: Repo.get_by(Config, params)
@spec changeset(Config.t(), map()) :: Changeset.t() @spec changeset(Config.t(), map()) :: Changeset.t()
def changeset(config, params \\ %{}) do def changeset(config, params \\ %{}) do
config config
|> cast(params, [:key, :value]) |> cast(params, [:key, :group, :value])
|> validate_required([:key, :value]) |> validate_required([:key, :group, :value])
|> unique_constraint(:key) |> unique_constraint(:key, name: :config_group_key_index)
end end
@spec create(map()) :: {:ok, Config.t()} | {:error, Changeset.t()} @spec create(map()) :: {:ok, Config.t()} | {:error, Changeset.t()}
def create(%{key: key, value: value}) do def create(params) do
%Config{} %Config{}
|> changeset(%{key: key, value: transform(value)}) |> changeset(Map.put(params, :value, transform(params[:value])))
|> Repo.insert() |> Repo.insert()
end end
@ -43,20 +44,20 @@ def update(%Config{} = config, %{value: value}) do
end end
@spec update_or_create(map()) :: {:ok, Config.t()} | {:error, Changeset.t()} @spec update_or_create(map()) :: {:ok, Config.t()} | {:error, Changeset.t()}
def update_or_create(%{key: key} = params) do def update_or_create(params) do
with %Config{} = config <- Config.get_by_key(key) do with %Config{} = config <- Config.get_by_params(Map.take(params, [:group, :key])) do
Config.update(config, params) Config.update(config, params)
else else
nil -> Config.create(params) nil -> Config.create(params)
end end
end end
@spec delete(String.t()) :: {:ok, Config.t()} | {:error, Changeset.t()} @spec delete(map()) :: {:ok, Config.t()} | {:error, Changeset.t()}
def delete(key) do def delete(params) do
with %Config{} = config <- Config.get_by_key(key) do with %Config{} = config <- Config.get_by_params(params) do
Repo.delete(config) Repo.delete(config)
else else
nil -> {:error, "Config with key #{key} not found"} nil -> {:error, "Config with params #{inspect(params)} not found"}
end end
end end
@ -77,10 +78,21 @@ defp do_convert(values) when is_list(values), do: for(val <- values, do: do_conv
defp do_convert({k, v} = value) when is_tuple(value), defp do_convert({k, v} = value) when is_tuple(value),
do: %{k => do_convert(v)} do: %{k => do_convert(v)}
defp do_convert(value) when is_binary(value) or is_atom(value) or is_map(value), defp do_convert(value) when is_tuple(value), do: %{"tuple" => do_convert(Tuple.to_list(value))}
do: value
defp do_convert(value) when is_binary(value) or is_map(value) or is_number(value), do: value
defp do_convert(value) when is_atom(value) do
string = to_string(value)
if String.starts_with?(string, "Elixir."),
do: String.trim_leading(string, "Elixir."),
else: value
end
@spec transform(any()) :: binary() @spec transform(any()) :: binary()
def transform(%{"tuple" => _} = entity), do: :erlang.term_to_binary(do_transform(entity))
def transform(entity) when is_map(entity) do def transform(entity) when is_map(entity) do
tuples = tuples =
for {k, v} <- entity, for {k, v} <- entity,
@ -101,11 +113,16 @@ def transform(entity), do: :erlang.term_to_binary(entity)
defp do_transform(%Regex{} = value) when is_map(value), do: value defp do_transform(%Regex{} = value) when is_map(value), do: value
defp do_transform(%{"tuple" => [k, values] = entity}) when length(entity) == 2 do
{do_transform(k), do_transform(values)}
end
defp do_transform(%{"tuple" => values}) do
Enum.reduce(values, {}, fn val, acc -> Tuple.append(acc, do_transform(val)) end)
end
defp do_transform(value) when is_map(value) do defp do_transform(value) when is_map(value) do
values = values = for {key, val} <- value, into: [], do: {String.to_atom(key), do_transform(val)}
for {key, val} <- value,
into: [],
do: {String.to_atom(key), do_transform(val)}
Enum.sort(values) Enum.sort(values)
end end
@ -117,28 +134,27 @@ defp do_transform(value) when is_list(value) do
defp do_transform(entity) when is_list(entity) and length(entity) == 1, do: hd(entity) defp do_transform(entity) when is_list(entity) and length(entity) == 1, do: hd(entity)
defp do_transform(value) when is_binary(value) do defp do_transform(value) when is_binary(value) do
value = String.trim(value) String.trim(value)
|> do_transform_string()
case String.length(value) do
0 ->
nil
_ ->
cond do
String.starts_with?(value, "Pleroma") ->
String.to_existing_atom("Elixir." <> value)
String.starts_with?(value, ":") ->
String.replace(value, ":", "") |> String.to_existing_atom()
String.starts_with?(value, "i:") ->
String.replace(value, "i:", "") |> String.to_integer()
true ->
value
end
end
end end
defp do_transform(value), do: value defp do_transform(value), do: value
defp do_transform_string(value) when byte_size(value) == 0, do: nil
defp do_transform_string(value) do
cond do
String.starts_with?(value, "Pleroma") or String.starts_with?(value, "Phoenix") ->
String.to_existing_atom("Elixir." <> value)
String.starts_with?(value, ":") ->
String.replace(value, ":", "") |> String.to_existing_atom()
String.starts_with?(value, "i:") ->
String.replace(value, "i:", "") |> String.to_integer()
true ->
value
end
end
end end

View file

@ -10,6 +10,7 @@ def render("index.json", %{configs: configs}) do
def render("show.json", %{config: config}) do def render("show.json", %{config: config}) do
%{ %{
key: config.key, key: config.key,
group: config.group,
value: Pleroma.Web.AdminAPI.Config.from_binary_to_map(config.value) value: Pleroma.Web.AdminAPI.Config.from_binary_to_map(config.value)
} }
end end

View file

@ -76,14 +76,16 @@ def use_token(%Authorization{used: false, valid_until: valid_until} = auth) do
def use_token(%Authorization{used: true}), do: {:error, "already used"} def use_token(%Authorization{used: true}), do: {:error, "already used"}
@spec delete_user_authorizations(User.t()) :: {integer(), any()} @spec delete_user_authorizations(User.t()) :: {integer(), any()}
def delete_user_authorizations(%User{id: user_id}) do def delete_user_authorizations(%User{} = user) do
from( user
a in Pleroma.Web.OAuth.Authorization, |> delete_by_user_query
where: a.user_id == ^user_id
)
|> Repo.delete_all() |> Repo.delete_all()
end end
def delete_by_user_query(%User{id: user_id}) do
from(a in __MODULE__, where: a.user_id == ^user_id)
end
@doc "gets auth for app by token" @doc "gets auth for app by token"
@spec get_by_token(App.t(), String.t()) :: {:ok, t()} | {:error, :not_found} @spec get_by_token(App.t(), String.t()) :: {:ok, t()} | {:error, :not_found}
def get_by_token(%App{id: app_id} = _app, token) do def get_by_token(%App{id: app_id} = _app, token) do

View file

@ -64,26 +64,34 @@ defp do_authorize(%Plug.Conn{} = conn, params) do
defp handle_existing_authorization( defp handle_existing_authorization(
%Plug.Conn{assigns: %{token: %Token{} = token}} = conn, %Plug.Conn{assigns: %{token: %Token{} = token}} = conn,
params %{"redirect_uri" => @oob_token_redirect_uri}
) do ) do
token = Repo.preload(token, :app) render(conn, "oob_token_exists.html", %{token: token})
end
defp handle_existing_authorization(
%Plug.Conn{assigns: %{token: %Token{} = token}} = conn,
%{} = params
) do
app = Repo.preload(token, :app).app
redirect_uri = redirect_uri =
if is_binary(params["redirect_uri"]) do if is_binary(params["redirect_uri"]) do
params["redirect_uri"] params["redirect_uri"]
else else
default_redirect_uri(token.app) default_redirect_uri(app)
end end
redirect_uri = redirect_uri(conn, redirect_uri) if redirect_uri in String.split(app.redirect_uris) do
redirect_uri = redirect_uri(conn, redirect_uri)
if redirect_uri == @oob_token_redirect_uri do
render(conn, "oob_token_exists.html", %{token: token})
else
url_params = %{access_token: token.token} url_params = %{access_token: token.token}
url_params = UriHelper.append_param_if_present(url_params, :state, params["state"]) url_params = UriHelper.append_param_if_present(url_params, :state, params["state"])
url = UriHelper.append_uri_params(redirect_uri, url_params) url = UriHelper.append_uri_params(redirect_uri, url_params)
redirect(conn, external: url) redirect(conn, external: url)
else
conn
|> put_flash(:error, "Unlisted redirect_uri.")
|> redirect(external: redirect_uri(conn, redirect_uri))
end end
end end
@ -100,18 +108,28 @@ def create_authorization(
end end
end end
def after_create_authorization(%Plug.Conn{} = conn, %Authorization{} = auth, %{
"authorization" => %{"redirect_uri" => @oob_token_redirect_uri}
}) do
render(conn, "oob_authorization_created.html", %{auth: auth})
end
def after_create_authorization(%Plug.Conn{} = conn, %Authorization{} = auth, %{ def after_create_authorization(%Plug.Conn{} = conn, %Authorization{} = auth, %{
"authorization" => %{"redirect_uri" => redirect_uri} = auth_attrs "authorization" => %{"redirect_uri" => redirect_uri} = auth_attrs
}) do }) do
redirect_uri = redirect_uri(conn, redirect_uri) app = Repo.preload(auth, :app).app
if redirect_uri == @oob_token_redirect_uri do # An extra safety measure before we redirect (also done in `do_create_authorization/2`)
render(conn, "oob_authorization_created.html", %{auth: auth}) if redirect_uri in String.split(app.redirect_uris) do
else redirect_uri = redirect_uri(conn, redirect_uri)
url_params = %{code: auth.token} url_params = %{code: auth.token}
url_params = UriHelper.append_param_if_present(url_params, :state, auth_attrs["state"]) url_params = UriHelper.append_param_if_present(url_params, :state, auth_attrs["state"])
url = UriHelper.append_uri_params(redirect_uri, url_params) url = UriHelper.append_uri_params(redirect_uri, url_params)
redirect(conn, external: url) redirect(conn, external: url)
else
conn
|> put_flash(:error, "Unlisted redirect_uri.")
|> redirect(external: redirect_uri(conn, redirect_uri))
end end
end end
@ -324,7 +342,7 @@ def callback(%Plug.Conn{} = conn, params) do
}) })
conn conn
|> put_session(:registration_id, registration.id) |> put_session_registration_id(registration.id)
|> registration_details(%{"authorization" => registration_params}) |> registration_details(%{"authorization" => registration_params})
end end
else else
@ -445,7 +463,7 @@ defp validate_scopes(app, params) do
|> Scopes.validates(app.scopes) |> Scopes.validates(app.scopes)
end end
defp default_redirect_uri(%App{} = app) do def default_redirect_uri(%App{} = app) do
app.redirect_uris app.redirect_uris
|> String.split() |> String.split()
|> Enum.at(0) |> Enum.at(0)

View file

@ -34,13 +34,15 @@ defp normalize_attributes(html_node, prefix, key_name, value_name) do
defp maybe_put_title(%{title: _} = meta, _), do: meta defp maybe_put_title(%{title: _} = meta, _), do: meta
defp maybe_put_title(meta, html) do defp maybe_put_title(meta, html) when meta != %{} do
case get_page_title(html) do case get_page_title(html) do
"" -> meta "" -> meta
title -> Map.put_new(meta, :title, title) title -> Map.put_new(meta, :title, title)
end end
end end
defp maybe_put_title(meta, _), do: meta
defp get_page_title(html) do defp get_page_title(html) do
Floki.find(html, "title") |> Floki.text() Floki.find(html, "title") |> Floki.text()
end end

View file

@ -133,8 +133,8 @@ defmodule Pleroma.Web.Router do
scope "/api/pleroma", Pleroma.Web.TwitterAPI do scope "/api/pleroma", Pleroma.Web.TwitterAPI do
pipe_through(:pleroma_api) pipe_through(:pleroma_api)
get("/password_reset/:token", UtilController, :show_password_reset) get("/password_reset/:token", PasswordController, :reset, as: :reset_password)
post("/password_reset", UtilController, :password_reset) post("/password_reset", PasswordController, :do_reset, as: :reset_password)
get("/emoji", UtilController, :emoji) get("/emoji", UtilController, :emoji)
get("/captcha", UtilController, :captcha) get("/captcha", UtilController, :captcha)
get("/healthcheck", UtilController, :healthcheck) get("/healthcheck", UtilController, :healthcheck)

View file

@ -1,5 +1,5 @@
<h2>Password Reset for <%= @user.nickname %></h2> <h2>Password Reset for <%= @user.nickname %></h2>
<%= form_for @conn, util_path(@conn, :password_reset), [as: "data"], fn f -> %> <%= form_for @conn, reset_password_path(@conn, :do_reset), [as: "data"], fn f -> %>
<div class="form-row"> <div class="form-row">
<%= label f, :password, "Password" %> <%= label f, :password, "Password" %>
<%= password_input f, :password %> <%= password_input f, :password %>

View file

@ -0,0 +1,37 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2019 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Web.TwitterAPI.PasswordController do
@moduledoc """
The module containts functions for reset password.
"""
use Pleroma.Web, :controller
require Logger
alias Pleroma.PasswordResetToken
alias Pleroma.Repo
alias Pleroma.User
def reset(conn, %{"token" => token}) do
with %{used: false} = token <- Repo.get_by(PasswordResetToken, %{token: token}),
%User{} = user <- User.get_cached_by_id(token.user_id) do
render(conn, "reset.html", %{
token: token,
user: user
})
else
_e -> render(conn, "invalid_token.html")
end
end
def do_reset(conn, %{"data" => data}) do
with {:ok, _} <- PasswordResetToken.reset_password(data["token"], data) do
render(conn, "reset_success.html")
else
_e -> render(conn, "reset_failed.html")
end
end
end

View file

@ -11,8 +11,6 @@ defmodule Pleroma.Web.TwitterAPI.UtilController do
alias Pleroma.Activity alias Pleroma.Activity
alias Pleroma.Emoji alias Pleroma.Emoji
alias Pleroma.Notification alias Pleroma.Notification
alias Pleroma.PasswordResetToken
alias Pleroma.Repo
alias Pleroma.User alias Pleroma.User
alias Pleroma.Web alias Pleroma.Web
alias Pleroma.Web.ActivityPub.ActivityPub alias Pleroma.Web.ActivityPub.ActivityPub
@ -20,26 +18,6 @@ defmodule Pleroma.Web.TwitterAPI.UtilController do
alias Pleroma.Web.OStatus alias Pleroma.Web.OStatus
alias Pleroma.Web.WebFinger alias Pleroma.Web.WebFinger
def show_password_reset(conn, %{"token" => token}) do
with %{used: false} = token <- Repo.get_by(PasswordResetToken, %{token: token}),
%User{} = user <- User.get_cached_by_id(token.user_id) do
render(conn, "password_reset.html", %{
token: token,
user: user
})
else
_e -> render(conn, "invalid_token.html")
end
end
def password_reset(conn, %{"data" => data}) do
with {:ok, _} <- PasswordResetToken.reset_password(data["token"], data) do
render(conn, "password_reset_success.html")
else
_e -> render(conn, "password_reset_failed.html")
end
end
def help_test(conn, _params) do def help_test(conn, _params) do
json(conn, "ok") json(conn, "ok")
end end

View file

@ -0,0 +1,8 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2019 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Web.TwitterAPI.PasswordView do
use Pleroma.Web, :view
import Phoenix.HTML.Form
end

11
mix.exs
View file

@ -37,14 +37,14 @@ def project do
pleroma: [ pleroma: [
include_executables_for: [:unix], include_executables_for: [:unix],
applications: [ex_syslogger: :load, syslog: :load], applications: [ex_syslogger: :load, syslog: :load],
steps: [:assemble, &copy_pleroma_ctl/1] steps: [:assemble, &copy_files/1]
] ]
] ]
] ]
end end
def copy_pleroma_ctl(%{path: target_path} = release) do def copy_files(%{path: target_path} = release) do
File.cp!("./rel/pleroma_ctl", Path.join([target_path, "bin", "pleroma_ctl"])) File.cp_r!("./rel/files", target_path)
release release
end end
@ -108,7 +108,7 @@ defp deps do
{:ex_aws, "~> 2.0"}, {:ex_aws, "~> 2.0"},
{:ex_aws_s3, "~> 2.0"}, {:ex_aws_s3, "~> 2.0"},
{:earmark, "~> 1.3"}, {:earmark, "~> 1.3"},
{:bbcode, "~> 0.1"}, {:bbcode, "~> 0.1.1"},
{:ex_machina, "~> 2.3", only: :test}, {:ex_machina, "~> 2.3", only: :test},
{:credo, "~> 0.9.3", only: [:dev, :test]}, {:credo, "~> 0.9.3", only: [:dev, :test]},
{:mock, "~> 0.3.3", only: :test}, {:mock, "~> 0.3.3", only: :test},
@ -209,10 +209,11 @@ defp version(version) do
branch_name = branch_name =
with {branch_name, 0} <- System.cmd("git", ["rev-parse", "--abbrev-ref", "HEAD"]), with {branch_name, 0} <- System.cmd("git", ["rev-parse", "--abbrev-ref", "HEAD"]),
branch_name <- System.get_env("PLEROMA_BUILD_BRANCH") || branch_name,
true <- branch_name != "master" do true <- branch_name != "master" do
branch_name = branch_name =
String.trim(branch_name) String.trim(branch_name)
|> String.replace(~r/\W+/, "-") |> String.replace(~r/[^0-9a-z\-\.]+/i, "-")
"-" <> branch_name "-" <> branch_name
end end

View file

@ -2,7 +2,7 @@
"accept": {:hex, :accept, "0.3.5", "b33b127abca7cc948bbe6caa4c263369abf1347cfa9d8e699c6d214660f10cd1", [:rebar3], [], "hexpm"}, "accept": {:hex, :accept, "0.3.5", "b33b127abca7cc948bbe6caa4c263369abf1347cfa9d8e699c6d214660f10cd1", [:rebar3], [], "hexpm"},
"auto_linker": {:git, "https://git.pleroma.social/pleroma/auto_linker.git", "95e8188490e97505c56636c1379ffdf036c1fdde", [ref: "95e8188490e97505c56636c1379ffdf036c1fdde"]}, "auto_linker": {:git, "https://git.pleroma.social/pleroma/auto_linker.git", "95e8188490e97505c56636c1379ffdf036c1fdde", [ref: "95e8188490e97505c56636c1379ffdf036c1fdde"]},
"base64url": {:hex, :base64url, "0.0.1", "36a90125f5948e3afd7be97662a1504b934dd5dac78451ca6e9abf85a10286be", [:rebar], [], "hexpm"}, "base64url": {:hex, :base64url, "0.0.1", "36a90125f5948e3afd7be97662a1504b934dd5dac78451ca6e9abf85a10286be", [:rebar], [], "hexpm"},
"bbcode": {:hex, :bbcode, "0.1.0", "400e618b640b635261611d7fb7f79d104917fc5b084aae371ab6b08477cb035b", [:mix], [{:nimble_parsec, "~> 0.5", [hex: :nimble_parsec, repo: "hexpm", optional: false]}], "hexpm"}, "bbcode": {:hex, :bbcode, "0.1.1", "0023e2c7814119b2e620b7add67182e3f6019f92bfec9a22da7e99821aceba70", [:mix], [{:nimble_parsec, "~> 0.5", [hex: :nimble_parsec, repo: "hexpm", optional: false]}], "hexpm"},
"benchee": {:hex, :benchee, "1.0.1", "66b211f9bfd84bd97e6d1beaddf8fc2312aaabe192f776e8931cb0c16f53a521", [:mix], [{:deep_merge, "~> 1.0", [hex: :deep_merge, repo: "hexpm", optional: false]}], "hexpm"}, "benchee": {:hex, :benchee, "1.0.1", "66b211f9bfd84bd97e6d1beaddf8fc2312aaabe192f776e8931cb0c16f53a521", [:mix], [{:deep_merge, "~> 1.0", [hex: :deep_merge, repo: "hexpm", optional: false]}], "hexpm"},
"bunt": {:hex, :bunt, "0.2.0", "951c6e801e8b1d2cbe58ebbd3e616a869061ddadcc4863d0a2182541acae9a38", [:mix], [], "hexpm"}, "bunt": {:hex, :bunt, "0.2.0", "951c6e801e8b1d2cbe58ebbd3e616a869061ddadcc4863d0a2182541acae9a38", [:mix], [], "hexpm"},
"cachex": {:hex, :cachex, "3.0.2", "1351caa4e26e29f7d7ec1d29b53d6013f0447630bbf382b4fb5d5bad0209f203", [:mix], [{:eternal, "~> 1.2", [hex: :eternal, repo: "hexpm", optional: false]}, {:unsafe, "~> 1.0", [hex: :unsafe, repo: "hexpm", optional: false]}], "hexpm"}, "cachex": {:hex, :cachex, "3.0.2", "1351caa4e26e29f7d7ec1d29b53d6013f0447630bbf382b4fb5d5bad0209f203", [:mix], [{:eternal, "~> 1.2", [hex: :eternal, repo: "hexpm", optional: false]}, {:unsafe, "~> 1.0", [hex: :unsafe, repo: "hexpm", optional: false]}], "hexpm"},

View file

@ -0,0 +1,12 @@
defmodule Pleroma.Repo.Migrations.AddGroupKeyToConfig do
use Ecto.Migration
def change do
alter table("config") do
add(:group, :string)
end
drop(unique_index("config", :key))
create(unique_index("config", [:group, :key]))
end
end

View file

@ -0,0 +1 @@
.status-account{display:-webkit-box;display:-ms-flexbox;display:flex;-webkit-box-align:center;-ms-flex-align:center;align-items:center}.status-avatar-img{width:15px;height:15px;margin-right:5px}.status-account-name{margin:0;height:22px}.status-body{display:-webkit-box;display:-ms-flexbox;display:flex;-webkit-box-orient:vertical;-webkit-box-direction:normal;-ms-flex-direction:column;flex-direction:column}.status-content{font-size:15px}.status-card{margin-bottom:15px}.status-header{display:-webkit-box;display:-ms-flexbox;display:flex;-webkit-box-pack:justify;-ms-flex-pack:justify;justify-content:space-between}@media (min-device-width:768px) and (max-device-width:1024px),only screen and (max-width:760px){.el-message{min-width:80%}.el-message-box{width:80%}.status-card .el-card__header{padding:10px 17px}.status-card .el-tag{margin:3px 4px 3px 0}.status-card .status-account-container{margin-bottom:5px}.status-card .status-actions-button{margin:3px 0}.status-card .status-actions{display:-webkit-box;display:-ms-flexbox;display:flex;-ms-flex-wrap:wrap;flex-wrap:wrap}.status-card .status-header{display:-webkit-box;display:-ms-flexbox;display:flex;-webkit-box-orient:vertical;-webkit-box-direction:normal;-ms-flex-direction:column;flex-direction:column}}.account{text-decoration:underline}.avatar-img{vertical-align:bottom;width:15px;height:15px;margin-left:5px}.el-card__body{padding:17px}.el-card__header{background-color:#fafafa;padding:10px 20px}.el-collapse{border-bottom:none}.el-collapse-item__header{height:46px;font-size:14px}.el-collapse-item__content{padding-bottom:7px}.el-icon-arrow-right{margin-right:6px}.el-icon-close{padding:10px 5px 10px 10px;cursor:pointer}h4{margin:0;height:17px}.header-container{display:-webkit-box;display:-ms-flexbox;display:flex;-webkit-box-pack:justify;-ms-flex-pack:justify;justify-content:space-between;-webkit-box-align:baseline;-ms-flex-align:baseline;align-items:baseline;height:40px}.id{color:grey;margin-top:6px}.line{width:100%;height:0;border:.5px solid #ebeef5;margin:15px 0}.new-note p{font-size:14px;font-weight:500;height:17px;margin:13px 0 7px}.note{-webkit-box-shadow:0 2px 5px 0 rgba(0,0,0,.1);box-shadow:0 2px 5px 0 rgba(0,0,0,.1);margin-bottom:10px}.no-notes{font-style:italic;color:grey}.report-row-key{font-weight:500;font-size:14px}.report-title{margin:0}.statuses{margin-top:15px}.submit-button{display:block;margin:7px 0 17px auto}.timestamp{margin:0;font-style:italic;color:grey}@media (min-device-width:768px) and (max-device-width:1024px),only screen and (max-width:760px){.timeline-item-container .header-container{display:-webkit-box;display:-ms-flexbox;display:flex;-webkit-box-orient:vertical;-webkit-box-direction:normal;-ms-flex-direction:column;flex-direction:column;height:80px}.timeline-item-container .id{margin:6px 0 0}}.select-field[data-v-bb4390da]{width:350px}@media (min-device-width:768px) and (max-device-width:1024px),only screen and (max-width:760px){.select-field[data-v-bb4390da]{width:100%;margin-bottom:5px}}.reports-container .el-timeline[data-v-e32c7dc6]{margin:45px 45px 45px 19px;padding:0}.reports-container .filter-container[data-v-e32c7dc6]{margin:22px 15px;padding-bottom:0}.reports-container h1[data-v-e32c7dc6]{margin:22px 0 0 15px}.reports-container .no-reports-message[data-v-e32c7dc6]{color:grey;margin-left:19px}@media (min-device-width:768px) and (max-device-width:1024px),only screen and (max-width:760px){.reports-container h1[data-v-e32c7dc6]{margin:7px 10px 15px}.reports-container .filter-container[data-v-e32c7dc6]{margin:0 10px}.reports-container .timeline[data-v-e32c7dc6]{margin:20px 20px 20px 18px}}

View file

@ -0,0 +1 @@
.select-field[data-v-71bc6b38]{width:350px}@media (min-device-width:768px) and (max-device-width:1024px),only screen and (max-width:760px){.select-field[data-v-71bc6b38]{width:100%;margin-bottom:5px}}.actions-button[data-v-94227b1e]{text-align:left;width:350px;padding:10px}.actions-button-container[data-v-94227b1e]{display:-webkit-box;display:-ms-flexbox;display:flex;-webkit-box-pack:justify;-ms-flex-pack:justify;justify-content:space-between}.el-dropdown[data-v-94227b1e]{float:right}.el-icon-edit[data-v-94227b1e]{margin-right:5px}.tag-container[data-v-94227b1e]{display:-webkit-box;display:-ms-flexbox;display:flex;-webkit-box-pack:justify;-ms-flex-pack:justify;justify-content:space-between;-webkit-box-align:center;-ms-flex-align:center;align-items:center}.tag-text[data-v-94227b1e]{padding-right:20px}.no-hover[data-v-94227b1e]:hover{color:#606266;background-color:#fff;cursor:auto}@media (min-device-width:768px) and (max-device-width:1024px),only screen and (max-width:760px){.create-user-dialog{width:80%}.create-account-form-item{margin-bottom:30px}.el-dialog__body{padding:20px 20px 0}}.actions-button[data-v-3ffddd00]{text-align:left;width:350px;padding:10px}.actions-container[data-v-3ffddd00]{display:-webkit-box;display:-ms-flexbox;display:flex;height:36px;-webkit-box-pack:justify;-ms-flex-pack:justify;justify-content:space-between;-webkit-box-align:center;-ms-flex-align:center;align-items:center;margin:0 15px 10px}.active-tag[data-v-3ffddd00]{color:#409eff;font-weight:700}.active-tag .el-icon-check[data-v-3ffddd00]{color:#409eff;float:right;margin:7px 0 0 15px}.el-dropdown-link[data-v-3ffddd00]:hover{cursor:pointer;color:#409eff}.el-icon-plus[data-v-3ffddd00]{margin-right:5px}.users-container h1[data-v-3ffddd00]{margin:22px 0 0 15px}.users-container .pagination[data-v-3ffddd00]{margin:25px 0;text-align:center}.users-container .search[data-v-3ffddd00]{width:350px;float:right}.users-container .filter-container[data-v-3ffddd00]{display:-webkit-box;display:-ms-flexbox;display:flex;height:36px;-webkit-box-pack:justify;-ms-flex-pack:justify;justify-content:space-between;-webkit-box-align:center;-ms-flex-align:center;align-items:center;margin:22px 15px 15px}.users-container .user-count[data-v-3ffddd00]{color:grey;font-size:28px}@media (min-device-width:768px) and (max-device-width:1024px),only screen and (max-width:760px){.users-container h1[data-v-3ffddd00]{margin:7px 10px 15px}.users-container .actions-container[data-v-3ffddd00]{display:-webkit-box;display:-ms-flexbox;display:flex;-webkit-box-orient:vertical;-webkit-box-direction:normal;-ms-flex-direction:column;flex-direction:column;margin:0 10px 7px}.users-container .create-account[data-v-3ffddd00]{width:100%}.users-container .el-icon-arrow-down[data-v-3ffddd00]{font-size:12px}.users-container .search[data-v-3ffddd00]{width:100%}.users-container .filter-container[data-v-3ffddd00]{display:-webkit-box;display:-ms-flexbox;display:flex;height:82px;-webkit-box-orient:vertical;-webkit-box-direction:normal;-ms-flex-direction:column;flex-direction:column;margin:0 10px}.users-container .el-tag[data-v-3ffddd00]{width:30px;display:inline-block;margin-bottom:4px;font-weight:700}.users-container .el-tag.el-tag--danger[data-v-3ffddd00],.users-container .el-tag.el-tag--success[data-v-3ffddd00]{padding-left:8px}}

File diff suppressed because one or more lines are too long

View file

@ -1 +1 @@
<!DOCTYPE html><html><head><meta charset=utf-8><meta http-equiv=X-UA-Compatible content="IE=edge,chrome=1"><meta name=renderer content=webkit><meta name=viewport content="width=device-width,initial-scale=1,maximum-scale=1,user-scalable=no"><title>Admin FE</title><link rel="shortcut icon" href=favicon.ico><link href=static/css/chunk-elementUI.4296cedf.css rel=stylesheet><link href=static/css/chunk-libs.bd17d456.css rel=stylesheet><link href=static/css/app.cea15678.css rel=stylesheet></head><body><script src=/pleroma/admin/static/tinymce4.7.5/tinymce.min.js></script><div id=app></div><script type=text/javascript src=static/js/runtime.7144b2cf.js></script><script type=text/javascript src=static/js/chunk-elementUI.d388c21d.js></script><script type=text/javascript src=static/js/chunk-libs.48e79a9e.js></script><script type=text/javascript src=static/js/app.25699e3d.js></script></body></html> <!DOCTYPE html><html><head><meta charset=utf-8><meta http-equiv=X-UA-Compatible content="IE=edge,chrome=1"><meta name=renderer content=webkit><meta name=viewport content="width=device-width,initial-scale=1,maximum-scale=1,user-scalable=no"><title>Admin FE</title><link rel="shortcut icon" href=favicon.ico><link href=chunk-elementUI.f74c256b.css rel=stylesheet><link href=chunk-libs.4e8c4664.css rel=stylesheet><link href=app.34fc670f.css rel=stylesheet></head><body><script src=/pleroma/admin/static/tinymce4.7.5/tinymce.min.js></script><div id=app></div><script type=text/javascript src=static/js/runtime.d8d12c12.js></script><script type=text/javascript src=static/js/chunk-elementUI.1fa5434b.js></script><script type=text/javascript src=static/js/chunk-libs.d5609760.js></script><script type=text/javascript src=static/js/app.4137ad8f.js></script></body></html>

View file

@ -1 +0,0 @@
.select-field[data-v-71bc6b38]{width:350px}@media (min-device-width:768px) and (max-device-width:1024px),only screen and (max-width:760px){.select-field[data-v-71bc6b38]{width:100%;margin-bottom:5px}}.active-tag[data-v-693dba04]{color:#409eff;font-weight:700}.active-tag .el-icon-check[data-v-693dba04]{color:#409eff;float:right;margin:7px 0 0 15px}.users-container h1[data-v-693dba04]{margin:22px 0 0 15px}.users-container .pagination[data-v-693dba04]{margin:25px 0;text-align:center}.users-container .search[data-v-693dba04]{width:350px;float:right}.users-container .search-container[data-v-693dba04]{display:-webkit-box;display:-ms-flexbox;display:flex;height:36px;-webkit-box-pack:justify;-ms-flex-pack:justify;justify-content:space-between;-webkit-box-align:center;-ms-flex-align:center;align-items:center;margin:22px 15px}@media (min-device-width:768px) and (max-device-width:1024px),only screen and (max-width:760px){.users-container h1[data-v-693dba04]{margin:7px 10px}.users-container .el-dropdown-link[data-v-693dba04]{cursor:pointer;color:#409eff}.users-container .el-icon-arrow-down[data-v-693dba04]{font-size:12px}.users-container .search[data-v-693dba04]{width:100%}.users-container .search-container[data-v-693dba04]{display:-webkit-box;display:-ms-flexbox;display:flex;height:82px;-webkit-box-orient:vertical;-webkit-box-direction:normal;-ms-flex-direction:column;flex-direction:column;margin:0 10px 7px}.users-container .el-tag[data-v-693dba04]{width:30px;display:inline-block;margin-bottom:4px;font-weight:700}.users-container .el-tag.el-tag--danger[data-v-693dba04],.users-container .el-tag.el-tag--success[data-v-693dba04]{padding-left:8px}}

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

View file

@ -1 +0,0 @@
!function(e){function t(t){for(var r,o,c=t[0],i=t[1],f=t[2],l=0,d=[];l<c.length;l++)o=c[l],u[o]&&d.push(u[o][0]),u[o]=0;for(r in i)Object.prototype.hasOwnProperty.call(i,r)&&(e[r]=i[r]);for(s&&s(t);d.length;)d.shift()();return a.push.apply(a,f||[]),n()}function n(){for(var e,t=0;t<a.length;t++){for(var n=a[t],r=!0,o=1;o<n.length;o++){var i=n[o];0!==u[i]&&(r=!1)}r&&(a.splice(t--,1),e=c(c.s=n[0]))}return e}var r={},o={runtime:0},u={runtime:0},a=[];function c(t){if(r[t])return r[t].exports;var n=r[t]={i:t,l:!1,exports:{}};return e[t].call(n.exports,n,n.exports,c),n.l=!0,n.exports}c.e=function(e){var t=[];o[e]?t.push(o[e]):0!==o[e]&&{"chunk-18e1":1,"chunk-50cf":1,"chunk-8b70":1,"chunk-f018":1}[e]&&t.push(o[e]=new Promise(function(t,n){for(var r="static/css/"+({}[e]||e)+"."+{"7zzA":"31d6cfe0",JEtC:"31d6cfe0","chunk-18e1":"6aaab273","chunk-50cf":"1db1ed5b","chunk-8b70":"9ba0945c","chunk-f018":"0d22684d"}[e]+".css",o=c.p+r,u=document.getElementsByTagName("link"),a=0;a<u.length;a++){var i=(l=u[a]).getAttribute("data-href")||l.getAttribute("href");if("stylesheet"===l.rel&&(i===r||i===o))return t()}var f=document.getElementsByTagName("style");for(a=0;a<f.length;a++){var l;if((i=(l=f[a]).getAttribute("data-href"))===r||i===o)return t()}var s=document.createElement("link");s.rel="stylesheet",s.type="text/css",s.onload=t,s.onerror=function(t){var r=t&&t.target&&t.target.src||o,u=new Error("Loading CSS chunk "+e+" failed.\n("+r+")");u.request=r,n(u)},s.href=o,document.getElementsByTagName("head")[0].appendChild(s)}).then(function(){o[e]=0}));var n=u[e];if(0!==n)if(n)t.push(n[2]);else{var r=new Promise(function(t,r){n=u[e]=[t,r]});t.push(n[2]=r);var a,i=document.createElement("script");i.charset="utf-8",i.timeout=120,c.nc&&i.setAttribute("nonce",c.nc),i.src=function(e){return c.p+"static/js/"+({}[e]||e)+"."+{"7zzA":"e1ae1c94",JEtC:"f9ba4594","chunk-18e1":"7f9c377c","chunk-50cf":"b9b1df43","chunk-8b70":"46525646","chunk-f018":"e1a7a454"}[e]+".js"}(e),a=function(t){i.onerror=i.onload=null,clearTimeout(f);var n=u[e];if(0!==n){if(n){var r=t&&("load"===t.type?"missing":t.type),o=t&&t.target&&t.target.src,a=new Error("Loading chunk "+e+" failed.\n("+r+": "+o+")");a.type=r,a.request=o,n[1](a)}u[e]=void 0}};var f=setTimeout(function(){a({type:"timeout",target:i})},12e4);i.onerror=i.onload=a,document.head.appendChild(i)}return Promise.all(t)},c.m=e,c.c=r,c.d=function(e,t,n){c.o(e,t)||Object.defineProperty(e,t,{enumerable:!0,get:n})},c.r=function(e){"undefined"!=typeof Symbol&&Symbol.toStringTag&&Object.defineProperty(e,Symbol.toStringTag,{value:"Module"}),Object.defineProperty(e,"__esModule",{value:!0})},c.t=function(e,t){if(1&t&&(e=c(e)),8&t)return e;if(4&t&&"object"==typeof e&&e&&e.__esModule)return e;var n=Object.create(null);if(c.r(n),Object.defineProperty(n,"default",{enumerable:!0,value:e}),2&t&&"string"!=typeof e)for(var r in e)c.d(n,r,function(t){return e[t]}.bind(null,r));return n},c.n=function(e){var t=e&&e.__esModule?function(){return e.default}:function(){return e};return c.d(t,"a",t),t},c.o=function(e,t){return Object.prototype.hasOwnProperty.call(e,t)},c.p="",c.oe=function(e){throw console.error(e),e};var i=window.webpackJsonp=window.webpackJsonp||[],f=i.push.bind(i);i.push=t,i=i.slice();for(var l=0;l<i.length;l++)t(i[l]);var s=f;n()}([]);

View file

@ -0,0 +1 @@
!function(e){function t(t){for(var r,o,a=t[0],i=t[1],f=t[2],l=0,h=[];l<a.length;l++)o=a[l],u[o]&&h.push(u[o][0]),u[o]=0;for(r in i)Object.prototype.hasOwnProperty.call(i,r)&&(e[r]=i[r]);for(s&&s(t);h.length;)h.shift()();return c.push.apply(c,f||[]),n()}function n(){for(var e,t=0;t<c.length;t++){for(var n=c[t],r=!0,o=1;o<n.length;o++){var i=n[o];0!==u[i]&&(r=!1)}r&&(c.splice(t--,1),e=a(a.s=n[0]))}return e}var r={},o={runtime:0},u={runtime:0},c=[];function a(t){if(r[t])return r[t].exports;var n=r[t]={i:t,l:!1,exports:{}};return e[t].call(n.exports,n,n.exports,a),n.l=!0,n.exports}a.e=function(e){var t=[];o[e]?t.push(o[e]):0!==o[e]&&{"chunk-56c9":1,"chunk-5eaf":1,"chunk-18e1":1,"chunk-8b70":1,"chunk-f018":1}[e]&&t.push(o[e]=new Promise(function(t,n){for(var r=({}[e]||e)+"."+{"7zzA":"31d6cfe0",JEtC:"31d6cfe0","chunk-02a0":"31d6cfe0","chunk-56c9":"c27dac5e","chunk-0620":"31d6cfe0","chunk-5eaf":"1a04e979","chunk-18e1":"6aaab273","chunk-8b70":"9ba0945c","chunk-f018":"0d22684d"}[e]+".css",o=a.p+r,u=document.getElementsByTagName("link"),c=0;c<u.length;c++){var i=(l=u[c]).getAttribute("data-href")||l.getAttribute("href");if("stylesheet"===l.rel&&(i===r||i===o))return t()}var f=document.getElementsByTagName("style");for(c=0;c<f.length;c++){var l;if((i=(l=f[c]).getAttribute("data-href"))===r||i===o)return t()}var s=document.createElement("link");s.rel="stylesheet",s.type="text/css",s.onload=t,s.onerror=function(t){var r=t&&t.target&&t.target.src||o,u=new Error("Loading CSS chunk "+e+" failed.\n("+r+")");u.request=r,n(u)},s.href=o,document.getElementsByTagName("head")[0].appendChild(s)}).then(function(){o[e]=0}));var n=u[e];if(0!==n)if(n)t.push(n[2]);else{var r=new Promise(function(t,r){n=u[e]=[t,r]});t.push(n[2]=r);var c,i=document.createElement("script");i.charset="utf-8",i.timeout=120,a.nc&&i.setAttribute("nonce",a.nc),i.src=function(e){return a.p+"static/js/"+({}[e]||e)+"."+{"7zzA":"e1ae1c94",JEtC:"f9ba4594","chunk-02a0":"db6ec114","chunk-56c9":"28e35fc3","chunk-0620":"c765c190","chunk-5eaf":"5b76e416","chunk-18e1":"7f9c377c","chunk-8b70":"46525646","chunk-f018":"e1a7a454"}[e]+".js"}(e),c=function(t){i.onerror=i.onload=null,clearTimeout(f);var n=u[e];if(0!==n){if(n){var r=t&&("load"===t.type?"missing":t.type),o=t&&t.target&&t.target.src,c=new Error("Loading chunk "+e+" failed.\n("+r+": "+o+")");c.type=r,c.request=o,n[1](c)}u[e]=void 0}};var f=setTimeout(function(){c({type:"timeout",target:i})},12e4);i.onerror=i.onload=c,document.head.appendChild(i)}return Promise.all(t)},a.m=e,a.c=r,a.d=function(e,t,n){a.o(e,t)||Object.defineProperty(e,t,{enumerable:!0,get:n})},a.r=function(e){"undefined"!=typeof Symbol&&Symbol.toStringTag&&Object.defineProperty(e,Symbol.toStringTag,{value:"Module"}),Object.defineProperty(e,"__esModule",{value:!0})},a.t=function(e,t){if(1&t&&(e=a(e)),8&t)return e;if(4&t&&"object"==typeof e&&e&&e.__esModule)return e;var n=Object.create(null);if(a.r(n),Object.defineProperty(n,"default",{enumerable:!0,value:e}),2&t&&"string"!=typeof e)for(var r in e)a.d(n,r,function(t){return e[t]}.bind(null,r));return n},a.n=function(e){var t=e&&e.__esModule?function(){return e.default}:function(){return e};return a.d(t,"a",t),t},a.o=function(e,t){return Object.prototype.hasOwnProperty.call(e,t)},a.p="",a.oe=function(e){throw console.error(e),e};var i=window.webpackJsonp=window.webpackJsonp||[],f=i.push.bind(i);i.push=t,i=i.slice();for(var l=0;l<i.length;l++)t(i[l]);var s=f;n()}([]);

View file

@ -3,7 +3,11 @@
# NOTE: This file should not be committed to a repo or otherwise made public # NOTE: This file should not be committed to a repo or otherwise made public
# without removing sensitive information. # without removing sensitive information.
use Mix.Config <%= if Code.ensure_loaded?(Config) or not Code.ensure_loaded?(Mix.Config) do
"import Config"
else
"use Mix.Config"
end %>
config :pleroma, Pleroma.Web.Endpoint, config :pleroma, Pleroma.Web.Endpoint,
url: [host: "<%= domain %>", scheme: "https", port: <%= port %>], url: [host: "<%= domain %>", scheme: "https", port: <%= port %>],
@ -16,7 +20,6 @@ config :pleroma, :instance,
notify_email: "<%= notify_email %>", notify_email: "<%= notify_email %>",
limit: 5000, limit: 5000,
registrations_open: true, registrations_open: true,
dedupe_media: false,
dynamic_configuration: <%= db_configurable? %> dynamic_configuration: <%= db_configurable? %>
config :pleroma, :media_proxy, config :pleroma, :media_proxy,
@ -38,6 +41,10 @@ config :web_push_encryption, :vapid_details,
public_key: "<%= web_push_public_key %>", public_key: "<%= web_push_public_key %>",
private_key: "<%= web_push_private_key %>" private_key: "<%= web_push_private_key %>"
config :pleroma, :database, rum_enabled: <%= rum_enabled %>
config :pleroma, :instance, static_dir: "<%= static_dir %>"
config :pleroma, Pleroma.Uploaders.Local, uploads: "<%= uploads_dir %>"
# Enable Strict-Transport-Security once SSL is working: # Enable Strict-Transport-Security once SSL is working:
# config :pleroma, :http_security, # config :pleroma, :http_security,
# sts: true # sts: true

View file

@ -5,3 +5,8 @@ CREATE DATABASE <%= dbname %> OWNER <%= dbuser %>;
CREATE EXTENSION IF NOT EXISTS citext; CREATE EXTENSION IF NOT EXISTS citext;
CREATE EXTENSION IF NOT EXISTS pg_trgm; CREATE EXTENSION IF NOT EXISTS pg_trgm;
CREATE EXTENSION IF NOT EXISTS "uuid-ossp"; CREATE EXTENSION IF NOT EXISTS "uuid-ossp";
<%= if rum_enabled do
"CREATE EXTENSION IF NOT EXISTS rum;"
else
""
end %>

118
rel/files/bin/pleroma_ctl Executable file
View file

@ -0,0 +1,118 @@
#!/bin/sh
# XXX: This should be removed when elixir's releases get custom command support
detect_flavour() {
arch="$(arch)"
if [ "$arch" = "x86_64" ]; then
arch="amd64"
elif [ "$arch" = "armv7l" ]; then
arch="arm"
elif [ "$arch" = "aarch64" ]; then
arch="arm64"
else
echo "Unsupported arch: $arch" >&2
exit 1
fi
if getconf GNU_LIBC_VERSION >/dev/null; then
libc_postfix=""
elif [ "$(ldd 2>&1 | head -c 9)" = "musl libc" ]; then
libc_postfix="-musl"
elif [ "$(find /lib/libc.musl* | wc -l)" ]; then
libc_postfix="-musl"
else
echo "Unsupported libc" >&2
exit 1
fi
echo "$arch$libc_postfix"
}
detect_branch() {
version="$(cut -d' ' -f2 <"$RELEASE_ROOT"/releases/start_erl.data)"
branch="$(echo "$version" | cut -d'-' -f 4)"
if [ "$branch" = "develop" ]; then
echo "develop"
elif [ "$branch" = "" ]; then
echo "master"
else
echo "Releases are built only for master and develop branches" >&2
exit 1
fi
}
update() {
set -e
RELEASE_ROOT=$(dirname "$SCRIPTPATH")
uri="${PLEROMA_CTL_URI:-https://git.pleroma.social}"
project_id="${PLEROMA_CTL_PROJECT_ID:-2}"
project_branch="$(detect_branch)"
flavour="${PLEROMA_CTL_FLAVOUR:-$(detect_flavour)}"
echo "Detected flavour: $flavour"
tmp="${PLEROMA_CTL_TMP_DIR:-/tmp}"
artifact="$tmp/pleroma.zip"
full_uri="${uri}/api/v4/projects/${project_id}/jobs/artifacts/${project_branch}/download?job=${flavour}"
echo "Downloading the artifact from ${full_uri} to ${artifact}"
curl "$full_uri" -o "${artifact}"
echo "Unpacking ${artifact} to ${tmp}"
unzip -q "$artifact" -d "$tmp"
echo "Copying files over to $RELEASE_ROOT"
if [ "$1" != "--no-rm" ]; then
rm -r "${RELEASE_ROOT:-?}"/*
fi
cp -rf "$tmp/release"/* "$RELEASE_ROOT"
echo "Removing temporary files"
rm -r "$tmp/release"
rm "$artifact"
echo "Done! Please refer to the changelog/release notes for changes and update instructions"
set +e
}
if [ -z "$1" ] || [ "$1" = "help" ]; then
# TODO: Just list the commands on `pleroma_ctl help` and output help for the individual command on `pleroma_ctl help $COMMAND`
echo "Usage: $(basename "$0") COMMAND [ARGS]
The known commands are:
create
Create database schema (needs to be executed only once)
migrate
Execute database migrations (needs to be done after updates)
rollback [VERSION]
Rollback database migrations (needs to be done before downgrading)
update [OPTIONS]
Update the instance using the latest CI artifact for the current branch.
The only supported option is --no-rm, when set the script won't delete the whole directory, but
just force copy over files from the new release. This wastes more space, but may be useful if
some files are stored inside of the release directories (although you really shouldn't store them
there), or if you want to be able to quickly revert a broken update.
The script will try to detect your architecture and ABI and set a flavour automatically,
but if it is wrong, you can overwrite it by setting PLEROMA_CTL_FLAVOUR to the desired flavour.
By default the artifact will be downloaded from https://git.pleroma.social for pleroma/pleroma (project id: 2)
to /tmp/, you can overwrite these settings by setting PLEROMA_CTL_URI, PLEROMA_CTL_PROJECT_ID and PLEROMA_CTL_TMP_DIR
respectively.
and any mix tasks under Pleroma namespace, for example \`mix pleroma.user COMMAND\` is
equivalent to \`$(basename "$0") user COMMAND\`
By default pleroma_ctl will try calling into a running instance to execute non migration-related commands,
if for some reason this is undesired, set PLEROMA_CTL_RPC_DISABLED environment variable
"
else
SCRIPT=$(readlink -f "$0")
SCRIPTPATH=$(dirname "$SCRIPT")
if [ "$1" = "update" ]; then
update "$2"
elif [ "$1" = "migrate" ] || [ "$1" = "rollback" ] || [ "$1" = "create" ] || [ "$1 $2" = "instance gen" ] || [ -n "$PLEROMA_CTL_RPC_DISABLED" ]; then
"$SCRIPTPATH"/pleroma eval 'Pleroma.ReleaseTasks.run("'"$*"'")'
else
"$SCRIPTPATH"/pleroma rpc 'Pleroma.ReleaseTasks.run("'"$*"'")'
fi
fi

View file

@ -1,26 +0,0 @@
#!/bin/sh
# XXX: This should be removed when elixir's releases get custom command support
if [ -z "$1" ] || [ "$1" = "help" ]; then
echo "Usage: $(basename "$0") COMMAND [ARGS]
The known commands are:
create Create database schema (needs to be executed only once)
migrate Execute database migrations (needs to be done after updates)
rollback [VERSION] Rollback database migrations (needs to be done before downgrading)
and any mix tasks under Pleroma namespace, for example \`mix pleroma.user COMMAND\` is
equivalent to \`$(basename "$0") user COMMAND\`
By default pleroma_ctl will try calling into a running instance to execute non migration-related commands,
if for some reason this is undesired, set PLEROMA_CTL_RPC_DISABLED environment variable
"
else
SCRIPT=$(readlink -f "$0")
SCRIPTPATH=$(dirname "$SCRIPT")
if [ "$1" = "migrate" ] || [ "$1" = "rollback" ] || [ "$1" = "create" ] || [ -n "$PLEROMA_CTL_RPC_DISABLED" ]; then
"$SCRIPTPATH"/pleroma eval 'Pleroma.ReleaseTasks.run("'"$*"'")'
else
"$SCRIPTPATH"/pleroma rpc 'Pleroma.ReleaseTasks.run("'"$*"'")'
fi
fi

View file

@ -13,19 +13,37 @@ defmodule Pleroma.Config.TransferTaskTest do
test "transfer config values from db to env" do test "transfer config values from db to env" do
refute Application.get_env(:pleroma, :test_key) refute Application.get_env(:pleroma, :test_key)
Pleroma.Web.AdminAPI.Config.create(%{key: "test_key", value: [live: 2, com: 3]}) refute Application.get_env(:idna, :test_key)
Pleroma.Web.AdminAPI.Config.create(%{
group: "pleroma",
key: "test_key",
value: [live: 2, com: 3]
})
Pleroma.Web.AdminAPI.Config.create(%{
group: "idna",
key: "test_key",
value: [live: 15, com: 35]
})
Pleroma.Config.TransferTask.start_link() Pleroma.Config.TransferTask.start_link()
assert Application.get_env(:pleroma, :test_key) == [live: 2, com: 3] assert Application.get_env(:pleroma, :test_key) == [live: 2, com: 3]
assert Application.get_env(:idna, :test_key) == [live: 15, com: 35]
on_exit(fn -> on_exit(fn ->
Application.delete_env(:pleroma, :test_key) Application.delete_env(:pleroma, :test_key)
Application.delete_env(:idna, :test_key)
end) end)
end end
test "non existing atom" do test "non existing atom" do
Pleroma.Web.AdminAPI.Config.create(%{key: "undefined_atom_key", value: [live: 2, com: 3]}) Pleroma.Web.AdminAPI.Config.create(%{
group: "pleroma",
key: "undefined_atom_key",
value: [live: 2, com: 3]
})
assert ExUnit.CaptureLog.capture_log(fn -> assert ExUnit.CaptureLog.capture_log(fn ->
Pleroma.Config.TransferTask.start_link() Pleroma.Config.TransferTask.start_link()

File diff suppressed because it is too large Load diff

View file

@ -20,7 +20,7 @@ test "ip/1" do
end end
test "it restricts by opts" do test "it restricts by opts" do
scale = 100 scale = 1000
limit = 5 limit = 5
Pleroma.Config.put([:rate_limit, @limiter_name], {scale, limit}) Pleroma.Config.put([:rate_limit, @limiter_name], {scale, limit})
@ -64,7 +64,7 @@ test "it restricts by opts" do
test "optional limits for authenticated users" do test "optional limits for authenticated users" do
Ecto.Adapters.SQL.Sandbox.checkout(Pleroma.Repo) Ecto.Adapters.SQL.Sandbox.checkout(Pleroma.Repo)
scale = 100 scale = 1000
limit = 5 limit = 5
Pleroma.Config.put([:rate_limit, @limiter_name], [{1, 10}, {scale, limit}]) Pleroma.Config.put([:rate_limit, @limiter_name], [{1, 10}, {scale, limit}])

View file

@ -314,6 +314,7 @@ def registration_factory do
def config_factory do def config_factory do
%Pleroma.Web.AdminAPI.Config{ %Pleroma.Web.AdminAPI.Config{
key: sequence(:key, &"some_key_#{&1}"), key: sequence(:key, &"some_key_#{&1}"),
group: "pleroma",
value: value:
sequence( sequence(
:value, :value,

View file

@ -5,7 +5,7 @@ defmodule Mix.Tasks.Pleroma.ConfigTest do
setup_all do setup_all do
Mix.shell(Mix.Shell.Process) Mix.shell(Mix.Shell.Process)
temp_file = "config/temp.migrated.secret.exs" temp_file = "config/temp.exported_from_db.secret.exs"
dynamic = Pleroma.Config.get([:instance, :dynamic_configuration]) dynamic = Pleroma.Config.get([:instance, :dynamic_configuration])
@ -30,17 +30,26 @@ test "settings are migrated to db" do
Mix.Tasks.Pleroma.Config.run(["migrate_to_db"]) Mix.Tasks.Pleroma.Config.run(["migrate_to_db"])
first_db = Config.get_by_key("first_setting") first_db = Config.get_by_params(%{group: "pleroma", key: "first_setting"})
second_db = Config.get_by_key("second_setting") second_db = Config.get_by_params(%{group: "pleroma", key: "second_setting"})
refute Config.get_by_key("Pleroma.Repo") refute Config.get_by_params(%{group: "pleroma", key: "Pleroma.Repo"})
assert Config.from_binary(first_db.value) == [key: "value", key2: [Pleroma.Repo]] assert Config.from_binary(first_db.value) == [key: "value", key2: [Pleroma.Repo]]
assert Config.from_binary(second_db.value) == [key: "value2", key2: [Pleroma.Activity]] assert Config.from_binary(second_db.value) == [key: "value2", key2: [Pleroma.Activity]]
end end
test "settings are migrated to file and deleted from db", %{temp_file: temp_file} do test "settings are migrated to file and deleted from db", %{temp_file: temp_file} do
Config.create(%{key: "setting_first", value: [key: "value", key2: [Pleroma.Activity]]}) Config.create(%{
Config.create(%{key: "setting_second", value: [key: "valu2", key2: [Pleroma.Repo]]}) group: "pleroma",
key: "setting_first",
value: [key: "value", key2: [Pleroma.Activity]]
})
Config.create(%{
group: "pleroma",
key: "setting_second",
value: [key: "valu2", key2: [Pleroma.Repo]]
})
Mix.Tasks.Pleroma.Config.run(["migrate_from_db", "temp"]) Mix.Tasks.Pleroma.Config.run(["migrate_from_db", "temp"])

View file

@ -89,8 +89,7 @@ test "user is deleted" do
assert_received {:mix_shell, :info, [message]} assert_received {:mix_shell, :info, [message]}
assert message =~ " deleted" assert message =~ " deleted"
user = User.get_cached_by_nickname(user.nickname) refute User.get_by_nickname(user.nickname)
assert user.info.deactivated
end end
test "no user to delete" do test "no user to delete" do

View file

@ -920,42 +920,44 @@ test ".delete_user_activities deletes all create activities" do
{:ok, activity} = CommonAPI.post(user, %{"status" => "2hu"}) {:ok, activity} = CommonAPI.post(user, %{"status" => "2hu"})
Ecto.Adapters.SQL.Sandbox.unboxed_run(Repo, fn -> {:ok, _} = User.delete_user_activities(user)
{:ok, _} = User.delete_user_activities(user)
# TODO: Remove favorites, repeats, delete activities. # TODO: Remove favorites, repeats, delete activities.
refute Activity.get_by_id(activity.id) refute Activity.get_by_id(activity.id)
end)
end end
test ".delete deactivates a user, all follow relationships and all create activities" do test ".delete deactivates a user, all follow relationships and all activities" do
user = insert(:user) user = insert(:user)
followed = insert(:user)
follower = insert(:user) follower = insert(:user)
{:ok, user} = User.follow(user, followed)
{:ok, follower} = User.follow(follower, user) {:ok, follower} = User.follow(follower, user)
{:ok, activity} = CommonAPI.post(user, %{"status" => "2hu"}) {:ok, activity} = CommonAPI.post(user, %{"status" => "2hu"})
{:ok, activity_two} = CommonAPI.post(follower, %{"status" => "3hu"}) {:ok, activity_two} = CommonAPI.post(follower, %{"status" => "3hu"})
{:ok, _, _} = CommonAPI.favorite(activity_two.id, user) {:ok, like, _} = CommonAPI.favorite(activity_two.id, user)
{:ok, _, _} = CommonAPI.favorite(activity.id, follower) {:ok, like_two, _} = CommonAPI.favorite(activity.id, follower)
{:ok, _, _} = CommonAPI.repeat(activity.id, follower) {:ok, repeat, _} = CommonAPI.repeat(activity_two.id, user)
{:ok, _} = User.delete(user) {:ok, _} = User.delete(user)
followed = User.get_cached_by_id(followed.id)
follower = User.get_cached_by_id(follower.id) follower = User.get_cached_by_id(follower.id)
user = User.get_cached_by_id(user.id)
assert user.info.deactivated refute User.following?(follower, user)
refute User.get_by_id(user.id)
refute User.following?(user, followed) user_activities =
refute User.following?(followed, follower) user.ap_id
|> Activity.query_by_actor()
|> Repo.all()
|> Enum.map(fn act -> act.data["type"] end)
# TODO: Remove favorites, repeats, delete activities. assert Enum.all?(user_activities, fn act -> act in ~w(Delete Undo) end)
refute Activity.get_by_id(activity.id) refute Activity.get_by_id(activity.id)
refute Activity.get_by_id(like.id)
refute Activity.get_by_id(like_two.id)
refute Activity.get_by_id(repeat.id)
end end
test "get_public_key_for_ap_id fetches a user that's not in the db" do test "get_public_key_for_ap_id fetches a user that's not in the db" do

View file

@ -0,0 +1,145 @@
# Pleroma: A lightweight social networking server
# Copyright © 2019 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Web.ActivityPub.MRF.AntiLinkSpamPolicyTest do
use Pleroma.DataCase
import Pleroma.Factory
import ExUnit.CaptureLog
alias Pleroma.Web.ActivityPub.MRF.AntiLinkSpamPolicy
@linkless_message %{
"type" => "Create",
"object" => %{
"content" => "hi world!"
}
}
@linkful_message %{
"type" => "Create",
"object" => %{
"content" => "<a href='https://example.com'>hi world!</a>"
}
}
@response_message %{
"type" => "Create",
"object" => %{
"name" => "yes",
"type" => "Answer"
}
}
describe "with new user" do
test "it allows posts without links" do
user = insert(:user)
assert user.info.note_count == 0
message =
@linkless_message
|> Map.put("actor", user.ap_id)
{:ok, _message} = AntiLinkSpamPolicy.filter(message)
end
test "it disallows posts with links" do
user = insert(:user)
assert user.info.note_count == 0
message =
@linkful_message
|> Map.put("actor", user.ap_id)
{:reject, _} = AntiLinkSpamPolicy.filter(message)
end
end
describe "with old user" do
test "it allows posts without links" do
user = insert(:user, info: %{note_count: 1})
assert user.info.note_count == 1
message =
@linkless_message
|> Map.put("actor", user.ap_id)
{:ok, _message} = AntiLinkSpamPolicy.filter(message)
end
test "it allows posts with links" do
user = insert(:user, info: %{note_count: 1})
assert user.info.note_count == 1
message =
@linkful_message
|> Map.put("actor", user.ap_id)
{:ok, _message} = AntiLinkSpamPolicy.filter(message)
end
end
describe "with followed new user" do
test "it allows posts without links" do
user = insert(:user, info: %{follower_count: 1})
assert user.info.follower_count == 1
message =
@linkless_message
|> Map.put("actor", user.ap_id)
{:ok, _message} = AntiLinkSpamPolicy.filter(message)
end
test "it allows posts with links" do
user = insert(:user, info: %{follower_count: 1})
assert user.info.follower_count == 1
message =
@linkful_message
|> Map.put("actor", user.ap_id)
{:ok, _message} = AntiLinkSpamPolicy.filter(message)
end
end
describe "with unknown actors" do
test "it rejects posts without links" do
message =
@linkless_message
|> Map.put("actor", "http://invalid.actor")
assert capture_log(fn ->
{:reject, _} = AntiLinkSpamPolicy.filter(message)
end) =~ "[error] Could not decode user at fetch http://invalid.actor"
end
test "it rejects posts with links" do
message =
@linkful_message
|> Map.put("actor", "http://invalid.actor")
assert capture_log(fn ->
{:reject, _} = AntiLinkSpamPolicy.filter(message)
end) =~ "[error] Could not decode user at fetch http://invalid.actor"
end
end
describe "with contentless-objects" do
test "it does not reject them or error out" do
user = insert(:user, info: %{note_count: 1})
message =
@response_message
|> Map.put("actor", user.ap_id)
{:ok, _message} = AntiLinkSpamPolicy.filter(message)
end
end
end

View file

@ -1334,7 +1334,7 @@ test "with settings in db", %{conn: conn} do
setup %{conn: conn} do setup %{conn: conn} do
admin = insert(:user, info: %{is_admin: true}) admin = insert(:user, info: %{is_admin: true})
temp_file = "config/test.migrated.secret.exs" temp_file = "config/test.exported_from_db.secret.exs"
on_exit(fn -> on_exit(fn ->
Application.delete_env(:pleroma, :key1) Application.delete_env(:pleroma, :key1)
@ -1343,6 +1343,8 @@ test "with settings in db", %{conn: conn} do
Application.delete_env(:pleroma, :key4) Application.delete_env(:pleroma, :key4)
Application.delete_env(:pleroma, :keyaa1) Application.delete_env(:pleroma, :keyaa1)
Application.delete_env(:pleroma, :keyaa2) Application.delete_env(:pleroma, :keyaa2)
Application.delete_env(:pleroma, Pleroma.Web.Endpoint.NotReal)
Application.delete_env(:pleroma, Pleroma.Captcha.NotReal)
:ok = File.rm(temp_file) :ok = File.rm(temp_file)
end) end)
@ -1361,8 +1363,9 @@ test "create new config setting in db", %{conn: conn} do
conn = conn =
post(conn, "/api/pleroma/admin/config", %{ post(conn, "/api/pleroma/admin/config", %{
configs: [ configs: [
%{key: "key1", value: "value1"}, %{group: "pleroma", key: "key1", value: "value1"},
%{ %{
group: "pleroma",
key: "key2", key: "key2",
value: %{ value: %{
"nested_1" => "nested_value1", "nested_1" => "nested_value1",
@ -1373,6 +1376,7 @@ test "create new config setting in db", %{conn: conn} do
} }
}, },
%{ %{
group: "pleroma",
key: "key3", key: "key3",
value: [ value: [
%{"nested_3" => ":nested_3", "nested_33" => "nested_33"}, %{"nested_3" => ":nested_3", "nested_33" => "nested_33"},
@ -1380,8 +1384,14 @@ test "create new config setting in db", %{conn: conn} do
] ]
}, },
%{ %{
group: "pleroma",
key: "key4", key: "key4",
value: %{"nested_5" => ":upload", "endpoint" => "https://example.com"} value: %{"nested_5" => ":upload", "endpoint" => "https://example.com"}
},
%{
group: "idna",
key: "key5",
value: %{"tuple" => ["string", "Pleroma.Captcha.NotReal", []]}
} }
] ]
}) })
@ -1389,10 +1399,12 @@ test "create new config setting in db", %{conn: conn} do
assert json_response(conn, 200) == %{ assert json_response(conn, 200) == %{
"configs" => [ "configs" => [
%{ %{
"group" => "pleroma",
"key" => "key1", "key" => "key1",
"value" => "value1" "value" => "value1"
}, },
%{ %{
"group" => "pleroma",
"key" => "key2", "key" => "key2",
"value" => [ "value" => [
%{"nested_1" => "nested_value1"}, %{"nested_1" => "nested_value1"},
@ -1405,6 +1417,7 @@ test "create new config setting in db", %{conn: conn} do
] ]
}, },
%{ %{
"group" => "pleroma",
"key" => "key3", "key" => "key3",
"value" => [ "value" => [
[%{"nested_3" => "nested_3"}, %{"nested_33" => "nested_33"}], [%{"nested_3" => "nested_3"}, %{"nested_33" => "nested_33"}],
@ -1412,8 +1425,14 @@ test "create new config setting in db", %{conn: conn} do
] ]
}, },
%{ %{
"group" => "pleroma",
"key" => "key4", "key" => "key4",
"value" => [%{"endpoint" => "https://example.com"}, %{"nested_5" => "upload"}] "value" => [%{"endpoint" => "https://example.com"}, %{"nested_5" => "upload"}]
},
%{
"group" => "idna",
"key" => "key5",
"value" => %{"tuple" => ["string", "Pleroma.Captcha.NotReal", []]}
} }
] ]
} }
@ -1437,6 +1456,8 @@ test "create new config setting in db", %{conn: conn} do
endpoint: "https://example.com", endpoint: "https://example.com",
nested_5: :upload nested_5: :upload
] ]
assert Application.get_env(:idna, :key5) == {"string", Pleroma.Captcha.NotReal, []}
end end
test "update config setting & delete", %{conn: conn} do test "update config setting & delete", %{conn: conn} do
@ -1446,14 +1467,15 @@ test "update config setting & delete", %{conn: conn} do
conn = conn =
post(conn, "/api/pleroma/admin/config", %{ post(conn, "/api/pleroma/admin/config", %{
configs: [ configs: [
%{key: config1.key, value: "another_value"}, %{group: config1.group, key: config1.key, value: "another_value"},
%{key: config2.key, delete: "true"} %{group: config2.group, key: config2.key, delete: "true"}
] ]
}) })
assert json_response(conn, 200) == %{ assert json_response(conn, 200) == %{
"configs" => [ "configs" => [
%{ %{
"group" => "pleroma",
"key" => config1.key, "key" => config1.key,
"value" => "another_value" "value" => "another_value"
} }
@ -1463,5 +1485,152 @@ test "update config setting & delete", %{conn: conn} do
assert Application.get_env(:pleroma, :keyaa1) == "another_value" assert Application.get_env(:pleroma, :keyaa1) == "another_value"
refute Application.get_env(:pleroma, :keyaa2) refute Application.get_env(:pleroma, :keyaa2)
end end
test "common config example", %{conn: conn} do
conn =
post(conn, "/api/pleroma/admin/config", %{
configs: [
%{
"group" => "pleroma",
"key" => "Pleroma.Captcha.NotReal",
"value" => %{
"enabled" => ":false",
"method" => "Pleroma.Captcha.Kocaptcha",
"seconds_valid" => "i:60"
}
}
]
})
assert json_response(conn, 200) == %{
"configs" => [
%{
"group" => "pleroma",
"key" => "Pleroma.Captcha.NotReal",
"value" => [
%{"enabled" => false},
%{"method" => "Pleroma.Captcha.Kocaptcha"},
%{"seconds_valid" => 60}
]
}
]
}
end
test "tuples with more than two values", %{conn: conn} do
conn =
post(conn, "/api/pleroma/admin/config", %{
configs: [
%{
"group" => "pleroma",
"key" => "Pleroma.Web.Endpoint.NotReal",
"value" => [
%{
"http" => %{
"dispatch" => [
%{
"tuple" => [
":_",
[
%{
"tuple" => [
"/api/v1/streaming",
"Pleroma.Web.MastodonAPI.WebsocketHandler",
[]
]
},
%{
"tuple" => [
"/websocket",
"Phoenix.Endpoint.CowboyWebSocket",
%{
"tuple" => [
"Phoenix.Transports.WebSocket",
%{
"tuple" => [
"Pleroma.Web.Endpoint",
"Pleroma.Web.UserSocket",
[]
]
}
]
}
]
},
%{
"tuple" => [
":_",
"Phoenix.Endpoint.Cowboy2Handler",
%{
"tuple" => ["Pleroma.Web.Endpoint", []]
}
]
}
]
]
}
]
}
}
]
}
]
})
assert json_response(conn, 200) == %{
"configs" => [
%{
"group" => "pleroma",
"key" => "Pleroma.Web.Endpoint.NotReal",
"value" => [
%{
"http" => %{
"dispatch" => %{
"_" => [
%{
"tuple" => [
"/api/v1/streaming",
"Pleroma.Web.MastodonAPI.WebsocketHandler",
[]
]
},
%{
"tuple" => [
"/websocket",
"Phoenix.Endpoint.CowboyWebSocket",
%{
"Elixir.Phoenix.Transports.WebSocket" => %{
"tuple" => [
"Pleroma.Web.Endpoint",
"Pleroma.Web.UserSocket",
[]
]
}
}
]
},
%{
"tuple" => [
"_",
"Phoenix.Endpoint.Cowboy2Handler",
%{"Elixir.Pleroma.Web.Endpoint" => []}
]
}
]
}
}
}
]
}
]
}
end
end end
end end
# Needed for testing
defmodule Pleroma.Web.Endpoint.NotReal do
end
defmodule Pleroma.Captcha.NotReal do
end

View file

@ -7,18 +7,18 @@ test "get_by_key/1" do
config = insert(:config) config = insert(:config)
insert(:config) insert(:config)
assert config == Config.get_by_key(config.key) assert config == Config.get_by_params(%{group: config.group, key: config.key})
end end
test "create/1" do test "create/1" do
{:ok, config} = Config.create(%{key: "some_key", value: "some_value"}) {:ok, config} = Config.create(%{group: "pleroma", key: "some_key", value: "some_value"})
assert config == Config.get_by_key("some_key") assert config == Config.get_by_params(%{group: "pleroma", key: "some_key"})
end end
test "update/1" do test "update/1" do
config = insert(:config) config = insert(:config)
{:ok, updated} = Config.update(config, %{value: "some_value"}) {:ok, updated} = Config.update(config, %{value: "some_value"})
loaded = Config.get_by_key(config.key) loaded = Config.get_by_params(%{group: config.group, key: config.key})
assert loaded == updated assert loaded == updated
end end
@ -27,8 +27,8 @@ test "update_or_create/1" do
key2 = "another_key" key2 = "another_key"
params = [ params = [
%{key: key2, value: "another_value"}, %{group: "pleroma", key: key2, value: "another_value"},
%{key: config.key, value: "new_value"} %{group: config.group, key: config.key, value: "new_value"}
] ]
assert Repo.all(Config) |> length() == 1 assert Repo.all(Config) |> length() == 1
@ -37,8 +37,8 @@ test "update_or_create/1" do
assert Repo.all(Config) |> length() == 2 assert Repo.all(Config) |> length() == 2
config1 = Config.get_by_key(config.key) config1 = Config.get_by_params(%{group: config.group, key: config.key})
config2 = Config.get_by_key(key2) config2 = Config.get_by_params(%{group: "pleroma", key: key2})
assert config1.value == Config.transform("new_value") assert config1.value == Config.transform("new_value")
assert config2.value == Config.transform("another_value") assert config2.value == Config.transform("another_value")
@ -46,8 +46,8 @@ test "update_or_create/1" do
test "delete/1" do test "delete/1" do
config = insert(:config) config = insert(:config)
{:ok, _} = Config.delete(config.key) {:ok, _} = Config.delete(%{key: config.key, group: config.group})
refute Config.get_by_key(config.key) refute Config.get_by_params(%{key: config.key, group: config.group})
end end
describe "transform/1" do describe "transform/1" do
@ -179,5 +179,80 @@ test "complex map with sigil" do
assert Config.from_binary(binary) == assert Config.from_binary(binary) ==
[federated_timeline_removal: [], reject: [~r/comp[lL][aA][iI][nN]er/], replace: []] [federated_timeline_removal: [], reject: [~r/comp[lL][aA][iI][nN]er/], replace: []]
end end
test "complex map with tuples with more than 2 values" do
binary =
Config.transform(%{
"http" => %{
"dispatch" => [
%{
"tuple" => [
":_",
[
%{
"tuple" => [
"/api/v1/streaming",
"Pleroma.Web.MastodonAPI.WebsocketHandler",
[]
]
},
%{
"tuple" => [
"/websocket",
"Phoenix.Endpoint.CowboyWebSocket",
%{
"tuple" => [
"Phoenix.Transports.WebSocket",
%{"tuple" => ["Pleroma.Web.Endpoint", "Pleroma.Web.UserSocket", []]}
]
}
]
},
%{
"tuple" => [
":_",
"Phoenix.Endpoint.Cowboy2Handler",
%{
"tuple" => ["Pleroma.Web.Endpoint", []]
}
]
}
]
]
}
]
}
})
assert binary ==
:erlang.term_to_binary(
http: [
dispatch: [
_: [
{"/api/v1/streaming", Pleroma.Web.MastodonAPI.WebsocketHandler, []},
{"/websocket", Phoenix.Endpoint.CowboyWebSocket,
{Phoenix.Transports.WebSocket,
{Pleroma.Web.Endpoint, Pleroma.Web.UserSocket, []}}},
{:_, Phoenix.Endpoint.Cowboy2Handler, {Pleroma.Web.Endpoint, []}}
]
]
]
)
assert Config.from_binary(binary) == [
http: [
dispatch: [
{:_,
[
{"/api/v1/streaming", Pleroma.Web.MastodonAPI.WebsocketHandler, []},
{"/websocket", Phoenix.Endpoint.CowboyWebSocket,
{Phoenix.Transports.WebSocket,
{Pleroma.Web.Endpoint, Pleroma.Web.UserSocket, []}}},
{:_, Phoenix.Endpoint.Cowboy2Handler, {Pleroma.Web.Endpoint, []}}
]}
]
]
]
end
end end
end end

View file

@ -10,6 +10,7 @@ defmodule Pleroma.Web.OAuth.OAuthControllerTest do
alias Pleroma.Registration alias Pleroma.Registration
alias Pleroma.Repo alias Pleroma.Repo
alias Pleroma.Web.OAuth.Authorization alias Pleroma.Web.OAuth.Authorization
alias Pleroma.Web.OAuth.OAuthController
alias Pleroma.Web.OAuth.Token alias Pleroma.Web.OAuth.Token
@oauth_config_path [:oauth2, :issue_new_refresh_token] @oauth_config_path [:oauth2, :issue_new_refresh_token]
@ -49,7 +50,7 @@ test "GET /oauth/authorize renders auth forms, including OAuth consumer form", %
%{ %{
"response_type" => "code", "response_type" => "code",
"client_id" => app.client_id, "client_id" => app.client_id,
"redirect_uri" => app.redirect_uris, "redirect_uri" => OAuthController.default_redirect_uri(app),
"scope" => "read" "scope" => "read"
} }
) )
@ -72,7 +73,7 @@ test "GET /oauth/prepare_request encodes parameters as `state` and redirects", %
"authorization" => %{ "authorization" => %{
"scope" => "read follow", "scope" => "read follow",
"client_id" => app.client_id, "client_id" => app.client_id,
"redirect_uri" => app.redirect_uris, "redirect_uri" => OAuthController.default_redirect_uri(app),
"state" => "a_state" "state" => "a_state"
} }
} }
@ -98,11 +99,12 @@ test "GET /oauth/prepare_request encodes parameters as `state` and redirects", %
test "with user-bound registration, GET /oauth/<provider>/callback redirects to `redirect_uri` with `code`", test "with user-bound registration, GET /oauth/<provider>/callback redirects to `redirect_uri` with `code`",
%{app: app, conn: conn} do %{app: app, conn: conn} do
registration = insert(:registration) registration = insert(:registration)
redirect_uri = OAuthController.default_redirect_uri(app)
state_params = %{ state_params = %{
"scope" => Enum.join(app.scopes, " "), "scope" => Enum.join(app.scopes, " "),
"client_id" => app.client_id, "client_id" => app.client_id,
"redirect_uri" => app.redirect_uris, "redirect_uri" => redirect_uri,
"state" => "" "state" => ""
} }
@ -121,7 +123,7 @@ test "with user-bound registration, GET /oauth/<provider>/callback redirects to
) )
assert response = html_response(conn, 302) assert response = html_response(conn, 302)
assert redirected_to(conn) =~ ~r/#{app.redirect_uris}\?code=.+/ assert redirected_to(conn) =~ ~r/#{redirect_uri}\?code=.+/
end end
end end
@ -132,7 +134,7 @@ test "with user-unbound registration, GET /oauth/<provider>/callback renders reg
state_params = %{ state_params = %{
"scope" => "read write", "scope" => "read write",
"client_id" => app.client_id, "client_id" => app.client_id,
"redirect_uri" => app.redirect_uris, "redirect_uri" => OAuthController.default_redirect_uri(app),
"state" => "a_state" "state" => "a_state"
} }
@ -165,7 +167,7 @@ test "on authentication error, GET /oauth/<provider>/callback redirects to `redi
state_params = %{ state_params = %{
"scope" => Enum.join(app.scopes, " "), "scope" => Enum.join(app.scopes, " "),
"client_id" => app.client_id, "client_id" => app.client_id,
"redirect_uri" => app.redirect_uris, "redirect_uri" => OAuthController.default_redirect_uri(app),
"state" => "" "state" => ""
} }
@ -199,7 +201,7 @@ test "GET /oauth/registration_details renders registration details form", %{
"authorization" => %{ "authorization" => %{
"scopes" => app.scopes, "scopes" => app.scopes,
"client_id" => app.client_id, "client_id" => app.client_id,
"redirect_uri" => app.redirect_uris, "redirect_uri" => OAuthController.default_redirect_uri(app),
"state" => "a_state", "state" => "a_state",
"nickname" => nil, "nickname" => nil,
"email" => "john@doe.com" "email" => "john@doe.com"
@ -218,6 +220,7 @@ test "with valid params, POST /oauth/register?op=register redirects to `redirect
conn: conn conn: conn
} do } do
registration = insert(:registration, user: nil, info: %{"nickname" => nil, "email" => nil}) registration = insert(:registration, user: nil, info: %{"nickname" => nil, "email" => nil})
redirect_uri = OAuthController.default_redirect_uri(app)
conn = conn =
conn conn
@ -229,7 +232,7 @@ test "with valid params, POST /oauth/register?op=register redirects to `redirect
"authorization" => %{ "authorization" => %{
"scopes" => app.scopes, "scopes" => app.scopes,
"client_id" => app.client_id, "client_id" => app.client_id,
"redirect_uri" => app.redirect_uris, "redirect_uri" => redirect_uri,
"state" => "a_state", "state" => "a_state",
"nickname" => "availablenick", "nickname" => "availablenick",
"email" => "available@email.com" "email" => "available@email.com"
@ -238,7 +241,36 @@ test "with valid params, POST /oauth/register?op=register redirects to `redirect
) )
assert response = html_response(conn, 302) assert response = html_response(conn, 302)
assert redirected_to(conn) =~ ~r/#{app.redirect_uris}\?code=.+/ assert redirected_to(conn) =~ ~r/#{redirect_uri}\?code=.+/
end
test "with unlisted `redirect_uri`, POST /oauth/register?op=register results in HTTP 401",
%{
app: app,
conn: conn
} do
registration = insert(:registration, user: nil, info: %{"nickname" => nil, "email" => nil})
unlisted_redirect_uri = "http://cross-site-request.com"
conn =
conn
|> put_session(:registration_id, registration.id)
|> post(
"/oauth/register",
%{
"op" => "register",
"authorization" => %{
"scopes" => app.scopes,
"client_id" => app.client_id,
"redirect_uri" => unlisted_redirect_uri,
"state" => "a_state",
"nickname" => "availablenick",
"email" => "available@email.com"
}
}
)
assert response = html_response(conn, 401)
end end
test "with invalid params, POST /oauth/register?op=register renders registration_details page", test "with invalid params, POST /oauth/register?op=register renders registration_details page",
@ -254,7 +286,7 @@ test "with invalid params, POST /oauth/register?op=register renders registration
"authorization" => %{ "authorization" => %{
"scopes" => app.scopes, "scopes" => app.scopes,
"client_id" => app.client_id, "client_id" => app.client_id,
"redirect_uri" => app.redirect_uris, "redirect_uri" => OAuthController.default_redirect_uri(app),
"state" => "a_state", "state" => "a_state",
"nickname" => "availablenickname", "nickname" => "availablenickname",
"email" => "available@email.com" "email" => "available@email.com"
@ -286,6 +318,7 @@ test "with valid params, POST /oauth/register?op=connect redirects to `redirect_
} do } do
user = insert(:user, password_hash: Comeonin.Pbkdf2.hashpwsalt("testpassword")) user = insert(:user, password_hash: Comeonin.Pbkdf2.hashpwsalt("testpassword"))
registration = insert(:registration, user: nil) registration = insert(:registration, user: nil)
redirect_uri = OAuthController.default_redirect_uri(app)
conn = conn =
conn conn
@ -297,7 +330,7 @@ test "with valid params, POST /oauth/register?op=connect redirects to `redirect_
"authorization" => %{ "authorization" => %{
"scopes" => app.scopes, "scopes" => app.scopes,
"client_id" => app.client_id, "client_id" => app.client_id,
"redirect_uri" => app.redirect_uris, "redirect_uri" => redirect_uri,
"state" => "a_state", "state" => "a_state",
"name" => user.nickname, "name" => user.nickname,
"password" => "testpassword" "password" => "testpassword"
@ -306,7 +339,37 @@ test "with valid params, POST /oauth/register?op=connect redirects to `redirect_
) )
assert response = html_response(conn, 302) assert response = html_response(conn, 302)
assert redirected_to(conn) =~ ~r/#{app.redirect_uris}\?code=.+/ assert redirected_to(conn) =~ ~r/#{redirect_uri}\?code=.+/
end
test "with unlisted `redirect_uri`, POST /oauth/register?op=connect results in HTTP 401`",
%{
app: app,
conn: conn
} do
user = insert(:user, password_hash: Comeonin.Pbkdf2.hashpwsalt("testpassword"))
registration = insert(:registration, user: nil)
unlisted_redirect_uri = "http://cross-site-request.com"
conn =
conn
|> put_session(:registration_id, registration.id)
|> post(
"/oauth/register",
%{
"op" => "connect",
"authorization" => %{
"scopes" => app.scopes,
"client_id" => app.client_id,
"redirect_uri" => unlisted_redirect_uri,
"state" => "a_state",
"name" => user.nickname,
"password" => "testpassword"
}
}
)
assert response = html_response(conn, 401)
end end
test "with invalid params, POST /oauth/register?op=connect renders registration_details page", test "with invalid params, POST /oauth/register?op=connect renders registration_details page",
@ -322,7 +385,7 @@ test "with invalid params, POST /oauth/register?op=connect renders registration_
"authorization" => %{ "authorization" => %{
"scopes" => app.scopes, "scopes" => app.scopes,
"client_id" => app.client_id, "client_id" => app.client_id,
"redirect_uri" => app.redirect_uris, "redirect_uri" => OAuthController.default_redirect_uri(app),
"state" => "a_state", "state" => "a_state",
"name" => user.nickname, "name" => user.nickname,
"password" => "wrong password" "password" => "wrong password"
@ -358,7 +421,7 @@ test "renders authentication page", %{app: app, conn: conn} do
%{ %{
"response_type" => "code", "response_type" => "code",
"client_id" => app.client_id, "client_id" => app.client_id,
"redirect_uri" => app.redirect_uris, "redirect_uri" => OAuthController.default_redirect_uri(app),
"scope" => "read" "scope" => "read"
} }
) )
@ -378,7 +441,7 @@ test "properly handles internal calls with `authorization`-wrapped params", %{
"authorization" => %{ "authorization" => %{
"response_type" => "code", "response_type" => "code",
"client_id" => app.client_id, "client_id" => app.client_id,
"redirect_uri" => app.redirect_uris, "redirect_uri" => OAuthController.default_redirect_uri(app),
"scope" => "read" "scope" => "read"
} }
} }
@ -399,7 +462,7 @@ test "renders authentication page if user is already authenticated but `force_lo
%{ %{
"response_type" => "code", "response_type" => "code",
"client_id" => app.client_id, "client_id" => app.client_id,
"redirect_uri" => app.redirect_uris, "redirect_uri" => OAuthController.default_redirect_uri(app),
"scope" => "read", "scope" => "read",
"force_login" => "true" "force_login" => "true"
} }
@ -423,7 +486,7 @@ test "with existing authentication and non-OOB `redirect_uri`, redirects to app
%{ %{
"response_type" => "code", "response_type" => "code",
"client_id" => app.client_id, "client_id" => app.client_id,
"redirect_uri" => app.redirect_uris, "redirect_uri" => OAuthController.default_redirect_uri(app),
"state" => "specific_client_state", "state" => "specific_client_state",
"scope" => "read" "scope" => "read"
} }
@ -433,6 +496,31 @@ test "with existing authentication and non-OOB `redirect_uri`, redirects to app
"https://redirect.url?access_token=#{token.token}&state=specific_client_state" "https://redirect.url?access_token=#{token.token}&state=specific_client_state"
end end
test "with existing authentication and unlisted non-OOB `redirect_uri`, redirects without credentials",
%{
app: app,
conn: conn
} do
unlisted_redirect_uri = "http://cross-site-request.com"
token = insert(:oauth_token, app_id: app.id)
conn =
conn
|> put_session(:oauth_token, token.token)
|> get(
"/oauth/authorize",
%{
"response_type" => "code",
"client_id" => app.client_id,
"redirect_uri" => unlisted_redirect_uri,
"state" => "specific_client_state",
"scope" => "read"
}
)
assert redirected_to(conn) == unlisted_redirect_uri
end
test "with existing authentication and OOB `redirect_uri`, redirects to app with `token` and `state` params", test "with existing authentication and OOB `redirect_uri`, redirects to app with `token` and `state` params",
%{ %{
app: app, app: app,
@ -461,6 +549,7 @@ test "with existing authentication and OOB `redirect_uri`, redirects to app with
test "redirects with oauth authorization" do test "redirects with oauth authorization" do
user = insert(:user) user = insert(:user)
app = insert(:oauth_app, scopes: ["read", "write", "follow"]) app = insert(:oauth_app, scopes: ["read", "write", "follow"])
redirect_uri = OAuthController.default_redirect_uri(app)
conn = conn =
build_conn() build_conn()
@ -469,14 +558,14 @@ test "redirects with oauth authorization" do
"name" => user.nickname, "name" => user.nickname,
"password" => "test", "password" => "test",
"client_id" => app.client_id, "client_id" => app.client_id,
"redirect_uri" => app.redirect_uris, "redirect_uri" => redirect_uri,
"scope" => "read write", "scope" => "read write",
"state" => "statepassed" "state" => "statepassed"
} }
}) })
target = redirected_to(conn) target = redirected_to(conn)
assert target =~ app.redirect_uris assert target =~ redirect_uri
query = URI.parse(target).query |> URI.query_decoder() |> Map.new() query = URI.parse(target).query |> URI.query_decoder() |> Map.new()
@ -489,6 +578,7 @@ test "redirects with oauth authorization" do
test "returns 401 for wrong credentials", %{conn: conn} do test "returns 401 for wrong credentials", %{conn: conn} do
user = insert(:user) user = insert(:user)
app = insert(:oauth_app) app = insert(:oauth_app)
redirect_uri = OAuthController.default_redirect_uri(app)
result = result =
conn conn
@ -497,7 +587,7 @@ test "returns 401 for wrong credentials", %{conn: conn} do
"name" => user.nickname, "name" => user.nickname,
"password" => "wrong", "password" => "wrong",
"client_id" => app.client_id, "client_id" => app.client_id,
"redirect_uri" => app.redirect_uris, "redirect_uri" => redirect_uri,
"state" => "statepassed", "state" => "statepassed",
"scope" => Enum.join(app.scopes, " ") "scope" => Enum.join(app.scopes, " ")
} }
@ -506,7 +596,7 @@ test "returns 401 for wrong credentials", %{conn: conn} do
# Keep the details # Keep the details
assert result =~ app.client_id assert result =~ app.client_id
assert result =~ app.redirect_uris assert result =~ redirect_uri
# Error message # Error message
assert result =~ "Invalid Username/Password" assert result =~ "Invalid Username/Password"
@ -515,6 +605,7 @@ test "returns 401 for wrong credentials", %{conn: conn} do
test "returns 401 for missing scopes", %{conn: conn} do test "returns 401 for missing scopes", %{conn: conn} do
user = insert(:user) user = insert(:user)
app = insert(:oauth_app) app = insert(:oauth_app)
redirect_uri = OAuthController.default_redirect_uri(app)
result = result =
conn conn
@ -523,7 +614,7 @@ test "returns 401 for missing scopes", %{conn: conn} do
"name" => user.nickname, "name" => user.nickname,
"password" => "test", "password" => "test",
"client_id" => app.client_id, "client_id" => app.client_id,
"redirect_uri" => app.redirect_uris, "redirect_uri" => redirect_uri,
"state" => "statepassed", "state" => "statepassed",
"scope" => "" "scope" => ""
} }
@ -532,7 +623,7 @@ test "returns 401 for missing scopes", %{conn: conn} do
# Keep the details # Keep the details
assert result =~ app.client_id assert result =~ app.client_id
assert result =~ app.redirect_uris assert result =~ redirect_uri
# Error message # Error message
assert result =~ "This action is outside the authorized scopes" assert result =~ "This action is outside the authorized scopes"
@ -541,6 +632,7 @@ test "returns 401 for missing scopes", %{conn: conn} do
test "returns 401 for scopes beyond app scopes", %{conn: conn} do test "returns 401 for scopes beyond app scopes", %{conn: conn} do
user = insert(:user) user = insert(:user)
app = insert(:oauth_app, scopes: ["read", "write"]) app = insert(:oauth_app, scopes: ["read", "write"])
redirect_uri = OAuthController.default_redirect_uri(app)
result = result =
conn conn
@ -549,7 +641,7 @@ test "returns 401 for scopes beyond app scopes", %{conn: conn} do
"name" => user.nickname, "name" => user.nickname,
"password" => "test", "password" => "test",
"client_id" => app.client_id, "client_id" => app.client_id,
"redirect_uri" => app.redirect_uris, "redirect_uri" => redirect_uri,
"state" => "statepassed", "state" => "statepassed",
"scope" => "read write follow" "scope" => "read write follow"
} }
@ -558,7 +650,7 @@ test "returns 401 for scopes beyond app scopes", %{conn: conn} do
# Keep the details # Keep the details
assert result =~ app.client_id assert result =~ app.client_id
assert result =~ app.redirect_uris assert result =~ redirect_uri
# Error message # Error message
assert result =~ "This action is outside the authorized scopes" assert result =~ "This action is outside the authorized scopes"
@ -577,7 +669,7 @@ test "issues a token for an all-body request" do
|> post("/oauth/token", %{ |> post("/oauth/token", %{
"grant_type" => "authorization_code", "grant_type" => "authorization_code",
"code" => auth.token, "code" => auth.token,
"redirect_uri" => app.redirect_uris, "redirect_uri" => OAuthController.default_redirect_uri(app),
"client_id" => app.client_id, "client_id" => app.client_id,
"client_secret" => app.client_secret "client_secret" => app.client_secret
}) })
@ -631,7 +723,7 @@ test "issues a token for request with HTTP basic auth client credentials" do
|> post("/oauth/token", %{ |> post("/oauth/token", %{
"grant_type" => "authorization_code", "grant_type" => "authorization_code",
"code" => auth.token, "code" => auth.token,
"redirect_uri" => app.redirect_uris "redirect_uri" => OAuthController.default_redirect_uri(app)
}) })
assert %{"access_token" => token, "scope" => scope} = json_response(conn, 200) assert %{"access_token" => token, "scope" => scope} = json_response(conn, 200)
@ -676,7 +768,7 @@ test "rejects token exchange with invalid client credentials" do
|> post("/oauth/token", %{ |> post("/oauth/token", %{
"grant_type" => "authorization_code", "grant_type" => "authorization_code",
"code" => auth.token, "code" => auth.token,
"redirect_uri" => app.redirect_uris "redirect_uri" => OAuthController.default_redirect_uri(app)
}) })
assert resp = json_response(conn, 400) assert resp = json_response(conn, 400)
@ -755,7 +847,7 @@ test "rejects an invalid authorization code" do
|> post("/oauth/token", %{ |> post("/oauth/token", %{
"grant_type" => "authorization_code", "grant_type" => "authorization_code",
"code" => "Imobviouslyinvalid", "code" => "Imobviouslyinvalid",
"redirect_uri" => app.redirect_uris, "redirect_uri" => OAuthController.default_redirect_uri(app),
"client_id" => app.client_id, "client_id" => app.client_id,
"client_secret" => app.client_secret "client_secret" => app.client_secret
}) })

View file

@ -9,6 +9,12 @@ defmodule Pleroma.Web.RichMedia.ParserTest do
} -> } ->
%Tesla.Env{status: 200, body: File.read!("test/fixtures/rich_media/ogp.html")} %Tesla.Env{status: 200, body: File.read!("test/fixtures/rich_media/ogp.html")}
%{
method: :get,
url: "http://example.com/non-ogp"
} ->
%Tesla.Env{status: 200, body: File.read!("test/fixtures/rich_media/non_ogp_embed.html")}
%{ %{
method: :get, method: :get,
url: "http://example.com/ogp-missing-title" url: "http://example.com/ogp-missing-title"
@ -47,6 +53,11 @@ test "returns error when no metadata present" do
assert {:error, _} = Pleroma.Web.RichMedia.Parser.parse("http://example.com/empty") assert {:error, _} = Pleroma.Web.RichMedia.Parser.parse("http://example.com/empty")
end end
test "doesn't just add a title" do
assert Pleroma.Web.RichMedia.Parser.parse("http://example.com/non-ogp") ==
{:error, "Found metadata was invalid or incomplete: %{}"}
end
test "parses ogp" do test "parses ogp" do
assert Pleroma.Web.RichMedia.Parser.parse("http://example.com/ogp") == assert Pleroma.Web.RichMedia.Parser.parse("http://example.com/ogp") ==
{:ok, {:ok,

View file

@ -356,4 +356,110 @@ test "it doesn't send muted reblogs" do
Task.await(task) Task.await(task)
end end
describe "direct streams" do
setup do
GenServer.start(Streamer, %{}, name: Streamer)
on_exit(fn ->
if pid = Process.whereis(Streamer) do
Process.exit(pid, :kill)
end
end)
:ok
end
test "it sends conversation update to the 'direct' stream", %{} do
user = insert(:user)
another_user = insert(:user)
task =
Task.async(fn ->
assert_receive {:text, _received_event}, 4_000
end)
Streamer.add_socket(
"direct",
%{transport_pid: task.pid, assigns: %{user: user}}
)
{:ok, _create_activity} =
CommonAPI.post(another_user, %{
"status" => "hey @#{user.nickname}",
"visibility" => "direct"
})
Task.await(task)
end
test "it doesn't send conversation update to the 'direct' streamj when the last message in the conversation is deleted" do
user = insert(:user)
another_user = insert(:user)
{:ok, create_activity} =
CommonAPI.post(another_user, %{
"status" => "hi @#{user.nickname}",
"visibility" => "direct"
})
task =
Task.async(fn ->
assert_receive {:text, received_event}, 4_000
assert %{"event" => "delete", "payload" => _} = Jason.decode!(received_event)
refute_receive {:text, _}, 4_000
end)
Streamer.add_socket(
"direct",
%{transport_pid: task.pid, assigns: %{user: user}}
)
{:ok, _} = CommonAPI.delete(create_activity.id, another_user)
Task.await(task)
end
test "it sends conversation update to the 'direct' stream when a message is deleted" do
user = insert(:user)
another_user = insert(:user)
{:ok, create_activity} =
CommonAPI.post(another_user, %{
"status" => "hi @#{user.nickname}",
"visibility" => "direct"
})
{:ok, create_activity2} =
CommonAPI.post(another_user, %{
"status" => "hi @#{user.nickname}",
"in_reply_to_status_id" => create_activity.id,
"visibility" => "direct"
})
task =
Task.async(fn ->
assert_receive {:text, received_event}, 4_000
assert %{"event" => "delete", "payload" => _} = Jason.decode!(received_event)
assert_receive {:text, received_event}, 4_000
assert %{"event" => "conversation", "payload" => received_payload} =
Jason.decode!(received_event)
assert %{"last_status" => last_status} = Jason.decode!(received_payload)
assert last_status["id"] == to_string(create_activity.id)
end)
Streamer.add_socket(
"direct",
%{transport_pid: task.pid, assigns: %{user: user}}
)
{:ok, _} = CommonAPI.delete(create_activity2.id, another_user)
Task.await(task)
end
end
end end

View file

@ -0,0 +1,56 @@
defmodule Pleroma.Web.TwitterAPI.PasswordControllerTest do
use Pleroma.Web.ConnCase
alias Pleroma.PasswordResetToken
alias Pleroma.Web.OAuth.Token
import Pleroma.Factory
describe "GET /api/pleroma/password_reset/token" do
test "it returns error when token invalid", %{conn: conn} do
response =
conn
|> get("/api/pleroma/password_reset/token")
|> html_response(:ok)
assert response =~ "<h2>Invalid Token</h2>"
end
test "it shows password reset form", %{conn: conn} do
user = insert(:user)
{:ok, token} = PasswordResetToken.create_token(user)
response =
conn
|> get("/api/pleroma/password_reset/#{token.token}")
|> html_response(:ok)
assert response =~ "<h2>Password Reset for #{user.nickname}</h2>"
end
end
describe "POST /api/pleroma/password_reset" do
test "it returns HTTP 200", %{conn: conn} do
user = insert(:user)
{:ok, token} = PasswordResetToken.create_token(user)
{:ok, _access_token} = Token.create_token(insert(:oauth_app), user, %{})
params = %{
"password" => "test",
password_confirmation: "test",
token: token.token
}
response =
conn
|> assign(:user, user)
|> post("/api/pleroma/password_reset", %{data: params})
|> html_response(:ok)
assert response =~ "<h2>Password changed!</h2>"
user = refresh_record(user)
assert Comeonin.Pbkdf2.checkpw("test", user.password_hash)
assert length(Token.get_user_tokens(user)) == 0
end
end
end