forked from AkkomaGang/akkoma
Merge remote-tracking branch 'remotes/upstream/develop' into 1260-rate-limited-auth-actions
This commit is contained in:
commit
c98e761d28
18 changed files with 507 additions and 190 deletions
90
CHANGELOG.md
90
CHANGELOG.md
|
@ -4,14 +4,18 @@ All notable changes to this project will be documented in this file.
|
||||||
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/).
|
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/).
|
||||||
|
|
||||||
## [Unreleased]
|
## [Unreleased]
|
||||||
|
### Changed
|
||||||
|
- **Breaking:** Elixir >=1.8 is now required (was >= 1.7)
|
||||||
|
- Replaced [pleroma_job_queue](https://git.pleroma.social/pleroma/pleroma_job_queue) and `Pleroma.Web.Federator.RetryQueue` with [Oban](https://github.com/sorentwo/oban) (see [`docs/config.md`](docs/config.md) on migrating customized worker / retry settings)
|
||||||
|
- Introduced [quantum](https://github.com/quantum-elixir/quantum-core) job scheduler
|
||||||
|
- Admin API: Return `total` when querying for reports
|
||||||
|
|
||||||
|
## [1.1.0] - 2019-??-??
|
||||||
### Security
|
### Security
|
||||||
- OStatus: eliminate the possibility of a protocol downgrade attack.
|
|
||||||
- OStatus: prevent following locked accounts, bypassing the approval process.
|
|
||||||
- Mastodon API: respect post privacy in `/api/v1/statuses/:id/{favourited,reblogged}_by`
|
- Mastodon API: respect post privacy in `/api/v1/statuses/:id/{favourited,reblogged}_by`
|
||||||
|
|
||||||
### Removed
|
### Removed
|
||||||
- **Breaking:** GNU Social API with Qvitter extensions support
|
- **Breaking:** GNU Social API with Qvitter extensions support
|
||||||
- **Breaking:** ActivityPub: The `accept_blocks` configuration setting.
|
|
||||||
- Emoji: Remove longfox emojis.
|
- Emoji: Remove longfox emojis.
|
||||||
- Remove `Reply-To` header from report emails for admins.
|
- Remove `Reply-To` header from report emails for admins.
|
||||||
|
|
||||||
|
@ -19,8 +23,6 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/).
|
||||||
- **Breaking:** Configuration: A setting to explicitly disable the mailer was added, defaulting to true, if you are using a mailer add `config :pleroma, Pleroma.Emails.Mailer, enabled: true` to your config
|
- **Breaking:** Configuration: A setting to explicitly disable the mailer was added, defaulting to true, if you are using a mailer add `config :pleroma, Pleroma.Emails.Mailer, enabled: true` to your config
|
||||||
- **Breaking:** Configuration: `/media/` is now removed when `base_url` is configured, append `/media/` to your `base_url` config to keep the old behaviour if desired
|
- **Breaking:** Configuration: `/media/` is now removed when `base_url` is configured, append `/media/` to your `base_url` config to keep the old behaviour if desired
|
||||||
- **Breaking:** `/api/pleroma/notifications/read` is moved to `/api/v1/pleroma/notifications/read` and now supports `max_id` and responds with Mastodon API entities.
|
- **Breaking:** `/api/pleroma/notifications/read` is moved to `/api/v1/pleroma/notifications/read` and now supports `max_id` and responds with Mastodon API entities.
|
||||||
- Configuration: OpenGraph and TwitterCard providers enabled by default
|
|
||||||
- Configuration: Filter.AnonymizeFilename added ability to retain file extension with custom text
|
|
||||||
- Configuration: added `config/description.exs`, from which `docs/config.md` is generated
|
- Configuration: added `config/description.exs`, from which `docs/config.md` is generated
|
||||||
- Federation: Return 403 errors when trying to request pages from a user's follower/following collections if they have `hide_followers`/`hide_follows` set
|
- Federation: Return 403 errors when trying to request pages from a user's follower/following collections if they have `hide_followers`/`hide_follows` set
|
||||||
- NodeInfo: Return `skipThreadContainment` in `metadata` for the `skip_thread_containment` option
|
- NodeInfo: Return `skipThreadContainment` in `metadata` for the `skip_thread_containment` option
|
||||||
|
@ -30,23 +32,16 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/).
|
||||||
- AdminAPI: Add "godmode" while fetching user statuses (i.e. admin can see private statuses)
|
- AdminAPI: Add "godmode" while fetching user statuses (i.e. admin can see private statuses)
|
||||||
- Improve digest email template
|
- Improve digest email template
|
||||||
– Pagination: (optional) return `total` alongside with `items` when paginating
|
– Pagination: (optional) return `total` alongside with `items` when paginating
|
||||||
- Replaced [pleroma_job_queue](https://git.pleroma.social/pleroma/pleroma_job_queue) and `Pleroma.Web.Federator.RetryQueue` with [Oban](https://github.com/sorentwo/oban) (see [`docs/config.md`](docs/config.md) on migrating customized worker / retry settings)
|
|
||||||
- Introduced [quantum](https://github.com/quantum-elixir/quantum-core) job scheduler
|
|
||||||
|
|
||||||
### Fixed
|
### Fixed
|
||||||
- Following from Osada
|
- Following from Osada
|
||||||
- Not being able to pin unlisted posts
|
|
||||||
- Objects being re-embedded to activities after being updated (e.g faved/reposted). Running 'mix pleroma.database prune_objects' again is advised.
|
|
||||||
- Favorites timeline doing database-intensive queries
|
- Favorites timeline doing database-intensive queries
|
||||||
- Metadata rendering errors resulting in the entire page being inaccessible
|
- Metadata rendering errors resulting in the entire page being inaccessible
|
||||||
- `federation_incoming_replies_max_depth` option being ignored in certain cases
|
- `federation_incoming_replies_max_depth` option being ignored in certain cases
|
||||||
- Federation/MediaProxy not working with instances that have wrong certificate order
|
|
||||||
- Mastodon API: Handling of search timeouts (`/api/v1/search` and `/api/v2/search`)
|
- Mastodon API: Handling of search timeouts (`/api/v1/search` and `/api/v2/search`)
|
||||||
- Mastodon API: Misskey's endless polls being unable to render
|
- Mastodon API: Misskey's endless polls being unable to render
|
||||||
- Mastodon API: Embedded relationships not being properly rendered in the Account entity of Status entity
|
- Mastodon API: Embedded relationships not being properly rendered in the Account entity of Status entity
|
||||||
- Mastodon API: Notifications endpoint crashing if one notification failed to render
|
- Mastodon API: Notifications endpoint crashing if one notification failed to render
|
||||||
- Mastodon API: follower/following counters not being nullified, when `hide_follows`/`hide_followers` is set
|
|
||||||
- Mastodon API: `muted` in the Status entity, using author's account to determine if the tread was muted
|
|
||||||
- Mastodon API: Add `account_id`, `type`, `offset`, and `limit` to search API (`/api/v1/search` and `/api/v2/search`)
|
- Mastodon API: Add `account_id`, `type`, `offset`, and `limit` to search API (`/api/v1/search` and `/api/v2/search`)
|
||||||
- Mastodon API, streaming: Fix filtering of notifications based on blocks/mutes/thread mutes
|
- Mastodon API, streaming: Fix filtering of notifications based on blocks/mutes/thread mutes
|
||||||
- ActivityPub C2S: follower/following collection pages being inaccessible even when authentifucated if `hide_followers`/ `hide_follows` was set
|
- ActivityPub C2S: follower/following collection pages being inaccessible even when authentifucated if `hide_followers`/ `hide_follows` was set
|
||||||
|
@ -54,15 +49,9 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/).
|
||||||
- Rich Media: Parser failing when no TTL can be found by image TTL setters
|
- Rich Media: Parser failing when no TTL can be found by image TTL setters
|
||||||
- Rich Media: The crawled URL is now spliced into the rich media data.
|
- Rich Media: The crawled URL is now spliced into the rich media data.
|
||||||
- ActivityPub S2S: sharedInbox usage has been mostly aligned with the rules in the AP specification.
|
- ActivityPub S2S: sharedInbox usage has been mostly aligned with the rules in the AP specification.
|
||||||
- ActivityPub S2S: remote user deletions now work the same as local user deletions.
|
|
||||||
- ActivityPub S2S: POST requests are now signed with `(request-target)` pseudo-header.
|
|
||||||
- Not being able to access the Mastodon FE login page on private instances
|
|
||||||
- Invalid SemVer version generation, when the current branch does not have commits ahead of tag/checked out on a tag
|
|
||||||
- Pleroma.Upload base_url was not automatically whitelisted by MediaProxy. Now your custom CDN or file hosting will be accessed directly as expected.
|
- Pleroma.Upload base_url was not automatically whitelisted by MediaProxy. Now your custom CDN or file hosting will be accessed directly as expected.
|
||||||
- Report email not being sent to admins when the reporter is a remote user
|
- Report email not being sent to admins when the reporter is a remote user
|
||||||
- MRF: ensure that subdomain_match calls are case-insensitive
|
|
||||||
- Reverse Proxy limiting `max_body_length` was incorrectly defined and only checked `Content-Length` headers which may not be sufficient in some circumstances
|
- Reverse Proxy limiting `max_body_length` was incorrectly defined and only checked `Content-Length` headers which may not be sufficient in some circumstances
|
||||||
- MRF: fix use of unserializable keyword lists in describe() implementations
|
|
||||||
- ActivityPub: Deactivated user deletion
|
- ActivityPub: Deactivated user deletion
|
||||||
- ActivityPub: Fix `/users/:nickname/inbox` crashing without an authenticated user
|
- ActivityPub: Fix `/users/:nickname/inbox` crashing without an authenticated user
|
||||||
- MRF: fix ability to follow a relay when AntiFollowbotPolicy was enabled
|
- MRF: fix ability to follow a relay when AntiFollowbotPolicy was enabled
|
||||||
|
@ -73,16 +62,9 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/).
|
||||||
- Mastodon API: all status JSON responses contain a `pleroma.expires_at` item which states when an activity will expire. The value is only shown to the user who created the activity. To everyone else it's empty.
|
- Mastodon API: all status JSON responses contain a `pleroma.expires_at` item which states when an activity will expire. The value is only shown to the user who created the activity. To everyone else it's empty.
|
||||||
- Configuration: `ActivityExpiration.enabled` controls whether expired activites will get deleted at the appropriate time. Enabled by default.
|
- Configuration: `ActivityExpiration.enabled` controls whether expired activites will get deleted at the appropriate time. Enabled by default.
|
||||||
- Conversations: Add Pleroma-specific conversation endpoints and status posting extensions. Run the `bump_all_conversations` task again to create the necessary data.
|
- Conversations: Add Pleroma-specific conversation endpoints and status posting extensions. Run the `bump_all_conversations` task again to create the necessary data.
|
||||||
- **Breaking:** MRF describe API, which adds support for exposing configuration information about MRF policies to NodeInfo.
|
|
||||||
Custom modules will need to be updated by adding, at the very least, `def describe, do: {:ok, %{}}` to the MRF policy modules.
|
|
||||||
- MRF: Support for priming the mediaproxy cache (`Pleroma.Web.ActivityPub.MRF.MediaProxyWarmingPolicy`)
|
- MRF: Support for priming the mediaproxy cache (`Pleroma.Web.ActivityPub.MRF.MediaProxyWarmingPolicy`)
|
||||||
- MRF: Support for excluding specific domains from Transparency.
|
- MRF: Support for excluding specific domains from Transparency.
|
||||||
- MRF: Support for filtering posts based on who they mention (`Pleroma.Web.ActivityPub.MRF.MentionPolicy`)
|
- MRF: Support for filtering posts based on who they mention (`Pleroma.Web.ActivityPub.MRF.MentionPolicy`)
|
||||||
- MRF: Support for filtering posts based on ActivityStreams vocabulary (`Pleroma.Web.ActivityPub.MRF.VocabularyPolicy`)
|
|
||||||
- MRF (Simple Policy): Support for wildcard domains.
|
|
||||||
- Support for wildcard domains in user domain blocks setting.
|
|
||||||
- Configuration: `quarantined_instances` support wildcard domains.
|
|
||||||
- Configuration: `federation_incoming_replies_max_depth` option
|
|
||||||
- Mastodon API: Support for the [`tagged` filter](https://github.com/tootsuite/mastodon/pull/9755) in [`GET /api/v1/accounts/:id/statuses`](https://docs.joinmastodon.org/api/rest/accounts/#get-api-v1-accounts-id-statuses)
|
- Mastodon API: Support for the [`tagged` filter](https://github.com/tootsuite/mastodon/pull/9755) in [`GET /api/v1/accounts/:id/statuses`](https://docs.joinmastodon.org/api/rest/accounts/#get-api-v1-accounts-id-statuses)
|
||||||
- Mastodon API, streaming: Add support for passing the token in the `Sec-WebSocket-Protocol` header
|
- Mastodon API, streaming: Add support for passing the token in the `Sec-WebSocket-Protocol` header
|
||||||
- Mastodon API, extension: Ability to reset avatar, profile banner, and background
|
- Mastodon API, extension: Ability to reset avatar, profile banner, and background
|
||||||
|
@ -110,9 +92,6 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/).
|
||||||
- Admin API: Endpoint for fetching latest user's statuses
|
- Admin API: Endpoint for fetching latest user's statuses
|
||||||
- Pleroma API: Add `/api/v1/pleroma/accounts/confirmation_resend?email=<email>` for resending account confirmation.
|
- Pleroma API: Add `/api/v1/pleroma/accounts/confirmation_resend?email=<email>` for resending account confirmation.
|
||||||
- Pleroma API: Email change endpoint.
|
- Pleroma API: Email change endpoint.
|
||||||
- Relays: Added a task to list relay subscriptions.
|
|
||||||
- Mix Tasks: `mix pleroma.database fix_likes_collections`
|
|
||||||
- Federation: Remove `likes` from objects.
|
|
||||||
- Admin API: Added moderation log
|
- Admin API: Added moderation log
|
||||||
- Web response cache (currently, enabled for ActivityPub)
|
- Web response cache (currently, enabled for ActivityPub)
|
||||||
- Mastodon API: Added an endpoint to get multiple statuses by IDs (`GET /api/v1/statuses/?ids[]=1&ids[]=2`)
|
- Mastodon API: Added an endpoint to get multiple statuses by IDs (`GET /api/v1/statuses/?ids[]=1&ids[]=2`)
|
||||||
|
@ -124,6 +103,61 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/).
|
||||||
- RichMedia: parsers and their order are configured in `rich_media` config.
|
- RichMedia: parsers and their order are configured in `rich_media` config.
|
||||||
- RichMedia: add the rich media ttl based on image expiration time.
|
- RichMedia: add the rich media ttl based on image expiration time.
|
||||||
|
|
||||||
|
## [1.0.6] - 2019-08-14
|
||||||
|
### Fixed
|
||||||
|
- MRF: fix use of unserializable keyword lists in describe() implementations
|
||||||
|
- ActivityPub S2S: POST requests are now signed with `(request-target)` pseudo-header.
|
||||||
|
|
||||||
|
## [1.0.5] - 2019-08-13
|
||||||
|
### Fixed
|
||||||
|
- Mastodon API: follower/following counters not being nullified, when `hide_follows`/`hide_followers` is set
|
||||||
|
- Mastodon API: `muted` in the Status entity, using author's account to determine if the thread was muted
|
||||||
|
- Mastodon API: return the actual profile URL in the Account entity's `url` property when appropriate
|
||||||
|
- Templates: properly style anchor tags
|
||||||
|
- Objects being re-embedded to activities after being updated (e.g faved/reposted). Running 'mix pleroma.database prune_objects' again is advised.
|
||||||
|
- Not being able to access the Mastodon FE login page on private instances
|
||||||
|
- MRF: ensure that subdomain_match calls are case-insensitive
|
||||||
|
- Fix internal server error when using the healthcheck API.
|
||||||
|
|
||||||
|
### Added
|
||||||
|
- **Breaking:** MRF describe API, which adds support for exposing configuration information about MRF policies to NodeInfo.
|
||||||
|
Custom modules will need to be updated by adding, at the very least, `def describe, do: {:ok, %{}}` to the MRF policy modules.
|
||||||
|
- Relays: Added a task to list relay subscriptions.
|
||||||
|
- MRF: Support for filtering posts based on ActivityStreams vocabulary (`Pleroma.Web.ActivityPub.MRF.VocabularyPolicy`)
|
||||||
|
- MRF (Simple Policy): Support for wildcard domains.
|
||||||
|
- Support for wildcard domains in user domain blocks setting.
|
||||||
|
- Configuration: `quarantined_instances` support wildcard domains.
|
||||||
|
- Mix Tasks: `mix pleroma.database fix_likes_collections`
|
||||||
|
- Configuration: `federation_incoming_replies_max_depth` option
|
||||||
|
|
||||||
|
### Removed
|
||||||
|
- Federation: Remove `likes` from objects.
|
||||||
|
- **Breaking:** ActivityPub: The `accept_blocks` configuration setting.
|
||||||
|
|
||||||
|
## [1.0.4] - 2019-08-01
|
||||||
|
### Fixed
|
||||||
|
- Invalid SemVer version generation, when the current branch does not have commits ahead of tag/checked out on a tag
|
||||||
|
|
||||||
|
## [1.0.3] - 2019-07-31
|
||||||
|
### Security
|
||||||
|
- OStatus: eliminate the possibility of a protocol downgrade attack.
|
||||||
|
- OStatus: prevent following locked accounts, bypassing the approval process.
|
||||||
|
- TwitterAPI: use CommonAPI to handle remote follows instead of OStatus.
|
||||||
|
|
||||||
|
## [1.0.2] - 2019-07-28
|
||||||
|
### Fixed
|
||||||
|
- Not being able to pin unlisted posts
|
||||||
|
- Mastodon API: represent poll IDs as strings
|
||||||
|
- MediaProxy: fix matching filenames
|
||||||
|
- MediaProxy: fix filename encoding
|
||||||
|
- Migrations: fix a sporadic migration failure
|
||||||
|
- Metadata rendering errors resulting in the entire page being inaccessible
|
||||||
|
- Federation/MediaProxy not working with instances that have wrong certificate order
|
||||||
|
- ActivityPub S2S: remote user deletions now work the same as local user deletions.
|
||||||
|
|
||||||
|
### Changed
|
||||||
|
- Configuration: OpenGraph and TwitterCard providers enabled by default
|
||||||
|
- Configuration: Filter.AnonymizeFilename added ability to retain file extension with custom text
|
||||||
|
|
||||||
## [1.0.1] - 2019-07-14
|
## [1.0.1] - 2019-07-14
|
||||||
### Security
|
### Security
|
||||||
|
|
|
@ -317,6 +317,7 @@ Note: Available `:permission_group` is currently moderator and admin. 404 is ret
|
||||||
|
|
||||||
```json
|
```json
|
||||||
{
|
{
|
||||||
|
"total" : 1,
|
||||||
"reports": [
|
"reports": [
|
||||||
{
|
{
|
||||||
"account": {
|
"account": {
|
||||||
|
|
|
@ -43,23 +43,9 @@ def start(_type, _args) do
|
||||||
hackney_pool_children() ++
|
hackney_pool_children() ++
|
||||||
[
|
[
|
||||||
Pleroma.Stats,
|
Pleroma.Stats,
|
||||||
{Oban, Pleroma.Config.get(Oban)},
|
{Oban, Pleroma.Config.get(Oban)}
|
||||||
%{
|
|
||||||
id: :web_push_init,
|
|
||||||
start: {Task, :start_link, [&Pleroma.Web.Push.init/0]},
|
|
||||||
restart: :temporary
|
|
||||||
},
|
|
||||||
%{
|
|
||||||
id: :federator_init,
|
|
||||||
start: {Task, :start_link, [&Pleroma.Web.Federator.init/0]},
|
|
||||||
restart: :temporary
|
|
||||||
},
|
|
||||||
%{
|
|
||||||
id: :internal_fetch_init,
|
|
||||||
start: {Task, :start_link, [&Pleroma.Web.ActivityPub.InternalFetchActor.init/0]},
|
|
||||||
restart: :temporary
|
|
||||||
}
|
|
||||||
] ++
|
] ++
|
||||||
|
task_children(@env) ++
|
||||||
oauth_cleanup_child(oauth_cleanup_enabled?()) ++
|
oauth_cleanup_child(oauth_cleanup_enabled?()) ++
|
||||||
streamer_child(@env) ++
|
streamer_child(@env) ++
|
||||||
chat_child(@env, chat_enabled?()) ++
|
chat_child(@env, chat_enabled?()) ++
|
||||||
|
@ -163,4 +149,39 @@ defp hackney_pool_children do
|
||||||
:hackney_pool.child_spec(pool, options)
|
:hackney_pool.child_spec(pool, options)
|
||||||
end
|
end
|
||||||
end
|
end
|
||||||
|
|
||||||
|
defp task_children(:test) do
|
||||||
|
[
|
||||||
|
%{
|
||||||
|
id: :web_push_init,
|
||||||
|
start: {Task, :start_link, [&Pleroma.Web.Push.init/0]},
|
||||||
|
restart: :temporary
|
||||||
|
},
|
||||||
|
%{
|
||||||
|
id: :federator_init,
|
||||||
|
start: {Task, :start_link, [&Pleroma.Web.Federator.init/0]},
|
||||||
|
restart: :temporary
|
||||||
|
}
|
||||||
|
]
|
||||||
|
end
|
||||||
|
|
||||||
|
defp task_children(_) do
|
||||||
|
[
|
||||||
|
%{
|
||||||
|
id: :web_push_init,
|
||||||
|
start: {Task, :start_link, [&Pleroma.Web.Push.init/0]},
|
||||||
|
restart: :temporary
|
||||||
|
},
|
||||||
|
%{
|
||||||
|
id: :federator_init,
|
||||||
|
start: {Task, :start_link, [&Pleroma.Web.Federator.init/0]},
|
||||||
|
restart: :temporary
|
||||||
|
},
|
||||||
|
%{
|
||||||
|
id: :internal_fetch_init,
|
||||||
|
start: {Task, :start_link, [&Pleroma.Web.ActivityPub.InternalFetchActor.init/0]},
|
||||||
|
restart: :temporary
|
||||||
|
}
|
||||||
|
]
|
||||||
|
end
|
||||||
end
|
end
|
||||||
|
|
|
@ -14,7 +14,7 @@ defmodule Pleroma.FlakeId do
|
||||||
|
|
||||||
@type t :: binary
|
@type t :: binary
|
||||||
|
|
||||||
@behaviour Ecto.Type
|
use Ecto.Type
|
||||||
use GenServer
|
use GenServer
|
||||||
require Logger
|
require Logger
|
||||||
alias __MODULE__
|
alias __MODULE__
|
||||||
|
|
|
@ -150,6 +150,7 @@ def get_cached_follow_state(user, target) do
|
||||||
Cachex.fetch!(:user_cache, key, fn _ -> {:commit, follow_state(user, target)} end)
|
Cachex.fetch!(:user_cache, key, fn _ -> {:commit, follow_state(user, target)} end)
|
||||||
end
|
end
|
||||||
|
|
||||||
|
@spec set_follow_state_cache(String.t(), String.t(), String.t()) :: {:ok | :error, boolean()}
|
||||||
def set_follow_state_cache(user_ap_id, target_ap_id, state) do
|
def set_follow_state_cache(user_ap_id, target_ap_id, state) do
|
||||||
Cachex.put(
|
Cachex.put(
|
||||||
:user_cache,
|
:user_cache,
|
||||||
|
|
|
@ -410,6 +410,7 @@ def delete(%Object{data: %{"id" => id, "actor" => actor}} = object, local \\ tru
|
||||||
end
|
end
|
||||||
end
|
end
|
||||||
|
|
||||||
|
@spec block(User.t(), User.t(), String.t() | nil, boolean) :: {:ok, Activity.t() | nil}
|
||||||
def block(blocker, blocked, activity_id \\ nil, local \\ true) do
|
def block(blocker, blocked, activity_id \\ nil, local \\ true) do
|
||||||
outgoing_blocks = Config.get([:activitypub, :outgoing_blocks])
|
outgoing_blocks = Config.get([:activitypub, :outgoing_blocks])
|
||||||
unfollow_blocked = Config.get([:activitypub, :unfollow_blocked])
|
unfollow_blocked = Config.get([:activitypub, :unfollow_blocked])
|
||||||
|
@ -438,10 +439,11 @@ def unblock(blocker, blocked, activity_id \\ nil, local \\ true) do
|
||||||
end
|
end
|
||||||
end
|
end
|
||||||
|
|
||||||
|
@spec flag(map()) :: {:ok, Activity.t()} | any
|
||||||
def flag(
|
def flag(
|
||||||
%{
|
%{
|
||||||
actor: actor,
|
actor: actor,
|
||||||
context: context,
|
context: _context,
|
||||||
account: account,
|
account: account,
|
||||||
statuses: statuses,
|
statuses: statuses,
|
||||||
content: content
|
content: content
|
||||||
|
@ -453,14 +455,6 @@ def flag(
|
||||||
|
|
||||||
additional = params[:additional] || %{}
|
additional = params[:additional] || %{}
|
||||||
|
|
||||||
params = %{
|
|
||||||
actor: actor,
|
|
||||||
context: context,
|
|
||||||
account: account,
|
|
||||||
statuses: statuses,
|
|
||||||
content: content
|
|
||||||
}
|
|
||||||
|
|
||||||
additional =
|
additional =
|
||||||
if forward do
|
if forward do
|
||||||
Map.merge(additional, %{"to" => [], "cc" => [account.ap_id]})
|
Map.merge(additional, %{"to" => [], "cc" => [account.ap_id]})
|
||||||
|
|
|
@ -1050,7 +1050,7 @@ def upgrade_user_from_ap_id(ap_id) do
|
||||||
with %User{local: false} = user <- User.get_cached_by_ap_id(ap_id),
|
with %User{local: false} = user <- User.get_cached_by_ap_id(ap_id),
|
||||||
{:ok, data} <- ActivityPub.fetch_and_prepare_user_from_ap_id(ap_id),
|
{:ok, data} <- ActivityPub.fetch_and_prepare_user_from_ap_id(ap_id),
|
||||||
already_ap <- User.ap_enabled?(user),
|
already_ap <- User.ap_enabled?(user),
|
||||||
{:ok, user} <- user |> User.upgrade_changeset(data) |> User.update_and_set_cache() do
|
{:ok, user} <- user |> User.upgrade_changeset(data, true) |> User.update_and_set_cache() do
|
||||||
unless already_ap do
|
unless already_ap do
|
||||||
TransmogrifierWorker.enqueue("user_upgrade", %{"user_id" => user.id})
|
TransmogrifierWorker.enqueue("user_upgrade", %{"user_id" => user.id})
|
||||||
end
|
end
|
||||||
|
|
|
@ -33,50 +33,40 @@ def normalize_params(params) do
|
||||||
Map.put(params, "actor", get_ap_id(params["actor"]))
|
Map.put(params, "actor", get_ap_id(params["actor"]))
|
||||||
end
|
end
|
||||||
|
|
||||||
def determine_explicit_mentions(%{"tag" => tag} = _object) when is_list(tag) do
|
@spec determine_explicit_mentions(map()) :: map()
|
||||||
tag
|
def determine_explicit_mentions(%{"tag" => tag} = _) when is_list(tag) do
|
||||||
|> Enum.filter(fn x -> is_map(x) end)
|
Enum.flat_map(tag, fn
|
||||||
|> Enum.filter(fn x -> x["type"] == "Mention" end)
|
%{"type" => "Mention", "href" => href} -> [href]
|
||||||
|> Enum.map(fn x -> x["href"] end)
|
_ -> []
|
||||||
|
end)
|
||||||
end
|
end
|
||||||
|
|
||||||
def determine_explicit_mentions(%{"tag" => tag} = object) when is_map(tag) do
|
def determine_explicit_mentions(%{"tag" => tag} = object) when is_map(tag) do
|
||||||
Map.put(object, "tag", [tag])
|
object
|
||||||
|
|> Map.put("tag", [tag])
|
||||||
|> determine_explicit_mentions()
|
|> determine_explicit_mentions()
|
||||||
end
|
end
|
||||||
|
|
||||||
def determine_explicit_mentions(_), do: []
|
def determine_explicit_mentions(_), do: []
|
||||||
|
|
||||||
|
@spec recipient_in_collection(any(), any()) :: boolean()
|
||||||
defp recipient_in_collection(ap_id, coll) when is_binary(coll), do: ap_id == coll
|
defp recipient_in_collection(ap_id, coll) when is_binary(coll), do: ap_id == coll
|
||||||
defp recipient_in_collection(ap_id, coll) when is_list(coll), do: ap_id in coll
|
defp recipient_in_collection(ap_id, coll) when is_list(coll), do: ap_id in coll
|
||||||
defp recipient_in_collection(_, _), do: false
|
defp recipient_in_collection(_, _), do: false
|
||||||
|
|
||||||
|
@spec recipient_in_message(User.t(), User.t(), map()) :: boolean()
|
||||||
def recipient_in_message(%User{ap_id: ap_id} = recipient, %User{} = actor, params) do
|
def recipient_in_message(%User{ap_id: ap_id} = recipient, %User{} = actor, params) do
|
||||||
|
addresses = [params["to"], params["cc"], params["bto"], params["bcc"]]
|
||||||
|
|
||||||
cond do
|
cond do
|
||||||
recipient_in_collection(ap_id, params["to"]) ->
|
Enum.any?(addresses, &recipient_in_collection(ap_id, &1)) -> true
|
||||||
true
|
|
||||||
|
|
||||||
recipient_in_collection(ap_id, params["cc"]) ->
|
|
||||||
true
|
|
||||||
|
|
||||||
recipient_in_collection(ap_id, params["bto"]) ->
|
|
||||||
true
|
|
||||||
|
|
||||||
recipient_in_collection(ap_id, params["bcc"]) ->
|
|
||||||
true
|
|
||||||
|
|
||||||
# if the message is unaddressed at all, then assume it is directly addressed
|
# if the message is unaddressed at all, then assume it is directly addressed
|
||||||
# to the recipient
|
# to the recipient
|
||||||
!params["to"] && !params["cc"] && !params["bto"] && !params["bcc"] ->
|
Enum.all?(addresses, &is_nil(&1)) -> true
|
||||||
true
|
|
||||||
|
|
||||||
# if the message is sent from somebody the user is following, then assume it
|
# if the message is sent from somebody the user is following, then assume it
|
||||||
# is addressed to the recipient
|
# is addressed to the recipient
|
||||||
User.following?(recipient, actor) ->
|
User.following?(recipient, actor) -> true
|
||||||
true
|
true -> false
|
||||||
|
|
||||||
true ->
|
|
||||||
false
|
|
||||||
end
|
end
|
||||||
end
|
end
|
||||||
|
|
||||||
|
@ -179,53 +169,58 @@ def maybe_federate(_), do: :ok
|
||||||
Adds an id and a published data if they aren't there,
|
Adds an id and a published data if they aren't there,
|
||||||
also adds it to an included object
|
also adds it to an included object
|
||||||
"""
|
"""
|
||||||
def lazy_put_activity_defaults(map, fake? \\ false) do
|
@spec lazy_put_activity_defaults(map(), boolean) :: map()
|
||||||
map =
|
def lazy_put_activity_defaults(map, fake? \\ false)
|
||||||
if not fake? do
|
|
||||||
%{data: %{"id" => context}, id: context_id} = create_context(map["context"])
|
|
||||||
|
|
||||||
map
|
def lazy_put_activity_defaults(map, true) do
|
||||||
|> Map.put_new_lazy("id", &generate_activity_id/0)
|
map
|
||||||
|> Map.put_new_lazy("published", &make_date/0)
|
|> Map.put_new("id", "pleroma:fakeid")
|
||||||
|> Map.put_new("context", context)
|
|> Map.put_new_lazy("published", &make_date/0)
|
||||||
|> Map.put_new("context_id", context_id)
|
|> Map.put_new("context", "pleroma:fakecontext")
|
||||||
else
|
|> Map.put_new("context_id", -1)
|
||||||
map
|
|> lazy_put_object_defaults(true)
|
||||||
|> Map.put_new("id", "pleroma:fakeid")
|
end
|
||||||
|> Map.put_new_lazy("published", &make_date/0)
|
|
||||||
|> Map.put_new("context", "pleroma:fakecontext")
|
|
||||||
|> Map.put_new("context_id", -1)
|
|
||||||
end
|
|
||||||
|
|
||||||
if is_map(map["object"]) do
|
def lazy_put_activity_defaults(map, _fake?) do
|
||||||
object = lazy_put_object_defaults(map["object"], map, fake?)
|
%{data: %{"id" => context}, id: context_id} = create_context(map["context"])
|
||||||
%{map | "object" => object}
|
|
||||||
else
|
map
|
||||||
|
|> Map.put_new_lazy("id", &generate_activity_id/0)
|
||||||
|
|> Map.put_new_lazy("published", &make_date/0)
|
||||||
|
|> Map.put_new("context", context)
|
||||||
|
|> Map.put_new("context_id", context_id)
|
||||||
|
|> lazy_put_object_defaults(false)
|
||||||
|
end
|
||||||
|
|
||||||
|
# Adds an id and published date if they aren't there.
|
||||||
|
#
|
||||||
|
@spec lazy_put_object_defaults(map(), boolean()) :: map()
|
||||||
|
defp lazy_put_object_defaults(%{"object" => map} = activity, true)
|
||||||
|
when is_map(map) do
|
||||||
|
object =
|
||||||
map
|
map
|
||||||
end
|
|> Map.put_new("id", "pleroma:fake_object_id")
|
||||||
|
|> Map.put_new_lazy("published", &make_date/0)
|
||||||
|
|> Map.put_new("context", activity["context"])
|
||||||
|
|> Map.put_new("context_id", activity["context_id"])
|
||||||
|
|> Map.put_new("fake", true)
|
||||||
|
|
||||||
|
%{activity | "object" => object}
|
||||||
end
|
end
|
||||||
|
|
||||||
@doc """
|
defp lazy_put_object_defaults(%{"object" => map} = activity, _)
|
||||||
Adds an id and published date if they aren't there.
|
when is_map(map) do
|
||||||
"""
|
object =
|
||||||
def lazy_put_object_defaults(map, activity \\ %{}, fake?)
|
map
|
||||||
|
|> Map.put_new_lazy("id", &generate_object_id/0)
|
||||||
|
|> Map.put_new_lazy("published", &make_date/0)
|
||||||
|
|> Map.put_new("context", activity["context"])
|
||||||
|
|> Map.put_new("context_id", activity["context_id"])
|
||||||
|
|
||||||
def lazy_put_object_defaults(map, activity, true = _fake?) do
|
%{activity | "object" => object}
|
||||||
map
|
|
||||||
|> Map.put_new_lazy("published", &make_date/0)
|
|
||||||
|> Map.put_new("id", "pleroma:fake_object_id")
|
|
||||||
|> Map.put_new("context", activity["context"])
|
|
||||||
|> Map.put_new("fake", true)
|
|
||||||
|> Map.put_new("context_id", activity["context_id"])
|
|
||||||
end
|
end
|
||||||
|
|
||||||
def lazy_put_object_defaults(map, activity, _fake?) do
|
defp lazy_put_object_defaults(activity, _), do: activity
|
||||||
map
|
|
||||||
|> Map.put_new_lazy("id", &generate_object_id/0)
|
|
||||||
|> Map.put_new_lazy("published", &make_date/0)
|
|
||||||
|> Map.put_new("context", activity["context"])
|
|
||||||
|> Map.put_new("context_id", activity["context_id"])
|
|
||||||
end
|
|
||||||
|
|
||||||
@doc """
|
@doc """
|
||||||
Inserts a full object if it is contained in an activity.
|
Inserts a full object if it is contained in an activity.
|
||||||
|
@ -345,24 +340,24 @@ defp fetch_likes(object) do
|
||||||
@doc """
|
@doc """
|
||||||
Updates a follow activity's state (for locked accounts).
|
Updates a follow activity's state (for locked accounts).
|
||||||
"""
|
"""
|
||||||
|
@spec update_follow_state_for_all(Activity.t(), String.t()) :: {:ok, Activity} | {:error, any()}
|
||||||
def update_follow_state_for_all(
|
def update_follow_state_for_all(
|
||||||
%Activity{data: %{"actor" => actor, "object" => object}} = activity,
|
%Activity{data: %{"actor" => actor, "object" => object}} = activity,
|
||||||
state
|
state
|
||||||
) do
|
) do
|
||||||
try do
|
"Follow"
|
||||||
Ecto.Adapters.SQL.query!(
|
|> Activity.Queries.by_type()
|
||||||
Repo,
|
|> Activity.Queries.by_actor(actor)
|
||||||
"UPDATE activities SET data = jsonb_set(data, '{state}', $1) WHERE data->>'type' = 'Follow' AND data->>'actor' = $2 AND data->>'object' = $3 AND data->>'state' = 'pending'",
|
|> Activity.Queries.by_object_id(object)
|
||||||
[state, actor, object]
|
|> where(fragment("data->>'state' = 'pending'"))
|
||||||
)
|
|> update(set: [data: fragment("jsonb_set(data, '{state}', ?)", ^state)])
|
||||||
|
|> Repo.update_all([])
|
||||||
|
|
||||||
User.set_follow_state_cache(actor, object, state)
|
User.set_follow_state_cache(actor, object, state)
|
||||||
activity = Activity.get_by_id(activity.id)
|
|
||||||
{:ok, activity}
|
activity = Activity.get_by_id(activity.id)
|
||||||
rescue
|
|
||||||
e ->
|
{:ok, activity}
|
||||||
{:error, e}
|
|
||||||
end
|
|
||||||
end
|
end
|
||||||
|
|
||||||
def update_follow_state(
|
def update_follow_state(
|
||||||
|
@ -413,6 +408,7 @@ def fetch_latest_follow(%User{ap_id: follower_id}, %User{ap_id: followed_id}) do
|
||||||
@doc """
|
@doc """
|
||||||
Retruns an existing announce activity if the notice has already been announced
|
Retruns an existing announce activity if the notice has already been announced
|
||||||
"""
|
"""
|
||||||
|
@spec get_existing_announce(String.t(), map()) :: Activity.t() | nil
|
||||||
def get_existing_announce(actor, %{data: %{"id" => ap_id}}) do
|
def get_existing_announce(actor, %{data: %{"id" => ap_id}}) do
|
||||||
"Announce"
|
"Announce"
|
||||||
|> Activity.Queries.by_type()
|
|> Activity.Queries.by_type()
|
||||||
|
@ -495,33 +491,35 @@ def make_unlike_data(
|
||||||
|> maybe_put("id", activity_id)
|
|> maybe_put("id", activity_id)
|
||||||
end
|
end
|
||||||
|
|
||||||
|
@spec add_announce_to_object(Activity.t(), Object.t()) ::
|
||||||
|
{:ok, Object.t()} | {:error, Ecto.Changeset.t()}
|
||||||
def add_announce_to_object(
|
def add_announce_to_object(
|
||||||
%Activity{
|
%Activity{data: %{"actor" => actor, "cc" => [Pleroma.Constants.as_public()]}},
|
||||||
data: %{"actor" => actor, "cc" => [Pleroma.Constants.as_public()]}
|
|
||||||
},
|
|
||||||
object
|
object
|
||||||
) do
|
) do
|
||||||
announcements =
|
announcements = take_announcements(object)
|
||||||
if is_list(object.data["announcements"]) do
|
|
||||||
Enum.uniq([actor | object.data["announcements"]])
|
|
||||||
else
|
|
||||||
[actor]
|
|
||||||
end
|
|
||||||
|
|
||||||
update_element_in_object("announcement", announcements, object)
|
with announcements <- Enum.uniq([actor | announcements]) do
|
||||||
|
update_element_in_object("announcement", announcements, object)
|
||||||
|
end
|
||||||
end
|
end
|
||||||
|
|
||||||
def add_announce_to_object(_, object), do: {:ok, object}
|
def add_announce_to_object(_, object), do: {:ok, object}
|
||||||
|
|
||||||
|
@spec remove_announce_from_object(Activity.t(), Object.t()) ::
|
||||||
|
{:ok, Object.t()} | {:error, Ecto.Changeset.t()}
|
||||||
def remove_announce_from_object(%Activity{data: %{"actor" => actor}}, object) do
|
def remove_announce_from_object(%Activity{data: %{"actor" => actor}}, object) do
|
||||||
announcements =
|
with announcements <- List.delete(take_announcements(object), actor) do
|
||||||
if is_list(object.data["announcements"]), do: object.data["announcements"], else: []
|
|
||||||
|
|
||||||
with announcements <- announcements |> List.delete(actor) do
|
|
||||||
update_element_in_object("announcement", announcements, object)
|
update_element_in_object("announcement", announcements, object)
|
||||||
end
|
end
|
||||||
end
|
end
|
||||||
|
|
||||||
|
defp take_announcements(%{data: %{"announcements" => announcements}} = _)
|
||||||
|
when is_list(announcements),
|
||||||
|
do: announcements
|
||||||
|
|
||||||
|
defp take_announcements(_), do: []
|
||||||
|
|
||||||
#### Unfollow-related helpers
|
#### Unfollow-related helpers
|
||||||
|
|
||||||
def make_unfollow_data(follower, followed, follow_activity, activity_id) do
|
def make_unfollow_data(follower, followed, follow_activity, activity_id) do
|
||||||
|
@ -535,6 +533,7 @@ def make_unfollow_data(follower, followed, follow_activity, activity_id) do
|
||||||
end
|
end
|
||||||
|
|
||||||
#### Block-related helpers
|
#### Block-related helpers
|
||||||
|
@spec fetch_latest_block(User.t(), User.t()) :: Activity.t() | nil
|
||||||
def fetch_latest_block(%User{ap_id: blocker_id}, %User{ap_id: blocked_id}) do
|
def fetch_latest_block(%User{ap_id: blocker_id}, %User{ap_id: blocked_id}) do
|
||||||
"Block"
|
"Block"
|
||||||
|> Activity.Queries.by_type()
|
|> Activity.Queries.by_type()
|
||||||
|
@ -583,28 +582,32 @@ def make_create_data(params, additional) do
|
||||||
end
|
end
|
||||||
|
|
||||||
#### Flag-related helpers
|
#### Flag-related helpers
|
||||||
|
@spec make_flag_data(map(), map()) :: map()
|
||||||
def make_flag_data(params, additional) do
|
def make_flag_data(%{actor: actor, context: context, content: content} = params, additional) do
|
||||||
status_ap_ids =
|
|
||||||
Enum.map(params.statuses || [], fn
|
|
||||||
%Activity{} = act -> act.data["id"]
|
|
||||||
act when is_map(act) -> act["id"]
|
|
||||||
act when is_binary(act) -> act
|
|
||||||
end)
|
|
||||||
|
|
||||||
object = [params.account.ap_id] ++ status_ap_ids
|
|
||||||
|
|
||||||
%{
|
%{
|
||||||
"type" => "Flag",
|
"type" => "Flag",
|
||||||
"actor" => params.actor.ap_id,
|
"actor" => actor.ap_id,
|
||||||
"content" => params.content,
|
"content" => content,
|
||||||
"object" => object,
|
"object" => build_flag_object(params),
|
||||||
"context" => params.context,
|
"context" => context,
|
||||||
"state" => "open"
|
"state" => "open"
|
||||||
}
|
}
|
||||||
|> Map.merge(additional)
|
|> Map.merge(additional)
|
||||||
end
|
end
|
||||||
|
|
||||||
|
def make_flag_data(_, _), do: %{}
|
||||||
|
|
||||||
|
defp build_flag_object(%{account: account, statuses: statuses} = _) do
|
||||||
|
[account.ap_id] ++
|
||||||
|
Enum.map(statuses || [], fn
|
||||||
|
%Activity{} = act -> act.data["id"]
|
||||||
|
act when is_map(act) -> act["id"]
|
||||||
|
act when is_binary(act) -> act
|
||||||
|
end)
|
||||||
|
end
|
||||||
|
|
||||||
|
defp build_flag_object(_), do: []
|
||||||
|
|
||||||
@doc """
|
@doc """
|
||||||
Fetches the OrderedCollection/OrderedCollectionPage from `from`, limiting the amount of pages fetched after
|
Fetches the OrderedCollection/OrderedCollectionPage from `from`, limiting the amount of pages fetched after
|
||||||
the first one to `pages_left` pages.
|
the first one to `pages_left` pages.
|
||||||
|
|
|
@ -442,11 +442,9 @@ def list_reports(conn, params) do
|
||||||
params
|
params
|
||||||
|> Map.put("type", "Flag")
|
|> Map.put("type", "Flag")
|
||||||
|> Map.put("skip_preload", true)
|
|> Map.put("skip_preload", true)
|
||||||
|
|> Map.put("total", true)
|
||||||
|
|
||||||
reports =
|
reports = ActivityPub.fetch_activities([], params)
|
||||||
[]
|
|
||||||
|> ActivityPub.fetch_activities(params)
|
|
||||||
|> Enum.reverse()
|
|
||||||
|
|
||||||
conn
|
conn
|
||||||
|> put_view(ReportView)
|
|> put_view(ReportView)
|
||||||
|
|
|
@ -12,7 +12,9 @@ defmodule Pleroma.Web.AdminAPI.ReportView do
|
||||||
|
|
||||||
def render("index.json", %{reports: reports}) do
|
def render("index.json", %{reports: reports}) do
|
||||||
%{
|
%{
|
||||||
reports: render_many(reports, __MODULE__, "show.json", as: :report)
|
reports:
|
||||||
|
render_many(reports[:items], __MODULE__, "show.json", as: :report) |> Enum.reverse(),
|
||||||
|
total: reports[:total]
|
||||||
}
|
}
|
||||||
end
|
end
|
||||||
|
|
||||||
|
|
|
@ -4,16 +4,18 @@ defmodule Pleroma.Web.Streamer.State do
|
||||||
|
|
||||||
alias Pleroma.Web.Streamer.StreamerSocket
|
alias Pleroma.Web.Streamer.StreamerSocket
|
||||||
|
|
||||||
|
@env Mix.env()
|
||||||
|
|
||||||
def start_link(_) do
|
def start_link(_) do
|
||||||
GenServer.start_link(__MODULE__, %{sockets: %{}}, name: __MODULE__)
|
GenServer.start_link(__MODULE__, %{sockets: %{}}, name: __MODULE__)
|
||||||
end
|
end
|
||||||
|
|
||||||
def add_socket(topic, socket) do
|
def add_socket(topic, socket) do
|
||||||
GenServer.call(__MODULE__, {:add, socket, topic})
|
GenServer.call(__MODULE__, {:add, topic, socket})
|
||||||
end
|
end
|
||||||
|
|
||||||
def remove_socket(topic, socket) do
|
def remove_socket(topic, socket) do
|
||||||
GenServer.call(__MODULE__, {:remove, socket, topic})
|
do_remove_socket(@env, topic, socket)
|
||||||
end
|
end
|
||||||
|
|
||||||
def get_sockets do
|
def get_sockets do
|
||||||
|
@ -29,7 +31,7 @@ def handle_call(:get_state, _from, state) do
|
||||||
{:reply, state, state}
|
{:reply, state, state}
|
||||||
end
|
end
|
||||||
|
|
||||||
def handle_call({:add, socket, topic}, _from, %{sockets: sockets} = state) do
|
def handle_call({:add, topic, socket}, _from, %{sockets: sockets} = state) do
|
||||||
internal_topic = internal_topic(topic, socket)
|
internal_topic = internal_topic(topic, socket)
|
||||||
stream_socket = StreamerSocket.from_socket(socket)
|
stream_socket = StreamerSocket.from_socket(socket)
|
||||||
|
|
||||||
|
@ -44,7 +46,7 @@ def handle_call({:add, socket, topic}, _from, %{sockets: sockets} = state) do
|
||||||
{:reply, state, state}
|
{:reply, state, state}
|
||||||
end
|
end
|
||||||
|
|
||||||
def handle_call({:remove, socket, topic}, _from, %{sockets: sockets} = state) do
|
def handle_call({:remove, topic, socket}, _from, %{sockets: sockets} = state) do
|
||||||
internal_topic = internal_topic(topic, socket)
|
internal_topic = internal_topic(topic, socket)
|
||||||
stream_socket = StreamerSocket.from_socket(socket)
|
stream_socket = StreamerSocket.from_socket(socket)
|
||||||
|
|
||||||
|
@ -57,6 +59,14 @@ def handle_call({:remove, socket, topic}, _from, %{sockets: sockets} = state) do
|
||||||
{:reply, state, state}
|
{:reply, state, state}
|
||||||
end
|
end
|
||||||
|
|
||||||
|
defp do_remove_socket(:test, _, _) do
|
||||||
|
:ok
|
||||||
|
end
|
||||||
|
|
||||||
|
defp do_remove_socket(_env, topic, socket) do
|
||||||
|
GenServer.call(__MODULE__, {:remove, topic, socket})
|
||||||
|
end
|
||||||
|
|
||||||
defp internal_topic(topic, socket)
|
defp internal_topic(topic, socket)
|
||||||
when topic in ~w[user user:notification direct] do
|
when topic in ~w[user user:notification direct] do
|
||||||
"#{topic}:#{socket.assigns[:user].id}"
|
"#{topic}:#{socket.assigns[:user].id}"
|
||||||
|
|
|
@ -10,7 +10,11 @@ defmodule Pleroma.Workers.WebPusherWorker do
|
||||||
|
|
||||||
@impl Oban.Worker
|
@impl Oban.Worker
|
||||||
def perform(%{"op" => "web_push", "notification_id" => notification_id}, _job) do
|
def perform(%{"op" => "web_push", "notification_id" => notification_id}, _job) do
|
||||||
notification = Repo.get(Notification, notification_id)
|
notification =
|
||||||
|
Notification
|
||||||
|
|> Repo.get(notification_id)
|
||||||
|
|> Repo.preload([:activity])
|
||||||
|
|
||||||
Pleroma.Web.Push.Impl.perform(notification)
|
Pleroma.Web.Push.Impl.perform(notification)
|
||||||
end
|
end
|
||||||
end
|
end
|
||||||
|
|
8
mix.exs
8
mix.exs
|
@ -5,7 +5,7 @@ def project do
|
||||||
[
|
[
|
||||||
app: :pleroma,
|
app: :pleroma,
|
||||||
version: version("1.0.0"),
|
version: version("1.0.0"),
|
||||||
elixir: "~> 1.7",
|
elixir: "~> 1.8",
|
||||||
elixirc_paths: elixirc_paths(Mix.env()),
|
elixirc_paths: elixirc_paths(Mix.env()),
|
||||||
compilers: [:phoenix, :gettext] ++ Mix.compilers(),
|
compilers: [:phoenix, :gettext] ++ Mix.compilers(),
|
||||||
elixirc_options: [warnings_as_errors: true],
|
elixirc_options: [warnings_as_errors: true],
|
||||||
|
@ -99,9 +99,9 @@ defp deps do
|
||||||
{:plug_cowboy, "~> 2.0"},
|
{:plug_cowboy, "~> 2.0"},
|
||||||
{:phoenix_pubsub, "~> 1.1"},
|
{:phoenix_pubsub, "~> 1.1"},
|
||||||
{:phoenix_ecto, "~> 4.0"},
|
{:phoenix_ecto, "~> 4.0"},
|
||||||
{:ecto_sql, "~> 3.1"},
|
{:ecto_sql, "~> 3.2"},
|
||||||
{:postgrex, ">= 0.13.5"},
|
{:postgrex, ">= 0.13.5"},
|
||||||
{:oban, "~> 0.7"},
|
{:oban, "~> 0.8.1"},
|
||||||
{:quantum, "~> 2.3"},
|
{:quantum, "~> 2.3"},
|
||||||
{:gettext, "~> 0.15"},
|
{:gettext, "~> 0.15"},
|
||||||
{:comeonin, "~> 4.1.1"},
|
{:comeonin, "~> 4.1.1"},
|
||||||
|
@ -113,7 +113,7 @@ defp deps do
|
||||||
{:calendar, "~> 0.17.4"},
|
{:calendar, "~> 0.17.4"},
|
||||||
{:cachex, "~> 3.0.2"},
|
{:cachex, "~> 3.0.2"},
|
||||||
{:poison, "~> 3.0", override: true},
|
{:poison, "~> 3.0", override: true},
|
||||||
{:tesla, "~> 1.2"},
|
{:tesla, "~> 1.3", override: true},
|
||||||
{:jason, "~> 1.0"},
|
{:jason, "~> 1.0"},
|
||||||
{:mogrify, "~> 0.6.1"},
|
{:mogrify, "~> 0.6.1"},
|
||||||
{:ex_aws, "~> 2.1"},
|
{:ex_aws, "~> 2.1"},
|
||||||
|
|
10
mix.lock
10
mix.lock
|
@ -21,8 +21,8 @@
|
||||||
"decimal": {:hex, :decimal, "1.8.0", "ca462e0d885f09a1c5a342dbd7c1dcf27ea63548c65a65e67334f4b61803822e", [:mix], [], "hexpm"},
|
"decimal": {:hex, :decimal, "1.8.0", "ca462e0d885f09a1c5a342dbd7c1dcf27ea63548c65a65e67334f4b61803822e", [:mix], [], "hexpm"},
|
||||||
"deep_merge": {:hex, :deep_merge, "1.0.0", "b4aa1a0d1acac393bdf38b2291af38cb1d4a52806cf7a4906f718e1feb5ee961", [:mix], [], "hexpm"},
|
"deep_merge": {:hex, :deep_merge, "1.0.0", "b4aa1a0d1acac393bdf38b2291af38cb1d4a52806cf7a4906f718e1feb5ee961", [:mix], [], "hexpm"},
|
||||||
"earmark": {:hex, :earmark, "1.3.6", "ce1d0675e10a5bb46b007549362bd3f5f08908843957687d8484fe7f37466b19", [:mix], [], "hexpm"},
|
"earmark": {:hex, :earmark, "1.3.6", "ce1d0675e10a5bb46b007549362bd3f5f08908843957687d8484fe7f37466b19", [:mix], [], "hexpm"},
|
||||||
"ecto": {:hex, :ecto, "3.1.4", "69d852da7a9f04ede725855a35ede48d158ca11a404fe94f8b2fb3b2162cd3c9", [:mix], [{:decimal, "~> 1.6", [hex: :decimal, repo: "hexpm", optional: false]}, {:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: true]}], "hexpm"},
|
"ecto": {:hex, :ecto, "3.2.0", "940e2598813f205223d60c78d66e514afe1db5167ed8075510a59e496619cfb5", [:mix], [{:decimal, "~> 1.6", [hex: :decimal, repo: "hexpm", optional: false]}, {:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: true]}], "hexpm"},
|
||||||
"ecto_sql": {:hex, :ecto_sql, "3.1.3", "2c536139190492d9de33c5fefac7323c5eaaa82e1b9bf93482a14649042f7cd9", [:mix], [{:db_connection, "~> 2.0", [hex: :db_connection, repo: "hexpm", optional: false]}, {:ecto, "~> 3.1.0", [hex: :ecto, repo: "hexpm", optional: false]}, {:mariaex, "~> 0.9.1", [hex: :mariaex, repo: "hexpm", optional: true]}, {:myxql, "~> 0.2.0", [hex: :myxql, repo: "hexpm", optional: true]}, {:postgrex, "~> 0.14.0", [hex: :postgrex, repo: "hexpm", optional: true]}, {:telemetry, "~> 0.4.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm"},
|
"ecto_sql": {:hex, :ecto_sql, "3.2.0", "751cea597e8deb616084894dd75cbabfdbe7255ff01e8c058ca13f0353a3921b", [:mix], [{:db_connection, "~> 2.1", [hex: :db_connection, repo: "hexpm", optional: false]}, {:ecto, "~> 3.2.0", [hex: :ecto, repo: "hexpm", optional: false]}, {:myxql, "~> 0.2.0", [hex: :myxql, repo: "hexpm", optional: true]}, {:postgrex, "~> 0.15.0", [hex: :postgrex, repo: "hexpm", optional: true]}, {:telemetry, "~> 0.4.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm"},
|
||||||
"esshd": {:hex, :esshd, "0.1.0", "6f93a2062adb43637edad0ea7357db2702a4b80dd9683482fe00f5134e97f4c1", [:mix], [], "hexpm"},
|
"esshd": {:hex, :esshd, "0.1.0", "6f93a2062adb43637edad0ea7357db2702a4b80dd9683482fe00f5134e97f4c1", [:mix], [], "hexpm"},
|
||||||
"eternal": {:hex, :eternal, "1.2.0", "e2a6b6ce3b8c248f7dc31451aefca57e3bdf0e48d73ae5043229380a67614c41", [:mix], [], "hexpm"},
|
"eternal": {:hex, :eternal, "1.2.0", "e2a6b6ce3b8c248f7dc31451aefca57e3bdf0e48d73ae5043229380a67614c41", [:mix], [], "hexpm"},
|
||||||
"ex2ms": {:hex, :ex2ms, "1.5.0", "19e27f9212be9a96093fed8cdfbef0a2b56c21237196d26760f11dfcfae58e97", [:mix], [], "hexpm"},
|
"ex2ms": {:hex, :ex2ms, "1.5.0", "19e27f9212be9a96093fed8cdfbef0a2b56c21237196d26760f11dfcfae58e97", [:mix], [], "hexpm"},
|
||||||
|
@ -60,7 +60,7 @@
|
||||||
"mogrify": {:hex, :mogrify, "0.6.1", "de1b527514f2d95a7bbe9642eb556061afb337e220cf97adbf3a4e6438ed70af", [:mix], [], "hexpm"},
|
"mogrify": {:hex, :mogrify, "0.6.1", "de1b527514f2d95a7bbe9642eb556061afb337e220cf97adbf3a4e6438ed70af", [:mix], [], "hexpm"},
|
||||||
"mox": {:hex, :mox, "0.5.1", "f86bb36026aac1e6f924a4b6d024b05e9adbed5c63e8daa069bd66fb3292165b", [:mix], [], "hexpm"},
|
"mox": {:hex, :mox, "0.5.1", "f86bb36026aac1e6f924a4b6d024b05e9adbed5c63e8daa069bd66fb3292165b", [:mix], [], "hexpm"},
|
||||||
"nimble_parsec": {:hex, :nimble_parsec, "0.5.1", "c90796ecee0289dbb5ad16d3ad06f957b0cd1199769641c961cfe0b97db190e0", [:mix], [], "hexpm"},
|
"nimble_parsec": {:hex, :nimble_parsec, "0.5.1", "c90796ecee0289dbb5ad16d3ad06f957b0cd1199769641c961cfe0b97db190e0", [:mix], [], "hexpm"},
|
||||||
"oban": {:hex, :oban, "0.7.1", "171bdd1b69c1a4a839f8c768f5e962fc22d1de1513d459fb6b8e0cbd34817a9a", [:mix], [{:ecto_sql, "~> 3.1", [hex: :ecto_sql, repo: "hexpm", optional: false]}, {:jason, "~> 1.1", [hex: :jason, repo: "hexpm", optional: false]}, {:postgrex, "~> 0.14", [hex: :postgrex, repo: "hexpm", optional: false]}, {:telemetry, "~> 0.4", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm"},
|
"oban": {:hex, :oban, "0.8.1", "4bbf62eb1829f856d69aeb5069ac7036afe07db8221a17de2a9169cc7a58a318", [:mix], [{:ecto_sql, "~> 3.1", [hex: :ecto_sql, repo: "hexpm", optional: false]}, {:jason, "~> 1.1", [hex: :jason, repo: "hexpm", optional: false]}, {:postgrex, "~> 0.14", [hex: :postgrex, repo: "hexpm", optional: false]}, {:telemetry, "~> 0.4", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm"},
|
||||||
"parse_trans": {:hex, :parse_trans, "3.3.0", "09765507a3c7590a784615cfd421d101aec25098d50b89d7aa1d66646bc571c1", [:rebar3], [], "hexpm"},
|
"parse_trans": {:hex, :parse_trans, "3.3.0", "09765507a3c7590a784615cfd421d101aec25098d50b89d7aa1d66646bc571c1", [:rebar3], [], "hexpm"},
|
||||||
"pbkdf2_elixir": {:hex, :pbkdf2_elixir, "0.12.3", "6706a148809a29c306062862c803406e88f048277f6e85b68faf73291e820b84", [:mix], [], "hexpm"},
|
"pbkdf2_elixir": {:hex, :pbkdf2_elixir, "0.12.3", "6706a148809a29c306062862c803406e88f048277f6e85b68faf73291e820b84", [:mix], [], "hexpm"},
|
||||||
"phoenix": {:hex, :phoenix, "1.4.9", "746d098e10741c334d88143d3c94cab1756435f94387a63441792e66ec0ee974", [:mix], [{:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: true]}, {:phoenix_pubsub, "~> 1.1", [hex: :phoenix_pubsub, repo: "hexpm", optional: false]}, {:plug, "~> 1.8.1 or ~> 1.9", [hex: :plug, repo: "hexpm", optional: false]}, {:plug_cowboy, "~> 1.0 or ~> 2.0", [hex: :plug_cowboy, repo: "hexpm", optional: true]}, {:telemetry, "~> 0.4", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm"},
|
"phoenix": {:hex, :phoenix, "1.4.9", "746d098e10741c334d88143d3c94cab1756435f94387a63441792e66ec0ee974", [:mix], [{:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: true]}, {:phoenix_pubsub, "~> 1.1", [hex: :phoenix_pubsub, repo: "hexpm", optional: false]}, {:plug, "~> 1.8.1 or ~> 1.9", [hex: :plug, repo: "hexpm", optional: false]}, {:plug_cowboy, "~> 1.0 or ~> 2.0", [hex: :plug_cowboy, repo: "hexpm", optional: true]}, {:telemetry, "~> 0.4", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm"},
|
||||||
|
@ -74,7 +74,7 @@
|
||||||
"plug_static_index_html": {:hex, :plug_static_index_html, "1.0.0", "840123d4d3975585133485ea86af73cb2600afd7f2a976f9f5fd8b3808e636a0", [:mix], [{:plug, "~> 1.0", [hex: :plug, repo: "hexpm", optional: false]}], "hexpm"},
|
"plug_static_index_html": {:hex, :plug_static_index_html, "1.0.0", "840123d4d3975585133485ea86af73cb2600afd7f2a976f9f5fd8b3808e636a0", [:mix], [{:plug, "~> 1.0", [hex: :plug, repo: "hexpm", optional: false]}], "hexpm"},
|
||||||
"poison": {:hex, :poison, "3.1.0", "d9eb636610e096f86f25d9a46f35a9facac35609a7591b3be3326e99a0484665", [:mix], [], "hexpm"},
|
"poison": {:hex, :poison, "3.1.0", "d9eb636610e096f86f25d9a46f35a9facac35609a7591b3be3326e99a0484665", [:mix], [], "hexpm"},
|
||||||
"poolboy": {:hex, :poolboy, "1.5.2", "392b007a1693a64540cead79830443abf5762f5d30cf50bc95cb2c1aaafa006b", [:rebar3], [], "hexpm"},
|
"poolboy": {:hex, :poolboy, "1.5.2", "392b007a1693a64540cead79830443abf5762f5d30cf50bc95cb2c1aaafa006b", [:rebar3], [], "hexpm"},
|
||||||
"postgrex": {:hex, :postgrex, "0.14.3", "5754dee2fdf6e9e508cbf49ab138df964278700b764177e8f3871e658b345a1e", [:mix], [{:connection, "~> 1.0", [hex: :connection, repo: "hexpm", optional: false]}, {:db_connection, "~> 2.0", [hex: :db_connection, repo: "hexpm", optional: false]}, {:decimal, "~> 1.5", [hex: :decimal, repo: "hexpm", optional: false]}, {:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: true]}], "hexpm"},
|
"postgrex": {:hex, :postgrex, "0.15.1", "23ce3417de70f4c0e9e7419ad85bdabcc6860a6925fe2c6f3b1b5b1e8e47bf2f", [:mix], [{:connection, "~> 1.0", [hex: :connection, repo: "hexpm", optional: false]}, {:db_connection, "~> 2.1", [hex: :db_connection, repo: "hexpm", optional: false]}, {:decimal, "~> 1.5", [hex: :decimal, repo: "hexpm", optional: false]}, {:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: true]}], "hexpm"},
|
||||||
"prometheus": {:hex, :prometheus, "4.4.1", "1e96073b3ed7788053768fea779cbc896ddc3bdd9ba60687f2ad50b252ac87d6", [:mix, :rebar3], [], "hexpm"},
|
"prometheus": {:hex, :prometheus, "4.4.1", "1e96073b3ed7788053768fea779cbc896ddc3bdd9ba60687f2ad50b252ac87d6", [:mix, :rebar3], [], "hexpm"},
|
||||||
"prometheus_ecto": {:hex, :prometheus_ecto, "1.4.1", "6c768ea9654de871e5b32fab2eac348467b3021604ebebbcbd8bcbe806a65ed5", [:mix], [{:ecto, "~> 2.0 or ~> 3.0", [hex: :ecto, repo: "hexpm", optional: false]}, {:prometheus_ex, "~> 1.1 or ~> 2.0 or ~> 3.0", [hex: :prometheus_ex, repo: "hexpm", optional: false]}], "hexpm"},
|
"prometheus_ecto": {:hex, :prometheus_ecto, "1.4.1", "6c768ea9654de871e5b32fab2eac348467b3021604ebebbcbd8bcbe806a65ed5", [:mix], [{:ecto, "~> 2.0 or ~> 3.0", [hex: :ecto, repo: "hexpm", optional: false]}, {:prometheus_ex, "~> 1.1 or ~> 2.0 or ~> 3.0", [hex: :prometheus_ex, repo: "hexpm", optional: false]}], "hexpm"},
|
||||||
"prometheus_ex": {:hex, :prometheus_ex, "3.0.5", "fa58cfd983487fc5ead331e9a3e0aa622c67232b3ec71710ced122c4c453a02f", [:mix], [{:prometheus, "~> 4.0", [hex: :prometheus, repo: "hexpm", optional: false]}], "hexpm"},
|
"prometheus_ex": {:hex, :prometheus_ex, "3.0.5", "fa58cfd983487fc5ead331e9a3e0aa622c67232b3ec71710ced122c4c453a02f", [:mix], [{:prometheus, "~> 4.0", [hex: :prometheus, repo: "hexpm", optional: false]}], "hexpm"},
|
||||||
|
@ -90,7 +90,7 @@
|
||||||
"swoosh": {:hex, :swoosh, "0.23.2", "7dda95ff0bf54a2298328d6899c74dae1223777b43563ccebebb4b5d2b61df38", [:mix], [{:cowboy, "~> 1.0.1 or ~> 1.1 or ~> 2.4", [hex: :cowboy, repo: "hexpm", optional: true]}, {:gen_smtp, "~> 0.13", [hex: :gen_smtp, repo: "hexpm", optional: true]}, {:hackney, "~> 1.9", [hex: :hackney, repo: "hexpm", optional: false]}, {:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: false]}, {:mail, "~> 0.2", [hex: :mail, repo: "hexpm", optional: true]}, {:mime, "~> 1.1", [hex: :mime, repo: "hexpm", optional: false]}, {:plug_cowboy, ">= 1.0.0", [hex: :plug_cowboy, repo: "hexpm", optional: true]}], "hexpm"},
|
"swoosh": {:hex, :swoosh, "0.23.2", "7dda95ff0bf54a2298328d6899c74dae1223777b43563ccebebb4b5d2b61df38", [:mix], [{:cowboy, "~> 1.0.1 or ~> 1.1 or ~> 2.4", [hex: :cowboy, repo: "hexpm", optional: true]}, {:gen_smtp, "~> 0.13", [hex: :gen_smtp, repo: "hexpm", optional: true]}, {:hackney, "~> 1.9", [hex: :hackney, repo: "hexpm", optional: false]}, {:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: false]}, {:mail, "~> 0.2", [hex: :mail, repo: "hexpm", optional: true]}, {:mime, "~> 1.1", [hex: :mime, repo: "hexpm", optional: false]}, {:plug_cowboy, ">= 1.0.0", [hex: :plug_cowboy, repo: "hexpm", optional: true]}], "hexpm"},
|
||||||
"syslog": {:git, "https://github.com/Vagabond/erlang-syslog.git", "4a6c6f2c996483e86c1320e9553f91d337bcb6aa", [tag: "1.0.5"]},
|
"syslog": {:git, "https://github.com/Vagabond/erlang-syslog.git", "4a6c6f2c996483e86c1320e9553f91d337bcb6aa", [tag: "1.0.5"]},
|
||||||
"telemetry": {:hex, :telemetry, "0.4.0", "8339bee3fa8b91cb84d14c2935f8ecf399ccd87301ad6da6b71c09553834b2ab", [:rebar3], [], "hexpm"},
|
"telemetry": {:hex, :telemetry, "0.4.0", "8339bee3fa8b91cb84d14c2935f8ecf399ccd87301ad6da6b71c09553834b2ab", [:rebar3], [], "hexpm"},
|
||||||
"tesla": {:hex, :tesla, "1.2.1", "864783cc27f71dd8c8969163704752476cec0f3a51eb3b06393b3971dc9733ff", [:mix], [{:exjsx, ">= 3.0.0", [hex: :exjsx, repo: "hexpm", optional: true]}, {:fuse, "~> 2.4", [hex: :fuse, repo: "hexpm", optional: true]}, {:hackney, "~> 1.6", [hex: :hackney, repo: "hexpm", optional: true]}, {:ibrowse, "~> 4.4.0", [hex: :ibrowse, repo: "hexpm", optional: true]}, {:jason, ">= 1.0.0", [hex: :jason, repo: "hexpm", optional: true]}, {:mime, "~> 1.0", [hex: :mime, repo: "hexpm", optional: false]}, {:poison, ">= 1.0.0", [hex: :poison, repo: "hexpm", optional: true]}], "hexpm"},
|
"tesla": {:hex, :tesla, "1.3.0", "f35d72f029e608f9cdc6f6d6fcc7c66cf6d6512a70cfef9206b21b8bd0203a30", [:mix], [{:castore, "~> 0.1", [hex: :castore, repo: "hexpm", optional: true]}, {:exjsx, ">= 3.0.0", [hex: :exjsx, repo: "hexpm", optional: true]}, {:fuse, "~> 2.4", [hex: :fuse, repo: "hexpm", optional: true]}, {:gun, "~> 1.3", [hex: :gun, repo: "hexpm", optional: true]}, {:hackney, "~> 1.6", [hex: :hackney, repo: "hexpm", optional: true]}, {:ibrowse, "~> 4.4.0", [hex: :ibrowse, repo: "hexpm", optional: true]}, {:jason, ">= 1.0.0", [hex: :jason, repo: "hexpm", optional: true]}, {:mime, "~> 1.0", [hex: :mime, repo: "hexpm", optional: false]}, {:mint, "~> 0.4", [hex: :mint, repo: "hexpm", optional: true]}, {:poison, ">= 1.0.0", [hex: :poison, repo: "hexpm", optional: true]}, {:telemetry, "~> 0.3", [hex: :telemetry, repo: "hexpm", optional: true]}], "hexpm"},
|
||||||
"timex": {:hex, :timex, "3.6.1", "efdf56d0e67a6b956cc57774353b0329c8ab7726766a11547e529357ffdc1d56", [:mix], [{:combine, "~> 0.10", [hex: :combine, repo: "hexpm", optional: false]}, {:gettext, "~> 0.10", [hex: :gettext, repo: "hexpm", optional: false]}, {:tzdata, "~> 0.1.8 or ~> 0.5 or ~> 1.0.0", [hex: :tzdata, repo: "hexpm", optional: false]}], "hexpm"},
|
"timex": {:hex, :timex, "3.6.1", "efdf56d0e67a6b956cc57774353b0329c8ab7726766a11547e529357ffdc1d56", [:mix], [{:combine, "~> 0.10", [hex: :combine, repo: "hexpm", optional: false]}, {:gettext, "~> 0.10", [hex: :gettext, repo: "hexpm", optional: false]}, {:tzdata, "~> 0.1.8 or ~> 0.5 or ~> 1.0.0", [hex: :tzdata, repo: "hexpm", optional: false]}], "hexpm"},
|
||||||
"trailing_format_plug": {:hex, :trailing_format_plug, "0.0.7", "64b877f912cf7273bed03379936df39894149e35137ac9509117e59866e10e45", [:mix], [{:plug, "> 0.12.0", [hex: :plug, repo: "hexpm", optional: false]}], "hexpm"},
|
"trailing_format_plug": {:hex, :trailing_format_plug, "0.0.7", "64b877f912cf7273bed03379936df39894149e35137ac9509117e59866e10e45", [:mix], [{:plug, "> 0.12.0", [hex: :plug, repo: "hexpm", optional: false]}], "hexpm"},
|
||||||
"tzdata": {:hex, :tzdata, "0.5.21", "8cbf3607fcce69636c672d5be2bbb08687fe26639a62bdcc283d267277db7cf0", [:mix], [{:hackney, "~> 1.0", [hex: :hackney, repo: "hexpm", optional: false]}], "hexpm"},
|
"tzdata": {:hex, :tzdata, "0.5.21", "8cbf3607fcce69636c672d5be2bbb08687fe26639a62bdcc283d267277db7cf0", [:mix], [{:hackney, "~> 1.0", [hex: :hackney, repo: "hexpm", optional: false]}], "hexpm"},
|
||||||
|
|
11
priv/repo/migrations/20190917100019_update_oban.exs
Normal file
11
priv/repo/migrations/20190917100019_update_oban.exs
Normal file
|
@ -0,0 +1,11 @@
|
||||||
|
defmodule Pleroma.Repo.Migrations.UpdateOban do
|
||||||
|
use Ecto.Migration
|
||||||
|
|
||||||
|
def up do
|
||||||
|
Oban.Migrations.up(version: 4)
|
||||||
|
end
|
||||||
|
|
||||||
|
def down do
|
||||||
|
Oban.Migrations.down(version: 2)
|
||||||
|
end
|
||||||
|
end
|
|
@ -18,6 +18,11 @@ defmodule Pleroma.Integration.MastodonWebsocketTest do
|
||||||
|> Map.put(:path, "/api/v1/streaming")
|
|> Map.put(:path, "/api/v1/streaming")
|
||||||
|> URI.to_string()
|
|> URI.to_string()
|
||||||
|
|
||||||
|
setup_all do
|
||||||
|
start_supervised(Pleroma.Web.Streamer.supervisor())
|
||||||
|
:ok
|
||||||
|
end
|
||||||
|
|
||||||
def start_socket(qs \\ nil, headers \\ []) do
|
def start_socket(qs \\ nil, headers \\ []) do
|
||||||
path =
|
path =
|
||||||
case qs do
|
case qs do
|
||||||
|
@ -32,6 +37,7 @@ test "refuses invalid requests" do
|
||||||
capture_log(fn ->
|
capture_log(fn ->
|
||||||
assert {:error, {400, _}} = start_socket()
|
assert {:error, {400, _}} = start_socket()
|
||||||
assert {:error, {404, _}} = start_socket("?stream=ncjdk")
|
assert {:error, {404, _}} = start_socket("?stream=ncjdk")
|
||||||
|
Process.sleep(30)
|
||||||
end)
|
end)
|
||||||
end
|
end
|
||||||
|
|
||||||
|
@ -39,17 +45,16 @@ test "requires authentication and a valid token for protected streams" do
|
||||||
capture_log(fn ->
|
capture_log(fn ->
|
||||||
assert {:error, {403, _}} = start_socket("?stream=user&access_token=aaaaaaaaaaaa")
|
assert {:error, {403, _}} = start_socket("?stream=user&access_token=aaaaaaaaaaaa")
|
||||||
assert {:error, {403, _}} = start_socket("?stream=user")
|
assert {:error, {403, _}} = start_socket("?stream=user")
|
||||||
|
Process.sleep(30)
|
||||||
end)
|
end)
|
||||||
end
|
end
|
||||||
|
|
||||||
@tag needs_streamer: true
|
|
||||||
test "allows public streams without authentication" do
|
test "allows public streams without authentication" do
|
||||||
assert {:ok, _} = start_socket("?stream=public")
|
assert {:ok, _} = start_socket("?stream=public")
|
||||||
assert {:ok, _} = start_socket("?stream=public:local")
|
assert {:ok, _} = start_socket("?stream=public:local")
|
||||||
assert {:ok, _} = start_socket("?stream=hashtag&tag=lain")
|
assert {:ok, _} = start_socket("?stream=hashtag&tag=lain")
|
||||||
end
|
end
|
||||||
|
|
||||||
@tag needs_streamer: true
|
|
||||||
test "receives well formatted events" do
|
test "receives well formatted events" do
|
||||||
user = insert(:user)
|
user = insert(:user)
|
||||||
{:ok, _} = start_socket("?stream=public")
|
{:ok, _} = start_socket("?stream=public")
|
||||||
|
@ -94,31 +99,32 @@ test "accepts valid tokens", state do
|
||||||
assert {:ok, _} = start_socket("?stream=user&access_token=#{state.token.token}")
|
assert {:ok, _} = start_socket("?stream=user&access_token=#{state.token.token}")
|
||||||
end
|
end
|
||||||
|
|
||||||
@tag needs_streamer: true
|
|
||||||
test "accepts the 'user' stream", %{token: token} = _state do
|
test "accepts the 'user' stream", %{token: token} = _state do
|
||||||
assert {:ok, _} = start_socket("?stream=user&access_token=#{token.token}")
|
assert {:ok, _} = start_socket("?stream=user&access_token=#{token.token}")
|
||||||
|
|
||||||
assert capture_log(fn ->
|
assert capture_log(fn ->
|
||||||
assert {:error, {403, "Forbidden"}} = start_socket("?stream=user")
|
assert {:error, {403, "Forbidden"}} = start_socket("?stream=user")
|
||||||
|
Process.sleep(30)
|
||||||
end) =~ ":badarg"
|
end) =~ ":badarg"
|
||||||
end
|
end
|
||||||
|
|
||||||
@tag needs_streamer: true
|
|
||||||
test "accepts the 'user:notification' stream", %{token: token} = _state do
|
test "accepts the 'user:notification' stream", %{token: token} = _state do
|
||||||
assert {:ok, _} = start_socket("?stream=user:notification&access_token=#{token.token}")
|
assert {:ok, _} = start_socket("?stream=user:notification&access_token=#{token.token}")
|
||||||
|
|
||||||
assert capture_log(fn ->
|
assert capture_log(fn ->
|
||||||
assert {:error, {403, "Forbidden"}} = start_socket("?stream=user:notification")
|
assert {:error, {403, "Forbidden"}} = start_socket("?stream=user:notification")
|
||||||
|
Process.sleep(30)
|
||||||
end) =~ ":badarg"
|
end) =~ ":badarg"
|
||||||
end
|
end
|
||||||
|
|
||||||
@tag needs_streamer: true
|
|
||||||
test "accepts valid token on Sec-WebSocket-Protocol header", %{token: token} do
|
test "accepts valid token on Sec-WebSocket-Protocol header", %{token: token} do
|
||||||
assert {:ok, _} = start_socket("?stream=user", [{"Sec-WebSocket-Protocol", token.token}])
|
assert {:ok, _} = start_socket("?stream=user", [{"Sec-WebSocket-Protocol", token.token}])
|
||||||
|
|
||||||
assert capture_log(fn ->
|
assert capture_log(fn ->
|
||||||
assert {:error, {403, "Forbidden"}} =
|
assert {:error, {403, "Forbidden"}} =
|
||||||
start_socket("?stream=user", [{"Sec-WebSocket-Protocol", "I am a friend"}])
|
start_socket("?stream=user", [{"Sec-WebSocket-Protocol", "I am a friend"}])
|
||||||
|
|
||||||
|
Process.sleep(30)
|
||||||
end) =~ ":badarg"
|
end) =~ ":badarg"
|
||||||
end
|
end
|
||||||
end
|
end
|
||||||
|
|
|
@ -87,6 +87,18 @@ test "works with an object that has only IR tags" do
|
||||||
|
|
||||||
assert Utils.determine_explicit_mentions(object) == []
|
assert Utils.determine_explicit_mentions(object) == []
|
||||||
end
|
end
|
||||||
|
|
||||||
|
test "works with an object has tags as map" do
|
||||||
|
object = %{
|
||||||
|
"tag" => %{
|
||||||
|
"type" => "Mention",
|
||||||
|
"href" => "https://example.com/~alyssa",
|
||||||
|
"name" => "Alyssa P. Hacker"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
assert Utils.determine_explicit_mentions(object) == ["https://example.com/~alyssa"]
|
||||||
|
end
|
||||||
end
|
end
|
||||||
|
|
||||||
describe "make_unlike_data/3" do
|
describe "make_unlike_data/3" do
|
||||||
|
@ -300,8 +312,8 @@ test "updates the state of all Follow activities with the same actor and object"
|
||||||
{:ok, follow_activity_two} =
|
{:ok, follow_activity_two} =
|
||||||
Utils.update_follow_state_for_all(follow_activity_two, "accept")
|
Utils.update_follow_state_for_all(follow_activity_two, "accept")
|
||||||
|
|
||||||
assert Repo.get(Activity, follow_activity.id).data["state"] == "accept"
|
assert refresh_record(follow_activity).data["state"] == "accept"
|
||||||
assert Repo.get(Activity, follow_activity_two.id).data["state"] == "accept"
|
assert refresh_record(follow_activity_two).data["state"] == "accept"
|
||||||
end
|
end
|
||||||
end
|
end
|
||||||
|
|
||||||
|
@ -323,8 +335,8 @@ test "updates the state of the given follow activity" do
|
||||||
|
|
||||||
{:ok, follow_activity_two} = Utils.update_follow_state(follow_activity_two, "reject")
|
{:ok, follow_activity_two} = Utils.update_follow_state(follow_activity_two, "reject")
|
||||||
|
|
||||||
assert Repo.get(Activity, follow_activity.id).data["state"] == "pending"
|
assert refresh_record(follow_activity).data["state"] == "pending"
|
||||||
assert Repo.get(Activity, follow_activity_two.id).data["state"] == "reject"
|
assert refresh_record(follow_activity_two).data["state"] == "reject"
|
||||||
end
|
end
|
||||||
end
|
end
|
||||||
|
|
||||||
|
@ -401,4 +413,216 @@ test "fetches existing like" do
|
||||||
assert ^like_activity = Utils.get_existing_like(user.ap_id, object)
|
assert ^like_activity = Utils.get_existing_like(user.ap_id, object)
|
||||||
end
|
end
|
||||||
end
|
end
|
||||||
|
|
||||||
|
describe "get_get_existing_announce/2" do
|
||||||
|
test "returns nil if announce not found" do
|
||||||
|
actor = insert(:user)
|
||||||
|
refute Utils.get_existing_announce(actor.ap_id, %{data: %{"id" => "test"}})
|
||||||
|
end
|
||||||
|
|
||||||
|
test "fetches existing announce" do
|
||||||
|
note_activity = insert(:note_activity)
|
||||||
|
assert object = Object.normalize(note_activity)
|
||||||
|
actor = insert(:user)
|
||||||
|
|
||||||
|
{:ok, announce, _object} = ActivityPub.announce(actor, object)
|
||||||
|
assert Utils.get_existing_announce(actor.ap_id, object) == announce
|
||||||
|
end
|
||||||
|
end
|
||||||
|
|
||||||
|
describe "fetch_latest_block/2" do
|
||||||
|
test "fetches last block activities" do
|
||||||
|
user1 = insert(:user)
|
||||||
|
user2 = insert(:user)
|
||||||
|
|
||||||
|
assert {:ok, %Activity{} = _} = ActivityPub.block(user1, user2)
|
||||||
|
assert {:ok, %Activity{} = _} = ActivityPub.block(user1, user2)
|
||||||
|
assert {:ok, %Activity{} = activity} = ActivityPub.block(user1, user2)
|
||||||
|
|
||||||
|
assert Utils.fetch_latest_block(user1, user2) == activity
|
||||||
|
end
|
||||||
|
end
|
||||||
|
|
||||||
|
describe "recipient_in_message/3" do
|
||||||
|
test "returns true when recipient in `to`" do
|
||||||
|
recipient = insert(:user)
|
||||||
|
actor = insert(:user)
|
||||||
|
assert Utils.recipient_in_message(recipient, actor, %{"to" => recipient.ap_id})
|
||||||
|
|
||||||
|
assert Utils.recipient_in_message(
|
||||||
|
recipient,
|
||||||
|
actor,
|
||||||
|
%{"to" => [recipient.ap_id], "cc" => ""}
|
||||||
|
)
|
||||||
|
end
|
||||||
|
|
||||||
|
test "returns true when recipient in `cc`" do
|
||||||
|
recipient = insert(:user)
|
||||||
|
actor = insert(:user)
|
||||||
|
assert Utils.recipient_in_message(recipient, actor, %{"cc" => recipient.ap_id})
|
||||||
|
|
||||||
|
assert Utils.recipient_in_message(
|
||||||
|
recipient,
|
||||||
|
actor,
|
||||||
|
%{"cc" => [recipient.ap_id], "to" => ""}
|
||||||
|
)
|
||||||
|
end
|
||||||
|
|
||||||
|
test "returns true when recipient in `bto`" do
|
||||||
|
recipient = insert(:user)
|
||||||
|
actor = insert(:user)
|
||||||
|
assert Utils.recipient_in_message(recipient, actor, %{"bto" => recipient.ap_id})
|
||||||
|
|
||||||
|
assert Utils.recipient_in_message(
|
||||||
|
recipient,
|
||||||
|
actor,
|
||||||
|
%{"bcc" => "", "bto" => [recipient.ap_id]}
|
||||||
|
)
|
||||||
|
end
|
||||||
|
|
||||||
|
test "returns true when recipient in `bcc`" do
|
||||||
|
recipient = insert(:user)
|
||||||
|
actor = insert(:user)
|
||||||
|
assert Utils.recipient_in_message(recipient, actor, %{"bcc" => recipient.ap_id})
|
||||||
|
|
||||||
|
assert Utils.recipient_in_message(
|
||||||
|
recipient,
|
||||||
|
actor,
|
||||||
|
%{"bto" => "", "bcc" => [recipient.ap_id]}
|
||||||
|
)
|
||||||
|
end
|
||||||
|
|
||||||
|
test "returns true when message without addresses fields" do
|
||||||
|
recipient = insert(:user)
|
||||||
|
actor = insert(:user)
|
||||||
|
assert Utils.recipient_in_message(recipient, actor, %{"bccc" => recipient.ap_id})
|
||||||
|
|
||||||
|
assert Utils.recipient_in_message(
|
||||||
|
recipient,
|
||||||
|
actor,
|
||||||
|
%{"btod" => "", "bccc" => [recipient.ap_id]}
|
||||||
|
)
|
||||||
|
end
|
||||||
|
|
||||||
|
test "returns false" do
|
||||||
|
recipient = insert(:user)
|
||||||
|
actor = insert(:user)
|
||||||
|
refute Utils.recipient_in_message(recipient, actor, %{"to" => "ap_id"})
|
||||||
|
end
|
||||||
|
end
|
||||||
|
|
||||||
|
describe "lazy_put_activity_defaults/2" do
|
||||||
|
test "returns map with id and published data" do
|
||||||
|
note_activity = insert(:note_activity)
|
||||||
|
object = Object.normalize(note_activity)
|
||||||
|
res = Utils.lazy_put_activity_defaults(%{"context" => object.data["id"]})
|
||||||
|
assert res["context"] == object.data["id"]
|
||||||
|
assert res["context_id"] == object.id
|
||||||
|
assert res["id"]
|
||||||
|
assert res["published"]
|
||||||
|
end
|
||||||
|
|
||||||
|
test "returns map with fake id and published data" do
|
||||||
|
assert %{
|
||||||
|
"context" => "pleroma:fakecontext",
|
||||||
|
"context_id" => -1,
|
||||||
|
"id" => "pleroma:fakeid",
|
||||||
|
"published" => _
|
||||||
|
} = Utils.lazy_put_activity_defaults(%{}, true)
|
||||||
|
end
|
||||||
|
|
||||||
|
test "returns activity data with object" do
|
||||||
|
note_activity = insert(:note_activity)
|
||||||
|
object = Object.normalize(note_activity)
|
||||||
|
|
||||||
|
res =
|
||||||
|
Utils.lazy_put_activity_defaults(%{
|
||||||
|
"context" => object.data["id"],
|
||||||
|
"object" => %{}
|
||||||
|
})
|
||||||
|
|
||||||
|
assert res["context"] == object.data["id"]
|
||||||
|
assert res["context_id"] == object.id
|
||||||
|
assert res["id"]
|
||||||
|
assert res["published"]
|
||||||
|
assert res["object"]["id"]
|
||||||
|
assert res["object"]["published"]
|
||||||
|
assert res["object"]["context"] == object.data["id"]
|
||||||
|
assert res["object"]["context_id"] == object.id
|
||||||
|
end
|
||||||
|
end
|
||||||
|
|
||||||
|
describe "make_flag_data" do
|
||||||
|
test "returns empty map when params is invalid" do
|
||||||
|
assert Utils.make_flag_data(%{}, %{}) == %{}
|
||||||
|
end
|
||||||
|
|
||||||
|
test "returns map with Flag object" do
|
||||||
|
reporter = insert(:user)
|
||||||
|
target_account = insert(:user)
|
||||||
|
{:ok, activity} = CommonAPI.post(target_account, %{"status" => "foobar"})
|
||||||
|
context = Utils.generate_context_id()
|
||||||
|
content = "foobar"
|
||||||
|
|
||||||
|
target_ap_id = target_account.ap_id
|
||||||
|
activity_ap_id = activity.data["id"]
|
||||||
|
|
||||||
|
res =
|
||||||
|
Utils.make_flag_data(
|
||||||
|
%{
|
||||||
|
actor: reporter,
|
||||||
|
context: context,
|
||||||
|
account: target_account,
|
||||||
|
statuses: [%{"id" => activity.data["id"]}],
|
||||||
|
content: content
|
||||||
|
},
|
||||||
|
%{}
|
||||||
|
)
|
||||||
|
|
||||||
|
assert %{
|
||||||
|
"type" => "Flag",
|
||||||
|
"content" => ^content,
|
||||||
|
"context" => ^context,
|
||||||
|
"object" => [^target_ap_id, ^activity_ap_id],
|
||||||
|
"state" => "open"
|
||||||
|
} = res
|
||||||
|
end
|
||||||
|
end
|
||||||
|
|
||||||
|
describe "add_announce_to_object/2" do
|
||||||
|
test "adds actor to announcement" do
|
||||||
|
user = insert(:user)
|
||||||
|
object = insert(:note)
|
||||||
|
|
||||||
|
activity =
|
||||||
|
insert(:note_activity,
|
||||||
|
data: %{
|
||||||
|
"actor" => user.ap_id,
|
||||||
|
"cc" => [Pleroma.Constants.as_public()]
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
|
assert {:ok, updated_object} = Utils.add_announce_to_object(activity, object)
|
||||||
|
assert updated_object.data["announcements"] == [user.ap_id]
|
||||||
|
assert updated_object.data["announcement_count"] == 1
|
||||||
|
end
|
||||||
|
end
|
||||||
|
|
||||||
|
describe "remove_announce_from_object/2" do
|
||||||
|
test "removes actor from announcements" do
|
||||||
|
user = insert(:user)
|
||||||
|
user2 = insert(:user)
|
||||||
|
|
||||||
|
object =
|
||||||
|
insert(:note,
|
||||||
|
data: %{"announcements" => [user.ap_id, user2.ap_id], "announcement_count" => 2}
|
||||||
|
)
|
||||||
|
|
||||||
|
activity = insert(:note_activity, data: %{"actor" => user.ap_id})
|
||||||
|
|
||||||
|
assert {:ok, updated_object} = Utils.remove_announce_from_object(activity, object)
|
||||||
|
assert updated_object.data["announcements"] == [user2.ap_id]
|
||||||
|
assert updated_object.data["announcement_count"] == 1
|
||||||
|
end
|
||||||
|
end
|
||||||
end
|
end
|
||||||
|
|
|
@ -1309,6 +1309,7 @@ test "returns empty response when no reports created", %{conn: conn} do
|
||||||
|> json_response(:ok)
|
|> json_response(:ok)
|
||||||
|
|
||||||
assert Enum.empty?(response["reports"])
|
assert Enum.empty?(response["reports"])
|
||||||
|
assert response["total"] == 0
|
||||||
end
|
end
|
||||||
|
|
||||||
test "returns reports", %{conn: conn} do
|
test "returns reports", %{conn: conn} do
|
||||||
|
@ -1331,6 +1332,8 @@ test "returns reports", %{conn: conn} do
|
||||||
|
|
||||||
assert length(response["reports"]) == 1
|
assert length(response["reports"]) == 1
|
||||||
assert report["id"] == report_id
|
assert report["id"] == report_id
|
||||||
|
|
||||||
|
assert response["total"] == 1
|
||||||
end
|
end
|
||||||
|
|
||||||
test "returns reports with specified state", %{conn: conn} do
|
test "returns reports with specified state", %{conn: conn} do
|
||||||
|
@ -1364,6 +1367,8 @@ test "returns reports with specified state", %{conn: conn} do
|
||||||
assert length(response["reports"]) == 1
|
assert length(response["reports"]) == 1
|
||||||
assert open_report["id"] == first_report_id
|
assert open_report["id"] == first_report_id
|
||||||
|
|
||||||
|
assert response["total"] == 1
|
||||||
|
|
||||||
response =
|
response =
|
||||||
conn
|
conn
|
||||||
|> get("/api/pleroma/admin/reports", %{
|
|> get("/api/pleroma/admin/reports", %{
|
||||||
|
@ -1376,6 +1381,8 @@ test "returns reports with specified state", %{conn: conn} do
|
||||||
assert length(response["reports"]) == 1
|
assert length(response["reports"]) == 1
|
||||||
assert closed_report["id"] == second_report_id
|
assert closed_report["id"] == second_report_id
|
||||||
|
|
||||||
|
assert response["total"] == 1
|
||||||
|
|
||||||
response =
|
response =
|
||||||
conn
|
conn
|
||||||
|> get("/api/pleroma/admin/reports", %{
|
|> get("/api/pleroma/admin/reports", %{
|
||||||
|
@ -1384,6 +1391,7 @@ test "returns reports with specified state", %{conn: conn} do
|
||||||
|> json_response(:ok)
|
|> json_response(:ok)
|
||||||
|
|
||||||
assert Enum.empty?(response["reports"])
|
assert Enum.empty?(response["reports"])
|
||||||
|
assert response["total"] == 0
|
||||||
end
|
end
|
||||||
|
|
||||||
test "returns 403 when requested by a non-admin" do
|
test "returns 403 when requested by a non-admin" do
|
||||||
|
|
Loading…
Reference in a new issue