Merge branch 'develop' into issue/1218

This commit is contained in:
Maksim Pechnikov 2019-09-25 12:24:12 +03:00
commit 1a858134ed
285 changed files with 9234 additions and 2247 deletions

5
.gitignore vendored
View file

@ -38,7 +38,12 @@ erl_crash.dump
# Prevent committing docs files
/priv/static/doc/*
docs/generated_config.md
# Code test coverage
/cover
/Elixir.*.coverdata
.idea
pleroma.iml

View file

@ -4,13 +4,25 @@ All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/).
## [Unreleased]
### Added
- Refreshing poll results for remote polls
- Admin API: Add ability to require password reset
### Changed
- **Breaking:** Elixir >=1.8 is now required (was >= 1.7)
- Replaced [pleroma_job_queue](https://git.pleroma.social/pleroma/pleroma_job_queue) and `Pleroma.Web.Federator.RetryQueue` with [Oban](https://github.com/sorentwo/oban) (see [`docs/config.md`](docs/config.md) on migrating customized worker / retry settings)
- Introduced [quantum](https://github.com/quantum-elixir/quantum-core) job scheduler
- Admin API: Return `total` when querying for reports
- Mastodon API: Return `pleroma.direct_conversation_id` when creating a direct message (`POST /api/v1/statuses`)
### Fixed
- Mastodon API: Fix private and direct statuses not being filtered out from the public timeline for an authenticated user (`GET /api/v1/timelines/public`)
## [1.1.0] - 2019-??-??
### Security
- OStatus: eliminate the possibility of a protocol downgrade attack.
- OStatus: prevent following locked accounts, bypassing the approval process.
- Mastodon API: respect post privacy in `/api/v1/statuses/:id/{favourited,reblogged}_by`
### Removed
- **Breaking:** GNU Social API with Qvitter extensions support
- **Breaking:** ActivityPub: The `accept_blocks` configuration setting.
- Emoji: Remove longfox emojis.
- Remove `Reply-To` header from report emails for admins.
@ -18,6 +30,8 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/).
- **Breaking:** Configuration: A setting to explicitly disable the mailer was added, defaulting to true, if you are using a mailer add `config :pleroma, Pleroma.Emails.Mailer, enabled: true` to your config
- **Breaking:** Configuration: `/media/` is now removed when `base_url` is configured, append `/media/` to your `base_url` config to keep the old behaviour if desired
- **Breaking:** `/api/pleroma/notifications/read` is moved to `/api/v1/pleroma/notifications/read` and now supports `max_id` and responds with Mastodon API entities.
- **Breaking:** `/api/pleroma/admin/users/invite_token` now uses `POST`, changed accepted params and returns full invite in json instead of only token string.
- Configuration: added `config/description.exs`, from which `docs/config.md` is generated
- Configuration: OpenGraph and TwitterCard providers enabled by default
- Configuration: Filter.AnonymizeFilename added ability to retain file extension with custom text
- Mastodon API: `pleroma.thread_muted` key in the Status entity
@ -25,24 +39,21 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/).
- NodeInfo: Return `skipThreadContainment` in `metadata` for the `skip_thread_containment` option
- NodeInfo: Return `mailerEnabled` in `metadata`
- Mastodon API: Unsubscribe followers when they unfollow a user
- Mastodon API: `pleroma.thread_muted` key in the Status entity
- AdminAPI: Add "godmode" while fetching user statuses (i.e. admin can see private statuses)
- Improve digest email template
Pagination: (optional) return `total` alongside with `items` when paginating
- Add `rel="ugc"` to all links in statuses, to prevent SEO spam
### Fixed
- Following from Osada
- Not being able to pin unlisted posts
- Objects being re-embedded to activities after being updated (e.g faved/reposted). Running 'mix pleroma.database prune_objects' again is advised.
- Favorites timeline doing database-intensive queries
- Metadata rendering errors resulting in the entire page being inaccessible
- `federation_incoming_replies_max_depth` option being ignored in certain cases
- Federation/MediaProxy not working with instances that have wrong certificate order
- Mastodon API: Handling of search timeouts (`/api/v1/search` and `/api/v2/search`)
- Mastodon API: Misskey's endless polls being unable to render
- Mastodon API: Embedded relationships not being properly rendered in the Account entity of Status entity
- Mastodon API: Notifications endpoint crashing if one notification failed to render
- Mastodon API: follower/following counters not being nullified, when `hide_follows`/`hide_followers` is set
- Mastodon API: `muted` in the Status entity, using author's account to determine if the tread was muted
- Mastodon API: Add `account_id`, `type`, `offset`, and `limit` to search API (`/api/v1/search` and `/api/v2/search`)
- Mastodon API, streaming: Fix filtering of notifications based on blocks/mutes/thread mutes
- ActivityPub C2S: follower/following collection pages being inaccessible even when authentifucated if `hide_followers`/ `hide_follows` was set
@ -50,16 +61,11 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/).
- Rich Media: Parser failing when no TTL can be found by image TTL setters
- Rich Media: The crawled URL is now spliced into the rich media data.
- ActivityPub S2S: sharedInbox usage has been mostly aligned with the rules in the AP specification.
- ActivityPub S2S: remote user deletions now work the same as local user deletions.
- ActivityPub S2S: POST requests are now signed with `(request-target)` pseudo-header.
- Not being able to access the Mastodon FE login page on private instances
- Invalid SemVer version generation, when the current branch does not have commits ahead of tag/checked out on a tag
- Pleroma.Upload base_url was not automatically whitelisted by MediaProxy. Now your custom CDN or file hosting will be accessed directly as expected.
- Report email not being sent to admins when the reporter is a remote user
- MRF: ensure that subdomain_match calls are case-insensitive
- Reverse Proxy limiting `max_body_length` was incorrectly defined and only checked `Content-Length` headers which may not be sufficient in some circumstances
- MRF: fix use of unserializable keyword lists in describe() implementations
- ActivityPub: Deactivated user deletion
- ActivityPub: Fix `/users/:nickname/inbox` crashing without an authenticated user
- MRF: fix ability to follow a relay when AntiFollowbotPolicy was enabled
### Added
@ -68,16 +74,9 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/).
- Mastodon API: all status JSON responses contain a `pleroma.expires_at` item which states when an activity will expire. The value is only shown to the user who created the activity. To everyone else it's empty.
- Configuration: `ActivityExpiration.enabled` controls whether expired activites will get deleted at the appropriate time. Enabled by default.
- Conversations: Add Pleroma-specific conversation endpoints and status posting extensions. Run the `bump_all_conversations` task again to create the necessary data.
- **Breaking:** MRF describe API, which adds support for exposing configuration information about MRF policies to NodeInfo.
Custom modules will need to be updated by adding, at the very least, `def describe, do: {:ok, %{}}` to the MRF policy modules.
- MRF: Support for priming the mediaproxy cache (`Pleroma.Web.ActivityPub.MRF.MediaProxyWarmingPolicy`)
- MRF: Support for excluding specific domains from Transparency.
- MRF: Support for filtering posts based on who they mention (`Pleroma.Web.ActivityPub.MRF.MentionPolicy`)
- MRF: Support for filtering posts based on ActivityStreams vocabulary (`Pleroma.Web.ActivityPub.MRF.VocabularyPolicy`)
- MRF (Simple Policy): Support for wildcard domains.
- Support for wildcard domains in user domain blocks setting.
- Configuration: `quarantined_instances` support wildcard domains.
- Configuration: `federation_incoming_replies_max_depth` option
- Mastodon API: Support for the [`tagged` filter](https://github.com/tootsuite/mastodon/pull/9755) in [`GET /api/v1/accounts/:id/statuses`](https://docs.joinmastodon.org/api/rest/accounts/#get-api-v1-accounts-id-statuses)
- Mastodon API, streaming: Add support for passing the token in the `Sec-WebSocket-Protocol` header
- Mastodon API, extension: Ability to reset avatar, profile banner, and background
@ -89,6 +88,7 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/).
- Mastodon API: added `/auth/password` endpoint for password reset with rate limit.
- Mastodon API: /api/v1/accounts/:id/statuses now supports nicknames or user id
- Mastodon API: Improve support for the user profile custom fields
- Mastodon API: follower/following counters are nullified when `hide_follows`/`hide_followers` and `hide_follows_count`/`hide_followers_count` are set
- Admin API: Return users' tags when querying reports
- Admin API: Return avatar and display name when querying users
- Admin API: Allow querying user by ID
@ -104,10 +104,10 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/).
- ActivityPub: Optional signing of ActivityPub object fetches.
- Admin API: Endpoint for fetching latest user's statuses
- Pleroma API: Add `/api/v1/pleroma/accounts/confirmation_resend?email=<email>` for resending account confirmation.
- Relays: Added a task to list relay subscriptions.
- Mix Tasks: `mix pleroma.database fix_likes_collections`
- Federation: Remove `likes` from objects.
- Pleroma API: Email change endpoint.
- Admin API: Added moderation log
- Web response cache (currently, enabled for ActivityPub)
- Mastodon API: Added an endpoint to get multiple statuses by IDs (`GET /api/v1/statuses/?ids[]=1&ids[]=2`)
### Changed
- Configuration: Filter.AnonymizeFilename added ability to retain file extension with custom text
@ -115,6 +115,61 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/).
- RichMedia: parsers and their order are configured in `rich_media` config.
- RichMedia: add the rich media ttl based on image expiration time.
## [1.0.6] - 2019-08-14
### Fixed
- MRF: fix use of unserializable keyword lists in describe() implementations
- ActivityPub S2S: POST requests are now signed with `(request-target)` pseudo-header.
## [1.0.5] - 2019-08-13
### Fixed
- Mastodon API: follower/following counters not being nullified, when `hide_follows`/`hide_followers` is set
- Mastodon API: `muted` in the Status entity, using author's account to determine if the thread was muted
- Mastodon API: return the actual profile URL in the Account entity's `url` property when appropriate
- Templates: properly style anchor tags
- Objects being re-embedded to activities after being updated (e.g faved/reposted). Running 'mix pleroma.database prune_objects' again is advised.
- Not being able to access the Mastodon FE login page on private instances
- MRF: ensure that subdomain_match calls are case-insensitive
- Fix internal server error when using the healthcheck API.
### Added
- **Breaking:** MRF describe API, which adds support for exposing configuration information about MRF policies to NodeInfo.
Custom modules will need to be updated by adding, at the very least, `def describe, do: {:ok, %{}}` to the MRF policy modules.
- Relays: Added a task to list relay subscriptions.
- MRF: Support for filtering posts based on ActivityStreams vocabulary (`Pleroma.Web.ActivityPub.MRF.VocabularyPolicy`)
- MRF (Simple Policy): Support for wildcard domains.
- Support for wildcard domains in user domain blocks setting.
- Configuration: `quarantined_instances` support wildcard domains.
- Mix Tasks: `mix pleroma.database fix_likes_collections`
- Configuration: `federation_incoming_replies_max_depth` option
### Removed
- Federation: Remove `likes` from objects.
- **Breaking:** ActivityPub: The `accept_blocks` configuration setting.
## [1.0.4] - 2019-08-01
### Fixed
- Invalid SemVer version generation, when the current branch does not have commits ahead of tag/checked out on a tag
## [1.0.3] - 2019-07-31
### Security
- OStatus: eliminate the possibility of a protocol downgrade attack.
- OStatus: prevent following locked accounts, bypassing the approval process.
- TwitterAPI: use CommonAPI to handle remote follows instead of OStatus.
## [1.0.2] - 2019-07-28
### Fixed
- Not being able to pin unlisted posts
- Mastodon API: represent poll IDs as strings
- MediaProxy: fix matching filenames
- MediaProxy: fix filename encoding
- Migrations: fix a sporadic migration failure
- Metadata rendering errors resulting in the entire page being inaccessible
- Federation/MediaProxy not working with instances that have wrong certificate order
- ActivityPub S2S: remote user deletions now work the same as local user deletions.
### Changed
- Configuration: OpenGraph and TwitterCard providers enabled by default
- Configuration: Filter.AnonymizeFilename added ability to retain file extension with custom text
## [1.0.1] - 2019-07-14
### Security

View file

@ -51,6 +51,24 @@
telemetry_event: [Pleroma.Repo.Instrumenter],
migration_lock: nil
scheduled_jobs =
with digest_config <- Application.get_env(:pleroma, :email_notifications)[:digest],
true <- digest_config[:active] do
[{digest_config[:schedule], {Pleroma.Daemons.DigestEmailDaemon, :perform, []}}]
else
_ -> []
end
scheduled_jobs =
scheduled_jobs ++
[{"0 */6 * * * *", {Pleroma.Web.Websub, :refresh_subscriptions, []}}]
config :pleroma, Pleroma.Scheduler,
global: true,
overlap: true,
timezone: :utc,
jobs: scheduled_jobs
config :pleroma, Pleroma.Captcha,
enabled: false,
seconds_valid: 60,
@ -91,6 +109,7 @@
config :pleroma, Pleroma.Uploaders.S3,
bucket: nil,
streaming_enabled: true,
public_endpoint: "https://s3.amazonaws.com"
config :pleroma, Pleroma.Uploaders.MDII,
@ -104,7 +123,8 @@
# Put groups that have higher priority than defaults here. Example in `docs/config/custom_emoji.md`
Custom: ["/emoji/*.png", "/emoji/**/*.png"]
],
default_manifest: "https://git.pleroma.social/pleroma/emoji-index/raw/master/index.json"
default_manifest: "https://git.pleroma.social/pleroma/emoji-index/raw/master/index.json",
shared_pack_cache_seconds_per_file: 60
config :pleroma, :uri_schemes,
valid_schemes: [
@ -258,7 +278,7 @@
max_account_fields: 10,
max_remote_account_fields: 20,
account_field_name_length: 512,
account_field_value_length: 512,
account_field_value_length: 2048,
external_user_synchronization: true
config :pleroma, :markup,
@ -313,6 +333,10 @@
follow_handshake_timeout: 500,
sign_object_fetches: true
config :pleroma, :streamer,
workers: 3,
overflow_workers: 2
config :pleroma, :user, deny_follow_blocked: true
config :pleroma, :mrf_normalize_markup, scrub_policy: Pleroma.HTML.Scrubber.Default
@ -373,6 +397,8 @@
config :phoenix, :format_encoders, json: Jason
config :phoenix, :json_library, Jason
config :pleroma, :gopher,
enabled: false,
ip: {0, 0, 0, 0},
@ -449,13 +475,11 @@
"web"
]
config :pleroma, Pleroma.Web.Federator.RetryQueue,
enabled: false,
max_jobs: 20,
initial_timeout: 30,
max_retries: 5
config :pleroma_job_queue, :queues,
config :pleroma, Oban,
repo: Pleroma.Repo,
verbose: false,
prune: {:maxlen, 1500},
queues: [
activity_expiration: 10,
federator_incoming: 50,
federator_outgoing: 50,
@ -464,6 +488,13 @@
transmogrifier: 20,
scheduled_activities: 10,
background: 5
]
config :pleroma, :workers,
retries: [
federator_incoming: 5,
federator_outgoing: 5
]
config :pleroma, :fetch_initial_posts,
enabled: false,
@ -478,7 +509,7 @@
class: false,
strip_prefix: false,
new_window: false,
rel: false
rel: "ugc"
]
config :pleroma, :ldap,
@ -560,6 +591,10 @@
config :pleroma, Pleroma.ActivityExpiration, enabled: true
config :pleroma, :web_cache_ttl,
activity_pub: nil,
activity_pub_question: 30_000
# Import environment specific config. This must remain at the bottom
# of this file so it overrides the configuration defined above.
import_config "#{Mix.env()}.exs"

2713
config/description.exs Normal file

File diff suppressed because it is too large Load diff

View file

@ -30,7 +30,8 @@
notify_email: "noreply@example.com",
skip_thread_containment: false,
federating: false,
external_user_synchronization: false
external_user_synchronization: false,
static_dir: "test/instance_static/"
config :pleroma, :activitypub, sign_object_fetches: false
@ -61,7 +62,11 @@
config :web_push_encryption, :http_client, Pleroma.Web.WebPushHttpClientMock
config :pleroma_job_queue, disabled: true
config :pleroma, Oban,
queues: false,
prune: :disabled
config :pleroma, Pleroma.Scheduler, jobs: []
config :pleroma, Pleroma.ScheduledActivity,
daily_user_limit: 2,

View file

@ -60,9 +60,13 @@ Authentication is required and the user must be an admin.
- Method: `POST`
- Params:
- `nickname`
- `email`
- `password`
`users`: [
{
`nickname`,
`email`,
`password`
}
]
- Response: Users nickname
## `/api/pleroma/admin/users/follow`
@ -220,15 +224,25 @@ Note: Available `:permission_group` is currently moderator and admin. 404 is ret
## `/api/pleroma/admin/users/invite_token`
### Get an account registration invite token
### Create an account registration invite token
- Methods: `GET`
- Methods: `POST`
- Params:
- *optional* `invite` => [
- *optional* `max_use` (integer)
- *optional* `expires_at` (date string e.g. "2019-04-07")
]
- Response: invite token (base64 string)
- Response:
```json
{
"id": integer,
"token": string,
"used": boolean,
"expires_at": date,
"uses": integer,
"max_use": integer,
"invite_type": string (possible values: `one_time`, `reusable`, `date_limited`, `reusable_date_limited`)
}
```
## `/api/pleroma/admin/users/invites`
@ -296,6 +310,14 @@ Note: Available `:permission_group` is currently moderator and admin. 404 is ret
- Params: none
- Response: password reset token (base64 string)
## `/api/pleroma/admin/users/:nickname/force_password_reset`
### Force passord reset for a user with a given nickname
- Methods: `PATCH`
- Params: none
- Response: none (code `204`)
## `/api/pleroma/admin/reports`
### Get a list of reports
- Method `GET`
@ -313,6 +335,7 @@ Note: Available `:permission_group` is currently moderator and admin. 404 is ret
```json
{
"total" : 1,
"reports": [
{
"account": {
@ -718,3 +741,10 @@ Compile time settings (need instance reboot):
}
]
```
## `POST /api/pleroma/admin/reload_emoji`
### Reload the instance's custom emoji
* Method `POST`
* Authentication: required
* Params: None
* Response: JSON, "ok" and 200 status

View file

@ -21,7 +21,8 @@ Adding the parameter `with_muted=true` to the timeline queries will also return
Has these additional fields under the `pleroma` object:
- `local`: true if the post was made on the local instance
- `conversation_id`: the ID of the conversation the status is associated with (if any)
- `conversation_id`: the ID of the AP context the status is associated with (if any)
- `direct_conversation_id`: the ID of the Mastodon direct message conversation the status is associated with (if any)
- `in_reply_to_account_acct`: the `acct` property of User entity for replied user (if any)
- `content`: a map consisting of alternate representations of the `content` property with the key being it's mimetype. Currently the only alternate representation supported is `text/plain`
- `spoiler_text`: a map consisting of alternate representations of the `spoiler_text` property with the key being it's mimetype. Currently the only alternate representation supported is `text/plain`
@ -50,6 +51,8 @@ Has these additional fields under the `pleroma` object:
- `confirmation_pending`: boolean, true if a new user account is waiting on email confirmation to be activated
- `hide_followers`: boolean, true when the user has follower hiding enabled
- `hide_follows`: boolean, true when the user has follow hiding enabled
- `hide_followers_count`: boolean, true when the user has follower stat hiding enabled
- `hide_follows_count`: boolean, true when the user has follow stat hiding enabled
- `settings_store`: A generic map of settings for frontends. Opaque to the backend. Only returned in `verify_credentials` and `update_credentials`
- `chat_token`: The token needed for Pleroma chat. Only returned in `verify_credentials`
- `deactivated`: boolean, true when the user is deactivated
@ -91,6 +94,20 @@ Additional parameters can be added to the JSON body/Form data:
- `expires_in`: The number of seconds the posted activity should expire in. When a posted activity expires it will be deleted from the server, and a delete request for it will be federated. This needs to be longer than an hour.
- `in_reply_to_conversation_id`: Will reply to a given conversation, addressing only the people who are part of the recipient set of that conversation. Sets the visibility to `direct`.
## GET `/api/v1/statuses`
An endpoint to get multiple statuses by IDs.
Required parameters:
- `ids`: array of activity ids
Usage example: `GET /api/v1/statuses/?ids[]=1&ids[]=2`.
Returns: array of Status.
The maximum number of statuses is limited to 100 per request.
## PATCH `/api/v1/update_credentials`
Additional parameters can be added to the JSON body/Form data:
@ -98,6 +115,8 @@ Additional parameters can be added to the JSON body/Form data:
- `no_rich_text` - if true, html tags are stripped from all statuses requested from the API
- `hide_followers` - if true, user's followers will be hidden
- `hide_follows` - if true, user's follows will be hidden
- `hide_followers_count` - if true, user's follower count will be hidden
- `hide_follows_count` - if true, user's follow count will be hidden
- `hide_favorites` - if true, user's favorites timeline will be hidden
- `show_role` - if true, user's role (e.g admin, moderator) will be exposed to anyone in the API
- `default_scope` - the scope returned under `privacy` key in Source subentity

View file

@ -321,6 +321,16 @@ See [Admin-API](Admin-API.md)
}
```
## `/api/pleroma/change_email`
### Change account email
* Method `POST`
* Authentication: required
* Params:
* `password`: user's password
* `email`: new email
* Response: JSON. Returns `{"status": "success"}` if the change was successful, `{"error": "[error message]"}` otherwise
* Note: Currently, Mastodon has no API for changing email. If they add it in future it might be incompatible with Pleroma.
# Pleroma Conversations
Pleroma Conversations have the same general structure that Mastodon Conversations have. The behavior differs in the following ways when using these endpoints:
@ -355,3 +365,68 @@ The status posting endpoint takes an additional parameter, `in_reply_to_conversa
* Params:
* `recipients`: A list of ids of users that should receive posts to this conversation. This will replace the current list of recipients, so submit the full list. The owner of owner of the conversation will always be part of the set of recipients, though.
* Response: JSON, statuses (200 - healthy, 503 unhealthy)
## `GET /api/pleroma/emoji/packs`
### Lists the custom emoji packs on the server
* Method `GET`
* Authentication: not required
* Params: None
* Response: JSON, "ok" and 200 status and the JSON hashmap of "pack name" to "pack contents"
## `PUT /api/pleroma/emoji/packs/:name`
### Creates an empty custom emoji pack
* Method `PUT`
* Authentication: required
* Params: None
* Response: JSON, "ok" and 200 status or 409 if the pack with that name already exists
## `DELETE /api/pleroma/emoji/packs/:name`
### Delete a custom emoji pack
* Method `DELETE`
* Authentication: required
* Params: None
* Response: JSON, "ok" and 200 status or 500 if there was an error deleting the pack
## `POST /api/pleroma/emoji/packs/:name/update_file`
### Update a file in a custom emoji pack
* Method `POST`
* Authentication: required
* Params:
* if the `action` is `add`, adds an emoji named `shortcode` to the pack `pack_name`,
that means that the emoji file needs to be uploaded with the request
(thus requiring it to be a multipart request) and be named `file`.
There can also be an optional `filename` that will be the new emoji file name
(if it's not there, the name will be taken from the uploaded file).
* if the `action` is `update`, changes emoji shortcode
(from `shortcode` to `new_shortcode` or moves the file (from the current filename to `new_filename`)
* if the `action` is `remove`, removes the emoji named `shortcode` and it's associated file
* Response: JSON, updated "files" section of the pack and 200 status, 409 if the trying to use a shortcode
that is already taken, 400 if there was an error with the shortcode, filename or file (additional info
in the "error" part of the response JSON)
## `POST /api/pleroma/emoji/packs/:name/update_metadata`
### Updates (replaces) pack metadata
* Method `POST`
* Authentication: required
* Params:
* `new_data`: new metadata to replace the old one
* Response: JSON, updated "metadata" section of the pack and 200 status or 400 if there was a
problem with the new metadata (the error is specified in the "error" part of the response JSON)
## `POST /api/pleroma/emoji/packs/download_from`
### Requests the instance to download the pack from another instance
* Method `POST`
* Authentication: required
* Params:
* `instance_address`: the address of the instance to download from
* `pack_name`: the pack to download from that instance
* Response: JSON, "ok" and 200 status if the pack was downloaded, or 500 if there were
errors downloading the pack
## `GET /api/pleroma/emoji/packs/:name/download_shared`
### Requests a local pack from the instance
* Method `GET`
* Authentication: not required
* Params: None
* Response: the archive of the pack with a 200 status code, 403 if the pack is not set as shared,
404 if the pack does not exist

View file

@ -39,7 +39,7 @@ Feel free to contact us to be added to this list!
### Nekonium
- Homepage: [F-Droid Repository](https://repo.gdgd.jp.net/), [Google Play](https://play.google.com/store/apps/details?id=com.apps.nekonium), [Amazon](https://www.amazon.co.jp/dp/B076FXPRBC/)
- Source: <https://git.gdgd.jp.net/lin/nekonium/>
- Source: <https://gogs.gdgd.jp.net/lin/nekonium>
- Contact: [@lin@pleroma.gdgd.jp.net](https://pleroma.gdgd.jp.net/users/lin)
- Platforms: Android
- Features: Streaming Ready
@ -67,7 +67,7 @@ Feel free to contact us to be added to this list!
## Alternative Web Interfaces
### Brutaldon
- Homepage: <https://jfm.carcosa.net/projects/software/brutaldon/>
- Source Code: <https://github.com/jfmcbrayer/brutaldon>
- Source Code: <https://git.carcosa.net/jmcbray/brutaldon>
- Contact: [@gcupc@glitch.social](https://glitch.social/users/gcupc)
- Features: No Streaming

View file

@ -23,6 +23,7 @@ Note: `strip_exif` has been replaced by `Pleroma.Upload.Filter.Mogrify`.
* `truncated_namespace`: If you use S3 compatible service such as Digital Ocean Spaces or CDN, set folder name or "" etc.
For example, when using CDN to S3 virtual host format, set "".
At this time, write CNAME to CDN in public_endpoint.
* `streaming_enabled`: Enable streaming uploads, when enabled the file will be sent to the server in chunks as it's being read. This may be unsupported by some providers, try disabling this if you have upload problems.
## Pleroma.Upload.Filter.Mogrify
@ -135,7 +136,7 @@ config :pleroma, Pleroma.Emails.Mailer,
* `max_account_fields`: The maximum number of custom fields in the user profile (default: `10`)
* `max_remote_account_fields`: The maximum number of custom fields in the remote user profile (default: `20`)
* `account_field_name_length`: An account field name maximum length (default: `512`)
* `account_field_value_length`: An account field value maximum length (default: `512`)
* `account_field_value_length`: An account field value maximum length (default: `2048`)
* `external_user_synchronization`: Enabling following/followers counters synchronization for external users.
@ -400,35 +401,71 @@ You can then do
curl "http://localhost:4000/api/pleroma/admin/invite_token?admin_token=somerandomtoken"
```
## :pleroma_job_queue
## Oban
[Pleroma Job Queue](https://git.pleroma.social/pleroma/pleroma_job_queue) configuration: a list of queues with maximum concurrent jobs.
[Oban](https://github.com/sorentwo/oban) asynchronous job processor configuration.
Configuration options described in [Oban readme](https://github.com/sorentwo/oban#usage):
* `repo` - app's Ecto repo (`Pleroma.Repo`)
* `verbose` - logs verbosity
* `prune` - non-retryable jobs [pruning settings](https://github.com/sorentwo/oban#pruning) (`:disabled` / `{:maxlen, value}` / `{:maxage, value}`)
* `queues` - job queues (see below)
Pleroma has the following queues:
* `activity_expiration` - Activity expiration
* `federator_outgoing` - Outgoing federation
* `federator_incoming` - Incoming federation
* `mailer` - Email sender, see [`Pleroma.Emails.Mailer`](#pleroma-emails-mailer)
* `mailer` - Email sender, see [`Pleroma.Emails.Mailer`](#pleromaemailsmailer)
* `transmogrifier` - Transmogrifier
* `web_push` - Web push notifications
* `scheduled_activities` - Scheduled activities, see [`Pleroma.ScheduledActivities`](#pleromascheduledactivity)
* `scheduled_activities` - Scheduled activities, see [`Pleroma.ScheduledActivity`](#pleromascheduledactivity)
Example:
```elixir
config :pleroma_job_queue, :queues,
config :pleroma, Oban,
repo: Pleroma.Repo,
verbose: false,
prune: {:maxlen, 1500},
queues: [
federator_incoming: 50,
federator_outgoing: 50
]
```
This config contains two queues: `federator_incoming` and `federator_outgoing`. Both have the `max_jobs` set to `50`.
This config contains two queues: `federator_incoming` and `federator_outgoing`. Both have the number of max concurrent jobs set to `50`.
## Pleroma.Web.Federator.RetryQueue
### Migrating `pleroma_job_queue` settings
* `enabled`: If set to `true`, failed federation jobs will be retried
* `max_jobs`: The maximum amount of parallel federation jobs running at the same time.
* `initial_timeout`: The initial timeout in seconds
* `max_retries`: The maximum number of times a federation job is retried
`config :pleroma_job_queue, :queues` is replaced by `config :pleroma, Oban, :queues` and uses the same format (keys are queues' names, values are max concurrent jobs numbers).
### Note on running with PostgreSQL in silent mode
If you are running PostgreSQL in [`silent_mode`](https://postgresqlco.nf/en/doc/param/silent_mode?version=9.1), it's advised to set [`log_destination`](https://postgresqlco.nf/en/doc/param/log_destination?version=9.1) to `syslog`,
otherwise `postmaster.log` file may grow because of "you don't own a lock of type ShareLock" warnings (see https://github.com/sorentwo/oban/issues/52).
## :workers
Includes custom worker options not interpretable directly by `Oban`.
* `retries` — keyword lists where keys are `Oban` queues (see above) and values are numbers of max attempts for failed jobs.
Example:
```elixir
config :pleroma, :workers,
retries: [
federator_incoming: 5,
federator_outgoing: 5
]
```
### Migrating `Pleroma.Web.Federator.RetryQueue` settings
* `max_retries` is replaced with `config :pleroma, :workers, retries: [federator_outgoing: 5]`
* `enabled: false` corresponds to `config :pleroma, :workers, retries: [federator_outgoing: 1]`
* deprecated options: `max_jobs`, `initial_timeout`
## Pleroma.Web.Metadata
* `providers`: a list of metadata providers to enable. Providers available:
@ -485,10 +522,28 @@ config :auto_linker,
class: false,
strip_prefix: false,
new_window: false,
rel: false
rel: "ugc"
]
```
## Pleroma.Scheduler
Configuration for [Quantum](https://github.com/quantum-elixir/quantum-core) jobs scheduler.
See [Quantum readme](https://github.com/quantum-elixir/quantum-core#usage) for the list of supported options.
Example:
```elixir
config :pleroma, Pleroma.Scheduler,
global: true,
overlap: true,
timezone: :utc,
jobs: [{"0 */6 * * * *", {Pleroma.Web.Websub, :refresh_subscriptions, []}}]
```
The above example defines a single job which invokes `Pleroma.Web.Websub.refresh_subscriptions()` every 6 hours ("0 */6 * * * *", [crontab format](https://en.wikipedia.org/wiki/Cron)).
## Pleroma.ScheduledActivity
* `daily_user_limit`: the number of scheduled activities a user is allowed to create in a single day (Default: `25`)
@ -653,6 +708,8 @@ Configure OAuth 2 provider capabilities:
* `pack_extensions`: A list of file extensions for emojis, when no emoji.txt for a pack is present. Example `[".png", ".gif"]`
* `groups`: Emojis are ordered in groups (tags). This is an array of key-value pairs where the key is the groupname and the value the location or array of locations. `*` can be used as a wildcard. Example `[Custom: ["/emoji/*.png", "/emoji/custom/*.png"]]`
* `default_manifest`: Location of the JSON-manifest. This manifest contains information about the emoji-packs you can download. Currently only one manifest can be added (no arrays).
* `shared_pack_cache_seconds_per_file`: When an emoji pack is shared, the archive is created and cached in
memory for this amount of seconds multiplied by the number of files.
## Database options
@ -690,3 +747,12 @@ Supported rate limiters:
* `:relation_id_action` for actions on relation with a specific user (follow, unfollow)
* `:statuses_actions` for create / delete / fav / unfav / reblog / unreblog actions on any statuses
* `:status_id_action` for fav / unfav or reblog / unreblog actions on the same status by the same user
## :web_cache_ttl
The expiration time for the web responses cache. Values should be in milliseconds or `nil` to disable expiration.
Available caches:
* `:activity_pub` - activity pub routes (except question activities). Defaults to `nil` (no expiration).
* `:activity_pub_question` - activity pub routes (question activities). Defaults to `30_000` (30 seconds).

View file

@ -215,7 +215,9 @@
]}
]},
{ 5222, ejabberd_c2s, [
%% If you want dual stack, you have to clone this entire config stanza
%% and change the bind to "::"
{ {5222, "0.0.0.0"}, ejabberd_c2s, [
%%
%% If TLS is compiled in and you installed a SSL
@ -246,7 +248,9 @@
%% {max_stanza_size, 65536}
%% ]},
{ 5269, ejabberd_s2s_in, [
%% If you want dual stack, you have to clone this entire config stanza
%% and change the bind to "::"
{ {5269, "0.0.0.0"}, ejabberd_s2s_in, [
{shaper, s2s_shaper},
{max_stanza_size, 131072},
{protocol_options, ["no_sslv3"]}

View file

@ -1,5 +1,5 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2018 Pleroma Authors <https://pleroma.social/>
# Copyright © 2017-2019 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Mix.Pleroma do

View file

@ -27,7 +27,7 @@ def run(["tag"]) do
})
end
def run(["render_timeline", nickname]) do
def run(["render_timeline", nickname | _] = args) do
start_pleroma()
user = Pleroma.User.get_by_nickname(nickname)
@ -37,33 +37,37 @@ def run(["render_timeline", nickname]) do
|> Map.put("blocking_user", user)
|> Map.put("muting_user", user)
|> Map.put("user", user)
|> Map.put("limit", 80)
|> Map.put("limit", 4096)
|> Pleroma.Web.ActivityPub.ActivityPub.fetch_public_activities()
|> Enum.reverse()
inputs = %{
"One activity" => Enum.take_random(activities, 1),
"Ten activities" => Enum.take_random(activities, 10),
"Twenty activities" => Enum.take_random(activities, 20),
"Forty activities" => Enum.take_random(activities, 40),
"Eighty activities" => Enum.take_random(activities, 80)
"1 activity" => Enum.take_random(activities, 1),
"10 activities" => Enum.take_random(activities, 10),
"20 activities" => Enum.take_random(activities, 20),
"40 activities" => Enum.take_random(activities, 40),
"80 activities" => Enum.take_random(activities, 80)
}
inputs =
if Enum.at(args, 2) == "extended" do
Map.merge(inputs, %{
"200 activities" => Enum.take_random(activities, 200),
"500 activities" => Enum.take_random(activities, 500),
"2000 activities" => Enum.take_random(activities, 2000),
"4096 activities" => Enum.take_random(activities, 4096)
})
else
inputs
end
Benchee.run(
%{
"Parallel rendering" => fn activities ->
Pleroma.Web.MastodonAPI.StatusView.render("index.json", %{
activities: activities,
for: user,
as: :activity
})
end,
"Standart rendering" => fn activities ->
Pleroma.Web.MastodonAPI.StatusView.render("index.json", %{
activities: activities,
for: user,
as: :activity,
parallel: false
as: :activity
})
end
},

View file

@ -1,5 +1,5 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2018 Pleroma Authors <https://pleroma.social/>
# Copyright © 2017-2019 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Mix.Tasks.Pleroma.Database do

View file

@ -0,0 +1,42 @@
defmodule Mix.Tasks.Pleroma.Docs do
use Mix.Task
import Mix.Pleroma
@shortdoc "Generates docs from descriptions.exs"
@moduledoc """
Generates docs from `descriptions.exs`.
Supports two formats: `markdown` and `json`.
## Generate Markdown docs
`mix pleroma.docs`
## Generate JSON docs
`mix pleroma.docs json`
"""
def run(["json"]) do
do_run(Pleroma.Docs.JSON)
end
def run(_) do
do_run(Pleroma.Docs.Markdown)
end
defp do_run(implementation) do
start_pleroma()
with {descriptions, _paths} <- Mix.Config.eval!("config/description.exs"),
{:ok, file_path} <-
Pleroma.Docs.Generator.process(
implementation,
descriptions[:pleroma][:config_description]
) do
type = if implementation == Pleroma.Docs.Markdown, do: "Markdown", else: "JSON"
Mix.shell().info([:green, "#{type} docs successfully generated to #{file_path}."])
end
end
end

View file

@ -1,5 +1,5 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2018 Pleroma Authors <https://pleroma.social/>
# Copyright © 2017-2019 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-onl
defmodule Mix.Tasks.Pleroma.Ecto do

View file

@ -1,5 +1,5 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2018 Pleroma Authors <https://pleroma.social/>
# Copyright © 2017-2019 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-onl
defmodule Mix.Tasks.Pleroma.Ecto.Migrate do

View file

@ -1,5 +1,5 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2018 Pleroma Authors <https://pleroma.social/>
# Copyright © 2017-2019 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-onl
defmodule Mix.Tasks.Pleroma.Ecto.Rollback do

View file

@ -1,5 +1,5 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2018 Pleroma Authors <https://pleroma.social/>
# Copyright © 2017-2019 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Mix.Tasks.Pleroma.Emoji do

View file

@ -1,5 +1,5 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2018 Pleroma Authors <https://pleroma.social/>
# Copyright © 2017-2019 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Mix.Tasks.Pleroma.Instance do

View file

@ -1,5 +1,5 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2018 Pleroma Authors <https://pleroma.social/>
# Copyright © 2017-2019 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Mix.Tasks.Pleroma.Relay do

View file

@ -1,5 +1,5 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2018 Pleroma Authors <https://pleroma.social/>
# Copyright © 2017-2019 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Mix.Tasks.Pleroma.Uploads do

View file

@ -1,5 +1,5 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2018 Pleroma Authors <https://pleroma.social/>
# Copyright © 2017-2019 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Mix.Tasks.Pleroma.User do

View file

@ -6,6 +6,7 @@ defmodule Pleroma.Activity do
use Ecto.Schema
alias Pleroma.Activity
alias Pleroma.Activity.Queries
alias Pleroma.ActivityExpiration
alias Pleroma.Bookmark
alias Pleroma.Notification
@ -65,8 +66,8 @@ defmodule Pleroma.Activity do
timestamps()
end
def with_joined_object(query) do
join(query, :inner, [activity], o in Object,
def with_joined_object(query, join_type \\ :inner) do
join(query, join_type, [activity], o in Object,
on:
fragment(
"(?->>'id') = COALESCE(?->'object'->>'id', ?->>'object')",
@ -78,10 +79,10 @@ def with_joined_object(query) do
)
end
def with_preloaded_object(query) do
def with_preloaded_object(query, join_type \\ :inner) do
query
|> has_named_binding?(:object)
|> if(do: query, else: with_joined_object(query))
|> if(do: query, else: with_joined_object(query, join_type))
|> preload([activity, object: object], object: object)
end
@ -107,12 +108,9 @@ def with_set_thread_muted_field(query, %User{} = user) do
def with_set_thread_muted_field(query, _), do: query
def get_by_ap_id(ap_id) do
Repo.one(
from(
activity in Activity,
where: fragment("(?)->>'id' = ?", activity.data, ^to_string(ap_id))
)
)
ap_id
|> Queries.by_ap_id()
|> Repo.one()
end
def get_bookmark(%Activity{} = activity, %User{} = user) do
@ -133,21 +131,10 @@ def change(struct, params \\ %{}) do
end
def get_by_ap_id_with_object(ap_id) do
Repo.one(
from(
activity in Activity,
where: fragment("(?)->>'id' = ?", activity.data, ^to_string(ap_id)),
left_join: o in Object,
on:
fragment(
"(?->>'id') = COALESCE(?->'object'->>'id', ?->>'object')",
o.data,
activity.data,
activity.data
),
preload: [object: o]
)
)
ap_id
|> Queries.by_ap_id()
|> with_preloaded_object(:left)
|> Repo.one()
end
def get_by_id(id) do
@ -158,66 +145,34 @@ def get_by_id(id) do
end
def get_by_id_with_object(id) do
from(activity in Activity,
where: activity.id == ^id,
inner_join: o in Object,
on:
fragment(
"(?->>'id') = COALESCE(?->'object'->>'id', ?->>'object')",
o.data,
activity.data,
activity.data
),
preload: [object: o]
)
Activity
|> where(id: ^id)
|> with_preloaded_object()
|> Repo.one()
end
def by_object_ap_id(ap_id) do
from(
activity in Activity,
where:
fragment(
"coalesce((?)->'object'->>'id', (?)->>'object') = ?",
activity.data,
activity.data,
^to_string(ap_id)
)
)
def all_by_ids_with_object(ids) do
Activity
|> where([a], a.id in ^ids)
|> with_preloaded_object()
|> Repo.all()
end
def create_by_object_ap_id(ap_ids) when is_list(ap_ids) do
from(
activity in Activity,
where:
fragment(
"coalesce((?)->'object'->>'id', (?)->>'object') = ANY(?)",
activity.data,
activity.data,
^ap_ids
),
where: fragment("(?)->>'type' = 'Create'", activity.data)
)
@doc """
Accepts `ap_id` or list of `ap_id`.
Returns a query.
"""
@spec create_by_object_ap_id(String.t() | [String.t()]) :: Ecto.Queryable.t()
def create_by_object_ap_id(ap_id) do
ap_id
|> Queries.by_object_id()
|> Queries.by_type("Create")
end
def create_by_object_ap_id(ap_id) when is_binary(ap_id) do
from(
activity in Activity,
where:
fragment(
"coalesce((?)->'object'->>'id', (?)->>'object') = ?",
activity.data,
activity.data,
^to_string(ap_id)
),
where: fragment("(?)->>'type' = 'Create'", activity.data)
)
end
def create_by_object_ap_id(_), do: nil
def get_all_create_by_object_ap_id(ap_id) do
Repo.all(create_by_object_ap_id(ap_id))
ap_id
|> create_by_object_ap_id()
|> Repo.all()
end
def get_create_by_object_ap_id(ap_id) when is_binary(ap_id) do
@ -228,54 +183,17 @@ def get_create_by_object_ap_id(ap_id) when is_binary(ap_id) do
def get_create_by_object_ap_id(_), do: nil
def create_by_object_ap_id_with_object(ap_ids) when is_list(ap_ids) do
from(
activity in Activity,
where:
fragment(
"coalesce((?)->'object'->>'id', (?)->>'object') = ANY(?)",
activity.data,
activity.data,
^ap_ids
),
where: fragment("(?)->>'type' = 'Create'", activity.data),
inner_join: o in Object,
on:
fragment(
"(?->>'id') = COALESCE(?->'object'->>'id', ?->>'object')",
o.data,
activity.data,
activity.data
),
preload: [object: o]
)
@doc """
Accepts `ap_id` or list of `ap_id`.
Returns a query.
"""
@spec create_by_object_ap_id_with_object(String.t() | [String.t()]) :: Ecto.Queryable.t()
def create_by_object_ap_id_with_object(ap_id) do
ap_id
|> create_by_object_ap_id()
|> with_preloaded_object()
end
def create_by_object_ap_id_with_object(ap_id) when is_binary(ap_id) do
from(
activity in Activity,
where:
fragment(
"coalesce((?)->'object'->>'id', (?)->>'object') = ?",
activity.data,
activity.data,
^to_string(ap_id)
),
where: fragment("(?)->>'type' = 'Create'", activity.data),
inner_join: o in Object,
on:
fragment(
"(?->>'id') = COALESCE(?->'object'->>'id', ?->>'object')",
o.data,
activity.data,
activity.data
),
preload: [object: o]
)
end
def create_by_object_ap_id_with_object(_), do: nil
def get_create_by_object_ap_id_with_object(ap_id) when is_binary(ap_id) do
ap_id
|> create_by_object_ap_id_with_object()
@ -299,7 +217,8 @@ def normalize(ap_id) when is_binary(ap_id), do: get_by_ap_id_with_object(ap_id)
def normalize(_), do: nil
def delete_by_ap_id(id) when is_binary(id) do
by_object_ap_id(id)
id
|> Queries.by_object_id()
|> select([u], u)
|> Repo.delete_all()
|> elem(1)
@ -308,10 +227,19 @@ def delete_by_ap_id(id) when is_binary(id) do
%{data: %{"type" => "Create", "object" => %{"id" => ap_id}}} -> ap_id == id
_ -> nil
end)
|> purge_web_resp_cache()
end
def delete_by_ap_id(_), do: nil
defp purge_web_resp_cache(%Activity{} = activity) do
%{path: path} = URI.parse(activity.data["id"])
Cachex.del(:web_resp_cache, path)
activity
end
defp purge_web_resp_cache(nil), do: nil
for {ap_type, type} <- @mastodon_notification_types do
def mastodon_notification_type(%Activity{data: %{"type" => unquote(ap_type)}}),
do: unquote(type)
@ -334,31 +262,10 @@ def all_by_actor_and_id(actor, status_ids) do
end
def follow_requests_for_actor(%Pleroma.User{ap_id: ap_id}) do
from(
a in Activity,
where:
fragment(
"? ->> 'type' = 'Follow'",
a.data
),
where:
fragment(
"? ->> 'state' = 'pending'",
a.data
),
where:
fragment(
"coalesce((?)->'object'->>'id', (?)->>'object') = ?",
a.data,
a.data,
^ap_id
)
)
end
@spec query_by_actor(actor()) :: Ecto.Query.t()
def query_by_actor(actor) do
from(a in Activity, where: a.actor == ^actor)
ap_id
|> Queries.by_object_id()
|> Queries.by_type("Follow")
|> where([a], fragment("? ->> 'state' = 'pending'", a.data))
end
def restrict_deactivated_users(query) do

View file

@ -0,0 +1,63 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2019 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Activity.Ir.Topics do
alias Pleroma.Object
alias Pleroma.Web.ActivityPub.Visibility
def get_activity_topics(activity) do
activity
|> Object.normalize()
|> generate_topics(activity)
|> List.flatten()
end
defp generate_topics(%{data: %{"type" => "Answer"}}, _) do
[]
end
defp generate_topics(object, activity) do
["user", "list"] ++ visibility_tags(object, activity)
end
defp visibility_tags(object, activity) do
case Visibility.get_visibility(activity) do
"public" ->
if activity.local do
["public", "public:local"]
else
["public"]
end
|> item_creation_tags(object, activity)
"direct" ->
["direct"]
_ ->
[]
end
end
defp item_creation_tags(tags, %{data: %{"type" => "Create"}} = object, activity) do
tags ++ hashtags_to_topics(object) ++ attachment_topics(object, activity)
end
defp item_creation_tags(tags, _, _) do
tags
end
defp hashtags_to_topics(%{data: %{"tag" => tags}}) do
tags
|> Enum.filter(&is_bitstring(&1))
|> Enum.map(fn tag -> "hashtag:" <> tag end)
end
defp hashtags_to_topics(_), do: []
defp attachment_topics(%{data: %{"attachment" => []}}, _act), do: []
defp attachment_topics(_object, %{local: true}), do: ["public:media", "public:local:media"]
defp attachment_topics(_object, _act), do: ["public:media"]
end

View file

@ -1,5 +1,5 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2018 Pleroma Authors <https://pleroma.social/>
# Copyright © 2017-2019 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Activity.Queries do
@ -13,6 +13,14 @@ defmodule Pleroma.Activity.Queries do
alias Pleroma.Activity
@spec by_ap_id(query, String.t()) :: query
def by_ap_id(query \\ Activity, ap_id) do
from(
activity in query,
where: fragment("(?)->>'id' = ?", activity.data, ^to_string(ap_id))
)
end
@spec by_actor(query, String.t()) :: query
def by_actor(query \\ Activity, actor) do
from(
@ -21,8 +29,23 @@ def by_actor(query \\ Activity, actor) do
)
end
@spec by_object_id(query, String.t()) :: query
def by_object_id(query \\ Activity, object_id) do
@spec by_object_id(query, String.t() | [String.t()]) :: query
def by_object_id(query \\ Activity, object_id)
def by_object_id(query, object_ids) when is_list(object_ids) do
from(
activity in query,
where:
fragment(
"coalesce((?)->'object'->>'id', (?)->>'object') = ANY(?)",
activity.data,
activity.data,
^object_ids
)
)
end
def by_object_id(query, object_id) when is_binary(object_id) do
from(activity in query,
where:
fragment(
@ -41,9 +64,4 @@ def by_type(query \\ Activity, activity_type) do
where: fragment("(?)->>'type' = ?", activity.data, ^activity_type)
)
end
@spec limit(query, pos_integer()) :: query
def limit(query \\ Activity, limit) do
from(activity in query, limit: ^limit)
end
end

View file

@ -31,34 +31,21 @@ def start(_type, _args) do
children =
[
Pleroma.Repo,
Pleroma.Scheduler,
Pleroma.Config.TransferTask,
Pleroma.Emoji,
Pleroma.Captcha,
Pleroma.FlakeId,
Pleroma.ScheduledActivityWorker,
Pleroma.ActivityExpirationWorker
Pleroma.Daemons.ScheduledActivityDaemon,
Pleroma.Daemons.ActivityExpirationDaemon
] ++
cachex_children() ++
hackney_pool_children() ++
[
Pleroma.Web.Federator.RetryQueue,
Pleroma.Stats,
%{
id: :web_push_init,
start: {Task, :start_link, [&Pleroma.Web.Push.init/0]},
restart: :temporary
},
%{
id: :federator_init,
start: {Task, :start_link, [&Pleroma.Web.Federator.init/0]},
restart: :temporary
},
%{
id: :internal_fetch_init,
start: {Task, :start_link, [&Pleroma.Web.ActivityPub.InternalFetchActor.init/0]},
restart: :temporary
}
{Oban, Pleroma.Config.get(Oban)}
] ++
task_children(@env) ++
oauth_cleanup_child(oauth_cleanup_enabled?()) ++
streamer_child(@env) ++
chat_child(@env, chat_enabled?()) ++
@ -70,9 +57,7 @@ def start(_type, _args) do
# See http://elixir-lang.org/docs/stable/elixir/Supervisor.html
# for other strategies and supported options
opts = [strategy: :one_for_one, name: Pleroma.Supervisor]
result = Supervisor.start_link(children, opts)
:ok = after_supervisor_start()
result
Supervisor.start_link(children, opts)
end
defp setup_instrumenters do
@ -116,10 +101,15 @@ defp cachex_children do
build_cachex("object", default_ttl: 25_000, ttl_interval: 1000, limit: 2500),
build_cachex("rich_media", default_ttl: :timer.minutes(120), limit: 5000),
build_cachex("scrubber", limit: 2500),
build_cachex("idempotency", expiration: idempotency_expiration(), limit: 2500)
build_cachex("idempotency", expiration: idempotency_expiration(), limit: 2500),
build_cachex("web_resp", limit: 2500),
build_cachex("emoji_packs", expiration: emoji_packs_expiration(), limit: 10)
]
end
defp emoji_packs_expiration,
do: expiration(default: :timer.seconds(5 * 60), interval: :timer.seconds(60))
defp idempotency_expiration,
do: expiration(default: :timer.seconds(6 * 60 * 60), interval: :timer.seconds(60))
@ -141,7 +131,7 @@ defp oauth_cleanup_enabled?,
defp streamer_child(:test), do: []
defp streamer_child(_) do
[Pleroma.Web.Streamer]
[Pleroma.Web.Streamer.supervisor()]
end
defp oauth_cleanup_child(true),
@ -164,16 +154,38 @@ defp hackney_pool_children do
end
end
defp after_supervisor_start do
with digest_config <- Application.get_env(:pleroma, :email_notifications)[:digest],
true <- digest_config[:active] do
PleromaJobQueue.schedule(
digest_config[:schedule],
:digest_emails,
Pleroma.DigestEmailWorker
)
defp task_children(:test) do
[
%{
id: :web_push_init,
start: {Task, :start_link, [&Pleroma.Web.Push.init/0]},
restart: :temporary
},
%{
id: :federator_init,
start: {Task, :start_link, [&Pleroma.Web.Federator.init/0]},
restart: :temporary
}
]
end
:ok
defp task_children(_) do
[
%{
id: :web_push_init,
start: {Task, :start_link, [&Pleroma.Web.Push.init/0]},
restart: :temporary
},
%{
id: :federator_init,
start: {Task, :start_link, [&Pleroma.Web.Federator.init/0]},
restart: :temporary
},
%{
id: :internal_fetch_init,
start: {Task, :start_link, [&Pleroma.Web.ActivityPub.InternalFetchActor.init/0]},
restart: :temporary
}
]
end
end

View file

@ -6,4 +6,16 @@ defmodule Pleroma.Constants do
use Const
const(as_public, do: "https://www.w3.org/ns/activitystreams#Public")
const(object_internal_fields,
do: [
"likes",
"like_count",
"announcements",
"announcement_count",
"emoji",
"context_id",
"deleted_activity_id"
]
)
end

View file

@ -2,13 +2,14 @@
# Copyright © 2019 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.ActivityExpirationWorker do
defmodule Pleroma.Daemons.ActivityExpirationDaemon do
alias Pleroma.Activity
alias Pleroma.ActivityExpiration
alias Pleroma.Config
alias Pleroma.Repo
alias Pleroma.User
alias Pleroma.Web.CommonAPI
require Logger
use GenServer
import Ecto.Query
@ -49,7 +50,10 @@ def perform(:execute, expiration_id) do
def handle_info(:perform, state) do
ActivityExpiration.due_expirations(@schedule_interval)
|> Enum.each(fn expiration ->
PleromaJobQueue.enqueue(:activity_expiration, __MODULE__, [:execute, expiration.id])
Pleroma.Workers.ActivityExpirationWorker.enqueue(
"activity_expiration",
%{"activity_expiration_id" => expiration.id}
)
end)
schedule_next()

View file

@ -2,10 +2,11 @@
# Copyright © 2017-2019 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.DigestEmailWorker do
import Ecto.Query
defmodule Pleroma.Daemons.DigestEmailDaemon do
alias Pleroma.Repo
alias Pleroma.Workers.DigestEmailsWorker
@queue_name :digest_emails
import Ecto.Query
def perform do
config = Pleroma.Config.get([:email_notifications, :digest])
@ -20,8 +21,10 @@ def perform do
where: u.last_digest_emailed_at < datetime_add(^now, ^negative_interval, "day"),
select: u
)
|> Pleroma.Repo.all()
|> Enum.each(&PleromaJobQueue.enqueue(@queue_name, __MODULE__, [&1]))
|> Repo.all()
|> Enum.each(fn user ->
DigestEmailsWorker.enqueue("digest_email", %{"user_id" => user.id})
end)
end
@doc """

View file

@ -2,7 +2,7 @@
# Copyright © 2017-2019 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.ScheduledActivityWorker do
defmodule Pleroma.Daemons.ScheduledActivityDaemon do
@moduledoc """
Sends scheduled activities to the job queue.
"""
@ -11,6 +11,7 @@ defmodule Pleroma.ScheduledActivityWorker do
alias Pleroma.ScheduledActivity
alias Pleroma.User
alias Pleroma.Web.CommonAPI
use GenServer
require Logger
@ -45,7 +46,10 @@ def perform(:execute, scheduled_activity_id) do
def handle_info(:perform, state) do
ScheduledActivity.due_activities(@schedule_interval)
|> Enum.each(fn scheduled_activity ->
PleromaJobQueue.enqueue(:scheduled_activities, __MODULE__, [:execute, scheduled_activity.id])
Pleroma.Workers.ScheduledActivityWorker.enqueue(
"execute",
%{"activity_id" => scheduled_activity.id}
)
end)
schedule_next()

51
lib/pleroma/delivery.ex Normal file
View file

@ -0,0 +1,51 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2019 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Delivery do
use Ecto.Schema
alias Pleroma.Delivery
alias Pleroma.FlakeId
alias Pleroma.Object
alias Pleroma.Repo
alias Pleroma.User
alias Pleroma.User
import Ecto.Changeset
import Ecto.Query
schema "deliveries" do
belongs_to(:user, User, type: FlakeId)
belongs_to(:object, Object)
end
def changeset(delivery, params \\ %{}) do
delivery
|> cast(params, [:user_id, :object_id])
|> validate_required([:user_id, :object_id])
|> foreign_key_constraint(:object_id)
|> foreign_key_constraint(:user_id)
|> unique_constraint(:user_id, name: :deliveries_user_id_object_id_index)
end
def create(object_id, user_id) do
%Delivery{}
|> changeset(%{user_id: user_id, object_id: object_id})
|> Repo.insert(on_conflict: :nothing)
end
def get(object_id, user_id) do
from(d in Delivery, where: d.user_id == ^user_id and d.object_id == ^object_id)
|> Repo.one()
end
# A hack because user delete activities have a fake id for whatever reason
# TODO: Get rid of this
def delete_all_by_object_id("pleroma:fake_object_id"), do: {0, []}
def delete_all_by_object_id(object_id) do
from(d in Delivery, where: d.object_id == ^object_id)
|> Repo.delete_all()
end
end

View file

@ -0,0 +1,73 @@
defmodule Pleroma.Docs.Generator do
@callback process(keyword()) :: {:ok, String.t()}
@spec process(module(), keyword()) :: {:ok, String.t()}
def process(implementation, descriptions) do
implementation.process(descriptions)
end
@spec uploaders_list() :: [module()]
def uploaders_list do
{:ok, modules} = :application.get_key(:pleroma, :modules)
Enum.filter(modules, fn module ->
name_as_list = Module.split(module)
List.starts_with?(name_as_list, ["Pleroma", "Uploaders"]) and
List.last(name_as_list) != "Uploader"
end)
end
@spec filters_list() :: [module()]
def filters_list do
{:ok, modules} = :application.get_key(:pleroma, :modules)
Enum.filter(modules, fn module ->
name_as_list = Module.split(module)
List.starts_with?(name_as_list, ["Pleroma", "Upload", "Filter"])
end)
end
@spec mrf_list() :: [module()]
def mrf_list do
{:ok, modules} = :application.get_key(:pleroma, :modules)
Enum.filter(modules, fn module ->
name_as_list = Module.split(module)
List.starts_with?(name_as_list, ["Pleroma", "Web", "ActivityPub", "MRF"]) and
length(name_as_list) > 4
end)
end
@spec richmedia_parsers() :: [module()]
def richmedia_parsers do
{:ok, modules} = :application.get_key(:pleroma, :modules)
Enum.filter(modules, fn module ->
name_as_list = Module.split(module)
List.starts_with?(name_as_list, ["Pleroma", "Web", "RichMedia", "Parsers"]) and
length(name_as_list) == 5
end)
end
end
defimpl Jason.Encoder, for: Tuple do
def encode(tuple, opts) do
Jason.Encode.list(Tuple.to_list(tuple), opts)
end
end
defimpl Jason.Encoder, for: [Regex, Function] do
def encode(term, opts) do
Jason.Encode.string(inspect(term), opts)
end
end
defimpl String.Chars, for: Regex do
def to_string(term) do
inspect(term)
end
end

20
lib/pleroma/docs/json.ex Normal file
View file

@ -0,0 +1,20 @@
defmodule Pleroma.Docs.JSON do
@behaviour Pleroma.Docs.Generator
@spec process(keyword()) :: {:ok, String.t()}
def process(descriptions) do
config_path = "docs/generate_config.json"
with {:ok, file} <- File.open(config_path, [:write]),
json <- generate_json(descriptions),
:ok <- IO.write(file, json),
:ok <- File.close(file) do
{:ok, config_path}
end
end
@spec generate_json([keyword()]) :: String.t()
def generate_json(descriptions) do
Jason.encode!(descriptions)
end
end

View file

@ -0,0 +1,88 @@
defmodule Pleroma.Docs.Markdown do
@behaviour Pleroma.Docs.Generator
@spec process(keyword()) :: {:ok, String.t()}
def process(descriptions) do
config_path = "docs/generated_config.md"
{:ok, file} = File.open(config_path, [:utf8, :write])
IO.write(file, "# Generated configuration\n")
IO.write(file, "Date of generation: #{Date.utc_today()}\n\n")
IO.write(
file,
"This file describe the configuration, it is recommended to edit the relevant `*.secret.exs` file instead of the others founds in the ``config`` directory.\n\n" <>
"If you run Pleroma with ``MIX_ENV=prod`` the file is ``prod.secret.exs``, otherwise it is ``dev.secret.exs``.\n\n"
)
for group <- descriptions do
if is_nil(group[:key]) do
IO.write(file, "## #{inspect(group[:group])}\n")
else
IO.write(file, "## #{inspect(group[:key])}\n")
end
IO.write(file, "#{group[:description]}\n")
for child <- group[:children] || [] do
print_child_header(file, child)
print_suggestions(file, child[:suggestions])
if child[:children] do
for subchild <- child[:children] do
print_child_header(file, subchild)
print_suggestions(file, subchild[:suggestions])
end
end
end
IO.write(file, "\n")
end
:ok = File.close(file)
{:ok, config_path}
end
defp print_child_header(file, %{key: key, type: type, description: description} = _child) do
IO.write(
file,
"- `#{inspect(key)}` (`#{inspect(type)}`): #{description} \n"
)
end
defp print_child_header(file, %{key: key, type: type} = _child) do
IO.write(file, "- `#{inspect(key)}` (`#{inspect(type)}`) \n")
end
defp print_suggestion(file, suggestion) when is_list(suggestion) do
IO.write(file, " `#{inspect(suggestion)}`\n")
end
defp print_suggestion(file, suggestion) when is_function(suggestion) do
IO.write(file, " `#{inspect(suggestion.())}`\n")
end
defp print_suggestion(file, suggestion, as_list \\ false) do
list_mark = if as_list, do: "- ", else: ""
IO.write(file, " #{list_mark}`#{inspect(suggestion)}`\n")
end
defp print_suggestions(_file, nil), do: nil
defp print_suggestions(_file, ""), do: nil
defp print_suggestions(file, suggestions) do
if length(suggestions) > 1 do
IO.write(file, "Suggestions:\n")
for suggestion <- suggestions do
print_suggestion(file, suggestion, true)
end
else
IO.write(file, " Suggestion: ")
print_suggestion(file, List.first(suggestions))
end
end
end

View file

@ -9,6 +9,7 @@ defmodule Pleroma.Emails.Mailer do
The module contains functions to delivery email using Swoosh.Mailer.
"""
alias Pleroma.Workers.MailerWorker
alias Swoosh.DeliveryError
@otp_app :pleroma
@ -19,7 +20,12 @@ def enabled?, do: Pleroma.Config.get([__MODULE__, :enabled])
@doc "add email to queue"
def deliver_async(email, config \\ []) do
PleromaJobQueue.enqueue(:mailer, __MODULE__, [:deliver_async, email, config])
encoded_email =
email
|> :erlang.term_to_binary()
|> Base.encode64()
MailerWorker.enqueue("email", %{"encoded_email" => encoded_email, "config" => config})
end
@doc "callback to perform send email from queue"

View file

@ -62,6 +62,9 @@ def get_all do
:ets.tab2list(@ets)
end
@doc "Clear out old emojis"
def clear_all, do: :ets.delete_all_objects(@ets)
@doc false
def init(_) do
@ets = :ets.new(@ets, @ets_options)

View file

@ -64,9 +64,13 @@ def load do
)
end
emojis =
Enum.flat_map(packs, fn pack ->
load_pack(Path.join(emoji_dir_path, pack), emoji_groups)
end)
Emoji.clear_all()
emojis
end
# Compat thing for old custom emoji handling & default emoji,
@ -87,15 +91,29 @@ defp prepare_emoji({code, _, _} = emoji), do: {code, Emoji.build(emoji)}
defp load_pack(pack_dir, emoji_groups) do
pack_name = Path.basename(pack_dir)
pack_file = Path.join(pack_dir, "pack.json")
if File.exists?(pack_file) do
contents = Jason.decode!(File.read!(pack_file))
contents["files"]
|> Enum.map(fn {name, rel_file} ->
filename = Path.join("/emoji/#{pack_name}", rel_file)
{name, filename, pack_name}
end)
else
# Load from emoji.txt / all files
emoji_txt = Path.join(pack_dir, "emoji.txt")
if File.exists?(emoji_txt) do
load_from_file(emoji_txt, emoji_groups)
else
extensions = Config.get([:emoji, :pack_extensions])
extensions = Pleroma.Config.get([:emoji, :pack_extensions])
Logger.info(
"No emoji.txt found for pack \"#{pack_name}\", assuming all #{Enum.join(extensions, ", ")} files are emoji"
"No emoji.txt found for pack \"#{pack_name}\", assuming all #{
Enum.join(extensions, ", ")
} files are emoji"
)
make_shortcode_to_file_map(pack_dir, extensions)
@ -106,6 +124,7 @@ defp load_pack(pack_dir, emoji_groups) do
end)
end
end
end
def make_shortcode_to_file_map(pack_dir, exts) do
find_all_emoji(pack_dir, exts)

View file

@ -14,7 +14,7 @@ defmodule Pleroma.FlakeId do
@type t :: binary
@behaviour Ecto.Type
use Ecto.Type
use GenServer
require Logger
alias __MODULE__

View file

@ -34,9 +34,9 @@ def mention_handler("@" <> nickname, buffer, opts, acc) do
nickname_text = get_nickname_text(nickname, opts)
link =
"<span class='h-card'><a data-user='#{id}' class='u-url mention' href='#{ap_id}'>@<span>#{
~s(<span class="h-card"><a data-user="#{id}" class="u-url mention" href="#{ap_id}" rel="ugc">@<span>#{
nickname_text
}</span></a></span>"
}</span></a></span>)
{link, %{acc | mentions: MapSet.put(acc.mentions, {"@" <> nickname, user})}}
@ -48,7 +48,7 @@ def mention_handler("@" <> nickname, buffer, opts, acc) do
def hashtag_handler("#" <> tag = tag_text, _buffer, _opts, acc) do
tag = String.downcase(tag)
url = "#{Pleroma.Web.base_url()}/tag/#{tag}"
link = "<a class='hashtag' data-tag='#{tag}' href='#{url}' rel='tag'>#{tag_text}</a>"
link = ~s(<a class="hashtag" data-tag="#{tag}" href="#{url}" rel="tag ugc">#{tag_text}</a>)
{link, %{acc | tags: MapSet.put(acc.tags, {tag_text, tag})}}
end

View file

@ -9,6 +9,7 @@ defmodule Pleroma.Healthcheck do
alias Pleroma.Healthcheck
alias Pleroma.Repo
@derive Jason.Encoder
defstruct pool_size: 0,
active: 0,
idle: 0,

View file

@ -184,7 +184,8 @@ defmodule Pleroma.HTML.Scrubber.Default do
"tag",
"nofollow",
"noopener",
"noreferrer"
"noreferrer",
"ugc"
])
Meta.allow_tag_with_these_attributes("a", ["name", "title"])
@ -304,7 +305,8 @@ defmodule Pleroma.HTML.Scrubber.LinksOnly do
"nofollow",
"noopener",
"noreferrer",
"me"
"me",
"ugc"
])
Meta.allow_tag_with_these_attributes("a", ["name", "title"])

View file

@ -90,7 +90,7 @@ def set_reachable(_), do: {:error, nil}
def set_unreachable(url_or_host, unreachable_since \\ nil)
def set_unreachable(url_or_host, unreachable_since) when is_binary(url_or_host) do
unreachable_since = unreachable_since || DateTime.utc_now()
unreachable_since = parse_datetime(unreachable_since) || NaiveDateTime.utc_now()
host = host(url_or_host)
existing_record = Repo.get_by(Instance, %{host: host})
@ -114,4 +114,10 @@ def set_unreachable(url_or_host, unreachable_since) when is_binary(url_or_host)
end
def set_unreachable(_, _), do: {:error, nil}
defp parse_datetime(datetime) when is_binary(datetime) do
NaiveDateTime.from_iso8601(datetime)
end
defp parse_datetime(datetime), do: datetime
end

View file

@ -210,8 +210,10 @@ def create_notification(%Activity{} = activity, %User{} = user) do
unless skip?(activity, user) do
notification = %Notification{user_id: user.id, activity: activity}
{:ok, notification} = Repo.insert(notification)
Streamer.stream("user", notification)
Streamer.stream("user:notification", notification)
["user", "user:notification"]
|> Streamer.stream(notification)
Push.send(notification)
notification
end

View file

@ -38,6 +38,24 @@ def change(struct, params \\ %{}) do
def get_by_id(nil), do: nil
def get_by_id(id), do: Repo.get(Object, id)
def get_by_id_and_maybe_refetch(id, opts \\ []) do
%{updated_at: updated_at} = object = get_by_id(id)
if opts[:interval] &&
NaiveDateTime.diff(NaiveDateTime.utc_now(), updated_at) > opts[:interval] do
case Fetcher.refetch_object(object) do
{:ok, %Object{} = object} ->
object
e ->
Logger.error("Couldn't refresh #{object.data["id"]}:\n#{inspect(e)}")
object
end
else
object
end
end
def get_by_ap_id(nil), do: nil
def get_by_ap_id(ap_id) do
@ -130,14 +148,16 @@ def swap_object_with_tombstone(object) do
def delete(%Object{data: %{"id" => id}} = object) do
with {:ok, _obj} = swap_object_with_tombstone(object),
deleted_activity = Activity.delete_by_ap_id(id),
{:ok, true} <- Cachex.del(:object_cache, "object:#{id}") do
{:ok, true} <- Cachex.del(:object_cache, "object:#{id}"),
{:ok, _} <- Cachex.del(:web_resp_cache, URI.parse(id).path) do
{:ok, object, deleted_activity}
end
end
def prune(%Object{data: %{"id" => id}} = object) do
with {:ok, object} <- Repo.delete(object),
{:ok, true} <- Cachex.del(:object_cache, "object:#{id}") do
{:ok, true} <- Cachex.del(:object_cache, "object:#{id}"),
{:ok, _} <- Cachex.del(:web_resp_cache, URI.parse(id).path) do
{:ok, object}
end
end

View file

@ -6,18 +6,40 @@ defmodule Pleroma.Object.Fetcher do
alias Pleroma.HTTP
alias Pleroma.Object
alias Pleroma.Object.Containment
alias Pleroma.Repo
alias Pleroma.Signature
alias Pleroma.Web.ActivityPub.InternalFetchActor
alias Pleroma.Web.ActivityPub.Transmogrifier
alias Pleroma.Web.OStatus
require Logger
require Pleroma.Constants
defp reinject_object(data) do
defp touch_changeset(changeset) do
updated_at =
NaiveDateTime.utc_now()
|> NaiveDateTime.truncate(:second)
Ecto.Changeset.put_change(changeset, :updated_at, updated_at)
end
defp maybe_reinject_internal_fields(data, %{data: %{} = old_data}) do
internal_fields = Map.take(old_data, Pleroma.Constants.object_internal_fields())
Map.merge(data, internal_fields)
end
defp maybe_reinject_internal_fields(data, _), do: data
@spec reinject_object(struct(), map()) :: {:ok, Object.t()} | {:error, any()}
defp reinject_object(struct, data) do
Logger.debug("Reinjecting object #{data["id"]}")
with data <- Transmogrifier.fix_object(data),
{:ok, object} <- Object.create(data) do
data <- maybe_reinject_internal_fields(data, struct),
changeset <- Object.change(struct, %{data: data}),
changeset <- touch_changeset(changeset),
{:ok, object} <- Repo.insert_or_update(changeset) do
{:ok, object}
else
e ->
@ -26,24 +48,24 @@ defp reinject_object(data) do
end
end
def refetch_object(%Object{data: %{"id" => id}} = object) do
with {:local, false} <- {:local, String.starts_with?(id, Pleroma.Web.base_url() <> "/")},
{:ok, data} <- fetch_and_contain_remote_object_from_id(id),
{:ok, object} <- reinject_object(object, data) do
{:ok, object}
else
{:local, true} -> object
e -> {:error, e}
end
end
# TODO:
# This will create a Create activity, which we need internally at the moment.
def fetch_object_from_id(id, options \\ []) do
if object = Object.get_cached_by_ap_id(id) do
{:ok, object}
else
Logger.info("Fetching #{id} via AP")
with {:fetch, {:ok, data}} <- {:fetch, fetch_and_contain_remote_object_from_id(id)},
with {:fetch_object, nil} <- {:fetch_object, Object.get_cached_by_ap_id(id)},
{:fetch, {:ok, data}} <- {:fetch, fetch_and_contain_remote_object_from_id(id)},
{:normalize, nil} <- {:normalize, Object.normalize(data, false)},
params <- %{
"type" => "Create",
"to" => data["to"],
"cc" => data["cc"],
# Should we seriously keep this attributedTo thing?
"actor" => data["actor"] || data["attributedTo"],
"object" => data
},
params <- prepare_activity_params(data),
{:containment, :ok} <- {:containment, Containment.contain_origin(id, params)},
{:ok, activity} <- Transmogrifier.handle_incoming(params, options),
{:object, _data, %Object{} = object} <-
@ -57,11 +79,14 @@ def fetch_object_from_id(id, options \\ []) do
{:reject, nil}
{:object, data, nil} ->
reinject_object(data)
reinject_object(%Object{}, data)
{:normalize, object = %Object{}} ->
{:ok, object}
{:fetch_object, %Object{} = object} ->
{:ok, object}
_e ->
# Only fallback when receiving a fetch/normalization error with ActivityPub
Logger.info("Couldn't get object via AP, trying out OStatus fetching...")
@ -73,6 +98,16 @@ def fetch_object_from_id(id, options \\ []) do
end
end
end
defp prepare_activity_params(data) do
%{
"type" => "Create",
"to" => data["to"],
"cc" => data["cc"],
# Should we seriously keep this attributedTo thing?
"actor" => data["actor"] || data["attributedTo"],
"object" => data
}
end
def fetch_object_from_id!(id, options \\ []) do

136
lib/pleroma/plugs/cache.ex Normal file
View file

@ -0,0 +1,136 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2019 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Plugs.Cache do
@moduledoc """
Caches successful GET responses.
To enable the cache add the plug to a router pipeline or controller:
plug(Pleroma.Plugs.Cache)
## Configuration
To configure the plug you need to pass settings as the second argument to the `plug/2` macro:
plug(Pleroma.Plugs.Cache, [ttl: nil, query_params: true])
Available options:
- `ttl`: An expiration time (time-to-live). This value should be in milliseconds or `nil` to disable expiration. Defaults to `nil`.
- `query_params`: Take URL query string into account (`true`), ignore it (`false`) or limit to specific params only (list). Defaults to `true`.
- `tracking_fun`: A function that is called on successfull responses, no matter if the request is cached or not. It should accept a conn as the first argument and the value assigned to `tracking_fun_data` as the second.
Additionally, you can overwrite the TTL inside a controller action by assigning `cache_ttl` to the connection struct:
def index(conn, _params) do
ttl = 60_000 # one minute
conn
|> assign(:cache_ttl, ttl)
|> render("index.html")
end
"""
import Phoenix.Controller, only: [current_path: 1, json: 2]
import Plug.Conn
@behaviour Plug
@defaults %{ttl: nil, query_params: true}
@impl true
def init([]), do: @defaults
def init(opts) do
opts = Map.new(opts)
Map.merge(@defaults, opts)
end
@impl true
def call(%{method: "GET"} = conn, opts) do
key = cache_key(conn, opts)
case Cachex.get(:web_resp_cache, key) do
{:ok, nil} ->
cache_resp(conn, opts)
{:ok, {content_type, body, tracking_fun_data}} ->
conn = opts.tracking_fun.(conn, tracking_fun_data)
send_cached(conn, {content_type, body})
{:ok, record} ->
send_cached(conn, record)
{atom, message} when atom in [:ignore, :error] ->
render_error(conn, message)
end
end
def call(conn, _), do: conn
# full path including query params
defp cache_key(conn, %{query_params: true}), do: current_path(conn)
# request path without query params
defp cache_key(conn, %{query_params: false}), do: conn.request_path
# request path with specific query params
defp cache_key(conn, %{query_params: query_params}) when is_list(query_params) do
query_string =
conn.params
|> Map.take(query_params)
|> URI.encode_query()
conn.request_path <> "?" <> query_string
end
defp cache_resp(conn, opts) do
register_before_send(conn, fn
%{status: 200, resp_body: body} = conn ->
ttl = Map.get(conn.assigns, :cache_ttl, opts.ttl)
key = cache_key(conn, opts)
content_type = content_type(conn)
conn =
unless opts[:tracking_fun] do
Cachex.put(:web_resp_cache, key, {content_type, body}, ttl: ttl)
conn
else
tracking_fun_data = Map.get(conn.assigns, :tracking_fun_data, nil)
Cachex.put(:web_resp_cache, key, {content_type, body, tracking_fun_data}, ttl: ttl)
opts.tracking_fun.(conn, tracking_fun_data)
end
put_resp_header(conn, "x-cache", "MISS from Pleroma")
conn ->
conn
end)
end
defp content_type(conn) do
conn
|> Plug.Conn.get_resp_header("content-type")
|> hd()
end
defp send_cached(conn, {content_type, body}) do
conn
|> put_resp_content_type(content_type, nil)
|> put_resp_header("x-cache", "HIT from Pleroma")
|> send_resp(:ok, body)
|> halt()
end
defp render_error(conn, message) do
conn
|> put_status(:internal_server_error)
|> json(%{error: message})
|> halt()
end
end

View file

@ -15,7 +15,8 @@ def call(%{assigns: %{valid_signature: true}} = conn, _opts) do
end
def call(conn, _opts) do
[signature | _] = get_req_header(conn, "signature")
headers = get_req_header(conn, "signature")
signature = Enum.at(headers, 0)
if signature do
# set (request-target) header to the appropriate value

7
lib/pleroma/scheduler.ex Normal file
View file

@ -0,0 +1,7 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2019 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Scheduler do
use Quantum.Scheduler, otp_app: :pleroma
end

View file

@ -38,16 +38,26 @@ def get_file(file) do
def put_file(%Pleroma.Upload{} = upload) do
config = Config.get([__MODULE__])
bucket = Keyword.get(config, :bucket)
streaming = Keyword.get(config, :streaming_enabled)
s3_name = strict_encode(upload.path)
op =
if streaming do
upload.tempfile
|> ExAws.S3.Upload.stream_file()
|> ExAws.S3.upload(bucket, s3_name, [
{:acl, :public_read},
{:content_type, upload.content_type}
])
else
{:ok, file_data} = File.read(upload.tempfile)
ExAws.S3.put_object(bucket, s3_name, file_data, [
{:acl, :public_read},
{:content_type, upload.content_type}
])
end
case ExAws.request(op) do
{:ok, _} ->

View file

@ -11,6 +11,7 @@ defmodule Pleroma.User do
alias Comeonin.Pbkdf2
alias Ecto.Multi
alias Pleroma.Activity
alias Pleroma.Delivery
alias Pleroma.Keys
alias Pleroma.Notification
alias Pleroma.Object
@ -27,6 +28,7 @@ defmodule Pleroma.User do
alias Pleroma.Web.OStatus
alias Pleroma.Web.RelMe
alias Pleroma.Web.Websub
alias Pleroma.Workers.BackgroundWorker
require Logger
@ -61,6 +63,7 @@ defmodule Pleroma.User do
field(:last_digest_emailed_at, :naive_datetime)
has_many(:notifications, Notification)
has_many(:registrations, Registration)
has_many(:deliveries, Delivery)
embeds_one(:info, User.Info)
timestamps()
@ -147,6 +150,7 @@ def get_cached_follow_state(user, target) do
Cachex.fetch!(:user_cache, key, fn _ -> {:commit, follow_state(user, target)} end)
end
@spec set_follow_state_cache(String.t(), String.t(), String.t()) :: {:ok | :error, boolean()}
def set_follow_state_cache(user_ap_id, target_ap_id, state) do
Cachex.put(
:user_cache,
@ -174,11 +178,25 @@ def following_count(%User{} = user) do
|> Repo.aggregate(:count, :id)
end
defp truncate_if_exists(params, key, max_length) do
if Map.has_key?(params, key) and is_binary(params[key]) do
{value, _chopped} = String.split_at(params[key], max_length)
Map.put(params, key, value)
else
params
end
end
def remote_user_creation(params) do
bio_limit = Pleroma.Config.get([:instance, :user_bio_length], 5000)
name_limit = Pleroma.Config.get([:instance, :user_name_length], 100)
params = Map.put(params, :info, params[:info] || %{})
params =
params
|> Map.put(:info, params[:info] || %{})
|> truncate_if_exists(:name, name_limit)
|> truncate_if_exists(:bio, bio_limit)
info_cng = User.Info.remote_user_creation(%User.Info{}, params[:info])
changes =
@ -251,6 +269,7 @@ def password_update_changeset(struct, params) do
|> validate_required([:password, :password_confirmation])
|> validate_confirmation(:password)
|> put_password_hash
|> put_embed(:info, User.Info.set_password_reset_pending(struct.info, false))
end
@spec reset_password(User.t(), map) :: {:ok, User.t()} | {:error, Ecto.Changeset.t()}
@ -267,6 +286,20 @@ def reset_password(%User{id: user_id} = user, data) do
end
end
def force_password_reset_async(user) do
BackgroundWorker.enqueue("force_password_reset", %{"user_id" => user.id})
end
@spec force_password_reset(User.t()) :: {:ok, User.t()} | {:error, Ecto.Changeset.t()}
def force_password_reset(user) do
info_cng = User.Info.set_password_reset_pending(user.info, true)
user
|> change()
|> put_embed(:info, info_cng)
|> update_and_set_cache()
end
def register_changeset(struct, params \\ %{}, opts \\ []) do
bio_limit = Pleroma.Config.get([:instance, :user_bio_length], 5000)
name_limit = Pleroma.Config.get([:instance, :user_name_length], 100)
@ -633,8 +666,9 @@ def get_or_fetch_by_nickname(nickname) do
end
@doc "Fetch some posts when the user has just been federated with"
def fetch_initial_posts(user),
do: PleromaJobQueue.enqueue(:background, __MODULE__, [:fetch_initial_posts, user])
def fetch_initial_posts(user) do
BackgroundWorker.enqueue("fetch_initial_posts", %{"user_id" => user.id})
end
@spec get_followers_query(User.t(), pos_integer() | nil) :: Ecto.Query.t()
def get_followers_query(%User{} = user, nil) do
@ -1064,7 +1098,7 @@ def unblock_domain(user, domain) do
end
def deactivate_async(user, status \\ true) do
PleromaJobQueue.enqueue(:background, __MODULE__, [:deactivate_async, user, status])
BackgroundWorker.enqueue("deactivate_user", %{"user_id" => user.id, "status" => status})
end
def deactivate(%User{} = user, status \\ true) do
@ -1092,9 +1126,11 @@ def update_notification_settings(%User{} = user, settings \\ %{}) do
|> update_and_set_cache()
end
@spec delete(User.t()) :: :ok
def delete(%User{} = user),
do: PleromaJobQueue.enqueue(:background, __MODULE__, [:delete, user])
def delete(%User{} = user) do
BackgroundWorker.enqueue("delete_user", %{"user_id" => user.id})
end
def perform(:force_password_reset, user), do: force_password_reset(user)
@spec perform(atom(), User.t()) :: {:ok, User.t()}
def perform(:delete, %User{} = user) do
@ -1201,25 +1237,24 @@ def external_users(opts \\ []) do
Repo.all(query)
end
def blocks_import(%User{} = blocker, blocked_identifiers) when is_list(blocked_identifiers),
do:
PleromaJobQueue.enqueue(:background, __MODULE__, [
:blocks_import,
blocker,
blocked_identifiers
])
def blocks_import(%User{} = blocker, blocked_identifiers) when is_list(blocked_identifiers) do
BackgroundWorker.enqueue("blocks_import", %{
"blocker_id" => blocker.id,
"blocked_identifiers" => blocked_identifiers
})
end
def follow_import(%User{} = follower, followed_identifiers) when is_list(followed_identifiers),
do:
PleromaJobQueue.enqueue(:background, __MODULE__, [
:follow_import,
follower,
followed_identifiers
])
def follow_import(%User{} = follower, followed_identifiers)
when is_list(followed_identifiers) do
BackgroundWorker.enqueue("follow_import", %{
"follower_id" => follower.id,
"followed_identifiers" => followed_identifiers
})
end
def delete_user_activities(%User{ap_id: ap_id} = user) do
ap_id
|> Activity.query_by_actor()
|> Activity.Queries.by_actor()
|> RepoStreamer.chunk_stream(50)
|> Stream.each(fn activities ->
Enum.each(activities, &delete_activity(&1))
@ -1624,4 +1659,25 @@ defp put_password_hash(changeset), do: changeset
def is_internal_user?(%User{nickname: nil}), do: true
def is_internal_user?(%User{local: true, nickname: "internal." <> _}), do: true
def is_internal_user?(_), do: false
# A hack because user delete activities have a fake id for whatever reason
# TODO: Get rid of this
def get_delivered_users_by_object_id("pleroma:fake_object_id"), do: []
def get_delivered_users_by_object_id(object_id) do
from(u in User,
inner_join: delivery in assoc(u, :deliveries),
where: delivery.object_id == ^object_id
)
|> Repo.all()
end
def change_email(user, email) do
user
|> cast(%{email: email}, [:email])
|> validate_required([:email])
|> unique_constraint(:email)
|> validate_format(:email, @email_regex)
|> update_and_set_cache()
end
end

View file

@ -20,6 +20,7 @@ defmodule Pleroma.User.Info do
field(:following_count, :integer, default: nil)
field(:locked, :boolean, default: false)
field(:confirmation_pending, :boolean, default: false)
field(:password_reset_pending, :boolean, default: false)
field(:confirmation_token, :string, default: nil)
field(:default_scope, :string, default: "public")
field(:blocks, {:array, :string}, default: [])
@ -41,6 +42,8 @@ defmodule Pleroma.User.Info do
field(:topic, :string, default: nil)
field(:hub, :string, default: nil)
field(:salmon, :string, default: nil)
field(:hide_followers_count, :boolean, default: false)
field(:hide_follows_count, :boolean, default: false)
field(:hide_followers, :boolean, default: false)
field(:hide_follows, :boolean, default: false)
field(:hide_favorites, :boolean, default: true)
@ -80,6 +83,14 @@ def set_activation_status(info, deactivated) do
|> validate_required([:deactivated])
end
def set_password_reset_pending(info, pending) do
params = %{password_reset_pending: pending}
info
|> cast(params, [:password_reset_pending])
|> validate_required([:password_reset_pending])
end
def update_notification_settings(info, settings) do
settings =
settings
@ -242,6 +253,13 @@ def set_keys(info, keys) do
end
def remote_user_creation(info, params) do
params =
if Map.has_key?(params, :fields) do
Map.put(params, :fields, Enum.map(params[:fields], &truncate_field/1))
else
params
end
info
|> cast(params, [
:ap_enabled,
@ -255,6 +273,8 @@ def remote_user_creation(info, params) do
:salmon,
:hide_followers,
:hide_follows,
:hide_followers_count,
:hide_follows_count,
:follower_count,
:fields,
:following_count
@ -274,7 +294,9 @@ def user_upgrade(info, params, remote? \\ false) do
:following_count,
:hide_follows,
:fields,
:hide_followers
:hide_followers,
:hide_followers_count,
:hide_follows_count
])
|> validate_fields(remote?)
end
@ -288,6 +310,8 @@ def profile_update(info, params) do
:banner,
:hide_follows,
:hide_followers,
:hide_followers_count,
:hide_follows_count,
:hide_favorites,
:background,
:show_role,
@ -326,6 +350,16 @@ defp valid_field?(%{"name" => name, "value" => value}) do
defp valid_field?(_), do: false
defp truncate_field(%{"name" => name, "value" => value}) do
{name, _chopped} =
String.split_at(name, Pleroma.Config.get([:instance, :account_field_name_length], 255))
{value, _chopped} =
String.split_at(value, Pleroma.Config.get([:instance, :account_field_value_length], 255))
%{"name" => name, "value" => value}
end
@spec confirmation_changeset(Info.t(), keyword()) :: Changeset.t()
def confirmation_changeset(info, opts) do
need_confirmation? = Keyword.get(opts, :need_confirmation)
@ -441,7 +475,9 @@ def follow_information_update(info, params) do
:hide_followers,
:hide_follows,
:follower_count,
:following_count
:following_count,
:hide_followers_count,
:hide_follows_count
])
end
end

View file

@ -1,5 +1,5 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2018 Pleroma Authors <https://pleroma.social/>
# Copyright © 2017-2019 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.User.Query do

View file

@ -4,6 +4,7 @@
defmodule Pleroma.Web.ActivityPub.ActivityPub do
alias Pleroma.Activity
alias Pleroma.Activity.Ir.Topics
alias Pleroma.Config
alias Pleroma.Conversation
alias Pleroma.Notification
@ -16,7 +17,9 @@ defmodule Pleroma.Web.ActivityPub.ActivityPub do
alias Pleroma.User
alias Pleroma.Web.ActivityPub.MRF
alias Pleroma.Web.ActivityPub.Transmogrifier
alias Pleroma.Web.Streamer
alias Pleroma.Web.WebFinger
alias Pleroma.Workers.BackgroundWorker
import Ecto.Query
import Pleroma.Web.ActivityPub.Utils
@ -145,7 +148,7 @@ def insert(map, local \\ true, fake \\ false, bypass_actor_check \\ false) when
activity
end
PleromaJobQueue.enqueue(:background, Pleroma.Web.RichMedia.Helpers, [:fetch, activity])
BackgroundWorker.enqueue("fetch_data_for_activity", %{"activity_id" => activity.id})
Notification.create_notifications(activity)
@ -186,9 +189,7 @@ def stream_out_participations(participations) do
participations
|> Repo.preload(:user)
Enum.each(participations, fn participation ->
Pleroma.Web.Streamer.stream("participation", participation)
end)
Streamer.stream("participation", participations)
end
def stream_out_participations(%Object{data: %{"context" => context}}, user) do
@ -207,41 +208,15 @@ def stream_out_participations(%Object{data: %{"context" => context}}, user) do
def stream_out_participations(_, _), do: :noop
def stream_out(activity) do
if activity.data["type"] in ["Create", "Announce", "Delete"] do
object = Object.normalize(activity)
# Do not stream out poll replies
unless object.data["type"] == "Answer" do
Pleroma.Web.Streamer.stream("user", activity)
Pleroma.Web.Streamer.stream("list", activity)
if get_visibility(activity) == "public" do
Pleroma.Web.Streamer.stream("public", activity)
if activity.local do
Pleroma.Web.Streamer.stream("public:local", activity)
def stream_out(%Activity{data: %{"type" => data_type}} = activity)
when data_type in ["Create", "Announce", "Delete"] do
activity
|> Topics.get_activity_topics()
|> Streamer.stream(activity)
end
if activity.data["type"] in ["Create"] do
object.data
|> Map.get("tag", [])
|> Enum.filter(fn tag -> is_bitstring(tag) end)
|> Enum.each(fn tag -> Pleroma.Web.Streamer.stream("hashtag:" <> tag, activity) end)
if object.data["attachment"] != [] do
Pleroma.Web.Streamer.stream("public:media", activity)
if activity.local do
Pleroma.Web.Streamer.stream("public:local:media", activity)
end
end
end
else
if get_visibility(activity) == "direct",
do: Pleroma.Web.Streamer.stream("direct", activity)
end
end
end
def stream_out(_activity) do
:noop
end
def create(%{to: to, actor: actor, context: context, object: object} = params, fake \\ false) do
@ -435,6 +410,7 @@ def delete(%Object{data: %{"id" => id, "actor" => actor}} = object, local \\ tru
end
end
@spec block(User.t(), User.t(), String.t() | nil, boolean) :: {:ok, Activity.t() | nil}
def block(blocker, blocked, activity_id \\ nil, local \\ true) do
outgoing_blocks = Config.get([:activitypub, :outgoing_blocks])
unfollow_blocked = Config.get([:activitypub, :unfollow_blocked])
@ -463,10 +439,11 @@ def unblock(blocker, blocked, activity_id \\ nil, local \\ true) do
end
end
@spec flag(map()) :: {:ok, Activity.t()} | any
def flag(
%{
actor: actor,
context: context,
context: _context,
account: account,
statuses: statuses,
content: content
@ -478,14 +455,6 @@ def flag(
additional = params[:additional] || %{}
params = %{
actor: actor,
context: context,
account: account,
statuses: statuses,
content: content
}
additional =
if forward do
Map.merge(additional, %{"to" => [], "cc" => [account.ap_id]})
@ -551,9 +520,10 @@ def fetch_latest_activity_id_for_context(context, opts \\ %{}) do
end
def fetch_public_activities(opts \\ %{}) do
q = fetch_activities_query([Pleroma.Constants.as_public()], opts)
opts = Map.drop(opts, ["user"])
q
[Pleroma.Constants.as_public()]
|> fetch_activities_query(opts)
|> restrict_unlisted()
|> Pagination.fetch_paginated(opts)
|> Enum.reverse()

View file

@ -6,6 +6,7 @@ defmodule Pleroma.Web.ActivityPub.ActivityPubController do
use Pleroma.Web, :controller
alias Pleroma.Activity
alias Pleroma.Delivery
alias Pleroma.Object
alias Pleroma.Object.Fetcher
alias Pleroma.User
@ -23,6 +24,12 @@ defmodule Pleroma.Web.ActivityPub.ActivityPubController do
action_fallback(:errors)
plug(
Pleroma.Plugs.Cache,
[query_params: false, tracking_fun: &__MODULE__.track_object_fetch/2]
when action in [:activity, :object]
)
plug(Pleroma.Web.FederatingPlug when action in [:inbox, :relay])
plug(:set_requester_reachable when action in [:inbox])
plug(:relay_active? when action in [:relay])
@ -42,7 +49,8 @@ def user(conn, %{"nickname" => nickname}) do
{:ok, user} <- User.ensure_keys_present(user) do
conn
|> put_resp_content_type("application/activity+json")
|> json(UserView.render("user.json", %{user: user}))
|> put_view(UserView)
|> render("user.json", %{user: user})
else
nil -> {:error, :not_found}
end
@ -53,14 +61,27 @@ def object(conn, %{"uuid" => uuid}) do
%Object{} = object <- Object.get_cached_by_ap_id(ap_id),
{_, true} <- {:public?, Visibility.is_public?(object)} do
conn
|> assign(:tracking_fun_data, object.id)
|> set_cache_ttl_for(object)
|> put_resp_content_type("application/activity+json")
|> json(ObjectView.render("object.json", %{object: object}))
|> put_view(ObjectView)
|> render("object.json", object: object)
else
{:public?, false} ->
{:error, :not_found}
end
end
def track_object_fetch(conn, nil), do: conn
def track_object_fetch(conn, object_id) do
with %{assigns: %{user: %User{id: user_id}}} <- conn do
Delivery.create(object_id, user_id)
end
conn
end
def object_likes(conn, %{"uuid" => uuid, "page" => page}) do
with ap_id <- o_status_url(conn, :object, uuid),
%Object{} = object <- Object.get_cached_by_ap_id(ap_id),
@ -70,7 +91,8 @@ def object_likes(conn, %{"uuid" => uuid, "page" => page}) do
conn
|> put_resp_content_type("application/activity+json")
|> json(ObjectView.render("likes.json", ap_id, likes, page))
|> put_view(ObjectView)
|> render("likes.json", %{ap_id: ap_id, likes: likes, page: page})
else
{:public?, false} ->
{:error, :not_found}
@ -84,7 +106,8 @@ def object_likes(conn, %{"uuid" => uuid}) do
likes <- Utils.get_object_likes(object) do
conn
|> put_resp_content_type("application/activity+json")
|> json(ObjectView.render("likes.json", ap_id, likes))
|> put_view(ObjectView)
|> render("likes.json", %{ap_id: ap_id, likes: likes})
else
{:public?, false} ->
{:error, :not_found}
@ -96,19 +119,50 @@ def activity(conn, %{"uuid" => uuid}) do
%Activity{} = activity <- Activity.normalize(ap_id),
{_, true} <- {:public?, Visibility.is_public?(activity)} do
conn
|> maybe_set_tracking_data(activity)
|> set_cache_ttl_for(activity)
|> put_resp_content_type("application/activity+json")
|> json(ObjectView.render("object.json", %{object: activity}))
|> put_view(ObjectView)
|> render("object.json", object: activity)
else
{:public?, false} ->
{:error, :not_found}
{:public?, false} -> {:error, :not_found}
nil -> {:error, :not_found}
end
end
defp maybe_set_tracking_data(conn, %Activity{data: %{"type" => "Create"}} = activity) do
object_id = Object.normalize(activity).id
assign(conn, :tracking_fun_data, object_id)
end
defp maybe_set_tracking_data(conn, _activity), do: conn
defp set_cache_ttl_for(conn, %Activity{object: object}) do
set_cache_ttl_for(conn, object)
end
defp set_cache_ttl_for(conn, entity) do
ttl =
case entity do
%Object{data: %{"type" => "Question"}} ->
Pleroma.Config.get([:web_cache_ttl, :activity_pub_question])
%Object{} ->
Pleroma.Config.get([:web_cache_ttl, :activity_pub])
_ ->
nil
end
assign(conn, :cache_ttl, ttl)
end
# GET /relay/following
def following(%{assigns: %{relay: true}} = conn, _params) do
conn
|> put_resp_content_type("application/activity+json")
|> json(UserView.render("following.json", %{user: Relay.get_actor()}))
|> put_view(UserView)
|> render("following.json", %{user: Relay.get_actor()})
end
def following(%{assigns: %{user: for_user}} = conn, %{"nickname" => nickname, "page" => page}) do
@ -120,7 +174,8 @@ def following(%{assigns: %{user: for_user}} = conn, %{"nickname" => nickname, "p
conn
|> put_resp_content_type("application/activity+json")
|> json(UserView.render("following.json", %{user: user, page: page, for: for_user}))
|> put_view(UserView)
|> render("following.json", %{user: user, page: page, for: for_user})
else
{:show_follows, _} ->
conn
@ -134,7 +189,8 @@ def following(%{assigns: %{user: for_user}} = conn, %{"nickname" => nickname}) d
{user, for_user} <- ensure_user_keys_present_and_maybe_refresh_for_user(user, for_user) do
conn
|> put_resp_content_type("application/activity+json")
|> json(UserView.render("following.json", %{user: user, for: for_user}))
|> put_view(UserView)
|> render("following.json", %{user: user, for: for_user})
end
end
@ -142,7 +198,8 @@ def following(%{assigns: %{user: for_user}} = conn, %{"nickname" => nickname}) d
def followers(%{assigns: %{relay: true}} = conn, _params) do
conn
|> put_resp_content_type("application/activity+json")
|> json(UserView.render("followers.json", %{user: Relay.get_actor()}))
|> put_view(UserView)
|> render("followers.json", %{user: Relay.get_actor()})
end
def followers(%{assigns: %{user: for_user}} = conn, %{"nickname" => nickname, "page" => page}) do
@ -154,7 +211,8 @@ def followers(%{assigns: %{user: for_user}} = conn, %{"nickname" => nickname, "p
conn
|> put_resp_content_type("application/activity+json")
|> json(UserView.render("followers.json", %{user: user, page: page, for: for_user}))
|> put_view(UserView)
|> render("followers.json", %{user: user, page: page, for: for_user})
else
{:show_followers, _} ->
conn
@ -168,7 +226,8 @@ def followers(%{assigns: %{user: for_user}} = conn, %{"nickname" => nickname}) d
{user, for_user} <- ensure_user_keys_present_and_maybe_refresh_for_user(user, for_user) do
conn
|> put_resp_content_type("application/activity+json")
|> json(UserView.render("followers.json", %{user: user, for: for_user}))
|> put_view(UserView)
|> render("followers.json", %{user: user, for: for_user})
end
end
@ -177,7 +236,8 @@ def outbox(conn, %{"nickname" => nickname} = params) do
{:ok, user} <- User.ensure_keys_present(user) do
conn
|> put_resp_content_type("application/activity+json")
|> json(UserView.render("outbox.json", %{user: user, max_id: params["max_id"]}))
|> put_view(UserView)
|> render("outbox.json", %{user: user, max_id: params["max_id"]})
end
end
@ -225,7 +285,8 @@ defp represent_service_actor(%User{} = user, conn) do
with {:ok, user} <- User.ensure_keys_present(user) do
conn
|> put_resp_content_type("application/activity+json")
|> json(UserView.render("user.json", %{user: user}))
|> put_view(UserView)
|> render("user.json", %{user: user})
else
nil -> {:error, :not_found}
end
@ -246,27 +307,42 @@ def internal_fetch(conn, _params) do
def whoami(%{assigns: %{user: %User{} = user}} = conn, _params) do
conn
|> put_resp_content_type("application/activity+json")
|> json(UserView.render("user.json", %{user: user}))
|> put_view(UserView)
|> render("user.json", %{user: user})
end
def whoami(_conn, _params), do: {:error, :not_found}
def read_inbox(%{assigns: %{user: user}} = conn, %{"nickname" => nickname} = params) do
if nickname == user.nickname do
def read_inbox(
%{assigns: %{user: %{nickname: nickname} = user}} = conn,
%{"nickname" => nickname} = params
) do
conn
|> put_resp_content_type("application/activity+json")
|> json(UserView.render("inbox.json", %{user: user, max_id: params["max_id"]}))
else
err =
dgettext("errors", "can't read inbox of %{nickname} as %{as_nickname}",
nickname: nickname,
as_nickname: user.nickname
)
|> put_view(UserView)
|> render("inbox.json", user: user, max_id: params["max_id"])
end
def read_inbox(%{assigns: %{user: nil}} = conn, %{"nickname" => nickname}) do
err = dgettext("errors", "can't read inbox of %{nickname}", nickname: nickname)
conn
|> put_status(:forbidden)
|> json(err)
end
def read_inbox(%{assigns: %{user: %{nickname: as_nickname}}} = conn, %{
"nickname" => nickname
}) do
err =
dgettext("errors", "can't read inbox of %{nickname} as %{as_nickname}",
nickname: nickname,
as_nickname: as_nickname
)
conn
|> put_status(:forbidden)
|> json(err)
end
def handle_user_activity(user, %{"type" => "Create"} = params) do

View file

@ -8,6 +8,7 @@ defmodule Pleroma.Web.ActivityPub.MRF.MediaProxyWarmingPolicy do
alias Pleroma.HTTP
alias Pleroma.Web.MediaProxy
alias Pleroma.Workers.BackgroundWorker
require Logger
@ -30,7 +31,7 @@ def perform(:preload, %{"object" => %{"attachment" => attachments}} = _message)
url
|> Enum.each(fn
%{"href" => href} ->
PleromaJobQueue.enqueue(:background, __MODULE__, [:prefetch, href])
BackgroundWorker.enqueue("media_proxy_prefetch", %{"url" => href})
x ->
Logger.debug("Unhandled attachment URL object #{inspect(x)}")
@ -46,7 +47,7 @@ def filter(
%{"type" => "Create", "object" => %{"attachment" => attachments} = _object} = message
)
when is_list(attachments) and length(attachments) > 0 do
PleromaJobQueue.enqueue(:background, __MODULE__, [:preload, message])
BackgroundWorker.enqueue("media_proxy_preload", %{"message" => message})
{:ok, message}
end

View file

@ -5,8 +5,10 @@
defmodule Pleroma.Web.ActivityPub.Publisher do
alias Pleroma.Activity
alias Pleroma.Config
alias Pleroma.Delivery
alias Pleroma.HTTP
alias Pleroma.Instances
alias Pleroma.Object
alias Pleroma.User
alias Pleroma.Web.ActivityPub.Relay
alias Pleroma.Web.ActivityPub.Transmogrifier
@ -84,6 +86,15 @@ def publish_one(%{inbox: inbox, json: json, actor: %User{} = actor, id: id} = pa
end
end
def publish_one(%{actor_id: actor_id} = params) do
actor = User.get_cached_by_id(actor_id)
params
|> Map.delete(:actor_id)
|> Map.put(:actor, actor)
|> publish_one()
end
defp should_federate?(inbox, public) do
if public do
true
@ -107,7 +118,18 @@ defp recipients(actor, activity) do
{:ok, []}
end
Pleroma.Web.Salmon.remote_users(actor, activity) ++ followers
fetchers =
with %Activity{data: %{"type" => "Delete"}} <- activity,
%Object{id: object_id} <- Object.normalize(activity),
fetchers <- User.get_delivered_users_by_object_id(object_id),
_ <- Delivery.delete_all_by_object_id(object_id) do
fetchers
else
_ ->
[]
end
Pleroma.Web.Salmon.remote_users(actor, activity) ++ followers ++ fetchers
end
defp get_cc_ap_ids(ap_id, recipients) do
@ -159,7 +181,8 @@ def determine_inbox(
Publishes an activity with BCC to all relevant peers.
"""
def publish(actor, %{data: %{"bcc" => bcc}} = activity) when is_list(bcc) and bcc != [] do
def publish(%User{} = actor, %{data: %{"bcc" => bcc}} = activity)
when is_list(bcc) and bcc != [] do
public = is_public?(activity)
{:ok, data} = Transmogrifier.prepare_outgoing(activity.data)
@ -186,7 +209,7 @@ def publish(actor, %{data: %{"bcc" => bcc}} = activity) when is_list(bcc) and bc
Pleroma.Web.Federator.Publisher.enqueue_one(__MODULE__, %{
inbox: inbox,
json: json,
actor: actor,
actor_id: actor.id,
id: activity.data["id"],
unreachable_since: unreachable_since
})
@ -221,7 +244,7 @@ def publish(%User{} = actor, %Activity{} = activity) do
%{
inbox: inbox,
json: json,
actor: actor,
actor_id: actor.id,
id: activity.data["id"],
unreachable_since: unreachable_since
}

View file

@ -15,6 +15,7 @@ defmodule Pleroma.Web.ActivityPub.Transmogrifier do
alias Pleroma.Web.ActivityPub.Utils
alias Pleroma.Web.ActivityPub.Visibility
alias Pleroma.Web.Federator
alias Pleroma.Workers.TransmogrifierWorker
import Ecto.Query
@ -41,8 +42,7 @@ def fix_object(object, options \\ []) do
end
def fix_summary(%{"summary" => nil} = object) do
object
|> Map.put("summary", "")
Map.put(object, "summary", "")
end
def fix_summary(%{"summary" => _} = object) do
@ -50,10 +50,7 @@ def fix_summary(%{"summary" => _} = object) do
object
end
def fix_summary(object) do
object
|> Map.put("summary", "")
end
def fix_summary(object), do: Map.put(object, "summary", "")
def fix_addressing_list(map, field) do
cond do
@ -73,13 +70,9 @@ def fix_explicit_addressing(
explicit_mentions,
follower_collection
) do
explicit_to =
to
|> Enum.filter(fn x -> x in explicit_mentions end)
explicit_to = Enum.filter(to, fn x -> x in explicit_mentions end)
explicit_cc =
to
|> Enum.filter(fn x -> x not in explicit_mentions end)
explicit_cc = Enum.filter(to, fn x -> x not in explicit_mentions end)
final_cc =
(cc ++ explicit_cc)
@ -97,13 +90,19 @@ def fix_explicit_addressing(object, _explicit_mentions, _followers_collection),
def fix_explicit_addressing(%{"directMessage" => true} = object), do: object
def fix_explicit_addressing(object) do
explicit_mentions =
explicit_mentions = Utils.determine_explicit_mentions(object)
%User{follower_address: follower_collection} =
object
|> Utils.determine_explicit_mentions()
|> Containment.get_actor()
|> User.get_cached_by_ap_id()
follower_collection = User.get_cached_by_ap_id(Containment.get_actor(object)).follower_address
explicit_mentions = explicit_mentions ++ [Pleroma.Constants.as_public(), follower_collection]
explicit_mentions =
explicit_mentions ++
[
Pleroma.Constants.as_public(),
follower_collection
]
fix_explicit_addressing(object, explicit_mentions, follower_collection)
end
@ -147,15 +146,37 @@ def fix_addressing(object) do
end
def fix_actor(%{"attributedTo" => actor} = object) do
object
|> Map.put("actor", Containment.get_actor(%{"actor" => actor}))
Map.put(object, "actor", Containment.get_actor(%{"actor" => actor}))
end
def fix_in_reply_to(object, options \\ [])
def fix_in_reply_to(%{"inReplyTo" => in_reply_to} = object, options)
when not is_nil(in_reply_to) do
in_reply_to_id =
in_reply_to_id = prepare_in_reply_to(in_reply_to)
object = Map.put(object, "inReplyToAtomUri", in_reply_to_id)
if Federator.allowed_incoming_reply_depth?(options[:depth]) do
with {:ok, replied_object} <- get_obj_helper(in_reply_to_id, options),
%Activity{} = _ <- Activity.get_create_by_object_ap_id(replied_object.data["id"]) do
object
|> Map.put("inReplyTo", replied_object.data["id"])
|> Map.put("inReplyToAtomUri", object["inReplyToAtomUri"] || in_reply_to_id)
|> Map.put("conversation", replied_object.data["context"] || object["conversation"])
|> Map.put("context", replied_object.data["context"] || object["conversation"])
else
e ->
Logger.error("Couldn't fetch #{inspect(in_reply_to_id)}, error: #{inspect(e)}")
object
end
else
object
end
end
def fix_in_reply_to(object, _options), do: object
defp prepare_in_reply_to(in_reply_to) do
cond do
is_bitstring(in_reply_to) ->
in_reply_to
@ -166,40 +187,11 @@ def fix_in_reply_to(%{"inReplyTo" => in_reply_to} = object, options)
is_list(in_reply_to) && is_bitstring(Enum.at(in_reply_to, 0)) ->
Enum.at(in_reply_to, 0)
# Maybe I should output an error too?
true ->
""
end
object = Map.put(object, "inReplyToAtomUri", in_reply_to_id)
if Federator.allowed_incoming_reply_depth?(options[:depth]) do
case get_obj_helper(in_reply_to_id, options) do
{:ok, replied_object} ->
with %Activity{} = _activity <-
Activity.get_create_by_object_ap_id(replied_object.data["id"]) do
object
|> Map.put("inReplyTo", replied_object.data["id"])
|> Map.put("inReplyToAtomUri", object["inReplyToAtomUri"] || in_reply_to_id)
|> Map.put("conversation", replied_object.data["context"] || object["conversation"])
|> Map.put("context", replied_object.data["context"] || object["conversation"])
else
e ->
Logger.error("Couldn't fetch \"#{inspect(in_reply_to_id)}\", error: #{inspect(e)}")
object
end
e ->
Logger.error("Couldn't fetch \"#{inspect(in_reply_to_id)}\", error: #{inspect(e)}")
object
end
else
object
end
end
def fix_in_reply_to(object, _options), do: object
def fix_context(object) do
context = object["context"] || object["conversation"] || Utils.generate_context_id()
@ -210,11 +202,9 @@ def fix_context(object) do
def fix_attachments(%{"attachment" => attachment} = object) when is_list(attachment) do
attachments =
attachment
|> Enum.map(fn data ->
Enum.map(attachment, fn data ->
media_type = data["mediaType"] || data["mimeType"]
href = data["url"] || data["href"]
url = [%{"type" => "Link", "mediaType" => media_type, "href" => href}]
data
@ -222,30 +212,25 @@ def fix_attachments(%{"attachment" => attachment} = object) when is_list(attachm
|> Map.put("url", url)
end)
object
|> Map.put("attachment", attachments)
Map.put(object, "attachment", attachments)
end
def fix_attachments(%{"attachment" => attachment} = object) when is_map(attachment) do
Map.put(object, "attachment", [attachment])
object
|> Map.put("attachment", [attachment])
|> fix_attachments()
end
def fix_attachments(object), do: object
def fix_url(%{"url" => url} = object) when is_map(url) do
object
|> Map.put("url", url["href"])
Map.put(object, "url", url["href"])
end
def fix_url(%{"type" => "Video", "url" => url} = object) when is_list(url) do
first_element = Enum.at(url, 0)
link_element =
url
|> Enum.filter(fn x -> is_map(x) end)
|> Enum.filter(fn x -> x["mimeType"] == "text/html" end)
|> Enum.at(0)
link_element = Enum.find(url, fn x -> is_map(x) and x["mimeType"] == "text/html" end)
object
|> Map.put("attachment", [first_element])
@ -263,36 +248,32 @@ def fix_url(%{"type" => object_type, "url" => url} = object)
true -> ""
end
object
|> Map.put("url", url_string)
Map.put(object, "url", url_string)
end
def fix_url(object), do: object
def fix_emoji(%{"tag" => tags} = object) when is_list(tags) do
emoji = tags |> Enum.filter(fn data -> data["type"] == "Emoji" and data["icon"] end)
emoji =
emoji
tags
|> Enum.filter(fn data -> data["type"] == "Emoji" and data["icon"] end)
|> Enum.reduce(%{}, fn data, mapping ->
name = String.trim(data["name"], ":")
mapping |> Map.put(name, data["icon"]["url"])
Map.put(mapping, name, data["icon"]["url"])
end)
# we merge mastodon and pleroma emoji into a single mapping, to allow for both wire formats
emoji = Map.merge(object["emoji"] || %{}, emoji)
object
|> Map.put("emoji", emoji)
Map.put(object, "emoji", emoji)
end
def fix_emoji(%{"tag" => %{"type" => "Emoji"} = tag} = object) do
name = String.trim(tag["name"], ":")
emoji = %{name => tag["icon"]["url"]}
object
|> Map.put("emoji", emoji)
Map.put(object, "emoji", emoji)
end
def fix_emoji(object), do: object
@ -303,17 +284,13 @@ def fix_tag(%{"tag" => tag} = object) when is_list(tag) do
|> Enum.filter(fn data -> data["type"] == "Hashtag" and data["name"] end)
|> Enum.map(fn data -> String.slice(data["name"], 1..-1) end)
combined = tag ++ tags
object
|> Map.put("tag", combined)
Map.put(object, "tag", tag ++ tags)
end
def fix_tag(%{"tag" => %{"type" => "Hashtag", "name" => hashtag} = tag} = object) do
combined = [tag, String.slice(hashtag, 1..-1)]
object
|> Map.put("tag", combined)
Map.put(object, "tag", combined)
end
def fix_tag(%{"tag" => %{} = tag} = object), do: Map.put(object, "tag", [tag])
@ -325,8 +302,7 @@ def fix_content_map(%{"contentMap" => content_map} = object) do
content_groups = Map.to_list(content_map)
{_, content} = Enum.at(content_groups, 0)
object
|> Map.put("content", content)
Map.put(object, "content", content)
end
def fix_content_map(object), do: object
@ -335,16 +311,11 @@ def fix_type(object, options \\ [])
def fix_type(%{"inReplyTo" => reply_id, "name" => _} = object, options)
when is_binary(reply_id) do
reply =
with true <- Federator.allowed_incoming_reply_depth?(options[:depth]),
{:ok, object} <- get_obj_helper(reply_id, options) do
object
end
if reply && reply.data["type"] == "Question" do
{:ok, %{data: %{"type" => "Question"} = _} = _} <- get_obj_helper(reply_id, options) do
Map.put(object, "type", "Answer")
else
object
_ -> object
end
end
@ -376,6 +347,17 @@ defp get_follow_activity(follow_object, followed) do
end
end
# Reduce the object list to find the reported user.
defp get_reported(objects) do
Enum.reduce_while(objects, nil, fn ap_id, _ ->
with %User{} = user <- User.get_cached_by_ap_id(ap_id) do
{:halt, user}
else
_ -> {:cont, nil}
end
end)
end
def handle_incoming(data, options \\ [])
# Flag objects are placed ahead of the ID check because Mastodon 2.8 and earlier send them
@ -384,31 +366,19 @@ def handle_incoming(%{"type" => "Flag", "object" => objects, "actor" => actor} =
with context <- data["context"] || Utils.generate_context_id(),
content <- data["content"] || "",
%User{} = actor <- User.get_cached_by_ap_id(actor),
# Reduce the object list to find the reported user.
%User{} = account <-
Enum.reduce_while(objects, nil, fn ap_id, _ ->
with %User{} = user <- User.get_cached_by_ap_id(ap_id) do
{:halt, user}
else
_ -> {:cont, nil}
end
end),
%User{} = account <- get_reported(objects),
# Remove the reported user from the object list.
statuses <- Enum.filter(objects, fn ap_id -> ap_id != account.ap_id end) do
params = %{
%{
actor: actor,
context: context,
account: account,
statuses: statuses,
content: content,
additional: %{
"cc" => [account.ap_id]
additional: %{"cc" => [account.ap_id]}
}
}
ActivityPub.flag(params)
|> ActivityPub.flag()
end
end
@ -755,8 +725,12 @@ def handle_incoming(
def handle_incoming(_, _), do: :error
@spec get_obj_helper(String.t(), Keyword.t()) :: {:ok, Object.t()} | nil
def get_obj_helper(id, options \\ []) do
if object = Object.normalize(id, true, options), do: {:ok, object}, else: nil
case Object.normalize(id, true, options) do
%Object{} = object -> {:ok, object}
_ -> nil
end
end
def set_reply_to_uri(%{"inReplyTo" => in_reply_to} = object) when is_binary(in_reply_to) do
@ -855,26 +829,23 @@ def prepare_outgoing(%{"type" => _type} = data) do
{:ok, data}
end
def maybe_fix_object_url(data) do
if is_binary(data["object"]) and not String.starts_with?(data["object"], "http") do
case get_obj_helper(data["object"]) do
{:ok, relative_object} ->
if relative_object.data["external_url"] do
_data =
data
|> Map.put("object", relative_object.data["external_url"])
def maybe_fix_object_url(%{"object" => object} = data) when is_binary(object) do
with false <- String.starts_with?(object, "http"),
{:fetch, {:ok, relative_object}} <- {:fetch, get_obj_helper(object)},
%{data: %{"external_url" => external_url}} when not is_nil(external_url) <-
relative_object do
Map.put(data, "object", external_url)
else
{:fetch, e} ->
Logger.error("Couldn't fetch #{object} #{inspect(e)}")
data
_ ->
data
end
end
e ->
Logger.error("Couldn't fetch #{data["object"]} #{inspect(e)}")
data
end
else
data
end
end
def maybe_fix_object_url(data), do: data
def add_hashtags(object) do
tags =
@ -893,38 +864,42 @@ def add_hashtags(object) do
tag
end)
object
|> Map.put("tag", tags)
Map.put(object, "tag", tags)
end
def add_mention_tags(object) do
mentions =
object
|> Utils.get_notified_from_object()
|> Enum.map(fn user ->
%{"type" => "Mention", "href" => user.ap_id, "name" => "@#{user.nickname}"}
end)
|> Enum.map(&build_mention_tag/1)
tags = object["tag"] || []
object
|> Map.put("tag", tags ++ mentions)
Map.put(object, "tag", tags ++ mentions)
end
def add_emoji_tags(%User{info: %{"emoji" => _emoji} = user_info} = object) do
user_info = add_emoji_tags(user_info)
defp build_mention_tag(%{ap_id: ap_id, nickname: nickname} = _) do
%{"type" => "Mention", "href" => ap_id, "name" => "@#{nickname}"}
end
object
|> Map.put(:info, user_info)
def take_emoji_tags(%User{info: %{emoji: emoji} = _user_info} = _user) do
emoji
|> Enum.flat_map(&Map.to_list/1)
|> Enum.map(&build_emoji_tag/1)
end
# TODO: we should probably send mtime instead of unix epoch time for updated
def add_emoji_tags(%{"emoji" => emoji} = object) do
tags = object["tag"] || []
out =
emoji
|> Enum.map(fn {name, url} ->
out = Enum.map(emoji, &build_emoji_tag/1)
Map.put(object, "tag", tags ++ out)
end
def add_emoji_tags(object), do: object
defp build_emoji_tag({name, url}) do
%{
"icon" => %{"url" => url, "type" => "Image"},
"name" => ":" <> name <> ":",
@ -932,14 +907,6 @@ def add_emoji_tags(%{"emoji" => emoji} = object) do
"updated" => "1970-01-01T00:00:00Z",
"id" => url
}
end)
object
|> Map.put("tag", tags ++ out)
end
def add_emoji_tags(object) do
object
end
def set_conversation(object) do
@ -959,9 +926,7 @@ def set_type(object), do: object
def add_attributed_to(object) do
attributed_to = object["attributedTo"] || object["actor"]
object
|> Map.put("attributedTo", attributed_to)
Map.put(object, "attributedTo", attributed_to)
end
def prepare_attachments(object) do
@ -972,30 +937,18 @@ def prepare_attachments(object) do
%{"url" => href, "mediaType" => media_type, "name" => data["name"], "type" => "Document"}
end)
object
|> Map.put("attachment", attachments)
Map.put(object, "attachment", attachments)
end
defp strip_internal_fields(object) do
object
|> Map.drop([
"likes",
"like_count",
"announcements",
"announcement_count",
"emoji",
"context_id",
"deleted_activity_id"
])
|> Map.drop(Pleroma.Constants.object_internal_fields())
end
defp strip_internal_tags(%{"tag" => tags} = object) do
tags =
tags
|> Enum.filter(fn x -> is_map(x) end)
tags = Enum.filter(tags, fn x -> is_map(x) end)
object
|> Map.put("tag", tags)
Map.put(object, "tag", tags)
end
defp strip_internal_tags(object), do: object
@ -1049,9 +1002,9 @@ def upgrade_user_from_ap_id(ap_id) do
with %User{local: false} = user <- User.get_cached_by_ap_id(ap_id),
{:ok, data} <- ActivityPub.fetch_and_prepare_user_from_ap_id(ap_id),
already_ap <- User.ap_enabled?(user),
{:ok, user} <- user |> User.upgrade_changeset(data) |> User.update_and_set_cache() do
unless already_ap do
PleromaJobQueue.enqueue(:transmogrifier, __MODULE__, [:user_upgrade, user])
{:ok, user} <- upgrade_user(user, data) do
if not already_ap do
TransmogrifierWorker.enqueue("user_upgrade", %{"user_id" => user.id})
end
{:ok, user}
@ -1061,6 +1014,12 @@ def upgrade_user_from_ap_id(ap_id) do
end
end
defp upgrade_user(user, data) do
user
|> User.upgrade_changeset(data, true)
|> User.update_and_set_cache()
end
def maybe_retire_websub(ap_id) do
# some sanity checks
if is_binary(ap_id) && String.length(ap_id) > 8 do
@ -1074,16 +1033,11 @@ def maybe_retire_websub(ap_id) do
end
end
def maybe_fix_user_url(data) do
if is_map(data["url"]) do
Map.put(data, "url", data["url"]["href"])
else
data
end
def maybe_fix_user_url(%{"url" => url} = data) when is_map(url) do
Map.put(data, "url", url["href"])
end
def maybe_fix_user_object(data) do
data
|> maybe_fix_user_url
end
def maybe_fix_user_url(data), do: data
def maybe_fix_user_object(data), do: maybe_fix_user_url(data)
end

View file

@ -33,50 +33,40 @@ def normalize_params(params) do
Map.put(params, "actor", get_ap_id(params["actor"]))
end
def determine_explicit_mentions(%{"tag" => tag} = _object) when is_list(tag) do
tag
|> Enum.filter(fn x -> is_map(x) end)
|> Enum.filter(fn x -> x["type"] == "Mention" end)
|> Enum.map(fn x -> x["href"] end)
@spec determine_explicit_mentions(map()) :: map()
def determine_explicit_mentions(%{"tag" => tag} = _) when is_list(tag) do
Enum.flat_map(tag, fn
%{"type" => "Mention", "href" => href} -> [href]
_ -> []
end)
end
def determine_explicit_mentions(%{"tag" => tag} = object) when is_map(tag) do
Map.put(object, "tag", [tag])
object
|> Map.put("tag", [tag])
|> determine_explicit_mentions()
end
def determine_explicit_mentions(_), do: []
@spec recipient_in_collection(any(), any()) :: boolean()
defp recipient_in_collection(ap_id, coll) when is_binary(coll), do: ap_id == coll
defp recipient_in_collection(ap_id, coll) when is_list(coll), do: ap_id in coll
defp recipient_in_collection(_, _), do: false
@spec recipient_in_message(User.t(), User.t(), map()) :: boolean()
def recipient_in_message(%User{ap_id: ap_id} = recipient, %User{} = actor, params) do
addresses = [params["to"], params["cc"], params["bto"], params["bcc"]]
cond do
recipient_in_collection(ap_id, params["to"]) ->
true
recipient_in_collection(ap_id, params["cc"]) ->
true
recipient_in_collection(ap_id, params["bto"]) ->
true
recipient_in_collection(ap_id, params["bcc"]) ->
true
Enum.any?(addresses, &recipient_in_collection(ap_id, &1)) -> true
# if the message is unaddressed at all, then assume it is directly addressed
# to the recipient
!params["to"] && !params["cc"] && !params["bto"] && !params["bcc"] ->
true
Enum.all?(addresses, &is_nil(&1)) -> true
# if the message is sent from somebody the user is following, then assume it
# is addressed to the recipient
User.following?(recipient, actor) ->
true
true ->
false
User.following?(recipient, actor) -> true
true -> false
end
end
@ -85,15 +75,13 @@ defp extract_list(lst) when is_list(lst), do: lst
defp extract_list(_), do: []
def maybe_splice_recipient(ap_id, params) do
need_splice =
need_splice? =
!recipient_in_collection(ap_id, params["to"]) &&
!recipient_in_collection(ap_id, params["cc"])
if need_splice? do
cc_list = extract_list(params["cc"])
if need_splice do
params
|> Map.put("cc", [ap_id | cc_list])
Map.put(params, "cc", [ap_id | cc_list])
else
params
end
@ -139,7 +127,7 @@ def get_notified_from_object(%{"type" => type} = object) when type in @supported
"object" => object
}
Notification.get_notified_from_activity(%Activity{data: fake_create_activity}, false)
get_notified_from_object(fake_create_activity)
end
def get_notified_from_object(object) do
@ -169,14 +157,7 @@ def create_context(context) do
@spec maybe_federate(any()) :: :ok
def maybe_federate(%Activity{local: true} = activity) do
if Pleroma.Config.get!([:instance, :federating]) do
priority =
case activity.data["type"] do
"Delete" -> 10
"Create" -> 1
_ -> 5
end
Pleroma.Web.Federator.publish(activity, priority)
Pleroma.Web.Federator.publish(activity)
end
:ok
@ -188,9 +169,19 @@ def maybe_federate(_), do: :ok
Adds an id and a published data if they aren't there,
also adds it to an included object
"""
def lazy_put_activity_defaults(map, fake \\ false) do
map =
unless fake do
@spec lazy_put_activity_defaults(map(), boolean) :: map()
def lazy_put_activity_defaults(map, fake? \\ false)
def lazy_put_activity_defaults(map, true) do
map
|> Map.put_new("id", "pleroma:fakeid")
|> Map.put_new_lazy("published", &make_date/0)
|> Map.put_new("context", "pleroma:fakecontext")
|> Map.put_new("context_id", -1)
|> lazy_put_object_defaults(true)
end
def lazy_put_activity_defaults(map, _fake?) do
%{data: %{"id" => context}, id: context_id} = create_context(map["context"])
map
@ -198,53 +189,46 @@ def lazy_put_activity_defaults(map, fake \\ false) do
|> Map.put_new_lazy("published", &make_date/0)
|> Map.put_new("context", context)
|> Map.put_new("context_id", context_id)
else
map
|> Map.put_new("id", "pleroma:fakeid")
|> Map.put_new_lazy("published", &make_date/0)
|> Map.put_new("context", "pleroma:fakecontext")
|> Map.put_new("context_id", -1)
|> lazy_put_object_defaults(false)
end
if is_map(map["object"]) do
object = lazy_put_object_defaults(map["object"], map, fake)
%{map | "object" => object}
else
# Adds an id and published date if they aren't there.
#
@spec lazy_put_object_defaults(map(), boolean()) :: map()
defp lazy_put_object_defaults(%{"object" => map} = activity, true)
when is_map(map) do
object =
map
end
end
@doc """
Adds an id and published date if they aren't there.
"""
def lazy_put_object_defaults(map, activity \\ %{}, fake)
def lazy_put_object_defaults(map, activity, true = _fake) do
map
|> Map.put_new_lazy("published", &make_date/0)
|> Map.put_new("id", "pleroma:fake_object_id")
|> Map.put_new_lazy("published", &make_date/0)
|> Map.put_new("context", activity["context"])
|> Map.put_new("fake", true)
|> Map.put_new("context_id", activity["context_id"])
|> Map.put_new("fake", true)
%{activity | "object" => object}
end
def lazy_put_object_defaults(map, activity, _fake) do
defp lazy_put_object_defaults(%{"object" => map} = activity, _)
when is_map(map) do
object =
map
|> Map.put_new_lazy("id", &generate_object_id/0)
|> Map.put_new_lazy("published", &make_date/0)
|> Map.put_new("context", activity["context"])
|> Map.put_new("context_id", activity["context_id"])
%{activity | "object" => object}
end
defp lazy_put_object_defaults(activity, _), do: activity
@doc """
Inserts a full object if it is contained in an activity.
"""
def insert_full_object(%{"object" => %{"type" => type} = object_data} = map)
when is_map(object_data) and type in @supported_object_types do
with {:ok, object} <- Object.create(object_data) do
map =
map
|> Map.put("object", object.data["id"])
map = Map.put(map, "object", object.data["id"])
{:ok, map, object}
end
@ -263,7 +247,7 @@ def get_existing_like(actor, %{data: %{"id" => id}}) do
|> Activity.Queries.by_actor()
|> Activity.Queries.by_object_id(id)
|> Activity.Queries.by_type("Like")
|> Activity.Queries.limit(1)
|> limit(1)
|> Repo.one()
end
@ -356,36 +340,35 @@ defp fetch_likes(object) do
@doc """
Updates a follow activity's state (for locked accounts).
"""
@spec update_follow_state_for_all(Activity.t(), String.t()) :: {:ok, Activity} | {:error, any()}
def update_follow_state_for_all(
%Activity{data: %{"actor" => actor, "object" => object}} = activity,
state
) do
try do
Ecto.Adapters.SQL.query!(
Repo,
"UPDATE activities SET data = jsonb_set(data, '{state}', $1) WHERE data->>'type' = 'Follow' AND data->>'actor' = $2 AND data->>'object' = $3 AND data->>'state' = 'pending'",
[state, actor, object]
)
"Follow"
|> Activity.Queries.by_type()
|> Activity.Queries.by_actor(actor)
|> Activity.Queries.by_object_id(object)
|> where(fragment("data->>'state' = 'pending'"))
|> update(set: [data: fragment("jsonb_set(data, '{state}', ?)", ^state)])
|> Repo.update_all([])
User.set_follow_state_cache(actor, object, state)
activity = Activity.get_by_id(activity.id)
{:ok, activity}
rescue
e ->
{:error, e}
end
end
def update_follow_state(
%Activity{data: %{"actor" => actor, "object" => object}} = activity,
state
) do
with new_data <-
activity.data
|> Map.put("state", state),
changeset <- Changeset.change(activity, data: new_data),
{:ok, activity} <- Repo.update(changeset),
_ <- User.set_follow_state_cache(actor, object, state) do
new_data = Map.put(activity.data, "state", state)
changeset = Changeset.change(activity, data: new_data)
with {:ok, activity} <- Repo.update(changeset) do
User.set_follow_state_cache(actor, object, state)
{:ok, activity}
end
end
@ -410,28 +393,14 @@ def make_follow_data(
end
def fetch_latest_follow(%User{ap_id: follower_id}, %User{ap_id: followed_id}) do
query =
from(
activity in Activity,
where:
fragment(
"? ->> 'type' = 'Follow'",
activity.data
),
where: activity.actor == ^follower_id,
"Follow"
|> Activity.Queries.by_type()
|> where(actor: ^follower_id)
# this is to use the index
where:
fragment(
"coalesce((?)->'object'->>'id', (?)->>'object') = ?",
activity.data,
activity.data,
^followed_id
),
order_by: [fragment("? desc nulls last", activity.id)],
limit: 1
)
Repo.one(query)
|> Activity.Queries.by_object_id(followed_id)
|> order_by([activity], fragment("? desc nulls last", activity.id))
|> limit(1)
|> Repo.one()
end
#### Announce-related helpers
@ -439,23 +408,14 @@ def fetch_latest_follow(%User{ap_id: follower_id}, %User{ap_id: followed_id}) do
@doc """
Retruns an existing announce activity if the notice has already been announced
"""
def get_existing_announce(actor, %{data: %{"id" => id}}) do
query =
from(
activity in Activity,
where: activity.actor == ^actor,
@spec get_existing_announce(String.t(), map()) :: Activity.t() | nil
def get_existing_announce(actor, %{data: %{"id" => ap_id}}) do
"Announce"
|> Activity.Queries.by_type()
|> where(actor: ^actor)
# this is to use the index
where:
fragment(
"coalesce((?)->'object'->>'id', (?)->>'object') = ?",
activity.data,
activity.data,
^id
),
where: fragment("(?)->>'type' = 'Announce'", activity.data)
)
Repo.one(query)
|> Activity.Queries.by_object_id(ap_id)
|> Repo.one()
end
@doc """
@ -531,31 +491,35 @@ def make_unlike_data(
|> maybe_put("id", activity_id)
end
@spec add_announce_to_object(Activity.t(), Object.t()) ::
{:ok, Object.t()} | {:error, Ecto.Changeset.t()}
def add_announce_to_object(
%Activity{
data: %{"actor" => actor, "cc" => [Pleroma.Constants.as_public()]}
},
%Activity{data: %{"actor" => actor, "cc" => [Pleroma.Constants.as_public()]}},
object
) do
announcements =
if is_list(object.data["announcements"]), do: object.data["announcements"], else: []
announcements = take_announcements(object)
with announcements <- [actor | announcements] |> Enum.uniq() do
with announcements <- Enum.uniq([actor | announcements]) do
update_element_in_object("announcement", announcements, object)
end
end
def add_announce_to_object(_, object), do: {:ok, object}
@spec remove_announce_from_object(Activity.t(), Object.t()) ::
{:ok, Object.t()} | {:error, Ecto.Changeset.t()}
def remove_announce_from_object(%Activity{data: %{"actor" => actor}}, object) do
announcements =
if is_list(object.data["announcements"]), do: object.data["announcements"], else: []
with announcements <- announcements |> List.delete(actor) do
with announcements <- List.delete(take_announcements(object), actor) do
update_element_in_object("announcement", announcements, object)
end
end
defp take_announcements(%{data: %{"announcements" => announcements}} = _)
when is_list(announcements),
do: announcements
defp take_announcements(_), do: []
#### Unfollow-related helpers
def make_unfollow_data(follower, followed, follow_activity, activity_id) do
@ -569,29 +533,16 @@ def make_unfollow_data(follower, followed, follow_activity, activity_id) do
end
#### Block-related helpers
@spec fetch_latest_block(User.t(), User.t()) :: Activity.t() | nil
def fetch_latest_block(%User{ap_id: blocker_id}, %User{ap_id: blocked_id}) do
query =
from(
activity in Activity,
where:
fragment(
"? ->> 'type' = 'Block'",
activity.data
),
where: activity.actor == ^blocker_id,
"Block"
|> Activity.Queries.by_type()
|> where(actor: ^blocker_id)
# this is to use the index
where:
fragment(
"coalesce((?)->'object'->>'id', (?)->>'object') = ?",
activity.data,
activity.data,
^blocked_id
),
order_by: [fragment("? desc nulls last", activity.id)],
limit: 1
)
Repo.one(query)
|> Activity.Queries.by_object_id(blocked_id)
|> order_by([activity], fragment("? desc nulls last", activity.id))
|> limit(1)
|> Repo.one()
end
def make_block_data(blocker, blocked, activity_id) do
@ -631,28 +582,32 @@ def make_create_data(params, additional) do
end
#### Flag-related helpers
def make_flag_data(params, additional) do
status_ap_ids =
Enum.map(params.statuses || [], fn
%Activity{} = act -> act.data["id"]
act when is_map(act) -> act["id"]
act when is_binary(act) -> act
end)
object = [params.account.ap_id] ++ status_ap_ids
@spec make_flag_data(map(), map()) :: map()
def make_flag_data(%{actor: actor, context: context, content: content} = params, additional) do
%{
"type" => "Flag",
"actor" => params.actor.ap_id,
"content" => params.content,
"object" => object,
"context" => params.context,
"actor" => actor.ap_id,
"content" => content,
"object" => build_flag_object(params),
"context" => context,
"state" => "open"
}
|> Map.merge(additional)
end
def make_flag_data(_, _), do: %{}
defp build_flag_object(%{account: account, statuses: statuses} = _) do
[account.ap_id] ++
Enum.map(statuses || [], fn
%Activity{} = act -> act.data["id"]
act when is_map(act) -> act["id"]
act when is_binary(act) -> act
end)
end
defp build_flag_object(_), do: []
@doc """
Fetches the OrderedCollection/OrderedCollectionPage from `from`, limiting the amount of pages fetched after
the first one to `pages_left` pages.
@ -695,11 +650,11 @@ def fetch_ordered_collection(from, pages_left, acc \\ []) do
#### Report-related helpers
def update_report_state(%Activity{} = activity, state) when state in @supported_report_states do
with new_data <- Map.put(activity.data, "state", state),
changeset <- Changeset.change(activity, data: new_data),
{:ok, activity} <- Repo.update(changeset) do
{:ok, activity}
end
new_data = Map.put(activity.data, "state", state)
activity
|> Changeset.change(data: new_data)
|> Repo.update()
end
def update_report_state(_, _), do: {:error, "Unsupported state"}
@ -766,21 +721,13 @@ defp get_updated_targets(
end
def get_existing_votes(actor, %{data: %{"id" => id}}) do
query =
from(
[activity, object: object] in Activity.with_preloaded_object(Activity),
where: fragment("(?)->>'type' = 'Create'", activity.data),
where: fragment("(?)->>'actor' = ?", activity.data, ^actor),
where:
fragment(
"(?)->>'inReplyTo' = ?",
object.data,
^to_string(id)
),
where: fragment("(?)->>'type' = 'Answer'", object.data)
)
Repo.all(query)
actor
|> Activity.Queries.by_actor()
|> Activity.Queries.by_type("Create")
|> Activity.with_preloaded_object()
|> where([a, object: o], fragment("(?)->>'inReplyTo' = ?", o.data, ^to_string(id)))
|> where([a, object: o], fragment("(?)->>'type' = 'Answer'", o.data))
|> Repo.all()
end
defp maybe_put(map, _key, nil), do: map

View file

@ -37,12 +37,12 @@ def render("object.json", %{object: %Activity{} = activity}) do
Map.merge(base, additional)
end
def render("likes.json", ap_id, likes, page) do
def render("likes.json", %{ap_id: ap_id, likes: likes, page: page}) do
collection(likes, "#{ap_id}/likes", page)
|> Map.merge(Pleroma.Web.ActivityPub.Utils.make_json_ld_header())
end
def render("likes.json", ap_id, likes) do
def render("likes.json", %{ap_id: ap_id, likes: likes}) do
%{
"id" => "#{ap_id}/likes",
"type" => "OrderedCollection",

View file

@ -75,10 +75,7 @@ def render("user.json", %{user: user}) do
endpoints = render("endpoints.json", %{user: user})
user_tags =
user
|> Transmogrifier.add_emoji_tags()
|> Map.get("tag", [])
emoji_tags = Transmogrifier.take_emoji_tags(user)
fields =
user.info
@ -110,7 +107,7 @@ def render("user.json", %{user: user}) do
},
"endpoints" => endpoints,
"attachment" => fields,
"tag" => (user.info.source_data["tag"] || []) ++ user_tags
"tag" => (user.info.source_data["tag"] || []) ++ emoji_tags
}
|> Map.merge(maybe_make_image(&User.avatar_url/2, "icon", user))
|> Map.merge(maybe_make_image(&User.banner_url/2, "image", user))
@ -118,30 +115,34 @@ def render("user.json", %{user: user}) do
end
def render("following.json", %{user: user, page: page} = opts) do
showing = (opts[:for] && opts[:for] == user) || !user.info.hide_follows
showing_items = (opts[:for] && opts[:for] == user) || !user.info.hide_follows
showing_count = showing_items || !user.info.hide_follows_count
query = User.get_friends_query(user)
query = from(user in query, select: [:ap_id])
following = Repo.all(query)
total =
if showing do
if showing_count do
length(following)
else
0
end
collection(following, "#{user.ap_id}/following", page, showing, total)
collection(following, "#{user.ap_id}/following", page, showing_items, total)
|> Map.merge(Utils.make_json_ld_header())
end
def render("following.json", %{user: user} = opts) do
showing = (opts[:for] && opts[:for] == user) || !user.info.hide_follows
showing_items = (opts[:for] && opts[:for] == user) || !user.info.hide_follows
showing_count = showing_items || !user.info.hide_follows_count
query = User.get_friends_query(user)
query = from(user in query, select: [:ap_id])
following = Repo.all(query)
total =
if showing do
if showing_count do
length(following)
else
0
@ -152,7 +153,7 @@ def render("following.json", %{user: user} = opts) do
"type" => "OrderedCollection",
"totalItems" => total,
"first" =>
if showing do
if showing_items do
collection(following, "#{user.ap_id}/following", 1, !user.info.hide_follows)
else
"#{user.ap_id}/following?page=1"
@ -162,32 +163,34 @@ def render("following.json", %{user: user} = opts) do
end
def render("followers.json", %{user: user, page: page} = opts) do
showing = (opts[:for] && opts[:for] == user) || !user.info.hide_followers
showing_items = (opts[:for] && opts[:for] == user) || !user.info.hide_followers
showing_count = showing_items || !user.info.hide_followers_count
query = User.get_followers_query(user)
query = from(user in query, select: [:ap_id])
followers = Repo.all(query)
total =
if showing do
if showing_count do
length(followers)
else
0
end
collection(followers, "#{user.ap_id}/followers", page, showing, total)
collection(followers, "#{user.ap_id}/followers", page, showing_items, total)
|> Map.merge(Utils.make_json_ld_header())
end
def render("followers.json", %{user: user} = opts) do
showing = (opts[:for] && opts[:for] == user) || !user.info.hide_followers
showing_items = (opts[:for] && opts[:for] == user) || !user.info.hide_followers
showing_count = showing_items || !user.info.hide_followers_count
query = User.get_followers_query(user)
query = from(user in query, select: [:ap_id])
followers = Repo.all(query)
total =
if showing do
if showing_count do
length(followers)
else
0
@ -198,8 +201,8 @@ def render("followers.json", %{user: user} = opts) do
"type" => "OrderedCollection",
"totalItems" => total,
"first" =>
if showing do
collection(followers, "#{user.ap_id}/followers", 1, showing, total)
if showing_items do
collection(followers, "#{user.ap_id}/followers", 1, showing_items, total)
else
"#{user.ap_id}/followers?page=1"
end
@ -221,11 +224,12 @@ def render("outbox.json", %{user: user, max_id: max_qid}) do
activities = ActivityPub.fetch_user_activities(user, nil, params)
# this is sorted chronologically, so first activity is the newest (max)
{max_id, min_id, collection} =
if length(activities) > 0 do
{
Enum.at(Enum.reverse(activities), 0).id,
Enum.at(activities, 0).id,
Enum.at(Enum.reverse(activities), 0).id,
Enum.map(activities, fn act ->
{:ok, data} = Transmogrifier.prepare_outgoing(act.data)
data

View file

@ -14,6 +14,7 @@ defmodule Pleroma.Web.AdminAPI.AdminAPIController do
alias Pleroma.Web.AdminAPI.Config
alias Pleroma.Web.AdminAPI.ConfigView
alias Pleroma.Web.AdminAPI.ModerationLogView
alias Pleroma.Web.AdminAPI.Report
alias Pleroma.Web.AdminAPI.ReportView
alias Pleroma.Web.AdminAPI.Search
alias Pleroma.Web.CommonAPI
@ -139,7 +140,8 @@ def users_create(%{assigns: %{user: admin}} = conn, %{"users" => users}) do
def user_show(conn, %{"nickname" => nickname}) do
with %User{} = user <- User.get_cached_by_nickname_or_id(nickname) do
conn
|> json(AccountView.render("show.json", %{user: user}))
|> put_view(AccountView)
|> render("show.json", %{user: user})
else
_ -> {:error, :not_found}
end
@ -158,7 +160,8 @@ def list_user_statuses(conn, %{"nickname" => nickname} = params) do
})
conn
|> json(StatusView.render("index.json", %{activities: activities, as: :activity}))
|> put_view(StatusView)
|> render("index.json", %{activities: activities, as: :activity})
else
_ -> {:error, :not_found}
end
@ -178,7 +181,8 @@ def user_toggle_activation(%{assigns: %{user: admin}} = conn, %{"nickname" => ni
})
conn
|> json(AccountView.render("show.json", %{user: updated_user}))
|> put_view(AccountView)
|> render("show.json", %{user: updated_user})
end
def tag_users(%{assigns: %{user: admin}} = conn, %{"nicknames" => nicknames, "tags" => tags}) do
@ -400,13 +404,23 @@ def email_invite(%{assigns: %{user: user}} = conn, %{"email" => email} = params)
end
end
@doc "Get a account registeration invite token (base64 string)"
def get_invite_token(conn, params) do
options = params["invite"] || %{}
{:ok, invite} = UserInviteToken.create_invite(options)
@doc "Create an account registration invite token"
def create_invite_token(conn, params) do
opts = %{}
conn
|> json(invite.token)
opts =
if params["max_use"],
do: Map.put(opts, :max_use, params["max_use"]),
else: opts
opts =
if params["expires_at"],
do: Map.put(opts, :expires_at, params["expires_at"]),
else: opts
{:ok, invite} = UserInviteToken.create_invite(opts)
json(conn, AccountView.render("invite.json", %{invite: invite}))
end
@doc "Get list of created invites"
@ -414,7 +428,8 @@ def invites(conn, _params) do
invites = UserInviteToken.list_invites()
conn
|> json(AccountView.render("invites.json", %{invites: invites}))
|> put_view(AccountView)
|> render("invites.json", %{invites: invites})
end
@doc "Revokes invite by token"
@ -422,7 +437,8 @@ def revoke_invite(conn, %{"token" => token}) do
with {:ok, invite} <- UserInviteToken.find_by_token(token),
{:ok, updated_invite} = UserInviteToken.update_invite(invite, %{used: true}) do
conn
|> json(AccountView.render("invite.json", %{invite: updated_invite}))
|> put_view(AccountView)
|> render("invite.json", %{invite: updated_invite})
else
nil -> {:error, :not_found}
end
@ -437,16 +453,23 @@ def get_password_reset(conn, %{"nickname" => nickname}) do
|> json(token.token)
end
@doc "Force password reset for a given user"
def force_password_reset(conn, %{"nickname" => nickname}) do
(%User{local: true} = user) = User.get_cached_by_nickname(nickname)
User.force_password_reset_async(user)
json_response(conn, :no_content, "")
end
def list_reports(conn, params) do
params =
params
|> Map.put("type", "Flag")
|> Map.put("skip_preload", true)
|> Map.put("total", true)
reports =
[]
|> ActivityPub.fetch_activities(params)
|> Enum.reverse()
reports = ActivityPub.fetch_activities([], params)
conn
|> put_view(ReportView)
@ -457,7 +480,7 @@ def report_show(conn, %{"id" => id}) do
with %Activity{} = report <- Activity.get_by_id(id) do
conn
|> put_view(ReportView)
|> render("show.json", %{report: report})
|> render("show.json", Report.extract_report_info(report))
else
_ -> {:error, :not_found}
end
@ -473,7 +496,7 @@ def report_update_state(%{assigns: %{user: admin}} = conn, %{"id" => id, "state"
conn
|> put_view(ReportView)
|> render("show.json", %{report: report})
|> render("show.json", Report.extract_report_info(report))
end
end
@ -591,6 +614,12 @@ def config_update(conn, %{"configs" => configs}) do
|> render("index.json", %{configs: updated})
end
def reload_emoji(conn, _params) do
Pleroma.Emoji.reload()
conn |> json("ok")
end
def errors(conn, {:error, :not_found}) do
conn
|> put_status(:not_found)

View file

@ -90,6 +90,8 @@ defp do_convert(entity) when is_list(entity) do
for v <- entity, into: [], do: do_convert(v)
end
defp do_convert(%Regex{} = entity), do: inspect(entity)
defp do_convert(entity) when is_map(entity) do
for {k, v} <- entity, into: %{}, do: {do_convert(k), do_convert(v)}
end
@ -122,7 +124,7 @@ def transform(entity) when is_binary(entity) or is_map(entity) or is_list(entity
def transform(entity), do: :erlang.term_to_binary(entity)
defp do_transform(%Regex{} = entity) when is_map(entity), do: entity
defp do_transform(%Regex{} = entity), do: entity
defp do_transform(%{"tuple" => [":dispatch", [entity]]}) do
{dispatch_settings, []} = do_eval(entity)
@ -154,8 +156,15 @@ defp do_transform(entity) when is_binary(entity) do
defp do_transform(entity), do: entity
defp do_transform_string("~r/" <> pattern) do
pattern = String.trim_trailing(pattern, "/")
~r/#{pattern}/
modificator = String.split(pattern, "/") |> List.last()
pattern = String.trim_trailing(pattern, "/" <> modificator)
case modificator do
"" -> ~r/#{pattern}/
"i" -> ~r/#{pattern}/i
"u" -> ~r/#{pattern}/u
"s" -> ~r/#{pattern}/s
end
end
defp do_transform_string(":" <> atom), do: String.to_atom(atom)

View file

@ -0,0 +1,22 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2019 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Web.AdminAPI.Report do
alias Pleroma.Activity
alias Pleroma.User
def extract_report_info(
%{data: %{"actor" => actor, "object" => [account_ap_id | status_ap_ids]}} = report
) do
user = User.get_cached_by_ap_id(actor)
account = User.get_cached_by_ap_id(account_ap_id)
statuses =
Enum.map(status_ap_ids, fn ap_id ->
Activity.get_by_ap_id_with_object(ap_id)
end)
%{report: report, user: user, account: account, statuses: statuses}
end
end

View file

@ -4,25 +4,26 @@
defmodule Pleroma.Web.AdminAPI.ReportView do
use Pleroma.Web, :view
alias Pleroma.Activity
alias Pleroma.HTML
alias Pleroma.User
alias Pleroma.Web.AdminAPI.Report
alias Pleroma.Web.CommonAPI.Utils
alias Pleroma.Web.MastodonAPI.StatusView
def render("index.json", %{reports: reports}) do
%{
reports: render_many(reports, __MODULE__, "show.json", as: :report)
reports:
reports[:items]
|> Enum.map(&Report.extract_report_info(&1))
|> Enum.map(&render(__MODULE__, "show.json", &1))
|> Enum.reverse(),
total: reports[:total]
}
end
def render("show.json", %{report: report}) do
user = User.get_cached_by_ap_id(report.data["actor"])
def render("show.json", %{report: report, user: user, account: account, statuses: statuses}) do
created_at = Utils.to_masto_date(report.data["published"])
[account_ap_id | status_ap_ids] = report.data["object"]
account = User.get_cached_by_ap_id(account_ap_id)
content =
unless is_nil(report.data["content"]) do
HTML.filter_tags(report.data["content"])
@ -30,11 +31,6 @@ def render("show.json", %{report: report}) do
nil
end
statuses =
Enum.map(status_ap_ids, fn ap_id ->
Activity.get_by_ap_id_with_object(ap_id)
end)
%{
id: report.id,
account: merge_account_views(account),

View file

@ -34,27 +34,14 @@ defp param_to_integer(val, default) when is_binary(val) do
defp param_to_integer(_, default), do: default
def add_link_headers(
conn,
method,
activities,
param \\ nil,
params \\ %{},
func3 \\ nil,
func4 \\ nil
) do
def add_link_headers(conn, activities, extra_params \\ %{}) do
case List.last(activities) do
%{id: max_id} ->
params =
conn.params
|> Map.drop(Map.keys(conn.path_params))
|> Map.drop(["since_id", "max_id", "min_id"])
|> Map.merge(params)
last = List.last(activities)
func3 = func3 || (&mastodon_api_url/3)
func4 = func4 || (&mastodon_api_url/4)
if last do
max_id = last.id
|> Map.merge(extra_params)
limit =
params
@ -72,40 +59,12 @@ def add_link_headers(
|> Map.get(:id)
end
{next_url, prev_url} =
if param do
{
func4.(
Pleroma.Web.Endpoint,
method,
param,
Map.merge(params, %{max_id: max_id})
),
func4.(
Pleroma.Web.Endpoint,
method,
param,
Map.merge(params, %{min_id: min_id})
)
}
else
{
func3.(
Pleroma.Web.Endpoint,
method,
Map.merge(params, %{max_id: max_id})
),
func3.(
Pleroma.Web.Endpoint,
method,
Map.merge(params, %{min_id: min_id})
)
}
end
next_url = current_url(conn, Map.merge(params, %{max_id: max_id}))
prev_url = current_url(conn, Map.merge(params, %{min_id: min_id}))
conn
|> put_resp_header("link", "<#{next_url}>; rel=\"next\", <#{prev_url}>; rel=\"prev\"")
else
put_resp_header(conn, "link", "<#{next_url}>; rel=\"next\", <#{prev_url}>; rel=\"prev\"")
_ ->
conn
end
end

View file

@ -10,16 +10,17 @@ defmodule Pleroma.Web.Federator do
alias Pleroma.Web.ActivityPub.Transmogrifier
alias Pleroma.Web.ActivityPub.Utils
alias Pleroma.Web.Federator.Publisher
alias Pleroma.Web.Federator.RetryQueue
alias Pleroma.Web.OStatus
alias Pleroma.Web.Websub
alias Pleroma.Workers.PublisherWorker
alias Pleroma.Workers.ReceiverWorker
alias Pleroma.Workers.SubscriberWorker
require Logger
def init do
# 1 minute
Process.sleep(1000 * 60)
refresh_subscriptions()
# To do: consider removing this call in favor of scheduled execution (`quantum`-based)
refresh_subscriptions(schedule_in: 60)
end
@doc "Addresses [memory leaks on recursive replies fetching](https://git.pleroma.social/pleroma/pleroma/issues/161)"
@ -37,50 +38,38 @@ def allowed_incoming_reply_depth?(depth) do
# Client API
def incoming_doc(doc) do
PleromaJobQueue.enqueue(:federator_incoming, __MODULE__, [:incoming_doc, doc])
ReceiverWorker.enqueue("incoming_doc", %{"body" => doc})
end
def incoming_ap_doc(params) do
PleromaJobQueue.enqueue(:federator_incoming, __MODULE__, [:incoming_ap_doc, params])
ReceiverWorker.enqueue("incoming_ap_doc", %{"params" => params})
end
def publish(activity, priority \\ 1) do
PleromaJobQueue.enqueue(:federator_outgoing, __MODULE__, [:publish, activity], priority)
def publish(%{id: "pleroma:fakeid"} = activity) do
perform(:publish, activity)
end
def publish(activity) do
PublisherWorker.enqueue("publish", %{"activity_id" => activity.id})
end
def verify_websub(websub) do
PleromaJobQueue.enqueue(:federator_outgoing, __MODULE__, [:verify_websub, websub])
SubscriberWorker.enqueue("verify_websub", %{"websub_id" => websub.id})
end
def request_subscription(sub) do
PleromaJobQueue.enqueue(:federator_outgoing, __MODULE__, [:request_subscription, sub])
def request_subscription(websub) do
SubscriberWorker.enqueue("request_subscription", %{"websub_id" => websub.id})
end
def refresh_subscriptions do
PleromaJobQueue.enqueue(:federator_outgoing, __MODULE__, [:refresh_subscriptions])
def refresh_subscriptions(worker_args \\ []) do
SubscriberWorker.enqueue("refresh_subscriptions", %{}, worker_args ++ [max_attempts: 1])
end
# Job Worker Callbacks
def perform(:refresh_subscriptions) do
Logger.debug("Federator running refresh subscriptions")
Websub.refresh_subscriptions()
spawn(fn ->
# 6 hours
Process.sleep(1000 * 60 * 60 * 6)
refresh_subscriptions()
end)
end
def perform(:request_subscription, websub) do
Logger.debug("Refreshing #{websub.topic}")
with {:ok, websub} <- Websub.request_subscription(websub) do
Logger.debug("Successfully refreshed #{websub.topic}")
else
_e -> Logger.debug("Couldn't refresh #{websub.topic}")
end
@spec perform(atom(), module(), any()) :: {:ok, any()} | {:error, any()}
def perform(:publish_one, module, params) do
apply(module, :publish_one, [params])
end
def perform(:publish, activity) do
@ -92,14 +81,6 @@ def perform(:publish, activity) do
end
end
def perform(:verify_websub, websub) do
Logger.debug(fn ->
"Running WebSub verification for #{websub.id} (#{websub.topic}, #{websub.callback})"
end)
Websub.verify(websub)
end
def perform(:incoming_doc, doc) do
Logger.info("Got document, trying to parse")
OStatus.handle_incoming(doc)
@ -130,22 +111,27 @@ def perform(:incoming_ap_doc, params) do
end
end
def perform(
:publish_single_websub,
%{xml: _xml, topic: _topic, callback: _callback, secret: _secret} = params
) do
case Websub.publish_one(params) do
{:ok, _} ->
:ok
def perform(:request_subscription, websub) do
Logger.debug("Refreshing #{websub.topic}")
{:error, _} ->
RetryQueue.enqueue(params, Websub)
with {:ok, websub} <- Websub.request_subscription(websub) do
Logger.debug("Successfully refreshed #{websub.topic}")
else
_e -> Logger.debug("Couldn't refresh #{websub.topic}")
end
end
def perform(type, _) do
Logger.debug(fn -> "Unknown task: #{type}" end)
{:error, "Don't know what to do with this"}
def perform(:verify_websub, websub) do
Logger.debug(fn ->
"Running WebSub verification for #{websub.id} (#{websub.topic}, #{websub.callback})"
end)
Websub.verify(websub)
end
def perform(:refresh_subscriptions) do
Logger.debug("Federator running refresh subscriptions")
Websub.refresh_subscriptions()
end
def ap_enabled_actor(id) do

View file

@ -6,7 +6,7 @@ defmodule Pleroma.Web.Federator.Publisher do
alias Pleroma.Activity
alias Pleroma.Config
alias Pleroma.User
alias Pleroma.Web.Federator.RetryQueue
alias Pleroma.Workers.PublisherWorker
require Logger
@ -30,23 +30,11 @@ defmodule Pleroma.Web.Federator.Publisher do
Enqueue publishing a single activity.
"""
@spec enqueue_one(module(), Map.t()) :: :ok
def enqueue_one(module, %{} = params),
do: PleromaJobQueue.enqueue(:federator_outgoing, __MODULE__, [:publish_one, module, params])
@spec perform(atom(), module(), any()) :: {:ok, any()} | {:error, any()}
def perform(:publish_one, module, params) do
case apply(module, :publish_one, [params]) do
{:ok, _} ->
:ok
{:error, _e} ->
RetryQueue.enqueue(params, module)
end
end
def perform(type, _, _) do
Logger.debug("Unknown task: #{type}")
{:error, "Don't know what to do with this"}
def enqueue_one(module, %{} = params) do
PublisherWorker.enqueue(
"publish_one",
%{"module" => to_string(module), "params" => params}
)
end
@doc """

View file

@ -1,239 +0,0 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2019 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Web.Federator.RetryQueue do
use GenServer
require Logger
def init(args) do
queue_table = :ets.new(:pleroma_retry_queue, [:bag, :protected])
{:ok, %{args | queue_table: queue_table, running_jobs: :sets.new()}}
end
def start_link(_) do
enabled =
if Pleroma.Config.get(:env) == :test,
do: true,
else: Pleroma.Config.get([__MODULE__, :enabled], false)
if enabled do
Logger.info("Starting retry queue")
linkres =
GenServer.start_link(
__MODULE__,
%{delivered: 0, dropped: 0, queue_table: nil, running_jobs: nil},
name: __MODULE__
)
maybe_kickoff_timer()
linkres
else
Logger.info("Retry queue disabled")
:ignore
end
end
def enqueue(data, transport, retries \\ 0) do
GenServer.cast(__MODULE__, {:maybe_enqueue, data, transport, retries + 1})
end
def get_stats do
GenServer.call(__MODULE__, :get_stats)
end
def reset_stats do
GenServer.call(__MODULE__, :reset_stats)
end
def get_retry_params(retries) do
if retries > Pleroma.Config.get([__MODULE__, :max_retries]) do
{:drop, "Max retries reached"}
else
{:retry, growth_function(retries)}
end
end
def get_retry_timer_interval do
Pleroma.Config.get([:retry_queue, :interval], 1000)
end
defp ets_count_expires(table, current_time) do
:ets.select_count(
table,
[
{
{:"$1", :"$2"},
[{:"=<", :"$1", {:const, current_time}}],
[true]
}
]
)
end
defp ets_pop_n_expired(table, current_time, desired) do
{popped, _continuation} =
:ets.select(
table,
[
{
{:"$1", :"$2"},
[{:"=<", :"$1", {:const, current_time}}],
[:"$_"]
}
],
desired
)
popped
|> Enum.each(fn e ->
:ets.delete_object(table, e)
end)
popped
end
def maybe_start_job(running_jobs, queue_table) do
# we don't want to hit the ets or the DateTime more times than we have to
# could optimize slightly further by not using the count, and instead grabbing
# up to N objects early...
current_time = DateTime.to_unix(DateTime.utc_now())
n_running_jobs = :sets.size(running_jobs)
if n_running_jobs < Pleroma.Config.get([__MODULE__, :max_jobs]) do
n_ready_jobs = ets_count_expires(queue_table, current_time)
if n_ready_jobs > 0 do
# figure out how many we could start
available_job_slots = Pleroma.Config.get([__MODULE__, :max_jobs]) - n_running_jobs
start_n_jobs(running_jobs, queue_table, current_time, available_job_slots)
else
running_jobs
end
else
running_jobs
end
end
defp start_n_jobs(running_jobs, _queue_table, _current_time, 0) do
running_jobs
end
defp start_n_jobs(running_jobs, queue_table, current_time, available_job_slots)
when available_job_slots > 0 do
candidates = ets_pop_n_expired(queue_table, current_time, available_job_slots)
candidates
|> List.foldl(running_jobs, fn {_, e}, rj ->
{:ok, pid} = Task.start(fn -> worker(e) end)
mref = Process.monitor(pid)
:sets.add_element(mref, rj)
end)
end
def worker({:send, data, transport, retries}) do
case transport.publish_one(data) do
{:ok, _} ->
GenServer.cast(__MODULE__, :inc_delivered)
:delivered
{:error, _reason} ->
enqueue(data, transport, retries)
:retry
end
end
def handle_call(:get_stats, _from, %{delivered: delivery_count, dropped: drop_count} = state) do
{:reply, %{delivered: delivery_count, dropped: drop_count}, state}
end
def handle_call(:reset_stats, _from, %{delivered: delivery_count, dropped: drop_count} = state) do
{:reply, %{delivered: delivery_count, dropped: drop_count},
%{state | delivered: 0, dropped: 0}}
end
def handle_cast(:reset_stats, state) do
{:noreply, %{state | delivered: 0, dropped: 0}}
end
def handle_cast(
{:maybe_enqueue, data, transport, retries},
%{dropped: drop_count, queue_table: queue_table, running_jobs: running_jobs} = state
) do
case get_retry_params(retries) do
{:retry, timeout} ->
:ets.insert(queue_table, {timeout, {:send, data, transport, retries}})
running_jobs = maybe_start_job(running_jobs, queue_table)
{:noreply, %{state | running_jobs: running_jobs}}
{:drop, message} ->
Logger.debug(message)
{:noreply, %{state | dropped: drop_count + 1}}
end
end
def handle_cast(:kickoff_timer, state) do
retry_interval = get_retry_timer_interval()
Process.send_after(__MODULE__, :retry_timer_run, retry_interval)
{:noreply, state}
end
def handle_cast(:inc_delivered, %{delivered: delivery_count} = state) do
{:noreply, %{state | delivered: delivery_count + 1}}
end
def handle_cast(:inc_dropped, %{dropped: drop_count} = state) do
{:noreply, %{state | dropped: drop_count + 1}}
end
def handle_info({:send, data, transport, retries}, %{delivered: delivery_count} = state) do
case transport.publish_one(data) do
{:ok, _} ->
{:noreply, %{state | delivered: delivery_count + 1}}
{:error, _reason} ->
enqueue(data, transport, retries)
{:noreply, state}
end
end
def handle_info(
:retry_timer_run,
%{queue_table: queue_table, running_jobs: running_jobs} = state
) do
maybe_kickoff_timer()
running_jobs = maybe_start_job(running_jobs, queue_table)
{:noreply, %{state | running_jobs: running_jobs}}
end
def handle_info({:DOWN, ref, :process, _pid, _reason}, state) do
%{running_jobs: running_jobs, queue_table: queue_table} = state
running_jobs = :sets.del_element(ref, running_jobs)
running_jobs = maybe_start_job(running_jobs, queue_table)
{:noreply, %{state | running_jobs: running_jobs}}
end
def handle_info(unknown, state) do
Logger.debug("RetryQueue: don't know what to do with #{inspect(unknown)}, ignoring")
{:noreply, state}
end
if Pleroma.Config.get(:env) == :test do
defp growth_function(_retries) do
_shutit = Pleroma.Config.get([__MODULE__, :initial_timeout])
DateTime.to_unix(DateTime.utc_now()) - 1
end
else
defp growth_function(retries) do
round(Pleroma.Config.get([__MODULE__, :initial_timeout]) * :math.pow(retries, 3)) +
DateTime.to_unix(DateTime.utc_now())
end
end
defp maybe_kickoff_timer do
GenServer.cast(__MODULE__, :kickoff_timer)
end
end

View file

@ -6,7 +6,7 @@ defmodule Pleroma.Web.MastodonAPI.MastodonAPIController do
use Pleroma.Web, :controller
import Pleroma.Web.ControllerHelper,
only: [json_response: 3, add_link_headers: 5, add_link_headers: 4, add_link_headers: 3]
only: [json_response: 3, add_link_headers: 2, add_link_headers: 3]
alias Ecto.Changeset
alias Pleroma.Activity
@ -147,6 +147,8 @@ def update_credentials(%{assigns: %{user: user}} = conn, params) do
[
:no_rich_text,
:locked,
:hide_followers_count,
:hide_follows_count,
:hide_followers,
:hide_follows,
:hide_favorites,
@ -365,7 +367,7 @@ def home_timeline(%{assigns: %{user: user}} = conn, params) do
|> Enum.reverse()
conn
|> add_link_headers(:home_timeline, activities)
|> add_link_headers(activities)
|> put_view(StatusView)
|> render("index.json", %{activities: activities, for: user, as: :activity})
end
@ -379,12 +381,11 @@ def public_timeline(%{assigns: %{user: user}} = conn, params) do
|> Map.put("local_only", local_only)
|> Map.put("blocking_user", user)
|> Map.put("muting_user", user)
|> Map.put("user", user)
|> ActivityPub.fetch_public_activities()
|> Enum.reverse()
conn
|> add_link_headers(:public_timeline, activities, false, %{"local" => local_only})
|> add_link_headers(activities, %{"local" => local_only})
|> put_view(StatusView)
|> render("index.json", %{activities: activities, for: user, as: :activity})
end
@ -398,7 +399,7 @@ def user_statuses(%{assigns: %{user: reading_user}} = conn, params) do
activities = ActivityPub.fetch_user_activities(user, reading_user, params)
conn
|> add_link_headers(:user_statuses, activities, params["id"])
|> add_link_headers(activities)
|> put_view(StatusView)
|> render("index.json", %{
activities: activities,
@ -422,11 +423,25 @@ def dm_timeline(%{assigns: %{user: user}} = conn, params) do
|> Pagination.fetch_paginated(params)
conn
|> add_link_headers(:dm_timeline, activities)
|> add_link_headers(activities)
|> put_view(StatusView)
|> render("index.json", %{activities: activities, for: user, as: :activity})
end
def get_statuses(%{assigns: %{user: user}} = conn, %{"ids" => ids}) do
limit = 100
activities =
ids
|> Enum.take(limit)
|> Activity.all_by_ids_with_object()
|> Enum.filter(&Visibility.visible_for_user?(&1, user))
conn
|> put_view(StatusView)
|> render("index.json", activities: activities, for: user, as: :activity)
end
def get_status(%{assigns: %{user: user}} = conn, %{"id" => id}) do
with %Activity{} = activity <- Activity.get_by_id_with_object(id),
true <- Visibility.visible_for_user?(activity, user) do
@ -471,7 +486,7 @@ def get_context(%{assigns: %{user: user}} = conn, %{"id" => id}) do
end
def get_poll(%{assigns: %{user: user}} = conn, %{"id" => id}) do
with %Object{} = object <- Object.get_by_id(id),
with %Object{} = object <- Object.get_by_id_and_maybe_refetch(id, interval: 60),
%Activity{} = activity <- Activity.get_create_by_object_ap_id(object.data["id"]),
true <- Visibility.visible_for_user?(activity, user) do
conn
@ -523,7 +538,7 @@ def poll_vote(%{assigns: %{user: user}} = conn, %{"id" => id, "choices" => choic
def scheduled_statuses(%{assigns: %{user: user}} = conn, params) do
with scheduled_activities <- MastodonAPI.get_scheduled_activities(user, params) do
conn
|> add_link_headers(:scheduled_statuses, scheduled_activities)
|> add_link_headers(scheduled_activities)
|> put_view(ScheduledActivityView)
|> render("index.json", %{scheduled_activities: scheduled_activities})
end
@ -595,7 +610,12 @@ def post_status(%{assigns: %{user: user}} = conn, %{"status" => _} = params) do
{:ok, activity} ->
conn
|> put_view(StatusView)
|> try_render("status.json", %{activity: activity, for: user, as: :activity})
|> try_render("status.json", %{
activity: activity,
for: user,
as: :activity,
with_direct_conversation_id: true
})
end
end
end
@ -706,7 +726,7 @@ def notifications(%{assigns: %{user: user}} = conn, params) do
notifications = MastodonAPI.get_notifications(user, params)
conn
|> add_link_headers(:notifications, notifications)
|> add_link_headers(notifications)
|> put_view(NotificationView)
|> render("index.json", %{notifications: notifications, for: user})
end
@ -828,6 +848,7 @@ def get_mascot(%{assigns: %{user: user}} = conn, _params) do
def favourited_by(%{assigns: %{user: user}} = conn, %{"id" => id}) do
with %Activity{} = activity <- Activity.get_by_id_with_object(id),
{:visible, true} <- {:visible, Visibility.visible_for_user?(activity, user)},
%Object{data: %{"likes" => likes}} <- Object.normalize(activity) do
q = from(u in User, where: u.ap_id in ^likes)
@ -839,12 +860,14 @@ def favourited_by(%{assigns: %{user: user}} = conn, %{"id" => id}) do
|> put_view(AccountView)
|> render("accounts.json", %{for: user, users: users, as: :user})
else
{:visible, false} -> {:error, :not_found}
_ -> json(conn, [])
end
end
def reblogged_by(%{assigns: %{user: user}} = conn, %{"id" => id}) do
with %Activity{} = activity <- Activity.get_by_id_with_object(id),
{:visible, true} <- {:visible, Visibility.visible_for_user?(activity, user)},
%Object{data: %{"announcements" => announces}} <- Object.normalize(activity) do
q = from(u in User, where: u.ap_id in ^announces)
@ -856,6 +879,7 @@ def reblogged_by(%{assigns: %{user: user}} = conn, %{"id" => id}) do
|> put_view(AccountView)
|> render("accounts.json", %{for: user, users: users, as: :user})
else
{:visible, false} -> {:error, :not_found}
_ -> json(conn, [])
end
end
@ -894,7 +918,7 @@ def hashtag_timeline(%{assigns: %{user: user}} = conn, params) do
|> Enum.reverse()
conn
|> add_link_headers(:hashtag_timeline, activities, params["tag"], %{"local" => local_only})
|> add_link_headers(activities, %{"local" => local_only})
|> put_view(StatusView)
|> render("index.json", %{activities: activities, for: user, as: :activity})
end
@ -910,7 +934,7 @@ def followers(%{assigns: %{user: for_user}} = conn, %{"id" => id} = params) do
end
conn
|> add_link_headers(:followers, followers, user)
|> add_link_headers(followers)
|> put_view(AccountView)
|> render("accounts.json", %{for: for_user, users: followers, as: :user})
end
@ -927,7 +951,7 @@ def following(%{assigns: %{user: for_user}} = conn, %{"id" => id} = params) do
end
conn
|> add_link_headers(:following, followers, user)
|> add_link_headers(followers)
|> put_view(AccountView)
|> render("accounts.json", %{for: for_user, users: followers, as: :user})
end
@ -1152,7 +1176,7 @@ def favourites(%{assigns: %{user: user}} = conn, params) do
|> Enum.reverse()
conn
|> add_link_headers(:favourites, activities)
|> add_link_headers(activities)
|> put_view(StatusView)
|> render("index.json", %{activities: activities, for: user, as: :activity})
end
@ -1179,7 +1203,7 @@ def user_favourites(%{assigns: %{user: for_user}} = conn, %{"id" => id} = params
|> Enum.reverse()
conn
|> add_link_headers(:favourites, activities)
|> add_link_headers(activities)
|> put_view(StatusView)
|> render("index.json", %{activities: activities, for: for_user, as: :activity})
else
@ -1200,7 +1224,7 @@ def bookmarks(%{assigns: %{user: user}} = conn, params) do
|> Enum.map(fn b -> Map.put(b.activity, :bookmark, Map.delete(b, :activity)) end)
conn
|> add_link_headers(:bookmarks, bookmarks)
|> add_link_headers(bookmarks)
|> put_view(StatusView)
|> render("index.json", %{activities: activities, for: user, as: :activity})
end
@ -1640,7 +1664,7 @@ def conversations(%{assigns: %{user: user}} = conn, params) do
end)
conn
|> add_link_headers(:conversations, participations)
|> add_link_headers(participations)
|> json(conversations)
end

View file

@ -19,9 +19,10 @@ defmodule Pleroma.Web.MastodonAPI.SearchController do
def account_search(%{assigns: %{user: user}} = conn, %{"q" => query} = params) do
accounts = User.search(query, search_options(params, user))
res = AccountView.render("accounts.json", users: accounts, for: user, as: :user)
json(conn, res)
conn
|> put_view(AccountView)
|> render("accounts.json", users: accounts, for: user, as: :user)
end
def search2(conn, params), do: do_search(:v2, conn, params)

View file

@ -74,10 +74,18 @@ defp do_render("account.json", %{user: user} = opts) do
user_info = User.get_cached_user_info(user)
following_count =
((!user.info.hide_follows or opts[:for] == user) && user_info.following_count) || 0
if !user.info.hide_follows_count or !user.info.hide_follows or opts[:for] == user do
user_info.following_count
else
0
end
followers_count =
((!user.info.hide_followers or opts[:for] == user) && user_info.follower_count) || 0
if !user.info.hide_followers_count or !user.info.hide_followers or opts[:for] == user do
user_info.follower_count
else
0
end
bot = (user.info.source_data["type"] || "Person") in ["Application", "Service"]
@ -138,6 +146,8 @@ defp do_render("account.json", %{user: user} = opts) do
pleroma: %{
confirmation_pending: user_info.confirmation_pending,
tags: user.tags,
hide_followers_count: user.info.hide_followers_count,
hide_follows_count: user.info.hide_follows_count,
hide_followers: user.info.hide_followers,
hide_follows: user.info.hide_follows,
hide_favorites: user.info.hide_favorites,

View file

@ -73,14 +73,12 @@ defp reblogged?(activity, user) do
def render("index.json", opts) do
replied_to_activities = get_replied_to_activities(opts.activities)
parallel = unless is_nil(opts[:parallel]), do: opts[:parallel], else: true
opts.activities
|> safe_render_many(
StatusView,
"status.json",
Map.put(opts, :replied_to_activities, replied_to_activities),
parallel
Map.put(opts, :replied_to_activities, replied_to_activities)
)
end
@ -499,7 +497,7 @@ def build_tags(object_tags) when is_list(object_tags) do
object_tags = for tag when is_binary(tag) <- object_tags, do: tag
Enum.reduce(object_tags, [], fn tag, tags ->
tags ++ [%{name: tag, url: "/tag/#{tag}"}]
tags ++ [%{name: tag, url: "/tag/#{URI.encode(tag)}"}]
end)
end

View file

@ -8,6 +8,7 @@ defmodule Pleroma.Web.MastodonAPI.WebsocketHandler do
alias Pleroma.Repo
alias Pleroma.User
alias Pleroma.Web.OAuth.Token
alias Pleroma.Web.Streamer
@behaviour :cowboy_websocket
@ -24,7 +25,7 @@ defmodule Pleroma.Web.MastodonAPI.WebsocketHandler do
]
@anonymous_streams ["public", "public:local", "hashtag"]
# Handled by periodic keepalive in Pleroma.Web.Streamer.
# Handled by periodic keepalive in Pleroma.Web.Streamer.Ping.
@timeout :infinity
def init(%{qs: qs} = req, state) do
@ -65,7 +66,7 @@ def websocket_info(:subscribe, state) do
}, topic #{state.topic}"
)
Pleroma.Web.Streamer.add_socket(state.topic, streamer_socket(state))
Streamer.add_socket(state.topic, streamer_socket(state))
{:ok, state}
end
@ -80,7 +81,7 @@ def terminate(reason, _req, state) do
}, topic #{state.topic || "?"}: #{inspect(reason)}"
)
Pleroma.Web.Streamer.remove_socket(state.topic, streamer_socket(state))
Streamer.remove_socket(state.topic, streamer_socket(state))
:ok
end

View file

@ -57,6 +57,7 @@ def raw_nodeinfo do
"mastodon_api_streaming",
"polls",
"pleroma_explicit_addressing",
"shareable_emoji_packs",
if Config.get([:media_proxy, :enabled]) do
"media_proxy"
end,

View file

@ -202,6 +202,8 @@ def token_exchange(
{:ok, app} <- Token.Utils.fetch_app(conn),
{:auth_active, true} <- {:auth_active, User.auth_active?(user)},
{:user_active, true} <- {:user_active, !user.info.deactivated},
{:password_reset_pending, false} <-
{:password_reset_pending, user.info.password_reset_pending},
{:ok, scopes} <- validate_scopes(app, params),
{:ok, auth} <- Authorization.create_authorization(app, user, scopes),
{:ok, token} <- Token.exchange_token(app, auth) do
@ -215,6 +217,9 @@ def token_exchange(
{:user_active, false} ->
render_error(conn, :forbidden, "Your account is currently disabled")
{:password_reset_pending, true} ->
render_error(conn, :forbidden, "Password reset is required")
_error ->
render_invalid_credentials_error(conn)
end

View file

@ -1,5 +1,5 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2018 Pleroma Authors <https://pleroma.social/>
# Copyright © 2017-2019 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Web.OAuth.Token.CleanWorker do
@ -17,6 +17,7 @@ defmodule Pleroma.Web.OAuth.Token.CleanWorker do
)
alias Pleroma.Web.OAuth.Token
alias Pleroma.Workers.BackgroundWorker
def start_link(_), do: GenServer.start_link(__MODULE__, %{})
@ -27,9 +28,11 @@ def init(_) do
@doc false
def handle_info(:perform, state) do
Token.delete_expired_tokens()
BackgroundWorker.enqueue("clean_expired_tokens", %{})
Process.send_after(self(), :perform, @interval)
{:noreply, state}
end
def perform(:clean), do: Token.delete_expired_tokens()
end

View file

@ -1,5 +1,5 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2018 Pleroma Authors <https://pleroma.social/>
# Copyright © 2017-2019 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Web.OAuth.Token.Query do

View file

@ -3,14 +3,12 @@
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Web.OStatus do
import Ecto.Query
import Pleroma.Web.XML
require Logger
alias Pleroma.Activity
alias Pleroma.HTTP
alias Pleroma.Object
alias Pleroma.Repo
alias Pleroma.User
alias Pleroma.Web
alias Pleroma.Web.ActivityPub.ActivityPub
@ -38,21 +36,13 @@ def is_representable?(%Activity{} = activity) do
end
end
def feed_path(user) do
"#{user.ap_id}/feed.atom"
end
def feed_path(user), do: "#{user.ap_id}/feed.atom"
def pubsub_path(user) do
"#{Web.base_url()}/push/hub/#{user.nickname}"
end
def pubsub_path(user), do: "#{Web.base_url()}/push/hub/#{user.nickname}"
def salmon_path(user) do
"#{user.ap_id}/salmon"
end
def salmon_path(user), do: "#{user.ap_id}/salmon"
def remote_follow_path do
"#{Web.base_url()}/ostatus_subscribe?acct={uri}"
end
def remote_follow_path, do: "#{Web.base_url()}/ostatus_subscribe?acct={uri}"
def handle_incoming(xml_string, options \\ []) do
with doc when doc != :error <- parse_document(xml_string) do
@ -217,10 +207,9 @@ def get_content(entry) do
Get the cw that mastodon uses.
"""
def get_cw(entry) do
with cw when not is_nil(cw) <- string_from_xpath("/*/summary", entry) do
cw
else
_e -> nil
case string_from_xpath("/*/summary", entry) do
cw when not is_nil(cw) -> cw
_ -> nil
end
end
@ -232,19 +221,17 @@ def get_tags(entry) do
end
def maybe_update(doc, user) do
if "true" == string_from_xpath("//author[1]/ap_enabled", doc) do
case string_from_xpath("//author[1]/ap_enabled", doc) do
"true" ->
Transmogrifier.upgrade_user_from_ap_id(user.ap_id)
else
_ ->
maybe_update_ostatus(doc, user)
end
end
def maybe_update_ostatus(doc, user) do
old_data = %{
avatar: user.avatar,
bio: user.bio,
name: user.name
}
old_data = Map.take(user, [:bio, :avatar, :name])
with false <- user.local,
avatar <- make_avatar_object(doc),
@ -279,21 +266,28 @@ def find_make_or_update_actor(doc) do
end
end
@spec find_or_make_user(String.t()) :: {:ok, User.t()}
def find_or_make_user(uri) do
query = from(user in User, where: user.ap_id == ^uri)
user = Repo.one(query)
if is_nil(user) do
make_user(uri)
else
{:ok, user}
case User.get_by_ap_id(uri) do
%User{} = user -> {:ok, user}
_ -> make_user(uri)
end
end
@spec make_user(String.t(), boolean()) :: {:ok, User.t()} | {:error, any()}
def make_user(uri, update \\ false) do
with {:ok, info} <- gather_user_info(uri) do
data = %{
with false <- update,
%User{} = user <- User.get_cached_by_ap_id(info["uri"]) do
{:ok, user}
else
_e -> User.insert_or_update_user(build_user_data(info))
end
end
end
defp build_user_data(info) do
%{
name: info["name"],
nickname: info["nickname"] <> "@" <> info["host"],
ap_id: info["uri"],
@ -301,14 +295,6 @@ def make_user(uri, update \\ false) do
avatar: info["avatar"],
bio: info["bio"]
}
with false <- update,
%User{} = user <- User.get_cached_by_ap_id(data.ap_id) do
{:ok, user}
else
_e -> User.insert_or_update_user(data)
end
end
end
# TODO: Just takes the first one for now.
@ -319,23 +305,23 @@ def make_avatar_object(author_doc, rel \\ "avatar") do
if href do
%{
"type" => "Image",
"url" => [
%{
"type" => "Link",
"mediaType" => type,
"href" => href
}
]
"url" => [%{"type" => "Link", "mediaType" => type, "href" => href}]
}
else
nil
end
end
@spec gather_user_info(String.t()) :: {:ok, map()} | {:error, any()}
def gather_user_info(username) do
with {:ok, webfinger_data} <- WebFinger.finger(username),
{:ok, feed_data} <- Websub.gather_feed_data(webfinger_data["topic"]) do
{:ok, Map.merge(webfinger_data, feed_data) |> Map.put("fqn", username)}
data =
webfinger_data
|> Map.merge(feed_data)
|> Map.put("fqn", username)
{:ok, data}
else
e ->
Logger.debug(fn -> "Couldn't gather info for #{username}" end)
@ -371,10 +357,7 @@ def get_atom_url(body) do
def fetch_activity_from_atom_url(url, options \\ []) do
with true <- String.starts_with?(url, "http"),
{:ok, %{body: body, status: code}} when code in 200..299 <-
HTTP.get(
url,
[{:Accept, "application/atom+xml"}]
) do
HTTP.get(url, [{:Accept, "application/atom+xml"}]) do
Logger.debug("Got document from #{url}, handling...")
handle_incoming(body, options)
else

View file

@ -55,12 +55,11 @@ def feed_redirect(conn, %{"nickname" => nickname}) do
def feed(conn, %{"nickname" => nickname} = params) do
with {_, %User{} = user} <- {:fetch_user, User.get_cached_by_nickname(nickname)} do
query_params =
Map.take(params, ["max_id"])
|> Map.merge(%{"whole_db" => true, "actor_id" => user.ap_id})
activities =
ActivityPub.fetch_public_activities(query_params)
params
|> Map.take(["max_id"])
|> Map.merge(%{"whole_db" => true, "actor_id" => user.ap_id})
|> ActivityPub.fetch_public_activities()
|> Enum.reverse()
response =
@ -98,8 +97,7 @@ def salmon_incoming(conn, _) do
Federator.incoming_doc(doc)
conn
|> send_resp(200, "")
send_resp(conn, 200, "")
end
def object(%{assigns: %{format: format}} = conn, %{"uuid" => _uuid})
@ -218,7 +216,8 @@ defp represent_activity(
conn
|> put_resp_header("content-type", "application/activity+json")
|> json(ObjectView.render("object.json", %{object: object}))
|> put_view(ObjectView)
|> render("object.json", %{object: object})
end
defp represent_activity(_conn, "activity+json", _, _) do

View file

@ -0,0 +1,575 @@
defmodule Pleroma.Web.PleromaAPI.EmojiAPIController do
use Pleroma.Web, :controller
require Logger
@emoji_dir_path Path.join(
Pleroma.Config.get!([:instance, :static_dir]),
"emoji"
)
@cache_seconds_per_file Pleroma.Config.get!([:emoji, :shared_pack_cache_seconds_per_file])
@doc """
Lists the packs available on the instance as JSON.
The information is public and does not require authentification. The format is
a map of "pack directory name" to pack.json contents.
"""
def list_packs(conn, _params) do
with {:ok, results} <- File.ls(@emoji_dir_path) do
pack_infos =
results
|> Enum.filter(&has_pack_json?/1)
|> Enum.map(&load_pack/1)
# Check if all the files are in place and can be sent
|> Enum.map(&validate_pack/1)
# Transform into a map of pack-name => pack-data
|> Enum.into(%{})
json(conn, pack_infos)
end
end
defp has_pack_json?(file) do
dir_path = Path.join(@emoji_dir_path, file)
# Filter to only use the pack.json packs
File.dir?(dir_path) and File.exists?(Path.join(dir_path, "pack.json"))
end
defp load_pack(pack_name) do
pack_path = Path.join(@emoji_dir_path, pack_name)
pack_file = Path.join(pack_path, "pack.json")
{pack_name, Jason.decode!(File.read!(pack_file))}
end
defp validate_pack({name, pack}) do
pack_path = Path.join(@emoji_dir_path, name)
if can_download?(pack, pack_path) do
archive_for_sha = make_archive(name, pack, pack_path)
archive_sha = :crypto.hash(:sha256, archive_for_sha) |> Base.encode16()
pack =
pack
|> put_in(["pack", "can-download"], true)
|> put_in(["pack", "download-sha256"], archive_sha)
{name, pack}
else
{name, put_in(pack, ["pack", "can-download"], false)}
end
end
defp can_download?(pack, pack_path) do
# If the pack is set as shared, check if it can be downloaded
# That means that when asked, the pack can be packed and sent to the remote
# Otherwise, they'd have to download it from external-src
pack["pack"]["share-files"] &&
Enum.all?(pack["files"], fn {_, path} ->
File.exists?(Path.join(pack_path, path))
end)
end
defp create_archive_and_cache(name, pack, pack_dir, md5) do
files =
['pack.json'] ++
(pack["files"] |> Enum.map(fn {_, path} -> to_charlist(path) end))
{:ok, {_, zip_result}} = :zip.zip('#{name}.zip', files, [:memory, cwd: to_charlist(pack_dir)])
cache_ms = :timer.seconds(@cache_seconds_per_file * Enum.count(files))
Cachex.put!(
:emoji_packs_cache,
name,
# if pack.json MD5 changes, the cache is not valid anymore
%{pack_json_md5: md5, pack_data: zip_result},
# Add a minute to cache time for every file in the pack
ttl: cache_ms
)
Logger.debug("Created an archive for the '#{name}' emoji pack, \
keeping it in cache for #{div(cache_ms, 1000)}s")
zip_result
end
defp make_archive(name, pack, pack_dir) do
# Having a different pack.json md5 invalidates cache
pack_file_md5 = :crypto.hash(:md5, File.read!(Path.join(pack_dir, "pack.json")))
case Cachex.get!(:emoji_packs_cache, name) do
%{pack_file_md5: ^pack_file_md5, pack_data: zip_result} ->
Logger.debug("Using cache for the '#{name}' shared emoji pack")
zip_result
_ ->
create_archive_and_cache(name, pack, pack_dir, pack_file_md5)
end
end
@doc """
An endpoint for other instances (via admin UI) or users (via browser)
to download packs that the instance shares.
"""
def download_shared(conn, %{"name" => name}) do
pack_dir = Path.join(@emoji_dir_path, name)
pack_file = Path.join(pack_dir, "pack.json")
with {_, true} <- {:exists?, File.exists?(pack_file)},
pack = Jason.decode!(File.read!(pack_file)),
{_, true} <- {:can_download?, can_download?(pack, pack_dir)} do
zip_result = make_archive(name, pack, pack_dir)
send_download(conn, {:binary, zip_result}, filename: "#{name}.zip")
else
{:can_download?, _} ->
conn
|> put_status(:forbidden)
|> json(%{
error: "Pack #{name} cannot be downloaded from this instance, either pack sharing\
was disabled for this pack or some files are missing"
})
{:exists?, _} ->
conn
|> put_status(:not_found)
|> json(%{error: "Pack #{name} does not exist"})
end
end
@doc """
An admin endpoint to request downloading a pack named `pack_name` from the instance
`instance_address`.
If the requested instance's admin chose to share the pack, it will be downloaded
from that instance, otherwise it will be downloaded from the fallback source, if there is one.
"""
def download_from(conn, %{"instance_address" => address, "pack_name" => name} = data) do
shareable_packs_available =
"#{address}/.well-known/nodeinfo"
|> Tesla.get!()
|> Map.get(:body)
|> Jason.decode!()
|> List.last()
|> Map.get("href")
# Get the actual nodeinfo address and fetch it
|> Tesla.get!()
|> Map.get(:body)
|> Jason.decode!()
|> get_in(["metadata", "features"])
|> Enum.member?("shareable_emoji_packs")
if shareable_packs_available do
full_pack =
"#{address}/api/pleroma/emoji/packs/list"
|> Tesla.get!()
|> Map.get(:body)
|> Jason.decode!()
|> Map.get(name)
pack_info_res =
case full_pack["pack"] do
%{"share-files" => true, "can-download" => true, "download-sha256" => sha} ->
{:ok,
%{
sha: sha,
uri: "#{address}/api/pleroma/emoji/packs/download_shared/#{name}"
}}
%{"fallback-src" => src, "fallback-src-sha256" => sha} when is_binary(src) ->
{:ok,
%{
sha: sha,
uri: src,
fallback: true
}}
_ ->
{:error,
"The pack was not set as shared and there is no fallback src to download from"}
end
with {:ok, %{sha: sha, uri: uri} = pinfo} <- pack_info_res,
%{body: emoji_archive} <- Tesla.get!(uri),
{_, true} <- {:checksum, Base.decode16!(sha) == :crypto.hash(:sha256, emoji_archive)} do
local_name = data["as"] || name
pack_dir = Path.join(@emoji_dir_path, local_name)
File.mkdir_p!(pack_dir)
files = Enum.map(full_pack["files"], fn {_, path} -> to_charlist(path) end)
# Fallback cannot contain a pack.json file
files = if pinfo[:fallback], do: files, else: ['pack.json'] ++ files
{:ok, _} = :zip.unzip(emoji_archive, cwd: to_charlist(pack_dir), file_list: files)
# Fallback can't contain a pack.json file, since that would cause the fallback-src-sha256
# in it to depend on itself
if pinfo[:fallback] do
pack_file_path = Path.join(pack_dir, "pack.json")
File.write!(pack_file_path, Jason.encode!(full_pack, pretty: true))
end
json(conn, "ok")
else
{:error, e} ->
conn |> put_status(:internal_server_error) |> json(%{error: e})
{:checksum, _} ->
conn
|> put_status(:internal_server_error)
|> json(%{error: "SHA256 for the pack doesn't match the one sent by the server"})
end
else
conn
|> put_status(:internal_server_error)
|> json(%{error: "The requested instance does not support sharing emoji packs"})
end
end
@doc """
Creates an empty pack named `name` which then can be updated via the admin UI.
"""
def create(conn, %{"name" => name}) do
pack_dir = Path.join(@emoji_dir_path, name)
if not File.exists?(pack_dir) do
File.mkdir_p!(pack_dir)
pack_file_p = Path.join(pack_dir, "pack.json")
File.write!(
pack_file_p,
Jason.encode!(%{pack: %{}, files: %{}}, pretty: true)
)
conn |> json("ok")
else
conn
|> put_status(:conflict)
|> json(%{error: "A pack named \"#{name}\" already exists"})
end
end
@doc """
Deletes the pack `name` and all it's files.
"""
def delete(conn, %{"name" => name}) do
pack_dir = Path.join(@emoji_dir_path, name)
case File.rm_rf(pack_dir) do
{:ok, _} ->
conn |> json("ok")
{:error, _} ->
conn
|> put_status(:internal_server_error)
|> json(%{error: "Couldn't delete the pack #{name}"})
end
end
@doc """
An endpoint to update `pack_names`'s metadata.
`new_data` is the new metadata for the pack, that will replace the old metadata.
"""
def update_metadata(conn, %{"pack_name" => name, "new_data" => new_data}) do
pack_file_p = Path.join([@emoji_dir_path, name, "pack.json"])
full_pack = Jason.decode!(File.read!(pack_file_p))
# The new fallback-src is in the new data and it's not the same as it was in the old data
should_update_fb_sha =
not is_nil(new_data["fallback-src"]) and
new_data["fallback-src"] != full_pack["pack"]["fallback-src"]
with {_, true} <- {:should_update?, should_update_fb_sha},
%{body: pack_arch} <- Tesla.get!(new_data["fallback-src"]),
{:ok, flist} <- :zip.unzip(pack_arch, [:memory]),
{_, true} <- {:has_all_files?, has_all_files?(full_pack, flist)} do
fallback_sha = :crypto.hash(:sha256, pack_arch) |> Base.encode16()
new_data = Map.put(new_data, "fallback-src-sha256", fallback_sha)
update_metadata_and_send(conn, full_pack, new_data, pack_file_p)
else
{:should_update?, _} ->
update_metadata_and_send(conn, full_pack, new_data, pack_file_p)
{:has_all_files?, _} ->
conn
|> put_status(:bad_request)
|> json(%{error: "The fallback archive does not have all files specified in pack.json"})
end
end
# Check if all files from the pack.json are in the archive
defp has_all_files?(%{"files" => files}, flist) do
Enum.all?(files, fn {_, from_manifest} ->
Enum.find(flist, fn {from_archive, _} ->
to_string(from_archive) == from_manifest
end)
end)
end
defp update_metadata_and_send(conn, full_pack, new_data, pack_file_p) do
full_pack = Map.put(full_pack, "pack", new_data)
File.write!(pack_file_p, Jason.encode!(full_pack, pretty: true))
# Send new data back with fallback sha filled
json(conn, new_data)
end
defp get_filename(%{"filename" => filename}), do: filename
defp get_filename(%{"file" => file}) do
case file do
%Plug.Upload{filename: filename} -> filename
url when is_binary(url) -> Path.basename(url)
end
end
defp empty?(str), do: String.trim(str) == ""
defp update_file_and_send(conn, updated_full_pack, pack_file_p) do
# Write the emoji pack file
File.write!(pack_file_p, Jason.encode!(updated_full_pack, pretty: true))
# Return the modified file list
json(conn, updated_full_pack["files"])
end
@doc """
Updates a file in a pack.
Updating can mean three things:
- `add` adds an emoji named `shortcode` to the pack `pack_name`,
that means that the emoji file needs to be uploaded with the request
(thus requiring it to be a multipart request) and be named `file`.
There can also be an optional `filename` that will be the new emoji file name
(if it's not there, the name will be taken from the uploaded file).
- `update` changes emoji shortcode (from `shortcode` to `new_shortcode` or moves the file
(from the current filename to `new_filename`)
- `remove` removes the emoji named `shortcode` and it's associated file
"""
# Add
def update_file(
conn,
%{"pack_name" => pack_name, "action" => "add", "shortcode" => shortcode} = params
) do
pack_dir = Path.join(@emoji_dir_path, pack_name)
pack_file_p = Path.join(pack_dir, "pack.json")
full_pack = Jason.decode!(File.read!(pack_file_p))
with {_, false} <- {:has_shortcode, Map.has_key?(full_pack["files"], shortcode)},
filename <- get_filename(params),
false <- empty?(shortcode),
false <- empty?(filename) do
file_path = Path.join(pack_dir, filename)
# If the name contains directories, create them
if String.contains?(file_path, "/") do
File.mkdir_p!(Path.dirname(file_path))
end
case params["file"] do
%Plug.Upload{path: upload_path} ->
# Copy the uploaded file from the temporary directory
File.copy!(upload_path, file_path)
url when is_binary(url) ->
# Download and write the file
file_contents = Tesla.get!(url).body
File.write!(file_path, file_contents)
end
updated_full_pack = put_in(full_pack, ["files", shortcode], filename)
update_file_and_send(conn, updated_full_pack, pack_file_p)
else
{:has_shortcode, _} ->
conn
|> put_status(:conflict)
|> json(%{error: "An emoji with the \"#{shortcode}\" shortcode already exists"})
true ->
conn
|> put_status(:bad_request)
|> json(%{error: "shortcode or filename cannot be empty"})
end
end
# Remove
def update_file(conn, %{
"pack_name" => pack_name,
"action" => "remove",
"shortcode" => shortcode
}) do
pack_dir = Path.join(@emoji_dir_path, pack_name)
pack_file_p = Path.join(pack_dir, "pack.json")
full_pack = Jason.decode!(File.read!(pack_file_p))
if Map.has_key?(full_pack["files"], shortcode) do
{emoji_file_path, updated_full_pack} = pop_in(full_pack, ["files", shortcode])
emoji_file_path = Path.join(pack_dir, emoji_file_path)
# Delete the emoji file
File.rm!(emoji_file_path)
# If the old directory has no more files, remove it
if String.contains?(emoji_file_path, "/") do
dir = Path.dirname(emoji_file_path)
if Enum.empty?(File.ls!(dir)) do
File.rmdir!(dir)
end
end
update_file_and_send(conn, updated_full_pack, pack_file_p)
else
conn
|> put_status(:bad_request)
|> json(%{error: "Emoji \"#{shortcode}\" does not exist"})
end
end
# Update
def update_file(
conn,
%{"pack_name" => pack_name, "action" => "update", "shortcode" => shortcode} = params
) do
pack_dir = Path.join(@emoji_dir_path, pack_name)
pack_file_p = Path.join(pack_dir, "pack.json")
full_pack = Jason.decode!(File.read!(pack_file_p))
with {_, true} <- {:has_shortcode, Map.has_key?(full_pack["files"], shortcode)},
%{"new_shortcode" => new_shortcode, "new_filename" => new_filename} <- params,
false <- empty?(new_shortcode),
false <- empty?(new_filename) do
# First, remove the old shortcode, saving the old path
{old_emoji_file_path, updated_full_pack} = pop_in(full_pack, ["files", shortcode])
old_emoji_file_path = Path.join(pack_dir, old_emoji_file_path)
new_emoji_file_path = Path.join(pack_dir, new_filename)
# If the name contains directories, create them
if String.contains?(new_emoji_file_path, "/") do
File.mkdir_p!(Path.dirname(new_emoji_file_path))
end
# Move/Rename the old filename to a new filename
# These are probably on the same filesystem, so just rename should work
:ok = File.rename(old_emoji_file_path, new_emoji_file_path)
# If the old directory has no more files, remove it
if String.contains?(old_emoji_file_path, "/") do
dir = Path.dirname(old_emoji_file_path)
if Enum.empty?(File.ls!(dir)) do
File.rmdir!(dir)
end
end
# Then, put in the new shortcode with the new path
updated_full_pack = put_in(updated_full_pack, ["files", new_shortcode], new_filename)
update_file_and_send(conn, updated_full_pack, pack_file_p)
else
{:has_shortcode, _} ->
conn
|> put_status(:bad_request)
|> json(%{error: "Emoji \"#{shortcode}\" does not exist"})
true ->
conn
|> put_status(:bad_request)
|> json(%{error: "new_shortcode or new_filename cannot be empty"})
_ ->
conn
|> put_status(:bad_request)
|> json(%{error: "new_shortcode or new_file were not specified"})
end
end
def update_file(conn, %{"action" => action}) do
conn
|> put_status(:bad_request)
|> json(%{error: "Unknown action: #{action}"})
end
@doc """
Imports emoji from the filesystem.
Importing means checking all the directories in the
`$instance_static/emoji/` for directories which do not have
`pack.json`. If one has an emoji.txt file, that file will be used
to create a `pack.json` file with it's contents. If the directory has
neither, all the files with specific configured extenstions will be
assumed to be emojis and stored in the new `pack.json` file.
"""
def import_from_fs(conn, _params) do
with {:ok, results} <- File.ls(@emoji_dir_path) do
imported_pack_names =
results
|> Enum.filter(fn file ->
dir_path = Path.join(@emoji_dir_path, file)
# Find the directories that do NOT have pack.json
File.dir?(dir_path) and not File.exists?(Path.join(dir_path, "pack.json"))
end)
|> Enum.map(&write_pack_json_contents/1)
json(conn, imported_pack_names)
else
{:error, _} ->
conn
|> put_status(:internal_server_error)
|> json(%{error: "Error accessing emoji pack directory"})
end
end
defp write_pack_json_contents(dir) do
dir_path = Path.join(@emoji_dir_path, dir)
emoji_txt_path = Path.join(dir_path, "emoji.txt")
files_for_pack = files_for_pack(emoji_txt_path, dir_path)
pack_json_contents = Jason.encode!(%{pack: %{}, files: files_for_pack})
File.write!(Path.join(dir_path, "pack.json"), pack_json_contents)
dir
end
defp files_for_pack(emoji_txt_path, dir_path) do
if File.exists?(emoji_txt_path) do
# There's an emoji.txt file, it's likely from a pack installed by the pack manager.
# Make a pack.json file from the contents of that emoji.txt fileh
# FIXME: Copy-pasted from Pleroma.Emoji/load_from_file_stream/2
# Create a map of shortcodes to filenames from emoji.txt
File.read!(emoji_txt_path)
|> String.split("\n")
|> Enum.map(&String.trim/1)
|> Enum.map(fn line ->
case String.split(line, ~r/,\s*/) do
# This matches both strings with and without tags
# and we don't care about tags here
[name, file | _] -> {name, file}
_ -> nil
end
end)
|> Enum.filter(fn x -> not is_nil(x) end)
|> Enum.into(%{})
else
# If there's no emoji.txt, assume all files
# that are of certain extensions from the config are emojis and import them all
pack_extensions = Pleroma.Config.get!([:emoji, :pack_extensions])
Pleroma.Emoji.Loader.make_shortcode_to_file_map(dir_path, pack_extensions)
end
end
end

View file

@ -5,7 +5,7 @@
defmodule Pleroma.Web.PleromaAPI.PleromaAPIController do
use Pleroma.Web, :controller
import Pleroma.Web.ControllerHelper, only: [add_link_headers: 7]
import Pleroma.Web.ControllerHelper, only: [add_link_headers: 2]
alias Pleroma.Conversation.Participation
alias Pleroma.Notification
@ -27,31 +27,22 @@ def conversation_statuses(
%{assigns: %{user: user}} = conn,
%{"id" => participation_id} = params
) do
participation = Participation.get(participation_id, preload: [:conversation])
if user.id == participation.user_id do
params =
params
|> Map.put("blocking_user", user)
|> Map.put("muting_user", user)
|> Map.put("user", user)
participation =
participation_id
|> Participation.get(preload: [:conversation])
if user.id == participation.user_id do
activities =
participation.conversation.ap_id
|> ActivityPub.fetch_activities_for_context(params)
|> Enum.reverse()
conn
|> add_link_headers(
:conversation_statuses,
activities,
participation_id,
params,
nil,
&pleroma_api_url/4
)
|> add_link_headers(activities)
|> put_view(StatusView)
|> render("index.json", %{activities: activities, for: user, as: :activity})
end

View file

@ -3,7 +3,7 @@
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Web.Push do
alias Pleroma.Web.Push.Impl
alias Pleroma.Workers.WebPusherWorker
require Logger
@ -31,6 +31,7 @@ def enabled do
end
end
def send(notification),
do: PleromaJobQueue.enqueue(:web_push, Impl, [notification])
def send(notification) do
WebPusherWorker.enqueue("web_push", %{"notification_id" => notification.id})
end
end

View file

@ -81,6 +81,7 @@ defp parse_url(url) do
{:ok, %Tesla.Env{body: html}} = Pleroma.HTTP.get(url, [], adapter: @hackney_options)
html
|> parse_html
|> maybe_parse()
|> Map.put(:url, url)
|> clean_parsed_data()
@ -91,6 +92,8 @@ defp parse_url(url) do
end
end
defp parse_html(html), do: Floki.parse(html)
defp maybe_parse(html) do
Enum.reduce_while(parsers(), %{}, fn parser, acc ->
case parser.parse(html, acc) do
@ -100,7 +103,8 @@ defp maybe_parse(html) do
end)
end
defp check_parsed_data(%{title: title} = data) when is_binary(title) and byte_size(title) > 0 do
defp check_parsed_data(%{title: title} = data)
when is_binary(title) and byte_size(title) > 0 do
{:ok, data}
end

View file

@ -135,6 +135,7 @@ defmodule Pleroma.Web.Router do
pipeline :http_signature do
plug(Pleroma.Web.Plugs.HTTPSignaturePlug)
plug(Pleroma.Web.Plugs.MappedSignatureToIdentityPlug)
end
scope "/api/pleroma", Pleroma.Web.TwitterAPI do
@ -179,12 +180,13 @@ defmodule Pleroma.Web.Router do
post("/relay", AdminAPIController, :relay_follow)
delete("/relay", AdminAPIController, :relay_unfollow)
get("/users/invite_token", AdminAPIController, :get_invite_token)
post("/users/invite_token", AdminAPIController, :create_invite_token)
get("/users/invites", AdminAPIController, :invites)
post("/users/revoke_invite", AdminAPIController, :revoke_invite)
post("/users/email_invite", AdminAPIController, :email_invite)
get("/users/:nickname/password_reset", AdminAPIController, :get_password_reset)
patch("/users/:nickname/force_password_reset", AdminAPIController, :force_password_reset)
get("/users", AdminAPIController, :list_users)
get("/users/:nickname", AdminAPIController, :user_show)
@ -204,6 +206,29 @@ defmodule Pleroma.Web.Router do
get("/config/migrate_from_db", AdminAPIController, :migrate_from_db)
get("/moderation_log", AdminAPIController, :list_log)
post("/reload_emoji", AdminAPIController, :reload_emoji)
end
scope "/api/pleroma/emoji", Pleroma.Web.PleromaAPI do
scope "/packs" do
# Modifying packs
pipe_through([:admin_api, :oauth_write])
post("/import_from_fs", EmojiAPIController, :import_from_fs)
post("/:pack_name/update_file", EmojiAPIController, :update_file)
post("/:pack_name/update_metadata", EmojiAPIController, :update_metadata)
put("/:name", EmojiAPIController, :create)
delete("/:name", EmojiAPIController, :delete)
post("/download_from", EmojiAPIController, :download_from)
end
scope "/packs" do
# Pack info / downloading
get("/", EmojiAPIController, :list_packs)
get("/:name/download_shared/", EmojiAPIController, :download_shared)
end
end
scope "/", Pleroma.Web.TwitterAPI do
@ -224,6 +249,7 @@ defmodule Pleroma.Web.Router do
scope [] do
pipe_through(:oauth_write)
post("/change_email", UtilController, :change_email)
post("/change_password", UtilController, :change_password)
post("/delete_account", UtilController, :delete_account)
put("/notification_settings", UtilController, :update_notificaton_settings)
@ -443,6 +469,7 @@ defmodule Pleroma.Web.Router do
get("/timelines/tag/:tag", MastodonAPIController, :hashtag_timeline)
get("/timelines/list/:list_id", MastodonAPIController, :list_timeline)
get("/statuses", MastodonAPIController, :get_statuses)
get("/statuses/:id", MastodonAPIController, :get_status)
get("/statuses/:id/context", MastodonAPIController, :get_context)
@ -512,6 +539,7 @@ defmodule Pleroma.Web.Router do
scope "/", Pleroma.Web do
pipe_through(:ostatus)
pipe_through(:http_signature)
get("/objects/:uuid", OStatus.OStatusController, :object)
get("/activities/:uuid", OStatus.OStatusController, :activity)

View file

@ -170,6 +170,15 @@ def publish_one(%{recipient: url, feed: feed} = params) when is_binary(url) do
end
end
def publish_one(%{recipient_id: recipient_id} = params) do
recipient = User.get_cached_by_id(recipient_id)
params
|> Map.delete(:recipient_id)
|> Map.put(:recipient, recipient)
|> publish_one()
end
def publish_one(_), do: :noop
@supported_activities [
@ -218,7 +227,7 @@ def publish(%{info: %{keys: keys}} = user, %{data: %{"type" => type}} = activity
Logger.debug(fn -> "Sending Salmon to #{remote_user.ap_id}" end)
Publisher.enqueue_one(__MODULE__, %{
recipient: remote_user,
recipient_id: remote_user.id,
feed: feed,
unreachable_since: reachable_urls_metadata[remote_user.info.salmon]
})

View file

@ -1,318 +0,0 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2019 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Web.Streamer do
use GenServer
require Logger
alias Pleroma.Activity
alias Pleroma.Config
alias Pleroma.Conversation.Participation
alias Pleroma.Notification
alias Pleroma.Object
alias Pleroma.User
alias Pleroma.Web.ActivityPub.ActivityPub
alias Pleroma.Web.ActivityPub.Visibility
alias Pleroma.Web.CommonAPI
alias Pleroma.Web.MastodonAPI.NotificationView
@keepalive_interval :timer.seconds(30)
def start_link(_) do
GenServer.start_link(__MODULE__, %{}, name: __MODULE__)
end
def add_socket(topic, socket) do
GenServer.cast(__MODULE__, %{action: :add, socket: socket, topic: topic})
end
def remove_socket(topic, socket) do
GenServer.cast(__MODULE__, %{action: :remove, socket: socket, topic: topic})
end
def stream(topic, item) do
GenServer.cast(__MODULE__, %{action: :stream, topic: topic, item: item})
end
def init(args) do
Process.send_after(self(), %{action: :ping}, @keepalive_interval)
{:ok, args}
end
def handle_info(%{action: :ping}, topics) do
topics
|> Map.values()
|> List.flatten()
|> Enum.each(fn socket ->
Logger.debug("Sending keepalive ping")
send(socket.transport_pid, {:text, ""})
end)
Process.send_after(self(), %{action: :ping}, @keepalive_interval)
{:noreply, topics}
end
def handle_cast(%{action: :stream, topic: "direct", item: item}, topics) do
recipient_topics =
User.get_recipients_from_activity(item)
|> Enum.map(fn %{id: id} -> "direct:#{id}" end)
Enum.each(recipient_topics || [], fn user_topic ->
Logger.debug("Trying to push direct message to #{user_topic}\n\n")
push_to_socket(topics, user_topic, item)
end)
{:noreply, topics}
end
def handle_cast(%{action: :stream, topic: "participation", item: participation}, topics) do
user_topic = "direct:#{participation.user_id}"
Logger.debug("Trying to push a conversation participation to #{user_topic}\n\n")
push_to_socket(topics, user_topic, participation)
{:noreply, topics}
end
def handle_cast(%{action: :stream, topic: "list", item: item}, topics) do
# filter the recipient list if the activity is not public, see #270.
recipient_lists =
case Visibility.is_public?(item) do
true ->
Pleroma.List.get_lists_from_activity(item)
_ ->
Pleroma.List.get_lists_from_activity(item)
|> Enum.filter(fn list ->
owner = User.get_cached_by_id(list.user_id)
Visibility.visible_for_user?(item, owner)
end)
end
recipient_topics =
recipient_lists
|> Enum.map(fn %{id: id} -> "list:#{id}" end)
Enum.each(recipient_topics || [], fn list_topic ->
Logger.debug("Trying to push message to #{list_topic}\n\n")
push_to_socket(topics, list_topic, item)
end)
{:noreply, topics}
end
def handle_cast(
%{action: :stream, topic: topic, item: %Notification{} = item},
topics
)
when topic in ["user", "user:notification"] do
topics
|> Map.get("#{topic}:#{item.user_id}", [])
|> Enum.each(fn socket ->
with %User{} = user <- User.get_cached_by_ap_id(socket.assigns[:user].ap_id),
true <- should_send?(user, item) do
send(
socket.transport_pid,
{:text, represent_notification(socket.assigns[:user], item)}
)
end
end)
{:noreply, topics}
end
def handle_cast(%{action: :stream, topic: "user", item: item}, topics) do
Logger.debug("Trying to push to users")
recipient_topics =
User.get_recipients_from_activity(item)
|> Enum.map(fn %{id: id} -> "user:#{id}" end)
Enum.each(recipient_topics, fn topic ->
push_to_socket(topics, topic, item)
end)
{:noreply, topics}
end
def handle_cast(%{action: :stream, topic: topic, item: item}, topics) do
Logger.debug("Trying to push to #{topic}")
Logger.debug("Pushing item to #{topic}")
push_to_socket(topics, topic, item)
{:noreply, topics}
end
def handle_cast(%{action: :add, topic: topic, socket: socket}, sockets) do
topic = internal_topic(topic, socket)
sockets_for_topic = sockets[topic] || []
sockets_for_topic = Enum.uniq([socket | sockets_for_topic])
sockets = Map.put(sockets, topic, sockets_for_topic)
Logger.debug("Got new conn for #{topic}")
{:noreply, sockets}
end
def handle_cast(%{action: :remove, topic: topic, socket: socket}, sockets) do
topic = internal_topic(topic, socket)
sockets_for_topic = sockets[topic] || []
sockets_for_topic = List.delete(sockets_for_topic, socket)
sockets = Map.put(sockets, topic, sockets_for_topic)
Logger.debug("Removed conn for #{topic}")
{:noreply, sockets}
end
def handle_cast(m, state) do
Logger.info("Unknown: #{inspect(m)}, #{inspect(state)}")
{:noreply, state}
end
defp represent_update(%Activity{} = activity, %User{} = user) do
%{
event: "update",
payload:
Pleroma.Web.MastodonAPI.StatusView.render(
"status.json",
activity: activity,
for: user
)
|> Jason.encode!()
}
|> Jason.encode!()
end
defp represent_update(%Activity{} = activity) do
%{
event: "update",
payload:
Pleroma.Web.MastodonAPI.StatusView.render(
"status.json",
activity: activity
)
|> Jason.encode!()
}
|> Jason.encode!()
end
def represent_conversation(%Participation{} = participation) do
%{
event: "conversation",
payload:
Pleroma.Web.MastodonAPI.ConversationView.render("participation.json", %{
participation: participation,
for: participation.user
})
|> Jason.encode!()
}
|> Jason.encode!()
end
@spec represent_notification(User.t(), Notification.t()) :: binary()
defp represent_notification(%User{} = user, %Notification{} = notify) do
%{
event: "notification",
payload:
NotificationView.render(
"show.json",
%{notification: notify, for: user}
)
|> Jason.encode!()
}
|> Jason.encode!()
end
defp should_send?(%User{} = user, %Activity{} = item) do
blocks = user.info.blocks || []
mutes = user.info.mutes || []
reblog_mutes = user.info.muted_reblogs || []
domain_blocks = Pleroma.Web.ActivityPub.MRF.subdomains_regex(user.info.domain_blocks)
with parent when not is_nil(parent) <- Object.normalize(item),
true <- Enum.all?([blocks, mutes, reblog_mutes], &(item.actor not in &1)),
true <- Enum.all?([blocks, mutes], &(parent.data["actor"] not in &1)),
%{host: item_host} <- URI.parse(item.actor),
%{host: parent_host} <- URI.parse(parent.data["actor"]),
false <- Pleroma.Web.ActivityPub.MRF.subdomain_match?(domain_blocks, item_host),
false <- Pleroma.Web.ActivityPub.MRF.subdomain_match?(domain_blocks, parent_host),
true <- thread_containment(item, user),
false <- CommonAPI.thread_muted?(user, item) do
true
else
_ -> false
end
end
defp should_send?(%User{} = user, %Notification{activity: activity}) do
should_send?(user, activity)
end
def push_to_socket(topics, topic, %Activity{data: %{"type" => "Announce"}} = item) do
Enum.each(topics[topic] || [], fn socket ->
# Get the current user so we have up-to-date blocks etc.
if socket.assigns[:user] do
user = User.get_cached_by_ap_id(socket.assigns[:user].ap_id)
if should_send?(user, item) do
send(socket.transport_pid, {:text, represent_update(item, user)})
end
else
send(socket.transport_pid, {:text, represent_update(item)})
end
end)
end
def push_to_socket(topics, topic, %Participation{} = participation) do
Enum.each(topics[topic] || [], fn socket ->
send(socket.transport_pid, {:text, represent_conversation(participation)})
end)
end
def push_to_socket(topics, topic, %Activity{
data: %{"type" => "Delete", "deleted_activity_id" => deleted_activity_id}
}) do
Enum.each(topics[topic] || [], fn socket ->
send(
socket.transport_pid,
{:text, %{event: "delete", payload: to_string(deleted_activity_id)} |> Jason.encode!()}
)
end)
end
def push_to_socket(_topics, _topic, %Activity{data: %{"type" => "Delete"}}), do: :noop
def push_to_socket(topics, topic, item) do
Enum.each(topics[topic] || [], fn socket ->
# Get the current user so we have up-to-date blocks etc.
if socket.assigns[:user] do
user = User.get_cached_by_ap_id(socket.assigns[:user].ap_id)
blocks = user.info.blocks || []
mutes = user.info.mutes || []
with true <- Enum.all?([blocks, mutes], &(item.actor not in &1)),
true <- thread_containment(item, user) do
send(socket.transport_pid, {:text, represent_update(item, user)})
end
else
send(socket.transport_pid, {:text, represent_update(item)})
end
end)
end
defp internal_topic(topic, socket) when topic in ~w[user user:notification direct] do
"#{topic}:#{socket.assigns[:user].id}"
end
defp internal_topic(topic, _), do: topic
@spec thread_containment(Activity.t(), User.t()) :: boolean()
defp thread_containment(_activity, %User{info: %{skip_thread_containment: true}}), do: true
defp thread_containment(activity, user) do
if Config.get([:instance, :skip_thread_containment]) do
true
else
ActivityPub.contain_activity(activity, user)
end
end
end

View file

@ -0,0 +1,33 @@
defmodule Pleroma.Web.Streamer.Ping do
use GenServer
require Logger
alias Pleroma.Web.Streamer.State
alias Pleroma.Web.Streamer.StreamerSocket
@keepalive_interval :timer.seconds(30)
def start_link(opts) do
ping_interval = Keyword.get(opts, :ping_interval, @keepalive_interval)
GenServer.start_link(__MODULE__, %{ping_interval: ping_interval}, name: __MODULE__)
end
def init(%{ping_interval: ping_interval} = args) do
Process.send_after(self(), :ping, ping_interval)
{:ok, args}
end
def handle_info(:ping, %{ping_interval: ping_interval} = state) do
State.get_sockets()
|> Map.values()
|> List.flatten()
|> Enum.each(fn %StreamerSocket{transport_pid: transport_pid} ->
Logger.debug("Sending keepalive ping")
send(transport_pid, {:text, ""})
end)
Process.send_after(self(), :ping, ping_interval)
{:noreply, state}
end
end

View file

@ -0,0 +1,78 @@
defmodule Pleroma.Web.Streamer.State do
use GenServer
require Logger
alias Pleroma.Web.Streamer.StreamerSocket
@env Mix.env()
def start_link(_) do
GenServer.start_link(__MODULE__, %{sockets: %{}}, name: __MODULE__)
end
def add_socket(topic, socket) do
GenServer.call(__MODULE__, {:add, topic, socket})
end
def remove_socket(topic, socket) do
do_remove_socket(@env, topic, socket)
end
def get_sockets do
%{sockets: stream_sockets} = GenServer.call(__MODULE__, :get_state)
stream_sockets
end
def init(init_arg) do
{:ok, init_arg}
end
def handle_call(:get_state, _from, state) do
{:reply, state, state}
end
def handle_call({:add, topic, socket}, _from, %{sockets: sockets} = state) do
internal_topic = internal_topic(topic, socket)
stream_socket = StreamerSocket.from_socket(socket)
sockets_for_topic =
sockets
|> Map.get(internal_topic, [])
|> List.insert_at(0, stream_socket)
|> Enum.uniq()
state = put_in(state, [:sockets, internal_topic], sockets_for_topic)
Logger.debug("Got new conn for #{topic}")
{:reply, state, state}
end
def handle_call({:remove, topic, socket}, _from, %{sockets: sockets} = state) do
internal_topic = internal_topic(topic, socket)
stream_socket = StreamerSocket.from_socket(socket)
sockets_for_topic =
sockets
|> Map.get(internal_topic, [])
|> List.delete(stream_socket)
state = Kernel.put_in(state, [:sockets, internal_topic], sockets_for_topic)
{:reply, state, state}
end
defp do_remove_socket(:test, _, _) do
:ok
end
defp do_remove_socket(_env, topic, socket) do
GenServer.call(__MODULE__, {:remove, topic, socket})
end
defp internal_topic(topic, socket)
when topic in ~w[user user:notification direct] do
"#{topic}:#{socket.assigns[:user].id}"
end
defp internal_topic(topic, _) do
topic
end
end

View file

@ -0,0 +1,55 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2019 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Web.Streamer do
alias Pleroma.Web.Streamer.State
alias Pleroma.Web.Streamer.Worker
@timeout 60_000
@mix_env Mix.env()
def add_socket(topic, socket) do
State.add_socket(topic, socket)
end
def remove_socket(topic, socket) do
State.remove_socket(topic, socket)
end
def get_sockets do
State.get_sockets()
end
def stream(topics, items) do
if should_send?() do
Task.async(fn ->
:poolboy.transaction(
:streamer_worker,
&Worker.stream(&1, topics, items),
@timeout
)
end)
end
end
def supervisor, do: Pleroma.Web.Streamer.Supervisor
defp should_send? do
handle_should_send(@mix_env)
end
defp handle_should_send(:test) do
case Process.whereis(:streamer_worker) do
nil ->
false
pid ->
Process.alive?(pid)
end
end
defp handle_should_send(_) do
true
end
end

View file

@ -0,0 +1,31 @@
defmodule Pleroma.Web.Streamer.StreamerSocket do
defstruct transport_pid: nil, user: nil
alias Pleroma.User
alias Pleroma.Web.Streamer.StreamerSocket
def from_socket(%{
transport_pid: transport_pid,
assigns: %{user: nil}
}) do
%StreamerSocket{
transport_pid: transport_pid
}
end
def from_socket(%{
transport_pid: transport_pid,
assigns: %{user: %User{} = user}
}) do
%StreamerSocket{
transport_pid: transport_pid,
user: user
}
end
def from_socket(%{transport_pid: transport_pid}) do
%StreamerSocket{
transport_pid: transport_pid
}
end
end

View file

@ -0,0 +1,33 @@
defmodule Pleroma.Web.Streamer.Supervisor do
use Supervisor
def start_link(opts) do
Supervisor.start_link(__MODULE__, opts, name: __MODULE__)
end
def init(args) do
children = [
{Pleroma.Web.Streamer.State, args},
{Pleroma.Web.Streamer.Ping, args},
:poolboy.child_spec(:streamer_worker, poolboy_config())
]
opts = [strategy: :one_for_one, name: Pleroma.Web.Streamer.Supervisor]
Supervisor.init(children, opts)
end
defp poolboy_config do
opts =
Pleroma.Config.get(:streamer,
workers: 3,
overflow_workers: 2
)
[
{:name, {:local, :streamer_worker}},
{:worker_module, Pleroma.Web.Streamer.Worker},
{:size, opts[:workers]},
{:max_overflow, opts[:overflow_workers]}
]
end
end

View file

@ -0,0 +1,220 @@
defmodule Pleroma.Web.Streamer.Worker do
use GenServer
require Logger
alias Pleroma.Activity
alias Pleroma.Config
alias Pleroma.Conversation.Participation
alias Pleroma.Notification
alias Pleroma.Object
alias Pleroma.User
alias Pleroma.Web.ActivityPub.ActivityPub
alias Pleroma.Web.ActivityPub.Visibility
alias Pleroma.Web.CommonAPI
alias Pleroma.Web.Streamer.State
alias Pleroma.Web.Streamer.StreamerSocket
alias Pleroma.Web.StreamerView
def start_link(_) do
GenServer.start_link(__MODULE__, %{}, [])
end
def init(init_arg) do
{:ok, init_arg}
end
def stream(pid, topics, items) do
GenServer.call(pid, {:stream, topics, items})
end
def handle_call({:stream, topics, item}, _from, state) when is_list(topics) do
Enum.each(topics, fn t ->
do_stream(%{topic: t, item: item})
end)
{:reply, state, state}
end
def handle_call({:stream, topic, items}, _from, state) when is_list(items) do
Enum.each(items, fn i ->
do_stream(%{topic: topic, item: i})
end)
{:reply, state, state}
end
def handle_call({:stream, topic, item}, _from, state) do
do_stream(%{topic: topic, item: item})
{:reply, state, state}
end
defp do_stream(%{topic: "direct", item: item}) do
recipient_topics =
User.get_recipients_from_activity(item)
|> Enum.map(fn %{id: id} -> "direct:#{id}" end)
Enum.each(recipient_topics, fn user_topic ->
Logger.debug("Trying to push direct message to #{user_topic}\n\n")
push_to_socket(State.get_sockets(), user_topic, item)
end)
end
defp do_stream(%{topic: "participation", item: participation}) do
user_topic = "direct:#{participation.user_id}"
Logger.debug("Trying to push a conversation participation to #{user_topic}\n\n")
push_to_socket(State.get_sockets(), user_topic, participation)
end
defp do_stream(%{topic: "list", item: item}) do
# filter the recipient list if the activity is not public, see #270.
recipient_lists =
case Visibility.is_public?(item) do
true ->
Pleroma.List.get_lists_from_activity(item)
_ ->
Pleroma.List.get_lists_from_activity(item)
|> Enum.filter(fn list ->
owner = User.get_cached_by_id(list.user_id)
Visibility.visible_for_user?(item, owner)
end)
end
recipient_topics =
recipient_lists
|> Enum.map(fn %{id: id} -> "list:#{id}" end)
Enum.each(recipient_topics, fn list_topic ->
Logger.debug("Trying to push message to #{list_topic}\n\n")
push_to_socket(State.get_sockets(), list_topic, item)
end)
end
defp do_stream(%{topic: topic, item: %Notification{} = item})
when topic in ["user", "user:notification"] do
State.get_sockets()
|> Map.get("#{topic}:#{item.user_id}", [])
|> Enum.each(fn %StreamerSocket{transport_pid: transport_pid, user: socket_user} ->
with %User{} = user <- User.get_cached_by_ap_id(socket_user.ap_id),
true <- should_send?(user, item) do
send(transport_pid, {:text, StreamerView.render("notification.json", socket_user, item)})
end
end)
end
defp do_stream(%{topic: "user", item: item}) do
Logger.debug("Trying to push to users")
recipient_topics =
User.get_recipients_from_activity(item)
|> Enum.map(fn %{id: id} -> "user:#{id}" end)
Enum.each(recipient_topics, fn topic ->
push_to_socket(State.get_sockets(), topic, item)
end)
end
defp do_stream(%{topic: topic, item: item}) do
Logger.debug("Trying to push to #{topic}")
Logger.debug("Pushing item to #{topic}")
push_to_socket(State.get_sockets(), topic, item)
end
defp should_send?(%User{} = user, %Activity{} = item) do
blocks = user.info.blocks || []
mutes = user.info.mutes || []
reblog_mutes = user.info.muted_reblogs || []
domain_blocks = Pleroma.Web.ActivityPub.MRF.subdomains_regex(user.info.domain_blocks)
with parent when not is_nil(parent) <- Object.normalize(item),
true <- Enum.all?([blocks, mutes, reblog_mutes], &(item.actor not in &1)),
true <- Enum.all?([blocks, mutes], &(parent.data["actor"] not in &1)),
%{host: item_host} <- URI.parse(item.actor),
%{host: parent_host} <- URI.parse(parent.data["actor"]),
false <- Pleroma.Web.ActivityPub.MRF.subdomain_match?(domain_blocks, item_host),
false <- Pleroma.Web.ActivityPub.MRF.subdomain_match?(domain_blocks, parent_host),
true <- thread_containment(item, user),
false <- CommonAPI.thread_muted?(user, item) do
true
else
_ -> false
end
end
defp should_send?(%User{} = user, %Notification{activity: activity}) do
should_send?(user, activity)
end
def push_to_socket(topics, topic, %Activity{data: %{"type" => "Announce"}} = item) do
Enum.each(topics[topic] || [], fn %StreamerSocket{
transport_pid: transport_pid,
user: socket_user
} ->
# Get the current user so we have up-to-date blocks etc.
if socket_user do
user = User.get_cached_by_ap_id(socket_user.ap_id)
if should_send?(user, item) do
send(transport_pid, {:text, StreamerView.render("update.json", item, user)})
end
else
send(transport_pid, {:text, StreamerView.render("update.json", item)})
end
end)
end
def push_to_socket(topics, topic, %Participation{} = participation) do
Enum.each(topics[topic] || [], fn %StreamerSocket{transport_pid: transport_pid} ->
send(transport_pid, {:text, StreamerView.render("conversation.json", participation)})
end)
end
def push_to_socket(topics, topic, %Activity{
data: %{"type" => "Delete", "deleted_activity_id" => deleted_activity_id}
}) do
Enum.each(topics[topic] || [], fn %StreamerSocket{transport_pid: transport_pid} ->
send(
transport_pid,
{:text, %{event: "delete", payload: to_string(deleted_activity_id)} |> Jason.encode!()}
)
end)
end
def push_to_socket(_topics, _topic, %Activity{data: %{"type" => "Delete"}}), do: :noop
def push_to_socket(topics, topic, item) do
Enum.each(topics[topic] || [], fn %StreamerSocket{
transport_pid: transport_pid,
user: socket_user
} ->
# Get the current user so we have up-to-date blocks etc.
if socket_user do
user = User.get_cached_by_ap_id(socket_user.ap_id)
blocks = user.info.blocks || []
mutes = user.info.mutes || []
with true <- Enum.all?([blocks, mutes], &(item.actor not in &1)),
true <- thread_containment(item, user) do
send(transport_pid, {:text, StreamerView.render("update.json", item, user)})
end
else
send(transport_pid, {:text, StreamerView.render("update.json", item)})
end
end)
end
@spec thread_containment(Activity.t(), User.t()) :: boolean()
defp thread_containment(_activity, %User{info: %{skip_thread_containment: true}}), do: true
defp thread_containment(activity, user) do
if Config.get([:instance, :skip_thread_containment]) do
true
else
ActivityPub.contain_activity(activity, user)
end
end
end

View file

@ -263,12 +263,7 @@ def follow_import(%{assigns: %{user: follower}} = conn, %{"list" => list}) do
String.split(line, ",") |> List.first()
end)
|> List.delete("Account address") do
PleromaJobQueue.enqueue(:background, User, [
:follow_import,
follower,
followed_identifiers
])
User.follow_import(follower, followed_identifiers)
json(conn, "job started")
end
end
@ -279,12 +274,7 @@ def blocks_import(conn, %{"list" => %Plug.Upload{} = listfile}) do
def blocks_import(%{assigns: %{user: blocker}} = conn, %{"list" => list}) do
with blocked_identifiers <- String.split(list) do
PleromaJobQueue.enqueue(:background, User, [
:blocks_import,
blocker,
blocked_identifiers
])
User.blocks_import(blocker, blocked_identifiers)
json(conn, "job started")
end
end
@ -312,6 +302,25 @@ def change_password(%{assigns: %{user: user}} = conn, params) do
end
end
def change_email(%{assigns: %{user: user}} = conn, params) do
case CommonAPI.Utils.confirm_current_password(user, params["password"]) do
{:ok, user} ->
with {:ok, _user} <- User.change_email(user, params["email"]) do
json(conn, %{status: "success"})
else
{:error, changeset} ->
{_, {error, _}} = Enum.at(changeset.errors, 0)
json(conn, %{error: "Email #{error}."})
_ ->
json(conn, %{error: "Unable to change email."})
end
{:error, msg} ->
json(conn, %{error: msg})
end
end
def delete_account(%{assigns: %{user: user}} = conn, params) do
case CommonAPI.Utils.confirm_current_password(user, params["password"]) do
{:ok, user} ->

View file

@ -0,0 +1,66 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2019 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Web.StreamerView do
use Pleroma.Web, :view
alias Pleroma.Activity
alias Pleroma.Conversation.Participation
alias Pleroma.Notification
alias Pleroma.User
alias Pleroma.Web.MastodonAPI.NotificationView
def render("update.json", %Activity{} = activity, %User{} = user) do
%{
event: "update",
payload:
Pleroma.Web.MastodonAPI.StatusView.render(
"status.json",
activity: activity,
for: user
)
|> Jason.encode!()
}
|> Jason.encode!()
end
def render("notification.json", %User{} = user, %Notification{} = notify) do
%{
event: "notification",
payload:
NotificationView.render(
"show.json",
%{notification: notify, for: user}
)
|> Jason.encode!()
}
|> Jason.encode!()
end
def render("update.json", %Activity{} = activity) do
%{
event: "update",
payload:
Pleroma.Web.MastodonAPI.StatusView.render(
"status.json",
activity: activity
)
|> Jason.encode!()
}
|> Jason.encode!()
end
def render("conversation.json", %Participation{} = participation) do
%{
event: "conversation",
payload:
Pleroma.Web.MastodonAPI.ConversationView.render("participation.json", %{
participation: participation,
for: participation.user
})
|> Jason.encode!()
}
|> Jason.encode!()
end
end

View file

@ -66,23 +66,9 @@ def safe_render(view, template, assigns \\ %{}) do
end
@doc """
Same as `render_many/4` but wrapped in rescue block and parallelized (unless disabled by passing false as a fifth argument).
Same as `render_many/4` but wrapped in rescue block.
"""
def safe_render_many(collection, view, template, assigns \\ %{}, parallel \\ true)
def safe_render_many(collection, view, template, assigns, true) do
Enum.map(collection, fn resource ->
Task.async(fn ->
as = Map.get(assigns, :as) || view.__resource__
assigns = Map.put(assigns, as, resource)
safe_render(view, template, assigns)
end)
end)
|> Enum.map(&Task.await(&1, :infinity))
|> Enum.filter(& &1)
end
def safe_render_many(collection, view, template, assigns, false) do
def safe_render_many(collection, view, template, assigns \\ %{}) do
Enum.map(collection, fn resource ->
as = Map.get(assigns, :as) || view.__resource__
assigns = Map.put(assigns, as, resource)

View file

@ -0,0 +1,18 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2019 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Workers.ActivityExpirationWorker do
use Pleroma.Workers.WorkerHelper, queue: "activity_expiration"
@impl Oban.Worker
def perform(
%{
"op" => "activity_expiration",
"activity_expiration_id" => activity_expiration_id
},
_job
) do
Pleroma.Daemons.ActivityExpirationDaemon.perform(:execute, activity_expiration_id)
end
end

View file

@ -0,0 +1,74 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2019 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Workers.BackgroundWorker do
alias Pleroma.Activity
alias Pleroma.User
alias Pleroma.Web.ActivityPub.MRF.MediaProxyWarmingPolicy
alias Pleroma.Web.OAuth.Token.CleanWorker
use Pleroma.Workers.WorkerHelper, queue: "background"
@impl Oban.Worker
def perform(%{"op" => "fetch_initial_posts", "user_id" => user_id}, _job) do
user = User.get_cached_by_id(user_id)
User.perform(:fetch_initial_posts, user)
end
def perform(%{"op" => "deactivate_user", "user_id" => user_id, "status" => status}, _job) do
user = User.get_cached_by_id(user_id)
User.perform(:deactivate_async, user, status)
end
def perform(%{"op" => "delete_user", "user_id" => user_id}, _job) do
user = User.get_cached_by_id(user_id)
User.perform(:delete, user)
end
def perform(%{"op" => "force_password_reset", "user_id" => user_id}, _job) do
user = User.get_cached_by_id(user_id)
User.perform(:force_password_reset, user)
end
def perform(
%{
"op" => "blocks_import",
"blocker_id" => blocker_id,
"blocked_identifiers" => blocked_identifiers
},
_job
) do
blocker = User.get_cached_by_id(blocker_id)
User.perform(:blocks_import, blocker, blocked_identifiers)
end
def perform(
%{
"op" => "follow_import",
"follower_id" => follower_id,
"followed_identifiers" => followed_identifiers
},
_job
) do
follower = User.get_cached_by_id(follower_id)
User.perform(:follow_import, follower, followed_identifiers)
end
def perform(%{"op" => "clean_expired_tokens"}, _job) do
CleanWorker.perform(:clean)
end
def perform(%{"op" => "media_proxy_preload", "message" => message}, _job) do
MediaProxyWarmingPolicy.perform(:preload, message)
end
def perform(%{"op" => "media_proxy_prefetch", "url" => url}, _job) do
MediaProxyWarmingPolicy.perform(:prefetch, url)
end
def perform(%{"op" => "fetch_data_for_activity", "activity_id" => activity_id}, _job) do
activity = Activity.get_by_id(activity_id)
Pleroma.Web.RichMedia.Helpers.perform(:fetch, activity)
end
end

View file

@ -0,0 +1,16 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2019 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Workers.DigestEmailsWorker do
alias Pleroma.User
use Pleroma.Workers.WorkerHelper, queue: "digest_emails"
@impl Oban.Worker
def perform(%{"op" => "digest_email", "user_id" => user_id}, _job) do
user_id
|> User.get_cached_by_id()
|> Pleroma.Daemons.DigestEmailDaemon.perform()
end
end

View file

@ -0,0 +1,15 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2019 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Workers.MailerWorker do
use Pleroma.Workers.WorkerHelper, queue: "mailer"
@impl Oban.Worker
def perform(%{"op" => "email", "encoded_email" => encoded_email, "config" => config}, _job) do
encoded_email
|> Base.decode64!()
|> :erlang.binary_to_term()
|> Pleroma.Emails.Mailer.deliver(config)
end
end

Some files were not shown because too many files have changed in this diff Show more