Merge develop

This commit is contained in:
Roman Chvanikov 2019-09-26 10:38:54 +03:00
commit b4b147000c
258 changed files with 4798 additions and 2735 deletions

View file

@ -4,14 +4,26 @@ All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/). The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/).
## [Unreleased] ## [Unreleased]
### Added
- Refreshing poll results for remote polls
- Admin API: Add ability to require password reset
### Changed
- **Breaking:** Elixir >=1.8 is now required (was >= 1.7)
- Replaced [pleroma_job_queue](https://git.pleroma.social/pleroma/pleroma_job_queue) and `Pleroma.Web.Federator.RetryQueue` with [Oban](https://github.com/sorentwo/oban) (see [`docs/config.md`](docs/config.md) on migrating customized worker / retry settings)
- Introduced [quantum](https://github.com/quantum-elixir/quantum-core) job scheduler
- Admin API: Return `total` when querying for reports
- Mastodon API: Return `pleroma.direct_conversation_id` when creating a direct message (`POST /api/v1/statuses`)
- Admin API: Return link alongside with token on password reset
### Fixed
- Mastodon API: Fix private and direct statuses not being filtered out from the public timeline for an authenticated user (`GET /api/v1/timelines/public`)
## [1.1.0] - 2019-??-??
### Security ### Security
- OStatus: eliminate the possibility of a protocol downgrade attack.
- OStatus: prevent following locked accounts, bypassing the approval process.
- Mastodon API: respect post privacy in `/api/v1/statuses/:id/{favourited,reblogged}_by` - Mastodon API: respect post privacy in `/api/v1/statuses/:id/{favourited,reblogged}_by`
### Removed ### Removed
- **Breaking:** GNU Social API with Qvitter extensions support - **Breaking:** GNU Social API with Qvitter extensions support
- **Breaking:** ActivityPub: The `accept_blocks` configuration setting.
- Emoji: Remove longfox emojis. - Emoji: Remove longfox emojis.
- Remove `Reply-To` header from report emails for admins. - Remove `Reply-To` header from report emails for admins.
@ -19,9 +31,11 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/).
- **Breaking:** Configuration: A setting to explicitly disable the mailer was added, defaulting to true, if you are using a mailer add `config :pleroma, Pleroma.Emails.Mailer, enabled: true` to your config - **Breaking:** Configuration: A setting to explicitly disable the mailer was added, defaulting to true, if you are using a mailer add `config :pleroma, Pleroma.Emails.Mailer, enabled: true` to your config
- **Breaking:** Configuration: `/media/` is now removed when `base_url` is configured, append `/media/` to your `base_url` config to keep the old behaviour if desired - **Breaking:** Configuration: `/media/` is now removed when `base_url` is configured, append `/media/` to your `base_url` config to keep the old behaviour if desired
- **Breaking:** `/api/pleroma/notifications/read` is moved to `/api/v1/pleroma/notifications/read` and now supports `max_id` and responds with Mastodon API entities. - **Breaking:** `/api/pleroma/notifications/read` is moved to `/api/v1/pleroma/notifications/read` and now supports `max_id` and responds with Mastodon API entities.
- **Breaking:** `/api/pleroma/admin/users/invite_token` now uses `POST`, changed accepted params and returns full invite in json instead of only token string.
- Configuration: added `config/description.exs`, from which `docs/config.md` is generated
- Configuration: OpenGraph and TwitterCard providers enabled by default - Configuration: OpenGraph and TwitterCard providers enabled by default
- Configuration: Filter.AnonymizeFilename added ability to retain file extension with custom text - Configuration: Filter.AnonymizeFilename added ability to retain file extension with custom text
- Configuration: added `config/description.exs`, from which `docs/config.md` is generated - Mastodon API: `pleroma.thread_muted` key in the Status entity
- Federation: Return 403 errors when trying to request pages from a user's follower/following collections if they have `hide_followers`/`hide_follows` set - Federation: Return 403 errors when trying to request pages from a user's follower/following collections if they have `hide_followers`/`hide_follows` set
- NodeInfo: Return `skipThreadContainment` in `metadata` for the `skip_thread_containment` option - NodeInfo: Return `skipThreadContainment` in `metadata` for the `skip_thread_containment` option
- NodeInfo: Return `mailerEnabled` in `metadata` - NodeInfo: Return `mailerEnabled` in `metadata`
@ -30,23 +44,18 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/).
- AdminAPI: Add "godmode" while fetching user statuses (i.e. admin can see private statuses) - AdminAPI: Add "godmode" while fetching user statuses (i.e. admin can see private statuses)
- Improve digest email template - Improve digest email template
Pagination: (optional) return `total` alongside with `items` when paginating Pagination: (optional) return `total` alongside with `items` when paginating
- Replaced [pleroma_job_queue](https://git.pleroma.social/pleroma/pleroma_job_queue) and `Pleroma.Web.Federator.RetryQueue` with [Oban](https://github.com/sorentwo/oban) (see [`docs/config.md`](docs/config.md) on migrating customized worker / retry settings) - Add `rel="ugc"` to all links in statuses, to prevent SEO spam
- Introduced [quantum](https://github.com/quantum-elixir/quantum-core) job scheduler - ActivityPub: The first page in inboxes/outboxes is no longer embedded.
### Fixed ### Fixed
- Following from Osada - Following from Osada
- Not being able to pin unlisted posts
- Objects being re-embedded to activities after being updated (e.g faved/reposted). Running 'mix pleroma.database prune_objects' again is advised.
- Favorites timeline doing database-intensive queries - Favorites timeline doing database-intensive queries
- Metadata rendering errors resulting in the entire page being inaccessible - Metadata rendering errors resulting in the entire page being inaccessible
- `federation_incoming_replies_max_depth` option being ignored in certain cases - `federation_incoming_replies_max_depth` option being ignored in certain cases
- Federation/MediaProxy not working with instances that have wrong certificate order
- Mastodon API: Handling of search timeouts (`/api/v1/search` and `/api/v2/search`) - Mastodon API: Handling of search timeouts (`/api/v1/search` and `/api/v2/search`)
- Mastodon API: Misskey's endless polls being unable to render - Mastodon API: Misskey's endless polls being unable to render
- Mastodon API: Embedded relationships not being properly rendered in the Account entity of Status entity - Mastodon API: Embedded relationships not being properly rendered in the Account entity of Status entity
- Mastodon API: Notifications endpoint crashing if one notification failed to render - Mastodon API: Notifications endpoint crashing if one notification failed to render
- Mastodon API: follower/following counters not being nullified, when `hide_follows`/`hide_followers` is set
- Mastodon API: `muted` in the Status entity, using author's account to determine if the tread was muted
- Mastodon API: Add `account_id`, `type`, `offset`, and `limit` to search API (`/api/v1/search` and `/api/v2/search`) - Mastodon API: Add `account_id`, `type`, `offset`, and `limit` to search API (`/api/v1/search` and `/api/v2/search`)
- Mastodon API, streaming: Fix filtering of notifications based on blocks/mutes/thread mutes - Mastodon API, streaming: Fix filtering of notifications based on blocks/mutes/thread mutes
- ActivityPub C2S: follower/following collection pages being inaccessible even when authentifucated if `hide_followers`/ `hide_follows` was set - ActivityPub C2S: follower/following collection pages being inaccessible even when authentifucated if `hide_followers`/ `hide_follows` was set
@ -54,15 +63,9 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/).
- Rich Media: Parser failing when no TTL can be found by image TTL setters - Rich Media: Parser failing when no TTL can be found by image TTL setters
- Rich Media: The crawled URL is now spliced into the rich media data. - Rich Media: The crawled URL is now spliced into the rich media data.
- ActivityPub S2S: sharedInbox usage has been mostly aligned with the rules in the AP specification. - ActivityPub S2S: sharedInbox usage has been mostly aligned with the rules in the AP specification.
- ActivityPub S2S: remote user deletions now work the same as local user deletions.
- ActivityPub S2S: POST requests are now signed with `(request-target)` pseudo-header.
- Not being able to access the Mastodon FE login page on private instances
- Invalid SemVer version generation, when the current branch does not have commits ahead of tag/checked out on a tag
- Pleroma.Upload base_url was not automatically whitelisted by MediaProxy. Now your custom CDN or file hosting will be accessed directly as expected. - Pleroma.Upload base_url was not automatically whitelisted by MediaProxy. Now your custom CDN or file hosting will be accessed directly as expected.
- Report email not being sent to admins when the reporter is a remote user - Report email not being sent to admins when the reporter is a remote user
- MRF: ensure that subdomain_match calls are case-insensitive
- Reverse Proxy limiting `max_body_length` was incorrectly defined and only checked `Content-Length` headers which may not be sufficient in some circumstances - Reverse Proxy limiting `max_body_length` was incorrectly defined and only checked `Content-Length` headers which may not be sufficient in some circumstances
- MRF: fix use of unserializable keyword lists in describe() implementations
- ActivityPub: Deactivated user deletion - ActivityPub: Deactivated user deletion
- ActivityPub: Fix `/users/:nickname/inbox` crashing without an authenticated user - ActivityPub: Fix `/users/:nickname/inbox` crashing without an authenticated user
- MRF: fix ability to follow a relay when AntiFollowbotPolicy was enabled - MRF: fix ability to follow a relay when AntiFollowbotPolicy was enabled
@ -73,16 +76,9 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/).
- Mastodon API: all status JSON responses contain a `pleroma.expires_at` item which states when an activity will expire. The value is only shown to the user who created the activity. To everyone else it's empty. - Mastodon API: all status JSON responses contain a `pleroma.expires_at` item which states when an activity will expire. The value is only shown to the user who created the activity. To everyone else it's empty.
- Configuration: `ActivityExpiration.enabled` controls whether expired activites will get deleted at the appropriate time. Enabled by default. - Configuration: `ActivityExpiration.enabled` controls whether expired activites will get deleted at the appropriate time. Enabled by default.
- Conversations: Add Pleroma-specific conversation endpoints and status posting extensions. Run the `bump_all_conversations` task again to create the necessary data. - Conversations: Add Pleroma-specific conversation endpoints and status posting extensions. Run the `bump_all_conversations` task again to create the necessary data.
- **Breaking:** MRF describe API, which adds support for exposing configuration information about MRF policies to NodeInfo.
Custom modules will need to be updated by adding, at the very least, `def describe, do: {:ok, %{}}` to the MRF policy modules.
- MRF: Support for priming the mediaproxy cache (`Pleroma.Web.ActivityPub.MRF.MediaProxyWarmingPolicy`) - MRF: Support for priming the mediaproxy cache (`Pleroma.Web.ActivityPub.MRF.MediaProxyWarmingPolicy`)
- MRF: Support for excluding specific domains from Transparency. - MRF: Support for excluding specific domains from Transparency.
- MRF: Support for filtering posts based on who they mention (`Pleroma.Web.ActivityPub.MRF.MentionPolicy`) - MRF: Support for filtering posts based on who they mention (`Pleroma.Web.ActivityPub.MRF.MentionPolicy`)
- MRF: Support for filtering posts based on ActivityStreams vocabulary (`Pleroma.Web.ActivityPub.MRF.VocabularyPolicy`)
- MRF (Simple Policy): Support for wildcard domains.
- Support for wildcard domains in user domain blocks setting.
- Configuration: `quarantined_instances` support wildcard domains.
- Configuration: `federation_incoming_replies_max_depth` option
- Mastodon API: Support for the [`tagged` filter](https://github.com/tootsuite/mastodon/pull/9755) in [`GET /api/v1/accounts/:id/statuses`](https://docs.joinmastodon.org/api/rest/accounts/#get-api-v1-accounts-id-statuses) - Mastodon API: Support for the [`tagged` filter](https://github.com/tootsuite/mastodon/pull/9755) in [`GET /api/v1/accounts/:id/statuses`](https://docs.joinmastodon.org/api/rest/accounts/#get-api-v1-accounts-id-statuses)
- Mastodon API, streaming: Add support for passing the token in the `Sec-WebSocket-Protocol` header - Mastodon API, streaming: Add support for passing the token in the `Sec-WebSocket-Protocol` header
- Mastodon API, extension: Ability to reset avatar, profile banner, and background - Mastodon API, extension: Ability to reset avatar, profile banner, and background
@ -94,6 +90,7 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/).
- Mastodon API: added `/auth/password` endpoint for password reset with rate limit. - Mastodon API: added `/auth/password` endpoint for password reset with rate limit.
- Mastodon API: /api/v1/accounts/:id/statuses now supports nicknames or user id - Mastodon API: /api/v1/accounts/:id/statuses now supports nicknames or user id
- Mastodon API: Improve support for the user profile custom fields - Mastodon API: Improve support for the user profile custom fields
- Mastodon API: follower/following counters are nullified when `hide_follows`/`hide_followers` and `hide_follows_count`/`hide_followers_count` are set
- Admin API: Return users' tags when querying reports - Admin API: Return users' tags when querying reports
- Admin API: Return avatar and display name when querying users - Admin API: Return avatar and display name when querying users
- Admin API: Allow querying user by ID - Admin API: Allow querying user by ID
@ -110,12 +107,10 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/).
- Admin API: Endpoint for fetching latest user's statuses - Admin API: Endpoint for fetching latest user's statuses
- Pleroma API: Add `/api/v1/pleroma/accounts/confirmation_resend?email=<email>` for resending account confirmation. - Pleroma API: Add `/api/v1/pleroma/accounts/confirmation_resend?email=<email>` for resending account confirmation.
- Pleroma API: Email change endpoint. - Pleroma API: Email change endpoint.
- Relays: Added a task to list relay subscriptions.
- Mix Tasks: `mix pleroma.database fix_likes_collections`
- Federation: Remove `likes` from objects.
- Admin API: Added moderation log - Admin API: Added moderation log
- Web response cache (currently, enabled for ActivityPub) - Web response cache (currently, enabled for ActivityPub)
- Mastodon API: Added an endpoint to get multiple statuses by IDs (`GET /api/v1/statuses/?ids[]=1&ids[]=2`) - Mastodon API: Added an endpoint to get multiple statuses by IDs (`GET /api/v1/statuses/?ids[]=1&ids[]=2`)
- ActivityPub: Add ActivityPub actor's `discoverable` parameter.
### Changed ### Changed
- Configuration: Filter.AnonymizeFilename added ability to retain file extension with custom text - Configuration: Filter.AnonymizeFilename added ability to retain file extension with custom text
@ -123,6 +118,61 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/).
- RichMedia: parsers and their order are configured in `rich_media` config. - RichMedia: parsers and their order are configured in `rich_media` config.
- RichMedia: add the rich media ttl based on image expiration time. - RichMedia: add the rich media ttl based on image expiration time.
## [1.0.6] - 2019-08-14
### Fixed
- MRF: fix use of unserializable keyword lists in describe() implementations
- ActivityPub S2S: POST requests are now signed with `(request-target)` pseudo-header.
## [1.0.5] - 2019-08-13
### Fixed
- Mastodon API: follower/following counters not being nullified, when `hide_follows`/`hide_followers` is set
- Mastodon API: `muted` in the Status entity, using author's account to determine if the thread was muted
- Mastodon API: return the actual profile URL in the Account entity's `url` property when appropriate
- Templates: properly style anchor tags
- Objects being re-embedded to activities after being updated (e.g faved/reposted). Running 'mix pleroma.database prune_objects' again is advised.
- Not being able to access the Mastodon FE login page on private instances
- MRF: ensure that subdomain_match calls are case-insensitive
- Fix internal server error when using the healthcheck API.
### Added
- **Breaking:** MRF describe API, which adds support for exposing configuration information about MRF policies to NodeInfo.
Custom modules will need to be updated by adding, at the very least, `def describe, do: {:ok, %{}}` to the MRF policy modules.
- Relays: Added a task to list relay subscriptions.
- MRF: Support for filtering posts based on ActivityStreams vocabulary (`Pleroma.Web.ActivityPub.MRF.VocabularyPolicy`)
- MRF (Simple Policy): Support for wildcard domains.
- Support for wildcard domains in user domain blocks setting.
- Configuration: `quarantined_instances` support wildcard domains.
- Mix Tasks: `mix pleroma.database fix_likes_collections`
- Configuration: `federation_incoming_replies_max_depth` option
### Removed
- Federation: Remove `likes` from objects.
- **Breaking:** ActivityPub: The `accept_blocks` configuration setting.
## [1.0.4] - 2019-08-01
### Fixed
- Invalid SemVer version generation, when the current branch does not have commits ahead of tag/checked out on a tag
## [1.0.3] - 2019-07-31
### Security
- OStatus: eliminate the possibility of a protocol downgrade attack.
- OStatus: prevent following locked accounts, bypassing the approval process.
- TwitterAPI: use CommonAPI to handle remote follows instead of OStatus.
## [1.0.2] - 2019-07-28
### Fixed
- Not being able to pin unlisted posts
- Mastodon API: represent poll IDs as strings
- MediaProxy: fix matching filenames
- MediaProxy: fix filename encoding
- Migrations: fix a sporadic migration failure
- Metadata rendering errors resulting in the entire page being inaccessible
- Federation/MediaProxy not working with instances that have wrong certificate order
- ActivityPub S2S: remote user deletions now work the same as local user deletions.
### Changed
- Configuration: OpenGraph and TwitterCard providers enabled by default
- Configuration: Filter.AnonymizeFilename added ability to retain file extension with custom text
## [1.0.1] - 2019-07-14 ## [1.0.1] - 2019-07-14
### Security ### Security

View file

@ -109,6 +109,7 @@
config :pleroma, Pleroma.Uploaders.S3, config :pleroma, Pleroma.Uploaders.S3,
bucket: nil, bucket: nil,
streaming_enabled: true,
public_endpoint: "https://s3.amazonaws.com" public_endpoint: "https://s3.amazonaws.com"
config :pleroma, Pleroma.Uploaders.MDII, config :pleroma, Pleroma.Uploaders.MDII,
@ -122,7 +123,8 @@
# Put groups that have higher priority than defaults here. Example in `docs/config/custom_emoji.md` # Put groups that have higher priority than defaults here. Example in `docs/config/custom_emoji.md`
Custom: ["/emoji/*.png", "/emoji/**/*.png"] Custom: ["/emoji/*.png", "/emoji/**/*.png"]
], ],
default_manifest: "https://git.pleroma.social/pleroma/emoji-index/raw/master/index.json" default_manifest: "https://git.pleroma.social/pleroma/emoji-index/raw/master/index.json",
shared_pack_cache_seconds_per_file: 60
config :pleroma, :uri_schemes, config :pleroma, :uri_schemes,
valid_schemes: [ valid_schemes: [
@ -507,7 +509,7 @@
class: false, class: false,
strip_prefix: false, strip_prefix: false,
new_window: false, new_window: false,
rel: false rel: "ugc"
] ]
config :pleroma, :ldap, config :pleroma, :ldap,

File diff suppressed because it is too large Load diff

View file

@ -30,7 +30,8 @@
notify_email: "noreply@example.com", notify_email: "noreply@example.com",
skip_thread_containment: false, skip_thread_containment: false,
federating: false, federating: false,
external_user_synchronization: false external_user_synchronization: false,
static_dir: "test/instance_static/"
config :pleroma, :activitypub, sign_object_fetches: false config :pleroma, :activitypub, sign_object_fetches: false

View file

@ -224,15 +224,25 @@ Note: Available `:permission_group` is currently moderator and admin. 404 is ret
## `/api/pleroma/admin/users/invite_token` ## `/api/pleroma/admin/users/invite_token`
### Get an account registration invite token ### Create an account registration invite token
- Methods: `GET` - Methods: `POST`
- Params: - Params:
- *optional* `invite` => [ - *optional* `max_use` (integer)
- *optional* `max_use` (integer) - *optional* `expires_at` (date string e.g. "2019-04-07")
- *optional* `expires_at` (date string e.g. "2019-04-07") - Response:
]
- Response: invite token (base64 string) ```json
{
"id": integer,
"token": string,
"used": boolean,
"expires_at": date,
"uses": integer,
"max_use": integer,
"invite_type": string (possible values: `one_time`, `reusable`, `date_limited`, `reusable_date_limited`)
}
```
## `/api/pleroma/admin/users/invites` ## `/api/pleroma/admin/users/invites`
@ -298,7 +308,23 @@ Note: Available `:permission_group` is currently moderator and admin. 404 is ret
- Methods: `GET` - Methods: `GET`
- Params: none - Params: none
- Response: password reset token (base64 string) - Response:
```json
{
"token": "U13DX6muOvpRsj35_ij9wLxUbkU-eFvfKttxs6gIajo=", // password reset token (base64 string)
"link": "https://pleroma.social/api/pleroma/password_reset/U13DX6muOvpRsj35_ij9wLxUbkU-eFvfKttxs6gIajo%3D"
}
```
## `/api/pleroma/admin/users/:nickname/force_password_reset`
### Force passord reset for a user with a given nickname
- Methods: `PATCH`
- Params: none
- Response: none (code `204`)
## `/api/pleroma/admin/reports` ## `/api/pleroma/admin/reports`
### Get a list of reports ### Get a list of reports
@ -317,6 +343,7 @@ Note: Available `:permission_group` is currently moderator and admin. 404 is ret
```json ```json
{ {
"total" : 1,
"reports": [ "reports": [
{ {
"account": { "account": {
@ -722,3 +749,10 @@ Compile time settings (need instance reboot):
} }
] ]
``` ```
## `POST /api/pleroma/admin/reload_emoji`
### Reload the instance's custom emoji
* Method `POST`
* Authentication: required
* Params: None
* Response: JSON, "ok" and 200 status

View file

@ -21,7 +21,8 @@ Adding the parameter `with_muted=true` to the timeline queries will also return
Has these additional fields under the `pleroma` object: Has these additional fields under the `pleroma` object:
- `local`: true if the post was made on the local instance - `local`: true if the post was made on the local instance
- `conversation_id`: the ID of the conversation the status is associated with (if any) - `conversation_id`: the ID of the AP context the status is associated with (if any)
- `direct_conversation_id`: the ID of the Mastodon direct message conversation the status is associated with (if any)
- `in_reply_to_account_acct`: the `acct` property of User entity for replied user (if any) - `in_reply_to_account_acct`: the `acct` property of User entity for replied user (if any)
- `content`: a map consisting of alternate representations of the `content` property with the key being it's mimetype. Currently the only alternate representation supported is `text/plain` - `content`: a map consisting of alternate representations of the `content` property with the key being it's mimetype. Currently the only alternate representation supported is `text/plain`
- `spoiler_text`: a map consisting of alternate representations of the `spoiler_text` property with the key being it's mimetype. Currently the only alternate representation supported is `text/plain` - `spoiler_text`: a map consisting of alternate representations of the `spoiler_text` property with the key being it's mimetype. Currently the only alternate representation supported is `text/plain`
@ -50,6 +51,8 @@ Has these additional fields under the `pleroma` object:
- `confirmation_pending`: boolean, true if a new user account is waiting on email confirmation to be activated - `confirmation_pending`: boolean, true if a new user account is waiting on email confirmation to be activated
- `hide_followers`: boolean, true when the user has follower hiding enabled - `hide_followers`: boolean, true when the user has follower hiding enabled
- `hide_follows`: boolean, true when the user has follow hiding enabled - `hide_follows`: boolean, true when the user has follow hiding enabled
- `hide_followers_count`: boolean, true when the user has follower stat hiding enabled
- `hide_follows_count`: boolean, true when the user has follow stat hiding enabled
- `settings_store`: A generic map of settings for frontends. Opaque to the backend. Only returned in `verify_credentials` and `update_credentials` - `settings_store`: A generic map of settings for frontends. Opaque to the backend. Only returned in `verify_credentials` and `update_credentials`
- `chat_token`: The token needed for Pleroma chat. Only returned in `verify_credentials` - `chat_token`: The token needed for Pleroma chat. Only returned in `verify_credentials`
- `deactivated`: boolean, true when the user is deactivated - `deactivated`: boolean, true when the user is deactivated
@ -112,6 +115,8 @@ Additional parameters can be added to the JSON body/Form data:
- `no_rich_text` - if true, html tags are stripped from all statuses requested from the API - `no_rich_text` - if true, html tags are stripped from all statuses requested from the API
- `hide_followers` - if true, user's followers will be hidden - `hide_followers` - if true, user's followers will be hidden
- `hide_follows` - if true, user's follows will be hidden - `hide_follows` - if true, user's follows will be hidden
- `hide_followers_count` - if true, user's follower count will be hidden
- `hide_follows_count` - if true, user's follow count will be hidden
- `hide_favorites` - if true, user's favorites timeline will be hidden - `hide_favorites` - if true, user's favorites timeline will be hidden
- `show_role` - if true, user's role (e.g admin, moderator) will be exposed to anyone in the API - `show_role` - if true, user's role (e.g admin, moderator) will be exposed to anyone in the API
- `default_scope` - the scope returned under `privacy` key in Source subentity - `default_scope` - the scope returned under `privacy` key in Source subentity

View file

@ -365,3 +365,68 @@ The status posting endpoint takes an additional parameter, `in_reply_to_conversa
* Params: * Params:
* `recipients`: A list of ids of users that should receive posts to this conversation. This will replace the current list of recipients, so submit the full list. The owner of owner of the conversation will always be part of the set of recipients, though. * `recipients`: A list of ids of users that should receive posts to this conversation. This will replace the current list of recipients, so submit the full list. The owner of owner of the conversation will always be part of the set of recipients, though.
* Response: JSON, statuses (200 - healthy, 503 unhealthy) * Response: JSON, statuses (200 - healthy, 503 unhealthy)
## `GET /api/pleroma/emoji/packs`
### Lists the custom emoji packs on the server
* Method `GET`
* Authentication: not required
* Params: None
* Response: JSON, "ok" and 200 status and the JSON hashmap of "pack name" to "pack contents"
## `PUT /api/pleroma/emoji/packs/:name`
### Creates an empty custom emoji pack
* Method `PUT`
* Authentication: required
* Params: None
* Response: JSON, "ok" and 200 status or 409 if the pack with that name already exists
## `DELETE /api/pleroma/emoji/packs/:name`
### Delete a custom emoji pack
* Method `DELETE`
* Authentication: required
* Params: None
* Response: JSON, "ok" and 200 status or 500 if there was an error deleting the pack
## `POST /api/pleroma/emoji/packs/:name/update_file`
### Update a file in a custom emoji pack
* Method `POST`
* Authentication: required
* Params:
* if the `action` is `add`, adds an emoji named `shortcode` to the pack `pack_name`,
that means that the emoji file needs to be uploaded with the request
(thus requiring it to be a multipart request) and be named `file`.
There can also be an optional `filename` that will be the new emoji file name
(if it's not there, the name will be taken from the uploaded file).
* if the `action` is `update`, changes emoji shortcode
(from `shortcode` to `new_shortcode` or moves the file (from the current filename to `new_filename`)
* if the `action` is `remove`, removes the emoji named `shortcode` and it's associated file
* Response: JSON, updated "files" section of the pack and 200 status, 409 if the trying to use a shortcode
that is already taken, 400 if there was an error with the shortcode, filename or file (additional info
in the "error" part of the response JSON)
## `POST /api/pleroma/emoji/packs/:name/update_metadata`
### Updates (replaces) pack metadata
* Method `POST`
* Authentication: required
* Params:
* `new_data`: new metadata to replace the old one
* Response: JSON, updated "metadata" section of the pack and 200 status or 400 if there was a
problem with the new metadata (the error is specified in the "error" part of the response JSON)
## `POST /api/pleroma/emoji/packs/download_from`
### Requests the instance to download the pack from another instance
* Method `POST`
* Authentication: required
* Params:
* `instance_address`: the address of the instance to download from
* `pack_name`: the pack to download from that instance
* Response: JSON, "ok" and 200 status if the pack was downloaded, or 500 if there were
errors downloading the pack
## `GET /api/pleroma/emoji/packs/:name/download_shared`
### Requests a local pack from the instance
* Method `GET`
* Authentication: not required
* Params: None
* Response: the archive of the pack with a 200 status code, 403 if the pack is not set as shared,
404 if the pack does not exist

View file

@ -39,7 +39,7 @@ Feel free to contact us to be added to this list!
### Nekonium ### Nekonium
- Homepage: [F-Droid Repository](https://repo.gdgd.jp.net/), [Google Play](https://play.google.com/store/apps/details?id=com.apps.nekonium), [Amazon](https://www.amazon.co.jp/dp/B076FXPRBC/) - Homepage: [F-Droid Repository](https://repo.gdgd.jp.net/), [Google Play](https://play.google.com/store/apps/details?id=com.apps.nekonium), [Amazon](https://www.amazon.co.jp/dp/B076FXPRBC/)
- Source: <https://git.gdgd.jp.net/lin/nekonium/> - Source: <https://gogs.gdgd.jp.net/lin/nekonium>
- Contact: [@lin@pleroma.gdgd.jp.net](https://pleroma.gdgd.jp.net/users/lin) - Contact: [@lin@pleroma.gdgd.jp.net](https://pleroma.gdgd.jp.net/users/lin)
- Platforms: Android - Platforms: Android
- Features: Streaming Ready - Features: Streaming Ready
@ -67,7 +67,7 @@ Feel free to contact us to be added to this list!
## Alternative Web Interfaces ## Alternative Web Interfaces
### Brutaldon ### Brutaldon
- Homepage: <https://jfm.carcosa.net/projects/software/brutaldon/> - Homepage: <https://jfm.carcosa.net/projects/software/brutaldon/>
- Source Code: <https://github.com/jfmcbrayer/brutaldon> - Source Code: <https://git.carcosa.net/jmcbray/brutaldon>
- Contact: [@gcupc@glitch.social](https://glitch.social/users/gcupc) - Contact: [@gcupc@glitch.social](https://glitch.social/users/gcupc)
- Features: No Streaming - Features: No Streaming

View file

@ -23,6 +23,7 @@ Note: `strip_exif` has been replaced by `Pleroma.Upload.Filter.Mogrify`.
* `truncated_namespace`: If you use S3 compatible service such as Digital Ocean Spaces or CDN, set folder name or "" etc. * `truncated_namespace`: If you use S3 compatible service such as Digital Ocean Spaces or CDN, set folder name or "" etc.
For example, when using CDN to S3 virtual host format, set "". For example, when using CDN to S3 virtual host format, set "".
At this time, write CNAME to CDN in public_endpoint. At this time, write CNAME to CDN in public_endpoint.
* `streaming_enabled`: Enable streaming uploads, when enabled the file will be sent to the server in chunks as it's being read. This may be unsupported by some providers, try disabling this if you have upload problems.
## Pleroma.Upload.Filter.Mogrify ## Pleroma.Upload.Filter.Mogrify
@ -521,7 +522,7 @@ config :auto_linker,
class: false, class: false,
strip_prefix: false, strip_prefix: false,
new_window: false, new_window: false,
rel: false rel: "ugc"
] ]
``` ```
@ -707,6 +708,8 @@ Configure OAuth 2 provider capabilities:
* `pack_extensions`: A list of file extensions for emojis, when no emoji.txt for a pack is present. Example `[".png", ".gif"]` * `pack_extensions`: A list of file extensions for emojis, when no emoji.txt for a pack is present. Example `[".png", ".gif"]`
* `groups`: Emojis are ordered in groups (tags). This is an array of key-value pairs where the key is the groupname and the value the location or array of locations. `*` can be used as a wildcard. Example `[Custom: ["/emoji/*.png", "/emoji/custom/*.png"]]` * `groups`: Emojis are ordered in groups (tags). This is an array of key-value pairs where the key is the groupname and the value the location or array of locations. `*` can be used as a wildcard. Example `[Custom: ["/emoji/*.png", "/emoji/custom/*.png"]]`
* `default_manifest`: Location of the JSON-manifest. This manifest contains information about the emoji-packs you can download. Currently only one manifest can be added (no arrays). * `default_manifest`: Location of the JSON-manifest. This manifest contains information about the emoji-packs you can download. Currently only one manifest can be added (no arrays).
* `shared_pack_cache_seconds_per_file`: When an emoji pack is shared, the archive is created and cached in
memory for this amount of seconds multiplied by the number of files.
## Database options ## Database options

View file

@ -215,7 +215,9 @@
]} ]}
]}, ]},
{ 5222, ejabberd_c2s, [ %% If you want dual stack, you have to clone this entire config stanza
%% and change the bind to "::"
{ {5222, "0.0.0.0"}, ejabberd_c2s, [
%% %%
%% If TLS is compiled in and you installed a SSL %% If TLS is compiled in and you installed a SSL
@ -246,7 +248,9 @@
%% {max_stanza_size, 65536} %% {max_stanza_size, 65536}
%% ]}, %% ]},
{ 5269, ejabberd_s2s_in, [ %% If you want dual stack, you have to clone this entire config stanza
%% and change the bind to "::"
{ {5269, "0.0.0.0"}, ejabberd_s2s_in, [
{shaper, s2s_shaper}, {shaper, s2s_shaper},
{max_stanza_size, 131072}, {max_stanza_size, 131072},
{protocol_options, ["no_sslv3"]} {protocol_options, ["no_sslv3"]}

View file

@ -1,5 +1,5 @@
# Pleroma: A lightweight social networking server # Pleroma: A lightweight social networking server
# Copyright © 2017-2018 Pleroma Authors <https://pleroma.social/> # Copyright © 2017-2019 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only # SPDX-License-Identifier: AGPL-3.0-only
defmodule Mix.Pleroma do defmodule Mix.Pleroma do

View file

@ -1,5 +1,5 @@
# Pleroma: A lightweight social networking server # Pleroma: A lightweight social networking server
# Copyright © 2017-2018 Pleroma Authors <https://pleroma.social/> # Copyright © 2017-2019 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only # SPDX-License-Identifier: AGPL-3.0-only
defmodule Mix.Tasks.Pleroma.Database do defmodule Mix.Tasks.Pleroma.Database do

View file

@ -1,5 +1,5 @@
# Pleroma: A lightweight social networking server # Pleroma: A lightweight social networking server
# Copyright © 2017-2018 Pleroma Authors <https://pleroma.social/> # Copyright © 2017-2019 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-onl # SPDX-License-Identifier: AGPL-3.0-onl
defmodule Mix.Tasks.Pleroma.Ecto do defmodule Mix.Tasks.Pleroma.Ecto do

View file

@ -1,5 +1,5 @@
# Pleroma: A lightweight social networking server # Pleroma: A lightweight social networking server
# Copyright © 2017-2018 Pleroma Authors <https://pleroma.social/> # Copyright © 2017-2019 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-onl # SPDX-License-Identifier: AGPL-3.0-onl
defmodule Mix.Tasks.Pleroma.Ecto.Migrate do defmodule Mix.Tasks.Pleroma.Ecto.Migrate do

View file

@ -1,5 +1,5 @@
# Pleroma: A lightweight social networking server # Pleroma: A lightweight social networking server
# Copyright © 2017-2018 Pleroma Authors <https://pleroma.social/> # Copyright © 2017-2019 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-onl # SPDX-License-Identifier: AGPL-3.0-onl
defmodule Mix.Tasks.Pleroma.Ecto.Rollback do defmodule Mix.Tasks.Pleroma.Ecto.Rollback do

View file

@ -1,5 +1,5 @@
# Pleroma: A lightweight social networking server # Pleroma: A lightweight social networking server
# Copyright © 2017-2018 Pleroma Authors <https://pleroma.social/> # Copyright © 2017-2019 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only # SPDX-License-Identifier: AGPL-3.0-only
defmodule Mix.Tasks.Pleroma.Emoji do defmodule Mix.Tasks.Pleroma.Emoji do
@ -235,7 +235,7 @@ def run(["gen-pack", src]) do
cwd: tmp_pack_dir cwd: tmp_pack_dir
) )
emoji_map = Pleroma.Emoji.make_shortcode_to_file_map(tmp_pack_dir, exts) emoji_map = Pleroma.Emoji.Loader.make_shortcode_to_file_map(tmp_pack_dir, exts)
File.write!(files_name, Jason.encode!(emoji_map, pretty: true)) File.write!(files_name, Jason.encode!(emoji_map, pretty: true))

View file

@ -1,5 +1,5 @@
# Pleroma: A lightweight social networking server # Pleroma: A lightweight social networking server
# Copyright © 2017-2018 Pleroma Authors <https://pleroma.social/> # Copyright © 2017-2019 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only # SPDX-License-Identifier: AGPL-3.0-only
defmodule Mix.Tasks.Pleroma.Instance do defmodule Mix.Tasks.Pleroma.Instance do

View file

@ -1,5 +1,5 @@
# Pleroma: A lightweight social networking server # Pleroma: A lightweight social networking server
# Copyright © 2017-2018 Pleroma Authors <https://pleroma.social/> # Copyright © 2017-2019 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only # SPDX-License-Identifier: AGPL-3.0-only
defmodule Mix.Tasks.Pleroma.Relay do defmodule Mix.Tasks.Pleroma.Relay do

View file

@ -1,5 +1,5 @@
# Pleroma: A lightweight social networking server # Pleroma: A lightweight social networking server
# Copyright © 2017-2018 Pleroma Authors <https://pleroma.social/> # Copyright © 2017-2019 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only # SPDX-License-Identifier: AGPL-3.0-only
defmodule Mix.Tasks.Pleroma.Uploads do defmodule Mix.Tasks.Pleroma.Uploads do

View file

@ -1,10 +1,9 @@
# Pleroma: A lightweight social networking server # Pleroma: A lightweight social networking server
# Copyright © 2017-2018 Pleroma Authors <https://pleroma.social/> # Copyright © 2017-2019 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only # SPDX-License-Identifier: AGPL-3.0-only
defmodule Mix.Tasks.Pleroma.User do defmodule Mix.Tasks.Pleroma.User do
use Mix.Task use Mix.Task
import Ecto.Changeset
import Mix.Pleroma import Mix.Pleroma
alias Pleroma.User alias Pleroma.User
alias Pleroma.UserInviteToken alias Pleroma.UserInviteToken
@ -228,9 +227,9 @@ def run(["unsubscribe", nickname]) do
shell_info("Deactivating #{user.nickname}") shell_info("Deactivating #{user.nickname}")
User.deactivate(user) User.deactivate(user)
{:ok, friends} = User.get_friends(user) user
|> User.get_friends()
Enum.each(friends, fn friend -> |> Enum.each(fn friend ->
user = User.get_cached_by_id(user.id) user = User.get_cached_by_id(user.id)
shell_info("Unsubscribing #{friend.nickname} from #{user.nickname}") shell_info("Unsubscribing #{friend.nickname} from #{user.nickname}")
@ -405,7 +404,7 @@ def run(["delete_activities", nickname]) do
start_pleroma() start_pleroma()
with %User{local: true} = user <- User.get_cached_by_nickname(nickname) do with %User{local: true} = user <- User.get_cached_by_nickname(nickname) do
{:ok, _} = User.delete_user_activities(user) User.delete_user_activities(user)
shell_info("User #{nickname} statuses deleted.") shell_info("User #{nickname} statuses deleted.")
else else
_ -> _ ->
@ -443,39 +442,21 @@ def run(["sign_out", nickname]) do
end end
defp set_moderator(user, value) do defp set_moderator(user, value) do
info_cng = User.Info.admin_api_update(user.info, %{is_moderator: value}) {:ok, user} = User.update_info(user, &User.Info.admin_api_update(&1, %{is_moderator: value}))
user_cng =
Ecto.Changeset.change(user)
|> put_embed(:info, info_cng)
{:ok, user} = User.update_and_set_cache(user_cng)
shell_info("Moderator status of #{user.nickname}: #{user.info.is_moderator}") shell_info("Moderator status of #{user.nickname}: #{user.info.is_moderator}")
user user
end end
defp set_admin(user, value) do defp set_admin(user, value) do
info_cng = User.Info.admin_api_update(user.info, %{is_admin: value}) {:ok, user} = User.update_info(user, &User.Info.admin_api_update(&1, %{is_admin: value}))
user_cng =
Ecto.Changeset.change(user)
|> put_embed(:info, info_cng)
{:ok, user} = User.update_and_set_cache(user_cng)
shell_info("Admin status of #{user.nickname}: #{user.info.is_admin}") shell_info("Admin status of #{user.nickname}: #{user.info.is_admin}")
user user
end end
defp set_locked(user, value) do defp set_locked(user, value) do
info_cng = User.Info.user_upgrade(user.info, %{locked: value}) {:ok, user} = User.update_info(user, &User.Info.user_upgrade(&1, %{locked: value}))
user_cng =
Ecto.Changeset.change(user)
|> put_embed(:info, info_cng)
{:ok, user} = User.update_and_set_cache(user_cng)
shell_info("Locked status of #{user.nickname}: #{user.info.locked}") shell_info("Locked status of #{user.nickname}: #{user.info.locked}")
user user

View file

@ -21,7 +21,7 @@ defmodule Pleroma.Activity do
@type t :: %__MODULE__{} @type t :: %__MODULE__{}
@type actor :: String.t() @type actor :: String.t()
@primary_key {:id, Pleroma.FlakeId, autogenerate: true} @primary_key {:id, FlakeId.Ecto.CompatType, autogenerate: true}
# https://github.com/tootsuite/mastodon/blob/master/app/models/notification.rb#L19 # https://github.com/tootsuite/mastodon/blob/master/app/models/notification.rb#L19
@mastodon_notification_types %{ @mastodon_notification_types %{

View file

@ -1,5 +1,5 @@
# Pleroma: A lightweight social networking server # Pleroma: A lightweight social networking server
# Copyright © 2017-2018 Pleroma Authors <https://pleroma.social/> # Copyright © 2017-2019 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only # SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Activity.Queries do defmodule Pleroma.Activity.Queries do

View file

@ -7,7 +7,6 @@ defmodule Pleroma.ActivityExpiration do
alias Pleroma.Activity alias Pleroma.Activity
alias Pleroma.ActivityExpiration alias Pleroma.ActivityExpiration
alias Pleroma.FlakeId
alias Pleroma.Repo alias Pleroma.Repo
import Ecto.Changeset import Ecto.Changeset
@ -17,7 +16,7 @@ defmodule Pleroma.ActivityExpiration do
@min_activity_lifetime :timer.hours(1) @min_activity_lifetime :timer.hours(1)
schema "activity_expirations" do schema "activity_expirations" do
belongs_to(:activity, Activity, type: FlakeId) belongs_to(:activity, Activity, type: FlakeId.Ecto.CompatType)
field(:scheduled_at, :naive_datetime) field(:scheduled_at, :naive_datetime)
end end

View file

@ -35,7 +35,6 @@ def start(_type, _args) do
Pleroma.Config.TransferTask, Pleroma.Config.TransferTask,
Pleroma.Emoji, Pleroma.Emoji,
Pleroma.Captcha, Pleroma.Captcha,
Pleroma.FlakeId,
Pleroma.Daemons.ScheduledActivityDaemon, Pleroma.Daemons.ScheduledActivityDaemon,
Pleroma.Daemons.ActivityExpirationDaemon Pleroma.Daemons.ActivityExpirationDaemon
] ++ ] ++
@ -43,23 +42,9 @@ def start(_type, _args) do
hackney_pool_children() ++ hackney_pool_children() ++
[ [
Pleroma.Stats, Pleroma.Stats,
{Oban, Pleroma.Config.get(Oban)}, {Oban, Pleroma.Config.get(Oban)}
%{
id: :web_push_init,
start: {Task, :start_link, [&Pleroma.Web.Push.init/0]},
restart: :temporary
},
%{
id: :federator_init,
start: {Task, :start_link, [&Pleroma.Web.Federator.init/0]},
restart: :temporary
},
%{
id: :internal_fetch_init,
start: {Task, :start_link, [&Pleroma.Web.ActivityPub.InternalFetchActor.init/0]},
restart: :temporary
}
] ++ ] ++
task_children(@env) ++
oauth_cleanup_child(oauth_cleanup_enabled?()) ++ oauth_cleanup_child(oauth_cleanup_enabled?()) ++
streamer_child(@env) ++ streamer_child(@env) ++
chat_child(@env, chat_enabled?()) ++ chat_child(@env, chat_enabled?()) ++
@ -116,10 +101,14 @@ defp cachex_children do
build_cachex("rich_media", default_ttl: :timer.minutes(120), limit: 5000), build_cachex("rich_media", default_ttl: :timer.minutes(120), limit: 5000),
build_cachex("scrubber", limit: 2500), build_cachex("scrubber", limit: 2500),
build_cachex("idempotency", expiration: idempotency_expiration(), limit: 2500), build_cachex("idempotency", expiration: idempotency_expiration(), limit: 2500),
build_cachex("web_resp", limit: 2500) build_cachex("web_resp", limit: 2500),
build_cachex("emoji_packs", expiration: emoji_packs_expiration(), limit: 10)
] ]
end end
defp emoji_packs_expiration,
do: expiration(default: :timer.seconds(5 * 60), interval: :timer.seconds(60))
defp idempotency_expiration, defp idempotency_expiration,
do: expiration(default: :timer.seconds(6 * 60 * 60), interval: :timer.seconds(60)) do: expiration(default: :timer.seconds(6 * 60 * 60), interval: :timer.seconds(60))
@ -163,4 +152,39 @@ defp hackney_pool_children do
:hackney_pool.child_spec(pool, options) :hackney_pool.child_spec(pool, options)
end end
end end
defp task_children(:test) do
[
%{
id: :web_push_init,
start: {Task, :start_link, [&Pleroma.Web.Push.init/0]},
restart: :temporary
},
%{
id: :federator_init,
start: {Task, :start_link, [&Pleroma.Web.Federator.init/0]},
restart: :temporary
}
]
end
defp task_children(_) do
[
%{
id: :web_push_init,
start: {Task, :start_link, [&Pleroma.Web.Push.init/0]},
restart: :temporary
},
%{
id: :federator_init,
start: {Task, :start_link, [&Pleroma.Web.Federator.init/0]},
restart: :temporary
},
%{
id: :internal_fetch_init,
start: {Task, :start_link, [&Pleroma.Web.ActivityPub.InternalFetchActor.init/0]},
restart: :temporary
}
]
end
end end

View file

@ -10,20 +10,20 @@ defmodule Pleroma.Bookmark do
alias Pleroma.Activity alias Pleroma.Activity
alias Pleroma.Bookmark alias Pleroma.Bookmark
alias Pleroma.FlakeId
alias Pleroma.Repo alias Pleroma.Repo
alias Pleroma.User alias Pleroma.User
@type t :: %__MODULE__{} @type t :: %__MODULE__{}
schema "bookmarks" do schema "bookmarks" do
belongs_to(:user, User, type: FlakeId) belongs_to(:user, User, type: FlakeId.Ecto.CompatType)
belongs_to(:activity, Activity, type: FlakeId) belongs_to(:activity, Activity, type: FlakeId.Ecto.CompatType)
timestamps() timestamps()
end end
@spec create(FlakeId.t(), FlakeId.t()) :: {:ok, Bookmark.t()} | {:error, Changeset.t()} @spec create(FlakeId.Ecto.CompatType.t(), FlakeId.Ecto.CompatType.t()) ::
{:ok, Bookmark.t()} | {:error, Changeset.t()}
def create(user_id, activity_id) do def create(user_id, activity_id) do
attrs = %{ attrs = %{
user_id: user_id, user_id: user_id,
@ -37,7 +37,7 @@ def create(user_id, activity_id) do
|> Repo.insert() |> Repo.insert()
end end
@spec for_user_query(FlakeId.t()) :: Ecto.Query.t() @spec for_user_query(FlakeId.Ecto.CompatType.t()) :: Ecto.Query.t()
def for_user_query(user_id) do def for_user_query(user_id) do
Bookmark Bookmark
|> where(user_id: ^user_id) |> where(user_id: ^user_id)
@ -52,7 +52,8 @@ def get(user_id, activity_id) do
|> Repo.one() |> Repo.one()
end end
@spec destroy(FlakeId.t(), FlakeId.t()) :: {:ok, Bookmark.t()} | {:error, Changeset.t()} @spec destroy(FlakeId.Ecto.CompatType.t(), FlakeId.Ecto.CompatType.t()) ::
{:ok, Bookmark.t()} | {:error, Changeset.t()}
def destroy(user_id, activity_id) do def destroy(user_id, activity_id) do
from(b in Bookmark, from(b in Bookmark,
where: b.user_id == ^user_id, where: b.user_id == ^user_id,

View file

@ -6,4 +6,16 @@ defmodule Pleroma.Constants do
use Const use Const
const(as_public, do: "https://www.w3.org/ns/activitystreams#Public") const(as_public, do: "https://www.w3.org/ns/activitystreams#Public")
const(object_internal_fields,
do: [
"likes",
"like_count",
"announcements",
"announcement_count",
"emoji",
"context_id",
"deleted_activity_id"
]
)
end end

View file

@ -13,10 +13,10 @@ defmodule Pleroma.Conversation.Participation do
import Ecto.Query import Ecto.Query
schema "conversation_participations" do schema "conversation_participations" do
belongs_to(:user, User, type: Pleroma.FlakeId) belongs_to(:user, User, type: FlakeId.Ecto.CompatType)
belongs_to(:conversation, Conversation) belongs_to(:conversation, Conversation)
field(:read, :boolean, default: false) field(:read, :boolean, default: false)
field(:last_activity_id, Pleroma.FlakeId, virtual: true) field(:last_activity_id, FlakeId.Ecto.CompatType, virtual: true)
has_many(:recipient_ships, RecipientShip) has_many(:recipient_ships, RecipientShip)
has_many(:recipients, through: [:recipient_ships, :user]) has_many(:recipients, through: [:recipient_ships, :user])

View file

@ -12,7 +12,7 @@ defmodule Pleroma.Conversation.Participation.RecipientShip do
import Ecto.Changeset import Ecto.Changeset
schema "conversation_participation_recipient_ships" do schema "conversation_participation_recipient_ships" do
belongs_to(:user, User, type: Pleroma.FlakeId) belongs_to(:user, User, type: FlakeId.Ecto.CompatType)
belongs_to(:participation, Participation) belongs_to(:participation, Participation)
end end

View file

@ -6,7 +6,6 @@ defmodule Pleroma.Delivery do
use Ecto.Schema use Ecto.Schema
alias Pleroma.Delivery alias Pleroma.Delivery
alias Pleroma.FlakeId
alias Pleroma.Object alias Pleroma.Object
alias Pleroma.Repo alias Pleroma.Repo
alias Pleroma.User alias Pleroma.User
@ -16,7 +15,7 @@ defmodule Pleroma.Delivery do
import Ecto.Query import Ecto.Query
schema "deliveries" do schema "deliveries" do
belongs_to(:user, User, type: FlakeId) belongs_to(:user, User, type: FlakeId.Ecto.CompatType)
belongs_to(:object, Object) belongs_to(:object, Object)
end end

View file

@ -23,7 +23,7 @@ def process(descriptions) do
IO.write(file, "#{group[:description]}\n") IO.write(file, "#{group[:description]}\n")
for child <- group[:children] do for child <- group[:children] || [] do
print_child_header(file, child) print_child_header(file, child)
print_suggestions(file, child[:suggestions]) print_suggestions(file, child[:suggestions])
@ -44,6 +44,17 @@ def process(descriptions) do
{:ok, config_path} {:ok, config_path}
end end
defp print_child_header(file, %{key: key, type: type, description: description} = _child) do
IO.write(
file,
"- `#{inspect(key)}` (`#{inspect(type)}`): #{description} \n"
)
end
defp print_child_header(file, %{key: key, type: type} = _child) do
IO.write(file, "- `#{inspect(key)}` (`#{inspect(type)}`) \n")
end
defp print_suggestion(file, suggestion) when is_list(suggestion) do defp print_suggestion(file, suggestion) when is_list(suggestion) do
IO.write(file, " `#{inspect(suggestion)}`\n") IO.write(file, " `#{inspect(suggestion)}`\n")
end end
@ -59,20 +70,19 @@ defp print_suggestion(file, suggestion, as_list \\ false) do
defp print_suggestions(_file, nil), do: nil defp print_suggestions(_file, nil), do: nil
defp print_suggestions(file, suggestions) do defp print_suggestions(_file, ""), do: nil
IO.write(file, "Suggestions:\n")
defp print_suggestions(file, suggestions) do
if length(suggestions) > 1 do if length(suggestions) > 1 do
IO.write(file, "Suggestions:\n")
for suggestion <- suggestions do for suggestion <- suggestions do
print_suggestion(file, suggestion, true) print_suggestion(file, suggestion, true)
end end
else else
IO.write(file, " Suggestion: ")
print_suggestion(file, List.first(suggestions)) print_suggestion(file, List.first(suggestions))
end end
end end
defp print_child_header(file, child) do
IO.write(file, "- `#{inspect(child[:key])}` -`#{inspect(child[:type])}` \n")
IO.write(file, "#{child[:description]} \n")
end
end end

View file

@ -4,24 +4,37 @@
defmodule Pleroma.Emoji do defmodule Pleroma.Emoji do
@moduledoc """ @moduledoc """
The emojis are loaded from: This GenServer stores in an ETS table the list of the loaded emojis,
and also allows to reload the list at runtime.
* emoji packs in INSTANCE-DIR/emoji
* the files: `config/emoji.txt` and `config/custom_emoji.txt`
* glob paths, nested folder is used as tag name for grouping e.g. priv/static/emoji/custom/nested_folder
This GenServer stores in an ETS table the list of the loaded emojis, and also allows to reload the list at runtime.
""" """
use GenServer use GenServer
alias Pleroma.Emoji.Loader
require Logger require Logger
@type pattern :: Regex.t() | module() | String.t()
@type patterns :: pattern() | [pattern()]
@type group_patterns :: keyword(patterns())
@ets __MODULE__.Ets @ets __MODULE__.Ets
@ets_options [:ordered_set, :protected, :named_table, {:read_concurrency, true}] @ets_options [
:ordered_set,
:protected,
:named_table,
{:read_concurrency, true}
]
defstruct [:code, :file, :tags, :safe_code, :safe_file]
@doc "Build emoji struct"
def build({code, file, tags}) do
%__MODULE__{
code: code,
file: file,
tags: tags,
safe_code: Pleroma.HTML.strip_tags(code),
safe_file: Pleroma.HTML.strip_tags(file)
}
end
def build({code, file}), do: build({code, file, []})
@doc false @doc false
def start_link(_) do def start_link(_) do
@ -44,11 +57,14 @@ def get(name) do
end end
@doc "Returns all the emojos!!" @doc "Returns all the emojos!!"
@spec get_all() :: [{String.t(), String.t()}, ...] @spec get_all() :: list({String.t(), String.t(), String.t()})
def get_all do def get_all do
:ets.tab2list(@ets) :ets.tab2list(@ets)
end end
@doc "Clear out old emojis"
def clear_all, do: :ets.delete_all_objects(@ets)
@doc false @doc false
def init(_) do def init(_) do
@ets = :ets.new(@ets, @ets_options) @ets = :ets.new(@ets, @ets_options)
@ -58,13 +74,13 @@ def init(_) do
@doc false @doc false
def handle_cast(:reload, state) do def handle_cast(:reload, state) do
load() update_emojis(Loader.load())
{:noreply, state} {:noreply, state}
end end
@doc false @doc false
def handle_call(:reload, _from, state) do def handle_call(:reload, _from, state) do
load() update_emojis(Loader.load())
{:reply, :ok, state} {:reply, :ok, state}
end end
@ -75,189 +91,11 @@ def terminate(_, _) do
@doc false @doc false
def code_change(_old_vsn, state, _extra) do def code_change(_old_vsn, state, _extra) do
load() update_emojis(Loader.load())
{:ok, state} {:ok, state}
end end
defp load do defp update_emojis(emojis) do
emoji_dir_path = :ets.insert(@ets, emojis)
Path.join(
Pleroma.Config.get!([:instance, :static_dir]),
"emoji"
)
emoji_groups = Pleroma.Config.get([:emoji, :groups])
case File.ls(emoji_dir_path) do
{:error, :enoent} ->
# The custom emoji directory doesn't exist,
# don't do anything
nil
{:error, e} ->
# There was some other error
Logger.error("Could not access the custom emoji directory #{emoji_dir_path}: #{e}")
{:ok, results} ->
grouped =
Enum.group_by(results, fn file -> File.dir?(Path.join(emoji_dir_path, file)) end)
packs = grouped[true] || []
files = grouped[false] || []
# Print the packs we've found
Logger.info("Found emoji packs: #{Enum.join(packs, ", ")}")
if not Enum.empty?(files) do
Logger.warn(
"Found files in the emoji folder. These will be ignored, please move them to a subdirectory\nFound files: #{
Enum.join(files, ", ")
}"
)
end
emojis =
Enum.flat_map(
packs,
fn pack -> load_pack(Path.join(emoji_dir_path, pack), emoji_groups) end
)
true = :ets.insert(@ets, emojis)
end
# Compat thing for old custom emoji handling & default emoji,
# it should run even if there are no emoji packs
shortcode_globs = Pleroma.Config.get([:emoji, :shortcode_globs], [])
emojis =
(load_from_file("config/emoji.txt", emoji_groups) ++
load_from_file("config/custom_emoji.txt", emoji_groups) ++
load_from_globs(shortcode_globs, emoji_groups))
|> Enum.reject(fn value -> value == nil end)
true = :ets.insert(@ets, emojis)
:ok
end
defp load_pack(pack_dir, emoji_groups) do
pack_name = Path.basename(pack_dir)
emoji_txt = Path.join(pack_dir, "emoji.txt")
if File.exists?(emoji_txt) do
load_from_file(emoji_txt, emoji_groups)
else
extensions = Pleroma.Config.get([:emoji, :pack_extensions])
Logger.info(
"No emoji.txt found for pack \"#{pack_name}\", assuming all #{Enum.join(extensions, ", ")} files are emoji"
)
make_shortcode_to_file_map(pack_dir, extensions)
|> Enum.map(fn {shortcode, rel_file} ->
filename = Path.join("/emoji/#{pack_name}", rel_file)
{shortcode, filename, [to_string(match_extra(emoji_groups, filename))]}
end)
end
end
def make_shortcode_to_file_map(pack_dir, exts) do
find_all_emoji(pack_dir, exts)
|> Enum.map(&Path.relative_to(&1, pack_dir))
|> Enum.map(fn f -> {f |> Path.basename() |> Path.rootname(), f} end)
|> Enum.into(%{})
end
def find_all_emoji(dir, exts) do
Enum.reduce(
File.ls!(dir),
[],
fn f, acc ->
filepath = Path.join(dir, f)
if File.dir?(filepath) do
acc ++ find_all_emoji(filepath, exts)
else
acc ++ [filepath]
end
end
)
|> Enum.filter(fn f -> Path.extname(f) in exts end)
end
defp load_from_file(file, emoji_groups) do
if File.exists?(file) do
load_from_file_stream(File.stream!(file), emoji_groups)
else
[]
end
end
defp load_from_file_stream(stream, emoji_groups) do
stream
|> Stream.map(&String.trim/1)
|> Stream.map(fn line ->
case String.split(line, ~r/,\s*/) do
[name, file] ->
{name, file, [to_string(match_extra(emoji_groups, file))]}
[name, file | tags] ->
{name, file, tags}
_ ->
nil
end
end)
|> Enum.to_list()
end
defp load_from_globs(globs, emoji_groups) do
static_path = Path.join(:code.priv_dir(:pleroma), "static")
paths =
Enum.map(globs, fn glob ->
Path.join(static_path, glob)
|> Path.wildcard()
end)
|> Enum.concat()
Enum.map(paths, fn path ->
tag = match_extra(emoji_groups, Path.join("/", Path.relative_to(path, static_path)))
shortcode = Path.basename(path, Path.extname(path))
external_path = Path.join("/", Path.relative_to(path, static_path))
{shortcode, external_path, [to_string(tag)]}
end)
end
@doc """
Finds a matching group for the given emoji filename
"""
@spec match_extra(group_patterns(), String.t()) :: atom() | nil
def match_extra(group_patterns, filename) do
match_group_patterns(group_patterns, fn pattern ->
case pattern do
%Regex{} = regex -> Regex.match?(regex, filename)
string when is_binary(string) -> filename == string
end
end)
end
defp match_group_patterns(group_patterns, matcher) do
Enum.find_value(group_patterns, fn {group, patterns} ->
patterns =
patterns
|> List.wrap()
|> Enum.map(fn pattern ->
if String.contains?(pattern, "*") do
~r(#{String.replace(pattern, "*", ".*")})
else
pattern
end
end)
Enum.any?(patterns, matcher) && group
end)
end end
end end

View file

@ -0,0 +1,59 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2019 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Emoji.Formatter do
alias Pleroma.Emoji
alias Pleroma.HTML
alias Pleroma.Web.MediaProxy
def emojify(text) do
emojify(text, Emoji.get_all())
end
def emojify(text, nil), do: text
def emojify(text, emoji, strip \\ false) do
Enum.reduce(emoji, text, fn
{_, %Emoji{safe_code: emoji, safe_file: file}}, text ->
String.replace(text, ":#{emoji}:", prepare_emoji_html(emoji, file, strip))
{unsafe_emoji, unsafe_file}, text ->
emoji = HTML.strip_tags(unsafe_emoji)
file = HTML.strip_tags(unsafe_file)
String.replace(text, ":#{emoji}:", prepare_emoji_html(emoji, file, strip))
end)
|> HTML.filter_tags()
end
defp prepare_emoji_html(_emoji, _file, true), do: ""
defp prepare_emoji_html(emoji, file, _strip) do
"<img class='emoji' alt='#{emoji}' title='#{emoji}' src='#{MediaProxy.url(file)}' />"
end
def demojify(text) do
emojify(text, Emoji.get_all(), true)
end
def demojify(text, nil), do: text
@doc "Outputs a list of the emoji-shortcodes in a text"
def get_emoji(text) when is_binary(text) do
Enum.filter(Emoji.get_all(), fn {emoji, %Emoji{}} ->
String.contains?(text, ":#{emoji}:")
end)
end
def get_emoji(_), do: []
@doc "Outputs a list of the emoji-Maps in a text"
def get_emoji_map(text) when is_binary(text) do
get_emoji(text)
|> Enum.reduce(%{}, fn {name, %Emoji{file: file}}, acc ->
Map.put(acc, name, "#{Pleroma.Web.Endpoint.static_url()}#{file}")
end)
end
def get_emoji_map(_), do: []
end

224
lib/pleroma/emoji/loader.ex Normal file
View file

@ -0,0 +1,224 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2019 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Emoji.Loader do
@moduledoc """
The Loader emoji from:
* emoji packs in INSTANCE-DIR/emoji
* the files: `config/emoji.txt` and `config/custom_emoji.txt`
* glob paths, nested folder is used as tag name for grouping e.g. priv/static/emoji/custom/nested_folder
"""
alias Pleroma.Config
alias Pleroma.Emoji
require Logger
@type pattern :: Regex.t() | module() | String.t()
@type patterns :: pattern() | [pattern()]
@type group_patterns :: keyword(patterns())
@type emoji :: {String.t(), Emoji.t()}
@doc """
Loads emojis from files/packs.
returns list emojis in format:
`{"000", "/emoji/freespeechextremist.com/000.png", ["Custom"]}`
"""
@spec load() :: list(emoji)
def load do
emoji_dir_path = Path.join(Config.get!([:instance, :static_dir]), "emoji")
emoji_groups = Config.get([:emoji, :groups])
emojis =
case File.ls(emoji_dir_path) do
{:error, :enoent} ->
# The custom emoji directory doesn't exist,
# don't do anything
[]
{:error, e} ->
# There was some other error
Logger.error("Could not access the custom emoji directory #{emoji_dir_path}: #{e}")
[]
{:ok, results} ->
grouped =
Enum.group_by(results, fn file ->
File.dir?(Path.join(emoji_dir_path, file))
end)
packs = grouped[true] || []
files = grouped[false] || []
# Print the packs we've found
Logger.info("Found emoji packs: #{Enum.join(packs, ", ")}")
if not Enum.empty?(files) do
Logger.warn(
"Found files in the emoji folder. These will be ignored, please move them to a subdirectory\nFound files: #{
Enum.join(files, ", ")
}"
)
end
emojis =
Enum.flat_map(packs, fn pack ->
load_pack(Path.join(emoji_dir_path, pack), emoji_groups)
end)
Emoji.clear_all()
emojis
end
# Compat thing for old custom emoji handling & default emoji,
# it should run even if there are no emoji packs
shortcode_globs = Config.get([:emoji, :shortcode_globs], [])
emojis_txt =
(load_from_file("config/emoji.txt", emoji_groups) ++
load_from_file("config/custom_emoji.txt", emoji_groups) ++
load_from_globs(shortcode_globs, emoji_groups))
|> Enum.reject(fn value -> value == nil end)
Enum.map(emojis ++ emojis_txt, &prepare_emoji/1)
end
defp prepare_emoji({code, _, _} = emoji), do: {code, Emoji.build(emoji)}
defp load_pack(pack_dir, emoji_groups) do
pack_name = Path.basename(pack_dir)
pack_file = Path.join(pack_dir, "pack.json")
if File.exists?(pack_file) do
contents = Jason.decode!(File.read!(pack_file))
contents["files"]
|> Enum.map(fn {name, rel_file} ->
filename = Path.join("/emoji/#{pack_name}", rel_file)
{name, filename, ["pack:#{pack_name}"]}
end)
else
# Load from emoji.txt / all files
emoji_txt = Path.join(pack_dir, "emoji.txt")
if File.exists?(emoji_txt) do
load_from_file(emoji_txt, emoji_groups)
else
extensions = Pleroma.Config.get([:emoji, :pack_extensions])
Logger.info(
"No emoji.txt found for pack \"#{pack_name}\", assuming all #{
Enum.join(extensions, ", ")
} files are emoji"
)
make_shortcode_to_file_map(pack_dir, extensions)
|> Enum.map(fn {shortcode, rel_file} ->
filename = Path.join("/emoji/#{pack_name}", rel_file)
{shortcode, filename, [to_string(match_extra(emoji_groups, filename))]}
end)
end
end
end
def make_shortcode_to_file_map(pack_dir, exts) do
find_all_emoji(pack_dir, exts)
|> Enum.map(&Path.relative_to(&1, pack_dir))
|> Enum.map(fn f -> {f |> Path.basename() |> Path.rootname(), f} end)
|> Enum.into(%{})
end
def find_all_emoji(dir, exts) do
dir
|> File.ls!()
|> Enum.flat_map(fn f ->
filepath = Path.join(dir, f)
if File.dir?(filepath) do
find_all_emoji(filepath, exts)
else
[filepath]
end
end)
|> Enum.filter(fn f -> Path.extname(f) in exts end)
end
defp load_from_file(file, emoji_groups) do
if File.exists?(file) do
load_from_file_stream(File.stream!(file), emoji_groups)
else
[]
end
end
defp load_from_file_stream(stream, emoji_groups) do
stream
|> Stream.map(&String.trim/1)
|> Stream.map(fn line ->
case String.split(line, ~r/,\s*/) do
[name, file] ->
{name, file, [to_string(match_extra(emoji_groups, file))]}
[name, file | tags] ->
{name, file, tags}
_ ->
nil
end
end)
|> Enum.to_list()
end
defp load_from_globs(globs, emoji_groups) do
static_path = Path.join(:code.priv_dir(:pleroma), "static")
paths =
Enum.map(globs, fn glob ->
Path.join(static_path, glob)
|> Path.wildcard()
end)
|> Enum.concat()
Enum.map(paths, fn path ->
tag = match_extra(emoji_groups, Path.join("/", Path.relative_to(path, static_path)))
shortcode = Path.basename(path, Path.extname(path))
external_path = Path.join("/", Path.relative_to(path, static_path))
{shortcode, external_path, [to_string(tag)]}
end)
end
@doc """
Finds a matching group for the given emoji filename
"""
@spec match_extra(group_patterns(), String.t()) :: atom() | nil
def match_extra(group_patterns, filename) do
match_group_patterns(group_patterns, fn pattern ->
case pattern do
%Regex{} = regex -> Regex.match?(regex, filename)
string when is_binary(string) -> filename == string
end
end)
end
defp match_group_patterns(group_patterns, matcher) do
Enum.find_value(group_patterns, fn {group, patterns} ->
patterns =
patterns
|> List.wrap()
|> Enum.map(fn pattern ->
if String.contains?(pattern, "*") do
~r(#{String.replace(pattern, "*", ".*")})
else
pattern
end
end)
Enum.any?(patterns, matcher) && group
end)
end
end

View file

@ -12,7 +12,7 @@ defmodule Pleroma.Filter do
alias Pleroma.User alias Pleroma.User
schema "filters" do schema "filters" do
belongs_to(:user, User, type: Pleroma.FlakeId) belongs_to(:user, User, type: FlakeId.Ecto.CompatType)
field(:filter_id, :integer) field(:filter_id, :integer)
field(:hide, :boolean, default: false) field(:hide, :boolean, default: false)
field(:whole_word, :boolean, default: true) field(:whole_word, :boolean, default: true)

View file

@ -1,182 +0,0 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2019 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.FlakeId do
@moduledoc """
Flake is a decentralized, k-ordered id generation service.
Adapted from:
* [flaky](https://github.com/nirvana/flaky), released under the terms of the Truly Free License,
* [Flake](https://github.com/boundary/flake), Copyright 2012, Boundary, Apache License, Version 2.0
"""
@type t :: binary
@behaviour Ecto.Type
use GenServer
require Logger
alias __MODULE__
import Kernel, except: [to_string: 1]
defstruct node: nil, time: 0, sq: 0
@doc "Converts a binary Flake to a String"
def to_string(<<0::integer-size(64), id::integer-size(64)>>) do
Kernel.to_string(id)
end
def to_string(<<_::integer-size(64), _::integer-size(48), _::integer-size(16)>> = flake) do
encode_base62(flake)
end
def to_string(s), do: s
def from_string(int) when is_integer(int) do
from_string(Kernel.to_string(int))
end
for i <- [-1, 0] do
def from_string(unquote(i)), do: <<0::integer-size(128)>>
def from_string(unquote(Kernel.to_string(i))), do: <<0::integer-size(128)>>
end
def from_string(<<_::integer-size(128)>> = flake), do: flake
def from_string(string) when is_binary(string) and byte_size(string) < 18 do
case Integer.parse(string) do
{id, ""} -> <<0::integer-size(64), id::integer-size(64)>>
_ -> nil
end
end
def from_string(string) do
string |> decode_base62 |> from_integer
end
def to_integer(<<integer::integer-size(128)>>), do: integer
def from_integer(integer) do
<<_time::integer-size(64), _node::integer-size(48), _seq::integer-size(16)>> =
<<integer::integer-size(128)>>
end
@doc "Generates a Flake"
@spec get :: binary
def get, do: to_string(:gen_server.call(:flake, :get))
# checks that ID is is valid FlakeID
#
@spec is_flake_id?(String.t()) :: boolean
def is_flake_id?(id), do: is_flake_id?(String.to_charlist(id), true)
defp is_flake_id?([c | cs], true) when c >= ?0 and c <= ?9, do: is_flake_id?(cs, true)
defp is_flake_id?([c | cs], true) when c >= ?A and c <= ?Z, do: is_flake_id?(cs, true)
defp is_flake_id?([c | cs], true) when c >= ?a and c <= ?z, do: is_flake_id?(cs, true)
defp is_flake_id?([], true), do: true
defp is_flake_id?(_, _), do: false
# -- Ecto.Type API
@impl Ecto.Type
def type, do: :uuid
@impl Ecto.Type
def cast(value) do
{:ok, FlakeId.to_string(value)}
end
@impl Ecto.Type
def load(value) do
{:ok, FlakeId.to_string(value)}
end
@impl Ecto.Type
def dump(value) do
{:ok, FlakeId.from_string(value)}
end
def autogenerate, do: get()
# -- GenServer API
def start_link(_) do
:gen_server.start_link({:local, :flake}, __MODULE__, [], [])
end
@impl GenServer
def init([]) do
{:ok, %FlakeId{node: worker_id(), time: time()}}
end
@impl GenServer
def handle_call(:get, _from, state) do
{flake, new_state} = get(time(), state)
{:reply, flake, new_state}
end
# Matches when the calling time is the same as the state time. Incr. sq
defp get(time, %FlakeId{time: time, node: node, sq: seq}) do
new_state = %FlakeId{time: time, node: node, sq: seq + 1}
{gen_flake(new_state), new_state}
end
# Matches when the times are different, reset sq
defp get(newtime, %FlakeId{time: time, node: node}) when newtime > time do
new_state = %FlakeId{time: newtime, node: node, sq: 0}
{gen_flake(new_state), new_state}
end
# Error when clock is running backwards
defp get(newtime, %FlakeId{time: time}) when newtime < time do
{:error, :clock_running_backwards}
end
defp gen_flake(%FlakeId{time: time, node: node, sq: seq}) do
<<time::integer-size(64), node::integer-size(48), seq::integer-size(16)>>
end
defp nthchar_base62(n) when n <= 9, do: ?0 + n
defp nthchar_base62(n) when n <= 35, do: ?A + n - 10
defp nthchar_base62(n), do: ?a + n - 36
defp encode_base62(<<integer::integer-size(128)>>) do
integer
|> encode_base62([])
|> List.to_string()
end
defp encode_base62(int, acc) when int < 0, do: encode_base62(-int, acc)
defp encode_base62(int, []) when int == 0, do: '0'
defp encode_base62(int, acc) when int == 0, do: acc
defp encode_base62(int, acc) do
r = rem(int, 62)
id = div(int, 62)
acc = [nthchar_base62(r) | acc]
encode_base62(id, acc)
end
defp decode_base62(s) do
decode_base62(String.to_charlist(s), 0)
end
defp decode_base62([c | cs], acc) when c >= ?0 and c <= ?9,
do: decode_base62(cs, 62 * acc + (c - ?0))
defp decode_base62([c | cs], acc) when c >= ?A and c <= ?Z,
do: decode_base62(cs, 62 * acc + (c - ?A + 10))
defp decode_base62([c | cs], acc) when c >= ?a and c <= ?z,
do: decode_base62(cs, 62 * acc + (c - ?a + 36))
defp decode_base62([], acc), do: acc
defp time do
{mega_seconds, seconds, micro_seconds} = :erlang.timestamp()
1_000_000_000 * mega_seconds + seconds * 1000 + :erlang.trunc(micro_seconds / 1000)
end
defp worker_id do
<<worker::integer-size(48)>> = :crypto.strong_rand_bytes(6)
worker
end
end

View file

@ -3,10 +3,8 @@
# SPDX-License-Identifier: AGPL-3.0-only # SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Formatter do defmodule Pleroma.Formatter do
alias Pleroma.Emoji
alias Pleroma.HTML alias Pleroma.HTML
alias Pleroma.User alias Pleroma.User
alias Pleroma.Web.MediaProxy
@safe_mention_regex ~r/^(\s*(?<mentions>(@.+?\s+){1,})+)(?<rest>.*)/s @safe_mention_regex ~r/^(\s*(?<mentions>(@.+?\s+){1,})+)(?<rest>.*)/s
@link_regex ~r"((?:http(s)?:\/\/)?[\w.-]+(?:\.[\w\.-]+)+[\w\-\._~%:/?#[\]@!\$&'\(\)\*\+,;=.]+)|[0-9a-z+\-\.]+:[0-9a-z$-_.+!*'(),]+"ui @link_regex ~r"((?:http(s)?:\/\/)?[\w.-]+(?:\.[\w\.-]+)+[\w\-\._~%:/?#[\]@!\$&'\(\)\*\+,;=.]+)|[0-9a-z+\-\.]+:[0-9a-z$-_.+!*'(),]+"ui
@ -36,9 +34,9 @@ def mention_handler("@" <> nickname, buffer, opts, acc) do
nickname_text = get_nickname_text(nickname, opts) nickname_text = get_nickname_text(nickname, opts)
link = link =
"<span class='h-card'><a data-user='#{id}' class='u-url mention' href='#{ap_id}'>@<span>#{ ~s(<span class="h-card"><a data-user="#{id}" class="u-url mention" href="#{ap_id}" rel="ugc">@<span>#{
nickname_text nickname_text
}</span></a></span>" }</span></a></span>)
{link, %{acc | mentions: MapSet.put(acc.mentions, {"@" <> nickname, user})}} {link, %{acc | mentions: MapSet.put(acc.mentions, {"@" <> nickname, user})}}
@ -50,7 +48,7 @@ def mention_handler("@" <> nickname, buffer, opts, acc) do
def hashtag_handler("#" <> tag = tag_text, _buffer, _opts, acc) do def hashtag_handler("#" <> tag = tag_text, _buffer, _opts, acc) do
tag = String.downcase(tag) tag = String.downcase(tag)
url = "#{Pleroma.Web.base_url()}/tag/#{tag}" url = "#{Pleroma.Web.base_url()}/tag/#{tag}"
link = "<a class='hashtag' data-tag='#{tag}' href='#{url}' rel='tag'>#{tag_text}</a>" link = ~s(<a class="hashtag" data-tag="#{tag}" href="#{url}" rel="tag ugc">#{tag_text}</a>)
{link, %{acc | tags: MapSet.put(acc.tags, {tag_text, tag})}} {link, %{acc | tags: MapSet.put(acc.tags, {tag_text, tag})}}
end end
@ -100,51 +98,6 @@ def mentions_escape(text, options \\ []) do
end end
end end
def emojify(text) do
emojify(text, Emoji.get_all())
end
def emojify(text, nil), do: text
def emojify(text, emoji, strip \\ false) do
Enum.reduce(emoji, text, fn emoji_data, text ->
emoji = HTML.strip_tags(elem(emoji_data, 0))
file = HTML.strip_tags(elem(emoji_data, 1))
html =
if not strip do
"<img class='emoji' alt='#{emoji}' title='#{emoji}' src='#{MediaProxy.url(file)}' />"
else
""
end
String.replace(text, ":#{emoji}:", html) |> HTML.filter_tags()
end)
end
def demojify(text) do
emojify(text, Emoji.get_all(), true)
end
def demojify(text, nil), do: text
@doc "Outputs a list of the emoji-shortcodes in a text"
def get_emoji(text) when is_binary(text) do
Enum.filter(Emoji.get_all(), fn {emoji, _, _} -> String.contains?(text, ":#{emoji}:") end)
end
def get_emoji(_), do: []
@doc "Outputs a list of the emoji-Maps in a text"
def get_emoji_map(text) when is_binary(text) do
get_emoji(text)
|> Enum.reduce(%{}, fn {name, file, _group}, acc ->
Map.put(acc, name, "#{Pleroma.Web.Endpoint.static_url()}#{file}")
end)
end
def get_emoji_map(_), do: []
def html_escape({text, mentions, hashtags}, type) do def html_escape({text, mentions, hashtags}, type) do
{html_escape(text, type), mentions, hashtags} {html_escape(text, type), mentions, hashtags}
end end

View file

@ -184,7 +184,8 @@ defmodule Pleroma.HTML.Scrubber.Default do
"tag", "tag",
"nofollow", "nofollow",
"noopener", "noopener",
"noreferrer" "noreferrer",
"ugc"
]) ])
Meta.allow_tag_with_these_attributes("a", ["name", "title"]) Meta.allow_tag_with_these_attributes("a", ["name", "title"])
@ -304,7 +305,8 @@ defmodule Pleroma.HTML.Scrubber.LinksOnly do
"nofollow", "nofollow",
"noopener", "noopener",
"noreferrer", "noreferrer",
"me" "me",
"ugc"
]) ])
Meta.allow_tag_with_these_attributes("a", ["name", "title"]) Meta.allow_tag_with_these_attributes("a", ["name", "title"])

View file

@ -13,7 +13,7 @@ defmodule Pleroma.List do
alias Pleroma.User alias Pleroma.User
schema "lists" do schema "lists" do
belongs_to(:user, User, type: Pleroma.FlakeId) belongs_to(:user, User, type: FlakeId.Ecto.CompatType)
field(:title, :string) field(:title, :string)
field(:following, {:array, :string}, default: []) field(:following, {:array, :string}, default: [])
field(:ap_id, :string) field(:ap_id, :string)

View file

@ -22,8 +22,8 @@ defmodule Pleroma.Notification do
schema "notifications" do schema "notifications" do
field(:seen, :boolean, default: false) field(:seen, :boolean, default: false)
belongs_to(:user, User, type: Pleroma.FlakeId) belongs_to(:user, User, type: FlakeId.Ecto.CompatType)
belongs_to(:activity, Activity, type: Pleroma.FlakeId) belongs_to(:activity, Activity, type: FlakeId.Ecto.CompatType)
timestamps() timestamps()
end end

View file

@ -38,6 +38,24 @@ def change(struct, params \\ %{}) do
def get_by_id(nil), do: nil def get_by_id(nil), do: nil
def get_by_id(id), do: Repo.get(Object, id) def get_by_id(id), do: Repo.get(Object, id)
def get_by_id_and_maybe_refetch(id, opts \\ []) do
%{updated_at: updated_at} = object = get_by_id(id)
if opts[:interval] &&
NaiveDateTime.diff(NaiveDateTime.utc_now(), updated_at) > opts[:interval] do
case Fetcher.refetch_object(object) do
{:ok, %Object{} = object} ->
object
e ->
Logger.error("Couldn't refresh #{object.data["id"]}:\n#{inspect(e)}")
object
end
else
object
end
end
def get_by_ap_id(nil), do: nil def get_by_ap_id(nil), do: nil
def get_by_ap_id(ap_id) do def get_by_ap_id(ap_id) do

View file

@ -6,18 +6,40 @@ defmodule Pleroma.Object.Fetcher do
alias Pleroma.HTTP alias Pleroma.HTTP
alias Pleroma.Object alias Pleroma.Object
alias Pleroma.Object.Containment alias Pleroma.Object.Containment
alias Pleroma.Repo
alias Pleroma.Signature alias Pleroma.Signature
alias Pleroma.Web.ActivityPub.InternalFetchActor alias Pleroma.Web.ActivityPub.InternalFetchActor
alias Pleroma.Web.ActivityPub.Transmogrifier alias Pleroma.Web.ActivityPub.Transmogrifier
alias Pleroma.Web.OStatus alias Pleroma.Web.OStatus
require Logger require Logger
require Pleroma.Constants
defp reinject_object(data) do defp touch_changeset(changeset) do
updated_at =
NaiveDateTime.utc_now()
|> NaiveDateTime.truncate(:second)
Ecto.Changeset.put_change(changeset, :updated_at, updated_at)
end
defp maybe_reinject_internal_fields(data, %{data: %{} = old_data}) do
internal_fields = Map.take(old_data, Pleroma.Constants.object_internal_fields())
Map.merge(data, internal_fields)
end
defp maybe_reinject_internal_fields(data, _), do: data
@spec reinject_object(struct(), map()) :: {:ok, Object.t()} | {:error, any()}
defp reinject_object(struct, data) do
Logger.debug("Reinjecting object #{data["id"]}") Logger.debug("Reinjecting object #{data["id"]}")
with data <- Transmogrifier.fix_object(data), with data <- Transmogrifier.fix_object(data),
{:ok, object} <- Object.create(data) do data <- maybe_reinject_internal_fields(data, struct),
changeset <- Object.change(struct, %{data: data}),
changeset <- touch_changeset(changeset),
{:ok, object} <- Repo.insert_or_update(changeset) do
{:ok, object} {:ok, object}
else else
e -> e ->
@ -26,55 +48,68 @@ defp reinject_object(data) do
end end
end end
def refetch_object(%Object{data: %{"id" => id}} = object) do
with {:local, false} <- {:local, String.starts_with?(id, Pleroma.Web.base_url() <> "/")},
{:ok, data} <- fetch_and_contain_remote_object_from_id(id),
{:ok, object} <- reinject_object(object, data) do
{:ok, object}
else
{:local, true} -> object
e -> {:error, e}
end
end
# TODO: # TODO:
# This will create a Create activity, which we need internally at the moment. # This will create a Create activity, which we need internally at the moment.
def fetch_object_from_id(id, options \\ []) do def fetch_object_from_id(id, options \\ []) do
if object = Object.get_cached_by_ap_id(id) do with {:fetch_object, nil} <- {:fetch_object, Object.get_cached_by_ap_id(id)},
{:fetch, {:ok, data}} <- {:fetch, fetch_and_contain_remote_object_from_id(id)},
{:normalize, nil} <- {:normalize, Object.normalize(data, false)},
params <- prepare_activity_params(data),
{:containment, :ok} <- {:containment, Containment.contain_origin(id, params)},
{:ok, activity} <- Transmogrifier.handle_incoming(params, options),
{:object, _data, %Object{} = object} <-
{:object, data, Object.normalize(activity, false)} do
{:ok, object} {:ok, object}
else else
Logger.info("Fetching #{id} via AP") {:containment, _} ->
{:error, "Object containment failed."}
with {:fetch, {:ok, data}} <- {:fetch, fetch_and_contain_remote_object_from_id(id)}, {:error, {:reject, nil}} ->
{:normalize, nil} <- {:normalize, Object.normalize(data, false)}, {:reject, nil}
params <- %{
"type" => "Create", {:object, data, nil} ->
"to" => data["to"], reinject_object(%Object{}, data)
"cc" => data["cc"],
# Should we seriously keep this attributedTo thing? {:normalize, object = %Object{}} ->
"actor" => data["actor"] || data["attributedTo"],
"object" => data
},
{:containment, :ok} <- {:containment, Containment.contain_origin(id, params)},
{:ok, activity} <- Transmogrifier.handle_incoming(params, options),
{:object, _data, %Object{} = object} <-
{:object, data, Object.normalize(activity, false)} do
{:ok, object} {:ok, object}
else
{:containment, _} ->
{:error, "Object containment failed."}
{:error, {:reject, nil}} -> {:fetch_object, %Object{} = object} ->
{:reject, nil} {:ok, object}
{:object, data, nil} -> _e ->
reinject_object(data) # Only fallback when receiving a fetch/normalization error with ActivityPub
Logger.info("Couldn't get object via AP, trying out OStatus fetching...")
{:normalize, object = %Object{}} -> # FIXME: OStatus Object Containment?
{:ok, object} case OStatus.fetch_activity_from_url(id) do
{:ok, [activity | _]} -> {:ok, Object.normalize(activity, false)}
_e -> e -> e
# Only fallback when receiving a fetch/normalization error with ActivityPub end
Logger.info("Couldn't get object via AP, trying out OStatus fetching...")
# FIXME: OStatus Object Containment?
case OStatus.fetch_activity_from_url(id) do
{:ok, [activity | _]} -> {:ok, Object.normalize(activity, false)}
e -> e
end
end
end end
end end
defp prepare_activity_params(data) do
%{
"type" => "Create",
"to" => data["to"],
"cc" => data["cc"],
# Should we seriously keep this attributedTo thing?
"actor" => data["actor"] || data["attributedTo"],
"object" => data
}
end
def fetch_object_from_id!(id, options \\ []) do def fetch_object_from_id!(id, options \\ []) do
with {:ok, object} <- fetch_object_from_id(id, options) do with {:ok, object} <- fetch_object_from_id(id, options) do
object object

View file

@ -64,6 +64,7 @@ def paginate(query, options, :keyset) do
def paginate(query, options, :offset) do def paginate(query, options, :offset) do
query query
|> restrict(:order, options)
|> restrict(:offset, options) |> restrict(:offset, options)
|> restrict(:limit, options) |> restrict(:limit, options)
end end

View file

@ -12,7 +12,7 @@ defmodule Pleroma.PasswordResetToken do
alias Pleroma.User alias Pleroma.User
schema "password_reset_tokens" do schema "password_reset_tokens" do
belongs_to(:user, User, type: Pleroma.FlakeId) belongs_to(:user, User, type: FlakeId.Ecto.CompatType)
field(:token, :string) field(:token, :string)
field(:used, :boolean, default: false) field(:used, :boolean, default: false)

View file

@ -11,10 +11,10 @@ defmodule Pleroma.Registration do
alias Pleroma.Repo alias Pleroma.Repo
alias Pleroma.User alias Pleroma.User
@primary_key {:id, Pleroma.FlakeId, autogenerate: true} @primary_key {:id, FlakeId.Ecto.CompatType, autogenerate: true}
schema "registrations" do schema "registrations" do
belongs_to(:user, User, type: Pleroma.FlakeId) belongs_to(:user, User, type: FlakeId.Ecto.CompatType)
field(:provider, :string) field(:provider, :string)
field(:uid, :string) field(:uid, :string)
field(:info, :map, default: %{}) field(:info, :map, default: %{})

View file

@ -17,7 +17,7 @@ defmodule Pleroma.ScheduledActivity do
@min_offset :timer.minutes(5) @min_offset :timer.minutes(5)
schema "scheduled_activities" do schema "scheduled_activities" do
belongs_to(:user, User, type: Pleroma.FlakeId) belongs_to(:user, User, type: FlakeId.Ecto.CompatType)
field(:scheduled_at, :naive_datetime) field(:scheduled_at, :naive_datetime)
field(:params, :map) field(:params, :map)

View file

@ -21,8 +21,8 @@ defmodule Pleroma.SubscriptionNotification do
@type t :: %__MODULE__{} @type t :: %__MODULE__{}
schema "subscription_notifications" do schema "subscription_notifications" do
belongs_to(:user, User, type: Pleroma.FlakeId) belongs_to(:user, User, type: FlakeId.Ecto.CompatType)
belongs_to(:activity, Activity, type: Pleroma.FlakeId) belongs_to(:activity, Activity, type: FlakeId.Ecto.CompatType)
timestamps() timestamps()
end end

View file

@ -12,7 +12,7 @@ defmodule Pleroma.ThreadMute do
require Ecto.Query require Ecto.Query
schema "thread_mutes" do schema "thread_mutes" do
belongs_to(:user, User, type: Pleroma.FlakeId) belongs_to(:user, User, type: FlakeId.Ecto.CompatType)
field(:context, :string) field(:context, :string)
end end
@ -24,7 +24,7 @@ def changeset(mute, params \\ %{}) do
end end
def query(user_id, context) do def query(user_id, context) do
user_id = Pleroma.FlakeId.from_string(user_id) {:ok, user_id} = FlakeId.Ecto.CompatType.dump(user_id)
ThreadMute ThreadMute
|> Ecto.Query.where(user_id: ^user_id) |> Ecto.Query.where(user_id: ^user_id)

View file

@ -38,16 +38,26 @@ def get_file(file) do
def put_file(%Pleroma.Upload{} = upload) do def put_file(%Pleroma.Upload{} = upload) do
config = Config.get([__MODULE__]) config = Config.get([__MODULE__])
bucket = Keyword.get(config, :bucket) bucket = Keyword.get(config, :bucket)
streaming = Keyword.get(config, :streaming_enabled)
s3_name = strict_encode(upload.path) s3_name = strict_encode(upload.path)
op = op =
upload.tempfile if streaming do
|> ExAws.S3.Upload.stream_file() upload.tempfile
|> ExAws.S3.upload(bucket, s3_name, [ |> ExAws.S3.Upload.stream_file()
{:acl, :public_read}, |> ExAws.S3.upload(bucket, s3_name, [
{:content_type, upload.content_type} {:acl, :public_read},
]) {:content_type, upload.content_type}
])
else
{:ok, file_data} = File.read(upload.tempfile)
ExAws.S3.put_object(bucket, s3_name, file_data, [
{:acl, :public_read},
{:content_type, upload.content_type}
])
end
case ExAws.request(op) do case ExAws.request(op) do
{:ok, _} -> {:ok, _} ->

View file

@ -34,7 +34,7 @@ defmodule Pleroma.User do
@type t :: %__MODULE__{} @type t :: %__MODULE__{}
@primary_key {:id, Pleroma.FlakeId, autogenerate: true} @primary_key {:id, FlakeId.Ecto.CompatType, autogenerate: true}
# credo:disable-for-next-line Credo.Check.Readability.MaxLineLength # credo:disable-for-next-line Credo.Check.Readability.MaxLineLength
@email_regex ~r/^[a-zA-Z0-9.!#$%&'*+\/=?^_`{|}~-]+@[a-zA-Z0-9](?:[a-zA-Z0-9-]{0,61}[a-zA-Z0-9])?(?:\.[a-zA-Z0-9](?:[a-zA-Z0-9-]{0,61}[a-zA-Z0-9])?)*$/ @email_regex ~r/^[a-zA-Z0-9.!#$%&'*+\/=?^_`{|}~-]+@[a-zA-Z0-9](?:[a-zA-Z0-9-]{0,61}[a-zA-Z0-9])?(?:\.[a-zA-Z0-9](?:[a-zA-Z0-9-]{0,61}[a-zA-Z0-9])?)*$/
@ -106,9 +106,7 @@ def profile_url(%User{info: %{source_data: %{"url" => url}}}), do: url
def profile_url(%User{ap_id: ap_id}), do: ap_id def profile_url(%User{ap_id: ap_id}), do: ap_id
def profile_url(_), do: nil def profile_url(_), do: nil
def ap_id(%User{nickname: nickname}) do def ap_id(%User{nickname: nickname}), do: "#{Web.base_url()}/users/#{nickname}"
"#{Web.base_url()}/users/#{nickname}"
end
def ap_followers(%User{follower_address: fa}) when is_binary(fa), do: fa def ap_followers(%User{follower_address: fa}) when is_binary(fa), do: fa
def ap_followers(%User{} = user), do: "#{ap_id(user)}/followers" def ap_followers(%User{} = user), do: "#{ap_id(user)}/followers"
@ -119,12 +117,9 @@ def ap_following(%User{} = user), do: "#{ap_id(user)}/following"
def user_info(%User{} = user, args \\ %{}) do def user_info(%User{} = user, args \\ %{}) do
following_count = following_count =
if args[:following_count], Map.get(args, :following_count, user.info.following_count || following_count(user))
do: args[:following_count],
else: user.info.following_count || following_count(user)
follower_count = follower_count = Map.get(args, :follower_count, user.info.follower_count)
if args[:follower_count], do: args[:follower_count], else: user.info.follower_count
%{ %{
note_count: user.info.note_count, note_count: user.info.note_count,
@ -137,12 +132,11 @@ def user_info(%User{} = user, args \\ %{}) do
end end
def follow_state(%User{} = user, %User{} = target) do def follow_state(%User{} = user, %User{} = target) do
follow_activity = Utils.fetch_latest_follow(user, target) case Utils.fetch_latest_follow(user, target) do
%{data: %{"state" => state}} -> state
if follow_activity,
do: follow_activity.data["state"],
# Ideally this would be nil, but then Cachex does not commit the value # Ideally this would be nil, but then Cachex does not commit the value
else: false _ -> false
end
end end
def get_cached_follow_state(user, target) do def get_cached_follow_state(user, target) do
@ -150,12 +144,9 @@ def get_cached_follow_state(user, target) do
Cachex.fetch!(:user_cache, key, fn _ -> {:commit, follow_state(user, target)} end) Cachex.fetch!(:user_cache, key, fn _ -> {:commit, follow_state(user, target)} end)
end end
@spec set_follow_state_cache(String.t(), String.t(), String.t()) :: {:ok | :error, boolean()}
def set_follow_state_cache(user_ap_id, target_ap_id, state) do def set_follow_state_cache(user_ap_id, target_ap_id, state) do
Cachex.put( Cachex.put(:user_cache, "follow_state:#{user_ap_id}|#{target_ap_id}", state)
:user_cache,
"follow_state:#{user_ap_id}|#{target_ap_id}",
state
)
end end
def set_info_cache(user, args) do def set_info_cache(user, args) do
@ -196,34 +187,25 @@ def remote_user_creation(params) do
|> truncate_if_exists(:name, name_limit) |> truncate_if_exists(:name, name_limit)
|> truncate_if_exists(:bio, bio_limit) |> truncate_if_exists(:bio, bio_limit)
info_cng = User.Info.remote_user_creation(%User.Info{}, params[:info]) changeset =
%User{local: false}
changes =
%User{}
|> cast(params, [:bio, :name, :ap_id, :nickname, :avatar]) |> cast(params, [:bio, :name, :ap_id, :nickname, :avatar])
|> validate_required([:name, :ap_id]) |> validate_required([:name, :ap_id])
|> unique_constraint(:nickname) |> unique_constraint(:nickname)
|> validate_format(:nickname, @email_regex) |> validate_format(:nickname, @email_regex)
|> validate_length(:bio, max: bio_limit) |> validate_length(:bio, max: bio_limit)
|> validate_length(:name, max: name_limit) |> validate_length(:name, max: name_limit)
|> put_change(:local, false) |> change_info(&User.Info.remote_user_creation(&1, params[:info]))
|> put_embed(:info, info_cng)
if changes.valid? do case params[:info][:source_data] do
case info_cng.changes[:source_data] do %{"followers" => followers, "following" => following} ->
%{"followers" => followers, "following" => following} -> changeset
changes |> put_change(:follower_address, followers)
|> put_change(:follower_address, followers) |> put_change(:following_address, following)
|> put_change(:following_address, following)
_ -> _ ->
followers = User.ap_followers(%User{nickname: changes.changes[:nickname]}) followers = ap_followers(%User{nickname: get_field(changeset, :nickname)})
put_change(changeset, :follower_address, followers)
changes
|> put_change(:follower_address, followers)
end
else
changes
end end
end end
@ -244,7 +226,6 @@ def upgrade_changeset(struct, params \\ %{}, remote? \\ false) do
name_limit = Pleroma.Config.get([:instance, :user_name_length], 100) name_limit = Pleroma.Config.get([:instance, :user_name_length], 100)
params = Map.put(params, :last_refreshed_at, NaiveDateTime.utc_now()) params = Map.put(params, :last_refreshed_at, NaiveDateTime.utc_now())
info_cng = User.Info.user_upgrade(struct.info, params[:info], remote?)
struct struct
|> cast(params, [ |> cast(params, [
@ -259,7 +240,7 @@ def upgrade_changeset(struct, params \\ %{}, remote? \\ false) do
|> validate_format(:nickname, local_nickname_regex()) |> validate_format(:nickname, local_nickname_regex())
|> validate_length(:bio, max: bio_limit) |> validate_length(:bio, max: bio_limit)
|> validate_length(:name, max: name_limit) |> validate_length(:name, max: name_limit)
|> put_embed(:info, info_cng) |> change_info(&User.Info.user_upgrade(&1, params[:info], remote?))
end end
def password_update_changeset(struct, params) do def password_update_changeset(struct, params) do
@ -268,6 +249,7 @@ def password_update_changeset(struct, params) do
|> validate_required([:password, :password_confirmation]) |> validate_required([:password, :password_confirmation])
|> validate_confirmation(:password) |> validate_confirmation(:password)
|> put_password_hash |> put_password_hash
|> put_embed(:info, User.Info.set_password_reset_pending(struct.info, false))
end end
@spec reset_password(User.t(), map) :: {:ok, User.t()} | {:error, Ecto.Changeset.t()} @spec reset_password(User.t(), map) :: {:ok, User.t()} | {:error, Ecto.Changeset.t()}
@ -284,6 +266,20 @@ def reset_password(%User{id: user_id} = user, data) do
end end
end end
def force_password_reset_async(user) do
BackgroundWorker.enqueue("force_password_reset", %{"user_id" => user.id})
end
@spec force_password_reset(User.t()) :: {:ok, User.t()} | {:error, Ecto.Changeset.t()}
def force_password_reset(user) do
info_cng = User.Info.set_password_reset_pending(user.info, true)
user
|> change()
|> put_embed(:info, info_cng)
|> update_and_set_cache()
end
def register_changeset(struct, params \\ %{}, opts \\ []) do def register_changeset(struct, params \\ %{}, opts \\ []) do
bio_limit = Pleroma.Config.get([:instance, :user_bio_length], 5000) bio_limit = Pleroma.Config.get([:instance, :user_bio_length], 5000)
name_limit = Pleroma.Config.get([:instance, :user_name_length], 100) name_limit = Pleroma.Config.get([:instance, :user_name_length], 100)
@ -295,43 +291,39 @@ def register_changeset(struct, params \\ %{}, opts \\ []) do
opts[:need_confirmation] opts[:need_confirmation]
end end
info_change = struct
User.Info.confirmation_changeset(%User.Info{}, need_confirmation: need_confirmation?) |> cast(params, [:bio, :email, :name, :nickname, :password, :password_confirmation])
|> validate_required([:name, :nickname, :password, :password_confirmation])
|> validate_confirmation(:password)
|> unique_constraint(:email)
|> unique_constraint(:nickname)
|> validate_exclusion(:nickname, Pleroma.Config.get([User, :restricted_nicknames]))
|> validate_format(:nickname, local_nickname_regex())
|> validate_format(:email, @email_regex)
|> validate_length(:bio, max: bio_limit)
|> validate_length(:name, min: 1, max: name_limit)
|> change_info(&User.Info.confirmation_changeset(&1, need_confirmation: need_confirmation?))
|> maybe_validate_required_email(opts[:external])
|> put_password_hash
|> put_ap_id()
|> unique_constraint(:ap_id)
|> put_following_and_follower_address()
end
changeset = def maybe_validate_required_email(changeset, true), do: changeset
struct def maybe_validate_required_email(changeset, _), do: validate_required(changeset, [:email])
|> cast(params, [:bio, :email, :name, :nickname, :password, :password_confirmation])
|> validate_required([:name, :nickname, :password, :password_confirmation])
|> validate_confirmation(:password)
|> unique_constraint(:email)
|> unique_constraint(:nickname)
|> validate_exclusion(:nickname, Pleroma.Config.get([User, :restricted_nicknames]))
|> validate_format(:nickname, local_nickname_regex())
|> validate_format(:email, @email_regex)
|> validate_length(:bio, max: bio_limit)
|> validate_length(:name, min: 1, max: name_limit)
|> put_change(:info, info_change)
changeset = defp put_ap_id(changeset) do
if opts[:external] do ap_id = ap_id(%User{nickname: get_field(changeset, :nickname)})
changeset put_change(changeset, :ap_id, ap_id)
else end
validate_required(changeset, [:email])
end
if changeset.valid? do defp put_following_and_follower_address(changeset) do
ap_id = User.ap_id(%User{nickname: changeset.changes[:nickname]}) followers = ap_followers(%User{nickname: get_field(changeset, :nickname)})
followers = User.ap_followers(%User{nickname: changeset.changes[:nickname]})
changeset changeset
|> put_password_hash |> put_change(:following, [followers])
|> put_change(:ap_id, ap_id) |> put_change(:follower_address, followers)
|> unique_constraint(:ap_id)
|> put_change(:following, [followers])
|> put_change(:follower_address, followers)
else
changeset
end
end end
defp autofollow_users(user) do defp autofollow_users(user) do
@ -346,9 +338,8 @@ defp autofollow_users(user) do
@doc "Inserts provided changeset, performs post-registration actions (confirmation email sending etc.)" @doc "Inserts provided changeset, performs post-registration actions (confirmation email sending etc.)"
def register(%Ecto.Changeset{} = changeset) do def register(%Ecto.Changeset{} = changeset) do
with {:ok, user} <- Repo.insert(changeset), with {:ok, user} <- Repo.insert(changeset) do
{:ok, user} <- post_register_action(user) do post_register_action(user)
{:ok, user}
end end
end end
@ -394,7 +385,7 @@ def maybe_direct_follow(%User{} = follower, %User{local: true} = followed) do
end end
def maybe_direct_follow(%User{} = follower, %User{} = followed) do def maybe_direct_follow(%User{} = follower, %User{} = followed) do
if not User.ap_enabled?(followed) do if not ap_enabled?(followed) do
follow(follower, followed) follow(follower, followed)
else else
{:ok, follower} {:ok, follower}
@ -427,9 +418,7 @@ def follow_all(follower, followeds) do
{1, [follower]} = Repo.update_all(q, []) {1, [follower]} = Repo.update_all(q, [])
Enum.each(followeds, fn followed -> Enum.each(followeds, &update_follower_count/1)
update_follower_count(followed)
end)
set_cache(follower) set_cache(follower)
end end
@ -539,8 +528,6 @@ def set_cache(%User{} = user) do
def update_and_set_cache(changeset) do def update_and_set_cache(changeset) do
with {:ok, user} <- Repo.update(changeset, stale_error_field: :id) do with {:ok, user} <- Repo.update(changeset, stale_error_field: :id) do
set_cache(user) set_cache(user)
else
e -> e
end end
end end
@ -577,9 +564,7 @@ def get_cached_by_nickname(nickname) do
key = "nickname:#{nickname}" key = "nickname:#{nickname}"
Cachex.fetch!(:user_cache, key, fn -> Cachex.fetch!(:user_cache, key, fn ->
user_result = get_or_fetch_by_nickname(nickname) case get_or_fetch_by_nickname(nickname) do
case user_result do
{:ok, user} -> {:commit, user} {:ok, user} -> {:commit, user}
{:error, _error} -> {:ignore, nil} {:error, _error} -> {:ignore, nil}
end end
@ -590,7 +575,7 @@ def get_cached_by_nickname_or_id(nickname_or_id, opts \\ []) do
restrict_to_local = Pleroma.Config.get([:instance, :limit_to_local_content]) restrict_to_local = Pleroma.Config.get([:instance, :limit_to_local_content])
cond do cond do
is_integer(nickname_or_id) or Pleroma.FlakeId.is_flake_id?(nickname_or_id) -> is_integer(nickname_or_id) or FlakeId.flake_id?(nickname_or_id) ->
get_cached_by_id(nickname_or_id) || get_cached_by_nickname(nickname_or_id) get_cached_by_id(nickname_or_id) || get_cached_by_nickname(nickname_or_id)
restrict_to_local == false -> restrict_to_local == false ->
@ -619,13 +604,11 @@ def get_by_nickname_or_email(nickname_or_email) do
def get_cached_user_info(user) do def get_cached_user_info(user) do
key = "user_info:#{user.id}" key = "user_info:#{user.id}"
Cachex.fetch!(:user_cache, key, fn _ -> user_info(user) end) Cachex.fetch!(:user_cache, key, fn -> user_info(user) end)
end end
def fetch_by_nickname(nickname) do def fetch_by_nickname(nickname) do
ap_try = ActivityPub.make_user_from_nickname(nickname) case ActivityPub.make_user_from_nickname(nickname) do
case ap_try do
{:ok, user} -> {:ok, user} {:ok, user} -> {:ok, user}
_ -> OStatus.make_user(nickname) _ -> OStatus.make_user(nickname)
end end
@ -660,7 +643,8 @@ def get_followers_query(%User{} = user, nil) do
end end
def get_followers_query(user, page) do def get_followers_query(user, page) do
from(u in get_followers_query(user, nil)) user
|> get_followers_query(nil)
|> User.Query.paginate(page, 20) |> User.Query.paginate(page, 20)
end end
@ -669,25 +653,24 @@ def get_followers_query(user), do: get_followers_query(user, nil)
@spec get_followers(User.t(), pos_integer()) :: {:ok, list(User.t())} @spec get_followers(User.t(), pos_integer()) :: {:ok, list(User.t())}
def get_followers(user, page \\ nil) do def get_followers(user, page \\ nil) do
q = get_followers_query(user, page) user
|> get_followers_query(page)
{:ok, Repo.all(q)} |> Repo.all()
end end
@spec get_external_followers(User.t(), pos_integer()) :: {:ok, list(User.t())} @spec get_external_followers(User.t(), pos_integer()) :: {:ok, list(User.t())}
def get_external_followers(user, page \\ nil) do def get_external_followers(user, page \\ nil) do
q = user
user |> get_followers_query(page)
|> get_followers_query(page) |> User.Query.build(%{external: true})
|> User.Query.build(%{external: true}) |> Repo.all()
{:ok, Repo.all(q)}
end end
def get_followers_ids(user, page \\ nil) do def get_followers_ids(user, page \\ nil) do
q = get_followers_query(user, page) user
|> get_followers_query(page)
Repo.all(from(u in q, select: u.id)) |> select([u], u.id)
|> Repo.all()
end end
@spec get_friends_query(User.t(), pos_integer() | nil) :: Ecto.Query.t() @spec get_friends_query(User.t(), pos_integer() | nil) :: Ecto.Query.t()
@ -696,7 +679,8 @@ def get_friends_query(%User{} = user, nil) do
end end
def get_friends_query(user, page) do def get_friends_query(user, page) do
from(u in get_friends_query(user, nil)) user
|> get_friends_query(nil)
|> User.Query.paginate(page, 20) |> User.Query.paginate(page, 20)
end end
@ -704,28 +688,27 @@ def get_friends_query(user, page) do
def get_friends_query(user), do: get_friends_query(user, nil) def get_friends_query(user), do: get_friends_query(user, nil)
def get_friends(user, page \\ nil) do def get_friends(user, page \\ nil) do
q = get_friends_query(user, page) user
|> get_friends_query(page)
{:ok, Repo.all(q)} |> Repo.all()
end end
def get_friends_ids(user, page \\ nil) do def get_friends_ids(user, page \\ nil) do
q = get_friends_query(user, page) user
|> get_friends_query(page)
Repo.all(from(u in q, select: u.id)) |> select([u], u.id)
|> Repo.all()
end end
@spec get_follow_requests(User.t()) :: {:ok, [User.t()]} @spec get_follow_requests(User.t()) :: {:ok, [User.t()]}
def get_follow_requests(%User{} = user) do def get_follow_requests(%User{} = user) do
users = user
Activity.follow_requests_for_actor(user) |> Activity.follow_requests_for_actor()
|> join(:inner, [a], u in User, on: a.actor == u.ap_id) |> join(:inner, [a], u in User, on: a.actor == u.ap_id)
|> where([a, u], not fragment("? @> ?", u.following, ^[user.follower_address])) |> where([a, u], not fragment("? @> ?", u.following, ^[user.follower_address]))
|> group_by([a, u], u.id) |> group_by([a, u], u.id)
|> select([a, u], u) |> select([a, u], u)
|> Repo.all() |> Repo.all()
{:ok, users}
end end
def increase_note_count(%User{} = user) do def increase_note_count(%User{} = user) do
@ -771,21 +754,15 @@ def decrease_note_count(%User{} = user) do
end end
def update_note_count(%User{} = user) do def update_note_count(%User{} = user) do
note_count_query = note_count =
from( from(
a in Object, a in Object,
where: fragment("?->>'actor' = ? and ?->>'type' = 'Note'", a.data, ^user.ap_id, a.data), where: fragment("?->>'actor' = ? and ?->>'type' = 'Note'", a.data, ^user.ap_id, a.data),
select: count(a.id) select: count(a.id)
) )
|> Repo.one()
note_count = Repo.one(note_count_query) update_info(user, &User.Info.set_note_count(&1, note_count))
info_cng = User.Info.set_note_count(user.info, note_count)
user
|> change()
|> put_embed(:info, info_cng)
|> update_and_set_cache()
end end
@spec maybe_fetch_follow_information(User.t()) :: User.t() @spec maybe_fetch_follow_information(User.t()) :: User.t()
@ -802,17 +779,7 @@ def maybe_fetch_follow_information(user) do
def fetch_follow_information(user) do def fetch_follow_information(user) do
with {:ok, info} <- ActivityPub.fetch_follow_information_for_user(user) do with {:ok, info} <- ActivityPub.fetch_follow_information_for_user(user) do
info_cng = User.Info.follow_information_update(user.info, info) update_info(user, &User.Info.follow_information_update(&1, info))
changeset =
user
|> change()
|> put_embed(:info, info_cng)
update_and_set_cache(changeset)
else
{:error, _} = e -> e
e -> {:error, e}
end end
end end
@ -886,62 +853,28 @@ def get_recipients_from_activity(%Activity{recipients: to}) do
@spec mute(User.t(), User.t(), boolean()) :: {:ok, User.t()} | {:error, String.t()} @spec mute(User.t(), User.t(), boolean()) :: {:ok, User.t()} | {:error, String.t()}
def mute(muter, %User{ap_id: ap_id}, notifications? \\ true) do def mute(muter, %User{ap_id: ap_id}, notifications? \\ true) do
info = muter.info update_info(muter, &User.Info.add_to_mutes(&1, ap_id, notifications?))
info_cng =
User.Info.add_to_mutes(info, ap_id)
|> User.Info.add_to_muted_notifications(info, ap_id, notifications?)
cng =
change(muter)
|> put_embed(:info, info_cng)
update_and_set_cache(cng)
end end
def unmute(muter, %{ap_id: ap_id}) do def unmute(muter, %{ap_id: ap_id}) do
info = muter.info update_info(muter, &User.Info.remove_from_mutes(&1, ap_id))
info_cng =
User.Info.remove_from_mutes(info, ap_id)
|> User.Info.remove_from_muted_notifications(info, ap_id)
cng =
change(muter)
|> put_embed(:info, info_cng)
update_and_set_cache(cng)
end end
def subscribe(subscriber, %{ap_id: ap_id}) do def subscribe(subscriber, %{ap_id: ap_id}) do
deny_follow_blocked = Pleroma.Config.get([:user, :deny_follow_blocked])
with %User{} = subscribed <- get_cached_by_ap_id(ap_id) do with %User{} = subscribed <- get_cached_by_ap_id(ap_id) do
blocked = blocks?(subscribed, subscriber) and deny_follow_blocked deny_follow_blocked = Pleroma.Config.get([:user, :deny_follow_blocked])
if blocked do if blocks?(subscribed, subscriber) and deny_follow_blocked do
{:error, "Could not subscribe: #{subscribed.nickname} is blocking you"} {:error, "Could not subscribe: #{subscribed.nickname} is blocking you"}
else else
info_cng = update_info(subscribed, &User.Info.add_to_subscribers(&1, subscriber.ap_id))
subscribed.info
|> User.Info.add_to_subscribers(subscriber.ap_id)
change(subscribed)
|> put_embed(:info, info_cng)
|> update_and_set_cache()
end end
end end
end end
def unsubscribe(unsubscriber, %{ap_id: ap_id}) do def unsubscribe(unsubscriber, %{ap_id: ap_id}) do
with %User{} = user <- get_cached_by_ap_id(ap_id) do with %User{} = user <- get_cached_by_ap_id(ap_id) do
info_cng = update_info(user, &User.Info.remove_from_subscribers(&1, unsubscriber.ap_id))
user.info
|> User.Info.remove_from_subscribers(unsubscriber.ap_id)
change(user)
|> put_embed(:info, info_cng)
|> update_and_set_cache()
end end
end end
@ -970,21 +903,11 @@ def block(blocker, %User{ap_id: ap_id} = blocked) do
blocker blocker
end end
if following?(blocked, blocker) do if following?(blocked, blocker), do: unfollow(blocked, blocker)
unfollow(blocked, blocker)
end
{:ok, blocker} = update_follower_count(blocker) {:ok, blocker} = update_follower_count(blocker)
info_cng = update_info(blocker, &User.Info.add_to_block(&1, ap_id))
blocker.info
|> User.Info.add_to_block(ap_id)
cng =
change(blocker)
|> put_embed(:info, info_cng)
update_and_set_cache(cng)
end end
# helper to handle the block given only an actor's AP id # helper to handle the block given only an actor's AP id
@ -993,15 +916,7 @@ def block(blocker, %{ap_id: ap_id}) do
end end
def unblock(blocker, %{ap_id: ap_id}) do def unblock(blocker, %{ap_id: ap_id}) do
info_cng = update_info(blocker, &User.Info.remove_from_block(&1, ap_id))
blocker.info
|> User.Info.remove_from_block(ap_id)
cng =
change(blocker)
|> put_embed(:info, info_cng)
update_and_set_cache(cng)
end end
def mutes?(nil, _), do: false def mutes?(nil, _), do: false
@ -1058,27 +973,11 @@ def subscribers(user) do
end end
def block_domain(user, domain) do def block_domain(user, domain) do
info_cng = update_info(user, &User.Info.add_to_domain_block(&1, domain))
user.info
|> User.Info.add_to_domain_block(domain)
cng =
change(user)
|> put_embed(:info, info_cng)
update_and_set_cache(cng)
end end
def unblock_domain(user, domain) do def unblock_domain(user, domain) do
info_cng = update_info(user, &User.Info.remove_from_domain_block(&1, domain))
user.info
|> User.Info.remove_from_domain_block(domain)
cng =
change(user)
|> put_embed(:info, info_cng)
update_and_set_cache(cng)
end end
def deactivate_async(user, status \\ true) do def deactivate_async(user, status \\ true) do
@ -1086,51 +985,41 @@ def deactivate_async(user, status \\ true) do
end end
def deactivate(%User{} = user, status \\ true) do def deactivate(%User{} = user, status \\ true) do
info_cng = User.Info.set_activation_status(user.info, status) with {:ok, user} <- update_info(user, &User.Info.set_activation_status(&1, status)) do
Enum.each(get_followers(user), &invalidate_cache/1)
with {:ok, friends} <- User.get_friends(user), Enum.each(get_friends(user), &update_follower_count/1)
{:ok, followers} <- User.get_followers(user),
{:ok, user} <-
user
|> change()
|> put_embed(:info, info_cng)
|> update_and_set_cache() do
Enum.each(followers, &invalidate_cache(&1))
Enum.each(friends, &update_follower_count(&1))
{:ok, user} {:ok, user}
end end
end end
def update_notification_settings(%User{} = user, settings \\ %{}) do def update_notification_settings(%User{} = user, settings \\ %{}) do
info_changeset = User.Info.update_notification_settings(user.info, settings) update_info(user, &User.Info.update_notification_settings(&1, settings))
change(user)
|> put_embed(:info, info_changeset)
|> update_and_set_cache()
end end
def delete(%User{} = user) do def delete(%User{} = user) do
BackgroundWorker.enqueue("delete_user", %{"user_id" => user.id}) BackgroundWorker.enqueue("delete_user", %{"user_id" => user.id})
end end
def perform(:force_password_reset, user), do: force_password_reset(user)
@spec perform(atom(), User.t()) :: {:ok, User.t()} @spec perform(atom(), User.t()) :: {:ok, User.t()}
def perform(:delete, %User{} = user) do def perform(:delete, %User{} = user) do
{:ok, _user} = ActivityPub.delete(user) {:ok, _user} = ActivityPub.delete(user)
# Remove all relationships # Remove all relationships
{:ok, followers} = User.get_followers(user) user
|> get_followers()
Enum.each(followers, fn follower -> |> Enum.each(fn follower ->
ActivityPub.unfollow(follower, user) ActivityPub.unfollow(follower, user)
User.unfollow(follower, user) unfollow(follower, user)
end) end)
{:ok, friends} = User.get_friends(user) user
|> get_friends()
Enum.each(friends, fn followed -> |> Enum.each(fn followed ->
ActivityPub.unfollow(user, followed) ActivityPub.unfollow(user, followed)
User.unfollow(user, followed) unfollow(user, followed)
end) end)
delete_user_activities(user) delete_user_activities(user)
@ -1142,13 +1031,11 @@ def perform(:delete, %User{} = user) do
def perform(:fetch_initial_posts, %User{} = user) do def perform(:fetch_initial_posts, %User{} = user) do
pages = Pleroma.Config.get!([:fetch_initial_posts, :pages]) pages = Pleroma.Config.get!([:fetch_initial_posts, :pages])
Enum.each( # Insert all the posts in reverse order, so they're in the right order on the timeline
# Insert all the posts in reverse order, so they're in the right order on the timeline user.info.source_data["outbox"]
Enum.reverse(Utils.fetch_ordered_collection(user.info.source_data["outbox"], pages)), |> Utils.fetch_ordered_collection(pages)
&Pleroma.Web.Federator.incoming_ap_doc/1 |> Enum.reverse()
) |> Enum.each(&Pleroma.Web.Federator.incoming_ap_doc/1)
{:ok, user}
end end
def perform(:deactivate_async, user, status), do: deactivate(user, status) def perform(:deactivate_async, user, status), do: deactivate(user, status)
@ -1234,16 +1121,12 @@ def follow_import(%User{} = follower, followed_identifiers)
}) })
end end
def delete_user_activities(%User{ap_id: ap_id} = user) do def delete_user_activities(%User{ap_id: ap_id}) do
ap_id ap_id
|> Activity.Queries.by_actor() |> Activity.Queries.by_actor()
|> RepoStreamer.chunk_stream(50) |> RepoStreamer.chunk_stream(50)
|> Stream.each(fn activities -> |> Stream.each(fn activities -> Enum.each(activities, &delete_activity/1) end)
Enum.each(activities, &delete_activity(&1))
end)
|> Stream.run() |> Stream.run()
{:ok, user}
end end
defp delete_activity(%{data: %{"type" => "Create"}} = activity) do defp delete_activity(%{data: %{"type" => "Create"}} = activity) do
@ -1253,17 +1136,19 @@ defp delete_activity(%{data: %{"type" => "Create"}} = activity) do
end end
defp delete_activity(%{data: %{"type" => "Like"}} = activity) do defp delete_activity(%{data: %{"type" => "Like"}} = activity) do
user = get_cached_by_ap_id(activity.actor)
object = Object.normalize(activity) object = Object.normalize(activity)
ActivityPub.unlike(user, object) activity.actor
|> get_cached_by_ap_id()
|> ActivityPub.unlike(object)
end end
defp delete_activity(%{data: %{"type" => "Announce"}} = activity) do defp delete_activity(%{data: %{"type" => "Announce"}} = activity) do
user = get_cached_by_ap_id(activity.actor)
object = Object.normalize(activity) object = Object.normalize(activity)
ActivityPub.unannounce(user, object) activity.actor
|> get_cached_by_ap_id()
|> ActivityPub.unannounce(object)
end end
defp delete_activity(_activity), do: "Doing nothing" defp delete_activity(_activity), do: "Doing nothing"
@ -1275,9 +1160,7 @@ def html_filter_policy(%User{info: %{no_rich_text: true}}) do
def html_filter_policy(_), do: Pleroma.Config.get([:markup, :scrub_policy]) def html_filter_policy(_), do: Pleroma.Config.get([:markup, :scrub_policy])
def fetch_by_ap_id(ap_id) do def fetch_by_ap_id(ap_id) do
ap_try = ActivityPub.make_user_from_ap_id(ap_id) case ActivityPub.make_user_from_ap_id(ap_id) do
case ap_try do
{:ok, user} -> {:ok, user} ->
{:ok, user} {:ok, user}
@ -1292,7 +1175,7 @@ def fetch_by_ap_id(ap_id) do
def get_or_fetch_by_ap_id(ap_id) do def get_or_fetch_by_ap_id(ap_id) do
user = get_cached_by_ap_id(ap_id) user = get_cached_by_ap_id(ap_id)
if !is_nil(user) and !User.needs_update?(user) do if !is_nil(user) and !needs_update?(user) do
{:ok, user} {:ok, user}
else else
# Whether to fetch initial posts for the user (if it's a new user & the fetching is enabled) # Whether to fetch initial posts for the user (if it's a new user & the fetching is enabled)
@ -1312,19 +1195,20 @@ def get_or_fetch_by_ap_id(ap_id) do
@doc "Creates an internal service actor by URI if missing. Optionally takes nickname for addressing." @doc "Creates an internal service actor by URI if missing. Optionally takes nickname for addressing."
def get_or_create_service_actor_by_ap_id(uri, nickname \\ nil) do def get_or_create_service_actor_by_ap_id(uri, nickname \\ nil) do
if user = get_cached_by_ap_id(uri) do with %User{} = user <- get_cached_by_ap_id(uri) do
user user
else else
changes = _ ->
%User{info: %User.Info{}} {:ok, user} =
|> cast(%{}, [:ap_id, :nickname, :local]) %User{info: %User.Info{}}
|> put_change(:ap_id, uri) |> cast(%{}, [:ap_id, :nickname, :local])
|> put_change(:nickname, nickname) |> put_change(:ap_id, uri)
|> put_change(:local, true) |> put_change(:nickname, nickname)
|> put_change(:follower_address, uri <> "/followers") |> put_change(:local, true)
|> put_change(:follower_address, uri <> "/followers")
|> Repo.insert()
{:ok, user} = Repo.insert(changes) user
user
end end
end end
@ -1381,23 +1265,21 @@ def get_or_fetch(nickname), do: get_or_fetch_by_nickname(nickname)
# this is because we have synchronous follow APIs and need to simulate them # this is because we have synchronous follow APIs and need to simulate them
# with an async handshake # with an async handshake
def wait_and_refresh(_, %User{local: true} = a, %User{local: true} = b) do def wait_and_refresh(_, %User{local: true} = a, %User{local: true} = b) do
with %User{} = a <- User.get_cached_by_id(a.id), with %User{} = a <- get_cached_by_id(a.id),
%User{} = b <- User.get_cached_by_id(b.id) do %User{} = b <- get_cached_by_id(b.id) do
{:ok, a, b} {:ok, a, b}
else else
_e -> nil -> :error
:error
end end
end end
def wait_and_refresh(timeout, %User{} = a, %User{} = b) do def wait_and_refresh(timeout, %User{} = a, %User{} = b) do
with :ok <- :timer.sleep(timeout), with :ok <- :timer.sleep(timeout),
%User{} = a <- User.get_cached_by_id(a.id), %User{} = a <- get_cached_by_id(a.id),
%User{} = b <- User.get_cached_by_id(b.id) do %User{} = b <- get_cached_by_id(b.id) do
{:ok, a, b} {:ok, a, b}
else else
_e -> nil -> :error
:error
end end
end end
@ -1459,7 +1341,7 @@ defp update_tags(%User{} = user, new_tags) do
defp normalize_tags(tags) do defp normalize_tags(tags) do
[tags] [tags]
|> List.flatten() |> List.flatten()
|> Enum.map(&String.downcase(&1)) |> Enum.map(&String.downcase/1)
end end
defp local_nickname_regex do defp local_nickname_regex do
@ -1552,11 +1434,7 @@ def list_inactive_users_query(inactivity_threshold \\ 7) do
@spec switch_email_notifications(t(), String.t(), boolean()) :: @spec switch_email_notifications(t(), String.t(), boolean()) ::
{:ok, t()} | {:error, Ecto.Changeset.t()} {:ok, t()} | {:error, Ecto.Changeset.t()}
def switch_email_notifications(user, type, status) do def switch_email_notifications(user, type, status) do
info = Pleroma.User.Info.update_email_notifications(user.info, %{type => status}) update_info(user, &User.Info.update_email_notifications(&1, %{type => status}))
change(user)
|> put_embed(:info, info)
|> update_and_set_cache()
end end
@doc """ @doc """
@ -1578,13 +1456,8 @@ def touch_last_digest_emailed_at(user) do
def toggle_confirmation(%User{} = user) do def toggle_confirmation(%User{} = user) do
need_confirmation? = !user.info.confirmation_pending need_confirmation? = !user.info.confirmation_pending
info_changeset =
User.Info.confirmation_changeset(user.info, need_confirmation: need_confirmation?)
user user
|> change() |> update_info(&User.Info.confirmation_changeset(&1, need_confirmation: need_confirmation?))
|> put_embed(:info, info_changeset)
|> update_and_set_cache()
end end
def get_mascot(%{info: %{mascot: %{} = mascot}}) when not is_nil(mascot) do def get_mascot(%{info: %{mascot: %{} = mascot}}) when not is_nil(mascot) do
@ -1607,16 +1480,11 @@ def get_mascot(%{info: %{mascot: mascot}}) when is_nil(mascot) do
} }
end end
def ensure_keys_present(%User{info: info} = user) do def ensure_keys_present(%{info: %{keys: keys}} = user) when not is_nil(keys), do: {:ok, user}
if info.keys do
{:ok, user}
else
{:ok, pem} = Keys.generate_rsa_pem()
user def ensure_keys_present(%User{} = user) do
|> Ecto.Changeset.change() with {:ok, pem} <- Keys.generate_rsa_pem() do
|> Ecto.Changeset.put_embed(:info, User.Info.set_keys(info, pem)) update_info(user, &User.Info.set_keys(&1, pem))
|> update_and_set_cache()
end end
end end
@ -1662,4 +1530,26 @@ def change_email(user, email) do
|> validate_format(:email, @email_regex) |> validate_format(:email, @email_regex)
|> update_and_set_cache() |> update_and_set_cache()
end end
@doc """
Changes `user.info` and returns the user changeset.
`fun` is called with the `user.info`.
"""
def change_info(user, fun) do
changeset = change(user)
info = get_field(changeset, :info) || %User.Info{}
put_embed(changeset, :info, fun.(info))
end
@doc """
Updates `user.info` and sets cache.
`fun` is called with the `user.info`.
"""
def update_info(user, fun) do
user
|> change_info(fun)
|> update_and_set_cache()
end
end end

View file

@ -20,6 +20,7 @@ defmodule Pleroma.User.Info do
field(:following_count, :integer, default: nil) field(:following_count, :integer, default: nil)
field(:locked, :boolean, default: false) field(:locked, :boolean, default: false)
field(:confirmation_pending, :boolean, default: false) field(:confirmation_pending, :boolean, default: false)
field(:password_reset_pending, :boolean, default: false)
field(:confirmation_token, :string, default: nil) field(:confirmation_token, :string, default: nil)
field(:default_scope, :string, default: "public") field(:default_scope, :string, default: "public")
field(:blocks, {:array, :string}, default: []) field(:blocks, {:array, :string}, default: [])
@ -41,6 +42,8 @@ defmodule Pleroma.User.Info do
field(:topic, :string, default: nil) field(:topic, :string, default: nil)
field(:hub, :string, default: nil) field(:hub, :string, default: nil)
field(:salmon, :string, default: nil) field(:salmon, :string, default: nil)
field(:hide_followers_count, :boolean, default: false)
field(:hide_follows_count, :boolean, default: false)
field(:hide_followers, :boolean, default: false) field(:hide_followers, :boolean, default: false)
field(:hide_follows, :boolean, default: false) field(:hide_follows, :boolean, default: false)
field(:hide_favorites, :boolean, default: true) field(:hide_favorites, :boolean, default: true)
@ -51,6 +54,7 @@ defmodule Pleroma.User.Info do
field(:pleroma_settings_store, :map, default: %{}) field(:pleroma_settings_store, :map, default: %{})
field(:fields, {:array, :map}, default: nil) field(:fields, {:array, :map}, default: nil)
field(:raw_fields, {:array, :map}, default: []) field(:raw_fields, {:array, :map}, default: [])
field(:discoverable, :boolean, default: false)
field(:notification_settings, :map, field(:notification_settings, :map,
default: %{ default: %{
@ -80,6 +84,14 @@ def set_activation_status(info, deactivated) do
|> validate_required([:deactivated]) |> validate_required([:deactivated])
end end
def set_password_reset_pending(info, pending) do
params = %{password_reset_pending: pending}
info
|> cast(params, [:password_reset_pending])
|> validate_required([:password_reset_pending])
end
def update_notification_settings(info, settings) do def update_notification_settings(info, settings) do
settings = settings =
settings settings
@ -176,16 +188,11 @@ def set_subscribers(info, subscribers) do
|> validate_required([:subscribers]) |> validate_required([:subscribers])
end end
@spec add_to_mutes(Info.t(), String.t()) :: Changeset.t() @spec add_to_mutes(Info.t(), String.t(), boolean()) :: Changeset.t()
def add_to_mutes(info, muted) do def add_to_mutes(info, muted, notifications?) do
set_mutes(info, Enum.uniq([muted | info.mutes])) info
end |> set_mutes(Enum.uniq([muted | info.mutes]))
|> set_notification_mutes(
@spec add_to_muted_notifications(Changeset.t(), Info.t(), String.t(), boolean()) ::
Changeset.t()
def add_to_muted_notifications(changeset, info, muted, notifications?) do
set_notification_mutes(
changeset,
Enum.uniq([muted | info.muted_notifications]), Enum.uniq([muted | info.muted_notifications]),
notifications? notifications?
) )
@ -193,12 +200,9 @@ def add_to_muted_notifications(changeset, info, muted, notifications?) do
@spec remove_from_mutes(Info.t(), String.t()) :: Changeset.t() @spec remove_from_mutes(Info.t(), String.t()) :: Changeset.t()
def remove_from_mutes(info, muted) do def remove_from_mutes(info, muted) do
set_mutes(info, List.delete(info.mutes, muted)) info
end |> set_mutes(List.delete(info.mutes, muted))
|> set_notification_mutes(List.delete(info.muted_notifications, muted), true)
@spec remove_from_muted_notifications(Changeset.t(), Info.t(), String.t()) :: Changeset.t()
def remove_from_muted_notifications(changeset, info, muted) do
set_notification_mutes(changeset, List.delete(info.muted_notifications, muted), true)
end end
def add_to_block(info, blocked) do def add_to_block(info, blocked) do
@ -262,9 +266,12 @@ def remote_user_creation(info, params) do
:salmon, :salmon,
:hide_followers, :hide_followers,
:hide_follows, :hide_follows,
:hide_followers_count,
:hide_follows_count,
:follower_count, :follower_count,
:fields, :fields,
:following_count :following_count,
:discoverable
]) ])
|> validate_fields(true) |> validate_fields(true)
end end
@ -281,7 +288,10 @@ def user_upgrade(info, params, remote? \\ false) do
:following_count, :following_count,
:hide_follows, :hide_follows,
:fields, :fields,
:hide_followers :hide_followers,
:discoverable,
:hide_followers_count,
:hide_follows_count
]) ])
|> validate_fields(remote?) |> validate_fields(remote?)
end end
@ -295,13 +305,16 @@ def profile_update(info, params) do
:banner, :banner,
:hide_follows, :hide_follows,
:hide_followers, :hide_followers,
:hide_followers_count,
:hide_follows_count,
:hide_favorites, :hide_favorites,
:background, :background,
:show_role, :show_role,
:skip_thread_containment, :skip_thread_containment,
:fields, :fields,
:raw_fields, :raw_fields,
:pleroma_settings_store :pleroma_settings_store,
:discoverable
]) ])
|> validate_fields() |> validate_fields()
end end
@ -458,7 +471,9 @@ def follow_information_update(info, params) do
:hide_followers, :hide_followers,
:hide_follows, :hide_follows,
:follower_count, :follower_count,
:following_count :following_count,
:hide_followers_count,
:hide_follows_count
]) ])
end end
end end

View file

@ -1,5 +1,5 @@
# Pleroma: A lightweight social networking server # Pleroma: A lightweight social networking server
# Copyright © 2017-2018 Pleroma Authors <https://pleroma.social/> # Copyright © 2017-2019 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only # SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.User.Query do defmodule Pleroma.User.Query do

View file

@ -412,6 +412,7 @@ def delete(%Object{data: %{"id" => id, "actor" => actor}} = object, local \\ tru
end end
end end
@spec block(User.t(), User.t(), String.t() | nil, boolean) :: {:ok, Activity.t() | nil}
def block(blocker, blocked, activity_id \\ nil, local \\ true) do def block(blocker, blocked, activity_id \\ nil, local \\ true) do
outgoing_blocks = Config.get([:activitypub, :outgoing_blocks]) outgoing_blocks = Config.get([:activitypub, :outgoing_blocks])
unfollow_blocked = Config.get([:activitypub, :unfollow_blocked]) unfollow_blocked = Config.get([:activitypub, :unfollow_blocked])
@ -440,10 +441,11 @@ def unblock(blocker, blocked, activity_id \\ nil, local \\ true) do
end end
end end
@spec flag(map()) :: {:ok, Activity.t()} | any
def flag( def flag(
%{ %{
actor: actor, actor: actor,
context: context, context: _context,
account: account, account: account,
statuses: statuses, statuses: statuses,
content: content content: content
@ -455,14 +457,6 @@ def flag(
additional = params[:additional] || %{} additional = params[:additional] || %{}
params = %{
actor: actor,
context: context,
account: account,
statuses: statuses,
content: content
}
additional = additional =
if forward do if forward do
Map.merge(additional, %{"to" => [], "cc" => [account.ap_id]}) Map.merge(additional, %{"to" => [], "cc" => [account.ap_id]})
@ -518,7 +512,7 @@ def fetch_activities_for_context(context, opts \\ %{}) do
end end
@spec fetch_latest_activity_id_for_context(String.t(), keyword() | map()) :: @spec fetch_latest_activity_id_for_context(String.t(), keyword() | map()) ::
Pleroma.FlakeId.t() | nil FlakeId.Ecto.CompatType.t() | nil
def fetch_latest_activity_id_for_context(context, opts \\ %{}) do def fetch_latest_activity_id_for_context(context, opts \\ %{}) do
context context
|> fetch_activities_for_context_query(Map.merge(%{"skip_preload" => true}, opts)) |> fetch_activities_for_context_query(Map.merge(%{"skip_preload" => true}, opts))
@ -527,12 +521,13 @@ def fetch_latest_activity_id_for_context(context, opts \\ %{}) do
|> Repo.one() |> Repo.one()
end end
def fetch_public_activities(opts \\ %{}) do def fetch_public_activities(opts \\ %{}, pagination \\ :keyset) do
q = fetch_activities_query([Pleroma.Constants.as_public()], opts) opts = Map.drop(opts, ["user"])
q [Pleroma.Constants.as_public()]
|> fetch_activities_query(opts)
|> restrict_unlisted() |> restrict_unlisted()
|> Pagination.fetch_paginated(opts) |> Pagination.fetch_paginated(opts, pagination)
|> Enum.reverse() |> Enum.reverse()
end end
@ -841,7 +836,7 @@ defp restrict_muted_reblogs(query, %{"muting_user" => %User{info: info}}) do
defp restrict_muted_reblogs(query, _), do: query defp restrict_muted_reblogs(query, _), do: query
defp exclude_poll_votes(query, %{"include_poll_votes" => "true"}), do: query defp exclude_poll_votes(query, %{"include_poll_votes" => true}), do: query
defp exclude_poll_votes(query, _) do defp exclude_poll_votes(query, _) do
if has_named_binding?(query, :object) do if has_named_binding?(query, :object) do
@ -925,11 +920,11 @@ def fetch_activities_query(recipients, opts \\ %{}) do
|> exclude_poll_votes(opts) |> exclude_poll_votes(opts)
end end
def fetch_activities(recipients, opts \\ %{}) do def fetch_activities(recipients, opts \\ %{}, pagination \\ :keyset) do
list_memberships = Pleroma.List.memberships(opts["user"]) list_memberships = Pleroma.List.memberships(opts["user"])
fetch_activities_query(recipients ++ list_memberships, opts) fetch_activities_query(recipients ++ list_memberships, opts)
|> Pagination.fetch_paginated(opts) |> Pagination.fetch_paginated(opts, pagination)
|> Enum.reverse() |> Enum.reverse()
|> maybe_update_cc(list_memberships, opts["user"]) |> maybe_update_cc(list_memberships, opts["user"])
end end
@ -960,10 +955,15 @@ def fetch_activities_bounded_query(query, recipients, recipients_with_public) do
) )
end end
def fetch_activities_bounded(recipients, recipients_with_public, opts \\ %{}) do def fetch_activities_bounded(
recipients,
recipients_with_public,
opts \\ %{},
pagination \\ :keyset
) do
fetch_activities_query([], opts) fetch_activities_query([], opts)
|> fetch_activities_bounded_query(recipients, recipients_with_public) |> fetch_activities_bounded_query(recipients, recipients_with_public)
|> Pagination.fetch_paginated(opts) |> Pagination.fetch_paginated(opts, pagination)
|> Enum.reverse() |> Enum.reverse()
end end
@ -1003,6 +1003,7 @@ defp object_to_user_data(data) do
locked = data["manuallyApprovesFollowers"] || false locked = data["manuallyApprovesFollowers"] || false
data = Transmogrifier.maybe_fix_user_object(data) data = Transmogrifier.maybe_fix_user_object(data)
discoverable = data["discoverable"] || false
user_data = %{ user_data = %{
ap_id: data["id"], ap_id: data["id"],
@ -1011,7 +1012,8 @@ defp object_to_user_data(data) do
source_data: data, source_data: data,
banner: banner, banner: banner,
fields: fields, fields: fields,
locked: locked locked: locked,
discoverable: discoverable
}, },
avatar: avatar, avatar: avatar,
name: data["name"], name: data["name"],

View file

@ -49,7 +49,8 @@ def user(conn, %{"nickname" => nickname}) do
{:ok, user} <- User.ensure_keys_present(user) do {:ok, user} <- User.ensure_keys_present(user) do
conn conn
|> put_resp_content_type("application/activity+json") |> put_resp_content_type("application/activity+json")
|> json(UserView.render("user.json", %{user: user})) |> put_view(UserView)
|> render("user.json", %{user: user})
else else
nil -> {:error, :not_found} nil -> {:error, :not_found}
end end
@ -90,7 +91,8 @@ def object_likes(conn, %{"uuid" => uuid, "page" => page}) do
conn conn
|> put_resp_content_type("application/activity+json") |> put_resp_content_type("application/activity+json")
|> json(ObjectView.render("likes.json", ap_id, likes, page)) |> put_view(ObjectView)
|> render("likes.json", %{ap_id: ap_id, likes: likes, page: page})
else else
{:public?, false} -> {:public?, false} ->
{:error, :not_found} {:error, :not_found}
@ -104,7 +106,8 @@ def object_likes(conn, %{"uuid" => uuid}) do
likes <- Utils.get_object_likes(object) do likes <- Utils.get_object_likes(object) do
conn conn
|> put_resp_content_type("application/activity+json") |> put_resp_content_type("application/activity+json")
|> json(ObjectView.render("likes.json", ap_id, likes)) |> put_view(ObjectView)
|> render("likes.json", %{ap_id: ap_id, likes: likes})
else else
{:public?, false} -> {:public?, false} ->
{:error, :not_found} {:error, :not_found}
@ -158,7 +161,8 @@ defp set_cache_ttl_for(conn, entity) do
def following(%{assigns: %{relay: true}} = conn, _params) do def following(%{assigns: %{relay: true}} = conn, _params) do
conn conn
|> put_resp_content_type("application/activity+json") |> put_resp_content_type("application/activity+json")
|> json(UserView.render("following.json", %{user: Relay.get_actor()})) |> put_view(UserView)
|> render("following.json", %{user: Relay.get_actor()})
end end
def following(%{assigns: %{user: for_user}} = conn, %{"nickname" => nickname, "page" => page}) do def following(%{assigns: %{user: for_user}} = conn, %{"nickname" => nickname, "page" => page}) do
@ -170,7 +174,8 @@ def following(%{assigns: %{user: for_user}} = conn, %{"nickname" => nickname, "p
conn conn
|> put_resp_content_type("application/activity+json") |> put_resp_content_type("application/activity+json")
|> json(UserView.render("following.json", %{user: user, page: page, for: for_user})) |> put_view(UserView)
|> render("following.json", %{user: user, page: page, for: for_user})
else else
{:show_follows, _} -> {:show_follows, _} ->
conn conn
@ -184,7 +189,8 @@ def following(%{assigns: %{user: for_user}} = conn, %{"nickname" => nickname}) d
{user, for_user} <- ensure_user_keys_present_and_maybe_refresh_for_user(user, for_user) do {user, for_user} <- ensure_user_keys_present_and_maybe_refresh_for_user(user, for_user) do
conn conn
|> put_resp_content_type("application/activity+json") |> put_resp_content_type("application/activity+json")
|> json(UserView.render("following.json", %{user: user, for: for_user})) |> put_view(UserView)
|> render("following.json", %{user: user, for: for_user})
end end
end end
@ -192,7 +198,8 @@ def following(%{assigns: %{user: for_user}} = conn, %{"nickname" => nickname}) d
def followers(%{assigns: %{relay: true}} = conn, _params) do def followers(%{assigns: %{relay: true}} = conn, _params) do
conn conn
|> put_resp_content_type("application/activity+json") |> put_resp_content_type("application/activity+json")
|> json(UserView.render("followers.json", %{user: Relay.get_actor()})) |> put_view(UserView)
|> render("followers.json", %{user: Relay.get_actor()})
end end
def followers(%{assigns: %{user: for_user}} = conn, %{"nickname" => nickname, "page" => page}) do def followers(%{assigns: %{user: for_user}} = conn, %{"nickname" => nickname, "page" => page}) do
@ -204,7 +211,8 @@ def followers(%{assigns: %{user: for_user}} = conn, %{"nickname" => nickname, "p
conn conn
|> put_resp_content_type("application/activity+json") |> put_resp_content_type("application/activity+json")
|> json(UserView.render("followers.json", %{user: user, page: page, for: for_user})) |> put_view(UserView)
|> render("followers.json", %{user: user, page: page, for: for_user})
else else
{:show_followers, _} -> {:show_followers, _} ->
conn conn
@ -218,16 +226,48 @@ def followers(%{assigns: %{user: for_user}} = conn, %{"nickname" => nickname}) d
{user, for_user} <- ensure_user_keys_present_and_maybe_refresh_for_user(user, for_user) do {user, for_user} <- ensure_user_keys_present_and_maybe_refresh_for_user(user, for_user) do
conn conn
|> put_resp_content_type("application/activity+json") |> put_resp_content_type("application/activity+json")
|> json(UserView.render("followers.json", %{user: user, for: for_user})) |> put_view(UserView)
|> render("followers.json", %{user: user, for: for_user})
end end
end end
def outbox(conn, %{"nickname" => nickname} = params) do def outbox(conn, %{"nickname" => nickname, "page" => page?} = params)
when page? in [true, "true"] do
with %User{} = user <- User.get_cached_by_nickname(nickname),
{:ok, user} <- User.ensure_keys_present(user) do
activities =
if params["max_id"] do
ActivityPub.fetch_user_activities(user, nil, %{
"max_id" => params["max_id"],
# This is a hack because postgres generates inefficient queries when filtering by
# 'Answer', poll votes will be hidden by the visibility filter in this case anyway
"include_poll_votes" => true,
"limit" => 10
})
else
ActivityPub.fetch_user_activities(user, nil, %{
"limit" => 10,
"include_poll_votes" => true
})
end
conn
|> put_resp_content_type("application/activity+json")
|> put_view(UserView)
|> render("activity_collection_page.json", %{
activities: activities,
iri: "#{user.ap_id}/outbox"
})
end
end
def outbox(conn, %{"nickname" => nickname}) do
with %User{} = user <- User.get_cached_by_nickname(nickname), with %User{} = user <- User.get_cached_by_nickname(nickname),
{:ok, user} <- User.ensure_keys_present(user) do {:ok, user} <- User.ensure_keys_present(user) do
conn conn
|> put_resp_content_type("application/activity+json") |> put_resp_content_type("application/activity+json")
|> json(UserView.render("outbox.json", %{user: user, max_id: params["max_id"]})) |> put_view(UserView)
|> render("activity_collection.json", %{iri: "#{user.ap_id}/outbox"})
end end
end end
@ -275,7 +315,8 @@ defp represent_service_actor(%User{} = user, conn) do
with {:ok, user} <- User.ensure_keys_present(user) do with {:ok, user} <- User.ensure_keys_present(user) do
conn conn
|> put_resp_content_type("application/activity+json") |> put_resp_content_type("application/activity+json")
|> json(UserView.render("user.json", %{user: user})) |> put_view(UserView)
|> render("user.json", %{user: user})
else else
nil -> {:error, :not_found} nil -> {:error, :not_found}
end end
@ -296,19 +337,45 @@ def internal_fetch(conn, _params) do
def whoami(%{assigns: %{user: %User{} = user}} = conn, _params) do def whoami(%{assigns: %{user: %User{} = user}} = conn, _params) do
conn conn
|> put_resp_content_type("application/activity+json") |> put_resp_content_type("application/activity+json")
|> json(UserView.render("user.json", %{user: user})) |> put_view(UserView)
|> render("user.json", %{user: user})
end end
def whoami(_conn, _params), do: {:error, :not_found} def whoami(_conn, _params), do: {:error, :not_found}
def read_inbox( def read_inbox(
%{assigns: %{user: %{nickname: nickname} = user}} = conn, %{assigns: %{user: %{nickname: nickname} = user}} = conn,
%{"nickname" => nickname} = params %{"nickname" => nickname, "page" => page?} = params
) do )
when page? in [true, "true"] do
activities =
if params["max_id"] do
ActivityPub.fetch_activities([user.ap_id | user.following], %{
"max_id" => params["max_id"],
"limit" => 10
})
else
ActivityPub.fetch_activities([user.ap_id | user.following], %{"limit" => 10})
end
conn conn
|> put_resp_content_type("application/activity+json") |> put_resp_content_type("application/activity+json")
|> put_view(UserView) |> put_view(UserView)
|> render("inbox.json", user: user, max_id: params["max_id"]) |> render("activity_collection_page.json", %{
activities: activities,
iri: "#{user.ap_id}/inbox"
})
end
def read_inbox(%{assigns: %{user: %{nickname: nickname} = user}} = conn, %{
"nickname" => nickname
}) do
with {:ok, user} <- User.ensure_keys_present(user) do
conn
|> put_resp_content_type("application/activity+json")
|> put_view(UserView)
|> render("activity_collection.json", %{iri: "#{user.ap_id}/inbox"})
end
end end
def read_inbox(%{assigns: %{user: nil}} = conn, %{"nickname" => nickname}) do def read_inbox(%{assigns: %{user: nil}} = conn, %{"nickname" => nickname}) do

View file

@ -111,11 +111,11 @@ defp should_federate?(inbox, public) do
@spec recipients(User.t(), Activity.t()) :: list(User.t()) | [] @spec recipients(User.t(), Activity.t()) :: list(User.t()) | []
defp recipients(actor, activity) do defp recipients(actor, activity) do
{:ok, followers} = followers =
if actor.follower_address in activity.recipients do if actor.follower_address in activity.recipients do
User.get_external_followers(actor) User.get_external_followers(actor)
else else
{:ok, []} []
end end
fetchers = fetchers =

View file

@ -42,8 +42,7 @@ def fix_object(object, options \\ []) do
end end
def fix_summary(%{"summary" => nil} = object) do def fix_summary(%{"summary" => nil} = object) do
object Map.put(object, "summary", "")
|> Map.put("summary", "")
end end
def fix_summary(%{"summary" => _} = object) do def fix_summary(%{"summary" => _} = object) do
@ -51,10 +50,7 @@ def fix_summary(%{"summary" => _} = object) do
object object
end end
def fix_summary(object) do def fix_summary(object), do: Map.put(object, "summary", "")
object
|> Map.put("summary", "")
end
def fix_addressing_list(map, field) do def fix_addressing_list(map, field) do
cond do cond do
@ -74,13 +70,9 @@ def fix_explicit_addressing(
explicit_mentions, explicit_mentions,
follower_collection follower_collection
) do ) do
explicit_to = explicit_to = Enum.filter(to, fn x -> x in explicit_mentions end)
to
|> Enum.filter(fn x -> x in explicit_mentions end)
explicit_cc = explicit_cc = Enum.filter(to, fn x -> x not in explicit_mentions end)
to
|> Enum.filter(fn x -> x not in explicit_mentions end)
final_cc = final_cc =
(cc ++ explicit_cc) (cc ++ explicit_cc)
@ -98,13 +90,19 @@ def fix_explicit_addressing(object, _explicit_mentions, _followers_collection),
def fix_explicit_addressing(%{"directMessage" => true} = object), do: object def fix_explicit_addressing(%{"directMessage" => true} = object), do: object
def fix_explicit_addressing(object) do def fix_explicit_addressing(object) do
explicit_mentions = explicit_mentions = Utils.determine_explicit_mentions(object)
%User{follower_address: follower_collection} =
object object
|> Utils.determine_explicit_mentions() |> Containment.get_actor()
|> User.get_cached_by_ap_id()
follower_collection = User.get_cached_by_ap_id(Containment.get_actor(object)).follower_address explicit_mentions =
explicit_mentions ++
explicit_mentions = explicit_mentions ++ [Pleroma.Constants.as_public(), follower_collection] [
Pleroma.Constants.as_public(),
follower_collection
]
fix_explicit_addressing(object, explicit_mentions, follower_collection) fix_explicit_addressing(object, explicit_mentions, follower_collection)
end end
@ -148,48 +146,25 @@ def fix_addressing(object) do
end end
def fix_actor(%{"attributedTo" => actor} = object) do def fix_actor(%{"attributedTo" => actor} = object) do
object Map.put(object, "actor", Containment.get_actor(%{"actor" => actor}))
|> Map.put("actor", Containment.get_actor(%{"actor" => actor}))
end end
def fix_in_reply_to(object, options \\ []) def fix_in_reply_to(object, options \\ [])
def fix_in_reply_to(%{"inReplyTo" => in_reply_to} = object, options) def fix_in_reply_to(%{"inReplyTo" => in_reply_to} = object, options)
when not is_nil(in_reply_to) do when not is_nil(in_reply_to) do
in_reply_to_id = in_reply_to_id = prepare_in_reply_to(in_reply_to)
cond do
is_bitstring(in_reply_to) ->
in_reply_to
is_map(in_reply_to) && is_bitstring(in_reply_to["id"]) ->
in_reply_to["id"]
is_list(in_reply_to) && is_bitstring(Enum.at(in_reply_to, 0)) ->
Enum.at(in_reply_to, 0)
# Maybe I should output an error too?
true ->
""
end
object = Map.put(object, "inReplyToAtomUri", in_reply_to_id) object = Map.put(object, "inReplyToAtomUri", in_reply_to_id)
if Federator.allowed_incoming_reply_depth?(options[:depth]) do if Federator.allowed_incoming_reply_depth?(options[:depth]) do
case get_obj_helper(in_reply_to_id, options) do with {:ok, replied_object} <- get_obj_helper(in_reply_to_id, options),
{:ok, replied_object} -> %Activity{} = _ <- Activity.get_create_by_object_ap_id(replied_object.data["id"]) do
with %Activity{} = _activity <- object
Activity.get_create_by_object_ap_id(replied_object.data["id"]) do |> Map.put("inReplyTo", replied_object.data["id"])
object |> Map.put("inReplyToAtomUri", object["inReplyToAtomUri"] || in_reply_to_id)
|> Map.put("inReplyTo", replied_object.data["id"]) |> Map.put("conversation", replied_object.data["context"] || object["conversation"])
|> Map.put("inReplyToAtomUri", object["inReplyToAtomUri"] || in_reply_to_id) |> Map.put("context", replied_object.data["context"] || object["conversation"])
|> Map.put("conversation", replied_object.data["context"] || object["conversation"]) else
|> Map.put("context", replied_object.data["context"] || object["conversation"])
else
e ->
Logger.error("Couldn't fetch #{inspect(in_reply_to_id)}, error: #{inspect(e)}")
object
end
e -> e ->
Logger.error("Couldn't fetch #{inspect(in_reply_to_id)}, error: #{inspect(e)}") Logger.error("Couldn't fetch #{inspect(in_reply_to_id)}, error: #{inspect(e)}")
object object
@ -201,6 +176,22 @@ def fix_in_reply_to(%{"inReplyTo" => in_reply_to} = object, options)
def fix_in_reply_to(object, _options), do: object def fix_in_reply_to(object, _options), do: object
defp prepare_in_reply_to(in_reply_to) do
cond do
is_bitstring(in_reply_to) ->
in_reply_to
is_map(in_reply_to) && is_bitstring(in_reply_to["id"]) ->
in_reply_to["id"]
is_list(in_reply_to) && is_bitstring(Enum.at(in_reply_to, 0)) ->
Enum.at(in_reply_to, 0)
true ->
""
end
end
def fix_context(object) do def fix_context(object) do
context = object["context"] || object["conversation"] || Utils.generate_context_id() context = object["context"] || object["conversation"] || Utils.generate_context_id()
@ -211,11 +202,9 @@ def fix_context(object) do
def fix_attachments(%{"attachment" => attachment} = object) when is_list(attachment) do def fix_attachments(%{"attachment" => attachment} = object) when is_list(attachment) do
attachments = attachments =
attachment Enum.map(attachment, fn data ->
|> Enum.map(fn data ->
media_type = data["mediaType"] || data["mimeType"] media_type = data["mediaType"] || data["mimeType"]
href = data["url"] || data["href"] href = data["url"] || data["href"]
url = [%{"type" => "Link", "mediaType" => media_type, "href" => href}] url = [%{"type" => "Link", "mediaType" => media_type, "href" => href}]
data data
@ -223,30 +212,25 @@ def fix_attachments(%{"attachment" => attachment} = object) when is_list(attachm
|> Map.put("url", url) |> Map.put("url", url)
end) end)
object Map.put(object, "attachment", attachments)
|> Map.put("attachment", attachments)
end end
def fix_attachments(%{"attachment" => attachment} = object) when is_map(attachment) do def fix_attachments(%{"attachment" => attachment} = object) when is_map(attachment) do
Map.put(object, "attachment", [attachment]) object
|> Map.put("attachment", [attachment])
|> fix_attachments() |> fix_attachments()
end end
def fix_attachments(object), do: object def fix_attachments(object), do: object
def fix_url(%{"url" => url} = object) when is_map(url) do def fix_url(%{"url" => url} = object) when is_map(url) do
object Map.put(object, "url", url["href"])
|> Map.put("url", url["href"])
end end
def fix_url(%{"type" => "Video", "url" => url} = object) when is_list(url) do def fix_url(%{"type" => "Video", "url" => url} = object) when is_list(url) do
first_element = Enum.at(url, 0) first_element = Enum.at(url, 0)
link_element = link_element = Enum.find(url, fn x -> is_map(x) and x["mimeType"] == "text/html" end)
url
|> Enum.filter(fn x -> is_map(x) end)
|> Enum.filter(fn x -> x["mimeType"] == "text/html" end)
|> Enum.at(0)
object object
|> Map.put("attachment", [first_element]) |> Map.put("attachment", [first_element])
@ -264,36 +248,32 @@ def fix_url(%{"type" => object_type, "url" => url} = object)
true -> "" true -> ""
end end
object Map.put(object, "url", url_string)
|> Map.put("url", url_string)
end end
def fix_url(object), do: object def fix_url(object), do: object
def fix_emoji(%{"tag" => tags} = object) when is_list(tags) do def fix_emoji(%{"tag" => tags} = object) when is_list(tags) do
emoji = tags |> Enum.filter(fn data -> data["type"] == "Emoji" and data["icon"] end)
emoji = emoji =
emoji tags
|> Enum.filter(fn data -> data["type"] == "Emoji" and data["icon"] end)
|> Enum.reduce(%{}, fn data, mapping -> |> Enum.reduce(%{}, fn data, mapping ->
name = String.trim(data["name"], ":") name = String.trim(data["name"], ":")
mapping |> Map.put(name, data["icon"]["url"]) Map.put(mapping, name, data["icon"]["url"])
end) end)
# we merge mastodon and pleroma emoji into a single mapping, to allow for both wire formats # we merge mastodon and pleroma emoji into a single mapping, to allow for both wire formats
emoji = Map.merge(object["emoji"] || %{}, emoji) emoji = Map.merge(object["emoji"] || %{}, emoji)
object Map.put(object, "emoji", emoji)
|> Map.put("emoji", emoji)
end end
def fix_emoji(%{"tag" => %{"type" => "Emoji"} = tag} = object) do def fix_emoji(%{"tag" => %{"type" => "Emoji"} = tag} = object) do
name = String.trim(tag["name"], ":") name = String.trim(tag["name"], ":")
emoji = %{name => tag["icon"]["url"]} emoji = %{name => tag["icon"]["url"]}
object Map.put(object, "emoji", emoji)
|> Map.put("emoji", emoji)
end end
def fix_emoji(object), do: object def fix_emoji(object), do: object
@ -304,17 +284,13 @@ def fix_tag(%{"tag" => tag} = object) when is_list(tag) do
|> Enum.filter(fn data -> data["type"] == "Hashtag" and data["name"] end) |> Enum.filter(fn data -> data["type"] == "Hashtag" and data["name"] end)
|> Enum.map(fn data -> String.slice(data["name"], 1..-1) end) |> Enum.map(fn data -> String.slice(data["name"], 1..-1) end)
combined = tag ++ tags Map.put(object, "tag", tag ++ tags)
object
|> Map.put("tag", combined)
end end
def fix_tag(%{"tag" => %{"type" => "Hashtag", "name" => hashtag} = tag} = object) do def fix_tag(%{"tag" => %{"type" => "Hashtag", "name" => hashtag} = tag} = object) do
combined = [tag, String.slice(hashtag, 1..-1)] combined = [tag, String.slice(hashtag, 1..-1)]
object Map.put(object, "tag", combined)
|> Map.put("tag", combined)
end end
def fix_tag(%{"tag" => %{} = tag} = object), do: Map.put(object, "tag", [tag]) def fix_tag(%{"tag" => %{} = tag} = object), do: Map.put(object, "tag", [tag])
@ -326,8 +302,7 @@ def fix_content_map(%{"contentMap" => content_map} = object) do
content_groups = Map.to_list(content_map) content_groups = Map.to_list(content_map)
{_, content} = Enum.at(content_groups, 0) {_, content} = Enum.at(content_groups, 0)
object Map.put(object, "content", content)
|> Map.put("content", content)
end end
def fix_content_map(object), do: object def fix_content_map(object), do: object
@ -336,16 +311,11 @@ def fix_type(object, options \\ [])
def fix_type(%{"inReplyTo" => reply_id, "name" => _} = object, options) def fix_type(%{"inReplyTo" => reply_id, "name" => _} = object, options)
when is_binary(reply_id) do when is_binary(reply_id) do
reply = with true <- Federator.allowed_incoming_reply_depth?(options[:depth]),
with true <- Federator.allowed_incoming_reply_depth?(options[:depth]), {:ok, %{data: %{"type" => "Question"} = _} = _} <- get_obj_helper(reply_id, options) do
{:ok, object} <- get_obj_helper(reply_id, options) do
object
end
if reply && reply.data["type"] == "Question" do
Map.put(object, "type", "Answer") Map.put(object, "type", "Answer")
else else
object _ -> object
end end
end end
@ -377,6 +347,17 @@ defp get_follow_activity(follow_object, followed) do
end end
end end
# Reduce the object list to find the reported user.
defp get_reported(objects) do
Enum.reduce_while(objects, nil, fn ap_id, _ ->
with %User{} = user <- User.get_cached_by_ap_id(ap_id) do
{:halt, user}
else
_ -> {:cont, nil}
end
end)
end
def handle_incoming(data, options \\ []) def handle_incoming(data, options \\ [])
# Flag objects are placed ahead of the ID check because Mastodon 2.8 and earlier send them # Flag objects are placed ahead of the ID check because Mastodon 2.8 and earlier send them
@ -385,31 +366,19 @@ def handle_incoming(%{"type" => "Flag", "object" => objects, "actor" => actor} =
with context <- data["context"] || Utils.generate_context_id(), with context <- data["context"] || Utils.generate_context_id(),
content <- data["content"] || "", content <- data["content"] || "",
%User{} = actor <- User.get_cached_by_ap_id(actor), %User{} = actor <- User.get_cached_by_ap_id(actor),
# Reduce the object list to find the reported user. # Reduce the object list to find the reported user.
%User{} = account <- %User{} = account <- get_reported(objects),
Enum.reduce_while(objects, nil, fn ap_id, _ ->
with %User{} = user <- User.get_cached_by_ap_id(ap_id) do
{:halt, user}
else
_ -> {:cont, nil}
end
end),
# Remove the reported user from the object list. # Remove the reported user from the object list.
statuses <- Enum.filter(objects, fn ap_id -> ap_id != account.ap_id end) do statuses <- Enum.filter(objects, fn ap_id -> ap_id != account.ap_id end) do
params = %{ %{
actor: actor, actor: actor,
context: context, context: context,
account: account, account: account,
statuses: statuses, statuses: statuses,
content: content, content: content,
additional: %{ additional: %{"cc" => [account.ap_id]}
"cc" => [account.ap_id]
}
} }
|> ActivityPub.flag()
ActivityPub.flag(params)
end end
end end
@ -756,8 +725,12 @@ def handle_incoming(
def handle_incoming(_, _), do: :error def handle_incoming(_, _), do: :error
@spec get_obj_helper(String.t(), Keyword.t()) :: {:ok, Object.t()} | nil
def get_obj_helper(id, options \\ []) do def get_obj_helper(id, options \\ []) do
if object = Object.normalize(id, true, options), do: {:ok, object}, else: nil case Object.normalize(id, true, options) do
%Object{} = object -> {:ok, object}
_ -> nil
end
end end
def set_reply_to_uri(%{"inReplyTo" => in_reply_to} = object) when is_binary(in_reply_to) do def set_reply_to_uri(%{"inReplyTo" => in_reply_to} = object) when is_binary(in_reply_to) do
@ -856,27 +829,24 @@ def prepare_outgoing(%{"type" => _type} = data) do
{:ok, data} {:ok, data}
end end
def maybe_fix_object_url(data) do def maybe_fix_object_url(%{"object" => object} = data) when is_binary(object) do
if is_binary(data["object"]) and not String.starts_with?(data["object"], "http") do with false <- String.starts_with?(object, "http"),
case get_obj_helper(data["object"]) do {:fetch, {:ok, relative_object}} <- {:fetch, get_obj_helper(object)},
{:ok, relative_object} -> %{data: %{"external_url" => external_url}} when not is_nil(external_url) <-
if relative_object.data["external_url"] do relative_object do
_data = Map.put(data, "object", external_url)
data
|> Map.put("object", relative_object.data["external_url"])
else
data
end
e ->
Logger.error("Couldn't fetch #{data["object"]} #{inspect(e)}")
data
end
else else
data {:fetch, e} ->
Logger.error("Couldn't fetch #{object} #{inspect(e)}")
data
_ ->
data
end end
end end
def maybe_fix_object_url(data), do: data
def add_hashtags(object) do def add_hashtags(object) do
tags = tags =
(object["tag"] || []) (object["tag"] || [])
@ -894,53 +864,49 @@ def add_hashtags(object) do
tag tag
end) end)
object Map.put(object, "tag", tags)
|> Map.put("tag", tags)
end end
def add_mention_tags(object) do def add_mention_tags(object) do
mentions = mentions =
object object
|> Utils.get_notified_from_object() |> Utils.get_notified_from_object()
|> Enum.map(fn user -> |> Enum.map(&build_mention_tag/1)
%{"type" => "Mention", "href" => user.ap_id, "name" => "@#{user.nickname}"}
end)
tags = object["tag"] || [] tags = object["tag"] || []
object Map.put(object, "tag", tags ++ mentions)
|> Map.put("tag", tags ++ mentions)
end end
def add_emoji_tags(%User{info: %{"emoji" => _emoji} = user_info} = object) do defp build_mention_tag(%{ap_id: ap_id, nickname: nickname} = _) do
user_info = add_emoji_tags(user_info) %{"type" => "Mention", "href" => ap_id, "name" => "@#{nickname}"}
end
object def take_emoji_tags(%User{info: %{emoji: emoji} = _user_info} = _user) do
|> Map.put(:info, user_info) emoji
|> Enum.flat_map(&Map.to_list/1)
|> Enum.map(&build_emoji_tag/1)
end end
# TODO: we should probably send mtime instead of unix epoch time for updated # TODO: we should probably send mtime instead of unix epoch time for updated
def add_emoji_tags(%{"emoji" => emoji} = object) do def add_emoji_tags(%{"emoji" => emoji} = object) do
tags = object["tag"] || [] tags = object["tag"] || []
out = out = Enum.map(emoji, &build_emoji_tag/1)
emoji
|> Enum.map(fn {name, url} ->
%{
"icon" => %{"url" => url, "type" => "Image"},
"name" => ":" <> name <> ":",
"type" => "Emoji",
"updated" => "1970-01-01T00:00:00Z",
"id" => url
}
end)
object Map.put(object, "tag", tags ++ out)
|> Map.put("tag", tags ++ out)
end end
def add_emoji_tags(object) do def add_emoji_tags(object), do: object
object
defp build_emoji_tag({name, url}) do
%{
"icon" => %{"url" => url, "type" => "Image"},
"name" => ":" <> name <> ":",
"type" => "Emoji",
"updated" => "1970-01-01T00:00:00Z",
"id" => url
}
end end
def set_conversation(object) do def set_conversation(object) do
@ -960,9 +926,7 @@ def set_type(object), do: object
def add_attributed_to(object) do def add_attributed_to(object) do
attributed_to = object["attributedTo"] || object["actor"] attributed_to = object["attributedTo"] || object["actor"]
Map.put(object, "attributedTo", attributed_to)
object
|> Map.put("attributedTo", attributed_to)
end end
def prepare_attachments(object) do def prepare_attachments(object) do
@ -973,30 +937,18 @@ def prepare_attachments(object) do
%{"url" => href, "mediaType" => media_type, "name" => data["name"], "type" => "Document"} %{"url" => href, "mediaType" => media_type, "name" => data["name"], "type" => "Document"}
end) end)
object Map.put(object, "attachment", attachments)
|> Map.put("attachment", attachments)
end end
defp strip_internal_fields(object) do defp strip_internal_fields(object) do
object object
|> Map.drop([ |> Map.drop(Pleroma.Constants.object_internal_fields())
"likes",
"like_count",
"announcements",
"announcement_count",
"emoji",
"context_id",
"deleted_activity_id"
])
end end
defp strip_internal_tags(%{"tag" => tags} = object) do defp strip_internal_tags(%{"tag" => tags} = object) do
tags = tags = Enum.filter(tags, fn x -> is_map(x) end)
tags
|> Enum.filter(fn x -> is_map(x) end)
object Map.put(object, "tag", tags)
|> Map.put("tag", tags)
end end
defp strip_internal_tags(object), do: object defp strip_internal_tags(object), do: object
@ -1050,8 +1002,8 @@ def upgrade_user_from_ap_id(ap_id) do
with %User{local: false} = user <- User.get_cached_by_ap_id(ap_id), with %User{local: false} = user <- User.get_cached_by_ap_id(ap_id),
{:ok, data} <- ActivityPub.fetch_and_prepare_user_from_ap_id(ap_id), {:ok, data} <- ActivityPub.fetch_and_prepare_user_from_ap_id(ap_id),
already_ap <- User.ap_enabled?(user), already_ap <- User.ap_enabled?(user),
{:ok, user} <- user |> User.upgrade_changeset(data) |> User.update_and_set_cache() do {:ok, user} <- upgrade_user(user, data) do
unless already_ap do if not already_ap do
TransmogrifierWorker.enqueue("user_upgrade", %{"user_id" => user.id}) TransmogrifierWorker.enqueue("user_upgrade", %{"user_id" => user.id})
end end
@ -1062,6 +1014,12 @@ def upgrade_user_from_ap_id(ap_id) do
end end
end end
defp upgrade_user(user, data) do
user
|> User.upgrade_changeset(data, true)
|> User.update_and_set_cache()
end
def maybe_retire_websub(ap_id) do def maybe_retire_websub(ap_id) do
# some sanity checks # some sanity checks
if is_binary(ap_id) && String.length(ap_id) > 8 do if is_binary(ap_id) && String.length(ap_id) > 8 do
@ -1075,16 +1033,11 @@ def maybe_retire_websub(ap_id) do
end end
end end
def maybe_fix_user_url(data) do def maybe_fix_user_url(%{"url" => url} = data) when is_map(url) do
if is_map(data["url"]) do Map.put(data, "url", url["href"])
Map.put(data, "url", data["url"]["href"])
else
data
end
end end
def maybe_fix_user_object(data) do def maybe_fix_user_url(data), do: data
data
|> maybe_fix_user_url def maybe_fix_user_object(data), do: maybe_fix_user_url(data)
end
end end

View file

@ -33,50 +33,40 @@ def normalize_params(params) do
Map.put(params, "actor", get_ap_id(params["actor"])) Map.put(params, "actor", get_ap_id(params["actor"]))
end end
def determine_explicit_mentions(%{"tag" => tag} = _object) when is_list(tag) do @spec determine_explicit_mentions(map()) :: map()
tag def determine_explicit_mentions(%{"tag" => tag} = _) when is_list(tag) do
|> Enum.filter(fn x -> is_map(x) end) Enum.flat_map(tag, fn
|> Enum.filter(fn x -> x["type"] == "Mention" end) %{"type" => "Mention", "href" => href} -> [href]
|> Enum.map(fn x -> x["href"] end) _ -> []
end)
end end
def determine_explicit_mentions(%{"tag" => tag} = object) when is_map(tag) do def determine_explicit_mentions(%{"tag" => tag} = object) when is_map(tag) do
Map.put(object, "tag", [tag]) object
|> Map.put("tag", [tag])
|> determine_explicit_mentions() |> determine_explicit_mentions()
end end
def determine_explicit_mentions(_), do: [] def determine_explicit_mentions(_), do: []
@spec recipient_in_collection(any(), any()) :: boolean()
defp recipient_in_collection(ap_id, coll) when is_binary(coll), do: ap_id == coll defp recipient_in_collection(ap_id, coll) when is_binary(coll), do: ap_id == coll
defp recipient_in_collection(ap_id, coll) when is_list(coll), do: ap_id in coll defp recipient_in_collection(ap_id, coll) when is_list(coll), do: ap_id in coll
defp recipient_in_collection(_, _), do: false defp recipient_in_collection(_, _), do: false
@spec recipient_in_message(User.t(), User.t(), map()) :: boolean()
def recipient_in_message(%User{ap_id: ap_id} = recipient, %User{} = actor, params) do def recipient_in_message(%User{ap_id: ap_id} = recipient, %User{} = actor, params) do
addresses = [params["to"], params["cc"], params["bto"], params["bcc"]]
cond do cond do
recipient_in_collection(ap_id, params["to"]) -> Enum.any?(addresses, &recipient_in_collection(ap_id, &1)) -> true
true
recipient_in_collection(ap_id, params["cc"]) ->
true
recipient_in_collection(ap_id, params["bto"]) ->
true
recipient_in_collection(ap_id, params["bcc"]) ->
true
# if the message is unaddressed at all, then assume it is directly addressed # if the message is unaddressed at all, then assume it is directly addressed
# to the recipient # to the recipient
!params["to"] && !params["cc"] && !params["bto"] && !params["bcc"] -> Enum.all?(addresses, &is_nil(&1)) -> true
true
# if the message is sent from somebody the user is following, then assume it # if the message is sent from somebody the user is following, then assume it
# is addressed to the recipient # is addressed to the recipient
User.following?(recipient, actor) -> User.following?(recipient, actor) -> true
true true -> false
true ->
false
end end
end end
@ -179,53 +169,58 @@ def maybe_federate(_), do: :ok
Adds an id and a published data if they aren't there, Adds an id and a published data if they aren't there,
also adds it to an included object also adds it to an included object
""" """
def lazy_put_activity_defaults(map, fake? \\ false) do @spec lazy_put_activity_defaults(map(), boolean) :: map()
map = def lazy_put_activity_defaults(map, fake? \\ false)
if not fake? do
%{data: %{"id" => context}, id: context_id} = create_context(map["context"])
map def lazy_put_activity_defaults(map, true) do
|> Map.put_new_lazy("id", &generate_activity_id/0) map
|> Map.put_new_lazy("published", &make_date/0) |> Map.put_new("id", "pleroma:fakeid")
|> Map.put_new("context", context) |> Map.put_new_lazy("published", &make_date/0)
|> Map.put_new("context_id", context_id) |> Map.put_new("context", "pleroma:fakecontext")
else |> Map.put_new("context_id", -1)
map |> lazy_put_object_defaults(true)
|> Map.put_new("id", "pleroma:fakeid") end
|> Map.put_new_lazy("published", &make_date/0)
|> Map.put_new("context", "pleroma:fakecontext")
|> Map.put_new("context_id", -1)
end
if is_map(map["object"]) do def lazy_put_activity_defaults(map, _fake?) do
object = lazy_put_object_defaults(map["object"], map, fake?) %{data: %{"id" => context}, id: context_id} = create_context(map["context"])
%{map | "object" => object}
else map
|> Map.put_new_lazy("id", &generate_activity_id/0)
|> Map.put_new_lazy("published", &make_date/0)
|> Map.put_new("context", context)
|> Map.put_new("context_id", context_id)
|> lazy_put_object_defaults(false)
end
# Adds an id and published date if they aren't there.
#
@spec lazy_put_object_defaults(map(), boolean()) :: map()
defp lazy_put_object_defaults(%{"object" => map} = activity, true)
when is_map(map) do
object =
map map
end |> Map.put_new("id", "pleroma:fake_object_id")
|> Map.put_new_lazy("published", &make_date/0)
|> Map.put_new("context", activity["context"])
|> Map.put_new("context_id", activity["context_id"])
|> Map.put_new("fake", true)
%{activity | "object" => object}
end end
@doc """ defp lazy_put_object_defaults(%{"object" => map} = activity, _)
Adds an id and published date if they aren't there. when is_map(map) do
""" object =
def lazy_put_object_defaults(map, activity \\ %{}, fake?) map
|> Map.put_new_lazy("id", &generate_object_id/0)
|> Map.put_new_lazy("published", &make_date/0)
|> Map.put_new("context", activity["context"])
|> Map.put_new("context_id", activity["context_id"])
def lazy_put_object_defaults(map, activity, true = _fake?) do %{activity | "object" => object}
map
|> Map.put_new_lazy("published", &make_date/0)
|> Map.put_new("id", "pleroma:fake_object_id")
|> Map.put_new("context", activity["context"])
|> Map.put_new("fake", true)
|> Map.put_new("context_id", activity["context_id"])
end end
def lazy_put_object_defaults(map, activity, _fake?) do defp lazy_put_object_defaults(activity, _), do: activity
map
|> Map.put_new_lazy("id", &generate_object_id/0)
|> Map.put_new_lazy("published", &make_date/0)
|> Map.put_new("context", activity["context"])
|> Map.put_new("context_id", activity["context_id"])
end
@doc """ @doc """
Inserts a full object if it is contained in an activity. Inserts a full object if it is contained in an activity.
@ -345,24 +340,24 @@ defp fetch_likes(object) do
@doc """ @doc """
Updates a follow activity's state (for locked accounts). Updates a follow activity's state (for locked accounts).
""" """
@spec update_follow_state_for_all(Activity.t(), String.t()) :: {:ok, Activity} | {:error, any()}
def update_follow_state_for_all( def update_follow_state_for_all(
%Activity{data: %{"actor" => actor, "object" => object}} = activity, %Activity{data: %{"actor" => actor, "object" => object}} = activity,
state state
) do ) do
try do "Follow"
Ecto.Adapters.SQL.query!( |> Activity.Queries.by_type()
Repo, |> Activity.Queries.by_actor(actor)
"UPDATE activities SET data = jsonb_set(data, '{state}', $1) WHERE data->>'type' = 'Follow' AND data->>'actor' = $2 AND data->>'object' = $3 AND data->>'state' = 'pending'", |> Activity.Queries.by_object_id(object)
[state, actor, object] |> where(fragment("data->>'state' = 'pending'"))
) |> update(set: [data: fragment("jsonb_set(data, '{state}', ?)", ^state)])
|> Repo.update_all([])
User.set_follow_state_cache(actor, object, state) User.set_follow_state_cache(actor, object, state)
activity = Activity.get_by_id(activity.id)
{:ok, activity} activity = Activity.get_by_id(activity.id)
rescue
e -> {:ok, activity}
{:error, e}
end
end end
def update_follow_state( def update_follow_state(
@ -413,6 +408,7 @@ def fetch_latest_follow(%User{ap_id: follower_id}, %User{ap_id: followed_id}) do
@doc """ @doc """
Retruns an existing announce activity if the notice has already been announced Retruns an existing announce activity if the notice has already been announced
""" """
@spec get_existing_announce(String.t(), map()) :: Activity.t() | nil
def get_existing_announce(actor, %{data: %{"id" => ap_id}}) do def get_existing_announce(actor, %{data: %{"id" => ap_id}}) do
"Announce" "Announce"
|> Activity.Queries.by_type() |> Activity.Queries.by_type()
@ -495,33 +491,35 @@ def make_unlike_data(
|> maybe_put("id", activity_id) |> maybe_put("id", activity_id)
end end
@spec add_announce_to_object(Activity.t(), Object.t()) ::
{:ok, Object.t()} | {:error, Ecto.Changeset.t()}
def add_announce_to_object( def add_announce_to_object(
%Activity{ %Activity{data: %{"actor" => actor, "cc" => [Pleroma.Constants.as_public()]}},
data: %{"actor" => actor, "cc" => [Pleroma.Constants.as_public()]}
},
object object
) do ) do
announcements = announcements = take_announcements(object)
if is_list(object.data["announcements"]) do
Enum.uniq([actor | object.data["announcements"]])
else
[actor]
end
update_element_in_object("announcement", announcements, object) with announcements <- Enum.uniq([actor | announcements]) do
update_element_in_object("announcement", announcements, object)
end
end end
def add_announce_to_object(_, object), do: {:ok, object} def add_announce_to_object(_, object), do: {:ok, object}
@spec remove_announce_from_object(Activity.t(), Object.t()) ::
{:ok, Object.t()} | {:error, Ecto.Changeset.t()}
def remove_announce_from_object(%Activity{data: %{"actor" => actor}}, object) do def remove_announce_from_object(%Activity{data: %{"actor" => actor}}, object) do
announcements = with announcements <- List.delete(take_announcements(object), actor) do
if is_list(object.data["announcements"]), do: object.data["announcements"], else: []
with announcements <- announcements |> List.delete(actor) do
update_element_in_object("announcement", announcements, object) update_element_in_object("announcement", announcements, object)
end end
end end
defp take_announcements(%{data: %{"announcements" => announcements}} = _)
when is_list(announcements),
do: announcements
defp take_announcements(_), do: []
#### Unfollow-related helpers #### Unfollow-related helpers
def make_unfollow_data(follower, followed, follow_activity, activity_id) do def make_unfollow_data(follower, followed, follow_activity, activity_id) do
@ -535,6 +533,7 @@ def make_unfollow_data(follower, followed, follow_activity, activity_id) do
end end
#### Block-related helpers #### Block-related helpers
@spec fetch_latest_block(User.t(), User.t()) :: Activity.t() | nil
def fetch_latest_block(%User{ap_id: blocker_id}, %User{ap_id: blocked_id}) do def fetch_latest_block(%User{ap_id: blocker_id}, %User{ap_id: blocked_id}) do
"Block" "Block"
|> Activity.Queries.by_type() |> Activity.Queries.by_type()
@ -583,28 +582,32 @@ def make_create_data(params, additional) do
end end
#### Flag-related helpers #### Flag-related helpers
@spec make_flag_data(map(), map()) :: map()
def make_flag_data(params, additional) do def make_flag_data(%{actor: actor, context: context, content: content} = params, additional) do
status_ap_ids =
Enum.map(params.statuses || [], fn
%Activity{} = act -> act.data["id"]
act when is_map(act) -> act["id"]
act when is_binary(act) -> act
end)
object = [params.account.ap_id] ++ status_ap_ids
%{ %{
"type" => "Flag", "type" => "Flag",
"actor" => params.actor.ap_id, "actor" => actor.ap_id,
"content" => params.content, "content" => content,
"object" => object, "object" => build_flag_object(params),
"context" => params.context, "context" => context,
"state" => "open" "state" => "open"
} }
|> Map.merge(additional) |> Map.merge(additional)
end end
def make_flag_data(_, _), do: %{}
defp build_flag_object(%{account: account, statuses: statuses} = _) do
[account.ap_id] ++
Enum.map(statuses || [], fn
%Activity{} = act -> act.data["id"]
act when is_map(act) -> act["id"]
act when is_binary(act) -> act
end)
end
defp build_flag_object(_), do: []
@doc """ @doc """
Fetches the OrderedCollection/OrderedCollectionPage from `from`, limiting the amount of pages fetched after Fetches the OrderedCollection/OrderedCollectionPage from `from`, limiting the amount of pages fetched after
the first one to `pages_left` pages. the first one to `pages_left` pages.

View file

@ -37,12 +37,12 @@ def render("object.json", %{object: %Activity{} = activity}) do
Map.merge(base, additional) Map.merge(base, additional)
end end
def render("likes.json", ap_id, likes, page) do def render("likes.json", %{ap_id: ap_id, likes: likes, page: page}) do
collection(likes, "#{ap_id}/likes", page) collection(likes, "#{ap_id}/likes", page)
|> Map.merge(Pleroma.Web.ActivityPub.Utils.make_json_ld_header()) |> Map.merge(Pleroma.Web.ActivityPub.Utils.make_json_ld_header())
end end
def render("likes.json", ap_id, likes) do def render("likes.json", %{ap_id: ap_id, likes: likes}) do
%{ %{
"id" => "#{ap_id}/likes", "id" => "#{ap_id}/likes",
"type" => "OrderedCollection", "type" => "OrderedCollection",

View file

@ -8,7 +8,6 @@ defmodule Pleroma.Web.ActivityPub.UserView do
alias Pleroma.Keys alias Pleroma.Keys
alias Pleroma.Repo alias Pleroma.Repo
alias Pleroma.User alias Pleroma.User
alias Pleroma.Web.ActivityPub.ActivityPub
alias Pleroma.Web.ActivityPub.Transmogrifier alias Pleroma.Web.ActivityPub.Transmogrifier
alias Pleroma.Web.ActivityPub.Utils alias Pleroma.Web.ActivityPub.Utils
alias Pleroma.Web.Endpoint alias Pleroma.Web.Endpoint
@ -75,10 +74,7 @@ def render("user.json", %{user: user}) do
endpoints = render("endpoints.json", %{user: user}) endpoints = render("endpoints.json", %{user: user})
user_tags = emoji_tags = Transmogrifier.take_emoji_tags(user)
user
|> Transmogrifier.add_emoji_tags()
|> Map.get("tag", [])
fields = fields =
user.info user.info
@ -110,7 +106,8 @@ def render("user.json", %{user: user}) do
}, },
"endpoints" => endpoints, "endpoints" => endpoints,
"attachment" => fields, "attachment" => fields,
"tag" => (user.info.source_data["tag"] || []) ++ user_tags "tag" => (user.info.source_data["tag"] || []) ++ emoji_tags,
"discoverable" => user.info.discoverable
} }
|> Map.merge(maybe_make_image(&User.avatar_url/2, "icon", user)) |> Map.merge(maybe_make_image(&User.avatar_url/2, "icon", user))
|> Map.merge(maybe_make_image(&User.banner_url/2, "image", user)) |> Map.merge(maybe_make_image(&User.banner_url/2, "image", user))
@ -118,30 +115,34 @@ def render("user.json", %{user: user}) do
end end
def render("following.json", %{user: user, page: page} = opts) do def render("following.json", %{user: user, page: page} = opts) do
showing = (opts[:for] && opts[:for] == user) || !user.info.hide_follows showing_items = (opts[:for] && opts[:for] == user) || !user.info.hide_follows
showing_count = showing_items || !user.info.hide_follows_count
query = User.get_friends_query(user) query = User.get_friends_query(user)
query = from(user in query, select: [:ap_id]) query = from(user in query, select: [:ap_id])
following = Repo.all(query) following = Repo.all(query)
total = total =
if showing do if showing_count do
length(following) length(following)
else else
0 0
end end
collection(following, "#{user.ap_id}/following", page, showing, total) collection(following, "#{user.ap_id}/following", page, showing_items, total)
|> Map.merge(Utils.make_json_ld_header()) |> Map.merge(Utils.make_json_ld_header())
end end
def render("following.json", %{user: user} = opts) do def render("following.json", %{user: user} = opts) do
showing = (opts[:for] && opts[:for] == user) || !user.info.hide_follows showing_items = (opts[:for] && opts[:for] == user) || !user.info.hide_follows
showing_count = showing_items || !user.info.hide_follows_count
query = User.get_friends_query(user) query = User.get_friends_query(user)
query = from(user in query, select: [:ap_id]) query = from(user in query, select: [:ap_id])
following = Repo.all(query) following = Repo.all(query)
total = total =
if showing do if showing_count do
length(following) length(following)
else else
0 0
@ -152,7 +153,7 @@ def render("following.json", %{user: user} = opts) do
"type" => "OrderedCollection", "type" => "OrderedCollection",
"totalItems" => total, "totalItems" => total,
"first" => "first" =>
if showing do if showing_items do
collection(following, "#{user.ap_id}/following", 1, !user.info.hide_follows) collection(following, "#{user.ap_id}/following", 1, !user.info.hide_follows)
else else
"#{user.ap_id}/following?page=1" "#{user.ap_id}/following?page=1"
@ -162,32 +163,34 @@ def render("following.json", %{user: user} = opts) do
end end
def render("followers.json", %{user: user, page: page} = opts) do def render("followers.json", %{user: user, page: page} = opts) do
showing = (opts[:for] && opts[:for] == user) || !user.info.hide_followers showing_items = (opts[:for] && opts[:for] == user) || !user.info.hide_followers
showing_count = showing_items || !user.info.hide_followers_count
query = User.get_followers_query(user) query = User.get_followers_query(user)
query = from(user in query, select: [:ap_id]) query = from(user in query, select: [:ap_id])
followers = Repo.all(query) followers = Repo.all(query)
total = total =
if showing do if showing_count do
length(followers) length(followers)
else else
0 0
end end
collection(followers, "#{user.ap_id}/followers", page, showing, total) collection(followers, "#{user.ap_id}/followers", page, showing_items, total)
|> Map.merge(Utils.make_json_ld_header()) |> Map.merge(Utils.make_json_ld_header())
end end
def render("followers.json", %{user: user} = opts) do def render("followers.json", %{user: user} = opts) do
showing = (opts[:for] && opts[:for] == user) || !user.info.hide_followers showing_items = (opts[:for] && opts[:for] == user) || !user.info.hide_followers
showing_count = showing_items || !user.info.hide_followers_count
query = User.get_followers_query(user) query = User.get_followers_query(user)
query = from(user in query, select: [:ap_id]) query = from(user in query, select: [:ap_id])
followers = Repo.all(query) followers = Repo.all(query)
total = total =
if showing do if showing_count do
length(followers) length(followers)
else else
0 0
@ -198,8 +201,8 @@ def render("followers.json", %{user: user} = opts) do
"type" => "OrderedCollection", "type" => "OrderedCollection",
"totalItems" => total, "totalItems" => total,
"first" => "first" =>
if showing do if showing_items do
collection(followers, "#{user.ap_id}/followers", 1, showing, total) collection(followers, "#{user.ap_id}/followers", 1, showing_items, total)
else else
"#{user.ap_id}/followers?page=1" "#{user.ap_id}/followers?page=1"
end end
@ -207,25 +210,22 @@ def render("followers.json", %{user: user} = opts) do
|> Map.merge(Utils.make_json_ld_header()) |> Map.merge(Utils.make_json_ld_header())
end end
def render("outbox.json", %{user: user, max_id: max_qid}) do def render("activity_collection.json", %{iri: iri}) do
params = %{ %{
"limit" => "10" "id" => iri,
"type" => "OrderedCollection",
"first" => "#{iri}?page=true"
} }
|> Map.merge(Utils.make_json_ld_header())
end
params = def render("activity_collection_page.json", %{activities: activities, iri: iri}) do
if max_qid != nil do # this is sorted chronologically, so first activity is the newest (max)
Map.put(params, "max_id", max_qid)
else
params
end
activities = ActivityPub.fetch_user_activities(user, nil, params)
{max_id, min_id, collection} = {max_id, min_id, collection} =
if length(activities) > 0 do if length(activities) > 0 do
{ {
Enum.at(Enum.reverse(activities), 0).id,
Enum.at(activities, 0).id, Enum.at(activities, 0).id,
Enum.at(Enum.reverse(activities), 0).id,
Enum.map(activities, fn act -> Enum.map(activities, fn act ->
{:ok, data} = Transmogrifier.prepare_outgoing(act.data) {:ok, data} = Transmogrifier.prepare_outgoing(act.data)
data data
@ -239,71 +239,14 @@ def render("outbox.json", %{user: user, max_id: max_qid}) do
} }
end end
iri = "#{user.ap_id}/outbox" %{
"id" => "#{iri}?max_id=#{max_id}&page=true",
page = %{
"id" => "#{iri}?max_id=#{max_id}",
"type" => "OrderedCollectionPage", "type" => "OrderedCollectionPage",
"partOf" => iri, "partOf" => iri,
"orderedItems" => collection, "orderedItems" => collection,
"next" => "#{iri}?max_id=#{min_id}" "next" => "#{iri}?max_id=#{min_id}&page=true"
} }
|> Map.merge(Utils.make_json_ld_header())
if max_qid == nil do
%{
"id" => iri,
"type" => "OrderedCollection",
"first" => page
}
|> Map.merge(Utils.make_json_ld_header())
else
page |> Map.merge(Utils.make_json_ld_header())
end
end
def render("inbox.json", %{user: user, max_id: max_qid}) do
params = %{
"limit" => "10"
}
params =
if max_qid != nil do
Map.put(params, "max_id", max_qid)
else
params
end
activities = ActivityPub.fetch_activities([user.ap_id | user.following], params)
min_id = Enum.at(Enum.reverse(activities), 0).id
max_id = Enum.at(activities, 0).id
collection =
Enum.map(activities, fn act ->
{:ok, data} = Transmogrifier.prepare_outgoing(act.data)
data
end)
iri = "#{user.ap_id}/inbox"
page = %{
"id" => "#{iri}?max_id=#{max_id}",
"type" => "OrderedCollectionPage",
"partOf" => iri,
"orderedItems" => collection,
"next" => "#{iri}?max_id=#{min_id}"
}
if max_qid == nil do
%{
"id" => iri,
"type" => "OrderedCollection",
"first" => page
}
|> Map.merge(Utils.make_json_ld_header())
else
page |> Map.merge(Utils.make_json_ld_header())
end
end end
def collection(collection, iri, page, show_items \\ true, total \\ nil) do def collection(collection, iri, page, show_items \\ true, total \\ nil) do

View file

@ -14,10 +14,13 @@ defmodule Pleroma.Web.AdminAPI.AdminAPIController do
alias Pleroma.Web.AdminAPI.Config alias Pleroma.Web.AdminAPI.Config
alias Pleroma.Web.AdminAPI.ConfigView alias Pleroma.Web.AdminAPI.ConfigView
alias Pleroma.Web.AdminAPI.ModerationLogView alias Pleroma.Web.AdminAPI.ModerationLogView
alias Pleroma.Web.AdminAPI.Report
alias Pleroma.Web.AdminAPI.ReportView alias Pleroma.Web.AdminAPI.ReportView
alias Pleroma.Web.AdminAPI.Search alias Pleroma.Web.AdminAPI.Search
alias Pleroma.Web.CommonAPI alias Pleroma.Web.CommonAPI
alias Pleroma.Web.Endpoint
alias Pleroma.Web.MastodonAPI.StatusView alias Pleroma.Web.MastodonAPI.StatusView
alias Pleroma.Web.Router
import Pleroma.Web.ControllerHelper, only: [json_response: 3] import Pleroma.Web.ControllerHelper, only: [json_response: 3]
@ -139,7 +142,8 @@ def users_create(%{assigns: %{user: admin}} = conn, %{"users" => users}) do
def user_show(conn, %{"nickname" => nickname}) do def user_show(conn, %{"nickname" => nickname}) do
with %User{} = user <- User.get_cached_by_nickname_or_id(nickname) do with %User{} = user <- User.get_cached_by_nickname_or_id(nickname) do
conn conn
|> json(AccountView.render("show.json", %{user: user})) |> put_view(AccountView)
|> render("show.json", %{user: user})
else else
_ -> {:error, :not_found} _ -> {:error, :not_found}
end end
@ -158,7 +162,8 @@ def list_user_statuses(conn, %{"nickname" => nickname} = params) do
}) })
conn conn
|> json(StatusView.render("index.json", %{activities: activities, as: :activity})) |> put_view(StatusView)
|> render("index.json", %{activities: activities, as: :activity})
else else
_ -> {:error, :not_found} _ -> {:error, :not_found}
end end
@ -178,7 +183,8 @@ def user_toggle_activation(%{assigns: %{user: admin}} = conn, %{"nickname" => ni
}) })
conn conn
|> json(AccountView.render("show.json", %{user: updated_user})) |> put_view(AccountView)
|> render("show.json", %{user: updated_user})
end end
def tag_users(%{assigns: %{user: admin}} = conn, %{"nicknames" => nicknames, "tags" => tags}) do def tag_users(%{assigns: %{user: admin}} = conn, %{"nicknames" => nicknames, "tags" => tags}) do
@ -250,18 +256,12 @@ def right_add(%{assigns: %{user: admin}} = conn, %{
"nickname" => nickname "nickname" => nickname
}) })
when permission_group in ["moderator", "admin"] do when permission_group in ["moderator", "admin"] do
user = User.get_cached_by_nickname(nickname) info = Map.put(%{}, "is_" <> permission_group, true)
info = {:ok, user} =
%{} nickname
|> Map.put("is_" <> permission_group, true) |> User.get_cached_by_nickname()
|> User.update_info(&User.Info.admin_api_update(&1, info))
info_cng = User.Info.admin_api_update(user.info, info)
cng =
user
|> Ecto.Changeset.change()
|> Ecto.Changeset.put_embed(:info, info_cng)
ModerationLog.insert_log(%{ ModerationLog.insert_log(%{
action: "grant", action: "grant",
@ -270,8 +270,6 @@ def right_add(%{assigns: %{user: admin}} = conn, %{
permission: permission_group permission: permission_group
}) })
{:ok, _user} = User.update_and_set_cache(cng)
json(conn, info) json(conn, info)
end end
@ -289,40 +287,33 @@ def right_get(conn, %{"nickname" => nickname}) do
}) })
end end
def right_delete(%{assigns: %{user: %{nickname: nickname}}} = conn, %{"nickname" => nickname}) do
render_error(conn, :forbidden, "You can't revoke your own admin status.")
end
def right_delete( def right_delete(
%{assigns: %{user: %User{:nickname => admin_nickname} = admin}} = conn, %{assigns: %{user: admin}} = conn,
%{ %{
"permission_group" => permission_group, "permission_group" => permission_group,
"nickname" => nickname "nickname" => nickname
} }
) )
when permission_group in ["moderator", "admin"] do when permission_group in ["moderator", "admin"] do
if admin_nickname == nickname do info = Map.put(%{}, "is_" <> permission_group, false)
render_error(conn, :forbidden, "You can't revoke your own admin status.")
else
user = User.get_cached_by_nickname(nickname)
info = {:ok, user} =
%{} nickname
|> Map.put("is_" <> permission_group, false) |> User.get_cached_by_nickname()
|> User.update_info(&User.Info.admin_api_update(&1, info))
info_cng = User.Info.admin_api_update(user.info, info) ModerationLog.insert_log(%{
action: "revoke",
actor: admin,
subject: user,
permission: permission_group
})
cng = json(conn, info)
Ecto.Changeset.change(user)
|> Ecto.Changeset.put_embed(:info, info_cng)
{:ok, _user} = User.update_and_set_cache(cng)
ModerationLog.insert_log(%{
action: "revoke",
actor: admin,
subject: user,
permission: permission_group
})
json(conn, info)
end
end end
def right_delete(conn, _) do def right_delete(conn, _) do
@ -400,13 +391,23 @@ def email_invite(%{assigns: %{user: user}} = conn, %{"email" => email} = params)
end end
end end
@doc "Get a account registeration invite token (base64 string)" @doc "Create an account registration invite token"
def get_invite_token(conn, params) do def create_invite_token(conn, params) do
options = params["invite"] || %{} opts = %{}
{:ok, invite} = UserInviteToken.create_invite(options)
conn opts =
|> json(invite.token) if params["max_use"],
do: Map.put(opts, :max_use, params["max_use"]),
else: opts
opts =
if params["expires_at"],
do: Map.put(opts, :expires_at, params["expires_at"]),
else: opts
{:ok, invite} = UserInviteToken.create_invite(opts)
json(conn, AccountView.render("invite.json", %{invite: invite}))
end end
@doc "Get list of created invites" @doc "Get list of created invites"
@ -414,7 +415,8 @@ def invites(conn, _params) do
invites = UserInviteToken.list_invites() invites = UserInviteToken.list_invites()
conn conn
|> json(AccountView.render("invites.json", %{invites: invites})) |> put_view(AccountView)
|> render("invites.json", %{invites: invites})
end end
@doc "Revokes invite by token" @doc "Revokes invite by token"
@ -422,7 +424,8 @@ def revoke_invite(conn, %{"token" => token}) do
with {:ok, invite} <- UserInviteToken.find_by_token(token), with {:ok, invite} <- UserInviteToken.find_by_token(token),
{:ok, updated_invite} = UserInviteToken.update_invite(invite, %{used: true}) do {:ok, updated_invite} = UserInviteToken.update_invite(invite, %{used: true}) do
conn conn
|> json(AccountView.render("invite.json", %{invite: updated_invite})) |> put_view(AccountView)
|> render("invite.json", %{invite: updated_invite})
else else
nil -> {:error, :not_found} nil -> {:error, :not_found}
end end
@ -434,19 +437,33 @@ def get_password_reset(conn, %{"nickname" => nickname}) do
{:ok, token} = Pleroma.PasswordResetToken.create_token(user) {:ok, token} = Pleroma.PasswordResetToken.create_token(user)
conn conn
|> json(token.token) |> json(%{
token: token.token,
link: Router.Helpers.reset_password_url(Endpoint, :reset, token.token)
})
end
@doc "Force password reset for a given user"
def force_password_reset(conn, %{"nickname" => nickname}) do
(%User{local: true} = user) = User.get_cached_by_nickname(nickname)
User.force_password_reset_async(user)
json_response(conn, :no_content, "")
end end
def list_reports(conn, params) do def list_reports(conn, params) do
{page, page_size} = page_params(params)
params = params =
params params
|> Map.put("type", "Flag") |> Map.put("type", "Flag")
|> Map.put("skip_preload", true) |> Map.put("skip_preload", true)
|> Map.put("total", true)
|> Map.put("limit", page_size)
|> Map.put("offset", (page - 1) * page_size)
reports = reports = ActivityPub.fetch_activities([], params, :offset)
[]
|> ActivityPub.fetch_activities(params)
|> Enum.reverse()
conn conn
|> put_view(ReportView) |> put_view(ReportView)
@ -457,7 +474,7 @@ def report_show(conn, %{"id" => id}) do
with %Activity{} = report <- Activity.get_by_id(id) do with %Activity{} = report <- Activity.get_by_id(id) do
conn conn
|> put_view(ReportView) |> put_view(ReportView)
|> render("show.json", %{report: report}) |> render("show.json", Report.extract_report_info(report))
else else
_ -> {:error, :not_found} _ -> {:error, :not_found}
end end
@ -473,7 +490,7 @@ def report_update_state(%{assigns: %{user: admin}} = conn, %{"id" => id, "state"
conn conn
|> put_view(ReportView) |> put_view(ReportView)
|> render("show.json", %{report: report}) |> render("show.json", Report.extract_report_info(report))
end end
end end
@ -591,6 +608,12 @@ def config_update(conn, %{"configs" => configs}) do
|> render("index.json", %{configs: updated}) |> render("index.json", %{configs: updated})
end end
def reload_emoji(conn, _params) do
Pleroma.Emoji.reload()
conn |> json("ok")
end
def errors(conn, {:error, :not_found}) do def errors(conn, {:error, :not_found}) do
conn conn
|> put_status(:not_found) |> put_status(:not_found)

View file

@ -0,0 +1,22 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2019 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Web.AdminAPI.Report do
alias Pleroma.Activity
alias Pleroma.User
def extract_report_info(
%{data: %{"actor" => actor, "object" => [account_ap_id | status_ap_ids]}} = report
) do
user = User.get_cached_by_ap_id(actor)
account = User.get_cached_by_ap_id(account_ap_id)
statuses =
Enum.map(status_ap_ids, fn ap_id ->
Activity.get_by_ap_id_with_object(ap_id)
end)
%{report: report, user: user, account: account, statuses: statuses}
end
end

View file

@ -4,25 +4,26 @@
defmodule Pleroma.Web.AdminAPI.ReportView do defmodule Pleroma.Web.AdminAPI.ReportView do
use Pleroma.Web, :view use Pleroma.Web, :view
alias Pleroma.Activity
alias Pleroma.HTML alias Pleroma.HTML
alias Pleroma.User alias Pleroma.User
alias Pleroma.Web.AdminAPI.Report
alias Pleroma.Web.CommonAPI.Utils alias Pleroma.Web.CommonAPI.Utils
alias Pleroma.Web.MastodonAPI.StatusView alias Pleroma.Web.MastodonAPI.StatusView
def render("index.json", %{reports: reports}) do def render("index.json", %{reports: reports}) do
%{ %{
reports: render_many(reports, __MODULE__, "show.json", as: :report) reports:
reports[:items]
|> Enum.map(&Report.extract_report_info(&1))
|> Enum.map(&render(__MODULE__, "show.json", &1))
|> Enum.reverse(),
total: reports[:total]
} }
end end
def render("show.json", %{report: report}) do def render("show.json", %{report: report, user: user, account: account, statuses: statuses}) do
user = User.get_cached_by_ap_id(report.data["actor"])
created_at = Utils.to_masto_date(report.data["published"]) created_at = Utils.to_masto_date(report.data["published"])
[account_ap_id | status_ap_ids] = report.data["object"]
account = User.get_cached_by_ap_id(account_ap_id)
content = content =
unless is_nil(report.data["content"]) do unless is_nil(report.data["content"]) do
HTML.filter_tags(report.data["content"]) HTML.filter_tags(report.data["content"])
@ -30,11 +31,6 @@ def render("show.json", %{report: report}) do
nil nil
end end
statuses =
Enum.map(status_ap_ids, fn ap_id ->
Activity.get_by_ap_id_with_object(ap_id)
end)
%{ %{
id: report.id, id: report.id,
account: merge_account_views(account), account: merge_account_views(account),

View file

@ -6,7 +6,7 @@ defmodule Pleroma.Web.CommonAPI do
alias Pleroma.Activity alias Pleroma.Activity
alias Pleroma.ActivityExpiration alias Pleroma.ActivityExpiration
alias Pleroma.Conversation.Participation alias Pleroma.Conversation.Participation
alias Pleroma.Formatter alias Pleroma.Emoji
alias Pleroma.Object alias Pleroma.Object
alias Pleroma.ThreadMute alias Pleroma.ThreadMute
alias Pleroma.User alias Pleroma.User
@ -261,12 +261,7 @@ def post(user, %{"status" => status} = data) do
sensitive, sensitive,
poll poll
), ),
object <- object <- put_emoji(object, full_payload, poll_emoji) do
Map.put(
object,
"emoji",
Map.merge(Formatter.get_emoji_map(full_payload), poll_emoji)
) do
preview? = Pleroma.Web.ControllerHelper.truthy_param?(data["preview"]) || false preview? = Pleroma.Web.ControllerHelper.truthy_param?(data["preview"]) || false
direct? = visibility == "direct" direct? = visibility == "direct"
@ -300,18 +295,25 @@ def post(user, %{"status" => status} = data) do
end end
end end
# parse and put emoji to object data
defp put_emoji(map, text, emojis) do
Map.put(
map,
"emoji",
Map.merge(Emoji.Formatter.get_emoji_map(text), emojis)
)
end
# Updates the emojis for a user based on their profile # Updates the emojis for a user based on their profile
def update(user) do def update(user) do
emoji = emoji_from_profile(user)
source_data = user.info |> Map.get(:source_data, {}) |> Map.put("tag", emoji)
user = user =
with emoji <- emoji_from_profile(user), with {:ok, user} <- User.update_info(user, &User.Info.set_source_data(&1, source_data)) do
source_data <- (user.info.source_data || %{}) |> Map.put("tag", emoji),
info_cng <- User.Info.set_source_data(user.info, source_data),
change <- Ecto.Changeset.change(user) |> Ecto.Changeset.put_embed(:info, info_cng),
{:ok, user} <- User.update_and_set_cache(change) do
user user
else else
_e -> _e -> user
user
end end
ActivityPub.update(%{ ActivityPub.update(%{
@ -336,34 +338,21 @@ def pin(id_or_ap_id, %{ap_id: user_ap_id} = user) do
} }
} = activity <- get_by_id_or_ap_id(id_or_ap_id), } = activity <- get_by_id_or_ap_id(id_or_ap_id),
true <- Visibility.is_public?(activity), true <- Visibility.is_public?(activity),
%{valid?: true} = info_changeset <- User.Info.add_pinnned_activity(user.info, activity), {:ok, _user} <- User.update_info(user, &User.Info.add_pinnned_activity(&1, activity)) do
changeset <-
Ecto.Changeset.change(user) |> Ecto.Changeset.put_embed(:info, info_changeset),
{:ok, _user} <- User.update_and_set_cache(changeset) do
{:ok, activity} {:ok, activity}
else else
%{errors: [pinned_activities: {err, _}]} -> {:error, %{changes: %{info: %{errors: [pinned_activities: {err, _}]}}}} -> {:error, err}
{:error, err} _ -> {:error, dgettext("errors", "Could not pin")}
_ ->
{:error, dgettext("errors", "Could not pin")}
end end
end end
def unpin(id_or_ap_id, user) do def unpin(id_or_ap_id, user) do
with %Activity{} = activity <- get_by_id_or_ap_id(id_or_ap_id), with %Activity{} = activity <- get_by_id_or_ap_id(id_or_ap_id),
%{valid?: true} = info_changeset <- {:ok, _user} <- User.update_info(user, &User.Info.remove_pinnned_activity(&1, activity)) do
User.Info.remove_pinnned_activity(user.info, activity),
changeset <-
Ecto.Changeset.change(user) |> Ecto.Changeset.put_embed(:info, info_changeset),
{:ok, _user} <- User.update_and_set_cache(changeset) do
{:ok, activity} {:ok, activity}
else else
%{errors: [pinned_activities: {err, _}]} -> %{errors: [pinned_activities: {err, _}]} -> {:error, err}
{:error, err} _ -> {:error, dgettext("errors", "Could not unpin")}
_ ->
{:error, dgettext("errors", "Could not unpin")}
end end
end end
@ -458,23 +447,15 @@ defp set_visibility(activity, %{"visibility" => visibility}) do
defp set_visibility(activity, _), do: {:ok, activity} defp set_visibility(activity, _), do: {:ok, activity}
def hide_reblogs(user, muted) do def hide_reblogs(user, %{ap_id: ap_id} = _muted) do
ap_id = muted.ap_id
if ap_id not in user.info.muted_reblogs do if ap_id not in user.info.muted_reblogs do
info_changeset = User.Info.add_reblog_mute(user.info, ap_id) User.update_info(user, &User.Info.add_reblog_mute(&1, ap_id))
changeset = Ecto.Changeset.change(user) |> Ecto.Changeset.put_embed(:info, info_changeset)
User.update_and_set_cache(changeset)
end end
end end
def show_reblogs(user, muted) do def show_reblogs(user, %{ap_id: ap_id} = _muted) do
ap_id = muted.ap_id
if ap_id in user.info.muted_reblogs do if ap_id in user.info.muted_reblogs do
info_changeset = User.Info.remove_reblog_mute(user.info, ap_id) User.update_info(user, &User.Info.remove_reblog_mute(&1, ap_id))
changeset = Ecto.Changeset.change(user) |> Ecto.Changeset.put_embed(:info, info_changeset)
User.update_and_set_cache(changeset)
end end
end end
end end

View file

@ -9,6 +9,7 @@ defmodule Pleroma.Web.CommonAPI.Utils do
alias Pleroma.Activity alias Pleroma.Activity
alias Pleroma.Config alias Pleroma.Config
alias Pleroma.Conversation.Participation alias Pleroma.Conversation.Participation
alias Pleroma.Emoji
alias Pleroma.Formatter alias Pleroma.Formatter
alias Pleroma.Object alias Pleroma.Object
alias Pleroma.Plugs.AuthenticationPlug alias Pleroma.Plugs.AuthenticationPlug
@ -25,7 +26,7 @@ defmodule Pleroma.Web.CommonAPI.Utils do
# This is a hack for twidere. # This is a hack for twidere.
def get_by_id_or_ap_id(id) do def get_by_id_or_ap_id(id) do
activity = activity =
with true <- Pleroma.FlakeId.is_flake_id?(id), with true <- FlakeId.flake_id?(id),
%Activity{} = activity <- Activity.get_by_id_with_object(id) do %Activity{} = activity <- Activity.get_by_id_with_object(id) do
activity activity
else else
@ -184,7 +185,7 @@ def make_poll_data(%{"poll" => %{"options" => options, "expires_in" => expires_i
"name" => option, "name" => option,
"type" => "Note", "type" => "Note",
"replies" => %{"type" => "Collection", "totalItems" => 0} "replies" => %{"type" => "Collection", "totalItems" => 0}
}, Map.merge(emoji, Formatter.get_emoji_map(option))} }, Map.merge(emoji, Emoji.Formatter.get_emoji_map(option))}
end) end)
case expires_in do case expires_in do
@ -434,8 +435,8 @@ def confirm_current_password(user, password) do
end end
def emoji_from_profile(%{info: _info} = user) do def emoji_from_profile(%{info: _info} = user) do
(Formatter.get_emoji(user.bio) ++ Formatter.get_emoji(user.name)) (Emoji.Formatter.get_emoji(user.bio) ++ Emoji.Formatter.get_emoji(user.name))
|> Enum.map(fn {shortcode, url, _} -> |> Enum.map(fn {shortcode, %Emoji{file: url}} ->
%{ %{
"type" => "Emoji", "type" => "Emoji",
"icon" => %{"type" => "Image", "url" => "#{Endpoint.url()}#{url}"}, "icon" => %{"type" => "Image", "url" => "#{Endpoint.url()}#{url}"},

View file

@ -13,10 +13,9 @@ defmodule Pleroma.Web.MastodonAPI.MastodonAPIController do
alias Pleroma.Bookmark alias Pleroma.Bookmark
alias Pleroma.Config alias Pleroma.Config
alias Pleroma.Conversation.Participation alias Pleroma.Conversation.Participation
alias Pleroma.Emoji
alias Pleroma.Filter alias Pleroma.Filter
alias Pleroma.Formatter
alias Pleroma.HTTP alias Pleroma.HTTP
alias Pleroma.Notification
alias Pleroma.Object alias Pleroma.Object
alias Pleroma.Pagination alias Pleroma.Pagination
alias Pleroma.Plugs.RateLimiter alias Pleroma.Plugs.RateLimiter
@ -35,7 +34,6 @@ defmodule Pleroma.Web.MastodonAPI.MastodonAPIController do
alias Pleroma.Web.MastodonAPI.ListView alias Pleroma.Web.MastodonAPI.ListView
alias Pleroma.Web.MastodonAPI.MastodonAPI alias Pleroma.Web.MastodonAPI.MastodonAPI
alias Pleroma.Web.MastodonAPI.MastodonView alias Pleroma.Web.MastodonAPI.MastodonView
alias Pleroma.Web.MastodonAPI.NotificationView
alias Pleroma.Web.MastodonAPI.ReportView alias Pleroma.Web.MastodonAPI.ReportView
alias Pleroma.Web.MastodonAPI.ScheduledActivityView alias Pleroma.Web.MastodonAPI.ScheduledActivityView
alias Pleroma.Web.MastodonAPI.StatusView alias Pleroma.Web.MastodonAPI.StatusView
@ -140,18 +138,21 @@ def update_credentials(%{assigns: %{user: user}} = conn, params) do
user_info_emojis = user_info_emojis =
user.info user.info
|> Map.get(:emoji, []) |> Map.get(:emoji, [])
|> Enum.concat(Formatter.get_emoji_map(emojis_text)) |> Enum.concat(Emoji.Formatter.get_emoji_map(emojis_text))
|> Enum.dedup() |> Enum.dedup()
info_params = info_params =
[ [
:no_rich_text, :no_rich_text,
:locked, :locked,
:hide_followers_count,
:hide_follows_count,
:hide_followers, :hide_followers,
:hide_follows, :hide_follows,
:hide_favorites, :hide_favorites,
:show_role, :show_role,
:skip_thread_containment :skip_thread_containment,
:discoverable
] ]
|> Enum.reduce(%{}, fn key, acc -> |> Enum.reduce(%{}, fn key, acc ->
add_if_present(acc, params, to_string(key), key, fn value -> add_if_present(acc, params, to_string(key), key, fn value ->
@ -186,14 +187,13 @@ def update_credentials(%{assigns: %{user: user}} = conn, params) do
end) end)
|> Map.put(:emoji, user_info_emojis) |> Map.put(:emoji, user_info_emojis)
info_cng = User.Info.profile_update(user.info, info_params) changeset =
user
|> User.update_changeset(user_params)
|> User.change_info(&User.Info.profile_update(&1, info_params))
with changeset <- User.update_changeset(user, user_params), with {:ok, user} <- User.update_and_set_cache(changeset) do
changeset <- Changeset.put_embed(changeset, :info, info_cng), if original_user != user, do: CommonAPI.update(user)
{:ok, user} <- User.update_and_set_cache(changeset) do
if original_user != user do
CommonAPI.update(user)
end
json( json(
conn, conn,
@ -223,12 +223,10 @@ def update_avatar(%{assigns: %{user: user}} = conn, params) do
end end
def update_banner(%{assigns: %{user: user}} = conn, %{"banner" => ""}) do def update_banner(%{assigns: %{user: user}} = conn, %{"banner" => ""}) do
with new_info <- %{"banner" => %{}}, new_info = %{"banner" => %{}}
info_cng <- User.Info.profile_update(user.info, new_info),
changeset <- Changeset.change(user) |> Changeset.put_embed(:info, info_cng),
{:ok, user} <- User.update_and_set_cache(changeset) do
CommonAPI.update(user)
with {:ok, user} <- User.update_info(user, &User.Info.profile_update(&1, new_info)) do
CommonAPI.update(user)
json(conn, %{url: nil}) json(conn, %{url: nil})
end end
end end
@ -236,9 +234,7 @@ def update_banner(%{assigns: %{user: user}} = conn, %{"banner" => ""}) do
def update_banner(%{assigns: %{user: user}} = conn, params) do def update_banner(%{assigns: %{user: user}} = conn, params) do
with {:ok, object} <- ActivityPub.upload(%{"img" => params["banner"]}, type: :banner), with {:ok, object} <- ActivityPub.upload(%{"img" => params["banner"]}, type: :banner),
new_info <- %{"banner" => object.data}, new_info <- %{"banner" => object.data},
info_cng <- User.Info.profile_update(user.info, new_info), {:ok, user} <- User.update_info(user, &User.Info.profile_update(&1, new_info)) do
changeset <- Changeset.change(user) |> Changeset.put_embed(:info, info_cng),
{:ok, user} <- User.update_and_set_cache(changeset) do
CommonAPI.update(user) CommonAPI.update(user)
%{"url" => [%{"href" => href} | _]} = object.data %{"url" => [%{"href" => href} | _]} = object.data
@ -247,10 +243,9 @@ def update_banner(%{assigns: %{user: user}} = conn, params) do
end end
def update_background(%{assigns: %{user: user}} = conn, %{"img" => ""}) do def update_background(%{assigns: %{user: user}} = conn, %{"img" => ""}) do
with new_info <- %{"background" => %{}}, new_info = %{"background" => %{}}
info_cng <- User.Info.profile_update(user.info, new_info),
changeset <- Changeset.change(user) |> Changeset.put_embed(:info, info_cng), with {:ok, _user} <- User.update_info(user, &User.Info.profile_update(&1, new_info)) do
{:ok, _user} <- User.update_and_set_cache(changeset) do
json(conn, %{url: nil}) json(conn, %{url: nil})
end end
end end
@ -258,9 +253,7 @@ def update_background(%{assigns: %{user: user}} = conn, %{"img" => ""}) do
def update_background(%{assigns: %{user: user}} = conn, params) do def update_background(%{assigns: %{user: user}} = conn, params) do
with {:ok, object} <- ActivityPub.upload(params, type: :background), with {:ok, object} <- ActivityPub.upload(params, type: :background),
new_info <- %{"background" => object.data}, new_info <- %{"background" => object.data},
info_cng <- User.Info.profile_update(user.info, new_info), {:ok, _user} <- User.update_info(user, &User.Info.profile_update(&1, new_info)) do
changeset <- Changeset.change(user) |> Changeset.put_embed(:info, info_cng),
{:ok, _user} <- User.update_and_set_cache(changeset) do
%{"url" => [%{"href" => href} | _]} = object.data %{"url" => [%{"href" => href} | _]} = object.data
json(conn, %{url: href}) json(conn, %{url: href})
@ -331,7 +324,7 @@ def peers(conn, _params) do
defp mastodonized_emoji do defp mastodonized_emoji do
Pleroma.Emoji.get_all() Pleroma.Emoji.get_all()
|> Enum.map(fn {shortcode, relative_url, tags} -> |> Enum.map(fn {shortcode, %Pleroma.Emoji{file: relative_url, tags: tags}} ->
url = to_string(URI.merge(Web.base_url(), relative_url)) url = to_string(URI.merge(Web.base_url(), relative_url))
%{ %{
@ -379,7 +372,6 @@ def public_timeline(%{assigns: %{user: user}} = conn, params) do
|> Map.put("local_only", local_only) |> Map.put("local_only", local_only)
|> Map.put("blocking_user", user) |> Map.put("blocking_user", user)
|> Map.put("muting_user", user) |> Map.put("muting_user", user)
|> Map.put("user", user)
|> ActivityPub.fetch_public_activities() |> ActivityPub.fetch_public_activities()
|> Enum.reverse() |> Enum.reverse()
@ -485,7 +477,7 @@ def get_context(%{assigns: %{user: user}} = conn, %{"id" => id}) do
end end
def get_poll(%{assigns: %{user: user}} = conn, %{"id" => id}) do def get_poll(%{assigns: %{user: user}} = conn, %{"id" => id}) do
with %Object{} = object <- Object.get_by_id(id), with %Object{} = object <- Object.get_by_id_and_maybe_refetch(id, interval: 60),
%Activity{} = activity <- Activity.get_create_by_object_ap_id(object.data["id"]), %Activity{} = activity <- Activity.get_create_by_object_ap_id(object.data["id"]),
true <- Visibility.visible_for_user?(activity, user) do true <- Visibility.visible_for_user?(activity, user) do
conn conn
@ -609,7 +601,12 @@ def post_status(%{assigns: %{user: user}} = conn, %{"status" => _} = params) do
{:ok, activity} -> {:ok, activity} ->
conn conn
|> put_view(StatusView) |> put_view(StatusView)
|> try_render("status.json", %{activity: activity, for: user, as: :activity}) |> try_render("status.json", %{
activity: activity,
for: user,
as: :activity,
with_direct_conversation_id: true
})
end end
end end
end end
@ -716,49 +713,6 @@ def unmute_conversation(%{assigns: %{user: user}} = conn, %{"id" => id}) do
end end
end end
def notifications(%{assigns: %{user: user}} = conn, params) do
notifications = MastodonAPI.get_notifications(user, params)
conn
|> add_link_headers(notifications)
|> put_view(NotificationView)
|> render("index.json", %{notifications: notifications, for: user})
end
def get_notification(%{assigns: %{user: user}} = conn, %{"id" => id} = _params) do
with {:ok, notification} <- Notification.get(user, id) do
conn
|> put_view(NotificationView)
|> render("show.json", %{notification: notification, for: user})
else
{:error, reason} ->
conn
|> put_status(:forbidden)
|> json(%{"error" => reason})
end
end
def clear_notifications(%{assigns: %{user: user}} = conn, _params) do
Notification.clear(user)
json(conn, %{})
end
def dismiss_notification(%{assigns: %{user: user}} = conn, %{"id" => id} = _params) do
with {:ok, _notif} <- Notification.dismiss(user, id) do
json(conn, %{})
else
{:error, reason} ->
conn
|> put_status(:forbidden)
|> json(%{"error" => reason})
end
end
def destroy_multiple(%{assigns: %{user: user}} = conn, %{"ids" => ids} = _params) do
Notification.destroy_multiple(user, ids)
json(conn, %{})
end
def relationships(%{assigns: %{user: user}} = conn, %{"id" => id}) do def relationships(%{assigns: %{user: user}} = conn, %{"id" => id}) do
id = List.wrap(id) id = List.wrap(id)
q = from(u in User, where: u.id in ^id) q = from(u in User, where: u.id in ^id)
@ -810,26 +764,16 @@ def upload(%{assigns: %{user: user}} = conn, %{"file" => file} = data) do
def set_mascot(%{assigns: %{user: user}} = conn, %{"file" => file}) do def set_mascot(%{assigns: %{user: user}} = conn, %{"file" => file}) do
with {:ok, object} <- ActivityPub.upload(file, actor: User.ap_id(user)), with {:ok, object} <- ActivityPub.upload(file, actor: User.ap_id(user)),
%{} = attachment_data <- Map.put(object.data, "id", object.id), %{} = attachment_data <- Map.put(object.data, "id", object.id),
%{type: type} = rendered <- # Reject if not an image
%{type: "image"} = rendered <-
StatusView.render("attachment.json", %{attachment: attachment_data}) do StatusView.render("attachment.json", %{attachment: attachment_data}) do
# Reject if not an image # Sure!
if type == "image" do # Save to the user's info
# Sure! {:ok, _user} = User.update_info(user, &User.Info.mascot_update(&1, rendered))
# Save to the user's info
info_changeset = User.Info.mascot_update(user.info, rendered)
user_changeset = json(conn, rendered)
user else
|> Changeset.change() %{type: _} -> render_error(conn, :unsupported_media_type, "mascots can only be images")
|> Changeset.put_embed(:info, info_changeset)
{:ok, _user} = User.update_and_set_cache(user_changeset)
conn
|> json(rendered)
else
render_error(conn, :unsupported_media_type, "mascots can only be images")
end
end end
end end
@ -952,11 +896,11 @@ def following(%{assigns: %{user: for_user}} = conn, %{"id" => id} = params) do
end end
def follow_requests(%{assigns: %{user: followed}} = conn, _params) do def follow_requests(%{assigns: %{user: followed}} = conn, _params) do
with {:ok, follow_requests} <- User.get_follow_requests(followed) do follow_requests = User.get_follow_requests(followed)
conn
|> put_view(AccountView) conn
|> render("accounts.json", %{for: followed, users: follow_requests, as: :user}) |> put_view(AccountView)
end |> render("accounts.json", %{for: followed, users: follow_requests, as: :user})
end end
def authorize_follow_request(%{assigns: %{user: followed}} = conn, %{"id" => id}) do def authorize_follow_request(%{assigns: %{user: followed}} = conn, %{"id" => id}) do
@ -1360,11 +1304,7 @@ def index(%{assigns: %{user: user}} = conn, _params) do
end end
def put_settings(%{assigns: %{user: user}} = conn, %{"data" => settings} = _params) do def put_settings(%{assigns: %{user: user}} = conn, %{"data" => settings} = _params) do
info_cng = User.Info.mastodon_settings_update(user.info, settings) with {:ok, _} <- User.update_info(user, &User.Info.mastodon_settings_update(&1, settings)) do
with changeset <- Changeset.change(user),
changeset <- Changeset.put_embed(changeset, :info, info_cng),
{:ok, _user} <- User.update_and_set_cache(changeset) do
json(conn, %{}) json(conn, %{})
else else
e -> e ->

View file

@ -0,0 +1,57 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2019 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Web.MastodonAPI.NotificationController do
use Pleroma.Web, :controller
import Pleroma.Web.ControllerHelper, only: [add_link_headers: 2]
alias Pleroma.Notification
alias Pleroma.Web.MastodonAPI.MastodonAPI
# GET /api/v1/notifications
def index(%{assigns: %{user: user}} = conn, params) do
notifications = MastodonAPI.get_notifications(user, params)
conn
|> add_link_headers(notifications)
|> render("index.json", notifications: notifications, for: user)
end
# GET /api/v1/notifications/:id
def show(%{assigns: %{user: user}} = conn, %{"id" => id}) do
with {:ok, notification} <- Notification.get(user, id) do
render(conn, "show.json", notification: notification, for: user)
else
{:error, reason} ->
conn
|> put_status(:forbidden)
|> json(%{"error" => reason})
end
end
# POST /api/v1/notifications/clear
def clear(%{assigns: %{user: user}} = conn, _params) do
Notification.clear(user)
json(conn, %{})
end
# POST /api/v1/notifications/dismiss
def dismiss(%{assigns: %{user: user}} = conn, %{"id" => id} = _params) do
with {:ok, _notif} <- Notification.dismiss(user, id) do
json(conn, %{})
else
{:error, reason} ->
conn
|> put_status(:forbidden)
|> json(%{"error" => reason})
end
end
# DELETE /api/v1/notifications/destroy_multiple
def destroy_multiple(%{assigns: %{user: user}} = conn, %{"ids" => ids} = _params) do
Notification.destroy_multiple(user, ids)
json(conn, %{})
end
end

View file

@ -19,9 +19,10 @@ defmodule Pleroma.Web.MastodonAPI.SearchController do
def account_search(%{assigns: %{user: user}} = conn, %{"q" => query} = params) do def account_search(%{assigns: %{user: user}} = conn, %{"q" => query} = params) do
accounts = User.search(query, search_options(params, user)) accounts = User.search(query, search_options(params, user))
res = AccountView.render("accounts.json", users: accounts, for: user, as: :user)
json(conn, res) conn
|> put_view(AccountView)
|> render("accounts.json", users: accounts, for: user, as: :user)
end end
def search2(conn, params), do: do_search(:v2, conn, params) def search2(conn, params), do: do_search(:v2, conn, params)

View file

@ -74,10 +74,18 @@ defp do_render("account.json", %{user: user} = opts) do
user_info = User.get_cached_user_info(user) user_info = User.get_cached_user_info(user)
following_count = following_count =
((!user.info.hide_follows or opts[:for] == user) && user_info.following_count) || 0 if !user.info.hide_follows_count or !user.info.hide_follows or opts[:for] == user do
user_info.following_count
else
0
end
followers_count = followers_count =
((!user.info.hide_followers or opts[:for] == user) && user_info.follower_count) || 0 if !user.info.hide_followers_count or !user.info.hide_followers or opts[:for] == user do
user_info.follower_count
else
0
end
bot = (user.info.source_data["type"] || "Person") in ["Application", "Service"] bot = (user.info.source_data["type"] || "Person") in ["Application", "Service"]
@ -108,6 +116,8 @@ defp do_render("account.json", %{user: user} = opts) do
bio = HTML.filter_tags(user.bio, User.html_filter_policy(opts[:for])) bio = HTML.filter_tags(user.bio, User.html_filter_policy(opts[:for]))
relationship = render("relationship.json", %{user: opts[:for], target: user}) relationship = render("relationship.json", %{user: opts[:for], target: user})
discoverable = user.info.discoverable
%{ %{
id: to_string(user.id), id: to_string(user.id),
username: username_from_nickname(user.nickname), username: username_from_nickname(user.nickname),
@ -131,13 +141,17 @@ defp do_render("account.json", %{user: user} = opts) do
note: HTML.strip_tags((user.bio || "") |> String.replace("<br>", "\n")), note: HTML.strip_tags((user.bio || "") |> String.replace("<br>", "\n")),
sensitive: false, sensitive: false,
fields: raw_fields, fields: raw_fields,
pleroma: %{} pleroma: %{
discoverable: discoverable
}
}, },
# Pleroma extension # Pleroma extension
pleroma: %{ pleroma: %{
confirmation_pending: user_info.confirmation_pending, confirmation_pending: user_info.confirmation_pending,
tags: user.tags, tags: user.tags,
hide_followers_count: user.info.hide_followers_count,
hide_follows_count: user.info.hide_follows_count,
hide_followers: user.info.hide_followers, hide_followers: user.info.hide_followers,
hide_follows: user.info.hide_follows, hide_follows: user.info.hide_follows,
hide_favorites: user.info.hide_favorites, hide_favorites: user.info.hide_favorites,

View file

@ -3,6 +3,7 @@
# SPDX-License-Identifier: AGPL-3.0-only # SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Web.Metadata.Utils do defmodule Pleroma.Web.Metadata.Utils do
alias Pleroma.Emoji
alias Pleroma.Formatter alias Pleroma.Formatter
alias Pleroma.HTML alias Pleroma.HTML
alias Pleroma.Web.MediaProxy alias Pleroma.Web.MediaProxy
@ -13,7 +14,7 @@ def scrub_html_and_truncate(%{data: %{"content" => content}} = object) do
|> HtmlEntities.decode() |> HtmlEntities.decode()
|> String.replace(~r/<br\s?\/?>/, " ") |> String.replace(~r/<br\s?\/?>/, " ")
|> HTML.get_cached_stripped_html_for_activity(object, "metadata") |> HTML.get_cached_stripped_html_for_activity(object, "metadata")
|> Formatter.demojify() |> Emoji.Formatter.demojify()
|> Formatter.truncate() |> Formatter.truncate()
end end
@ -23,7 +24,7 @@ def scrub_html_and_truncate(content, max_length \\ 200) when is_binary(content)
|> HtmlEntities.decode() |> HtmlEntities.decode()
|> String.replace(~r/<br\s?\/?>/, " ") |> String.replace(~r/<br\s?\/?>/, " ")
|> HTML.strip_tags() |> HTML.strip_tags()
|> Formatter.demojify() |> Emoji.Formatter.demojify()
|> Formatter.truncate(max_length) |> Formatter.truncate(max_length)
end end

View file

@ -57,6 +57,7 @@ def raw_nodeinfo do
"mastodon_api_streaming", "mastodon_api_streaming",
"polls", "polls",
"pleroma_explicit_addressing", "pleroma_explicit_addressing",
"shareable_emoji_packs",
if Config.get([:media_proxy, :enabled]) do if Config.get([:media_proxy, :enabled]) do
"media_proxy" "media_proxy"
end, end,

View file

@ -20,7 +20,7 @@ defmodule Pleroma.Web.OAuth.Authorization do
field(:scopes, {:array, :string}, default: []) field(:scopes, {:array, :string}, default: [])
field(:valid_until, :naive_datetime_usec) field(:valid_until, :naive_datetime_usec)
field(:used, :boolean, default: false) field(:used, :boolean, default: false)
belongs_to(:user, User, type: Pleroma.FlakeId) belongs_to(:user, User, type: FlakeId.Ecto.CompatType)
belongs_to(:app, App) belongs_to(:app, App)
timestamps() timestamps()

View file

@ -202,6 +202,8 @@ def token_exchange(
{:ok, app} <- Token.Utils.fetch_app(conn), {:ok, app} <- Token.Utils.fetch_app(conn),
{:auth_active, true} <- {:auth_active, User.auth_active?(user)}, {:auth_active, true} <- {:auth_active, User.auth_active?(user)},
{:user_active, true} <- {:user_active, !user.info.deactivated}, {:user_active, true} <- {:user_active, !user.info.deactivated},
{:password_reset_pending, false} <-
{:password_reset_pending, user.info.password_reset_pending},
{:ok, scopes} <- validate_scopes(app, params), {:ok, scopes} <- validate_scopes(app, params),
{:ok, auth} <- Authorization.create_authorization(app, user, scopes), {:ok, auth} <- Authorization.create_authorization(app, user, scopes),
{:ok, token} <- Token.exchange_token(app, auth) do {:ok, token} <- Token.exchange_token(app, auth) do
@ -215,6 +217,9 @@ def token_exchange(
{:user_active, false} -> {:user_active, false} ->
render_error(conn, :forbidden, "Your account is currently disabled") render_error(conn, :forbidden, "Your account is currently disabled")
{:password_reset_pending, true} ->
render_error(conn, :forbidden, "Password reset is required")
_error -> _error ->
render_invalid_credentials_error(conn) render_invalid_credentials_error(conn)
end end

View file

@ -21,7 +21,7 @@ defmodule Pleroma.Web.OAuth.Token do
field(:refresh_token, :string) field(:refresh_token, :string)
field(:scopes, {:array, :string}, default: []) field(:scopes, {:array, :string}, default: [])
field(:valid_until, :naive_datetime_usec) field(:valid_until, :naive_datetime_usec)
belongs_to(:user, User, type: Pleroma.FlakeId) belongs_to(:user, User, type: FlakeId.Ecto.CompatType)
belongs_to(:app, App) belongs_to(:app, App)
timestamps() timestamps()

View file

@ -1,5 +1,5 @@
# Pleroma: A lightweight social networking server # Pleroma: A lightweight social networking server
# Copyright © 2017-2018 Pleroma Authors <https://pleroma.social/> # Copyright © 2017-2019 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only # SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Web.OAuth.Token.CleanWorker do defmodule Pleroma.Web.OAuth.Token.CleanWorker do

View file

@ -1,5 +1,5 @@
# Pleroma: A lightweight social networking server # Pleroma: A lightweight social networking server
# Copyright © 2017-2018 Pleroma Authors <https://pleroma.social/> # Copyright © 2017-2019 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only # SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Web.OAuth.Token.Query do defmodule Pleroma.Web.OAuth.Token.Query do

View file

@ -3,14 +3,12 @@
# SPDX-License-Identifier: AGPL-3.0-only # SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Web.OStatus do defmodule Pleroma.Web.OStatus do
import Ecto.Query
import Pleroma.Web.XML import Pleroma.Web.XML
require Logger require Logger
alias Pleroma.Activity alias Pleroma.Activity
alias Pleroma.HTTP alias Pleroma.HTTP
alias Pleroma.Object alias Pleroma.Object
alias Pleroma.Repo
alias Pleroma.User alias Pleroma.User
alias Pleroma.Web alias Pleroma.Web
alias Pleroma.Web.ActivityPub.ActivityPub alias Pleroma.Web.ActivityPub.ActivityPub
@ -38,21 +36,13 @@ def is_representable?(%Activity{} = activity) do
end end
end end
def feed_path(user) do def feed_path(user), do: "#{user.ap_id}/feed.atom"
"#{user.ap_id}/feed.atom"
end
def pubsub_path(user) do def pubsub_path(user), do: "#{Web.base_url()}/push/hub/#{user.nickname}"
"#{Web.base_url()}/push/hub/#{user.nickname}"
end
def salmon_path(user) do def salmon_path(user), do: "#{user.ap_id}/salmon"
"#{user.ap_id}/salmon"
end
def remote_follow_path do def remote_follow_path, do: "#{Web.base_url()}/ostatus_subscribe?acct={uri}"
"#{Web.base_url()}/ostatus_subscribe?acct={uri}"
end
def handle_incoming(xml_string, options \\ []) do def handle_incoming(xml_string, options \\ []) do
with doc when doc != :error <- parse_document(xml_string) do with doc when doc != :error <- parse_document(xml_string) do
@ -217,10 +207,9 @@ def get_content(entry) do
Get the cw that mastodon uses. Get the cw that mastodon uses.
""" """
def get_cw(entry) do def get_cw(entry) do
with cw when not is_nil(cw) <- string_from_xpath("/*/summary", entry) do case string_from_xpath("/*/summary", entry) do
cw cw when not is_nil(cw) -> cw
else _ -> nil
_e -> nil
end end
end end
@ -232,19 +221,17 @@ def get_tags(entry) do
end end
def maybe_update(doc, user) do def maybe_update(doc, user) do
if "true" == string_from_xpath("//author[1]/ap_enabled", doc) do case string_from_xpath("//author[1]/ap_enabled", doc) do
Transmogrifier.upgrade_user_from_ap_id(user.ap_id) "true" ->
else Transmogrifier.upgrade_user_from_ap_id(user.ap_id)
maybe_update_ostatus(doc, user)
_ ->
maybe_update_ostatus(doc, user)
end end
end end
def maybe_update_ostatus(doc, user) do def maybe_update_ostatus(doc, user) do
old_data = %{ old_data = Map.take(user, [:bio, :avatar, :name])
avatar: user.avatar,
bio: user.bio,
name: user.name
}
with false <- user.local, with false <- user.local,
avatar <- make_avatar_object(doc), avatar <- make_avatar_object(doc),
@ -279,38 +266,37 @@ def find_make_or_update_actor(doc) do
end end
end end
@spec find_or_make_user(String.t()) :: {:ok, User.t()}
def find_or_make_user(uri) do def find_or_make_user(uri) do
query = from(user in User, where: user.ap_id == ^uri) case User.get_by_ap_id(uri) do
%User{} = user -> {:ok, user}
user = Repo.one(query) _ -> make_user(uri)
if is_nil(user) do
make_user(uri)
else
{:ok, user}
end end
end end
@spec make_user(String.t(), boolean()) :: {:ok, User.t()} | {:error, any()}
def make_user(uri, update \\ false) do def make_user(uri, update \\ false) do
with {:ok, info} <- gather_user_info(uri) do with {:ok, info} <- gather_user_info(uri) do
data = %{
name: info["name"],
nickname: info["nickname"] <> "@" <> info["host"],
ap_id: info["uri"],
info: info,
avatar: info["avatar"],
bio: info["bio"]
}
with false <- update, with false <- update,
%User{} = user <- User.get_cached_by_ap_id(data.ap_id) do %User{} = user <- User.get_cached_by_ap_id(info["uri"]) do
{:ok, user} {:ok, user}
else else
_e -> User.insert_or_update_user(data) _e -> User.insert_or_update_user(build_user_data(info))
end end
end end
end end
defp build_user_data(info) do
%{
name: info["name"],
nickname: info["nickname"] <> "@" <> info["host"],
ap_id: info["uri"],
info: info,
avatar: info["avatar"],
bio: info["bio"]
}
end
# TODO: Just takes the first one for now. # TODO: Just takes the first one for now.
def make_avatar_object(author_doc, rel \\ "avatar") do def make_avatar_object(author_doc, rel \\ "avatar") do
href = string_from_xpath("//author[1]/link[@rel=\"#{rel}\"]/@href", author_doc) href = string_from_xpath("//author[1]/link[@rel=\"#{rel}\"]/@href", author_doc)
@ -319,23 +305,23 @@ def make_avatar_object(author_doc, rel \\ "avatar") do
if href do if href do
%{ %{
"type" => "Image", "type" => "Image",
"url" => [ "url" => [%{"type" => "Link", "mediaType" => type, "href" => href}]
%{
"type" => "Link",
"mediaType" => type,
"href" => href
}
]
} }
else else
nil nil
end end
end end
@spec gather_user_info(String.t()) :: {:ok, map()} | {:error, any()}
def gather_user_info(username) do def gather_user_info(username) do
with {:ok, webfinger_data} <- WebFinger.finger(username), with {:ok, webfinger_data} <- WebFinger.finger(username),
{:ok, feed_data} <- Websub.gather_feed_data(webfinger_data["topic"]) do {:ok, feed_data} <- Websub.gather_feed_data(webfinger_data["topic"]) do
{:ok, Map.merge(webfinger_data, feed_data) |> Map.put("fqn", username)} data =
webfinger_data
|> Map.merge(feed_data)
|> Map.put("fqn", username)
{:ok, data}
else else
e -> e ->
Logger.debug(fn -> "Couldn't gather info for #{username}" end) Logger.debug(fn -> "Couldn't gather info for #{username}" end)
@ -371,10 +357,7 @@ def get_atom_url(body) do
def fetch_activity_from_atom_url(url, options \\ []) do def fetch_activity_from_atom_url(url, options \\ []) do
with true <- String.starts_with?(url, "http"), with true <- String.starts_with?(url, "http"),
{:ok, %{body: body, status: code}} when code in 200..299 <- {:ok, %{body: body, status: code}} when code in 200..299 <-
HTTP.get( HTTP.get(url, [{:Accept, "application/atom+xml"}]) do
url,
[{:Accept, "application/atom+xml"}]
) do
Logger.debug("Got document from #{url}, handling...") Logger.debug("Got document from #{url}, handling...")
handle_incoming(body, options) handle_incoming(body, options)
else else

View file

@ -55,12 +55,11 @@ def feed_redirect(conn, %{"nickname" => nickname}) do
def feed(conn, %{"nickname" => nickname} = params) do def feed(conn, %{"nickname" => nickname} = params) do
with {_, %User{} = user} <- {:fetch_user, User.get_cached_by_nickname(nickname)} do with {_, %User{} = user} <- {:fetch_user, User.get_cached_by_nickname(nickname)} do
query_params =
Map.take(params, ["max_id"])
|> Map.merge(%{"whole_db" => true, "actor_id" => user.ap_id})
activities = activities =
ActivityPub.fetch_public_activities(query_params) params
|> Map.take(["max_id"])
|> Map.merge(%{"whole_db" => true, "actor_id" => user.ap_id})
|> ActivityPub.fetch_public_activities()
|> Enum.reverse() |> Enum.reverse()
response = response =
@ -98,8 +97,7 @@ def salmon_incoming(conn, _) do
Federator.incoming_doc(doc) Federator.incoming_doc(doc)
conn send_resp(conn, 200, "")
|> send_resp(200, "")
end end
def object(%{assigns: %{format: format}} = conn, %{"uuid" => _uuid}) def object(%{assigns: %{format: format}} = conn, %{"uuid" => _uuid})
@ -218,7 +216,8 @@ defp represent_activity(
conn conn
|> put_resp_header("content-type", "application/activity+json") |> put_resp_header("content-type", "application/activity+json")
|> json(ObjectView.render("object.json", %{object: object})) |> put_view(ObjectView)
|> render("object.json", %{object: object})
end end
defp represent_activity(_conn, "activity+json", _, _) do defp represent_activity(_conn, "activity+json", _, _) do

View file

@ -0,0 +1,617 @@
defmodule Pleroma.Web.PleromaAPI.EmojiAPIController do
use Pleroma.Web, :controller
require Logger
def emoji_dir_path do
Path.join(
Pleroma.Config.get!([:instance, :static_dir]),
"emoji"
)
end
@doc """
Lists packs from the remote instance.
Since JS cannot ask remote instances for their packs due to CPS, it has to
be done by the server
"""
def list_from(conn, %{"instance_address" => address}) do
address = String.trim(address)
if shareable_packs_available(address) do
list_resp =
"#{address}/api/pleroma/emoji/packs" |> Tesla.get!() |> Map.get(:body) |> Jason.decode!()
json(conn, list_resp)
else
conn
|> put_status(:internal_server_error)
|> json(%{error: "The requested instance does not support sharing emoji packs"})
end
end
@doc """
Lists the packs available on the instance as JSON.
The information is public and does not require authentification. The format is
a map of "pack directory name" to pack.json contents.
"""
def list_packs(conn, _params) do
# Create the directory first if it does not exist. This is probably the first request made
# with the API so it should be sufficient
with {:create_dir, :ok} <- {:create_dir, File.mkdir_p(emoji_dir_path())},
{:ls, {:ok, results}} <- {:ls, File.ls(emoji_dir_path())} do
pack_infos =
results
|> Enum.filter(&has_pack_json?/1)
|> Enum.map(&load_pack/1)
# Check if all the files are in place and can be sent
|> Enum.map(&validate_pack/1)
# Transform into a map of pack-name => pack-data
|> Enum.into(%{})
json(conn, pack_infos)
else
{:create_dir, {:error, e}} ->
conn
|> put_status(:internal_server_error)
|> json(%{error: "Failed to create the emoji pack directory at #{emoji_dir_path()}: #{e}"})
{:ls, {:error, e}} ->
conn
|> put_status(:internal_server_error)
|> json(%{
error:
"Failed to get the contents of the emoji pack directory at #{emoji_dir_path()}: #{e}"
})
end
end
defp has_pack_json?(file) do
dir_path = Path.join(emoji_dir_path(), file)
# Filter to only use the pack.json packs
File.dir?(dir_path) and File.exists?(Path.join(dir_path, "pack.json"))
end
defp load_pack(pack_name) do
pack_path = Path.join(emoji_dir_path(), pack_name)
pack_file = Path.join(pack_path, "pack.json")
{pack_name, Jason.decode!(File.read!(pack_file))}
end
defp validate_pack({name, pack}) do
pack_path = Path.join(emoji_dir_path(), name)
if can_download?(pack, pack_path) do
archive_for_sha = make_archive(name, pack, pack_path)
archive_sha = :crypto.hash(:sha256, archive_for_sha) |> Base.encode16()
pack =
pack
|> put_in(["pack", "can-download"], true)
|> put_in(["pack", "download-sha256"], archive_sha)
{name, pack}
else
{name, put_in(pack, ["pack", "can-download"], false)}
end
end
defp can_download?(pack, pack_path) do
# If the pack is set as shared, check if it can be downloaded
# That means that when asked, the pack can be packed and sent to the remote
# Otherwise, they'd have to download it from external-src
pack["pack"]["share-files"] &&
Enum.all?(pack["files"], fn {_, path} ->
File.exists?(Path.join(pack_path, path))
end)
end
defp create_archive_and_cache(name, pack, pack_dir, md5) do
files =
['pack.json'] ++
(pack["files"] |> Enum.map(fn {_, path} -> to_charlist(path) end))
{:ok, {_, zip_result}} = :zip.zip('#{name}.zip', files, [:memory, cwd: to_charlist(pack_dir)])
cache_seconds_per_file = Pleroma.Config.get!([:emoji, :shared_pack_cache_seconds_per_file])
cache_ms = :timer.seconds(cache_seconds_per_file * Enum.count(files))
Cachex.put!(
:emoji_packs_cache,
name,
# if pack.json MD5 changes, the cache is not valid anymore
%{pack_json_md5: md5, pack_data: zip_result},
# Add a minute to cache time for every file in the pack
ttl: cache_ms
)
Logger.debug("Created an archive for the '#{name}' emoji pack, \
keeping it in cache for #{div(cache_ms, 1000)}s")
zip_result
end
defp make_archive(name, pack, pack_dir) do
# Having a different pack.json md5 invalidates cache
pack_file_md5 = :crypto.hash(:md5, File.read!(Path.join(pack_dir, "pack.json")))
case Cachex.get!(:emoji_packs_cache, name) do
%{pack_file_md5: ^pack_file_md5, pack_data: zip_result} ->
Logger.debug("Using cache for the '#{name}' shared emoji pack")
zip_result
_ ->
create_archive_and_cache(name, pack, pack_dir, pack_file_md5)
end
end
@doc """
An endpoint for other instances (via admin UI) or users (via browser)
to download packs that the instance shares.
"""
def download_shared(conn, %{"name" => name}) do
pack_dir = Path.join(emoji_dir_path(), name)
pack_file = Path.join(pack_dir, "pack.json")
with {_, true} <- {:exists?, File.exists?(pack_file)},
pack = Jason.decode!(File.read!(pack_file)),
{_, true} <- {:can_download?, can_download?(pack, pack_dir)} do
zip_result = make_archive(name, pack, pack_dir)
send_download(conn, {:binary, zip_result}, filename: "#{name}.zip")
else
{:can_download?, _} ->
conn
|> put_status(:forbidden)
|> json(%{
error: "Pack #{name} cannot be downloaded from this instance, either pack sharing\
was disabled for this pack or some files are missing"
})
{:exists?, _} ->
conn
|> put_status(:not_found)
|> json(%{error: "Pack #{name} does not exist"})
end
end
defp shareable_packs_available(address) do
"#{address}/.well-known/nodeinfo"
|> Tesla.get!()
|> Map.get(:body)
|> Jason.decode!()
|> Map.get("links")
|> List.last()
|> Map.get("href")
# Get the actual nodeinfo address and fetch it
|> Tesla.get!()
|> Map.get(:body)
|> Jason.decode!()
|> get_in(["metadata", "features"])
|> Enum.member?("shareable_emoji_packs")
end
@doc """
An admin endpoint to request downloading a pack named `pack_name` from the instance
`instance_address`.
If the requested instance's admin chose to share the pack, it will be downloaded
from that instance, otherwise it will be downloaded from the fallback source, if there is one.
"""
def download_from(conn, %{"instance_address" => address, "pack_name" => name} = data) do
address = String.trim(address)
if shareable_packs_available(address) do
full_pack =
"#{address}/api/pleroma/emoji/packs/list"
|> Tesla.get!()
|> Map.get(:body)
|> Jason.decode!()
|> Map.get(name)
pack_info_res =
case full_pack["pack"] do
%{"share-files" => true, "can-download" => true, "download-sha256" => sha} ->
{:ok,
%{
sha: sha,
uri: "#{address}/api/pleroma/emoji/packs/download_shared/#{name}"
}}
%{"fallback-src" => src, "fallback-src-sha256" => sha} when is_binary(src) ->
{:ok,
%{
sha: sha,
uri: src,
fallback: true
}}
_ ->
{:error,
"The pack was not set as shared and there is no fallback src to download from"}
end
with {:ok, %{sha: sha, uri: uri} = pinfo} <- pack_info_res,
%{body: emoji_archive} <- Tesla.get!(uri),
{_, true} <- {:checksum, Base.decode16!(sha) == :crypto.hash(:sha256, emoji_archive)} do
local_name = data["as"] || name
pack_dir = Path.join(emoji_dir_path(), local_name)
File.mkdir_p!(pack_dir)
files = Enum.map(full_pack["files"], fn {_, path} -> to_charlist(path) end)
# Fallback cannot contain a pack.json file
files = if pinfo[:fallback], do: files, else: ['pack.json'] ++ files
{:ok, _} = :zip.unzip(emoji_archive, cwd: to_charlist(pack_dir), file_list: files)
# Fallback can't contain a pack.json file, since that would cause the fallback-src-sha256
# in it to depend on itself
if pinfo[:fallback] do
pack_file_path = Path.join(pack_dir, "pack.json")
File.write!(pack_file_path, Jason.encode!(full_pack, pretty: true))
end
json(conn, "ok")
else
{:error, e} ->
conn |> put_status(:internal_server_error) |> json(%{error: e})
{:checksum, _} ->
conn
|> put_status(:internal_server_error)
|> json(%{error: "SHA256 for the pack doesn't match the one sent by the server"})
end
else
conn
|> put_status(:internal_server_error)
|> json(%{error: "The requested instance does not support sharing emoji packs"})
end
end
@doc """
Creates an empty pack named `name` which then can be updated via the admin UI.
"""
def create(conn, %{"name" => name}) do
pack_dir = Path.join(emoji_dir_path(), name)
if not File.exists?(pack_dir) do
File.mkdir_p!(pack_dir)
pack_file_p = Path.join(pack_dir, "pack.json")
File.write!(
pack_file_p,
Jason.encode!(%{pack: %{}, files: %{}}, pretty: true)
)
conn |> json("ok")
else
conn
|> put_status(:conflict)
|> json(%{error: "A pack named \"#{name}\" already exists"})
end
end
@doc """
Deletes the pack `name` and all it's files.
"""
def delete(conn, %{"name" => name}) do
pack_dir = Path.join(emoji_dir_path(), name)
case File.rm_rf(pack_dir) do
{:ok, _} ->
conn |> json("ok")
{:error, _} ->
conn
|> put_status(:internal_server_error)
|> json(%{error: "Couldn't delete the pack #{name}"})
end
end
@doc """
An endpoint to update `pack_names`'s metadata.
`new_data` is the new metadata for the pack, that will replace the old metadata.
"""
def update_metadata(conn, %{"pack_name" => name, "new_data" => new_data}) do
pack_file_p = Path.join([emoji_dir_path(), name, "pack.json"])
full_pack = Jason.decode!(File.read!(pack_file_p))
# The new fallback-src is in the new data and it's not the same as it was in the old data
should_update_fb_sha =
not is_nil(new_data["fallback-src"]) and
new_data["fallback-src"] != full_pack["pack"]["fallback-src"]
with {_, true} <- {:should_update?, should_update_fb_sha},
%{body: pack_arch} <- Tesla.get!(new_data["fallback-src"]),
{:ok, flist} <- :zip.unzip(pack_arch, [:memory]),
{_, true} <- {:has_all_files?, has_all_files?(full_pack, flist)} do
fallback_sha = :crypto.hash(:sha256, pack_arch) |> Base.encode16()
new_data = Map.put(new_data, "fallback-src-sha256", fallback_sha)
update_metadata_and_send(conn, full_pack, new_data, pack_file_p)
else
{:should_update?, _} ->
update_metadata_and_send(conn, full_pack, new_data, pack_file_p)
{:has_all_files?, _} ->
conn
|> put_status(:bad_request)
|> json(%{error: "The fallback archive does not have all files specified in pack.json"})
end
end
# Check if all files from the pack.json are in the archive
defp has_all_files?(%{"files" => files}, flist) do
Enum.all?(files, fn {_, from_manifest} ->
Enum.find(flist, fn {from_archive, _} ->
to_string(from_archive) == from_manifest
end)
end)
end
defp update_metadata_and_send(conn, full_pack, new_data, pack_file_p) do
full_pack = Map.put(full_pack, "pack", new_data)
File.write!(pack_file_p, Jason.encode!(full_pack, pretty: true))
# Send new data back with fallback sha filled
json(conn, new_data)
end
defp get_filename(%{"filename" => filename}), do: filename
defp get_filename(%{"file" => file}) do
case file do
%Plug.Upload{filename: filename} -> filename
url when is_binary(url) -> Path.basename(url)
end
end
defp empty?(str), do: String.trim(str) == ""
defp update_file_and_send(conn, updated_full_pack, pack_file_p) do
# Write the emoji pack file
File.write!(pack_file_p, Jason.encode!(updated_full_pack, pretty: true))
# Return the modified file list
json(conn, updated_full_pack["files"])
end
@doc """
Updates a file in a pack.
Updating can mean three things:
- `add` adds an emoji named `shortcode` to the pack `pack_name`,
that means that the emoji file needs to be uploaded with the request
(thus requiring it to be a multipart request) and be named `file`.
There can also be an optional `filename` that will be the new emoji file name
(if it's not there, the name will be taken from the uploaded file).
- `update` changes emoji shortcode (from `shortcode` to `new_shortcode` or moves the file
(from the current filename to `new_filename`)
- `remove` removes the emoji named `shortcode` and it's associated file
"""
# Add
def update_file(
conn,
%{"pack_name" => pack_name, "action" => "add", "shortcode" => shortcode} = params
) do
pack_dir = Path.join(emoji_dir_path(), pack_name)
pack_file_p = Path.join(pack_dir, "pack.json")
full_pack = Jason.decode!(File.read!(pack_file_p))
with {_, false} <- {:has_shortcode, Map.has_key?(full_pack["files"], shortcode)},
filename <- get_filename(params),
false <- empty?(shortcode),
false <- empty?(filename) do
file_path = Path.join(pack_dir, filename)
# If the name contains directories, create them
if String.contains?(file_path, "/") do
File.mkdir_p!(Path.dirname(file_path))
end
case params["file"] do
%Plug.Upload{path: upload_path} ->
# Copy the uploaded file from the temporary directory
File.copy!(upload_path, file_path)
url when is_binary(url) ->
# Download and write the file
file_contents = Tesla.get!(url).body
File.write!(file_path, file_contents)
end
updated_full_pack = put_in(full_pack, ["files", shortcode], filename)
update_file_and_send(conn, updated_full_pack, pack_file_p)
else
{:has_shortcode, _} ->
conn
|> put_status(:conflict)
|> json(%{error: "An emoji with the \"#{shortcode}\" shortcode already exists"})
true ->
conn
|> put_status(:bad_request)
|> json(%{error: "shortcode or filename cannot be empty"})
end
end
# Remove
def update_file(conn, %{
"pack_name" => pack_name,
"action" => "remove",
"shortcode" => shortcode
}) do
pack_dir = Path.join(emoji_dir_path(), pack_name)
pack_file_p = Path.join(pack_dir, "pack.json")
full_pack = Jason.decode!(File.read!(pack_file_p))
if Map.has_key?(full_pack["files"], shortcode) do
{emoji_file_path, updated_full_pack} = pop_in(full_pack, ["files", shortcode])
emoji_file_path = Path.join(pack_dir, emoji_file_path)
# Delete the emoji file
File.rm!(emoji_file_path)
# If the old directory has no more files, remove it
if String.contains?(emoji_file_path, "/") do
dir = Path.dirname(emoji_file_path)
if Enum.empty?(File.ls!(dir)) do
File.rmdir!(dir)
end
end
update_file_and_send(conn, updated_full_pack, pack_file_p)
else
conn
|> put_status(:bad_request)
|> json(%{error: "Emoji \"#{shortcode}\" does not exist"})
end
end
# Update
def update_file(
conn,
%{"pack_name" => pack_name, "action" => "update", "shortcode" => shortcode} = params
) do
pack_dir = Path.join(emoji_dir_path(), pack_name)
pack_file_p = Path.join(pack_dir, "pack.json")
full_pack = Jason.decode!(File.read!(pack_file_p))
with {_, true} <- {:has_shortcode, Map.has_key?(full_pack["files"], shortcode)},
%{"new_shortcode" => new_shortcode, "new_filename" => new_filename} <- params,
false <- empty?(new_shortcode),
false <- empty?(new_filename) do
# First, remove the old shortcode, saving the old path
{old_emoji_file_path, updated_full_pack} = pop_in(full_pack, ["files", shortcode])
old_emoji_file_path = Path.join(pack_dir, old_emoji_file_path)
new_emoji_file_path = Path.join(pack_dir, new_filename)
# If the name contains directories, create them
if String.contains?(new_emoji_file_path, "/") do
File.mkdir_p!(Path.dirname(new_emoji_file_path))
end
# Move/Rename the old filename to a new filename
# These are probably on the same filesystem, so just rename should work
:ok = File.rename(old_emoji_file_path, new_emoji_file_path)
# If the old directory has no more files, remove it
if String.contains?(old_emoji_file_path, "/") do
dir = Path.dirname(old_emoji_file_path)
if Enum.empty?(File.ls!(dir)) do
File.rmdir!(dir)
end
end
# Then, put in the new shortcode with the new path
updated_full_pack = put_in(updated_full_pack, ["files", new_shortcode], new_filename)
update_file_and_send(conn, updated_full_pack, pack_file_p)
else
{:has_shortcode, _} ->
conn
|> put_status(:bad_request)
|> json(%{error: "Emoji \"#{shortcode}\" does not exist"})
true ->
conn
|> put_status(:bad_request)
|> json(%{error: "new_shortcode or new_filename cannot be empty"})
_ ->
conn
|> put_status(:bad_request)
|> json(%{error: "new_shortcode or new_file were not specified"})
end
end
def update_file(conn, %{"action" => action}) do
conn
|> put_status(:bad_request)
|> json(%{error: "Unknown action: #{action}"})
end
@doc """
Imports emoji from the filesystem.
Importing means checking all the directories in the
`$instance_static/emoji/` for directories which do not have
`pack.json`. If one has an emoji.txt file, that file will be used
to create a `pack.json` file with it's contents. If the directory has
neither, all the files with specific configured extenstions will be
assumed to be emojis and stored in the new `pack.json` file.
"""
def import_from_fs(conn, _params) do
with {:ok, results} <- File.ls(emoji_dir_path()) do
imported_pack_names =
results
|> Enum.filter(fn file ->
dir_path = Path.join(emoji_dir_path(), file)
# Find the directories that do NOT have pack.json
File.dir?(dir_path) and not File.exists?(Path.join(dir_path, "pack.json"))
end)
|> Enum.map(&write_pack_json_contents/1)
json(conn, imported_pack_names)
else
{:error, _} ->
conn
|> put_status(:internal_server_error)
|> json(%{error: "Error accessing emoji pack directory"})
end
end
defp write_pack_json_contents(dir) do
dir_path = Path.join(emoji_dir_path(), dir)
emoji_txt_path = Path.join(dir_path, "emoji.txt")
files_for_pack = files_for_pack(emoji_txt_path, dir_path)
pack_json_contents = Jason.encode!(%{pack: %{}, files: files_for_pack})
File.write!(Path.join(dir_path, "pack.json"), pack_json_contents)
dir
end
defp files_for_pack(emoji_txt_path, dir_path) do
if File.exists?(emoji_txt_path) do
# There's an emoji.txt file, it's likely from a pack installed by the pack manager.
# Make a pack.json file from the contents of that emoji.txt fileh
# FIXME: Copy-pasted from Pleroma.Emoji/load_from_file_stream/2
# Create a map of shortcodes to filenames from emoji.txt
File.read!(emoji_txt_path)
|> String.split("\n")
|> Enum.map(&String.trim/1)
|> Enum.map(fn line ->
case String.split(line, ~r/,\s*/) do
# This matches both strings with and without tags
# and we don't care about tags here
[name, file | _] -> {name, file}
_ -> nil
end
end)
|> Enum.filter(fn x -> not is_nil(x) end)
|> Enum.into(%{})
else
# If there's no emoji.txt, assume all files
# that are of certain extensions from the config are emojis and import them all
pack_extensions = Pleroma.Config.get!([:emoji, :pack_extensions])
Pleroma.Emoji.Loader.make_shortcode_to_file_map(dir_path, pack_extensions)
end
end
end

View file

@ -15,7 +15,7 @@ defmodule Pleroma.Web.Push.Subscription do
@type t :: %__MODULE__{} @type t :: %__MODULE__{}
schema "push_subscriptions" do schema "push_subscriptions" do
belongs_to(:user, User, type: Pleroma.FlakeId) belongs_to(:user, User, type: FlakeId.Ecto.CompatType)
belongs_to(:token, Token) belongs_to(:token, Token)
field(:endpoint, :string) field(:endpoint, :string)
field(:key_p256dh, :string) field(:key_p256dh, :string)

View file

@ -180,12 +180,13 @@ defmodule Pleroma.Web.Router do
post("/relay", AdminAPIController, :relay_follow) post("/relay", AdminAPIController, :relay_follow)
delete("/relay", AdminAPIController, :relay_unfollow) delete("/relay", AdminAPIController, :relay_unfollow)
get("/users/invite_token", AdminAPIController, :get_invite_token) post("/users/invite_token", AdminAPIController, :create_invite_token)
get("/users/invites", AdminAPIController, :invites) get("/users/invites", AdminAPIController, :invites)
post("/users/revoke_invite", AdminAPIController, :revoke_invite) post("/users/revoke_invite", AdminAPIController, :revoke_invite)
post("/users/email_invite", AdminAPIController, :email_invite) post("/users/email_invite", AdminAPIController, :email_invite)
get("/users/:nickname/password_reset", AdminAPIController, :get_password_reset) get("/users/:nickname/password_reset", AdminAPIController, :get_password_reset)
patch("/users/:nickname/force_password_reset", AdminAPIController, :force_password_reset)
get("/users", AdminAPIController, :list_users) get("/users", AdminAPIController, :list_users)
get("/users/:nickname", AdminAPIController, :user_show) get("/users/:nickname", AdminAPIController, :user_show)
@ -205,6 +206,30 @@ defmodule Pleroma.Web.Router do
get("/config/migrate_from_db", AdminAPIController, :migrate_from_db) get("/config/migrate_from_db", AdminAPIController, :migrate_from_db)
get("/moderation_log", AdminAPIController, :list_log) get("/moderation_log", AdminAPIController, :list_log)
post("/reload_emoji", AdminAPIController, :reload_emoji)
end
scope "/api/pleroma/emoji", Pleroma.Web.PleromaAPI do
scope "/packs" do
# Modifying packs
pipe_through([:admin_api, :oauth_write])
post("/import_from_fs", EmojiAPIController, :import_from_fs)
post("/:pack_name/update_file", EmojiAPIController, :update_file)
post("/:pack_name/update_metadata", EmojiAPIController, :update_metadata)
put("/:name", EmojiAPIController, :create)
delete("/:name", EmojiAPIController, :delete)
post("/download_from", EmojiAPIController, :download_from)
post("/list_from", EmojiAPIController, :list_from)
end
scope "/packs" do
# Pack info / downloading
get("/", EmojiAPIController, :list_packs)
get("/:name/download_shared/", EmojiAPIController, :download_shared)
end
end end
scope "/", Pleroma.Web.TwitterAPI do scope "/", Pleroma.Web.TwitterAPI do
@ -308,14 +333,11 @@ defmodule Pleroma.Web.Router do
get("/favourites", MastodonAPIController, :favourites) get("/favourites", MastodonAPIController, :favourites)
get("/bookmarks", MastodonAPIController, :bookmarks) get("/bookmarks", MastodonAPIController, :bookmarks)
post("/notifications/clear", MastodonAPIController, :clear_notifications) get("/notifications", NotificationController, :index)
get("/notifications/:id", NotificationController, :show)
post("/notifications/dismiss", MastodonAPIController, :dismiss_notification) post("/notifications/clear", NotificationController, :clear)
post("/notifications/dismiss", NotificationController, :dismiss)
get("/notifications", MastodonAPIController, :notifications) delete("/notifications/destroy_multiple", NotificationController, :destroy_multiple)
get("/notifications/:id", MastodonAPIController, :get_notification)
delete("/notifications/destroy_multiple", MastodonAPIController, :destroy_multiple)
get("/scheduled_statuses", MastodonAPIController, :scheduled_statuses) get("/scheduled_statuses", MastodonAPIController, :scheduled_statuses)
get("/scheduled_statuses/:id", MastodonAPIController, :show_scheduled_status) get("/scheduled_statuses/:id", MastodonAPIController, :show_scheduled_status)

View file

@ -4,16 +4,18 @@ defmodule Pleroma.Web.Streamer.State do
alias Pleroma.Web.Streamer.StreamerSocket alias Pleroma.Web.Streamer.StreamerSocket
@env Mix.env()
def start_link(_) do def start_link(_) do
GenServer.start_link(__MODULE__, %{sockets: %{}}, name: __MODULE__) GenServer.start_link(__MODULE__, %{sockets: %{}}, name: __MODULE__)
end end
def add_socket(topic, socket) do def add_socket(topic, socket) do
GenServer.call(__MODULE__, {:add, socket, topic}) GenServer.call(__MODULE__, {:add, topic, socket})
end end
def remove_socket(topic, socket) do def remove_socket(topic, socket) do
GenServer.call(__MODULE__, {:remove, socket, topic}) do_remove_socket(@env, topic, socket)
end end
def get_sockets do def get_sockets do
@ -29,7 +31,7 @@ def handle_call(:get_state, _from, state) do
{:reply, state, state} {:reply, state, state}
end end
def handle_call({:add, socket, topic}, _from, %{sockets: sockets} = state) do def handle_call({:add, topic, socket}, _from, %{sockets: sockets} = state) do
internal_topic = internal_topic(topic, socket) internal_topic = internal_topic(topic, socket)
stream_socket = StreamerSocket.from_socket(socket) stream_socket = StreamerSocket.from_socket(socket)
@ -44,7 +46,7 @@ def handle_call({:add, socket, topic}, _from, %{sockets: sockets} = state) do
{:reply, state, state} {:reply, state, state}
end end
def handle_call({:remove, socket, topic}, _from, %{sockets: sockets} = state) do def handle_call({:remove, topic, socket}, _from, %{sockets: sockets} = state) do
internal_topic = internal_topic(topic, socket) internal_topic = internal_topic(topic, socket)
stream_socket = StreamerSocket.from_socket(socket) stream_socket = StreamerSocket.from_socket(socket)
@ -57,6 +59,14 @@ def handle_call({:remove, socket, topic}, _from, %{sockets: sockets} = state) do
{:reply, state, state} {:reply, state, state}
end end
defp do_remove_socket(:test, _, _) do
:ok
end
defp do_remove_socket(_env, topic, socket) do
GenServer.call(__MODULE__, {:remove, topic, socket})
end
defp internal_topic(topic, socket) defp internal_topic(topic, socket)
when topic in ~w[user user:notification direct] do when topic in ~w[user user:notification direct] do
"#{topic}:#{socket.assigns[:user].id}" "#{topic}:#{socket.assigns[:user].id}"

View file

@ -239,11 +239,9 @@ def version(conn, _params) do
def emoji(conn, _params) do def emoji(conn, _params) do
emoji = emoji =
Emoji.get_all() Enum.reduce(Emoji.get_all(), %{}, fn {code, %Emoji{file: file, tags: tags}}, acc ->
|> Enum.map(fn {short_code, path, tags} -> Map.put(acc, code, %{image_url: file, tags: tags})
{short_code, %{image_url: path, tags: tags}}
end) end)
|> Enum.into(%{})
json(conn, emoji) json(conn, emoji)
end end

View file

@ -5,7 +5,6 @@
defmodule Pleroma.Web.TwitterAPI.Controller do defmodule Pleroma.Web.TwitterAPI.Controller do
use Pleroma.Web, :controller use Pleroma.Web, :controller
alias Ecto.Changeset
alias Pleroma.Notification alias Pleroma.Notification
alias Pleroma.User alias Pleroma.User
alias Pleroma.Web.OAuth.Token alias Pleroma.Web.OAuth.Token
@ -16,15 +15,12 @@ defmodule Pleroma.Web.TwitterAPI.Controller do
action_fallback(:errors) action_fallback(:errors)
def confirm_email(conn, %{"user_id" => uid, "token" => token}) do def confirm_email(conn, %{"user_id" => uid, "token" => token}) do
with %User{} = user <- User.get_cached_by_id(uid), new_info = [need_confirmation: false]
true <- user.local,
true <- user.info.confirmation_pending, with %User{info: info} = user <- User.get_cached_by_id(uid),
true <- user.info.confirmation_token == token, true <- user.local and info.confirmation_pending and info.confirmation_token == token,
info_change <- User.Info.confirmation_changeset(user.info, need_confirmation: false), {:ok, _} <- User.update_info(user, &User.Info.confirmation_changeset(&1, new_info)) do
changeset <- Changeset.change(user) |> Changeset.put_embed(:info, info_change), redirect(conn, to: "/")
{:ok, _} <- User.update_and_set_cache(changeset) do
conn
|> redirect(to: "/")
end end
end end

View file

@ -13,7 +13,7 @@ defmodule Pleroma.Web.Websub.WebsubClientSubscription do
field(:state, :string) field(:state, :string)
field(:subscribers, {:array, :string}, default: []) field(:subscribers, {:array, :string}, default: [])
field(:hub, :string) field(:hub, :string)
belongs_to(:user, User, type: Pleroma.FlakeId) belongs_to(:user, User, type: FlakeId.Ecto.CompatType)
timestamps() timestamps()
end end

View file

@ -26,6 +26,11 @@ def perform(%{"op" => "delete_user", "user_id" => user_id}, _job) do
User.perform(:delete, user) User.perform(:delete, user)
end end
def perform(%{"op" => "force_password_reset", "user_id" => user_id}, _job) do
user = User.get_cached_by_id(user_id)
User.perform(:force_password_reset, user)
end
def perform( def perform(
%{ %{
"op" => "blocks_import", "op" => "blocks_import",

View file

@ -10,7 +10,11 @@ defmodule Pleroma.Workers.WebPusherWorker do
@impl Oban.Worker @impl Oban.Worker
def perform(%{"op" => "web_push", "notification_id" => notification_id}, _job) do def perform(%{"op" => "web_push", "notification_id" => notification_id}, _job) do
notification = Repo.get(Notification, notification_id) notification =
Notification
|> Repo.get(notification_id)
|> Repo.preload([:activity])
Pleroma.Web.Push.Impl.perform(notification) Pleroma.Web.Push.Impl.perform(notification)
end end
end end

12
mix.exs
View file

@ -5,7 +5,7 @@ def project do
[ [
app: :pleroma, app: :pleroma,
version: version("1.0.0"), version: version("1.0.0"),
elixir: "~> 1.7", elixir: "~> 1.8",
elixirc_paths: elixirc_paths(Mix.env()), elixirc_paths: elixirc_paths(Mix.env()),
compilers: [:phoenix, :gettext] ++ Mix.compilers(), compilers: [:phoenix, :gettext] ++ Mix.compilers(),
elixirc_options: [warnings_as_errors: true], elixirc_options: [warnings_as_errors: true],
@ -99,9 +99,9 @@ defp deps do
{:plug_cowboy, "~> 2.0"}, {:plug_cowboy, "~> 2.0"},
{:phoenix_pubsub, "~> 1.1"}, {:phoenix_pubsub, "~> 1.1"},
{:phoenix_ecto, "~> 4.0"}, {:phoenix_ecto, "~> 4.0"},
{:ecto_sql, "~> 3.1"}, {:ecto_sql, "~> 3.2"},
{:postgrex, ">= 0.13.5"}, {:postgrex, ">= 0.13.5"},
{:oban, "~> 0.7"}, {:oban, "~> 0.8.1"},
{:quantum, "~> 2.3"}, {:quantum, "~> 2.3"},
{:gettext, "~> 0.15"}, {:gettext, "~> 0.15"},
{:comeonin, "~> 4.1.1"}, {:comeonin, "~> 4.1.1"},
@ -113,7 +113,7 @@ defp deps do
{:calendar, "~> 0.17.4"}, {:calendar, "~> 0.17.4"},
{:cachex, "~> 3.0.2"}, {:cachex, "~> 3.0.2"},
{:poison, "~> 3.0", override: true}, {:poison, "~> 3.0", override: true},
{:tesla, "~> 1.2"}, {:tesla, "~> 1.3", override: true},
{:jason, "~> 1.0"}, {:jason, "~> 1.0"},
{:mogrify, "~> 0.6.1"}, {:mogrify, "~> 0.6.1"},
{:ex_aws, "~> 2.1"}, {:ex_aws, "~> 2.1"},
@ -158,6 +158,7 @@ defp deps do
{:ex_const, "~> 0.2"}, {:ex_const, "~> 0.2"},
{:plug_static_index_html, "~> 1.0.0"}, {:plug_static_index_html, "~> 1.0.0"},
{:excoveralls, "~> 0.11.1", only: :test}, {:excoveralls, "~> 0.11.1", only: :test},
{:flake_id, "~> 0.1.0"},
{:mox, "~> 0.5", only: :test} {:mox, "~> 0.5", only: :test}
] ++ oauth_deps() ] ++ oauth_deps()
end end
@ -174,7 +175,8 @@ defp aliases do
"ecto.rollback": ["pleroma.ecto.rollback"], "ecto.rollback": ["pleroma.ecto.rollback"],
"ecto.setup": ["ecto.create", "ecto.migrate", "run priv/repo/seeds.exs"], "ecto.setup": ["ecto.create", "ecto.migrate", "run priv/repo/seeds.exs"],
"ecto.reset": ["ecto.drop", "ecto.setup"], "ecto.reset": ["ecto.drop", "ecto.setup"],
test: ["ecto.create --quiet", "ecto.migrate", "test"] test: ["ecto.create --quiet", "ecto.migrate", "test"],
docs: ["pleroma.docs", "docs"]
] ]
end end

View file

@ -1,6 +1,7 @@
%{ %{
"accept": {:hex, :accept, "0.3.5", "b33b127abca7cc948bbe6caa4c263369abf1347cfa9d8e699c6d214660f10cd1", [:rebar3], [], "hexpm"}, "accept": {:hex, :accept, "0.3.5", "b33b127abca7cc948bbe6caa4c263369abf1347cfa9d8e699c6d214660f10cd1", [:rebar3], [], "hexpm"},
"auto_linker": {:git, "https://git.pleroma.social/pleroma/auto_linker.git", "95e8188490e97505c56636c1379ffdf036c1fdde", [ref: "95e8188490e97505c56636c1379ffdf036c1fdde"]}, "auto_linker": {:git, "https://git.pleroma.social/pleroma/auto_linker.git", "95e8188490e97505c56636c1379ffdf036c1fdde", [ref: "95e8188490e97505c56636c1379ffdf036c1fdde"]},
"base62": {:hex, :base62, "1.2.1", "4866763e08555a7b3917064e9eef9194c41667276c51b59de2bc42c6ea65f806", [:mix], [{:custom_base, "~> 0.2.1", [hex: :custom_base, repo: "hexpm", optional: false]}], "hexpm"},
"base64url": {:hex, :base64url, "0.0.1", "36a90125f5948e3afd7be97662a1504b934dd5dac78451ca6e9abf85a10286be", [:rebar], [], "hexpm"}, "base64url": {:hex, :base64url, "0.0.1", "36a90125f5948e3afd7be97662a1504b934dd5dac78451ca6e9abf85a10286be", [:rebar], [], "hexpm"},
"bbcode": {:hex, :bbcode, "0.1.1", "0023e2c7814119b2e620b7add67182e3f6019f92bfec9a22da7e99821aceba70", [:mix], [{:nimble_parsec, "~> 0.5", [hex: :nimble_parsec, repo: "hexpm", optional: false]}], "hexpm"}, "bbcode": {:hex, :bbcode, "0.1.1", "0023e2c7814119b2e620b7add67182e3f6019f92bfec9a22da7e99821aceba70", [:mix], [{:nimble_parsec, "~> 0.5", [hex: :nimble_parsec, repo: "hexpm", optional: false]}], "hexpm"},
"benchee": {:hex, :benchee, "1.0.1", "66b211f9bfd84bd97e6d1beaddf8fc2312aaabe192f776e8931cb0c16f53a521", [:mix], [{:deep_merge, "~> 1.0", [hex: :deep_merge, repo: "hexpm", optional: false]}], "hexpm"}, "benchee": {:hex, :benchee, "1.0.1", "66b211f9bfd84bd97e6d1beaddf8fc2312aaabe192f776e8931cb0c16f53a521", [:mix], [{:deep_merge, "~> 1.0", [hex: :deep_merge, repo: "hexpm", optional: false]}], "hexpm"},
@ -17,12 +18,13 @@
"credo": {:hex, :credo, "0.9.3", "76fa3e9e497ab282e0cf64b98a624aa11da702854c52c82db1bf24e54ab7c97a", [:mix], [{:bunt, "~> 0.2.0", [hex: :bunt, repo: "hexpm", optional: false]}, {:poison, ">= 0.0.0", [hex: :poison, repo: "hexpm", optional: false]}], "hexpm"}, "credo": {:hex, :credo, "0.9.3", "76fa3e9e497ab282e0cf64b98a624aa11da702854c52c82db1bf24e54ab7c97a", [:mix], [{:bunt, "~> 0.2.0", [hex: :bunt, repo: "hexpm", optional: false]}, {:poison, ">= 0.0.0", [hex: :poison, repo: "hexpm", optional: false]}], "hexpm"},
"crontab": {:hex, :crontab, "1.1.7", "b9219f0bdc8678b94143655a8f229716c5810c0636a4489f98c0956137e53985", [:mix], [{:ecto, "~> 1.0 or ~> 2.0 or ~> 3.0", [hex: :ecto, repo: "hexpm", optional: true]}], "hexpm"}, "crontab": {:hex, :crontab, "1.1.7", "b9219f0bdc8678b94143655a8f229716c5810c0636a4489f98c0956137e53985", [:mix], [{:ecto, "~> 1.0 or ~> 2.0 or ~> 3.0", [hex: :ecto, repo: "hexpm", optional: true]}], "hexpm"},
"crypt": {:git, "https://github.com/msantos/crypt", "1f2b58927ab57e72910191a7ebaeff984382a1d3", [ref: "1f2b58927ab57e72910191a7ebaeff984382a1d3"]}, "crypt": {:git, "https://github.com/msantos/crypt", "1f2b58927ab57e72910191a7ebaeff984382a1d3", [ref: "1f2b58927ab57e72910191a7ebaeff984382a1d3"]},
"custom_base": {:hex, :custom_base, "0.2.1", "4a832a42ea0552299d81652aa0b1f775d462175293e99dfbe4d7dbaab785a706", [:mix], [], "hexpm"},
"db_connection": {:hex, :db_connection, "2.1.1", "a51e8a2ee54ef2ae6ec41a668c85787ed40cb8944928c191280fe34c15b76ae5", [:mix], [{:connection, "~> 1.0.2", [hex: :connection, repo: "hexpm", optional: false]}], "hexpm"}, "db_connection": {:hex, :db_connection, "2.1.1", "a51e8a2ee54ef2ae6ec41a668c85787ed40cb8944928c191280fe34c15b76ae5", [:mix], [{:connection, "~> 1.0.2", [hex: :connection, repo: "hexpm", optional: false]}], "hexpm"},
"decimal": {:hex, :decimal, "1.8.0", "ca462e0d885f09a1c5a342dbd7c1dcf27ea63548c65a65e67334f4b61803822e", [:mix], [], "hexpm"}, "decimal": {:hex, :decimal, "1.8.0", "ca462e0d885f09a1c5a342dbd7c1dcf27ea63548c65a65e67334f4b61803822e", [:mix], [], "hexpm"},
"deep_merge": {:hex, :deep_merge, "1.0.0", "b4aa1a0d1acac393bdf38b2291af38cb1d4a52806cf7a4906f718e1feb5ee961", [:mix], [], "hexpm"}, "deep_merge": {:hex, :deep_merge, "1.0.0", "b4aa1a0d1acac393bdf38b2291af38cb1d4a52806cf7a4906f718e1feb5ee961", [:mix], [], "hexpm"},
"earmark": {:hex, :earmark, "1.3.6", "ce1d0675e10a5bb46b007549362bd3f5f08908843957687d8484fe7f37466b19", [:mix], [], "hexpm"}, "earmark": {:hex, :earmark, "1.3.6", "ce1d0675e10a5bb46b007549362bd3f5f08908843957687d8484fe7f37466b19", [:mix], [], "hexpm"},
"ecto": {:hex, :ecto, "3.1.4", "69d852da7a9f04ede725855a35ede48d158ca11a404fe94f8b2fb3b2162cd3c9", [:mix], [{:decimal, "~> 1.6", [hex: :decimal, repo: "hexpm", optional: false]}, {:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: true]}], "hexpm"}, "ecto": {:hex, :ecto, "3.2.0", "940e2598813f205223d60c78d66e514afe1db5167ed8075510a59e496619cfb5", [:mix], [{:decimal, "~> 1.6", [hex: :decimal, repo: "hexpm", optional: false]}, {:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: true]}], "hexpm"},
"ecto_sql": {:hex, :ecto_sql, "3.1.3", "2c536139190492d9de33c5fefac7323c5eaaa82e1b9bf93482a14649042f7cd9", [:mix], [{:db_connection, "~> 2.0", [hex: :db_connection, repo: "hexpm", optional: false]}, {:ecto, "~> 3.1.0", [hex: :ecto, repo: "hexpm", optional: false]}, {:mariaex, "~> 0.9.1", [hex: :mariaex, repo: "hexpm", optional: true]}, {:myxql, "~> 0.2.0", [hex: :myxql, repo: "hexpm", optional: true]}, {:postgrex, "~> 0.14.0", [hex: :postgrex, repo: "hexpm", optional: true]}, {:telemetry, "~> 0.4.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm"}, "ecto_sql": {:hex, :ecto_sql, "3.2.0", "751cea597e8deb616084894dd75cbabfdbe7255ff01e8c058ca13f0353a3921b", [:mix], [{:db_connection, "~> 2.1", [hex: :db_connection, repo: "hexpm", optional: false]}, {:ecto, "~> 3.2.0", [hex: :ecto, repo: "hexpm", optional: false]}, {:myxql, "~> 0.2.0", [hex: :myxql, repo: "hexpm", optional: true]}, {:postgrex, "~> 0.15.0", [hex: :postgrex, repo: "hexpm", optional: true]}, {:telemetry, "~> 0.4.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm"},
"esshd": {:hex, :esshd, "0.1.0", "6f93a2062adb43637edad0ea7357db2702a4b80dd9683482fe00f5134e97f4c1", [:mix], [], "hexpm"}, "esshd": {:hex, :esshd, "0.1.0", "6f93a2062adb43637edad0ea7357db2702a4b80dd9683482fe00f5134e97f4c1", [:mix], [], "hexpm"},
"eternal": {:hex, :eternal, "1.2.0", "e2a6b6ce3b8c248f7dc31451aefca57e3bdf0e48d73ae5043229380a67614c41", [:mix], [], "hexpm"}, "eternal": {:hex, :eternal, "1.2.0", "e2a6b6ce3b8c248f7dc31451aefca57e3bdf0e48d73ae5043229380a67614c41", [:mix], [], "hexpm"},
"ex2ms": {:hex, :ex2ms, "1.5.0", "19e27f9212be9a96093fed8cdfbef0a2b56c21237196d26760f11dfcfae58e97", [:mix], [], "hexpm"}, "ex2ms": {:hex, :ex2ms, "1.5.0", "19e27f9212be9a96093fed8cdfbef0a2b56c21237196d26760f11dfcfae58e97", [:mix], [], "hexpm"},
@ -34,12 +36,13 @@
"ex_rated": {:hex, :ex_rated, "1.3.3", "30ecbdabe91f7eaa9d37fa4e81c85ba420f371babeb9d1910adbcd79ec798d27", [:mix], [{:ex2ms, "~> 1.5", [hex: :ex2ms, repo: "hexpm", optional: false]}], "hexpm"}, "ex_rated": {:hex, :ex_rated, "1.3.3", "30ecbdabe91f7eaa9d37fa4e81c85ba420f371babeb9d1910adbcd79ec798d27", [:mix], [{:ex2ms, "~> 1.5", [hex: :ex2ms, repo: "hexpm", optional: false]}], "hexpm"},
"ex_syslogger": {:git, "https://github.com/slashmili/ex_syslogger.git", "f3963399047af17e038897c69e20d552e6899e1d", [tag: "1.4.0"]}, "ex_syslogger": {:git, "https://github.com/slashmili/ex_syslogger.git", "f3963399047af17e038897c69e20d552e6899e1d", [tag: "1.4.0"]},
"excoveralls": {:hex, :excoveralls, "0.11.1", "dd677fbdd49114fdbdbf445540ec735808250d56b011077798316505064edb2c", [:mix], [{:hackney, "~> 1.0", [hex: :hackney, repo: "hexpm", optional: false]}, {:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: false]}], "hexpm"}, "excoveralls": {:hex, :excoveralls, "0.11.1", "dd677fbdd49114fdbdbf445540ec735808250d56b011077798316505064edb2c", [:mix], [{:hackney, "~> 1.0", [hex: :hackney, repo: "hexpm", optional: false]}, {:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: false]}], "hexpm"},
"flake_id": {:hex, :flake_id, "0.1.0", "7716b086d2e405d09b647121a166498a0d93d1a623bead243e1f74216079ccb3", [:mix], [{:base62, "~> 1.2", [hex: :base62, repo: "hexpm", optional: false]}, {:ecto, ">= 2.0.0", [hex: :ecto, repo: "hexpm", optional: true]}], "hexpm"},
"floki": {:hex, :floki, "0.23.0", "956ab6dba828c96e732454809fb0bd8d43ce0979b75f34de6322e73d4c917829", [:mix], [{:html_entities, "~> 0.4.0", [hex: :html_entities, repo: "hexpm", optional: false]}], "hexpm"}, "floki": {:hex, :floki, "0.23.0", "956ab6dba828c96e732454809fb0bd8d43ce0979b75f34de6322e73d4c917829", [:mix], [{:html_entities, "~> 0.4.0", [hex: :html_entities, repo: "hexpm", optional: false]}], "hexpm"},
"gen_smtp": {:hex, :gen_smtp, "0.14.0", "39846a03522456077c6429b4badfd1d55e5e7d0fdfb65e935b7c5e38549d9202", [:rebar3], [], "hexpm"}, "gen_smtp": {:hex, :gen_smtp, "0.14.0", "39846a03522456077c6429b4badfd1d55e5e7d0fdfb65e935b7c5e38549d9202", [:rebar3], [], "hexpm"},
"gen_stage": {:hex, :gen_stage, "0.14.2", "6a2a578a510c5bfca8a45e6b27552f613b41cf584b58210f017088d3d17d0b14", [:mix], [], "hexpm"}, "gen_stage": {:hex, :gen_stage, "0.14.2", "6a2a578a510c5bfca8a45e6b27552f613b41cf584b58210f017088d3d17d0b14", [:mix], [], "hexpm"},
"gen_state_machine": {:hex, :gen_state_machine, "2.0.5", "9ac15ec6e66acac994cc442dcc2c6f9796cf380ec4b08267223014be1c728a95", [:mix], [], "hexpm"}, "gen_state_machine": {:hex, :gen_state_machine, "2.0.5", "9ac15ec6e66acac994cc442dcc2c6f9796cf380ec4b08267223014be1c728a95", [:mix], [], "hexpm"},
"gettext": {:hex, :gettext, "0.17.0", "abe21542c831887a2b16f4c94556db9c421ab301aee417b7c4fbde7fbdbe01ec", [:mix], [], "hexpm"}, "gettext": {:hex, :gettext, "0.17.0", "abe21542c831887a2b16f4c94556db9c421ab301aee417b7c4fbde7fbdbe01ec", [:mix], [], "hexpm"},
"hackney": {:hex, :hackney, "1.15.1", "9f8f471c844b8ce395f7b6d8398139e26ddca9ebc171a8b91342ee15a19963f4", [:rebar3], [{:certifi, "2.5.1", [hex: :certifi, repo: "hexpm", optional: false]}, {:idna, "6.0.0", [hex: :idna, repo: "hexpm", optional: false]}, {:metrics, "1.0.1", [hex: :metrics, repo: "hexpm", optional: false]}, {:mimerl, "~>1.1", [hex: :mimerl, repo: "hexpm", optional: false]}, {:ssl_verify_fun, "1.1.4", [hex: :ssl_verify_fun, repo: "hexpm", optional: false]}], "hexpm"}, "hackney": {:hex, :hackney, "1.15.2", "07e33c794f8f8964ee86cebec1a8ed88db5070e52e904b8f12209773c1036085", [:rebar3], [{:certifi, "2.5.1", [hex: :certifi, repo: "hexpm", optional: false]}, {:idna, "6.0.0", [hex: :idna, repo: "hexpm", optional: false]}, {:metrics, "1.0.1", [hex: :metrics, repo: "hexpm", optional: false]}, {:mimerl, "~>1.1", [hex: :mimerl, repo: "hexpm", optional: false]}, {:ssl_verify_fun, "1.1.5", [hex: :ssl_verify_fun, repo: "hexpm", optional: false]}], "hexpm"},
"html_entities": {:hex, :html_entities, "0.4.0", "f2fee876858cf6aaa9db608820a3209e45a087c5177332799592142b50e89a6b", [:mix], [], "hexpm"}, "html_entities": {:hex, :html_entities, "0.4.0", "f2fee876858cf6aaa9db608820a3209e45a087c5177332799592142b50e89a6b", [:mix], [], "hexpm"},
"html_sanitize_ex": {:hex, :html_sanitize_ex, "1.3.0", "f005ad692b717691203f940c686208aa3d8ffd9dd4bb3699240096a51fa9564e", [:mix], [{:mochiweb, "~> 2.15", [hex: :mochiweb, repo: "hexpm", optional: false]}], "hexpm"}, "html_sanitize_ex": {:hex, :html_sanitize_ex, "1.3.0", "f005ad692b717691203f940c686208aa3d8ffd9dd4bb3699240096a51fa9564e", [:mix], [{:mochiweb, "~> 2.15", [hex: :mochiweb, repo: "hexpm", optional: false]}], "hexpm"},
"http_signatures": {:git, "https://git.pleroma.social/pleroma/http_signatures.git", "293d77bb6f4a67ac8bde1428735c3b42f22cbb30", [ref: "293d77bb6f4a67ac8bde1428735c3b42f22cbb30"]}, "http_signatures": {:git, "https://git.pleroma.social/pleroma/http_signatures.git", "293d77bb6f4a67ac8bde1428735c3b42f22cbb30", [ref: "293d77bb6f4a67ac8bde1428735c3b42f22cbb30"]},
@ -60,7 +63,7 @@
"mogrify": {:hex, :mogrify, "0.6.1", "de1b527514f2d95a7bbe9642eb556061afb337e220cf97adbf3a4e6438ed70af", [:mix], [], "hexpm"}, "mogrify": {:hex, :mogrify, "0.6.1", "de1b527514f2d95a7bbe9642eb556061afb337e220cf97adbf3a4e6438ed70af", [:mix], [], "hexpm"},
"mox": {:hex, :mox, "0.5.1", "f86bb36026aac1e6f924a4b6d024b05e9adbed5c63e8daa069bd66fb3292165b", [:mix], [], "hexpm"}, "mox": {:hex, :mox, "0.5.1", "f86bb36026aac1e6f924a4b6d024b05e9adbed5c63e8daa069bd66fb3292165b", [:mix], [], "hexpm"},
"nimble_parsec": {:hex, :nimble_parsec, "0.5.1", "c90796ecee0289dbb5ad16d3ad06f957b0cd1199769641c961cfe0b97db190e0", [:mix], [], "hexpm"}, "nimble_parsec": {:hex, :nimble_parsec, "0.5.1", "c90796ecee0289dbb5ad16d3ad06f957b0cd1199769641c961cfe0b97db190e0", [:mix], [], "hexpm"},
"oban": {:hex, :oban, "0.7.1", "171bdd1b69c1a4a839f8c768f5e962fc22d1de1513d459fb6b8e0cbd34817a9a", [:mix], [{:ecto_sql, "~> 3.1", [hex: :ecto_sql, repo: "hexpm", optional: false]}, {:jason, "~> 1.1", [hex: :jason, repo: "hexpm", optional: false]}, {:postgrex, "~> 0.14", [hex: :postgrex, repo: "hexpm", optional: false]}, {:telemetry, "~> 0.4", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm"}, "oban": {:hex, :oban, "0.8.1", "4bbf62eb1829f856d69aeb5069ac7036afe07db8221a17de2a9169cc7a58a318", [:mix], [{:ecto_sql, "~> 3.1", [hex: :ecto_sql, repo: "hexpm", optional: false]}, {:jason, "~> 1.1", [hex: :jason, repo: "hexpm", optional: false]}, {:postgrex, "~> 0.14", [hex: :postgrex, repo: "hexpm", optional: false]}, {:telemetry, "~> 0.4", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm"},
"parse_trans": {:hex, :parse_trans, "3.3.0", "09765507a3c7590a784615cfd421d101aec25098d50b89d7aa1d66646bc571c1", [:rebar3], [], "hexpm"}, "parse_trans": {:hex, :parse_trans, "3.3.0", "09765507a3c7590a784615cfd421d101aec25098d50b89d7aa1d66646bc571c1", [:rebar3], [], "hexpm"},
"pbkdf2_elixir": {:hex, :pbkdf2_elixir, "0.12.3", "6706a148809a29c306062862c803406e88f048277f6e85b68faf73291e820b84", [:mix], [], "hexpm"}, "pbkdf2_elixir": {:hex, :pbkdf2_elixir, "0.12.3", "6706a148809a29c306062862c803406e88f048277f6e85b68faf73291e820b84", [:mix], [], "hexpm"},
"phoenix": {:hex, :phoenix, "1.4.9", "746d098e10741c334d88143d3c94cab1756435f94387a63441792e66ec0ee974", [:mix], [{:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: true]}, {:phoenix_pubsub, "~> 1.1", [hex: :phoenix_pubsub, repo: "hexpm", optional: false]}, {:plug, "~> 1.8.1 or ~> 1.9", [hex: :plug, repo: "hexpm", optional: false]}, {:plug_cowboy, "~> 1.0 or ~> 2.0", [hex: :plug_cowboy, repo: "hexpm", optional: true]}, {:telemetry, "~> 0.4", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm"}, "phoenix": {:hex, :phoenix, "1.4.9", "746d098e10741c334d88143d3c94cab1756435f94387a63441792e66ec0ee974", [:mix], [{:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: true]}, {:phoenix_pubsub, "~> 1.1", [hex: :phoenix_pubsub, repo: "hexpm", optional: false]}, {:plug, "~> 1.8.1 or ~> 1.9", [hex: :plug, repo: "hexpm", optional: false]}, {:plug_cowboy, "~> 1.0 or ~> 2.0", [hex: :plug_cowboy, repo: "hexpm", optional: true]}, {:telemetry, "~> 0.4", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm"},
@ -74,7 +77,7 @@
"plug_static_index_html": {:hex, :plug_static_index_html, "1.0.0", "840123d4d3975585133485ea86af73cb2600afd7f2a976f9f5fd8b3808e636a0", [:mix], [{:plug, "~> 1.0", [hex: :plug, repo: "hexpm", optional: false]}], "hexpm"}, "plug_static_index_html": {:hex, :plug_static_index_html, "1.0.0", "840123d4d3975585133485ea86af73cb2600afd7f2a976f9f5fd8b3808e636a0", [:mix], [{:plug, "~> 1.0", [hex: :plug, repo: "hexpm", optional: false]}], "hexpm"},
"poison": {:hex, :poison, "3.1.0", "d9eb636610e096f86f25d9a46f35a9facac35609a7591b3be3326e99a0484665", [:mix], [], "hexpm"}, "poison": {:hex, :poison, "3.1.0", "d9eb636610e096f86f25d9a46f35a9facac35609a7591b3be3326e99a0484665", [:mix], [], "hexpm"},
"poolboy": {:hex, :poolboy, "1.5.2", "392b007a1693a64540cead79830443abf5762f5d30cf50bc95cb2c1aaafa006b", [:rebar3], [], "hexpm"}, "poolboy": {:hex, :poolboy, "1.5.2", "392b007a1693a64540cead79830443abf5762f5d30cf50bc95cb2c1aaafa006b", [:rebar3], [], "hexpm"},
"postgrex": {:hex, :postgrex, "0.14.3", "5754dee2fdf6e9e508cbf49ab138df964278700b764177e8f3871e658b345a1e", [:mix], [{:connection, "~> 1.0", [hex: :connection, repo: "hexpm", optional: false]}, {:db_connection, "~> 2.0", [hex: :db_connection, repo: "hexpm", optional: false]}, {:decimal, "~> 1.5", [hex: :decimal, repo: "hexpm", optional: false]}, {:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: true]}], "hexpm"}, "postgrex": {:hex, :postgrex, "0.15.1", "23ce3417de70f4c0e9e7419ad85bdabcc6860a6925fe2c6f3b1b5b1e8e47bf2f", [:mix], [{:connection, "~> 1.0", [hex: :connection, repo: "hexpm", optional: false]}, {:db_connection, "~> 2.1", [hex: :db_connection, repo: "hexpm", optional: false]}, {:decimal, "~> 1.5", [hex: :decimal, repo: "hexpm", optional: false]}, {:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: true]}], "hexpm"},
"prometheus": {:hex, :prometheus, "4.4.1", "1e96073b3ed7788053768fea779cbc896ddc3bdd9ba60687f2ad50b252ac87d6", [:mix, :rebar3], [], "hexpm"}, "prometheus": {:hex, :prometheus, "4.4.1", "1e96073b3ed7788053768fea779cbc896ddc3bdd9ba60687f2ad50b252ac87d6", [:mix, :rebar3], [], "hexpm"},
"prometheus_ecto": {:hex, :prometheus_ecto, "1.4.1", "6c768ea9654de871e5b32fab2eac348467b3021604ebebbcbd8bcbe806a65ed5", [:mix], [{:ecto, "~> 2.0 or ~> 3.0", [hex: :ecto, repo: "hexpm", optional: false]}, {:prometheus_ex, "~> 1.1 or ~> 2.0 or ~> 3.0", [hex: :prometheus_ex, repo: "hexpm", optional: false]}], "hexpm"}, "prometheus_ecto": {:hex, :prometheus_ecto, "1.4.1", "6c768ea9654de871e5b32fab2eac348467b3021604ebebbcbd8bcbe806a65ed5", [:mix], [{:ecto, "~> 2.0 or ~> 3.0", [hex: :ecto, repo: "hexpm", optional: false]}, {:prometheus_ex, "~> 1.1 or ~> 2.0 or ~> 3.0", [hex: :prometheus_ex, repo: "hexpm", optional: false]}], "hexpm"},
"prometheus_ex": {:hex, :prometheus_ex, "3.0.5", "fa58cfd983487fc5ead331e9a3e0aa622c67232b3ec71710ced122c4c453a02f", [:mix], [{:prometheus, "~> 4.0", [hex: :prometheus, repo: "hexpm", optional: false]}], "hexpm"}, "prometheus_ex": {:hex, :prometheus_ex, "3.0.5", "fa58cfd983487fc5ead331e9a3e0aa622c67232b3ec71710ced122c4c453a02f", [:mix], [{:prometheus, "~> 4.0", [hex: :prometheus, repo: "hexpm", optional: false]}], "hexpm"},
@ -84,13 +87,13 @@
"quantum": {:hex, :quantum, "2.3.4", "72a0e8855e2adc101459eac8454787cb74ab4169de6ca50f670e72142d4960e9", [:mix], [{:calendar, "~> 0.17", [hex: :calendar, repo: "hexpm", optional: true]}, {:crontab, "~> 1.1", [hex: :crontab, repo: "hexpm", optional: false]}, {:gen_stage, "~> 0.12", [hex: :gen_stage, repo: "hexpm", optional: false]}, {:swarm, "~> 3.3", [hex: :swarm, repo: "hexpm", optional: false]}, {:timex, "~> 3.1", [hex: :timex, repo: "hexpm", optional: true]}], "hexpm"}, "quantum": {:hex, :quantum, "2.3.4", "72a0e8855e2adc101459eac8454787cb74ab4169de6ca50f670e72142d4960e9", [:mix], [{:calendar, "~> 0.17", [hex: :calendar, repo: "hexpm", optional: true]}, {:crontab, "~> 1.1", [hex: :crontab, repo: "hexpm", optional: false]}, {:gen_stage, "~> 0.12", [hex: :gen_stage, repo: "hexpm", optional: false]}, {:swarm, "~> 3.3", [hex: :swarm, repo: "hexpm", optional: false]}, {:timex, "~> 3.1", [hex: :timex, repo: "hexpm", optional: true]}], "hexpm"},
"ranch": {:hex, :ranch, "1.7.1", "6b1fab51b49196860b733a49c07604465a47bdb78aa10c1c16a3d199f7f8c881", [:rebar3], [], "hexpm"}, "ranch": {:hex, :ranch, "1.7.1", "6b1fab51b49196860b733a49c07604465a47bdb78aa10c1c16a3d199f7f8c881", [:rebar3], [], "hexpm"},
"recon": {:git, "https://github.com/ferd/recon.git", "75d70c7c08926d2f24f1ee6de14ee50fe8a52763", [tag: "2.4.0"]}, "recon": {:git, "https://github.com/ferd/recon.git", "75d70c7c08926d2f24f1ee6de14ee50fe8a52763", [tag: "2.4.0"]},
"ssl_verify_fun": {:hex, :ssl_verify_fun, "1.1.4", "f0eafff810d2041e93f915ef59899c923f4568f4585904d010387ed74988e77b", [:make, :mix, :rebar3], [], "hexpm"}, "ssl_verify_fun": {:hex, :ssl_verify_fun, "1.1.5", "6eaf7ad16cb568bb01753dbbd7a95ff8b91c7979482b95f38443fe2c8852a79b", [:make, :mix, :rebar3], [], "hexpm"},
"swarm": {:hex, :swarm, "3.4.0", "64f8b30055d74640d2186c66354b33b999438692a91be275bb89cdc7e401f448", [:mix], [{:gen_state_machine, "~> 2.0", [hex: :gen_state_machine, repo: "hexpm", optional: false]}, {:libring, "~> 1.0", [hex: :libring, repo: "hexpm", optional: false]}], "hexpm"}, "swarm": {:hex, :swarm, "3.4.0", "64f8b30055d74640d2186c66354b33b999438692a91be275bb89cdc7e401f448", [:mix], [{:gen_state_machine, "~> 2.0", [hex: :gen_state_machine, repo: "hexpm", optional: false]}, {:libring, "~> 1.0", [hex: :libring, repo: "hexpm", optional: false]}], "hexpm"},
"sweet_xml": {:hex, :sweet_xml, "0.6.6", "fc3e91ec5dd7c787b6195757fbcf0abc670cee1e4172687b45183032221b66b8", [:mix], [], "hexpm"}, "sweet_xml": {:hex, :sweet_xml, "0.6.6", "fc3e91ec5dd7c787b6195757fbcf0abc670cee1e4172687b45183032221b66b8", [:mix], [], "hexpm"},
"swoosh": {:hex, :swoosh, "0.23.2", "7dda95ff0bf54a2298328d6899c74dae1223777b43563ccebebb4b5d2b61df38", [:mix], [{:cowboy, "~> 1.0.1 or ~> 1.1 or ~> 2.4", [hex: :cowboy, repo: "hexpm", optional: true]}, {:gen_smtp, "~> 0.13", [hex: :gen_smtp, repo: "hexpm", optional: true]}, {:hackney, "~> 1.9", [hex: :hackney, repo: "hexpm", optional: false]}, {:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: false]}, {:mail, "~> 0.2", [hex: :mail, repo: "hexpm", optional: true]}, {:mime, "~> 1.1", [hex: :mime, repo: "hexpm", optional: false]}, {:plug_cowboy, ">= 1.0.0", [hex: :plug_cowboy, repo: "hexpm", optional: true]}], "hexpm"}, "swoosh": {:hex, :swoosh, "0.23.2", "7dda95ff0bf54a2298328d6899c74dae1223777b43563ccebebb4b5d2b61df38", [:mix], [{:cowboy, "~> 1.0.1 or ~> 1.1 or ~> 2.4", [hex: :cowboy, repo: "hexpm", optional: true]}, {:gen_smtp, "~> 0.13", [hex: :gen_smtp, repo: "hexpm", optional: true]}, {:hackney, "~> 1.9", [hex: :hackney, repo: "hexpm", optional: false]}, {:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: false]}, {:mail, "~> 0.2", [hex: :mail, repo: "hexpm", optional: true]}, {:mime, "~> 1.1", [hex: :mime, repo: "hexpm", optional: false]}, {:plug_cowboy, ">= 1.0.0", [hex: :plug_cowboy, repo: "hexpm", optional: true]}], "hexpm"},
"syslog": {:git, "https://github.com/Vagabond/erlang-syslog.git", "4a6c6f2c996483e86c1320e9553f91d337bcb6aa", [tag: "1.0.5"]}, "syslog": {:git, "https://github.com/Vagabond/erlang-syslog.git", "4a6c6f2c996483e86c1320e9553f91d337bcb6aa", [tag: "1.0.5"]},
"telemetry": {:hex, :telemetry, "0.4.0", "8339bee3fa8b91cb84d14c2935f8ecf399ccd87301ad6da6b71c09553834b2ab", [:rebar3], [], "hexpm"}, "telemetry": {:hex, :telemetry, "0.4.0", "8339bee3fa8b91cb84d14c2935f8ecf399ccd87301ad6da6b71c09553834b2ab", [:rebar3], [], "hexpm"},
"tesla": {:hex, :tesla, "1.2.1", "864783cc27f71dd8c8969163704752476cec0f3a51eb3b06393b3971dc9733ff", [:mix], [{:exjsx, ">= 3.0.0", [hex: :exjsx, repo: "hexpm", optional: true]}, {:fuse, "~> 2.4", [hex: :fuse, repo: "hexpm", optional: true]}, {:hackney, "~> 1.6", [hex: :hackney, repo: "hexpm", optional: true]}, {:ibrowse, "~> 4.4.0", [hex: :ibrowse, repo: "hexpm", optional: true]}, {:jason, ">= 1.0.0", [hex: :jason, repo: "hexpm", optional: true]}, {:mime, "~> 1.0", [hex: :mime, repo: "hexpm", optional: false]}, {:poison, ">= 1.0.0", [hex: :poison, repo: "hexpm", optional: true]}], "hexpm"}, "tesla": {:hex, :tesla, "1.3.0", "f35d72f029e608f9cdc6f6d6fcc7c66cf6d6512a70cfef9206b21b8bd0203a30", [:mix], [{:castore, "~> 0.1", [hex: :castore, repo: "hexpm", optional: true]}, {:exjsx, ">= 3.0.0", [hex: :exjsx, repo: "hexpm", optional: true]}, {:fuse, "~> 2.4", [hex: :fuse, repo: "hexpm", optional: true]}, {:gun, "~> 1.3", [hex: :gun, repo: "hexpm", optional: true]}, {:hackney, "~> 1.6", [hex: :hackney, repo: "hexpm", optional: true]}, {:ibrowse, "~> 4.4.0", [hex: :ibrowse, repo: "hexpm", optional: true]}, {:jason, ">= 1.0.0", [hex: :jason, repo: "hexpm", optional: true]}, {:mime, "~> 1.0", [hex: :mime, repo: "hexpm", optional: false]}, {:mint, "~> 0.4", [hex: :mint, repo: "hexpm", optional: true]}, {:poison, ">= 1.0.0", [hex: :poison, repo: "hexpm", optional: true]}, {:telemetry, "~> 0.3", [hex: :telemetry, repo: "hexpm", optional: true]}], "hexpm"},
"timex": {:hex, :timex, "3.6.1", "efdf56d0e67a6b956cc57774353b0329c8ab7726766a11547e529357ffdc1d56", [:mix], [{:combine, "~> 0.10", [hex: :combine, repo: "hexpm", optional: false]}, {:gettext, "~> 0.10", [hex: :gettext, repo: "hexpm", optional: false]}, {:tzdata, "~> 0.1.8 or ~> 0.5 or ~> 1.0.0", [hex: :tzdata, repo: "hexpm", optional: false]}], "hexpm"}, "timex": {:hex, :timex, "3.6.1", "efdf56d0e67a6b956cc57774353b0329c8ab7726766a11547e529357ffdc1d56", [:mix], [{:combine, "~> 0.10", [hex: :combine, repo: "hexpm", optional: false]}, {:gettext, "~> 0.10", [hex: :gettext, repo: "hexpm", optional: false]}, {:tzdata, "~> 0.1.8 or ~> 0.5 or ~> 1.0.0", [hex: :tzdata, repo: "hexpm", optional: false]}], "hexpm"},
"trailing_format_plug": {:hex, :trailing_format_plug, "0.0.7", "64b877f912cf7273bed03379936df39894149e35137ac9509117e59866e10e45", [:mix], [{:plug, "> 0.12.0", [hex: :plug, repo: "hexpm", optional: false]}], "hexpm"}, "trailing_format_plug": {:hex, :trailing_format_plug, "0.0.7", "64b877f912cf7273bed03379936df39894149e35137ac9509117e59866e10e45", [:mix], [{:plug, "> 0.12.0", [hex: :plug, repo: "hexpm", optional: false]}], "hexpm"},
"tzdata": {:hex, :tzdata, "0.5.21", "8cbf3607fcce69636c672d5be2bbb08687fe26639a62bdcc283d267277db7cf0", [:mix], [{:hackney, "~> 1.0", [hex: :hackney, repo: "hexpm", optional: false]}], "hexpm"}, "tzdata": {:hex, :tzdata, "0.5.21", "8cbf3607fcce69636c672d5be2bbb08687fe26639a62bdcc283d267277db7cf0", [:mix], [{:hackney, "~> 1.0", [hex: :hackney, repo: "hexpm", optional: false]}], "hexpm"},

View file

@ -0,0 +1,11 @@
defmodule Pleroma.Repo.Migrations.UpdateOban do
use Ecto.Migration
def up do
Oban.Migrations.up(version: 4)
end
def down do
Oban.Migrations.down(version: 2)
end
end

File diff suppressed because one or more lines are too long

View file

@ -0,0 +1 @@
@supports (-webkit-mask:none) and (not (cater-color:#fff)){.login-container .el-input input{color:#fff}.login-container .el-input input:first-line{color:#eee}}.login-container .el-input{display:inline-block;height:47px;width:85%}.login-container .el-input input{background:transparent;border:0;-webkit-appearance:none;border-radius:0;padding:12px 5px 12px 15px;color:#eee;height:47px;caret-color:#fff}.login-container .el-input input:-webkit-autofill{-webkit-box-shadow:0 0 0 1000px #283443 inset!important;-webkit-text-fill-color:#fff!important}.login-container .el-form-item{border:1px solid hsla(0,0%,100%,.1);background:rgba(0,0,0,.1);border-radius:5px;color:#454545}.login-container .login-button{width:100%;margin:0 0 10px}.login-container .omit-host-note{color:#596f8c;font-size:.8em;font-style:italic;margin:-20px 0 15px;padding:3px 0 0 15px}.login-container[data-v-d027d802]{min-height:100%;width:100%;background-color:#2d3a4b;overflow:hidden}.login-container .login-form[data-v-d027d802]{position:relative;width:520px;max-width:100%;padding:160px 35px 0;margin:0 auto;overflow:hidden}.login-container .tips[data-v-d027d802]{font-size:14px;color:#fff;margin-bottom:10px}.login-container .tips span[data-v-d027d802]:first-of-type{margin-right:16px}.login-container .svg-container[data-v-d027d802]{padding:6px 5px 6px 15px;color:#889aa4;vertical-align:middle;width:30px;display:inline-block}.login-container .title-container[data-v-d027d802]{position:relative}.login-container .title-container .title[data-v-d027d802]{font-size:26px;color:#eee;margin:0 auto 40px;text-align:center;font-weight:700}.login-container .title-container .set-language[data-v-d027d802]{color:#fff;position:absolute;top:3px;font-size:18px;right:0;cursor:pointer}.login-container .show-pwd[data-v-d027d802]{position:absolute;right:10px;top:7px;font-size:16px;color:#889aa4;cursor:pointer;-webkit-user-select:none;-moz-user-select:none;-ms-user-select:none;user-select:none}.login-container .thirdparty-button[data-v-d027d802]{position:absolute;right:0;bottom:6px}

View file

@ -0,0 +1 @@
.wscn-http404-container[data-v-1d6b2d2a]{-webkit-transform:translate(-50%,-50%);transform:translate(-50%,-50%);position:absolute;top:40%;left:50%}.wscn-http404[data-v-1d6b2d2a]{position:relative;width:1200px;padding:0 50px;overflow:hidden}.wscn-http404 .pic-404[data-v-1d6b2d2a]{position:relative;float:left;width:600px;overflow:hidden}.wscn-http404 .pic-404__parent[data-v-1d6b2d2a]{width:100%}.wscn-http404 .pic-404__child[data-v-1d6b2d2a]{position:absolute}.wscn-http404 .pic-404__child.left[data-v-1d6b2d2a]{width:80px;top:17px;left:220px;opacity:0;-webkit-animation-name:cloudLeft-data-v-1d6b2d2a;animation-name:cloudLeft-data-v-1d6b2d2a;-webkit-animation-duration:2s;animation-duration:2s;-webkit-animation-timing-function:linear;animation-timing-function:linear;-webkit-animation-fill-mode:forwards;animation-fill-mode:forwards;-webkit-animation-delay:1s;animation-delay:1s}.wscn-http404 .pic-404__child.mid[data-v-1d6b2d2a]{width:46px;top:10px;left:420px;opacity:0;-webkit-animation-name:cloudMid-data-v-1d6b2d2a;animation-name:cloudMid-data-v-1d6b2d2a;-webkit-animation-duration:2s;animation-duration:2s;-webkit-animation-timing-function:linear;animation-timing-function:linear;-webkit-animation-fill-mode:forwards;animation-fill-mode:forwards;-webkit-animation-delay:1.2s;animation-delay:1.2s}.wscn-http404 .pic-404__child.right[data-v-1d6b2d2a]{width:62px;top:100px;left:500px;opacity:0;-webkit-animation-name:cloudRight-data-v-1d6b2d2a;animation-name:cloudRight-data-v-1d6b2d2a;-webkit-animation-duration:2s;animation-duration:2s;-webkit-animation-timing-function:linear;animation-timing-function:linear;-webkit-animation-fill-mode:forwards;animation-fill-mode:forwards;-webkit-animation-delay:1s;animation-delay:1s}@-webkit-keyframes cloudLeft-data-v-1d6b2d2a{0%{top:17px;left:220px;opacity:0}20%{top:33px;left:188px;opacity:1}80%{top:81px;left:92px;opacity:1}to{top:97px;left:60px;opacity:0}}@keyframes cloudLeft-data-v-1d6b2d2a{0%{top:17px;left:220px;opacity:0}20%{top:33px;left:188px;opacity:1}80%{top:81px;left:92px;opacity:1}to{top:97px;left:60px;opacity:0}}@-webkit-keyframes cloudMid-data-v-1d6b2d2a{0%{top:10px;left:420px;opacity:0}20%{top:40px;left:360px;opacity:1}70%{top:130px;left:180px;opacity:1}to{top:160px;left:120px;opacity:0}}@keyframes cloudMid-data-v-1d6b2d2a{0%{top:10px;left:420px;opacity:0}20%{top:40px;left:360px;opacity:1}70%{top:130px;left:180px;opacity:1}to{top:160px;left:120px;opacity:0}}@-webkit-keyframes cloudRight-data-v-1d6b2d2a{0%{top:100px;left:500px;opacity:0}20%{top:120px;left:460px;opacity:1}80%{top:180px;left:340px;opacity:1}to{top:200px;left:300px;opacity:0}}@keyframes cloudRight-data-v-1d6b2d2a{0%{top:100px;left:500px;opacity:0}20%{top:120px;left:460px;opacity:1}80%{top:180px;left:340px;opacity:1}to{top:200px;left:300px;opacity:0}}.wscn-http404 .bullshit[data-v-1d6b2d2a]{position:relative;float:left;width:300px;padding:30px 0;overflow:hidden}.wscn-http404 .bullshit__oops[data-v-1d6b2d2a]{font-size:32px;line-height:40px;color:#1482f0;margin-bottom:20px;-webkit-animation-fill-mode:forwards;animation-fill-mode:forwards}.wscn-http404 .bullshit__headline[data-v-1d6b2d2a],.wscn-http404 .bullshit__oops[data-v-1d6b2d2a]{font-weight:700;opacity:0;-webkit-animation-name:slideUp-data-v-1d6b2d2a;animation-name:slideUp-data-v-1d6b2d2a;-webkit-animation-duration:.5s;animation-duration:.5s}.wscn-http404 .bullshit__headline[data-v-1d6b2d2a]{font-size:20px;line-height:24px;color:#222;margin-bottom:10px;-webkit-animation-delay:.1s;animation-delay:.1s;-webkit-animation-fill-mode:forwards;animation-fill-mode:forwards}.wscn-http404 .bullshit__info[data-v-1d6b2d2a]{font-size:13px;line-height:21px;color:grey;margin-bottom:30px;-webkit-animation-delay:.2s;animation-delay:.2s;-webkit-animation-fill-mode:forwards;animation-fill-mode:forwards}.wscn-http404 .bullshit__info[data-v-1d6b2d2a],.wscn-http404 .bullshit__return-home[data-v-1d6b2d2a]{opacity:0;-webkit-animation-name:slideUp-data-v-1d6b2d2a;animation-name:slideUp-data-v-1d6b2d2a;-webkit-animation-duration:.5s;animation-duration:.5s}.wscn-http404 .bullshit__return-home[data-v-1d6b2d2a]{display:block;float:left;width:165px;height:36px;background:#1482f0;border-radius:100px;text-align:center;color:#fff;font-size:14px;line-height:36px;cursor:pointer;-webkit-animation-delay:.3s;animation-delay:.3s;-webkit-animation-fill-mode:forwards;animation-fill-mode:forwards}@-webkit-keyframes slideUp-data-v-1d6b2d2a{0%{-webkit-transform:translateY(60px);transform:translateY(60px);opacity:0}to{-webkit-transform:translateY(0);transform:translateY(0);opacity:1}}@keyframes slideUp-data-v-1d6b2d2a{0%{-webkit-transform:translateY(60px);transform:translateY(60px);opacity:0}to{-webkit-transform:translateY(0);transform:translateY(0);opacity:1}}

View file

@ -0,0 +1 @@
.prop-row{margin-bottom:1em}.emoji-preview-img{max-width:5em}.copy-to-local-button{margin-top:2em;float:right}.new-emoji-col{margin-top:8em}.or,.shared-pack-dl-box{margin:1em}.dl-as-input{margin:1em;max-width:30%}.contents-collapse{margin:1em}.pack-actions{margin-top:1em}.new-emoji-uploader{margin-bottom:3em}.emoji-packs-container{margin:22px 0 0 15px}.local-packs-actions{margin-top:1em;margin-bottom:1em}.remote-instance-input{max-width:10%}.create-pack-button{margin-top:1em}

View file

@ -1 +0,0 @@
.wscn-http404-container[data-v-b8c8aa9a]{-webkit-transform:translate(-50%,-50%);transform:translate(-50%,-50%);position:absolute;top:40%;left:50%}.wscn-http404[data-v-b8c8aa9a]{position:relative;width:1200px;padding:0 50px;overflow:hidden}.wscn-http404 .pic-404[data-v-b8c8aa9a]{position:relative;float:left;width:600px;overflow:hidden}.wscn-http404 .pic-404__parent[data-v-b8c8aa9a]{width:100%}.wscn-http404 .pic-404__child[data-v-b8c8aa9a]{position:absolute}.wscn-http404 .pic-404__child.left[data-v-b8c8aa9a]{width:80px;top:17px;left:220px;opacity:0;-webkit-animation-name:cloudLeft-data-v-b8c8aa9a;animation-name:cloudLeft-data-v-b8c8aa9a;-webkit-animation-duration:2s;animation-duration:2s;-webkit-animation-timing-function:linear;animation-timing-function:linear;-webkit-animation-fill-mode:forwards;animation-fill-mode:forwards;-webkit-animation-delay:1s;animation-delay:1s}.wscn-http404 .pic-404__child.mid[data-v-b8c8aa9a]{width:46px;top:10px;left:420px;opacity:0;-webkit-animation-name:cloudMid-data-v-b8c8aa9a;animation-name:cloudMid-data-v-b8c8aa9a;-webkit-animation-duration:2s;animation-duration:2s;-webkit-animation-timing-function:linear;animation-timing-function:linear;-webkit-animation-fill-mode:forwards;animation-fill-mode:forwards;-webkit-animation-delay:1.2s;animation-delay:1.2s}.wscn-http404 .pic-404__child.right[data-v-b8c8aa9a]{width:62px;top:100px;left:500px;opacity:0;-webkit-animation-name:cloudRight-data-v-b8c8aa9a;animation-name:cloudRight-data-v-b8c8aa9a;-webkit-animation-duration:2s;animation-duration:2s;-webkit-animation-timing-function:linear;animation-timing-function:linear;-webkit-animation-fill-mode:forwards;animation-fill-mode:forwards;-webkit-animation-delay:1s;animation-delay:1s}@-webkit-keyframes cloudLeft-data-v-b8c8aa9a{0%{top:17px;left:220px;opacity:0}20%{top:33px;left:188px;opacity:1}80%{top:81px;left:92px;opacity:1}to{top:97px;left:60px;opacity:0}}@keyframes cloudLeft-data-v-b8c8aa9a{0%{top:17px;left:220px;opacity:0}20%{top:33px;left:188px;opacity:1}80%{top:81px;left:92px;opacity:1}to{top:97px;left:60px;opacity:0}}@-webkit-keyframes cloudMid-data-v-b8c8aa9a{0%{top:10px;left:420px;opacity:0}20%{top:40px;left:360px;opacity:1}70%{top:130px;left:180px;opacity:1}to{top:160px;left:120px;opacity:0}}@keyframes cloudMid-data-v-b8c8aa9a{0%{top:10px;left:420px;opacity:0}20%{top:40px;left:360px;opacity:1}70%{top:130px;left:180px;opacity:1}to{top:160px;left:120px;opacity:0}}@-webkit-keyframes cloudRight-data-v-b8c8aa9a{0%{top:100px;left:500px;opacity:0}20%{top:120px;left:460px;opacity:1}80%{top:180px;left:340px;opacity:1}to{top:200px;left:300px;opacity:0}}@keyframes cloudRight-data-v-b8c8aa9a{0%{top:100px;left:500px;opacity:0}20%{top:120px;left:460px;opacity:1}80%{top:180px;left:340px;opacity:1}to{top:200px;left:300px;opacity:0}}.wscn-http404 .bullshit[data-v-b8c8aa9a]{position:relative;float:left;width:300px;padding:30px 0;overflow:hidden}.wscn-http404 .bullshit__oops[data-v-b8c8aa9a]{font-size:32px;line-height:40px;color:#1482f0;margin-bottom:20px;-webkit-animation-fill-mode:forwards;animation-fill-mode:forwards}.wscn-http404 .bullshit__headline[data-v-b8c8aa9a],.wscn-http404 .bullshit__oops[data-v-b8c8aa9a]{font-weight:700;opacity:0;-webkit-animation-name:slideUp-data-v-b8c8aa9a;animation-name:slideUp-data-v-b8c8aa9a;-webkit-animation-duration:.5s;animation-duration:.5s}.wscn-http404 .bullshit__headline[data-v-b8c8aa9a]{font-size:20px;line-height:24px;color:#222;margin-bottom:10px;-webkit-animation-delay:.1s;animation-delay:.1s;-webkit-animation-fill-mode:forwards;animation-fill-mode:forwards}.wscn-http404 .bullshit__info[data-v-b8c8aa9a]{font-size:13px;line-height:21px;color:grey;margin-bottom:30px;-webkit-animation-delay:.2s;animation-delay:.2s;-webkit-animation-fill-mode:forwards;animation-fill-mode:forwards}.wscn-http404 .bullshit__info[data-v-b8c8aa9a],.wscn-http404 .bullshit__return-home[data-v-b8c8aa9a]{opacity:0;-webkit-animation-name:slideUp-data-v-b8c8aa9a;animation-name:slideUp-data-v-b8c8aa9a;-webkit-animation-duration:.5s;animation-duration:.5s}.wscn-http404 .bullshit__return-home[data-v-b8c8aa9a]{display:block;float:left;width:110px;height:36px;background:#1482f0;border-radius:100px;text-align:center;color:#fff;font-size:14px;line-height:36px;cursor:pointer;-webkit-animation-delay:.3s;animation-delay:.3s;-webkit-animation-fill-mode:forwards;animation-fill-mode:forwards}@-webkit-keyframes slideUp-data-v-b8c8aa9a{0%{-webkit-transform:translateY(60px);transform:translateY(60px);opacity:0}to{-webkit-transform:translateY(0);transform:translateY(0);opacity:1}}@keyframes slideUp-data-v-b8c8aa9a{0%{-webkit-transform:translateY(60px);transform:translateY(60px);opacity:0}to{-webkit-transform:translateY(0);transform:translateY(0);opacity:1}}

View file

@ -0,0 +1 @@
.select-field[data-v-71bc6b38]{width:350px}@media (min-device-width:768px) and (max-device-width:1024px),only screen and (max-width:760px){.select-field[data-v-71bc6b38]{width:100%;margin-bottom:5px}}.actions-button[data-v-19afabea]{text-align:left;width:350px;padding:10px}.actions-button-container[data-v-19afabea]{display:-webkit-box;display:-ms-flexbox;display:flex;-webkit-box-pack:justify;-ms-flex-pack:justify;justify-content:space-between}.el-dropdown[data-v-19afabea]{float:right}.el-icon-edit[data-v-19afabea]{margin-right:5px}.tag-container[data-v-19afabea]{display:-webkit-box;display:-ms-flexbox;display:flex;-webkit-box-pack:justify;-ms-flex-pack:justify;justify-content:space-between;-webkit-box-align:center;-ms-flex-align:center;align-items:center}.tag-text[data-v-19afabea]{padding-right:20px}.no-hover[data-v-19afabea]:hover{color:#606266;background-color:#fff;cursor:auto}.el-dialog__body{padding:20px}.create-account-form-item{margin-bottom:20px}.create-account-form-item-without-margin{margin-bottom:0}@media (min-device-width:768px) and (max-device-width:1024px),only screen and (max-width:760px){.create-user-dialog{width:85%}.create-account-form-item{margin-bottom:20px}.el-dialog__body{padding:20px}}.actions-button{text-align:left;width:350px;padding:10px}.actions-container{display:-webkit-box;display:-ms-flexbox;display:flex;height:36px;-webkit-box-pack:justify;-ms-flex-pack:justify;justify-content:space-between;-webkit-box-align:center;-ms-flex-align:center;align-items:center;margin:0 15px 10px}.active-tag{color:#409eff;font-weight:700}.active-tag .el-icon-check{color:#409eff;float:right;margin:7px 0 0 15px}.el-dropdown-link:hover{cursor:pointer;color:#409eff}.el-icon-plus{margin-right:5px}.password-reset-token{margin:0 0 14px}.password-reset-token-dialog{width:50%}.reset-password-link{text-decoration:underline}.users-container h1{margin:22px 0 0 15px}.users-container .pagination{margin:25px 0;text-align:center}.users-container .search{width:350px;float:right}.users-container .filter-container{display:-webkit-box;display:-ms-flexbox;display:flex;height:36px;-webkit-box-pack:justify;-ms-flex-pack:justify;justify-content:space-between;-webkit-box-align:center;-ms-flex-align:center;align-items:center;margin:22px 15px 15px}.users-container .user-count{color:grey;font-size:28px}@media (min-device-width:768px) and (max-device-width:1024px),only screen and (max-width:760px){.password-reset-token-dialog{width:85%}.users-container h1{margin:7px 10px 15px}.users-container .actions-container{display:-webkit-box;display:-ms-flexbox;display:flex;-webkit-box-orient:vertical;-webkit-box-direction:normal;-ms-flex-direction:column;flex-direction:column;margin:0 10px 7px}.users-container .create-account{width:100%}.users-container .el-icon-arrow-down{font-size:12px}.users-container .search{width:100%}.users-container .filter-container{display:-webkit-box;display:-ms-flexbox;display:flex;height:82px;-webkit-box-orient:vertical;-webkit-box-direction:normal;-ms-flex-direction:column;flex-direction:column;margin:0 10px}.users-container .el-tag{width:30px;display:inline-block;margin-bottom:4px;font-weight:700}.users-container .el-tag.el-tag--danger,.users-container .el-tag.el-tag--success{padding-left:8px}}

View file

@ -0,0 +1 @@
.invites-container .actions-container{display:-webkit-box;display:-ms-flexbox;display:flex;height:36px;-webkit-box-pack:justify;-ms-flex-pack:justify;justify-content:space-between;-webkit-box-align:center;-ms-flex-align:center;align-items:center;margin:20px 15px 15px}.invites-container .create-invite-token{text-align:left;width:350px;padding:10px}.invites-container .create-new-token-dialog{width:40%}.invites-container .el-dialog__body{padding:5px 20px 0}.invites-container h1{margin:22px 0 0 15px}.invites-container .icon{margin-right:5px}.invites-container .invite-token-table{width:100%;margin:0 15px}.invites-container .invite-via-email{text-align:left;width:350px;padding:10px}.invites-container .invite-via-email-dialog{width:50%}.invites-container .info{color:#666;font-size:13px;line-height:22px;margin:0 0 10px}@media (min-device-width:768px) and (max-device-width:1024px),only screen and (max-width:760px){.invites-container .actions-container{display:-webkit-box;display:-ms-flexbox;display:flex;height:82px;-webkit-box-orient:vertical;-webkit-box-direction:normal;-ms-flex-direction:column;flex-direction:column;-webkit-box-align:center;-ms-flex-align:center;align-items:center;margin:15px 10px 7px}.invites-container .create-invite-token{width:100%}.invites-container .create-new-token-dialog{width:85%}.invites-container .el-date-editor{width:150px}.invites-container .el-dialog__body{padding:5px 15px 0}.invites-container h1{margin:7px 10px 15px}.invites-container .invite-token-table{width:100%;margin:0}.invites-container .invite-via-email{width:100%;margin:10px 0 0}.invites-container .invite-via-email-dialog{width:85%}.invites-container .info{margin:0 0 10px 5px}.create-invite-token,.invite-via-email{width:100%}}

View file

@ -1 +0,0 @@
@supports (-webkit-mask:none) and (not (cater-color:#fff)){.login-container .el-input input{color:#fff}.login-container .el-input input:first-line{color:#eee}}.login-container .el-input{display:inline-block;height:47px;width:85%}.login-container .el-input input{background:transparent;border:0;-webkit-appearance:none;border-radius:0;padding:12px 5px 12px 15px;color:#eee;height:47px;caret-color:#fff}.login-container .el-input input:-webkit-autofill{-webkit-box-shadow:0 0 0 1000px #283443 inset!important;-webkit-text-fill-color:#fff!important}.login-container .el-form-item{border:1px solid hsla(0,0%,100%,.1);background:rgba(0,0,0,.1);border-radius:5px;color:#454545}.login-container[data-v-57350b8e]{min-height:100%;width:100%;background-color:#2d3a4b;overflow:hidden}.login-container .login-form[data-v-57350b8e]{position:relative;width:520px;max-width:100%;padding:160px 35px 0;margin:0 auto;overflow:hidden}.login-container .tips[data-v-57350b8e]{font-size:14px;color:#fff;margin-bottom:10px}.login-container .tips span[data-v-57350b8e]:first-of-type{margin-right:16px}.login-container .svg-container[data-v-57350b8e]{padding:6px 5px 6px 15px;color:#889aa4;vertical-align:middle;width:30px;display:inline-block}.login-container .title-container[data-v-57350b8e]{position:relative}.login-container .title-container .title[data-v-57350b8e]{font-size:26px;color:#eee;margin:0 auto 40px;text-align:center;font-weight:700}.login-container .title-container .set-language[data-v-57350b8e]{color:#fff;position:absolute;top:3px;font-size:18px;right:0;cursor:pointer}.login-container .show-pwd[data-v-57350b8e]{position:absolute;right:10px;top:7px;font-size:16px;color:#889aa4;cursor:pointer;-webkit-user-select:none;-moz-user-select:none;-ms-user-select:none;user-select:none}.login-container .thirdparty-button[data-v-57350b8e]{position:absolute;right:0;bottom:6px}

Some files were not shown because too many files have changed in this diff Show more