Merge branch 'master' into remote-context-processing

Conflicts:
	mix.exs
	mix.lock
This commit is contained in:
Marcel Otto 2020-06-16 22:22:09 +02:00
commit 277fabedb4
23 changed files with 273 additions and 148 deletions

2
.gitignore vendored
View file

@ -15,3 +15,5 @@ erl_crash.dump
# Also ignore archive artifacts (built via "mix archive.build").
*.ez
.tool-versions*

29
.iex.exs Normal file
View file

@ -0,0 +1,29 @@
import RDF.Sigils
alias RDF.NS
alias RDF.NS.{RDFS, OWL, SKOS}
alias RDF.{
Term,
IRI,
Literal,
BlankNode,
Triple,
Quad,
Statement,
Description,
Graph,
Dataset,
XSD
}
alias RDF.BlankNode, as: BNode
alias RDF.{NTriples, NQuads, Turtle}
alias Decimal, as: D
alias JSON.LD

View file

@ -1,14 +1,24 @@
language: elixir
matrix:
include:
- otp_release: 19.3
elixir: 1.5
- otp_release: 20.0
elixir: 1.5
- otp_release: 19.3
elixir: 1.6
- otp_release: 20.0
elixir: 1.6
# TODO: temporarily disabled since the newest version of Hackney (used by excoveralls) doesn't seem to work with OTP 20 ("Compiling src/hackney_ssl.erl failed")
# - otp_release: 20.0
# elixir: 1.8
- otp_release: 21.0
elixir: 1.8
- otp_release: 22.0
elixir: 1.8
# TODO: temporarily disabled since the newest version of Hackney (used by excoveralls) doesn't seem to work with OTP 20 ("Compiling src/hackney_ssl.erl failed")
# - otp_release: 20.0
# elixir: 1.9
- otp_release: 22.0
elixir: 1.9
- otp_release: 21.0
elixir: 1.10
- otp_release: 22.0
elixir: 1.10
- otp_release: 23.0
elixir: 1.10
sudo: false
script:
- MIX_ENV=test mix coveralls.travis

View file

@ -5,6 +5,48 @@ This project adheres to [Semantic Versioning](http://semver.org/) and
[Keep a CHANGELOG](http://keepachangelog.com).
## 0.3.1 - 2020-06-01
This version just upgrades to RDF.ex 0.8. With that Elixir version < 1.8 are no longer supported.
[Compare v0.3.0...v0.3.1](https://github.com/rdf-elixir/rdf-ex/compare/v0.3.0...v0.3.1)
## 0.3.0 - 2018-09-17
No significant changes. Just some adoptions to work with RDF.ex 0.5.
But together with RDF.ex 0.5, Elixir versions < 1.6 are no longer supported.
[Compare v0.2.3...v0.3.0](https://github.com/rdf-elixir/jsonld-ex/compare/v0.2.3...v0.3.0)
## 0.2.3 - 2018-07-11
- Upgrade to Jason 1.1
- Pass options to `JSON.LD.Encoder.encode/2` and `JSON.LD.Encoder.encode!/2`
through to Jason; this allows to use the new Jason pretty printing options
[Compare v0.2.2...v0.2.3](https://github.com/rdf-elixir/jsonld-ex/compare/v0.2.2...v0.2.3)
## 0.2.2 - 2018-03-17
### Added
- JSON-LD encoder can handle `RDF.Graph`s and `RDF.Description`s
### Changed
- Use Jason instead of Poison for JSON encoding and decoding, since it's faster and more standard conform
[Compare v0.2.1...v0.2.2](https://github.com/rdf-elixir/jsonld-ex/compare/v0.2.1...v0.2.2)
## 0.2.1 - 2018-03-10
### Changed
@ -13,7 +55,7 @@ This project adheres to [Semantic Versioning](http://semver.org/) and
- Fixed all warnings ([@talklittle](https://github.com/talklittle))
[Compare v0.2.0...v0.2.1](https://github.com/marcelotto/jsonld-ex/compare/v0.2.0...v0.2.1)
[Compare v0.2.0...v0.2.1](https://github.com/rdf-elixir/jsonld-ex/compare/v0.2.0...v0.2.1)
@ -24,7 +66,7 @@ This project adheres to [Semantic Versioning](http://semver.org/) and
- Upgrade to RDF.ex 0.3.0
[Compare v0.1.1...v0.2.0](https://github.com/marcelotto/jsonld-ex/compare/v0.1.1...v0.2.0)
[Compare v0.1.1...v0.2.0](https://github.com/rdf-elixir/jsonld-ex/compare/v0.1.1...v0.2.0)
@ -35,7 +77,7 @@ This project adheres to [Semantic Versioning](http://semver.org/) and
- Don't support Elixir versions < 1.5, since `URI.merge` is broken in earlier versions
[Compare v0.1.0...v0.1.1](https://github.com/marcelotto/jsonld-ex/compare/v0.1.0...v0.1.1)
[Compare v0.1.0...v0.1.1](https://github.com/rdf-elixir/jsonld-ex/compare/v0.1.0...v0.1.1)

View file

@ -1,4 +1,4 @@
1. Fork it ( <https://github.com/marcelotto/jsonld-ex/fork> )
1. Fork it ( <https://github.com/rdf-elixir/jsonld-ex/fork> )
2. Create your feature branch (`git checkout -b my-new-feature`)
3. Make your changes, with new passing tests. Follow this [style guide].
4. Execute all tests.

View file

@ -1,6 +1,6 @@
# MIT License
Copyright (c) 2017 Marcel Otto
Copyright (c) 2017-2020 Marcel Otto
Permission is hereby granted, free of charge, to any person obtaining
a copy of this software and associated documentation files (the

View file

@ -1,8 +1,10 @@
<img style="border:0px;" width="64" src="https://json-ld.org/images/json-ld-logo-64.png" alt="JSON-LD-logo-64" align="right"/>
# JSON-LD.ex
[![Travis](https://img.shields.io/travis/marcelotto/jsonld-ex.svg?style=flat-square)](https://travis-ci.org/marcelotto/jsonld-ex)
[![Travis](https://img.shields.io/travis/rdf-elixir/jsonld-ex.svg?style=flat-square)](https://travis-ci.org/rdf-elixir/jsonld-ex)
[![Hex.pm](https://img.shields.io/hexpm/v/json_ld.svg?style=flat-square)](https://hex.pm/packages/json_ld)
[![Coverage Status](https://coveralls.io/repos/github/marcelotto/jsonld-ex/badge.svg?branch=master)](https://coveralls.io/github/marcelotto/jsonld-ex?branch=master)
[![Coverage Status](https://coveralls.io/repos/github/rdf-elixir/jsonld-ex/badge.svg?branch=master)](https://coveralls.io/github/rdf-elixir/jsonld-ex?branch=master)
An implementation of the [JSON-LD] standard for Elixir and [RDF.ex].
@ -12,7 +14,7 @@ An implementation of the [JSON-LD] standard for Elixir and [RDF.ex].
- fully conforming JSON-LD API processor
- JSON-LD reader/writer for [RDF.ex]
- tests of the [JSON-LD test suite][] (see [here](https://github.com/marcelotto/jsonld-ex/wiki/JSON-LD.ex-implementation-report) for a detailed status report)
- tests of the [JSON-LD test suite][] (see [here](https://github.com/rdf-elixir/jsonld-ex/wiki/JSON-LD.ex-implementation-report) for a detailed status report)
## TODO
@ -28,7 +30,7 @@ The [JSON-LD.ex](https://hex.pm/packages/json_ld) Hex package can be installed a
```elixir
def deps do
[{:json_ld, "~> 0.2"}]
[{:json_ld, "~> 0.3"}]
end
```
@ -53,7 +55,7 @@ end
"homepage": "http://manu.sporny.org/"
}
"""
|> Poison.Parser.parse!
|> Jason.decode!
|> JSON.LD.expand
```
@ -67,7 +69,7 @@ produces
### Compact a document
```elixir
context = Poison.Parser.parse! """
context = Jason.decode! """
{
"@context": {
"name": "http://xmlns.com/foaf/0.1/name",
@ -91,7 +93,7 @@ context = Poison.Parser.parse! """
}
]
"""
|> Poison.Parser.parse!
|> Jason.decode!
|> JSON.LD.compact(context)
```
@ -119,6 +121,16 @@ JSON.LD.write_file!(dataset, "file.jsonld")
```
## Pretty printing
Pretty printing is possible on all writer functions with all of the formatter options of [Jason](https://hexdocs.pm/jason/Jason.Formatter.html#pretty_print/2), the underlying JSON encoder, to which the given options are passed through.
```elixir
JSON.LD.write_file!(dataset, "file.jsonld", pretty: true)
JSON.LD.write_string(dataset, "file.jsonld", pretty: [indent: "\t"])
```
## Getting help
- [Documentation](http://hexdocs.pm/json_ld)
@ -132,7 +144,7 @@ see [CONTRIBUTING](CONTRIBUTING.md) for details.
## License and Copyright
(c) 2017 Marcel Otto. MIT Licensed, see [LICENSE](LICENSE.md) for details.
(c) 2017-2020 Marcel Otto. MIT Licensed, see [LICENSE](LICENSE.md) for details.
[RDF.ex]: https://hex.pm/packages/rdf

View file

@ -1 +1 @@
0.2.1
0.3.1

View file

@ -345,7 +345,7 @@ defmodule JSON.LD.Context do
# 16.1)
defp do_create_container_definition(_, %{"@container" => container})
when not container in ~w[@list @set @index @language],
when container not in ~w[@list @set @index @language],
do: raise JSON.LD.InvalidContainerMappingError,
message: "#{inspect container} is not a valid container mapping; @container must be either @list, @set, @index, or @language"
# 16.2)

View file

@ -6,10 +6,10 @@ defmodule JSON.LD.Decoder do
import JSON.LD.{NodeIdentifierMap, Utils}
alias JSON.LD.NodeIdentifierMap
alias RDF.{Dataset, Graph}
alias RDF.NS.{XSD}
alias RDF.{Dataset, Graph, NS}
@impl RDF.Serialization.Decoder
def decode(content, opts \\ []) do
with {:ok, json_ld_object} <- parse_json(content),
dataset = to_rdf(json_ld_object, opts) do
@ -86,14 +86,12 @@ defmodule JSON.LD.Decoder do
end
end
# TODO: This should not be dependent on Poison as a JSON parser in general,
# but determine available JSON parsers and use one heuristically or by configuration
def parse_json(content, _opts \\ []) do
Poison.Parser.parse(content)
Jason.decode(content)
end
def parse_json!(content, _opts \\ []) do
Poison.Parser.parse!(content)
Jason.decode!(content)
end
def node_to_rdf(nil), do: nil
@ -118,30 +116,34 @@ defmodule JSON.LD.Decoder do
{value, datatype} =
cond do
is_boolean(value) ->
value = value |> RDF.Boolean.new |> RDF.Literal.canonical |> RDF.Literal.lexical
datatype = if is_nil(datatype), do: XSD.boolean, else: datatype
value = value |> RDF.XSD.Boolean.new() |> RDF.XSD.Boolean.canonical() |> RDF.XSD.Boolean.lexical()
datatype = if is_nil(datatype), do: NS.XSD.boolean, else: datatype
{value, datatype}
is_float(value) or (is_number(value) and datatype == to_string(XSD.double)) ->
value = value |> RDF.Double.new |> RDF.Literal.canonical |> RDF.Literal.lexical
datatype = if is_nil(datatype), do: XSD.double, else: datatype
is_float(value) or (is_number(value) and datatype == to_string(NS.XSD.double)) ->
value = value |> RDF.XSD.Double.new() |> RDF.XSD.Double.canonical() |> RDF.XSD.Double.lexical()
datatype = if is_nil(datatype), do: NS.XSD.double, else: datatype
{value, datatype}
is_integer(value) or (is_number(value) and datatype == to_string(XSD.integer)) ->
value = value |> RDF.Integer.new |> RDF.Literal.canonical |> RDF.Literal.lexical
datatype = if is_nil(datatype), do: XSD.integer, else: datatype
is_integer(value) or (is_number(value) and datatype == to_string(NS.XSD.integer)) ->
value = value |> RDF.XSD.Integer.new() |> RDF.XSD.Integer.canonical() |> RDF.XSD.Integer.lexical()
datatype = if is_nil(datatype), do: NS.XSD.integer, else: datatype
{value, datatype}
is_nil(datatype) ->
datatype =
if Map.has_key?(item, "@language") do
RDF.langString
else
XSD.string
NS.XSD.string
end
{value, datatype}
true ->
{value, datatype}
end
RDF.Literal.new(value,
%{datatype: datatype, language: item["@language"], canonicalize: true})
if language = item["@language"] do
RDF.Literal.new(value, language: language, canonicalize: true)
else
RDF.Literal.new(value, datatype: datatype, canonicalize: true)
end
end
defp list_to_rdf(list, node_id_map) do

View file

@ -4,8 +4,7 @@ defmodule JSON.LD.Encoder do
use RDF.Serialization.Encoder
alias RDF.{IRI, BlankNode, Literal}
alias RDF.NS.{XSD}
alias RDF.{IRI, BlankNode, Literal, XSD, NS}
@rdf_type to_string(RDF.NS.RDF.type)
@rdf_nil to_string(RDF.NS.RDF.nil)
@ -13,16 +12,17 @@ defmodule JSON.LD.Encoder do
@rdf_rest to_string(RDF.NS.RDF.rest)
@rdf_list to_string(RDF.uri(RDF.NS.RDF.List))
@impl RDF.Serialization.Encoder
def encode(data, opts \\ []) do
with {:ok, json_ld_object} <- from_rdf(data, opts) do
encode_json(json_ld_object)
encode_json(json_ld_object, opts)
end
end
def encode!(data, opts \\ []) do
data
|> from_rdf!(opts)
|> encode_json!
|> encode_json!(opts)
end
def from_rdf(dataset, options \\ %JSON.LD.Options{}) do
@ -33,7 +33,9 @@ defmodule JSON.LD.Encoder do
end
end
def from_rdf!(dataset, options \\ %JSON.LD.Options{}) do
def from_rdf!(rdf_data, options \\ %JSON.LD.Options{})
def from_rdf!(%RDF.Dataset{} = dataset, options) do
with options = JSON.LD.Options.new(options) do
graph_map =
Enum.reduce RDF.Dataset.graphs(dataset), %{},
@ -76,7 +78,7 @@ defmodule JSON.LD.Encoder do
|> Enum.sort_by(fn {s, _} -> s end)
|> Enum.reduce([], fn ({_s, n}, graph_nodes) ->
n = Map.delete(n, "usages")
if Map.size(n) == 1 and Map.has_key?(n, "@id") do
if map_size(n) == 1 and Map.has_key?(n, "@id") do
graph_nodes
else
[n | graph_nodes]
@ -89,7 +91,7 @@ defmodule JSON.LD.Encoder do
# 6.2)
node = Map.delete(node, "usages")
if Map.size(node) == 1 and Map.has_key?(node, "@id") do
if map_size(node) == 1 and Map.has_key?(node, "@id") do
result
else
[node | result]
@ -99,6 +101,9 @@ defmodule JSON.LD.Encoder do
end
end
def from_rdf!(rdf_data, options),
do: rdf_data |> RDF.Dataset.new() |> from_rdf!(options)
# 3.5)
defp node_map_from_graph(graph, current, use_native_types, use_rdf_type) do
Enum.reduce(graph, current, fn ({subject, predicate, object}, node_map) ->
@ -273,38 +278,39 @@ defmodule JSON.LD.Encoder do
%{"@id" => to_string(bnode)}
end
defp rdf_to_object(%Literal{value: value, datatype: datatype} = literal, use_native_types) do
defp rdf_to_object(%Literal{literal: %datatype{}} = literal, use_native_types) do
result = %{}
value = Literal.value(literal)
converted_value = literal
type = nil
{converted_value, type, result} =
if use_native_types do
cond do
datatype == XSD.string ->
datatype == XSD.String ->
{value, type, result}
datatype == XSD.boolean ->
if RDF.Boolean.valid?(literal) do
datatype == XSD.Boolean ->
if RDF.XSD.Boolean.valid?(literal) do
{value, type, result}
else
{converted_value, XSD.boolean, result}
{converted_value, NS.XSD.boolean, result}
end
datatype in [XSD.integer, XSD.double] ->
if RDF.Literal.valid?(literal) do
datatype in [XSD.Integer, XSD.Double] ->
if Literal.valid?(literal) do
{value, type, result}
else
{converted_value, type, result}
end
true ->
{converted_value, datatype, result}
{converted_value, Literal.datatype_id(literal), result}
end
else
cond do
datatype == RDF.langString ->
{converted_value, type, Map.put(result, "@language", literal.language)}
datatype == XSD.string ->
datatype == RDF.LangString ->
{converted_value, type, Map.put(result, "@language", Literal.language(literal))}
datatype == XSD.String ->
{converted_value, type, result}
true ->
{converted_value, datatype, result}
{Literal.lexical(literal), Literal.datatype_id(literal), result}
end
end
@ -314,14 +320,11 @@ defmodule JSON.LD.Encoder do
end
# TODO: This should not be dependent on Poison as a JSON encoder in general,
# but determine available JSON encoders and use one heuristically or by configuration
defp encode_json(value, opts \\ []) do
Poison.encode(value)
defp encode_json(value, opts) do
Jason.encode(value, opts)
end
defp encode_json!(value, opts \\ []) do
Poison.encode!(value)
defp encode_json!(value, opts) do
Jason.encode!(value, opts)
end
end

View file

@ -304,7 +304,7 @@ defmodule JSON.LD.Expansion do
# 8)
%{"@value" => value} ->
with keys = Map.keys(result) do # 8.1)
if Enum.any?(keys, &(not &1 in ~w[@value @language @type @index])) ||
if Enum.any?(keys, &(&1 not in ~w[@value @language @type @index])) ||
("@language" in keys and "@type" in keys) do
raise JSON.LD.InvalidValueObjectError,
message: "value object with disallowed members"

View file

@ -9,7 +9,7 @@ defmodule JSON.LD do
@extension "jsonld"
@media_type "application/ld+json"
def options, do: JSON.LD.Options.new
def options, do: JSON.LD.Options.new
@keywords ~w[
@base

25
mix.exs
View file

@ -1,7 +1,7 @@
defmodule JSON.LD.Mixfile do
use Mix.Project
@repo_url "https://github.com/marcelotto/jsonld-ex"
@repo_url "https://github.com/rdf-elixir/jsonld-ex"
@version File.read!("VERSION") |> String.trim
@ -9,10 +9,11 @@ defmodule JSON.LD.Mixfile do
[
app: :json_ld,
version: @version,
elixir: "~> 1.5",
elixir: "~> 1.8",
build_embedded: Mix.env == :prod,
start_permanent: Mix.env == :prod,
deps: deps(),
elixirc_paths: elixirc_paths(Mix.env()),
# Hex
package: package(),
@ -30,7 +31,7 @@ defmodule JSON.LD.Mixfile do
# ExCoveralls
test_coverage: [tool: ExCoveralls],
preferred_cli_env: [
"coveralls": :test,
coveralls: :test,
"coveralls.detail": :test,
"coveralls.post": :test,
"coveralls.html": :test
@ -59,13 +60,17 @@ defmodule JSON.LD.Mixfile do
defp deps do
[
{:rdf, "~> 0.4"},
{:poison, "~> 3.0"},
{:httpoison, "~> 0.13"},
{:dialyxir, "~> 0.4", only: [:dev, :test], runtime: false},
{:credo, "~> 0.6", only: [:dev, :test], runtime: false},
{:ex_doc, "~> 0.14", only: :dev, runtime: false},
{:excoveralls, "~> 0.7", only: :test},
{:rdf, "~> 0.8"},
{:jason, "~> 1.2"},
{:httpoison, "~> 1.7"},
{:credo, "~> 1.4", only: [:dev, :test], runtime: false},
{:dialyxir, "~> 1.0", only: :dev, runtime: false},
{:ex_doc, "~> 0.22", only: :dev, runtime: false},
{:excoveralls, "~> 0.13", only: :test},
]
end
defp elixirc_paths(:test), do: ["lib", "test/support"]
defp elixirc_paths(_), do: ["lib"]
end

View file

@ -1,20 +1,25 @@
%{
"bunt": {:hex, :bunt, "0.2.0", "951c6e801e8b1d2cbe58ebbd3e616a869061ddadcc4863d0a2182541acae9a38", [:mix], []},
"certifi": {:hex, :certifi, "2.0.0", "a0c0e475107135f76b8c1d5bc7efb33cd3815cb3cf3dea7aefdd174dabead064", [:rebar3], []},
"credo": {:hex, :credo, "0.8.10", "261862bb7363247762e1063713bb85df2bbd84af8d8610d1272cd9c1943bba63", [:mix], [{:bunt, "~> 0.2.0", [hex: :bunt, optional: false]}]},
"dialyxir": {:hex, :dialyxir, "0.5.1", "b331b091720fd93e878137add264bac4f644e1ddae07a70bf7062c7862c4b952", [:mix], []},
"earmark": {:hex, :earmark, "1.2.4", "99b637c62a4d65a20a9fb674b8cffb8baa771c04605a80c911c4418c69b75439", [:mix], []},
"ex_doc": {:hex, :ex_doc, "0.18.3", "f4b0e4a2ec6f333dccf761838a4b253d75e11f714b85ae271c9ae361367897b7", [:mix], [{:earmark, "~> 1.1", [hex: :earmark, optional: false]}]},
"excoveralls": {:hex, :excoveralls, "0.8.1", "0bbf67f22c7dbf7503981d21a5eef5db8bbc3cb86e70d3798e8c802c74fa5e27", [:mix], [{:exjsx, ">= 3.0.0", [hex: :exjsx, optional: false]}, {:hackney, ">= 0.12.0", [hex: :hackney, optional: false]}]},
"exjsx": {:hex, :exjsx, "4.0.0", "60548841e0212df401e38e63c0078ec57b33e7ea49b032c796ccad8cde794b5c", [:mix], [{:jsx, "~> 2.8.0", [hex: :jsx, optional: false]}]},
"hackney": {:hex, :hackney, "1.11.0", "4951ee019df102492dabba66a09e305f61919a8a183a7860236c0fde586134b6", [:rebar3], [{:certifi, "2.0.0", [hex: :certifi, optional: false]}, {:idna, "5.1.0", [hex: :idna, optional: false]}, {:metrics, "1.0.1", [hex: :metrics, optional: false]}, {:mimerl, "1.0.2", [hex: :mimerl, optional: false]}, {:ssl_verify_fun, "1.1.1", [hex: :ssl_verify_fun, optional: false]}]},
"httpoison": {:hex, :httpoison, "0.13.0", "bfaf44d9f133a6599886720f3937a7699466d23bb0cd7a88b6ba011f53c6f562", [:mix], [{:hackney, "~> 1.8", [hex: :hackney, optional: false]}]},
"idna": {:hex, :idna, "5.1.0", "d72b4effeb324ad5da3cab1767cb16b17939004e789d8c0ad5b70f3cea20c89a", [:rebar3], [{:unicode_util_compat, "0.3.1", [hex: :unicode_util_compat, optional: false]}]},
"jsx": {:hex, :jsx, "2.8.3", "a05252d381885240744d955fbe3cf810504eb2567164824e19303ea59eef62cf", [:mix, :rebar3], []},
"metrics": {:hex, :metrics, "1.0.1", "25f094dea2cda98213cecc3aeff09e940299d950904393b2a29d191c346a8486", [:rebar3], []},
"mimerl": {:hex, :mimerl, "1.0.2", "993f9b0e084083405ed8252b99460c4f0563e41729ab42d9074fd5e52439be88", [:rebar3], []},
"poison": {:hex, :poison, "3.1.0", "d9eb636610e096f86f25d9a46f35a9facac35609a7591b3be3326e99a0484665", [:mix], []},
"rdf": {:hex, :rdf, "0.4.0", "10db43f71931f95e3707a5a0357d46149e6c47be8ae83f17199ff296765891db", [:mix], []},
"ssl_verify_fun": {:hex, :ssl_verify_fun, "1.1.1", "28a4d65b7f59893bc2c7de786dec1e1555bd742d336043fe644ae956c3497fbe", [:make, :rebar], []},
"unicode_util_compat": {:hex, :unicode_util_compat, "0.3.1", "a1f612a7b512638634a603c8f401892afbf99b8ce93a45041f8aaca99cadb85e", [:rebar3], []},
"bunt": {:hex, :bunt, "0.2.0", "951c6e801e8b1d2cbe58ebbd3e616a869061ddadcc4863d0a2182541acae9a38", [:mix], [], "hexpm", "7af5c7e09fe1d40f76c8e4f9dd2be7cebd83909f31fee7cd0e9eadc567da8353"},
"certifi": {:hex, :certifi, "2.5.2", "b7cfeae9d2ed395695dd8201c57a2d019c0c43ecaf8b8bcb9320b40d6662f340", [:rebar3], [{:parse_trans, "~>3.3", [hex: :parse_trans, repo: "hexpm", optional: false]}], "hexpm", "3b3b5f36493004ac3455966991eaf6e768ce9884693d9968055aeeeb1e575040"},
"credo": {:hex, :credo, "1.4.0", "92339d4cbadd1e88b5ee43d427b639b68a11071b6f73854e33638e30a0ea11f5", [:mix], [{:bunt, "~> 0.2.0", [hex: :bunt, repo: "hexpm", optional: false]}, {:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: false]}], "hexpm", "1fd3b70dce216574ce3c18bdf510b57e7c4c85c2ec9cad4bff854abaf7e58658"},
"decimal": {:hex, :decimal, "1.8.1", "a4ef3f5f3428bdbc0d35374029ffcf4ede8533536fa79896dd450168d9acdf3c", [:mix], [], "hexpm", "3cb154b00225ac687f6cbd4acc4b7960027c757a5152b369923ead9ddbca7aec"},
"dialyxir": {:hex, :dialyxir, "1.0.0", "6a1fa629f7881a9f5aaf3a78f094b2a51a0357c843871b8bc98824e7342d00a5", [:mix], [{:erlex, ">= 0.2.6", [hex: :erlex, repo: "hexpm", optional: false]}], "hexpm", "aeb06588145fac14ca08d8061a142d52753dbc2cf7f0d00fc1013f53f8654654"},
"earmark": {:hex, :earmark, "1.4.4", "4821b8d05cda507189d51f2caeef370cf1e18ca5d7dfb7d31e9cafe6688106a4", [:mix], [], "hexpm", "1f93aba7340574847c0f609da787f0d79efcab51b044bb6e242cae5aca9d264d"},
"erlex": {:hex, :erlex, "0.2.6", "c7987d15e899c7a2f34f5420d2a2ea0d659682c06ac607572df55a43753aa12e", [:mix], [], "hexpm", "2ed2e25711feb44d52b17d2780eabf998452f6efda104877a3881c2f8c0c0c75"},
"ex_doc": {:hex, :ex_doc, "0.22.1", "9bb6d51508778193a4ea90fa16eac47f8b67934f33f8271d5e1edec2dc0eee4c", [:mix], [{:earmark, "~> 1.4.0", [hex: :earmark, repo: "hexpm", optional: false]}, {:makeup_elixir, "~> 0.14", [hex: :makeup_elixir, repo: "hexpm", optional: false]}], "hexpm", "d957de1b75cb9f78d3ee17820733dc4460114d8b1e11f7ee4fd6546e69b1db60"},
"excoveralls": {:hex, :excoveralls, "0.13.0", "4e1b7cc4e0351d8d16e9be21b0345a7e165798ee5319c7800b9138ce17e0b38e", [:mix], [{:hackney, "~> 1.16", [hex: :hackney, repo: "hexpm", optional: false]}, {:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: false]}], "hexpm", "fe2a56c8909564e2e6764765878d7d5e141f2af3bc8ff3b018a68ee2a218fced"},
"hackney": {:hex, :hackney, "1.16.0", "5096ac8e823e3a441477b2d187e30dd3fff1a82991a806b2003845ce72ce2d84", [:rebar3], [{:certifi, "2.5.2", [hex: :certifi, repo: "hexpm", optional: false]}, {:idna, "6.0.1", [hex: :idna, repo: "hexpm", optional: false]}, {:metrics, "1.0.1", [hex: :metrics, repo: "hexpm", optional: false]}, {:mimerl, "~>1.1", [hex: :mimerl, repo: "hexpm", optional: false]}, {:parse_trans, "3.3.0", [hex: :parse_trans, repo: "hexpm", optional: false]}, {:ssl_verify_fun, "1.1.6", [hex: :ssl_verify_fun, repo: "hexpm", optional: false]}], "hexpm", "3bf0bebbd5d3092a3543b783bf065165fa5d3ad4b899b836810e513064134e18"},
"httpoison": {:hex, :httpoison, "1.7.0", "abba7d086233c2d8574726227b6c2c4f6e53c4deae7fe5f6de531162ce9929a0", [:mix], [{:hackney, "~> 1.16", [hex: :hackney, repo: "hexpm", optional: false]}], "hexpm", "975cc87c845a103d3d1ea1ccfd68a2700c211a434d8428b10c323dc95dc5b980"},
"idna": {:hex, :idna, "6.0.1", "1d038fb2e7668ce41fbf681d2c45902e52b3cb9e9c77b55334353b222c2ee50c", [:rebar3], [{:unicode_util_compat, "0.5.0", [hex: :unicode_util_compat, repo: "hexpm", optional: false]}], "hexpm", "a02c8a1c4fd601215bb0b0324c8a6986749f807ce35f25449ec9e69758708122"},
"jason": {:hex, :jason, "1.2.1", "12b22825e22f468c02eb3e4b9985f3d0cb8dc40b9bd704730efa11abd2708c44", [:mix], [{:decimal, "~> 1.0", [hex: :decimal, repo: "hexpm", optional: true]}], "hexpm", "b659b8571deedf60f79c5a608e15414085fa141344e2716fbd6988a084b5f993"},
"makeup": {:hex, :makeup, "1.0.1", "82f332e461dc6c79dbd82fbe2a9c10d48ed07146f0a478286e590c83c52010b5", [:mix], [{:nimble_parsec, "~> 0.5.0", [hex: :nimble_parsec, repo: "hexpm", optional: false]}], "hexpm", "49736fe5b66a08d8575bf5321d716bac5da20c8e6b97714fec2bcd6febcfa1f8"},
"makeup_elixir": {:hex, :makeup_elixir, "0.14.0", "cf8b7c66ad1cff4c14679698d532f0b5d45a3968ffbcbfd590339cb57742f1ae", [:mix], [{:makeup, "~> 1.0", [hex: :makeup, repo: "hexpm", optional: false]}], "hexpm", "d4b316c7222a85bbaa2fd7c6e90e37e953257ad196dc229505137c5e505e9eff"},
"metrics": {:hex, :metrics, "1.0.1", "25f094dea2cda98213cecc3aeff09e940299d950904393b2a29d191c346a8486", [:rebar3], [], "hexpm", "69b09adddc4f74a40716ae54d140f93beb0fb8978d8636eaded0c31b6f099f16"},
"mimerl": {:hex, :mimerl, "1.2.0", "67e2d3f571088d5cfd3e550c383094b47159f3eee8ffa08e64106cdf5e981be3", [:rebar3], [], "hexpm", "f278585650aa581986264638ebf698f8bb19df297f66ad91b18910dfc6e19323"},
"nimble_parsec": {:hex, :nimble_parsec, "0.5.3", "def21c10a9ed70ce22754fdeea0810dafd53c2db3219a0cd54cf5526377af1c6", [:mix], [], "hexpm", "589b5af56f4afca65217a1f3eb3fee7e79b09c40c742fddc1c312b3ac0b3399f"},
"parse_trans": {:hex, :parse_trans, "3.3.0", "09765507a3c7590a784615cfd421d101aec25098d50b89d7aa1d66646bc571c1", [:rebar3], [], "hexpm", "17ef63abde837ad30680ea7f857dd9e7ced9476cdd7b0394432af4bfc241b960"},
"protocol_ex": {:hex, :protocol_ex, "0.4.3", "4acbe35da85109dc40315c1139bb7a65ebc7fc102d384cd8b3038384fbb9b282", [:mix], [], "hexpm", "6ca5ddb3505c9c86f17cd3f19838b34bf89966ae17078f79f81983b6a4391fe9"},
"rdf": {:hex, :rdf, "0.8.0", "9e797e1120d7eb4285874eb3e293493670d3deddcca8315abec9598ca80ae7d4", [:mix], [{:decimal, "~> 1.5", [hex: :decimal, repo: "hexpm", optional: false]}, {:protocol_ex, "~> 0.4", [hex: :protocol_ex, repo: "hexpm", optional: false]}], "hexpm", "d256e7d35d03dd22ae44850c31d5e7eca77563985b8ba7dbf6d639f7ddadd596"},
"ssl_verify_fun": {:hex, :ssl_verify_fun, "1.1.6", "cf344f5692c82d2cd7554f5ec8fd961548d4fd09e7d22f5b62482e5aeaebd4b0", [:make, :mix, :rebar3], [], "hexpm", "bdb0d2471f453c88ff3908e7686f86f9be327d065cc1ec16fa4540197ea04680"},
"unicode_util_compat": {:hex, :unicode_util_compat, "0.5.0", "8516502659002cec19e244ebd90d312183064be95025a319a6c7e89f4bccd65b", [:rebar3], [], "hexpm", "d48d002e15f5cc105a696cf2f1bbb3fc72b4b770a184d8420c8db20da2674b38"},
}

View file

@ -49,6 +49,6 @@ defmodule JSON.LD.TestSuite.FromRdfTest do
filename
|> file
|> File.read!
|> Poison.Parser.parse!
|> Jason.decode!
end
end

View file

@ -7,7 +7,7 @@ defmodule JSON.LD.TestSuite do
def parse_json_file!(file) do
case File.read(file(file)) do
{:ok, content} -> Poison.Parser.parse!(content)
{:ok, content} -> Jason.decode!(content)
{:error, reason} -> raise File.Error, path: file, action: "read", reason: reason
end
end
@ -49,7 +49,6 @@ defmodule JSON.LD.TestSuite do
{:expand_context, file} -> {:expand_context, j(file)}
option -> option
end)
|> JSON.LD.Options.new
end
def exception(error) do

View file

@ -1,7 +1 @@
ExUnit.start()
with files = File.ls!("./test/support") do
Enum.each files, fn(file) ->
Code.require_file "support/#{file}", __DIR__
end
end

View file

@ -4,7 +4,7 @@ defmodule JSON.LD.CompactionTest do
alias RDF.NS.{RDFS, XSD}
test "Flattened form of a JSON-LD document (EXAMPLES 57-59 of https://www.w3.org/TR/json-ld/#compacted-document-form)" do
input = Poison.Parser.parse! """
input = Jason.decode! """
[
{
"http://xmlns.com/foaf/0.1/name": [ "Manu Sporny" ],
@ -16,7 +16,7 @@ defmodule JSON.LD.CompactionTest do
}
]
"""
context = Poison.Parser.parse! """
context = Jason.decode! """
{
"@context": {
"name": "http://xmlns.com/foaf/0.1/name",
@ -27,7 +27,7 @@ defmodule JSON.LD.CompactionTest do
}
}
"""
assert JSON.LD.compact(input, context) == Poison.Parser.parse! """
assert JSON.LD.compact(input, context) == Jason.decode! """
{
"@context": {
"name": "http://xmlns.com/foaf/0.1/name",
@ -324,16 +324,16 @@ defmodule JSON.LD.CompactionTest do
}
},
"compact-0007" => %{
input: Poison.Parser.parse!("""
input: Jason.decode!("""
{"http://example.org/vocab#contains": "this-is-not-an-IRI"}
"""),
context: Poison.Parser.parse!("""
context: Jason.decode!("""
{
"ex": "http://example.org/vocab#",
"ex:contains": {"@type": "@id"}
}
"""),
output: Poison.Parser.parse!("""
output: Jason.decode!("""
{
"@context": {
"ex": "http://example.org/vocab#",
@ -355,7 +355,7 @@ defmodule JSON.LD.CompactionTest do
describe "@reverse" do
%{
"compact-0033" => %{
input: Poison.Parser.parse!("""
input: Jason.decode!("""
[
{
"@id": "http://example.com/people/markus",
@ -371,13 +371,13 @@ defmodule JSON.LD.CompactionTest do
}
]
"""),
context: Poison.Parser.parse!("""
context: Jason.decode!("""
{
"name": "http://xmlns.com/foaf/0.1/name",
"isKnownBy": { "@reverse": "http://xmlns.com/foaf/0.1/knows" }
}
"""),
output: Poison.Parser.parse!("""
output: Jason.decode!("""
{
"@context": {
"name": "http://xmlns.com/foaf/0.1/name",

View file

@ -3,9 +3,8 @@ defmodule JSON.LD.EncoderTest do
doctest JSON.LD.Encoder
alias RDF.{Dataset}
alias RDF.{Dataset, Graph, Description}
alias RDF.NS
alias RDF.NS.{XSD, RDFS}
import RDF.Sigils
@ -17,14 +16,37 @@ defmodule JSON.LD.EncoderTest do
alias TestNS.{EX, S}
@compile {:no_warn_undefined, JSON.LD.EncoderTest.TestNS.EX}
@compile {:no_warn_undefined, JSON.LD.EncoderTest.TestNS.S}
def gets_serialized_to(input, output, opts \\ []) do
data_structs = Keyword.get(opts, :only, [Dataset])
data_structs = Keyword.get(opts, :data_structs, [Dataset, Graph])
Enum.each data_structs, fn data_struct ->
assert JSON.LD.Encoder.from_rdf!(data_struct.new(input), opts) == output
end
end
test "pretty printing" do
dataset = Dataset.new {~I<http://a/b>, ~I<http://a/c>, ~I<http://a/d>}
assert JSON.LD.Encoder.encode!(dataset) ==
"[{\"@id\":\"http://a/b\",\"http://a/c\":[{\"@id\":\"http://a/d\"}]}]"
assert JSON.LD.Encoder.encode!(dataset, pretty: true) ==
"""
[
{
"@id": "http://a/b",
"http://a/c": [
{
"@id": "http://a/d"
}
]
}
]
""" |> String.trim()
end
test "an empty RDF.Dataset is serialized to an JSON array string" do
assert JSON.LD.Encoder.encode!(Dataset.new) == "[]"
@ -36,7 +58,7 @@ defmodule JSON.LD.EncoderTest do
|> gets_serialized_to([%{
"@id" => "http://a/b",
"http://a/c" => [%{"@id" => "http://a/d"}]
}])
}], data_structs: [Dataset, Graph, Description])
end
test "should generate object list" do
@ -47,7 +69,7 @@ defmodule JSON.LD.EncoderTest do
%{"@id" => "http://example.com/d"},
%{"@id" => "http://example.com/e"}
]
}])
}], data_structs: [Dataset, Graph, Description])
end
test "should generate property list" do
@ -56,7 +78,7 @@ defmodule JSON.LD.EncoderTest do
"@id" => "http://example.com/b",
"http://example.com/c" => [%{"@id" => "http://example.com/d"}],
"http://example.com/e" => [%{"@id" => "http://example.com/f"}]
}])
}], data_structs: [Dataset, Graph, Description])
end
test "serializes multiple subjects" do
@ -77,7 +99,7 @@ defmodule JSON.LD.EncoderTest do
|> gets_serialized_to([%{
"@id" => "http://example.com/a",
"http://example.com/b" => [%{"@value" => "foo", "@type" => "http://example.com/d"}]
}])
}], data_structs: [Dataset, Graph, Description])
end
test "integer" do
@ -144,14 +166,14 @@ defmodule JSON.LD.EncoderTest do
integer: 1,
unsignedInt: 1,
nonNegativeInteger: 1,
float: 1,
float: 1.0,
nonPositiveInteger: -1,
negativeInteger: -1,
}
|> Enum.each(fn ({type, _} = data) ->
@tag data: data
test "#{type}", %{data: {type, value}} do
{EX.a, EX.b, RDF.literal(value, datatype: apply(XSD, type, []))}
{EX.a, EX.b, RDF.literal(value, datatype: apply(NS.XSD, type, []))}
|> gets_serialized_to([%{
"@id" => "http://example.com/a",
"http://example.com/b" => [%{"@value" => "#{value}", "@type" => "http://www.w3.org/2001/XMLSchema##{type}"}]
@ -183,7 +205,7 @@ defmodule JSON.LD.EncoderTest do
|> gets_serialized_to([%{
"@id" => "_:a",
"http://example.com/a" => [%{"@id" => "http://example.com/b"}]
}])
}], data_structs: [Dataset, Graph, Description])
end
test "should generate blank nodes as object" do
@ -409,7 +431,7 @@ defmodule JSON.LD.EncoderTest do
|> Enum.each(fn ({title, data}) ->
@tag data: data
test title, %{data: %{input: input, output: output}} do
input |> gets_serialized_to(output, only: [Dataset])
input |> gets_serialized_to(output, data_structs: [Dataset])
end
end)
end
@ -417,7 +439,7 @@ defmodule JSON.LD.EncoderTest do
describe "problems" do
%{
"xsd:boolean as value" => {
{~I<http://data.wikia.com/terms#playable>, RDFS.range, XSD.boolean},
{~I<http://data.wikia.com/terms#playable>, NS.RDFS.range, NS.XSD.boolean},
[%{
"@id" => "http://data.wikia.com/terms#playable",
"http://www.w3.org/2000/01/rdf-schema#range" => [

View file

@ -6,7 +6,7 @@ defmodule JSON.LD.ExpansionTest do
alias RDF.NS.{RDFS, XSD}
test "Expanded form of a JSON-LD document (EXAMPLE 55 and 56 of https://www.w3.org/TR/json-ld/#expanded-document-form)" do
input = Poison.Parser.parse! """
input = Jason.decode! """
{
"@context":
{
@ -20,7 +20,7 @@ defmodule JSON.LD.ExpansionTest do
"homepage": "http://manu.sporny.org/"
}
"""
assert JSON.LD.expand(input) == Poison.Parser.parse! """
assert JSON.LD.expand(input) == Jason.decode! """
[
{
"http://xmlns.com/foaf/0.1/name": [
@ -540,7 +540,7 @@ defmodule JSON.LD.ExpansionTest do
}]
},
"expand-0004" => %{
input: Poison.Parser.parse!(~s({
input: Jason.decode!(~s({
"@context": {
"mylist1": {"@id": "http://example.com/mylist1", "@container": "@list"},
"mylist2": {"@id": "http://example.com/mylist2", "@container": "@list"},
@ -549,7 +549,7 @@ defmodule JSON.LD.ExpansionTest do
},
"http://example.org/property": { "@list": "one item" }
})),
output: Poison.Parser.parse!(~s([
output: Jason.decode!(~s([
{
"http://example.org/property": [
{
@ -700,7 +700,7 @@ defmodule JSON.LD.ExpansionTest do
describe "@reverse" do
%{
"expand-0037" => %{
input: Poison.Parser.parse!(~s({
input: Jason.decode!(~s({
"@context": {
"name": "http://xmlns.com/foaf/0.1/name"
},
@ -713,7 +713,7 @@ defmodule JSON.LD.ExpansionTest do
}
}
})),
output: Poison.Parser.parse!(~s([
output: Jason.decode!(~s([
{
"@id": "http://example.com/people/markus",
"@reverse": {
@ -737,7 +737,7 @@ defmodule JSON.LD.ExpansionTest do
]))
},
"expand-0043" => %{
input: Poison.Parser.parse!(~s({
input: Jason.decode!(~s({
"@context": {
"name": "http://xmlns.com/foaf/0.1/name",
"isKnownBy": { "@reverse": "http://xmlns.com/foaf/0.1/knows" }
@ -757,7 +757,7 @@ defmodule JSON.LD.ExpansionTest do
]
}
})),
output: Poison.Parser.parse!(~s([
output: Jason.decode!(~s([
{
"@id": "http://example.com/people/markus",
"http://xmlns.com/foaf/0.1/knows": [
@ -870,7 +870,7 @@ defmodule JSON.LD.ExpansionTest do
},
"@reverse object with an @id property" => %{
input: Poison.Parser.parse!(~s({
input: Jason.decode!(~s({
"@id": "http://example/foo",
"@reverse": {
"@id": "http://example/bar"
@ -879,7 +879,7 @@ defmodule JSON.LD.ExpansionTest do
exception: JSON.LD.InvalidReversePropertyMapError,
},
"colliding keywords" => %{
input: Poison.Parser.parse!(~s({
input: Jason.decode!(~s({
"@context": {
"id": "@id",
"ID": "@id"

View file

@ -4,7 +4,7 @@ defmodule JSON.LD.FlatteningTest do
alias RDF.NS.RDFS
test "Flattened form of a JSON-LD document (EXAMPLE 60 and 61 of https://www.w3.org/TR/json-ld/#flattened-document-form)" do
input = Poison.Parser.parse! """
input = Jason.decode! """
{
"@context": {
"name": "http://xmlns.com/foaf/0.1/name",
@ -23,7 +23,7 @@ defmodule JSON.LD.FlatteningTest do
]
}
"""
assert JSON.LD.flatten(input, input) == Poison.Parser.parse! """
assert JSON.LD.flatten(input, input) == Jason.decode! """
{
"@context": {
"name": "http://xmlns.com/foaf/0.1/name",
@ -107,7 +107,7 @@ defmodule JSON.LD.FlatteningTest do
]
},
"reverse properties" => %{
input: Poison.Parser.parse!("""
input: Jason.decode!("""
[
{
"@id": "http://example.com/people/markus",
@ -125,7 +125,7 @@ defmodule JSON.LD.FlatteningTest do
}
]
"""),
output: Poison.Parser.parse!("""
output: Jason.decode!("""
[
{
"@id": "http://example.com/people/dave",
@ -155,7 +155,7 @@ defmodule JSON.LD.FlatteningTest do
""")
},
"Simple named graph (Wikidata)" => %{
input: Poison.Parser.parse!("""
input: Jason.decode!("""
{
"@context": {
"rdf": "http://www.w3.org/1999/02/22-rdf-syntax-ns#",
@ -187,7 +187,7 @@ defmodule JSON.LD.FlatteningTest do
]
}
"""),
output: Poison.Parser.parse!("""
output: Jason.decode!("""
[{
"@id": "http://example.org/ParisFact1",
"@type": ["http://www.w3.org/1999/02/22-rdf-syntax-ns#Graph"],
@ -212,7 +212,7 @@ defmodule JSON.LD.FlatteningTest do
""")
},
"Test Manifest (shortened)" => %{
input: Poison.Parser.parse!("""
input: Jason.decode!("""
{
"@id": "",
"http://example/sequence": {"@list": [
@ -224,7 +224,7 @@ defmodule JSON.LD.FlatteningTest do
]}
}
"""),
output: Poison.Parser.parse!("""
output: Jason.decode!("""
[{
"@id": "",
"http://example/sequence": [{"@list": [{"@id": "#t0001"}]}]
@ -237,7 +237,7 @@ defmodule JSON.LD.FlatteningTest do
options: %{}
},
"@reverse bnode issue (0045)" => %{
input: Poison.Parser.parse!("""
input: Jason.decode!("""
{
"@context": {
"foo": "http://example.org/foo",
@ -247,7 +247,7 @@ defmodule JSON.LD.FlatteningTest do
"bar": [ "http://example.org/origin", "_:b0" ]
}
"""),
output: Poison.Parser.parse!("""
output: Jason.decode!("""
[
{
"@id": "_:b0",

View file

@ -210,7 +210,7 @@ defmodule JSON.LD.IRICompactionTest do
describe "compact-0018" do
setup do
context = JSON.LD.context(Poison.Parser.parse! """
context = JSON.LD.context(Jason.decode! """
{
"id1": "http://example.com/id1",
"type1": "http://example.com/t1",
@ -324,7 +324,7 @@ defmodule JSON.LD.IRICompactionTest do
do: [values],
else: values
Enum.each(values, fn value ->
value = Poison.Parser.parse!(value)
value = Jason.decode!(value)
@tag data: {term, value}
test "uses #{term} for #{inspect value, limit: 3}",
%{data: {term, value}, example_context: context,