Merge branch 'develop' into reducer-dev
This commit is contained in:
commit
de379b6c04
13
.github/ISSUE_TEMPLATE/developer-bug.md
vendored
13
.github/ISSUE_TEMPLATE/developer-bug.md
vendored
|
@ -1,13 +0,0 @@
|
||||||
---
|
|
||||||
name: Developer friction when contributing to Squiggle
|
|
||||||
about: Have a testing-related task? Did your yarn scripts fail? Did the CI diverge from a README? Etc.
|
|
||||||
labels: "ops & testing"
|
|
||||||
---
|
|
||||||
|
|
||||||
# Description:
|
|
||||||
|
|
||||||
# The OS and version, yarn version, etc. in which this came up
|
|
||||||
|
|
||||||
_delete this section if testing task_
|
|
||||||
|
|
||||||
# Desired behavior
|
|
12
.github/ISSUE_TEMPLATE/ops-testing.md
vendored
Normal file
12
.github/ISSUE_TEMPLATE/ops-testing.md
vendored
Normal file
|
@ -0,0 +1,12 @@
|
||||||
|
---
|
||||||
|
name: Operations and testing
|
||||||
|
about: Have a testing-related task? Developer friction when contributing to squiggle? Etc.
|
||||||
|
labels: "ops & testing"
|
||||||
|
---
|
||||||
|
|
||||||
|
# Description:
|
||||||
|
|
||||||
|
<!-- delete this section if testing task or otherwise not applicable -->
|
||||||
|
# The OS and version, yarn version, etc. in which this came up
|
||||||
|
|
||||||
|
# Desired behavior
|
4
.github/ISSUE_TEMPLATE/pl.md
vendored
4
.github/ISSUE_TEMPLATE/pl.md
vendored
|
@ -1,6 +1,6 @@
|
||||||
---
|
---
|
||||||
name: Regarding the programming language
|
name: Regarding the programming language (the `squiggle-lang` package)
|
||||||
about: Interpreter, parser, syntax, semantics, and including distributions
|
about: Anything concerning distributions/numerics, as well as the interpreter, parser, syntax, semantics
|
||||||
labels: "programming language"
|
labels: "programming language"
|
||||||
---
|
---
|
||||||
|
|
||||||
|
|
5
.github/workflows/codeql-analysis.yml
vendored
5
.github/workflows/codeql-analysis.yml
vendored
|
@ -62,7 +62,10 @@ jobs:
|
||||||
# If this step fails, then you should remove it and run the build manually (see below)
|
# If this step fails, then you should remove it and run the build manually (see below)
|
||||||
- name: Autobuild
|
- name: Autobuild
|
||||||
uses: github/codeql-action/autobuild@v1
|
uses: github/codeql-action/autobuild@v1
|
||||||
|
- name: Install dependencies
|
||||||
|
run: yarn
|
||||||
|
- name: Build rescript
|
||||||
|
run: cd packages/squiggle-lang && yarn build
|
||||||
# ℹ️ Command-line programs to run using the OS shell.
|
# ℹ️ Command-line programs to run using the OS shell.
|
||||||
# 📚 https://git.io/JvXDl
|
# 📚 https://git.io/JvXDl
|
||||||
|
|
||||||
|
|
|
@ -51,7 +51,7 @@ See [here](https://github.com/NixOS/nixpkgs/issues/107375)
|
||||||
|
|
||||||
# Pull request protocol
|
# Pull request protocol
|
||||||
|
|
||||||
Please work against `develop` branch. **Do not** work against `master`.
|
Please work against `develop` branch. **Do not** work against `master`.
|
||||||
|
|
||||||
- For rescript code: Quinn and Ozzie are reviewers
|
- For rescript code: Quinn and Ozzie are reviewers
|
||||||
- For js or typescript code: Sam and Ozzie are reviewers
|
- For js or typescript code: Sam and Ozzie are reviewers
|
||||||
|
@ -60,7 +60,8 @@ Please work against `develop` branch. **Do not** work against `master`.
|
||||||
Autopings are set up: if you are not autopinged, you are welcome to comment, but please do not use the formal review feature, send approvals, rejections, or merges.
|
Autopings are set up: if you are not autopinged, you are welcome to comment, but please do not use the formal review feature, send approvals, rejections, or merges.
|
||||||
|
|
||||||
# Code Quality
|
# Code Quality
|
||||||
- Aim for at least 8/10* quality in ``/packages/squiggle-lang``, and 7/10 quality in ``/packages/components``.
|
|
||||||
|
- Aim for at least 8/10\* quality in `/packages/squiggle-lang`, and 7/10 quality in `/packages/components`.
|
||||||
- If you submit a PR that is under a 7, for some reason, describe the reasoning for this in the PR.
|
- If you submit a PR that is under a 7, for some reason, describe the reasoning for this in the PR.
|
||||||
|
|
||||||
* This quality score is subjective.
|
* This quality score is subjective.
|
||||||
|
@ -74,6 +75,7 @@ Note: Our codebase used to use `|>`, so there's a lot of that in the system. We'
|
||||||
|
|
||||||
**Don't use anonymous functions with over three lines**
|
**Don't use anonymous functions with over three lines**
|
||||||
Bad:
|
Bad:
|
||||||
|
|
||||||
```rescript
|
```rescript
|
||||||
foo
|
foo
|
||||||
-> E.O.fmap(r => {
|
-> E.O.fmap(r => {
|
||||||
|
@ -83,7 +85,9 @@ Bad:
|
||||||
r + a + b + c
|
r + a + b + c
|
||||||
}
|
}
|
||||||
```
|
```
|
||||||
|
|
||||||
Good:
|
Good:
|
||||||
|
|
||||||
```rescript
|
```rescript
|
||||||
let addingFn = (r => {
|
let addingFn = (r => {
|
||||||
let a = 34;
|
let a = 34;
|
||||||
|
@ -101,6 +105,7 @@ We'll try this for one month (ending May 5, 2022), then revisit.
|
||||||
Rescript is clever about function inputs. There's custom syntax for default and optional arguments. In the cases where this applies, use it.
|
Rescript is clever about function inputs. There's custom syntax for default and optional arguments. In the cases where this applies, use it.
|
||||||
|
|
||||||
From https://rescript-lang.org/docs/manual/latest/function:
|
From https://rescript-lang.org/docs/manual/latest/function:
|
||||||
|
|
||||||
```rescript
|
```rescript
|
||||||
// radius can be omitted
|
// radius can be omitted
|
||||||
let drawCircle = (~color, ~radius=?, ()) => {
|
let drawCircle = (~color, ~radius=?, ()) => {
|
||||||
|
@ -114,22 +119,23 @@ let drawCircle = (~color, ~radius=?, ()) => {
|
||||||
|
|
||||||
**Use named arguments**
|
**Use named arguments**
|
||||||
If a function is called externally (in a different file), and has either:
|
If a function is called externally (in a different file), and has either:
|
||||||
|
|
||||||
1. Two arguments of the same type
|
1. Two arguments of the same type
|
||||||
2. Three paramaters or more.
|
2. Three paramaters or more.
|
||||||
|
|
||||||
**Module naming: Use x_y as module names**
|
**Module naming: Use x_y as module names**
|
||||||
For example: ``Myname_Myproject_Add.res``. Rescript/Ocaml both require files to have unique names, so long names are needed to keep different parts separate from each other.
|
For example: `Myname_Myproject_Add.res`. Rescript/Ocaml both require files to have unique names, so long names are needed to keep different parts separate from each other.
|
||||||
|
|
||||||
See [this page](https://dev.to/yawaramin/a-modular-ocaml-project-structure-1ikd) for more information. (Though note that they use two underscores, and we do one. We might refactor that later.
|
See [this page](https://dev.to/yawaramin/a-modular-ocaml-project-structure-1ikd) for more information. (Though note that they use two underscores, and we do one. We might refactor that later.
|
||||||
|
|
||||||
**Module naming: Don't rename modules**
|
**Module naming: Don't rename modules**
|
||||||
We have some of this in the Reducer code, but generally discourage it.
|
We have some of this in the Reducer code, but generally discourage it.
|
||||||
|
|
||||||
**Use interface files (.resi) for files with very public interfaces**
|
**Use interface files (.resi) for files with very public interfaces**
|
||||||
|
|
||||||
### Recommended Rescript resources
|
### Recommended Rescript resources
|
||||||
- https://dev.to/yawaramin/a-modular-ocaml-project-structure-1ikd
|
|
||||||
- https://github.com/avohq/reasonml-code-style-guide
|
|
||||||
- https://cs.brown.edu/courses/cs017/content/docs/reasonml-style.pdf
|
|
||||||
- https://github.com/ostera/reason-design-patterns/
|
|
||||||
|
|
||||||
|
- https://dev.to/yawaramin/a-modular-ocaml-project-structure-1ikd
|
||||||
|
- https://github.com/avohq/reasonml-code-style-guide
|
||||||
|
- https://cs.brown.edu/courses/cs017/content/docs/reasonml-style.pdf
|
||||||
|
- https://github.com/ostera/reason-design-patterns/
|
||||||
|
|
|
@ -2,7 +2,9 @@
|
||||||
"private": true,
|
"private": true,
|
||||||
"name": "squiggle",
|
"name": "squiggle",
|
||||||
"scripts": {
|
"scripts": {
|
||||||
"nodeclean": "rm -r node_modules && rm -r packages/*/node_modules"
|
"nodeclean": "rm -r node_modules && rm -r packages/*/node_modules",
|
||||||
|
"format:all": "prettier --write . && cd packages/squiggle-lang && yarn format",
|
||||||
|
"lint:all": "prettier --check . && cd packages/squiggle-lang && yarn lint:rescript"
|
||||||
},
|
},
|
||||||
"devDependencies": {
|
"devDependencies": {
|
||||||
"prettier": "^2.6.2"
|
"prettier": "^2.6.2"
|
||||||
|
|
|
@ -9,7 +9,7 @@
|
||||||
"@types/jest": "^27.4.0",
|
"@types/jest": "^27.4.0",
|
||||||
"@types/lodash": "^4.14.181",
|
"@types/lodash": "^4.14.181",
|
||||||
"@types/node": "^17.0.23",
|
"@types/node": "^17.0.23",
|
||||||
"@types/react": "^18.0.1",
|
"@types/react": "^18.0.3",
|
||||||
"@types/react-dom": "^18.0.0",
|
"@types/react-dom": "^18.0.0",
|
||||||
"antd": "^4.19.3",
|
"antd": "^4.19.3",
|
||||||
"cross-env": "^7.0.3",
|
"cross-env": "^7.0.3",
|
||||||
|
@ -17,7 +17,7 @@
|
||||||
"react": "^18.0.0",
|
"react": "^18.0.0",
|
||||||
"react-ace": "9.5.0",
|
"react-ace": "9.5.0",
|
||||||
"react-dom": "^18.0.0",
|
"react-dom": "^18.0.0",
|
||||||
"react-scripts": "5.0.0",
|
"react-scripts": "5.0.1",
|
||||||
"react-vega": "^7.5.0",
|
"react-vega": "^7.5.0",
|
||||||
"styled-components": "^5.3.5",
|
"styled-components": "^5.3.5",
|
||||||
"tsconfig-paths-webpack-plugin": "^3.5.2",
|
"tsconfig-paths-webpack-plugin": "^3.5.2",
|
||||||
|
|
|
@ -37,7 +37,7 @@ could be continuous, discrete or mixed.
|
||||||
<Story
|
<Story
|
||||||
name="Discrete"
|
name="Discrete"
|
||||||
args={{
|
args={{
|
||||||
squiggleString: "mm(0, 1, 3, 5, 8, 10, [0.1, 0.8, 0.5, 0.3, 0.2, 0.1])",
|
squiggleString: "mx(0, 1, 3, 5, 8, 10, [0.1, 0.8, 0.5, 0.3, 0.2, 0.1])",
|
||||||
}}
|
}}
|
||||||
>
|
>
|
||||||
{Template.bind({})}
|
{Template.bind({})}
|
||||||
|
@ -51,7 +51,7 @@ could be continuous, discrete or mixed.
|
||||||
name="Mixed"
|
name="Mixed"
|
||||||
args={{
|
args={{
|
||||||
squiggleString:
|
squiggleString:
|
||||||
"mm(0, 1, 3, 5, 8, normal(8, 1), [0.1, 0.3, 0.4, 0.35, 0.2, 0.8])",
|
"mx(0, 1, 3, 5, 8, normal(8, 1), [0.1, 0.3, 0.4, 0.35, 0.2, 0.8])",
|
||||||
}}
|
}}
|
||||||
>
|
>
|
||||||
{Template.bind({})}
|
{Template.bind({})}
|
||||||
|
|
|
@ -130,10 +130,6 @@
|
||||||
},
|
},
|
||||||
"encode": {
|
"encode": {
|
||||||
"enter": {
|
"enter": {
|
||||||
"y2": {
|
|
||||||
"scale": "yscale",
|
|
||||||
"value": 0
|
|
||||||
},
|
|
||||||
"width": {
|
"width": {
|
||||||
"value": 1
|
"value": 1
|
||||||
}
|
}
|
||||||
|
@ -146,6 +142,10 @@
|
||||||
"y": {
|
"y": {
|
||||||
"scale": "yscale",
|
"scale": "yscale",
|
||||||
"field": "y"
|
"field": "y"
|
||||||
|
},
|
||||||
|
"y2": {
|
||||||
|
"scale": "yscale",
|
||||||
|
"value": 0
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
@ -160,7 +160,7 @@
|
||||||
"shape": {
|
"shape": {
|
||||||
"value": "circle"
|
"value": "circle"
|
||||||
},
|
},
|
||||||
"size": [{ "value": 30 }],
|
"size": [{ "value": 100 }],
|
||||||
"tooltip": {
|
"tooltip": {
|
||||||
"signal": "datum.y"
|
"signal": "datum.y"
|
||||||
}
|
}
|
||||||
|
|
4
packages/squiggle-lang/.prettierignore
Normal file
4
packages/squiggle-lang/.prettierignore
Normal file
|
@ -0,0 +1,4 @@
|
||||||
|
dist
|
||||||
|
lib
|
||||||
|
*.bs.js
|
||||||
|
*.gen.tsx
|
|
@ -4,10 +4,10 @@ open Expect
|
||||||
describe("Bandwidth", () => {
|
describe("Bandwidth", () => {
|
||||||
test("nrd0()", () => {
|
test("nrd0()", () => {
|
||||||
let data = [1., 4., 3., 2.]
|
let data = [1., 4., 3., 2.]
|
||||||
expect(SampleSetDist_Bandwidth.nrd0(data)) -> toEqual(0.7625801874014622)
|
expect(SampleSetDist_Bandwidth.nrd0(data))->toEqual(0.7625801874014622)
|
||||||
})
|
})
|
||||||
test("nrd()", () => {
|
test("nrd()", () => {
|
||||||
let data = [1., 4., 3., 2.]
|
let data = [1., 4., 3., 2.]
|
||||||
expect(SampleSetDist_Bandwidth.nrd(data)) -> toEqual(0.8981499984950554)
|
expect(SampleSetDist_Bandwidth.nrd(data))->toEqual(0.8981499984950554)
|
||||||
})
|
})
|
||||||
})
|
})
|
||||||
|
|
|
@ -18,11 +18,9 @@ let {
|
||||||
triangularDist,
|
triangularDist,
|
||||||
exponentialDist,
|
exponentialDist,
|
||||||
} = module(GenericDist_Fixtures)
|
} = module(GenericDist_Fixtures)
|
||||||
let mkNormal = (mean, stdev) => GenericDist_Types.Symbolic(#Normal({mean: mean, stdev: stdev}))
|
|
||||||
|
|
||||||
let {toFloat, toDist, toString, toError} = module(DistributionOperation.Output)
|
let {toFloat, toDist, toString, toError, fmap} = module(DistributionOperation.Output)
|
||||||
let {run} = module(DistributionOperation)
|
let {run} = module(DistributionOperation)
|
||||||
let {fmap} = module(DistributionOperation.Output)
|
|
||||||
let run = run(~env)
|
let run = run(~env)
|
||||||
let outputMap = fmap(~env)
|
let outputMap = fmap(~env)
|
||||||
let toExt: option<'a> => 'a = E.O.toExt(
|
let toExt: option<'a> => 'a = E.O.toExt(
|
||||||
|
|
|
@ -6,6 +6,9 @@ let normalDist: GenericDist_Types.genericDist = normalDist5
|
||||||
let betaDist: GenericDist_Types.genericDist = Symbolic(#Beta({alpha: 2.0, beta: 5.0}))
|
let betaDist: GenericDist_Types.genericDist = Symbolic(#Beta({alpha: 2.0, beta: 5.0}))
|
||||||
let lognormalDist: GenericDist_Types.genericDist = Symbolic(#Lognormal({mu: 0.0, sigma: 1.0}))
|
let lognormalDist: GenericDist_Types.genericDist = Symbolic(#Lognormal({mu: 0.0, sigma: 1.0}))
|
||||||
let cauchyDist: GenericDist_Types.genericDist = Symbolic(#Cauchy({local: 1.0, scale: 1.0}))
|
let cauchyDist: GenericDist_Types.genericDist = Symbolic(#Cauchy({local: 1.0, scale: 1.0}))
|
||||||
let triangularDist: GenericDist_Types.genericDist = Symbolic(#Triangular({low: 1.0, medium: 2.0, high: 3.0}))
|
let triangularDist: GenericDist_Types.genericDist = Symbolic(
|
||||||
|
#Triangular({low: 1.0, medium: 2.0, high: 3.0}),
|
||||||
|
)
|
||||||
let exponentialDist: GenericDist_Types.genericDist = Symbolic(#Exponential({rate: 2.0}))
|
let exponentialDist: GenericDist_Types.genericDist = Symbolic(#Exponential({rate: 2.0}))
|
||||||
let uniformDist: GenericDist_Types.genericDist = Symbolic(#Uniform({low: 9.0, high: 10.0}))
|
let uniformDist: GenericDist_Types.genericDist = Symbolic(#Uniform({low: 9.0, high: 10.0}))
|
||||||
|
let floatDist: GenericDist_Types.genericDist = Symbolic(#Float(1e1))
|
||||||
|
|
|
@ -0,0 +1,368 @@
|
||||||
|
/*
|
||||||
|
This file is like a half measure between one-off unit tests and proper invariant validation.
|
||||||
|
As such, I'm not that excited about it, though it does provide some structure and will alarm us
|
||||||
|
when things substantially change.
|
||||||
|
Also, there are some open comments in https://github.com/quantified-uncertainty/squiggle/pull/232 that haven't been addressed.
|
||||||
|
*/
|
||||||
|
|
||||||
|
open Jest
|
||||||
|
open Expect
|
||||||
|
open TestHelpers
|
||||||
|
|
||||||
|
let {
|
||||||
|
normalDist5, // mean=5, stdev=2
|
||||||
|
normalDist10, // mean=10, stdev=2
|
||||||
|
normalDist20, // mean=20, stdev=2
|
||||||
|
normalDist, // mean=5; stdev=2
|
||||||
|
uniformDist, // low=9; high=10
|
||||||
|
betaDist, // alpha=2; beta=5
|
||||||
|
lognormalDist, // mu=0; sigma=1
|
||||||
|
cauchyDist, // local=1; scale=1
|
||||||
|
triangularDist, // low=1; medium=2; high=3;
|
||||||
|
exponentialDist, // rate=2
|
||||||
|
} = module(GenericDist_Fixtures)
|
||||||
|
|
||||||
|
let {
|
||||||
|
algebraicAdd,
|
||||||
|
algebraicMultiply,
|
||||||
|
algebraicDivide,
|
||||||
|
algebraicSubtract,
|
||||||
|
algebraicLogarithm,
|
||||||
|
algebraicPower,
|
||||||
|
} = module(DistributionOperation.Constructors)
|
||||||
|
|
||||||
|
let algebraicAdd = algebraicAdd(~env)
|
||||||
|
let algebraicMultiply = algebraicMultiply(~env)
|
||||||
|
let algebraicDivide = algebraicDivide(~env)
|
||||||
|
let algebraicSubtract = algebraicSubtract(~env)
|
||||||
|
let algebraicLogarithm = algebraicLogarithm(~env)
|
||||||
|
let algebraicPower = algebraicPower(~env)
|
||||||
|
|
||||||
|
describe("(Algebraic) addition of distributions", () => {
|
||||||
|
describe("mean", () => {
|
||||||
|
test("normal(mean=5) + normal(mean=20)", () => {
|
||||||
|
normalDist5
|
||||||
|
->algebraicAdd(normalDist20)
|
||||||
|
->E.R2.fmap(GenericDist_Types.Constructors.UsingDists.mean)
|
||||||
|
->E.R2.fmap(run)
|
||||||
|
->E.R2.fmap(toFloat)
|
||||||
|
->E.R.toExn
|
||||||
|
->expect
|
||||||
|
->toBe(Some(2.5e1))
|
||||||
|
})
|
||||||
|
|
||||||
|
test("uniform(low=9, high=10) + beta(alpha=2, beta=5)", () => {
|
||||||
|
// let uniformMean = (9.0 +. 10.0) /. 2.0
|
||||||
|
// let betaMean = 1.0 /. (1.0 +. 5.0 /. 2.0)
|
||||||
|
let received =
|
||||||
|
uniformDist
|
||||||
|
->algebraicAdd(betaDist)
|
||||||
|
->E.R2.fmap(GenericDist_Types.Constructors.UsingDists.mean)
|
||||||
|
->E.R2.fmap(run)
|
||||||
|
->E.R2.fmap(toFloat)
|
||||||
|
->E.R.toExn
|
||||||
|
switch received {
|
||||||
|
| None => "algebraicAdd has"->expect->toBe("failed")
|
||||||
|
// This is nondeterministic, we could be in a situation where ci fails but you click rerun and it passes, which is bad.
|
||||||
|
// sometimes it works with ~digits=2.
|
||||||
|
| Some(x) => x->expect->toBeSoCloseTo(0.01927225696028752, ~digits=1) // (uniformMean +. betaMean)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
test("beta(alpha=2, beta=5) + uniform(low=9, high=10)", () => {
|
||||||
|
// let uniformMean = (9.0 +. 10.0) /. 2.0
|
||||||
|
// let betaMean = 1.0 /. (1.0 +. 5.0 /. 2.0)
|
||||||
|
let received =
|
||||||
|
betaDist
|
||||||
|
->algebraicAdd(uniformDist)
|
||||||
|
->E.R2.fmap(GenericDist_Types.Constructors.UsingDists.mean)
|
||||||
|
->E.R2.fmap(run)
|
||||||
|
->E.R2.fmap(toFloat)
|
||||||
|
->E.R.toExn
|
||||||
|
switch received {
|
||||||
|
| None => "algebraicAdd has"->expect->toBe("failed")
|
||||||
|
// This is nondeterministic, we could be in a situation where ci fails but you click rerun and it passes, which is bad.
|
||||||
|
// sometimes it works with ~digits=2.
|
||||||
|
| Some(x) => x->expect->toBeSoCloseTo(0.019275414920485248, ~digits=1) // (uniformMean +. betaMean)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
})
|
||||||
|
describe("pdf", () => {
|
||||||
|
// TEST IS WRONG. SEE STDEV ADDITION EXPRESSION.
|
||||||
|
testAll(
|
||||||
|
"(normal(mean=5) + normal(mean=5)).pdf (imprecise)",
|
||||||
|
list{8e0, 1e1, 1.2e1, 1.4e1},
|
||||||
|
x => {
|
||||||
|
let received =
|
||||||
|
normalDist10 // this should be normal(10, sqrt(8))
|
||||||
|
->Ok
|
||||||
|
->E.R2.fmap(d => GenericDist_Types.Constructors.UsingDists.pdf(d, x))
|
||||||
|
->E.R2.fmap(run)
|
||||||
|
->E.R2.fmap(toFloat)
|
||||||
|
->E.R.toOption
|
||||||
|
->E.O.flatten
|
||||||
|
let calculated =
|
||||||
|
normalDist5
|
||||||
|
->algebraicAdd(normalDist5)
|
||||||
|
->E.R2.fmap(d => GenericDist_Types.Constructors.UsingDists.pdf(d, x))
|
||||||
|
->E.R2.fmap(run)
|
||||||
|
->E.R2.fmap(toFloat)
|
||||||
|
->E.R.toOption
|
||||||
|
->E.O.flatten
|
||||||
|
|
||||||
|
switch received {
|
||||||
|
| None =>
|
||||||
|
"this branch occurs when the dispatch to Jstat on trusted input fails."
|
||||||
|
->expect
|
||||||
|
->toBe("never")
|
||||||
|
| Some(x) =>
|
||||||
|
switch calculated {
|
||||||
|
| None => "algebraicAdd has"->expect->toBe("failed")
|
||||||
|
| Some(y) => x->expect->toBeSoCloseTo(y, ~digits=0)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
)
|
||||||
|
test("(normal(mean=10) + normal(mean=10)).pdf(1.9e1)", () => {
|
||||||
|
let received =
|
||||||
|
normalDist20
|
||||||
|
->Ok
|
||||||
|
->E.R2.fmap(d => GenericDist_Types.Constructors.UsingDists.pdf(d, 1.9e1))
|
||||||
|
->E.R2.fmap(run)
|
||||||
|
->E.R2.fmap(toFloat)
|
||||||
|
->E.R.toOption
|
||||||
|
->E.O.flatten
|
||||||
|
let calculated =
|
||||||
|
normalDist10
|
||||||
|
->algebraicAdd(normalDist10)
|
||||||
|
->E.R2.fmap(d => GenericDist_Types.Constructors.UsingDists.pdf(d, 1.9e1))
|
||||||
|
->E.R2.fmap(run)
|
||||||
|
->E.R2.fmap(toFloat)
|
||||||
|
->E.R.toOption
|
||||||
|
->E.O.flatten
|
||||||
|
switch received {
|
||||||
|
| None =>
|
||||||
|
"this branch occurs when the dispatch to Jstat on trusted input fails."
|
||||||
|
->expect
|
||||||
|
->toBe("never")
|
||||||
|
| Some(x) =>
|
||||||
|
switch calculated {
|
||||||
|
| None => "algebraicAdd has"->expect->toBe("failed")
|
||||||
|
| Some(y) => x->expect->toBeSoCloseTo(y, ~digits=1)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
})
|
||||||
|
test("(uniform(low=9, high=10) + beta(alpha=2, beta=5)).pdf(10)", () => {
|
||||||
|
let received =
|
||||||
|
uniformDist
|
||||||
|
->algebraicAdd(betaDist)
|
||||||
|
->E.R2.fmap(d => GenericDist_Types.Constructors.UsingDists.pdf(d, 1e1))
|
||||||
|
->E.R2.fmap(run)
|
||||||
|
->E.R2.fmap(toFloat)
|
||||||
|
->E.R.toExn
|
||||||
|
switch received {
|
||||||
|
| None => "algebraicAdd has"->expect->toBe("failed")
|
||||||
|
// This is nondeterministic, we could be in a situation where ci fails but you click rerun and it passes, which is bad.
|
||||||
|
// sometimes it works with ~digits=4.
|
||||||
|
| Some(x) => x->expect->toBeSoCloseTo(0.001978994877226945, ~digits=3)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
test("(beta(alpha=2, beta=5) + uniform(low=9, high=10)).pdf(10)", () => {
|
||||||
|
let received =
|
||||||
|
betaDist
|
||||||
|
->algebraicAdd(uniformDist)
|
||||||
|
->E.R2.fmap(d => GenericDist_Types.Constructors.UsingDists.pdf(d, 1e1))
|
||||||
|
->E.R2.fmap(run)
|
||||||
|
->E.R2.fmap(toFloat)
|
||||||
|
->E.R.toExn
|
||||||
|
switch received {
|
||||||
|
| None => "algebraicAdd has"->expect->toBe("failed")
|
||||||
|
// This is nondeterministic, we could be in a situation where ci fails but you click rerun and it passes, which is bad.
|
||||||
|
// sometimes it works with ~digits=4.
|
||||||
|
| Some(x) => x->expect->toBeSoCloseTo(0.001978994877226945, ~digits=3)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
})
|
||||||
|
describe("cdf", () => {
|
||||||
|
testAll("(normal(mean=5) + normal(mean=5)).cdf (imprecise)", list{6e0, 8e0, 1e1, 1.2e1}, x => {
|
||||||
|
let received =
|
||||||
|
normalDist10
|
||||||
|
->Ok
|
||||||
|
->E.R2.fmap(d => GenericDist_Types.Constructors.UsingDists.cdf(d, x))
|
||||||
|
->E.R2.fmap(run)
|
||||||
|
->E.R2.fmap(toFloat)
|
||||||
|
->E.R.toOption
|
||||||
|
->E.O.flatten
|
||||||
|
let calculated =
|
||||||
|
normalDist5
|
||||||
|
->algebraicAdd(normalDist5)
|
||||||
|
->E.R2.fmap(d => GenericDist_Types.Constructors.UsingDists.cdf(d, x))
|
||||||
|
->E.R2.fmap(run)
|
||||||
|
->E.R2.fmap(toFloat)
|
||||||
|
->E.R.toOption
|
||||||
|
->E.O.flatten
|
||||||
|
|
||||||
|
switch received {
|
||||||
|
| None =>
|
||||||
|
"this branch occurs when the dispatch to Jstat on trusted input fails."
|
||||||
|
->expect
|
||||||
|
->toBe("never")
|
||||||
|
| Some(x) =>
|
||||||
|
switch calculated {
|
||||||
|
| None => "algebraicAdd has"->expect->toBe("failed")
|
||||||
|
| Some(y) => x->expect->toBeSoCloseTo(y, ~digits=0)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
})
|
||||||
|
test("(normal(mean=10) + normal(mean=10)).cdf(1.25e1)", () => {
|
||||||
|
let received =
|
||||||
|
normalDist20
|
||||||
|
->Ok
|
||||||
|
->E.R2.fmap(d => GenericDist_Types.Constructors.UsingDists.cdf(d, 1.25e1))
|
||||||
|
->E.R2.fmap(run)
|
||||||
|
->E.R2.fmap(toFloat)
|
||||||
|
->E.R.toOption
|
||||||
|
->E.O.flatten
|
||||||
|
let calculated =
|
||||||
|
normalDist10
|
||||||
|
->algebraicAdd(normalDist10)
|
||||||
|
->E.R2.fmap(d => GenericDist_Types.Constructors.UsingDists.cdf(d, 1.25e1))
|
||||||
|
->E.R2.fmap(run)
|
||||||
|
->E.R2.fmap(toFloat)
|
||||||
|
->E.R.toOption
|
||||||
|
->E.O.flatten
|
||||||
|
switch received {
|
||||||
|
| None =>
|
||||||
|
"this branch occurs when the dispatch to Jstat on trusted input fails."
|
||||||
|
->expect
|
||||||
|
->toBe("never")
|
||||||
|
| Some(x) =>
|
||||||
|
switch calculated {
|
||||||
|
| None => "algebraicAdd has"->expect->toBe("failed")
|
||||||
|
| Some(y) => x->expect->toBeSoCloseTo(y, ~digits=2)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
})
|
||||||
|
test("(uniform(low=9, high=10) + beta(alpha=2, beta=5)).cdf(10)", () => {
|
||||||
|
let received =
|
||||||
|
uniformDist
|
||||||
|
->algebraicAdd(betaDist)
|
||||||
|
->E.R2.fmap(d => GenericDist_Types.Constructors.UsingDists.cdf(d, 1e1))
|
||||||
|
->E.R2.fmap(run)
|
||||||
|
->E.R2.fmap(toFloat)
|
||||||
|
->E.R.toExn
|
||||||
|
switch received {
|
||||||
|
| None => "algebraicAdd has"->expect->toBe("failed")
|
||||||
|
// This is nondeterministic, we could be in a situation where ci fails but you click rerun and it passes, which is bad.
|
||||||
|
// sometimes it works with ~digits=4.
|
||||||
|
| Some(x) => x->expect->toBeSoCloseTo(0.0013961779932477507, ~digits=3)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
test("(beta(alpha=2, beta=5) + uniform(low=9, high=10)).cdf(10)", () => {
|
||||||
|
let received =
|
||||||
|
betaDist
|
||||||
|
->algebraicAdd(uniformDist)
|
||||||
|
->E.R2.fmap(d => GenericDist_Types.Constructors.UsingDists.cdf(d, 1e1))
|
||||||
|
->E.R2.fmap(run)
|
||||||
|
->E.R2.fmap(toFloat)
|
||||||
|
->E.R.toExn
|
||||||
|
switch received {
|
||||||
|
| None => "algebraicAdd has"->expect->toBe("failed")
|
||||||
|
// This is nondeterministic, we could be in a situation where ci fails but you click rerun and it passes, which is bad.
|
||||||
|
// sometimes it works with ~digits=4.
|
||||||
|
| Some(x) => x->expect->toBeSoCloseTo(0.001388898111625753, ~digits=3)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
describe("inv", () => {
|
||||||
|
testAll("(normal(mean=5) + normal(mean=5)).inv (imprecise)", list{5e-2, 4.2e-3, 9e-3}, x => {
|
||||||
|
let received =
|
||||||
|
normalDist10
|
||||||
|
->Ok
|
||||||
|
->E.R2.fmap(d => GenericDist_Types.Constructors.UsingDists.inv(d, x))
|
||||||
|
->E.R2.fmap(run)
|
||||||
|
->E.R2.fmap(toFloat)
|
||||||
|
->E.R.toOption
|
||||||
|
->E.O.flatten
|
||||||
|
let calculated =
|
||||||
|
normalDist5
|
||||||
|
->algebraicAdd(normalDist5)
|
||||||
|
->E.R2.fmap(d => GenericDist_Types.Constructors.UsingDists.inv(d, x))
|
||||||
|
->E.R2.fmap(run)
|
||||||
|
->E.R2.fmap(toFloat)
|
||||||
|
->E.R.toOption
|
||||||
|
->E.O.flatten
|
||||||
|
|
||||||
|
switch received {
|
||||||
|
| None =>
|
||||||
|
"this branch occurs when the dispatch to Jstat on trusted input fails."
|
||||||
|
->expect
|
||||||
|
->toBe("never")
|
||||||
|
| Some(x) =>
|
||||||
|
switch calculated {
|
||||||
|
| None => "algebraicAdd has"->expect->toBe("failed")
|
||||||
|
| Some(y) => x->expect->toBeSoCloseTo(y, ~digits=-1)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
})
|
||||||
|
test("(normal(mean=10) + normal(mean=10)).inv(1e-1)", () => {
|
||||||
|
let received =
|
||||||
|
normalDist20
|
||||||
|
->Ok
|
||||||
|
->E.R2.fmap(d => GenericDist_Types.Constructors.UsingDists.inv(d, 1e-1))
|
||||||
|
->E.R2.fmap(run)
|
||||||
|
->E.R2.fmap(toFloat)
|
||||||
|
->E.R.toOption
|
||||||
|
->E.O.flatten
|
||||||
|
let calculated =
|
||||||
|
normalDist10
|
||||||
|
->algebraicAdd(normalDist10)
|
||||||
|
->E.R2.fmap(d => GenericDist_Types.Constructors.UsingDists.inv(d, 1e-1))
|
||||||
|
->E.R2.fmap(run)
|
||||||
|
->E.R2.fmap(toFloat)
|
||||||
|
->E.R.toOption
|
||||||
|
->E.O.flatten
|
||||||
|
switch received {
|
||||||
|
| None =>
|
||||||
|
"this branch occurs when the dispatch to Jstat on trusted input fails."
|
||||||
|
->expect
|
||||||
|
->toBe("never")
|
||||||
|
| Some(x) =>
|
||||||
|
switch calculated {
|
||||||
|
| None => "algebraicAdd has"->expect->toBe("failed")
|
||||||
|
| Some(y) => x->expect->toBeSoCloseTo(y, ~digits=-1)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
})
|
||||||
|
test("(uniform(low=9, high=10) + beta(alpha=2, beta=5)).inv(2e-2)", () => {
|
||||||
|
let received =
|
||||||
|
uniformDist
|
||||||
|
->algebraicAdd(betaDist)
|
||||||
|
->E.R2.fmap(d => GenericDist_Types.Constructors.UsingDists.inv(d, 2e-2))
|
||||||
|
->E.R2.fmap(run)
|
||||||
|
->E.R2.fmap(toFloat)
|
||||||
|
->E.R.toExn
|
||||||
|
switch received {
|
||||||
|
| None => "algebraicAdd has"->expect->toBe("failed")
|
||||||
|
// This is nondeterministic, we could be in a situation where ci fails but you click rerun and it passes, which is bad.
|
||||||
|
// sometimes it works with ~digits=2.
|
||||||
|
| Some(x) => x->expect->toBeSoCloseTo(10.927078217530806, ~digits=0)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
test("(beta(alpha=2, beta=5) + uniform(low=9, high=10)).inv(2e-2)", () => {
|
||||||
|
let received =
|
||||||
|
betaDist
|
||||||
|
->algebraicAdd(uniformDist)
|
||||||
|
->E.R2.fmap(d => GenericDist_Types.Constructors.UsingDists.inv(d, 2e-2))
|
||||||
|
->E.R2.fmap(run)
|
||||||
|
->E.R2.fmap(toFloat)
|
||||||
|
->E.R.toExn
|
||||||
|
switch received {
|
||||||
|
| None => "algebraicAdd has"->expect->toBe("failed")
|
||||||
|
// This is nondeterministic, we could be in a situation where ci fails but you click rerun and it passes, which is bad.
|
||||||
|
// sometimes it works with ~digits=2.
|
||||||
|
| Some(x) => x->expect->toBeSoCloseTo(10.915396627014363, ~digits=0)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
})
|
||||||
|
})
|
|
@ -0,0 +1,140 @@
|
||||||
|
/*
|
||||||
|
This is the most basic file in our invariants family of tests.
|
||||||
|
|
||||||
|
See document in https://github.com/quantified-uncertainty/squiggle/pull/238 for details
|
||||||
|
|
||||||
|
Note: digits parameter should be higher than -4.
|
||||||
|
*/
|
||||||
|
|
||||||
|
open Jest
|
||||||
|
open Expect
|
||||||
|
open TestHelpers
|
||||||
|
|
||||||
|
let {
|
||||||
|
algebraicAdd,
|
||||||
|
algebraicMultiply,
|
||||||
|
algebraicDivide,
|
||||||
|
algebraicSubtract,
|
||||||
|
algebraicLogarithm,
|
||||||
|
algebraicPower,
|
||||||
|
} = module(DistributionOperation.Constructors)
|
||||||
|
|
||||||
|
let algebraicAdd = algebraicAdd(~env)
|
||||||
|
let algebraicMultiply = algebraicMultiply(~env)
|
||||||
|
let algebraicDivide = algebraicDivide(~env)
|
||||||
|
let algebraicSubtract = algebraicSubtract(~env)
|
||||||
|
let algebraicLogarithm = algebraicLogarithm(~env)
|
||||||
|
let algebraicPower = algebraicPower(~env)
|
||||||
|
|
||||||
|
describe("Mean", () => {
|
||||||
|
let digits = -4
|
||||||
|
|
||||||
|
let mean = GenericDist_Types.Constructors.UsingDists.mean
|
||||||
|
|
||||||
|
let runMean: result<DistributionTypes.genericDist, DistributionTypes.error> => float = distR => {
|
||||||
|
distR
|
||||||
|
->E.R2.fmap(mean)
|
||||||
|
->E.R2.fmap(run)
|
||||||
|
->E.R2.fmap(toFloat)
|
||||||
|
->E.R.toExn
|
||||||
|
->E.O2.toExn("Shouldn't see this because we trust testcase input")
|
||||||
|
}
|
||||||
|
|
||||||
|
let impossiblePath: string => assertion = algebraicOp =>
|
||||||
|
`${algebraicOp} has`->expect->toEqual("failed")
|
||||||
|
|
||||||
|
let distributions = list{
|
||||||
|
normalMake(0.0, 1e0),
|
||||||
|
betaMake(2e0, 4e0),
|
||||||
|
exponentialMake(1.234e0),
|
||||||
|
uniformMake(7e0, 1e1),
|
||||||
|
// cauchyMake(1e0, 1e0),
|
||||||
|
lognormalMake(1e0, 1e0),
|
||||||
|
triangularMake(1e0, 1e1, 5e1),
|
||||||
|
Ok(floatMake(1e1)),
|
||||||
|
}
|
||||||
|
let combinations = E.L.combinations2(distributions)
|
||||||
|
let zipDistsDists = E.L.zip(distributions, distributions)
|
||||||
|
|
||||||
|
let testOperationMean = (
|
||||||
|
distOp: (DistributionTypes.genericDist, DistributionTypes.genericDist) => result<DistributionTypes.genericDist, DistributionTypes.error>,
|
||||||
|
description: string,
|
||||||
|
floatOp: (float, float) => float,
|
||||||
|
dist1': result<SymbolicDistTypes.symbolicDist, string>,
|
||||||
|
dist2': result<SymbolicDistTypes.symbolicDist, string>
|
||||||
|
) => {
|
||||||
|
let dist1 = dist1'->E.R2.fmap(x=>DistributionTypes.Symbolic(x))->E.R2.fmap2(s=>DistributionTypes.Other(s))
|
||||||
|
let dist2 = dist2'->E.R2.fmap(x=>DistributionTypes.Symbolic(x))->E.R2.fmap2(s=>DistributionTypes.Other(s))
|
||||||
|
let received =
|
||||||
|
E.R.liftJoin2(distOp, dist1, dist2)
|
||||||
|
->E.R2.fmap(mean)
|
||||||
|
->E.R2.fmap(run)
|
||||||
|
->E.R2.fmap(toFloat)
|
||||||
|
let expected = floatOp(runMean(dist1), runMean(dist2))
|
||||||
|
switch received {
|
||||||
|
| Error(err) => impossiblePath(description)
|
||||||
|
| Ok(x) =>
|
||||||
|
switch x {
|
||||||
|
| None => impossiblePath(description)
|
||||||
|
| Some(x) => x->expect->toBeSoCloseTo(expected, ~digits)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
describe("addition", () => {
|
||||||
|
let testAdditionMean = testOperationMean(algebraicAdd, "algebraicAdd", \"+.")
|
||||||
|
|
||||||
|
testAll("homogeneous addition", zipDistsDists, dists => {
|
||||||
|
let (dist1, dist2) = dists
|
||||||
|
testAdditionMean(dist1, dist2)
|
||||||
|
})
|
||||||
|
|
||||||
|
testAll("heterogeneous addition (1)", combinations, dists => {
|
||||||
|
let (dist1, dist2) = dists
|
||||||
|
testAdditionMean(dist1, dist2)
|
||||||
|
})
|
||||||
|
|
||||||
|
testAll("heterogeneous addition (commuted of 1 (or; 2))", combinations, dists => {
|
||||||
|
let (dist1, dist2) = dists
|
||||||
|
testAdditionMean(dist2, dist1)
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
describe("subtraction", () => {
|
||||||
|
let testSubtractionMean = testOperationMean(algebraicSubtract, "algebraicSubtract", \"-.")
|
||||||
|
|
||||||
|
testAll("homogeneous subtraction", zipDistsDists, dists => {
|
||||||
|
let (dist1, dist2) = dists
|
||||||
|
testSubtractionMean(dist1, dist2)
|
||||||
|
})
|
||||||
|
|
||||||
|
testAll("heterogeneous subtraction (1)", combinations, dists => {
|
||||||
|
let (dist1, dist2) = dists
|
||||||
|
testSubtractionMean(dist1, dist2)
|
||||||
|
})
|
||||||
|
|
||||||
|
testAll("heterogeneous subtraction (commuted of 1 (or; 2))", combinations, dists => {
|
||||||
|
let (dist1, dist2) = dists
|
||||||
|
testSubtractionMean(dist2, dist1)
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
describe("multiplication", () => {
|
||||||
|
let testMultiplicationMean = testOperationMean(algebraicMultiply, "algebraicMultiply", \"*.")
|
||||||
|
|
||||||
|
testAll("homogeneous subtraction", zipDistsDists, dists => {
|
||||||
|
let (dist1, dist2) = dists
|
||||||
|
testMultiplicationMean(dist1, dist2)
|
||||||
|
})
|
||||||
|
|
||||||
|
testAll("heterogeneoous subtraction (1)", combinations, dists => {
|
||||||
|
let (dist1, dist2) = dists
|
||||||
|
testMultiplicationMean(dist1, dist2)
|
||||||
|
})
|
||||||
|
|
||||||
|
testAll("heterogeneoous subtraction (commuted of 1 (or; 2))", combinations, dists => {
|
||||||
|
let (dist1, dist2) = dists
|
||||||
|
testMultiplicationMean(dist2, dist1)
|
||||||
|
})
|
||||||
|
})
|
||||||
|
})
|
|
@ -1,70 +1,65 @@
|
||||||
open Jest
|
open Jest
|
||||||
open Expect
|
open Expect
|
||||||
open TestHelpers
|
open TestHelpers
|
||||||
|
|
||||||
// TODO: use Normal.make (etc.), but preferably after the new validation dispatch is in.
|
|
||||||
let mkNormal = (mean, stdev) => GenericDist_Types.Symbolic(#Normal({mean: mean, stdev: stdev}))
|
|
||||||
let mkBeta = (alpha, beta) => GenericDist_Types.Symbolic(#Beta({alpha: alpha, beta: beta}))
|
|
||||||
let mkExponential = rate => GenericDist_Types.Symbolic(#Exponential({rate: rate}))
|
|
||||||
let mkUniform = (low, high) => GenericDist_Types.Symbolic(#Uniform({low: low, high: high}))
|
|
||||||
let mkCauchy = (local, scale) => GenericDist_Types.Symbolic(#Cauchy({local: local, scale: scale}))
|
|
||||||
let mkLognormal = (mu, sigma) => GenericDist_Types.Symbolic(#Lognormal({mu: mu, sigma: sigma}))
|
|
||||||
|
|
||||||
describe("mixture", () => {
|
describe("mixture", () => {
|
||||||
testAll("fair mean of two normal distributions", list{(0.0, 1e2), (-1e1, -1e-4), (-1e1, 1e2), (-1e1, 1e1)}, tup => { // should be property
|
|
||||||
let (mean1, mean2) = tup
|
|
||||||
let meanValue = {
|
|
||||||
run(Mixture([(mkNormal(mean1, 9e-1), 0.5), (mkNormal(mean2, 9e-1), 0.5)]))
|
|
||||||
-> outputMap(FromDist(ToFloat(#Mean)))
|
|
||||||
}
|
|
||||||
meanValue -> unpackFloat -> expect -> toBeSoCloseTo((mean1 +. mean2) /. 2.0, ~digits=-1)
|
|
||||||
})
|
|
||||||
testAll(
|
testAll(
|
||||||
"weighted mean of a beta and an exponential",
|
"fair mean of two normal distributions",
|
||||||
// This would not survive property testing, it was easy for me to find cases that NaN'd out.
|
list{(0.0, 1e2), (-1e1, -1e-4), (-1e1, 1e2), (-1e1, 1e1)},
|
||||||
list{((128.0, 1.0), 2.0), ((2e-1, 64.0), 16.0), ((1e0, 1e0), 64.0)},
|
tup => {
|
||||||
tup => {
|
// should be property
|
||||||
let ((alpha, beta), rate) = tup
|
let (mean1, mean2) = tup
|
||||||
let betaWeight = 0.25
|
let meanValue = {
|
||||||
let exponentialWeight = 0.75
|
run(Mixture([(mkNormal(mean1, 9e-1), 0.5), (mkNormal(mean2, 9e-1), 0.5)]))->outputMap(
|
||||||
let meanValue = {
|
FromDist(ToFloat(#Mean)),
|
||||||
run(Mixture(
|
|
||||||
[
|
|
||||||
(mkBeta(alpha, beta), betaWeight),
|
|
||||||
(mkExponential(rate), exponentialWeight)
|
|
||||||
]
|
|
||||||
)) -> outputMap(FromDist(ToFloat(#Mean)))
|
|
||||||
}
|
|
||||||
let betaMean = 1.0 /. (1.0 +. beta /. alpha)
|
|
||||||
let exponentialMean = 1.0 /. rate
|
|
||||||
meanValue
|
|
||||||
-> unpackFloat
|
|
||||||
-> expect
|
|
||||||
-> toBeSoCloseTo(
|
|
||||||
betaWeight *. betaMean +. exponentialWeight *. exponentialMean,
|
|
||||||
~digits=-1
|
|
||||||
)
|
)
|
||||||
}
|
}
|
||||||
|
meanValue->unpackFloat->expect->toBeSoCloseTo((mean1 +. mean2) /. 2.0, ~digits=-1)
|
||||||
|
},
|
||||||
)
|
)
|
||||||
testAll(
|
testAll(
|
||||||
"weighted mean of lognormal and uniform",
|
"weighted mean of a beta and an exponential",
|
||||||
// Would not survive property tests: very easy to find cases that NaN out.
|
// This would not survive property testing, it was easy for me to find cases that NaN'd out.
|
||||||
list{((-1e2,1e1), (2e0,1e0)), ((-1e-16,1e-16), (1e-8,1e0)), ((0.0,1e0), (1e0,1e-2))},
|
list{((128.0, 1.0), 2.0), ((2e-1, 64.0), 16.0), ((1e0, 1e0), 64.0)},
|
||||||
tup => {
|
tup => {
|
||||||
let ((low, high), (mu, sigma)) = tup
|
let ((alpha, beta), rate) = tup
|
||||||
let uniformWeight = 0.6
|
let betaWeight = 0.25
|
||||||
let lognormalWeight = 0.4
|
let exponentialWeight = 0.75
|
||||||
let meanValue = {
|
let meanValue = {
|
||||||
run(Mixture([(mkUniform(low, high), uniformWeight), (mkLognormal(mu, sigma), lognormalWeight)]))
|
run(
|
||||||
-> outputMap(FromDist(ToFloat(#Mean)))
|
Mixture([(mkBeta(alpha, beta), betaWeight), (mkExponential(rate), exponentialWeight)]),
|
||||||
}
|
)->outputMap(FromDist(ToFloat(#Mean)))
|
||||||
let uniformMean = (low +. high) /. 2.0
|
|
||||||
let lognormalMean = mu +. sigma ** 2.0 /. 2.0
|
|
||||||
meanValue
|
|
||||||
-> unpackFloat
|
|
||||||
-> expect
|
|
||||||
-> toBeSoCloseTo(uniformWeight *. uniformMean +. lognormalWeight *. lognormalMean, ~digits=-1)
|
|
||||||
}
|
}
|
||||||
|
let betaMean = 1.0 /. (1.0 +. beta /. alpha)
|
||||||
|
let exponentialMean = 1.0 /. rate
|
||||||
|
meanValue
|
||||||
|
->unpackFloat
|
||||||
|
->expect
|
||||||
|
->toBeSoCloseTo(betaWeight *. betaMean +. exponentialWeight *. exponentialMean, ~digits=-1)
|
||||||
|
},
|
||||||
|
)
|
||||||
|
testAll(
|
||||||
|
"weighted mean of lognormal and uniform",
|
||||||
|
// Would not survive property tests: very easy to find cases that NaN out.
|
||||||
|
list{((-1e2, 1e1), (2e0, 1e0)), ((-1e-16, 1e-16), (1e-8, 1e0)), ((0.0, 1e0), (1e0, 1e-2))},
|
||||||
|
tup => {
|
||||||
|
let ((low, high), (mu, sigma)) = tup
|
||||||
|
let uniformWeight = 0.6
|
||||||
|
let lognormalWeight = 0.4
|
||||||
|
let meanValue = {
|
||||||
|
run(
|
||||||
|
Mixture([
|
||||||
|
(mkUniform(low, high), uniformWeight),
|
||||||
|
(mkLognormal(mu, sigma), lognormalWeight),
|
||||||
|
]),
|
||||||
|
)->outputMap(FromDist(ToFloat(#Mean)))
|
||||||
|
}
|
||||||
|
let uniformMean = (low +. high) /. 2.0
|
||||||
|
let lognormalMean = mu +. sigma ** 2.0 /. 2.0
|
||||||
|
meanValue
|
||||||
|
->unpackFloat
|
||||||
|
->expect
|
||||||
|
->toBeSoCloseTo(uniformWeight *. uniformMean +. lognormalWeight *. lognormalMean, ~digits=-1)
|
||||||
|
},
|
||||||
)
|
)
|
||||||
})
|
})
|
||||||
|
|
||||||
|
|
|
@ -38,4 +38,3 @@ describe("Continuous and discrete splits", () => {
|
||||||
let toArr2 = discrete2 |> E.FloatFloatMap.toArray
|
let toArr2 = discrete2 |> E.FloatFloatMap.toArray
|
||||||
makeTest("splitMedium at count=500", toArr2 |> Belt.Array.length, 500)
|
makeTest("splitMedium at count=500", toArr2 |> Belt.Array.length, 500)
|
||||||
})
|
})
|
||||||
|
|
||||||
|
|
|
@ -3,131 +3,115 @@ open Expect
|
||||||
open TestHelpers
|
open TestHelpers
|
||||||
|
|
||||||
// TODO: use Normal.make (but preferably after teh new validation dispatch is in)
|
// TODO: use Normal.make (but preferably after teh new validation dispatch is in)
|
||||||
let mkNormal = (mean, stdev) => GenericDist_Types.Symbolic(#Normal({mean: mean, stdev: stdev}))
|
let mkNormal = (mean, stdev) => DistributionTypes.Symbolic(#Normal({mean: mean, stdev: stdev}))
|
||||||
|
|
||||||
describe("(Symbolic) normalize", () => {
|
describe("(Symbolic) normalize", () => {
|
||||||
testAll("has no impact on normal distributions", list{-1e8, -1e-2, 0.0, 1e-4, 1e16}, mean => {
|
testAll("has no impact on normal distributions", list{-1e8, -1e-2, 0.0, 1e-4, 1e16}, mean => {
|
||||||
let normalValue = mkNormal(mean, 2.0)
|
let normalValue = mkNormal(mean, 2.0)
|
||||||
let normalizedValue = run(FromDist(ToDist(Normalize), normalValue))
|
let normalizedValue = run(FromDist(ToDist(Normalize), normalValue))
|
||||||
normalizedValue
|
normalizedValue->unpackDist->expect->toEqual(normalValue)
|
||||||
-> unpackDist
|
|
||||||
-> expect
|
|
||||||
-> toEqual(normalValue)
|
|
||||||
})
|
})
|
||||||
})
|
})
|
||||||
|
|
||||||
describe("(Symbolic) mean", () => {
|
describe("(Symbolic) mean", () => {
|
||||||
testAll("of normal distributions", list{-1e8, -16.0, -1e-2, 0.0, 1e-4, 32.0, 1e16}, mean => {
|
testAll("of normal distributions", list{-1e8, -16.0, -1e-2, 0.0, 1e-4, 32.0, 1e16}, mean => {
|
||||||
run(FromDist(ToFloat(#Mean), mkNormal(mean, 4.0)))
|
run(FromDist(ToFloat(#Mean), mkNormal(mean, 4.0)))->unpackFloat->expect->toBeCloseTo(mean)
|
||||||
-> unpackFloat
|
|
||||||
-> expect
|
|
||||||
-> toBeCloseTo(mean)
|
|
||||||
})
|
})
|
||||||
|
|
||||||
Skip.test("of normal(0, -1) (it NaNs out)", () => {
|
Skip.test("of normal(0, -1) (it NaNs out)", () => {
|
||||||
run(FromDist(ToFloat(#Mean), mkNormal(1e1, -1e0)))
|
run(FromDist(ToFloat(#Mean), mkNormal(1e1, -1e0)))->unpackFloat->expect->ExpectJs.toBeFalsy
|
||||||
-> unpackFloat
|
|
||||||
-> expect
|
|
||||||
-> ExpectJs.toBeFalsy
|
|
||||||
})
|
})
|
||||||
|
|
||||||
test("of normal(0, 1e-8) (it doesn't freak out at tiny stdev)", () => {
|
test("of normal(0, 1e-8) (it doesn't freak out at tiny stdev)", () => {
|
||||||
run(FromDist(ToFloat(#Mean), mkNormal(0.0, 1e-8)))
|
run(FromDist(ToFloat(#Mean), mkNormal(0.0, 1e-8)))->unpackFloat->expect->toBeCloseTo(0.0)
|
||||||
-> unpackFloat
|
|
||||||
-> expect
|
|
||||||
-> toBeCloseTo(0.0)
|
|
||||||
})
|
})
|
||||||
|
|
||||||
testAll("of exponential distributions", list{1e-7, 2.0, 10.0, 100.0}, rate => {
|
testAll("of exponential distributions", list{1e-7, 2.0, 10.0, 100.0}, rate => {
|
||||||
let meanValue = run(FromDist(ToFloat(#Mean), GenericDist_Types.Symbolic(#Exponential({rate: rate}))))
|
let meanValue = run(
|
||||||
meanValue -> unpackFloat -> expect -> toBeCloseTo(1.0 /. rate) // https://en.wikipedia.org/wiki/Exponential_distribution#Mean,_variance,_moments,_and_median
|
FromDist(ToFloat(#Mean), DistributionTypes.Symbolic(#Exponential({rate: rate}))),
|
||||||
|
)
|
||||||
|
meanValue->unpackFloat->expect->toBeCloseTo(1.0 /. rate) // https://en.wikipedia.org/wiki/Exponential_distribution#Mean,_variance,_moments,_and_median
|
||||||
})
|
})
|
||||||
|
|
||||||
test("of a cauchy distribution", () => {
|
test("of a cauchy distribution", () => {
|
||||||
let meanValue = run(FromDist(ToFloat(#Mean), GenericDist_Types.Symbolic(#Cauchy({local: 1.0, scale: 1.0}))))
|
let meanValue = run(
|
||||||
meanValue
|
FromDist(ToFloat(#Mean), DistributionTypes.Symbolic(#Cauchy({local: 1.0, scale: 1.0}))),
|
||||||
-> unpackFloat
|
)
|
||||||
-> expect
|
meanValue->unpackFloat->expect->toBeSoCloseTo(1.0098094001641797, ~digits=5)
|
||||||
-> toBeCloseTo(2.01868297874546)
|
|
||||||
//-> toBe(GenDistError(Other("Cauchy distributions may have no mean value.")))
|
//-> toBe(GenDistError(Other("Cauchy distributions may have no mean value.")))
|
||||||
})
|
})
|
||||||
|
|
||||||
testAll("of triangular distributions", list{(1.0,2.0,3.0), (-1e7,-1e-7,1e-7), (-1e-7,1e0,1e7), (-1e-16,0.0,1e-16)}, tup => {
|
testAll(
|
||||||
let (low, medium, high) = tup
|
"of triangular distributions",
|
||||||
let meanValue = run(FromDist(
|
list{(1.0, 2.0, 3.0), (-1e7, -1e-7, 1e-7), (-1e-7, 1e0, 1e7), (-1e-16, 0.0, 1e-16)},
|
||||||
ToFloat(#Mean),
|
tup => {
|
||||||
GenericDist_Types.Symbolic(#Triangular({low: low, medium: medium, high: high}))
|
let (low, medium, high) = tup
|
||||||
))
|
let meanValue = run(
|
||||||
meanValue
|
FromDist(
|
||||||
-> unpackFloat
|
ToFloat(#Mean),
|
||||||
-> expect
|
DistributionTypes.Symbolic(#Triangular({low: low, medium: medium, high: high})),
|
||||||
-> toBeCloseTo((low +. medium +. high) /. 3.0) // https://www.statology.org/triangular-distribution/
|
),
|
||||||
})
|
)
|
||||||
|
meanValue->unpackFloat->expect->toBeCloseTo((low +. medium +. high) /. 3.0) // https://www.statology.org/triangular-distribution/
|
||||||
|
},
|
||||||
|
)
|
||||||
|
|
||||||
// TODO: nonpositive inputs are SUPPOSED to crash.
|
// TODO: nonpositive inputs are SUPPOSED to crash.
|
||||||
testAll("of beta distributions", list{(1e-4, 6.4e1), (1.28e2, 1e0), (1e-16, 1e-16), (1e16, 1e16), (-1e4, 1e1), (1e1, -1e4)}, tup => {
|
testAll(
|
||||||
let (alpha, beta) = tup
|
"of beta distributions",
|
||||||
let meanValue = run(FromDist(
|
list{(1e-4, 6.4e1), (1.28e2, 1e0), (1e-16, 1e-16), (1e16, 1e16), (-1e4, 1e1), (1e1, -1e4)},
|
||||||
ToFloat(#Mean),
|
tup => {
|
||||||
GenericDist_Types.Symbolic(#Beta({alpha: alpha, beta: beta}))
|
let (alpha, beta) = tup
|
||||||
))
|
let meanValue = run(
|
||||||
meanValue
|
FromDist(ToFloat(#Mean), DistributionTypes.Symbolic(#Beta({alpha: alpha, beta: beta}))),
|
||||||
-> unpackFloat
|
)
|
||||||
-> expect
|
meanValue->unpackFloat->expect->toBeCloseTo(1.0 /. (1.0 +. beta /. alpha)) // https://en.wikipedia.org/wiki/Beta_distribution#Mean
|
||||||
-> toBeCloseTo(1.0 /. (1.0 +. (beta /. alpha))) // https://en.wikipedia.org/wiki/Beta_distribution#Mean
|
},
|
||||||
})
|
)
|
||||||
|
|
||||||
// TODO: When we have our theory of validators we won't want this to be NaN but to be an error.
|
// TODO: When we have our theory of validators we won't want this to be NaN but to be an error.
|
||||||
test("of beta(0, 0)", () => {
|
test("of beta(0, 0)", () => {
|
||||||
let meanValue = run(FromDist(
|
let meanValue = run(
|
||||||
ToFloat(#Mean),
|
FromDist(ToFloat(#Mean), DistributionTypes.Symbolic(#Beta({alpha: 0.0, beta: 0.0}))),
|
||||||
GenericDist_Types.Symbolic(#Beta({alpha: 0.0, beta: 0.0}))
|
)
|
||||||
))
|
meanValue->unpackFloat->expect->ExpectJs.toBeFalsy
|
||||||
meanValue
|
|
||||||
-> unpackFloat
|
|
||||||
-> expect
|
|
||||||
-> ExpectJs.toBeFalsy
|
|
||||||
})
|
})
|
||||||
|
|
||||||
testAll("of lognormal distributions", list{(2.0, 4.0), (1e-7, 1e-2), (-1e6, 10.0), (1e3, -1e2), (-1e8, -1e4), (1e2, 1e-5)}, tup => {
|
testAll(
|
||||||
let (mu, sigma) = tup
|
"of lognormal distributions",
|
||||||
let meanValue = run(FromDist(
|
list{(2.0, 4.0), (1e-7, 1e-2), (-1e6, 10.0), (1e3, -1e2), (-1e8, -1e4), (1e2, 1e-5)},
|
||||||
ToFloat(#Mean),
|
tup => {
|
||||||
GenericDist_Types.Symbolic(#Lognormal({mu: mu, sigma: sigma}))
|
let (mu, sigma) = tup
|
||||||
))
|
let meanValue = run(
|
||||||
meanValue
|
FromDist(ToFloat(#Mean), DistributionTypes.Symbolic(#Lognormal({mu: mu, sigma: sigma}))),
|
||||||
-> unpackFloat
|
)
|
||||||
-> expect
|
meanValue->unpackFloat->expect->toBeCloseTo(Js.Math.exp(mu +. sigma ** 2.0 /. 2.0)) // https://brilliant.org/wiki/log-normal-distribution/
|
||||||
-> toBeCloseTo(Js.Math.exp(mu +. sigma ** 2.0 /. 2.0 )) // https://brilliant.org/wiki/log-normal-distribution/
|
},
|
||||||
})
|
)
|
||||||
|
|
||||||
testAll("of uniform distributions", list{(1e-5, 12.345), (-1e4, 1e4), (-1e16, -1e2), (5.3e3, 9e9)}, tup => {
|
testAll(
|
||||||
let (low, high) = tup
|
"of uniform distributions",
|
||||||
let meanValue = run(FromDist(
|
list{(1e-5, 12.345), (-1e4, 1e4), (-1e16, -1e2), (5.3e3, 9e9)},
|
||||||
ToFloat(#Mean),
|
tup => {
|
||||||
GenericDist_Types.Symbolic(#Uniform({low: low, high: high}))
|
let (low, high) = tup
|
||||||
))
|
let meanValue = run(
|
||||||
meanValue
|
FromDist(ToFloat(#Mean), DistributionTypes.Symbolic(#Uniform({low: low, high: high}))),
|
||||||
-> unpackFloat
|
)
|
||||||
-> expect
|
meanValue->unpackFloat->expect->toBeCloseTo((low +. high) /. 2.0) // https://en.wikipedia.org/wiki/Continuous_uniform_distribution#Moments
|
||||||
-> toBeCloseTo((low +. high) /. 2.0) // https://en.wikipedia.org/wiki/Continuous_uniform_distribution#Moments
|
},
|
||||||
})
|
)
|
||||||
|
|
||||||
test("of a float", () => {
|
test("of a float", () => {
|
||||||
let meanValue = run(FromDist(
|
let meanValue = run(FromDist(ToFloat(#Mean), DistributionTypes.Symbolic(#Float(7.7))))
|
||||||
ToFloat(#Mean),
|
meanValue->unpackFloat->expect->toBeCloseTo(7.7)
|
||||||
GenericDist_Types.Symbolic(#Float(7.7))
|
|
||||||
))
|
|
||||||
meanValue -> unpackFloat -> expect -> toBeCloseTo(7.7)
|
|
||||||
})
|
})
|
||||||
})
|
})
|
||||||
|
|
||||||
describe("Normal distribution with sparklines", () => {
|
describe("Normal distribution with sparklines", () => {
|
||||||
|
|
||||||
let parameterWiseAdditionPdf = (n1: SymbolicDistTypes.normal, n2: SymbolicDistTypes.normal) => {
|
let parameterWiseAdditionPdf = (n1: SymbolicDistTypes.normal, n2: SymbolicDistTypes.normal) => {
|
||||||
let normalDistAtSumMeanConstr = SymbolicDist.Normal.add(n1, n2)
|
let normalDistAtSumMeanConstr = SymbolicDist.Normal.add(n1, n2)
|
||||||
let normalDistAtSumMean: SymbolicDistTypes.normal = switch normalDistAtSumMeanConstr {
|
let normalDistAtSumMean: SymbolicDistTypes.normal = switch normalDistAtSumMeanConstr {
|
||||||
| #Normal(params) => params
|
| #Normal(params) => params
|
||||||
}
|
}
|
||||||
x => SymbolicDist.Normal.pdf(x, normalDistAtSumMean)
|
x => SymbolicDist.Normal.pdf(x, normalDistAtSumMean)
|
||||||
}
|
}
|
||||||
|
@ -138,24 +122,25 @@ describe("Normal distribution with sparklines", () => {
|
||||||
|
|
||||||
test("mean=5 pdf", () => {
|
test("mean=5 pdf", () => {
|
||||||
let pdfNormalDistAtMean5 = x => SymbolicDist.Normal.pdf(x, normalDistAtMean5)
|
let pdfNormalDistAtMean5 = x => SymbolicDist.Normal.pdf(x, normalDistAtMean5)
|
||||||
let sparklineMean5 = fnImage(pdfNormalDistAtMean5, range20Float)
|
let sparklineMean5 = fnImage(pdfNormalDistAtMean5, range20Float)
|
||||||
Sparklines.create(sparklineMean5, ())
|
Sparklines.create(sparklineMean5, ())
|
||||||
-> expect
|
->expect
|
||||||
-> toEqual(`▁▂▃▆██▇▅▂▁▁▁▁▁▁▁▁▁▁▁`)
|
->toEqual(`▁▂▃▆██▇▅▂▁▁▁▁▁▁▁▁▁▁▁`)
|
||||||
})
|
})
|
||||||
|
|
||||||
test("parameter-wise addition of two normal distributions", () => {
|
test("parameter-wise addition of two normal distributions", () => {
|
||||||
let sparklineMean15 = normalDistAtMean5 -> parameterWiseAdditionPdf(normalDistAtMean10) -> fnImage(range20Float)
|
let sparklineMean15 =
|
||||||
|
normalDistAtMean5->parameterWiseAdditionPdf(normalDistAtMean10)->fnImage(range20Float)
|
||||||
Sparklines.create(sparklineMean15, ())
|
Sparklines.create(sparklineMean15, ())
|
||||||
-> expect
|
->expect
|
||||||
-> toEqual(`▁▁▁▁▁▁▁▁▁▂▃▄▆███▇▅▄▂`)
|
->toEqual(`▁▁▁▁▁▁▁▁▁▂▃▄▆███▇▅▄▂`)
|
||||||
})
|
})
|
||||||
|
|
||||||
test("mean=10 cdf", () => {
|
test("mean=10 cdf", () => {
|
||||||
let cdfNormalDistAtMean10 = x => SymbolicDist.Normal.cdf(x, normalDistAtMean10)
|
let cdfNormalDistAtMean10 = x => SymbolicDist.Normal.cdf(x, normalDistAtMean10)
|
||||||
let sparklineMean10 = fnImage(cdfNormalDistAtMean10, range20Float)
|
let sparklineMean10 = fnImage(cdfNormalDistAtMean10, range20Float)
|
||||||
Sparklines.create(sparklineMean10, ())
|
Sparklines.create(sparklineMean10, ())
|
||||||
-> expect
|
->expect
|
||||||
-> toEqual(`▁▁▁▁▁▁▁▁▂▄▅▇████████`)
|
->toEqual(`▁▁▁▁▁▁▁▁▂▄▅▇████████`)
|
||||||
})
|
})
|
||||||
})
|
})
|
||||||
|
|
|
@ -3,8 +3,8 @@ open Expect
|
||||||
|
|
||||||
let makeTest = (~only=false, str, item1, item2) =>
|
let makeTest = (~only=false, str, item1, item2) =>
|
||||||
only
|
only
|
||||||
? Only.test(str, () => expect(item1) -> toEqual(item2))
|
? Only.test(str, () => expect(item1)->toEqual(item2))
|
||||||
: test(str, () => expect(item1) -> toEqual(item2))
|
: test(str, () => expect(item1)->toEqual(item2))
|
||||||
|
|
||||||
describe("Lodash", () =>
|
describe("Lodash", () =>
|
||||||
describe("Lodash", () => {
|
describe("Lodash", () => {
|
||||||
|
|
|
@ -6,8 +6,7 @@ open Expect
|
||||||
let expectEvalToBe = (expr: string, answer: string) =>
|
let expectEvalToBe = (expr: string, answer: string) =>
|
||||||
Reducer.evaluate(expr)->ExpressionValue.toStringResult->expect->toBe(answer)
|
Reducer.evaluate(expr)->ExpressionValue.toStringResult->expect->toBe(answer)
|
||||||
|
|
||||||
let testEval = (expr, answer) =>
|
let testEval = (expr, answer) => test(expr, () => expectEvalToBe(expr, answer))
|
||||||
test(expr, () => expectEvalToBe(expr, answer))
|
|
||||||
|
|
||||||
describe("builtin", () => {
|
describe("builtin", () => {
|
||||||
// All MathJs operators and functions are available for string, number and boolean
|
// All MathJs operators and functions are available for string, number and boolean
|
||||||
|
|
|
@ -14,7 +14,8 @@ let testDescriptionParse = (desc, expr, answer) => test(desc, () => expectParseT
|
||||||
module MySkip = {
|
module MySkip = {
|
||||||
let testParse = (expr, answer) => Skip.test(expr, () => expectParseToBe(expr, answer))
|
let testParse = (expr, answer) => Skip.test(expr, () => expectParseToBe(expr, answer))
|
||||||
|
|
||||||
let testDescriptionParse = (desc, expr, answer) => Skip.test(desc, () => expectParseToBe(expr, answer))
|
let testDescriptionParse = (desc, expr, answer) =>
|
||||||
|
Skip.test(desc, () => expectParseToBe(expr, answer))
|
||||||
}
|
}
|
||||||
|
|
||||||
describe("MathJs parse", () => {
|
describe("MathJs parse", () => {
|
||||||
|
@ -60,7 +61,8 @@ describe("MathJs parse", () => {
|
||||||
MySkip.testDescriptionParse("define", "# This is a comment", "???")
|
MySkip.testDescriptionParse("define", "# This is a comment", "???")
|
||||||
})
|
})
|
||||||
|
|
||||||
describe("if statement", () => { // TODO Tertiary operator instead
|
describe("if statement", () => {
|
||||||
|
// TODO Tertiary operator instead
|
||||||
MySkip.testDescriptionParse("define", "if (true) { 1 } else { 0 }", "???")
|
MySkip.testDescriptionParse("define", "if (true) { 1 } else { 0 }", "???")
|
||||||
})
|
})
|
||||||
})
|
})
|
||||||
|
|
|
@ -3,7 +3,8 @@ open Reducer_TestHelpers
|
||||||
|
|
||||||
let testParseToBe = (expr, answer) => test(expr, () => expectParseToBe(expr, answer))
|
let testParseToBe = (expr, answer) => test(expr, () => expectParseToBe(expr, answer))
|
||||||
|
|
||||||
let testDescriptionParseToBe = (desc, expr, answer) => test(desc, () => expectParseToBe(expr, answer))
|
let testDescriptionParseToBe = (desc, expr, answer) =>
|
||||||
|
test(desc, () => expectParseToBe(expr, answer))
|
||||||
|
|
||||||
let testEvalToBe = (expr, answer) => test(expr, () => expectEvalToBe(expr, answer))
|
let testEvalToBe = (expr, answer) => test(expr, () => expectEvalToBe(expr, answer))
|
||||||
|
|
||||||
|
@ -44,13 +45,21 @@ describe("reducer using mathjs parse", () => {
|
||||||
})
|
})
|
||||||
describe("multi-line", () => {
|
describe("multi-line", () => {
|
||||||
testParseToBe("1; 2", "Ok((:$$bindExpression (:$$bindStatement (:$$bindings) 1) 2))")
|
testParseToBe("1; 2", "Ok((:$$bindExpression (:$$bindStatement (:$$bindings) 1) 2))")
|
||||||
testParseToBe("1+1; 2+1", "Ok((:$$bindExpression (:$$bindStatement (:$$bindings) (:add 1 1)) (:add 2 1)))")
|
testParseToBe(
|
||||||
|
"1+1; 2+1",
|
||||||
|
"Ok((:$$bindExpression (:$$bindStatement (:$$bindings) (:add 1 1)) (:add 2 1)))",
|
||||||
|
)
|
||||||
})
|
})
|
||||||
describe("assignment", () => {
|
describe("assignment", () => {
|
||||||
testParseToBe("x=1; x", "Ok((:$$bindExpression (:$$bindStatement (:$$bindings) (:$let :x 1)) :x))")
|
testParseToBe(
|
||||||
testParseToBe("x=1+1; x+1", "Ok((:$$bindExpression (:$$bindStatement (:$$bindings) (:$let :x (:add 1 1))) (:add :x 1)))")
|
"x=1; x",
|
||||||
|
"Ok((:$$bindExpression (:$$bindStatement (:$$bindings) (:$let :x 1)) :x))",
|
||||||
|
)
|
||||||
|
testParseToBe(
|
||||||
|
"x=1+1; x+1",
|
||||||
|
"Ok((:$$bindExpression (:$$bindStatement (:$$bindings) (:$let :x (:add 1 1))) (:add :x 1)))",
|
||||||
|
)
|
||||||
})
|
})
|
||||||
|
|
||||||
})
|
})
|
||||||
|
|
||||||
describe("eval", () => {
|
describe("eval", () => {
|
||||||
|
@ -101,5 +110,9 @@ describe("test exceptions", () => {
|
||||||
"javascriptraise('div by 0')",
|
"javascriptraise('div by 0')",
|
||||||
"Error(JS Exception: Error: 'div by 0')",
|
"Error(JS Exception: Error: 'div by 0')",
|
||||||
)
|
)
|
||||||
testDescriptionEvalToBe("rescript exception", "rescriptraise()", "Error(TODO: unhandled rescript exception)")
|
testDescriptionEvalToBe(
|
||||||
|
"rescript exception",
|
||||||
|
"rescriptraise()",
|
||||||
|
"Error(TODO: unhandled rescript exception)",
|
||||||
|
)
|
||||||
})
|
})
|
||||||
|
|
|
@ -90,16 +90,8 @@ describe("eval on distribution functions", () => {
|
||||||
})
|
})
|
||||||
|
|
||||||
describe("mixture", () => {
|
describe("mixture", () => {
|
||||||
testEval(
|
testEval("mx(normal(5,2), normal(10,1), normal(15, 1))", "Ok(Point Set Distribution)")
|
||||||
~skip=true,
|
testEval("mixture(normal(5,2), normal(10,1), [0.2, 0.4])", "Ok(Point Set Distribution)")
|
||||||
"mx(normal(5,2), normal(10,1), normal(15, 1))",
|
|
||||||
"Ok(Point Set Distribution)",
|
|
||||||
)
|
|
||||||
testEval(
|
|
||||||
~skip=true,
|
|
||||||
"mixture(normal(5,2), normal(10,1), [.2,, .4])",
|
|
||||||
"Ok(Point Set Distribution)",
|
|
||||||
)
|
|
||||||
})
|
})
|
||||||
})
|
})
|
||||||
|
|
||||||
|
@ -111,7 +103,11 @@ describe("parse on distribution functions", () => {
|
||||||
})
|
})
|
||||||
describe("pointwise arithmetic expressions", () => {
|
describe("pointwise arithmetic expressions", () => {
|
||||||
testParse(~skip=true, "normal(5,2) .+ normal(5,1)", "Ok((:dotAdd (:normal 5 2) (:normal 5 1)))")
|
testParse(~skip=true, "normal(5,2) .+ normal(5,1)", "Ok((:dotAdd (:normal 5 2) (:normal 5 1)))")
|
||||||
testParse(~skip=true, "normal(5,2) .- normal(5,1)", "Ok((:dotSubtract (:normal 5 2) (:normal 5 1)))")
|
testParse(
|
||||||
|
~skip=true,
|
||||||
|
"normal(5,2) .- normal(5,1)",
|
||||||
|
"Ok((:dotSubtract (:normal 5 2) (:normal 5 1)))",
|
||||||
|
)
|
||||||
testParse("normal(5,2) .* normal(5,1)", "Ok((:dotMultiply (:normal 5 2) (:normal 5 1)))")
|
testParse("normal(5,2) .* normal(5,1)", "Ok((:dotMultiply (:normal 5 2) (:normal 5 1)))")
|
||||||
testParse("normal(5,2) ./ normal(5,1)", "Ok((:dotDivide (:normal 5 2) (:normal 5 1)))")
|
testParse("normal(5,2) ./ normal(5,1)", "Ok((:dotDivide (:normal 5 2) (:normal 5 1)))")
|
||||||
testParse("normal(5,2) .^ normal(5,1)", "Ok((:dotPow (:normal 5 2) (:normal 5 1)))")
|
testParse("normal(5,2) .^ normal(5,1)", "Ok((:dotPow (:normal 5 2) (:normal 5 1)))")
|
||||||
|
|
|
@ -3,24 +3,41 @@ open Expect
|
||||||
|
|
||||||
let makeTest = (~only=false, str, item1, item2) =>
|
let makeTest = (~only=false, str, item1, item2) =>
|
||||||
only
|
only
|
||||||
? Only.test(str, () => expect(item1) -> toEqual(item2))
|
? Only.test(str, () => expect(item1)->toEqual(item2))
|
||||||
: test(str, () => expect(item1) -> toEqual(item2))
|
: test(str, () => expect(item1)->toEqual(item2))
|
||||||
|
|
||||||
|
|
||||||
let {toFloat, toDist, toString, toError, fmap} = module(DistributionOperation.Output)
|
let {toFloat, toDist, toString, toError, fmap} = module(DistributionOperation.Output)
|
||||||
|
|
||||||
let fnImage = (theFn, inps) => Js.Array.map(theFn, inps)
|
let fnImage = (theFn, inps) => Js.Array.map(theFn, inps)
|
||||||
|
|
||||||
let env: DistributionOperation.env = {
|
let env: DistributionOperation.env = {
|
||||||
sampleCount: 100,
|
sampleCount: 10000,
|
||||||
xyPointLength: 100,
|
xyPointLength: 1000,
|
||||||
}
|
}
|
||||||
|
|
||||||
let run = DistributionOperation.run(~env)
|
let run = DistributionOperation.run(~env)
|
||||||
let outputMap = fmap(~env)
|
let outputMap = fmap(~env)
|
||||||
let unreachableInTestFileMessage = "Should be impossible to reach (This error is in test file)"
|
let unreachableInTestFileMessage = "Should be impossible to reach (This error is in test file)"
|
||||||
let toExtFloat: option<float> => float = E.O.toExt(unreachableInTestFileMessage)
|
let toExtFloat: option<float> => float = E.O.toExt(unreachableInTestFileMessage)
|
||||||
let toExtDist: option<GenericDist_Types.genericDist> => GenericDist_Types.genericDist = E.O.toExt(unreachableInTestFileMessage)
|
let toExtDist: option<DistributionTypes.genericDist> => DistributionTypes.genericDist = E.O.toExt(
|
||||||
|
unreachableInTestFileMessage,
|
||||||
|
)
|
||||||
// let toExt: option<'a> => 'a = E.O.toExt(unreachableInTestFileMessage)
|
// let toExt: option<'a> => 'a = E.O.toExt(unreachableInTestFileMessage)
|
||||||
let unpackFloat = x => x -> toFloat -> toExtFloat
|
let unpackFloat = x => x->toFloat->toExtFloat
|
||||||
let unpackDist = y => y -> toDist -> toExtDist
|
let unpackDist = y => y->toDist->toExtDist
|
||||||
|
|
||||||
|
let mkNormal = (mean, stdev) => DistributionTypes.Symbolic(#Normal({mean: mean, stdev: stdev}))
|
||||||
|
let mkBeta = (alpha, beta) => DistributionTypes.Symbolic(#Beta({alpha: alpha, beta: beta}))
|
||||||
|
let mkExponential = rate => DistributionTypes.Symbolic(#Exponential({rate: rate}))
|
||||||
|
let mkUniform = (low, high) => DistributionTypes.Symbolic(#Uniform({low: low, high: high}))
|
||||||
|
let mkCauchy = (local, scale) => DistributionTypes.Symbolic(#Cauchy({local: local, scale: scale}))
|
||||||
|
let mkLognormal = (mu, sigma) => DistributionTypes.Symbolic(#Lognormal({mu: mu, sigma: sigma}))
|
||||||
|
|
||||||
|
let normalMake = SymbolicDist.Normal.make
|
||||||
|
let betaMake = SymbolicDist.Beta.make
|
||||||
|
let exponentialMake = SymbolicDist.Exponential.make
|
||||||
|
let uniformMake = SymbolicDist.Uniform.make
|
||||||
|
let cauchyMake = SymbolicDist.Cauchy.make
|
||||||
|
let lognormalMake = SymbolicDist.Lognormal.make
|
||||||
|
let triangularMake = SymbolicDist.Triangular.make
|
||||||
|
let floatMake = SymbolicDist.Float.make
|
||||||
|
|
10
packages/squiggle-lang/__tests__/Utility_test.res
Normal file
10
packages/squiggle-lang/__tests__/Utility_test.res
Normal file
|
@ -0,0 +1,10 @@
|
||||||
|
open Jest
|
||||||
|
open Expect
|
||||||
|
|
||||||
|
describe("E.L.combinations2", () => {
|
||||||
|
test("size three", () => {
|
||||||
|
E.L.combinations2(list{"alice", "bob", "eve"})
|
||||||
|
->expect
|
||||||
|
->toEqual(list{("alice", "bob"), ("alice", "eve"), ("bob", "eve")})
|
||||||
|
})
|
||||||
|
})
|
|
@ -3,8 +3,8 @@ open Expect
|
||||||
|
|
||||||
let makeTest = (~only=false, str, item1, item2) =>
|
let makeTest = (~only=false, str, item1, item2) =>
|
||||||
only
|
only
|
||||||
? Only.test(str, () => expect(item1) -> toEqual(item2))
|
? Only.test(str, () => expect(item1)->toEqual(item2))
|
||||||
: test(str, () => expect(item1) -> toEqual(item2))
|
: test(str, () => expect(item1)->toEqual(item2))
|
||||||
|
|
||||||
let pointSetDist1: PointSetTypes.xyShape = {xs: [1., 4., 8.], ys: [0.2, 0.4, 0.8]}
|
let pointSetDist1: PointSetTypes.xyShape = {xs: [1., 4., 8.], ys: [0.2, 0.4, 0.8]}
|
||||||
|
|
||||||
|
@ -21,7 +21,11 @@ let pointSetDist3: PointSetTypes.xyShape = {
|
||||||
describe("XYShapes", () => {
|
describe("XYShapes", () => {
|
||||||
describe("logScorePoint", () => {
|
describe("logScorePoint", () => {
|
||||||
makeTest("When identical", XYShape.logScorePoint(30, pointSetDist1, pointSetDist1), Some(0.0))
|
makeTest("When identical", XYShape.logScorePoint(30, pointSetDist1, pointSetDist1), Some(0.0))
|
||||||
makeTest("When similar", XYShape.logScorePoint(30, pointSetDist1, pointSetDist2), Some(1.658971191043856))
|
makeTest(
|
||||||
|
"When similar",
|
||||||
|
XYShape.logScorePoint(30, pointSetDist1, pointSetDist2),
|
||||||
|
Some(1.658971191043856),
|
||||||
|
)
|
||||||
makeTest(
|
makeTest(
|
||||||
"When very different",
|
"When very different",
|
||||||
XYShape.logScorePoint(30, pointSetDist1, pointSetDist3),
|
XYShape.logScorePoint(30, pointSetDist1, pointSetDist3),
|
||||||
|
|
|
@ -10,6 +10,7 @@
|
||||||
"test:reducer": "jest --testPathPattern '.*__tests__/Reducer.*'",
|
"test:reducer": "jest --testPathPattern '.*__tests__/Reducer.*'",
|
||||||
"test": "jest",
|
"test": "jest",
|
||||||
"test:watch": "jest --watchAll",
|
"test:watch": "jest --watchAll",
|
||||||
|
"test:quick": "jest --modulePathIgnorePatterns=__tests__/Distributions/Invariants/*",
|
||||||
"coverage": "rm -f *.coverage; yarn clean; BISECT_ENABLE=yes yarn build; yarn test; bisect-ppx-report html",
|
"coverage": "rm -f *.coverage; yarn clean; BISECT_ENABLE=yes yarn build; yarn test; bisect-ppx-report html",
|
||||||
"coverage:ci": "yarn clean; BISECT_ENABLE=yes yarn build; yarn test; bisect-ppx-report send-to Codecov",
|
"coverage:ci": "yarn clean; BISECT_ENABLE=yes yarn build; yarn test; bisect-ppx-report send-to Codecov",
|
||||||
"lint:rescript": "./lint.sh",
|
"lint:rescript": "./lint.sh",
|
||||||
|
|
|
@ -1,6 +1,6 @@
|
||||||
type functionCallInfo = GenericDist_Types.Operation.genericFunctionCallInfo
|
type functionCallInfo = GenericDist_Types.Operation.genericFunctionCallInfo
|
||||||
type genericDist = GenericDist_Types.genericDist
|
type genericDist = DistributionTypes.genericDist
|
||||||
type error = GenericDist_Types.error
|
type error = DistributionTypes.error
|
||||||
|
|
||||||
// TODO: It could be great to use a cache for some calculations (basically, do memoization). Also, better analytics/tracking could go a long way.
|
// TODO: It could be great to use a cache for some calculations (basically, do memoization). Also, better analytics/tracking could go a long way.
|
||||||
|
|
||||||
|
|
|
@ -39,57 +39,52 @@ module Output: {
|
||||||
}
|
}
|
||||||
|
|
||||||
module Constructors: {
|
module Constructors: {
|
||||||
@genType
|
@genType
|
||||||
let mean: (~env: env, genericDist) => result<float, error>
|
let mean: (~env: env, genericDist) => result<float, error>
|
||||||
@genType
|
@genType
|
||||||
let sample: (~env: env, genericDist) => result<float, error>
|
let sample: (~env: env, genericDist) => result<float, error>
|
||||||
@genType
|
@genType
|
||||||
let cdf: (~env: env, genericDist, float) => result<float, error>
|
let cdf: (~env: env, genericDist, float) => result<float, error>
|
||||||
@genType
|
@genType
|
||||||
let inv: (~env: env, genericDist, float) => result<float, error>
|
let inv: (~env: env, genericDist, float) => result<float, error>
|
||||||
@genType
|
@genType
|
||||||
let pdf: (~env: env, genericDist, float) => result<float, error>
|
let pdf: (~env: env, genericDist, float) => result<float, error>
|
||||||
@genType
|
@genType
|
||||||
let normalize: (~env: env, genericDist) => result<genericDist, error>
|
let normalize: (~env: env, genericDist) => result<genericDist, error>
|
||||||
@genType
|
@genType
|
||||||
let toPointSet: (~env: env, genericDist) => result<genericDist, error>
|
let toPointSet: (~env: env, genericDist) => result<genericDist, error>
|
||||||
@genType
|
@genType
|
||||||
let toSampleSet: (~env: env, genericDist, int) => result<genericDist, error>
|
let toSampleSet: (~env: env, genericDist, int) => result<genericDist, error>
|
||||||
@genType
|
@genType
|
||||||
let truncate: (
|
let truncate: (~env: env, genericDist, option<float>, option<float>) => result<genericDist, error>
|
||||||
~env: env,
|
@genType
|
||||||
genericDist,
|
let inspect: (~env: env, genericDist) => result<genericDist, error>
|
||||||
option<float>,
|
@genType
|
||||||
option<float>,
|
let toString: (~env: env, genericDist) => result<string, error>
|
||||||
) => result<genericDist, error>
|
@genType
|
||||||
@genType
|
let toSparkline: (~env: env, genericDist, int) => result<string, error>
|
||||||
let inspect: (~env: env, genericDist) => result<genericDist, error>
|
@genType
|
||||||
@genType
|
let algebraicAdd: (~env: env, genericDist, genericDist) => result<genericDist, error>
|
||||||
let toString: (~env: env, genericDist) => result<string, error>
|
@genType
|
||||||
@genType
|
let algebraicMultiply: (~env: env, genericDist, genericDist) => result<genericDist, error>
|
||||||
let toSparkline: (~env: env, genericDist, int) => result<string, error>
|
@genType
|
||||||
@genType
|
let algebraicDivide: (~env: env, genericDist, genericDist) => result<genericDist, error>
|
||||||
let algebraicAdd: (~env: env, genericDist, genericDist) => result<genericDist, error>
|
@genType
|
||||||
@genType
|
let algebraicSubtract: (~env: env, genericDist, genericDist) => result<genericDist, error>
|
||||||
let algebraicMultiply: (~env: env, genericDist, genericDist) => result<genericDist, error>
|
@genType
|
||||||
@genType
|
let algebraicLogarithm: (~env: env, genericDist, genericDist) => result<genericDist, error>
|
||||||
let algebraicDivide: (~env: env, genericDist, genericDist) => result<genericDist, error>
|
@genType
|
||||||
@genType
|
let algebraicPower: (~env: env, genericDist, genericDist) => result<genericDist, error>
|
||||||
let algebraicSubtract: (~env: env, genericDist, genericDist) => result<genericDist, error>
|
@genType
|
||||||
@genType
|
let pointwiseAdd: (~env: env, genericDist, genericDist) => result<genericDist, error>
|
||||||
let algebraicLogarithm: (~env: env, genericDist, genericDist) => result<genericDist, error>
|
@genType
|
||||||
@genType
|
let pointwiseMultiply: (~env: env, genericDist, genericDist) => result<genericDist, error>
|
||||||
let algebraicPower: (~env: env, genericDist, genericDist) => result<genericDist, error>
|
@genType
|
||||||
@genType
|
let pointwiseDivide: (~env: env, genericDist, genericDist) => result<genericDist, error>
|
||||||
let pointwiseAdd: (~env: env, genericDist, genericDist) => result<genericDist, error>
|
@genType
|
||||||
@genType
|
let pointwiseSubtract: (~env: env, genericDist, genericDist) => result<genericDist, error>
|
||||||
let pointwiseMultiply: (~env: env, genericDist, genericDist) => result<genericDist, error>
|
@genType
|
||||||
@genType
|
let pointwiseLogarithm: (~env: env, genericDist, genericDist) => result<genericDist, error>
|
||||||
let pointwiseDivide: (~env: env, genericDist, genericDist) => result<genericDist, error>
|
@genType
|
||||||
@genType
|
let pointwisePower: (~env: env, genericDist, genericDist) => result<genericDist, error>
|
||||||
let pointwiseSubtract: (~env: env, genericDist, genericDist) => result<genericDist, error>
|
|
||||||
@genType
|
|
||||||
let pointwiseLogarithm: (~env: env, genericDist, genericDist) => result<genericDist, error>
|
|
||||||
@genType
|
|
||||||
let pointwisePower: (~env: env, genericDist, genericDist) => result<genericDist, error>
|
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,12 +1,15 @@
|
||||||
|
@genType
|
||||||
type genericDist =
|
type genericDist =
|
||||||
| PointSet(PointSetTypes.pointSetDist)
|
| PointSet(PointSetTypes.pointSetDist)
|
||||||
| SampleSet(array<float>)
|
| SampleSet(SampleSetDist.t)
|
||||||
| Symbolic(SymbolicDistTypes.symbolicDist)
|
| Symbolic(SymbolicDistTypes.symbolicDist)
|
||||||
|
|
||||||
|
@genType
|
||||||
type error =
|
type error =
|
||||||
| NotYetImplemented
|
| NotYetImplemented
|
||||||
| Unreachable
|
| Unreachable
|
||||||
| DistributionVerticalShiftIsInvalid
|
| DistributionVerticalShiftIsInvalid
|
||||||
|
| ArgumentError(string)
|
||||||
| Other(string)
|
| Other(string)
|
||||||
|
|
||||||
module Operation = {
|
module Operation = {
|
||||||
|
@ -55,7 +58,11 @@ module DistributionOperation = {
|
||||||
type fromDist =
|
type fromDist =
|
||||||
| ToFloat(Operation.toFloat)
|
| ToFloat(Operation.toFloat)
|
||||||
| ToDist(toDist)
|
| ToDist(toDist)
|
||||||
| ToDistCombination(Operation.direction, Operation.arithmeticOperation, [#Dist(genericDist) | #Float(float)])
|
| ToDistCombination(
|
||||||
|
Operation.direction,
|
||||||
|
Operation.arithmeticOperation,
|
||||||
|
[#Dist(genericDist) | #Float(float)],
|
||||||
|
)
|
||||||
| ToString
|
| ToString
|
||||||
|
|
||||||
type singleParamaterFunction =
|
type singleParamaterFunction =
|
||||||
|
|
|
@ -1,6 +1,6 @@
|
||||||
//TODO: multimodal, add interface, test somehow, track performance, refactor sampleSet, refactor ASTEvaluator.res.
|
//TODO: multimodal, add interface, test somehow, track performance, refactor sampleSet, refactor ASTEvaluator.res.
|
||||||
type t = GenericDist_Types.genericDist
|
type t = DistributionTypes.genericDist
|
||||||
type error = GenericDist_Types.error
|
type error = DistributionTypes.error
|
||||||
type toPointSetFn = t => result<PointSetTypes.pointSetDist, error>
|
type toPointSetFn = t => result<PointSetTypes.pointSetDist, error>
|
||||||
type toSampleSetFn = t => result<SampleSetDist.t, error>
|
type toSampleSetFn = t => result<SampleSetDist.t, error>
|
||||||
type scaleMultiplyFn = (t, float) => result<t, error>
|
type scaleMultiplyFn = (t, float) => result<t, error>
|
||||||
|
@ -115,7 +115,7 @@ module Truncate = {
|
||||||
| Some(r) => Ok(r)
|
| Some(r) => Ok(r)
|
||||||
| None =>
|
| None =>
|
||||||
toPointSetFn(t)->E.R2.fmap(t => {
|
toPointSetFn(t)->E.R2.fmap(t => {
|
||||||
GenericDist_Types.PointSet(PointSetDist.T.truncate(leftCutoff, rightCutoff, t))
|
DistributionTypes.PointSet(PointSetDist.T.truncate(leftCutoff, rightCutoff, t))
|
||||||
})
|
})
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
@ -168,7 +168,7 @@ module AlgebraicCombination = {
|
||||||
->E.R.bind(((t1, t2)) => {
|
->E.R.bind(((t1, t2)) => {
|
||||||
SampleSetDist.map2(~fn, ~t1, ~t2)->GenericDist_Types.Error.resultStringToResultError
|
SampleSetDist.map2(~fn, ~t1, ~t2)->GenericDist_Types.Error.resultStringToResultError
|
||||||
})
|
})
|
||||||
->E.R2.fmap(r => GenericDist_Types.SampleSet(r))
|
->E.R2.fmap(r => DistributionTypes.SampleSet(r))
|
||||||
}
|
}
|
||||||
|
|
||||||
//I'm (Ozzie) really just guessing here, very little idea what's best
|
//I'm (Ozzie) really just guessing here, very little idea what's best
|
||||||
|
@ -206,7 +206,7 @@ module AlgebraicCombination = {
|
||||||
arithmeticOperation,
|
arithmeticOperation,
|
||||||
t1,
|
t1,
|
||||||
t2,
|
t2,
|
||||||
)->E.R2.fmap(r => GenericDist_Types.PointSet(r))
|
)->E.R2.fmap(r => DistributionTypes.PointSet(r))
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
@ -229,7 +229,7 @@ let pointwiseCombination = (
|
||||||
t2,
|
t2,
|
||||||
)
|
)
|
||||||
)
|
)
|
||||||
->E.R2.fmap(r => GenericDist_Types.PointSet(r))
|
->E.R2.fmap(r => DistributionTypes.PointSet(r))
|
||||||
}
|
}
|
||||||
|
|
||||||
let pointwiseCombinationFloat = (
|
let pointwiseCombinationFloat = (
|
||||||
|
@ -239,7 +239,7 @@ let pointwiseCombinationFloat = (
|
||||||
~float: float,
|
~float: float,
|
||||||
): result<t, error> => {
|
): result<t, error> => {
|
||||||
let m = switch arithmeticOperation {
|
let m = switch arithmeticOperation {
|
||||||
| #Add | #Subtract => Error(GenericDist_Types.DistributionVerticalShiftIsInvalid)
|
| #Add | #Subtract => Error(DistributionTypes.DistributionVerticalShiftIsInvalid)
|
||||||
| (#Multiply | #Divide | #Power | #Logarithm) as arithmeticOperation =>
|
| (#Multiply | #Divide | #Power | #Logarithm) as arithmeticOperation =>
|
||||||
toPointSetFn(t)->E.R2.fmap(t => {
|
toPointSetFn(t)->E.R2.fmap(t => {
|
||||||
//TODO: Move to PointSet codebase
|
//TODO: Move to PointSet codebase
|
||||||
|
@ -254,7 +254,7 @@ let pointwiseCombinationFloat = (
|
||||||
)
|
)
|
||||||
})
|
})
|
||||||
}
|
}
|
||||||
m->E.R2.fmap(r => GenericDist_Types.PointSet(r))
|
m->E.R2.fmap(r => DistributionTypes.PointSet(r))
|
||||||
}
|
}
|
||||||
|
|
||||||
//Note: The result should always cumulatively sum to 1. This would be good to test.
|
//Note: The result should always cumulatively sum to 1. This would be good to test.
|
||||||
|
@ -265,7 +265,7 @@ let mixture = (
|
||||||
~pointwiseAddFn: pointwiseAddFn,
|
~pointwiseAddFn: pointwiseAddFn,
|
||||||
) => {
|
) => {
|
||||||
if E.A.length(values) == 0 {
|
if E.A.length(values) == 0 {
|
||||||
Error(GenericDist_Types.Other("Mixture error: mixture must have at least 1 element"))
|
Error(DistributionTypes.Other("Mixture error: mixture must have at least 1 element"))
|
||||||
} else {
|
} else {
|
||||||
let totalWeight = values->E.A2.fmap(E.Tuple2.second)->E.A.Floats.sum
|
let totalWeight = values->E.A2.fmap(E.Tuple2.second)->E.A.Floats.sum
|
||||||
let properlyWeightedValues =
|
let properlyWeightedValues =
|
||||||
|
|
|
@ -1,27 +1,20 @@
|
||||||
type genericDist =
|
type genericDist = DistributionTypes.genericDist
|
||||||
| PointSet(PointSetTypes.pointSetDist)
|
|
||||||
| SampleSet(SampleSetDist.t)
|
|
||||||
| Symbolic(SymbolicDistTypes.symbolicDist)
|
|
||||||
|
|
||||||
@genType
|
@genType
|
||||||
type error =
|
type error = DistributionTypes.error
|
||||||
| NotYetImplemented
|
|
||||||
| Unreachable
|
|
||||||
| DistributionVerticalShiftIsInvalid
|
|
||||||
| Other(string)
|
|
||||||
|
|
||||||
@genType
|
@genType
|
||||||
module Error = {
|
module Error = {
|
||||||
type t = error
|
type t = error
|
||||||
|
|
||||||
let fromString = (s: string): t => Other(s)
|
let fromString = (s: string): t => Other(s)
|
||||||
|
|
||||||
@genType
|
@genType
|
||||||
let toString = (x: t) => {
|
let toString = (x: t) => {
|
||||||
switch x {
|
switch x {
|
||||||
| NotYetImplemented => "Not Yet Implemented"
|
| NotYetImplemented => "Not Yet Implemented"
|
||||||
| Unreachable => "Unreachable"
|
| Unreachable => "Unreachable"
|
||||||
| DistributionVerticalShiftIsInvalid => "Distribution Vertical Shift Is Invalid"
|
| DistributionVerticalShiftIsInvalid => "Distribution Vertical Shift Is Invalid"
|
||||||
|
| ArgumentError(x) => `Argument Error: ${x}`
|
||||||
| Other(s) => s
|
| Other(s) => s
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -100,7 +100,6 @@ let combineShapesContinuousContinuous = (
|
||||||
s1: PointSetTypes.xyShape,
|
s1: PointSetTypes.xyShape,
|
||||||
s2: PointSetTypes.xyShape,
|
s2: PointSetTypes.xyShape,
|
||||||
): PointSetTypes.xyShape => {
|
): PointSetTypes.xyShape => {
|
||||||
|
|
||||||
// if we add the two distributions, we should probably use normal filters.
|
// if we add the two distributions, we should probably use normal filters.
|
||||||
// if we multiply the two distributions, we should probably use lognormal filters.
|
// if we multiply the two distributions, we should probably use lognormal filters.
|
||||||
let t1m = toDiscretePointMassesFromTriangulars(s1)
|
let t1m = toDiscretePointMassesFromTriangulars(s1)
|
||||||
|
|
|
@ -235,18 +235,10 @@ module T = Dist({
|
||||||
let indefiniteIntegralStepwise = (p, h1) => h1 *. p ** 2.0 /. 2.0
|
let indefiniteIntegralStepwise = (p, h1) => h1 *. p ** 2.0 /. 2.0
|
||||||
let indefiniteIntegralLinear = (p, a, b) => a *. p ** 2.0 /. 2.0 +. b *. p ** 3.0 /. 3.0
|
let indefiniteIntegralLinear = (p, a, b) => a *. p ** 2.0 /. 2.0 +. b *. p ** 3.0 /. 3.0
|
||||||
|
|
||||||
Analysis.integrate(
|
Analysis.integrate(~indefiniteIntegralStepwise, ~indefiniteIntegralLinear, t)
|
||||||
~indefiniteIntegralStepwise,
|
|
||||||
~indefiniteIntegralLinear,
|
|
||||||
t,
|
|
||||||
)
|
|
||||||
}
|
}
|
||||||
let variance = (t: t): float =>
|
let variance = (t: t): float =>
|
||||||
XYShape.Analysis.getVarianceDangerously(
|
XYShape.Analysis.getVarianceDangerously(t, mean, Analysis.getMeanOfSquares)
|
||||||
t,
|
|
||||||
mean,
|
|
||||||
Analysis.getMeanOfSquares,
|
|
||||||
)
|
|
||||||
})
|
})
|
||||||
|
|
||||||
let downsampleEquallyOverX = (length, t): t =>
|
let downsampleEquallyOverX = (length, t): t =>
|
||||||
|
|
|
@ -212,8 +212,7 @@ module T = Dist({
|
||||||
let totalIntegralSum = discreteIntegralSum +. continuousIntegralSum
|
let totalIntegralSum = discreteIntegralSum +. continuousIntegralSum
|
||||||
|
|
||||||
let getMeanOfSquares = ({discrete, continuous}: t) => {
|
let getMeanOfSquares = ({discrete, continuous}: t) => {
|
||||||
let discreteMean =
|
let discreteMean = discrete |> Discrete.shapeMap(XYShape.T.square) |> Discrete.T.mean
|
||||||
discrete |> Discrete.shapeMap(XYShape.T.square) |> Discrete.T.mean
|
|
||||||
let continuousMean = continuous |> Continuous.Analysis.getMeanOfSquares
|
let continuousMean = continuous |> Continuous.Analysis.getMeanOfSquares
|
||||||
(discreteMean *. discreteIntegralSum +. continuousMean *. continuousIntegralSum) /.
|
(discreteMean *. discreteIntegralSum +. continuousMean *. continuousIntegralSum) /.
|
||||||
totalIntegralSum
|
totalIntegralSum
|
||||||
|
|
|
@ -207,4 +207,4 @@ let toSparkline = (t: t, bucketCount) =>
|
||||||
T.toContinuous(t)
|
T.toContinuous(t)
|
||||||
->E.O2.fmap(Continuous.downsampleEquallyOverX(bucketCount))
|
->E.O2.fmap(Continuous.downsampleEquallyOverX(bucketCount))
|
||||||
->E.O2.toResult("toContinous Error: Could not convert into continuous distribution")
|
->E.O2.toResult("toContinous Error: Could not convert into continuous distribution")
|
||||||
->E.R2.fmap(r => Continuous.getShape(r).ys->Sparklines.create())
|
->E.R2.fmap(r => Continuous.getShape(r).ys->Sparklines.create())
|
||||||
|
|
|
@ -14,10 +14,10 @@ type distributionType = [
|
||||||
| #CDF
|
| #CDF
|
||||||
]
|
]
|
||||||
|
|
||||||
type xyShape = XYShape.xyShape;
|
type xyShape = XYShape.xyShape
|
||||||
type interpolationStrategy = XYShape.interpolationStrategy;
|
type interpolationStrategy = XYShape.interpolationStrategy
|
||||||
type extrapolationStrategy = XYShape.extrapolationStrategy;
|
type extrapolationStrategy = XYShape.extrapolationStrategy
|
||||||
type interpolator = XYShape.extrapolationStrategy;
|
type interpolator = XYShape.extrapolationStrategy
|
||||||
|
|
||||||
@genType
|
@genType
|
||||||
type rec continuousShape = {
|
type rec continuousShape = {
|
||||||
|
|
|
@ -81,7 +81,7 @@ module Triangular = {
|
||||||
low < medium && medium < high
|
low < medium && medium < high
|
||||||
? Ok(#Triangular({low: low, medium: medium, high: high}))
|
? Ok(#Triangular({low: low, medium: medium, high: high}))
|
||||||
: Error("Triangular values must be increasing order.")
|
: Error("Triangular values must be increasing order.")
|
||||||
let pdf = (x, t: t) => Jstat.Triangular.pdf(x, t.low, t.high, t.medium) // not obvious in jstat docs that high comes before medium?
|
let pdf = (x, t: t) => Jstat.Triangular.pdf(x, t.low, t.high, t.medium) // not obvious in jstat docs that high comes before medium?
|
||||||
let cdf = (x, t: t) => Jstat.Triangular.cdf(x, t.low, t.high, t.medium)
|
let cdf = (x, t: t) => Jstat.Triangular.cdf(x, t.low, t.high, t.medium)
|
||||||
let inv = (p, t: t) => Jstat.Triangular.inv(p, t.low, t.high, t.medium)
|
let inv = (p, t: t) => Jstat.Triangular.inv(p, t.low, t.high, t.medium)
|
||||||
let sample = (t: t) => Jstat.Triangular.sample(t.low, t.high, t.medium)
|
let sample = (t: t) => Jstat.Triangular.sample(t.low, t.high, t.medium)
|
||||||
|
@ -141,6 +141,8 @@ module Lognormal = {
|
||||||
}
|
}
|
||||||
let divide = (l1, l2) => {
|
let divide = (l1, l2) => {
|
||||||
let mu = l1.mu -. l2.mu
|
let mu = l1.mu -. l2.mu
|
||||||
|
// We believe the ratiands will have covariance zero.
|
||||||
|
// See here https://stats.stackexchange.com/questions/21735/what-are-the-mean-and-variance-of-the-ratio-of-two-lognormal-variables for details
|
||||||
let sigma = l1.sigma +. l2.sigma
|
let sigma = l1.sigma +. l2.sigma
|
||||||
#Lognormal({mu: mu, sigma: sigma})
|
#Lognormal({mu: mu, sigma: sigma})
|
||||||
}
|
}
|
||||||
|
@ -346,7 +348,11 @@ module T = {
|
||||||
| _ => #NoSolution
|
| _ => #NoSolution
|
||||||
}
|
}
|
||||||
|
|
||||||
let toPointSetDist = (~xSelection=#ByWeight, sampleCount, d: symbolicDist): PointSetTypes.pointSetDist =>
|
let toPointSetDist = (
|
||||||
|
~xSelection=#ByWeight,
|
||||||
|
sampleCount,
|
||||||
|
d: symbolicDist,
|
||||||
|
): PointSetTypes.pointSetDist =>
|
||||||
switch d {
|
switch d {
|
||||||
| #Float(v) => Discrete(Discrete.make(~integralSumCache=Some(1.0), {xs: [v], ys: [1.0]}))
|
| #Float(v) => Discrete(Discrete.make(~integralSumCache=Some(1.0), {xs: [v], ys: [1.0]}))
|
||||||
| _ =>
|
| _ =>
|
||||||
|
|
|
@ -21,4 +21,4 @@ let toPointSetDist = (samplingInputs, environment, node: node) =>
|
||||||
let runFunction = (samplingInputs, environment, inputs, fn: ASTTypes.Function.t) => {
|
let runFunction = (samplingInputs, environment, inputs, fn: ASTTypes.Function.t) => {
|
||||||
let params = envs(samplingInputs, environment)
|
let params = envs(samplingInputs, environment)
|
||||||
ASTTypes.Function.run(params, inputs, fn)
|
ASTTypes.Function.run(params, inputs, fn)
|
||||||
}
|
}
|
||||||
|
|
|
@ -22,7 +22,7 @@ let makeSymbolicFromTwoFloats = (name, fn) =>
|
||||||
~inputTypes=[#Float, #Float],
|
~inputTypes=[#Float, #Float],
|
||||||
~run=x =>
|
~run=x =>
|
||||||
switch x {
|
switch x {
|
||||||
| [#Float(a), #Float(b)] => fn(a, b) |> E.R.fmap(r => (#SymbolicDist(r)))
|
| [#Float(a), #Float(b)] => fn(a, b) |> E.R.fmap(r => #SymbolicDist(r))
|
||||||
| e => wrongInputsError(e)
|
| e => wrongInputsError(e)
|
||||||
},
|
},
|
||||||
(),
|
(),
|
||||||
|
@ -90,7 +90,8 @@ let floatFromDist = (
|
||||||
switch t {
|
switch t {
|
||||||
| #SymbolicDist(s) =>
|
| #SymbolicDist(s) =>
|
||||||
SymbolicDist.T.operate(distToFloatOp, s) |> E.R.bind(_, v => Ok(#SymbolicDist(#Float(v))))
|
SymbolicDist.T.operate(distToFloatOp, s) |> E.R.bind(_, v => Ok(#SymbolicDist(#Float(v))))
|
||||||
| #RenderedDist(rs) => PointSetDist.operate(distToFloatOp, rs) |> (v => Ok(#SymbolicDist(#Float(v))))
|
| #RenderedDist(rs) =>
|
||||||
|
PointSetDist.operate(distToFloatOp, rs) |> (v => Ok(#SymbolicDist(#Float(v))))
|
||||||
}
|
}
|
||||||
|
|
||||||
let verticalScaling = (scaleOp, rs, scaleBy) => {
|
let verticalScaling = (scaleOp, rs, scaleBy) => {
|
||||||
|
@ -125,10 +126,15 @@ module Multimodal = {
|
||||||
->E.R.bind(TypeSystem.TypedValue.toArray)
|
->E.R.bind(TypeSystem.TypedValue.toArray)
|
||||||
->E.R.bind(r => r |> E.A.fmap(TypeSystem.TypedValue.toFloat) |> E.A.R.firstErrorOrOpen)
|
->E.R.bind(r => r |> E.A.fmap(TypeSystem.TypedValue.toFloat) |> E.A.R.firstErrorOrOpen)
|
||||||
|
|
||||||
E.R.merge(dists, weights) -> E.R.bind(((a, b)) =>
|
E.R.merge(dists, weights)->E.R.bind(((a, b)) =>
|
||||||
E.A.length(b) > E.A.length(a) ?
|
E.A.length(b) > E.A.length(a)
|
||||||
Error("Too many weights provided") :
|
? Error("Too many weights provided")
|
||||||
Ok(E.A.zipMaxLength(a, b) |> E.A.fmap(((a, b)) => (a |> E.O.toExn(""), b |> E.O.default(1.0))))
|
: Ok(
|
||||||
|
E.A.zipMaxLength(a, b) |> E.A.fmap(((a, b)) => (
|
||||||
|
a |> E.O.toExn(""),
|
||||||
|
b |> E.O.default(1.0),
|
||||||
|
)),
|
||||||
|
)
|
||||||
)
|
)
|
||||||
| _ => Error("Needs items")
|
| _ => Error("Needs items")
|
||||||
}
|
}
|
||||||
|
|
|
@ -86,11 +86,7 @@ module TypedValue = {
|
||||||
|> E.R.fmap(r => #Array(r))
|
|> E.R.fmap(r => #Array(r))
|
||||||
| (#Hash(named), #Hash(r)) =>
|
| (#Hash(named), #Hash(r)) =>
|
||||||
let keyValues =
|
let keyValues =
|
||||||
named |> E.A.fmap(((name, intendedType)) => (
|
named |> E.A.fmap(((name, intendedType)) => (name, intendedType, Hash.getByName(r, name)))
|
||||||
name,
|
|
||||||
intendedType,
|
|
||||||
Hash.getByName(r, name),
|
|
||||||
))
|
|
||||||
let typedHash =
|
let typedHash =
|
||||||
keyValues
|
keyValues
|
||||||
|> E.A.fmap(((name, intendedType, optionNode)) =>
|
|> E.A.fmap(((name, intendedType, optionNode)) =>
|
||||||
|
@ -180,11 +176,7 @@ module Function = {
|
||||||
_coerceInputNodes(evaluationParams, t.inputTypes, t.shouldCoerceTypes),
|
_coerceInputNodes(evaluationParams, t.inputTypes, t.shouldCoerceTypes),
|
||||||
)
|
)
|
||||||
|
|
||||||
let run = (
|
let run = (evaluationParams: ASTTypes.evaluationParams, inputNodes: inputNodes, t: t) =>
|
||||||
evaluationParams: ASTTypes.evaluationParams,
|
|
||||||
inputNodes: inputNodes,
|
|
||||||
t: t,
|
|
||||||
) =>
|
|
||||||
inputsToTypedValues(evaluationParams, inputNodes, t)->E.R.bind(t.run)
|
inputsToTypedValues(evaluationParams, inputNodes, t)->E.R.bind(t.run)
|
||||||
|> (
|
|> (
|
||||||
x =>
|
x =>
|
||||||
|
|
|
@ -6,7 +6,7 @@ module Js = Reducer_Js
|
||||||
module MathJs = Reducer_MathJs
|
module MathJs = Reducer_MathJs
|
||||||
|
|
||||||
@genType
|
@genType
|
||||||
type expressionValue = Reducer_Expression.expressionValue
|
type expressionValue = ReducerInterface_ExpressionValue.expressionValue
|
||||||
|
|
||||||
@genType
|
@genType
|
||||||
let evaluate: string => result<expressionValue, Reducer_ErrorValue.errorValue>
|
let evaluate: string => result<expressionValue, Reducer_ErrorValue.errorValue>
|
||||||
|
|
|
@ -104,7 +104,7 @@ let rec reduceExpression = (expression: t, bindings: T.bindings): result<express
|
||||||
}
|
}
|
||||||
|
|
||||||
switch list {
|
switch list {
|
||||||
| list{T.EValue(EvCall("$$bindings"))} => bindings->EBindings->Ok
|
| list{T.EValue(EvCall("$$bindings"))} => bindings->T.EBindings->Ok
|
||||||
|
|
||||||
| list{T.EValue(EvCall("$$bindStatement")), T.EBindings(bindings), statement} =>
|
| list{T.EValue(EvCall("$$bindStatement")), T.EBindings(bindings), statement} =>
|
||||||
doBindStatement(statement, bindings)
|
doBindStatement(statement, bindings)
|
||||||
|
|
|
@ -66,6 +66,64 @@ module Helpers = {
|
||||||
dist1,
|
dist1,
|
||||||
)->runGenericOperation
|
)->runGenericOperation
|
||||||
}
|
}
|
||||||
|
let parseNumber = (args: expressionValue): Belt.Result.t<float, string> =>
|
||||||
|
switch args {
|
||||||
|
| EvNumber(x) => Ok(x)
|
||||||
|
| _ => Error("Not a number")
|
||||||
|
}
|
||||||
|
|
||||||
|
let parseNumberArray = (ags: array<expressionValue>): Belt.Result.t<array<float>, string> =>
|
||||||
|
E.A.fmap(parseNumber, ags) |> E.A.R.firstErrorOrOpen
|
||||||
|
|
||||||
|
let parseDist = (args: expressionValue): Belt.Result.t<GenericDist_Types.genericDist, string> =>
|
||||||
|
switch args {
|
||||||
|
| EvDistribution(x) => Ok(x)
|
||||||
|
| EvNumber(x) => Ok(GenericDist.fromFloat(x))
|
||||||
|
| _ => Error("Not a distribution")
|
||||||
|
}
|
||||||
|
|
||||||
|
let parseDistributionArray = (ags: array<expressionValue>): Belt.Result.t<
|
||||||
|
array<GenericDist_Types.genericDist>,
|
||||||
|
string,
|
||||||
|
> => E.A.fmap(parseDist, ags) |> E.A.R.firstErrorOrOpen
|
||||||
|
|
||||||
|
let mixtureWithGivenWeights = (
|
||||||
|
distributions: array<GenericDist_Types.genericDist>,
|
||||||
|
weights: array<float>,
|
||||||
|
): DistributionOperation.outputType =>
|
||||||
|
E.A.length(distributions) == E.A.length(weights)
|
||||||
|
? Mixture(Belt.Array.zip(distributions, weights))->runGenericOperation
|
||||||
|
: GenDistError(
|
||||||
|
ArgumentError("Error, mixture call has different number of distributions and weights"),
|
||||||
|
)
|
||||||
|
|
||||||
|
let mixtureWithDefaultWeights = (
|
||||||
|
distributions: array<GenericDist_Types.genericDist>,
|
||||||
|
): DistributionOperation.outputType => {
|
||||||
|
let length = E.A.length(distributions)
|
||||||
|
let weights = Belt.Array.make(length, 1.0 /. Belt.Int.toFloat(length))
|
||||||
|
mixtureWithGivenWeights(distributions, weights)
|
||||||
|
}
|
||||||
|
|
||||||
|
let mixture = (args: array<expressionValue>): DistributionOperation.outputType => {
|
||||||
|
switch E.A.last(args) {
|
||||||
|
| Some(EvArray(b)) => {
|
||||||
|
let weights = parseNumberArray(b)
|
||||||
|
let distributions = parseDistributionArray(
|
||||||
|
Belt.Array.slice(args, ~offset=0, ~len=E.A.length(args) - 1),
|
||||||
|
)
|
||||||
|
switch E.R.merge(distributions, weights) {
|
||||||
|
| Ok(d, w) => mixtureWithGivenWeights(d, w)
|
||||||
|
| Error(err) => GenDistError(ArgumentError(err))
|
||||||
|
}
|
||||||
|
}
|
||||||
|
| Some(EvDistribution(b)) => switch parseDistributionArray(args) {
|
||||||
|
| Ok(distributions) => mixtureWithDefaultWeights(distributions)
|
||||||
|
| Error(err) => GenDistError(ArgumentError(err))
|
||||||
|
}
|
||||||
|
| _ => GenDistError(ArgumentError("Last argument of mx must be array or distribution"))
|
||||||
|
}
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
module SymbolicConstructors = {
|
module SymbolicConstructors = {
|
||||||
|
@ -146,6 +204,7 @@ let dispatchToGenericOutput = (call: ExpressionValue.functionCall): option<
|
||||||
Helpers.toDistFn(Truncate(None, Some(float)), dist)
|
Helpers.toDistFn(Truncate(None, Some(float)), dist)
|
||||||
| ("truncate", [EvDistribution(dist), EvNumber(float1), EvNumber(float2)]) =>
|
| ("truncate", [EvDistribution(dist), EvNumber(float1), EvNumber(float2)]) =>
|
||||||
Helpers.toDistFn(Truncate(Some(float1), Some(float2)), dist)
|
Helpers.toDistFn(Truncate(Some(float1), Some(float2)), dist)
|
||||||
|
| ("mx" | "mixture", args) => Helpers.mixture(args)->Some
|
||||||
| ("log", [EvDistribution(a)]) =>
|
| ("log", [EvDistribution(a)]) =>
|
||||||
Helpers.twoDiststoDistFn(Algebraic, "log", a, GenericDist.fromFloat(Math.e))->Some
|
Helpers.twoDiststoDistFn(Algebraic, "log", a, GenericDist.fromFloat(Math.e))->Some
|
||||||
| ("log10", [EvDistribution(a)]) =>
|
| ("log10", [EvDistribution(a)]) =>
|
||||||
|
@ -187,7 +246,8 @@ let genericOutputToReducerValue = (o: DistributionOperation.outputType): result<
|
||||||
| GenDistError(NotYetImplemented) => Error(RETodo("Function not yet implemented"))
|
| GenDistError(NotYetImplemented) => Error(RETodo("Function not yet implemented"))
|
||||||
| GenDistError(Unreachable) => Error(RETodo("Unreachable"))
|
| GenDistError(Unreachable) => Error(RETodo("Unreachable"))
|
||||||
| GenDistError(DistributionVerticalShiftIsInvalid) =>
|
| GenDistError(DistributionVerticalShiftIsInvalid) =>
|
||||||
Error(RETodo("Distribution Vertical Shift is Invalid"))
|
Error(RETodo("Distribution Vertical Shift Is Invalid"))
|
||||||
|
| GenDistError(ArgumentError(err)) => Error(RETodo("Argument Error: " ++ err))
|
||||||
| GenDistError(Other(s)) => Error(RETodo(s))
|
| GenDistError(Other(s)) => Error(RETodo(s))
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
|
@ -11,10 +11,10 @@ The below few seem to work fine. In the future there's definitely more work to d
|
||||||
type samplingParams = DistributionOperation.env
|
type samplingParams = DistributionOperation.env
|
||||||
|
|
||||||
@genType
|
@genType
|
||||||
type genericDist = GenericDist_Types.genericDist
|
type genericDist = DistributionTypes.genericDist
|
||||||
|
|
||||||
@genType
|
@genType
|
||||||
type distributionError = GenericDist_Types.error
|
type distributionError = DistributionTypes.error
|
||||||
|
|
||||||
@genType
|
@genType
|
||||||
type resultDist = result<genericDist, distributionError>
|
type resultDist = result<genericDist, distributionError>
|
||||||
|
@ -32,7 +32,7 @@ let makeSampleSetDist = SampleSetDist.make
|
||||||
let evaluate = Reducer.evaluate
|
let evaluate = Reducer.evaluate
|
||||||
|
|
||||||
@genType
|
@genType
|
||||||
type expressionValue = Reducer_Expression.expressionValue
|
type expressionValue = ReducerInterface_ExpressionValue.expressionValue
|
||||||
|
|
||||||
@genType
|
@genType
|
||||||
type errorValue = Reducer_ErrorValue.errorValue
|
type errorValue = Reducer_ErrorValue.errorValue
|
||||||
|
|
|
@ -59,8 +59,9 @@ module O = {
|
||||||
let toExn = Rationale.Option.toExn
|
let toExn = Rationale.Option.toExn
|
||||||
let some = Rationale.Option.some
|
let some = Rationale.Option.some
|
||||||
let firstSome = Rationale.Option.firstSome
|
let firstSome = Rationale.Option.firstSome
|
||||||
let toExt = Rationale.Option.toExn
|
let toExt = Rationale.Option.toExn // wanna flag this-- looks like a typo but `Rationale.OptiontoExt` doesn't exist.
|
||||||
let flatApply = (fn, b) => Rationale.Option.apply(fn, Some(b)) |> Rationale.Option.flatten
|
let flatApply = (fn, b) => Rationale.Option.apply(fn, Some(b)) |> Rationale.Option.flatten
|
||||||
|
let flatten = Rationale.Option.flatten
|
||||||
|
|
||||||
let toBool = opt =>
|
let toBool = opt =>
|
||||||
switch opt {
|
switch opt {
|
||||||
|
@ -103,6 +104,7 @@ module O2 = {
|
||||||
let toExn = (a, b) => O.toExn(b, a)
|
let toExn = (a, b) => O.toExn(b, a)
|
||||||
let fmap = (a, b) => O.fmap(b, a)
|
let fmap = (a, b) => O.fmap(b, a)
|
||||||
let toResult = (a, b) => O.toResult(b, a)
|
let toResult = (a, b) => O.toResult(b, a)
|
||||||
|
let bind = (a, b) => O.bind(b, a)
|
||||||
}
|
}
|
||||||
|
|
||||||
/* Functions */
|
/* Functions */
|
||||||
|
@ -176,17 +178,49 @@ module R = {
|
||||||
|
|
||||||
let errorIfCondition = (errorCondition, errorMessage, r) =>
|
let errorIfCondition = (errorCondition, errorMessage, r) =>
|
||||||
errorCondition(r) ? Error(errorMessage) : Ok(r)
|
errorCondition(r) ? Error(errorMessage) : Ok(r)
|
||||||
|
|
||||||
|
let ap = Rationale.Result.ap
|
||||||
|
let ap' = (r, a) =>
|
||||||
|
switch r {
|
||||||
|
| Ok(f) => fmap(f, a)
|
||||||
|
| Error(err) => Error(err)
|
||||||
|
}
|
||||||
|
// (a1 -> a2 -> r) -> m a1 -> m a2 -> m r // not in Rationale
|
||||||
|
let liftM2: (('a, 'b) => 'c, result<'a, 'd>, result<'b, 'd>) => result<'c, 'd> = (op, xR, yR) => {
|
||||||
|
ap'(fmap(op, xR), yR)
|
||||||
|
}
|
||||||
|
|
||||||
|
let liftJoin2: (('a, 'b) => result<'c, 'd>, result<'a, 'd>, result<'b, 'd>) => result<'c, 'd> = (
|
||||||
|
op,
|
||||||
|
xR,
|
||||||
|
yR,
|
||||||
|
) => {
|
||||||
|
bind(liftM2(op, xR, yR), x => x)
|
||||||
|
}
|
||||||
|
|
||||||
|
let fmap2 = (f, r) =>
|
||||||
|
switch r {
|
||||||
|
| Ok(r) => r->Ok
|
||||||
|
| Error(x) => x->f->Error
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
module R2 = {
|
module R2 = {
|
||||||
let fmap = (a,b) => R.fmap(b,a)
|
let fmap = (a, b) => R.fmap(b, a)
|
||||||
let bind = (a, b) => R.bind(b, a)
|
let bind = (a, b) => R.bind(b, a)
|
||||||
|
|
||||||
//Converts result type to change error type only
|
//Converts result type to change error type only
|
||||||
let errMap = (a, map) => switch(a){
|
let errMap = (a, map) =>
|
||||||
|
switch a {
|
||||||
| Ok(r) => Ok(r)
|
| Ok(r) => Ok(r)
|
||||||
| Error(e) => map(e)
|
| Error(e) => map(e)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
let fmap2 = (xR, f) =>
|
||||||
|
switch xR {
|
||||||
|
| Ok(x) => x->Ok
|
||||||
|
| Error(x) => x->f->Error
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
let safe_fn_of_string = (fn, s: string): option<'a> =>
|
let safe_fn_of_string = (fn, s: string): option<'a> =>
|
||||||
|
@ -257,6 +291,29 @@ module L = {
|
||||||
let update = Rationale.RList.update
|
let update = Rationale.RList.update
|
||||||
let iter = List.iter
|
let iter = List.iter
|
||||||
let findIndex = Rationale.RList.findIndex
|
let findIndex = Rationale.RList.findIndex
|
||||||
|
let headSafe = Belt.List.head
|
||||||
|
let tailSafe = Belt.List.tail
|
||||||
|
let headExn = Belt.List.headExn
|
||||||
|
let tailExn = Belt.List.tailExn
|
||||||
|
let zip = Belt.List.zip
|
||||||
|
|
||||||
|
let combinations2: list<'a> => list<('a, 'a)> = xs => {
|
||||||
|
let rec loop: ('a, list<'a>) => list<('a, 'a)> = (x', xs') => {
|
||||||
|
let n = length(xs')
|
||||||
|
if n == 0 {
|
||||||
|
list{}
|
||||||
|
} else {
|
||||||
|
let combs = fmap(y => (x', y), xs')
|
||||||
|
let hd = headExn(xs')
|
||||||
|
let tl = tailExn(xs')
|
||||||
|
concat(list{combs, loop(hd, tl)})
|
||||||
|
}
|
||||||
|
}
|
||||||
|
switch (headSafe(xs), tailSafe(xs)) {
|
||||||
|
| (Some(x'), Some(xs')) => loop(x', xs')
|
||||||
|
| (_, _) => list{}
|
||||||
|
}
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
/* A for Array */
|
/* A for Array */
|
||||||
|
@ -300,7 +357,6 @@ module A = {
|
||||||
|> Rationale.Result.return
|
|> Rationale.Result.return
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
// This zips while taking the longest elements of each array.
|
// This zips while taking the longest elements of each array.
|
||||||
let zipMaxLength = (array1, array2) => {
|
let zipMaxLength = (array1, array2) => {
|
||||||
let maxLength = Int.max(length(array1), length(array2))
|
let maxLength = Int.max(length(array1), length(array2))
|
||||||
|
@ -456,7 +512,6 @@ module A = {
|
||||||
let diff = (arr: array<float>): array<float> =>
|
let diff = (arr: array<float>): array<float> =>
|
||||||
Belt.Array.zipBy(arr, Belt.Array.sliceToEnd(arr, 1), (left, right) => right -. left)
|
Belt.Array.zipBy(arr, Belt.Array.sliceToEnd(arr, 1), (left, right) => right -. left)
|
||||||
|
|
||||||
|
|
||||||
exception RangeError(string)
|
exception RangeError(string)
|
||||||
let range = (min: float, max: float, n: int): array<float> =>
|
let range = (min: float, max: float, n: int): array<float> =>
|
||||||
switch n {
|
switch n {
|
||||||
|
@ -474,7 +529,7 @@ module A = {
|
||||||
}
|
}
|
||||||
|
|
||||||
module A2 = {
|
module A2 = {
|
||||||
let fmap = (a,b) => A.fmap(b,a)
|
let fmap = (a, b) => A.fmap(b, a)
|
||||||
let joinWith = (a, b) => A.joinWith(b, a)
|
let joinWith = (a, b) => A.joinWith(b, a)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
|
@ -36,8 +36,8 @@ module Exponential = {
|
||||||
@module("jstat") @scope("exponential") external pdf: (float, float) => float = "pdf"
|
@module("jstat") @scope("exponential") external pdf: (float, float) => float = "pdf"
|
||||||
@module("jstat") @scope("exponential") external cdf: (float, float) => float = "cdf"
|
@module("jstat") @scope("exponential") external cdf: (float, float) => float = "cdf"
|
||||||
@module("jstat") @scope("exponential") external inv: (float, float) => float = "inv"
|
@module("jstat") @scope("exponential") external inv: (float, float) => float = "inv"
|
||||||
@module("jstat") @scope("exponential") external sample: (float) => float = "sample"
|
@module("jstat") @scope("exponential") external sample: float => float = "sample"
|
||||||
@module("jstat") @scope("exponential") external mean: (float) => float = "mean"
|
@module("jstat") @scope("exponential") external mean: float => float = "mean"
|
||||||
}
|
}
|
||||||
|
|
||||||
module Cauchy = {
|
module Cauchy = {
|
||||||
|
@ -56,7 +56,6 @@ module Triangular = {
|
||||||
@module("jstat") @scope("triangular") external mean: (float, float, float) => float = "mean"
|
@module("jstat") @scope("triangular") external mean: (float, float, float) => float = "mean"
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
module Pareto = {
|
module Pareto = {
|
||||||
@module("jstat") @scope("pareto") external pdf: (float, float, float) => float = "pdf"
|
@module("jstat") @scope("pareto") external pdf: (float, float, float) => float = "pdf"
|
||||||
@module("jstat") @scope("pareto") external cdf: (float, float, float) => float = "cdf"
|
@module("jstat") @scope("pareto") external cdf: (float, float, float) => float = "cdf"
|
||||||
|
@ -66,20 +65,20 @@ module Pareto = {
|
||||||
module Poisson = {
|
module Poisson = {
|
||||||
@module("jstat") @scope("poisson") external pdf: (float, float) => float = "pdf"
|
@module("jstat") @scope("poisson") external pdf: (float, float) => float = "pdf"
|
||||||
@module("jstat") @scope("poisson") external cdf: (float, float) => float = "cdf"
|
@module("jstat") @scope("poisson") external cdf: (float, float) => float = "cdf"
|
||||||
@module("jstat") @scope("poisson") external sample: (float) => float = "sample"
|
@module("jstat") @scope("poisson") external sample: float => float = "sample"
|
||||||
@module("jstat") @scope("poisson") external mean: (float) => float = "mean"
|
@module("jstat") @scope("poisson") external mean: float => float = "mean"
|
||||||
}
|
}
|
||||||
|
|
||||||
module Weibull = {
|
module Weibull = {
|
||||||
@module("jstat") @scope("weibull") external pdf: (float, float, float) => float = "pdf"
|
@module("jstat") @scope("weibull") external pdf: (float, float, float) => float = "pdf"
|
||||||
@module("jstat") @scope("weibull") external cdf: (float, float,float ) => float = "cdf"
|
@module("jstat") @scope("weibull") external cdf: (float, float, float) => float = "cdf"
|
||||||
@module("jstat") @scope("weibull") external sample: (float,float) => float = "sample"
|
@module("jstat") @scope("weibull") external sample: (float, float) => float = "sample"
|
||||||
@module("jstat") @scope("weibull") external mean: (float,float) => float = "mean"
|
@module("jstat") @scope("weibull") external mean: (float, float) => float = "mean"
|
||||||
}
|
}
|
||||||
|
|
||||||
module Binomial = {
|
module Binomial = {
|
||||||
@module("jstat") @scope("binomial") external pdf: (float, float, float) => float = "pdf"
|
@module("jstat") @scope("binomial") external pdf: (float, float, float) => float = "pdf"
|
||||||
@module("jstat") @scope("binomial") external cdf: (float, float,float ) => float = "cdf"
|
@module("jstat") @scope("binomial") external cdf: (float, float, float) => float = "cdf"
|
||||||
}
|
}
|
||||||
|
|
||||||
@module("jstat") external sum: array<float> => float = "sum"
|
@module("jstat") external sum: array<float> => float = "sum"
|
||||||
|
|
|
@ -2,21 +2,24 @@
|
||||||
|
|
||||||
This website is built using [Docusaurus 2](https://docusaurus.io/), a modern static website generator.
|
This website is built using [Docusaurus 2](https://docusaurus.io/), a modern static website generator.
|
||||||
|
|
||||||
## Build for development and production
|
# Build for development
|
||||||
|
|
||||||
This one actually works without running `yarn` at the monorepo level, but it doesn't hurt. You must at least run it at this package level
|
We assume you ran `yarn` at monorepo level.
|
||||||
|
|
||||||
|
The website depends on `squiggle-lang`, which you have to build manually.
|
||||||
|
|
||||||
```sh
|
```sh
|
||||||
yarn
|
cd ../squiggle-lang
|
||||||
|
yarn build
|
||||||
```
|
```
|
||||||
|
|
||||||
This command generates static content into the `build` directory and can be served using any static contents hosting service.
|
Generate static content, to the `build` directory.
|
||||||
|
|
||||||
```sh
|
```sh
|
||||||
yarn build
|
yarn build
|
||||||
```
|
```
|
||||||
|
|
||||||
Your local dev server is here, opening up a browser window.
|
Open a local dev server
|
||||||
|
|
||||||
```sh
|
```sh
|
||||||
yarn start
|
yarn start
|
||||||
|
|
|
@ -68,15 +68,15 @@ combination of the two. The first positional arguments represent the distributio
|
||||||
to be combined, and the last argument is how much to weigh every distribution in the
|
to be combined, and the last argument is how much to weigh every distribution in the
|
||||||
combination.
|
combination.
|
||||||
|
|
||||||
<SquiggleEditor initialSquiggleString="mm(uniform(0,1), normal(1,1), [0.5, 0.5])" />
|
<SquiggleEditor initialSquiggleString="mx(uniform(0,1), normal(1,1), [0.5, 0.5])" />
|
||||||
|
|
||||||
It's possible to create discrete distributions using this method.
|
It's possible to create discrete distributions using this method.
|
||||||
|
|
||||||
<SquiggleEditor initialSquiggleString="mm(0, 1, [0.2,0.8])" />
|
<SquiggleEditor initialSquiggleString="mx(0, 1, [0.2,0.8])" />
|
||||||
|
|
||||||
As well as mixed distributions:
|
As well as mixed distributions:
|
||||||
|
|
||||||
<SquiggleEditor initialSquiggleString="mm(3, 8, 1 to 10, [0.2, 0.3, 0.5])" />
|
<SquiggleEditor initialSquiggleString="mx(3, 8, 1 to 10, [0.2, 0.3, 0.5])" />
|
||||||
|
|
||||||
## Other Functions
|
## Other Functions
|
||||||
|
|
||||||
|
|
126
packages/website/docs/Internal/Invariants.md
Normal file
126
packages/website/docs/Internal/Invariants.md
Normal file
|
@ -0,0 +1,126 @@
|
||||||
|
---
|
||||||
|
title: Statistical properties of algebraic combinations of distributions for property testing.
|
||||||
|
urlcolor: blue
|
||||||
|
author:
|
||||||
|
- Nuño Sempere
|
||||||
|
- Quinn Dougherty
|
||||||
|
abstract: This document outlines some properties about algebraic combinations of distributions. It is meant to facilitate property tests for [Squiggle](https://squiggle-language.com/), an estimation language for forecasters. So far, we are focusing on the means, the standard deviation and the shape of the pdfs.
|
||||||
|
---
|
||||||
|
|
||||||
|
_This document right now is normative and aspirational, not a description of the testing that's currently done_.
|
||||||
|
|
||||||
|
The academic keyword to search for in relation to this document is "[algebra of random variables](https://wikiless.org/wiki/Algebra_of_random_variables?lang=en)". Squiggle doesn't yet support getting the standard deviation, denoted by $\sigma$, but such support could yet be added.
|
||||||
|
|
||||||
|
## Means and standard deviations
|
||||||
|
|
||||||
|
### Sums
|
||||||
|
|
||||||
|
$$
|
||||||
|
mean(f+g) = mean(f) + mean(g)
|
||||||
|
$$
|
||||||
|
|
||||||
|
$$
|
||||||
|
\sigma(f+g) = \sqrt{\sigma(f)^2 + \sigma(g)^2}
|
||||||
|
$$
|
||||||
|
|
||||||
|
In the case of normal distributions,
|
||||||
|
|
||||||
|
$$
|
||||||
|
mean(normal(a,b) + normal(c,d)) = mean(normal(a+c, \sqrt{b^2 + d^2}))
|
||||||
|
$$
|
||||||
|
|
||||||
|
### Subtractions
|
||||||
|
|
||||||
|
$$
|
||||||
|
mean(f-g) = mean(f) - mean(g)
|
||||||
|
$$
|
||||||
|
|
||||||
|
$$
|
||||||
|
\sigma(f-g) = \sqrt{\sigma(f)^2 + \sigma(g)^2}
|
||||||
|
$$
|
||||||
|
|
||||||
|
### Multiplications
|
||||||
|
|
||||||
|
$$
|
||||||
|
mean(f \cdot g) = mean(f) \cdot mean(g)
|
||||||
|
$$
|
||||||
|
|
||||||
|
$$
|
||||||
|
\sigma(f \cdot g) = \sqrt{ (\sigma(f)^2 + mean(f)) \cdot (\sigma(g)^2 + mean(g)) - (mean(f) \cdot mean(g))^2}
|
||||||
|
$$
|
||||||
|
|
||||||
|
### Divisions
|
||||||
|
|
||||||
|
Divisions are tricky, and in general we don't have good expressions to characterize properties of ratios. In particular, the ratio of two normals is a Cauchy distribution, which doesn't have to have a mean.
|
||||||
|
|
||||||
|
## Probability density functions (pdfs)
|
||||||
|
|
||||||
|
Specifying the pdf of the sum/multiplication/... of distributions as a function of the pdfs of the individual arguments can still be done. But it requires integration. My sense is that this is still doable, and I (Nuño) provide some _pseudocode_ to do this.
|
||||||
|
|
||||||
|
### Sums
|
||||||
|
|
||||||
|
Let $f, g$ be two independently distributed functions. Then, the pdf of their sum, evaluated at a point $z$, expressed as $(f + g)(z)$, is given by:
|
||||||
|
|
||||||
|
$$
|
||||||
|
(f + g)(z)= \int_{-\infty}^{\infty} f(x)\cdot g(z-x) \,dx
|
||||||
|
$$
|
||||||
|
|
||||||
|
See a proof sketch [here](https://www.milefoot.com/math/stat/rv-sums.htm)
|
||||||
|
|
||||||
|
Here is some pseudocode to approximate this:
|
||||||
|
|
||||||
|
```js
|
||||||
|
// pdf1 and pdf2 are pdfs,
|
||||||
|
// and cdf1 and cdf2 are their corresponding cdfs
|
||||||
|
|
||||||
|
let epsilonForBounds = 2 ** -16;
|
||||||
|
let getBounds = (cdf) => {
|
||||||
|
let cdf_min = -1;
|
||||||
|
let cdf_max = 1;
|
||||||
|
let n = 0;
|
||||||
|
while (
|
||||||
|
(cdf(cdf_min) > epsilonForBounds || 1 - cdf(cdf_max) > epsilonForBounds) &&
|
||||||
|
n < 10
|
||||||
|
) {
|
||||||
|
if (cdf(cdf_min) > epsilonForBounds) {
|
||||||
|
cdf_min = cdf_min * 2;
|
||||||
|
}
|
||||||
|
if (1 - cdf(cdf_max) > epsilonForBounds) {
|
||||||
|
cdf_max = cdf_max * 2;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return [cdf_min, cdf_max];
|
||||||
|
};
|
||||||
|
|
||||||
|
let epsilonForIntegrals = 2 ** -16;
|
||||||
|
let pdfOfSum = (pdf1, pdf2, cdf1, cdf2, z) => {
|
||||||
|
let bounds1 = getBounds(cdf1);
|
||||||
|
let bounds2 = getBounds(cdf2);
|
||||||
|
let bounds = [
|
||||||
|
Math.min(bounds1[0], bounds2[0]),
|
||||||
|
Math.max(bounds1[1], bounds2[1]),
|
||||||
|
];
|
||||||
|
|
||||||
|
let result = 0;
|
||||||
|
for (let x = bounds[0]; (x = x + epsilonForIntegrals); x < bounds[1]) {
|
||||||
|
let delta = pdf1(x) * pdf2(z - x);
|
||||||
|
result = result + delta * epsilonForIntegrals;
|
||||||
|
}
|
||||||
|
return result;
|
||||||
|
};
|
||||||
|
```
|
||||||
|
|
||||||
|
## Cumulative density functions
|
||||||
|
|
||||||
|
TODO
|
||||||
|
|
||||||
|
## Inverse cumulative density functions
|
||||||
|
|
||||||
|
TODO
|
||||||
|
|
||||||
|
# To do:
|
||||||
|
|
||||||
|
- Provide sources or derivations, useful as this document becomes more complicated
|
||||||
|
- Provide definitions for the probability density function, exponential, inverse, log, etc.
|
||||||
|
- Provide at least some tests for division
|
||||||
|
- See if playing around with characteristic functions turns out anything useful
|
|
@ -1,5 +1,7 @@
|
||||||
// @ts-check
|
// @ts-check
|
||||||
// Note: type annotations allow type checking and IDEs autocompletion
|
// Note: type annotations allow type checking and IDEs autocompletion
|
||||||
|
const math = require("remark-math");
|
||||||
|
const katex = require("rehype-katex");
|
||||||
|
|
||||||
const lightCodeTheme = require("prism-react-renderer/themes/github");
|
const lightCodeTheme = require("prism-react-renderer/themes/github");
|
||||||
const darkCodeTheme = require("prism-react-renderer/themes/dracula");
|
const darkCodeTheme = require("prism-react-renderer/themes/dracula");
|
||||||
|
@ -14,7 +16,7 @@ const config = {
|
||||||
onBrokenLinks: "throw",
|
onBrokenLinks: "throw",
|
||||||
onBrokenMarkdownLinks: "warn",
|
onBrokenMarkdownLinks: "warn",
|
||||||
favicon: "img/favicon.ico",
|
favicon: "img/favicon.ico",
|
||||||
organizationName: "QURIResearch", // Usually your GitHub org/user name.
|
organizationName: "quantified-uncertainty", // Usually your GitHub org/user name.
|
||||||
projectName: "squiggle", // Usually your repo name.
|
projectName: "squiggle", // Usually your repo name.
|
||||||
|
|
||||||
plugins: [
|
plugins: [
|
||||||
|
@ -47,13 +49,15 @@ const config = {
|
||||||
sidebarPath: require.resolve("./sidebars.js"),
|
sidebarPath: require.resolve("./sidebars.js"),
|
||||||
// Please change this to your repo.
|
// Please change this to your repo.
|
||||||
editUrl:
|
editUrl:
|
||||||
"https://github.com/foretold-app/squiggle/tree/master/packages/website/",
|
"https://github.com/quantified-uncertainty/squiggle/tree/master/packages/website/",
|
||||||
|
remarkPlugins: [math],
|
||||||
|
rehypePlugins: [katex],
|
||||||
},
|
},
|
||||||
blog: {
|
blog: {
|
||||||
showReadingTime: true,
|
showReadingTime: true,
|
||||||
// Please change this to your repo.
|
// Please change this to your repo.
|
||||||
editUrl:
|
editUrl:
|
||||||
"https://github.com/foretold-app/squiggle/tree/master/packages/website/",
|
"https://github.com/quantified-uncertainty/squiggle/tree/master/packages/website/",
|
||||||
},
|
},
|
||||||
theme: {
|
theme: {
|
||||||
customCss: require.resolve("./src/css/custom.css"),
|
customCss: require.resolve("./src/css/custom.css"),
|
||||||
|
@ -111,6 +115,15 @@ const config = {
|
||||||
darkTheme: darkCodeTheme,
|
darkTheme: darkCodeTheme,
|
||||||
},
|
},
|
||||||
}),
|
}),
|
||||||
|
stylesheets: [
|
||||||
|
{
|
||||||
|
href: "https://cdn.jsdelivr.net/npm/katex@0.13.24/dist/katex.min.css",
|
||||||
|
type: "text/css",
|
||||||
|
integrity:
|
||||||
|
"sha384-odtC+0UGzzFL/6PNoE8rX/SPcQDXBJ+uRepguP4QkPCm2LBxH3FA3y+fKSiJ+AmM",
|
||||||
|
crossorigin: "anonymous",
|
||||||
|
},
|
||||||
|
],
|
||||||
};
|
};
|
||||||
|
|
||||||
module.exports = config;
|
module.exports = config;
|
||||||
|
|
|
@ -17,7 +17,10 @@
|
||||||
"clsx": "^1.1.1",
|
"clsx": "^1.1.1",
|
||||||
"prism-react-renderer": "^1.2.1",
|
"prism-react-renderer": "^1.2.1",
|
||||||
"react": "^18.0.0",
|
"react": "^18.0.0",
|
||||||
"react-dom": "^18.0.0"
|
"react-dom": "^18.0.0",
|
||||||
|
"remark-math": "^3",
|
||||||
|
"rehype-katex": "^5",
|
||||||
|
"hast-util-is-element": "1.1.0"
|
||||||
},
|
},
|
||||||
"browserslist": {
|
"browserslist": {
|
||||||
"production": [
|
"production": [
|
||||||
|
|
|
@ -40,6 +40,16 @@ const sidebars = {
|
||||||
},
|
},
|
||||||
],
|
],
|
||||||
},
|
},
|
||||||
|
{
|
||||||
|
type: "category",
|
||||||
|
label: "Internal",
|
||||||
|
items: [
|
||||||
|
{
|
||||||
|
type: "autogenerated",
|
||||||
|
dirName: "Internal",
|
||||||
|
},
|
||||||
|
],
|
||||||
|
},
|
||||||
],
|
],
|
||||||
|
|
||||||
// But you can create a sidebar manually
|
// But you can create a sidebar manually
|
||||||
|
|
120
yarn.lock
120
yarn.lock
|
@ -3890,6 +3890,11 @@
|
||||||
resolved "https://registry.yarnpkg.com/@types/json5/-/json5-0.0.29.tgz#ee28707ae94e11d2b827bcbe5270bcea7f3e71ee"
|
resolved "https://registry.yarnpkg.com/@types/json5/-/json5-0.0.29.tgz#ee28707ae94e11d2b827bcbe5270bcea7f3e71ee"
|
||||||
integrity sha1-7ihweulOEdK4J7y+UnC86n8+ce4=
|
integrity sha1-7ihweulOEdK4J7y+UnC86n8+ce4=
|
||||||
|
|
||||||
|
"@types/katex@^0.11.0":
|
||||||
|
version "0.11.1"
|
||||||
|
resolved "https://registry.yarnpkg.com/@types/katex/-/katex-0.11.1.tgz#34de04477dcf79e2ef6c8d23b41a3d81f9ebeaf5"
|
||||||
|
integrity sha512-DUlIj2nk0YnJdlWgsFuVKcX27MLW0KbKmGVoUHmFr+74FYYNUDAaj9ZqTADvsbE8rfxuVmSFc7KczYn5Y09ozg==
|
||||||
|
|
||||||
"@types/lodash@^4.14.181":
|
"@types/lodash@^4.14.181":
|
||||||
version "4.14.181"
|
version "4.14.181"
|
||||||
resolved "https://registry.yarnpkg.com/@types/lodash/-/lodash-4.14.181.tgz#d1d3740c379fda17ab175165ba04e2d03389385d"
|
resolved "https://registry.yarnpkg.com/@types/lodash/-/lodash-4.14.181.tgz#d1d3740c379fda17ab175165ba04e2d03389385d"
|
||||||
|
@ -4025,10 +4030,10 @@
|
||||||
dependencies:
|
dependencies:
|
||||||
"@types/react" "*"
|
"@types/react" "*"
|
||||||
|
|
||||||
"@types/react@*", "@types/react@^16.9.19", "@types/react@^18.0.1":
|
"@types/react@*", "@types/react@^16.9.19", "@types/react@^18.0.1", "@types/react@^18.0.3":
|
||||||
version "18.0.2"
|
version "18.0.3"
|
||||||
resolved "https://registry.yarnpkg.com/@types/react/-/react-18.0.2.tgz#bc6a0572d434642ebe8b4ac0f121d18e2f2d8f7f"
|
resolved "https://registry.yarnpkg.com/@types/react/-/react-18.0.3.tgz#baefa397561372015b9f8ba5bc83bc3f84ae8fcb"
|
||||||
integrity sha512-2poV9ReTwwV5ZNxkKyk7t6Vp/odeTfYI3vRjtDYWfUdEstx9mp26jzELfMBwV6gXg1irhHUnmZJH/dJW7xafcA==
|
integrity sha512-P8QUaMW4k+kH9aKNPl9b3XWcKMSSALYprLL8xpAMJOLUn3Pl6B+6nKC4F7dsk9oJPwkiRx+qlwhG/Zc1LxFVuQ==
|
||||||
dependencies:
|
dependencies:
|
||||||
"@types/prop-types" "*"
|
"@types/prop-types" "*"
|
||||||
"@types/scheduler" "*"
|
"@types/scheduler" "*"
|
||||||
|
@ -6394,7 +6399,7 @@ commander@^6.2.1:
|
||||||
resolved "https://registry.yarnpkg.com/commander/-/commander-6.2.1.tgz#0792eb682dfbc325999bb2b84fddddba110ac73c"
|
resolved "https://registry.yarnpkg.com/commander/-/commander-6.2.1.tgz#0792eb682dfbc325999bb2b84fddddba110ac73c"
|
||||||
integrity sha512-U7VdrJFnJgo4xjrHpTzu0yrHPGImdsmD95ZlgYSEajAn2JKzDhDTPG9kBTefmObL2w/ngeZnilk+OV9CG3d7UA==
|
integrity sha512-U7VdrJFnJgo4xjrHpTzu0yrHPGImdsmD95ZlgYSEajAn2JKzDhDTPG9kBTefmObL2w/ngeZnilk+OV9CG3d7UA==
|
||||||
|
|
||||||
commander@^8.3.0:
|
commander@^8.0.0, commander@^8.3.0:
|
||||||
version "8.3.0"
|
version "8.3.0"
|
||||||
resolved "https://registry.yarnpkg.com/commander/-/commander-8.3.0.tgz#4837ea1b2da67b9c616a67afbb0fafee567bca66"
|
resolved "https://registry.yarnpkg.com/commander/-/commander-8.3.0.tgz#4837ea1b2da67b9c616a67afbb0fafee567bca66"
|
||||||
integrity sha512-OkTL9umf+He2DZkUq8f8J9of7yL6RJKI24dVITBmNfZBmri9zYZQrKkuXiKhyfPSu8tUhnVBB1iKXevvnlR4Ww==
|
integrity sha512-OkTL9umf+He2DZkUq8f8J9of7yL6RJKI24dVITBmNfZBmri9zYZQrKkuXiKhyfPSu8tUhnVBB1iKXevvnlR4Ww==
|
||||||
|
@ -7936,10 +7941,10 @@ escodegen@^2.0.0:
|
||||||
optionalDependencies:
|
optionalDependencies:
|
||||||
source-map "~0.6.1"
|
source-map "~0.6.1"
|
||||||
|
|
||||||
eslint-config-react-app@^7.0.0:
|
eslint-config-react-app@^7.0.1:
|
||||||
version "7.0.0"
|
version "7.0.1"
|
||||||
resolved "https://registry.yarnpkg.com/eslint-config-react-app/-/eslint-config-react-app-7.0.0.tgz#0fa96d5ec1dfb99c029b1554362ab3fa1c3757df"
|
resolved "https://registry.yarnpkg.com/eslint-config-react-app/-/eslint-config-react-app-7.0.1.tgz#73ba3929978001c5c86274c017ea57eb5fa644b4"
|
||||||
integrity sha512-xyymoxtIt1EOsSaGag+/jmcywRuieQoA2JbPCjnw9HukFj9/97aGPoZVFioaotzk1K5Qt9sHO5EutZbkrAXS0g==
|
integrity sha512-K6rNzvkIeHaTd8m/QEh1Zko0KI7BACWkkneSs6s9cKZC/J27X3eZR6Upt1jkmZ/4FK+XUOPPxMEN7+lbUXfSlA==
|
||||||
dependencies:
|
dependencies:
|
||||||
"@babel/core" "^7.16.0"
|
"@babel/core" "^7.16.0"
|
||||||
"@babel/eslint-parser" "^7.16.3"
|
"@babel/eslint-parser" "^7.16.3"
|
||||||
|
@ -9278,6 +9283,11 @@ hast-util-from-parse5@^6.0.0:
|
||||||
vfile-location "^3.2.0"
|
vfile-location "^3.2.0"
|
||||||
web-namespaces "^1.0.0"
|
web-namespaces "^1.0.0"
|
||||||
|
|
||||||
|
hast-util-is-element@1.1.0, hast-util-is-element@^1.0.0:
|
||||||
|
version "1.1.0"
|
||||||
|
resolved "https://registry.yarnpkg.com/hast-util-is-element/-/hast-util-is-element-1.1.0.tgz#3b3ed5159a2707c6137b48637fbfe068e175a425"
|
||||||
|
integrity sha512-oUmNua0bFbdrD/ELDSSEadRVtWZOf3iF6Lbv81naqsIV99RnSCieTbWuWCY8BAeEfKJTKl0gRdokv+dELutHGQ==
|
||||||
|
|
||||||
hast-util-parse-selector@^2.0.0:
|
hast-util-parse-selector@^2.0.0:
|
||||||
version "2.2.5"
|
version "2.2.5"
|
||||||
resolved "https://registry.yarnpkg.com/hast-util-parse-selector/-/hast-util-parse-selector-2.2.5.tgz#d57c23f4da16ae3c63b3b6ca4616683313499c3a"
|
resolved "https://registry.yarnpkg.com/hast-util-parse-selector/-/hast-util-parse-selector-2.2.5.tgz#d57c23f4da16ae3c63b3b6ca4616683313499c3a"
|
||||||
|
@ -9310,6 +9320,15 @@ hast-util-to-parse5@^6.0.0:
|
||||||
xtend "^4.0.0"
|
xtend "^4.0.0"
|
||||||
zwitch "^1.0.0"
|
zwitch "^1.0.0"
|
||||||
|
|
||||||
|
hast-util-to-text@^2.0.0:
|
||||||
|
version "2.0.1"
|
||||||
|
resolved "https://registry.yarnpkg.com/hast-util-to-text/-/hast-util-to-text-2.0.1.tgz#04f2e065642a0edb08341976084aa217624a0f8b"
|
||||||
|
integrity sha512-8nsgCARfs6VkwH2jJU9b8LNTuR4700na+0h3PqCaEk4MAnMDeu5P0tP8mjk9LLNGxIeQRLbiDbZVw6rku+pYsQ==
|
||||||
|
dependencies:
|
||||||
|
hast-util-is-element "^1.0.0"
|
||||||
|
repeat-string "^1.0.0"
|
||||||
|
unist-util-find-after "^3.0.0"
|
||||||
|
|
||||||
hastscript@^5.0.0:
|
hastscript@^5.0.0:
|
||||||
version "5.1.2"
|
version "5.1.2"
|
||||||
resolved "https://registry.yarnpkg.com/hastscript/-/hastscript-5.1.2.tgz#bde2c2e56d04c62dd24e8c5df288d050a355fb8a"
|
resolved "https://registry.yarnpkg.com/hastscript/-/hastscript-5.1.2.tgz#bde2c2e56d04c62dd24e8c5df288d050a355fb8a"
|
||||||
|
@ -10968,6 +10987,13 @@ junk@^3.1.0:
|
||||||
resolved "https://registry.yarnpkg.com/junk/-/junk-3.1.0.tgz#31499098d902b7e98c5d9b9c80f43457a88abfa1"
|
resolved "https://registry.yarnpkg.com/junk/-/junk-3.1.0.tgz#31499098d902b7e98c5d9b9c80f43457a88abfa1"
|
||||||
integrity sha512-pBxcB3LFc8QVgdggvZWyeys+hnrNWg4OcZIU/1X59k5jQdLBlCsYGRQaz234SqoRLTCgMH00fY0xRJH+F9METQ==
|
integrity sha512-pBxcB3LFc8QVgdggvZWyeys+hnrNWg4OcZIU/1X59k5jQdLBlCsYGRQaz234SqoRLTCgMH00fY0xRJH+F9METQ==
|
||||||
|
|
||||||
|
katex@^0.13.0:
|
||||||
|
version "0.13.24"
|
||||||
|
resolved "https://registry.yarnpkg.com/katex/-/katex-0.13.24.tgz#fe55455eb455698cb24b911a353d16a3c855d905"
|
||||||
|
integrity sha512-jZxYuKCma3VS5UuxOx/rFV1QyGSl3Uy/i0kTJF3HgQ5xMinCQVF8Zd4bMY/9aI9b9A2pjIBOsjSSm68ykTAr8w==
|
||||||
|
dependencies:
|
||||||
|
commander "^8.0.0"
|
||||||
|
|
||||||
keyv@^3.0.0:
|
keyv@^3.0.0:
|
||||||
version "3.1.0"
|
version "3.1.0"
|
||||||
resolved "https://registry.yarnpkg.com/keyv/-/keyv-3.1.0.tgz#ecc228486f69991e49e9476485a5be1e8fc5c4d9"
|
resolved "https://registry.yarnpkg.com/keyv/-/keyv-3.1.0.tgz#ecc228486f69991e49e9476485a5be1e8fc5c4d9"
|
||||||
|
@ -14036,10 +14062,10 @@ react-colorful@^5.1.2:
|
||||||
resolved "https://registry.yarnpkg.com/react-colorful/-/react-colorful-5.5.1.tgz#29d9c4e496f2ca784dd2bb5053a3a4340cfaf784"
|
resolved "https://registry.yarnpkg.com/react-colorful/-/react-colorful-5.5.1.tgz#29d9c4e496f2ca784dd2bb5053a3a4340cfaf784"
|
||||||
integrity sha512-M1TJH2X3RXEt12sWkpa6hLc/bbYS0H6F4rIqjQZ+RxNBstpY67d9TrFXtqdZwhpmBXcCwEi7stKqFue3ZRkiOg==
|
integrity sha512-M1TJH2X3RXEt12sWkpa6hLc/bbYS0H6F4rIqjQZ+RxNBstpY67d9TrFXtqdZwhpmBXcCwEi7stKqFue3ZRkiOg==
|
||||||
|
|
||||||
react-dev-utils@^12.0.0:
|
react-dev-utils@^12.0.0, react-dev-utils@^12.0.1:
|
||||||
version "12.0.0"
|
version "12.0.1"
|
||||||
resolved "https://registry.yarnpkg.com/react-dev-utils/-/react-dev-utils-12.0.0.tgz#4eab12cdb95692a077616770b5988f0adf806526"
|
resolved "https://registry.yarnpkg.com/react-dev-utils/-/react-dev-utils-12.0.1.tgz#ba92edb4a1f379bd46ccd6bcd4e7bc398df33e73"
|
||||||
integrity sha512-xBQkitdxozPxt1YZ9O1097EJiVpwHr9FoAuEVURCKV0Av8NBERovJauzP7bo1ThvuhZ4shsQ1AJiu4vQpoT1AQ==
|
integrity sha512-84Ivxmr17KjUupyqzFode6xKhjwuEJDROWKJy/BthkL7Wn6NJ8h4WE6k/exAv6ImS+0oZLRRW5j/aINMHyeGeQ==
|
||||||
dependencies:
|
dependencies:
|
||||||
"@babel/code-frame" "^7.16.0"
|
"@babel/code-frame" "^7.16.0"
|
||||||
address "^1.1.2"
|
address "^1.1.2"
|
||||||
|
@ -14060,7 +14086,7 @@ react-dev-utils@^12.0.0:
|
||||||
open "^8.4.0"
|
open "^8.4.0"
|
||||||
pkg-up "^3.1.0"
|
pkg-up "^3.1.0"
|
||||||
prompts "^2.4.2"
|
prompts "^2.4.2"
|
||||||
react-error-overlay "^6.0.10"
|
react-error-overlay "^6.0.11"
|
||||||
recursive-readdir "^2.2.2"
|
recursive-readdir "^2.2.2"
|
||||||
shell-quote "^1.7.3"
|
shell-quote "^1.7.3"
|
||||||
strip-ansi "^6.0.1"
|
strip-ansi "^6.0.1"
|
||||||
|
@ -14112,10 +14138,10 @@ react-element-to-jsx-string@^14.3.4:
|
||||||
is-plain-object "5.0.0"
|
is-plain-object "5.0.0"
|
||||||
react-is "17.0.2"
|
react-is "17.0.2"
|
||||||
|
|
||||||
react-error-overlay@^6.0.10:
|
react-error-overlay@^6.0.11:
|
||||||
version "6.0.10"
|
version "6.0.11"
|
||||||
resolved "https://registry.yarnpkg.com/react-error-overlay/-/react-error-overlay-6.0.10.tgz#0fe26db4fa85d9dbb8624729580e90e7159a59a6"
|
resolved "https://registry.yarnpkg.com/react-error-overlay/-/react-error-overlay-6.0.11.tgz#92835de5841c5cf08ba00ddd2d677b6d17ff9adb"
|
||||||
integrity sha512-mKR90fX7Pm5seCOfz8q9F+66VCc1PGsWSBxKbITjfKVQHMNF2zudxHnMdJiB1fRCb+XsbQV9sO9DCkgsMQgBIA==
|
integrity sha512-/6UZ2qgEyH2aqzYZgQPxEnz33NJ2gNsnHA2o5+o4wW9bLM/JYQitNP9xPhsXwC08hMMovfGe/8retsdDsczPRg==
|
||||||
|
|
||||||
react-fast-compare@^3.0.1, react-fast-compare@^3.2.0:
|
react-fast-compare@^3.0.1, react-fast-compare@^3.2.0:
|
||||||
version "3.2.0"
|
version "3.2.0"
|
||||||
|
@ -14247,10 +14273,10 @@ react-router@6.3.0, react-router@^6.0.0:
|
||||||
dependencies:
|
dependencies:
|
||||||
history "^5.2.0"
|
history "^5.2.0"
|
||||||
|
|
||||||
react-scripts@5.0.0:
|
react-scripts@5.0.1:
|
||||||
version "5.0.0"
|
version "5.0.1"
|
||||||
resolved "https://registry.yarnpkg.com/react-scripts/-/react-scripts-5.0.0.tgz#6547a6d7f8b64364ef95273767466cc577cb4b60"
|
resolved "https://registry.yarnpkg.com/react-scripts/-/react-scripts-5.0.1.tgz#6285dbd65a8ba6e49ca8d651ce30645a6d980003"
|
||||||
integrity sha512-3i0L2CyIlROz7mxETEdfif6Sfhh9Lfpzi10CtcGs1emDQStmZfWjJbAIMtRD0opVUjQuFWqHZyRZ9PPzKCFxWg==
|
integrity sha512-8VAmEm/ZAwQzJ+GOMLbBsTdDKOpuZh7RPs0UymvBR2vRk4iZWCskjbFnxqjrzoIvlNNRZ3QJFx6/qDSi6zSnaQ==
|
||||||
dependencies:
|
dependencies:
|
||||||
"@babel/core" "^7.16.0"
|
"@babel/core" "^7.16.0"
|
||||||
"@pmmmwh/react-refresh-webpack-plugin" "^0.5.3"
|
"@pmmmwh/react-refresh-webpack-plugin" "^0.5.3"
|
||||||
|
@ -14268,7 +14294,7 @@ react-scripts@5.0.0:
|
||||||
dotenv "^10.0.0"
|
dotenv "^10.0.0"
|
||||||
dotenv-expand "^5.1.0"
|
dotenv-expand "^5.1.0"
|
||||||
eslint "^8.3.0"
|
eslint "^8.3.0"
|
||||||
eslint-config-react-app "^7.0.0"
|
eslint-config-react-app "^7.0.1"
|
||||||
eslint-webpack-plugin "^3.1.1"
|
eslint-webpack-plugin "^3.1.1"
|
||||||
file-loader "^6.2.0"
|
file-loader "^6.2.0"
|
||||||
fs-extra "^10.0.0"
|
fs-extra "^10.0.0"
|
||||||
|
@ -14285,7 +14311,7 @@ react-scripts@5.0.0:
|
||||||
postcss-preset-env "^7.0.1"
|
postcss-preset-env "^7.0.1"
|
||||||
prompts "^2.4.2"
|
prompts "^2.4.2"
|
||||||
react-app-polyfill "^3.0.0"
|
react-app-polyfill "^3.0.0"
|
||||||
react-dev-utils "^12.0.0"
|
react-dev-utils "^12.0.1"
|
||||||
react-refresh "^0.11.0"
|
react-refresh "^0.11.0"
|
||||||
resolve "^1.20.0"
|
resolve "^1.20.0"
|
||||||
resolve-url-loader "^4.0.0"
|
resolve-url-loader "^4.0.0"
|
||||||
|
@ -14541,6 +14567,18 @@ regjsparser@^0.8.2:
|
||||||
dependencies:
|
dependencies:
|
||||||
jsesc "~0.5.0"
|
jsesc "~0.5.0"
|
||||||
|
|
||||||
|
rehype-katex@^5:
|
||||||
|
version "5.0.0"
|
||||||
|
resolved "https://registry.yarnpkg.com/rehype-katex/-/rehype-katex-5.0.0.tgz#b556f24fde918f28ba1cb642ea71c7e82f3373d7"
|
||||||
|
integrity sha512-ksSuEKCql/IiIadOHiKRMjypva9BLhuwQNascMqaoGLDVd0k2NlE2wMvgZ3rpItzRKCd6vs8s7MFbb8pcR0AEg==
|
||||||
|
dependencies:
|
||||||
|
"@types/katex" "^0.11.0"
|
||||||
|
hast-util-to-text "^2.0.0"
|
||||||
|
katex "^0.13.0"
|
||||||
|
rehype-parse "^7.0.0"
|
||||||
|
unified "^9.0.0"
|
||||||
|
unist-util-visit "^2.0.0"
|
||||||
|
|
||||||
rehype-parse@^6.0.2:
|
rehype-parse@^6.0.2:
|
||||||
version "6.0.2"
|
version "6.0.2"
|
||||||
resolved "https://registry.yarnpkg.com/rehype-parse/-/rehype-parse-6.0.2.tgz#aeb3fdd68085f9f796f1d3137ae2b85a98406964"
|
resolved "https://registry.yarnpkg.com/rehype-parse/-/rehype-parse-6.0.2.tgz#aeb3fdd68085f9f796f1d3137ae2b85a98406964"
|
||||||
|
@ -14550,6 +14588,14 @@ rehype-parse@^6.0.2:
|
||||||
parse5 "^5.0.0"
|
parse5 "^5.0.0"
|
||||||
xtend "^4.0.0"
|
xtend "^4.0.0"
|
||||||
|
|
||||||
|
rehype-parse@^7.0.0:
|
||||||
|
version "7.0.1"
|
||||||
|
resolved "https://registry.yarnpkg.com/rehype-parse/-/rehype-parse-7.0.1.tgz#58900f6702b56767814afc2a9efa2d42b1c90c57"
|
||||||
|
integrity sha512-fOiR9a9xH+Le19i4fGzIEowAbwG7idy2Jzs4mOrFWBSJ0sNUgy0ev871dwWnbOo371SjgjG4pwzrbgSVrKxecw==
|
||||||
|
dependencies:
|
||||||
|
hast-util-from-parse5 "^6.0.0"
|
||||||
|
parse5 "^6.0.0"
|
||||||
|
|
||||||
relateurl@^0.2.7:
|
relateurl@^0.2.7:
|
||||||
version "0.2.7"
|
version "0.2.7"
|
||||||
resolved "https://registry.yarnpkg.com/relateurl/-/relateurl-0.2.7.tgz#54dbf377e51440aca90a4cd274600d3ff2d888a9"
|
resolved "https://registry.yarnpkg.com/relateurl/-/relateurl-0.2.7.tgz#54dbf377e51440aca90a4cd274600d3ff2d888a9"
|
||||||
|
@ -14589,6 +14635,11 @@ remark-footnotes@2.0.0:
|
||||||
resolved "https://registry.yarnpkg.com/remark-footnotes/-/remark-footnotes-2.0.0.tgz#9001c4c2ffebba55695d2dd80ffb8b82f7e6303f"
|
resolved "https://registry.yarnpkg.com/remark-footnotes/-/remark-footnotes-2.0.0.tgz#9001c4c2ffebba55695d2dd80ffb8b82f7e6303f"
|
||||||
integrity sha512-3Clt8ZMH75Ayjp9q4CorNeyjwIxHFcTkaektplKGl2A1jNGEUey8cKL0ZC5vJwfcD5GFGsNLImLG/NGzWIzoMQ==
|
integrity sha512-3Clt8ZMH75Ayjp9q4CorNeyjwIxHFcTkaektplKGl2A1jNGEUey8cKL0ZC5vJwfcD5GFGsNLImLG/NGzWIzoMQ==
|
||||||
|
|
||||||
|
remark-math@^3:
|
||||||
|
version "3.0.1"
|
||||||
|
resolved "https://registry.yarnpkg.com/remark-math/-/remark-math-3.0.1.tgz#85a02a15b15cad34b89a27244d4887b3a95185bb"
|
||||||
|
integrity sha512-epT77R/HK0x7NqrWHdSV75uNLwn8g9qTyMqCRCDujL0vj/6T6+yhdrR7mjELWtkse+Fw02kijAaBuVcHBor1+Q==
|
||||||
|
|
||||||
remark-mdx@1.6.22:
|
remark-mdx@1.6.22:
|
||||||
version "1.6.22"
|
version "1.6.22"
|
||||||
resolved "https://registry.yarnpkg.com/remark-mdx/-/remark-mdx-1.6.22.tgz#06a8dab07dcfdd57f3373af7f86bd0e992108bbd"
|
resolved "https://registry.yarnpkg.com/remark-mdx/-/remark-mdx-1.6.22.tgz#06a8dab07dcfdd57f3373af7f86bd0e992108bbd"
|
||||||
|
@ -14673,7 +14724,7 @@ repeat-element@^1.1.2:
|
||||||
resolved "https://registry.yarnpkg.com/repeat-element/-/repeat-element-1.1.4.tgz#be681520847ab58c7568ac75fbfad28ed42d39e9"
|
resolved "https://registry.yarnpkg.com/repeat-element/-/repeat-element-1.1.4.tgz#be681520847ab58c7568ac75fbfad28ed42d39e9"
|
||||||
integrity sha512-LFiNfRcSu7KK3evMyYOuCzv3L10TW7yC1G2/+StMjK8Y6Vqd2MG7r/Qjw4ghtuCOjFvlnms/iMmLqpvW/ES/WQ==
|
integrity sha512-LFiNfRcSu7KK3evMyYOuCzv3L10TW7yC1G2/+StMjK8Y6Vqd2MG7r/Qjw4ghtuCOjFvlnms/iMmLqpvW/ES/WQ==
|
||||||
|
|
||||||
repeat-string@^1.5.4, repeat-string@^1.6.1:
|
repeat-string@^1.0.0, repeat-string@^1.5.4, repeat-string@^1.6.1:
|
||||||
version "1.6.1"
|
version "1.6.1"
|
||||||
resolved "https://registry.yarnpkg.com/repeat-string/-/repeat-string-1.6.1.tgz#8dcae470e1c88abc2d600fff4a776286da75e637"
|
resolved "https://registry.yarnpkg.com/repeat-string/-/repeat-string-1.6.1.tgz#8dcae470e1c88abc2d600fff4a776286da75e637"
|
||||||
integrity sha1-jcrkcOHIirwtYA//Sndihtp15jc=
|
integrity sha1-jcrkcOHIirwtYA//Sndihtp15jc=
|
||||||
|
@ -16509,6 +16560,18 @@ unified@^8.4.2:
|
||||||
trough "^1.0.0"
|
trough "^1.0.0"
|
||||||
vfile "^4.0.0"
|
vfile "^4.0.0"
|
||||||
|
|
||||||
|
unified@^9.0.0:
|
||||||
|
version "9.2.2"
|
||||||
|
resolved "https://registry.yarnpkg.com/unified/-/unified-9.2.2.tgz#67649a1abfc3ab85d2969502902775eb03146975"
|
||||||
|
integrity sha512-Sg7j110mtefBD+qunSLO1lqOEKdrwBFBrR6Qd8f4uwkhWNlbkaqwHse6e7QvD3AP/MNoJdEDLaf8OxYyoWgorQ==
|
||||||
|
dependencies:
|
||||||
|
bail "^1.0.0"
|
||||||
|
extend "^3.0.0"
|
||||||
|
is-buffer "^2.0.0"
|
||||||
|
is-plain-obj "^2.0.0"
|
||||||
|
trough "^1.0.0"
|
||||||
|
vfile "^4.0.0"
|
||||||
|
|
||||||
union-value@^1.0.0:
|
union-value@^1.0.0:
|
||||||
version "1.0.1"
|
version "1.0.1"
|
||||||
resolved "https://registry.yarnpkg.com/union-value/-/union-value-1.0.1.tgz#0b6fe7b835aecda61c6ea4d4f02c14221e109847"
|
resolved "https://registry.yarnpkg.com/union-value/-/union-value-1.0.1.tgz#0b6fe7b835aecda61c6ea4d4f02c14221e109847"
|
||||||
|
@ -16545,6 +16608,13 @@ unist-builder@2.0.3, unist-builder@^2.0.0:
|
||||||
resolved "https://registry.yarnpkg.com/unist-builder/-/unist-builder-2.0.3.tgz#77648711b5d86af0942f334397a33c5e91516436"
|
resolved "https://registry.yarnpkg.com/unist-builder/-/unist-builder-2.0.3.tgz#77648711b5d86af0942f334397a33c5e91516436"
|
||||||
integrity sha512-f98yt5pnlMWlzP539tPc4grGMsFaQQlP/vM396b00jngsiINumNmsY8rkXjfoi1c6QaM8nQ3vaGDuoKWbe/1Uw==
|
integrity sha512-f98yt5pnlMWlzP539tPc4grGMsFaQQlP/vM396b00jngsiINumNmsY8rkXjfoi1c6QaM8nQ3vaGDuoKWbe/1Uw==
|
||||||
|
|
||||||
|
unist-util-find-after@^3.0.0:
|
||||||
|
version "3.0.0"
|
||||||
|
resolved "https://registry.yarnpkg.com/unist-util-find-after/-/unist-util-find-after-3.0.0.tgz#5c65fcebf64d4f8f496db46fa8fd0fbf354b43e6"
|
||||||
|
integrity sha512-ojlBqfsBftYXExNu3+hHLfJQ/X1jYY/9vdm4yZWjIbf0VuWF6CRufci1ZyoD/wV2TYMKxXUoNuoqwy+CkgzAiQ==
|
||||||
|
dependencies:
|
||||||
|
unist-util-is "^4.0.0"
|
||||||
|
|
||||||
unist-util-generated@^1.0.0:
|
unist-util-generated@^1.0.0:
|
||||||
version "1.1.6"
|
version "1.1.6"
|
||||||
resolved "https://registry.yarnpkg.com/unist-util-generated/-/unist-util-generated-1.1.6.tgz#5ab51f689e2992a472beb1b35f2ce7ff2f324d4b"
|
resolved "https://registry.yarnpkg.com/unist-util-generated/-/unist-util-generated-1.1.6.tgz#5ab51f689e2992a472beb1b35f2ce7ff2f324d4b"
|
||||||
|
|
Loading…
Reference in New Issue
Block a user