Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Backend 1.5.0 #782

Merged
merged 5 commits into from
Dec 15, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
62 changes: 61 additions & 1 deletion docs/examples.md
Original file line number Diff line number Diff line change
Expand Up @@ -644,7 +644,67 @@ You can then view the logs with:
tensorboard --logdir logs/
```

## 13. Additional features
## 13. Using differential operators

As part of the `TemplateExpressionSpec` described above,
you can also use differential operators within the template.
The operator for this is `D` which takes an expression as the first argument,
and the argument _index_ we are differentiating as the second argument.
This lets you compute integrals via evolution.

For example, let's say we wish to find the integral of $\frac{1}{x^2 \sqrt{x^2 - 1}}$
in the range $x > 1$.
We can compute the derivative of a function $f(x)$, and compare that
to numerical samples of $\frac{1}{x^2\sqrt{x^2-1}}$. Then, by extension,
$f(x)$ represents the indefinite integral of it with some constant offset!

```python
import numpy as np
from pysr import PySRRegressor, TemplateExpressionSpec

x = np.random.uniform(1, 10, (1000,)) # Integrand sampling points
y = 1 / (x**2 * np.sqrt(x**2 - 1)) # Evaluation of the integrand

expression_spec = TemplateExpressionSpec(
["f"],
"""
function diff_f_x((; f), (x,))
df = D(f, 1) # Symbolic derivative of f with respect to its first arg
return df(x)
end
"""
)

model = PySRRegressor(
binary_operators=["+", "-", "*", "/"],
unary_operators=["sqrt"],
expression_spec=expression_spec,
maxsize=20,
)
model.fit(x[:, np.newaxis], y)
```

If everything works, you should find something that simplifies to $\frac{\sqrt{x^2 - 1}}{x}$.

Here, we write out a full function in Julia.
But we can also do an anonymous function, like `((; f), (x,)) -> D(f, 1)(x)`. We can also avoid the fancy unpacking syntax and write:
`(nt, xs) -> D(nt.f, 1)(xs[1])` which is completely equivalent. Note that in Julia,
the following two syntaxes are equivalent:

```julia
nt = (; f=1, g=2) # Create a "named tuple"
(; f, g) = nt
```

and

```julia
f = nt.f
g = nt.g
```


## 14. Additional features

For the many other features available in PySR, please
read the [Options section](options.md).
80 changes: 30 additions & 50 deletions docs/operators.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,56 +7,32 @@ takes one or two scalars as input, and returns on scalar as output,
is likely to be a valid operator[^1].
A selection of these and other valid operators are stated below.

**Binary**

- `+`
- `-`
- `*`
- `/`
- `^`
- `max`
- `min`
- `mod`
- `cond`
- Equal to `(x, y) -> x > 0 ? y : 0`
- `greater`
- Equal to `(x, y) -> x > y ? 1 : 0`
- `logical_or`
- Equal to `(x, y) -> (x > 0 || y > 0) ? 1 : 0`
- `logical_and`
- Equal to `(x, y) -> (x > 0 && y > 0) ? 1 : 0`

**Unary**

- `neg`
- `square`
- `cube`
- `exp`
- `abs`
- `log`
- `log10`
- `log2`
- `log1p`
- `sqrt`
- `sin`
- `cos`
- `tan`
- `sinh`
- `cosh`
- `tanh`
- `atan`
- `asinh`
- `acosh`
- `atanh_clip`
- Equal to `atanh(mod(x + 1, 2) - 1)`
- `erf`
- `erfc`
- `gamma`
- `relu`
- `round`
- `floor`
- `ceil`
- `sign`
Also, note that it's a good idea to not use too many operators, since
it can exponentially increase the search space.

**Binary Operators**

| Arithmetic | Comparison | Logic |
|--------------|------------|----------|
| `+` | `max` | `logical_or`[^2] |
| `-` | `min` | `logical_and`[^3]|
| `*` | `greater`[^4] | |
| `/` | `cond`[^5] | |
| `^` | `mod` | |

**Unary Operators**

| Basic | Exp/Log | Trig | Hyperbolic | Special | Rounding |
|------------|------------|-----------|------------|-----------|------------|
| `neg` | `exp` | `sin` | `sinh` | `erf` | `round` |
| `square` | `log` | `cos` | `cosh` | `erfc` | `floor` |
| `cube` | `log10` | `tan` | `tanh` | `gamma` | `ceil` |
| `cbrt` | `log2` | `asin` | `asinh` | `relu` | |
| `sqrt` | `log1p` | `acos` | `acosh` | `sinc` | |
| `abs` | | `atan` | `atanh` | | |
| `sign` | | | | | |
| `inv` | | | | | |


## Custom

Expand Down Expand Up @@ -96,3 +72,7 @@ any invalid values over the training dataset.
<!-- (Will say "However, you may need to define a `extra_sympy_mapping`":) -->

[^1]: However, you will need to define a sympy equivalent in `extra_sympy_mapping` if you want to use a function not in the above list.
[^2]: `logical_or` is equivalent to `(x, y) -> (x > 0 || y > 0) ? 1 : 0`
[^3]: `logical_and` is equivalent to `(x, y) -> (x > 0 && y > 0) ? 1 : 0`
[^4]: `greater` is equivalent to `(x, y) -> x > y ? 1 : 0`
[^5]: `cond` is equivalent to `(x, y) -> x > 0 ? y : 0`
2 changes: 1 addition & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ build-backend = "setuptools.build_meta"

[project]
name = "pysr"
version = "1.2.0"
version = "1.3.0"
authors = [
{name = "Miles Cranmer", email = "[email protected]"},
]
Expand Down
2 changes: 1 addition & 1 deletion pysr/juliapkg.json
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
"packages": {
"SymbolicRegression": {
"uuid": "8254be44-1295-4e6a-a16d-46603ac705cb",
"version": "=1.4.0"
"version": "=1.5.0"
},
"Serialization": {
"uuid": "9e88b42a-f829-5b0c-bbe9-9e923198166b",
Expand Down
Loading