Contributions are welcome, and they are greatly appreciated! Every little bit helps, and credit will always be given.
You can contribute in many ways:
Report bugs at here
If you are reporting a bug, please include:
- Your operating system name and version.
- Any details about your local setup that might be helpful in troubleshooting.
- Detailed steps to reproduce the bug.
Look through the GitHub issues for bugs. Anything tagged with "bug" and "help wanted" is open to whoever wants to implement it.
Look through the GitHub issues for features. Anything tagged with "enhancement" and "help wanted" is open to whoever wants to implement it.
This project could always use more documentation, whether as part of the official docs, in docstrings, or even on the web in blog posts, articles, and such.
The best way to send feedback is to file an issue here.
If you are proposing a feature:
- Explain in detail how it would work.
- Keep the scope as narrow as possible, to make it easier to implement.
- Remember that this is a volunteer-driven project, and that contributions are welcome!
Ready to contribute? Here's how to make a contribution.
-
Fork the repo on GitHub.
-
Clone your fork locally:
git clone [email protected]:your_name_here/analphipy.git
If the repo includes submodules, you can add them either with the initial close using:
git clone --recursive-submodules [email protected]:your_name_here/analphipy.git
or after the clone using
cd analphipy git submodule update --init --recursive
-
Initiate pre-commit with:
pre-commit install
To update the recipe, periodically run:
pre-commit autoupdate
If recipes change over time, you can clean up old installs with:
pre-commit gc
-
Create a branch for local development:
git checkout -b name-of-your-bugfix-or-feature
Now you can make your changes locally. Alternatively, we recommend using git-flow.
-
When you're done making changes, check that your changes pass the pre-commit checks: tests.
pre-commit run [--all-files]
To run tests, use:
pytest
To test against multiple python versions, use nox:
nox -s test
-
Create changelog fragment. See scriv for more info.
scriv create --edit
-
Commit your changes and push your branch to GitHub:
git add . git commit -m "Your detailed description of your changes." git push origin name-of-your-bugfix-or-feature
Note that the pre-commit hooks will force the commit message to be in the conventional style. To assist this, you may want to commit using commitizen.
cz commit
-
Submit a pull request through the GitHub website.
Before you submit a pull request, check that it meets these guidelines:
- The pull request should include tests.
- If the pull request adds functionality, the docs should be updated. Put your
new functionality into a function with a docstring, and add the feature to the
list in
CHANGELOG.md
. You should use scriv for this. - The pull request should work for all supported python versions.
Using pre-commit
It is highly recommended to enable pre-commit. See for installation instructions. To install the pre-commit hooks, run:
pre-commit install
This will enable a variety of code-checkers (linters) when you add a file to commit. Alternatively, you can run the hooks over all files using:
pre-commit run --all-files
You can also run pre-commit on all files via nox using:
nox -s lint
This project makes extensive use of nox to automate testing, typing,
documentation creation, etc. One downside of using tox with this particular
workflow is the need for multiple scripts/makefiles, while with nox, most
everything is self contained in the file noxfile.py
. nox also allows for a
mix of conda and virtualenv environments. The default is for the development
environment to use conda, while all other environments are virtualenvs. There
are conda sessions for testing (test-conda
), typing (typing-conda
), docs
(docs-conda
), etc.
If using virtualenvs across multiple python versions (e.g., test
, typing
,
etc), you'll need to install python interpreters for each version. If using
pyenv, you should be good to go.
Instead of using pyenv, I use uv to manage python versions. For example:
uv python install python3.12
I also set the global uv config file (~/.config/uv/uv.toml
on mac and linux)
to use only managed python:
python-preference = "only-managed"
The noxfile.py
is setup to automatically add the python interpreters installed
by uv to the path. Note that the python version needs to be installed before
it can be used with nox
To see all nox session, run:
nox --list
To simplify passing options to underlying commands, the options to a particular
nox session use +
instead of -
for options. For example, pass options to
pytest, use:
nox -s test -- ++test-opts -x -v
Using +
for the session option ++test-opts
means we don't have to escape
-x
or -v
. To see all options:
nox -- ++help/+h
Note that these options should be passed after --
. For example, to build and
open the documentation, run:
nox -s docs -- +d build open
The project is setup to create environment.yaml
and requirement.txt
files
from pyproject.toml
. This can be done using:
nox -s requirements
This uses pyproject2conda to create the requirement files. Note that all
requirement files are under something like
requirements/py{version}-{env-name}.yaml
(conda environment) or
requirements/{env-name}.txt
(virtual environment).
Additionally, requirement files for virtualenvs (e.g., requirements.txt
like
files) will be "locked" using uv pip compile
from uv. These files are placed
under requirements/lock
. Note the the session requirements
automatically
calls the session lock
.
To upgrade the dependencies in the lock, you'll need to pass the option:
nox -s lock -- +L/++lock-upgrade
This will also update uv.lock
if it's being used.
The environments created by nox dev
, or running make install-kernel
, will
try to add meaningful display names for ipykernel. These are installed at the
user level. To cleanup the kernels (meaning, removing installed kernels that
point to a removed environment), You can use the script
tools/clean_kernelspec.py
:
python tools/clean_kernelspec.py
We use nox to isolate the documentation build. Specific tasks can be run with
nox -s docs -- +d [commands]
where commands can be one of:
- clean : remove old doc build
- build/html : build html documentation
- spelling : check spelling
- linkcheck : check the links
- symlink : rebuild symlinks from
examples
todocs/examples
- release : make pages branch for documentation hosting (using ghp-import)
- livehtml : Live documentation updates
- open : open the documentation in a web browser
- serve : Serve the created documentation webpage (Need this to view javascript in created pages).
The basic command is:
nox -s test -- [++test-opts] [++no-cov]
where you can pass in additional pytest options via ++test-opts
. For example:
nox -s test -- ++test-opts -x -v
Use session test-conda
to test under a conda environment.
Note that by default, these will install an isolated copy of the package, as
apposed to installing with pip install -e . --no-deps
. This is similar to how
tox works. This uses the nox session build
behind the scenes. This should
therefore be a fast operation.
For the most part, we use grayskull to create the conda recipe. However, I've
had issues getting it to play nice with pyproject.toml
for some of the 'extra'
variables. So, we use grayskull to build the majority of the recipe, and append
the file config/recipe-append.yaml
. For some edge cases (install name
different from package name, etc), you'll need to manually edit this file to
create the final recipe.
To build the conda recipe using grayskull:
nox -s conda-recipe -- ++conda-recipe [recipe, recipe-full]
To build the conda distribution:
nox -s conda-build -- ++conda-build [build,clean]
To upload the recipe, you'll need to run an external command like:
nox -s conda-build -- ++conda-build-run "anaconda upload PATH-TO-TARBALL"
The basic command is:
nox -s build
To upload the pypi distribution:
nox -s publish -- +p [release, test]
- test : upload to testpypi
- release : upload to pypi
Run:
nox -s testdist-pypi -- ++version [version]
to test a specific version from pypi and
nox -s testdist-conda -- ++version [version]
to to likewise from conda.
Testing notebooks with nbval
To test notebooks expected output using nbval, run
nox -s test-notebook
Run:
nox -s typing -- +m [commands] [options]
Use typing-conda
to test typing in a conda environment.
Note that the repo is setup to use a single install of mypy and pyright. The
script tools/uvxrun.py
will run check if an appropriate version of the
typecheckers is installed. If not, they will be run (and cached) using uvx.
This project uses a host of tools to (hopefully) make development easier. We
recommend installing some of these tools system wide. For this, we recommend
using uv (or pipx or condax). We mostly use uv, but the choice is yours.
For conda, we recommend actually using mamba. Alternatively, you can setup
conda
to use the faster mamba
solver. See here for
details.
To install a development environment using conda/mamba run:
conda env create -n {env-name} -f requirements/py{version}-dev.yaml
conda activate {env-name}
pip install -e . --no-deps
The easiest way to create an development environment, if using uv.lock
mechanism is:
uv sync
If the project does not use uv.lock
, or you don't want to use uv to manage
your environment, then use one of the following:
# using venv
python -m venv .venv
source .venv/bin/activate
python -m pip install -r requirements/lock/py{version}-dev.txt
python -m pip install -e . --no-deps
# using uv
uv venv --python 3.11 .venv
uv pip sync requirements/lock/py{version}-dev.txt
Note that if the project is setup to use uv.lock
but you'd like to use one of
the above, you may have to run something like:
uv export --dev > requirements.txt
and use this requirement file in the commands above.
If the project includes an ipython kernel, you can install it with:
make install-kernel
Alternatively, you can simply use:
nox -s dev
which will create a virtual environment under .venv
. If you go this route, you
may want to use something like
zsh-autoenv (if using zsh shell) or
autoenv (if using bash) to auto
activate the development environment when in the parent directory.
Additional tools are:
- pre-commit
- uv (optional, highly recommended)
- scriv (optional)
- pyright (optional)
- cruft (optional)
- commitizen (optional)
- cog (optional)
- nbqa (optional)
We recommend installing these tools with uv, but feel free to use pipx or condax.
uv tool/condax/pipx install pre-commit
# optional packages
uv tool/pipx install scriv
uv tool/condax/pipx install uv
uv tool/condax/pipx install pyright
uv tool/condax/pipx install cruft
uv tool/condax/pipx install commitizen
uv tool/condax/pipx install cogapp
uv tool/condax/pipx install nbqa
Note that the repo is setup to automatically use uvx for many of these tools.
Behind the scenes, the makefile and noxfile.py
will invoke tools/uvxrun.py
.
This will run the tool with uvx tool..
with proper tool version. Note that if
the tool is already installed with the proper version, uvx will use it. This
prevents having to install a bunch of tooling in the "dev" environment, and also
avoid creating a bunch of through away nox environments. This is experimental,
and I might change back to using small nox environments again in the future.
Versioning is handled with hatch-vcs. The package version is set by the git
tag. For convenience, you can override the version with nox setting
++version ...
. This is useful for updating the docs, etc.
Note that the version in a given environment/session can become stale. The easiest way to update the installed package version version is to reinstall the package. This can be done using the following:
# using pip
pip install -e . --no-deps
# using uv
uv pip install -e . --no-deps
To do this in a given session, use:
nox -s {session} -- +P/++update-package