Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

WIP: Test lowest versions of all required and optional dependencies #3639

Draft
wants to merge 14 commits into
base: main
Choose a base branch
from
Draft
43 changes: 19 additions & 24 deletions .github/workflows/ci_tests_legacy.yaml
Original file line number Diff line number Diff line change
@@ -1,8 +1,8 @@
# Test PyGMT with GMT legacy versions on Linux/macOS/Windows
# Test PyGMT with GMT legacy versions and old Python dependencies on Linux/macOS/Windows
#
# This workflow runs regular PyGMT tests with GMT legacy versions. Due to the minor
# baseline image changes between GMT versions, the workflow only runs the tests but
# doesn't do image comparisons.
# This workflow runs regular PyGMT tests with GMT legacy versions and old versions of
# all optional Python dependencies. Due to the minor baseline image changes between GMT
# versions, the workflow only runs the tests but doesn't do image comparisons.
#
# It is scheduled to run every Tuesday on the main branch.
#
Expand All @@ -12,7 +12,7 @@ on:
# push:
# branches: [ main ]
# Uncomment the 'pull_request' line below to trigger the workflow in PR
# pull_request:
pull_request:
# types: [ready_for_review]
# paths:
# - 'pygmt/**'
Expand Down Expand Up @@ -62,23 +62,6 @@ jobs:
python=3.10
gmt=${{ matrix.gmt_version }}
ghostscript<10
numpy
pandas
xarray
netCDF4
packaging
contextily
geopandas
ipython
pyarrow
rioxarray
sphinx-gallery
make
pip
python-build
pytest
pytest-doctestplus
pytest-mpl

# Download cached remote files (artifacts) from GitHub
- name: Download remote data from GitHub
Expand All @@ -95,10 +78,22 @@ jobs:
env:
GH_TOKEN: ${{ github.token }}

# Install uv package manager
- name: Install uv
uses: astral-sh/setup-uv@v3

# Install the package that we want to test
- name: Install the package
run: make install
run: |
uv venv
source .venv/bin/activate
uv run --with pip==23 --resolution lowest-direct --all-extras --dev make install
uv pip list
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Using --resolution lowest-direct for now, because --resolution lowest grabs many other transitive dependencies that may be hard to install (e.g. missing wheels for newer Python versions, so requires compilation from source). We could switch to --resolution lowest once the Python ecosystem does lower bound pins a bit more thoroughly (which may take years).


# Run the tests but skip images
- name: Run tests
run: make test_no_images PYTEST_EXTRA="-r P"
run: |
source .venv/bin/activate
uv run --with pytest==8,pytest-mpl==0.17,pytest-doctestplus==1.2 --resolution lowest-direct --all-extras --dev make test_no_images PYTEST_EXTRA="-r P"
env:
GMT_LIBRARY_PATH: $CONDA_PREFIX/lib
2 changes: 1 addition & 1 deletion Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -55,7 +55,7 @@ doctest: _runtest
# run tests without image comparisons
# run pytest without the --mpl option to disable image comparisons
# use '-o addopts' to override 'addopts' settings in pyproject.toml file
test_no_images: PYTEST_ARGS=-o addopts="--verbose --durations=0 --durations-min=0.2 --doctest-modules"
test_no_images: PYTEST_ARGS=-o addopts="--verbose --color=yes --durations=0 --durations-min=0.2 --doctest-modules"
test_no_images: _runtest

format:
Expand Down
14 changes: 7 additions & 7 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -33,18 +33,18 @@ dependencies = [
"numpy>=1.24",
"pandas>=2.0",
"xarray>=2023.04",
"netCDF4",
"packaging",
"netCDF4>=1.7",
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We discussed making NetCDF an optional dependency before in #429. Maybe we could revisit that discussion?

Copy link
Member

@seisman seisman Nov 22, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That's a good point. NetCDF4 was needed by xarray when reading and writing netCDF files. As mentioned in #239 (comment), previously, our load_xxx functions (e.g., load_earth_relief) called xarray.load_dataarray to load local netCDF grids into an xarray.DataArray object, making netCDF a highly "required" dependency.

After we refactor our load_xxx function in #3120, we no longer call xarray.load_dataarray, so netCDF should be no longer highly needed by PyGMT.

I've opened PR #3643 to see how the PyGMT relies on the netCDF package.

Edit: There are only 14 failures in the Python 3.11 + Ubuntu CI job after removing netCDF entirely (see https://github.com/GenericMappingTools/pygmt/actions/runs/11968416461/job/33367210857?pr=3643) and most of them are caused by load_static_earth_relief.

"packaging>=22.0",
]
dynamic = ["version"]

[project.optional-dependencies]
all = [
"contextily",
"geopandas",
"IPython", # 'ipython' is not the correct module name.
"pyarrow",
"rioxarray",
"contextily>=1.2",
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Need to pin to contextily>=1.2, otherwise this code block won't work as expected (there is a chance that an ImportError will be raised if contextily=1.1 is installed, but xyzservices is not installed, breaking the _HAS_CONTEXTILY logic):

try:
import contextily
from xyzservices import TileProvider
_HAS_CONTEXTILY = True
except ImportError:
TileProvider = None
_HAS_CONTEXTILY = False

Xref geopandas/contextily#183, where xyzservices was included as a requirement of contextily (in v1.2.0)

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good to know that we require contexily>=1.2

"geopandas>=0.14",
"IPython>=8", # 'ipython' is not the correct module name.
"pyarrow>=16",
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pinning to pyarrow>16 here to fix this error:

________ test_to_numpy_pyarrow_array_pyarrow_dtypes_string[string_view] ________
>   ???
E   KeyError: 'string_view'
pyarrow/types.pxi:5025: KeyError
During handling of the above exception, another exception occurred:
dtype = 'string_view'
    @pytest.mark.skipif(not _HAS_PYARROW, reason="pyarrow is not installed")
    @pytest.mark.parametrize(
        "dtype",
        [
            None,
            "string",
            "utf8",  # alias for string
            "large_string",
            "large_utf8",  # alias for large_string
            "string_view",
        ],
    )
    def test_to_numpy_pyarrow_array_pyarrow_dtypes_string(dtype):
        """
        Test the _to_numpy function with PyArrow arrays of PyArrow string types.
        """
>       array = pa.array(["abc", "defg", "12345"], type=dtype)
../pygmt/tests/test_clib_to_numpy.py:333: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
pyarrow/array.pxi:230: in pyarrow.lib.array
    ???
pyarrow/types.pxi:5040: in pyarrow.lib.ensure_type
    ???
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
>   ???
E   ValueError: No type alias for string_view
pyarrow/types.pxi:5027: ValueError

Code was added in #3608:

"string_view",
],
)
def test_to_numpy_pyarrow_array_pyarrow_dtypes_string(dtype):
"""
Test the _to_numpy function with PyArrow arrays of PyArrow string types.
"""
array = pa.array(["abc", "defg", "12345"], type=dtype)
result = _to_numpy(array)
_check_result(result, np.str_)
npt.assert_array_equal(result, array)

The pyarrow.StringViewArray class was added in apache/arrow#39652 that was released in pyarrow v16.0

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

PyArrow v16.0 was released in April, 2024. Instead of pinning pyarrow>=16.0, we can just skip string_view if pyarrow<v16.0.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok, I just tested older versions from pyarrow v12 to v15, and think we can pin to pyarrow>=13 (v13 was released on 24 Aug 2023) if ignoring string_view. With v12, there is another error on datetime.

_________________________________________________________________ test_to_numpy_pyarrow_array_pyarrow_dtypes_date[date64[ms]] __________________________________________________________________

dtype = 'date64[ms]', expected_dtype = 'datetime64[ms]'

    @pytest.mark.skipif(not _HAS_PYARROW, reason="pyarrow is not installed")
    @pytest.mark.parametrize(
        ("dtype", "expected_dtype"),
        [
            pytest.param("date32[day]", "datetime64[D]", id="date32[day]"),
            pytest.param("date64[ms]", "datetime64[ms]", id="date64[ms]"),
        ],
    )
    def test_to_numpy_pyarrow_array_pyarrow_dtypes_date(dtype, expected_dtype):
        """
        Test the _to_numpy function with PyArrow arrays of PyArrow date types.

        date32[day] and date64[ms] are stored as 32-bit and 64-bit integers, respectively,
        representing the number of days and milliseconds since the UNIX epoch (1970-01-01).

        Here we explicitly check the dtype and date unit of the result.
        """
        data = [
            date(2024, 1, 1),
            datetime(2024, 1, 2),
            datetime(2024, 1, 3),
        ]
        array = pa.array(data, type=dtype)
        result = _to_numpy(array)
        _check_result(result, np.datetime64)
>       assert result.dtype == expected_dtype  # Explicitly check the date unit.
E       AssertionError: assert dtype('<M8[D]') == 'datetime64[ms]'
E        +  where dtype('<M8[D]') = array(['2024-01-01', '2024-01-02', '2024-01-03'], dtype='datetime64[D]').dtype

../pygmt/tests/test_clib_to_numpy.py:364: AssertionError

The main change appears to be in apache/arrow#33321, when pyarrow supported preserving the datetime64 temporal resolution.

Maybe we can keep the string_view test, but add a skip_if(pyarrow.__version__ < 16) marker or something like that.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Adjusted pin to pyarrow>=13 in commit 4733bda

"rioxarray>=0.14",
]

[project.urls]
Expand Down
Loading