Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[master] feat: Add OS grain mappings for AlmaLinux Kitten #66991

Open
wants to merge 4 commits into
base: master
Choose a base branch
from

Conversation

jamesharr
Copy link

@jamesharr jamesharr commented Oct 23, 2024

What does this PR do?

Add OS grain mappings to appropriately detect AlmaLinux Kitten (beta release of AlmaLinux 10) and treat it as a RHEL distribution.

AlmaLinux Kitten Release Notes: https://almalinux.org/blog/2024-10-22-introducing-almalinux-os-kitten/

What issues does this PR fix or reference?

No existing bug/issue

New Behavior

AlmaLinux Kitten is detected and sets os_family="RHEL" in grains; additionally the os grain is mapped to AlmaLinux instead of AlmaLinux Kitten

Merge requirements satisfied?

[NOTICE] Bug fixes or features added to Salt require tests.

Commits signed with GPG?

No

Please review Salt's Contributing Guide for best practices, including the
PR Guidelines.

See GitHub's page on GPG signing for more information about signing commits with GPG.

@jamesharr jamesharr requested a review from a team as a code owner October 23, 2024 16:00
@salt-project-bot-prod-environment salt-project-bot-prod-environment bot changed the title feat: Add OS grain mappings for AlmaLinux Kitten [master] feat: Add OS grain mappings for AlmaLinux Kitten Oct 23, 2024
@jamesharr
Copy link
Author

Question for a reviewer: I looked for ways to add tests, and I couldn't find any existing grain testing I could tack this onto.

Anecdotally, I did run this code against an AlmaLinux Kitten VM and it worked. I know this doesn't cover regression testing, but I'm not sure I'm capable of developing the scaffolding to test grain detection at this point in time.

@dmurphy18
Copy link
Contributor

@jamesharr You can mock the values provided for the function to read the file,. for example, and return that info and check that the code processes them correct. Make sure you use pytest.
Note: alll changes need to have tests exercising the code changes, and using mock for testing with appropriate mock'ed return value, is easy to do, there are plenty of examples in other tests showing how that is done

@dmurphy18 dmurphy18 self-requested a review October 23, 2024 21:56
@dmurphy18 dmurphy18 added the Needs-Testcase PR needs test cases written, or the issue is about a bug/feature that needs test cases label Oct 23, 2024
@dmurphy18 dmurphy18 added this to the Argon v3008.0 milestone Oct 23, 2024
@jamesharr
Copy link
Author

@dmurphy18 , tests are added.

Let me know if I need to do anything else for this PR. Apart from the changelog fagment, I'm not sure whether there are docs that need to be added.

dmurphy18
dmurphy18 previously approved these changes Oct 24, 2024
@jamesharr
Copy link
Author

I'll get the CI tests fixed related to the feature change. I wound running pytest on may laptop (a mac) and not on a Linux box, so I think it skipped the test. I'm struggling to get pytest to run, though.

% pytest -vx tests/pytests/unit/grains/test_core.py::test_almalinux_kitten_os_grains
<frozen importlib._bootstrap>:1047: ImportWarning: NaclImporter.find_spec() not found; falling back to find_module()
=========================================================================================================== test session starts ===========================================================================================================
platform linux -- Python 3.11.7, pytest-8.1.1, pluggy-1.4.0 -- /home/jharr/workspace/github/jamesharr/salt/.venv/bin/python3.11
cachedir: .pytest_cache
max open files: soft=524288; hard=524288
salt-transport: zeromq
rootdir: /home/jharr/workspace/github/jamesharr/salt
configfile: pytest.ini
plugins: flaky-3.8.1, httpserver-1.0.8, timeout-2.3.1, helpers-namespace-2021.12.29, custom-exit-code-0.3.0, system-statistics-1.0.2, pyfakefs-5.3.1, anyio-4.1.0, subtests-0.4.0, skip-markers-1.5.0, shell-utilities-1.8.0, salt-factories-1.0.1
collected 1 item                                                                                                                                                                                                                          

tests/pytests/unit/grains/test_core.py::test_almalinux_kitten_os_grains FAILED                                                                                                                                                      [100%]

================================================================================================================ FAILURES =================================================================================================================
_____________________________________________________________________________________________________ test_almalinux_kitten_os_grains _____________________________________________________________________________________________________

    @pytest.mark.skip_unless_on_linux
    def test_almalinux_kitten_os_grains():
        """
        Test that 'os' grain will properly detect AlmaLinux Kitten
        """
        # /etc/os-release data taken from an AlmaLinux Kitten VM, install ISO
        # AlmaLinux-Kitten-10-20241018.0-x86_64_v2-minimal.iso. At the time of
        # writting, there was no docker image for Kitten
        _os_release_data = {
            "NAME": "AlmaLinux Kitten",
            "VERSION": "10 (Lion Cub)",
            "ID": "almalinux",
            "ID_LIKE": "rhel centos fedora",
            "VERSION_ID": "10",
            "PLATFORM_ID": "platform:el10",
            "PRETTY_NAME": "AlmaLinux Kitten 10 (Lion Cub)",
            "ANSI_COLOR": "0;34",
            "LOGO": "fedora-logo-icon",
            "CPE_NAME": "cpe:/o:almalinux:almalinux:10::baseos",
            "HOME_URL": "https://almalinux.org/",
            "DOCUMENTATION_URL": "https://wiki.almalinux.org/",
            "VENDOR_NAME": "AlmaLinux",
            "VENDOR_URL": "https://almalinux.org/",
            "BUG_REPORT_URL": "https://bugs.almalinux.org/",
            "ALMALINUX_MANTISBT_PROJECT": "AlmaLinux-10",
            "ALMALINUX_MANTISBT_PROJECT_VERSION": "10",
            "REDHAT_SUPPORT_PRODUCT": "AlmaLinux",
            "REDHAT_SUPPORT_PRODUCT_VERSION": "10",
        }
        expectation = {
            "os": "AlmaLinux",
            "os_family": "RedHat",
            "oscodename": "Lion Cub",
            "osfullname": "AlmaLinux",
            "osrelease": "10",
            "osrelease_info": (10, 0),
            "osmajorrelease": 10,
            "osfinger": "AlmaLinux-10",
        }
>       _run_os_grains_tests(_os_release_data, {}, expectation)

tests/pytests/unit/grains/test_core.py:1197: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
tests/pytests/unit/grains/test_core.py:647: in _run_os_grains_tests
    os_grains = core.os_data()
salt/grains/core.py:2618: in os_data
    (
/usr/lib64/python3.11/platform.py:799: in __iter__
    (self.processor,)
/usr/lib64/python3.11/functools.py:1001: in __get__
    val = self.func(instance)
/usr/lib64/python3.11/platform.py:794: in processor
    return _unknown_as_blank(_Processor.get())
/usr/lib64/python3.11/platform.py:739: in get
    return func() or ''
/usr/lib64/python3.11/platform.py:762: in from_subprocess
    return subprocess.check_output(
/usr/lib64/python3.11/subprocess.py:466: in check_output
    return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,
/usr/lib64/python3.11/subprocess.py:548: in run
    with Popen(*popenargs, **kwargs) as process:
/usr/lib64/python3.11/subprocess.py:1018: in __init__
    self.stdout = io.TextIOWrapper(self.stdout,
/usr/lib64/python3.11/encodings/__init__.py:99: in search_function
    mod = __import__('encodings.' + modname, fromlist=_import_tail,
.venv/lib64/python3.11/site-packages/mock/mock.py:1178: in __call__
    return _mock_self._mock_call(*args, **kwargs)
.venv/lib64/python3.11/site-packages/mock/mock.py:1182: in _mock_call
    return _mock_self._execute_mock_call(*args, **kwargs)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

_mock_self = <MagicMock name='__import__' id='140589785572240'>, args = ('encodings.utf_8',), kwargs = {'fromlist': ['*'], 'level': 0}, self = <MagicMock name='__import__' id='140589785572240'>
effect = <function _run_os_grains_tests.<locals>._import_mock at 0x7fdd9c3849a0>

    def _execute_mock_call(_mock_self, *args, **kwargs):
        self = _mock_self
        # separate from _increment_mock_call so that awaited functions are
        # executed separately from their call, also AsyncMock overrides this method
    
        effect = self.side_effect
        if effect is not None:
            if _is_exception(effect):
                raise effect
            elif not _callable(effect):
                result = next(effect)
                if _is_exception(result):
                    raise result
            else:
>               result = effect(*args, **kwargs)
E               TypeError: _run_os_grains_tests.<locals>._import_mock() got an unexpected keyword argument 'fromlist'

.venv/lib64/python3.11/site-packages/mock/mock.py:1245: TypeError
============================================================================================================ warnings summary =============================================================================================================
salt/utils/pycrypto.py:27
  /home/jharr/workspace/github/jamesharr/salt/salt/utils/pycrypto.py:27: DeprecationWarning: 'crypt' is deprecated and slated for removal in Python 3.13
    import crypt

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
========================================================================================================= short test summary info =========================================================================================================
FAILED tests/pytests/unit/grains/test_core.py::test_almalinux_kitten_os_grains - TypeError: _run_os_grains_tests.<locals>._import_mock() got an unexpected keyword argument 'fromlist'
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! stopping after 1 failures !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
====================================================================================================== 1 failed, 1 warning in 1.22s =======================================================================================================

jamesharr and others added 2 commits October 25, 2024 14:20
Add OS grain mappings to appropriately detect AlmaLinux Kitten (beta release of AlmaLinux 10) and treat it as a RHEL distribution.

https://almalinux.org/blog/2024-10-22-introducing-almalinux-os-kitten/
@jamesharr
Copy link
Author

So for whatever reason, a few __import__ mocks were not passing through keyword arguments.

I'm not sure how these tests were passing before and what the environment differences were, but I did add a patch to psss-through kwargs in the mock. If you feel like this is a mistake, I can remove that commit.

With the addition, the tests now pass on a linux system when running nox -e 'test-3(coverage=False)' -- -v tests/pytests/unit/grains/test_core.py

@dmurphy18
Copy link
Contributor

@jamesharr try pre-commit install to ensure when you run git commit that problems with changes are caught before testing the PR. You have pre-commit errors which need fixing, before the PR can be reviewed

Copy link
Contributor

@twangboy twangboy left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please fix the pre-commit issues

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
test:full Run the full test suite
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants