Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature: add kubernetes api_client parameter #133

Open
alexandrem opened this issue Aug 13, 2019 · 7 comments
Open

Feature: add kubernetes api_client parameter #133

alexandrem opened this issue Aug 13, 2019 · 7 comments
Labels
enhancement New feature or request

Comments

@alexandrem
Copy link

In the context of parallelizing my tests more in the future, I'd like the ability for kubetest to work on different remote Kubernetes clusters during the same pytest run.

At the moment, the kube fixture relies on the global kubeconfig discovered by the kubernetes client module, as in:

client.CoreV1Api().list_node()

What I would like is the ability to instantiate a api_client somewhere and pass that down to the kube fixture somehow, so that it would for instance use this to list nodes:

client.CoreV1Api(api_client=self.api_client).list_node()

That definitely requires more thinking about how to instantiate many clients and keep them available for fixtures.

Right now I have a parent fixture that generates different kubernetes clients and use that in a child fixture for the rest of my tests. If I could pass the kubeconfig path to the kubetest client then I could return that and use a different fixture name later instead of the default kube. Or perhaps there's a better way to accomplish that and keep the standardized kube name.

@alexandrem alexandrem changed the title Feature: add kubernetes apiclient parameter Feature: add kubernetes api_client parameter Aug 13, 2019
@edaniszewski
Copy link
Contributor

Thanks for opening this feature request -- I think this sounds like a pretty sweet idea. Admittedly, I haven't used different api clients in this manner before, so I'll need to read up on it a bit more. Hopefully without sounding to naive on how this works, my initial thought is that this could be done using pytest markers, e.g. something like

def test_case_1(kube):
    """Tests something using the default global kubeconfig."""
    ...

@pytest.mark.kubetest_config('./path/to/other/kubeconfig')
def test_case_2(kube):
    """Tests something using a different kubeconfig."""
    ...

Under the hood, what I'm envisioning is that on test setup, this would register the config with the test case metadata here:

kubetest/kubetest/plugin.py

Lines 233 to 244 in ac4104e

try:
# Register test case state based on markers on the test case.
test_case.register_rolebindings(
*markers.rolebindings_from_marker(item, test_case.ns)
)
test_case.register_clusterrolebindings(
*markers.clusterrolebindings_from_marker(item, test_case.ns)
)
# Apply manifests for the test case, if any are specified.
markers.apply_manifests_from_marker(item, test_case)
markers.apply_manifest_from_marker(item, test_case)

The general process of that being that the test manager would create an ApiClient for each unique value passed to the marker in a test suite and cache it. Any test case with said marker would be configured to use those ApiClients transparently, so the test could just use the kube fixture as it would before.

That's my initial thought, at least. I'm sure there are other ways it could be done. If that proposed usage sounds reasonable to you, I can start working towards its implementation.

@alexandrem
Copy link
Author

Static markers will not do it for my use case, since I'm provisioning different clusters at runtime in pytest_generate_tests and passing the cluster attributes to a cluster fixture.

I could probably hack something with pytest_collection_modifyitems though?

@edaniszewski
Copy link
Contributor

do each of your test cases run in its own runtime-provisioned cluster? Hacking something together with pytest_collection_modifyitems seems like it could work, but I think I don't understand the use case well enough to be particularly insightful with what the best implementation could be

@alexandrem
Copy link
Author

@edaniszewski The test cases receive a cluster object which is session scoped. So I have one runtime provisioned cluster per suite of tests that were parametrized dynamically.

@edaniszewski
Copy link
Contributor

After thinking about this a little more, I suppose one thing that could work is to allow a client to be set on the TestClient object (returned via the kubetest fixture), so then tests are free to use their own client. This would put the onus on the user to update the test client, but it seems like the simplest way to implement this, from what I can see right now.

@pytest.fixture(scope='session')
def custom_api_client():
    return generated_api_client

def test_something(kube, custom_api_client):
    # Manually set the custom API client at the start of the test.
    kube.api_client = custom_api_client

    # Continue to use as you would otherwise.
    kube.load_deployment(...)

thoughts?

@alexandrem
Copy link
Author

I think this can work just fine.

@edaniszewski
Copy link
Contributor

Check out #144 -- I believe that should implement this feature in the most basic way. Let me know what you think!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants