-
Notifications
You must be signed in to change notification settings - Fork 57
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Feature: add kubernetes api_client parameter #133
Comments
Thanks for opening this feature request -- I think this sounds like a pretty sweet idea. Admittedly, I haven't used different api clients in this manner before, so I'll need to read up on it a bit more. Hopefully without sounding to naive on how this works, my initial thought is that this could be done using pytest markers, e.g. something like def test_case_1(kube):
"""Tests something using the default global kubeconfig."""
...
@pytest.mark.kubetest_config('./path/to/other/kubeconfig')
def test_case_2(kube):
"""Tests something using a different kubeconfig."""
... Under the hood, what I'm envisioning is that on test setup, this would register the config with the test case metadata here: Lines 233 to 244 in ac4104e
The general process of that being that the test manager would create an ApiClient for each unique value passed to the marker in a test suite and cache it. Any test case with said marker would be configured to use those ApiClients transparently, so the test could just use the That's my initial thought, at least. I'm sure there are other ways it could be done. If that proposed usage sounds reasonable to you, I can start working towards its implementation. |
Static markers will not do it for my use case, since I'm provisioning different clusters at runtime in I could probably hack something with |
do each of your test cases run in its own runtime-provisioned cluster? Hacking something together with |
@edaniszewski The test cases receive a cluster object which is session scoped. So I have one runtime provisioned cluster per suite of tests that were parametrized dynamically. |
After thinking about this a little more, I suppose one thing that could work is to allow a client to be set on the TestClient object (returned via the @pytest.fixture(scope='session')
def custom_api_client():
return generated_api_client
def test_something(kube, custom_api_client):
# Manually set the custom API client at the start of the test.
kube.api_client = custom_api_client
# Continue to use as you would otherwise.
kube.load_deployment(...) thoughts? |
I think this can work just fine. |
Check out #144 -- I believe that should implement this feature in the most basic way. Let me know what you think! |
In the context of parallelizing my tests more in the future, I'd like the ability for kubetest to work on different remote Kubernetes clusters during the same pytest run.
At the moment, the
kube
fixture relies on the global kubeconfig discovered by the kubernetes client module, as in:What I would like is the ability to instantiate a
api_client
somewhere and pass that down to the kube fixture somehow, so that it would for instance use this to list nodes:That definitely requires more thinking about how to instantiate many clients and keep them available for fixtures.
Right now I have a parent fixture that generates different kubernetes clients and use that in a child fixture for the rest of my tests. If I could pass the kubeconfig path to the kubetest client then I could return that and use a different fixture name later instead of the default
kube
. Or perhaps there's a better way to accomplish that and keep the standardizedkube
name.The text was updated successfully, but these errors were encountered: