Skip to content

Commit

Permalink
fix: Update all Learn the basics notebooks with the latest version of…
Browse files Browse the repository at this point in the history
… the source transpiler and remove legacy parts
  • Loading branch information
hmahmood24 committed Oct 19, 2024
1 parent d3c9cd6 commit 76f73fc
Show file tree
Hide file tree
Showing 6 changed files with 441 additions and 272 deletions.
2 changes: 1 addition & 1 deletion learn_the_basics.rst
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
Learn the basics
Learn the Basics
----------------

.. grid:: 1 1 3 3
Expand Down
79 changes: 31 additions & 48 deletions learn_the_basics/03_trace_code.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@
"source": [
"⚠️ If you are running this notebook in Colab, you will have to install `Ivy` and some dependencies manually. You can do so by running the cell below ⬇️\n",
"\n",
"If you want to run the notebook locally but don't have Ivy installed just yet, you can check out the [Get Started section of the docs.](https://unify.ai/docs/ivy/overview/get_started.html)"
"If you want to run the notebook locally but don't have Ivy installed just yet, you can check out the [Get Started section of the docs.](https://www.docs.ivy.dev/overview/get_started.html)"
]
},
{
Expand All @@ -40,24 +40,21 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"Firstly, let's pick up where we left off in the [last notebook](02_unify_code.ipynb), with our unified `normalize` function:"
"Let's begin with an implementation of the `normalize` function using `ivy`'s Functional API:"
]
},
{
"cell_type": "code",
"execution_count": 2,
"execution_count": 4,
"metadata": {},
"outputs": [],
"source": [
"import ivy\n",
"import torch\n",
"\n",
"def normalize(x):\n",
" mean = torch.mean(x)\n",
" std = torch.std(x)\n",
" return torch.div(torch.sub(x, mean), std)\n",
"\n",
"normalize = ivy.unify(normalize, source=\"torch\")"
" mean = ivy.mean(x)\n",
" std = ivy.std(x)\n",
" return ivy.divide(ivy.subtract(x, mean), std)"
]
},
{
Expand All @@ -70,47 +67,31 @@
},
{
"cell_type": "code",
"execution_count": 4,
"metadata": {},
"outputs": [],
"source": [
"# set ivy's backend to jax\n",
"ivy.set_backend(\"jax\")\n",
"\n",
"# Import jax\n",
"import jax\n",
"\n",
"# create random jax arrays for testing\n",
"key = jax.random.PRNGKey(42)\n",
"x = jax.random.uniform(key, shape=(10,))"
]
},
{
"attachments": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"As in the previous example, the Ivy function can be executed like so (in this case it will trigger lazy unification, see the [Lazy vs Eager](05_lazy_vs_eager.ipynb) section for more details):"
]
},
{
"cell_type": "code",
"execution_count": 5,
"execution_count": 7,
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"ivy.array([ 0.55563945, -0.65538704, -1.14150524, 1.46951997, 1.30220294,\n",
" -1.14739668, -0.57017946, -0.91962677, 0.51029003, 0.59644395])"
"ivy.array([ 0.58569533, -0.69083852, -1.20325196, 1.5490098 , 1.37264228,\n",
" -1.20946217, -0.60102183, -0.96937162, 0.53789282, 0.62870705])"
]
},
"execution_count": 5,
"execution_count": 7,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"# set ivy's backend to jax\n",
"ivy.set_backend(\"jax\")\n",
"\n",
"# Import jax\n",
"import jax\n",
"\n",
"# create random jax arrays for testing\n",
"key = jax.random.PRNGKey(42)\n",
"x = jax.random.uniform(key, shape=(10,))\n",
"normalize(x)"
]
},
Expand All @@ -137,7 +118,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"The traced function can be executed in exactly the same manner as the non-traced function (in this case it will also trigger lazy graph tracing, see the [Lazy vs Eager](05_lazy_vs_eager.ipynb) section for more details):"
"The traced function can be executed in exactly the same manner as the non-traced function:"
]
},
{
Expand All @@ -148,8 +129,8 @@
{
"data": {
"text/plain": [
"Array([ 0.5556394 , -0.655387 , -1.1415051 , 1.4695197 , 1.3022028 ,\n",
" -1.1473966 , -0.5701794 , -0.91962665, 0.51028997, 0.5964439 ], dtype=float32)"
"Array([ 0.5856953 , -0.6908385 , -1.203252 , 1.5490098 , 1.3726423 ,\n",
" -1.2094622 , -0.6010218 , -0.9693716 , 0.5378928 , 0.62870705], dtype=float32)"
]
},
"execution_count": 9,
Expand All @@ -171,14 +152,14 @@
},
{
"cell_type": "code",
"execution_count": 11,
"execution_count": 10,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"985 µs ± 6.76 µs per loop (mean ± std. dev. of 7 runs, 1,000 loops each)\n"
"138 ms ± 3.57 ms per loop (mean ± std. dev. of 7 runs, 10 loops each)\n"
]
}
],
Expand All @@ -196,7 +177,7 @@
"name": "stdout",
"output_type": "stream",
"text": [
"69.5 µs ± 1.24 µs per loop (mean ± std. dev. of 7 runs, 10,000 loops each)\n"
"122 µs ± 2.02 µs per loop (mean ± std. dev. of 7 runs, 10,000 loops each)\n"
]
}
],
Expand All @@ -210,7 +191,9 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"As expected, we can see that `normalize` is slower, as it includes all `ivy` wrapping overhead. On the other hand, `traced` has no wrapping overhead and it's more efficient!"
"As expected, we can see that `normalize` is slower, as it includes all `ivy` wrapping overhead. On the other hand, `traced` has no wrapping overhead and it's more efficient!\n",
"\n",
"> Fun Fact: You can use the graph tracer with pretty much any code written in one of the ML frameworks Ivy supports i.e. PyTorch, TensorFlow, Jax, NumPy etc. and speed it up by removing unnecessary computations that don't contribute towards the output by extracting an efficient computation graph stitched together in the set backend framework!"
]
},
{
Expand All @@ -226,13 +209,13 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"That's it, you can now trace `ivy` code for more efficient inference! However, there are several other important topics to master before you're ready to unify ML code like a pro 🥷. Next, we'll be learning how to transpile code from one framework to another in a single line of code 🔄"
"That's it, you can now trace `ivy` code for more efficient inference! However, there are several other [important topics](https://www.docs.ivy.dev/demos/learn_the_basics.html) to master before you're ready to play with ML code like a pro 🥷."
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3",
"display_name": "tracer-transpiler",
"language": "python",
"name": "python3"
},
Expand All @@ -246,7 +229,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.8.10"
"version": "3.10.13"
},
"orig_nbformat": 4
},
Expand Down
Loading

0 comments on commit 76f73fc

Please sign in to comment.