我们将在这个页面对所有那些为Deep Learning课程中文化提供过帮助的人表示衷心的感谢。如果您不希望您的名字出现在这里,或者想要展示您的别名或者昵称,亦或者您希望挂上您的个人链接,请及时的告知我们。
(以下排名不分先后)
课程的 Smartling 字幕和 srt 字幕分别由 GTC 中文社群成员 和 深度学习字幕小组成员 进行维护,相关人员信息将后续更新。
如果发现信息有误请及时反馈,为此带来的不快向您表示歉意。
编号 | 课程名 | 时间轴 | 翻译 | 审核 |
---|---|---|---|---|
01.01 | Welcome | AcceptedDoge | AcceptedDoge / Fei Li | AcceptedDoge |
01.02 | What is a neural network | AcceptedDoge | AcceptedDoge / Fei Li | AcceptedDoge |
01.03 | Supervised Learning with Neural Networks | AcceptedDoge | AcceptedDoge | AcceptedDoge |
01.04 | Why is Deep Learning taking off | AcceptedDoge | AcceptedDoge | AcceptedDoge |
01.05 | About this Course | AcceptedDoge | AcceptedDoge | AcceptedDoge |
01.06 | Course Resources | AcceptedDoge | AcceptedDoge | AcceptedDoge |
02.01 | Binary Classification | Coursera | 谢小彬 | AcceptedDoge |
02.02 | Logistic Regression | AcceptedDoge | 宋泽翰 | AcceptedDoge |
02.03 | Logistic Regression Cost Function | Coursera | 陈明 | AcceptedDoge |
02.04 | Gradient Descent | Coursera | 曹越 | AcceptedDoge |
02.05 | Derivatives | AcceptedDoge / 陈倩倩 | 陈倩倩 | AcceptedDoge |
02.06 | More Derivative Examples | AcceptedDoge | 谢小彬 | AcceptedDoge |
02.07 | Computation graph | AcceptedDoge | 刘振卫 | AcceptedDoge |
02.08 | Derivatives with a Computation Graph | AcceptedDoge | 史红光 | 李智锋 |
02.09 | Logistic Regression Gradient Descent | AcceptedDoge | 彭世锦 | AcceptedDoge |
02.10 | Gradient Descent on m Examples | 庞伟 | 庞伟 | AcceptedDoge |
02.11 | Vectorization | 杨先圣 | 杨先圣 | 李智锋 |
02.12 | More Examples of Vectorization | 舒正英 | 舒正英 | AcceptedDoge |
02.13 | Vectorizing Logistic Regression | hengC-137 | hengC-137 | AcceptedDoge |
02.14 | Vectorizing Logistic Regression's Gradient Output | 李晶 | 李晶 | AcceptedDoge |
02.15 | Broadcasting in Python | 李晶 | 李晶 | AcceptedDoge |
02.16 | A note on python/numpy vectors | 张雲飞 | 张雲飞 | AcceptedDoge |
02.17 | Quick tour of Jupyter/iPython Notebooks | 庞伟 | 舒正英 | AcceptedDoge |
02.18 | Explanation of logistic regression cost function (optional) | 彭世锦 | 彭世锦 | AcceptedDoge |
02.19 | Heroes of Deep Learning (Optional) Pieter Abeel interview | ihollywhy | ihollywhy | AcceptedDoge |
03.01 | Neural Networks Overview | 李晶 | 李晶 | AcceptedDoge |
03.02 | Neural Network Representation | 谢小彬 | 谢小彬 | 黄海广 |
03.03 | Computing a Neural Network's Output | hengC-137 | hengC-137 | 李智锋 |
03.04 | Vectorizing across multiple examples | 刘通 | 刘通 | 谢小彬 |
03.06 | Activation functions | 潘哲 | 潘哲 | 谢小彬 |
03.09 | Gradient descent for Neural Networks | 李晶 | 李晶 | 谢小彬 |
03.10 | Backpropagation intuition | 谢小彬 | 谢小彬 | 谢小彬 |
03.11 | Random Initialization | 刘振卫 | 刘振卫 | 谢小彬 |
- @wetstreet fixed some grammatical and translation errors #1
- @lzwhard 2.14向量化逻辑回归梯度下降 #2
- @Ldream 建议统一下编码. #4
- @Ldream 现有编码转换和编码转换工具添加 #6
- @athemeroy 关于2.2Logistic Regression的疑惑 #7
- @grsgth 部分语句修正 #9 #10
- @ihollywhy add Pieter Abeel interview #11
- @ForgotPS 第二周09中σ的翻译问题 #12 #13 #14
- @carefree0910Patch 1 #20 大幅度改进