knowledge, the teacher is able to provide the neural network with a desired responsefor that training vector. Indeed, the desired response represents the "optimum" ac-tion to be performed by the neural network. The network parameters are adjustedunder the combined influence of the training vector and the error signal. The errorsignal is defined as the difference between the desired response and the actual re-sponse of the network. This adjustment is carried out iteratively in a step-by-stepfashion with the aim of eventually making the neural network emulate the teacher;the emulation is presumed to be optimum in some statistical sense. In this way,knowledge of the environment available to the teacher is transferred to the neuralnetwork through training and stored in the form of"fixed" synaptic weights, repre-senting long-term memory. When this condition is reached, we may then dispensewith the teacher and let the neural network deal with the environment completelyby itself.
The form of supervised learning we have just described is the basis of error-correction learning. From Fig. 24, we see that the supervised-learning process con-stitutes a closed-loop feedback system, but the unknown environment is outside theloop. As a performance measure for the system, we may think in terms of the mean-square error, or the sum of squared errors over the training sample, defined as a func-tion of the free parameters (i.e., synaptic weights) of the system. This function maybe visualized as a multidimensional error-performance surface, or simply error surface,with the free paiameters as coordinates.The true error surface is averaged over allpossible input-output examples. Any given operation of the system under theteachers supervision is represented as a point on the error surface. For the system toimprove performance over time and therefore learn from the teacher, the operatingpoint has to move down successively toward a minimum point of the error surface;the minimum point may be a local minimum or a global minimum. A supervisedlearning system is able to do this with the useful information it has about the gradient of the error surface corresponding to the current behavior of the system.
《神经网络与机器学习(英文版·第3版)》((加)海金(Haykin
来源:互联网 发布日期:2011-09-20 18:43:17 浏览:22930次
导读:神经网络与机器学习(英文版·第3版)作者:(加)海金(Haykin,S)机械工业出版社出版,京东人工智能图书网购,折扣超低。...
下一篇:没有了...
相关内容
- 黄仁勋:我们的目标是创建一个巨型芯片,将使用72个Blackwell GPU
- 上海团队实现脑机接口临床试验重大突破,用汉语“意念对话”走进现实
- 2025全球电商消费趋势有哪些?AI等创新科技应用场景加速拓展、情绪经济成亮点
- 我国脑机接口技术实现汉语实时编解码重大突破
- 脑机接口取得又一突破性进展:实时汉语解码实现”意念对话“
- 回顾2024:AI和新硬件,开始改变我们的生活方式
- 美科技巨头未来“完全收购”AI开发者?拜登警告:警惕科技工业综合体
- 2025年手机芯片,可能不会采用2nm技术
- 韩国研究团队开发出类脑芯片,能够自主学习并纠正错误
- 意图打击中国产业,效果可能适得其反,美科技组织担忧AI芯片出口被管制
AiLab云推荐
最新资讯
本月热点
热门排行
-
科学家成功研发可弯曲的非硅柔性芯片,成本不到 1 美元
阅读量:5959
-
马斯克的新愿景对准盲人,Neuralink下一代脑机接口已获批
阅读量:4552
-
为了获诺奖,AI不仅要懂物理学,还要懂哲学?
阅读量:3749
-
中国厂商与苹果竞速,AI手机进入“自动驾驶”时代
阅读量:3628
-
我国研究人员设计出高效神经调控芯片,推动脑机接口研发进程
阅读量:3549
-
马斯克盯上了盲人,Neuralink下一代脑机接口产品获批
阅读量:3523