knowledge, the teacher is able to provide the neural network with a desired responsefor that training vector. Indeed, the desired response represents the "optimum" ac-tion to be performed by the neural network. The network parameters are adjustedunder the combined influence of the training vector and the error signal. The errorsignal is defined as the difference between the desired response and the actual re-sponse of the network. This adjustment is carried out iteratively in a step-by-stepfashion with the aim of eventually making the neural network emulate the teacher;the emulation is presumed to be optimum in some statistical sense. In this way,knowledge of the environment available to the teacher is transferred to the neuralnetwork through training and stored in the form of"fixed" synaptic weights, repre-senting long-term memory. When this condition is reached, we may then dispensewith the teacher and let the neural network deal with the environment completelyby itself.
The form of supervised learning we have just described is the basis of error-correction learning. From Fig. 24, we see that the supervised-learning process con-stitutes a closed-loop feedback system, but the unknown environment is outside theloop. As a performance measure for the system, we may think in terms of the mean-square error, or the sum of squared errors over the training sample, defined as a func-tion of the free parameters (i.e., synaptic weights) of the system. This function maybe visualized as a multidimensional error-performance surface, or simply error surface,with the free paiameters as coordinates.The true error surface is averaged over allpossible input-output examples. Any given operation of the system under theteachers supervision is represented as a point on the error surface. For the system toimprove performance over time and therefore learn from the teacher, the operatingpoint has to move down successively toward a minimum point of the error surface;the minimum point may be a local minimum or a global minimum. A supervisedlearning system is able to do this with the useful information it has about the gradient of the error surface corresponding to the current behavior of the system.
《神经网络与机器学习(英文版·第3版)》((加)海金(Haykin
来源:互联网 发布日期:2011-09-20 18:43:17 浏览:22930次
导读:神经网络与机器学习(英文版·第3版)作者:(加)海金(Haykin,S)机械工业出版社出版,京东人工智能图书网购,折扣超低。...
下一篇:没有了...
相关内容
AiLab云推荐

最新资讯
本月热点
热门排行
-
为了获诺奖,AI不仅要懂物理学,还要懂哲学?
阅读量:3843
-
我国研究人员设计出高效神经调控芯片,推动脑机接口研发进程
阅读量:3677
-
为何诺贝尔化学奖又颁给AI?万字详解:AI重塑结构生物学
阅读量:3604
-
芯片大厂恩智浦边缘业务提速融合人工智能
阅读量:3505
-
华为脑机接口芯片新专利曝光,是其第二项脑机接口专利
阅读量:3436
-
小米 Vela 系统代码即将开源,开启先锋体验计划
阅读量:3158
推荐内容
- 2025年第33届中国华东进出口商品交易会(上海华交会)
- 2025第十五届中国(郑州)塑料产业博览会(AllinPlas)
- 2025第二十届中国(临沂)小商品博览会
- 2025年加拿大多伦多矿业展PDAC
- 2025北方国际电机技术与节能科技展览会
- 2025第三十届届华南国际口腔展览会
- 2025CME第10届上海国际机床展(华机展)
- 2025第14届国际生物发酵展(济南展)(BIO)
- 2025中国(山东)国际工业节能技术与装备博览会
- 2025北方国际膜工业大会暨膜工业技术产品展览会
- 2025第十五届杭州网红直播电商及私域团购选品博览会
- 2025第十五届杭州全球新电商博览会
- 2025济南药交会
- 2025广州国际护肤用品展览会(迎河个护展 PCE)
- 2025第三十一届中国国际包装工业展览会(中国国际包装工业展 Sino-Pack 2025)
- 2025第三十一届华南国际印刷工业展览会(printing south china)
- 第九届广州国际氢科技产业博览会
- 2025第三十二届上海国际广告技术设备展览会(上海国际广印展 APPP EXPO)
- 2025 年日本国际照明LED 展览会