Neural-Network-Introduction神经网络介绍大学毕业论文外文文献翻译及原文
上传者:索红|上传时间:2017-06-03|密次下载
Neural-Network-Introduction神经网络介绍大学毕业论文外文文献翻译及原文
毕 业 设 计(论文)
外 文 文 献 翻 译
文献、资料中文题目:神经网络介绍
文献、资料英文题目:Neural Network Introduction 文献、资料来源:
文献、资料发表(出版)日期:
院 (部):
专 业:
班 级:
姓 名:
学 号:
指导教师:
翻译日期: 2017.02.14
外文文献翻译
注:节选自Neural Network Introduction神经网络介绍,绪论。
History
The history of artificial neural networks is filled with colorful, creative individuals from many different fields, many of whom struggled for decades to develop concepts that we now take for granted. This history has been documented by various authors. One particularly interesting book is Neurocomputing: Foundations of Research by John
Anderson and Edward Rosenfeld. They have collected and edited a set of some 43 papers of special historical interest. Each paper is preceded by an introduction that puts the paper in historical perspective.
Histories of some of the main neural network contributors are included at the
beginning of various chapters throughout this text and will not be repeated here. However, it seems appropriate to give a brief overview, a sample of the major developments.
At least two ingredients are necessary for the advancement of a technology: concept and implementation. First, one must have a concept, a way of thinking about a topic, some view of it that gives clarity not there before. This may involve a simple idea, or it may be more specific and include a mathematical description. To illustrate this point, consider the history of the heart. It was thought to be, at various times, the center of the soul or a source of heat. In the 17th century medical practitioners finally began to view the heart as a pump, and they designed experiments to study its pumping action. These experiments revolutionized our view of the circulatory system. Without the pump concept, an understanding of the heart was out of grasp.
Concepts and their accompanying mathematics are not sufficient for a technology to mature unless there is some way to implement the system. For instance, the mathematics necessary for the reconstruction of images from computer-aided topography (CAT) scans was known many years before the availability of high-speed computers and efficient algorithms finally made it practical to implement a useful CAT system.
The history of neural networks has progressed through both conceptual innovations and implementation developments. These advancements, however, seem to have occurred in fits and starts rather than by steady evolution.
Some of the background work for the field of neural networks occurred in the late 19th and early 20th centuries. This consisted primarily of interdisciplinary work in
physics, psychology and neurophysiology by such scientists as Hermann von Helmholtz, Ernst Much and Ivan Pavlov. This early work emphasized general theories of learning, vision, conditioning, etc.,and did not include specific mathematical models of neuron operation.
The modern view of neural networks began in the 1940s with the work of Warren McCulloch and Walter Pitts [McPi43], who showed that networks of artificial neurons could, in principle, compute any arithmetic or logical function. Their work is often acknowledged as the origin of the
neural network field.
McCulloch and Pitts were followed by Donald Hebb [Hebb49], who proposed that classical conditioning (as discovered by Pavlov) is present because of the properties of individual neurons. He proposed a mechanism for learning in biological neurons.
The first practical application of artificial neural networks came in the late 1950s, with the invention of the perception network and associated learning rule by Frank Rosenblatt [Rose58]. Rosenblatt and his colleagues built a perception network and demonstrated its ability to perform pattern recognition. This early success generated a great deal of interest in neural network research. Unfortunately, it was later shown that the basic perception network could solve only a limited class of problems. (See Chapter 4 for more on Rosenblatt and the perception learning rule.)
At about the same time, Bernard Widrow and Ted Hoff [WiHo60] introduced a new learning algorithm and used it to train adaptive linear neural networks, which were
similar in structure and capability to Rosenblatt’s perception. The Widrow Hoff learning rule is still in use today. (See Chapter 10 for more on Widrow-Hoff learning.)
Unfortunately, both Rosenblatt's and Widrow's networks suffered from the same inherent limitations, which were widely publicized in a book by Marvin Minsky and Seymour Papert [MiPa69]. Rosenblatt and Widrow were
aware of these limitations and proposed new networks that would overcome them.
However, they were not able to successfully modify their learning algorithms to train the more complex networks.
Many people, influenced by Minsky and Papert, believed that further research on neural networks was a dead end. This, combined with the fact that there were no powerful digital computers on which to experiment,
caused many researchers to leave the field. For a decade neural network research was largely suspended. Some important work, however, did continue during the 1970s. In 1972 Teuvo Kohonen [Koho72] and James Anderson [Ande72] independently and
separately developed new neural networks that could act as memories. Stephen Grossberg
[Gros76] was also very active during this period in the investigation of self-organizing networks.
Interest in neural networks had faltered during the late 1960s because of the lack of new ideas and powerful computers with which to experiment. During the 1980s both of these impediments were overcome, and research
in neural networks increased dramatically. New personal computers and
workstations, which rapidly grew in capability, became widely available. In addition, important new concepts were introduced.
Two new concepts were most responsible for the rebirth of neural net works. The first was the use of statistical mechanics to explain the operation of a certain class of recurrent network, which could be used as an associative memory. This was described in a seminal paper by physicist John Hopfield [Hopf82].
The second key development of the 1980s was the backpropagation algo rithm for training multilayer perceptron networks, which was discovered independently by several different researchers. The most influential publication of the backpropagation algorithm was by David Rumelhart and James McClelland [RuMc86]. This algorithm was the
answer to the criticisms Minsky and Papert had made in the 1960s. (See Chapters 11 and 12 for a development of the backpropagation algorithm.)
These new developments reinvigorated the field of neural networks. In the last ten years, thousands of papers have been written, and neural networks have found many applications. The field is buzzing with new theoretical and practical work. As noted below, it is not clear where all of this will lead US.
The brief historical account given above is not intended to identify all of the major contributors, but is simply to give the reader some feel for how knowledge in the neural
network field has progressed. As one might note, the progress has not always been slow
but sure. There have been periods of dramatic progress and periods when relatively little
has been accomplished.
Many of the advances in neural networks have had to do with new concepts, such as
innovative architectures and training. Just as important has been the availability of
powerful new computers on which to test these new concepts. Well, so much for the history of neural networks to this date. The real question is,
What will happen in the next ten to twenty years? Will neural networks take a
permanent place as a mathematical/engineering tool, or will they fade away as have so
many promising technologies? At present, the answer seems to be that neural networks
will not only have their day but will have a permanent place, not as a solution to every
problem, but as a tool to be used in appropriate situations. In addition, remember that we
still know very little about how the brain works. The most important advances in neural
networks almost certainly lie in the future.
Although it is difficult to predict the future success of neural networks, the large
number and wide variety of applications of this new technology are very encouraging.
The next section describes some of these applications.
Applications
A recent newspaper article described the use of neural networks in literature
research by Aston University. It stated that the network can be taught to recognize
individual writing styles, and the researchers used it to compare works attributed to
Shakespeare and his contemporaries. A popular science television program recently
documented the use of neural networks by an Italian research institute to test the purity of
olive oil. These examples are indicative of the broad range of applications that can be
found for neural networks. The applications are expanding because neural networks are
good at solving problems, not just in engineering, science and mathematics, but m
medicine, business, finance and literature as well. Their application to a wide variety of
problems in many fields makes them very attractive. Also, faster computers and faster
algorithms have made it possible to use neural networks to solve complex industrial
problems that formerly required too much computation.
The following note and Table of Neural Network Applications are reproduced here from the Neural Network Toolbox for MATLAB with the permission of the Math Works, Inc.
The 1988 DARPA Neural Network Study [DARP88] lists various neural network applications, beginning with the adaptive channel equalizer in about 1984. This device, which is an outstanding commercial success, is a single-neuron network used in long distance telephone systems to stabilize voice signals. The DARPA report goes on to list other commercial applications, including a small word recognizer, a process monitor, a sonar classifier and a risk analysis system.
Neural networks have been applied in many fields since the DARPA report was written. A list of some applications mentioned in the literature follows.
Aerospace
High performance aircraft autopilots, flight path simulations, aircraft control systems, autopilot enhancements, aircraft component simulations, aircraft component fault detectors
Automotive
Automobile automatic guidance systems, warranty activity analyzers
Banking
Check and other document readers, credit application evaluators
Defense
Weapon steering, target tracking, object discrimination, facial recognition, new kinds of sensors, sonar, radar and image signal processing including data compression, feature extraction and noise suppression, signal/image identification
Electronics
Code sequence prediction, integrated circuit chip layout, process control, chip failure analysis, machine vision, voice synthesis, nonlinear modeling
Entertainment
Animation, special effects, market forecasting
下载文档
热门试卷
- 2016年四川省内江市中考化学试卷
- 广西钦州市高新区2017届高三11月月考政治试卷
- 浙江省湖州市2016-2017学年高一上学期期中考试政治试卷
- 浙江省湖州市2016-2017学年高二上学期期中考试政治试卷
- 辽宁省铁岭市协作体2017届高三上学期第三次联考政治试卷
- 广西钦州市钦州港区2016-2017学年高二11月月考政治试卷
- 广西钦州市钦州港区2017届高三11月月考政治试卷
- 广西钦州市钦州港区2016-2017学年高一11月月考政治试卷
- 广西钦州市高新区2016-2017学年高二11月月考政治试卷
- 广西钦州市高新区2016-2017学年高一11月月考政治试卷
- 山东省滨州市三校2017届第一学期阶段测试初三英语试题
- 四川省成都七中2017届高三一诊模拟考试文科综合试卷
- 2017届普通高等学校招生全国统一考试模拟试题(附答案)
- 重庆市永川中学高2017级上期12月月考语文试题
- 江西宜春三中2017届高三第一学期第二次月考文科综合试题
- 内蒙古赤峰二中2017届高三上学期第三次月考英语试题
- 2017年六年级(上)数学期末考试卷
- 2017人教版小学英语三年级上期末笔试题
- 江苏省常州西藏民族中学2016-2017学年九年级思想品德第一学期第二次阶段测试试卷
- 重庆市九龙坡区七校2016-2017学年上期八年级素质测查(二)语文学科试题卷
- 江苏省无锡市钱桥中学2016年12月八年级语文阶段性测试卷
- 江苏省无锡市钱桥中学2016-2017学年七年级英语12月阶段检测试卷
- 山东省邹城市第八中学2016-2017学年八年级12月物理第4章试题(无答案)
- 【人教版】河北省2015-2016学年度九年级上期末语文试题卷(附答案)
- 四川省简阳市阳安中学2016年12月高二月考英语试卷
- 四川省成都龙泉中学高三上学期2016年12月月考试题文科综合能力测试
- 安徽省滁州中学2016—2017学年度第一学期12月月考高三英语试卷
- 山东省武城县第二中学2016.12高一年级上学期第二次月考历史试题(必修一第四、五单元)
- 福建省四地六校联考2016-2017学年上学期第三次月考高三化学试卷
- 甘肃省武威第二十三中学2016—2017学年度八年级第一学期12月月考生物试卷
网友关注
- 2018陕西公务员考试行测题库:行测每日一练数量关系练习题答案12.18
- 陕西公务员面试每日一练结构化面试模拟题答案1.15
- 陕西公务员面试每日一练结构化面试模拟题1.12
- 2018陕西公务员考试行测题库:行测每日一练言语理解练习题答案01.08
- 陕西公务员考试行测题库:行测每日一练言语理解练习题01.11
- 陕西公务员考试行测题库:行测每日一练言语理解练习题答案01.11
- 陕西公务员行测每日一练言语理解练习题答案01.17
- 陕西公务员考试面试题库:面试每日一练结构化面试模拟题1.11
- 陕西公务员考试行测题库:行测每日一练言语理解练习题答案01.09
- 2018陕西公务员考试面试题库:面试每日一练结构化面试模拟题1.8
- 陕西公务员面试每日一练结构化面试模拟题答案1.19
- 陕西公务员行测每日一练言语理解练习题答案01.18
- 陕西公务员面试每日一练结构化面试模拟题1.19
- 陕西公务员考试行测题库:行测每日一练言语理解练习题01.09
- 陕西公务员行测每日一练数量关系练习题答案01.19
- 陕西公务员面试每日一练结构化面试模拟题答案1.18
- 2018陕西公务员考试面试题库:面试每日一练结构化面试模拟题答案1.8
- 2018陕西公务员考试面试热点模拟题:“节后空巢症”怎么治?
- 陕西公务员行测每日一练判断推理练习题01.15
- 陕西公务员考试面试题库:面试每日一练结构化面试模拟题答案1.9
- 陕西公务员考试面试题库:面试每日一练结构化面试模拟题答案1.11
- 陕西公务员行测每日一练资料分析练习题01.12
- 陕西公务员行测每日一练资料分析练习题答案01.12
- 陕西公务员行测每日一练判断推理练习题01.22
- 陕西公务员面试每日一练结构化面试模拟题1.17
- 陕西公务员面试每日一练结构化面试模拟题1.22
- 陕西公务员行测每日一练言语理解练习题01.17
- 2018陕西公务员考试行测题库:行测每日一练言语理解练习题01.08
- 陕西公务员面试每日一练结构化面试模拟题1.18
- 2018陕西公务员考试申论每周一练:支付宝账单与信息安全
网友关注视频
- 七年级英语下册 上海牛津版 Unit9
- 3.2 数学二年级下册第二单元 表内除法(一)整理和复习 李菲菲
- 沪教版牛津小学英语(深圳用) 五年级下册 Unit 10
- 沪教版牛津小学英语(深圳用) 四年级下册 Unit 7
- 第五单元 民族艺术的瑰宝_16. 形形色色的民族乐器_第一课时(岭南版六年级上册)_T3751175
- 北师大版数学 四年级下册 第三单元 第二节 小数点搬家
- 苏科版八年级数学下册7.2《统计图的选用》
- 冀教版小学数学二年级下册第二周第2课时《我们的测量》宝丰街小学庞志荣
- 冀教版英语四年级下册第二课
- 【部编】人教版语文七年级下册《逢入京使》优质课教学视频+PPT课件+教案,安徽省
- 苏科版数学七年级下册7.2《探索平行线的性质》
- 外研版英语七年级下册module1unit3名词性物主代词讲解
- 外研版英语七年级下册module3 unit2第一课时
- 冀教版小学数学二年级下册第二单元《有余数除法的整理与复习》
- 外研版英语三起5年级下册(14版)Module3 Unit2
- 19 爱护鸟类_第一课时(二等奖)(桂美版二年级下册)_T502436
- 冀教版小学数学二年级下册第二周第2课时《我们的测量》宝丰街小学庞志荣.mp4
- 【部编】人教版语文七年级下册《过松源晨炊漆公店(其五)》优质课教学视频+PPT课件+教案,辽宁省
- 【部编】人教版语文七年级下册《老山界》优质课教学视频+PPT课件+教案,安徽省
- 沪教版牛津小学英语(深圳用) 四年级下册 Unit 2
- 化学九年级下册全册同步 人教版 第18集 常见的酸和碱(二)
- 北师大版数学四年级下册第三单元第四节街心广场
- 【部编】人教版语文七年级下册《老山界》优质课教学视频+PPT课件+教案,安徽省
- 北师大版八年级物理下册 第六章 常见的光学仪器(二)探究凸透镜成像的规律
- 19 爱护鸟类_第一课时(二等奖)(桂美版二年级下册)_T3763925
- 北师大版小学数学四年级下册第15课小数乘小数一
- 二年级下册数学第二课
- 沪教版八年级下册数学练习册一次函数复习题B组(P11)
- 冀教版英语五年级下册第二课课程解读
- 化学九年级下册全册同步 人教版 第22集 酸和碱的中和反应(一)
精品推荐
- 2016-2017学年高一语文人教版必修一+模块学业水平检测试题(含答案)
- 广西钦州市高新区2017届高三11月月考政治试卷
- 浙江省湖州市2016-2017学年高一上学期期中考试政治试卷
- 浙江省湖州市2016-2017学年高二上学期期中考试政治试卷
- 辽宁省铁岭市协作体2017届高三上学期第三次联考政治试卷
- 广西钦州市钦州港区2016-2017学年高二11月月考政治试卷
- 广西钦州市钦州港区2017届高三11月月考政治试卷
- 广西钦州市钦州港区2016-2017学年高一11月月考政治试卷
- 广西钦州市高新区2016-2017学年高二11月月考政治试卷
- 广西钦州市高新区2016-2017学年高一11月月考政治试卷
分类导航
- 互联网
- 电脑基础知识
- 计算机软件及应用
- 计算机硬件及网络
- 计算机应用/办公自动化
- .NET
- 数据结构与算法
- Java
- SEO
- C/C++资料
- linux/Unix相关
- 手机开发
- UML理论/建模
- 并行计算/云计算
- 嵌入式开发
- windows相关
- 软件工程
- 管理信息系统
- 开发文档
- 图形图像
- 网络与通信
- 网络信息安全
- 电子支付
- Labview
- matlab
- 网络资源
- Python
- Delphi/Perl
- 评测
- Flash/Flex
- CSS/Script
- 计算机原理
- PHP资料
- 数据挖掘与模式识别
- Web服务
- 数据库
- Visual Basic
- 电子商务
- 服务器
- 搜索引擎优化
- 存储
- 架构
- 行业软件
- 人工智能
- 计算机辅助设计
- 多媒体
- 软件测试
- 计算机硬件与维护
- 网站策划/UE
- 网页设计/UI
- 网吧管理