国产一级a片免费看高清,亚洲熟女中文字幕在线视频,黄三级高清在线播放,免费黄色视频在线看

打開APP
userphoto
未登錄

開通VIP,暢享免費電子書等14項超值服

開通VIP
可自我學習的仿生腦計算機 - 計算機新時代
摘要:計算機已進入一個新時代,類大腦計算機不僅可以自動完成一些需要辛苦編程的任務(wù),它還具有容錯能力。這是怎么實現(xiàn)的?這主要是新型計算機基于生物設(shè)計,因此它算法是不斷變化的,而這也使得系統(tǒng)能夠適應(yīng)并解決問題。
據(jù)《紐約時報》網(wǎng)站報道,加利福尼亞州PALO ALTO的計算機已經(jīng)進入了一個新時代,它們能夠從自己的錯誤中吸取教訓(xùn)。據(jù)悉這個具有新芯片的計算機不僅可以自動完成需要辛苦編程的任務(wù)(比如說順利和有效地移動機器人的手臂外),還能回避乃至容忍錯誤,從而潛在地避免“電腦死機”發(fā)生。
目前這個新的計算機已經(jīng)在一些大型科技公司使用,該計算機算法主要基于生物神經(jīng)系統(tǒng),特別之處則是它的“神經(jīng)元”能對刺激作出反應(yīng),并與其它神經(jīng)元協(xié)作處理信息。正是因為基于這樣的機制,當計算機執(zhí)行任務(wù)時能夠吸收新的信息,并在基于信息基礎(chǔ)上進行調(diào)整。
在未來幾年,這種方法很有可能產(chǎn)生新一代的人工智能系統(tǒng),從而非常容易地執(zhí)行一些只有人類才能所具有的一些功能,比如看、講話、聽東西、導(dǎo)航、操縱和控制等。當執(zhí)行一些面部和語音識別等任務(wù)時,它可以處理龐大的數(shù)據(jù),不過就目前來看,雖然具有一定的容錯能力,但這些仍處于初級階段,它還嚴重依賴于人類的編程。
類大腦計算機:前進一小步,實則一大步
設(shè)計師稱,盡管目前計算機思維或意識,仍像科幻小說里的某個主題,并且實現(xiàn)它仍然還很遙遠,但這種計算風格為機器人安全的在物理世界行走和駕駛掃清了道路。致力于這種新型計算機電路研究的天體物理學家Larry Smarr稱,他們努力目標正從工程計算機系統(tǒng)向生物計算的方向轉(zhuǎn)移。
傳統(tǒng)的計算機受限于人們的編程代碼,比如在計算機視覺系統(tǒng)中,唯一的“識別”對象是通過統(tǒng)計面向算法來識別物體,一種算法就好比是一個食譜,通過一套循序漸進的指示來執(zhí)行計算。但是在去年,在沒有任何其它管理的情況下,谷歌研究人員通過一個被稱為神經(jīng)網(wǎng)絡(luò)的機器學習算法,執(zhí)行和完成了一些鑒定工作——網(wǎng)絡(luò)掃描1000萬張圖片的數(shù)據(jù)庫,從而訓(xùn)練出計算機自己的綜合分析能力。
今年六月,谷歌公司表示,它們已使用神經(jīng)網(wǎng)絡(luò)技術(shù)開發(fā)了一個新的搜索服務(wù),以幫助客戶更準確地找到特定的照片。這個神經(jīng)網(wǎng)絡(luò)已經(jīng)用于硬件和軟件中,并正在推動腦科學知識的爆炸。
領(lǐng)導(dǎo)斯坦福大學在硅谷進行腦研究項目的計算機科學家Kwabena Boahen稱,而這也是其局限性之一,因為科學家至今還遠遠沒有充分了解大腦的功能。他稱,盡管如此,但這些夸張的理論,仍然給他帶來一些靈感并構(gòu)建相應(yīng)的東西。
類大腦計算機與傳統(tǒng)計算機的不同
迄今為止,電腦的設(shè)計理念仍是出自65年前數(shù)學家約翰·馮·諾伊曼之手,微處理器以閃電般的速度執(zhí)行長串的0和1編程指令,這些指令信息一般單獨存放在內(nèi)存、處理器、相鄰的存儲芯片或更高容量的磁盤驅(qū)動器之中。比如,處理溫度氣候模型或字母文字處理情況下,計算機進行程序化操作時這些數(shù)據(jù)只有短期記憶,計算機只能頻繁地進出處理器,最后得到結(jié)果才會移到它的主存儲器。
而新型計算機的處理器則由眾多模仿生物神經(jīng)突觸并連接在一起的電子元器件組成,因為他們是基于大群體的類神經(jīng)元元件,所以它們被稱為神經(jīng)形態(tài)處理器。值得一提的是,這個術(shù)語最早由加州理工學院的物理學家Carver Mead提出,他于20世紀80年期后期首先開創(chuàng)了這個概念。
由于數(shù)據(jù)和數(shù)據(jù)間的連接相當于“加權(quán)”,實際上處理器已經(jīng)不是在執(zhí)行“編程”的命令了,根據(jù)數(shù)據(jù)的相關(guān)性來看,處理器已經(jīng)在學習。在芯片中,那些權(quán)重會改變數(shù)據(jù)流,然后會造成他們價值觀改變和產(chǎn)生尖波,而這會產(chǎn)生信號并擴散到其他組件中產(chǎn)生反應(yīng),并最終改變神經(jīng)網(wǎng)絡(luò)。從本質(zhì)上來說,這種設(shè)計理念與信息改變了人類思想和行為的方式基本差不多。
IBM計算機認知計算研究負責人、IBM計算機科學家Dharmendra Modha稱,我們可以把計算帶給數(shù)據(jù),而不是像今天這樣,被動地把數(shù)據(jù)輸入進去計算。Dharmendra Modha繼續(xù)稱,未來傳感器將成為計算機,不僅如此,它還會開辟使用計算機芯片地新方式——可以無處不在。
類大腦計算機優(yōu)勢明顯,但不會取代今天的計算機
這種類大腦計算機基于硅芯片,未來不會取代今天的計算機,但會增加他們的能力,至少從目前來看是這樣。許多計算機設(shè)計者認為今天的計算機不僅不會被取代,它們的未來還會被當做協(xié)處理器,這意味著他們可以串聯(lián)并嵌入到智能手機和巨大的集中式計算機中組成云。是的,現(xiàn)代計算機已經(jīng)由各種各樣的協(xié)處理器執(zhí)行專門任務(wù),比如像今天用戶都在使用自己的手機處理圖形和轉(zhuǎn)換視覺、音頻,而使用筆記本去處理其他數(shù)據(jù)等。
這種新型計算機一個最大的優(yōu)點是它具有容忍故障的能力,傳統(tǒng)計算機是精確的,但他們由于死腦筋,在遇到失敗時就會崩潰,但是新型計算機不一樣,它是基于生物設(shè)計,因此它的算法是不斷變化的,而這也使得系統(tǒng)能夠不斷地適應(yīng)并解決故障,從而完成任務(wù)。
與人腦相比,類大腦計算機還有很長的路要走
實際上類人腦計算機能效也非常低下,特別是與真實的大腦相比。IBM公司在去年宣布,它已建立一個超級計算機仿真人腦團隊,該超級計算機粗略估計大約包含了100億個神經(jīng)元,而這相當于人腦10%以上的比例,但它的運行速度與人腦相比,差不多慢了有1500倍。此外,它還需要幾百萬瓦特的功率,而生物大腦在工作時僅需要20瓦特的功率。
IBM計算機科學家Dharmendra Modha稱,這個被稱為Compass的超級計算機,如果要達到人類大腦一樣速度,它需要的電量等價于舊金山和紐約兩座城市的供電。
目前IBM和高通以及斯坦福大學的研究團隊,已經(jīng)設(shè)計出了具有神經(jīng)形態(tài)的處理器,與此同時,高通還表示首個商業(yè)版本的神經(jīng)形態(tài)處理器將會于2014年推出,預(yù)計它的推出將主要用于進一步發(fā)展。此外許多大學現(xiàn)在也集中于這種新風格計算的研究,今年秋天,美國國家科學基金會資助了由麻省理工學院、哈佛大學和康奈爾大學等組成的研究中心,該研究中心也從事類似的研究。
斯坦福大學計算機科學家Andrew Ng.教授稱,這體現(xiàn)了時代精神,每個人都知道將會有什么大事發(fā)生,而他們現(xiàn)在正在試圖找出它是什么。
Brainlike Computers, Learning From Experience - NYTimes.com
Erin Lubin/The New York Times
Kwabena Boahen holding a biologically inspired processor attached to a robotic arm in a laboratory at Stanford University.
By JOHN MARKOFF
Published: December 28, 2013
PALO ALTO, Calif. — Computers have entered the age when they are able to learn from their own mistakes, a development that is about to turn the digital world on its head.
The first commercial version of the new kind of computer chip is scheduled to be released in 2014. Not only can it automate tasks that now require painstaking programming — for example, moving a robot’s arm smoothly and efficiently — but it can also sidestep and even tolerate errors, potentially making the term “computer crash” obsolete.
The new computing approach, already in use by some large technology companies, is based on the biological nervous system, specifically on how neurons react to stimuli and connect with other neurons to interpret information. It allows computers to absorb new information while carrying out a task, and adjust what they do based on the changing signals.
In coming years, the approach will make possible a new generation of artificial intelligence systems that will perform some functions that humans do with ease: see, speak, listen, navigate, manipulate and control. That can hold enormous consequences for tasks like facial and speech recognition, navigation and planning, which are still in elementary stages and rely heavily on human programming.
Designers say the computing style can clear the way for robots that can safely walk and drive in the physical world, though a thinking or conscious computer, a staple of science fiction, is still far off on the digital horizon.
“We’re moving from engineering computing systems to something that has many of the characteristics of biological computing,” said Larry Smarr, an astrophysicist who directs theCalifornia Institute for Telecommunications and Information Technology, one of many research centers devoted to developing these new kinds of computer circuits.
Conventional computers are limited by what they have been programmed to do. Computer vision systems, for example, only “recognize” objects that can be identified by the statistics-oriented algorithms programmed into them. An algorithm is like a recipe, a set of step-by-step instructions to perform a calculation.
But last year, Google researchers were able to get a machine-learning algorithm, known as a neural network, to perform an identification task without supervision. The network scanned a database of 10 million images, and in doing so trained itself to recognize cats.
In June, the company said it had used those neural network techniques to develop a new search service to help customers find specific photos more accurately.
The new approach, used in both hardware and software, is being driven by the explosion of scientific knowledge about the brain. Kwabena Boahen, a computer scientist who leads Stanford’s Brains in Silicon research program, said that is also its limitation, as scientists are far from fully understanding how brains function.
“We have no clue,” he said. “I’m an engineer, and I build things. There are these highfalutin theories, but give me one that will let me build something.”
Until now, the design of computers was dictated by ideas originated by the mathematician John von Neumann about 65 years ago. Microprocessors perform operations at lightning speed, following instructions programmed using long strings of 1s and 0s. They generally store that information separately in what is known, colloquially, as memory, either in the processor itself, in adjacent storage chips or in higher capacity magnetic disk drives.
The data — for instance, temperatures for a climate model or letters for word processing — are shuttled in and out of the processor’s short-term memory while the computer carries out the programmed action. The result is then moved to its main memory.
The new processors consist of electronic components that can be connected by wires that mimic biological synapses. Because they are based on large groups of neuron-like elements, they are known as neuromorphic processors, a term credited to the California Institute of Technology physicist Carver Mead, who pioneered the concept in the late 1980s.
They are not “programmed.” Rather the connections between the circuits are “weighted” according to correlations in data that the processor has already “l(fā)earned.” Those weights are then altered as data flows in to the chip, causing them to change their values and to “spike.” That generates a signal that travels to other components and, in reaction, changes the neural network, in essence programming the next actions much the same way that information alters human thoughts and actions.
“Instead of bringing data to computation as we do today, we can now bring computation to data,” said Dharmendra Modha, an I.B.M. computer scientist who leads the company’s cognitive computing research effort. “Sensors become the computer, and it opens up a new way to use computer chips that can be everywhere.”
The new computers, which are still based on silicon chips, will not replace today’s computers, but will augment them, at least for now. Many computer designers see them as coprocessors, meaning they can work in tandem with other circuits that can be embedded in smartphones and in the giant centralized computers that make up the cloud. Modern computers already consist of a variety of coprocessors that perform specialized tasks, like producing graphics on your cellphone and converting visual, audio and other data for your laptop.
One great advantage of the new approach is its ability to tolerate glitches. Traditional computers are precise, but they cannot work around the failure of even a single transistor. With the biological designs, the algorithms are ever changing, allowing the system to continuously adapt and work around failures to complete tasks.
Traditional computers are also remarkably energy inefficient, especially when compared to actual brains, which the new neurons are built to mimic.
I.B.M. announced last year that it had built a supercomputer simulation of the brain that encompassed roughly 10 billion neurons — more than 10 percent of a human brain. It ran about 1,500 times more slowly than an actual brain. Further, it required several megawatts of power, compared with just 20 watts of power used by the biological brain.
Running the program, known as Compass, which attempts to simulate a brain, at the speed of a human brain would require a flow of electricity in a conventional computer that is equivalent to what is needed to power both San Francisco and New York, Dr. Modha said.
I.B.M. and Qualcomm, as well as the Stanford research team, have already designed neuromorphic processors, and Qualcomm has said that it is coming out in 2014 with a commercial version, which is expected to be used largely for further development. Moreover, many universities are now focused on this new style of computing. This fall the National Science Foundation financed the Center for Brains, Minds and Machines, a new research center based at the Massachusetts Institute of Technology, with Harvard and Cornell.
The largest class on campus this fall at Stanford was a graduate level machine-learning course covering both statistical and biological approaches, taught by the computer scientist Andrew Ng. More than 760 students enrolled. “That reflects the zeitgeist,” said Terry Sejnowski, a computational neuroscientist at the Salk Institute, who pioneered early biologically inspired algorithms. “Everyone knows there is something big happening, and they’re trying find out what it is.”
Related Links:
類大腦計算機,一個可以適應(yīng)并從經(jīng)驗中學習的計算機
Brainlike Computers, Learning From Experience 
本站僅提供存儲服務(wù),所有內(nèi)容均由用戶發(fā)布,如發(fā)現(xiàn)有害或侵權(quán)內(nèi)容,請點擊舉報。
打開APP,閱讀全文并永久保存 查看更多類似文章
猜你喜歡
類似文章
大腦—你真的了解它嗎?
大腦進化史
很好奇,人的大腦算力相當于什么水平的GPU和CPU呀 ?
20個英語四六級考試閱讀難點關(guān)鍵句翻譯
我們僅僅用了大腦的10%嗎?
學會如何學習
更多類似文章 >>
生活服務(wù)
分享 收藏 導(dǎo)長圖 關(guān)注 下載文章
綁定賬號成功
后續(xù)可登錄賬號暢享VIP特權(quán)!
如果VIP功能使用有故障,
可點擊這里聯(lián)系客服!

聯(lián)系客服