paging Herr Dr. Frankenstein , Herr Dr. Victor good doc ,paging Dr. ( in the back ground GrrrrrrGr rrrrr grumf grrrrrrrrrrrr growl ! "/ !@#$%^&* fire get fire quick shoot it SHOOT IT !!!! ngrowl snap ,cracking POP KILLLLLLLLLLLLLLLLLL it !!!!!!!!!!!! shoot !!!!!!!!!!!!! )(*&^%$ awollllllllllllllllllllllllll snap pop fuck KILL It !!%$^&*( )(*&^ ) um dr. Frankenstein yer monster IS LOOOOSE again ? paging Herr Dr. ??? (HAL) what are you doing Dave ? Dave don't do that ? Dave don't pulllLLLLLLLLL ugh ..... silence :) lol kooky fucks
The military seeks help in developing an artificial human brain
The US military's research wing, DARPA, is seeking help with a new project that would allow computers to imitate the human neocortex, the region of the brain we use for language, reasoning, and perception. What could go wrong?
Though we
already have machine learning algorithms that allow computers to make
decisions based on constantly-shifting data, DARPA wants more. They hope
for a "cortical computation model" that will allow computers to
recognize new data when it's relevant to their tasks, and to adapt based
on new inputs from their environments. Though cortical processors sound
like something out of a 1980s scifi movie, they're exactly what a robot
or drone in the field will need to gather intel and make decisions
based on what it learns.
Writes DARPA:
Although a thorough understanding of how the cortex works is beyond current state of the art, we are at a point where some basic algorithmic principles are being identified and merged into machine learning and neural network techniques. Algorithms inspired by neural models, in particular neocortex, can recognize complex spatial and temporal patterns and can adapt to changing environments. Consequently, these algorithms are a promising approach to data stream filtering and processing and have the potential for providing new levels of performance and capabilities for a range of data recognition problems. The cortical computational model should be fault tolerant to gaps in data, massively parallel, extremely power efficient, and highly scalable. It should also have minimal arithmetic precision requirements, and allow ultra-dense, low power implementations.
Read more via Network World
No comments:
Post a Comment