100% All-Natural Composition
No Artificial Intelligence!
Showing posts with label neurobiology. Show all posts
Showing posts with label neurobiology. Show all posts

Saturday, March 07, 2026

Boffins in Australia make human neurons play Doom

For some time now there has been one ultimate benchmark for whether or not something has a truly computerized central component: Does it play Doom?  More than thirty years after iD Software released their 3D first-person shooter, it has been ported to seemingly everything from refrigerators to tractors to home pregnancy tests.  And more often than not, they do indeed run Doom.

A few years ago an outfit in Australia called Cortical Labs, working with lab-grown human neural cells, coaxed a Petri dish of neurons into playing Pong.  Which is about as rudimentary a video game as there can get.  Almost immediately the company started getting asked the same question: Could their "brains on a chip" play Doom?  Something that would require substantially more sensory input and calculation than simply moving a virtual paddle up and down.  The real matter was, people didn't want to know if the neurons could run Doom.  They wanted to know if it could play it with some semblance of a human being's participation in the game.

Lo and behold, Cortical Labs has done it.

A little high-tech box containing 200,000 human neurons is now playing Doom at Cortical Labs's facility.  It's playing FreeDoom, which is a port of the original source code that iD released a long time ago.  The classic imps, cacodemons and former humans aren't in the lab's version of the game - that would be trademark infringement - but it's still the same basic design and functionality.  "Doomguy" is moving around in Doom's simulated 3D space and firing his weapons at the generic enemy targets, just as he would if it was a living person operating him.

Furthermore, the neurons are gradually learning the parameters of the game.  They are getting better.  One can only wonder what would happen if they got turned loose in the classic doom.wad game file and started discovering how to evade, and then shoot back at, the original in-game enemies running their now-primitive but back-in-the-day awesome programmed artificial intelligence.

This is a major step forward toward the development of true AI.  If this kind of technology comes to migrate out of the lab and into industrial production, Lord only knows what kind of applications could come of it.  Some good, and some... not so much.  This is the sort of development that William Gibson wrote about in his novel Count Zero (the sequel to Neuromancer) forty years ago.  It was kind of scary then and it's a bit scary now.

I guess there could be some benefit though.  If you ever needed a more-than-silicon opponent to play Call of Duty against, there might be an ever-ready one sitting in a lab dish waiting to compete with.  That might be a spinoff (albeit not an altogether comfortable one to have).

Friday, June 25, 2010

Meet Oscar: The world's first bionic cat

Oscar the cat lost most of his back legs in a harvester accident this past fall. But thanks to some British researchers Oscar is now enjoying a fully functional life complete with two fore paws and two faux paws!

Popular Science has more about Oscar: the world's first bionically-enabled feline. It's thought that the technology will soon be applicable to human patients.

And here's some video of Oscar strutting his stuff!

He just needs some adamantium claws in his front paws and he'll be all set :-P

Wednesday, March 24, 2010

Later start time decreasing absenteeism in high school students

Don'cha wish we knew this when we were in high school!

(Oh who am I kidding? North Carolina's government is so bass-ackwards on everything, the concept would never even get the chance to fly here...)

Anyway, an experiment being conducted by an Oxford neuroscience professor at Monkseaton High School in North Tyneside in Great Britain has had students starting classes an hour later than usual, at 10 a.m. The remarkable findings of the experiment thus far are that the later class time has caused an 8% drop in general absence and a 27% drop in chronic absenteeism. Furthermore, memory testing done on the students indicate that the best time for learning more difficult lessons is in the afternoon. Researchers believe that teenagers wanting to sleep in is not a matter of laziness, but merely a component of biology adjusting during the adolescent years.

(Or maybe it's just that they're staying up at later hours playing World of Warcraft? :-P)

Thursday, October 29, 2009

Bad driving has a genetic component

Have a horrible driving record? It might be in your DNA. Here's what scientists at the University of California at Irvine have found...
People with a particular gene variant performed more than 20 percent worse on a driving test than people without it - and a follow-up test a few days later yielded similar results. About 30 percent of Americans have the variant.

"These people make more errors from the get-go, and they forget more of what they learned after time away," says Dr. Steven Cramer, neurology associate professor and senior author of the study published recently in the journal Cerebral Cortex.

This gene variant limits the availability of a protein called brain-derived neurotrophic factor during activity. BDNF keeps memory strong by supporting communication among brain cells and keeping them functioning optimally. When a person is engaged in a particular task, BDNF is secreted in the brain area connected with that activity to help the body respond.

Previous studies have shown that in people with the variant, a smaller portion of the brain is stimulated when doing a task than in those with a normal BDNF gene. People with the variant also don't recover as well after a stroke. Given these differences, the UCI scientists wondered: Could the variant affect an activity such as driving?

"We wanted to study motor behavior, something more complex than finger-tapping," says Stephanie McHughen, graduate student and lead author of the study. "Driving seemed like a good choice because it has a learning curve and it's something most people know how to do."

The driving test was taken by 29 people - 22 without the gene variant and seven with it. They were asked to drive 15 laps on a simulator that required them to learn the nuances of a track programmed to have difficult curves and turns. Researchers recorded how well they stayed on the course over time. Four days later, the test was repeated.

Results showed that people with the variant did worse on both tests than the other participants, and they remembered less the second time. "Behavior derives from dozens and dozens of neurophysiologic events, so it's somewhat surprising this exercise bore fruit," Cramer says.

And we are now one step closer toward understanding my family :-P

Thursday, September 10, 2009

Motivation: A requisite for useful artificial intelligence?

Edward Boyden has a fascinating essay at MIT's Technology Review website in which he describes a problem that could possibly arise from super-smart artificial intelligence. The problem, Boyden notes, is motivation: even with all of that intelligence and computational, how does a possibly sentient computer become moved to utilize that power?
Indeed, a really advanced intelligence, improperly motivated, might realize the impermanence of all things, calculate that the sun will burn out in a few billion years, and decide to play video games for the remainder of its existence, concluding that inventing an even smarter machine is pointless. (A corollary of this thinking might explain why we haven't found extraterrestrial life yet: intelligences on the cusp of achieving interstellar travel might be prone to thinking that with the galaxies boiling away in just 1019 years, it might be better just to stay home and watch TV.) Thus, if one is trying to build an intelligent machine capable of devising more intelligent machines, it is important to find a way to build in not only motivation, but motivation amplification--the continued desire to build in self-sustaining motivation, as intelligence amplifies. If such motivation is to be possessed by future generations of intelligence--meta-motivation, as it were--then it's important to discover these principles now.
A second possibility that Boyden theorizes is that a strong AI might simply become overwhelmed by its own decision-making process and become locked-up from contemplating factors and uncertainties (which sounds a lot like the "rampancy" that eventually afflicts AIs in the Halo franchise).

It's a very deep and most intriguing read about what may or may not be waiting for us around the corner from the realm of computers and neuroscience. Click here and partake of the article... if you think your brains can handle it :-)

Thursday, July 30, 2009

Coloring for blue M&Ms found to heal spinal injuries

In what has to be one of the more bizarre bits of medical news we've heard lately, the blue dye used to give blue M&Ms their color has been found to help mend severe spinal injuries.

Researchers at the University of Rochester Medical Center in New York discovered that when tested on laboratory rats, Brilliant Blue G blocks the action of a chemical that causes more damage to neural tissue around an already injured area. Rats with damaged spines who received injections of BBG eventually regained the ability to walk, while those that did not receive the BBG treatment never recovered. The one side effect found so far: injections of BBG causes the skin to temporarily turn bright blue.

Research is still being conducted, but it's thought that human trials with BBG may begin within the next few years.

(I wonder if Brilliant Blue G can counteract all those effects of Yellow 5 in Mountain Dew that my old roomie used to tell me about...)