Friday, September 17, 2010
Impressive HRP-4 robot will make you bow in deference (video) -- Engadget:
By Thomas Ricker posted Sep 17th 2010 6:29AM
Japan's National Institute of Advanced Industrial Science and Technology (AIST) is back with the mighty impressive HRP-4 humanoid. Created in partnership with Kawada Industries, this 151-centimeter (5-feet) tall, 39-kilo (86-pound) walking followup to the HRP-4C, HRP-3 and HRP-2 robots (pictured in the background) was developed to help take over manufacturing duties from a rapidly aging Japanese work force. The highly mobile HRP-4 features 34-degrees of movement with AIST proprietary control software running on a Linux core. Things get weird at the 5:30 mark of the video embedded after the break when a human enters the stage for a good ol' fashioned stare down. Probably has something to do with his hot wife."
Here's the video
Thursday, September 2, 2010
We should take a look at these and see if this is good for the next robot project.
Chumby hacker boards
For hackers who missed out on the Chumby craze, Adafruit has a pile of Chumby PCBs for sale.
The Chumby Hacker Board is a cool single board Linux computer that has much of the same hardware as the famous Chumby One. It's great for people who are experienced with Linux and want to have the power of a microcomputer with audio and video output while at the same time getting all the peripherals of a microcontroller such as analog-to-digital conversion, PWM outputs, sensors, bit twiddling, and broken-out GPIOs!
Wednesday, September 1, 2010
Open source synthetic intelligence project
Another day, another cool-sounding Kickstarter project. This one is titled E1: synthetic intelligence, open source.
E1 is an inexpensive open source hardware kit in the same theme as the Arduino--for bringing synthetic intelligence to electronics projects. We've made tremendous progress over the past year, but now we need your help to get it manufactured.
A while ago we realized even the most powerful microcontrollers are just too limited for complex machine learning tasks. At the same time, we weren't interested in all the overhead of a processor and OS. We wanted something right in the middle, made for the task, to coordinate between our sensors, locomotion, and the user. E1 is a custom core embedded within an FPGA. It requires no PC to use or train, is thoroughly flexible, and completely open.
Here's how it works. Attach inputs like cameras, microphones, and sensors--and output mechanics, like servos, actuators, or motors. E1 starts out in an untrained state, but can receive reward and punishment with a remote. It can also detect some set of behaviors, like facial expressions. Over time E1 not only learns what you teach it, but learns the conditions that lead to reward and punishment and so when it should reward or punish itself.
Surprisingly complex behaviors are possible with the combination of simple training and the sensory analytics done by the E1. And all of these details are handled out of your way, from signal decoding to feature detection. Tap the outputs via the header pins on the board itself, or let E1 talk to your outputs for you.
See more on the project website. (Note that the Kickstarter funding goal must be met by 9/6)