NASA Robotics Competition

Re: NASA Robotics Competition

Postby stealthpaladin » Mon Sep 30, 2013 11:33 am

shr wrote:Different strokes for different folks. Using the ARM for high level supervision and coordination broadens the potential tools and developers that can be applied. Leveraging the FPGA is more hardware efficent. Leveraging the ARM/Linux and high level tools is more developer efficient. Either approach could be effective for the Sample Return Robot Challenge. The promise of such flexibility is one of the attractive aspects of the Parallella. It would be interesting to see teams representing both approaches in the competition.


Hi shr, thanks for your feedback - Gravis and I had continued this convo a bit out of thread so we wouldn't clutter it, but we came to some agreement later =)

Certainly for any of the actual heavy processing, we'll be needing some native code running on the Ephiphany/FPGA. For orchestration I've recently been working up this platform at: https://www.github.com/stealthpaladin/Approach . very pre-alpha project; I'm collecting as much spare time as I can to finish turning closed source code into generic open source code. Hoping C++ and Python version will be out soon.

For robotics, this package would be useful I think. Individual physical units like "spine" or "waist" can be defined as a component, and a more complex unit can be made of these using compositions or nesting. Services can then launch native code able to peer into any physically mapped scope and oversee occasional validation so that people can glue all sorts of native Epiphany programs together! Hope it works out =J
User avatar
stealthpaladin
 
Posts: 41
Joined: Sat Jul 20, 2013 9:46 am

Previous

Return to Robotics

Who is online

Users browsing this forum: No registered users and 1 guest

cron