Page 2 of 2

Re: NASA Robotics Competition

PostPosted: Mon Sep 30, 2013 11:33 am
by stealthpaladin
shr wrote:Different strokes for different folks. Using the ARM for high level supervision and coordination broadens the potential tools and developers that can be applied. Leveraging the FPGA is more hardware efficent. Leveraging the ARM/Linux and high level tools is more developer efficient. Either approach could be effective for the Sample Return Robot Challenge. The promise of such flexibility is one of the attractive aspects of the Parallella. It would be interesting to see teams representing both approaches in the competition.

Hi shr, thanks for your feedback - Gravis and I had continued this convo a bit out of thread so we wouldn't clutter it, but we came to some agreement later =)

Certainly for any of the actual heavy processing, we'll be needing some native code running on the Ephiphany/FPGA. For orchestration I've recently been working up this platform at: . very pre-alpha project; I'm collecting as much spare time as I can to finish turning closed source code into generic open source code. Hoping C++ and Python version will be out soon.

For robotics, this package would be useful I think. Individual physical units like "spine" or "waist" can be defined as a component, and a more complex unit can be made of these using compositions or nesting. Services can then launch native code able to peer into any physically mapped scope and oversee occasional validation so that people can glue all sorts of native Epiphany programs together! Hope it works out =J