parallella is died. So Long, and Thanks for All the Fish

Forum for anything not suitable for the other forums.

parallella is died. So Long, and Thanks for All the Fish

Postby roberto » Fri Jun 02, 2017 4:42 pm

Parallella was a "one man company" with Andreas Olofsson as principale actor of it . Now that Andreas joined the "dark side" (http://www.darpa.mil/staff/mr-andreas-olofsson) all his time will go to new job and noone will be focused on parallella. Fearly, the new 1024 chip will be "coopted" by DARPA and bye bye to Community. We lived a sweet dream, now is time to wake up: no Ephiphany-V for us.
roberto
 
Posts: 39
Joined: Sat Mar 09, 2013 2:01 pm

Re: parallella is died. So Long, and Thanks for All the Fish

Postby nickoppen » Sat Jun 03, 2017 12:47 am

Roberto you are so dramatic!

It was amazing that the Parallella got off the ground in the first place. Yes the initial enthusiasm has waned but there are still some of the originals active and new people pop up all the time. There is still core development going on with the platform and the tools.

I've had my own businesses in the past and I don't judge anyone negatively for taking a salaried job. Taking on something big is tough.

I enjoy programming on the Parrallella because it is hard. Most people don't enjoy doing hard things. Some of the people who got involved early on were absolutely brilliant. I think that it comes with the territory that these people often have a short attention span. Others will come and the lessons learnt with the Parrallella are still out there.

I think that it will lead to something big in the future. I don't know when. It might be the E-V but maybe not.

Best regards,

nick
Sharing is what makes the internet Great!
User avatar
nickoppen
 
Posts: 266
Joined: Mon Dec 17, 2012 3:21 am
Location: Sydney NSW, Australia

Re: parallella is died. So Long, and Thanks for All the Fish

Postby sebraa » Sun Jun 04, 2017 7:41 pm

I am sure that the Epiphany V will be a thing. Just not for us.
We have to live with the Parallella and the Epiphany-III, and there won't be a future for either.
sebraa
 
Posts: 495
Joined: Mon Jul 21, 2014 7:54 pm

Re: parallella is died. So Long, and Thanks for All the Fish

Postby dobkeratops » Mon Jun 05, 2017 9:00 am

I think that it comes with the territory that these people often have a short attention span


The problem seems to be it needed to cross a certain critical mass to get a cutting edge chip mass produced (hence being stuck at 60nm when you could get traditional chips with the better process).
this chip requires big rethinks in software approaches, whilst the mainstream is edging over with the classic CPU+GPU ensemble and gradually generalising it, with real world useable software at every step. (the latest nvidia machines have the nvlink for scalability, so they will still be building grids )

The fact the epiphany 5 has GPIO is really interesting, I still hope they could do something like the 'movidius fathom' USB stick with it.. make it as a peripharal which could slot into a raspberry pi (rely on that for storage, wifi, ..) and have it processing camera inputs or whatever. But we'd have to pull together as a community to organise and fund that, I guess.

parallela itself was a 'devkit' and the uncertainty was always there.. if you put time into something so radical, will the next version actually appear.

I enjoy programming on the Parrallella because it is hard. Most people don't enjoy doing hard things.


That kind of personal challenge is great; but to be useful, the tools need to be developed that make it easy. It seemed it was really hard to coordinate the community to do this in time.

The complexity of real world applications continues to rise, it's all about the tooling to handle it better. And the mainstream languages *are* moving in the necassery direction, because of clusters and GPUs. It just takes the ability to write code with the right kind of abstractions.

I note that eventually the ability to roll code using C++ templates has appeared: https://parallella.org/forums/viewtopic.php?f=53&t=4050&sid=2e892751d4ecc2b6f4d88e16f34f95dd (i.e slotting some sort of kernel function into template code that handles the transfers) . that is what was needed, then you'd be able to write code that is portable between CPUs,GPUs and epiphany, and also code that is tuneable - if you separate the details of the function from the dataflow framework , you can try different permutations more easily. Code needs to stay in a fluid form.

GPGPU code just deals with a glorified 'for-each' really , but we could go further with 'map-reduce' or whatever . Note that in their primary mode, the GPU is doing a 2 stage operation with some dataflow.. vertex transformations then pixel shading, with on chip dataflow between those two stages (transformed vertices).. it's just a special case that the machine is built for. The epiphany could generalise that.

Baking in dataflow with DMA and so on in your actual sourcecode is like trying to program a traditional CPU in assembler, manually doing the register allocation .. when we rely on good compilers for that now. Working on AI code or whatever you have to be able to focus on increasingly complex mathematics and actually getting formulae right, and you need to be able to dive in and write all sorts of debug code and visualisers to have a hope of getting that right.. its just too much to try and do all that *and* think about DMA at the same time. So you break it into layers.

just not for us


I do personally worry that the right chips for AI (and the epiphany is a great contender) will only be available in a centralised form (data-centres, proprietary systems), which will mean a really bad outcome for everyone. AI is going to change the world (transport, food production, healthcare) and if this is controlled by a few it's a recipe for disaster... a disaster that the average member of the sheeple is sleepwalking toward (stopping buying PCs, relying on phones and the cloud for everything, but they keep complaining when the wealth gap grows..)

we had the same thing with GPUs but the way it happened went ok.
you had PC's and dedicated consoles. There were 3d games on open PCs but then the sony playstation came out which was dramatically better than software engines on PCs.. but the PC GPU industry took off with 3DFX, ATI, and of course Nvidia (who survived). The rate of progress in the PC graphics cards was actually faster than in the dedicated consoles, and by the time the XBox360/PS3 came along, they were just using PC chips (or early access to the latest tech from a PC chip vendor).

The problem with AI today is computing is moving over toward servers, so 'the common man' doesn't have the demand for an AI chip in a programmable form.
There might be AI chips in phones , but those are locked down devices .. the superpowers will do everything to tie you into their services.
dobkeratops
 
Posts: 189
Joined: Fri Jun 05, 2015 6:42 pm
Location: uk

Re: parallella is died. So Long, and Thanks for All the Fish

Postby roberto » Mon Jun 05, 2017 8:17 pm

nickoppen wrote:Roberto you are so dramatic!

oh, thank you, i take it as compliment. latin blood here, so dramatic soul is an unavoidable built-in feauture.

nickoppen wrote:It was amazing that the Parallella got off the ground in the first place. Yes the initial enthusiasm has waned but there are still some of the originals active and new people pop up all the time. There is still core development going on with the platform and the tools.

the chip is only for a few percentage of audience, not big enough to create a self-walking business.
nickoppen wrote:I've had my own businesses in the past and I don't judge anyone negatively for taking a salaried job. Taking on something big is tough.

i don't judge too. aside sweat dreams (i had mine in past too) there is reality to fight with: food to take on table every day, dentist for kids, gasoline for car etc. parallela can't sutisfy them and Andreas correctly switch to a salaried job, he must survive.
not judging, just facing the facts.
nickoppen wrote:
I enjoy programming on the Parrallella because it is hard. Most people don't enjoy doing hard things. Some of the people who got involved early on were absolutely brilliant. I think that it comes with the territory that these people often have a short attention span. Others will come and the lessons learnt with the Parrallella are still out there.

I think that it will lead to something big in the future. I don't know when. It might be the E-V but maybe not.

Best regards,

nick


we as community failed with him too. this dream was too much big for a single man and us was not enough to support him.
but the main thing is not to fail, but to do the best to reach the goal: fail or victory is a collateral thing. in this case, failure popped up. now is time to put all beside the back and walk another street.

i bet there will be not E-V for ordinary people.

best regards.
roberto
 
Posts: 39
Joined: Sat Mar 09, 2013 2:01 pm

Re: parallella is died. So Long, and Thanks for All the Fish

Postby nickoppen » Tue Jun 06, 2017 6:14 am

Roberto, drama is not a bad thing. Especially if you are latin.

the tools need to be developed that make it easy.


I couldn't agree more. It needs to be something that can be taught at university so that the graduate can sit down and start churning out working code. The platform has to be stable, tools that are available (IDEs, debuggers etc) have to be solid and I'd say that there needs to be some sort of guidance available for the programmer to know how to write efficient (i.e. fast) programs. I don't think that it will get off the ground if the algorithm has to be implemented more than once in order to get the best from multi-core advantage.

AI ... will only be available in a centralised form


I tend to agree. Smart stuff is hard and sometimes takes a lot of data to make it work. Companies who put this amount of effort in will want a commercial return and locking it down is the only way to do this. I'm still positive about the future though. I think that there is enough available for the average person who wants to do something hard to be able to do it.

Thanks for taking the time to put your thoughts down.

nick
Sharing is what makes the internet Great!
User avatar
nickoppen
 
Posts: 266
Joined: Mon Dec 17, 2012 3:21 am
Location: Sydney NSW, Australia

Re: parallella is died. So Long, and Thanks for All the Fish

Postby dobkeratops » Tue Jun 06, 2017 5:28 pm

for the programmer to know how to write efficient (i.e. fast) programs. I don't think that it will get off the ground if the algorithm has to be implemented more than once in order to get the best from multi-core advantage


Functional programming does it, IMO, and this has become quite mainstream now.

it just takes the ability to plug a lambda function into a given framework; C++ finally gained this capability , and eventually the parallela did get the C++ support to do it.
.. and of course even more exotic approaches are out there (pure-functional languages etc), but they are more disparate/incomplete/require more development to get done. but it's happened because web, gui, clusters all have aspects of handling asynchronicity and parallelism

GPGPU programming already has some very convenient programming models available, where you can spawn kernels from within your main program, with type information flowing between the main program & parallel 'kernel' interface; that is what was sorely missing from the raw C programming model (where the parallel kernel is written as a separately compiled executable.. very hard to interface.

Spawning parallel code needs to be as easy as writing a regular for loop.. so it can be used routinely .. and with the right tools, it is.

https://msdn.microsoft.com/en-us/library/hh265137.aspx
-> https://msdn.microsoft.com/en-us/library/hh873133.aspx

https://devblogs.nvidia.com/parallelfor ... a-c-and-c/

The way I'd see epiphany programming ideally work is that you build a library of high order functions , starting with the most basic 'map' operation (take a stream of data, apply a function), then making more elaborate variations, like 2D convolutions, scatter / gather approaches (e.g. give one function to generate indices, another to compute something) etc etc. This is similar to the approach modern languages use with various iterator libraries (e.g. see all the 'iterator chaining' stuff in rust, although rust is mostly 'external iteration, and I think parallelism needs internal iteration.)
dobkeratops
 
Posts: 189
Joined: Fri Jun 05, 2015 6:42 pm
Location: uk

Re: parallella is died. So Long, and Thanks for All the Fish

Postby nickoppen » Tue Jun 06, 2017 11:22 pm

Again I totally agree. You are much more up to date with all of this "modern" stuff than I am. Your point about "edging over with the classic CPU+GPU ... with real world useable software at every step" is very true. I think that you would need very deep pockets to take the leap with new hardware platform and be able to pull all of the software together to get it to work.
Sharing is what makes the internet Great!
User avatar
nickoppen
 
Posts: 266
Joined: Mon Dec 17, 2012 3:21 am
Location: Sydney NSW, Australia

Re: parallella is died. So Long, and Thanks for All the Fish

Postby jar » Wed Jun 07, 2017 6:42 pm

Adapteva was smart to have invested in the GCC toolchain early on. The fact that Epiphany has a real compiler and doesn't require some hacked up or extended version of C to program sets it beyond 90% of other startup offerings.

Yes, the tools need developed and these take time and resources (which Adapteva didn't have). Andreas has written at length about finding the right small group of people to make the whole thing work. A small group of people can make an impact on the many but the goals must be aligned.

Parallel computing isn't going away and parallel programming will always have challeges. Finding an appropriate amount of software abstraction and flexibility remains an unsolved problem in computer science.

If I thought the story was over, I wouldn't still be here.
User avatar
jar
 
Posts: 294
Joined: Mon Dec 17, 2012 3:27 am

Re: parallella is died. So Long, and Thanks for All the Fish

Postby aolofsson » Thu Jun 08, 2017 3:13 am

...feel compelled to respond.

Here's my 2 cents on free will. I chose to destroy my body and mind for 9 years with relentless 80 hour work weeks in an attempt to bring financial security to my wife and kids and to show the world a more efficient way to compute. There was not a single day during that time that Adapteva wasn't on my mind. In late 2016 I decided that Adapteva was never going to break through to the main stream and that my impact on the world was going to be marginal at best (you will never know just how close we got....) so I chose to move to DARPA where I get a safe salary, a significant research budget and the ability to affect a whole industry. It wasn't an easy decision, but definitely the right one for me and my family.

So where does that leave the community? Well, it's your choice...but before you choose to abandon the Parallella consider this:
-What is the alternative?
-All of the software is open source so you are free to choose your path
-All the boards are open source so you are free to choose your path
-There are thousands of Parallella boards for sale at Digikey (not running out for many years to come...)
-There are 20,000 Epiphany-III devices on the shelf so if the E3 fits your needs and you are brave enough and have the $ you could build your your own boards
-The Epiphany(3,4,5) IP is (and always have been) available for licensing, so if you are brave enough and have the $, you can build your own chip.

What you are not going to see is any chip development from Adapteva in the near future.

The Parallella community was by far best result of the 9 year Adapteva journey and I will be forever grateful to folks like @tnt, @dar, @coduin, @ericsson, @polas, @dimako, @nickoppen and others for incredible efforts to build open source parallel programming tools, tutorials, and libraries. If only the world would have paid more attention!

I wish everyone best of luck on their chosen path!

Andreas
User avatar
aolofsson
 
Posts: 1005
Joined: Tue Dec 11, 2012 6:59 pm
Location: Lexington, Massachusetts,USA

Next

Return to General Discussion

Who is online

Users browsing this forum: No registered users and 15 guests

cron