It is my experience that most "new tools" Hadoop included are developed primarily for 64 bit platforms. It's possible that some components are offered with 32 bit versions.  However  we live in a world of very interdependent software components. It takes a bug or two in the 32 bit version that's not getting any dev attention because it's 32 bit to bring your project to a grinding halt. Better to expend energy , time , or money to start at 64 bit and deal with hard problems rather than asinine problems.

/opinion

On Jan 15, 2015 3:16 PM, "Clive DaSilva" <cdasilva@iprimus.ca> wrote:

Hello all

 

I decided to take a shot at Hadoop as a lot of my career adventures deal with manipulating large data streams. While Hadoop usually runs from a cluster of Linux boxes, for self-experimental purposes, some of the distros  have included Hadoop or work- alikes( Apache Bigtop for Ubuntu being one example which insists on  a 64 bit box) for single node activity, but it’s not clear as to whether you need a 32 or 64 bit computer, as after two days of googling the subject, I find the documentation to be  quite vague. Does anyone here have experience with this beast? Any suggestions, links, etc would be appreciated.

 

 

---------

Clive DaSilva – cdasilva@iprimus.ca

---------

GTALUG Talk Mailing List - talk@gtalug.org http://gtalug.org/mailman/listinfo/talk

 

 


---
Talk Mailing List
talk@gtalug.org
http://gtalug.org/mailman/listinfo/talk