Both reports and summaries try to inject a dose of reality into Big Data adoption. I don't plan to summarise them here, (ZDNet already did that...) but I notice two important points :-
Between one-third and almost 50% of respondents don't know if ROI will be positive or negative.
According to Gartner / ZDNet - two thirds of Big Data projects will fail.
ROI is obviously an elusive measurement, sometimes it's used as a predictive tool to justify investments and then measurement doesn't take place. We have seen plenty of Big Data projects that goo through an elaborate internal dance of justification using ROI calculations that are then forgotten or ignored once the project is up and running.
For sure many Big Data projects will fail (lots of our successful projects have been built on the back of failed projects from other suppliers), but most of the customers we talk to are becoming knowledgeable enough to avoid basic pitfalls like business linkage, skills building etc. The challenge for them is understanding the frenetic pace of change in the technology, I am sure this is the number 1 factor (among many others) that causes projects to fail.
If you feel out of control in setting your Big Data priorities there is one activity above all others that avoids the pitfalls Gartner outlines. In many ways it's a paradigm shift as Open Source has moved into mainstream Enterprise adoption. A pre-requisite for success is - reading and lots of it. Very basic, but in such a fast moving environment with relatively high levels of complexity and still very tools/platform based - to avoid failure you must (must) spend maybe 1 hour a day reading blogs, product announcements, Apache project details, white papers. It's that simple. Read lots and don't fail..... don't read and fail (lots)
What do you think - do technologists and non-technologists have to up their reading time? What are the other critical factors for success in Big Data.?