Non- Quantifiable ROI of Big Data Projects

 In Archive

While everyone seems to be convinced about Big Data being the next big thing in technology, there seems to be a certain amount of uncertainty in regards to the ways in which enterprises can leverage Big Data. The question is whether or not Big Data deployment is worth it and if yes, how much?

Traditional financial measures such as Internal Rate of Return (IRR), Net Present Value (NPV), Payback Period or Return on Investment (ROI), can be applied to Big Data projects.Typically cost benefit analysis takes into consideration the monetized value of the proposed benefit and the cost incurred through implementation. Although this applies in most Business Intelligence/Data warehousing projects, it doesn’t necessarily hold true for the Big Data projects. In order to fully evaluate any Big Data project from a financial standpoint, organizations have to take into consideration the non-quantifiable benefits of Big Data projects as well.

Listed below are a few non-measurable factors which organizations tend to forget while calculating returns on the Big Data projects.

1. Do more Analysis with all of your data and less investment: With Big Data technologies, multiple business scenarios could be run on all the complete data set, rather than a traditional sampling. With no statistical errors, your insights will be accurate, this cannot be analyzed in pure ROI terms

2. Enhanced incremental ROI: Cost of implementing new insights is significantly less than traditional BI techniques for two reasons:

a) The price of traditional BI solutions is linearly dependent on the size of the data. For incremental insights all you need is a few more machines

b) No specialized hardware is required for Big Data projects, commodity hardware is more than sufficient, thus a low cost of new insights

3. Scalability: Big Data technologies offer effortless scalability as your data infrastructure could be readily scaled to handle higher data loads without altering your current data setup.

4. Ease of adjustment to changing data requirements: Traditional BI implementations tend to be non-effective the moment data points or structure change. Fortunately, Big Data technologies don’t have this constraint; any change in data pointers, schemas, or volume of data would not require any modifications to your data systems. If time is money, this flexibility should be considered while monetizing the benefits for a cost versus benefit analysis.

5. The chance of hitting a jackpot: Big Data projects are aimed at deriving insights from data pointers that could not be previously. There is always a probability of inadvertently discovering critical insights which may be worth much more than the initial consideration.

Obviously, strategic as well as operational considerations also play a big part in deciding whether or not to go for Big Data. But these broad (or should we say big) insights need to be considered while analyzing any Big Data project from a financial standpoint.

For information on your Big Data queries, write to us at or visit our website

Interested in learning more about our offerings and solutions?
Let's connect
Recommended Posts
Showing 2 comments
  • Riven

    How is Scalability putforth as a NQ ROI? Let’s say I use a cloud service that technically can offer me unlimited scalability if I were to run a Hadoop analysis on my data. But can this be perceived as any sort of ROI for my business ?

    You say “Traditional BI implementations tend to be non-effective the moment data points or structure change.ortunately, Big Data technologies don’t have this constraint”

    But do you know that you have to prepare your input data for ingest? For instance, the above cant be done on the great Informatica’s hadoop product
    check this presentation

    So what is the basis on which your theory based on?

    Would be great if you can give an example of a hadoop product variant where you can pass any sort of data structure that will be automatically taken care of for ingest and processing.

  • Mukund Deshpande

    @ Riven-Traditional BI technologies (even if they are on the cloud) struggle to give linear scalability with the size of the data . Most of these technologies are not designed for the horizontal scaling that big data technologies are designed for and hence typically will require additional investments to give you this kind of scalability.

    I think what is more important is the framework under which Hadoop operates. This framework makes it more amenable to process unstructured data versus traditional BI technologies that are more focused on processing structured data (operating in relational format). Most Hadoop platforms support processing unstructured data in some form. Some of the commercial Hadoop distributions have a support for extensive text analytics algorithms as well; IBM BigInsights is a good example of such a distribution.

Start typing and press Enter to search

Contact Us
close slider
Contact Us

Yes, I would like Persistent to contact me on the information provided above. Click Here to read our full Privacy Notice.