Sunday, December 29, 2013

Independent Report: Pivotal Data Dispatch Reached Payback in 3 Months with 377% Annual ROI at NYSE Euronext

Independent Report: Pivotal Data Dispatch Reached Payback in 3 Months with 377% Annual ROI at NYSE Euronext

Nucleus Research This week, Nucleus Research compiled an independent report profiling how NYSE Euronext solved the challenge of big data in their organization with big returns using Pivotal technologies that we have now released to the market called Pivotal Data Dispatch.  By all accounts, NYSE Euronext was caught between the proverbial rock and a hard place with their data requirements. With their stock trades representing one third of the world’s equities volume, and federal regulations requiring them to keep 7 years history, by 2006 their data volume was staggering. The cost to maintain this data and the penalties the business was paying with the latency of data results was unacceptable.
However, instead of following the norm in the industry and continuing to invest in monolithic data stores that had fixed data ceilings, NYSE Euronext read the future and decided to embrace modern big data strategies early, resulting in a scalable and cost affordable data solution. As a result, they are now recognized as a early leader in the big data market, achieving payback for their efforts in just 3 months and enjoying a staggering 377% annual ROI.

Background

To give an idea of the scope of the problem, by 2007, when NYSE Euronext started to look at viable solutions to their growing challenge, even the lower cost alternative of Massive Parallel Processing (MPP) still had a market price of about $22,000 a terabyte. At the time, they were generating an average of ¾ of a terabyte a day. Knowing data volumes were only going to grow, and they needed to keep 7 years on hand, storage of the data alone was a crippling prospect.
Also, actually using the data was problematic. Typical queries had to be performed by experienced DBAs so as to not upset the infrastructure and average query time took 7 hours, and frequently required additional filters and queries. NYSE Euronext executives frequently waited up to 3 weeks to receive the results of their queries. With data not being ubiquitous and available in even near real-time, the NYSE knew it was missing opportunities—big opportunities as it turns out.

The Solution

Their first attempt to solve their data challenges started in 2007. NYSE Euronext set out to develop a solution built on our Greenplum technology (inherited from EMC in our spin-out) and IBM Netezza. The idea was to build a Data Lake, where active and historical data could be self-provisioned on-demand by the data analysts. Separate from real-time operations, data analysts were free to use the data at will.
Data volumes continued to grow and by 2010, NYSE Euronext was collecting 2 TB a day. While the cost of MPP processing was becoming cheaper, the growth rate was not affording any further savings. Their estimated costs were approaching $4.5 million a year to just store data. However, since the data could be federated across commodity hardware, this solution was still estimated to be 1/40th of a traditional data storage in a analytics environment, and provided broader access to analysts—something that provided enormous value to daily operations.
With that in mind, NYSE Euronext took a leap that many organizations to date have not. They went big on big data. With help from vendors like Pivotal, they built a system, now publicly available called Data Dispatch, that should be treated as a model for the enterprise.

Key Benefits

For full financial disclosure on the economics, please read the full report by Nucleus Research. However here are some notable highlights from the report itself speak volumes on this aggressive approach to big data, and why big data leaders like NYSE Euronext are showing the way to the market that investing in strategies to make it ubiquitous and real-time really pay off:
  • Power user productivity. With IT eliminated from the active process of harvesting data, business users are not only in control, they are empowered to use data daily to fully understand their markets. With the data available, they tend to use it more and improve business decisions.
  • Increased productivity. With the back and forth between IT and the business eliminated in every data request, both the business and IT can focus on their own areas of expertise. Data requests are fulfilled more quickly with the person who knows what they are actually after and what the data means in the drivers seat. IT also manages to service the business more effectively, while reducing the amount of direct help. This is a win-win for everyone.
  • Reduced IT labor costs. The Pivotal Data Dispatch tool services about 2000 data requests each month. Historically each request took a DBA about 1 hour, so this means approximately 2000 hours or over 83 man days of DBA work can be refocused into other areas of their massive data infrastructure.
  • Reduced decision latency. With the data request cycle compressing from 3 weeks to hours, the nearly 250 data analysts at NYSE Euronext are by default working on fresher data. This results in reduced decision latency, allowing them to use near real-time data to make important inferences and prove imperically what their markets need.

More on NYSE Euronext and Pivotal Data Dispatch

- See more at: http://blog.gopivotal.com/case-studies-2/independent-report-pivotal-data-dispatch-reached-payback-in-3-months-with-377-annual-roi-at-nyse-euronext#sthash.lCt6gLwH.dpuf

No comments:

Post a Comment