Tags: poor | it | performance | industries | big data

Blame Big Data as Poor IT Performance Costs Billions?

dollar symbol on a computer keyboard, spending money online
Karenr | Dreamstime.com

By    |   Tuesday, 05 March 2019 12:42 PM EST

The explosion of “big data” in the past few years means that businesses in virtually every industry are now collecting, storing and using large amounts of data – from healthcare to automotive.

Accompanying this explosion is the need for massive IT spending in order to handle all this data; global IT spending is expected to reach $3.87 trillion by 2020. That number is only going up.

But while people are worrying about the staggering amounts of data being collected, they aren’t paying much attention to how overwhelmed most IT infrastructure is trying to keep up with it. They especially seem to have overlooked the massive performance issues all this data is causing and how much it’s costing entire industries.

Healthcare is one industry that is struggling to keep up with increased data usage. New digital tools and equipment are constantly being adopted, and electronic medical imaging needs have blown up. Whole human genome sets already require hundreds of gigabytes to store, and with sequence data doubling (yes, doubling) every seven to nine months, they’re going to need exponentially more moving forward. For an underfunded hospital already struggling to keep things afloat, doubling your data needs can be overwhelming.

Retail also makes heavy use of data. Retail brands practicing omnichannel retail often use the same inventory in both brick-and-mortar and online stores to fulfill requests. They also collect tons of customer data and need to access that data instantly both on online and in-person in order to provide the seamless experience their customers are looking for. Returns alone can be overwhelming, with clothing return rates often topping 30% to 40%. But for retailers whose IT networks can’t handle all this traffic, more data means more time and money wasted. Even just a few seconds of delay is known to disrupt customer experience.  

Even traditionally mechanical industries aren’t immune. Automakers are increasingly trying to make sense of the sheer amount of data that today’s cars (and consumers) demand. Annual sales of connected car technologies (think, connecting your car infotainment system to a music streaming service) is expected to triple to $122 billion by 2021. Telematics – in-car hardware that records and transmits driving metrics, often used by insurers to set rates – are now coming pre-installed in vehicles. Automakers are looking at huge investments in IT to handle all this data – costly investments, considering the low returns the automotive industry has seen in recent years.

If data hasn’t yet become an integral part of doing business in an industry, it likely will soon. But with more data comes more problems for IT systems that can’t handle the influx. IT departments are increasingly witnessing both day-to-day business and customer-facing services dragged down by shoddy system performance – performance issues that don’t come from hardware alone.

As it turns out, most servers used to store and process data (generally Microsoft SQL servers) can be dreadfully inefficient, full of small, random and fragmented input/output (I/O). Windows separates storage and computing and has to break I/O into smaller random “chunks” to move and process it.

This creates a noisy environment that bottlenecks and can drag down entire systems; it’s not uncommon for inefficient and fractured I/O to slow performance by 30-50%.

Think about it - 50%. If your IT system is 50% slower than it should be, you’re essentially throwing away half your IT budget and paying twice as much for half the results. In some industries, we are seeing losses pertaining to this in the millions—even billions.

Lost employee productivity illustrates this point well; the average employee user of inefficient systems like these spends an average of 15 minutes a day simply waiting for their computer system to respond or complete tasks. That’s 5 hours a month. With the combined annual payroll of the U.S’s retail, business services and financial industries clocking in at $2.1 trillion, losing just 15 minutes every day means these industries lose almost $66 billion every year.

Imagine the effect saving those 15 minutes per day could have on employee productivity over time.

When CIO’s discuss solutions for handling Big Data, they rarely talk about these kind of I/O performance issues. In fact, almost no one does. But by not addressing them, they are losing billions every year.

What’s the solution? New hardware and upgraded infrastructure?

Those help, certainly. The problem is they are expensive. Enterprise hardware spending across all industries is expected to top $1.5 trillion in 2020. Telecom providers alone may need $150 billion in infrastructure upgrades in order to handle the expected fourfold increase in mobile data traffic over the next few years. With 5G due to hit widely this year or next, you can expect that number to climb even higher.

Hardware upgrades aren’t the most effective solution, anyway. If the problem lies with inefficient data-handling due to fractured I/O, fixing that inefficiency through software solves the problem where it begins. Clean up data I/O by 50% and it’s like adding 50% more throughput – while saving hardware upgrades for further down the road.

This may seem “technical” rather than financial. Today, technical IS financial. Big data is cutting into profits. If CEOs, CIOs and other executives responsible for profitability, costing, cap ex and risk mitigation don’t educate themselves on this underlying “dirty little secret” of performance—and adequately arm database servers and networks to increase the throughput of data—then slows, crashes and the overwhelm of big data will thwart all speed and profitability goals it is supposed to bring. It is the nasty silent killer in many industries today—and it causes “death by millions of data particles.”

Companies need to be aware what’s causing the problem before they can tackle it.

Jim D’Arezzo has a distinguished career in technology that started at IBM and has included senior executive positions at Compaq, Autodesk and as President and COO of Radiant Logic. He is currently CEO of Condusiv Technologies, the world leader in software-only storage performance solutions for virtual and physical server environments.

© 2024 Newsmax Finance. All rights reserved.

As companies and entire industries collect and utilize more and more data, they're finding themselves dragged down by inefficient handling and slow systems.
poor, it, performance, industries, big data
Tuesday, 05 March 2019 12:42 PM
Newsmax Media, Inc.

Sign up for Newsmax’s Daily Newsletter

Receive breaking news and original analysis - sent right to your inbox.

(Optional for Local News)
Privacy: We never share your email address.
Join the Newsmax Community
Read and Post Comments
Please review Community Guidelines before posting a comment.
Get Newsmax Text Alerts

Newsmax, Moneynews, Newsmax Health, and Independent. American. are registered trademarks of Newsmax Media, Inc. Newsmax TV, and Newsmax World are trademarks of Newsmax Media, Inc.

© Newsmax Media, Inc.
All Rights Reserved
© Newsmax Media, Inc.
All Rights Reserved