There is no question about it. We are facing a data tsunami--one that is threatening to swamp organizations just as they move into major IT advancements changing the face of industry, government, healthcare, education and almost every aspect of our lives.
Artificial intelligence, the Internet of Things, predictive analytics and just the sheer volume of on-demand data is a 100-foot wave starting to crash over your IT organization.
It’s an apt analogy: As the data waters rise, IT departments must find ways to contain costs as data needs increase.
Indeed, Cisco’s latest Cloud Global Index report projects that the volume of stored big data will reach 403 exabytes by 2021, a nearly eight-fold increase from 51 EB in 2016. The growing importance of data analytics -- the result of big data coming from ubiquitously networked end-user devices and IoT alike -- has added to the value and growth of data centers. They touch nearly every aspect of an enterprise, whether internal, employee-related data, communication or processes, or partner-and customer-facing information and services. Big data alone will represent 30 percent of data stored in data centers by 2021, up from 18 percent in 2016.
According to a recent report from International Data Corporation, the volume of data processed in the overall healthcare sector is projected to increase at a compound annual growth rate of 36 percent through 2025, significantly faster than in other data-intensive industries such as manufacturing (30 percent projected CAGR), financial services (26 percent) and media and entertainment (25 percent).
Big data presents opportunities to gain new insights, deliver better products, better customer service and improve operations, but it also can cause bottlenecks that lower productivity and impact user experience.
More Data, More Problems
IT departments are struggling to keep up with demand. Like the proverbial Dutch boy with his finger in the dyke, it is difficult for IT staff to manage the sheer amount of data, much less the performance demands of users.
We can all relate to this problem. We are users of massive amounts of data. We also have little patience for slow downloads, uploads, processing or wait times for systems to refresh. IT departments are generally measured on three fundamentals: the efficacy of the applications they provide to end users, uptime of systems and speed (user experience). The applications are getting more robust, systems are generally more reliable, but speed (performance) is a constant challenge that can get worse by the day.
From an IT investment perspective, improvements in technology have given us much faster networks, much faster processing and huge amounts of storage. Virtualization of the traditional client-server IT model has provided massive cost savings. And new hyperconverged systems can improve performance as well in certain instances. Cloud computing has given us economies of scale.
But costs will not easily be contained as the mounting waves of data continue to pound against the IT breakwaters.
Containing IT Costs
Traditional thinking about IT investments goes like this: We need more compute power; we buy more systems. We need faster network speeds; we increase network bandwidth and buy the hardware that goes with it. We need more storage; we buy more hardware. Costs continue to rise proportionate to the demand for the three fundamentals (applications, uptime and speed).
However, there are solutions that can help contain IT costs. Data Center Infrastructure Management (DCIM) software has become an effective tool for analyzing and then reducing the overall cost of IT. In fact, the US government Data Center Optimization Initiative claims to have saved nearly $2 billion since 2016.
Other solutions that don’t require new hardware to improve performance and extend the life of existing systems are also available.
What is often overlooked is that processing and analyzing data is dependent on the overall system’s input/output (I/O) performance, also known as throughput. Many large organizations performing data analytics require a computer system to access multiple and widespread databases, pulling information together through millions of I/O operations. The system’s analytic capability is dependent on the efficiency of those operations, which in turn is dependent on the efficiency of the computer’s operating environment.
In the Windows environment especially (which runs about 80 percent of the world’s computers), I/O performance degradation progresses over time. This degradation, which can lower the system’s overall throughput capacity by 50 percent or more, happens in any storage environment. Windows penalizes optimum performance due to server inefficiencies in the handoff of data to storage. This occurs in any data center, whether it is in the cloud or on premises. And it gets worse in a virtualized computing environment. In a virtual environment the multitude of systems all sending I/O up and down the stack to and from storage create tiny, fractured, random I/O that results in a “noisy” environment that slows down application performance. Left untreated, it only worsens with time.
Even experienced IT professionals mistakenly think that new hardware will solve these problems. Since data is so essential to running organizations, they are tempted to throw money at the problem by buying expensive new hardware. While additional hardware can temporarily mask this degradation, targeted software can improve system throughput by up to 30 to 50 percent or more. Software like this has the advantage of being non-disruptive (no ripping and replacing hardware), and it can be transparent to end users as it is added in the background. Thus, a software solution can handle more data by eliminating overhead, increase performance at a much, much lower cost and extend the life of existing systems.
With the tsunami of data threatening IT, solutions like these should be considered in order to contain costs.
Jim D’Arezzo has a distinguished career in technology that started at IBM and has included senior executive positions at Compaq, Autodesk and as President and COO of Radiant Logic. He is currently CEO of Condusiv Technologies, the world leader in software-only storage performance solutions for virtual and physical server environments.
© 2024 Newsmax Finance. All rights reserved.