Here are the influential voices leading the conversations where nonprofits and technology overlap.
When IT workers are tasked with planning and building a data center, the starting point for their estimates is usually based on present or historical benchmarks. For example, if the company uses 150 servers, then it will have to increase the number of servers by 25 percent in five years.
While this might seem logical, David Cappuccio, vice president of research at Gartner, thinks it’s misguided. According to a post he wrote for the Gartner blog, basing data centers strictly on historical data means not accounting for advancements in data center technology.
The key metric to keep an eye on is compute capacity per square foot, which he demonstrates in a mock data center scenario. The more this figure can be maximized, the better. Performing these types of calculations will help data center managers truly estimate the space they’ll need to grow their data centers for the future, Cappuccio explains.
Doing some simple spreadsheet exercises and asking these “what if” questions can yield some startling results when it comes to capacity estimates. And the logic works with servers as well as storage, as each device category continues to decrease in size, improve in capacity and performance, and reduce its power consumption per unit of work with each new generation.
If we were to look at these performance and density trends and make the assumption that the curve will continue — even at a much slower pace, it becomes clear that even small data center environments can have significant growth rates (well in excess of 20% compound annual growth rate), while maintaining the exact same footprint over the next 15 to 20 years.
How do you plan for growth in your data center?