The New York Times published an anti-data center series last fall: From an energy drain on local communities to smog-producing buildings the 'newspaper of record' ripped cloud computing, as if the new paradigm rolled out coal-fired power plants. The Times suitably left out the other side of the eco story: the reduction of electronic waste, forests of paper, and offsite file storage with trucks rumbling to and fro from the office.
It was hard to tell if the series was written for an ecological or technology audience. No matter, the Times shutdown its Environmental Desk last month, as the analog Gray Lady struggles to find its way in the Digital Age, with data-at-your-fingertips access delivered via the cloud.
Had the Times wrote on the inner workings of data centers, it would have been shocked to learned that the tools to optimize data centers--rack space, floor plan layouts, power, and expansion projects--have been analog and inefficient. We're talking Phoenician era spreadsheets and handwritten flowchart.
Coming from the construction industry, which is dead last of fifteen verticals investing in technology (according to a recent Gartner report), I was amazed to learn that such old tools were used to optimize space and predictive analytics for later phased expansions plague data center projects, too.
A recent meeting with Nlyte Software's Mark Harris, VP of Marketing and Data Center Strategy, in Silicon Valley changed that perception.
"What happens at a data center when new servers come online, they are often installed in a sub-optimal location, which after being repeated over and over, effectively strands valuable resources and wastes others," Harris said.
Like construction projects or cloud services, precise execution is rewarded. So planning is critical to both maximizing ROI for the customers, as well as optimizing the nexus of power, space, and maintenance on thousands to millions of servers.
Nlyte 7.0 - Data Center Focus
"We have found that every data center is so unique, even the largest ones are night and day in comparison," explained Harris. "A handful of years ago, Power Usage Effectiveness (PUE) became a new metric to measure data center efficiency. PUE is a great directional metric, but provides little guidance at actually optimizing inefficiencies. PUE don't address capacity planning within the confines of a data center. It falls short."
During the same conference call, Nlyte CEO Doug Sabella underscored that point: "The release of Nlyte 7.0 comes from a brand messaging standpoint that every data center has its own unique DNA. It requires all physical, virtual and IT logical layers--not merely the facilities fabric--to be coordinated for the enterprise customer. Through our contextual data repository, 7.0 has this coordination, with scalability and performance. It forecasts where you go from where you have been, accounting for the time-sensitive nature of each."
Nlyte's value proposition can be found in "maximizing the financial benefits from the optimized utilization of power, space and assets." In other words, clients can execute more projects with fewer people, in less time, reduce hidden duplicate steps, and avoid rework errors that cost both time and money.
So if space is at a premium in today's explosive growth of big data, mobile data, and the variety of consumer and social data, then planning data center architecture in terms of energy use and physical/virtual layout will be critical for the long-term success of the all-digital world.
"Strictly from a maturity standpoint, what separates Nlyte is our seventh generation software, which includes thousands of those customer requests," Mr. Sabella said. "We put these customer-driven inputs into the software that enables superior floor planning, navigating in and around rows of servers, with far fewer through-clicks on user experience. V7 isn't about if you can do something, but instead how easily can you do it."
With the advent of the small, smart screen of mobile devices, he added, "I want the data on the screen to be there. Today, information is so big. Data correlation is needed to break through the clutter. There's too much data to effectively use. That's why a robust business intelligence (BI) model that can slice and dice questions is key."
What Nlyte 7.0 offers is the ability for customers to consolidate data centers, advance scenario planning, and manage 100 percent of their assets. "We strive to make their lives easier, whether doing new projects or just maintaining an existing data center, which will turnover thousands of servers a year," said Doug Sabella.
The Nlyte software platform, which integrates that BI--manage, visualize, report, analyze and predict--into its central repository, enables its customers to plan, track, and maintain all of the mission-critical aspects of a data center. They include all resources: power, cooling, network, miscellaneous combined with the three 'S's of space, servers, and storage.
Knowing asset allocation and resource management enables the customer to manage and mitigate risk, while reducing the frequency and duration of downtime.
Nlyte's Core Message Getting Out
"The level of interest in our software is growing exponentially. Enterprises are in a world of hurt," stated Mr. Sabella. "We have the best purpose-built software in the industry. And with the largest installed base of modern solution like this, Version 7.0 is the latest proof of our product's tremendous maturity."
He added, "Strategically, the most important solutions spaces are evolving from islands to now being part of the overall IT fabric. The DC management world has not been connected yet to the rest of the business; it's not optimized and not managed well.
"Look at the impact of asset knowledge on the broader environment. A manufacturing company today knows more about a two-dollar widget they produce than a $3,000 server that consumes 5x that in annual maintenance, while running a hundred thousand dollar piece of software. They know precisely where that widget is in their warehouse, when it arrived, and when and where it was manufactured. By comparison, most companies today have relatively limited knowledge about where their servers are, how much work they are doing, if they are still on lease or under warranty, let alone what workload is running on them. It's crazy."
"What about Green energy uses?" I asked.
"The tech industry has wrestled with how to be Green for some time. Anything that eliminates wasted energy is currently considered Green--which is precisely what Nlyte's offering accomplishes. Whether it's increasing the amount of work done by servers, or increasing visibility into the cloud or virtualization layers already in use, Nlyte allows the Data Center to run at the lowest cost, with the least amount of energy wasted," Sabella said.
"Contrary to the Luddite view by the Times, data centers are at the very core of highly efficient cloud services, and optimizing their operation is the main industry goal," he said.
With the explosive growth of new data centers, it's more important than ever to have full visibility of cloud assets, where the work is being performed and how much is being performed, what the energy requirements are per unit of work and the cost savings that a vendor like Nlyte Software brings to this emerging industry.