Many moons ago, Auto-Graphics’ infrastructure was the best in the business. We pioneered the concept of what is now called Software as a Service in the library space, and we had a data center to match. Over the years, technology moved along at a lightning fast pace. Particularly in the dot com boom, generic data centers rich in computing power and connectivity exploded in places where fiber connections came together. With the rise of inexpensive space in the data centers, Auto-Graphics needed to re-evaluate its need for total control over its own data center.
At a glance, total control over your own data center, especially if you’re providing Software as a Service to customers, seems like a good thing. In some cases, it can be, but one must consider the total cost of ownership. To build a world-class data center, one must first be willing to build the infrastructure to be completely resilient to all kinds of disasters: environmental, electrical, human error-incurred, etc. In addition, this data center would be best built in a location capable of communicating with as many networks as possible. Lastly, security needs to be one of the primary concerns.
Given all the requirements, outsourcing data center services provides all of these options and more since the needs of many businesses that require these kinds of services can be brought together into a facility that can accomplish all these goals. A data center that meets all these requirements, along with providing additional services, allows businesses like Auto-Graphics to concentrate on the aspects of serving its customers with what it knows best. In our case, that’s delivering high-quality library solutions. While it’s possible for Auto-Graphics to also provide world-class data center services, doing so would require significant resources that could better be spent by refining the services we are best at delivering.
Once realizing that moving our data center services to a business that can best meet the up and coming needs of the future of our customers, there was also the thought process of the type of infrastructure to utilize. Because world-class data centers have many, many options for services, it wasn’t good enough for us to just move the servers that were in our own data center to somewhere “better.” We had the opportunity to move to a more flexible fabric of infrastructure elements allowing us to grow faster and be more fault tolerant. The possibilities generally fell into 3 categories: a physical environment, a virtual one using public cloud infrastructure and a virtual environment using a private cloud infrastructure.
We ended up deciding on a private cloud infrastructure because it offered the benefit of hardware control that a physical environment provides but also allows for the flexibility that a public cloud infrastructure provides. In addition, our customers are privacy-conscious. The possibility of moving our customers’ data into the hands of a public cloud provider where data could not be immediately located in its physical form wouldn’t accomplish the goals of our customers in keeping their patrons’ data private.
In the end, our new facility with new infrastructure allowed us to be much more flexible with allocation of resources. That move also allowed us to quickly engineer solutions that speed up the entire pipeline of process from conception, development, all the way to deployment, training, and end-user use.