By Craigh Stuart, Systems Engineering Manager at Nutanix SADC & WECA
A data-driven culture is needed if a business wants to lead through digital innovation. But traditional data management infrastructures are rigid and unable to combine diverse data types. Then there is the challenge of storing, managing and securing growing volumes of data and extracting insights out of it. Throw in cloud data and the fact that data is moving closer to the application and the user, and it becomes a hotbed of poorly managed databases sullying the role of analytics.
Those organisations that are unable to access accurate, timely, relevant, and reliable information from their data run a legacy datacentre infrastructure based on proprietary SAN arrays and storage fabric networks. These “hardware-defined” datacentres are, in essence, the first problem. What is needed is a data services platform that supports open architecture and is maximally available, assuring business of continuous access and insights from the data, where the infrastructure itself is effectively invisible to users.
Open Architecture
To turn data into a competitive advantage, one needs to understand that data comes from different business areas and needs to be organised in three primary ways, namely relational databases, unstructured data, and high-velocity data. Because data is diverse, its collection often results in the formation of silos or individual repositories and as there is no cohesion between these sources because storage formats and data types differ.
Once data is organised, silos can be eliminated by adhering to open standards, adopting open paradigms, offering choice, and avoiding lock-in to users, customers, and operators alike. This helps the business firstly bring in all data types and secondly consume the data – or in this case, use the data to glean insights for improved business decision-making. This model also supports the API-centric IT economy as it ensures that data can be collected, no matter the source, because there is software interoperability via generic endpoints.
Maximally Available Infrastructure
Making infrastructure maximally calls for systems that can self-heal, dynamically adjust the write path to suit the workload, and dynamically apply data transforms based on heuristics. When designing these environments, discipline is required because application placement and layout are important and need to factor in data sovereignty, workload affinity, workload availability, and workload redundancy upfront.
Maximally available data is simple to manage through a central management plane that offers end-to-end capabilities around alerting, events, and monitoring. It brings data closer to analytics – which is where it should be.
But what do we mean? Let’s look at a traditional database environment where an application or database update may lead to issues across systems. Here DBAs take a snapshot before each update if they need to roll back to an ideal state. With a maximally available data architecture, this can be avoided entirely, and any upgrades to underlying infrastructure supporting your data are continuous without interruption to services. Now, adding capacity for storage or performance, or performing a software update, is seamless and non-disruptive to your data services.
Making Data Infrastructure Invisible
A competitive data infrastructure is effectively invisible, trusted, and autonomous. It is trusted because it already takes into consideration data and application security, data protection, and data governance and compliance. It is autonomous because it brings together cloud-like simplicity and ease of use to database provisioning, management, and patching.
Traditional database management is done through an error-prone siloed approach to provisioning or cloning, which is time-consuming and makes troubleshooting exceptionally difficult. With an invisible data infrastructure, you know that the environment is integrated with your automation tools and gives DBAs the ability to create a self-service catalogue and perform delegation that controls who gets access, what they get access to (provision, clone, patch etc.), how much (storage, RAM, vCPU), and for how long (spin-down of resources).
Companies do not have the time or the resources to perform these functions manually in this digitally connected world. There are too many moving parts (applications). Today’s DBAs must be able to copy/paste a particular database instance and create clones from any point in time.
Building It
To build an invisible data infrastructure, your organisation must follow a software-defined approach to data services. By adding the tools needed to architect an invisible plane that can orchestrate the functions of your databases, your business can unify how data is stored and consumed with a data services platform that is open, operative, and opaque and is supported both on-premises or in the public cloud.
Using integrated database management software, you can automate and simplify management, enjoy one-click simplicity and invisible operations to database provisioning, lifecycle and copy data management. In essence, this platform paves the way to the advantages inherent in Databases as a Service (DBaaS).
The benefit to your business is an end-to-end environment for structured and unstructured data with raw storage in a cloud-like experience with full API-first automation and self-service. When in place, you can enjoy the agility to embrace DevOps while maintaining traditional enterprise apps and speed up provisioning new file servers, object repositories, and databases with simplified recovery, empowering end-users, including DBAs and reducing their dependence on infrastructure and ops teams.