By Alex McMullan, EMEA CTO, Pure Storage
In the same way that a CFO sets out the strategy and framework for investment and cash flow, CIOs and CTOs need to take control of making dataflow work for their organisations. Data is now a currency, but one which carries extra responsibility for the holder, especially where our personal information is involved. This data currency will come under regulation in 2018, where failing to get a clean audit will have similar reputational and monetary consequences as failing a finance audit.
Against this backdrop, as we head into 2018, below are four factors that CIOs and CTOs should consider, to ensure that they get the most value from their organisations’ data.
Cloud control – hybrid architectures will dominate
The debate around whether to use cloud technologies is history. Multi-cloud deployment has now become the norm. With increased awareness that uncapped or per-second pricing can spiral out of control, a return to hybrid architecture, which marries the strengths of controllable cost, high performance on-premises systems, with burstable, global cloud services is underway.
The increased requirements for data control are going to further boost the attractiveness of the hybrid model. A trend that will only accelerate as each business seeks to balance its own technology demands against the relative TCOs of public vs on-premises cloud platforms.
As the transition from virtualization to cloud-native and containerized applications gathers momentum, there will be an increasing focus on seamless private/public cloud application and data mobility through to 2020. Those technologies which are able to integrate security with performance and multi-tenancy to enable this mobility on-demand, will become the market and mind-share leaders. Being able to efficiently migrate and repatriate data is going to be a key feature of cloud capability as organisations review the splits between where applications and data are most advantageously hosted from both a legal and operational perspective. Returning data to centralised pools in each legislative region is likely to become more common. This will increasingly be augmented (even at the edge) by directly linked high performance storage and compute at a local level, for mission critical real time or highly sensitive applications.
Data stewardship must become a core competency in 2018: IT and culture must change to support that
GDPR implementation is imminent, and for some organisations the initial cost of compliance will be substantial. It fundamentally requires companies to be good data stewards. That means knowing and showing where personal data is, where it is not and demonstrating fine grained data control from ingestion to deletion.
Becoming an exemplary data steward or responding effectively to security incidents, is near impossible with systems that take days to backup, index or restore data. To get to the stage where this level of control and safeguarding is possible we are going to see significant investment during 2018 in faster networks, search and indexing. Tools and platforms to improve the visibility, manageability and performance of data pipelines in general will also see substantial investment.
That said, organisations that rely on technology alone for GDPR compliance will struggle. There will also be significant cultural and procedural changes needed to achieve and maintain GDPR compliance. The right cultural approaches need to be led by senior management, and the right tools need to be implemented to support that behaviour. IT can help but it has to be part of an end to end approach, starting with the data architect and permeating the organisation from the back office through to every customer facing representative.
Leveraging AI and Machine Learning
Data from Forrester suggests that 70% of enterprises expect to implement some form of AI over the next year. However, I believe that, across the full range of businesses, the benefits of Machine Learning (ML) are going to be felt more immediately. ML’s quick wins will be lower down in the business technology stack. ML based automation is already proven to save hours each day in the routine administration of IT infrastructure. Instrumenting systems, via the Internet of Things, and using ML to analyse the data, delivers valuable, actionable information which can be used to automatically resolve issues before they have a business impact.
In my discussions with customers, several have equated the automation and guidance provided by ML based systems with having an additional infrastructure engineer on staff 24/7. This frees up IT staff to invest time in making use of the data they are storing and securing for their organisations.
Storage conversations should become data conversations
Technically, and commercially, the problem of delivering high performance, robust, simple and scalable storage has been solved. 2018 will be seen as a tipping point, where automation and orchestration technologies abstract modern infrastructure technology operations into a REST API call from Ansible, Chef, Puppet, Kubernetes etc.
Developers can now be given access to the block, file and object storage that they need, on a common scalable platform, with certainty over performance and ongoing, non-disruptive enhancement of the underlying technology. They no longer need to have discussions about where the next terabyte is coming from, or how it will be delivered. The simplification of data management that this offers opens up the opportunity for data scientists, researchers and designers to focus on their data pipelines and enhancing design processes, rather than talking about infrastructure.
Being able to shift data and workloads between clouds to take advantage of multi-cloud and hybrid architectures as cloud usage and the data regulation landscape evolves, is going to become a key capability for IT teams. In parallel, getting data storage infrastructure and dataflow right will be essential to equip organisations to take advantage of the machine learning and AI technologies that leading businesses are already deploying. As we head in to 2018, the onus is on CIOs and CTOs to ensure that their data storage strategy and infrastructure allows the organisation to extract maximum value from their data.