The Data and Analytics Rethink at Australia’s AI-driven Organisations
I always thought I had only one option to get to my data – consolidate it into one place and then transform and organise it to report. We know this works, but it’s not always affordable, nor is it truly agile.
Two-and-a-half years into the current AI boom, here’s what we now know
- AI needs data
- Data is best leveraged with a strategy
- If you have a strategy, you then need a way to implement it.
The first two are uncontroversial; implementation is where organisations can become unstuck.
AI is evolving with a near ubiquitous appeal and arguably (generative AI) at a rate not seen before. There are three overarching principles governing the data strategies of organisations. Namely, a desire to improve their agility to pursue data-driven innovation; to reduce the time to data, insights and value; and to do all of this in a scalable and cost-effective manner.
To deliver on these principles, a key pillar of any data strategy is to implement a modern data and analytics platform or ecosystem that is AI-ready, for current and future use cases. One of the key reasons that hyperscale cloud-based data lakes / Lakehouse have become singular destinations for all data is a desire for simplicity. This however can come at the expense of agility and cost.
Moving data into such platforms (using the proven approach of data consolidation) means taking copies of data from operational source systems and storing those copies in the cloud. This approach is necessary where data is not persistent at source, but this is not always the case. Data consolidation requires us to organise, secure, cleanse and transform the data before we can use it. This takes time and more cost. Organisations typically end up replicating more data into these environments than they need, which again increases costs. Exploring data to generate new insights in these single cloud-based stores comes at greater cost again.
And the cost to source, transform and store data is only one dimension to consider. The other is compute. The computation required to meet the business need is equally relevant when considering cost and agility. Depending on the why and what you need to do, costs can vary significantly.
Data Lakes/Lakehouse is critical, but not the only way
To be clear, we see cloud-based data lakes as a critical element to any organisation’s data and analytics capability. Organisations, however, demand stronger cost control, quicker time to data and time to value.
A hybrid architecture for both data and infrastructure has set a path to meet this challenge. Ever since 2023 industry analysts have reported that organisations are deploying a hybrid cloud to accelerate their adoption of AI. It is being used to provide a suitable cost and performance alternative to address the requirements of data sovereignty, low latency, edge requirements, and to provide a level of vendor independence.
To get the most out of a hybrid architecture, it’s best to consider both compute and data requirements and explore what the best combination of compute, data storage and data management capabilities is for your primary use cases.
For example, an organisation using a machine learning (ML) algorithm to support predictive maintenance should consider where the data is created, and how much is created. It might make sense to develop the initial ML model in a cloud-based analytics platform and then execute the model at the edge (on-site next to the machine). Or, given data volumes, it might make sense to develop the model in a private cloud and execute at the edge.
Leading with a Hybrid Approach
Data virtualisation establishes a single data-access layer for finding and using all enterprise data, enabling real-time access to data stored in multiple places. This approach enables you to abstract the data stored on-premises or in the cloud into a logical framework, which is then used to apply governance and security as data is delivered to data consumers.
In practice, this means leaving the majority of the data in its source systems and bringing only what is needed, when it’s needed, to an external AI model; or having other data in a hyperscale, private or hybrid cloud environment, depending on the use case.
Significant tangible benefits are on offer: Forrester Consulting revealed, in a Total Economic Impact™ (TEI) study into the Denodo Platform, organisations realised benefits of:
- 83% reduction in time-to-revenue
- 67% reduction in data preparation effort
- 65% decrease in delivery times over ETL
Leading with a hybrid approach across both data analytics platforms and infrastructure gives you agility and reduces risk when implementing your data and AI strategies.
Enabling incremental transformation – Organisations cannot wait for a ‘big bang’ migration to the cloud. A hybrid approach enables staged pathways to modernise data platforms – linking new AI capabilities with legacy and new cloud platforms without forcing disruptive “all-or-nothing” migrations.
Supporting diverse use cases and stakeholders – Business units can have different requirements – both in urgency and data governance. Finance may demand data be held on premises, whilst Marketing may need rapid access to cloud based AI tools. Marketing’s deadlines may not be as tight as Finance needing to report the Board. A hybrid architecture accommodates these varying requirements without forcing a ‘one-size fits all’ solution.
Future-proofing against vendor lock-in – A hybrid approach avoids data and AI capabilities based on a single vendor’s roadmap. The rate of AI development by many vendors suggests an increasing menu of capabilities to choose from – not all of which will come from today’s main players.
Enhancing resilience and continuity – By distributing data and AI workloads across a hybrid architecture you reduce your attack surface, providing greater operational resilience. Data virtualisation assists by avoiding the creation of even more copies of data.
Alignment with Sovereignty requirements – Cloud-based platforms’ sovereignty capabilities are not always acceptable. A hybrid approach connecting data held on premises with data held elsewhere.
A hybrid architecture comprising multiple data and infrastructure options – Connected by an overarching data virtualisation layer to enable centralised oversight, data access, management and control – is fast being recognised as the new standard for organisations chasing AI goals.
About the author
Petar Bielovich is the General Manager, Data & Analytics at Atturra. He leads a team delivering data, analytics and AI solutions, enabling digital transformation and generating more value from all forms of data. Petar has more than 25 years’ experience working with clients, including Australian Defence, Boral, Telstra and Nestle, and has worked for large professional services organisations and start-ups.









