Cloud & Edge Computing Trends and Predictions 2025 From Industry InsidersCloud & Edge Computing Trends and Predictions 2025 From Industry Insiders
IT leaders and industry insiders share their cloud computing and edge computing trends and predictions for 2025.
January 17, 2025
In 2024, AI and cost optimization shaped the cloud landscape. In 2025, expect growing interest in supercloud and for multicloud (or, even better, cross-cloud) to gain significant momentum, according ITPro Today tech expert Christopher Tozzi.
Those were just a few of his cloud computing and edge computing predictions for the coming year. Now it's IT leaders' and industry insiders' turn to share what they are expecting in the cloud computing and edge computing realms in 2025. Check out their predictions below.
First, explore all of our 2025 tech predictions, including "anti-predictions" that challenge widely anticipated IT trends with fresh insights from our experts:
Cloud Computing Trends of 2025:
2025 Will Be the Year of the Move from VMware to Other Cloud Orchestration Platforms
Enterprises will increasingly move away from traditional virtualization platforms like Broadcom VMware to open-source, no-vendor, lock-in cloud management/orchestration platforms — such as CloudStack, OpenStack, OpenNebula, Proxmox, and Kubernetes, which are all KVM-based. Even other commercial or non-open-source alternatives like RedHat OpenShift and HPE Morpheus VM Essentials are KVM-based.
KVM is the default hypervisor for the majority of the large-scale cloud-native companies in SaaS/PaaS/IaaS/e-commerce, and such. And now it is becoming the preferred hypervisor in many traditional enterprises due to its openness, performance, capabilities, flexibility, and very wide usage.
In 2025 we'll see many more organizations of all types which will increasingly adopt cloud-native orchestration platforms, such as the ones listed above. This trend is driven by the requirements to reduce cost, optimize operational efficiency and profit margins, increase performance, and reduce vendor lock-in. — Boyan Ivanov, CEO, StorPool Storage
SMBs Will Embrace Modern Cloud Solutions to Simplify CMMC Compliance
As CMMC requirements enter contracts in mid-2025, small and medium-sized businesses will transition from legacy systems to cloud-based solutions for a simpler, more cost-effective path to CMMC compliance. These modern solutions, combining built-in security controls with detailed compliance documentation, will help SMBs in the Defense Industrial Base dramatically reduce both the complexity and cost of achieving and maintaining CMMC certification. — Sanjeev Verma, co-founder, PreVeil
Cloud-Agnoticism Gains Momentum
Digital-native or digitally transformed companies have started moving their cloud solutions to on-prem as a means of reducing cost, while others are exploring multicloud or hybrid-cloud solutions in order to reduce dependence on a single vendor. This makes "run anywhere" management layers (such as Kubernetes) increasingly important to build towards. — Robert Elwell, VP of engineering, MacStadium
State of IT in 2025: Remigration to On-Premises
With rising IT costs driven by increased license fees from major vendors and soaring hyperscaler bills, many organizations are facing budget crises. In 2025, we predict a shift as companies begin moving workloads back from the cloud to on-premises or co-locations to reduce operational expenses. — Sascha Giese, Global Tech Evangelist, Observability, SolarWinds
Adaptive Strategies for Modern Data Protection
Hybrid and multi-cloud architectures are the lifeblood of modern business agility. However, with great flexibility comes great responsibility. For 2025, we must enforce consistent, adaptive security policies that accompany data wherever it flows — cloud, on-premises, or edge. This is not just about safeguarding data but about building a resilient and trust-driven digital economy. — Balaji Ganesan, co-founder and CEO, Privacera
Cloud Detection and Response Will Be Essential
As cloud environments become essential, perimeter-based security is obsolete. By 2025, Cloud Detection and Response (CDR) will be crucial for securing cloud-native infrastructures with real-time monitoring, machine learning, and actionable insights. CDR will address misconfigurations and external threats, ensuring visibility across multi-cloud setups. Additionally, real-time CDR tools must expand to cover edge infrastructure vulnerabilities, which remain a risk due to rushed pandemic-era deployments. — Jimmy Mesta, CTO and founder, RAD Security
Cloud Repatriation Won't Be the Key to Cost Savings (for Most)
There's been a lot of talk about organizations moving their workloads from cloud to on-prem, however, that won't be feasible (or strategic) for most. It's true that certain organizations with predictable workloads might benefit from hybrid or on-prem solutions — like large-scale social media networks. However, for most companies, the time, money, resources, and overall complexity of full-scale cloud repatriation won't offset cost. Instead, they should look into implementing a targeted optimization approach — instead of abandoning their cloud infrastructure, they can optimize it for cost, performance, and scalability. This requires a mix of FinOps, leveraging the right tools, and continuous monitoring of infrastructure economics, but teams that lean into this approach will see meaningful cost savings without sacrificing the agility and scalability that drew them to cloud platforms in the first place. — Richard "RichiH" Hartmann, Director of Community & Office of the CTO, Grafana Labs
Catch You Later, Cloud
The cloud exit movement picks up more steam, thanks to rising costs. Players both small and large will continue to leave cloud providers to find other, more budget-friendly alternatives. Tech companies with a vested interest wait to see if there's a calm after the storm. — Jay Upchurch, chief information officer, SAS
Cloud Providers and AI Users Will Share Environmental Responsibility
The rush to adopt AI is leading to inefficient models that consume vast amounts of cloud resources and contribute to a larger carbon footprint. It is not only up to hardware providers and hyperscalers to reduce environmental impact — it's a shared responsibility with the AI users managing data and AI workloads. Greater efficiency in AI model development — made possible by cloud-optimized data and AI platforms — will help to reduce unnecessary duplication and waste and minimize energy consumption. — Jerry Williams, chief environmental officer, SAS
AI and Cloud Acceleration Will Trigger a Great IT Rationalization
Businesses have long run on siloed systems, each serving a different function or customer segment. IT teams are buckling under the weight of cumbersome integrations, unable to provide the agility their enterprises need. A Great IT Rationalization is on the horizon, where business leaders will leverage the cloud to simplify their IT infrastructures and vendor relationships, gain critical speed and cut costs. Those who modernize on a cloud-native, AI-powered platform that drives multiple functions will derive the greatest value. They can achieve integrated, democratized data and decisioning capabilities that span the customer life cycle and the enterprise at large. — Stu Bradley, Sr. VP, Risk, Fraud and Compliance Solutions, SAS
Cloud Innovation Will Die Unless Walled Gardens Are Broken Open
A handful of big tech companies in the United States have historically controlled most of the world's cloud computing infrastructure. This structure encourages lock-in, consolidating access to AI by building walls around the infrastructure needed to utilize it at scale. In 2025 we're going to see a shift away from "all in one" commercially available models and towards lightweight, open-source, purpose-built deployments. This will do three things; lower the bar to entry for startups and scaleups, improve accessibility in regions traditionally underserved by the hyperscalers, and make enterprise workloads more efficient. If we don't see this, I'm afraid cloud, and particularly AI, innovation will stagnate and AI adoption will become prohibitively costly. Closed platforms inherently lack the flexibility needed for businesses to rapidly adapt their AI tech stacks to capitalize on the latest innovations, building innovative latency into the foundations of their tech stacks. — Kevin Cochrane, CMO, Vultr
All Hail the Sovereign Cloud
In 2025, we're going to see a real push toward sovereign and private clouds. We're already seeing the largest hyperscalers pouring billions of dollars into constructing data centers around the world to offer these capabilities. This rush to build capacity will take a while to come online, in the meantime, demand will skyrocket fueled by a wave of legislation coming predominantly from the EU. Those with flexible, scalable and elastic cloud infrastructure will be able to adopt sovereign or private approaches quickly. Those with monolithic, rigid infrastructure will be putting themselves behind the curve. — Kevin Cochrane, CMO, Vultr
AI: The Catalyst for the Alt-cloud
AI will become smarter and more dependable in the next year, but businesses will require agile, scalable, open, composable ecosystems to unlock its full potential — something Big Tech's cloud titans aren't capable of delivering. Enterprises will increasingly look to alternative cloud providers to supply the kind of infrastructure that supports the rapid deployment of new AI models without skyrocketing overheads. These open ecosystems will supplant the monolithic, rigid, and costly single-vendor paradigm that has disproportionately favored enterprises operating closer to the traditional tech heartlands, leveling the playing field for AI innovation across all regions of the world. — J.J. Kardwell, CEO, Vultr
Cloudy with a Chance of Doubt: Reassessing Cloud Platforms
Many organizations are increasingly shifting their HPC and AI workloads back to on-premises hybrid cluster environments, driven by some unexpected challenges from generative AI. As GenAI surged in popularity, it has used up most of the available GPUs from major cloud providers, making it hard for other users to access the resources they need when they need them — and driving costs up significantly. As a result, companies are reconsidering their cloud strategies and looking to bring servers back in-house. This hybrid approach allows them to benefit from a scalable computing environment for large tasks while still having on-premises GPUs for their AI workloads. It's all about finding the right balance that meets their needs without the headaches that come with cloud dependency. — Bill Bryce, SVP, product management, Altair
Cloud Security in the Age of Multicloud and Hybrid Environments
The shared responsibility model for cloud security is proving insufficient as multicloud and hybrid environments expand. As outages and third-party supply chain attacks increase, we'll see more reliance on developers who understand the full scope of major cloud platforms, while MFA becomes essential across all CSPs. — George Gerchow, head of trust, MongoDB
AI-Driven Cloud Economics Reshape Infrastructure Decisions
In 2025, organizations will fundamentally reshape their cloud strategies around AI economics. The focus will shift from traditional cloud cost optimization to AI-specific ROI optimization. Organizations will develop sophisticated modeling capabilities to understand and predict AI workload costs across different infrastructure options. This will lead to more nuanced hybrid deployment strategies where companies carefully balance the cost-performance trade-offs of training and inference workloads across cloud providers and on-premises infrastructure. — Haoyuan Li, founder and CEO, Alluxio
Cloud Cost Management Strategies Will Be Increasingly Driven by AI and Containerization
As more organizations adopt technologies like AI and containerization, robust visibility into their cloud costs will become even more important to make better decisions. With the use of edge content networks and more storage, organizations in 2025 will increasingly need a better handle on cloud cost management. Engineering and finance teams will increasingly need to understand how to normalize the cost data from many providers and then understand unit economics. AI features are exciting but often are associated with significant and variable costs. Many of those costs will be worthwhile, but without real-time monitoring of their unit economics, cost management teams will struggle to help their companies maximize those AI features' reach and economic impact. Another factor here is that more companies are shifting to outcome-based pricing for AI products, which means pricing will be tied directly to the outcomes delivered by AI agents. As a result, you can't set prices without understanding the cost per AI agent outcome, making a modern cloud cost management strategy essential. — Bill Buckley, senior vice president of engineering, CloudZero
Fragmentation in the Cloud
Companies, especially larger ones using AI, will see an increasing fragmentation in their cloud as they try out different AI services and chase where GPU and compute are available. With an explosion of exciting third-party vendors helping companies bring AI and AI-powered solutions to market, many companies will bring on new vendors in this space. Lastly, good data is the lifeblood of many of these AI trends, so companies will continue to invest in data platforms. These pressures will make it even harder for cost management teams to collect, normalize, and organize their cloud vendor costs in an actionable way. Increasingly, companies are looking to use tooling to assist with this data problem. Through either build or buy strategies, companies will need to deploy more software to assist in getting timely visibility into this increased vendor sprawl. — Bill Buckley, senior vice president of engineering, CloudZero
Higher Demand for Real-Time Insights
Many teams are still struggling to predict cloud costs, with many lacking the right tools or knowledge to make early estimates. That could be a wake-up call for leaders with a false sense of how prepared their teams are to manage cloud costs. The good news is that we expect to see continued advancements in monitoring tools in the coming year, which should improve real-time insights. The best tools will continue to offer easy-to-leverage ETL and normalization to allow a single pane of glass across all costs. They will also, as AI spending increases, make it easier and easier to bring in non-cost spending, like revenue, to understand unit economics. — Bill Buckley, senior vice president of engineering, CloudZero
FinOps Adoption to Face Uphill Battle
FinOps talks a big game, but have FinOps practices really shifted left, or is it still centralized at most companies? While the idea of integrating FinOps early in the process sounds great, many organizations still need help to make this shift. Even FinOps practitioners admit to not having widespread buy-in yet, suggesting that it's more of an aspiration than a reality for many. — Bill Buckley, senior vice president of engineering, CloudZero
The FinOps Role Will Need Redefining
The expectations for FinOps professionals are growing unsustainably broad, which will prompt a need to redefine these roles. For instance, many job descriptions ask FinOps professionals to have DevOps, architecture, and accounting skills — essentially, wearing all hats simultaneously. This could spark debates within the community about whether the role needs more specialization or if companies are setting themselves up for failure by demanding too much from too few people. This could be one reason why the market doesn't seem to have caught up in terms of hiring for FinOps roles. In 2025, organizations must look closer at how they define these roles and reevaluate the talent they seek. — Bill Buckley, senior vice president of engineering, CloudZero
More Companies Will Turn to Cloud Unit Economics to Understand Their Cloud Spend
Global cloud spending is poised to reach at least $824 billion in 2025. For SaaS companies, cloud spending is in the top three areas of overall expenditure, and AI is driving up cloud costs faster than expected. In fact, for 73% of companies, cloud costs consume at least 6% of revenue. Organizations need to find a way to change the status quo. As a result, they will increasingly turn to cloud unit economics to create more profitable pricing and packaging — including using dynamic pricing to change prices based on current market demands. To do this effectively, organizations must understand how efficient their cloud cost spending is and how much revenue is spent on cloud costs. — Bill Buckley, senior vice president of engineering, CloudZero
GreenOps Will Grab a Greater Foothold
Statistics show that the public cloud now has a larger carbon footprint than even the airline industry, and a single public data center uses as much electricity as 50,000 homes. Amid new regulations, particularly in Europe, coupled with consumer pressure, we predict more interest in the concept of GreenOps. Put simply, GreenOps is the practice of minimizing a cloud environment's carbon footprint by efficiently using cloud resources. This can only be done with visibility into an organization's true cloud spend and a deeper understanding of how resources are allocated. Optimizing cloud use to reduce waste will be a key part of this puzzle, leading organizations and individuals to take a closer look at their data usage. — Bill Buckley, senior vice president of engineering, CloudZero
Greater Adoption of Hybrid Cloud Failover Clustering
Enterprises will increasingly implement hybrid cloud architectures, combining on-premises data centers with public cloud platforms for failover clustering. This setup will ensure high availability and disaster recovery while offering flexibility and cost-effectiveness. Organizations will prioritize failover solutions that seamlessly bridge on-prem and cloud environments, allowing IT teams to leverage the cloud's resilience without abandoning existing infrastructure. — Cassius Rhue, vice president, customer experience, SIOS Technology
Shifting to AI-Driven Remediation for Stronger Cloud Resilience
In the cloud-native space, we anticipate a shift from prioritizing vulnerability detection to focusing on streamlined remediation, driven by faster, automated responses to security issues. With rising threat volumes, organizations will increasingly rely on AI-guided remediation, automated workflows, and contextual analysis to expedite fixes and reduce manual workload. Advanced tools will assign responsibility, provide targeted guidance, and adapt in real time, enhancing both accuracy and speed. This transition will strengthen cloud resilience, as organizations move from merely identifying risks to actively and efficiently closing vulnerabilities across their dynamic infrastructures. — Gilad Elyashar, chief product officer, Aqua Security
Code-to-Cloud Security Set to Redefine Protection from Development to Deployment
The convergence of cloud security and application security will drive code-to-cloud approaches to become standard in cloud security solutions. As cloud environments grow more complex, identifying and fixing security issues at the code level before production becomes essential. This approach integrates security throughout the software lifecycle — from development through runtime. With DevSecOps, CI/CD integration, and automated threat response, code-to-cloud strategies streamline security practices, making it easier to trace vulnerabilities back to their source and resolve them quickly. — Gilad Elyashar, chief product officer, Aqua Security
Automated Runtime Blocking to Lead the Charge in Cloud Security Defense
With the huge increase in the volume and frequency of attacks on cloud environments, defenders will have to automate their side as well in order to scale to the challenge. We expect more organizations will adopt runtime blocking controls to close security gaps in real time. This shift is vital in cloud-native environments, where applications and workloads operate across dynamic platforms, expanding the attack surface. Runtime blocking provides proactive defense, helping companies meet shared responsibility requirements, enhance threat response, and maintain regulatory compliance. — Gilad Elyashar, chief product officer, Aqua Security
Cloud Exposure and Consolidation Risks
Organizations will start to realize just how exposed their cloud infrastructure and applications are, and that some of their choices in the past 12-18 months were too aggressive towards consolidation, leaving large gaps. The expectation that you can replace five to six tools with a single tool is unrealistic, especially at this point of maturity in the cloud-native market. A successful transition requires restructuring teams to support this major change. Companies must completely change the way they handle cloud vulnerabilities. I expect we will see a shift back to the best-of-breed point solutions in an effort to maintain effective cloud security programs. — Rani Osnat, SVP strategy, Aqua Security
Re-evaluating Cloud vs. On-Prem Decisions
Enterprises, especially in regulated industries, will adopt a more structured and informed framework for deciding which applications will run in public clouds vs. in private cloud/virtualization infrastructure. This will be driven by performance, governance and cost considerations, as well as security. The automatic assumption that every new application needs to run in a public cloud has been losing its inertia as infrastructure and operations teams realize that, in some circumstances, they can do a better job with less effort and cost on-prem, and that not all applications need the on-demand elasticity enabled by public cloud providers. — Rani Osnat, SVP strategy, Aqua Security
GenAI to Drive the Future of Cloud Security Against Evolving Threats
In continuation to last year, GenAI will continue to empower both attackers and defenders. Attackers can now use AI to generate complex, targeted phishing, deepfakes, and adaptive malware. In response, cloud-native security solutions leverage GenAI to automate threat detection and response across distributed environments, enabling real-time analysis and predictive defense. By 2025, using AI within cloud-native frameworks will be essential for maintaining the agility needed to counter increasingly adaptive threats. — Moshe Weis, CISO, Aqua Security
Cloud-Native Solutions to Shape the Future of Data Security
With data spread across diverse cloud-native architectures, adaptive, data-centric security is essential. Cloud-native solutions now provide dynamic protection across data lifecycles, securing data at rest, in motion, and in use. This will be critical in 2025 as stricter compliance standards and more data-centric attacks demand robust, consistent security for data everywhere. In 2025, cloud-native solutions will be crucial for staying resilient, adapting to new regulations, and navigating an ever-evolving threat landscape. — Moshe Weis, CISO, Aqua Security
The Forecast Calls for Multicloud Chaos
By 2025, multicloud environments will become the "new normal," but with a twist — organizations will be navigating clouds with as much finesse as a game of Twister. A recent Gartner report found that by 2028, more than 50% of enterprises will use industry cloud platforms, but managing these clouds will resemble herding digital cats. — Ravi Ithal, GVP and CTO, Proofpoint DSPM Group
Ensuring Success of Long-Term Hybrid Work Models
In 2025, in order for CTOs to ensure that their organizations' technology infrastructure supports long-term hybrid work models, they must design systems and processes that can be quickly moved to the cloud as the need for virtual collaboration and distributed processing calls for it, but also designing for cost-effective repatriation of stable workloads that don't need cloud's elasticity or broad network access to optimize spend and efficiency is the basic strategy. — Michael Allen, CTO, Laserfiche
Building Synergy Between On-Prem and Cloud Infrastructure
In order to create seamless integration between on-prem and cloud infrastructure in the new year, organizations should adopt infrastructure-as-code tools that allow them to define the configuration of their IT infrastructure as code and deploy to multiple environments. Additionally, retrofitting these tools on top of brownfield environments (existing) and adopting these for all greenfield infrastructure (new) projects will enhance the agility of infrastructure by significantly reducing the effort required to move services between on-prem and cloud environments. In 2025, buyers should be wary of making large investments into on-prem infrastructure that isn't compatible with infrastructure-as-code. — Michael Allen, CTO, Laserfiche
Which SaaS Vendors Will Win in AI
AI is here to stay. It's not a wave that'll go away. It's like mobile; it's like the cloud; it's like the internet. SaaS vendors are scrambling to add AI to their existing products, with mixed results. The long-term winners here will have three things: deep domain expertise, a tight feedback loop, and unique data that only they have access to. Building great products with AI isn't a matter of stuffing an LLM into existing features and seeing what sticks — and doing it right is no simple task. You need to have that unique data AND be able to utilize it in a way that deeply respects your customer's privacy; finding that balance is crucial. It's about leveraging your exclusive insight—in a privacy-minded, compliant way—to solve real problems for your customers and help them achieve outcomes they otherwise couldn't. Done right, you won't need to sell anyone anything; you're scratching an itch you already know your customer has. — Milin Desai, CEO, Sentry
Evolution of FinOps Practices
2025 will mark the end of operational FinOps, shifting from operational cost managers to strategic value enablers. This transformation will be driven by two factors: the increasing complexity of hybrid cloud environments and the C-suite's growing focus on technology ROI. Organizations that fail to make this transition risk being left behind as cloud costs continue to accelerate. This will create disruption for FinOps teams that are used to being operators and not strategists, and will challenge the notion of DIY/Native tooling usage. — Kyle Campos, chief technology & product officer, CloudBolt
AI Reality Check: From Gold Rush to Trough of Disillusionment
The AI gold rush will force a fundamental rethinking of cloud economics in 2025. As organizations grapple with unprecedented GPU demands and costs, we'll see public cloud laggards make aggressive moves to public cloud. However, as AI progresses through the Hype Cycle, expect an 'AI reality check' by Q3. Organizations will shift from the exuberance of 'AI Everything' to a more measured focus on tangible business outcomes and ROI. This tension will push cloud providers to innovate new pricing models and optimization levers specifically for AI workloads. — Kyle Campos, chief technology & product officer, CloudBolt
Platform Consolidation & Integration Propelled by FOCUS
2025 will be the year of the platform play in cloud management, catalyzed by the widespread adoption of FOCUS (FinOps Open Cost and Usage Specification). As this standard enables unified visibility across public clouds, private infrastructure, and data centers, organizations will reject today's fragmented tooling landscape in favor of comprehensive platforms that can treat their entire hybrid environment as a single cloud fabric. This shift will drive significant market consolidation as vendors race to deliver complete visibility and automation across all technology investments. For the first time, organizations will be able to make truly workload-optimized decisions based on complete economic and performance data. — Kyle Campos, chief technology & product officer, CloudBolt
Cloud Value Optimization
The focus will shift from "cloud-first" to "cloud-right" as organizations demand deeper insights into workload economics across their entire cloud fabric. Understanding unit costs — especially in private cloud environments — will become crucial for strategic decision-making. We'll see the emergence of new metrics and KPIs designed specifically to measure cloud ROI and effectiveness across hybrid environments. — Kyle Campos, chief technology & product officer, CloudBolt
AI in the Cloud Moves From Simply 'Spotting Things' to Actually 'Doing Things'
Beyond data crunching and spitting out insights, AI-driven automation that turns insights into actions, automatically optimizes cloud performance and spend, and reduces the insight to action gap becomes the new table stakes by the end of 2025. Agentic AI gains rapid adoption and is integrated into workflows to accelerate AI impact such that the industry begins seeing "near-real-time FinOps" for the first time. And AI begins playing a bigger role in spotting anomalies and making decisions at moments of truth at the edge as organizations continue finding ways to shift left. — Kyle Campos, chief technology & product officer, CloudBolt
The Audacious Bet — Microsoft Buys IBM
Cloud has seen vast acquisition and consolidation. But the biggest move is yet to come. In a bold move to overtake AWS in size and relevance, and in the process own the FinOps space, Microsoft makes the biggest gamble by vying to purchase IBM to acquire its cloud, tech, and customers in one of the largest acquisitions in history. — Kyle Campos, chief technology & product officer, CloudBolt
Cloud-Only Analytics Fade as Enterprises Shift to Cost-Effective, Predictable Solutions
Enterprises are on the verge of a major shift in their approach to data analytics, as cloud-only solutions face increasing scrutiny due to opaque billing practices from providers, often leading to unexpected expenses that undermine financial planning. Currently, more than half of companies identify cloud spend as a top concern, yet lack the visibility needed to truly control or optimize these costs. Without clear insights into actual usage and application requirements, businesses are essentially "driving blind," much like driving without a gauge for fuel mileage. In the next 12-18 months, this lack of transparency is likely to drive a substantial movement toward hybrid and alternative models that promise greater predictability and control. Until cloud providers deliver the transparency needed for accurate spending oversight, cloud-only models will take a back seat as businesses seek sustainable, controlled solutions that allow them to manage and optimize their usage effectively. — Chris Gladwin, CEO and founder, Ocient
Risk Management Strategies Will Embrace the Cloud
The constant specter of cyber threats and the need for data protection will compel more IT pros to situate data and applications in the cloud not solely for availability as in the past, but for improved security, compliance, and disaster recovery capabilities. The Change Healthcare data breach disrupted the medical industry in 2023, and the CrowdStrike incident disrupted almost everything. Cloud-based risk management solutions will be more valued for business continuity and maintaining productivity. — Karen Gondoly, CEO, Leostream
Going Back to Cloud Basics for AI Success
The integration of generative AI services will continue to be a top priority for organizations in 2025, although only a small percentage will make much headway. Too many organizations took shortcuts on their transition to cloud computing, opting for lift-and-shift migrations that resulted in high costs, poor performance, and increased security risks. The risk of leveraging AI has little to do with the services themselves, and more to do with the shaky cloud computing foundations that generative AI is being built upon. Organizations will need to take the necessary steps back to establish fundamental cloud native practices in 2025 before taking the steps forward into the future of generative AI — otherwise your investments are built upon a house of cards. — Drew Firment, chief cloud strategist, Pluralsight
Fundamental Cloud Skills Will Be Key
Many enterprises are still in the early stages of adopting cloud computing, and are still building a workforce that is literate in the foundational skills required to transition to the new operating model. Until organizations can reach critical mass of cloud fluency, there are very few cloud skills that will become obsolete in 2025. While cloud fundamentals remain in high demand as organizations try to close their skills gap, the emergence of generative AI is creating a new demand on the workforce. In 2025, individuals will need to gain a firm grasp of basic generative AI technologies and clearly understand strategies and methods for applying AI services to specific use cases. — Drew Firment, chief cloud strategist, Pluralsight
The Price of Two Cloud Approaches
The current state of cloud computing is a tale of two enterprises separated by a large divide of maturity. For disciplined organizations that spent the past decade investing in cloud native practices and skills, their leaders will be focused on leveraging that solid foundation as a springboard for implementing generative AI solutions in 2025. On the other side of the chasm, enterprises that focused on "lift and shift" migrations to secure quick wins are now paying the price. As we head into 2025, those leaders will need to prioritize cost control and cloud security as a result of tactical migrations. Any aspirations for large-scale generative AI initiatives will remain in the backlog as their teams sprint to transition from legacy workloads and practices to cloud native architectures. — Drew Firment, chief cloud strategist, Pluralsight
Generative AI will Impact Cloud Hosting Models
While the optimal hosting model for delivering customer value is "all-in" on a single public cloud provider, the reality is that most organizations will be managing a hot-mess combination of private data centers, multiple public cloud providers, and an emphasis on containers. Until organizations can finally get around to migrating workloads from their mainframes, most private data centers will continue to coexist in a hybrid model with a predominant public cloud provider. While using a single public cloud provider is the ideal approach to avoid draining talent and value, the sprawl of multi-cloud will still be prevalent in 2025 given the difficulty of unwinding those decisions. Using containers as a method for encapsulating applications so they can work in these heterogeneous environments is a popular choice, but negates the opportunity to take advantage of more modern and less expensive approaches such as event-driven cloud native architectures. While organizations have used containers to manage porting their applications across the sprawl, it'll be much more difficult to ring fence their data in these complex hosting models with the anticipated demands of generative AI in 2025 and beyond. — Drew Firment, chief cloud strategist, Pluralsight
A SaaS Reckoning Is Coming
A SaaS reckoning is coming. Businesses have built elaborate tech stacks with an average of more than 300 apps. Company leaders are under pressure to cut costs, leading them to start questioning the ROI of these platforms and eliminating the ones that are not delivering value. This means that ensuring there's real ROI and making it easy to see and understand are more important than ever for SaaS companies. The eventual outcome of this will be a culling of any SaaS that has failed to deliver on its promised results and an uplift for those that have. — Michael Zuercher, CEO, Prismatic
Cloud Environments Will Face Escalating Security Risks Amidst Visibility and Cost Challenges
2025 will see rising security risks driven by limited visibility across multi-cloud environments and underinvestment in protection. With cloud attacks surging 75% in 2023, attackers will continue to exploit unsecured containers, default settings, and inadequate monitoring, particularly in platforms like Azure and Google. — Jim Broome, CTO and president, DirectDefense
Visibility Into User Experience with Cloud-Based Apps Becomes More Important
As companies reverse many of the remote and hybrid work models of the last several years, visibility and observability among corporate offices and remote locations will be increasingly important in 2025. Similarly, as enterprises roll out new cloud-based AI applications and integrations, if "User Experience" is a focus going forward, as some suggest, then maintaining visibility into the performance and user experience (UX) of cloud-based applications needs to be a top priority. Unfortunately, too often, it is difficult for organizations to appreciate the level of visibility they need until after their employees or customers encounter UX issues with their mission-critical applications. — Eileen Haggerty, AVP, NETSCOUT
Thanks to AI, Hybrid Cloud Is Here to Stay
Only about two years ago, it was a very "cloud only" environment with some companies ready to get rid of their data centers altogether. The reality is, many businesses still have over half their data living outside of the cloud — and it will likely stay there based on what makes the most sense for their use case (in high stakes environments such as healthcare, for example). Therefore, hybrid cloud strategies are alive and well, especially with the proliferation of AI. Organizations can maintain on-premises GPU infrastructure for consistent, high-priority workloads while using cloud GPUs for burst capacity. This avoids complete lock-in to cloud providers' premium GPU pricing and grants better control over total cost of ownership for expensive AI infrastructure. — Haseeb Budhani, co-founder and CEO, Rafay Systems
2025 Will Be the Year of Cloud Cost Optimization and Data Governance
As budgets tighten in 2025, organizations will intensify their scrutiny of operational efficiencies and cloud spending. Instead of prioritizing development speed, companies will focus on return on investment and total cost of ownership, conducting granular cost assessments at the application level. Revenue generated by applications will be compared to their development and maintenance costs, potentially accelerating the shift back to on-premises or hybrid environments. The complexity and expense of cloud-native modernization, coupled with increasing data privacy and AI regulations, will drive a renewed emphasis on data control and governance. — Lee Faus, global field CTO, GitLab
Slow and Steady Will Not Win the Cloud Modernization Race
The traditional "slow and steady" approach to cloud modernization isn't working — traditional engagement models often prioritize time and materials over measurable transformation outcomes. Organizations will need to break free from gradual modernization patterns and embrace more decisive approaches. We'll see increased adoption of modernization strategies that emphasize speed and efficiency, backed by clear metrics and business outcomes. The market will not forever tolerate endless modernization cycles that fail to deliver real transformation. Organizations must either modernize decisively or watch more agile competitors capture market share. — Amir Rapson, CTO, CCSO, and co-founder, vFunction
Hybrid Cloud Environments Will Embrace Advanced Multi-Agent Observability
In 2025, hybrid cloud environments will see a major shift toward advanced observability solutions using multi-agent. This surge in adoption will work to unify visibility across on-premises, cloud, and multi-cloud infrastructures, using AI and machine learning to predict and prevent issues before they occur. Organizations need tools that can deliver a real-time view of complex, interconnected systems that empower IT teams to proactively manage discrepancies, optimize performance, and enhance security. With these capabilities, companies will be better equipped to ensure reliable performance, security, and scalability across diverse infrastructures. — Gab Menachem, VP ITOM, ServiceNow
Cloud Platforms Evolve to Prioritize Simplicity, Security, Compliance
In 2025, the cloud will evolve into a core platform prioritizing simplicity, security, and compliance. By integrating AI-driven insights and automation, cloud environments will empower diverse teams—not just developers—to drive efficient, secure, and compliant workflows. This growth will solidify the cloud as essential infrastructure, supporting seamless digital operations and driving strategic business growth. — Gab Menachem, VP ITOM, ServiceNow
AI Will Fuel Cloud Growth
From a client perspective, AI-augmented application development lends itself well to the service-oriented architecture that helps businesses achieve the benefits of moving to the cloud. This not only benefits the hyperscalers but can also enable businesses to accelerate their application modernization efforts — which, in turn, allows them to see the business value around increased efficiency and improvement of top and bottom-line numbers. Ultimately, I believe that AI-augmented software development will contribute to cloud growth and modernization efforts by removing complexity, increasing quality and reducing the cost for cloud modernization to improve the ROI. — Darlene Burke, director, Application Services NORAM, SoftwareOne
Risk Management Strategies Embrace the Cloud
The constant specter of cyber threats and the need for data protection will compel more IT pros to situate data and applications in the cloud not solely for availability as in the past, but for improved security, compliance, and disaster recovery capabilities. The Change Healthcare data breach disrupted the medical industry in 2023, and the CrowdStrike incident disrupted almost everything. Cloud-based risk management solutions will be more valued for business continuity and maintaining productivity." — Karen Gondoly, CEO, Leostream
Cloud-Native Will Replace 40% of Traditional VMware Deployments
One of the beauties (and costs) of capitalism is that dramatic price increases to existing technologies drives innovation and adaptation. With average price increases of 325% (and some substantially higher), and three-year lock-ins, both the buy side and the sell side of this market segment will scream for alternatives, and they will show up. The coming year will see legacy VMware users re-evaluate their options, with many pivoting to cloud-native approaches for greater flexibility. This shift for smaller organizations is already a no-brainer, but the real coup will be in federal and large enterprise entities. While there are no technologies capable of making the transition as simple as checking a box (due to patent infringement and scary men in black suits showing up at your office), rethinking the convoluted Gordian knot of interconnectedness into more straightforward implementations with modern approaches is the clear and effective way forward. — Charles Ruffino, fellow, Cloud Architecture, SoftIron
CIOs and Regulators Must Address Cloud Concentration Risk
In the wake of major vendor outages such as CrowdStrike, CIOs and regulators are both focusing on cloud concentration risk. Truly business-critical applications need to be resilient against both regional and cloud-wide failures, which requires a fully distributed database architecture supporting seamless failover from one cloud region to another, even across clouds. Expect regulators to start scrutinizing cloud provider dependencies at the application level. — Phillip Merrick, co-founder and CEO, pgEdge
Support Instantaneous Failover from One Cloud to Another
In the coming year we expect to see ever more attention on multi-cloud and hybrid-cloud architectures. For resilience against both region-wide and provider-wide cloud failures, organizations need to architect their "always on" applications to support instantaneous failover from one cloud to another, or from the cloud to on-premises infrastructure. Fully distributed databases do a great job of underpinning these architectures, often with minimal application code changes. — Phillip Merrick, co-founder and CEO, pgEdge
Cloud-Powered Platforms: Backbone of Cybersecurity
Cloud-powered platforms are becoming the new backbone of cybersecurity, where AI-driven integration outperforms standalone tools. By unifying diverse security operations, these platforms simplify complexity and enable organizations to manage threats and vulnerabilities across the cloud more effectively and efficiently. — Brian McHenry, head of cloud security engineering, Check Point Software
Sovereign Clouds Will Protect AI Data
Both businesses and countries will prioritize the creation of sovereign clouds in 2025, driven by the need to control and protect sensitive AI data. With artificial intelligence emerging as a critical driver of economic and political power, organizations and governments will demand private cloud solutions that ensure data remains within specific jurisdictions. Sovereign clouds will align with strict regulatory frameworks such as GDPR and act as a counterweight to the dominance of global hyperscalers.
A critical enabler of sovereign clouds will be GPU virtualization through advanced infrastructure software. As AI workloads require high-performance computing, the ability to virtualize GPUs will allow organizations to maximize hardware utilization and dynamically allocate GPU power across workloads. This capability will not only optimize AI operations but also ensure that data remains secure and private within sovereign environments. — George Crump, CMO, VergeIO
Rise of the Open Cloud
The Open Cloud concept will continue to evolve rapidly in 2025, driven by growing demand for flexibility, interoperability, transparency, and control in cloud services. As organizations increasingly rely on multi-cloud and hybrid-cloud architectures, the Open Cloud's principles — openness, collaboration, and user empowerment — will become even more central to the way cloud computing is delivered and consumed. The need for seamless integration between different cloud providers and on-premises systems will drive further development of open-source standards and frameworks. We expect to see more efforts towards multi-cloud compatibility, making it easier for companies to run workloads across various clouds without being locked into one provider's ecosystem. — Pat Patterson, chief evangelist, Backblaze
Security, Data Sovereignty, and Compliance Take Center Stage in Cloud Computing
With increasing concerns about data privacy, breaches, and regulatory requirements, security, data sovereignty, and compliance continue to be critical areas of focus in cloud computing. We see organizations deploying immutable storage to guard against malware infection and active hacking attacks as well as increased sensitivity to the geographical location of data as governments implement stricter data sovereignty laws. — Pat Patterson, chief evangelist, Backblaze
AI/ML Innovations Redefine Cloud Computing Strategy
Despite reports of its impending implosion, AI/ML continues to dominate the cloud computing agenda. Cloud storage providers will tap AI to build a new layer of enhanced services on their existing data storage/retrieval foundations, delivering business insights derived from both the data itself and the way in which it is accessed. — Pat Patterson, chief evangelist, Backblaze
Threat Actors Will Turn Their Eyes to Cloud Technologies
As organizations migrate workloads to the cloud for cost-efficiency and faster delivery, cloud assets are becoming more attractive targets for attackers. Consolidating critical functions like identity and authentication in the cloud, while potentially enhancing security, is also creating a larger, more valuable target. The centralized identity and authentication provider is now a single point of compromise for threat actors. This makes it easier for threat actors to compromise a single point of access to breach multiple systems. In 2025, we will see cybercriminals seek to compromise a single, centralized cloud access point for an opportunity to breach all of a company's most important assets. — Paul Reid, VP of Adversary Research, AttackIQ
Cloud Security and Automation Will Drive the Future of Cybersecurity
In 2025, the focus on cloud security will intensify as organizations deepen their reliance on cloud infrastructure and adopt advanced tools to defend against increasingly sophisticated threats. At the same time, the shift toward proactive strategies — such as predictive threat modeling and risk management — will gain significant momentum, driving the need for greater expertise in automation, threat intelligence, and DevSecOps. As network professionals are already spending up to 50% of their workweek on manual tasks, automation will become a key priority, helping to streamline processes and free up valuable resources. These trends, which have been growing steadily, will see broader adoption by 2025, underscoring the critical need for continuous upskilling in automation and other essential cybersecurity capabilities. — Adi Dubin, VP of product management, Skybox Security
Edge Computing Trends in 2025:
AI Moves Closer to the Edge
In the year ahead, we anticipate AI at the edge will further enhance applications and improve efficiency with increasingly specialized edge-AI chips that can enable tasks with lower power consumption. AI techniques like TinyML and model quantization will continue to advance, allowing more sophisticated AI algorithms to run on resource-constrained devices. We expect more real-time speech recognition, computer vision, and predictive maintenance on small edge devices, along with more local data processing. Current edge applications mostly use pre-trained models, but a move toward real-time, on-device training and fine-tuning will become more common. This means edge devices could adapt and learn from local data over time, improving performance and personalization without relying on cloud retraining. — Rashmi Misra, chief AI officer, Analog Devices
AI Inference at the Edge
AI inference at the edge is rapidly approaching mainstream adoption, with applications like image processing, PPE detection, and efficient oil well operations. Key drivers include advancements in edge-specific hardware, ecosystem readiness, and regulatory compliance. Real-world use cases also span computer vision and predictive maintenance. — Eva Feng, VP of product, ZEDEDA
The Future of Edge Investment
The most compelling investment opportunities lie in AI/ML at the edge, with real-time inference and edge-native services growing rapidly, particularly in healthcare and energy. Additionally, edge security and data management are crucial, as the demand for endpoint authentication and compliance solutions increases, and localized data processing drives innovation in analytics without the need to transfer all data to the cloud. — Eva Feng, VP of product, ZEDEDA
Edge Orchestration Solutions Become Crucial for Critical Infrastructure Security
Growing concerns over critical infrastructure security and the increasing sophistication of cyberattacks, including state-sponsored threats, highlight the urgent need for robust edge orchestration solutions. These solutions must incorporate built-in zero-trust security, encryption, real-time threat detection, and compliance with data security and sovereignty requirements. As global privacy regulations continue to evolve, many jurisdictions now mandate that data be processed and stored locally. Edge computing knows itself and the data, with its ability to secure and manage diverse network devices while maintaining awareness of data and system integrity, is emerging as a critical solution for protecting distributed systems. — Eva Feng, VP of product, ZEDEDA
Edge Computing Adoption Accelerates
Organizations will accelerate their adoption of edge computing, taking advantage of the increasing capabilities of edge devices to reduce spend on centralized compute. We'll see continuing separation of the control and data planes, with edge devices interacting directly with cloud storage instead of sending data via a server application. — Pat Patterson, chief evangelist, Backblaze
Remote Office/Branch Office (ROBO) Will Lead the 'Edge' Conversation
While edge computing has been heralded as the next big thing, its full realization is likely a 2030 event. In 2025, the focus will shift to remote office/branch office (ROBO) environments as a more immediate and actionable opportunity. Data center infrastructure software (DCIS) platforms will enable centralized management of these remote sites, offering improved reliability and simplified operations. ROBO environments will drive the edge conversation forward, delivering practical benefits to distributed organizations. — George Crump, CMO, VergeIO
AI Moves to the Edge
With increasing emphasis on data privacy and real-time processing, AI will continue its journey from the cloud to the edge. Think about autonomous cars making split-second decisions on the road, wearables that personalize health recommendations in real time, and smart home devices that process locally without sending data off-device. Edge AI will enable responsive, private, and latency-free applications that feel personal. 2025 will see the first commercial applications where advanced intelligence operates at the edge. — Luca Antiga, CTO, Lightning AI
Prevalence of GenAI Use Cases Will Produce Small Language Models for Edge Computing
As edge devices struggle with resource and performance limitations, Small Language Models (SLMs) tailored for these environments will provide a viable solution, enabling more efficient GenAI deployment at the edge. Initiatives like ONNX, which promote multi-platform compatibility and hardware acceleration, will play a pivotal role in enhancing performance. This will pave the way for a broader integration of GenAI applications in edge computing, especially in industries that demand real-time processing and localized intelligence, such as healthcare, manufacturing, and autonomous systems. — Pankaj Mendki, head of emerging technology, Talentica Software
2025 Is the Year of the Platform
If 2024 was the year of the LLM, 2025 will be the year of the platform. There's no shortage of models on the market — plenty to address just about any use case. But there's no point for businesses in talking about models if you don't have a strong platform to support them. In 2025, technology leaders will shift their focus toward investing in platforms that have built-in security, grounding capabilities to reduce hallucinations, and can serve as a one-stop-shop to bring the potential of these models to life. — Raj Pai, Vice President, Product Management, Cloud AI, Google Cloud
Edge Computing Will Have a Significant Impact on Cloud Landscape
The combination of edge computing — the practice of processing data near the edge of the network where it is generated — and 5G will have a significant impact on the cloud landscape. The synergy between the technologies will enable real-time data processing and low latency, minimizing the need to send data to centralized clouds. For enterprises, they'll become less reliant on centralized data centers and as a result will be in good stead to reap benefits such as reducing bandwidth usage and associated costs as well as improving their scalability for applications. Currently, we are seeing a massive growth in IoT and low-latency apps. With this growth, we expect to see a demand for more localized computing so that IoT devices can make quick decisions instantly and locally. This will have wide ranging applications in fields as diverse as healthcare (wearables administering medicine to patients without user input) to industrial machinery (sensors automatically controlling machines). Looking ahead, we can expect the combination of edge computing and 5G will bring greater speed to operations as well as better scalability, and accessibility for emerging technologies. — Alex Galbraith, CTO, Cloud Services, SoftwareOne
Edge Computing to Become a Shield for Data
Security and IoT breaches are continuing to rapidly evolve and are expected to continue in 2025. If data becomes the "new oil," companies and countries will want to keep theirs close to the vest. In an increasingly uncertain world, investments in on-prem infrastructure will increasingly serve a compliance needs for businesses around the world. — Dan Wright, CEO and co-founder, Armada
AI Inches Closer to the Edge
2025 will be the year of real-time, multimodal AI. AI will enter the action with humans and machinery in entirely new ways — from bringing data from sensors, drones, robotics and machinery all together to take action. — Dan Wright, CEO and co-founder, Armada
Massive Edge Computing Investments to Come
As companies continue to increasingly invest in their total amount of edge sites in 2025, the needs at the edge will continue to evolve and grow. We're still in the early days of deploying robotics and autonomy on the jobsite, however the investment in these capabilities is not slowing down. So much as $378B is projected to be invested in edge computing by 2028 (according to the IDC). With this, massive planned investments are coming down the pipeline and this kind of AI and autonomous work will serve in keeping humans safe from dangerous tasks while helping companies grow revenue with faster production at-scale. — Dan Wright, CEO and co-founder, Armada
The Supply Chain Strain of a Mobile, High-Demand Future
Edge computing relies on access to cutting-edge chips but also a supply chain that can take them to remote areas. The massive demand of these chips will continue to forge ahead however increasingly mobile customers could stretch the industry's nascent supply chains. — Dan Wright, CEO and co-founder, Armada
Energy and Defense Reach an AI Tipping Point
In 2025, AI will hit its tipping point in energy, with edge computing bringing intelligence directly to the oil rig. Much like how railroads revolutionized the oil industry by unlocking new markets in the 19th century, cutting-edge computing infrastructure will transport AI to the farthest reaches of the edge in the 22nd century. 2025 will also mark a seismic shift in defense, as edge computing becomes indispensable in the era of autonomous warfare. It's the modern-day railroad that delivers AI to the frontlines, empowering the U.S. military to navigate the complexities of the battlefield with unprecedented speed and precision. — Dan Wright, CEO and co-founder, Armada
Using Edge to Democratize AI
I'm looking at the potential expansion of edge computing (thanks to 5G proliferation, we're putting data processing closer to the data source and reducing latency) as a way to democratize AI. Can we build AI apps that are efficient that can run on mobile devices that may or may not incorporate cloud resources? If 5G is available to field technicians, they may be able to utilize AI to assist in their jobs (medical experts with diagnosis and treatment in disaster areas where 5G is available but wi-fi is not; engineers and scientists able to make decisions in the field with AI-assisted research/calculations based on real-world conditions, etc.). — Jerod Johnson, Sr. technical evangelist, CData
Smarter Edge (Client) Applications
In 2025, expect a rise in intelligent, edge-centric applications that enhance user productivity. With companies like Intel, AMD, and Qualcomm releasing AI-enabled CPUs for client devices, the term "AIPC" (AI-Powered Client) is gaining traction. While applications like Zoom and Microsoft Office 365 Copilot have started to tap into these capabilities, there's vast untapped potential across the client ecosystem, including independent software vendors (ISVs). This could lead to a broader range of AI-enabled applications tailored to client devices. — Peter Morales, CEO, Code Metal
Growth of Heterogeneous Mesh Networks in Edge Development
In 2025, mesh networking is expected to expand through heterogeneous sensor networks, similar to Apple AirTags, which piggyback on GPS from nearby Apple devices. With the proliferation of Starlink, 5G, and LTE-enabled edge devices, we anticipate a rise in remote sensors that can relay AI-processed insights through low-powered mesh networks. These systems will enable high-bandwidth sensor data, like video, to be summarized locally, allowing even isolated sensors to extend their range and send key insights through a network of connected nodes. — Peter Morales, CEO, Code Metal
The Rise of Agentic AI Will Push Us to the Edge
In 2025, agentic AI will leap from imaginary to necessary, quickly redefining enterprise automation. Self-directed AI applications will allow organizations to make real-time, data-driven decisions, particularly in sectors already making use of sovereign and private clouds. Expect early enterprise-level adopters to crop up in places where CapEx isn't an issue, deploying high-performance GPU and CPU clusters for mission-critical applications. Simultaneously, lighter agentic AI solutions will flourish through alternative cloud providers, enabling serverless inference at the edge, slashing costs and complexity. By outsourcing infrastructure management, businesses will be able to focus on optimizing the AI application layer, unlocking unparalleled productivity and customer engagement. To support the massive scale of AI inference required, organizations will increasingly deploy specialized models paired with vector databases and RAG at edge locations. This edge-focused architecture will deliver the ultra-low latency needed for AI agents to effectively support the volume of AI interactions needed for agentic AI at scale. — J.J. Kardwell, CEO, Vultr
Rise of Industry-Specific GenAI Inference Solutions at the Edge
2025 will be the year where GenAI inference solutions designed for specific industries, such as automotive, manufacturing, and retail, will grow and gain traction. Customized GenAI solutions allow companies to leverage foundational AI models augmented with their proprietary data to deliver real-time insights about the business that are precise and context-aware. Moreover, performing customized GenAI inference at the edge enables taking actions in real time. In the automotive sector, edge-based AI can interpret sensor data for autonomous driving, providing immediate responses in dynamic environments. Manufacturing sectors will benefit from AI-driven quality control systems, allowing defects to be identified and addressed in real time, enhancing productivity and reducing waste. For retail, edge AI can support personalized customer interactions by analyzing shopper behavior instantly and adjusting experiences accordingly. By focusing on these specialized edge-based solutions, companies will gain a competitive edge in creating responsive, efficient, and cost-effective AI applications that align with industry-specific needs. — Yoram Novick, CEO, Zadara
Expansion of Edge AI for Applications that Require Privacy
In 2025, industries that work with sensitive data, like healthcare and finance, will prioritize deploying AI inference directly at the edge to maintain privacy and comply with regulatory standards. This approach enables data processing on local devices instead of on cloud servers, reducing the risk of a potential data breach while providing a faster, more secure user experience. For example, healthcare providers may employ on-device AI for real-time patient monitoring, diagnosis assistance and treatment recommendations while adhering to data protection laws. Financial institutions can benefit from on-site AI models to detect fraud, minimizing latency and safeguarding sensitive customer data. The shift towards privacy-sensitive, edge-based AI solutions will continue to help industries to operate within regulatory frameworks, ensuring security and efficiency. — Yoram Novick, CEO, Zadara
Edge Databases Will Be Table Stakes for Next-Gen AI
We've already seen organizations move away from a cloud-only approach and embrace edge AI for inferencing and real-time decisions, enabling faster processing with minimal latency. By bringing data processing and models closer to data, organizations are able to get more value out of their applications and make faster business decisions. This is why, in the next year, edge databases will be a must for enabling agentic AI — which we see as an increasingly prominent trend. Edge databases can process data closer to the source, reduce latency and allow AI applications to rapidly perceive their environment and take immediate actions. By processing data at the edge, agentic AI can perform time-sensitive actions, such as managing industrial equipment or making real-time retail recommendations. Plus, edge databases provide the low-latency data access and processing required for agentic AI to truly behave as an autonomous agent. The ability to rapidly ingest, analyze and act on local data empowers agentic AI to quickly adapt to dynamic real-world conditions, a key tenet of agency. Overall, the rise of edge AI and edge databases will be mutually reinforcing trends. Edge databases will facilitate the deployment of agentic AI by providing low-latency access to data, while edge AI will drive demand for edge databases that can support time-sensitive analytics and decision-making. As agentic AI proliferates, edge databases will become a critical infrastructure for powering these autonomous, intelligent applications. — Rahul Pradhan, VP of product and strategy, Couchbase
About the Author
You May Also Like