Chapters
Try It For Free
October 14, 2025

Industry Reports Agree: DevOps is the Key to Unlocking AI's Potential

Table of Contents

Recent industry research shows that AI is accelerating code creation, but having mixed results downstream. They also show that better platforms and pipelines yield better outcomes for teams adopting AI for coding.

Every engineering leader I talk to is asking the same questions about AI coding assistants: How much faster can we ship? How much more productive can my developers be?

On the surface, the answers look pretty good. The 2025 "State of AI in Software Engineering" report from Harness found that 63% of organizations report shipping code to production faster since adopting AI. Developers certainly feel more productive, and who are we to argue with feelings?

Here's the thing, though: this acceleration is telling a more complicated story. While developers are spending less time typing, a study from the research nonprofit METR found that for experienced developers, the impact of AI could actually be negative, even though they felt that they were moving faster.

This highlights a growing consensus I'm seeing across the industry: we are facing an AI Velocity Paradox. AI makes generating that first draft of code easier than ever. But figuring out if that code is actually good—functional, performant, and secure—well, that still takes time.

What we’re seeing is that when you hold the quality bar high as METR did, velocity can sometimes dip with AI. More often, organizations are letting their quality bar slip, and stability issues are emerging in production. AI is supercharging the front end of the software development lifecycle (SDLC), and this flood of new code demands a serious upgrade to our feedback loops—the core promise of DevOps—to manage all this change.

How AI is Reshaping a Developer’s Day

Let’s be clear: AI is absolutely changing the coding experience. The latest DORA "State of AI-assisted Software Development" report found that AI adoption now positively correlates with software delivery throughput—a complete reversal from the previous year. Developers say it's great for boilerplate, scaffolding, and getting quick options on the table.

But to get the full picture, we have to look at how the work itself is changing.

While developers feel faster, the METR study uncovered a critical nuance: the work isn't eliminated—it changes. The cognitive load moves from typing to a whole new, demanding set of tasks: specifying what’s needed, validating the AI's output, carefully reviewing its logic, hunting for subtle bugs, and trying to integrate it into a decade's worth of architectural decisions the AI knows nothing about. Think less bricklayer, more architect meets building inspector.

This shift from creation to verification is so profound that the METR study found experienced developers sometimes took 19% longer on certain tasks, even as they felt more productive. The end result is the same: a firehose of new code, pull requests, and changes aimed directly at your delivery pipeline. And frankly, that pipeline is starting to buckle.

Part 2: The Downstream Bottleneck

This is the heart of the AI Velocity Paradox. In the Harness report, a respondent put it perfectly, describing it as "squeezing a balloon - the volume of work stays the same, it's just forced from one side to another".

The data backs this up. The imbalance in automation across the SDLC is stark. While coding workflows are 51% automated on average, that number drops to just 43% for CI/build pipeline creation and continuous delivery. We're simply generating code faster than we can validate and deploy it.

The consequences are as predictable as they are severe:

  • Increased Failures: Nearly half (45%) of all deployments linked to AI-generated code lead to problems.
  • Rising Instability: The DORA report found that while throughput is up, AI adoption is still associated with a problematic increase of about 9% in “software delivery instability”. Their conclusion is blunt: our systems "have not yet evolved to safely manage AI-accelerated development".
  • Growing Risk: Almost half (48%) of teams in the Harness report are worried they will see an increase in software vulnerabilities from using AI coding assistants. I think that might be an optimistic take.

We’re driving faster on bad roads. Sometimes we get there faster. Sometimes we crash.

Part 3: The DevOps Decoupling Point - How to Win

So what does this mean? How do you tackle the paradox?

Code generation isn’t the problem. The feedback loop is. How quickly can you determine if a change is beneficial or detrimental? How fast can you fix it if it's not? Amplifying these feedback loops is DevOps 101, and it's never been more critical. The answer isn't in the code creation phase, but in everything that comes after it.

Both the DORA and Harness reports, despite their different approaches, converge on a single, powerful conclusion: mature DevOps practices are the critical mitigating factor. This is the decoupling point that separates the teams who are just creating chaos faster from those who are actually delivering value faster.

The Platform as an Amplifier

The DORA report highlights a key finding regarding the importance of having a "quality internal platform". A good platform is what operationalizes these feedback loops at scale, giving you standardized pipelines, automated governance, and developer-friendly guardrails. It’s the foundation you need to let the benefits of AI actually scale. DORA's research found that a high-quality platform literally amplifies the positive effects of AI adoption on organizational performance.

The Power of Good Continuous Delivery (CD)

The Harness report delivers a stunning statistic: organizations with moderate automation in their CD processes are more than twice as likely to see a velocity gain from their AI coding tools compared to those with low automation. A robust, automated CD pipeline gives you a tight, reliable feedback loop for that deluge of new code.

Bottom line: both reports are saying the same thing. To solve the problems created by AI at the beginning of the lifecycle, you must invest in the systems that manage the end of it—you must invest in the feedback loop.

Conclusion: From AI-Assisted Coding to an AI-Powered System

If the first chapter of AI in software development was about individual productivity, the next chapter is all about systemic health. The paradox is real: just handing developers AI assistants without upgrading your delivery infrastructure is a recipe for riskier, more chaotic releases. And let's be honest, if stability continues to slip it won’t take long for the business to tell us to slow down. 

The path forward starts with the fundamentals. The Harness report shows a huge jump in success just by moving from low to medium CD maturity. A solid foundation of basic DevOps and automated testing is the first step to handling today's AI-assisted reality.

But we have to look ahead, too. Today, developers use AI chat interfaces and in IDE based assistants. Tomorrow, they might be acting as "first-line managers" for teams of autonomous coding agents. In that future, the sheer volume of change will be unimaginable, and "basic" DevOps won't cut it. The feedback loops will need to be instantaneous and intelligent. We'll need AI woven into the very fabric of DevOps—AI-powered verification, AI-driven testing, and intelligent pipeline orchestration, just to keep our heads above water.

Start building that foundation now. The paradox is a warning, sure, but it's also a massive opportunity to build the resilient, high-performing systems that will define the next era of software development.

Eric Minick

Eric Minick is an internationally recognized expert in software delivery with experience in Continuous Delivery, DevOps, and Agile practices, working as a developer, marketer, and product manager. Eric is the co-author of “AI Native Software Delivery” (O’Reilly) and is cited or acknowledged in the books “Continuous Integration,” “Agile Conversations,” and “Team Topologies.” Today, Eric works on the Harness product management team to bring its solutions to market. Eric joined Harness from CodeLogic, where he was Head of Product.

Harness is a GitOps Leader

Discover why Harness was named a Leader in the "GigaOm Radar for GitOps Solutions." Harness helps teams manage GitOps at scale and orchestrate rollouts across clusters and regions.

Read the ebook
AI DevOps Agent
DevOps & Automation