The 2026 Waste Management Tech Trends Report Findings
December 29, 2025
In this episode of Waste Nexus, hosts Brian Dolan and Charlie Dolan of DSQ Technology break down the key findings from the 2026 Waste Management Tech Trends Report, offering a candid look at how AI is actually being used inside waste management today.
The report reveals a clear pattern across the industry: while personal use of AI and informal experimentation is widespread, most companies lack the documentation, governance, and operational processes required to turn AI into a true driver of scale and efficiency. Brian and Charlie explore why so much AI adoption stalls at the “cool demo” stage—and what it really takes to move from tinkering to tangible impact.
Brian and Charlie also explore the shift from AI as a toy to AI as a tool, outlining practical steps operators and leaders can take to formalize usage, embed AI into workflows, and align technology adoption with real operational outcomes. Moving past the AI hype cycle, the episode emphasizes discipline, process design, and leadership intent as the foundations for meaningful change in the waste industry.
To dive deeper into the data, insights, and recommendations shaping the future of waste management technology, read the full 2026 Waste Management Tech Trends Report.
Brian: I don’t know if teams necessarily want AI. I think they want their jobs to be easier and to deliver better products for customers. I’m not sure everyone has the preparation, diligence, and experience to deploy it.
Charlie: Isn’t that similar to machine learning? What’s the difference between machine learning and AI? Most of that doesn’t matter to people—they just want the outcome. They want faster human-to-human connection and a better result. When you talk about processes, workflows, and standard operating procedures, it’s not just automation—it’s optimization.
Brian: Right. You want to say, “This is what we do. This is what we do well. Now how do we improve it by having a process?” AI is really an extension of having good process.
[Music]
Charlie: Welcome, everybody. Charlie Dolan here with my older—and better—brother Brian. We’re here to talk about the 2026 Waste Management Tech Trends Report, which is coming out soon. We wanted to run through some highlights, talk about what we’re seeing in the market—both within waste—and how AI is being used there, and also more broadly, including how it’s working for us as a software company.
Without further ado, let’s hit a few high points. The first one: about 75% of respondents say they’re now using AI weekly. That’s a big change from when we did this last year. The tools weren’t as developed then, and people weren’t getting as much real utility out of them.
Almost half—45%—say they have a daily use case. A lot of those are probably still a ChatGPT-style interface, not deeply embedded into workflows, but it’s still an impressive gain from last year. Back then it felt more like a parlor trick—something you tried and thought, “That’s interesting.” And sure, there were a lot of cat memes.
If you think of this as a tech adoption curve, there’s definitely a maturing happening since a lot of the ChatGPT wave hit in 2023. It still feels a bit slow, though. We’ve seen use cases across different parts of the industry. Are there any you want to highlight—inside or outside waste?
Brian: Using our own example is probably the most applicable, because you and I use AI entirely differently. You’re using it in a more deterministic, code-based environment. I’m using it as a tool to supplement my thinking.
I was working on a commission plan the other day and had it in the system asking: “How do I make this better? What do I want to adjust? What am I missing? What did I forget? Where are the holes? Are there inconsistencies? Did I contradict myself?” That’s very useful. But it’s not doing the work for me—it’s supplementing my brain.
From my perspective, you’re using it more as a multiplier—duplicating or scaling effort.
Charlie: Yeah, I think so. Let’s break that into two parts. First: what can you use AI for daily that’s lower risk—almost like a coworker sitting next to you helping you work through a problem?
That might still be the best use case right now: help me think through ideas, prompt me on what I might have forgotten, edit a paragraph, modify a sentence, or review a commission plan based on patterns it’s seen.
I had an example where there was a typo—something was “1.1” in one place and “1.2” in another. When you read a document ten times, you can easily miss that.
But don’t you think those general, quick tasks are also where people get frustrated with AI? They ask it to do something small and it gives back a generic answer—and people decide it’s not good at what they need.
Brian: In that specific case, I was using it with a very clear purpose. I wasn’t asking it to do math—I was asking for error checking and logical consistency. But I agree with your larger point: there’s a lower level of trust. We see that in the survey results—people aren’t always willing to rely on it.
And part of that distrust is really about how people use it. They look at themselves and say, “I’m not sure how to use this tool well. I don’t know how to get the best output.”
Charlie: Exactly. If we bucket this into the “coworker next to you” use case—delegating something and getting feedback—the reason it doesn’t feel like a huge breakthrough is that the guardrails aren’t great yet. It’s hard to reliably get the outputs you want.
It’s also broad. These models were trained on a lot of generic internet content. So when you ask about a business problem, you might get something high-quality—like Harvard Business Review-level thinking—and you might also get something that sounds like it came from a random chat room. That mix leads to very generic, vanilla outputs.
The way to take control is what we’ve been pushing people to do—and what we’re doing ourselves: document everything you can, then use language models to generate or structure outputs based on your company context.
You can load in meeting recordings, call transcripts, stale documentation from Google Drive or SharePoint, and use that to create a “source of truth” for how your company wants to run. That becomes the guide rails. Then you can tell the model: “Here’s the structured output I want when a customer service complaint comes in.”
There’s a truism: if you only rely on the average of what everyone else is doing, you get the average outcome. If you put in the work—standard operating procedures, documentation, a system that understands your business uniquely—you level up beyond what the internet spits out.
Brian: And the use cases we still see working best are tied to unstructured, front-end customer inputs—phone calls, emails, service requests. That’s where both the waste industry and the broader market are seeing success.
Didn’t we see one example from someone we follow—Hoffman out of St. Louis? He introduced an AI customer service system for inbound call routing.
Charlie: Right—an AI customer service agent that answers after hours (and I think during business hours too). It’s a large HVAC company. They saw drop-off in calls being answered and delays in response time—getting the caller connected to a human who can schedule service.
Speed-to-response was critical. They started that project in 2024, and it was a long-term effort to get the AI to follow the rules of their business. HVAC has scheduling complexity: what services do you offer, how do you route to the right team, how do you capture the right details into backend systems, and how do you make the experience good enough that customers don’t view it as friction?
Ultimately the goal is to move from a computer interaction to a human-to-human interaction as quickly as possible, because in HVAC—like “my furnace is down”—there’s still triage that needs a human at some point.
What’s interesting is that it’s a limited use case with a very specific outcome. They achieved it by documenting scenarios and iterating: running dry runs of calls, checking whether the AI routed correctly, and if it didn’t, updating documentation so it would improve on the next one.
Brian: That ties into two themes we saw in the report: people are trying to get rid of manual data entry—something we’ve been focused on for a decade—and they’re trying to improve processes. It’s not just automation; it’s optimization.
You want to say, “This is what we do well—how do we improve it with process?” AI is an extension of good process, not a replacement for it.
Without good process, the AI gives you generic answers—which isn’t what you want for your company.
Another benefit is using AI for double-checking at scale. It can reread presentations before they’re given, review emails, sift through information humans don’t have capacity to review, and flag potential issues. It’s like scaling the example you mentioned earlier—iterating on documents—but across the whole team to catch inconsistencies and ensure things align to standard process.
Charlie: Looking ahead, it seems like teams want AI—it’s not a fad, even though it had a fad-like phase. There was a hype cycle. But I don’t think everyone has the preparation or experience to deploy it.
When I look at customers we’re onboarding, many are coming from older software or manual processes. They’re asking: “How do I employ this? Is it part of what you offer? How does it work for me?” There’s an ongoing gap between what people want and what they can execute on their own. That gap will be filled by third parties—companies like us—who can help deploy it.
Brian: And that comes down to picking partners with APIs—systems where you can access your data in ways AI can actually use. If you’re watching this and you don’t have an API, we welcome you to start building one.
I’ll be pedantic about one thing: I don’t know if teams necessarily want “AI.” I think they want their jobs to be easier and they want to deliver better products for customers. AI is a big word—and if you ask ten people what it is, you’ll get ten answers. Some might ask, “Isn’t it just machine learning?” But most of that doesn’t matter. People want faster human-to-human connection and better results.
There’s a learning curve around what’s truly new here—because it is very different than what existed before. But the underlying need is the same: deliver better products and operate better.
Charlie: That’s interesting. I think there’s also an element—like when the iPhone or Blackberry came out—where some people just want the newest, fanciest tech. They want to be seen as cutting edge. We’ve seen that in fundraising too: add “AI” to a homepage and suddenly you can raise more money. So I think both things are true.
Brian: Agreed—both are true.
Charlie: The last point, from where we sit, is that legacy processes and systems are what will be challenged most over the next couple of years.
Brian: Yes. If you’re not documenting things—and you don’t have software, processes, and tools these models can interface with—you’re almost “offline.” You’re going to miss out.
You need structure: guardrails, guideposts, documentation, quality control—so AI can navigate your systems and deliver impact. Otherwise, you risk the worst kind of automation: automating things you don’t want and multiplying problems.
Charlie: If 75% are using AI weekly and 45% daily today, I’d guess those numbers rise next year—80% or higher. Even if daily use is just something small, most products people use will integrate AI in a way that creates ongoing daily benefit.
It might be as simple as voice assistants getting better, or more intelligence embedded in core systems. It’ll be interesting to see how many people are using AI without even realizing it—if companies deliver the seamless experience you described.
Brian: That will be interesting to watch.
Charlie: Okay—take us home.
Brian: Thanks for joining us. This is our second annual Tech Trends report. Look for the 2026 Waste Management Tech Trends Report, and look out for Waste Nexus being announced in January.
Charlie: If it is.
Brian: (laughs) I think we’ll announce it in January.
Charlie: Thanks for joining us—have a good one.

