Where Did the Money Go?
Picture this: your company has just spent tens of millions of baht on a new ERP system with a full suite of AI modules. The vendor completed the implementation, delivered a polished demo, and management applauded — but six months later, employees are still buried in Excel, doing things the same old way.
Sound familiar?
Data from Talyx.ai makes it painfully clear: 90% of enterprise AI projects fail. Not just “underperform” — they fail outright, consuming budget without delivering the value that was promised. Even the projects that technically “succeed” often produce only 10–30% of the expected value.
The uncomfortable question few people ask is: where did the other 70–90% go?
The answer is not in the technology itself. It is not because the wrong vendor was selected. And it is not because the budget was too small. It lies in the “Last Mile” — the final stage where technology must turn into real human action inside the organization.
What Is the “Last Mile” — and Why Is It Where AI + ERP Breaks Down?
In logistics, the term “Last Mile” refers to the final leg of delivery — from the distribution center to the customer’s doorstep. It is usually the most expensive and difficult part, even though it covers the shortest distance.
In AI + ERP, it works the same way. The Last Mile is the point where AI makes recommendations, but people still have to decide whether they trust them enough to act.
In its March 2026 issue, Harvard Business Review put it bluntly: the Last Mile problem is significantly slowing AI transformation — not because AI is bad at thinking, but because organizations have failed to redesign decision-making processes to support AI.
erp.today goes even further: AI failure in ERP is almost never a technology failure. It happens because organizations deploy intelligence into their systems without redesigning how decisions are actually made.
Think about it: you install highly capable AI into your ERP system, but the people using that system still work the old way, rely on the same instincts, and do not trust AI-generated insights. The result? The AI works perfectly well — but no one listens to it.
90% Fail — But Not Because the Technology Is Bad
If you are an executive who recently approved an AI + ERP budget, these numbers may keep you awake at night:
- 95% of AI pilots fail due to governance issues and lack of trust
- 60% of AI projects without proper data readiness will be canceled by 2026
- Only 48% of AI projects actually make it into production — the other half die in the pilot stage
- IBM reports that 53% of executives admit that difficulties integrating AI with legacy systems caused projects to collapse
But more important than the statistics is the real root cause. Across report after report, the conclusion is the same: the issue is not that AI is not smart enough. The issue is that the organization never changed.
McKinsey estimates that AI-ERP integration issues are causing average losses of €500,000 per company in Europe, and the situation in Asia is no better. Organizations buy world-class technology — and get village-level outcomes.
Why does this happen? Because technology can change in a day, but organizational culture takes years. And most companies do not want to invest in something intangible.
A Key Warning Sign: Employees Go Back to Excel
How can you tell if your organization is facing a Last Mile problem? There are several warning signs, and they are often overlooked:
1. Employees go back to spreadsheets — This is the number-one red flag. When teams export data from the ERP system into Excel and redo the analysis themselves, it means they either do not trust the AI’s output or do not know how to interpret it.
2. AI recommendations are overridden as a habit — Not occasionally, but routinely. The procurement team receives an AI recommendation on how much to order, but the team lead insists on using the same numbers “we’ve always used,” ignoring the new data entirely.
3. The system is used only for compliance — Staff enter data into the system because they are required to, but real decisions happen elsewhere: in meeting rooms, on LINE groups, or inside a manager’s head.
4. “AI doesn’t understand our business” — This is a dangerous phrase. Sometimes it is true, but most of the time it is an excuse to avoid changing how work gets done. If you hear it often, ask a harder question: “Have we ever actually followed the AI’s recommendation and compared the results?”
5. ERP data no longer reflects reality — Once employees lose faith in the system, they also stop caring about data quality. The result is garbage in, bad recommendations out, and even less trust in the system — a vicious cycle that becomes harder and harder to fix.
What makes this especially dangerous is that these symptoms often do not show up on executive dashboards. The system may still report “Active Users: 500”, but in reality, 400 of them are only logging in to do the bare minimum. They are not using AI in any meaningful decision-making at all.