AI That Moves the Numbers: Jeff X. Li on KPI-Driven Execution

Artificial intelligence sits high on corporate agendas, yet its business impact often remains unproven. For Jeff X. Li, SVP, Technology AI Strategy and Solutions at Daitrix, that gap reflects a fundamental misunderstanding of AI’s role. “If it does not move revenue, cost, or customer experience, it does not matter,” he says, AI must earn its place on the scoreboard.

Li’s career across enterprise IT, B2B SaaS, retail, and manufacturing has shaped a disciplined thesis. AI initiatives succeed when they’re anchored to financial outcomes and operational metrics that executives already care about. They fail when they remain detached pilots, admired for their novelty but disconnected from performance.

From Process Discipline to AI Accountability

While leading Lean Six Sigma initiatives at a global coatings manufacturer, every project Li worked on was tied to financial validation. Each division had a finance partner assigned to quantify results, and savings were categorized into two buckets: hard savings that directly reduced spend and flowed to the bottom line, and cash flow improvements such as inventory reduction.

“There were only two kinds of savings,” Li says. “Hard savings that go directly into the bottom line, and cash flow, like reducing inventory. We tracked them separately, and leadership looked at those numbers first.”

That discipline created clarity, as regional leaders focused on reports on bottom-line impact. Years later, while supporting an AI solution for hospital operating rooms that automated paperwork through IoT devices and computer vision, Li encountered the opposite dynamic. The solution sold for more than $100,000. Months later, it was still not deployed. “That’s where I realized technology without ownership is theater,” he says. “Somebody needs to own the business outcome. Otherwise the impact is not there.”

The contrast crystallized the belief that AI must be treated with the same financial rigor as any operational initiative.

Why AI Bets Miss the Scoreboard

Organizations often lack alignment over their AI project ideas. While early experimentation once had value as companies learned what AI could do, the landscape has matured. “There has been enough experience around specific use cases that yield real business value,” he says. “The era of experimentation for experimentation’s sake is over.”

The core disconnect is accountability. Projects begin without a clearly identified business sponsor. Use cases are selected for technical intrigue rather than measurable impact. KPIs are defined after the fact, if at all.

Li maintains that proven AI use cases now exist across nearly every function, from pricing optimization to demand forecasting to customer support automation. The odds of success have improved dramatically. “Every AI initiative is an investment decision,” he says. “But the odds have dramatically improved. If we link the KPIs to the right use cases and execute well, the scoreboard should look good.”

The 90 Day KPI Alignment Playbook

When brought into a private equity-backed B2B SaaS company with scattered AI pilots, Li begins with clarity and accountability.

First, he identifies a business sponsor for every initiative. “If there is no business sponsor, it is not a priority,” he says. Ownership determines seriousness.

Second, he traces a direct line from each project to a specific KPI owned by that sponsor. The question is straightforward: How does this initiative improve revenue, margin, churn, or working capital? If the connection is vague, the project is paused or redesigned.

Third, he establishes cadence. Li often advocates for an AI Center of Excellence structure that keeps product, data, and business leaders aligned around outcomes rather than model performance. “Focus drives performance. When leadership pays attention, results improve.” he says.

Execution discipline follows classic lean management principles: What was accomplished yesterday? What is planned for today? What are the roadblocks? As a senior leader, Li concentrates on removing obstacles for the highest-impact initiatives rather than micromanaging technical details.

Designing the AI Scorecard

As predictive models and generative tools become embedded in pricing, forecasting, and service operations, some executives question whether entirely new metrics are required. Li’s answer is pragmatic.

Board-level KPIs such as EBITDA, gross margin, and working capital remain unchanged. “AI is just a tool set,” he says. “It is a breakthrough tool set, but it is still a tool set.”

Boards will ask how much of performance improvement is AI-enabled — and how much is not. Leaders must quantify AI’s contribution to established objectives rather than invent parallel scorecards that obscure accountability. AI isn’t an alternative to business fundamentals. It is a lever to accelerate them.

Turning Possibility Into Measurable Value

Li’s approach reflects a broader maturation in enterprise AI adoption. The AI conversation is shifting from curiosity to capability. That transition requires disciplined sponsorship, KPI alignment, and operational rigor.

For enterprise IT and digital transformation leaders, every AI initiative must answer a simple question before it begins: How will this change the numbers that matter?

When AI is aligned with financial accountability and embedded into daily workflows, it stops being a headline and starts becoming infrastructure. When AI is tied to accountability and embedded into operations, it stops being a headline and becomes infrastructure.
That’s when it starts moving the numbers.

You May Also Like