"We want to use AI. Which model do you recommend? GPT-4? Claude? Or domestic DeepSeek?"
One of the most common opening questions in first client meetings.
The order in this question is wrong. It picks the first step of enterprise AI zero-to-one — at "which tool" rather than "which business problem".
This piece lays out the correct path: inventory workflows → pick scenarios → design solutions → pick tools → pilot → scale. Counterintuitive, but validated across many projects.
1. Why "tool first" drifts
The "tool first" mindset:
- GPT-4 is strong — buy Copilot for Business
- Roll it out to employees
- Months later: "how's it going?"
- Usage rate has dropped from 60% to 20%
- Blame "employees don't know how to use it"
Root problem: no business objective. Employees hold a tool without knowing where to apply it — they end up writing weekly reports, translating email, summarizing meetings. All fine, but zero impact on the company's core business metrics.
Six months later leadership asks: "we spent ¥500k — what did we earn?" No answer. Year-two budget cut. Project dies.
The correct mindset, inverted:
- Inventory the company's 10 most resource-intensive workflows
- Identify 1-2 suited to AI redesign (highest ROI)
- Design new "AI + workflow" solutions for those
- Based on solution needs, pick tools and models
- Pilot → validate ROI → scale
Order: workflow → scenario → solution → tool. Not reversed.
2. Step 0: Workflow inventory (the most important step)
This step decides the direction of everything downstream. Skip it and downstream investment is wasted.
Method: list all major business workflows — at least 10. Describe each as a row:
| Workflow | Roles involved | Monthly labor | Typical pain | Business impact |
|---|---|---|---|---|
| Customer inquiry response | Sales + assistant | 120 hr | Slow, missed, repeat | Low inquiry conversion |
| Quote generation | Sales + engineering | 200 hr | Slow lookup, surcharge errors | Long quote cycle |
| Shipment follow-up | Logistics + sales asst | 160 hr | Cross-system lookup, manual rec | High delay rate |
| ... | ... | ... | ... | ... |
This table is company-level, not department-level. First time takes 1-2 weeks (cross-department interviews required), but this is the single highest-value investment in enterprise AI zero-to-one. Once done, you have a baseline — any AI project should target one or more rows in this table.
3. Step 1: Identify highest-ROI scenarios
After inventory, sort by ROI. Simple formula:
Scenario ROI ≈ (current labor cost + error cost + opportunity cost) × AI's improvable share
Factors affecting AI's "improvable share":
- Structuredness — can inputs / outputs be precisely described? More structure, more AI can do
- Rule stability — are business rules stable? Volatile rules are hard for AI
- Data availability — is required data already in digital systems? Or scattered in email and Word?
- Error tolerance — how costly is an AI error? (Near-zero for medical/finance; higher for marketing)
Example:
| Scenario | Labor cost | Structure | Rule stability | Data availability | Error tolerance | ROI rating |
|---|---|---|---|---|---|---|
| Customer email auto-reply | High | Medium | High | High | Medium | ⭐⭐⭐⭐ |
| Contract clause review | High | Medium | Medium | Medium | Low | ⭐⭐ |
| Product copy generation | Medium | High | High | High | Medium | ⭐⭐⭐⭐⭐ |
| Sales forecast | Medium | Medium | Low | Low | Medium | ⭐⭐ |
Illustrative — exact ratings differ per company. The point isn't the numbers — it's doing the ranking at all.
Ranking principles
- Phase 1 picks 1-2 scenarios — regardless of how tempting other options look, going over 2 disperses focus
- Favor high labor + medium-plus error tolerance + already-digital data
- Avoid volatile rules + scattered data + near-zero error tolerance
4. Step 2: Design "AI + workflow" solution
After picking scenarios, don't jump to "which tool". First do workflow redesign.
Two questions:
Question 1: What should the new workflow look like?
Take "quote generation":
Original:
Inquiry → sales log → sales look up rate → sales calculate surcharges → manager approval → sales issue quote → send
Post-AI:
Inquiry → AI extract info → AI lookup + calc → AI draft quote → sales review → send
Key change: 7 steps → 5; the 4 steps that needed sales and manager (lookup, calc, approve, issue) compress to 2 (AI does + sales reviews).
Question 2: Responsibility split across steps?
In the new flow:
- AI: info extraction, lookup, calc, draft generation
- Human: review, edit, final decision, exceptions
- Exception handling: AI's low-confidence cases auto-escalate to manager
- Audit: every AI generation and human edit logged
Only with clear new-workflow design can you evaluate what AI capabilities you actually need — which directly determines tool choice.
5. Step 3: Pick tools and models
Once the solution is clear, tool selection isn't hard — you know exactly what capabilities you need.
Typical cases:
Case A: Generic, highly standardized
E.g.: email reply, copy generation, simple QA.
Recommend: SaaS AI (ChatGPT Enterprise, Claude for Work, Doubao / Zhipu in China). Per-seat.
Why: generic capabilities are mature, cost is low, no customization.
Case B: Needs internal data
E.g.: look up customer receivables, find product specs, generate business reports.
Recommend: enterprise AI gateway (e.g., SiNan) + existing LLM APIs + internal knowledge base.
Why: requires permission pass-through, data connection, audit trail — SaaS can't.
Case C: Strict compliance (central SOE, finance, energy)
Recommend: private deployment + domestic models (GLM, Qwen, DeepSeek open source).
Why: data-in-network mandate, domestic-stack requirement, audit is hard.
Case D: Industrial on-site
Recommend: edge inference devices + specialized models (vision / time-series).
Why: low latency, network-drop tolerant, 24/7 stable — office AI can't.
Don't over-invest in tool selection. 80/20 is enough to start; the remaining 20% comes from real operation tuning. Over-researching tools is a stall tactic to avoid starting.
6. Step 4: Pilot
Phase 1 scope must be small, duration short, evaluation strict.
Small: 1-2 scenarios, 1 department piloting (not company-wide)
Short: 2-3 months for full pilot cycle (solution + implementation + 30-day validation)
Strict: pilot conditions must approximate production — no "curated POC"
Pilot answers three questions:
- Technically feasible? (Binary — 100% or 0%)
- ROI hits target? (Specific numbers vs baseline)
- Org-ready to scale? (Will employees use it? Can management sustain it?)
All three yes → scale. Any no → fix before scaling.
7. Step 5: Scale
Scaling has its own order:
1. Vertical before horizontal
- Vertical: same department / scenario, more users (sales 20 → 200)
- Horizontal: cross-department / cross-scenario (sales → CS → marketing)
Vertical first stabilizes ops; horizontal tests org capability.
2. Every new expansion needs its own mini-pilot
Don't directly roll out to new scenarios. Even with identical tech, new scenarios have different business rules, data states, user habits — 2-4 weeks of small-scale validation.
3. Establish "content owner / operator" roles
From Why enterprise KBs fail, knowledge owners matter — every AI scenario needs them. Broader scale makes this role more critical.
8. A complete path map
Strung together — from "AI zero" to "AI integrated into business" over 1.5-2 years:
Months 0-2: workflow inventory + free AI maturity audit + pick 1 scenario
Months 2-6: first scenario pilot (full cycle: solution → implementation → 30-day validation)
Months 6-9: scale first scenario to more users; start second scenario
Months 9-15: second scenario pilot + scale; start building enterprise AI governance
Months 15-24: 3-5 core scenarios in stable operation + cross-department collaborative agents in pilot
This is a restrained but realistic rhythm. Slower than the "6-month full AI transformation" pitches you see. But every step has deliverables, validation, and feedback.
9. Closing
Enterprise AI zero-to-one doesn't start with "pick a model" — that's the finish, not the start.
Start with workflow inventory, identify highest-ROI scenarios, design AI + workflow solutions, then pick tools, pilot, scale.
This order isn't a framework I invented — it's validated by the sum of many failed and successful projects. Done right, 2 years lets AI genuinely enter your business. Done wrong, you spend 5 years cycling through "buy tool, no result, swap tool".
If your company is at the zero-to-one starting line, the most valuable first step is a free AI maturity audit — 15-minute questionnaire + half-day on-site. We'll surface the top 3 AI entry points in your workflows and their deployment priority. If we're not a fit, treat the diagnostic as internal reference.
This is piece #12 of the Knowledge Center, and the wrap of the series. If you've read from start to finish, you should now have a systemic view of enterprise AI delivery — from methodology (#1 consulting, #12 zero-to-one), through deployment (#2-#6 services and governance), GEO (#7-#9), to knowledge bases and industrial AI (#10-#11).
Want to talk about your specific case? Reach out.