top of page

Making AI Useful: What the Vancouver Survey Says Leaders Actually Need

  • Writer: JR
    JR
  • Feb 11
  • 9 min read

Updated: Feb 20

What Leaders Want and

Executive Summary

  • A small Vancouver workshop survey (n=12) shows leaders are clear on outcomes (cost reduction and growth lead), but far less confident about execution and scale.

  • The biggest blocker is not tools. It is people capability: 10 of 12 respondents cite a skills gap as the primary constraint.

  • “Pilot gravity” is real: half report zero AI/automation pilots reaching production in the last 12 months, even while most are experimenting.

  • Governance is immature: no respondent describes strong, consistently enforced AI safety rules; most report either none or partial rules.

  • Ownership exists, but it is uneven: responsibility often sits with a CEO/founder or a department leader, and a meaningful minority describe a working group without a single accountable owner.

  • Industry data suggests the same pattern across sectors: adoption is rising, but value shows up when organizations invest in operating models, data foundations, and measured rollout, not one-off experiments.

  • Practical next steps: assign a single accountable AI owner, pick 1–2 workflows to redesign end-to-end, establish minimal governance, and build role-based training tied to production metrics.


What the Survey Reveals About AI Readiness


Outcomes leaders want


Even in a small sample, the “why” is consistent: leaders want AI to move real numbers. The most common stated outcomes cluster into two buckets:

  1. Efficiency and cost reduction. A plurality of respondents explicitly want AI to reduce costs, eliminate manual work, or improve throughput.

  2. Growth and customer impact. The next cluster aims at revenue growth, better customer experience, and faster decision-making.


This matters because it sets the bar for success. If the goal is cost reduction or growth, AI cannot remain a set of isolated tools used by a few people. It has to become part of how work gets done in core workflows, with measurement attached.


What’s blocking progress


The single dominant blocker is capability and skills. Ten of twelve respondents selected the skills gap as the number one blocker. That finding lines up with what many large-scale studies observe in practice: AI value is constrained less by model availability and more by whether teams can translate opportunities into redesigned workflows, data pipelines, and governance.


Two other blockers appear in the remaining responses: leadership alignment/buy-in and uncertainty about where to start. Those “secondary blockers” often behave like multipliers. When skills are thin, it becomes harder to create credible business cases, which in turn makes buy-in brittle, which then reinforces pilot churn.


The ownership gap and why it matters


Respondents most often place ownership in one of three places:

  • CEO/founder accountability (common in smaller or faster-moving companies)

  • A functional or department leader (operations, IT, or another business unit leader)

  • A working group without a single accountable owner


What is missing is not “involvement.” It is single-threaded accountability: one person responsible for outcomes, resourcing, governance, and production rollout, even if execution is distributed.


This shows up elsewhere in the survey. Most organizations describe themselves as experimenting or operating in silos, and only one respondent indicates AI is already scaled across core workflows. The implication is straightforward: without a clear owner, AI stays as activity. With a clear owner, it becomes an operating capability.


A useful external reality check is firm-level adoption data. Across OECD countries, the share of firms reporting AI use rose to 20.2% in 2025 (up from 14.2% in 2024), showing adoption is accelerating, but still far from universal. Early adoption does not automatically translate to scaled advantage.


Industry Intelligence: How 5 Sectors Are Responding to AI Right Now


The survey included a wide mix of industries. To ground the findings, here are five sector mini-briefs that reflect common patterns relevant to participants.


1) Healthcare and medical services


What’s changing: Healthcare is moving from “AI as innovation” to “AI as capacity.” The pressure points are well known: clinical workforce constraints, administrative burden, and a growing emphasis on outcomes and patient experience.


Where AI is being applied (realistic use cases)

  • Clinical documentation support and administrative automation

  • Revenue cycle optimization and denial management

  • Care navigation, triage support, and patient communications

  • Operations: staffing, throughput, and scheduling optimization


Common pitfalls

  • Deploying tools without workflow integration (adds clicks instead of removing work)

  • Governance gaps: privacy, model risk, and clinical accountability

  • Data fragmentation across systems


2–3 stats + sources

  • In a McKinsey survey of healthcare organizations, over 70% reported pursuing or implementing generative AI capabilities, with many using partnerships rather than purely in-house builds.

  • Among organizations that had implemented gen AI, 60% reported a positive ROI, and the most frequently cited value came from administrative and operational use cases.

  • Evidence from healthcare settings suggests meaningful productivity potential when deployed in specific tasks and roles, but outcomes depend on implementation design, not just technology selection.


2) Logistics and warehousing


What’s changing: Logistics leaders are under pressure to deliver resilience (disruption response), cost control, and speed. AI is increasingly framed as a decision advantage: better forecasts, tighter inventory, and faster exception handling.


Where AI is being applied

  • Demand forecasting and inventory optimization

  • Routing, slotting, labor planning, and warehouse task orchestration

  • Computer vision for quality checks and damage detection

  • Customer-facing ETAs and exception communications


Common pitfalls

  • “Project-by-project” AI creates disconnected systems that do not scale

  • Data quality and master data issues undermine model reliability

  • Underinvesting in change management for frontline workflows


2–3 stats + sources

  • Gartner found only 23% of surveyed supply chain organizations had a formal AI strategy (surveyed Dec 2024–Jan 2025), warning that short-term ROI pressure can create fragile, hard-to-scale architectures.

  • PwC’s 2024 operations and supply chain survey highlights broad operating-model change, with 80% reporting changes implemented or planned within 12 months, and a meaningful investment focus on AI/ML in operations technology.

  • Gartner’s 2025 supply chain technology trends emphasize agentic AI and an “augmented connected workforce,” reinforcing that workforce enablement is part of the AI story, not separate from it.


3) Construction and building materials


What’s changing: Construction is adopting AI unevenly, but the direction is clear: schedule risk, cost volatility, and safety pressures are pushing firms toward data-driven delivery.


Where AI is being applied

  • Estimating and bid support; change-order risk detection

  • Schedule forecasting, progress tracking, and productivity analytics

  • Safety monitoring (computer vision), compliance documentation

  • Equipment maintenance and utilization optimization


Common pitfalls

  • Pilots that never connect to day-to-day site routines

  • Lack of skilled personnel and integration challenges

  • Data capture gaps (field data not consistently digitized)


2–3 stats + sources

  • A RICS report on AI in construction found 45% of respondents reported no AI use, while 34% were in early pilot stages; only a small minority reported embedded use across multiple processes.

  • The same RICS report cites lack of skilled personnel as a leading barrier (alongside integration complexity), echoing what the Vancouver workshop participants reported.

  • Deloitte’s construction digital adoption research in Australia reported an increase in AI/ML use (for example, 37% of surveyed businesses using AI/ML, up from 26% previously), pointing to momentum but still early maturity.


4) Manufacturing and industrial production


What’s changing: Manufacturers are moving from isolated “Industry 4.0” initiatives toward more integrated, productivity-focused programs. The leading edge is not flashy AI demos. It is reliable data, connected equipment, and governed rollout.


Where AI is being applied

  • Predictive maintenance and reliability

  • Quality inspection (vision systems) and defect reduction

  • Production planning and constraint resolution

  • Workforce enablement (digital work instructions, knowledge capture)


Common pitfalls

  • Underestimating cybersecurity and operational risk

  • Treating AI as an IT program rather than an operations transformation

  • Skill gaps in data, controls, and frontline adoption


2–3 stats + sources

  • Deloitte’s 2025 Smart Manufacturing and Operations Survey reports average improvements of 10–20% in production output and 7–20% in employee productivity among respondents after implementing smart manufacturing initiatives.

  • The same survey reports 29% using AI/ML at the facility or network level and 24% deploying generative AI at that scale, indicating adoption but not yet universal maturity.

  • Rockwell Automation’s State of Smart Manufacturing reporting indicates 95% of manufacturers have invested in or plan to invest in AI/ML (including gen AI) over the next five years, reflecting the strategic shift from experimentation to planned investment.


5) Commercial real estate and property management


What’s changing: Real estate and property management are adopting AI in response to margin pressure, tenant expectations, and the need to operate portfolios more efficiently. The sector is also wrestling with data fragmentation across buildings, vendors, and legacy systems.


Where AI is being applied

  • Lease abstraction and document intelligence

  • Predictive maintenance and energy optimization

  • Tenant service automation and operations workflows

  • Investment analysis, underwriting support, and market intelligence


Common pitfalls

  • Poor data structure and weak privacy controls

  • Over-automating without redesigning the service model

  • Vendor sprawl with inconsistent integration


2–3 stats + sources

  • Deloitte’s 2025 commercial real estate outlook reports 76% of organizations are still in early stages of adopting AI, while 97% say they are committed to investing in AI-enabled solutions.

  • The same Deloitte research highlights data limitations, noting only 14% report well-structured data with robust privacy policies, a common reason pilots stall.

  • JLL reports broad experimentation, with 92% of real estate organizations piloting or planning to use gen AI, but only a small fraction reporting they have fully met their initial goals, underscoring the execution gap.


What High-Performing Organizations Are Doing Differently


Across industries, the pattern is consistent: high performers treat AI as an operating capability, not a toolset. That shows up in five practical operating principles.

  1. They assign accountable ownership. One owner holds outcomes, sequencing, governance, and resourcing, even when implementation is cross-functional. This directly counters “working group drift,” where everyone is involved but no one is accountable.

  2. They pick workflows, not use cases. Instead of starting with “chatbots” or “summarization,” they start with a measurable workflow (for example: intake-to-resolution, quote-to-cash, schedule-to-delivery). They redesign it end-to-end, then embed AI where it removes friction.

  3. They build minimum viable governance early. Clear rules on data handling, model usage boundaries, review/approval, and auditability prevent later rework. This is especially relevant given the workshop’s widespread reports of weak or non-enforced safe AI rules.

  4. They invest in data readiness as a product. Not “a data project,” but a maintained capability: definitions, quality checks, access controls, and ownership. That is how they avoid models trained on inconsistent, unlabeled, or inaccessible data.

  5. They measure what changed in production. Pilots are judged by operational metrics tied to the workflow: cycle time, cost-to-serve, error rates, throughput, and customer satisfaction. This is how AI moves from promise to operating leverage, and it aligns with supply chain research cautioning against project-by-project “franken-systems.”


Recommendations Informed by the Workshop Data


Below are recommendations tied directly to what participants said they want (outcomes), what blocks them (skills), and what the ownership patterns imply. Given the small sample size (n=12), treat these as directional, not definitive.


Quick wins


  1. Name one accountable AI Owner (single-threaded), even if delivery is shared.If AI ownership is a working group today, appoint a lead with decision rights on priorities, spend, and standards. This reduces diffusion and speeds execution.

  2. Pick one workflow to redesign in 30 days, with a production definition upfront.Many respondents report either zero production pilots or siloed automation. Define “in production” clearly (who uses it, how often, what metric moves), then design backward.

  3. Create a one-page “Safe AI Minimum Standard.”Because most respondents report weak or absent rules, start simple: approved tools, prohibited data types, human review requirements, and escalation paths. Keep it enforceable.

  4. Run role-based training, not generic “AI 101.”Since skills are the top blocker, train by role and workflow: operations leaders, analysts, customer-facing teams, and IT/security. Tie training to the chosen workflow and the metrics.


Deeper changes


  1. Build a small “AI delivery spine”: product, data, and change management.A recurring issue in industry research is pilots that do not scale due to integration, data, and people adoption. Establish a repeatable delivery model: backlog, data readiness checks, governance gate, rollout plan, measurement cadence.

  2. Adopt a “portfolio” approach to investments: run, grow, transform.This counters the short-term ROI trap highlighted in supply chain research. Use a three-tier portfolio: quick operational automations (run), cross-functional improvements (grow), and a small number of bigger bets (transform).

  3. Treat data readiness as an operational asset.Participants’ data readiness answers suggest many have “scattered exports” or “raw data that could be labeled.” Convert that into a data inventory: what exists, who owns it, quality level, and what is safe to use.

  4. Move governance closer to operations, not only IT.Manufacturing and construction research both show success depends on operational ownership and workforce enablement, not just technology selection. Align IT, security, and operations around shared standards and rollout.


Measurement and sustainability


  1. Use a simple “pilot scorecard” every two weeks.Track: adoption (active users), cycle time impact, error rate impact, cost impact, and risk incidents. If a pilot is not trending toward production, stop it or redesign it quickly.

  2. Build an internal AI Leader pathway.Most respondents expressed interest in learning more about developing an internal AI leader. Treat that as a capability-building agenda: mandate, competencies, and a progression plan tied to real delivery outcomes.


Implications for Future Workshops and Initiatives


What resonated


Participant comments consistently point to a desire for practical clarity: how to prioritize, how to govern safely, and how to build internal capability rather than relying on outside vendors for everything. The strong interest in developing an internal AI leader reinforces that the audience is not looking for inspiration. They want an operating approach.


Ready to build internal AI capability?



Revenue growth requires internal leaders who can execute. Will you build that capability?

Comments


Commenting on this post isn't available anymore. Contact the site owner for more info.
bottom of page