What’s Really Happening? Discovery, Opportunity & Risk with AI
top of page

What’s Really Happening? Discovery, Opportunity & Risk with AI

Updated: Oct 28

We’re continuing our exploration of Leadership in the Age of AI.


ree

In our first webinar, we explored ambition, value, and leadership behaviours for an AI-driven world (catch the headlines here). This month, we moved from setting direction to seeing reality: discovering what’s already happening with AI inside our organisations—and how leaders can manage it wisely.


We were joined by AI expert Paul Mills. He is the Director and Co-founder of Mai.


Framing the Journey


Martin opened the session by revisiting the Tillon Approach to AI, using a simple quadrant model to help participants locate where their organisations are today and where to focus next:


  • Raising Awareness & Setting Ambition

  • Discovering What's There

  • Finding What’s Working / Proving Value

  • Providing Plans, Guidance & Enablers


These quadrants continuously inform one another: Discovery fuels ambition; ambition shapes discovery; proven use cases become plans and operating models—and the cycle continues.


This month’s focus was on the bottom-left quadrant: Discovering What’s There.


ree


Discovering What's Already Happening


AI is already in your business—whether you’ve formally introduced it or not. It’s embedded in vendor tools and being used by individuals with “bring your own AI” behaviour.


The first leadership act is to look under the lid and ask:

  • Do I understand where AI is being used—and by whom?

  • Do I understand which tools and data are in play and why?

  • What is official and accompanied by guidance, and what’s happening under the radar?

  • Where are transparency gaps—and where might risk be accumulating?


During the session, a poll revealed that most attendees felt only 'somewhat confident' about how AI is being used in their organisations, while nearly a third admitted they were 'not very confident at all'.


Why Visibility Matters


So why does visibility matter?


  • Risk: Undeclared tools and unknown data flows can create IP, compliance, and reputational risks.

  • Opportunity: potentially wasted effort and missed opportunities exist when motivated people have the skills and tools—but aren’t aligned to business needs.

  • Ambition Fuel: When there is tangible value being seen with AI it helps align teams and leaders on what is possible, and stay focused on real life problem solving, with the right resources allocated to experimentation and scaling use.


Webinar participants answering BYO AI polling stated their top three concerns were:


  • Sensitive data and IP leakage

  • Compliance and legal exposure

  • Output accuracy and quality 


These results show a clear awareness of the risks.

.

Practical Ways to Discover What’s There


The panel recommended easy to action, human-centred methods for surfacing AI use responsibly:


  • Anonymous pulse surveys and community roundtables (with a clear message “we’re learning, not auditing”).

  • Story invitations: Ask people what they’re already doing with AI—at work and at home - being congruent with the knowledge it is being used and recognising innovation

  • Role-based conversations: Link enquiry to roles. Real insight often sits deeper in the organisation.


Key insight: Discovery is not one-off and should avoid being seen as an audit—it’s a continuous feedback loop. Seeing what’s there refines the ambition you set from the top, and vice versa.


From Discovery to Value


Martin invited us to think about AI value in relation to three essential components:


  • Business Need: Are we applying AI to the right problems?

  • Tools and Data: Do people have access to appropriate tools with usable, secure data?

  • Skills: Do people know how to use AI effectively, both to prompt and execute taks and also to reflect and evalute it's use?


ree

When one of these elements is missing, value leaks occur:


  • Great skills and tools, wrong need → wasted effort

  • Right need and tools, low skill → guesswork

  • Clear need and skill, no approved tool → shadow tooling


Leadership’s role: bring all three together—where value, safety, and learning overlap.


Managing What You’ve Found


Once you’ve discovered what’s happening with AI, the next step is managing it with a balance of discipline and curiosity.


1. Discipline: Guardrails & Guidance

  • Publish clear principles and escalation routes.

  • Tailor guidance by function and task (regulated vs. creative).

  • Make tool access conditional on lightweight enablement and training.

  • Keep humans in the loop—accuracy is a capability and process issue, not just a model issue.


2. Curiosity: Culture & Enablement

  • Signal permission to experiment safely within guardrails.

  • Start small: Copilot for drafting, structured prompting for synthesis, etc.

  • Be explicit about your posture: how self-regulated vs. centrally managed do you want to be?

  • Expect shifts in roles—more people managing agents and processes rather than performing manual tasks.


Key insight: The secret to sustainable AI adoption is discipline + curiosity. Tighten the rails where risk is high, and widen the lanes where learning and value can compound.


Looking Ahead


Next, we’ll move into proving AI’s value in the flow of work—how to align opportunities with your roadmap, measure outcomes, and decide what’s ready to scale.


Book your place at the 'Leading in the Age of AI' Webinars Now



Ready to Discover What’s Really Happening with AI in Your Business?


If you’re unsure what AI is already in use—or want to turn fragmented experimentation into focused value—it’s time to start with Discovery.


Our AI Discovery Sprint helps leaders uncover hidden risks, identify real opportunities, and align AI activity with strategic intent.



 
 
 
bottom of page