top of page
Search

Creating a Culture of Learning That’s Built to Last

  • Writer: Yingyang Wu
    Yingyang Wu
  • Sep 17, 2025
  • 3 min read

Upskilling is a recurring theme in today’s workplace. But with the rise of AI tools, the question is no longer just what employees need to learn, but how we prepare people to learn in environments where the rules are still being written.


This shift is especially relevant because much of the recent focus has been on tool/process-specific training. While useful in the short term, this approach assumes a relatively stable work environment and a predictable shelf life for skills. AI challenges that assumption. What’s being asked of workers today is not simply the ability to use a tool, but to navigate a changing landscape with uncertainty, evaluate ambiguous outputs, and explore the limitations of the systems now embedded in their workflows.

The Limits of Instruction-Based Work

In many industries, AI now performs tasks that once required human effort such as summarizing documents, writing drafts, generating images, even building dashboards. These capabilities are changing the nature of work from “doing the task” to evaluating the task’s output. This shift raises different learning demands. Memorizing steps or following fixed processes is less useful when the system completes them automatically. What becomes more important is the ability to assess, refine, and contextualize the result.


This reorientation from procedure to judgment requires a different learning culture. One that values questioning over compliance, and learning as inquiry rather than memorization. The Dreyfus model of skill acquisition offers one way to understand this shift: moving from novice rule-following to expert intuition, flexibility, and context sensitivity. These forms of expertise are difficult to automate.


The Value of Curiosity and Critical Thinking

Curiosity, creativity, and critical thinking are often treated as soft skills. But they are becoming structural to how people function alongside AI. A curious employee might not just accept an AI-generated response but ask why it framed something a certain way. A critical thinker may notice when an AI output appears confident but is misaligned with the actual business need. These aren't add-ons to technical fluency; they are part of what makes the use of AI productive rather than performative.


Reports from the World Economic Forum consistently place analytical thinking, creativity, and flexibility at the top of critical future skills. This focus is not accidental. These are the very capabilities that enable people to work across uncertainty, generate original solutions, and challenge assumptions. These skills will gain value precisely because AI can’t define the problem space.


This shift isn’t just a practical adjustment. It aligns with what learning science tells us about transfer: the ability to apply knowledge in unfamiliar situations is a hallmark of deep, durable learning. Transferable skills like problem-solving and communication require metacognitive development, awareness of how we think, what we know, and how we adjust when new information emerges.


Rethinking What It Means to Upskill

Upskilling in this context goes beyond developing technical capabilities. It’s about shaping a culture that allows people to explore, evaluate, and grow.


When I led a team of instructional designers, we integrated AI into our design workflows. One junior designer leaned heavily on generative AI to write course content. The result resembled technical documentation: grammatically correct but flat and cognitively disengaging. It lacked the ability to guide a learner through a meaningful experience. We had to start over, beginning not with better prompts, but with a conversation about what good instructional writing is. It reminded me that without foundational capabilities in analysis and judgment, AI magnifies the gap.


This dynamic plays out across functions. A marketer who can’t distinguish insight from noise won’t make better decisions just because AI gives them more content. A product manager who hasn’t developed systems thinking won’t ask better questions simply because AI gives them faster answers.


The issue is not about using AI well. It’s about thinking well in a world where AI exists.


Building Toward a Durable Learning Culture

Companies often ask: “What tools/processes should we train on?” But that question may be too narrow. What’s needed is a broader view of what workplace learning is for. That includes:

  • Encouraging exploration over prescription

  • Creating space for collective learning and reflection

  • Rewarding employees who ask thoughtful questions, not just follow efficient steps

  • Recognizing that foundational skills like reasoning, communication, and conceptual thinking don’t go out of date


In AI ethics, there’s a growing consensus that human oversight is not optional. Context, values, and judgment cannot be outsourced. These are not just risk mitigation concerns. They define the ongoing relevance of human decision-making in technology-mediated environments.


Upskilling is often framed as a response to automation. But in reality, it is an opportunity to ask what we want human work to be. What kind of learning helps us make sense of the systems we’re building? What kind of thinking helps us direct them toward meaningful outcomes? These are questions AI can’t, and shouldn't, answer for us. They are ours to ask.

 
 
 

Comments


bottom of page