Sparks + Embers Episode No. 008: Being Human Together
This episode explores six research-backed practices for building learning communities that preserve human connection and collective wisdom while thoughtfully integrating AI tools as supportive resources rather than replacements for human relationship (the Kindling companion article can be found here called Building Learning Communities that Thrive). Through the metaphor of a master craftsperson teaching an apprentice, it demonstrates how authentic learning requires the vulnerable, inefficient, and irreplaceable process of consciousness engaging with consciousness—something that cannot be optimized away without losing what makes us most human.
Episode Transcript
Tiffany: Tyler, I’ve been thinking about your latest piece, “The Long Apprenticeship,” and there’s this moment where you describe your mentor teaching you woodworking. You could have learned the same techniques from YouTube videos or AI tutorials, but you chose the messiness of human instruction. What’s really at stake in that choice?
Tyler: That’s the question that started this whole piece, Tiffany. I stood at that workbench watching my mentor guide his then my hands through using a handplane, and I realized we’re making this choice constantly now – convenience versus connection. AI promises to deliver all the benefits of learning without the costs: no scheduling conflicts, no personality clashes, no vulnerability of exposing ignorance to another person.
Tiffany: But there’s something we lose in that trade, right?
Tyler: What we lose is what I call the “thorniness of bumping into others” – and that thorniness isn’t a bug to be fixed. It’s the feature. When my mentor lets me struggle with that planing that catches and tears, when he’s patient enough to let me discover the error through my own hands, that requires something no algorithm can provide: the capacity to care about my development more than his own efficiency.
Tiffany: You write about this as a “lotus flower effect.” Can you explain that?
Tyler: [laughs] AI as a sophisticated narcotic. It delivers the dopamine hit of learning and connection while eliminating the challenges that actually develop us. Half a million people got fooled by an AI-generated band called The Velvet Sundown because they’d learned to trust algorithmic curation over their own discernment. The music followed every pattern but lacked something essential – the trace of consciousness engaging with reality.
Tiffany: So we’re not just talking about learning skills. We’re talking about learning to be human.
Tyler: Exactly. Human learning is fundamentally social – knowledge gets constructed through conversation and negotiation, not just information transfer. When I teach someone else, I have to organize my thoughts, present them clearly, identify my knowledge gaps. That vulnerability, that risk of being wrong in front of another conscious being, builds something AI can’t replicate.
Tiffany: You outline six practices for what you call “learning to be human together.” Which one do you think people struggle with most?
Tyler: Practice Five – using AI as a tool rather than a replacement. People don’t realize they’re making this choice. We drift toward AI-mediated interaction that feels like community but lacks the essential elements of genuine relationship. When my daughter needs to understand something difficult, I can ask AI to explain it perfectly tailored to her comprehension level. But if I always take that shortcut, we both lose the practice of wrestling with ideas together, of sitting with confusion, of finding our way through not-knowing.
Tiffany: There’s this line in your piece that really stuck with me: “Learning is about becoming more fully human together, not just more knowledgeable individually.” What does that look like practically?
Tyler: It means choosing the inefficient path sometimes. When my writing group meets, we could use AI to synthesize our feedback and deliver optimized suggestions. Instead, we sit in a circle and stumble through trying to articulate what isn’t working, what resonates, why a particular passage feels off. That stumbling – that’s where the magic happens. We’re not just improving the writing; we’re developing our capacity to think together, to hold tension, to build on each other’s insights.
Tiffany: But you’re not anti-AI. You’re using it as we speak.
Tyler: Right. The question isn’t whether to use these tools but how to maintain what I call “cognitive sovereignty” – keeping final authority over our intellectual development while benefiting from technological augmentation. I use AI to help research, to generate hypotheses, to challenge my assumptions. But I edit ruthlessly, I verify independently, and I preserve space for the messy, inefficient work of developing my own thoughts.
Tiffany: You mention something called “the convenience trap.” How do we know when we’ve fallen into it?
Tyler: When we stop being able to tolerate the inconvenience of coordinating with others, waiting for slower learners, working through interpersonal challenges. The diagnostic question I use: Can I think well when my usual AI tools aren’t available? If the answer is no, I’ve outsourced too much. The goal isn’t efficiency – it’s developing the collaborative capacity we need for navigating real-world problems that require collective intelligence.
Tiffany: There’s an urgency in this piece that feels different from your other work. Why now?
Tyler: Because we’re at a choice point that won’t stay open forever. Each time we choose AI convenience over human connection, we’re training ourselves for a certain kind of future – one where we can process information but can’t create meaning together. The master craftsperson and apprentice relationship represents something precious: consciousness transmission that happens through relationship, patience, shared struggle. Once we optimize that away, getting it back requires a reckoning we’re not prepared for.
Tiffany: So what’s the call to action here? What should people do after listening to this?
Tyler: Start small. Next time you’re tempted to ask AI for a quick answer, pause and ask: What would I learn if I wrestled with this question first? Find one place in your life where you can choose the thorny path of human collaboration over the smooth path of algorithmic convenience. Join something messy – a book club, a maker space, a community garden – where you have to coordinate schedules and navigate personality differences and argue about approaches.
Tiffany: And if people want to dive deeper into these ideas?
Tyler: The full article walks through all six practices with specific examples and research backing. But more than that, it’s an invitation to examine your own learning ecology. Are you becoming more capable of independent thought, or more dependent on external systems? The choice shapes not just what we learn but who we become. And that choice? We make it new every day.
Tiffany: The piece is called “The Long Apprenticeship: Learning to Be Human Together,” and it’s available now.
Tyler: Here’s to choosing relationship over efficiency, connection over convenience – one decision at a time.
