Saturday, 27 December 2025

From "Ums" to Outcomes: How a Wireless Mic & AI Changed My Teaching Workflow

I haven't written a blog post in a while. Usually, when I do write here, I link it to personal growth, general things I’ve been up to, or things that have interested me. But today, I want to talk about something specific that happened just before we broke up for Christmas—a shift in how I work that has been an absolute game-changer.

I want to talk about contextualised Speech-to-Structured Text AI.

The "Admin Beast" and the Friction Point

If you work in Further Education (FE) or SEN (Special Educational Needs), you know that tracking meaningful progress is vital. But the administrative burden of typing detailed feedback is a massive friction point.
For me, this is personal. As a professional formally diagnosed with dyslexia, getting my observations out of my head and onto a screen is a battle. I know the value is in what I see in the classroom, but typing it out is exhausting. And let’s be honest—if you just type "Well done," it doesn't really work. Feedback needs to be bespoke, detailed, and mapped to specific outcomes.
So, I decided to try something different.
The Experiment: A Mic and a Prompt
I purchased a set of wireless lavalier microphones and developed a specific workflow using AI.

It’s actually quite simple: At the end of a session, instead of sitting at a keyboard, I just speak. I record a big block of raw speech—"ums," "errs," and all. I then run that raw transcript through a custom AI prompt I designed specifically for UKFE and Skills compliance.

Here is what the system does for me:
 * Personalisation at Scale: It takes that one big audio file and splits it into structured, personalised written updates for every learner on the roster.

 * EHCP Alignment: It automatically scans my spoken feedback against an attached list of EHCP learning outcomes and Core 4 targets. It maps the feedback to the specific goal, identifying exactly where the evidence was met.

 * Zero "Hallucinations": I built in strict rules. The AI is forbidden from guessing. If I didn't mention a learner in the recording, it flags "No feedback given" rather than inventing progress.

 * Accessibility: It formats the output into clear British English at Level 1 readability, making it perfect for sharing directly with learners and families.

The Result

The impact has been massive. I can now deliver meaningful, evidence-backed feedback and upload it directly to Evidence for Learning (EFL) with minimum typing.

It also solved a problem I didn't realise we had. When staff are overseeing an EHCP review, they need high-quality data. Because of this workflow, the quality of the progress updates is so much better, meaning the review team is equipped with really good information about how that learning is progressing.

From Invisible Work to National Recognition with Natspec

I’m really looking forward to seeing how this evolves. In fact, I received some incredible news just before the break.

I have been invited by Natspec (The National Association of Specialist Colleges) to showcase this workflow during their upcoming Peer Exchange Week. I’ll be contributing on the topic of using AI to increase productivity in the sector.

To be honest, working where I work—often offsite—means that effectively nobody gets a 'seat' to see me in action. It’s easy to feel a bit invisible. That is why this invitation is such a big deal. It is fantastic for the college to be represented in a national project, and personally, it is incredibly validating to have my work recognised on this platform.

Keeping it Real (The Caveats)

I do want to acknowledge that it’s not always perfect. There are a few caveats. We all have different accents and dialects, and that can impact pronunciation in the transcript. Also, if you have two people in your class with the same first name, the AI can get confused.
Even though this is a productivity game-changer, the need for human oversight is really key. You can't just leave it on autopilot. It keeps the "human" in the loop but removes the admin barrier.

I’d highly recommend exploring speech-to-text workflows if you want to let technology handle the sorting while you handle the teaching. I'm excited to share more with Natspec soon!

Tags: #EdTech #Dyslexia #AIinEducation #UKFE #Natspec #Productivity #PersonalGrowth

The Data Doesn't Lie: Why I’m Finally Fixing My Sleep Debt

I haven't written a blog post in a while. To be honest, I haven’t had the energy.

Lately, my sleep has been absolutely atrocious. I feel constantly tired—actually, "tired" doesn't quite cover it. I feel re-tired, drained, and perpetually foggy. For a long time, I just assumed this was the new normal of modern life.

Then I got a Samsung Watch. I started tracking my sleep, thinking I’d find out I had insomnia or restless legs. But the data does not lie, and it told a very different, much clearer story.

It turns out, I’m not just "tired." I am chronically, clinically sleep-deprived.


The Reality Check

I used to think that because I could fall asleep instantly, I was a "good sleeper." My watch data quickly debunked that myth. Here is the breakdown of what is actually happening to my brain:

 * The Massive Deficit: I am averaging about 5 hours and 2 minutes of sleep per night. My biological need is closer to 7.5 hours. That means I am missing 2.5 hours of recovery every single night.

 * The Maths: Over a week, that creates a deficit of nearly 20 hours. That is physically the equivalent of pulling two full "all-nighters" every single month. No wonder I feel like I'm running on fumes.

The "3-Minute" Warning

The most alarming statistic wasn't the total hours—it was how fast I fall asleep.
According to my data, it takes me on average 3 minutes to fall asleep.

I used to wear this as a badge of honour. I thought it meant I was efficient. In reality, falling asleep in under five minutes is a clinical sign of pathological sleepiness. A well-rested human takes 15 to 20 minutes to drift off. My body is essentially "crashing" the moment I stop moving because it is so starved for rest.

It’s Not Insomnia, It’s Neglect

Here is the silver lining (and the frustration): My sleep efficiency is 86%. My Deep Sleep and REM cycles are actually normal relative to the time I'm in bed.

This means my internal machinery works perfectly fine. I don't have insomnia. I don't struggle to stay asleep. The problem is entirely self-inflicted. I am simply not staying in bed long enough to let my brain finish its wash cycle. By cutting my sleep at the 5-hour mark, I’m waking up right before the heavy REM stages occur in the early morning—cutting off my mental recovery at the knees.

The Mission for 2026

I am realising that I cannot optimise my professional life or my physical health while running a 20-hour weekly sleep deficit. The inconsistency—my bedtime shifting from midnight to 2:00 am—is causing "social jetlag," confusing my circadian rhythm even further.

So, as we approach 2026, my mission is simple: Respect the debt.

I’m not going to try complex bio-hacks. I don't need supplements. I need time.

 Anchor the Bedtime: Stop the 2:00 am drift.
 
The 15-Minute Rule: I’m going to start going to bed 15 minutes earlier every few days until I hit that 7.5-hour mark.

 Prioritise Recovery: Treating sleep not as a luxury or a waste of time, but as the foundation for everything else I want to achieve.

The data showed me I was crashing. Now, I’m going to use the data to make sure I finally recharge.