3. Fireflies Tutorial
Note: The video covers material not in the guide below — please watch in full.
Action Step
Complete this before moving on.
Make sure the Fireflies Chrome extension is installed, then go to meet.google.com and create an instant meeting. Talk for about thirty seconds — say your name, today's date, and a sentence about what you are testing. End the meeting, then go to your Fireflies dashboard and verify the transcript appeared. Take a screenshot of the test meeting in Fireflies with the transcript view open and submit it.
Training Guide
You've been building kickoff prep docs, building decks — all from transcripts that were handed to you. Somebody recorded those calls and turned them into text you could feed to Claude.
That somebody is now you. And the tool is Fireflies.
You already saw Fireflies mentioned in the APIs training — it's one of the services you can pull data from. But before you can pull a transcript via API, you need a transcript. And before you have a transcript, you need Fireflies actually recording your calls.
This is a setup training. Short and practical.
(Here's what Fireflies actually does and why it matters for everything that comes after)
What Fireflies Does
Fireflies joins your calls — Google Meet, Zoom, Teams — and records them. After the call ends, it generates a full transcript. That transcript is the raw material for everything you've been doing: extracting pain points, building prep docs, processing action items.
For AI processing, you only need the audio. The transcript is what matters — who said what, in what order. The AI doesn't watch video. It reads text.
For human consumption — like sharing a recording with a teammate who missed the call — you want video too. But for your AI workflows, the transcript is the thing.
The Fireflies Chrome extension captures audio from your browser. It's lightweight, it works with Google Meet directly, and it gives you the transcript.
(Let's get it installed)
Step 1: Install the Chrome Extension
- Open Chrome
- Go to the Chrome Web Store and search for Fireflies
- Find the official Fireflies.ai extension and click Add to Chrome
- The Fireflies icon appears in your extensions bar (top right of Chrome)
- Click the icon and sign in — use your LeanScale Google account
If it asks you to grant permissions for microphone access, say yes. It needs to hear the call to transcribe it.
(Now let's make sure it actually works)
Step 2: Create a Test Google Meet
- Go to meet.google.com
- Click New meeting → Start an instant meeting
- You're now in a Google Meet by yourself — you just need audio
Look for the Fireflies icon in your browser or in the meeting interface. Click it to start recording. Some setups auto-join — if Fireflies is already capturing, you'll see a notification or a small indicator.
Talk for thirty seconds. Say your name, today's date, a sentence or two about what you're testing. This gives Fireflies actual audio to transcribe. Silence won't give it anything to transcribe.
End the meeting. Fireflies processes the recording — this takes a few minutes.
Step 3: Verify the Capture
Go to app.fireflies.ai and log in.
Your test meeting appears in the dashboard. Click into it. You're looking for:
- The transcript — your spoken words, converted to text. It won't be perfect, but it'll be recognizable.
- The recording — the audio capture from the call.
- Speaker labels — Fireflies identifies who's talking. In a solo test, it'll just be you.
If the transcript shows your test audio converted to text, Fireflies is working.
If nothing shows up, check three things: the extension is enabled, you granted microphone permissions, and you gave it a few minutes to process. Transcription isn't instant.
(One more thing before you move on)
Audio vs Video — When Each Matters
Most of your AI workflows only need the transcript. When you pull a call from the Fireflies API and feed it to Claude, you're sending text. The AI doesn't watch a video. It reads the words.
When does video matter? When a human needs to review the call — onboarding someone to a client, showing a teammate how a conversation went, referencing a screen share. For those, record via Zoom or Google Meet's built-in recording, not just the Chrome extension.
For day-to-day AI work — processing transcripts, extracting action items, building prep docs — the Chrome extension and the transcript it produces is all you need.
Submission
Take a screenshot showing your test meeting in Fireflies — the transcript view with your test audio captured and converted to text. Submit the screenshot.
What You Just Did
Every call from here out can be transcribed automatically — prep docs, action items, client recaps, task lists.
(You've got the recorder. Next, you're going to process what it captures — pull a transcript, extract the work, and push it where it needs to go)
Comment in Slack
Post your answer in your onboarding channel.
What was your biggest takeaway(s) from this training?