Desktop App: Recording & Analyzing Meetings

Record any meeting — Zoom, Google Meet, Teams, or in-person — and get evidence, signals, and a full transcript automatically, right as the conversation happens.

Install & Sign In

The UpSight desktop app runs on macOS and Windows. It sits in your menu bar / system tray and automatically detects when you join a meeting.

1
Download the app

Go to getupsight.com/download and install the app for your OS.

2
Sign in with your UpSight account

Click the UpSight icon in your menu bar / tray and choose Sign In. Your browser will open to complete authentication — same account as the web app.

3
Grant microphone access

On first launch macOS or Windows will ask for microphone permission. Allow it — the app needs mic access to record your side of the conversation.

4
Join a meeting — the panel appears automatically

When the app detects a Zoom, Google Meet, or Teams call, the floating panel slides in. You can also open it manually from the tray icon.

macOS vs. Windows

On macOS, the app captures system audio directly — all participants are recorded automatically with no extra setup. On Windows, use the cloud bot instead: it joins your meeting as a participant and records every speaker cleanly. See the Cloud Bot section below.

The Floating Panel

The floating panel overlays your screen during a meeting. It has five tabs — each giving you a different live view into the conversation.

Guidancedefault

Shows the research questions you set up for this project, so you never lose track of what you're trying to learn. Questions check off as they're covered. Configure them in the web app under project settings.

Notes

Type a quick note any time during the call — it's saved and linked to this meeting. Use the input at the bottom of the panel. Toggle between Note and Task mode with the icon button or ⌘⇧T.

Tasks

AI-suggested follow-up actions appear here as the conversation progresses. Review and accept them individually or hit Accept All. Accepted tasks sync to the web app.

Signals

Real-time evidence cards — goals, pain points, context, and workflow moments extracted from the live transcript as they happen. Each card is saved to your project automatically.

Transcript

Speaker-attributed live transcript, word by word. Useful for double-checking a quote or catching something you missed. The full transcript is also available in the web app after the meeting ends.

Panel modes

Compact — default size, shows tabs and input bar
Expanded — taller view, more content visible at once
Minimized — collapses to a small indicator; pulses red while recording
Starting & Stopping a Recording
1
Open your meeting and the panel

The panel opens automatically when a meeting is detected. If it doesn't appear, click the UpSight icon in your menu bar / tray.

2
Click the Record button

Press the red circle button at the top of the panel. It pulses amber while connecting, then turns solid red with a running timer once recording is confirmed. The Signals and Transcript tabs will start populating within seconds.

3
Conduct your meeting normally

The panel runs in the background. Glance at the Signals tab to see evidence cards appearing in real-time. Use the input bar at the bottom to jot notes or create tasks without breaking your flow.

4
Stop the recording when done

Click the Record button again to stop. The app finalizes the transcript, uploads the audio, and triggers full AI processing. You'll see the completed interview in the web app within a few minutes.

Tip: minimize the panel to stay focused

Once recording is running you can minimize the panel — it shrinks to a small indicator that pulses red so you know it's active. Click it to bring the panel back.

Cloud Bot (Windows)

Windows users should use the cloud bot instead of local mic recording. It joins your Zoom or Google Meet call as a named participant ("UpSight") and records every speaker cleanly. macOS users don't need this — system audio capture handles all participants automatically.

1
Enable cloud bot in the panel header

The cloud bot toggle appears in the panel header once a compatible meeting is detected. Click it to invite the bot into your call.

2
Admit the bot from your meeting app

You'll see a join request from "UpSight" — admit it like any other participant. Once admitted, the bot records all audio and the panel shows its active recording state in red.

3
Bot leaves automatically when you end the meeting

When the meeting ends, the bot exits and the recording is finalized just like a local recording.

Heads up: participants see the bot

The cloud bot joins as a visible participant. It's good practice to let attendees know the meeting is being recorded — most video platforms require this by default anyway.

After the Meeting: What Happens Next

Once you stop recording, UpSight processes the conversation in the background. You don't need to do anything — open the web app a few minutes later and it's all there.

What gets created automatically:

Full transcript — speaker-attributed, timestamped, searchable
Evidence — AI-extracted quotes tagged as goals, pain points, context, workflow moments
Themes — patterns that link this conversation to others in your project
People records — participants are identified and added to your contacts
Any notes and tasks you captured during the call

Where to find it in the web app:

Open your project and go to Conversations — the recording appears at the top of the list
Click the conversation to see the full transcript, evidence cards, and lens analysis
Click any evidence quote to see it in context — play the audio clip at that exact timestamp
Go to Themes to see how this conversation connects to patterns across your project

Typical processing time: 3–8 minutes

A 60-minute conversation usually finishes processing in 5–8 minutes. Shorter calls are faster. If you don't see the conversation in the web app after 10 minutes, refresh the page.

Tips for Better Results
Set up your project questions before the call

The Guidance tab shows your project's research questions during the call. If you haven't set them up yet, do it in the web app under Project Settings before your first meeting.

Use a headset or separate mic for cleaner audio

Better audio = better transcription = better evidence extraction. Built-in laptop mics work but a headset noticeably improves accuracy.

Say speaker names when you introduce participants

If you say "Sarah, can you tell us more about..." at the start, the AI can better attribute quotes to the right people. Speaker names you say explicitly are captured and used in the evidence cards.

Jot a note for anything you want to follow up on

Use the panel's note input during the call — it takes seconds and ensures you don't lose the thought. Notes become searchable evidence attached to this conversation.

Windows: use the cloud bot instead of mic recording

On Windows, the cloud bot is the recommended way to record remote meetings — it captures every speaker cleanly. macOS captures system audio directly, so no bot is needed.