Survey Branching & AI Modes
Route respondents to relevant questions using skip logic, and control how much freedom the AI interviewer has to adapt the conversation.
Skip logic lets you create rules that route respondents to different questions based on their answers. Rules are configured per-question in the survey editor.
Condition Operators
equals / not_equalsExact match on answer value
contains / not_containsAnswer includes a substring
selected / not_selectedOption was chosen (multi-select)
answered / not_answeredQuestion was answered at all
Actions
AND / OR Logic
Combine multiple conditions with AND (all must match) or OR (any can match). For example: "If role is Founder AND company size is 1-10, skip to solo founder section."
Natural language rules
You can type rules in plain English like "if they haven't purchased, skip to awareness questions" and the AI will parse it into structured conditions.
In Chat or Voice mode, you control how much freedom the AI interviewer has to adapt. Choose the mode that matches your research goals.
Strict
DefaultAsks each question exactly as written, in order. No follow-ups, no improvisation. Every respondent gets the identical experience.
Best for: standardized surveys, compliance-sensitive research, large sample sizes
Moderate
Questions in order, but the AI may ask one brief follow-up if an answer is particularly interesting. Skips clearly irrelevant questions when respondent context is known.
Best for: customer feedback, satisfaction surveys, general research
Adaptive
ProThe AI uses your project's research goals, the respondent's CRM profile, and past interactions to guide the conversation. It probes deeper on important topics, reorders questions for flow, and asks natural follow-ups.
Best for: discovery research, user interviews, exploratory conversations
References respondent's job title, company, and segment
Uses project research goals to prioritize questions
Can reference past interviews with the same person
Today, branching rules can only reference answers given during the current survey. Person-attribute branching extends this to reference CRM data imported before the survey starts.
What This Enables
Available Person Attributes
How It Works
- 1. Survey start: respondent is matched by email to an imported person record
- 2. Attributes loaded: title, segment, seniority, and other fields from the person record are loaded into a branching context
- 3. Rules evaluated: branching conditions can reference either question responses OR person attributes
- 4. In-session updates: when a survey answer maps to a person field (e.g., a "role" question), the attribute updates mid-survey so later rules see the fresh value
This feature is being developed on the feat/person-attribute-branching branch and will work in both Form and Chat modes.
- Import your contacts to enable profile-aware personalization and attribute branching
- Send surveys with pre-filled email for seamless identity matching
- Research workflow for the complete research process from planning to analysis