AI Survey Builder

I led the redesign of Pollfish’s AI Survey Builder, evolving it from a one-time generator into an intelligent research collaborator. My work focused on redefining the AI experience across the entire survey creation flow: making AI feel like a true partner rather than a tool.

I was responsible for UX strategy, interaction design, and visual identity, collaborating closely with product, data, and engineering teams.

AI survey generation

AI editing & refinement

2022 - Present

AI Survey Builder interface preview

First approach 2022

In 2022, we introduced AI survey generation in two places: on the homepage to attract users, and in the My Surveys page where they could describe their research goal and have AI build a complete questionnaire. The AI would generate questions based on their input, show a preview (user had to decline or accept), and create the survey.

Early AI survey generation entry on Pollfish homepage2022 My Surveys AI builder workflow

Pain points of this approach

We knew this approach had limitations, but we deliberately kept it simple. We wanted to ship quickly, test user adoption, and validate whether AI-powered survey creation had real value before investing in a more complex solution.

There were no ongoing AI support after creation.

There were no collaboration with AI once the survey was built.

Had to start completely over if they changed direction.

It was only a one time generator.

New approach, 2024

The opportunity

Revisiting the AI Builder in 2024 gave us the opportunity to evolve it from a one-time generator into a continuous research advisor, supporting users from the first question to final refinement and enabling advanced research methods without requiring specialized expertise.

My role

I led early cross-functional workshops with business, product, engineering, and data to define the role AI should play in Pollfish.

Together, we:

  • Identified core user pain points and operational constraints.
  • Aligned on where AI could create genuine researcher value.
  • Translated insights into clear design principles that shaped the product direction.

Workshop outcomes that shaped AI Builder's constraints, user needs, and product direction.

Workshop board used to define constraints, pain points, and opportunities

From principles to execution

Once the strategic direction was defined, I drove its execution by:

Drove iterative exploration of the AI direction

Built cross-functional alignment through recurring stakeholder reviews

Partnered with engineering to ensure technical feasibility

Translated early insights into tangible experience concepts

Worked closely with data team to validate AI capabilities within strict-time constraints

Facilitated fast feedback loops to accelerate decision-making

Visual Exploration

Exploration

I led the end-to-end design direction for the AI Builder, from early problem framing through execution and refinement.

I translated workshop outcomes into clear design principles, guiding exploration across interaction models, visual identity, and system behavior. I drove rapid iteration cycles, presenting work regularly to stakeholders, incorporating feedback, and refining solutions as constraints became clearer.

AI button visual exploration states
AI iconography visual exploration states

Align intent with constraints

Throughout the process, I worked closely with engineering and the data team to ensure feasibility, validate AI capabilities, and align design intent with response-time and technical constraints.

The interface itself went through multiple iterations. I explored different color palettes (green, magenta, dark, gradients), various layouts, and different ways to structure the welcome experience. Each iteration was presented to the broader team, product, engineering, and business stakeholders, gathering feedback that shaped the next round of designs.

The back-and-forth was constant, but it ensured we were building something that worked for users and aligned with business goals.

Evolution of AI Builder panel concepts

Visual identity & branding

Through this process, the design evolved from a simple chat panel to a task-oriented system with the "What do you want to do today?" framework. Working with the support team's insights about common survey types, we shifted from general topic categories to specific use case cards.

I designed all the iconography and visual elements to create a cohesive AI brand within Pollfish. The purple/magenta color palette distinguishes AI features while complementing the main Pollfish blue. Each of the eight icons (for survey types and AI actions) follows the same visual language, simple, recognizable, and clearly connected to the AI brand.

I even designed a subtle animation for the gradient header bar, creating specs for the development team that showed how I wanted it to move smoothly rather than remain static. These details mattered, they make AI feel premium and intentional, not tacked on.

Refined AI Builder interface with branded actions and capability cards

Positioning AI as a tool, not a chatbot

First desicions

My first decision was placement. A floating bottom-right button felt like a support chatbot and conflicted with our existing help chat. I needed AI to feel like a creation tool, not assistance.

I explored placing "Ask AI" next to "Add question." It worked well in the empty state and remains available at the bottom of the survey - close to where users actively build.

However, as surveys grow longer, that placement alone isn't sufficient. To ensure consistent access, I also introduced a persistent entry point in the top-right corner alongside preview and versioning controls.

This multi-entry approach keeps AI contextual during creation, while also making it globally accessible - reinforcing that it's a core tool, not an afterthought.

Exploring CTA placements

Exploring different AI call-to-action placements in the questionnaire builder

Final placement in an active questionnaire

AI placement in an active questionnaire with existing questions

Final placement in an empty questionnaire

AI placement in an empty questionnaire state

Making AI feel guided, not over-whelming

Using the insights from our support team about common survey types, I designed specific starting points: "Create a survey," "Brand feedback," "Product feedback," "The right pricing," "Conjoint analysis," and "Max Diff analysis."

This gave users a clear place to begin while still allowing free-form requests through the text input below. The "What do you want to do today?" framing made AI feel helpful rather than intimidating.

AI Builder panel with guided survey creation options

Adapting to context

The AI interface changes based on whether you're starting fresh or editing an existing survey. In an empty state, it offers creation tasks. When a survey already exists, different options appear: "Translate survey," "Set tone," "Rephrase."

This contextual awareness makes AI feel like a true collaborator that understands where you are in the process.

AI Builder adapting options to an existing questionnaire context

Progressive disclosure

I wanted to give users control without overwhelming them. When someone selects an option like "Translate survey," a follow-up appears with language choices rather than showing everything at once.

This contextual awareness makes AI feel like a true collaborator that understands where you are in the process.

Progressive disclosure states for AI actions

Question & answer level assistance

Beyond the main AI Builder interface, I designed AI assistance at the micro level for every individual question and answer. Users can click an AI icon next to any question or answer to access options like "Generate answers," "Rephrase," "Set tone," or "Translate."

Question and answer level AI assistance options

Business & personal impact

Results

The redesigned AI Builder has seen strong adoption, with the majority of Pollfish users now using it, especially first-time users who rely on AI to create their research from scratch. Support tickets related to question quality have decreased, and direct communication around survey structure has been significantly reduced.

More importantly, AI has become part of the natural workflow. Users don't just generate a survey and move on, they collaborate with AI throughout the process, from initial creation to final refinements.

What I learnt

This project reinforced that designing AI features isn't about showcasing technical capability, it's about defining where AI adds real value. Even the most powerful tool is ineffective if users don't know where to start or how to use it.

Working under constraints, tight timelines, no formal user research, and multiple stakeholder needs, taught me to be resourceful. Collaborating closely with the support team to understand user patterns proved just as valuable as formal research, and in many cases allowed for faster iteration.

What I would change

If I were to approach this again, I would push harder for early user testing, even through informal sessions. While internal feedback helped shape a strong product, direct input from users earlier in the process would have accelerated our learning.