Guides and Relevant Use Cases for Filtering Research Results
Filtering your research study results in Crowd allows you to focus on specific data subsets, making analysis more
efficient and insights more actionable. This guide provides detailed instructions on using the Filter feature, located
on the results page, to refine your data by Duration, Country, Date, Device, Browser, and Recruitment Source.
Additionally, it offers practical use cases to demonstrate how filters can uncover meaningful patterns and trends.
Accessing the Filter Feature
On the results page of your research study, locate the "Filter" button at the top-right corner. Clicking this button
opens a dropdown menu listing the available filters: Duration, Country, Date, Device, Browser, and Recruitment Source.
Each filter comes with specific attributes to customize your data view.
Understanding Filter Types and Attributes
Below is a breakdown of each filter, its attributes, and examples to illustrate their application.
Duration
The Duration filter allows you to analyze results based on the time participants spent on the study.
- Attributes: More than, Exactly, Less than, In range
- Units: Seconds (sec), Minutes (min), Hours (hour)
- Action: Select an attribute, specify a value, choose a unit, and click "Apply".
- Examples:
- More than 5 min: View responses from participants who took longer than 5 minutes, potentially indicating
thorough engagement or difficulty.
- Exactly 30 sec: Analyze responses completed in exactly 30 seconds, useful for quick impression tests.
- Less than 2 min: Identify rushed responses that may lack depth.
- In range 1 min to 3 min: Focus on responses within a typical engagement window, excluding outliers.
Country
The Country filter helps segment data by geographic location.
- Attributes: Is, Is not, Starts with, Ends with, Contains, Does not contain
- Action: Select an attribute, enter a value in the text field, and click "Apply".
- Examples:
- Is United States: Include only responses from the United States to assess regional feedback.
- Is not Canada: Exclude Canadian responses to focus on other regions.
- Starts with Aus: Capture responses from countries like Australia or Austria.
- Ends with land: Include countries like Finland or Thailand.
- Contains rico: Target responses from Puerto Rico or Costa Rica.
- Does not contain Asia: Exclude countries with "Asia" in their name, such as Malaysia.
Date
The Date filter enables analysis based on the recency of responses.
- Attributes: More than, Exactly, Less than, In range
- Action: Select an attribute, input a number of days ago (e.g., 5 days ago), and click "Apply".
- Examples:
- More than 7 days ago: Review older responses to compare with recent feedback.
- Exactly 3 days ago: Focus on responses submitted exactly three days ago for a specific campaign analysis.
- Less than 2 days ago: Analyze the most recent responses for real-time insights.
- In range 1 to 5 days ago: Examine responses from the past week, excluding older data.
Device
The Device filter segments data by the type of device used by participants.
- Attributes: Desktop, Smartphone, Tablet
- Action: Select a device type and click "Apply".
- Examples:
- Desktop: Assess responses from desktop users to evaluate desktop-specific usability.
- Smartphone: Focus on mobile user feedback for a mobile-first design study.
- Tablet: Analyze tablet responses to identify device-specific trends.
Browser
The Browser filter allows segmentation based on the browser used by participants.
- Attributes: Is, Is not, Starts with, Ends with, Contains, Does not contain
- Action: Select an attribute, enter a value in the text field, and click "Apply".
- Examples:
- Is Chrome: Include responses from Chrome users to check browser compatibility.
- Is not Safari: Exclude Safari users to focus on other browser performance.
- Starts with Fire: Capture responses from Firefox users.
- Ends with Edge: Include Microsoft Edge users.
- Contains fox: Target Firefox users specifically.
- Does not contain IE: Exclude Internet Explorer responses to focus on modern browsers.
Recruitment Source
The Recruitment Source filter segments data by how participants were recruited.
- Attributes: Website, Email, URL, Participant Pool
- Action: Select a recruitment source and click "Apply".
- Examples:
- Website: Analyze responses from participants recruited via a web prompt to evaluate its effectiveness.
- Email: Focus on responses from email invitations to assess campaign reach.
- URL: Review responses from participants who joined via a shareable link.
- Participant Pool: Examine responses from Crowd’s participant pool for broader demographic insights.
Combining Multiple Filters
1. To streamline your data further, you can apply multiple filters simultaneously. For example:
- Combine Country Is United States and Device Smartphone to analyze mobile responses from U.S. participants.
- Use Duration Less than 1 min and Recruitment Source Email to identify quick responses from email-invited
participants, possibly indicating rushed feedback.
1. After selecting your filters, click "Apply" to update the results. To remove a filter, click the "X" next to it
in the filter bar.
Practical Use Cases for Filtering
Filters enable targeted analysis to address specific business and organizational goals. Below are robust use cases with
relevant data, highlighting their objectives, findings, and actions.
- Enhancing E-Commerce Mobile Experience
- Objective: A retail business aims to improve mobile checkout usability to boost conversion rates.
- Filters Applied: Device Smartphone, Country Is United States
- Data: 300 responses, 120 (40%) from U.S. smartphones, completion rate 65%, average duration 3.2 minutes, 25%
reported checkout issues.
- Insight: U.S. smartphone users frequently cited small button sizes as a barrier, with 25% abandoning due to
usability, impacting potential sales.
- Action: Increase button size by 20% and retest with the same filter to measure improvement.
- Optimizing a Global Marketing Campaign
- Objective: A multinational corporation seeks to assess the effectiveness of a recent ad campaign across regions.
- Filters Applied: Recruitment Source Email, Date In range 1 to 7 days ago, Country Contains Europe
- Data: 500 responses, 200 (40%) from email invites in Europe over 7 days, completion rate 88%, 60% rated the ad
positively.
- Insight: High engagement in Europe suggests successful targeting, but 40% negative feedback highlighted unclear
messaging in France.
- Action: Refine messaging for France and re-survey with Country Is France.
- Improving Software Compatibility
- Objective: A software company aims to ensure cross-browser compatibility for a new release.
- Filters Applied: Browser is Firefox, Duration is More than 4 min
- Data: 180 responses, 45 (25%) from Firefox with durations over 4 minutes, completion rate 60%, 35% reported lag.
- Insight: Firefox users experienced significant lag, reducing completion rates and indicating a compatibility
issue.
- Action: Optimize for Firefox and retest with the same filter.
- Tailoring Educational Content by Region
- Objective: An e-learning platform wants to customize content for Asian markets based on device preferences.
- Filters Applied: Country Contains Asia, Device Smartphone
- Data: 250 responses, 100 (40%) from Asian smartphones, completion rate 80%, 70% preferred video content.
- Insight: High smartphone usage and video preference in Asia suggest a mobile-first video strategy.
- Action: Develop mobile-optimized video courses and re-evaluate with the same filter.
- Evaluating Corporate Training Engagement
- Objective: A corporate HR department seeks to improve training program retention.
- Filters Applied: Recruitment Source Participant Pool, Duration In range 10 to 20 min
- Data: 400 responses, 150 (37.5%) from participant pool within 10-20 minutes, completion rate 92%, 15% dropped at
module 3.
- Insight: Strong engagement overall, but a 15% drop at module 3 indicates content fatigue or complexity.
- Action: Simplify module 3 and retest with the same duration filter.
- Diagnosing Browser-Specific Delays
- Filters Applied: Browser Is Chrome, Duration More than 5 min
- Data: 120 responses, 50 (42%) from Chrome with durations over 5 minutes, completion rate of 70%.
- Insight: Chrome users experienced delays, with 30% citing loading issues, pointing to potential compatibility
problems.
- Action: Optimize for Chrome and retest.
Best Practices for Effective Filtering
- Start Broad, Then Narrow: Begin with a single filter (e.g., Country) to understand general trends, then add more
filters (e.g., Device) for specificity.
- Use Relevant Combinations: Pair filters that align with your goals, such as Device and Browser for technical
analysis.
- Monitor Data Volume: Ensure your filters don’t exclude too much data, resulting in insufficient responses for
analysis.
- Reset When Needed: If results are too narrow, remove filters and reapply with broader criteria.
- Document Insights: Note which filters yield the most actionable insights for future studies.
Troubleshooting Common Issues
- No Results After Filtering: Check if filters are too restrictive (e.g., Duration Exactly 10 sec); broaden criteria
like using In range instead.
- Unexpected Data: Verify filter attributes (e.g., Country Is not Canada may still include responses if misapplied);
recheck and reapply.
- Slow Performance: Reduce the number of filters if the dataset is large, as multiple filters may slow down
processing.
Related Resources
AI Chat - Chat with Your Data: Gain deep insights into your test responses by using the AI Chat feature. This tool
allows you to interactively break down, analyze, and explain your collected data across the workspace, providing a
comprehensive understanding of participant responses.
For support, contact our team via in-app chat or email support@crowdapp.io.