Home Guides

Guides

This category lists the step-by-step guides on how to use the different features on Crowd. How to access and understand them.
By Crowd team
10 articles

Conditional logic

Key Points; - What is Conditional Logic? - Steps on how to use Conditional Logic on Crowd. - Questions and Answers example for Conditional Logic. - Some FAQs on Conditional Logic. What Is Conditional Logic? Conditional logic is a feature that allows you to create questions and answer options that appear or disappear based on how users answer previous questions. This way, you can create personalized and relevant tests for your users. Steps on how to use Conditional Logic on Crowd To use conditional logic in Crowd, you need to; - Create a test, then click on "Add a block". - Add at least three questions to it. - Then click on the "Add Conditional Logic" button under the question you want to add logic to. - You can choose the conditions under which the question or answer option should be displayed or hidden by clicking on “show” this question if (select a previous set question) then select the answer to which the question will respond. - You can click on "GO TO", which is used to skip a question that you do not want to answer. Tips; The GO TO feature can only be used for the first or second question. Question and Answer example for Conditional Logic. Here is a detailed question-and-answer example for conditional logic. Question 1: What type of phone do you have? Answer options: iPhone, Android, Other Question 2: If iPhone, what model do you have? Answer options: iPhone 11, iPhone X, iPhone 8, Other Conditional logic: Show this question only if the answer to Question 1 is "iPhone" Question 3: If Android, what brand is your phone? Answer options: Samsung, Google, OnePlus, Other Conditional logic: Show this question only if the answer to Question 1 is "Android" Question 4: How satisfied are you with your phone's camera? Answer options: Very satisfied, Somewhat satisfied, Neutral, Somewhat dissatisfied, Very dissatisfied Conditional logic: Show this question only if the answer to Question 2 is "iPhone X" Question 5: Which feature of your phone do you use the most? Answer options: Camera, Social media, Gaming, Other Conditional logic: Show this question only if the answer to Question 1 is "Android" and the answer to Question 3 is "Samsung" Some FAQs on Conditional logic - What kinds of conditions can I use in conditional logic? - You can ask conditional logic to show some questions based on the answers/options selected in previous questions. - When can I use condition logic? Let's say you are creating a survey to gather feedback on a new product. In the survey, you first ask the respondent to select their age group: 18-24, 25-34, 35-44, or 45 and above. Depending on the answer selected, you can use conditional logic to show different follow-up questions. For example, if the respondent selects the 18-24 age group, you might show a question about their preferred social media platform. But if the respondent selects the 45 and above age group, you might instead show a question about their preferred method of communication with friends and family. By using conditional logic, you can tailor the survey questions to the respondent's specific age group, making the survey more engaging and relevant to their interests. - Can I preview how my conditional logic will work before publishing my test? Yes! Crowd has a preview mode that allows you to test your conditional logic before publishing your test. This way, you can make sure that your logic is working correctly and providing a good user experience. - Are there any limitations to using conditional logic in Crowd? While conditional logic is a powerful feature, there are some limitations to keep in mind. For example, you can only use conditional logic on certain question types, and you cannot use it to create loops or complex branching paths. Additionally, complex logic can sometimes slow down the performance of your test. - Can I use conditional logic with short-text or long-text answers? Yes, you can use conditional logic to show or hide follow-up questions based on specific keywords or phrases in short or long-text answers - . Can I use conditional logic with multiple-choice or checkbox answers? Yes, you can use conditional logic to show or hide follow-up questions based on the selected answer(s) in multiple-choice or checkbox questions. - Can I use conditional logic with linear scale answers? Yes, you can use conditional logic to show or hide follow-up questions based on the selected value(s) on a linear scale. - Can I use conditional logic with yes/no questions? Yes, you can use conditional logic to show or hide follow-up questions based on the selected answer (yes or no) in a yes/no question. - Are there any limitations to using conditional logic with different answer types? While conditional logic can be used with most answer types in Crowd, there may be some limitations based on the complexity of the logic or the number of follow-up questions. It's always best to test your conditional logic thoroughly to ensure it's working as intended. - Do I need to have a minimum number of questions to use conditional logic? Yes, you must have at least two questions in order to set conditional logic. - Is there a limit to the number of questions I can apply conditional logic to? No, you can apply conditional logic to as many questions as you like within a test or survey. - Can I use conditional logic on just one question? Yes, you can apply conditional logic to just one question if desired. - Can I set multiple conditions on a single question? Yes, you can set multiple conditions on a single question, allowing for more complex logic to be applied.

Last updated on Jun 13, 2024

How to set up a moderated session

Click on the “ Check out interview guides” to get tips that will guide you through the process of using the moderated session feature. Crowd allows you to schedule an instant session or a session for later. To Get Started On An Instant Moderated Session 1. Navigate to the Moderated session dashboard and click on “Schedule new session” 2. Select the “start an instant session” button to continue. 3. Fill in the necessary information. The session name and the participant name. You can add more participant names by clicking on the “This is a group session” box. And then click on “create” to continue. 4. Once the session has been created, copy the link and share with the participants and observers. 5. Click on “Join as moderator” to join the session. To schedule a session for later 1. After clicking on “Schedule new schedule” 2. Select the “Schedule a session for later’ button to continue. 3. Fill in the necessary details. The session name, the participant name, the date, and the time. You can add more participants by clicking on the “This is a group session” box to add the participant names and email addresses. Then click on “create” to continue. 4. You can choose to send the custom emails directly to the participant's email address or you can copy the invitation and send it to participants manually. 5. To copy the link for the participants and observers, navigate back to your live session dashboard and click on “View details” on the session you scheduled for later.

Last updated on Jun 13, 2024

Steps on how to create an unmoderated test

Key points; - Steps to create an unmoderated test - Running an unmoderated test - Analyzing report These are the steps to create an unmoderated test on Crowd; 1. Navigate to the crowd dashboard, click on unmoderated test, and click on "Create new test" 2. You can select from our numerous template boxes, or you can start from scratch 3. Edit your test details 4. Click on "Add new block" to add your preferred test. 5. Select your preferred test 6. Preview your test and click on "publish" to go to the recruit page. Running the Unmoderated Test 1. Launching the Test**:** Launch the test and send invitations to participants. Ensure they understand the process and have access to the necessary resources. 2. Ensuring Privacy and Security**:** Protect participant data and privacy. Follow data protection regulations and inform participants of how their data will be used. 3. Handling Participant Questions**:** Provide a means for participants to ask questions or seek clarification, even though the test is unmoderated. Be responsive to their inquiries. Analyzing Results 1. Quantitative vs. Qualitative Data Differentiate between quantitative data (e.g., task duration) and qualitative data (e.g., user comments) when analyzing results. 2. Data Analysis Tools Utilize appropriate data analysis tools to process and interpret the collected data. 3. Interpreting User Feedback Look for patterns, trends, and insights within the user feedback to draw meaningful conclusions. 4. Reporting Insights Summarize your findings in a comprehensive report that includes key insights, recommendations, and actionable next steps. Iterating and improving Continuous Improvement - Use the insights from the unmoderated test to inform product or website improvements. Continuously iterate and refine your designs based on user feedback. - Incorporating Feedback Engage with your team and stakeholders to incorporate user feedback into the design and development process. - Future Testing Strategies Consider conducting additional unmoderated tests to track progress and assess the impact of changes made based on previous test findings.

Last updated on Jun 13, 2024

How to invite participants to a session and how to reschedule a session

Key Points; - How to Invite participants to a session - How to reschedule a session 1. HOW TO INVITE PARTICIPANTS TO A SESSION To an instant Moderated session An instant moderated session does not target many participants by sharing the link automatically when creating the test. To be able to share the link once the meeting has started; - The meeting page is different from the session setup page, so you can switch tabs from the meeting page to the setup page. - Switch tabs to the session setup page, and copy the link to the participant test to share. To a session scheduled for later You can add as many participants as you want during the creation of the test. But during the session, to add participants you have to; - After creating the session, navigate to the Live session dashboard. - Click on the “session details” box on the test that is scheduled for later. - Copy the link to the participants' test and share. 2. HOW TO RESCHEDULE A SESSION To reschedule a session; - Navigate to the live session dashboard and click on “View details” - Click on the “reschedule” button - Fill in the new time and date schedule. You can also edit the session name, the participant's name, and the email address. You can also reschedule an instant meeting; - Navigate to your Live session dashboard. - Click on “Reschedule” for the session you want to reschedule. - Fill in the new date and schedule. You can also make other necessary edits.

Last updated on May 09, 2024

How to share a test and track results during live interviews

Key Points - HOW TO SHARE AN UNMODERATED TEST DURING A MODERATED SESSION - HOW TO TRACK YOUR RESPONSES AND METRICS AFTER A SESSION HOW TO SHARE AN UNMODERATED TEST DURING A MODERATED SESSION 1. Prepare the unmoderated test: Before the moderated session, you'll need to set up the unmoderated test: - Define Objectives: Clearly define the goals and objectives of the test. What specific information are you looking to gather from participants? - Select Tasks: Identify the tasks or scenarios you want participants to complete during the test. Ensure that these tasks align with your objectives. - Create Test Materials: Prepare any materials needed for the test, such as prototypes, websites, apps, or documents. - Create Test Instructions: Write clear and concise instructions for participants, including details on how to perform each task. 2. Conduct the Moderated session During the moderated session follow these steps: - Navigate to the right part of the screen and click on “share test” - Select the test you have created for this session. - Click on “track” to time participants during the session - Request participants to share screens. (Optional) - Click on “stop” to stop tracking participants responses - Click on status after the test is over. HOW TO TRACK YOUR RESPONSES AND METRICS AFTER A SESSION When the session has ended do this: - Navigate to your moderated session dashboard - Click on “view details” on the just concluded session - Click on “shared test” - Click on “view responses” . Here you can check the response and metrics of your session.

Last updated on May 20, 2025

How to install widget, deactivate widget and manage feedback submissions

Key Points; - How THE WIDGET INSTALLATION WORK - HOW TO SHARE, DELETE, EDIT, AND RENAME WIDGETS FROM THE RESPONSE PAGE - HOW DOES CROWD DETECT THAT YOUR WIDGET IS INSTALLED - DEACTIVATING YOUR WIDGET - HOW TO MANAGE FEEDBACK SUBMISSIONS 1. How The Widget Installation Works Once a new Widget has been created, Crowd will take you to the “Edit widget page” After widget customization, scroll down to installation and follow the steps below; 1. Copy the embed code snippet provided by Crowd by clicking on “Click on copy” 2. Paste it before the end of the closing tag in the HTML of every page you want the widget to display 3. Test connection to be sure it’s properly installed Once your widget has been installed to your preference, you can save your work and your widget is active on your website. You can now proceed to the feedback submissions settings. 2. How To Share, Delete, Edit, And Rename Widgets From The Response Page You can share, edit, delete, and even rename your widget from your feedback widget response page. Click on the “more options” tab on the top right of your response page. 3. How does Crowd detect that your widget is installed When you paste the code snippet on the page website and refresh your website page, a request will be made to the Crowd server to fetch and load the widget details. But before the widget will be displayed on your website, necessary checks will be made which are listed below: 1. Is widget activated: By default, every widget created is active meaning that once installed will be visible on the website. But in a situation that it was deactivated then it will not be displayed on your website. 2. Display Rules: While editing and customizing your widget, there are two major display rules you have to set which are for device and Visibility based on your URL. So depending on these rules the widget will show if the checks were passed. 3. Domain check: Note that Crowd widget can only be installed on one domain or subdomain at a time. Meaning you can't install it on two domains (eg. example.com and test.example.com). Thus if you want to install it on a new domain or subdomain you have to unlink the current domain and then install it on the new one by clicking on the installation item on the Edit Widget page, and you will see an unlink website button to unlink the domain. 4. Deactivating Your Widget To deactivate your widget, Navigate to the response page, and click on the “ACTIVATE” button on the top right corner of your screen to deactivate your widget. Once your widget has been deactivated, it will no longer be visible on your website. 5. How To Manage Feedback Submissions Managing feedback submissions requires a systematic approach to ensure that user inputs are captured, categorized, acted upon, and communicated effectively. Here's a structured method for handling feedback submissions on Crowd; Collection and Storage: - Centralized Database: Store all feedback submissions in a centralized database to facilitate easy access and analysis. - Timestamp and Metadata: Record the time of submission and any associated metadata (like the user ID, device type, or page they were on) to provide context. Prioritization: - Severity Level: For feedback indicating issues or bugs, assess the severity of the problem. - User Demand: Prioritize feedback based on the frequency of similar requests or suggestions. - Feasibility: Consider the practicality and resources required to address the feedback. Categorization and Tagging: - Automated Classification: Use algorithms or keyword filters to auto-classify feedback into broad categories like bugs, feature requests, or general comments. - Manual Review: Designate team members to manually review, categorize, and tag feedback to ensure accuracy and identify nuances. Assignment and Action: - Direct to Relevant Teams: Forward feedback to appropriate departments or teams (e.g., technical issues to the IT team, feature requests to the product team). - Integration with Task Management: Integrate feedback with Crowd task management tool to track the progress of addressing the feedback. Communication: - Acknowledgment: Edit your Thank You screen to acknowledge users who submit feedback, thanking them for their input. - Status Updates: If feasible, inform users about the status of their feedback, especially if it leads to tangible changes or fixes. Analysis and Reporting: - Trend Analysis: Identify patterns or recurring feedback to understand overarching user sentiments or pressing issues. - Regular Reporting: Create regular reports or dashboards for stakeholders, summarizing feedback trends, resolved issues, and ongoing concerns. Feedback Loop Closure: - Release Notes or Updates: Publicly share updates or changes made in response to feedback to demonstrate that you value user input. - User Testing: If major changes are implemented based on feedback, consider running focused user testing sessions to validate these changes. Continuous Improvement: - Refine the Widget: Use the feedback about the feedback widget itself to make it more user-friendly or efficient. - Iterate: As your product or service evolves, regularly revisit and update the feedback management process. Data Privacy: - Anonymize Data: If you're sharing or analyzing feedback publicly, ensure personal details are anonymized. - Compliance: Ensure your feedback collection and storage practices comply with data protection regulations relevant to your audience Feedback Incentivization: - Rewards: Consider offering users incentives like discounts, badges, or early access to features for providing valuable feedback. Effectively managing feedback submissions not only enhances the user experience but also provides valuable insights for product and service improvement. It's a two-way street that benefits both users and businesses.

Last updated on Jan 31, 2024

Crowd widget and its different feedback types

key Points; - NPS WITH CROWD FEEDBACK WIDGET - CSAT SURVEYS WITH CROWD FEEDBACK WIDGET - COLLECTING FEATURE REQUEST WITH CROWD FEEDBACK WIDGET - BUG TRACKING WITH CROWD FEEDBACK WIDGET 1. Nps With Crowd Feedback Widget Net Promoter Score (NPS) is a widely used metric that gauges customer loyalty by asking customers to rate the likelihood they would recommend a product or service to others. Integrating NPS with a crowd widget can provide a more comprehensive feedback system and allow businesses to tap into crowdsourced insights about their performance. Here's how you can effectively use NPS with a crowd widget: Integration: - Embed NPS Question: Within your crowd widget, embed the standard NPS question: "On a scale of 0-10, how likely are you to recommend [Product/Service/Brand] to a friend or colleague?" - Follow-up Questions: Allow space for open-ended feedback so respondents can elaborate on their scores. This can provide context to the numeric rating and offer actionable insights. Categorization: - Promoters (9-10): Users who are extremely likely to recommend your product or service. - Passives (7-8): Satisfied users who are neutral and might switch to competitors under the right circumstances. - Detractors (0-6): Unhappy users who could potentially harm your brand through negative word-of-mouth. Analysis: - Aggregate Data: Calculate your overall NPS by subtracting the percentage of detractors from the percentage of promoters. This will give you a score between -100 and 100. - Feedback Themes: Analyze the open-ended feedback to identify common themes or issues among promoters, passives, and detractors. Feedback Loop: - Engage with Respondents: Consider reaching out to detractors to understand their grievances and potentially convert them into promoters. Similarly, engage with promoters to understand what you're doing right. - Continuous Feedback: Use the crowd widget to continuously solicit NPS feedback, tracking how your score evolves over time and in response to changes or improvements you make. Actionable Insights: - Address Common Issues: Use feedback from detractors to identify and address widespread issues or pain points. - Leverage Promoter Feedback: Understand what's working well and consider doubling down on those areas. Integration with Other Tools: - Task Management: Integrate the feedback from the crowd widget into the Crowd Kanban board to ensure that actionable insights are addressed. - CRM Integration: If applicable, integrate with your Customer Relationship Management (CRM) system to track individual user feedback and tailor your interactions accordingly. Data Visualization: - Use dashboards to visualize your NPS trends over time, the distribution of promoters, passives, and detractors, and the key themes emerging from open-ended feedback. Data Privacy: - Ensure that NPS responses, especially open-ended feedback, are collected and stored in compliance with relevant data protection regulations. - Provide transparency to respondents about how their feedback will be used. Promote the NPS Question: - Encourage users to participate by highlighting the simplicity of the NPS question and the value their feedback brings. By integrating NPS with a crowd widget, businesses can gain both a quantitative measure of customer loyalty and qualitative insights to drive improvement. The real value lies in not just collecting the scores but in acting on the feedback to enhance user experience and satisfaction. 2. Csat Surveys With Crowd Feedback Widget Customer Satisfaction (CSAT) surveys are designed to measure the satisfaction of your users with a specific interaction or transaction. When integrating CSAT surveys with Crowd feedback widget, you offer a convenient and seamless way for users to share their experiences. Here’s a guide on how to effectively use CSAT surveys with Crowd feedback widget: Survey Design: - Simplicity: CSAT questions should be direct and concise, often centered around "How satisfied were you with [specific interaction/feature]?" - Rating Scale: Crowd CSAT uses a star scale, such as 1-5, where 1 might mean "Very Unsatisfied" and 5 means "Very Satisfied." Crowd also uses an emoji scale. Integration Points: - Relevant Interactions: Place the feedback widget at the end of significant touchpoints — after a support ticket resolution, a purchase, or the use of a new feature. - Non-intrusive Placement: Ensure the widget doesn’t disrupt the user experience. It should be easily accessible but not obstructive. Follow-up Questions: - After the initial CSAT question, provide an option for users to elaborate on their rating. Open-ended questions can yield valuable insights. Real-time Analysis: - Instant Notifications: Set up real-time alerts for low scores to address any issues promptly. - Aggregate Data: Over time, analyze the collected data to identify trends in customer satisfaction. Feedback Loop: - Immediate Response: If a user leaves a negative rating, consider giving a response that acknowledges their feedback and assures them that their concerns will be addressed. - Iterative Improvements: Use the feedback to implement improvements and iterate on your product or service. Transparency and Trust: - Privacy Assurance: Communicate how the feedback will be used and ensure user data privacy. - Opt-in/Opt-out: Give users the option to participate in the survey and ensure they can easily opt out if they choose. Integration with Other Tools: - CRM Systems: Link feedback to individual user profiles in your Customer Relationship Management system to tailor future interactions. - Support Platforms: If the feedback widget is used post-support interaction, integrate it with your support platform to get a comprehensive view of customer issues. Reporting and Visualization: - Use the feedback analysis dashboards that highlight CSAT scores over time, breaking down data by different categories, touchpoints, or demographics. Incentivization: - Consider offering users a small incentive for participating in the CSAT survey, like entry into a prize draw or a discount on their next purchase. Continuous Improvement: - Regularly review and refine your CSAT questions to ensure they stay relevant and continue to provide actionable insights. - Adjust the widget's design or placement based on user engagement metrics to increase participation. By effectively integrating CSAT surveys into your feedback widget, you provide users with an easy and intuitive way to share their experiences, leading to actionable insights that can drive improvements in your product or service. 3. Collecting Feature Request With Crowd Feedback Widget Collecting feature requests via a feedback widget is a great way to gauge user needs, prioritize product development, and enhance the overall user experience. Implementing a system that captures, manages, and acts upon these feature requests efficiently is crucial. Here’s a guide on how to effectively use a feedback widget for feature request collection on Crowd. Widget Design & Placement: - Accessibility: Ensure the widget is easily accessible from all pages or sections where users might want to suggest features. - Visibility: While the widget should be noticeable, it shouldn't disrupt the user experience. Consider using an icon or a floating button that expands upon a click. - Responsive: Ensure the widget is mobile-responsive if your platform supports mobile users. Encourage Context: - Ask users to provide use cases or scenarios to understand the context behind their request. Acknowledgment: - Confirmation Message: Upon submission, show a customized message thanking the user and confirming the receipt of their suggestion. - Feedback Loop: Consider notifying users if their suggested feature gets implemented or if there are updates about it. Integration & Analysis: - Centralized Repository: Store all feature requests in a centralized system or platform for easy review and analysis. - Trend Identification: Periodically analyze requests to identify commonly requested features or themes. - Prioritization: Use the urgency indicators, user suggestions, and your product strategy to prioritize feature development. Engagement & Community Building: - Public Roadmap: If appropriate, maintain a public roadmap where users can see planned features and the status of requested features. - Community Discussions: Allow comments or discussions around feature requests to encourage community engagement and flesh out ideas. Data Privacy: - Anonymity Option: Allow users to submit feature requests anonymously if they prefer. - Data Protection: Ensure you handle and store user data in compliance with relevant regulations. Feedback Incentivization: - Consider offering rewards or recognition to users who suggest features that get implemented. This can encourage more users to provide valuable input. Collecting feature requests through a feedback widget not only empowers users by giving them a voice in the development process but also provides invaluable insights to product teams, ensuring that the product evolves in a direction aligned with user needs. 4. Bug Tracking With Crowd Feedback Widget Bug tracking is a critical component of any software development and maintenance process. A feedback widget can be a valuable tool for capturing, categorizing, and addressing bugs reported by end-users in real-time. Here's how to effectively utilize a feedback widget for bug tracking on Crowd; Widget Design & Placement: - Visibility: Ensure the widget is noticeable yet unobtrusive. Consider a bug icon or label to make it explicit. - Accessibility: Place the widget consistently across all pages or sections of your application or website. - User Experience: Ensure the widget is easy to use, intuitive, and responsive across devices. Bug Reporting Form: - Description: Include a text field for users to describe the bug in detail. - Screenshots/Attachments: Allow users to attach screenshots, screen recordings, or other files that can illustrate the issue. - Reproduction Steps: Ask users for steps to reproduce the bug by screen recording, which is invaluable for developers. Feedback Loop: - Acknowledgment: Send a customized automatic response or show a message thanking the user for their report and assuring them it will be looked into. - Progress Updates: If feasible, notify users about the status of the bugs they reported (e.g., "Under Review," "In Progress," "Resolved"). Integration with Bug Tracking System: - Ensure the feedback widget integrates seamlessly with your existing bug-tracking or issue-management system. - Automate the creation of new tickets or issues based on user reports, complete with all the captured data. Prioritization & Triage: - Use severity levels, frequency of reports, and potential impact to prioritize bug fixing. - Regularly review and triage incoming bug reports to address critical issues promptly. Analysis & Insights: - Periodically analyze the collected data to identify recurring issues, potential areas of improvement, or patterns indicating systemic problems. Data Privacy & Security: - Ensure that any user data collected is protected and handled according to relevant regulations. - Allow users to report bugs anonymously if they prefer. Incentivizing Reports: - Consider offering rewards, discounts, or recognition to users who report significant or critical bugs. This can motivate more users to take the time to report issues. Incorporating a feedback widget for bug tracking can greatly enhance the user experience by making it easy for users to report issues and feel heard. For developers, it provides real-time insights from actual users, ensuring quicker resolutions and a more stable product.

Last updated on Jan 31, 2024

Feedback widget and best practices for collecting feedback

Key Points; - Best practices for collecting feedback - Implementing a feedback widget Best Practices For Collecting Feedback 1. Clearly Define Objectives: Before collecting feedback, outline your goals and what specific information you're seeking to gather. Be clear about the problems you want to address or the aspects of your product or service that need improvement. 2. Create User-Friendly Feedback Forms: If using forms or surveys, design them to be user-friendly and easy to complete. Keep them short, use plain language, and offer clear instructions. Include a mix of open-ended and closed-ended questions. 3. Implement Feedback Widgets: Use feedback widgets on your website or within your application to make it easy for users to provide feedback in real time. Widgets can be strategically placed for maximum visibility. 4. Anonymity and Privacy: Allow users to provide feedback anonymously, especially if they are sharing negative experiences or sensitive information. Ensure that user data is protected and that privacy is maintained. 5. Timing and Context: Ask for feedback at appropriate times and in relevant contexts. For example, request feedback after a support interaction, following a purchase, or during specific interactions with your product. 6. Incentives: Consider providing incentives, such as discounts, access to exclusive content, or other rewards, to encourage users to provide feedback. 7. Multichannel Approach: Collect feedback through multiple channels, including email, on-site widgets, social media, and user interviews. Different channels can reach different user segments and provide diverse insights. 8. Listen Actively: Actively listen to the feedback you receive. Show empathy and acknowledge the user's perspective, even if you can't immediately address their concerns. 9. Respond and Close the Feedback Loop: Respond to user feedback promptly. Whether it's a thank-you message or addressing a specific issue, closing the feedback loop shows that you value user input. 10. Analyze and Prioritize: Analyze the feedback systematically. Identify recurring themes, common pain points, and opportunities for improvement. Prioritize changes based on the impact and feasibility. 11. Implement Changes Transparently: After making improvements based on feedback, communicate the changes to your users. Transparency builds trust and shows that you're responsive to their needs. 12. Continuous Monitoring: Collect feedback regularly and consistently. User needs and preferences can evolve, and ongoing monitoring ensures that your products and services stay aligned with these changes. 13. Training and Empowerment: Train your support and customer service teams to collect and manage feedback effectively. Empower them to address user concerns and escalate feedback to the relevant teams. 14. Data Integration: Integrate feedback data with other user data and analytics to gain a comprehensive understanding of user behavior and sentiment. 15. User Testing and Observational Feedback: Combine qualitative feedback methods, such as user testing and observational feedback, with quantitative data to gain a holistic view of user experiences. 16. Benchmarking: Benchmark your feedback against industry standards or competitors to gain context and identify areas where you can excel. 17. Iterate and Improve: Continuously iterate on your feedback collection processes and act on the insights you gain. Improvement should be an ongoing, iterative process. Implementing A Feedback Widget Once you've selected the right widget, the next step is implementation. 1. Navigate to your dashboard, click on “Feedback Widget”, and click on "Create new widget" Tips: You can select from our numerous feedback widget templates. Rather than building one from scratch. 2. Select the widget type you want to create. Single or Multiple feedback types. 3. Customize your widget - Edit the widget name. - Select the preferred feedback type for your widget. - Style your widget to your preference. - Customize your thank you message page. - Customize your "Display rule". 4. Install the widget code. - Configure the widget's placement on your website or app. Ensure it's easily accessible and visible to users. - Before making it live, thoroughly test the widget's functionality to make sure it's working as expected.

Last updated on Jun 14, 2024