Makerble® Surveys

Bring your surveys to life with a platform that tracks change over time

Branded, flexible and built for insight: Makerble Surveys lets you collect data across moments, monitor impact, and turn responses into reports that tell a powerful story.

Get Started →
Get full access ✦ No creditcard required
Makerble® Surveys

WHAT’S NEW

Ask Matt: Can I omit survey responses from people who've dropped out?
IN
Hints, Tips and Feature Releases
time

In this installment of Ask Matt, we're answering the question of how to handle participants who drop out over the course of an initiative that you want to analyse longitudinally

Let's face it, people drop out of programs. While that can be easy to manage operationally, it can present problems when you're analysing your results. The last thing you want is for your results to become skewed or rather diluted by the participants who began with you but didn't stick around until the end. Thankfully there's a way to accommodate that on Makerble. Let's dive in.

In this example we'll imagine a Coding Course with 3 participants at the start. 1 of them drops out. Here's how to manage that and ensure you have reliable results at the end.

And as you'll see, the filtering isn't limited to Distribution over Time, you'll also see that same criteria applied to the other charts shown on your Progress Board.

There you have it! Hopefully you won't have need to contend with dropouts, but if you do, now you know how to!

Using the WEMWBS Survey to Understand Wellbeing – and How Makerble Enhances It
IN
The School of Impact
time

Measuring mental wellbeing is an essential part of understanding the impact of your work. The Warwick-Edinburgh Mental Wellbeing Scale (WEMWBS) is widely used across sectors for this purpose. It’s a reliable, straightforward way to capture data about how people are feeling - particularly in the context of social and community-based services.

This article looks at what makes WEMWBS effective and how using it within the Makerble platform can help you go further - streamlining how you collect, analyse and learn from wellbeing data.

What is WEMWBS – and Why Use It?

WEMWBS is a 14-question scale designed to assess positive aspects of mental wellbeing. Developed through academic research and validated across diverse populations, it captures indicators such as optimism, energy, and connection to others.

Organisations use WEMWBS when they want a simple but robust way of understanding how their services affect people’s mental wellbeing. It’s particularly common in:

  • Charities working in mental health
  • Youth and community services
  • Education settings
  • Local authority and NHS-funded programmes

Limitations of Traditional Survey Tools

While WEMWBS itself is effective, how it’s delivered and analysed can be a challenge. Many organisations still rely on spreadsheets, paper forms or general-purpose survey tools — which often results in:

  • Time-consuming data analysis
  • Difficulty comparing results across projects or time
  • Limited insight into individual journeys or group trends
  • Insights that are delayed or underused

How Makerble Enhances WEMWBS

Makerble has been designed specifically with impact-focused organisations in mind. It’s built to make evidence collection easier — and to help you extract meaningful insight from that evidence as you go.

Here’s how Makerble supports organisations using WEMWBS:

📊 Instant Insight

When a participant completes WEMWBS via Makerble, their scores are reflected instantly on your dashboard. This allows teams to monitor wellbeing trends in real-time — without needing to download data or build custom charts.

🧍 Track Individual Progress Over Time

One of the strengths of Makerble is the ability to track progress at both the group and individual level. You can easily see how someone’s wellbeing has changed across multiple touchpoints — useful for frontline workers and useful for reporting.

🧩 Segment and Compare Results

Whether you’re running a single programme or delivering multiple services across locations, Makerble makes it easy to filter and compare wellbeing scores by demographic, service, team, or location.

📁 Outcomes That Link to the Bigger Picture

WEMWBS scores are part of a broader story. Within Makerble, you can link WEMWBS data to your project outcomes — showing how improvements in wellbeing contribute to your wider goals.

📄 Ready-Made Reporting

With Makerble’s reporting tools, it’s straightforward to produce summaries that highlight trends and tell the story behind the data — without needing to build reports from scratch.

Get Started

If you’re already using WEMWBS, you can get started with Makerble in a few simple steps — whether you want to import existing data or run your first survey through the platform.

We’re happy to walk you through how it works and explore how it could support your existing processes.

Start using WEMWBS for free on Makerble or Book a Demo

12 Ways To Supercharge Survey Completions & Analysis on Makerble
IN
The School of Impact
time

Get all the insights you need for growth by making the most of every survey you run.

Get more out of your surveys. Save time with automations, improve the survey completion experience and expand the range of insights available from your responses. Discover all this and more in our round up of the top 12 ways to supercharge surveys on Makerble. Let's dive in.

#12 Skip Logic

Increase survey completion rates by ensuring that respondents only see the questions that relate to them. Skip Logic enables you to tailor the survey experience to each respondent based on their answers to your questions. You can categorise some of your survey questions as screening questions which means that the answers to those questions dictate which subsequent questions are displayed to a respondent.

There are two types of Skip Logic available to you on Makerble Surveys.

  • Basic (Jump To Question) will immediately skip past one or more questions based on the answer that the respondent has just provided to a question
  • Advanced (Hide Question Unless...) will remember a respondent's answer to an earlier question and based on that earlier answer, it will show or hide a question which appears later in the survey

Skip Logic is a great way to manage complex data collection requirements without needing to complicate things by creating many different surveys that are only slightly different from each other. It lets you consolidate all your questions into a single survey that uses screening questions to guide respondents to the parts that are relevant to them.

Get started with Skip Logic

#11 Success Scores

Success Scores give you an instant understanding of the change in people's answers between their pre and post survey responses. There are three scores and each one is automatically calculated for every Scale-based question set to Single Choice. This means you can speed up your longitudinal analysis, report accurate results quickly and see at a glance how well a particular cohort is doing.

The Success Scores show you the following:

  • the % of contacts or survey respondents who have improved (in other words, Distance Travelled)
  • the average amount of improvement across all contacts
  • the % of the way there - a current snapshot of progress across all contacts

Now you can get an understanding of Distance Travelled without needing to review every respondent's answer on every survey they've completed.

Find out more about Success Scores

#10 Question Rotation

There's always a trade-off with longitudinal surveys between consistency and variety. Question Rotation on Makerble gives you the best of both.

Here's a quick rundown of how it works:

  • By default your  questions are static - which means that they will appear in a survey every time that it is completed.
  • But for questions that don't need to be asked as frequently, you can add them to a Pool.
  • Every time someone is shown the survey to complete, they will see a different question to the one they saw last time from that Pool.
  • You can add as many Pools as you like to each survey.

This lets you rotate the questions which are being asked in order to give respondents a degree of variety, whilst at the same time being able to maintain consistency where you need it.

Get started with Question Rotation

#9 Automated Survey Distribution

Send out personalised survey invitations, on time and with accuracy, using Makerble Automation. Set rules that govern when people are sent specific surveys, such as a feedback survey at the end of a course or a baseline survey once they register with you. Intelligently customise the email and SMS message containing the survey link using mail merge tags that give every communication the personal touch, even as you scale.

Automated Survey Distribution is particularly useful when you have a constant influx of people whom you need to ask questions. Do you have

  • new customers or enquiries every day?
  • new courses every couple of months?
  • new cohorts of students every academic year?
  • new employees ad-hoc?

If you need to deal with any of these, sending out one-off survey links quickly becomes tricky. If you need to go beyond that and collect survey responses from the same people over time - such as a Pre and Post survey - it can become a nightmare to administer all those survey invitations. Makerble Automation can simplify it all. Now your team can focus on what they do best - coming up with ways to better serve your audience rather than getting bogged down in survey administration.

Get started with Automated Survey Distribution

#8 Verdicts

Every survey response means something, but interpreting the meaning can take time. Verdicts on Makerble allow you to automatically classify every survey response based on the combination of answers chosen. This speeds up decision-making when it comes to deciding what to do next with a participant once they've completed a survey.

Verdicts are typically used in medical scenarios where a particular set of answers indicates a specific diagnosis. But you can create your own decision-making formulas that enable you to intelligently classify each participant based on their answers to your questions.

Get started with Verdicts

#7 Response-based Workflows

When people don't respond

Increase your survey's response rate with automated reminders that chase people up, only if they haven't submitted their survey response yet. We all know that chasing people up can be tedious and there's all the administration involved - cross-checking your submissions list with your distribution list. With No-Response Reminders you can sit back and watch the survey responses roll in.

Get started with No-Response Reminders

When people do respond

Take action in response to people's answers. Answer-based Workflows allow you to deliver personalised interventions based on the way people have answered your survey. For example - if a respondent has mentioned a particular opportunity or risk (e.g. being ready to look for a new job) you can automatically alert relevant colleagues, add a flag to their contact profile or even send them personalised communications via SMS and Email. In this way you can act on every survey response in real-time information to improve risk management and take advantage of time-sensitive opportunities.

Get started with Answer-based Workflows

#6 Outcome Rings

Outcome Rings turn individual survey responses into a clear visual story that makes it easy for everyone to understand the shape of the results. Outcome Rings display a person's answers to related survey questions on a 'spider plot' which reveals strengths, gaps, and overall patterns at a glance.

As a visual aide, Outcome Rings help everyone make sense of progress. This means you can use them as an engagement tool when working with individual respondents or an entire group of respondents. Rather than requiring people to interpret tables with words and numbers, you can instead empower them with visuals that highlight just how far they have come, while at the same time pinpointing areas for further growth. This makes Outcome Rings a dynamic and interactive diagnostic tool. Now you can transform raw data into meaningful insights that drive personal development and honest group discussions.

You're not limited to looking at individual participants on Outcome Rings. You can also see and overlay the following:

  • the Mean Average response from an entire cohort of participants
  • Observations recorded by your colleagues about participants
  • Self-reflections submitted by participants themselves

Get started with Outcome Rings

#5 360° Surveys

Create a culture of openness and embed a Growth Mindset, with 360° Surveys. You can collect several people's perspectives on the same person or organisation by categorising your respondents as Observers. This is useful when you want to:

  • Provide employees with rich rounded feedback from their peers about how they're doing - you designate one employee as the 'subject' and the other employees are linked to them as 'observers'
  • Conduct an OCAT (Organisational Capacity Assessment Tool) survey that gathers feedback about an organisation or team from across the workforce or team-members, in order to assess how the organisation is really doing - you designate the organisation or team as the 'subject' and the the workforce or team members are linked to them as 'observers'

In each of these scenarios you can use Outcome Rings to compare and contrast what Subjects say about themselves with what Observers say about them. This can lead to honest and fruitful feedback and reflections that drive progress.

Get started with 360° Surveys

#4 Tabulated Interval Analysis

One of the best ways to track improvement is to compare survey responses collected over time from the same respondents. This enables you to establish a baseline and then see how responses vary. Each point in time is an Interval and you can even give each Interval a name so that it's easier to identify them in your analysis.

Makerble gives you several ways to conduct Interval Analysis. If you prefer a more visual interface you can instead opt for the chart-based dashboards.

  • Tabulated responses that are ordered sequentially i.e. 1st Time, 2nd Time, 3rd Time
  • Tabulated responses that are ordered according to each named interval e.g. baseline - midline - endline

Get started with Interval Analysis

#3 Multi-version surveys

Tweak your survey questions over time, without needing to create duplicate questions. Survey Versions on Makerble give you the freedom to adapt and adjust your surveys as you go, whether that's based on changes to research requirements, feedback or any other reason.

Each of your surveys contains Core Questions which you can choose to hide, reword or keep in subsequent versions of the survey. This means you can still compare responses - for longitudinal analysis, such as when you have a baseline, midpoint and endpoint - and you can aggregate responses to see the totals across different projects, even though the survey questions themselves might be worded slightly differently or only appear in certain versions.

Get started with Mult-version surveys

#2 Benchmark Comparisons

Understand how one cohort performed compared to another. Contrast contacts with one characteristic versus contacts of another. See how results from one year differ from results from another. You can do side-by-side comparisons like these and more using the Comparison features available on Makerble.

One definition of impact is the difference you've made compared to what would have happened anyway. With Benchmark Comparisons on Makerble you can create your own benchmarks to compare results against and if you're running a control group or even a randomised controlled trial, you can manage your results on Makerble and see the differences between both groups easily.

Get started with Comparisons

#1 Branded Survey Microsites

Lead the conversation in your sector with a Branded Survey Microsite. You create the surveys, give your members access to their private dashboard and then they share survey links with their audiences. They get to analyse only their results, whereas you get to see anonymised results from across your members' audiences.

Armed with those insights you can see the big picture of what's happening in your sector. You have full access to the dashboards that let you identify themes and spot trends. You can even share your benchmarks so that members can compare their results against your benchmark and see how their own results change year-on-year.

Get started with Branded Survey Microsites

So that's it - a summary of all the ways you can get the most out of your surveys and ensure you gain the insights you need for growth. To put these into practice, start using Makerble Surveys for free or to dive deeper with a specialist, book a call.

How Buttle UK automate Longitudinal Grant Evaluation with Makerble
IN
Case Studies
time

Buttle UK transformed a complex, spreadsheet-heavy impact process into a seamless, automated workflow - giving them richer insights, with less admin.

ABOUT

Buttle UK provides life-changing hardship grants to young people across the UK, many of whom are living in poverty, experiencing family breakdown, or estranged from their parents or guardians. These grants provide practical, emotional and educational support - often marking a turning point in a young person’s life.

But for a charity committed to learning and transparency, understanding the real impact of those grants over time is just as vital as giving them.

THE CHALLENGE

To capture outcomes effectively, Buttle UK had developed a thoughtful longitudinal evaluation model, sending three different surveys over a six-month period:

  • A 3-month survey to the referring professional (e.g. youth worker or case worker)
  • A 3-month survey to the young person or their parent/guardian
  • A 6-month follow-up survey to the young person or parent/guardian

But with new grants being made every month — and a growing caseload - keeping track of survey timing, respondents, and data analysis became overwhelming. The team was juggling complex spreadsheets just to stay on top of who needed what, when.

THE SOLUTION

With Makerble, Buttle UK streamlined their entire follow-up process. Now, once a new batch of grantees is approved, staff simply upload a spreadsheet of recipients into the platform.

From there, Makerble takes care of the rest:

  • Automatically sending 3-month surveys to referrers and families
  • Triggering the 6-month surveys at the right time, without manual chasing
  • Allowing surveys to be completed via email or SMS - whichever works best for recipients

Everything is securely logged and instantly available for analysis.

THE IMPACT

Instead of spending hours managing reminders and data merges, the Buttle UK team can now focus on what really matters: learning from the data.

With Makerble’s filters and Lenses tools, they can:

  • Segment results by age, region, or type of hardship
  • Compare progress between different groups
  • Surface powerful insights — both qualitative and quantitative — in real time

Makerble has helped Buttle UK transform a demanding process into a scalable, insightful, and human-centred evaluation system - one that matches the care and ambition behind every grant they give.

Why survey responses dip during programmes and why that's okay
IN
The School of Impact
time

Survey responses often fluctuate during programmes, with some scores dipping mid-way. These changes aren’t necessarily a sign of failure but rather reflect natural shifts in self-perception, emotional growth, or external challenges. Understanding why these dips happen and how to design surveys to capture real progress can help you interpret the data more effectively and support participants on their journey.

There are several reasons why your contacts' Average Improvement along a particular Scale could be sub zero.

1. The Self-Perception Shift

What it is

When running courses designed to help people develop, it’s common to use surveys to track participants’ progress. However, these surveys can sometimes seem to suggest a decline in self-assessment over time which you’ll see on Makerble as a sub zero Average Improvement if it happens to enough of your participants.

Here’s an example. You run a parenting course and you have a question that asks participants at the start of the course: "How good a parent are you?" The participants, thinking that they’re already good parents, provide answers such as, "Great!" But after a few weeks on the course, over which time those participants have learnt a lot more about what good parenting actually looks like, those same participants rate themselves lower. It’s not that they’ve become worse parents, it’s because their self-perception has shifted.

This phenomenon is often referred to as the Dunning-Kruger effect and reflects an increase in self-awareness. At the beginning of a course, people may not fully understand what “great” looks like. As they learn more, they may realise their initial assessment was overly optimistic. This shouldn’t be seen as a sign of failure, rather this shift can indicate meaningful growth: participants are learning to evaluate themselves against higher, more accurate standards.

What you can do about it

To avoid misinterpreting these results and better capture true development, consider these strategies:

  1. Include Objective Measures
    Combine self-reported data with objective metrics, where possible. For example, track changes in specific parenting behaviours, like communication frequency or positive reinforcement, alongside self-perception.
  2. Acknowledge Changing Perspectives
    Incorporate questions that recognise shifts in understanding, such as:
    "How has your understanding of effective parenting changed since starting the course?"
    This helps contextualise changes in self-assessment as part of a broader learning journey.
  3. Use Retrospective Pre-Post Questions
    Ask participants to reflect on where they were at the start of the course, e.g.:
    "Looking back, how would you rate your parenting skills when you started?"
    This approach highlights their progress by allowing them to reassess their starting point with their new perspective.
  4. Invite Reflection
    Instead of a raw self-assessment, phrase questions to highlight growth. For instance:
    "How confident are you in your parenting skills compared to when you started the course?"
    This helps participants anchor their responses to progress rather than static self-judgment.

In summary

When participants lower their self-assessments, it’s not a sign that they’re getting worse. Instead, it often reflects growing insight into what “good” really looks like. By designing survey questions that account for this shift, you can more effectively capture the progress that truly matters—the development of self-awareness, skills, and confidence.

Accurate tracking ensures your data reflects the real impact of your programme, celebrating participants’ growth in ways that inspire and motivate.

2. The mid-programme Slump

If you’ve ever reviewed survey data during the middle of a programme and noticed a decline in responses compared to the initial assessment, you’re not alone. Particularly for questions about mental health and wellbeing, these dips can feel disheartening—but they’re not necessarily a sign that things are going wrong. Progress, especially in areas tied to personal growth or emotional wellbeing, is rarely linear.

Why it happens

Mid-programme dips are natural and often reflect the complexity of personal progress. Here’s why they might occur:

  • Emotional Vulnerability
    Programmes that tackle deep issues often surface emotions or challenges that participants may not have previously acknowledged. For example, a mental health programme may bring attention to stressors or unhelpful habits that people had ignored. While this is part of growth, it can feel overwhelming and temporarily impact wellbeing scores.
  • The "Implementation Dip"
    Borrowing a term from education, the "implementation dip" refers to the temporary challenges people face when applying new knowledge or skills. For example, adopting healthier coping mechanisms may feel awkward or frustrating at first, leading to perceived setbacks.
  • Temporary Life Events
    Declines in wellbeing might also reflect external factors, such as unexpected stressors in participants' lives. These are unrelated to the programme but can impact responses.
  • Programme Fatigue
    For some participants, the middle of a programme can feel like a slog—especially if they’ve invested effort but haven’t yet seen tangible results. This can lead to a temporary decline in motivation or optimism.

What you can do about it

To reassure yourself (and participants), here are ways to interpret and respond to these dips:

  • Normalise the Journey
    Recognise that ups and downs are a normal part of personal growth. Share this with participants during the programme to help manage their expectations. Framing these dips as part of the process can reduce discouragement.
  • Supplement Surveys with Qualitative Data
    Pair quantitative responses with open-ended questions like, "What challenges are you facing right now?" or "What are you learning about yourself so far?" This can provide richer insights into why responses are declining and whether it’s part of a broader growth pattern.
  • Track Long-Term Trends
    Focus on overall trajectories rather than isolated data points. If scores dip mid-programme but improve by the end, the temporary decline isn’t necessarily a problem—it’s just part of the process.
  • Provide Support During Dips
    Use mid-programme surveys as an opportunity to check in with participants. For example, if mental health scores decline, it could signal a need for additional support, like a reminder of available resources or a session focused on self-care.

Should You Adjust Your Survey Design?

It’s worth considering whether your survey questions could better accommodate these natural fluctuations. Here are some options:

  • Ask About Change
    Rather than asking about absolute states (e.g., "How happy are you right now?"), ask about changes relative to their baseline: "How has your wellbeing changed since the start of the programme?"
  • Incorporate Expectation Management
    Add questions that acknowledge the journey, such as:
    "What challenges are you currently facing as part of this process?" or "What progress are you most proud of so far?"
  • Introduce Reflective Questions
    Retrospective questions encourage participants to reflect on their growth:
    "Looking back to the start of the programme, how have your perspectives or experiences changed?"
  • Be Cautious About Over-Adjusting
    While some tweaks can help, you don’t want to obscure the real data by artificially smoothing over dips. Instead, use communication and interpretation strategies to contextualize fluctuations rather than trying to eliminate them.

In summary

Progress isn’t linear - and that’s okay. Survey responses that show declines mid-programme don’t necessarily mean something’s gone wrong. Instead, they often reflect the ups and downs of real personal growth, emotional reflection, or external life circumstances. By normalising these dips, supporting participants through them, and thoughtfully designing surveys, you can ensure that your data—and your programme—accurately reflects the transformative journeys people are on.

Ultimately, growth isn’t about a straight upward line. It’s about where participants end up by the finish.

Related articles

Create a survey by creating the metrics first

Calculate Distance Travelled

5 Stages of Impact: How to Measure Your Project’s Progress on Makerble
IN
The School of Impact
time

A practical guide to understanding and tracking outcomes using Makerble’s 5-stage impact framework.

If you’re unsure how to identify, measure or categorise your outcomes, that’s exactly where this article will help.

What are outcomes on Makerble?

  1. Makerble is designed to help you achieve your project’s ambitions
  2. Makerble does this by organising your project’s ambitions into 5 stages of progress

   3. Because these 5 stages of progress are measurable, you now have an easy way to see how well your project is doing at achieving its ambitions.

      4. Activity and Engagement are things that you can simply count. On Makerble they’re called trackers, elsewhere you might see them described as outputs.

      5. Ability, Behaviour and Growth aren’t things that you can simply count. They need to be defined. So

         a. firstly you should write them a statement with a direction. Each of these statements is an outcome. For example “Improve people's knowledge of how to cook a healthy meal”

         b. secondly, you should choose something you can look at that shows you whether progress is happening or not. This is called an indicator (or a Progress Tracker) because it indicates whether progress is happening or not. For example, one of your indicators could be called “Cooking Ability” and you could rate someone’s level of cooking ability on a scale of 1-10 at the start of the project and then compare it to their cooking ability level at the end of the project

Here’s an example

  • Let’s imagine that your project’s ambition is to decrease obesity by changing people’s eating habits.
  • Your 5 stages of progress might look like this:


  • On Makerble you can measure your progress at each of these 5 stages

    • You can create these 5 stages of progress on Makerble and add them to your project

 Then you can start measuring progress at each stage on your project

      ◦ Measure Activity and Engagement by entering the number of Classes Provided and using the checkbox to record which participants attended the class that day

       ◦ Measure Ability by entering the ability of each participant at the start of the project and then entering their ability at the end, so you can compare the two and see the difference over time.

        ◦ Measure Behaviour

      ◦ Measure Growth

  • So because you have defined your outcomes as statements and then added Progress Trackers to them, you can now see the change that’s happened over time.

See it for yourself

Surveys for every sector and scenario

360deg-staff-surveys
360° Staff Surveys
diversity-equality-inclusion
Diversity Equality Inclusion
staff-satisfaction
Staff Satisfaction
event-feedback
Event Feedback
course-feedback
Course Feedback
training-evaluation
Training Evaluation
outcome-assessments
Outcome Assessments
mental-health
Mental Health
medical-assessments
Medical Assessments
longitudinal-research
Longitudinal Research

Flexible, insightful and made to scale with you

Your branding, your experience

Launch surveys that reflect your organisation’s identity. Customise everything from domain to design, creating a smooth, trusted experience for respondents.

Track progress over time

Whether you’re surveying once or at regular intervals, Makerble makes it easy to compare trends, spot changes and measure longitudinal impact across individuals or groups.

Turn insights into stories

Automatically generate reports and dashboards that bring your data to life. See trends at a glance and download custom reports in a couple of clicks.

Advanced logic, simple setup

Use skip logic, branching and multiple question types without needing a developer. Design sophisticated surveys that stay intuitive for your team and your respondents.

Understand your audiences

Group and compare responses by demographic criteria. Filter your dashboards to reveal what matters to different cohorts.
Insight, wherever you are

Survey offline and on the go

Use tablets and smartphones to record survey responses even without an internet connection - all results sync to the cloud automatically whenever you have an internet connection.
Right survey. Right person. Right moment.

Automate survey deployment based on people’s journeys

Set rules that trigger surveys automatically - when someone joins a programme, attends an event or reaches a key milestone. Initiate follow-ups, send reminders and personalise survey journeys to keep participants involved throughout.
Pick up where you left off

Import previous survey responses with ease

Bring your survey history with you. Makerble lets you upload legacy data from spreadsheets or other platforms - and even link it to individual profiles in your CRM. That means no disruption, just continuity and deeper insight.

One platform for every survey you’ll ever send

Makerble Surveys adapts to your context, whether you're gathering feedback or conducting policy-influencing research.

Create benchmarks and baselines

Compare responses to historical data or sector averages. Understand where you are now - and where you're heading.

Mix quant and qual

Combine closed questions with journaling, photos, and quotes. Capture data that tells the full story, not just a number.

Visualise strengths and gaps

Bring 360° survey results to life with interactive charts that show shifts-over-time, different people's perspectives and average responses across a cohort

What customers say

Open the door to new funding

“So (due to using Makerble) we’re able to say that actually, we’re supporting a far wider number of people. And that has enabled us to reframe our offering for funding and attract funding for areas we’ve not been funded for before”

Andrew Ainsworth, Operations Manager, Vintage Vibes

Change the world faster

“Makerble's in-built Impact Analytics allow us to look at our programmes through the lens of every datapoint – stats, surveys, stories, even other systems – so we can better understand the impact our programmes have on the people we work with.”

Tom Beaumont, Operations Director, Christians In Sport

Inspire partners to share progress

“We went with Makerble because of the flexibility - particularly with participant and activity surveys which we can do on an iPad, on the bus on the way back from an event or over the phone; rather than on paper forms. Now there’s no paperwork.”

Amy Harrison, Head of Community Partnerships, Bristol Climate & Nature Partnership

Integrations

Makerble integrates with Mailchimp, Google Drive, Google Calendar, Gmail and more. You also gain access to Makerble’s open API which gives you the freedom to connect your data to your own software applications.

Open API
Facebook
Makerble Surveys
Linkedin
WordPres
Joomla
GoDaddy
Squarespace
Webflow
Wix
PowerPoint
PDF
Microsoft word
Tableau
POWER BI
Outlook
Slack
Trello
Mailchimp
Calendar
Google Drive
Gmail
Makerble® Surveys

Built for scale, designed for control

Assign survey permissions, control visibility and work together across departments and projects. Get consent, anonymise responses when needed and uphold ethical research standards.

Awards

Frequently Asked Questions

Are you GDPR, HIPAA and FDA compliant?
Can I calculate scores based on survey responses?
Can data be exported?
Can staff see different views based on their role?
Can we access the platform on a mobile phone or tablet?
Do you have API services?
Is Makerble suitable for large-scale programmes?
Which question formats do you have?

Contact us on +44 (0) 1225 595594

You're already changing the world.

We're here to help you change it faster.