News
It’s easier than ever to prepare and improve your data with our upgraded Data Quality Tool
25 March 2025
We’ve updated our Data Quality Tool with a new look to improve usability and make the feedback it provides clearer so that it’s easier for you to prepare, improve and submit your grants data.
Our Data Quality Tool is a bit of a workhorse – not glamorous, but essential. This tool knows all the rules of the 360Giving Data Standard and can check files with 50 or 50,000 grants, spotting issues that would be impossible to find with the naked eye and giving feedback on where the issue is located in the file.
Without the ability to check whether the data is following the Standard’s rules properly, we could not support hundreds of funders in publishing their grants in a consistent format. Once data has passed the tool’s checks we know that it can be included in our tools like GrantNav, which allows people to search and filter information about multiple grants, grantmakers and recipients all in one place.
It performs this critical function, providing detailed and specific feedback and depending on the number of issues found in a file, there can be a lot of information to process. While this level of detail is essential, it can be daunting. That’s why, over the past few months, we’ve been working on an upgrade to make it more user-friendly by changing the design and making the feedback clearer and easier to navigate.
Doing our homework
This is the first major upgrade to our Data Quality Tool since it launched in 2016. Over the years we’ve made some improvements, like adding new checks and refining the feedback it provides, but the overall look and feel of the tool have remained relatively unchanged.
We had plenty of ideas for things we’d like to be better: our Publisher Helpdesk are super users who rely on the tool every day to check data and support funders preparing and publishing their grants. But before making any significant changes to it, we wanted to find out what our community of users thought of the tool and our ideas to improve it.
That’s why we carried out a survey, which told us that while most found it fairly easy to check files, they had a harder time finding key pieces of information, like how to submit data using the tool. Similarly, although most said they understood the feedback provided by the tool, many weren’t sure how to fix their data when it was invalid.
As well as asking for feedback, we asked people to rank some of our suggested improvements, helping us to decide what we should prioritise.
What’s new in the Data Quality Tool?
If you’re a regular user of the Data Quality Tool, here’s a quick summary of the key changes we’ve made:
Design changes
We have:
- Summarised the tool’s feedback into headline results, so you get a clearer overview of any issues with the file
- Divided detailed feedback into separate tabs to make it easier to navigate
- Given certain pieces of critical feedback higher priority so it’s easier to find important information
- Grouped feedback according to the field or category it relates to so that related pieces of feedback are shown together
- Added a separate “Submit” page with its own URL to make it easier for users who want to submit files to our Data Registry
New checks
We added:
- A check for grant duration to raise awareness of the importance of this data for understanding the length of grants
- More nuanced feedback about location, which recognises when a file includes one type of location data but not others (such as data which includes the location of the recipient organisation but not the location of the grant)
- More specific feedback for common issues the tool identifies when multiple funders appear in a file by accident
We’ve also refreshed our guidance on using tool.
What’s next for the Data Quality Tool?
This upgrade is a big step towards making the tool more user-friendly and effective for our community of funders. We weren’t able to prioritise all of the improvements we’d have liked to in this round of development, but we’ll be making further enhancements in the coming year. Ideas we’d like to explore include:
- How the Data Quality Tool can give positive feedback to reflect the useful features of the data that people have included in the file, alongside highlighting issues
- The ability for people to download the results, making it easier to work through the feedback outside of the tool or with colleagues
We’re grateful to all the funders who contributed to our user research in 2024. Our next step will be testing the tool with publishers to gather more feedback, informing our development plans over the coming year so that we can improve our support even further.