A quick guide to testing your new grants management system (GMS)

Testing is one of the most important things you can do to ensure your new system meets your requirements and works for your team. In most software implementations, the client is responsible for User Acceptance Testing (UAT) which requires the client to test their new system to ensure it meets their specifications as stated in the implementation contract or Statement of Work (SOW). User Acceptance Testing may also be needed after a new release of system upgrades, or if the client has undergone major enhancement of an existing system. The client is given a period of time to plan, execute and report issues uncovered during the testing. Once all testing is complete, and all identified issues are resolved, retested and validated, the system is handed off to the client to launch and begin using. 

Most clients we work with understand the importance of testing their new system but often find the job daunting and overwhelming. This is understandable - software testing is a practice that requires knowledge and a skillset that comes with training and experience. My colleagues and I at Grantbook have guided and supported many organizations in testing  their new grant management system, including spending 100s of hours conducting testing ourselves. Below are a few tips and how-to's we picked up along the way that can make testing as successful and painless as possible. 

Steps for Testing your GMS

1. Assemble your testing team

Testing is a very important training opportunity for your team. As they test the functionality of the new system, your team will gain a deep understanding of how the system works. As end users of the system, your staff are also best equipped to assess if and how the system meets the requirements and needs of the organization. With that in mind, we recommend that you assemble a testing team with representatives from all groups of users in order to ensure, as early on in the process as possible, that the system will meet the needs of all stakeholders. To test for user-friendliness and intuitive design we recommend engaging one or two testers less familiar with the processes being carried out in the system. And, lastly, no matter who you recruit to test, a thorough orientation of the system is in order.  

2. Determine your testing style

Before you dive head first into testing, we recommend that you take some time to reflect on the amount and types of testing you feel is right for your project, and use that to inform your approach and testing strategy. Much of this will depend on the size of your project and team. For smaller projects you might consider testing as a group, where everyone meets and tests together. For larger teams a divide-and-conquer approach might be best - divide your team into smaller groups or individuals and assign specific parts of the process/system for them to test and report on. But no matter what strategy you follow it always  helps to block off a large chunk of uninterrupted time to run through your list of specific steps/clicks for a test case without distractions.

3. Project manage your testing

Begin by determining how much time you have to conduct the testing and how many staff you have to do it. Once you have your timelines planned, establish a process for documenting and tracking test cases, results, and how to submit bugs. In most cases this can simply be a spreadsheet, but you can also use project management tools like AirTable and Trello for test cases, or Freshdesk to track and log tickets. If this is a large implementation, consider a professional test management software such as JIRA.

If your testing plan includes individuals testing on their own, you will find that holding a consistent regular check-in time is helpful to keep everyone on-track and address common issues. Follow a consistent format for check-ins and decide what drives the agenda and meeting discussions (e.g defect tracker or test progress tracker?). Make sure your team understands the process and know where and how to share the testing results. 

4. Plan testing scenarios

Identify the list of different ways a process workflow can go to completion, including happy and not-so-happy paths. Include identifying exception scenarios that sometimes occur, with the context of how frequently they occur (e.g. an approver is out of the office for extended vacation, a grantee contact leaves). Include trying to look up data or ad hoc reporting not directly tied to administering a grant.

5. Create test data

Start by deciding which environment you are testing in. If this is an implementation of a new system testing in production prior to launch is appropriate. But after launch it could vary depending on your setup and strategy. Once you decide where you will test, you can begin creating testing data. Having test data available in the system in advance will make testing run smoother and faster, because you can jump right in to testing without having to create records first. As you create test records we recommend that you follow a naming convention so it’s easy to identify test data for clean-up later. We often use tags such as TESTTEST or a theme to make things fun - we recently created an entire set of testing records using Harry Potter characters and places! Create a test user for each role so that you can validate user permissions. Share a common set of test data with the team so effort can be saved.

6. Coach your testers on best practices 

In addition to orienting the testers to the system we recommend that you also coach (and re-coach) testers on how to conduct testing and give good quality feedback. For example, tell testers to include notes on the specific records and roles that were used to test and details about what went wrong rather than broad statements.

The following instructions is what we provide to our clients:

  • Approach testing like running a science experiment; follow a procedure, alter one variable, observe and note the results.
  • Go into testing with a pre-existing notion or hypothesis of what should happen or the expected result. Compare this against the actual results.
  • Test one thing at a time so if the results are not in line with what was expected, you can more confidently assume it was due to the variable that was changed.
  • Document everything! Make sure the steps that you took are noted and repeatable, so others can follow them to the exact specifications to reproduce your results.
  • Test workflows with the test user roles and not as a system administrator. 
  • Know where to find emails which are supposed to be sent from the system so that you can verify they are going out when they need to. Since you will likely be using fake testing accounts you will need to find the emails in the system and not in the -fake- inbox. 
  • Try to “break the system” by following unusual process paths. For example: trying to access something your test role shouldn’t be able to, entering invalid data, adding data then deleting data and leaving fields blank before saving/submitting to check requirements.

7. Use helpful tools

Our team has found the following tools very helpful in making testing easier and time efficient. 

  • Form Filler chrome extension (to quickly fill in forms with gibberish)
  • Session Box chrome extension to enable logging in with multiple accounts in the same browser session window (you can also do this with Incognito mode)

8. Manage your bugs proactively

To keep testing focused, we recommend that you identify the most important test scenarios and make addressing bugs that show up in these scenarios your first priority.  If you are employing a large testing team with many novice testers, have an experienced person triage reported bugs in order to prevent duplicate bugs being logged, verify the quality of information provided in the testing log, and identify the appropriate priority of the bug. Lastly, once testing is underway, make sure to monitor for forward progress on bugs - this may involve nudging people to follow up on their actions if there’s been no activity for a while. 

Don't sweat it

Test-driving your new Grants Management System may seem stressful and daunting at first but it does not have to be. With the right approach, tips and tools, testing can be an exciting and valuable opportunity for your team to verify that your new system works as it should, while also getting to know the new system well and become confident power users right from the start!

Haifa Staiti

Philanthropy Solutions Consultant

Solutions Selection, Grantmaking Best Practices, Empathic Grantmaking

I love working at Grantbook because it allows me to combine my experience in grantmaking with my love for technology. I love that I get to use my 8 plus years as a grant manager to help other foundations get the technology they need to maximize their impact. I am motivated by problem-solving and applying creative and innovative solutions to make philanthropy more efficient and effective.