I have now been with Foundant about nine months after working six years in the philanthropic community. When people ask how I like it, there’s only one answer: I love it. Even though I’ve made my exodus from the nonprofit world, I am so lucky to have more than two thousand Foundant clients (over 100 on my personal CSM roster!) to keep me connected to the sector I love. A few months ago, I wrote about my experience with organizational evaluation and learning . . . that continual improvement process to make sure we’re getting to the highest impact as grantmakers. A few months after that article was published, I had the pleasure of organizing a client panel at our biannual users’ Summit to ask a couple of Foundant clients what evaluation and learning mean to them. The panel blew me away. They were gracious enough to allow me to share how their organizations view evaluation and learning!
Meet Amy Nossaman from the Ottumwa Regional Legacy Foundation. Amy, with her self-described “analytical” mind, saw the need for a new grant management system and a new way to pull data from applicants and grantees. For Amy, the grant management side of things is great for rehauling the systemic way Ottumwa manages grants; that said, there’s still a very human element to grantmaking. Amy quickly put as many trust and relationship building tools in place to help nonprofits in her area write better grants and build their capacity.
- Meeting with nonprofits in her area (with no agenda, just a conversation!).
- Offering assistance in reading application drafts (with the emphasis that she does NOT evaluate grants, but she has a sense of what her grant committee is looking for).
- Even offering assistance with OTHER funder applications.
- Trying to eliminate some of the reporting burden – instead of a 6-month interim report, she replaced that with a phone call check-in.
- Does a grantee need to return the money? Amy thinks refunding grants are a pain (she’s not wrong there!) – so instead, she works with the grantee to find other ways to expend the rest of the grant award.
One of the most powerful things Amy said was, “I like to get a pulse on the nonprofits – what keeps them up at night? Are we meeting your needs from a foundation standpoint? If you’re not getting what you need, ask for it. This is setting me up for the “moving the needle” work my board is asking of my foundation.”
Next up was Kristen Summers from Saint Luke’s Foundation. I first met Kristen in my grantmaker days when we both did a Foundant education webinar called “The Funder’s Role in Collecting, Tracking, and Using Data.” She was great in the webinar and she was equally great in the client panel!
Kristen found that some of the things the foundation wanted to know were hard to find; that information was never asked of the applicant on their application or in any follow-up reports. It’s kind of hard to report on data that isn’t there! They utilized some outside tools like Project Streamline and CEP surveys to assist them in assessing their grantmaking process. One of the biggest takeaways Kristen learned? “We need to be more intentional about what we ask in our application questions. We asked a lot of nice to know but weren’t asking need to know questions.”
Some other changes they made were:
- Changed some compound questions and broke them into separate questions – this makes the applications longer but as a funder, they receive more actionable data. (Also, can we just shout out Kristen here for that fantastic term? Actionable data?!)
- Correlated application data with evaluation data – make sure questions and follow up questions are asked in both places.
- Included a lot more instructional text on questions with more formatting and more detail on what answers they were actually looking for.
Kristen mentioned this on our webinar together and I remember literally saying out loud “OOOOOO” when I heard her talk about it . . . she helped set up fields in Grant Lifecycle Manager (GLM) – their grant management software – so program officers can document the lessons they or their applicants have learned on follow-up forms. They can then check a “lessons learned type” to add tags to this information. This essentially turns GLM follow-up forms into a “lessons learned library.” How very cool!
Kristen ended her talk with this: “It’s a lot of iterative, continuous improvement. It makes work easier and harder at the same time. Change is 80% culture, 20% technology.”
Linsey Sauer from The Russell Family Foundation wrapped up the end of our panel. For Linsey and The Russell Family Foundation, they stress having better relationships with grantees, so much of their evaluation and learning comes from bettering the applicant experience. They seek to collect immediate feedback, including asking the question, “How long does it take you to fill this out?” directly on the application form. (What a GREAT idea!). Linsey and the program officers at Russell talk about the choice in asking for certain information on the application OR emphasizing the relationship with the applicant to call them and get the information that way instead. Understanding that nuance of “is this a question for the application or a phone call?” is an art.
Linsey also implemented some time saver efficiencies. One included reduced financial reporting based on grant size. Linsey used Question Branching in GLM so that grants under $20,000 won’t ask for as much detailed information as those above $20,000. This practice cuts down on the burden for program staff, applicants, and Linsey – who does financial reviews.
Linsey also stresses the importance of offering applicants the opportunity for anonymous feedback, which is something they are looking at implementing in the next year. With true anonymous feedback, applicants can feel free to offer honest opinions without having to feel penalized for providing critique.
I am so appreciative of these three ladies sharing their experiences in evaluation and learning – whether that be looking introspectively at foundation policies and procedures, or simply listening to their applicants and grantees and finding ways to ease the grant writing burden. Want to try implementing evaluation and learning at your foundation? I think when it comes down to it, it’s really this simple recipe:
- Ask questions
*Rinse and repeat