Skip to content
PEAK Grantmaking

Customized Reporting: One Size Does Not Fit All

Funder Profile – Qatar Foundation International

Interview with Kelly Doffing, Program Officer

Kelly Doffing is a Program Officer with the Arabic Language and Arab Culture Program at Qatar Foundation International. She holds a master’s degree in Arabic from the University of Maryland, College Park and completed the Graduate Arabic Flagship Program. She has worked as an Arabic teacher, administrator, and translator in the United States and Egypt. Her interests include expanding opportunities for Arabic learning and improving the quality of Arabic language instruction.

Qatar Foundation International (QFI), LLC, is a U.S.-based member of Qatar Foundation (QF). QFI is a grant-making organization and a convener of thought leaders on issues related to global and international education, open education, and education technologies. Its programmatic areas are Arabic language and Arab culture, STE{A}M (STEM plus the Arts), and Youth Engagement. QFI inspires meaningful connections to the Arab world by creating a global community of diverse learners and educators, connecting them through effective and collaborative learning environments.

How does Qatar Foundation International think about reporting – what is its raison d’etre?

We think about reporting in a number of different ways. Of course, we look for fidelity to the proposed project – we want to make sure the grantee did what they said they’d do. And, we want to know how it went. We also always look at the financial side, to make sure the funds were spent appropriately.

Beyond that, we want to learn and we want to help grantees learn. We fund similar projects across a number of grantees, and so we use reports to help us choose, guide or advise other grantees. Reports help us share how common challenges have been addressed and allow grantees with similar projects to learn from one another. Program officers are the primary user of reports for these purposes.

With that said, we are also always looking for good stories as well as ways to share the good work done by our grant recipients. QFI is a bit unique, because we work with 501c3 groups, but we also work with individual students, teachers, and even for-profit organizations.

What happens to a report when it is received?

Because our grant cycles and deadlines run at various points, we are getting reports in all the time. So, when we get reports, we try to read and clear them immediately, so the reports do not just “sit” in our system.

The reports are received by a program officer, who does an initial review and, if it is an interim report that triggers contingencies or additional payments, those are noted. Otherwise, the program officer pulls out potential stories for the communications team and identifies impact and beneficiary numbers for our staff. We review financial reporting and check budgets to ensure funds have been spent according to the agreement. If things look good, then the program officer sends an email to grantee to say the report has been received and includes any follow-up questions.

How has technology helped Qatar Foundation International to learn from reports?

We use an online system for grantees to submit reports. Program officers associated with each grant read and respond to every report; and, within our system, we built in a step that asks, “Would anything in this report be interesting to the communications team?” While the communications team might not be interested in all the specifics, financial details, and other data that a program officer needs, we know there could be one piece or an anecdote that should go to communications.

The system allows the program officer to very easily cut and paste sections and send it automatically to a communications database. So, when they go looking for stories, they have a place to start, with enough information to go back to the program officer, or to go directly to the grantee as long as the program officer gets a heads up.’

Can you share an example of how your reports are being used by grantees, to help them learn?

The individuals with whom we work get a lot of pride from being profiled by us; our profiles are often included in students’ college and internship applications. Teachers have told us that they keep our profiles in their professional portfolios, when seeking promotions and jobs. Stories based on reports help QFI to expand our reach, amplify our mission, and attract future applicants.

We are especially excited about one approach we’ve used for recipients of professional development grants for teachers. To help social studies, science, math, other subject teachers integrate Arab world topics into their teaching, we support teachers to attend professional development opportunities.

Instead of just asking for a narrative report or just proof that they attended, instead, we ask for a lesson plan based on what they learned in the training. In this way, they are applying what they learned and providing us with a product they’d create anyway. it’s useful for both QFI and the grantee. Even better, we place the lesson plans on an open source website for educators, so other teachers can access these materials, for free. We are building this great bank of lesson plans for a community of teachers.

Has your approach to reporting evolved over the life of the foundation? How so, and why?

I started at QFI about five years ago, and at that time, we used one set of reporting guidelines for all our programs – about five or six at the time. Then, as we’ve expanded to twelve grantmaking programs and looked at our reporting requirements, we saw some questions were just overkill, some were outdated or irrelevant, and most importantly, a lot of the information we required was not actually useful for our analysis.

This prompted us to start revising the forms, trimming down, and adding more intentional and specific questions that related more to what we needed to learn. During this time, we shifted our grant agreements so from the start, grant recipients would have a better understanding of our expectations. We customized the documents to the actual program, no longer “one size fits all.”

From there, it was an easy jump to tweak the reporting based on the type of grant and the size and scope of the grant. Because we want teachers in the classroom teaching, not spending time filling out our reporting forms. We started distinguishing reporting by program, based on what we wanted to learn and what we know about grant recipients.

We found that if the report did not ask for specific participant information, grantees might not give us that information. Because the foundation wanted to report specific data, we needed to ask for specific data – like demographics, numbers of K-12 students served, numbers of participants in professional development programs.

For renewable and larger grants, we need more information because we will use it to make decisions. Those large grants require more frequent reporting – as well as an after-funding site visit. As an aside – or maybe not an aside at all, because this is really a part of our reporting – we conduct after-funding site visits to schools or other institutions running observable programs. After those site visits, staff completes an online form capturing what has been learned from observations and meetings. Those in-person site visits yield a lot of great information that we would never get in a more formal written report.

Can you share an example of how you’ve customized reporting requirements to fit grant programs?

For smaller individual-focused programs, we’ve really flipped our reporting requirements completely. With students, we ask them for a blog post that is immediately externally facing – our communications team does just a slight edit for clarity, but the students know that what they write will be public and will serve future applicants and recipients. Our prompts might vary, but generally, we ask students to tell us about their experience: what did you like? What was a challenge? What surprised you? We give them a 300-500 word limit and ask for a couple photos to use with the blog.

We get some students who really focus on the academic aspects of, say, a summer study experience. Others focus more on friends they made. The variety is ok, because across all the blogs, a reader (and us, as staff) get a fuller, more realistic sense of what recipients are getting out of these programs. We’ve been pleasantly surprised that students are very honest and offer both their positive experiences, and experiences that were more challenging.

For honest feedback, it helps that they are students. Young people tend to be really honest! But, also, it’s important to note that there is nothing riding on these reports. They’ve already received the award and we urge them to be honest.

How do individual reports help you assess entire programs, with such diverse strategies and goals?

Individual reports tell us about individual grants, but we make sense of our grants and what we are learning through conversations with other program officers. When we are about to enter a new grantmaking cycle or are looking at a larger strategy, that’s when we pull common themes and trends together.

We ask ourselves: do the reports, taken together, show us any common concerns or solutions to common problems? Program officers keep in mind the over-arching goal of promoting Arabic language programs in schools. We are always looking to see how, even among all these very different schools in very different contexts, there are commonalities and over-arching lessons.

The program staff is responsible for this analysis, so I know when I read a set of reports, I have these larger questions in mind. We tweak programs and grants almost every year based on what we learn from reports.

How would you set up a reporting system from scratch if you had the opportunity?
  1. Utility. Ask yourself: what is the usefulness of what we are asking from grantees? Do we need to know or are we asking for sake of asking? If the latter, then let it go!
  2. Always balance the value to the foundation with the burden on the grantee. Is the burden commensurate with the amount of the grant? Or, with how much of the total project the foundation is actually funding? If the foundation supports a large percentage of a project, it makes sense to track it closely. Alternatively, if the foundation is supporting a tiny percentage, why ask for a huge report?
  3. I’d also look at flexibility: can reporting be tailored or adapted to accommodate the different kinds of programs we support?
  4. Look to Peak Grantmaking for advice and new ideas! At the annual conference last year, I learned so much about streamlining the application and reporting process. It really prompted me to come back and look at the programs I manage and urge my colleagues to do the same. Previously, we might have incorporated that lesson plan approach to our reporting requirements for teachers, but we probably would have also asked for a traditional narrative. After speaking with colleagues, we just let that go.