This past spring a client dedicated significant resources in response to a national solicitation to fund STEM education. Two months into the proposal writing effort, sequestration forced withdrawal of the funds. As organizations lose faith in the ability of the national government to respond to STEM funding needs, they increasingly turn to local corporate, foundation, and government sources. How do these institutions evaluate and select the many applications they receive?
The FY2014 federal budget requests about $3 billion for STEM education. According to Rita Ferrandino of Arc Capital Development a private equity and advisory firm specializing in education, despite this sizable investment, the U.S. lags behind peer-nations in its funding of STEM education as compared to the GDP. At the same time, Ferrandino notes that over the last 10 years private investment in education has grown. Regardless of their motivations for investment corporations, foundations, and private investors want to support and invest in effective education efforts. To do this, they increasingly expect recipients to provide data demonstrating program efficacy.
National funding for STEM education eclipses state efforts. For example, in FY20013 Iowa allocated $4.7 million to STEM education and Colorado dispersed about $1.6 million. In contrast, the Saint Vrain Valley School District (Longmont, CO) received over $16 million from the federally funded Race to the Top program for a STEM education K-12 initiative.
State Funding Efforts
While awaiting the end of sequestration and approval of the FY2014 budget, STEM programs turn to local resources to fulfill their budget needs. Last year the state of Iowa received approximately forty requests for STEM education funding of which 25% received funding to meet Iowa's scale-up needs. Jeff Weld, executive director of the Iowa Governor's STEM Advisory Council, explains that "to separate the wheat from the chaff", his team used components of the Design Principles Rubric. The Rubric is a tool produced by Change the Equation (CTEq) to improve decision making by philanthropic entities. According to Weld, use of the Rubric provided the Council with the ability to quickly separate programs of promise from those less suitable for the state.
The Rubric is not the only tool employed by states to select STEM education programs for funding. In the spring of 2013 along with about thirty other volunteers from the Colorado STEM education community I acted as a reviewer for the STEM in Action program that awarded about $250,000/each to four of twenty-nine applicants. Although we did not use the CTEq Design Principles Rubric, I felt that the review process was rigorous. Individually and in teams of three, we rated five proposals against an assessment tool aligned to the specific solicitation. To ensure parity across teams, staff from the department of education led all the volunteers through an evaluation of one proposal, modeling the process, allowing discussion, and providing clarity throughout.
Initially developed for corporate philanthropy, many corporations and foundations have adopted the CTEq Design Principles Rubric. Version 3.0 provides funders with a list of ten criteria with which to assess the status and capacity of existing STEM learning programs. In addition to setting out key criteria, the Rubric supplies examples of the type of evidence a program should supply to demonstrate that it meets the criteria. Based upon the evidence provided, funders then decided if the organization is either accomplished, developing, or undeveloped with respect to each of the ten criteria. Claus von Zastrow, COO and Director of Research at Change the Equation, explains that the Rubric was developed in response to corporate requests for a tool to increase the impact of their philanthropic efforts. Once developed the Rubric was pilot tested by an independent evaluator to refine the criteria and demonstrate the efficacy of the Rubric itself. Flip the intention around and program staff may use the Rubric to demonstrate the efficacy of their program to potential funders, use it internally to perform a self-evaluation, or rate their likelihood of receiving funds when held to a rigorous review. Moreover, once around the table, the Rubric acts a centerpiece for structuring and focusing conversations between corporate funders and education partners. Zastrow hopes that funders will use the Rubric not only to identify the most promising education efforts but also to support programs to grow and change to remain relevant and effective over time. In the meantime, Change the Equation has worked with WestEd, a nonprofit research organization, to create STEMworks, a database of programs that meet the Design Principles articulated by the Rubric.
Scott Fast, Executive Director of the Accenture Foundation, explains that when weighing philanthropic options industry takes into consideration corporate needs, long-term impact, and the employee experience. About the employees, Fast notes that corporations often select local and national efforts that make the staff feel proud, excited, engaged, and willing to donate their time. This may result in businesses seeking "signature" programs, those that demonstrate measurable success and wide local or national support. Once on-board, a company may want exclusive rights to support a program to demonstrate its philanthropic efforts. This works well for a few, proven programs but may hurt the potential for innovation beyond those programs. In response, Iowan Jeff Weld explained that his Council may ask the state to set-up a separate funding source designed to cultivate new and innovative approaches to improved STEM education, those efforts that have yet to acquire the foothold or data needed to meet Rubric criteria.
Crunch the Data
Weld and Fast both emphasize the importance of data. Fast notes that corporations make decisions based upon data and expect their education partners to do the same. Fast, who has watched corporate philanthropy change with time explains that donations and support now come with expectations of specific outcomes, aligned to stated goals, and proven measurement methods. Discussions about the nature and depth of data collection expected and performed should be part of the conversation between donors and recipients. As a result, organizations receiving grants need to anticipate and set aside time and funds (about 30% of the award) to perform evaluation. Evaluation may require hiring a third-party to perform quantitative and qualitative research - gathering baseline data, providing formative and summative feedback, and reporting on the change in abilities and attitudes of students and/or teachers toward STEM teaching and learning.
Corporations, foundations, and states will continue to feel pressure to fund STEM education, taking up the slack, as federal efforts remain unclear and sequestration in effect. With increased requests, funders will expect more impact for their investment, requiring data and data-driven decisions from their education partners. Programs demonstrating lasting and effective change will likely receive more and more consistent funding as long as they have the data to backup their approaches and actions.
Click here for a complete version with references and additional images