How do we measure impact and success?
1. Your Impact/Success Checklist
We know we're able to demonstrate impact and success when:
We have founds ways to quantify the problem space and how our solution makes a difference
We've explored ways to demonstrate learning outcomes as well as other value metrics like time, and money
We are looking to collect and analyze useful data about use of our idea/solution wherever we can
2. How do we measure impact and success?
When thinking about measurement of impact and success keep in mind what your goals and objectives are. Try to think about what your short-term, medium-term and long-term goals look like.
Example: “In the next 3 months we would like to have our new social media platform product launched and within a year we would like to have 1,000 active users. Our end goal is to attract users from other geographic locations”
When identifying your goals and objectives make sure they are: 1) Realistic, 2) Specific and 3) Time bound and 4) quantitative. In the broader spectrum, your impact or success shouldn’t necessarily time up only to your product idea or concept. It could be related to the human resource capacity you attain or the amount of seed funding you attain (or lack thereof).
Most of it is about what is successful for you! What are your personal goals in this venture? Did you achieve what you had hoped for? Many of you will be busy with academics in school so findings a balance between what you could consider as a reasonable goal based on the level of commitment you can give.
Additionally, there are some key metrics that generally apply to most venture ideas that you would want to assess as part of you measuring success and impact. See below.
3. How do we quantify our problem?
You can quantify a problem through (1) lost money, (2) lost time, or (3) a currently inefficient use of money (4) learning outcomes.
Good example 1: New teachers who are quitting in their first year costs school districts $21,000 per year, so investing anything up to that amount to prevent a teacher from quitting is a win.
Good example 2: Parents of pre-school-aged children have a hard time finding quality, education-focused childcare and waste 6 working days on average comparison shopping.
Good example 3: The median office of career services spends $34,650 per year on tools and resources, but many of the tools are under-utilized by students.
Keep in mind that, in education innovation, we're always talking about LEARNING outcomes. And, since we can't measure learning directly, we have to use some proxies combined with sound logic. You can get students to take tests and compare your intervention to students NOT using your idea/solution. You can measure how students, parents, teachers FEEL - some people consider this a confidence or self-efficacy measurement. You can also find ways to quantify indicators that you are onto something of value - number of visits to a clinic, repeat use, time spent using your idea/solution, number of referrals etc. Regardless of what method you choose, you have to try to remove bias from your data collection process and collect it in a rigorous and consistent way.
There are other ways to quantify the problem as well (e.g. “Only 59% of college students graduate within 6 years”), but the beauty of using these three quantifiers is that they help you better articulate why people should be willing to pay for a solution. #1: if people are losing $X, then they should be willing to pay $Y for a fix as long as $Y < $X. #2: if time is money, then lost time = lost money. #3: if people already pay $X, then it should be a no-brainer for them to pay for something that does the same job better. If we apply this thinking to the 59%, we’d end up with a meatier quantifier of “schools lose $100,000 for every dropout.” Notice the difference? 59% is alarming, yes, but 59% is such an abstract number that it’s difficult to pinpoint who specifically would care about this number. $100,000, on the other hand, is surely going to be alarming for any college’s Chief Financial Officer. Whereas citing 59% will get you a bunch of nods (but few people willing to open up their wallets to pay for a solution), you can be sure that the CFO would want to speak to you if you have ideas on how to save them up to $100,000 per student.
Questions to ask yourself:
Have we thought about ways to demonstrate our idea improves learning?
Have we identified other metrics to demonstrate our solution's value - such as time and money?
4. How do we know if the launch was a success?
It’s all about the data. Basically for pre-launch you need to identify the most meaningful data (see examples below) and put procedures and tools in place to track them consistently. For post-launch, here are some options for data tracking and determining whether you are making progress towards your mission and goals.
For a physical product e.g. a board game or learning cards you should track # of online and in-store sales transactions by vendor, by geography, by date/season and by customer type if purchaser provide information at time of sale. Also track # of customer issues, # returns and reasons for return or replacement. Over time you can also assess % market share, positioning in relation to competition etc.
For a digital product e.g. mobile application or video game, you could track on the front end (related to the targeted end-user activity): user logins, # clicks, # shares , functions used least/most frequently, duration of use, ,user frequency, # user errors, even watch for positive product reviews on Yelp, ranking on product sites https://www.producthunt.com/and other sites,etc. On the application backend (support/technical maintenance side of solution) you could track # downloads, # system errors, device types, # and timing of purchase transactions by geo, activity related to specific promotions/campaigns, # user retention
For a program e.g. a new SEL curriculum for middle school students,you should track # and timing of general sales, # of Pilots that converted to a sale, # sales transactions by timing and geo. # of subscription renewals If curriculum/teacher guide distributed from website-track the # of downloads, # teachers using curriculum, class sizes/# students if provided with subscription signon.
For a platform e.g. a community website for teachers to share lesson plans and learning assets you could track on the front end (related to the targeted end-user activity): user logins, # clicks, # shares , functions used least/most frequently, duration of use, user login frequency, # user errors, # content downloads, #content uploads,# of discussions, range of topics discussed. On the platform backend (support/technical maintenance side of solution) platform responsiveness/uptime
Over time you can also assess % market share, positioning in relation to competition and use this data to make decision about scaling.
Questions to ask yourself:
How do we define success for ourventure?
What does success mean for our customers? users? buyers?
How do we measure impact over time?
Resources at Harvard
Meet with a SMART Fellow!
Harvard Business Review Article: The Hard Work of Measuring Social Impact
Social Innovation and Change Initiative, Harvard Kennedy School of Government
Harvard Innovation Labs - look out for social impact workshops!