At the start of the 2015 school year, The Moriah School in Englewood, NJ (“Moriah”) started a series of parent and employee surveys. The purpose of the survey was to generate real data around issues that the school faced, from core issues such as tuition, morale, and curriculum, to non-core, but equally important issues such as lunch and carpool. The goal was to prioritize our agenda, decisions, and resources based on data rather than noise and anecdotal issues.
One of the first decisions we made was to use SurveyMonkey for our data collection and analysis. We realized quickly that we required the paid subscription as we expected to receive more than 100 responses. SurveyMonkey is a great tool, and allows skip logic so that, for example, if you answer that you do not have a child in the Early Childhood program, it skips the section on Early Childhood and jumps to ask if you have a child in Lower School. It also has functionality to aid in analysis and presentation of results. The functionality you need for school surveys will cost around $350/year.
The surveys were run by a committee of lay-leaders (parents) and reported to the President and Chairman of the board. The surveys were shared with the Head of School and his direct reports for their feedback prior to publication. Surveys were anonymous and no effort was made to associate answers with any individual. Parent surveys were open to both parents and we turned on the SurveyMonkey functionality which limits one response per IP address.
A presentation of the survey results was shared with the Board of Trustees and the Head of School, who shared the summary results with his direct reports. The complete written responses were shared only with the Head of School and not further disseminated.
The results of the teacher surveys were presented in their entirety in an annual town hall meeting with teachers. The teachers were initially skeptical that we would share the entire results and, as a group, they were particularly appreciative of the transparency that we displayed in the presentation. In our school, the parent results were not shared in full with the parents. This was because we did not have an appropriate forum to share the results, not because we were withholding any negative outcomes.
We were interested not only in creating a survey to test a single point in time, but to create a model for how we would measure our performance over time. We therefore put a lot of time and thought into the questions that we asked, and the format of the answers. For most questions, we offered statements to which the respondent could select from the following responses:
|Strongly Disagree||Somewhat Agree||Neither Agree nor Disagree||Somewhat Agree||Strongly Agree|
We found it important to offer a balanced “Likert” scale for responses, in which the opportunity for a positive response equals the opportunity for a negative response. This removes some of the implied bias in the solicitation of feedback and can provide a more meaningful result when you compile your data into numerical results. When we compiled results, we grouped the positive columns together and the negative columns together to get a better sense of how the sentiment was leaning. An example of the grouped results is below.
|Disagree||Neither Agree nor Disagree||Agree||Total||Weighted Average|
|My child has effective teachers||2.78%|
|108||2.94 (out of 3)*|
*Note – when you group the responses into three categories, the numerical value changes from a 5 scale to a 3 scale.
Deciding what questions to ask and how to frame each question was the most challenging part of the survey development. It is important that you create a survey around a largely repeatable group of questions for use over time. This will allow you to create a baseline, and then to track your progress against the baseline over time. If you change the questions significantly each year, you will not be able to easily track your progress against your baseline.
The development of the questions takes significant thought and discussion with key stakeholders. We reached out to parent liaisons who regularly field comments from parents. This group typically receives what we call non-core feedback on administrative or non-educational matters. We also took input from the administration, which is the leadership of the early childhood program, lower school, middle school and special services. Finally, we utilized a parent volunteer who works in the field of market research to help us frame the questions and responses. We had multiple testers do a dry run on the survey before we published it and all completed the survey in approximately 10 minutes.
This was the overall structure of our survey:
Early Childhood/Lower School/Middle School
We also tested questions which we determined to be non-core and may or may not be repeated in future surveys but were hot topics at the time. These included dress code, lunch program, cell phone policy, facilities, communications, Administration and Board leadership, and tuition sensitivity and value. It is also important to test these questions with parents of students in different age groups, as you will want to know where you have significant variation in responses across the “generations” of students and parents. As an example, we found that the parents of children in middle school were clearly against the concept of a school uniform, while parents of early childhood children were clearly in favor of it. We would have to test this question over time to see how the same group of parents responds as their children age through our school.
We asked a Net Promoter Score (NPS) question, which is derived by asking how likely you would be to recommend us to a friend. Significant research has been done into this single question as a barometer of the health of your key customer relationships. Do not forgo the opportunity to ask this.
We offered free-form text boxes on nearly every question, which offers respondents an opportunity to leave additional comments. This allows additional insights into each question, but the responses are very difficult to categorize and analyze. I recommend that you decide carefully where you want to solicit free-form feedback in places where it will be most useful, rather than everywhere. We also provided two questions at the end of our survey which were only free form: what are two things you like most about Moriah, and what are two things you would want changed? We got a lot of interesting responses but again, they are difficult to categorize. SurveyMonkey now offers word cloud functionality so that even without a scientific analysis, you can create a meaningful discussion with your stakeholders about popular responses.
The staff survey tested key areas of employee satisfaction.
Similar to the parent survey, we also asked some free form questions at the end: what is the best change you have seen in Moriah in the past two years, and what would you most like to see improved for next year?
We did both the parent and staff surveys in the first year of our work, but we have since moved to an alternating 2-year schedule. Each year we send either the parent or staff survey. This prevents survey fatigue and ensures that we have a big enough time gap between surveys that the administration can implement changes to move the results for the next survey in two years.
The results of our surveys surprised us in some areas and confirmed our knowledge of other items. The key learning from our initial surveys in 2015 was that, while we had a lot of complaints from both parents and staff, the survey data indicated that all stakeholders exhibited a significantly higher satisfaction than the noise indicated. We were also able to pinpoint some key issues for focus which needed attention but were lost in the noise. We are now in maintenance mode. We don’t make significant changes to the surveys as we track our performance against our baseline. We have real data which tells us why our teachers like working at Moriah and what the key levers are for their satisfaction. We have constantly heard from vocal parents that we send too many emails, and at the same time, that parents aren’t informed about what is going on in the school, but the data just does not support those claims. We know that we haven’t achieved enough around Safa and Ivrit immersion, but we also know that Moriah parents think this is important, and they support our recent efforts to change course. We know specifically which parts of the curriculum generate lower satisfaction and we can track our performance in these areas over time.
Develop questions around your own key areas where you invested, or are considering investing, time or capital. This will help you allocate your resources appropriately and measure one key aspect of your success – stakeholder satisfaction.
The purpose of the survey is to generate real data and actionable intelligence. It isn’t necessarily the role of the survey committee to create or track an action plan. One person such as a Principal or Head of School should be accountable to create a written action plan from the survey results and track it regularly with an oversight committee. The more closely you track the action plan, the more likely you are to impact the key drivers of customer and employee satisfaction.
Adam Z. Cohen was a Trustee of The Moriah School from 2013 – 2019. During his tenure on the board, he created the Surveys Committee and Security Committee. In real life, he is an operations management consultant. He can be reached at email@example.com.