Surveys are an important part of the program review process. A good survey can access information about the experiences / perspectives of multiple key stakeholders in a way that provides data that is easy to collate and analyze. The richness that comes from the survey data can then be used to inform how you write the questions for the focus group stage of data gathering.
Getting started:
- decide who you want to survey and why
- think about what specifically you want to find out
- refer back to institutional priorities (e.g., the Widening Our Doorways Academic Plan) to ensure that, what you ask will provide the data you need
Guidelines for creating surveys:
- make them as short as possible, ideally taking no more than 5 minutes to complete
- ensure the questions relate to an identified and clearly-articulated research goal (i.e., get clear on what it is you want to find out)
- take the time to formulate questions so that they can be used in future reviews with few / no changes required, to optimize comparability over time / across other programs / departments
- ensure the questions are relevant to the respondents being surveyed
Constructing surveys is an Open Text BC resource that explores the psychological process of responding to questions, the influence of context, and types of questions. A summary of Key Takeaways is available here
This resource outlines types of questions and how to write them: Harvard University: Tips sheet on question wording
Common pitfalls in writing survey questions:
- writing overly complex questions
- asking double barreled questions to which only a yes / no response is possible (e.g., Were the courses available relevant and was there enough choice?)
- asking dichotomous questions (i.e. questions where only yes / no is possible but neither response may capture what a respondent truly feels or believes. For example, “Was the scheduling of courses convenient?” The answer might be ‘yes’ for most courses, but ‘no’ for one of the required courses.)
- asking leading or loaded questions (e.g., Tell us why you had a great experience studying on this program.)
- not taking into consideration who the respondents are (e.g., asking current students, “Did studying on this program give you the exit qualifications you needed to gain your current employment?”
- forgetting accessibility (e.g., words / language used for ease of understanding; biased language; timing and technology required for responding)
- forgetting that the order in which the questions are asked does matter
Other considerations:
From a cultural perspective, if using a Likert Scale for responses (e.g. on a continuum from strongly agree to strongly disagree), research shows that individuals from certain cultural backgrounds are more likely to respond to ‘neither agree nor disagree’ (a middle neutral response), or to choose the more extreme response either end of the continuum. Therefore in some cases, using a 4 or 7 point Likert Scale may be more useful.
Who can help? The Institutional Research and Planning Department can provide consultation on the development and implementation of tools for gathering new data, along with assistance in interpreting and displaying data. The Centre for Teaching and Learning Innovation can support the program advisory group in creating and conducting surveys.
Examples of surveys:
For students:
University of Western Ontario – Student / Alumni Learning Experience Survey
BCIT Program Review Current Student Survey Questions
BCIT Program Review Alumni Survey Questions
For faculty:
BCIT Program Review Faculty Survey Questions
For community partners:
BCIT Program Review Industry Survey Questions