Total Pageviews

Tuesday 24 March 2015

Mike Shortt Reports: Methodology

This post gives more details on the surveys and methodology, if you're interested. I decided it was too boring for a general audience so I cut it out of my original post. But you are an eager beaver! So here you go. And feel free to send me a message if you want your own copies of the pdfs. You can find my contact info at my website.

Methodology
Sample
Mike tells me they got "between 700 and 1300 students over the years". The results were from all faculties, and efforts were made to select a representative sample. They claim they succeeded in this.

Results
Often there were five possible responses to the questions. It's the standard strongly agree/disagree, somewhat agree/disagree, and neutral.

Depending on the question, when the study reports "only 10% agreed" this could mean only 10% strongly agreed, or only 10% strongly/somewhat agreed. They make it clear which they chose, and the strategy depends on the question. This doesn't matter to us though because usually I just compare the scores between departments.

Changes Over Time
Even though the reports cover various years, usually each new report references the same two surveys done once each. However I make it clear when this is not the case, and when we can talk about trends.

Cherry Picking
These surveys measured a lot of metrics, and I am of course cherry picking bad results to support my previous criticisms of McGill's education department. So where did education score best?

Compared to other departments the language skills among TAs is great, our library space and study spaces are great, and library staff are especially helpful. But of course, those are not the things I've been criticizing! In addition to scanning the 100+ pages of reports, I searched for "education". I can say that most of the times education was mentioned in a paragraph or table, it scored the worst.