Anonymous wrote:The WSJ rankings also let you reweight the factors you care more about. So for example I put 70% on outcomes and 10% on resources, engagement, and environment. If you do this, Cornell jumps to 7, Columbia falls a few places, Virginia jumps up to the 30s and Maryland jumps to the 40s.
+1.
I think 20% engagement methodology in the ranking is not reasonable. Also, they couldn't include 2020 survey because of the coronavirus pandemic.
https://www.timeshighereducation.com/USmethodology2021#
Engagement (20%)
Decades of research has found that the best way to truly understand teaching quality at an institution – how well it manages to inform, inspire and challenge students – is through capturing what is known as “student engagement”. This was described by Malcolm Gladwell in The New Yorker in 2011 as “the extent to which students immerse themselves in the intellectual and social life of their college – and a major component of engagement is the quality of a student’s contacts with faculty”.
THE has captured student engagement across the US through its US Student Survey, carried out in partnership with two leading market research providers. In 2018 and 2019, we gathered the views of more than 170,000 current college and university students on a range of issues relating directly to their experience at college (see key changes detailed above).
Students answer 12 core questions about their experience that are either multiple choice or on a scale from 0 to 10, and also provide background information about themselves. The survey was conducted online and respondents were recruited by research firm Streetbees using social media, facilitated, in part, by student representatives at individual schools. We also worked with participating institutions that distributed the survey to random samples of their own students. Respondents were verified as students of their reported college using their email addresses.
We used an aggregated group of respondents from both years (2018 and 2019 surveys). At least 50 validated responses in the 2019 survey were required for a university to be included.
To capture
engagement with learning (7%), we look at the answers to four key questions:
*to what extent does the student’s college or university support critical thinking? For example, developing new concepts or evaluating different points of view;
*to what extent does the teaching support reflection on, or making connections among, the things that the student has learned? For example, combining ideas from different lessons to complete a task;
*to what extent does the teaching support applying the student’s learning to the real world? For example, taking study excursions to see concepts in action;
*to what extent do the classes taken in college challenge the student? For example, presenting new ways of thinking to challenge assumptions or values
To capture a student’s opportunity to
interact with others (4%) to support learning, we use the responses to two questions: to what extent does the student have the opportunity to interact with faculty and teachers? For example, talking about personal progress in feedback sessions; and to what extent does the college provide opportunities for collaborative learning? For example, group assignments.
The final measure in this area from the survey is around
student recommendation (6%): if a friend or family member were considering going to university, based on your experience, how likely or unlikely are you to recommend your college or university to them?
In this pillar of indicators we also seek to help a student understand the opportunities that are on offer at the institution, and the likelihood of getting a more rounded education, by providing an indicator of the
number of different subjects taught (3%). While other components of the Engagement pillar are drawn from the student survey, the source of this metric is IPEDS. We are using the average of two years of data for this metric in order to provide a better long-term view.