Tag Archives: evaluation

Evaluation, Internships and Community Partners 

Our undergraduate students enrolled in the PACE unit “Internships in Social Research” frequently conduct evaluation studies for different community partners. However, we noticed that the time-frame of a single semester is too short for developing a sophisticated evaluation plan, as well as implementing it…

Continue reading Evaluation, Internships and Community Partners 

Connect More…with Alex Woods

Tell us something you’ve learned about teaching from your colleagues.

One of the most valuable lessons I have learned from a colleague is to show my passion and enthusiasm for acquiring skills and knowledge in my subject area, the study of ancient Egypt, so students in turn will be inspired to continue their learning journey long after their studies have ceased.

Continue reading Connect More…with Alex Woods

A little TED goes a long way: Avoiding survey fatigue

Tired of all those emails inviting you to participate in online surveys?

Spare a thought for Macquarie students, who in the past might have been asked to ​complete more than 10 student survey forms​ in​ the same study period.  Over-surveying leads to survey fatigue, and ultimately disengagement – which in turn has a negative impact on your ​survey response rates.

Here are 3 questions to consider before you order a TEDS survey:

Continue reading A little TED goes a long way: Avoiding survey fatigue

How do I know whether my TEDS results are “good” or not?

It’s difficult to determine a clear “standard” for TEDS results, since we know that they are affected by a range of contextual variables that relate to the learning and teaching environment.

Over the years, analysis of TEDS data has demonstrated persistent and consistent differences according to:
• discipline area (Faculty – this is more a reflection of student cohort differences than variation in teaching or curriculum quality);
• class level (100, 200, 300-500, 800-900-level, with 600- 700 level yet to be examined); and
• class size (this tends to have more impact on teaching than unit evaluation results, but is evident in both).

Interpreting Your TEDS Results – in Context

Without a measure of the variation attributable to each of these factors, it’s hard for an individual teacher or unit
convenor to “place” their own TEDS results in the context of their own teaching environment. However, help is at hand!

Now that we have been running the revised TEDS surveys for several semesters, we have sufficient data to provide descriptive statistics for groups of evaluations within the same context, at least to Faculty by Unit Level refinement in
most Faculties. These statistics, based on the distribution of mean (average) scores rather than individual scores in Faculty/Unit Level category, will enable you to see where your results are placed in relation to others who teach in the same context.

Where to Access

Guidance for interpreting your results in relation to the data summaries, and the summary tables themselves, are available NOW for LEU surveys only at http://staff.mq.edu.au/teaching/evaluation/surveys/compare_leu/.

The TEDS team are working on the LET tables and will inform all staff when these are ready to be accessed.