Case study – Peer Review (GIS)

Supporting problem solving skills in GIS and spatial optimisation through peer assessment and application-oriented, software-based tests As part of a series of case studies, staff at LET sat down to have a conversation with Monika Niederhuber & Andreas Gabriel, Institute of Terrestrial Ecosystems, Department of Environmental Systems Science and Technology to discuss their geographic information systems (GIS) project.

What is the project about?

The aim of this project was to create a new learning arrangement for teaching the course spatial analysis, modelling and optimisation. Based on the theoretical content, exercises, peer assessments, and software-based exams were arranged for an in-depth examination of the subject matter. For correction the exercises, we introduced a peer review process. The purpose of the peer review process was to ensure students learn (how) to assess their own performance, and those of others, accurately. Simultaneously they also practiced formulating constructive criticism. Our intention was to promote self-responsibility, which is an important skill for later professional life.

What motivated you to initiate the project?

Usually GIS and optimisation courses focus on practical aspects, so we wanted to develop an exam that requires them to use the relevant application. The practical evaluation of the acquired software development skills in combination with analytical competencies became an essential part of our competence-oriented approach. In addition, the amount of time required for correcting the exercises was very high for us. So, we wanted to find a way to reduce the effort and the peer assessment process was critical to achieving that goal.

How did you do it?

We teach several subject areas and each topic contains the following elements: theoretical input in the form of a two-hour lecture, followed by a related computer-based exercise (8-10 hours, conducted in groups of two and led by tutors), and a peer assessment process.

The peer assessment process was designed to be a maximum of one hour’s work. We divided the groups so that each individual student would receive two assignments to review. This means that each group received four lots of feedback at the end. Students get assessed for their submitted exercise and how well they have done the review.

The exam is a hybrid solution. The examination tasks are handed out on paper, the practical tasks have to be solved and stored on the computer. Additionally, questions have to be answered on paper. We work with examination logins and after the exam, the data will be copied to another drive.

Did you have the support you needed for the project? Is there additional support you wish you had had to help you to achieve your goals?

At D-USYS, we had very good support from our Educational Developer Urs Brändle who knows the specific necessary functions of Moodle, the online learning system very well.

One challenge was that we had to organize the exam twice, because we had so many students and only 30 computers in our lab. A large computer lab at ETH with the necessary technical requirements and equipment would have been helpful.

Please describe some of the key outcomes of the project?

With the introduction of the peer review process, some of the students also became familiar with other possible solutions to the exercises. Furthermore, due to this new type of practical exam, students have to examine their solution approach in more detail, which lead to a much deeper understanding with the learning content.

How did the project impact learners or the way in which you teach?

We experienced that the majority of students have a positive view of the peer assessment method. Therefore, we believe that students will discover the value of the entire approach more and more, especially when they see other possible solutions to the same problem and learn to receive, and give relevant and appropriate feedback.

Non-disciplinary competences such as communication, collaboration and critical thinking became an integrated part of this course concept. In this way, students have a concrete opportunity to develop these important skills.

What are lessons learned that you want to share with your colleagues?

Overall, we had positive experiences with peer assessment and the very practice-oriented examinations.

Concerning the peer review, many students mentioned that the peer review itself had already led to a deeper discussion of the learning content. But upon further investigation, we realised that only a few students found the feedback helpful. This leads to a first conclusion that it was the review itself that deepened the knowledge and not the feedback they received. Peer feedback was only read carefully and taken in to consideration by students when it was of high relevance and quality. It looks like that this was often not the case and we have to work on that point. It takes time and space for students to learn to give good and constructive high-quality feedback.

With the software-based exams, we have created a tool with which practical skills can be tested. Our experiences with it are mostly positive.  A note on that: The time required to develop such questions as well as the students’ time to answer the exam questions should not be underestimated.

Initially, we scheduled the exercise allocation timeline too tightly. We quickly switched to a new pace where students had more time for all the tasks and then things worked well.

Overall, we recommend colleagues carefully select topics/courses where peer review fits and creates an added value/benefit. A small pilot project could be a good and doable way to try out this approach.

What are the future plans for this work? How do you plan to sustain what you have created through the project?

Based on our experience, we are thinking about new ways to conduct the review process and how to best convince students about its value. One idea is that students do more self-reviews based on a standard (master)-solution that can additionally show various solution methods. This way, they would reflect more on “meta level”. The peer reviews would be reduced in numbers, but might increase in quality.

Additional resources and comments

Final report (Abschlussbericht) of this Innovedum project:
https://lue.ethz.ch/GIS/Projekte/praktische-gis-pruefungen.html