The pivot to remote online teaching on the MA in Conference Interpreting in Cologne: Lessons learned from an unexpected experience
Barbara Ahrens, Morven Beaton-Thome and Anja Rütten, TH Köln – University of Applied Sciences, Cologne
ABSTRACT
This paper describes and critically evaluates the new online setting encountered when the MA in Conference Interpreting at the Institute of Translation and Multilingual Communication at TH Köln – University of Applied Sciences, Cologne, was forced to move completely online as a result of the COVID-19 pandemic. The pedagogical and interactional challenges of the pivot to remote online teaching are first contextualised and discussed, before the results of a longitudinal survey of staff and students are presented and analysed. The transition to remote online teaching brought into sharp relief the fact that pedagogical concepts and lesson plans cannot simply be transposed directly from face-to-face to online teaching, particularly regarding issues around interaction between all participants. Peer-to-peer interaction was perceived to suffer most in this context. What was particularly striking about the results of the survey was that the success of remote online teaching in conference interpreting depended on small groups, individualised and personalised learning and feedback, and reliable and user-friendly technical solutions. A strengthened pedagogical focus on remote interpreting proved to be an unintended benefit of the transition.
KEYWORDS
Conference interpreting, remote online teaching, COVID-19 pandemic, remote interpreting, community of practice, interpreting technology, conference interpreting teaching platform, longitudinal survey.
1. Introduction
When the COVID-19 pandemic crisis hit Germany in March 2020, staff on the MA in Conference Interpreting at the Institute of Translation and Multilingual Communication at TH Köln – University of Applied Sciences, Cologne, were faced with an overnight blanket ban on all on-site teaching and great uncertainty as to whether this would resume during the summer semester. Under these circumstances, the challenging decision was taken to turn the entire MA programme, which is normally run purely on-site and face-to-face (with the exception of one online course dedicated to Information Management and New Technologies), into a remote online programme until further notice.
In this paper, the groundwork will first be laid by describing and critically evaluating the new online training setting. The results of the longitudinal survey will then be presented and analysed, and the benefits and limitations of remote online teaching and examining in the field of conference interpreting discussed on the basis of this experience. The paper concludes with suggestions as to a meaningful use of online pedagogy in this field.
2. Technical setup
In the MA training context, technical issues broadly consisted of providing and adjusting existing hardware and software solutions to the specific setting and requirements of remote online teaching, as the basic technical setup was already in place. Rather than shaping and defining teaching, technical solutions were viewed as underlying infrastructure used to support the transfer of an existing training concept to the new remote setting.
2.1. Teaching conference interpreting remotely—a niche in the niche in the niche
Conference interpreting constitutes a relatively small market segment in software development (Rütten 2017: 98), with conference interpreting training accounting for an even smaller share. Even in on-site interpreting training, technical solutions for the training needs of the institution need to be devised in close cooperation with teaching staff, and usually involve a combination of interpreting hardware from the professional market, combined with tailored solutions for recording, dual-track replay and storage of interpreted speeches. It was therefore no surprise that an off-the-shelf solution for remote interpreting training could not immediately be procured when the COVID-19 pandemic hit.
In Cologne, the Adobe Connect platform (2021) was found promising for synchronous teaching of both consecutive and simultaneous interpreting, but was ultimately rejected due to the instability of the network provided by DFN (German National Research and Education Network) (2020). Despite security concerns, synchronous teaching was then moved to Zoom (2021), using the breakout room and the simultaneous interpreting functions for real-time consecutive and simultaneous interpreting for teaching and exams. Overall, Zoom was considered to be the solution which came closest to a combination of an online teaching and remote interpreting platform. As the interpreting function in Zoom does not support relay interpreting, this was not taught throughout the semester. Acquiring another Remote Simultaneous Interpreting (RSI) software for relay interpreting, in a semester when no interpreting course demanded such a function, was considered disproportionate. The most important constraint when using Zoom, however, was the lack of dual-track recording, or even listening (i.e. listening to a source video and a student’s interpretation at the same time). Dual-track recording was therefore carried out via Audacity (2021) to enable strategic feedback on issues such as décalage, with student recordings saved to individual or course folders in sciebo, a non-commercial cloud storage service for university research, studying and teaching, located in North Rhine-Westphalia. GoReact (2020) and its time-stamped feedback mechanism was also used both in synchronous teaching and in asynchronous assignments for structured assignment, delivery, and evaluation of interpreting tasks.
The challenge of finding a single platform for synchronous teaching of simultaneous interpreting, or even to cover the complete workflow of conference interpreting training, is one certainly not confined to Cologne, as witnessed by the lively exchange of best-practice amongst the interpreter training community. Indeed CIUTI, the International Association of University Institutes with translation and interpretation programmes, released a call in August 2020 for proofs-of-concept for single-device remote teaching of simultaneous interpreting, worth €10,000, indicating that the technical aspects related to ease of use have not yet been satisfactorily resolved (CIUTI 2020). The contract was awarded to Green Terp, who now aim to provide a platform offering trainers the possibility to virtually enter each booth to listen and comment on students’ performance, among other features (Green Terp 2021).
2.2. Multichanneling
Apart from tools used for teaching purposes, other platforms were already in use for organisational purposes pre-pandemic. These were Airtable for internal administrative purposes and feedback, as well as Google Drive and Google Calendar for timetable planning. Teaching material was shared using email, sciebo, Google Drive, YouTube, the university-wide learning management system ILIAS, WhatsApp or Zoom Chat. The range of tools available allowed a high degree of flexibility and a swift start to remote teaching, as trainers could use the platform(s) they felt most comfortable with.
Interaction between trainers and students as well as peer interaction between students required channels that could be used outside the actual (virtual) ‘classroom’, such as text messaging and social media (Clifford 2018: 180-183).
In terms of interaction during synchronous lessons, Zoom offers a chat window which can be used for written communication on a parallel channel, either with the whole group or privately between individuals. Apart from the options available in the system itself, students also use additional devices to send text messages to each other during lessons. However, using multiple channels during synchronous lessons has the potential to be a source of additional stress and fatigue (see section 5.1.). Indeed research has shown that multitasking activities, such as texting during lessons, increases the load placed on a finite cognitive system (Lepp et al. 2019: 1). Studies also show that multitasking during educational activities also negatively affects comprehension, recall and retention (see Lepp et al. 2019: 1 for a comprehensive overview).
In addition, stress and fatigue during videoconferences, dubbed “Zoom fatigue,” has been claimed to be related specifically to four features, namely excessive amounts of close-up eye gaze, cognitive load, increased self-evaluation from staring at video of oneself, and constraints on physical mobility (Bailensen 2021). Of these four features, the first three involve, or result from, multichannel processing. Fatigue also arises from multitasking on a technical level, such as constant monitoring of connectivity and stability for video and/or audio streams.
3. New online setting
Given the sudden nature of the transition, the first response was to provide the best possible “emergency remote teaching” (ERT). Defined as “a temporary shift of instructional delivery to an alternate delivery mode due to crisis circumstances” (Hodges et al. 2020), ERT also implies a limited time period of remote teaching, and that there will be a return to the usual format “once the crisis or emergency has abated” (Hodges et al. 2020). This assumption was central to the strategic response of the MA in Conference Interpreting in the initial phase, when it was assumed that face-to-face teaching would resume a month later, on April 20, 2020. Even once it became clear that teaching would remain remote and online for the duration of the semester, the issue of the format of the final interpreting exams, whether online or in-person, was not settled until May 11, 2020, resulting in parallel planning of multiple scenarios for teaching and assessment. The stress resulting from the highly volatile situation and the impact of the shutdown of public life on the learning and teaching experience, is also a unique aspect to this type of emergency transition (see e.g., IAU 2020 for the impact of COVID-19 on higher education world-wide and Czerniewicz 2020 on the effect of university shutdowns in South Africa).
Despite these considerations, as time progressed, it became clear that ERT could not be continued indefinitely. The MA programme has now mastered another period of exclusively remote online teaching with both new and existing students during the winter semester 2020/2021. As such, the initial ERT phase has been concluded and a temporary online pivot, defined as “longer-term, but crucially still temporary, plans to continue teaching online” (Nordmann et al. 2020: 2), is well underway.
For the purposes of this paper, the discussion will focus on practical interpreting courses on the MA programme in Conference Interpreting in Cologne in the summer semester 2020. In that semester, these practical courses took the form of individual introductory courses to consecutive and simultaneous interpreting in all directions (B/C-A, A-B) in first year, advanced courses in consecutive and simultaneous interpreting in all directions (B/C-A, A-B) in second year, as well as a course in interpreting for international organisations (B/C-A) for first-year students, and a course in note-taking for consecutive interpreting for first-year students. Throughout the paper, the term ‘programme’ will be used to refer to the MA as a whole, the term ‘course’ to individual courses within that programme (such as Introduction to Consecutive Interpreting from French to German), and the term ‘lesson’ to individual teaching sessions within the course (see TH Köln 2021 for further details on the structure of the MA). To situate the findings of the survey on these practical interpreting courses, it is first important to review studies of remote teaching for practical interpreting. These can be broken down into five distinct categories:
- The use of (at times highly innovative) remote technological solutions to complement student learning and encourage reflective self-study on a face-to-face programme (Gorm Hansen and Shlesinger 2007; Sandrelli 2015; Kim 2017; Braun et al. 2020).
- The delivery of online components in a blended learning format, often based specifically on a socioconstructivist approach to learner interaction where “distance is a positive principle not a deficit” (Nordmann et al. 2020: 2; see also Moser-Mercer et al. 2014; Motta 2016; Skaaden 2016; Lee and Huh 2018). Such an approach emphasises the role of the student as the initiator at the centre of the learning experience, an active role that is reflected in other learning theories such as situated learning (Lave and Wenger 1991) that will be referenced later in this paper, as a means to complement remote online teaching.
- Programme design with a longer distinct online phase, such as at Glendon College, York University (Clifford 2017, 2018).
- Programme design with a remote option and interpreting courses for face-to-face students opened to remote students, such as at the University of Vic (Perramon and Ugarte Ballester 2020).
- Programmes designed to be delivered exclusively via remote teaching (Ko 2006, 2008; Ko and Chen 2011; Mulayim and Lai 2015).
Although experience in all four categories is relevant to the case study, the pedagogical and interactional issues in category (1) differ somewhat from those in categories (2) to (5). For this reason, the discussion will focus on studies with self-contained, longer online teaching phases (see discussion in Ko 2008: 838).
3.1. Pedagogical issues
One major pedagogical issue in the research literature is that teaching in remote formats is generally perceived as more time-intensive in terms of staff and student workload. Studies have found that administration on the part of staff takes longer, particularly at the beginning (IAU 2020: 25-26; Ko 2008: 834), and that monitoring and encouraging interaction, as well as pastoral care (see Gómez and Weinreb 2002: 647), constitutes a larger part of staff workload throughout the course, when compared with face-to-face teaching. The desire to provide comprehensive feedback and individual mentoring to compensate for lack of face-to-face contact is also a factor in staff workload, particularly in an ERT situation. In considering pedagogical issues, best practice in pedagogy has to be weighed up against the time that staff and students can invest in a particular course or programme as a whole.
The discussion of pedagogical issues will focus on two categories: training and assessment of paralinguistic features and feedback design.
3.1.1. Training and assessment of paralinguistic features
In studies of remote interpreting teaching, both training and assessment of paralinguistic features, particularly prosody and voice work, gaze, facial expression and gesture, have been flagged as problematic (Perramon and Ugarte Ballester 2020: 175; Ko 2006: 84; Ko and Chen 2011: 138). This issue should be viewed as distinct from the influence of these factors on interaction in a remote online setting (see section 3.2.).
In face-to-face lessons, reflective practice on these factors forms an integral part of training and often constitutes a separate category of feedback to students. However, in the ERT phase in Cologne, such paralinguistic factors may have been weighted less heavily as both staff and students devoted their attention to content and language-based factors. Moving forward, one promising approach is to frame interpreting assignments in class as actual remote interpreting assignments, rather than projected and simulated face-to-face scenarios. Following this situated learning approach (see González-Davies and Enríquez-Raído 2016), assessment and evaluation of interpreting performance would also focus on the remote setting, thus providing current students with more intensive reflection on their role in remote scenarios while continuing to thematise commonalities with, and differences to, face-to-face settings.
3.1.2. Feedback
The traditional form of feedback in conference interpreting training is inextricably linked to the traditional master/apprentice structure in which the master (trainer) gives oral feedback to the apprentice (student) in front of their peers (e.g., Setton and Dawrant 2016: XXIX-XXX, 58-59, 74-75). This is now generally complemented by forms of peer feedback and self-feedback. A well-designed face-to-face conference interpreting lesson usually combines all three aspects of feedback (e.g., Behr 2015: 213-214). Although instructor-to-student feedback is open to criticism in the face-to-face mode, it is generally accepted that feedback in the group in face-to-face teaching environments is beneficial to all as it highlights issues in one performance that other students may also be experiencing (e.g., Setton and Dawrant 2016: 90, 96).
The effectiveness of such an approach in remote online mode is, however, open to question. Certainly, questions regarding the organisation of peer feedback (including technical issues such as the use of breakout rooms) and the spontaneity of dialogue versus the static nature of the feedback monologue, arose in Cologne. Students’ desire for highly individualised (written) feedback, a growing impatience with feedback directed towards one particular student, as well as issues regarding keeping track of feedback in the remote online environment are issues which were observed throughout the ERT phase.
Another challenge of remote online training situations may be that students are unaware of the interactional factors and their role as interpreters in on-site communicative events (Mouzourakis 2010). Therefore, feedback should focus on these aspects, both in remote and on-site settings, even if courses are exclusively remote and online. In addition, interaction in the online classroom, both between trainers and students, and between students, needs to be approached differently in remote online training in order to compensate for the lack of physical proximity.
3.2. Interactional issues
In conference interpreting training, interaction is at the very heart of the subject matter. This manifests itself in two distinct ways. On one hand, interaction in the training situation is essential for teaching, i.e. it refers to the interaction between the learner(s) and the instructor(s). On the other hand, interaction is also taught as part of an interpreter’s skillset, via this multi-layered interaction in the classroom (see section 3.1.). The following section will concentrate on interaction in the synchronous online conference interpreting classroom.
3.2.1. Technically-mediated communication
One of the characteristics of face-to-face communication is immediate mutual perception (Braun 2004: 32), with interpersonal perception creating a feeling of proximity among the communication partners who are physically present on-site (Short et al. 1976). In an online setting, the communication partners try to achieve social presence, although they are physically distant (Williams 1977). This requires more effort than in traditional communication settings, with the risk of experiencing a feeling of alienation, social distance or of being ‘lost in (virtual) space’ (Braun 2004: 62-66, 2007: 23; Mouzourakis 2006: 52). In order to keep all communication channels open and to maintain social presence, in virtually mediated communication more “process overhead” (Olsen et al. 1997) is needed, i.e. interaction in the remote online classroom has to focus both on content and conversational management. In addition, problem solving constantly requires time and effort, leading to ‘operational’ interaction instead of ‘content/task-centred’ interaction. It is important to note that in technically-mediated communication, there is a higher risk of losing the overall view of the communicative event because there is no immediate contact like in face-to-face situations (Braun 2004: 32).
Another important aspect which has not been mentioned very often in publications on online courses is the use of additional devices as a fallback or emergency option for learner-instructor interaction. This becomes necessary if technology fails—not only during lessons. Especially in times when exams at all ends of the MA cycle (aptitude testing, first-year exams and final exams) have to be taken online, connectivity problems are an additional stress factor for the candidate. Whereas in on-site situations the candidate and the jury are in the same place, in virtual exams they are separated, thus making an extra channel for emergency communication necessary.
3.2.2. Interpersonal relationships
In terms of interaction between communication partners in learning environments, Moore (1989) distinguishes between learner-learner interaction, learner-instructor interaction—both functioning reciprocally between the individuals involved—and one-way learner-content interaction. The two interpersonal types of interaction have been researched in a number of studies on online teaching (see Kuo et al. 2013 for an overview) but with heterogeneous results: Whereas in Jung et al.’s (2002) study, social interaction between learner and instructor—which “involves reciprocal stimulation or response between two or more individuals” and “includes […] the dynamics of group behavior” (APA 2020a)—had the strongest impact on learning outcomes and satisfaction with the online learning experience depending to a great extent on collaborative interaction between learners themselves, Kuo et al. (2013: 33) found that “[l]earner-instructor interaction, learner-content interaction and internet self-efficacy were significant predictors of student satisfaction in fully online learning settings.” In interpreter training, collaborative interaction and communities of practice with peers are considered to be essential for successful learning (e.g., D’Hayer 2012; Ehrlich and Napier (eds) 2015; Braun et al. 2020, among others). Pre-COVID-19, Cologne had already prepared its students for these important aspects of self-study, which was considered an asset in the transition to remote online mode. However, as student-to-student communication and interaction have been found to suffer generally during the abrupt shift to remote online learning as a result of the COVID-19 pandemic (Lee et al. 2021: 166), this is an aspect worth focusing on in the discussion of this particular context.
Another important aspect in this context is group size in interpreter training: groups are rather small as compared to many other careers. Groups with more than six to eight participants can be considered large groups in interpreter training, three to six students are an ideal number for the individually focused, practice-intensive classes (Setton and Dawrant 2016: 23). Interestingly, the interpersonal dynamics in groups with more than eight students are similar to those in much larger groups where participants no longer interact any more with each individual group member (Sorgalla 2015: 3). Lack of (inter-)active participation and zoning-out can be the consequence (see also section 6.).
3.2.3. Synchronous mode of delivery
Synchronous online teaching of various types of interpreting (e.g., dialogue, consecutive, simultaneous) was evaluated as being feasible as early as in Ko and Chen’s (2011) pilot study. For conference interpreting in particular, Clifford (2017, 2018) provides a detailed description of an MA programme in conference interpreting in Canada with 100% online teaching during the first year. Clifford (2018: 172) describes “interaction in […] virtual classes closer to traditional, onsite interaction” than one would expect, probably related to synchronous teaching as modus operandi. However, in online learner-instructor interaction, non-verbal communication by means of gestures, facial expression and gaze is also more difficult to decode. Questions from both sides often have to be verbalised, whereas in an on-site setting a frown would probably trigger a reaction. In our experience, instructors need to be more explicit in their explanations and instructions and also have to repeat them more often. This is much in line with the general characteristics of technically mediated communication as less natural and spontaneous (Sellen 1995; Olson et al. 1997; Clifford 2018). In the context of interpreter training, where trainers can adapt quickly to student needs in face-to-face lessons without using any technical equipment (such as spontaneous brainstorming or group work, intensive voice and breath work, unplanned changes in booth constellations), the constraints of the technical platform in remote online mode impact on the spontaneity of lesson progression.
What emerges in terms of interaction is that in remote online interpreter training all three types of interaction are intrinsically linked to the mode of delivery. A process of acclimatisation and habituation also has to take place before both interpreting trainers and students fully adjust to the interactional demands of an online setting.
4. Survey methodology
Following the abrupt change to ERT and the ongoing temporary online pivot, staff were keen to receive regular feedback from one another and the students in a format that would enable staff to quickly and pragmatically adapt those formats to improve the learning conditions for everyone involved. The purpose of the survey was not to obtain statistically relevant results, an aim which is unrealistic under any circumstances given the potential sample size of students on interpreting programmes (Gile 1998: 80). Given the time pressure of the transition, it was also not possible to run a pilot project or refine and adjust the survey in any significant manner which would have been standard procedure for a study planned on a longer-term basis. However, despite these constraints, it was felt that the unique opportunity for gathering time-sensitive data on the abrupt transition to online interpreting training could offer insight into a number of factors that could not have been explored previously. Therefore, on the initiative of a number of members of staff, based on the issues that appeared pertinent according to their limited experience of teaching remote interpreting and flanked by a summary review of the literature, a simple questionnaire was created in Airtable which staff and students completed on a weekly basis (see Appendix). This was distributed at the end of each week for all 14 weeks of the summer semester. In order to reduce the effort to a minimum and motivate respondents to reply every week, the questionnaire started with a set of simple rating questions about time spent per course and per week, contact time, quality of interaction, level of efficiency and degree of fatigue (all compared to on-site teaching). The options respondents could choose from on a Likert-type scale were ‘the same,’ ‘more/better’ and ‘less/worse.’ After the first week of teaching, a number of respondents requested that more nuanced response options be included for the question pertaining to time spent per course. As a result, two more options were immediately added, consisting of the options ‘a bit more’ and ‘substantially more.’ Furthermore, the questionnaire included open questions about advantages and disadvantages, the information that results may be used for research purposes and that data would be anonymised. All staff had access to these responses and they were discussed informally on an on-going basis, and in more formal weekly or fortnightly staff meetings. All respondents had the possibility to remain anonymous or to provide their name on a voluntary basis. Direct access to responses enabled staff to make adjustments quickly and tailor individual responses to students who may have been experiencing difficulty, as long as they chose not to remain anonymous. Collecting weekly data also allowed for a longitudinal study as the situation developed and teaching adapted to feedback.
The MA programme in Cologne is designed as a two-year programme. However, students often take longer than four semesters to complete the programme and continue to attend second-year courses. As a result, reference to second-year students includes this group.
Overall, the survey was sent to six first-year students and 15 second-year students who actively participated in the remote online courses on a regular basis. It was also distributed to all 15 trainers (14 conference interpreters and one speech therapist).
Figure 1. Total number of respondents
The response rate across all 14 weeks and respondent groups was 31%. It was 21% on average across all 14 weeks for the first-year students (0 from week 10 to 13), 36% for second-year students and 35% for staff. In week 1, it was as high as 50% for the first year, 80% for the second year and 47% for staff.
Figure 2. Response rate
While the impressions and opinions expressed by, on average, a third of all students and staff of one Master’s programme in Conference Interpreting can certainly not be regarded as representative for the population of all students and trainers of conference interpreting, the results of the survey can be viewed as an account of developments in remote online teaching across an entire semester in the direct wake of the COVID-19 pandemic.
5. Survey results
In the following section, the results of the survey will be summarised and visualised. For illustration purposes, ‘the same’ was assumed to be 0, ‘more/better’ = 1 and ‘less/worse’ = -1, and the average was calculated for each respondent group. When respondents chose two options, e.g., ‘the same’ and ‘more,’ for one question, the average of the two replies was calculated (in this case 0.5). The more nuanced replies for time spent per course were calculated accordingly: ‘a bit more’ = 0.5, ‘substantially more’ = 1.5, ‘a bit less’ = -0.5.
The averages of the three respondent groups (trainers, first-year students, second-year students) are presented for the 14 weeks of the semester in the charts below. In addition to these respondent group averages, individual responses were studied in certain cases, particularly when there were notable differences in responses within a single group.
The comments were allocated manually to thematic sections by the researchers, using keywords (technical issues, individual and collective issues) and each divided into positive and negative aspects by respondent groups (trainers, first-year and second-year students).
5.1. Fatigue of remote lessons compared to on-site teaching
Overall, remote online lessons on average were found to be more strenuous and exhausting than on-site lessons throughout the whole semester, both for trainers and for students. Not one single respondent reports feeling less strain or fatigue remotely than on-site, although there are some respondents throughout the semester, in both the trainer and student groups, who find it equally strenuous or exhausting.
Figure 3. Strain and fatigue
This finding correlates with the literature on the use of videoconferencing software such as Zoom for synchronous online learning (see e.g., Blum 2020; Cohn 2020) and may result from both technical challenges such as bandwith problems and the effort required to foster teamwork and maintain high quality exchange (Schroeder 2020).
Interestingly, the level of strain and fatigue decreases over the semester for trainers while it increases for both first- and second-year students. This may be due to the fact that students are approaching their exams while trainers are acclimatising to the remote format and/or finding better coping strategies. A further reason could be the positive effect of a decrease in the baseline of non-academic related stress on staff as a result of the improving pandemic situation (including the gradual lifting of restrictions on day-to-day life during May to July 2020), as well as the transition out of ‘fight or flight’ mode and from ERT to the temporary online pivot. One of the potential stressors mentioned by staff in the initial stages in the qualitative comments was uncertainty surrounding planning which reflects the day-to-day ‘survival mode’ characteristic of ERT.
A number of freely formulated comments were made which directly relate to the issue of fatigue. These can be split into physical and cognitive reasons. Physically, staff referred to ergonomic factors such as eye-strain, headaches and aches and pains from sitting still in front of a screen (five times). The strain of intensive screen time was also mentioned by both first-year (three times) and second-year students (four times) in the negative comments. Cognitively, staff referred to a difficulty in focus and concentration (once). First-year students found they were exhausted (twice), found it difficult to maintain concentration (five times) and struggled to adapt to the new situation (three times). Second-year students also highlighted their difficulty in remaining concentrated (nine times), and the need for more frequent breaks in online teaching was mentioned in the comments made by second-year students as a request (three times) and as an improvement (once).
Unsurprisingly, connection issues were mentioned as the most frequent technology-related stressor. Interestingly, multichannel processing, i.e. handling different meeting and information-sharing platforms in parallel, was the second most frequently referenced technology-related stressor (mentioned five times by students and ten times by trainers). This finding ties in with the studies on multitasking referenced in section 2.2. and suggests reducing the complexity and number of concurrent tasks during online teaching should be considered.
Although some of these fatigue issues can be addressed by improving the technical environment in terms of hardware and software, as well as the ergonomics of individual work stations, others could potentially be linked to issues around interaction, such as a lack of non-verbal cues, mentioned explicitly by staff ten times and by students four times (see section 3.2.). Fatigue could also be mapped to the strain of constantly monitoring one’s own video and audio stream (see section 2.2.), mentioned as a negative point by all groups, with trainers mentioning it twice and the two student groups mentioning this twice each (see section 5.4. for a more in-depth discussion of non-verbal factors in interaction).
5.2. Time investment
The issue of time investment was divided into two parts: One question covered the time spent per course and the other related to the time spent per week—always in comparison to a comparable on-site week. The question related to total time spent per week explicitly included teaching time in addition to the regular timetable, cancelled lessons, organisation, overhead, travel time, etc. At the beginning of the semester it was difficult to predict whether all courses would take place remotely, so it may have been the case that some respondents had a higher overall workload than others. In order to rule out a bias due to differences in the respective weekly timetables and workload of students with different language combinations, or trainers teaching different subjects, the overall time investment as well as the time spent per course was covered. By making this distinction, the ‘overhead,’ course-independent time spent could also be identified more easily. As a caveat, the results presented here need to be viewed in the context of the (un)reliability of self-reporting of time spent on task, when not accompanied by systematic time tracking (see Robinson and Godbey 1997 for a discussion of the problems surrounding this).
5.2.1. Time spent per remote course compared to on-site
Regarding the time spent for each lesson, i.e. presence, preparation and follow-up, the general trend seems to be that trainers in Cologne with extensive experience in teaching face-to-face on average spend more time on each remote lesson than they do for on-site teaching, with only a slight downward trend towards the end of the semester. This ties in with the findings of Kenny and Fluck (2017) who reported that online teaching is more time intensive than face-to-face teaching for trainers used to teaching predominantly face-to-face.
Figure 4. Time spent per remote course
Both first-year and second-year students on average reported spending more time on remote lessons than on on-site ones, which is also reflected in the comments.
In the last week of the semester, all four second-year student respondents said they spent as much time on a remote lesson as on on-site ones while four out of six trainer respondents stated that they still spent more time remotely (the other two spent as much time remotely as they did on-site).
In all three respondent groups, there was one respondent at the beginning and at the end of the semester who reported to spend less time on remote online lessons than on-site.
Individual comments clearly show that there are a number of reasons for the increased time investment per lesson. Staff mention spending more time on preparation (three times) and the higher time demand of asynchronous work (once), as students’ performances are evaluated after the lesson. While the overall workload is increased for trainers by asynchronous teaching, the online group time can be used more efficiently, which is also confirmed by students’ comments. One member of staff also mentions that more preparation is needed as online lessons are not as dynamic and spontaneous as face-to-face teaching and therefore need more detailed planning (particularly regarding the use of features such as breakout rooms and video feeds). These are important, fine-grained observations that should shape pedagogical concepts during the temporary online pivot.
5.2.2. Time spent per remote week compared to on-site
For the time spent per online week, the overall picture is that the weekly workload compared to on-site teaching increases for trainers. However, this increase is not as high as the increase in workload per course. In comparison, students’ workload on average remains stable or even decreases in comparison to an on-site semester. Reasons for more time spent per week for staff might be the initial increase in workload in general due to re-organisation, testing of different software solutions, a higher need for coordination, and more staff meetings. The uncertainty of exam modalities is mentioned three times in staff members’ comments, with one trainer explicitly stating that exams worked well but involved time-consuming preparation.
Figure 5. Time spent per remote week
For second-year students, while the average is below 0 (i.e. less time spent) almost throughout the whole semester, a higher or slightly higher workload is reported by one respondent in weeks 3, 6, 9, 11 and 12, by two respondents in weeks 1 and 7 and by five in week 5. Two second-year students comment on their workload, which they feel is higher than in a ‘normal’ semester. In the comments two first-year students feel they struggle to keep up with the different tasks/courses and information load, while others find it easier and feel better organised than before.
There is a downward trend for all respondent groups, with the trainers’ trendline dropping below 0 (i.e. spending less time remotely than on-site) in the last week, being much closer to the students’ line at the end of the semester than at the beginning. On average, all respondent groups report spending less time for a remote than for an on-site week at the end of the semester.
5.2.3. Contact time
The fact that contact time is perceived to decrease with remote online teaching for students whilst increasing for trainers may be attributable to the fact that there is more one-to-one contact between trainers and students, as the trainer is present at each individual appointment with students to discuss their performance whereas the students are only present at their own appointment. There may also be confusion as to what contact time actually refers to and that it is not exclusively equivalent to time spent in lessons and may also include individual feedback or advice.
Figure 6. Contact time
This is in line with the comments, where it is mentioned nine times that one-to-one feedback is perceived as more efficient than discussion of every student’s performance in the group. This finding points to the need for adapted feedback design and mechanisms for remote online teaching mentioned in section 3.1.2.
For both trainers and second-year students, perception of contact time increases towards the end of the semester. This might be due to additional tuition due to imminent exams.
According to the comments, at least one trainer switched from asynchronous to synchronous teaching in the course of the semester. This may be the reason for trainers’ average contact time being less than on-site only at the beginning of the semester.
Comments on contact time also focus on the quality of that time. In addition to emphasis on the targeted use of lesson time for individual feedback, both staff and students were critical of technical and organisational details encroaching on contact time, with staff complaining of technical and connection issues (15 times), first-year students mentioning microphone and sound issues (three times) and second-year students indicating that internet connectivity negatively impacted on the quality of lesson time. Students also mentioned the difficulty of finding a quiet place to set up their work station (once) and the potential for distraction being higher at home (four times). This was likely intensified by the fact that a number of staff and students were juggling small children and home-schooling older children during the initial stages of ERT. Although potentially more time-consuming, asynchronous assignments with individual feedback provided the potential for reducing contact time during assigned slots and were positively evaluated by trainers three times in the survey. Maintaining such flexibility in combining synchronous and asynchronous teaching in the temporary online pivot would build on these positive comments for those staff and students who may continue to have pandemic-related, non-academic demands on their time. This is in line with best practice during such a pivot (Nordmann et al. 2020), where the provision of asynchronous content, and planning for both synchronous and asynchronous communication, are second and third respectively on the list of ten simple rules for the online pivot.
5.3. Efficiency
After the questions of time and quality of interaction, respondents were asked to rate the efficiency of remote vs. face-to-face teaching. The German term Wirkungsgrad originally refers to the input-output ratio in power generation, so it may be assumed that it was understood by respondents as the learning outcome in relation to the time invested. On average, the efficiency of remote lessons is perceived as equal or worse than on-site by most respondents. The trendline goes up slightly for students and remains stable for trainers. Interestingly, the average for both trainers and second-year students develops in parallel, going down drastically in week 13 and rising to an all-time high in week 14 where the end-of-semester value rises to 0 (equally efficient) for trainers and 0.8 (more efficient) for second-year students.
Figure 7. Efficiency
Interestingly, the trainers’ average of 0 in week 14 is the result of half of the six trainer respondents rating efficiency for remote online teaching higher and half rating it lower than on-site. None considers it equally efficient. There seems to be a clear divide as to the perceived efficiency of remote online teaching among trainers. The role of confirmation bias, defined as when “information is searched for, interpreted, and remembered in such a way that it systematically impedes the possibility that the hypothesis could be rejected” (Oswald and Grosjean 2012: 79), should not be underestimated here in terms of the attitude towards remote technology that trainers have developed over time. The abrupt nature of the shift to remote teaching, as well as the lack of choice and the steep learning curve involved, may also have affected the individual responses to this issue.
Efficiency is rated better than on-site ten times over the whole semester by different second-year students while four different trainers consider it better than for on-site teaching, rating it better seventeen times over the whole semester.
In the comments one-to-one contact is mentioned five times by trainers and nine times by second-year students as an ‘efficiency booster.’ There are a number of comments which explicitly refer to the advantage of working in smaller groups. Trainers specifically mention small groups as positive (once) and large groups as being more difficult (once). Interestingly, the negative comment from the trainer group related specifically to the lack of spontaneous verbal interaction in larger groups (see section 3.2.2.). Students mention the advantages of smaller groups twice and also mention greater flexibility in planning their practice sessions six times.
From a collective organisation perspective, three students felt they had poorer access/overview to information on organisational issues/the weekly timetable while two reported to have better access to documents and videos, and one found digital files easier to annotate than paper documents.
5.4. Quality of interaction
The three time-related questions, which were of a rather quantitative nature, were followed by a question regarding the quality of the interaction. The concept of quality was not deliberately pre-defined in order to avoid bias due to suggestive questioning. On average, the quality of the remote interaction is perceived as equal or worse than on-site by almost all respondents. Interestingly, the trendline for trainers and second-year students is almost identical (going up very slightly towards the end of the semester).
Figure 8. Quality of interaction
In the comments, lack of non-verbal communication is mentioned as a downside by all respondent groups. However, rather interestingly, one member of staff reported having a better (closer) view of students’ facial expressions.
Only twice is remote interaction rated better than on-site by second-year students while three different trainers consider it better than on-site contact, rating it better 14 times over the whole semester.
In the comments, one-to-one contact is mentioned four times by trainers (and nine times by second-year students) as being better and/or more efficient. The variety of tasks and formats was mentioned positively by one second-year student.
While all respondent groups seem to agree that non-verbal communication and personal contact are missing, the lack of contact among peers is by far the type of contact most mentioned in the comments. Students mentioned three times that in the remote online modality, they felt there was little possibility to share experience and/or their frustration because they are not used to not having personal/face-to-face contact. Trainers also argued that it is more difficult to engage with students. However, all groups also said that cooperation was good. Students mentioned the trainers’ effort, flexibility, response times and understanding a total of 16 times. Trainers mentioned mutual support and effort from all sides ten times, explicitly praised students supporting trainers three times, and the support trainers gave each other five times.
6. Discussion and conclusions
As can be seen by the quantitative and qualitative results from each of the questions in the survey, as well as the growing body of literature on online teaching and learning, it is difficult to clearly delineate the two categories of pedagogy and interaction. The technical setup emerges as a cross-cutting issue which clearly impacts significantly on the success or otherwise of initiatives in these areas.
One decisive aspect that was mentioned as a success factor in the survey in terms of successful remote online teaching, was the meaningful use of the contact time between trainers and students. This contact time is a valuable resource per se, but even more so in the form of screen contact time with its higher physical and mental strain. It requires smart management if students are to actively participate in the lesson on the one hand, and are not to be overburdened cognitively by several 90-minute remote online lessons in a row, as evidenced by the need for breaks, voiced by the students in the survey. This need is also important in relation to the much-appreciated one-to-one contact and feedback, and in smaller groups when attention is centred on one or two individuals. In contrast, in online teaching with larger groups, students appear to zone out periodically, especially if the discussion during the lesson does not specifically relate to their performance. This phenomenon can also be observed in larger face-to-face groups. Since students are entitled to have a defined number of contact hours per course and term, the combination of parallel synchronous and asynchronous teaching of the same lesson might be an option, especially for bigger groups which pose a challenge to time management in any lesson, whether face-to-face or remote. Online lessons—or blended learning as a post-pandemic option—might thus offer opportunities for a more productive involvement of every single student in large(r) groups since materials and task can be reused or redefined according to their individual learning progression (Bergmann and Sams 2012: 19-34; Rodríguez Melchor 2020: 68-71).
A challenge that still needs to be overcome is the digital divide and the degree of adaptability of different groups to technology. The term ‘digital divide’ was defined by the OECD (2001: 4) twenty years ago as “the gap between individuals […] with regard to both their opportunities to access IT and their use of the Internet for a wide variety of activities.” It still seems to exist in German higher education, both in terms of technical equipment/connectivity and in terms of digital literacy (Breitenbach 2021: 4). Broadly speaking, access to technical equipment and connectivity was not a major issue on the MA in Conference Interpreting in Cologne. However, there was at least one student who could not fully participate due to persistent connection issues for which workarounds involving asynchronous work had to be found. In addition, at least one student did not have access to a printer, resulting in the decision to send hard copies of documents by mail to all candidates for one exam which required speech preparation based on manuscripts. Regarding digital literacy, the freely formulated comments suggest that there is still a gap between those who struggle to cope with the new online format and those who appreciate it at least in part. The fact that it was only trainers who formulated concerns about data protection also shows a difference in perception of online collaboration between trainers and students.
One interesting development that became clear in the results of the survey is that the abrupt nature of the complete shift to remote online teaching and learning mirrored the increase in the use of various forms of remote interpreting on the professional interpreting market. As the majority of Cologne staff are practicing conference interpreters, they were experiencing this professional reality first-hand, allowing for enriching debate and reflection on issues arising in both training and the professional contexts. In recent years, much has been done to move away from the traditional master and apprentice training model in conference interpreting training, towards communities of practice (Lave and Wenger 1991), defined as “groups of people who share a concern, a set of problems, or a passion about a topic, and who deepen their knowledge and expertise in this area by interacting on an ongoing basis” (Wenger et al. 2002: 4). However, although mutually beneficial communities of practice have emerged (see D’Hayer 2012; Beaton-Thome 2018), these have predominantly been deliberately created and cultivated, focusing mainly on easing the passage of students into the market.
Central to the community of practice of trainers and students that emerged under ERT, however,was the organic nature of its emergence, and the fact that the established profession of conference interpreting was also undergoing a shift in attitudes towards remote interpreting (AIIC 2020a). There was a palpable sense that both staff and students were ‘in it together’ and, perhaps more altruistically, that the mutual trialling of particular tools and modes of interaction would benefit both groups in the workplace. What was originally born out of necessity, became a mode of ensuring mutual benefit in a community of practice characterised by “aliveness” (Wenger et al. 2002: 50). A number of initiatives have arisen out of this community of practice such as the trialling of the software GoReact as a self-study tool, and the ongoing project of equipping one of the physical interpreting labs in Cologne to function as a remote interpreting hub. Similar developments could also be observed regarding the strong desire of both staff and students to remain connected as part of a learning community, a finding that has been backed up by previous research on crisis-induced ERT (Czerniewicz et al. 2019; Shin and Hickey 2020).
In terms of interaction, the results of this particular study indicate that interaction requires more effort in a remote online environment, especially if this is not the format students and trainers are used to. Almost all respondents felt that the quality of interaction was worse or equal to on-site interaction. Towards the end of the semester, trainers and second-year students, in particular, felt slightly more comfortable with remote online teaching, especially when it related to one-to-one contact and feedback between trainer and student. The lack of non-verbal communication and immediate eye contact are by far the strongest downsides of the remote modality, felt by all groups involved. The fact that one respondent comments on the better, i.e. closer view of students’ facial expressions might suggests that there is a margin for better non-verbal interaction if high resolution video image is used systematically (if participants’ bandwidths permit this). Apart from this finding, these results show that there is a need of proximity which has to be achieved by other means than non-verbal communication (e.g., Breitenbach 2021: 15). Since students and trainers alike seem to appreciate the more focused feedback/interaction in one-to-one or smaller group constellations, this seems to be a way to improve the quality of interaction and to create a feeling of proximity. This is also in line with Jung et al. (2002) and Kuo et al. (2013) who state the importance of learner-instructor interaction (see also Breitenbach 2021: 9). Their studies agree on the importance of interaction in online learning. Therefore, online teaching set-ups should enhance coherent communication and interaction among the teachers and students involved. In this way, it is comparable to traditional on-site teaching, despite the fact that it is technically mediated and therefore requires more explicit explanations and instructions (see section 3.2.1.).
The importance of interaction among peers (Clifford 2018) is another aspect of interaction reflected in the comments: Lack of motivation from one’s peers, feedback or contact is mentioned six times by students as a disadvantage of remote lessons; it seems to be the kind of interaction which is most missed (for similar results of other surveys, see also Breitenbach 2021: 9). On the other hand, the ease of organising peer practice groups online was mentioned as positive. Interestingly, this matches discussions among professional interpreters on the importance of team collaboration in remote interpreting. While remote simultaneous interpreting as such seems to be accepted as feasible provided the technical requirements are met, “an interpreter must be able to work with their language team and other language teams seamlessly” (AIIC 2020b: 5). Interpreting remotely from home offices is considered an in extremis situation that is to be avoided, among other reasons, due to the lack of collaboration between interpreters. Working together in teams from distance interpreting hubs, i.e. locations providing professional conference interpreting equipment as well as a secure, high quality internet connection, is considered a better alternative (AIIC 2020c: 2-3). There also seems to be a trend towards creating ‘private hubs’ in interpreters’ home offices, which do not necessarily meet the technical standards of commercial hubs created by conference equipment providers, but still provide for a setting where interpreters can be located together and collaborate directly (Rausch 2020). Similarly, private ‘student hubs’, created on student initiative, might be an option worth exploring further to create an alternative space for on-site peer learning in small groups in compliance with hygiene standards. This may also be a way of specifically targeting the assessment of paralinguistic elements and integrating tailored and individualised peer feedback more directly into the classroom.
The students’ comments in the first two weeks about acclimatising to the new format/setting of lessons as well as the comments on technical matters (e.g., disturbances, internet stability, handling of different platforms) indicate that technical problems also interfered with actual interaction. Even if there is a slight ‘habituation effect’ in handling the different technologies used (see APA 2020b, especially the aspect of habituation as “learning, through repetition and practice, a skill”), problems with technology and internet stability may arise at any moment during synchronous online classes and require ad-hoc solutions which cost both time and effort (see section 3.2.1.). Technology-centred interaction is therefore perceived as having a negative impact on remote online classroom interaction in general. On the other hand, the variety of formats, flexibility and technical possibilities are mentioned as positive in several comments. Zoom was also perceived as very user-friendly by trainers and students alike.
There also did not seem to be agreement as to how effective the organisation of the programme as whole was, or how well organised individuals were, which some respondents perceive as more efficient and others as more cumbersome as a result of the online mode. However, on the technical level, and despite the positive comments mentioned above, there generally seemed to be wide agreement as to the level of stress involved in handling many different tools and platforms. Finding a more centralised software solution to cover the organisational workflow and reduce multichannel processing could be a path worth exploring to reduce stress and increase efficiency so that both staff and students feel more at ease in a remote online setting.
7. Final remarks
The remote online teaching of conference interpreting during the summer semester of 2020 in the MA in Conference Interpreting in TH Köln has demonstrated the benefits and limitations of this teaching modality.
A clear picture emerged from the survey of students appreciating targeted and individualised feedback. Such a finding is important, not just in terms of tailoring feedback during the temporary online pivot, but also in fine-tuning teaching methodology in the return to face-to-face teaching and learning. However, care should be taken to support peer-to-peer learning and interaction, and not to minimise the importance of such learning in the online mode. Rather, we should seek to find ways of opening up and deepening the student-to-student channel of interaction both in and around remote online lessons (for examples see Clifford 2018: 182-185).
As to the technical platforms available, delivering simultaneous interpreting training online and synchronously—i.e. listening live to the original and the interpreted speech—is still an unresolved technological issue. Green Terp (2021), as the winner of CIUTI’s (2020) competition, should provide a single-device online training platform in the future. Such a solution would truly enhance remote online interpreting teaching internationally and allow interpreting staff to minimise technological multitasking, thus freeing up cognitive resources which can be then employed to focus on the pedagogy. Furthermore, a single platform which goes beyond the mere teaching function and covers the complete organisational workflow (i.e. including administrative functions, timetable coordination, communication/chat/ email function and file management) could be a way forward to further enhance efficiency and keep those on board who feel less comfortable handling many different tools or platforms.
One aspect in our remote online teaching experience in Cologne is especially noteworthy, namely that the whole semester was characterised by a feeling of proximity and closeness which sometimes was more tangible than in other, exclusively on-site semesters, and helped to overcome the distance caused by technologically mediated communication. This correlates with the findings of Shin and Hickey (2020: 12) who conclude that “rather than mere content coverage, we should emphasise social-emotional support.” Since universities world-wide are heading for another semester—or perhaps even longer—of remote online teaching, the onus is now on students and staff to build on the momentum and to further develop remote online teaching and learning strategies—e.g., by using new or additional platforms and developing hub solutions in Cologne’s on-site interpreting labs. Another challenge is to successfully include the new first-year students, who have already had to sit a remote online aptitude test as their very first MA experience, in the remote online setup of the MA in Conference Interpreting, and to support their identification as part of our organically emerging community of practice. The experience of being ‘forced’ by an unpredictable reality into remote interpreting scenarios in training, also underlined the importance of consolidating and extending the focus on remote for the post-pandemic market. In sum, much can be learnt from this longitudinal documentation of our remote online teaching experience in terms of complementing face-to-face-teaching once it resumes, as well as reflecting on the pedagogical and interactional factors experienced in the past semester regarding the insights they have afforded us into conference interpreting pedagogy in general.
Acknowledgements
The authors would like to expressly thank all teaching colleagues and students on the MA in Conference Interpreting for their consistent participation in the survey and their helpful input throughout the online phase.
Bibliography
- AIIC, Association Internationale des Interprètes de Conférence (2020a). AIIC Interpreter Checklist – Performing Remote Interpreting Assignments from Home in extremis During the Covid-19 Pandemic. https://aiic2.in1touch.org/document/4845/AIIC-Interpreter-Checklist.pdf(consulted 8.9.2020).
- AIIC, Association Internationale des Interprètes de Conférence (2020b). Reference Guide to Remote Simultaneous Interpreting – Version 1. https://aiic.ch/wp-content/uploads/2020/05/aiic-ch-reference-guide-to-rsi.pdf (consulted 13.10.2020).
- AIIC,Association Internationale des Interprètes de Conférence (2020c). AIIC Guidelines for Distance Interpreting. https://aiic.org/document/4418/.pdf (consulted 19.2.2021).
- Bailensen, Jeremy N. (2021). “Nonverbal overload: A theoretical argument for the causes of Zoom fatigue.” Technology, Mind, and Behavior 2(1). https://tmb.apaopen.org/pub/nonverbal-overload/release/1 (consulted 2.3.2021).
- Beaton-Thome, Morven (2018). “Situated expertise in interpreting.” Barbara Ahrens et al. (eds) (2018). Translation – Didaktik – Kompetenz. Berlin: Frank & Timme, 145-167.
- Behr, Martina (2015). “How to back the students – Quality, assessment & feedback.” Dörte Andres and Martina Behr (eds) (2015). To Know How to Suggest… Approaches to Teaching Conference Interpreting. Berlin: Frank & Timme, 201-217.
- Bergmann, Jonathan and Aaron Sams (2012). Flip Your Classroom: Reach Every Student in Every Class Every Day. Eugene, OR and Arlington, VA/Alexandria, VA: International Society for Technology in Education (ISTE)/ASCD.
- Blum, Susan D. (2020). “Why we’re exhausted by Zoom.” Inside Higher Ed, Apr. 22, 2020, https://www.insidehighered.com/advice/2020/04/22/professor-explores-why-zoom-classes-deplete-her-energy-opinion (consulted 4.3.2021).
- Braun, Sabine (2004). Kommunikation unter widrigen Umständen? Fallstudien zu einsprachigen und gedolmetschten Videokonferenzen. Tübingen: Gunter Narr.
- Braun, Sabine (2007). “Interpreting in small-group bilingual videoconferences – Challenges and adaptation processes.” Interpreting 9(1), 21-46.
- Braun, Sabine, Davitti, Elena and Catherine Slater (2020). “‘It’s like being in bubbles’: Affordances and challenges of virtual learning environments for collaborative learning in interpreter education.” The Interpreter and Translator Trainer 14(3), 259-278.
- Breitenbach, Andrea (2021). “Digitale Lehre in Zeiten von Covid-19: Risiken und Chancen.” Marburg, 18p. http://nbn-resolving.org/urn:nbn:de:0111-pedocs-212740 (consulted 19.2.2021).
- Clifford, Andrew (2017). “Becoming close despite the distance: Communities of practice and online interpreting training.” Tópicos e Contextos em Interpretação 2.http://interpret2b.com/publicacoes (consulted 4.9.2020).
- Clifford, Andrew (2018). “What does it take to train interpreters online? Communication, communication, and communication.” Barbara Ahrens et al. (eds) (2018). Translation – Didaktik – Kompetenz. Berlin: Frank & Timme, 169-187.
- Cohn, Jenae (2020). “A day in the life of a remote instructor: Fall 2020.” Inside Higher Ed, June 24, 2020. https://www.insidehighered.com/digital-learning/views/2020/06/24/day-life-remote-instructor-fall-opinion (consulted 4.3.2021).
- Czerniewicz, Laura (2020)."What we learnt from ‘going online’ during university shutdowns in South Africa.” PhilOnEdTech, Mar. 15, 2020. https://philonedtech.com/what-we-learnt-from-going-online-during-university-shutdowns-in-south-africa/ (consulted 4.9.2020).
- Czerniewicz, Laura, Henry Trotter and Genevieve Haupt (2019). “Online teaching in response to student protest and campus shutdowns: Academics’ perspectives.” International Journal of Educational Technology in Higher Education 16(43), 1-22.
- D’Hayer, Danielle (2012). “Public service interpreting and translation: Moving towards a (virtual) community of practice.” Meta 57(1), 235-247.
- Ehrlich, Suzanne and Jemina Napier (eds) (2015). Interpreter Education in the Digital Age: Innovation, Access, and Change. Washington, DC: Gallaudet University Press.
- Gile, Daniel (1998). “Observational studies and experimental studies in the investigation of conference interpreting.” Target 10(1), 69-93.
- Gómez, María Asunción and Steven Weinreb (2002). “An alternative instructional model: Teaching medical translation online.” Meta 47(4), 643-648.
- González-Davies, Maria and Vanessa Enríquez-Raído (2016). “Situated learning in translator and interpreter training: Bridging research and good practice.” The Interpreter and Translator Trainer 10(1), 1-11.
- Gorm Hansen, Inge and Miriam Shlesinger (2007). “The silver lining: Technology and self-study in the interpreting classroom.” Interpreting 9(1), 95-118.
- Hodges, Charles et al. (2020). “The difference between emergency remote teaching and online learning.” Educause Review 3. https://er.educause.edu/articles/2020/3/the-difference-between-emergency-remote-teaching-and-online-learning(consulted 28.7.2020).
- IAU, International Association of Universities (2020). The Impact of COVID-19 on Higher Education Around the World. IAU Global Survey Report. https://www.iau-aiu.net/IMG/pdf/iau_covid19_and_he_survey_report_final_may_2020.pdf (consulted 20.2.2021).
- Jung, Insung et al. (2002). “Effects of different types of interaction on learning achievement, satisfaction and participation in web-based instruction.” Innovations in Education and Teaching International 39(2), 153-162.
- Kenny, John and Andrew Edward Fluck (2017). “Towards a methodology to determine standard time allocations for academic work.” Journal of Higher Education Policy and Management 39(5), 503-523.
- Kim, Dohun (2017). “Flipped interpreting classroom: Flipping approaches, student perceptions and design considerations.” The Interpreter and Translator Trainer 11(1), 38-55.
- Ko, Leong (2006). “Teaching interpreting by distance mode: Possibilities and constraints.” Interpreting 8(1), 67-96.
- Ko, Leong (2008). “Teaching interpreting by distance mode: An empirical study.” Meta 53(4), 814-840.
- Ko, Leong and Nian-Shing Chen (2011). “Online-interpreting in synchronous cyber classrooms.” Babel 52(2), 123-143.
- Kuo, Yu-Chun et al. (2013). “A predictive study of student satisfaction in online education programs.” The International Review of Research in Open and Distributed Learning 14(1), 16-39.
- Lave, Jean and Etienne Wenger (1991). Situated Learning: Legitimate Peripheral Participation. Cambridge: Cambridge University Press.
- Lee, Jieun and Jiun Huh (2018). “Why not go online? A case study of blended mode business interpreting and translation certificate program.” The Interpreter and Translator Trainer 12(4), 444-466.
- Lee, Kyungmee et al. (2021). “Student learning during COVID-19: It was not as bad as we feared.” Distance Education 42(1), 164-172.
- Lepp, Andrew et al. (2019). “College students’ multitasking behavior in online versus face-to-face courses.” SAGE Open, January-March 2019, 1-9.
- Moore, Michael G. (1989). “Three types of interactions.” The American Journal of Distance Education 3(2), 1-6.
- Moser-Mercer, Barbara (2005). “Remote interpreting: Issues of multi-sensory integration in a multilingual task.” Meta (50)2, 727-738.
- Moser-Mercer, Barbara, Kherbiche, Leila and Barbara Class (2014). “Interpreting conflict: Training challenges in humanitarian field interpreting.” Journal of Human Rights Practice 6(1), 140-158.
- Motta, Manuela (2016). “A blended learning environment based on the principles of deliberate practice for the acquisition of interpreting skills.” The Interpreter and Translator Trainer 10(1), 133-149.
- Mouzourakis, Panayotis (2006). “Remote interpreting: A technical perspective on recent experiments.” Interpreting 8(1), 45-66.
- Mouzourakis, Panayotis (2010). “Remote interpreter training – training for remote interpreting?” Cadernos CML – Cadernos do Centro de Multimédia e Linguas. https://multimedialinguas.wordpress.com/edicoes/ano-i-2010/0001-janeiro/panayotis-mouzourakis-%C2%ABremote-interpreter-training-training-for-remote-interpreting%C2%BB/ (consulted 17.2.2021).
- Mulayim, Sedat and Miranda Lai (2015). “The community-of-inquiry framework in online interpreter training.” Suzanne Ehrlich and Jemina Napier (eds) (2015). Interpreter Education in the Digital Age: Innovation, Access, and Change. Washington, DC: Gallaudet University Press, 95-124.
- Nordmann, Emily et al. (2020). “Ten simple rules for supporting a temporary online pivot in higher education.” PLoS Computational Biology 16(10): e100824.
- OECD, Organisation for Economic Co-operation and Development (2001). Understanding the Digital Divide. Paris: OECD Publications. https://www.oecd.org/sti/1888451.pdf (consulted 19.2.2021).
- Olson, Judith, Olson, Gary and David Meader (1997). “Face-to-face group work compared to remote group work with and without video.” Kathleen E. Finn, Abigail J. Sellen and Sylvia B. Wilbur (eds) (1997). Video-Mediated Communication. Mahwah, NJ: Lawrence Erlbaum Associates, 157-172.
- Oswald, Margit E. and Stefan Grosjean (2012). “Confirmation bias.” Rüdiger F. Pohl (ed.) (2012). Cognitive Illusions: A Handbook on Fallacies and Biases in Thinking, Judgement and Memory. Hove:Psychology Press, 79-98.
- Perramon, María and Xus Ugarte Ballester (2020). “Teaching interpreting online for the Translation and Interpreting Degree at the University of Vic: A nonstop challenge since 2001.” Pilar Rodríguez Reina and Estefanía Flores Acuña (eds) (2020). Teaching and Practising Interpreting: From Traditional to New Remote Approaches. Special issue of Translation and Translanguaging in Multilingual Contexts 6(2), 172-182.
- Robinson, John and Geoffrey Godbey (1997). Time for Life: The Surprising Ways Americans Use Their Time. University Park, PA: Penn State University Press.
- Rodríguez Melchor, María Dolores (2020). “Meeting the challenge of adapting interpreter training and assessment to blended learning environments.” María Dolores Rodríguez Melchor, Ildikó Horvath and Kate Ferguson (eds) (2020). The Role of Technology in Conference Interpreter Training. Oxford: Peter Lang, 59-76.
- Rütten, Anja (2017). “Terminology management tools for conference interpreters – Current tools and how they address the specific needs of interpreters.” AsLing – The International Association for Advancement in Language Technology, Proceedings of the 39th Conference Translating and the Computer, London, UK, November 16-17, 2017, 98-103.
- Sandrelli, Annalisa (2015). “Becoming an interpreter: The role of computer technology.” MonTI Special Issue 2, 111-138.
- Schroeder, Ray (2020). “Are you a victim of Zoom fatigue?” Inside Higher Ed, May 6, 2020. https://www.insidehighered.com/digital-learning/blogs/online-trending-now/are-you-victim-zoom-fatigue (consulted 4.3.2021).
- Sellen, Abigail (1995). “Remote conversations: The effects of mediating talk with technology.” Human-Computer Interaction 10(4), 401-444.
- Setton, Robin and Andrew Dawrant (2016). Conference Interpreting: A Trainer’s Guide. Amsterdam/Philadelphia: John Benjamins.
- Shin, Minsun and Kasey Hickey (2020). “Needs a little TLC: Examining college students’ emergency remote teaching and learning experiences during COVID-19.” Journal of Further and Higher Education. https://doi.org/10.1080/0309877X.2020.1847261 (consulted 4.3.2021).
- Short, John, Williams, Ederyn and Bruce Christie (1976). The Social Psychology of Telecommunication. Chichester: Wiley & Sons.
- Skaaden, Hanne (2016). “Online learning on remote interpreting: A pilot course (HIOA)”. Katalin Balogh, Heidi Salaets and Dominique Van Schoor (eds) (2016). TraiLLD: Training in Languages of Lesser Diffusion. Leuven: Lannoo, 162-184.
- Sorgalla, Mario (2015). Gruppendynamik. Der DIE-Wissensbaustein für die Praxis. https://www.die-bonn.de/wb/2015-gruppendynamik-01.pdf (consulted 20.2.2021).
- TH Köln (2021). MA Konferenzdolmetschen: Studienverlaufsplan mit Prüfungserfordernissen. https://www.th-koeln.de/mam/downloads/f03_stvpl_mpo_kfd_20122018.pdf (consulted 19.2.2021).
- Wenger, Etienne, McDermott, Richard and William Snyder (2002). Cultivating Communities of Practice: A Guide to Managing Knowledge. Cambridge, MA: Harvard Business School Press.
- Williams, Ederyn (1977). “Experimental comparison of face-to-face and mediated communication: a review.” Psychological Bulletin 84, 963-976.
- APA, American Psychological Association (2020a). APA Dictionary of Psychology: Social interaction. https://dictionary.apa.org/social-interactions (consulted 17.2.2021).
- APA, American Psychological Association (2020b). APA Dictionary of Psychology: Habituation. https://dictionary.apa.org/habituation (consulted 20.2.2021).
- Adobe Connect (2021). https://www.adobe.com/de/products/adobeconnect.html# (consulted 17.2.2021).
- Audacity (2021). “Audacity – Tonstudio-Technik für zuhause.” https://www.audacity.de/ (consulted 17.2.2021).
- CIUTI, Conférence Internationale Permanente d’Instituts Universitaires de Traducteurs et Interprètes (2020). “Online platform conference interpreting.”https://www.ciuti.org/education-training/online-platform-conference-interpreting/ (consulted 29.7.2020).
- DFN, German National Research and Education Network (2020). https://www.dfn.de/en/ (consulted 17.2.2021).
- GoReact (2020). “The #1 video software for skill development.” https://get.goreact.com/ (consulted 17.2.2021).
- Green Terp (2021). “GT EDU Green Terp simultaneous interpreting training platform.” https://www.gtmeeting.com/solutions/gt-edu-green-terp-si-training-platform (consulted 19.2.2021).
- Rausch, Jan (2020). “Remote interpreting hubs.” https://www.google.com/maps/d/viewer?mid=1sp-0NXphjX3BHo7g95XTZFly0a8Qb2qd&ll=-3.81666561775622e-14%2C25.870848599999988&z=1 (consulted 13.10.2020).
- Zoom (2021). “Zoom video communications.” https://zoom.us/ (consulted 19.2.2021).
Websites
Appendix
Survey questions
German [original] |
English [translation] |
Name (kann auch leer bleiben) |
Name (optional) |
Rolle |
Role |
Woche |
Week |
Zeitaufwand insgesamt im Vgl. zu vergleichbarer Präsenzwoche (einschl. Überziehungen/weggelassener LV, Orga-Overhead, Fahrzeiten etc.) |
Total time spent compared to a comparable on-site week (taking into account, e.g., teaching time in addition to the regular timetable, cancelled lessons, organisation, overhead, travel time, etc.) |
Zeitaufwand pro Lehrveranstaltung im Vgl. zu Präsenz-LV |
Time spent per course compared to on-site teaching |
Kontaktzeit (gleichzeitige Anwesenheit in Raum/virtuellem Meeting) insgesamt im Vgl. zu vergleichbarer Präsenzwoche |
Contact time (being present in the same room physically/virtually) compared to a comparable on-site week |
Qualität der Interaktion mit Dozenten/Studierenden im Vgl. zu Präsenzveranstaltungen |
Quality of interaction with trainers/students compared to on-site teaching |
Wirkungsgrad im Vgl. zu Präsenzveranstaltungen |
Level of efficiency compared to on-site teaching |
Anstrengung/Erschöpfung im Vgl. zu Präsenzveranstaltungen |
Degree of strain/fatigue compared to on-site teaching |
Was war besonders schwierig/gewöhnungsbedürftig? |
What was particularly difficult/hard to get used to? |
Was war besonders hilfreich? |
What was particularly helpful? |
Anekdoten (Pannen, Highlights, Lustiges)? |
Anecdotes (mishaps, highlights, funny stories)? |
Wünsche/Vorschläge? |
Wishes/suggestions? |
Sonstige Kommentare? |
Any other comments? |
Biographies
Barbara Ahrens has been Full Professor for Interpreting (Spanish) at TH Köln, Cologne, Germany, since 2006, after graduating in conference interpreting from the University of Heidelberg, completing a PhD on prosody in simultaneous interpreting and holding a junior professorship for Translation Studies, both at the University of Mainz/Germersheim. Her research focuses on prosody, note-taking and cognition in interpreting. She is a practising conference interpreter, member of AIIC and AIIC’s Research Committee, and the CIUTI Board.
E-mail: barbara.ahrens@th-koeln.de
Morven Beaton-Thome is Full Professor for Interpreting (English) and Programme Director of the MA in Conference Interpreting at TH Köln – University of Applied Sciences, Cologne. Prior to her appointment in 2013, she held tenured academic positions at the Centre for Translation and Intercultural Studies (CTIS) at The University of Manchester and Saarland University. Her research interest on interpreting pedagogy and didactics focuses on situated expertise and communities of practice.
E-mail: morven.beaton-thome@th-koeln.de
Anja Rütten is a freelance conference interpreter and translator for German, Spanish, English and French, working on the private market, the EU institutions and the European Patent Office. She has taught on the MA in Conference Interpreting at TH Köln since 2004 and completed her PhD on Information and Knowledge Management in conference interpreting in 2006. Her research focuses on Information and Knowledge Management and the intersection with new technologies.
E-mail: ruetten@sprachmanagement.net