Jack Burston
Temple University
Abstract:
The purpose of this paper is to describe theoretical and practical considerations related to the provision of feedback in the written compositions of advanced level foreign language learners, as exemplified by second year (semester 7-8) students of French. The paper begins by discussing the approach taken to teaching and assessing writing skills of students at Monash University. It then proceeds to a consideration of how using a computer-based composition annotation program, Markin32, can contribute to the reduction of correction loads for instructors and the improvement of the quality and usefulness of composition feedback for students.
KEYWORDS
Advanced Level Writing, Computer Feedback, Automated Annotation, Focus-On-Form Approaches
INTRODUCTION
As part of a long-term strategy to improve the effectiveness of the teaching of writing skills in advanced level (semester 7-8) second-year French language courses at Monash University, a two-pronged pedagogical approach has been adopted.1 The first, reported previously in the CALICO Journal, concentrates on the problem of morphosyntactic accuracy and ways to remedy it (Burston, 2001). The second focuses on higher level
37
syntactic, semantic, and lexical accuracy as well as discourse coherence and cohesion. It is this latter aspect of advanced writing skills that is the topic of this article.
The acquisition of foreign language writing skills at advanced levels is a challenging task for students, one for which dominant teaching methodologies that focus on oral communicative competency provide little preparation. For English speaking students of French, the task is made all the more difficult by the norms of formal discourse in French, which are frequently more demanding than those expected in their native language. Learner-centered approaches to the development of writing skills quite justifiably focus on the process of writing as opposed to the product of writing. To be effective, however, learner-centered approaches necessarily involve a cyclical process of drafting and feedback. From the teacher's perspective, marking essays can be enormously time consuming. From the student's viewpoint, the more detailed the corrective comments, the more difficult and off-putting they can be to assimilate. Practical problems, such as the legibility and consistency of marginal and interlinear annotations, further contribute to student difficulties. Moreover, as is well known, little is to be gained from the correction of compositions if feedback is not systematically linked to the process of rewriting. As traditionally practiced, multiple correction of essays is as burdensome for students as it is impractical for teachers and, hence, not practiced nearly as much as it should be.
THE PROCESS AND PRODUCT OF WRITING AT MONASH UNIVERSITY
Within the framework of second language acquisition theory, advocates of "Focus-on-Form" approaches (Daughty & Williams, 1998) stress the need to direct the attention of language learners to formal problems within a meaningful linguistic environment, as opposed to decontextualized grammar exercises. While there are many ways to achieve this objective, the guiding principle of such approaches is to disrupt the flow of communication as little as possible and to get learners to "notice the gap" between what they have produced and what is required by the target language. In essence, the approach is intended to foster discovery learning. When applied to composition correction, such an approach can also be used effectively to reduce marking loads substantially thereby making cyclical correction and rewriting possible.
In the case of our students, the process and product of writing are dealt with separately during the term. In previous years, students wrote three essays (250-300 words), each marked only once, plus a final exam composition. Students now write only two essays but submit them for correction
38
two or even three times. The former final 90 minute examination composition, the correction of which students usually never saw, has now been replaced by two essays written in class under test conditions (i.e., completely closed book). The first TST (Travail sur Table as it is called in French) is given at midterm and, like homework essays, is returned to students within a week. Aside from the linguistic feedback it provides, the corrected TST also serves as an early warning to students of what to expect on the second TST given during the final week of classes. The second TST, too, is corrected and returned to students within a week, that is, prior to the final exam. Whether the focus is on the writing process or product, an important part of our pedagogical strategy is making writing activities during the term count substantially in course assessment: 15% for the two homework essays plus 10% for each of the TST, a total of 35%. To encourage students to take their writing seriously from the beginning, first drafts account for 80% of the homework essay mark.
Compared to previous practice, where feedback was only provided on three pieces of written work, students now receive feedback on their writing nine times during the term. Of particular importance, because of the use of focus on form techniques, the goal of nine corrections has been achieved without significantly increasing workloads for students or staff. First drafts of homework assignments are corrected with minimal intervention from instructors who use two types of mark-up devices. Morphosyntactic mistakes (e.g., errors in spelling, accents, and gender agreement) are simply highlighted. Higher level syntactic and semantic difficulties (e.g., vocabulary, tense usage, and phrasal constructions) receive equally minimal treatment the first time around: a maximum of 20 errors are simply underlined without comment. Only in the case of lexical problems is the nature of the difficulty identified by the symbol "Voc." Using this technique, an instructor can usually mark a first draft in about 5 minutes rather than the 20-25 minutes usually required for a full correction.
As part of normal classroom instruction, students are taught how to spot and eliminate simple mistakes on their own. They are also taught how to use a French grammar checker (Antidote) to verify the accuracy of their own corrections. In any event, nothing is to be gained from commenting on simple mistakes in marking students' work and even less from correcting them. From a theoretical as well as a practical point of view, all that can be done is to draw attention to the gap between students' level of writing proficiency and that required in French and place responsibility on students to eliminate it. Since 20% of their first draft mark (i.e., the equivalent of a full letter grade) is determined by basic grammatical accuracy, students quickly get the message and submit work reasonably free of fautes bêtes 'low level errors.' (For discussion of the issue of low level errors, see Burston, 2001.)
39
The simple underlining of other types of difficulty in homework essays serves the same purpose of focusing students' attention on errors and requiring them to reflect on what they have written. Even at an advanced level, the number of such problems can run into the dozens in a typical 250-300 word composition. Restricting the number of underlinings to 20 errors has important advantages for teachers and learners alike. For the instructor, it reduces the amount of correction required and helps to ensure that areas targeted for treatment are those that are the most relevant to the curriculum. For students, it significantly stems the usual flow of red ink thereby helping to keep the "affective filter" down. Analysis of second draft compositions has shown that about three quarters of underlined problems are adequately addressed by the students themselves, either on their own or through out of class discussions with their instructors. Admittedly, this approach occasionally leads to avoidance strategies and some inappropriate corrections, but, overall, second drafts are substantially improved with minimal teacher intervention.
The correction of second drafts, though more comprehensive, is nonetheless again kept to as much of a minimum as possible in order to engage the maximum attention and reflection of students. As with first draft corrections, minimal commentary has the advantage of reducing marking workloads, focusing on the most relevant problem areas, and not overwhelming student compositions with annotations. The workload involved in marking second drafts is of course considerably reduced by the prior elimination of morphosyntactic mistakes and other errors. To facilitate correction, a standardized code of some two dozen short annotations is employed (e.g., Réf (ambiguous reference), Tps (wrong tense), Ord (incorrect word order), Cv (faulty verbal construction), and Con (discourse connector needed). Faulty sentences are not rewritten for students and individual comments are again intentionally kept to a minimum in order to focus as much as possible on the positive features of the substance of the compositions. As a rule, second drafts usually take no more than 10 minutes each to mark in this fashion and can be quickly returned to students.
Because of time and workload limits, only one of the two term essays undergoes a third draft. However, knowing that they must produce two compositions in class under test conditions, students are inclined to look over their returned second draft carefully (whether or not they are required to resubmit it), which was demonstrably not the case when they only submitted a single draft of essays and faced a distant final exam. Given the writing-feedback cycles which precede third drafts, final correction can focus on sentence remodeling and more discourse related difficulties. While the emphasis remains on the process of writing, the final product is also very much in evidence at this point. In terms of correction time, it can still take 10-15 minutes to mark the third draft of an essay; but
40
the time is productively spent on higher order matters, and the results are well worth the effort. Since only half the term essays are corrected a third time, overall correction workloads remain essentially the same as before.
As previously indicated, the product of student writing is separately assessed by two TSTs, one at midterm the other in the last week of classes. The TSTs are marked in exactly the same manner as the first drafts of homework essays in the form of highlighted morphosyntactic mistakes and underlined syntactic/semantic errors. Although instructors have to mark two TST, compared to the previous one exam essay, minimal correction results in comparable marking workloads. The midterm TST is returned to students, who then resubmit it for 20% of the mark. The final TST is the only composition submitted just once. A detailed analysis of TST results from the second semester of 1999 (when they were first introduced) showed a marked improvement in grammatical accuracy of the midterm TST compared to the final exam essay of the first semester: 6.5 errors per 100 words versus 8.1 errors per 100 words. Likewise, the end of term TST showed an equally large improvement compared to the first TST (4.8 errors per 100 words vs. 8.1 errors per 100 words). The results from the first semester 2000 TSTs are even more encouraging. A diagnostic TST was administered during the second week of classes. Analysis revealed an underlying error rate of 10.1 errors per 100 words, which was essentially what had been observed in previous years' examinations. In comparison, the midterm TST results were 5 errors per 100 words, an improvement of more than 50% and very nearly what it took a whole year to achieve in 1999. Analysis of this TST data set with regard to higher order syntax, vocabulary, and discourse structure has not yet been completed, but early results also appear encouraging.
MARKIN32
In seeking to improve composition feedback without increasing marking workloads, our attention was drawn to an inexpensive shareware program called Markin32 which allows automated annotation of essays. (For a review of Markin32, see Burston, 1998.) Markin32 operates only on Windows95/98 but produces annotated compositions in ASCI, RTF, and HTML. Consequently, work can be returned to students in a format compatible with both PC and Macintosh platforms. Likewise, since Markin32 accepts as input texts in either ASCI or RTF, it can accommodate student essays written with virtually any word processor regardless of platform.
From the instructor's viewpoint, Markin32 is extremely easy and flexible to use. The work space of Markin32 is essentially that of a word processor with familiar File and Edit options in the top toolbar menu. Although texts can be created directly in Markin32, they are normally imported as ASCII or RTF files for composition correction.
41
The heart of Markin32 is the system of automated annotation buttons (see Figure 1).
Figure 1
Markin32 Annotation Buttons
0x01 graphic
As can be seen, the buttons display only a very abbreviated annotation, but when the cursor is placed over them a fuller description is revealed. Markin32 was originally developed for ESL students and so is delivered with a set of default annotation buttons in English. The buttons of course need to be adapted for foreign language work, which can be easily done by typing required annotations into three text fields (see Figure 2).
42
Figure 2
Adapting Markin32 to French
0x01 graphic
Once the changes are made, the buttons can be saved as an external set and made the default for the program. It is possible to create any number of annotation button sets and use them to meet different requirements (e.g., advanced level discourse analysis or content commentary). Using the annotation buttons is equally simple; students just highlight the portion of the text to annotate and press the appropriate button. Annotations can be undone by selecting the annotation and then clicking the delete annotation button.
In addition to the standard annotation buttons, Markin32 also allows for the insertion of free-form comments. The same procedure is used to insert comments; the instructor just highlights the portion of the text to be commented and then presses the appropriate button (see Figure 3).
43
Figure 3
Instructor Comments in Markin32
0x01 graphic
Another useful feature of Markin32 is the tabulation of error statistics which can be included in returned work, if desired.
Corrected compositions can be returned to students in two text formats as well as in the form of a web page, the selection of which is just a matter of clicking on the corresponding icon. As shown in Figure 4, work returned in RTF appears essentially as it does within the Markin32 editing window.
44
Figure 4
RTF Text from Markin32
0x01 graphic
Work returned in HTML format displays only the annotation tags; however, these are in fact hyperlinked to fuller explanations.
The advantages of using Markin32 should be reasonably apparent.
• Standard annotations can be quickly inserted and changed as required;
• Free form comments are readily accommodated.
• Annotation is legible and unobtrusive.
While Markin32 can be used in principle for any kind of annotation, a word of caution is in order regarding the purposes to which it is put and the danger of overwhelming students with feedback. As previously mentioned, within the composition correction framework established for our students, basic morphosyntactic mistakes are noted but never commented on. Consequently, the use of Markin32 for the correction of first drafts would serve no purpose, the whole point of such correction being to focus on form with minimal teacher intervention. The marking of second drafts is an entirely different matter, however, and the kind of abbreviated feedback
45
provided by Markin32 is particularly appropriate for the intended purpose. Because of prior corrections, second drafts are largely free of low level errors. Accordingly, the annotation code used with Markin32 essentially concentrates on higher order problems. Were this not the case, and correction attempted to deal with lower level errors at the same time, students' compositions would typically be festooned with annotations. As it is, even with a self-imposed limit of 20 syntactic/semantic annotations per essay, care needs to be taken not to overwhelm students with negative feedback and always to include some positive reinforcement, which can be done either by a standard annotation button or a free from comment.
Benefiting from Markin32 requires a certain number of adaptations to traditional composition writing/correction practice. First, students must of course submit work in word processed form. In situations such as in our course, where multiple drafting is required and where access to word processors is readily available on campus as well as in the home, getting students to use a word processor poses no problem. Notwithstanding, anyone attempting to integrate the use of Markin32 into the curriculum will need to deal with two practical problems: file format, and procedures for the submission and return of work.
Unless compositions are saved in ASCI or RTF, Markin32 will be unable to import them. While file formatting is easy enough to do, most students need to be taught how to do it. Inevitably, especially at the beginning of the course, some students will submit their work without properly formatting it, but this difficulty usually takes care of itself very quickly. Moreover, all industry standard word processors, like Word or WordPerfect, can import multiple versions of their own and each others' file formats. As a result, the instructor can usually reformat a student essay before importing it into Markin32, if necessary. Should worse come to worst, the essay can always be returned to students for resubmission in the required format.
More problematic than file formatting is the issue of how students should submit their work and how it should be returned to them. After some experimentation, we discovered that the best solution was for second and third drafts to be sent and returned as e-mail attachments. This process has several advantages.
• Unlike ordinary e-mail messages which, especially on Wintel systems, can preclude the use of diacritics or make access to them very cumbersome (i.e., typing in ANSI codes), file attachments can use any fonts supported by the originating word processor.
• Students can submit their work at virtually any time from a large number of points on campus as well as from home if they have an Internet connection. The same applies to the electronic retrieval of corrected assignments.
46
• The instructor has proof of submission and, thanks to the time/date stamp accompanying all e-mail messages, can substantiate whether or not deadlines have been respected.
• E-mail messages themselves can facilitate communication between students and the instructor, in particular allowing for more personal interchanges (e.g., explanations why an assignment was late, general words of encouragement, and admonitions).
• Since attached files remain with original messages even after they have been extracted, there is always a backup copy in reserve should disaster strike.
• The submission of essays in digital form creates a potentially rich database for future research into writing skills development.
As with Markin32 itself, the use of e-mail distribution of student essays also has its special requirements. If instructors retrieve e-mail or extract attachments at more than one location, some system needs to be devised to combine extracted student essays into one database or otherwise keep track of them. Needless to say, up-to-date antivirus protection is an absolute necessity, preferably one which operates automatically when attachments are extracted from e-mail messages.
The electronic correction of student essays of course can only take place when connected to a computer. Instructors used to taking home a pile of compositions and correcting them on the fly as circumstances permit (e.g., on the train or while waiting for a dental appointment) are likely to find that the technological overhead outweighs the potential gain. On the other hand, computer-based composition correction makes possible the use of some very powerful ancillary tools such as on-line grammar checkers and dictionaries.
The ability of Markin32 to produce HTML versions of corrected work also opens up the possibility of integrating the results into a course web site. This potential is further extended by the ability of Markin32 to link annotations to external HTML sources such as web sites. For students in our course, this feature has been exploited to make available a context sensitive on-line reference grammar. For the instructor, linking Markin32 annotations to an appropriate grammar reference is simply a matter of clicking on a menu selection (see Figure 5).
47
Figure 5
Markin32 Linking Procedures
0x01 graphic
Likewise, to consult a grammatical reference, all students need to do is click on an designated option (see Figure 6).
48
Figure 6
Markin32 Grammatical Reference
0x01 graphic
CONCLUSION
As should be obvious, Markin32 offers many ways of facilitating composition correction and improving feedback to students. Just as obviously, its successful use requires some technological literacy on the part of teachers and learners. Preliminary reactions from students in an earlier pilot group were very positive, and the end-of-year results pointed to improvements in writing skills above those in the control group. One must remember, however, that a number of interdependent factors come into play when introducing pedagogical innovations like the use of Markin32 into the curriculum, not the least of which is the instructor's enthusiasm. For a teacher who is already at ease with instructional technology and accustomed to correcting essays in electronic form, Markin32 can save time and energy, at least enough to compensate for the technological overhead its use entails. Whether or not Markin32 can be integrated into an entire course, to be used by all instructors, would of course depend on the common denominator of technological literacy among those called upon to teach the subject. For those willing to experiment on their own, it is certainly worth its shareware price of $30.
49
NOTE
1 This project was made possible through the generous support of an Australian National Teaching Development Grant, for which I would like to express my sincere appreciation. The research described here also owes much to the collaboration of my colleague Patrick Durel and graduate student Eugene Mogilevski whose assistance and enthusiasm are equally appreciated.
REFERENCES
Burston, J. (1998). Review of Markin32 (Ver. 1.2). CALICO Journal, 15 (4), 67-74.
Burston, J. (2001). Exploiting the potential of a computer-based grammar checker in conjunction with self-monitoring strategies with advanced level students of French. CALICO Journal, 18 (3), 499-515
Daughty, K., & Williams, J. (1998). Focus on form in classroom second language acquisition. Cambridge: Cambridge University Press.
Tidak ada komentar:
Posting Komentar