Peer Review Activity and a Search-Engine based Corpus System
By Yin Ling Cheung, National Institute of Education, Nanyang Technological University, Singapore
Contact: yinling.cheung@nie.edu.sg
Abstract
For the past two decades, we have witnessed a number of peer review research studies in both first and second/foreign language writing classrooms. Few studies, however, have been done to build a custom search-engine based corpus system that performs searches on relevant texts for academic writing tasks, such as peer review activity. The study investigates students’ perception of the peer feedback task using a search-engine based corpus system called Word Engine. The participants were 322 first-year undergraduates across disciplines who took an academic writing course at a large public university in Singapore. Data were collected from background questionnaires about the participants, peer reviews on first drafts of the students’ papers, and students’ final papers after incorporating feedback from the peer review. Findings showed that students believed that peer feedback activity was useful. They made revisions on various aspects including discussion of results, the development of ideas, macro-rhetorical goal of the paper, and the use of academic language such as hedges. Students used Word Engine because it excluded all non-academic websites. The study contributes to the field of academic writing and corpus linguistics, particularly how peer feedback with the use of Word Engine can promote student autonomy in learning.
Keywords: Peer review, search-engine based corpus system, academic writing, undergraduate students
Introduction
In writing pedagogy, online corpora merit particular attention for their educational potential to provide resources for learners to develop confidence about using the taught language and encouraging them to have positive perceptions of the corpus use, to build error-correction skills, and to develop lexico-grammatical competence. Studies have shown learners’ textual performance, e.g. the extent to which they improved their essays through a process of revision, relies on a series of hypothesis-testing endeavors whereby they compare some example sentences with their own. A search engine (Google, for example) can be a practical resource of scaffolding, as it allows users to revise their searches and retrieve samples in a fast and resilient manner and thereby test their linguistic hypotheses.
Few studies, however, have been done to build a custom search-engine based corpus system that performs searches on relevant texts for academic writing tasks, particularly peer review activity. My project team proposed to use the system by integrating it into an actual writing course for writing tasks such as research paper writing. Peer review activity will be a good chance for preservice teachers to enhance their writing skills through the use of the corpus, as the activity requires the students to take a reviewer’s perspective, rather than an author’s, so they need to look for language examples in order to compare the text they review and give comments to the writer. In other words, we hope the corpus system will serve as a textual repertoire for preservice teachers to explore and discover rhetorical convention and lexico-grammatical patterns used in an unfamiliar discourse, i.e., the peer reviewer’s report. Findings of the study contribute to the field by suggesting how the corpus system might help preservice teachers in improving their academic writing.
Literature Review
We have witnessed a number of peer review research studies in both first and second/foreign language writing classrooms. Peer review, also commonly known as peer response, is defined as the process in which ‘students critique and provide feedback on one another’s writing in small groups’ (Zhu, 2001, p. 251). Many writing researchers believe that peer review brings about benefits in the ‘cognitive, affective, social, and linguistic’ dimensions (Min, 2006, p. 118). Peer review is beneficial to university and secondary school students in various aspects, such as enhancing their awareness of readers in writing (Min, 2006), developing a positive attitude in writing (Min, 2006), and facilitating student autonomy in learning (Miao, Badger, & Zhen, 2006). Peer review activity is also useful to writing teachers when they plan the Assessment for Learning (Swaffield, 2011; Willis, 2011) and formative assessment (Sadler, 1989), and set the achievement goals for the students (Clark, 2012).
Over the years, one set of studies has investigated the effects of peer feedback on student revision and writing quality (Kaufman & Schunn, 2011; Yang, Badger, & Yu, 2006). These studies have yielded mixed results. Some studies have revealed that peer feedback leads to better texts. However, other studies have shown that the peer review activity does not improve the writing quality of the papers. Motivated by the prior mixed results of peer feedback, another set of studies has focused on the impacts of training on the peer feedback (Hovardas, Tsivitanidou, & Zacharia, 2014; Kulkarni, Bernstein, Klemmer, 2015; Min, 2016; Rahimi, 2013). The positive results of all these studies indicate that peer feedback can improve the quantity and quality of peer feedback, and increase student interaction and negotiation in the peer review processes. The third strand of research has examined student participation and interaction in peer review processes (Zhu, 2001; Zhu & Mitchell, 2012). Since student perceptions could affect the quality of writing and revision, another line of research has been conducted to investigate student perceptions on the use of peer feedback (Kaufman & Schunn, 2011).
For the above early studies on student peer feedback, some criticisms have been raised concerning the research design. Suggested research design issues include the following. There has been an increase in the use of teacher-student conferences in addition to trained student feedback for the research conducted (e.g., Min, 2006, 2016). Students in the control groups were not fairly treated since they did not receive teachers’ inputs, unlike the experimental groups, before the final submission of their papers for grading purposes. Also, it is unclear whether the observed improved final drafts should be attributed to feedback from teacher-student conferences or peer/self-feedback. Second, the majority of the studies employed a small sample student population. It is therefore difficult to estimate the extent to which the findings can be generalized to larger populations. Third, in the past two decades, the methodologies of peer review research have focused on the use of thinking-aloud protocols to (i) identify the kinds of revisions (organization, content, vocabulary, grammar, etc.); (ii) get students familiar with the genres of their classmates’ writing; and (iii) encourage students to use effective communication skills in interacting with their peers, in order to facilitate peer response and peer talk. Studies using an innovative research methodology to systematically investigate the impacts of trained peer feedback on student writing are absent. It is important to adopt an innovative method to study the effects of trained peer response on student writing that will shed new insights into why trained peer feedback may enhance the writing quality and promote student autonomy in learning.
The present study aims to investigate students’ perception of the peer feedback task using an innovative research method. To address the problems mentioned above, we did not include teacher-student conferences. Rather, we focused on trained peer evaluation using a new research tool. We engaged a large number of participants, specifically first-year undergraduates majoring in Education at a large public university in Singapore. We introduced an innovative method to examine peer feedback, by creating a web-based system hosting a database of academic texts and a custom search engine, for our participants to use in the trained peer review process. The peer response, the first draft of the research paper before the peer response, and the final version of the paper after the peer response, all constituted part of the course assessment, so that all the students, whether or not they took part in this research study, treated the peer review task seriously.
Our study is unique and comprises three novel features. First, we examine the perceptions of trained peer feedback on the writing of first-year pre-service student-teachers based in Asia who use English as the first language. This population and its size have never been investigated in previous evidence-based research. Second, the participants were taught techniques for conducting systematic peer reviews, using a new web-based system hosting a database of academic texts and a custom search engine created by our research team. This method enhanced the student participants’ interaction and negotiation with their peers and at the same time promoted autonomy in learning. Third, we collected both quantitative and qualitative data to understand the kinds of peer feedback that students have incorporated into the final versions of their research papers and the types of revisions that will lead to better texts, regarding idea development, the organization of information, and sufficiency. We conducted our study in the context of a writing course because writing proficiency is considered a key indicator of college success. Specifically, my study has contributed to new and important knowledge construction in aspects including English-as-a-first/English-as-a-second language writing, trained peer reviews, and corpus linguistics, and impacts of peer reviews on student revision.
Research Design
(i) Research Questions
We sought answers to three research questions:
- How did the students perceive the peer review activity?
- Which aspects of the draft did the students revise, as indicated in their self-evaluation forms?
- What factors might facilitate/hinder the students’ use of ‘Word Engine’ for a peer review activity?
(ii) Introducing innovative method in peer feedback research
As a resource for the course and a research tool in its right, the project team created a web-based system with a database of academic texts and a custom search engine based on Google’s Custom Search Engine service (see the screenshot below). As an electronic text collection, the system allows students to access more than 400,000 words of academic texts in relevant broad topic areas of communication and technology, language use, and language learning.
The search tool of the system provides rich context, such as showing sentences that co-occur with the search terms and highlighting all the search terms instead of single keywords. This functionality is essential for the writing course in the study and lets the system function as an interactive writing reference rather than a list or dictionary of words.
In addition to its search capability, the system has a logging device to record the searches that users have entered into the system. The device was implemented as a computer program plugged into the system. The program saves the users’ queries in a separate computer database, from which a log of the search queries can be retrieved.
iii) Research Methodologies
In order to collect accurate and reliable data that shed light on the improvement of student writing over a school semester, we used the following instruments: background questionnaires about the students, peer reviews on the first drafts of the students’ research papers, and the students’ final papers after incorporating feedback from the peer review.
(a) Participants and context of study
The participants were 322 first-year undergraduates across disciplines who took the Academic Discourse Skills course in the January 2013 semester at a large public university in Singapore. Their age ranged from 18 to 59. The course was a compulsory course taken by all first-year undergraduates in BA/BSc (Education) programme. The project was approved by the Institutional Review Board (IRB-2012-10-037).
(b) Background questionnaires about the participants
All the participants were required to complete a background questionnaire in the first week of class which elicited information about their experience of peer feedback, paper writing activities in the previous semester, level of integrative and instrumental motivation in writing academic papers, and desired writing skills upon completing the Academic Discourse Skills course.
(c) Peer reviews on first drafts of the students’ research papers
In the penultimate week of the academic semester, each student was asked to bring in two copies of their drafts and invited two classmates to give both quantitative and qualitative comments on the drafts. All the students in the course were involved in the in-class peer review session. The writing instructors went through the detailed peer review forms specially designed for the course and the database of academic texts created by the project team before the students performed the peer review task. The students were asked to follow the guidelines in the peer review form to give their peers useful feedback regarding the first drafts of their research papers. All the students were required to submit their first draft two weeks before the final paper deadline. The first draft and the completed peer review forms would altogether make up 10% of the course grade. Before the peer review activity, all the students are expected to have completed the self-evaluation form on the first draft of their papers at home.
(d) The students’ final papers after incorporating feedback from the peer review
The students’ final revised research papers were analyzed to find out what kinds of peer feedback have been included in their papers. Another objective of this analysis was to find out whether the incorporated peer feedback indeed led to better writing.
In the first and the penultimate week of the academic semester, the course coordinator explained the project via Blackboard (and tutors explained the project at the beginning of the class) and asked students to participate. If students did not wish to take part in the project, they still needed to do the background questionnaires, peer reviews on the first drafts of their research papers, and final papers but their data would not be included in the project.
(e) Data analysis
A selective reading approach (Van Manen, 1990) was used. Attention was paid to data that shed light on the research questions. Through reading and re-reading, categories were established relevant to the research questions and the conceptual framework, i.e., Assessment for Learning (Swaffield, 2011) and Formative Assessment (Sadler, 1989). The data were coded accordingly. Categories emerged from the data including (a) How students understood the goals of learning and what constituted quality work; (b) How they saw their role and the peer reviewers’ role in the peer review activity; and (c) How they made multi-criterion judgments, with reference to salient properties of the marking rubrics.
Findings and Discussion
Student perceptions of the peer review activity
- 92% of the participants believed that peer feedback activity was useful because students got to look at their peers’ opinions on the same topic. The peer reviewers were able to point out the mistakes and the specific areas (e.g., the logical flow of ideas, the organization of the paper, content, and linguistic aspects) for improvement.
- Students have gradually developed their evaluative knowledge and expertise as they have tried out ways in which others have approached the same task (Sadler, 1989).
- Students use success criteria (i.e., items in the peer review form and the moves of different sections of a research paper) to communicate what counts as achievement of these goals (Clark, 2012).
Aspects of the papers that students revised
- 62% of the students revised on various aspects of their papers. The aspects were description and discussion of findings, the development of ideas, the use of secondary sources, macro-rhetorical goal of the paper, citation of sources, grammar, organization of ideas, sentence structure, paraphrasing ideas, and the use of academic language such as hedges.
- Students compare current performance to what is expected (Sadler, 1989). When they found a particular piece of peer feedback useful, they acted on it. The trained peer review activity enhances students’ ability to revise and improve their writing (Min, 2006).
Factors facilitating the use of Word Engine
- Students used the Word Engine because of the following reasons:
- It excluded all non-academic websites.
- It focused on academic journals which were reliable and credible.
- It was easily accessible.
- The Word Engine was more efficient than the traditional method such as looking up resources in the library.
- It was encouraged by the writing teachers.
Factors hindering the use of Word Engine
- Some students might not use Word Engine to assist their peer review task.
- Some of them were more comfortable with existing tools, e.g., Google and library e-resources.
- Google was easier to use, more efficient and effective.
- Books were better than open-access journal articles.
- Some students believed that Word Engine was not needed as they had already gathered sufficient reading sources from Google and the university library.
- Self-monitoring leads to student ownership of and responsibility for learning (Swaffield, 2011; Willis, 2011). Students are resourceful; they can identify the secondary sources they need, from the existing search engine (Google) as well as a custom search engine developed for the course.
Conclusion
The study investigated the impacts of trained peer feedback with the use of a custom search engine and identified the kinds of revisions that will lead to better texts. The study has provided crucial empirical evidence to support the claim that trained peer feedback may affect student writing in important ways. The findings of this study contribute to knowledge in English-as-a-first-language or English-as-a-second language writing and corpus linguistics.The study provides insights into how trained peer feedback with the use of Word Engine can promote student autonomy in learning.
Acknowledgements
This study was supported by a start-up grant (SUG 6/13 CYL) from National Institute of Education, Singapore.
References
Clark, I. 2012. ‘Formative assessment: Assessment is for self-regulated learning.’ Educational Psychology Review, 24, 2, pp. 205-249.
Hovardas, T., Tsivitanidou, O. E., & Zacharia, Z. C. 2014. ‘Peer vs expert feedback: An investigation of the quality of peer feedback among secondary school students.’ Computer and Education, 71, pp. 133-152.
Kaufman, J. H., & Schunn, C. D. 2011. ‘Students’ perceptions about peer assessment for writing: their origin and impact on revision work.’ Instructional Science, 39, pp. 387-406.
Kulkarni, C., Bernstein, M. S., Klemmer, S. 2015. ‘Peer Studio: rapid peer feedback emphasizes revision and improves performance.’ Proceedings of the second ACM conference on learning @ scale, pp. 75-84.
Min, H. T. 2006. ‘The effects of trained peer review on EFL students’ revision types and writing quality.’ Journal of Second Language Writing, 15, pp. 118-141.
Min, H. T. 2008. ‘Reviewer stances and writer perceptions in EFL peer review training.’ English for Specific Purposes, 27, 3, pp. 285-305.
Min, H. T. 2016. ‘Effect of teacher modeling and feedback on EFL students’ peer review skills in peer review training.’ Journal of Second Language Writing, 31, pp. 43-57.
Rahimi, M. 2013. ‘Is training student reviewers worth its while? A study of how training influences the quality of students’ feedback and writing.’ Language Teaching Research, 17, pp. 67-89.
Sadler, D. R. 1989. ‘Formative assessment and the design of instructional systems.’ Instructional Science, 18, 2, pp. 119-144.
Swaffield, S. 2011. ‘Getting to the heart of authentic Assessment for Learning.’ Assessment in Education: Principles, Policy, & Practice, 18, 4, pp. 433-449.
Van Manen, M. 1990. Researching lived experience: human science for an action sensitive pedagogy. London, ON: Althouse Press.
Willis, J. 2011. ‘Affiliation, autonomy, and Assessment for Learning.’ Assessment in Education: Principles, Policy, & Practice, 18, 4, pp. 399-415.
Yang, M., Badger, R., & Yu, Z. 2006. ‘A comparative study of peer and teacher feedback in a Chinese EFL writing class.’ Journal of Second Language Writing, 15, pp. 179-200.
Zhu, W. 2001. ‘Interaction and Feedback in Mixed Peer Response Groups.’ Journal of Second Language Writing, 10, pp. 251-76.
Zhu, W., & Mitchell, D. A. 2012. ‘Participation in Peer Response as Activity: An Examination of Peer Stances from an Activity Theory Perspective.’ TESOL Quarterly, 46, 2, pp. 362-386.
address
Academy for Learning and Teaching Excellence
University of Bedfordshire
University Square
Luton, Bedfordshire
LU1 3JU