To get high scores at essay writing tests, learners of English as a foreign language need to focus on good arguments more than on complex grammar. The Kobe University finding challenges conventional approaches to test preparation and scoring rubrics.
Writing essays is a well-established tool for monitoring progress in learning English as a foreign language, as it provides a snapshot of a student’s mastery of grammar and vocabulary. Especially in Japan, where English language tests are often required for university admission and students closely follow advice on how to achieve high scores on these tests, a “good essay” is often seen as one that demonstrates a high level of grammatical complexity. But is this actually reflected in test scores?
Kobe University linguist YASUDA Sachiko expresses her doubts: “Based on my experience of teaching academic writing to students at various levels in Japan, I believe that linguistically complex texts do not always result in better writing.” She therefore decided to conduct an experiment with over 100 Japanese high school students. Yasuda had them write a short essay on a given topic and looked at the relationship between the linguistic complexity of the texts and the writers’ ability to present complex arguments, and how these two related to how the texts were graded according to official rubrics. She adds, “This study is the first to focus on the relationship between features of linguistic complexity and features of meaning complexity; no one else in the relevant fields has looked at the relationship between these two.”
The results, published in the journal Assessing Writing, confirmed her suspicions. She found that high-scoring essays shared features related more to the ability to express complex meaning, such as lexical diversity, noun modification, and soundness and number of arguments, than to structural complexity. “Interestingly, low scoring essays showed the highest level of complexity in finite adverbial dependent clauses,” the linguist writes in her paper. Emphasizing this point, the ability to express complex meaning was strongly correlated only with using diverse expressions and the ability to modify their meaning, but not with grammatical features. Yasuda concludes, “Simply having complex sentence structures does not necessarily lead to a better essay.”
The findings have implications for how essay writing tests are scored. The Kobe University researcher explains: “Current rubrics for writing questions on language tests instruct test-takers to ‘use complex grammar appropriately’ or ‘a variety of complex structures.’ However, since sentence complexity does not significantly affect overall essay quality, it may be more appropriate to use terms such as ‘contextually appropriate grammar’ or ‘genre-appropriate grammar.'” Thus arguing that the ability to express one’s opinion in varied and complex ways is a marker of students’ writing ability, she advocates that this characteristic should be more represented both in the way tests are scored and how feedback is provided to students.
This so-called washback effect of test scoring rubrics on the way language is taught is at the heart of what drives Yasuda. She says: “I am committed to using the results of this study for practical applications, such as refining assessment criteria for evaluating students’ writing, developing tasks and materials to improve their writing skills, and identifying the key knowledge that teachers need to help students become better writers. The ability to write in English has become increasingly important in the 21st century, as it is a crucial medium that allows us to connect with others around the world.”
This research was funded by the Japan Society for the Promotion of Science (grant JP24K04031JSPS).