Prof Cyril James Weir AcSS MA MSc PhD

Present Appointments

From 2005: Powdrill Professor in English Language Acquisition, Director of Centre for Research in English Language Learning and Assessment, University of Bedfordshire, and Guest Professor Shanghai Jiaotong University, PRC.

Education, Qualifications

Dates Establishment Qualifications and Class subject

  • 1968-71 University of Reading B.A. [IIi] Politics
  • 1971-72 University of Birmingham Cert. Ed. [P.G.C.E.] Liberal Studies
  • 1972-74 University of Reading M.A. Political Philosophy
  • 1977-78 University of Edinburgh M.Sc. Applied Linguistics
  • 1979-83 University of London Ph.D. Language Testing

Previous Appointments

  • 1972 - 1974 Lecturer in European Studies, Middlesex Polytechnic
  • 1975 - 1977 Senior Lecturer in EFL, MIS Technical College Iran
  • 1979 - 1983 Research Officer Associated Examining Board, Aldershot
  • 1983 - 1985 Lecturer in ELT, University of Exeter
  • 1985 - 1986 Teaching Fellow in ELT, University of Lancaster
  • 1986 - 2000 Lecturer in EFL, Director of Testing and Evaluation Unit, CALS
  • 2000 - 2005 Professor in ELT, University of Surrey Roehampton

The social and economic impact of my research

My research resulted in an innovative socio-cognitive framework for language test development and validation which was elaborated in two publications viz Weir (2005) Language Testing and Validation: An Evidence-Based Approach, and Shaw and Weir's (2007) Examining Writing. The framework marks the first systematic attempt at providing assessment stakeholders with a coherent and accessible methodology for test development and validation research, covering social, cognitive and evaluative (scoring) dimensions of language use and linking these in turn to the context and consequences of test use, including the interface with language teaching and learning. The framework allows for serious theoretical consideration of the issues but has also proven itself capable of being applied practically in critical analyses of test content across the proficiency spectrum; it therefore has direct relevance and value to an operational language testing/assessment context – especially when that testing is conducted on an industrial scale. While other frameworks developed during the 1990s (e.g. Bachman's 1990 Communicative Language Ability (CLA) model and the Council of Europe's 2001 Common European Framework of Reference (CEFR) undoubtedly helped end users to consider key issues from a theoretical perspective (with the notable lacunae of the cognitive dimension), they generally proved somewhat difficult for examination boards to operationalise in a manageable and meaningful way.

1) To be effective, English language learning, teaching and assessment require meaningful and useful definitions of levels of language proficiency. My research has sought to clarify the English as a second language (ESL) proficiency levels employed by major international English language examinations, particularly the criterial features distinguishing one proficiency level from another. My research into the nature and influence of these differentiating socio-cognitive parameters has helped test providers worldwide to create more effective and efficient tools for assessing language proficiency. Enhanced tests improve selection/gate keeping functions in society and open doors to higher education, improve job prospects, and increase transnational mobility for millions of successful candidates, particularly important in times of economic austerity. My work with Cambridge ESOL Examinations who offer multiple English language examinations at different levels, across different domains, to over four million candidates per annum across 130 countries, is a good example of this.

2) Academic English language skills are needed to perform effectively as a language user within university or college contexts, engaging in academic studies with relative independence and understanding. This area of my research has been instrumental in enhancing knowledge of international students' academic proficiency in English, and in developing efficient and effective, entry-level tests. These tests, as well as satisfying an important selection function, also serve as useful diagnostic measures of a student's proficiency in using Academic English, which enable receiving institutions to offer weaker students appropriate language and study skills support pre and post entry. A key application of this research is evidenced by my involvement in global test development projects for entry to and assessment within higher education (e.g. British Council IELTS test (1.7m candidates per annum) and the College English Test (CET) in China (17 million candidates per annum).

Publications

Books

Sole Author
  • Communicative Language Testing with Special Reference to English as a Foreign Language Exeter Linguistic Studies Volume XI Exeter University Press, 1988 241pp
  • Communicative Language Testing Prentice Hall, 1990 215pp
  • Understanding and Developing Language Tests Prentice Hall, 1993 203pp
  • Language Testing and Validation: an evidence based approach Palgrave, 2005 288pp.
Co-author
  • Alderson, J.C., Candlin C.C., Clapham, J., Martin D.J. & Weir, C.J. Language Proficiency Testing for Migrant Professionals: New Directions for the Occupational English Test Centre for Research in Language Education University of Lancaster 1986 93pp
  • Weir, C.J. and Roberts, J.R. Evaluation in ELT Blackwell: Oxford, 1994 338pp
  • Shen, Z., Weir, C.J and Green, R. The Test for English Majors Validation Project Shanghai Foreign Language Education Press 1997 309pp
  • Urquhart, A. and Weir, C.J. Reading in a Second Language: Process, Product and Practice Longman 1998 345pp
  • Yang, H. and Weir, C.J. Empirical Bases for Construct Validation: the College English Test - a case study Shanghai Foreign Language Education Press 1998 241pp
  • Weir, C.J., Yang Huizhong and Jin Yan An Empirical Investigation of the Componentiality of L2 Reading in English for Academic Purposes CUP 2000 311pp
  • Shaw, S and C.J. Weir, (2007) Examining Writing in a Second Language: research and practice in assessing second language writing, Studies in Language Testing 26. Cambridge University Press and Cambridge ESOL
  • Khalifa, H. and Weir, C. J. (2009). Examining Reading: Research and practice in assessing second language reading, Studies in Language Testing 29, Cambridge: Cambridge University Press. (Runner up in the Sage/ILTA triennial award for the best book on language assessment in 2008-11).
  • Weir, C.J. , I. Vidakovic and E. Galaczi (2013) Measured Constructs: A history of the constructs underlying Cambridge English language (ESOL) examinations 1913-2012: Cambridge: Cambridge University Press 652pp
Co-Editor
  • English Language Testing Service: Research Report 1(ii), ELTS Validation Project Cambridge: UCLES and British Council.1988 105pp with Porter, D. and Hughes, A. 
  • Balancing Continuity and Innovation. A History of the CPE Examination 1913-2002 CUP 2003 550pp with M. Milanovic
  • European language testing in a global context. Cambridge, UK: CUP 2004 304pp. with M. Milanovic
  • Multilingualism and Assessment: Achieving transparency, assuring quality, sustaining diversity Studies in Language Testing 27. Cambridge ESOL/Cambridge University Press (2008) with Taylor, L
  • Language Testing Matters: Investigating the wider social and educational impact of assessment. Studies in Language Testing 31. Cambridge: Cambridge University Press (2009) with Lynda Taylor
  • Research in reading and listening assessment. Studies in Language Testing 34, Cambridge: UCLES/Cambridge University Press. with Taylor, L. (2012)
  • Exploring Language Frameworks Studies in Language Testing 36. Cambridge: Cambridge University Press (2013) with Evelina Galaczi

Articles on assessment in international refereed journals:17 (5 in Language Testing)

Chapters on assessment in edited books: 25

Learning and Teaching

Postgraduate Research Students

  • Number of current PhD students: 5
  • Number of successful PhD completions: 25
  • I have acted as external examiner for 10 PhD theses.

Test development and validation

I have worked closely with examination boards and ministries of education applying my research-based socio-cognitive framework (SCF) to their assessment policies and products to improve the technical quality of the language proficiency tests they offer. Nowadays, test providers are expected to offer test stakeholders explicit evidence of how they meet validity demands and technical standards in the tests they offer in the public domain. The SCF has- enabled test providers to assemble the logical and empirical evidence needed to support claims about a test's usefulness. As a result, test developers and providers have revisited their test designs and reframed their validity arguments in new and more effective ways to demonstrate evidence of test quality and fitness for purpose. Tests include:

  • The College English Test (CET), The Test for English Majors (TEM), and the Advanced English Reading Test (AERT), PRC
  • Cambridge ESOL Main Suite
  • British Council International Language Assessment (ILA), IELTS and Aptis
  • Test of English for Academic Purposes (TEAP) and EIKEN Society for Testing English Proficiency (STEP) Japan
  • General English Proficiency Test Taiwan

Some responsibilities and positions held

UK

  • 1983 - 1986 Chief Examiner in English as a Foreign Language, Institute of Linguists
  • 1983 - 1991 Chief Examiner Test in English for Educational Purposes, Associated Examining Board, Aldershot
  • 1988 - 1991 Language Advisor to General Medical Council PLAB test for overseas medical professionals

Overseas

  • 1988 - 1993 Senior Consultant, Evaluation Department, Overseas Development Administration [Baseline evaluation studies of language projects in Nepal, Guinea, Ecuador]
  • 1990 - 1995 Co-ordinator, Universities' EAP Proficiency Test Project, Egypt
  • 1991 - 1995 Senior UK Consultant, College English Test [CET] Validation Study, PRC [EAP Test taken by 16 million undergraduate pa]
  • 1993 - 1996 Senior UK Consultant, Test for English Majors [TEM] Validation Study, PRC.
  • 1995 - 1998 Senior UK Consultant, Advanced Reading Test in English for Academic Purposes (AERT), PRC.
  • 1999 - 2004 Senior Consultant USAID Student Achievement Test Development Project, Egypt
  • 2006 - Visiting Professor at Shanghai Jiaotong University, PRC the institution responsible for the administration of the College English Test CET to over 17 million undergraduate per annum
  • 2007 - International representative on the research board of the LTTC state GEPT examination in Taiwan
  • I have carried out project related consultancies in over 50 countries world-wide in the fields of language testing, programme and project evaluation and curriculum renewal, and have given over 60 keynote and plenary speeches and 40 seminars/ workshops in these areas.

Measures of Esteem

  • Series Editor for Studies in Language Testing (CUP)
  • Member of the Editorial Board of Language Testing
  • Academician of the Academy of Social Sciences
  • On the steering group of the English Profile Project which is making a major contribution to the revision of the Common European Framework of Reference for Languages (CEFR; Council of Europe, 2001).
  • Chairman of the British Council Assessment Advisory Board

address

Professor Tony Green
Director of CRELLA
University of Bedfordshire
Putteridge Bury
Hitchin Road
Luton, Bedfordshire
UK, LU2 8LE

tony.green@beds.ac.uk

telephone

+44 (0)1582 489086