Enhancing learner knowledge and the application of that knowledge via computer based assessment
Lynne Reynolds, Applied Social Studies, Health and Social Sciences, University of Bedfordshire
Abstract
This paper details the process that the author went through as a novice action researcher whilst designing and implementing a new computer based assignment within a Higher Education institution within the UK. The paper outlines the initial stages of a project which was designed to assist students in the transformation from declarative to functioning knowledge (Biggs & Tang 2011). The implementation of a new summative assessment was to help students to develop a deeper rather than surface approach to learning. Owing to the personal and professional beliefs of the author, the project was designed using Norton's (2009) action research methodology of ITDEM. The research also consisted of a specific theoretical framework which included Kolb's (1984) and Atherton's (2009) theories on experiential learning and a constructivist approach (Swan 2005) to developing and designing an intervention. It also highlights the difficulties that were faced by the researcher whilst identifying and tackling this issue and implementing the new assessment. In addition during the initial stages, the research design encompassed the piloting of the Touchstone Open Source Platform because the University's Question Mark Platform was not compatible with the demands of the new assignment. This would allow an online assignment to be utilised. It would also produce instant results and feedback for the students whilst reducing marking loads (Wilkinson & Rai 2007). In order to evaluate and analyse the results from the research, data was collected and measured through the attainment of individual summative grades which were available as part of the normal academic process. Moreover, the grades that would normally be available within the university infrastructure for grading purposes were utilised to collect data on the new assessment. Upon analysis, initial results indicated an increase in the number of students who had achieved a level of functioning knowledge in comparison to previous cohorts (see fig 1). However, despite some indications of success, the author is unable to generalise this success at present owing to this project being a pilot study for the new Question Mark Platform. This paper concludes with a number of suggestions for modifying the new assessment and recommendations for the next cycle in the research process.
Keywords: Action Research, declarative knowledge, functioning knowledge, Touchstone, constructivism.
Context of the Subject
I am the unit lead on one of the core level 4 units. Traditionally students from four different degree programmes within the Applied Social Studies Department (Criminology, Child and Adolescent Studies, Applied Social Studies, Health & Social Care) study this unit as part of their core level 4 units. The current cohort consists of 245 students with a diversity of backgrounds and abilities. The curriculum consists of one lecture and one workshop per week. It is on this unit that students develop and expand their knowledge of social divisions within society, such as class and ethnicity. As such, the unit aims to develop within the students an understanding of major theoretical explanations of contemporary social divisions. It also aims to develop key knowledge within the students of critical and dominant political ideologies which are responsible for implementing policies designed to address the many divisions within society. In this way the unit aims to develop a clarity and depth of knowledge required for further study and employment within the Human Services.
The unit assesses student performance (the depth and clarity of their knowledge) via three assessment points: an in-class test, a 1,500 word written essay, and an end of year computer based multiple choice exam. It is the second assignment on the unit that this research project is concerned with. Results obtained from the second assignment during the previous three years had provided evidence that all was not well with the level of student knowledge of theoretical concepts within the unit. During an annual review of the unit I began to think of the different types of assessment available to pedagogic practitioners. I also began to question what it was that I hoped to achieve by changing the assignment. This reflection led to the development of specific research questions, aims and objectives.
Research Questions
- What could I do to assist the students with developing their declarative knowledge?
- How can the student's level of knowledge/ performance be measured or assessed?
- Would the University's Question Mark Platform be suitable for the assessment changes I had in mind?
- If not, what type of intervention could be used and would it be robust enough to meet university and HE standards/policies?
- How will the intervention (Question Mark Essay) impact on student performance and experience?
Aims/objectives
- For students to achieve a deeper level of understanding and knowledge of specific theories associated with the unit. These theories were valuable for cross unit purposes.
- To achieve a robust summative assignment which would address the issue of developing functioning knowledge within the students.
- To design a robust and constructively aligned summative assessment which would assess the learning objectives for the unit.
- To produce an Action Research Project that was worthy of scholarly publication.
- Evidence that was formulated within the action research project to inform the pedagogic practice and personal development of the author and wider community.
Action Research
Action research is an essential and reflexive tool for improving pedagogic knowledge and practice. The process is conducted through three basic principles: observation, planning and action (Kember 2000; Norton 2009). However, McNiff & Whitehead (2011) and Fox et al (2007) suggest that reflection is also an important principle within the process. This is because reflection allows the practitioner to 'review and rework' the problem at hand (Fox et al 2007:83). As such action research should be discipline based. Therefore this Action Research Project was conducted via the ITDEM process, as described by Norton (2009:70). Norton (2009) suggests that there are 5 steps to the action research process.
The first step is to identify a problem or issue within the educational practitioner's pedagogic practice. At this stage I had identified that the issue was that of students developing declarative knowledge rather than functioning knowledge. Norton (2009) then suggests that the second step in the process is where the action researcher should think of ways to resolve the issue. It was at this stage that I considered changing the more traditional type of assessment (a 1,500 word essay) to one which encompassed an electronic based assessment (an online essay). The third step entails the pedagogic practitioner actually doing something. In this case I began designing and implementing a new e-assignment by utilising Touchstone software. Step four is where the researcher should evaluate the innovation that has been put into place to overcome the pedagogic issue. Quite simply this is where l would collate and analyse the research findings. The final step in the process is to modify future practice (Norton 2009; McNiff & Whitehead 2011). This means that I would need to reflect further on the implementation of the innovation and the results obtained. Future practice, or a new cycle in the action research process, would then begin with adjustments and modifications to the original solution to the problem (i.e., the e-assessment).
The Nature of the Problem
During an annual review I began to reflect on the unit and realised that the students were having difficulty with the transformation from declarative to functioning knowledge. Biggs & Tang (2011) suggest that there are two main types of knowledge: Declarative knowledge (verbal/symbolic/surface knowledge) and Functioning knowledge (that which informs action by the learner/deeper knowledge). Within the formation of declarative knowledge, the student is a passive recipient who learns by rote or by memorising information with little comprehension and an inability to apply knowledge. It is here that a constructivist would argue that students have constructed knowledge (schema construction) but have failed to place any importance on the meaning of this knowledge (Swan 2005). Biggs & Tang (2011) argue further that those students who formulate functioning knowledge understand the meaning behind the literature/theory, can apply that knowledge and see the bigger picture from observing facts and details which make the difference to a practical issue. Furthermore, Experiential Learning theorists would argue that these students have reflected on their experience to formulate new knowledge. In addition, constructivists, such as Swan (2005), would suggest that this reflection is coupled with the assimilation or application of this new knowledge to that knowledge which is already held by the student. In this way the student is actively engaged with his/her learning.
I also realised that some students had a specific difficulty with threshold concepts (Meyer & Land 2003) such as understanding the differences between applicable theories. It was here that I realised that students were unable to manipulate their limited knowledge into practice during the second assignment. This meant that they were unable to provide solutions for posed problems. Their learning and understanding was limited through their surface approach to learning and their failure to build concrete knowledge (Biggs & Tang 2011; Swan 2005; Kolb 1994; Atherton 2009). In order to do well in the assignment and the unit they needed to develop a deeper approach to learning in order to develop their functioning knowledge. Quite simply, these students should be able to build concrete or known experiences (Kolb 1984; Atherton 2009) by actively applying existing knowledge to new experiences (the theories used within the assessment). At this stage the students would have a deeper level of understanding and be able to utilise functioning knowledge rather than declarative knowledge (Biggs & Tang 2011). But threshold concepts can be difficult for level 4 students to grasp. It was at this stage that I began to think about Prosner & Trigwell's (1999) argument. They argue that if teaching starts from the learners' experience then as teachers we should aim to assist students with altering their perspectives (how they see things, facts, the environment etc) and how they represent that knowledge (Prosner & Trigwell 1999 cited in Biggs and Tang 2011). So how could I help the students to change their academic practice?
Ethics
According to Norton (2009) and Gordon et al. (2003), pedagogic action research is a form of enquiry which incorporates elements of both pedagogic development and pedagogic research. As such, ethical considerations are of the utmost importance. The researcher not only has a moral obligation but also a professional obligation to follow ethical guidelines whilst actively researching participants within a pedagogic framework or environment (Fox et al 2007). I had noted three specific areas for consideration: anonymity, confidentiality and protection from harm. So, for the purpose of this particular intervention I did not include any identifiable or distinguishing factors (such as students' names or unique identity numbers) when collecting data. In addition, the data collected from three specific cohorts within the department where labelled A, B and C so that specific cohorts could not be easily identifiable. Furthermore data collection involved non-intrusive methods by collecting data which was produced as a natural part of the academic process within the department. The data was stored on a personal computer with a secure password known only to the researcher.
Tackling the Issue
As I reflected upon my role as a pedagogic practitioner I realised that I should be acting as a facilitator who guides the student through the learning process and towards the practical application of more complex concepts so that they could achieve a higher level of understanding. What was required was something that would achieve a conceptual change within the learner so that they could transform their declarative knowledge into functioning knowledge. In other words, they would understand the theory so well that they would be able to apply it to any given situation or problem. At this point I started to question my own pedagogical stance and after some deep thought I realised that I believed that learning is built on the premise of both knowledge construction and experiential learning (Swan 2005; Atherton 2009 and Kolb 1984). We cannot achieve one without the other. If we do not internalise what we learn through experience then how can we know that we are correct in the acquisition of that knowledge? We should aim to build upon what we know as we gain new knowledge through new experiences. In fact, Bruner (1985:pp8) states that 'there is not one kind of learning' but several strategies that allow learners to learn. In addition Mezirow (1991) suggests that it is through these experiences that learning begins to take on new meaning which allows us to construct new knowledge. So, after some discussion with senior colleagues I started to explore different assessment types and favoured Computer Based Assessments (CBAs).
For a number of years the Higher Education Sector within the UK has recognised the numerous advantages of utilising CBAs (See Gibbs 1992; Stephens et al 1998; Brown et al 1999; JISC 2008; Wilkinson & Rai 2007a & 2007b). These would be beneficial for both students and staff in terms of saving time on assessment preparation, marking and feedback whilst helping to assist in the transformation from declarative to functioning knowledge within the students. I had some set ideas and took the planned changes through the next Faculty Teaching Quality and Standards Committee meeting. By the time I started to lay down firm plans and make further investigations it was August 2011. My initial investigations led me to the CBA Policy and Procedures documentation (UOB 2008) and discussions with the Teaching and Learning Directorate. Then I started to plan the new assignment. The new assignment would be an electronic essay with drop down menus placed at strategic points within the essay which provided a multiple selection of answers; 3 that were incorrect and 1 which was the correct answer. This would allow the students to develop a deeper level of understanding as they would have to think about the theories learnt and apply each one to a given situation/problem at strategic points within the essay. This would promote student engagement with the assignment. They would also be transforming their knowledge from declarative to functioning knowledge (Biggs & Tang 2011) as they would be building on previous knowledge learnt whilst constructing new knowledge (arriving at the correct answer) (UOB 2011; Bruner 1998; Kolb 1984; Rogers & Frieberg 1994). The CBA and Touchstone platform would be more beneficial in this instance as it would be embedding technology into the curriculum in order to generate student engagement directly with knowledge learnt on the unit. It would also provide a positive learning experience as the learner would be able to identify the value of the knowledge that they had learnt previously on the unit (UOB 2011).
In October 2011 I asked a colleague to help me to pen the essay, to ensure that the essay was robust and tested the student's knowledge acquisition and ability to use both declarative and functioning knowledge. After some consideration I realised that two papers would be required – a 'mock' paper and the 'unseen' paper. In December 2011 I also consulted the University's Learner Experience Strategy documentation (UOB 2011) to ensure that I was developing the assignment in line with the University's vision of the learner experience. It was then that the Teaching and Learning Directorate transformed the Word documents into HTML documents and uploaded both the mock and unseen assignments into the University's Question Mark Platform. I was quite excited at this stage and was busy reflecting on the possible advantages for the students.
However, all was not well. The platform was not able to produce the desired finished product. I had envisioned that the students would also be able to complete the assignment from home via remote access to the platform through the University's VLE system. In this way they would be able to access the assignment for a period of 24 hours but would only be able to complete the assignment during a two-hour window within these 24 hours. So I wanted the assignment to cut off after two hours of the student starting the paper. In this way the assignment would be part of an inclusive and equitable curriculum whilst allowing the students to achieve the best possible outcome (UOB 2012a). This was essential as there were a number of dyslexic students in the cohort and they would not be advantaged or disadvantaged by the change in the assignment (UOB 2012b). After further discussions, the head of the Teaching and Learning Directorate investigated. Eventually a new platform was found: Touchstone Open Source from the University of Nottingham. The new assignment was to be loaded into the new platform and be used as a pilot for this university. If this worked then there were wider implications for the use of the new assignment and the new platform. It could be rolled out throughout the University and used within other departments. On reflection this was quite a heavy responsibility. It was at this stage that I realised that further planning was required for this assignment to be successful.
Innovative implementation
Further plans included that both papers would be imported into the new platform and be ready for the 'testing phase' by early January 2012. The students would sit the 'mock' paper during the regular one hour tutorial period two weeks prior to the unseen paper so that the author was on hand to answer queries. The final implementation of the intervention would occur in January 2012 with the Touchstone software enabling a remote and 'live' assessment via a secure network. Data collection would occur by the end of February 2012 and a focus group would occur later in the year.
But the mock paper was not as successful as I had hoped. There were errors in the software. Every time there was a comma within the text, the platform inserted a new line of text. This proved difficult in that the text was not readily understandable and the meaning of the text was lost. At this stage I had further discussions with Teaching and Learning and it was decided that contact with Nottingham University was essential. The date of the unseen paper was fast approaching and I was becoming more anxious. At this stage I had decided to write a paper-based equivalent of the CBA which contained gaps in the text as it would on the CBA. A further paper was produced with copies of the four answers held within the drop down boxes on the CBA. At least this way the students would have the same information and be able to complete the same task. However, the problem that now arose was: where would the students sit the unseen paper? After much deliberation I decided that they would have to sit the assignment in the lecture hall during the normal lecture period. This had further repercussions as the lecture programme would then have to be redesigned to encompass all of the planned lectures on the unit.
By this time Nottingham University had sent the Teaching and Learning team the most up to date version of Touchstone. The team were busily trying to upload the assignment information into the new platform. This was achieved with only days to go until the actual assignment. I was fast beginning to realise that I had not thought this through properly. This was owing to my inexperience of utilising this type of software platform and failing to realise the potential pitfalls of developing new CBAs. This also meant that I would be unable to test the live version of the assignment prior to allowing students access. Pedagogically this was not acceptable. After discussion with the IT department the students then had to have usernames and passwords generated and the system had to be secure for them to be able to access the assignment. This would mean further delay. Finally, with a few days left, I decided that the students would need to sit the paper version as the CBA had not been tested prior to use. So the students sat the 'unseen' paper in the lecture hall as planned and the lecture programme was redesigned.
Evaluation
The whole experience was not what I had envisioned for the students. However, despite student and staff apprehension, which according to JISC (2008) is relatively normal, the actual assignment went fairly well. I started to analyse the results by collecting data via a non-intrusive investigation into those results that were available via the normal academic process. Quite simply I utilised the grades from the second assignment that were available within the university infrastructure. I ensured that only the grades were utilised rather than using any identifying factors which could relate to individual students. Therefore this would retain anonymity and confidentiality. Analysis of the results consisted of a comparison between the previous two cohorts' (A and B) and the current cohort's (cohort C) grade profiles for the second assignment on the unit. Initial analysis showed that the number of students passing the assignment had risen (See fig 3). Those achieving a grade within the A and B range had been similar to those of previous years (see fig 1). However, a notable difference was with the number of passes achieved. The number of students achieving a D grade had risen by 49 students (see figs 1, 2 & 3). Quite simply, more students had passed the assignment than in previous years. In summary, it can be argued that the intervention had been successful in so much as more students had passed the unit.
Figure 1 - results for all cohorts
Figure 2 - results for cohorts A, B and C
Figure 3 - results for cohort C
But could it be claimed that more students had achieved the transformation from declarative to functioning knowledge? Certain factors need to be taken into consideration, such as the fact that the current cohort may be a slightly stronger or weaker cohort than in previous years. Although it is difficult to compare the students' full range of abilities within a single assessment point, early indicators show that more of the students have achieved their learning outcomes than those in previous cohorts. But at this early stage it is important to realise that the results from the intervention do not allow any type of generalisations to be made as this is a pilot study for a new intervention (assessment type). However, further analysis shows that overall 84 students within cohort C had improved on their grade from assignment 1. Whilst 76 remained within the same grade band 25 of the 76 had improved their grade (i.e. from a B- to a B+). This would suggest that overall the current cohort of students had achieved a deeper level of knowledge and understanding of the unit in comparison to previous cohorts.
Modifications
Upon reflection I now realise that further modifications are required in order for further success to be made. Modifications include the following:
- Ensuring that the new version of Touchstone platform is flexible and robust (see Wilkinson & Rai 2007a and 2007b).
- The mock and unseen papers should be tested in order to iron out any further problems with delivery.
- Academic staff should test the system remotely to ensure that the paper is available within the anticipated timeframe set for the assignment.
- Assess the validity and reliability of the Touchstone platform and remote system to mark the assignment correctly prior to allowing students access.
Concluding discussion and Recommendations
Despite several difficulties within the project, early indications of success suggest that the change in assignment type has been beneficial for students. As such, a higher number of students have achieved the learning objectives for the unit than in previous cohorts. However, at this early stage, it is impossible to generalise on the results, validity or reliability of the new assignment. Further indications suggest that the current cohort of students have achieved a higher level of transformation from declarative to functioning knowledge. From this point of view the project is successful. But it is important to remember specific factors such as the fact that the current cohort might be academically stronger than previous cohorts. Therefore the following recommendations should be followed prior to further modifications:
- Further research should include more in-depth study of this assessment type. Furthermore, as students have traditionally struggled with the second assignment on this unit, more in-class facilitation should occur regarding terminology used. This is so that further in-depth knowledge can be achieved within the student body.
- Whilst terminology is essential for comprehension and deeper understanding, this is something that should also be included within the focus of the next cycle of the action research project.
- The intervention should be utilised again with the next cohort of students to enable a valid, reliable and representative comparison between cohorts.
- The student's opinion of the intervention should be recorded in an ethical manner in order to assess the student experience with this type of assessment. This could be achieved through the use of a focus group. Obviously ethical approval and informed consent should be sought prior to commencing any data collection.
- If subsequent cycles of the action research project show a positive improvement within the students' level of understanding (formulating functioning knowledge) then the Touchstone platform could be rolled out across the University and distributed to a larger group of students. This may provide more validity and reliability to the study whilst allowing generalisations to be made.
References
- Atherton, J., S., (2009) Learning and Teaching: Experiential Learning (on-line) Available at: www.learningandteaching.info/learning/experience.html?step=9&step=8. Accessed 29/09/09
- Biggs. J, & Tang. C (2011) Teaching for Quality Learning at University (4th Ed) Berkshire, Open University Press.
- Brown, S., Race. P., & Bull, J. (1999) Computer Assisted Assessment in Higher Education, London, Kogan Page.
- Fox, M., Martin, M., & Green, G. (2007) Doing Practitioner Research, London, Sage.
- Gibbs, G. (1992) Improving the Quality of Student Learning, Bristol, Technical and Educational Services
- Gordon, G., D'Andrea, V., Gosling, D. & Stefani, L. (2003) 'Building capacity for change: research on the scholarship of teaching'. Report to HEFCE. Online. Available at: www.hefce.ac.uk/pubs/RDreports/2003/rd02_03/ (accessed 31/03/12)
- JISC (2008) Exploring Tangible Benefits of e-Learning: Does Investment Yield Interest?, JISC Infonet, University of Northumbria.
- Kember, D. (2000) Action Learning and Action Research: Improving the Quality of Teaching and Learning, London, Kogan Page.
- Kolb, D. A. (1984) The Experiential Learning Cycle in Atherton, J. S. (2009) Learning & Teaching: Experiential Learning (on-line) UK. Available at www.learningandteaching.info/learning/experience.htm accessed (29/09/09).
- McNiff, J., & Whitehead, J. (2011) Doing and Writing Action Research, London, Sage.
- Meyer, J. H. F, & Land, R. (2003) Threshold Concepts and Troublesome Knowledge (1): Linkages to Ways of Thinking and Practising Within Disciplines. Available at www.utwente.nl/so/vop/nieuwsbrief_17/land_paper.pdf. Accessed 15/02/12.
- Mezirow, J. (1991) Transformative Dimensions of Adult Learning, San Francisco, CA, Jossey-Bass.
- Norton, L. (2009) Action Research in Teaching & Learning: A practical guide to conducting pedagogical research in universities, Abingdon: Routledge.
- Prosner, M. & Trigwell, K. (1999) Understanding Learning and ~Teaching: The Experience in Higher Education, Buckingham, Open University Press. Cited in Biggs. J, & Tang. C (2011) Teaching for Quality Learning at University (4th Ed) Berkshire, Open University Press.
- Stephens, D., Bull, J., Wade, W. (1998) Computer Assisted Assessment: Suggested Guidelines for Institutional Strategy, Assessment & Evaluation in Higher Education, September 1998, 23 (3) p283-296.
- Swan, K (2005) A Constructivist Model for Thinking About Learning Online. In Bourne, J & Moore, J.C. (eds), Elements of Quality Online Education: Engaging Communities. Needham, MA, Sloan.
- Wilkinson, S., & Rai, H., (2007a) OMR to CBA for Summative Assessments. Camelbelt. JISC. available at: jiscinfonetcasestudies.pbworks.com accessed 13/04/12
- Wilkinson, S. & Rai, H. (2007b) Disability Support in CBA. Camelbelt. JISC available at: jiscinfonetcasestudies.pbworks.com accessed 13/04/12
- University of Bedfordshire (UOB) (2008) Computer Based Assessment: Policy & Procedures available at www.beds.ac.uk
- University of Bedfordshire (2011) Learner Experience Strategy: December 2011 Post-Consultation Document Draft, [internal] available at: in.beds.ac.uk/consultation/strategy. Accessed 20/12/11.
- University of Bedfordshire (UOB) (2012a) Guidance Paper Number 4: Inclusivity in Assessment available at www.beds.ac.uk/learning/curriculum/delivery/incanddiv. accessed 03/01/12.
- University of Bedfordshire (UOB) (2012b) Guide Number 3 A Guide to Assessment for Learning. accessed 20/01/12.
address
Academy for Learning and Teaching Excellence
University of Bedfordshire
University Square
Luton, Bedfordshire
LU1 3JU