Tuesday, 8 August 2017

Improvement through Self-evaluation

The pivotal role of intrinsic motivation to self-improvement and advancement is well document in cognitive psychology and learning theories. According to Ryan and Deci (2000), this type of motivation is defined as:

the doing of activity for its inherent satisfaction rather than for some separable consequence. When intrinsically motivated, a person is moved to act for the fun and challenge entailed rather than because of external products, pressures, or rewards.

By contrast, the concept of extrinsic motivation refers to activities, which are performed in order to “attain some separable outcome” (Ryan and Deci, 2000), for example, a certificate or another form of external validation. Research suggests that intrinsic type of motivation is most effective to achieving improved outcomes or personal growth. Therefore creating environments and situations which favour the development of intrinsic motivation are a challenge for us all. With reference to teaching and learning situations, this type of motivation can be achieved when tasks are well matched to the learners’ skills.

Csikszentmihalyi’s work on flow theory (1991) and Dweck’s research (2012) focused on ability versus effort, defined in the terms of beliefs as mindsets (fixed and growth), also indicate the crucial role of intrinsic motivation in achieving improved outcomes and personal growth.

School inspections based mainly on external accountability measures depend heavily on extrinsic motivation to bring improvement. In these types of evaluations, the guidance for development is usually communicated through externally articulated recommendations for improvement. It is questionable to what extent these types of evaluations actually benefit organisational improvement. MacBeath et al. (2000) assert that exclusively external systems of school inspections, for example in England and The Netherlands, are mainly driven by control and the need for accountability even if they have an improvement perspective.

In contrast, robust and contextualized self-evaluation arrangements can visibly benefit organisational development by providing the right level of motivation to achieve higher outcomes (intrinsic motivation). Organisational development, where improvement is mainly driven through self-assessment, can be very effective with regard to growth. When fully embedded into an institutional context, the process of self-evaluation becomes almost self-managing in productively meeting institutional objectives and success performance indicators.

Indeed, some of the most successful educational systems in terms of outcomes for young people, for example Finland or Singapore, where school audits are focused only on financial control, are not subjected to external evaluations of their performance.

In England, the most successful schools consistently self-evaluate their own performance against their own challenging and evolving performance indicators. To add greater rigour and challenge to their arrangements, some institutions validate their self-evaluation processes through peer-reviews. These encourage the right level of professional dialogue leading to further improvement.

At this level, success does not happen by chance. At the heart of improving education for all children, is school improvement. The outcomes for all children matter, regardless of their postcode or birth. If school improvement can be achieved through a more effective system of performance evaluation, I think it is worth a try. Perhaps the next stage is to examine the effectiveness of the current inspection framework: to what extent does it directly benefit school improvement in terms of pupils’ outcomes?

On the impact of the inspection regime on school improvement, former Chief Inspector, David Bell, said, “I have always been cautious in saying that inspections cause improvement because, frankly, we do not”.

A Government Select Committee Report in 2010 concluded that “true self-evaluation is at the heart of what a good school does” and that:

Self-evaluation – as an iterative, reflexive and continuous process, embedded in the culture of a school – is a highly effective means for a school to consolidate success and secure improvement across the full range of its activities.

Research literature indicates that external evaluations are most effective when they are focused on improvement and collaboration. Arguably, new inspection arrangements conducted within the spirit of a peer-review process focused on schools’ self-evaluation, is perhaps what is needed to improve education for all. This type of peer-review would focus to a greater degree on the school’s own objectives rather than on the standard pre-determined criteria. Although self-evaluation can play a part in some inspection systems, it is not always a requirement. I feel, that putting self-evaluation at the centre of the inspection process and shifting the focus from external accountability to internal accountability measures against the school’s own objectives, would lead to greater school improvement for all.

Evidence-based data suggest that intrinsic motivation is most conducive to achieving greater learning gains. By analogy, if the same theory is applied to institutional development, for example school improvement, the importance of contextualized self-evaluation and its value in the school evaluation system is perhaps worthy of a longer discussion.


Csikszentmihalyi, M. (1991). Flow: The Psychology of Optimal Experience. New York, NY, Harper Perennial.
Dweck, C. S. (2012). Mindset: How you can fulfill your potential. UK, Robinson.
MacBeath, J. et al (2000). Self-evaluation in European Schools. London, Routledge.
Ryan R.M., Deci E.L. (200). Intrinsic and extrinsic motivations: classic definitions and new directions. Contemp. Educ. Psychol. 25, 54 – 6710.

Monday, 9 January 2017

Building Resilience Early for Later Success: preventing potential mental health issues and poor achievement

As an experienced educationalist, I am a great believer in developing learners through encouraging their effort and providing constructive guidance on future learning. This approach not only helps to develop the right attitudes to learning, leading to improved outcomes, but, with focus on actions, it takes the ‘person’ out of the frame and allows the learners, whatever their abilities, to direct their attention to working on tasks in hand.

All teachers would have encountered learners, including those considered as highly able, who can be reluctant to try for fear of failing. Carol Dweck’s extensive research into the psychology of learning confirms that learners’ attitudes, and especially their willingness to put the effort in, not abilities, are crucial to successful learning and making progress. So how can we ensure that our pupils develop these right attitudes needed for future success?

All young children are keen to explore, discover and learn. During their early development, they make rapid progress and soak up new knowledge like ever expandable sponges. As Dweck says, “You never see an unmotivated baby”. However, as children grow older, they develop different attitudes to learning (“mindsets”), which can have an enormous impact on their future achievement. As adults, we influence children’s thinking and shape their mindsets through our own sets of beliefs and values. In societies that place value on ability over effort, children who feel that they are clever, but are faced with obstacles, can become reluctant learners in order to avoid looking stupid. Such mindsets (“fixed mindsets”), according to research, are formed by adults who tend to focus their praise on the person (“What a clever girl!”), rather than the actions, and this has a negative impact on the child’s future success, especially when that child is faced with some difficulties. Moreover, when these children cannot get by on wits alone, they can develop various avoidance mechanisms and behavioural issues that can lead to more serious mental health problems in the longer term. It is this inability to compete on equal terms with others, who they perceive as no more able than them, and the fear of visible failure that can lead to isolation, depression and poor self-worth.

The demands of the global economy, parental expectations of high achievement to secure first class qualifications for better job prospects, increasing higher education costs and other accountability measures based on sometimes flawed assessment systems all contribute to greater pressures by young people for improving their outcomes. It is therefore crucial for education and health care professionals to take note of evidence-based strategies in order to help young people develop effective self-regulation skills to enable them to cope with temporary failures and to equip them with strategies in overcoming setbacks. Teaching young people the value of effort through praising their actions and trying hard, and preparing them for bugs through challenging tasks, where they have the opportunity to progress at their own level with appropriate level of guidance to experience success, are some of the strategies that can be used to build resilience. The aim of these strategies is to enable these young people to bounce back from difficulties through their own efforts. Since prevention is always preferable to cure, early focus on developing the right mindsets to learning and cultivating the strength of mind through encouraging and praising effort can avert the development of mental health problems by school pupils, and help them to build resilience.

However, building resilience through developing the right attitudes to learning requires consistency of approach by all professionals and greater understanding of developing self-regulation strategies in young children. It is counter-productive and utterly frustrating when I repeatedly hear early years education and health care professionals/therapists lavish their praise on my young grand-daughter in the form of “clever girl”. Developing the growth mindset by instilling the value of effort to future success, through praising effort, not the person, is important to building resilience and determination early, and preventing emotional or behavioural problems later on. Ego-enhancing strategies, however, can result in creating a fixed mindset, where an individual’s full potential can be compromised, and where the lack of adequate effort and running away from challenges can lead to failures and wasted talents.

Wednesday, 21 December 2016

Assessing Pupils’ Progress: primary curriculum, exams and assessment

Following the publication of the Key Stage 2 assessments (SATs ) results, the report in the Times Educational Supplement (TES) on 14th December calls for reforms to the system, claiming that the tests have “affected pupils’ wellbeing”.

In the drive to improve pupils’ readiness for secondary education, this year pupils sat new and more rigorous tests. According to the provisional figures released by the Department of Education, only about half of pupils in Year 6 have met the new expected standard. The results show that 53% of pupils achieved the expected standard in reading, writing and mathematics. While it is difficult to make direct comparisons with the previous year, when the expected standard (NC level 4) was attained by 80% of pupils, this year’s results look certainly different.

Since the abolition of NC levels, schools have been encouraged to develop their own assessment systems to monitor the progress of their own school populations. Life beyond levels became quite a challenge for most schools, and the teaching profession, who were searching for new off-the-shelf solutions to replace the familiar concept of levels. This thinking showed a certain lack of understanding of the needs of different school populations and gave little consideration to the opportunities of developing specific attainment targets, no longer stipulated by the government, in order to assess different curriculum areas by content and time. The development of new assessment systems requires in-depth understanding of assessment principles and clear awareness of what makes a ‘good’ assessment. Although assessment is central to any learning situation and is a driver for the school curriculum, it can be often misunderstood by the teaching profession who, for about 25 years, has been fed a diet of abstract levels. These were used routinely to determine attainment, or indeed progress, by attaching a number to the level of knowledge.

School leaders and school associations are highly critical of the new reforms and, as reported by the TES after the publications of the results, describe the recent changes as “diabolical” and “unacceptable”. Their main concerns are that:
• the reforms are unhelpful to children with special educational needs;
• the tests are stressful and affect pupils’ well-being;
• the school curriculum is affected because of focus on assessment and non-exam subjects are side-lined;
• high stakes accountability has a negative impact on teachers and how they teach.

Whilst any new reforms need some time to be effectively implemented, I am particularly concerned with the apparent focus on teaching-to-the-test, where other areas of the curriculum are reportedly side-lined, and the notion that the accountability for high stake testing can reduce teaching to the demands of a tick list, as concluded by one deputy headteacher: “The ticklist I’ve got to go through to give my school’s data (…) is massively influencing the way I teach. And I will teach to that ticklist”.

For fear of accountability and inability to let go, children can be deprived of the love of learning and opportunities to develop their wide interests and talents, and ultimately achieve better results at the end. We need to focus on LEARNING, not accountability and teaching-to-the test, to improve progress and achieve better outcomes for all children. To view SATs as the raison d’etre of primary school education, is to limit that education to the narrow syllabus required by the final test. The evidence shows that pupils’ results improve when they are fully involved in their learning, including participation in a wider curriculum and extra-curricular activities.

In order to fully engage pupils in their learning for improved results, schools need to develop effective formative assessment strategies aimed at developing pupils as autonomous learners. This includes pupils with special educational needs, in particular, as they need quite specific feedback on next steps in learning to help them with progress and to develop their learning independence. Currently, the use of assessment for learning (AfL) techniques to move learning forward is highly ineffective. The thinking behind AfL is poorly understood, resulting in weak implementation, which is often condensed to routine strategies or ‘toolkits’. This may be as a result of an earlier AfL government initiative, which presented the strategy as a mini summative assessment system, known as APP (assessing pupils’ progress); this had little in common with the essence of AfL. In fact, such interpretation of AfL is at odds with its spirit: an assessment system aimed at assessing learning during production, similar to coaching, and moving pupils’ learning onto the next level through effective guidance on their next steps in learning or matching learning objectives to pupils’ abilities. As a fluid process, it requires adjustments to teaching in order to match pupils’ level of understanding and, when used effectively, can result in big learning gains. Effective use of assessment for learning strategies allows for personalised learning to take place, where each individual pupil can make progress at his or her level. Therefore relentless focus on teaching-to-the-test, rather than on developing pupils as effective learners who are motivated to achieve, can have a negative impact on their high stake exam outcomes, and on later success.

Undoubtedly, schools need to be accountable for the quality of the education they provide. However, narrowing the primary school curriculum to the core subjects, as tested by the SATs, is not a way to go. Preparation for the next stage in education includes all round pupil development, which has a positive impact on examination results. To succeed in life and in education, pupils must be exposed to a rich educational experience and a broad curriculum. They must be equipped with skills for life-long learning in the ever changing environment of technological advancement and they must be given a chance to develop their talents and learning to the full. It’s the only chance they have.

Sunday, 10 May 2015

Learning for the 21st Century: how to succeed in the digital age of globalisation

Rapid technological advancement and globalisation create new demands for the world of education. Current generation of school pupils needs to be prepared for multiple career changes and this necessitates life-long learning. So what does it mean for schools and the curricula delivered?

‘Knowledge’ in the traditional sense of fact-finding can no longer provide adequate basis for the 21st century learning, where ‘knowledge’ needs to be seen as a competence and a result of in-depth learning leading to the acquisition of new skills. Therefore during the learning process, it is crucial to bridge knowledge-based material with the skills needed not only for the application of the learnt material but, ultimately, work on developing new skills required for creating new solutions. This equates with progress which has never been as rapid as during the digital age.

This focus on learning mastery (in-depth learning for understanding) undoubtedly has implications on classroom learning and teaching strategies used. Drawing on the research into what makes successful learners (Boekaerts, Dweck, Shunk, Stipek), I would suggest that at the core of effective learning is the mastery of learning independence, where the learners are capable of making their own learning decisions. Moreover, learning independence leads to learning sustainability, which is essential for successful career development in the 21st century, and beyond. Aiming for achieving learning independence by actively involving pupils in their learning and teaching them to think in critical, creative and evaluative terms through problem solving, investigative tasks or application of the learnt material to new situations, should be at the heart of any 21st century curriculum.

Since assessment is often the curriculum driver as it leads to gaining new competencies and qualifications, its role and its quality are absolutely crucial in the changing educational landscape. The way pupils are assessed has an impact on developing reasoning and higher order thinking skills, such as analytical skills, evaluative skills, critical thinking or creative skills, for example. The ability to think effectively by possessing these higher order thinking skills will become invaluable in the future success at work and will shape the individual learning independence. Globalisation, new technologies and social networking opportunities for professional growth and business development create endless opportunities for new solutions to old and new problems. Talents can be developed and educational systems aimed at advancing appropriate skills-based education rooted in learning mastery of key concepts, will have the capacity to create independent learners capable of thinking for themselves.

In the world, where new knowledge can be found at a touch of a button, learning is changing. It is no longer confined to the mere fact acquisition and its social aspect of interaction with others or resources, including technologies, is fundamental to gaining new skills, competencies or qualifications. The capacity for instant feedback, for example for goods or services available on-line, has an impact on the changing role of assessment and how feedback is perceived. As effective feedback is key to any improvement and the concept of feedback is being accepted as part of everyday life, teaching students evaluative skills through self-reflection or peer-evaluation is another essential skill for learning in the 21st century. It also extends to developing essential critical evaluation skills – the ability to think critically about the found evidence or new ‘knowledge’.

Effective learning in the 21st century requires learners to be active participants in the learning processes in order to maximise opportunities for problem solving and developing learning independence. Classroom strategies based on formative approaches to assessment as integral part of the learning process create conducive learning environments for fostering the skills needed for success in the digital age, where there is no ceiling to learning. Utilising smart technologies and virtual learning environments for greater engagement with learning, provide more ideal opportunities for developing higher order thinking skills of analysis or creativity, and can be used to embed independent learning essential to future success.

It is important for the qualification providers to recognise the changing needs for the 21st century learning so the qualifications reflect the skills which are required for successful employment or further education in the highly competitive global economy.

Thursday, 18 September 2014

Assessment beyond Levels: the benefits of using standardised assessments for planning and monitoring purposes

Arrangements for assessment in the new curriculum give schools greater opportunity to develop assessment frameworks that are best suited to their own school populations. Although ‘assessment without levels’, a new concept for a generation of teachers brought up on tracking and evaluating progress through reference to levels, initially may pose some challenges to school leaders, in practice, schools have been given greater freedom with regard to developing their own assessment.
In this new climate, where schools are accountable for demonstrating year-on-year progress without using levels, diagnostic, standardised assessments are an excellent way of supporting other school assessments as evidence of progress. As CEM, University of Durham, standardised assessments are an excellent tool for providing valid and reliable information on pupil academic profiling and can work alongside any school assessments already in place, these types of assessments are key in providing informed evidence for progress tracking and for benchmarking pupils’ performance with national norms. In addition, CEM assessments can be used not only to establish a baseline for every pupil or a particular cohort, but can also complement teacher assessment in monitoring learning and assist with reporting of progress to parents and other stakeholders. Many schools looking for reliable ways of demonstrating year-on-year progress by externally verified standardised tests have already embedded CEM assessments into their assessment frameworks to assist with evidence of progress for inspection purposes and own self-evaluation of teaching and learning.
What makes CEM assessments particularly useful to teachers and school leaders, is not only the range of assessments available to suit different ages and key stages in education but also their reliability in terms of future predictions – a feature especially useful for target-setting on individual and cohort basis. Since test development at CEM, the largest provider of standardised school assessments in the world, is researched-based, CEM assessments are trailed on very large samples, which makes the tests quite reliable. CEM have been delivering baseline tests and value added measures for over 30 years to over 3000 secondary schools worldwide.

Range of Assessments
CEM assessments cover all stages of education: from Nursery / EYFS to Post 16 and CEM’s Pscales+ assessments are aimed at supporting pupils with special educational needs.

Aspects Nursery/EYFS (ages 3 – 4)
PIPS Primary (ages 5 – 11)
InCAS Primary (ages 5 – 11)
MidYIS KS3 (ages 11 – 14)
INSIGHT End of KS3 (ages 13 – 14)
Yellis KS4 (ages 14 – 16)
Alis KS5 (16+)
CEM IBE KS5 (16+ IB courses)
Pscales+ All key stages (ages 3 – 19) special schools

CEM assessment are adaptive and computer based, meaning each pupil is challenged at a level that is appropriate to them. They are easy to administer and rapid, comprehensive analysis and feedback are provided for schools, including analytical software.

Primary Schools: aims and benefits of CEM assessments
How can CEM data help schools to measure attainment and track progress?
At primary level, Aspects, PIPS and InCAS assessments provide an objective baseline measure which is invaluable to tracking progress at individual and cohort level. In the absence of levels, they provide an objective, external evaluation and standardisation against national norms or age equivalent scores and, alongside school internal assessment data, assist with year-on-year tracking of progress. Information from on-entry assessments helps with pupil profiling by establishing an initial ability baseline which can be used to inform future teaching and learning, and inform school curriculum planning. The detailed data obtained over time allows teachers to identify pupils’ gaps in learning, monitor individual progress and can be used to inform future teaching and planning of learning activities. At whole school level, the data not only provides robust evidence of progress that can be used for accountability purposes, but can also support monitoring of teaching and learning at cohort level; inform curriculum planning and evaluation; identify staff professional development needs and support school improvement.

Can CEM data be used by teachers for target-setting?
As PIPS and InCAS data identify gaps in learning, this can help with individual target-setting for improvement and mapping pupil progress. Easy access to detailed analysis of performance at cohort level, for example class, year group, subject area, can support school leaders with identifying targets for particular cohorts as well as monitoring and evaluating earlier interventions or particular targets. The predictive nature of the assessments assists with curriculum planning and target-setting for whole cohorts against school expectations or national accountability measures (floor standards).

How can the CEM data help parents understand their child’s attainment and progress?

PIPS and InCAS data provide a wealth of information that can support school assessment data in reporting attainment and progress to parents. Within the changing school landscape, parents need simple and accurate information about their children learning in an easy to understand format. CEM reports identify where children are with their learning against their own prior achievement and/or national standards in a visual format, and parental feedback suggests that parents find PIPS data to be a particularly useful tool in reporting of progress. InCAS data show progress based on age standardized scores so parents can see how pupils’ abilities relate to their chronological age. Where there is an indication of a weakness, parents always welcome early identification. In instances of high achievement, CEM data can be used to report on appropriate level of challenge. A school which introduced sharing of PIPS data with parents reported higher levels of parental engagement and improved parental participation at school events.

How can the CEM assessment data help with providing evidence of progress for school inspectors?
Alongside school assessment data, CEM assessments provide an extra layer of external, robust evidence of pupil progress and attainment. As inspections are focused on the impact of school processes, the ability to demonstrate how a school’s assessment framework contributes to improved outcomes for pupils is crucial to evaluating the effectiveness of teaching and learning, and pupil achievement. In the absence of levels, the data from CEM assessments can be used to support teacher assessment data as evidence of progress at individual and cohort level. The information from baseline assessments, can be also used to establish pupils’ ability levels on entry as compared with national norms. CEM assessment data are a source of reliable, standardised information for inspectors and allow school leaders to demonstrate how the data are used for school self-evaluation and improvement. One school reported that PIPS data helped to establish pupils’ ability levels during inspection. PIPS data along with other school assessment data were used as evidence of outstanding progress made by pupils.

How can CEM assessments inform standardisation and national benchmarking in the absence of levels?
Comparisons with national norms allow teachers to put school assessment data into context and assist with benchmarking of pupils performance, which is particularly useful when schools work in greater isolation and need to demonstrate progress between standardised key stage assessments. CEM assessments are in tune with the new way of assessing progress and attainment in the new curriculum, where the new assessments at the end of key stage 2 will report “secondary readiness” by way of national standardisation (by comparison with others). According to new accountability measures, schools will be expected to get at least 85% of their pupils to the new “secondary ready” standard, which is expected to be reported on a standardised national scale of 80 – 130 with a score of 100 being described as “secondary ready” standard. The new floor standards will be based on key stage 2 results and pupil progress, and schools will be expected to track the rate of progress from a new baseline assessment in Reception. As CEM assessments already provide similar baseline data and can be used for tracking progress throughout primary years based on national standardisation and evidence of individual attainment/progress, together with local school data based on teacher assessment, they can provide robust evidence of progress benchmarked against national standards. For comparison purposes, PIPS assessment feedback is standardised with an average pupil scoring 50. CEM data are ideally placed to contribute to a broad range of information that can be made available to parents and the wider public about the school performance and contribute to the fair and transparent school accountability system.

How can CEM tests help with screening for SEN, EAL and More Able Pupils?
The diagnostic nature of CEM assessments provides data for early identification of special education or EAL needs that can be a barrier to learning. On the other end of the spectrum, early identification of pupils who are particularly able, Gifted & Talented, can assist with curriculum planning and inform teaching and learning to a greater degree. Early identification of specific learning needs is crucial to providing intervention at an appropriate level. The data from subsequent assessments can be used to evaluate school intervention. Diagnostic assessment in InCAS Reading and Maths are supported by research-based remediation advice to empower teachers in the diagnosis of specific learning difficulties and providing the right level of support.

How do CEM assessments contribute to pupils’ voice?
As pupil engagement in learning is key to motivation and improvement, CEM assessments, through measuring attitudes, can provide valuable information for teachers and school leaders regarding pupils’ attitudes to learning. At primary stage, attitudes to mathematics, reading and school are assessed. These data can help with identifying mindsets towards learning in different areas of the school curriculum and can help with bridging pastoral and academic dimensions of the school. The data can also be used for mentoring learning and developing cross-curricular thinking and study skills to improve pupils’ attitudes towards learning. This information is an invaluable addition to academic and pastoral profiling, and can contribute significantly to reporting and discussions with parents.

Secondary Schools: aims and benefits of CEM assessments
The benefits of CEM assessments (MidYIS, INSIGHT, Yellis, Alis and CEM IBE) at secondary school level can be demonstrated through their ability to add an extra dimension to other school data that can be used for monitoring teaching and learning at subject, department/faculty and whole-school level. As students progress through their school career, there is more information available about their progress and attainment. These data provide additional information about students’ academic profiles, making CEM assessments a very accurate and reliable tool for future predictions at individual subject level. This type of information is very useful for curriculum planning at secondary school level and for career advice, including various subject options and/or qualifications at GCSE level and beyond. MidYIS tests have an established reputation for delivering accurate baseline data that can be used as a springboard for progress measures as well as validating “secondary readiness” data. Invaluable feedback for progress tracking includes predictions and chances graphs to external examination as well as a full progress reporting system, currently based on value-added. CEM assessment data can be fully integrated into schools’ information management systems for analysis purposes, and can be used by teachers and leaders to inform school planning, monitoring and for evaluating school effectiveness.

How can CEM data help schools with progress measures and provide standardisation with reference to benchmarking?
As secondary schools move from the five A* to C benchmark performance measure to Progress 8 and average point score (APS), CEM assessments can provide crucial evidence to inform tracking and monitoring student progress. As new accountability measures will assess students on their progress relative to their starting points, schools will need reliable baseline data for accurate progress monitoring. MidYIS can provide an accurate and stable alternative or addition to the new end of key stage 2 secondary readiness measures. In the absence of key stage 3 national standardised assessments, INSIGHT tests can deliver useful data for progress tracking in core subjects with reference to national norms and inform curriculum planning, including qualification choices at 14+. Since ‘progress’ is key to the new accountability measures system, schools will need to develop robust evidence for demonstrating progress, which can be supported by the standardised CEM assessment data. All CEM assessment feedback delivers standardised data and, in case of MidYIS, an average student score is 100. This standardisation can be used by school leaders as an extra validation for progress tracking and provides a national benchmark for identifying the ability levels of particular cohorts of students, as well as individual students. As the tests are designed to measure a ‘typical’ student performance, the data informs teachers about the required level of preparation for external examination. Also, benchmarking pupils’ performance with others of the same age can assist school leaders with the evaluation of school’s teaching and learning or particular interventions, as well as with reporting to parents and inspectors.

How the feedback data can be used to inform target-setting?
Feedback from CEM secondary school assessments can support target-setting at individual and cohort level, including targets for specific subject areas, and can provide performance indicators for Post 16 education. In the absence of levels, CEM data can provide reliable baseline information for setting minimum individual and group targets in different subject areas. This information can be used by subject teachers for monitoring individual attainment and for triangulation purposes with other teacher assessment data. It can be also useful for tutors for monitoring student effort in meeting their minimum targets, and for providing basis for informed teacher-student learning dialogues. The predictive nature of the assessments can assist with curriculum planning and target-setting for whole cohorts against school expectations or national accountability measures. For reporting, monitoring and accountability purposes in the absence of levels, information about likely examination performance can assist with on-track monitoring and with individual and cohort position regarding meeting expected targets. Target predictions are a useful tool in informing teachers about setting appropriate challenge at student and class level, and, at whole school level, can contribute to evaluating the effectiveness of teaching and learning.

To what extent CEM data can be used for reporting purposes?
Comprehensive feedback from CEM assessments, which includes predictions and chances graphs related to external examinations as well as a full value-added reporting system, is a reliable way of communicating progress to parents and other stakeholders, often providing external validation of school assessment data. Parental feedback indicates that parents value assessment feedback that pinpoints their child position in comparison with other students or national standards. Indeed, one of the reason for removing national curriculum levels is that parents found levels confusing as they attached an abstract number to particular attainment without identifying students’ strength or weaknesses. Since CEM assessment data is standardised against a nationally representative sample of schools and the feedback includes the percentile band into which the student’s score falls and identifies into which ability band (A – D) the student’s score belongs, parents can be reliably informed about their child’s ability profile. This type of reporting helps with setting future targets and Individual Pupil Record Sheets (IPRS) provide an accurate record which summarises all the baseline information for a particular student on one sheet. Easy access to data manipulation and analysis can help senior leaders with the analysis of predicted performance for particular cohorts or groups of students. This extra layer of evidence for pupil progress and predictions based on ability profiles provides substantiation regarding school effectiveness when reporting to school inspectors. This is a particularly useful evidence in the absence of levels, where schools are expected to provide reliable data for accountability purposes and demonstrate year-on-year progress between key stage 2 and end of key stage 4 qualifications. CEM assessment data can also support evidence for the effectiveness of school interventions and can provide evidence for reporting pupil achievement by comparisons with different groups of students in the school and nationally.

How can CEM tests help with screening for SEN, EAL and More Able Pupils?
The diagnostic nature of CEM assessments provides data for early identification of special education or EAL needs that can be a barrier to learning. On the other end of the spectrum, early identification of pupils who are particularly able, Gifted & Talented, can assist with curriculum planning and inform teaching and learning to a greater degree, where challenge can be more effectively matched to students’ skills and abilities. Early identification of specific learning needs is crucial to providing intervention at an appropriate level and preventing failure at a later stage. The data from subsequent assessments can be used to evaluate school intervention.

How can feedback from CEM secondary assessments assist school leadership with accountability measures and school improvement?
MidYIS, INSIGHT, Yellis and Alis/CEM IBE standardised assessment data can be used to support effective school self-evaluation and promote school improvement through identifying school’s strengths and weaknesses, and by reference to national benchmarking. CEM feedback can also facilitate monitoring of standards over time, where statistical process control charts show yearly progress measures against statistical significance boundaries. In the absence of levels and no common benchmarking at key stage 3, schools need to ensure that their assessment frameworks provide reliable data for demonstrating students’ progress over time and attainment relative to their baseline on entry. Therefore there is a need for schools to develop systems for tracking student progress in order to present data in support of self-evaluation statements about the progress made by students. Although Ofsted do not have any predetermined view as to what specific system schools should use, inspectors’ main interest is whether the approach adopted by the school is effective in measuring what progress pupils are making and how this relates to their expected level of progress. In order to assist with reliable evidence for demonstrating students’ progress over time, one of the key indicators of school effectiveness, INSIGHT assessments can provide value-added measures of progress over time as compared against pupils of similar abilities in other schools. The feedback from these tests can also show progress from MidYIS tests (progress at key stage 3) and can give predicted grades at GCSE level in a wide range of subjects which are invaluable pieces of data for school leaders with regard to external validation of the effectiveness of teaching and learning, and can assist with curriculum and improvement planning.
Regarding accountability measures, from September 2016, schools will be required to provide information via a standard ‘snapshot’ version on the school website and performance tables giving more detailed information, including access to data of interest to Ofsted, parents and other interested parties. In order to provide accurate and reliable information for the new accountability measures, CEM secondary school assessment data can assist school leaders with robust evidence for pupil progress and the standardised pupil scores for reading, maths and science enable school leaders to make comparisons between students and classes for monitoring, planning and evaluating purposes. School leaders can also benefit from PARIS software which is designed to provide detailed assessment data analysis and produce reports or value added information for cohorts of students to help with progress monitoring and GCSE predictions. This type of data analysis can be also very useful for validating school processes and inspection evidence.
In one of the schools, senior leaders used MidIYS data to monitor teaching and learning and adjusted staffing where the data demonstrated poor cohort progress in a subject area with long-term teacher absence covered by non-specialist staff. In this case, the intervention resulted in recruiting a specialist teacher to cover the absence for improved progress.