Daily Bulletin

The Times Real Estate

.

  • Written by Philip Roberts, Associate professor, University of Canberra

NAPLAN testing starts this week. With calls for a review, many education experts are calling the Future of NAPLAN into question. In this series, the experts look at options for removing, replacing, rethinking or resuming NAPLAN.

The recently released Independent Review into Rural, Regional and Remote Education again noted students in non-metropolitan areas perform, on average, below those in metropolitan areas. One such measure commonly used to make this claim is students’ NAPLAN results.

But the problem with NAPLAN is that it’s a “dumb” test: administered to all students, regardless of contextual factors such as location and culture. This is said to be a benefit, as it provides a measure comparable across all students.

Monitoring the performance of a system is an important and necessary public policy process. But in practice, no tool is able to neutralise the influence of contextual factors.

Read more: Standardised tests are culturally biased against rural students

Relevance over one size fits all

The review questioned how relevant the “metro-centric” nature of these assessments was to those in rural areas. The review gave the example “of a question asking what you could see at the busy train station — of which many rural, regional and remote students have no experience and therefore no concepts to be able to respond accurately”.

The review went on to ask “do rural, regional and remote students (and others) have knowledge and skills which they value and find useful but which are not measured and therefore not valued more widely?”

NAPLAN needs to value regional, rural and remote students Asking metropolitan students questions that draw on life in the country could promote a common understanding of country life. Shutterstock

The problem of relevance raises two possibilities for the future of NAPLAN in its current form: the inclusion of different questions drawing on different knowledge, or questions asked in different ways that still assess the same basic skill.

In modern educational assessment, there is no need for all students to sit the same test, that asks the same questions, on the same day. We are smarter than that. We can test the same skills that need monitoring through different types of questions that differentiate for students contexts and prior learning. In this way, we can build on the experiences of students in city or rural schools, or two metropolitan schools in culturally diverse communities.

Does a question have to draw on the example of a train timetable to assess numeracy, or ask students to write about a “day at the beach” to assess writing? Logically the answer would be no. This may be where the move to online purposely designed testing could be beneficial.

We could, for instance, include examples of country life in the curriculum, such as calving, and ask metropolitan students questions drawing on life in the country - to promote a common understanding of country life.

Currently we don’t, but we do include metropolitan examples that highlight the basis of this bias. This is again an issue picked up by the review of rural, regional and remote education that noted the need for direct participation of educators and communities in the development of curriculum and assessment. Too often consultation is optional, and not representative of the community itself.

NAPLAN needs to value regional, rural and remote students Creating a curriculum based on more universal concepts might be a positive for the future of NAPLAN testing. Shutterstock

Read more: How to solve Australia's 'rural school challenge': focus on research and communities

Making education contextually meaningful

Students do better when the course material is more relevant to them. Creating a curriculum based on more universal concepts and allowing teachers to choose the examples they use to illustrate the concepts is achievable. It happens in the ACT, South Australia and Queensland, where teachers moderate assessment across schools as part of their work.

Previously we had the “Country Areas Program” which was aimed at working with communities to make schooling meaningful and relevant to students’ lives.

But this was replaced in 2009 by national partnerships, which was based on literacy and numeracy benchmarks. It also assumed all non-metropolitan schools were low socioeconomic status schools because of their NAPLAN scores - ignoring the cultural relevancy problems of the test. As such a metro-centric one-size-fits-all approach to school improvement was applied.

This was around the same time the tests that preceded NAPLAN (for instance the basic skills test in NSW) went “high-stakes” by being publicly reported on the MySchool website. Clearly we want all kids reaching minimum benchmarks, but that approach doesn’t examine the nature of the benchmark or the way it’s measured.

NAPLAN needs to value regional, rural and remote students There have been many tests, like the basic skills test in NSW, for many years without the current problems of NAPLAN. Shutterstock

Instead, the result is a high-stakes measure applied in a census-like fashion – the same test for everyone – that distorts practice. It’s perhaps not surprising that analysis shows results have not improved, and suggests a decline in results, for disadvantaged groups in the ten years NAPLAN has been operating.

Read more: NAPLAN only tells part of the story of student achievement

The medium is not the message

We’ve had tests for many years without the current problems of NAPLAN. Part of the problem is the publication in order to compare achievement, and the resulting high stakes it creates. This is exacerbated by the nature of the NAPLAN test as a “dumb test”.

A simple change would be to continue NAPLAN testing, but using “smarter tests” and valuing teachers professionalism by engaging in moderation practices. In this way we can get a genuine picture of how all students, regardless of location and cultural background, are travelling.

This approach would enhance our appreciation of the diversity of students and enhance the skills of the profession. Teachers would be comparing work samples in response to questions drawing on the local knowledge of students from Menindee and Manly. This would force teachers to focus on the skills of literacy and numeracy, and understand how they are enacted differently in different places.

Alternatively, testing random schools as samples is an equally valid approach to monitoring a system. This is used in international tests such a PISA and TIMSS.

Ultimately we need to be smarter. We need to move away from dumb tests that treat the profession as incapable of measuring student performance in a valid and meaningful way, and students and communities as neutral cultures without histories.

Authors: Philip Roberts, Associate professor, University of Canberra

Read more http://theconversation.com/beyond-dumb-tests-naplan-needs-to-value-regional-rural-and-remote-students-93356

Business News

Insulation Solutions for Meeting Modern Industrial Standards

As global energy costs soar and environmental regulations tighten, industries face unprecedented pressure to optimise their operations while minimising their ecological footprint. Modern industrial ...

Daily Bulletin - avatar Daily Bulletin

How Australian Startups Should Responsibly Collect, Use and Store Customer Data?

Owing to the digital landscape, data is the most important currency in the market. From giant e-commerce sharks to small businesses, every company is investing heavily to responsibly collect data an...

Daily Bulletin - avatar Daily Bulletin

Revolutionising Connections - The Power of Customer Engagement Software

As time goes by, customer expectations keep on rising ever so rapidly. Businesses that must keep pace will need future-ready tools to deliver connectedness at every touchpoint. Customer engagement a...

Daily Bulletin - avatar Daily Bulletin

LayBy Deals