Daily Bulletin

The Times Real Estate

.

  • Written by Alan Reid, Research Professor, School of Education, University of South Australia
image

When the 2016 NAPLAN results were released a couple of weeks ago, a claim from the Education Minister Simon Birmingham attracted a lot of attention.

The minister said that despite a 23% increase in federal education funding over the past three years, NAPLAN results have plateaued.

He concluded that there should be less concern about the amount of funding going to schools, and more focus on ensuring that the existing money is spent on “evidence-based measures”.

The claim has been picked up in a number of quarters and repeated so often it has taken on the aura of a universal “truth”.

I’ll take each part of the claim in turn.

Claim 1: Education funding has increased significantly over the past few years

This section of the claim is questionable. Federal funding has certainly increased as a consequence of the government picking up the first four years of the Gonski plan. However, when the budget figures are adjusted for rises in wage costs and enrolments, the increase is less than half the 23% claimed by the federal government.

This is well short of the funding that the Gonski Review estimated is needed to improve the educational outcomes of students from educationally disadvantaged backgrounds.

In any event it is difficult to understand why the minister chose only federal funding to make his point. Sure, that is the money over which he has control. But if education standards are to be linked with funding, the only figure that can really be used is one that reflects the total amount schools receive. This means combining state and federal government funding.

As Save Our Schools president Trevor Cobbold has demonstrated, when that is done for the period 2009-2013 (under the previous Labor government), and then adjusted for inflation and rises in enrolments, increases in funding were very modest indeed.

During that five-year period, total government funding for education rose by only about 1% across Australia. Available data suggests not much has changed since.

It is hard to escape the conclusion that the federal government is searching for excuses to justify its stated intention not to fund the remaining 70% of the Gonski plan.

Claim 2: NAPLAN results have plateaued

The claim that results are plateauing means that if you compare NAPLAN results for the same year level (say year 7) over a number of years, scores have stayed much the same. This is accurate, but it can be misleading.

Statisticians and educational researchers have shown that there can be a 10% margin of error for each individual score in NAPLAN. This is compounded when results are used to make comparisons of the same year level over time.

The social composition of cohorts of students at a particular year level can vary markedly from year to year. This will have an impact on test results and thus on overall scores over time.

Given this uncertainty, it is important to take care when interpreting the results. It is far more illuminating to drill down into how groups of students in particular year levels are faring in any one year.

When that happens, the stark and consistent reality is that a larger percentage of students from educationally disadvantaged backgrounds are consistently below the minimum standards, than from advantaged backgrounds. Such an insight suggests a very different conclusion about desirable levels of education funding.

Claim 3: Despite spending more money, education outcomes have not improved

This involves correlating the previous two claims and concluding that extra funding won’t help to raise educational standards. There are a number of problems with this logic.

First, it ignores a fundamental rule in statistics – that correlation is not causation. If NAPLAN results have “plateaued”, surely such an outcome could be linked to any number of variables, including an explanation that the results could have been much worse if the money had not been spent.

Second, even if a direct correlation can be made, the fact is that NAPLAN is an annual standardised test that focuses solely on literacy and numeracy. It cannot tell us about all the other outcomes from the rich array of learning that occurs in schools, and to which resources are also directed.

This means that if you want to correlate funding with NAPLAN results, you would have to somehow isolate the money that was provided to support teaching and learning in literacy and numeracy, rather than all the other things that money is spent on in schools. The minister’s claim did not do that.

Third, the claim assumes that money was distributed on the basis of need. It wasn’t. Large sums of state and federal money are still going to the most affluent and well-resourced schools, thus diminishing the amount going to educationally disadvantaged schools.

Claim 4: We need to focus on evidence-based measures that will get results for students

The minister says we need to focus on evidence about best practice, rather than on funding amounts. The problem is that the strategies suggested by the research evidence, such as professional learning programs and smaller classes for targeted groups of students, require adequate funding.

If policymakers are looking for “evidence-based measures” to improve educational outcomes, they need look no further than the research evidence that highlights the importance of strategies targeted at where they are needed most.

There are large disparities in educational outcomes between students from educationally advantaged and disadvantaged backgrounds. By year 9, these differences can be as much as five years of learning.

And there are many research studies pointing to the fact that a significant portion of the variation in school outcomes is explained by substantial differences in resources between schools. There is shown to be a strong connection between increased expenditure and improved educational outcomes.

There is also plenty of evidence about how to enhance quality in education. These include strategies such as supporting teachers with ongoing professional development, providing additional support staff, introducing quality intervention programs for students at risk, and shaping curriculum to suit identified needs. All of these require adequate funding – and they also take more than a few years to bear fruit.

If the goal is to improve educational outcomes, it makes more sense to increase funding and direct that money to programs that support the most educationally disadvantaged schools and the students who are struggling most.

Authors: Alan Reid, Research Professor, School of Education, University of South Australia

Read more http://theconversation.com/four-common-claims-about-education-funding-and-quality-that-need-explaining-64480

Business News

How Australian Startups Should Responsibly Collect, Use and Store Customer Data?

Owing to the digital landscape, data is the most important currency in the market. From giant e-commerce sharks to small businesses, every company is investing heavily to responsibly collect data an...

Daily Bulletin - avatar Daily Bulletin

Revolutionising Connections - The Power of Customer Engagement Software

As time goes by, customer expectations keep on rising ever so rapidly. Businesses that must keep pace will need future-ready tools to deliver connectedness at every touchpoint. Customer engagement a...

Daily Bulletin - avatar Daily Bulletin

Benefits of Outsourced Bookkeeping for Growing Businesses

Outsourced bookkeeping can have numerous benefits regardless of the size of business. The main advantage being it can provide more than just cost savings. So, if you are thinking of outsourcing your b...

Daily Bulletin - avatar Daily Bulletin

LayBy Deals