Legislative Council Tuesday 10 September 2019
Ms FORREST - Mr President, I thank the member for Elwick for bringing this motion forward. It is a really important topic. I have had a number of representations from my constituents on this matter, not just since the report was tabled but beforehand as well. I will be reflecting particularly on that feedback. The member for Elwick has gone through a lot of the Tasmanian background to this. I will come back to some of those points.
I note it is very difficult for public servants to speak out publicly regarding their concerns on government policy so I believe the comments I make here reflect only the very tip of the iceberg. There are many more teachers and principals in Tasmania who share similar views and concerns as those I will raise.
I assume the member for Elwick is talking about different people from those I have been speaking to, being from different parts of the state.
Mr Willie - The State Service Act makes a lot of them nervous, but they will talk to you privately.
Ms FORREST - Yes, that is what I am saying. Retired teachers too will sometimes speak up. I do not intend to address each point of the motion separately as the member for Elwick has done this; I intend to speak in broader terms.
A number of current and former teachers have informed me that they have been long-time opponents of NAPLAN in its current framework and how it is applied for a number of reasons. A former, now retired primary school teacher in the north-west provided her views to me. She indicated that standardised testing is appropriate when the aim is to inform teaching, and inform planning, monitoring and assessment of student growth over time, which is underpinned by an understanding that students of the same age and the same year at school can be at very different levels in their development and learning.
She also suggested that a good assessment process should enable teachers to tailor their teaching to the needs of the individual student, allowing them to advance and progress regardless of their starting point. At that point I am in furious agreement with her.
A good assessment should provide an overall description of the types of skills mastered and those still to be developed, based on the test performance, leading to an effective and targeted learning program and improved outcomes for all students. Isn't that what we want for all of our children?
In my constituent's view, a view shared by me and many others, NAPLAN as it is currently being conducted does not achieve these outcomes. My constituent suggested that the only people for whom NAPLAN is useful are the statisticians, the writers, the publishers and booksellers of the plethora of books on how to practise and pass NAPLAN; the parents who use results to choose a 'good school' for their children; and, last but not least, the government, which uses the results to denigrate teachers. That is her view, but it is shared by many others.
I know the member for Elwick referred to many of these matters, but this is the reality teachers are facing in the classroom. I believe these views are widely shared; it is not just a small group of renegade teachers. I support the motion calling for the Tasmanian Government to join the breakaway - as it is called - review that will report to the Education Council. The member for Elwick outlined the process that would be required for that in his closing comments. I believe there is still time to get involved. We should be at the table with these things; we should be there. Tasmanian children are just as important as any other child in the country and Tasmania's voice should be there.
In saying that, I do not support this lightly. As I know, we need to ensure that education of our children is effective and contemporary. It also needs to be adaptable to meet the needs of the individual child and acknowledge the uniqueness of each child.
I have studied the education systems in some of the best performing countries around the world, including the Scandinavian countries, particularly Finland, which I have visited twice to meet with those involved in education there. I will comment more on that a bit later.
First, I would like to explore some of the problems with NAPLAN, as viewed by those at the coalface - that is, in the classroom. Although I will repeat some of what the member for Elwick has said, I will continue because it is important for me to put my constituents' views clearly on the record too.
The view is that it does not inform teaching and planning as the results arrive months after the test, so that you do not get that rapid feedback and response. Results, when they do arrive, are inconclusive and do not contain sufficient information to target the skills required for growth and advancement. The online testing would be laughable if it were not for the distressed calls to both teachers and students in its inception. The member for Elwick alluded to this.
The following are some of the problems encountered by some of the teaching staff I spoke to: an inability to log on; a lack of ICT support personnel when problems occurred; constant dropouts; students with insufficient typing skills to finish the test in a timely manner; children with learning disabilities being set up to fail; and so it goes on.
In this current age, we expect and demand high-quality, high-speed and reliable access to fast internet; however, in many parts of the state, particularly in my vast electorate, this is, very sadly, not the case. Personally, I have regular issues with internet in my office in the main street of Wynyard, and I am sure my electorate officer, who is here today in the Chamber, would vouch for that.
Furthermore, some excellent standardised tests have been available for at least 20 years that some teachers have used to their benefit, helping them identify and plan for varying ability levels within a one-year group for the students or, in the case of composite classes, trying to meet the needs of that broader range of students in their class.
I am informed that these tests that have been used on an individual basis are now available online, including the ACER progressive assessment test - or PAT - where the results are available in a timely manner and provide useful diagnostic results which inform teaching and planning for all ability levels. Teachers are out there using these sorts of mechanisms to guide their practice and to get that immediate feedback so they can directly target or address the needs of their individual students.
I will share some of the feedback which I see more specifically related to points (3) and (4) of the motion.
Points (3) and (4) of the motion -
(3) Is concerned that this year's NAPLAN testing encountered a range of difficulties – with students hampered by technological issues in Tasmania and across the country;
(4) Acknowledges education expert David Gonski's comments regarding limitations at the classroom level where NAPLAN reports on achievement rather than growth and the results are six months old by the time they are released.
I was told that when the school implemented NAPLAN online for the first time, after doing two practice sessions in 2017-18, the staff were assured that all technical difficulties would be worked out, and all the technology in the schools would be up to the task by 2019. For the four months before testing took place, a school in my electorate was having constant NBN dropouts, leaving the school with not just poor internet access but no internet access.
Most of the desktop computers were over five years old. Staff were concerned about dropouts during the test, and how the hardware would hold up with so many students online at one time. One teacher informed me that they also had a feeling ACARA may have felt the same, because every school was sent a complete set of the test books in paper form, none of which were used.
Understandably, this did not inspire much confidence in the process to begin with. It is staggering to think of the cost involved in producing booklets for every grade and test nationwide that were not used and were sent back to be shredded.
The other concern is that some schools, especially in my electorate, do not have an IT specialist onsite to support the school during the testing period. All technical difficulties were to be solved by staff members supervising the test. Training for implementing these tests and troubleshooting was conducted in-house and by reading the instruction manuals as mandated by the department. The teachers were having to do all of this. They are not IT technicians. They are teachers who are trying to teach the kids, supervise the test and fix all the IT problems.
I can only imagine how stressful this was for those teachers supervising the test, who were more rightly concerned about the welfare of their students, but instead were diverted to and distracted by the need to fix internet dropouts, find sound tests that were not working, deal with broken headphones and ports, screens blanking out and so on. This in every test across every grade. In one school I was talking to, there were numerous issues where student tests had to be paused to sort out technical issues of some sort.
While some students are very flexible and tended to take it in their stride, many students were put off by this - particularly when it was happening during the test situation.
One teacher informed me that with all of this, they thought they were lucky to only have two or three students in tears, which is awful when you put this into context. No child should be crying at school.
Another key concern related to the inequity of the test and its implementation. In paper form, if a question is too hard, a child can see other questions that may be more accessible to them and attempt them first to build up their confidence before proceeding. This is not possible in the online format. Students can skip questions but they can only see one question at a time. In the reading question, many of these are linked to one text.
A student who was not a confident reader was presented with a grade 5 or 6 text as the first part of the reading test. The child was presented with a wall of small-sized text to read and scroll through. According to teachers supervising the test, there was a real risk they would immediately shut down in this situation and proceed to have a meltdown. It is just too overwhelming.
When this has occurred, staff members would talk to the child and encourage them to have a go and skip over the questions. In this one specific example of the reading task, all eight of the next questions were to do with the text, so they would have had to skip eight questions. You can understand how, in such circumstances, a child will not fully recover from this. It will certainly have an impact on their final result.
If it had been a paper test, the magazine starts with the easiest text and proceeds to the hardest. Students can see at a glance which text they can access and it gives them more confidence, so they can move backwards and forwards throughout it.
I am informed that the reading test, which is completed second, determined which spelling words students would access in the language conventions test. While reading levels can indicate the spelling levels of some students, it is not a consistent correlation that a good reader is a good speller. If this is a fair test, should not all students receive the same questions? This has been explained to me by people who are actually supervising the test and understand how it works.
It seems an odd way to do it. There may be some good research backing this, but it seems odd to me. In the writing test students receive different prompts and stimulus as well, but still the same text structure.
I understand and appreciate the randomisation of the questions in an online setting, particularly to prevent possible cheating, but I question how fair and equitable this approach is. I also question how the marking can be compared if not every student is completing the same questions in the test. If standardised results are the goal, how does this work?
These sorts of tests are quite appropriate on an individual basis where you are getting immediate feedback to then guide your teaching. That would not be a problem.
With regard to point (4), teachers I spoke to said Gonski is correct. We are constantly told NAPLAN is a valuable teaching resource and able to be used to identify need and future planning. The test itself is well structured and links well to the curriculum.
Teachers will often use parts of the test, particularly in maths, to test if a concept has been understood. It is a useful tool. I am not saying it is not a useful tool. NAPLAN marking guides can be used for teaching and assessing narrative and persuasive texts because they are comprehensive and well researched. This is a key issue because standardised test information is not reported back related to the test in time to make it useful for teaching practice and to focus on the identified areas of need.
Teachers who have marked NAPLAN papers in the past informed me all papers are marked and the results are sent back to Australian Curriculum, Assessment and Reporting Authority within a month. However, teachers do not get the results until at least September and in the past it has been delayed until October, which the member for Elwick referred to.
I understand it is a bit better, but there is still a gap between the test and having a young grade 3 child remember back to what they said, and then teach them on the basis of this, is not easy. I am not a teacher, so I am not assuming anything about the actual application.
The reason for the delay is to gather and smooth the statistics so they look presentable to stakeholders. This would have been a much tougher job this year, because not all students received the same time to complete tasks due to the constant interruptions, for example, in their online access.
I am informed teachers would rather be given the raw data as soon as possible so they can work with their students. For example, if four students struggle with fractions and five did not understand speech marks, this is information teachers can work with right away and target their teaching to make sure the students are not missing key messages and key learnings.
If the making of rubrics for the writing were provided almost immediately after testing, the teachers would have a huge amount of information to work with in regards to spelling structure, language conventions, grammar and more.
Teachers are professionals and able to analyse data and results to achieve better teaching outcomes. It is part of their core business every day, giving students feedback and reflecting on their practice and how best to improve student outcomes. By not providing this information in a timely manner and then complaining teachers are not doing their jobs by not using the information to improve outcomes must be extremely frustrating and demoralising for teachers who genuinely care about and value students and seek the best educational outcomes they can for each child.
Talking to teachers when the NAPLAN results are due, I know the anxiety they feel about how they are going to be judged based on the results of their particular school and students. One teacher told me they already use their own data collections, work samples, observations in class testing, putting in post-test diagnostic tests, conferences and so on to provide individualised programs and to set further goals for students. NAPLAN could work with this in some of these issues, if the issues with NAPLAN were addressed.
One teacher provided an excellent example of NAPLAN versus teacher-collected data. In this teacher's first year of NAPLAN, they had a student with learning difficulties who participated in the testing as the child was not eligible for exemption. This child had a DE average and was working at least two years behind in all subject areas. They could not comprehend the test questions and resorted to guessing the answers just to finish the test. The teacher was sitting with the student who was talking through why they were choosing the answers. When the results were released, the student received average scores throughout, with some being slightly higher. This resulted in a very awkward conversation with parents to explain the NAPLAN results and what was happening in the class, she said, or the reality.
It lacks rigour if you undertake the test that way. Some of us may remember doing multiple choice questions over the years and some of it is guessing.
The teachers have copious amounts of testing, observation, work samples, diagnostic testing et cetera to provide an evidence of the student's learning abilities and were able to demonstrate to the parents how the information they had collected was informing their individual learning plans. This child I was talking about basically guessed the answers,. Unfortunately, because NAPLAN is so widely discussed in the community and in the media, some parents believe it is more important than other pieces of work teachers gather daily.
NAPLAN, in my view, is a tool and when used well it can be effective in highlighting areas of need. If the results are used in conjunction with all the other data teachers collect throughout the year, it can provide a more balanced view of students and class progress - that is, if teachers can get the information as soon as it is marked. It needs to be provided much more quickly after the test to make it meaningful.
Many teachers are feeling very concerned and stressed out about how NAPLAN is being used, saying that it is such a huge issue and is becoming more detrimental every year to the health and wellbeing of students and teachers, especially with how politicised it has become and how much airplay it gets in the media.
Teachers work very hard in schools to make it as stress-free as possible for the students and to remind them that their work is more than a mark on the standardised test. I want to acknowledge the hard work that teaching is. It is one of the most important jobs that anyone can do. We undervalue teachers in this country. When you go to Finland, for example, teachers are one of the most highly regarded professions. You have to have a master's degree before you can step into a classroom. They take the best of the best students into university education to be teachers. They are treated with the same respect as we tend to treat people like doctors and others at that level. It is a shame we, as a population, do not seem to have the same respect for teachers that the Finnish do.
I agree it is time for a reconsideration of NAPLAN and how the testing is used and how it works. I am not saying we should throw it out and I am not saying we should not do standardised testing. I am saying the way it is done and applied needs to be reviewed.
In preparing for this contribution, I read a number of articles about education and NAPLAN, including a paper by Dr Bob Lingard, published in the Queensland Teachers Union Professional Magazine in 2009. I know this is a teachers union and they are going to have a view, of course, but he did a long paper on NAPLAN 10 years ago. While there are elements of political commentary in this paper, which I knowledge, I believe the points he raised 10 years ago remain relevant now. I will refer to some of his comments, but I encourage members to read his article if they want to get a historical context. He does some significant comparative analysis of global education policy and its application. I want to quote from a few parts of it. It is called 'Testing times: The need for new intelligent accountabilities for schooling' -
Recently, the Australian Curriculum, Assessment and Reporting Authority (ACARA) has been established to oversee these national involvements.
He is talking about when the Rudd government brought in the NAPLAN process -
Apart from investment in school infrastructure, the most obvious manifestation of the strengthened national presence in schooling and new national accountabilities is the National Assessment Program - Literacy and Numeracy (NAPLAN). NAPLAN entails yearly full-cohort standardised testing in literacy and numeracy at years 3, 5, 7 and 9, conducted in all schools in Australia. The outcomes of these results gain a great deal of media coverage and provoke cross‑state and cross-school comparisons. Over the coming months, the federal government will also release 'like school' measures, comparing school performance for policy and practice interventions. They will also prepare league tables of school performance on NAPLAN.
It goes on to say -
Despite claims to the contrary, NAPLAN tests have quickly become high-stakes, with all the potentially negative effects on pedagogies and curricula.
It is named up back then as very rapidly becoming high stakes. Going to policy learning in other jurisdictions, in England and Finland, I will quote a couple of sections of this. He said -
In policy and political terms there is now some belated recognition of the negative effects of the dominant policy regime, with also some stepping-back from the absolute emphasis on high-stakes testing. Despite this, there remains an incapacity to move beyond the dominant policy paradigm of seeking to achieve better educational and equity outcomes through targeting linked to league tables of performance on high-stakes testing.
Having said that, the motivation for the Blair and Brown New Labour schooling reforms have been laudable -
This is in the UK.
... namely to improve educational outcomes for all students, and specifically to improve the outcomes from schooling of the most disadvantaged students, so as to enhance their life chances.
Yes, there has been a failure to recognise that it is the quality of teacher classroom practices that count most in terms of school effects upon student learning, and especially in relation to students from disadvantaged backgrounds.
I want to pause there because it is so important we recognise - and that is what the Finn recognise - the value of quality teaching. That is the biggest impact you can have on a child or a young person, particularly those from disadvantaged backgrounds.
He goes on -
Recognition of the importance of teacher classroom practices demands informed prescription at the policy centre, working with a culture of trust and respect of teachers and full support for teachers to develop and practise their professional judgements. In other words, the quality of classroom practices is what counts. This means that governments need to invest heavily in ongoing teacher learning. The evidence is very clear that high-stakes testing produces 'defensive pedagogies', ... rather than pedagogies of the kind described in the productive pedagogies research, which make a real difference to the quality of schooling outcomes.
He then goes on talking about Finland's approach -
The global trend, represented by the Anglo-American model, has been towards standardisation, while Finland retains flexibility and comparatively 'loose standards'. The global trend has been towards a narrowed focus on literacy and numeracy, while Finnish schooling continues to emphasise broad learning combined with creativity. Sahlberg suggests that the global education reform trend has been towards 'consequential accountability', where negative consequences flow from the failure to meet targets, while Finland works with intelligent accountability and trust-based professionalism.
Moreover, Finnish teachers have high status. Teaching is a highly respected profession and an attractive career option for high-achieving students at the end of secondary schooling. Teachers in Finland are comparatively well paid. They also have master's degrees with a good number of principals having doctorates. There is a real valuing of learning for all associated with schooling. Teachers have a considerable degree of professional autonomy. There is no high-stakes testing. While teacher pedagogies appear to be teacher-centred, they are intellectually demanding. There are only government schools. In a sense all students attend the same school. Finland has a low Gini coefficient of social inequality. That is, it is a relatively egalitarian society with a high degree of equality. But Finland is also a small and relatively ethically homogenous society. This suggests the need for some caution in borrowing or learning from Finland.
I want to read that last bit in - you cannot just pick up what the Finns are doing and drop it in here, it is a completely different system. But the key messages are there. I encourage members interested in this to read more about that. This matter is not only a Tasmanian issue either. The member for Elwick talked about some of the other states who are looking at this - what is it called, a breakaway group - to look at how NAPLAN is applied. Even in the process those other states are suggesting, I do not believe anyone is saying NAPLAN has no place, or there is no place for standardised testing. What is being questioned is whether the current approach is meeting the needs of our students and resulting in improved educational programs that meet the needs of students in a timely manner.
There was an interesting opinion piece in The Age, written by Adam Voigt, a former school principal, on 28 August from which I want to quote a couple of sections. He wrote -
Can you imagine the Federal Minister for Health, Greg Hunt, decreeing to the masses that, due to some concerns about hospital performance, we're regressing to the treatment of patients with leeches?
'Sure,' he'd say. “We know that this is a tool of a bygone era and that there's zero scientific proof that it works. We know the professionals who will use the leeches despise the idea, but we think we know better. It's time that our hospitals got back to basics.”
Well, that's effectively what Victorian education minister James Merlino is doing in response to some disappointing data about year 9 performance in NAPLAN in our state.
His idea to attach year 9 NAPLAN scores to their future job applications does nothing more than increase the damage that this failed tool is causing as we all attempt to build a better education system.
NAPLAN is dead. It has been for years. If there's even a shred of relevance in the results released this week, it's a celebration point for Merlino. Victoria leading all states in seven of the 10 general categories for primary schools. Last year we led in only four of those categories.
The work that Merlino's government primary schools have done in achieving this should be his chance to shower them with champagne atop an educational podium. But they've achieved this despite the outrageous pressure applied by the NAPLAN blowtorch, not because of it.
NAPLAN has failed because of one key misuse of the tool: we publish the data. This creates an environment of competition and fear across our education sector, which results in schools focusing on the test and hoarding successful practices when they should be sharing them.
I agree with that point. If we are going to do the best for our students, if someone has a good idea and it is working, a teacher has success with a particularly difficult circumstance with children with their learning, we should be sharing that. I am sure teachers would prefer to do that but this makes them want to hold it close to their chest. We do need to use data to assist all students and support teachers to undertake their vital role. Mr Voigt continues -
NAPLAN was supposed to help teachers spot areas of opportunity and risk. But NAPLAN has pitted our schools against each other and distracted from a true educative pursuit.
Having spoken to parents and teachers about this point, it is safe to say that similar things are happening in Tasmania. Mr Voigt continues -
Schools are already obsessed with the May madness that NAPLAN represents and petrified of the accountability that comes with any slip in results.
Only recently, I spoke to a principal who was already preparing for how she would explain away the performance of a struggling year 2 cohort who won't even sit the tests until next year.
When you're a government school that takes all comers, occasionally a year level comes through with more kids with special needs, more kids living with trauma, more kids with learning impairments and more kids whose parents struggle to send them to school ready to learn.
That is the reality we also face that in Tasmania. Many in my electorate, and I know others too, can have cohorts of students who have come through with an extraordinarily high incidence of trauma or special needs. Some of these families and students have moved here from the mainland for their own protection. We can only try to imagine how hard it must be for these young children and their teachers who do a remarkable job in simply helping them to feel safe in the new location before they can begin to learn. I have talked to teachers in schools in my electorate where this is the case. This is the reality. These kids are hypervigilant. They have come from a really unsafe place.
A child- and student-centred focus must remain our priority in any policy we develop and use, and the same must apply to how standardised and other testing of students is applied. Mr Voigt also said in his article -
We shouldn't be judging this school or its early childhood educators and compelling them into excuse making 12 months in advance of an inevitable outcome. They've got better things to do.
We should be congratulating them on the important nation building they do by preparing these kids for a better life.
Mr Voigt, in that opinion piece, was responding to an article in The Age published the day before. It was reported the Victorian Education minister will push for a dramatic overhaul of NAPLAN, including linking the tests for year 9 students to future job applications -
James Merlino said it was clear year 9 students were stubbornly disengaged from the national reading and numeracy test, as preliminary results for 2019 revealed a lack of improvement in literacy and numeracy at that year level.
He has proposed the creation of a new proficiency certificate tied to year 9 students' NAPLAN results, which would reveal their proficiency in numeracy and literacy to would-be employers.
He said the certificate would give the test relevance to students as they approach their senior school years, guiding them in their VCE subject selection.
An advisory committee of principals from government, Catholic and independent schools will immediately begin assessing the proposal, among other options to make NAPLAN more relevant to year 9 students.
Mr Merlino has also proposed changing the years students sit the NAPLAN test from years 3, 5, 7 and 9, to years 4, 6, 8 and 10.
The Andrews government will put the proposal to change the test years to the NSW and Queensland governments, which are conducting a breakaway review of NAPLAN with Victoria to determine if it is still fit for purpose 11 years after it was established.
Mr Merlino said a test in year 6 would provide timely information about a student's proficiency in the year before the move to secondary school, and a year 10 test would be similarly useful ahead of VCE studies.
This is the point I am hearing from teachers around Tasmania; they need timely access to data to enable them to focus on areas their students need more assistance with. The article also notes -
The annual literacy and numeracy tests have been plagued by controversy since they were introduced in 2008. Critics argue they create anxiety among children and drive unhealthy competition between schools
The Australian Education Union reiterated its call on Wednesday for NAPLAN to be discontinued and replaced with 'a new assessment strategy that has students and teachers at its heart'.
'Teachers and principals cannot trust NAPLAN or the results it has produced,' acting federal president Meredith Peace said.
But Federal Education Minister Dan Tehan said NAPLAN continued to provide a crucial tool for parents to understand how their child is progressing.
'We now have rich NAPLAN data on the literacy and numeracy standards of Australian students at key points in their schooling', Mr Tehan said. 'Today's results show that since testing began, progress has been made in most areas but there remains room to improve, particularly in the high school years.'
Victorian Curriculum and Assessment Authority chief executive David Howes said NAPLAN was a test of a student's performance at a point in time, not of their overall ability.
'NAPLAN shouldn't be a great source of stress and anxiety; it shouldn't be regarded by students as being determinant of their future,' he said.
'Our message to parents and students is: make sure you check results with a teacher who has knowledge about the student's level of performance.
'Sometimes the teacher can show data that shows it does not reflect what they are doing day to day.'
Those last few messages are really important. We are not saying chuck out standardised testing. Maybe NAPLAN is not fit for purpose in its current form. Yes, it is time to have a look at it. I support that call to have Tasmania involved in a full, really inquiring review to look at what other options there might be.
The key message I am hearing from teachers and parents is not unique to Tasmania. As I said, I am not suggesting we do away with the standardised testing, although I have a view that this should not be introduced under the age of eight years. NAPLAN is currently not used until after this age. It must provide a student-centred approach to education that provides timely and meaningful data to teachers to assist them in their crucial role as educators of our children and grandchildren.
As with all practices and policies in areas such as Health and Education, these must be regularly reviewed and tested against available evidence. The adage 'If we always do what we always did, we will always get what we always got' is true in considering the basis of this motion. I acknowledge and I am happy to support the motion, including the call for the Government to be proactive and to join a breakaway review that will report back to the Education Council.
In my view and the view of many closer to this than I am, it is time for full, comprehensive review that takes into account all aspects of this motion before us. It does not mean that we are going to be throwing NAPLAN out, but it does mean we are at the table if changes are to be made.
Go Back