WALNA

From Wikipedia, the free encyclopedia

Contents

[edit] WALNA (Western Australian Literacy and Numeracy Assessment)

"The West Australian Literacy and Numeracy Assessment (WALNA) is a curriculum-based assessment that is criterion-referenced and tests students' knowledge and skills in numeracy, reading, spelling and writing. The WALNA test is administered annually to students in years 3, 5 and 7." (WALNA, 20071)

WALNA was "developed through a collaborative effort between the Western Australian Department of Education and Training, the Catholic Education Office of Western Australia, the Association of Independent Schools of Western Australia, and the Australian Council for Educational Research." (WALNA 2006: Information for Parents, 2006)


[edit] What is the WALNA?

In Western Australia teachers from Kindergarten through to Year 12 currenty assess students progress using an outcomes based method of assessment. This outcomes based education (OBE) has been a topic of much debate and recent results from WALNA testing has suggested that the lack of discipline in the current OBE system, as well as the lack of syllabus could hae contriuted to a recent decline in literacy and numeracy levels.

WALNA testing was based on a joint initiative in 1997 between Commonwealth, State and Territory Education Ministers as a part of a national plan to ensure that focused on the needs of the students to emphasise the importance of literacy and numeracy competence.

Testing is carried out each year from 1998 however it will cease in at the end of 2007. Year 3, Year 5 and Year 7 students participate in the testing. Performance is analysed by individual assessors and scored according to a strict criteria. The results of testing are made available to the schools and teachers of the students to be used for future planning and focus.

[edit] Outcomes Based Education (OBE)

Outcomes based education is a planning and assessment method that allows schools and teachers to accomodate for individual learning needs to accomodate the various learning methods of children in schools. OBE is a topic of much debate in WA presently (as at April 2007) although the achievement levels of Australian students would not be a call for concern.

[edit] Who participates in WALNA testing?

The total Year 3, Year 5 and Year 7 population of public schools in Western Australia is approximately 56,500. With a few exceptions, all of these students undertake the reading, writing, spelling and numeracy assessments. Around 22,000 students from WA Catholic and independent schools also participate in the assessments. Public schools in Tasmania and some independent schools in South Australia and ACT also participate in WALNA. In all, over 100,000 students across Australia participate in WALNA each year. (WALNA, 20072)

[edit] The future of WALNA testing.

The future of WALNA testing in Western Australia is certainly unclear. Almost 10 years of time and effort has seen a final product that is of the highest standard. Each year the assessments are reiewed and rewritten as necesary making the instructions or the students clear and precise. The tasks in the assessment gather the required information from the students and are easy to assess yet they will not be used in Western Australia in 2008 and beyond.

[edit] National Literacy and Numeracy Assessment

From 2008 Western Australian schools will instead use the National Literacy and Numeracy assessments for students in Year 3, Year 5, Year 7 and Year 9. This assessment will be standardised Australia Wide. More information will be presented on these assessments as they become available.


[edit] Opinions of WALNA.

There are many differing views on the current levels of achievement by students in Western Australia and nationally (Australia).

[edit] Parents and schools assessment of WALNA's worth

"[WALNA testing was]introduced in 1998 under a storm of opposition and anger from educators. In early 1999 an evaluation was conducted to formally gauge parent and teacher reactions to the program. The results indicated relatively high levels of mistrust and dissatisfaction amongst teachers with WALNA." (Cook, 2005. pp. 4)

[edit] Teacher dissatisfaction

Figure 1 below details the dissatisfaction amongst teachers with the WALNA.

Fairly incidental work done with schools in interpreting their data suggested that, even in schools where they were trying to use the data for school improvement purposes, the level of knowledge about the assessment was an impediment to efficient and effective use of the data. We also observed more than the occasional instance where teachers and their principals were over-interpreting the data and, as a corollary, their teachers were disposed to over-prepare for the assessment at the cost of curriculum balance. (Cook, 2005. pp. 4)


Question: [Does WALNA] Assist schools to provide better literacy teaching for all students? Image:Satisfaction1.jpg
Data from http://cmslive.curriculum.edu.au/verve/_resources/Cook_edited.pdf
Figure 1. Parent and teacher satisfaction in the ability of WALNA to provide better literacy teaching for all student.

[edit] Data Club and Assessment for Improvement initiatives

From 1999 to 2003 two significant programs were introduced to support better use of the data by key stakeholders of WALNA – principals and teachers. The first to be introduced was the Data Club which targeted principals. The Data Club supports school leaders in making performance judgements based on their school’s WALNA data.

The second program which, targeting teachers, began life as the ‘Teachers’ Data Club’, has since been rebadged as ‘Assessment for Improvement’. The aim of this professional development program is to increase teachers’ confidence in judgements they make from a range of assessments. Teachers’ analysis workshops have been specifically designed to build teachers’ ability to blend their classroom monitoring with WALNA results to judge student achievement and plan for future teaching and learning.

While both programs were about understanding data, they were built to meet the needs identified by principals and teachers. Beyond the initial data provided, the displays and graphs included in WALNA reporting have been ones requested by principals and teachers, rather than all the ones that a powerful statistical software packages can generate.

A subsequent evaluation was carried out at the end of 2002. Using the (then) original questionnaire, a representative sample of teacher and parents was canvassed as to their opinion of the key aspects of the WALNA. Parents never doubted that external assessment would support schools to provide better literacy teachings (92%) and this view firmed up over the intervening three years (95%). In 1999 close to half (42%) of the teachers surveyed disagreed that system level test information would assist schools in providing better literacy teaching. By 2002 that percentage had reduced significantly (down to 27%). (Cook, 2005. pp. 4)

The changes that were made by introducing programs to assist teachers and principals is evident. Figure 2 below highlights the level of satisfaction achieved after the changes were made from a survey in 2003.

Question: The [WALNA] data on individual students is useful for diagnostic purposes.
Question: The [WALNA] test results provided me with valuable diagnostic information about my students
Image:Satisfaction3.jpg
Data from http://cmslive.curriculum.edu.au/verve/_resources/Cook_edited.pdf
Figure 2. Satisfaction results after the introduction of programs to assist in the interpretation of WALNA data.

[edit] The effectiveness of the Data Club

Overall, we were impressed by how open the principals were to learning more about the data coming from their schools. Before the Data Club it was not at all evident that the great majority of primary school principals would take on board the results of national tests in examining the school’s practices and strategies. The very favourable response to the offer of help in analysing and understanding the WALNA data by the principals is an important finding in its own right

The information gathered from this survey about the use being made of the Data Club by principals (and incidentally by Districts) suggests that the data, the analysis, and the skill and understanding developed through participation in the Data Club is being applied extensively.… In the process of working through the Data Club, principals (and schools) appear to have stopped being defensive if they are not performing as well as expected

In sum, the professional development aspect of the Data Club has provided much needed tutoring about analysing and using the WALNA data in schools, and it has been very much appreciated. There was not a single principal who felt that he or she had not learned what was intended for them to learn. The outcome from such successful PD is that principals want more – more for themselves and more for their teachers

The Data Club has begun very well, but its role has only just begun. Schools recognise that there will be much more for them to learn about using data over the next few years. And they will want reliable help from independent experts. The Data Club has provided those services to everyone’s satisfaction – indeed it seems to have exceeded expectations (Louden & Wildy, 2001. p. 1).

[edit] State and national opinions of WALNA and OBE

With WALNA in its last year (2007) and a move towards national testing from 2008 onwards it is unclear what will happen with all of what has been achieved. The assessments have been modified each year to allow easier testing of students and grading of work. Parents, teachers and school principals have familiarised themselves with the system and are beginning to use the data for their planning. From 2008 it is not just WALNA that will have changed. There is ever increasing pressure for WA to drop OBE and adopt a national curriculum. Perhaps a move to national testing is just the first step towards federal government achieving this.

[edit] International Opinion of Australia's academic performance

Whilst within Australia there is much criticism over the achievement levels of students OECD data would suggest that Australia is perhaps on the right track. Results from the 2003 PISA test suggested that:

"Australia was outperformed by only one country in reading literacy (Finland); by three countries in scientific literacy (Finland, Japan and Korea); by four countries in mathematical literacy (Hong Kong-China, Finland, Korea and the Netherlands), and by four countries in problem solving (Korea, Hong Kong-China, Finland and Japan)." (Rann, 2006)

[edit] References

All graphs created using National Centre for Education Statistic's Create a Graph available at http://nces.ed.gov/nceskids/createagraph/