Student Assessment

dmbtestHi loopers,

I have been in the loop from the beginning but have rarely had time to make any contribution plus I also felt somewhat reluctant to be pushing the BEAM project, but I would like to comment on recent discussions about student assessment.

BEAM very consciously adopted a constructivist approach to teaching and learning with a heavy emphasis on the development of higher order thinking skills and this approach and emphasis has permeated everything we have done, our trainings, our learning guides and other learning materials and our approach to the assessment of student learning outcomes.  In fact the complete focus of our final summer training last May for teachers, principals and supervisors was classroom based assessment and a comprehensive training package was developed along with a DVD ‘Constructivism in Teaching-Learning and Assessment’.   All of these resources are available to everyone via the Acumen website (www.acumen.ph) under the Educators / Assessment / Assessment Training / Assessment Package section (http://davao.acumen.ph/static/Assessment/index-2.html).  There are snippets of the DVD integrated into the training materials  and you will soon be able to access the full DVD through Acumen when current modifications are completed. We have also completed an e-learning training package on student assessment and this will also be available soon through Acumen which is a repository for all of our learning materials (SBM, teacher inservice training, pre-service, multigrade, special education, distance learning, ALIVE, IP   and of course assessment).  We are glad that the British Council and SEAMEO Innotech have agreed to join us and place all of their learning materials into Acumen too.  We hope to have all of these modifications and additions to Acumen completed by November.

In developing the training materials on assessment and working through all of the relevant DepEd Orders pertaining to student assessment, the need for a DepEd overarching policy on student assessment became obvious.  Working with people from the NETRC and the Central Office we drafted policy paper for others to consider as a starting point for further elaboration.  Amongst other things the paper endeavors to relate classroom assessment to the BEC and advocated authentic assessment strategies consistent with the philosophy of the BEC.  I have attached a copy of that draft paper for your information.

Without trying to draw analogies with the NAT, the RAMSE (Region-wide Assessment in Maths, Science and English) which we have conducted in Regions XI, XII and ARMM each year since 2004, has proved to be a very useful evaluation tool for the system.  A sample of more than 22,000 students from Grade 4 and Year 2 have been tested each year with the focus on the BEC competencies and higher order thinking skills. The grade 4 test included Grades 1 to 4 competencies while the Year covered competencies from Grade 4 to Year 2.  Every booklet was numbered and collected and about 60% of the items were common each year (anchored items).  One of the most significant findings was that the average mean score from the three subjects of the anchored items improved by 22.95% for Grade 4 and 21.12% for Year 2.  Admittedly the base line means in 2004 were only in the high 20% but increases of this nature over only 5 years are very considerable.  If you are interested in all of the findings of the RAMSE tests, the 2004 to 2007 RAMSE reports are available on the BEAM website (www.beam.org. ph) under BEAM Systems / Evaluation Reports / RAMSE.  The 2008 Report is almost finished and will be posted on the website very soon.

It is heartening to read the comments about assessment and to see the obvious interest, I hope DepEd picks up on this area and gives it appropriate and due recognition.  After all, what you really value is what you assess so if you are advocating the development of problem solving and other higher order thinking skills (HOTS) then you have to assess whether they are being developed.

Cheers,

D’Arcy

(Ian D’Arcy Walsh is Australian Project Director, Basic Education Assistance for Mindanao – BEAM Project)

_________________________________________

DRAFT OF
POSITION ON STUDENT ASSESSMENT

THE BASIC EDUCATION ASSISTANCE FOR MINDANAO (BEAM) PROJECT

Acknowledgements

This document has drawn on ideas developed in other school systems to describe holistic approaches to student and classroom assessment and the reporting of this information to parents.

In particular segments from the South Australian Policy Statement Assessing and Reporting for Schools (1996), the National Assessment and Reporting Policy of Papua New Guinea Department of Education (2003) and the Background, Rationale and Specifications of the Queensland Curriculum. Assessment and Reporting Framework (2005) have been particularly helpful.

The document has also drawn on key Department of Education documents, particularly DepEd Orders of the Republic of the Philippines.

Introduction

The overall goal of the BEAM project is to improve the quality of teaching and learning in basic education in Mindanao.  Specifically, the project aims to improve and enhance the skills and knowledge of teachers, educational managers and to address other community needs.   These needs include basic education for Indigenous Cultural Communities (ICCs) and Madaris. The project covers Regions XI, XII and ARMM.

The project approach to student assessment is driven by the main assumptions of the BEAM approach to student learning.   The BEAM approach is designed to be aligned with the requirements of the assessment requirements of the Basic Education Curriculum (BEC), consistent with Dep Ed Order 16 of 2002 relating to Foreign-Assisted Projects.

The BEC was promulgated in 2002 under DepEd Orders 43 (and an earlier Order 25 of 2002).

Order 43 provides guidelines for Elementary and Secondary schools.  These guidelines cover, in general terms, the structure of the curriculum, time allotments, grading procedures (adopted from Orders 66 of 1995 and 80 of 1993), the approach to character traits and modified student report formats (DepEd Forms 138).  Details of the competencies required in each learning area of the BEC have been distributed and are available from the DepEd website[1] and cover elementary subject areas: English, Science, Filipino, Edukasyong Pangtahanan at Pangkabuhayan, Mathematics, Makabayan and Edukasyong Pagpapakatao.

Complementing the BEC is the revised approach to grading, detailed in DepEd Order 79/2003, that advises the basis on which grades should be calculated, tests designed, and grades reported.  From documents explored to date it appears that there is neither a definitive position on student assessment (beyond grading) nor a global DepEd policy on student assessment.

Rationale for this document

The BEAM project promotes a student-centered approach to learning.  A fundamental tenet of the approach is that each student ‘constructs’ their own understanding of the world (and of what is taught to them) based on their personal assimilation of information, moderated by their current ‘accumulation’ of information and understandings.

No matter what the teacher hopes, students who do not yet have the background knowledge and skills can neither accommodate what they are told, what they explore nor what they attempt to read, in a meaningful way.  A teacher sensitive to the student’s ‘construction’ of learning has a view of the current accumulated learning of each student, and can attempt to ensure that each student engages in the learning process at a level appropriate.  This is recognized as very complex for a teacher, with the complexity increasing as the size of the teaching group increases.

The task required of the teacher is made more manageable by approaches to how the learning accumulation for each student is estimated, how it is kept track of and the range of teaching and learning approaches that can be used to assist and motivate the student.

DepEd Order 35/2005 addresses the findings of the monitoring and evaluation of the implementation of the BEC in secondary schools, by the Bureau of Secondary Education.  In particular, the Order notes that ‘teachers have limited knowledge of constructivism’[2].  The report goes on to indicate that “ ‘learning as a construction process and the learner as a constructor of meaning’ is among the basic concepts of the BEC.  ….Although the concept was unfamiliar to many teachers, …. its operationalization was observable in some classes…where problem solving, inquiry or discovery approaches were being used.”

This document outlines the approach of the BEAM project to student assessment, and presumes a ‘constructivist’ view of learning and an interest in promoting high order thinking.

Role of student assessment in Beam

‘Assessment’ can mean different things to different people.  BEAM promotes a very broad concept of assessment, consistent with the dictionary definition ‘of fixing the amount of, estimating the value of’ (Oxford Concise).  In non-technical use, it can be used interchangeably with ‘evaluation’ but BEAM prefers to use evaluation as the term for a ‘big picture’ assessment; that is analyzing data from a range of sources to evaluate a larger-scale process rather than the learning position of any individual student.

The assessment literature distinguishes between ‘formative’ assessment and ‘summative’ assessment.  The former is the concept of making a judgment of a student’s understanding in order to help him/her make further progress, often described as ‘assessment for learning’.  The latter is used to summarize, to take a more detailed ‘reading’ after a period of time, at the end of a grading period or year, described as ‘assessment of learning’.  The distinction has value for some purposes but BEAM, in promoting a strong emphasis on formative assessment, will normally imply ‘formative’ in its use of the term ‘assessment’.

A ‘student assessment’ is any event in which an observer, directly or through the use of a tool such as a test or a task, estimates what or how much a student knows or understands.  For BEAM, assessment is the process of identifying, gathering and interpreting information about the current position of a student’s learning.  By relating a current assessment to previous assessments, growth in learning can be observed.

The purposes of assessment, for BEAM, are:

  • to improve the learning outcomes of all students,
  • to provide specific feedback to students to help them improve their performance,
  • to modify instructional strategies to meet individual student needs;
  • to provide information about whether the learning goals of the teaching program have been achieved;
  • to assist with making decisions about subsequent teaching and learning; and
  • to make decisions regarding the effectiveness of instructional programs.

Teachers make assessments of students for much of the time they are in contact with them.  Most commonly these assessments are informal observations of student behaviours, responses and interactions.  Only   infrequently, compared to the rate at which teachers are assessing, are assessments ‘formal’, that is recorded and reported.

Assessments need not be confined to observation of students going about learning, an often mysterious and hidden process.  The classroom can be set up for students to regularly provide feedback about their own learning through the use of response techniques such as ‘traffic light cards’ or student initiated signals to let the teacher appreciate the degree to which a student understands a lesson.  Under BEAM and BEC the teacher is required to  understand each student’s level of achievement and to use this understanding to help each student and thus the practical techniques are required to achieve this.

Formal assessments of student ‘products’ and pencil-and-paper tests are also used to monitor student learning.  The results of these formal assessments, as well as informing the teacher and the student, are often used for reporting to other interested parties: parents and carers, and future teachers in higher grade levels for example.  If the assessment data can be aggregated and summarized, they can also be used for the principal (and other school leaders) to understand the success of the school in helping students learn, as well as used by divisional, regional or national offices for similar analyses.

Under the requirements of Dep Ed Order 79 of 2003, the approach of teachers to assessment must mesh with grading policies, the reporting of students’ educational achievements to parents.  For each of the four grading periods a grade in the form of a percentage must be calculated for each student, on a basis described in the Order.  The BEAM approach to assessment de-emphasizes formal grades (summative assessments) and emphasizes the regular formative assessments and observations that help a student (and teacher) appreciate that they are making progress.

Principles of Assessment within BEAM

The following assessment principles underpin the position of BEAM on student assessment.

  • Assessment is continual, recorded in a simple fashion when recording is required, and is able to illustrate each individual’s learning improvement over time.
  • Assessment is related to clear expectations of what is to be achieved for any student, based on clear criteria (BEC competencies as a possible example) described in a form that shows empirically established ‘increasing difficulty to learn’ sequences.  Establishing approximately where a student is positioned in such a sequence (that is what outcomes have been achieved) can be used as a measure of student progress.
  • Within a spiral curriculum (key skills, ideas regularly revisited, with each new visit establishing a deeper and richer understanding), assessment of achievements can be related to the understandings and skill levels appropriate to the cycle of the spiral and thus illustrate the student’s improvement.
  • Assessments are sufficiently sensitive to help students understand that they are making progress in their learning, and what they next need to learn next.
  • Assessments help teachers appreciate where students are in their development of skills, knowledge and concepts, and thus inform teachers where modification of their teaching strategies is needed to ensure the student is engaged within their zone of proximal development (ZPD).[3]
  • Assessments are based on a deep belief that all students can succeed in their learning, given appropriate time and support.
  • Assessment methods and strategies are appropriate to the age and stage of development of students and are as varied and unobtrusive as possible.
  • Students are exposed to a wide range of assessment methods and as a result, understand and are prepared for any assessment approach they might be exposed to, in advance of that exposure; that is, students develop strategies to deal with any required testing or examination process.
  • Assessment in BEAM encourages a commitment by teachers to being personally accountable for student improvement over time, that is for student learning growth.  Good performance in school is strongly influenced by student ‘entry’ skill levels and home/economic background.  A teacher lucky enough to get students who already understand the material being taught, or from ‘achievement rich’ home backgrounds, is not necessarily an effective teacher.  The effective teacher is the one who achieves learning growth for all students.  True measures of learning should focus on growth in knowledge and skills and not just reflect the student’s inherent aptitude.
  • Assessment processes used by a teacher for each student are sensitive to individual differences in student development, confidence, learning styles, motivation and need to consider the implication of multiple intelligences.
  • Teachers are expected to use their ‘on-balance’ judgment of a range of sources of information to assess students’ progress.  Teachers must be supported in the development of these assessment approaches through peer moderation, within and across schools.   Test moderation of teacher judgments, that is comparing teacher and test assessments to improve the ability of teachers to mimic the scales of the tests, should also be considered.
  • Assessments are, and are seen to be, valid and reliable.  However, a consequence of lessened reliability for increased validity should be accepted when teachers exercise their professional judgment for assessing students through observation and mixed methods.

Assessment as a connected process

From a BEAM perspective assessment is but one phase of a recurring cycle of steps.  In the metaphor of a journey, these steps include:

  • Teaching -or setting the path, encouraging and preparing students for discoveries and helping them read the map of the journey.
  • Assessing-or finding who is lost (off the path), tired (stopped), who is almost there, etc.; that is to estimate where each person is and updating this assessment as new information is obtained.
  • Recording- or keeping notes on who is where so you can help them get back onto the right path.
  • Reporting -or letting other appropriate interested parties (mainly the students themselves) know where the student is and where they have been.
  • Evaluation- or reflecting on the whole trip and deciding if taking the same path in the future would be desirable.

Assessment cannot be developed as a separate process, independent of its relationship and impact on the other recurring phases of learning.

Issues for further consideration

This document describes the position adopted on student assessment within the BEAM project, which is understood to be consistent with approaches to assessment and evaluation within the Basic Education Curriculum.  Exemplar lesson plans included in the English Philippine Elementary Learning Competencies (PELC), include sample ‘evaluations’ that are consistent with the BEAM approach.

DepEd Order 79/2003, along with implementing guidelines for the Secondary Education DepEd 37/2003, advise the position on a significant aspect of student assessment, reporting to parents and students, through grades and the structure of reports.  By implication, this Order sets the tone for all other aspects of student assessment.  That tone may be discordant with the view of student assessment taken by BEAM and by key evaluators of BEC (Bureau of Secondary Education and the Bureau of Elementary Education).

The framework is however open to a broad interpretation that allows, among other matters, for

  • the use of non-traditional assessment approaches
  • the general promotion of rubrics
  • the use of quizzes, recitations/interactions, behaviour observation, homework, projects, themes/experiments, and other performance outputs.

Both Orders provide very specific advice about how grades are to be calculated which might be seen as a constraint to some of the assessment strategies considered appropriate by BEAM.

However, the general assessment principles of BEAM have been developed in a way that is consistent with the prime orders.  The approaches to assessment promoted by BEAM can be seen to fit within the broad categories of the grading system, perhaps with expanded definitions (possibly needing DepEd approval) for the major categories, viz

  • Periodical tests
  • Quizzes
  • Participation
  • Project/Output
  • Performance.

The gradings require significant contributions from each of the 5 categories; three can contribute up to 25% each depending on the subject in secondary.  This requires that a variety of assessment approaches are to be encouraged.  A revised description of classroom assessment strategies and their link to the broad categories above might be an initial strategy for enhanced approaches to classroom assessment.

The hope of the BEAM project is that by attempting to articulate the broad vision for student assessment adopted in the BEAM approach to learning, discussions can occur among key policy planners in DepEd to develop additional statements of policy on student classroom assessment.  These Dep Ed statements could then guide refinements to BEC, BEAM and related department activities, such as the National Educational and Testing Research Center and a new order to update 79/2003.

9 thoughts on “Student Assessment

  1. Once DepEd Order No. 74 s.2009 kicks in, it will throw a curve at some current assessment baselines and methodologies.

    The paper, “Alternative assessment and the teaching of Mother Tongue languages in Singapore schools” by Koh Kim Hong, Gong Wengao, and Lye Mun Sum, presents an overview about the necessity, feasibility, advantages, how the use of alternative assessments can improve the quality of MT teaching and learning in the actual classroom context, and the practical constraints of introducing alternative assessment into MT education.

    Singapore’s experience is unique in that it has a bilingual policy whereby English is the medium of instruction in ALL subjects, except in one of three official mother tongue languages (Mandarin Chinese, Malay, and Tamil). [Extraneous observation: In 2007, Singapore was at the top, if not among the top 10 in the TIMSS.]. The Philippine MLE model will be a bit different in that MT is used as medium of instruction in all subjects at the initial phases then as scaffolding later. I suppose student assessment–whether traditional or constructivist–once MT-based MLE is in force will have its own challenges. For one thing, in addition to the usual assessment techniques, you will use the MT in the assessment process, necessitating that whoever is conducting the assessment must be familiar with whatever MT is being used in the assessment. [In Ilocano for instance, "12 divided by 5" most likely translates to "pagkalimaen ti sangapulo ket dua".] Ah, Cindi Lauper put it best: [boys and] “girls just wanna have fun”!

  2. Dear D’Arcy Walsh:

    I read your posting about BEAM in the TEDLOOP, and it touched my interest. I have heard of BEAM before, but your posting was my first real contact with someone who breathes it. I want to know more about it. My first interest is to see the reports (technical) about the gains of 21+% and 22+%. I looked for any relevant report under the BEAM website and under “acumen” but coulld not find any. If you could walk me through the BEAM experience, e might be able to beam BEAM to other assessment projects afoot.

    • Hi Abe,

      Just to give you a bit of background, BEAM is a special DepEd project jointly funded by the Philippine and Australian Governments with the Australian grant being AUD $53.4m or 2.07 billion pesos. It began in January 2002 and focuses on improving the quality and access to basic education in Regions XI, XII and ARMM. It is the biggest bi-lateral education project DepEd has ever had and the first and only one where management has been decentralized to the Regional level. We finished in Regions XI and XII on May 31st and will conclude in ARMM on Nov 30.

      While much has been achieved here in our 3 targeted Regions, many of these achievements has also been picked up by Central Office and are now being rolled out across the country. I speak of the NCBTS, much of the SBM training and processes, the extended practicum and revised pre-service curriculum for TEIs, the ALIVE program and other Muslim and IP Education initiatives, the HRIS, the processes and systems we used in the access programs, our Learning Guides and now this Acumen repository to mention a few.

      I rarely comment of this TEDP loop as I just don’t have the time and furthermore many of the members already know all about BEAM (eg Nap Imperial is on our PCC and Linda Pefianco was on our QAP) not to mention the other senior DepEd officials involved in different ways.

      Anyway as I mention in my email, the 2008 RAMSE Report which contains the analysis you refer to (21 & 22% gains) is undergoing final edit and will not be on our website for another two weeks; however the 2004 to 2007 RAMSE Reports are included and while not showing the improvement level of the most recent report, do show the considerable gains have been made. These reports are on our website under “Evaluation Reports” along with our External Evaluation Reports, our Access Tracer Studies and our ALIVE Evaluation. In fact the website is full of a huge range of technical reports. See under M & E – outputs, outcomes and impact sections; under the various Stage 1 and 2 Milestone Reports and under the Resources Section. These website sections also contain annual plans, design documents and other things like KPI reports. You should have no difficulty finding them.

      During the life of the project we developed special Learning Guides for all Maths, Science and English competencies for Grades 1 to 6 and Years 1 to 4. These are not lesson plans but rather a section of work around a particular competency which may take about 4-6 lessons to teach. All the guides are structured around the 6 stages of a constructivist approach to teaching and learning. We built special authoring software for the development of these guides and have recently expanded this authoring software to include a repository of all our learning materials and now those of the British Council and Innotech. We call this new program Acumen and hope to have the final version up by the end of October. If you want to look at these learning guides and our distance learning materials and other resources for each competency, go to aumen.ph and click on davao and then curriculum – bec, then click on say science grade 5 and chose a learning guide or distance learning module from a particular topic. You can then move across the competencies/ content/activiti es and resources. When you click on resources you can actually down load a pdf file of the learning guide or distance learning module. In some cases you can also obtain assessment items under the resources section. Under the Educator’s section of Acumen you can down load a range of teacher training and support materials for all the areas listed. Remember we are still ironing out a few little bugs in Acumen and trying to improve its usability.

      I hope that is enough “walking through” for you.

      Cheers,

      D’Arcy

      • Dear D’Arcy:

        Thank you for walking me through Beam. It was a very thorough briefing, and I really appreciate your taking time out of your busy schedule to respond to a request. The scope of the Beam project impressed me.

        I have noted what you said about 2004-07 RAMSE Reports, together with the 2008 report being finalized. I have not yet seen them because I have been briefly away but will definitely read them with great interest.

        I hope to give you substantive comments next time.

        Many thanks.

        Abe

  3. Dear D’Arcy,

    Thank you so much for the link to the Acumen. It’s just fabulous! I hope all teachers would be able to access such a wealth of resources. Maybe we can also upload there the training materials we use for MLE (based on the Lubuagan model).

    For now, only the Davao server is functional and so the materials you mentioned are found in the link below: http://davao.acumen.ph/static/Assessment/index-2.html

    best regards,

    ched

  4. Hi Ched,

    Thanks for the prompt response. We are in the process of handing over Acumen to DepEd and Seameo Innotech at which time there will be three servers operating across the Philippines to accelerate access. There are other discussions going on about possibly installing a copy of the resources (synchronized with the main servers) in any schools to speed up access to video clips and other interactive materials incorporated within Acumen but all of these discussions are very much in the early stages. In fact Acumen itself is in its early stages as well and is, based on feed back from users, currently being modified to be more user friendly. We expect this process to be completed by November.

    There is no problem uploading MLE materials we would like to see as much material as possible be included in Acumen to support teachers and educators in general.

    Cheers,

    D’Arcy

  5. Assessment and evaluation are two different concepts. Here’s a wrinkle on “teacher evaluation” from today’s Los Angeles Times:

    http://www.latimes.com/news/local/la-me-data29-2009jul29,0,4522983.story

    From the Los Angeles Times

    California schools chief reacts to U.S. criticism on teacher evaluation

    Jack O’Connell visits Long Beach to show that districts in the state are allowed to tie test scores to educator assessments. Obama and his Education secretary chided California on the issue last week.

    By Seema Mehta
    10:18 PM PDT, July 28, 2009

    California’s top education official sought Tuesday to counter federal criticism of the state’s reluctance to use student test scores to evaluate teachers, paying a visit to Long Beach to highlight one of the few California school districts to make extensive use of such data.

    The Long Beach Unified School District’s use of student scores to assess the effectiveness of programs, instructional strategies and teachers is a rarity in California, and state Supt. of Public Instruction Jack O’Connell called it a model for other California school districts during a hastily arranged round-table discussion. Other participants included district administrators and staff.

    “Becoming a data-oriented culture, as Long Beach is, won’t be easy, and it won’t be overnight,” O’Connell said. “Long Beach is ahead of the curve. . . . You’re a model for this new culture of data for education.”

    The visit followed comments last week by President Obama and U.S. Education Secretary Arne Duncan, in which they criticized the state for not allowing such test data to be linked to teacher performance evaluations.

    On Friday, Obama singled out California for failing to use student test scores to distinguish poor teachers from good ones and Duncan warned that states that bar linking such data to evaluations will be ineligible to compete for the $4.35-billion “Race to the Top” grants. That funding is part of roughly $100 billion earmarked for education in the economic-stimulus package.

    The U.S. Department of Education will be awarding the money in competitive grants to states. Applications are due in December.

    Duncan has repeatedly raised the issue, including during a trip to San Francisco in May, when he called California’s position “mind-boggling.”

    “The firewall between students and teachers is bad for children and bad for education,” he said. “I challenge the state to think very, very differently about that.”

    At issue is a 2006 California law that prohibits use of student data to evaluate teachers at the state level. O’Connell said Obama and Duncan misunderstand the law, which does not bar local districts from using the information.

    “I need to do a better job making that case,” the schools chief said, adding that he would be open to amending the law to clarify the matter. Gov. Arnold Schwarzenegger has also supported such a move, though it would probably draw opposition from the state’s powerful teachers unions.

    O’Connell’s Long Beach visit, which a district official said was not put on the schedule until Monday, was designed to show that California school districts are already able to use student data to assess teachers.

    The 87,499-student Long Beach Unified School District has won national acclaim for its students’ academic performance. Obama cited the district in his first major speech on education.

    “The reason we have been successful . . . is because we base all of our decisions on data,” Long Beach Supt. Christopher J. Steinhauser said Tuesday.

    Seven years ago, the district developed a sophisticated centralized data system that allows it to track individual student achievement, attendance and discipline over time. The system also lets the district see how students are faring collectively in a particular classroom or school, and how subsets such as English learners or special education students are performing. District officials can then use the information for staffing decisions, such as where to send specialists.

    Tom Malkus, principal of Lee Elementary School, said he and other school leaders use the data to spot struggling teachers and offer coaching, professional development and other support.

    If that fails, Steinhauser said, the district has “courageous conversations” with teachers that can result in their leaving the profession.

    The system allows teachers to look at their students’ most recent work to ensure that they understand a particular lesson, or double back to concepts that are difficult for them.

    “You can look at individual students’ needs and you can look at the group’s needs,” said Christina Benson, a teacher at Lee Elementary. “It’s perfect for me.”

    seema.mehta@latimes.com

  6. On Sunday, 8/2/09, Abraham Felipe wrote:

    Dear D’Arcy:

    I have gone through the RAMSE reports and am now back as I wrote the last time.

    Let me first commend you for the BEAM Project. It is a lot of important work of significant implications in the classroom. I hope to see in the future follow-ups of the logical implications of the present findings in the project.

    My interest in BEAM is connected to a little work in which I am now involved. You might have heard of a plan to organize a National Educational Evaluation Testing System or NEETS, a recommendation of the recently concluded Presidential Task Force for Educational Reforms (PTFER). I am doing some work for NEETS,

    NEETS has recently been clarifying its focus and is now at that point where it is de-emphasizing testing and converging on “assessment” instead. From its viewpoint, “assessment” is not the ‘small picture’ evaluation in the classroom in contrast to the big-picture variety that some might want to call “evaluation”. From our group’s standpoint, we would rather speak of a classroom level type of assessment, an institutional (school) level type, and beyond that, a system-level type. We have more or less accepted that NEETS has to go beyond testing which primarily implies measurement, and attend to assessment which further implies the application of standards.

    There is still some internal struggle in our group to restrict ourselves to the system-level variety and leave the classroom- and institutional- levels for others to do. I am one of those who hold on to this view because I fear the sheer volume of work that classroom and institutional assessments entail, and I doubt that NEETS could ever organize sufficient technical staffing with experience, training and technology for these types of work. I am also aware of at least one organization with experience in institutional- level assessments. And now I am aware of BEAM.

    I see the work of BEAM, the work in institutional assessment and the emerging focus of NEETS at the systems-level to be connected. The thread connecting them is the idea that assessment is one tool that government could use in order to rationally direct efforts to improve the system. At the classroom level, assessment will inform a teacher the degree to which he is attaining his objectives; likewise for the school and systems manager vis-à-vis the institutional and systems-level objectives, respectively. If classroom assessment would have a value, that would register at the institutional level. And if institutional assessments in turn will have its own value, that would register on the systems level. A coordinated effort to assess the effectiveness of classroom teaching wherein assessment is done at various levels starting from the classroom, should help guide the system to progress.

    Thus, my initial interest in the RAMSE reports was to find if any evidence exists that classroom-level assessment leads to higher achievement. To my mind, the basic design to address this issue would involve the following: (a) treating classroom-assessment as an independent variable, and (b) using as dependent variable a measure that the (classroom-level) implementors do not use in tracking student progress at the classroom level.

    Requirement “a” is self-explanatory. As for “b”, progress must be established using measures other than those proposed by the advocates of classroom assessment, in order that any evidence proffered will not be described as only circular. To my mind, a DepEd-advocated instrument like NEAT, NSAT or NAT would be best logically. It is understandable if these measures do not meet your criteria of “satisfactoriness” on technical or some other grounds and therefore not be accepted, but it is important that the dependent variable to be finally used be independently defined. As an aside, I do not consider procedures on weighting of samples to be important at this point. The important point is to demonstrate the value of classroom assessment, and for this purpose an experimental approach that is not weighed down by sampling issues has higher priority. Replicating in the Philippines experimental demonstrations for classroom assessment has its own political arguments besides its technical arguments.

    Finally, as an aside, I noted the repeated RAMSE observations of very small differences between G4 and Y2. You might want to factor in into your explanation some findings which might be related:

    * in 1976, the then Ministry of Education reported a study on the comparative achievement test results of 5th and 6th grade pupils, wherein it was reported that Grade 6 pupils did not perform much more than Grade 5 pupils on same tests. The report spoke of an arresting of development, as if something like a Piagetian level had been hit. Follow-ups in 1986 and 1988 reported a recovery period for Grade 6 in 1986, only to go back to the old (1976) level by 1988.

    * In 2004, I replicated the study, with more controls. I reported that G5 pupils had more learning competencies than G6 pupils, a worse scenario than in 1976. If you want to check out the details, download the December 2006 paper [Unexpected learning competencies of Grades 5 and 6 pupils in public elementary schools: A Philippine report--See Abstract below] from http://iej.cjb.net. The RAMSE report could mean that the first 2 years of high school is a period for recovering whatever was lost at the end of the elementary years. LONG SHOT though!!

    Regards,

    Abe

    ABSTRACT: The…study tested the assumption of a positive and linear relation between years of schooling and school learning in the Philippine setting. It replicated a 1976 study that had cast doubt on this assumption in the Philippine public educational system. It tested three competing hypotheses for that finding: common sense, the 1976 arrested development hypothesis, and the alternative accelerated development hypothesis. To test these competing hypotheses, two factors were systematically varied: the grade levels of Ss and the levels of the tests used. The competing hypotheses have different predicted outcomes. A total n of 7097 from 96 schools participated in the study. The results showed that on all tests Grade5 showed more competencies than Grades 4 and 6, although Grade 6 continued to perform better than Grade 4. When sub-test level was held constant in multiple comparisons, Grade 5 was learning more Grade 6 competencies, whereas Grade 6 was losing not only Grade 6 but also Grade 5 competencies. It is noted that whereas Grade 6 enjoyed a slight superiority in achievement scores circa 1976, the present study shows that Grade 5 enjoys an impressive superiority over Grade 6 circa 2003. That in Grade 6 one knows more competencies than in Grade 5 seems to be a myth. The common sense hypothesis has been ruled out. The results are consistent with the accelerated development hypothesis.

  7. The discussion has perked my interest as a bystander. I can only sympathize with the hard work that is needed to reach your common goal–to improve, in general, the quality of Philippine education which, to this day, hasn’t matched even only nearly what is desired. Reminds me of the ancient mathematician, Archimedes, who said: “Give me a lever long enough and a place to stand and I could lift the world.”

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s