Improvement Plan

Improvements needed:  Fall 2011-Spring 2012

Test time:  In the first common test, many students felt they did not have enough time to complete the reading test in the 30-45 minutes provided (there were different times for different levels).   For subsequent tests, we will try to allow enough time for most students to finish the reading tests.   A proposed guideline is that if the designated time for the reading test has passed and more than 20% of the students are still not finished, an extra 10 minutes may be given to the students. 

Assessing Problems in Student Learning:  Upon giving our next common exam to assess SLOs (Fall 2012), we plan to do a more detailed analysis of the student papers which we agree do not meet the SLOs for each course.  Our goal is to discover any patterns of weakness in the students' work and seek ways to improve instruction and student learning in those areas.  

Fall 2012

Our assessment of the student papers that did not meet the writing-related SLOs revealed the following primary areas of weakness.  For each course, the most widespread problem is listed first.

In ESL 181RW, the most common problems were these:
  • incomplete sentences / sentence structure
  • length of writing (less than 150 words on the assigned topic)
  • plagiarism (due to some students having access to the reading while writing their papers; some teachers did not follow the prescribed procedure, which disallowed this access). 
In ESL 182RW, the most common problems were these:  
  • incomplete sentences / sentence structure
  • basic verb tenses (those included in the 181RW curriculum)  
  • incorrect use of conjunctions, especially subordinating conjunctions
In light of these results, the department will meet to discuss how to improve our instruction in these areas.   We will collaborate, seek out best practices, and explore new resources. 

At the same time, we are also considering changing the type of assessment tools/methods we use.  Thus far, we have usually chosen readings on topics that the students have not previously studied, and writing topics based on those readings.   As a result, the students enter into the test/assessment "cold"--with no direct preparation for it.  This way no instructors are teaching to the test or  preparing their students for it better than others.   Although this kind of test has some validity for proficiency and placement testing, it does not mirror the most common type of writing tasks students are given in their classes, where they are introduced to a topic, given a chance to develop vocabulary and background knowledge related to it, and then asked to write about it.   

For the coming round of assessments (Spring or Fall 2013), we may introduce a different method of assessment.  

No comments:

Post a Comment