Questions are placed into categories for ease in reviewing. Viewers may select from the following menu of questions to locate topics of particular interest:
- What are the DIBELS measures?
DIBELS are five brief measures of the important skills forming the basis for early success in reading. DIBELS measure a student’s ability to hear the individual sounds or phonemes in words, to decode words, and to read connected text. DIBELS are typically administered from pre-kindergarten though third grade, the time when these skills should be firmly established.
- Where were DIBELS originally constructed, and do they have a strong research base?
DIBELS were originally developed at the Early Childhood Research Institute on Measuring Growth and Development at the University of Oregon. There has been extensive research conducted on the DIBELS measures and how they accurately predict performance on important outcomes that depend on ability to read and comprehend written text. Technical reports are available from the University of Oregon website http://www.uoregon.edu.
- Are DIBELS considered as diagnostic measures?
DIBELS may be considered as the first step in the diagnostic process but are not typically thought of as a diagnostic tool. They are best suited to screening students for weaknesses in the phonemic awareness, phonics and reading fluency areas; and they are an efficient and practical progress monitoring measure for K – 3 students. In many cases, a student who performs in the at-risk range on DIBELS may not need additional assessment (e.g., a diagnostic measure) before the teacher is able to implement remedial strategies. In other cases, when the remediation is not successful, the teacher may seek additional data from a diagnostic test of reading to better target the areas of weakness that are impeding the student’s progress.
- Can paraprofessionals be trained to administer and score DIBELS?
Yes. Paraprofessionals can receive training in the administration and scoring of the DIBELS measures. It is important that the training include additional opportunities for practice and oversight as many paraprofessionals have had little, if any, experience conducting standardized testing. It is sometimes advisable to assign paraprofessionals to one grade level (e.g., second and/or third grade) where only one or two DIBELS measures are learned to proficiency.
- Do all children participate in DIBELS?
If a school is receiving a Reading First grant, it is generally expected that all K – 3 children enrolled in a Reading First school participate in the DIBELS and other Reading First required assessments. As a part of the overall No Child Left Behind federal legislation, states are accountable for 99% of those students. Therefore, careful consideration must be given to any decision to not include a student in the progress monitoring (and other required Reading First) measures. It is recommended that school staff work closely with district administrators of the Reading First grant before deciding that DIBELS is inappropriate in its entirety for administration to a specific student. With some students, it may be appropriate to administer only one or several of the required measures due to sensory limitations.
- Are there accommodations that can be made to the DIBELS administration procedures to ensure valid assessment of students with special behavioral and/or cognitive needs?
The University of Oregon has established accommodations that can be made to the administration procedures when certain conditions indicate that the assessment of the student will be more accurate if minor adjustments are made to the standardized procedures. For example, students with limited vision would profit from having the student materials enlarged, students who have difficulty following oral directions would profit from a repeat of the directions or the examples that are given to help the student understand the requirements of the measure. One accommodation that cannot be made is to change the timing of the measures. DIBELS is a fluency measure, and the performance of each student must be calculated within the timed procedures in order to determine the level of fluency that the student has with the early literacy skill. It is recommended that any accommodation be implemented only when an accurate assessment of the student would otherwise be in jeopardy. It is further recommended that assessment team members should make such a decision in consultation with the school reading coach teacher and/or others on the assessment team. In this way, accommodations will be considered carefully, and the accommodations will not be interpreted as unlimited permission to deviate from the standardized administration procedures. These accommodations are available with the revised 6th Edition of the Administration and Scoring Guide for DIBELS.
- How are students with limited English proficiency assessed with DIBELS?
The FCRR has translated the administration directions for DIBELS into Spanish and Haitian Creole. Additional translations are being secured and will be made available to districts early in the 2003-2004 school year. The responses of students administered with translated directions are scored on the basis of English responses. The assessment team members, unless they are proficient in the language of administration, may need the assistance of individuals proficient in the student’s language to be certain that the student has a full grasp of the specific requirements of the individual measures.
DIBELS student materials are being translated into Braille as well. These will be distributed to requesting schools as soon as available.
- Can teachers become trained to use DIBELS for frequent progress monitoring of selected students?
Yes. Teachers should be supported in learning the DIBELS measures that are specific to their grade level. Once the reading coach or others on the assessment team have been trained to provide training to others, they can facilitate teacher training for more frequent, ongoing monitoring of student progress with the DIBELS measures. Materials for the more frequent, ongoing progress monitoring can be purchased from Sopris West or downloaded from the University of Oregon website.
- If someone has been trained to administer DIBELS and has conducted numerous administrations of all of the measures, can that person train others in DIBELS?
The requirements to train others in the administration and scoring procedures for DIBELS are specific to individual states and school districts. It is important that every person who is trained in DIBELS is afforded the same opportunities and experiences as every other person. That assurance is made possible through the a “Train the Trainer” model. Every Reading First school and district will benefit from a large group of educators who are skilled in the DIBELS assessment and in the DIBELS training areas.
- There appears to be some overlap and perhaps conflicts in the assessment requirements of several grants that have been provided to one or more schools. How can this be resolved?
These conflicts can only be resolved at the district level between the individuals who are responsible for the grant requirements. Any school or district participating in more than one reading grant should be aware of and would have to have agreed to abide by the individual grant requirements.
- How can a school determine when their specific DIBELS assessments will take place?
The window for each assessment is determined by the individual state or district, and this information is most likely found in the state’s Reading First grant or available from the state Department of Education.
- Is parent consent required for DIBELS assessment?
No, parent consent is not required because all children will be participating in the assessment. It is important that parents are informed about the type of assessment that will be taking place with the K – 3 students during the year. In turn, teachers are encouraged to send home parent report after each assessment period and to be prepared to discuss ways in which parents can support the child’s reading achievement from home.
- Once a child has achieved the “low risk” level of proficiency, can they be excluded from the next assessment?
No. There are several reasons why we would want to include these students in later DIBELS assessment in the measure where they have shown proficiency. First, many of the probes require higher levels of proficiency at each assessment period. Should a student “plateau” or not continue to improve during the time between the two assessments, it is very likely that they would fall into the “moderate risk” range and need instructional supports to meet the goals for the following assessment. Secondly, the teacher, school, district and state need information about the success of reading instruction for students at all levels of reading proficiency. We hope that all students will show meaningful growth between assessment periods—without measuring the growth of all students we will not be able to determine if this is happening.
- What should be done if there is some reason to believe that the score obtained by a student is not an accurate reflection of the student’s skill?
The best way to handle such a situation is to repeat the assessment on a different day, about a week later if possible. The repeated measure scores should be entered into the student’s record. Retest effects should be minimal in this situation.
- It seems more efficient to test an entire classroom or group of children before taking the time to score each form. Is this acceptable?
It is best to score each child’s performance as soon as the measure is completed. It is okay to prorate scores (such as on ISF) after completing the entire initial raw score calculations. The important thing to remember with calculating scores is to double-check all scoring; and if the examiner determines scores at the conclusion of each testing, it is then possible to verify the scores before turning them in to the person responsible for entering the scores into the student’s record or the system for maintaining DIBELS data.
- When students are known to be well below grade level, shouldn’t we be using DIBELS measures at their functioning level rather than their assigned grade level?
The measures that are administered for Reading First purposes should always be at the student’s assigned grade level. It is possible, after the required assessments have taken place, to determine these students’ performance on lower grade DIBELS measures. For example, a third grader who is weak in ORF might be assessed on NWF to determine if decoding skills are hampering reading fluency. These scores should be shared with the teacher when deciding intervention options.
- Is it permissible to administer DIBELS during the 90-minute uninterrupted reading block?
Yes. Each student is only missing instruction for a very brief period of time over the entire school year. Consideration should be given to ensuring minimal distractions during this instructional time by working with the teacher on a testing process that addresses the class disruption issue.
- After viewing reports for her students, a teacher does not agree with the child’s performance and requests that the child be retested. Is this permissible?
If the teacher challenges a score as being invalid (i.e., inconsistent with all other information that the teacher has about the student), it would be permissible to retest with as much time as possible between the two assessments. The same measures can be used (you may get a slightly inflated score by this method) or alternate measures may be downloaded off of the University of Oregon website. Best practices would suggest that the student be re-evaluated more than once, but this is probably not feasible because there would not be enough time within the testing window to do repeated testing over several weeks. If the student's score was significantly lower than expectations, the reason for this should be explored. It could be that the child was not up to par on this day. Also there could be a test administration issue where the examiner may have erred in conducting the assessment or scoring the performance. If the teacher questions the score because it is much higher than expected, it would be advisable to first check for a scoring error. It may be, however, that the examiner did not time correctly or did not score correctly, and this resulted in an inflated score. There is also the possibility, of course, that the student does possess these higher-level skills.
- What are some of the more common errors that might be made when first learning to administer the DIBELS measures?
We have found the most common errors that are made by individuals when they are first learning to administer and score the DIBELS measures are:
a. Starting and stopping the stopwatch at correct times
b. Not reading the directions verbatim
c. Following the discontinue rules
d. Using the correct prompts when a student does not follow the directions
e. Practicing the measures where other children can overhear
f. Interruptions or not able to hear the student because of where the measures are being administered
g. Using the assessment time as a “teaching” moment
h. Overemphasizing the phonemes when administering phoneme segmentation fluency and initial sound fluency
i. Calculating or transferring scores incorrectly
j. Prorating scores and determining median scores
It is important that DIBELS trainers are providing technical assistance and support to individuals as they learn to administer the measures so that these and other errors are avoided.
Initial Sound Fluency
- If students are not familiar with the names that are provided for the pictures (e.g., frame, hamster, etc.), won’t this interfere with their ability to respond correctly?
Yes, this is very possible. Therefore, the examiner should spend a little time being sure that the student knows the labels that are being assigned to the pictures. This can be done before assessment begins by telling the child the name of the picture followed by asking the child to give the name. Once all pictures have been reviewed and the label seems to be known, the assessment can begin.
- When calculating the score for ISF with the prorating formula, how should the score be reported if it is carried out to the decimal place?
Only whole numbers are entered into the student’s record. Therefore scores should be rounded up to the next whole number if they are .5 or above. Scores should be rounded down to the next whole number if they are below .5.
- Why is ISF only administered for the first two assessment periods in kindergarten? This is a difficult task for students at this grade level.
This very early stage of phonemic awareness should be mastered by the third quarter of kindergarten. If students have not acquired competency in this skill and teachers are working on building recognition and production of beginning, ending and medial sounds, additional measures of ISF can be administered for information purposes. These measures are available at the University of Oregon website: http://dibels.uoregon.edu.
Nonsense Word Fluency
- If a student sounds out each letter of the word correctly then follows this by blending the sounds incorrectly, how should the item be scored?
The student should get full credit for the correct sounding out of the letters. If this is a recurring pattern, it may be mentioned to the teacher that the student needs additional skill building on the blending of words after decoding. The reason for this scoring decision is that, when students are first learning to sound out individual letters, we are not as concerned about the direction that they sound out the letters but that they know the letter sounds. However, when they are sounding out complete words, we definitely want them to read the words in the correct order, from left to right.
- Why don’t we use real words for decoding skills assessment rather than non-words?
The nonsense words are used because students differ on their sight word recognition skills. By using non-words, we can more accurately assess their ability to match letters to sounds and their ability to decode an unknown word when it is presented.
Oral Reading Fluency
- What are the readability levels of the ORF passages?
All passages underwent numerous readability formulae before selection for DIBELS reading fluency assessment. Although each formula does reveal differences in readability, the developers of DIBELS selected those that had the most conformity to mid-grade level readability for the grade at which they were assigned.
- How is ORF scored when a student skips and entire line? Are the words in the omitted line counted as errors or is the omitted line ignored in the calculation of the fluency score?
With Oral Reading Fluency, if a child skips an entire line of text, each word in the line is counted as an error. This scoring rule is in contrast to the rule for skipped lines with the Letter Naming Fluency measure. With LNF, an entire line that is skipped is not counted as letters read correctly or incorrectly. The reason for counting skipped lines as errors of omission with ORF is because omitting words when reading text significantly impacts comprehension. The words read correctly score does not change if the skipped line is ignored or counted as an error. However, the error rate is a helpful number in determining a student's instructional level in reading.
- Some people may fail to use the discontinue rule for the Oral Fluency section and continue to administer the second and third passages. Do you take the score from the first passage since the administration should have been discontinued or take the median score from all three passages?
The purpose of the discontinue rule is to stop testing when there is little chance that you will gain additional, meaningful information from continued testing. In the case of ORF, if the student scores less than “10” on the first passage, he or she is more than likely going to read somewhere in that range on the other passages. This student is essentially a non-reader, this experience is frustrating, and continued testing with additional passages is probably not going to yield different results.
However, if one does go ahead and administers the other two passages, it makes sense to take the median score; and this is recommended in this situation.
- The scoring form for ORF indicates that the “middle score” is recorded as the student’s score. Is the “middle score” the same as the “median?”
Yes. We often had confusion on the more technical term, “median,” and chose to use the words “middle score.” This score is obtained by recording the three scores, crossing out the lowest and highest scores, and the remaining score is the “middle” or “median” score representing the student’s ORF.
- How might parents, teachers, principals, district administration, and the state use the data from DIBELS?
DIBELS was developed to identify students at risk of reading failure so that appropriate interventions could be put into place that would remediate the deficiencies. In that spirit, the information gained from DIBELS assessment should be used to support children in their acquisition of reading skills, to determine how children can best be served within their classroom, to evaluate the effectiveness of remedial instruction, and to determine the resource and staff development needs of teachers to achieve success for all students. Each school district and the state will likely consider DIBELS data while evaluating the effectiveness of the state’s Reading First grant to accomplish the goals set forth in the application for federal funding.
- Should DIBELS be used to make decisions about student promotion or retention, to evaluate teachers, or to determine if a teacher should receive merit pay based on student performance on DIBELS measures?
DIBELS was not designed for any of these purposes. Certainly the performance of a student on any measure of reading readiness or reading achievement is an important consideration when determining how a student’s educational needs can best be met in future years. However, the DIBELS subtests should only be one aspect of a broad array of information that is used for making these important decisions.
Teachers should be supported for using DIBELS as a reading progress monitoring measure, for applying the DIBELS data to decisions about grouping, instruction, and remedial strategies and for taking frequent DIBELS measures during the implementation of interventions. The use of individual or group DIBELS data as a teacher evaluation information source should be avoided.