Appendix 6-12

The Friedman Model in Action

 Sample Strategy to Evaluate Progress

            To assist the school team in creating an evaluation strategy before beginning the initiative, a suggested plan using Friedman’s steps is below, followed by examples that illustrate Friedman’s four-quadrant conceptualization in action. 

  1. Identify and establish a politically grounded set of results.
  2. Distribute a general survey to teaching staff to collect baseline data (Appendix 6-13: Keeping Quality Teachers Survey is an example of such an instrument) or use the self-assessment checklists to assist in determining: What’s working? What areas need more focus? Establish a baseline, reporting the “story behind the baseline” or the history, and develop forecasting trend lines.
  3. Select performances measures and indicators that measure and communicate whether the results are being met. Review the “Potential Evaluation Plan for Performance Indictors and Targets” in Appendix 6-10 and review baseline data collected to date. Are there additional data that need to be collected to measure impact?
  4. Review strategies and resources to assist in turning the curve away from the baseline.
  5. Involve partners in implementing research-based strategies to produce the desired results (e.g., State Department of Education, IHE’s).
  6. Begin implementation of the selected strategies while continuing to look for new ones that will stand the test of time.
  7. Use a feedback loop to review success of the strategies and correct as needed.
  8. Distribute the general survey to teaching staff to collect post-implementation data. Were the desired results achieved?

 Improving Working Conditions:
Performance Indicators and Targets Example

End/Results:  

Means/Strategies:     

Indicators and performance measures are then matched to the effort/effect, quantity/quality standards. By returning to the “Working Conditions: Self-Assessment Instrument,” specific areas for focus can be targeted, which may have been rated “never” or “seldom.” These are the data points that will be collected, analyzed and reported over the period of the project.  

How much did we do?

 # of parents represented in policy making

 # of parents participating in administration and faculty hiring decisions

 # newsletter/communications to parents providing an update of school/district activities/issues

 # of Special Education Advisory Council meetings

 

How well did we do it?

% of parents reporting satisfaction with their participation in policy making  

% of parents reporting satisfaction with their participation in administration and faculty hiring decisions 

% of parents reporting satisfaction with newsletter or communications received from the school/district 

% of Advisory Council members reporting satisfaction with the meetings, process, accomplishments, etc.

Text Box: Is anyone better off?

 

  

# of families reporting increased involvement (baseline-to-current data changes)

 

# of new teachers remaining in their current positions for 5 years (periodic data collection)

 

# of students’ outcome data that can be correlated with teacher longevity (baseline-to-current data changes)

 

 

 

 

% of families reporting increased involvement (baseline-to-current data changes)

 

% of new teachers remaining in their current positions for 5 years (periodic data collection)

 

% of students’ outcome data that can be correlated with teacher longevity (baseline-to-current data changes)

 

 


Role of the Administrator:
Performance Indicators and Targets Example

 End/Results: 

 Means/Strategies:     

Indicators and performance measures are then matched to the effort/effect, quantity/quality standards. By returning to the “The Role of the Administrator in Teacher Retention: Self-Assessment Instrument,” specific areas for focus can be targeted, which may have been rated “never” or “seldom.” These represent the data points that will be collected, analyzed and reported over the period of the project.  

How much did we do?

# of revised job descriptions 

# of teachers who visit other classrooms 

# of consistent discipline   policies/procedures developed

 

How well did we do it?

% of teachers indicating satisfaction with revised job descriptions 

% of teachers indicating satisfaction with visits to other classrooms 

% of staff indicating satisfaction with discipline policies/procedures

 

Text Box: Is any one better off?

 

 

 # of job descriptions developed (baseline-to-current) 

# of teachers visiting other classrooms (baseline-to-current) 

 # of schools with consistent discipline policies/procedures (baseline-to-current)

 

 

 

 

 % of administrative policies changed that can be correlated with teacher retention

 % of new teachers remaining in their current positions for 5 years (periodic data collection)

 % of students’ outcome data that can be correlated with teacher longevity (baseline-to-current data changes)

 

 


Induction and Mentoring Programs that Work:
Performance Indicators and Targets Example
 

End/Results:

Means/Strategies:

Indicators and performance measures are then matched to the effort/effect, quantity/quality standards. By returning to the “Developing Effective Mentor Programs” rating rubric, specific areas for focus can be targeted, which may have been rated “inadequate” or “basic.” These represent the data points that will be collected, analyzed and reported over the period of the project. 

How much did we do?

 # of new teachers and mentors who are paired up 

# of mentors compared to new teachers 

# of modified schedules and meeting times 

# of mentors trained in adult learning theory and cognitive coaching

 

How well did we do it?

% of new teachers and mentors indicating satisfaction with the selection and matching process  

% of teachers indicating satisfaction with the ratio of mentors to new teachers  

% of staff indicating satisfaction with modification of schedules and meeting times to enhance mentoring 

% of mentors indicating satisfaction with training in adult learning theory and cognitive coaching

Text Box: Is anyone better off?

 

 

# of new teachers and mentors who are paired up (baseline-to-current)

 # of mentors compared to new teachers (baseline-to-current) 

# of modified schedules and meeting times (baseline-to-current) 

# of mentors trained in adult learning theory and cognitive coaching (baseline-to-current)

 

 

 

% of mentoring practices implemented that can be correlated with teacher retention

 % of new teachers remaining in their current positions for 5 years (periodic data collection)

 % of students’ outcome data that can be correlated with teacher longevity (baseline-to-current data changes)

 

 


Partnerships between Schools and Higher Education:
Performance Indicators and Targets Example

 End/Results: 

 Means/Strategies:     

Indicators and performance measures are then matched to the effort/effect, quantity/quality standards. By returning to the “Rubric for Assessing the Qualities of Partnerships between Schools and Teacher Preparation Programs at Institutions of Higher Education,” specific areas for focus can be targeted, which may have been rated “drawing board” or “evolving.” These are the data points that will be collected, analyzed and reported over the period of the project.  

How much did we do?

# CPW members attending meetings 

# of stakeholder groups represented on the CPW 

# of research/inquiry projects at the school 

 # of student teachers at the school reaching a satisfactory score on collaboratively designed performance assessment 

# of school district staff participating in two-credit, summer course on “Differentiating Instruction” 

# of IHE-enrolled students participating in two-credit, summer course on “Differentiating Instruction”

How well did we do it?

 % of CPW members reporting satisfaction with their participation (e.g., meetings, projects) 

% of school/IHE representatives at meetings 

% of IHE/school faculty reporting satisfaction with projects at the school 

% of student teachers reporting satisfaction with field placement 

% of participants reporting satisfaction with “Differentiating Instruction” course

 

 

Text Box: Is anyone better off?

 

 # CPW members attending meetings (baseline-to-current data changes) 

# of stakeholder groups represented on the CPW (baseline-to-current data changes)

 

# of research/inquiry projects at the school (baseline-to-current data changes)

 

 # of student teachers at the school reaching a satisfactory score on collaboratively designed performance assessment (baseline-to-current data changes)

 

# of school district staff participating in two-credit, summer course on “Differentiating Instruction” (# that register and # that complete all requirements)

 

# of IHE-enrolled students participating in two-credit, summer course on “Differentiating Instruction” (# that register and # that complete all requirements)

 

 

 

 % of CPW members attending meetings where the range of stakeholder groups is represented

 

% of research projects conducted at the school where faculty member documents instructor and/or student impact

 

% of student teachers at the school indicating interest in teaching in the district upon graduation

 

% of school district staff participating in course on “Differentiating Instruction” that implement the strategies in the academic year (baseline-to-current data)

 

% of IHE-enrolled students participating in two-credit, summer course on “Differentiating Instruction” (baseline-to-current data)