File:Z9.png: Difference between revisions

From EXPART HR
Jump to navigation Jump to search
No edit summary
No edit summary
 
(7 intermediate revisions by the same user not shown)
Line 1: Line 1:
{| style="margin:0 0 0 0; background:none; width:100%; margin-top:2px; background:transparent;"
dss
| style="width:50%; border:2px solid #a7d7f9; vertical-align:top; color:#000; padding: 0px 0px 0px 0px; -moz-border-radius: 20px; -webkit-border-radius: 10px; border-radius:5px;" |
<div class="wikipedia-ko manual main-box" style="width: 100%; margin-top: 5px; flex: 1;">
<div class="wikipedia-ko participation-header" style="width: 100%; font-size: 1.3em; overflow: auto;"></div>
{| cellpadding="2" cellspacing="5" style="width:50%; vertical-align:top; background:transparent;"
<!-- Muốn đặt text vào hộp 1 thì để ở đây -->
|}
{| style="width:100%;border: 0; background-color: #ffffff" cellpadding="0" cellspacing="10"
| style="width:20%; height:10%; vertical-align: top; border:1px solid #ffffff; border-radius: 10px; background-color: #FFFFFF" |
<div style="font-size: 18px; color:#9767C6;"> <div style="display: inline-block; margin-left: 10px; padding-left: 10px;">
<div style="font-size: 16px; font-weight: bold; text-align: center;">
 
KẾ HOẠCH ĐÁNH GIÁ CHƯƠNG TRÌNH </div>
<hr />
<div style="font-size: 14px; text-align: center;"></div>
| style="width:82%; height:10%; vertical-align: top; border:1px solid #ffffff; border-radius: 10px; background-color: #FFFFFF" |
Hướng dẫn này được viết riêng cho những người muốn đánh giá việc triển khai CLISE của trường học hoặc hệ thống, nhưng không được đào tạo về đánh giá chương trình và không làm việc với người đánh giá chuyên nghiệp. Đây không phải là hướng dẫn chung để đánh giá các chương trình học tại trường học - nó chỉ phù hợp với các đặc thù của việc đánh giá chương trình CLISE.
 
|-
|}
|}
<br>
 
<div style="font-size: 18px; color:#A880CF">'''Why Evaluate?'''</div>
 
There are a variety of reasons to evaluate CLISE use. In general, the goal is to show that resources put into the program are paying off, so one of the most common audiences for evaluations is funders - Vinschool management board. Another important audience is parents and school members.
 
Many people choose to evaluate the program to see how it’s working. Evaluation evidence can increase staff motivation and commitment to implementing the program fully and well. Evaluation can also help schools see how implementation might be affecting outcomes and how it might be improved to ensure students are benefitting fully. Evaluation is also useful for tracking progress toward desired program goals and outcomes over time.
<br />
 
{| style="margin:0 0 0 0; background:none; width:100%; margin-top:3px; background:transparent;"
| style="width:50%; border:2px solid #a7d7f9; vertical-align:top; color:#000; padding: 0px 0px 0px 0px; -moz-border-radius: 20px; -webkit-border-radius: 10px; border-radius:5px;" |
<div class="wikipedia-ko manual main-box" style="width: 100%; margin-top: 10px; flex: 1;">
<div class="wikipedia-ko participation-header" style="width: 100%; font-size: 1.3em; overflow: auto;">
<span class="heading" style="display: inline-block; height: 2rem; line-height: 2rem; padding-left: .5rem; padding-right: 1rem; margin: .6rem 0; border-radius: 0 1rem 1rem 0; background-color: #A880CF; color:#FFFFFF;">'''Evaluating Implementation'''</span>  </div>
 
<div style="text-align: center; font-size: 16px; color:#A880CF;"> One of the keys to successful, effective evaluation is to be sure you know exactly what you’re evaluating.  </div>
 
<div style="font-size: 16px; color:#A880CF">What Am I Evaluating?</div>
Every school implements the same CLISE program, but what students actually receive can vary widely. You can make your evaluation more powerful and useful by examining how the program is being implemented in your school. Remember, you’re evaluating the intervention your students actually get, which, depending on implementation, might be more or less like the exact program designed by the Program Department.
 
<br />
<div style="font-size: 16px; color:#A880CF">What Information Should I Gather?</div>
To include implementation in your evaluation, gather information on how the program is being taught in your setting. In particular:
 
*How many students are receiving CLISE lessons? All students? Only certain grades? Only certain classrooms?
*How many of the lessons are being taught?
*How closely are lessons being taught to how they are written?
*Are students doing Advisory Activities?
*What else is being done outside formal lessons to reinforce CLISE skills, both in the classroom and throughout the school?
 
<br>
<div style="font-size: 16px; color:#A880CF">How Do I Gather It?</div>
Collecting data on what students are receiving typically involves having staff complete a simple survey that asks the questions listed above. Some surveys for this purpose are described here:
 
<div style="font-size: 14px; color:#A880CF">'''Implementation Preparedness Survey'''</div>
Collects information about staff readiness to begin teaching lessons and helps you plan for implementation support needs
 
<div style="font-size: 14px; color:#A880CF">'''Implementation Survey'''</div>
Collects information about program implementation as experienced by those teaching the program and can function as a formative assessment of the implementation process during the year or a record of implementation experiences at the end
 
Surveys should be filled out by the relevant staff. In some schools, homeroom teachers teach most of the lessons, but they still need other teachers’ input to find out how much and what parts of the program students are actually getting, since other teachers are responsible for skill reinforcement.
 
<br />
 
<div style="font-size: 16px; color:#A880CF"> What's Implementation Fidelity?</div>
Surveying staff on how the program is being taught can also go beyond examining how many students are receiving how many lessons. Implementation evaluation can also look at the “fidelity” of implementation. Fidelity basically means the extent to which the program is taught as designed.
 
A full implementation ideally means students are receiving all the lessons in order and all the concepts and skills in each lesson. For a variety of reasons, staff sometimes only teach parts of lessons and skip others, teach lessons out of order, or change some of the content. These are all examples of low fidelity. Obviously it’s possible to change lessons in ways that don’t harm or might even improve outcomes, but it’s also possible to change lessons in ways that reduce program effectiveness. The Program Department recommends implementing the program with as much fidelity as possible. It can be useful in an evaluation to know the fidelity with which the program was taught.
{| cellpadding="2" cellspacing="5" style="width:50%; vertical-align:top; background:transparent;"
<!-- Muốn đặt text vào hộp 1 thì để ở đây -->
|}<!-- GÓC TRÊN BÊN PHẢI --> 
|}
 
 
{| style="margin:0 0 0 0; background:none; width:100%; margin-top:3px; background:transparent;"
| style="width:50%; border:2px solid #a7d7f9; vertical-align:top; color:#000; padding: 0px 0px 0px 0px; -moz-border-radius: 20px; -webkit-border-radius: 10px; border-radius:5px;" |
<div class="wikipedia-ko manual main-box" style="width: 100%; margin-top: 10px; flex: 1;">
<div class="wikipedia-ko participation-header" style="width: 100%; font-size: 1.3em; overflow: auto;">
<span class="heading" style="display: inline-block; height: 2rem; line-height: 2rem; padding-left: .5rem; padding-right: 1rem; margin: .6rem 0; border-radius: 0 1rem 1rem 0; background-color: #A880CF; color:#FFFFFF;">'''Evaluation Design'''</span>  </div>
 
<div style="text-align: center; font-size: 16px; color:#A880CF;"> Create an evaluation design that aligns with your data-collection resources
It might be helpful to think about your CLISE evaluation as falling somewhere along a spectrum of evaluation rigor. The most rigorous approach is an experimental design, in the middle is what is called quasi-experimental design, and the least rigorous approach is a non-experimental design. Each of these designs and their pros and cons are described below.
</div>
 
<div style="font-size: 16px; color:#A880CF">Experimental Design</div>
One of the main challenges in program evaluation is determining whether any effects you find were in fact caused by the program you’re evaluating. In any given school, CLISE is only one of many factors affecting students’ attitudes and behaviors. The purpose of an experimental design is to increase your confidence that changes you find in students were caused by their exposure to the program.
 
This is primarily accomplished through random assignment. Random assignment means you determine which students will be involved in the study (your study population), and each of those students has an equal chance of either being taught the program or not. Random assignment is a powerful way to create two groups that are as likely as possible not to be significantly different. This goes a long way toward ruling out differences in outcomes being due to initial differences in the students being studied.
 
For complicated technical reasons, random assignment for evaluating a social-emotional learning program requires assigning entire schools to either implement the program or not (the ones that don’t implement serve as non-intervention controls). In addition, for statistical reasons, a large number of schools must be involved in the evaluation. Scientifically valid experimental design evaluations of CLISE commonly involve 30 to 60 or more schools. A study this large is typically not feasible for Vinschool at the moment, and since experimental design requires randomizing entire schools, this approach can’t be done by an individual school.
 
<br />
 
<div style="font-size: 16px; color:#A880CF">Quasi-Experimental Design</div>
Quasi-experimental designs are a way to try to assess program effects when random assignment isn’t possible. Rather than a randomly selected control group, a quasi-experimental design includes a comparison group. Comparison groups are made up of students who aren't receiving the program. The key to creating a good comparison group is attempting to match the students as closely as possible to those receiving CLISE lessons. The more alike the two groups are, the more useful the comparison group data will be. The most common way to match comparison group students (or classrooms or schools) to those getting CLISE lessons is by using demographics, such as age, race or ethnicity, gender, income, etc.
 
The drawback to the quasi-experimental approach is you ultimately have less certainty that the students in the two groups you’re comparing are alike to begin with than with random assignment, and differences between the two groups that don’t have to do with the CLISE program may be part of the cause of differences you find in outcomes. However, this approach is a reasonable way to increase the strength of an evaluation.
 
<br />
 
<div style="font-size: 16px; color:#A880CF">Non-Experimental Design</div>
A non-experimental design means gathering data on children who receive CLISE only, without any control or comparison children involved. This approach is often the most feasible for many schools. Just keep in mind that it can’t tell you whether any outcomes you find were actually caused by the program. It may be that the program is causing the changes you find, or it could be that schools using the program are also doing other things that benefit children and cause the changes you’re finding.
 
The clear advantage of not including control or comparison groups in your evaluation is that it’s simpler and relatively inexpensive.
 
The primary approach used in non-experimental CLISE evaluation is to collect data before and after the program is implemented. This information is often called pre- and post-test data. Getting this information typically involves measuring outcomes for students and/or staff in the fall and again in the spring.
 
Although it’s difficult to know how much of the change (positive or negative) from the first to the second semester was caused by CLISE, there are ways to make this evaluation approach stronger and more informative. Student behavior typically changes from the beginning to the end of the school year, regardless of what programs you’re using. Students often start the school year out on their best behavior, but by the end of the year their behavior can look worse - even if you implement the program and it’s working. It may be that students are having more conflicts and problems by the end of the year, but without CLISE lessons those increases would have been much larger.
 
<div style="font-size: 14px; color:#A880CF">'''Strengthening Your Non-Experimental Evaluation'''</div>
One way to tease out these types of effects that strengthen a simple pre/post evaluation is to collect data across multiple years. It can be particularly useful, once a fall baseline is established, to collect data each spring. It often takes time for staff to become familiar with the program, so implementation quality can improve over time, yielding better outcomes when the program has been in place longer. Tracking data across multiple years also allows you to see the cumulative effect of students receiving a larger dose of the program. CLISE isn’t intended as a one-year intervention. It’s carefully designed so each year’s lessons build on those that came before. Collecting data on outcomes across multiple years allows you to capture that growth.
 
A final way to strengthen a non-experimental approach to evaluation is to examine implementation. In some schools, implementation will vary - some students will get more lessons than others, some staff will implement the lessons more fully than others, and some staff will reinforce skills more than others. If you’re collecting data from staff on implementation, you may be able to compare outcomes for students who received different amounts, or doses, of the program. If students who received more lessons or more reinforcement show better outcomes, that can help you see how to increase outcomes for more students.
{| cellpadding="2" cellspacing="5" style="width:50%; vertical-align:top; background:transparent;"
<!-- Muốn đặt text vào hộp 1 thì để ở đây -->
|}<!-- GÓC TRÊN BÊN PHẢI --> 
|}
 
 
{| style="margin:0 0 0 0; background:none; width:100%; margin-top:3px; background:transparent;"
| style="width:50%; border:2px solid #a7d7f9; vertical-align:top; color:#000; padding: 0px 0px 0px 0px; -moz-border-radius: 20px; -webkit-border-radius: 10px; border-radius:5px;" |
<div class="wikipedia-ko manual main-box" style="width: 100%; margin-top: 10px; flex: 1;">
<div class="wikipedia-ko participation-header" style="width: 100%; font-size: 1.3em; overflow: auto;">
<span class="heading" style="display: inline-block; height: 2rem; line-height: 2rem; padding-left: .5rem; padding-right: 1rem; margin: .6rem 0; border-radius: 0 1rem 1rem 0; background-color: #A880CF; color:#FFFFFF;">'''Using School Data'''</span> </div>
 
<div style="text-align: center; font-size: 16px; color:#A880CF;"> How to use school data in your evaluation process  </div>
<br />
 
<div style="font-size: 16px; color:#A880CF"> Using School Data for Evaluation </div>
Schools collect data as part of their everyday operations, and the most commonly used school data is probably discipline referral data. Many schools look at their disciplinary referrals over time as a way to see whether CLISE implementation has resulted in fewer problem behaviors. One of the advantages of this approach is that schools can often compare the number of referrals for the year before they implemented the CLISE program to the number once the program has been in place.
 
It’s also possible to track referrals over time to see whether the program results in fewer students having behavioral problems once it’s been in place for multiple years.
 
Although it’s possible to look at other types of school data for evaluation purposes, disciplinary referrals are the most common and safest source of information on CLISE outcomes. Things like attendance, grades, and test scores can be affected by the program, but its effect on those outcomes is less direct and can be harder to see.
{| cellpadding="2" cellspacing="5" style="width:50%; vertical-align:top; background:transparent;"
<!-- Muốn đặt text vào hộp 1 thì để ở đây -->
|}<!-- GÓC TRÊN BÊN PHẢI --> 
|}
 
 
{| style="margin:0 0 0 0; background:none; width:100%; margin-top:3px; background:transparent;"
| style="width:50%; border:2px solid #a7d7f9; vertical-align:top; color:#000; padding: 0px 0px 0px 0px; -moz-border-radius: 20px; -webkit-border-radius: 10px; border-radius:5px;" |
<div class="wikipedia-ko manual main-box" style="width: 100%; margin-top: 10px; flex: 1;">
<div class="wikipedia-ko participation-header" style="width: 100%; font-size: 1.3em; overflow: auto;">
<span class="heading" style="display: inline-block; height: 2rem; line-height: 2rem; padding-left: .5rem; padding-right: 1rem; margin: .6rem 0; border-radius: 0 1rem 1rem 0; background-color: #A880CF; color:#FFFFFF;">'''Using Evaluation Results'''</span>  </div>
 
<div style="text-align: center; font-size: 16px; color:#A880CF;"> How both positive and poor outcomes can be used to refine your program implementation  </div>
<br />
<div style="font-size: 16px; color:#A880CF">Positive Outcomes</div>
Congratulations! Your evaluation has shown that your CLISE implementation has improved outcomes for students. This is the time to ensure that your school or system continues to teach the program and supports what students are learning in CLISE lessons throughout the school day and the school environment. Remember that ongoing support for the program by building leaders has been shown to be the number one factor that drives continued successful implementation over time. Share the good news with school staff, system staff, parents, and the community so your efforts continue to be sustained and supported.
 
<br />
<div style="font-size: 16px; color:#A880CF">Poor Outcomes</div>
 
<div style="font-size: 14px; color:#A880CF"> '''With No Implementation Evaluation'''</div>
If your evaluation suggests students aren't benefitting sufficiently from the CLISE, a natural place to look for reasons is implementation. As discussed in the Evaluating Implementation section, how the program is implemented is very important and has been shown to affect outcomes. If you haven’t examined CLISE implementation as part of your evaluation, doing so may provide you with ideas for how to improve the impact on students.
<br />
 
<div style="font-size: 14px; color:#A880CF"> '''With Implementation Evaluation'''</div>
If your evaluation included implementation, then poor outcomes indicate the importance of looking closely at how the program is being implemented to see where there's room for improvement that may increase program effects.
 
Keep in mind that high quality CLISE implementation goes beyond teaching the lessons. Just as with academics, CLISE skills have to be reinforced and practiced in order to be mastered. Look for ways staff can cue students to use CLISE skills throughout the school day and school environment, and find ways to reinforce students’ skill use.
 
If it appears that CLISE implementation in your setting has been done well, it can be harder to know where to turn if you’re not finding sufficiently positive outcomes from your evaluation. Keep in mind that a truly rigorous evaluation requires random assignment of a large number of schools, and that quasi or non-experimental evaluations can make it hard to separate CLISE effects from other factors. Also recall that positive program outcomes may be lost in a one-year pre/post evaluation, because behaviors typically worsen from fall to spring. A lack of findings may result from changes in student behavior throughout the school year, despite positive program effects.
 
If your one-year evaluation produces disappointing results, remember that the program is designed to have a cumulative effect across multiple years, and that teaching it, like anything else, takes time to master. A one-year evaluation doesn't necessarily capture program effects well, and it may be that data collected across more than one year will tell a different and more positive story.
{| cellpadding="2" cellspacing="5" style="width:50%; vertical-align:top; background:transparent;"
<!-- Muốn đặt text vào hộp 1 thì để ở đây -->
|}<!-- GÓC TRÊN BÊN PHẢI --> 
|}
 
 
{| style="margin:0 0 0 0; background:none; width:100%; margin-top:3px; background:transparent;"
| style="width:50%; border:2px solid #a7d7f9; vertical-align:top; color:#000; padding: 0px 0px 0px 0px; -moz-border-radius: 20px; -webkit-border-radius: 10px; border-radius:5px;" |
<div class="wikipedia-ko manual main-box" style="width: 100%; margin-top: 10px; flex: 1;">
<div class="wikipedia-ko participation-header" style="width: 100%; font-size: 1.3em; overflow: auto;">
<span class="heading" style="display: inline-block; height: 2rem; line-height: 2rem; padding-left: .5rem; padding-right: 1rem; margin: .6rem 0; border-radius: 0 1rem 1rem 0; background-color: #A880CF; color:#FFFFFF;">'''Outcome Measures'''</span>  </div>
 
<div style="text-align: center; font-size: 16px; color:#A880CF;"> Research-validated evaluation tools
It’s important to choose carefully developed and tested tools for your CLISE evaluation. The basic approach to looking at data from surveys is to compare averages across surveys administered at different times. The following are survey measures we recommend you use.
</div>
 
<div style="font-size: 16px; color:#A880CF">Panorama</div>
Panorama Education’s social-emotional learning measurement platform aligns with CLISE program. Panorama’s Teacher Perception of Students’ SEL surveys can be used for Grade 1 through Grade 8, and the Student Perception of SEL self-reports are available for Grades 3 through 8. Panorama helps educators collect, analyze, and act on data about social-emotional learning, school climate, and more. Using Panorama’s customizable reporting, educators can analyze their data by subgroups - such as race or ethnicity, gender, and Title I status - and they can explore data at the individual, class, grade, school, and system levels.
 
<br />
 
<div style="font-size: 16px; color:#A880CF">Strengths and Difficulties Questionnaire (SDQ)</div>
The Strengths and Difficulties Questionnaire is a brief behavioral screening questionnaire for use with 3- to 16-year-olds. It asks about 25 attributes, some positive and some negative, on five different scales: emotional symptoms, conduct problems, hyperactivity/inattention, peer relationship problems, and prosocial behavior.
 
<br />
 
<div style="font-size: 16px; color:#A880CF">Devereux Student Strengths Assessment - Second Step Edition (DESSA-SSE)</div>
The Devereux Student Strengths Assessment - Second Step Edition is a 36-item standardized, norm-referenced behavior rating scale for Kindergarten through Grade 5. It’s designed to assess students’ skills related to social-emotional competence, resilience, and academic success. It measures four key social-emotional competencies taught in CLISE lessons: skills for learning, empathy, emotion management, and problem solving.
 
The DESSA is entirely strength-based. It focuses only on positive behaviors (such as getting along with others) rather than maladaptive ones (such as annoying others). It can be completed by parents, teachers, or staff at schools and child-serving agencies.
 
The DESSA can:
 
*Provide schools with a strength-based measure of social-emotional competence for Kindergarten through Grade 5 students
*Be used as part of a CLISE needs assessment process
*Allow schools to evaluate the impact of CLISE at the child, classroom, and school levels
 
Apperson, a trusted provider of performance assessment measures since 1955, provides an online version of the DESSA-SSE. Now educators can score and track changes in social-emotional competencies quickly and easily. Results can be accessed on demand from anywhere.
 
<div style="font-size: 14px; color:#A880CF">'''DESSA Resources''' </div>
 
''DESSA Manual''
 
''Brief DESSA User's Guide''
 
''DESSA Classroom Profile (XLS)''
 
''DESSA Teacher Individual Student Profile''
 
''DESSA Parent Individual Student Profile''
{| cellpadding="2" cellspacing="5" style="width:50%; vertical-align:top; background:transparent;"
<!-- Muốn đặt text vào hộp 1 thì để ở đây -->
|}<!-- GÓC TRÊN BÊN PHẢI --> 
|}

Latest revision as of 14:26, 27 February 2023

dss

File history

Click on a date/time to view the file as it appeared at that time.

Date/TimeThumbnailDimensionsUserComment
current14:26, 27 February 2023Thumbnail for version as of 14:26, 27 February 20231,800 × 390 (78 KB)Admin (talk | contribs)

Metadata