Smarter Balanced Summative Assessment Results for ELA and Math
The group view provides an aggregate and individual display of a selected assessment for an assigned group, a custom group, or a school. Refer to the View Student Group Assessment Results section for details.
Once a user selects a summative assessment to display in the group view, the Results page displays as shown in figure 1.
Figure 1. Summative Assessment Results Page
The Results page for a Smarter Balanced summative assessment displays the following information and elements:
- Name and grade of the assessment
- Group Aggregate panel: Displays aggregated data for the selected group of students
- Select a results view drop-down menu: Offers the following options for display results:
- [Results By Student] (default view)
- [Writing Trait Scores] (for ELA only)
- [Target Report] (for summative assessments only; student groups of 30 more)
- [Lexile Report] (for ELA only) or [Quantile Report] (for Math only)
- Results View table
- Display value as field: Contains a toggle that allow a user to change the display to show the student score distribution in percentages of students or numbers of students in each reporting category
- [Collapse All] button: Hides the Results View table for all the displayed assessments and toggles to an [Expand All] button, which displays the results again
- [Export CSV] button: Gives the option to download the results in CSV format
- [Overall/Composite/Claim] toggle: Switches among Overall, Composite, and Claim
- The [Overall] toggle displays the Student Score Distribution with four achievement levels.
- The [Composite] toggle displays the Student Composite Claim Score Distribution across three score categories.
- The [Claim] toggle displays four claims with three reporting categories.
Smarter Balanced Summative Assessment Group Results
The Smarter Balanced summative assessments report achievement Overall, by Composite Claim, and by Claim. The group Results page for a Smarter Balanced summative assessment is displayed in figure 2.
Figure 2. Group Aggregate Results for ELA Summative Assessment
The Group Aggregate panel displays the following information and elements:
- Average Scale Score: Average scale score for the selected group of students and an error band based on the standard error of the mean
- Total student test results collected to calculate average scale score and score distribution
- Student Score Distribution: Score distribution for the performance levels of the overall score
- [Overall/Composite/Claim] toggle: Switches among Overall, Composite, and Claim
- [Show/Hide Results] button: Displays the Results View table for the assessment and toggles to a [Hide Results] button, which hides the table
Composite Claim Group Results: Smarter Balanced Summative Assessment
The composite claims for the summative assessment for English language arts/literacy (ELA) are:
- Composite Claim 1: Reading and Listening—Students can comprehend, by reading or listening closely and analytically, a range of increasingly complex literary and informational texts.
- Composite Claim 2: Writing and Research—Students can produce organized and focused written texts for a range of purposes and audiences and can apply research and inquiry skills to investigate topics and analyze, integrate, and present information.
The composite claims for the summative assessment for mathematics are:
- Composite Claim 1: Concepts and Procedures—Students can explain and apply mathematical concepts and interpret and carry out mathematical procedures with precision and fluency.
- Composite Claim 2: Mathematical Practices (Problem Solving, Communicating Reasoning, and Modeling and Data Analysis)—Students can use problem-solving strategies and mathematical models to represent, analyze, and solve complex, well‐formed or not yet fully formed problems that are presented in mathematical or real‐world contexts; make productive use of mathematical concepts, procedures, and tools; interpret results; and communicate clearly and precisely about their own reasoning and the reasoning of others.
The [Composite] toggle displays the composite claim results (Below, Near, or Above Standard) based on the subject (ELA or math).
Figure 3. Group Aggregate Results for Composite Claims for ELA Summative Assessment
The Group Aggregate panel for composite claim results (figure 3) displays the following information and elements:
- Student Composite Claim Score Distribution: Displays claim level distributions for the selected group of students
- [Overall/Composite/Claim] toggle: Switches among Overall, Composite, and Claim
- [Show/Hide Results] button: Displays the Results View table for the assessment and toggles to a [Hide Results] button, which hides the table
Overall Claim Results: Smarter Balanced Summative Assessment
Results in the [Claim] tab will display “No Scores Reported” (figure 4) if students completed the adjusted blueprint of the Smarter Balanced Summative Assessments for ELA and Mathematics. Refer to the [Composite] tab for composite claim results.
Figure 4. Student Results by Claim Level with No Scores Reported
Smarter Balanced Summative Assessment Results By Student
The Results View for Smarter Balanced summative assessment results can be toggled to show Overall, Composite, and Claim results. Overall is the default selection.
Student Overall Scores
When the [Overall] button is selected, the Results By Student table includes the information elements for each student shown in figure 5.
Figure 5. Summative Assessment Results By Student Overall Scores
The table includes the following information for each student:
- Student: Student’s full name with a Context Menu three-dot icon [⋮] that provides access to additional student details (refer to the Summative Student Options section)
- Date: Date student completed the assessment
- Session: Testing session identifier (applicable to interim assessments only)
- Enrolled Grade: Student’s enrolled grade at the time of the assessment
- School: Student’s assigned school at the time of the assessment
- Status: The assessment status describes whether the administration condition (Manner of Administration) was Valid or Invalid. A blank Status field indicates a valid assessment status.
- Achievement Level: Student’s achievement level on the assessment: Standard Not Met (Level 1), Standard Nearly Met (Level 2), Standard Met (Level 3), Standard Exceeded (Level 4)
- Scale Score/Error Band: Student’s scale score and error band based on the Standard Error of Measurement (SEM) associated with that score
Student Composite Claim Level Scores
When the [Composite] button is selected (figure 6), the Results View panel changes to show each student’s assessment results by Composite Claim. Return to the default view by selecting the [Overall] button.
The Student Composite Claim Level Distribution panel (figure 6), displays the number or percentage of students scoring in each reporting category (Below Standard, Near Standard, or Above Standard) for each claim.
Figure 6. Summative Assessment Results By Student Composite Claim
The table includes the following information for each student:
- Student: Student’s full name with a Context Menu three-dot icon [⋮] that provides access to additional student details (refer to the Summative Student Options section)
- Date: Date student completed the assessment
- Session: Test session identifier (applicable to interim assessments only)
- School: Student’s assigned school at the time of the assessment
- Composite Claims for the selected assessment: Student performance for each composite claim is described as Above Standard, Near Standard, or Below Standard
Student Claim Level Scores
When the [Claim] button is selected, the Results View panel changes to show each student’s assessment results by Claim, as shown below. Return to the default view by selecting the [Overall] button.
The Student Claim Level Distribution table displays the number or percentage of students scoring in each reporting category (Below Standard, Near Standard, or Above Standard) for each claim.
Figure 7. Summative Results By Student Claim Level Scores
Each student’s result in the Claim Scores table shows the following elements:
- Student: Student’s full name with a Context Menu three-dot icon [⋮] that provides access to additional student details (refer to the Summative Student Options section)
- Date: Date student completed the assessment
- Session: Test session identifier (applicable to interim assessments only)
- School: Student’s assigned school at the time of the assessment
- Claims for the selected assessment: Student performance on each claim is described as Above Standard, Near Standard, or Below Standard
Summative Student Options
When the Context Menu three-dot icon [⋮] next to a student’s name is selected, the Student Options pop-up menu opens (figure 8).
Figure 8. Student Options Pop-Up
The menu options for each student are:
- [Student’s Test History]: This displays the test history of the student. Refer to the Student Test History Report section for details.
- [Print Student’s Summative Report]: Links to a printable Individual Student Report (ISR) in PDF format. Filter selections are automatically filled for some fields, such as the assessment type, subject, and school year. Refer to the Printable Reports section for details.
Smarter Balanced Summative Writing Trait Scores
The Writing Trait Scores view (figure 9) for Smarter Balanced ELA summative results displays a table of aggregate student results grouped by writing purpose (Argumentative, Explanatory, Informational, Narrative, or Opinion). Individual student writing trait scores are available in the individual student report (ISR) PDF. Refer to the Printable Reports section for details.
Figure 9. Writing Trait Scores
The Writing Trait Scores table shows the following elements:
- The writing purpose tabs display based on student results. Selecting a writing purpose displays a table with aggregated student results.
- Each table row lists the category of writing performance, or writing trait (Evidence/Elaboration, Organization/Purpose, and Conventions), the group average and maximum points for that category, and the percentage/number of students who earned each number of points for that category.
- An [Export] button is available to export the results table to a comma-separated value (CSV) file. All available tabs will be included in the report.
Writing Extended Response (WER) Condition Codes for ELA
Writing Extended Response (WER) Condition Codes are used for responses to full-write essay items –an example of an ELA performance task– that cannot be scored because of the nature of the student’s response. A detailed description of each available condition code is provided in table 1.
When a student response is assigned a condition code, it is equivalent to a score of zero, except for the Off-Purpose code. In most cases, when a full-write response receives a condition code, that code is assigned to all three dimensions of the response. However, beginning with the 2022–23 test administration, if the condition code for a response is Off-Purpose, only the Evidence/Elaboration and Organization/Purpose dimensions are assigned Off-Purpose, and the conventions dimension is still scored. The conventions item score is included in the total ELA score and the writing claim score. The rule for Off-Purpose scoring of conventions is applied regardless of the genre of the writing prompt.
Condition Code | Description |
---|---|
Blank (B) | Student did not enter a response. |
Insufficient (I) |
Student has not provided a meaningful response; examples can include:
Responses where the student’s original work is insufficient to determine whether the student is able to organize, cite evidence and elaborate, and use conventions as defined in the rubrics; or the response is too brief to determine whether it is on purpose or on topic. |
Non-scorable Language (L) |
ELA: language other than English Mathematics: language other than English or Spanish |
Off-Topic (T) |
A writing sample will be judged as off topic when the response is unrelated to the task or the sources, or when the response shows no evidence that the student has read the task or the sources (especially for informational/explanatory and opinion/argumentative). Off-topic responses are generally substantial responses. |
Off-Purpose (M) |
A writing sample will be judged as off purpose when the student has clearly not written to the purpose designated in the task. An off-purpose response addresses the topic of the task but not the purpose of the task. Off-purpose responses are generally developed responses (essays, poems, etc.) clearly not written to the designated purpose. If the condition code for a response is “off purpose,” only the evidence/elaboration and organization/purpose dimensions are assigned “off purpose,” and the conventions dimension is still scored. |
Smarter Balanced Summative Target Reports
Each Smarter Balanced Claim for English language arts/literacy and mathematics includes a set of assessment targets that provide more detail about the expectations of the knowledge, skills, and/or abilities assessed by the items and tasks within each claim.
- For English language arts/literacy (Figure 10), the summative assessment target-level scores are calculated for each claim.
- For mathematics, target-level scores are calculated for Claim 1 only.
CERS displays aggregate target-level reports for each summative assessment. Target scores are reported as Performance Relative to the Entire Test and Performance Relative to Level 3 (Standard Met).
Figure 10. Target Report
The Target Report table includes the following information:
- [Subgroups] button: Disaggregates the target report by student demographic or program group
- Claim: The Claim(s) associated with the assessment subject
- Target: The Targets associated with each Claim
- Subgroup: The student subgroup breakdown of the report
- Students Tested: The number of students who answered questions related to the target
- Performance Relative to Entire Test: The overall student Performance Relative to the Entire Test (Better, Similar, or Worse)
- Performance Relative to Met Standard (Level 3): The overall student Performance Relative to Level 3 (Below, Near, or Above) (Figure 11)
Figure 11. Target Report Table Indicators
Performance Relative to the Entire Test
Performance Relative to the Entire Test is reported in one of three reporting categories: Better, Similar, or Worse. This report indicates whether students’ performance on a target was better than, the same, or worse than the students’ performance on the entire test. A "Worse" indicator does not necessarily mean poor performance on a target, but rather that students’ performance in this area was weaker than the overall performance.
Performance Relative to Level 3
Performance Relative to Level 3 (Met Standard) is reported in one of three reporting categories: Above, Near, or Below. This report Indicates whether students’ performance on a target was above, near, or below the performance standard (Level 3: Met Standard). A "Below" indicator suggests that students have not yet mastered the content assessed in a target; however, the students’ overall performance on the test may be near or above standard.