Back to Publications

The Second Draft - Volume 35, No. 1

An Effective Self-Assessment Tool for Beginner IRAC Assignments DOWNLOAD PDF

  • Nicole Coon
    Associate Teaching Professor of Law Academic Excellence Specialist
    Mitchell Hamline School of Law
  • Kari Milligan
    Associate Teaching Professor of Law Academic Excellence Specialist
    Mitchell Hamline School of Law

Looking for an effective self-assessment tool for your students’ first IRAC[1] assignment? We were too.

To that end, we: (1) canvassed existing academic literature on self-assessments; (2) created our own self-assessment tool[2] that integrates Self-Assessment by Comparative Analysis (comparing one’s work to that of an expert); and (3) categorized our 1Ls’ answers to this self-assessment tool, so we could “see what our students were seeing” as they began their IRAC journey.

Once we could “see what our students were seeing,” we were able to check any assumptions we had about our students, self-assess our efficacy as legal educators, and identify data-driven adjustments to our curriculum.

 

  1. The Broader Academic Landscape of Self-Assessments

 

Self-assessments are a type of formative assessment, which the American Bar Association began requiring law schools to use in 2016-2017.[3] Formative assessments are “measurements [taken] at different points during a particular course . . . that provide meaningful feedback to improve student learning.”[4]

 

Self-assessments teach students how to become better legal writers. They also teach students how to independently assess their legal writing, a skill required to succeed in law school, on the bar examination, and in practice. Further, many self-assessments teach students the critical skill of metacognition. Metacognition is the skill of “thinking about [one’s] thinking,” and “self-regulat[ing] . . . learning with the goal of transferring learned skills to new situations.”[5]

The concept of “Self-Assessment by Comparative Analysis” is well-suited to self-assessments built for students’ first IRAC assignments, when many professors want students to begin evaluating their legal writing end-products and their legal writing processes. Specifically, Self-Assessment by Comparative Analysis requires “student[s] to compare [their] own work to the work of an ‘expert’ on the same project, analyze the differences, and identify what [they] need to improve and how [they] will improve it, focusing on both [their] product and [their] process.”[6] Frankly, beginning 1Ls may not be able to identify good legal writing without an example and can benefit from examining expert work.[7] While there is a risk students will believe the expert work is the one and only “right answer,”[8] professors can ameliorate that risk by explaining that no IRAC is perfect and asking students to simultaneously critique their IRACs and the expert IRAC. 

 

  1. The Self-Assessment Tool

 

We began designing and administering the self-assessment tool by relying on a straightforward assault fact pattern with a simple, one-page IRAC answer. The analysis section of this expert IRAC answer contained only deductive reasoning, and no viable counterarguments or analogical reasoning. Prior to completing the self-assessment, students studied assault in their doctrinal Torts classes, wrote IRAC answers to the assault fact pattern, and reviewed our feedback on their IRAC answers.

 

The self-assessment tool contained four sections, including two highlighting exercises, a sentence-by-sentence comparative review, and high-level take-away questions. 

In the self-assessment tool’s first and second sections, we asked students to complete a highlighting exercise. We instructed students to take their own IRAC answers and highlight the issue sentence green, the rule sentences yellow, the analysis sentences (containing both legal conclusions and facts) blue, and the conclusion pink.[9] Next, we prompted each student to complete the same highlighting exercise using an expert IRAC answer we provided. Following this highlighting exercise, students were left with two visually informative IRACs, enabling them to compare the coloration of their IRACs with that of the expert IRAC.

In the self-assessment tool’s third section, we asked students to complete a sentence-by-sentence comparative review of their own IRACs and the expert IRAC. We divided the expert IRAC into the different IRAC components: Issue, Rule, Analysis, and Conclusion. Within each component, we broke out each sentence or couple of sentences into separate sections for student review. Each student began by reading the relevant sentence or sentences from the expert IRAC. We then asked students to paste the sentence or sentences from their IRACs that corresponded with the sentence or sentences in the expert IRAC. Next, students indicated whether their IRACs “fully articulated,” “partially articulated,” or did “not at all” articulate the content in the expert IRAC.[10] If students did not capture the expert IRAC answer content in their IRAC answers, we asked the students to explain why they did not and why they thought the expert IRAC answer did. We then invited students to paste any additional, related content they had in their IRAC answers that the expert IRAC answer did not contain, and to explain why they included that content and why they thought the expert IRAC answer did not. [11]

Below is an example of the “IRAC Issue Statement” portion of this exercise.

-------------------------------------------------

MODEL ANSWER:

The issue is whether Ref is likely to succeed in a claim for assault against Ron Fan.

 

PASTE YOUR IRAC HERE:

 

GRADE YOUR IRAC HERE:

Fully Articulated

Partially Articulated

Not At All

 

If your answer does not have a statement that corresponds to the statement in the model answer, explain why you did not include one and why you think the model does.

 

 

Paste any additional issues you have in your answer that the model answer does not contain, and explain why you included those issues and why you think the model does not.

 

--------------------------------------------------

Following this sentence-by-sentence comparative review, students were left with a chart enabling them to evaluate their product and process. With respect to product, students could assess whether their IRACs contained necessary or extraneous information. With respect to process, each student could assess the choices made while writing the IRAC.

 

In the fourth and final section of the self-assessment, we asked students to answer high-level take-away questions. We first asked students to: (1) identify the strengths of their IRAC answers; (2) identify the strengths of the expert IRAC answer; (3) identify the weaknesses of their IRAC answers; and (4) identify the weaknesses of the expert IRAC answer. We also challenged each student to think metacognitively—specifically, to think about what they would do if they repeated the assignment and how they would transfer their skills to new situations. We prompted each student to answer the following questions: (1) if you were to write a second draft of your IRAC, what would you do that is substantively different from your current answer? and (2) if you were to write a second draft of your IRAC, what would you do that is procedurally different from your current answer? By substance, we were referring to the accuracy of each student’s content. By procedure, we were referring to each student’s legal writing process. Finally, we asked students to list the top two things that they learned about writing like an attorney from the self-assessment that they would apply to future legal writing.[12] After answering these questions, students were left with concrete take-aways regarding their IRAC product and their legal writing process to apply to their next legal writing assignment.

 

  1. What Our Students Saw When They Self-Assessed Their First IRAC Assignment and Our Response

 

Once we received our students’ self-assessment submissions, we had the opportunity to “see what 196 1Ls were seeing” as they began their IRAC journey and compared their IRAC answers to an expert IRAC answer.

 

To elicit data from the self-assessment assignment, we reviewed and categorized the students’ responses to the questions in the fourth and final step.

 

     A.  Students’ Substantive Changes

 

We first looked at the students’ responses to the question: “What would you do that is substantively different?” Thirty-three percent of responses focused on the analysis, whereas 27% of responses focused on the rule. Students’ responses also addressed various issues, including being concise, organization, the issue statement, no changes (no doubt our perfect students), and the conclusion. The “other” category contained an assortment of answers, such as “I would check over the grammar carefully” and “I would get my cause of action correct.”

We further broke down the rule and analysis responses.

 

  1. Students’ Substantive Changes to the Rule

 

For the rule, nearly half the responses focused on adding sub-definitions/sub-rules (47%). The majority of remaining responses fell within the “other” category and contained various responses, such as “I would break down the three elements [of assault] into two elements,” and “I would see if there is a more logical way to separate the elements than I did in the original assignment.”

 

 

  1. Students’ Substantive Changes to the Analysis

 

For the analysis, we identified a quite sizable “other” section. Once we dove more deeply into this “other” section, we saw that students’ responses were all over the map, and many students had a difficult time articulating the actual problems with their analysis and solutions to those problems. For example, we saw vague statements like: “I would analyze the facts in regards to the rule a little more clear[ly]. It was a little jumbled and could have been set up better.” From there, student responses addressed different issues, including the need to consider counterarguments, the need to further explain the analysis, use of “Legal Conclusion because Facts” (the deductive reasoning sentence structure we teach), and use of the mirroring concept (the rule-to-analysis matching construction we teach).

 

 

     B.  Students’ Procedural Changes

 

Next, we looked at the question: “What would you do that is procedurally different?” Here, 34% of students said they would have created a better outline (or what we call a “T-Chart”), and 21% of students said they would have spent more time on their legal writing. From there, we saw a host of responses, including doing nothing (no doubt our perfect students again), outlining, organization, more emphasis on rules, and seeking clarification from professors. Responses in the “other” category focused on different procedural issues, such as “I would take more time to fully digest the instructions” and “[I would p]roofread to find more passive voice.”

 

     C.  Students’ Top Two Takeaways

 

The final question we looked at was: “List the top two things that you learned about writing like an attorney from this exercise that you will apply to future legal writing.” A large percentage of the responses focused on rules and analysis. Students further emphasized being concise, formulaic, and precise.

 

 

     D.  Data Reactions and Recommendations

 

Overall, we were pleased to see beginning 1Ls concentrating on some of the most important substantive aspects of IRAC and the most important procedural aspects of the legal writing process. With respect to the substantive aspects of IRAC, 60% of student responses focused on revising the rule and analysis sections. Only 7% of student responses focused on revising the issue and conclusion sections. These results support that, early in their law school careers, students are tuning into the major components of IRAC and recognizing that their rules and analysis sections are often more important than “getting the right answer.” With respect to the procedural aspects of the legal writing process, 55% of student responses focused on outlining and building in more time to write. These results appear to show that novice 1Ls quickly appreciate the need to take ownership of their writing processes and to make a solid plan for, and devote more time to, their legal writing. These results should encourage skills professors like us, who often focus on the rookie mistakes our rookie students make and our most at-risk students. This glimpse into what our 1Ls were seeing also prompted us to respond by checking any assumptions we had about our students, self-assessing our efficacy as legal educators, and identifying data-driven adjustments to our curriculum.

 

Checking any assumptions we had about our students involved the pedagogical exercise of engaging in colleague brainstorming sessions, wherein we identified personal as well as generally held assumptions about incoming law school students. We took these assumptions and compared them to the data presented above. With this comparison, we further reflected on the course competencies outlined and provided to students in our syllabi to consider and identify possible curriculum revisions. 

For example, some assumptions included: (1) 1Ls beginning their IRAC journey would not focus on some of the most important substantive and procedural aspects of IRAC writing—the rules, the analysis, outlining, and spending time writing; (2) novice 1Ls would have a difficult time articulating the actual problems with their analysis and solutions to those problems; and (3) some 1Ls would not appreciate that they make major mistakes when writing and assume that their grades reflect minor mistakes.

The data we gathered challenged this first assumption while supporting the second and third—and illuminated the struggle many 1Ls have when trying to deeply reflect on the substance and structure of their work, particularly their analysis. Given the unique nature of legal writing, it makes perfect sense that students early in their law school careers would have difficulty truly stepping outside of themselves and diagnosing and determining how to rectify significant issues with their legal writing.

With this information, we then turned to course competencies. We focused on two competencies in particular. By the end of the course, students should be able to: (1) identify rules from cases and develop the applicable legal rules and standards through synthesis of multiple cases and other sources; and (2) analyze hypothetical situations by identifying the correct legal issues and relevant facts, and applying the appropriate legal standard(s) to do the following: (a) utilize the T-chart method to prepare an outline of an organized and complete legal analysis; and (b) utilize the IRAC structure to prepare a clear and complete written legal analysis on the legal issues presented. We reflected on our efficacy in teaching these particular competencies and considered possible curriculum revisions.

Specifically, we identified three main areas for improvement. First, we would like to improve students’ incorporation of more sub-definitions and sub-rules into their rule sections. By enhancing students’ writing skills related to the presentation of rules, we can help them unpack and fully present the law. Building out sub-definitions and sub-rules can also have a cascading effect by simultaneously improving students’ analysis sections. If students’ analysis sections match their rule sections, their analysis sections will necessarily be more thorough if their rule sections contain more sub-definitions and sub-rules. Second, we would like to teach students how to articulate the actual problems with their analysis sections and solutions to those problems. Third, we would like to help normalize the process of learning legal writing by offering future 1Ls some of previous 1Ls’ answers to the self-assessment assignment as a peer perspective.

 

 

  1. Conclusion

 

In conclusion, as the next cohort of 1Ls enter law school and begin their IRAC journeys, we encourage you to modify the self-assessment tool in this article to fit your needs and position your 1Ls to improve their IRACs and legal writing processes from their very first IRAC assignment.

 

 

 

[1] This article refers to IRAC; however, the self-assessment tool would be applicable to any of the various organizational paradigms including CREAC, etc. 

[2] A copy of the self-assessment tool is available in the Legal Writing Institute Teaching Bank, under the All Other Teaching Materials Section, at https://www.lwionline.org/resources/teaching-bank. Alternatively, if you are unable to access the self-assessment tool through this website, please email the authors directly at nicole.coon@mitchellhamline.edu or kari.milligan@mitchellhamline.edu for a copy of the self-assessment tool.

[3] Diana Donahoe & Julie Ross, Lighting the Fires of Learning in Law School: Implementing ABA Standard 314 by Incorporating Effective Formative Assessment Techniques Across the Curriculum, 81 U. Pitt. L. Rev. 657, 660-61 (2020); ABA Standards & Rules of Proc. for Approval of Law Sch.’s 2016-2017, at 23 (2016), available at https://www.americanbar.org/content/dam/aba/publications/misc/

legal_education/Standards/2016_2017_aba_standards_and_rules_of_procedure.pdf [hereinafter Standard 314].

[4] Standard 314, supra note 3, at 23.

[5] Anthony Niedwiecki, Teaching for Lifelong Learning: Improving the Metacognitive Skills of Law Students Through More Effective Formative Assessment Techniques, 40 Cap. U. L. Rev. 149, 156 (2012).

[6] Joi Montiel, Empower the Student, Liberate the Professor: Self-Assessment by Comparative Analysis, 39 S. Ill. U. L.J. 249, 251 (2015).

[7] See id. at 262.

[8] Id. at 271.

[9] Id. at 267-68 (discussing the role of highlighting in a self-evaluation of CREAC parts); see also Mary Beth Beazley, The Self-Graded Draft: Teaching Students to Revise Using Guided Self-Critique, 3 Legal Writing 175, 182-85 (1997) (setting forth a self-grading exercise that includes highlighting).

[10] Betsy Brand Six & Jamie Kleppetsch, Presenters, Presentation at the 2018 Ass’n of Acad. Support Educ. Conf.: Self-Assessment as the Ultimate Tool for Success (May 24, 2018).

[11] See Montiel, supra note 6, at 268 (presenting a self-evaluation worksheet wherein students must explain why they did not include rules in their memo that were included in the “good memo” and vice versa).

[12] See id. at 269-70 (setting forth a line of questioning in a self-assessment assignment designed to re-assess the accuracy of the students’ self-monitoring).