Back to Publications

The Second Draft - Volume 34, No. 3

Early Warning Systems: The Case for Written Diagnostics as a Tool for Scaffolding DOWNLOAD PDF

  • Cara Shaffer
    Writing Specialist
    South Texas College of Law
  1. Introduction

Judge Gerald Lebovits remarked that “good legal writing is clear, concise, and engaging in ways unheard of in college.”[1]  

The judge is, of course, correct. Yet 1L students balk at this advice.[2] In student meetings at the writing center where I work, a surprising number of struggling 1L students tell me, “I’m a very strong writer, but legal writing is just so different.” This is not hubris. They diligently adopted a particular system of writing in undergraduate school that worked in that context. They wrote papers in university that scored well. They were often given minimal written critique. And now they find themselves here with their papers dripping in red ink.

Legal writing is different. Different as it is, though, many problems in law student writing were born far before 1L year. As legal writing professors well know, many of these same students struggle with fundamental writing concepts that we wish would have crystallized in high school—paragraph organization and unity, structure, introductory sentences, crisp syntax, and more. The problem, in other words, goes beyond the mere difficulty of adjusting to a new mode of writing.

Every legal writing teacher knows these students. We’ve critiqued papers from liberal arts students who maintain the habit of burying conclusions in the back of paragraphs or, worse, writing cliffhangers that never reach the paragraph’s point. We’ve encouraged former government majors not to use “throat-clearing” phrases—flights of purple rhetoric—for unnecessary emphasis. We’ve convinced students from science and business backgrounds not to rely on the passive voice even though it is standard in their prior fields. Out of History and English departments creep vague phrases like “it appears” and “one can infer.” We’ve promised students of all majors that it is acceptable and even best practice to repeatedly reference facts despite high school warnings of redundant rhetoric. 

In sum, 1L students lack a clear understanding of which parts of their undergraduate and high school writing education to keep and which to abandon. More basically, they lack the tools and context to assess their own fundamental skills. How do we give students a clear idea of where they stand relative to the expectations of the field? Sometimes, students receive this information with the first graded assignment—which comes as a nasty shock. How can we best untangle the unhelpful habits of their prose from principles and skills they should foster—techniques of organization, clarity, and cohesion? How do we best undertake this work in an era when students are spread across a spectrum of skills? 

Professor Kari Johnson wrote that “we’ve gone from teaching a mostly unified group of students to the classroom equivalent of herding cats. Your students—with their disparate entry skills, various languages, and hit-and-miss powers of concentration—will never be in one proximal development zone. You cannot count on students to be similarly situated at [the] start of the year.”[3]

To address this spectrum of ability, professors have historically adapted with “scaffolding”—the method of “meeting learners where they are” and adjusting accordingly.[4] While writing professors are already scaffolding because they critique individual student work, the structure of law school itself imposes many challenges on the process.[5] Before professors can give any writing critique, they are tasked with first teaching fundamentals of the law itself. They must teach the basis of the court systems, jurisdiction, and stare decisis. They must explain an unintuitive system of research. They must teach students an entirely new organizational paradigm. These lessons naturally take time. Sometimes, untangling a student’s misconceptions about writing itself can only take place after conferencing with a student on multiple intersecting issues of misunderstanding. Writing centers offer another space for this clarification to happen. But many students do not come to the center until several weeks into the semester.

Are there ways we can support scaffolding and offer students more insight into the challenges they may face before they are several weeks into the semester? This semester, I decided to test out a particular way—indeed, a highly particular way—of addressing these issues. In the first week at my school, I asked every entering 1L to carefully consider whether Taylor Swift would be theoretically liable for insulting a gorilla under the fictional Gorilla Provocation Act.

Bear with me.

  1. Assessment & Diagnostic Testing

Before we get to Taylor and her fictional misdeeds, I would like to address diagnostic testing.

Over the past decade, legal writing scholars have done important work developing and implementing multiple-choice diagnostics for grammar, punctuation, and style.[6]  These tests allow law schools to understand what their students have learned, identify students who need help, and concretely track progress across semesters. This intervention is critical because students who do poorly in their first semester are less likely to pass the bar exam.[7] Diagnostics also help students identify how competent they are at fundamental grammar and punctuation concepts relative to expectations in the field.  

While critical, these tests nonetheless have limitations. They are not designed to test clarity, organization, cohesion, and other key fundamental writing skills students often struggle with in the first year. Written essay diagnostics, by contrast, offer a path to gathering data on these elements.

Though common at the undergraduate level as placement tools, essay diagnostics are not generally used in law school. There are certainly reasons why they would be less favored in a law school context. For example, law schools do not place students according to their writing. Students have already been shown to meet minimum writing proficiency levels to enter the school through their writing sample. In addition, grading an essay for an entire entering class is a time-intensive endeavor that professors and writing center professionals simply do not have time to undertake. We are also in the business of teaching legal writing. But is it a waste of time to ask students to write something unrelated to law? In reality, offering a writing assessment that is disconnected from the new challenges of legal writing offers students the chance to productively assess the habits and skills they bring to law school.

Last year, I became curious as to whether gathering information about structure and syntax might yield useful data for students, writing centers, and professors. I decided to test this theory by devising an essay prompt and administering it in the first week of the fall semester alongside our program’s multiple-choice diagnostic. This brings us back to Taylor Swift and that gorilla.

  1. The Prompt

I created the following prompt: 

Last year, Taylor visited The Houston Zoo. Taylor felt great because she was wearing a new Gucci hat. The hat is described as “bright neon yellow” on Gucci’s website. Taylor made a beeline to see her favorite animal—the gorilla. Watching the gorilla in its enclosure, Taylor was filled with awe. Inspired, Taylor threw the gorilla a “Taylor” concert shirt she had in her bag. The gorilla gave a great roar and tore the shirt to pieces. Furious at the gorilla’s rejection of her gift, Taylor screamed, “You are the worst gorilla I’ve ever seen! You are a loser! I love all gorillas, but you are a joke!” Taylor banged her hands on the railing of the gorilla’s cage so hard that her hat fell off. She swiftly departed. Since the incident, the gorilla appears sad and depressed. The zoo is losing business because visitors state the gorilla is “really boring now.” They no longer visit the enclosure. Our client, The Houston Zoo, is suing Taylor under the Gorilla Provocation Act.

The Gorilla Provocation Act (GPA) imposes liability upon an individual when that individual

I. substantially provokes a gorilla II. while wearing a banana yellow hat.

•           “Banana yellow” is defined in the GPA as “any shade that a reasonable person would find to be yellow like a banana.”

•           “Substantially provokes” is not defined in the Act; however, there is one case addressing the matter that we can use to persuade our judge that Taylor substantially provoked the gorilla. In that case, Judge Park wrote the following:


“In the case we deal with today, Josephs, the defendant, screamed at the gorilla: ‘Fustiliarian scoundrel! Thou art a boil and a plague!’ Josephs then poked the gorilla in the shoulder with the tip of his umbrella. Josephs’ extremely aggressive and melodramatic behavior—particularly the theatrical language—clearly meets the ‘substantial provocation’ threshold.” (Gorilla Circus v. Josephs)

I asked my students to explain in 2-3 short paragraphs (or one page) why the court would find:

  • Taylor Swift was wearing a banana yellow hat; and
  • Taylor Swift substantially provoked a gorilla.

I used a fictional legal structure because I did not want to risk confusing students before they received the tools to understand the problem. I used language that was less complex than even the most basic memo prompt. I did not note a specific organizational structure; however, I purposefully outlined the two big points. I wanted to keep the structure similar enough to a legal problem so that I could use the responses as teaching tools in the fall semester.

  1. Gathering Data: Formulating a Reflective Student Assessment 

This year, I chose to assess student responses myself so that I could finesse the grading system and ensure that it was useful. In the future, I plan to have students self-assess their own work in writing center workshops, as an out-of-class assignment, or in groups.

I created a simple point system based on Strunk and White’s The Elements of Style.[8] I chose this text because it offers simple, practical rules and is nearly universally considered a gold standard in the field.[9]

  1. 1 point: The student introduces the paper response.
  2. 1 point: The response has multiple paragraphs.
  3. 1 point: There is one paragraph for each main idea.
  4. 1 point: Each paragraph begins with a topic sentence.
  5. 1 point: Each paragraph concludes on its specific issue.
  6. 1 point: The sentences in each paragraph all relate to the main idea.
  7. 1 point: Sentences are under three lines long or fewer than twenty words.
  8. 1 point: The student uses transition words or phrases.
  9. 1 point: The student uses fewer than two passive voice constructions.
  10. 1 point: The student uses fewer than two “throat-clearer” words.

Below the point system, I explain to students how to look for each point. The grading exercise itself is meant to be a refresher on basic points of organization and style. For example, after explaining what active voice is, I instruct them to circle the word “by” and conjugations of “to be” verbs, which I list. For the “throat-clearing” words, I list the most common filler words that students use along with some common undergraduate writing phrases, including “one can infer,” “seemingly” and “it appears.”

Education scholarship suggests that there are direct and meaningful benefits to having students self-assess their own work.[10] Of course, these responses could be graded by anyone—but I think it is generally safe to say professors and writing center specialists do not have time to add a grading task to their lengthy to-do lists.[11] Despite legitimate anxieties over computer “robo-grading,” we are reaching the point at which AI software may also add a useful quiver to our data-gathering arrows.[12] That, though, is a topic for another day.

 

 

  1. Assessing the Benefits of Responses

Simply put, essay diagnostics provide a battery of information about students that multiple-choice testing alone does not. The essay assessment has potential as a useful tool for students, writing center specialists, and professors.

  • Students

Self-assessment has the potential to meaningfully improve student writing. [13] Students benefit from thinking about rules of paragraph cohesion and unity, passive voice, wordiness, and clarity. In the semester, these issues can become blurred in a huge list of requirements they must meet in their memos and briefs. Assessing a concrete sample of their own writing forces students to scrutinize their habits related to key principles of organization and syntax. Front-loading these key concepts encourages students to prioritize them throughout the semester.  

  • Writing Center & Professors

The ability to review a written sample allows me to understand a student’s writing profile earlier and to tailor my critique specifically to that student. This assessment method immediately clarifies which issues a student should review.

Samples generally confirm the errors made in multiple-choice diagnostic tests. This is useful because, in conferences, students can directly learn by correcting their own work. Whereas my students generally correct grammar exercises that I create, they were far more enthusiastic (and bashful) about correcting their own samples. My hope is that this makes the lesson stick in the memory.

Student writing in these responses is usefully unclouded by the complications of their attempts to implement CREAC, understand what “extreme and outrageous” means, or weave in a bevy of complex legal language. The comingling of prior bad habits with the introduction of legal writing expectations can obscure underlying issues. Does a student not understand basic principles of paragraph unity? Or is the student wildly misunderstanding CREAC? Can a student truly not articulate a clear thought because they don’t understand the legal issue? Or have they been taught that purple writing is sophisticated? Is a student’s faulty analysis due to a weakness in logic? Or is it due to a pre-existing misconception that repeating any idea in writing is always a problem?

Although legal writing professors can parse underlying misconceptions with a student in conferences, the writing diagnostic offers an efficient way to instantly gather more data about a student’s specific challenges. Essay diagnostic data enables professors to recommend tailored supplemental writing sources early in the semester. Through diagnostic data, professors may glean an understanding of the makeup of a particular class’s mechanical skills. This allows the adjustment or addition of interventions cued to student weaknesses. Finally, the essay diagnostic allows another pathway through which professors can begin a dialogue with students about writing technique early in the semester. 

The essay diagnostic also generates communication between writing centers, professors, and students. For example, I recently had a student whose sentences were so counterintuitive and wordy that I could barely understand the legal framework under which they were writing. Modifiers were misplaced. Legal terminology was scrunched together. Multiple subject-verb agreement errors plagued the paragraphs. I looked at the student’s writing diagnostic and found that her prose in the sample was clear and logical. Based on our conference and the sample, I suspected that the student was just deeply confused about the law. I recommended that the student conference with her writing professor. When the student returned, her draft was far clearer. I did not spend time trying to teach that student how to write clear sentences or what subject-verb agreement was. It was clear from the strong sample that the student already knew. Instead, we quickly diagnosed and addressed the issue and were able to move on to other matters.

Finally, despite the fact that I am not actively gathering data on this, the writing assessment allows me to immediately understand which students have more trouble with critical reasoning. Stripping the problem of any actual legal complexity shines a bright light on a student’s reasoning faculties. Although this is not technically part of the assessment, it does help me tailor my feedback and suggestions to students.

  1. Conclusion

As Judge Lebovits noted, good legal writing is clear, concise, and engaging in ways unheard of in college. At the same time, good legal writing should be clear, concise, and engaging in ways that should be heard of before law school. Recognizing the abrupt shift of writing standards in 1L, we should intervene early to implement not only multiple-choice diagnostics, but to also collect non-legal writing samples such as the example I discuss above. The combination of these evaluations allows us to simultaneously encourage extant skills and disabuse our new students of pernicious writing habits.

 

 

[1] Hon. Gerald Lebovits, Surviving Your 1L Year (Again): A Primer for First-Year Legal-Writing Adjuncts, 25 Persps.: Teaching Legal Rsch. & Writing 133, 135 (2017).

[2] Bryan Garner, Why Lawyers Can’t Write, A.B.A. J. (Mar. 1, 2013), https://www.abajournal.com/magazine/ article/why_lawyers_cant_write (“I’ve been trying, in other words, to say that lawyers on the whole don’t write well and have no clue that they don’t write well.”).

[3] Kari Johnson, Scaffolding on Steroids: Meeting Your Students Where They Are is Harder than Ever . . .  and Easier Than You Think, 31 Second Draft 2, 2-3 (2018).

[4] Id. at 3-4.

[5] See Jan Levine, Leveling the Hill of Sisyphus: Becoming a Professor of Legal Writing, 26 Fla. St. U. L. Rev. 1067, 1072 (1999) (describing the widely acknowledged grueling nature of teaching 1L writing: “Teaching legal writing may be the most demanding teaching job in the law school.”).

[6] See Jeremy Francis, et al., Designing Success: Motivating and Measuring Successful 1L Student Engagement in an Optional, Proficiency-Based Program Teaching Grammar and Punctuation, 21 Legal Writing: J. Legal Writing Inst. 129 (2016); Laurel Currie Oates, et al., The Legal Writing Handbook: Analysis, Research, and Writing (5th ed. 2017) (featuring a diagnostic exam connected to the book material); John D. Schunk, Indirectly Assessing Written and Analysis Skills in a First-Year Legal Writing Course, 40 S.U. L. Rev. 47, 48 (2012).

[7] See generally Amy N. Farley, et al., A Deeper Look at Bar Success: The Relationship Between Law Student Success, Academic Performance, and Student Characteristics, 16 J. Empirical Legal Stud. 605 (2019); John F. Murphy, Teaching Remedial Problem-Solving Skills to a Law School's Underperforming Students, 16 Nev. L.J. 173, 174 (2015) (“Students in the bottom quarter of the class at the beginning of their 2L year are most at risk for failing the bar exam after graduation.”).

[8] William Strunk Jr. & E.B. White, The Elements of Style (4th ed. 2000).

[9] Rick G. Paszkiet, Style Guidelines, Americanbar.org (Sept. 2019), https://www.americanbar.org/groups/construction_industry/about_us/rules_procedures/style_guidelines/ (referring to Strunk & White’s The Elements of Style as “the bible of the economical, careful writer”).

[10] Kristen Nielsen, Self‐Assessment Methods in Writing Instruction: A Conceptual Framework, Successful Practices, and Essential Strategies, 37 J. Rsch. Reading 1, 1 (2014).

[11] Daniel L. Barnett, Triage in the Trenches of the Legal Writing Course: The Theory and Methodology of Analytical Critique, 38 U. Tol. L. Rev. 651, 652 (2007) (explaining that legal writing professors critique a mind-boggling 1200 pages of student writing per year).

[12] See Aluizio Haendchen Filho, et al., An Approach to Evaluate Adherence to the Theme and the Argumentative Structure of Essays, 26 Procedia Comput. Sci. 788, 796-98 (2018); Babak K. Khoshnevisan, The Affordances and Constraints of Automatic Writing Evaluation (AWE) Tools: A Case for Grammarly, 2.2 ARTESOL EFL J. 12, 14-15 (2019).

[13] See Lorelei A. Ortiz, A Heuristic Tool for Teaching Business Writing: Self-Assessment, Knowledge Transfer, and Writing Exercises, 76 Bus. Commc’n Q. 226 (2013); Ricky Lam, Assessment as Learning: Examining a Cycle of Teaching, Learning, and Assessment of Writing in the Portfolio-Based Classroom, 41 Stud. Higher Educ. 1900 (2016).