STEM Writing Project
Automated Support
Automated Supports
The Challenge
Instructors who have large sections of introductory biology cannot just copy the model of college writing programs. How then can they reduce the workload enough to make scientific writing a viable part of their courses?
Our Approach
Our strategy has been to find and automate time-intensive yet low-level tasks so that instructors can spend more time and effort on higher-level writing support. For example, we observed that many of our first-year students do not have enough prior exposure to scientific writing to accurately judge whether or not their writing meets minimum requirements. As a result instructors spent considerable time and energy pointing out missing elements and making corrections that did not need human insight.
To address this problem we created the STEM Automated Writing Help Tool (SAWHET), a web-based workflow that pre-checks student lab reports for essential elements. SAWHET provides nearly real-time responses to questions like:
- Are all of the required parts there?
- Are some sections too short or long?
- Does the text have citations in correct format in the correct sections?
- Does the report contain a hypothesis statement?
- Are there references in the text to figures and tables?
- Are the methods described in a narrative or a list?
Students could use SAWHET as often as they liked to check reports. Results were sent back to students within 10 minutes, so they could make corrections to their reports before the documents went to GTAs for grading. Only the most recently submitted reports were forwarded to GTAs for grading.
Lessons Learned
For us, SAWHET pre-screening reduced both the number of incomplete reports and instructor time spent on identifying mechanical errors.
Report Version | Pre-SWv1 (AY16/17) * | Post-SWv1 (AY17/18) * |
Initial drafts | 64% (n=100) | 94.5% (n=1062) |
Revised drafts | 89.6% (n=100) | 98.0% (n=967) |
* % of student reports meeting all five “basic criteria”
Pre-SWv1 (AY16/17) | Post-SWv1 (AY17/18) | |
TAs' self-reported grading time (32 reports) |
7.2+3.8 hrs (n=13 TAs) |
5.3+2.4 hrs (n=15 TAs) |
Looking Ahead
We are always looking for new automated support tools. If you have a tool or strategy that works for you, please describe it in the Comments for the sub-project. Also, check the list of To Do items for the Automated Support project. Let us know if you want to take on one or more.
Where to Learn More
- Haswell, R.H. 2005. Automated Text-checkers: A Chronology and a Bibliography of Commentary. Computers and Composition Online Fall, 53.
- Haswell, R.H. 2006. Automatons and automated scoring: Drudges, black boxes, and dei ex machina., in: Ericsson, P.F., Haswell, R.H. (Eds.), Machine Scoring of Student Essays: Truth and Consequences. Utah State University Press, Logan, pp. 57–78.
- Jackson, N.C., Olinger, A.R. 2021. Chapter 13. Preparing Graduate Students and Contingent Faculty for Online Writing Instruction: A Responsive and Strategic Approach to Designing Professional Development Opportunities, in: Borgman, J., McArdle, C. (Eds.), PARS in Practice: More Resources and Strategies for Online Writing Instructors. The WAC Clearinghouse; University Press of Colorado, pp. 225–242. https://doi.org/10.37514/PRA-B.2021.1145.2.13
- Reiff, M.J., Bawarshi, A., 2011. Tracing Discursive Resources: How Students Use Prior Genre Knowledge to Negotiate New Writing Contexts in First-Year Composition. Written Communication 28, 312–337.
- Silge, J., Robinson, D. 2017. Text Mining with R. O'Reilly Media, Inc. 194pp. https://www.tidytextmining.com/
Comments
There are no comments on this entry.