On July 15/16, the Wall Street Journal had an ominous story on the advancing influence of a few technology companies on every aspect of our lives. The main focus fell on the extraordinary growth of Google, Amazon, Facebook, Apple, and Microsoft, a colossal quintet that makes the old days of the Robber Barons look minor league.
The opening point, however, notes technology’s revolutionary impact on the labor market. When driverless cars come along (currently, the most discussed example of automation and the workforce), we may see 300,000 jobs disappear every year, we read. Truck drivers, cabbies, delivery men, movers, driving teachers . . . millions of them will fall out of the workforce. When I mentioned to a friend the other day that the political/culture war ignited by the election of Donald Trump may get more violent as we approach the midterm elections, he shook his head and said, “That’s nothing compared to what’s going to happen when robots take over all the driving. All those men can’t do anything else, and they’re going to be hungry.”
This is one area where, for once, the humanities are safe. Any profession and job involving not just transportation, but also calculation, computing, unskilled factory labor, etc. will give way to automation. Voices of caution such as Nicholas Carr won’t slow the process. Robots don’t need health coverage and pensions, and they don’t join unions.
But the humanities presume human contact. The interaction of teacher and student involves much more than the transfer of information. The materials on the table are emotional and value-heavy. They touch profound joys and dark ambitions. It is hard to discuss the Grand Inquisitor or watch Lady Macbeth at night (“Yet who would have thought the old man to have had so much blood in him?”) without the human factor coloring the session.
Freudians speak of transference and counter-transference in the psychoanalytic method, and they play a part in meaningful humanities teaching, too. They can always go awry, for instance, when discipleship becomes so strong that the student never comes into his own, but we can’t get rid of them without turning the classroom into a non-humanistic zone.
And so, we can’t replace humanities teachers with automated instruction.
Everything I’ve said up to this point is true—except for the previous sentence. I just read about a new step in the evolution of humanities automation this morning. The University of Michigan is testing an “automated text-analysis tool” in large lower-level science classes, a program that lets teachers assign more writing and assumes much of the burden of feedback and guidance. Teachers don’t have to sit with students one-on-one and go over rough drafts, a process that can run all day and only reach 12 or so students. The program does it automatically as soon as students log on to it, examining writing sent to it and responding with corrections and suggestions.
Now, this is a science class, not a humanities class. Furthermore, the University of Michigan states that computers will not assign grades. The program is an advisory device, not an evaluative one. But anyone who believes that automated advising and grading are not coming soon to all the disciplines doesn’t understand college finances.
Freshman composition is a big problem in the eyes of administrators when it comes to labor productivity. I don’t mean the product of the labor, namely, an articulate sophomore, but the nature of the labor. The former is bad enough, as we can see when we ask teachers across the curriculum how well students write. But the latter is exasperating, too.
When the budget people visualize an instructor in Psychology 101 lecturing to 350 students and relying on three graduate teaching assistants to run once-a-week discussion sections and assigning grades with multiple-choice tests, they smile. But when they see a freshman comp instructor with a class of 25 students who write six five-page papers during the semester, they see a gross inefficiency. Paying an instructor to spend six hours every other week solely on grading 25 essays looks awfully expensive, especially when the psychology teacher can do the same job and cover 350 students.
Automation is a solution. What took the person six hours to do, the computer can do in a few minutes. If we can pay one composition instructor to teach a class of 200 students, lecturing to the group on general principles of strong writing, but using teaching-assistant robots to individualize the instruction, then we don’t have to pay five teachers to handle those students. One can envision a stressed-out dean rubbing his hands over the prospect.
It’s already happening in the scoring of standardized tests. I heard of it a few years ago while working for ETS on the GRE. The background was the pressure on schools and governments to bring accountability to student performance in writing, which colleges and businesses constantly deplore (No Child Left Behind, passed a few years earlier, emphasized testing in reading and math, but not writing.) Enrollments in remedial writing classes were going up, and so were the number of organizations hiring writing tutors for younger workers.
To assess writing and improve instruction accordingly was going to take a lot of money. The SAT added a writing component in 2006, for example, which meant some 1.6 million pieces of writing had to be read and scored each year. You can imagine the cost of paying temp workers to do so. The State of Illinois, in fact, dropped writing tests in 2005 to save $6 million.
With these burdens, institutions can’t help but spread computerized grading of writing throughout higher education. It doesn’t matter that current programs have their flaws. Enough of them will be ironed out to justify using the programs, especially when school officials see the savings.
And there is another benefit as well. It bears precisely on the humanistic nature of the humanities that I highlighted above. What we might take as an enticement — that is, the emotional and psychological nature of humanities teaching — administrators see as a risk. With relations between teachers and students becoming tenser, and with students growing more conscious of offense and discrimination, human-to-robot contact seems safer, too, not just cheaper. The more instruction and grading can be rationalized and dehumanized, especially in courses that touch upon delicate issues, the fewer complaints and allegations and lawsuits will occur.
One final consideration. It used to be the case that parents and students demanded small classes and lots of instructor attention. But from what I’ve seen and heard lately, grades and accreditation increasingly matter more than the human touch. Administrators and humanities colleagues at other campuses tell me that income and employment prospects are #1 in the minds of the “customers,” not an intense engagement with professors.
In fact, if they sense that teachers demand too much attention, at least in courses not directly related to their future careers, students drift away. Not many of them are going to object to having to conduct pedagogy through the screen in their dorm rooms at night rather than spending a half hour in a professor’s office in the middle of the day. Millennials prefer it that way.
I like this seantance “Truck drivers, cabbies, delivery men, movers, driving teachers . . . millions of them will fall out of the workforce.”