How AI is Changing Government
Old State House
Hartford
July 12, 2024
The Connecticut Old State House hosted a lecture titled “How AI is Changing Government” on Thursday evening as part of a “Ctrl+Shift+Democracy: The Impact Of Technology On Society And Governance” series of talks. While the topic of the panel, which featured four experts from academia and government, was ostensibly the impact of artificial intelligence on governance, it was the discussion of AI and academics that got my noodle cooking.
One of the most interesting questions addressed by the panel was also one of the most basic: What is artificial intelligence?
James Maroney, state senator from the 14th District and one of the legislature’s leading experts on the subject, pointed out that we’ve been using AI for decades. Tools as common as spellcheck and Google Maps fall under Maroney’s definition of AI. He differentiated between what is popularly known as AI and the generative AI and large language models (LLMs) that have recently exploded into our consciousness such as ChatGPT.
While the rest of the panel was intriguing, it was really the conversation around LLMs that grabbed my attention. I’ve been a teacher or a tutor for most of my adult life, and have worked at nearly every level of education from elementary through undergraduate. I also write for my livelihood, so I follow with great interest the development of machines that can replace me.
During the Q&A session, a professor in the crowd asked whether we’ve established the ethical guidelines necessary to live in a world with LLMs and other increasingly sophisticated AI tools. His comments were primarily focused on students using ChatGPT to write college papers for them. Panelist Belinha de Abreu, who teaches in the Journalism & Media Production program at Sacred Heart University, said that the problem had become so pervasive that she is no longer assigning papers to her students.
“My concern is that students won’t be able to think anymore without asking a computer to do it for them,” she said.
I want to set aside the ethical considerations for a moment (actually it’s pretty easy: if your professor tells you not to use ChatGPT, then don’t), and address what teachers, professors and other academics have been worried about concerning LLMs and the production of academic work. I think that educators need to hold students accountable for producing accurate work. LLMs are notorious for providing inaccurate information at best, and at worst simply making things up out of whole cloth. So yes, educators should grade students harshly when they turn in a ChatGPT essay riddled with factual errors, just as they would if the student had written it themselves.
But the issue of ChatGPT writing the essay itself? I’m far more sanguine about that than are many other educators. I’m just not convinced that knowing how to write a five paragraph essay is that important of a skill. It’s not something that anyone ever does outside of an academic setting. The skills used to write one are not as transferable as students are led to believe. When I write for a news outlet, I use a completely different style guide, a different methodology and a different style of reference and verification than I ever used in school. Most people who write do so for pleasure or technical writing, which involves yet another set of style guidelines that are not that similar to academic style.
Perhaps a better way to approach the “problem” would be to let students use ChatGPT to write first drafts, and then have them fact check the information contained therein and rewrite a final draft. The skill to critically evaluate information for accuracy would still be developed, while sparing the students the mundane task of constructing topic sentences and transitions for every paragraph.
There’s a lot to still talk about regarding the use of AI as a tool in academia, instead of a boogeyman that’s going to shrivel the brains of people who rely on it. The same arguments were made when calculators became ubiquitous. Technological change always displaces old ways of doing things. When I was a third-grade teacher, you wouldn’t believe how many people asked me why we don’t teach children cursive anymore, or teach kids how to read analog clocks. Those skills simply aren’t as necessary anymore thanks to word processors and digital clocks. I mean, we don’t read sundials anymore, right?
Still, perhaps some part of our critical thinking ability was lost when we stopped doing math by hand and looping our letters. Maybe some part will be lost if we stop writing essays the old-fashioned way. But the tradeoff in terms of ease and time saved seems worth it to me. I suspect we’ll be saying the same thing about essay writing in 30 years, but I’ve been wrong before.
I loved this lecture because it got me thinking about where I stand on issues I hadn’t really considered before. I like to write my stories pretty quickly after events so that they’re fresh in my mind, but I found myself pondering the points that Professor Abreu and the other panelists made for hours. In the end, I don’t have any better answers than they did, but it was worthwhile to think about it.
NEXT
The next lecture in the “Ctrl+Shift+Democracy” series, Digital Divide/Digital Equity, takes place Thursday, July 25, at the Old State House.
Jamil has done enough thinking for one week. See you next week!