Wesley R. Bishop
Wesley R. Bishop is a historian, writer, and editor living in northern Alabama. He is the
founding and managing editor of North Meridian Press.
1.
In Rome late last August, I met a Buddhist monk was also a computer scientist.
The air was oven ready to melt
and I was dehydrated and desperate for AC and cool shade and ice drinks
but we were both attending a panel on thought and AI
but the presenting professors did not show
and grumbling from us, the audience, the few who braved the Roman air, vibrated in that cruel
dry space.
As we left, I chatted with the monk philosopher computer scientist
and we noted how just that month META was pushing a Bot that would summarize and comment
on content.
Was this intelligence?
No clear answer, and the monk was annoyed at the heat and lack of truancy from colleagues and
the fact that computers were being so hard to pin
and I found it funny a monk annoyed and a Bot that would patiently explain everything to me as
I shuffled from the conference heat to the cooling night streets and eventually Treve Fountain.
Late that night, after my credit card had been declined (the AI protection software wondered how
an Alabama resident could be so far from home) I walked the numerous blocks back to my hotel
room, that slice of Rome I was renting.
I toyed with the Meta Bot, asking it questions about consciousness, labor unions, and sex.
Finally, back to bed, I told my phone to set an alarm for morning when the sunlight would trip
the wires and wake me for another day of philosophers in Rome.
2.
It’s the end of the semester and my students and I are set to debate the death penalty in
Alabama—Pros and Cons— but instead we talk about the ethics of AI since the admin is pushing
a survey to see how much we are all using AI to gage our comfort and no doubt measure how
fast they can move to implement it for cost saving measures because the president and provost
just got thirty-five percent raises while the janitors continue to fight for a living wage and the
president and provost say they will not meet with their union, no way no how, but they will want
us to meet them in the survey to talk about robots and how comfortable we are using them for
work and my students have mixed thoughts, most saying AI is bad but then I ask how many of
them used AI to do their assignment on the death penalty and they all say they did not despite the
AI on my computer flagging five for using AI and as we dig deeper into the conversation some
students admit they use AI selectively like to find articles and books and summarize articles and
books and maybe even to outline a paper or two but no, oh no, the words in their papers are all
theirs they really do support the death penalty, after all, would throw the switch, after all, and
administer justice through killing without question and I ask how they can trust AI to summarize
articles and books and outlines and they say they know because they know how to read but I say
they admit they have not read the articles or the books and they admit this is a problem but they
wouldn’t have read the articles or books anyway so what difference does it make and one student
who has proudly stated all semester they are going into politics says they don’t have time to
research and AI is great because it allows them to move fast, so fast really, through their work so
they can focus on their calling of going into government and advocating for important things like
keeping the death penalty and the survey dings again in my inbox, the provost wonders why I
haven’t taken it myself and really, if I don’t take it I’m probably not telling my student about it,
and really the students need to hear about it because we are an engaged community and we need
the feedback to justify the decision that was already made by admins not in labor unions, who
refuse to meet with labor unions, who see labor as dirty and time consuming and best left to the
peons of Alabama or the robots od silicon valley, because they know what they would have read
is true and I can’t stop thinking of the overwhelming support for the death penalty in my class
because they weighed all the evidence, except they didn’t, but it didn’t matter because how
would any of that have changed their minds?
3.
That first classroom was a converted storage unit painted corporate gray. There were no
windows, just a row of refurbished computers blinking like cricking frogs sounding in the night.
The software gave each student a maze: math, history, grammar. Didn’t matter what they wanted
to learn. The machine decided.
I remember one boy, quiet and brilliant in flashes, whose thoughts had begun to fray like
damaged clothing.
He kept his hood up, his head down, and one day asked, barely audible, “Is the computer talking
to me?” The screen was silent. The fan only hummed. Outside, I knew plastic bags cartwheeled
through dead patches of grass where we’d let off rockets for a science project the week before.
I didn’t know what to say. I think about him now when people talk about AI hallucinations, how
the systems sometimes make things up. Tuning to spring light and rust, leaning toward any
flicker that resembles voice.
That room smelled like dry-erase markers and baking computer plastic. And I don’t know what
to say still, about the subject of AI talking and the hallucinations it may provoke.