New applications of artificial intelligence technology have now generated great excitement, but it has also created expectations that it will disrupt many work processes. Generating written material or images that are not the direct output of humans means that using AI comes with many questions: when is it appropriate and professional, or when is it ethical?
At the SC Municipal Finance Officers, Clerks and Treasurers Association Spring Academy, Urica Floyd, the Municipal Association of SC staff associate for training and learning solutions, presented several AI services now in common use. The potential impact of AI will not be unlike other technology advances of recent history, she said.
“What AI is going to do for us is the same as what computers did for us in the early 1990s. It’s just a transition from one tool to another, that’s going to assist us in how we work every day.”
The key point, she said, and perhaps the first rule of using AI, is that it can help humans, but it cannot utterly replace human effort and human judgment.
“You have to be involved,” she said. “You can’t just accept the results that AI spits out to you, you have to review. You have to read it carefully and edit all the generative AI content that you received.”
She first took a look at ChatGPT, a conversational tool or chatbot that answers questions by generating human-like responses and written material. In the municipal government sphere, that could involve creating a starting point for draft documents like letters or proclamations, or it can help summarize longer documents as shorter ones, although users need to be careful to avoid uploading or inputting any confidential or sensitive information.
Floyd illustrated ChatGPT’s function by prompting it to create names for a hypothetical food truck event organized by the City of Columbia. The prompt included some guidelines: both “Cola” and “Soda City” are common nicknames for Columbia.
ChatGPT answered the request with “Soda City Street Eats,” “Cola Cuisine Carnival” and “Capital City Chow Down,” as well as with potential written descriptions of such an event.
Floyd next reviewed Otter.ai, a tool that transcribes spoken language into written text and creates summaries of what was discussed. This makes it capable of processing recordings of meetings, lectures or interviews, and it can integrate with virtual meeting services as well, like Zoom, Microsoft Teams or Google Meet. While it can help save workers’ time by rapidly creating transcriptions, which was once a laborious process, it can still fail to provide complete accuracy.
Demonstrating its abilities, she had it transcribe a brief recorded conversation of young children discussing caterpillars and butterflies. It transcribed their words, including the potentially challenging phrase “the caterpillar is making a chrysalis.” It did not correctly detect that multiple children were speaking, however, as their voices had a similar pitch.
Finally, Floyd took a look at Microsoft Copilot Image Generator, which can work from text descriptions to generate visuals for presentations or publications, although there is an emerging expectation that AI-generated images need to be labeled as such. She used it to create an image of a female government employee at work, and made a series of adjustments to the woman’s appearance, and even changed out the art style to make the woman resemble a comic book character.
When used with skill and discernment, Floyd said, AI allows people to “work smarter, create content and improve productivity.”