With Artificial Intelligence (AI) advancing by leaps and bounds seemingly every day, it’s easy to have a bit of apprehension mixed in with the excitement of seeing what happens next, according to Aric Mitchell.
“The real threat is right here with us today,” Mitchell told a group during a presentation June 29 for the Fort Smith Regional Chamber of Commerce. “It’s in how we use it. The scary thing is if companies are using it to augment expertise or to see how many jobs can be cut.”
Mitchell, the public affairs officer for the Fort Smith Police Department, is a self-proclaimed AI nerd.
For the first time, in May of this year, there were 4,000 jobs reported lost due directly to AI, Mitchell said, noting that office jobs and legal jobs are and will be the most impacted. Sam Alman, OpenAI CEO, has admitted that even he is a little scared that AI will “eliminate many jobs” and is especially concerned about the speed with which it might happen, Mitchell said.
“Job displacement, that uncertainty, is for me the Terminator,” Mitchell said. “It can go real bad, especially if we go crazy pulling jobs out of our community. Everyone has to consider their role as a community partner. I am just hoping the jobs lost will be with jobs that AI opens up.”
Another concern is cybersecurity, he said, noting that AI requires large amounts of data to operate as it should.
“Who is collecting, storing and using this personal and sensitive information?” Mitchell said.
He said as AI grows and is used more and more, it could lead to advanced cyberattacks, attacks on AI systems, autonomous attacks, infringement of privacy and civil rights abuses.
“And as AI is used to make more videos and those videos are released, what might happen? What are we going to do when we can’t differentiate video. What do we do when image generation gets better, when someone can generate a real-looking video of Aric Mitchell robbing a store?” he asked.
But even with the concerns and fears, AI can be an important tool for many businesses, he said. Businesses, non-profit organizations and government agencies can incorporate it into grant writing.
“The qualifications that have to be met, the instructions that have to be followed, AI can do it. It can read, understand patterns and then generate new word patterns that copy that. It’s learning how we think,” Mitchell said.
Police departments could use AI to help responders write reports. He said officers often spend a large amount of time writing reports. If AI could write a good report in a few minutes that officers could review and correct and submit quickly, it would allow them to answer more calls and help more citizens, he said.
“We could take that off the responders, so they can do what we hire them to do,” Mitchell said.
But AI isn’t the answer to everything. What AI cannot do, he said, is use nonverbal cues, expertise or intuitions.
“We can train it to do our job better, but it still needs us to operate. It learns from trial and error,” he said.
What companies need to do is learn how to implement that tool. They need to be clear about their motives and put employees’ minds at ease, he said.
“We need clear, ethical guidelines. We don’t need everyone being willy-nilly with this,” he said. “The key is to keep your human hands on it. Don’t just trust what it pops out at you. Just as soon as you do, it will be wrong, and that will be embarrassing.”