Positioning UA for an AI Future

AI generated image using Adobe Firefly with the prompt: A robot elephant covered in computer circuits standing on a red surface with a laptop.
AI-generated image using the prompt: A robot elephant covered in computer circuits standing on a red surface with a laptop.

Artificial intelligence is generating headlines at a breakneck pace, and at the same time, the science of AI is advancing. In just a few years, its influence has passed “trending” status and become entrenched in technology and our everyday lives.

The University of Alabama is taking steps to ensure The University and its students are at the forefront of the AI wave. Artificial intelligence is no longer confined to science fiction, but even though we use it daily, the term may mean more — or less — than many people think.

Building AI Capability

What AI isn’t, says Allen Parrish, executive director of the Alabama Cyber Institute at The University of Alabama, is a reason to worry.

“Many universities have had a defensive posture about it,” Parrish said. “But instead, we need to think about the ways AI will change our society and workforce and be prepared to educate our students in a way that helps them seize that opportunity.”

The University has begun to move forward on a project that Parrish views as the gateway to a new era of computing at UA, the High Performance Computing and Data Center announced in 2023.

Dr. Allen Parrish

The $100 million facility will scale up The University’s computing capacity exponentially. For a sense of scale, the HPC will have 125 racks to house servers. The University’s current capacity would fit in one of these racks.

“In a sense, we will be scaling our ability to do high level research by 100,” Parrish said. “We’ll be able to take in a much larger amount of data, and we’ll be able to handle it in house.” This opens up new possibilities with projects requiring a higher level of security, for example. “At the same time, we can focus on new ways of teaching our students how to work with AI, which will be an important part of the job market going forward.”

The HPC will house a petascale computing system, meaning it will be capable of performing quadrillions of calculations per second.

Researchers across campus use the power of data for tasks as varied as flood prediction, designing chemical catalysts, and improving our understanding of how and when structures fail. The HPC will enable a higher number of these projects and increase the speed and power available to drive them. The HPC can also support arts and visual learning like the work of Dr. Jennifer Feltman, an art historian who helped build a virtual reality experience of the Notre Dame cathedral in Paris for middle school students.  

What we’re seeing today in AI is a microcosm of what we’ll see in the next five years, and we want to be ready for it.

Dr. Allen Parrish

“This project is for the whole campus,” Parrish said. “What we’re seeing today in AI is a microcosm of what we’ll see in the next five years, and we want to be ready for it.”

Defining the Next Generation of Artificial Intelligence

The term artificial intelligence itself encompasses a lot of ideas. Some of those, like self-aware robots bent on world domination, lie firmly in the realm of fiction. Dr. Sergei Gleyzer is an associate professor in the Department of Physics and Astronomy who uses AI to probe the nature of the universe. He is among the individuals worldwide who are pushing the boundaries of the field itself.

A more accurate term for what we mean when we talk about AI is “machine learning,” he says. It’s merely the science of algorithms capable of learning from data, and then using analysis of that data to make predictions about new data. A simple example might be separating cats from dogs. The more times the model completes this task and is corrected when it makes an error, the more successful it will be in identifying cats versus dogs in the future.

Around 2010 came the “deep learning revolution.” A class of algorithms based on how our brains work, called neural networks, can be layered for more sophisticated decision-making processes. Technical innovations made it possible to build deeper neural networks with more layers.

Dr. Sergei Gleyzer

This was the development that propelled machine learning forward in areas like image processing and natural language processing and laid the basis for generative AI models like ChatGPT and Midjourney.

Gleyzer is active in the development of deep learning algorithms for use in physics and astronomy, including work at the Large Hadron Collider in Cern, Switzerland. The next frontier, he says, is quantum machine learning.

Quantum computing is still in its infancy, but it is learning to crawl. In classical computing, information is coded in ones and zeroes­—binary code. Quantum computing, Gleyzer said, makes it possible to encode data in a superposition, a weighted combination of zeros and ones where a one and zero both exist as possibilities. That is the quantum state.

“This is not a theoretical thing. Quantum computers already exist,” Gleyzer said. And he is writing algorithms for them.

On the left is a multi-tiered metallic machine with complicated parts. To the right is an illustration of a bit next to a qbit.
Quantum computer with a visual representation of the qbit, or the basic unit of data in quantum computing.

Quantum computing is in the equivalent of computing in the 1950s: the technology is bulky and expensive, and few people have access to the machines already built. “We asked, if we had a quantum computer, could we design better machine learning algorithms for that hardware? And that is what we are doing,” Gleyzer said. Although quantum machine learning is less than a decade old, Gleyzer’s group has already designed quantum versions of machine learning algorithms that perform as well or better than comparable machine learning for classical computing.  

Both classical and quantum machine learning will have a place in the future, Gleyzer believes, much as radio and TV still have uses alongside the internet. But as the nature of computing evolves, he and his students at The University of Alabama are on the front lines.

Alabama Generative AI Task Force

UA is also at the table as Alabama’s state leaders look at both the possibilities and the cautions that come with widespread use of AI tools.

Dr. Matthew Hudnall, assistant professor of management information systems, has 20 years of experience with Alabama’s data systems. When you register your vehicle at the DMV, that documentation is handled with software he helped write. Earlier this year, Hudnall joined Gov. Kay Ivey’s task force on generative AI.

“Our task is to develop a set of recommendations and a plan that leverages generative AI in the most optimal but secure ways possible for the betterment of the state of Alabama,” Hudnall said.  Four working groups within the task force meet regularly and hammer out recommendations that will go into a final report.

Dr. Matthew Hudnall, assistant professor of management information systems, is on the governor's artificial intelligence task force.
Dr. Matthew Hudnall

Generative AI is only a small part of the broader field of machine learning, but it is broadly available and in the forefront of popular culture. The unique feature of generative AI is that the models, once trained, do not make predictions about unseen content. They create new content.

Helpful generative AI assistants no accompany search engines and software at every turn. Need help drafting a routine email? Let your AI assistant help!

Regulations like HIPAA and FERPA protect our sensitive data, but there is an emerging and urgent need for AI-related guidance on the varying levels of sensitive information that state employees might handle. “You don’t want to, for example, send tax data for a citizen of Alabama to ChatGPT, because there’s a data security concern,” Hudnall said. The final report will include a data matrix one of his working groups devised, based on a system used by UA’s Office of Information Technology, to help advise state agencies on the level of sensitivity of data.

From process automation to quicker access to data, I only see [generative AI] as becoming more integrated into our society.

Dr. Matthew Hudnall

Each of the four working groups will provide similar recommendations for various aspects of generative AI.

As an offshoot of his work with the task force, Hudnall is guiding a group of UA graduate students in the creation of a large language model designed to answer questions about the Alabama legal code. The project will give students a front-row seat to the creation of data policy and a hands-on learning opportunity.

The students will use a large language model created by other developers and made available for free online. The students downloaded the model and trained it on the Alabama code of law. Once it is trained, anyone could use this software to answer questions about the most obscure Alabama statute in seconds. And because it is hosted and run on local hardware, it isn’t feeding the training data back to any outside party.

This student project will be included in the report as an example of a secure way to harness the power of generative AI.

“From process automation to quicker access to data, I only see it as becoming more integrated into our society, Hudnall said. “It’s incumbent upon us as an institution to make sure our students are equipped with these tools.