While no one can predict what the art of the future will look like, it seems increasingly clear that at least some of it will be created with the help of artificial intelligence. A growing number of visual arts programs at independent art colleges and universities now offer courses—and in some cases, entire degree programs—focused on advanced technology. These include Art Center College of Design in Pasadena, the California Institute of the Arts in Santa Clarita, the Cleveland Institute of Art & Design in Ohio, the Eskenazi School of Art, Architecture + Design at Indiana University in Bloomington, Parsons School of Design in Manhattan, the Rhode Island School of Design in Providence and the Savannah College of Art and Design in Georgia. Students graduating from these and similar programs are unlikely to abandon what they’ve learned once they leave school.
What exactly are they being taught?Rick Dakan, chair of the Emerging Technology Committee at Ringling College of Art and Design in Sarasota, Florida, described artists who make “digital images with complex workflows that incorporate A.I. tools in their multi-step process. They start with a hand-drawn sketch or drawing, put that in A.I. to get a more detailed color rendering, take that back out and paint over it in Photoshop or Illustrator”—image editing software first developed in the 1980s—“then take that image and use an A.I. tool to make a three-dimensional rendering of it and print a model with a 3-D printer.”
Caleb Weintraub, director of the Eskenazi Technology and Innovation Lab in the School of Art, Architecture + Design at Indiana University, described one of his own creations, a project called “Speculative Portraiture,” in which “a voice becomes a picture. The finished works are hand-painted portraits on panel, not prints or screen outputs.” He works with A.I.-generated voices—sometimes his own or those of consenting colleagues and family members—to explore what human head images are produced when different vocal features are fed into an A.I. program. “I then make a series of gesture drawings using the images as references, as if encountering the same person in different moments.
Outputs generated solely by A.I. systems are rarely treated as finished work...legal and ethical questions...are part of the broader critical conversation in courses that engage with A.I. tools.
Caleb Weintraub, associate professor of painting, Indiana University Eskenazi School of Art, Architecture + Design
Tom Leeser, founding director of the Art and Technology Program and director of the Center for Integrated Media at the California Institute of the Arts (CalArts), described the initial artistic use of A.I. as “making images from prompts, asking a program to generate something based on a question. For instance, make me a sunny day and then you’ll get any number of images of a sunny day.” The next steps, however, can be far more technical.Scott Benzel, a faculty member in CalArts’ Program in Art and Technology, recalled a 2017 piece he created,Mathesis and Mathematikoi, for string quartet, dancers and telescope operator. The “score was composed using earlyish generative A.I., including Noatikl and Wolframtones and was based on Hubble’s Constant, the law of cosmic expansion thatEdwin Hubblediscovered using the telescope at Mt. Wilson in 1929. Other elements of the piece, including the motion of the telescope dome and the ‘counting’ elements of the dance, were determined using similar generative methods. For the score, I chose and edited the generative output into something that I found pleasing. The score turned out to be too complex, unplayable by humans and had to be simplified by the score conformer and the players.”
While the process may sound playful, these and other professors stress that A.I. is a tool—not the final product. “Outputs generated solely by A.I. systems are rarely treated as finished work,” Weintraub said, adding that “legal and ethical questions—authorship, attribution, data provenance—are part of the broader critical conversation in courses that engage with A.I. tools.”
Flynn, the first A.I. art student, in a classroom at the University of Applied Arts Vienna. Courtesy Malpractice
Ry Fryar, assistant professor of art at York College of Pennsylvania, explained that his courses in the Digital Art and A.I. major teach students to use A.I. tools as part of a larger creative process. “The focus is on creativity itself, because without that, the results are common, therefore dull and fundamentally inexpert. We work with students on how to guide A.I. tools at a professional level, stay aligned with developing good practices and understand current copyright law, ethics and other standards for responsible A.I. use.”
Artificial intelligence, particularly generative A.I.—which produces text, videos, images and other forms of data—has been both a source of fascination and anxiety in recent years. On the worry side, high school and university faculty in the humanities struggle to determine whether written assignments were completed by students or with ChatGPT or another generative A.I. program. Existing detection tools are unreliable, and educators are hesitant to accuse students of misconduct without solid evidence.Jessica Sponsler, assistant professor of art history at York College of Pennsylvania, noted that “I’m a medievalist. I can read Latin,” but added, “I’m dealing, we’re all dealing, with A.I. We are in the middle of a change period, and most faculty here acknowledge that students are using ChatGPT. Our job is to help them use A.I. in a way that improves their skills and doesn’t substitute for those skills.”
Andrew Shea, associate dean of the School of Design Strategies at Parsons School of Design, said that the college isn’t teaching students to code or create a “large language model,” nor to “push a button and get an answer.” Instead, “It’s about teaching them how to think about when, why and how they might use A.I., if they need to at all. If they do use it, students are expected to learn how to use it intentionally, reflect on its impact on their process, and keep their own voice and vision at the center. I tell them A.I. is like a spice—it can add complexity or shift a flavor, but it can’t be the whole meal. We’re trying to develop better thinkers and makers. Heavily using A.I. often limits both of those from happening.”
Outside academia, concerns extend further. Will entire fields of employment be replaced by A.I.? Generative A.I. has already become a source of companionship for isolated young people and the elderly, sometimes with troubling consequences. Lawsuits by Getty Images and several artists against Stable AI (creator of Stable Diffusion), Midjourney and DeviantART (maker of DreamUp), alleging copyright infringement, have been winding through the courts for years. Meanwhile, the U.S. Copyright Office continues to wrestle with whether an artwork created using A.I. qualifies for copyright protection, since only human-created works are eligible.
Basil Masri Zada, assistant professor of digital art and technology and head of the Center for Advanced Computing in the School of Art + Design at Ohio University, said convincing faculty, administrators and legal counsel that they weren’t encouraging cheating or risking legal exposure “took us a year to make sure that we are doing things correctly. We had to educate our own school about what this all is.”
While colleges and universities in the U.S. appear to be embracing A.I. in studio art courses, not everyone is on board. The University of New South Wales in Australia introduced this fall a course in its School of Art and Design titled “Generative A.I. for Artists,” prompting a petition signed by more than 7,000 students demanding its cancellation. “Our key objection is that the course encourages and requires students to use generative A.I. to create art,” said Robin, a fourth-year student in the university’s fine art degree program who asked that their last name not be used. “As detailed further inthe petition, there are numerous ethical, environmental and humanitarian issues with the use of generative A.I., and we do not believe it is appropriate for a university to be encouraging it.”
He called the idea that A.I. is the inevitable future “defeatist,” adding, “we’ve seen how campaigns and legislation around technology, particularly in the E.U., have served to protect people and the environment. Why should we give up hope with generative A.I.? We don’t need to quit before we’ve even started.” A university spokesperson defended the course, noting that it “critically explores the creative and ethical questions raised by A.I., rather than simply promoting or celebrating its use.”
Whether to teach A.I. and how to teach it remains a matter of debate across academia—perhaps more about how than if. Andrew Shea said, “the deeper issue is authorship. And this is what we focus more on at Parsons with our students: Are you owning your process? Do you decide what to keep, what to reject and what meaning to shape? Or do you simply hand all of that over to systems built on the unpaid labor of countless others?”
Will A.I. in the arts endure, or is it just the latest trend? Fifty years ago, art students often entered school aspiring to become painters or sculptors but shifted to film and video. Twenty years ago, they turned toward social practice. Now, courses and degrees in artificial intelligence are redefining creative education once again—shaping students’ thinking and offering seemingly new career paths.
In recent years, university studio art programs—and especially independent art colleges—have expanded their offerings in every direction. The Savannah College of Art and Design, for instance, has degree programs in Equestrian Studies and the Business of Beauty and Fragrance. The Maryland Institute College of Art allows students to minor in cartooning. Ringling College of Art and Design offers an undergraduate degree in Business of Art and Design and CalArts offers a graduate degree in Aesthetics and Politics. The Rhode Island School of Design has even introduced its first-ever tattooing course, which could one day lead to a full program. Maybe the proliferation of A.I. courses in art schools is simply part of higher education’s version of throwing spaghetti at the wall to see what sticks.
Eskenazi School of Art, Architecture + Design resources and social media channels