Top

Brave new technology: A&S experts provide insight into generative AI tools like ChatGPT and DALL-E

An image of the University of Toronto in the style of van Gogh's The Starry Night produced by DALL-E.

This image was created by directing DALL-E to produce an image of “the University of Toronto in the style of van Gogh’s The Starry Night.” (Photo: DALL-E/directed by Chris Sasaki.)

As artificial intelligence (AI) continues to rapidly advance, there has been a surge in the development of AI-powered content creation tools like ChatGPT and DALL-E that offer users a range of personalized experiences. However, with this growth come concerns about the potential dangers and ramifications of such apps, from privacy concerns to the displacement of human workers.

For example, the previous paragraph was written by ChatGPT for this story, illustrating the blurring of lines between AI- and human-generated content. And the accompanying image was created by directing DALL-E to produce an image of “the University of Toronto in the style of van Gogh’s The Starry Night.”

In recent months, headlines have outlined on an almost weekly basis the issues relating to generative AI tools and content. Illustrators, graphic designers, photographers, musicians and writers have expressed concerns about losing income to generative AI and having their creations used without permission or compensation as source material; they also complain that the work is without originality, artistry or soul.

Instructors are having to cope with students submitting work written by ChatGPT and are re-evaluating how best to teach and assess courses as a result. Institutions like U of T are examining the ramifications of this technology and providing guidelines for students and instructors.

Some scientific journal publishers are requiring authors to declare the use of generative AI in their papers, while other publishers forbid its use entirely, characterizing it as “scientific misconduct.”

At the same time, it hasn’t taken long for the tone of headlines to change from dystopically fearful to cautiously constructive. Many experts point out that the technology is here to stay, and our focus should be on establishing guidelines and safeguards for its use; others look to its positive potential.

A&S News spoke with members of the Arts & Science community and asked them what they think about generative AI tools, and what we need to do about them.

Here’s what Assistant Professor Ashton Anderson had to say:

We are increasingly seeing AI game-playing, text generation and artistic expression tools that are designed to simulate a specific person. For example, it is easy to imagine AI models that play in the style of chess champion Magnus Carlsen, write like a famous author, or interact with students like their favourite teacher’s assistant. My colleagues and I refer to these as mimetic models — they mimic specific individuals — and they raise important social and ethical issues across a variety of applications. They affect the person being modeled, the “operator” of the model and anyone interacting with the model and can either be used as a means to an end, e.g., to prepare for an interview, or as an end themselves, e.g., to replace a particular person with their “digital doppelganger.”

Will they be used to deceive others into thinking they are dealing with a real person, say a business colleague, celebrity or political figure? What happens to an individual’s value or worth when a mimetic model performs well enough to replace that person? Conversely, what happens when the model exhibits bad behaviour; how does that affect the reputation of the person being modeled? And in all these scenarios, has consent been given by the person being modelled? It is vital to consider all of these questions as these tools increasingly become part of our everyday lives.

For more, read Anderson and colleagues’ research article, Mimetic Models: Ethical Implications of AI that Acts Like You.

Read more at A&S News for additional faculty insights on the technology.

— Original story by Chris Sasaki for A&S News