web-name

Community Voice: Campus Survey Results

white arrow pointing left half-human-half-robot half-human-half-robot half-human-half-robot half-human-half-robot half-human-half-robot half-human-half-robot

Binghamton University Opinions

Introduction: Our Method

We designed a survey for the Binghamton University community to gather opinions on AI-generated art. The survey is divided into three parts:

  • The first part collects basic demographic information, such as participants’ roles within the Binghamton University community, their major or academic area, and how frequently they’ve encountered AI-generated art over the past year.
  • The second part is an AI vs. human art guessing game with 11 questions. In each question, participants are shown two visually similar images and asked to identify which one was created by AI.
  • The third part includes a series of questions addressing ethical concerns related to AI art, using both Likert scale and open-ended responses.

We promoted the survey through multiple channels to reach as many members of the BU community as possible:

  • We contacted faculty members to request permission to share the survey link via email through departmental listservs.
  • We designed and posted visually appealing, theme-relevant flyers, each with a convenient QR code linking directly to the survey.
  • We shared the survey in social media chat groups with large numbers of BU students.
  • We promoted the survey in person at public spaces on campus, such as the Marketplace, to engage with passersby directly.
  • We also reached out to our personal networks—including friends and familiar professors—for help spreading the word. *A special thanks goes to Professor Fetten, who generously offered survey participation as an extra credit opportunity for her yoga class!

Who Took the Survey?


pie chart showing who particapted in survey bar graph showing the years of the students who particpiated

AI Art Guessing Game Results


Binghamton University Community's AI vs Human Art Score

Majority of the participants were able to correctly guess 9 of our AI-generated images.

AI Usage Among Participants and Interest In Using AI


AI usage chart

About 28.2% of participants reported using AI tools like Midjourney or DALL·E.

Interest in using Ai

About 43.7% of participants have no interest in using AI tools to create art.

Participant Attitudes Toward AI Art


Participant Thoughts in Their Own Words

What are your initial thoughts about AI-generated art?

Word cloud of postive/negative

Do you have any concerns about AI-generated art?

A graph of word-to-word connections, highlighting the most frequently occurring word pairs.

*This is a graph of word-to-word connections, showing the most frequently occuring word pairs from the responses.

*The whiter the arrow, the stronger the connection between the words.


Quotes expressing concerns about AI-generated art.

Do you believe AI should be a collaborator rather than a creator in art?

Word cloud of yes/no and other keywords

Do you think AI art will create, take, or affect job opportunities?

Word cloud of take/create/affect and other keywords

How do you think AI-generated art will impact the future of creative industries?

Word cloud of keywords such as 'negatively', 'decrease', 'harder', etc.

If you incorporate AI in your art, what is the boundary between your work and AI? If you don't incorporate AI in your art, why?

Quotes of responses

Are there any policies or guidelines you think should be in place for AI-generated art?

Quotes of responses

Do you have any other thoughts or insights on AI-generated art that weren't asked about in this survey? Feel free to provide any input.

Quotes of responses

Key Findings

  • From the guessing game results, we found that most participants were able to easily identify AI art with 9/11 being the most frequent outcome. Such results prove that while humans are generally good at noticing and distinguishing between human art and generative AI, it is not always a guarantee.

  • From the survey results, the majority of participants agreed that AI art is not a suitable replacement for human expression. From authorship, consent, transparency, and compensation, the majority of participants strongly sided with artists. Despite our small sample size and potential sampling bias, these results may allude to more generalizing conclusions in which AI has become too far reaching and its impact detrimental without human intervention.

  • From the demographics survey, we can conclude that the majority of our participants were students in their senior year. This lack of sampling diversity may have skewed our results as younger individuals who are more exposed to current technological trends never experienced a time without AI. Industry professionals, with deep knowledge of corporate standards may have more to weigh in and possibly change the trajectory of our results.

  • From the pie charts indicating ‘Interest in using AI to create art’ and ‘Use of AI tools to create art,’ we can conclude that our results may have some response bias. While 28.2% of our participants claimed to have used AI tools to create art, only 26.1% claimed to have used AI tools previously within the ‘Interest’ pie chart. While the difference is negligible, such bias highlights the social stigma associated with generative AI. Such stigma prevents participants from being fully transparent even if AI was used partially. As AI becomes more ubiquitous and normalized, such stigma might disappear but larger safeguards must be put in place to ensure artists’ and their work is protected.

Limitations of Our Method

While our survey provided valuable insights from the Binghamton University community, several limitations should be acknowledged:


  • Survey Format and Time Constraints: To encourage participation, the survey was designed to be short and accessible. While this made it more user-friendly, it also limited the depth of information we could gather. Some ethical issues may have required more context or nuanced framing to elicit more thoughtful answers.

  • Sampling Bias: Despite our efforts to reach a broad audience, the limitations of our distribution methods likely introduced sampling bias. The majority of our responses came from students, and while participation was voluntary and promoted through channels such as social media, flyers, and email, the sample tends to reflect individuals already interested in art or AI. Additionally, our sample size was relatively small (n = 142), which further limits the generalizability of our findings. As a result, the perspectives collected may not fully represent the broader Binghamton University community or capture the diversity of viewpoints necessary for more comprehensive conclusions.

Conclusion

In conclusion, Beyond the Brush aims to explore the ethical, legal, and cultural implications of AI-generated art, emphasizing the urgent need to preserve human creativity in a rapidly evolving technological landscape. While AI tools like DALL-E, Midjourney, and Adobe Firefly, can replicate art, it lacks the emotional depth, intentionality, and contextual understanding that defines true artistic expression. Such distinctions have significant implications, as seen in Getty Images v. Stability AI, in which corporate profit takes precedence over artistic fairness, authorship and compensation. These trends expose the vague boundaries drawn between innovation and exploitation, particularly the lack of dataset sourcing when feeding AI. Artists must now contend with their work being appropriated or risk exclusion from major creative platforms that rely on their work to fuel their AI’s output.


Beyond legal and ethical frameworks, AI generated art has larger cultural implications. As society and technology moves closer to automation, many individuals fear the loss of creative professions and the devaluing of their work. This trend risks undermining what makes art meaningful: its ability to reflect personal experiences, question norms, and foster human connection.


Through our research, we found overwhelming evidence that generative AI is detectable, though not consistently or with certainty. The feedback from the Binghamton University community reinforces these concerns, as many individuals found growing concern over authorship, consent and transparency. Survey respondents expressed skepticism about AI’s capacity to replace human expression, voicing support for stronger ethical guidelines, clearer copyright policies, and equitable compensation for artists whose work is stolen.


Despite our methodology limiting the scope of our research, it also revealed the lack of public awareness and understanding of generative AI. While many recognized the ethical stakes, education and dialogue must extend beyond our own professional circles. Only through inclusive, informed conversations can we shape the trajectory of public opinion, and responsibly integrate AI as a tool to enrich our lives, not replace it.


As AI continues to become more ubiquitous, it is imperative that we protect all forms of human expression, and develop strategies that balance innovation with creative ingenuity.

white arrow pointing right half-human-half-robot half-human-half-robot half-human-half-robot half-human-half-robot half-human-half-robot half-human-half-robot