Dataset Consent Kiosk
Dataset Consent Kiosk
A fictional interface exploring the choices artists were never given.
Artists were not asked this question during dataset creation.
Style-level consent is not a feature available in most AI systems.
Commercial permissions were not discussed with creators.
These options were not offered to artists included in large datasets.
Consent played little role in early model training pipelines.
Reflection: Dataset Consent and Creative Responsibility
For this project I wanted something that felt simple and direct. A museum kiosk interface made the most sense because it gets the message across without decoration or emotional framing. It presents questions plainly, the way a real system might have done if anyone had taken the time to build one. The calm and neutral look helps emphasize how basic these questions are and how strange it is that artists were never given anything like this.
While making the kiosk, I kept thinking about how easy it would have been for developers to include an opt in system. If you are training a massive model built from the creative work of real people, something as basic as a consent form should be a first step. It would have clearly divided those who wanted to contribute to a community built model from those who wanted to keep their personal style private. I do not think this idea is unrealistic. A simple interface could have allowed people to say yes or no. A toggle or a checkbox would have been enough to respect someone’s choice. Instead, the current situation is a mix of uncertainty, scraped content, and blurred ownership.
Working on this project also reminded me why I still feel a hesitation toward fully embracing AI tools. There is always something in the back of my mind holding me back from diving into the idea of AI as the future of everything. I see the potential and I use these tools because they are genuinely helpful, but the way training was handled from the beginning makes it hard to view the systems as clean or fair. It feels like there was a one time chance to build these models in a respectful and transparent way, and that chance was ignored. It created a situation where even the best tools carry a weight that should not be there.
Consent and respect for artists continue to be my main moral discomforts. I do not think it is asking too much for artists to be able to choose whether their work is included. Anyone who did not want their writing, images, or style used in training should be able to have their work removed. I understand that full removal might be technically difficult, but the fact that it was not even attempted speaks to the priorities of early model creators. The people who made the art that now fuels these systems were not considered. Their labor and creativity became raw material. My project tries to show how simple it would have been to ask them first.
In the kiosk itself, the consent questions say the most about where I stand. The sliders and toggles are there to make the piece interactive and readable, but the heart of it is the question of permission. The final message, which tells the user that their answers cannot be applied, is meant to highlight the gap between this fictional interface and the reality of how datasets were built. It should feel slightly wrong to see a clean and organized system that never existed in practice.
Throughout this class and while working on this project, I think my own viewpoint changed in a useful way. I started out strongly against AI as a whole, but I realized that the problem is not the technology itself. AI is a powerful collaborator when used in a thoughtful and transparent way. It helps me move between different stages of a project without losing track of ideas. It supports the messy workflow that I naturally have. The problem is the development process and the choices that were made early on. Those choices continue to shape the field and the public debate around it.
I also realized that AI tools do not have to be the enemy of human creativity. They can play a real role in the brainstorming process and in the organization of complex projects, which is something I rely on. Using AI to build the structure of this kiosk felt like working with a capable assistant who helped keep everything consistent. That part of the collaboration felt completely fine. It was the larger context of consent and creative respect that stayed on my mind.
If someone interacts with my kiosk for even a few seconds, I want them to think about how these models were created. I want them to notice the absence of consent and think about the artists who were never asked before their work was scraped and folded into a dataset. These tools are becoming more important every year, and people should be aware of the cost behind them. Not the financial cost, but the human one.
This project is my attempt to make that absence visible. It imagines what could have happened. It shows how simple the process could have been and how different things might look today if artists had been considered from the beginning.
Tools Used & AI Chats
AI Tools Used
- ChatGPT (concept development, code creation)