🌸 Digital Garden

Workshop 7

← Back to Reflections

What I’ve done

This workshop focuses on exploring how identity, experience, and generative AI interact. We learned key theoretical concepts and then apply them through practical work with AI.

Key Concepts to Review

Screening and Discussion

Exploring Identity Through Negative Prompting

Step 1: Story Collaboration with ChatGPT

Step 2: Apply Negative Prompting

More thoughts

In this workshop, I started to realize that negative prompting is not just a technical trick for controlling AI’s output, but a way of rethinking my own experiences and identity. Every time I told the AI “don’t write it like this” or “avoid describing it that way,” I became aware of how much we define ourselves through exclusion and difference. Interacting with the AI also showed me that experience isn’t something that comes only from myself — it forms through back-and-forth negotiation between me and the machine. When the AI told my story in a generic or overly dramatic way, I felt a bit uncomfortable, and it made me reflect on how AI tends to use those standardized narratives to predict who I am. By using negative prompting, I could pull the AI away from those patterns and bring back the uniqueness in my own experience. This whole process helped me see more clearly that identity grows through relationships, differences, and constant change.

Reading references (Munster, 2025)

  • “Negative prompting,” or writing a script for the opposite of what the prompt will sample from the image space. In negative prompting, the model samples images the furthest statistical distance away from their match to the text written in the prompt itself, that is, furthest along a distribution of text and image-matched data on which the model has been trained.
  • As computation has increasingly been inflected by ml, our cultural, computational, and medial outputs as online images, generative artworks, and text corpuses — indeed, all and any data — have been modeled into maximum and minimum clusters of proximities that simultaneously butt up against one another as continuous regions or disperse away from one another via discontinuous edges and outliers.
  • Experience, then, for ml, is simultaneously quantifiable as a state of measurable change and the ongoing process of learning that variably qualifies what that change is to be over time. The final improvement, or what the model has learned, is therefore really a coalescence of many processes of modulation differing from and conjoining with one another.
  • Many routinely performed operations of data preparation and optimisation in machine learning … organise data relations according to vectors of similarity and difference. … At the very moment data is quantitively operated upon by the statistical methods of ML, it is also re-spatialised and re-configured with ‘hidden’ or latent potential for pattern and relation. Slippages, then, between quantity and quality compose the ‘experience’ of ML at large.