This session covers the early history and emergence of generativity in art, the proliferation of machines, popular and ancient techniques used in generative art such as randomness and noise, the use of chance operations and constrained randomness, the mathematical approaches of chaos theory and fractals in generative art, and the pros and cons of these tools.
"Art is about form" and not strictly reducible to utilitarian functions. There is a co-evolution of artists and their tools. One can only make what the tools available afford whether concrete or conceptual.
Algorithms are developed in conjunction with our knowledge of geometry.
The industrial era brought about the rise of more sophisticated machines through a societal shift to operating in more organized, systematic, and algorithmic ways. Electricity brought a wealth of new possibilities.
Progress in mechanics, optics, and chemistry make disciplines like photography possible—arguably the automation of figurative representation—influencing new aesthetics in painting.
The 1950s see the deployment of the first computers, discovery of DNA, and the rise of cybernetics—autonomous machines that accept inputs through sensors and act on the world through effectors in a closed or open loop system.
Computers sped up the development of generative art because they are generic and are meant to execute any calculatory algorithm. They were efficient first at manipulating text, so we see generative practices expand first in literary fields such as the practice of découpé or cut-up technique.
IBM began working with writers on experiments in electronic literature in the late 50s and early 60s. Works like the permutational poem I am that I am by Brion Gason, Tape Mark poems by Nanni Balestrini, and the book Aesthetica by Max Bense appear in the 1960s.
One of the most basic ways of delegating control of artistic output in favor of a process is by using chance operations and randomness.
Randomness is observed in a sequence when order and patterns seem to be lacking. It allows the introduction of variability and unpredictability into a system. History has produced many ways to derive random values, but computers revolutionized the speed at which they can be generated and processed. Sources of randomness can be lookup tables or manually produced random values, measurements made in the physical world of random phenomena, or algorithmic based randomness.
Probability theory is the main framework for chance events. Probability distributions retain the notion of chance while varying the likelihood of a particular outcome and operate on a domain with a discrete number of possible outcomes for a variable. Types of distributions include uniform, Bernoulli, binomial, hypergeometric, Poisson, to name a few. A probability mass function allows us to favor some outcomes over others for a discrete variable. Similarly, a probability density function allows such operations but on continuous variables.
When all the elements of a sequence that serves as a signal are decided at random, we call this noise. Different types of noise are characterized by the correlation between values in the sequence. These types of noise occur and can be reproduced in multiple dimensions.
A random walk is a sequence of steps in which each step is chosen at random according to a probability distribution. The size of the steps can be fixed or dynamic, can change according to any rules, and can be produced at any dimension. A random walk becomes Brownian movement when the size of the steps tends towards zero.
Pink noise (1/f) is a signal with a frequency spectrum such that the power spectral density is inversely proportional to the frequency. It occurs in many physical, biological and economic systems and is useful in generative art since it is a simple way to model natural phenomena.
Leaving dimensions unfixed to employ chance, accident, and improvisation can be a tool to free creative processes from rational control and explore novel configuration in the creative space.
A process is said to be aleatoric if its course is determined in general but depends on chance in detail. — Werner Meyer-Eppler
Aleatoric music is music in which some element of the composition is left to chance. Chance operations can be used in either the composition or the interpretation processes of music for example.
There are two broad families of approaches—combinatorial and procedural. The combinatorial approach requires a corpus of data as an input whereas the procedural approach generates its corpus.
Chaos theory and fractals are dynamical systems—systems that change through time. Dynamical systems are expressed as iterative equations that describe the state of the system at the next timestamp or iteration.
A dynamical system can converge to one of four behaviors
These behaviors are more often expressed as attractors—a set of states toward which a system tends to evolve.
Note that chaotic does not mean random. Chaotic systems are deterministic, but small changes in input can make the output unpredictable. They were popularized in the 1980s by the meteorologist Edward Lorenz who is famous for his description of the "Butterfly Effect."
Music composers began the integration of chaos theory in the 1980s as applied to pitch, duration, dynamics, and orchestration. Chaotic systems have also been applied at the micro level in the context of sound synthesis.
Fractals are the most popular and studied family of chaotic systems. They are mathematical dynamical systems represented by iterative equations that develop curves and geometrical shapes that have the property of self-similarity—its patterns repeat at different scales.
Art is contingent on the tools and resources available at a given time. Discoveries in mathematics and science have a direct effect on art whether conscious or unconscious to the artist.