placeholder_1

Stochastic Media

Jacob Weinberg read bio


In 2023, I traveled to Miami as part of a grad school-funded trip to research the Everglades wetlands. We spent little time in Miami itself, traveling most days out to the wetlands while staying in Miami Beach. Early in the morning on the day we set to leave, I sat out on South Beach just before dawn with a radio I had built from guides found on old enthusiast internet forums. It was designed to tune into signals produced by atmospheric noise and lightning. Best heard at dawn, dusk, and night, and away from the hum of the power grid, I stayed close to the water. The radio detected louder clicks and chirps emitted by a series of thunderstorms just off the coast that were interspersed between hundreds of fainter pops from lightning strikes further away. Additional noise grew louder with the sunrise as its radiation energized the ionosphere. These noise sources were thoroughly displaced from my location, with the ionosphere beginning at least 40 or 50 miles above and lightning’s signals traveling from up to thousands of miles away, traveling between the ionosphere and the earth.1 The beach became a kind of non-location, referring less to the sand, sun, water, and clouds in the immediate vicinity and more to what it offered as a place for listening to everything far away from my sensory boundaries.

As I listened to those louder cracks and chirps generated by the approaching storm, I envisioned the water in front of me also approaching, not because of the tide, but due to the global rise in sea levels in the coming decades. I was reminded of a conversation I had that week in the Everglades, where someone mentioned that insurance companies had increasingly ceased offering property insurance for homes on Florida’s shores and even inland, as the market could not withstand ongoing shocks from litigation due to damages from storm frequency and intensity. Beyond its impacts on homeowners, private insurers themselves could no longer find reinsurance companies (insurance for the insurers) to help cover the claims they were paying out, and that international businesses in Europe were providing a large chunk of the capital needed to cover volatile coastal assets. On the beach then, it seemed that the state's housing market could not recover and that there would be an eventual collapse.

But I found later that the market at that time was already moving past its lowest point, with most companies having already exited, and local and state governments were already working to address their structural problems. And it became clear that, while this predicament was wrapped up with the inevitability of weather-related disasters, the story was much more about their secondary effects– impacts to risk assessment of Florida’s coast rather than primary damages to livelihood.  Since then, Florida reconfigured its regulation around property insurance law, adjusting for the certainties of increased risk, but also addressing the means by which capital was able to flow via that risk. This was not simply a matter of loosening restrictions and opening up caps on homeowner costs, but rather addressing the intricacies of the effects of climate change from financial and legal perspectives. Contractors frequently sued insurance companies to be paid inflated estimates for house repairs and renovations under the guise of weather-related damages along with exorbitant associated lawyer’s fees. Due to legal frameworks and a general ubiquity of weather damage to property, this was so commonplace that Florida had been responsible for just over 75% of all property insurance lawsuits in the entire country.2 Reinsurers did not want to have to risk paying out claims submitted by property insurance companies paying for these much higher repair costs and fees.

The impact of climate change and weather-related catastrophe in Florida becomes more oblique and almost ironic in light of this crisis. A secondary, discursive environment around storms and their effects created situations where house repairs – most often roofs – were able to be paid via litigation even if they were aging and not directly damaged by a particular storm, with contracts often citing past weather events. The prospect of risk served to be more valuable than the root of its causes. We can imagine then, that even if a particular person did not believe there would be any future impact of climate on Florida, that they nonetheless leveraged these conditions as a means to obtain funding from insurers. The insurance company would then raise premiums for everyone else in the area, whether or not they had been involved in any claims yet, as their actuaries assessed costs based on shared risk and the likelihood of future financial impacts. In this way, the models of risk that translated to increased premium costs and diminishing coverage to homeowners became inflated, not by any direct address of impending changes in climate, but rather by the models themselves as abstractions responding to market conditions.

In The Precession of Simulacra, Jean Baudrillard writes that Jorge Luis Borges’ allegory of the map3 has been superseded:

“But it is no longer a question of either maps or territories… No more mirror of being and appearances, of the real and its concept. No more imaginary coextensivity: it is genetic miniaturization that is the dimension of simulation. The real is produced from miniaturized cells, matrices, and memory banks, models of control….”4

Baudrillard’s description of hyperreality here is not only reflected in the classic example of Disneyworld, located in Florida’s inland, but in these minutiae of risk assessment that drive the property markets on Florida’s coastal regions. The hyperreal appears in a twofold way. On the one hand, it is generated by an enduring belief in the idyllic and replicated images of coastal life and a denial of its impending change. On the other hand are the actuarial tables that form abstracted images of its volatile future.

The dilemma of Florida’s coasts resonate with a futurity that extends to wider cultural touchpoints, not only through their preoccupation with prediction, but also in how the market operates as a fundamentally stochastic medium, where prediction by statistical analysis is rooted in stochastic methods, or those that embrace randomness  in their calculation. This sort of media permeates platforms today, most literally in consumer prediction markets like Kalshi and Polymarket, where users abstract significant and insignificant world events into gambling opportunities. However, they also appear structurally in algorithmic content feeds, which work to anticipate and shape user engagement. Stochastic processes are also embedded within computational forms of text and image, such as generative AI models, which iterate through feedback cycles to shape noise into meaning.5 I want to suggest here that, unlike previous forms of media, these examples attempt to address future circumstances rather than document or reflect past circumstances. They operationalize noise and randomness to consider the world across a totality of possible states, and in doing so, present the world less as a set of unknown material conditions, and more as sets of meanings to be interpreted, replicated, decoded, and shaped.

Stochastic media sit within a long and storied history of probability in the West. The formalization of probabilistic principles was central to the Enlightenment, shifting from older notions of fortune and fate into rationalized chance across the 17th-19th centuries.6 But by the 19th century, formalized randomness entered science and laid the groundwork for a series of developments that would rapidly develop in the 20th century. The radio signals detected on Miami beach, which are emitted by physical processes in the atmosphere, are one of many that can be used to create highly random numbers which are difficult to be replicated in any predictable way. Other often-used examples of these physical, non-human processes are radioactive decay, thermal noise, and shot noise in electronics.7 Random values became the basis for computational models which rely on high-quality random numbers to sample as possible outcomes in simulations, and these models are entangled within the post-war United States and the rise of its neoliberal order throughout the 20th century.

Formed in Santa Monica just after the end of World War II, the RAND Corporation became the pre-eminent example of the present-day think tank, conducting interdisciplinary research that was often based in quantitative methods. Throughout the Cold War, many of its core preoccupations centered on problems of prediction, including nuclear deterrence, intelligence, military simulation, and game theory. Noise played a central role in this research, both conceptually and practically. Conceptually, Roberta and Albert Wohlstetter, who worked for RAND as military historians and strategists, shifted the United States towards embracing uncertainty as fundamentally embedded within military strategy, often utilizing principles of signal and noise introduced by Claude Shannon first published in the same year that RAND was established.8 During their time at RAND, Roberta published an account of intelligence challenges during Pearl Harbor and the Cuban Missile Crisis, emphasizing the difficulty of identifying opponent signals amidst noise, writing that, “...true signals were always embedded in the noise or irrelevance of false ones.”9 Albert, in his writings on nuclear strategy, stressed that, “no bit of noise is unambiguously noise; it is always possible to hypothesize that some apparently random series of events contains a piece of information, deliberately, or actually concealed.”10

In her book, Think Tank Aesthetics: Mid-Century Modernism, The Cold War and the Rise of Visual Culture, Pamela Lee argues that these approaches to information and meaning can lead to an expanded notion of signs, drawing a parallel to Meyer Schapiro’s notion of polysemic signs in visual art found in the figure/ground field and non-representational and non-memetic elements. When absorbed into the work of the intelligence analyst, Lee presents these frameworks as contributing to “an expanding empire of signs” that the neoliberal order seeks to dominate.11 These theories of signals, noise, and signs, when applied to intelligence, present a world where all events are interpretable. Within this framework an analyst might be driven toward maximum vigilance and maximum interpretation as means of insurance against being caught unprepared. This kind of interpretive maximalism accelerates a collection of information with the intent of generating meaning, anticipating large-scale databases and large language models today.

Modeling and simulation was pervasive at RAND, and alongside its conceptual applications, researchers used noise towards practical ends as well. To be able to accurately run simulation games, researchers needed an abundance of high-quality random numbers to sample from in their calculations, and sought out physical processes to generate these numbers. In 1955, the RAND Corporation published One Million Random Digits with 100000 Normal Deviates, an expansive text filled with tables of numbers generated from a complex electronic roulette wheel whose electric pulses were sampled at regular intervals.12 Though now obsolete, the book provided enough entropy at the time in its number tables for RAND to conduct large-scale mathematical simulations where the numbers were sampled and processed to produce a number of likely and unlikely real-world outcomes. In this way, the incorporation of unpredictable noise produced a prediction-heavy, future-oriented mediation of the world.

While these uses at RAND formed metaphorical images or pictures of the future, we can find their methods embedded quite literally in everyday encounters with media today. For instance, when employing path tracing in rendering software, Monte-Carlo methods used frequently at RAND are also used to simulate the behavior of light emitted from objects as it falls, bounces, and scatters across surfaces and through volumes. This is most noticeable when rendering scenes at low samples. In resulting images, we begin to see a visual snow, exposing the points of light being sampled across the scene. This process also exposes the image as the result of a computational process rooted in prediction. The resulting image of the render is only an artifact or after-effect of the medium, which itself is a simulation of how these objects would interact according to the physics of light and volume. The rendering illustrates the implied correlation between clarity of prediction and volume of sampling; the more that randomness is employed, the clearer the outcome.

Across both the conceptual and practical applications of noise at RAND, there was a necessary maximalism in practice. Just as the intelligence analyst benefits from a maximally broad interpretation of information as holding meaning, the computer scientist computes the simulation across a maximal number of samples, each purporting to contribute to an overall clarity of a future scenario. These approaches have only become more pervasive since then, reflected now in the maximalism of data centers, platforms that now act as vast networks to ingest user behaviors to then feedback personalized content, and networked supercomputers that run billions of trials (albeit much different and more complex than RAND simulations) in training large language and other machine learning models.

Returning to Baudrillard, I am reminded that at some point, rationality is no longer needed after simulacra reach critical mass.13 And importantly, I think about how this extreme proliferation of signs is entangled within processes of prediction and futurity as they attempt first to describe, then predict, and finally determine, the real. The introduction of randomness in prediction models, and the meaningfulness of noise incorporated into interpretation of information in the 20th century not only operate vehicles to contend with uncertainty, but in becoming a totalizing mass of possible meanings, they also lean towards a form of certainty in and of itself, that signs can signal possible futures and that a “mastery of signs” might unlock or even determine that.

This maximalism reveals something peculiar about contemporary prediction. Randomness, when deployed at sufficient scale and with enough computational force, produces an inversion. The more exhaustively these systems account for randomness, whether that be samples processed, signs interpreted, or noise converted to signal, the less they resemble frameworks of probability and the more they echo older frameworks that probability began to replace. Actuarial tables, intelligence assessments, and training cycles of large language models not only purport to estimate likelihood but to actively reveal a set of conditions, a piece of content, or the form of an image. What begins as a means to contend with uncertainty blurs the border between possibility and determination, increasingly learning towards prophecy but doing so through an aesthetics of logic and rationality.

Not long after dawn broke, I could see these mechanics play out in real-time as more people gathered on the shore. Some sat and watched, a few meeting others who had already been there. At the same time, some used the scene as a backdrop to take photos of themselves, rushing to catch the right angle for a soon-to-be-post before it was too late. And some livestreamed themselves to an audience, giving good morning messages in an attempt to be a channel of positive energy, motivation, and vibes. The scene here was less a matter of spectacle and more an example of its abolishment.14 There was no singularity of vision or perspective,15 but rather a fragmentation and reprocessing of the sunrise into a myriad of images to be distributed across online platforms.

Photography is often discussed as emphasizing or contending with time passed; a perpetual departure from a sampled moment. But amidst a noisy environment of competing, infinite online content, the images generated here suggested an inverse relationship with time. They each constructed a moment to be projected forward, perpetually reaching towards each individual’s desired future, operating as a form of manifestation. I looked at these scenes but instead was seeing the signals they sought to generate, which in turn determined and defined what I was looking at. In this way, the world appeared to stretch and fragment to meet the contours of itself as content being generated in real-time. I imagined these scenes taking place each day ad infinitum, until the beach no longer rendered itself as paradise.

1. VLF radio has been studied for some time as it has also been used as a medium to communicate with the United States’ nuclear submarine fleet. VLF radio takes an incredible amount of energy to generate and is one of the few waves in the radio spectrum to be able to pierce seawater. For more information about the ionosphere-earth waveguide, see: J.R. Wait and K.P Spies, Characteristics of the Earth-Ionosphere Waveguide for VLF Radio Waves (Boulder, Colorado: National Bureau of Standards, 1965). 

2. Paige Sutherland and Meghna Chakrabarti, “Inside Florida’s Property Insurance Crisis,” On Point, May 3, 2022, https://www.wbur.org/onpoint/2022/05/03/inside-floridas-property-insurance-crisis.

3. See Jorge Luis Borges, On Exactitude in Science

4. Jean Baudrillard, “The Precession of Simulacra,” chapter, in Simulacra and Simulation (Ann Arbor, Michigan: University of Michigan Press, 1994), 2. 

5. Though the technology has since developed, in diffusion-based rendering, the model begins with an image of pure noise which increases in clarity of image with each cycle.

6. See Lorraine Daston, Classical Probability in the Enlightenment

Cloudflare famously used a wall of lava lamps to generate random values for its internet security services, relying on the unpredictable flow of liquid in each lamp.

8. See Claude Shannon et al., The Mathematical Theory of Communication

9. Roberta M. Wohlstetter, Cuba and Pearl Harbor: Hindsight and Foresight (Santa Monica, CA: RAND Corporation, 1965), 36. 

10. Pamela M. Lee, Think Tank Aesthetics: Midcentury Modernism, the Cold War, and the Neoliberal Present (Cambridge, MA: The MIT Press, 2020), 79. 

11. Lee, Think Tank Aesthetics, 85. 

12. RAND, "Foreword to the Online Edition" in A Million Random Digits with 100,000 Normal Deviates (Santa Monica, CA: RAND Corporation, 2001). 

13. “It no longer needs to be rational, because it no longer measures itself against either an ideal or negative instance. It is no longer anything but operational.” Jean Baudrillard, “The Precession of Simulacra," in Simulacra and Simulation (Ann Arbor, Michigan: University of Michigan Press, 1994), 2. 

14. Baudrillard, "The Precession of Simulacra," 30.

15. Notably, everyone was faced in different directions, with many people taking images literally having their backs turned away from the coast so that their phone could capture their faces in front of the sky.