Category: Implicit Generative Modeling
-
Go with the Flow: Seeking the Philosophers’ Stone for the New Alchemy of Generative Modeling
Although I run the risk of abusing the metaphor when it comes to describing implicit generative modeling (IGM) as alchemy, I remain fond of this viewpoint of using IGM to transform digital “lead” (the cheap and plentiful source data that we can easily sample) into digital “gold” (the precious target data that we want to…
-
Life on a Chain
In this post we will step out of the technical weeds and take a higher-level look at what is going on under the hood of modern sequence-prediction models, which include large language models such as ChatGPT, audio-generation models such as WaveNet, and image-generation models such as the original version of DALLĀ·E. We will provide a…
-
The Long Walk Home: Statistical Distances, Part 2
To recap, in the setting of implicit generative modeling (IGM), we wish to train a model to produce data that appears to be drawn from a target distribution $p$, which might be images of cats or whatever else we’re interested in. The distribution $p$ is unknown, and we only have indirect access to it through…
-
The “New Alchemy” of Generative Modeling
In the very early days of the study of physics, natural philosophers extended their world-understanding efforts to the pursuit of immortality-granting elixirs, cure-all potions, and the transmutation of base metals into precious metals. In this latter application, the precious targets were typically gold and silver, and the respective base sources were most often lead, copper,…