OpenAI’s DALL·E borrows from the GPT-3 and creates high-fidelity images from text

Last year, OpenAI released GPT-3, the largest transformer model to date with over 175 billion parameters. The model demonstrated great prowess in generating text from a given context and OpenAI licensed it exclusively to Microsoft for providing the computational backend required to host and run the model for its customers.

Building on this, OpenAI have announced a distilled, 12-billion parameter version of GPT-3 today. Dubbed DALL·E, the new transformer model borrows heavily from GPT-3 but combines its abilities with ImageGPT …