2023/02/28

GAN LLM WTF? Guardian Columnist Explains Key AI Acronyms without Technical Jargon

“It seems that forcing a neural network to ‘squeeze’ its thinking through a bottleneck of just a few neurons can improve the quality of the output. Why? We don’t really know. It just does.”
– TechScape columnist Alex Hearn, describing an idiosyncrasy of neural network design. Part of a (largely) jargon free ‘glossary of AI acronyms,’ Hearn breaks down the meaning of ubiquitous AI terminology (GAN, LLM, compute, fine tuning, etc.).
Metadata: People: / Contributors:
$40 USD