power of a lifetime

The son of an old friend of mine is deep in the world of artificial intelligence and machine learning. Not so long ago, while trying to explain to me what he does, he started talking about transformer models (GPT-3 from OpenAI is probably the best known example):

They basically try and do away with every traditional analytic assumption and just throw mountains (read: Iceland sized countries, for the big models) of electricity into computing outcomes.

To put that into some perspective in relation to the power demands and flexibility of human cognition, Andy Crouch writes:

Furthermore, human babies accomplish all this cognition with the roughly one-hundred-watt power supply of the human body (a single training run for GPT-3, one set of researchers estimated, consumes 189,000 kWh of power, roughly what a human being would consume over an entire lifetime). How would we ever engineer a silicon-based system to use so little power to mobilize curiosity, engage relationally, and infer effortlessly from a few examples the shape of the learner’s world? Now we truly seem in the realm of the inconceivable.

– Andy Crouch, The life we’re looking for: reclaiming relationship in a technological world.

Up next now: 4 January 2023 Going on for me right now: Reading: Writing Dance by Jonathan Burrows; The Feeling of What Happens by Antonio Damasio; Ministry for the Future by Colin, Simon and I archive
Latest posts wendell-berry-quote irritating in others awakened transfiguration bits of unsolicited advice stockdale paradox hands that don’t want anything singing and dancing losing oneself given a price on remembering everything Godin on ideas three chairs growth felt in christ Freelance Dance Artists’ Working Ecology he danced listening and pain Somatics unlimited body politics vernacular activities one sentence email tips scrutiny ripeness Dance after lockdown - living with paradox mini essay Esther May Campbell a community of practice a nest for hope Colin, Simon and I archive power of a lifetime