Hauptinhalt
Topinformationen
Talk Neuromorphic Computing - Next generation of brain-inspired computing ai
Dozenten
Beschreibung
This is the announcement of the main Research Lecture from Kwabena Boahen (Professor of Bioengineering and of Electrical Engineering at Stanford University).
Lecture Title
Scaling Knowledge Processing from 2D Chips to 3D Brains
Abstract
As a computer's processors increase in number, they process data at a higher rate and exchange results across a greater distance. Thus, the computer consumes energy at a rate that increases quadratically. In contrast, as a brain's neurons increase in number, it consumes energy at a rate that increases not quadratically but rather just linearly. Thus, an 86B-neuron human brain consumes not 2 terawatts but rather just 25 watts. To scale linearly rather than quadratically, the brain follows two design principles.
First, pack neurons in three dimensions (3D) rather than just two (2D). This principle shortens wires and thus reduces the energy a signal consumes as well as the heat it generates.
Second, scale the number of signals per second as the square-root of the number of neurons rather than linearly. This principle matches the heat generated to the surface area available and thus avoids overheating.
I will illustrate how we could apply these two principles to design AI hardware that runs not with megawatts in the cloud but rather with watts on a phone.
Weitere Angaben
Ort: (66/E34)
Zeiten: Termine am Donnerstag, 30.05.2024 19:00 - 22:00
Erster Termin: Donnerstag, 30.05.2024 19:00 - 22:00, Ort: (66/E34)
Veranstaltungsart: sonstige (Offizielle Lehrveranstaltungen)
Studienbereiche
- Cognitive Science > Einführungsveranstaltungen für Studienanfänger
- Cognitive Science > Bachelor-Programm
- Cognitive Science > Master-Programm
- Cognitive Science > Promotionsprogramm
- Cognitive Science > Veranstaltungen