Recent from talks
Knowledge base stats:
Talk channels stats:
Members stats:
Bernard Widrow
Bernard Widrow (born December 24, 1929) is an American professor of electrical engineering at Stanford University. He is the co-inventor of the Widrow–Hoff least mean squares filter (LMS) adaptive algorithm with his then doctoral student Ted Hoff. The LMS algorithm led to the ADALINE and MADALINE artificial neural networks and to the backpropagation technique. He made other fundamental contributions to the development of signal processing in the fields of geophysics, adaptive antennas, and adaptive filtering.
Widrow is the namesake of "Uncle Bernie's Rule": the training sample size should be 10 times the number of weights in a network.
This section is based on.
He was born in Norwich, Connecticut. While young, he was interested in electronics. During WWII, he found an entry on "Radios" in the World Book Encyclopedia, and built a one-tube radio.
He entered MIT in 1947, studied electrical engineering and electronics, and graduated in 1951. After that, he got a research assistantship in the MIT Digital Computer Laboratory, in the magnetic core memory group. The DCL was a division of the Servomechanisms Laboratory, which was building the Whirlwind I computer. The experience of building magnetic core memory shaped his understanding of computers into a "memory's eye view", that is, he "look for the memory and see what you have to connect around it".
For his masters thesis (1953, advised by William Linvill), he worked on raising the signal-to-noise ratio of the sensing signal of magnetic core memory. Back then, the hysteresis loops for magnetic core memory was not square enough, making sensing signal noisy.
For his PhD (1956, advised by William Linvill), he worked on the statistical theory of quantization noise, inspired by work by William Linvill and David Middleton.
During PhD, he learned the Wiener filter from Lee Yuk-wing. To design a Wiener filter, one must know the statistics of the noiseless signal that one wants to recover. However, if the statistics of the noiseless signal is unknown, this cannot be designed. Widrow thus designed an adaptive filter that uses gradient descent to minimize the mean square error. He also attended the Dartmouth workshop in 1956 and was inspired to work on AI.
Hub AI
Bernard Widrow AI simulator
(@Bernard Widrow_simulator)
Bernard Widrow
Bernard Widrow (born December 24, 1929) is an American professor of electrical engineering at Stanford University. He is the co-inventor of the Widrow–Hoff least mean squares filter (LMS) adaptive algorithm with his then doctoral student Ted Hoff. The LMS algorithm led to the ADALINE and MADALINE artificial neural networks and to the backpropagation technique. He made other fundamental contributions to the development of signal processing in the fields of geophysics, adaptive antennas, and adaptive filtering.
Widrow is the namesake of "Uncle Bernie's Rule": the training sample size should be 10 times the number of weights in a network.
This section is based on.
He was born in Norwich, Connecticut. While young, he was interested in electronics. During WWII, he found an entry on "Radios" in the World Book Encyclopedia, and built a one-tube radio.
He entered MIT in 1947, studied electrical engineering and electronics, and graduated in 1951. After that, he got a research assistantship in the MIT Digital Computer Laboratory, in the magnetic core memory group. The DCL was a division of the Servomechanisms Laboratory, which was building the Whirlwind I computer. The experience of building magnetic core memory shaped his understanding of computers into a "memory's eye view", that is, he "look for the memory and see what you have to connect around it".
For his masters thesis (1953, advised by William Linvill), he worked on raising the signal-to-noise ratio of the sensing signal of magnetic core memory. Back then, the hysteresis loops for magnetic core memory was not square enough, making sensing signal noisy.
For his PhD (1956, advised by William Linvill), he worked on the statistical theory of quantization noise, inspired by work by William Linvill and David Middleton.
During PhD, he learned the Wiener filter from Lee Yuk-wing. To design a Wiener filter, one must know the statistics of the noiseless signal that one wants to recover. However, if the statistics of the noiseless signal is unknown, this cannot be designed. Widrow thus designed an adaptive filter that uses gradient descent to minimize the mean square error. He also attended the Dartmouth workshop in 1956 and was inspired to work on AI.