About David Zuckerman's Work
My research focuses on randomness and computing. Many randomized algorithms are faster than algorithms that don’t use randomness. Scientists use random numbers to simulate complex systems such as the climate. Computer security is impossible without random numbers. On the other hand, computers use ad hoc sources of randomness that are really only slightly random. It is therefore essential to improve the quality of a random source. Unfortunately, this is impossible with just one, general, low-quality random source. In joint work with my then PhD student Eshan Chattopadhyay, I recently solved a longstanding open problem by introducing an algorithm that combines two independent low-quality random sources to create one high-quality random source. Previous attempts needed at least one of the two input sources to be of moderately high-quality. The new algorithm, called a two-source extractor, also gives a major improvement to an important mathematical problem in Ramsey Theory.
Awards and Achievements
Simons Investigator Award
ACM Fellow (2013)
Guggenheim Memorial Foundation Fellowship (2004)
Alfred P. Sloan Research Fellowship (1996)
NSF Young Investigator Award (1994)