Dean received a B.S., summa cum laude, from the University of Minnesota in computer science and economics in 1990.[3] His undergraduate thesis was on neural networks in C programming, advised by Vipin Kumar.[4][5]
Before joining Google, Dean worked at DEC/Compaq's Western Research Laboratory,[9] where he worked on profiling tools, microprocessor architecture and information retrieval.[10] Much of his work was completed in close collaboration with Sanjay Ghemawat.[11][6]
While at Google, he designed and implemented large portions of the company's advertising, crawling, indexing and query serving systems, along with various pieces of the distributed computing infrastructure that underlies most of Google's products.[6] At various times, he has also worked on improving search quality, statistical machine translation and internal software development tools and has had significant involvement in the engineering hiring process.
The projects Dean has worked on include:
Original design of Protocol Buffers, an open-source data interchange format.
Spanner, a scalable, multi-version, globally distributed, and synchronously replicated database
Some of the production system design and statistical machine translation system for Google Translate
Bigtable, a large-scale semi-structured storage system[6]
MapReduce, a system for large-scale data processing applications[6]
DistBelief, a proprietary machine-learning system for distributed training of deep neural networks. The "Belief" part is because it could be used to train deep belief networks. It was eventually refactored into TensorFlow. It was used to train the network in "the cat neuron paper".[12][14]
TensorFlow, an open-source machine-learning software library. He was the primary designers and implementors of the initial system.[15]
Pathways, an asynchronous distributed dataflow system for neural networks. It was used in PaLM.[15]
He was an early member of Google Brain,[6] a team that studies large-scale artificial neural networks, and he has headed artificial intelligence efforts since they were split from Google Search.[16]
In 2020, after Timnit Gebru tried to publish a paper, Dean wrote that an internal review concluded that the paper "ignored too much relevant research" and did not meet Google's bar for publication, also noting that it was submitted one day instead of at least two weeks before the deadline. Gebru challenged Google's research review process and wrote that if her concerns were not addressed, they could "work on an end date". Google responded that they could not meet her conditions and accepted her resignation immediately. Gebru stated that she was fired, leading to a controversy. Dean later published a memo on Google's approach to the review process.[17][18]
In 2023, DeepMind was merged with Google Brain to form a unified AI research unit, Google DeepMind. As part of this reorganization, Dean became Google's chief scientist.[2][15]
He is the subject of an Internet meme for "Jeff Dean facts". Similar to Chuck Norris facts, the Jeff Dean facts exaggerate his programming powers.[20] For example:[21]
Once, in early 2002, when the index servers went down, Jeff Dean answered user queries manually for two hours. Evals showed a quality improvement of 5 points.
Dean was interviewed for the 2018 book Architects of Intelligence: The Truth About AI from the People Building it by the American futurist Martin Ford.[25]
Fay Chang, Jeff Dean, Sanjay Ghemawat, Wilson C. Hsieh, Deborah A. Wallach, Mike Burrows, Tushar Chandra, Andrew Fikes, and Robert E. Gruber. 2006. Bigtable: A Distributed Storage System for Structured Data. OSDI'06: 7th Symposium on Operating System Design and Implementation (October 2006)