Bio

Gideon Mann is the Head of Data Science at Bloomberg L.P., where he guides the strategic direction for machine learning, natural language processing (NLP) and search across the company. He is part of the leadership team for the Office of the CTO. His team works on the company-wide data science platform, natural language question answering, and deep learning text processing, among other products. He is a founding member of both the Data for Good Exchange, an annual conference on data science applications for social good, and the Shift Commission on Work, Workers and Technology.

He has also been active in academic research in fact extraction, weakly-supervised learning, and distributed optimization. Recently, he has also been interested in applications of machine learning to problems in software engineering. From 2007 to 2014, he worked at Google Research in New York City, and his team built core internal machine learning libraries, released the Google Prediction API, and developed coLaboratory, a collaborative iPython application. This followed a short post-doc at UMass-Amherst working on problems in weakly supervised machine learning.

Mann graduated Brown University in 1999 and received a Ph.D. from The Johns Hopkins University in 2006, where his focus was natural language processing with a dissertation on multi-document fact extraction and fusion.

Keynote: Information in Context: Financial Conversations and News Flows

Financial information is the lifeblood of financial decision-making, and in the case of textual information requires significant work to uncover. At Bloomberg, we are applying machine learning techniques to extract information automatically from recovered sources in our effort to make sense of the massive volume of text available on the Bloomberg Terminal every day. Part of our current frontier of work is not only extracting facts from individual sentences, but beginning to understand the larger context of streams of text. The first part of the talk focuses on conversations, where the structure of the conversation itself can be used to glean insight. The second part of the talk deals with less structured news flows, where summarization and salience can aid understanding of large news flows. Together, these two problems illustrate how deep understanding can lead to better products.