Harrison, Matthew

Assistant Professor of Applied Mathematics

Applied Mathematics
Brown University
Box G-S121-7
121 S. Main Street
Providence, RI 02912
Phone:+1 401 863 1834
Alt. Phone:+1 401 863 2115
Email:


Read Matthew Harrison's full Faculty Profile.

link to PubMed

0

My research has focused primarily on applications related to neuroscience, information theory and computer vision. These topics provide ample opportunities to participate in collaborative, interdisciplinary research, something that I value and enjoy, and they relate to a subject that I find fascinating: the mathematical and computational foundations of learning and intelligence. They also provide an endless supply of interesting mathematical and statistical problems.


BIOGRAPHY


Please visit Professor Harrison's Homepage for current information.
Professor Harrison's research interests are comprised of the following areas:
Statistics. Conditional inference, Multiple hypothesis testing, Sequential importance sampling
Neuroscience. Pattern detection in multi-neuronal spiking data, Exploratory data analysis
Information theory. Rate distortion theory, Model selection
Computer vision. Structured statistical models, Natural scene statistics, Perceptual organization


INTERESTS


Over the next several years I plan to continue existing collaborations and develop new collaborations with scientists studying biological and machine intelligence. Anticipating the specific research questions that will arise is dicult, but probability, statistics and computing will surely play important roles. I expect neuroscience, information theory and computer vision to remain central to my work.
Neuroscience
Perception, cognition and behavior arise in part from the electrical signals among billions of neurons. The details of this process are still a mystery. Modern neurophysiological techniques permit the simultaneous recording of the electrical activity of hundreds of individual neurons (and this number is quickly growing) or, more coarsely, the global activity of brain regions comprised of hundreds of thousands of neurons. The high dimensionality of the resulting data sets, combined with the peculiarities of neural activity patterns, create challenging statistical issues. Methods for exploratory statistical analysis are crucial right now in neuroscience, and will probably remain crucial for years to come. This is not to say that statistical modeling is premature but rather, that models are typically used in an exploratory fashion. The conflux of rapidly changing technology, high-dimensional datasets, demand for exploratory methods, and sustained scientific excitement (and funding) all combine to create a fertile environment for statistical creativity and novel methodology.
Information theory
Information theory provides a rigorous probabilistic setting for understanding the fundamental limits of digital coding and communication. Because of its tight connection to probability theory, many results in probability and statistics can be interpreted from an information theoretic point of view. This new perspective has led to a variety of insights, such as the minimum description length (MDL) approach to model selection and learning [1]. Information theoretic ideas and interpretations are becoming commonplace in several applied areas, such as the biological sciences. Typically these applications borrow ideas, like entropy, that are related to theoretical lossless compression. Theoretical lossy compression, on the other hand, is a much less developed field with few existing connections to scientific applications. The rate-distortion function plays the role of the entropy and describes the optimal amount of achievable compression. Furthermore, practical lossy compression lags far behind practical lossless compression in terms of approaching optimal compression performance. Closing this gap will likely require new mathematical and computational insights, as will drawing tighter connections between lossy compression and scientific applications. (I am convinced such connections exist. For example, lossy compression (which attempts to preserve semantic content at the expense of precise detail) seems more analogous to biological learning and perception than does lossless compression (which must preserve every detail).
Computer vision
The gap between biological and machine vision is substantial. The technological impact of closing this gap and the potential benefits for society are quite profound. One place where this gap is especially apparent is visual learning. A child can accurately learn a novel visual category (like "helicopter") with only a few examples (maybe only a single example). But a visual category is an incredibly complicated, high-dimensional statistical object (think of the collection of retinal images that contain a helicopter). How is rapid learning possible? Knowledge sharing (or transfer) among visual categories is often suggested as a possible solution. Inheriting ideas from my PhD advisor, Stuart Geman, I believe that highly structured statistical models (in particular, compositional models) are required for efficient sharing and transfer of knowledge. In any event, visual category learning represents an incredibly difficult statistical problem for which we know a solution exists. Even partial progress will push the boundaries of statistical methodology, generate new technology and provide scienti c insights into biological vision.


DEGREES



AWARDS


Phi Beta Kappa, University of Virginia (1997) Jefferson Scholarship, University of Virginia (1994-1998) Howard Hughes Medical Institute Predoctoral Fellowship in Biological Sciences Awardee (1998) National Defense Science and Engineering Graduate Fellowship (1998-2001)


AFFILIATIONS


Co-Chair, Bayesian Nonparametrics Workshop, ICERM, Providence, RI, 2012. Organizer, Invited Session on Statistical Methods in Neuroscience, 8th World Congress in Probability and Statistics, Istanbul, Turkey, 2012. Member of AMA, ASA, IEEE, SIAM


TEACHING


Probability and Mathematical Statistics Intermediate Statistics Probability Theory and Random Processes Engineering Statistics and Quality Control Mathematical Methods in the Brain Sciences


FUNDED RESEARCH


Please see Curriculum Vita


WEB LINKS



Curriculum Vitae


Download Matthew Harrison's Curriculum Vitae in PDF Format


© 2015 All Rights Reserved | Brown University Center for Statistical Sciences | Phone: 401.863.9181 | Fax: 401.863.9182 | center@stat.brown.edu
Search provided by Google © | Sitemap