RESEARCH INTERESTS
My interests are in harmonic analysis, wavelets, multiscale analysis in general, and in particular with applications to the analysis of graphs and data sets viewed as discrete or sampled continuous geometric structures embedded in highdimensional spaces. I am interested in machine learning problems, mostly from the point of view of approximation and fitting of functions under random noise and random sampling.
See the Research tab for more info about: Diffusion Wavelets: a construction of families of wavelets and Multiresolution Analyses on graphs, manifolds and point clouds. Pictures, papers and presentations available.
 Diffusion Geometries: here are some links to the use of diffusion geometries in data analysis.
 Multiscale Geometric Methods for Data: various techniques for studying geometry of highdimensional data in a multiscale fashion.
 Analysis of Molecular Dynamics Data: in collaboration with Cecilia Clementi and her lab, we use the geometric structure of data generated from molecular dynamics data to construct observables that provide reaction coordinates and reduced, lowdimensional dynamics that wellapproximates the longtime dynamics of the original system.
 Multiscale Analysis of Markov Decision Processes.
 Visualization of large data sets.
 Harmonic Analysis and Wavelets: here I talk a bit about Harmonic Analysis and provide links to related web pages.
 HyperSpectral Imaging and Pathology: hyperspectral imaging applied to pathology.
Postdocs and students
Current: Paul Escande, Sven Cattell, Stefano Vigogna, Alessandro Lanteri, Wenjing Liao, James Murphy, Sui Tang, Ming Zhong
Past: Jake Bouvrie, Guangliang Chen (SJSU), Miles Crosskey (now at CoVar Applied Technologies), Mark Iwen (MSU), David Lawlor (now at NOKIA's HERE), Stas Minsker (USC), Jason Lee, Nate Strawn (Georgetown University), Josh Vogelstein (JHU), YoonMo Jung, Prakash Balachandran, Anna V Little (Jacksonville U).
Pointers to some future, present and recent past events
 HopHacks Data Science Workshop and Challenge at MINDS
 The Mathematical INstitute of Data Science (MINDS) at Johns Hopkins is born! The inaugural workshop is on Nov. 3d.
 The Workshop on Trends and Advances in Monte Carlo Sampling Algorithms will be held at SAMSI on 12/1112/15 2017.
 The Data Seminar is continuing this Fall 2017 at JHU
 Slides of M. Fornasier''s lecture on agents and sparse control
 Trends and Advances in Monte Carlo Sampling Algorithms, Duke University, December 1115
 Dynamics and geometry from high dimensional data, CMU, Mar 1416, 2017. Poster
 Conferences and Meetings on Applied Maths: Neural Networks and Artificial Intelligence, Machine Learning
 Norbert Wiener Center at UMD
 The IPAM program on Understanding ManyParticle Systems with Machine Learning is running SepDec. 2016
 ICERM workshop on HighDimensional Stochastic Systems and Statistics of Data, 2016
 PCMI Summer School on Mathematics of Data, 2016. Materials for my lectures .
 Duke Workshop on Sensing and Analysis of HighDimensional Data, July 2325, 2013
 Structure in Complex Data Set Duke Mathematics N.S.F.funded Research Training Grant.
 Triangle Computer Science Distinguished Lecturer Series
 Mathematical Biology Duke Mathematics N.S.F.funded Research Training Grant.
 Computational Geometry Week, including ACM SoCG 2012, June 1720, in Chapel Hill, NC, USA. Special workshop Connections between analysis and computational geometry, organized by Chris Bishop. See also the special workshop Computational Geometric Learning: Exploring geometric structure in high dimensions organized by J. Giesen, C.K. Muller, G. Rote.
 Challenges in Geometry, Analysis and Computation: High_Dimensional Synthesis June 46 2012. A conference in honor of R.R. Coifman, P.W. Jones and V. Rokhlin.
 I.M.A. Thematic Year on the Mathematics of Information 201112
 ICIAM Minisymposium on Harmonic Analysis on Graphs and Networks on July 22, 2011
 Duke workshop on Sensing and Analysis of HighDimensional Data on July 2628th
 AMS Special Session on The Mathematics of Information and Knowledge: a page with links and some of the slides of the talks here.
 CTMS Workshop on Large Data Sets: Computation and Structure, Nov. 13th, Duke University, NC. The schedule will be available here.
 Forum on Geometric Aspects of Machine Learning and Visual Analytics, Oct 1112, IEEE VisWeek, Atlantic City, NJ.
 SAMSI opening workshop on stochastic dynamics, part of the long program on stochastic dynamics.
Collaborators
Cecilia Clementi, Ronald R. Coifman, David Brady, Jason Lee, Sridhar Mahadevan, Raanan Schul, Sayan Mukherjee, Andreas Coppi, Frank B. Geshwind, Richard DeVerse, Gustave L. Davis, Francis Woolfe, Peter W. Jones, Stephane Lafon, Sridhar Mahadevan, M. Mahoney, Francois Meyer, Hrushikesh Mhaskar, Arthur D. Szlam, Johan Walden, Frederick J. Warner, Steven W. Zucker, Petros Drineas, James C. Bremer Jr., Anna Lin, Eric Monson, Xilin Shen, William Goetzmann
Links of Interest
 Special issue of Science on Dealing with Data
 Statistical Theory and Methods for Complex, HighDimensional Data, Isaac Newton Institute for Mathematical Sciences, Cambridge, U.K.
 Slides of talks given at the workshop on eigenfunctions of the Laplacian, organized by N. Saito and myslef, held at ICIAM 2007.
 Tutorial on Manifold and Spectral Methods for MDPs, by S. Mahadevan and myself, held at ICML 2006.
 Mathematics Calendar from AMS.
 Institute for Pure and Applied Mathematics (IPAM) at UCLA. Conference on Document Space at IPAM on January 2006. Summer School on Intelligent Extraction of Information from Graphs and High Dimensional Data (July 2005), talks available in pdf and video formats.
 Institute for Mathematics and Applications (IMA) at the University of Minnesota, MSRI at Berkeley, SAMSI at Research Triangle Park, Sissa,Trieste,
 TTI Chicago. Conference on Machine Learning (summer 2005). Talks available in pdf and video formats.
 Manifold Learning Page , Fast Manifold Learning ,
 Mathematics Department,Applied Mathematics at Yale University.
 MathsciNet
 American Mathematical Society
 The Society of Mathematical Biology  meetings, conferences and workshops
 Applications of Analysis to Mathematical Biology conference at Duke.
 Visualization Tools by Mario Valle. In particular see ParaView and Ggobi
 Home pages of mathematicians (under construction): Michael Christ, David Donoho , Timothy Gowers, Fan Chung Graham, Terry Tao,Isabella Laba ,...
 The journal Applied Computational Harmonic Analysis
 The summer school on Wavelet and Multifractals (Corsica, 2004)
 Math/Harmonic Analysis Blog
 Tutorial on spectral clustering , by C. Ding
Programming, Code, etc...
 Lightspeed Matlab package, and useful Matlab speeding up packages and suggestions.
 Matlab tricks and tips
 Sparse Reconstruction packages:
 Software by Kevin Murphy, mostly Matlab packages (e.g. HMM, MDP, graphs...)
Data Sets
 UCI benchmark repository and UCI KDD archive.
 Benchmark data sets for semisupervised classification, from the book Chapelle et al.
 Benchmark repository at the Intelligent Data Analysis Group Fraunhofer.
 Regression data sets available at LIACC.
 JSE data archive .
 Statlib data set archive .
Miscellanea

Newspapers (giornali): Il giornale , Il resto del carlino
 Multiscale Geometric Methods for Data
 Diffusion Wavelets
 Diffusion Geometry
 Molecular Dynamics
 Markov Decision Processes
 Data Visualization
 Hyperspectral Imaging and Pathology
 Harmonic Analysis and Wavelets
Overview
We have developed several methods for analyzing in a multiscale fashion the geometry of data in highdimensions, when the data is close to being lowdimensional.In particular we developed a technique we called Multiscale SVD, where the main idea is to fix a point and look at the behavior of singular values of the covariance of the data in a ball of radius r centered at the point, as r varies. This behavior is useful to reveal the local intrinsic dimension of the data, is robust to noise and sampling, and also can be used to estimate the largest region around the point which is approximately lowdimensional. This is useful in a wide variety of applications. We have used it to analyze molecular dynamics data, and construct reduced models, to estimate the intrinsic dimension of a variety of data sets, and estimate Kflats approximations to data efficiently.
We have also developed a technique called Geometric MultiResolution Analysis, a.k.a. GMRA, which we use to construct piecewise linear multiscale approximations to data. This leads to a novel way of approximating data, of decomposing data matrices in a hierarchical fashion, of constructing dictionaries that sparsify data, and we use it as a scaffold to efficiently encode data and on which to run a variety of estimation tasks (from regression to classification to model reduction for stochastic systems) efficiently (both from a sample complexity perspective and a computational perspective).
Papers
 Multiscale SVD and Intrinsic Dimensions
 Multiscale Geometric Methods for Data Sets I: Multiscale SVD, Noise and Curvature, A. V. Little, M. Maggioni, L. Rosasco. This paper summarizes the work in A.V. Little's thesis (May 2011) on multi scale singular values for noisy point clouds. This extends the analysis of the constructions and results in our previous work Multiscale Estimation of Intrinsic Dimensionality of Data Sets (A.V. Little and Y.M. Jung and M. Maggioni, Proc. AAAI, 2009, and see a presentation by A.V. Little here) and Estimation of intrinsic dimensionality of samples from noisy lowdimensional manifolds in high dimensions with multiscale SVD (A.V. Little and J. Lee and Y.M. Jung and M. Maggioni, Proc. SSP, 2009}. Appears in A.C.H.A. in Mar '16 almost 4 years after submission.
 Multiscale Geometric Methods for estimating intrinsic dimension, A.V. Little, M. Maggioni, L. Rosasco, Proc. SampTA 2011
 Multiscale geometric wavelets for the analysis of point clouds, G. Chen, M. Maggioni, Proc. CISS, 2009
 Geometric MultiResolution Analysis (GMRA) and Dictionary Learning:
 Dictionary Learning and NonAsymptotic Bounds for Geometric MultiResolution Analysis, with S. Minsker and N. Strawn, in Proceedings of the second "international Traveling Workshop on Interactions between Sparse models and Technology", 2014, as well as here
 Multiscale Dictionary Learning: NonAsymptotic Bounds and Robustness, M. Maggioni, S. Minsker, N. Strawn, to appear in J.M.L.R., submitted in 2014
 With connections to compressed sensing and estimation of probability measures: A Fast Multiscale Framework for Data in High Dimensions: Measure Estimation, Anomaly Detection, and Compressive Measurements, G. Chen and M. Iwen and S. Chin and M. Maggioni, Visual Communications and Image Processing (VCIP), Nov. 2012 IEEE (submitted April 2012, published version available here).
 With connections to compressed sensing: Approximation of Points on LowDimensional Manifolds Via Random Linear Projections, M. Iwen, M. Maggioni. Appears here in Information and Inference, 2, 2013.
 With connections to learning dictionaries for multiscale patches: Multiscale dictionaries, transforms, and learning in highdimensions, S. Gerber, M. Maggioni, Proc. SPIE 8858, Wavelets and Sparsity XV, 88581T, 2013.
 Multiscale Geometric Methods for Data Sets II: Geometric Wavelets, W.K. Allard, G. Chen, M. Maggioni , Appl. Comp. Harm. Anal., Vol. 32(3), May 2012, 435462. This is a detailed and improved construction for geometric wavelets for point clouds, originally introduced in the 2010 conference paper here.
 With applications to model reduction and control: Geometric Multiscale Reduction for Autonomous and Controlled Nonlinear Systems, J. Bouvrie and M. Maggioni, IEEE Conference on Decision and Control (CDC), 2012.
 Multiscale Geometric Dictionaries for pointcloud data, G. Chen, M. Maggioni, Proc. SampTA 2011
 Modeling with multiple planes:
 Multiscale Analysis of Plane Arrangements, G. Chen, M. Maggioni, CVPR, 2011. See also the corresponding poster and code on G. Chen's webpage.
Overview
Diffusion wavelets generalize classical wavelets, allowing for multiscale analysis on general structures, such as manifolds, graphs and point clouds in Euclidean space. They allow to perform signal processing tasks on functions on these spaces. This has several applications. The ones we are currently focusing on arise in the study of data sets which can be modeled as graphs, and one is interested in learning functions on such graphs. For example we can consider a graph whose vertices are proteins, the edges connect interacting proteins, and the function on the graph labels a functionality of the protein. Or each vertex could be an image (e.g. a handwirtten digit), the edge connect very similar images, and the function at each vertex is the value of digit represented by that vertex. In classical, onedimensional wavelet theory one applies dilations by powers of 2 and translations by integers to a mother wavelet, and obtains orthonormal wavelet bases. The classical construction has been of course generalized in many ways, considering wide groups of transformations, spaces different from the real line, such as higherdimensional Euclideanspaces, Lie groups, etc...Most of these constructions are based on groups of geometrical transformations of the space, that are then applied (as a "change of variable") to functions on that space to obtain wavelets. On general graphs, point clouds and manifolds there may not be nice or rich groups of transformations. So instead of assuming the existence of these "symmetries" we directly use semigroups of operators acting on functions on the space (and not on the space itself). We typically use "diffusion operators", because of their nice properties. Let T be a diffusion operator on a graph T (e.g. the heat operator). In many cases, our graph will be a discretization of a manifold. The study of the eigenfunctions and eigenvalues of T is known as Spectral Graph Theory and can be viewed (for our purposes) a generalization of the classical theory of Fourier series on the circle. The advantages of wavelets and multiscale techniques over classical Fourier series are well known. So it is natural to attempt to generalize the wavelet theory to the setting of diffusion operators on graphs. Following Stein, we take the view that the dyadic powers of the operator T establish a scale for performing multiresolution analysis on the graph. In the paper, "Diffusion Wavelets," we introduce a procedure for construction scaling functions and wavelets adapted to these scales.Index
 Papers
 Matlab code
 Talks
 Sketch of the Construction
 Examples
 Diffusion Wavelet Packets
 Local Discriminant Bases with Diffusion Wavelet Packets
Papers
 Diffusiondriven Multiscale Analysis on Manifolds and Graphs: topdown and bottomup constructions, M Maggioni, A Szlam, RR Coifman, JC Bremer, Proc. SPIE Wavelet XI, August 2005.
 Biorthogonal Diffusion Wavelets for Multiscale Representations on Manifolds and Graphs, M Maggioni, JC Bremer, RR Coifman, A Szlam, Proc. SPIE Wavelet XI, August 2005.
 Diffusion Wavelet Packets, JC Bremer, RR Coifman, M Maggioni, A Szlam, submitted, August 2004.
 Diffusion Wavelets, RR Coifman, M Maggioni, submitted, August 2004. ACHA special issue on Diffusion Maps and Wavelets (July 2006). If this link does not work, start from ACHA home page (see the most downloaded papers).
Sketch of the Construction
In several applications there is a need to organize structures in a multiresolution fashion, in order to process them, understand them, compress them etc...Maybe the most classical example is given by Fourier analysis, where different resolutions correspond to different frequency bands in which a signal can be analyzed. However, wavelets allow one to perform a multiresolution analysis in a somehow stronger and better organized way in the spatial domain. The construction of wavelets in Euclidean spaces is by now in many ways quite well understood, even if interesting open questions remain for higherdimensional wavelet constructions. Wavelets have found also wide applications in numerical analysis, both as mathematical foundation of Fast Multipole Methods, and by providing bases for Galerkin methods. Already in this latter setting, though, higher flexibility has been required than lowdimensional Euclidean wavelets, since multiresolution are needed on rather general domains and manifolds. Both the Fast Multipole Methods and the Galerkin wavelet methods are not very well adapted to the operator, and only through some efforts are they adapted to the geometry of the domain/manifold. Finally, we want to mention the setting of Spectral Graph Theory, where one can naturally do Fourier analysis through the Laplacian of a graph, and several multiscale constructions, for use in a variety of applications, are still quite ad hoc. In the paper “Diffusion Wavelets” we propose a construction of wavelets on discrete (or discretized continuous) graphs and spaces, that are adapted to the “geometry” of a given diffusion operator T, where the attribute “diffusion” is intended in a rather general sense. The motivation for starting with a given diffusion operator is that in many cases one is interested in studying functions on the graph/space, and hence it seems natural to start with a local operator generating local relationships between functions. Powers of the operator propagate these relationships further away till they become global. If the spectrum of T decays, large powers have lowrank, and hence are compressible. For example we can think of the range of high power of T as being essentially spanned by very smooth functions with small gradient, or even bandlimited functions. It is natural to take advantage of this, and compress the ranks of (dyadic) powers of the operator, thus obtaining a decreasing chain of subspaces, which can be interpreted as scaling function approximation spaces. The difference between these subspace can be called wavelet subspaces. The construction of the basis elements is nontrivial: we show one can build orthonormal bases of scaling functions and wavelets, with good localization properties in about order n(log n)^2, even if the constants are still big and their improvement is of great practical significance and is being investigated. The algorithm proceeds by applying T^(2^j), expressed on the basis of orthonormal scaling functions spanning V_j, then orthogonalizing the resulting set of vectors, discarding the ones not needed to span (numerically) the same subspace, and so on. Hence the orthonormalization step encapsulate the downsampling step. Our construction works on a Riemannian manifold, with respect to, for example, the LaplaceBeltrami diffusion on the manifold; on a weighted graph, with respect, for example, to the natural diffusion induced by the weights on graph.Examples
Homogeneous diffusion on a circle
The first example is a standard diffusion on the circle: our set is the circle sampled at 256 points, an initial orthonormal basis is given by the set of 256 deltafunctions at each points, and the diffusion is given by the standard homogeneous diffusion on the circle. In the picture, we plot several diffusion scaling functions in various scaling function spaces. The finest scaling functions are the given deltafunctions. These diffuse in 'triangle functions' (linear splines): these are orthonormalized into a (nontranslation invariant) basis of 'triangle functions' and linear combinations of 'triangle functions'. Those are diffuse again twice etc.. The coarse scaling function spaces are spanned by the first few top eigenfunctions of the diffusion operator, which are simply trigonometric polynomials with small frequency. The algorithm still tries to build welllocalized scaling functions out these trigonometric polynomials (see e.g. V_9, V_10, V_11) when possible. On the left we plot the compressed matrices representing powers of the diffusion operator, on the right we plot the entries of the same matrices which are above precision. The initial powers get fuller because the spectrum of the diffusion is slowly decaying. It is enough to consider, instead of the given diffusion, a small power of it to avoid this partial fillup.Nonhomogeneous diffusion on the circle
We consider a circle as before, but now the diffusion operator is not translation invariant or homogeneous: the conductivity is nonconstant and is depicted in the figure in the topleft position. Think of the circle of different materials, which is most conductive at the top and least conductive (almost insulating material) at the bottom. In the top right picture we represent some scaling functions at level 15, so by construction/definition, these scaling functions span the range of T^(2^161). Observe that the “scale” of these scaling functions is far from uniform if we measure it in a translation invariant way (for example with standard wavelets on the circle, or with the diffusion wavelets of the previous example). However this is exactly the scale at which the corresponding power of T is operating: at the top of the circle the diffusion acted fast , the numerical rank of the operator restricted to that region is very small, and the scaling functions are trivial there; on the bottom part of the circle, this power of T is still far from trivial, since there the diffusion is much slower, and scaling functions exhibit a more complex behavior (they still contain local highfrequency components). In the second row we show how one can compute an eigenfunction of the compressed operator and extend it back to the whole space, with good precision. In the third row we show the entries above precision of a power of the operator (left); on the right we show a “diffusion scaling function” embedding, which shows that points at the bottom of the circle have a large “diffusion distance” (it is hard/takes a long time to diffuse from one to the other) while points at the top have a small diffusion distance. Diffusion distance in original space is roughly Euclidean distance in this “diffusion scaling function” embedding.Diffusion on a graph of 3 Gaussian clouds
In this example we consider the graph induced by points randomly distributed according to three Guassian random variables centered at different points. A graph is associated to these points, by putting edges between closeby points, with weights which are an exponentially decays function of the distances between the points. The picture topleft shows the points, the picture top center shows the image of the points under the Laplacian eigenfunction map, the remaining figures show some diffusion scaling functions and wavelets, hinting at how they could be used to localized the analysis of this graph.Diffusion a dumbbellshaped manifold
We consider a dumbbellshaped manifold, with diffusion given by (a discrete approximation to) the LaplaceBeltrami operator. The picture shows different scaling functions and wavelets at different scales, on this manifold.Extension outside the set
We also propose an algorithm for extending scaling functions and wavelets from the data set. In the figure we show the extension of some scaling functions from the set (left) to many points randomly generated around the set.Diffusion Wavelet Packets
Diffusion Wavelet Packets can be constructed by further splitting the wavelet subspace further, as in the classical case.Local Discriminant Bases with Diffusion Wavelet Packets
We consider two classes of functions on the sphere, and the problem of learning a good set of features such that the projection on them allows for good discrimination between the functions of the two classes. Functions of class A are build as the superposition of three ripply functions, with equioriented ripples of the same frequency, centered around three points moving slightly in a noisy way, and two Guassian bumps acting as decoys, which are nonoverlapping with the three ripply functions. Functions of class B are similar but one (randomly chosen) ripply function has ripples in a different direction than the other two. Running CART on the original data has an error of 48%, running it on the top 40 LDB features leads to an error of about 12.5%, and running it onto the top discriminating eigenfunctions leads to an error of about 18%. As a second example of local discriminant basis, we consider the following two classes of functions. We fix a direction v, and slightly perturb it with random Gaussian noise. Around that direction we create a ripply spherical cap, with one or two oscillation depending on the class. We add then 5 nonoverlapping ripply functions, acting as decoy, randomly on the sphere. CART run on the original data has an error of about 17.5%, CART on the top 20 LDB features has an error of about 3.5%, and CART on the first 300 eigenfunctions has an error of about 31%.Overview
Diffusion geometries refer to the largescale geometry of a manifold or a graph representing a data set, which is determined by longtime heat flows on the manifold/graph/data set. A multiscale analysis, that includes all scales and not just large scales, is possible with diffusion wavelets.This behavior can be accurately described by the bottom eigenfunctions of the Laplacian operator on the manifold or on the graph (or data set). Moreover, these eigenfunctions can be used to define an embedding (sometimes called eigenmap) of the manifold or graph into lowdimensional Euclidean space, in such a way that Euclidean distances in the range of the eigenmap correspond to "diffusion distances"; on the manifold or graph. Techniques based on eigenvectors of similarity matrices (which are related to the Laplacian have been used successfully for some time in several problems in data analysis, segmentation, clustering, etc... These connections, and connection to differentialgeometric structures of manifolds, and problems related to the stability, computability, and extendibility of these eigenfunctions has been explored extensively in Stephane Lafon's thesis. Further material, including talks and papers, are available on Stephane Lafon's webpage.
Diffusion Geometry was introduced with R. Coifman and Stephane Lafon (now at Google  check his home page for cool demos introducing the basics of diffusion geometries!). There a few key ideas behind the introduction of Diffusion Geometry. The first key idea is that for highdimensional data large distances are often not reliable, as they are severaly corrupted by noise, and if data has lowdimensional geometry, large distances do not respect such geometry. Therefore one starts by connecting each data point only with the closest points  those close enough for which the distance may be trusted. This leads to constructing a graph where each point is connected to its nearest point, forming a graph, possibly weighted in such a way that closest points are connected by an edge with ggreater weight.
 The second key idea is that even if the data lies on a lowdimensional manifold, the geodesic (i.e. shortest path) distance on the manifold is not necessarily an effective distance, as it may be too easily be corrupted by noise. In diffusion geometry the geodesic distance is replaced by diffusion distance, which is in fact a family of distances, paramtrized by time. Diffusion distance between two points at time t takes into considerations all paths of length up to t in order evaluate a distance between the two points, weighting each path by the probability of realizing that path according to the random walk on the graph of nearest points previously constructed.
 The third idea is that one may create a map from highdimensional space to lowdimensional sapce, that respects diffusion distance, by using the top eigenvectors of the random walk on the graph. This is essentially kernel PCA, with the kernel being the random walk on the graph  in particular the kernel is dataadaptive.
 The fourth idea is that this constrution may be multiscaled. With diffusion wavelets, this idea is carried further, to consider multiple scales on the graph in a multiresolution fashion.
 Model reduction in molecular dynamics
 Dimitris Giannakis and his collaborators have extended the construction of diffusion geometry for trajectories of dynamical systems by into account velocities, and used this construction to perform dimension reduction of complex highdimensional systems.
 the study of paths of activity in gene networks, in DNA sequencing, in hyperspectral imaginge, and much more.
See here the list of works citing Diffusion Geometry
Papers
 The short papers (1,2) that appeared in P.N.A.S. are a short introduction to the main ideas.
 ACHA special issue on Diffusion Maps and Wavelets are a good starting point for moreindepth reading. It contains several papers, including one laying down the main construction and ideas, and one with diffusion wavelets.
 A general framework for adaptive regularization based on diffusion processes on graphs, A.D. Szlam, M. Maggioni, R.R. Coifman, to appear in J.M.L.R., August 2006. Contains applications to machine learning, in particular semisupervised learning, as well as image denoising (in a perhaps suprisingly unifying framework). It also gives a diffusiongeometry interpretation to nonlocal means (which we rediscovered in this paper), as essentially running a heatkernel smoothing on the graph of patches of an image. Note that JMLR version of the paper does not include the denoising of images (considered not relevant to the journal), but the e original version does.
 The paper Multiscale Analysis of Data Sets using Diffusion Wavelets, appeared in Proc. Data Mining for BIOmedical Informatics, in conjuction with the SIAM Conference in Data Mining, contains applications to the analysis of document corpora.
Overview
This is an ongoing collaboration with Cecilia Clementi and various people at her lab at Rice. Publications: M. A. Rohrdanz, W. Zheng, M. Maggioni, C. Clementi, Determination of reaction coordinates via locally scaled diffusion map. J. Chem. Phys., 134 2011: 124116
 W. Zheng, M. A. Rohrdanz, M. Maggioni, C. Clementi, Polymer reversal rate calculated via locally scaled diffusion map. J. Chem. Phys., 134 2011: 144108
See also the snippet at the Institute of Pure and Applied Mathematics, where this research initiated, here
The Extensible Tools for Advanced Sampling and Analysis project for developing large scale analysis tools for molecular dynamics data
The main ideas are that data from molecular dynamics simulations, e.g. in the form of the coordinates of the atoms in a molecule as a function of time, lie on or near an intrinsicallylowdimensional set in the highdimensional state space of the molecule, and geometric properties of such sets provide important information about the dynamics, or about how to build lowdimensional representations of such dynamics. We apply recent work on estimation of the intrinsic dimension of data sets in highdimensions to such data, validating the hypothesis that indeed the set of configurations of the molecule does indeed lie on an intrinsically lowdimensional set (at least, in the examples considered), and then use this information, together with a notion of local scale (roughly defined as the largest scale where the data is wellapproximated by a lowdimensional linear subspace), to introduce a variation of diffusion maps, that leads to a set of nonlinear coordinates in state space onto which we may project the dynamics and construct a lowdimensional diffusion process that wellapproximates the largetime behavior of the molecular dynamics simulation.
See this nugget and the papers above for more information.
Links to pages of interest
Contents
I recently started working with Sridhar Mahadevan on Markov Decision Processes. The tools of harmonic analysis on graphs, especially the eigenfunctions of the Laplacian and Diffusion Wavelets seem to find very natural applications to the study of Markov Decision Processes. First of all they can efficiently encode functions on general state spaces for these processes, thus performing a dimensionality reduction task which is recognized as fundamental in order to be able to perform efficient computation of these processes. Secondly there many connections between Markov Decision Processes and harmonic analysis of Markov Chains, stemming from the connections between the Bellman equation and the Green's function or fundamental matrix of a Markov Chain. Nonlinearity of the optimization problem and stochasticity of most ingredients in a Markov Decision Processes on the other hand offer new challanges. I will add more to this page as we make progress in this program.
Publications
The most recent work on multiscale/hierarchical representations for MDP's is this paper: Multiscale Markov Decision Problems: Compression, Solution, and Transfer Learning, J. Bouvrie, M. Maggioni. A short version is available here: Efficient Solution of Markov Decision Problems with Multiscale Representations, J. Bouvrie and M. Maggioni, Proc. 50th Annual Allerton Conference on Communication, Control, and Computing, 2012.
In these works we introduce an automatic multiscale decomposition of MDP's, which leads to a hierarchical set of MDP's ``at different scales'' and corresponding to different portions of state space, separated by ``geometric'' and ``taskspecific'' bottlenecks. The hierarchy of nested subproblems is such that each subproblem is itself a general type of MDP, that may be solved independently of the others (thereby giving trivial parallelization), and the solutions may then be stitched together through a certain type of ``boundaryconditions''. These boundary conditions are propagated top to bottom, by solving the coarsest problem(s) first, and propagating down the solutions (essentially, these boundary conditions for finerscale problems), and ``fillingin'' these coarse solutions by solving the finer problems with the propagated boundary conditions. Besides giving fast parallelizable algorithms for the solution of large MDP's that are amenable to this decomposition, these decompositions also allow one to perform transfer learning of subproblems (possibly at different scales), thereby enhancing the possibilities of transferability and power of transfer learning.Here are electronic copies of two Technical Reports written with my collaborator Sridhar Mahadevan at the Computer Science Dept. at University of Mass. at Amherst about the application of the eigenfunctions of the Laplacian and Diffusion Wavelets to the solution of Markov Decision Processes:
Sridhar and myself organized a workshop on application of spectral methods to Markov Decision Processes, check out the page on our ICML '06 Workshop to be held during ICML 2006.
Protovalue Functions: A Laplacian Framework for Learning
Representation and Control in Markov Decision Processes
, with S. Mahadevan. Tech Report Univ. Mass. Amherst, CMPSCI 200635,
May 2005.
This paper introduces a novel paradigmfor solvingMarkov decision processes (MDPs),
based on jointly learning representations and optimal policies. Protovalue functions are
geometrically customized taskindependent basis functions forming the building blocks
of all value functions on a given state space graph or manifold. In this first of two
papers, protovalue functions are constructed using the eigenfunctions of the (graph
or manifold) Laplacian, which can be viewed as undertaking a Fourier analysis on the
state space graph. The companion paper (Maggioni and Mahadevan, 2006) investigates
building protovalue functions using a multiresolution manifold analysis framework
called diffusion wavelets, which is an extension of classical wavelet representations
to graphs and manifolds. Protovalue functions combine insights from spectral graph
theory, harmonic analysis, and Riemannian manifolds. A novel variant of approximate
policy iteration, called representation policy iteration, is described, which combines
learning representations and approximately optimal policies. Two strategies for scaling
protovalue functions to continuous or large discrete MDPs are described. For continuous
domains, the NystrÂ®om extension is used to interpolate Laplacian eigenfunctions
to novel states. To handle large structured domains, a hierarchical framework is presented
that compactly represents protovalue functions as tensor products of simpler
protovalue functions on component subgraphs. A variety of experiments are reported,
including perturbation analysis to evaluate parameter sensitivity, and detailed comparisons
of protovalue functions with traditional parametric function approximators.
A Multiscale Framework for Markov Decision Processes using
Diffusion Wavelets
, with S. Mahadevan. Tech Report Univ. Mass. Amherst, CMPSCI 200636.
We present a novel hierarchical framework for solving Markov decision processes
(MDPs) using a multiscale method called diffusion wavelets. Diffusion wavelet bases
significantly differ from the Laplacian eigenfunctions studied in the companion paper
(Mahadevan and Maggioni, 2006): the basis functions have compact support, and are
inherently multiscale both spectrally and spatially, and capture localized geometric
features of the state space, and of functions on it, at different granularities in space
frequency. Classes of (value) functions that can be compactly represented in diffusion
wavelets include piecewise smooth functions. Diffusion wavelets also provide a novel
approach to approximate powers of transition matrices. Policy evaluation is usually the
expensive step in policy iteration, requiring O(S^3) time to directly solve the Bellman
equation (where S is the number of states for discrete state spaces or sample size in
continuous spaces). Diffusion wavelets compactly represent powers of transition matri
ces, yielding a direct policy evaluation method requiring only O(S) complexity in many
cases, which is remarkable because the GreenÃs function (I 
P^\pi)1 is usually a full
matrix requiring quadratic space just to store each entry. A range of illustrative exam
ples and experiments, from simple discrete MDPs to classic continuous benchmark tasks
like inverted pendulum and mountain car, are used to evaluate the proposed framework.
Value Function
Approximation with Diffusion Wavelets and Laplacian Eigenfunctions
, with S. Mahadevan. Tech Report Univ. Mass. Amherst, CMPSCI 0538,
May 2005.
Analysis of functions of manifolds and graphs is
essential in many tasks, such as learning, classification,
clustering, and reinforcement learning. The construction of efficient
decompositions of functions has till now been quite problematic, and
restricted to few choices, such as the eigenfunctions of the
Laplacian on a manifold or graph, which has found interesting
applications. In this paper we propose a novel paradigm for analysis
on manifolds and graphs, based on the recently constructed diffusion
wavelets. They allow a coherent and effective multiscale analysis of
the space and of functions on the space, and are a promising new tool
in classification and learning tasks, in reinforcement learning,
among others. In this paper we overview the main motivations behind
their introduction, their properties, and sketch a series of
applications, among which multiscale document corpora analysis,
structural nonlinear denoising of data sets and the tasks of value
function approximation and policy evaluation in reinforcement
learning, analyzed in two companion papers. The final form of this
paper has appeared in the Proc.
NIPS 2005,
with Sridhar Mahadevan, Tech Report Univ. Mass. Amherst, CMPSCI
0539, May 2005.
Fast
Direct Policy Evaluation using Multiscale Analysis of Markov
Diffusion Processes
, with Sridhar Mahadevan, accepted to ICML 2006.
Policy
evaluation is a critical step in the approximate solution of large
Markov decision processes (MDPs), typically requiring $O(S^3)$ to
directly solve the Bellman system of $S$ linear equations (where
$S$ is the state space size). In this paper we apply a recently
introduced multiscale framework for analysis on graphs to design a
faster algorithm for policy evaluation. For a fixed policy $\pi$,
this framework efficiently constructs a multiscale decomposition of
the random walk $P^\pi$ associated with the policy $\pi$. This
enables efficiently computing medium and long term state
distributions, approximation of value functions, and the {\it direct}
computation of the potential operator $(I\gamma P^\pi)^{1}$ needed
to solve Bellman's equation. We show that even a preliminary
nonoptimized version of the solver competes with highly optimized
iterative techniques, requiring in many cases a complexity of $O(S
\log^2 S)$.
Learning Representation and Control in Continuous Markov Decision Processes , with Sridhar Mahadevan, Kimberly Ferguson, Sarah Osentoski, accepted to AAAI 2006. This paper presents a novel framework for simultaneously learning representation and control in continuous Markov decision processes. Our approach builds on the framework of protovalue functions, in which the underlying representation or basis functions are automatically derived from a spectral analysis of the state space manifold. The protovalue functions correspond to the eigenfunctions of the graph Laplacian. We describe an approach to extend the eigenfunctions to novel states using the Nystr¨om extension. A leastsquares policy iterationmethod is used to learn the control policy, where the underlying subspace for approximating the value function is spanned by the learned protovalue functions. A detailed set of experiments is presented using classic benchmark tasks, including the inverted pendulum and the mountain car, showing the sensitivity in performance to various parameters, and including comparisons with a parametric radial basis function method
Links
Here are some links to useful pages:
A Markov Decision Process Toolbox for Matlab with a quick intro to MDPS, and various useful links to review papers and books
Another Markov Decision Process Toolbox for Matlab, by Iadine Chadès, MarieJosée Cros, Frédérick Garcia, Régis Sabbadin, at Inria.
Reinforcement Learning FAQ and Reinforcement Learning Software and Stuff, by Rich Sutton.
LeastSquares Policy Iteration code by R. Parr and M.G. Lagoudakis, Duke University.
An Introduction to Markov Decision Processes, by B. Givan and R. Parr.
Online book on Markov Decision Processes, by Rich Sutton.
Links
2006 NSF Workshop and Outreach Tutorials on Approximate Dynamic Programming. Check out tutorials and workshop presentations.People
Links to some people in the field (this is very very partial list and under continuous construction!):
Ronald Parr and Michael G. Lagoudakis, CS dept. at Duke University.
Sridhar Mahadevan at the Computer Science Dept. at University of Mass. at Amherst .
Andrew W. Moore, previously at Computer Science Department of Carnegie Mellon University, now at Google
Silvia Ferrari at the Laboratory for Intelligent Systems and Controls at Duke University.
Overview
This is an ongoing collaboration with Rachael Brady and Eric Monson, in the Scientific Visualization group at Duke.
Here is a list of posters and papers:
 Eric E Monson, Rachael Brady, Guangliang Chen, Mauro Maggioni, Exploration and Representation of Data with Geometric Wavelets, Poster and short paper at Visweek 2010
 G. Chen, M. Maggioni Multiscale Analysis of Plane Arrangements, CVPR, 2011. See also the corresponding poster.
See Eric's page on the FODAVA activites, which contains more materials and software.
Overview
With light sources of increasingly broader ranges, spectral analysis of tissue sections has evolved from 2 wavelength image subtraction techniques to hyperspectral mapping. A variety of proprietary spectral splitting devices, including prisms and mirror, interferometer, variable interference filterbased monochromometer, tuned liquid crystals, mounted on microscopes in combination with CCD cameras and computers, have been used to discriminate among cell typeS, endogenous, exogenous pigments Goals We use a prototype unique tuned light source, a digital mirror array device (Plain Sight Systems) based on microoptoelectromechanical systems, in combination with analytic algorithms developed in the Yale Program in Applied Mathematics to evaluate the diagnostic efficiency of hyperspectral microscopic analysis of normal NAD neoplastic colon biopsies prepared as microarray tissue sections. We compare the results to our previous spectral analysis of colon tissues and to other spectral studies of tissues and cells.Experimental Details  

Platform: The prototype tuned light digital mirror array device transilluminates H & E stained microarray tissue sections with any combination of light frequencies, range 440 nm – 700 nm, through a Nikon Biophot microscope. Hyperspectral tissue images, multiplexed with a CCD camera (Sensovation), are captured & analyzed mathematically with a PC. Image Source: 147 (76 normal & 71 malignant) hyperspectral gray scale 400X images are derived from 68 normal and 62 malignant colon biopsies selected from @ 200 normal and @600 malignant H & E stained biopsies arrayed respectively on two different slides. Cube: Each hyperspectral image is a 3D data cube (Figure 3) with spatial coordinates x  491 pixels, y  653 pixels & spectral coordinates z  128 pixels, a total of 41 million transmitted spectra. 

Data  

Figure 2: Microarraybiopsies 
Figure 3: Hyperspectral data cube (image from DataFusion Corp) 
Average nucleus spectrum with standard deviation bars. 
A spectral slice of a normal gland 
A spectral slice of a cancerous gland 
TISSUE CLASSIFICATION 


Tissue classification algorithm on a sample 
Tissue classification algorithm on a sample 
ALGORITHM FOR NORMAL/ABNORMAL DISCRIMINATION 


Normal:

Abnormal:

Table 1: Performance of classifier on nuclei patches, crossvalidated . 

Patches/Nuclei (8688) 
True Positive (4860) 
True Negative (3828) 

Predicted Positive ( malignant) 
94.0% (4568) 
7.3% (280) 
Predicted Negative normal) 
6.0% (292) 
92.7% (3548) 
CLASSIFICATION OF A WHOLE SLIDE 


The classification of a whole slide is obtained by selecting 40 random nuclei patches from the slide, and averaging the corresponding classifications. The classification of the slides has no mistakes, since the few errors of classification on the nuclei are averaged out on the whole slide. 
A SPATIALSPECTRAL CLASSIFIER 


Papers
 Hyperspectral pathology:
 Hyperspectral microscopic discrimination between normal and cancerous colon biopsies. Franco Woolfe, Mauro Maggioni, Gustave Davis, Frederick Warner, Ronald Coifman, and Steven Zucker, submitted, 2006.
 Algorithms from Signal and Data Processing Applied to Hyperspectral Analysis: Discriminating Normal and Malignant Microarray Colon Tissue Sections Using a Novel Digital Mirror Device System. M.Maggioni, G.L. Davis, F. J. Warner, F. B. Geshwind, A.C. Coppi, R.A.DeVerse, R.R.Coifman, Tech Report, Dept. Comp. Science, Yale University,. November 2004.
 Hyperspectral Analysis of normal and malignant colon tissue microarray sections using a novel DMD system, G. L. Davis, M. Maggioni, F. J. Warner, F. B. Geshwind, A. C. Coppi, R. A. DeVerse, R. Coifman, poster session at Optical Imaging NIH workshop, Sep. 2004.
 ,HyperSpectral analysis of normal and malignant microarray tissue sections using a novel microoptoelectricalmechanical system G.L. Davis, M. Maggioni, F. J. Warner, F. B. Geshwind, A.C. Coppi, R.A. DeVerse, R.R.Coifman.
 SpatialSpectral Analysis of Colon Carcinoma, G.L. Davis, R.R.Coifman, R.Levinson.
 Target and anomaly detection
 With connections to compressed sensing and estimation of probability measures: A Fast Multiscale Framework for Data in High Dimensions: Measure Estimation, Anomaly Detection, and Compressive Measurements, G. Chen and M. Iwen and S. Chin and M. Maggioni, Visual Communications and Image Processing (VCIP), Nov. 2012 IEEE (submitted April 2012, published version available here).
 Math/Applied math
 Markov Decision Processes
 Hyperspectral Imaging and Pathology
 Finance
 Miscellanea
 Talks
 Tutorials and Lecture Series
 PathBased Spectral Clustering: Guarantees, Robustness to Outliers, and Fast Algorithms, with A. Little, J. Murphy, 2017.
 Nonlinear Unsupervised Clustering of Hyperspectral Images with Applications to Anomaly Detection and Active Learning, with J. Murphy, 2017; software package available.
 Adaptive Geometric Multiscale Approximations for Intrinsically Lowdimensional Data, with W. Liao, 2017.
 Multiscale Strategies for Computing Optimal Transport, with S. Gerber, appeared in J.M.L.R., 2017. An updated version is available.
 Inferring Interaction Rules From Observations of Evolutive Systems I: The Variational Approach, with M. Bongini, M. Fornasier, M. Hansen, Mathematical Models and Methods in Applied Sciences, 27(05), 909951, 2017
 Learning adaptive multiscale approximations to data and functions near lowdimensional sets, with W. Liao and S. Vigogna, in IEEE Information Theory Workshop (ITW), 226230, 2016.
 High Dimensional Data Modeling Techniques for Detection of Chemical Plumes and Anomalies in Hyperspectral Images and Movies, with Wang, Y., Chen, G., IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, DOI: 10.1109/JSTARS.2016.2539968, 2016. (published version here) , software package.
 Dictionary Learning and NonAsymptotic Bounds for Geometric MultiResolution Analysis, with S. Minsker and N. Strawn, in JMLR, Jan. 2016, 4393; short version in Proceedings of the second "international Traveling Workshop on Interactions between Sparse models and Technology", 2014, as well as here
 Object recognition in art drawings: Transfer of a neural network, Yin, R., Monson, E., Honig, E., Daubechies, I., Maggioni, M., IEEE ICASSP 2016.
 ATLAS: A geometric approach to learning highdimensional stochastic systems near manifolds, with M. Crosskey, M. Maggioni, submitted, 2014, to appear in Journ. Mult. Model. Simul., 2016
 Genomic Characterization of Large Heterochromatic Gaps in the Human Genome Assembly, with N. Altemose, K. H. Miga, M. Maggioni, H. F. Willard, PLOS Computational Biology, 2014
 Multiscale dictionaries, transforms, and learning in highdimensions, with S. Gerber, M. Maggioni, Proc. SPIE 8858, Wavelets and Sparsity XV, 88581T, 2013.
 Branchedchain amino acids alter neurobehavioral function in rats, with Coppola A., Wenner B.R., Ilkayeva O., Stevens R.D., Maggioni M., Slotkin T.A., Levin E.D., Newgard C.B., Am J Physiol Endocrinol Metab. 2013 Feb 15;304(4):E40513.
 Geometric Multiscale Reduction for Autonomous and Controlled Nonlinear Systems, with J. Bouvrie and M. Maggioni, IEEE Conference on Decision and Control (CDC), 2012.
 Multiscale Geometric Methods for Data Sets I: Multiscale SVD, Noise and Curvature, with A. V. Little, M. Maggioni, L. Rosasco. This paper summarizes the work in A.V. Little's thesis (May 2011) on multi scale singular values for noisy point clouds. This extends the analysis of the constructions and results in our previous work Multiscale Estimation of Intrinsic Dimensionality of Data Sets (A.V. Little and Y.M. Jung and M. Maggioni, Proc. AAAI, 2009, and see a presentation by A.V. Little here) and Estimation of intrinsic dimensionality of samples from noisy lowdimensional manifolds in high dimensions with multiscale SVD (A.V. Little and J. Lee and Y.M. Jung and M. Maggioni, Proc. SSP, 2009}
 Approximation of Points on LowDimensional Manifolds Via Random Linear Projections, M. Iwen, M. Maggioni. Appears here in Information and Inference, 2, 2013.
 A Fast Multiscale Framework for Data in High Dimensions: Measure Estimation, Anomaly Detection, and Compressive Measurements, Chen, G. and M. Iwen and S. Chin and M. Maggioni, Visual Communications and Image Processing (VCIP), Nov. 2012 IEEE (submitted April 2012, published version available here).
 What is...Data Mining?, M. Maggioni , A.M.S. Notices, April 2012
 Multiscale Analysis of Plane Arrangements, G. Chen, M. Maggioni, CVPR, 2011. See also the corresponding poster and code on G. Chen's webpage.
 Multiscale Geometric Methods for Data Sets II: Geometric Wavelets, W.K. Allard, Chen, G., M. Maggioni , Appl. Comp. Harm. Anal., Vol. 32(3), May 2012, 435462. This is a detailed and improved construction for geometric wavelets for point clouds, originally introduced in the 2010 conference paper here.
 Determination of reaction coordinates via locally scaled diffusion map, M. A. Rohrdanz, W. Zheng, M. Maggioni, C. Clementi, J. Chem. Phys., 134 2011: 124116
 Polymer reversal rate calculated via locally scaled diffusion map, W. Zheng, M. A. Rohrdanz, M. Maggioni, C. Clementi, J. Chem. Phys., 134 2011: 144108
 Some Recent Advances in the Geometric Analysis of Point Clouds in High Dimensions, G. Chen, A.V. Little, M. Maggioni and L. Rosasco, in Wavelets and Multiscale Analysis: Theory and Applications, March 2011, Springer
 Exploration and Representation of Data with Geometric Wavelets, Eric E Monson, Rachael Brady, Guangliang Chen, Mauro Maggioni, Poster and short paper at Visweek 2010
 Justin Guinney, Phillip Febbo, Mauro Maggioni, Sayan Mukherjee, Multiscale factor models for molecular networks, JSM Proc. (2010), pp. 48874901, A.S.A.
 Learning gradients: predictive models that infer geometry and statistical dependence, Q. Wu, J. Guinney, M. Maggioni, S. Mukherjee, J.M.L.R., vol. 11 (August, 2010), pp. 21752198
 Multiscale Geometric Dictionaries for pointcloud data, G. Chen, M. Maggioni, Proc. SampTA 2011
 Multiscale Analysis of Time Series of Graphs, J. Lee, M. Maggioni, Proc. SampTA 2011
 Multiscale Geometric Methods for estimating intrinsic dimension, A.V. Little, M. Maggioni, L. Rosasco, Proc. SampTA 2011
 Universal Local Parametrizations via Heat Kernels and Eigenfunctions of the Laplacian, P.W. Jones, M. Maggioni, R. Schul, Ann. Acad. Scient. Fen., Vol. 35:144, 2010. A short version has appeared in P.N.A.S here. Localized eigenfunctions appear incidentally in this paper. See Terry Tao's discussion on scarring for the Bunimovich stadium and the pictures by Denis Grebenkov here. More pictures and talks about Laplacian eigenfunctions can be found on Naoki Saito's page on the ICIAM minisymposium on Laplacian eigenfunctions and their applications. See also D. Stone's page, and in particular his paper on Modes of wavechaotic dielectric resonators.
 Multiscale geometric wavelets for the analysis of point clouds, G. Chen, M. Maggioni, Proc. CISS
 GeometryAnalysis and Signal Processing on Digital Data,Emergent Structures, and Knowledge Building , R.R. Coifman, M.Maggioni, Siam News, Dec. 2008.
 A general framework for adaptive regularization based on diffusion processes on graphs , A.D. Szlam, M. Maggioni, R.R. Coifman. J.M.L.R., 9 (2008) 17111739 (published, shortened version here).
 Multiscale Analysis of Data Sets using Diffusion Wavelets, M.Maggioni and R.R. Coifman, Proc. Data Mining for BIOmedical Informatics.
 Diffusion Polynomial Frames on measure metric spaces, M. Maggioni, H.N. Mhaskar. Appl. Comp. Harm. Anal., 24(3): 329353, 2007.
 TensorCUR Decompositions for TensorBased Data, M Mahoney, M Maggioni, and P Drineas. to appear in SIAM Jour. on Mat. Anal. and Appl., 2007.
 Geometries of sensor outputs, inference and information processing, R.R. Coifman, S Lafon, M Maggioni, Y Keller, AD Szlam, F J Warner, S W Zucker. Proc. SPIE, Vol. 6232, 623204, May 2006.
 Geometric diffusions for the analysis of data from sensor networks, RR Coifman, M Maggioni, SW Zucker, and IG Kevrekidis. Curr. Opin. Neurobiol., 15(5):57684, October 2005.
 Biorthogonal diffusion wavelets for multiscale representations on manifolds and graphs , with M Maggioni, JC Bremer Jr, RR Coifman, and AD Szlam. Proc. SPIE Wavelet XI, Vol 5914, 59141M, Sep 2005.
 Diffusiondriven multiscale analysis on manifolds and graphs: topdown and bottomup constructions, with M Maggioni, AD Szlam, RR Coifman, and JC Bremer Jr. Proc. SPIE Wavelet XI, Vol 5914, 59141D, Sep 2005.
 Diffusion wavelet packets, JC Bremer, RR Coifman, M Maggioni, and AD Szlam. Appl. Comp. Harm. Anal., 15(1): 95112.
 Diffusion wavelets for multiscale analysis on graphs and manifolds, RR Coifman and M Maggioni. Proc. Conf. Interactions Splines and Wavelets, Oct. 2005.
 Diffusion wavelets, RR Coifman and M Maggioni. Appl. Comp. Harm. Anal., 21(1):5394, July 2006.
 Geometric diffusions as a tool for harmonic analysis and structure definition of data. part i: Diffusion maps , RR Coifman, S Lafon, A Lee, M Maggioni, B Nadler, FJ Warner, and SW Zucker. Proc. of Nat. Acad. Sci., 102:74267431, May 2005.
 Geometric diffusions as a tool for harmonic analysis and structure definition of data. part ii: Multiscale methods , RR Coifman, S Lafon, A Lee, M Maggioni, B Nadler, FJ Warner, and SW Zucker. Proc. of Nat. Acad. Sci., 102:74327438, May 2005.
 Multiresolution analysis associated to diffusion semigroups: construction and fast algorithms , RR Coifman and M Maggioni. Tech. Rep. YALE/DCS/TR1292, Dept. Comp. Sci., Yale University, June 2004. This has been superseded by Diffusion Wavelets.
 Wavelet frames on groups and hypergroups via discretization of Calder\'on formulas , M Maggioni. Monats. Mat., (143):299331, 2004.
 On the box problem, NH Katz, E Krop, and M Maggioni, Math. Research Letters, 4:515519, 2002.
 Mband BurtAdelson wavelets, M Maggioni, Appl. Comput. Harm. Anal., 3:286311, 2000.
 Characterization of tight wavelet frames with arbitrary matrix dilations and tightness preserving oversampling, with CK Chui, W Czaja, M Maggioni, and GL Weiss, J Four Anal App, 8(2):173200, 2002.
 Critical exponent of short even filters and biorthogonal BurtAdelson wavelets, with M Maggioni, Monats. Math., 131(1):4970, 2000.
 Multiscale Markov Decision Problems: Compression, Solution, and Transfer Learning, with J. Bouvrie, M. Maggioni, 2012, submitted.
 Efficient Solution of Markov Decision Problems with Multiscale Representations, J. Bouvrie and M. Maggioni, Proc. 50th Annual Allerton Conference on Communication, Control, and Computing, 2012.
 Protovalue Functions: A Laplacian Framework for Learning Representation and Control in Markov Decision Processes, with S Mahadevan, M Maggioni. J.M.L.R., 2007.
 Multiscale Diffusion Bases for Policy Iteration in Markov Decision Processes, M Maggioni, S Mahadevan. Submitted, 2006.
 Simultaneous Learning of Representation and Control In Continuous Domains, with S Mahadevan, Kimberly Ferguson, Sarah Osentoski and M Maggioni, AAAI 2006.
 A Multiscale Framework For Markov Decision Processes using Diffusion Wavelets, with M Maggioni and S Mahadevan, Univ. of Mass., Dept. of Computer Science TR200636, and ICML 2005.
 Value function approximation with diffusion wavelets and Laplacian eigenfunctions, with S Mahadevan and M Maggioni, Univ. of Massachusetts, Dept. of Computer Science TR200538, and NIPS}, 2005.
 Fast Direct Policy Evaluation Using Multiscale Markov Diffusion Processes, with M Maggioni and S Mahadevan , Univ. of Mass., Dept. of Computer Science, TR200539, and NIPS, 2005.
 High Dimensional Data Modeling Techniques for Detection of Chemical Plumes and Anomalies in Hyperspectral Images and Movies, with Wang, Y., Chen, G., IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, DOI: 10.1109/JSTARS.2016.2539968, 2016.
 A Fast Multiscale Framework for Data in High Dimensions: Measure Estimation, Anomaly Detection, and Compressive Measurements, G. Chen and M. Iwen and S. Chin and M. Maggioni, Visual Communications and Image Processing (VCIP), Nov. 2012 IEEE (submitted April 2012, published version available here).
 Hyperspectral Microscopic Analysis of Normal, Adenoma and Carcinoma Microarray Tissue Sections, with M Maggioni, FJ Warner, GL Davis, RR Coifman, FB Geshwind, AC Coppi, and RA DeVerse, SPIE Optical Biopsy, 2006, Vol. 6091, 60910I, available here.
 QEEGbased classification with wavelet packets and microstate features for triage applications in the ER , E Causevic, RR Coifman, R Isenhart, A Jacquin, ER John, M Maggioni, LS Prichep, FJ Warner, ICASSP, 2006. Here is a preprint of the poster.
 Algorithms from signal and data processing applied to hyperspectralanalysis: Application to discriminating normal and malignant microarray colon tissue sections , M Maggioni, FJ Warner, GL Davis, RR Coifman, FB Geshwind, AC Coppi, and RA DeVerse, submitted, 2004.
 Algorithms from signal and data processing applied to hyperspectral analysis: Application to discriminating normal and malignant microarray colon tissue sections , M Maggioni, FJ Warner, GL Davis, RR Coifman, FrankB Geshwind, AndreasC Coppi, and RichardA DeVerse, Tech. Rep. 1311, Yale University, Dept. Comp. Sci., Feb 2004.
 Hyperspectral analysis of normal and malignant colon tissue microarray sections using a novel dmd system , GL Davis, M Maggioni, FJ Warner, FB Geshwind, AC Coppi, RA DeVerse, and RR Coifman, Poster, Optical Imaging NIH workshop, Sep 2004.
 Beauty is in the bid of the beholder: An empirical basis for style, with Goetzmann, WN, Jones, PW, Maggioni, M, and Walden, J., Research in Economics 70.3 (September 2016): 388402
 Data Representation and Exploration with Geometric Wavelets, with E. Monson, G. Chen, R. Brady, IEEE Symposium on Visual Analytics Science and Technology (VAST), 2010. Corresponding poster is available here.
 Auditory display of hyperspectral colon tissue images using vocal synthesis models, with RJ Cassidy, JBerger, M Maggioni, and RR Coifman, Proc. 2004 Intern. Conf. Auditory Display}, 2004.
 Multiscale approximation with hierarchical radial basis functions networks, with S Ferrari, M Maggioni, and NA Borghese, IEEE Trans. on Neural Networks, 15(1):178188, January 2004.
 Lectures at Summer School at Peking University, July 2017.
 PCMI Lectures, Summer 2016: Lecture 1, Lecture 2, Problems/discussion points
 Google Scholar
 Papers on the ArXiv
 Papers on MathsciNet
 Tutorials on diffusion geometry and multiscale analysis on graphs at the MRA Internet Program at IPAM: Part I and Part II.
 Diffusion Geometries, Diffusion Wavelets and Harmonic Analysis of large data sets, IPAM, Multiscale Geometric Analysis Program, Fall 2004.
 Diffusion Geometries, global and multiscale, IPAM, 2005.
Introduction to Harmonic Analysis and its Applications  Spring 2018
Office hours: Krieger 405, on Wed. after 4:15pm, by appointment
Section: Thu 6:006:50pm, Krieger 205
Final exam date: decided by registrar
We will use materials from several Lecture notes, books and papers, including:
 Fourier Analysis, E. S. Stein and R. Shakarchi
 A wavelet tour of signal processing, S. Mallat
 Real Analysis, G.B. Folland
 Ten Lectures on WaveletsI, I. Daubechies
 Harmonic Analysis for Engineers and Applied Scientists, G.S. Chirikjian and A.B. Kyatkin
 Mathematics of Medical Imaging, C. Epstein
Homework sets:
 Homework 1, due 2/14
 Homework 2, due 2/21
 Homework 3, due 3/7
HighDimensional Approximation, Probability, and Statistical Learning  Fall 2017
Course number: AS.110.675, EN.553.738Classroom: Hodson 311, MonWed 1:30pm2:45pm. IMPORTANT: class on 9/20 will be in Hackerman 320.
Office hours: Krieger 405, Tue 4:15pm (instructor); Wed 46pm in Whitehead 212 with Zachary Lubberts, TA.
Final exam: Thu 12/14, 25pm, usual classroom (as per this page)
Synopsis. We will introduce fundamental techniques for approximations of funcitons in low and high dimensions, using Fourier, wavelet and other multiscale techniques. Both linear and nonlinear approximation techniques will be introduced. We will discuss some of the application of these techniques to signal processing and imaging. We will then connect these techniques with probability, and apply them to fundamental problems in statistical and machine learning theory. Tools such as concentration inequalities and basic analysis of random matrices will be introduced and applied. We will also discuss other highdimensonal estimation and inverse problems, and their geometric interpretation. Finally, we will discuss dimension reduction, random projections, manifold learning, optimal transportations and estimation problems in the space of probability measures via optimal transport. Prerequisites. Linear algebra will be used throughout the course, as will multivariable calculus and probability (discrete and continuous random variables). Basic functional analysis (e.g. Lebesgue spaces, linear operators), or at the very least firm grasp of basic properties of vector spaces and linear operators. Basic experience in programming in C or MATLAB or R or Octave, in order to perform basic numerical experiments. Grading. Grade to be based on assignments (roughly biweekly), and a final presentation and report. The latter will be about a paper or topic selected in agreement with the instructor, and depending on the paper/topic it may be worked on by a team rather than a single student. Students from all areas of science, mathematics and applied mathematics, engineering, computer science, statistics, quantitative biology, economics that need advanced level skills in solving problems involving probability and stochastic processes in high dimensions, signal processing, statistical modeling, often arising from the analysis or modeling of highdimensional data are encouraged to enroll. The Instructor is available during his office hours, in Krieger 405, at a time to be mutually agreed upon at the beginning of the semester. Our Teaching Assistant, Mr. Zachary Lubberts, is available during his office hours, at a time to be mutually agreed upon at the beginning of the semester. 
 Roman Vershynin's High Dimensional Probability, An Introduction with Applications in Data Science, draft textbook
 Philippe Rigolet's High Dimensional Statistics lecture notes
 Binev et al.'s paper Universal Algorithms for Learning Theory Part I: Piecewise Constant Functions
 Roman Vershynin's Estimation in high dimensions: a geometric perspective (much of this material is now part of the textbook above)
 Alexander Barvinok's Measure Concentration lecture notes
 Avrim Blum, John Hopcroft, and Ravindran Kannan Foundations of Data Science textbook.
 Ronald A. DeVore's Nonlinear Approximation
 Afonso Bandeira's Ten Lectures and FortyTwo Open Problems in the Mathematics of Data Science
 Introduction to nonparametric estimation, A. Tsybakov
 A distributionfree theory of nonparametric regression, L. Gyorfi, M. Kohler, A. Krzyzak, H Walk
 Universal Algorithms for Learning Theory Part I : Piecewise Constant Functions, P. Binev, A. Cohen, W. Dahmen, R. DeVore, V. Temlyakov
 Fourier Analysis, E. S. Stein and R. Shakarchi
 A wavelet tour of signal processing, S. Mallat
 Real Analysis, G.B. Folland
 Ten Lectures on WaveletsI, I. Daubechies
 Harmonic Analysis for Engineers and Applied Scientists, G.S. Chirikjian and A.B. Kyatkin
 Mathematics of Medical Imaging, C. Epstein
Homework sets:
 Homework 1. Due on Fri. Sep. 15th.
 Homework 2. Due on Fri. Sep. 29th.
Possible Topics for Final Project: The final project consists in a presentation by the team (14 members) who worked on the project, each team member presenting for 6 minutes), together with a short, concise, clearlywritten report (6 single column pages would be a typical length) sent by email, with High Dimensional Approximation final report in the subject line. The report should be authored by all team members, each stating their role and contribution to the project and report. Please send me the presentation in pdf or keynote format in advance so we can proceed quickly through them.
Topics can range from Fourier and wavelets analysis to random matrices, to compressed sensing, for example:
 Random Matrices: the matrix Bernstein inequality, asymptotpic distribution of eigenvalues (Wigner's semicircle law)
 Compressed Sensing: original Fourier, original polytopes, multiscale for imaging, for CT
 Chaining: parts of the corresponding chapter in R. Vershynin's lecture nodes on High Dimensional Probability (see reference above)
 DvoretzkyMilman's theorem: section 11.3.2 in R. Vershynin's lecture nodes on High Dimensional Probability (see reference above)
 Spectral Clustering: any or all of the topics in the corresponding chapter of the Foundation of Data Science by A. Blum et al. (see reference above)
 Randomized SVD and other linear algebra algorithms, for example following this review paper
 Compressed Sensing from nonlinear observations, for example section 13 in R. Veryshnin's Estimation in High Dimensions references above
 Windowed Fourier and its Application, for example following S. Mallat's book on Wavelets and Signal Processing, or Daubechies' book Ten Lectures on Wavelets
Introduction to Statistical Learning, Data Analysis and Signal Processing  Spring 2017
Course number: AS.110.446, EN.550.416.A synopsis is available here. A syllabus is available here.
Office hours: Krieger 405, Wednesday 4:155:15pm (instructor); Whitehead 212, Thursdays 4:306:30pm (Zachary Lubberts, TA).
Mid term exam: Wednesday 3/15. Extra office hours on Tuesday 24pm (Shangsi Wang) and 4pm5pm (Zachary Lubberts).
This wiki contains materials for the lectures, links to reference and references and data sets.
If you are unable to connect to the wiki, it is most likely because of one of the recurrent JHU network outages in my office, which makes my servers unavailable. I will transition the materials to my home server when I have time. You may try this alternative link (JHU local network only).
Synopsis of course content. Introduction to high dimensional data sets: key problems in statistical and machine learning. Geometric aspects. Principal component analysis, linear dimension reduction, random projections. Concentration phenomena: examples and basic inequalities. Metric spaces and embeddings thereof. Kernel methods. Nonlinear dimension reduction, manifold models. Regression. Vector spaces of functions, linear operators, projections. Orthonormal bases; Fourier and wavelet bases, and their use in signal processing and time series analysis. Basic approximation theory. Linear models, least squares. Bias and variance tradeoffs, regularization. Sparsity and compressed sensing. Multiscale methods. Graphs and networks. Random walks on graphs, diffusions, page rank. Block models. Spectral clustering, classification, semisupervised learning. Algorithmic and computational aspects of the above will be consistently in focus, as will be computational experiments on synthetic and real data. Prerequisites. Linear algebra will be used throughout the course, as will multivariable calculus and basic probability (discrete random variables). Basic experience in programming in C or MATLAB or R or Octave. Recommended: More than basic programming experience in Matlab or R; some more advanced probability (e.g. continuous random variables), some signal processing (e.g. Fourier transform, discrete and continuous). Grading. Grade to be based on weekly assignments, midterm and final exams, and a final team project. The final team project includes a report and the presentation of a poster on the topic of the project (typically involving studying a small number of papers and summarizing them and/or working on a data set). Weekly problem sets will focus on computational projects and theory. Students from all areas of science, mathematics and applied mathematics, engineering, computer science, statistics, quantitative biology, economics that need advanced level skills in solving problems related to the analysis of data, signal processing, or statistical modeling are encouraged to enroll. The Instructor is available during his office hours, in Krieger 405, on Tuesdays from 2pm to 3pm. Our Teaching Assistant, Mr. Zachary Lubberts, is available during his office hours, in Whitehead 212 on Thursdays from 4:306:30pm The Math help room, in Krieger 213 I believe, is staffed daily, roughly from 9am9pm. I found a schedule for last semester at http://www.math.jhu.edu/helproomschedule.pdf: there you may ask questions there about any math topic related to the course. 

Homework sets:
 Homework 1. Due on Fri. Feb. 3rd.
 Homework 2. Due on Mon. Feb. 13th.
 Homework 3. Due on Mon. Feb. 20th.
 Homework 4. Due on Mon. Feb. 27th.
 Homework 5. Due on Mon. Mar. 6th.
 Homework 6. Due on Mon. Mar. 13th.
 Homework 7. Due on Fri. Apr. 7th.
 Possible topics and questions for final exam.
Math 431  Advanced Calculus, I  Spring 2016
Office hours: TBA, Gross Hall
This course will develop a rigorous theory of elementary mathematical analysis including differentiation, integration, and convergence of sequences and series. Students will learn how to write mathematical proofs, how to construct counterexamples, and how to think clearly and logically. These topics are part of the foundation of all of mathematical analysis and applied mathematics, geometry, ordinary and partial differential equations, probability, and stochastic analysis.
Textbook: Fundamental Ideas of Analysis, by Michael Reed. The course will cover most, but not all, of the material in Chapters 16.
Homework sets:
Math 561  Scientific Computing, I  Fall 2015
Office hours: Mondays at 2:45 in Gross Hall 318 The first part of the course will cover basic numerical linear algebra, in particular matrix factorizations, solution of linear systems and eigenproblems. The suggested textbook is Trefethen and Bau's Numerical Linear Algebra. The textbook by Heath, Scientific Computing, may be a useful references for exercises, of which it contains many. The second part of the course will cover basic nonlinear optimization (gradient descent, Newton's method, stochastic gradient descent), and basic notions in linear programming. The third part of the course will cover the basics of Montecarlo algorithms. A short syllabus is available here, and an extended one is here.One midterm exam. Final exam will include a takehome portion (algorithms, coding) and an inclass portion. Homework: weekly homework. Homework:
The code we discussed in our last class on 11/23, that includes solving a discretized linear PDE in various ways, as well as computing eigenvalues/eigenvectors is available here Useful links:Sample code I used in the first class to produce some examples John Trangenstein's home page contains a link to his online book on Scientific Computing, as well as several useful links to programming guides for Fortrain, C, C++ and Lapack on his page for Math 225. William Allard's home page also contains useful material, such as notes and links to online guides and materials. Fortran tutorial: here and here. Matlab tutorial: from Mathworks here, from the University of Florida here. Many more are available online, just use your favourite search engine to look for "matlab tutorial". Cleve Moler book (selected chapters): Numerical computing with MATLAB and Experiments with MATLAB. 

Math 690  Topics in statistical learning theory  Spring 2015
Class: MW 10:0511:20, Location: Physics 119
Math 561  Scientific Computing, I  Fall 2014
Class: MW 1:252:40, Location: Gross 304B
Office hours: Tue 12:001:00pm, Location: Gross 309
Here is the synopsis and its extended version in the syllabus.
Review on 12/10 at noon in 304B
Take home exam: I will email you a link to the exam on the night of Friday Dec. 5th. You pick a 6 hour period between then and the final inclass exam, and at the beginning of the period of your choice you download the exam and email me an electronic copy by the end of the 6 hour period. A (matching) paper copy is due at the end of inclass exam. You can use the textbook (no other books), your notes from class, and your homework. You cannot copy code directly or reuse code from your homework. Needless to say, do not discuss the problems with anybody else.
The first part of the course will cover basic numerical linear algebra, in particular matrix factorizations, solution of linear systems and eigenproblems.
The suggested textbook is Trefethen and Bau's Numerical Linear Algebra. The textbook by Heath, Scientific Computing, may be a useful references for exercises, of which it contains many.
Useful links:
John Trangenstein's home page contains a link to his online book on Scientific Computing, as well as several useful links to programming guides for Fortrain, C, C++ and Lapack on his page for Math 225. William Allard's home page also contains useful material, such as notes and links to online guides and materials.
Fortran tutorial: here and here.
Matlab tutorial: from Mathworks here, from the University of Florida here. Many more are available online, just use your favourite search engine to look for "matlab tutorial".
Cleve Moler book (selected chapters): Numerical computing with MATLAB and Experiments with MATLAB.
Math 431  Advanced Calculus, I  Spring 2014
Office hours: Mondays, 1:302:30pm, Gross Hall, Office 318
This course will develop a rigorous theory of elementary mathematical analysis including differentiation, integration, and convergence of sequences and series. Students will learn how to write mathematical proofs, how to construct counterexamples, and how to think clearly and logically. These topics are part of the foundation of all of mathematical analysis and applied mathematics, geometry, ordinary and partial differential equations, probability, and stochastic analysis.
Textbook: Fundamental Ideas of Analysis, by Michael Reed. The course will cover most, but not all, of the material in Chapters 16.
Midterms: February 19th, and March 26th.
Homework:
 Homework 1. Office hours moved to 11:30am, instead of 1:30pm on Monday 13th. Office: 319 Gross Chem.
 Homework 2. Office hours at 1:30pm on Monday 13th, 318 Gross Chem. [Updated pdf on 1/16/14 at 11:30pm: it contained a misspelled word and I corrected office hours in the .pdf]
 Homework 3
 Homework 4
 Homework 5
 Homework 6
 Homework 7
 Homework 8
 Homework 9
Math 790  Random Matrices, Concentration Inequalities and Applications to Signal Processing and Machine Learning  Spring 2013
We discuss several related topics and techniques at the intersection between probability, approximation theory, highdimensional geometry, and machine learning and statistics. Synopsis. Wiki page
Math 224  Scientific Computing  Fall 2011
Office hours: Tue 45pm, my office is Room 293 in the Physics Bldg.
Here is the synopsis.
The first part of the course will cover basic numerical linear algebra, in particular matrix factorizations, solution of linear systems and eigenproblems, nonlinear equations in 1 dimensions. If time permits, we shall discuss recent randomized algorithms in numerical linear algebra.
Useful links:
John Trangenstein's home page contains a link to his online book on Scientific Computing, as well as several useful links to programming guides for Fortrain, C, C++ and Lapack on his page for Math 225. William Allard's home page also contains useful material, such as notes and links to online guides and materials.
Fortran tutorial: here and here.
Matlab tutorial: from Mathworks here, from the University of Florida here. Many more are available online, just use your favourite search engine to look for "matlab tutorial".
Cleve Moler book (selected chapters): Numerical computing with MATLAB and Experiments with MATLAB.
Math 288  Topics in Probability: Geometry, Functions and Learning in High Dimensions  Fall 2011
Here is a flier for the course.
We discuss several related topics and techniques at the intersection between probability, approximation theory, highdimensional geometry, and machine learning and statistics. We build on basic tools in large deviation theory and concentration of measure and move to problems in nonasymptotic random matrix theory (RMT), such estimating the spectral properties of certain classes of random matrices. We then use these tools to study metric proeprties of certain maps between linear spaces that are nearisometry, such as random projections. We then move to the setting of general metric spaces, and introduce multiscale approximation of metric spaces a la Bourgain, and also discuss tree approximations, and hint at the algorithmic applications of these ideas. Finally we move to the real of function approximation/estimation/machine learning for functions defined on highdimensional spaces. We discuss Reproducing Kernel Hilbert Spaces and learning iwth RKHS's, and we also discuss multiscale techniques for function approximation in highdimensions. We discuss also geometric methods, both graph based (Laplacians, manifold learning) and multiscalebased. Finally, we discuss recent fast randomized algorithmic for certain numerical linear algebra computations, that use nonasymptotic RMT results discussed above.
Requirements: solid linear algebra and basic probability. Of help, but to be introduced in the course: metric spaces, function spaces, matrix factorizations.
A course wiki contains links to lecture notes, papers and other materials. May be edited by students in the class.
Math 139  Real Analysis  Fall 2010
Office hours: Mon, 1:30pm3:30pm, or by appointment
Textbook: Fundamental Ideas of Analysis, by Michael Reed. The course will cover most, but not all, of the material in Chapters 16.
There will be a midterm exam, a final exam and weekly homework.
Evaluation: There will also be at least one lengthy assignment which challenges you to write carefully constructed proofs. Your final letter grade will be based on these components weighted as follows: long assignment(s) 1015%, regular homework 2025%, midterm exam 25%, final exam 40%. Homework is due at the beginning of class, stapled, written legibly, on one side of each page only and must contain the reaffirmation of the Duke community standard. Otherwise, it will be returned ungraded. The logic of a proof must be completely clear and all steps justified. The clarity and completeness of your arguments will count as much as their correctness. Some problems from the homework will reappear on exams. I will go over in detail the solution to any homework problem during office hours. You may use a computational aid for the homework but I do not recommend it. Calculators and computers will not be allowed on the quizzes and exams. The lowest homework score will be dropped. No late homework will be accepted. Duke policies apply with no exceptions to cases of incapacitating shortterm illness, or for officially recognized religious holiday.You may, and are encouraged to, discuss issues raised by the class or the homework problems with your fellow students and both offer and receive advice. However all submitted homework must be written up individually without consulting anyone else's written solution.
Math 338  Topics in Graph Theory, Random Matrices, and Applications  Spring 2010
A flier for the course, with summary of some of the topics.Math 348  Applied Harmonic and Multiscale Analysis  Fall 2009
Office hours: by appointment.
Students have access to a wiki and blog with materials for the course and to discuss topics and problems
The overarching theme is applied multiscale harmonic analysis, in various of its forms. Very rough outline (probably overambitious):  Basic Fourier analysis; Littlewood Paley theory (a.k.a. how to do multiscale analysis with Fourier); square functions & Khintchine inequality; applications to signal processing (audio/video)  Classical multiresolution analysis through wavelets; applications to Calder\'onZygmund integral operators and associated numerical algorithms; applications to signal processing (audio/video)  Multiscale analysis of random walks on graphs; applications to analysis of highdimensional data sets, regression and inference.  A sketch of some techniques in multiscale geometric measure theory, in particular the geometric version of square functions + concentration phenomena for measures in highdimensional spaces + results in random matrix theory. Applications to analysis of high dimensional data sets and inference, as well as numerical linear algebra.
Math 225  Scientific Computing II  Spring 2009
Office hours: by appointment.
Here is the synopsis.
 Homework 1. Due Jan. 22.
 Homework 2. Due Jan. 29.
 Homework 3. Due Feb. 5th.
 Homework 4, and associated data sets. Due Feb. 17th.
 Homework 5. Due Feb. 24th.
 Homework 6. Due Mar. 19th.
 Homework 7. Due Mar. 26th. Here are the examples that I ran in class.
 No homework for Apr. 2nd.
 Homework 8. Due Apr. 7th.
 Homework 9. Due Apr. 21st.
Cleve Moler book (selected chapters): Numerical computing with MATLAB and Experiments with MATLAB.
Math 133  Introduction to Partial Differential Equations  Spring 2009
Office hours: Tuesday 34:30pm, or by appointment.
Here is the synopsis.
First test will on February 17th (usual class time). (Solution)
Reviews session will on April 23rd 4:30pm5:30pm, usual classroom. I will answer your questions.
Material will be added as the course proceeds. Matlab tutorials are available from Mathworks, UFL, cyclismo.org, Matlab resources and many others.Math 378  Minicourse  Introduction to Spectral Graph Theory and Applications  Spring 2008
We will discuss the basics of spectral graph theory, which studies random walks on graphs, and related objects such as the Laplacian and its eigenfunctions, on a weighted graph. This can be thought as a discrete analogue to spectral geometry, albeit the geometry of graphs and their discrete nature gives rise to issues not generally considered in the continuous, smooth case of Riemannian manifolds. We will present some classical connections between properties of the random walks and the geometry of the graph. We will then discuss disparate applications: the solution of sparse linear systems by multiscale methods based on random walks; analysis of large data sets (images, web pages, etc...), in particular how to find systems of coordinates on them, performing dimensionality reduction, and performing multiscale analysis on them; tasks in learning, such as spectral clustering, classification and regression on data sets.
Materials:
 Code for the demo involving drawing a set a points, constructing an associated proximity graph, and computing and displaying eigenvalues/eigenfunctions of the Laplacian. You need to first download and install the general code for Diffusion Geometry, and then download and install this code for running the demo I ran in class, with some images already prepared. After installing the Diffusion Geometry package, please run DGPaths in order to set the Matlab paths correctly. The script for the demo is called GraphEigenFcnsEx_01.m, and it is fairly extensively commented. I will be happy to add your own examples here! The code works best with the Approximate Nearest Neighbor Searching Library by D. Mount and S. Arya. To install this code, simply untar in a directory and run make. This should produce the file libANN.a in the lib subdirectory. This file is already included in the Diffusion Geometry package, in the MEX directory, compiled on a Unix machine at Duke Math. Copy this library libANN.a in the MEX directory, under the directory where the Diffusion Geometry package, and from the Matlab prompt run "mex ANNsearch libANN.a" and "mex ANNrsearch libANN.a". This will yield two .mexglx files, that are what Matlab will call. These two files are already included in the Diffusion Geometry package, after compilation on a Unix machine at Duke Math.
References:
 R. Diestel's book on Graph Theory is an excellent general reference. Availble online here.
 D. Spielman notes for his course on Spectral Graph Theory at Yale; several papers on specific applications, dependent on the attendant's interests.
 D. Spielman notes on his course on Graphs and Networks at Yale. Some overlap with the above, but also other references and materials.
 F. Chung's book "Spectral Graph Theory". She also wrote a book on "Complex Graphs and Networks", mostly on random graphs and their degree distribution properties, and also some spectral results for them. Visit her homepage for lots of interesting material on graphs, spectral graph theory and its applications. In particular see the gallery of graphs.
 S. Lafon's web page has some cool tutorials and interactive demos on diffusion geometry.
Math 224  Scientific Computing  Fall 2007
Office hours: Wed 4:305:30, Thu 1:102:10, or by appointment.
Here is the synopsis.
The first part of the course will cover basic numerical linear algebra, in particular matrix factorizations, solution of linear systems and eigenproblems. The second part of the course will cover nonlinear equations, numerical integration and differentiation, basic techniques for ODEs, and the Fast Fourier Transform.
Useful links:
John Trangenstein's home page contains a link to his online book on Scientific Computing, as well as several useful links to programming guides for Fortrain, C, C++ and Lapack on his page for Math 225. William Allard's home page also contains useful material, such as notes and links to online guides and materials.
Fortran tutorial: here and here.
Matlab tutorial: from Mathworks here, from the University of Florida here. Many more are available online, just use your favourite search engine to look for "matlab tutorial".
Homework sets:
1 (solution), 2 (solution), 3 (solution), 4 (solution), 5 (solution), 6 (solution), 7, 8.
Partial solution to test 1.
Math 348  Harmonic Analysis and Applications  Curr Res in Analysis  Spring 2007
Please find the synopsis here.
I plan to develop lecture notes as the course proceeds. Last update: 1/10/07. The notes are still in a very preliminary should be downloaded and used by students of the course only, and should not be divulgated, replicated if not for purposes related to the course. When a more stable version will become available, certain of these restrictions will be removed. This link will be updated regularly. Right now they are in an extremely preliminary state, and at times they may not even be accessible through the link provided.
A list of topics for presentation suggested for the course (by instructor or students), and the students currently working on them is available here.
Presentations by students:
 Statistical Approach to Wavelet Shrinkage, Simon Lunagomez
 Semisupervised Learning on Graphs, Chungping Wang
 Markov Decision Processes, Rachel Thomas
 The Fast Multipole Method, Veronica Rozmiarek
 Multiscale Reconstruction of Hyperspectral Data , Kalyani Shivakumar and Cristina Fernandez.
 Stochastic Filtering, Zachary Harmany and William Lee.
 Geometric MultiResolution Analysis (GMRA) Code
 Diffusion Geometry Code
 Diffusion Wavelets Code
 MAPA
 Multiscale SVD code
 Compressed Sensing Inversion for GMRA
 Data Sets
 Class Demo
 Data Sets
Diffusion Geometry Code
See the paper Geometric diffusions as a tool for harmonic analysis and structure definition of data. part i: Diffusion maps , RR Coifman, S Lafon, A Lee, M Maggioni, B Nadler, FJ Warner, and SW Zucker. Proc. of Nat. Acad. Sci., 102:74267431, May 2005.Matlab code for Diffusion Geometry.
Latest version of the code: Diffusion geometry [Updated on 7/22/17]. This version now includes covertree for fast nearest neighbor searches. It may not (yet) be compatible with all Linux versions.
Previous version of the code: Diffusion geometry [Updated on 11/12/14].
Instructions:
Unzip in a directory, preserving the subdirectory structure. Open Matlab, go in the installation directory and run Startup_DiffusionGeometry.
RunExamples contains examples for running the GraphDiffusion code, for constructing nearest neighbor graphs and computing eigenvectors of the Laplacian on such graphs.
The main script for constructing graphs and computing Laplacians and their eigenfunctions is called GraphDiffusion. Type help GraphDiffusion at the Matlab prompt to see the options, and run the example there (diffusion on a circle) to test installation.
Important note regarding examples and data sets:
Some of the examples rely on data sets, publicly available on the web, in the appropriate format: you may find them, and their source, below.
Important note regarding nearest neighbor:
The package supports either the nn_search and range_search functions of the TSToolbox and the ANNsearch functions of the Approximate Nearest Neighbor Searching Library by D. Mount and S. Arya. MEX files for some platforms are already included in the Diffusion Geometry package, in the 'NearestNeighbors' directory. If MEX files for your machine are not included, you should compile those files and make those available to Matlab by modifying its search paths as needed.
Please let me know if you encounter problems with the installation, or report successes under different systems. Thanks.
Diffusion Wavelets Code
See the paper Diffusion wavelets, RR Coifman and M Maggioni. Appl. Comp. Harm. Anal., 21(1):5394, July 2006.Matlab code for Diffusion Wavelets. A new, much faster version is in the works.
This was originally a joint effort with James C. Bremer Jr. and Arthur D. Szlam.
Last version of the code: Diffusion wavelets [Updated on 7/13/11].
Instructions:
Unzip in a directory, preserving the subdirectory structure. Open Matlab, go in the installation directory and run Startup_DiffusionWavelets.
RunExamples contains examples for running the DWPTree code that constructs diffusion wavelet trees on graphs.
Important note regarding examples and data sets:
The code, especially the examples, rely on the Diffusion Geometry package above and on the data sets below. All notes/comments that apply to the Diffusion Geometry package apply here as well.
Multiscale Analysis of Plane Arrangement
See Multiscale Analysis of Plane Arrangements on Guanliang Chen's' webpage.Multiscale SVD Code
See the paper Multiscale Geometric Methods for Data Sets I: Multiscale SVD, Noise and Curvature, A. V. Little, M. Maggioni, L. Rosasco. This paper summarizes the work in A.V. Little's thesis (May 2011) on multi scale singular values for noisy point clouds. This extends the analysis of the constructions and results in our previous work Multiscale Estimation of Intrinsic Dimensionality of Data Sets (A.V. Little and Y.M. Jung and M. Maggioni, Proc. AAAI, 2009) and Estimation of intrinsic dimensionality of samples from noisy lowdimensional manifolds in high dimensions with multiscale SVD (A.V. Little and J. Lee and Y.M. Jung and M. Maggioni, Proc. SSP, 2009}.
Latest version of the code: Multiscale SVD Code [Updated on 11/29/12].
Instructions:
Unzip in a directory, preserving the subdirectory structure. Open Matlab, go in the installation directory and run Startup_MSVD.
RunExamples contains examples for running the MSVD code, including several examples in the paper.
Important note: The code requires the DiffusionGeometry package to be properly installed, as well as the Data Sets (for running certain examples).
Geometric MultiResolution Analysis (GMRA) Code
See the paper Geometric MultiResolution Analysis, W.K. Allard, G. Chen and M Maggioni. Preprint here, as well as the related ``compressive sampling'' for GMRA paper Approximation of Points on LowDimensional Manifolds Via Random Linear Projections, M. Iwen, M. Maggioni.Matlab code for GMRA and Geometric Wavelet Analysis and Transforms.
Last version of the code: Geometric MultiResolution Analysis [Updated on 2/24/17].
Instructions:
Unzip in a directory, preserving the subdirectory structure. Open Matlab, go in the installation directory and run Startup_GMRA. This will add the required paths to Matlab. RunExamples contains examples for running the GMRA code.
Important note: The code includes the Diffusion Geometry package on which it depends. If you have Diffusion Geometry already installed, please delete the Diffusion Geometry subdirectory that gets installed with this packa.ge The code depends on Data Sets (for running certain examples). The code requires SuiteSparse and METIS to be installed; it comes with precompiled versions for OS X and Linux 64bit, in which case the installation of these packages is not required.
Compressed Sensing Inversion for GMRA Code
See the paper Approximation of points on lowdimensional manifolds via random linear projections, M. A. Iwen and M Maggioni. Preprint here.Matlab code for Compressive Sensing inversion for GMRA.
Last version of the code: Compressive Sensing Inversion for GMRA [Updated on 6/19/13].
Instructions:
Note: The code requires the Diffusion toolbox and all the packages required for that (in particular, Diffusion Geometry; see the GMRA code section), and the SpaRSA toolbox for the performance comparisons
Some demo code I use in a class on Spectral Graph Theory
This code allows to take a black and white picture with points, and constructing an associated proximity graph, and then computing and displaying eigenvalues/eigenfunctions of the Laplacian on such graph. You need to first download and install the general code for Diffusion Geometry above and then download and install this code for running the demo I ran in class, with some images already prepared.
The script for the demo is called GraphEigenFcnsEx_01.m, and it is fairly extensively commented. I will be happy to add your own examples here!
Some data sets
Data Sets: Example Data Sets [Updated on 6/7/11].
I include here data sets from a variety of sources, transformed to Matlab format and typically preprocessed, for use with the Diffusion Geometry Code. They include variations/subsets of the MNIST data base, CBCL Face Data, Science News articles (prepared by J. Solka), Yale Face B database. Please refer to the original data on these sites if you plan to use this data, as the parts included here are only for the purpose of running examples with the packages above.
They may be installed in any directory and that directory should be added to the Matlab path, so that the packages above may find them (this works for the most recent versions of Matlab).