Making sense of big neural data...

Making sense of big neural data...

About me...

[ cv ] [ scholar ] [ github ] [ xbrain web ]
I am currently an Assistant Professor in the Department of Biomedical Engineering at the Georgia Institute of Technology and Emory University. At Georgia Tech, I run the Neural Data Science (NerDS) Lab, focused on developing new machine learning and data science approaches for analyzing and making sense of neural datasets. Before this, I was a Research Scientist in the Bayesian Behavior Lab at Northwestern University, where I worked with Konrad Kording (now at UPenn). I completed my Ph.D in Electrical & Computer Engineering at Rice University, under the supervision of Richard Baraniuk in 2014. While at Rice, I co-developed the edX MOOC Discrete-Time Signals and Systems. Before that, I received a BSEE from the University of Miami, where I completed a double major in Audio Engineering and Physics. At the University of Miami, I worked for the Edward Arnold Center for Confluent Media Studies as a multimedia designer and an assistant sound designer for the documentary One Water: A collaborative effort for a sustainable future.

Projects

High-throughput methods for quantifying neuroanatomy

With a team of researchers at Argonne National Laboratory (led by Bobby Kasthuri), I have developed the first open data analysis pipeline to convert X-ray image volumes to dense micron-scale brain maps of cell bodies and blood vessels. Our results demonstrate that X-ray sources can be used with image parsing techniques to rapidly quantify neuroanatomy at the mesoscale. We are currently exploring ways in which our techniques can be combined with electron microscopy to obtain multi-modal brain maps.

Related publications:

  • Dyer et al., 2016. Quantifying mesoscale neuroanatomy using X-ray microtomography (Web, Paper)

Large-scale optimization

Optimization problems are ubiquitous in machine learning and neuroscience. I am currently working on two projects in this domain. First, with Azalia Mirhoseini and Farinaz Koushanfar, I am developing frameworks for data-aware distributed learning. In a recent paper, we showed how the low rank and multi-subspace structure of large datasets can be leveraged to accelerate a broad class of iterative optimization methods. Second, with Mohammad Gheshlaghi Azar and Konrad Kording, I am developing approaches for non-convex and black-box optimization. In a recent paper at UAI 2016, we introduced a provable black-box approach for global optimization that learns a convex envelope from samples of the function.

Related publications:

  • A. Mirhoseini, E.L. Dyer, E. Songhori, R.G. Baraniuk, and F. Koushanfar, RankMap: A platform-aware framework for distributed learning from dense datasets, accepted to IEEE Trans. on Neural Networks and Learning Systems, 2016. (Paper, Code)

  • M Gheshlaghi Azar, E.L. Dyer, Konrad Kording, Convex Relaxation Regression (CoRR): Black-box optimization of a smooth function by learning its convex envelope, Proc. of the Conference on Uncertainity in Artificial Intelligence, 2016. (Paper)

Low-dimensional signal models

Unions of subspaces (UoS) are a generalization of single subspace models that approximate data points as living on multiple subspaces, rather than assuming a global low-dimensional model (as in PCA). Modeling data with mixtures of subspaces provides a more compact and simple representation of the data, and thus can lead to better partitioning (clustering) of the data and help in compression and denoising.

Related publications:

  • E.L. Dyer, A.C. Sankaranarayanan, and R.G. Baraniuk, Greedy feature selection for subspace clustering, The Journal of Machine Learning Research 14 (1), 2487-2517, September, 2013. (Paper)

  • E.L. Dyer, T.A. Goldstein, R. Patel, K.P. Körding, and R.G. Baraniuk, Sparse self-expressive decompositions for dimensionality reduction and clustering (Paper)

  • R.J. Patel, T.A. Goldstein, E.L. Dyer, A. Mirhoseini, and R.G. Baraniuk, Deterministic column sampling for low rank approximation: Nystrom vs. Incomplete Cholesky Decomposition, SIAM Data Mining (SDM) Conference, May 2016. (Paper, Code)

Analyzing the activity of neuronal populations

Advances in monitoring the activity of large populations of neurons has provided new insights into the collective dynamics of neurons. I am working on methods that learn and exploit low-dimensional structure in neural activity for decoding, denoising, and deconvolution.

Related publications:

  • E.L. Dyer, M. Azar, H.L. Fernandes, M. Perich, L.E. Miller, and K.P. Körding: A cryptography-based approach to brain decoding (Web, Paper)

  • E.L. Dyer, C. Studer, J.T. Robinson, and R.G Baraniuk, A robust and efficient method to recover neural events from noisy and corrupted data, IEEE EMBS NER Conference, 2013. (Paper, Code)

Papers

In the pipeline…

  • X. Yang, V. De Andrade, F. De Carlo, E.L. Dyer, N. Kasthuri, D. Gürsoy, Seeing the structure of objects at the nanoscale through low dose computed X-ray tomography, 2017.

  • E.L. Dyer, W.G. Roncal, H.L. Fernandes, D. Gürsoy, V. De Andrade, R. Vescovi, K. Fezzaa, X. Xiao, J.T. Vogelstein, C. Jacobsen, K.P. Körding and N. Kasthuri, Quantifying mesoscale neuroanatomy using X-ray microtomography, in review at eNeuro, 2016. (Paper, Code, Data)

  • E.L. Dyer, M. Azar, H.L. Fernandes, M. Perich, L.E. Miller, and K.P. Körding: A cryptography-based approach to brain decoding, in review at Nature Biomedical Engineering, 2016. (Paper, Code)

  • E.L. Dyer, W.G. Roncal, D. Gürsoy, K.P. Körding, N. Kasthuri: From sample to knowledge: Towards an integrated approach for neuroscience discovery, arXiv:1604.03199 [q-bio.QM], 2016. (Paper)

  • E.L. Dyer, T.A. Goldstein, R.J. Patel, K.P. Körding, and R.G. Baraniuk: Sparse self-expressive decompositions for matrix approximation and clustering, arXiv:1505.00824 [cs.IT], 2015. (Paper, Code)

Publications

  • A. Mirhoseini, E.L. Dyer, E. Songhori, R.G. Baraniuk, and F. Koushanfar, RankMap: A platform-aware framework for distributed learning from dense datasets, IEEE Transactions on Neural Networks and Learning Systems, 2017. (Paper, Code)

  • M. Azar, E.L. Dyer, and K.P. Körding, Convex relaxation regression: Black-Box optimization of smooth functions by learning their convex envelopes, Proceedings of the Conference on Uncertainty in Artificial Intelligence (UAI), June 2016. (Paper, Poster, Slides)

  • R.J. Patel, T.A. Goldstein, E.L. Dyer, A. Mirhoseini, and R.G. Baraniuk, Deterministic column sampling for low rank approximation: Nystrom vs. Incomplete Cholesky Decomposition, SIAM Data Mining (SDM) Conference, May 2016. (Paper, Code)

  • E.L. Dyer, A.C. Sankaranarayanan, and R.G. Baraniuk, Greedy feature selection for subspace clustering, The Journal of Machine Learning Research 14 (1), 2487-2517, September, 2013. (Paper)

  • E.L. Dyer, C. Studer, J.T. Robinson, and R.G Baraniuk, A robust and efficient method to recover neural events from noisy and corrupted data, IEEE EMBS Neural Engineering (NER) Conference, 2013. (Paper, Code)

  • E.L. Dyer, C. Studer, and R.G Baraniuk, Subspace clustering with dense representations, IEEE International Conf. on Signal Processing (ICASSP) 2013 Proceedings, Vancouver, BC, 2013. (Paper)

  • E.L. Dyer, M. Majzoobi, F. Koushanfar, Hybrid modeling of non-stationary process variations, IEEE/ACM Design and Automation Conference (DAC) 2011 Proceedings, San Diego, CA, 2011. (Paper)

  • M. Majzoobi, E.L. Dyer, A. Enably, and F. Koushanfar, Rapid FPGA characterization using clock synthesis and signal sparsity, IEEE International Test Conference (ITC) 2010 Proceedings, Austin, TX, November 2010. (Paper)

  • E.L. Dyer, M.F. Duarte, D.H. Johnson, and R.G. Baraniuk, Recovering spikes from noisy neuronal calcium signals via structured sparse approximation, Lecture Notes in Computer Science, LVA/ICA 2010, Volume 6365/2010, 604-611. (Paper)

  • G. Fischer, E.L. Dyer, C. Csoma, A. Deguet, and G. Fichtinger, Validation system for MR image overlay and other needle insertion techniques, Medicine Meets Virtual Reality 15- in vivo, in vitro, in silico: Designing the Next in Medicine, IOS Press, 2007. (Paper)

Abstracts

  • A. Bleckert, A. Bodor, J. Borseth, D. Brittain, D. Bumbarger, D. Castelli, E.L. Dyer, T. Keenan, Y. Li, F. Long, J. Perkins, D. Reid, D. Sullivan, M. Takeno, R. Torres, D. Williams, C. Reid, N. da Costa: Linking functional and anatomical circuit connectivity using fast parallelized TEM imaging, Society for Neuroscience Annual Meeting (SFN), November 2016.

  • R. Vescovi, E. Miqueles, D. Gursoy, V. De Andrade, E.L. Dyer, K. Kording, M. Cardoso, F. De Carlo, C. Jacobsen, N. Kasthuri. TOMOSAIC: Towards Terabyte Tomography, submitted to X-ray microscopy (XRM), 2016.

  • E.L. Dyer, H.L. Fernandes, X. Xiao, W. Gray Roncal, J.T. Vogelstein, C. Jacobsen, K.P. Körding and N. Kasthuri, Quantifying mesoscale neuroanatomy using X-ray microtomography, presented at the Society for Neuroscience (SFN) Annual Meeting in October 2015 and the Annual Statistical Analysis of Neural Data (SAND) Meeting in May 2015.(Abstract)

  • E.L. Dyer, T.A. Goldstein, R. Patel, K.P. Körding, and R.G. Baraniuk, Sparse Self-Expressive Decompositions for Dimensionality Reduction and Clustering, Signal Processing with Adaptive Sparse Structured Representations (SPARS), July, 2015. (Abstract)

  • E.L. Dyer, D.B. Murphy, R.G. Baraniuk, and J.T Robinson, Compressive neural circuit reconstruction using patterned optical stimulation, Society for Neuroscience (SFN) Annual Meeting, 2013.

  • E.L. Dyer, C. Studer, and R.G Baraniuk, Subspace clustering with dense representations, Signal Processing with Adaptive Sparse Structured Representations (SPARS) 2013 Proceedings, Lausanne, Switzerland, 2013.

  • E.L. Dyer, U. Rutishauser, and R.G Baraniuk, Group sparse coding with collections of winner-take-all (WTA) circuits, Organization for Computational Neurosciences (OCNS), BMC Neuroscience, 2012.

  • E.L. Dyer, A.C. Sankaranarayanan, and R.G. Baraniuk, Learning hybrid linear models via sparse recovery, Signal Processing with Adaptive Sparse Structured Representations (SPARS) 2011 Proceedings.

  • E.L. Dyer, D.H. Johnson, and R.G. Baraniuk, Learning modular representations from global sparse coding networks, Organization for Computational Neurosciences (OCNS), BMC Neuroscience 2010, 11(1): P131.

  • E.L. Dyer, D.H. Johnson, and R.G. Baraniuk, Sparse coding in modular networks, Computational and systems neuroscience (COSYNE), 2010.

  • E.L. Dyer, D.H. Johnson, and R.G Baraniuk, Sparse coding with population sketches, Organization for Computational Neurosciences (OCNS), BMC Neuroscience 2009, 10(1):P132.

Theses

  • New Theory and Methods for Signals in Unions of Subspaces, Ph.D. Thesis, Dept. of Electrical and Computer Engineering, Rice University, September, 2014.
  • Endogenous Sparse Recovery, M.S. Thesis, Dept. of Electrical and Computer Engineering, Rice University, October, 2011.

Code

  • XBRAIN: X-ray Brain Reconstruction, Analytics, and Inference for Neuranatomy (code, data)
  • RankMap API for Distributed Learning (Code, Paper)
  • Self-Expressive Decomposition (SEED) (Code, Paper)
  • Accelerated Sequential Incoherence Sampling (oASIS) (Code, Paper)
  • Neural Event Recovery and Detection via Sparsity (NERDS) (Code, Paper)
  • Rapid Characterization of FPGAs with Matrix Completion (Code)