L. Nathan Perkins

Statement of Purpose for Systems Neuroscience

(Note: This is the generic version of my Statement of Purpose when applying to programs in systems neuoscience. School specific references were removed.)

As I dialed down the exogenous inhibitory signal from the central nervous system, the simple six neuron model began slow bilateral oscillations. These oscillations mirrored the locomotor controls identified in the spinal cord of the lamprey3, where the bilateral phase differences are the basis for muscle contractions necessary for the undulatory motion of the eel-like fish through water. By extracting and exploring the differential equations and their phase planes underlying the model, I was able not only to observe the oscillations but to understand the frequencies and steady states that provide central nervous system control over swimming speed. This model, created as part of my computational neuroscience term project, was my first taste of how computational modeling could elucidate the neural networks that neurobiology was just beginning to disentangle.

As genetic and imaging techniques have markedly improved, the opportunities to capture and analyze neural connections have opened the floodgates of new research. Genetic sequencing and numerous knockout studies have provided greater clarity in understanding biological development and the roles of specific neural structures, while improvements in serial electron microscopy have made the previously elusive detail necessary for mapping connectomes a reality. These breakthroughs lie at the core of my interest in researching systems neuroscience.

Research Interests

Researchers are for the first time getting a glimpse of the larger scale computational structures that underlie complex behavior, thought and disease. Previous theoretical connectionist models are now being validated, revised and improved based on their biological counterparts.1 I hope to work at this intersection, either building computational models that are concretely informed by their biological bases, or using theoretical models to interpret the narrow and still-noisy data emerging from connectomics imaging.

I am specifically interested in understanding emergence in neural computation, a phenomenon which depends on both intrinsic features (such as genetically-determined intraspecies morphological building blocks and common developmental phases, including neurogenesis and synaptogenesis) and on extrinsic developmental learning and exposure (such as learned behavior and plasticity). I believe that combining advances in genetics and imaging with computational models will help further understand the biological ability to assemble a generic computational lattice of neurons that then adapts via training and stimuli.

Visual processing has already benefited substantially from increased understanding of the biological basis of vision in animals. Early work proposed high-level stages in processing inspired by the distinct layers and regions identified in the occipital lobe, where a visual stimulus is decomposed into topographical features and then selectively reconstituted into higher level representations based on attention.4 This early work became the cornerstone in how many computer algorithms perform image recognition.2 Yet as the understanding of cognitive image recognition increased, new and more effective algorithms were developed by taking advantage of a greater understanding of neural cognition and, as a result, achieved improved performance.5 This trend of novel neurologic insights being incorporated into and improving the theories of complex computation will likely gain momentum, as more research and imaging allows for more detailed knowledge of underlying neural circuitry. And in many fields, such as image recognition and language processing, human performance continues to outshine leading computational models suggesting that there is substantially more to learn through such ongoing investigation. I hope to be able to contribute to the incremental steps integrating cognition theory with neurobiological evidence, resulting in improved approaches to such challenging computational problems as machine learning, image processing and language parsing.


My diverse educational background, including quantitative research experience while working toward my masters degree, and a strong computational skill set honed through both my work and personal projects, have prepared me for the research I might undertake as a doctoral student. Originally entranced by the idea of emergent properties in biology and neuroscience, this concept became a defining theme in my academic trajectory. My undergraduate education provided a strong background in neurobiology, as well as in computational neuroscience, through courses in artificial intelligence and computational modeling.

My graduate research at MIT focused on emergence and organization of systems on a societal scale. In my thesis and related publications, I built a novel tool for evaluating organizational strengths. By taking an existing, comprehensive evaluation tool developed by my lab, and data from actual implementations, I first identified the principal components of the evaluation. Using machine learning algorithms, I identified a more concise sequence of data that could be used to extrapolate the same detail and insight. From this, a new evaluation was built and validated that uses adaptive evaluation with only a quarter of the number of data points (selected on a case-by-case basis) to achieve the same information.

Outside of academia, I have developed and supported myself through robust and versatile programming skills. Most recently, I created Arctic Reservations, a software platform that provides reservation management, accounting and customer relations for outdoor adventure companies. Beyond the programming necessary, this has involved deploying and maintaining a highly reliable database cluster with over 50 million records. In support of Arctic and my other products, I have built extensive libraries for highly optimized data storage and processing. For example, while working with an education nonprofit in Los Angeles, I built a crawler and indexer with a custom database format for highly responsive search results, as well as a linguistic clustering algorithm to automatically group related documents based on both n-gram frequency and prevalence. I am confident that my technical experience and skills will be a valuable asset in performing research.


Following a doctoral program, I anticipate having further developed my quantitative research skills and having gained a greater understanding for unanswered questions that are at the forefront of systems neuroscience. Thus equipped, I plan to continue doing research, most likely opting to pursue an investigative or tenure-track position at a research university. The field of neuroscience, which lies at the nexus of many distinct disciplines — biology, chemistry, cognition, computation, physics, mathematics — will consistently provide a font of new questions and new challenges. Drawn by this breadth, I am eager to gain the depth of knowledge and capabilities necessary to contribute to the advancement of our understanding of the mind.

Thank you for your time and consideration.

  1. Denk, W., Briggman, K. L., & Helmstaedter, M. (2012). “Structural neurobiology: missing link to a mechanistic understanding of neural computation.” Nature Reviews Neuroscience, 13(5), 351–358. 

  2. Itti, L., & Koch, C. (2001). “Computational modelling of visual attention.” Nature Reviews Neuroscience, 2(February), 1–11. 

  3. Jung, R., Kiemel, T., & Cohen, A. H. (1996). “Dynamic behavior of a neural network model of locomotor control in the lamprey.” Journal of Neurophysiology, 75(3), 1074–86. 

  4. Koch, C., & Ullman, S. (1985). “Shifts in selective visual attention: towards the underlying neural circuitry.” Human Neurobiology, 4(4), 219–27. 

  5. Serre, T., Wolf, L., & Poggio, T. (n.d.). “Object Recognition with Features Inspired by Visual Cortex.” 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR’05), 2, 994–1000.