Modeling and Comp Seminar
Modeling and Computation challenges for soft, wireless sensors and photonics for assessment and stimulation of biological systems
Recent advances in materials and fabrication concepts coupled with miniaturization of wireless energy transfer schemes enable the construction of soft high-performance electronic and optoelectronic systems with sizes, shapes and physical properties matched to biological systems. Applications range from continuous monitors for health diagnosis to minimally invasive exploratory tools for neuroscience. This seminar explores basic science and engineering aspects for the creation of such systems and outlines modeling and related computational challenges and opportunities. Applications of such systems are discussed in the context of imperceptible body-worn devices for the assessment of hemodynamics, sweat and thermal properties of the skin. Highly miniaturized embodiments featuring advanced capabilities in energy harvesting and photonics allow for deployment as multifunctional subdermal neuroscience tools for wireless recording of genetically targeted indicators and optogenetic stimulation of the brain, peripheral nervous system and the heart. The seminar will highlight areas of opportunity for advanced modeling tools as well as computational challenges in the context of these applications.
Bio: Dr. Philipp Gutruf is an Assistant Professor in the Biomedical Engineering Department at the University of Arizona. He received his postdoctoral training in the Rogers Research Group at Northwestern University and the University of Illinois Urbana-Champaign (UIUC) where he developed a broad set of soft, highly miniaturized wireless battery free tools for the characterization and stimulation of biological systems. Dr. Gutruf received his PhD in 2016 at RMIT University where he worked on oxide based stretchable electronics, sensors and photonics, with emphasis on device fabrication and material concepts for intrinsically stretchable devices. He has authored over 36 journal articles and received 4 patents and his work has been highlighted on 8 journal covers. He has also been the recipient of prestigious scholarships and fellowships such as the International Postgraduate Research Scholarship (IPRS) and the Australian Nano Technology Network Travel Fellowship.
Numerical Solution of Riemann Hilbert Problems Applied to High Speed Communication
The main achievement of the last decade in the field of high-speed information transmission is the technology of optical coherent communication. This technology is based on the linear regime of pulse dynamics in optical fibers. With increasing transmission speed and distance (long-haul communications) coherent systems are experiencing the negative effects of fiber nonlinearity. For currently existing optical fibers and optical pulse parameters, the pulse dynamics are well described by the nonlinear Schrodinger equation (NLS). Recently it has been proposed to utilize the linear transmission of the scattering data for the Lax operator associated with the NLS to overcome limitations caused by nonlinearity in coherent systems. In this talk, we develop a method for computing scattering data and recovering the data signal from back-processed scattering data using the solution of the associated Riemann-Hilbert problem, which is solved with a fast and accurate numerical method.
Mechanisms for gamma rhythm production in the brain
Populations of neurons in the brain can produce distinctive, rhythmic activity observable in extracellular recordings. Activity in the ~30-90 Hz frequency band of these recordings has been the subject of much attention and is known as gamma rhythm. It has been found in the visual cortex, for example, where power in the gamma band is modulated by the orientation and contrast of visual stimuli. Whether it performs a specific function in the brain or is merely a side-effect of neural computation is not yet settled. My aim will be to discuss the mechanism that produces gamma rhythm.
As gamma rhythm is thought to be locally produced, I will begin by introducing a spiking network model of a generic local population of neurons in the brain, where biophysical parameters and network structure are chosen to reflect data where possible. When the network is driven, an emergent spike pattern appears consisting of the briefly coordinated spiking activity of small groups of neurons followed by relative lulls in activity. The frequency and size of these events are consistent with gamma rhythm. I will explain the mechanism of these events, their statistics, and how synaptic parameters influence their size and inter-event times.
Next I will show how such rhythms appear in a semi-realistic model of the visual cortex. The model covers a small patch of monkey V1 and reproduces many of its known phenomena. Driven now by visual stimuli, a gamma rhythm is again produced in a way that is graded with stimulus features like contrast and orientation. The more detailed and realistic model benchmarked against different forms of experimental data increases our confidence in the mechanism for gamma rhythm production, and allows us to relate dynamics to the network’s visual functions and response statistics. This is joint work with Lai-Sang Young and Robert Shapley.
From sea slugs to robots: Soft mechanics, discrete geometry, and computation of hyperbolic elastic sheets
Why are there intricate, self-similar wrinkles along the edges of growing leaves, blooming flowers, torn plastic sheets, and frilly sea slugs? We argue that the soft mechanics and dynamics of these non-Euclidean elastic sheets are governed by interacting non-smooth geometric defects in the material. I will describe novel ideas stemming from characterizing and modeling these defects using discrete differential geometry in order to uncover fundamental insights into the elastic behavior and properties of thin hyperbolic bodies. New theories based on the mechanics of non-smooth defects may (i) explain biological phenomena, from the morphogenesis of leaves, flowers, etc. to the biomechanics of sea slugs, as well as (ii) introduce new paradigms for materials design and actuation in a variety of new technologies, e.g., soft robotics. This is joint work with Shankar Venkataramani.
Modeling cosmological observations
The accelerated expansion of the Universe is the most surprising cosmological discovery in decades. It has inspired a newgeneration of ambitious surveys to determine the fundamental nature of this acceleration. I will introduce the observations used by today’s cosmologists, describe the modeling techniques used to connect these observations to fundamental physics, and present cosmology constraints from the Dark Energy Survey (DES Y1). This analysis constrains the composition and evolution of the Universe through joint modeling of galaxy clustering, galaxy-galaxy lensing, and cosmic shear. These three measurements, based on 20 million galaxies, yield consistent cosmological results, and in combination they provide some of the most stringent constraints on cosmological parameters. I will describe the validation of measurements and modeling from pixels to cosmology and give an outlook on the modeling and analysis challenges for future, much larger experiments that will map the position and shapes of billions of galaxies.
Deep Learning Algorithms for Autonomous Space Guidance
Autonomous and unconstrained exploration of small and large bodies of the solar system requires the development of a new class of intelligent systems capable of integrating in real-time stream of sensor data and autonomously take optimal decisions, i.e. decide the best course of action. For example, future missions to asteroids and comets will require that the spacecraft be able to autonomously navigate in uncertain dynamical environments by executing a precise sequence of maneuvers (e.g. hovering, landing, touch-and-go) based on processed information collected during the close-proximity operations phase. Currently, optimal trajectories are determined by solving optimal guidance problems for a variety of scenarios, generally yielding open-loop trajectories that must be tracked by the guidance system. Although deeply rooted in the powerful tools from optimal control theory, such trajectories are computationally expensive and must be determined off-line, thus hindering the ability to optimally adapt and respond in real-time to 1) uncertainties in the unknown dynamical environment; 2) detected hazards; and 3) science value analysis. Over the past few years, there has been an explosion of machine learning techniques involving the use of shallow and deep neural networks to solve a variety of problems spanning from object detection to image recognition to natural language processing and sentiment analysis. The recent success of deep learning is due to concurrent advancement of the fundamental understanding on how to train deep architecture, the availability of large amount of data and critical advancements in computing power (e.g. extensive use of GPUs). One can naturally ask the following: how can such techniques help the development of the next generation of robust and adaptive algorithms that may enable autonomous space exploration? In this talk, I will address this problem by presenting a variety of methods and techniques that have been recently developed by my research team in the context of autonomous planetary landing and close proximity operations around small bodies. The methodologies span from supervised learning to deep reinforcement learning and demonstrate that such approaches may be implemented to enable intelligent autonomous systems for both guidance, control and real-time decision-making during the robotic exploration of the solar system. Arizona’s First University – Since 1885
Brief Bio: Roberto Furfaro is Full Professor at the Department of Systems and Industrial Engineering, Department of Aerospace and Mechanical Engineering, University of Arizona. He is the Director of the Space Situational Awareness Arizona (SSA-Arizona) Initiative at the Defense and Security Research Institute (DSRI). He published 46 peer-reviewed journal papers and more than 200 conference papers and abstracts. As PI and Co-PI, he received more than $30M from NASA and Department of Defense. He was the System Engineering lead for the Science Processing and Operations Center (SPOC) for the NASA OSIRIS REx mission. He is currently part of the NASA NEOCam Science Team, responsible for developing an autonomous system for the NEOCam spacecraft follow-up and survey planning. In his honor, the asteroid 2003 WX3 was renamed 133474 Roberto Furfaro.
Instability and fragility in Wall Turbulence Transition
The mathematical analysis of transition to turbulence in wall-bounded shear flows has been a long standing problem in hydrodynamic stability since the early twentieth century. The problem is central to understanding skin-friction drag, and its reduction and control by both active and passive mechanisms. While a satisfactory mathematical theory of transition in these flows remains incomplete, substantial progress has been made in the past two decades, and is related to concepts of non-normal growth and pseudo-spectral analysis. I will argue that one key to this progress is to view transition in shear flows not only as a stability problem, but also as a fragility of the underlying dynamics to structural perturbations. I will describe these recent developments which have interesting connections with control theory, and in particular with robust stability problems. The implications of this new theory to control of turbulent skin friction drag will also be outlined.
Experiments in Traffic Flow Control with Low and High Density of Autonomous Vehicles
This talk describes two experiments in traffic flow control that involve the University of Arizona CAT Vehicle Testbed. The first experiment explores how to dampen emerging waves in traffic that are due to congestive effects. This experiment grew out of theory of how traffic flow could be improved through sparse velocity control (e.g., ~5% of the vehicles) in the flow. The second experiment examines an analogous case, where 100% of the vehicles are controlled, though this time using off-the-shelf (rather than customized) cruise control algorithms. The talk will examine the hypotheses, methods, and results of these experiments, and explore the theory and motivation for the research as a means to provide insights into the obtained results. The talk will discuss how some results promise the potential for tremendous societal reduction in stress and in fuel used, and how other results indicate the need for more research to determine whether deployment at a societal scale of the available technology would result in positive, or negative, benefits in terms of safety and congestion. The research was sponsored by the National Science Foundation under award CNS-1446435, and is collaborative work with Benedetto Piccoli, Benjamin Seibold, and Dan Work.
12:30 p.m. Sept. 26, 2019
The Modeling, Computation, Nonlinearity, Randomness and Waves Seminar will have an organizational meeting Thursday, September 12, at 12:30pm in Math 402. If you would like to speak this semester or nominate someone to speak this semester, please attend this meeting or send an email to firstname.lastname@example.org If you do not already subscribe to seminar announcements, please send an email to email@example.com
12:30 p.m. Sept. 12, 2019