When
Where
Student: Jackson Zariski, Program in Applied Mathematics
Title: Supervised-Learning for Telescope Control Systems and Performance Assessment via Record-Based Metrics
Advisors: Kaitlin Kratter, Steward Observatory
Location: Math Building, Room 501
Abstract: Despite the substantial telemetry produced by modern observatories, telescope acquisition typically continues to rely on static pointing models and manually-tuned heuristics. We present a data-driven alternative for acquisition offset correction that unifies supervised machine-learning (ML), global sensitivity analysis, and an operations-centric software framework. Using archival telemetry logs from a variety of telescopes and observatories, we train regression models to predict on-sky horizon acquisition offsets from a diverse set of features, while also providing a brief examination into guiding forecasts with the aid of recurrent neural networks. To interpret these predictors and guide model tuning, we compute variance sensitivity via traditional Monte Carlo methods, quantifying the contributions of each input to the variance of the predicted offsets. With our pipeline taking historical telemetry and re-purposing it for ML training, we are able to see vast improvement in theoretical offset corrections on a temporally split testing set. For ease of use and greater generalization, we wrap the above trainers and diagnostic tools in a lightweight, Dockerized microservice with a browser-based GUI included, enabling operators to upload new logs, visualize feature sensitivities, and obtain acquisition offset corrections, all provided open-source. In addition, we introduce a new problem in combinatorics, based on probabilistic record-theory, that we call the Disappear Sort procedure. We present novel closed-form expressions for two different versions of this procedure and explore its application as a metric in evaluating an acquisition model's effectiveness in reducing large-localized errors prevalent at certain areas of feature space.