Technology

  • Epidemics and pandemics are characterized by highly complex and dynamic systems, requiring data and model driven simulations to support decision making. Due to the large number of unknowns, decision-makers usually need to generate ensembles of stochastic scenarios, requiring hundreds or thousands of individual simulation instances, each with different parameter settings corresponding to distinct plausible scenarios. Intuitively, each simulation takes as input relevant models and observations (e.g. demographic data, transportation networks, disease models) and produce a rich simulation ensemble that corresponds to alternative outcomes -- each outcome representing a potential timeline.

    As the number of model parameters of a simulation increases, the number of potential situations one can simulate increases exponentially -- consequently, simulation ensembles are inherently sparse, even when they are extremely large.

    The novel PanCommunity:DataStorm platform supports data- and model-driven simulation ensemble management, optimization, analysis, and exploration. The platform help (a) decide which simulation instances to execute and (b) given a large simulation ensemble, enabling decision makers to explore the resulting alternative timelines.

  • Real-time and continuous analysis and decision making (including estimating transmissibility of the disease, forecasting the spatio-temporal spread at different spatial scales, assessing the effect of travel controls, predicting the effect of school closures, assessing the impact of pharmaceutical interventions) through models and simulations, however, require data. When a new virus starts to spread, vaccines are not an option; nonetheless, diagnostic tests can be developed at different costs and with different accuracy (sensitivity and specificity parameters) and can be seen not only as a method to identify positive cases of the disease to conduct routine contact tracing efforts aimed at stopping chains of transmission but also as a means to collect, dynamically, data that provides a picture of the trajectory of the epidemic in the real-time]. This way, a silent epidemic can be monitored, and appropriate countermeasures can put in place.

    Unfortunately, modeling of an epidemic can be very complex and can involve a large space of capabilities and “future” actions to be accounted for. Consequently, developing an efficient and effective rapid testing strategy requires the ability to select potential scenarios from a high dimensional parameter space. Unfortunately, existing approaches fail to account for complexity, dynamicity, cost, and accuracy of testing modalities and cannot scale to the needs of rapid testing applications. We are, therefore, developing an innovative family of budgeted rapid testing algorithms that generate rapid testing ensembles that help explain past observations, in terms of the target disease model, while providing insights into possible future scenarios.

    Click for RTEM Model Overview

    Click for RTEM Simulation Interface (V0.1)

The 2019 National Response Framework report by the Federal Emergency Management Agency (FEMA) recognizes the role of data and models and the need for new and innovative methods that can exploit those in a unified manner. We, therefore, design and implement a novel participatory modeling-as-a-service (simulation, analysis, and model orchestration) and computing framework, PanCommunity for the collection, analysis and decision support in COVID-19 like pandemics.

PanCommunity framework will support seamless integration and manipulation of independently developed, reusable scientific models and analysis components within the same framework as the data for understanding and improving community response in pandemics and will empower experts with data- and model-driven situational awareness, socio-behavioral understanding, and decision making. This complemented with a novel multi-scale and multi-fidelity optimization framework that accounts for economical and social costs.

To achieve this objective, this work answers several fundamental research challenges across computing and community health: (a) epidemic, testing, vaccination, intervention and behavior model/data integration and alignment, (b) multi-model and multi-scale simulation ensemble creation and decision support, and (c) multi-scale social impact of decision making across communities.