MindBytes Poster Gallery 2014
Comparative Analysis of Capital Investments in the Electric Utility Industry
The Utility industry is characterized by high capital investments with returns that are accrued with decades of delay. Therefore, generation and justification of large capital funds far in advance of highly uncertain
future returns often becomes difficult, resulting in (a) systematic underinvestment in infrastructure in many regions and (b) deprivation of these regions of investments in other economic areas due to fragile
utility infrastructure. Analytical tools may be helpful to turn the future uncertainty in the returns of capital investments in the utility industry into a calculated risk. This presentation focuses on the area
of electricity distribution grid and aims to: 1. Model the historical relationship between economic, demographic, and technological aspects of US regions and demand for electricity. 2. Model the relationship
between capital investment amounts and electricity load on the grid over time. 3. Identify under and over investing regions in infrastructure with respect to future electricity loads that are forecasted on the
basis of economic, demographic, and technology related characteristics of the regions.
Identifying Students at Risk Accurately and Early
The high school dropout crisis in the United States claims more than one million students each year, costing individuals the loss of potential earnings and the nation hundreds of billions of dollars in lost revenue,
lower economic activity and increased social services. Interventions have long been shown to be an effective way to address the needs of students who are falling behind with respect to their educational goals.
To increase the odds of success, however, interventions must reach the right students, at the right time, and with the correct message. With student-level data becoming almost ubiquitous, educators today can
obtain valuable insight to tackle these challenges in a much more timely manner. Using techniques from machine learning and data analytics, we developed an approach that can be used to select and prioritize
students who are likely to be at risk of dropping out of high school, and suggest how particular may differ in their needs. These predictions and explanations can then be used to target interventions for these
students, hopefully leading to better outcomes. We are currently analyzing data from additional cohorts and school districts to assess how well this approach generalizes and scales to student data from across
the country. Furthermore, we are developing a student data pipeline such that these methods can be applied efficiently and effectively with data from school districts in the future.
Computational Efficient Electrode Model to Simulate the Electrochemical Cells
Understanding of the mechanisms in electrochemical cells can provide meaningful insights to improve the battery efficiency. Building computational models for electrochemical cells is non-trivial task. A computationally
efficient method is presented here for the treatment of complicated interactions between the polarizable metallic electrodes holding constant potentials and the electrolyte layer separating them. The method
combines a fluctuating uniform electrode charge with explicit image charges to account for the polarization effect of the electrode, and a constant uniform charge can be added to account for the constant applied
voltages.
Towards Anatomic Scale Agent-Based Modeling with a Spatially Explicit General-Purpose Model of Enteric Tissue
Inflammation of the illeal pouch following remedial surgery for ulcerative colitis, a condition termed pouchitis, is a significant source of morbidity in patients with inflammatory bowel disease with reported long-term
incidence rates of up to 95%. The pathogenesis of pouchitis is believed to involve the intersection of dysregulated intestinal inflammation, abnormal mucosal tissue response, and alterations in gut microflora
due to stasis resulting from the anatomic configuration of the illeal pouch. Thus the pathogenesis of pouchitis is a multiscale process that extends from microscale molecular signaling to tissue scale cellular
patterning to anatomic-scale dynamics of the flow of intestinal contents. We have previously developed the Spatially Explicit General-purpose Model of Enteric Tissue (SEGMEnT) to dynamically represent existing
knowledge of the behavior of enteric epithelial tissue as influenced by inflammation with the ability to generate a variety of pathophysiological processes. Given that the progression of pouchitis and the spatial
distribution of the stimuli that drive it are not homogenous throughout the entirety of the pouch, and as such anatomic scale simulations are required in order to plausibly simulate the complicated interplay
of ileal and rectal (colonic) tissue with a dynamic microbiome, we have implemented a parallelized version of SEGMEnT, SEGMEnT_HPC, with the ability to generate anatomic-scale simulations of intestinal epithelial
tissue patterning and response to inflammation.
Metadynamics: Improved Methodology and a Proof of Convergence
Metadynamics is a versatile and capable enhanced sampling method for the computational study of soft matter materials and biomolecular systems. However, it is an adaptive biasing method and the field of adaptive
simulation is quite young; there are many open questions about the statistical mechanics and design of adaptive simulations and metadynamics is no exception. This poster presents an improved metadynamics that
can achieve an order of magnitude faster convergence than the previous state of the art without compromising exploration efficiency and summarizes the most important aspects of our recent proof that metadynamics
converges asymptotically. In particular, strict timescale separation is not required for metadynamics to converge accurately and knowing the rough size of the energy barriers in a system beforehand is not necessary
for efficient convergence.
Simulating the Dark Side of the Universe
We present a suite of large simulations of the collapse of dark matter into the structure known as the cosmic web, and into dark matter halos in particular. We highlight some key aspects of various research projects
focused on the density profiles of these halos.
Listening Closely for Black Hole Collisions
With the most sensitive ground-based gravitational wave detectors ever made coming on line in the next year, the birth of a new field of observational astronomy is closer than ever. We outline the endeavor to detect
gravitational waves and characterize their sources a computationally expensive endeavor requiring the generation of millions of model waveforms for each detected signal.
Using Global View Resilience (GVR) to Add Resilience to Exascale Applications
Resilience is one of the most significant challenges towards achieving exascale computing. In the Global View Resilience (GVR) project, in order to mitigate or tolerate high error rate in the future, we have been
developing a library which enables scientific applications to run reliably on unreliable computers. The GVR approach builds on a global view data model, adding versioning (multi-version), user control of timing
and rate (multi-stream), and flexible cross layer error signalling and recovery. With a versioned array as a portable abstraction, GVR enables application programmers to exploit deep scientific and application
code insights to manage resilience (and its overhead) in a flexible, portable fashion. We have developed a prototype implementation of the GVR library and applied it to several existing production-level scientific
applications, such as ddcMD (molecular dynamics, from LLNL), OpenMC (Monte Carlo neutron transport simulation for nuclear reactor from CESAR co-design center), Chombo (block-structured adaptive mesh refinement
framework, from LBNL), Trilinos (library collection for scientific computation, from SNL), and linear solvers. Through those application studies we have proven that applying GVR to existing applications is pretty
easy and brings huge benefit for applications in terms of resilience. In this poster we present the overall idea of GVR and the latest results to demonstrate that GVR requires negligible code change and runtime
overhead when applied to these existing scientific applications.
Coarse-Grained (CG) Computer Simulations of Biomolecular Phenomena
The use of "coarse-grained" (CG) models can extend the reach of computer simulation to time- and length-scales capable of examining important biophysical phenomena. We present a motivation for using coarse-grained
models in the context of several different systems, and highlight what such approaches can reveal about key processes in biology.
Structures of Disordered Polypeptides
The folding of intrinsically disordered peptides is often difficult to elucidate by traditional experimental techniques because of the rapid conversion of these peptides between many low free energy states. Molecular
simulations are one tool to resolve the structures of proteins on the angstrom and picosecond scales that can be applied to a diverse set of systems. This poster discusses how advanced sampling techniques such
as bias exchange metadynamics and replica exchange solute tempering were used to determine the important conformations of proteins in a variety of environments. For example, the aggregation of the peptide human
amylin has been associated with the formation of type II diabetes. Oligomers of amylin are suspected to cause the death of insulin producing cells in the pancreas; however, their transient nature has made them
difficult to identify experimentally. Molecular simulations were used to probe the conformations of single peptides in solution. The fraction of peptides in a β-hairpin, α-helix, or random coil state was determined
for human and rat amylin. Different force fields and their resulting ensembles of structures were then evaluated on their ability to reproduce experimentally measured NMR chemical shifts. Another project presented
is the role of the chirality of peptides in forming coacervates, a liquid-liquid phase separation with applications in drug delivery, gene therapy, food sciences, and cosmetics. While homochiral peptides form
precipitates, achiral systems instead form coacervates. Molecular simulations were performed to understand the effect of chirality on the strength of interactions between pairs of these peptides.
A Robust Redesign of High School Match
I propose a method to estimate parameters of students' preference for schools using rank order lists that are not necessarily identical to their true rankings over schools.
Multi-Layered, Iterative Protocols for Quantum Chemical Calculations
A common strategy in quantum chemical calculations is to start by modeling a system with a low level of theory and to progress to the desired (high) level of theory. While this seems intuitively reasonable, there
is no formal reason that such a sequence is guaranteed to converge to the optimum for the desired level of theory. In fact, in cases in which the low and high levels of theory favor very different solutions,
this approach could lead to local traps and slow down convergence. Here, we propose a theoretical framework for how one force field can be used to precondition another, so as to seamlessly accelerate convergence
of the latter. We demonstrate this idea by applying it to reaction path discovery. Speedups of up to 3-5 fold are obtained.
A Highly Customizable MRI Reconstruction and Post-Processing Platform with Integrated High-Performance for Cardiac MRI: A Basic Framework Description
We present a basic framework that integrates the MRI scanner, a local workstation, and Midway RCC's high-performance computing for efficient reconstruction and post-processing of Cardiac MRI data.
Forgotten Treasures: MPI, Algorithms, Data Structures and the C Language to Empower Distributed Web Crawler
Would you like to know the tips to scale your projects ? Do you wonder how to use your data structures and algorithms knowledge to scale ? Do you think that you are away from The C Language ? If you say at least
one “yes”, this project would be an eye opener and might even give you a surprise.
Pattern Design for Self-Assembly of Block Copolymers by Computational Evolutionary Strategy
Design of the guiding patterns for directed self-assembly of block copolymers to a desired morphology is a challenging task. In the past, trial-and-error and random search methods have been used in order to find
the appropriate underlying pattern. These methods struggle with comprehensively exploring the parameter space of directed self-assembly problems in a efficient way. In this work, a computational evolutionary
strategy along with molecular simulations is used to find the optimum substrate chemical patterns for the desired block copolymer morphology. This evolutionary scheme using the substrate pattern-copolymer combinations
is an efficient method for substrate design which leads to faster convergence compared to a blind random search.
Learning Natural Language Morphology from a Raw Text
We work on the unsupervised learning of natural language morphology, devising methods that take a raw and unannotated text and induce various types of morphological structure. Our work makes extensive use of the
services---both cluster computing and data visualization---provided by the Research Computing Center (RCC). This poster presentation focuses on the visualization strategies, developed in collaboration with the
RCC, for the n-gram data structure of a raw text. The resultant visualization tools have offered much insight into both strengths and weaknesses of our methods and algorithms for inducing linguistic structure,
and therefore have pointed towards directions for further research.
Coarse-Grained Molecular Modeling of DNA: From Nanotechnology to Chromatin
Molecular-level information about DNA at nanometer length scales is of fundamental interest to nanotechnology and biology. Here we use coarse-grained molecular simulation to explore DNA-mediated self assembly and
the packaging of DNA as chromatin.
Controlling Anisotropy in Vapor-Deposited Glasses
In agreement with experiments it is shown that degree of anisotropy in vapor deposited glasses can be controlled by tuning the substrate temperature. The mechanism of molecular orientation was investigated in atomistic
MD simulations of ethylbenzene and ...
CosmoSIS: Cosmological Survey Inference System
CosmoSIS is a flexible framework for the joint analysis of cosmological datasets and the exploration of cosmological models. It was designed with modularity at its heart to enable researchers to tackle the challenges
of working with modern datasets.
Classification of Magnetized Star-Planet Interactions: Bow Shocks, Comet-Like Tails, and In-Spiraling Streams
Stellar irradiation is believed to drive outflows from the surface of close-in exoplanets, a phenomenon that is supported by transit observations of Hot Jupiters. Assuming planetary magnetospheres similar to those
of our solar system, such outflows are expected to be magnetized. Moreover, the environment of short period orbits consists of the sweeping stellar wind plasma that is known to attain super-sonic velocities.
This framework suggests the manifestation of complex magnetized star-planet interactions in systems harboring Hot Jupiters. In this work, we perform a series of parameterized 3D magneto-hydrodynamic numerical
simulations in order to provide a classification for the different types of interactions that may occur (Matsakos et al. 2014). We incorporate stellar and planetary outflows that are consistent with detailed
physical models and investigate case by case the exhibited dynamics.
Cosmic Walkers in Midway: Combined Probes Analysis with the Dark Energy Survey
Understanding dark energy, the cosmic ingredient that drives the accelerated expansion of our universe, is a key pursuit of modern cosmology. The Dark Energy Survey attempts to answer this challenge by observing
more than 300 million galaxies across an eighth of the sky, thereby obtaining data on the weak lensing and clustering of galaxies. The ultimate goal of our analysis is to put constraints on six key cosmological
parameters, but robust constraints can only be obtained by considering numerous additional model nuisance parameters, which leaves a vast and degenerate parameter space to be explored. At Midway, we use the
emcee affine-invariant ensemble sampler to perform an efficient and easily parallelized MCMC likelihood analysis covering this parameter space, thereby obtaining state-of-the-art parameter constraints that contribute
to our new best understanding of the cosmos.
Geant4 Simulation of Neutrons and Other Backgrounds in Coherent Neutrino
We have simulated the high energy neutron flux from the Spallation Neutron Source (SNS) in a CsI detector located in the basement of the facility. The neutron background is due to the interactions of 1 GeV protons
with mercury. We used Geant4, an Object Oriented Monte Carlo simulation toolkit in C++, on the midway computer cluster at the University of Chicago. The importance sampling was used to bias the neutrons to track
them through large volumes of shielding. A computation time of 180K cpu hours was used to optimize and run the simulations.
Simulations of Liquid Crystals
The de Pablo group uses three different descriptions to capture LC dynamics: atomistic, mesoscale, and continuum. Each of these models provides a different level of molecular detail; and as such, when each model
becomes relevant depends on the level of detail needed. Continuum-based LC models have been successful for decades in predicting and understanding LC systems. Traditional numerical techniques, however, lack
fluctuations, and sufficiently complex LC systems can be challenging to model with these approaches. We propose a Monte-Carlo based continuum approach to avoid this limitation. By looking at a variety of LC
systems, we demonstrate that our proposed approach is an effective means of avoiding local energy minima. Recent experiments have demonstrated that when micrometer-sized particles are placed at the surface of
an LC droplet, these particles tend to form hexagonal 2D crystals. To understand this phenomenon, we use a continuum-based LC model to measure the free-energy of various particle arrangements. With a combination
of theory, simulation and experiment, we now have a more complete understanding of these particle-decorated droplets. For some LC phenomena, a continuum model is insufficient. Recent experiments looking at the
interaction between complex lipids and LCs necessarily require a model with greater molecular detail. To simulate these systems, we use an atomistic approach. With this approach, we are able to complement experimental
findings by providing molecular details impossible to obtain through experiment.
Toward Turbulent Galaxy Formation Modeling
Typical astrophysical flows are highly turbulent (with Reynolds numbers as high as few billions). On the contrary, simulations have limited resolution and highest achievable Reynolds numbers in simulations are about
thousand at best. Consequently, simulations hardly capture the transition from laminar flow to turbulence and turbulent cascade on small scales is not resolved. The unresolved turbulent motions exert pressure
which is not taken into account in modern cosmological simulations. To model this additional pressure support we need a Sub-Grid Scale (SGS) model of turbulence. Following the method described by [Schmidt et
al 2014] such model was implemented in cosmological code ART. Fully turbulent box tests performed with ART show that the model works fairly well for developed isotropic turbulence. However, the model needs to
be improved to reproduce dynamics of stratified turbulent flows which are the most important ones from the practical point of view. In future such models can be used to improve connection between resolved dynamics
of gas and sub-grid scale models of star formation and feedback.