Abstract: Acquisition, analysis, and visualization of diffusion tensor magnetic resonance imaging (DT-MRI) is still an evolving technology. This article reviews the fundamentals of the data acquisition process and the pipeline leading to visual results that are interpretable by physicians in their clinical practice. The limitations of common approaches for visualizing the retrieved data are discussed and a new statistical method is presented to assess the reliability of the acquired tensor field. A novel visualization method is proposed which is discussed in light of neurophysiological considerations of the perception of colored patterns. It is argued that this method is more accurate for medical data while providing a nearly optimal visual stimulus. The method is evaluated on a patient study with a brain tumor.
Abstract: We present results for two colliding black holes (BHs), with angular momentum, spin, and unequal mass. For the first time, gravitational waveforms are computed for a grazing collision from a full 3D numerical evolution. The collision can be followed through the merger to form a single BH, and through part of the ringdown period of the final BH. The apparent horizon is tracked and studied, and physical parameters, such as the mass of the final BH, are computed. The total energy radiated in gravitational waves is shown to be consistent with the total initial mass of the spacetime and the apparent horizon mass of the final BH.
Abstract: Images in scientific visualization are the end-product of the data processing. Starting from higher-dimensional
datasets, such as scalar-, vector-, tensor- fields given on 2D, 3D, 4D domains the objective is to reduce this complexity
to two-dimensional images comprehensible to the human visual system. Various mathematical fields such
as in particular Differential Geometry, Topology (theory of discretized manifolds), Differential Topology, Linear
Algebra, Geometric Algebra, Vectorfield and Tensor analysis and partial differential equations contribute to the
data filtering and transformation algorithms used in scientific visualization. The application of differential methods
is core to all these fields. The following chapter will provide examples from current research on the application of
these mathematical domains to scientific visualization and ultimately generating of images for analysis of multidimensional
datasets.
Abstract: Linux clusters have become an important tool in doing todayâs science and made it possible to study very complex phenomena in a wide range of scientific fields. Data modeling, efficient data management, and data visualization have become a critical part of the scientific discovery process and present new challenges to the users. While multiple cores create gigabytes of data, and parallel file systems in combination with the parallel I/O libraries, such as MPI-IO, provide scalable and manageable I/O, having computational power alone is not enough to deal with data complexity, extensibility, and portability. Scientists need a way to describe complex data and relationships between data components that often extend beyond one application and one computational platform. This tutorial introduces two software packages HDF5 (Hierarchical Data Format 5) and F5 (The Fiber Bundle HDF5 library), a thin library layer on top of HDF5 for storing, managing and visualizing big and complex data. Created more than a decade ago at the National Center for Supercomputing Applications at University of Illinois and currently maintained by the non-profit company âThe HDF Groupâ, HDF5 addresses data complexity and data management needs in todayâs high-performance computational environments including Linux clusters. It provides a simple yet powerful data model along with the portable and scalable access to data. Inspired by the mathematical concept of Fiber Bundles, F5 was created by Dr. Werner Benger at the Albert Einstein Institute and further developed at the Center for Computation and Technology at Louisiana State University. It offers a Common Data Model implemented using HDF5 and provides a simple Application Programming Interface (API) to formulate a wide range of data types used for scientific visualization, thereby achieving interoperability among applications and to make possible the modeling and visualization of diverse physical phenomena. HDF5 and F5 were successfully used to model hurricane Katrina, collisions of galaxies, and rotations of neutron stars and black holes, to name a few. The full-day Tutorial will be equally useful to anyone who wants to learn HDF5 or to master HDF5 to go from data to vision and beyond.