Enabling Technologies (ET)

advertisement
Enabling Technologies (ET). The top priorities for ET are:

Visual Data Mining Research and Applications

Development of CTA-specific Visualization Systems

Development of CTA-specific Problem Solving Environments

Automation of Mesh Generation Tools
ET will concentrate on:

Visualization systems that exploit heterogeneous distributed computational
resources and can ingest enormous (10TB) datasets

One or two problem solving environments

Automation in geometry processing for mesh generation

Training in visualization, mesh generation, and data mining tools
The ET FY03 project tasks are:

ET-03-001: EnVis and EnVisU – Distributed High Performance Batchmode
Visualization (PI: Robert Moorhead, Mississippi State University [MSU])

ET-03-002: GGTK (Geometry-Grid Tool Kit) (PI: Bharat Soni, University of
Alabama-Birmingham [UAB])

ET-03-008: Development of an Integrated Simulation Environment (PI: Ralph
Noack, MSU)

ET-03-011: Building Interoperable Portals with Web Services (PI: Mary
Thomas, Texas Advanced Computing Center [TACC], UT-Austin)
The strategic roadmap for ET is as follows:
Functional Area: ET
Strategic Focus
Areas
FY03
FY04
FY05
FY06
1. Distributed Vis
ET001
ET-DV
2. Large Scale Vis
ET001
ET-LSV
ET-FV
3. Visual Data Mining
ET-DM
4. Problem Solving
Environments
ET008
5. Grid Generation
ET002
ET011
ET-PSE
ET-GG
FY07
FY04-07 areas of concentration for ET include:

ET-DV: Distributed Visualization — Once a solution is computed to a CTA
problem, the data is usually remote from those that need to see and analyze
the results. To effectively analyze data, the computational task is often divided
into parts so that one part is carried out at the remote MSRC/ADC/DDC, and
the other done on resources local to the analyst. For example, feature detection
and extraction is often best done in the MSRC/ADC/DDC, while actually
mapping to 2D/3D graphical primitives is often best done on the user's local
workstation. The issues in solving this problem start with balancing the
computation demands tempered by the communication bandwidth. The issues
then become where is it best to do what computation, where to put the longhaul transmission in the visualization pipeline that ingests 4D data and outputs
imagery (after reading the data, after every time step, after constructing 3D
graphical objects, after projecting to 2D graphical objects, after rendering to
images, etc.), how much speed to give up to provide more flexibility to the
analyst, etc. Performance is probably a more critical issue than portability in
the end, but due to the continuously changing collection of resources,
portability is probably a significant issue initially. (Generally resources at
MSRCs/ADCs/DDCs are known, but resources at user’s desktop are not.)
Projects that address ingesting, coding, transferring, decoding, and visualizing
enormous (10TB) raw datasets need to be undertaken. Existing frameworks
(e.g., vtk etc.) should be leveraged where possible. Security issues (Kerberos)
must be addressed.ET-LSV: Large-scale Visualization — Although
visualization of high-order entities, like features and objects, is the ultimate
goal, visualization of raw data is often first required. Analysts are legitimately
skeptical of too much automation too soon. To provide analysts with a
sufficient comfort level, they often need to be able to browse and explore
large datasets. They need to be able to do data mining and knowledge
discovery in a labor-intensive way before trusting an automated tool, if for no
other reason than to generate some test cases. ET-LSV is not necessarily
interactive visualization and in fact often is batch-mode visualization. These
projects should produce algorithms, tools, and knowledge bases that allow
rapid efficient deployment of analysis systems. Issues that need to be
addressed under ET-LSV to improve the working environments for DoD users
include application-specific coding/compression schemes that compress the
data more or run faster, automated sizing metrics (to help determine
appropriate chunking of data), and visualization algorithms and user interfaces
that help an analyst to better explore and understand massive datasets. User
interfaces is an often-neglected area in CTA programs that could benefit from
some of the research done by cognitive scientists and human-computer
interface specialists. User interfaces include issues like how to best organize
data accesses and computations, as well as contextual display issues like text
annotation, time line indicators, and latitude/longitude lines.

ET-FV: Feature-based Visualization — Ultimately it is the features, not the
raw data that the analyst seeks. Where are the (shock, strong, peak) waves,
how strong are they, where are the eddies, how big, how strong, how fast,
where is the information content, where is the target, where are the hot spots,
where are the narrow junctions, etc. Much research has been done on feature
detection, feature extraction, and feature classification. This work needs to be
captured in a body of knowledge or library so that it is useful across CTAs.
This will require multidisciplinary projects involving personnel
knowledgeable in the various CTAs. The issues to be addressed in this area
are how to best visualize those features. Developing these visualization
methods will require multidisciplinary projects involving personnel
knowledgeable in the various CTAs and in ET technology. Projects
addressing this area that produce software modules within a standard
framework like vtk etc. would be highly desirable. Exploitation of scripting
languages, e.g., tcl/tk, Python, Perl, etc. would help portability of developed
tools.ET-DM: Data Mining — ET-FV and ET-DM are highly supportive of
each other, but have distinct goals. ET-DM projects will focus on the process
of data mining and the computational techniques and methods for the
extraction of useful information from data. To improve working environments
for DoD users in this area, more data mining techniques need to be developed
and the existing techniques need to be tested on more datasets. Different DM
techniques (clustering, trees, neural networks, etc.) work better on different
data sets. A better body of knowledge needs to be developed in when and how
to apply each technique and libraries of techniques need to be developed. To
maximize return on investment, it would be reasonable for some ET-DM
projects to be done in collaboration with some SIP-IE projects, especially
those based on image analysis. To be able to mine data, some idea of what is
being sought is useful. A greater body of knowledge as to what needs to be
mined and/or what knowledge needs to be discovered would be useful. To
improve the working environments for DoD users, projects in this area need to
involve a domain expert, not just a mining expert. This area is viewed as a
high-risk area, as many projects have promised much and delivered little.

ET-PSE: Problem Solving Environments — Initial projects in this area should
develop PSEs for some of the most heavily used codes at the
MSRCs/ADCs/DDCs. In particular initial development should be on codespecific portals instead of more abstract PSEs. Once several application code
specific PSEs have been developed (2-3 years), then significant infrastructure
development should accelerate, probably as the computational grid
infrastructure becomes available to the DoD user. The technology
underpinnings of portal-based PSEs continue to rapidly evolve. However, in
anticipation of the need to exploit the computational grid infrastructure, DoD
involvement in the Global Grid Forum would be appropriate. Involvement in
this body would ensure the DoD will remain in contact with the larger
computing portals and grids communities and will be able to influence the
standards and activities of this group. Projects in this subarea could be viewed
as developing/tracking enabling technology for enabling technology!

ET-GG: Grid Generation — An important element, and current user need, is
the accurate and rapid processing of geometric models being designed and
analyzed. Grid (or mesh) generation (GG) technology bridges the gap between
digital geometry models and computational simulations for engineering
analyses. Current GG techniques are being developed for incorporation in
large production, or commercial, software systems. Such an approach limits
the use of emerging algorithms from outside developers. The work in GG
needs to focus on the use of components and standard interfaces to be used
among larger software systems. The underlying geometry is moving from
CAD/CAM based to solid models. Adaptive and moving meshes need to be
developed, with the goal being dynamic meshes. The mesh technology needs
to support multiple disciplines (multi-disciplinary optimization). The
underlying coding needs to move to object-oriented technology. The modules
or components need to be thread-safe and operate in parallel and distributed
computing environments. The creation of meshes needs to become more
automatic and exploit some intelligence. In particular,
 Dynamic solution-adaptive unstructured mesh capability for viscous
dominated flows
 Temporally/spatially deforming geometry/ mesh capability for
multidisciplinary analysis
 Parametric, automatic, and intelligent complex structured, unstructured,
and hybrid mesh generation for multidisciplinary applications
 Parallel unstructured/generalized mesh generation with dynamic load
balancing
 Conservative interpolation techniques for multidisciplinary interactions
 Robust mesh generation methods for multidisciplinary applications,
including geo-spatial processes which may not include man-made models
The ET UAP includes:

Alan Walcraft (NRL/SSC)

John E. West (ERDC MSRC)

Jerry Clarke (ARL MSRC)

Frank Witzeman (AFRL)

Dan Kedziorek (Army Tank Command [TACOM])

Aram Kevorkian (SPAWAR Systems Center San Diego [SSCSD])

Pete Gruzinskas (NAVO MSRC)
Download