ARSC T3E Users' Newsletter 155, November 5, 1998

SC98 Issue

This is a special issue on Supercomputing '98 (next week in Orlando): it's a "What's Happening" for non-attenders and attenders, alike.

The next regular T3E Newsletter issue will come out on Nov. 20th.

ARSC Research Booth

ARSC's mission is to support high performance computational research in science and engineering with an emphasis on high latitudes and the Arctic.

At our booth (# R760) you'll see posters, 3D Demos, videos, us (!), and other stuff. Please drop by!

ARSC Booth Posters

  • Oceans "New Insight Into Global Climate Change" Wieslaw Maslowski, Naval Post Graduate School
  • Oceans "Model Reveals Arctic Ocean Tides" Andrey Proshutinsky and Igor Polyakov, UAF
  • Arctic Engineering "Arctic Engineer Works to Smooth Alaska's Roads" Doug Goering, UAF
  • Grand Challenge "Unsteady Dynamics of the Maneuvering Submarine" Roger Briley, DOD-HPCMO.
  • Earth Sciences "Using Satellite Data for Soil Moisture Mapping" Larry Hinzman, Neal Meade, and Douglas Kane, UAF
  • Earth Sciences "Satellite Data and Supercomputers Used in Topographic Mapping" Rick Guritz, UAF
  • Space "Eulerian Parallel Polar Ionosphere Model" Serge Maurtis, ARSC
  • T3E Newsletter "Four Years and Counting" Tom Baring and Guy Robinson, ARSC
  • Supercomputing in the Far North (Introduction to ARSC Hardware and Facilities) ARSC

ARSC Booth 3-D interactive demos (two of several described here)

Alaskan and Western Arctic meteorological phenomena

You can "dive into" or "hover above" this 3D animation of an actual synoptic scale cyclonal-frontal weather system from the Aleutian islands. The particular weather system contained hazardous flying conditions including in-flight icing and sever turbulence.

This research done by Jeff Tilley of the UAF Geological Institute. Jeff used NCAR's mesoscale meteorological modeling system, MM5, and Vis5D. (See NCAR's booth for more on MM5.)

Active soil depth for an area from North of Alaska's Brooks range to the Arctic ocean.

The active range is the area of melt above the permafrost. It is the locale of biological and physical processes and affects the growth of plants and, in turn, impacts migration and survival of animals in the tundra environment.

As this thin layer is quite sensitive to damage, it is of interest to the army, oil industry, and other groups concerned with the use and management of natural resources in the Arctic.

The simulation shows 107 days, from June onwards, and allows you to zoom in/out, and otherwise probe the entire region for areas of interest.

This research is based on the work of Larry Hinzman of the Water and Environmental Research Center, UAF.

ARSC Booth Research Highlights Video

Video program:

  1. Ocean
    • Arctic Ocean Parallel Model
  2. Earth Sciences
    • Kuparuk Watershed Thermal Model
    • MM5 Alaska Model
    • Healy Clean Coal Project
  3. Space
    • UAF Eulerian Parallel Polar Ionosphere Model
    • Reconnection of Magnetic Flux Tubes
  4. Computer Art and Animation
    • Comet
    • Rotating Sphere
    • The Ball and the Flower
    • Firewind
    • Space Ball
    • Boggy and Tripod
    • Sledding Snowman

Chris Hartman's Demo at HPCMO Booth

Gesture analysis and the Body Language User Interface (BLUI) will be demonstrated on the ImmersaDesk on Wednesday, Nov. 11 at 3 pm in the HPCMO booth (number R780).

Press Release:

Most people can imagine using a computer program to produce art. But imagine having the ability to run a 3D program with your body--without the aid of a mouse or keyboard. This is just the kind of technology that Chris Hartman, a UAF Computer Science faculty member, and Bill Brody, head of UAF's Art Department, are striving for. (Both Hartman and Brody are ARSC/UAF joint faculty members.)

Hartman and Brody are using the virtual reality technology at the Arctic Region Supercomputing Center to design a program called Body Language User Interface (BLUI).

"It's a three dimensional drawing program," says Hartman, "But what makes it special is that it's based on your gestures."

The user draws on a virtual reality screen with a wand, and can "undo" with a shake of the head, or "quit" by dropping his or her hands to the side.

In order for BLUI to work, two TV cameras placed at 45 degree angles on either side of an ImmersaDesk visualization screen give the computer a pair of "eyes." Bill and Chris will program these "eyes" to recognize the hand as a tool. As an open hand swipes across a line on a 3D sketch, the line will widen. A grabbing motion will replace the usual "select" menu.

"Eventually Bill wants to sculpt--to do things you normally wouldn't be able to do with certain materials," says Hartman. This would mean the ability for an artist to add to a rock sculpture, when in reality an artist can only subtract rock to create a sculpture. The researchers use to video capture human gestures which are used to drive the program. But some information is still based on how the user moves the wand and where the user looks on the screen. "Ideally, you eventually won't use the wand at all," says Hartman.

Don Morton's Poster

The biological and physical processes which characterize Arctic ecosystems are affected by intricately related thermal and hydrologic processes.

This work describes the coupling of two parallel computer models that, before this project, independently simulated thermal and hydrologic processes. The coupling of the two models allows the thermal model to be influenced by recent hydrologic calculations, and vice versa. Feedback loops existing in the field are captured, providing a more realistic simulation of ecosystem interactions.

The two original codes have been coupled by combining them into an SPMD format, communicating through the use of MPI intra- and inter-communicators on the CRAY T3E as described in newsletter #146:

/arsc/support/news/t3enews/t3enews146/index.xml

This approach allows the original codes to remain largely unmodified, retaining parallel characteristics while facilitating the exchange of data for coupling. Preliminary results from the coupled model show solutions that exhibit a large degree of heterogeneity, which is expected when the hydrologic and thermal models drive each other.

Don Morton Department of Computer Science The University of Montana

Guy Robinson's Poster

Guiding User Improvements Through Tool Application

Abstract:

Computer Centres have often noted the difficulty in encouraging use of the various software development tools currently available. This is disappointing since the latest generation of tools can, when applied correctly, solve many problems which cause users considerable trouble when developing code, modifying existing code, or attempting to tweak code for performance optimization.

ARSC has provided several tools for users and even with the traditional training courses and on-line materials to promote the available of these tools take-up by users has been a slow and uncertain process.

Several means for helping users get started, and examples from real users in solving actual problems and achieving real performance improvements, are presented. Guy Robinson, ARSC

OSC Research Booth

The Ohio Supercomputer Center (OSC) will present Collaborations for the 21st Century - a showcase of projects setting the tone for Ohio's future in high performance computing and communications.

OSC's collaborative efforts with its users benefits Ohio research and possibly affects the way the world will turn in the next millennium. This year OSC will showcase the work of researchers from 12 Ohio universities.

This research ranges from using molecular modeling to develop new drugs that may someday help individuals with Alzheimer's disease gain their memory back, to examining ways to improve industrial mixing processes. This work will impact both equipment manufacturers and the polymer processing industry.

Check out OSC's booth, number R766, located in the SC98 Research Exhibits area. For more information on OSC, visit:

http://www.osc.edu

DoD HPCMP Booth

[ Note: Wednesday at 3pm--ARSC participation. ]

The DoD HPCMP booth is organized around the theme "Wrights to Bytes, Computing the Future." The booth will feature HPC research exhibits and provide information on all the computational science areas investigated using DoD HPCMP computing resources.

Below is the current schedule for the HPCMP booth (R-790) at SC '98. A detailed schedule is posted at:

http://www.asc.hpc.mil/sc98/

Web.


Monday, November 9, 1998


7:00-8:00:   MetaComputing Demonstration
8:00-9:00:   Bradley Tank CAVE Demonstration

  
Tuesday, November 10, 1998


10:00-11:00: Computational Electronics and Nanoelectronics, Advanced
              Visualization Network Using AVS Express, and CAPTools 
             Demonstrations
11:00-12:00: Aircraft Wing Deformation and Axisymmetric Damage Model, 
             and Dynamic Fracture in Gallium Arsenide Demonstrations
12:00-1:00:  MetaComputing Demonstration and the Wright Brothers (Mbone)
1:00-2:00:   Dual-Level Parallel Analysis of Harbor Wave
              Response Using MPI and OpenMP, Variable Charge 
             Molecular Dynamics Simulations, and Real-Time Video 
             Feed From DISA-RCC in Ohio via Mbone
2:00-3:00:   FDTD Code Analysis, Aircraft Wing Deformation and
              Axisymmetric Damage Model, and Tank Towing Fuel Trailer 
             Demonstrations; and the Wright Brothers (Mbone)
3:00-4:00:   Run Time Visualization of an FMC Computational
              Chemistry and
              Materials Science Application in a Distributed Interactive 
             Computing Environment, Sparse Matrix Ordering for Parallel 
             Computation Demonstration, and Web-Based Training Demonstration
4:00-5:00:   Run Time Visualization of CTH Computational Structural
              Mechanics Application in a Distributed Interactive Computing 
             Environment, and SC98 News
5:00-6:00:   Virtual Tanker Demonstration

  
Wednesday, November 11, 1998


10:00-11:00: Computational Electronics and Nanoelectronics and CAPTools
              Demonstrations
11:00-12:00: Aircraft Wing Deformation and Axisymmetric Damage Model,
              SciVis Animation, Tank Towing Fuel Trailer, and Web-Based 
             Training Demonstrations
12:00-1:00:  HPC Challenge Competition and the Wright Brothers (Mbone)
1:00-2:00:   Simulation of an F-18 With Stores
2:00-3:00:   Molecular Docking, Web Computational Electronics and
              Nanoelectronics, and Aircraft Wing Deformation and 
             Axisymmetric Damage Model Demonstrations
3:00-4:00:   Immersive Gesture-Driven Interface Demonstration
              and the Wright Brothers (Mbone)
4:00-5:00:   Haptic Device Demonstration and SC98 News
5:00-6:00:   Wave Field Animation and Bradley Tank CAVE Demonstrations

  
Thursday, November 12, 1998


10:00-11:00: Aircraft Wing Deformation and Axisymmetric Damage Model
              Demonstration
11:00-12:00: Virtual Tanker Demonstration
12:00-1:00:  MetaComputing Demonstration, Simulation of an F-18 With
              Stores Demonstration, and the Wright Brothers (MBone)
1:00-2:00:   Navigation of Time-Varying Multi-Layered Ocean
              Circulation
              Models and Virtual Reality Modeling Language Demonstrations, 
             and SC98 News
2:00-3:00:   Wright Brothers (Mbone)
3:00-4:00:   Display of Molecular Structure and Properties,
              Sparse Matrix Ordering for Parallel Computation, and Web-
             Based Training Demonstrations

For more information on the HPCMP, see our home page at:

http://www.hpcmo.hpc.mil/

or call (703) 812-8205.

The "Supercomputing98" issue of Parallel Computing Research

The "Supercomputing98" issue of Parallel Computing Research, the newsletter of the Center for Research on Parallel Computation, is now available at:

http://www.crpc.rice.edu/CRPC/newsletters/fal98/ .

Previous issues and articles can be found at:

http://www.crpc.rice.edu/CRPC/newsletters/index.html .

At ARSC November 16, 1998: CRAY SV-1 Presentation

ARSC is pleased to host Patti Langer, Gary Shorrell and Frank Chism of SGI who will provide an in-depth presentation on the new CRAY SV-1 supercomputer.

The talk is scheduled for Monday, November 16th from 10:00 to 12:00 in Butrovich room 109 (the Board of Regents Conference room) on the UAF campus.

The CRAY SV-1 is a first-generation scalable vector supercomputer. It combines high single-processor performance and is scalable from entry-level to teraflops-level. To quote from a recent SGI press release regarding the CRAY SV-1:

Boasting several bold technical innovations, the CRAY SV-1 supercomputer is the most technologically sophisticated supercomputer on the planet.

The CRAY SV-1 supercomputer is also the first to run vector cache memory, boosting actual processor performance by dramatically increasing effective memory bandwidth. It is also the first to feature an adjustable-size processor. Called the Multi-Streaming Processor, this technology adds efficiency by allowing each processor to be configured as one ultra-performance four-gigaflop or four single-gigaflop processors.

In addition, with a maximum of eight pipes, the CRAY SV-1 supercomputer offers a processor architecture optimized for high performance on real world workloads rather than mere peak performance.

The presentation will cover the following topics:

  • Technology roadmap
  • SV-1 product highlights
  • Detailed hardware overview
  • Detailed MSP and SuperCluster overview
  • Benchmark results

Everyone interested in the latest in computing technology is invited to attend. More information on SGI supercomputer systems, including the SV-1, can be found at:

http://www.cray.com

Quick-Tip Q & A


Bonus Answers: 
  {{ Why does Fortran array indexing start at 1 while C starts at 0? }}

  Here are two late reader responses to the Fortran/C question--thanks! 
  #######
  As the attached snippet from a J90 session shows,  Cray and most
  other Fortran77s have handled arbitrary lower array bounds for a long
  time.  Like maybe twenty years.

  The question really is, why would anyone still think that Fortran
  arrays 'have' to start at 1?

  snippet follows:

  j90% cat bounds.f
        program bounds
  c
  c Show use of arbitrary lower bounds in Fortran 77 arrays
  c
        dimension pre77(10)
        dimension c(0:9)
        dimension f77(-9:0)

        do i=1,10
         pre77(i)=i
         c(i-1)=i
         f77(-10+i)=i
        end do

        print *,"pre77=",pre77
        print *,"c    =",c
        print *,"f77  =",f77

        end
  j90% cf77 -o bounds bounds.f
  j90% ./bounds
   pre77=1.,  2.,  3.,  4.,  5.,  6.,  7.,  8.,  9.,  10.
   c    =1.,  2.,  3.,  4.,  5.,  6.,  7.,  8.,  9.,  10.
   f77  =1.,  2.,  3.,  4.,  5.,  6.,  7.,  8.,  9.,  10.
  j90%

  ########

  It's been too many years to have 100% confidence in what I'm about to
  say, but in the early 80's I worked on Burroughs mainframes, whose OS
  is written in a variant of Algol.  It allowed any lower bound.  And,
  a particularly nice feature of the Burroughs mainframe was that
  bounds were enforced in hardware - you could not turn bound checking
  off!  Absolutely no running off the end of your array without knowing
  it.

  ########

  Editor's Notes:  

    In cf90:

      "f90 -Rb" -- Enables run-time checking of array bounds
      "f90 -Rs" -- Enables run-time checking of character substring bounds

    ---

    Co-array image numbers (which "replace" PE numbers in co-array
    Fortran) start counting at 1 by default.  Thus, they are consistent
    with Fortran arrays.

    However, like arrays, they can be declared to start counting at any
    convenient integer.  The co-array, "imgarr" in this example has
    both array indices and image numbers starting at 0:

       program co_hello
       implicit none
       integer N
       parameter (N=128)
       integer :: myimg, masterimg, totimgs, i

       integer :: imgarr (0:N)[0:*]

       masterimg = 0
       myimg = this_image () - 1
       totimgs = num_images ()

       imgarr(myimg) = myimg             ! local assignment

       call sync_images ()               ! synchronize

       do i = 0, totimgs - 1             ! global exchange
         imgarr(i) = imgarr(i)[i]
       enddo

       if (myimg.EQ.masterimg) then      ! Master image prints array
         do i = 0, totimgs - 1
           print*, "Image array value: imgarr(", i, ") = ", imgarr(i)
         enddo
       endif

       end 


Q: Arctic survival skills answers still being accepted. (In response
   to one e-mail, no, neither Guy nor Tom has ever driven away
   without "unplugging."  But it could happen...) 
[ Answers, questions, and tips graciously accepted. ]
Current Editors:
Ed Kornkven ARSC HPC Specialist ph: 907-450-8669
Kate Hedstrom ARSC Oceanographic Specialist ph: 907-450-8678
Arctic Region Supercomputing Center
University of Alaska Fairbanks
PO Box 756020
Fairbanks AK 99775-6020
E-mail Subscriptions: Archives:
    Back issues of the ASCII e-mail edition of the ARSC T3D/T3E/HPC Users' Newsletter are available by request. Please contact the editors.
Back to Top