ARSC HPC Users' Newsletter 299, September 10, 2004

Combined X1 FAQ

The four DoD X1 sites (ERDC, AHPCRC, SMDC, and ARSC) have produced a collaborative Cray X1 FAQ.

The X1 FAQ is linked from the "Computational Environments" (CE) area of the PET On-line Knowledge Center. Start here:

https://okc.erdc.hpc.mil/index.jsp

Navigate to the FAQ by clicking on:
  1. the "CE" link under the "Cross-Cutting Areas" heading, then
  2. "Papers/Pubs", and finally
  3. "Cray X1 Frequently Asked Questions."

The FAQ will be updated, well... frequently, or at least quarterly. If you've got some burning X1 questions, send them to your center's help desk or this newsletter, and they may make their way into this FAQ for the benefit of all.

Retrieving Compile Information From an IBM Executable

[ Thanks to Don Bahls of ARSC. ]

IBM Power4 systems such as iceberg and iceflyer support both 32 and 64 bit compilations. This can leave you wondering later which mode was used to create an executable. Fortunately there's a simple way to check an executable using the standard Unix "file" and "dump" commands.

"file" command

Below is a sample compilation line using the 64-bit mode.


  iceberg2 1% cc -q64 sample.c -o sample
Issuing the file command shows that the executable is indeed a 64-bit executable.

  iceberg2 2% file sample
  readfile: 64-bit XCOFF executable or object module not stripped
Alternately if the source had been compiled in 32-bit mode the file command tells us that the executable is a RISC System/6000 executable.

  iceberg2 3% cc -q32 sample.c -o sample
  iceberg2 4% file sample
  readfile: executable (RISC System/6000) or object module not stripped

"dump" command

The 32-bit compilation mode also allows heap and stack values to be modified (by passing the flags -bmaxdata and -bmaxstack to the linker). The dump command has an option which gets these values from the header of the executable file.

Here's a compile command which sets the maximum heap and stack size to 512 MB each. (NOTE: 0x20000000 bytes is 536870912 bytes in decimal or 512 MB.)


  iceberg2 5% cc -q32 -bmaxdata:0x20000000 -bmaxstack:0x20000000 \
                  sample.c -o sample
In the following "dump" command we know that the executable was compiled in 32-bit mode, and thus we can pass it the flag, "-X32". The -o flag instructs dump to print "optional" headers from the executable. The optional headers contain the maxdata and maxstack values:

  iceberg2 6% dump -X32 -o sample
  
  readfile:

                          ***Object Module Header***
  # Sections      Symbol Ptr      # Symbols       Opt Hdr Len     Flags
           4      0x000009d4            121                72     0x1002
  Timestamp = 1093630304
  Magic = 0x1df  
  
                          ***Optional Header***
  Tsize        Dsize       Bsize       Tstart      Dstart
  0x000003a8  0x000000fc  0x00000004  0x10000128  0x300004d0

  SNloader     SNentry     SNtext      SNtoc       SNdata
  0x0004      0x0002      0x0001      0x0002      0x0002    

  TXTalign     DATAalign   TOC         vstamp      entry
  0x0002      0x0003      0x30000574  0x0001      0x30000560

  maxSTACK     maxDATA     SNbss       magic       modtype
  0x20000000  0x20000000  0x0003      0x010b        1L
Had we incorrectly given "dump" the "-X64" option for the 32-bit executable, it would have reported an error message:

  iceberg2 7% dump -X64 -o sample
  
  readfile:
  dump: readfile: 0654-108 file is not valid in the current object file mode.
        Use the -X option to specify the desired object mode.

Don't know if your executable is 32- or 64-bit? Find out using the "file" command as describe above, and then execute "dump."

Multi-Streaming C Programs on X1

ARSC vector specialist Lee Higbie has done exhaustive testing of small set of loops, coded in both C and Fortran.

He's discovered that aggressive optimization using compiler options improves streaming (and performance) notably for the C version. In the Fortran version, however, the default compilation options produce code which is as well-streamed as that produced using aggressive options.

Thus, this recommendation for C programmers on the X1: try recompiling with these options to "cc":


   -O3 -h aggress

Lee's test set is admittedly small, and your results will vary. (You might let us know what you discover.) And, of course, regardless of what language you're using, or platform you're running on, remember that aggressive compiler optimization can alter your results. Always validate test results.

X1 PrgEnv.new Upgraded

During downtime on 9/1, the mpt, craytools, cftn, and CC components of PrgEnv.new on the X1 were upgraded. The default PrgEnv remains unchanged.


  KLONDIKE$ module switch PrgEnv PrgEnv.new
  KLONDIKE$ module list
  Currently Loaded Modulefiles:
    1) modules             6) craytools.5.2.0.4  11) totalview
    2) PrgEnv.new          7) cal                12) X11
    3) craylibs.5.2.0.1    8) CC.5.2.0.4         13) pbs
    4) libsci.5.2.0.1      9) cftn.5.2.0.4       14) open
    5) mpt.2.3.0.5        10) motif.2.1.0.0

Book Review: "High Tide: News from a Warming World

[ Many thanks to Guy Robinson for this review. ]

High Tide: News from a Warming World. Mark Lynas. Flamingo, ISBN0-00-713939-X.

It is very easy sometimes to consider global warming as just so many numbers in a computer, to forget the impact it is having in the real world and perhaps more importantly what various changes mean to real people. The author, Mark Lynas, has produced both an interesting and very readable book describing not only the impact global warming is having on his doorstep but at various locations around the world.

The book tells of the author's trips to various locations around the world, however the locations were not chosen for their appeal to the tourist but for their relevance to the changing conditions global warming causes. The events investigated are diverse: how intense storms cause flooding in the UK and Europe; the changes in Alaska as permafrost melts and seasonal ice cover changes; how rising sea levels are gradually drowning a chain of Pacific islands; the problems of dust clouds and eroding farm lands in central China; the increasing power of hurricanes and storms of the eastern US seaboard; and rapidly retreating glaciers in the Andes and the impact this will have on water supplies in South America.

In each case the tale is effectively told by combining the author's own experiences gathering information with the experiences of locals. Recent events are put into context by comparison with the past and the regional economic impact as well as scientific background are explained.

The final chapter describes the author's experiences reporting on recent meetings that have attempted to promote a broader understanding of the changes a warming world will cause, and looks toward making policies which might alleviate the impact of these changes. It is clear that the interplay between politics, economics, and scientific research is very complex and finding solutions will not be easy. Balancing the needs of the developing world with the industrial nations is just one example. A great many resources are listed in the appendices of the book, pointers to the scientific publications and political groups provides support for the arguments in each chapter.

This book gives a first hand account of how our planet is currently changing and how this is impacting the lives of many people today. It justifies the view that these changes are not simply local, once in a lifetime fluctuations, but presents a picture of global change. It also looks forward to how these changes will spread to impact more of the planet, and therefore more of us, either directly at our places of work or our homes, or indirectly through the complex interplay of modern society.

ScicomP News

Here's a recent email from the IBM System Scientific User Group. IBM users might consider attending the next group meeting, or at least subscribing to the email list.


 
> Here are some news and updates on recent and upcoming ScicomP meetings.
> 
> * ScicomP10 : An excellent series of talks was presented on topics
> from ranging from scientific application performance on existing IBM
> hardware to emerging hardware and new ACTC tools. Talk materials from
> the Austin meeting are now online.
> 
> 
http://www.spscicomp.org/ScicomP10/abstracts.html

> 
> * New ScicomP officers : At the Austin meeting the ScicomP executive
> committee saw some changes. Bronis de Supinski is replaced by Gary
> Kumfert of LLNL as President. David Skinner moved from Secretary to
> Vice-President. We welcome Bernd Mohr from Julich as Secretary. Thanks
> for all your hard work Bronis!
> 
> * A decision was made to meet once yearly as opposed to twice. As often
> as possible we will try to co-locate with SP-XXL meetings as there is a
> good deal of overlap in the IBM speakers.
> 
> * An informal ScicomP Birds of a Feather meeting will take place at SC04
> in Pittsburgh this November.
> 
> * ScicomP11 : Will be held hosted by HPCx in Edinburgh in Spring 2005.
> Exact dates for the meeting and registration will be announced later
> this year.
> 
> * ScicomP12 : Will be hosted at NCAR in May 2006.
> 
> * ScicomP will be sending a Letter of Feedback and Suggestions on IBM's
> HPC offerings and services. If you would like to contribute to this
> letter or participate in discussions of IBM HPC application and user
> issues send email to majordomo@nersc.gov with
> 
> subscribe forum_spscicomp
> 
> in the message body. This will subscribe you to the opt-in discussion
> list forum@spscicomp.org which is open to all members.
> 
> * General information about ScicomP email lists is available at
> 
> 
http://www.spscicomp.org/index.html#mailinglists

> 

Answers to 10th Anniversary Quiz

The grand prize winner was Trey White. The only question Trey missed was number 9, that President Bush's first name in 1994 was not Bill, but George. Here're all the answers:
  1. The name of the Company that built ARSC's T3D was, in August 1994:

    [] Cray Inc. [x] Cray Research, Inc. [] Cray Computer Corp.

  2. ARSC's 1st supercomputer was a CRAY Y-MP M98, its name was?

    [] nenana [] mckinley [x] denali [] rainier [] whitney [] elbert [] buckhill

  3. ARSC's Y-MP M98 (at install) had 1 GB memory, 4 CPUs, and liquid cooling. It was notable as the world's very first shared-memory supercomputer with:

    [x] 1 GB memory [] 4 CPUs [] liquid cooling

  4. Attendees of the Fall 1995 Alaska CUG witnessed:

    [x] The Aurora Borealis [] The Aurora Pectoralis

  5. I froom air conditioning for ARSC's air-cooled, 84-node IBM Power 4 complex fails, sysadmins have how many minutes to start bringing it down gracefully, before it overheads and shuts itself down:

    [] 5 [x] 15 [] 60

  6. The T3D used a DEC Alpha EV4 processor which had a per-processor peak theoretical performance of 150 mflops. The processor clock speed was:

    [] 37.5 mHz [] 50 mHz [] 75 mHz [x] 150 mHz

  7. MPI was available on the Cray T3D when it was installed in 1994:

    [] yes [x] no

  8. On the T3D, applications could run on "NPES" processors where:

    [] NPES >= 1 [] NPES >= 2 and NPES is a multiple of 2 [x] NPES >= 1 and NPES is a power of 2 [] NPES >= 2 and NPES is a power of 2

  9. On August 25, 1994, the President's first name was:

    [] Bill [x] George

  10. Computers built in Poughkeepsie NY must travel about how far north to reach Fairbanks, Alaska (in degrees, latitude)?

    [] 13 [] 18 [x] 23 [] 28

Thanks to everyone who gave it a shot... There were under ten, so everyone get prizes. They're in the mail... if not there already.

Quick-Tip Q & A



Q: Does anyone know a work-around for this?  Everything compiled, but 
   it fails to link.

   % mpxlf_r -O3 -qarch=pwr4 -qtune=pwr4 -qcache=auto -qhot -qstrict \
     -bmaxdata:1000000000 -bmaxstack:256000000 -o myprog  \
     modules/*.o grid/*.o tools/*.o local/*.o atmos/*.o  \
     land/*.o coupler/*.o main/*.o
   /bin/sh: /usr/bin/mpxlf_r: 0403-027 The parameter list is too long.

[[ Answers, Questions, and Tips Graciously Accepted ]]


Current Editors:
Ed Kornkven ARSC HPC Specialist ph: 907-450-8669
Kate Hedstrom ARSC Oceanographic Specialist ph: 907-450-8678
Arctic Region Supercomputing Center
University of Alaska Fairbanks
PO Box 756020
Fairbanks AK 99775-6020
E-mail Subscriptions: Archives:
    Back issues of the ASCII e-mail edition of the ARSC T3D/T3E/HPC Users' Newsletter are available by request. Please contact the editors.
Back to Top