summaryrefslogtreecommitdiffstats
diff options
context:
space:
mode:
authorAlbert Cheng <acheng@hdfgroup.org>1999-10-01 18:00:02 (GMT)
committerAlbert Cheng <acheng@hdfgroup.org>1999-10-01 18:00:02 (GMT)
commita9b3674f9759f0132fe006c5d3df89b917739e8e (patch)
tree7f9a334fb5e7269dc035041332a8af5196642422
parentbc87d2a2cea06cabc7d0cfb94d400b45f402cf49 (diff)
downloadhdf5-a9b3674f9759f0132fe006c5d3df89b917739e8e.zip
hdf5-a9b3674f9759f0132fe006c5d3df89b917739e8e.tar.gz
hdf5-a9b3674f9759f0132fe006c5d3df89b917739e8e.tar.bz2
[svn-r1707] Changed to different locations or filenames.
INSTALL.ascired -> INSTALL_TFLOPS INSTALL.ibm.sp.parallel -> bin/config_para_ibm_sp.sh INSTALL_parallel.ascired -> bin/config_para_tflops.sh
-rw-r--r--INSTALL.ascired107
-rw-r--r--INSTALL.ibm.sp.parallel88
-rw-r--r--INSTALL_parallel.ascired58
-rw-r--r--MANIFEST6
4 files changed, 3 insertions, 256 deletions
diff --git a/INSTALL.ascired b/INSTALL.ascired
deleted file mode 100644
index 43dea2a..0000000
--- a/INSTALL.ascired
+++ /dev/null
@@ -1,107 +0,0 @@
-
-FOR THE INTEL TFLOPS MACHINE:
-
-Below are the step-by-step procedures for building, testing, and
-installing both the sequential and parallel versions of the HDF5 library.
-
----------------
-Sequential HDF5:
----------------
-
-The setup process for building the sequential HDF5 library for the
-ASCI Red machine is done by a coordination of events from sasn100 and
-janus. Though janus can do compiling, it is better to build it
-from sasn100 which has more complete building tools and runs faster.
-It is also anti-social to tie up janus with compiling. The HDF5 building
-requires the use of janus because one of steps is to execute a program
-to find out the run-time characteristics of the TFLOPS machine.
-
-Assuming you have already unpacked the HDF5 tar-file into the
-directory <hdf5>, follow the steps below:
-
-FROM SASN100,
-
-1) cd <hdf5>
-
-2) ./configure tflop
-
-3) make H5detect
-
-
-FROM JANUS,
-
-4) cd <hdf5>
-
-5) make H5Tinit.c
-
-
-FROM SASN100,
-
-6) make
-
-
-When everything is finished compiling and linking,
-you can run the tests by
-FROM JANUS,
-
-7.1) Due to a bug, you must first remove the following line from
- the file test/Makefile before the next step.
- RUNTEST=$(LT_RUN)
-7.2) make check
-
-
-Once satisfied with the test results, you can install
-the software by
-FROM SASN100,
-
-8) make install
-
-
----------------
-Parallel HDF5:
----------------
-
-The setup process for building the parallel version of the HDF5 library for the
-ASCI Red machine is very similar to the sequential version. Since TFLOPS
-does not support MPIO, we have prepared a shell-script file that configures
-with the appropriate MPI library.
-
-Assuming you have already unpacked the HDF5 tar-file into the
-directory <hdf5>, follow the steps below:
-FROM SASN100,
-
-1) cd <hdf5>
-
-2) sh INSTALL_parallel.ascired /* this is different from the sequential version */
-
-3) make H5detect
-
-
-FROM JANUS,
-
-4) cd <hdf5>
-
-5) make H5Tinit.c
-
-
-FROM SASN100,
-
-6) make
-
-
-When everything is finished compiling and linking,
-FROM JANUS,
-
-7.1) Due to a bug, you must first remove the following line from
- the file test/Makefile before the next step.
- RUNTEST=$(LT_RUN)
-7.2) make check
-
-
-Once satisfied with the parallel test results, as long as you
-have the correct permission,
-FROM SASN100,
-
-8) make install
-
-
diff --git a/INSTALL.ibm.sp.parallel b/INSTALL.ibm.sp.parallel
deleted file mode 100644
index b269527..0000000
--- a/INSTALL.ibm.sp.parallel
+++ /dev/null
@@ -1,88 +0,0 @@
-# How to create a parallel version of HDF5 on an IBM SP system
-# that uses MPI and MPI-IO.
-
-# Unfortunately, the configure/make process to create the parallel version of
-# HDF5 has not yet been automated to the same extent that the sequential
-# version has.
-# Read the INSTALL file to understand the configure/make process for the
-# sequential (i.e., uniprocess) version of HDF5.
-# The process for creating the parallel version of HDF5 using MPI-IO
-# is similar, but first you will have to set up some environment variables
-# with values specific to your local installation.
-# The relevant variables are shown below, with values that work for LLNL's
-# ASCI baby blue pacific SP as of the writing of these instructions (980210).
-
-# In addition to the environment variables, you _might_ also have to
-# create a new file in the config directory.
-# You will need to create this file only if the execution of the ./configure
-# program aborts with an error after printing the message
-# "checking whether byte ordering is bigendian..."
-#
-# If this is the case, create a new file in the config directory
-# whose name is of the form architecture-vendor-OSversion
-# (e.g., for baby blue pacific, this file is named powerpc-ibm-aix4.2.1.0)
-# and which contains the line
-# ac_cv_c_bigendian=${ac_cv_c_bigendian='yes'}
-# if the target architecture is bigendian, or
-# ac_cv_c_bigendian=${ac_cv_c_bigendian='no'}
-# otherwise.
-# Running the program ./bin/config.guess will print out the name
-# of the new file you must create.
-
-# Don't try to make a parallel version of HDF5 from the same hdf5 root
-# directory where you made a sequential version of HDF5 -- start with
-# a fresh copy.
-# Here are the flags you must set before running the ./configure program
-# to create the parallel version of HDF5.
-# (We use csh here, but of course you can adapt to whatever shell you like.)
-
-# compile for MPI jobs
-setenv CC "/usr/local/mpich-1.1.2+romio_lgfiles/bin/mpicc"
-
-#
-# next 4 for IBM mpi
-#
-#setenv CC /usr/lpp/ppe.poe/bin/mpcc_r
-
-#
-# for both
-#
-setenv MP_PROCS 1
-
-
-# These compiler flags work on ASCI baby blue pacific (IBM SP),
-# using IBM's MPI and Argonne's MPI-IO (ROMIO):
-# -DHAVE_FUNCTION compiler accepts __FUNCTION__ notation
-# -I/usr/local/mpio/include/ibm using ROMIO's MPI-IO header files
-#
-# The following flags are only needed when compiling/linking a user program
-# for execution.
-# -bI:/usr/include/piofs/piofs.exp this MPI-IO uses PIOFS file system
-# -L/usr /local/mpio/lib/ibm -lmpio link to this MPI-IO lib
-#
-#setenv CFLAGS "-D_LARGE_FILES $CFLAGS"
-
-# The configure/make process needs to be able to run some programs,
-# need to specify a processor pool.
-# Also, don't prepend the process id in the output of the programs
-# run by config/make.
-setenv MP_RMPOOL 0
-setenv MP_LABELIO no
-
-# Once these variables are set to the proper values for your installation,
-# you can run the configure program (i.e., ./configure)
-# to set up the Makefiles, etc.
-# After configuring, run the make as described in the INSTALL file.
-# Once the configuration is complete, you can set any of your
-# environment variables to whatever you like.
-
-# the files in the config directory, such as
-# config/powerpc-ibm-aix4.2.1.0
-# config/powerpc-ibm-aix4.x
-# config/powerpc-ibm-aix4.3.2.0
-# sometimes will need some help depending on subtlties of the installation
-
-
-# When compiling and linking your application, don't forget to compile with
-# mpcc and link to the MPI-IO library and the parallel version of the HDF5
-# library (that was created and installed with the configure/make process).
diff --git a/INSTALL_parallel.ascired b/INSTALL_parallel.ascired
deleted file mode 100644
index 648b093..0000000
--- a/INSTALL_parallel.ascired
+++ /dev/null
@@ -1,58 +0,0 @@
-#! /bin/sh
-# How to create a parallel version of HDF5 on the Intel Asci Red System
-# that uses MPI and MPI-IO.
-
-# Read the INSTALL.ascired file to understand the configure/make process
-# for the sequential (i.e., uniprocessor) version of HDF5.
-# The process for creating the parallel version of HDF5 using MPI-IO
-# is similar, but first you will have to set up some environment variables
-# with values specific to your local installation.
-# The relevant variables are shown below, with values that work for Sandia'a
-# ASCI Red Tflops machine as of the writing of these instructions (980421).
-
-# Don't try to make a parallel version of HDF5 from the same hdf5 root
-# directory where you made a sequential version of HDF5 -- start with
-# a fresh copy.
-# Here are the flags you must set before running the ./configure program
-# to create the parallel version of HDF5.
-# (We use sh here, but of course you can adapt to whatever shell you like.)
-
-# compile for MPI jobs
-#CC=cicc
-
-# The following flags are only needed when compiling/linking a user program
-# for execution.
-#
-
-# Using the MPICH libary by Daniel Sands.
-# It contains both MPI-1 and MPI-IO functions.
-ROMIO="${HOME}/MPIO/mpich"
-if [ ! -d $ROMIO ]
-then
- echo "ROMIO directory ($ROMIO) not found"
- echo "Aborted"
- exit 1
-fi
-mpi1_inc=""
-mpi1_lib=""
-mpio_inc="-I$ROMIO/include"
-mpio_lib="-L$ROMIO/lib"
-
-
-MPI_INC="$mpi1_inc $mpio_inc"
-MPI_LIB="$mpi1_lib $mpio_lib"
-
-
-# Once these variables are set to the proper values for your installation,
-# you can run the configure program (i.e., ./configure tflop --enable-parallel=mpio)
-# to set up the Makefiles, etc.
-# After configuring, run the make as described in the INSTALL file.
-
-# When compiling and linking your application, don't forget to compile with
-# cicc and link to the MPI-IO library and the parallel version of the HDF5
-# library (that was created and installed with the configure/make process).
-
-CPPFLAGS=$MPI_INC \
-LDFLAGS=$MPI_LIB \
-LIBS="-lmpich" \
-./configure --enable-parallel tflop $@
diff --git a/MANIFEST b/MANIFEST
index eabae91..e85a15b 100644
--- a/MANIFEST
+++ b/MANIFEST
@@ -7,11 +7,9 @@
./COPYING
./INSTALL
-./INSTALL.ascired
-./INSTALL.ibm.sp.parallel
./INSTALL_MAINT
+./INSTALL_TFLOPS
./INSTALL_parallel
-./INSTALL_parallel.ascired
./INSTALL_Windows.txt
./MANIFEST
./Makefile.dist
@@ -28,6 +26,8 @@
./bin/checkposix _DO_NOT_DISTRIBUTE_
./bin/config.guess
./bin/config.sub
+./bin/config_para_ibm_sp.sh
+./bin/config_para_tflops.sh
./bin/debug-ohdr _DO_NOT_DISTRIBUTE_
./bin/distdep
./bin/errors _DO_NOT_DISTRIBUTE_