From 2edd657fa0b01d69a4dfe91de207b208d0853c4f Mon Sep 17 00:00:00 2001 From: Frank Baker Date: Fri, 20 Jun 2003 16:53:38 -0500 Subject: [svn-r7076] Purpose: Add new tools: h5diff h5perf h5redeploy h5c++ Corrections to h5import Description: h5diff, h5perf, h5redeploy, h5c++ -- Added these tools. h5import -- Corrected the description of the naming convention for output datasets. Other edits based on review comments. Copy edits and source code formatting. Platforms tested: IE 5, Safari --- doc/html/Tools.html | 1742 ++++++++++++++++++++++++++++++++------------------- 1 file changed, 1114 insertions(+), 628 deletions(-) diff --git a/doc/html/Tools.html b/doc/html/Tools.html index 21f6d9e..a6ebe55 100644 --- a/doc/html/Tools.html +++ b/doc/html/Tools.html @@ -60,6 +60,9 @@ to convert files from HDF4 format to HDF5 format and vice versa. A tool for displaying HDF5 file contents
  • h5ls -- A tool for listing specified features of HDF5 file contents +
  • h5diff -- + A tool to identify and locate the differences between two HDF5 files + (Beta version)
  • h5repart -- A tool for repartitioning a file, creating a family of files
  • h5import -- @@ -72,10 +75,16 @@ to convert files from HDF4 format to HDF5 format and vice versa. A tool for converting an HDF5 file to an HDF4 file
  • h4toh5 -- A tool for converting an HDF4 file to an HDF5 file +
  • h5perf -- + A tool for measuring HDF5 performance +
  • h5redeploy -- + A tool for updating HDF5 compiler tools after installing HDF5
  • h5cc -- - A tool for compiling HDF5 programs + A tool for compiling HDF5 programs written in C
  • h5fc -- A tool for compiling HDF5 programs written in Fortran90 +
  • h5c++ -- + A tool for compiling HDF5 programs written in C++
  • Java-based tools for HDF5 -- (at http://hdf.ncsa.uiuc.edu/java-hdf5-html/)
    @@ -263,7 +272,7 @@ to convert files from HDF4 format to HDF5 format and vice versa. Default: 1 in all dimensions.
    --
    Indicate that all following arguments are non-options. - E.g., to dump a file called `-f', use h5dump -- -f.
    + E.g., to dump a file called `-f', use h5dump -- -f.
    file
    The file to be examined.


    @@ -444,7 +453,7 @@ to convert files from HDF4 format to HDF5 format and vice versa.
    -f   or   --full
    Print full path names instead of base names. -
    -g   or   +
    -g   or   --group
    Show information about a group, not its contents.
    -l   or   @@ -493,6 +502,102 @@ to convert files from HDF4 format to HDF5 format and vice versa.
    +
    Tool Name: h5diff    (Beta version) +
    Syntax: +
    h5diff file1 file2 + [OPTIONS] + [object1 [object2 ] ] +
    Purpose: +
    Compares two HDF5 files and reports the differences. +
    Description: +
    h5diff is a command line tool that compares + two HDF5 files, file1 and file2, and + reports the differences between them.  +

    + Optionally, h5diff will compare two objects + within these files. + If only one object, object1, is specified, + h5diff will compare + object1 in file1 + with object1 in file2. + In two objects, object1 and object2, + are specified, h5diff will compare + object1 in file1 + with object2 in file2. + These objects must be HDF5 datasets. +

    + object1 and object2 must be expressed + as absolute paths from the respective file's root group. +

    + Additional information, with several sample cases, + can be found in the document + + H5diff Examples. +

    Options and Parameters: +
    +
    file1 +
    file2 +
    The HDF5 files to be compared. +
    -h +
    Print all differences. +
    -r +
    Print only the names of objects that differ; + do not print the differences. + These objects may be HDF5 datasets, groups, + or named datatypes. +
    -n count +
    Print difference up to count differences, + then stop. + count must be a positive integer. +
    -d delta +
    Print only differences that are greater than the + limit delta. + delta must be a positive number. + The comparison criterion is whether the + absolute value of the difference of + two corresponding values is greater than + delta +
    (e.g., |a–b| > delta, + where a is a value in file1 and + b is a value in file2). +
    -p relative +
    Print only differences that are greater than a + relative error. + relative must be a positive number. + The comparison criterion is whether the + absolute value of the difference 1 + and the ratio of two corresponding values + is greater than relative + (e.g., |1–(b/a)| > relative + where a is a value in file1 and + b is a value in file2). +
    object1 +
    object2 +
    Specific object(s) within the files to be compared. +
    +
    Examples: +
    The following h5diff call compares + the object /a/b in file1 + with the object /a/c in file2:
    +     h5diff file1 file2 /a/b /a/c +
    This h5diff call compares + the object /a/b in file1 + with the same object in file2:
    +     h5diff file1 file2 /a/b +
    And this h5diff call compares + all objects in both files:
    +     h5diff file1 file2 + +
    + + +
    +
    Tool Name: h5repart
    Syntax:
    h5repart @@ -547,14 +652,14 @@ to convert files from HDF4 format to HDF5 format and vice versa.
    Syntax:
    h5import infile in_options - [infile in_options ...] - -o outfile - + [infile in_options ...] + -o outfile +
    h5import infile in_options - [infile in_options ...] - -outfile outfile - + [infile in_options ...] + -outfile outfile +
    h5import -h
    h5import -help
    Purpose: @@ -563,624 +668,657 @@ to convert files from HDF4 format to HDF5 format and vice versa.
    h5import converts data from one or more ASCII or binary files, infile, into the same number of HDF5 datasets - in the existing or new HDF5 file, outfile. - Data conversion is performed in accordance with the + in the existing or new HDF5 file, outfile. + Data conversion is performed in accordance with the user-specified type and storage properties - specified in in_options. -

    - The primary objective of h5import is to - import floating point or integer data. + specified in in_options. +

    + The primary objective of h5import is to + import floating point or integer data. The utility's design allows for future versions that - accept ASCII text files and store the contents as a + accept ASCII text files and store the contents as a compact array of one-dimensional strings, - but that capability is not implemented in HDF5 Release 1.6. + but that capability is not implemented in HDF5 Release 1.6. -

    - Input data and options:
    - Input data can be provided in one of the follwing forms: -

    • As an ASCII, or plain-text, file containing either - floating point or integer data -
    • As a binary file containing either 32-bit or +

      + Input data and options:
      + Input data can be provided in one of the follwing forms: +

      • As an ASCII, or plain-text, file containing either + floating point or integer data +
      • As a binary file containing either 32-bit or 64-bit native floating point data -
      • As a binary file containing native integer data, - signed or unsigned and - 8-bit, 16-bit, 32-bit, or 64-bit. -
      • As an ASCII, or plain-text, file containing text data. +
      • As a binary file containing native integer data, + signed or unsigned and + 8-bit, 16-bit, 32-bit, or 64-bit. +
      • As an ASCII, or plain-text, file containing text data. (This feature is not implemented in HDF5 Release 1.6.) -
      - Each input file, infile, - contains a single n-dimensional - array of values of one of the above types expressed - in the order of fastest-changing dimensions first. -

      - Floating point data in an ASCII input file must be - expressed in the fixed floating form (e.g., 323.56) - h5import is designed to accept scientific notation - (e.g., 3.23E+02) in an ASCII, but that is not implemented in HDF5 release 1.6. -

      - Each input file can be associated with options specifying - the datatype and storage properties. - These options can be specified either as - command line arguments - or in a configuration file. - Note that exactly one of these approaches must be used with a - single input file. -

      - Command line arguments, best used with simple input files, - can be used to specify - the class, size, dimensions of the input data and - a path identifying the output dataset. -

      - The recommended means of specifying input data options - is in a configuration file; this is also the only means of - specifying advanced storage features. - See further discussion in "The configuration file" below. -

      - The only required option for input data is dimension sizes; - defaults are available for all others. -

      - h5import will accept up to 30 input files in a single call. - Other considerations, such as the maximum length of a command line, - may impose a more stringent limitation. +

    + Each input file, infile, + contains a single n-dimensional + array of values of one of the above types expressed + in the order of fastest-changing dimensions first. +

    + Floating point data in an ASCII input file must be + expressed in the fixed floating form (e.g., 323.56) + h5import is designed to accept scientific notation + (e.g., 3.23E+02) in an ASCII, but that is not implemented in HDF5 release 1.6. +

    + Each input file can be associated with options specifying + the datatype and storage properties. + These options can be specified either as + command line arguments + or in a configuration file. + Note that exactly one of these approaches must be used with a + single input file. +

    + Command line arguments, best used with simple input files, + can be used to specify + the class, size, dimensions of the input data and + a path identifying the output dataset. +

    + The recommended means of specifying input data options + is in a configuration file; this is also the only means of + specifying advanced storage features. + See further discussion in "The configuration file" below. +

    + The only required option for input data is dimension sizes; + defaults are available for all others. +

    + h5import will accept up to 30 input files in a single call. + Other considerations, such as the maximum length of a command line, + may impose a more stringent limitation. -

    - Output data and options:
    - The name of the output file is specified following - the -o or -output option - in outfile. - The data from each input file is stored as a separate dataset - in this output file. - outfile may be an existing file. - If it does not yet exist, h5import will create it. -

    - Output dataset information and storage properties can be - specified only by means of a configuration file. - - - - - - - - - - - - - - -
      - Dataset path - If the groups in the path leading to the dataset - do not exist, h5import will create them.
    - If no group is specified, the dataset will be created - under the root group.
    - If no dataset name is specified, the dataset will be created - as dataset1.
    - h5import does not check for a pre-existing dataset - of the specified or default name; it overwrites any such dataset - without offering an opportunity to preserve it. -
      - Output type - Datatype parameters for output data -
      -     Output data class - Signed or unsigned integer or floating point -
      -     Output data size - 8-, 16-, 32-, or 64-bit integer
    - 31- or 64-bit floating point -
      -     Output architecture - IEEE
    - STD
    - NATIVE (Default)
    - Other architectures are included in the h5import design - but are not implemented in this release. -
      -     Output byte order - Little- or big-endian.
    - Relevant only if output architecture - is IEEE, UNIX, or STD; - fixed for other architectures. -
      - Dataset layout and storage  
    -         properties -
    Denote how raw data is to be organized on the disk. - If none of the following are specified, - the default configuration is contiguous layout and with no compression. -
      -     Layout - Contiguous (Default)
    - Chunked -
      -     External storage - Allows raw data to be stored in a non-HDF5 file or in an - external HDF5 file.
    - Requires contiguous layout. -
      -     Compressed - Sets the type of compression and the - level to which the dataset must be compressed.
    - Requires chunked layout. -
      -     Extendible - Allows the dimensions of the dataset increase over time - and/or to be unlimited.
    - Requires chunked layout. -
      -     Compressed and
    -         extendible -
    Requires chunked layout. -
      - -   -
    -

    +

    + Output data and options:
    + The name of the output file is specified following + the -o or -output option + in outfile. + The data from each input file is stored as a separate dataset + in this output file. + outfile may be an existing file. + If it does not yet exist, h5import will create it. +

    + Output dataset information and storage properties can be + specified only by means of a configuration file. + + + + + + + + + + + + + + +
      + Dataset path + If the groups in the path leading to the dataset + do not exist, h5import will create them.
    + If no group is specified, the dataset will be created + as a member of the root group.
    + If no dataset name is specified, the default name is + dataset1 for the first input dataset, + dataset2 for the second input dataset, + dataset3 for the third input dataset, + etc.
    + h5import does not overwrite a pre-existing + dataset of the specified or default name. + When an existing dataset of a confilcting name is + encountered, h5import quits with an error; + the current input file and any subsequent input files + are not processed. +
      + Output type + Datatype parameters for output data +
      +     Output data class + Signed or unsigned integer or floating point +
      +     Output data size + 8-, 16-, 32-, or 64-bit integer
    + 32- or 64-bit floating point +
      +     Output architecture + IEEE
    + STD
    + NATIVE (Default)
    + Other architectures are included in the h5import design + but are not implemented in this release. +
      +     Output byte order + Little- or big-endian.
    + Relevant only if output architecture + is IEEE, UNIX, or STD; + fixed for other architectures. +
      + Dataset layout and storage  
    +         properties +
    Denote how raw data is to be organized on the disk. + If none of the following are specified, + the default configuration is contiguous layout and with no compression. +
      +     Layout + Contiguous (Default)
    + Chunked +
      +     External storage + Allows raw data to be stored in a non-HDF5 file or in an + external HDF5 file.
    + Requires contiguous layout. +
      +     Compressed + Sets the type of compression and the + level to which the dataset must be compressed.
    + Requires chunked layout. +
      +     Extendible + Allows the dimensions of the dataset increase over time + and/or to be unlimited.
    + Requires chunked layout. +
      +     Compressed and
    +         extendible +
    Requires chunked layout. +
      + +   +
    +

    -

    - Command-line arguments:
    - The h5import syntax for the command-line arguments, - in_options, is as follows: - - -
         - h5import infile -d dim_list - [-p pathname] - [-t input_class] - [-s input_size] - [infile ...] - -o outfile
    - or
    - h5import infile -dims dim_list - [-path pathname] - [-type input_class] - [-size input_size] - [infile ...] - -outfile outfile
    - or
    - h5import infile -c config_file - [infile ...] - -outfile outfile -
    - Note the following: - If the -c config_file option is used with - an input file, no other argument can be used with that input file. - If the -c config_file option is not used with - an input data file, the -d dim_list argument - (or -dims dim_list) - must be used and any combination of the remaining options may be used. - Any arguments used must appear in exactly the order used - in the syntax declarations immediately above. +

    + Command-line arguments:
    + The h5import syntax for the command-line arguments, + in_options, is as follows: + + +
         + h5import infile -d dim_list + [-p pathname] + [-t input_class] + [-s input_size] + [infile ...] + -o outfile
    + or
    + h5import infile -dims dim_list + [-path pathname] + [-type input_class] + [-size input_size] + [infile ...] + -outfile outfile
    + or
    + h5import infile -c config_file + [infile ...] + -outfile outfile +
    + Note the following: + If the -c config_file option is used with + an input file, no other argument can be used with that input file. + If the -c config_file option is not used with + an input data file, the -d dim_list argument + (or -dims dim_list) + must be used and any combination of the remaining options may be used. + Any arguments used must appear in exactly the order used + in the syntax declarations immediately above. -

    - The configuration file:
    - A configuration file is specified with the - -c config_file option: - - -
         - h5import infile -c config_file - [infile -c config_file2 ...] - -outfile >outfile -
    -

    - The configuration file is an ASCII file and must be - organized as "Configuration_Keyword Value" pairs, - with one pair on each line. - For example, the line indicating that - the input data class (configuration keyword INPUT-CLASS) - is floating point in a text file (value TEXTFP) - would appear as follows:
    -     INPUT-CLASS TEXTFP -

    +

    + The configuration file:
    + A configuration file is specified with the + -c config_file option: + + +
         + h5import infile -c config_file + [infile -c config_file2 ...] + -outfile outfile +
    +

    + The configuration file is an ASCII file and must be + organized as "Configuration_Keyword Value" pairs, + with one pair on each line. + For example, the line indicating that + the input data class (configuration keyword INPUT-CLASS) + is floating point in a text file (value TEXTFP) + would appear as follows:
    +     INPUT-CLASS TEXTFP +

    A configuration file may have the following keywords each - followed by one of the following defined values. - One entry for each of the first two keywords, - RANK and DIMENSION-SIZES, - is required; all other keywords are optional. - -

    - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    Keyword  
        Value -

    Description -
    -
    RANK   -

    The number of dimensions in the dataset. (Required) -
    -     rank - An integer specifying the number of dimensions in the dataset.
    - Example:   4   for a 4-dimensional dataset. -
    -
    DIMENSION-SIZES -

    Sizes of the dataset dimensions. (Required) -
    -     dim_sizes - A string of space-separated integers - specifying the sizes of the dimensions in the dataset. - The number of sizes in this entry must match the value in - the RANK entry.
    - Example:   4 3 4 38   for a 4x3x4x38 dataset. -
    -
    PATH -

    Path of the output dataset. -
    -     path - The full HDF5 pathname identifying the output dataset - relative to the root group within the output file.
    - I.e., path is a string of optional group names, - each followed by a slash, - and ending with a dataset name. - If the groups in the path do no exist, they will be created.
    - If PATH is not specified, the default - path is /dataset1.
    - Example: The configuration file entry - - -
         - PATH grp1/grp2/dataset1 -
    - indicates that the output dataset dataset1 will be written - in the group grp2/ which is in the group grp1/, - a member of the root group in the output file. -
    -
    INPUT-CLASS   -

    A string denoting the type of input data. -
    -     TEXTIN - Input is signed integer data in an ASCII file. -
    -     TEXTUIN - Input is unsigned integer data in an ASCII file. -
    -     TEXTFP - Input is floating point data in fixed notation (e.g., 325.34) - in an ASCII file. -
    -     TEXTFPE - Input is floating point data in scientific notation (e.g., 3.2534E+02) - in an ASCII file.
    - (Not implemented in this release.) -
    -     IN - Input is signed integer data in a binary file. -
    -     UIN - Input is unsigned integer data in a binary file. -
    -     FP - Input is floating point data in a binary file. (Default) -
    -     STR - Input is character data in an ASCII file. - With this value, the configuration keywords - RANK, DIMENSION-SIZES, - OUTPUT-CLASS, OUTPUT-SIZE, - OUTPUT-ARCHITECTURE, and OUTPUT-BYTE-ORDER - will be ignored.
    - (Not implemented in this release.) -
    -
    INPUT-SIZE -

    An integer denoting the size of the input data, in bits. -
    -     8
    -     16
    -     32
    -     64 -
    For signed and unsigned integer data: - TEXTIN, TEXTUIN, - IN, or UIN. - (Default: 32) -
    -     32
    -     64 -
    For floating point data: - TEXTFP, TEXTFPE, - or FP. - (Default: 32) -
    -
    OUTPUT-CLASS   -

    A string denoting the type of output data. -
    -     IN - Output is signed integer data.
    - (Default if INPUT-CLASS is - IN or TEXTIN) -
    -     UIN - Output is unsigned integer data.
    - (Default if INPUT-CLASS is - UIN or TEXTUIN) -
    -     FP - Output is floating point data.
    - (Default if INPUT-CLASS is not specified or is - FP, TEXTFP, or TEXTFPE) -
    -     STR - Output is character data, - to be written as a 1-dimensional array of strings.
    - (Default if INPUT-CLASS is UIN - or TEXTUIN)
    - (Not implemented in this release.) -
    -
    OUTPUT-SIZE -

    An integer denoting the size of the output data, in bits. -
    -     8
    -     16
    -     32
    -     64 -
    For signed and unsigned integer data: - IN or UIN. - (Default: Same as INPUT-SIZE, else 32) -
    -     32
    -     64 -
    For floating point data: - FP. - (Default: Same as INPUT-SIZE, else 32) -
    -
    OUTPUT-ARCHITECTURE -

    A string denoting the type of output architecture. -
    -     STD
    -     IEEE
    -     INTEL *
    -     CRAY *
    -     MIPS *
    -     ALPHA *
    -     NATIVE
    -     UNIX * -
    See the "Predefined Atomic Types" section - in the "HDF5 Datatypes" chapter - of the HDF5 User's Guide - for a discussion of these architectures.
    - Values marked with an asterisk (*) are not implemented in this release.
    - (Default: NATIVE) -
    -
    OUTPUT-BYTE-ORDER -

    A string denoting the output byte order. - This entry is ignored if the OUTPUT-ARCHITECTURE - is not specified or if it is specified as IEEE, - UNIX, or STD. -
    -     BE - Big-endian. (Default) -
    -     LE - Little-endian. -
    -
    The following options are disabled by default, making - the default storage properties no chunking, no compression, - no external storage, and no extensible dimensions. -
    -
    CHUNKED-DIMENSION
    -

    Dimension sizes of the chunk for chunked output data. -
    BTW, is this CHUNKED-DIMENSION or CHUNKED-D...-SIZES? -
    -     chunk_dims - A string of space-separated integers specifying the - dimension sizes of the chunk for chunked output data. - The number of dimensions must correspond to the value - of RANK.
    - The presence of this field indicates that the - output dataset is to be stored in chunked layout; - if this configuration field is absent, - the dataset will be stored in contiguous layout. -
    -
    COMPRESSION-TYPE -

    Type of compression to be used with chunked storage. - Requires that CHUNKED-DIMENSION be specified. -
    -     GZIP - Gzip compression.
    - Othe compression algorithms are not implemented - in this release of h5import. -
    -
    COMPRESSION-PARAM -

    Compression level. - Required if COMPRESSION-TYPE is specified. -Since there is a default, is "required" true? -
    -     1 through 9 - Gzip compression levels: - 1 will result in the fastest compression - while 9 will result in the best compression ratio. - (Default: 6) -
    -
    EXTERNAL-STORAGE -

    Name of an external file in which to create the output dataset. - Cannot be used with CHUNKED-DIMENSIONS, - COMPRESSION-TYPE, OR MAXIMUM-DIMENSIONS. -
    -     external_file    - - - A string specifying the name of an external file. -
    -
    MAXIMUM-DIMENSIONS -

    Maximum sizes of all dimensions. - Requires that CHUNKED-DIMENSION be specified. -
    -     max_dims - A string of space-separated integers specifying the - maximum size of each dimension of the output dataset. - A value of -1 for any dimension implies - unlimited size for that particular dimension.
    - The number of dimensions must correspond to the value - of RANK.
    -


    + followed by one of the following defined values. + One entry for each of the first two keywords, + RANK and DIMENSION-SIZES, + is required; all other keywords are optional. + +

    + + + + + + + + + + + + + + + + + + + + + + + + + -

    - The help option:
    - The help option, expressed as one of -

    +
    Keyword  
        Value +

    Description +
    +
    RANK   +

    The number of dimensions in the dataset. (Required) +
    +     rank + An integer specifying the number of dimensions in the dataset.
    + Example:   4   for a 4-dimensional dataset. +
    +
    DIMENSION-SIZES +

    Sizes of the dataset dimensions. (Required) +
    +     dim_sizes + A string of space-separated integers + specifying the sizes of the dimensions in the dataset. + The number of sizes in this entry must match the value in + the RANK entry. + The fastest-changing dimension must be listed first.
    + Example:   4 3 4 38   for a 38x4x3x4 dataset. +
    +
    PATH +

    Path of the output dataset. +
    +     path + The full HDF5 pathname identifying the output dataset + relative to the root group within the output file.
    + I.e., path is a string consisting of + optional group names, each followed by a slash, + and ending with a dataset name. + If the groups in the path do no exist, they will be + created.
    + If PATH is not specified, the output dataset + is stored as a member of the root group and the + default dataset name is + dataset1 for the first input dataset, + dataset2 for the second input dataset, + dataset3 for the third input dataset, etc.
    + Note that h5import does not overwrite a + pre-existing dataset of the specified or default name. + When an existing dataset of a confilcting name is + encountered, h5import quits with an error; + the current input file and any subsequent input files + are not processed.
    + Example: The configuration file entry + + +
         + PATH grp1/grp2/dataset1 +
    + indicates that the output dataset dataset1 will + be written in the group grp2/ which is in + the group grp1/, + a member of the root group in the output file. +
    +
    INPUT-CLASS   +

    A string denoting the type of input data. +
    +     TEXTIN + Input is signed integer data in an ASCII file. +
    +     TEXTUIN + Input is unsigned integer data in an ASCII file. +
    +     TEXTFP + Input is floating point data in fixed notation (e.g., 325.34) + in an ASCII file. +
    +     TEXTFPE + Input is floating point data in scientific notation (e.g., 3.2534E+02) + in an ASCII file.
    + (Not implemented in this release.) +
    +     IN + Input is signed integer data in a binary file. +
    +     UIN + Input is unsigned integer data in a binary file. +
    +     FP + Input is floating point data in a binary file. (Default) +
    +     STR + Input is character data in an ASCII file. + With this value, the configuration keywords + RANK, DIMENSION-SIZES, + OUTPUT-CLASS, OUTPUT-SIZE, + OUTPUT-ARCHITECTURE, and OUTPUT-BYTE-ORDER + will be ignored.
    + (Not implemented in this release.) +
    +
    INPUT-SIZE +

    An integer denoting the size of the input data, in bits. +
    +     8
    +     16
    +     32
    +     64 +
    For signed and unsigned integer data: + TEXTIN, TEXTUIN, + IN, or UIN. + (Default: 32) +
    +     32
    +     64 +
    For floating point data: + TEXTFP, TEXTFPE, + or FP. + (Default: 32) +
    - - - + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
         - h5import -h
    - or
    - h5import -help
    -
    prints the h5import usage summary
         - - h5import -h[elp], OR
    +
    +
    OUTPUT-CLASS   +

    A string denoting the type of output data. +
    +     IN + Output is signed integer data.
    + (Default if INPUT-CLASS is + IN or TEXTIN) +
    +     UIN + Output is unsigned integer data.
    + (Default if INPUT-CLASS is + UIN or TEXTUIN) +
    +     FP + Output is floating point data.
    + (Default if INPUT-CLASS is not specified or is + FP, TEXTFP, or TEXTFPE) +
    +     STR + Output is character data, + to be written as a 1-dimensional array of strings.
    + (Default if INPUT-CLASS is STR)
    + (Not implemented in this release.) +
    +
    OUTPUT-SIZE +

    An integer denoting the size of the output data, in bits. +
    +     8
    +     16
    +     32
    +     64 +
    For signed and unsigned integer data: + IN or UIN. + (Default: Same as INPUT-SIZE, else 32) +
    +     32
    +     64 +
    For floating point data: + FP. + (Default: Same as INPUT-SIZE, else 32) +
    +
    OUTPUT-ARCHITECTURE +

    A string denoting the type of output architecture. +
    +     NATIVE
    +     STD
    +     IEEE
    +     INTEL *
    +     CRAY *
    +     MIPS *
    +     ALPHA *
    +     UNIX * +
    See the "Predefined Atomic Types" section + in the "HDF5 Datatypes" chapter + of the HDF5 User's Guide + for a discussion of these architectures.
    + Values marked with an asterisk (*) are not implemented in this release.
    + (Default: NATIVE) +
    +
    OUTPUT-BYTE-ORDER +

    A string denoting the output byte order. + This entry is ignored if the OUTPUT-ARCHITECTURE + is not specified or if it is not specified as IEEE, + UNIX, or STD. +
    +     BE + Big-endian. (Default) +
    +     LE + Little-endian. +
    +
    The following options are disabled by default, making + the default storage properties no chunking, no compression, + no external storage, and no extensible dimensions. +
    +
    CHUNKED-DIMENSION-SIZES
    +

    Dimension sizes of the chunk for chunked output data. +
    +     chunk_dims + A string of space-separated integers specifying the + dimension sizes of the chunk for chunked output data. + The number of dimensions must correspond to the value + of RANK.
    + The presence of this field indicates that the + output dataset is to be stored in chunked layout; + if this configuration field is absent, + the dataset will be stored in contiguous layout. +
    +
    COMPRESSION-TYPE +

    Type of compression to be used with chunked storage. + Requires that CHUNKED-DIMENSION-SIZES + be specified. +
    +     GZIP + Gzip compression.
    + Othe compression algorithms are not implemented + in this release of h5import. +
    +
    COMPRESSION-PARAM +

    Compression level. + Required if COMPRESSION-TYPE is specified. +
    +     1 through 9 + Gzip compression levels: + 1 will result in the fastest compression + while 9 will result in the + best compression ratio.
    + (Default: 6. The default gzip compression level is 6; + not all compression methods will have a default level.) +
    +
    EXTERNAL-STORAGE +

    Name of an external file in which to create the output dataset. + Cannot be used with CHUNKED-DIMENSIONS-SIZES, + COMPRESSION-TYPE, OR MAXIMUM-DIMENSIONS. +
    +     external_file        + + + + A string specifying the name of an external file. +
    +
    MAXIMUM-DIMENSIONS +

    Maximum sizes of all dimensions. + Requires that CHUNKED-DIMENSION-SIZES be specified. +
    +     max_dims + A string of space-separated integers specifying the + maximum size of each dimension of the output dataset. + A value of -1 for any dimension implies + unlimited size for that particular dimension.
    + The number of dimensions must correspond to the value + of RANK.
    +


    + +

    + The help option:
    + The help option, expressed as one of + + + + - -
         + h5import -h
    + or
    + h5import -help
    +
    prints the h5import usage summary
         + + h5import -h[elp], OR
    h5import <infile> <options> - [<infile> <options>...] - -o[utfile] <outfile>
    -
    then exits.
    -

    - + [<infile> <options>...] + -o[utfile] <outfile> + + then exits. + +

    +

    Options and Parameters:
    infile(s)
    Name of the Input file(s).
    in_options
    Input options. Note that while only the -dims argument - is required, arguments must used in the order in which they are listed below. -
    -
    -d dim_list -
    -dims dim_list -
    Input data dimensions. - dim_list is a string of - comma-separated numbers with no spaces - describing the dimensions of the input data. - For example, a 50 x 100 2-dimensional array would be - specified as -dims 50,100.
    - Required argument: if no configuration file is used, - this command-line argument is mandatory. -
    -p pathname -
    -pathname pathname -
    pathname is a string consisiting of - one or more strings separated by '/' specifying the path - of the dataset in the output file. - If the groups in the path do no exist, they will be created.
    - Optional argument: if not specified, - the default path is /dataset1.
    - h5import does not check for a pre-existing dataset - of the specified or default name; it overwrites any such dataset - without offering an opportunity to preserve it. -
    -t input_class -
    -type input_class -
    input_class specifies the class of the - input data and determines the class of the output data.
    - Valid values are as defined in the "ZZZinput_classZZZ" - section of "ZZZconfig_fileZZZ".
    - Optional argument: if not specified, - the default value is FP. -
    -s input_size -
    -size input_size -
    input_size specifies the size in bits - of the input data and determines the size of the output data.
    - Valid values for signed or unsigned integers are - 8, 16, 32, and 64.
    - Valid values for floating point data are - 32 and 64.
    - Optional argument: if not specified, - the default value is 32. -
    -c config_file -
    config_file specifies a configuration file.
    - This argument replaces all other arguments except - infileand -o outfile -
    + is required, arguments must used in the order in which they are listed below. +
    +
    -d dim_list +
    -dims dim_list +
    Input data dimensions. + dim_list is a string of + comma-separated numbers with no spaces + describing the dimensions of the input data. + For example, a 50 x 100 2-dimensional array would be + specified as -dims 50,100.
    + Required argument: if no configuration file is used, + this command-line argument is mandatory. +
    -p pathname +
    -pathname pathname +
    pathname is a string consisting of + one or more strings separated by slashes (/) + specifying the path of the dataset in the output file. + If the groups in the path do no exist, they will be + created.
    + Optional argument: if not specified, + the default path is + dataset1 for the first input dataset, + dataset2 for the second input dataset, + dataset3 for the third input dataset, + etc.
    + h5import does not overwrite a pre-existing + dataset of the specified or default name. + When an existing dataset of a confilcting name is + encountered, h5import quits with an error; + the current input file and any subsequent input files + are not processed. +
    -t input_class +
    -type input_class +
    input_class specifies the class of the + input data and determines the class of the output data.
    + Valid values are as defined in the Keyword/Values table + in the section "The configuration file" above.
    + Optional argument: if not specified, + the default value is FP. +
    -s input_size +
    -size input_size +
    input_size specifies the size in bits + of the input data and determines the size of the output data.
    + Valid values for signed or unsigned integers are + 8, 16, 32, and 64.
    + Valid values for floating point data are + 32 and 64.
    + Optional argument: if not specified, + the default value is 32. +
    -c config_file +
    config_file specifies a + configuration file.
    + This argument replaces all other arguments except + infile and + -o outfile +
    outfile
    Name of the HDF5 output file.
    Examples:
    Using command-line arguments: - - -
    - h5import infile -dims 2,3,4 -type TEXTIN -size 32 -o out1 -
         - This command creates a file out1 containing - a single 2x3x4 32-bit integer dataset. - Since no pathname is specified, the dataset is stored - in out1 as /dataset1. -
    - h5import infile -dims 20,50 -path bin1/dset1 -type FP -size 64 -o out2 -
         - This command creates a file out2 containing - a single a 20x50 64-bit floating point dataset. - The dataset is stored in out2 as /bin1/dset1. -
    + + +
    + h5import infile -dims 2,3,4 -type TEXTIN -size 32 -o out1 +
         + This command creates a file out1 containing + a single 2x3x4 32-bit integer dataset. + Since no pathname is specified, the dataset is stored + in out1 as /dataset1. +
    + h5import infile -dims 20,50 -path bin1/dset1 -type FP -size 64 -o out2 +
         + This command creates a file out2 containing + a single a 20x50 64-bit floating point dataset. + The dataset is stored in out2 as /bin1/dset1. +
    Sample configuration files:
    - The following configuration file specifies the following:
    - – The input data is a 5x2x4 floating point array in an ASCII file.
    - – The output dataset will be saved in chunked layout, - with chunk dimension sizes of 2x2x2.
    - – The output datatype will be 64-bit floating point, little-endian, IEEE.
    - – The output dataset will be stored in outfile - at /work/h5/pkamat/First-set.
    - – The maximum dimension sizes of the output dataset - will be 8x8x(unlimited). -
    -            PATH work h5 pkamat First-set
    +        The following configuration file specifies the following:
    + – The input data is a 5x2x4 floating point array in + an ASCII file.
    + – The output dataset will be saved in chunked layout, + with chunk dimension sizes of 2x2x2.
    + – The output datatype will be 64-bit floating point, + little-endian, IEEE.
    + – The output dataset will be stored in + outfile + at /work/h5/pkamat/First-set.
    + – The maximum dimension sizes of the output dataset + will be 8x8x(unlimited). +
    +            PATH work/h5/pkamat/First-set
                 INPUT-CLASS TEXTFP
                 RANK 3
                 DIMENSION-SIZES 5 2 4
    @@ -1188,31 +1326,33 @@ to convert files from HDF4 format to HDF5 format and vice versa.
                 OUTPUT-SIZE 64
                 OUTPUT-ARCHITECTURE IEEE
                 OUTPUT-BYTE-ORDER LE
    -            CHUNKED-DIMENSION 2 2 2 
    +            CHUNKED-DIMENSION-SIZES 2 2 2 
                 MAXIMUM-DIMENSIONS 8 8 -1
    -	
    - +
    + The next configuration file specifies the following:
    - – The input data is a 6x3x5x2x4 integer array in a binary file.
    - – The output dataset will be saved in chunked layout, - with chunk dimension sizes of 2x2x2x2x2.
    - – The output datatype will be 32-bit integer in NATIVE format - (as the output architecure is not specified).
    + – The input data is a 6x3x5x2x4 integer array in + a binary file.
    + – The output dataset will be saved in chunked layout, + with chunk dimension sizes of 2x2x2x2x2.
    + – The output datatype will be 32-bit integer in + NATIVE format + (as the output architecure is not specified).
    – The output dataset will be compressed using Gzip compression - with a compression level of 7.
    - – The output dataset will be stored in outfile - at /Second-set. -
    +                with a compression level of 7.
    + – The output dataset will be stored in + outfile at /Second-set. +
                 PATH Second-set
    -            INPUT-CLASS IN	
    +            INPUT-CLASS IN
                 RANK 5
                 DIMENSION-SIZES 6 3 5 2 4
                 OUTPUT-CLASS IN
                 OUTPUT-SIZE 32
    -            CHUNKED-DIMENSION 2 2 2 2 2
    +            CHUNKED-DIMENSION-SIZES 2 2 2 2 2
                 COMPRESSION-TYPE GZIP
                 COMPRESSION-PARAM 7
    -	
    +
    + +
    Purpose: +
    Tests Parallel HDF5 performance. +
    Description: +
    h5perf provides tools for testing the performance + of the Parallel HDF5 library. +

    + The following environment variables have the following + effects on H5perf behavior: + + + + +
         + HDF5_NOCLEANUP + If set, h5perf does not remove data files. + (Default: Remove)
      + HDF5_MPI_INFO + Must be set to a string containing a list of semi-colon separated + key=value pairs for the MPI INFO object.
    + Example:
      + HDF5_PARAPREFIX   + Sets the prefix for paralllel output data files.
    +

    Options and Parameters: +
    +
    These terms are used as follows in this section: + + + + +
         + file   + A filename
      + size + A size specifier, expressed as an integer greater than or equal + to 0 (zero) followed by a size indicator:
    +     K for kilobytes (1024 bytes)
    +     M for megabytes (1048576 bytes)
    +     G for gigabytes (1073741824 bytes)
    + Example: 37M specifies 37 megabytes or 38797312 bytes.
      + N + An integer greater than or equal to 0 (zero)
    +

    +

    -h, --help +
    Prints a usage message and exits. +
    -a size, --align=size +
    Specifies the alignment of objects in the HDF5 file. + (Default: 1) +
    -A api_list, --api=api_list +
    Specifies which APIs to test. + api_list is a comma-separated list with the + following valid values: + + + + +
         + phdf5  Parallel HDF5
      + mpiioMPI-I/O
      + posixPOSIX
    + (Default: All APIs)

    + Example, --api=mpiio,phdf5 specifies that the + MPI I/O and parallel HDf5 APIs are to be monitored.

    +
    -B size, --block-size=size +
    Specifies the block size within the transfer buffer. + (Default: 128K)

    + Block size versus transfer buffer size: + The transfer buffer size is the size of a + buffer in memory. The data in that buffer is broken + into block size pieces and written to the + file.

    + Transfer block size is set by the + -x (or --min-xfer-size) and + -X (or --max-xfer-size) + options.
    + The pattern in which the blocks + are written to the file is described in the discussion + of the -I (or --interleaved) + option.

    +
    -c, --chunk +
    Creates HDF5 datasets in chunked layout. + (Default: Off) +
    -C, --collective +
    Use collective I/O for the MPI I/O and Parallel HDF5 APIs.
    + (Default: Off, i.e., independent I/O)

    + If this option is set and the MPI-I/O and PHDF5 APIs + are in use, all the blocks in each transfer buffer + will be written at once with an MPI derived type. +

    +
    -d N, --num-dsetsN +
    Sets the number of datasets per file. + (Default: 1) +
    -D debug_flags, --debug=debug_flags +
    Sets the debugging level. + debug_flags is a comma-separated list of + debugging flags with the following valid values: + + + + + + + + +
         + 1  Minimal debugging
      + 2Moderate debugging (“not quite everything”)
      + 3Extensive debugging (“everything”)
      + 4All possible debugging (“the kitchen sink”)
      + rRaw data I/O throughput information
      + tTimes, in additions to throughputs
      + vVerify data correctness
    + (Default: No debugging)

    + Example: --debug=2,r,t specifies to + run a moderate level of debugging + while collecting raw data I/O throughput information + and verifying the correctness of the data.

    +
    -e size, --num-bytes=size +
    Specifies the number of bytes per process per dataset. + (Default: 256K) +
    -F N, --num-files=N +
    Specifies the number of files. + (Default: 1) +
    -i N, --num-iterations=N +
    Sets the number of iterations to perform. + (Default: 1) +
    -I, --interleaved +
    Sets interleaved block I/O.
    + (Default: Contiguous block I/O)

    + Interleaved vs. Contiguous blocks + in a parallel environment:
    + When contiguous blocks are written to a dataset, + the dataset is divided into m regions, + where m is the number of processes writing + separate portions of the dataset. Each process + then writes data to its own region. + When interleaved blocks are written to a dataset, + space for the first block of the first process is + allocated in the dataset, then space is allocated + for the first block of the second process, etc., + until space has been allocated for the first block + of each process. Space is then allocated for + the second block of the first process, + the second block of the second process, etc.

    + For example, in the case of a 4 process run + with 1M bytes-per-process, 256K transfer buffer size, + and 64KB block size, 16 contiguous blocks + per process would be written to the file in the + manner
    +     1111111111111111222222222222222233333333333333334444444444444444
    + while 16 interleaved blocks per process would be + written to the file as +     1234123412341234123412341234123412341234123412341234123412341234
    + If collective I/O is turned on, all of the four + blocks per transfer buffer will be written in + one collective I/O call.

    +
    -m, --mpi-posix +
    Sets use of MPI-posix driver for HDF5 I/O. + (Default: MPI-I/O driver) +
    -n, --no-fill +
    Specifies to not write fill values to HDF5 datasets. + This option is supported only in HDF5 Release v1.6 or later.
    + (Default: Off, i.e., write fill values) +
    -o file, --output=file +
    Sets the output file for raw data to file. + (Default: None) +
    -p N, --min-num-processes=N +
    Sets the minimum number of processes to be used. + (Default: 1) +
    -P N, --max-num-processes=N +
    Sets the maximum number of processes to be used.
    + (Default: All MPI_COMM_WORLDprocesses) +
    -T size, --threshold=size +
    Sets the threshold for alignment of objects in the HDF5 file. + (Default: 1) +
    -w, --write-only +
    Performs only write tests, not read tests. + (Default: Read and write tests) +
    -x size, --min-xfer-size=size +
    Sets the minimum transfer buffer size. + (Default: 128K) +
    -X size, --max-xfer-size=size +
    Sets the maximum transfer buffer size. + (Default: 1M) +
    + +
    + + +
    +
    +
    Tool Name: h5redeploy +
    Syntax: +
    h5redeploy + [help | -help] +
    h5redeploy + [-echo] + [-force] + [-prefix=dir] + [-tool=tool] + [-show] +
    Purpose: +
    Updates HDF5 compiler tools after an HDF5 software installation + in a new location. +
    Description: +
    h5redeploy updates the HDF5 compiler tools after + the HDF5 software has been installed in a new location. + +
    Options and Parameters: +
    +
    help, -help +
    Prints a help message. +
    -echo +
    Shows all the shell commands executed. +
    -force +
    Performs the requested action without offerring any prompt + requesting confirmation. +
    -prefix=dir +
    Specifies a new directory in which to find the + HDF5 subdirectories lib/ and + include/.
    + (Default: current working directory) +
    -tool=tool +
    Specifies the tool to update. + tool must be in the current directory + and must be writable.
    + (Default: h5cc) +
    -show +
    Shows all of the shell commands to be executed + without actually executing them. +
    + +
    + + +
    +
    Tool Name: h5cc
    Syntax:
    h5cc @@ -1537,7 +1950,7 @@ to convert files from HDF4 format to HDF5 format and vice versa.
    Description:
    h5cc can be used in much the same way MPIch is used to compile an HDF5 program. It takes care of specifying where the - HDF5 header files and libraries are on the commandline. + HDF5 header files and libraries are on the command line.

    h5cc supercedes all other compiler scripts in that if you've used them to compile the HDF5 library, then @@ -1571,9 +1984,9 @@ to convert files from HDF4 format to HDF5 format and vice versa.

    Show all the shell commands executed.
    -prefix=DIR
    Use the directory DIR to find the HDF5 - lib/ and include/ subdirectories. + lib/ and include/ subdirectories.
    - Default: prefix specified when configuring HDF5. + Default: prefix specified when configuring HDF5.
    -show
    Show the commands without executing them.
    -shlib @@ -1582,7 +1995,7 @@ to convert files from HDF4 format to HDF5 format and vice versa.
    Compile using static HDF5 libraries [default].
    <compile line>
    The normal compile line options for your compiler. - h5cc uses the same compiler you used to compile HDF5. + h5cc uses the same compiler you used to compile HDF5. Check your compiler's manual for more information on which options are needed.
    @@ -1619,7 +2032,7 @@ to convert files from HDF4 format to HDF5 format and vice versa.

    h5fc can be used in much the same way MPIch is used to compile an HDF5 program. It takes care of specifying where the - HDF5 header files and libraries are on the commandline. + HDF5 header files and libraries are on the command line.

    h5fc supercedes all other compiler scripts in that if you've used them to compile the HDF5 Fortran library, then @@ -1638,7 +2051,7 @@ to convert files from HDF4 format to HDF5 format and vice versa. An example of how to use h5fc to compile the program hdf_prog, which consists of modules prog1.f90 and prog2.f90 - and uses the HDF5 Fortran library, would be as follows: + and uses the HDF5 Fortran library, would be as follows:

             # h5fc -c prog1.f90
             # h5fc -c prog2.f90
    @@ -1653,16 +2066,16 @@ to convert files from HDF4 format to HDF5 format and vice versa.
                 
    Show all the shell commands executed.
    -prefix=DIR
    Use the directory DIR to find HDF5 - lib/ and include/ subdirectories -
    + lib/ and include/ subdirectories +
    Default: prefix specified when configuring HDF5.
    -show
    Show the commands without executing them.
    <compile line>
    The normal compile line options for your compiler. - h5fc uses the same compiler you used + h5fc uses the same compiler you used to compile HDF5. Check your compiler's manual for - more information on which options are needed. + more information on which options are needed.
    Environment Variables:
    When set, these environment variables override some of the built-in @@ -1682,6 +2095,79 @@ to convert files from HDF4 format to HDF5 format and vice versa. +
    +
    +
    Tool Name: h5c++ +
    Syntax: +
    h5c++ + [OPTIONS] <compile line> +
    Purpose: +
    Helper script to compile HDF5 C++ applications. +
    Description: +

    + h5c++ can be used in much the same way MPIch is used + to compile an HDF5 program. It takes care of specifying where the + HDF5 header files and libraries are on the command line. +

    + h5c++ supercedes all other compiler scripts in that + if you've used one set of compiler scripts to compile the + HDF5 C++ library, then h5c++ uses those same scripts. + For example, when compiling an MPIch program, + you use the mpiCC script. +

    + Some programs use HDF5 in only a few modules. It isn't necessary + to use h5c++ to compile those modules which don't use + HDF5. In fact, since h5c++ is only a convenience + script, you are still able to compile HDF5 C++ modules in the + normal way. In that case, you will have to specify the HDF5 libraries + and include paths yourself. +

    + An example of how to use h5c++ to compile the program + hdf_prog, which consists of modules + prog1.cpp and prog2.cpp + and uses the HDF5 C++ library, would be as follows: +

    +        # h5c++ -c prog1.cpp
    +        # h5c++ -c prog2.cpp
    +        # h5c++ -o hdf_prog prog1.o prog2.o
    +
    Options and Parameters: +
    +
    +
    -help +
    Prints a help message. +
    -echo +
    Show all the shell commands executed. +
    -prefix=DIR +
    Use the directory DIR to find HDF5 + lib/ and include/ subdirectories +
    + Default: prefix specified when configuring HDF5. +
    -show +
    Show the commands without executing them. +
    <compile line> +
    The normal compile line options for your compiler. + h5c++ uses the same compiler you used + to compile HDF5. Check your compiler's manual for + more information on which options are needed. +
    +
    Environment Variables: +
    When set, these environment variables override some of the built-in + defaults of h5c++. +
    +
    HDF5_CXX +
    Use a different C++ compiler. +
    HDF5_CXXLINKER +
    Use a different linker. +
    + +
    + + -Last modified: 30 May 2003 +Last modified: 11 June 2003 -- cgit v0.12