summaryrefslogtreecommitdiffstats
path: root/tools/h5repack/h5repack.sh.in
Commit message (Collapse)AuthorAgeFilesLines
* [svn-r17053] merge 17052 from trunkPedro Vicente Nunes2009-06-151-0/+38
| | | | | | | | | | Add a run to the h5repack shell script to read a family file The file used for input is located in the common source tools for testfiles, in tools/testfiles Modified the h5repack shell script to read files from this location (h5repack reads its input files from a dedicated testfiles location in h5repack/testfiles) Changed the h5diff open file call to use h5tools_fopen, so that it can open all file drivers Tested: linux
* [svn-r16802] Purpose: Fix bug 1516Neil Fortner2009-04-201-1/+8
| | | | | | | | | | | Description: h5repack previously would not take named datatypes into consideration when copying datasets and attributes. This would cause extra anonymous datatypes in the target file at best, and cause errors halfway through the repacking at worst. h5repack should now always handle named datatypes correctly. Named datatypes are also now converted to the native type when -n is given. Tested: jam, linew, smirom (h5committest)
* [svn-r16641] merge from trunk revs 16614, 16629Pedro Vicente Nunes2009-04-011-0/+3
| | | | | | | | | | | 1. #1501 (B1) tools bug if dataset is larger than H5TOOLS_BUFSIZE limit. ISSUE : the tools use the following formula to read by hyperslabs: hyperslab_size[i] = MIN( dim_size[i], H5TOOLS_BUFSIZE / datum_size) where H5TOOLS_BUFSIZE is a constant defined of 1024K. This is OK as long as the datum_size does not exceed 1024K, otherwise we have a hyperslab size of 0 (since 1024K/(greater than 1024K) = 0). This affects h5dump. h5repack, h5diff SOLUTION: add a check for a 0 size and define as 1 if so. TEST FOR H5DUMP: Defined a case in the h5dump test generator program of such a type (an array type of doubles with a large array dimension, that was the case the user reported). Since the written file commited in svn would be around 1024K, opted for not writing the data (the part of the code where the hyperslab is defined is executed, since h5dump always reads the files). Defined a macro WRITE_ARRAY to enable such writing if needed. Added a run on the h5dump shell script. Added 2 new files to svn: tools/testfiles/tarray8.ddl, tools/testfiles/tarray8.h5. NOTE: while doing this I thought of adding this dataset case to an existing file, but that would add the large array output to those files (the ddls). The issue is that the file list is increasing. TEST FOR H5DIFF: for h5diff the check for reading by hyperslabs is H5TOOLS_MALLOCSIZE (128 * H5TOOLS_BUFSIZE) or 128 Mb. This makes it not possible to add such a file to svn, so used the same method as h5dump (only write the dataset if WRITE_ARRAY is defined). As opposed to h5dump, the hyperslab code is NOT executed when the dataset is empty (dataset is not read). Added the new dataset to existing files and shell run (tools/h5diff/testfiles/h5diff_dset1.h5 and tools/h5diff/testfiles/h5diff_dset2.h5 and output in tools/h5diff/testfiles/h5diff_80.txt). TEST FOR H5REPACK: similar issue as h5diff with the difference that the hyperslab code is run. Added a run to the shell script (with a filter, otherwise the code uses H5Ocopy). FURTHER ISSUES: the type in question ("double") has a different output cross platforms (e.g on liberty some garbage number is printed at some array locations) SOLUTION: defined an "int" type for this test. However the printing of such an array has a bogus output at least in one platform (FreeBsd), so eliminated the test run altogether and filed a bug report on this
* [svn-r16402] Description:Quincey Koziol2009-02-031-0/+5
| | | | | | | | | | | Bring r16401 back from trunk: Correct error introduced in r16353 with layout version, and add test so it gets caught earlier. Tested on: FreeBSD/32 6.3 (duty) Too minor to require h5committest
* [svn-r15871] Eliminate -c option and make that behavior the default and ↵Pedro Vicente Nunes2008-10-151-2/+2
| | | | | | return 2 instead of -1 on error status Tested: windows, linux
* [svn-r15750] move h5repack test files to /tools/h5repack/testfilesPedro Vicente Nunes2008-10-011-6/+6
| | | | tested: linux
* [svn-r15558] 1) There are 2 new command line parameters for h5repack. ↵Pedro Vicente Nunes2008-08-291-1/+4
| | | | | | | | | | | | | | Together they allow a call to H5Pset_alignment to be made -t T, --threshold=T Threshold value for H5Pset_alignment -a A, --alignment=A Alignment value for H5Pset_alignment 2) bug fix the printing of the dataset name was not done for references (verbose mode) tested: windows, linux
* [svn-r15533] #1184Pedro Vicente Nunes2008-08-261-0/+5
| | | | | | | | | | | Add a userblock to an HDF5 file during the repack. The user gives give a filename and userblock size as command line parameters to h5repack and the contents of that file are stored in the userblock for the HDF5 file created by h5repack. New flags to handle this -u and -b Tested : windows, linux
* [svn-r15433] http://bugzilla.hdfgroup.uiuc.edu/show_bug.cgi?id=1170Pedro Vicente Nunes2008-08-051-2/+2
| | | | | | | | | | | | | Summary: when using h5diff to compare the results of h5repack (or other tools that copy one HDF5 file to another), a new option is needed to allow h5diff to make an "absolute" comparison of the 2 files. This is the "contents" mode explained in the usage below. If this mode is present, objects in both files must match (must be exactly the same). If this does not happen, the tool returns an error code of 1 (instead of the success code of 0) Changes to the h5repack test script: the call to h5diff was changed to include -c (maintaining the previous -q). tested: windows, linux
* [svn-r15053] Add a test for the 1.6.7 -i infile -o outifle for backward ↵Pedro Vicente Nunes2008-05-211-0/+37
| | | | | | | | compability Used the function TOOLTEST from 1.6.7 Tested: linux
* [svn-r15024] backward compatibility for old -i infile -o outfile optionsPedro Vicente Nunes2008-05-161-2/+9
| | | | | | | | | | | | | if these are detected this syntax is used, otherwise the one in usage is used there was another -i option for -i L2, --indexed=L2 Minimum number of links in the indexed format That was changed to -d -d L2, --indexed=L2 Minimum number of links in the indexed format Tested: windows, linux
* [svn-r14309] bug fix: add a check for the presence of a filter in the call ↵Pedro Vicente Nunes2007-11-291-4/+15
| | | | | | | | to test of long swtich names add a test for multiple global filters tested: linux, solaris
* [svn-r14308] new feature test: add a test for the long switch namesPedro Vicente Nunes2007-11-291-1/+9
| | | | tested: linux, solaris
* [svn-r14274] new feature: add some new syntax tests for h5repackPedro Vicente Nunes2007-11-201-2/+2
| | | | tested: linux
* [svn-r14264] new features: 1) new usage for h5diff " A la" h5dump formatPedro Vicente Nunes2007-11-161-2/+2
| | | | | | 2) new usage for h5repack and new command line parsing using the tools library parsing code tested: windows, linux, solaris
* [svn-r14260] bug fix: modified the h5repack script to call h5diff with its ↵Pedro Vicente Nunes2007-11-141-2/+2
| | | | | | brand new syntax tested: linux
* [svn-r14158] Description:Quincey Koziol2007-09-271-23/+13
| | | | | | | | | | | | | | | | Add regression test for h5repack with userblock Tested on: FreeBSD/32 6.2 (duty) in debug mode FreeBSD/64 6.2 (liberty) w/C++ & FORTRAN, in debug mode Linux/32 2.6 (kagiso) w/PGI compilers, w/C++ & FORTRAN, w/threadsafe, in debug mode Linux/64 2.6 (smirom) w/default API=1.6.x, w/C++ & FORTRAN, in production mode Solaris/32 2.10 (linew) w/deprecated symbols disabled, w/C++ & FORTRAN, w/szip filter, in production mode AIX/32 5.3 (copper) w/FORTRAN, w/parallel, in production mode Mac OS X/32 10.4.10 (amazon) in debug mode
* [svn-r13451] Pedro Vicente Nunes2007-03-051-1/+1
| | | | | | | h5diff bug fix, attributes differences were not being count for total differences Revision of H5Ocopy call in h5repack
* [svn-r13441] Pedro Vicente Nunes2007-03-011-1/+1
| | | | make 1.7 h5repack more similar to 1.6 h5repack
* [svn-r13261] Updated copyright notices.Albert Cheng2007-02-071-2/+3
| | | | Tested: visual inspection as they are all just comments.
* [svn-r12917] Pedro Vicente Nunes2006-11-151-51/+72
| | | | | | | | h5repack revision: 1. added a new test due to the introduction of H5Ocopy in the copy of objects (compressed dataset with references, that still must go a second sweep of the file to be regenerated). 2. Moved all the source files from the h5repack test program to a new file h5repacktst.c and removed the old ones (testh5repack*.c). 3. Renamed the binary files from test*.h5 to h5repack*.h5 for easy reference. 4. Modified the shell script to use variables for file names instead of hard coded names
* [svn-r12815] Pedro Vicente Nunes2006-10-251-1/+4
| | | | | | | | | | | | | | 1) added a new parameter to the h5diff function diff_array that contains the beginning position of the hyperslab, so that the total position in the array is printed correctly when reading by hyperslabs. 2) added a new test to h5diff that reads and diffs by hyperslabs. The test reads a 1GB dataset, from which a 1KB hyperslab was written with differences . 3) added the generation of 2 files to the generator program to test the h5diff hyperslab read. 4) changed the h5diff binary pre-generated file names to be more descriptive (e.g, instead of file1.h5, made it h5diff_basic1.h5) 5) changed the name of the h5repack options text file to info.h5repack
* [svn-r12784] Pedro Vicente Nunes2006-10-191-0/+4
| | | | | | | | | | | | | | | | | Fixes for bugs 676, 228 676: both h5repack and h5diff use H5Dread. In the case of a "big" dataset, use read/write by hyperslabs the same way h5dump uses. An arbitrary value of 1GB was defined for "big", i.e, if the dataset is greater than 1GB, then read/write by hyperslabs 228: use the file type in read/write by default. A new switch -n was introduced if the user wants to use a native type, which was the previous use by default. Added a new test for h5repack that repacks a 1GB dataset Tested: heping (serial, parallel), sol, copper
* [svn-r12726] Pedro Vicente Nunes2006-10-061-10/+5
| | | | added calls for the scale offset filter to the h5repack test script
* [svn-r11712] Purpose:Quincey Koziol2005-11-151-1/+1
| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | New feature Description: Check in baseline for compact group revisions, which radically revises the source code for managing groups and object headers. WARNING!!!! WARNING!!!! WARNING!!!! WARNING!!!! WARNING!!!! WARNING!!!! WARNING!!!! WARNING!!!! WARNING!!!! WARNING!!!! WARNING!!!! WARNING!!!! This initiates the "unstable" phase of the 1.7.x branch, leading up to the 1.8.0 release. Please test this code, but do _NOT_ keep files created with it - the format will change again before the release and you will not be able to read your old files!!! WARNING!!!! WARNING!!!! WARNING!!!! WARNING!!!! WARNING!!!! WARNING!!!! WARNING!!!! WARNING!!!! WARNING!!!! WARNING!!!! WARNING!!!! WARNING!!!! Solution: There's too many changes to really describe them all, but some of them include: - Stop abusing the H5G_entry_t structure and split it into two separate structures for non-symbol table node use within the library: H5O_loc_t for object locations in a file and H5G_name_t to store the path to an opened object. H5G_entry_t is now only used for storing symbol table entries on disk. - Retire H5G_namei() in favor of a more general mechanism for traversing group paths and issuing callbacks on objects located. This gets us out of the business of hacking H5G_namei() for new features, generally. - Revised H5O* routines to take a H5O_loc_t instead of H5G_entry_t - Lots more... Platforms tested: h5committested and maybe another dozen configurations.... :-)
* [svn-r11043] Purpose:Pedro Vicente Nunes2005-07-071-1/+2
| | | | | | | | | | | | | | new test hor h5repack, to syncronize tests between unix and windows it requires a new file added Description: Solution: Platforms tested: linux Misc. update:
* [svn-r10067] Purpose:Robert E. McGrath2005-02-231-1/+30
| | | | | | | | | | | | | | | | | | | | | | | | | feature Description: h5repack support for scaleoffset compression Checking in early to help debug the filter. Solution: Added messages and command line to handle new scale offset filter. Note: TESTS ARE DISABLED FOR NOW. The filter is not complete, repack tests may fail due to know problems. PLEASE DO NOT MESS WITH THE SCALEOFFSET TESTS AT THIS TIME. They will be enabled when the filter is ready. Platforms tested: verbena,copper,shanti Misc. update: MANIFEST
* [svn-r10009] Purpose:Robert E. McGrath2005-02-151-1/+26
| | | | | | | | | | | | | | | feature Description: support for nbit compression in h5repack Solution: Platforms tested: verbena,copper,shanti Misc. update: manifest
* [svn-r9885] Purpose:Pedro Vicente Nunes2005-01-291-1/+2
| | | | | | | | | | | | | | | bug fix Description: calling h5diff from the h5repack test script running mpicc , the path of one of the files was not found Solution: inserted the full path in the script Platforms tested: linux (with mpicc and gcc) Misc. update:
* [svn-r9496] Purpose:Robert E. McGrath2004-11-021-4/+9
| | | | | | | | | | | | | | | | | | | | | | Fix SZIP filter to dynmically detect encoder. Description: Solution: See: http://hdf.ncsa.uiuc.edu/RFC/SZIP/Szip_dynamic_12_Oct.pdf Changes to h5repack tests, contingent on detecting SZIP encoder. Note new program: testh5repack_detect_szip Checks fo rencoder, prints out "yes" or "no". Used by hrepack.sh to detect encoder. Can also be used for windows tests. This is only used as part of the tests. Had to modify Makefile to build and clean this program.
* [svn-r9137] Purpose:Pedro Vicente Nunes2004-08-231-0/+2
| | | | | | | | | | | | | | | | | | new test Description: added a test that generates and copies a file with a dataset with fill value (this is to test the property list function H5Pequal) Solution: Platforms tested: linux solaris aix Misc. update:
* [svn-r9106] Purpose:Pedro Vicente Nunes2004-08-171-1/+0
| | | | | | | | | | | | | | | | | bug fix Description: the option CHUNK:NONE (remove chunking ) was not setting the layout to contiguous Solution: Platforms tested: linux solaris AIX Misc. update:
* [svn-r8904] Purpose:Pedro Vicente Nunes2004-07-201-7/+7
| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | h5diff and h5repack changes Description: h5diff introduced the following four modes of output: Normal mode: print the number of differences found and where they occured Report mode: print the above plus the differences Verbose mode: print the above plus a list of objects and warnings Quiet mode: do not print output (h5diff always returns an exit code of 1 when differences are found) h5repack added an extra parameter for SZIP filter (coding method) the new syntax is -f SZIP=<pixels per block,coding> (pixels per block is a even number in 2-32 and coding method is 'EC' or 'NN') Example of use: ./h5repack -i file1 -o file2 -f SZIP=8,NN -v updated usage messages, test scripts and files accordingly Solution: Platforms tested: linux AIX solaris Misc. update:
* [svn-r8869] Purpose:Pedro Vicente Nunes2004-07-131-136/+198
| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | h5repack changes Description: there were some requests to change some minor h5repack features h5repack only made a warning about a non available filter in verbose mode ( -v ) without -v it kept silent, and users sometimes missed this warning the request was that it should print this warning always. so, the new format, is e.g ./h5repack -i test_szip.h5 -o out.h5 Warning: dataset </dset_szip> cannot be read, SZIP filter is not available due to this, and to avoid a lot of these messages in the shell test script, I modified the script h5repack.sh so that it detects the presence of all filters in the environment (previously it only detected SZIP) the test files were also divided in more files , to make the script code easier to follow Solution: Platforms tested: linux AIX (no szip) solaris (no szip, no gzip ) Misc. update:
* [svn-r8781] James Laird2004-07-011-0/+298
Purpose: HDF5 now supports SZIP with no encoder. Description: SZIP can be configured to have both encoder and decoder or just to have the decoder. HDF5 can now query the configuration of any filter, and will throw errors if users try to write using a filter with encoding disabled. Solution: Added H5Zget_filter_info function, changed API for H5Pget_filter and H5P_get_filter_by_id. See SZIP RFC. Platforms tested: Copper (fortran, C++, parallel), Sleipnir (C++), Arabica (fortran, C++), Verbena (fortran, C++) Misc. update: