| Commit message (Collapse) | Author | Age | Files | Lines |
|
|
|
| |
Tested: local linux autotools
|
|
|
|
|
|
| |
cleaned. CMake update to clean testfiles.
Tested: local cmake, autools, and h5committest
|
|
|
|
|
|
| |
Reviewed in H5T-61
Tested: local linux - cmake and autotools
|
|
|
|
|
|
|
|
| |
Reverted code change and changed default to 0 from 1024. Changed limit test to use h5dump to compare repacked file instead of h5diff.
Corrected test scripts for test folder path
Tested: h5committest and local linux with CMake
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
HDFFV-8214 - h5repack failed converting small chunked dataset (size < 1K) to contiguous layout.
Description:
h5repack failed converting small chunked dataset (size < 1K) to contiguous layout.
The first case was when chunk dim is bigger than the dataset dim (at leat one), h5repack failed with displaying error stacks.
The other case is when chunk dim is smaller than the dataset dim, h5repack failed to change layout.
Tested:
jam (linux32-LE), koala (linux64-LE), ostrich (linuxppc64-BE), emu (solaris-BE),fred (mac64-LE), Windows (32-LE cmake), cmake (jam)
|
|
|
|
| |
Tested: local linux
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
HDFFV-8012 - h5repack changes max dims and cause failure if only "-f none" is used without changing layout for chunked dataset when a chunk dim is bigger than a dataset dim
Description:
"h5repack -f <obj>:NONE <file.h5> out.h5" command failed if source file contains chunked dataset and a chunk dim is bigger than a dataset dim.
Another issue is that the command changed max dims if chunk dim is smaller than the dataset dim.
These issue occurred when dataset size is smaller than 64k (compact size limit)
Fixed them.
Tested:
jam (linux32-LE), koala (linux64-LE), ostrich (linuxppc64-BE), tejeda (mac32-LE), linew (solaris-BE), Windows (32-LE cmake), cmake (jam)
|
|
|
|
|
|
| |
autotools TESTS var, rename configure.in to configure.ac, convert test scripts from hard *.sh to configure managed *sh.in files.
Tested: h5committest
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Fix for HDFFV-8107 testh5diff will fail if build/test in HDF5 source tree
Description:
This is sub-task for "HDFFV-8105 testh5diff.sh uses the wrong operator (-a) in an if statement."
From the HDFFV-8105's update, h5diff test failed if build&test is performed in HDF5 source tree because 'cp' try to copy test files to self dir.
It's addressed by skipping if cp's src dir and dest dir is same.
Also this applied for all other tools under src/tools dir.
No change to the CMakeLists.txt files because CMake cautions/demands that in-source builds be avoided.
Tested:
jam (linux32-LE), koala (linux64-LE), ostrich (linuxppc64-BE), tejeda (mac32-LE), linew (solaris-BE), some manual tests as well
|
|
|
|
|
|
|
|
|
|
|
|
| |
Fix for HDFFV-7993 - h5repack fails with error "chunk size must be <= maximum dimension size for fixed-sized dimensions"
Description:
Fixed a failure when change the chunk size of a specified chunked dataset with unlimited max dims.
Also took care of converting to contiguous and compact from the dataset.
Test cases were added and tagged with jira#.
Tested:
jam (linux32-LE), koala (linux64-LE), ostrich (linuxppc64-BE), tejeda (mac32-LE), linew (solaris-BE), Windows (32-LE cmake), Cmake (jam)
|
|
|
|
|
|
| |
Added test to verify h5repack --metadata option produces bigger size file.
Tested: h5committest (koala, jam, ostrich).
|
|
|
|
|
|
|
| |
Correct typo in test file name.
Tested on:
Mac OSX/64 10.7.3 (amazon) w/debug
|
|
|
|
|
|
|
|
|
| |
Add new "metadata block size" command line option ('-M <x>' or
'--metadata_block_size=<x>') for h5repack.
Tested on:
Mac OSX/64 10.7.3 (amazon) w/debug)
(h5committest upcoming)
|
|
|
|
| |
non-executable files.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Fix for HDFFV-7840 h5repack: memory leak over one of the h5diff test file
Description:
Turned out that there were two causes of memory leaks.
1. for handling variable length string in attribute.
2. for handling compound type with non-reference members.
The first issue is fixed in copy_attr() which is updated to use h5tools_detect_vlen to take care of vlen string as well as vlen data.
The second is fixed in copy_refs_attr() of compound handling code.
Tested:
jam (linux32-LE), koala (linux64-LE), heiwa (linuxppc64-BE), tejeda (mac32-LE), linew (solaris-BE), Windows (32-LE), Cmake (jam)
|
|
|
|
|
|
|
|
|
|
|
| |
h5dump was used in test script but was not invoked by RUNSERIAL.
THis does not work in batch machines like Blue Gene in LLNL.
Solution:
Added $RUNSERIAL to invoke $H5DUMP_BIN.
Tested:
LLNL BlueGene (udawn)
|
|
|
|
|
|
|
| |
dimension scales.
Tested:
jam (linux32-LE), koala (linux64-LE), heiwa (linuxppc64-BE)
|
|
|
|
|
|
|
|
|
|
|
| |
Purpose:
Work for HDFFV-7602 - HDF5 command tools: Provide framework for reusable
test files among tools
Description:
Provide framework to share test files among tools for tools test.
Tested:
jam (linux32-LE), koala (linux64-LE), heiwa (linuxppc64-BE)
|
|
|
|
|
|
|
|
|
|
| |
Fix for Bug1896 h5repack - changing layout to COMPACT does not work
Description:
Make h5repack be able to convert a layout to COMPACT for small size dataset as default. Also add verifying layout changes in our test script.
Tested:
jam, amani, heiwa
|
|
|
|
|
|
| |
corrected changes to test script for verbose filters
Tested: local linux
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Bring changes from Coverity branch back to trunk:
r19079 & 19080:
[BZ1942] h5dump -u to generate XML, it does not respect the -m option
xml version of dump_data function didn't check for use of fp_format variable.
Added new test expected file for committed bug 1942
r19103, 19104 & 19105:
[BZ1821] h5repack -v did not display correct output for a selected compression. Needed new test for comparing output of -v option.
Added new test file for solution to BZ1821
BZ1821 - Bring test changes from the shell script actually used.
Tested on:
Mac OS X/32 10.6.4 (amazon) debug & production
(h5committested on branch)
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Add test cases for bug1797 - h5repack doesn't handle references in
compound and vlen datatypes for attributes
Description:
Add test cases ahead as waiting for H5Acopy to complete the fix.
1. obj references in attr of compound type
2. region references in attr of compound type
3. obj references in attr of vlen type
4. region references in attr of vlen type
NOTE:
This test is skipped now and will be included when code part is completed.
( H5Acopy() replaces copy_attr() )
The file&code can be used for lib portion test.
Tested:
jam, amani, linew
|
|
|
|
|
|
|
|
|
|
|
|
| |
Fix for the bug1726 - NPOESS: h5repack loses attributes for datasets of
type H5T_REFERENCE.
Description:
include test cases.
also test cases for attribute with object and region reference.
Tested:
jam, amani, linew
|
|
|
|
|
|
|
|
|
|
|
|
| |
Fix for bug1814 - NPOESS: h5repack doesn't handle references to the groups
as an element of a dataset
Description:
handles object reference to named-datatype as well.
Add test cases.
Tested:
jam, amani, linew
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Unify srcdir handling for test executables and allow them to use the srcdir
setting from configure time without requiring the 'srcdir' environment variable
be set (although you still can, to override the built in setting). Attempted
to get this right for Windows builds also.
Also add dependency between src/H5Tinit.c and src/libhdf5.settings, so
that the test/testcheck_version.sh script works correctly.
Tested on:
Linux/32 2.6 (jam)
Mac OS X/32 10.6.2 (amazon)
|
|
|
|
|
|
|
|
| |
Description:
Fixed exit code (sometimes return code in Main) to follow the HDF5 standards.
Tested:
H5committested plus serial test in Jam.
|
|
|
|
|
|
|
|
| |
TFLOPS machine has retired long ago. Removed all code specific for its
support.
Test:
h5committested.
|
|
|
|
|
|
|
|
| |
The file used for input is located in the common source tools for testfiles, in tools/testfiles
Modified the h5repack shell script to read files from this location (h5repack reads its input files from a dedicated testfiles location in h5repack/testfiles)
Changed the h5diff open file call to use h5tools_fopen, so that it can open all file drivers
Tested: windows, linux, solaris
|
|
|
|
|
|
|
|
|
|
|
| |
Description:
h5repack previously would not take named datatypes into consideration when copying
datasets and attributes. This would cause extra anonymous datatypes in the target file
at best, and cause errors halfway through the repacking at worst. h5repack should now
always handle named datatypes correctly. Named datatypes are also now converted to the
native type when -n is given.
Tested: jam, linew, smirom (h5committest)
|
|
|
|
|
|
|
|
|
|
|
|
| |
H5TOOLS_BUFSIZE limit.
ISSUE : the tools use the following formula to read by hyperslabs: hyperslab_size[i] = MIN( dim_size[i], H5TOOLS_BUFSIZE / datum_size) where H5TOOLS_BUFSIZE is a constant defined of 1024K. This is OK as long as the datum_size does not exceed 1024K, otherwise we have a hyperslab size of 0 (since 1024K/(greater than 1024K) = 0). This affects h5dump. h5repack, h5diff
SOLUTION: add a check for a 0 size and define as 1 if so.
TEST FOR H5DUMP: Defined a case in the h5dump test generator program of such a type (an array type of doubles with a large array dimension, that was the case the user reported). Since the written file commited in svn would be around 1024K, opted for not writing the data (the part of the code where the hyperslab is defined is executed, since h5dump always reads the files). Defined a macro WRITE_ARRAY to enable such writing if needed. Added a run on the h5dump shell script. Added 2 new files to svn: tools/testfiles/tarray8.ddl, tools/testfiles/tarray8.h5. NOTE: while doing this I thought of adding this dataset case to an existing file, but that would add the large array output to those files (the ddls). The issue is that the file list is increasing.
TEST FOR H5DIFF: for h5diff the check for reading by hyperslabs is H5TOOLS_MALLOCSIZE (128 * H5TOOLS_BUFSIZE) or 128 Mb. This makes it not possible to add such a file to svn, so used the same method as h5dump (only write the dataset if WRITE_ARRAY is defined). As opposed to h5dump, the hyperslab code is NOT executed when the dataset is empty (dataset is not read). Added the new dataset to existing files and shell run (tools/h5diff/testfiles/h5diff_dset1.h5 and tools/h5diff/testfiles/h5diff_dset2.h5 and output in tools/h5diff/testfiles/h5diff_80.txt).
TEST FOR H5REPACK: similar issue as h5diff with the difference that the hyperslab code is run. Added a run to the shell script (with a filter, otherwise the code uses H5Ocopy).
tested: linux (h5commitest failed , apparently it did not detect the code changes in /tools/lib that fix the bug: the error in an assertion in the hyperslab of 0. I am sure that making h5ccomitest --distclean will detect the new code , but don't want to wait more 3 hours :-) )
|
|
|
|
|
|
|
|
|
| |
Correct error introduced in r16353 with layout version, and add test
so it gets caught earlier.
Tested on:
FreeBSD/32 6.3 (duty)
Too minor to require h5committest
|
|
|
|
|
|
| |
Eliminate -c option and make that behavior the default and return 2 instead of -1 on error status
tested: linux
|
|
|
|
|
|
| |
that was changed to "--minimum" because "--threshold" now refers to the threshold parameter for H5Pset_alignment
tested: linux
|
|
|
|
|
|
|
|
|
|
|
|
| |
Together they allow a call to H5Pset_alignment to be made
-t T, --threshold=T Threshold value for H5Pset_alignment
-a A, --alignment=A Alignment value for H5Pset_alignment
2) bug fix
the printing of the dataset name was not done for references (verbose mode)
tested: windows, linux
|
|
|
|
|
|
|
|
|
|
| |
Add a userblock to an HDF5 file during the repack. The user gives
give a filename and userblock size as command line parameters to
h5repack and the contents of that file are stored in the
userblock for the HDF5 file created by h5repack.
New flags to handle this -u and -b
Tested : windows, linux
|
|
|
|
|
|
|
|
|
|
| |
Summary: when using h5diff to compare the results of h5repack (or other tools that copy one HDF5 file to another), a new option is needed to allow h5diff to make an "absolute" comparison of the 2 files. This is the "contents" mode explained in the usage below.
If this mode is present, objects in both files must match (must be exactly the same). If this does not happen, the tool returns an error code of 1 (instead of the success code of 0)
Changes to the h5repack test script: the call to h5diff was changed to include -c (maintaining the previous -q).
tested: windows, linux, solaris
|
|
|
|
|
|
| |
tools/h5repack/testfiles
tested: linux
|
|
|
|
|
|
|
|
| |
compability
Used the function TOOLTEST from 1.6.7
Tested: linux
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
if these are detected this syntax is used, otherwise the one in usage is used
there was another -i option for
-i L2, --indexed=L2 Minimum number of links in the indexed format
That was changed to -d
-d L2, --indexed=L2 Minimum number of links in the indexed format
Tested: windows, linux
|
|
|
|
|
|
|
|
| |
to test of long swtich names
add a test for multiple global filters
tested: linux, solaris
|
|
|
|
| |
tested: linux, solaris
|
|
|
|
| |
tested: linux
|
|
|
|
|
|
| |
2) new usage for h5repack and new command line parsing using the tools library parsing code
tested: windows, linux, solaris
|
|
|
|
|
|
| |
brand new syntax
tested: linux
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Add regression test for h5repack with userblock
Tested on:
FreeBSD/32 6.2 (duty) in debug mode
FreeBSD/64 6.2 (liberty) w/C++ & FORTRAN, in debug mode
Linux/32 2.6 (kagiso) w/PGI compilers, w/C++ & FORTRAN, w/threadsafe,
in debug mode
Linux/64 2.6 (smirom) w/default API=1.6.x, w/C++ & FORTRAN,
in production mode
Solaris/32 2.10 (linew) w/deprecated symbols disabled, w/C++ & FORTRAN,
w/szip filter, in production mode
AIX/32 5.3 (copper) w/FORTRAN, w/parallel, in production mode
Mac OS X/32 10.4.10 (amazon) in debug mode
|
|
|
|
|
|
|
| |
h5diff bug fix, attributes differences were not being count for total
differences
Revision of H5Ocopy call in h5repack
|
|
|
|
| |
make 1.7 h5repack more similar to 1.6 h5repack
|
|
|
|
| |
Tested: visual inspection as they are all just comments.
|
|
|
|
|
|
|
|
| |
h5repack revision:
1. added a new test due to the introduction of H5Ocopy in the copy of objects (compressed dataset with references, that still must go a second sweep of the file to be regenerated).
2. Moved all the source files from the h5repack test program to a new file h5repacktst.c and removed the old ones (testh5repack*.c).
3. Renamed the binary files from test*.h5 to h5repack*.h5 for easy reference.
4. Modified the shell script to use variables for file names instead of hard coded names
|