| Commit message (Collapse) | Author | Age | Files | Lines |
| |
|
|
|
|
| |
New fortran wrappers added.
|
|
|
|
| |
tested: linux
|
|
|
|
|
|
| |
unused functions
tested: windows, linux
|
|
|
|
|
|
|
|
|
| |
and for global filters): verify if all requested filters in the array FILTER obtained from user input are present in the property list PID obtained from the output file. All the filter comparison of client data values is done here instead of the previous filtcmp
TO DO: szip, nbit and scale offset
NOTE: the symbol H5Z_SHUFFLE_TOTAL_NPARMS was made public
Tested: windows, teragrid with icc 8.1, linux (kagiso), solaris (linew)
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
match for each individual dataset
Following the new feature of h5repack to allow multiple filters for all datasets and the new function has_filters that checks if the repacked file has all the filters requested, I added a new function
has_filters_obj
that does the same for each dataset. The previous function that checked this only ckecked if the user input filters were in the output dataset. This new function does this but checks if the filters are exactly the same. Currently the behavior of h5repack is to delete all filters that are present in the input file (dataset) and replace them with the requested ones, so they must match exactly.
We might consider adding other logical operations, like keep the existing ones.
Additionally , the function also checks if the filter parameters match.
While doing this I noticed that for the shuffle filter , the values returned do not match and also the same for the N-bit and scale-offset
The new function that checks for the filter values fails then, and so I commented the h5repack tests that do this for the N-bit and scale-offset filter (previously for the same bug on the shuffle filter I added special code on the compare filter function but this is temporary until I find the issue)
tested: windows, linux, solaris
|
|
|
|
|
|
|
|
| |
values, for H5Z_COMMON_CD_VALUES
that is defined in the library
tested: windows, linux, solaris
|
|
|
|
|
|
|
|
| |
and in the requested list
Used to verify if the filters requested are present in the output dataset
tested: windows, linux, solaris
|
|
|
|
|
|
| |
h5repack verify routines
tested: windows, linux
|
|
|
|
|
|
|
|
| |
filters
usage is to repeat the -f option
tested: windows, linux, solaris
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Remove all plain calls to H5Gopen() from source, replacing them with
either H5Gopen2().
Add test for H5Gopen1().
Reformatted several pieces of code, to clean them up.
Tested on:
FreeBSD/32 6.2 (duty)
FreeBSD/64 6.2 (liberty)
Linux/32 2.6 (kagiso)
Linux/64 2.6 (smirom)
Solaris/32 5.10 (linew)
Mac OS X/32 10.4.10 (amazon)
|
|
|
|
|
|
| |
file format.
Add test for the options to the daily test.
|
|
|
|
|
|
|
|
| |
Minor tunings to output verbose messages:
1)when there is not a filter request do not print a message saying the filter was not apllied when the dataset was too small
2) avoid printing the message that has a list of objects to modify when there is none
Tested:linux
|
|
|
|
|
|
| |
h5repack code cleaning (required reconfigure)
tested: linux (32, 64, parallel), solaris
|
|
|
|
| |
make 1.7 h5repack more similar to 1.6 h5repack
|
|
|
|
|
|
|
|
|
| |
copyright notice.
Tested platform:
Kagiso only since it is only a comment block change. If it works in one
machine, it should work in all, I hope. Still need to check the parallel
build on copper.
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
| |
h5repack support for H5Ocopy in the copy of objects. The old method
for recreating references was dropped (references recreated in a second
traversal of the file)
The logic for using H5Ocopy or not is
if the input DCPL has filters or non default layout OR these are
requested by the user THEN
use the old h5repack read / write
ELSE
use H5Ocopy
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Fixes for bugs 676, 228
676: both h5repack and h5diff use H5Dread. In the case of a "big"
dataset, use read/write by hyperslabs the same way h5dump uses. An
arbitrary value of 1GB was defined for "big", i.e, if the dataset is
greater than 1GB, then read/write by hyperslabs
228: use the file type in read/write by default. A new switch -n was
introduced if the user wants to use a native type, which was the
previous use by default.
Added a new test for h5repack that repacks a 1GB dataset
Tested: heping (serial, parallel), sol, copper
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Add "use the latest format" support for dataspace object header encode/
decode routines and clean up format a bit for the latest format (new to 1.8.x
releases)
Remove storing 'perm' parameter for array datatypes in memory and the file,
and add test to make certain that if any user applications are attempting to
store them, we get some reports back. (Should be unlikely, since the RefMan
says that the parameter is not implemented and is unsupported).
Carry those changes into the tests, etc.
Clean up a bunch more compiler warnings.
Tested on:
FreeBSD/32 4.11 (sleipnir) w/threadsafe
Linux/32 2.4 (heping) w/FORTRAN & C++
Linux/64 2.4 (mir) w/enable-1.6-compat
|
|
|
|
|
|
|
|
|
|
|
| |
Code cleanup
Description:
Trim trailing whitespace in Makefile.am and C/C++ source files to make
diffing changes easier.
Platforms tested:
None necessary, whitespace only change
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
new feature
Description
some more check in related to the print of compression ratios: print warning messages after the print of the dataset name and compression:
Solution:
Platforms tested:
linux
solaris
AIX
Misc. update:
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
new feature
Description:
some more formatting for the new printout of compression ratios
Solution:
Platforms tested:
linux
solaris
AIX
Misc. update:
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
bug fix
Description:
h5repack was not dealing with family files
Solution:
use the toolslib function h5tools_open to open the file instead of H5Fopen in h5repack
Platforms tested:
linux
solaris
AIX
Misc. update:
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Code cleanup
Description:
Check in some of the code cleanups from working on the external link
support. (This doesn't include any of the external link features)
Platforms tested:
FreeBSD 4.11 (sleipnir)
Mac OSX.4 (amazon)
Linux 2.4
|
|
|
|
|
|
|
|
|
|
|
|
| |
Description: VMS doesn't like file names with more than one "."
Some h5repacktst output file names were of the form
<name>.out.h5 causing h5repacktst to choke.
Solution: Renamed output files to be of the form <name>out.h5
Platforms tested: heping, unnamed VMS machine
Misc. update:
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
new features
Description:
added support for the scale/offset filter
there is a new filter symbol 'SOFF'
-f SOFF=<scale_factor,scale_type>
scale_factor = integer
scale_type = 'IN' or 'DS'
Solution:
Platforms tested:
Linux
SunOS
Misc. update:
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
bug fix
Description:
during the generation of some test files, H5Fclose was not called
during the #ifdef detection of the scale ofsset filter, a wrong macro symbol was used
Solution:
Platforms tested:
linux
Misc. update:
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Code cleanup
Description:
Trim trailing whitespace, which is making 'diff'ing the two branches
difficult.
Solution:
Ran this script in each directory:
foreach f (*.[ch] *.cpp)
sed 's/[[:blank:]]*$//' $f > sed.out && mv sed.out $f
end
Platforms tested:
FreeBSD 4.11 (sleipnir)
Too minor to require h5committest
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Bug fix
Description:
The GASS VFL driver header file was bringing in the <string.h> header file,
which several other source code modules needed also, but weren't including
explicitly themselves.
Solution:
Add includes for <string.h> to files which actually need them.
Platforms tested:
FreeBSD 4.11 (sleipnir) w/C++ as CC
Configuration not tested by h5committest...
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
feature
Description:
h5repack support for scaleoffset compression
Checking in early to help debug the filter.
Solution:
Added messages and command line to handle new scale offset filter.
Note: TESTS ARE DISABLED FOR NOW. The filter is not
complete, repack tests may fail due to know problems.
PLEASE DO NOT MESS WITH THE SCALEOFFSET TESTS AT THIS TIME.
They will be enabled when the filter is ready.
Platforms tested:
verbena,copper,shanti
Misc. update:
MANIFEST
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
feature
Description:
support for nbit compression in h5repack
Solution:
Platforms tested:
verbena,copper,shanti
Misc. update:
manifest
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
bug fix
Description:
Description:
one case was not handled in the combination of input options (layout and filters)
Solution:
redo the algorythm that handles all cases
Solution:
Platforms tested:
linux
Misc. update:
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
bug fix
Description:
when specifying both an input object e.g -f mydset:GZIP=1 and a defined chunk -l CHUNK=20x20
the filter used a defined default chunk instead
Solution:
add a check for the input chunk
Platforms tested:
linux (small change)
Misc. update:
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
new test
Description:
added a test that generates and copies a file with a dataset with fill value
(this is to test the property list function H5Pequal)
Solution:
Platforms tested:
linux
solaris
aix
Misc. update:
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
new feature
Description:
added a check that the chunk size must be smaller than pixels per block in SZIP request
prints a message and exits, if not met
Solution:
Platforms tested:
linux
aix
solaris
Misc. update:
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
h5diff and h5repack changes
Description:
h5diff
introduced the following four modes of output:
Normal mode: print the number of differences found and where they occured
Report mode: print the above plus the differences
Verbose mode: print the above plus a list of objects and warnings
Quiet mode: do not print output (h5diff always returns an exit code of 1 when differences are found)
h5repack
added an extra parameter for SZIP filter (coding method)
the new syntax is
-f SZIP=<pixels per block,coding>
(pixels per block is a even number in 2-32 and coding method is 'EC' or 'NN')
Example of use:
./h5repack -i file1 -o file2 -f SZIP=8,NN -v
updated usage messages, test scripts and files accordingly
Solution:
Platforms tested:
linux
AIX
solaris
Misc. update:
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
bug fix, new feature
Description:
fixed bug in the parse function:
cases where we have an already inserted name but there is a new name also
example:
-f dset1:GZIP=1 -l dset1,dset2:CHUNK=20x20
dset1 is already inserted, but dset2 must also be (it was not)
added a CHECK_SZIP symbol to enable/disable checking of library related szip parameters
added the print of the filter name in verbose mode (confirms visually that the filter was applied )
Solution:
Platforms tested:
linux
solaris
AIX
Misc. update:
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
h5repack changes
Description:
there were some requests to change some minor h5repack features
h5repack only made a warning about a non available filter in verbose mode ( -v )
without -v it kept silent, and users sometimes missed this warning
the request was that it should print this warning always. so, the new format, is e.g
./h5repack -i test_szip.h5 -o out.h5
Warning: dataset </dset_szip> cannot be read, SZIP filter is not available
due to this, and to avoid a lot of these messages in the shell test script, I modified
the script h5repack.sh so that it detects the presence of all filters in the environment
(previously it only detected SZIP)
the test files were also divided in more files , to make the script code easier to
follow
Solution:
Platforms tested:
linux
AIX (no szip)
solaris (no szip, no gzip )
Misc. update:
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Code cleanup
Description:
Clean up lots of warnings based on those reported from the SGI compilers
as well as gcc.
Platforms tested:
SGI O3900, IRIX64 6.5 (Cheryl's SGI machine)
FreeBSD 4.9 (sleipnir) w/ & w/o parallel
h5committest
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
new test
Description:
add a test that generates a file with "holes" and then uses h5repack to clean
the empty space. it works
Solution:
Platforms tested:
linux
Misc. update:
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
new tests for h5repack
Description:
added tests that do layout type to layout type conversion in a matrix of 9 between compact, contiguous and chunking
Solution:
Platforms tested:
linux
afs has problems; I could not telnet to sol and copper, arabica is really slow (meaning
waiting 1 minute for a typed character) and the writing of a file gave an error
arabica 181% afs: failed to store file (145)
afs: failed to store file (145)
Misc. update:
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
bug fix
Description:
the synntax of the input of h5repack conatined double quotes and spaces, which
were causing problems on the parsing in AIX paralell
Solution:
replaced the spaces by =
that is, instead of -f "GZIP 6"
we have now
-f GZIP=6
Platforms tested:
linux
solaris
AIX paralell
Misc. update:
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
1) new function for tools library
2) new test script for h5repack
Description:
1) currently all the tools (h5dump, h5diff, etc) do not check if a filter is available
for reading some dataset that might have a filter not available on the current configuration (the behaviour
of the tools until now was to trigger a library error, saying that the dataset cannot be read
due to the lack of the filter)
Solution:
1) added a new function h5tools_canreadf that checks if a dataset can be read
depending on the availability of filters.
this function was added in calls for h5diff and h5repack.
instead of triggering the library error, a message is printed, saying that the dataset
cannot be read (the print is optional, it is on on verbose mode)
2) added a shell script that tests the commannd line tool behaviour of h5repack
the script does a series of runs of h5repack with several options on the same file (this file test4.h5
was added to the testfiles dir).
then, it runs the h5diff tool, with the input and output files , in each run.
the goal of the test is also to check item 1) . the binary file was saved with filters
that might not be available on other configurations
Platforms tested:
linux (all filters enabled)
linux (some filters disabled)
solaris (some filters disabled)
AIX (some filters disabled)
windows (all filters on and off )
Misc. update:
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
bug fix in H5Zshuffle.c
add more tests to h5repack that exposed the bug
Description:
when creating a dataset with the shuffle filter and duplicating it in a new dataset (file)
the call to H5Z_set_local_shuffle failed. this is because the value of cd_nelmts of the filter
structure is set to 1 (H5Z_SHUFFLE_TOTAL_NPARMS) when the original dataset is created, but when
the new dataset is created there is a checking instruction that fails if the value of
cd_nelmts is not 0 (its original value, H5Z_SHUFFLE_USER_NPARMS)
Solution:
just remove that check condition, since the value of cd_nelmts is not used anyway.
if we decide that the value of cd_nelmts is necessary, then the H5O_pline_copy function
must be changed to update this value (a different update for each filter)
Platforms tested:
linux
solaris
AIX
Misc. update:
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
new feature, bug fix, changed function
Description:
1) implemented the option that says if the dataset is too small , do not compress it
2) bug fix in the SZIP checking . only apply szip to atomic datatypes
3) made the apply_filters function more compact
Solution:
Platforms tested:
linux
AIX
solaris
Misc. update:
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
h5repack new feature
Description:
in the SZIP settings, when the requested pixels per block parameter does not conform
to the SZIP specifications, instead of returning without applying the filter,
do an attempt to set this parameter to a valid value, issuing a warning in the process
Solution:
Platforms tested:
linux
solaris
AIX
Misc. update:
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
bug fix
Description:
avoid reading and writing data when one of the dimensions is 0 (attributes case )
Solution:
linux
solaris
AIX
Platforms tested:
Misc. update:
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
added h5repack and h5diff support for copying and differences of references to dataset regions
modified the behaviour in the diff of attributes, when a difference in name is detected
in the attribute cycle (number of attributes of object), instead of exiting the
cycle, rather continue
Description:
Solution:
Platforms tested:
linux
solaris
AIX
Misc. update:
|