| Commit message (Collapse) | Author | Age | Files | Lines |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Bug fix.
Description:
If the H4DUMP (hdp) is not executable for some reasons (e.g., not
in $PATH), it produced zero sized files. The test script did not
detect the abnormality but compared the zero sized files the same
and concluded the test passed.
Solution:
Test if H4DUMP can produce valid output. If not, print warning
messages. Also check the size of output files to make sure they
are reasonably valid.
Platform tested:
Linux.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Changes since 19990615
----------------------
./README
Version number synchronized with library.
./bin/h5vers
If the version number of the library is changed then the first
line of the README file is also changed to something like
This is hdf5-1.2.3 currently under development
The `release' script (which also gets run by `snapshot')
changes that line to include the release date but keeps the
version number the same. The net effect is that the version
numbers in README and H5public.h should now always stay
synchronized.
./bin/snapshot
The CVS checkin comment includes the version number for the
snapshot that was just made.
./tools/testh5toh4
Changed `*-SKIP-*' to `-SKIP-' to be consistent with the other
tests.
|
|
|
|
|
|
|
| |
(solaris,
irix) misinterpreted it to mean deleting the letter 'n'.
Tested in IRIX and solaris and FreeBSD machines.
|
|
|
|
|
|
|
|
| |
hdp before HDF 4.1r3 does not handle Vgroups loops correctly.
Solution:
Added codes to detect the library version of hdp command. Skipped
tests for loops. Tested in an old solaris with HDF 4.1r2, Solaris 2.6
and Hawkwind.
|
|
|
|
| |
output.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
the generanted hdf files against saved output of hdp. This did not
work well because whenever the hdp changes its output format, the
tests failed unnecessarily. The tests also failed if the test machine
uses a different version of HDF library from the HDF5 development
machine.
Changed the algorithm to compare the generated HDF files against
saved HDF files (first by a simple cmp; if that fails, compare the
output of the host machine's hdp on both HDF files.)
Tested on Hawkwind (FreeBSD) with srcdir option and Baldric (Solaris)
without srcdir option.
|
|
|
|
|
|
|
| |
while-loop. Prepare it for the next revision in which the hdp
will apply to both the converted hdf4 file and a preserved hdf4
file, then compare the hdp output from both for h5toh4 converter
correctness.
|
|
|
|
| |
test. This would signal to make something is incorrect.
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
New feature
Problem:
The h5toh4 converter tester - testh5toh4, is set up to place output
files in same directory as input files. A difficulty comes up when
the input files come off a write-protected media, such-as CDROM.
Solution:
Rather than using "cd" to change directory and referencing files by
short filename only, "input directory" and "output directory" are
defined explicitly, and files are always referenced with pathnames
included. For cases when the converter generates the output filename,
a copy of the input file is first placed in the "output directory".
The copied input file is used by h5toh4 and then removed.
On Solaris2.5, the following sequence of commands seemed to work fine:
$ gunzip < hdf5-1.1.72.tar.gz | tar xf -
$ chmod -R ugo-w hdf5-1.1.72
$ mkdir build
$ cd build
$ ../hdf5-1.1.72/configure --enable-production --disable-debug \
$ --with-hdf4=... --with-zlib=...
$ make check
This change should allow the tester to be used when the hdf5 source
is on a read only media like a CDROM.
Platform tested:
Solaris2.5
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
----------------------
./configure.in
./configure [REGENERATED]
./config/commence.in
A few tweaks to the makefile rules for rebuilding makefiles.
./src/H5detect.c
Fixed a really stupid mistake: resetting the signal handler
after a longjmp(). This should fix Bob's SIGBUS on Solaris.
|
|
|
|
|
|
|
|
|
|
| |
New feature
Solution:
Add string conversion testers
Platform tested:
Solaris2.5
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Conform closer to other test print-out.
Solution:
Added "All h5dump tests passed." statement to output
of testh5dump.sh when appropriate. Similarly, added
"All h5toh4 tests passed." statement to output of
testh5toh4 when appropriate.
Also, added the testing of converting H5 files with
loop pathways into H4 files with recursive references.
Platforms tested:
Solaris2.5, Digital Unix 4.0
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Conform closer to other test print-out.
Problem:
Multiple "PASSED" statements for single test results when test involves
conversion of multiple files. Should be single "PASSED" statement for
test results.
Solution:
"*FAILED*" statement given if conversion fails for any file. Differences
given between expected results and actual results for each file that fails.
"PASSED" statement given when conversion is successful for all files.
Platform tested:
Solaris2.5
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
New feature, Documentation
Solution:
Changed h5toh4 testing output to more closely resemble other HDF5
testing output.
Also added documentation to describe the puposes of the
conversion test files.
Platform tested:
Solaris2.5
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Problems:
There were three separate bugs in the h5toh4 converter worked on.
They were:
1) When a loop was detected, the H4 file was missing references
to paths which were available in the H5 file.
2) When an H4 SDS or Vdata was created from a H5 dataset, the H4
object was referenced in the root group, instead of the correct
Vgroup.
3) The FIRST path taken to an object for the h5toh4 conversion
could not involve a SOFTLINK.
Solutions:
The bug fixes were:
1) All of the associated references to available paths which
occur in the H5 file, now appear in the H4 file.
2) After an H4 SDS or Vdata is created from a H5 dataset, the H4
object is tag/ref'ed in the appropriate Vgroup.
3) The FIRST path taken to an H5 object for the H5toh4 conversion
may involve a HARDLINK, a SOFTLINK, or neither. The same is
true of any additional paths to the same object.
Platform tested:
Solaris2.5
|
|
|
|
|
|
|
|
|
|
|
| |
New feature
Solution:
Testing of Extendable Dataset support in h5toh4 converter when
extendable dimension is first dimension.
Platform tested:
Solaris2.5
|
|
New Feature
Solution:
This is a tester for the h5toh4 converter. It is implemented
on different HDF5 files to produce HDF4 files. After each HDF4
file is produced, the file is run through the HDF4 dumper hdp.
The dumper results are compared with expected results and any
differences are noted.
Note: The "/hdf4/bin" directory needs to be part of the tester's
$PATH.
Platform tested:
Solaris2.5, HP10.20
|