| Commit message (Collapse) | Author | Age | Files | Lines |
| |
|
|
|
|
|
|
| |
Updates Windows thread-safe code in H5TS.c to use _beginthread instead of CreateThread.
Tested on 64-bit Windows 7 with Visual Studio 2010 using CMake. Both 32- and 64-bit builds were tested.
|
|
|
|
|
|
|
| |
Add test case to unix test-script for "HDFFV-2567 - added test for be generated files with at least 9 attributes at root". It's committed previously as r21812 along with Cmake test.
Tested:
jam (linux32-LE), koala (linux64-LE), heiwa (linuxppc64-BE), tejeda (mac32-LE)
|
|
|
|
|
|
| |
attributes at root
Tested: local linux
|
| |
|
| |
|
|
|
|
| |
standard 2.8.6
|
|
|
|
| |
Tested: durandal
|
|
|
|
|
|
|
| |
Back out r21782 while I figure out what the problem is with the change.
Tested on:
Daily tests... :-/
|
| |
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
| |
Description:
When shrinking a chunked dataset, the library fills in the unused parts of
chunks that have been shrunk. The fill value buffer allocated for this purpose
had a maximum size of 1 MB, but the fill was performed in a single operation.
Therefore, if the amount of unused space in a chunk after being shrunk was
greater than 1 MB, the library would read off the end of the fill value buffer.
Changed the maximum fill buffer size to be equal to the chunk size.
Tested: durandal; jam, koala, heiwa (h5committest)
|
|
|
|
| |
Tested: local linux
|
| |
|
| |
|
|
|
|
|
|
| |
CMAKE_BUILD_TYPE is defined.
Also correct use of CMAKE_ANSI_FLAGS for passing into sub-projects.
|
|
|
|
|
|
|
|
|
|
| |
Rearrange checks for reasons why we break collective I/O back to independent
I/O into "global" and "local" sections. We should try to minimize the checks
in the "local" section...
Tested on:
Mac OS X/32 10.7.2 (amazon) w/parallel
(too minor to require h5committest)
|
|
|
|
|
|
| |
Added test for h5dget_space_status_f.
Tested: jam (gnu, intel, pgi)
|
|
|
|
|
|
|
| |
Changed, INTENT of flag to be OUT instead of IN for h5dget_space_status_f, the value gets changed so it should have intent(OUT), it
is correct in the documentation.
Tested: jam (gnu, intel, pgi)
|
| |
|
|
|
|
|
|
| |
extrnal libraries.
Tested: local linux
|
| |
|
|
|
|
| |
fortran/src and hl/fortran/src and the install command.
|
|
|
|
|
|
| |
internally. Corrected path component in EXTERNAL_ZLIB_LIBRARY macro.
Tested: linux
|
| |
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Revert part of r21275 (F2003 configure change) which unintentionally
removed a line from configure.in that sets FC=no when fortran
is not enabled. This ensures that configure doesn't run
compiler checks on a fortran compiler when it won't be used.
(and can cause failures in configure when no fortran compiler is
present as well as issues with the resulting src/Makefile when
building DLLs on Cygwin)
Tested:
h5committest; manually on jam & bangan (Cygwin).
|
|
|
|
|
|
| |
between ASCII and UTF8. I added more test cases to the previous commit. Now it has conversion from UTF8 to ASCII, ASCII to UTF8, VL and fixed length, and H5Tconvert.
Tested on jam, koala, linew.
|
|
|
|
|
|
| |
between ASCII and UTF8. I corrected it by adding a condition check in H5T_conv_s_s and H5T_conv_vlen to report an error under this situation.
Tested on jam, koala, linew.
|
|
|
|
|
|
|
|
|
|
|
| |
Description: removed the temporary patch of
RUNSERIAL=${RUNSERIAL="env LDR_CNTRL=MAXDATA=0x20000000@DSA"}
and similaryly for RUNPARALLEL since the h5repack test failure
was fixed and this patch is no longer needed. (IBM also advices
not to hardset MAXDATA if possible.)
Tested: NASA G-ADA AIX machine, both 32 and 64 bits modes.
(No parallel test because not able to build or run MPI executables yet.)
|
|
|
|
| |
CentOS 4.6 system (compiled with gcc 4.6.2), we will no longer link to the bsd-compat library. Tested on jam (32-bit linux) and koala (64-bit linux). Threadsafe and parallel were also tested on both platforms.
|
| |
|
|
|
|
|
|
| |
Renamed hdf file dsetf.h5 to dsetf_F03.h5 to avoid a possible conflict with the same file created in the fortranlib_test.f90 test.
Tested: jam (pgi)
|
|
|
|
|
|
| |
Removed using INT() on an already INTEGER variable on the input argument to the verify routine, which was causing wrong verify results for the two integers on a Mac.
Tested: fred (gnu) in production mode.
|
| |
|
| |
|
|
|
|
| |
Increase CTEST SUBMITRETRY TIME from 5 to 20
|
| |
|
|
|
|
|
|
|
|
| |
files updated to link with fortran mpi libs.
Updated cacheinit.cmake to set num of procs to 3 for hdf testing.
Tested: local linux and on windows 7
|
|
|
|
|
|
| |
MPI_XXX to MPI_C_XXX.
Tested: local windows
|
| |
|
| |
|
|
|
|
|
|
|
| |
in fortranlib_test.f90. I think it was causing the
make check -j8 race condition failures in daily tests.
Tested jam (gfortran, intel)
|
|
|
|
|
|
|
|
|
|
|
| |
> make failed:
> "../../../hdf5/fortran/test/fortranlib_test_1_8.f90", line 642.15:
> 1513-041 (S) Arguments of the wrong type were specified for the
> INTRINSIC procedure "mod".
Fixed by defining both arguments in MOD as integer size_t
Tested: jam (gfortran, intel)
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
*Hostname: nsipada0X:
"../../../hdf5/fortran/src/H5Ef.c", line 301.40: 1506-280 (E)
Function argument
assignment between types "int(*)(int,void*)" and "int(*)(int,struct
{...}*)" is
not allowed.
Fixed by casting has H5E_auto2_t.
tested: jam (gfortran, intel)
|
|
|
|
| |
No test needed.
|
|
|
|
|
|
|
|
|
| |
Don't check dataset storage size for compressed datasets with region
reference datatypes. (The address of the region reference type in the file
varies and affects the compressed size)
Tested on:
Mac OS X/32 10.7.2 (amazon) w/debug & production + check-vfd
|
|
|
|
|
|
|
|
| |
the size of compound data type through H5Tset_size immedia
tely after the type was created. I fixed it in this commit.
Tested on jam, linew, and koala.
|
|
|
|
| |
No test needed.
|
|
|
|
|
|
| |
OPTION command for solution folder and no packaging.
Tested: local linux
|