summaryrefslogtreecommitdiff
path: root/ports/sysdeps
diff options
context:
space:
mode:
authorJoseph Myers <joseph@codesourcery.com>2014-03-04 14:16:25 +0000
committerJoseph Myers <joseph@codesourcery.com>2014-03-04 14:16:25 +0000
commit45adef3cf2057aa1f7e2b7479e5f1bcb7506140c (patch)
treef66231bbd415777b1b5de7a3f2181d332d0076d4 /ports/sysdeps
parentd4b17258bba38f206079fbae1e7255779db1b74c (diff)
downloadglibc-45adef3cf2057aa1f7e2b7479e5f1bcb7506140c.tar.gz
Fix libm-test.inc:print_complex_max_error handling of some error cases.
When regenerating ulps incrementally with "make regen-ulps", the resulting diffs should only increase existing ulps, never decrease them. This allows successive uses of "make regen-ulps" on different hardware or with different compiler configurations to accumulate ulps that are sufficient for tests to pass in a variety of configurations. However, sometimes changes that decrease ulps are wrongly generated; thus, when applying <https://sourceware.org/ml/libc-alpha/2014-02/msg00605.html> I had to remove such changes manually. The problem is print_complex_max_error. If the ulps for either the real or the imaginary part of a function are out of range, this function prints the maximum ulps seen for both parts, which then replace those previously in libm-test-ulps. So if the ulps for one part are bigger than recorded before, but those for the other part are smaller, the diffs reduce existing ulps. This patch fixes the logic so that only increased ulps get printed. Tested x86_64 ("make math/tests", and "make regen-ulps" in a situation with ulps manually modified so one part would go up and the other down, to confirm the changes have the intended effect then). * math/libm-test.inc (print_complex_max_error): Check separately whether real and imaginary errors are within allowed range and pass 0 to print_complex_function_ulps instead of value within allowed range.
Diffstat (limited to 'ports/sysdeps')
0 files changed, 0 insertions, 0 deletions