integer: remove explicit casts from _MIN definitions

The spec says (section 6.12.3, CL version 1.2):
  The macro names given in the following list must use the values
  specified. The values shall all be constant expressions suitable
  for use in #if preprocessing directives.

This commit addresses the second part of that statement.

Reviewed-by: Jan Vesely <jan.vesely@rutgers.edu>
Reviewed-by: Tom Stellard <tom@stellard.net>
CC: Moritz Pflanzer <moritz.pflanzer14@imperial.ac.uk>
CC: Serge Martin <edb+libclc@sigluy.net>
llvm-svn: 249445
This commit is contained in:
Aaron Watry 2015-10-06 19:12:12 +00:00
parent 017bfee456
commit 23faa5a1f9
1 changed files with 3 additions and 3 deletions

View File

@ -1,14 +1,14 @@
#define CHAR_BIT 8
#define INT_MAX 2147483647
#define INT_MIN ((int)(-2147483647 - 1))
#define INT_MIN (-2147483647 - 1)
#define LONG_MAX 0x7fffffffffffffffL
#define LONG_MIN (-0x7fffffffffffffffL - 1)
#define CHAR_MAX SCHAR_MAX
#define CHAR_MIN SCHAR_MIN
#define SCHAR_MAX 127
#define SCHAR_MIN ((char)(-127 - 1))
#define SCHAR_MIN (-127 - 1)
#define SHRT_MAX 32767
#define SHRT_MIN ((short)(-32767 - 1))
#define SHRT_MIN (-32767 - 1)
#define UCHAR_MAX 255
#define USHRT_MAX 65535
#define UINT_MAX 0xffffffff