When ASP CInt( ( 0.3 - 0.2 ) * 10 )
gives you 1
, but PHP (int) ( ( 0.3 - 0.2 ) * 10 )
gives you 0
;
When MySQL SELECT CAST( 3 * ( 1.0 / 3 ) AS SIGNED )
gives you 1
, but MSSQL SELECT CAST( 3 * ( 1.0 / 3 ) AS INT)
gives you 0
;
How accurate the result you can expect from your web application?
When your web application is built on top of the myth of the floating point, you can’t even expect 0.1
from JavaScript (0.3-0.2)
, it gives 0.09999999999999998
…