You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Apple Swift version 4.1 (swiftlang-902.0.48 clang-902.0.37.1)
Target: x86_64-apple-darwin17.5.0
Running on MacBook Pro (13-inch, 2016, Two Thunderbolt 3 ports) with macOS 10.13.4 (17E202); Xcode: Version 9.3.1 (9E501)
Additional Detail from JIRA
Votes
0
Component/s
Foundation, Standard Library
Labels
Bug
Assignee
None
Priority
Medium
md5: eaf4c498641b6112ee4c7dc40fff5a78
Issue Description:
There's been a change in how Swift handles casting numerical values and it's either really buggy or not intuitive. Please see examples below. All testing was done with:
Apple Swift version 4.1 (swiftlang-902.0.48 clang-902.0.37.1)
Target: x86_64-apple-darwin17.5.0
On one hand this looks like casting between types has been disabled - no matter if I'm upcasting to a type that can hold specific values (i.e. casting Int8 to Int16) or the other way (which might be blocked since results may vary depending on runtime values).
On the other hand implicit conversions of float literals (which I believe are Doubles in Swift) to Float as in the second and third examples work without a problem.
I can't really understand the logic between `0.1 as Float` and `0.1 as? Float` giving completely different values.
The problem extends to NSNumber objects which behave really weird regarding casting.
Seeing these tests in action I can't really be sure when it's possible to cast an NSNumber to Float. Why is it possible to cast NSNumber with a value of 13 to Float while Int64 with the same value isn't possible.
Can someone please let me know what's going on here?
The text was updated successfully, but these errors were encountered:
This is correct behavior. In Swift, literals don't have an intrinsic type, just a default one. So when you say 1.0 as Float or even 10 as Float, the compiler knows that the value should be a Float. It doesn't start with a Double or Int and then convert. 'as?', however, is only used for converting between runtime values, and so the compiler falls back to the default type for a particular literal. And indeed, different basic number types cannot be converted to one another in Swift.
NSNumber does come along and confuse things. It's mostly just there for compatibility with Objective-C, but it can hold any of the basic number types. Because of that, it's possible to convert to and from NSNumber.
NSNumber does return a floatValue but cannot be casted to a Float.
The problem happens mostly when working with JSON based APIs – the data format does not give information about types so parsers use NSNumber to make sure everything works properly. Then it turns out I run into a lot of problems because values can or cannot be casted to Floats or any different numeric type and all of this depends on what data was actually sent from the server.
@phausler can probably explain this better than me, but the basic idea is that a cast to Float will fail whenever doing so would lose precision. Since "0.1" (a value written in JSON) can't be precisely represented by a binary floating-point value, both the Float and Double representations are approximations. The JSON parser in Foundation uses Double for all floating-point values because it's a better approximation.
Environment
Apple Swift version 4.1 (swiftlang-902.0.48 clang-902.0.37.1)
Target: x86_64-apple-darwin17.5.0
Running on MacBook Pro (13-inch, 2016, Two Thunderbolt 3 ports) with macOS 10.13.4 (17E202); Xcode: Version 9.3.1 (9E501)
Additional Detail from JIRA
md5: eaf4c498641b6112ee4c7dc40fff5a78
Issue Description:
There's been a change in how Swift handles casting numerical values and it's either really buggy or not intuitive. Please see examples below. All testing was done with:
Apple Swift version 4.1 (swiftlang-902.0.48 clang-902.0.37.1)
Target: x86_64-apple-darwin17.5.0
On one hand this looks like casting between types has been disabled - no matter if I'm upcasting to a type that can hold specific values (i.e. casting Int8 to Int16) or the other way (which might be blocked since results may vary depending on runtime values).
On the other hand implicit conversions of float literals (which I believe are Doubles in Swift) to Float as in the second and third examples work without a problem.
I can't really understand the logic between `0.1 as Float` and `0.1 as? Float` giving completely different values.
The problem extends to NSNumber objects which behave really weird regarding casting.
Seeing these tests in action I can't really be sure when it's possible to cast an NSNumber to Float. Why is it possible to cast NSNumber with a value of 13 to Float while Int64 with the same value isn't possible.
Can someone please let me know what's going on here?
The text was updated successfully, but these errors were encountered: