Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SR-8756] NSDecimalNumber(string:) regression in Xcode 10 GM (Returns NaN). #51264

Open
swift-ci opened this issue Sep 14, 2018 · 1 comment
Open
Labels
access control Feature → modifiers: Access control and access levels bug A deviation from expected or documented behavior. Also: expected but undesirable behavior. compiler The Swift compiler in itself regression swift 4.2

Comments

@swift-ci
Copy link
Collaborator

Previous ID SR-8756
Radar None
Original Reporter freak4pc (JIRA User)
Type Bug
Environment

Xcode 10 GM
Xcode 9.4.1

Additional Detail from JIRA
Votes 0
Component/s Compiler
Labels Bug, 4.2Regression
Assignee None
Priority Medium

md5: 215ce993c53eac04883adc50b9b63c05

Issue Description:

When running our unit tests, spotted a weird regression in some conversion to `NSDecimalNumber` that's acting very odd, specifically in Xcode 10.

I reduced the issue to as few lines as possible.

Given the following, seemingly unrelated extension:

extension Dictionary where Key == String {
    func value<T>(for key: Key, or: T) -> T {
        return self[key] as? T ?? or
    }
}

The following code behaves irregularly (see Example 2). Seems related to inference but not entirely sure how, since the "String" return type is properly explicit; having this wasn't an issue in Xcode 9.

let json = ["x": "5", "b": "12"]

// ### Example 1 ###
// #################
// Returns 5, as expected, on all versions of Xcode
print(#line, NSDecimalNumber(string: json.value(for: "x", or: "0")))

// ### Example 2 ###
// #################
// Xcode 9.4.1 Swift 4.1: Prints 0.
// Xcode 10 GM Swift 4.2: Should be 0, Printing NaN.
// Xcode 10 GM Swift 4 mode: Should be 0, Printing NaN.
print(#line, NSDecimalNumber(string: json.value(for: "y", or: "0")))

// ### Example 3 ###
// #################
// Same, but explicitly casting, works correctly on all versions of Xcode
print(#line, NSDecimalNumber(string: json.value(for: "y", or: "0") as String))

// ### Example 4 ###
// #################
// Same, when extracting string separately.
// Works as expected on all versions of xcode
let value = json.value(for: "y", or: "0") // Returns "0" on all versions of Xcode
print(#line, NSDecimalNumber(string: value))

Appreciate your help!
Shai.

@belkadan
Copy link
Contributor

NSDecimalNumber.init(string:) takes an optional String, so T is getting inferred to String? rather than String in Xcode 10. That matches the dictionary lookup, and then you pass nil to the initializer.

@rudkx, we have another bug for this behavior change, right?

@swift-ci swift-ci transferred this issue from apple/swift-issues Apr 25, 2022
@AnthonyLatsis AnthonyLatsis added regression swift 4.2 access control Feature → modifiers: Access control and access levels and removed 4.2 regression labels Nov 19, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
access control Feature → modifiers: Access control and access levels bug A deviation from expected or documented behavior. Also: expected but undesirable behavior. compiler The Swift compiler in itself regression swift 4.2
Projects
None yet
Development

No branches or pull requests

3 participants