Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SR-12974] JSONEncoder encoding non fractional double value as Int64 #3253

Closed
LucianoPAlmeida opened this issue Jun 10, 2020 · 3 comments
Closed

Comments

@LucianoPAlmeida
Copy link
Contributor

Previous ID SR-12974
Radar None
Original Reporter @LucianoPAlmeida
Type Bug
Status Closed
Resolution Invalid

Attachment: Download

Environment

Xcode 11.5

Additional Detail from JIRA
Votes 0
Component/s Foundation
Labels Bug
Assignee None
Priority Medium

md5: 24b93375887ee6f6f34bb2f0a444e4da

Issue Description:

Exemple:

import Foundation

class B: Codable {
  var value: Double?
}

class A: Codable {
  init(_ value: Double) {
    self.a = value
    self.b = B()
    self.b?.value = value
  }
  var a: Double?
  var b: B?
}

extension Encodable {
  var dictionary: [String: Any]? {
    let encoder = JSONEncoder()
    guard let data = try? encoder.encode(self) else { return nil }
    return (try? JSONSerialization.jsonObject(with: data, options: .allowFragments)).flatMap { $0 as? [String: Any] }

  }
}

let a = A(20.00)
print(a.dictionary)

When encoding the value, JSONEncoder makes a Double value an Int64 when it doesn't have fractional values.

As seen in the screenshot bellow, the encoder uses the double type when it has fractional part

![](Screen Shot 2020-06-10 at 19.22.46.png)

But when encoding a value that doesn't have a fractional part it losses type information and encode the value as Int64.

![](Screen Shot 2020-06-10 at 19.21.24.png)

The question is if that is intended behavior (optimization) or is indeed a problem? If that is intended behavior is fine because at the end it represents the same value, but I'm just a bit curious to know why the encoder does that 🙂

@LucianoPAlmeida
Copy link
Contributor Author

cc @spevans

@spevans
Copy link
Collaborator

spevans commented Jun 12, 2020

It is due to how JSONSerialization works internally.

For A(20.01) the JSON produced looks like:

code
{"a":20.010000000000002,"b":{"value":20.010000000000002}}
code

As you can see converting a Double to a String is not precise however thats not important here.

Encoding A(20.00) to JSON gives:

code
{"a":20,"b":{"value":20}}
code

Its important to note that JSON doesn't differentiate between floating point and integers, all numbers are just represented as a string of digits that may optionally have a decimal point and an exponent. It is upto the decoder to decide how to store it.

So given "20.010000000000002", the only type that can hold it is a Double so it is stored as such. But given "20" an Int64 is a better choice since it can store a large range of integers. The actual value is stored as an NSNumber since it is inside an NSDictionary which only holds reference types.

The other thing to note is that JSONSerialization doesn't know about JSONDecoder, and that ultimately the value will be converted to a Double when creating the A and B instances. It only generates structures consisting of Dictionarys, Arrays, Strings and Numbers.

There is some further discussion about parsing of JSON numbers in the swift-corelibs-foundation version of JSONSerialization in #1655 and #1657

@LucianoPAlmeida
Copy link
Contributor Author

Interesting, that makes sense, thank you for the detailed explanation @spevans 🙂

@swift-ci swift-ci transferred this issue from apple/swift-issues Apr 25, 2022
@shahmishal shahmishal transferred this issue from apple/swift May 5, 2022
This issue was closed.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants