Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SR-7054] JSONDecoder Decimal precision error #4255

Open
swift-ci opened this issue Feb 22, 2018 · 46 comments
Open

[SR-7054] JSONDecoder Decimal precision error #4255

swift-ci opened this issue Feb 22, 2018 · 46 comments

Comments

@swift-ci
Copy link
Contributor

Previous ID SR-7054
Radar rdar://problem/33491336
Original Reporter tiborbodecs (JIRA User)
Type Bug
Environment

Xcode 9.2 (9C40b)
Apple Swift version 4.0.3 (swiftlang-900.0.74.1 clang-900.0.39.2)

Same output with 4.1 swift-DEVELOPMENT-SNAPSHOT-2018-02-20-a-ubuntu16.04.

Additional Detail from JIRA
Votes 38
Component/s Foundation
Labels Bug
Assignee bendjones (JIRA)
Priority Medium

md5: 4d586dd9920250512d8b01ea1b3597c9

Issue Description:

Decoding decimals from a JSON structure returns incorrect numbers.

See the following example:

#!/usr/bin/env swift

import Foundation

struct DoubleItem : Codable \{
 var name: String
 var price: Double
}

struct DecimalItem : Codable \{
 var name: String
 var price: Decimal
}

let jsonString = """
\{ "name": "Gum ball", "price": 46.984765 }
"""
let jsonData = jsonString.data(using: .utf8)!

let decoder = JSONDecoder()
do \{
 let doubleItem = try decoder.decode(DoubleItem.self, from: jsonData)
 print(doubleItem)

let decimalItem = try decoder.decode(DecimalItem.self, from: jsonData)
 print(decimalItem)
} 
catch \{
 print(error)
}

Output:

DoubleItem(name: "Gum ball", price: 46.984765000000003)

DecimalItem(name: "Gum ball", price: 46.98476500000001024)

Expected result for the DecimalItem should be: 46.984765

I know that floating point values can not be represented precisely, but decimals should keep their original values, am I wrong? Is this a Swift foundation bug or an intended behavior? Note that encoding the same value with the JSONEncoder will provide the correct value in the JSON.

My actual problem is that the JSONEncoder and JSONDecoder classes are working inconsistently.

If the encoder will output the value as a number, I'd expect that the decoder can decode the exact same value from that if I use the decimal type in my model. My other idea is that the encoder should transform and save the value into a string type inside the JSON instead of the current representation, this could be supported with a decimal coding strategy.

let encoder = JSONEncoder()
encoder.decimalEncodingStrategy = .string //outputs as string: "46.984765"
encoder.decimalEncodingStrategy = .precise //outputs as number: 46.984765
encoder.decimalEncodingStrategy = .lossy //current output like: 46.984765


let encoder = JSONDecoder()
encoder.decimalDecodingStrategy = .string //decoded value from json string:  46.984765
encoder.decimalDecodingStrategy = .precise //decoded value from json number: 46.984765
encoder.decimalDecodingStrategy = .lossy //current value from number:  46.98476500000001024

Please share your thoughts about this idea.

@belkadan
Copy link

cc @itaiferber

@itaiferber
Copy link
Contributor

This is a known issue that's been discussed at length in the past — I'll try to dig up some prior discussion about this. The core of the matter is that the implementations of JSONEncoder and JSONDecoder sit atop JSONSerialization, which provides the actual serialization to and from JSON data.

At the JSONSerialization layer, there's no type hint available to know whether the user expects to decode the value as a Double or an Int or a Float or a Decimal; a representation needs to be chosen for the value at deserialization time. JSONDecoder can then get that value, and convert it to the requested result, e.g. Decimal. The issue is that a string of digits like 46.984765 has differing representations as a Double or a Decimal. Since the value can't be represented precisely as a Double, stuffing the value into a Double gets you 46.98476500000000300 in the full expanded representation (which is what you see as the DoubleItem above). On the other hand, Decimal can represent that string of digits precisely, but when you might go to use the value (say, by getting the decimal's doubleValue), the calculated result is different than what you might expect, due to a propagation of many floating point losses of precision. (The specific result you get, BTW, will be further different depending on whether the Decimal is created from a string or from an existing Double value.)

You can see this with the following ObjC code:

#import <Foundation/Foundation.h>

NS_FORMAT_FUNCTION(1, 2)
static void print(NSString *format, ...) {
    va_list arguments;
    va_start(arguments, format);
    puts([[NSString alloc] initWithFormat:format arguments:arguments].UTF8String);
    va_end(arguments);
}

int main(int argc, char *argv[]) {
    @autoreleasepool {
        double d = 46.984765;
        print(@"%.*f\n", DBL_DECIMAL_DIG, d); // => 46.98476500000000300
        
        NSDecimalNumber *number = [NSDecimalNumber decimalNumberWithString:@"46.984765"];
        print(@"%@", number); // => 46.984765
        print(@"%.*f\n", DBL_DECIMAL_DIG, number.doubleValue); // => 46.98476499999999589
        
        number = [NSDecimalNumber numberWithDouble:d];
        print(@"%@", number); // => 46.98476500000001024
        print(@"%.*f\n", DBL_DECIMAL_DIG, number.doubleValue); // => 46.98476500000002432
    }
}

All of these results are "correct" for some definition of "correct". If JSONSerialization always produced Decimal values, folks grabbing the doubleValue from the results in order to perform calculations might be surprised by the inexactness of the result; in the same vein, if we prefer to produce accurate Double values, the original string of digits is lost and we can only create a Decimal from the Double, further losing precision.

At the end of the day, we preferred to always accurately round-trip Double values, i.e. if you encode a Double with JSONSerialization and decode it, you should always get the same exact (bitwise-equal) value back. This means that for values which lose precision when converted to a Double, the decimal value you see will be slightly different.

@itaiferber
Copy link
Contributor

There are a few ways we could potentially address this, potentially requiring changes in JSONSerialization itself:

  1. Add a flag for JSONSerialization which requests always producing a Decimal on its end, and JSONDecoder can make the decision about how to process and return that

  2. Adding a similar flag which requests returning all numbers as strings to really defer the problem to a different layer

  3. Moving JSONEncoder and JSONDecoder off of JSONSerialization and onto a custom serializer

Each of these approaches have their tradeoffs that we'd need to weigh if we want to address this.

@swift-ci
Copy link
Contributor Author

Comment by Tibor Bödecs (JIRA)

@itaiferber thank you for the deep explanation, I'm aware of the root cause of the issue.

If I take the following example:

let x = Decimal(46.984765)

print(x) // 46.98476500000001024

let y = Decimal(string: "46.984765")

print(y) // 46.984765

Both values are "correct", but from my point of view, however if I encode, decode, encode the second decimal number to JSON, the final value is going to be an "incorrect" (46.98476500000001024) decimal number value. That's clearly not the expected representation because of the underlying JSONSerialization problem. I would love to see a better solution, because the current behaviour of the coder classes can be misleading if you are working with decimals. Also it would be nice to save decimals to JSON as numbers instead of strings, because currently that's the only alternative to hold the original value.

@itaiferber
Copy link
Contributor

tiborbodecs (JIRA User) I agree on all points — this should be something that "just works" and it's unfortunate that we leak the implementation details here in this way. This is one of the many improvements to our implementations we'd like to make.

@swift-ci
Copy link
Contributor Author

Comment by Sergej Jaskiewicz (JIRA)

@itaiferber I see the following in the _JSONDecoder current implementation:

fileprivate func unbox(_ value: Any, as type: Decimal.Type) throws -> Decimal? {
    guard !(value is NSNull) else { return nil }
    
    // Attempt to bridge from NSDecimalNumber.
    if let decimal = value as? Decimal {
        return decimal
    } else {
        let doubleValue = try self.unbox(value, as: Double.self)!
        return Decimal(doubleValue)
    }
}

If we change it to

fileprivate func unbox(_ value: Any, as type: Decimal.Type) throws -> Decimal? {
    guard !(value is NSNull) else { return nil }
    
    // Attempt to bridge from NSDecimalNumber.
    if let decimal = value as? Decimal {
        return decimal
    } else if let number = value as? NSNumber,
        number !== kCFBooleanTrue,
        number !== kCFBooleanFalse {

        return number.decimalValue
    } else {
        let doubleValue = try self.unbox(value, as: Double.self)!
        return Decimal(doubleValue)
    }
}

then everything is fine — the JSON data is decoded correctly.

For example, this test from the test suite:

lazy var decimalValues: [Decimal] = [
        Decimal.leastFiniteMagnitude,
        Decimal.greatestFiniteMagnitude,
        Decimal.leastNormalMagnitude,
        Decimal.leastNonzeroMagnitude,
        Decimal(),
        Decimal(string: "67.52")!, // This was added

        // Decimal.pi does not round-trip at the moment.
        // See rdar://problem/33165355
        // Decimal.pi,
    ]

    func test_Decimal_JSON() {
        for decimal in decimalValues {
            // Decimal encodes as a number in JSON and cannot be encoded at the top level.
            expectRoundTripEqualityThroughJSON(for: TopLevelWrapper(decimal))
        }
    }

was failing after I had added `Decimal(string: "67.52")!` to the `decimalValues` array. However, after the aforementioned fix, it passes and no other test fails.

Or am I missing something?

I would be happy to create a PR and write some tests for it.

@swift-ci
Copy link
Contributor Author

Comment by Tibor Bödecs (JIRA)

broadway_lamb (JIRA User) sounds good, it'd be good to se a possible fix for this issue! 😉

@itaiferber
Copy link
Contributor

broadway_lamb (JIRA User) I meant to respond to this earlier — this fix might work for some numbers, but not all:

import Foundation

let decimal = Decimal(string: "46.984765000000003")!
print(decimal) // => 46.984765000000003

let number = NSNumber(value: 46.984765000000003)
print(number.stringValue) // => 46.984765
print(String(format: "%0.17f", number.doubleValue)) // => 46.98476500000000300

print(number.decimalValue == decimal) // => false

number.decimalValue gets the (imprecise) string value of the number, then creates a Decimal from that. What you're seeing is essentially Decimal(string: "46.984765000000003") != Decimal(string: "46.984765").

I think this gets us closer to the right answer, but is inefficient and still doesn't provide the correctness we need, unfortunately.

@swift-ci
Copy link
Contributor Author

Comment by Petr Dvorak (JIRA)

Hello @itaiferber, I was briefly looking at the source code (I admit that building Swift is more than I can currently chew - you guys are doing a great job!) and I was wondering: Why the [Decimal type uses the Double conversion for unboxing|https://github.com/apple/swift/blob/master/stdlib/public/SDK/Foundation/JSONEncoder.swift#L2417] and not String? Is this for the performance reason (and if yes, is the performance hit so unacceptable in typical scenarios)?

Solving this issue would help us a lot. We have APIs that send correct double values in JSON and all our other projects (usually built in Java, leveraging the veteran JSON mapper Jackson) work correctly (they map the JSON value to BigDecimal via String representation). We only encounter issues with Swift...

@itaiferber
Copy link
Contributor

petrdvorak (JIRA User) I think that my original comment on this explains things best — specifically, that there is unfortunately no correct answer here because Double and Decimal are not interchangeable. Due to the numeric representation of Decimal, even if we preferred it over Double always, it produces different results in mathematical operations than Double would due to differences in magnitude and precision. This means that even if you got the full range of values in Decimal, it could potentially not be as useful mathematically as Double. This is a hairy problem that doesn't have a clear solution because everyone's use case is different.

In a perfect world, my preferred solution would be to truly use a BigDecimal-like type if we had it because it would more perfectly preserve the representation in almost all cases. We've got some thoughts internally on doing things in this area (or at least offering higher-precision/magnitude representation types) to alleviate this problem, but for the moment, we're constrained by backwards compatibility on a lot of things. Given that we don't yet have a BigDecimal-like type in either Obj-C or Swift hampers a more perfect solution.

/cc @stephentyrone for some visibility on this in light of some of our previous discussions.

@swift-ci
Copy link
Contributor Author

Comment by Petr Dvorak (JIRA)

@itaiferber Thank you for the comment - I completely forgot there is no BigDecimal-like type in Obj-C/Swift. I got a bit rusty since the last time I did some iOS coding... 🙁

To provide yet another hint of possible direction... Maybe having some "hooks" - alongside the Jackson Streaming API would allow for a more precise (de)serialization?

(EDIT: I am not suggesting making a streaming API - instead, I would like to have the callback when reading attributes so that I have some control over the writing / parsing...)

@itaiferber
Copy link
Contributor

petrdvorak (JIRA User) That's certainly another way of solving the issue that we've considered — but it quickly hits the fact that parsing is done up-front by JSONSerialization, which doesn't itself have hooks for this sort of thing right now. This precision deficiency is a sort of leaky implementation detail that we want to resolve, which can be done either by having JSONSerialization add hooks for this sort of thing (and JSONEncoder/JSONDecoder allow you to use those hooks), or by using a different serializer (also something that's considered). At the moment, we don't have a great answer, but we have given serious thought to moving off of JSONSerialization and onto an internal serializer which gives us flexibility and allows us to parse with the intent of the developer known to us.

@swift-ci
Copy link
Contributor Author

Comment by Eddie LukeAtmey (JIRA)

Hi @itaiferber, I just saw this pull request that has been merged into `apple:master`. I'd like to know if the pull request has fixed the issue. If so, when will the fix be distributed?

@itaiferber
Copy link
Contributor

eddiemarvin (JIRA User) I believe that change should make it into the Swift 5 release. However, please keep in mind that (a) this change is in swift-corelibs-foundation, and thus will not affect Darwin platforms, and (b) even that fix will not resolve the issue; although it improves parsing of values on non-Darwin platforms, the base issue remains: parsing numbers as Decimal leads to precision loss on conversion to Double, and parsing numbers as Double leads to precision loss on conversion to Decimal (or up-front on parsing). As long as JSONDecoder relies on JSONSerialization making some decision up-front about how to parse numbers, some use cases will still see imprecise values.

@swift-ci
Copy link
Contributor Author

Comment by Charles Maria Tor (JIRA)

Anyone adversely affected by this issue like I was (I work for a bank, precision is very important) can use this custom JSONDecoder for the time being. Attached are tests written with SwiftCheck that prove the issue exists in JSONDecoder and does not exist in my replacement Decoder.

@swift-ci
Copy link
Contributor Author

Comment by Charles Maria Tor (JIRA)

@itaiferber is anything being done to replace JSONSerialization with a non-lossy parser or is this issue going to be left indefinitely because it requires huge changes to legacy code?

@tkrajacic
Copy link

I am a bit puzzled by this as well. It is clearly a bug that using these types basically corrupts your data and no one seems to care really. I would have imagined that this is something that requires an urgent fix in a the next possible dot-release even.

@itaiferber
Copy link
Contributor

@tkrajacic I don't think it's fair to characterize this as nobody caring — I'd love to get this resolved, but replacing the implementation of JSONEncoder/JSONDecoder on top of a new JSON serialization requires a significant engineering investment that we have not had the bandwidth for among other high-priority tasks. Contributions here are welcomed and encouraged, but so far, no one has had the opportunity to do this. (Believe me, I care a lot about this!)

chazza (JIRA User) We have no intention of leaving this indefinitely; given bandwidth (or contributions), we should absolutely work to fix this in a generalized way. Your JSON decoder is an excellent workaround for this in the meantime.

@tkrajacic
Copy link

@itaiferber You are absolutely right, that was unfair. I humbly and truly apologize. You are giving so much helpful feedback on the forums as well and I would say that you (and basically most other people concerned with Swift) are incredibly helpful and supportive. I spoke out of frustration.

I am writing an application that handles monetary amounts, with quite a large data model and even "unit count"s are decimal of course. Double and Float should not even exist in programming languages anymore because they are broken for most anything that doesn't allow for "meh, if it is approximately this value that's fine".

Now I have to write EVERY SINGLE Codable conformance myself because the encoders/decoder that Swift provides are broken. That's where my frustration comes from. And simply encoding/decoding Decimals as String would solve all of this. (Well, the other thing is that the compiler is stupid enough to treat literals when initializing Decimals as Doubles - so never correctly initializes a Decimal value, but that's another story… and bug)

Now I do my best to help out. I write plenty of bug reports here and include sample projects with almost all of them which takes a considerable amount distilling them from a big project. Sadly I don't have the C++ expertise to fix bugs in the compiler and related projects.

I do really not understand how this particular bug though can be ranked anything but top-priority as it basically corrupts user data. This is a critical bug in my humble opinion.

@swift-ci
Copy link
Contributor Author

Comment by Tiago Janela (JIRA)

We also faced this issue on an iOS application we are working with which involves monetary amounts. We are currently using the referenced JSON Decoder with satisfactory results.

I'm currently inclined to say that, from our use case perspective, a number of hooks at the JSONDecoder level would help overcome this issue with an acceptable tradeoff. This is a pattern that works well, given my experience with other JSON Coders/Decoders such as Newtonsoft's JSON.NET or FasterXML's Jackson.

Happy to contribute in any way possible.

@swift-ci
Copy link
Contributor Author

Comment by Boris Yurkevich (JIRA)

tiborbodecs (JIRA User) thank you for your ticket.

I have ran your snippet in Xcode 10.2.1 Playground and what I got is different.

DoubleItem(name: "Gum ball", price: 46.984765)
DecimalItem(name: "Gum ball", price: 46.98476500000001024)

Looks like Double doesn't have this problem any more and we can use it. The problem now affects only Decimal.

Update:

I've been thinking a lot about this. I don't believe this is a platform issue, but an issue of us using Swift platform in an incorrect way.

When dealing with financial problems, following this simple rule will keep your code accurate: use String for encoding and decoding.

I have not found any issues converting between String and Decimal. I have tried "broken decimals" from chazza (JIRA User)'s workaround.

If your server API does not uses String, encode with Double, do not attempt to encode with Decimal. You still can do math using Decimal. Then, when you ready to send it to server, encode it to Double. Apply `NSDecimalRound` when dealing with long numbers.

If server gives you a numeric JSON value, this means they are already using primitive type on backend. The right solution in avoiding data loss, is changing code on the server to use String type for JSON models.

@swift-ci
Copy link
Contributor Author

Comment by Tiago Janela (JIRA)

boris.y (JIRA User) I disagree with the premise of "using Swift platform in an incorrect way".
The behavior we are seeing is, by no means, expected. As a simple proof of concept, you can try a round-trip encoding/decoding of a problematic decimal. You'll see that the decoded value is different when compared to the value you encoded. This behavior is unexpected, and arguably defective, in the sense that this type of encoder/decoder should be "lossless."

I believe the correct way to handle this would be to provide functionality much like the DateDecodingStrategy only for Numbers, such as a NumberDecodingStrategy. This functionality could provide a default strategy that keeps backward compatibility and provides a way for those who need control to have it.
Sadly, it seems that the problem is rooted in the Decoder, so I'm not sure even this would provide enough control over how numbers are parsed.

Happy to discuss further.

@swift-ci
Copy link
Contributor Author

swift-ci commented Jun 1, 2019

Comment by Boris Yurkevich (JIRA)

tjanela (JIRA User) I am aware of the issue with Decimal encoding and decoding. Ideally it should not conform to the Codable protocol. This would prevent wrong assumptions. I am sure there’s a good reason to allow this.

@swift-ci
Copy link
Contributor Author

swift-ci commented Jun 2, 2019

Comment by Charles Maria Tor (JIRA)

boris.y (JIRA User) Removing Codable support isn’t going to happen for obvious reasons, and I’ve already demonstrated that this problem can be solved with different methods of JSON parsing, so I don’t know why you’re advocating for the worst solution to this problem.

@swift-ci
Copy link
Contributor Author

swift-ci commented Jun 4, 2019

Comment by Tiago Janela (JIRA)

And on that note, chazza (JIRA User), how can one go about integrating your suggested changes to the JSON parsing method into the codebase? What are the drawbacks of the current implementation of MonadicJSON from your point of view?

@swift-ci
Copy link
Contributor Author

swift-ci commented Jun 6, 2019

Comment by Boris Yurkevich (JIRA)

I have been doing more testing, and I have to retract some of my previous statements. Plain short value Double also suffers from this behaviour. It seams like MonadicJSON is necessary workaorund.

import Foundation

// This is correct number which we send
let package: Double = 7067.51
let input: [String: Any] = ["Key": package]
let data = try! JSONSerialization.data(withJSONObject: input, options: [])
print(String(data: data, encoding: .utf8)!)

Output:

{"Key":7067.5100000000002}

@itaiferber
Copy link
Contributor

boris.y (JIRA User) What you're seeing for Double is not loss of precision, but potentially excess precision. Double values are produced with as much precision as necessary to absolutely guarantee safe round-tripping back to Double, the process for which can output more data than you might expect. The imprecise decoding with Decimal should not happen for you with Double.

@swift-ci
Copy link
Contributor Author

swift-ci commented Oct 1, 2019

Comment by Charles Maria Tor (JIRA)

tjanela (JIRA User) My suggested changes require a huge change to how numbers are currently being treated by `JSONSerializalion`. At the moment they're being first initialised into an NSNumber, but we need an intermediary container that preserves the original characters until they need to be converted into their final numeric representation. Going from String to NSNumber to Decimal is where we get precision issues, going directly from String to Decimal is the solution.

This could theoretically be implemented without changing the API surface of JSONSerialization, but would require changes to NSNumber (to store the raw string) and would have huge implications for the entire Apple ecosystem by requiring conversion at extraction time. Memoization of the call would be needed but even then the performance implications can't be overlooked.

@swift-ci
Copy link
Contributor Author

swift-ci commented Oct 1, 2019

Comment by Tiago Janela (JIRA)

EDIT: Added the fact that the author suggested using a strategy approach to handle this
Thanks for your take on this, chazza (JIRA User). I trust your analysis and completely agree with what you state in terms of things that must not be overlooked, as well as the possible approaches to overcome this.
Somehow I can't wrap my head around having such a dangerous behavior lying around in the platform's JSON serialization infrastructure.
I think the most important piece of your last comment really is — without changing the API surface of JSONSerialization.
Wouldn't a pattern like JSONDecoder.NonConformingFloatDecodingStrategy work here, like the author of this thread suggested?

@tkrajacic
Copy link

I also have a hard time grasping how such an obvious case of data corruption can be ignored for so long.

7067.5100000000002 is NOT excess precision. It is NOT "more correct" than the 7067.51 that was fed into the serialization… That attitude is mind-boggling.

@weissi
Copy link
Member

weissi commented Oct 17, 2019

CC @millenomi

@swift-ci
Copy link
Contributor Author

Comment by Tiago Janela (JIRA)

Hey bendjones (JIRA User), let me know if you want to debate possible approaches or need any help.
Thanks!

@spevans
Copy link
Collaborator

spevans commented Feb 4, 2021

For Linux (swift-corelibs-foundation) there is a potential solution here: #2980

@swift-ci
Copy link
Contributor Author

Comment by Adriano Gonçalves (JIRA)

Hello. I was wondering if there's any update on this?

We are facing the same issue on our current project and will try to fix it using chazza (JIRA User)'s implementation of a custom decoder, but would love to see this being fixed in the default JSONDecoder.

@swift-ci
Copy link
Contributor Author

Comment by Jason Bobier (JIRA)

This is now biting me too. In particular, I'm converting JSON numbers to a rational number via a library that I wrote. I was expecting to read them in as Decimals and convert to my rationals, easy... no problem. Except I can't read numbers in as Decimals accurately. sigh

28.35 becomes 28 49258120924365/140737488355328 instead of 28 7/20. This is for display to an end user and no one wants to see that.

Obviously, this is even more dangerous for monetary transactions.

@swift-ci
Copy link
Contributor Author

Comment by Doug Sjoquist (JIRA)

It is not JSONDecoder's fault (or not just its fault). The conversion from double to decimal is flawed. This:

import Foundation

print(Decimal(46.984765))
print(Decimal(46.98476))
print(Decimal(46.9847))
print(Decimal(46.984))
print(Decimal(46.98))
print(Decimal(46.9))

produces this output:

46.98476500000001024
46.98476
46.9847
46.98400000000001024
46.98
46.9

@swift-ci
Copy link
Contributor Author

Comment by Tiago Janela (JIRA)

Following your train of thought, dwsjoquist (JIRA User), this would make it the compiler's fault, since parsing and storing a Double literal is its responsibility. We write a double literal, the compiler parses and stores a double. The Decimal constructor that receives a double can't do much more than what it is doing right now. The information it gets is of a double value that, due to the way they are coded in the computer structure, is not a precise number in certain cases. Therefore, that particular constructor has no issues and is behaving accordingly.

Furthermore, if you use Decimal("46.984765") you'll notice that you get adequate results.

In fact, I believe the problem lies exactly there. Somewhere in JSONDecoder's implementation the receiving type should be inspected. If it is a Decimal then the constructor that receives a string should be used.

@swift-ci
Copy link
Contributor Author

Comment by Jason Bobier (JIRA)

I do believe that it is the JSONDecoder's issue. A number in JSON is an arbitrary length signed decimal number. The decoder needs to be able to handle this without loss of accuracy or if it is unable to handle the given number (for example if it is too many digits), throw an error. If requested, this arbitrary length decimal number could be converted to a given output (such as Double or Decimal) with loss of precision if it would be required to store it in that format. But that loss of precision needs to be explicitly asked for by the caller.

@davidpasztor
Copy link

This is neither the compiler’s nor JSONDecoder’s fault (or at least not directly). The problem is caused by NSJSONSerialization parsing all floating point numbers as Double and JSONDecoder using NSJSONSerialization under the hood. If JSONDecoder was more than a simple wrapper around the legacy NSJSONSerialization type, it could be written properly to fix the Decimal parsing problem by parsing floating point values directly into Decimal when asked for (rather than using Double as the intermediate type). Or as an alternative, the issue could be fixed at the root cause, in NSJSONSerialization itself.

@swift-ci
Copy link
Contributor Author

Comment by Leonardo Savio Dabus (JIRA)

dwsjoquist (JIRA User)

If you need to preserve precision when initializing a Decimal you need to use the string initializer otherwise you would be using the double initializer and that's the problem.

init?(string: String, locale: Locale)
init(_ value: Double)

What you need is:

print(Decimal(string: "46.984765")!)

@swift-ci swift-ci transferred this issue from apple/swift-issues Apr 25, 2022
@shahmishal shahmishal transferred this issue from apple/swift May 5, 2022
@guillaumealgis
Copy link

guillaumealgis commented Jan 19, 2023

Unless I'm mistaken, the JSONDecoder issue has been resolved? Nope, see the following comments

Starting with commit 5ed74d3, the decoder is no longer using Decimal(doubleValue) (with doubleValue coming from JSONSerializer) but unwrapDecimal(), which handles the data correctly by keeping the it represented a string, and passing it to Decimal(string:) as one would expect.

I'm not quite sure when this commit landed in the SDKs, but using Xcode 14.2 today, I don't get any loss of precision from the Gum Ball example:

DoubleItem(name: "Gum ball", price: 46.984765)
DecimalItem(name: "Gum ball", price: 46.984765)

Initializing Decimal from a Double literal still is problematic, but that's another issue (#3658).

@scoreyou
Copy link

scoreyou commented Jan 19, 2023

Sadly but this issue is still actual.

Edited: my opinion based on Gum Ball example, but with another number { "name": "Gum ball", "price": 0.07 }
Result produced by Xcode 14.2(14C18):

DoubleItem(name: "Gum ball", price: 0.07)
DecimalItem(name: "Gum ball", price: 0.07000000000000001)

@davdroman
Copy link

@guillaumealgis @scoreyou bear in mind this is corelibs-foundation, not Apple Foundation. It might've been fixed in this one but not Apple's. You could try running Gum Ball on Linux to verify.

Soon enough it could be fixed in both though, as Foundation is being rewritten in Swift, open sourced, and shared across all platforms.

@guillaumealgis
Copy link

@scoreyou @davdroman thank you both for clarifying :)

Indeed, its doesn't seems to be fixed in Apple Foundation yet, sadly :(

@kober32
Copy link

kober32 commented Jan 26, 2023

@guillaumealgis

This seems to be fixed on iOS 15 and newer.

Deserializing 0.0070918203981 as Decimal:

on iOS 14, the result is 0.007091820398099999744
on iOS 15, the result is 0.0070918203981

Sadly, we target iOS 13+ in our applications and SDKs, so we need to wait a few years, and then we can drop the "string from the backend" approach 😬

@bencallis
Copy link

bencallis commented Mar 30, 2023

@kober32 I've tested a few decimals on iOS 15 and 16 and some still seem to fail.

16684.24 results in 16684.240000000004096

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests