Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SR-6421] Double precision are wrong #48971

Closed
swift-ci opened this issue Nov 17, 2017 · 10 comments
Closed

[SR-6421] Double precision are wrong #48971

swift-ci opened this issue Nov 17, 2017 · 10 comments
Labels
bug A deviation from expected or documented behavior. Also: expected but undesirable behavior.

Comments

@swift-ci
Copy link
Collaborator

Previous ID SR-6421
Radar None
Original Reporter vincentsaluzo (JIRA User)
Type Bug
Status Resolved
Resolution Invalid
Environment

OS: macOS 10.13.1

Swift version: 4.0.2

Additional Detail from JIRA
Votes 1
Component/s
Labels Bug
Assignee None
Priority Medium

md5: 06d539c30e2de101f0ad6b9414a8ddd1

Issue Description:

I encount a problem with Double. I find it when using ObjectMapper to decode JSON file to a class and I discover some JSON retrieved from JSON lost his precision.

I investigated more deeply, especially in Swift REPL mode to ensure the problem was from ObjectMapper of from Swift itself.

(I've tried this algorithm on a Swift Playground but the problem doesn't occurs here)

Here is the exemple of the problem :

Welcome to Apple Swift version 4.0.2 (swiftlang-900.0.69.1 clang-900.0.38). Type :help for assistance.
1> let c = 7.78
c: Double = 7.7800000000000002
2> c * 7.78
$R0: Double = 60.528400000000005
3> c * 7.78 * 1000000
$R1: Double = 60528400.000000007
4> c * 7.78 * 1000000000000
$R2: Double = 60528400000000.008
5> c * 7.78 * 1000000000000000
$R3: Double = 60528400000000008

As you can see, declare a simple double lost the precision at a very low precison. But If you make some operations that multiply the result by multiple of 10, the lost of precision have more impact.

I've resolve the problem by not using Double but Decimal, but I think it's not a normal behavior.

@belkadan
Copy link
Contributor

This is how floating-point numbers work: some values can't be represented exactly. You can see this by printing even more decimal places than you do at first:

  1> let c = 7.78
c: Double = 7.7800000000000002
  2> import Foundation
  3> NSLog("%.64lf", c) 
2017-11-17 08:54:19.124766-0800 repl_swift[11135:1224777] 7.7800000000000002486899575160350650548934936523437500000000000000
  4> NSLog("%.64lf", c * 7.78)
2017-11-17 08:56:07.516474-0800 repl_swift[11135:1224777] 60.5284000000000048657966544851660728454589843750000000000000000000
  5> NSLog("%.64lf", c * 7.78 * 1000_000)
2017-11-17 08:56:17.142104-0800 repl_swift[11135:1224777] 60528400.0000000074505805969238281250000000000000000000000000000000000000

Note that to represent a higher number, there are fewer bits left to store the fractional part of the value. It looks like it's changing by a lot, but that's actually as close as it can get. You can use the ulp property on any Double value to tell you the smallest interval that can be represented for a number of that magnitude.

@belkadan
Copy link
Contributor

If you do need perfect decimal behavior, using Decimal is a perfectly good answer.

@swift-ci
Copy link
Collaborator Author

Comment by Christophe Braud (JIRA)

@belkadan I use Swift 4 and Codable, and I have the same issue that Vincent when I receive some Json. In the data sent to me I receive a large number of digits with just 2 decimals like 24.35, 53.47 or 123.07. The 8 bytes of Double are even too much compared to what I really need. I will be able to use Float which takes only 4 Bytes. But that does not change anything because Float has the same problem as Double. Yes we can use Decimal instead of Double or Float. But Decimal required 20 Bytes in memory (Float => MemoryLayout<Float>.size => 4, Double => MemoryLayout<Double>.size => 8, Decimal => MemoryLayout<Decimal>.size => 20).
So if I understand correctly I have the choice between taking 5 times (compare to Float) or 2.5 times (compare to Double) more memory for my data with Decimal, or I recover false data with Double or Float. Knowing that I will have errors when I go to make calculations with Double and Float because of the false data extracted from the Json.
It's a difficult choice. It's a bit like choosing between plague and cholera. 😛

@belkadan
Copy link
Contributor

That's how floating-point numbers work in all languages, including JavaScript, C, and Python.

Python 2.7.10 (default, Nov 10 2017, 18:24:51) 
[GCC 4.2.1 Compatible Apple LLVM 9.1.0 (clang-902.0.15.1)] on darwin
Type "help", "copyright", "credits" or "license" for more information.
>>> 7.78
7.78
>>> c = 7.78
>>> c * 7.78
60.528400000000005
>>> c * 7.78 * 1000000
60528400.00000001
>>> "%.64lf" % (c * 7.78 * 1000000)
'60528400.0000000074505805969238281250000000000000000000000000000000000000'

It's not something Swift can do differently. Binary just can't perfectly represent all decimal numbers, just like decimal can't perfectly represent all rational numbers.

@swift-ci
Copy link
Collaborator Author

Comment by Christophe Braud (JIRA)

@belkadan Swift is a new language. I had hoped that it would do better than reproduce the same limitations as other languages. I had hoped better. I'm disappointed. 🙁

@swift-ci
Copy link
Collaborator Author

Comment by Michael D. Morris (JIRA)

It's not a language limitation though, it's a physical limitation. All data in a standard digital computer is stored in binary, fixed width binary fractions can't exactly represent all decimal fractions. That's why we have things like the Decimal type and BCD.

@swift-ci
Copy link
Collaborator Author

Comment by Christophe Braud (JIRA)

mdmorris (JIRA User) I know about the physical limitation but all the numbers I get from json just have 2 decimals not 10 or more. It never exceeds the capacity of the Double. For example when I have 5.69 in the json the Double that I get is 5.6900000000000004.

@swift-ci
Copy link
Collaborator Author

Comment by Michael D. Morris (JIRA)

It's doesn't matter how many bits you have in the mantissa, 5.69 can't be exactly expressed in binary. You're probably confused because printing a double value usually rounds off the error. If you debugPrint(5.69) it will print 5.6900000000000004, because that's actually how 5.69 is stored in 64-bit binary floating point.

@swift-ci
Copy link
Collaborator Author

Comment by Vincent Saluzzo (JIRA)

I totally understand your point, the logic used in floating point can't be totally precise. But if the problem was only in the debug print that's could be ok. But, for exemple, if we use JSONSerialization with a Double, the precision lost will be exported in the JSON.

In our case we used Double in JSON object and that's use for computation basket. In Android application, using of Double in the same object cause no problems. But in Swift application, more over since changing to Swift 4 (any idea if something change in that version), the precision lost was exported in JSON with JSONSerialization tool.

Here is an example given from REPL :

10> c
$R3: Double = 66.483000000000004
11> let data = JSONSerialization.data(withJSONObject: ["c": c])
data: Data = {
_backing = {
_bytes = (_rawValue = 0x0000000101808a00)
_length = 24
_capacity = 0
_needToZero = false
_deallocator = nil
_backing = immutable {
immutable = 4320168448 bytes
}
_offset = 0
}
_sliceRange = 0..<24
}
12> String(data: data, encoding: .utf8)
$R4: String? = "{\"c\":66.483000000000004}"

Of course we could use Decimal and that we do in our application now, but if we take a look on the Swift documentation from Apple website, I've seen no mention of Decimal. And for developers we use an another language, Double should do the work like it does in some other languages.

So maybe the problem wasn't Double itself, but JSONSerialization when using a double more. At least, a single point of that in the Swift documentation should be appreciated 🙂


(in my example, c variable come from computation of that :

1> let a = 7.47
a: Double = 7.4699999999999998
2> let b = 8.9
b: Double = 8.9000000000000004
3> b
$R0: Double = 8.9000000000000004
4> let c = a * b
c: Double = 66.483000000000004

@tbkka
Copy link
Contributor

tbkka commented Feb 3, 2021

There could be two very different things going on here. It would help if you could explain what results you expected to see so we can better understand your concern.

You are using the Swift REPL for your examples here. The Swift REPL does print out more digits than necessary which can be confusing. The Swift REPL should be using Swift's `Double.debugDescription` property to print these values. Starting with Swift 4.2, `description` and `debugDescription` print values that most people would find easier to understand:
```
4> let c = a * b
c: Double = 66.483000000000004
5> c.description
$R2: String = "66.483"
6> c.debugDescription
$R3: String = "66.483"
```
(The `description` and `debugDescription` values shown here are just as accurate as the other values, they just print the minimum number of necessary digits.)

JSONSerialization also emits more digits than necessary. The important point for JSONSerialization is that if you decode the JSON document you should get the exact same value back again::
```
3> let c = 66.483
4> let data = JSONSerialization.data(withJSONObject: ["c": c])
5> let q = JSONSerialization.jsonObject(with: data)
6> ((q as! Dictionary<AnyHashable,Any>)["c"] as! Double) == c
$R5: Bool = true
```
As long as this is true, the exact text used in the JSON document is not particularly important.

@swift-ci swift-ci transferred this issue from apple/swift-issues Apr 25, 2022
This issue was closed.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug A deviation from expected or documented behavior. Also: expected but undesirable behavior.
Projects
None yet
Development

No branches or pull requests

3 participants