Uploaded image for project: 'Swift'
  1. Swift
  2. SR-8409

Decimal initialized with Double loses precision

    XMLWordPrintable

    Details

    • Type: Bug
    • Status: Open
    • Priority: Medium
    • Resolution: Unresolved
    • Component/s: Foundation
    • Labels:
      None

      Description

      Decimal precision is a bit strange for a number 456.789. When you initialize Double with 456.789 it holds the number so it is representable, but if you try to convert it to a decimal with Decimal(456.789) it results in 456.7889999999998976 but if you use Decimal(string: "456.789")! it results in a correct decimal with a value 456.789.

      I tried also Objective-C, and it works correctly there:

      NSDecimalNumber *number = [[NSDecimalNumber alloc] initWithDouble:456.789];
      NSLog(@"Number: %f", number.doubleValue);
      

        Attachments

          Issue Links

            Activity

              People

              Assignee:
              Unassigned Unassigned
              Reporter:
              TomasLinhart Tomáš Linhart
              Votes:
              0 Vote for this issue
              Watchers:
              3 Start watching this issue

                Dates

                Created:
                Updated: