Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SR-15132] Decimal.significand gives wrong sign for negative values #3361

Closed
xwu opened this issue Aug 29, 2021 · 1 comment
Closed

[SR-15132] Decimal.significand gives wrong sign for negative values #3361

xwu opened this issue Aug 29, 2021 · 1 comment
Assignees

Comments

@xwu
Copy link
Collaborator

xwu commented Aug 29, 2021

Previous ID SR-15132
Radar None
Original Reporter @xwu
Type Bug
Status Resolved
Resolution Done
Additional Detail from JIRA
Votes 0
Component/s Foundation
Labels Bug
Assignee @xwu
Priority Medium

md5: 9fe3b53ff06ecf828b17b88ad46a4d30

relates to:

  • SR-15228 Inconsistent behaviour of Decimal.significand on Darwin and Linux
  • SR-15134 Decimal.init(sign:exponent:significand:) gives incorrect values when exponent overflow or underflows

Issue Description:

The significand of a floating-point value should never be negative. Decimal, despite mimicking FloatingPoint APIs, returns negative significands for negative values:

import Foundation

let x = -42 as Double
x.significand.sign // plus
let y = -42 as Decimal
y.significand.sign // minus (!)

This issue was discovering when trying to fix the implementation of Decimal.ulp, where it would have been nice to be able to use this API but for this incorrect sign.

@xwu
Copy link
Collaborator Author

xwu commented Aug 31, 2021

#3068

@swift-ci swift-ci transferred this issue from apple/swift-issues Apr 25, 2022
@shahmishal shahmishal transferred this issue from apple/swift May 5, 2022
This issue was closed.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant