The JSON standard does not prescribe that numbers must be decoded as binary floating point (Float and Double). Because of its lossy nature, it would be very useful to be able to avoid conversion between decimal and binary floating point, and thus to maintain precision.
The following playground:
produces a correctly decoded number (0.1) for the Double case and an error for the Decimal case: Expected to decode Dictionary<String, Any> but found a number instead., seemingly because Decimal only supports keyed encoding.
I understand that while Decimal's implementation of Encodable can detect a JSONEncoder, there's no mechanism to access the raw values from within the JSON structure. One alternative is to add this. Another alternative is special-casing Decimal: public func decode(_ type: Decimal.Type) throws -> Decimal could be added to SingleValueDecodingContainer, and so on for the other protocols.
My motivations why this is a bug are:
- Decoding to and encoding from the customary decimal arithmetic type is supported by similar JSON handling in popular libraries for other languages.
- It is hard to work around without transforming the JSON data, which is either invasive or impossible.
- The way in which it fails is surprising, since the reason for the keyed archiving behavior for Decimal is not obvious in the context of JSON.
- The potentially user-actionable "workaround" of declaring a Codable type that encodes or decodes a decimal value from the JSON by actually encoding or decoding a Double only works by forcing the conversion to binary arithmetic, which defeats the purpose of using Decimal in the first place.