Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SR-12397] Debugger computes different value than the code in generic context #54835

Open
swift-ci opened this issue Mar 23, 2020 · 24 comments
Open
Labels
bug A deviation from expected or documented behavior. Also: expected but undesirable behavior.

Comments

@swift-ci
Copy link
Collaborator

Previous ID SR-12397
Radar rdar://problem/60832873
Original Reporter cukier (JIRA User)
Type Bug
Status Reopened
Resolution

Attachment: Download

Environment

XCode Version 11.4 beta 3 (11N132i)

Additional Detail from JIRA
Votes 0
Component/s
Labels Bug
Assignee None
Priority Medium

md5: 93052e86832ce609caffca74f1babb33

Issue Description:

if you have the following code

func f<T>(_ foo: T) {
    let test = foo != nil
    print(test)
}
let x: Int? = nil
f(x)

and put a breakpoint on a line with `print(test)`, a value of `foo != nil` would be different from the value of `test`

(lldb) po foo != nil
false

(lldb) po test
true
@beccadax
Copy link
Contributor

@swift-ci create

@vedantk
Copy link
Member

vedantk commented Apr 2, 2020

I think this is behaving as expected:

main.swift:2:17: warning: comparing non-optional value of type 'T' to 'nil' always returns true
let test = foo != nil

@swift-ci
Copy link
Collaborator Author

swift-ci commented Apr 2, 2020

Comment by cukr (JIRA)

Diagnostic says that comparing always returns true
Debugger returns false

Is that expected? Do we live in a world where true is the same as false?


If you don't like code examples with diagnostics, here's another one, this time with no diagnostics:

func f<T>(_ foo: T) {
    let test = foo as Any? == nil
    print(test)
}
let x: Int? = nil
f(x)

and in the debugger:

(lldb) po foo as Any? == nil
true

(lldb) po test
false

Can you see how the first line printed by the debugger is `true`, and the second line printed by the debugger is `false`?
can you see how `true` is different from `false`?
Both values were generated from the same piece of code `foo as Any? == nil`

@masters3d
Copy link
Contributor

You are comparing two different things. Optionals in swift are backed by an Enum called Optional. nil is just magical dust so people don't have to write down `.none`

https://forums.swift.org/t/the-purpose-of-expressiblebynilliteral-as-a-public-api/12570/6

public struct _OptionalNilComparisonType: ExpressibleByNilLiteral {

In the generic context below nil never compares to the Inner Optional value unless you overload the function for Optional<T> then the Optional comparison happens.

The confusing part here that we are allowed to compare to nil even though the type is not Optional.

```
// T Generic
func f<T>(_ foo: T) {
let isNil = foo ==nil
debugPrint("isNil: \(isNil), type: \(T.self)")
}

f(x) // "isNil: false, type: Optional<Int>"
f(y)// "isNil: false, type: Int"

// Optional Overload
func f<T>(_ foo: Optional<T>) { // Same as T?
let isNil = foo ==nil
debugPrint("isNil: \(isNil), type: \(T.self)")
}

f(x) // "isNil: true, type: Int"
f(y) // "isNil: false, type: Int"
```

@swift-ci
Copy link
Collaborator Author

Comment by cukr (JIRA)

You can make the debugger behave differently even without using `nil` or even without comparing anything

func f<T>(_ foo: T) {
    print(foo as Any?)
    switch foo as Any? { case Optional<Any>.some: print("some"); case Optional<Any>.none: print("none"); }
}
let x: Int? = nil
f(x)

program will print

Optional(nil)
some

while debugger will print

nil
none

@swift-ci
Copy link
Collaborator Author

Comment by cukr (JIRA)

Another example, without using Optional

Code prints "A", debugger prints "B"

protocol Foo { }
extension Foo {
    func test() -> String { "A" }
}
struct S: Foo {
    func test() -> String { "B" }
}
func x<T: Foo>(_ x: T) {
    print(x.test())
}
x(S())

@masters3d
Copy link
Contributor

I am not seeing that in both examples. Please post a picture.

@swift-ci
Copy link
Collaborator Author

Comment by cukr (JIRA)

![](Screen Shot 2020-04-30 at 10.37.24 PM.png)

![](Screen Shot 2020-04-30 at 10.35.49 PM.png)

@swift-ci
Copy link
Collaborator Author

Comment by cukr (JIRA)

Would you like for me to highlight in different colors what is being printed by the running program, and what by the debugger, so that it's easier to see the difference?

@augusto2112
Copy link
Contributor

@vedantk The issue is marked as resolved, but I believe that's a mistake right? I can reproduce it in my system:

(lldb) po foo != nil
false

(lldb) po test
true

@vedantk
Copy link
Member

vedantk commented May 13, 2020

There have been a couple more examples discussed since the initial resolution, I think this is worth another look.

@augusto2112
Copy link
Contributor

@vedantk @adrian-prantl

Here's swiftc's and LLDB's ASTs. Here's the relevant part.
LLDB:

(binary_expr type='Bool' location=<EXPR>:6:5 range=[<EXPR>:6:1 - line:6:8] nothrow
  (dot_syntax_call_expr implicit type='(Int?, _OptionalNilComparisonType) -> Bool' location=<EXPR>:6:5 range=[<EXPR>:6:5 - line:6:5] nothrow
    (declref_expr type='(Optional<Int>.Type) -> (Int?, _OptionalNilComparisonType) -> Bool' location=<EXPR>:6:5 range=[<EXPR>:6:5 - line:6:5] decl=Swift.(file).Optional extension.!= [with (substitution_map generic_signature=<Wrapped> (substitution Wrapped -> Int))] function_ref=unapplied)
    (type_expr implicit type='Optional<Int>.Type' location=<EXPR>:6:5 range=[<EXPR>:6:5 - line:6:5] typerepr='Optional<Int>'))
  (tuple_expr implicit type='(Optional<Int>, _OptionalNilComparisonType)' location=<EXPR>:6:1 range=[<EXPR>:6:1 - line:6:8]


    (declref_expr type='Optional<Int>' location=<EXPR>:6:1 range=[<EXPR>:6:1 - line:6:1] decl=__lldb_expr_1.(file).$__lldb_expr(_:).foo@<EXPR>:2:62 function_ref=unapplied)
    (nil_literal_expr type='_OptionalNilComparisonType' location=<EXPR>:6:8 range=[<EXPR>:6:8 - line:6:8] initializer=Swift.(file)._OptionalNilComparisonType.init(nilLiteral:)))))

swiftc:

(binary_expr type='Bool' location=./example.swift:2:19 range=[./example.swift:2:15 - line:2:22] nothrow
  (dot_syntax_call_expr implicit type='(T?, _OptionalNilComparisonType) -> Bool' location=./example.swift:2:19 range=[./example.swift:2:19 - line:2:19] nothrow
    (declref_expr type='(Optional<T>.Type) -> (T?, _OptionalNilComparisonType) -> Bool' location=./example.swift:2:19 range=[./example.swift:2:19 - line:2:19] decl=Swift.(file).Optional extension.!= [with (substitution_map generic_signature=<Wrapped> (substitution Wrapped -> T))] function_ref=unapplied)
    (type_expr implicit type='Optional<T>.Type' location=./example.swift:2:19 range=[./example.swift:2:19 - line:2:19] typerepr='Optional<T>'))
  (tuple_expr implicit type='(T?, _OptionalNilComparisonType)' location=./example.swift:2:15 range=[./example.swift:2:15 - line:2:22]

    // notice this line!
    (inject_into_optional implicit type='T?' location=./example.swift:2:15 range=[./example.swift:2:15 - line:2:15]
      (declref_expr type='T' location=./example.swift:2:15 range=[./example.swift:2:15 - line:2:15] decl=output.(file).fooooo(_:).foo@./example.swift:1:18 function_ref=unapplied))
    (nil_literal_expr type='_OptionalNilComparisonType' location=./example.swift:2:22 range=[./example.swift:2:22 - line:2:22] initializer=Swift.(file)._OptionalNilComparisonType.init(nilLiteral:)))))

The only difference I see is the "inject_into_optional implicit type='T?'". So my guess is that since the type of T for the example is Int?, the compiler converts it to Int?? before the comparison, and LLDB doesn't, which would explain why swiftc evaluates the comparison to true and LLDB to false.

So for LLDB to work like swiftc, I think that it'd have to know somehow that it has to wrap values in optionals (even if the type is already an optional), when the "original" type is generic. What do you think, is this the way to go? And if yes, how would I got about to achieve this?

@adrian-prantl
Copy link
Member

Is concrete instance of f that the expression is injected into a generic function f<T> or a specialized function f<Int?>? I.e., what's the demangled symbol name? The upper AST makes it look like the latter.

@augusto2112
Copy link
Contributor

@adrian-prantl

I believe that LLDB's mangled function name is $s13_lldb_expr_103$_a1_B0yySpyypGF and the compiled function name is $s6output1fyyxlF.

For some reason, swift-demangle hangs indefinitely when trying to demangle those symbols, am I doing something wrong?

Looking at the IR of the compiled code, the concrete type of T is being injected into the function:

Signature:

define hidden swiftcc void @"$s6output1fyyxlF"(%swift.opaque* noalias nocapture %0, %swift.type* %T) 

Call:

  %11 = call %swift.type* @__swift_instantiateConcreteTypeFromMangledName({ i32, i32 }* @"$sSiSgMD") #&#8203;10, !dbg !52
  call swiftcc void @"$s6output1fyyxlF"(%swift.opaque* noalias nocapture %10, %swift.type* %11), !dbg !52

The LLDB's version seems receive as a parameter just a pointer:

 define hidden swiftcc void @"$s13__lldb_expr_103$__a1_B0yySpyypGF"(i8* %0) #&#8203;0 {

@adrian-prantl
Copy link
Member

{{$xcrun swift-demangle -expand s6output1fyyxlF

Demangling for $s6output1fyyxlF

kind=Global

kind=Function

kind=Module, text="output"

kind=Identifier, text="f"

kind=LabelList

kind=Type

  kind=DependentGenericType

    kind=DependentGenericSignature

      kind=DependentGenericParamCount, index=1

    kind=Type

      kind=FunctionType

        kind=ArgumentTuple

          kind=Type

            kind=DependentGenericParamType

              kind=Index, index=0

              kind=Index, index=0

        kind=ReturnType

          kind=Type

            kind=Tuple}}

@adrian-prantl
Copy link
Member

I'm confused now. If the function is generic, where did the AST you posted get the Int? type from?

@augusto2112
Copy link
Contributor

@adrian-prantl sorry, I should've been more clear on what I did.

I got the first AST from the logs of evaluating "foo != nil" in LLDB (log enable lldb expr) the second one is from the actual compiled code.

@adrian-prantl
Copy link
Member

My bad. Your post did say that! What is the result of `v foo` and `p foo`?

@augusto2112
Copy link
Contributor

Do you mean just the result of the commands?

(lldb) v foo
(Int?) foo = nil
(lldb) p foo
(Int?) $R0 = nil

@augusto2112
Copy link
Contributor

Hi @vedantk and @adrian-prantl, I'd like to update you on where I am with this issue.

So I found out that there seems to be some support for evaluating generics when dealing with classes/structs which don't work with functions. For example, trying to evaluate e var t: T? = nil does not work in for the working example, with the error:

error: <user expression 0>:1:8: use of undeclared identifier 'T'
var t: T? = nil

But does work under:

class A<T> {
    func f(_ foo: T) {
        let test = foo != nil
        print(test)
    }
}
let x: Int? = nil
A().f(x)

The way this seems to work is by extending class A with a conditional conformance, so the whole generated code looks more or less like this:

extension A where T == Optional<Int> {
  @LLDBDebuggerFunction final func $__lldb_wrapped_expr_0(_ $__lldb_arg: UnsafeMutablePointer<Any>) {
    do {
      /*__LLDB_USER_START__*/
      var t: T? = nil
      /*__LLDB_USER_END__*/
    } catch (let __lldb_tmp_error) {
      var $__lldb_error_result = __lldb_tmp_error
    }
  }
}

func $__lldb_expr(_ $__lldb_arg : UnsafeMutablePointer<Any>) {
  do {
    $__lldb_injected_self.$__lldb_wrapped_expr_0(
      $__lldb_arg
    )
  }
}

Unfortunately, foo != nil still evaluates to false in this case as well, since T is "already" Int?.

So, firstly, do you think we can somehow reuse this functionality to treat the bug? I'm not very confident since it requires the function to be inside some class, and the generic type is specialized (is this the correct term?) when the code is generated, so we still don't get the upgraded optional we see in the compiled code.

I took inspiration on the aforementioned example and came up with this possible solution:

func $__lldb_expr(_ $__lldb_arg : UnsafeMutablePointer<Any>) {
    @LLDBDebuggerFunction func $__lldb_inner_expr<T>(_ $__lldb_arg: UnsafeMutablePointer<Any>, foo: T, bar: SomeActualType) {
      do {
        /*__LLDB_USER_START__*/
        foo != nil
        /*__LLDB_USER_END__*/
      } catch (let __lldb_tmp_error) {
        var $__lldb_error_result = __lldb_tmp_error
      }
    }
  do {
    try $__lldb_injected_self.$__lldb_wrapped_expr($__lldb_arg, foo: foo, bar: bar)
  }
}

So basically, create an inner function, "$_lldb_inner_exp", and pass all variables as parameters to it, setting as generic the types to those who were originally so. What do you think?

Right now, I'm looking for a way of finding out if a given variable had a type that was originally generic by the time we're evaluating the user's expression (more specifically, in SwiftASTManipulator::AddExternalVariables).

@vedantk
Copy link
Member

vedantk commented May 26, 2020

That's a really nice investigation. I'm wary of using the conditional conformance trick to evaluate expressions that aren't in a method context, as that would require setting up a new fake type. (Btw: we use "substitute" to describe when we fill in a generic type with a non-generic type, and "specialize" when we make a copy of a generic function with all its generic types substituted with non-generic types.)

The gadget you shared for `$lldb_inner_exp` is interesting, but it seems a bit tricky to set up: calling the inner expression func would require a fair amount of extra work. Instead, could we piggy-back on the "argument struct" (`_lldb_arg`), and change the type of the "foo" field of that struct to `Swift.Optional<Swift.Optional<Swift.Int>>`?

Today, we add:
```
Variables:
[name=test, type = Swift.Bool]
[name=foo, type = Swift.Optional<Swift.Int>]
[name=$τ_0_0, type = Builtin.RawPointer]
...
```

@augusto2112
Copy link
Contributor

@vedantk@adrian-prantl I got the version with the inner function working! 😃

There's one caveat: I had to add a function GenericContext::setGenericParams to the Swift side. Is this ok? Was there a reason for this not to exist? As an alternative, I could try to create a new function instead and clone all the fields.

I'm cleaning up the code right now as I had some things hard-coded to make it run, but I should be able to open a PR today.

@augusto2112
Copy link
Contributor

I've opened a pull request to solve this issue, although the solution is not complete yet.

We overall idea is introducing a second function, $__lldb_inner_expr, and passing all the variables in the current environment as arguments to said function, making them generic if their type was substituted.

We start setting up $_lldb_expr and $_lldb_inner_expr:

func $__lldb_inner_expr() throws {
  /*__LLDB_USER_START__*/
  foo != nil // user expression inserted here
  /*__LLDB_USER_END__*/
}
@LLDBDebuggerFunction 
func $__lldb_expr(_ $__lldb_arg : UnsafeMutablePointer<Any>) {
  do {
    $__lldb_inner_expr()
  } catch (let __lldb_tmp_error) {
    var $__lldb_error_result = __lldb_tmp_error
  }
}

We update $_lldb_inner_expr's signature to receive all the variables in $_lldb_expr, as well as update the call site to pass those parameters along. This depends on adding a function GenericContext::setGenericParams in the swift repo, which currently doesn't exist. This is done in SwiftASTManipulator::SetupParametersForInnerFunction. For the example expression, let's say that foo is a variable with substituted type Int? and bar has type String:

func $__lldb_inner_expr<T0>(foo: T0, bar: String) throws {
  /*__LLDB_USER_START__*/
  foo != nil
  /*__LLDB_USER_END__*/
}
@LLDBDebuggerFunction 
func $__lldb_expr(_ $__lldb_arg : UnsafeMutablePointer<Any>) {
  do {
    $__lldb_inner_expr(foo: foo, bar: bar)
  } catch (let __lldb_tmp_error) {
    var $__lldb_error_result = __lldb_tmp_error
  }
}

After type-checking is done, we update the return type of $__lldb_inner_expr to the correct type and return the last expression. We should also update the call site to expect a value of that type, this is what's missing.

func $__lldb_inner_expr<T0>(foo: T0, bar: String) throws -> Bool {
  /*__LLDB_USER_START__*/
  return foo != nil
  /*__LLDB_USER_END__*/
}
@LLDBDebuggerFunction 
func $__lldb_expr(_ $__lldb_arg : UnsafeMutablePointer<Any>) {
  do {
   // expect a value of type `Bool` here, currently working on this
    $__lldb_inner_expr(foo: foo, bar: bar) 
  } catch (let __lldb_tmp_error) {
    var $__lldb_error_result = __lldb_tmp_error
  }
}

After that evaluation continues as normal.

@augusto2112
Copy link
Contributor

I'm unassigning myself from this issue since I wasn't able to find a robust solution. I'll summarize the problem, my attempted solution, and the problem with this solution, in the hopes this will help out whoever picks this up next. The example I'll be referring to in the rest of this comment is the one in the description.

The problem

The problem is that, in the compiled code, when comparing an optional to a value whose type might not be optional (in the example, T), the latter gets wrapped in an optional to allow for the comparison (so, in the example, we compare two values of type Int??); but, when evaluating the expression in LLDB, since the type is already substituted to an optional type (Int?) this wrapping doesn't happen, and we end up with a different expression. I think the more general problem is that LLDB always uses the substituted types when evaluating expressions (which makes sense from a practical standpoint), and sometimes this conflicts with the compiled code in these edge-cases.

Attempted Solution

The attempted solution was to take into account the generic types by using an inner generic function. When evaluating an expression, we'd generate, more or less, the following code:

func $__lldb_expr(_ $__lldb_arg : UnsafeMutablePointer<Any>) {
  @LLDBDebuggerFunction // t1, t2 and t3 are variables in the environment with generic types
  func $__lldb_wrapped_expr<T1, T2, T3>(_ $__lldb_arg : UnsafeMutablePointer<Any>, t1: T1, t2: T2, t3: T3) {
   // user expresion here
  }
  $__lldb_wrapped_expr($__lldb_arg, t1: t1, t2: t2, t3: t3)
}

In essence, for every variable in the environment, we'd "re-generify" it by setting up a new generic argument in $__lldb_wrapped_expr, and passing the variable as an argument (I tried a couple of variations of this, but that's the gist of the solution). It should be noted that LLDB used to have something like this (as seen in this commit from 2015, in SwiftASTManipulator.cpp, lines 320-342), but this was dropped for some reason (I was unable to find a clear commit where this was dropped, and why).

Problem with the solution

The problem with this solution is that, in the vast majority of cases, we want to use the substituted type of variables in a generic context. For example, consider the following code:

struct A {
    func someFunc() {}
}
func f<T>(_ foo: T) -> T {
    return foo // breakpoint here and run "e foo.someFunc()"
}

let x = A()
_ = f(x)

Unfortunately, in the solution presented, evaluating foo.someFunc() would fail, because type T has no function someFunc, which would be very inconvenient to LLDB users.

This is the PR I opened to solve the issue, there's some good discussion there that might prove useful to whoever wants to tackle this next.

@swift-ci swift-ci transferred this issue from apple/swift-issues Apr 25, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug A deviation from expected or documented behavior. Also: expected but undesirable behavior.
Projects
None yet
Development

No branches or pull requests

6 participants