Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SR-13229] Derivative function protocol witness SILGen verification failure #55669

Closed
dan-zheng opened this issue Jul 16, 2020 · 1 comment
Closed
Labels
AutoDiff bug A deviation from expected or documented behavior. Also: expected but undesirable behavior. compiler The Swift compiler in itself

Comments

@dan-zheng
Copy link
Collaborator

Previous ID SR-13229
Radar None
Original Reporter @dan-zheng
Type Bug
Status Closed
Resolution Duplicate
Additional Detail from JIRA
Votes 0
Component/s Compiler
Labels Bug, AutoDiff
Assignee None
Priority Medium

md5: 8c1c5d54937a6c1bd09b69b2b98624b1

is duplicated by:

Issue Description:

Reproducer:

import _Differentiation

protocol P {}

struct Tensor<Scalar>: Equatable {
 @differentiable(where Scalar: P)
 static func +(lhs: Self, rhs: Self) -> Self { return lhs }
}
extension Tensor: Differentiable where Scalar: P {}

protocol Addable: Differentiable {
 @differentiable
 static func +(lhs: Self, rhs: Self) -> Self
}
extension Tensor: Addable where Scalar: P {}

Verification failure:

$ swiftc sr-13229.swift
substitution map's generic signature: <Scalar>
callee's generic signature: <τ_0_0 where τ_0_0 : P>
SIL verification failed: Substitution map does not match callee in apply instruction: false
Verifying instruction:
     %4 = load [trivial] %1 : $*Tensor<τ_0_0>    // user: %10
     %5 = load [trivial] %2 : $*Tensor<τ_0_0>    // user: %10
     %6 = metatype $@thin Tensor<τ_0_0>.Type     // user: %10
     %9 = differentiable_function_extract [jvp] %8 : $@differentiable @convention(method) <τ_0_0> (Tensor<τ_0_0>, Tensor<τ_0_0>, @noDerivative @thin Tensor<τ_0_0>.Type) -> Tensor<τ_0_0> // user: %10
->   %10 = apply %9<τ_0_0>(%4, %5, %6) : $@convention(method) <τ_0_0 where τ_0_0 : P> (Tensor<τ_0_0>, Tensor<τ_0_0>, @thin Tensor<τ_0_0>.Type) -> (Tensor<τ_0_0>, @owned @callee_guaranteed @substituted <τ_0_0, τ_0_1, τ_0_2> (τ_0_0, τ_0_1) -> τ_0_2 for <Tensor<τ_0_0>.TangentVector, Tensor<τ_0_0>.TangentVector, Tensor<τ_0_0>.TangentVector>) // user: %11
     (%11, %12) = destructure_tuple %10 : $(Tensor<τ_0_0>, @callee_guaranteed @substituted <τ_0_0, τ_0_1, τ_0_2> (τ_0_0, τ_0_1) -> τ_0_2 for <Tensor<τ_0_0>.TangentVector, Tensor<τ_0_0>.TangentVector, Tensor<τ_0_0>.TangentVector>) // users: %13, %14
In function:
// AD__$s4main6TensorVyxGAA7AddableA2A1PRzlAaEP1poiyxx_xtFZTW_jvp_SSU
sil private [transparent] [thunk] [ossa] @AD__$s4main6TensorVyxGAA7AddableA2A1PRzlAaEP1poiyxx_xtFZTW_jvp_SSU : $@convention(witness_method: Addable) <τ_0_0 where τ_0_0 : P> (@in_guaranteed Tensor<τ_0_0>, @in_guaranteed Tensor<τ_0_0>, @thick Tensor<τ_0_0>.Type) -> (@out Tensor<τ_0_0>, @owned @callee_guaranteed @substituted <τ_0_0, τ_0_1, τ_0_2> (@in_guaranteed τ_0_0, @in_guaranteed τ_0_1) -> @out τ_0_2 for <Tensor<τ_0_0>.TangentVector, Tensor<τ_0_0>.TangentVector, Tensor<τ_0_0>.TangentVector>) {
// %0                                             // user: %13
// %1                                             // user: %4
// %2                                             // user: %5
bb0(%0 : $*Tensor<τ_0_0>, %1 : $*Tensor<τ_0_0>, %2 : $*Tensor<τ_0_0>, %3 : $@thick Tensor<τ_0_0>.Type):
  %4 = load [trivial] %1 : $*Tensor<τ_0_0>       // user: %10
  %5 = load [trivial] %2 : $*Tensor<τ_0_0>       // user: %10
  %6 = metatype $@thin Tensor<τ_0_0>.Type        // user: %10
  // function_ref static Tensor.+ infix(_:_:)
  %7 = function_ref @$s4main6TensorV1poiyACyxGAE_AEtFZ : $@convention(method) <τ_0_0> (Tensor<τ_0_0>, Tensor<τ_0_0>, @thin Tensor<τ_0_0>.Type) -> Tensor<τ_0_0> // user: %8
  %8 = differentiable_function [parameters 0 1] [results 0] %7 : $@convention(method) <τ_0_0> (Tensor<τ_0_0>, Tensor<τ_0_0>, @thin Tensor<τ_0_0>.Type) -> Tensor<τ_0_0> // user: %9
  %9 = differentiable_function_extract [jvp] %8 : $@differentiable @convention(method) <τ_0_0> (Tensor<τ_0_0>, Tensor<τ_0_0>, @noDerivative @thin Tensor<τ_0_0>.Type) -> Tensor<τ_0_0> // user: %10
  %10 = apply %9<τ_0_0>(%4, %5, %6) : $@convention(method) <τ_0_0 where τ_0_0 : P> (Tensor<τ_0_0>, Tensor<τ_0_0>, @thin Tensor<τ_0_0>.Type) -> (Tensor<τ_0_0>, @owned @callee_guaranteed @substituted <τ_0_0, τ_0_1, τ_0_2> (τ_0_0, τ_0_1) -> τ_0_2 for <Tensor<τ_0_0>.TangentVector, Tensor<τ_0_0>.TangentVector, Tensor<τ_0_0>.TangentVector>) // user: %11
  (%11, %12) = destructure_tuple %10 : $(Tensor<τ_0_0>, @callee_guaranteed @substituted <τ_0_0, τ_0_1, τ_0_2> (τ_0_0, τ_0_1) -> τ_0_2 for <Tensor<τ_0_0>.TangentVector, Tensor<τ_0_0>.TangentVector, Tensor<τ_0_0>.TangentVector>) // users: %13, %14
  store %11 to [trivial] %0 : $*Tensor<τ_0_0>    // id: %13
  %14 = convert_function %12 : $@callee_guaranteed @substituted <τ_0_0, τ_0_1, τ_0_2> (τ_0_0, τ_0_1) -> τ_0_2 for <Tensor<τ_0_0>.TangentVector, Tensor<τ_0_0>.TangentVector, Tensor<τ_0_0>.TangentVector> to $@callee_guaranteed (Tensor<τ_0_0>.TangentVector, Tensor<τ_0_0>.TangentVector) -> Tensor<τ_0_0>.TangentVector // user: %16
  // function_ref thunk for @escaping @callee_guaranteed (@unowned Tensor<A><A>.TangentVector, @unowned Tensor<A><A>.TangentVector) -> (@unowned Tensor<A><A>.TangentVector)
  %15 = function_ref @$s4main6TensorVA2A1PRzlE13TangentVectorVyx_GA2GIegyyd_A3GIegnnr_AaDRzlTR : $@convention(thin) <τ_0_0 where τ_0_0 : P> (@in_guaranteed Tensor<τ_0_0>.TangentVector, @in_guaranteed Tensor<τ_0_0>.TangentVector, @guaranteed @callee_guaranteed (Tensor<τ_0_0>.TangentVector, Tensor<τ_0_0>.TangentVector) -> Tensor<τ_0_0>.TangentVector) -> @out Tensor<τ_0_0>.TangentVector // user: %16
  %16 = partial_apply [callee_guaranteed] %15<τ_0_0>(%14) : $@convention(thin) <τ_0_0 where τ_0_0 : P> (@in_guaranteed Tensor<τ_0_0>.TangentVector, @in_guaranteed Tensor<τ_0_0>.TangentVector, @guaranteed @callee_guaranteed (Tensor<τ_0_0>.TangentVector, Tensor<τ_0_0>.TangentVector) -> Tensor<τ_0_0>.TangentVector) -> @out Tensor<τ_0_0>.TangentVector // user: %17
  %17 = convert_function %16 : $@callee_guaranteed (@in_guaranteed Tensor<τ_0_0>.TangentVector, @in_guaranteed Tensor<τ_0_0>.TangentVector) -> @out Tensor<τ_0_0>.TangentVector to $@callee_guaranteed @substituted <τ_0_0, τ_0_1, τ_0_2> (@in_guaranteed τ_0_0, @in_guaranteed τ_0_1) -> @out τ_0_2 for <Tensor<τ_0_0>.TangentVector, Tensor<τ_0_0>.TangentVector, Tensor<τ_0_0>.TangentVector> // user: %18
  return %17 : $@callee_guaranteed @substituted <τ_0_0, τ_0_1, τ_0_2> (@in_guaranteed τ_0_0, @in_guaranteed τ_0_1) -> @out τ_0_2 for <Tensor<τ_0_0>.TangentVector, Tensor<τ_0_0>.TangentVector, Tensor<τ_0_0>.TangentVector> // id: %18
} // end sil function 'AD__$s4main6TensorVyxGAA7AddableA2A1PRzlAaEP1poiyxx_xtFZTW_jvp_SSU'

The verification failure occurs for a protocol witness thunk generated by SILGenFunction::emitProtocolWitness, for a derivative function witness #Tensor."+"!jvp.SSU.<Self where Self : Addable>.

In SILGenFunction::emitProtocolWitness, witnessSubs (the witness substitution map) does not have the same generic signature as witnessFnRef (the result of getWitnessFunctionRef), which has additional differentiability requirements.

@swift-ci swift-ci transferred this issue from apple/swift-issues Apr 25, 2022
@ProfFan
Copy link

ProfFan commented Sep 28, 2023

Just want to FYI that this is still not resolved

This issue was closed.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
AutoDiff bug A deviation from expected or documented behavior. Also: expected but undesirable behavior. compiler The Swift compiler in itself
Projects
None yet
Development

No branches or pull requests

2 participants