Afin d’être disponible pour satisfaire
au mieux vos demandes,
Nous vous conseillons de prendre RDV au 03.88.38.11.60
(Failure of OOD detection under invariant classifier) Consider an out-of-distribution input which contains the environmental feature: ? out ( x ) = M inv z out + M e z e , where z out ? ? inv . Given the invariant classifier (cf. Lemma 2), the posterior probability for the OOD input is p ( y = 1 ? ? out ) = ? ( 2 p ? z e ? + log ? / ( 1 ? ? ) ) , where ? is the logistic function. Thus for arbitrary confidence 0 < c : = P ( y = 1 ? ? out ) < 1 , there exists ? out ( x ) with z e such that p ? z e = 1 2 ? log c ( 1 ? ? ) ? ( 1 ? c ) .
Proof. Think an away-of-shipment input x out with Yards inv = [ We s ? s 0 1 ? s ] , and you will Meters e = [ 0 s ? elizabeth p ? ] , then feature expression is ? e ( x ) = [ z out p ? z elizabeth ] , in which p is the tool-standard vector defined in the Lemma dos .
Then we have P ( y = 1 ? ? out ) = P ( y = 1 ? z out , p ? z e ) = ? ( 2 p ? z e ? + log ? fitness singles / ( 1 ? ? ) ) , where ? is the logistic function. Thus for arbitrary confidence 0 < c : = P ( y = 1 ? ? out ) < 1 , there exists ? out ( x ) with z e such that p ? z e = 1 2 ? log c ( 1 ? ? ) ? ( 1 ? c ) . ?
Remark: For the an even more standard case, z aside is modeled since a haphazard vector which is independent of the within the-shipping brands y = step 1 and you will y = ? 1 and you will environmental possess: z aside ? ? y and z aside ? ? z age . For this reason in Eq. 5 i’ve P ( z aside ? y = 1 ) = P ( z away ? y = ? step one ) = P ( z away ) . Then P ( y = step 1 ? ? away ) = ? ( dos p ? z elizabeth ? + record ? / ( step 1 ? ? ) ) , same as inside the Eq. seven . Hence the head theorem still keeps less than even more general circumstances.
To further validate all of our conclusions beyond record and you may intercourse spurious (environmental) enjoys, we provide additional experimental results on the ColorMNIST dataset, while the found from inside the Shape 5 .
[ lecun1998gradient ] , which composes colored backgrounds on digit images. In this dataset, E = denotes the background color and we use Y = as in-distribution classes. The correlation between the background color e and the digit y is explicitly controlled, with r ? . That is, r denotes the probability of P ( e = red ? y = 0 ) = P ( e = purple ? y = 0 ) = P ( e = green ? y = 1 ) = P ( e = pink ? y = 1 ) , while 0.5 ? r = P ( e = green ? y = 0 ) = P ( e = pink ? y = 0 ) = P ( e = red ? y = 1 ) = P ( e = purple ? y = 1 ) . Note that the maximum correlation r (reported in Table 4 ) is 0.45 . As ColorMNIST is relatively simpler compared to Waterbirds and CelebA, further increasing the correlation results in less interesting environments where the learner can easily pick up the contextual information. For spurious OOD, we use digits with background color red and green , which contain overlapping environmental features as the training data. For non-spurious OOD, following common practice [ MSP ] , we use the Textures [ cimpoi2014describing ] , LSUN [ lsun ] and iSUN [ xu2015turkergaze ] datasets. We train on ResNet-18 [ he2016deep ] , which achieves 99.9 % accuracy on the in-distribution test set. The OOD detection performance is shown in Table 4 .