91一级特黄大片|婷婷中文字幕在线|av成人无码国产|日韩无码一二三区|久久不射强奸视频|九九九久久久精品|国产免费浮力限制

One paper has been accepted by Pattern Recognition.

Our paper entitled "IW-ViT: Independence-Driven Weighting Vision Transformer for Out-of-Distribution Generalization" has been accepted by Pattern Recognition.

Title:  IW-ViT: Independence-Driven Weighting Vision Transformer for Out-of-Distribution Generalization

Author: Weifeng Liu*, Haoran Yu, Yingjie Wang, Baodi Liu, Dapeng Tao, Honglong Chen

Abstract:  Vision Transformer has shown excellent performance in various computer vision applications under the independently and identically distributed assumption. However, if the test distribution differs from the training distribution, the performance of the model drops significantly. To solve this problem, we propose to use independence sample weighting to improve the model’s out-of-distribution generalization ability. It learns a set of sample weights to eliminate the spurious correlation between irrelevant features and labels by eliminating the dependencies between features. Previous work based on independence sample weighting only learned sample weights from the final output of the feature extractor to optimize the model. Different from these works, we consider the difference in spurious correlations between different layers in the feature extraction process. Combining the modular architecture of ViT and independence sample weighting, we propose Independence-Driven Weighting Vision Transformer (IW-ViT) for out-of-distribution generalization. The IW-ViT is constructed by a specialized encoder block, IW-Block, where each IW-Block incorporates the independence sample weighting module. Every IW-Block learns a set of sample weights and generates weighted loss function to differentially eliminate the spurious correlations in different blocks. We conduct detailed verifications on various datasets. Experimental results demonstrate that IW-ViT significantly outperforms previous work in different OOD generalization settings. 

登錄用戶可以查看和發(fā)表評論, 請前往  登錄 或  注冊。
SCHOLAT.com 學者網(wǎng)
免責聲明 | 關于我們 | 用戶反饋
聯(lián)系我們: