Xu Luo
Xu Luo
UESTC | Shanghai AI Laboratory
Verified email at - Homepage
Cited by
Cited by
Rectifying the shortcut learning of background for few-shot learning
X Luo, L Wei, L Wen, J Yang, L Xie, Z Xu, Q Tian
NeurIPS 2021, 2021
Channel importance matters in few-shot image classification
X Luo, J Xu, Z Xu
ICML 2022, 2022
Boosting few-shot classification with view-learnable contrastive learning
X Luo*, Y Chen*, L Wen, L Pan, Z Xu
2021 IEEE International Conference on Multimedia and Expo (ICME), 1-6, 2021
A closer look at few-shot classification again
X Luo*, H Wu*, J Zhang, L Gao, J Xu, J Song
ICML 2023, 2023
Alleviating the sample selection bias in few-shot learning by removing projection to the centroid
J Xu, X Luo, X Pan, Y Li, W Pei, Z Xu
NeurIPS 2022, 2022
DETA: Denoised task adaptation for few-shot learning
J Zhang, L Gao, X Luo, H Shen, J Song
ICCV 2023, 2023
Exploring category-correlated feature for few-shot image classification
J Xu, X Pan, X Luo, W Pei, Z Xu
arXiv preprint arXiv:2112.07224, 2021
Concatenated tensor networks for deep multi-task learning
M Wang, Z Su, X Luo, Y Pan, S Zheng, Z Xu
Neural Information Processing: 27th International Conference, ICONIP 2020 …, 2020
3DAxiesPrompts: Unleashing the 3D Spatial Task Capabilities of GPT-4V
D Liu, X Dong, R Zhang, X Luo, P Gao, X Huang, Y Gong, Z Wang
arXiv preprint arXiv:2312.09738, 2023
Lumina-T2X: Transforming Text into Any Modality, Resolution, and Duration via Flow-based Large Diffusion Transformers
P Gao*, L Zhuo*, D Liu*, R Du*, X Luo*, L Qiu*, Y Zhang, C Lin, R Huang, ...
arXiv preprint arXiv:2405.05945, 2024
CoIN: A Benchmark of Continual Instruction tuNing for Multimodel Large Language Model
C Chen, J Zhu, X Luo, H Shen, L Gao, J Song
arXiv preprint arXiv:2403.08350, 2024
Less is More: On the Feature Redundancy of Pretrained Models When Transferring to Few-shot Tasks
X Luo, D Zou, L Gao, Z Xu, J Song
arXiv preprint arXiv:2310.03843, 2023
The system can't perform the operation now. Try again later.
Articles 1–12