![]() |
Sangmin Bae
Graduate School of AI, KAIST Email: bsmn0223xkxkxk@kaist.ac.kr / bsmn0223xkxkxk@gmail.com Google Scholar, CV, Github, Linkedin, X |
Jun. 2024: 🎊 A paper on 'Contrastive Decoding for Multi-modal Models' accepted at ICMLW 2024.
May. 2024: 💻 Starting an internship at Google DeepMind.
May. 2024: 🎊 A paper on 'Hard Prompt Optimization with RL' accepted at ACL 2024.
Apr. 2024: 🎊 A paper on 'Respiratory Sound Classification' accepted at EMBC 2024.
Mar. 2024: 🎊 A Long paper on 'Evolving Questions-Answering Benchamrk' accepted at NAACL 2024.
Jan. 2024: 🥈 Silver Award from Samsung Humantech Paper Awards.
Education
Publications Google Scholar *: 1st co-authors, †: corresponding authors, C: conferences, J: journals, W: workshops, P: preprints
![]() |
[P6] Stephen Cha, Minchan Jeong, Sangmin Bae, Beren Millidge, Se-Young Yun†. Continual Learning by Activation Projection for Large Language Models. Preprint 2024. |
![]() |
[P5] Felix den Greejen*, Sangmin Bae, Stephen Cha, Se-Young Yun†. Why In-Context Learning Transformers are Tabular Data Classifiers. Preprint 2024. [pdf] [code] |
![]() |
[P4] Yongjin Yang*, Sihyeon Kim*, Hojung Jung, Sangmin Bae, SangMook Kim, Se-Young Yun†, Kimin Lee†. Less but Better: Efficient Alignment of Text-to-Image Diffusion Models using Golden Human Data. Preprint 2024. |
![]() |
[P3] Namgyu Ho*, Sangmin Bae*, Taehyeon Kim, Hyunjik Jo, Yireun Kim, Tal Schuster, Adam Fisch, James Thorne†, Se-Young Yun†. Block Transformer: Global-to-Local Language Modeling for Fast Inference. Preprint 2024. [pdf] [code] |
![]() |
[W7] Sihyeon Kim*, Boryeong Cho*, Sangmin Bae, Sumyeong Ahn†, Se-Young Yun†. VACoDe: Visual Augmented Contrastive Decoding. International Conference on Machine Learning Workshop on Trustworthy Multi-modal Foundation Models and AI Agents (ICMLW) 2024. |
![]() |
[P2] Sungnyun Kim*, Kangwook Jang*, Sangmin Bae, Hoirin Kim†, Se-Young Yun†. Learning Video Temporal Dynamics with Asymmetric Cross-Modal Attention for Robust Audio-Visual Speech Recognition. Preprint 2024. |
![]() |
[C10] Yunseon Choi, Sangmin Bae, Seonghyun Ban, Minchan Jeong, Chuheng Zhang, Lei Song, Li Zhao, Jiang Bian, Kee-Eung Kim†. Hard Prompts Made Interpretable: Sparse Entropy Regularization for Prompt Tuning with RL. The Association for Computational Linguistics (ACL) 2024. Oral Presentation. |
![]() |
[C9] June-Woo Kim, Miika Toikkanen, Sangmin Bae, Minseok Kim†, Ho-Young Jung†. RepAugment: Input-Agnostic Representation-Level Augmentation for Respiratory Sound Classification. International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC) 2024. [pdf] |
![]() |
[C8] Yujin Kim, Jaehong Yoon, Seonghyeon Ye, Sangmin Bae, Namgyu Ho, Sung Ju Hwang†, Se-Young Yun†. Carpe diem: On the Evaluation of World Knowledge in Lifelong Language Models. Conference of the North American Chapter of the Association for Computational Linguistics (NAACL) Long Paper 2024. [pdf] [code] |
![]() |
[C7] June-Woo Kim, Sangmin Bae, Won-Yang Cho, Byungjo Lee, Ho-Young Jung†. Stethoscope-guided Supervised Contrastive Learning for Cross-domain Adaptation on Respiratory Sound Classification. IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) 2024. [pdf] [code] |
![]() |
[W6] June-Woo Kim, Chihyeon Yoon, Miika Toikkanen, Sangmin Bae, Ho-Young Jung†. Adversarial Fine-tuning using Generated Respiratory Sound to Address Class Imbalance. Neural Information Processing Systems Workshop on Deep Generative Models for Health (NeurIPSW) 2023. [pdf] [code] |
![]() |
[W5] Felix den Breejen, Sangmin Bae, Stephen Cha, Tae-Young Kim, Seoung-Hyun Koh, Se-Young Yun†. Exploring the Retrieval Mechanism for Tabular Deep Learning. Neural Information Processing Systems Workshop on Table Representation Learning (NeurIPSW) 2023. [pdf] |
![]() |
[C6] Sangmin Bae*, Jongwoo Ko*, Hwanjun Song†, Se-Young Yun†. Fast and Robust Early-Exiting Framework for Autoregressive Language Models with Synchronized Parallel Decoding. Conference on Empirical Methods in Natural Language Processing (EMNLP) Long Paper 2023. [pdf] [code] |
![]() |
[C5] Sangmin Bae*, June-Woo Kim*, Won-Yang Cho, Hyerim Baek, Soyoun Son, Byungjo Lee, Changwan Ha, Kyongpil Tae, Sungnyun Kim†, Se-Young Yun†. Patch-Mix Contrastive Learning with Audio Spectrogram Transformer on Respiratory Sound Classification. Conference of the International Speech Communication Association (INTERSPEECH) 2023. [pdf] [code] |
![]() |
[C4] Sungnyun Kim*, Sangmin Bae*, Se-Young Yun†. Coreset Sampling from Open-Set for Fine-Grained Self-Supervised Learning. International Conference on Computer Vision and Pattern Recognition (CVPR) 2023. [pdf] [code] |
![]() |
[C3] Sangmook Kim*, Sangmin Bae*, Hwanjun Song†, Se-Young Yun†. Re-thinking Federated Active Learning based on Inter-class Diversity. International Conference on Computer Vision and Pattern Recognition (CVPR) 2023. [pdf] [code] |
![]() |
[C2] Sangmin Bae*, Sungnyun Kim*, Jongwoo Ko, Gihun Lee, Seungjong Noh, Se-Young Yun†. Self-Contrastive Learning: Single-viewed Supervised Contrastive Framework using Sub-network. The Association for the Advancement of Artificial Intelligence (AAAI) 2023. Oral Presentation. [pdf] [code] |
![]() |
[C1] Gihun Lee*, Minchan Jeong*, Yongjin Shin, Sangmin Bae, Se-Young Yun†. Preservation of Global Knowledge by Not-True Distillation in Federated Learning. Neural Information Processing Systems (NeurIPS) 2022. [pdf] [code] |
![]() |
[W4] Sungnyun Kim*, Sangmin Bae*, Se-Young Yun†. Coreset Sampling from Open-Set for Fine-Grained Self-Supervised Learning. Neural Information Processing Systems Workshop on Self-Supervised Learning: Theory and Practice (NeurIPSW) 2022. [pdf] |
![]() |
[W3] Sangmook Kim*, Sangmin Bae*, Hwanjun Song†, Se-Young Yun†. LG-FAL: Federated Active Learning Strategy using Local and Global Models. International Conference on Machine Learning Workshop on Adaptive Experimental Design and Active Learning in the Real World (ICMLW) 2022. [pdf] |
![]() |
[W2] Sungnyun Kim*, Gihun Lee*, Sangmin Bae*, Se-Young Yun†. MixCo: Mix-up Contrastive Learning for Visual Representation. Neural Information Processing Systems Workshop on Self-Supervised Learning: Theory and Practice (NeurIPSW) 2020. [pdf] [code] |
![]() |
[P1] Taehyeon Kim*, Sangmin Bae*, Jin-woo Lee, Se-Young Yun†. Accurate and Fast Federated Learning via Combinatorial Multi-Armed Bandits. Preprint 2020. [pdf] |
![]() |
[W1] Gihun Lee*, Sangmin Bae*, Jaehoon Oh, Se-Young Yun†. SIPA: A Simple Framework for Efficient Networks. IEEE International Conference on Data Mining Workshop on Big Data Analysis for Smart Engergy (ICDMW) 2020. [pdf] [code] |
Patents
Awards and Honors
Research Experience
Research Projects
Services
© 2023 Sangmin Bae Thanks Dr. Hwanjun Song for the template.