Sangmin Bae


Research Scientist, OSI Lab

Graduate School of AI, KAIST
85 Hoegi-ro, Dongdaemun-gu, Seoul, Korea

Email: bsmn0223xkxkxk@kaist.ac.kr / bsmn0223xkxkxk@gmail.com
CV, Linkedin, Twitter, Github

Welcome to my page! I am a Research Scientist with a strong desire to become a versatile T-shaped expert in AI. While I have primarily focused on
Computer Vision, I have also explored other AI domains, including NLP, Audio, Tabular, and Video, to broaden my knowledge and expertise.

My research interests lie in Efficient AI, which entails exploring training- or data-efficient approaches to make AI more accessible and sustainable.
Some of my research areas include Self-Supervised Learning, Federated Learning, Generative AI, and Multimodal Learning. News

Oct. 2023: 🎊 A Long Paper on 'Fast and Robust Early-Exiting Framework' accepted at EMNLP 2023.

Jun. 2023: 🚀 Two poster sessions at CVPR 2023 in Vancouver.

May 2023: 🎊 A Paper on 'Respiratory Sound Classification' accepted at INTERSPEECH 2023.

Feb. 2023: 🎊 Two Papers on 'Self-Supervised Learning' and 'Federated Active Learning' accepted at CVPR 2023.

 

Education
  • Ph.D. student in Graduate School of AI, KAIST. Advised by Prof. Se-Young Yun.   Mar. 2021 - Present
  • M.S. in Industrial and Systems Engineering, KAIST. Advised by Prof. Se-Young Yun.   Mar. 2019 - Feb. 2021
  • B.S. in Industrial and Systems Engineering.   Mar. 2014 - Feb. 2019
  •  

    Publications Google Scholar *: 1st co-authors, : corresponding authors, C: conferences, J: journals, W: workshops, P: preprints

    2023
    [W6] June-Woo Kim, Chihyeon Yoon, Miika Toikkanen, Sangmin Bae, Ho-Young Jung. Adversarial Fine-tuning using Generated Respiratory Sound to Address Class Imbalance. Neural Information Processing Systems Workshop on Deep Generative Models for Health (NeurIPSW) 2023.
    [W5] Felix den Breejen, Sangmin Bae, Stephen Cha, Tae-Young Kim, Seoung-Hyun Koh, Se-Young Yun. Exploring the Retrieval Mechanism for Tabular Deep Learning. Neural Information Processing Systems Workshop on Table Representation Learning (NeurIPSW) 2023.
    [P2] June-Woo Kim, Sangmin Bae, Won-Yang Cho, Byungjo Lee, Ho-Young Jung. Stethoscope-guided Supervised Contrastive Learning for Cross-domain Adaptation on Respiratory Sound Classification. Preprint 2023.
    [C6] Sangmin Bae*, Jongwoo Ko*, Hwanjun Song, Se-Young Yun. Fast and Robust Early-Exiting Framework for Autoregressive Language Models with Synchronized Parallel Decoding. Conference on Empirical Methods in Natural Language Processing (EMNLP) Long Paper 2023. [pdf] [code]
    [C5] Sangmin Bae*, June-Woo Kim*, Won-Yang Cho, Hyerim Baek, Soyoun Son, Byungjo Lee, Changwan Ha, Kyongpil Tae, Sungnyun Kim, Se-Young Yun. Patch-Mix Contrastive Learning with Audio Spectrogram Transformer on Respiratory Sound Classification. Conference of the International Speech Communication Association (INTERSPEECH) 2023. [pdf] [code]
    [C4] Sungnyun Kim*, Sangmin Bae*, Se-Young Yun. Coreset Sampling from Open-Set for Fine-Grained Self-Supervised Learning. International Conference on Computer Vision and Pattern Recognition (CVPR) 2023. [pdf] [code]
    [C3] Sangmook Kim*, Sangmin Bae*, Hwanjun Song, Se-Young Yun. Re-thinking Federated Active Learning based on Inter-class Diversity. International Conference on Computer Vision and Pattern Recognition (CVPR) 2023. [pdf] [code]
    [C2] Sangmin Bae*, Sungnyun Kim*, Jongwoo Ko, Gihun Lee, Seungjong Noh, Se-Young Yun. Self-Contrastive Learning: Single-viewed Supervised Contrastive Framework using Sub-network. The Association for the Advancement of Artificial Intelligence (AAAI) 2023. Oral Presentation. [pdf] [code]

    2022
    [C1] Gihun Lee*, Minchan Jeong*, Yongjin Shin, Sangmin Bae, Se-Young Yun. Preservation of Global Knowledge by Not-True Distillation in Federated Learning. Neural Information Processing Systems (NeurIPS) 2022. [pdf] [code]
    [W4] Sungnyun Kim*, Sangmin Bae*, Se-Young Yun. Coreset Sampling from Open-Set for Fine-Grained Self-Supervised Learning. Neural Information Processing Systems Workshop on Self-Supervised Learning: Theory and Practice (NeurIPSW) 2022. [pdf]
    [W3] Sangmook Kim*, Sangmin Bae*, Hwanjun Song, Se-Young Yun. LG-FAL: Federated Active Learning Strategy using Local and Global Models. International Conference on Machine Learning Workshop on daptive Experimental Design and Active Learning in the Real World (ICMLW) 2022. [pdf]

    2020
    [W2] Sungnyun Kim*, Gihun Lee*, Sangmin Bae*, Se-Young Yun. MixCo: Mix-up Contrastive Learning for Visual Representation. Neural Information Processing Systems Workshop on Self-Supervised Learning: Theory and Practice (NeurIPSW) 2020. [pdf] [code]
    [P1] Taehyeon Kim*, Sangmin Bae*, Jin-woo Lee, Se-Young Yun. Accurate and Fast Federated Learning via Combinatorial Multi-Armed Bandits. Preprint 2020. [pdf]
    [W1] Gihun Lee*, Sangmin Bae*, Jaehoon Oh, Se-Young Yun. SIPA: A Simple Framework for Efficient Networks. IEEE International Conference on Data Mining Workshop on Big Data Analysis for Smart Engergy (ICDMW) 2020. [pdf] [code]

     

    Patents
  • Se-Young Yun, Seongyoon Kim, Woojin Chung, Sangmin Bae. Toward Enhanced Representation for Federated Re-Identification by Not-True Self
    Knowledge Distillation. Korea Patent Application.   Aug. 2022
  • Jaehoon Oh, Sangmook Kim, Se-Young Yun, Sangmin Bae, Jaewoo Shin, Seongyoon Kim, Woojin Chung. Federated Learning System for Performing Individual Data Customized Federated Learning, Method for Federated Learning, and Client Aratus for Performing Same. Korea and US Patent Application.
    Jun. 2022, Oct. 2022
  • Gihun Lee, Minchan Jeong, Se-Young Yun, Sangmin Bae, Jaeyeon Ahn, Seongyoon Kim, Woojin Chung. System, Method, Computer-Readable Storage Medium and Computer Program for Federated Learning of Local Model based on Learning Direction of Global Model. Korea and US Patent Application.
    Jun. 2022, Oct. 2022
  •  

    Awards and Honors
  • Two Best Presentation Awards from Korea Computing Congress (KCC).   Aug. 2022
  • Best Paper Award (5th Place) from Korean AI Association and LG AI Research (JKAIA).   Nov. 2021
  • MicroNet Challenge 4th Place at NeurIPS Workshop.   Oct. 2019
  • Alumni Scholarship from KAIST.   Mar. 2017 - Feb. 2019
  • Dean's List (Top 3%) at Faculty of Engineering Department in KAIST.   Spring 2017
  •  

    Research Projects
  • [NIER] Short-term Prediction of Particulate Matter via Artificial Intelligence. Project Manager.   Mar. 2023 - Present
  • [KT] Neural Architecture Search for Detecting Communication Network Failure. Project Manager.   Apr. 2022 - Feb. 2023
  • [ETRI] Lightweight Edge Device Technology via Federated Learning. Project Manager.   Mar. 2021 - Sep. 2022
  • [SK Hynix] Semantic Segmentation to Detect Errors in Wafer Process.   Feb. 2021 - Sep. 2021
  • [ETRI] Data-efficient Unsupervised Representation Learning.   Mar. 2020 - Dec. 2020
  • [ETRI] Model Compression for Big Data Ddge Analysis.   Jun. 2019 - Oct. 2019
  • [Hankook Tire and Technology] Compound Prediction with Artificial Intelligence and Auto-ML.   Mar. 2019 - Feb. 2020
  •  

    Research Experience
  • Research Collaboration with Seoul National University Bundang Hospital.   Jul. 2023 - Present
  • Research Collaboration with MODULABS.   Sep. 2022 - Present
  • Research Collaboration with NAVER AI, advised by Hwanjun Song.   Jan. 2022 - Jan. 2023
  • Research Internship at Kakao Recommendation Team.   Sep. 2018 - Feb. 2019
  • Research Internship at Optimization and Statistical Inference Lab, KAIST.   Jul. 2018 - Aug. 2018
  • Research Internship at Human Factors and Ergonomics Lab, KAIST.   Dec. 2017 - Jun. 2018
  • Exchange Student at Link√∂ping University, Sweden.   Jul. 2017 - Aug. 2017
  •  

    Services
  • Server Manager at KAIST AI.   Mar. 2021 - Feb. 2023
  • Student Leader at OSI Lab, KAIST.   Mar. 2021 - Mar. 2022
  • Teaching Assistant.
  • Instructor on DL and ML courses.

  • © 2023 Sangmin Bae Thanks Dr. Hwanjun Song for the template.