Program         Invited Talks


Invited Talks


  A Tutorial on ML for Next-Generation LEO Satellite Communications

Prof. Changhee Lee
Assistant Professor
Korea University
Abstract
Low Earth Orbit (LEO) satellite mega-constellations are revolutionizing global connectivity, yet their operational environment is defined by unprecedented challenges: extreme mobility, dynamic network topologies, frequent handovers, and variable link quality. Traditional network management paradigms are ill-equipped to handle this complexity at scale. This tutorial provides an expert-level overview of how Machine Learning (ML) are being harnessed to create next-generation LEO satellite communications. We will begin by dissecting the core challenges of LEO communications. The tutorial will then delve into three critical pillars of ML application: 1) Time-Series Forecasting for proactive resource management, 2) Anomaly Detection for ensuring operational resilience, and 3) Explainable AI (XAI) for building human trust in autonomous network operations.
Biography
Changhee Lee is a machine learning and deep learning expert with a research focus on time-series analysis, particularly performance improvement and interpretable AI for generating actionable insights. He is an Assistant Professor at Korea University, where he leads the Actionable Intelligence Lab, conducting research on integrating multimodal data, developing individualized models of care through prognostic and causal inference, extracting scientific knowledge from data, interpreting black-box machine learning models, and scaling advanced ML methods for real-world impact. His work has been published in top-tier AI venues, including NeurIPS, ICML, ICLR, and AISTATS. Before joining Korea University, he was an Assistant Professor in the Department of Artificial Intelligence at Chung-Ang University. He earned his Ph.D. from the University of California, Los Angeles.

  AI-RAN and the Future of Resource Management

Prof. Hyun Jong Yang
Associate Professor
Seoul National University
Abstract
The rapid evolution of wireless communications has led to increasing demands for intelligent, adaptive, and efficient resource allocation. Recently, the concept of AI-RAN has emerged as a key enabler for addressing these challenges. By integrating machine learning and AI-driven optimization into wireless systems, AI-RAN enables dynamic spectrum management, energy-efficient operations, and user-centric resource allocation. This talk will introduce recent trends in AI-RAN-based resource management, highlighting state-of-the-art approaches, current challenges, and future research directions that are shaping the next generation of wireless communications.
Biography
Dr. Hyun Jong Yang is an associate professor in the department of electrical and computer engineering, Seoul National University (SNU), Seoul, Korea. He received the B.S. degree in electrical engineering from Korea Advanced Institute of Science and Technology (KAIST), Korea, in 2004, and the M.S. and Ph.D. degrees in electrical engineering from KAIST, in 2006 and 2010, respectively. From Aug. 2010 to Aug. 2011, he was a research fellow at Korea Research Institute of Ships & Ocean Engineering (KRISO), Korea. From Oct. 2011 to Oct. 2012, he worked as a post-doctoral researcher in the Electrical Engineering Department, Stanford University, Stanford, CA. From Oct. 2012 to Aug. 2013, he was a Staff II Systems Design Engineer, Broadcom Corporation, Sunnyvale, CA, where he developed physical-layer algorithms for LTE-A MIMO receivers. In addition, he was a delegate of Broadcom in 3GPP RAN1 standard meetings. From Sept. 2013 to July 2020, he was an assistant/associate professor in the School of Electrical and Computer Engineering, UNIST, Korea. From July 2020 to Aug. 2024, he was an associate professor in the department of electrical engineering, Pohang University of Science and Technology (POSTECH), Pohang, Korea. Since Sept. 2024, he has been an associate professor in the department of electrical and computer engineering, Seoul National University (SNU), Seoul, Korea.

  Space-Time Beamforming for Satellite Communications: Enabling Extremely Narrow Beams

Prof. Namyoon Lee
Professor
POSTECH
Abstract
Inter‑beam interference is a central challenge in low‑Earth‑orbit (LEO) satellite communications, driven by dense constellations with overlapping beams and aggressive frequency reuse. We introduce space–time beamforming, a paradigm that exploits the composite space–time channel vector—parameterized by angle of arrival (AoA) and relative Doppler—to jointly optimize beamforming between a moving satellite and a ground user. By synthesizing a virtual array‑of‑subarrays across successive transmissions, the method effectively expands the aperture and forms ultra‑narrow beams, sharply suppressing leakage to neighboring users. This spatial selectivity comes with a controllable rate trade‑off due to temporal repetition. In this talk, I will present the principles of space–time beamforming and show, through analysis and simulation, how it mitigates inter‑beam interference in dense LEO downlinks and enables robust Direct‑to‑Cell satellite communications.
Biography
Namyoon Lee (Senior Member, IEEE) received his Ph.D. degree from The University of Texas at Austin in 2014. He was with the Communications and Network Research Group at Samsung Advanced Institute of Technology, South Korea, from 2008 to 2011, and later worked at Wireless Communications Research, Intel Labs, Santa Clara, CA, USA, from 2015 to 2016. He has been Professor at Pohang University of Science and Technology (POSTECH) since 2016. His research interests include communications theory, with a focus on advanced multi-antenna communications and error-correction coding technologies.
Dr. Lee has received several prestigious awards, including the 2016 IEEE ComSoc Asia-Pacific Outstanding Young Researcher Award, the 2020 IEEE Best Young Professional Award (Outstanding Nominee), the 2021 IEEE-IEIE Joint Award for Young Engineer and Scientist, and the 2021 KICS Haedong Young Engineering Researcher Award. He has actively contributed to IEEE journals and conferences, serving as an Associate Editor for IEEE Communications Letters from 2018 to 2020 and IEEE Transactions on Vehicular Technology from 2021 to 2023. He was also a Guest Editor for IEEE Communications Magazine in the Special Issue on Near-Field MIMO Technologies Toward 6G. Since 2021, he has been an Associate Editor for IEEE Transactions on Wireless Communications and IEEE Transactions on Communications.

  Agentic AI-based Mobile Network Management for 6G

Dr. Haneul Ko
Associate Professor
Kyung Hee University
Abstract
With the emergence of 6G, there is growing interest in leveraging Agentic AI for autonomous and intelligent network management. In this talk, I introduce the concept of Agentic AI-based mobile network management for 6G. I also provide an overview of recent studies on LLM-based mobile network optimization and management. To evaluate the effectiveness of Agentic AI, I introduce a dedicated testbed. On this testbed, I present a representative network function (NF) scaling use case where multiple AI agents coordinate to monitor traffic patterns, predict future demand, and determine optimal scaling actions in real time.
Biography
Haneul Ko is an Associate Professor in the Department of Electronic Engineering at Kyung Hee University, Korea. He received his B.S. and Ph.D. degrees in Electrical Engineering from Korea University in 2011 and 2016, respectively. He received the IEEE ComSoc APB Outstanding Young Researcher Award (2022) and the KICS Haedong Young Engineer Award (2023). His research interests include 5G/6G networks, network automation, mobile cloud computing, and SDN/NFV.

  Space radiation effects in GaN-based devices

Dr. Dong-Seok Kim
Senior Researcher
Korea Atomic Energy Research Institute
Abstract
GaN-based devices are attractive as a component of radiation-hardened applications due to higher atomic displacement threshold energy. Because protons are abundant in space, the proton irradiation effects of AlGaN/GaN HEMTs were studied by many research groups, however, most researches were focused on the difference in performance before and after proton irradiation. To accurately evaluate the space radiation effects of electronics operating in space radiation environments, the real-time effects on AlGaN/GaN HEMTs should be studied under operating bias conditions. Therefore, we investigated the synergy effect between biasing and proton fluence, as well as, real-time drain and gate current of devices under different bias conditions during irradiating 100 MeV protons.
Biography
Dong-Seok Kim was received the B.S., the M.S., and the Ph.D degrees in electronics engineering from Kyungpook National University (KNU), Republic of Korea, in 2008, 2010, and 2015, respectively. His doctoral research concerned the growth of III-nitride epitaxial structure and the fabrication of GaN-based devices. He is currently a senior researcher of Korea Atomic Energy Research Institute (KAERI). His current research is focused on the radiation effect on semiconductor devices such as HEMT, MOSFET, TFT, etc., and the radiation-utilized applications such as nuclear battery and sensors.