BEGIN:VCALENDAR
VERSION:2.0
PRODID:Linklings LLC
BEGIN:VTIMEZONE
TZID:America/Denver
X-LIC-LOCATION:America/Denver
BEGIN:DAYLIGHT
TZOFFSETFROM:-0700
TZOFFSETTO:-0600
TZNAME:MDT
DTSTART:19700308T020000
RRULE:FREQ=YEARLY;BYMONTH=3;BYDAY=2SU
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0600
TZOFFSETTO:-0700
TZNAME:MST
DTSTART:19701101T020000
RRULE:FREQ=YEARLY;BYMONTH=11;BYDAY=1SU
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTAMP:20260422T000713Z
LOCATION:506
DTSTART;TZID=America/Denver:20231113T090500
DTEND;TZID=America/Denver:20231113T093000
UID:submissions.supercomputing.org_SC23_sess452_misc279@linklings.com
SUMMARY:ISAV23 Invited Keynote – Progress in In-Situ Analysis and Visualiz
 ation in the Fusion Exascale Code XGC
DESCRIPTION:Choongseok Chang (Princeton Plasma Physics Laboratory, Princet
 on University); Scott Klasky, David Pugmire, Jong Youl Choi, and Ana Gaina
 ru (Oak Ridge National Laboratory (ORNL)); and John K Wu and Junmin Gu (La
 wrence Berkeley National Laboratory (LBNL))\n\nExascale computers are beco
 ming a playground for scientific discovery. Using the extreme-scale kineti
 c fusion PIC code XGC as a proxy, this presentation will demonstrate the c
 hallenges and opportunities of in situ analysis, reduction, and visualizat
 ion in our high-performance computing ecosystem, with new contributions we
  have made therein. The first discussion is enabling HPC science studies t
 hat have been difficult due to gap in memory size relative to FLOPS. Often
 , first-principles-level scientific analysis requires deep-level identific
 ations and indexing of simulation high-dimensional simulation objects, whi
 ch amplify the memory requirements to an impractically high level. Develop
 ing in situ approaches for our time and phase-space analysis and visualiza
 tion, which consider specific features, minimizes the node-memory requirem
 ent and enables such studies. Another concern is addressing the growing co
 mpute speed to I/O bandwidth gap. Our data is analyzed, visualized, and co
 mpressed while being generated, without first storing it to a file system.
  This enables faster scientific discovery that can be used for quicker fee
 dback to next-day experimental or simulation inputs. We also consider the 
 potential for increased accuracy, where fine temporal and phase-space samp
 ling of transient analysis might expose complex behavior missed by the coa
 rse sampling that is often necessitated by adopting an off-line approach. 
 There is also possibility for the assessment of the error and uncertainty 
 in the predictability of the target science in parallel with the simulatio
 n, which could enable automated/AI-assisted simulation steering. Finally, 
 we discuss building more complete data bases for AI/ML training via automa
 ted identification and healing of scientific simulation data sets in the p
 hase spaces where previous simulation or experimental data do not exist.\n
 \nTag: Data Analysis, Visualization, and Storage, Large Scale Systems, Per
 formance Measurement, Modeling, and Tools\n\nRegistration Category: Worksh
 op Reg Pass\n\nSession Chairs: E. Wes Bethel (San Francisco State Universi
 ty, Lawrence Berkeley National Laboratory (LBNL)); Nicola Ferrier (Argonne
  National Laboratory (ANL), University of Chicago); Axel Huebl (Lawrence B
 erkeley National Laboratory (LBNL)); Tom Vierjahn (Westphalian University 
 of Applied Sciences); and Sean Ziegeler (US Department of Defense HPC Mode
 rnization Program, Department of Defense High Performance Computing Modern
 ization Program (DoD HPCMP))\n\n
END:VEVENT
END:VCALENDAR
