BEGIN:VCALENDAR
VERSION:2.0
PRODID:Linklings LLC
BEGIN:VTIMEZONE
TZID:America/Denver
X-LIC-LOCATION:America/Denver
BEGIN:DAYLIGHT
TZOFFSETFROM:-0700
TZOFFSETTO:-0600
TZNAME:MDT
DTSTART:19700308T020000
RRULE:FREQ=YEARLY;BYMONTH=3;BYDAY=2SU
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0600
TZOFFSETTO:-0700
TZNAME:MST
DTSTART:19701101T020000
RRULE:FREQ=YEARLY;BYMONTH=11;BYDAY=1SU
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTAMP:20260422T000605Z
LOCATION:DEF Concourse
DTSTART;TZID=America/Denver:20231116T100000
DTEND;TZID=America/Denver:20231116T170000
UID:submissions.supercomputing.org_SC23_sess300_spostu121@linklings.com
SUMMARY:How Much Noise Is Enough:  On Privacy, Security, and Accuracy Trad
 e-Offs in Differentially Private Federated Learning
DESCRIPTION:Adhishree Kathikar (Indiana University)\n\nCentralized machine
  learning techniques have caused privacy concerns for users. Federated Lea
 rning~(FL) mitigates this as a decentralized training system where no raw 
 data are communicated across the network to a centralized server. Instead,
  the machine learning model is trained locally on each device and they sen
 d the locally-trained model weights to a central server to aggregate. Howe
 ver, there are critical challenges with FL. Security issues plague FL, suc
 h as model poisoning via label flipping. Additionally, there even exist pr
 ivacy concerns via data leakage by reconstruction of weights. In this work
 , we apply differential privacy (which adds noise to the model weights bef
 ore sending across the network) as an added privacy measure to protect sen
 sitive data from being reconstructed. Through this research, we study the 
 effects of differential privacy on FL with respect to security and privacy
  trade-offs.\n\nRegistration Category: Tech Program Reg Pass, Exhibits Reg
  Pass\n\n
END:VEVENT
END:VCALENDAR
