BEGIN:VCALENDAR X-WR-TIMEZONE:America/Chicago PRODID:-//University of Iowa//Events 1.0//EN VERSION:2.0 CALSCALE:GREGORIAN BEGIN:VEVENT DTSTAMP:20240329T122955Z DTSTART:20230405T090000 SUMMARY:Final Exam - From Distributionally Robust Optimization to Broader Machine Learning Applications DESCRIPTION:PhD Candidate: Dixian Zhu\n\nAbstract\n\nFor the purpose of training machine learning models\, it is conventionally accepted that peers define a differentiable loss function for each data sample and optimize the averaged empirical loss. However\, people have been questioning whether this is the only approach to train a model. In the last decade\, an alternative approach based on natural philosophy has been proposed. This approach involves trading-off between optimizing the averaged individual loss and the maximal individual loss. By focusing more on the harder data samples\, the approach can be more robust and perform better. \n\nInspired by this philosophy\, we propose to apply this high-level idea to various machine learning applications. For instance\, this philosophy can be used not only to guide model training but also to query labeled data under the active learning paradigm. Additionally\, we have discovered that the hard-attention mechanism can naturally adapt to optimizing the partial Area under the ROC curve\, which is especially significant for machine learning on imbalanced datasets such as medical and health care data. We have also found that this philosophy is related to the multi-class classification problem and its commonly used loss functions. To this end\, we have proposed a unified loss function and investigated its properties to enhance multi-class classification performance. Lastly\, we propose to employ this philosophy to multiple instance learning\, where we aim to classify bags of data with limited instances displaying the interests.\n\nAdvisor: Tianbao Yang\n\nPlease contact Dixian Zhu for further details\, if you wish to join his exam.\n\n\nhttps://events.uiowa.edu/78550 LOCATION:null\, null UID:edu.uiowa.events-prod-78550 X-ALT-DESC;FMTTYPE=text/html:
PhD Candidate: Dixian Zhu
\n\nAbstract
\n\nFor the purpose of training machine learning models\, it is conventionally accepted that peers define a differentiable loss function for each data sample and optimize the averaged empirical loss. However\, people have been questioning whether this is the only approach to train a model. In the last decade\, an alternative approach based on natural philosophy has been proposed. This approach involves trading-off between optimizing the averaged individual loss and the maximal individual loss. By focusing more on the harder data samples\, the approach can be more robust and perform better.
\n\nInspired by this philosophy\, we propose to apply this high-level idea to various machine learning applications. For instance\, this philosophy can be used not only to guide model training but also to query labeled data under the active learning paradigm. Additionally\, we have discovered that the hard-attention mechanism can naturally adapt to optimizing the partial Area under the ROC curve\, which is especially significant for machine learning on imbalanced datasets such as medical and health care data. We have also found that this philosophy is related to the multi-class classification problem and its commonly used loss functions. To this end\, we have proposed a unified loss function and investigated its properties to enhance multi-class classification performance. Lastly\, we propose to employ this philosophy to multiple instance learning\, where we aim to classify bags of data with limited instances displaying the interests.
\n\nAdvisor: Tianbao Yang
\n\nPlease contact Dixian Zhu for further details\, if you wish to join his exam.
\n