Final Exam - From Distributionally Robust Optimization to Broader Machine Learning Applications

Apr 5, 2023

09:00 AM

Save to My Events

Virtual Event

PhD Candidate: Dixian Zhu


For the purpose of training machine learning models, it is conventionally accepted that peers define a differentiable loss function for each data sample and optimize the averaged empirical loss. However, people have been questioning whether this is the only approach to train a model. In the last decade, an alternative approach based on natural philosophy has been proposed. This approach involves trading-off between optimizing the averaged individual loss and the maximal individual loss. By focusing more on the harder data samples, the approach can be more robust and perform better.

Inspired by this philosophy, we propose to apply this high-level idea to various machine learning applications. For instance, this philosophy can be used not only to guide model training but also to query labeled data under the active learning paradigm. Additionally, we have discovered that the hard-attention mechanism can naturally adapt to optimizing the partial Area under the ROC curve, which is especially significant for machine learning on imbalanced datasets such as medical and health care data. We have also found that this philosophy is related to the multi-class classification problem and its commonly used loss functions. To this end, we have proposed a unified loss function and investigated its properties to enhance multi-class classification performance. Lastly, we propose to employ this philosophy to multiple instance learning, where we aim to classify bags of data with limited instances displaying the interests.

Advisor: Tianbao Yang

Please contact Dixian Zhu for further details, if you wish to join his exam.

Individuals with disabilities are encouraged to attend all University of Iowa–sponsored events. If you are a person with a disability who requires a reasonable accommodation in order to participate in this program, please contact in advance at