Currently, my main objective is to obtain final sets of data to analyze, so much of this week was spent doing final adjustments to my models. Based on a suggestion from Dr. Tang, I looked at minimizing the size of the model embeddings, which would allow the model to train faster. To my surprise, doing so resulted in a model with similar levels of accuracy, which will help me out a lot if I need to retrain a set of models. An unfortunate part of doing ML research is the large amount of training time necessary to obtain a good set of results. As the data the model is trained on drastically affects the model’s performance, best practice generally means that your model’s performance is actually an average of the performance of several models trained on different data splits. As a result, I’ve been running a lot of models in the background as I work on the literature review for my writeup.

Outside of work, I’ve been spending my time trying to enjoy local spots I feel that I’ve missed out on due to travelling. I visited several parks this week just to relax, and trying a number of new food spots, which is where I got the incredibly tasty kimbap and dumplings I’m holding in the photo below.