Week 5: Time to move on
March 29, 2024
Hey folks, welcome back to my blog. This post will be a bit shorter than previous blogs because as of yesterday, I am in Taiwan. Nevertheless, I still made great progress so let’s get into it.
The program I implemented to solve exponential ODEs(using numpy autograd library) last week, which calculated loss between the estimated solution and gradient, was tested on trigonometry and polynomial functions. In both of these cases, the algorithm worked to a high degree of accuracy. For reproducibility, I tried to implement this model using Pytorch Lightning. And this is where I get stuck. I could not get this type of model to work in Pytorch because of the differences between Pytorch autograd and numpy autograd. It was difficult to debug because Pytorch Lightning automates the training process, abstracting parts of the code that might be causing a problem. The assumption that both autograd functions worked the same way was wishful thinking. I was comparing apples to oranges. Though I am not giving up on reimplementing it into Pytorch, I have to move on.
Using Pytorch Lightning, I implemented my initial approach of simulating x and y values and asking the model to fit the data. I failed to get this approach to work two weeks ago, however this week I was able to find the right hyperparameters for it to work. This model also worked with various functions to a high degree of accuracy, which I am happy about. The change I made to the hyperparameters was as simple as adjusting sample size, max epoch, and batch size. Additionally, the training process was also faster.
Moving forward with the project, I’ve decided to implement systems of ODEs in both of the ways I implemented above while reimplementing the autograd method in Pytorch Lightning on the side. Once I get a simple system of ODEs to work, I will move onto the Lorentz system.
That is all the progress I made this week, see you guys next time!
Leave a Reply
You must be logged in to post a comment.