Research Journal: Oct 2 - Oct 8
Facing an old project again
October 6: Facelift
I started to read through Jari Turkia’s R code so that it can be converted into a more user-friendly interface and make simulations easier to run. I named the new package
mebnlearn in reference to the original
bnlearn package that I originally worked with. Must remember to credit Jari fully when the package is in a more advanced form.
Sonia recommended that we stick to the BN project for the January talk. That means that this project needs to be accelerated so that there will be things to talk about at the conference. This comes at a good time since the Platform-of-1 project is blocked by needing to run all of those simulations. Hopefully Sama’s laptop gets this done a lot faster. I’ll need that laptop to help with the BN facelift as well.
Some pressing questions that I need to answer with this facelift:
What are a few critical situations that should be simulated as test cases?
How should the N-of-1 context fit into this new mixed-effect Bayesian Network?
How does Jari’s code work?
What basic functionality needs to be in mebnlearn before it can be used to simulate?
What metrics should be used in the simulations?
On the internship front, I think this third year also needs to be utilized so that I know the basic deep learning architectures. I also need to pick up and have solid knowledge on at least one of the deep learning frameworks. I think it’s going to end up being Pytorch, but I need to do some research into what’s used by Google since they’re the target.
Today was mostly a rest day. I boiled down the scope of what I needed to worry about for the January conference down to the slides that I’ll need to present:
An introduction to Bayesian networks
Extension to mixed-effect Bayesian network
How was it implemented
Simulation performance on N-of-1 data
Results on the Just Walk data
The full paper simulations and writing can wait. I can just focus on these particular slides first before January.
Temporarily shifting time priority to writing and finishing the ML course so that all of my attention can be dedicated towards the other course and the Bayesian Network project. Plus a small weekend break.