Learning feature crosses, regularization, and logistic regression!
- Kavan Mehta
- Sep 13, 2021
- 2 min read
Updated: Oct 25, 2022
During the past week, I continued my Google Developer’s course and learned more about topics of machine learning such as feature crosses, regularization of data to minimize overfitting, and started logistic regression. For feature crosses, I got to learn about how we can encode non-linear problems into linear problems by simply using feature crosses that would multiply one, two, or even more features. Another topic I got an opportunity to learn about was about regularization of data to minimize overfitting by creating a simpler model than one that would overfit data while still trying to contain a great accuracy at predicting the training and validation data. In order for this to happen, I learned that one must use the minimize loss + (lambda* complexity) formula. Furthermore, the complexity would be determined by the sum of squares of all weights and would be multiplied by lambda (the ratio between how simple a model is and how much does the model fit data). This newly gained knowledge helped me learn how I could use this formula to train my data properly for future models. I also just started learning about logistic regression and found it very interesting thus I hope to learn more and continue my machine learning pathway. I persist to have questions about neural networks that I hope to get answered this new week.
My goal for this week would be to finish the Google Developer’s course by the end of the week to be proficient with the concepts of both linear and advanced types of models including neural networks and other advanced algorithms. Furthermore, I hope to start working more on using TensorFlow to create some very basic and simple projects on my own in the next 2-3 weeks. I also think I will be doing a lot more theory as I continue learning machine learning. So see you next week, same place, same time.

Comments