Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

All machine learning is just renormalization which in turn is a convolution in Hopf algebra. That's why you see superposition

"In physics, wherever there is a linear system with a "superposition principle", a convolution operation makes an appearance."

I'm working this out in more details but it is uncanny how much it works out.

I have a discord if you want to discuss this further

https://discord.cofunctional.ai



Do you mean all ML or just large neural networks? Where is renormalization in a tree model? What superposition are you referring to?


Renormalization is all about this symmetric partitioning.


I suppose we should be cautious, the human mind is capable of overfitting too


You have no clue what you are talking about.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: