I recently wrote a post of quotes from Albert Einstein on my blog. As I was researching, my curiosity was piqued regarding the Theory of Everything and how Data Science might be able to be applied to take a step to advance it. I thought to myself that this would make an interesting thought exercise!
These quotes from Einstein particularly caught my attention:
- “Quantum mechanics is certainly interesting. But an inner voice tells me that it is not yet the real thing. The theory says a lot, but does not really bring us any closer to the secret of the “old one”. I, at any rate, am convinced that He is not playing at dice”
- “I cannot seriously believe in it [quantum theory] because the theory cannot be reconciled with the idea that physics should represent a reality in time and space, free from spooky actions at a distance”
- “People like us, who believe in physics, know that the distinction between past, present and future is only a stubbornly persistent illusion”
- I believe in intuitions and inspirations. I sometimes feel that I am right. I do not know that I am.
And then I thought, maybe some principles from Data Science can help. Here are some Data Science constructs that will help
- Logistic Regression
- Correlation Matrix
- Logarithmic Transformation
- Linear Regression
We start with our equations E = mc2 and E = hf (which is also E = hc/λ). We know E = mc2 is in regards to large masses while E = hc/λ relates to infinitesimally small particles and waves. This is a perfect case for Logistic Regression because these two equations are discrete classifications. It is binary – either it is 1 or 0. They also represent boundary conditions, so we can set each one to zero to solve them. For the E = mc2 part of the equation, this stands alone and is correct as-is – for with large masses, we will have a value; for infinitesimally small masses this will go to zero. For the E= hc/λ part of the equation, for infinitesimally small masses and waves, we will have a value; however, for large masses, we will have zero for a wave length so this will create a division by zero problem, so we will need a coefficient based on mass along with hf so that it can go to zero for large masses. We will also have to apply a Logarithmic Transformation in order to analyze the data since it is skewed by magnitudes of size. We can use a Correlation Matrix to test out different mathematical construct candidates and then finally, use a linear regression for the best candidate.
After some time exploring candidates, I came upon a promising candidate for the coefficient: 1/ ( (1-Log(mc^2))^2). I used a small set of data in which I varied the mass from 1E-17 to 1E +14. I plotted the logarithmic transformation of the coefficient in Tableau with R-squared of 0.85 and P value of 0.0004, which I think is pretty good on my hunt for a needle in the haystack.
Update 2/22/22: After listening to chapter 9 of Frank Wilczek’s audiobook “The Lightness of Being“, my thinking is that for further consideration, a construct of [[1/(√2) * (probablistic amplitude(up, down)] ^ 2] * [[1/(√2) * (probability of observing the spin (up, down)]^2 ] needs to be investigated since it is similar in nature to the construct I found.
Update 4/6/22: Interesting quote from Nikola Tesla dovetails with my thought exercise in my blog post Nikola Tesla | Wisdom from the Warehouse on Patreon
Update 8/25/22: After reading “The GOD Equation: The Quest for a Theory of Everything” by Michio Kaku I have a few more observations. As per Nikola Tesla, progress will be made when we look to vibrations and non-physical phenomena. String theory with 10 dimensions fits the bill! Plus, since dimensions 5-10 are non-physical, their component in the final equation will go to zero.
You must be logged in to post a comment.