Smallish Post 1

Big Brain Abstraction
1 min readSep 15, 2020

I’ve been using math all my life, but one thing in particular that sticks out in math for me is factoring. When factoring in math, you simply take a common value such as an integer, or variable, and factor it out to simplify the equation without changing it. The idea of factoring in math may also be applied in computers. Since computers are binary, and their values are determined by the sequence of 1’s and 0’s, factoring could technically be applied, but instead of factoring for a common value, it could be factored by a common sequence. This could potentially make the process of storying memory more efficient, and perhaps less cumbersome. Factoring in math also exposes critical points, so perhaps factoring in programming could expose any constraints, if there were any.

--

--