Posts

Showing posts from March, 2026

The Long Road to Understanding - Part III

If the first phase was curiosity, and the second was learning how to think, the third has been about learning how to work. The change was not dramatic. There was no moment where everything suddenly became easy or clear. The way I approached problems stopped depending on who was guiding me and started becoming something I could reproduce on my own. I began to notice patterns in how I made progress. When something felt confusing, it was usually because I had skipped a definition or was trying to move too quickly. Slowing down helped. Writing things out helped. Returning to first principles helped. These were simple ideas, but applying them consistently made a difference. Over time, this became a method. Not a rigid system, but a way of working that I could rely on. I would start by grounding myself in the basic objects of a subject, understanding what they were and how they behaved. From there, I would work through problems, not just to get answers, but to see how the ideas moved. When s...

Reasoning with sticks

Image
One of the recurring patterns in my number theory study has been getting trapped in circular reasoning without noticing it. I read an explanation, restate it in my own words, and feel as if I have understood something. But when I try to justify the idea, the entire argument often turns out to be feeding itself. This happened most clearly when I was trying to understand why Euclid’s Algorithm works. Textbooks say, “Replacing a pair ( a , b )  with ( b ,  a    m o d    b )  does not change the gcd.” I kept trying to explain this using statements like “the gcd stays the same when you subtract multiples,” or “the algorithm preserves common divisors.” Both sounded correct, but both were really just the conclusion restated in slightly different language. Nothing in the reasoning actually began from a place that did not already assume the result. The loop broke the moment I pictured the numbers as two sticks: one long stick of length a  and a shorter stick of le...

Daily Fantasy Sports

Image
My ex played daily fantasy sports professionally. Not recreationally, but as a serious, full-time pursuit. I spent a lot of time watching how lineups were constructed, how models were adjusted day to day, and how results were evaluated across long horizons rather than individual slates. I also spent time around other professionals, where the language was not about teams or narratives, but about distributions, leverage, and long-run expectation. At the time, I didn’t quite have the mathematical vocabulary to describe what I was seeing. Now it feels natural to return to it and write about daily fantasy sports as what it really is: an optimisation problem with partial information. This post focuses on introducing DFS. Future posts will dive into the specific math and techniques in basketball, baseball and (American) football. A daily fantasy slate begins with a finite player set  P = { p 1 , … , p n } . Each player has a salary  s i  and a random fantasy score Xi ​ . Th...

Why Coding Came Easy (And What That Says About Math)

My coding journey started at 14, when I took computer science as an optional subject in school and learned BASIC. For my final project, I tried to build a simple car racing game. I got stuck trying to generate random obstacle positions and detect collisions and eventually switched to a Hangman game instead. Even then, something about programming clicked. After school, I moved toward biology, but coding kept reappearing. I took a C++ course in college, then a bioinformatics course and a computational cognitive neuroscience course in graduate school. Over time, programming shifted from being an interest to a necessity. Modern biology, especially bioinformatics and large-scale data analysis, depends heavily on code. I taught myself Python and R, and during my postdocs in computational biology, both became tools I used fluently. At that point, I was not just using them for my own work but also helping others. Coding had become a natural way for me to think and solve problems. What has been...

Learning Math: What Worked, What Didn't

Like everyone, I started learning math in school. I wasn't bad at it, but I was no prodigy either. I decided my path lay in biology and only returned to math in 2016. You can read about that journey in my previous post. Today I want to write about the how of learning math: what worked, what didn't, and what I wish I'd known earlier. I think there is something I'd call a mathematical gaze, a style of thinking that some people access easily and others don't. Everything in learning math, at its core, revolves around developing this gaze. Another word for it is intuition. What is mathematical intuition, exactly? Does it have anything to do with the real world? For a long time I was convinced that solving enough problems would develop it automatically. The truth is more complicated. Problem-solving alone isn't enough. You need a teacher who already has intuition and can show you how to think like them. That transmission from one mind to another is the only reliable ...

My Mathematical Journey

Around early 2016, I decided to learn math. The trigger was a comment I came across about someone who had struggled with math but was able to rebuild their understanding through proofs. That idea stayed with me. I had always been comfortable with biology, and it became clear that a deeper command of mathematics would open up entirely new ways of thinking about it. I started with enthusiasm but very little structure. Like many beginners, I jumped too far ahead and picked up a book on stochastic methods. That attempt did not go far. At the time, I was finishing graduate school, and the effort faded. The real shift began during my first postdoc, where I worked with both a physicist and a biologist. My physicist advisor suggested I learn linear algebra and pointed me to Strang’s lectures. I supplemented this with visual material like 3Blue1Brown. That phase gave me an initial intuition for the subject, but it was still shallow. I was beginning to see patterns, but without enough precision ...

From Hartley to Shannon

Shannon asked: “How much information can a channel carry?” He wanted a function that measures “how surprised will I be by the outcome?” For a fair coin, you’re more surprised than a coin that always lands heads. He wrote down some reasonable properties any such function should have and derived the only formula that satisfies them. Start with the simplest case: equally likely outcomes, each with probability 1 s​. Call the uncertainty A ( s ) . We just need to know: what properties must A ( s )  have? Property 1: Monotonicity. More choices = more uncertainty. So A ( s 1 ) < A ( s 2 ) if s 1 < s 2 A(s_1) < A(s_2) \quad \text{if} \quad s_1 < s_2 ​ Property 2: Consistency.   Imagine choosing 1 letter from an alphabet of s^m  symbols. That should be equivalent to choosing m m  letters one at a time from an alphabet of s s  symbols. So: A ( s m ) = m ⋅ A ( s ) A(s^m) = m \cdot A(s) This single equation forces A ( s ) A(s)  to be a logarithm. He...

Why Information is Logarithmic: Hartley’s 1928 Insight

 In 1928, a researcher at Bell Labs named Ralph Hartley published a paper that would change the world. At the time, "information" was a vague, psychological concept that we couldn't measure. The transition from that to the current field of information theory has been exponential. Out of curiosity I went back and read "Transmission of Information" by Hartley. What I'm going to do here is talk about the paper. From the abstract: "A quantitative measure of information is developed which is based on physical as contrasted with psychological considerations."  That is the only sentence from the abstract that I am going to be writing about. The rest of the paper deals more with engineering applications and I am not interested in it. In the first page what he does is give an overview of the paper and explain what he is attempting to do. On the second page he begins with the measurement of information, where he talks about the considerations involved in com...

The Library Bar and Sherlock Holmes

Image
  The Library Cafe and Bar, Madison, WI, USA Back in grad school, I used to frequent a bar called “The Library.” They had books, food, and drinks, and the music was always great. There was nothing I enjoyed more than showing up there after a long day at work, finding a spot under the bright lights, and reading Sherlock Holmes. The walls were lined with actual bookshelves, but I usually stuck to the hard wooden chairs with my own worn paperbacks. If you would have told me back then that I would eventually be seriously attempting to learn math that wasn't essential for my work, I would’ve laughed in your face. My way of thinking felt completely incompatible with the rigid world of mathematics. I saw myself as a person of prose and puzzles, not proofs and polynomials. My perspective has changed over the last five years. I have slowly begun to realize that the thrill of a Holmesian deduction is not so different from the thrill of a mathematical discovery.