Entropy Explained, With Sheep

Why is this?

You might’ve heard an explanation that goes like this: whenever you drop an egg, or melt an ice cube, or shatter a wine glass, you’ve increased the entropy of the world. You might also have heard the phrase, “entropy always increases”. In other words, things are only allowed to happen in one direction — the direction in which entropy increases.

But this doesn’t answer the question, it just replaces it with a new set of questions.

What is entropy, really? Why does it always keep increasing? Why don’t eggshells uncrack, or wine glasses unshatter? In this piece, my goal is to give you the tools to answer these questions.

Going down this road leads us to some of the biggest unanswered questions about the cosmos: how did our universe begin, how will it end, and why is our past different from our future?


A founder’s perspective on 4 years with Haskell

In 2012 I co-founded Better1, a startup building a new type of enterprise e-learning platform. Our goal was to make it faster and cheaper for large companies to develop, deliver, and analyse adaptive, cross-platform, multi-language, online courses.

From day one, we decided to use Haskell as our main language, and it remained the only language we used on the back-end as our team grew to 10 developers.

After a period of experimentation and development, Better grew from $0 to $500k+ in annual recurring revenue in the space of a few months, with companies including American Express and Swissport as customers. However, the distribution model proved challenging to grow beyond that and we eventually sold to GRC Solutions, an Australian compliance company.

Though interest in Haskell appears to be growing steadily, its use in production is still rare. Some have the mistaken impression it’s an academic language and nothing more. In this post I’ll give my perspective on what it’s like to use Haskell in a startup. Can you get stuff done? Does it hold up in practice? Can you hire developers? Should more companies use it?

The short answer to those questions is yes. It’s not a fit for all problems or for all teams, but it is worth serious consideration. For building server-side software, Haskell might be the closest thing to a secret weapon you’ll find today.2


List of 50+ Machine Learning APIs

This article was posted on Mashape by Chris Ismael. Chris is a developer evangelist with proven success and experience in technical evangelism, business development, and software development in the mobile industry. Wikipedia defines Machine Learning as “a branch of artificial intelligence that deals with the construction and study of systems that can learn from data.”


Habits of highly mathematical people

The most common question students have about mathematics is “when will I ever use this?” Many math teachers would probably struggle to give a coherent answer, beyond being very good at following precise directions. They will say “critical thinking” but not much else concrete. Meanwhile, the same teachers must, with a straight face, tell their students that the derivative of arccosine is important. (It goes beyond calculus, in case you were wondering)


Random notes on improving the Redis LRU algorithm

Redis is often used for caching, in a setup where a fixed maximum memory to use is specified. When new data arrives, we need to make space by removing old data. The efficiency of Redis as a cache is related to how good decisions it makes about what data to evict: deleting data that is going to be needed soon is a poor strategy, while deleting data that is unlikely to be requested again is a good one.

In other terms every cache has an hits/misses ratio, which is, in qualitative terms, just the percentage of read queries that the cache is able to serve. Accesses to the keys of a cache are not distributed evenly among the data set in most workloads. Often a small percentage of keys get a very large percentage of all the accesses. Moreover the access pattern often changes over time, which means that as time passes certain keys that were very requested may no longer be accessed often, and conversely, keys that once were not popular may turn into the most accessed keys.

So in general what a cache should try to do is to retain the keys that have the highest probability of being accessed in the future. From the point of view of an eviction policy (the policy used to make space to allow new data to enter) this translates into the contrary: the key with the least probability of being accessed in the future should be removed from the data set. There is only one problem: Redis and other caches are not able to predict the future.