Nobel Prize-winning physicist Richard Feynman once said ‘If you think you understand quantum mechanics, you don’t understand quantum mechanics.’ As people who’ve never professed to understand quantum mechanics, The Fence find this statement quite presumptuous, but decided we’d ask some of the world’s biggest boffins how much they do or don’t know about their jobs, life and everything else besides.
TF: Can you do long division?
David MacMillan: Yes, anyone can do long division… it’s getting it right that’s the hard bit.
How much bullshit hides in your text? Discover now at:
(PR-Experts, politicians, ad writers or scientists need to be strong here!)
Spend a large amount of money on a “rebrand” you could have made yourself in five minutes:
(a web joke by Tom Scott)
A collection of bothersome UI/UX decisions, like:
Newsletters; or, an enormous rant about writing on the web that doesn’t really go anywhere and that’s okay with me:
(a beautifully designed essay)
...or bringing down the tyranny of reciprocal space
(a really awesome blog about condensed matter physics)
A group of professional nature recordists from around the globe have collaborated to develop Nature Soundmap, an enjoyable and interactive way of exploring the natural sounds of our planet. Combining high-quality field recordings with the latest satellite imagery, the project brings together some of nature’s most beautiful, interesting and inspiring sounds.
1. Entropy doesn’t measure disorder, it measures likelihood.
Really the idea that entropy measures disorder is totally not helpful. Suppose I make a dough and I break an egg and dump it on the flour. I add sugar and butter and mix it until the dough is smooth. Which state is more orderly, the broken egg on flour with butter over it, or the final dough?
I’d go for the dough. But that’s the state with higher entropy. And if you opted for the egg on flour, how about oil and water? Is the entropy higher when they’re separated, or when you shake them vigorously so that they’re mixed? In this case the better sorted case has the higher entropy.
Entropy is defined as the number of “microstates” that give the same “macrostate”. Microstates contain all details about a system’s individual constituents. The macrostate on the other hand is characterized only by general information, like “separated in two layers” or “smooth on average”. There are a lot of states for the dough ingredients that will turn to dough when mixed, but very few states that will separate into eggs and flour when mixed. Hence, the dough has the higher entropy. Similar story for oil and water: Easy to unmix, hard to mix, hence the unmixed state has the higher entropy.
Technology is supposed to work for people. From the lever to the Internet, tools designed by people have saved us time and energy and allowed us to accomplish things that were impractical or completely impossible before. While no technology is without its downsides, in general it is difficult to dispute that technology working for people has transformed human society for the better.
Unfortunately, sometimes people end up working for technology instead. This is a strange state of affairs, but it happens just about daily for most of us (...)
Confronting the climate crisis will require something more radical than just making data greener. That’s why we should put another tactic on the table: making less data. We should reject the assumption that our built environment must become one big computer. We should erect barriers against the spread of “smartness” into all of the spaces of our lives. To decarbonize, we need to decomputerize.