Rationality

https://www.youtube.com/watch?v=w4RLfVxTGH4

The Baloney Detection Kit: Carl Sagan’s Rules for Bullshit-Busting and Critical Thinking #article - "Necessary cognitive fortification against propaganda, pseudoscience, and general falsehood. Carl Sagan (November 9, 1934–December 20, 1996) was many things — a cosmic sage, voracious reader, hopeless romantic, and brilliant philosopher. But above all, he endures as our era’s greatest patron saint of reason and critical thinking, a master of the vital balance between skepticism and openness. In The Demon-Haunted World: Science as a Candle in the Dark (public library) — the same indispensable volume that gave us Sagan’s timeless meditation on science and spirituality, published mere months before his death in 1996 — Sagan shares his secret to upholding the rites of reason, even in the face of society’s most shameless untruths and outrageous propaganda."

Center for Applied Rationality - "Developing clear thinking for the sake of humanity’s future"

Charles Murray vs Milo, and the "good faith" principle #video - "I explain why I think the "good faith" principle implies that we should give people like Charles Murray, but not Milo, a platform to share their ideas. (Not saying Milo should be legally banned from speaking, just that we shouldn't be going out of our way to give him opportunities to speak.) I also argue that the "good faith" principle is inherently subjective and that this doesn't negate its importance."

Danny's Agonizing Guide to Agonizing Decisions #article - "Suppose you’re feeling super torn about whether to, say, order soup or salad. Or maybe whether or not to have children. Something with a seemingly overwhelming list of pros and cons on both sides."

An Honest Liar #documentary - "The incredible true story of the renowned magician turned skeptic and exposer of frauds and hoaxes, James "The Amazing" Randi." #skepticism

How Do You Evaluate Your Own Predictions? #article - "This post provides a comprehensive summary of the technique that Tetlock and Gardner presents in Superforecasting. It is divided into two parts: part one (this week) is about the rules and evaluation criterion that GJP uses to evaluate forecasters. I've presented this in the context of an interested practitioner; I'm assuming you might want to improve forecasting decisions in your own life, but you don't have the time to implement a rigorous forecasting tournament in your organisation."

How Good Are FiveThirtyEight Forecasts? #article - "Forecasts have always been a core part of FiveThirtyEight’s mission. They force us (and you) to think about the world probabilistically, rather than in absolutes. And making predictions, whether we’re modeling a candidate’s chance of being elected or a team’s odds of making the playoffs, improves our understanding of the world by testing our knowledge of how it works — what makes a team or a candidate win."

https://www.youtube.com/watch?v=B36MkF-QU1s

In Defense of Polling #article - November 7, 2020. "How I earned $50,000 on election night using polling data and some Python code."

Is Most Published Research Wrong? #video - "Mounting evidence suggests a lot of published research is false."

Mental Models: The Best Way to Make Intelligent Decisions (109 Models Explained) #article - "Mental models are how we understand the world. Not only do they shape what we think and how we understand but they shape the connections and opportunities that we see. Mental models are how we simplify complexity, why we consider some things more relevant than others, and how we reason."

PredictionBook - "Rather than just recording what you think will happen, PredictionBook allows you to record just how sure you are that it will happen. You can distinguish between those things you think will probably occur, and those things you're really really sure will occur. If you're properly calibrated, you'll be able to see that about 60% of the things you're 60% sure will happen, do happen, and 90% of the things you're 90% sure will happen, do happen."

https://twitter.com/blader/status/1330631848842588161

Rationalism #article - Wikipedia. "In philosophy, rationalism is the epistemological view that "regards reason as the chief source and test of knowledge"[1] or "any view appealing to reason as a source of knowledge or justification".[2] More formally, rationalism is defined as a methodology or a theory "in which the criterion of the truth is not sensory but intellectual and deductive".[3]"

Rationality #article - Wikipedia. "Rationality is the quality or state of being rational – that is, being based on or agreeable to reason.[1][2] Rationality implies the conformity of one's beliefs with one's reasons to believe, and of one's actions with one's reasons for action. "Rationality" has different specialized meanings in philosophy,[3] economics, sociology, psychology, evolutionary biology, game theory and political science."

Scientific skepticism #article - Wikipedia. "Scientific skepticism or rational skepticism (also spelled scepticism), sometimes referred to as skeptical inquiry, is an epistemological position in which one questions the veracity of claims lacking empirical evidence. In practice, the term most commonly references the examination of claims and theories that appear to be beyond mainstream science, rather than the routine discussions and challenges among scientists. Scientific skepticism differs from philosophical skepticism, which questions humans' ability to claim any knowledge about the nature of the world and how they perceive it, and the similar but distinct methodological skepticism, which is a systematic process of being skeptical about (or doubting) the truth of one's beliefs.[7]"

The Signal and the Noise: Why So Many Predictions Fail - But Some Don't by Nate Silver | Goodreads #book - "Drawing on his own groundbreaking work, Silver examines the world of prediction, investigating how we can distinguish a true signal from a universe of noisy data. Most predictions fail, often at great cost to society, because most of us have a poor understanding of probability and uncertainty. Both experts and laypeople mistake more confident predictions for more accurate ones. But overconfidence is often the reason for failure. If our appreciation of uncertainty improves, our predictions can get better too. This is the "prediction paradox": The more humility we have about our ability to make predictions, the more successful we can be in planning for the future."

Spurious Correlations

Thinking in Bets: Making Smarter Decisions When You Don't Have All the Facts #book - "Even the best decision doesn't yield the best outcome every time. There's always an element of luck that you can't control, and there is always information that is hidden from view. So the key to long-term success (and avoiding worrying yourself to death) is to think in bets: How sure am I? What are the possible ways things could turn out? What decision has the highest odds of success? Did I land in the unlucky 10% on the strategy that works 90% of the time? Or is my success attributable to dumb luck rather than great decision making?"

What is "rationality"? #video - "What do people mean by rationality? Julia Galef from http://measureofdoubt.com discusses the various ways understand word rationality and how different meanings relate to each other."

What exactly is the "Rationality Community?" #article - "I used to use the phrase "Rationality Community" to mean three different things. Now I only use it to mean two different things, which is... well, a mild improvement at least. In practice, I was lumping a lot of people together, many of whom neither wanted to get lumped together nor had much in common."

Why you shouldn't try to "'change your mind" #video - "In which I describe how I've changed my mind about the usefulness of "changing one's mind""

Monte Carlo Simulations

Monte Carlo Simulation #video - "A Monte Carlo simulation is a randomly evolving simulation. In this video, I explain how this can be useful, with two fun examples of Monte Carlo simulations: The first model shows how pi can be determined with Monte Carlo sampling, and in the second part of the video we will take a look how animations can be rendered with Monte Carlo path tracing."

Sources

Astral Codex Ten #blog - "ACX started as my personal blog, Slate Star Codex. It grew out of “the rationalist community”, a mostly-online subculture of people trying to work together to figure out how to distinguish truth from falsehood using insights from probability theory, cognitive science, and AI."

Farnam Street - "Farnam Street (FS) helps you master the best of what other people have already figured out. Together we will develop the mental models to understand how the world works, make better decisions, and live a more meaningful life."

Julia Galef #article - Wikipedia. "Julia Galef (/ˈɡeɪləf/; born July 4, 1983) is co-founder of the Center for Applied Rationality.[1] She is a writer and public speaker on the topics of rationality, science, technology, and design. She hosts Rationally Speaking, the official podcast of New York City Skeptics, which she has done since its inception in 2010, sharing the show with co-host and philosopher Massimo Pigliucci until 2015.[2][3]" YouTube. Podcast. Personal website. Newsletter.

LessWrong #blog - "LessWrong is a community blog devoted to the art of human rationality."

Slate Star Codex

Slate Star Codex on Reddit

Last updated