Nate Silver Famous Quotes
Reading Nate Silver quotes, download and share images of famous quotes by Nate Silver. Righ click to see or save pictures of Nate Silver quotes that you can use as your wallpaper for free.
You can build a statistical model and that's all well and good, but if you're dealing with a new type of financial instrument, for example, or a new type of situation - then the choices you're making are pretty arbitrary in a lot of respects.
I've just always been a bit of a dork.
One of the pervasive risks that we face in the information age, as I wrote in the introduction, is that even if the amount of knowledge in the world is increasing, the gap between what we know and what we think we know may be widening.
The story the data tells us is often the one we'd like to hear, and we usually make sure that it has a happy ending.
I've become invested with this symbolic power. It really does transcend what I'm actually doing and what I actually deserve.
The more interviews that an expert had done with the press, Tetlock found, the worse his predictions tended to be.
Successful gamblers - and successful forecasters of any kind - do not think of the future in terms of no-lose bets, unimpeachable theories, and infinitely precise measurements. These are the illusions of the sucker, the sirens of his overconfidence.
In baseball you have terrific data and you can be a lot more creative with it.
Good innovators typically think very big and they think very small. New ideas are sometimes found in the most granular details of a problem where few others bother to look. And they are sometimes found when you are doing your most abstract and philosophical thinking, considering why the world is the way that it is and whether there might be an alternative to the dominant paradigm. Rarely can they be found in the temperate latitudes between they two spaces, where we spend 99 percent of our lives.
Essentially, the frequentist approach toward statistics seeks to wash its hands of the reason that predictions most often go wrong: human error.
When human judgment and big data intersect there are some funny things that happen.
We're not that much smarter than we used to be, even though we have much more information - and that means the real skill now is learning how to pick out the useful information from all this noise.
What makes him successful is the way that he analyzes information. He is not just hunting for patterns. Instead, Bob combines his knowledge of statistics with his knowledge of basketball in order to identify meaningful relationships in the data.
An admonition like "The more complex you make the model, the worse the forecast gets." is equivalent to saying "Never add too much salt to the recipe." How much complexity-how much salt-did you begin with? If you want to get good at forecasting, you'll need to immerse yourself in the craft and trust your own taste buds.
I know it's cheaper to fund an op-ed columnist than a team of reporters, but I think it confuses the mission of what these great journalistic brands are about,
The key to making a good forecast is not in limiting yourself to quantitative information.
I have to think about how to not spread myself too thin. It's a really great problem to have.
The public is even more pessimistic about the economy than even the most bearish economists are.
A lot of journalism wants to have what they call objectivity without them having a commitment to pursuing the truth, but that doesn't work. Objectivity requires belief in and a commitment toward pursuing the truth - having an object outside of our personal point of view.
I guess I don't like the people in politics very much, to be blunt.
I actually buy the paper version of The New York Times maybe once or twice a week.
The signal is the truth. The noise is what distracts us from the truth.
New ideas are sometimes found in the most granular details of a problem where few others bother to look.
We want to get 80%-85% of predictions right, not 100%. Or else we calibrated our estimates in the wrong way.
I view my role now as providing more of a macro-level skepticism, rather than saying this poll is good or this poll is evil.
He does not depend on insider tips, crooked referees, or other sorts of hustles to make his bets. Nor does he have a "system" of any kind. He uses computer simulations, but does not rely upon them exclusively.
Every day, three times per second, we produce the equivalent of the amount of data that the Library of Congress has in its entire print collection, right? But most of it is like cat videos on YouTube or 13-year-olds exchanging text messages about the next Twilight movie.
Where our enemies will strike us is predictable: it's where we least expect them to.
Remember, the Congress doesn't get as many opportunities to make an impression with the public.
If you're keeping yourself in the bubble and only looking at your own data or only watching the TV that fits your agenda then it gets boring.
If you hold there is a 100 percent probability that God exists, or a 0 percent probability, then under Bayes's theorem, no amount of evidence could persuade you otherwise.
By playing games you can artificially speed up your learning curve to develop the right kind of thought processes.
The Protestant Reformation had a lot to do with the printing press, where Martin Luther's theses were reproduced about 250,000 times, and so you had widespread dissemination of ideas that hadn't circulated in the mainstream before.
All models are wrong, but some models are useful.90 What he meant by that is that all models are simplifications of the universe, as they must necessarily be.
People don't have a good intuitive sense of how to weigh new information in light of what they already know. They tend to overrate it.
Expert estimates of probability are often off by factors of hundreds or
thousands. [ ... ] I used to be annoyed when the margin of error was high in
a forecasting model that I might put together. Now I view it as perhaps the
single most important piece of information that a forecaster provides. When
we publish a forecast on FiveThirtyEight, I go to great lengths to document
the uncertainty attached to it, even if the uncertainty is sufficiently
large that the forecast won't make for punchy headlines.
I have the same friends and the same bad habits.
You get steely nerves playing poker.
Basically, books were a luxury item before the printing press.
I don't think that somebody who is observing or predicting behavior should also be participating in the 'experiment.'
We need to stop, and admit it: we have a prediction problem. We love to predict things - and we aren't very good at it.
Wherever there is human judgment there is the potential for bias.
In politics people build whole reputations off of getting one thing right.
Caesar recognized the omens, but he didn't believe they applied to him.
All I know is that I have way more stuff that I want to write about than I possibly have time to.
We're living in a world where Google beats Gallup.
The instinctual shortcut that we take when we have "too much information" is to engage with it selectively, picking out the parts we like and ignoring the remainder, making allies with those who have made the same choices and enemies of the rest.
People attach too much importance to intangibles like heart, desire and clutch hitting.
Data scientist is just a sexed up word for statistician.
When we advance more confident claims and they fail to come to fruition, this constitutes much more powerful evidence against our hypothesis. We can't really blame anyone for losing faith when this occurs
Finding patterns is easy in any kind of data-rich environment; that's what mediocre gamblers do. The key is in determining whether the patterns represent noise or signal.
Data Is Useless Without Context
Whenever you have dynamic interactions between 300 million people and the American economy acting in really complex ways, that introduces a degree of almost chaos theory to the system, in a literal sense.
I think people feel like there are all these things in our lives that we don't really have control over.
I have to make sure that I make good choices and that if I put my name on it, it's a high-quality endeavor and that I have time to be a human being.
If there's a major foreign policy event, the President gets on TV, the Congress doesn't.
If you compare the number of children who are diagnosed as autistic64 to the frequency with which the term autism has been used in American newspapers,65 you'll find that there is an almost perfect one-to-one correspondence (figure 7-4), with both having increased markedly in recent years.
To the extent that you can find ways where you're making predictions, there's no substitute for testing yourself on real-world situations that you don't know the answer to in advance.
A lot of news is just entertainment masquerading as news.
As the statistician George E. P. Box wrote, "All models are wrong, but some models are useful." What he meant by that is that all models are simplifications of the universe, as they must necessarily be. As another mathematician said, "The best model of a cat is a cat."
...
The key is in remembering that a model is a tool to help us understand the complexities of the universe, and never a substitute for the universe itself.
When you try to predict future E.R.A.'s with past E.R.A.'s, you're making a mistake.
We look at all the polls, not just the Gallup Poll. So, it's kind of like if you have, you know, four out of five doctors agree that reducing cholesterol reduces your risk of a heart attack, Gallup is like the fifth doctor.
The most calamitous failures of prediction usually have a lot in common. We focus on those signals that tell a story about the world as we would like it to be, not how it really is. We ignore the risks that are hardest to measure, even when they pose the greatest threats to our well-being. We make approximations and assumptions about the world that are much cruder than we realize. We abhor uncertainty, even when it is an irreducible part of the problem we are trying to solve.
Actually, one of the better indicators historically of how well the stock market will do is just a Gallup poll, when you ask Americans if you think it's a good time to invest in stocks, except it goes the opposite direction of what you would expect. When the markets going up, it in fact makes it more prone toward decline.
I'm a pro-horserace guy.
What a well-designed forecasting system can do is sort out which statistics are relatively more susceptible to luck; batting average, for instance, is more erratic than home runs.
Sometimes the only solution when the data is very noisy - is to focus more on process than on results.
People gravitate toward information that implies a happier outlook for them.
I love South American food, and I haven't really been down there. I really need a vacation.
Almost everyone's instinct is to be overconfident and read way too much into a hot or cold streak.
You don't want to treat any one person as oracular.
You don't want to influence the same system you are trying to forecast.
When you get into statistical analysis, you don't really expect to achieve fame. Or to become an Internet meme. Or be parodied by 'The Onion' - or be the subject of a cartoon in 'The New Yorker.' I guess I'm kind of an outlier there.
There was "nothing new under the sun," as the beautiful Bible verses in Ecclesiastes put it - not so much because everything had been discovered but because everything would be forgotten.
Any one game in baseball doesn't tell you that much, just as any one poll doesn't tell you that much.
For-profit weather forecasters rarely predict exactly a 50 percent chance of rain, which might seem wishy-washy and indecisive to consumers.41 Instead, they'll flip a coin and round up to 60, or down to 40, even though this makes the forecasts both less accurate and
If I had a spreadsheet on my computer, it looked like I was busy.
I think there's space in the market for a half-dozen kind of polling analysts.
Accountability doesn't mean apologizing.
Risk, as first articulated by the economist Frank H. Knight in 1921,45 is something that you can put a price on. Say that you'll win a poker hand unless your opponent draws to an inside straight: the chances of that happening are exactly 1 chance in 11.46 This is risk. It is not pleasant when you take a "bad beat" in poker, but at least you know the odds of it and can account for it ahead of time. In the long run, you'll make a profit from your opponents making desperate draws with insufficient odds. Uncertainty, on the other hand, is risk that is hard to measure. You might have some vague awareness of the demons lurking out there. You might even be acutely concerned about them. But you have no real idea how many of them there are or when they might strike. Your back-of-the-envelope estimate might be off by a factor of 100 or by a factor of 1,000; there is no good way to know. This is uncertainty. Risk greases the wheels of a free-market economy; uncertainty grinds them to a halt.
If you aren't taking a representative sample, you won't get a representative snapshot.
Every four years in the presidential election, some new precedent is broken.
Even if you fly twenty times per year, you are about twice as likely to be struck by lightning.
Herd immunity - the biological equivalent of a firewall in which the disease has too few opportunities to spread and dies out.
On average, people should be more skeptical when they see numbers. They should be more willing to play around with the data themselves.
A forecaster should almost never ignore data, especially when she is studying rare events like recessions or presidential elections, about which there isn't very much data to begin with. Ignoring data is often a tip-off that the forecaster is overconfident, or is overfitting her model - that she is interested in showing off rather than trying to be accurate.
If political scientists couldn't predict the downfall of the Soviet Union - perhaps the most important event in the latter half of the twentieth century - then what exactly were they good for?
Midterm elections can be dreadfully boring, unfortunately.
Well, you know, you're not going to have 86 percent of Congress voted out of office.
The thing that people associate with expertise, authoritativeness, kind of with a capital 'A,' don't correlate very well with who's actually good at making predictions.
Voters memories will fade some.
Stories of prediction are often those of long-term progress but short-term regress
Distinguishing the signal from the noise requires both scientific knowledge and self-knowledge.
One conceit of economics is that markets as a whole can perform fairly rationally, even if many of the participants within them are irrational. But irrational behavior in the markets may result precisely because individuals are responding rationally according to their incentives.
It is not really "artificial" intelligence if a human designed the artifice.