What do you mean by “you”?
People ask me: “What do you think about X?” It might be the path of interest rates or stock prices or the rate of inflation. I might respond “Well, the market is predicting . . . ” They respond; “No, what do YOU think about X.”
I can only respond, what do you mean by “you”?
Most people probably find me to be rather annoying, as I don’t believe in lots of things that are widely accepted, like asset bubbles, objective truth, and personal identity. That last one is especially hard to explain.
It’s not so much that I have multiple personalities, rather my single personality has multiple aspects. Yes, I may have a hunch as to where some economic variable is headed, and my hunch may differ from the market forecast. But unlike most people, I don’t privilege that hunch over my rational belief in the EMH. It makes no sense to speak of what I “really believe”, as I have multiple beliefs, located in different parts of my brain.
A few years back, I got into hot water for a post where I said I thought another pundit’s ideas seemed crazy, but on the other hand some extremely brilliant economists thought her views were quite interesting, so I had to accept that my gut reaction might not be correct.
Because most people do believe in the concept of personal identity, they took my gut reaction as my “real belief” and my reference to what other people believe as just so much window dressing. But that is not how I look at things. If my life were on the line, I’d go with what the market forecast, not my “hunch” as to where an asset price was headed.
There are many issues where people should not hold strongly held views. I’ve read many articles on AI risk. I’ve listened to podcasts. It’s obvious that the most brilliant minds in the world have reached absolutely no consensus on the question of whether AI poses an existential threat. It’s also obvious to me that many of these experts are much smarter than me. Not just smarter in the sense that they know more about AI whereas I know relatively more about the Fed (although that’s true), but smarter in the sense that they have far higher IQs.
So if someone far smarter and far more well informed than me tells me that AI is extremely dangerous, and he or she has studied the issue intensively for their entire adult life, on what possible basis could I tell them that they’re wrong? Do you think I have a counterargument that they’ve never heard? Do you think that they aren’t smart enough to understand my arguments?
“Previous fears over technology proved to be groundless, or wild overreactions.”
“Oh really, and which of those technologies involved creating entities that were far smarter than humans.”
At the same time, there are equally bright people who don’t see AI as a threat to humanity. Do you think I know something that they don’t know? Of course not. So I’m agnostic on the AI risk question. You should be too.
When it comes to personal beliefs, people tend to have big egos. They are overconfident in their views on politics, religion, economics, sociology, even sports.
On almost any issue, you should have at least two views. An inside view, which represents your hunch as to what is true, and an outside view that represents the rational forecast of the truth given both your hunch and weighted average of the views of experts. When it comes to AI risk, I put a weight of roughly zero on my inside view, and 100% on the experts. The problem here, of course, is that even the experts don’t agree.
PS. Wait, didn’t I say I don’t believe in truth? No, I said I didn’t believe in objective truth. When I say something is true, I merely mean that I predict that this will be the consensus view in the very long run. Since we never reach the end of time, there is no objective truth. It’s always provisional—things regarded as true, with varying levels of confidence.
PPS. Think about these questions:
1. What would an entity with an IQ of 250 be capable of doing?
2. What would an entity with an IQ of 250 choose to do?
You think you know the answer to either question? Oh really, what’s your IQ?
BTW, I’m agnostic on the question of whether we could even build something this smart.
Tags:
29. March 2024 at 13:36
“The problem here, of course, is that even the experts don’t agree.”
I agree with your general sentiment here but, in the case of AI risk, I think that the fact that the “experts” don’t agree is indicative that there are no experts. (Which is a really good reason to remain agnostic.) I know some people are experts in AI itself, e.g., computer scientists and mathematicians, but I don’t even know what it would mean for someone to have “studied [AI Risk] intensively for their entire adult life.” I know some people are affiliated with organizations that name themselves “AI Safety” orgs or something similar. But, as far as I know, there is no such thing as AI Risk Science or, if there is, I don’t think that they have learned very much at this point. What observations would they have used as the basis of their theories and models?
Similarly, I’m not sure anyone is an “expert” on the science of 250+ IQ entities.
29. March 2024 at 13:45
People get confused. They think if they are good at building AI machines then they are also good at predicting the societal impact of AI machines.
29. March 2024 at 14:47
I don’t believe in lots of things that are widely accepted, like asset bubbles, objective truth, and personal identity.
I don’t think that agrees with academic philosophers’ consensus (and a number of philosophers forming that consensus are smarter than you).
have reached absolutely no consensus on the question of whether AI poses an existential threat.
It clearly does, unless, as I believe, God is constantly active in the world preventing global catastrophes.
29. March 2024 at 15:25
BC and Scott, When I speak of “experts”, I don’t mean people building AI machines, I mean people who have studied AI risk.
E. Harding, Ah, but how many are smarter than Rorty or Schopenauer?
29. March 2024 at 15:32
What does the market say about AI risk? Or is that too hard to gleam.
29. March 2024 at 15:38
Pietro, I recall that Basil Halperin and two others did a paper on that:
https://basilhalperin.com/papers/agi_emh.pdf
But it may be difficult to distinguish between say a 3% chance of extinction and a zero chance. If you told me there was a 3% chance of extinction in 20 years, I’m not sure it would significantly affect my attitude toward consumption and investment. I’m far more likely to die of cancer.
30. March 2024 at 04:10
Scott, thank you!
30. March 2024 at 07:01
This is mostly word salad.
But the fact that you don’t believe in objective truth is pretty scary. That’s a recipe for tyranny. And I think that’s what you want, though, right? Because you’ve stated that you hate populist liberty movements like MAGA. You love that top down governance, especially if you’re in charge.
But let me kindly remind you what you said about Covid. You said Peter McCullough and Robert Malone were anti-scientists.
In this article, you state that “experts” don’t agree.
Well, it’s nice that you finally admit that.
But if you truly believe that, then why did you call Malone and McCullough anti-scientists?
Are they not highly educated experts who just happen to disagree with other groups of experts?
You cannot have your cake and eat it too.
30. March 2024 at 07:35
You are a Bayesian. You have a prior (gut) and a posterior (prior * likelihood function). Your inside view is your guy and your posterior is your outside view.
1.) why don’t expert opinions outperform market aggregates that include the views of dummies (less well informed gut feelings)?
2.) an AI will constantly tell you it does have an opinion or a guy view… what is a guy view then without a personal identity? Just a random seed?
30. March 2024 at 07:36
*does not have an opinion…
30. March 2024 at 09:11
Yep, Objective truth is just a myth.
2+2 doesn’t equal 4. It equals 5.
Men can get pregnant.
Only “my truth’ exists.
You talk about “weighted advice of experts”. How do you weight it?
Subjectively, of course. LOL. If objective truth doesn’t exist, then the weight must be subjective.
Wow.
You’re a real “intellectual” Scott.
So amazingly profound. You believe everything and nothing at the very same time.
However, I would argue there is indeed one objective truth that we can all agree too. And that is we all know your I.Q. is well short of 250. In fact, it’s well short of 150. Probably 120. Not bad. Not genius.
Keep trying thought. It’s funny.
30. March 2024 at 10:56
The weighted average of experts believed that DDT was harmless.
The weighted average of experts believed that smoking was harmless.
The weighted average of experts believed that GMOs were harmless.
The weighted average of experts cannot even isolate a virus using the standard, dictionary definition of isolation, then pretend they know everything when they know almost nothing…
The weighted average of experts, since ww2, believe that coercion and force is the best way for America to “win friends and influence people”.
The weighted average of experts believe dumping a surplus of food into other countries in the form of financial aid is helpful despite placing millions of foreign farmers out of business.
The weighted average…
We can go on and on.
So-called “experts” are, historically, almost always wrong. From a sun centered universe, to DDT, to phenology, to shutting down the economy, if academy is proven to be good at anything — it’s promulgating pseudoscience.
Real humility is recognizing that you aren’t an expert. You’d like to think that you are because you spent all your life studying one topic. But it’s more likely than not that you are wrong. That you will never be remembered. You’re name will not be in the history books as some magnificent hero that shaped and swayed humankind, or that made some fabulous invention. You are just an average dude who spent his entire life on one subject, understood it insofar as he could, with the available information, and whose conclusions will be almost certainly wrong.
If every academic approached science that way, we’d be a lot better off.
31. March 2024 at 10:10
Only the very extreme right tail is smarter than you by any significant margin. You went to UC grad school so your IQ is the in the big blob for elite schools. Their is a right tail and those people take Math 55 at Harvard. Spread across the elite schools there are probably 10-100 of those types per class. These are your Bill Gates and less famous probably Hal Finney types.
Most of the people talking about AI including say Sam Altman have no significant IQ difference with you.
31. March 2024 at 21:48
Sean, I’m not talking about people like Sam Altman (who I don’t trust), I’m talking about people with far higher IQs than me, such as Eliezer Yudkowsky, Scott Aaronson, Scott Alexander, Tyler Cowen, Zvi Moskowitz, Robin Hanson, etc.
1. April 2024 at 22:09
An entity with an IQ of 250 be capable of discerning objective truth. (Sorry you and I are not up to it.)