Some points on betting
Jul 3, 2013
4 minute read

See Adam Gurri for a review of where we are so far.

  • As I told Kevin Grier a few days ago, I usually lose arguments with Tyler, and even with my model of “Tyler” in my head. I would not be very meta-rational, therefore, for me to “attack Tyler” as Bryan wants GMU bloggers to do. Even writing this post, I have moved somewhat in Tyler’s direction.
  • Everyone takes Noah’s point that an individual bet does not necessarily reveal beliefs, because through clever hedging strategies, it is possible to engineer a portfolio that has a different exposure to the original proposition than the individual bet. Everyone also takes Tyler’s point that a big part of your “portfolio” is always going to be non-traded assets.
  • One reading of Tyler is that to a considerable extent, we are all a (mostly) non-tradeable part of our own portfolios. I am betting on my own success, because what choice do I have? What portfolio theory would suggest is that in terms of optimal risk allocation we are all too exposed to ourselves. We should try to find ways to hedge against ourselves that don’t undermine alpha. Which is hard to do. Ideally, we would simply “bet against ourselves” to reduce aggregate risk, but psychologically it is difficult to bet on your own failure without reducing your probability of success. Instead, we need to self-deceive to some extent, to find ways to hedge against ourselves without realizing that that’s what we’re doing. Betting on one’s beliefs is therefore counterproductive in two ways. First, it actually increases portfolio variance because we are already exposed to ourselves. Second, precisely because of the qualities that Alex, Bryan, and Robin laud, betting one’s beliefs makes self-deception, and therefore optimal portfolio allocation, harder.
  • Nevertheless, these drawbacks of betting should often be outweighed by the expectation of actually winning the bet. Therefore, refusing to bet still has informational content for those who observe the refusal.
  • Furthermore, Tyler says that he disagrees with “virtually every sentence in the second paragraph” of Alex’s post, but that still does not follow from my reading of Tyler. As Alex says, because the number of possible states of the world exceeds linear combinations of assets, portfolios cannot reveal a complete set of beliefs. In the presence of non-tradeable assets and psychological limitations on hedging, portfolios a fortiori don’t reveal beliefs. Bets are new assets that are difficult or costly to otherwise synthesize, and therefore add to a portfolio’s capacity to express beliefs. This is true even if we accept that bets have implications for bettor psychology; as long as bets covary with the rest of the portfolio in a unique way, they increase the information content of portfolios.
  • Internal mental accounting” is simply not consistent with truth-seeking plus meta-rationality. Tyler knows this, which is why we need to do some creative reading of Tyler. He is nevertheless right that betting does pay off in terms of pride in betting.
  • Tyler says to Bryan: “You seek arrangements which maximize your glee.” In my view, this is both true and an argument in favor of betting. Bryan’s glee brings me joy and amusement. If it is relatively cheap to make Bryan gleeful, why not do it?
  • Tyler’s fame relative to the other participants in this discussion provides some escape. Tyler has more than 20 times as many Twitter followers as I do, and of course that metric probably overstates my relative influence. Nevertheless, I am frequently bombarded with belligerent or offbeat challenges to my claims. I imagine that for Tyler, receiving many times more than I do, this is somewhat exhausting.
  • Alex says “a bet is a tax on bullshit.” Despite some concessions to Tyler, I still think that’s basically right. However, for the sake of completeness, someone should consider whether bullshit is all bad. Maybe society advances through the adjudication of competing bullshit claims. I call this dynamic the bullshit dialectic. Bullshit claims slow down our convergence to a reflective equilibrium because they present more (poorly founded) possibilities that we must consider, but they might speed up our convergence to the truth because, simply in virtue of there being so many of them, some bullshit claims end up being true. A betting norm probably cows people into parroting conventional views so that they will not be asked to bet, and that chilling effect is bad to the extent that truth-seeking is not merely an activity for atomistic elites.
  • I invite Straussian readings of Tyler in the comments.