June 29, 2010
Two EC Papers I Enjoyed

EC 2010 (nominally stands for “Electronic Commerce”) was recently hosted by a dedicated team at Harvard. I had a great time as an undergrad there and it was nice to be back for a few days, revisiting places like Felipe’s and People’s Republik. The conference itself took place in the skylighted underground piazza of the Northwest Science Labs, which was really the ideal space for a small conference. The building itself is an enduring symbol of the Summers era at Harvard — shiny, science-focused, and intrinsically endowed with the hubris that comes from constructing such a costly structure without even bothering to secure a naming donation.

My favorite talk was from Ramesh Johari for “Congestible services and network effects” (joint work with Sunil Kumar). Unfortunately, the paper does not seem to be available openly online and even the ACM Digital Library copy is just a one-page “extended abstract” (UPDATE: the paper is available here). Despite the awfully throwaway title, the work is a clever analysis of two opposing factors in networks: network effects, people benefiting from the presence of others (think Facebook), versus congestion effects, where your utility decreases as more people use the service (think traffic). I disliked the use of Nash equilibrium in the paper, but from what I could tell it looked like these were potential games and so simple dynamics should reinforce these equilibria. A robust model drawn in broad strokes and with qualitative insights — to me, the ideal theory paper.

The other talk I particularly enjoyed was from Sharad Goel for his paper (with Dan Reeves, Duncan Watts, and Dave Pennock) "Prediction Without Markets", an analysis of just how well prediction markets do their job by comparing against alternatives. I just really liked how much data went into this project — the authors evaluated so many distinct datasets and went into a good deal of depth to contrast various predictive techniques. I thought the prediction vs. probability graphs with circles to reflect the number of samples were a clever way to present a complicated data analysis cleanly.

8:00am  |   URL: http://tmblr.co/ZtlAMyigk3t
(View comments
Filed under: paper 
May 5, 2010
Foundational Paper: The Economist as Engineer

Al Roth and Peter Coles’ class on Market Design met from 9am until noon on Fridays the spring semester of my Junior year. I’ve never been a morning person, and my mornings that spring started at 5:30 or 6:30 to row House Crew (go Adams!). It’s a testament to the teaching and material in the course that I didn’t drop the damn thing, and instead had probably my best undergraduate academic experience. This paper comes the closest to summarizing, I think, the ideas that kept me from falling asleep.

The Economist as Engineer is, in effect, a summary paper of two different strains of research. The most prominent is Al’s work on the National Residency Matching Program, The NRMP serves to match MD graduates to their first jobs, and when Al took over responsibility for it the thing was breaking — students were beginning to go outside the match, taking exploding offers, and hurting the process. Al’s redesign of the match has held up to this day. It’s such an incredible thing to see an economist take something broken, diagnose it, and fix it. The use of computer experiments to enhance theoretical principles is extensive and heartening, and leaves you with the impression of reading a well-documented solution to a challenging problem.

To be frank the second half of the paper — on combinatorial mechanisms for wireless spectrum auctions — does not hold up as well. This is a function, I think, of the first half of the paper being so strong and also the fact that the issue of how to run a good spectrum auction continues to be open to this day.

I can’t help but feel the reason that section was added was as an attempt to stretch “Design Economics” into a field, rather than an isolated example. (What I’ve dubbed “Constructive Economics” could be fairly regarded as equivalent to the “Design Economics” of this paper, but I prefer constructive because it feels more accessible and highlights the non-constructiveness of the bulk of the discipline.) In addition to all of the very sound and well-reasoned research ideas in the work (nearly all of them backed up by the kind of nit-picky, rigorous, time-consuming studies and don’t end up really meriting anything more than a footnote), the paper takes on a remarkably strident and revivalist tone for something published in Econometrica. I mean, these are the second and third sentences of the abstract:

Market design involves a responsibility for detail, a need to deal with all of a market’s complications, not just its principle features. Designers therefore cannot work only with the simple conceptual models used for theoretical insights into the general working of markets.

Al closes his paper by quoting himself, and I’ll close this post by quoting him quote himself:

Just as chemical engineers are called upon not merely to understand the principles which govern chemical plants, but to design them, and just as physicians aim not merely to understand the biological causes of disease, but their treatment and prevention, a measure of the success of microeconomics will be the extent to which it becomes the source of practical advice, solidly grounded in well tested theory, on designing the institutions through which we interact with one another.

It’s a stunning vision of the future, better than coffee early on Friday mornings.

Roth — The Economist as Engineer: Game Theory, Experimentation, and Computation as Tools for Design Economics (2002)

April 6, 2010
Foundational Paper: Consensus of Subjective Probabilities: The Pari-mutuel Method

This is the oldest paper I plan on posting, and it’s a far more unconventional selection than the other logical candidates for a first paper — von Neumann and Morgenstern’s Games and Economic Behavior, which is virtually unreadable, and Nash’s 1950 paper, which would be useful only really as a non-constructive foil.

Instead, I think that the distinction should go to Eisenberg and Gale’s four-page gem from 1959, which considers how a collection of individual probabilities could be pooled into a single probability estimate. The paper is written in a rigorous but conversational manner that has aged well, making it an enjoyable short read more than fifty years later. But what really sets it apart is this:

[The existence of a solution] can be proved by means of fixed-point theorems. We prefer, however, to prove existence in an elementary manner using a variational method which seems to be of interest in itself.

This “variational method” defined the solution to the consensus problem as the solution to a convex optimization, which is so much more heartening to a computer scientist than seeing the solution as the product of a Brouwer fixed point.

Old paper, new results

Eisenberg and Gale’s paper was brought to the forefront by recent work by Kamal Jain and Vijay Vazirani. Essentially, because the solution of the E-G process can be expressed as a convex optimization, and convex optimization is computationally easy, things that “behave like” E-G can be solved well in practice. In the face of a plethora of recent results showing that Economic thinking about equilibria has been built on the unstable ground of computational intractability, it’s nice to have some classic equilibrium-finding algorithms that actually work in the real world.

Finally, Manski’s result on equilibrium in prediction markets — which implies the long-shot bias that shows up in actual market data — can be derived in a straightforward manner from E-G, reading the paper from the perspective of beliefs being continuously distributed.

Eisenberg and Gale — Consensus of Subjective Probabilities: The Pari-mutuel Method (1959)

April 2, 2010
Foundational Paper: Incentives Build Robustness in BitTorrent

It is to our profound and lasting discredit as a field that the most important paper in multi-agent systems in the last decade came not from the academic community, but from an outsider with some of the least auspicious academic credentials imaginable (dropping out of SUNY-Buffalo).

For every bit downloaded, someone must be uploading. But the two sides are inherently asymmetric because people are selfish and want to download as fast as possible without uploading anything. BitTorrent solves this problem by incentivizing uploading by making it the key to fast downloading.

A Blend of Theory and Practice

There’s a strong real-world focus in this paper, like you’d expect from a system that has been implemented and refined based on real usage. Take the section on the “last piece” problem — what happens if the final part of the file you’re downloading is from someone with a slow connection? These are the kinds of issues you don’t know you have until you build something.

At the same time, you also see an attempt to engage with Economic theory. From Section 3.1:

Well known economic theories show that systems which are pareto efficient, meaning that no two counterparties can make an exchange and both be happier, tend to have all of the above properties [utilizing all available resources, providing consistent download rates, and being “somewhat resistant” to people downloading but not uploading] … BitTorrent’s choking algorithms attempt to achieve pareto efficiency using a more fleshed out version of tit-for-tat than that used to play prisoner’s dilemma.

Sure, okay, you can scoff at that, and the explanation isn’t quite right, fine. It’s obviously the product of an intelligent autodidact, rather than an ensconced academic. But it evinces an obvious effort to derive practical work from theoretical first principles. It’s an effort to make it real.

Impact

It would be difficult to overstate the significance of BitTorrent to the Internet. Estimates are really shaky on this, but it’s probably somewhere between 20% and 50% of all traffic.

If we were to take all the CS papers published in the 2000s and rank them in terms of real-world impact, BitTorrent is likely in the top three — I would say the other two are Google’s MapReduce and Luis von Ahn’s work on CAPTCHAs.

Finally, like any good economic system, BitTorrent has been the subject of further analysis. I plan to address the really clever client BitTyrant, an academic effort to hack the economic protocol of BitTorrent to enable massive downloads with minimal uploads. BitTyrant, of course, also did not come from the multi-agent systems community. Just what is it we do again?

Cohen — Incentives Build Robustness in BitTorrent (2003)