If a prediction market is not liquid enough, it's possible to manipulate it by throwing in small sums of money (thus, for example, a political candidate could boost his price by buying a bunch of shares). Presumably this could be useful, for example if you pump up your market share price, this might induce donors to contribute to the winning cause or could help attract endorsements.
At the other extreme, if the market is too liquid, there's a potential "moral hazard" or motivation to throw an election, to purposely hurt your side in order to make money on the pointspread if you've already placed a large bet in the other direction.
Now here's my question: there's clearly a sense in which a prediction market can be too small (too illiquid) to be trusted, and conversely if it is too large (too liquid) you get problems in the other direction. Is there an intermediate zone in which the market is liquid enough so it can't be easily manipulated, but not so liquid that it motivates point-shaving? Or do the zones of "too illiquid" and "too liquid" actually overlap, so there's no market size that does the job?
I imagine the answer would depend on some external parameters, such as the ease or difficulty of enforcing insider-trading restrictions. Possibly there's some theoretical work in this area. Justin? Robin?
P.S. I'm raising the questions above in all sincerity. This post is not intended to be a devastating argument that shoots down prediction markets; I'd just like to know if these issues have been considered and resolved in some way. A lot of the casual discussions of prediction markets have been of the "they're cool" or "they're silly" variety, but I imagine the researchers in this area have considered ways of assessing the problems arising from the issues noted above.
P.P.S. This paper by Robin Hanson (see comment below) discusses the first of these points, presenting theory and evidence that low-volume markets are hard to manipulate and thus implying that there is an intermediate zone where the markets can work well.