Tom Lee 's Techdirt Comments

Latest Comments (15) comment rss

  • Let's Be Honest About Bandwidth Rationing

    Tom Lee ( profile ), 07 Oct, 2008 @ 11:18pm

    Re: One more thing Tom

    Well... yes. They're going to try to maximize profits and squeeze as many users onto a given network segment as possible. Is this a surprise? I hope not.

    None of it changes the fact that resources are constrained and consequently the ISPs must find ways to allocate them efficiently. If competition exists -- admittedly a big if -- prices will be forced down, providing an optimal allocation of bandwidth for minimal price.

    I still haven't heard a coherent argument for how users can be able to use as much bandwidth as they care to on an indefinite basis. In fact, aside from Mike's discussion of transaction costs, I haven't heard an explanation of why they ought to be able to, either.

  • Let's Be Honest About Bandwidth Rationing

    Tom Lee ( profile ), 06 Oct, 2008 @ 08:36am

    Re: Re: Re: Tom or whoever is reading this post, I think you've been mislead by talk of

    I think this is a fair criticism -- disconnecting users is not really a sufficient response to the need to discourage high bandwidth use. Overage charges are more appropriate, and something that Comcast considered but rejected, apparently fearing that they would confuse or scare off customers. It makes sense -- right now there aren't great ways for the average user to track their monthly bandwidth use, and getting an unexpectedly large bill is a great way to anger customers.

    But other ISPs are implementing overage charges, and I think Comcast probably will, too, as time goes on. Metered bandwidth is where we need to end up (with, perhaps, an initial pool of "free" bandwidth associated with the base monthly fee). Right now the ISPs are afraid that customers will be turned off by that; I think that fear is misplaced. So long as they can resist viewing a switch in pricing schemes as an opportunity to sneakily jack up prices, I think customers would be willing to go along.

  • Let's Be Honest About Bandwidth Rationing

    Tom Lee ( profile ), 04 Oct, 2008 @ 01:01pm

    Re: Re: Tom or whoever is reading this post, I think you've been mislead by talk of

    Sorry -- "their cost per customer is not non-zero" s/b "their cost per customer is not zero".

  • Let's Be Honest About Bandwidth Rationing

    Tom Lee ( profile ), 04 Oct, 2008 @ 12:55pm

    Re: You had me up until...

    It's just a matter of the ISP willing to install the gear at the nodes.

    "Willingness" in this case isn't just about them getting off their lazy butts -- it's about paying to do it. That's what makes bandwidth limited.

  • Let's Be Honest About Bandwidth Rationing

    Tom Lee ( profile ), 04 Oct, 2008 @ 12:53pm

    Re: Tom or whoever is reading this post, I think you've been mislead by talk of

    I feel as though a lot of the comments here amount to people simply asserting that there is no shortage of bandwidth facing ISPs, or that the costs of a full pipe versus an empty one are the same. In a sense this is true -- networks can always be upgraded, and peering charges are relatively small. But eventually demand for bandwidth will outstrip a segment's capacity, and it will have to be replaced or otherwise improved. This costs money, and happens more frequently as your per-customer bandwidth use increases.

    The question of whether our ISPs are pricing their services fairly is, to some extent, a separate question. Could Comcast accept a smaller profit margin per customer? Is the lack of broadband competition in most markets leading to higher prices than we'd otherwise have? The answer to both of these questions is almost certainly yes.

    That doesn't mean that they're acting avariciously or in bad faith when it comes to the caps, though. Consider Comcast's approach to those who violate their cap (either when it was unstated or now that it's explicit). They're not implementing overage charges -- they're simply disconnecting customers who go over the cap. How is this consistent with the idea of bandwidth caps representing a cynical way to extract more money from each customer?

    The only way this policy makes sense is if heavy bandwidth users represent a cost to Comcast -- whether through peering charges or, more plausibly, through their impact on other customers' service: the more heavy users, the sooner upgrades have to be paid for in order to maintain an acceptable level of service.

    Now, look: Comcast is a lousy company, and so are most of their fellow ISPs. Their service is expensive and customer support is often terrible. They're in the business of making money, not of making friends. But it should be obvious that 1) their cost per customer is not non-zero and 2) heavy bandwidth users represent a greater cost to them than do light users, primarily due to the network upgrades they make necessary.

    In other words, heavy users consume more resources -- and if someone consumes more resources, it's reasonable to ask that they pay more, too. To claim that everyone should pay the same regardless of what they use is unfair; to claim that the heaviest users are somehow doing everyone else a favor, as Al-Chalabi does, is ludicrous.

  • Let's Be Honest About Bandwidth Rationing

    Tom Lee ( profile ), 03 Oct, 2008 @ 02:16pm

    Re: Mental Transaction Costs

    I certainly agree that this is a reasonable position, but I do disagree about the significance of those transaction costs. There are many services that have succeeded despite relying upon metered sale of quantities that aren't easy to keep track of -- consider electricity or gas use. Or the mobile industry in non-US parts of the world.

    I agree that having a basic initial bucket of data has utility for customers. But I disagree that a one-size-fits-all, all-you-can-eat billing model is appropriate. It certainly isn't very fair.

  • Let's Be Honest About Bandwidth Rationing

    Tom Lee ( profile ), 03 Oct, 2008 @ 02:12pm

    Re: Is it possible...?

    I suppose it's possible that Al-Chalabi contributed to the article -- it's had many of contributors -- but the original version is by David Wheeler. Consequently a citation is called for even if Al-Chalabi had something to do with the article's revisions and additions.

    But I doubt he did. I'd encourage you to read the relevant section of the whitepaper -- the concepts discussed by the Wikipedia text are not expanded upon or placed into context. There's also the fact that various other parts of the Wikipedia article are rewritten into the section (a reference to Albert-László Barabási, and a John Robb paper from the references section that discusses terrorist networks).

  • The First Step Is For Microsoft To Admit It Has A Problem

    Tom Lee ( profile ), 22 Aug, 2008 @ 04:16pm

    I think you're continuing to insist on the comparison being perfectly symmetric when clearly it's not. By way of analogy, I might've said that Microsoft discourages the use of OpenGL because it would prefer that people use DirectX. The two libraries are clearly not functionally equivalent -- DX does a ton more stuff than OGL -- but the functionality of one is fully implemented by the other, and people might turn to one or the other interchangeably for many simple, common tasks. As a result, OGL's marketshare clearly has an impact on DX's, even though it, by itself, can't do nearly as much as DX.

    I hope this clears things up. I apologize if the initial comparison was unclear, but at this point it seems a bit as though you're willfully misunderstanding me.

    To your other point: I understand that HTML5 is a draft. But as others pointed out, canvas is already supported in several browsers and is obviously here to stay. The specific state of various proposals at W3C and WHATWG is not only deadly boring, it is, in this case, largely irrelevant. And it's a complete straw man to imply that progress on a browser needs to halt until existing standards are fully implemented. Should progress on Mozilla stop until it passes ACID3? Clearly not.

  • The First Step Is For Microsoft To Admit It Has A Problem

    Tom Lee ( profile ), 22 Aug, 2008 @ 03:04pm

    And, incidentally, while it's true that WebKit has its roots in KHTML, it's now pretty obviously its own project -- look, there's a website and everything! Apple's involvement makes it a much more appropriate comparison in this case than the Konqueror project would be, which is why I mentioned it. The point is not that Apple wrote the original code, but rather that they have had success using an open source model to sustain the rendering engine powering their free-with-OS browser.

  • The First Step Is For Microsoft To Admit It Has A Problem

    Tom Lee ( profile ), 22 Aug, 2008 @ 02:59pm

    I think folks are reading a bit too much into the Silverlight/canvas comparison. I'm quite aware that Silverlight is vastly more powerful than canvas -- obviously no one is going to be doing video with a Javascript codec anytime soon, for instance (at least, I hope they aren't).

    But it should be clear that the canvas, audio and video tags represent the beginnings of standards-based efforts to sidestep proprietary rich client libraries like Silverlight and Flash. My point was simply that, to the extent that it can, Microsoft is likely to do its best to push its users toward its own proprietary solutions rather than supporting and working to improve open standards.

  • The Airlines' Ongoing Struggle With Price Aggregation Sites

    Tom Lee ( profile ), 29 Jul, 2008 @ 11:56am

    Thanks for the thoughtful comments, everybody.  In particular thanks to Keith from Kayak for explaining what's been going on between his company and AA -- my analysis was based on guesswork already done by TechCrunch as to why American is going down this path.  It appears that their motives are going to remain opaque, at least for now.

    Now for the criticisms raised in this thread:

    First, Kayak does use a GDS -- see here. They use QPX from ITA systems (Wikipedia also indicates that they use SABRE, but I have not been able to confirm this).  QPX isn't one of the old-guard GDSes, it's true, but the fact remains that it's a fare aggregation firm that sells its services to the airlines.  The economics underlying my argument remain the same: to the extent that the airlines can extricate themselves from these middlemen, they will.

    As to Eric's comment re: the legality of screen-scraping -- he's right, and my phrasing was poor.  As the post indicated, the airlines certainly would be able to file suit or take technological countermeasures against scrapers.  However, the matter is not as settled as Eric indicates.  See here (PDF) for what seems like a decent overview.  In short: trespass to chattel has been successfully used to shut down some screenscraping operations (e.g. eBay v. Bidder's Edge) but has failed in others (Ticketmaster v. Tickets.com).  The issue does not appear to have been definitively resolved -- and, as you might imagine, we here at TechDirt feel that trespass to chattel will ultimately be found to be inapplicable to these sorts of cases, as the doctrine was created with physical property in mind, not computer networks.  Other attempts to suppress these sorts of activities are also less-than-bulletproof, as the withdrawal of DMCA claims (linked in the original post) indicates.

    So the upshot: Eric may be overstating the viability of legal options available to airlines being scraped, but I certainly understated them, and I apologize for that.  Certainly a well-lawyered airline could make plenty of legal trouble for someone scraping its website.

    It remains an open question, of course, whether the airlines would pursue such remedies against scraping operations if their upshot is to make the airlines more money.  And with the continuing and unstoppable rise of mashup-like technologies' significance, it seems like only a matter of time before the law stops recognizing attempts to shut down the sharing of facts.  There are various companies (e.g. Wesabe) who are making a go of this sort of business model (and raising significant capital to do so) despite the possibility of the sorts of legal claims that Eric points to.

    Finally, Sean's point regarding similar litigation in Europe is interesting, but not relevant to American companies like Kayak.  The laws surrounding the copyright protections afforded databases are very different outside the U.S., for one thing.

  • Would Congress Withhold Financial Aid From Colleges That Don't Offer A Subscription To Napster?

    Tom Lee ( profile ), 13 Nov, 2007 @ 08:13am

    1: As I said, this bill is a bad idea. The case against it can be made on the merits -- I don't think it needs to be overstated. Nor do I find the Democrats' behavior with respect to the entertainment lobby to be anything less than despicable. One of the biggest problem regarding these issues is that *both* parties are representing corporate rather than consumer interests.

    6: If you agree with the point, I'm not sure why you're insisting that we re-argue it. Anyone who's paid any attention to the content industry over the last half-decade is acutely aware of the trouble it's had adapting to the internet age. I don't see why it's necessary to endlessly rehash that point.

    14: On what basis do you think the RIAA/MPAA will be able to file suits against schools?

  • Advertising: The Revolution Won't Be Through Friend Requests

    Tom Lee ( profile ), 08 Nov, 2007 @ 07:17am

    Re: I don't think ...

    Derek: In fact I am familiar with Facebook's current permissioning system for FB apps. I'm sure that something similar will accompany Beacon. But questions remain to be answered: will all of your purchases on, say, Amazon, be added by default? Will there be a per-purchase opt in? Or will it be opt out? If you accidentally post an embarrassing purchase to your feed, which site do you go to in order to delete it? Can recurring payments (e.g. Flickr membership) post without the user taking deliberate action each time? If each purchase has to be deliberately added, how meaningful is this feature, anyway?

    It'd be foolish to think that Facebook isn't considering these issues and taking steps to assuage concerns about privacy. But just because there'll doubtless be some sort of control given to the user doesn't mean that there aren't privacy implications -- witness the furor over the introduction of the mini-feed, which exposed no non-public information but dramatically changed the way user information is provided, detected and consumed by other users.

  • Fooling Computers With Optical Illusions Is A Step In The Right Direction

    Tom Lee ( profile ), 03 Oct, 2007 @ 12:02pm

    that's also wrong, killer_tofu

    Sorry, I don't mean to just come to this thread and be a jerk to everyone. But the wii sensor bar actually contains two infrared LEDs, which emit light but don't detect anything. The wiimote has a single IR-sensitive camera in it. Have a look at this video, in which two candles (which also emit infrared light) are used to replace the sensor bar.

    The fact that the position of those two IR LEDs can be assumed to remain constant (and their orientation can be assumed to be horizontal) allows the Wiimote to calculate where it's pointing relative to the sensor bar. The spatial relationship between the TV and Wii is calibrated by the user when they set up the console.

    A one-eyed man staring at two dots painted on the wall is a closer analogy to how the Wii actually works.

  • Fooling Computers With Optical Illusions Is A Step In The Right Direction

    Tom Lee ( profile ), 03 Oct, 2007 @ 09:46am

    actually, this IS a physiological illusion

    Although when talking about neurons this distinction is fairly meaningless. Nevertheless, it's the case that this effect occurs before the signal ever leaves your retina, which most people would regard as a physiological issue. The reason has to do with a contrast-enhancing property afforded by a layer of inhibitory neurons. See here and read the part about "lateral inhibition".

    It's interesting that they've managed to get their system to fail in some of the same ways that the human visual system fails, but it's silly to take this as a signal that computer vision will mirror that of people. The human visual system is very good at some things and machines *will* have to replicate those tricks. But determining luminance is not one of those things -- robots have got us pretty well beat when it comes to that, and there's no reason they should regress on that score. This research is probably intended more to shed light on human biology than on future technology.