My Photo

« Tertium Non Datur | Main | Samarra »

June 13, 2007

TrackBack

TrackBack URL for this entry:
http://www.typepad.com/services/trackback/6a00d834515c2369e200e008c4742d8834

Listed below are links to weblogs that reference The Spectrum Auction for Dummies (and by Dummies):

Comments

Should companies truly "own" the wireless spectrum? How about leasing it, and allowing the feds to gradually adjust prices over the long-term as needed?

I also believe part of this extremely valuable spectrum should be reserved for public use - specifically for wireless networks provided by local governments

Interesting. It had not occured to me that the strength of the signal had anything to do with the frequency. In my mind it had simply become a pile of 'layers' and there suddenly were some layers available (we allready had the auctioning of free frequency bands).

Isn't the development in protocols more important than the frequency though? Just like we now have much higher speed over copper wires than we thought possible a few years ago?

Wireless has the advantage that you can pay for a service that you have at a variety of places. There are places (café's, restaurants) over here where free wifi acces is offered, and lots of people go there to work their laptops. I like the idea of being able work whereever I want and laptops are getting close to giving the same price/performance as desktops.

At the same time: in two years time the hole town I live in will have acces to glass fibre. It's hard to compete with that even if you have the 'stronger' frequencies.

Great, great post, publius. It's my understanding from my brief involvement in radio and my college studies that few federal agencies have been as subject to regulatory capture as the FCC has. Is there any reason to believe that this spectrum will be made available to small carriers, rather than that the big few will get exactly what they want? We all saw who the DTV frequencies went to as soon as they were made available.

Two points:

* wireless broadband recently really took off in Austria, with all 4 networks offering deals similar to 1.5 GB/month for ~20 EUR. This is UMTS with speeds up to a few megabit/s.

They have proven to be a real competitor for wireline services for a) low-use customers and b) people not served by wireline DSL/cable.

* Have a look at e.g. http://blog.tomevslin.com/2007/06/time_to_write_t.html

By far the best utilized spectrum is the unlicensed 2.4 GHz range. No commercially used band even comes close.

There is thus not only a question to whom you sell the spectrum in which bundles and under what conditions, but also the option NOT to sell it all and let parts of it be free for everybody. Imagine how good private mesh networks and WLANs could be if they had access to more spectrum.

The 2.4 GHz range is a huge success. Why not build on that?

Good, useful post and not at all boring. Is there any indication which way this will go? If it is broken down into smaller pieces, will there be some sort of anti-trust provision to prevent a post auction buy-up of all the smaller licences by the giants? And finally, what are the chances that this will become an issue to which the public will pay attention or is it likely to be one of those under the radar deals which give less consideration to the public good than the lawmakers' and corporate good?
Thanks for the heads up, in any case.

Thank you! We don't get cell phone at my house and Verizon has let us know it has no intention of ever offering DSL there. It's amazing how much of the internet is realistically unavailable to people on dial up. I can't even update my daughter's Wii. This information affects me directly and I'm glad I have a better handle on what to write my representatives now.

Truly excellent post and great public service. Thank you for making it all comprehensible. Beautifully well written.

Dang. I've commented a few times on this, and had it fail every time.

There’s likely a tradeoff between range and spectrum width that I don’t see anyone considering. Basically, when you decrease the number of towers per square mile of urban area, you increase the number of calls that a given tower has to handle, because you’ve increased the number of customers a given tower will have to service.

So it’s possible that a given chunk of bandwidth at 700 MHz will be worth less than the same bandwidth up at over 2GHz, or that a good chunk of the advantage (less infrastructure cost) will be wiped out. A given chunk of spectrum can support N simultaneous calls, after all, in a given cell.

And of course it should go without saying that the 36 MHz chunk of bandwidth in question isn’t really what would be considered to be a fat information pipe. For voice, it's quite valuable, but for data, you're talking perhaps a few megabits per second in a given cell. Total.

So I’d guess that the way things might move is we’d see the same number of towers at this frequency, and the selling point would be longer battery life due to reduced transmit power.

But, as I said, I’m not a telecommunications type.

Thank you thank you thank you. I had been sort of mystified by this, thinking: if I were a better person, I'd try to figure it all out, but... but...

This makes all the difference in the world. Thanks.

Great post, publius. A sequel would be good, too!

Fair warning: the current FCC chairman, Kevin J. Martin, has shown a very strong tendency to favor big users.

Martin is young-ish (40 IMS) and had little background in telecoms before becoming chairman of the FCC. He was part of the Bush-Cheney 2000 legal team; politics junkies with good memories may remember him from the Florida recount. His wife was counsel to VP Cheney and, before that, to John Cornyn.

FCC Commissioners are appointed for 5-year terms, and by law, no more than three can be from the same political party. So there are currently 2 Democrats and 3 Republicans. (Chairman Martin counts as one.)

The Commissioner to watch is probably Jonathan Adelstein, a former Tom Daschle staffer who has been a vocal critic of FCC policy under Martin. He'll likely throw out the first ball.


Doug M.

thanks doug - i didn't know that. actually, i think the commissioner to watch is mcdowell. martin, in addition to what you say, is a wholly owned subsidiary of verizon. mcdowell (the new republican) is emerging as the swing vote, the justice kennedy if you will. b/c he chose to abstain from the AT&T merger, we got the first sort of net neutrality like conditions (fleeting, but a good precedent). Tate (the 3d republican) does what Martin and the Bells say.

Slart - I don't think that's really a trade-off. You need to think of the infrastructure involved in setting up a network - building a tower, getting permits, running lines, installing equipment on the tower. towers are really the key. once you've built the tower, run the lines, got the permits, etc., it's much easier to slap on a new piece of technology to handle more traffic. By example, it's easier to build an additional door if you've already constructed the building.

Now that we've seen digital cellular evolve, we can see the mistakes that have been made over the past quarter-century. What worked and made sense back in the '80s with A-side and B-side bands, with roaming charges for anything out of local service area, is obsolete, but the geographically small areas are still the method of allocating spectrum, some of which are not being served.

My approach would be to guarantee AT&T, Sprint, T-Mobil, Verizon and any other prospective national carrier a license for the entire country available in the higher-band spectrum as long as they guaranteed to provide coverage in all areas with more than a low minimum density, say 1 person per square mile. The nationals would be allowed to pool their receiver/transmitters in other low density areas and would be required to make any pooled receiver/transmitters or towers available to others for use.

Local, captive, government-sponsored, regional and other services would then have the lower-band spectrum available for their use for voice or data services.

These words from my Communications Theory professor always stuck with me: "The world is, for the most part, a low-pass filter." Then he said something about Nyquist and I fell asleep.

I started along the same path as Slarti, but then realized that even if you need smaller cells to manage users, the towers for those smaller cells will be cheaper to build or modify for lower-frequency use than they would be for higher-frequency use. And the indoor coverage will be better.

I'm not sure why we should have to decide whether or not it is 'too concentrated' now. Divide it up into small enough chunks that small guys have the opportunity to buy it.

If the economies of scale are enough that it is much better to be large, let the big ones buy up the smaller ones. (I know, I'm hopelessly pro-market).

I would tend to think that the stable equilibrium would be 2-4 large carriers nationwide with an additional 1 or 2 smaller ones in huge or super-concentrated markets. But I'm not wedded enough to my concept to force it with the rules.

This question illustrates the spilt between pro-market Republicans and pro-business Republicans.

I think publius is saying that the towers are the expensive part, hsh.

Which reinforces my point, I think, rather more than it does to discount it: if the cost savings is in fewer transmitters (whether the cost is in the tower itself, or in the antenna and transceiver is not all that crucial, imo) then you're going to take a hit at some point; that point being where operating at the lower tower density limits your call volume per tower per megahertz of spectrum.

Now, it could be that conventional cellular networks never, ever operate anywhere near peak capacity; I'd be surprised if that were the case, though. Usually you design to worst case, then add margin.

Again, I'm not a telecommunications guy, but I have taken a course in communications theory, once upon a time. And occasionally use what I learned, from time to time. You might say I'm long on signal communications theory (modulation, in other words), but short on network theory.

I guess I'm partly agreeing with you and partly with Publius, but on different levels. I would say it's cheaper to build a lower-frequency network, not because you need fewer towers, but because you need less of a tower for each cell. So I agree with you that you might need the same number of towers to manage your users on a per-cell basis, but I agree with Publius' larger point that a lower-frequency network is cheaper, if for a somewhat different reason. Of course, I could be completely wrong about all of this.

I'm not sure why we should have to decide whether or not it is 'too concentrated' now. Divide it up into small enough chunks that small guys have the opportunity to buy it.

Sounds OK at first glance, but I wonder if you get into holdout or squatter problems. I don't know enough about the technology to know if a small owner has the ability to create problems for a possibly more sensible large-scale provider.

Maybe that can be solved by requiring winning bidders to actually put what they buy to use. Maybe slarti or hsh or someone else who actually understands this stuff could tell us whether that's a problem or not.

Technically you are obligated to use the frequencies you are licensed for. If another party wants some piece of spectrum your licensed for but not using, that party can petition to have it taken from you and granted to them if they can show non-use. I don't know if the pending auctioning will follow this same logic, but that's typically how radio licensing works in my experience.

Now, it could be that conventional cellular networks never, ever operate anywhere near peak capacity; I'd be surprised if that were the case, though. Usually you design to worst case, then add margin.

Slarti, this seems contradictory to me. If you design for worst case and add margin, you should expect the network not to operate near capacity. Or are you saying that "margin" is small and "worst case" is common, therefore you will sometimes be somewhat near peak capacity?

I don't know about cell networks, but I know that traditional telephone trunks are not designed for worst case (e.g. Mother's Day). They design for regularly encountered heavy use and leave you to try a few times to get through on Mother's Day. "We're sorry, but all circuits are busy..."

They design for regularly encountered heavy use and leave you to try a few times to get through on Mother's Day. "We're sorry, but all circuits are busy..."

I wonder if this is true, or if over time we have grown beyond the expected maximum use at time of installment.

Excellent and highly informative post publius. This not something I have been paying attention to but I will certainly start now. Thanks.

Think about how you would allocate resources if you were a phone company. You would want your service to be reliable enough to keep customers and get their calls through so you could bill them. But, on the other hand, you wouldn't want to make a capital investment that will be under-utilized. It's an optimization problem. Google the word "Erlang" for a bit of insight.

femdem: area's without (a)dsl connection in the Netherlands usually go for sattelite broadband. Isn't that possible in your area too?

"Barring some technical advances, wires will always beat wireless"

This puzzles me: "always" is a terribly long time. I'm seriously doubtful that wires will have a significant advantage over wireless, in voice communications, in 5000 years. Or even in 500 years. Or even in 50 years.

I'm even doubtful about that as regards more complex and higher-bandwidth-requiring data communications. In 50 years, or 60, or 70, or 100, nothing will change? Why are "some technical advances" unlikely, rather than absolutely inevitable?

Always? Always?

I'm seriously doubtful that we're going to be installing interstellar wires in a few hundred years, though I certainly could be wrong, and would be thrilled to see the sight of those wires linking solar systems.

Oh, and word to the wise: hexapodia.

Google the word "Erlang" for a bit of insight.

In an earlier life I wrote software that dealt with this stuff. You generally establish a service standard which is something like "5% chance of a call being blocked at peak demand time." Then you put in enough capacity to meet that. The work we did was for private companies, not telephone utilities, mostly, so they may do things differently, but that kind of approach is the general idea.

The principles are very well understood and apply to a broad range of activities.

Slarti, this seems contradictory to me. If you design for worst case and add margin, you should expect the network not to operate near capacity. Or are you saying that "margin" is small and "worst case" is common, therefore you will sometimes be somewhat near peak capacity?

More like, design margin is going to be the same, likely, for a 700 MHz network as it is for, for example, a 4GHz network. So the network that serves a larger number of callers is going to be closer to the edge of failure. That's how I see it, anyway.

Think about how you would allocate resources if you were a phone company. You would want your service to be reliable enough to keep customers and get their calls through so you could bill them. But, on the other hand, you wouldn't want to make a capital investment that will be under-utilized. It's an optimization problem.

This is pretty much the point I was attempting to convey.

I would say it's cheaper to build a lower-frequency network, not because you need fewer towers, but because you need less of a tower for each cell.

Could be, but it also could be that tower cost is driven more by footprint than height. Anyway, way past the limit of what I usefully know or suspect, here.

Say, has anyone heard about the storm we had here this afternoon? My pool screen, which stayed intact during a direct hit from 100mph winds, is now full of larger-than-designed holes.

Designed by? That might have been the way it was ultimately intended to be. :^)

Publius: On second reading it is still good and I still thank you for writing it.

So the network that serves a larger number of callers is going to be closer to the edge of failure.

Depends on what you mean by "failure." If it's a blocked call, and the network is designed as I described, then this doesn't hold.

Slarti -- yikes. We had a petite thunderstorm here, but it sounds like nothing compared to yours. Plus, my tomato plants are all refreshed ;)

How about "failure to meet the design criterion"?

I know: circular reference. But yes, the failure probably won't be catastrophic, unless there's something that excess connection requests might do to bring a subnet down. Again: way past any semblance of expertise I might have, now.

Still: the performance of a larger cell in a densely-populated area is likely (almost certainly) to be worse than that of a smaller cell, if the smaller cell was sized properly to begin with.

We had a 20-minute-long hailstorm here, evidently, and my tomatoes are somewhat the worse for wear. I've got a friend whose house I'm keeping an eye on (while he's on vacation); he's got half a dozen Sweet 100 plants that are lying prostrate on the ground.

Sweet 100, for the uninitiated, are the only cherry tomato plants worth growing in this part of the country. Possibly the entire world.

thanks OCSteve -- do i get some brownie points that i can spend on future snark now? :)

Snark points? Sure.

OTOH, now that I know you can write such an informative snarkfree post….

Hows that for ungrateful? Give me free ice cream and I whine for sprinkles… ;)

Lower frequencies don't inherently result in longer range due to some fundamental law of physics. In the absence of things like trees and buildings in the way, electromagnetic energy becomes more diffuse at rate that's the inverse of the square of the distance the farther one goes away from the transmitter independent of frequency. What changes is that a receiver antenna of a given gain shrinks at higher frequencies, capturing less energy. Increase antenna size to capture the same amount of energy and free space range stays constant. However, the antennas become much more directional. http://en.wikipedia.org/wiki/Free-space_loss

In real world non-line-of-sight (NLOS) signal paths, higher frequencies tend to suffer higher losses for the part of the signal path where radio signals get absorbed and reflected by trees and building materials. In the suburbs, this effect can be reduced by using relatively tall towers and lots of antenna downtilt to reduce the distance radio signals travel through the trees.

Various smart antenna techniques take advantage of physically small antennas at higher frequencies to achieve better spectral efficiency than would be possible at lower frequencies where arrays of antennas become too big. The tradeoff is cost and complexity go up with multiple transmitters, receivers, and antennas operating in parallel.

"Sounds OK at first glance, but I wonder if you get into holdout or squatter problems."

Educated buyers know how to get around these problems (at least those related to opportunistic holdouts rather than crankiness): options. When the pieces are worth much less than the whole, you don't buy piece by piece. Instead, you buy options to pieces. Holding out then becomes a less profitable strategy than selling.

It's the only way they lay pipelines and railways these days (unless there are "eminent domain" laws in place which forces sale at below market price, of course!).

The comments to this entry are closed.

Whatnot


  • visitors since 3/2/2004

July 2014

Sun Mon Tue Wed Thu Fri Sat
    1 2 3 4 5
6 7 8 9 10 11 12
13 14 15 16 17 18 19
20 21 22 23 24 25 26
27 28 29 30 31    
Blog powered by Typepad

QuantCast