Timothy Butler has an interesting post over at Open for Business (OFB) titled Fewer Bars in More Places: AT&T Network Upgrades Degrade Service for 2G Phones. Here's a couple of quotes from Butler:
In an act affecting owners of 2G cell phones on AT&T Mobility’s network, including the highly visible, and originally highly expensive first generation iPhone, Open for Business has learned that AT&T has been quietly sacrificing 2G signal strength in an effort to speed up the build out of its next generation 3G network. The first generation iPhone was trumpeted by the company as recently as seven months ago; many 2G phones continue to be sold by the Dallas-based company today.
According to Butler, until recently AT&T has primarily relied on the 850 MHz frequency band that offers better indoor reception for their 2G EDGE service. He says AT&T technicians confirmed to OFB that transmitters for the 2G signal used by the original iPhone and most other handsets, including most AT&T offered BlackBerry and RAZR models, have been shifted to the weaker 1900 MHz band in some areas.
Cellularguru.net has a good
frequency FAQ that describes the difference between the 850 and 1900 MHz bands:
What is the difference between the 850 and 1900 bands? They are the two
different wireless bands available to North America. 850 was the original cellular band, and it was split into two, the "A" band and the "B" band. The "B" band was for the wireline phone company and the "A" band was for a non-wireline provider. The 850 band has been around for 15+ years, and the systems are very well built out. The FCC mandated that a certain amount of land be covered by a signal. The 1900 band was placed in operation several years after 850. The 1900 band is also known as PCS, and the two terms are used interchangeably, which can be confusing when trying to follow a conversation. There are 6 bands A through F, and some of those can be split into others. The requirements for the 1900 were not as strict as 850. Only a certain percentage of the population needed coverage (67% IIRC). Than means building the urban areas pretty much met the entire FCC buildout requirement for a given area. From what I can gather, there rules have been relaxed even further.
Which is "better"? Here's more from
Cellularguru.net:
Which is better, 850 or 1900? In general, you are going to get more performance out of 850 than you are going to get out of 1900 for several reasons.
1. As mentioned earlier, back when the 850 licenses were issued, they had to cover a certain amount of land cover. This required deploying their system throughout many rural areas (not ALL though). 1900 licenses only need to cover up to 67% of the population, and in many cases they don't even have to meet that.
2. The higher the frequency, the shorter the usable range. You need approximately twice as many 1900 MHz towers to cover a given area than 850 MHz towers. Most 1900 MHz towers are in urban and suburban areas. A properly built 1900 system will work as well as a properly built 850 system, but it will likely cost more to deploy and operate.
Sometimes 1900 will work better in a city because 1900 MHz signals tend to work better in the middle of the city with large buildings as the shorter wavelength allows the signal to go around corners easier. Also, due to network loading, 850 towers have to be "turned down" in urban areas so as to not overload, so the playing field is leveled. 3. Leaving the technical details aside, it seems that 850 MHz signals penetrate most modern buildings better than 1900 MHz signals. There are many factors involved such as the material of the walls, the proximity of the local cell towers and various other factors. The fact that 850 MHz carriers have been in operation longer and have optimized their coverage is an important factor to consider. If there is a window nearby, chances are that either system will work, assuming that there is some sort of signal available at the window! The bottom line is this: when you try out a service, make sure you bring your phone to all the areas you'll be using to make sure it works where you need it.
Here's more from the
Butler piece:
OFB was able to confirm this situation for itself using multiple devices in St. Louis, MO, and also obtained information on similar cases across the country. Reports suggested the problem started to appear as AT&T ramped up its 3G network in preparation for the iPhone 3G in early 2008. Each AT&T technician OFB talked to concerning this problem offered the same solution: that the customer should purchase new, 3G-enabled equipment at the customer’s own expense.
I'm still one of those first generation $400 iPhone users so Butler's accusation concerns me. His piece goes on:
AT&T’s executive director of analyst relations, Mark Siegel, “categorically” denied to OFB that AT&T was advising customers to dump 2G equipment such as the iPhone for 3G versions. In a follow-up message, Siegel added that the company was not requiring anyone to switch to 3G equipment. Although that is technically true, customers in affected areas are all but required to upgrade due to the dramatic signal strength drop over the last few months.
Where's
Apple on this?
Butler writes OFB also attempted to reach Apple for comment, but had not received a response from the company by press time.