Tuesday, March 11, 2008

Campus Internet Access: Shall We Seek Another Way?

Over the past 5 or 6 years I've been to a lot of different campuses scattered around the country. I can't think of one that I've been to recently that was not struggling with bandwidth issues. Students, faculty and staff on college campuses are like sponges when it comes to bandwidth - we soak up as much as the provider can supply.

Accessing bandwidth hungry applications during peak usage times can be very frustrating - especially if that application is part of a lecture or exam. In addition to the cost of bandwidth, colleges and universities are also responsible for installation, 24/7 maintenance and upgrading of the network infrastructure.

Perhaps it's time to consider another way. I've written in the past about successful public/private partnerships and today came across an interesting press release from AT&T. Here's a piece from that press release:

The University of Houston and AT&T Inc. (NYSE:T) today announced the nation's first planned deployment of AT&T U-verseSM services into student housing on a college campus. The cutting-edge TV and high speed Internet services will be included in every room of a 547,000-square-foot residence hall under construction for graduate and professional students.

These kinds of relationships make a lot of sense to me - a public university contracting with a private company to provide services. AT&T will be responsible for installing, maintaining and upgrading their network while the University of Houston will be responsible for teaching and student learning. I would think it also passes on a lot of BitTorrent/copyright liability from the University of Houston to AT&T...... it makes sense for both the university and AT&T to go this way for a number of reasons.

Also, from AT&T's perspective, it gets their products out there in students hands .... an impressed and satisfied student is a future satisfied residential/business/wireless customer.
Here's more from the press release:


"We are delighted that University of Houston students will be able to enjoy the same advanced AT&T U-verse services as an ever-expanding number of consumers across the Houston area," said Ed Cholerton, AT&T vice president and general manager for the Houston area. "We share the university's commitment to the best communications and entertainment technology."

What will be next? I'm figuring on a wireless access option for students via an AT&T wireless network at the University of Houston.

It will be interesting to see how many other academic institutions and providers move in this direction.

Peer-to-Peer File Sharing

[Here's a recent piece I wrote for my monthly technology column in La Prensa, a Western Massachusetts Latino newspaper. To read a few of my previous La Prensa technology columns go here.]

Peer-to-peer (commonly referred to as “P2P” or “PtP”) networks are commonly used to share music and video files on the Internet. Much of the illegal file sharing you hear about in the news is handled using P2P networks. These networks are also used for legal file sharing and, in some ways, they have got a bad name because of the sharing of copyrighted materials.

You may recall the early version of Napster, a software program developed by Northeastern University student Shawn Fanning in 1999. Napster worked using a variation of a P2P network (some call it hybrid P2P) that used a centralized server to maintain a list of who was online and who had which MP3 music files for sharing. Because Napster used a centralized server, it was easy to trace users and effectively shut the service down which the Recording Industry Association of America (RIAA) did in the fall of 2001, after filing a lawsuit against Napster.

As Napster was going through the legal battle, programmers were working to develop other file sharing programs that did not use a centralized server. The first of these new programs was named BitTorrent, and created by Bram Cohen in the summer of 2002.

Hundreds of additional P2P programs have been created and they are almost all based on the BitTorent model. Some of the more common BitTorrent type applications include Gnutella, Bearshare, Morpheous and FastTrack.

BitTorrent type programs are true P2P programs, using ad-hoc connections so there is no central server. Every computer running a P2P program provides storage space, bandwidth and processing. As more people install and run the P2P program, more files are being uploaded and downloaded and more computers are participating in the file sharing process.

Here’s details on how a P2P program works. Let’s say you want to download a song (let’s also say this song can be legally distributed) and you’ve got one of these P2P programs installed on your machine. You start the P2P program and type in the name of the song you want in a search box. The program then goes out and looks for other users sharing that song. As users are found the song starts to download to your computer. As more users sharing the same song are found, additional connections are made (each connection is often referred to as a torrent) and the download speed to your computer increases. Also, as you download the song, you start sharing the song with others connected to your computer.

Popular songs and videos can have hundreds of torrents involved in a single download.

If you use P2P programs, you need to be very careful to only download content that can be legally shared. If downloading illegal content, you can be caught and lately some huge fines have been given out to violators.

Also be sure you are running up to date antivirus software and scan your system for spyware weekly. P2P networks can be used to spread malicious software.

Sunday, March 9, 2008

The iPhone Software Development Kit Podcast

Mike Q and I recorded "The iPhone Software Development Kit" podcast today. Below are the show notes. You can listen directly by turning up your speakers and clicking here. If you have iTunes installed you can get this one, listen to others and subscribe to our podcasts by following this link.
If you don't have iTunes and want to listen to other podcasts and read shownotes you can click here.

Shownotes:

Intro: On Thursday, March 6, 2008, Apple released the iPhone Software Development Kit (SDK) beta along with the App Stores, a place where iPhone users will be able to get applications written for the iPhone. Apple also launched the Enterprise Beta Program.

Gordon: Mike, can you give us a quick rundown on what Apple released on Thursday?
Sure, much of our discussion today is based on an excellent post at macworld.com titled The iPhone Software FAQ. Macworld editors Jason Snell, Jonathan Seff, Dan Moren, Christopher Breen, and Rob Griffiths contributed to this article. They also thank Glenn Fleishman, Craig Hockenberry, and Daniel Jalkut for their feedback and contributions.

Here's how Macworld answered the question:

The SDK is a set of tools that lets independent programmers and software companies design, write, and test software that runs on the iPhone. Right now there's a beta version for developers, but a final version of the iPhone software that supports the installation of new programs written by independent programmers is due in late June.

As a part of the announcement, Apple introduced a new iPhone program, App Store, through which you'll be able to purchase, download, and update iPhone software. That will be available as part of the new iPhone Software 2.0 update in late June. That's when you'll be able to add third-party apps to your iPhone for the first time, at least via official channels.

Gordon: You blogged about you experience with the SDK - can you tell us your first experience?
I downloaded the new iPhone SDK and wrote about my first impressions. I did quite a bit of FORTRAN programming many years ago > 10, but haven't done a whole lot lately. The SDK took a long time to download -2 Gig - over my wireless connection. And about 45 minutes to install. I also downloaded a couple of the sample applications Apple provides ~ 1 Meg each. In about 15 minutes - would have been shorter if I knew what I was doing - I was able to open the sample, compile and run on the simulator Apple provides.

I have no doubt that this is going to have a huge impact on mobile application development. It's really easy and really cool. If you teach programming - I suggest you download the SDK today, install it in your labs, and have your kids developing and running native iPhone apps by Monday afternoon. Get the SDK here. Even better, download Jing have your students record the simulator running their iPhone apps and embed in your department or faculty webpage - great for marketing! Wish I was 20 again!

Gordon: And you actually compiled a little Kalimba (African Thumb Piano) app. Where can we have a look?
You can go to my blog at http://q-ontech.blogspot.com/2008/03/iphone-sdk.html

Gordon: Apple is taking 30% of what is sold from the App Store - will shareware apps be available or will we have to pay for everything?

That's a good question and one that was sort of answered in the macworld.com post. Macworld assumes Apple won’t let you sell a “free” program that requires an unlock code. However, there are some other scenarios we expect to see. First, donationware: People will probably sell “free” programs that request that you make a donation if you want to keep the project going. We don’t think Apple will have any problem with that, since the donation would be voluntary. Second, it’s possible that you’ll see two versions of various iPhone programs: a free “lite” version that’s a good advertisement for a more feature-rich for-pay version.

Macworld also mentions Iconfactory’s Twitterrific, a Mac program that is free, but contains ads. For an “upgrade” fee, users can shut off the ads. Whether Apple would allow this to be handled within the program or there would need to be two separate versions of an iPhone version of Twitterrific remains to be seen.

Gordon: On Thursday, five companies demo'ed applications - can you give us a brief summary of what was shown?
From Macworld: Five companies showed off what they were able to put together with two weeks of engineering work and very few people involved. There were games from Electronic Arts (Spore) and Sega (Super Money Ball), an AIM client from AOL, medical software from Epocrates, and business software from Salesforce.com. The programs took advantage of the iPhone’s built-in accelerometer, Multi-Touch capabilities, interface elements, and more.

Gordon: I'm going to go back to the Macworld post again and take some questions directly from that FAQ for you to answer:

1. What kind of stuff does Apple say it won’t allow developers to create?
2. What if someone writes a malicious program?
3. What’s a “bandwidth hog?”
4. Can I buy these programs on my Mac, or just on the iPhone?
5. What about software updates?
6. What if you’ve synced your phone on one computer and then restore it on another? Do you lose your apps until you sync to the original?
7. If I buy a program for my iPhone, can I also transfer it to my significant other’s iPhone?
8. Can I download programs off the Web, or any place other than the App Store and iTunes?
9. What about internal, “private” software? What about beta testing?
10. Can I try the iPhone SDK and how could it be used in the classroom?

Gordon: Apple posted a roadmap video - can you tell us a little bit about that?
You can watch Steve Job's presentation and see what's ahead at http://www.apple.com/quicktime/qtv/iphoneroadmap

We hope you enjoy this 48 minute podcast!

Thursday, March 6, 2008

Internet Protocol version 6.0: An Excellent White Paper

Yesterday, 3G Americas published an excellent white paper titled Transitioning to IPv6. The white paper is directed specifically for wireless providers and includes a lot of good content directed towards the transition. Here's a quote from a 3GAmerica press release about the white paper.

The white paper by 3G Americas addresses the problems that will occur when new IPv4 address blocks are no longer available. Service providers will face increasing capital expenses and numerous challenges when attempting to operate their networks efficiently on a limited number of IPv4 addresses. Not only does transitioning to IPv6 solve the address exhaustion problem, it will likely enable new services perhaps impossible in an IPv4-only world. The 3G Americas’ white paper strongly recommends that rather than wait for the inevitable difficulties to arise, service providers should begin planning their transition to IPv6 as soon as possible.

The white paper takes a good look at how wireless providers will move their networks to IPv6 and uses 3 detailed case study examples:

Case Study 1: Video Share service
Case Study 2: Gaming services
Case Study 3: Blackberry service

Using these case studies, the white paper provides recommendations on:

1. Developing a transition plan;
2. Using a phased approach;
3. Developing a solution for IPv4-IPv6 inter-networking, and;
4. Security considerations

Chris Pearson, President of 3G Americas, is quoted in the press release:

The need to transition to IPv6 is upon us. The Internet continues to expand at a rapid pace, with wireless devices becoming major users of IP addresses. Transitioning to IPv6 will take significant effort, but it can no longer be delayed.

The white paper is 23 pages long (including a great glossary) and provides some excellent reading/classroom material - I'll be using it in the advanced telecom course I'm teaching this semester. You can download it here.

Tuesday, March 4, 2008

U.S. Fiber to the Home (FTTH) Ranking = Eighth in World

Last week, the FTTH Council North America, Europe and Asia-Pacific released a world rankings document titled Fiber to the Home Deployment Spreads Globally As More Economies Show Market Growth.
The report lists 14 economies in the world that currently have more than 1 percent of households directly into fiber networks. According to the release:

The global ranking follows the unified definition of FTTH terms announced by the three councils last year, and which has formed the basis for recent market research by each council. For completeness and accuracy the ranking includes both FTTH and FTTB (fiber-to-the-building) figures, while copper-based broadband access technologies (DSL, FTT-Curb, FTT-Node) are not included.

The United States has doubled it's penetration rate to 2.3 percent over the past year, moving us up three places to eighth position. This doubling is no doubt based on the Verizon FiOS rollout in this country. [Click diagram to right for larger view]

Joe Savage, President of the FTTH Council North America, is quoted as follows:

“We’re delighted to see the U.S. moving up the global ranking, indicating a good beginning is underway. FTTH leadership, demonstrated by those leading countries, shows full national deployment is achievable. The future belongs to those countries that satisfy the broadband consumer’s need for speed. Our members – the FTTH equipment vendors and the service providers – are ready to help make it happen on a wide scale across North America.”

Here's a quote from Schoichi Hanatani, President of the FTTH Council Asia-Pacific:

"It is no accident that Asia-Pac continues to be the fastest growing region for FTTH in the world, with more subscribers connected on fiber than all other regions combined. The rollout of FTTH has been encouraged by forward-looking governments and regulators in the Asia-Pac region for several years now. They understand that FTTH is a key strategic national infrastructure."

Read the full release and get more information on the FTTH Council web site at www.ftthcouncil.org

Monday, March 3, 2008

Google's Trans-Pacific Fiber Optic Cable Project

Last week, on February 26, Google and 5 other international companies announced the Unity consortium. This group has agreed to run a 5 pair, 10,000 kilometer fiber optic communications cable connecting the United States and Japan. According to a Google press release, each fiber pair will be capable of handling up to 960 Gigabits pers second (Gbps) and the cable system will allow expansion up to eight fiber pairs.

At 5 pairs: (5 pairs)*(960 Gbps/pair) = (5 pairs)*(960x109 bps) = 4.8 x 1012 bps = 4.8 Terabits per second (Tbps)

At 8 pairs: (8 pairs)*(960 Gbps/pair) = (8 pairs)*(960x109 bps) = 7.68 x 1012 bps = 7.68 Terabits per second (Tbps)

The Unity consortium companies are:

Bharti Airtel - India's leading integrated telecommunications services provider.

Global Transit - A South Asian IP Transit network provider

Google - You know who they are!

KDDI - A Japanese information and communications company offering all communications services, from fixed to mobile.

Pacnet - An Asian company that owns and operates EAC-C2C, Asia's largest privately-owned submarine cable network at 36,800 km with design capacity of 10.24 Tbps.

SingTel - Asia's leading communications group providing a portfolio of services including voice and data services over fixed, wireless and Internet platforms.

By partnering with the providers, Google will be extending it's reach into the Asian markets - combined Bharti Airtel and SingTel have over 232 million mobile and landline customers. In addition, the system will connect into other Asian cable systems and reach more customers. Here's more from the google press release:

According to the TeleGeography Global Bandwidth Report, 2007, Trans-Pacific bandwidth demand has grown at a compounded annual growth rate (CAGR) of 63.7 percent between 2002 and 2007. It is expected to continue to grow strongly from 2008 to 2013, with total demand for capacity doubling roughly every two years.

It's interesting to see competing Asian market providers partnering in a system within a system, with each having ownership and management of individual fiber pairs - a testament to the power and influence Google has.

NEC Corporation and Tyco Telecommunications will build and install the system with completion by the first quarter of 2010.

Sunday, March 2, 2008

Motivated and Committed People = Outstanding Work

I've been back and forth to Dallas a couple of times the last two weeks - first for a futures conference presentation and this past week for a two day National Science Foundation (NSF) Advanced Technological Education (ATE) Convergence Technology Center at Collin College visiting committee meeting.

At the futures conference I spoke on Globalization - specifically how college courses need to morph to properly prepare students for today and tomorrow's work. The reception, hospitality and quality of the event were outstanding and I am so thankful I get invited to these kinds of events. I learn so much listing to other speakers and talking with attendees.

Last week was the two day visiting committee meeting - larger National Science Foundation grants are required to appoint a National Visiting Committee (NVC) that meets once a year. According to the NVC Handbook published by the Evaluation Center at Western Michigan University, these committees are groups of advisors that work with grantees and NSF to help them achieve their goals and objectives. They assess the plans and progress of the project and report to NSF and the project leadership. Committee members also provide advice to the project staff and may serve as advocates for effective projects.

At the NVC meeting, among many things, we had a lot of excellent discussion about current and future of converged communications and networks - what many are now calling unified communications/networking. I'd like to especially thank President Cary A. Israel and Executive Vice President Toni Jenkins from Collin College along with Director Ann Beheler, Ann Blackman, Helen Sullivan, etc, etc from the Convergence Technology Center at Collin College for their hospitality, commitment, work, understanding and dedication to their students. It's always wonderful to see excellent work being done - especially when it is funded with taxpayer dollars.

Here's one photo of NVC student lunch presenters (click to enlarge) taken on Thursday - each a different story and each incredibly EXCELLENT is all I can say. You can check out my iPhone Tumblr photoblog of both events (and a lot of other events) at http://gsnyder.tumblr.com/ - scroll down to see all photos.

I'll get back on my five per week (or so) blog schedule this week - I've got a bunch of them started and I'm not going anywhere for the next couple of weeks!

Thanks again to all at Collin College in Texas.