Archive for the ‘Uncategorized’ Category

YouTube Preview Image

It is a different time we live in, one where we obsolete our tools long before they are obsolete. Cell phones are routinely replaced long before the two year contract that many sign with their providers has ended. This has little to do with functionality, even though battery life and new features may play a role in the decision, it has more to do with the cool factor.

And for those who are sporting a two year old Motorola Razr, the new iPhone3G looks pretty sexy, if sexy is what turns you on. I’ve heard that some people actually use many of the new functions available while they’re impressing their friends. For me, being the near Luddite that I am, I want to be able to make a phone call without being disconnected and occasionally send a text message, even though an IM would be infinitely more desirable.

But all of that is about to change, and change in a major way.

For those of us who believe communications is important, you know, the kind of people who actually care about spelling and try to make sure we are clear in what we write, we have long understood the shortcomings of the written words. Heck, most of us are envious of people like Joseph Heller or Kurt Vonnegut for their incredible ability to sculpt the written word, elevating writing to an actual art form. We have long known that communicating in writing largely removes the ability to convey emotions or even sarcasm – that aspect is stripped away, leaving the cold, written word to stand on its own.

Telephone was an improvement, at least the inflection in the voice would make its way from one end to the other, but the inability to look someone in the eye, to pick up the non-verbal clues, or to “read” body language still left one at a disadvantage.

But all of that is about to end…

We now are at the beginning of a new age, one where telepresence allows one to have a “face to face” conversation from halfway around the world, a world where no matter where you are, you’re there (if you want to be) and a world where the ability to hide your true emotions will behind a voice call may slowly disappear. The author makes no judgment as to whether this is a good or bad thing, it only is and that change will need to be accounted for, as well as gotten used to – because, like it or not, our world is about to be transformed.

It should be pointed out that this is not just a change in communications, it is a change in access. A shift to where a firefighter will have the information necessary to know what the floor plan in a building looks like and where dangerous chemicals might be stored. As technology progresses he might have real-time information as to who remains in the building, right down to what room they are in and even the health of the person trapped there – all provided at a loss of our privacy.

This will be a time when all the information anyone would need would be right there with them, wherever they go, accessible as necessary – obviously for a fee and only to those who can afford it.

And that may be where we will need to examine what we have built. With all the potential for good this may do, have we built something that is useful to society? Will this further exchange where we have given up even more of our privacy be worth it as the advertising we receive now knows where are, that it has been six hours since we last ate and that we are probably hungry, as an unfeeling device pushes a list of restaurants that serve our favorite foods all the while as it crosschecked our bank account to screen out the restaurants we cannot afford? From there will it report to our doctor that we cheated on our diet or even prevent the restaurant from serving us what we ordered in favor of something healthier? Or will this service become more mercenary as it targets the most vulnerable among us and feeds them directly to the sharks most likely to take advantage of them.

After all, this is a for profit service…

On Tuesday, June 17th, SlashDot linked to this article which is reporting that a private investment group, Network Acquisition Company LLC, is assuming control of what used to be Earthlink’s network in Philadelphia.

When I tried to check out who Network Acquisition Company LLC is, I ran across this article which details the founders as being, Derek Pew, Mark Rupp, and Richard Rasansky. While I don’t recognize either Mark Rupp or Richard Rasansky, Derek Pew was part of the original Wireless Philadelphia team and someone who actually has a background in telecommunications and networking. In fact, Mr. Pew might have been one of the only people associated with the project originally that actually had any experience in this field. It is my understanding that Mr. Pew left Wireless Philadelphia, if I had to guess, out of frustration and I am glad to see that he has reentered this project, this is a man that has the ability to save this project from ruin.

With all that said, I am going to offer some advice to the new owners of this project and hope that the advice is taken in the spirit it is offered.

    1.) You’re in for a forklift upgrade, maybe not right away, but sooner than later.

    2.) Using the tried and abandoned page view revenue model is not going to work

    3.) You get one chance to establish a reputation, if you miss that you are all done.

Let me expand on those three suggestions a little further.

1.) If you believe you are going to fire up the rest of this network and use it for a year or two, I would like to direct you to the SlashDot comments as well as the comment posted by Beth on the WirelessPhiladelphia site.

Both of these users of this network are telling you that their experience was miserable and the old business wisdom that if one customer calls to tell you that you have a problem, there are a hundred more that didn’t call.

I am also going to go on record as saying that the $200 repeater which was mentioned in this article is not going to fix the problem.

2.) With respect to the advertising revenue being a component to your bottom line, the reality is that the commonly discussed method is not even worth the time to set it up. If you need proof of that I would be glad to point your to several network operators that have tried it and they will tell you that you would be better off spending your time walking the streets of Philadelphia and picking up pennies.

That said, this does not mean that there aren’t other ways of generating revenue from advertising. I am working with another network operator on failed citywide wireless project and we have developed a method of leveraging advertising that will generate some healthy revenue.

3.) Reputation – this is something I know you understand and I am sure you realize that you are starting behind the proverbial eight ball picking this network up at the point you are. I think it is well understood that you will be granted a honeymoon period by the city – but that honeymoon will be viewed by a cynical spouse. By banking that you are going to be able to “fix” the existing network or that the remaining portion of the network can be brought on-line with better results than the portion that is already operating, I’m going on record as stating that this will be time wasted.

Good luck Philadelphia, we’re pulling for you.

On May 7th, this article appeared in the Philly Metro saying,

“Earthlink, which stopped accepting new customers last week, has given the city until tomorrow to come up with a plan to take over the system or it could begin to take down the network, according to sources close to discussions. An original deadline of last Wednesday came and passed.”

On May 13th, this article states,

“The company said it is providing customers in the area with service through a 30-day transition period that ends June 12. It is getting in touch with these customers to give them information about the termination and help move them to other EarthLink Internet services.”

In the interest of making sure that analyzing what took us from a brilliantly conceived concept to the shambles we see today doesn’t turn out to be the most valuable asset of this entire affair I would like to present the following perspective.

What happened when –

At the onset, no one could have predicted that inquiring how much it would cost to build a wireless network covering all of Philadelphia would spark a nationwide debate and give birth to a new industry. What went from a proposal to build the network for $10 million, then a $5 million counter proposal, to Earthlink proposing to build the network at no cost to Philadelphia, has provided an exciting ride.

When the details of Eathlink’s plan was first announced, I went on record saying that the network wouldn’t work, it couldn’t work. This realization had nothing to do with me having any particular prophetic powers, perish the thought, the basic numbers just didn’t add up. In fact, what few numbers were being released not only didn’t add up, they appeared to show that no one was paying attention to the business model. While that aspect was disturbing, when it came to the network engineering, the originally announced design caused waves of outright derision from the Wireless Internet Service Provider community – the very people Wireless Philadelphia and Earthlink should have been engaging in this discussion.

For those of us who had actually built these networks what was publicly being stated was simply not possible to deliver. Of course, this may seem easy to say from today’s vantage point but the record is out there preserved in the internet archives for anyone who cares enough to read through it all.

Certainly, many things went right and this, in of itself, is astounding considering that there was no blueprint to work from, no established baseline or best practices to adopt. The idea that Digital Inclusion would need to be an integral part of this network was something that hit the mainstream, as did the incredible job of holding public meetings in an attempt to educate the entire city as to what they might expect. The work that Dianah Neff did was nothing short of Herculean and regardless of the outcome, moving forward, she should be lauded for managing to accomplish what she did. There were also countless others, including Karen Archer-Perry, to name only one, who worked miracles to move this project along and there needs to be recognition for their work, as well.

At the same time, Wireless Philadelphia’s reliance on Earthlink “taking care of everything” while being incredibly seductive, was naive, to put it nicely. And to be completely blunt, if the mantle of failure has to be accepted by anyone, Earthlink needs to step up and face the facts. Yes, the network design was handed to Motorola (who chose to partner with Tropos) and while we should be looking at these two companies as having some responsibility in this mess, Earthlink’s name was on the project and they should have made sure that their project would do what they were promising the world. There was also a pilot project built using Alvarion equipment that performed very well but was not chosen, why, I’m not sure we’ll never know.

We’re here. What now? -

Certainly, everyone has an opinion. Ethos, published a report, which can be read here, describing their vision and how they would address the problem, a solution I do not hold as being desirable or sustainable.

While I will not attempt to put forth a 64 page report here, or even try to explain why I don’t believe Ethos is on the right track, I will state, very simply, that there is a workable plan, based on technology available today, that could take this misstep, learn from the mistakes and build out a world class network from there.

First off, let’s clearly articulate four mandates that must be adhered to if this, or any such project, is to succeed.

    1.) The network must be technologically sustainable.
    2.) The business model must make financial sense.
    3.) There must be compelling reasons for the customers to buy in.
    4.) The plan must include a long term process for continual improvement.

Technological Sustainability -

There is an accepted wisdom in this industry that bandwidth demands will continue to increase, at an ever-increasing rate, for the foreseeable future. Based on that wisdom, any network built, regardless of technology employed, will be temporary and as newer products are introduced they will need to be integrated into these networks. In the license exempt wireless industry, the useful lifespan of any equipment is generally considered to be three years. With respect to the fiber industry, this period of usefulness is extended to a decade or more, which should be expected based on the cost differential between these two technologies.

One fact needs to be clearly understood, in order for any network to continue to be indispensable, there will always be two distinct components; “fixed” and “mobile” connectivity.

Business Model -

Borrowing from a century’s worth of experience that the telecommunications industry has accrued, one well documented reality must be adhered to – businesses will pay more than residential customers for a business class service. This means that if you want to build a business model that will work, you need to have services that the business community will seek out and purchase. This also means is that “best effort” WiFi technology is not going to meet those requirements, nor is the promise of the same services offered by everybody else in the industry going to differentiate your offerings from the pack.

In today’s competitive world, the bar is set pretty high and any new competitor in this arena MUST offer not only competitively priced services but services that cannot be purchased from anyone else.

Specifically, any such venture must provide a speed/price component that noticeably beats the competition while meeting or exceeding the reliability metrics that anyone else can offer. At this time, the bar has been set so that any entity looking to penetrate this market must be able to deliver a service that equals Verizon’s Fios, which currently delivers 30Mbps symmetrical to those who wish to pay for it. That’s right, if you think you’re going to make a huge hit in any market by offering 5Mbps, (forget 1.5Mbps or dialup 2.0, as I like to think of it) you need to find a new business. However, this is the entry point for fixed communications, to succeed you will need to also be able to deliver a five 9s reliable, competitively priced, connection that can reach multi-gigabit speeds and this service must be installable with a very short window of time – as in three business days.

Additionally, make this service portable, meaning that wherever the customer goes, the service is available, albeit with the understanding that multi-gigabit speeds will not follow them everywhere. So, when customer “A” leaves their office, their wireless device will continue to keep them connected to the network. When this same customer heads home (assuming that they lives inside the network’s 100% citywide cloud) their service follows them there.

There is another component to the revenue stream that must be included in order for this business to thrive, applications. While an entire book could be written about this subject alone, it is the applications that are provided on these networks that create the demand – and if you aren’t supplying more and better applications than the competition, why would your potential customers seek out your services?

Compelling motivations for adoption -

As mentioned above, applications that are only available on your network will oftentimes sway the customer in your direction and may be an even more persuasive argument than price/performance alone.

Without getting too far into the subject, applications comes in two distinct categories; revenue producing and quality of life enhancing. Three revenue producing applications that are commonly paid for currently are voice (telephone/cell phone), Internet service and television/video entertainment. It is the author’s belief that in the next 24 to 36 months voice and video will cease to produce revenue, (in any meaningful manner) and will become part of the value added when one subscribes to Internet service. Please adjust your revenue projections accordingly. In place of those applications, I see video security monitoring, remote medical diagnostics and monitoring as well as mobile finance coupled with location based advertising as being some of the revenue producing applications which will supplant voice and video revenue streams.

Turning to the quality of life applications, the ability for a parent who is unable to attend their child’s activities, such as a baseball game, recital or school play but who can remotely attend these events is a compelling argument for adoption of a service. This same rationale holds true for a parent who would like to be able to check in on their child at day care or check on an elderly parent at home.

To highlight a valuable business application that could be implemented (and is not available on the competition’s network) I heard a presentation given by a VoIP provider who provides a service that allows small businesses, ones that might only have two or three lines, to “stack calls” during times when their line capacity is exceeded. As an example, he mentioned that in a college town, he had a customer that ran a pizza delivery service. Under normal business loads two phone lines was all that the customer required but when there was a sporting event or on weekend nights, just before closing, the call volume went through the roof. To answer this need, this provider created a service where when a customer called, and all the lines were in use, instead of receiving a busy signal or being shunted off to an answering service, the call was kept alive with “on hold” music until a live person could answer. This allowed the business to handle more volume than they could otherwise and minimized the occurrence of callers that would have normally dialed the competitor. The cost for this service? A mere quarter per call! While this was a service that was certainly a bargain to the business owner, it also provided a healthy revenue stream to the service provider, especially when all of the competing businesses in the area subscribed to this service.

One more application that is network agnostic but should be adopted into the offerings presented by any savvy network operator, Telepresence. While you may not be familiar with this term, eventually this will be a must have service for any business that used to travel in the past. The ability to hold a face to face meeting with someone without leaving your office (or perhaps, only having to travel to your friendly, local Telepresence center) will become more commonplace as the cost of travel rises – and I don’t know anyone that is predicting the costs will significantly decrease in the longer term.

telepresence

Continued improvement -

And now we get to the least palatable aspect of this discussion.

Simply building a network and marketing it’s services is not going to cut it, in case you might have thought otherwise.

This network, or any network, for that matter, will need a plan that will balance revenue coming in with the forecast expenditures which will be ongoing forever.

While a detailed plan could also take up more space than I am willing to dedicate here, the reality is this will be a phased project with the next decade planned out.

Phase one –

The network will be built using a fiber backbone that delivers connectivity to all areas of the city with more than one fiber connection provided to each area for redundancy’s sake. This fiber construction has already begun in Philadelphia, and will not stop, remaining ongoing until every single building in the city is eventually connected, perhaps over the next decade.

From the initial fiber buildout, rings of multi-gigabit wireless (and/or Free Space Optics) devices will be deployed to become the next layer of the network. This series of rings will be engineered to allow companies/organizations that want ultra high speed service to be connected FIRSTas they are the premium customers.

Further down, this is the stage where the next layer of wireless rings will deliver connectivity measuring in the hundreds of megabits. From this layer we will then connect the next group of customers that are looking for 30 to 100Mbps service – as they are also a premium class of customers.

At this point, we have reached the level where the “edge” network can now connect to the main network backbone. Please note – this is what Earthlink/Motorola did on a far slower threshold. It should be noted that nowhere in this design is Mesh Technology specified and that the equipment I would spec out to connect at the edge would be very different from what exists currently.

As things stand, this phase of the project would be projected to take eighteen months to two years, assuming that all political roadblocks were removed and adequate resources were dedicated to the task.

Phase two –

Complimentary of the first phase’s construction of fiber, the fiber network is now extended to areas with the highest demand. In this phase we begin to see the highest capacity (small business and residential users) now being connected directly to the fiber backbone, freeing up the ultra high capacity wireless layer to now deliver more connectivity to the mobility layer while also remaining in place as backup to the fiber.

Concurrently, more ultra high capacity equipment is now put in place so as to add to the next layer down total capacity, upgrading it from hundreds of megabits of throughput to gigabit (or multi gigabit) as needed.

At the edge, distribution is now increased to a shared 100Mbps allowing the residential users the ability to connect (fixed) at speeds burstable to 50 Mbps while still maintaining the WiFi connectivity which allows many mobile users to continue using mobile VoIP/PDA solutions. Depending on market conditions, we may also be looking at deploying WiMAX or potentially another technology, (LTE?) at this time.

With the conclusion of this phase, approximately 15% of all buildings in Philadelphia will now be connected directly to fiber and fiber is now available at every block. This phase is expected to last a total of three years, not including the work done in phase one.

Phase three -

At this time, work begins in earnest to connect every single building to the fiber backbone with the ultra high capacity wireless being kept in place for backup and areas where it would be difficult or prohibitively expensively to install fiber. This would include problem areas such as rivers, railroads, or highways. Upon completion of his phase, at the edge, every single wireless access point is now connected directly to fiber with a wireless connection for redundant backup.

While all this is happening, the originally deployed wireless edge equipment will be upgraded or replaced, depending on demand. By the end of phase three it should be expected that almost all of the existing edge wireless gear will now be experiencing its third complete upgrade. And in order to now service the expected demand, we will also be replacing the layers that feed the edge with state of the art wireless equipment which at that time would probably consist of 10Gbps at the uppermost ring (now dedicated almost exclusively to backup duty) with the multi gigabit radios that used to be utilized in the uppermost ring being redeployed to the next layer down, and the gear removed from that ring finding a use farther down the wireless backbone.

At the conclusion of phase three, three years after its inception, 85% of all buildings in Philadelphia will now be directly connected to fiber and the continued evolution of the wireless cloud will have kept pace with the state of the art.

Phase four -

Now we have hit the home stretch, with only a small percentage of cleaning up to do. However, while the percentage of total numbers is small, these will be the most difficult builds and we have allocated two full years to complete this phase.

While this will conclude the fiber portion of the buildout, all 139 miles of Philadelphia is completely connected to the network by fiber, we are still dependent on a constant upgrading process to keep the mobility segment of this network alive.

There are several aspects of this network proposal that are not illuminated here – nor could they be, For instance, administration, pricing, cost for each phase, adoption rates and the justification for those predictions – but all of that information can be made available.

It’s still all about the money –

So, how does this get paid for? Strangely, this is actually the easy part.

I would point your attention to the following quote, taken from here.

“Atlanta-based EarthLink has given up on its Wi-Fi plans across the country. In Philadelphia, the Internet service provider said it could not find a buyer for the network it spent $17 million to build, and talks to donate the network to the city or a nonprofit organization failed, even after it sweetened the offer with $1 million in cash.”

Hey – here’s a tip – TAKE THE NETWORK AND THE MONEY!
(…and let Earthlink out of their contract, they don’t want the network and nobody wants a law suit)

And this is what you do with it…

First off, use the $1 million to deinstall every single Tropos device and sell it, heck eBay it, for let’s say $1,000/each. I would suggest that the market might not support the $1K/unit price tag so lower it to $500/unit and dump in on any third world country that will take it. Speaking in round numbers, we can estimate that 4,000 Tropos units were put into service in Philadelphia, which at $500/each would provide $2 million to replace that edge equipment with. I could make specific recommendations as to which manufacturer’s equipment I would replace the Tropos gear with but this is not the time or place for that discussion.

Once the network is now operational, the job becomes one of marketing to the business community as well as the residential customers. The idea that Earthlink had only signed up 5,942 customers shows that the service they were providing was not interesting enough to attract any more customers. But, for the sake of this discussion, let’s assume we can double that number to 12,000 customer within six months with an average ARPU (Average Revenue Per User) of $23.00/month. That would now give us $276,000/month in revenue with no debt other than our operational expenses. Year two allows for doubling of the customer base again, with a rise in ARPU to $30.00/month (a goal I would be disgusted if I couldn’t surpass) giving us $720,000/month. Years three and four double the subscriber base while also increasing the ARPU which should ensure a healthy, viable, and sustainable network in perpetuity.

There should also be a component where any service provider can purchase transport or provide services (as in the case of technical support, email, video, etc.) across this resource, and yes, that would include the local telephone company and the cable providers. This would also make a very inexpensive infrastructure for the cell phone providers to increase their footprints with a huge capital expense.

And yes, a full business plan can be submitted should anyone seriously want one.

Now, on a personal note – WAKE UP PHILADELPHIA, you stumbled, you didn’t die. At one point you lead the way in this country and probably created more of a stir than at any time since Ben Franklin’s times.

You have taken the initiative to be the first, you started an entire industry and now have the choice of giving up or moving forward. I believe that given the right justification, as well as a sound plan, your city will be able to lead this country into this new millennium – but you have to want it.

I’m pulling for you, I know you can do it and I believe that deep down you know you can do it too. Now the question is, do you have the leadership to get the job done? If not, don’t worry, there are other cities that will answer that challenge, cities that will be able to make the claim that they did it, that they were forward looking enough to make this infrastructure work and they shall be the one that reaps the yet to be realized benefits of economic development, digital inclusion, and the ability to show the rest of this country how it is done.

Your choice.

John F. Kennedy once said, “Victory has a thousand fathers, and defeat is an orphan.”

Earthlink’s Rolla Huff indicated that EarthLink will stay focused on serving people using
dial-up Internet service and casual Internet surfers who want an economical plan.

One theme common in this industry has always been to watch as the same old thing is trotted out, lauded as revolutionary, advertised as the “solution” that will “fix” the problem, bought by the many who need their shortcomings addressed and then banked on as though this innovation will be the cure all forever.

This is not something that I only point out only in others, I have caught myself doing this same thing all to many times. In spite of the fact that back in 1995, more users were signing up for Internet service everyday and that an increasing number of web pages were being added to the web, many of this industry’s professionals simply chose to ignore the fact that the web was growing at an unprecedented rate. Sure, dialup ISPs understood this and the ILECs were dancing for joy at the number of PRIs they were deploying, not to mention T1 lines, but the realization of what was actually happening was lost on almost off of us, including yours truly. The reality was, the way we all communicate was about to go through a dramatic change.

Even more to the point, many of these tools were terms that most of the people in the world were not familiar with. While almost everyone had heard of Email, services like Instant Messaging, RSS, Social Networking sites were and ARE still foreign to many of the people now using the net. Heck, I would be willing to bet that most of the people my age have no idea what Twitter is or how it would be used to their benefit. But these are the end users that I am referring to – not the professionals.

I expect better from the people who we entrust our most critical infrastructures to, communications being at the top of that list, as well as transportation, energy with food and water distribution being first and foremost.

But the communications industry has managed to create such a level of groupthink that the damage they may cause, if this ignorance is not addressed, will take decades to undo.

Any child that ever played with a Walkie Talkie understands there are inherent disadvantages to using a half duplex communications tool in a full duplex application – like voice. The very idea of having to use a word like “over” to allow the other party you were talking to a clue that you had finished talking and that they could reply seems absurd and inefficient in today’s world – yet a portion of the Internet was built to reflect this type of usage. Now, no matter what you think about the POTS network, one thing it did do very well was to provide a full duplex voice communications experience – one that became the standard we used to measure everything else by.

Looking back, asymmetrical and half duplex communications were fine for a lot of the applications that were being run at the time. I would request a web page and it would be delivered to me, all in its own sweet time, but there was no real need for two way dialog during those transactions. Email – same thing.

Then came P2P, and VoIP with VoIP’s big brother Telepresence – and the world changed. Did our service providers? No, not really, there was a LOT of money invested and making any real change would have meant going back to the investors and telling them you were wrong – something nobody relishes having to do. One very notable exception is Verizon, and their commitment to Fios. While I honestly didn’t believe they could pull this transition off, they have more than met my expectations and done what seemed like the impossible. I still don’t believe they are out of the woods yet, this is a huge investment and I believe they have predicated their ROI on numbers that might not be there three to five years out but let’s give credit where credit is due.

Too bad we can’t say the same for many of the other providers.

Lately we have been discussing Comcast’s “network management” as they try to come to grips with the reality that their network cannot handle all the traffic that is being thrown at it. This is not a problem inherent only on Comcast’s network, AT&T seems to be having the same problems and don’t look too closely at the Cell Phone providers – that’s an area not for the timid.

In the final analysis, much of this can be attributed to greed – any decent network engineer would tell management to shut off allowing new customers to be added once the network appears to become overloaded but most CEO’s are more focused on the bottom line, in the short term and market share is more important having a solid network.

But now the invoice we will need to pay for such foolishness is about to come due and while the amount is going to cause pause, the pain is going to be spread all over. Only a fool misinterpreted the fact that demand for bandwidth was growing year over year. Only someone intentionally remaining ignorant would misjudge what the addition of high-definition video and applications like Telepresence were going to do to the demands for more bandwidth and it appears that these people are about to pay for that incompetence.

You want to claim that P2P is the problem and you NEED to block it – why you go right ahead. In fact, why don’t you go to the FCC and tell them you need relief – and that if you don’t get it, your network is quite possibly going to crash. I invite you to do this because P2P is only one of your problems, no, actually more of a symptom than anything else. The real problem is that you built your network in such a way that you cannot keep up with what was well know as being a ever-increasing demand.

In plain English, you failed – or more correctly will fail, and probably quite soon.

Let’s turn to the License Exempt Wireless Industry for a look at a group that has built networks using inexpensive, half duplex radios and now is crying that P2P is a scourge on their networks. According to Om Malik P2P isn’t anywhere near the problem many people are making it out to be.

The P2P stats are the ones that came as a complete surprise. Like you, I have read many reports that suggest P2P applications account for the majority of the traffic on high-speed networks. But McPherson’s data suggests otherwise:

20 percent of traffic is P2P applications

During peak-load times, 70 percent of subscribers use http while

20 percent are using P2P Http still makes up the majority of the total traffic, of which 45 percent is traditional web content that includes text and images. Streaming video and audio content from services like YouTube accounts for nearly 50 percent of the http traffic. It shouldn’t come as a surprise to anyone — streaming TV shows from Hulu and videos from YouTube have been on a major upswing, as noted by our colleagues over on NewTeeVee.

During the plague, it is said that cats and dogs were exterminated in massive numbers as it was believed that they spread the disease. A later theory proposed that this action exacerbated the situation as the cats and dogs were the natural enemies of the rats who were carrying the fleas that did pass on the infection.

In today’s version of this level of misrepresentation, we have ISPs blaming P2P traffic when the fault lies in the artificial scarcity of the upstream connection, making it too expensive for this traffic to pass – well that and the fact that many have invested their all into building networks that are no longer capable of doing the job required of them.

As many of you know, we made the decision to close up our ISP and one of the foremost reasons was that we believed the business model could not support what needed to be supplied to the customer in the longer term. That was a painful decision, one that I can empathize with anyone who will eventually have to admit that to themselves.

Change or fade away – there are no other choices…

Due to a number of reasons, some having to do with the mechanics of how I constructed this blog (having based many of the articles from links that were not permanent) and a web hosting company that disappeared without warning, among them, I decided to discontinue writing this blog.

A lot has changed and with the help of my wife, Dawn, this blog has been reconstructed (no small feat) and I now start again, bring the slanted views that I feel are necessary for consideration and rebuttal.

This industry, or more correctly, this world is a quickly evolving place and what seems like a technological disruption today become obsolete and passe tomorrow.

We have all heard that every journey begins with a single step but in the search to define what are the “best practices” in the Municipal Broadband Network industry the universal answer is, “We don’t know”.

The fact is there are no citywide network models that have been functioning for any long period of time that we could look to as examples of how other networks should be built and managed. We do know of many cities that are struggling to figure out how to construct a business model that will work in a sustainable manner over the long term, how to design such a project to take advantage of the this field’s ever-evolving technology and how this type of infrastructure can best benefit all of the different segments of their population.

As a real world example, I would like to introduce you to Adam Heller, the IT director for Bridgeport, CT as he attempts to gather as much information as he can over a three day period during the recent Muni Wireless event in Minneapolis. In this case, I am acting as a careful observer of this process with additional commentary provided by several of the knowledgeable people Adam met and interacted with at the show.

From Adam’s perspective, Bridgeport is in need of an overall WAN upgrade. Bridgeport’s connectivity between its various municipal facilities is substandard and as more enterprise applications are being implemented the WAN is not able to maintain consistent quality or throughput. As a result, Adam has begun to document what the city’s current infrastructure consists of and what alternatives exist to alleviate the extreme difficulties of accessing network resources.

In order to define this process Adam inventoried his current application environment paying particular attention as to how to go about upgrading them and inevitably, had to consider what impact future growth was going to have on the picture. In doing so, he framed the situation using the following questions:

  • What will be our future needs?
  • What applications are not currently being used that will be after an upgrade?
  • Are Public Safety needs being met?
  • Are we going to choose to add voice to this upgraded environment?
  • What is the city environment like? Is there a “public need” to be able to access municipal resources?

After careful examination of these questions Adam came to the conclusion that a network consisting of a solely wired solution is not a viable option. As the city develops with more construction occurring within the city, public safety communications is also becoming an issue. For some specific examples, traffic congestion is increasing as seen at most stop lights and access to parking is becoming increasingly difficult. Another area that needs to be improved is the need to streamline the building inspections and the building permit process so as to make the entire process more efficient. Additionally, there is also the very real need for mobile access to the Internet for the business community as well as providing for the communities at the lowest end of the economic scale, many of which do not have access to the Internet at all. This is all part of the myriad of issues that arise when we look closely at the future needs of our city’s network that as of today cannot meet our present requirements.

The question now becomes, if the city has the need for all of these extra resources, what technologies are available to meet these demands? This is what brings Adam to the MuniWireless event allowing him the opportunity to research the municipal wireless industry. Being in the unique position of planning an overall infrastructure upgrade, Adam felt that now is the time to explore what a municipal wireless network is and how it could be designed so as to incorporate an eventual deployment into Bridgeport’s plans, as well as avoid potential pitfalls as the city moves forward.

The first task is to define what a municipal wireless network is – specifically to their city. Is it the “wave of the future” or is it going to be yet another maintenance nightmare? What is the purpose of deployment? Is it going to serve the needs of the community as a whole? If not, what segments of the population will it serve? If so, what priorities will Adam have to set on deployment? Does Bridgeport go with an exclusively wireless network or do we go with a hybrid wireless/wired combination? What security issues will we face and are we opening ourselves up to a host of issues in that respect?

After deliberating all of these issues Adam has defined his concept for Bridgeport’s Municipal Wireless Network as follows:

A municipally owned and operated network that will provide access to municipal resources as well as providing Internet access to the community including those that may not currently have access. In addition, this network will provide for new services to enhance public safety and enhance the experience of everyone living or working in our city while encouraging community and economic development.

With that being said, how would one accomplish this task? The first thing to do is to find out what other communities are following this same route. The second thing to do is find a way to speak to industry experts to see how they define the criteria which should provide an explanation as to what they have experienced. As opportunity does seem to favor the prepared, it was at this time when Adam discovered that there is a conference designed to provide him with an opportunity to discuss municipal wireless with his peers, as well as to discuss these issues with people in the industry.

Adam’s first session was hosted by Dewayne Hendricks (Dandin Group) and focused on bandwidth. One of the fundamental decisions that needs to be addressed in any municipal deployment is how much throughout will the network deliver to the user. This is one of the more hotly debated questions and the fact is there really is no one right answer. This is a specification that each city must make a determination based on what they believe is right for their situation.

Dewayne Hendricks believes that “Life Begins at 100Mbps” and is now in the process of deploying just such a network in Sandoval County, New Mexico to demonstrate that concept. For Dewayne, the very idea that the United States is near the bottom of the world’s industrialized nations as far as broadband speeds is abhorrent – especially since the issue isn’t a technological limitation but instead caused by a combination of failed policy and an artificial scarcity of bandwidth created by the incumbents.

While showing that the technology exists and more importantly, at an acceptable price, Dewayne is also working towards lowering the wholesale cost of bandwidth. If we look at the cost of bandwidth in San Francisco versus the cost for the same amount of bandwidth in just about every small town or city in the US, it is easy to see why high speed broadband service has such a wide disparity from location to location. If you want to carry that comparison further one only need look to Asia where wholesale broadband costs significantly less making exceptionally high speed broadband service substantially less expensive to the end user. For a specific example, in Tokyo the cost of a 100Mbps connection is on par with a 3Mbps DSL connection in New York City and one can easily understand how this disparity will have a detrimental effect on any business that is broadband based or heavily depends on connectivity.

Attending the same session but taking a somewhat different view is Damien Fox (Wireless Nomad) who became interested in broadband deployment as a way of equalizing opportunity among his neighbors. As someone who never thought he would be interested in broadband technologies Damien was attracted to this industry when he realized that the digital divide had a very real effect on a person’s ability to get ahead.

As an example, let’s look at two children, perhaps classmates at the same school, who are given a history assignment on D-Day. The child on the dialup connection would be relegated to waiting for his computer to connect, then wait for a search engine page to load (taking the better part of a minute) clicking on a link (again, taking another minute for that page to load) and if we assume that this page has the necessary information the child was looking for this child’s entire educational experience would be reading a page of text with a grainy picture or two to illustrate the subject.

Conversely, the child on broadband would immediately see the search engine page load, would be able to quickly browse through several different sites and then could participate in a rich multimedia experience which might include actual footage taken at the battle or audio recordings of interviews from veterans who participated in this event.

The stark reality is that one child is going to have far better educational opportunities than the other with the tragedy being that the real cost to provide this resource to both children is negligible – but only if we chose to do so.

One point Damien did emphasize in the follow up conversation was that there’s no such thing as a free network and one should not underestimate the cost of providing real access to information resources rather than just total up the cost of a monthly DSL line and an obsolete computer. It is important to keep in mind that despite all the costs of bridging the digital divide, the cost of not doing so is far greater, in the long term and the bottom line is that to a child that has a dialup connection (or no connection at all) even a 1Mbps broadband connection is a godsend.

As you can see, for Adam, this is exactly the conversation he was hoping to become part of when he decided to book this trip.

In a subsequent conversation that took place later that evening, the subject of what should be the minimum connection throughput a municipal network should provide again was again brought up. Jay Barnell (Barnell Technology Services) submitted this observation for consideration, “What we really need to continually do is hold ourselves up to the international community for comparison as opposed to each community looking at the neighboring city up the interstate to gage of how we are all doing.” This reinforces the point that Dewayne is making that we need to be setting our goals high enough to make sure we are relevant as we move forward, even though there is no arguing that if the financial resources are the overall constraint, providing something is always better than nothing.

As the discussion continued the next day, an informal, ad hoc group formed to address some of the other questions that make up the foundation of a Municipal project were also tackled. Drew Lentz (Meshtek) hammered the point home that the three most important things to remember was “Design, design, design.” Over the years Drew has repeatedly run across instances of network deployments where the builder pushed the equipment’s specifications beyond where it should be realistically expected to go. As one would expect, this inevitably leads to a network that cannot live up to its expectations.

The other side of this issue is where the network’s specifications were not clearly spelled out or explained properly, creating the same scenario of failure where the network does not live up to the buyer’s (or the end user’s) expectations. One such issue is any type of promise which tries to specify what percentage of end users will be able to connect without the need of additional equipment. (CPE) In dense residential areas where buildings made of wood, brick, stone as well as stucco the individual user experience will vary in ways that is impossible to predict with any accuracy. Unless the entire user base is provided with a well organized educational campaign there will surely be some people who will feel slighted as they will need to spend a significant amount of money to get connected where their neighbor will not.

To complicate matters further, there may also be an eventual failure that may not manifest itself immediately as there will always be a considerable lag from the network launch and the time when the critical mass of users become part of the network. Couple that with the knowledge that the services end users are now demanding will require more bandwidth, as in the case of YouTube, and you have a disaster just waiting to happen.

Perhaps the most insidious subject to come up in the conversation was that of dealing with network security. Ash Dyer, a recent graduate of MIT and now part of the Cambridge Public Internet project, brought up several very important points that many Municipal Network managers might not even be aware of. While many of the security issues wireless networks face are also problems on wired networks, they are exacerbated by the omnidirectional nature of wireless. Incredibly, something as simple as a rogue access point added to the network without proper protection could potentially compromise thousands of people’s data. Drew Lentz added that off-the-shelf programs widely available as a free download (Ethereal as an example) could allow anyone to intercept anything from regular user account
information (including passwords) to credit card numbers and banking information. Ash responded that when the potential for compromising sensitive municipal data is also likely in these cases a well-publicized security breach on one of these networks could have serious ramifications across the entire industry.

For someone like Adam, this is the stuff that nightmares are made of. However, if we accept that knowledge is the best prevention in staving off these types of failures, Adam will tell you that the best three days he could have spent learning what he needed to know was at the MuniWireless event and I don’t think it would be presumptuous to expect we will be seeing him at the Dallas show in March.

Without going into specifics, a number of interesting discussions have crossed my desk in the last few weeks which have forced me to look a little deeper into what is going on in this industry.

Let’s look back for a minute and check out one of the parallels that we can point to for a glimpse of what is really going on here.

Back in the early 1980s, IBM introduced their Personal Computer and within a matter of months the industry grew to a point where the was a severe shortage of technicians to build, install and service these computers. This was a problem for several years moving forward as more of these systems sold and the ability to train people could not keep pace. Were there problems because of this? Absolutely! In one case a furniture retailer went out of business (and we all know that never happens!) and when the autopsy was complete it turned out that their accounting system had a “bug” that mistakenly showed more money in the bank than was actually there. One can see where that might become a problem very quickly. Had there been one sharp network engineer in the company, he might have caught the glitch and saved the company.

You know, for want of a nail…

What has happened in our industry now seems to be taking a parallel course as we now have municipalities and extremely large deployments planned yet the people who should be employed to design and build these networks are not participating in this work. In the last three months I have listened to some pretty serious examples of what would be considered malpractice if it was a medical case as the details of failed deployments are relayed to me, usually asking if I can fix them – as though a wave of a magic wand could fix designs that should have been questioned before a decision to purchase the equipment was made, let alone the full deployment built out.

Where this gets to be interesting is that this shows a complete failure of the entire process, from design through final approval. Of course, one would have to ask what good the review board is if they have little to no functional experience in this field – heck, many of these people have never even heard the term WISP before. I can’t tell you have many times I have run into committees made up of a few ex-telecom employees (downsized out of a job) complimented by the office computer expert with some “networking” expert (I haven’t quite figured out what the qualifications for that position are yet) thrown in to round out the experience. As best I can figure they probably should have invited a protologist to completely round out their combined skill sets.

Whether we are talking about “designers” that believed one could engineer a network where over a dozen 400Kbps video streams could reliably be pushed down a 5.5Mbps WiFi connection or multipath would somehow not be a factor in their deployments many of the mistakes being made are the same mistakes that some of the early WISPs learned the hard way. I was one of those people having made nearly all of the mistakes one could make – or so I would like to think. Yet, this process repeats itself and in this iteration we see the fingers being pointed in every direction except where they should be right back at the designers.

The reality is that designing a wireless infrastructure isn’t as easy as deploying radios every X feet and then turning them on. Incredibly, this is the mindset of some of these “project engineers” who have extensive experience in setting up Linksys boxes in their homes to share an Internet connection with two notebooks or so one might think. Where some of these bigger metropolitan networks ever thought they were going to be rolling out adequate bandwidth based on a 512Kbps connection to even a 1Mbps connection to the end user shows a level of comprehension as to how the Internet actually works on par with Senator Stevens [R-Clogged Tubes]

We all know there is a widening gap in broadband happening in the industrialized world as many countries have outstripped the US in connecting their communities and I have heard all the excuses. If we dismiss all the excuses as nothing more than, well, excuses the bottom line is that we are not providing adequate connectivity for our businesses and individuals to effectively compete with the rest of the world – even though we are well ahead of many developing countries. GO TEAM!

Well, where is the real problem? That answer is actually easier to come by than we might want to believe. It is time for the people who made this industry to step forward and be employed in this endeavor. As I look back over the better part of the last decade I have seen many people struggle to learn how the pieces of this all fit together. Many dove in as complete novices, built networks while teaching themselves everything from RF theory through marketing and business management. There were some spectacular failures as we all know that one person companies are not going to be successful at being all things to all people. But that doesn’t mean the lessons learned by many of these people aren’t valuable – actually to anyone trying to build one of these networks out it should be invaluable.

Instead, I see large companies investing in people that have zero knowledge aside from what they read in an owners manual or what a manufacturer taught them in a two day class – with results that equal the effort and investment made.

If you really want to know where the value is in this industry, it is the people, many of them the original WISPs or more likely hobbyists who first started experimenting with this technology that is where everyone should be looking – not at the rocket surgeons that claim they have a clue. The funny thing is that there really aren’t a large number of these people and out of that number many of them never actually “got it” in any kind of real way. This is where the near future is going to get very interesting – if you consider massive failures with blame being pointed at the technology or anything else that can be found to complete the CYA mandate as being interesting.

So, where are the assets in this industry?

The people.

Until then I will keep answering the phone and answering the same questions. Whether it is the large corporations or the municipal network people I keep hearing the same things asked of me and to be honest with you, if you have to ask where the ignition key goes you probably shouldn’t be driving a car.

For the last three years I have sat in this chair, read voraciously and tried to predict what the near term future of this convoluted industry might look like two or three years out. Having no fear of publicly making an idiot out of myself, I have published these projections here and elsewhere for all to read knowing full well that if I am wrong, I will hear about it – and I have been wrong more than once.

That’s why when I read something that seems to corroborate what I believe from a credible source I tend to react with a small bit of pride along with a healthy dose of trepidation knowing that whatever source I have found could also be just as wrong as I have been at times.

Perhaps this is nothing more than the old adage that misery loves company…

Yesterday was a red letter day for me in this regard as two different credible sources seem to have backed up what I have been saying for quite some time now! If we were to take either of these reports separately they would show some pretty interesting trends which might indicate some serious changes are coming down the pike but when read together the real impact of what this wireless revolution is about to wreak on our society.

As you’ve heard me say before, VoWiFi is going to cause the Cell Providers some discomfort. This article from Cellular News paints a mixed picture as it discusses how the cell phone industry might be able to leverage the License Exempt wireless infrastructure.

“This trend is likely to occur globally as operators seek to increase roaming usage as a boost to declining voice revenues. Visiongain believes that price reductions by operators will succeed in driving usage, allowing operators to tap into the 95% of subscribers who currently do not use roaming services whilst abroad.

VoIP through Wi-Fi will become an increasingly attractive alternative to mobile voice calls whilst roaming due to the disparity in price. Visiongain found that a typical voice call whilst roaming over Wi-Fi costs $0.02 per minute, compared with a typical cost of $1.25 per minute through mobile.

The increase in Wi-Fi hotspots world-wide is creating more opportunity for travelers to utilize VoIP services, therefore threatening mobile roaming revenues. In addition, visiongain believes that Nokia’s entry into the Wi-Fi market with its converged GSM / Wi-Fi handset, the 6136, is significant because it legitimizes the technology’s entry into the mobile handset market.”

Now, you do need to understand that this article is written with a European perspective in mind so there is certainly a differential in pricing to be taken into consideration. However, the message is the same, cell phone providers are going to have to modify their pricing structure and this is being driven by the UMA (unlicensed mobile access) end of the industry.

I also found this short article that provides a little more information about the Nokia 6136 courtesy of Engadget.

While we are looking at the pioneers taking their first steps into this new field, the ramifications of what this will do to an industry that cannot withstand an onslaught like this, financially speaking, is going to be pretty interesting to watch unfold.

Well, where might this push the cell phone industry to pick up new revenue? There is the move to deliver “LiveTV” to mobile users as I wrote about here. But there is also the newest wrinkle being used in Japan dubbed the “Mobile Wallet” which may also help the cell phone providers – if they don’t get beaten out by the WiFi industry first.

Okay, one down and one more very powerful one to go.

Next week, there is going to be a study released by Broadband Advisory Services (Pike and Ficher) that states, “City-run broadband networks could eventually cut into commercial service provider revenues by as much as 48%” which no matter how you look at it is going to change the landscape dramatically.”

This report can be purchased here.

The question remains, what happens when large corporations that do not have 33% profit margins see a significant decrease in their revenues? Even more to the point, what happens when the cash cow (as defined by the densely populated areas of our country) migrate away from their very expensive services due to the introduction of less expensive equivalent services? At what point does their business model suffer? More importantly, at what point does their business model cease to be able to sustain itself?

I guess well find out.

Hang in there, the opportunities are coming at us faster than we can recognize and react to them.

For three days last week (June 19th through the 21st) the MuniWireless organization held their show in Santa Clara California. Attendance was (once again) up from their previous show in Atlanta and there is a reason for that – a well produced show, excellent topics and a list of great speakers that I was proud to be a part of.

By focusing in on the specific niche that this show targets it is possible to entice an interesting mix of attendees that span the spectrum from free community wireless groups, non-profit organizations trying to better the world like Green-WiFi and to some of the largest corporations in the world, like IBM, Northrop Grumman and General Dynamics as well as representative of some of the larger cities that are considering building municipal networks.

It is exactly this mix of people that makes for the kind of cross-educational exchange that many of us find so valuable. While the sessions covered a wide range of interesting topics most of my time was spent networking with people outside of the sessions. It is really difficult to fully take in any exceptionally well put together show so we are all forced to make decisions as to what we will participate in and what we will have for forgo in order to get what we believe will deliver the best experience for us on an individual basis. In this particular case, I made the decision to not attend many excellent sessions that I am sure I would have learned an incredible amount in so I could interact with people, quite literally, from all over the world. In all honesty, if the show had been scheduled for an entire week I am not sure I would have had enough time to take in everything that was offered.

This show also had a larger selection of vendors than the previous show. While many of the names that were there are well known to us all, there were a few interesting additions that I had not seen before. I am hoping to do a dedicated piece on a few of these manufacturers like Netistix and Wavion in the near future as both of these companies offer products that are outside of the norm.

There is a deeper issue that I would like to bring up, one that should be discussed more often but is often overlooked when we discuss an event like MuniWireless, one that has a benefit that I am not sure many people in attendance understand is what kind of alliances are formed, what kind of ideas are spawned and can we really even begin to understand what dramatic effects a gathering of minds like this show creates has on the longer term benefit to society. In different discussions that I was part of I heard plans to help connect people in India, plans to integrate automobiles into the communications platforms as well as serious concepts at fixing communications after disasters. The diverse brain power coupled with an incredible energy that resonated at this show was unmistakable. One very important point was made clear, individuals, businesses both small as well as huge, organizations and governments are looking at problems – real problems – and doing something about them.

If we amplify this thought, we see that no longer is profit solely the motivator in this case, many of the people there were representing non-profit organizations. What we are seeing is a melding of business and private groups coming together to address problems so as to provide solutions that the majority can accept. We see the Electronic Frontier Foundation discussing privacy issues with Google in an attempt to find a way that both sides can live with. We get introduced to organizations such as Wireless Harlem presenting their vision along with groups like Seakay working to find the right mix of partnership to make their corner of the world a better place. Perhaps, most of all, in the center of this all, is one woman, Esme Vos whose vision, energy and determination has driven this once unheard of slice of the wireless industry straight into the public spotlight and we can now all clearly see what she has know for quite some time now, that this is only a start.

The best is yet to come.

Over the last week a couple of news articles caught my eye with respect to the distribution of video across the Internet. In the past few articles I have written I have primarily focused on the aspect of what the network operators might be looking at as a real impact due to video becoming a dominant use of the connectivity infrastructure.

Take, for a specific example, the cell phone providers as they struggle to build out/upgrade the necessary infrastructure so they can supply LiveTV to their users as just one way we can see the push to accommodate user demand for this application.

The same holds true for cable or DSL as we move towards an all HDTV video standard and the user does not want to wait for …Buffering…Buffering…Buffering…Buffering… content to be viewable or have their viewing experience interrupted as they are just starting to enjoy themselves – not that it has ever happened to me or anything.

While all of that is certainly true and important to our discussions there is a slightly deeper impact that seems to be crawling out into the light, one that is going to profoundly impact the way our society moves forward, one that will change the way we learn and communicate. There is a time in the not too distant future where we will no longer have to settle for average.

What do I specifically mean when I say that we will no longer have to settle for average?

If we were to look at any given profession we would find that the overwhelming majority of time we are dealing with the average performance – as they say DUH, that is the definition of average. However, what happens when we have a communications based society where only the very best see the mass distribution of their work. I am not talking about the very best in the sense of people, even thought that will undoubtedly have an impact but rather only the cream of the crop ever making the mainstream distribution channels. What happens to this society when only the most inspired lectures be granted the right to be distributed across the Internet? Even more important, how do we define the best? Will there be a user feedback section where if 99% of the viewers leave excellent ratings will the next group of people only view that one particular video out of all the choices?

If we were to use the example of an on-line class on any given subject we could envision a scenario where many professors would record their course and release it for viewing. I would suggest that as time progresses the students that watched the course would then rate the content for ease of understanding, charisma and organization among other criteria. Even though we would have several excellent people all trying to present this course material the one or perhaps two that were most effective (as rated by the students) would eventually become the “standard” until someone else managed to produce a “better” video assuming they could overcome the momentum built up from several thousands positive feedback ratings on the standard.

The implication is that professionally produced content with an eye toward capturing the audience’s attention and conveying the message will at some point displace the rest of the people in the field that are involved in providing the same subject material. At that point there will be only one option left for the content providers that are deemed less than the best of category and that would be for them to release their material into the public domain. This also creates an interesting problem because if there is a parity in the quality of the material the viewing audience will almost always gravitate toward the free content (or advertising supported content, as long as it doesn’t degrade from the viewing experience) leaving less of a paying audience for the previous provider.

What does this say if we apply this scenario to the education industry? Are we moving towards a society that doesn’t need hundreds of thousands of educators? Will we at some point reach a time where only a few very professional content producers will manufacturer every lecture we will need to see to continually keep up with our education? Will we reach a point where teachers will be reduced to content writers and the face on the screen will only read the content possibly without even fully comprehending what is being said?

Even more important, what will be the overall effect on a society that only sees one perspective or one presentation of any given subject? Could this happen in such a technological future as we might possibly foresee or will this very mechanism allow for the rapid distribution of content and since we now have an almost instantaneous communications infrastructure to get the message across we will now have the ability to comment and produce even better content to displace the previous content?

To be honest with you, I don’t know. At the same time I do see a time in the not too distant future where the mechanism of how we learn (or exchange our information, news and entertainment) will morph into a very different stream. As TV shows like The Daily Show start to displace the Evening News one has to wonder if education, entertainment and news will all become one as we move forward.

I do know that no longer will we have to settle for an average day by the average speaker as being acceptable and I, for one, welcome that change.