Archive for the ‘Uncategorized’ Category

I recently attended a ribbon cutting ceremony that was very different from any other event I have had the displeasure of suffering through. I was expecting to be forced to politely sit through meaningless speeches provided by politicians who were only there to take credit and garner votes but instead I was fascinated by several very well informed people casually discussing how connectivity was going to change the geopolitical landscape.

During a speech made by Aris Melissaratos, secretary of Maryland’s Department of Business and Economic Development, I was introduced to a term that I was unfamiliar with, “Virtual Adjacency.” At the time I really didn’t put too much weight on the phrase as each speech contained enormous amounts of excellent information delivered by some very well educated people.

Later that night, I was lying in bed and the full impact of the phrase hit me. Virtual Adjacency, this was something that somehow resonated with me. As I continued to think about this I decided to get up and do some reading on this concept. A quick Google search turned up several results including some from Cisco but nothing that defined the phrase as I was beginning to interpret it.

The next day I was chatting with Kory Mohr and I told him about the phrase. I explained that it was my belief this phrase was the quintessential definition of where the Internet was going to evolve to. In his usual no-nonsense way Kory asked, “Why is this different from what we have now?”

In trying to formulate an answer to this question this is the thought process that I followed. To my way of thinking, the Internet is some sort of loosely defined entity often depicted in diagrams as a cloud. (Don’t get me started as to how ridiculous I find that example. The concept almost seems as if it is magic, not some incredibly expensive grouping of high-end hardware that makes it all happen.) If I think back to the early days of dialup, we would connect, navigate somewhere (later, by using a search engine) find what we were looking for and maybe download it to our local computer for archiving. Certainly there were other purposes including email, reading the news or whatever else we needed to grab for information but this was largely a tool that was out there, we temporarily became a part of and then left.

A few years ago, broadband allowed us to connect and stay connected. Several things started to happen that changed the way we interacted at that time. Many of us stopped downloading information and saving it locally, instead opting to “leave” the information “out there” knowing that we could find it again (probably faster) if we needed it at a later time.

Lately, I am beginning to notice another behavior that is now becoming accepted in our usage, a tying together of distant locations in a near permanent connection between people. Yes, I know the phrase “virtual network” has been around for a while and that the phrase accurately describes how two locations can be considered as one LAN but this is now evolving into something entirely different. I see this as now morphing into a virtual office, one where I can attend meetings, share data, work collaboratively on a project as well or better than I could if I was actually present at the site! In other words, I am “virtually adjacent” to my coworkers. In fact, I can be adjacent to my coworkers in China, India, California and in my home town without ever leaving my livingroom.

As technology progresses we can look forward to an even more realistic simulation of our adjacency. Take a look at this announcement. If/When this technology hits the mainstream we could expect to almost believe we are sitting right next to our coworker as we share virtual documents, collaboratively work on projects or simply discuss whatever the subject of importance might be that day.

As the cost of fuel (and transportation in general) climbs, our ability to work together effectively with others remotely will become even more critical. The term telecommuting has been in use for decades but now our ability to telecommute is now making more sense that ever. According to this link, we have an estimated 129,000,000 commuters in the US. To put that into perspective, if each commuter uses an average of 10 gallons of fuel every week for commuting (I think we can all agree that is a low figure) we find a staggering 1.29 billion gallons of fuel are being consumed weekly by commuters! Even if we could lower that by 1/4 we would be saving roughly 32,000,000 gallons of fuel per week. As gasoline hovers around the $3/gallon mark the estimated saving in fuel alone reaches $100 million per week and that does not take into account wear and tear, depreciation on vehicles along with the incredible waste of time we are talking about. Couple that with the money we spend on roads including maintenance and cleaning up from the weather and I believe we can uniformly see that the fuel expense is only a small percentage of our overall cost to maintain this system.

Recently, I had the pleasure of meeting Chuck Wilsker and Jack Heacock of TelCoa. It didn’t take very long for the three of us to understand that we were pretty much in complete agreement as to the benefits of telecommuting (or Telework as they like to call it) but I was astounded at the specific examples they brought up that explain exactly why this business model makes sense.

For example, in this country we have a mounting problem dealing with an aging population (myself included) and the associated medical issues (not to mention the staggering costs) that we are facing. Complicating this issue is the fact that we don’t have enough health care providers to adequately deal with this problem. One of the most dramatic deficits is our shortage of nurses – especially ones with experience. Ironically, as nurses get older we see health problems that prevent them from being able to function in the physically demanding tasks we need them to do such as lifting patients, etc. We also have a need for people to be able to communicate with health care professionals without choking doctor’s offices and emergency rooms everywhere.

It makes sense to set up a facility that allows people to get on the phone and have access to an experienced nurse. This is one way that we can provide a rudimentary form of health care to our population inexpensively. However, the idea of a traditional “call center” may not be the best way to handle this situation. In cases where a nurse is disabled (say back problems) the added commute along with the continued time spent sitting in a chair is something that is not conducive with the needs of this person effectively locking them out of gainful (and useful) employment while denying the rest of us the ability to benefit from their extensive experience.

The answer, telecommuting! In fact, I believe that VoIP coupled with telecommuting will be one of the next big partnership between ISPs and business. As high speed connectivity becomes ubiquitous we will be able to employ many people that would otherwise be incapable or working. People who cannot drive temporarily (or cannot drive any longer) people who have loved ones at home that need care, single mothers with pre-school children will all now be able to work from their homes. Add to that the huge number of people living in rural areas that do not have the options available to their urban counterparts. More importantly, since these people have removed the ever-increasing expense of commuting they will effectively take home more money making them better compensated without costing their employers any more money.

On the employer’s side of the equation there is the added benefit of not having to rent/own a facility let alone the cost of heat, light, maintenance, etc. Since employees are better compensated and people who had few or no realistic work options before can now work at a professional job the impact of job churn on both the employer and the employee are less likely to be a factor. This cost alone is something that should be seriously looked at. The savings that are made by not having to advertise, sift through resumes, answer the phone, interview, check references and then once an applicant is finally hired training is substantial. Additionally, in situations where someone relocates they can still keep their job by simply plugging in their SIP router into their new broadband connection almost without any interruption.

So, how does a business model like this technically work? Simple, we create a virtual PBX that allows the real time routing of calls to the next available operator. From the operator’s perspective they can either choose to be “available” or “unavailable” and the PBX will route the calls accordingly. This allows for the single mother to feed her baby and put it down for a nap and then go back to work. The same holds true for an older person who has to attend to a sick loved one, personal needs or simply to take a nap. This is something that would be impossible in a traditional job where commuting was mandatory.

An even more fascinating aspect of this business model is what it is going to do to the traditional workplace. From computer tech support to telephone answering services this is going to disrupt the norm. While it is nearly impossible to estimate exactly the savings realized from switching from a brick and mortar telecenter to a home-based virtual telecenter model I think it is apparent that the reduction in cost will be dramatic.

As we all know, those that choose to adapt will displace those who don’t. This “Darwinism of technology application” is something that is going to happen at a greatly accelerated rate. Disruptive Technology is one aspect in business any manager worth his salary should be losing sleep over but the correct application of these disruptive technologies is where the magic lies.

After a miserable five week “adventure” we are finally settled in. We have our home set up and while we are not totally unpacked (heck, we never fully unpacked from our previous move 12 years ago) we are comfortable.

I’d like to take the time to thank Tim Wolfe (and family) who put us up for a few days while we were trying to decide on where we wanted to settle.

I now return you to your regularly scheduled blog reading.

And in the words of the immortal Jackie Gleason, And away we go…

As with everything in this world, things change.

My wife and I have decided to leave our home of over a decade and move to a new area of the country. In all honesty, this is a little unnerving and somewhat frightening but we are also equally excited.

Unfortunately, this will cause a pause in my regular posts of a somewhat indeterminate length of time.

I look forward to a quick return and wish you all the best in our absence.

The dirty little secret in the ISP world is our ability to oversubscribe traffic on our networks. If we examine the typical WISP business model we find that quite a few concurrent users can be “pushed” down a single T1 line. If we then take into account that at no one period of the day will there ever be every single customer we have using the network we find that one T1 line (with a capacity of 1,54Mbps) can carry far more broadband users (broadband as defined by speeds of 200Kbps) than the straight math would lead you to suspect.

1554Kbps/200Kbps=7.77 concurrent users

Even if the numbers above were close to reality and we could count on a 10 to 1 over subscription rate we find that the maximum amount of customers per T1 would max out somewhere around seventy-seven! If we then assume that we are billing each customer at $39.95/month total gross revenue would come in at a little over $3K/month. Considering that in many parts of the country a T1 line is now well below $1K/month everything would look pretty rosy! Of course, there are other costs that would need to be accounted for, cost of equipment, payroll, insurance but no matter.

What happens when the oversubscription rate drops?

Two well documented trends we are seeing in the end users on-line usage is the length of time they are staying connected and the amount of bandwidth being consumed in a 24 hour period.

Among many of the contributing factors influencing bandwidth usage is streaming media as Internet radio moves into the mainstream but also as video advertisements soar to astronomical heights. Granted, the streaming music doesn’t take up an enormous amount of bandwidth even though as the push for increased quality becomes more mainstream we will see a rise in this rate, the amount of people using this technology is quickly growing. It isn’t unusual to see a large proportion of people in any given workplace using Internet radio and while each individual user is only consuming a very small amount of bandwidth relatively speaking, the entire organization is consuming a continuous stream of a 200Kbps on Internet radio traffic alone. Add VoIP calls (using a continuous stream of 100Kbps per call while rapidly being adopted by the business mainstream) along with the other day to day Internet traffic and you can easily see how a small to medium sized company will saturate a T1 line for an entire 10 to 12 hour workday.

But what happens when everyone goes home at night?

The study, conducted in partnership with Frank N. Magid Associates, surveyed 27,841 Internet aged 13 and over on 25 different publisher Web sites. It found 51 percent of respondents watch online video at least once a month; 27 percent watch Internet video at least once a week; and five percent watch it on a daily basis.

The full article this quotation is from can be found here:

This follows up on the reports that web pages themselves were becoming significantly larger in size, containing more graphics and advertisements per page which contributes to much longer load times.

I think we can clearly infer that the trend is towards using more bandwidth and using it in very different ways than we could count on in the past.

From a historical perspective we used to be able to account for a “burst and release” pattern where users might grab a web page or download a relatively speaking small file and then stop for a few minutes freeing us the portion of their connection for someone else to use. Those days are rapidly coming to a close.

What does this mean to an ISP or more importantly a WISP?

The immediate effect is that the infrastructure needs to be designed so as to make it easily scalable. This is critical to the continued growth while keeping the existing user base satisfied. The second variable is that the access point/base stations deployed MUST be able to handle a continuous stream of small packets – something that most equipment cannot handle.

Let’s take a look at some of the specifications WiMAX is planning to provide. According to this article from Daily Wireless Alvarion has officially released information about their WiMAX equipment.

As you can see from the chart below, this is going to be some pretty impressive equipment.

WiMax System Performance

Range < 4 miles 4-6 miles > 6 miles
Base-station cost (’04 pricing) $5k – $20k for WISP class $20k+ for carrier same same
CPE price < $300 same same
Adaptive modulation scheme 64 QAM 16 QAM ½ QPSK up to 16 QAM
Data throughput (20 MHz channel*) 75 Mbit/s 50 Mbit/s 17 Mbit/s to 50 Mbit/s depending on link quality
No. of business users (T1 level) 1 206 138 46 to 138
No. of residential users (512 kbit/s) 2 1,552 1,035 345 to 1,035
Source: Intel
Assumes two 10MHz bands in the base station as benchmark for comparison purposes. Over-subscription rate is 5x for business and 12.5x for residential. Also takes into account overhead (efficiency), which for 802.16 is 85% independent of number of users.

Chart courtesy of Daily Wireless

What I have a slight problem with is the claims as to how many users can be services from a single basestation. According to the chart above, within the four mile range limit we should be able to see an awesome 75Mbps of throughput. I am not sure how this would translate into 206 T1 class business users based on the subscription rates I see as rapidly approaching. In fact, if you are willing to believe that we are near a one to one subscription rate the best we could expect to see would be somewhere in the neighborhood of 50 T1 class users per basestation at the theoretical best.

Somehow I don’t think the equipment will be able to handle the flood of continuous packets these businesses will be expecting to be able to pass but I will say that Alvarion has surprised me on many occasions before and this might be one more time to add to my list. As far as the claim that one of these basestations will be capable of handling 1552 residential users (at 512Kbps service levels) that is something I find very hard to accept. I have no idea what oversubscription rates are being used to quote that but the term “optimistic” certainly comes to mind in much the same way 802.11b will deliver 11Mbps.

I can’t wait to see what kind of effect IPTV and devices like the SlingBox are going to have on this model but I can tell you that as we move forward our customers are going to demand more bandwidth and in a continuous mode.

This is what Lightreading has charted and is predicting for the future.

Will you be ready?

There is the one, single constant we can most assuredly count on – everything changes.

There was a time when we, as a nation, felt protected from most of the rest of the world by two oceans. This illusion was shattered sometime around World War II or shortly thereafter. History has shown that this change in the way we viewed ourselves took quite some time for us to adjust to – not that we would expect much more based on what we know about people in general.

Now, a very different dynamic is taking place, one that is outside the “real world” for all intensive purposes. The “virtual world” is now crossing into our reality with blinding ferocity and once again we seem ill-prepared to accept this. Accept? Heck, we don’t seem capable of adjusting the way we view everything to take this into account.

As an example, I would like to cite this story courtesy of the BBC. On-line gambling is illegal in this country. Incredibly, according to the article I linked to (above) this law is being ignored by a significant number of people and it appears the US government is powerless to do anything about it or has chosen not to.

Online gambling is banned in America, so Partygaming which was set up by an American is based in Gibraltar with no assets in the US.

Its prospectus concedes: “In many countries, including the United States, the group’s activities are considered to be illegal by the relevant authorities.”

But, it adds the crucial clause: “Partygaming and its directors rely on the apparent unwillingness or inability of regulators generally to bring actions against businesses with no physical presence in the country concerned”.

In other words, even if Partygaming were illegal, what could the authorities do?

Compulsive pleasure

Not that Americans are exactly shunning the website.

It’s estimated that nine out of every 10 of its dollars last year came from the US.

At $600m those revenues are hefty and generated a profit of $350m in 2004.

Talk about thumbing your nose at authority!

Next up, we have the recently “postponed” act titled 2257. While the law was passed to reduce the number if underaged models being recruited by the Adult Entertainment Industry the reality is that there was no practical way to enforce this legislation – especially as the reality is that many of these sites were simply planning to move their hosting facilities off-shore so as to skirt the law.

The third point I would like to bring up is the recent ruling by the Supreme Court regarding the Grokster case. I don’t want to enter into the debate on whether file sharing is right or wrong but instead focus on whether or not this ruling really has any effect in the greater scheme of things. Even if the entire Western World made this technology illegal with many of the servers located in countries we have little or no control over the net effect (pun intended) is we really can’t do anything meaningful to change this situation.

At this point I think it is safe to say that we will need to rethink what we can and cannot legislate. If we are to assume that we have some form of control over the Internet that doesn’t exist what does this get us? Do we not look like jackasses when we pass laws we cannot enforce? At what point does our inability to understand what we have control over and what we don’t dilute our authority over everything else?

Here’s the thing, if you try to put the fear of the law into people when you can’t enforce it, eventually there is no fear of the law. Whether or not you believe in legislating morality has no bearing on this discussion. The question is what we can exercise control over and what we cannot. The problem in that the line is now a changing boundary as opposed to the one that used to seem as though it was carved in stone.

We will learn and adapt to this new dynamic or lose control over our own destiny.

Om Malik has another excellent blog entry that discusses where Chairman Martin’s vision is taking us.

Parity is the new catch phrase, learn it, live it, love it and if you’re an independent ISP you are going to grow to hate it.

“What we have hear is a failure to communicate.”

It’s no secret that the United States has a serious deficiency when it comes to deploying broadband even though the extent of the problem is debatable. One thing that is clear is that a fair portion of this country has limited (if any) choice as far as broadband connectivity is concerned. In my last commentary I took a hard look at what our options are and made an attempt to show that we need a mix of connectivity specialties in order to ignite broadband deployment in this country. Instead, we now can look forward to a very different landscape.

Here’s the deal, line sharing is dead. Maybe not right this minute but take this to the bank, it will be. In this very short statement released on the FCC’s web site commenting on the decision made by the Supreme Court on the BrandX case FCC Chairman Kevin J. Martin said,

“This decision provides much-needed regulatory clarity and a framework for broadband that can be applied to all providers. We can now move forward quickly to finalize regulations that will spur the deployment of broadband services for all Americans.”

This is what you need to know, for quite some time now the lobbyists for the telecommunications industry have been consistently repeating the same message to the FCC. Reduced to its most simplistic the message is that competition, as it is mandated by the current interpretation of the Telecommunications Act of 1996 is causing the current problems the US is facing with respect to broadband deployment. The argument has some validity (all lies have a kernel of truth, right?) if we follow the logic.

Independent ISPs are only interested in deploying in the areas that will make them money (DUH) and there is no parity (there’s that word again) to the mandate that universal coverage needs to be provided. Since the telecommunications world uses a model that utilizes the densely populated areas to subsidize the rural (read less profitable) areas if the independent ISPs are allowed to usurp the revenue the telecommunications industry needs to provide service universally.

Let me translate that for you in case you’re having trouble with the telecommunications industry’s ability to communicate.

We need the entire market all to ourselves in order to make our business model work.

Certainly, we can see that this is something that would be great for this country based on over a century’s worth of empirical data we have collected. After all, the American public was completely satisfied with the way we were all treated when we had a monopoly telephone company in the past.

(The sarcasm I felt when I typed those last two sentences was nothing short of scalding.)

Let’s not talk about the fact that the telecommunications industry wanted no part of this “Intraweb Thingy” when it was first announced. They completely ignored developing any business in this new experiment leaving it instead to the private sector. Let’s also not discuss that the deal that was struck when the Telecommunications Act of 1996 was formed was that the “Baby Bells” would be granted access to the revenue generated from Long Distance if they opened up their networks. We also need to forget the promises Verizon (then Bell Atlantic) made to deploy broadband in exchange for huge tax breaks in Pennsylvania that were never kept. As long as we can follow the above recommendations we can all go to bed and sleep comfortably knowing that our friends at the telephone company have everything in hand and under control.

What is the bottom line?

Here’s the scoop, the Internet is made up of quite literally millions of connections all exchanging information with each other. At the risk of stating the obvious, the more each one of these connections cost the more expensive it becomes for all of us.(DUH)

Any of us that have a long-term memory that stretches back to the telecom monopoly days can easily remember when $.25/minute was a “good price” for a long distance call. The reason all of us enjoy the inexpensive rates we do today is solely because the monopoly was broken up and competition was allowed to drive down the cost. Need further proof? When was the last time you received a notice from your friendly local telephone company informing you that your rates were going down? What? You can’t remember? That’s funny, I’m sure you can remember when the several increases in your phone bill, I certainly can. Now, anyone care to guess what might happen if the telephone company was to become the sole source of Internet connectivity at retail rates?

This is what line sharing is all about. The Telecommunications Act of 1996 allowed independent companies to get reduced (the term is wholesale) rates from the telephone company to provide services over the telephone network. The telephone companies were a little less than cooperative in this with numerous reports of all kinds from too many companies to count claiming “questionable” behavior on the part of the telephone company. Most recently, SBC has dropped rates on their DSL service in many of the markets they service but the cost of the wholesale line, as provided to independent providers, is billed at such a rate that independent ISPs are claiming that there is no way a wholesale provider can match that price. How can that be? Isn’t that the definition of Antitrust, you ask?

“U.S. legislation designed to prevent businesses from price-setting or other secret or illegal collaborations that circumvents the natural forces of a free market economy and gives those engaging in the anti-trust conduct a covert competitive edge.”

-Definition courtesy of

All that that is extraneous to this conversation and instead we should be looking at what this direction that Chairman Martin has declared might mean to us.

I would submit that we all benefit from the fact there is competition in this industry. It stands to reason that if there was no pressure created for the telecommunications industry we would expect to see their pricing structure rise – as history has repeatedly shown us. What happens if line sharing is taken away? Where is the competition going to come from? Certainly not the independent ISPs that provide service using the wholesale circuits.

What about wireless? Isn’t this a great thing for the WISP community? The short answer as I see it is NO!

Let’s take a look at that short-sighted answer, shall we?

Where do most WISPs get their connections from? The answer to that question is complicated because WISPs tend to get their connection from the least expensive source possible. This is usually not the telephone company as they are almost never the least expensive option.

If we eliminate the telephone company we have a list that is made up largely of CLECs (these are usually the very same people who purchase their connectivity and delivery from the telephone company at wholesale pricing) and the independent bandwidth providers like Cogent that have agreements to interconnect with the carriers.

Anyone else is irrelevant at this point because they probably connect to one or the other choice listed above.

What happens when the CLECs are legislated out of business by the removal of their ability to purchase services at wholesale rates? I think it is safe to assume the replacement cost to connect is not going to be as cheap as the one WISPs had. But that leaves the independent providers, right? It most certainly does! If you were one of these providers and a large portion of your competition was removed from your market, what would you do? If you answered that question honestly, you probably said you would raise your prices.

Yes, I do know that many of you have contracts and that affords you some protection, right? I’m sorry, you did read that contract right down to the very small print that allows your upstream provider to terminate your contract any time they choose. Every single one of the contracts I have ever read or heard about contains just such a clause. Add to that the reality that if the FCC removes your upstream provider’s ability to deliver connectivity then you are out of luck, period.

So, want to celebrate how good this decision is for you Mr. WISP?

Then think of this…

Frank Muto, of the WBIA tells me that the FCC looks at it this way, the independent ISPs are one homogeneous group. We will stand together or fall together but we are one solidified group in their eyes.

We just watched the decimation of our larger, more established half and we are next in line.

We do live in interesting times.

This New York Times article (registration required) provides a short summary of this excellent article by Thomas Bleha published at Foreign Affairs.


In the first three years of the Bush administration, the United States dropped from 4th to 13th place in global rankings of broadband Internet usage. Today, most U.S. homes can access only “basic” broadband, among the slowest, most expensive, and least reliable in the developed world, and the United States has fallen even further behind in mobile-phone-based Internet access. The lag is arguably the result of the Bush administration’s failure to make a priority of developing these networks. In fact, the United States is the only industrialized state without an explicit national policy for promoting broadband.

It did not have to be this way. Until recently, the United States led the world in Internet development. In the late 1960s and 1970s, the Department of Defense’s Advanced Research Projects Agency conceived of and then funded the Internet. In the 1980s, the National Science Foundation partially underwrote the university and college networks — and the high-speed lines supporting them — that extended the Internet across the nation. After the World Wide Web and mouse-driven browsers were developed in the early 1990s, the Internet was ready to take off. President Bill Clinton and Vice President Al Gore showed the way by promoting the Internet’s commercialization, the National Infrastructure Initiative, the Telecommunications Act of 1996, and remarkable e-commerce, e-government, and e-education programs. The private sector did the work, but the government offered a clear vision and strong leadership that created a competitive playing field for early broadband providers. Even though these policies had their share of detractors — who claimed that excessive hype was used to sell wasteful projects and even blamed the Clinton administration for the dot-com bust — they kept the United States in the forefront of Internet innovation and deployment through the 1990s.

As you know this is a subject I am seriously concerned about not do much from the standpoint of who is to blame but how we can change policy to deliver quality broadband to everywhere in our nation as quickly as possible.

From my perspective, there is no acceptable excuse we should accept to hold this deployment back. The rewards are far to great for us to even consider putting this off and the potential for economic devastation is far to great to ignore.

What has happened? Where did we divert our attention from this goal and become one of the worst connected countries in the world? Is this simply a matter of policy? Can anyone seriously believe that the administration created road blocks to impede the national deployment of broadband or is this simply a matter on not understanding the dynamics? Either way, the issue needs to be refocused from assessing blame to how we can not only fix the problem but also do it in a timely fashion before the damage becomes devastating.

To that end, let’s take a look at the potential alternatives.

The first and most obvious is the ILECs who have been lobbying for as close to complete control of this infrastructure for as long as anyone would care to remember. IF we wish to take this option seriously, we need to ask ourselves two serious questions. Do we have any faith that the ILECs could deploy a national network in a reasonable period of time? The second question (and one that is certainly more important) is do we honestly believe that the ILECs are a sustainable entity and are capable of being viable in the long term?

The past history of this industry seems to show that the ILECs have almost never met the projections that they have made in terms of not only cost but also timelines. This is a very seriously damning piece of evidence. While a case could be made that the ILECs were able to roll out a telecommunications network and keep it running for the better part of the century we must also say that if this solution had satisfied the majority of Americans there never would have been such a backlash as the one that forced the monopoly to be broken up and competition to be introduced. With the exceptions of the ILECs themselves I believe the opinion is nearly universal that this decision was the right move and when you stop to think about it whenever you get that many people to agree on anything one has to wonder if this shouldn’t be considered as a fact.

One more point to consider, how long would it take the ILECs to deploy complete broadband coverage over the entire US? I am not sure anyone could reliably provide us with a number. I can tell you that is this project were to be mandated the lead time it would take for the necessary hiring, equipment acquisition and the actual buildout would probably be measured in decades as opposed to years.

The next most viable candidate is the cable industry. If we take a look at the cable industry we find that they have many of the same challenges that the ILECs have. I don’t believe that even the most optimistic predictions would give this industry a reasonable time to connect the entire country. Let’s face it, there are simply too many miles of cable to be strung and too few customers to collect revenue for this industry to be able to succeed in this without an incredible amount of subsidization – something I am sure we would all like to avoid if possible. Please note – I cannot see how the ILECs would be able to deploy any kind of universal network with these levels of subsidization too.

Moving down the list, we have the satellite providers. As the technology sits we have an incredibly expensive investment with almost no return from a state of the art point of view. As long as satellites are stationed 22,000 miles out in orbit we will have to deal with a lag time that prevents the institution of many of the services the next generation of the net will need. Couple that with the bit caps all of the companies now put in place and the reality is that this is simply the choice of last resort – continuing the digital divide.

The LEO (Low Earth Orbit) satellite technology is advertised to cure the latency issue (and should) but there is still the issue of total bandwidth available coupled with the total cost to deploy and maintain this type of system. Still, this might be a technology we will have to move forward with as an intermediate step for some of the most rural areas. If so, this will be one of the worst examples of a Band Aid solution I can think of. Hold you nose and put up with it if you must but the reality is I look at this technology as I used to look at AM radio – is it better to have an AM radio and leave it off or not have one to begin with?

For the sake of sanity I am going to lump some of the ideas for blimps and other continuous airborne aircraft in with the satellite technology. While I admit this is not fair as the latency issue and the issue of aggregate bandwidth isn’t as pronounced with this technology the cost to build and continually maintain this concept makes it one I have a hard time accepting. I am sure that at some point the total cost of operation crosses the line where a fiber deployment is a better investment.

Then we have the WISP industry – one that for the sake of fairness I will break up into two segments. Instead of the usual division of licensed and license exempt I am going to instead look at what I call the professional and amateur deployments.

For professional deployments I would like to use three very different companies as examples; Clearwire, Towerstream and Verizon.

The reality is that each one of the companies I mentioned above has some very interesting things going for them and also some serious deficiencies. However, since we are talking in terms of universal broadband coverage I believe it is safe to say that none of the companies listed above will be able to meet that requirement. In fact, all of them are more dependent on a minimum population density that would be forced to ignore most of the rural areas of this country

Let’s look at the amateur WISPs (as I have labeled them) in this respect. This is a very interesting group as they have already shown that they can effectively provide service to rural areas in a sustainable manner. However, the three things that are preventing this industry from moving forward in my opinion are the cost of upstream connectivity, the outright cost of equipment to build their networks and the fact WISPs are relegated to a few slices of near worthless spectrum cluttered with interference generated by all kinds of devices.

The good news is that the FCC has provided WISPs with a slice of spectrum (3650) that might finally allow them to actually deploy reliable infrastructure and provide service in these areas. The downside is that in order for this to be successful we need equipment that is very inexpensive. Since this band is not produced in the kinds of quantities that WiFI is the reality is that any kind of equipment that might be manufactured is probably going too expensive for WISPs to utilize. If the cost of the equipment can be dropped (the promise of WiMAX?) along with the cost to connect WISPs we might actually see a viable WISP industry start to answer the problem of providing real connectivity to the most rural areas of the country.

Is there one right technology that we should be counting on to make universal broadband access a reality in this country? I don’t believe so. If we are to move this goal into high gear we are probably going to need all of these technologies to work together.

Here is what I would suggest.

We change the priorities of the ILEC to middle mile (setting up ultra high capacity backbones) to everywhere in the country instead of their current focus of trying to own the customer right down to the last foot. We also encourage the cable industry to also build out high capacity pipes into rural areas. As these pipes are highly profitable I believe this is perhaps the best way for these companies to take advantage of the opportunities this kind of push for universal broadband might offer them. Once the entire country has the high capacity backbone in place we can then reinstitute the race for the last mile customer.

At the same time we also need to encourage the wireless providers to also deploy high capacity pipes to connect areas that otherwise would be left to last. This could be done with ease if the powers that be wanted to. Let’s manage to put together guaranteed funding for projects like this coupled with mandated connectivity to the first mile at subsidized prices. Let’s make the equipment less expensive by pushing the
demand through the roof so as to have the scale of economics kick in forcing competition up and prices down.

Finally, there is one more thing we need to do – get the idiots out of the decision making positions. I’m sorry to say this but if you don’t know what you’re talking about, you shouldn’t be part of this most critical decision process.

Yes, in case you had any doubts, I’m looking squarely at you over there!

This is not something you pick up in a fifteen minute lecture. This is a critical infrastructure and requires people who understand this complicated subject to be given the authority and freedom to do what is right. If you don’t know that you qualify for this, you most certainly don’t. If that is the case I respectfully ask that you give up any position of authority you may have and step aside, yielding to those of us that do.

This is our country, our future and probably the single biggest challenge we will face in this generation. This is no place for well-intentioned amateurs.

Yet another study has been released that seems to back up many of the earlier ones I have been reading. Emarketer’s study details several important trends and makes a number of predictions as to what we might expect in the coming three years.

Not surprisingly we find steady growth predicted in many of the important areas that we think of as broadband indicators including adoption of on-line banking, E-commerce, content purchases and VoIP adoption.

This is something we need to seriously consider from several standpoints probably the most important being the necessary infrastructure we will need to accommodate this forecast increase in traffic. From a Wireless perspective I see a developing problem as VoIP and on-line content become more popular. It is now (based on current year technology) a pretty widely accepted fact that individual access points can only handle a maximum of roughly 10 concurrent VoIP sessions before they max out. As we approach the maximum we find that two things occur, the Quality of Service degrades and finally the inability of the access point to distribute any additional information such as bandwidth intensive applications such as on-line multimedia content.

The problem doesn’t stop there. The flood of packets will also wreak havoc on the backhaul equipment (assuming this is a pure wireless infrastructure) possibly degrading other WiPOPs in the process. In other words, as the technology stands at this moment we have to wonder at what point the entire technology ceases to be able to meet our needs.

To a certain extent the same holds true for the competing technologies (cable and DSL) as they are asymmetrical, limited in aggregate bandwidth and also have a point of saturation that needs to be looked at. Some of the providers are now looking to fiber as the technology of the future but I would question which will happen first, the widespread adoption of these newer services or the deployment of fiber infrastructure. I don’t think there is any doubt in even the most optimistic pundits that fiber will be 100% deployed in the US by the end of 2008 without the immediate institution of a “Manhattan Project” like initiative. The reality is that even if such an initiative were to be embarked on this morning, the delay in our legislative and regulatory process coupled with the necessary time for the industry to ramp up and their associated hardware manufacturers’ lead time would take most of the time we forecast as having left before these changes hit.

We can simply draw the conclusion that the consumer market will adopt these services at a rate that exceeds our ability to provide them. If that is the case it might be logical to assume that in many places we will have a situation where the Internet has services available where the local network cannot deliver them – regardless of demand. This is not substantially different from the situation many Americans find themselves in now in areas where dialup is the only realistic connection to the net.

This now creates a division that reminds me of the historic “wrong side of the tracks” description but one brought into today’s age as more of the “wrong side of the digital super highway” instead. Will this lead to emigration from many of the areas that do not have the necessary infrastructure? Will parents be forced to leave a geographic region to provide their children with the opportunities that will be required in the coming future? Will the value of a location now be determined by the quality of connectivity available? Will this happen? I submit it already is happening but that the effect will be significantly more pronounced in the next few years as the services that are offered become more valuable to our society.

During the last Presidential campaign both parties pledged to work on this problem with President Bush pledging to bring broadband to 100% of this country by the end of 2007.

“Bush said the entire country should have access to high speed Internet access by 2007. He has ordered federal agencies to make it easier for broadband providers to get access to federal land, and he supports banning any Internet taxes. Bush is also pushing for increasing spectrum for wireless broadband.”

As we cross the midpoint of 2005 (leaving a little more than 30 months before the end of 2007) how are we doing in real, measurable terms toward the goal of ubiquitous broadband throughout this country? Does anyone believe we will even come close to this campaign promise? With a huge portion of the country (geographically speaking) having next to no broadband connectivity how could this goal be realized? The answer is (as best I can figure) we will not be able to realize this goal and while the demand will be there the infrastructure most certainly will not.

There is several interesting unknowns that may help alleviate this problem, the 3650MHz band that was newly introduced, WiMAX (if the price and availability issues are dealt with in a timely fashion) the rise of independent fiber companies like Jaguar Communications and perhaps the biggest unknown, the ability of technology to innovate faster than many of us can predict.

I look forward to that possibility as the alternative seems rather bleak in contrast. We do, indeed, live in interesting times.

A pair of somewhat unrelated news articles caught my attention forcing me to look at what the near future might hold in store for us.

First, we have a quote from this press release,

“ResearchChannel, in partnership with the University of Washington and the Poznan Supercomputing and Networking Center (PSNC), demonstrated High Definition video transmitted across IP networks today at the TERENA 2005 conference in Poznan, Poland. Attendees observed HD programming transmitted from ResearchChannel’s DigitalWell digital asset management system over advanced networks from Seattle, Wash., to Poznan, Poland, at a rate of 270 megabits per second.

‘As a new member of the ResearchChannel consortium we are pleased to cooperate with the University of Washington to deliver high quality HD transmission and present it at the TERENA Conference. This enables us to showcase the demanding application which utilizes optical networking technologies to deliver HD quality streams over IP,’ said Maciej Stroinski, Ph.D., technical director of PSNC. ‘We expect a rapid growth of such services, enabled through developments in optical networking and IP, which will open new application possibilities in the multimedia industry. We are committed to cooperate with ResearchChannel in further research of this subject.'”

A 270Mbps video stream? What a technology like this hold for high definition video communications? If this service were inexpensive enough would Grammy subscribe to it so she could see her grandchildren in living color? Would this be a service that wide adoption could be expected? Would this kind of infrastructure finally make it possible for real telecommuting to take off? What kind of competitive advantages would this provide our country once this type of infrastructure was installed everywhere – like telephone service is now?

There is also this news story today discussing the inability of even our newest not yet ratified standard (802.11n) to meet our projected needs moving forward.

“‘There is a common opinion throughout academia, industry and business that the current wireless technology fulfills neither current nor future demands,'” according to those behind the Wireless Gigabit with Advanced Multimedia (WIGWAM) project. This German-led consortium of corporate and university researchers says 100Mbps is the bare minimum needed for the future of wireless. It is using the 108Mbps 802.11n and MIMO technology as the starting point for bringing 1Gbps wireless into offices and homes.”

Yes, I know that the article is talking about indoor LANs but the reality is that we are only an extension of that technology. I also know that people who have real high speed LANs in their office would never go back to a 10baseT network or an original Token Ring network.

Like it or not we are moving towards connecting everywhere into a worldwide LAN (oxymorons anyone?) where data (of any kind, voice, video, etc,) can be exchanged at the blink of an eye.

To me, the question isn’t if this will find its way into the mainstream but when.
Will wireless be able to handle this kind of extremes?
Only if we demand it.

What an opportunity!

In the never ending battle to spark high speed Internet deployment in the United States a new piece of legislation has been introduced. Senate Bill (S. 1147) that was introduced by Senators John D. Rockefeller IV (D-W.V.) and Olympia J. Snowe (R-Maine) provides for companies that deploy broadband infrastructure to expense 50 percent of their investments on current-generation technology and 100 percent on next-generation technology. Here is the exact text of the proposed legislation for anyone who would like to examine it closely.

The overall impact of this bill is probably not easily comprehensible at first glance. Yes, this does motivate ISP/WISPs to deploy equipment but in a significantly more subtle way it provides substantial incentive for broadband providers to roll out Next Generation infrastructure. This, of course, has an upward spiraling effect as it puts pressure on manufacturers to develop realistic equipment that would meet whatever definition this bill outlines “Next Generation” to be. Another added benefit that this piece of legislation brings to the table is the effect it will have on the investment community.

This legislation is near brilliant, in my opinion as it manages to accomplish several necessary things while avoiding the pitfalls of doling out grant monies with questionable results.

There is one slight problem (and this is where you come in) – this is proposed legislation – not law.

This is something you need to know. In this one case it doesn’t matter whether you are a huge ILEC, a large ISP, a WISP or simply someone who wants to see affordable broadband everywhere in the US, the benefit to us all is equal. Since the cost of hardware would be reduced for the infrastructure providers, the cost to the end user would also come down, we would like to think. In short, if you are reading this, you should be enthusiastically supporting this bill. I mean it, you need to let your legislators know that this bill in a real Win/Win situation for everyone.

In an attempt to make your interaction a little easier, I am providing this link to assist anyone who needs the direct contact information for their Senator.

One thing you should be aware of, any action you take is not wasted. For every single letter, email, fax or telephone call you make the interpretation is that there are many more people that support the same viewpoint but did not take the time to voice their opinions. This IS a numbers game and the action you take will have a far greater effect than casting one vote.