An interesting wrinkle with cognitive radio actually being tested in the real world…

A Cell-Phone Network without a License

By Tom Simonite

As excerpted from the link above,

While most radios can only use frequencies that are completely clear, xG’s radios can unlock more free space by analyzing channels whose use varies over time, Rotondo says. Signals can then be inserted in between bursts of activity from a device using that channel.

“Where a more conventional radio would see a wall of signals, we are able to put our packets in between them and move around between those gaps,” he explains. “Using that method, we find that even in an urban area, the 900-megahertz band is really only around 15 percent occupied at any time.”

Here’s another snippet from an expert who I have an incredible amount of respect for,

Craig Mathias, an analyst with the Farpoint Group, which specializes in the wireless industry, has inspected the Fort Lauderdale network. “It really is just like using a regular cellular system, even though the technology is so different,” he says.

The potential for cognitive radio to make better use of spectrum has motivated many companies and academic labs to work on the technology in recent years, says Mathias. “The real advance of xG’s system is that it can be deployed in exactly the same way as a conventional cell-phone network,” he says. But exactly how xG will bring the technology to market is unclear. “One option may be for a carrier to use this in an area or market where they don’t have spectrum, or to serve rural areas without coverage.”

As a personal commentary on this topic, we live in an age where our technology is beginning to eclipse our society’s ability to govern it. From my perspective, this isn’t necessarily a good or bad thing, it simply is.

The technology being discussed in the above article would have been impossible (or perhaps impractical) a half century ago, certainly from a cost-effectiveness standpoint.

With CPU processing power increasing while cost and size are decreasing, the ability for us to leverage computing power in ways never before thought of is quickly becoming a reality.

Arun Mehta mentioned on his awesome mailing list that Wifi was given “garbage spectrum, and it came up gold” which is a fairly accurate assessment. Conversely, in my not-so-humble opinion, it is also uncontrollable and that lack of control makes it difficult for any investor to monetize it.

I submit that it isn’t a technology issue we are discussing here, it is a policy and business model that needs to be reworked or reinvented. I would further posit that inventing and deploying the technology will be far easier than changing a political system – forget trying to change entrenched money.

But there is cause for hope, ironically brought to us courtesy of this economic downturn and the inability of big business to turn a large enough profit to keep their greed satiated. The evolution in communications will more likely be sparked due to the business community not being able to justify their investments while the need for communications to be put in place remains.

Therefore, to my unorthodox way of thinking, the question we should be asking is, how do we accelerate the oncoming telecommunications crash so that we can implement the solution we all know if coming?

(Hat Tip to David Isenberg’s Fast Fail policy)

Listen up!

No, the sky is not falling but that sound you just heard is the first tiny snowball in what is eventually going to be an avalanche. And like all avalanches that preceded it, this will not be the end of life on the planet, or even the end of our society as we know it, but it will lead to carnage somewhere on the level of the Telecommunications Crash of 2000-2001 with the distinct possibility of requiring a massive government bailout.

What the hell? One in eight consumers will eliminate or scale back their cable, satellite or other pay-TV service this year and this is a problem even worthy of mention?

Why yes, yes it is, and even if you didn’t ask, I’m still going to explain.

Recently, Verizon announced that they would be adding “3M FiOS homes in 2010 but have now cut that to 1M for budget reasons.While the rationale provided was “budget reasons” one might want to think about exactly what that reasoning actually means. Is it safe to assume that if the FiOS network was reasonably profitable we should logically expect to see Verizon continue to build out this network? To expand on the previous question, if the FiOS network was wildly profitable, wouldn’t it stand to reason that Verizon would be actively courting investors to raise the cash necessary to expand the FiOS footprint well past what they had initially set out to do?

Well, what can we assume caused what should be considered to be the next generation of telecommunications and information infrastructure construction to be suspended?

How about a faulty business model?

As with any venture, a business plan, one that includes financial projections which outlines both expenses and revenues for a set period of time, needs to be created. For those unfamiliar with this process, in layman’s terms, this means that someone is going to have to make an educated guess at those numbers. If you were to ask anyone who is intimately aware of how this process is tackled, without exception, they would tell you that the longer the financial projections are forecast the more likely that they are going to be inaccurate. Add to that formula the uncertainty that high tech brings to the mix and trying to guess both revenue and expenses for several years out is an exercise in psychic abilities.

Starting somewhere around the beginning of this millennium, the term “Triple Play” was introduced. This concept made the claim that customers would be looking to secure voice communications (as defined as being fixed or landline service), Internet, as well as Television (or more correctly Video Entertainment) and that any provider who was serious about providing any of these services had better provide all of them. A later variation of this term, The Quad Play, followed up on this by adding mobile voice (and now data) to the mix. While no claim was ever made that 100% of the customer base would subscribe to all of the services, it was expected that virtually 100% of available customers would sign up for at least one service.

So, what happens when ambitious businesses decide to embark on the construction of a business plan which would bring them into alignment with this new model? Well, the short answer is that the plan is constructed using four revenue streams, weighting each one as market demand seemed to indicate would provide the percentage of total revenue. On the other side of the spreadsheet came the cost to build out this project as based on the total demand that was expected to arise to deliver these services.

Great – the numbers look fantastic, let’s do this thing.

And then reality sets in.

Voice? – Yes, about that. Well, we didn’t forecast that a high percentage of our customers would be dropping that voice line at home in favor of using a cell phone exclusively. And, well, we really didn’t correctly anticipate companies like Vonage dropping the price of voice service to almost zero. And then there was that Google thing that actually did drop voice calling to zero. Ah well, it’s only one revenue stream, we still have three others.

Television? – Wait, what? The amount of people that watch TV is dropping? Okay, we forecast that into our model but maybe we didn’t quite expect that number to drop quite as fast as it did. Add to that the issue that many of these customers would elect to use an outside supplier for their video content, like NetFlix, who would eat away at our projections. How bad is it? Netflix: 55 percent of subscribers now streaming movies. Yes, that’s got to hurt. Still, we’re still making money, right? And we’ll still be making money after one in eight of our customers either downgrades or eliminates our pay TV service – just not as much.

Internet? – Ah, the one bright, shiny, spot in the business plan that we got right. After all, we did predict that we would see some pretty serious adoption rates and that seems to be holding true. And we did build out a network that could handle the bandwidth demands, right? So, all is well – with the exception that those bandwidth demands are climbing and while we built a network that was capable of servicing that requirement, there are escalating costs that we might have underestimated a bit.

Mobile Communications! – That’s going to save the day. What could possibly go awry in that revenue stream? I mean, it’s a cash cow that keeps on giving, between low bandwidth applications like text messaging and voice, we’re good for years and years, right?

And then the iPhone hit. Well, that would be obscuring the load that all of the previous smartphones added to the network. And then the iPad hit. Next up, the 3G iPad and an increase in people watching video on their portable devices – across a cellular network that was designed for low bandwdth applications.

I wonder if this might be why we see Verizon moving their financial resources from FiOS to cell network expansion. Could it be that Verizon now believes that this will be the one leg out of the Quad Play that will fill in the gaps?

If so, what happens when mobile voice communications move over to the data side? And what if the data side then moves to license exempt (WiFi) connections, where they are available?

To be perfectly fair, this article has used Verizon to illustrate the situation when Verizon is probably the most capable company out there to handle this insanity. AT&T under Ed Whitacre didn’t seem to show the foresight to handle this onslaught – but thankfully, Mr. Whitacre has moved on to saving General Motors – a company that needs vision, probably more than anything else. Quest? Who?

So, will we hear the other shoe dropping? If so, what will it sound like? To be quite honest with you, I don’t know – but I’m looking forward to experiencing it. There’s a killer ap coming, in fact, it’s overdue. It’s another Napster but I’m not sure what shape it’s in or what that manifestation will actually encompass – but it’s coming.

The question is, will the walled garden collapse when the other shoe drops? And if it does, what will replace it?

Stay tuned, we’ll have that information for you after a word from our sponsors – if you’re still paying to hear it when we return.

Well, if you follow the disruptive technology rumor mill you can’t swing a dead cat without hearing about the Google Phone just about everywhere you turn.

Image courtesy of Cory OBrien
Image courtesy of Cory O’Brien

Andy Abramsom did a great job listing out all of the ways he believes Google will monetize this device and I would like to add one more.

With the advent of universally accepted mobile payment the Google Phone will combine with Google Checkout to decimate the credit card industry while bringing more money to Google than one might want to think about.

It’s good to be search king.

Over the years, as reflected in the writings here, the quest to try and outguess where technology will usher us has been one challenge that has never ceased to entertain me. And as with these writings, there are many other places where the discussion as to the health of our communications infrastructure has been written about and argued over by yours truly.

Recently, the discussion turned to what is happening with AT&T’s network as their customers make the transition from using their cell phones as mobile voice platforms (with a little texting on the side) to real data clients.

According to John Donovan, CTO of AT&T, said the carrier’s wireless data traffic has increased 4,932 percent during the past dozen quarters. “I know what you’re thinking: iPhone. And you’re right, but only partially right,” Donovan said, explaining that Research In Motion’s BlackBerry devices and messaging-centric feature phones have also contributed to the increase in traffic.”

Apple Insider republishes data from a recent AdMob report that makes the claim, “In the worldwide market, AdMob notes that Apple advanced its lead in smartphone traffic share from 43% last month to an even 50%.”

Okay, so what’s next?

How about Fring making it possible to have Skype videophone on your Nokia S60 powered cell phone? If that catches on I wonder what the data usage statistics will look like a year from now.

But Mr. Donovan isn’t finished providing a wakeup call yet. Check out this gem, “If you look at 2008 for us it was unprecedented in terms of the work we did in the backbone,” Donovan said. “The capacity we carried in 2008 five years out will be a rounding error.” Donovan added that AT&T’s 2 gigabit backbone lasted 7 years, their 10 gigabit backbone lasted five, and the 40 gigabit will last 3 years. He then asked rhetorically, “How long will a 100 gigabit network last? At 400 gigabits I think our routers melt, I think finance likes liquid assets, but I don’t think that’s what they had in mind,” Donovan quipped.

Mr. Donovan also provided this bit of wisdom, “We have to rethink how we’re carrying traffic in our networks and I don’t think you can stop at just the cost per bit. We need to back out of that and fundamentally rethink how we interoperated, how networks are constructed, how routing is done and how we move content.”

Not all that long ago, there was a time when our telecommunications network was the envy of rest of the world. It was a time when you were almost assured that when you picked up the handset there would be dialtone and when you placed the call it would be connected, with perhaps only a busy signal preventing this from happening. On the very rare occasion when this wasn’t the case, any kind of large-scale outage was newsworthy.

In contrast, Verizon proudly advertises that they are aggressively seeking out network shortcomings using an army of people trained to repeatedly ask, “Can you hear me now?” This is actually a good thing as Ivan Seidenberg, the chief executive of Verizon Communications announced that they have no interest in continuing to offer landline service, instead opting to focus in on a cell phone service – even though telephone will be offered as an option provided across their Fios (Fiber Optic) network.

Why would this be a problem? After all, the number of people dropping their landlines in favor of cell phones is increasing every year. And if this is what the public wants, isn’t Verizon doing what any good business should do – listening to their customers?

John Donovan, CTO/AT&T, tells The New York Times, “Overnight we’re seeing a radical shift in how people are using their phones,” later adding “There’s just no parallel for the demand.

There’s no reason to worry though, as AT&T is aggressively addressing this problem, as detailed in this press release dated, September 1, 2009 , “AT&T* today announced a substantial strengthening of its 3G mobile broadband wireless network where it has deployed spectrum in the 850 MHz band across large portions of metro New York City, Long Island and New Jersey.

John Donovan, CTO/AT&T, also candidly admitted to Fortune magazine, “3G networks were not designed effectively for this kind of usage.” This leads me to wonder if AT&T has a meaningful dialog going with Mr Donovan as it appears there may be some internal disagreement.

In a report recently released by Cisco, [pdf] we find the following statements,

  • “Globally, mobile data traffic will double every year through 2013, increasing 66x between 2008 and 2013. Mobile data traffic will grow at a CAGR (Compound Annual Growth Rate) of 131 percent between 2008 and 2013, reaching over 2 exabytes per month by 2013.
  • Almost 64 percent of the world’s mobile data traffic will be video by 2013. Mobile video will grow at a CAGR of 150 percent between 2008 and 2013.
  • Mobile broadband handsets with higher than 3G speeds and laptop aircards will drive over 80 percent of global mobile traffic by 2013. A single high-end phone (such as an iPhone or Blackberry) generates more data traffic than 30 basic-feature cell phones. A laptop aircard generates more data traffic than 450 basic-feature cell phones.”

Businessweek adds this information about AT&T to the discussion, “Many of its 60,000 cell towers need to be upgraded. That could cost billions of dollars, and AT&T has kept a lid on capital spending during the recession—though it has made spending shifts to accommodate skyrocketing iPhone traffic. Even if the funds were available now, the process could take years due to the hassle and time needed to win approval to erect new towers and to dig the ditches that hold fiber-optic lines capable of delivering data. And time is ticking.

How bad could this problem realistically become? Perhaps understanding what occurred in Austin Texas earlier this year, as described by Fortune Magazine, can help drive the urgency home.

At the South by Southwest music, film, and interactive fest in Texas earlier this year, the iPhone was all the rage — and not in a good way.

The device proved so popular with Internet-addicted attendees that AT&T’s wireless network in the city of Austin buckled under the strain, all but shutting down both voice and data service for many customers.

The good news is that alternatives exist, AT&T says its free Wi-Fi initiative isn’t a response to a recent avalanche of complaints from iPhone users that they cannot connect via 3G. Still, Jeff Bradley, the company’s senior vice president of devices, said that if more AT&T users shifted to Wi-Fi, the performance of the 3G network should improve.”

That’s right, please continue to pay for the 3G network service but if you could find it in your heart to use a WiFi access point whenever possible it will help improve AT&T’s 3G network performance. One might assume that AT&T would appreciate it if you would use someone’s else WiFi connection, perhaps while you’re grabbing a cup of coffee somewhere.

What you need to understand is that this problem isn’t AT&T’s fault, nope, no way as John Donovan, CTO/AT&T, explains, “AT&T’s wireless data traffic has increased by more than 18 times over the past two years, and he expects this trend to continue as the company offers more smartphones and 3G netbooks.”

I mean it’s not like anyone could have predicted this increased demand in traffic, surely not a telecommunications company with a few decades of experience, right?

Perhaps the most powerful tool we have to advert this crisis is fear. As no one wants to be in charge (or holding political office) the day the communications network goes down, that fear is our strongest ally.

If you’re like most of us, you’re all but drowning in too much information, well, no, not information really, a cry to get someone else’s crap into your focus. We all know the advertisers do this, hell, they’ve been doing this for longer than anyone here has been alive but there’s something new, something insidious happening here, something that could be quite literally dangerous, if we let it continue.

They crazy people are being given the pulpit.

There was a time when that guy (You know that guy, right? Everyone knows that guy.) who stood on the corner with a handmade cardboard sign claiming that the world was coming to an end on Tuesday (even though he never said which Tuesday which, come to think about it, still gives him a one in seven chance of being right) was largely ignored. At best he managed to collect a passing disbelieving stare or the ridicule of a few children – but that was it.

Not today, nope – now this guy is a political commentator, probably with his own radio show or at least regular guest appearances on one TV show or another. We’ve reached an age where instead of marginalizing the lunatic fringe, we are embracing them, inviting them into our homes (figuratively, to be sure) and holding their opinions up to the same standards we seem to apply to scientists.

Wait. Scientists? Well, those that believe in the evolutionism are just part of that religion. And the rest of us, well, don’t you worry, this is a country where the majority rules and we can simply vote that nasty evolution out of our sight.

Courtesy of Pew Research

Courtesy of Pew Research

We are interviewing people who will stare straight into the camera and tell you that they don’t want government involved with their Medicaid/Medicare. We see has been TV actors getting exposure on prime-time television shows and saying, “I’ve been on food stamps and welfare, did anybody help me out? No.” and this is allowed to stand without the ridicule that is justly deserves.

YouTube Preview Image

This isn’t a free speech issue and this is not something that the government should regulate. This is an issue where each and every one of us is required to be an active participant in the running of our country.

You have a right to say anything you want and make as big a jackass as you can out of yourself – and too many Americans have sacrificed too much to allow you to keep that right. At the same time, I too have a right, the right to deride and mercilessly mock you for publicly saying something so stupid that any decent human being wouldn’t let their face see the light of day for the rest of their lives after committing such a stupidity.

This isn’t a Democrat/Republican issue, it isn’t a conservative/liberal issue, it is the rational, intelligent versus the lunatic fringe contest – and many of us are concerned that they are winning.

I’ll leave it to you to figure out who is “us” and who is “them” hopefully before it’s too late.

Because, in the words of our previous Commander In Chief, “You’re either with us or against us .” even though, admittedly, sometimes it’s hard to tell the sides apart.

The New America Foundation has invited people to post their opinions and ideas regarding what a national broadband strategy should entail – in 250 characters or less. One would hope that should the New America Foundation ever have to seek medical advice they will up the character count.

To more fully address this extremely complicated issue, this article will attempt to address many different aspects of this subject, outlining how this program should be administrated, in my opinion, of course.

Preliminary steps -

Legislate that all broadband backbone assets, as defined by fiber and high speed RF data transportation devices be reported where they can be collated, mapped and used to build a picture of what is available for use. To increase the usefulness of this map all vertical assets should be included, which would include towers that meet code with heights that exceed 100 feet, rooftops that would be useful in providing both RF backhaul sites as well as wireless distribution and mountaintops where it would be feasible to locate either RF backhaul or end user distribution. Any company found not complying with the reporting mandates will be fined and these fines will be added to the broadband infrastructure funding pool.

This information will be used to create a map of what areas have high speed backbone connections already in place and which locations do not. For the purposes of this discussion high speed backbone is defined as 10 Gbps or above.

Now, create a map overlay that shows where the existing backbone infrastructure could be easily extended using the vertical assets that are immediately available. It is understood that the cost to deploy RF transportation using already existing vertical assets will be significantly less expensive and substantially quicker than building fiber.

Next, create another map overlay that shows how these proposed RF backbone locations can also be equipped with wireless last mile connectivity as well as mobile connectivity where either technology is applicable.

Conduct a full auditing of the spectrum allocation table will be with the aim of identifying unused and underused bands. In addition, all commercial bands, currently in use, will be reexamined as to the continued need for this exclusive license as opposed to reabsorbing the spectrum and releasing it for use in broadband infrastructure. A specific example of this is the frequency used by taxis in this country which would now be able to utilize the 802.11p short range vehicle communications band instead. As a national strategy, let’s move towards a complete revamping of the way spectrum is allocated so that all of the tiny slices of spectrum are all recalled, critical licensed bands are moved out of the large swaths of spectrum we will assemble and those large swaths be divided into licensed and license exempt (if you’ll pardon the bastardization of the term) ultrawideband slices of spectrum which will power the next generation of Internet based communications for our country. As a bonus, this move will create a huge demand for all kinds of equipment to replace the now obsoleted radio devices that are no longer usable due to the change in spectrum allocation.

Any company that operates in backbone transportation services will be prohibited from supplying any services. This mandate will include both wired and wireless connection providers, including cell phone service. A company will be required to either provide data transportation or end user connectivity services but cannot provide both. This will also hold true for both owners and investors which will be allowed to participate in only one segment of the industry.

Phase one

Using the newly created maps, all broadband infrastructure funding to be handed out must be used to maximize the number of unserved people who will now receive broadband service. This is a crucial metric to apply to this first round of broadband funding so as to assure that the greatest efficiency will be achieved.

The aggressive use of eminent domain should be employed to seize (with fair market compensation) unused fiber strands owned by any private entity so as to immediately put these assets into use. In cases where fiber has been abandoned, or written off as fully depreciated assets, as in the case of railroad deployments or utility companies, these assets will now revert to the US government’s ownership where they will be tested and then put up for sale with the resulting funds to be used exclusively to develop more broadband infrastructure.

Phase two

The second phase of this construction will entail the building out of fiber to every location in the US, with a specific population density yet to be defined. The wireless network that is already in place will be maintained and continue to operate as a secondary network, primarily used for mobile connectivity but also as a failover.


With respect to paying for this new infrastructure construction, no new taxes or direct fees will be shouldered by individuals or corporations during the first construction phase. The money has already been allocated through the ARRA bill.

The funding for Phase two and any subsequent phases will be generated based on the taxes paid by the people who were newly hired due to this construction. As such, the IRS will be required to track every new (non-replacement) hire that is added to the Internet infrastructure industry, including the manufacturers of all components used in this construction, and report the money paid into the federal tax coffers. This money will now go exclusively towards the building of more infrastructure projects.

Congress will pass legislation that creates a $10 per vehicle assessment on every new automobile, beginning in 2010. These vehicles will be equipped with 802.11p wireless communications capability and this assessment will be collected to provide funding for the 802.11p network infrastructure. In addition, all federal funding for any RF or communications networks of any kind, E-911 included, will now be channeled into the Internet infrastructure funding pool as will all fees collected from licensing spectrum. Universal Service Funds will now be paid into the Internet funding pool and all funds will now be dispersed to ensure that every American will have suitable high speed access (as defined by 20 Mbps symmetrical) with a target rate will not exceed two hours of the current minimum wage for basic connectivity but where optional services can be provided at going market rates.

And to paraphrase Groucho Marx, these are my beliefs – and if you don’t like them, I have others.

The Information Superhighway. What promise that term engendered, at least, for those of us who treasure good, solid, information. It is important to note that nowhere in that term was there ever any suggestion that the information being delivered down that superhighway would be of any quality level or have anything to do with quality at all.

According to Russell Ackoff, there exists a Knowledge Hierarchy’ or ‘Knowledge Pyramid’ which is more correctly referred to as an inverted pyramid where a progression is formed by taking the lowest component (Data) and through an aggregation process move through Information, then Knowledge, on to Understanding, and eventually Wisdom.

Another pyramid, one that attempts to describe Maslow’s hierarchy of needs, sees self-actualization as the pinnacle of development where by people will “embrace reality and facts rather than denying truth.”

Whether or not one subscribes to either of these theories, it should be readily apparent that information which is corrupt will pollute the decision making process to the detriment of all. If we are to assume that the DIKW hierarchy is correct, what happens when the data stream is full of bad data? Can we apply the GIGO (Garbage In Garbage Out) theory of computing to this question? And, if so, what happens when we include the concept of Cognitive Dissonance, otherwise known as Belief Persistence, to this process? To further complicate this question, what happens when the information being injected into the data stream is intentionally falsified so as to manipulate the conclusions arrived at? What effect would this have on society and society’s development?

When I buy a bottle of water I expect the container to contain as close to 100% pure water as can be reliably delivered, not some portion of water mixed with who knows what. And yet, when I seek out information on the superhighway, while there is no lack of quantity, the quality of this fundamental building block of wisdom is certainly in question. Certainly, critical thinking is a skill that must be applied to anything we come across but how does one teach this subject? An even better question might be to ask how one can teach critical thinking when there is no apparent baseline of good information to draw from?

Instead of prizing and rewarding excellent sources of this most precious commodity, we are now bent headlong into minimizing these sources as our media devolves into a catering service aimed at the lowest common denominator, filled with intentional misrepresentations chosen to confuse and confound anyone working to better themselves – more often than not, for political gain.

At what point do we understand that while a person is entitled to their opinion, the delusional psychopath prophesying that the return of the Hale Bopp comet mandates that we should all commit suicide has no standing with the researcher who can show credible evidence that we are harming our environment.

This is the Information Quality Quotient or IQQ.

To say that we are in a state of flux goes beyond the definition of understatement. Chrysler declared bankruptcy last month, sticking it to Lee Iaccoca – perhaps the one man who didn’t deserve it. Yesterday, General Motors announced they were declaring bankruptcy, leading to them being delisted on the Dow Jones Industrial, and perhaps with some unintended foreshadowing, replaced by Cisco, a manufacturer of Internet hardware.

The Boston Globe recently dodged a bullet, thanks to concessions from their employees, which loosely translates into employees making less money in return for their work. On the other side of the spectrum, Craig’s List has seen incredible growth in the last decade in their advertising revenue.

To illustrate this change, Pew Research Center recently released a report showing that online advertising is rising while print advertising is plummeting.

“Nearly half (49 percent) of Internet users say they have ever used online classified sites,” the Pew Center said in the report. In 2005, the percentage was 22 percent.

How bad it the carnage? Well, to borrow a “wisdom” from the US print media back in their heyday, it is said that a picture is worth a thousand words.

Is it just me or does the angle of trajectory in the graph above (from 2000 forward) look somewhat similar to  the following picture?


Attempting to ascertain exactly why this is happening could be an interesting study but from this author’s perspective not really relevant to our immediate future and even less so when we look at the longer term projections. However, let’s look at what we do know, or in the words of that pillar among pillagers, “Reports that say that something hasn’t happened are always interesting to me, because as we know, there are known knowns; there are things we know we know. We also know there are known unknowns; that is to say we know there are some things we do not know. But there are also unknown unknowns — the ones we don’t know we don’t know.”

Well, I honestly can’t add anything to Donald Rumsfeld‘s masterful explanation of reality but I can share with you a few interesting facts that seem pertinent.

In today’s world, people don’t seem to have the time to read. In fact, in just the last 25 years we have seen a dramatic decline in the amount of reading we are willing to do as a society. In the 1980s President Reagan demanded that every issue presented to him, regardless of complexity, be reduced to a single page, in our day and age Twitter limits these thoughts to 140 characters.

This seems to be the intellectual equivalent of what has been termed by the advertising industry as “rightsizing” which apparently began around the time when coffee ceased to be packaged in a one pound can, even though many of the companies cheerfully printed on their cans that their smaller cans made as much coffee as their larger cans used to. I suspect that this is what led Dunkin Donuts to advertise that they were selling Kahlua flavored syrup which could be added to your coffee, ostensibly to give it a coffee flavor.

In a recent grocery trip I noticed a box of cereal that contained 8.75 ounces, leading me to suspect that the manufacturer had left just a little more space in the box to accommodate the settling of its contents during transportation.

But I digress…

What can we extrapolate from the declining revenues in print media?

The obvious explanation is that the cost is perceived as being too high for the value received. If we look at my local newspaper and the amount of readers that might see my add, factoring in their advertisement cost and then look at the same advertisement on Craig’s List (at no cost) somehow the chances of me spending money on my local paper are slim to none. With the recent trend that governments are moving towards using their websites to publish public notices as well as job listings, two distinct ramifications are occurring. The first (and probably most obvious) impact is that what used to be considered the cash cow of print advertising revenue is disappearing. The second, and more insidious, shift is that those Americans who, for whatever reason, are not “plugged into the net” are becoming further disenfranchised, leaving them to the mercy of the televised media for all of their information.

This creates a new dynamic, one that needs to be closely examined, which asks where does this online information, as well as what remains in the print and television media, get created? The obvious answer is that someone somewhere writes this material which is then printed or spoken in the respective media outlets. One might then ask, what effect does “rightsizing” have on both of those informational resources? Well, that leads us back to the Boston Globe story where we are forced to wonder if the quality of their writing will remain now that they are paying less for their content. My guess is that the Globe may one day have on it’s masthead that this newspaper wraps as much fish as our old newspaper did – but again, I digress.

Another shift that can be clearly seen is that consumers don’t appear to be satisfied with the one way information flow that has traditionally been how radio, TV and the print media has always done business. Sure, some radio station formats allowed for people to call in and newspapers would publish a few letters to the editors but the idea of an ongoing dialog fell flat in both of those venues.

Today, sites like SlashDot and Fark post news stories but the real action is in the discussion. The implementation of this type of dialog by the print media, primarily known as their comments section, falls indescribably short of what online sites are doing – and it shows.

Another wrinkle in this ever-shrinking demand for quality content has to be the content creation companies. The need for content is soaring while the price the content commands is falling off. From what used to be dollars per words written has now become pennies per word and not very many pennies at that. Many of these sites don’t actually pay their content creators but instead opt for an advertising revenue split – leaving one to wonder what the quality of their content can actually be if the author is left starving while waiting for page views and advertisement click-throughs to generate small change per day. If this is the case, and it certainly appears to be, what kind of quality control can these businesses be applying to the content they publish given the consideration that they must regularly aggregate fresh content in order attract return readers.

Perhaps the most troubling question about this entire issue is that if we do truly understand that an educated electorate is required for a representative democracy to function, what are we doing to our intellectual underpinnings? If the quality of the information we are being fed is taking the same trajectory as the train in the picture above, at what point will our political system be incapable of functioning?

Even more to the point, does anyone actually know if we haven’t already passed that point?

Over the decade plus I have been involved in different aspects of providing broadband service, plus the many years prior spent experimenting with modems, one constant have never left me – we aren’t going fast enough. Oh sure, the fact that I could get online and retrieve settings for a 20 megabyte hard drive in less time than it took me to call tech support, remain on hold, chat with the engineer, then write down the information before I actually could get any “real” work done was an improvement, but still took too long.

The equivalent today might be waiting the 17 minutes required to download the latest ISO of whatever open source variant I wish to experiment with because just like every other impatient 53 year old child, I want it and I want it now.

Along the way, the argument was at one time framed as defining the “good enough” network, which to my way of thinking was the functional equivalent of building a house just big enough to live in with all of my possessions while providing no accommodation for what I would buy and bring home tomorrow.

Well, what will we bring home tomorrow? It seems that this would certainly be dependent on how much room we have to spare.

The history of mankind has been one where we continually add to this storeroom of innovation, at times pausing for various reasons along the way, but in the longer term the body of knowledge increases as we move forward.

Great, now I need a bigger house – again.

We make the assumption that communications began with hand gestures, perhaps punctuated by the odd sound which might have also included crude pictures drawn in dirt to get our thoughts across. From there we moved to cave art and a somewhat more involved set of sounds, cuneiform, papyrus, with the Gutenberg Press, radio and television coming sometime later. And this entire process was to facilitate getting an idea from one to another mind or in many cases one to many minds. In each graduation, from one technological stage to the next, the previous iteration always appeared to be crude as it was obsoleted.

Certainly there is nothing new under the sun with this latest communications platform, the Internet. Sure the underlying technology is all bright and shiny, even though the modem banks have all been sent to the scrap heaps, but the motivation remains the same, the ability to convey information from one to one or one to many.

But isn’t it curious that every one of these technologies have one inherent drawback, we communicate in a two dimensional manner while living in a three dimensional reality. And that is about to change.

Let me direct you to this link, which includes an interesting technological wrinkle known as the Open Source 3D Printer.

Image courtesy of Fab@Home

Image courtesy of Fab@Home

While this device is still in the experimental stage, the potential seems pretty clear. The day is not far off where we should be able to see a time when each home will have one of these devices and where hard goods will be delivered as a stream of bits that this printer will translate into a tangible object.

But the progression along this path isn’t likely to stop or even slow here, if anything this device will lead to even more interesting devices, one that stretch our imaginations even further.

So what happens when we take this crude 3D printer and push its limits by employing the capability of a polymerase chain reaction (PCR) device? What is a PDR device you ask? I’ll let this next link explain that to you.

Image courtesy of Medgadget
Image courtesy of Medgadget

A vivid imagination immediately jumps at the possibility to “print out” DNA which could then be conceivably engineered into food, possibly pets, or (be still my beating heart) even replacement parts for some of us aging folks. If this could be possible, what happens when we expand on the idea and suggest that a high speed printer might possibly be built that could print out an entire human being instantaneously. Combine that with the ability to transmit all the necessary data that makes up this human being and I’ll let you all draw your own conclusions from there.

But none of this is possible without a network that can lift this load.

We pride ourselves in developing higher and higher resolution graphics while leaving aside that a 3D representation exponentially increases that amount of information. In a relatively short period of time data has gone from being measured in bytes, to kilobytes, then megabytes, now the commonly accepted gigabytes, with terabyte becoming the new yardstick and petabyte being recognized on the horizon.

All while our networks proudly measure themselves in megabits per second.
Occasionally, the roadblocks need to be pointed out to us before they become obvious.