Last week I had an extended conversation on the future of postal service and wanted to share some of my thoughts on potential scenarios 10 years from today. These are clearly quick sketches. The future – as is often the case – will likely be an amalgamation of these scenarios.  

Scenario 1: Traditional Mail Ceases to Exist, Small Parcel the Only Thing Delivered

Small parcel post is escalating.  I seem to recall a statistic recently from Fred Smith of FedEx, suggesting small parcels represent some 15% of their total shipment volume. This category of mail is driven by online retail sales and consumer-to-consumer transactions – both of which continue to increase.  Consumer-to-consumer transactions are on the rise as sites like eBay continue to gain in popular and are used more frequently for a wider assortment of goods. Online sales represent only about 5% of retail sales today, but this is clearly growing as well. It won’t be surprising to find online sales representing a quarter of all retail sales within five or six years. As these transactions increase, small parcel post naturally follows.

There are a few trends playing out in the technology sector which will also impact the rate at which small parcel post increases. First, as the retail sector has become more challenging, manufacturers are increasingly looking at selling directly to consumers.  This isn’t unique to technology companies, but is playing out across a host of categories. A second element I see evident in technology is the rapid acceleration of product launches, the speed at which companies are attempting to bring these products to market, and the swiftness at which information about new products is disseminated to potential consumers.  Manufacturers are building less inventory over a shorter period time before bringing a given product to market. Seeking to fill a broad supply chain in a shorter window will force manufacturers to increasingly rely on expedited, small parcel post – regardless if they are going directly to consumers or through more traditional retail channels.

Scenario 2: The Death of Direct Mail

Today, direct mail represents roughly half of all mail sent. According to a report from advertising and marketing consulting firm Winterberry

John Battelle writes about Color, a new social photo app. Color creates a visual (user-generated photos) public (anyone sharing photos through Color) timeline of any given location (using a proximity algorithm). (It is worth noting Dave Winer suggested the need of a “social camera” four years ago.)  Battelle suggests color matters because of location (“colors has the opportunity to be the first breakout application fueled by the concept of “augmented reality”).  Fred Wilson gives his take on Color – suggesting it has promise as a social graph because it implicit.  In other words, location is defining the social graph which will minimize the manual curating users have to do.  Proximity is clearly a key value of services – especially for mobile services.  But proximity should be defined broadly.  It is location – a physical proximity.  It is also time – proximity to know.  It can also be less tangible – proximity to my interests.  I often see individuals looking at these in isolation.  They focus on location being the killer aspect of services – as in LBS. Or time as in “real-time” recommendations.  Or proximity to interests – as in recommendation algorithms.  Nearness to what matters includes place, time, order, or occurrence. If Color – and other services and apps like it – show real value to users it isn’t solely because of location. 

Color also has potential because it makes the social graph linear.  Sensors (cameras, microphones, GPS, etc) are becoming ubiquitous which in turn is enabling mass data collection (photos, sounds, location, temperature, etc).  Because Moore’s Law drives the cost of data retention to zero we will increasingly see these data archived.  Economists love long time series and we are beginning to create a myriad of long time series.  What we’ll do with these long time series is just beginning to be uncovered.  Take for example, MIT researcher Deb Roy’s work on language acquisition. Roy wanted to explore how his son learned language so he filled his house with video cameras to catch every moment.  This exploration was only made possible by inexpensive sensors and the computing power to parse that information (you can see a Ted talk on the topic by Roy here: http://s.dbr.vc/fzXPn6).

One of the key elements of Roy’s work on how language acquisition progresses is his reliance on the linear nature of these data.  His work could have implications on the learning process which could help inform and improve our education methods.  This is just one example.  I’m seeing increasing instances of linear data creation and I suspect we have just begun to see the ways in which linear data will be leveraged and analyzed.

For Color, proximity clearly matters. But more, the linear nature of these data is being overlooked and might in the end represent the most promising aspect of the service.

Senate Majority Leader Harry Reid and fellow Senators Chuck Schumer, Frank Lautenberg and Tom Udall recently wrote to Apple, Google, and RIMM asking them to exclude apps which allow users to identify, among other things, drunk driving checkpoints.  In the request, the Senators write, “we appreciate the technology that has allowed millions of Americans to have information at their fingertips, but giving drunk drivers a free tool to evade checkpoints, putting innocent families and children at risk, is a matter of public concern.” 

While RIMM has already committed to pull the app, these types of apps should make law enforcement more effective, not less.

These apps – and their underlying services – rely on driver-generated information.  These apps place private knowledge into the public domain. By having this information in the public domain, law enforcement should have a better understanding of what the public knows – or thinks they know.  Once a checkpoint is registered in the system, law enforcement can simply target a new location that is not yet been logged in the system. In this way, law enforcement have an information advantage and should be able to more effectively enforce the law.  

Law enforcement  can use public dissemination of private information to more effectively enforce other traffic violations as well.  Take for example speeding. Users log known speed traps which then alert future users to the potential of those speed traps. Those alerts will naturally result in drivers slowing down – at least temporaily.  Enough alerts on a given road would likely keep drivers’ speed down on the entire route.  Law enforcement themselves could even log areas as speed traps where they want to control speeding. Enforcement can be done will less man-power and costs. Inevitably average speeds will drop and the desired results will be achieved. 

Information dissemination and market mechanisms – as they will in this case – forces market participants to compete more aggressively – and ultimately more effectively.  The argument against these class of apps is that they dissolve information asymmetries and those information asymmetries make law enforcement more effective. On the contrary, I argue these apps and their underlying services exacerbate information asymmetries and thereby should improve enforcement effectiveness.

Can super bowl commercials predict bubbles? I don’t know. Here are the commercials from the past 13 years: http://superbowl-ads.com if you want to find out. We did seem to have a myrid of dot.com related commercials prior to that implosion. Signs of bubbles from this year? Perhaps deal-of-the-day sites.

Both Lessien and John Gruber take on the topic of market share and I think they miss some of the nuances.  

Lessien applies the basic business school approach: 

Large market share attracted developers who built software exclusively for the dominant platform. That software, in turn, created further lock-in as users grew accustomed to the workflows and proprietary data formats that emerged. Typified by Microsoft’s “embrace and extend” strategy, market leadership yielded a nearly permanent advantage, which suffocated competing platforms and deprived customers of choice. Essentially, the historical advantage of dominant market share has been the ability to raise (discriminately) the switching cost of competing platforms.

In other words, there are massive network effects in technology.  These network effects can lead to monopoly rents.   

While she doesn’t do so explicitly, Lessien goes on to suggest these network effects aren’t applicable to mobile. Despite even a commandeering market share, monopoly rents can’t and won’t be created and therefore market share is largely irrelevant.       

Gruber takes a slightly different tack by suggesting market share and profitability are loosely correlated, but that this correlation has been minimized in the world of mobile. In paraphrasing Lessien, Gruber states, “profit share seems a better indicator of success than market share — both today, and historically.”

I think the three of us agree that the most desired applications from a user perspective, the “table stakes applications” that represent the top 85-90 percent of desired applications will be ubiquitous to all platforms. I think we also agree that the perceived horse race created by pundits shouting every month when the market share metrics drop is overdone.  Everyone wants to catch the inflection points, but these inflection points will never materialize as a single number.

But let’s take some of this to its limit.  Developers are constrained.  Big players do have the ability to ensure their services are ubiquitous to all platforms and thanks to web applications the  “table stakes applications” are (or will be) available on the leading mobile platforms. But with what lag? Even a short lag creates network effects. Consumers, knowing that their horse will always finish, but never first (sport analogy for Gruber) will be inclined to change their bet to a platform that gets the newest and next table stakes applications first.

This year’s State of the Union was about 400 words shorter than last year’s State of the Union address. Likely due the changes in Congress, this year’s address also had nearly 30 fewer applauses (77 compared to 106 in 2010) and about a third of the laughter (4 compared to 12 in 2010). The word clouds below illustrate some of the key themes of the two speeches. 


 

I also did some text analysis of the speech this year compared to last year which provides some interesting signs of times.

In October I said I expected 80+ tablet launches at 2011 CES.  AS CES approached it was clear 2011 was going to be the year of the tablet and a few days before the show I said that my 80+ estimate was looking conservative.  I updated my expectations and said I wouldn’t be surprised by 100+ tablet launches at CES and by my count we saw over one hundred launches. Here is a draft list of the launches we saw:

Check-in services like Foursquare and Gowalla have grown significantly in the last year. In March 2010, Foursquare hit 500K users and in the last few weeks they had surpassed 5M users.      

Building off this momentum, other services have entered the check-in fray like WeReward (pays you to check into places), GetGlue (Entertainment), Miso (TV), Philo (TV)  and of course even Facebook entered the LBS mix with Facebook Places.  

Device aspects like integrated GPS, larger screens, improved OS, and better connectivity began to reach critical mass over the last two years – making LBS technically more feasible.  But it was arguably the gaming aspects of the LBS offerings that provided early motivation – enough to finally jump-start a category. Gaming does well in the mobile environment naturally because it fills voids created by boredom. Besting a friend to become mayor of your favorite restaurant or capturing badges provided just the right level of gaming to fill small amounts of time and boredom.