Thursday, December 24, 2009

Why Google entering the smart phone market matters to you

A blog about the significance of Google's announcement to enter the smart phone market published here with Ricoh Innovations. Please enjoy.

Saturday, November 14, 2009

"If I had asked people what they wanted, they would have said faster horses." Henry Ford

I admit it, I first heard this quote the other day and I loved it. Why? Because it gives me the perfect opportunity to talk about the trade off of innovation. I will explain what I mean by that in a minute.

The danger of following Henry's advice

What Henry is telling us is that if you base your innovation methodology only on inputs you get from your potential customers, you may never leap. Sometimes you have to just go with your vision hoping to have done enough homework to convince yourself and your stake holders that you are not insane. Indeed imagine what people around Henry though when he announced he was going to build this weird, expensive and smelly new horseless carriage and that everybody would buy one. In reality Henry was not that much of a visionary in that the technology and the product already existed in another incarnation. Gottlieb Daimler already invented the petrol engine and the automobile, what Henry did was to realize that the time was ripe for a new positioning of the product. That was daring, visionary and indeed risky. In the quest to realize his dream Henry did end up innovating the industry by introducing the assembly line etc. But I digress. The point here is that there is a trade off between timing, risk and knowledge. Before diving into that, let's talk about the converse approach.

The danger of *not* following Henry's advice

Not doing anything that is not supported by evidence from your target audience has the advantage of reducing your risk considerably. Think for example a tire manufacturing company that introduces a new tire that is marginally more expensive but claims to save you $200 a year in gas. In this economy that tire manufacturer knows very well that this will give them a competitive advantage at low risk. It is a relatively safe bet. Now imagine another tire company that introduces a new type of tire that looks blue instead of black and that has no sidewall (incidentally such tire does exist, though it is not blue but brown, see Michelin here). The company claims that this new tire lasts 4 times longer than the old ones and that it never go flat (I am making this up). Despite the significant benefits of this product, this tire company will be taking a much higher risk introducing it since it has to convince its customer base to get over the fact that the new tire looks so very different and yet does what it promises. It is indeed a leap of faith where the only asset that the tire manufacturer can spend is its own credibility with its core customers and early adopters. This is what Amazon did with the Kindle. High risk, big reward. Note that Michelin is not as brave and this distruptive new tire has yet to see production. We will skip the discussion as to why that is but the author agrees with Michelin, the new tire is indeed innovative but not enough to take the risk.

The literature calls following Henry's advice disruptive innovations and not following it, evolutionary innovation. Obviously you can make or break your company with either so you need to be careful engaging with both. Further, the hidden danger of evolutionary innovation is the lack of challenge it presents to the R&D organization that eventually can become completely incapable of disruptive innovation.

Let's go back to the automotive industry. Clearly this industry has thrived on evolutionary innovation for more than a century, let's face it the gasoline engine is very old. But the party is over, the worldwide financial crisis and the renewed public opinion attention to gas mileage numbers is nearly killing many of the big players. Some (Toyota, Honda) saw the end of the gasoline engine in sight and started working on the next paradigm (hybrids, fuel cells). Others (GM, Ford) are playing catch up, they missed the warning signs and they just did some electric car trials as a marketing stunt, there was no company realignment behind it at all. Meanwhile the whole industry was so comfortable with evolutionary innovation that they left the door wide open for new players to come in with nearly no barrier of entry: enter Atom, Tesla and many other big and small ventures trying to be next Toyota or GM. The lesson to be learned here is that every product has a life cycle during which you do need evolutionary innovation but your organization should always encourage disruptive innovation or your will end up like Pontiac and other GM divisions now on the chopping block.

That is nice but how does that help my business

It should be clear by now that you need both types of innovations and that there is no single recipe to help you figure out how much of it do you need. Here are some commons sense guidelines that may help:
  1. know where your product line up is on the life cycle curve. Are you in the ramp up? Are you still able to charge a premium on it and enjoying the plateau at the top? Or are you starting the downward slope to commodity? You want to stop working on evolutionary innovation some time during the plateau and redirect your efforts to disruptive en masse then and there. Depending on the size of your company and your industry you may need anywhere from 2 to 5 years for your next big thing to hit the market so you better make sure you have that much time left in the life time of the current line up.
  2. know your talent pool. Do you have individuals in your organization with a track record for evolutionary innovation? how about disruptive innovation? Depending on (1) you want to take the necessary action to re-balance or retrain your team, if possible. And do so with plenty of notice, smart people have big egos and they resist change to their day to day life fiercely, no matter how creative they are at what they do. That used to include the author, I admit it.
  3. innovation should be always on. Innovation is not an event, it is a process that start the moment the founders start a company and ends when the company dies. Or some time before that and probably that is the reason why the company died in the first place! Morale, keep innovation going at all times.
  4. encourage employees to discuss innovation. Particularly listen to your support organization and your development/manufacturing organization. They talk to your users all the times and they know what the problems (aka opportunities) are. Most importantly, have these two organizations talk to each other and to your R&D organization...every day if you can! One caveat follows.
  5. do not listen to your sales organization! Sorry for anybody in sales but this is the single most harmful thing you can do to keep track of what your customers and potential customers want or need. The sales organization is trained to talk, not to listen. They may think they know the problems facing your customers but they do not. They are not there when your product is deployed and when its flaws or missing features emerge. Support is. Development and manufacturing is. They know becouse they have to deal with screaming users and customers. They may not understand the full picture of how your customers live the daily interaction with your product or service but they are your best and low cost tool to feed the innovation process. Further, your sales organization most often does not talk to the people using your product or service but to the people buying it: depending on your line of business these may be very different people. One example will help here: say you manufacture a new medical device for blood test. Your users are nurses. Your buyers are controllers and office managers and your stake holders and decision makers who authorize your juicy PO are doctors. Note how the decision makers and buyers in this example will never use the product and yet those are the only people that sales will talk to. QED, do not listen to sales. Of course I do not mean it literally or ever but you get my point, if your VP of sales tells you it is time to make pink elephants, tell him/her "good idea! we will look into it" and keep going. But if your support organization (or the data collected by it) tells you that more and more of your customers are painting the elephants they bought from you, it is time to call them up to find out why and what color would they like.

Friday, November 13, 2009

Apple and the Giant that came from the North

A few days ago analysts published a report on worldwide market share for smart phones and Apple is firmly in third position.

Here is a link for your convenience.

What is relevant here is a couple of things. First the trend, the "Giant" (AKA Nokia) is losing market share to RIM (the silver medalist in the race) and to Apple. While it is somewhat surprising that RIM is still growing at first glance, this can be explained by market differences and specifically by different business models. In Europe for example the modus operandi of mobile phones is that the end user purchases the device out right. That places the iPhone in the EU599 shelf, considerably higher than most RIM devices. If you are reading this from the US, do not bother to convert that to dollars, you may faint. This obviously limits the market penetration of this device and places it at the same level as the most exclusive and ugly mobile phones from Erickson and Nokia I have ever seen. Hey, I am being honest, they look like bricks and they cost an arm and a leg because the are feature jammed and they can rotate your tires. But I digress. The other point here is the trend. Over time Apple keeps grabbing market share and the critical but often overlooked detail is that it is doing so with....one device. Have you jumped on your chair yet? No? Well, let's work on that shall we.

My point?


Considering the fact that the iPhone is heavily subsidized by AT&T in the US, a 17% worldwide market share is something that Nokia and RIM should fear and understand so that they can fight it. Yes, I do want competition for the iPhone, that benefits everybody.

De facto Apple has a 17% worldwide market share with *one* device, as above mentioned. Nokia has 39% with a ton a different models and RIM is at 20% with a somewhat more organized mess of distinct devices but still a far cry from *one*. Getting the picture yet?

Two things are happening

Number one, Apple's rivals are plagued by product line up fragmentation and the burden of multiple platforms. Big problem, why? Because it costs more than a pretty penny to maintain and develop multiple hardware and software product lines. More importantly, it takes some serious guts for the executives and senior management at Nokia and RIM to push a top down company wide mentality change needed to move to one platform so they can compete. Not to mention you need to reorg like there is no tomorrow and possibly lay off people including some of the managers whose help you need to push the policy forward.

Good luck with that.

Number two, the mobile industry is all about software now. Sure one can foresee a future where consumers can purchase a more rugged iPhone for the outdoorsy audience Vs. a super-thin one for the trendy bunch. But the hardware inside will be the same and so will the software. What has happened is a paradigm shift in customization and personalization. Consumers can now project their identity onto their devices in a richer way than ever before. We went from expressing one's individuality via the physical features of the device (who can forget the Motorola Razor Vs. LG Chocolate battle?) to the customization of both exterior and interior. this is thanks to more iPhone cases than you can count and...software. An iPhone is an iPhone but if you grab mine and my wife's, for example, you will be looking at two completely different user experiences. That is the future. In fact I would expect to see many more ways to customize via software like desktop themes and so on and more ways to personalize the exterior via choices of colors, even more cases and, why not, custom paint schemes.

Wait a minute said the Android's crowd

I know, I am overlooking the Android platform. But for a good reason. I really wish Android well but until someone finally realizes that the only product design that will work is a de facto clone of the iPhone or better, we are not going anywhere fast. Read more here, if you like. Moreover chances are that Android will become a success story in an adjacent market: web enable devices other than mobile phones! Have you noticed how many such devices are running Android? We will cover that another time...

What does that means for the players in the industry?

I would start selling Nokia and RIM stocks sooner rather than later. Sure they have a long way to fall but I have yet to see signs that they are serious about reshaping their business to align with the new paradigm of mobile phones consumption. And if you think a corporate mentality shift is not that big a deal, you probably never lived through one with your company: if you do not execute it right or at all....this is how great companies die.

The good news is that it can be done. Ask IBM or indeed Apple, they know a few things about that ;-)

Friday, November 6, 2009

Droid, the new smart phone straight from the ...90s?

A slide out keyboard.

Really? Motorola guys, are you kidding me? Do you ever get out of your office and look around? Have you not seen what is happening in the consumer electronics marketplace? Even laptops and PC use touch screens, let alone a ton of mobile devices. And what do you do? You release yet another hyped iPhone killer with a UX from the 90s. Wow, talk about living in denial.

The iPhone needs competition!
This one goes out to all mobile device manufacturers: please get your acts together, accept what the iPhone did to the industry and...move on! You want to build an iPhone killer? Here is what you need to focus on:
  1. get over memory concerns, solid state memory is getting cheaper and cheaper, put plenty of it on it and get over it, you will not make your money playing nickle and dime on memory. Why? Because that ship has sailed, today you need to at least match the iPhone and I am ready to bet 90% of smart phone users could not care less to have an SD card in their phone. Have a look at the iPhone demographics, who do you think the audience is? The cashier at the local grocery store who saves his paychecks for months to buy the flat screen TV at Walmart, that's who. That is how you make the big numbers.
  2. get over offering a ton of features that make your hardware not backward or forward compatible with Android releases. Fragmentation of your product offering is your death sentence. An iPhone Edge today can run the same OS as an iPhone 3Gs. Beat that. Can't? Good, get a clue and get busy. That means you Erickson!
  3. superior UX, the device should have the same user guide as an iPhone: none. It is so intuitive to use you do not need one. Don't know how? Hire Interaction Designers and get with it.
  4. Establish a supported development community and the keyword here is *supported*. You cannot have one store for provider A and one for provider B and ...no, no, no! Developers will not come. They have to spend their time (=money) to embrace your platform, you have to make it as painless as possible!
  5. A way for app developers to digitally sign their software so piracy is not possible! Sharing is all fine and dandy but at the end of the day money has to exchange hands, developers need to pay rent too. Apple got it and that is why there are anywhere from million dollar software houses to high school students making money selling apps. That is how is done, pay attention. It is really not that hard.
It will cost millions of dollars to do the above but the more time goes by, the more it will cost to close the gap with Apple. Stop spending money on hype (the Droid campaign surely cost more than a pretty penny) and get busy building a valuable alternative or get out of the business.

[Update] rumor has it Google is working on a branded Android based phone that will not have a physical keyboard and that will not have software compatibility issues or product fragmentation. Now that *is* the way to go! Let's hope rumors are true in this case, please see this link.

Tuesday, November 3, 2009

Hard Disks are dead, so are Blue Rays and DVD

We all know it was just a matter of time and it is finally happening. Solid state storage is a consolidated technology and their pricing will start the downward slope towards commodity very soon. The result? HD will be displaced and they will become a thing of the past. The interesting side effect of that is that, wait for it...., DVD and Blue Ray will become collateral damage in the process, disappearing from the market in the next 36 months.

At the beginning it was Air
Let's go in order, the first clear sign of things to come was Apple's Air laptop. A clear statement that you no longer needed a CD/DVD drive. A closer look would have also told you that if your pocket was deep enough, you did not need an HD either. In fact the only "Air" worthy of that name because it was indeed lightweight was the most expensive version using solid state storage instead of HD. Whereas this was done to save weight it was still a clear sign of things to come.

Here come the net-books
someone finally listened to consumers and they realized that the time was ripe for a portable computer that does not weight more than the kitchen sink. Enter the finally small and lightweight net-books. Yet another proof that web enabled devices (loosely defined as a gizmo that is always online) did not need physical storage media and that solid state storage was clearly the way to go. Case and point: the latest laptops now have memory stick ports. So long DVD, has been nice knowing you.

A moment of silence for Blue Rays
The possibly unintended consequence of the raise of solid state storage is the de facto death sentence handed over to Blue Rays and DVD. The convergence of devices always online and solid state storage makes these media obsolete as well. As technologist I feel sorry for the Blue Rays, they were clearly destined to a short shelf life but I was hoping (and so was Sony) for a fast and furious tale instead of a stagnant and boring one due to the battle with Toshiba and the HD DVD gang. Who won? Nobody. Not even us the consumers as we had to wait for things to settle before enjoying the benefit of high definition media and now it is basically over. Locally I give Sony credit for not crying on spilled milk and moving on already: this week they announced they will partner with NetFlix to deliver movies via IP on the PS3. Yep, that is the way to go boys. Let's all hold hands and let physical media rest in peace.

What is next?
Obviously we can look forward to more lightweight computing devices using solid state storage solutions. We can also expect that multimedia content will be sold on memory sticks for the conceivable future until everybody and their grandma will have internet access 24x7. The advantages of this technology are many but the one that marketing and sales will embrace is that it is cheaper to ship and package since it is considerably smaller and lighter. Just last week I read a press release about one of the Hollywood studios working on this already. It is coming, do not say I did not warn you and stop buying DVD already ;-) Oh and you may want to make sure you do not own stocks of companies who did not get the picture...

Tuesday, October 20, 2009

Alex reader, how good intention translate into a bad idea

[update: B&N announced their e-reader. It is a dual screen like Alex but they did improve the UX a little. Most if not all of what follows still stands.]

Well I am sorry for the engineers and designer who worked on Alex but that is the way I fell. In case you have not seen photos of Alex, an epaper reader to compete with the Kindle, it has two screens one of top of the other. The top one is pretty much the same size and characteristics of the Kindle and the bottom one is a LCD of sort. Frankly the technical details are irrelevant. The problem is the UX (user experience).

Why is Alex's design wrong:
  • people in the western civilization read horizontally. Stacking information vertically goes against the grain of how people consume information. If you need to teach people how to consume information via your device... you may have a problem, a big one in fact.
  • our vision is wired to continually move the eyes in a span motion. Overall the brain tends to focus on one area (i.e.: the paragraph you are reading) while still capturing cues from what goes around you. Introduce two potential main attention grabbers and you just gave yourself a headache. This is, sadly, the main principle behind web based advertisement, how many of you enjoy being distracted by an animated ad while you are reading the news? Well that is exactly what Alex can do. Oops. So the brain will try to focus on one of the panels actively trying to ignore the other. Not exactly the relaxing experience of reading a book.
  • the UX model e-readers replace is a book. If you think about it carefully, that is hard enough, there is no need to get creative adding more features. A book is easy to access, share, browse and it is fault tolerant. Drop it on a concrete floor, pick it up and keep reading. Try any of the above with an electronic device and you will see why I said it is hard to replace a book. Amazon was smart about that and they realized that they should tackle the problem one feature at the time. That is why Kindle is successful, they replicated one of the aspects of a book (reading from paper) while still lacking others. But users forgave them and bought the device en masse. For lacking others I am referring to, for example, browsing a book or browsing your own library is much more complex in Kindle than with physical books but people deal with it because a Kindle weight less than carrying around your library. So what's wrong with Alex? Books have one page per sheet, not two. The interaction with a page does not changes from top to bottom, a page is a page. Ditto for Kindle. Alex is introducing split features, the top and the bottom screen are capable of doing different things and that has nothing to do with the experience of reading a book, browsing a magazine or consulting a manual. All experiences that I am sure Alex will try to replace. See my point at the very beginning on having to teach consumers how to use your device...
  • manufacturing costs: two screens means more inventory, more electronics, more assembly costs etc. You better have a really good reason to introduce a competing product that costs more to manufacture.
  • simplicity: consider the following, Kindle has one screen, a keyboard and a bunch of buttons. Alex has two screens, a touchscreen keyboard (or so I hope) and a bunch of buttons. That is, now you need to learn how to use this thing, for example, how to navigate between screens. Sound unnecessarily complicated, because it is.
Wait a minute, what if Alex is a disruptive innovation?
Well, I am sure it is possible that I am missing something here but my point is that there is a better way to build a device that is aimed to replace printed media and that can surpasses the Kindle. That is, a tablet pc with touchscreen controls a la iPhone. One screen, one interaction model very close to paper, no new mental model required. If anything I think B&N will help Apple take over this market if they ever decide to introduce the rumored iTable or iPad or whatever they will decide to call it. In fact Amazon and B&N may very well be solving the pricing problem for them (see previous post here). More in a moment.

So why hasn't anyone done it right?
hold on a minute, someone has done it partially right, Kindle is a good starting point. Why is it not better? Because the technology is not quite there yet, we are close but not close enough. The main problems are:
  • refresh rate vs. power consumption of the screen, namely paper like display Vs. your favorite flavor of LCD
  • color vs. BW (see above)
  • weight vs. battery life
  • and last but not least, positioning and revenue model
The last one is a show stopper for many players. Before we go there, notice that Alex is a compromise on all of the above, instead of solving the problem, they doubled the solution. Good idea? You decide.

Back to the business model. Think Sony, they were the first to introduce a capable e-reader and yet they are now trailing behind Amazon. The latter outsold them because they own the content and they waited for the convergence of technologies needed to build a better user experience (paper like display, affordable 3G and small lightweight batteries). More importantly, Amazon created an ecosystem in which Kindle makes sense. They learned from iTunes and applied that lesson to books. But if you are Acme inc. and you manufacture devices and you have no access to thousands of e-books, you do not have a business model hence you do not have a product.

Now B&N does have access to the content but their attempt to one up the Kindle look ill advised because of the factors I highlighted above. Not to mention that eventually e-readers will have to take it one step further and replace notepads. And that is a whole other ballgame with a whole other set of technical and financial challenges that Alex just made harder to tackle.

Thursday, October 1, 2009

Google Wave

It looks like Google is going in the right direction at least at the conceptual level. A dear friend of mine who works there sent me an invitation as it was released so I had a chance to play with it for a few minutes.
In a nutshell, they are giving us (the users) a new communication tool that is somewhat of an hybrid between a simplified Google doc, a chat room and an email. While I see the benefit of using it to collaborate in the most strict sense of the term, I am a bit disappointed. I was hoping for a slightly different approach focused more on the business end. The focus seems to be on social activities and we all know we do not need another social network to worry about.
What I like about it
it is reasonably simple, you start a wave (a context container that can handle multimedia) and start dragging and dropping people in. You can use it to manage the invite to a party as much as a project brainstorming. Not bad, well done.
What I do not like about it
every body can do anything to any content. While that sounds terrific on paper, in reality it poisons the very spirit of collaboration since only two outcomes are possible:
  1. everybody will edit everybody's else content without a common path (note I did not say without a common goal, this is a tool problem not a vision problem). This is otherwise known as chaos :-)
  2. nobody touches anything and nothing happens. If people are not engaged in whatever it is that the "wave" represents, they have very little motivation to do anything and a lot of motivation not to. Why? Google Wave *can* easily be mistaken for another social network and people will ask themselves how is this different from Facebook, Evite, MySpace etc.
I am hoping that my first impression is wrong and that I will discover more value. The reader should notice that I purposefully omitted the part where you and all the people you want to interact with all have to be registered users. Take a moment here to let it sink in.

It is easy to argue that this small *detail* will hinder adoption until the case for it becomes clear to everybody. Right now I expect that non Gmail users will resist signing up thinking, somewhat correctly, that they do not need to deal with yet another account. Sigh.

Friday, September 25, 2009

Shaking things up: a new paradigm of online ad consumption breaks the staus quo.


I hate ads. They are annoying. Yet we cannot live without them. Who would want to read a magazine without ads? Case and point, no respectable fashion magazine would ever go to press without advertisement. They are part of the content, they are expected. We want them. Why? They *can* be informative of new products, new trends etc. and we dear not be the last one to know about the latest and greatest. We do not want to be left out.

The same cannot be said for the online ecosystem. The online alter-ego of those printed ads are horribly invasive. The way they are pushed onto us consumers is just plain horrible.

So I tried an experiment with one of my start up. We built a simple app called "Just Ads". As the name clearly suggests, this (iPhone) app does one thing and one thing only, it shows a list of location aware ads. That is it.

What is so innovative about that?

Well, there is no other content but ads. That is, the user is not in the midst of consuming some other content (like reading this blog ;-) so the ads are no longer competing for attention. They are the content. The mind switch is what is innovative. Users of Just Ads are willingly opening an application that shows them (somewhat) targeted ads of things they may want to have. They are in the state of mind of consuming advertisement. Better yet, as if by magic, we have transformed advertisement into content that people want to consume.

Do not rush to your AppStore to download this little magic box though, AdMob (our ad provider) had to pull the plug on us in just 24 hrs so we had to remove the app from the store. Why?

Too much traffic.

You would think that generating a lot of eyeballs and clicks would be a good thing but when you do it in an innovative fashion, the status quo is often unable to cope and it stares back at you like a deer in the headlight of a car. Obviously we hit on something good so we will try to get it back online soon. Wish me luck.

[update] negotiation with AdMob failed, they refuse to serve ads to this app claiming that it does not provide an enjoyable experience for users and advertisers. What do you think?

[update] we signed up with MobClix, JustAds is on its way back to a device near you. Given the difference in content (ad format) we had to make some modifications and rethink this product. We hope you will enjoy it once it is out.

Thursday, September 10, 2009

Why Apple cannot build just an iTablet

There is a lot of rumors about a tablet (let's call it iTablet) coming from Apple. I have a problem believing them face value and I question what else is there, that we all missed, that would motivate them to build such device.

Tablet pc are nothing new, there are many on the market and they are not selling well. Why? Positioning and pricing. There is not enough room between a smart phone (iPhone, Pre) and a laptop to justify a tablet PC. And for "room" I am referring to pricing. $90 (and a two year contract) today gets you the phone + portable computer AKA iPhone 3G. $500 will get you a laptop from a name brand other than Apple, of course. So let's say those are our boundaries

>$90 and <$500 Well, it gets worse. Now we need to add Kindle to the equation. After all I would expect that if I buy an iTablet, I can read books with it. The reader should note that this open a whole new problem with display technology that we will omit for the sake of brevity. Let it just be said that paper like display (like Kindle's e-ink) consume a lot less energy and they have no refresh rate making them as comfortable as paper to read. Back to our superficial market analysis. As I was saying, we are not done yet, the dagger in the hearth comes from net-books. Virtually that shrinks our range for an iTablet to >$90 and <$200 (with data plan...) That is a though place to make money in considering the amount of hardware that must go into one of these devices. Namely I would expect that an iTablet must...
  • have comparable screen size to a Kindle DX
  • have comparable battery life (in read mode) to a Kindle
  • have wireless access (3G/wifi)
  • have all the bells and whistles of an iPod Touch
  • use touch and stylus (so you can actually write on it like you would on a paper notebook)
  • and last but not least, have access to a ton of books, movies, music and let's not forget (drum rolls...) games! See yesterday comments during their PR event in San Francisco that their iPod Touch is a PSP competitor.
Perhaps that is too much and some features should be dropped but let's assume for the sake of argument that they can pull all of that off and still not become another Sony/PS3 story. Well if they can do that they may have a killer product that does not replace a smart phone and that does not replaces a laptop but that has its own niche. Niche that is currently occupied mostly by net-books that would instantly become obsolete. So far so good. Who pays for all of this though?

Note: they have done something like this already with the iPod as it virtually killed all MP3 players already on the market by addressing the same need but with a superior user experience. Can they do this for the tablet PC market? Maybe. The question is how do they make money. Let's explore.

A quick glance back to what I just wrote should convince you that this is a monumental challenge as you must do all of the above and still end up in the $200 ballpark, street price. Now we all know that Apple excels in convincing its buyers that their products are worth a premium and let's say they can position it at $500. Let's also say that they pull another AT&T partnership so they can effectively subsidy the iTablet, forcing buyers to subscribe to 3G service. That would be a masterful plan. The question still stands, why go through all of this trouble given the high cost and risks. Where is the money?

My take is as good as antibody's but I hope that this brief post convinced the reader that if Apple does this, they must have more up their sleeve than just a better tablet PC. They must have come up with a better overall experience or even a new ecosystem that perhaps displaces more than just net-books. Perhaps what this is about is a better way to work, play and interact with a digital device that becomes part of our daily lives much as they managed to do with smart phones.

I for once would be very happy to be proven wrong and if they do build it, I look forward to be one of the first to adopt it, use it and exploit the benefit of a well executed tablet PC platform.

Thoughts and comments are appreciated. Tell me, what did I miss!?

Wednesday, September 9, 2009

Innovation in ubiquitous computing

If you miss the news from Apple PR today, they have added mic and camera capability to their iPod line up. That is a quiet storm brewing if I ever saw one.
Why? You may ask.
Well considering that more than half of the AppStore download comes from iPod Touches (see AdMob report on the matter), I am starting to see a trend emerging. Granted that the iPhone already does what the iPod and iPod Touch does, the message from consumers is clear: "we want more capabilities and we cannot afford to pay $70 per month to get them!"
So what is the storm I see coming? With the introduction of push notification and VOIP there is really little that an iPod Touch device has to envy to an iPhone and...it costs $0 per month. If you are in a urban environment where you can pick up a WiFi signal, well, chances are you really do not miss having an iPhone at all. You can call with VOIP, you can IM instead of SMS and now you have your camera too. The best part, it costs you nada. If you are a student or a teen, that is the right price for you and your parents, is it not?
So, is this the dawn of a new class of devices? Is an iPod Touch device the next "pager" whereas it is a more limited device but considerably cheaper than the full fledge smart phone?
Food for thoughts.

NOTE: why do I always talk of iPod or iPhone and never of the competitors. Because they are the ones breaking new ground. With that said, assume I am referring to an "iPod Touch like" device and an "iPhone like" device from now on. Sooner or later Apple will do what Microsoft did, license the iPhone O/S and then history will repeat itself (see what happened to PC in the 1980s).

Wednesday, July 15, 2009

Stirring things up: Human Learning Paradox

I recently started a fascinating discussion with Brian Hennessy (more on this soon, see here) on the matter of how much the brain influences our ability to innovate. In other words, how much is the brain actually trying to find patterns that mimic what it can model as opposed to create completely new constructs. Let's explore this.

I bet Jeff Hawkins (see numnenta.com) would say that the brain's ability to model is quite substantial as this organ is nothing but a classifier. Hold that though, I argue there is a limit to what our classifier can do.

An example of what do I mean by the brain trying to model what it knows may help here. Take the iPod and let's oversimplify things for the sake of this discussion. Let's say that one or more brains belonging to employees at Apple realized that there was a path between MP3 technology, hand held devices and music. That is, we can relatively safely argue that the iPod as an abstract concept is the result of a brain connecting the dots of what was already out there. That leads me to quote William Gibson :"the future is already here, it is just unevenly distributed".

Perhaps what William should have said is that the future of humans is already here, it is just unevenly distributed. That is, humans seem to innovate by connecting dots. A faschinating question would be how would non-humans innovate but until we build something capable of passing the Turing test or we meet other intelligent life forms this would be a rather fruitless discussion.

Back to us. A researcher in the natural science may disagree with my assertion that innovation=connecting dots but let's face it, all forms of research use modelling (mathematics anyone?) and prior knowledge to build upon. So even innovations like nuclear fusion or Lithium polymer batteries built from the past. Hence let's just agree that there is something about the notion of "connecting the dots" (notwithstanding how complex those dots may be) that strongly relates to our ability to innovate. The further apart the dots, the bigger the leap in innovation. Fair enough?

This all seems to suggest that Brian may be on to something. The brain is indeed influencing how we innovate, the question is how much. An even more interesting question is what are we missing in term of innovation since we are constrained by our brain? Sadly that is an question without an answer since our brain is all we have that demonstrates the ability to abstract and, yes Jeff, classify.

What does that implies for humans? It seems to imply that every cognitive process we undertake is influenced in some form by the brain's physiological and morphological characteristics. Brian goes so far as to ponder whether things like Saas and cloud computing are so successful because they resemble a neocortex at the morphological level. What he is suggesting is that we feel unconsciously or consciously comfortable with the idea of cloud computing because so is our brain, literally. Wild theory for sure but how far off is he? I say not much. Let me stretch the vision even more and propose the

Human Learning Paradox

"We will never fill the gap between our understanding at the cellular level and our understanding at the functional level with respect to the neocortex since the neocortex itself is what we are using to attempt this cognitive exercise. This is a paradox in that the neocortex would need to be capable of representing itself"

This is bad news if you make a living on computational neuroscience (or perhaps it is job security?) but it is good news for innovations. Why? Because if we embrace the fact that our brains influences how we think and thus how we innovate then we have a chance to help that process occur more often, faster and better. Now that is a good news indeed.

Saturday, June 20, 2009

“Open Innovation – Corporate Venturing And Monetizing IP”




Thursday, June 18th, 2009
Open Innovation – Corporate Venturing And Monetizing IP
Fremont Hills Country Club
12889 Viscaino Place
Los Altos, CA

Heidi Mason, Bell Mason
Paul Greco, Ocean Tomo
Jim Anderson, President SVB Analytic
Timothy Lohse, DLA Piper
Andrea Mariotti, R&D Manager, Ricoh Innovations
Debra Baker, Deloitte & Touche

Moderator: CJ Koomen, Band of Angels, Former President Philips Semiconductor North America

Description: The old Corporate R&D paradigm has been shifting with an accelerating pace in the past 10 years or so. The world moves to open innovation where new ideas can come from start ups, spin ins, corporate venturing, M&A. Corporate R&D more and more has the task of helping to sift the good ideas from the bad ones. Corporate structure is changing too. The panel will address the various aspects of open innovation such as the new corporate innovation structure, the role of IP in consortia, how to monetize and protect IP in an open innovation environment and the role of start ups.

For more information, please see here

My thoughts
The topics we covered in this panel and that were most interesting to me were:
  • the corporate innovation problem: this is a widespread challenge to big and medium size corporation alike and Patty Burke, speaking on behalf of Heidi Mason, had a terrific cartoon capturing the point. The cartoon showed a medieval army intent on fighting an open field battle with spears and swords when a guy with a white coat approached the general showing him a machine gun. The caption read something along the lines "We do not have time for new things, I am fighting a battle here!" This is a common scenario in most companies especially those dealing with manufacturing of goods. It is often the case that the current development organization has no bandwidth to take what comes out of the company labs, game changing as it may be. So what can companies do about it? One option is to perform an acquisition and de facto spin off your new product line that way. I am sure many other strategies could also be considered that are more appropriate for specific scenarios. My point? Awareness that your company is experiencing this problem is half the solution.
  • product maturity and cost of research: this was actually my presentation. In a nutshell all product life cycles presents a cross over point where the cost of doing R&D to continue to improve such product will no longer yield the desired increase in market value. In other words, no matter how much money you spend in research, your product's price is on the downward slope to become a commodity. Apple is a master at preventing this from happening, just look at their laptop lineup and compare it to the competition. They are investing millions in research and they can still charge a premium for their laptops whereas Dell or HP have to sell theirs for half the price. My point? You can stretch the life span of a product that has reached maturity by continuing to invest in R&D but it is risky and you better be as good as Apple at it. The reader should note that sooner or later your product will become a commodity, it is unavoidable. The message here is to make sure you are aware where your cross over point is and redirect your R&D resources elsewhere before you spend your way out of business. Case and point, Nikon in the digital camera industry, they almost spend their way out of business as they though they could prevent (their) digital cameras from becoming commodities. Today they scaled back considerably and moved into a niche market of high end product only.
  • IP issues for US market: each market is different when it comes to IP protection. One word of wisdom from the panelists was to make sure you have your inventors sign a release of ownership for everything that they work on. Failure to do so can be very expensive as we were told stories of disgruntled employees holding their former employer for ransom during an M&A. This is particularly important for start up where it is easy to overlook these matters with potentially disastrous financial consequences. Ye be warned.
A proposal for sustainable innovation and growth
Given the above I advocate that a sustainable model for innovation and growth is as follows:
  1. use evolutionary user centric innovation (more on this in an upcoming post) to generate new business opportunities
  2. channel the outputs from (1) into existing product lines if applicable. If not, identify adjacent players in the field of the new business opportunity you wish to pursue and consider M&A after your IP is protected
  3. rinse and repeat. Note that as some of your product lines reach commodity status, now you have a strategy in place to keep a pipeline of new opportunities thanks to (1) and (2).
[special thanks to ACG's Micky Robledo for the photos]

Monday, June 15, 2009

The art of innovation

What do I mean by "the art of innovation"? Let's face it, innovation is another word for lucky guess. Sure there is a lot of hard work that can be done to make the guessing more likely to be "lucky" but it is still guessing. First, allow me to share what do I mean for innovation.

Innovation: an innovation is a better solution to a solved problem or a solution to a yet unsolved problem that can be monetized.

At this point I raise two controversial points:
- I compared innovation with lucky guessing
- I stated that innovation is only something that can be monetized.

Let' start with the guessing claim. Why is innovation so difficult? After all if all you are doing is solving a problem, it is just a matter of applying science to the problem. Be that as it may be, the guessing part is not in solving the problem necessarily but in finding the problem in the first place. That is unless your goal is to build a better mouse trap but that case has limited interests IMHO.

From this point on the reader should assume that I am focusing any further postings to the challenge of innovation according to the second half of the definition. A new solution to an unsolved problem. Therefore you have to identify the problem first.

Case and point? Twitter. Twitter has a problem, it cannot (as I write) monetize on its service yet. The problem is well understood but yet unsolved. You can solve it by trying existing solutions (build a better mouse trap) or you come up with a new approach. This is where the lucky guessing comes into play. Say that you have an idea on how to make Twitter profitable. It is innovative in the latter half of the definition therefore you are breaking new ground. How do you evaluate if your solution is going to be innovative? The short answer is that you cannot. The best you can do is remove as much uncertainty as you can, re evaluate and eventually face the dilemma whether you want to spend resources to try it out and get the feedback from your users, buyers and stake holders. That is the only thing that matter.

Sure I am oversimplifying things and sure you can make the claim that I am only thinking web 2.0 models but what product or business today is not tied to it in some way? Think about it. A barber shop can hardly afford not the have an online presence let alone any other type of business. Not to mention online entity such as Yelp that give your business an online persona whether you want it or not.

So Twitter monetization problem shows us that the challenge is that you cannot completely remove uncertainty from your innovative process in the confine of your lab or you would not be innovating in the first place. The answers are indeed out there.

Does this means that Twitter is not innovative? So far, it is not at least according to the definition I proposed above. It has the potential to be but they are not there yet. Why not? Simple, nobody knows what problem they are solving exactly. You do not believe me? Try to explain what Twitter is to someone who has never seen it before. Now try to explain what Facebook or Linkedin is to someone who has never seen it before. See the difference?

Is Facebook innovative? Well Facebook built a better mouse trap. MySpace was the innovative idea and they did have revenues and a well defined business model. As such both MySpace and Facebook are innovative according to the definition herein although as researcher I find it more interesting to study how MySpace came into existence than how did Facebook stole the show. Perhaps more importantly how did we all miss it? The all so famous "why did I not think of it?"

Let's do one more: was YouTube innovative? According to the definition, yes, by all means. They solved a well known problem and they did not just build a better mouse trap, they allowed the cost of sharing copyrighted material to drop to $0. As such they created value for their users and in turn they opened the door to monetization strategies including acquisition, as it were. In other words they broke the rules but gave users what they asked.

The reader should take a moment to ponder that innovation often *is* about breaking rules. We can have a whole other conversation as to which rules breaking behaviour is ethical and socially acceptable and which is not. Case and point the financial "creativity" in the real estate market we experienced in the last several years where banks, broker, realtors *and* buyers broke best practice (and common sense) for quick and easy profit.

This is a good point to address the second controversy, monetization. Sure there are plenty of researchers in the academia that may be displeased with my statement but the fact of the matter is that there is no "pure" research and therefore no "pure" innovation (as in "not tied to monetization") as long as you look closely enough at its dynamics for resource allocation.

All pure research is paid for by someone, a government grant, a private grant, tuition from students, etc. In any of those scenarios someone (or multiple someones) at some point in time sat around a table and decided which proposal to fund based on which one would bring more progress to humanity, whether in the form of knowledge or technology, or perhaps they decided based on some other parameters. My point here is that there was a decision process involved for resource allocation. In a nutshell, as long as you operate under scarcity you will run into money matters. Lets' see how.

Chances are that regardless of the parameters used, the research so funded will add value to the life of human being. How much value depends on the specific case of course but in case you did not noticed, we just proved my point with respect to monetization. How do you measure value objectively? With money, of course. This is the very basic of human social interaction since currency was introduced, whereas we are all decision agents operating under scarcity. We all have to decide what we are willing to give up from our quota of goods (tangible or not) in order to acquire a new something. Example: how much money am I willing to part in order to have the new iPhone 3Gs and what else will I not be able to afford as consequence of depleting my pool of cash. For most of us money is a finite set so every choice we make is driven by which goods do we want to acquire/use/enjoy and which we are willing to part or do without for a sensible amount of time.

My point? Even pure research is subject to the monetization model, whether explicitly or implicitly. So let's just embrace the fact that monetization matters and make it explicit.

Given the above, innovation in the private sector should never be disjoint from monetization. Sounds trivial but you will be amazed of how many companies are ignoring this simple fact. Case and point, Google (sorry guys, I love you but you need to hear this).

Google claims they are user driven when it comes to innovation. Sadly, they are not. What they do is a lot of technological innovation driven by statistical user analysis. That is a whole other story than being driven by users.

They claim they have so much data about what people do that they can ascertain if a new technology or product is worth developing. Yet, more than ten years since they started, Google has tons of products and virtually only one revenue stream. That is, they are still a one trick pony. You take away the search traffic and thus the ad serving business and they are gone. How can that be?!? Google has more PhD than NASA! Well simply put doing innovation based on statistical user behavior is like learning how to drive by looking at parked cars. It has its limitations. Sure you can tell a lot about the habit of drivers but that still does not teach how they do it or, more importantly, why they do it.

In conclusion we have established a constrained definition of innovation and we have shown how it actually encompasses research in both the private and academic settings. Further, we have shown that monetization must be one of the ingredients of innovation or you are very likely to engage yourself in an exercise in futility that can be quite expensive.

Why this blog

After sixteen years in the high tech industry and more than ten in Silicon Valley it was inevitable that I actually learned a few things. This is a output to share what I have learn, to share educated opinion and wild guesses with respect to the art of innovation. Please enjoy!