A while back, I started hearing about how IBM is working to standardize communication in the world of M2M (through the promotion of the MQTT protocol), and it got me excited.  I have written numerous blog posts about how this fragmented market needs to adopt a standard protocol for it to truly succeed (just like the PC world did when it adopted protocols such as IP and others).  I stopped hearing about it for a bit, but was glad that it got brought back to the forefront by the recent announcement from OASIS that they are standardizing the open standard MQTT protocol for the M2M/IoE space.  IBM has gone a step further, giving their JavaScript source to Eclipse.

Does having standardized M2M communications matter?

This is definitely a question I have heard a few times, and one can make the argument that it may not be needed at all, at least at this stage.  The first argument I hear is that since the lack of standardization does not seem to be hurting the momentum of M2M, how important can it really be?  The answer is simple. For M2M to truly grow to its potential, we need to be able to diversify our deployment capabilities, as having open systems allows for better compatibility and selection among products.  Imagine if you only had the choice of using HP peripherals when you bought an HP laptop? That would have clearly reduced the amount of competition in the market, ultimately reducing the need for new technologies to be introduced. It would also have likely slowed down the adoption of this new market space.

The next argument I hear is, “It works for companies like Apple and it allows us to have a better level of control/quality over our solutions”.  Well, first, there is only one Apple.  Dozens of other companies have tried miserably to copy their all-inclusive model.  However, as successful as Apple has been of late, it hasn’t always been and may not be going forward.  Apple missed out on the majority of the PC market because they were a closed off system, and it nearly put them into bankruptcy in the 1990s.  The part about quality control does have some merit. However, as a dedicated Mac user, I can state that the idea of controlling hardware and software does allow for a higher level of integration and generally a better overall experience than in the PC world.

To counter that argument, I do remind them that while Apple may use a closed-off system, their devices still use all of the standard PC protocols to communicate, which allows Mac machines to communicate with Windows machines for common applications like printing, email and file sharing.  This allows the user to have a choice based on platform features, device costs and other factors, not having to worry about compatibility.  The same should be true for the world of M2M.  A user could select the hardware that makes the most sense, then could look for a middleware/application enablement software package that best meets their needs today and in the future; and not have to use only the one available from the hardware manufacturers.  It would also simplify the steps required to integrate the overall solution into the customer’s back-end systems, whether they were hosted in the cloud or locally.

Will this move towards a common protocol be successful?

First off, the initial answer is “I hope so”.  I believe this is one of the parts truly missing for our market to take off.  A common protocol would invite more competitors into the space and allow for a much faster mass adoption rate for M2M.  It would also allow for customers to choose from any platform/hardware combination that they wish, which will only foster more innovation and bring pricing down.

So, will it succeed?  If it were any other company behind this, I might say no.  However, IBM has a history of moving the market forward when it comes to standardizing technology and forcing the issue of interoperability, so I believe they have both the experience and the clout behind them to pull this off.

In order for Big Blue to succeed, they won’t be able to do it alone. They will no doubt have to get some of the other players on board.  With the widespread nature of M2M, it would likely require some of the large hardware players in the IT world (Intel, Qualcomm, TI?) to get on-board. It may also require an alliance with one or more of the large ERP players whose systems will take a lot of this data (Oracle, SAP?). 

In the end, if these parties get together, it will be tough for the market not to move in the direction of standardization.  There are signs that this is coming about with SAP, Cisco, Red Hat, TIBCO, NIST and half a dozen others having joined the OASIS MQTT standard effort in the past 2 months – an initiative that has been picked up by the NY TimesWall Street Journal and dozens of other publications.

Bottom line

Hats off to Big Blue for their efforts to standardize our market.  In the end, there will no doubt be a standard protocol for all M2M communication. If we are ever going to move forward and hit some of the lofty M2M projections that are being made, I believe the influencers in this space should align themselves behind what IBM is trying to do.

As always, let Novotech know how we can help with your M2M needs. Take a moment to check out our products and solutions now. As well, feel free to reach out to me directly ....larry(@)novotech.com.  You can also follow us on Twitter (@NovotechM2M) and you can follow me personally (@LBNovotechM2M).