Posted in: Backup
, Author: yobitech (March 4, 2014)
Far too many times I have bought something with much anticipation only to be disappointed. If it wasn’t the way it looked or what it was promised to do; it was something else that fell short of my expectations. I have to say that one of the few companies that go beyond my expectations are the ones I keep going back to. The one I like to frequently talk about is Apple. Their products often surprise me (in a good way) and the intangible features that brings a deep satisfaction way beyond what is advertised. The “new drug” for me is Samsung and Hyundai (cars).
American marketing plays the leading role in setting this expectation. It is the marketing that has become the “American” culture… The “must have” the newest, coolest and flashy-est toys that defines who we are. Unfortunately, marketing of these products almost always falls short of the actual product itself. We all seem to hang on the hope that these products will exceed our expectations. This is why “un-boxing” videos are so popular on YouTube. Product reviews and blogs are also a good way to keep companies honest and helping us with our “addictions” to our toys. This marketing culture is not only limited to personal electronics but is also true for products in the business enterprise as well.
Marketing in the Business Enterprise
The Backup Tape
I remember having to buying backup tapes for my backups. I have often wondered why and how they can advertise 2x the native capacity of the tape? How can they make that claim? For example, a SDLT320 tape is really a 160GB tape (native capacity). How do they know that customers can fit 320GBs on a 160GB tape?” After doing some research, the conclusion I came to was that they really don’t know! It was a surprising fact to me that they can make such a claim based on speculation. How can they do this and get away with? It is easy… It is what I call the “Chaos Factor”. This is when someone or something takes advantage of a situation to further their cause.
In the case of the backup tapes, they capitalize on a few things that facilitate the Chaos Factor:
1. The Backup Software and
2. The Business Requirements.
The Backup Tape “Chaos Factor”
1. The Backup Software
Tape manufacturers know this all too well. Backup software is very complex. Virtually all backup administrators are far too busy worrying about one thing; completing the backups successfully. Looking to see if tapes are being utilized to meet its advertised capacity is not something that is even thought about in the day-to-day operation. In fact, the only time tape utilization ever comes up is if management asks for it. When it is requested, it is usually a time consuming exercise as backup software does not have good reporting facilities to compile this information readily. Tape utilization is not a concern.
1. The Business Requirements
Another reason is how backup software uses tapes. Tape backups are scheduled by jobs. Most jobs are completed before the tape are filled up. Depending on the companys’ policy, most tapes are ejected and stored off-site. So tapes are rarely ever be filled up because of this policy! This is normal for backup jobs and it is when companies leave tapes in the drive(s) to fill them up goes against why they do backups in the first place. Backup tapes are meant to be taken off-site to protect from disaster. It is really the ONLY time (other than having backups larger than a single tape) that a tape can actually be fully utilized.
So this Chaos Factor is also used in the business of data storage. The SAN market is another one of where the protection of data trumps our ability to efficiently manage the storage. The SAN market is full of dirty secrets as I will outline them below.
The SAN “Chaos Factor”
A dirty secret of the storage industry is the use of marketing benchmark papers. Benchmark testing papers are designed to give the impression that a product can perform as advertised. And for the actual paper itself, it may be true, but sometimes these tests are “rigged” to give the product favorable results. In fact, sometimes these performance numbers are impossible in the real-world. Let me illustrate.. For example, I can type about 65 words per minute. Many people can and will view that as average, but if I wanted to “bend the truth”, I can say I can type 300 words per minute. I can technically type “at” 300+ words per minute, but in the real world, I don’t type like that. What good is a book with 1 word (at) printed on 300 pages? This kind of claim holds no water but it is the same technique and concept used for some of these technical papers. Although the results are touted, keep them honest by asking what their customers seeing in their performance on a day-to-day operation.
Here is another technique that is commonly used by vendors. It is what I call the “smoke and mirror” marketing. It is a tactic used to mimic a new technology, feature or product that is hot. The main goal of this is to create the feature at the best possible price and downplay the side-effects. This is the deliberate engineering around providing the feature set at the expense of existing features. Here is an example. I bought a new Hyundai Sonota last year. I love the car, but I am not crazy about the ECO feature that comes with it. I was told that I would save gas with this mode. Although I have to say I think I get a few more miles on a tank of gas, the cost I pay in lost power, torque and responsiveness is not worth me using this feature at all. I believe this feature as well as a smaller gas tank capacity eventually lead to a class-action law suite over Hyundai’s gas mileage claims. So for a vendor to incorporate new features they sometimes have to leverage existing infrastructures and architectures because it is what they already have. In doing so, they now have an inferior product by emulating new features and masking or downplaying the effects. The prospective customers are not going to know the product well enough to know the impact or these nuances. They often just see the feature set in a side-by-side comparison with other vendors and make decisions based on that. While the details are in the fine print, it is almost never looked at before the sale of the product. As a seasoned professional, I commonly do my due diligence to research their claims. I also am writing this to help you avoid making these mistakes by asking questions and researching before making a major investment for your company.
Here are some questions you should ask:
• What trade magazines have you been featured in lately? (last year)
• What benchmarking paper is available for review
• How does that benchmark compare to real-world workloads?
• What reference architectures are available?
• What customers can I talk to on specific feature set(s)?
Here are some things to do for research
• Look through the Administrator’s Guide for “Notes” and fine print details. This will usually tell you what is impacted and/or restricted as a result of implementing the features
• Invite the vendors for a face-to-face meeting and talk about their features
• Have the vendor present their technologies and how they differ from the competition
• Have the vendor white-board how their technology will fit into your environment
• Ask the vendor to present the value of their technology in relation to your company’s business and existing infrastructure
• If something sound too good to be true then ask them to provide proof in the form of a customer testimony
I hope this is good information for you because I have seen time after time, companies making a purchases into something that isn’t the right fit. Then they are stuck with it for 3-5 years. Remember, the best price isn’t always the best choice.
Posted in: Backup
, Author: yobitech (November 5, 2013)
It is human nature to assume that if it looks like a duck, quacks like a duck and sounds like a duck then it must be a duck. The same could be said about hard drives. They only come in 2.5” and 3.5” form factors, but when we dig deeper, there are distinct difference and developments in the storage industry that will define and shape the future of storage.
The Rotating Disk or Spinning Disk
So there were many claims in the 90’s of how the “mainframe is dead”, but the reality is, the mainframe is alive and well. In fact, there are many corporations still running on mainframes and have no plans to move off of it. This is because there are many other factors that may not be apparent on the surface, but it is reason enough to continue with the technology because it provides a “means to an end”.
Another claim was in the mid 2000’s that “tape is dead”, but again, the reality is, tape is very much alive and kicking. Although there have been many advances in disk and tape alternatives, tape IS the final line of defense in data recovery. Although it is slow, cumbersome and expensive, it is also a “means to an end” for most companies that can’t afford to lose ANY data.
When it comes to rotating or spinning disk, many are rooting for the disappearance of them. Some will even say that is going the way of floppy disk, but just when you think there isn’t any more that can be developed for the spinning disk, there are some amazing new developments. The latest is…
The 6TB Helium Filled hard drive from HGST (a Western Digital Company).
Yes, this is no joke. It is a, hermetically sealed, water proof, hard drive packed with more platters (7 platters) to run faster and more efficiently that the conventional spinning hard drive. Once again, injecting new life into the spinning disk industry.
What is fueling this kind of innovation into a supposedly “dying” technology? For one, solid state drives or SSDs are STILL relatively expensive. The cost has not dropped (as much as I would have hoped) like most traditional electronic components thus keeping the spinning disk breed alive. The million dollar question is, “How long will it be around?” It is hard to say because when we look deeper into the drives, there are differences. They are also fulfilling that “means to an end” purpose for most. Here are some differences…
As long as there are ways to keep increasing capacity and keep the delta between SSDs and spinning disk far enough, it will dilute the appetite for SSDs. This will trump the affordability factor because it is about value or “cost per gigabyte”. We are now up to 6TBs in a 3.5” form factor while SSDs are around 500GBs. This is the single most hindering factor for SSD adoption.
Most applications do not have a need for high performance storage. Most storage for home users are for digital pictures, home movies and static PDF files and documents. Most of these files are perfectly fine for the large 7.2k multi-terabyte drives. In the business world or enterprise, it is actually quite similar. Most companies’ data is somewhat static. In fact, on average, about 70% of all data is hardly ever touched again once it is written. I have personally seen some customers with 90% of their data being static after being written to for the first time. Storage vendors have been offering storage tiering (Dell Equallogic, Compellent, HP 3Par) that automates the movement of storage based on their usage characteristics without any user intervention. With this type of virtualized storage management maximizes the ROI (Return on Investment) and the TCO (Total Cost of Ownership) for spinning disk in the enterprise. This has extended the existence of spinning disk as it maximizes the performance characteristics of both spinning disk and SSDs.
3. Mean Time Before Failure (MTBF)
All drives have a MTBF rating. I don’t know how vendors come up with these numbers, but they do. It is a rating of how long the device is expected to be in service before they fail. I wrote in a past blog called “The Perfect Storm” where SATA drives would fail in bunches because of the MTBF. Many of these drives are put into service in massive amounts at the same time doing virtually the same thing all of the time. MTBF is theoretical number but depending on how they are used, “mileage will vary”. MTBF for these drives are so highly rated that most of them that run for a few years will continue to run for many more. In general, if a drive is defective, it will fail fairly soon into the operational stage. That is why there is a “burn-in” time for drives. I personally run them for a week before I put them into production. Those drives that last for years eventually make it back on the resale market only to run reliably for many more. On the other hand, MTBF for an SSD is different. Although they are rated for a specific time like the spinning disk, the characteristics of an SSD is different. There is a process called “cell amplification” where the cells in an SSD will actually degrade. They will eventually be rendered unusable but there is software that will compensate for that. So as compared to a spinning disk where there is no cell amplification, SSDs are measurably predictable to when they will fail. This is a good and bad thing. Good in the aspect of predicting failure but bad in the sense of reusability. If you can measure the life of a drive, this will directly affect the value of the drive.
In the near future, it is safe to say that the spinning disk is going to be around for a while. Even if the cost of SSDs come down, there are other factors that meet the needs for the users of storage. The same way that other factors that have kept the mainframe and tape technologies around the spinning disk is has earned its place.
Long live the spinning hard drive!
Posted in: Backup
, Author: yobitech (August 13, 2013)
If you are amazed with technology today, you should know that the speed of technological advances have been hindered for many years by many factors.
Things like copyright laws, marketing and corporate profits often constrain the speed of new products and innovations. We as a collective human race can develop and advance much faster if these obstacles are to be removed. The counter-force to these constraints have come through the “Open Source” community with Linux and other operating systems and open source standards, hobbyist, enthusiasts and hackers (ethical and unethical) has brought to us great benefits and/or improvements to the devices we all love so much. With these leaps in technology and advancements comes a new technology inhibitor… Increased Regulations and Compliance.
From the days of the Enron scandals to the insider trading of Martha Stewart, a number of regulatory rules and compliance has come down on businesses. Some examples of these rules and regulations are the Sarbanes Oxley or (SOX) Act, PCI DSS (Payment Card Industry Data Security Standard) and HIPAA (Health Insurance Portability and Accounting Act) are just to name a few.
These compliance rules do not directly inhibit innovations and advancements in technology, but it does slow it down. It forces technology to stay around longer than it is intended for. It “shifts” innovations to data preservation and accessibility. Regulations that span many years like in the financial sector are typically 7 years worth of unaltered and verifiable financial data. There is now the possibility of expanding it to 10 years or more. The medical industry is moving to retention of 50+ years of unaltered and verifiable record keeping. The portability of medical history is now driving retentions to exceed the life of a person; in some cases 100+ years, depending on the medical treatment type. Finally, there are the data “packrats”. Although they may not be mandated by regulations yet, some institutions’ self-imposed retentions to “forever”.
Reality? Yes and No.
Yes, that we can set these rules today, but the reality is No… at least not proven yet. It is a work in progress. There are some innovative products that are designed to keep data past 100+ years, but most companies’ IT departments are not looking that far ahead. They are not looking to spend lots of money on unproven technology on the prospect of being able to keep the data that long. They have more immediate issues to solve. So companies are faced with the challenge of keeping up with innovations and leading-edge technologies while being able to support data retention compliance. Much of that data is on old tapes, optical platters and old disk storage systems.
Fortunately, there are the aftermarket niche resellers that specialized in repurposed gear. These business provide an essential service to these unique situations. Companies are making their storage subsystems last longer, usually past the vendor’s EOL (End of Life) support of the products. Some are resorting to eBay and user groups to fulfill the hard to find items, but to varying degree of success. One IT manager says, “When I buy my stuff from eBay for my workplace, I am playing Russian roulette. I prefer to go to these niche resellers because I know what I am getting and it’s not from some guy in a garage somewhere.” EOL disks in archival systems with compliance metadata, older servers with specific BIOS support for older drives, SANs with EOL disks with specific interface or firmware are generating a steady demand for these components, but until it hurts enough for companies to invest in a ultra-long term archival/compliance ready solutions, companies will endure the pain and resort to leveraging older equipment to preserve their data.
Posted in: General
, Author: yobitech (July 1, 2013)
Although the iPad had set the standards for tablets, the tablet market is still ripe for the picking. In my opinion, we are fortunate enough to be a part of the most exciting time in history of mankind, well, at least for us guys. As many may think, Apple’s “golden days” appears to be behind them, but they are still a very innovative and trend setting company. I have to say with the loss of Steve Jobs, it just hasn’t been the same. As we look behind us, we see the tablet path riddled with failures. A lot of people may not know this, but Apple had a failure back in the 80′s with the Newton, then the Palm Pilot and the HP iPaq. To some degree these failures had setup the iPad for tremendous success. Like a volleyball game, Apple was prime to “spike” the ball. It took the failures of the past to learn and design the iPad.
The iPad is not failing or doomed but simply losing its freshness. With IOS7 coming, it doesn’t seem to have the same “BANG” as other past announcements. Although we don’t know how it is going to work yet, it appears to be more of the same. Meanwhile, there are some new and exciting tablets out there now innovating where Apple used to. For example, Dell has a convertible XPS12 that is a laptop with full Windows 8 (Home or Pro) running on an i5 or an i7 processor equipped with a flip 12” screen to be a full 10hr tablet! There is also the Dell XPS18 tablet that doubles as a desktop mimicking the Apple iMac while giving the consumer the ability use it as a large tablet. A full 18” tablet! Samsung is also leading the way with the cell phone hybrid tablet “The Note” series device with a 5” screen. A true man’s man tablet with dual core and handwriting capabilities. There is a “Wow” factor here, like the same ‘heart-thumping” excitement of unboxing of the previous Apple products.
These are just some of the new innovations that is taking place in the tablet evolution. I am glad to see that although Apple had opened up a can of whoop@$$ a few years back, but what goes around comes around. It’s about time that others learned from Apple’s mistake and is now returning the favor.
Posted in: General
, Author: yobitech (June 5, 2013)
I am constantly inspired by the technologies that come into the marketplace. Although most new products are mediocre, there are some that are really deserving of praise.
Occasionally a disruptive product shakes the world. Oh how I look forward to those times… However, there are products that make specific use of multiple technologies to leverage the strengths and weaknesses to compliment each other for a common goal.
Each technology may not be a great technology, but when combined it is amazing. Take for example the hybrid car. When the gasoline shortage of the 70s happened it was a shock to most people that gas can actually run out. That prompted the development of an alternative fuel for cars. While most people thought of a 1 to 1 fuel replacement, a combination of fuels for a single vehicle was radical. Gas prices stayed low after that and the gas shortage scare was a distant memory.
The concept of an alternative fuel car was put on low priority. I have seen some attempts of a natural gas car, the electric car and even the vegetable oil car (usually collected from restaurants left over from cooking).
All valiant efforts worthy of merit, but the hybrid car was the one that made it into production. The hybrid car makes use of electric for low-speed, stop and go local driving while making use of the gasoline engine for high speed driving. The use of each the technology were used where they are most efficient. The end product; a car capable of 90+ miles per gallon.
The same thing has now been done for SSD drives. There are 2 kinds of SSD drives, the MLC and the SLC drive. The MLC is the cheaper, consumer grade SSD that has a lower MTBF (mean time before failure) than the SLC drive. It is not a write optimized drive but does a good job at reads and affordable. The SLC is a more expensive, enterprise grade drive, but write optimized. the SLC has a higher MTBF. With these attributes and limited IT budgets it was only a question of time before an SSD hybrid SAN would be available.
Dell Compellent was one of the first to take the hybrid concept to use multiple RAID levels and multiple drive speeds to uniquely offer the most efficiency in SAN storage on the market. The SSD hybrid SAN is the next generation of storage virtualization and tiering bringing affordable, ultra-fast performance at a relatively reasonable cost.
We have to give credit where credit is due, we owe shrinking IT budgets and stormy economic conditions for the birth of innovations such as these.
It is the rainy days that make the sunny days so much more enjoyable.
Posted in: General
, Author: yobitech (April 24, 2013)
There used to something called the “Encyclopedia”. It was a set of books that was a collection of reliable knowledge based on hundreds of thousands of articles, biographies, videos, images, websites, etc. It was considered to be a neutral and accurate source of information. In fact, some information is based on canonical material and analytical processes that produces solid and consistent information and interpretations. The days of the encyclopedia are numbered mainly because of the Internet and the growing popularity of “Wikipedia”.
When I first heard of Wikipedia, I thought that was a stupid name. In addition to that, I thought that the information gathering processed is based on a flawed concept. The concept of having anyone and everyone put together a database of information for the purpose of reference was strange. Call me old-school, but I like to open up an encyclopedia and look things up. To know that there are real companies, real staff, with real reputations that put thousands of hours of research to publish the Encyclopedia. Wikipedia today has virtually replaced the encyclopedia and is now the gold standard for information. Not because it is better, but it is what the new generation knows. Convenient, accessible and free, combined with the loose standards of today’s society makes Wikipedia the popular choice. Popularity over time will eventually dominate. With the extinction of the encyclopedia there will be little to no accountability on what is being defined as information as we know it.
Consider this scenario… Let’s say, a group of people, believed strongly in a cause and decided to define some new terms. They post them to Wikipedia. Then others that are “like-minded” decide to cite and endorse the article or definition. At some point, these terms will become valid and left searchable to all. The problem here is that Wikipedia has the potential to be a “sliding scale” for information. Held together by a conglomerate of users, contributors and supporters. Given the credibility it has the potential danger it poses as a “sliding scale” can change and morph a society. I am not saying that Wikipedia is evil, but we should be careful in assigning undue credibility to it. Wikipedia has its place but our society gravitates to convenience. Imagine, if we can define (or redefine) something simple, it is just a matter of time before we can define the bigger things.
Posted in: General
, Author: yobitech (March 17, 2013)
There was once a time, not too long ago, that I had to actually study books for certifications and pass exams. I still do, but answers are now easier to find with the web. There is no doubt that a good solid education is important, but we live in an era where information is abundant and access to it is instantaneous. Learning has become different and condensed with the content available to us via the web.
With the advent of Google, the entire search engine industry was redefined. The “secret-sauce” of Google is what separates them from all other search engines. The competition is a far second and not even close. Google’s ability to “crawl” the entire World Wide Web and index all the content by Google’s massive servers farm is only the first phase. Google then applies their (“special sauce”) algorithms to sort in relevant ways that makes them the crown jewel of the Internet. The searches are seemly “spot on” or at least relevant to what we are looking for. So much so that we have become spoiled with having such a good tool. There is even a website called, “Let Me Google That For You” (http://lmgtfy.com/). It is a website that creates a script you can email someone that walks people through on how to find an answer to a question. Google’s success is due largely in part because everyone else just sucked. They have set the new standard, the “bar” raised high.
This tool is great for personal use, but many people come to leverage Google as a professional tool. I have to say that there are times when I may not have the answers to a technical question, but when I am in that situation, the first thing I do is I Google it. I come off as an expert because of my background, but I get to maintain that level because I know where to find the answers (and fast). It amazes me how many other people rely on Google as their right hand and to become the expert or consultant. It isn’t always about what we know, but how we are perceived. Google is our best friend. I have known people who have left psychology and have become computer experts, cab drivers that became web coders. Self-taught by Googl-ing any challenges and questions that may come up in the process. We are so fortunate in this era. What would we do without Google? I can even think about it…
I am tempted to experiment on launching a second career in something else that I have not done before relying solely on Google to give me all the answers I need to succeed. I know I can succeed because I am sure I am not first to do so.
Posted in: General
, Author: yobitech (February 16, 2013)
Sometimes the solution is right in front of us but most people fail to find the solution because of perspective. Even the best and the smartest people often fail as well. It is hard to see the solution because the solution is buried inside a sea of data.
What am I talking about? I am talking about business opportunities, cures for diseases, new scientific discoveries, inventions, etc. The success of Google began because there were no real good search engines. Most were mediocre at best. When you did a web search, most results returned were unusable or irrelevant. Google had the “secret sauce”. They were able to take your query and “mine” this vast sea of data called the Internet and return a list of mostly relevant list of search results. Google was able to do what nobody else could do that is pinpoint and uncover data on the Internet. This concept is just the cornerstone of the future.
Most companies do not understand this concept and is constantly looking for answers with the little data they have available. It isn’t the availability but the ability to look outside for answers is the key to having a competitive edge. I have recently done some consulting for a company that built its business on this concept. It is a company that takes vast information of people’s cell phone activities and using their software to look for trends and patterns. If the trends and patterns fall outside of the “norm” (user defined), it is flagged for unusual activity. At some point, these calling patterns can identify possible fraud. This is great stuff, but can you imaging by taking this one step further, I believe if this data is correlated with the justice department it can link unsolved crimes or give our criminal investigators an edge in finding and prosecuting existing crimes. This is just one example of “Big Data”.
Big Data is a term that describes that ability to take vast amounts of data and by using technology (hardware and software) to massage and mine it for new and valuable information. This is “trail-blazing” and discoveries are yet to be uncovered. That is the benefits of investing in a “Big Data” infrastructure. This is all possible because of large storage devices and the HPC (High Performance Computing). This is currently being used in geo-physical and nuclear research, but why limit this concept to those industries? As “Big Data” mining is slowly being adopted, we are just scratching the surface to what can be uncovered.
Believing that there is order in the midst of chaos is where it all begins.
Posted in: Cloud
, Author: yobitech (January 15, 2013)
Technology changes so fast that if I was ever sequestered for a court case for more than 1 year, I would have to change careers or start over. There are so many changes that happen on a daily basis that it is mind boggling. I have seen professionals come and go and only a few have remained in the tech industry. It is those that have the keen ability to be one step ahead to reinvent themselves are the ones that are thriving. I personally love the fast-paced, ever changing field; and I use it to my advantage. It keeps me fresh and set apart from my competition. Most would see this as a negative aspect to their career, but I see it as an opportunity.
I recently wrote about “Cannibalization” and the willingness for companies to engage purposely in it. It is a necessary evil to survive in this industry. As invincible as Microsoft may have seem to be in the early 2000s, they are showing signs of a big company that have lost their agility. They are getting hit from all fronts… the Open Source community, Apple, Android, Google Chrome, Amazon are just a few names. Getting slapped around the past 6-8 years, Microsoft has become a company with an identity crisis. Just look at their latest products… Windows 8 and Windows Server 2012. What is it???? Are they a business solutions company or are they a consumer products company? Is it a PC or a tablet? Is it a server or a desktop? This should be interesting where Microsoft will go. Will they be able to reinvent themselves or go the way of the typewriter?
There are a few game-changers that come along in technology that disrupts “life” as we know it. Where we are today, there is a disruptive technology called the “Cloud”. The “Cloud” sounds dark, but it is a shining star. It is a game-changer, a breath of fresh air in this economy. The “Cloud” is really nothing more that remote computing. It is the details of what they each cloud company offers is what sets them apart. Remote computing or telecommuting is the earliest form of cloud computing. In the mid 80s there was “PC Anywhere” and dial-in using RAS (Remote Access Servers) servers. Then came dedicated remote sites usually hooked up via dedicated T1s and ISDN lines. Then came the Internet… Al Gore changed everything. This opened up many more possibilities with IP tunneling and encryption. With mobile and tablet computing, BYOD (Bring Your Own Device) is the preferred MO (Modus Operandi or Method of Operation) for end user computing. Cloud technology has been around for a while, but it just never really caught on. The skies just dark with some serious Cloud offerings. Driven by a tight economy, it is all about the bottom line. Companies that once thought about buying servers and building out data centers are now looking at paying a monthly subscription for the same computing resources but without the burden of managing it. What does this mean for the traditional IT guy with tons of certifications and experience? They will find themselves outmatched and outwitted if they don’t reinvent themselves. For those looking to get into the tech industry, this is good news. The cloud is a great entry point as it is starting to take off. IT is revolving door. There are just as many experienced folks leaving (voluntary and involuntary) as those who are getting into it. So if you are a veteran like me or a newbie trying to get in, this is a great time to do so. The future looks cloudy and it is good.
Posted in: General
, Author: yobitech (November 8, 2012)
Cannibalization is a harsh term when it comes to tech companies. With technology moving at such a rapid pace, cannibalization is a reality that a company must face, sometimes in order to survive.
“Cannibalization” is the strategic research, development and release of a product or service that may take away from an existing product or service that a company already sells.
Back in the early days of “Ma Bell” (short for “Mother Bell” aka, Bell Systems) there was no competition. It did what it wanted whenever it wanted. Not only was there no competition, there wasn’t anything that “Ma Bell” was doing to hurt their own “bread and butter”… their Telephone Business.
This went unchallenged for many years. Until 1984 when “Ma Bell” was broken up into 8 different smaller companies such as, AT&T, Bell Atlantic, NYNEX, etc. I am sure this was painful for those who had benefited from the “Ma Bell” empire, but as we look back, it spurred competition as well as innovation.
The breakup of “Ma Bell” created the “Baby Bells” which has diversified into wireless technology, IP telephony, Internet service provider and even cable television service.
Cannibalization became an essential part of survival. It providing forward movement of technology for the industry and for us… the consumer.
On the other hand, a sad example of a company that was adverse to cannibalization was StorageTek. StorageTek was formed by 4 former IBM engineers in the early 70s. They developed tape libraries and tape backup devices. They rode the “gravy train” for many years and made lots of money. Competition was easy since they owned many patents around tape library robotics and design. StorageTek was way beyond any other competitor and put to shame virtually all other tape library companies. There was an unlikely competitor that didn’t even make or designed tape drives or libraries. They didn’t even have robotics. It was a software company that made disk storage arrays that started the downfall of StorageTek.
The company was EMC. EMC, a software disk array company developed and mass marketed the first successful VTL (virtual tape library). StorageTek was unable to develop and launch to the mass market a VTL mainly in part because of the resistance to cannibalization. Fortunately, StorageTek was acquired by Sun Micosystems and now a part of Oracle. StorageTek’s robotic technology is still used in other applications and is still selling tape libraries. Just not like they used to.
As we look at companies today, cannibalization is a natural part of moving forward, but it needs to be done gracefully and must be strategically timed. For example, Apple does this with grace, keeping ever consumer on his heals waiting on long lines as if our lives depended on it. The release of the iPad mini will indeed cannibalize iPad sales, but Apple strategically holds back certain features like the retina screen. This strategic limiting features and marketing of the iPad mini actually counters other 7″ tablets. This comes at the optimal time when people are holiday shopping and people are perhaps tired of their 7″ kindles or 7″galaxy tablets, what a great excuse to jump onto the Apple bandwagon. They can do this while not giving up the footprint they are so used to.
Cannibalization is now a tool to move forward instead of looking for an exit strategy.