The Software Defined World

Posted in: Cloud, General, Author: yobitech (September 23, 2014)

The Software Defined World
Before computers were around, typically things were done using pencil and paper. Since the introduction of the computers, it revolutionized the world. From the way we did business to how we entertain ourselves. As one of the greatest inventions ever invented, ranked up there in book along with the automobile, the microwave and the air conditioner.

From a business standpoint, computers gave companies an edge. The company that leverages technology best will have the greatest competitive edge in their industry. In a similar fashion, on a personal level, the person with the newest toys and the coolest toys are the ones that can take advantage of the latest software and apps. Giving them the best efficiency in getting their jobs done while attaining bragging rights in the process.

The Computer Era has seen some milestones. I have listed some highlights below.

The PC (Personal Computers)
The mainframe was the dominant platform as computers and were mainly used in companies with huge air-conditioned rooms as they were huge machines. Mainframes were not personal. No one had the room nor money to utilize them. Aside from businesses, access to the mainframe was mainly in specialized schools, colleges, libraries and/or government agencies.

The industry was disrupted by a few entries into home computing.

TANDY TRS-80 Model 1
Tandy corporation made their TRS-80 Model 1 computer powered by the BASIC language. It had marginal success with most consumers of this computer in schools. There wasn’t really much software for it but was a great educational tool to learning a programming language

TIMEX SINCLAIR
The Timex Sinclair was another attempt but the underpowered and tactile feel keyboard device was very limited. It had a black and white screen and an audio tape player as the storage device. There was more software available for it, but it never took off.

COMMODORE VIC 20/ COMMODORE 64
The Commodore Vic 20 and Commodore 64 was a different kind of computer. It had software titles available along with the release of the computer. It was in color and offered “sprite” graphics that allowed for detailed, smooth animations in color. The computer was a hit as it also offered an affordable floppy disk drive (5.25”) as the storage device.

APPLE AND IBM
Apple and IBM paved the way into the homes not just because they had a better piece of hardware, but there was access to software such as word processing, spreadsheets and databases (and not just games). This was the entry of the Personal Computer. There were differences between Apple and IBM where IBM was not user friendly and largely text based while Apple took a graphical route offering a mouse and menu driven Operating System that made the computer “friendly”.

Commoditization for Progress
Now that the home computer age has begun, the commoditizing of that industry also started shortly there after. With vendors like Gateway, Lenovo, HP and Dell, making these computers became cheap and plentiful. With computers being so affordable and plentiful, HPC (High-Performance Computing) or “grid” computing has been made possible. HPC/Grid Computing is basically the use of 2 or more computers in a logical group to share resources to act as one large computing platform. Trading firms, Hedge Funds, Geological Study and Genetic/Bio Research companies are just some places that use HPC/Grid Computing. The WLCG is a global project that collaborates more than 170 computing centers in 40 countries to provide resources to store, distribute and analyze over 30 petabytes of data generated by the Large Hadron Collider (LHC) at CERN on the Franco-Swiss border. As you can see, commoditization enables new services and sometimes “disruptive” technologies (ie. HPC/Grid). Let’s take a look at other disruptive developments…

The Virtual Reality Age
With PCs and home computers entering the home, the world of “Virtual Reality” was the next wave. Multimedia capable computers made it possible to dream. Fictional movies like Tron and The Matrix gave us a glimpse into the world of virtual reality. Although virtual reality has had limited success over the years, it wasn’t a disruptive technology until recently. With 3D movies making a comeback, 3D TVs widely available and glass cameras, virtual reality is more HD and is still being define and redefined today.

The Internet
No need to go into extensive details about the Internet. We all know this is a major disruption in technology because we all don’t know what to do with ourselves when it is down. I’ll just recap the inception of the Internet. Started as a government / military application (mostly text based) for many years, the adoption of the Internet for public and consumer use was established in the early 90s. The demand for computers with better processing and better graphics capabilities were pushed further as the World Wide Web and gaming applications became more popular. With better processing came better gaming and better web technologies. Java, Flash and 3D rendering made on-line gaming possible such as Call of Duty and Battlefield.

BYOD (Bring Your Own Device)
This is the latest disruptive trend is BYOD. As the lines between work and personal computing devices are being blurred. Most people (not me) prefer to carry one device. As the mobile phone market revolutionized 3 industries (Internet, phone, music device) it was only natural that the smart phone would be the device of choice for us. With the integration of high-definition cameras into the phones, we are finding less and less reason to carry a separate device to just take pictures with or a separate device for anything. There is a saying today, “There is an App for that”. With the commoditizing of cameras as well, Nokia has a phone with a 41 megapixel camera built in. With all that power in the camera, other challenges arise like bandwidth and storage to keep and share such huge pictures and videos.

The Software Defined Generation
There is a new trend that is finally going to take off but has been around for a while. This disruptive technology is called Software Defined “X”. The “X” being whatever the industry is. One example of Software Defined “X” is 3D printing. It was science fiction at one time to be able to just think up something and bring it into reality, but now you can simply defining it in software (CAD/CAM) and the printer will make it. What makes 3D printing possible is the cost of the printer and the materials used for printing due to commoditization. It wasn’t because we lacked the technology to make this earlier, it was just cost prohibitive. Affordability has brought 3D printing into our homes.

Software Defined Storage
Let’s take a look at Software Defined Storage. Storage first started out as a floppy disk or a hard drive. Then it evolved into a logical storage device consisting of multiple drives bound together in a RAID set for data protection. This concept of RAID was then scaled into SANs today that store most of our business critical information. This concept RAID has been commoditized today and is now a building block for Software Defined Storage. Software Defined Storage is not a new concept, just was not cost effective. Since the cost of high-performance networking and processing becoming affordable, Software Defined Storage is now a reality.

Software Defined Storage technology is taking the RAID concept and virtualizing small blocks of storage nodes (appliances-mini SANs) and grouping them together as a logical SAN. Because this Software Defined Storage is made up of many small components, these components can be anywhere in the architecture (including the cloud). As networking moves into 10Gb and 40Gb speeds and Fiber Channel at 16Gb speeds and WAN (Wide Area Networks) at LAN speeds, processors entering into the 12+ cores (per physical processor) and memory that can process at sub-millisecond speeds Software Defined Storage can virtually be anywhere. It can also be stretched over a campus or even between cities or countries.

In the world of commoditizing everything, the “Software Defined” era here.

Comments Off on The Software Defined World


Protecting Your Personal Data

Posted in: Backup, Cloud, General, Author: yobitech (July 17, 2014)

Being in the IT industry for over 20 years, I have worn many hats in my days. It isn’t very often that people actually know what I do. They just know I do something with computers. So by default, I have become my family’s (extended family included) support person for anything that runs on batteries or plugs into an outlet. In case you don’t know, I am a data protection expert and often not troubleshooting or setting up servers anymore. In fact, I spend most of my days visiting people and making blueprints with Microsoft Visio. I have consulted, validated and designed data protection strategies and disaster recovery plans for international companies, major banks, government, military and private sector entities.

For those who ARE familiar with my occupation often ask me, “So what does a data protection expert do to protect his personal data?” Since I help companies protect petabytes of data, I should have my own data protected also. I am probably a few professionals that actually do protect data to the extreme. Sometimes a challenge also because I have to find a balance between cost and realistic goals. It is always easier to spend other people’s money to protect their data. There’s an old saying that, “A shoemaker’s son has no shoes”. There is some truth in that. I know some people in my field that have lost their own data while being paid to protect others.

Now welcome to my world. Here is what I do to protect my data.

1. Backup, Backup and Backup – Make sure you backup! And often. Doing daily backups are too tedious, even for a paranoid guy like me. It is unrealistic also. Doing weekly or bi-weekly is perfectly sufficient. But there are other things that needs to be done as well.

2. External Drives – External drive backups are not only essential, but they are the only way we can survive as keeping pics and home videos on your laptop or desktop is not realistic. Backing up to a single external drive is NOT recommended. That is a single point of failure as that drive can fail with no other backups around. I use a dual (RAID1) external drive. It is an external drive that writes to 2 separate drives at the same time. There is always 2 copies at all times. I also have a 2 other copies on 2 separate USB drives. You should avoid don’t slam the door drives as they add an additional layer of complexity. When they fail, they fail miserably. Often the NAS piece is not recoverable and the data is stranded on the drives. At that time, data recovery specialist may have to be leverage to recover the data. This can cost thousands of dollars.

3. Cloud Backup – There are many different cloud services out there and most of them are great. I use one that has no limit to backing up to the cloud. So all of my files are backed up to the cloud whenever the my external drives are loaded with new data without limits.
4. Cloud Storage – Cloud storage is different from cloud backup as this service runs on the computers that I use. Whenever I add file(s) on my hard drive, it is instantly replicated to the cloud service. I use Dropbox at home and Microsoft SkyDrive for work, as it is saved in the cloud as well as all my computers. I also have access to my files via my smartphone or tablet. In a pinch, I can get to my files if I can get to an Internet browser. This feature has saved me on a few occasions.

5. Physical Off-Site Backup – I backup one more time on an external hard drive that I copy my files onto once a year. That drive goes to my brother-in-law’s house. You can also utilize a safety deposit box for that as well. This is in case there is a flood or my house burns down, I have a physical copy off-site.

Data is irreplaceable and should be treated as such. My personal backup plan may sound a bit extreme, but I can sleep well at night. You don’t have to follow my plan but a variation of this plan will indeed enhance what you are already doing.

Comments Off on Protecting Your Personal Data


Guys with the Fastest Cars Don’t Always Win

Posted in: General, SAN, Author: yobitech (July 10, 2014)

We (us men) are wired internally to achieve greatness. Whether it is having the fastest car or the “bad-est” laptop, it is in us want to get it.

Owning these high-tech gadgets and fast toys doesn’t necessarily make us better or faster. Most of the time it just makes us “feel” better.

In the storage business, SLC SSD drives or Enterprise Flash drives are the “Crème de la Crème” of all drives. Customers pay a premium for these drives, sometimes more than a well equipped BMW 3 series per drive. SAN vendors sometimes use them as cache augmentation or cache extensions while others will use them as ultra-fast/Tier 0 disk for the “killer app” that needs the ultra-low latency. Regardless, SSDs has captured the hearts of the power and speed hungry people. What isn’t always discussed is that the fastest drives in the SAN doesn’t always mean the fastest performance.

There are a few factors that can slow down your SAN. Here are a few tips to make sure you are optimized:

1. Plumbing – Just like plumbing in your house, water flow will always be at the mercy of the smallest pipe. So if you have 5” pipe coming into the house and a thin tube going into your bathtub, you will take a long time to fill that tub. Be sure to optimize the throughput by using the highest available rated speed network.

2. Firmware – hardware devices have software also. Not just on your computers. This “thin” layer of software is written specifically for hardware devices is called firmware. Make sure you are on the latest code and read the “README” file(s) included for release notes

3. Drivers – Devices have software inside the operating system called drivers. Even though devices have firmware, there is software that enables the operating system to use the devices. To give you a perspective of firmware vs. drivers; firmware is like the BIOS of the computer. It is usually the black screen when you turn on your computer that loads the basic information of your computer. Drivers are like the operating system of your computer. Like Windows 8 or OS X that loads on top of the hardware, drivers loads on top of the firmware of the device.

4. Drive Contention – Contention is when you over-utilize a device(s) or drive(s). A common mistake made is to put everything (applications and hosts) on the SAN and then run backups back onto the SAN. Although it may seem logical and economical, it does a disservice to the users data. First, all the data is in one place. SAN failure means loss of data and backups. Second, data first has to be read from of the drives, then written back onto the same SAN (usually the same set of drives). This can cause a massive slowdown of the SAN; regardless of what drives you have in the system

5. User Error – The most common and least talked about is user error. Probably because nobody ever wants to admit mistakes. Misconfigurations in the SAN or application is a common fault. Most people, men in general, will not read the manual and install by trial and error. The goal is if it works, it is good. This gives a false sense of security especially with systems becoming more and more complex. A misconfigured setting may never show up as a problem until much later. Sometimes catastrophic failures can be the result of overlooked mistakes.
If you follow these simple steps to tighten up the SAN, you will achieve greatness through your existing investment.

Comments Off on Guys with the Fastest Cars Don’t Always Win


The Balancing Act of Managing Data Today

Posted in: General, Security, Author: yobitech (June 25, 2014)

Did you hear? Ebay was among the latest of companies being compromised by hackers. Who exactly are these hackers? Hackers are essentially anyone with malicious intent on causing disruption or harm to a system, application or its data. I think the word “hacker” has received a bad rap over the years. Hacking can actually be a good thing. By definition, the “act” of hacking is merely to reverse engineer an application or system with the intent to improve it. There are college courses dedicated to Ethical Hacking as well as certification levels. To be certified to “hack” for the greater good sounds almost paradoxical. I think if you asked most people if ethical hacking was possible, most people will say no. With data being compromised almost daily, companies have taken serious measures safeguard their data through encryption.

Encryption is the purposeful implementation of scrambling the data only to be de-scrambled by the unique digital encryption key. With data ever growing exponentially, over the years, companies have bought into storage saving technologies such as Deduplication and compression. This is to better manage and protect (backup and restoration). To summarize, Deduplication is the process in which duplicated data blocks within a system is not written over and over again. In essence, a single instance in place of many instances. For example, if a company stores a food menu that is 5MB that has been distributed to 1,000 employees, it would consume 5GB of disk storage total. In a deduplicated system, it would only consume 5MB of space, regardless of how many employees. That is because the system sees 1 instance of this menu and will reference the 1,000 instances to this 1 instance. With compression added, this single instance of a 5MB menu can potentially be reduced up to 20x more. Imagine this process over terabytes of data. A tremendous space saving across the enterprise.

With security becoming a top priority with companies already employing deduplication and compression, what implications will encrypting data have on these datasets? The answer is: MAJOR.

Encryption randomizes data where duplication is purposely eliminated. Compression is limited, if at all applicable. Almost counter productive. So what are companies doing? Welcome to the Information Data Management balancing act. This balancing act is, by nature, an enabler to make better tools, to innovate new technologies and to do more with less. As budgets are shrinking with systems becoming more complex, it is exceedingly important to have proper training to maintain these systems. As many do properly train and do it well, but there are some who cut corners. Those companies do themselves an injustice and put their data at risk. Those companies usually fall usually in catastrophic fashion.

The Targets store data breach that happened back in 2013, is still trying to quantify the damages from that event to this day. Uncovering more and more damages as the investigation deepens. It is important to not fortify the front door while leaving the back door wide open.

Comments Off on The Balancing Act of Managing Data Today


The top 5 things you should never do when buying a SAN

Posted in: General, SAN, Author: yobitech (March 14, 2014)

As technical sales veteran in the storage field, I see people all the time. People who make wise decisions based on real-world expectations and people who buy on impulse. You might say that I am biased because I work for a vendor and although that might be true, I was also a consultant before I was in sales.

I operate under a different set of “rules” where my customer’s best interest comes before my quota. Here is a collection of the top 5 things you should never do when buying a SAN.

5. Buy out of a “good feeling”

Sales people are in the business of selling. That’s what they do, that’s their “prime directive”. It is their job to make you “feel good”. Make sure you do your homework and check every feature and quote item. This is so that you know what you are buying. A common practice is that sales people will put something in the quote thinking you may need it, but in reality, it may not ever be used. Make sure you talk to the technical sales person without the sales guy. Ask him for the honest opinion but be objective. Ask about his background as well so you know his perspective. A technical sales person is still a sales person, but he is more likely to give you the honest, technical facts.

4. Buy a SAN at the beginning of the vendor’s quarter

SAN vendors are in business to make money. They also operate under the same sales cycles. If the company is publically traded, you can look up when their quarters begin and end. Some align to a calendar year and some are fiscal. Here is the fact… You WILL get the best deal always at the end of a quarter. If possible, at absolute best deal at the end of the year (4th quarter). Since buying a SAN is usually a long process, you should align your research as well as your buying approval with the quarters. This will get you the best deal for your money.

3. Believe what a vendor tells you

I write this with caution because at some point you need to believe someone. As long as you keep in mind that the sales people that court you during the process has a quota to retire. The one that is willing to back up their claims by objective facts and real-world customer references are the ones that will most likely live up to expectations.

2. Buy a SAN without checking out their support

As a prospective SAN customer, once you are down to your final players, take some time to call their support. Perhaps at 2am on a Sunday or 6am on a Tuesday. See what kind of experience you get. A common mistake is that a SAN is purchased and things are running well, all is good. It is when there is an outage and you are trying to get support on the phone, that is not the time to test their support response. Check also what the industry says about their support. Other questions are, where is the support center located? Is it US based? Follow the sun?

1. Buy a SAN from a startup

I am a big fan of new and disruptive technologies. This is what makes us a great nation. The fact that we can have companies startups pop up overnight, they can also disappear overnight. Startups are great, but for a SAN that I am going to put my company’s “bread and butter” is not such a wise choice. I say this from experience as I have seen startups come and go. The ones that stay are the ones that are usually bought by the bigger companies. The others are just hoping to be bought. Usually 5 years is a good milestone for a SAN company to pass because by that time customers that made the investment in their products will be in the market again to refresh. If they make it past 5 years, there is a good chance they will be around.

Comments Off on The top 5 things you should never do when buying a SAN


Read the “Fine Print”

Posted in: Backup, General, SAN, Author: yobitech (March 4, 2014)

Far too many times I have bought something with much anticipation only to be disappointed. If it wasn’t the way it looked or what it was promised to do; it was something else that fell short of my expectations. I have to say that one of the few companies that go beyond my expectations are the ones I keep going back to. The one I like to frequently talk about is Apple. Their products often surprise me (in a good way) and the intangible features that brings a deep satisfaction way beyond what is advertised. The “new drug” for me is Samsung and Hyundai (cars).

American marketing plays the leading role in setting this expectation. It is the marketing that has become the “American” culture… The “must have” the newest, coolest and flashy-est toys that defines who we are. Unfortunately, marketing of these products almost always falls short of the actual product itself. We all seem to hang on the hope that these products will exceed our expectations. This is why “un-boxing” videos are so popular on YouTube. Product reviews and blogs are also a good way to keep companies honest and helping us with our “addictions” to our toys. This marketing culture is not only limited to personal electronics but is also true for products in the business enterprise as well.

Marketing in the Business Enterprise

The Backup Tape

I remember having to buying backup tapes for my backups. I have often wondered why and how they can advertise 2x the native capacity of the tape? How can they make that claim? For example, a SDLT320 tape is really a 160GB tape (native capacity). How do they know that customers can fit 320GBs on a 160GB tape?” After doing some research, the conclusion I came to was that they really don’t know! It was a surprising fact to me that they can make such a claim based on speculation. How can they do this and get away with? It is easy… It is what I call the “Chaos Factor”. This is when someone or something takes advantage of a situation to further their cause.
In the case of the backup tapes, they capitalize on a few things that facilitate the Chaos Factor:

1. The Backup Software and

2. The Business Requirements.

The Backup Tape “Chaos Factor”

1. The Backup Software

Tape manufacturers know this all too well. Backup software is very complex. Virtually all backup administrators are far too busy worrying about one thing; completing the backups successfully. Looking to see if tapes are being utilized to meet its advertised capacity is not something that is even thought about in the day-to-day operation. In fact, the only time tape utilization ever comes up is if management asks for it. When it is requested, it is usually a time consuming exercise as backup software does not have good reporting facilities to compile this information readily. Tape utilization is not a concern.

1. The Business Requirements

Another reason is how backup software uses tapes. Tape backups are scheduled by jobs. Most jobs are completed before the tape are filled up. Depending on the companys’ policy, most tapes are ejected and stored off-site. So tapes are rarely ever be filled up because of this policy! This is normal for backup jobs and it is when companies leave tapes in the drive(s) to fill them up goes against why they do backups in the first place. Backup tapes are meant to be taken off-site to protect from disaster. It is really the ONLY time (other than having backups larger than a single tape) that a tape can actually be fully utilized.

So this Chaos Factor is also used in the business of data storage. The SAN market is another one of where the protection of data trumps our ability to efficiently manage the storage. The SAN market is full of dirty secrets as I will outline them below.

The SAN “Chaos Factor”

A dirty secret of the storage industry is the use of marketing benchmark papers. Benchmark testing papers are designed to give the impression that a product can perform as advertised. And for the actual paper itself, it may be true, but sometimes these tests are “rigged” to give the product favorable results. In fact, sometimes these performance numbers are impossible in the real-world. Let me illustrate.. For example, I can type about 65 words per minute. Many people can and will view that as average, but if I wanted to “bend the truth”, I can say I can type 300 words per minute. I can technically type “at” 300+ words per minute, but in the real world, I don’t type like that. What good is a book with 1 word (at) printed on 300 pages? This kind of claim holds no water but it is the same technique and concept used for some of these technical papers. Although the results are touted, keep them honest by asking what their customers seeing in their performance on a day-to-day operation.

Here is another technique that is commonly used by vendors. It is what I call the “smoke and mirror” marketing. It is a tactic used to mimic a new technology, feature or product that is hot. The main goal of this is to create the feature at the best possible price and downplay the side-effects. This is the deliberate engineering around providing the feature set at the expense of existing features. Here is an example. I bought a new Hyundai Sonota last year. I love the car, but I am not crazy about the ECO feature that comes with it. I was told that I would save gas with this mode. Although I have to say I think I get a few more miles on a tank of gas, the cost I pay in lost power, torque and responsiveness is not worth me using this feature at all. I believe this feature as well as a smaller gas tank capacity eventually lead to a class-action law suite over Hyundai’s gas mileage claims. So for a vendor to incorporate new features they sometimes have to leverage existing infrastructures and architectures because it is what they already have. In doing so, they now have an inferior product by emulating new features and masking or downplaying the effects. The prospective customers are not going to know the product well enough to know the impact or these nuances. They often just see the feature set in a side-by-side comparison with other vendors and make decisions based on that. While the details are in the fine print, it is almost never looked at before the sale of the product. As a seasoned professional, I commonly do my due diligence to research their claims. I also am writing this to help you avoid making these mistakes by asking questions and researching before making a major investment for your company.

Here are some questions you should ask:

• What trade magazines have you been featured in lately? (last year)
• What benchmarking paper is available for review
• How does that benchmark compare to real-world workloads?
• What reference architectures are available?
• What customers can I talk to on specific feature set(s)?

Here are some things to do for research

• Look through the Administrator’s Guide for “Notes” and fine print details. This will usually tell you what is impacted and/or restricted as a result of implementing the features
• Invite the vendors for a face-to-face meeting and talk about their features
• Have the vendor present their technologies and how they differ from the competition
• Have the vendor white-board how their technology will fit into your environment
• Ask the vendor to present the value of their technology in relation to your company’s business and existing infrastructure
• If something sound too good to be true then ask them to provide proof in the form of a customer testimony

I hope this is good information for you because I have seen time after time, companies making a purchases into something that isn’t the right fit. Then they are stuck with it for 3-5 years. Remember, the best price isn’t always the best choice.

Comments Off on Read the “Fine Print”


Where is the Storage Industry Going?

Posted in: Backup, General, SCSI, SSD, Author: yobitech (November 5, 2013)

It is human nature to assume that if it looks like a duck, quacks like a duck and sounds like a duck then it must be a duck. The same could be said about hard drives. They only come in 2.5” and 3.5” form factors, but when we dig deeper, there are distinct difference and developments in the storage industry that will define and shape the future of storage.

The Rotating Disk or Spinning Disk

So there were many claims in the 90’s of how the “mainframe is dead”, but the reality is, the mainframe is alive and well. In fact, there are many corporations still running on mainframes and have no plans to move off of it. This is because there are many other factors that may not be apparent on the surface, but it is reason enough to continue with the technology because it provides a “means to an end”.

Another claim was in the mid 2000’s that “tape is dead”, but again, the reality is, tape is very much alive and kicking. Although there have been many advances in disk and tape alternatives, tape IS the final line of defense in data recovery. Although it is slow, cumbersome and expensive, it is also a “means to an end” for most companies that can’t afford to lose ANY data.

When it comes to rotating or spinning disk, many are rooting for the disappearance of them. Some will even say that is going the way of floppy disk, but just when you think there isn’t any more that can be developed for the spinning disk, there are some amazing new developments. The latest is…

The 6TB Helium Filled hard drive from HGST (a Western Digital Company).

Yes, this is no joke. It is a, hermetically sealed, water proof, hard drive packed with more platters (7 platters) to run faster and more efficiently that the conventional spinning hard drive. Once again, injecting new life into the spinning disk industry.

What is fueling this kind of innovation into a supposedly “dying” technology? For one, solid state drives or SSDs are STILL relatively expensive. The cost has not dropped (as much as I would have hoped) like most traditional electronic components thus keeping the spinning disk breed alive. The million dollar question is, “How long will it be around?” It is hard to say because when we look deeper into the drives, there are differences. They are also fulfilling that “means to an end” purpose for most. Here are some differences…

1. Capacity
As long as there are ways to keep increasing capacity and keep the delta between SSDs and spinning disk far enough, it will dilute the appetite for SSDs. This will trump the affordability factor because it is about value or “cost per gigabyte”. We are now up to 6TBs in a 3.5” form factor while SSDs are around 500GBs. This is the single most hindering factor for SSD adoption.

2. Applications
Most applications do not have a need for high performance storage. Most storage for home users are for digital pictures, home movies and static PDF files and documents. Most of these files are perfectly fine for the large 7.2k multi-terabyte drives. In the business world or enterprise, it is actually quite similar. Most companies’ data is somewhat static. In fact, on average, about 70% of all data is hardly ever touched again once it is written. I have personally seen some customers with 90% of their data being static after being written to for the first time. Storage vendors have been offering storage tiering (Dell Equallogic, Compellent, HP 3Par) that automates the movement of storage based on their usage characteristics without any user intervention. With this type of virtualized storage management maximizes the ROI (Return on Investment) and the TCO (Total Cost of Ownership) for spinning disk in the enterprise. This has extended the existence of spinning disk as it maximizes the performance characteristics of both spinning disk and SSDs.

3. Mean Time Before Failure (MTBF)
All drives have a MTBF rating. I don’t know how vendors come up with these numbers, but they do. It is a rating of how long the device is expected to be in service before they fail. I wrote in a past blog called “The Perfect Storm” where SATA drives would fail in bunches because of the MTBF. Many of these drives are put into service in massive amounts at the same time doing virtually the same thing all of the time. MTBF is theoretical number but depending on how they are used, “mileage will vary”. MTBF for these drives are so highly rated that most of them that run for a few years will continue to run for many more. In general, if a drive is defective, it will fail fairly soon into the operational stage. That is why there is a “burn-in” time for drives. I personally run them for a week before I put them into production. Those drives that last for years eventually make it back on the resale market only to run reliably for many more. On the other hand, MTBF for an SSD is different. Although they are rated for a specific time like the spinning disk, the characteristics of an SSD is different. There is a process called “cell amplification” where the cells in an SSD will actually degrade. They will eventually be rendered unusable but there is software that will compensate for that. So as compared to a spinning disk where there is no cell amplification, SSDs are measurably predictable to when they will fail. This is a good and bad thing. Good in the aspect of predicting failure but bad in the sense of reusability. If you can measure the life of a drive, this will directly affect the value of the drive.

In the near future, it is safe to say that the spinning disk is going to be around for a while. Even if the cost of SSDs come down, there are other factors that meet the needs for the users of storage. The same way that other factors that have kept the mainframe and tape technologies around the spinning disk is has earned its place.

Long live the spinning hard drive!

Comments Off on Where is the Storage Industry Going?


Technology Headwinds

Posted in: Backup, General, Author: yobitech (August 13, 2013)

If you are amazed with technology today, you should know that the speed of technological advances have been hindered for many years by many factors.

Things like copyright laws, marketing and corporate profits often constrain the speed of new products and innovations. We as a collective human race can develop and advance much faster if these obstacles are to be removed. The counter-force to these constraints have come through the “Open Source” community with Linux and other operating systems and open source standards, hobbyist, enthusiasts and hackers (ethical and unethical) has brought to us great benefits and/or improvements to the devices we all love so much. With these leaps in technology and advancements comes a new technology inhibitor… Increased Regulations and Compliance.

From the days of the Enron scandals to the insider trading of Martha Stewart, a number of regulatory rules and compliance has come down on businesses. Some examples of these rules and regulations are the Sarbanes Oxley or (SOX) Act, PCI DSS (Payment Card Industry Data Security Standard) and HIPAA (Health Insurance Portability and Accounting Act) are just to name a few.

These compliance rules do not directly inhibit innovations and advancements in technology, but it does slow it down. It forces technology to stay around longer than it is intended for. It “shifts” innovations to data preservation and accessibility. Regulations that span many years like in the financial sector are typically 7 years worth of unaltered and verifiable financial data. There is now the possibility of expanding it to 10 years or more. The medical industry is moving to retention of 50+ years of unaltered and verifiable record keeping. The portability of medical history is now driving retentions to exceed the life of a person; in some cases 100+ years, depending on the medical treatment type. Finally, there are the data “packrats”. Although they may not be mandated by regulations yet, some institutions’ self-imposed retentions to “forever”.
Reality? Yes and No.

Yes, that we can set these rules today, but the reality is No… at least not proven yet. It is a work in progress. There are some innovative products that are designed to keep data past 100+ years, but most companies’ IT departments are not looking that far ahead. They are not looking to spend lots of money on unproven technology on the prospect of being able to keep the data that long. They have more immediate issues to solve. So companies are faced with the challenge of keeping up with innovations and leading-edge technologies while being able to support data retention compliance. Much of that data is on old tapes, optical platters and old disk storage systems.

Fortunately, there are the aftermarket niche resellers that specialized in repurposed gear. These business provide an essential service to these unique situations. Companies are making their storage subsystems last longer, usually past the vendor’s EOL (End of Life) support of the products. Some are resorting to eBay and user groups to fulfill the hard to find items, but to varying degree of success. One IT manager says, “When I buy my stuff from eBay for my workplace, I am playing Russian roulette. I prefer to go to these niche resellers because I know what I am getting and it’s not from some guy in a garage somewhere.” EOL disks in archival systems with compliance metadata, older servers with specific BIOS support for older drives, SANs with EOL disks with specific interface or firmware are generating a steady demand for these components, but until it hurts enough for companies to invest in a ultra-long term archival/compliance ready solutions, companies will endure the pain and resort to leveraging older equipment to preserve their data.

Comments Off on Technology Headwinds


Tablets Are Up For Grabs

Posted in: General, Author: yobitech (July 1, 2013)

Although the iPad had set the standards for tablets, the tablet market is still ripe for the picking. In my opinion, we are fortunate enough to be a part of the most exciting time in history of mankind, well, at least for us guys. As many may think, Apple’s “golden days” appears to be behind them, but they are still a very innovative and trend setting company. I have to say with the loss of Steve Jobs, it just hasn’t been the same. As we look behind us, we see the tablet path riddled with failures. A lot of people may not know this, but Apple had a failure back in the 80’s with the Newton, then the Palm Pilot and the HP iPaq. To some degree these failures had setup the iPad for tremendous success. Like a volleyball game, Apple was prime to “spike” the ball. It took the failures of the past to learn and design the iPad.

The iPad is not failing or doomed but simply losing its freshness. With IOS7 coming, it doesn’t seem to have the same “BANG” as other past announcements. Although we don’t know how it is going to work yet, it appears to be more of the same. Meanwhile, there are some new and exciting tablets out there now innovating where Apple used to. For example, Dell has a convertible XPS12 that is a laptop with full Windows 8 (Home or Pro) running on an i5 or an i7 processor equipped with a flip 12” screen to be a full 10hr tablet! There is also the Dell XPS18 tablet that doubles as a desktop mimicking the Apple iMac while giving the consumer the ability use it as a large tablet. A full 18” tablet! Samsung is also leading the way with the cell phone hybrid tablet “The Note” series device with a 5” screen. A true man’s man tablet with dual core and handwriting capabilities. There is a “Wow” factor here, like the same ‘heart-thumping” excitement of unboxing of the previous Apple products.

These are just some of the new innovations that is taking place in the tablet evolution. I am glad to see that although Apple had opened up a can of whoop@$$ a few years back, but what goes around comes around. It’s about time that others learned from Apple’s mistake and is now returning the favor.

Comments Off on Tablets Are Up For Grabs


Hardship Drives Innovations

Posted in: General, SSD, Author: yobitech (June 5, 2013)

I am constantly inspired by the technologies that come into the marketplace. Although most new products are mediocre, there are some that are really deserving of praise.

Occasionally a disruptive product shakes the world. Oh how I look forward to those times… However, there are products that make specific use of multiple technologies to leverage the strengths and weaknesses to compliment each other for a common goal.

Each technology may not be a great technology, but when combined it is amazing. Take for example the hybrid car. When the gasoline shortage of the 70s happened it was a shock to most people that gas can actually run out. That prompted the development of an alternative fuel for cars. While most people thought of a 1 to 1 fuel replacement, a combination of fuels for a single vehicle was radical. Gas prices stayed low after that and the gas shortage scare was a distant memory.

The concept of an alternative fuel car was put on low priority. I have seen some attempts of a natural gas car, the electric car and even the vegetable oil car (usually collected from restaurants left over from cooking).

All valiant efforts worthy of merit, but the hybrid car was the one that made it into production. The hybrid car makes use of electric for low-speed, stop and go local driving while making use of the gasoline engine for high speed driving. The use of each the technology were used where they are most efficient. The end product; a car capable of 90+ miles per gallon.

The same thing has now been done for SSD drives. There are 2 kinds of SSD drives, the MLC and the SLC drive. The MLC is the cheaper, consumer grade SSD that has a lower MTBF (mean time before failure) than the SLC drive. It is not a write optimized drive but does a good job at reads and affordable. The SLC is a more expensive, enterprise grade drive, but write optimized. the SLC has a higher MTBF. With these attributes and limited IT budgets it was only a question of time before an SSD hybrid SAN would be available.

Dell Compellent was one of the first to take the hybrid concept to use multiple RAID levels and multiple drive speeds to uniquely offer the most efficiency in SAN storage on the market. The SSD hybrid SAN is the next generation of storage virtualization and tiering bringing affordable, ultra-fast performance at a relatively reasonable cost.

We have to give credit where credit is due, we owe shrinking IT budgets and stormy economic conditions for the birth of innovations such as these.

It is the rainy days that make the sunny days so much more enjoyable.

Comments Off on Hardship Drives Innovations