The “Wiki” effect

Posted in: General, Author: yobitech (April 24, 2013)

There used to something called the “Encyclopedia”. It was a set of books that was a collection of reliable knowledge based on hundreds of thousands of articles, biographies, videos, images, websites, etc. It was considered to be a neutral and accurate source of information. In fact, some information is based on canonical material and analytical processes that produces solid and consistent information and interpretations. The days of the encyclopedia are numbered mainly because of the Internet and the growing popularity of “Wikipedia”.

When I first heard of Wikipedia, I thought that was a stupid name. In addition to that, I thought that the information gathering processed is based on a flawed concept. The concept of having anyone and everyone put together a database of information for the purpose of reference was strange. Call me old-school, but I like to open up an encyclopedia and look things up. To know that there are real companies, real staff, with real reputations that put thousands of hours of research to publish the Encyclopedia. Wikipedia today has virtually replaced the encyclopedia and is now the gold standard for information. Not because it is better, but it is what the new generation knows. Convenient, accessible and free, combined with the loose standards of today’s society makes Wikipedia the popular choice. Popularity over time will eventually dominate. With the extinction of the encyclopedia there will be little to no accountability on what is being defined as information as we know it.

Consider this scenario… Let’s say, a group of people, believed strongly in a cause and decided to define some new terms. They post them to Wikipedia. Then others that are “like-minded” decide to cite and endorse the article or definition. At some point, these terms will become valid and left searchable to all. The problem here is that Wikipedia has the potential to be a “sliding scale” for information. Held together by a conglomerate of users, contributors and supporters. Given the credibility it has the potential danger it poses as a “sliding scale” can change and morph a society. I am not saying that Wikipedia is evil, but we should be careful in assigning undue credibility to it. Wikipedia has its place but our society gravitates to convenience. Imagine, if we can define (or redefine) something simple, it is just a matter of time before we can define the bigger things.

Comments Off on The “Wiki” effect


Perception is Everything

Posted in: General, Author: yobitech (March 17, 2013)

There was once a time, not too long ago, that I had to actually study books for certifications and pass exams. I still do, but answers are now easier to find with the web. There is no doubt that a good solid education is important, but we live in an era where information is abundant and access to it is instantaneous. Learning has become different and condensed with the content available to us via the web.

With the advent of Google, the entire search engine industry was redefined. The “secret-sauce” of Google is what separates them from all other search engines. The competition is a far second and not even close. Google’s ability to “crawl” the entire World Wide Web and index all the content by Google’s massive servers farm is only the first phase. Google then applies their (“special sauce”) algorithms to sort in relevant ways that makes them the crown jewel of the Internet. The searches are seemly “spot on” or at least relevant to what we are looking for. So much so that we have become spoiled with having such a good tool. There is even a website called, “Let Me Google That For You” (http://lmgtfy.com/). It is a website that creates a script you can email someone that walks people through on how to find an answer to a question. Google’s success is due largely in part because everyone else just sucked. They have set the new standard, the “bar” raised high.

This tool is great for personal use, but many people come to leverage Google as a professional tool. I have to say that there are times when I may not have the answers to a technical question, but when I am in that situation, the first thing I do is I Google it. I come off as an expert because of my background, but I get to maintain that level because I know where to find the answers (and fast). It amazes me how many other people rely on Google as their right hand and to become the expert or consultant. It isn’t always about what we know, but how we are perceived. Google is our best friend. I have known people who have left psychology and have become computer experts, cab drivers that became web coders. Self-taught by Googl-ing any challenges and questions that may come up in the process. We are so fortunate in this era. What would we do without Google? I can even think about it…

I am tempted to experiment on launching a second career in something else that I have not done before relying solely on Google to give me all the answers I need to succeed. I know I can succeed because I am sure I am not first to do so.

Comments Off on Perception is Everything


Looking for Order in Chaos

Posted in: General, Author: yobitech (February 16, 2013)

Sometimes the solution is right in front of us but most people fail to find the solution because of perspective. Even the best and the smartest people often fail as well. It is hard to see the solution because the solution is buried inside a sea of data.

What am I talking about? I am talking about business opportunities, cures for diseases, new scientific discoveries, inventions, etc. The success of Google began because there were no real good search engines. Most were mediocre at best. When you did a web search, most results returned were unusable or irrelevant. Google had the “secret sauce”. They were able to take your query and “mine” this vast sea of data called the Internet and return a list of mostly relevant list of search results. Google was able to do what nobody else could do that is pinpoint and uncover data on the Internet. This concept is just the cornerstone of the future.

Most companies do not understand this concept and is constantly looking for answers with the little data they have available. It isn’t the availability but the ability to look outside for answers is the key to having a competitive edge. I have recently done some consulting for a company that built its business on this concept. It is a company that takes vast information of people’s cell phone activities and using their software to look for trends and patterns. If the trends and patterns fall outside of the “norm” (user defined), it is flagged for unusual activity. At some point, these calling patterns can identify possible fraud. This is great stuff, but can you imaging by taking this one step further, I believe if this data is correlated with the justice department it can link unsolved crimes or give our criminal investigators an edge in finding and prosecuting existing crimes. This is just one example of “Big Data”.

Big Data is a term that describes that ability to take vast amounts of data and by using technology (hardware and software) to massage and mine it for new and valuable information. This is “trail-blazing” and discoveries are yet to be uncovered. That is the benefits of investing in a “Big Data” infrastructure. This is all possible because of large storage devices and the HPC (High Performance Computing). This is currently being used in geo-physical and nuclear research, but why limit this concept to those industries? As “Big Data” mining is slowly being adopted, we are just scratching the surface to what can be uncovered.

Believing that there is order in the midst of chaos is where it all begins.

Comments Off on Looking for Order in Chaos


The Future is Cloudy

Posted in: Cloud, General, Author: yobitech (January 15, 2013)

Technology changes so fast that if I was ever sequestered for a court case for more than 1 year, I would have to change careers or start over. There are so many changes that happen on a daily basis that it is mind boggling. I have seen professionals come and go and only a few have remained in the tech industry. It is those that have the keen ability to be one step ahead to reinvent themselves are the ones that are thriving. I personally love the fast-paced, ever changing field; and I use it to my advantage. It keeps me fresh and set apart from my competition. Most would see this as a negative aspect to their career, but I see it as an opportunity.

I recently wrote about “Cannibalization” and the willingness for companies to engage purposely in it. It is a necessary evil to survive in this industry. As invincible as Microsoft may have seem to be in the early 2000s, they are showing signs of a big company that have lost their agility. They are getting hit from all fronts… the Open Source community, Apple, Android, Google Chrome, Amazon are just a few names. Getting slapped around the past 6-8 years, Microsoft has become a company with an identity crisis. Just look at their latest products… Windows 8 and Windows Server 2012. What is it???? Are they a business solutions company or are they a consumer products company? Is it a PC or a tablet? Is it a server or a desktop? This should be interesting where Microsoft will go. Will they be able to reinvent themselves or go the way of the typewriter?

There are a few game-changers that come along in technology that disrupts “life” as we know it. Where we are today, there is a disruptive technology called the “Cloud”. The “Cloud” sounds dark, but it is a shining star. It is a game-changer, a breath of fresh air in this economy. The “Cloud” is really nothing more that remote computing. It is the details of what they each cloud company offers is what sets them apart. Remote computing or telecommuting is the earliest form of cloud computing. In the mid 80s there was “PC Anywhere” and dial-in using RAS (Remote Access Servers) servers. Then came dedicated remote sites usually hooked up via dedicated T1s and ISDN lines. Then came the Internet… Al Gore changed everything. This opened up many more possibilities with IP tunneling and encryption. With mobile and tablet computing, BYOD (Bring Your Own Device) is the preferred MO (Modus Operandi or Method of Operation) for end user computing. Cloud technology has been around for a while, but it just never really caught on. The skies just dark with some serious Cloud offerings. Driven by a tight economy, it is all about the bottom line. Companies that once thought about buying servers and building out data centers are now looking at paying a monthly subscription for the same computing resources but without the burden of managing it. What does this mean for the traditional IT guy with tons of certifications and experience? They will find themselves outmatched and outwitted if they don’t reinvent themselves. For those looking to get into the tech industry, this is good news. The cloud is a great entry point as it is starting to take off. IT is revolving door. There are just as many experienced folks leaving (voluntary and involuntary) as those who are getting into it. So if you are a veteran like me or a newbie trying to get in, this is a great time to do so. The future looks cloudy and it is good.

Comments Off on The Future is Cloudy


Cannibalization

Posted in: General, Author: yobitech (November 8, 2012)

Cannibalization is a harsh term when it comes to tech companies. With technology moving at such a rapid pace, cannibalization is a reality that a company must face, sometimes in order to survive.

“Cannibalization” is the strategic research, development and release of a product or service that may take away from an existing product or service that a company already sells.

Back in the early days of “Ma Bell” (short for “Mother Bell” aka, Bell Systems) there was no competition. It did what it wanted whenever it wanted. Not only was there no competition, there wasn’t anything that “Ma Bell” was doing to hurt their own “bread and butter”… their Telephone Business.

This went unchallenged for many years. Until 1984 when “Ma Bell” was broken up into 8 different smaller companies such as, AT&T, Bell Atlantic, NYNEX, etc. I am sure this was painful for those who had benefited from the “Ma Bell” empire, but as we look back, it spurred competition as well as innovation.

The breakup of “Ma Bell” created the “Baby Bells” which has diversified into wireless technology, IP telephony, Internet service provider and even cable television service.

Cannibalization became an essential part of survival. It providing forward movement of technology for the industry and for us… the consumer.

On the other hand, a sad example of a company that was adverse to cannibalization was StorageTek. StorageTek was formed by 4 former IBM engineers in the early 70s. They developed tape libraries and tape backup devices. They rode the “gravy train” for many years and made lots of money. Competition was easy since they owned many patents around tape library robotics and design. StorageTek was way beyond any other competitor and put to shame virtually all other tape library companies. There was an unlikely competitor that didn’t even make or designed tape drives or libraries. They didn’t even have robotics. It was a software company that made disk storage arrays that started the downfall of StorageTek.

The company was EMC. EMC, a software disk array company developed and mass marketed the first successful VTL (virtual tape library). StorageTek was unable to develop and launch to the mass market a VTL mainly in part because of the resistance to cannibalization. Fortunately, StorageTek was acquired by Sun Micosystems and now a part of Oracle. StorageTek’s robotic technology is still used in other applications and is still selling tape libraries. Just not like they used to.

As we look at companies today, cannibalization is a natural part of moving forward, but it needs to be done gracefully and must be strategically timed. For example, Apple does this with grace, keeping ever consumer on his heals waiting on long lines as if our lives depended on it. The release of the iPad mini will indeed cannibalize iPad sales, but Apple strategically holds back certain features like the retina screen. This strategic limiting features and marketing of the iPad mini actually counters other 7″ tablets. This comes at the optimal time when people are holiday shopping and people are perhaps tired of their 7″ kindles or 7″galaxy tablets, what a great excuse to jump onto the Apple bandwagon. They can do this while not giving up the footprint they are so used to.

Cannibalization is now a tool to move forward instead of looking for an exit strategy.

Comments Off on Cannibalization


Bigger Isn’t Always Better

Posted in: General, Author: yobitech (October 22, 2012)

I blogged earlier about hardware vendors like Dell, EMC, and HP looking to strategically become the “Walmart” of the Data Center.

While this was all happening on the “corporate level”, there was similar movement in the parallel universe on the “consumer level”.

Google, Apple and Amazon are moving discreetly and rapidly to captivate the consumer market. You may not have noticed, but there’s seems to be a lot of interesting products in the past few years that have really become pillars of our daily lives.

From Google Docs to Google Voice to watching videos on Amazon Prime… these are just a few examples of how a search engine company has now become our partner in life and how an online bookstore has now become our entertainment outlet and shopping superstore.

So what’s this all about? It is all about owning the “Ecosystem”. The “Walmart” philosophy of the “one-stop” shop goes beyond just making it convenient for us, the consumer, but it is also about making us dependent on them, the company… the “Ecosystem”.

The more ecosystem a company can own, the more likely we will be stuck patronizing them. This is not necessarily a bad thing, but it puts the squeeze on the smaller, more specialized segments of markets.

While creating competition on a macro-level, this also creates unintended consequences such as steamrolling industries in the process.

While I am a big fan of Apple and their endless barrage of tantalizing toys and entertainment I can’t just lock myself into just one company. I love the fact that I can just turn on my ASUS droid tablet and enjoy the same benefits as Apple, but at the same time see how each product and service is presented and executed uniquely.

Owning “the Ecosystem” is a wonderful concept but let’s not forget to help out the “little guys”. The small businesses, the boutique shops and niche players.

I remember going to a local tool rental shop to rent a compactor for my paving job in my backyard. I could have went to Home Depot and rented it there, but I chose to support my local business. They were so appreciative for my business they gave me the attention I could not have received from a hardware superstore.

Let’s support our local and small businesses as this is a great way to keep everyone honest, including the “Big Boys”. When we all do this we can all focus on the more important things in life, our friends and families.

Comments Off on Bigger Isn’t Always Better


Flash Evolution

Posted in: General, SSD, Author: yobitech (August 21, 2012)

I LOVE my Macbook Air. Hands down the best computer I ever used. Elegant, light-weight, but in every way a real computer.

What impressed me the most is not so much the size and sleekness of the MacBook Air, but Apple’s uncanny timing and ingenuity to “trail-blaze” on-board flash memory chips as hard drive storage. No other manufacture would make such a bold and gutsy move like this. It’s just way too risky and costly (from a support perspective).

Apple saw something different. They saw this as an opportunity to rewrite a page in Solid State Drive storage while creating the thinnest laptop in the world. This came at the perfect time also as Apple took advantage of the fact that the SSD market is still evolving. So consumers are not picky about what kind or type of SSDs are used, but that ANY kind of RAM or Flashed based storage would be awesome.

This is like the early days of flat screen televisions. About 7-8 years ago, it didn’t matter if Whirlpool made flat screen TV, people would just buy it because it was flat and cheap. As SSDs continue to take shape, it is unquestionable that Apple has clearly set a new standard. A concept that was once thought not economically feasible became their very advantage over every competitor.

Apple’s keen ability to operate and to think “outside-the-box” keeps us, the consumer, always wanting more! Just when we think Apple has run out of ideas, they always seem to surprise us again. This is the stuff that gets us out of bed early and to wait on long lines just to BUY an Apple product. Hats off to a company that almost went bankrupt a few times to now the biggest company that ever existed.

Flash Evolution

Comments Off on Flash Evolution


Is Fiber Channel Dead?

Posted in: General, Author: yobitech (May 16, 2012)

The discussion on which is the better choice, Fiber Channel (FC) or iSCSI, will get different answers from different people.

It is certainly an arguable case that Fiber Channel is more expensive, but the case can also be made that it has become less expensive over the years.

With the masses moving into the 10Gb iSCSI market, the cost differences is not so great anymore.

So is FC dead? It is probably as dead as the mainframe computer.

Many have predicted that mainframes will die over 20 years ago, but the mainframe is very much alive and thriving today. In fact, the server virtualization world like VMware, Microsoft Hyper-V and Citrix Xen hypervisors are foundationally based upon mainframe technologies.

FC has been and will remain a solid contender as a transport for data. It operates on a lower stack than iSCSI. From a foundational level, FC inherently has an advantage.

iSCSI has come a long way since it first came out. With many enhancements like Jumbo Frames, and DCB (Data Center Bridging) iSCSI is a formidable solution, even with the overhead of the IP stack. With 10Gb iSCSI going mainstream competing with 8Gb FC, the lines are blurred with differences in some pros and cons. Becoming more of an argument of preferences and what each establishment sees as an advantage.

So is FC going to die?
Not in the near future, for there are enough out there that are “old school” like me. Don’t get me wrong, I am a big proponent of iSCSI, but there are just some applications that I will reserve just for FC. With the next release of FC going to 16Gb and iSCSI going to 40Gb, knowing your application should determine your transport type.

Other than that, it’s really a question of, “Will it be Chocolate or Vanilla?”

Is Fiber Channel Dead?

Comments Off on Is Fiber Channel Dead?


The Perfect Storm

Posted in: Backup, General, RAID, SAN, SAS, Author: yobitech (February 25, 2012)

As you may remember when SATA drive technology came around several years ago, it was a very exciting time. This new low cost, high-capacity, commodity disk drive revolutionized the home computer data storage needs.

This fueled the age of the digital explosion. Digital photos and media quickly filled hard drives around the world and affordably. This digital explosion propelled companies like Apple and Google into the hundreds of billions in revenue. This also propelled the explosive data growth in the enterprise.

The SAN industry scrambled to meet this demand. SAN vendors such as EMC, NetApp and others saw the opportunity to move into a new market using these same affordable high-capacity drives to quench the thirst for storage.

The concept of using SATA drives in a SAN went mainstream. Companies that once could not afford a SAN can now buy a SAN with larger capacities for a fraction of the cost of a traditional SAN. This was so popular that companies bought SATA based SANs by the bulk, often in multiple batches at a time.

As time progressed, these drives started failing. SATA was known for their low MTBF (mean time before failure) rates. SATA SANs employed RAID 5 at first to provide protection for a single drive failure, but not for dual drive failure.

As companies started to employ RAID 6 technology dual drive failure protection would not result in data loss.

The “Perfect Storm” even with RAID 6 protection looks like this…

– Higher Capacity Drives = longer rebuild times: The industry has released 3TB drives. Depending on SAN vendor, this will vary. I have seen 6 days for a rebuild of a 2TB drive

– Denser Array Footprint = increased heat and vibrations: Dramatically reducing MTBF

– Outsourced drive manufacturing to third world countries = increase rate in drive failures particularly in batches or series: Quality control and management is lacking in outsourced facilities resulting in mass defects

– Common MTBF in Mass Numbers = drives will fail around the same time: This is a statistical game. For example, a 3% failure rate for a SAN in a datacenter is acceptable, but when there are mass quantities of these drives, 3% will approach and/or exceed the fault tolerant of RAID

Virtualized Storage = Complexity in recovery : Most SAN vendors now have virtualized storage, but recovery will vary depending on how they do their virtualization

– Media Errors on Drives = Failure to successfully rebuild RAID volumes: The larger the drive the chance of media errors become greater. Media errors are errors that are on the drive that renders small bits of data to be unreadable. Rebuild of RAID volumes may be compromised or failed due to these errors.

Don’t be fooled into having a false sense of security but having just RAID 6. Employ good backups and data replication as an extension of a good business continuity or disaster recovery plan.

As the industry moves to different technologies other new and interesting anomalies will develop.

In technology, life is never a dull moment.

Comments Off on The Perfect Storm


Lots of Cache isn’t always the answer

Posted in: General, Author: yobitech (December 19, 2011)

What is Cache(pronounced like cash)?

Cache is essentially memory that is used to enhance or to accelerate performance in a system. Cache is generally bolted onto storage devices to help facilitate bursts in data movement in a disk or a SAN system.

Cache does not, in any way, change the storage device but merely adds a “cushion” or “shock absorber”. The larger the cache, the more the system can absorb “spikes” and “peaks” in demands on the storage device.

The result… a better end user experience, but is that a good thing? Of course, but understanding what cache is and using it properly will result in optimal and proper use of such a tool. Ultimately, achieving a balanced system is what should be the goal of anyone using data storage today

Throwing Cache at Your Storage Problems

I have seen this happen all too often… Sounds cliché, but throwing “cache” at all your problems won’t make it go away. There is a point in any storage system where by adding a lot of cache will reach the point of “diminishing return”. This point of diminishing return will vary depending on the type or storage device configuration (speed of disks, raid level, connection type).

Cache can buffer up to a certain point, but at some point, the data still needs to land on the actual storage system. Adding cache is like using a credit card. It is great when shopping because it helps us feel better when and we get things so much easier and faster. Storage systems that have a lot of cache have to eventually pay for it on the backend. This cost comes in the form of write penalties (from RAID levels), IOPs support (lack of disk spindles or slow disks) and quasi-virtualized storage (typically a File System to emulate). The big problems come when the data that is being stored is used very heavily and consistently.

Cache reaches the point of diminishing return and shifts your problems to the backend. So remember, a good storage system will show its merits when it can boast the use of less cache. An example of this would be Compellent. Compellent’s truly virtualized storage can handle data issues upfront with little dependence from cache.

With the growing popularity of SSDs because of their robust speeds, some disk vendors like EMC is using them as a way to augment traditional cache. This will help, but essentially it is pushing the point of diminishing return further down the road. SSDs are also not the same as cache for it is great for read operations, but for write operations, it is reduced (as compared to read operations). Be careful of this tactic because the backend still needs to process the workload.

So before you go out and spend tons of cash, check and ask your storage vendor, how much cache is in their system and why. Don’t invest in a storage system that resembles our national debt… remember, we eventually have to pay in the end.

Comments Off on Lots of Cache isn’t always the answer