Wide Open

From the Vault – The Computing of Business, October 2014

EarthquakeDid you know that in the wide world of open source software there is an application for analyzing seismic data? If I had only known that a few weeks ago I could have thrown a portable seismograph into my carry-on for a recent trip to California.

Since I was rudely awakened during my stay by what FEMA now calls a major earthquake, I could easily have done a quick data reduction and submitted the results to the open source Global Quake Model. But alas, during the event I was thinking more about what might happen to the roof than contributing to science. At 40 miles from the epicenter it was certainly a unique experience and the building held together nicely—check that one off the bucket list.

So open source is everywhere. Good developers are developing good, creative software and distributing for all to use within the confines of a number of different license platforms.

In a recent IT-related university classroom experience Phil McCullough, a fellow COMMON Director, noted that the entire curriculum centered on open source. Open source operating systems, open source databases, open source development tools, open source applications; the entire gamut. The message was loud and clear: open source will be a big part of our computing future.

But software, by its very nature, contains defects. In many cases those defects do not consist of what we traditionally consider a problem. For example, as the programmable point of sale system industry was developing a couple of decades ago, I cannot imagine that leaving a credit card number unencrypted in ram memory would ever be considered a problem. Turns out it was.

Recently my employer sold some software we had developed to a very large company. Parts of that software contain open source components like frameworks and other mechanisms that helped develop a very complex set of code. Before the deal could close the buyer required substantial documentation of every open source component, and the version of that component. Additionally, the entire base of object code was scanned for any known vulnerabilities. All went well and the deal was completed.

Open source operating systems like Linux, and large open source applications like Sugar CRM, have the benefit of having substantial support organizations. Dedicated people watching for problems are an advantage many small applications do not, and cannot, possibly have. The likelihood is very real that the bad guys—and there are an awful lot of them—will stumble on to something they can exploit in almost any open source project. If you run some or all of a smaller targeted code base, it then becomes simply a matter of a bad guy (they are voracious sharers of exploit information) finding your system and deploying one of the myriad ways of injecting malware on to it.

Of course there needs to be something worthwhile to steal. Credit card numbers are the currency de jour though certain kinds of pictures bring a bigger bang. Regardless, any information about your company or your customers has value to someone.

A number of scanning solutions like OpenLogic and OpenVAS  can attempt to locate problematic software. But using such tools seems to me to be a bit reactive. The veritable cat may have already left its bag.

A more proactive approach is to know where in your software, and on your systems, the open Ground Crackssource stuff (and anything else with known vulnerabilities) is. Trust me, the stuff is there. Complete, accurate application inventories are more important than ever. An inventory, coupled with appropriate monitoring of the threat landscape, will keep you ahead of the bad guys. Yes, this is something more to do with your constantly shrinking resource base. But staying ahead of the bad guy sure beats the alternative.

For many the 2014 South Napa earthquake was a disaster. For me it was an interesting experience.

For some the open source landscape will be a disaster. Reading about those problems should be the only experience you want.

About the Author: Randy Dufault, CCBCP

Randy DufaultRandy is the Director of Solution Development for Genus Technologies, a Midwestern consultancy dealing primarily with enterprise content management systems. His experience with content management dates back 25 years, where he helped develop what ultimately became IBM‘s Content Manager for iSeries. He has also developed and integrated a number of advanced technologies including document creation, character recognition, records management, and work flow management. Randy is a member of the COMMON North America Board of Directors and was active in the development of COMMON‘s Certification program.

Read Randy’s Computing of Business column in COMMON.CONNECT.

What Is Software Defined Storage?

One of the major trends in IT storage today is the accelerating growth of software defined storage (SDS). According to market research firm MarketsandMarkets, the market for SDS products will grow from $4.72 billion in 2016 to $22.56 billion by 2021. That’s an outstanding compound annual growth rate (CAGR) of 36.7%.

But many IT leaders, even storage professionals, remain unsure of exactly what the term SDS really signifies.

Actually, that confusion is not surprising. As with many new technologies that begin to expand their market share, some storage vendors have taken the opportunity to break out the software component of existing products and call it SDS. And, of course, their definition of SDS just happens to precisely match the feature set of the product they are trying to sell.

Contrary to what some skeptics claim, however, SDS is much more than just the latest marketing buzzword. In fact, many proponents see it as the vanguard of a revolutionary advance in how enterprise storage is managed and delivered.

SDS Defined

The Storage Networking Industry Association (SNIA) defines SDS as “virtualized storage with a service management interface.”

Although storage virtualization has been in use for some time, SDS takes it to a new level. The distinctive feature of SDS is the decoupling of the intelligence of the storage system from the underlying hardware. This means SDS is storage-agnostic – it isn’t tied to any particular type of hardware or media. Instead it treats all the devices it controls, whether spinning disks, flash memory arrays, or even entire SAN or NAS subsystems, as part of a single storage pool. Users and applications (via standard APIs) can access storage through a consistent software interface without needing to have any knowledge of what hardware is actually storing the data.

One of the major benefits of the storage heterogeneity SDS allows is that costly special-designed storage appliances are not required (though, of course, they can be used if desired). Instead, inexpensive commodity hard drives attached to x86 hosts can be used, mixed in with higher performance technologies such as flash memory arrays as necessary. The SDS software has the intelligence to use tiering and caching functions to dynamically assign particular sets of data to the appropriate storage devices based on the performance demands of the workload being run.

The result of hiding all the storage hardware behind the SDS software interface is that flexibility, scalability, and control are maximized, while costs for hardware, maintenance, and storage management are minimized.

The FinTech Future of IBM i

Buildings With fintech companies moving toward the creation of new transaction models for blockchain support of payment and lending transactions, IBM has launched new developer tools, software, and training programs targeted at financial services industry software developers. Version 7.3 of IBM i was released in April 2016. Requiring little to no onsite IT administration during standard operations, IBM i is making blockchain programming endeavors possible.

IBM BlueMix Garage developers are using the Bluemix PaaS (platform as a service) capabilities to test network solutions on the cloud designed to unlock the potential of blockchain. The Hyperledger Project set up to advance blockchain technology as a cross-industry, enterprise-level open standard for distributed ledgers, will be critical to development of the latest in fintech services IaaS (infrastructure application as service) technologies as they emerge.

The collaboration of software developers on blockchain framework and platform projects, stands to promote the transparency and interoperability of fintech IaaS. Providing the support required to bring blockchain technologies into adoption by mainstream commercial entities, BlueMix Garage developers are keen on IBM i database software programming as turn-key solution to operating systems on PowerSystems and PureSystems servers.

Recent release of fintech and blockchain courses by the IBM Learning Lab, offers training and use cases for financial operations analysts and developers. Offered in partnership with blockchain education programs and coding communities, IBM is engaged with the best in cognitive developer talent to capture ideas for the next generation of APIs, artificial intelligence apps, and business process solutions from the IBM i community.

Considerations for Selecting the Right IT Vendor

As always, the field of information technology is growing exponentially. Along with ever-expanding growth comes a plethora of players, all seeming to offer the perfect IT solution for your company. Here are a few tips to consider when determining how to select the best IT vendor in order to meet both current and future technology needs.

Does the vendor believe in the product?

One of the most telling signs as to whether an IT vendor really believes in their products, is if they actually use what they are selling. If they do, it certainly speaks volumes as to the confidence level in their own products. If they don’t, it certainly is a tough sell to tout their products to others. Most vendors promote their products based upon the promises of reduced costs and increased productivity. If you want to determine whether or not that is actually the case, take a look at their financial reports. Their promises should show up for the vendor company or any other company they might mention as a reference.

Does the product “play nice” with others?

Another important consideration is interoperability. Do the products an IT vendor offers play well with others? Although there is the advantage of initial simplicity when choosing homogeneous products, potentially it does narrow one’s future and reduces the opportunity for diverse expansion and growth. Many companies these days see the value of solutions with built-in flexibility and are shying away from getting locked into homogeneous solutions.

Will the vendor be there for you?

Of course, service after the sale is important as well. When problems crop up, it’s important to know your IT vendor provides the stability and continuity in their workforce so you have a solid relationship upon which you can rely. For honest reviews of a particular company, you can check out sites like Glassdoor. If the employee turnover rate for a particular IT vendor is high, then chances are you will spend a fair amount of time re-making initial connections with reps instead of interacting with someone who knows you and your company very well.

Conclusion: Do Your Research

In essence, selecting a good IT vendor is not only about listening to sales presentations and then selecting the one with the most appeal. It also involves conducting further research to verify sales material and understanding the relationship with the vendor after the sale is just as important as the product itself.