Project Management in IT

Software upgrade rollouts. Database and server migrations. Security protocol change implementation. Hardware replacements and upgrades.

At any given time, your IT team is facing a list of projects that need to be completed. (Even more so at smaller companies where the “IT team” is one or two people trying to play catch-up when they’re not wearing one of their other hats.) At times, projects even seem to get added to your to-do list faster than you can cross them off.

Such is the nature of virtually any modern business. Staying current on security, delivering your customers the type of experience they demand, and equipping your employees with the tools they need to succeed requires you to be every bit as proactive about technological upgrades and process improvements as you are about system maintenance and monitoring.

The days when a calendar on your wall and sticky notes on your desk were adequate tools for managing the types of projects your company demands are in the past. Today, if you are going to have any chance of staying ahead of the curve on IT projects (and hopefully, maintain your sanity), you will need to be much more deliberate about your approach, understand how to work on multiple projects simultaneously, understand the proper sequencing for projects, and ensure that you are able to complete these projects in ways that minimize interruptions to your company’s work and your customers’ experience.

There are a myriad of tools designed to help you manage technical projects. There are, of course, classic tools like Microsoft Project, which remains the solution of choice for many companies and is a reasonable standard against which to measure any other potential solution. Other companies have signed large contracts with rapidly-expanding firms offering cloud-based project management platforms to simplify collaborative management and leverage shared data. Other companies embrace open source project management systems and platforms (whether installed locally or in the cloud) to access robust functionality without making a commitment to a single vendor.

No matter what route your company decides to go, it is imperative that you approach technical projects with the degree of intentionality that these solutions are designed to support. Successful project management is dependent on defining a clear scope of work, assigning the necessary resources, carefully and accurately documenting the work that needs to be done and the work that has been done, and following through.

Integrating PowerVC in an IBM i Shop

By Dana Boehler

The speed of business has never been faster. Product release cycles have shrunk to timelines inconceivable in the past. Some fashion retailers are now releasing new product every two weeks, a cycle that historically only happened 4-8 times a year, and certain retailers even have product available immediately after it is displayed on the runway.

The demand for immediate insight into the state of sales numbers, ad campaigns, and other business functions has made the continuous aggregation of data commonplace. And if those factors weren’t pressure enough, the threat of ever-evolving security hazards is generating mountains of updates, code changes, and configuration adjustments — all of which need to be properly vetted before entering a production environment.

All of this activity needs to run on infrastructure that administrators like ourselves must manage, often with fewer coworkers to assist. Thankfully, for those of us running IBM i on IBM Power Systems, IBM has provided a robust cloud management tool that allows us to quickly spin up and spin down systems: PowerVC.

PowerVC allows users to manage existing IBM Power System partitions, create images from those partitions, and deploy new partitions based on those images. More recent versions of PowerVC support IBM i management and deployment (earlier versions did not).

Over the past year, I have been using PowerVC to greatly reduce the amount of time it takes to bring a system into the environment. Typically, creating a new system would take several hours of hands-on keyboard work over the course of a few days of hurry-up-and-wait time. The first time I deployed a partition from PowerVC, however, I was able to reduce that to about an hour, and after more refinements in my deployment, images, and process, I am now down to under 25 minutes. That’s 25 minutes to have a fully deployed, PTF’d system up and running.

The full implications of this may not be readily apparent. Obviously, net new systems can be deployed much more quickly. But more importantly, new modes of development can be more easily be supported. PowerVC supports self-service system provisioning, which enables teams to create their own systems for development, test, and QA purposes, and then tear them down when no longer needed. Since the systems are focused on the task at hand, they do not need the resources a fully utilized environment would need.

There’s more: Templates can be created in PowerVC to give the self-service users different CPU and memory configurations, and additional disk volumes can be requested as well. Post-provisioning scripts are supported for making configuration changes after a deployed system is created. In our environment, we are taking this a step further by integrating PowerVC with Red Hat’s Ansible automation software, which has given us greater flexibility in pre- and post-provisioning task automation.

In practice, using PowerVC removes many of the barriers to efficient development inherent in traditional system deployment models and permits continuous deployment strategies. Using PowerVC, a developer tasked with fixing a piece of code can spin up a clean test partition with the application and datasets already installed, create the new code fix, spin up a QA environment that has all the scripted tests available for testing the code, and then promote the code to production and delete the partitions that were used for development and testing.

You do have to make some changes to the environment in order to support this model. Code needs to be stored in a repository, so it can be kept in sync between all systems involved. The use of VIOS is also required. Additionally, note that when using this type of environment, the administrator’s role becomes more centered around image/snapshot maintenance (used for deployment templates) and automation scripting rather than the provisioning and maintenance of systems.

For full information on the product and its installation, I recommend visiting IBM’s knowledge center.

Guest Blogger

Dana Boehler is a Systems Engineer and Security Analyst at Rocket Software, specializing in IBM i.

Introducing the POWER9 Server Family

POWER9 is here. As many in our community will be looking to upgrade, we want to provide information on what these new servers offer you and your business.

According to IBM, POWER9-based servers are built for data intensive workloads, are enabled for cloud, and offer industry leading performance.

As you have experienced, Power Systems have the reputation of being reliable, and the POWER9-based servers are no exception. POWER9 gives you the reliability you’ve come to trust from IBM Power Systems, the security you need in today’s high-risk environment, and the innovation to propel your business into the future. They truly provide an infrastructure you can bet your business on. From a Total Cost of Ownership (TCO) standpoint, a savings of 50% can be realized in 3 to 5 years when moving to POWER9 per IBM calculations.

When compared to other systems, POWER9 outperforms the competition. IBM reports:

  • 2x performance per core on POWER9 vs. X86
  • Up to 4.6x better performance per core on POWER9 vs. previous generations

Learn more about POWER9 by visiting the new landing page. For more detailed data regarding POWER9 performance, be sure to click on the Meet the POWER9 Family link.

Attending the COMMON Fall Conference & Expo? Be sure to attend the POWER Panel session on POWER9. This will be your opportunity to learn more about the servers from experts.

Minimize Employee Use of Local Storage

Saving files in local folders and even on the desktop is an easy option. Whenever you open a new file or download an attachment, it saves to a local ‘Download’ folder by default and edited files try to save themselves in ‘My Documents.’ But using local storage on individual devices can slow down your business.

Why Should You Reduce Local (Device-based) Storage?

Central or cloud-based storage is beneficial for multiple reasons. Easy security, universal access, and consistent back-ups are a few, and the inverse is true for local storage.

Only the Employee and the System Administrator Have Access

Locally stored files are easy for an employee to save and open, but only that specific employee. No one else has easy access, including managers or co-workers involved in the project. Only a network administrator with remote access to the drive can access the files. Not only is this inconvenient if the employee is out of the office that day, it also provides no protection against long-term loss of access. If the employee leaves the company and the drive is wiped (or the employee was using a personal device), any progress is lost. Hard drive malfunctions can also wipe out files without backup or a reparable file.

There Is No Version Control

If you’ve recently emailed a large group of people, the conversation probably segued into a couple of different email threads. This can be tricky to get back on track, and it always ends with not everyone having all the information they need. This is even more true with in-progress documents. If one employee is making updates based on a local file, other parties can’t see the changes until it’s manually shared. If two employees are making separate changes, then some work will be irreparably lost or there will be more confusion and frustration down the line. But if files are stored in working software, where changes are made live and saved continuously (especially if edits are marked by author), then there’s more collaboration and less overwriting or wasted effort.

hanging-files-1920437_640

Cloud Technologies – Containerizing Legacy Apps

Information technologies are continually in a state of transition and organizations often need tools to help them transition from one platform to another, especially with regard to legacy apps. Many companies either still find value in these apps or simply cannot make the transition to Cloud technologies fast enough due to budgetary concerns or other reasons. For these organizations, IBM is now offering the Cloud Private platform, which allows businesses to embrace the Cloud by not only containerizing their legacy apps but also containerizing the platform itself, along with other IBM tools and many of the notable open source databases.

Providing Bridges

Through their Cloud Private platform, IBM provides the bridge between current cloud services and an organization’s on-site data mechanism. In essence, it allows a company’s legacy apps to interact with cloud data. IBM understands the value of making a platform accessible to other technologies and they used this philosophy as well with their Cloud Private tools. Whether an organization uses Microsoft’s Azure cloud platform or Amazon Web Services, IBM’s Cloud Private is flexible enough to work with both.

A Comprehensive Package

IBM’s Cloud Private platform offers a comprehensive package of tools to help companies mix and mingle their legacy apps with other cloud services and data. The Cloud Private toolset includes components for:

  • Cloud management automation
  • Security and data encryption
  • The core cloud platform
  • Infrastructure options
  • Data and analytics
  • Support for applications
  • DevOps tools

Providing a comprehensive transitioning tool, such as the one IBM developed, should help companies make the most of their investment in their legacy apps. In addition, it will provide them with the time buffer they will need before eventually making a full transition to the Cloud.

Artificial Intelligence – The Number One IT Career Path

There is good news for those who have decided to acquire artificial intelligence skills as part of their IT career path. IBM Watson gurus will be pleased to learn that AR and VR skills are in the top spot of in-demand skills for at least the next couple of years. According to IDC, in the near future, total spending on products and services that incorporate AR and/or VR concepts will soar from 11.4 billion recorded in 2017, to almost 215 billion by the year 2021 — a phenomenal amount of growth that is going to require a steady stream of IT professionals that can fill the need for these widely expanding fields and others. Read on to learn more about the top five IT careers that show nothing less than extreme promise for anyone willing to reach for the rising IT stars.

Computer Vision Engineer

According to the popular job search site Indeed, the top IT position most in demand for the next few years goes to computer vision engineers. These types of positions will require expertise in the creation and continued improvement in computer vision and machine learning algorithms, along with analytics designed to discover, classify, and monitor objects.

Machine Learning Engineer

If vision engineers are responsible for envisioning new ideas, then machine learning engineers are responsible for the actual creation and execution of the resulting products and services. Machine learning engineers actually develop the AI systems and machines that will understand and apply their built-in knowledge.

Network Analyst

As AI continues to grow exponentially, so does the IoT. This means an increased demand for network analysts who can apply their expertise to the expansion of networks required to support a variety of smart devices.

Security Analyst

As AR and VR configurations become more sophisticated, along with more opportunities for exploitation through more smart devices, cyber attacks will become more sophisticated as well. Security analysts will need strong skills in AR, VR and the IoT in order to protect an organization’s valuable assets.

Cloud Engineer

Behind all the scenes is the question of how these newer concepts will affect cloud services. The current expectations are that solutions will require a mixture of both in-house technology and outside sources. Cloud engineers will need to thoroughly familiarize themselves with AR and VR concepts in order to give them the necessary support.

IBM i and the API Economy

By Dan Magid

The API economy is a new frontier for companies seeking to get the most out of their data. With the right API, you can transform rigid workflows into the intuitive web-based and mobile experiences that today’s users demand, using modern programming languages to access real-time transactional data. This way, users can harness the value of information from the transactional systems that fuel their businesses without unnecessary time, cost, and risk.

According to Coleman Parkes Research, 88% of all businesses employ APIs. But our internal research shows that only 38% of IBM i users are enjoying the benefits of the API economy. Perhaps this disparity is due to the lack of API expertise among IBM i shops, where most developers use RPG and COBOL vs. modern tools like Node.js or GO, or the time and resource constraints that are common among IBM i shops.

However, the fact is that APIs are the most effective way for IBM i shops to expose real-time transactional data for access by modern web, mobile and IoT applications, so bridging the knowledge gap between modern application development and the languages that power your critical IBM i applications should be a top priority. But it must be done without hurting productivity, blowing the IT budget or risking catastrophic application errors.

These goals—building and deploying APIs without negatively affecting organizational performance—may appear to be contradictory, but they don’t have to be. How do you do it? I recommend a two-pronged approach:

  • Look for tools to build APIs directly from the host-based application that run your business—but without needing to modify the codebase that runs your business
  • Ensure that your API functionality is easily tested against any other host-based application changes that occur, vastly reducing the resources required to ensure that APIs continue to work in a world of ever-changing user and IT requirements

Following this approach will enable you to not only extend the value of your IBM i platform, but also unlock the value of your applications to a world of new uses. And without the need to modify source code, it will also help you bridge the knowledge gap between your IBM i applications the web and mobile applications that need their functionality.

I’ll finish with the results of a recent study by Quark and Lepton, which showed that the IBM i platform typically costs about one-third as much budget to operate over a three-year period compared to Oracle/Linux or SQL Server/Windows platforms. IBM i is by far the most cost-effective, and with the right API approach, it can serve you effectively for years to come.

Guest Blogger

Daniel Magid is Managing Director of Rocket Software’s Application Lifecycle Management & DevOps lab, and is a recognized authority on helping leading organizations achieve compliance through ALM solutions and DevOps best practices. He has written a variety of articles for leading IT publications and is a regular speaker at technology conferences.

Cloud Technologies and Handling Ransomware

Cloud computing is one of the best technologies to have in the workplace. Not only can you store your data quickly and efficiently, but it’s also easier for you to access any data. With that said, when it comes to your business security, especially the malicious tool known as ransomware, why are cloud services so important?

Cloud = Virtual Storage

One reason why, is because cloud computing allows you to store your data virtually over the Internet. This makes it untouchable in the event of a disaster. Let’s say a ransomware attack happened on your device, and it affected the data on your hard drive. Despite this, none of your virtual data would be affected, especially since this isn’t what most hackers are banking on. However, since ransomware locks your computer, you wouldn’t be able to access any of your virtual files, right? As a matter of fact, you can. Cloud computing not only keeps your files safe in the event of a disaster, but your data is also accessible from any device with an Internet connection. Whether it’s another computer in the workplace or even your mobile phone, the sky’s the limit to where you can access your personal data.

For more information about cloud computing, COMMON offers educational opportunities throughout the year. Stay in touch to see when the next cloud-related sessions become available.

The Role of IT in the Retail Industry – Adapting to Trends

IT has a substantial role to play in the modern retail industry. This has been the case for a long time. However, the world of information technology changes quickly enough that individuals working in retail have had to adapt to various trends.

Mobile Apps for Retail Stores

Many experts today are now urging all retailers to create apps that specifically pertain to their products and even their store locations themselves. This is one of the biggest trends with regards to IT in the retail industry. All of these apps are incredibly different, however.

RetailSome makeup stores will offer apps that enable users to test products in advance in a whole new way. Some of the apps will offer customers convenient discounts right at their fingertips.

People can use apps in order to get a sense of where different items are located and whether or not they’re currently available. As such, there are apps that will truly help people overcome some of the most frustrating parts of shopping in the first place.

Cloud Computing in Retail

Thanks to cloud computing, it is much easier for retailers to consistently monitor their inventory. They can quickly get a sense of what they have in stock and what’s going on with orders. Losing large amounts of data is much less likely in the era of cloud computing. While this makes the technical side of the equation easier, it also means that retail workers can place more emphasis on customer service.

IT helps improve efficiency in the retail industry. The retail industry may also be able to fulfill more of its primary objectives thanks to modern information technology trends.

Demand for Cloud Technologies

Those who have skills and experience with cloud technologies are going to be much in demand in the next few years. According to Tech.Co, the use of cloud computing technologies is expected to quadruple in the near future. Estimates are that cloud data centers will manage a whopping 92 percent of all workloads.

So who are the biggest contributors to this massive progression to the cloud? The biggest players are the IoT (internet of things) and big data centers. Most of the growth will occur in public cloud data centers, with the use of private clouds beginning to decline. Interestingly, predictions are that infrastructure as a service (IaaS) will decline somewhat, due to many organizations focusing on improving their own corporate infrastructures, including both data storage for sensitive information and acquisition of their own high-speed connections.

In addition, a recent study, “2017 Cloud Computing and Business Intelligence Market Study” conducted by Dresner Advisory Services, notes that as organizations are turning to public clouds, they are also looking for cloud-based business intelligence tools such as dashboards, advanced visualization tools, ad hoc queries, data integration and data quality, end-user self-service and reporting features. The study goes on to note the trend for increasing demand for cloud-based BI services, is largely driven by smaller organizations. However, not included in their across the board “must have” list for BI services, is social media or streaming analytics, although these are still important in certain industries.

Trust in the cloud is not just increasing for businesses. Consumers are also expected to demand more from the cloud. Estimates are that personal cloud storage will increase from 47 to 59 percent. That may not sound like a huge percentage increase, but globally the increase represents about a billion more users.

The future looks bright in the cloud, supported by both business and consumer demand. Anyone interested in applying their technology skills to this trend will most likely have a bright future as well.