Home Latest Timing the leap from cutting-edge to LPTA technology — GCN

Timing the leap from cutting-edge to LPTA technology — GCN

0
Timing the leap from cutting-edge to LPTA technology — GCN

[ad_1]

cloud (PranFoto/Shutterstock.com)

INDUSTRY INSIGHT

Timing the leap from cutting-edge to LPTA technology

Technology changes fast. So fast that often what was considered leading edge even five years ago is today already a commodity. The rapid evolution from best value to commoditization is a key driver behind the lowest price technically acceptable (LPTA) evaluation policy that governs many federal IT purchases.

Solid state and flash drives are a good example. Formerly expensive and selectively used, they are now cheap and ubiquitous. Converged architecture — combining networking, compute and storage into a ready-built blueprint — is an example of more complex systems evolving toward a simplified “single SKU” offering. At first bug-prone and difficult to scalable, converged architectures are now stable, mature and easily integrated into agency IT environments.

So what evolving and emerging technologies should we expect to make the LPTA leap in the next few years? Here’s what I believe will be some of the most prominent:

Cloud: One of the biggest changes over the last five years has been the widespread adoption of cloud computing. While public cloud platforms offer an attractive lower-cost alternative to traditional on-prem data centers, many agencies have concerns about their diminished administration rights, which are limited to the software layer of the cloud environment only. The same goes for increasingly popular as-a-service applications. Those concerns are compounded by worries about putting sensitive data in the public cloud, where stringent security can feel out of an agency’s reach.

Consequently, the next three to five years will undoubtedly see broad adoption of hybrid or micro-cloud models.  Some data will stay in on-premise systems, where incorporating smaller private clouds into the architecture can offer a cost-effective and powerful solution. Other data will remain in the economical and safeguarded public cloud space. For instance, public cloud may be used to bolster long-term preservation and redundancy of on-prem technology for disaster recovery and business continuity purposes. With hybrid or micro-cloud, federal agencies will be in full control of what goes where. 

Automation and orchestration: Enterprise automation and orchestration are rapidly proliferating, with widely adopted platforms like Ansible and ServiceNow offering open application programming interfaces that automate workflow provisioning. However, such tools presently involve significant manual integration — installing, configuring and connecting to the right API.  Five years into the future, these tools will instead use automatic, prewritten connectors to APIs, making provisioning plug-and-play. Artificial intelligence and machine learning will drive major advances in other tools as well. For example, consider a technical support chatbot that integrates with an orchestration platform. Using AI and ML, the bot can be continually improved, becoming better able to identify problems faster by building on what it has already learned.  

Blockchain: Currently best known as the foundation of Bitcoin cryptocurrency, the distributed ledger orchestration technology that ensures data immutability, is still in its infancy. That is quickly changing. Blockchain is particularly significant for archival purposes because it enforces a move-forward strategy; while new data may be added to an archive, existing data cannot be changed. Because it is microprocessor enabled, it can be applied in databases or any other archival repository. Within a few years, blockchain will become the predominant archival strategy, as the long-time standard magnetic tape moves toward obsolescence.

Big data analytics: While big data analytics adoption has grown rapidly, the data science algorithms used for interpretation and analytics remain unique to each customer and mission. What is important to one will likely not be important across all agencies. What’s more, algorithms change frequently. These factors cast doubt over this technology ever being able to evolve beyond the heavy touch currently required for a more user-friendly plug-and-play status. With advances in AI and ML, perhaps that will change. This is an area to watch carefully.

How should agencies be approaching these technologies now?

Anticipating potential technology needs five or even three years out may feel like long-term forecasting, but in mission terms it’s not. Many government IT teams are eager to adopt new advanced capabilities to support leading-edge missions in intelligence, defense, health care, agriculture, environmental protection and more.  

While the long acquisition process for large, very costly contracts like the Defense Department’s Joint Enterprise Defense Infrastructure cloud contract can impede progress, mid- to long-term strategic planning must keep moving ahead, factoring in what technologies will be LPTA-ready before long. That must be predicated on a clear and very specific set of technical infrastructure requirements for supporting the mission long term.

For example, agencies will need to know what the data workflow is and where will data live in the future. What analytics will an agency require?  What data needs to be fully owned and controlled by the agency, and what is acceptable for public cloud storage? Approaching cloud capacity rather than the data center as a general storage target will accelerate adoption and deployment of new scalable gateway models that won’t require the replication and redundancy that are commonplace today.

While platform specifics of a five-year plan may be outdated in six months, the long-term goal should remain constant. For instance, if the goal is automating 100% of the IT environment, it may mean using a particular orchestration and automation tool over the next year, but shifting to something more suitable to the team’s needs 18 months down the road.

A flexible and dynamic architecture is also critical to preparing for future unexpected scenarios, a prime example being this year’s rapid shift to remote work. While thinking about disruption is not always comfortable, the current situation has proved it’s no longer optional. Agency IT leaders now understand the urgency of building solid strategies centered on what’s leading-edge today so they can be prepared for the LPTA future that’s fast approaching.

About the Author



Dave McCarty is data intelligence business lead with Hitachi Vantara Federal.

[ad_2]

Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here