Posted on Leave a comment

Hybrid AI Will Go Mainstream in 2022

Analysts predict an AI boom, driven by possibilities and record funding. While challenges remain, a hybrid approach combining the best of the realm may finally send it sailing into the mainstream.

Artificial intelligence (AI) is becoming the dominant trend in data ecosystems around the world, and by all counts, it will accelerate as the decade unfolds. The more the data community learns about AI and what it can do, the faster it empowers IT systems and structures. This is primarily why IDC predicts the market to top $500 billion as early as 2024, with penetration across virtually all industries driving a wealth of applications and services designed to make work more effective. In fact, CB Insights Research reported that at the close of Q3 2021, funding for AI companies had already surpassed 2020 levels by roughly 55%, setting a global record for the fourth consecutive quarter.

In 2022, we can expect AI to become better in solving practical problems that hamper unstructured language data-driven processes, thanks to improvements in complex cognitive tasks such as natural language understanding (NLU). At the same time, there will be increased scrutiny into how and why AI does what it does, such as ongoing efforts by the U.S. National Institutes of Standards and Technology (NIST) aimed at more explainable AI. This will require greater transparency into AI’s algorithmic functions without diminishing its performance or raising costs.

You shall know a word by the company it keeps

Of all the challenges that AI must cope with, understanding language is one of the toughest. While most AI solutions can crunch massive volumes of raw numbers or structured data in the blink of an eye, the multitude of meanings and nuances in language, based on the context they are in is another matter entirely. More often than not, words are contextual, which means they convey different understandings in different circumstances. Something easy and natural for our brains is not that easy for any piece of software.


This is why the development of software that can interpret language correctly and reliably has become a critical factor in the development of AI across the board. Achieving this level of computational prowess would literally unleash the floodgates of AI development by allowing it to access and ingest virtually any kind of knowledge.

NLU is a vital piece of this puzzle by virtue of its ability to leverage the wealth of language-based information. Language inhabits all aspects of enterprise activity, which means that an AI approach cannot be complete without extracting as much value as possible from this type of data.

A knowledge-based, or symbolic AI approach, leverages a knowledge graph which is an open box. Its structure is created by humans and is understood to represent the real world where concepts are defined and related to each other by semantic relationships. Thanks to knowledge graphs and NLU algorithms, you can read and learn from any text, out-of-the-box, and gain a true understanding of how data is being interpreted and conclusions are being drawn from that interpretation. This is similar to how we as humans are able to create our own specific, domain-oriented knowledge, and it enables AI projects to link its algorithmic results to explicit representations of knowledge.

In 2022, we should see a definitive shift toward this kind of AI approach combining both different techniques. Hybrid AI leverages different techniques to improve overall results and better tackle complex cognitive problems. Hybrid AI is an increasingly popular approach for NLU and natural language processing (NLP). Bringing together the best of AI-based knowledge or symbolic AI and learning models (machine learning, ML) is the most effective way to unlock the value of unstructured language data with the accuracy, speed and scale required by today’s businesses.

Not only will the use of knowledge, symbolic reasoning and semantic understanding produce more accurate results and a more efficient, effective AI environment, it will also reduce the need for cumbersome and resource-intensive training, based on wasteful volumes of documents on expensive, high-speed data infrastructure. Domain-specific knowledge can be added through subject matter experts and/or machine learning algorithms leveraging the analysis of small and pinpointed training sets of data to produce highly accurate, actionable results quickly and efficiently. 

The world of hybrid AI

But why is this transition happening now? Why hasn’t AI been able to harness language-based knowledge previously? We have been led to believe that learning approaches can solve any of our problems. In some cases, they can, but just because ML does well with certain needs and specific contexts doesn’t mean it is always the best method. And we see this all too often when it comes to the ability to understand and process language. Only in the past few years have we seen significant advancements in NLU based on hybrid (or composite) AI approaches.

Rather than throwing one form of AI, with its limited set of tools, at a problem, we can now utilize multiple, different approaches. Each can target the problem from a different angle, using different models, to evaluate and solve the issue in a multi-contextual way. And since each of these techniques can be evaluated independently of one another, it becomes easier to determine which ones deliver the most optimal outcomes.

With the enterprise already having gotten a taste of what AI can do, this hybrid approach is poised to become a strategic initiative in 2022. It produces significant time and cost benefits, while boosting the speed, accuracy and efficiency of analytical and operational processes. To take just one example, the process of annotation is currently performed by select experts, in large part due to the difficulty and expense of training. By combining the proper knowledge repositories and graphs, however, the training can be vastly simplified so that the process itself can be democratized among the knowledge workforce.

More to Come

Of course, research in all forms of AI is ongoing. But we will see particular focus on expanding the knowledge graph and automating ML and other techniques because enterprises are under constant pressure to leverage vast amounts of data quickly and at low cost.

As the year unfolds, we will see steady improvements in the way organizations apply these hybrid models to some of their most core processes. Business automation in the form of email management and search is already in sight. The current keyword-based search approach, for instance, is inherently incapable of absorbing and interpreting entire documents, which is why they can only extract basic, largely non-contextual information. Likewise, automation email management systems can rarely penetrate meaning beyond simple product names and other points of information. In the end, users are left to sort through a long list of hits trying to find the salient pieces of knowledge. This slows down processes, delays decision-making and ultimately hampers productivity and revenue.

Empowering NLU tools with symbolic comprehension under a hybrid framework will give all knowledge-based organizations the ability to mimic the human ability to comprehend entire documents across their intelligent, automated processes.

By , CTO at on March 2, 2022 in Artificial Intelligence

Posted on Leave a comment

Born in the Cloud: The Next Generation of Cloud Services: New Approaches


"Born in the cloud" is a new category of cloud services poised to make an impact on enterprises.

Born in the Cloud: The Next Generation of Cloud Services: New Approaches
Source: Filip323/



Years ago, we were talking about cloud native design as the lodestar for modern workload systems.

Now, we see the cloud as one more stepping stone toward even newer technologies that make data even more versatile and transferable.

Let's look at four of these and how they work, and how they intersect for the next generation, moving beyond the cloud age.

Distributed Cloud Solutions

With the evolution of peer to peer systems, the emergence of the Internet of Things, and the decentralization of the blockchain, cloud systems may be moving to a new place in a type of setup called "distributed cloud." Here a distributed peer to peer hardware framework runs services at the network edge, instead of in a centralized environment. This contributes to less latency and congestion on the network.

Like distributed computing, distributed cloud makes use of those individual hardware nodes that are 'out in the field.' Like the blockchain, it decentralizes certain types of control and management of system operations.


When we talked about no-hardware designs with "born in the cloud" systems nearly a decade ago, we were mainly talking about moving physical infrastructure from on-premises systems, to off-site in a vendor's network.

People talked a lot about the obvious savings for business that doesn't have to maintain its own server rooms anymore.

What's happened since then, though, is that virtualization has brought the next step – completely untethering hardware pieces from a physical footprint and co-locating them in larger data centers.

In other words, virtual machines don't ‘sit’ anywhere. They don't have physical connections. You don't have to get inside their guts to deal with CPU and storage capacity and other allocations.

Virtualization and the practice of using containers became one of the next big trend after companies started moving all sorts of data and operations to the cloud. It remains one of the big transfer modernizing business systems. (Read also: 10 Ways Virtualization Can Improve Security)

NoSQL Data Storage

Here's another interesting trend that's been happening over the same time period: the way we approach data storage.

First, cloud became joined by the acronym SaaS (software as a service.) There was a further innovation toward making all kinds of data operations remote and sourcing them off-site from vendor offices. (Read also: Redefining IT Decision-Making in the Age of SaaS.)

At the same time, people were figuring out better ways of retrieving data from its archived location.

When people talk about modern business data centers and data warehouses, they're not talking about traditional relational database design. At least that's the trend – away from old relational database table technology and toward a variety of approaches called noSQL.

In noSQL systems one of the big fundamental changes is that data is not identified by its particular location in a table. Instead, it's defined by its attributes with key-value pairsschemas or other types of innovations.

In other words, the data identifiers allow it to roam free in a less structured database environment, which leads to more capable queries and retrieval practices.

Web 3

As we talk about this third trend, let's also talk about cryptocurrency, which became much more of an integrated presence in our lives throughout the past four or five years.

The first cryptocurrency to make a splash was Bitcoin, and people tried to figure out how to get their heads around the concept of digital currency and blockchain technology.

Then all sorts of other cryptocurrencies started to emerge, including smart contract-handling chains like Ethereum that were able to use tokens to handle data on the blockchain.

Along with that, there was a move toward something called web 3 or the semantic web.

The idea here is that data can move from a simple cloud approach to a more refined place where it exists within semantic structures, noSQL environments and perhaps moves through blockchain oriented processes.

These new trends also mingle with one another.

For instance, BrightStar has developed a resource that is billed by its makers as an “ACID-compliant RDF triple store” that uses a data object layer and semantic web standards to approach data in a whole new way.

Part of the similarity with blockchain and semantic web systems is the use of data objects instead of basic exploration of data locations. Some people describe semantic web as a mapping of the Internet, and others talk about decentralized approaches to networking that complement the decentralization of cryptocurrencies like Bitcoin.

In addition to blockchain technology and cryptocurrency, an emerging aspect of this new web is the metaverse. Described by Mark Zuckerberg as an "embodied internet" where the user is actually part of the experience, this new cloud born tech has been making waves and inspiring many predictions about how exactly it will impact the world. By improving the virtual reality experience, the metaverse is poised to make exciting waves in many lifestyle, gaming and ecommerce sectors, and beyond. (Read also: Gaming, Fashion, Music: The Metaverse Across Industries.)


Virtualization, distributed cloud systems, immutable blockchains and noSQL data environments are continually being refined. They are an integral part of what's going to help our data world evolve beyond what was born in the cloud several years ago as they continue to change in an effort to anticipate and meet the needs of enterprise.


" data-original-title="Written by">Justin Stoltzfus | Reviewed by 
" data-original-title="Reviewed by">Kuntal ChakrabortyCheckmark
Published: April 8, 2022