4 predictions for NoSQL technologies in 2016
NoSQL technologies have seen growing adoption over the last 12 months – but will 2016 be the year these platforms move into the mainstream?
Short of time?
Allied Market Research predicts that the global NoSQL market will reach $4.2 billion by the end of 2020, with a growth rate of 35.1% compound annual growth rate (CAGR) from 2014 to 2020.
The reason for this growth is that companies are looking for database technologies that offer greater flexibility, scalability and customisation for their applications.
NoSQL technologies are designed for the needs of big data and to support the requirements of mobile and Internet of Things (IoT) applications.
In the age of IoT and big data, a database needs to support these new requirements. Traditional relational databases can no longer handle the volume and variety of data created, so enterprises are looking for alternative options.
The shift towards NoSQL as a leading piece of enterprise IT can be perceived as a daunting task. However looking at relative growth in job trends from Indeed.com, the demand for NoSQL positions is growing.
At the same time, NoSQL providers are making it easier than ever to switch from relational to non-relational databases from both technology and skills perspectives.
As companies look at their new application requirements, NoSQL is often the first choice for building those applications.
In 2016, evidence suggests that the popularity of NoSQL technologies will continue to rise. Here are four predictions for the coming year.
1. IoT won’t be the cool kid on the block anymore
The IoT is becoming more and more of a commodity, rather than being distinct and on its own. In 2016, IoT data will become part of the overall data pipeline and merge into wider company strategies on serving customers. The development of new standalone IoT applications and devices will halt in favour of IoT data built into services already been used.
Through current infrastructure, organisations are collecting massive amounts of data. New and improved databases and storage strategies will mean that data won’t have to be discarded straight away. Instead, companies can use all this data in a more efficient manner and increase the speed at which it is analysed.
To achieve this, there will be a tighter integration between analytics and databases, enabling a speed that we can’t currently achieve.
While some companies have found ways to profit from some IoT applications, many still have no true ROI. However, with these faster, more integrated analytics platforms, businesses should be able to find better ways to profit from IoT as it runs in the background of the rest of their applications.
2. Cloud adoption is inevitable
Cloud computing will continue its adoption within enterprises. In 2016, there will be a large influx of companies moving to the cloud that would previously be considered as unsuitable for cloud.
More elements of financial services and banking, insurance and public sector organisations will move to using cloud as the technology catches up with their data security and governance requirements.
On the flip side of this, customers will demand better experiences from the companies that they buy from. And this increase in expectations will drive more demand for scalability and speed within IT.
This, in turn, fits the cloud model for IT delivery better than more traditional on-premises deployments, and will likely lead to an increase in deployments of distributed database systems like Apache Cassandra in the cloud. More traditional approaches will simply not take these companies where they need to go.
3. Graph databases will grow in use
Graph databases have the ability to model and store data, as well as query both relationships and data. Traditional SQL databases and most NoSQL databases don’t handle this element of data relationships well. This has led to more interest in how to use and develop applications based on graph databases.
While those within the graph industry agree that graph will be big in 2016, graph will only reach this potential as it becomes part of the larger ecosystem of databases. Businesses will increasingly look to graph as a piece of a larger database solution that better addresses their needs.
4. The term NoSQL will become too generic
As more companies turn to NoSQL for its database solution, the term NoSQL will not be accurate enough to describe the more specialised database types that are being used.
For example, search services using tools like Solr or ElasticSearch should be considered differently to transactional systems like Apache Cassandra. The same could be said for graph databases too.
The term ‘NoSQL’ tends to be applied liberally and starts the conversation, but the answer to a company’s data problem is normally more specific than this.
>See also: Top 8 trends for big data in 2016
As companies become more mature in their use of new database technologies, there will be more use of best-of-breed tools. At the same time, these implementations will have to move into production at some point as well.
The multi-modal approach to databases may work from a technical and development perspective, but CIOs want to know about how these platforms will be supported when things go wrong as well. Bringing more types of database – operational, analytic, graph, etc. – under one roof from a support perspective would therefore make a lot of sense.
NoSQL provides cloud-ready and highly scalable solutions to enterprises. With the driving force of big data, more companies will make the move to implementing non-relational database platforms within their critical applications.
Sourced from Patrick McFadin, chief evangelist for Apache Cassandra, DataStax