<img height="1" width="1" style="display:none;" alt="" src="https://px.ads.linkedin.com/collect/?pid=1232938&amp;fmt=gif">
Insights > Blog

Managing Big Data Projects Beyond Hadoop

By Redapt Marketing | Posted on October 9, 2019 | Posted in Data Management and Analytics

For more than 10 years, Hadoop has been the go-to solution for building out a scalable data processing framework to support extremely large data workloads.

While Hadoop has been shining bright, however, other companies have been making a lot of progress in the arena. Organizations like CloudEra and MapR — recently acquired by HP — are making major inroads with their own services, and the industry trends are now pretty clear.

Yet, that doesn’t mean Hadoop is going away any time soon. It just means there are more options than ever when it comes to Big Data projects. So let’s take a look at the current state of Hadoop, as well as a high-level look at the Big Data playing field as it now stands.

Rumors of Hadoop’s demise have been greatly exaggerated

redapt_blog-graphics_hadoop-logo

For a solution that’s more than a decade old, Hadoop still has a very large footprint. For many industries, including SaaS, hyperscale customers, and ad tech companies, the legacy approach is still a vital part of their model. And Hadoop’s flexibility and scale, in particular, are still unmatched, as is its focus on the developer.

But the challenges are growing in strength. MapR, for one, has received a major shot in the arm through its acquisition by HP. As such, we expect a lot of really cool platforms emerging that integrate MapR with the rest of the HP portfolio and ecosystem. Plus, many of the workloads that required Hadoop can now be done in a number of SQL and NoSQL platforms, which is also leading to worries about the legacy platform’s future.

Still, Hadoop remains alive and kicking. And while some people may move to CloudEra, MapR, or open-sourced solutions, that fact doesn’t appear to be changing any time soon.

Despite the java-based MR paradigm, most BI/BA teams continue to use SQL and API services in partnership with Hadoop. Meanwhile, GCP, Azure, and AWS all offer Hadoop as a service to run ephemeral clusters and allow you to use only the tools you want.

When tools party together, good things happen

redapt_blog-graphics_apps-working-well-together

At Redapt, we’re agnostic when it comes to which platform is the preferred one. Every company has different needs, and our job is to take a holistic look at each client and help find the right solution for them.

For most of our clients, it’s a mixture of tools that end up working best. There are a lot more options for how to store petabytes of data than there were 10 years ago. After all, while Hadoop helped break the storage box barrier, it’s no longer the only game in town.

Beyond the aforementioned SQL and NoSQL solutions, companies can now pair a large object store or SDS filer with a tool like Kubernetes to run whatever processing framework they desire to crunch impressively large Big Data sets. And again, Hadoop is still in wide use, with unparalleled flexibility and scale and an ongoing focus on being developer-friendly.

Yes, Hadoop’s 10 years as being the only game in town has ended. But that’s not a bad thing — not for Hadoop, and certainly not for developers.

More options on the market mean more ways for companies to tackle challenges. And for Redapt, where we pride ourselves on having the knowledge and expertise to solve problems regardless of the tools, the fun’s just beginning.

Want to learn more? Reach out to our team and schedule a call today. Or, find out where your organization sits in the technical maturity model. Download our free eBook, The Redapt Technical Maturity Framework

The Redapt Technical Maturity Framework eBook - Download Now