Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Save more on your purchases! discount-offer-chevron-icon
Savings automatically calculated. No voucher code required.
Arrow left icon
All Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Newsletter Hub
Free Learning
Arrow right icon
timer SALE ENDS IN
0 Days
:
00 Hours
:
00 Minutes
:
00 Seconds
Arrow up icon
GO TO TOP
MySQL 8 for Big Data

You're reading from   MySQL 8 for Big Data Effective data processing with MySQL 8, Hadoop, NoSQL APIs, and other Big Data tools

Arrow left icon
Product type Paperback
Published in Oct 2017
Publisher Packt
ISBN-13 9781788397186
Length 296 pages
Edition 1st Edition
Languages
Tools
Concepts
Arrow right icon
Authors (4):
Arrow left icon
 Challawala Challawala
Author Profile Icon Challawala
Challawala
Jaydip Lakhatariya Jaydip Lakhatariya
Author Profile Icon Jaydip Lakhatariya
Jaydip Lakhatariya
 Mehta Mehta
Author Profile Icon Mehta
Mehta
 Patel Patel
Author Profile Icon Patel
Patel
Arrow right icon
View More author details
Toc

Table of Contents (17) Chapters Close

Title Page
Credits
About the Authors
About the Reviewers
www.PacktPub.com
Customer Feedback
Preface
1. Introduction to Big Data and MySQL 8 FREE CHAPTER 2. Data Query Techniques in MySQL 8 3. Indexing your data for High-Performing Queries 4. Using Memcached with MySQL 8 5. Partitioning High Volume Data 6. Replication for building highly available solutions 7. MySQL 8 Best Practices 8. NoSQL API for Integrating with Big Data Solutions 9. Case study: Part I - Apache Sqoop for exchanging data between MySQL and Hadoop 10. Case study: Part II - Real time event processing using MySQL applier

Case study overview


MySQL is a proven solution to store transactional data that is used to maintain ACID properties during write operations. Starting from MySQL 5.6, it also includes the new NoSQL Memcached API for InnoDB, which improves performance for high volume data ingestion. Hadoop is used to store a huge amount of data (in petabytes) and processing it for many scenarios such as storing archived data or various historical data. Analytical processing of data was handled offline and was not an integrated part of the data processing. However, the technology has evolved and nowadays, Hadoop is an active part of data flows for many use cases where we require real-time data processing and provisioning of data to the user.

We can use MySQL to store the transnational data and Hadoop to store huge amount of data which can easily process the data using a map-reduce algorithm. We can take advantages of both technology to unlock the Big Data analysis. There are various use cases where we need to...

lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $15.99/month. Cancel anytime
Visually different images