What is Big Data and Apache Hadoop?
What is Big Data?
Whenever the term Big Data comes in mind, people normally think of “Data in huge amount”, “Data in GBs, TBs, PBs or more”. That is only one aspect of Big Data. A small amount of data that cannot be processed using traditional processing can also be considered as Big Data.
Let’s take some real life examples of that kind of data. Have you ever tried to attach 100MB of data as a single attachment file in GMAIL? You probably didn’t because you know that much amount of data cannot be transferred over GMAIL. That data for GMAIL is Big Data.
For another example, imagine you have 10 GB of images stored in your hard drive and you have to do some image enhancing on all of them. Each image, lets’ say, is 25MB in size. You will be able to process that type of data but how much time it will take? Very long, of course. That will be Big Data for your computer. Real life problems of big data go on and on.
In short, Big Data is that much amount of data which cannot be processed with traditional ways in given amount of time.
Who is generating Big Data?
Who generates data? In old days, normally some employees of a company used to generate and store data locally. But today people are generating Data using different types of websites like Facebook, Twitter, GMAIL, and LinkedIn to generate data. All you have to do is create an account and store your personal information in it.
According to a survey,
Now the problem arises: How to store that much amount of data and how to process it? That where Hadoop comes in the picture.
What is Apache Hadoop?
One way of overcoming those problems was to add more processing power to tradition system and we did. For that Hadoop was created by Doug Cutting who was working at Yahoo at that time in 2005. He named it after his son’s toy elephant. Yes, that is the secret behind the logo.
Hadoop is a framework that allows us to divide Big Data in smaller parts and store it in a Distributed File System. Once distributed, processors can individually do parallel processing on that data. Hadoop is also responsible for managing this parallel processing.
Hadoop uses Map Reduce to make a table of contents stored in different systems. In this way it knows which file is stored in which of the server. Map Reduce also assign work to each server in the cluster. Hadoop uses HDFS along with Map Reduce to process Big Data.
Yahoo is originator and major contributor to Hadoop along with most famous websites like Facebook, IBM, Twitter and others.
What are the scope and Demand of Big Data / Hadoop?
According to ‘The Big Data Executive Survey 2013’, 90% of the companies that participated in the survey were doing something with Hadoop. That concludes that Hadoop is in great demand because data at any organization will keep increasing with time. So they will need experts to manage that much amount of data and that’s what Hadoop experts do. Take a look at the companies, who are looking for Hadoop experts,
Leading marketers are using BIG DATA to deliver greater value and relevance to their customers.
Why Join I-Medita’s Trainings(IBNC) ?
In a statement Shravan Goli, president of Dice, said, “Companies are betting big that harnessing data can play a major role in their competitive plans, and that is leading to high pay for critical skills,” So there are huge opportunities available for BIG Data/ Hadoop Engineers.
We, at I-Medita, focus on both theory and practical so that students don’t lack anything while facing interviews. We offer BIG Data hadoop training inside the college campus for students and faculties, during summer and winter vacation we provide 10-15 Days Training in metro cities like Pune, Bangalore, Chennai, Hyderabad, Kolkata, Mangalore etc, main motto is to provide enough knowledge so that students can kick start their career in Big Data and Hadoop. They will be able to understand, as well as, explain the concepts of Big Data and Hadoop to anyone. We have Hadoop experts who will teach you all about Hadoop and solve your queries anytime.
So if you are interested to make a career in this field, join our BIG Data Hadoop training program. After completion of this program you will create your own Big Data and manage it using Hadoop.