Data in technical parlance is the characters, quantities, alphabets, symbols etc., on which the computer performs operations and can be transmitted or stored as electrical signals using devices that are optical, magnetic, mechanical recording etc. 

  1. What is Big Data?
  2. Uses of Big Data
  3. History of Big Data

1. What is Big Data?

What is Big Data definition? Big Data or Big Data is the term for collecting huge datasets that have grown exponentially with time in what is Big Data simple definition. What is the Big Data concept has data volumes which are well beyond the capacity of processing or storing by traditional management tools?

Big Data definition can be classified into three types as below:

  1. Unstructured data: Raw data with unknown structure and form in what exactly is Big Data makes it challenging to derive value from it in any output format. For Ex: Google search results in different formats.
  1. Structured data: Data with a fixed format that can be stored, processed and accessed etc. 
  2. Semi-structured: Data that has both unstructured and structured data forms. 

What is Big Data has the following four typical characteristics:

  • Variety: Variety means the sources can be heterogeneous as also the data nature can be of different types. Initially, it was restricted to databases and spreadsheets. Presently it can contain photos, emails, monitoring devices output, videos, PDFs, music, audio files etc., in all about Big Data. Variety makes it hard to minimise, store or analyse data.
  • Volume: Big data is synonymous with huge volumes of data and refers to its size, defining it as Big Data.
  • Velocity: The term indicates the speed of generation of the raw data.
  • Variability: This term measures the inconsistencies in the data that makes it hard for storage or what is Big Data analytics.

2. Uses of Big Data

What is the use of Big Data? Big Data processing and analytics has huge benefits. Given below are some of them.

  • Big Data what is it uses Business Intelligence from the insights, predictions and patterns of forecasting in making business strategies and decisions.
  • Social data from various sites and search engines like Twitter, Facebook etc., help in customer retention, product decisions and marketing or business strategy decisions from data insights.
  • Customer service issues can quickly be resolved as the newer Big Data technologies and systems have NLP features (natural language processors) to evaluate customer satisfaction, resolve problems, provide simple product information etc.
  • Early error identification in what is Big Data reduces the risk to services or products offered.
  • Improved operational efficiency results are produced when the huge volumes of data are well-analyzed and used to tweak products, services, risk mitigation, security issues etc.
  • Big data technologies are about Big Data analytics and used to create large data-warehouses for integrating multiple sources, technologies and processes using data differently and storing the infrequently used data in such data warehouses.

3. History of Big Data

What do you mean by Big Data? John Graunt, in 1663 used statistical analysis to deal with large volumes of data in studying the Bubonic plague in Europe in what is the meaning of Big Data essentially. By the 1800s, besides statistical analysis of data, it was also including analysis of collected data and happening at regular intervals. In 1881, Herman Hollerith from the U.S. Census Bureau created the Hollerith Tabulating Machine based on punch cards allowing census tasks that needed years to be done in months. In 1927, Austrian-German engineer Fritz Pfleumer devised the Big Data technology meaning storage tapes and magnetic strips replaced technology that used wire recordings.

What is Big Data technology’s origin? In 1943 and during WW II, the British Colossus was used to crack Nazi codes using a pattern scanning machine with a speed of 5.000 characters/ second. In 1945 the EDVAC paper of John Von Neumann saw the advent of the Electronic Discrete Variable Automatic Computer. By 1952, under President Truman, computers were able to define Big Data analytics automatically and independently process data collected by them which was used by the US National Security Agency (NSA) to decrypt messages during the Cold War.

On Oct 29, 1969, ARPANET started with a message to Stanford’s computer from the host computer at UCLA. By 1973 it had connected to the transatlantic satellite of the Norwegian Seismic Array. The infrastructure started turning obsolete, and the system was too slow compared to newer networks like the NSFNET, and in 1990, it was closed down. The origin of the internet is attributed hence to ARPANET.

The US’ data centre of 1965 (closed down later) started with data storage of tax returns and fingerprint datasets. Personal computers came in 1977, introducing microcomputers, pushing internet evolution and storage of Big Data albeit on a time-sharing access basis for the early computers in organisations.

Computer scientist Tim Berners-Lee in 1989 introduced the concept of www- World Wide Web using URL’s to recognise data and web resources. Data on the internet was defined with hypertext links and could transfer video, audio, and image/picture files. A year later, in 1990, CERN’s Tim Berners-Lee introduced the first IT rules and commands of the WWW. They are

  • Uniform Resource Locator- URL provides an “address” to the unique web resource and called Uniform Resource Identifier- URI.
  • HyperText Markup Language– HTML or the web’s formatting language.
  • Hypertext Transfer Protocol– HTTP Used the mode of retrieving resources across and linked to the web.

With this in 1993, CERN announced the internet and World Wide Web as an open-source resource. Internet of Things concept of 1999 started evolving the IoT technology. By 2013 had included the internet, multiple technologies, embedded systems, wireless communications, MEMS- micro-electromechanical systems etc. which transmitted data from devices of the user like GPS signals, IoT support systems etc. In October 2016, the internet was hacked and crippled, leading to the development of Artificial intelligence, Machine Learning and Deep Learning Neural Networks. What is Big Data and why is it necessary, as defined by Roger Mougalas in 2005, referred to huge data volumes and huge datasets.

Hadoop software framework called Nutch became an open-sourced resource which merged with MapReduce from Google to provide what is Big Data processing of all kinds of data flexibly and in large volumes. Storage means also grew exponentially. Magnetic storage slowly evolved to floppy disks, hard drives, large volumes of data storage computers, 1989’s SaaS- Salesforce offered Software-as-a-service and finally cloud storage which is the present-day rage for its infinite scalability, easy access and secure services.


The above article provides an introduction to Big Data, speaks about Big Data, the definition of Big Data analytics, Big Data analytics meaning, Big Data history, and gives an introduction to Big Data analytics with what is meant by Big Data. Big Data as a concept is increasing exponentially and is present in all walks of every person’s life. What is Big Data drives improvements not only in the field’s it is used in but also causes developments in terms of improvements to storage devices, data handling device technologies, AI, ML and more.

If you are interested in making a career in the Data Science domain, our 11-month in-person Postgraduate Certificate Diploma in Data Science course can help you immensely in becoming a successful Data Science professional. 



Are you ready to build your own career?