What are the Characteristics of Big data? (10V’s of Big data)

Discussion RoomCategory: BigdataWhat are the Characteristics of Big data? (10V’s of Big data)
Mohan asked 5 years ago

 
10V’s of Big data

Volume: Big data first and foremost has to be “big,” and size in this case is measured as volume. It used to be employees created data. Now that data is generated by machines, networks and human interaction on systems like social media the volume of data to be analyzed is massive. Yet, Inderal states that the volume of data is not as much the problem as other V’s like veracity.
Variety: It refers to the many sources and types of data both structured and unstructured. We used to store data from sources like spreadsheets and databases. Now data comes in the form of emails, photos, videos, monitoring devices, PDFs, audio, etc. This variety of unstructured data creates problems for storage, mining and analyzing data. Jeff Veis, VP Solutions at HP Autonomy presented how HP is helping organizations deal with big challenges including data variety. With big data technology we can now analyze and bring together data of different types such as messages, social media conversations, photos, sensor data, video or voice recordings.
Velocity: Big Data Velocity deals with the pace at which data flows in from sources like business processes, machines, networks and human interaction with things like social media sites, mobile devices, etc. Just think of social media messages going viral in seconds. Technology allows us now to analyze the data while it is being generated (sometimes referred to as in-memory analytics), without ever putting it into databases.
The Velocity is the speed at which the data is created, stored, analyzed and visualized. In the past, when batch processing was common practice, it was normal to receive an update from the database every night or even every week. Computers and servers required substantial time to process the data and update the databases
Veracity: It refers to the biases, noise and abnormality in data. Is the data that is being stored and mined meaningful to the problem being analyzed. Inderpal feel veracity in data analysis is the biggest challenge when compares to things like volume and velocity. In scoping out your big data strategy you need to have your team and partners work to help keep your data clean and processes to keep ‘dirty data’ from accumulating in your systems.
Validity: Like big data veracity is the issue of validity meaning is the data correct and accurate for the intended use. Clearly valid data is key to making the right decisions. Phil Francisco, VP of Product Management from IBM spoke about IBM’s big data strategy and tools they offer to help with data veracity and validity.
Volatility: It refers to however long is data valid and the way long ought to it’s stored. during this world of real time data, you wish to see at what purpose is data no longer relevant to this analysis. big data clearly deals with problems on the far side volume, selection and velocity to alternative issues like veracity, validity and volatility.
Venue: = distributed, heterogeneous data from multiple platforms, from different owners’ systems, with different access and formatting requirements, private vs. public cloud.
Varifocal:  This is big data and data science together allow us to see both the forest and the trees.
Varmint: As big data gets bigger, so can software bugs
Vocabulary:  schema, data models, semantics, ontologies, taxonomies, and other content- and context-based metadata that describe the data’s structure, syntax, content, and provenance.

Scroll to Top