Aadhaar project from our very own India happens to on the most ambitious projects relying on Big Data ever to be undertaken. The goal is for the collection, storage and utilization of the biometric details of a population that has crossed the billion mark years ago. It is needless to say that a project of such epic proportions presents tremendous challenges but also gives rise to an incredible opportunity according to MapR, the company that is serving the technology behind the execution of this project.
Aadhaar is in its essence a 12 digit number assigned to a person / an individual by the UIDA , the abbreviated form of “Unique Identification Authority of India” The project was born in 2009 and had former Infosys CEO and co-founder Nandan Nilekani as its first chairman and the architect of this grand project which needed much input in terms of the tech involved.
The intention is to make it an unique identifier for all Indian citizens and prevent the use of false identities and fraudulent activities. MapR which is head-quartered in California is the distributor and developer of “Apache APA +0.00% Hadoop” has been putting into use its extensive experience in integrating web-scale enterprise storageand real-time database tech, for the purposes of this project.
According to John Schroeder who is the CEO and co-founder of MapR, the project presents multiple challenges including analytics, storage and making sure that the data involved remains accurate and secure amidst authentications that amount to several millions over the course of each passing day.Individual persons are provided with their number and a iris-scan or fingerprint is taken so that their identity might be proved and queried to and matched from the database backbone to a headshot photo of the person. Each day witnesses over a hundred million verifications of identity and all this needs to be done in real-time in about 200 milliseconds.
India has a percentage of rural population many of which are yet to be connected to the digital grid and as Schroeder continues the solution had to be economical and be reliable even under low bandwidth situations and technology behind it needed to be resilient which would work even with areas with low levels of connectivity.
Interested in a career in Data Analyst?
To learn more about Machine Learning Using Python and Spark – click here.
To learn more about Data Analyst with Advanced excel course – click here.
To learn more about Data Analyst with SAS Course – click here.
To learn more about Data Analyst with R Course – click here.
To learn more about Big Data Course – click here.
Analytics, Big Data, Big Data Analytics, Big data certification, Big data certification pune, Big data courses, big data hadoop, Big Data Hadoop courses, Big Data Hadoop institute in Delhi, Big Data in India, Big Data Revolution, Big Data technologies, Big data training, Hadoop, The Role of Big Data