Google processes a trillion searches per year. Over 300 hours of video are uploaded to YouTube every minute. 90% of the world’s data has been created in the last two years, and it’s estimated by 2020, every person on the planet will be creating about 1.7 MBs of new information every second.
That’s a lot of data. This amount of information, a product of our conversion to digital lives, is completely unprecedented in history.
For the optimists out there, this information presents extremely exciting possibilities. With the proliferation of Big Data tools, we can now use incredibly large data sets to solve the world’s most difficult problems:
Byron Reese, in his book “Infinite Progress,” talks about Big Data contributing to the end of ignorance, disease, poverty, hunger, and war.
Viktor Mayer-Schonberger and Kenneth Cukier similarly discuss companies and governments that are already solving incredibly difficult problems and saving lives using big data in their best-selling book “Big Data – A Revolution That Will Transform How We Live, Work and Think”.
Indeed, companies like Apple have already changed our lives with features like Autocorrect and Siri (I hold these products particularly close to my heart since I cannot type on a phone).
The possibilities are very exciting for higher education as well. Personalized learning experiences tailored to every student’s unique needs, virtual student assistants (chatbots), an improved feedback loop with the university: the future of education presents incredible opportunities for learning.
But on the flip side of all of this progress exists a clear and present danger, what are the implications of all this data on individual privacy? Are most individuals aware of who has access to their data and how it could be used? What potential is there for data to be used in nefarious ways? Certainly, in the wake of data breaches and the Facebook scandal, these are important questions to ask.
No one knows what new regulation (if any) could be coming in the US over data management, but with the gaps and ambiguity in the US’s existing governance, there clearly has been the potential to abuse consumer privacy. Companies not pushing the boundaries of data collection probably feel like baseball players in the 90’s who were not on steroids. But I believe it’s time for companies (and in particular, universities) to double down on consumer data protection before there is a true day of reckoning.
To do this, we all need to buy into the idea that individuals, at the very least, control the destiny of their data (if not own it explicitly – as in the GDPR model). Whether it be through refusing consent, not adopting technology, using a privacy tool, or pushing back on third-parties collecting data, the control of a person’s data rightfully belongs to that person. For an individual to willingly exchange their data, their needs to be an incentive and a high level of trust.
Trust is the single most important aspect of any relationship or transaction. That’s why people are so upset over Facebook’s issue. Not only was their data security compromised, there’s also a feeling that Facebook mislead users about how their data was being used, and didn’t care enough to enforce their privacy policies. Whether or not that’s true, Facebook has lost some trust from a portion of its users.
If the question is how to build and maintain your students’ trust, the answer is transparency. To me, transparency means specifically spelling out how and when data will be used (and how it won’t be used), providing alternatives to exchanging data, and enforcing whatever policies you discuss. If you are providing individuals the opportunity to participate or not participate in a transparent data exchange, no matter the outcome, there will be no surprises. It’s one of the core tenants of GDPR, and it should be the focus of any future legislation regarding US data protection.
There will always be a debate about how and when data should be used, but the reality is, we now live in a data-driven world. Universities will depend on data more and more as they discover its value. But in the end of the day, students control the destiny of their data, so I think we all need to remember the old maxim – “Don’t bite the hand that feeds you”. It’s time to double down on student data protection.
Gary Garofalo / About the Author
Gary is the co-founder and Chief Revenue Officer at Degree Analytics, where he focuses on developing new business relationships and partner success. He has spent his entire career using data and analytics to improve business operations and strategy. Gary believes the concept of the “Smart Campus” will be pivotal for universities adapt to the future of education, and is passionate about delivering products to enhance the student experience.