Home / Computers/technology / News / Some Ways Your Google Search And Social Media Affect Your Opportunities In Life

SOME WAYS YOUR GOOGLE SEARCH AND SOCIAL MEDIA AFFECT YOUR OPPORTUNITIES IN LIFE

Whether or not you realise it, big data can affect you and how you live your life.

The data we create when using social media, like browsing the internet and wearing fitness trackers, are all collected, categorised and used by businesses and the state to create profiles for us. These profiles are then used to target advertisements for products and services to those most likely to buy them or to inform government decisions.

Big data enables states and companies to access, combine and analyse our information and build revealing – but incomplete and potentially inaccurate – profiles of our lives. These companies do so by identifying correlations and patterns in data about us, and people with similar profiles to us, to make predictions about what we might do.

Just because big data is based on algorithms and statistics, does not mean that they are accurate, neutral or inherently objective. And while big data may provide insights about group behaviour, these are not necessarily a reliable way to determine individual behaviour.

Some of these methods can, in fact, open the door to discrimination and threaten people's human rights, they could even be working against you.

Here are four examples of how big data analytics can lead to injustice:

1. Calculating credit scores:

Big data can be used to make decisions about credit eligibility, affecting whether you are granted a mortgage, or how high your car insurance premiums should be. These decisions may be informed by your social media posts data from other apps, which are taken to indicate your level of risk or reliability.

However, data such as your education background or where you live may not be relevant or reliable for such assessments. This kind of data can act as a proxy for race or socioeconomic status, and using it to make decisions about credit risk could result in discrimination.

 
2. Job Searches:

Some job advertisements can be targeted at particular age groups, such as 25 to 36-year-olds, and the big data can also be used to determine who sees a certain job advertisement or even gets shortlisted for an interview.

Automation is also used to make filtering, sorting and ranking more accurate on more efficient candidates. This screening process may exclude people on the basis of indicators such as the distance of their commute.

Employers might suppose that those with a longer commute are less likely to remain in a job long-term, but this can actually discriminate against people living further from the city centre due to the location of affordable housing.

3. Parole and bail decisions:

In the US and UK, big data risk assessment models are used to help officials decide whether some people get granted parole or bail, or referred to rehabilitation programmes. They can also be used to assess how much of a risk an offender presents to society, which is one factor a judge might consider when deciding the length of a sentence.

It is unclear what data is used to help make these assessments, but as the move toward digital policing gathers pace, it is increasingly likely that these programmes will incorporate open source information such as social media activity (if they don't already).

4. Vetting visa applications:

Last year, the United States' Immigration and Customs Enforcement Agency (ICE) announced that it wanted to introduce an automated "extreme visa vetting" programme.

This programme would automatically and continuously scan social media accounts, to assess whether applications will have a "positive contribution" to the United States and whether any national security issues may arise.

As well as presenting risks to freedom of thought, opinion, expression and association, there were significant risks that this programme would discriminate against people of certain nationalities or religions. Commentators characterised it as a "Muslim ban by algorithm".

There’s no question that big data analytics works in ways that can affect individuals’ opportunities in life. However, the lack of transparency about how big data are being collected, used and shared makes it difficult for people to know what information is used, how, when and where.

Big data analytics are simply too complicated for individuals to be able to protect their data from inappropriate use. Instead, states and companies must make – and follow – regulations to ensure that their big data does not lead to discrimination.

 


LATEST
Top 10 Free Android Apps For 2019
Google's Duplex AI Now Works On Pixel Phones
Social Media Reimagined As Tech Products
Google Makes It Easier For You To Erase Your Search History
Facebook's Latest Portal Device Will Most Probably Spy On You
WhatsApp's Backup Could Put Business Chat Groups At Data Risk
Save Data For Emergencies With Google's Datally
The Earth Is Now Depicted As A Globe On Google Maps
Canadian Telecoms Make The Most Money On Mobile Data