CRASHCOURSE TWO. TECHNOLOGY IMPACT ON HUMAN VALUES

About the crashcourses
There are 10 online crashcourses. These crashcourses are linked to the Technology Impact Cycle Tool (tict.io). This free online tool, powered by Fontys University, helps you design, invent, deploy or use technology that makes a positive impact on society. The tool does this by offering you quick scans, improvement scans and full questionnaires. The tool consists of 10 different categories.

To use the tool it helps to be informed on the different categories and, even more, to be inspired about the categories. That is the goal of this crashcourse: inform you and inspire you on IMPACT ON HUMAN VALUES, so you are better equipped to assess the impact of technology.  All crashcourses take an hour to compleet.

About this crashcourse
This online crashcourse is number two: impact on society. This course, like every course, has at least one mandatory assignment to help you understand the topic. During the course we will offer all kind of further optional suggested reading, watching and optional assignments for those who crave for more!

The goal of this course is to educate you! To inform you! To inspire you! To entertain you! To dazzle you! To make you think! That is why we did not design a crashcourse for beginners, or a basic course with all theory. We cherry picked our way yhrough the topics with the aim to interest, inform and inspire you! If you think we failed you, no problem, we are open for improvements. Just sent us an e-mail on: info@technofilosofie.com.

Some time management directions 
Again: it will take you approximately one hour to complete this course. This course consists of text, articles, videos and assignments. Every section lists reading time, viewing time and mandatory assignment time, so you can plan accordingly. If it takes longer than one hour, maybe that means your slow, maybe it means we calculated poorly. You figure it out yourself.

ONLINE CRASHCOURSE TWO. TECHNOLOGY IMPACT ON HUMAN VALUES

This 60 minute online crashcourse consists of the following sections/topics:

  1. Our choice in human values (8 minutes)
  2. The attention economy (23 minutes)
  3. Surveillance Capitalisme (17 minutes)
  4. Technology addiction (15 minutes)
  5. Quantified Self (7 minutes)

Learning outcomes: The goal of these crashcourses is that they help you to Judge or assess a certain (digital) technology. After this one hour course you will understand a little bit better how technology influences our human values. We do this by exploring the way technology reduces or strengthens our autonomy (our ability to make choices) and the way technology contributes to our happiness. The focus will be mainly on our modern, digital technology. This insight helps you to determine in what way a certain technology influences your human values.

This crashcourse consists of text, videoclips, optional and mandatory assignments and suggestions for further reading, viewing and practicing.

OUR MOST IMPORTANT HUMAN VALUES

Reading time: 1 minute / viewing time 7 minutes

There are literally millions of books, websites, articles and videos on our main human values. Usually these are values such as ‘giving’, ‘health’, ‘energy’, ‘gratitude’, ‘connection’, ‘happiness’ and so on. They all have a connection with technology in some way. After all, in crash course one we have seen how technology changes people. However, in this crash course we focus on two important values, which we explore further and which revolve around two crucial questions.

Note: these values are our choice, because we believe the connection with technology is strongest in these values.

  • Human Value One: Autonomy. We think that being human is about making decisions as autonomously as possible. The central question is then: does technology enhance the ability to make autonomous choices, or does technology make choices for you? We think technology should enable you to make better choices, not make choices for you;
  • Human Value Two: Happiness. We believe that technology in general should contribute to positive values. Technology should strive to make you healthier, give you more confidence, increase your connection with other people, in short, make you happier. Technology should not make you unhealthier, more afraid, more insecure, lonely, in short, miserable. The central question then is, does technology make you happier or not?

We discuss the above questions in this crash course and try to find some answers based on some dominant themes (attention economy, surveillance capitalism, technology addiction, quantified self). This is not a complete picture, this is just our choice of human values and themes. However as a crash course this will be more than enough to make you think about the impact of technology on your values and the way you assess technology going forward.

Spoiler alert: our answers will not be conclusive. But maybe we can make you think about it a little better. What do you actually expect from your technology?

To get your mind ready for this question, let’s start with this video from Gary Turk (5 minutes)

And to make you think harder, and dount a little, watch this parody (look down – 2 minutes)

Further optional suggestions:

  • Werner Herzog made this documentary about smartphones in traffic. Not a positive person, our Werner, but a very good documentary (30 minutes);
  • Lindsey Lee Johnson wrote a book called The Most Dangerous Place on Earth. The story unfolds in an American highschool in a world filled with rich kids that try to fnd their place. Smartphones, social media, (the lack of) true connections, image, distraction. Lindsey Lee Johnson paints a disturbing picture.
  • Nolen Gertz talks (12 minutes) about Nihilism and Technology. If making choices and being responsible is human, does technology dehumanize us? If you look the talk, go read the book.

Key take aways:

  • We have chosen two human values with the strongest connection to technology: autonomy and happiness;
  • We can assess technologies by looking at them along these values.

THE ATTENTION ECONOMY

Reading time: 4 minutes / viewing time: 9 minutes / mandatory assignment: 10 minutes

In this section we like to introduce the business model of the attention merchant. First, watch this video of Tim Wu who coined the term in his book. If you do not like to watch videos you can also read the transcript (both 9 minutes).

Okay, so now you know what an attention merchant is and how important it is to control where you give you attention to. After all, you are what you give your attention to.

It is important to make a clear distinction between the attention merchant’s earnings model and other parties who also want your attention! For example, Facebook is an attention merchant. They offer a product “free” and your attention is then sold to someone else (usually someone who wants to sell something). Netflix also wants your attention, but is not a attention merchant. After all, you pay a fixed amount per month, and for that you get films and series. The distinction between attention merchants and other organisations that want your attention is not always black and white. After all, when you pay for the cinema or a pay for channel TV, you still get commercials. And maybe Netflix also does product placement. They do smoke a lot on Netflix.

Apple, for example, is largely not an attention merchant. They sell expensive devices. Google’s Android is very much an attention merchant. That is always good to know if you buy a phone or want to invest. That is why Apple is committed to privacy. They also like to advertise that.

Technology makes it a lot easier to become a very good attention merchant. After all, by collecting data about you, it becomes more easy to grab your attention and sell it to someone else. The main issue is that every time your attention is grabbed, it does something with your human values. It does something with your ability to make conscious choices but also with your mental health and happiness.

Let’s look at some examples.

The Race to the Bottom
Tim Wu talked about the first attention merchant: Benjamin Day. He stuffed his newspapers full of aggressive, sex-hungry creatures because he knew it was drawing attention. We saw it on the (shock) radio and on the TV. There is often a race to the bottom, although it seems to stop when people get saturated. Jerry Springer a talkshowhost that became popular with kung fu – hilbillies and men that were married to their horses, is really outdated. At the moment a lot of people are getting fed up with the quality of the internet. The internet, you may not believe, was once really seen as a place where people could exchange opinions and learn with each other. It is now often a place full of clickbait and cats that look like Hitler (Kitler!).

The problem is that the information that is presented to you, also shapes your opinion. This sometimes has very serious consequences. Imagine you are obsessed by black-on-white-crime and you search this on Google, you will not see official statistics but inflammatory sites, because of the way the algorithms work. This confirms you in your image, and that sometimes has very real, tragic consequences in the real world. It is not Google’s intention, but the result of optimizing to keep attention and sell things. Another famous example is that – when Obama was president – searching Google Maps for “Nigger King” led to the White House.

Polarisation
Attention merchants want your attention. The first trick, we saw above, is to provide you with information that is just a little higher in your emotion. Just a little sharper, a bit more black and white and just a little more sensational. The problem is that this shapes our opinion, which also will be less nuanced. In the mandatory assignment (below) we will see how YouTube does this.

A second way to get your attention is to provide you with information that confirms your opinion. The so-called Filter Bubbles. Or Echo Chambers. Filter Bubble is a term coined by Eli Pariser and it means that you are profiled and always see information from your own circle. Information that confirms you, information that strengthens polarization. On the other hand, there are plenty of studies that indicate that this also played a role in the before-the-internet-days and that, despite the filter bubbles, people online may get more different opinions than before.

What You See is All There is!
In a time before internet, if a bus driver in (for example) the Dutch city of Lemmer was beaten by a passenger we would never know. It may have been an item in the local newspaper, but in other parts of the Netherlands nobody reads that newspaper. Todat  you will receive the video on your social media channels in no time. You click, you look, and you think, the world is getting more brutal! Despite the fact that you never actually experience anything ‘brutal’. This symptom is called, what you see is all there is, and was researched by Daniel Kahneman. Think about it: if you hear a lot about shark attacks, the chance of a shark attack will not increase, but the chance that you will swim around quite uncomfortably will increase! This is happening on a large scale.

There are more issues with the businessmodel of the attention merchant. The attention merchant is a-moral. The attention merchant is only interested in time spent on their platforms, not in time well spent. The attention merchant will try to make you dependent, and insecure and will play on your fears and need for control with only one goal: grab your attention.

The essence is – ultimately – that the values of the attention merchant are central to the design of the technology. It is not about your human values. It also means that when technology companies/attention merchants (such as Facebook) talk about their problems, it are not really their problems. They are symptoms. Cambridge Analytics, Privacy, Incitement, Extremism, Filter Bubbles, Influencing Election. These are all symptoms of the real problem: the companies’ business model as an attention merchant. And it means that when Facebook talks about privacy, you have to see it the way the mob talks about security. The mob can protect you from anyone. Except from the mob itself.

Mandatory Assignment (10 minutes)
Under this link is a PowerPoint that is an empty canvas of YouTube. We would like you use a search engine (our suggestion, use DuckDuckGo) to fill out the canvas. In doing so you will gain insight in the way YouTube works, the problems and also you will get some first ideas about the possibilities of building a better YouTube. A YouTube that has your values at core and not the values of the technology company.

Further suggestions:

  • The book The Attention Merchants by Tim Wu;
  • Nicholas Carr wrote THE SHALLOWS about the impact of the attention economy on our minds;
  • The way recommendersystems from YouTube / Amazon & Netflix work in this great online paper.
  • Jaron Lanier tells us how to remake the internet in this Talk (15 minutes); 

Key take aways:

  • The businessmodel of the attention merchant is to grab your attention and sell it to someone else;
  • The businessmodel is not new, but technology has opened new dangerous possibilities;
  • There are a lot of potential negative effects on your human values (making conscious choices, being happy);
  • If you assess a technology, understanding the underlying values and businessmodel is key.

SURVEILLANCE CAPITALISM

Reading time: 12 minutes / viewing time: 5 minutes

Above we talked about companies grabbing our attention. To be effective at this they need data. To get data they need to watch us. That is why these companies, like Facebook, Google, Microsoft and Amazon are often called surveillance capitalists. The business model of the surveillance capitalist is threatening our human values, because these companies do not only want to predict our behavior but also shape and determine our behavior. In this way technology is threatening our ability to make our own choices, to be human in more profound ways than ever before.

That is why it is crucial that you read this 12 minute summary (PDF) of the Age of Surveillance Capitalism by Soshanna Zuboff. If you do not like to read, you can also check the video (under further suggestions).

After that, please watch this video below by Jaron Lanier who offers a solution (5 minutes).

Further suggestions:

  • A video of 30 minutes in which Shoshanna Zuboff explains her book;
  • Tristan Harris, ex-Google, runs the Center for Humane Technology (check it out!) and wrote about technology hijacking our mind in this brilliant essay
  • A Ted Talk of Tristan Harris where he explains how attention merchants operate (17 minutes) – highly recommended;

Key take aways:

  • Surveillance capitalism is real and aims to manipulate our behavior;
  • This is mostly still in the future, but a threat to our human values;
  • A deep understanding of this phenomena helps to assess technology.

TECHNOLOGY ADDICTION

Reading time: 1 minutes / viewing time 14 minutes

Okay, so now we learned that our core human values are threatened by technology, especially if we are not aware about mechanisms like the attention merchant and the surveillance capitalist. This all comes together in the most personal device the world has ever seen (our smartphone) and on that smartphone on a select number of apps. I won’t state that these apps are addictive, but I do know that they are designed for addiction. They are all designed the same way. They are designed according to the hook model. In the video below you will learn how this works (12 minutes).

Maybe you doubt if these apps are really that addictive. Well, they are. You only have to look at the average usage statistics. Check this video below, and remember this is a video from 2016… (2 minutes)

Okay, so our apps are designed for addiction and apparently it is working, but is it a bad thing? We honestly do not know. It all depends. On usage, on the person, on the profile, etc… There are a lot of people that are warning for the negative effects of our technology addiction. Our apps, especially our social media apps, are making us alone and miserable. And they are unhealthy. In the further suggestions, we posted some convincing thinkers on this. On the other hand, people are happier than ever. There has not been a decrease in happiness according to all research after the introduction of the smartphone. So, we honestly do not know. The smartphone and all those apps just have not been around long enough.

The most important lesson for now, in this crashcourse is that you know how those apps are designed, so you can determine your position and assess new technologies.

O, and ask yourself this question: you can choose not to use your phone, but can you still choose not wanting to use your phone?

Further optional assignment (1) – Secondhand App-usage
One of the most annoying aspects of using our smartphones / apps is passive app-usage. This is the concept that you become an involuntary “victim” of someone else’s app-usage. It is an important concept, because just look at the history of smoking, where secondhand smoke ultimately played a hugely important role in our changing attitude towards smoking. During this exercise you will consider which forms of secondhand app-usage / passive app-usage there are. You determine when you yourself were the victim of passive app-usage. And how you felt about that? Also consider what you can do about it. You make a list that you share with your fellow students.

Learning outcome: You learn about the effects of app-usage on your environment. You do not have to stop using your apps at all, but it is nice if you are not an “ass” with your smartphone. But what exactly is that? What do others experience and what can you do about it?

Further optional assignment (2) – Improve an App
Tristan Harris is the founder of Time Well Spent. He is a design ethicist who believes we should design our apps better. He believes we should have apps that put our values ​​at the center, not the values ​​of the technology company. Snapchat, for example, has the snapstreak. Is that fine or emotional blackmail? In this exercise you improve an existing App. What do you find important? Which values ​​are central to you? Try to find out what is really important to you and consider whether the app complies with it? What do you like about the app, what do you dislike about the app? What would you improve about the app so that your values ​​are central? Create a Mock-Up (a digital or drawn representation of the new app).

Learning outcome: You get to know your apps. You see which tricks their apps pull out and learn how to deal with them better. You think about how you use your apps and (perhaps) how you can better use them. How you can put your values at core.

Further optional assignment (3) – Rules of conduct
There are many publications on rules of conduct regarding the use of apps and smartphones. The most common sense rule is, “don’t be an ass.” But what do you think? Which rules work and which do not work? What discussion can you have about this? Research possible rules of conduct for using the smartphone. Which rules do you like and which you don’t? And why not? Think of rules around education, work, meetings, pub, concert, dining out, at home, bedroom, etc … Choose the three most important rules and present them with an argumentation.

Learning outcome: You learn to think about rules. The experience is also that you will surprise yourselve (and each other) because maybe you often opt for very strict rules, while you would not expect that from yourself, especially because you have an intense relation with your phone too.

Further optional suggestions:

  • Sherry Turkle has a great Ted Talk about the influence of social media (apps) on young people. Connected but alone (20 minutes);
  • The Atlantic published an article about how smartphones have destroyed an entire generation;
  • Jaron Lanier wrote a book about deleting all your social media accounts;
  • Natasha Dow Schull wrote a book on the relation between gambling addiction and app-addiction;
  • I wrote a book about the relationship between apps and smoking, an English essay here;  
  • Adam Alter wrote irresistible, on the rise of addictive technologies;
  • A video of 4 minutes on how our phone is changing us;

Further we really recommend this video of 2.35 minutes. We are especially fan of the compulsive opening weather app function.

Finally we like to introduce the TedxTalk below, which gives you inspiration for a simple excercise (sharing your smartphone) and a great project called Caught in the App (11 minutes).

Key take aways:

  • Our apps are designed for addiction;
  • This design is very, very successfull;
  • This can have a negative effect on our happiness, but we are not sure;
  • Knowing how this design works help you determine your position and assess other technologie

QUANTIFIED SELF

Reading time: 7 minutes

So, we talked about technology trying to capture your attention even if it means making your miserable. We also talked about technology trying to manipulate you and weakening your autonomy. And we talked about technology trying to be as addictive as possible. All can have negative effects on the human values that we have indicated as important (being happy, being able to make your own decisions). But of course understanding this helps you to determine your own positions and assess the technology you use.

The question we want to explore in this last part is if technology also can improve our autonomy and wellbeing and happiness? That is broad question, but we narrowed it down a bit, by using the perspective of the quantified self.

Quantified Self. What is it?
It has been possible to quantify yourself for a long time. To express yourself in numbers. Think of your age, your weight, your salary, the number of children you have, your bank balance, the number of points on your airmiles pass, the number of years in service, the number of years married and so on. It is pretty easy to be expressed in numbers. Quantified Self is a term coined by Kevin Kelly and Gary Wolff. They enthusiastically build on this idea. Quantified Self is a movement that uses technology to measure things about yourself in order to gain more insight. This movement is still relatively small now, but make no mistake, Quantified Self and its associated devices that measure everything are a billion-dollar industry powered by companies like Apple and Google. Measurement is becoming easier because the technology is increasingly available and cheaper.

Let’s look at some examples.

Wearables -> These are portable devices that keep track of you. For example, the number of steps you take. More and more people know and wear the Apple Watch, Jawbone or the Fitbit. Now these devices usually keep track of things like movement, heart rate (variation), sleep, temperature and skin resistance. But that changes quickly. First, the wearables will become smaller, invisible, and become part of your clothing or – for example part of your wedding ring. Second, they will measure better and they will measure more. If you now have a pedometer, you know approximately how many steps you have walked. Soon you will measure that more precisely and also your stress levels, your blood values, your sugar content, cholesterol, etc… Google is already experimenting with lenses that measure things like that in your eye fluid.

Tracker Apps -> There are many apps you can use to keep track of yourself. For example, how much you eat, how many calories you consume, how much you interact with your friends, how much time you spent in your car and so on. You have Apps in which you can keep track of how many and which books you read, films you watch, etc. This is called life logging. And of course the well-known sports apps for running (Runkeeper) and cycling (Strava). There are also meta apps like the apps that keep track of how much time you spend on your mobile phone. Nowadays this is standard on most devices.

Other Applications -> You probably use a lot of applications for very different purposes than quantifying yourself, but if you know how you use that application, you also know a lot about yourself. How often are you on Netflix? And how often do you listen to Spotify? What is your surfing behavior? And, interestingly, how do you use your e-mail? Or WhatsApp? There are tools that can keep track of how often you check your mailbox (are you mailbox-driven?), how much you meet and whether you check your e-mail during the meeting. Think of the studies that indicate a relationship between compulsive checking of your mail and depression. It is even being experimented if it is possible to recognize the signs of a depression by the way people use their keyboard on their smartphone.

Tools -> There are plenty of good tools available that collect information about you. Think of a smart scale (nowadays they prefer to be addressed as a Body Analyzer, or a sleep tracker. But also consider the increasingly cheaper DNA analysis (23andme) or the devices that measure the quality of the air.

There are virtually no limits to what can be measured. Sensors are getting cheaper, popping up everywhere and getting better. More and more data is being produced. The real challenge, of course, is to convert that data into information and that information into knowledge and that knowledge back into wisdom. According to the well-known pyramid. However, maybe this classic pyramid is no longer relevant. After all, how can you convert so much data, into information? Only algorithms can do that. And can you trust an algorithm?

At the moment, the Quantified Self is mainly for the nerds. It is not always easy to extract the data from the tools and it is certainly not easy to convert this data into information, let alone knowledge and wisdom.  Some people research the relationship between their periods and the color of their web purchases or their Tinder behavior, but they are still rare. In the future, however, the technology will continue to improve and disappear into the background. Standards will be created and a new profession may arise: The Personal Data Coach. Someone (or an algorithm in the long term) that looks at your data and provides you with advice.

The underlying idea of ​​the Quantified Self is of course not only that you gain insight into yourself, but also that you do something with it. That you improve yourself. Upgrade yourself. Bio-hackers for example measure, change and investigate the impact of the change. This ranges from influencing their food, stimulating their brain waves, to modifying their DNA or implanting technology.

You may say now. Sounds good. Everyone gets an insight into themselves, everyone can improve. Everyone will soon be happy and 1000 years old. Or maybe not?

Let’s look at some disadvantages.

Disadvantages to data -> First, there are a lot of disadvantages to data. This is discussed in crash course number four

Privacy -> Privacy is discussed in crash course number three. Still we have some remarks that are directly connected to the Quantified Self movement. Lots of Apps, wearables and tools you need to measure don’t take privacy to much into account. For example: the FitBit. This is a wristband that keeps track of how much you sleep, how much you move and what your heart rate is. That information is completely private, because it is about you. However, you can only activate the FitBit if you first allow your data to be shared with the (American) company. The company does that in accordance with the FitBit Terms and Agreements but there are many concerns and loopholes. Certainly as a consumer, it is not easy to find out exactly what happens to the data of a device that measures your personal data.

Think about that for a moment. Who actually measures who? While you think you measure yourself, you are simultaneously being measured by someone else. While you measure your sleeping behavior and experiment with improvements, someone else measures you too. While you watch Netflix, Netflix watches you. While you listen to Spotify, Spotify listens to you. Just like the white mice in The Hithchhikers Guide through the Galaxy that conduct experiments on humans. We should therefore seriously work on apps, tools and wearables where we ourselves are 100% owner of our data. Maybe pay a little more. However, that would mean paying for our right to privacy. Absurd, but better than the current situation.

Data Ownership -> A third danger is of course that our bosses and our teachers also want access to your data. The Dutch Data Protection Authority recently banned the processing of employee health data by employers, but there is much more data. Your employer now (time for an old-fashioned word: time clock) often keeps track of how much you work. Why wouldn’t he keep track of how much you email, surf pointlessly, work from home, meet and so on? And the same applies to students. Nowadays, it is increasingly recorded how a student behaves in the e-learning environment (login behavior, what have you viewed, what have you submitted when?). We call this Learning Analytics. But you can also keep track of how much time someone spends on campus. And what he does there. And how many times students are checking their Phone during classes. The danger is that it is not about the Quantified Self, but that someone else is quantifying you. Call it the Quantified You! Or Big Data. Quantified Self should be designed as a Big Mother (you get your data, which you can interpret yourself and which leads to suggestions for improvement) and not as a Big Brother (someone looks at your data and draws his conclusions).

Solidarity -> And what about solidarity? If the rise of the Quantified Self leads to the Quantified You, this offers the opportunity to connect conclusions to that data. For example, if you share your data with your health insurer (and you live healthy), you will receive a discount. Many Dutch people are open to that. The same principle applies to your driving behavior. It sounds fair. Why do you, a perfectly healthy young man, have to pay for someone who neglects health? But there are three problems with this type of insurance based on (big) data from the Quantified Self. 

Problem One: The infinite complexity of reality.
Health or safe driving behavior is much more complex than is presented in the data. This means that the data collected is not only a bad indicator, but also leads to the insatiable need for more data. The core of the argument is that data leads to simplification. If you are too fat, you should walk more and eat less. A solution beyond millions of other health, economic and cultural factors contributing to obesity.

Problem Two: The average becomes the norm.
If you measure something than there will be an average. And if there is an average there will a wrong and a right. And that is a big problem. Because this way Big Data and Quantified Self leads to mediocrity. Instead of taking advantage of the uniqueness of the Quantified Self (it’s about you, the unique you!), The opposite happens. Oh, irony. Oh yes, and of course you have people who are going to game the system. There is already a device to which you can connect your FitBit, so it will simulate steps.

Problem Three: Solidarity is under pressure.
The insurer’s system is based on the healthy young person paying for the less healthy older person. How will that system continue to work if the premium is linked to Quantified Self Data? And what does that mean for people who do not show risk-averse behavior? For innovators, adventurers, entrepreneurs who work 18 hours a day, etc … Will boringness be rewarded?

At the same time, the Quantified Self movement can also be a driver of solidarity. This movement is also known as the Quantified Us. The idea is that groups of people share their Quantified Self data and (automatically) learn from each other. Suppose you have diabetes, or epileptic or poor sleep and you share your data with others. That enables you to find (together) solutions for your problem. The intriguing thing is of course that on the one hand there are enormous concerns about the security and privacy of the digital patient file and on the other hand platforms are popping up everywhere on which patients seemingly carelessly share their data.

Further suggestions:

  1. Quantified Self Ted Talk by Gary Wolf (5 minutes);
  2. The official Quantified Self – website;
  3. Artikel on Wired.com about Quantified Us;
  4. TedTalk of Giorga Lupi on Quantified Self (12 minutes)

Key Take aways:

  • Technology can help you improve. Technology can make you happier and give you more autonomy;
  • Quantified Self is an example
  • However, also with Quantified Self there are a lot of pitfalls to be aware of.

A VERY SHORT SUMMARY OF THIS CRASHCOURSE

Congratulations. You have completed crashcourse number two, so you got a very small taste of thinking about technology and the impact on human values. An appetizer, if you want. Maybe you did some further reading, so you started on the soup. Good for you. Remember: technology impacts human values. If you assess or judge a technology you also have to look at the impact on human values.

In this crashcourse we looked at the way technology influences our autonomy (our ability to make decisions) and our happiness. Both are challenged by the attention economy, addictive technologies and surveillance capitalism. This means we need to be more active in putting our human values at core. But even if technology is promising to give us more autonomy or make us more happy, like the Quantified Self, we still have to be aware of the pitffalls.

Remember: technology should make us more human, technology should not try to be more like humans.