Sie sind auf Seite 1von 38

1

PHIL202:
PHILOSOPHY, TECHNOLOGY AND
THE FUTURE OF HUMANITY
A/Prof Paul Formosa
Paul.Formosa@mq.edu.au
2

TODAY’S LECTURE
• Teaching staff
• What will we be looking at?
• Unit outline
• Unit guide
• Learning outcomes
• Assessment
• Readings
• How the seminars will run
• What you need to do!
3

TEACHING STAFF
• Unit convenor and co-Lecturer:
A/Prof Paul Formosa

• Co-Lecturer:
• Dr Alex Gillett
4

WHAT IS
TECHNOLOGY?
• “Technology” comes from the Greek “techne” –
“craft” or “art” and “logia” meaning “study of”.
• Roughly: the study of how to apply or do things.
• English prominence of the word begins to emerge
with Industrial revolution.
• E.g. “study of mechanical and industrial
arts”(1859) (Century Dictionary, 1895, gives as
example "spinning, metal-working, or
brewing"). (From etymonline.com).
• Now: application of scientific knowledge for
practical, applied or industrial purposes.
5

WHY THINK ABOUT


TECHNOLOGY?
• Technological progress changes society and
changes us.
• Technology changes faster than humans and
human societies.
• Technological progress is increasing. (E.g. Kids
grow up with screens now).
• We are amidst a digital or information
revolution in terms of technology.
• What is this doing to us? (E.g. Think about
massive changes smartphones have had on
our lives).
6

WHAT QUESTIONS DOES


IT RAISE?
• 1) How does technology change us?

• 2) How does technology change our


societies?

• 3) What should we do about it?

• Task: A) Think of some important recent, new or


emerging technology. B) Try to answer the
above three questions.
7

E.G. BIG DATA


• Big data: Kitchin (2014) data sets and databases that
are ‘big’ along three lines: volume, velocity, and variety
(the 3Vs). (Hoffmann 2017)
• Massive size (petabytes or even zettabytes); rapidity of
their production (e.g. data generated by social
networking sites), their diversity (Kitchin 2014).
• Data is: scalable, searchable & combinable. Hoffmann
• Combine with machine learning algorithms, AI, etc. to
find patterns that we can’t see that can predict human
behaviour and reveal information about us.
• (E.g. Infer sexuality via Facebook likes).
8

E.G. BIG DATA


• Data blind spots – what is missed? E.g. The elderly?
• Problems of representation – who decides how data is
represented? (E.g. What are the gender options?)
• Biases in algorithms – how are algorithms trained? E.g.
Black faces are not recognised as well.
• Privacy – did we consent? What did we consent to?
What does it reveal about ourselves and others? E.g.
Infer sexuality from Facebook likes and friends.
• Inaccurate predictions – what are these being used for?
E.g. Chance of reoffending in sentencing hearing.
• Manipulation – using data to get people to buy things or
monopolise their attention on an app. What does this do
to human relationships?
9

E.G. BIG DATA


• How does this technology
change us?

• How does this technology


change our societies?

• What should we do about it?


10

TECHNOLOGY AND
HUMANITY
• 1. Technology is always good for us and makes us
better.

• 2. Technology is always bad for us and makes us worse.

• 3. Technology is simply neutral, and we can make good


or bad uses of it.

• Which (if any) of these is right? (Note: 3 is common!).


11

TECHNOLOGY AND
HUMANITY
• Clearly the same technology (e.g. nuclear technology)
can be put to good (clean energy) or bad (bombs)
uses.
• But technology is not value neutral. Why?
• Values are often embedded in technology. E.g. Think
how privacy is embedded in software – this embeds a
way of thinking about the importance of privacy. Are
there lots of privacy settings? Easy to access? Etc.
• And artificial agents may have to make ethical
decisions or value judgment. E.g. AVs and trolley
problems. Technology as a moral agent.
12

TECHNOLOGY AND
HUMANITY
• And some technologies and uses of technologies might
be good for some (e.g. rich white people) and bad for
others (e.g. poorer minorities).
• This raises questions of fairness and justice.
• Or good in some respects and bad in others.
• E.g. Automation might give us more free time but rob
our lives of some meaning.
• Or simply change us in complex ways. E.g. Think of the
way that social media has changed how we think
about privacy, sharing, friendship, etc. Overall, is this
good or bad? How do we even answer that?
13

TECHNOLOGY AND
HUMANITY
• We often think of technology as simply a tool we use.
• But this is also becoming too simplistic.

• Algorithms and AIs are making decisions for us. They can
act independently of us in ways we don’t understand.
• Social robots could become our friends or colleagues.
You don’t become attached to a hammer in the same
way.
• Technology could become part of us – or we could
merge with an AI (Human-Brain interfaces) or even
upload ourselves into virtual environments.
14

UNIT OUTLINE
• MIND, BODIES AND TECHNOLOGY
• W2 – What is technology? Optimist and pessimist views
of technology. (AG)

• We will look at what technology is and how it has been


understood by philosophers.
• We will look at optimist views (technology is good!) and
pessimist views (technology is bad!) and the arguments
for these views.
15

UNIT OUTLINE
• W3 - Artificial Intelligence (AI). (AG)

• What is intelligence? What is artificial intelligence?


• What are the different forms of AI?
• How does AI work?

• Together W2 and W3 provide a good background for


the issues we examine later in the unit.
16

UNIT OUTLINE
• W4 - Mind and technology: co-evolution of mind and
technology (AG)

• What role has technology already played in making us


who we are?
• How did we become modern humans with the help of
technology?
17

UNIT OUTLINE
• W5 - The Singularity and Mind-uploading: Will humanity
survive? (PF)

• What if an AI could be self-improving? Then an AI could


be build a better AI. And that better AI could build an
even better AI … and so on … until we get an
intelligence explosion (the singularity).
• Would that be the end of humanity? What would be
left for us? Would the AIs let us live?
• Or would we survive by uploading ourselves into virtual
worlds?
18

UNIT OUTLINE
• ETHICAL AND SOCIAL ASPECTS OF TECHNOLOGY
• W6 – Artificial moral agents: Can robots be persons? (PF)

• Can and should we make artificial moral agents? Are


there different types of ethical robots we could make?
• Can we off-load ethical decisions to AIs and robots?
• Should we off-load ethical decisions to AIs and robots?
• What happens if/when social robots look and act just
like us (e.g. Westworld)? Would they really be moral
agents then?
19

UNIT OUTLINE
• W7 - Autonomous Vehicles and Carebots: How to live
with machines (PF)

• AVs will need to make ethical decisions. How should


they do that? Should there be an individual ethical
setting or a mandatory ethical setting? Who should
decide?
• Carebots could take on much of our everyday care
work. What are the implications of this? Will we lose
some moral skills in the process?
20

UNIT OUTLINE
• W8 –Videogames and morality: Do virtual actions
matter? (PF)

• We live more of our lives in virtual worlds, such as


videogames.
• Do our virtual actions matter? Why is it ok to murder
people in videogames but not in the “real” world?
What is the difference?
• How do videogames impact us? Do they make us
better or worse – or both? How can videogames be
designed to be morally and ethically engaging?
21

UNIT OUTLINE
• W9 – Privacy on the Internet: Do we have any and
should we care? (PF)

• What is privacy?
• Why do we care about privacy?
• Do we have any privacy on the internet? Should we
care how much privacy we have on the internet?
• How can we respect people’s privacy?
• What are the impacts of a lack of privacy?
22

UNIT OUTLINE
• TECHNOLOGY AND THE FUTURE OF HUMANITY
• W10 – Economy and politics of cognitive capitalism
(AG)

• What are some of the political implications of the


digital/information revolution?
• Many platforms are designed to maximize the amount
of your attention they can get (the “attention
economy”) to sell advertising.
• What are the political implications of this? How does it
change how we think about capitalism and work?
23

UNIT OUTLINE
• W11 –Automation: dangers and solutions (AG).

• A lot of work previously being done by humans is being


automated.
• What are some of the social and political implications
of this?
• Where does this leave the world of human work?
• What does automation do to our skills?
24

UNIT OUTLINE
• W12 - Human enhancement. (AG)

• Are we already cyborgs? Is technology part of us? Has


it always been?
• Should we “enhance” ourselves? What are some of the
dangers and benefits of making ourselves “better”?
• What will humanity become in the future?
25

WEEKLY READINGS
• Required readings: available via Leganto.
• See the unit guide and iLearn for details about the
required readings each week.
• Read these BEFORE the seminar.
• No printed unit reader or textbook to buy.
26

WEEKLY SEMINARS
• We are a bit too big now for a seminar (especially an
online one).
• So we will:
• 1) prerecord some lecture-style content (about 1 hour
or less).
• 2) break up into two 50 minute tutorial groups for the
special circumstances (synchronous mode) students.
These will run from 4:05-4:55 (for A-K surnames) and
5:05-5:55 (for L-Z) on Wednesday (i.e. during our 4-6pm
seminar slot).
• 3) You must attend your assigned group. ONLY for
special circumstances (synchronous mode) students.
27

ASSESSMENT
28

ASSESSMENT
• 2000-word research essay. Worth 40%.
• Essay questions and rubric are out now.
• Due by 11:59PM on November 5.
• Weekly online quiz. Worth 15%. There will be 10 weekly
on-line quizzes worth a total of 15% (or a maximum of
1.5% for each of the 10 quizzes). Start week 3.
• Quizzes cover content covered in required readings
and/or seminars. Quizzes open after the seminar and all
close Friday November 4 at 11:59 PM. One go only. 5
questions. 10 minutes. Don’t leave it to the last minute!
29

ASSESSMENT
• Participation – worth 15%
• Internal/Special circumstances/synchronous mode
students: participate in synchronous zoom discussion.
Attendance will be noted.
• EXTERNAL/online/asynchronous mode students: Post at
least TWO contributions to the appropriate forum within
a week of the relevant lecture/seminar.
• Your contribution should include BOTH a direct
response to the discussion questions for each week
AND a contribution that seeks to engage in a dialogue
with the views of other students.
30

ASSESSMENT
• Length: Your weekly post/s should total between, very
roughly, 200-300 words.
• There is no strict upper word limit on the length or
number of weekly contributions
• Due date: on-going. All forums will close one week after
the final lecture in week 12.
• Late forum posts (i.e. posted more than 7 days after the
relevant lecture/seminar) will lose marks for a lack of
timeliness. Keep up with the discussion!
• I’ll post discussion questions after the lecture content
goes up.
31

ASSESSMENT
• Reflective Blog – worth 30%
• Due Date: 11/9/2020.
• Submission: ‘Reflective Blog’ link
32

ASSESSMENT
• Format and length: Each blog post should take the form
of a short video/audio discussion. You can also include
a poster, slide, image, or links as part of your post.
• Each entry should strictly be a maximum of 3 minutes in
total. (So three entries of 3 minutes each).
• You can be creative if you want to!
• Note: if you cannot record video or audio, you can
upload a written blog post of equivalent length.
33

ILEARN SITES
• Please log in and have a look around
34

QUESTIONS
35

WHAT YOU NEED TO DO!


• Download the unit guide from iLearn. Read it! Before you
ask a question, check the unit guide.
• Download the required readings for next week from
Leganto (via iLearn). Do the readings BEFORE the next
seminar.
• Download the lecture slides. Note: next week’s lecture
slides/notes and pre-recorded content will be uploaded
before the next seminar.
• Special circumstances/synchronous mode students – find
out which group you are in (4-5pm (A-K) or 5-6pm (L-Z).
Note the weekly zoom link (the same each week for each
group).
• Post your first entry to the online forums – start by
introducing yourself.
36

NEXT LECTURE
• W2 – What is technology? Optimist and pessimist views
of technology (AG)
• Reading 1: Mary Tiles and Hans Oberdiek, “Conflicting
Visions of Technology,” in Living in a Technological
Culture (London: Routledge, 1995), pp. 12–31.
• Reading 2: Andrew Feenberg, “What is the Philosophy
of Technology?”, in Defining Technological Literacy.
Towards An Epistemological Framework, J. Dakers (ed.),
(Palgrave McMillan, 2006), 5-16.
37

REFERENCES
• https://www.etymonline.com/word/technology
• Kitchin, R. (2014b). The Data Revolution: Big Data, Open
Data, Data Infrastructures and Their Consequence.
Thousand Oaks, CA: Sage.
• Anna Hoffmann (2017). “Data, Technology, and
Gender Thinking About (and From) Trans Lives”, Spaces
for the Future: A Companion to Philosophy of
Technology (E.g. Pitt & Shew). Routledge.
38

PHIL202:
PHILOSOPHY, TECHNOLOGY AND
THE FUTURE OF HUMANITY
A/Prof Paul Formosa
Paul.Formosa@mq.edu.au

Das könnte Ihnen auch gefallen