White Hat
UX Trine Falbe
Kim Andersen
Martin Michael Frederiksen
1. Discover
17 White hat, grey hat and black hat UX
20 A brief introduction to UX
24 Privacy Zuckering
33 Design patterns
35 Anti patterns
38 Dark patterns
42 UX gone wrong
2. Evaluate
46 Competition is a single click away
49 Simple and efficient test methods
53 Test reporting and release notes
3. Benchmark
55 Quality assurance and quality control
60 Baseline
4. Build
62 Understanding application types
64 From dark pattern to white hat UX
74 How to engage your development team
76 UX and scrum
5. Grow
80 Shitstorms and candystorms
85 From dark pattern to white hat UX
88 Improve
90 It takes time to grow
95 Strategic white hat UX
PRIVACY
ZUCKERING
Systems and
interfaces designed
to harvest your
personal data
B
usinesses are under pressure to act with more transparen-
cy. The consumers want openness, honesty, and there is
even a focus on sustainability in America nowadays.
At the same time it has never been easier for businesses to har-
vest, store and exploit their customers’ personal data and online
behaviour.
• For every privacy policy update, Facebook tells the media that
it is now better, easier and faster to control your own content.
The reality is that the settings you need to tweak are now
spread out over so many pages that it is hard to keep count
of them.
BENCH
MARK
QUALITY
ASSURANCE
AND
QUALITY
CONTROL
Q
uality Assurance (QA) is essentially when you perform
quality testing on product or a service during the develop-
ment proces and before release.
Quality Control (QC) is the opposite: You test the quality of the fin-
ished product after it has been launched.
People who work with QA argue that quality must be built in to the
product contrary to having quality smeared over the product after
it has been made.
The same holds true when it comes to UX. Some say that UX must
never be the icing on a bad cake. Or put differently: “we don’t put
lipstick on a pig”.
Sprint 1 22 11 0 32 12 77
Sprint 2 21 13 3 37 11 85
Sprint 3 57 21 1 3 3 85
Sprint 4 55 24 3 0 0 82
Here we have the same errors as before, but their value is set at
their monetary cost.
Sprint 1 22 11 0 32 12 77
Sprint 2 21 13 3 37 11 85
Sprint 3 57 21 1 3 3 85
Sprint 4 55 24 3 0 0 82
$ 55 $ 120 $ 75 $0 $0 $ 250
With roughly the same amount of errors, the cost of correcting the
error drop as you move forward in development. This argument is
well known in every test department, and it also holds true in the
real world.
The key thing to understand is that you can save a lot of money
by prioritising UX work, which includes iterative usability testing,
and by identify and solving all problems as early in the develop-
ment process as possible.
At the same time we make sure that black or grey hat methods
that have been used in the project by mistake, can be caught in an
early UX test. Imagine the costs of overlooking a problem, deliv-
ering the project to the customers (via code, test and release) and
then being hit by a shitstorm from the customers.
The costs are high for handling a crisis, developing a new version,
doing subsequent tests and releasing the product again. On top of
that are the scratches that follow. A bad image does not disappear
by magic.
No excuses. UX must
be prioritized. Not
because it is feel good,
but because it is too
expensive if you choose
not to. Get started.
FROM
DARK
PATTERN
TO
WHITE HAT
Make it easy for the user to leave when they want to. Making it
easy to leave will make the motivation for them to return that
much greater.