ABOUT THE SPEAKER
Sougwen Chung - Artist, researcher
Sougwen 愫君 Chung is an artist and researcher whose work explores the dynamics between humans and systems.

Why you should listen
Sougwen Chung's work explores the mark-made-by-hand and the mark-made-by-machine as an approach to understanding the dynamics of humans and systems. Chung is a former research fellow at MIT’s Media Lab and a pioneer in the field of human-machine collaboration. In 2019, she was selected as the Woman of the Year in Monaco for achievement in the Arts & Sciences.
 
In 2018 she was an inaugural E.A.T. Artist in Resident in partnership with New Museum and Bell Labs, and was awarded a commission for her project Omnia per Omnia. In 2016, Chung received Japan Media Art’s Excellence Award in for her project, Drawing Operations. She is a former research fellow at MIT’s Media Lab. She has been awarded Artist in Residence positions at Google, Eyebeam, Japan Media Arts and Pier 9 Autodesk. Her speculative critical practice spans performance, installation and drawings which have been featured in numerous exhibitions at museums and galleries around the world.
More profile about the speaker
Sougwen Chung | Speaker | TED.com
TED@BCG Mumbai

Sougwen Chung: Why I draw with robots

Filmed:
160,983 views

What happens when humans and robots make art together? In this awe-inspiring talk, artist Sougwen Chung shows how she "taught" her artistic style to a machine -- and shares the results of their collaboration after making an unexpected discovery: robots make mistakes, too. "Part of the beauty of human and machine systems is their inherent, shared fallibility," she says.
- Artist, researcher
Sougwen 愫君 Chung is an artist and researcher whose work explores the dynamics between humans and systems. Full bio

Double-click the English transcript below to play the video.

00:12
Many of us here use technology
in our day-to-day.
0
937
3165
00:16
And some of us rely
on technology to do our jobs.
1
4126
3247
00:19
For a while, I thought of machines
and the technologies that drive them
2
7397
3950
00:23
as perfect tools that could make my work
more efficient and more productive.
3
11371
4505
00:28
But with the rise of automation
across so many different industries,
4
16403
3254
00:31
it led me to wonder:
5
19681
1372
00:33
If machines are starting
to be able to do the work
6
21077
2341
00:35
traditionally done by humans,
7
23442
1667
00:37
what will become of the human hand?
8
25133
2333
00:40
How does our desire for perfection,
precision and automation
9
28133
4093
00:44
affect our ability to be creative?
10
32250
1922
00:46
In my work as an artist and researcher,
I explore AI and robotics
11
34553
4087
00:50
to develop new processes
for human creativity.
12
38664
3005
00:54
For the past few years,
13
42077
1286
00:55
I've made work alongside machines,
data and emerging technologies.
14
43387
4376
01:00
It's part of a lifelong fascination
15
48143
1861
01:02
about the dynamics
of individuals and systems
16
50028
2735
01:04
and all the messiness that that entails.
17
52787
2381
01:07
It's how I'm exploring questions about
where AI ends and we begin
18
55192
4808
01:12
and where I'm developing processes
19
60024
1642
01:13
that investigate potential
sensory mixes of the future.
20
61690
3326
01:17
I think it's where philosophy
and technology intersect.
21
65675
2857
01:20
Doing this work
has taught me a few things.
22
68992
2239
01:23
It's taught me how embracing imperfection
23
71642
2824
01:26
can actually teach us
something about ourselves.
24
74490
2489
01:29
It's taught me that exploring art
25
77428
2336
01:31
can actually help shape
the technology that shapes us.
26
79788
2931
01:35
And it's taught me
that combining AI and robotics
27
83148
3261
01:38
with traditional forms of creativity --
visual arts in my case --
28
86433
3532
01:41
can help us think a little bit more deeply
29
89989
2302
01:44
about what is human
and what is the machine.
30
92315
2897
01:47
And it's led me to the realization
31
95942
1707
01:49
that collaboration is the key
to creating the space for both
32
97673
3055
01:52
as we move forward.
33
100752
1267
01:54
It all started with a simple
experiment with machines,
34
102387
2746
01:57
called "Drawing Operations
Unit: Generation 1."
35
105157
2826
02:00
I call the machine "D.O.U.G." for short.
36
108434
2516
02:02
Before I built D.O.U.G,
37
110974
1326
02:04
I didn't know anything
about building robots.
38
112324
2365
02:07
I took some open-source
robotic arm designs,
39
115220
2897
02:10
I hacked together a system
where the robot would match my gestures
40
118141
3341
02:13
and follow [them] in real time.
41
121506
1639
02:15
The premise was simple:
42
123169
1448
02:16
I would lead, and it would follow.
43
124641
2200
02:19
I would draw a line,
and it would mimic my line.
44
127403
2936
02:22
So back in 2015, there we were,
drawing for the first time,
45
130363
3698
02:26
in front of a small audience
in New York City.
46
134085
2619
02:28
The process was pretty sparse --
47
136728
2555
02:31
no lights, no sounds,
nothing to hide behind.
48
139307
3487
02:35
Just my palms sweating
and the robot's new servos heating up.
49
143241
3395
02:38
(Laughs) Clearly, we were
not built for this.
50
146950
2441
02:41
But something interesting happened,
something I didn't anticipate.
51
149820
3233
02:45
See, D.O.U.G., in its primitive form,
wasn't tracking my line perfectly.
52
153077
4802
02:49
While in the simulation
that happened onscreen
53
157903
2333
02:52
it was pixel-perfect,
54
160260
1357
02:53
in physical reality,
it was a different story.
55
161641
2531
02:56
It would slip and slide
and punctuate and falter,
56
164196
2817
02:59
and I would be forced to respond.
57
167037
2068
03:01
There was nothing pristine about it.
58
169525
1778
03:03
And yet, somehow, the mistakes
made the work more interesting.
59
171327
3238
03:06
The machine was interpreting
my line but not perfectly.
60
174589
2754
03:09
And I was forced to respond.
61
177367
1372
03:10
We were adapting
to each other in real time.
62
178763
2709
03:13
And seeing this taught me a few things.
63
181496
1937
03:15
It showed me that our mistakes
actually made the work more interesting.
64
183457
4880
03:20
And I realized that, you know,
through the imperfection of the machine,
65
188663
4249
03:24
our imperfections became
what was beautiful about the interaction.
66
192936
3705
03:29
And I was excited,
because it led me to the realization
67
197650
3087
03:32
that maybe part of the beauty
of human and machine systems
68
200761
3650
03:36
is their shared inherent fallibility.
69
204435
2738
03:39
For the second generation of D.O.U.G.,
70
207197
1820
03:41
I knew I wanted to explore this idea.
71
209041
2307
03:43
But instead of an accident produced
by pushing a robotic arm to its limits,
72
211372
4418
03:47
I wanted to design a system
that would respond to my drawings
73
215814
2897
03:50
in ways that I didn't expect.
74
218735
1833
03:52
So, I used a visual algorithm
to extract visual information
75
220592
3849
03:56
from decades of my digital
and analog drawings.
76
224465
2978
03:59
I trained a neural net on these drawings
77
227467
2055
04:01
in order to generate
recurring patterns in the work
78
229546
2865
04:04
that were then fed through custom software
back into the machine.
79
232435
3476
04:07
I painstakingly collected
as many of my drawings as I could find --
80
235935
4386
04:12
finished works, unfinished experiments
and random sketches --
81
240345
4215
04:16
and tagged them for the AI system.
82
244584
1999
04:18
And since I'm an artist,
I've been making work for over 20 years.
83
246607
3684
04:22
Collecting that many drawings took months,
84
250315
2024
04:24
it was a whole thing.
85
252363
1389
04:25
And here's the thing
about training AI systems:
86
253776
2595
04:28
it's actually a lot of hard work.
87
256395
2200
04:31
A lot of work goes on behind the scenes.
88
259022
2191
04:33
But in doing the work,
I realized a little bit more
89
261237
2681
04:35
about how the architecture
of an AI is constructed.
90
263942
3421
04:39
And I realized it's not just made
of models and classifiers
91
267387
2947
04:42
for the neural network.
92
270358
1322
04:43
But it's a fundamentally
malleable and shapable system,
93
271704
3532
04:47
one in which the human hand
is always present.
94
275260
3111
04:50
It's far from the omnipotent AI
we've been told to believe in.
95
278395
4000
04:54
So I collected these drawings
for the neural net.
96
282419
2515
04:56
And we realized something
that wasn't previously possible.
97
284958
3929
05:00
My robot D.O.U.G. became
a real-time interactive reflection
98
288911
4091
05:05
of the work I'd done
through the course of my life.
99
293026
2627
05:07
The data was personal,
but the results were powerful.
100
295677
3865
05:11
And I got really excited,
101
299566
1484
05:13
because I started thinking maybe
machines don't need to be just tools,
102
301074
4582
05:17
but they can function
as nonhuman collaborators.
103
305680
3420
05:21
And even more than that,
104
309537
1547
05:23
I thought maybe
the future of human creativity
105
311108
2429
05:25
isn't in what it makes
106
313561
1524
05:27
but how it comes together
to explore new ways of making.
107
315109
3436
05:31
So if D.O.U.G._1 was the muscle,
108
319101
2190
05:33
and D.O.U.G._2 was the brain,
109
321315
1762
05:35
then I like to think
of D.O.U.G._3 as the family.
110
323101
2928
05:38
I knew I wanted to explore this idea
of human-nonhuman collaboration at scale.
111
326482
4793
05:43
So over the past few months,
112
331299
1373
05:44
I worked with my team
to develop 20 custom robots
113
332696
3135
05:47
that could work with me as a collective.
114
335855
1960
05:49
They would work as a group,
115
337839
1293
05:51
and together, we would collaborate
with all of New York City.
116
339156
2889
05:54
I was really inspired
by Stanford researcher Fei-Fei Li,
117
342069
2944
05:57
who said, "if we want to teach
machines how to think,
118
345037
2515
05:59
we need to first teach them how to see."
119
347576
1984
06:01
It made me think of the past decade
of my life in New York,
120
349584
2785
06:04
and how I'd been all watched over by these
surveillance cameras around the city.
121
352393
3993
06:08
And I thought it would be
really interesting
122
356410
2056
06:10
if I could use them
to teach my robots to see.
123
358490
2405
06:12
So with this project,
124
360919
1888
06:14
I thought about the gaze of the machine,
125
362831
1967
06:16
and I began to think about vision
as multidimensional,
126
364822
3226
06:20
as views from somewhere.
127
368072
1600
06:22
We collected video
128
370151
1834
06:24
from publicly available
camera feeds on the internet
129
372009
3063
06:27
of people walking on the sidewalks,
130
375096
1690
06:28
cars and taxis on the road,
131
376810
1712
06:30
all kinds of urban movement.
132
378546
1817
06:33
We trained a vision algorithm
on those feeds
133
381188
2603
06:35
based on a technique
called "optical flow,"
134
383815
2286
06:38
to analyze the collective density,
135
386125
1977
06:40
direction, dwell and velocity states
of urban movement.
136
388126
3637
06:44
Our system extracted those states
from the feeds as positional data
137
392178
4269
06:48
and became pads for my
robotic units to draw on.
138
396471
3373
06:51
Instead of a collaboration of one-to-one,
139
399868
2534
06:54
we made a collaboration of many-to-many.
140
402426
3024
06:57
By combining the vision of human
and machine in the city,
141
405474
3587
07:01
we reimagined what
a landscape painting could be.
142
409085
2794
07:03
Throughout all of my
experiments with D.O.U.G.,
143
411903
2218
07:06
no two performances
have ever been the same.
144
414145
2717
07:08
And through collaboration,
145
416886
1382
07:10
we create something that neither of us
could have done alone:
146
418292
2864
07:13
we explore the boundaries
of our creativity,
147
421180
2611
07:15
human and nonhuman working in parallel.
148
423815
2892
07:19
I think this is just the beginning.
149
427823
2334
07:22
This year, I've launched Scilicet,
150
430569
2183
07:24
my new lab exploring human
and interhuman collaboration.
151
432776
4245
07:29
We're really interested
in the feedback loop
152
437339
2120
07:31
between individual, artificial
and ecological systems.
153
439483
4230
07:36
We're connecting human and machine output
154
444276
2269
07:38
to biometrics and other kinds
of environmental data.
155
446569
2984
07:41
We're inviting anyone who's interested
in the future of work, systems
156
449577
4079
07:45
and interhuman collaboration
157
453680
1595
07:47
to explore with us.
158
455299
1550
07:48
We know it's not just technologists
that have to do this work
159
456873
3405
07:52
and that we all have a role to play.
160
460302
2103
07:54
We believe that by teaching machines
161
462429
2243
07:56
how to do the work
traditionally done by humans,
162
464696
2730
07:59
we can explore and evolve our criteria
163
467450
2953
08:02
of what's made possible by the human hand.
164
470427
2443
08:04
And part of that journey
is embracing the imperfections
165
472894
3493
08:08
and recognizing the fallibility
of both human and machine,
166
476411
3690
08:12
in order to expand the potential of both.
167
480125
2405
08:14
Today, I'm still in pursuit
of finding the beauty
168
482919
2301
08:17
in human and nonhuman creativity.
169
485244
2276
08:19
In the future, I have no idea
what that will look like,
170
487865
2829
08:23
but I'm pretty curious to find out.
171
491627
2024
08:25
Thank you.
172
493675
1151
08:26
(Applause)
173
494850
1884
Translated by Ivana Korom
Reviewed by Camille Martínez

▲Back to top

ABOUT THE SPEAKER
Sougwen Chung - Artist, researcher
Sougwen 愫君 Chung is an artist and researcher whose work explores the dynamics between humans and systems.

Why you should listen
Sougwen Chung's work explores the mark-made-by-hand and the mark-made-by-machine as an approach to understanding the dynamics of humans and systems. Chung is a former research fellow at MIT’s Media Lab and a pioneer in the field of human-machine collaboration. In 2019, she was selected as the Woman of the Year in Monaco for achievement in the Arts & Sciences.
 
In 2018 she was an inaugural E.A.T. Artist in Resident in partnership with New Museum and Bell Labs, and was awarded a commission for her project Omnia per Omnia. In 2016, Chung received Japan Media Art’s Excellence Award in for her project, Drawing Operations. She is a former research fellow at MIT’s Media Lab. She has been awarded Artist in Residence positions at Google, Eyebeam, Japan Media Arts and Pier 9 Autodesk. Her speculative critical practice spans performance, installation and drawings which have been featured in numerous exhibitions at museums and galleries around the world.
More profile about the speaker
Sougwen Chung | Speaker | TED.com

Data provided by TED.

This site was created in May 2015 and the last update was on January 12, 2020. It will no longer be updated.

We are currently creating a new site called "eng.lish.video" and would be grateful if you could access it.

If you have any questions or suggestions, please feel free to write comments in your language on the contact form.

Privacy Policy

Developer's Blog

Buy Me A Coffee