SimOnAir Ep. 5 – The Ego of Metrics (Ben Grosser)
In this 2019 conversation, 4 years before LLMs gained mainstream popularity, Sim dives deep into the world of tech critique and the ego of metrics with Ben Grosser, an artist and scholar whose work focuses on the cultural, social, and political effects of software. They discuss the nuances of critiquing technology, the role of art in this domain, and how software shapes user behavior and vice versa.
Ben Grosser creates interactive experiences, machines, and systems that examine the cultural, social, and political implications of software. His works have been featured in The New Yorker, Wired, The Atlantic, and many other publications. Ben is also an Associate Professor at the University of Illinois, Urbana-Champaign.
- Ben shares insights into the world of tech critique and the importance of understanding the implications of software on society.
- The conversation touches upon the role of art in critiquing technology and how it can offer unique perspectives.
- They discuss the challenges and rewards of creating art that critiques tech, and how such art can lead to broader discussions and reflections.
- Ben also mentions his experiences as an educator and the importance of fostering critical thinking in students.
Links and resources
Read full transcript
Sim (host): Ben, are you originally from Illinois?
Ben Grosser (guest): I am, I grew up in Champaign-Urbana.
Okay. Okay. And still around.
Still around… It's managed to hold me that, that long.
Okay. And then you decided… Did you know you wanted to go to school for music eventually?
I did. I was, so my focus before college was, in jazz.
So Miles Davis was my idol, I played trumpet. At whatever point I decided I wanted or knew I wanted to go to college, I knew that music… was the thing I wanted to study in college.
Okay. So you were set with your mind on that. And then eventually you got a couple of masters [laughs].
One in Music.
One in Music.
Music Composition, I think.
Music Composition… where my focus was on computer music… both as a method but also in terms of, using computers to not, not only generate sound but also as an assistant in the composition process.
So kind of working with the computer as a collaborator. And, and then I did, eventually I left school, I started a doctorate, but left and we can talk about where I went in the interim- … if it's of interest. eventually I ended up going back and getting an MFA in New Media Art.
Let's go straight to it. Where did you go in there [laughs] since you've mentioned it.
Right. In the middle of… so I spent a number of years at the Beckman Institute… which is a big multidisciplinary science research center… at the University of Illinois where I directed the imaging technology group and my work there was both in kind of state-of-the-art central facilities for visualization and imaging and video and 3D printing and microscopy and these kinds of, modes of analysis and investigation but also in remote instrumentation. So kind of working to, at that time put take scientific instruments and figure out ways to put them on the web so that they could have more kind of democratic use for these high dollar $500,000 microscopes and this kind of thing.
Allowing crowd, finding ways to allow, to allow crowds to use more expensive instrumentation they would have not had access to. Okay.
That's all right.
So the signature project there, which is still, happening, was this project we worked on called Bugscope, which allowed kids would send us bugs in the mail and we would put the bug in the $500,000 microscope. And then they could log on from their classroom and control the microscope and look at the bugs.
Oh, okay, okay. That's, that's interesting, yeah. When you mentioned that center, I thought, "Oh, you must have been involved in JPEG2000 standards." [laughs].
No, no. Yes, very much, sure, I would say finding ways to utilize a variety of what were at that point, you know, new media technologies… in order to better communicate scientific ideas in order to investigate, or augment existing scientific methods.
Across all your, there is this kind of keen interest towards human condition and how technology affects it. And you're not simply a person who goes, like, "Here is two physical, parts that I put together and this is metaphorical is something, and I call it art," but you're, you, you take a technique, you say, you take something that is very scientific, it's very quantified and eventually you recompose it or repurpose it. And that becomes expression of something deeper. And so if anything, I'm surprised that I don't see more of that because we are in such a, in such a point, historical point, technologically and socially that I would guess that what you do should be the standard of like content.
Well, no, I don't want it, I don't mean as a, but I'm not, I'm surprised that it's not a standard because I think that you-
There's a number of things that kind of play into it, but disciplinary boundaries and kind of the ways in which, whether in the humanities or the sciences or the arts, there are kind of people working against this now or working to, to change this. But there's still a lot of, you learn this discipline and that discipline has its series of methods… and the way to advance in that discipline is to master the methods and to possibly inch them forward in some way. sampling across disciplines and kind of purposefully pushing the boundaries or ignoring the boundaries. I think is less common 'cause education often doesn't work that way.
Yeah. So for me, I don't know, I guess it's, having come from music, but where my interest was in technology as a medium moving into the sciences where I spent a lot of time at the Beckman Institute trying to advocate for the inclusion of arts as a research method in that space and never had very much success.. and then kind of moving back into visual art now, but bringing all of that kind of history of moving across these boundaries. Maybe that's where it comes from, but you know, I think, you know, as someone who teaches at the university… I'm increasingly interested in trying to, create spaces for students that bring them across these boundaries in ways that, allow them to practice the methods they've learned, but also show how merging those methods with other people's methods and getting into those conversations can change what they end up doing.
So going a little bit the opposite way of specialization, trying to cross-pollinate. Mm-hmm [affirmative].
Yeah, without specialization, we wouldn't have a lot of the things that we have. but there's, there's an increasing need for those people that kind of resist too much specialization, who kind of live in those middle spaces, who can be translators across the boundaries… who are familiar enough with how software is designed and, and created and who creates it and why they do what they do so that, they can think about it critically, not just from a detached position but from a position that, you know, also has a more nuanced understanding of who and why and what choices got made and, and, and how things end up the way they do. I just did this big project looking at every public video appearance of Mark Zuckerberg from age 19 to age 34. So the first public appearance all the way up to the end of 2018.
So I've watched a lot of video of him talking about what Facebook is, why, you know, why he did it. And I will say that you learn things by watching that story… watch, watching him tell that story over and over again. I'm not sure I ever really get to what he really thinks 'cause there's a lot of repetition that is kind of suggestive of practice and, an attempt to craft a very particular presentation. But if you look at some of the earliest videos… there is,
Yeah. Yeah. I think his kind of, what was the mission statement of Facebook until 2017, I think when they felt like they needed to change, it was to make the world more open and connected. So if you accept that as the mission statement for Facebook, then the question is why is that a mission? One thing that Zuckerberg talks about a lot is community. And it's increasingly become his buzzword even more so now than it was before. As there's been a lot of critique over kind of the ways in which, algorithmic newsfeed kind of, you know, proliferate disinformation. but Zuckerberg, his idea of community from my analysis is very much about numbers and not about individual relationships. You know, he talks about the community as the entirety of everyone on Facebook. That's the community. So, to me, I don't, I think of communities as subsets of individuals with common interests. It seems to me maybe Facebook is the common interests. What makes the community of Facebook.
We all like to breathe [laughing].
How a piece of software is designed has significant effects on what kind of an experience the use of that software have… and who designs it has just as much of an, an effect on, on what gets done with it. It's, it's kind of like the way a lot of tools get created a lot of, is, is that somebody had a need.. somebody needed to cut a piece of paper so they… they figured out how to come up with something that would make that easier than it was previously. somebody decided they wanted to make a piece of software that would allow musicians to create music. They start from own conception of the music they want to create… and then they start to slowly assemble a tool that makes the way they want to create easier, more intuitive, streamlined, et cetera. If you think about a program like Photoshop… somebody thinks I want to try to take what I do in the dark room and replicate it in this digital space and then they start adding to it and they tweak it. There's nothing wrong with this approach, but over time you end up with a tool that was designed by an individual or a group of individuals that pushes the user in particular directions whether or not they realize it or not. so from my perspective, we think about a software platform like Facebook, to, just to take as an example, the, the way in which the news feed is always reporting how many seconds ago something happened or how many minutes ago something happened. Someone decided at one point that we need to know what just happened, right? And we need to know how far away we are from the present. If we're looking at this piece of content, whoever that person was who made that decision, there's a lot more concerned about extreme specificity of time than I am [laughing]. I don't need to know that my friend ate her banana 19 seconds ago versus 48 seconds ago- … but the design of the platform suggests I should care about it. And I would argue that over time the more we use a platform like that, especially as it becomes ubiquitous, it starts to make us care too. So we could then say, well, why do they care? Like what is the, what is the benefit or what is the goal of making me more hyper aware of how old something is?
And an argument that I would make and other scholars would make in response to this is, well, there's an advantage for Facebook. The more I see value in the new and less value in the old… and the more I can redefine the old, not as 20 years ago, but as two minutes ago or as two hours ago or two days ago, because the goal of Facebook is to make the world more open and connected, but it's the connected and the more that I would focus on in that statement than the, than the-
It's the more, you know, they want to add, they want every human on the planet to be part of the system. They want every human on the planet to be generating content for the system as frequently as possible… because that's where their money gets made. That's where profit happens. the more data we produce for the system, the more the, the more money they make, the easier it is to sell to us and to sell our data and this kind of thing through advertising.
Mm-hmm [affirmative]. so milk amilk the cow makes more milk [laughs].
Yes, right [laughs].
Yeah.. so that's just one example where the, the, the way the software platform is designed encourages a reconfior maybe promotes a reconfiguration or reconsideration of, or maybe just a reframing of, of what's old and what's new… and, and how new something needs to be to have more value than something that's old. And thus it leads us to want to keep producing new content and to preference, with our, our reactions, our likes, our loves, our hahas, our wows, our comments, these kinds of things, newer content versus older content.
How do you feel about that?
About the reframing of time that I'm talking about?
No, no, the reframing of time is I think one way to express a bigger, picture. So it's just one of the implementation and aspects of our whole mindset that is behind the, you know, the more production.
He doesn't have to like time as we said it's more "productive". So how does it make you feel-
So as a user-
Yeah, as a user of Facebook, it makes me feel anxious. It makes me feel like I need to keep checking. I need to keep seeing what's going on.
'Cause what if I miss something?
What if there's something important? What if.. You know, there's a way in which when the newsfeed is always, there's always new content, there's always something new. and the old stuff kind of falls away. I mean I've, I think a lot of people probably had some experience of this where a whole thing happened on Facebook and you weren't looking at it for those two hours.
How dare you [laughs].
Right. So what's the answer to that? Well, so be a little bit more aware of what might be happening on Facebook or turn on notifications so that you maybe get alerted if something like that happens. So, I mean this, this is the third, you know, the heart a little bit of, why I even made Facebook Demetricator in the first place. It's that the, you know, as a user of Facebook, I found myself, I realized that I was paying more attention to the numbers… in many cases than the content that the numbers were, you know, reporting about, want to see how many likes I got… and I might pay more attention to that then who liked it or I would see, I wanted to see how much it was shared rather than who had shared it or what they had said. And once I kind of gained that self-awareness… this is like 2011, I found that quite disturbing [laughs].
So it was a while ago for you because I was about to ask you like, when did this happen for you? Because it's the mothe haha moment about this for me was, it was about April 2015. I remember the moment that I decided that I did and I, I'm still not off of it, but it's, you know-
… but I, it's the moment that I was like, "Oh, this is what's happening [laughs] to me."
Right. Right. Yeah. It was 2011. It may, it might be a little slightly earlier. There's a piece I did before Facebook Demetricator.
Mm-hmm [affirmative], when you started to notice the tendency with yourself and within this mechanisms and this platform-
Yes, somewhere in the 2011 range.
And then there were kind of, I'd say two kind of maybe the first place I noticed it wasn't even the likes, but it was the notification numbers. So the red and white notification numbers, and I started to realize this is back when 2011, I didn't necessarily always have a tab open to Facebook. So I would maybe close the tab and then open a new tab… when I wanted to see Facebook again. And I realized that first of all, the first place I would look was that red and white notification number. I needed to see, had someone reacted to me… and if there was a red and white notification number waiting for me, then there'd be just the slightest little bit of kind of anticipatory happiness… until I clicked the number and then…
Right. And if I logged in and there wasn't a red and white notification number waiting for me, there was just this slightest little bit of disappointment. and so that kind of freaked me out. I was worried like, well, what does that do? How is that working?
But some people don't care. Do you think that they don't care, or do you think they, that that is harder to realize? At least consciously…
I mean I think, I think there's, it's kind of both.
So some people who don't care don't care because they maybe are aware of how these numbers are working at some extent, but it just feels so familiar and part of how things are supposed to happen. you know, that we learned from a young age that to pay attention to your score… and to use numbers as a guide. so for some people, and I've had these conversations, people are like, "Well, why would I want to hide the number? It doesn't make any sense. I mean that's how I know what's important or that's how I know if I'm doing well."
In this way you are arguing them the whole numbers system of scoring and judging in score is [laughs] you're going there [laughs].
I, I am going there. Absolutely. and-
Yeah, let's talk about school [laughing]. Yeah, let's go there [laughing].
I mean, so I do this line, which is, you know, is it, is there any level of quantification that's acceptable in the world once you kind of start looking around at the effects of, of quantifying everything? I mean, as someone who teaches school-
… students are reactive to the numbers that they receive. Of course-
I mean I cried over numbers when I was a kid.
Yeah. Right. They, you know, we learn and we're taught to pay extreme attention to the numbers. as with school there's kind of, I would say it's similar a little bit in these software platforms as it is with, you know, education systems. They're both systems… that we are embedded in. Both of those systems have conditioned the users of those systems to pay attention to the numbers. So on Facebook, part of the reason, you know, some people are like, oh, I just, the numbers are great, I'll pay attention to the numbers and maybe they aren't critical but other people, they've just become so conditioned by how the software is designed that they, they don't know what else to pay attention to. It's, you know, they've learned to watch how many followers someone has on Twitter as a way of knowing if that person has authority… as opposed to say looking at their bio, reading what their posts are, trying to understand who they are. Same with, you know, likes or, or you know any kind of social metric within these systems.
We, they've taught us pay attention to the number. And so in education in some ways it becomes the same thing. We, you know, students want to know how can I get the score that I want to get? And what are the ways in which I can get it? And that can be… In any of these systems it becomes, it produces a focus on optimizing for the metric. So optimizing one's activities to gain success metrically.
**I think every person is a creator in some way. And that brings to the point like if we are all creators consciously or not, then how much of their content or not just content of our creation is, is true and honest and is a search for honesty. And how much is a fruit of algorithmic optizoptimization at this point? Because we need it and we need to reach that. And that's, that's a super interesting question because you, we need to snap out of that mechanism. And even if you do, it's not easy because it's, if the rest around you works in that way, but it just means that you're going to lose, it just means that you're eventually late.
If I wanted to cut completely my social media presence, for example, it my be a thing, like if I work in, in, in promotion marketer or something like that, I cannot afford to cut off a verified 10,000 followers Twitter account. It's just, I can't, I mean I can-**
… but I need to give up completely a whole universe or, you know, even for this podcast, if I shoot a newsletter on Saturday morning, if I do it with one emoji inside of the headline, how many percentage points or is it going to make a difference of openings? Is it going to be 51% versus 50% and this is kind of what Facebook works for, the oldest tweakings in the interface and this kind of things is because you're only 1% of the users, open it more and adjust it more and use it more and eventually that brings more profits.
[laughs] yeah, no, it's super interesting. Like, like the thing that you were saying, like how much is adjustadjusted to produce optimized content for the metric and how much is true to yourself? How do we break this tendency? Do you think we should break this kind of tendency because what are the implications and the damages that we might have if we keep optimizing for the metric and now for the content?
So yeah, there's a couple of different ways we could look at that. In terms of Facebook and Twitter and social, I think we've seen now what optimization for the metric does. I think the Trump election, I think Brexit disinformation campaigns, reveal that when optimization of the metric is how the system is designed… that eventually someone will come along and optimize it in a way that kind of drives, things in directions we may be don't want as a society [laughs]. So, it's because when we take say qualitative evaluation… out of the equation of do I want to be reading this, do I care who this is? Should I follow this person? wouldn't the presence of the numbers start to automate the way we react? Then it makes us less critical of, of the information that, that comes next.
Isn't optimization for the metric just another way to call growth?
Of course.. So that's the other way in which of course,
I'm just wondering like, isn't that, how, isn't this just like we're just seeing a phenomenon of what is actually how everything works around them?
Yes. I mean this is the root of capitalism.
is, its dependence on and fetishization of growth… and for survival. And it's, so if we think back to metrics on Facebook or followers on Twitter, it's not easy to find someone who would say, I'd really rather have less followers.. Or I'd rather have less likes. And this is kind of, in some of the kind of theoretical thinking, an explorations I've done around kind of what do metrics do and why do they activate us as users? And how are they useful for the designs of software in order fin terms of promoting the production of content for systems by users? I think it gets… The way I think about it is I'm looking back at kind of like our evolutionarily developed need for esteem. So if like Maslow's hierarchy of needs, esteem is high up in the pyramid. We, we have a need that we've developed over our history as a species to feel valued… whether by ourself or, or by others. That's part of how we survived and kind of became who we are. But now in contemporary life that need plays out in the context of capitalism where value is quantifiable and growth is a, is a constant requirement for success. And so I feel like these, the intersection of that need for esteem… and capitalism's need for endless growth… kind of emerges such that it produces what I talk about as a desire for more. That when we, you know, when a, when a platform like Facebook is reporting a, a metric evaluation of our sociality to us… it's hard for us to not want those numbers to be larger.
When, so that's, so in other words, it's, it's a, it's kind of a, a micro, encouragement for growth in terms of our own sociality. More likes is better than less, more friends is better than less, more tweets, more followers, et cetera, et cetera. But here, here's going to be one way to talk about which is, have you ever liked a post… that you didn't really read?
I'm sure I have.
We all have.
I mean, I think, I mean, there might be some-
… unusual outlier out there… but we, we, dole out likes, we, we add people, we follow people. we, we increase metrics in one way or another for a lot of different reasons, which means that when you see that some famous person has… 500,000 or 2 million followers on Twitter, what does that number mean? Is a, is a reasonable question to ask. Like what can we ascertain from the existence of that large number that 2 million is somehow more important than 1.5 million in terms of followers? we, I mean, we know like there's been some good reporting in the last two, three years on how even famous people feel compelled to purchase, you know, tens or hundreds of thousands of followers because they need their numbers to go up… even though they already had a million followers or whatever it is, you know?
[laughs] I need 1.1 [laughs].
Right. It becomes a competition.
Mm-hmm [affirmative]. My precious [laughing]. Hmm. I think after realized that this things influenced me a lot, I am in a constant fight in… I'm in a constant fight with myself under different aspects to be less dependent on it. And sometimes my fetish becomes the battle with myself to not be too much attached to that.
So it become optimiI, I become obsessed by the optimization of unoptimizing the social value [laughing] it becomes like a… It's a ouroboros, I just bite my own tail.
so I ended up doing that, but I remember I never ever cared. I think mmaybe four, five minutes to, to things like the zero inbox technique. Like where you have zero emails or 'cause my thinking has always been the opposite. I try to do the infinite inbox.
It just, I just make it go up. I have like 30,000 on red, which is equal to zero-
… or at some point it will be the same number. I use Facebook Demetricator, I use Twitter Demetricator, I use ad blockers for Facebook. I use like stabilizer for the timeline, so it would clean up and remove some kind of like maybe you miss this kind of story, like that kind of thing. I used like the reverse in chronological order.
I need all of that. And then I was like, this is still not working for me. So I just was able to go there one off a few years ago, but also play with the idea. I started to ask myself, and you did that with an actual piece of work called Safebook.
Mm-hmm [affirmative]. Yeah.
And like how, what would a social network that does not influence you through dark patterns, psychologically and behaviorally with numbers, with colors with content, with place, with design look like? Like and in my little own play, I ended up with homeless of black and white page of texts [laughs]-
… just like, but what about Safebook? How did you approach that project?
So Safebook I would say is kind of for me, it kind of came out of the Summer of 2018… and watching how it was still plenty of reporting coming out about continued disinformation campaigns meant to influence the midterm election. So here we are almost two years after the 2016 presidential election in United States and very little had changed in terms of the platforms. in terms of accountability, this was before, thinking about this even before, like Zuckerberg was asked to testify before Congress… and this kind of thing. And I had already conducted a number of experiments at this point… Facebook Demetricator… being the most kind of longterm one. another one that, I've done a lot of work with is this, this work called Go Rando, which obfuscates your emotional reactions so it kind of selects randomly, one of the six reactions for you so that you don't inadvertently build an emotion profile of yourself for Facebook. I had made this work right after the election, the 2016 election called Textbook-
… where I was just, I mean, didn't even really try to get anybody else to use it. It was mostly just for me to… I was thinking about what was the role of the image… in the disinformation campaigns leading up to the election? And so I made a version of Facebook just hid all images. And so I tried all these different things and maybe a little bit like your own experiments where, you know, I'm kind of removing and modifying and tweaking versions of the interface to see what am I still left with? How does the design of that interface still compel me in certain ways? and it just got me thinking what would it take to [laughs]-
… to make us social network that was completely "safe" and I'm putting "safe" in air quotes right now. we, you know, is it, is that what safety is, right? I, I don't really think, communication media without any content at all-is the-
[laughs] that's the safest way.
… is, is the best way to, to go. But, so I made this work called Safebook, which just hides all content on the site, all audio, video, text.. And in some ways it was such like a funny gesture-
… like this is what it takes. But the result was a lot more intriguing to me because I realized how easily I could still use Facebook, even with all of the content hidden… that I could still write a status. I could still like all my friends posts, I could still, you know-
… like click on these different ways and yeah, I couldn't see the content. So it's not really the same kind of use, but it revealed to me just how ingrained the design of the interface and its interaction patterns had become in my brain that I can navigate this complex piece of software without seeing any of the content at all.
**That's, that's also an interesting point because I don't know about you, but at some point I was… Summer 2012 or so, I was starting to use the Instagram. I went on a trip on my own in California for a few weeks and was just going around. I, I, I took pictures because it's the first time that I noticed it was taking pictures, not for myself or for the memory, but for the, I dunno which crowds but I, but for not for myself, I noticed that was not thinking them for myself and I thought it was exhausting.
It was this exhausting process of going somewhere instead of enjoying the, you know, the stereotypical of the thing. Then instead of enjoying the moment you're, you're focused on creating the memory and crafting the memory for the moment for, for somebody else and that was that. There was also a relation because at the end of it we, it was like, "I had a good time, but how much good time do they miss?" [laughs].**
Right. It's a question. I mean I think, I think that experience you're describing, a lot of people haven't experienced like that. They don't all like kind of come to an awareness of it. Nathan Jurgensen, I don't know if you're familiar with his work. He's a theorist, who, who runs this conference called Theorizing the Web. and he just has a new book out but I haven't had a chance to read yet. It's on my pile. but he talks about the way in which the design of something like Instagram kind of starts to compel you to see your world as a future Instagram posts. My version of that with Facebook is, that I start to think about my current experience as in terms of its future-metric potential as a Facebook post. You know, if I take a photo of this and write a certain kind of status, what kind of light might this achieve? What kind of like count might this achieve within the platform? So it's like we were walking around where we're having our experiences, but it's very difficult I think for you, I think for me, I think for a lot of us to not have running in the back of our head how this could possibly look… within that social network… kind of, ecosystem.
Mm-hmm [affirmative]. You're the publisher of yourself all the time [laughs].
But you become the content ad either of your life.
Right.. And, and that editor piece, I think it's, right, it's like…
Yeah. You become the director and editor of your life. It's a lot.
It's a limit before [laughs].
Right. Absolutely. And, and of, you know, everyone else's perception of your life. yeah. So it's not, you're not just thinking how is this going to look as opposed to put up, but how is this going to be perceived as opposed to, about me by others and how might it relate to previous posts… and future posts and across networks.
Yeah. Older versions of myself, previous, past, present [laughs].
That's a lot. I was playing around also with the idea of, you mentioned like the way that I was interacting with Facebook changed because I was still able to publish something but not necessarily to use somebody else's. And that's also something that, it's important to, as soon as I, I in, in order of, I think the title is The Order of Magnitude.
Order of Magnitude, yeah.
Okay. Order without "the". Okay. In order of magnitude, it's about 45 minutes sou-, of, of, of clips from Zuckerberg. Same, you know, analyzing the concepts of the first one he starts with is numbers, numbers, lots of numbers like percentage, like 1 million users, 1 billion users, 1 million… Like all of that, what is, in this 45 minutes of Mark Zuckerberg, what are the most interesting patterns that you saw coming through about the way he sees Facebook?
Yeah, maybe just even, yeah, it's a, I extract. Anytime he talks about more… anytime he talks about growth… and anytime he speaks of metrics. So it's a super cut of… of all of those from all of his appearances. I, in terms of patterns, there's, I mean there's a few. One is that he is, with a few minor exceptions and kind of the earliest years… he is extremely consistent in the way he tells the story of Facebook. I mean from the first interview to, or first third interview maybe, you know, for first year… all the way up through, he still told the same stories, in front of Congress, you know, like it made sense when he was 19 to when he's 20 or 21 and now he's in Palo Alto… getting interviewed for the second or third time saying, "Well, I started this company in my dorm room at Harvard and, you know, like this is a small little thing with my friends and now it's like really grown to, you know, 30 schools" or whatever it was.. But he tells the same story in front of Congress: "I started this company in my dorm room," and, you know, that I like. … and it's like you're a trillion dollar corporation, I'm not sure why we're still talking about your dorm room.
So he's consistent with that. He just plays out like this, this, you know, this like situation that I was able to pull it off and this startup has grown like that…
… has grown like that and now we connect people! It's, there's kind of like the magic of it. Like if it was still the garage myth.
Right. It's his version of, of the Steve Jobs' garage.
Right. It's, it's an, it's an or, or the, you know, or Bill Gates. I don't, I don't remember…
Sure, we'll have the origin story for the myth, yeah.
… but why the origin story comes out as part of answering when, when being asked to be accountable for, producing a platform that was used by foreign governments to influence the election… seems a little disconnected to me. I think another thing that I see, a real consistency on that, that I've been thinking about a lot after having watched all this footage, for months and months is he was extremely proud for all until after the election. He was extremely proud and bragged over and over about the ratio of having very few engineers at Facebook compared to having so many users. It wasn't, it was, it was something he regularly would, would say, "Look, we only have, you know, a thousand engineers and we've got 300 million users. Like who else is doing that?"… you know, kind of thing. Now, of course, he's putting, put in the position, talking about all the people that he's hiring… to deal with all like, I mean he says it in front of Congress that, "I'm hiring 20,000, you know, 10,000 more people do this, 20,000 more people to do that," you know, like adding, adding, adding. All of a sudden that ratio, that was really important to him… it became a liability and in how he talks about Facebook.
So he had to reverse that part of the story of the framework.
Another thing that I saw was there's a progression of spreading. So there's kind of like, you know, he starts at Harvard… and then it opens up to a few more elite universities and then eventually it starts opening up to all universities. and then eventually it opens up to everybody. But pretty quickly a lot of people get on Facebook and then maybe there isn't so many people ready to just connect. So it starts to become this mission of connecting the world and getting out in the world. And that's when you hear him speak the word billion more and more… in, in, in the film because it starts to be an argument about, well there are, we may have 2 billion people connected, but there's still 5 billion people left in the world who aren't connected and we need to get those people connected to.
so he starts going out, you know, around the world and going to Africa, going to Europe, going to Asia, and going-
To Myanmar [laughs].
Well, I've seen one there [crosstalkbut in having, you know, in like, and with his, this is kind of the emergence of his internet.org-
… initiative in 2012 or 2013 where he starts imagining these drones that could fly over, you know, less developed… areas and kind of distribute internet connectivity to everyone as long as they, you know, all that comes through Facebook.
And, when I watched the film, maybe this is a consistency, but there's a consistency in certain way, right? It's consistency of continually wanting to add… and, and kind of the way he ends up going out around the world to kind of spreading the ideology of connection. We, you need to be connected. You could gain so much if you were connected to the network, we need to add these people and get them on the network. So yeah, I mean the order of magnitude for me is, first of all, it was this exercise and listening to and watching everything Mark Zuckerberg has said over and over again for his entire career.
And, and I get that. … I get that reaction a lot and I, I find myself a little worried about myself occasionally when I admit that I don't really, I became more accustomed to it and it's almost, almost like I feel like maybe Mark and I were not friends- … but of course-
Yeah. Yeah. But there is a connection at this point-
… in the way of like in the understanding of the thinking.
Yeah. You know, like I, I literally have become so attuned to kind of the sing-songy nature, like the particular sing-songy the nature of his voice or his, anybody else's and… the way he speaks and the words he uses. And there's a very kind of robotic nature to it sometimes that people made fun of.
Yeah. He said that, "When I was human."
Right [laughing]. But, I think that the thing that's kind of played out to what that work has done is it's for my own, for, for myself, is it's really gotten me to think more broadly about how the language we use to describe our goals and aspirations… drives how we evaluate what we've done and what we plan for the future and our relationship to other humans and other systems. And, you know, Zuckerberg is like, he's, he's not a popular CEO like a Steve obs was… or even a Jeff Bezos at times or a Bill Gates at times… and, but he's a dominant CEO of one of the world's largest corporations. And to me it'd be, I'm interested in how the piece can be a lens on the way that these companies grow and kind of what that says about what we're all doing and as a society, I guess. The fetishization for growth, the desire for more, the, the attention we pay to using numbers as, as argument,
Mm-hmm [affirmative]. From how many, how many hours of material have you gone through? Because 43 minutes of a supercut means that reads like at least 10 times that at the very least.
Yeah. So I should have done some metrics on it for you.. I didn't.
Tell me more. How much did you, well, you actually took something then you distilled it-
… so you've had the opposite of grow.
I did.. I did. So, I mean, it's hundreds of videos.
Okay. I think some of those came from the, these Zuckerberg files. Yeah.
So I used… yeah, I use the Zuckerberg files primarily for transcripts.
So Zuckerberg files would come into the University of Wisconsin in Milwaukee. is a, is an attempt to kind of collect all of his public utterances of, of any, in any medium. they, they did collect, they have collected a lot of his videos, the videos they've collected end up, you know, they're low res copies. They're kind of like the smallest, lowest res you could grab… and make an archive of. So what they've done that was most useful for me was they commissioned transcripts of a lot of it. so for the ones that they have commissioned transcripts for, I was able to use the transcripts to find-
Yeah. Quick search, huh?
As, as a, as a start.
Yeah. Yeah. Yeah.
You know, I mean the time codes on those transcripts are say once every two minutes. So it's not like for an hour and a half long video… it's still-
… a crazy amount of work to go through.. you know, thinking about like that, like how much I had to look at the, aanother big part of the effort was how much effort it takes to go and find a good, like copy of these videos.
Right. Remember you were looking for some of those and only screenshots were available within an article on like, I don't remember what it was, it was, FFast Company or Financial Times, whatever it was.
So sometimes you would just go on Twitter and you were like, "Help. [laughs] Like this is what he said. This is kind of when he said it to anybody."
This is a screenshot.
Is a screenshot. And I remember I tried for like 10 or 20 minutes, I was like, "I am going to try to find this one," and I, I did not succeed. So I didn't, I was like, "Oh, I'm just going to give up [crosstalknever tried." [laughs].
Yeah. No, I mean, so the, you know, the best example I'd say of, or a great example is that his first key… Zuckerberg's first keynote at the first Facebook developer conference in 2007… is nowhere on an English language website. And this is a, this is a video that clearly was produced by Facebook. And the reason I say that is because it's a multi-camera production… and there, it's integrated seamlessly with, a feed from his slides. So it seems like something that Facebook produced.
Yeah. Only people internally would have the resources to create it.
And it's nowhere online.
You weren't able to find it?
I, I got it on a Chinese language website, using a tip from a, a filmmaker friend, Elena Rossini… she's an Italian filmmaker… living in Paris. She told me about what you want to do is you want to take, like, words that might describe the video and translate them into the foreign language and then use those terms-
This is a very smart idea.
… top search for it on like Chinese, you know, YouTube clones.
And so I found a really crappy-
… copy of that video and when I watched the video, you can see like Zuckerberg's young 20-year-old smugness is just kind of oozing from his pores.
And like that's my best guess about why they have, scrapped it.
They have to like remove it… Okay. They took it off. Yeah. Well, he used to have, I don't think it's a rumor. I think people have it. He used to have two business cards. One was like, "Facebook CEO" and one was like, "Facebook, CEO, bitch!"
[laughs] that was like-
He used to have that.
I'm CEO bitch!
I'm CEO bitch. Yes, it does, it was something like that.
He's just a beer pong bro dude [laughs].
Totally. Totally. So, you know, and in another case, the one you think you saw me post on Twitter about the, the, the presentation he gave at the launch of Facebook Beacon, which was… the, his earliest attempt at an advertising platform-
… you know, that was, you know, widely, disparaged and that they abandoned within a year… that I'd never found it. I could never find…
Yeah, that's, I think, yeah, that's, that's the one that I was looking for. And I tried, I, I thought, I honestly, I said it, I was like, "Oh, I'm smart. I'll do it." That's, and that's good, you played over now. No, you don't even play. My ego played over that. And I was like, "I'll find it." And then I was like, "Oh, hey, I really can't find it." Like I ended up looking up for public FTP backups-
Mm-hmm [affirmative]. Nice, yeah.
… and those kinds of things. And I ended up looking, at some point I tried like some rejects expressions on [laughs].
Yeah. But it didn't work.
So it was like I failed. But, yeah, hopefully we'll find it [laughs].
I'll augment the piece if we find it.
Yeah. Yeah. [laughs].
I mean, the, the larger point I think it reveals is that this is, these are significant moments in kind of how this moment in history is playing out.
Mm-hmm [affirmative]. For sure.
And as with all of the other humanistic data that's getting recorded by computational systems through platforms like Facebook and Twitter, all this data is in the hands of private corporations… and this is public history. so the fact that I can't find a copy of that video or the fact that another video, I can only find in that kind of weird Securitas route-
… because the corporation is decided nobody should get to see it.
Yeah, it's anthropologically bad-
… and socially bad.
So let's archive everything but in decentralized systems [laughing].
Yes it is. Exactly.
Never like that.
Okay. No, it's a really interesting video. I have not watched it all yet. So I'll also watch it.
Yeah, let me know if you are able to watch it all.. So it's funny like, you know, some people said: "I can't walk. Like I've watched, I tried five minutes and I couldn't do it." And-
I think I'm perverse enough [laughs] to watch.
So my recmy recommendation is that people do it with a friend or with a group.
So I've shown it in group kind of settings, you know, and not just on the laptop but on a big screen. And there is a lot of humor I think that's kind of spread throughout the piece.
And that comes out more in those group settings.
When is a tool helping and empowering a person and when is a tool, what are the characteristics when is a tool good for you? When is a tool not any more good for you?
I mean for me it comes down to a question of agency, I guess.
So trying to think about, do I as a user of a tool have agency over what this tool does, how it works-
… what it makes possible and what it doesn't make possible? A tool like Facebook, you know, Zuckerberg would say early on, it's just about making the world more open and connected, right? It's about facilitating communication-
… and for the longest time, and in some ways they still talk about it as a neutral platform… but of course we've already talked about ways-
… in which the designs of that system is not neutral, promote certain kinds of ways of looking at things. Every tool has some… It's fairly unavoidable for a tool, especially a technological tool to not have some kind of embedded ways of thinking, some ways of thinking embedded in it-
Mm-hmm [affirmative]. Mm-hmm [affirmative].
… that people should be connected across the world. And these ways is embedded in some of the, you know, many of the tools that we use, for example. But in some ways those, those ways of thinking maybe aren't that hard to see, in some cases. But I think the more kind of complicated a tool becomes… the, and the more things it does… the easier it is for some of those ideologies to kind of hide out in, in the cracks and crevices.
Google gives a, I'm going to talking about this with my students a lot, right? We think of Google is providing access to the world's information and that in fact has been their motto or, you know-
… at some point, maybe still is, organizing the world.
Yeah. Organizing the world, yeah, after [Dombeyor[laughs].
Right. which makes it easy for someone to think, "Oh, well, if I can't find it in Google, then I must not exist. Or nobody's talked about it or it's, it's not out there."
But of course, Google isn't looking at everything. It's looking at the things that can look at-
In the way they want to look at it.
Yeah. In the way it wants to and it preferences certain kinds of, of information over others. And somebody designed the algorithm… to work in those ways. So one way of kind of looking at this is kind of algorithmic transparency. So, you know, some people, push for transparency I algorithm as a way of making kind of increasing the potential for someone to have an understanding of, of the implications of the system they're using. But the truth is in a lot of cases, people making these systems, how, they don't even necessarily know exactly how they work [laughs]. with artificial intelligence, deep learning, neural networks, these kinds of new approaches. we have general understandings… but exactly how it does what it does is complicated. So if it can be hard to ascertain, so if the maker of a tool doesn't even know exactly how it works, then of course the user of the tool really doesn't know exactly how it works. The less we understand how something works, the more easily we might be influenced by the tool in ways we, we didn't necessarily or we aren't kind of aware of or trying to watch for.
Mm-hmm [affirmative]. When something gets too complicated, it might be harder to discern when it stops being useful for us. And we use the tool instead of the tool using us [laughs]. Yeah.
I mean, I think that's.. Yeah.
I mean that's a big part of it.
And just a kind of a way in which code-based technologies, software in particular, I think has kind of accelerated… the distance between the user and the, and the in the creator of, of the system.
Most of the times I can't look at the code.
So that is, that is that it actually is coherent with what you were saying because especially for technology that is like you can't look at the code so you cannot see how it works. So you do not know the effects or how it was originally or why it was built if you can't see the code, so you can't do that and it adds a layer of opacity to it. But would transparency, transparency be a solution if Facebook's source code was open? Do you think people would care?
Hmm. I mean some people would-
Some people would.
I, I do think what it would enable, the average user of course isn't going to go look at, the, the Facebook source code and, and glean something… particularly useful from it if they don't have some experience with code. And even then, depending on how willing they are to dig and-
… how much research they are willing to do, but it would open up the systems to scrutiny by those who do understand it… and who might have motivation to explain it to people who don't.
Mm-hmm [affirmative]. So some transparency could add, could add, could add value-
Sure. Well, and, and transparency would also change what they write in the first place.
[laughs] Okay. It's kind of reversing the tool of like, you know, I'm influenced by knowing how many seconds ago my friend was publishing something. So I project myself and I prepare myself to behave in a certain way. So knowing that my code would be scrutinized would make me behaving in a way that… [laughs].
I mean, it's got a reverse of, of… right. Like, like surveillance, you know, changes what we do.
Yeah. For sure.
And, and so if Facebook all of a sudden knew that anyone could scrutinize all the source code of Facebook… then that might change what, you know, how they, like what options they code in the advertising tool, for example- … for how you can target Part of what I do in my own work is I, I do a lot of reading, paying a lot attention to kind of Silicon Valley culture. partially because I'm interested in it from anthropological standpoint, partially because I'm interested in it because I'm just interested in the technology. So kind of purely, and anyone, I think a lot of people from, from that point of view would hear any discussion of kind of enforced transparency around these systems as, as an extreme assault on the potential for innovation. That innovation can't… I think they would think about it as ruining the incentive structure to produce innovation… because if anybody can see the source code that anybody else can make their own version of Facebook for in this different way and, and, you know, you have to, it's kind of gets at these kinds of protection of IP arguments. and of course on the flip side of that is all the open source software movement, which is, you know, looks at this in a really different way.
well, one could argue that, effective total transparency would also be behavioral or with cameras. So we could also be [laughs] you know, if everything is transparent then transactions are blockchain wise or camera wise, everything is recorded all the time.
And that's a, that, that could also be the, the downside of total transparency.
Absolutely. I mean the NSA, this is kind of-
… you know, the National Security Agency would love to have everything we've ever done online in order to mine it-
… and, and do run predictive algorithms on it. But there's a difference between personal privacy… and understanding the full implications of a system that is responsible for more social communication than any other technology in the history of humanity. And, you know, I think that's kind of the thing that when I watch everything Zuckerberg has said since he was age 19, it's that he didn't ever really stop to think about the implications of putting everybody into one kind of walled garden… where everybody gets all their news from the same algorithm.
What is some fun technology to use for you now? Some technology that you use and you don't go like, "Oh my God, this is"-
"… I need, I need to create a piece of art about this?" [laughing]. You just use it.
I mean it might be, it might be a good knife that you're using it in the kitchen [laughs].
Oh my gosh. That's, yes… It may be an occupational hazard at this point that I have trouble. … in purely enjoying a piece of technology… without worrying about or trying to critically analyze it.
You were thinking about it so much, I really [laughs] I touched-
You know, it could be a, it could be a light bulb, right? [laughs]-
Yes. Yes. I've tried to think about, a good example here. in terms of new technologies… funny.. I mean I'd say the technology, piece of technology, maybe the software technology… I maybe enjoy the most right now in my life is probably something like Spotify. Yeah..
So you like streaming music.
I like the ability to see or hear something out in the world and then dig deeper… into it.
So that's nice-
So the subscription model works in that way. At the same time, I can't help but be critical of all the ways in which this is, not… the ways in which it encourages, just going all over the place and not necessarily digging deeply.
I'm probably one of the weirdos who, who use the Spotify to listen to albums… as opposed to tracks. So I'm always going like I find a track I like and I go and I want to listen to the whole album and see what albums that artist has produced. Maybe that is a common thing, I don't know. But I think of a lot of streaming services now as supporting the kind of, you assemble your own playlist or algorithmically generated playlists based on taste-
I think you, I think you're on to something because, the, the moment that I started to listen again to more single albums is when I decided did I need it in 2018, I say like, what is out there as an MP3 player and apparently Sony's making great high definition audio players-
… and was that's when I started to get old CDs and like kind of started to listen to albums again. But you're right. But when I was like, when I, with Apple music and Spotify, I was like, "Ooh, this is my playlist for today." [laughs].
Right, right, right.
But you liked the opportunity-
I like the opportunity.
… to, to just like to explore and go like, "Oh, what, I like that. Let me, I can know more about that right now easily."
Okay. That's great.
I also have a record player and I have a rare collection, so I do, that does lead me to purchase albums… once I've decided I like an album. Strangely enough, I find the algorithmic nature of Spotify at least to be almost useless for me. Like whatever it is they, however they profile me, it's not accurate [laughing] so like that, but I put it on the like, you know, do tracks for you or whatever it is. You know, your discovery, like I don't remember the names of them… and they just like two or three tracks in and I'm like -
You don't get me algorithm, you don't understand [laughs].
That's right. Yeah.
So you enjoy that. Yeah. There is those kinds of innovation. Like for example, if it doesn't send your data, your data all over. But having a thermostat that can control home, I think that, that's a very good piece of innovation [laughs].
Yeah. Yeah. Sure, sure.
Yeah. You know, as long as he doesn't also become a piece of data about how much do they warm up and how much can I sell them because they warm up this much and [laughs].
Exactly. High definition, you know, video and I mean these kinds of things. certainly the ability to, I'm kind of overly sensitive to resolution and… fidelity and these kinds of things. So-
You are the tech guy, you were saying [laughs].
Yeah. And I think the artists. It's-
Yeah, that's true. you want to see better.
I want to see more [laughs].
So as, as an artist, what does matter to you to transmit to other people?
I think it's the, we are embedded in a series of systems that are really hard to see and really hard to be aware of. It takes a certain level of criticality for our experience, for our environment to be able to see any of it at all. And when it comes to technology, we're not just talking about things that are designed to push a certain ways, I mean they're pushing us in ways that nobody intended. So-
So there is not a conscious evolution of the systems to push you in a certain way.
The systems push themselves in certain ways at this point. If you design a database to acquire and catalog information, the database itself becomes more useful the more information it has. And so there's a way in which I would say the database now wants more all by itself.
And yeah, and the more it has the more it adjusts for optimization.
This is a little bit of a cybernetic situation [laughs].
Sure, but, and, and, and, and maybe a latourinian kind of network kind of thing too. It's like, I, I think about metrics very specifically in this way. So to kind of try to isolate, to, to something a little more concrete, the, to put a, a count of the number of likes on an interface, you know, on a piece of content in a, in a, in a media feed, I would argue that the metric itself wants to be larger. Yeah. I want it to be larger because it reflects on my sociality. But the metric itself is also perhaps performing better or doing more of what it can, if it can continually grow.
Isn't that ego?
How so? Or which part?
The, the metric that pushes itself to be more.
You mean, does the metric have its own ego?
Is it an, an expression of ego, then you can isolate it in nature, but is it, yeah, I mean, yes. Is it, isn't it a particle of ego?
I think that's a reasonable way to think about it.
how do you be the best of metric there is?
Like if I'm a metric, do I want to be the metric that got consigned to only counting the single "like this post"? God or, I don't want to be the metric that counts Katy Perry's followers on Twitter. I mean-
I want to have a show now on, on TV. It's just called Metrics. It's Metrics [laughing].
Just having conversations! That's it!
Like metrics come home at the end of the day.
Like I only, only went up to 3 today and I'm really unhappy and, you know and you're, you-
Yeah. A radio drama Metrics [laughs].
That's right. Your…
Confessions of an algorithm.
I had a great day, men. I got, I got assigned to this post and it just went viral and [laughing]-
Metrics. I don't know which network, [laughs] you want to make people aware of where they are and what they do.
Make people aware isn't the way I would say it.
The way I would say it is I want to give people the opportunity to examine their own interactions with technology.
I want to, I write code, I write software to investigate the cultural effects of software.
But what I do with that most of the time is I write software for other people to.. to download and then to experience their everyday technological experience in a tweaked way, in a slightly different way… so that they can start to get a lens on what that software might be doing for them, against them, with them, et cetera.
In all of these technology, you do put a big emphasis on the emotional and human part. Do you believe in anything that you can't explain?
Boy, that's a good question. I don't, I don't sit around with a list of things that I think or I believe in that I can't explain, but I don't have any problem talking about there being plenty of things I can't explain.
Ah. What is one thing you can't explain, the most mysterious to you?
Oh boy. that's a hard question too, it's not a question I sit around thinking about.
Mm-hmm [affirmative]… That's fine.
Yeah. The, I'll give you an answer. It's not a, it certainly doesn't achieve the, the most qualified-
Don't, don't, don't, don't, don't worry about it.
Maybe I did it in the wrong way… Maybe I…
No, no, no.
Don't worry about what I want to hear.
I certainly do think about, I mean it gets at this thing of the agency of technology or technological systems themselves. So the ways in which technology does things we don't expect and we can't explain. When you assemble complex systems, there's some ways in which they start to take on a life of their own. There are kind of, you know, I see smiling and I like, I feel like sometimes they're, a lot of people could say, you know, have some personal experience of like, "My computer did this or my phone did that. I don't understand it why would do that." And they're like, and we kind of toss it. We kind of often are kind of dismissive of that… moment as a, as a bug or a glitch or something like that. I guess I'm a little bit more intrigued by the, by those moments with technology where they do something isn't what we thought it shit would do or could do or should do. and then that's what gets me to think about and talk about systems code metrics.
That's a very romantic view of it in, in, in a literal way. In a, in a literary meaning [laughs] like it's, romanticistic [laughs].
Yeah. I suppose so.
So one thing you can't explain is glitches, but you just unnecessarily think glitches are a dismissible error.
Well, I guess I would say it's not that I can't explain clutches, it's that I questioned whether a glitch is a glitch.
Okay.. Wwhat is a glitch if it's not a glitch?
The effect of a system exerting agency. Yeah. I love to think about what if that isn't just an error. What if it isn't just a flipped bit somewhere because of the, you know, physics of computational memory, you know, systems. what if that's an expression of intentionalityOr, yes. Something, yeah, something like that.
So deja vu is not a deja vu is not a deja vu?
well, yeah. I mean it's, it's, it' the sign of when they change something in the matrix.
Mm-hmm [affirmative]. That's what it is. That's when the computer doesn't start. Right?
Fast forward, I don't know how old you are, I don't care. Are you, are in a range to me, fast forward seven years, 200 years, you look back and you go, "This is what really mattered to me. This is where really matters to me." What did, what, what does really matter to you? What do you want to look back at and you thought, you know, "I'm glad that I knew that this really mattered to me. I know that this matter to me."
Personal relationships… really matter to me. my relationship with my spouse, my relationship with my cat… friends and colleagues. I would say the other thing that really matters to me is the ability or listening to, maybe this is the way to put it. Listening to the desmy own desire to work through questions in artistic ways to ask to, to find myself asking a question and then to take, an unexpected… artistic route to try and to answer that question, especially when I've no idea if the thing I'm doing is going to produce any kind of answer at all.
[laughs] How do you live with that? You like that? Do you like that? That moment of like you, you know, that exploration not necessarily the unknown.
Well, just to be clear, you asked me what would I like looking back [laughs].
Sometimes I'm in the middle of it, it drives me nuts-
… but I guess I, I've developed a comfort in that space of uncertainty.
So you will recognize the value of that. Like even if it drives you nuts in the moment, you recognize that that has value.
I, yeah, I guess… I guess I've done it enough times now that I know some percentage of them… I will look back on and it's not just that I'm happy I did that or something. It's more like it, it's set up the conditions for change… in some way or another.
… for, for understanding the system differently or understanding a, a relation differently.
Mm-hmm [affirmative]. What matters to you now?
I mean, what matters to me now is having time and space to think.
Mm-hmm [affirmative]. Okay… That's what you want now?
That's, yeah, that's, you know, also, I teach in university, I'm a faculty member. so summer's the time we get to, to do… that kind of thinking. So maybe it's in, we're in summer right now, so maybe that's… part of, part of it.
So all your projects are in your website?
Bengrosser.com. but you're also on social networks. You are on Twitter.
I am on Twitter, it's @bengrosser on Twitter.. @bengrosser on Twitter. Yeah.
And, and, your latest, work, Order of Magnitude is on Vimeo.
Is on Vimeo, yeah.
Okay. The supercut is on Vimeo.
Thank you so much.
Yeah, that was a pleasure.