President’s Speaker Series 2018-19: Safiya U. Noble, Ph.D.

President’s Speaker Series 2018-19: Safiya U. Noble, Ph.D.

August 20, 2019 0 By Stanley Isaacs


[ Music ]>>Hello, good afternoon. I am Cal State Monterey Bay
President Eduardo Ochoa, and I’m happy to welcome
you to the first event in this year’s President
Speaker Series. I would like to begin by thanking the California
Faculty Association and the Otter Cross
Cultural Center for cosponsoring today’s event. I greatly appreciate
their involvement. Please join me in a round of
applause for these two groups. [ Applause ] The theme for this year’s
president speaker series is technology and its impact. Through these events we will
exam the ways the internet and related technologies
have changed how we interact with our everyday environment
and with one another. Perhaps, the most ubiquitous of
these technologies is Google, the search engine that
overwhelms all others when it comes to market share. Google and its mysterious
algorithms that guide the search functions
have an unmatched power to shape our perceptions
of the world around us. And, considering today’s
topic I was curious to know how long we have
been googling things. So, of course, I googled it. It turns out the verb to google
was added to the Oxford English and Merriam-Webster
collegiate dictionaries in 2006. So, many of today’s college
students have been googling things pretty much since
they’ve learned how to read. And, if you’re looking
for a fact, anything from how long you
need to cook a hardboiled egg to which sides fought
the Punic Wars, Google is an invaluable tool. But, the searches, of
course, go far beyond quests for recipes or historical facts. As today’s speaker,
Dr. Safiya Umoja Noble, writes in her new
book “Algorithms of Oppression” how search
engines reinforce racism, and I quote, “We are
increasingly being acculturated to the notion that
digital technologies, particularly search engines,
can give us better information than other human beings can. People will often take
complex questions to the web and do a Google search rather
than going to the library or taking a class
on the subject. The idea is that an answer
can be found in 0.3 seconds to questions that have been
debated for thousands of years.” Dr. Noble shows us that Google and other search engines
are not value-neutral tools. Their results are
shaped by humans who create the algorithms, by businesses promoting
their products, and by political operatives
whose goals are anything but value-neutral. The impact of the political
bots that spread their messages through Google, Facebook,
Twitter, and other social media during
the last presidential campaign is only now coming into focus. Meanwhile, the flaws exposed by these misleading messages are
only beginning to be addressed. A reviewer in the New York
Journal of Books wrote, “Algorithms of Oppression is a
wakeup call to bring awareness to the biases of the internet and should motivate
all concerned people to ask why those biases
exist and who they benefit.” Dr. Noble is an associate
professor at UCLA in the Departments of
Information Studies and African American Studies
and a visiting faculty member to the University of Southern
California’s Annenberg School of Communication. In 2009 she will join the
Oxford Internet Institute at the University of Oxford
as a senior research fellow. And, she’s also a CSU
graduate from Fresno State. We’re pleased that she
was able to make time in her busy schedule
to talk with us today. Following her presentation, we will hold a question
and answer period. Please write any questions you
have on the cards provided, and CSUMB staff members
circulating through the audience will
bring them up to the stage. Now, please join me in
welcoming Dr. Safiya Noble. [ Applause ]>>Thank you so much, Dr. Ochoa. I appreciate you. Hi. Everybody just
wandered in from the rain, just didn’t have anywhere
else to be except to be dry. Thank you so much
for coming today. I am really pleased to be
here, and I have to say that it’s really a thrill to — whenever I get to
speak at a CSU campus, because I’m a graduate
of Fresno State. I also went to San
Jose State for a while, and this is fine,
just leave that. But, I remember that when
I was at Fresno State, I was really involved
in the campus politics and campus activism but
also in student government and was very active with the ASI and then eventually the
California State-Wide Student Association which is
really where I learned to just love the California
Faculty Association. And, so I’m really truly
grateful to have the invitation by Dr. Ochoa and the CFA and
the Cross Cultural Center, because I know that the
work of the CSU and the way in which world-class scholars
come to both work and teach here but also the kinds of
students that come to the CSU. They’re unparalleled,
and I don’t just say that because I was one of them, but I say that because
I know what it takes to get a degree in
the CSU system. This is not an easy place, and
also, you know, you never know where you’re going to end up
when you finish your schooling. And, so I would just
say to the students, to the undergraduate
students here today that don’t underestimate
that graduating from CSU Monterey
Bay might have — might take you all
over the world. And, I certainly know that when
I was a student at Fresno State, I couldn’t have ever imagined that I would have the
career that I’ve had. I was really just
trying to figure out how do I pass these classes
and get out of here, and, you know, it all worked out. So, that’s really — that’s
what I’m here to tell you. All right. Let’s talk a bit, thank you,
about Algorithms of Oppression. This isn’t my latest
book, and, really, it’s just so nerve-wrecking
to think about giving a talk about Google here so
close to the valley. So, I’m going to need
security at the doors just to make sure we can get through
the next couple of hours, and it’s going to be fine. I will tell you that, you know,
when I first started writing about search engine bias or
technology bias, you know, the bias of digital
media systems, a lot of people were not
interested in hearing about it. In fact, this is another great
thing about being a student and then, you know,
growing up and graduating is that you will be vindicated
on a lot of great ideas that you have right now. And, when I was writing
about this, you know, almost ten years ago, I remember
a couple of key conversations. The first one was
when I was trying to get my dissertation
committee to approve me talking about the way in which
Google biases information. And, it was difficult,
because a lot of people back in those days, this is,
like, almost ten years ago, said it’s impossible for
technology to hold any type of bias or value system,
because it’s just math. It’s just computer science, and math is subjective,
and it’s neutral. And, of course, now
fast-forward, many of us have been doing
research and trying to document, and I’m going to share with you
some of the evidence that shows, in fact, that there’s
a whole host of values that get imbued into
these systems. The other thing that
was interesting, when I started writing the
book and I got a contract with New York University
Press for this book, I said — I argued, and I said I want to
have the book called Algorithms of Oppression, and it was
really important to me. And, my editor, Eileen
Keller, she’s amazing. She said, “Safiya,
there’s no way, because nobody knows
what an algorithm is.” So, here we are. All right, and I will say that,
you know, it’s amazing to see, like, my father-in-law
say, you know, “Hey, Saf, what’s happening with
these algorithms?” You know, I mean just these
kinds of things, and, you know, I think part of the reason why
we’re talking algorithms is because certainly journalists
have done a great job over the last few years
of translating the work of researchers, my
own research included. I loved that, you
know, that, you know, 2017 was the year we fell
out of love with algorithms, and I definitely am trying to
take credit for some of that. And, part of that is,
because we’ve started to see the unravelings or maybe
a little bit of the logics or — and the slippage and the
seepage of some of the ideas that have been programmed
and that we have been kind of encouraged to trust. And, certainly we are in the
midst of what I would call at tech clash where, you
know, the people are starting to use — see, whether it’s
Cambridge Analytica, which, you know, I think for those
of us who’ve been working in the field of digital media
studies for a while, you know, we’re kind of surprised
that the public wasn’t aware that Cambridge Analytica
is just one of many, many, many companies upon which the
entire commercial internet is founded upon, which is about
the kind of buying and selling and trading of data about us. We are the products in these
platforms, and so I’m going to talk a little bit about
that maybe toward the end. But, let’s kind of, you know,
think about what does it mean to interrogate an algorithm. So, an algorithm, just
for definitional purposes so we’re all kind of level
set on what is that, you know, the way I characterize it in
my work is that it’s, you know, it’s a set of instructions. It’s a decision tree. It’s a — if these
conditions are present, then these decisions should
be kind of automated, and, you know, we think of them
as being kind of black boxed and hard to understand,
and in some cases they are, certainly in the case of Google. Google’s own report out
about its algorithms for its search technologies or that there are 200
different factors, for example, that go into how it
automates decision-making. But, you know, as I talk
today, I’m going to be talking about algorithms, and
I’m going to be talking about artificial intelligence. But, what you can hear
in that place is kind of automated decision-making. How does automated
decision-making happen? How does it get encoded,
and what is it saying? And, how we can best, I think,
understand algorithms is to look at their output. That tells us a lot about
what kinds of values and ideas have been
programmed into them. All right, so here’s one of
our first kind of examples. In the book I have dozens
and dozens of examples. I’m not going to get through all
of them today, but I am going to give you kind of a little bit
of a taste of some of the kinds of things that I captured. This was a tweet that went
out by DeRay McKesson. Some of you might know DeRay. He became well-known
around the Ferguson uprising after the death of Mike Brown, and he was very active
on Twitter. Some people attribute him to
the Black Lives Matter movement, but, really, he’s just kind
of a Twitter personality. And, DeRay, one day — also,
DeRay became very famous on Twitter, because
Beyoncé followed him. So, I always feel
obliged to just say if anyone here knows
Beyoncé and you could get her to follow me, it would really
amplify the work, okay? Just for the work. All right, so DeRay tweets out, “If you google map the N-word
house, this is what you’ll find. America.” And, what was
happening at that time is that if you went into
Google Maps and you typed in the N-word house
or the N-word king, Google Maps during — this is
during the Obama administration — would take you
to the White House. And, this story went
viral fairly quickly, and within 24 hours or so
the Google spokesperson says, “Some inappropriate results
are surfacing in Google Maps that should not be,
and we apologize for any offense this
may have caused. Our teams are working to
fix this issue quickly.” Now, there’s a couple things
that are happening here in this kind of apology
non-apology. The first is I’ll just say that,
you know, when my husband says, “I’m sorry if you
were offended.” I don’t actually
feel apologized to. So, there’s an interesting kind of non-apology apology
happening. But, more importantly,
the more important part of the statement is that
our teams are working to fix this issue quickly. And, this is one of the ways
that large tech companies, not just Google, certainly
Snap and others respond when this kind of output
happens in their platforms, which is to recharacterize
or recast the problem as a technical glitch
in the system that otherwise operates
perfectly. And, this is one of the things
that I think is a little bit of a sleight of hand that
happens by large tech companies when these kinds
of things happen. Now, here was a great story. Jessica Guynn who is a wonderful
tech reporter for USA Today. You should definitely follow her
reporting on the tech sector. She carries the story
here about Kabir Ali. Now, Kabir Ali was a teenager
who had a friend video him on his phone, and he
types into Google Images “three black teenagers”. And, then he says,
“Yo, look at this. Let’s see what happens
when we change one word, black to white.” And, so you can see here
that when he does this search on three black teenagers,
we get mostly kind of these criminal
mugshot kinds of images of African-American teenagers. But, then when we change
the word black to white, when he changes it,
he gets, like, some interesting complex
maybe Getty Images of white teenagers
playing, I think, multiple sports at one time. It’s confusing. It’s just not really like a
sports picture, but, you know, we’ve got these, like, perfectly
curated kinds of white teenagers on a white background,
because many of us know that the white background
follows us around, and we’re ready for
that perfect shot. So, this is an interesting
set of images, and it also, again, it’s a seepage. It’s some way for us to see
into, like, some of the logics or ideas that are behind what
the automated decision-making is or what the automated
output is in these systems. Now, this story went
viral on Twitter and which is both a blessing
and a curse for anybody who ever goes viral in Twitter. The next day, unlike in
the case of the White House and the mapping to the
Obama administration and the White House, Google actually just subtly
changes the algorithm. And, here’s Bob’s
Burgers guy who sees that when he does the
search, now we have a picture of a young white man who is,
you know, criminal sensing or some type of court hearing. And, then we’ve added
in, like, a few pictures of young African-American
women holding a volleyball or maybe, like, a church group. I’m not sure what’s happening,
but here’s an interesting part of the logic that is not
necessarily apparent. The way that the
equalization happens is to put a criminalized
image of a young white man, because that somehow maps onto
the truth or the truthiness of the logic of African-American
teenagers being criminals. So, this disruption is, again, predicated upon some
interesting values. What you don’t see and it
may not be obvious to you is that this young man is
the Deryl Paul Dedmon, and he was sentenced to 50
years in a federal prison after pleading guilty
to the capital murder of John Craig Anderson in
Jackson, Mississippi, in 2011. So, here’s a young
white man, the only kind of the characterization
of this criminalization is that he was involved in
this heinous hate crime where he murdered an
African-American man. So, these are some of
the ways in which, again, there are a sets of logics
that undergird the kind of results that we get. And, in the book there are
many, many different examples, and people often ask me about,
you know, my relationship with Google and does
Google, you know, call me and ask me what I think
about different things. And, I will just say we have
a weird silent relationship where I think they
watch my talks. Maybe someone’s logged
in right now and watching or they’re going to
watch it tomorrow. I’m not sure what happens, but
sometimes I make suggestions in my talks, and then I
find, a few weeks later, those suggestions
have been implemented. So, maybe in your Q and A we
could make some suggestions that that might get heard. Okay, so there are
other kinds of examples that I might point
to around concepts. For example, for many
years, if you did a search on the word beautiful,
you would get, again, images almost exclusively of
white women who were very thin, kind of a hegemonic beauty
standard if you will in lingerie or bikinis and this
kind of thing. And, that was the proxy
for the word beautiful. Now, I have to tell
you that when I think of the word beautiful and I’m
going to go to Google Images and do a search on beautiful,
I think we might get the ocean, like a beach and nature scene. I mean, because that seems to be a more universally
understood conception of something that might be
beautiful, not, you know, like, almost naked white women. So, there’s like —
and you don’t have to add almost naked white
women to beautiful, you know, but that still might
be what you get. For a long time I
would give that example when I would give book
talks, and then — now when you search for
beautiful, you get nature. So, I — this is
what I’m saying. You know what I mean? I don’t know what we’re doing. All right, so this is the — this is actually the search
that started the study, and in 2010 I was, 2009,
2010, I was kind of thinking about how could I talk
about the political economy of the internet. And, so what does that mean? How can I talk about the way in
which capital or money resources and policy come into practice
in large digital media systems that really benefit
the most powerful? And, that’s what I was
thinking about with respect to the internet, and so,
you know, I had this search that I had done at the
prompting of a colleague. And, then I had to kind
of repeat it again, thinking about my daughter,
you know, my stepdaughter and my nieces, and that search
was on the keyword black girls. Now, I assumed that, you
know, I had been a black girl. I’m a black woman
now, but I, you know, really didn’t expect what I
found back in 2010 and 2011. In 2000 — okay, this is also — I’m sorry if there are
any children present. This is a time you might
want to do just, like, do a cover if there’s any young
people here, although parents, you don’t even know
what they’re seeing. So, I — this isn’t
going to shock them. All right. So, in 2011, when
you did a search on the keywords black girls, the first hit was
sugaryblackpussy.com, and this was stunning to
me, because you didn’t have to add the words porn. You didn’t have to
add the word sex. Black girls, black
children, black teenagers, adolescents were
synonymous with pornography, and this is really what launched
the study that became the book. For many years, black girls were
synonymous with pornography, and to this day, still if
you search on Asian girls and Latina girls,
you are most likely to get hypersexualized
content or pornography. So, it’s really not
faring much better. What I found is that girls
of color, who I consider to be a very vulnerable group
of people in our society, who don’t have a lot of money,
right, and don’t have a lot in terms of like numerical
majorities in the United States, were not able to
control the ways in which they’re represented. And, to me this was the —
kind of the launch to thinking about a whole host of
questions like what does it mean when you are in a
numerical minority, let’s say an ethnic minority
in the United States. And, also, what does it mean if
you are poor or working-class or you’re a part — you
come from a community that is not generationally
wealthy, how can you control the
narratives and the images and the ideas associated
with you, with your identity or the communities
that you belong to. And, this, to me, is one of
the most important questions that we kind of have at
stake right now is this idea about not just representation
but how power and money and certain types of authority
get to control the narrative, the informational
landscape online. And, this is at a time when,
you know, I’ll tell you, I know this is not like
getting students here at CSUMB, but I will just say my students
at UCLA and USC say things like, “I could never write
a research paper in college without Google.” I know that’s none of you. I know none of you are writing
your papers out of Google, but these are the
kinds of questions that we have to ask about. Well, what is the authority
mechanism of Google Search? And, one of the things we
know, for example, is that more than 80 percent of people
who use search engines, especially in the mobile
environment, use Google Search. Google Search is the
dominant search engine. People ask me all the time, “Why
don’t you study Bing or Yahoo?” and I’m just like, “Because
no one’s really using them.” So, it’s important to
study the monopoly leader, the company that has
the most control, because everyone
else is really trying to replicate what
they’ve done anyway. So, if you have one search
engine that’s really controlling the information landscape,
this is a place where you have to start asking questions,
especially as it starts to displace other kinds of
information organizations like libraries, schools,
teachers, professors, and other — even parents, who
might have been other types of touchpoints for
expertise and knowledge. And, now, in fact,
what we have are — as President Ochoa shared,
we have people thinking and becoming acculturated
to the idea that one, there is an answer, and it
can be found in .03 seconds. And, that really belies the
idea of what a university or what a school
is, quite frankly. I mean, you know, I often tell
my students it’s, like, okay, let me just find a gadget here. Okay, look, here’s some water. All right, here’s
a glass of water. In a university environment,
you know, the design department
might look at this and talk about the conception of the
design, the kind of ergonomics of this, what this means. In a chemistry class
we might talk about this glass of
water differently. In a history class we might
think about the history of clean water and the
politics of clean water, right? There are a whole host of ways that we would conceptualize
this glass of water, which I’m just going to
take a drink of, because, I mean, I got it right on. I can’t just hold
it and not drink it. Okay, so this is, again,
really important when we think about framing and
conceptualizing knowledge and multiple vantage
points and, of course, long histories of expertise. But, what a search
engine does is kind of strips away those
expectations that we would have multiple
points of view, multiple frames of reference to objects, to
communities, to concepts. And, it instead operates as
kind of an authority system or, as [inaudible] might talk
about kind of a symbol — a system of symbols that
confer authority or knowledge. And, this is one of
the reasons why I’m so particularly concerned
about the way that search engines
work in our society. Now, these ideas about black
women and girls as kind of hypersexualized
sexual objects is not new. It doesn’t start with Google. It actually, significantly,
predates Google by several hundred years
and part of the reason — I mean, if you look here,
I’ll just say if you want to learn a great history of
racist stereotypes, particularly in the US context,
the Jim Crow Museum of Racist Memorabilia
is a fantastic resource that you can go to. One of the things there
they talk about is kind of this jezebel stereotype
of black women, and I will just give you a
quick shortcut which is to say that racist stereotypes
arise at different kinds of historical moments. And, they work in service of
particular political projects. So, at the moment, historically, that the Transatlantic
slave trade is outlawed, and we can no longer,
in the United States, import African people
and enslave them legally. The only way in which
the enslaved labor force of African peoples can be
reproduced is by birth, is if black women give
birth to any children that then become
property of the — their property —
kind of masters. And, so we need, then, the
invention of a racist trope that black women like to have
sex, a lot of it and like to have a lot of kids. And, that’s how we work
and start to understand and unpack racist stereotypes
is that they always work in some type of political
service, often against people who are oppressed and
in service of elites. And, so this is a
really important kind of way then of understanding. Well, the pornography
industry, which has more money than anyone just so you know — in fact, you can thank the porn
industry for a lot of the kind of technical digital
interventions, credit card processing, video
compression, all of these kinds of things that really worked in
service of the porn industry. But, we have to look and
see, well, who was been — at whose expense does those
— that kind of power come. And, of course, what we find
often is that that comes at the expense of women
and girls of color. Now, you guys know that every
academic has to have one slide that has too many words. That’s it. This is it. This is strictly
for the students. This is — again, if you
are interested academically in thinking about these
kinds of fields of inquiry about all kinds of things that
have to do with the internet, I would just say there
are three key fields that really inform my work. One is this kind of idea
of the social construction of technology, and many
of you who take courses in ethnic studies and
gender studies, sociology, you’re very comfortable
with the idea of kind of the social construction
of gender or the social construction
of race. We also have a field in science and technology studies
that’s concerned with the way in which technologies
are also made by people, and people imbued
their values in them. And, power is an essential
part of those value systems. When I started writing
about Google, really, there were other people
writing about Google, but people were not thinking
from kind of this black feminist or critical race perspective
where we center, again, people of color at
the epicenter. And, one of the things that
happens when you use feminist and critical race
theories in your work is that you ask different
questions. So, I started asking
different questions about Google like,
“What does it mean? How are vulnerable people
contending in this kind of space, and what’s
the kind of output? And, what are potential
interventions that we could be
thinking about?” And, then I would
just say, you know, critical information studies
is a really interesting field that you could think
about for graduate school. And, if you’re interested in
that, you can always reach out, and we can talk about that. All right. So, when I started writing
about Google, you know, and black girls and what
happening to girls of color, people were very sympathetic. You know, it’s really sad what
happens to the black girls. It’s terrible. But, then when the exact same
mechanisms threw a presidential election, then everybody
got, like, super interested. So, it’s unfortunate that
the practice of gamification of search engines
actually came on the backs of vulnerable people, and
now its influence could’ve mainstreamed politics. Here’s an example of how
this has come to the fore. There were two great
researchers, Epstein and Robertson, who did a
study in 2013, and they argued that democracy was at risk, because manipulating search
engine rankings could shift voters’ preferences
substantially without their awareness. And, basically, what
they found in their study that was the tenor of stories about a political
candidate would determine how people voted. And, after they did the
study and they, you know, they showed a controlled group
of people negative stories about a candidate,
and people said, “I’m not voting for
that person.” And, then they showed
positive results about stories about a candidate,
and people said, “I’m definitely voting
for that person.” because one of the things that’s
happening in these spaces is that people are oriented to
thinking of search engines at vetting engines,
providing credible, curated, thought through complex
decisions have gone — smarter people, right, have
helped me sift through the trash on the internet to find
the golden nuggets. And, what Epstein and
Robertson argued is that it’s actually money that
influences the kind of stories that occur on the first page
of search engine results. And, of course, the first
page is the most important, because most people don’t
go past the first page of results that they get. Now, this was consistent
with Matthew Hindman, who wrote a book called “The
Myth of Digital Democracy” where he showed that it’s really
political action committees that have the most money. They’re able to influence what
shows up on the first page. Often, the choices that are made
available did not reflect broad participation of web users, but
rather prioritized the interests of media conglomerates
and elites. And, again, this is the kind of
evidence that flies in the face of the idea that what we
find on the first page or in a search engine simply
reflects what’s most popular. And, this, of course, is
one of the prevailing myths about how these systems work, which is that they just
purely reflect back to as what we’re
doing as a collective. And, I just have to say that I
haven’t given up on the public in that way yet,
and I don’t hold such a cynical view
of the public. We certainly know the capital and money plays a
significant role in the kinds of things we find. Now, Epstein and Robertson —
it was, like, a foreshadowing in 2013, their study, because
here we have the week following the 2016 US presidential
election, the first result in Google Search when looking for final presidential
election results takes you to a disinformation
site that reports out that Donald Trump
won the popular vote. So, that’s not an
alternative fact. That’s a lie. I’m really — I think we have to
be specific with our language. So, here we have the first
result, and this, to me, is incredibly interesting,
because we wonder why part of the country, who supported
Donald Trump for president and believes in a
certain type of legitimacy of his popular support, is in
sharp contrast to those of us who know better or
know differently. So, this is, again, one
of the kinds of things that I think we have
to ask ourselves about how search engines
legitimate certain ideas and then again how the
public starts to relate. Now, the question has often been
posed to Google and other — and certainly social
media platforms about why don’t they
intervene upon their algorithms or tweak their algorithms to
keep certain kinds of falsehoods or disinformation out of sight. And, there was a great
example of this where often — and by great I mean terrible. An example of this is when
white supremacism white power organizations have taken over
certain types of keywords. Now, this has been written
about by many people. For example, for many years, if
you searched on the word Jews, the first page of results,
and Google would take you to holocaust denial
sites, antisemitic sites that looked kind of as if they
might be legitimate resources. They’re what Jessie Daniels
who wrote a great book called “Cyber Racism” calls
cloaked websites. So, they look like
they’re one thing, but you just scratched
the surface in a really white
supremacist white website. When white nationalists
have taken over the keywords Boasian
anthropology, there was a kind of a whole, you know,
series of questions about how can this happen
that this key term would be — would lead you to white
supremacist websites. And, the Inquisitor wrote — writes, and they kind of
capture Gizmodo’s interview. They say when Google
co-founder Sergey Brin was asked about adjusting the algorithm
to prevent such results from reaching the top
levels of search results, he said that would be quote
“bad technology practice”. Sometimes, I cannot read
this without laughing. I mean I’m trying. The Gizmodo article continues. Quote, “An important part of
our values as a company is that we don’t edit
the search results.” Brin Said. “What our algorithms produce,
whether we like it or not, are the search results. I think people want to know we
have unbiased search results, except for when we’re in Germany
and France, where, by law, we have to curate out
antisemitic content. So, you see this is where we
have a really interesting set of contradictions about
what is allowed to circulate on the internet and
what is taken down. Now, we have the foremost
authority, quite frankly, of a practice called
commercial content moderation, here in the room with us today
my colleague Sarah Roberts who’s here from UCLA who looks
completely mortified right now that I’ve called her out. But, she’s a person who has
written an amazing new book called “Behind the Screen”
that’s about the practice of a whole hidden labor force of
hundreds of thousands of people around the world who are
constantly scanning content online and taking down
things that are illegal or that violate the terms of
service for different platforms. And, certainly, Google
and its properties, YouTube in particular, have people who are constantly
looking and making decisions about the kinds of content
that we are able to see. It’s one of the reasons why,
for example, in a chapter that she wrote for a
book that I edited called “The Intersectional Internet”
one of her informants, Max, works for a big company
doing content moderation. And, he reported out. He said, “You know, it was
really interesting to me that I couldn’t understand
what the decision of the value systems were
and what got to stay up and what we had to take down. You know, we would, for example,
have to take down beheadings or violent images from the
drug wars in Juarez, Mexico, but then we could leave
beheadings in Baghdad.” It begs the question, you
know, why do we take down — why are people taking
down videos of animals being mutilated,
but blackface is up and in? And, Max was struggling,
as he reported out in Professor Robert’s
chapter that these kinds of value decisions were
constantly being thought through and refined. And, he had very little power,
quite frankly, to affect those, because those were policy
decisions being made in the companies
where he worked. So, we know that the — this
new explosion of visibility about these workers
that largely stems from Professor Robert’s
work has helped us to see that maybe the kinds of things
that we’re engaging with online, maybe it’s not a free for all. Maybe it’s not a free speech
space where anything goes. Maybe, in fact, there are
a whole host of decisions and different power
players who have the ability to have certain types
of content propagate and other content
that comes down. Now, I recently wrote about
this for Time Magazine, and one of the things
that I argued is that tech companies have been
slow to respond to the way that their platforms have
been used to amplify hate. Anonymity, in particular
social media platforms, often makes it difficult to identify the right-wing
radicalization that’s happening to some Americans online,
exposing users to violent and often racist disinformation. So, one of the things that
I argue in the book is that we need new business
practices and policies that address public harm,
that it’s propagated in these media tech
platforms, particularly as bad actors use
these platforms to enact violence on others. And, I think that we
haven’t even begun to scratch the surface of
thinking about what kinds of values are being mediated,
but we certainly know, and we see with the intense
calls for Facebook and Google in particular to be hold
to account for the kinds of information that move
through their systems, that this is not
going to be an issue that goes away any time soon. And, this is a really important
time for us to be thinking about algorithms and automated
decision-making systems that really are not
sophisticated enough to recognize certain
types of threats. We often find ourselves,
those of us who do kind of digital media research,
in meetings, in conferences with people who will
say that AI is going to solve these problems, AI is
going to recognize the threats or the disinformation, and
we’ll be able to automate. I think, in fact, we heard
that from Mark Zuckerberg in his testimony to Congress
a few months ago, and yet, you know, we know that AI is
actually still trying to figure out if this podium is a podium. That’s where we’re
at with AI, okay? So, I’m not really sure
how we’re going to get to these more complex types
of decisions through AI, but we’re certainly a
long ways away from that. Now, let me just share
one of the cases, as we kind of move toward
the close of this talk. This was one of the cases
that I found most interesting, as I was kind of
finishing up the book. You may or may not know
about Dylann Storm Roof, but on the evening of June 17th
in 2015 Dylann Roof opened fire in a church in Charleston,
South Carolina, and killed nine African
Americans who were worshipping and kind of one of the
more recent hate crimes. Although, I have to say that since 2015 we have seen
a very significant uptick of religious and
racial hate crimes. What was interesting to me
about the Dylann Roof case in particular is that many of us
instantly kind of went that day, and we’re trying to figure
out who was Dylann Roof. And, you know, once again
Twitter provided the answers. It’s weird, because there are — everybody on Twitter
is kind of on the case. There’s a lot of investigative
journalists and just nosy people on Twitter that are always
trying to figure things out. And, so here we had,
within 24 hours, someone on Twitter found
Dylann Roof’s online manifesto at thelastrhodesian.com,
and this was the part, I went through his manifesto,
and this was the part that really jumped out to me. He says, “The event that truly
awakened me was the Trayvon Martin case. I kept hearing and
seeing his name, and eventually I
decided to look him up. I read the Wikipedia article,
and right away I was unable to understand what
the big deal was. It was obvious Zimmerman
was in the right. But, more importantly,
this prompted me to type in the words black on
white crime into Google, and I have never been
the same since that day. The first website I
came to was the Council of Conservative Citizens. There were pages upon pages of these brutal black
on white murders. I was in disbelief. At this moment I realized
that something was very wrong. How could the news be blowing up the Trayvon Martin case
while hundreds of these black on white murders got ignored? From this point I
researched deeper and found out what was happening
in Europe. I saw that the same things were
happening in England and France and in all the other
Western European countries. Again I found myself
in disbelief. As an American we’re
taught to accept living in the melting pot, and black
and other minorities have just as much right to be here as we
do, since we’re all immigrants. But, Europe is the
homeland of white people, and in many ways the
situation is even worse there. From here I found out
about the Jewish problem and other issues facing our
race, and I can say today that I am completely
racially aware.” Now, this is Dylann Roof
who is trying to make sense of something happening in the
news, the case of Trayvon Martin and George Zimmerman,
and I think to myself, “How many of us turn
to Google to make sense of something in the world?” You know, one of the reasons
why it’s complicated is, because if you’re
looking for, let’s say, the closest coffeeshop
and you turn to Google or to Google Maps, it will
fairly reliably tell you about the closest coffeeshops. Now, it might prioritize
its clients first. So, you might find all the
Starbucks first before you find the local small business
coffeeshop, but you — we become so acculturated
and accustomed to trusting search engines,
because when we’re looking for banal kinds of information,
it’s fairly reliable. It’s when we turn and ask
more complex social questions of a search engine
that we start to fall into incredibly dangerous
territory. Now, what’s interesting to
me is that when Google — when Dylann Roof searches
for black on white crime, what he’s not given back are
things like FBI statistics that show that violent
crime is actually an intraracial phenomenon. So, I’m sure most
people here are familiar with the phrase black
on black crime. You’ve heard that before,
lots of times invoked. I’m giving you another
phrase today. It’s a gift, which is
white on white crime, because the majority of white
Americans are actually killed by other white Americans. So, see, there’s — it’s interesting how the FBI
statistics or the reframing of the question is actually
not available or accessible. Instead, when Dylann
Roof types in this kind of racist red herring,
black on white crime, he’s led to a whole host of
white supremacist websites, many of which look like just
conservative news aggregators. We might argue that, in
fact, as Jessie Daniels says, it’s quite difficult to parse
whether these are disinformation sites or not. What he doesn’t get access
to is research, for example, that shows how violent
crime occurs. It also doesn’t give a
history of a kind of a context of these types of racist red
herrings, but it does serve up legitimation for him. And, we know, unfortunately,
how this ends. So, I think that one of the
things that this brings to mind for me is kind of what the
implications are of turning to search engines
and not turning to other forms of information. And, certainly, my friends, my
librarian friends, say that, you know, if Dylann Roof had
come to the reference desk and asked for information on
black on white crime, you know, we hope — we imagine that that
might have been reframed for him or unpacked a little bit. We certainly know that those — that kind of language might be
used in a whole host of kind of racist texts, and if you went
and looked for it in the stacks on the shelf, it would be in a
context of those kinds of texts. And, so even that would
give him a possibility or give anyone the
possibility for a context for these kinds of questions. So, I think, ultimately,
there’s racist questions about who owns identity and
identity markers in cyberspace and whether racial
identities or other race and gender markers are
ownable property rights that can be contested. One of the problems with the way in which the commercial
internet works right now is that property rights are
really kind of the logic that undergird how these kind
of intricate systems work. So, for example, if you
are stormfront and you go and buy the URL
martinlutherkingjunior.org, you can run a propaganda
site against Dr. King and the Civil Rights Movement,
which, in fact, has been up and operating for a long time. They recently got
doxed and taken down, but this is something that the
King estate themselves could not intervene upon, because the
property ownership model has been really important. And, of course, that means
those who have the most money, the most capital always win
in that type of environment. One of the things
that we also know is that media representations and
misrepresentations always work in service of certain
types of stereotypic — certain types of projects, also,
that rely upon stereotypes. And, so I could’ve put in today, but it was actually just
a little too difficult. But, you know, I did searches
in image, Google Images, on immigrant, for example,
and you can do these yourself, and you can see the types
— the ways in which, like, who gets characterized as
an immigrant, who gets — which borders get policed harder
and how search engines work to kind of legitimate,
again, a certain world view about who people are and what
we should be concerned about. So, let me just share a
quick excerpt from the book, because I think it gives you,
also, a little bit of a sense of the kinds of conversation
that this book can open up. This was an excerpt. It’s very short. It’s from Wired Magazine. So, you can see it there, and in it I say an
app will not save us. We will not sort out social
equality lying in bed, staring at smartphones. It will not stem from simply
sending emails to people in power, one person at a time. New, neoliberal conceptions
of individual freedoms, especially in the realm
of technology use, are oversupported
in direct opposition to protections realized
through large-scale organizing to ensure collective rights. This is evident in
the past 30 years of active antilabor
policies put forward by several administrations and in increasing
hostility toward unions and twenty-first-century
civil rights organizations such as Black Lives Matter. These pro-individual, anti-community ideologies
have been central to the anti-democratic,
anti-affirmative-action, antiwelfare, anti-choice,
and anti-race discourses that place culpability
for individual failure on moral failings
of the individual, not policy decisions
and social systems. Discussions of institutional
discrimination and systemic marginalization
of whole classes and sectors of society have been
shunted from public discourse for remediation and
have given rise to viable presidential
candidates such as Donald Trump, someone with a history of misogynistic violence
toward women and anti-immigrant schemes. Despite resistance to
this kind of vitriol in the national electoral
body politic, society is also moving
toward greater acceptance of technological processes
that are seemingly benign and decontextualized, as if these projects
are wholly apolitical and without consequence too. Collective efforts to regulate– Or provide social safety
nets through public or governmental intervention
are rejected. In this conception of society,
individuals make choices of their own accord in the free
market, which is normalized as the only legitimate
source of social change. So, this is what dimension that
we really have to understand is that we are increasingly
thinking and kind of abdicating our collective
power to think about the systems that are kind of
ruling our lives. And, this is one of the things that I think the
entire experiment of the internet has been
about, which is about kind of hyper individualism, an elite
technocracy that knows better that uses science and
math and computer science to better make judgments
that are wholly humanistic and that actually
can’t just be coded that really require
complexity and our kind of intense engagement
and commitment to. The reliability of public
information online is in the context of real lived
experiences of Americans who are increasingly entrenched
in the shifts that are occurring in the information age. An enduring feature of the American experience
is gross systemic poverty, whereby the largest percentages of people living below the
poverty line suffering from un- and underemployment are
women and children of color. The economic crisis continues to disproportionately
impact poor people of color, especially Black and African-American
women, men, and children. Furthermore, the
gap between black and white wealth has become
so acute that a recent report by Brandeis University found
that this gap quadrupled between 1984 and 2007, making
whites five times richer than blacks in the
United States. This is not the result
of moral superiority. This is directly linked
to the gamification of financial markets through
algorithmic decision making. It is linked to the
exclusion of blacks, Latinos, and Native Americans
from the high-paying jobs in technology sectors. It is a result of digital
redlining and the re-segregation of the housing and
educational markets, fueled by seemingly innocuous
big-data applications, the kinds of applications
that allow the public to set tight parameters
on their searches for housing and schools. Never before has it been so
easy to set a school rating in a digital real estate
application such as zillow.com to preclude the possibility
of going to low-rated schools, using data that reflects
the long history of separate but equal, underfunded
schools in neighborhoods where African Americans and low-income people live are
simply screened out of view. These data-intensive
applications that work across vast data sets do
not show the microlevel interventions that are
being made to racially and economically
integrated schools to foster educational equity. These are the kinds
of schools I went to where people took a chance on
each other and said, you know, maybe our kids should go
to integrated schools, because a different
set of values and possibilities
might come from it. And, certainly the CSU system
I see as part of that legacy. That’s not in the book. I just said that, because
I want you to know. They simply make it easy
to take for granted data about good schools that
almost exclusively map to affluent, white
neighborhoods. We need more intense
attention on how these types of artificial intelligence,
under the auspices of individual freedom to make
choices, forestall the ability to see what kinds of
choices we are making and the collective
impact of these choices in reversing decades of
struggle for social, political, and economic equality. Digital technologies are
implicated in these struggles. These dramatic shifts
are occurring in an era of US economic policy that
has accelerated globalization, moved real jobs offshore, and
decimated labor interests. Claims that the society is
moving toward greater social equality are undermined by data
that show a substantive decrease in access to home ownership,
education, and jobs. And, this is one of those
things that’s really difficult, of course. You know, we’re fortunate
if we go to the CSU system that maybe we won’t leave
with a bachelor’s degree and 100,000 dollars in loan, but
certainly that is not typical at many universities around
the country right now. In the midst of changing
social and legal environment, inventions of terms
and ideologies of colorblindness disingenuously
purport a more humane and nonracist worldview. This is exacerbated by
celebrations of multiculturalism and diversity that obscure
structural and social oppression in fields such as education
and information science, for sure where I work, and that
are certainly shaping these technological practices. So, you can’t raise a generation
of students, for example, on colorblind ideology, tell
them that it’s wrong to think about race, it’s wrong to think
about gender, that if they think about those things, they
themselves are racist or sexist, and then wonder why the kinds
of output that they create in their professional
lives is not clear to them. They can’t recognize the kind
of racist or sexist dimensions in the kinds of evidence
that I’ve tried to provide for you today. So, I think these are
the kinds of things that are really important
to think about. We know that people who, for
example, embrace colorblindness, which is a very important
value in Silicon Valley. This idea of not seeing race and not seeing gender
only exacerbates the kind of deep inequalities in the
labor market in Silicon Valley and I think is certainly
implicated in the kind of output that we get from
these tech companies. And, you know, one of the things that I think is particularly
interesting about kind of the rise of the
technocracy and the rise of our deep investment
in digital systems is that at the very moment in the
1960s when we have a deployment of a series of state and
federal interventions to diversify the workforce,
where we may it illegal to discriminate against women
and underrepresented minorities in management and in leadership
positions in our society, that is the moment where we
have the rise of a belief in computerization and computers
to make better decisions than the fledgling multiracial
democracy that we’re building. And, we have to ask our
self what does it mean and what is lost that we believe that somehow these systems can
mediate or make better decisions in our democracy than we
can when we have and live in a truly representative
society where we have the goal
of social equality. And, so these are some of the
things that I think that we need to be taking up, as we– Interrogate whether or not artificial intelligence
should be a meaningful part of our future. I’ll just say this as I wrap,
but, you know, we have more data and technology than ever,
and we also have more social, political, and economic
inequality and injustice to go with it since we’ve
been keeping records about global wealth inequality. 2016 was the last
statistics that I saw. We had greater global
wealth inequality than since we’d been
keeping records about it, and we also have more
technology than ever. And, the future of my work
is really looking at the way in which technical systems and automated decision-making
systems are simply going to sort the haves
from the have-nots. We saw a study that came out
last spring from the UK House of Commons that said that by
2030 the top one percent will own two thirds of
the world’s wealth. And, I say, you know, I was
talking to a friend of mine. She’s in her late 20s, and I
said, you know, “What’s going to happen, you know, everybody
is just going to watch videos of other people eating?” and she was like, “Yeah,
girl, I do that all the time.” and then I was like
this is not it. Okay, that’s not what
— that’s not the goal. Okay. So, I think these are
the kinds of things, again, that are in front of us,
and, of course, you know, we see the rise of
social credit systems, these ideas that our ability
to move about freely in society or not will be determined by
our social creditworthiness. And, of course, those of us who
study the history of political and technical systems know
that these systems never work in service of the
most vulnerable. And, so I think I’ll leave
it there as maybe an opening for our Q and A, and I
just can’t thank you enough for your time and
attention today. [ Applause ]>>Thank you so much.>>You’re welcome.>>Let’s have a seat over there.>>Okay.>>Well, that was a
very stimulating talk and I couldn’t help
that think that starting out with the fairly
focused topic of algorithms and search engines,
we ended up connecting that to everything
that’s happening in our society was
kind of inevitable when you’re a social scientist. I think sooner or
later that happens.>>Eventually you’re
going to find that everything’s messed up. It’s true.>>And, I was — it, you know,
stimulated a number of thoughts in my mind that I
wanted to just kick around a little bit before
we turn to Q and A. It seems that on one level,
you’ve made a point that I think it’s been known
to scholars for a while that technology is
not a neutral product. It is constructed within a
social context, you know, using science, but you’re using
it to accomplish human ends, and so what precisely
gets shaped with that scientific
knowledge depends very much on the social context.>>You bet.>>And, I think people have
pointed out, for example, that in some ways,
you know, Bill Gates and his peers were
basically the nerds who constructed a new world
in which they could, you know, rule as opposed to the one that they were dealing
with in high school.>>Yeah, that’s right. I mean, they really showed
those football players and athletes, didn’t they? They really got them. I mean, I think that there
is certainly, I mean, we know these kind of
histories of the libertarian, techno-utopian kind of
early internet users. I mean, I’m an early
internet user. I was — I saw a picture
from the apartment, you know, that I lived in when
I was in high school. And, you know, we had
an Apple II computer. It was like 1986, and you
know, I think that there — the early idea of
the utopian promises of technology have
been hard to die. I mean, it’s been difficult
to dislodge this idea that we could program our
way to a better world. Or, that no governance or
this kind of, you know, anarchistic idea of,
like, borderless space and place online would
be the penultimate goal. But, you know, for
many people, you know, being organized has been
being connected and being part of a community or a nation has
been a really important pathway to civil rights, human rights,
and you know, different kinds of liberatory possibilities. So, there’s already a kind of ideologically orientation
happening about what’s in the best interest
of human beings. And, that’s really great
when you don’t have to think about oppression as
something you’ve got to organize resistance to. So, I think that of course it’s
shaped the future, you know, that we have inherited now. And, of course there were great
people who have written about, you know, what happened when
all the women were forced out of programming. For example, Marie Hicks
has this great new book “Programmed Inequality” where
she talks about the UK’s history of computer science and
computer programing and how that on the ground floor of
computer programing was women. And, then the UK, you know, the
state saw that it was really, like, a women’s field
and they were like, “That’s not really
that important.” And, they, you know, they killed
off their own computing industry because the embodied — the people who are invested
might also be invested in a whole host of ways. So, I think, you know,
these are some of the things that we should be
thinking about. And, you know, I know there
are a lot of people who think if we just get black girls to
code and Latina girls to code, you know, then these
problems get solved. And, that’s one of
the other kind of ways that the Valley responds
to the kinds of evidence that I provide, which
is to say, “Well, if black girls were coding then
they wouldn’t be pornified,” as if it’s the five and
six year old girls’ fault because they don’t know how to
work the computer or something. I mean, so there’s a whole host
of logics that come, I think, from a very narrow
band of people, from those who were tinkering in
their garage to those who are, you know, the tech giant
mogul leadership now. And, it’s in a narrow
historical framework and a narrow lived experience
that really precludes a lot of other possibilities.>>I remember there was this
document that was written by a male engineer at
one of the big companies that got a big uproar and–>>That James Demore.>>Yeah.>>Yeah.>>And, I’m reading that,
I mean it was, you know, it was interesting because
I could see how his mind was working in sort of a
windowless room, kind of trying to follow his own logic and
kind of with blind spots. And, is that mindset, I mean, what’s your thought
about that mindset?>>Well, you know,
it’s interesting. So, James Demore was a search — he was an engineer at
Google who worked in search. And, he wrote this
manifesto that argued that women were basically
biologically incapable of doing programing work
and that minorities, ethnic minorities
should not be hired because they basically
don’t have the capacity to do this type of work. And, it was a very
intense, ahistorical, uninformed, ill-informed screed. And, you know, I
find that that way of thinking is not
really implausible when you spend four years of
high school tracked in STEM, some type of STEM field where
you AP test out of English and all your humanities courses and all your social
science kind of courses. You enter as a freshman and
you only take computer science courses and you don’t
really take any humanities or social science courses. You don’t really have — I mean,
a lot of people who are working in the Valley on computer
science degrees, you know, might have never taken a
college level humanities course, or maybe one or two. So, of course the literacy and the well-formed thinking
can go sideways, okay? And, I think this is one
of the dangers of being so profoundly pro-STEM
to the exclusion of the humanities
and social sciences. And, one of the things I argue
in the book is that, you know, I have this line I always
say, you know, I say — and I say this to my
computer science students who accidentally
wander into my classes and then they hate their lives and they’re like,
“How did I get here?” and, “What just happened?” And, then they’re like, tears
are flowing and they’re like, “How can I have done four years
of a computer science degree and no one’s ever said
these things to me?” But, what I say to
them and I say in the book is you cannot
design technology for society and you know nothing
about society. You just can’t. [Applause] So, I think
these are — you know. I mean, have they taken
courses from you and, you know, become political economists
they they would really — they probably wouldn’t
be computer scientists. So, I don’t know, maybe
that would happen, too.>>Well, I confess,
I had a second life as an engineer after I–>>Did you?>>Yeah. So– So I can actually see both.>>Yeah. Might go
downhill from here then. Sorry. [Laughter] Apologize.>>Okay, we’ve got some
new questions here. I had some filler
questions in my pocket, but we didn’t need them. So–>>They filled up.>>I’m going to go
with the audience here. All right–>>Sort for some
easy ones first.>>They’re fairly long
and dense, so I’m–>>Thank you.>>I’m going to have to — I’m just going to
dive in here without–>>That they’re giving me
the super hard questions.>>I’ll read it in real-time.>>Okay.>>So, this may be a good one
or it may not be, I don’t know. Google enables us to carry in
our pockets the greatest library that the world has ever seen. Do you think that Google’s
pros outweigh the cons? For example, access
to useful information versus oppression
through algorithms?>>Okay, so the first thing
is that this isn’t fair because I didn’t say
this at the beginning. We didn’t announce
that I have a PHD in librarian information
science. So, that’s unfair because I
of course am going to tell you that Google is not the library. I mean, that’s just
like flat out. It’s not the library, all right? It’s not. Google is
an advertising engine. It’s not an information
retrieval platform. People need to really
understand this, that the way that Google makes its
money is that people pay — people and companies pay Google to optimize their
content, make it visible. You know, those who pay a lot
get a lot more visibility. This is well-documented. This is — you can just go and
look and see how AdWords works, which is their primary
mechanism for making money. There’s a real-time auction. It’s going 24 by seven. People are outbidding,
companies are outbidding, they’ll pay X amount to have
these keywords optimized and linked up and make
their content available. So, that’s really different
than the library, all right? In library we have — first
of all, if we’re lucky and we have a good library,
we have thousands of years of knowledge accumulated. We have a different
organizational context that helps us, again,
contextualize the items that we find in a library. So, you know, for example, I often have my students
go walk the stacks. And, I’m like, “What
are the stacks?” And, then I’m like, “This
is why I hate everybody, because I don’t know
what we’re doing. I mean, what, how, what?” So, we learn about the stacks
and we learn about books, and information, and that
information’s in book. And, that’s also really
different than a webpage because those of us who are
scholars in the room know that it is, you know,
well-researched information. I mean, there’s a whole process. I could give a different talk
about scholarly publishing, but there’s a whole process
by which knowledge is vetted. It’s not perfect. It’s also a problematic system, but there are a lot more
possibilities for truthfulness and fact to show up in
a book or in a magazine or in investigative
journalism, for example, than in a website necessarily. And, so this becomes very
difficult I think for people to kind of conceptualize. Certainly, you can fact
check a couple things, you know, quickly. But, if you’re fact checking
who won the popular vote in the US presidential
election, then you might see that that’s really wrong
and you would assume that would be a fact that
would be easily verifiable. So, I think the trust that
people have, and of course one of the things I didn’t
mention is that there’s a whole
gray market of actors, search engine optimizers,
companies, and individuals who are interested in
gaining the content. And, Google spends a lot — I
mean, social media platforms as well, spend a lot of time
and energy trying to sort out disinformation
from good information. And, that is — we are
in the nascent stages of being good at that. So, it’s very difficult
to trust, I think, the kinds of things
that you find in search engines
unless you can trust it.>>I — as you were saying
that I couldn’t help but think of something I read today,
which is a little bit sideways but very much along the
lines, you know, putting — really making sure what source
of information you’re looking at is, you know, the Times
Higher Education Supplement does a world ranking of universities. I just got an email, they’ve
set up a consulting firm to help you spruce
up your rankings.>>Right.>>So, they’re going to
be making money off you after they do the ranking.>>You bet. Right, and of course
if they’re going — they’re going to say
listen President Ochoa, you’ve got to get these keywords
in your strategic plan, man, because those words are
the words that get you seen by the raters, you see? I mean, the reviewers,
it’s a real interesting way that we have reoriented
ourselves to these kind of success models that are, you
know, about intense gamification from data mining and
the kinds of reports that come in on universities. And, you know, I feel like we
could have a good conversation about the scanning of
university rankings, but I know that’s not what
you want to talk about. [Applause]>>So, here’s a couple
of interesting questions. This one is from — probably
from a faculty member. We are developing more stat and data science
programs here at CSUMB. How should we respond
and teach differently as we train the very
people who may work for these companies developing
algorithms of the future?>>Right, okay. I can’t stop you because
you’ve already started. I don’t want to stop you. I think if you train data
scientists, you know, maybe it should be a
duel degree in sociology and data science, right? [Applause] I say that as a
person with a bachelor’s degree in sociology, but I’m just
saying I think — or history. Here is an example. I remember a student
once saying — he was an actuarial sciences
freshman in my class, actuarial science major. I was like, “What
counselor advised this? This is intense for
an 18-year-old.” And, in class we were talking
about the mortgage crisis, and I had the students read
Ta-Nehisi Coates’s “The Case for Reparations”, which is
an article from the Atlantic that I highly recommend. And, they were also
reading that against kind of more traditional
kind of data science, what is data science
type of work. And, he says, “I don’t
understand why we have to read this. You know, what is this? Like, what is this
mortgage crisis?” And, I was like, “You’re
going into actuarial science. You’re going to be
making financial models for the insurance
industry and you have to understand what
red lining is, what the history
of red lining is.” And, he was like,
“What is red lining?” And, was not at all interested. He was like, “That has nothing
to do with financial modeling.” So, this is why if you do a data
science program, it’s just — to me it has to be required
that you have this kind of historical and
social training. Otherwise, your modeling and your gamifying things
you don’t understand. You’re gambling with
people’s lives, like in the mortgage crisis where people were building
financial models to bet against the American public
and make money off of it. And, I bet you there were plenty
of young computer scientists who were just trying to
optimize the numbers, not understanding they were,
like, optimizing grandma out of her house all
over the country. And, I think those are the
kinds of things that we have to put data science
in conversation with. [ Applause ] I feel like this is my crowd. I feel like these are my people. I’m feeling good about it.>>So, if the algorithms
for Google or any other search
engine were to be altered, how would you like
it to look like?>>Well, one of the things
that in the concluding chapter of the book, I talk about imagining information
differently. So, you know, it’s not that
I’m particularly interested in not having commercial
search engines. I’m more interested in
kind of putting them in their proper place and
having a lot more kinds of ways of thinking about
accessing information than just one kind
of monopoly leaders. So, I think that’s
the first premise. The second is there are lots of different ways we could
design access to information. So, I kind of put a hypothetical
out there in the book if you’re interested
in it and you know where I could get some
money to play with it. I feel like let’s do it. I mean, it’s hard
because this kind of stuff takes a lot of money. But, you know, I use the visual
of color picker tool, right, with all the kind
of color spectrum. And, I think, you know, it’d be
one thing if I put my search box in the red part of the
color picker tool metaphor, knowing that’s the red light
district of the internet and I’m going to
get the porn there. But, maybe I put that in green and I get the commercial
products, that’s like the hair
care products which is what I’m looking for. And, you know, if it’s in blue
maybe it’s research or studies or government kind of
information, right? But, there’s more context.>>Yeah.>>And, that’s one of the things
that we don’t have the context that we have right now
in Page Rank or any kind of ranking indexing system
is that our cultural context in the United States
is if it’s first, what? It’s number one. What? I mean, we have a
big foamy finger that’s one for a reason, you know? It’s not 1,300,000, right? I mean, so the cultural
context of being in a rank order signals
credibility and authority from
one to infinity. And, that is also a
really flawed model for providing information
to me also. And, libraries are
also guilty of this and this is something
we’re working through and thinking about. So, I think we need more
search engines, not fewer. We need more ways to
think about the context of what we’re finding. We need more ways to
separate the advertising from other things. I’ve certainly argued to
librarians that, you know, if they put their skills to
use in indexing the open web and not just scholarly
information that typically is behind a
university pay wall or maybe is in a public library, but could
help us kind of sort what’s out on the open web, that
would be valuable as well. But, you know, ultimately I’m a
person who argues for regulation and I think that there are a
variety of consumer protections that the public must have
with the technologies that we have right now. And, we need to be thinking
about the kind of public harms that can come from unregulated
digital media technologies. And, that’s really to me
a really important site of intervention at
the policy level. You can’t — Professor Roberts and I have this conversation all
the time where we say, you know, you can’t on one hand talk
about regulating social media and search engines and
trying to get that right but simultaneously
defund public education, defund public higher ed, defund public libraries,
defund public media. You can’t do that. You’ve got to bolster
those institutions, those public institutions, otherwise all we have
left is the private, corporate information sphere and I think that’s
dangerous for our society. [ Applause ] I’m coming back here
as many times as you invite me
because this crowd. I need all their
numbers and I want to invite them to anything I do.>>So, I think one of your
fans is asking have you done a TEDTalk yet?>>You saw me with a mic, right? That’s what inspires you to ask? I did, like, a TEDx talk at
the University of Illinois. Please don’t go watch it. It’s on the internet
and it’s horrible. I thought I had the flu
while I was doing it because the minute I walked
off stage I vomited profusely and you might be able to tell. I’m just telling you the truth. It was terrible, and one of the
reasons why it was terrible is because it was kind of
early in my academic career. It was like my first year
as an assistant professor. And, you know, there’s nothing
worse, now I don’t really care because I have tenure,
but then I super cared. It’s very stressful for
academics to give a short, pithy talk and not cite
all of our colleagues who are also doing things,
you know, because you feel like you’re just like,
“I have all these ideas.” But, all your ideas are
in dialogue with all kinds of other people, right? And, for academics that is
really, really a tough genre to do a ten or 15 minute
talk and not, like, name check where you
got all your ideas from. So, I was sick. I actually made myself ill from
stress when I gave that talk. So, I think I’d like to give
it again now when I don’t care and I feel like I
could just be like, you guys already know
all these people. Or, I could just put a slide up and just say read all
these other people, too. I was a little inexperienced
when I gave that talk. So, yes, I have done
one and it’s out there. [Laughter]>>So, one of our audience
members asks how can we make a difference at the local level, in the community
about these issues?>>It’s so great. I mean, one of the things I
think we can do, you know, we’re writing a new
book right now. I keep referencing Professor
Roberts because she came with me because we’re under, like,
this massive deadline to our publisher for a book
we’re writing right now. And, part of that book is going
to be about kind of what kinds of interventions we can
make at different levels. And, one of the things that I
think we need are policy makers, people we elect should have
in their five-point plan, ten-point plan technology. How it’s going to be used,
who it’s going to be used for and against, what kinds of
investments are being made in it that are being shifted
from other places? There’s a whole host of
ways that city councils and municipal governments in particular are deeply
investing in, you know, let’s say making space for a
series of tech companies to come to your town, right, and
giving public, federal dollars or state dollars or municipal
dollars to entice that. And, at what cost
will that come? And, those are the
kinds of questions that I absolutely think should
be at the forefront of the kinds of conversations we’re having
as we go into the 2020 election.>>Like the Amazon
headquarters thing.>>What’s that?>>The dance that Amazon made
with a couple of kind of–>>Correct, with
Queens and Virginia. And, of course, you know, you see the real questions
are being asked, if, like, if say housing rights
advocates in New York who are saying how do we put — give, you know, Amazon three
billion dollars in subsidizes and we have 96,000 I think
was the number, 69,000 people who are experiencing
homelessness because they lost their jobs. So, I think, like, this
allocation of resources in government — by
governments is really important and that’s something that we can
hold politicians to account for. We should be running. I feel like this is a good
crowd to source some people to run for office from. So, you know, those are
the kinds of interventions that I think we can think about. I mean, I know there are people
who say, like, delete Facebook, don’t use Google, use
DuckDuckGo use these other kinds of technologies. I don’t think we’ll solve these
problems at an individual level. I think that’s the neoliberal
discourse, is to convince us that it’s a private,
singular individual problem, but these are public concerns. And, just as much as we care
about clean air and clean water and affordable housing we should
care about credible information and the way in which our
information environment is undermining democracy
or bolstering it. [ Applause ]>>So, now, I will end with
a lighthearted question. What’s–>>I’m a Capricorn. [Applause] [Inaudible]>>What’s your favorite
search engine?>>I use DuckDuckGo mostly on my
mobile phone, which is the place where I use my search
engines a lot, mostly because it
doesn’t, like, track me. And, so I think of, like,
being tracked is important. Of course, I use Google
a lot because I want to see what Google’s offering up and sometimes I do
some comparisons. You know, I have to say I mostly
use search engines for, like, commercial, like,
consumer things. I don’t often use it
for, like, real things. Does that make sense? Like, for real information
I try to — I’ve been quite active in trying
to get more women represented in Wikipedia, especially
critical scholars and women who are doing important
work in our field so that they are findable
and accessible because one of the things — Helen
Nissenbaum is professor at NYU says, you know,
basically to be indexed by a search engine is to exist. And, if you don’t —
if you’re not indexed by a search engine
you don’t exist. And, I also heard —
I was at a conference and I heard the director of
the CIA say that everybody in the eyes of the state
is their data profile and that is actually how
we’re being related to, which everyone here should
probably feel ill about that. So, I think that, you know,
for me I think about the way that I use search
engines, you know, for trivial kinds of things. But, I also think about how
I like to use search engines to look up things
that are provocative and that I don’t know
much about to kind of get a little hit on it. And, I know those things are
also adding to this data profile that exists about me
that I don’t know. I don’t know what
the data profile is. I’ve looked up a lot of things. All of us have these data
profiles, by the way. Everything you’ve every
looked at on the internet, and now people are really sick. They’re going to need to —
they’re going to need wine. They’re going to
need wine after this. But, all those things, you know,
to me again are really important about should we have the
freedom, to, you know, in the libraries
community we say, you know, the ALA says you
should have the freedom to read anything
you want to read. It’s one of the reasons
why librarians have been at the forefront of, like, not
tracking everything we read. They really just track that
you’ve checked out this book and when you bring it back they
get rid of the record because, you know, you might remember
John Ashcroft coming in and demanding during the
kind of after 9-11, you know, requiring that librarians
hand over patron records of everything people have ever
read looking for terrorists. So, this is one of the ways
that, you know, the government, the state looks for
terrorists, too, is by looking at our data profiles, all
the places we’ve been, our digital traces online. And, you know, for that reason,
you know, I’m mindful of that when I’m looking for
things on the internet and in any search engine. And, I think those are
things that, again, people should have more freedom
than to be afraid of knowing and afraid of inquiring and that
those inquiries will be held against them at some point.>>Well, thank you very much
for a very stimulating talk and thank you all
for these questions–>>Thank you all.>>Please join us for
the reception outside. [ Music ]