The age of state surveillance

Foreign Affairs

Wall Street Journal reporters Josh Chin and Liza Lin discuss their new book on technological surveillance, from China's state systems that track Uyghurs in Xinjiang, to the lack of Chinese public awareness on privacy issues, and much more.

Illustration for The China Project by Derek Zheng

Below is a complete transcript of the Sinica Podcast with Josh Chin and Liza Lin.

Kaiser Kuo: Welcome to the Sinica Podcast, a weekly discussion of current affairs in China, produced in partnership with The China Project. Subscribe to Access from The China Project to get, well, access to, not only our great daily newsletter, but all the original writing on our website at thechinaproject.com. We’ve got reported stories, essays, and editorials, great explainers and trackers, regular columns, and of course, a growing library of podcasts. We cover everything from China’s fraught foreign relations to its ingenious entrepreneurs, from the ongoing repression of Uyghurs and other Muslim peoples in China’s Xinjiang region, to Beijing’s ambitious plans to shift the Chinese economy onto a post-carbon footing. It’s a feast of business, political, and cultural news about a nation that is reshaping the world. We cover China with neither fear nor favor.

I’m Kaiser Kuo, coming to you from Chapel Hill, North Carolina.

A couple of episodes back, Bloomberg Chief Economist, Tom Orlik, recommended a book on our podcast, and he said that he thought it really stood out from the pack of China books that he’s read of late, Surveillance State by Wall Street Journal reporters, Josh Chin, and Liza Lin. I echoed him then, and I mentioned that I’d be talking to them about the book, and I am delighted that that’s exactly what we will be doing on the show this week. Surveillance State: Inside China’s Quest to Launch a New Era of Social Control is a book that is at once very powerful, and scary, and moving, and perhaps surprisingly, really quite subtle. It complicates and often challenges the narrative that, in my estimation, has taken root in American and Western thinking on China and its immense surveillance apparatus.

It goes way beyond just static descriptions of the systems China has deployed and raises deep difficult and really profound questions about cultural differences, about the mindset and the assumptions behind Chinese technocracy and the urge to social engineering, about interaction between China and the west and unintended consequences, and really a whole lot more. The book also has just a really great narrative pace. It introduces really memorable characters with fascinating personal stories. It is just reported from really all over the world. And of course, it is beautifully written. Josh Chin, Liza Lin, welcome to Sinica. Welcome back to you, Josh. And congratulations to both of you for this truly excellent book.

Josh Chin: Thanks Kaiser. It’s always a pleasure. And thank you for the very kind words.

Liza Lin: Thanks a lot, Kaiser.

Kaiser: Yeah. So, let’s jump right in. Josh, Liza, I suspect that there will be readers of your book who might somehow read it as a sophisticated species of whataboutism that your inclusion of chapters, both exploring how surveillance is sometimes abused by American law enforcement, and linking in various ways the rise of China surveillance to American actors, is really meant to subtly undermine and dilute our righteous criticism of Chinese excesses, especially in Xinjiang. Josh, how would you answer that? And perhaps you could also explain why you opted to complicate the whole narrative on Chinese surveillance tech rather than just report on technological oppression across China.

Josh: Right. Well, I think it’s pretty simple, actually our approach to this story, which is we really just wanted to develop and then describe the most comprehensive picture of China’s surveillance state that we could. I think what you discovered when we looked into it, I mean, obviously some of our reporting for this book began with the Wall Street Journal. That included some early reporting on what was happening in Xinjiang and the extremely disturbing and dystopian elements that we discovered there. But as we looked more deeply into it, what we discovered was yes, the Chinese surveillance state waves a deeply, terrifyingly dystopian stick, but it also dangles a pretty appealing carrot. Its aim is to sort of make life simpler, more convenient, safer, more predictable through the power of data and AI for Chinese citizens. And it is, in fact, using a lot of American technology to do that.

Kaiser: So, the book is reported from, as I say, many very different places. The opening chapters focused on Xinjiang, but then there are chapters about Hangzhou, for example, and safe city policing there, and things like that. There are also chapters in Uganda and lots in the Silicon Valley and in the United States more generally, in New York. How did you guys divide the labor? Who should I, in other words, direct the questions to about which parts? Liza?

Liza: Sure. So, Josh has done a ton of prior reporting in Xinjiang. The way the book was split up was Josh took a lot of the more sinister aspects of state surveillance while I dug into what we thought might be the more attractive and alluring aspects, or what we discovered to be the more attractive and alluring aspects of the surveillance state. So, the same systems that were used in Xinjiang could have been used in other major Chinese cities in ways that residents actually found beneficial. So, that was kind of the DV. I’ve always been a corporate reporter. So, a lot of the corporate supply chain stuff came down to me.

Kaiser: Okay. What about New York? Who did those sections in New York about the public defender who took on the TJ Maxx case that we’ll talk about?

Josh: That would’ve been me.

Kaiser: Okay.

Josh: Yeah. So, I did Xinjiang, Uganda, New York, Shenzhen stuff, and most of the elite politics stuff. And then Liza was tech, Chinese American tech, and Hangzhou.

Kaiser: Okay. Fantastic. Fantastic.

Liza: Yeah. And then we split some of the chapters like privacy or the panopticon type of findings.

Kaiser: Josh, you open your book with, as I said, with a discussion of Xinjiang, focusing on a guy named Tahir Hamut, a Uyghur documentary filmmaker and poet from Ürümqi, who managed to get out with his family before the jaws really closed, and who told you quite a bit about his experience. And it’s through his eyes that we see the way that Xinjiang became, after 2017, really the apotheosis of the technological surveillance regime that you describe. He and his wife, Marhaba, go through the whole gamut from facial recognition to fingerprinting, to blood drawn, presumably for DNA and other biometrics. He even recounts the intake form from the Integrated Joint Operations Platform. We actually had a show all about that when that information was leaked a couple of years ago.

Can you talk about some of what he experienced and how you reported it out? I mean, for instance, were the details of that from his memory, or did he take a picture of it, or did you get a copy of that form? Because you have it in real detail.

Josh: Right. So, Tahir is someone who we were sort of put in touch with actually after our first trip to Xinjiang. I had gone there in sort of late 2017 with a colleague Clemont Varge, who does video. We had gone to Xinjiang, we had really not any idea what was happening there. We sort of heard rumors that they were rolling out tons of cutting-edge sci-fi sort of surveillance equipment. But we didn’t know why or what the whole situation was, or whether that was actually even really true. So, we drove in and we discovered it was true and it was actually much darker than we imagined it would be, and that obviously there was a real story there.

So, we came back to Beijing and we started looking around for people who could help tell us what was happening there. At the time, there were only a handful of people. In fact, Tahir might have been the only person who we could find who had witnessed the rollout of all of this, but then got out and who could speak freely. I mean, we’d sort of been put in touch with him by some Uyghur activists who knew him and he told us his story. At first, he actually didn’t want to use his name because he was really worried what was going to happen to his family members if he did tell us using his name. But then he later reconsidered, he realized that the story would be much more powerful with his name attached to it.

So, on his own, he let us do that. What happened to him, some people described him as the greatest living Uyghur poet. He’s certainly highly respected and really well known. At the time, he had actually, when all of this was being rolled out in Xinjiang in sort of late 2016, early 2017, he’d actually been planning to try to leave and go to the U.S. Because he’d sort of seen, he’d been around for a long time. He’s a very savvy guy. He had been a student leader in the 1989 protests. He could see what was coming down the pike. But he actually, he kind of didn’t get out in time, right?

So, this vice started to close around Uyghurs, especially like him, who were intellectuals, who had passports, who had traveled abroad. One day he got called in by the police. They said they wanted to take his fingerprints, which, him and his wife, Marhaba, was obviously ridiculous because they’d given their fingerprints more times than they could count by then, but they didn’t really have a choice. So, they went in and they basically had to do the full biometric profile. They had their fingerprints taken but also their blood taken. They were asked to read an article from a Uyghur newspaper so that their voices could be recorded. They had 3D scans of their faces and the sides of their heads made so that facial recognition cameras could identify them more easily. And it kind of slowly dawned on Tahir that something really terrible was happening, or was about to happen.

And around this time, he was hearing rumors, at the time again, only rumors that people were being sent off to school. No one really knew what that meant. But at a certain point, one of his closest friends, who had actually been in the U.S. and come back to Xinjiang in the middle of this, was taken, disappeared, and that’s when he sort of realized exactly how serious it was for him. He started sleeping with clothes next to his bed because he’d heard stories about Uyghurs just being taken out in their underwear. Anyway, so he got out, and I don’t want to give away the whole story, but we got in touch with him. When we were talking to him, he actually had brought a copy of that population data collection form with him. For those who don’t know, one of the ways that the police were collecting data on Uyghurs in the early days of the surveillance rollout was just with these paper forms that every Uyghur had to fill out. And it had details about obviously-

Kaiser: Dietary things. Right.

Josh: But yeah, prayer. How often they pray, where they pray, people who they knew abroad, that sort of thing.

Kaiser: Yeah. Frightening. It’s a really harrowing account. And it’s really well done. Liza, you reported obviously a lot of the non-Xinjiang sections of this from cities like Hangzhou. Can you talk about the awareness, if any, in other parts of China, of the extent of Xinjiang surveillance from people outside of the region? Do they know? Do they care?

Liza: Yeah. So, what we discovered is the people outside of Xinjiang are pretty much oblivious to what is happening in Xinjiang itself. And that’s of no fault of their own. It’s really the Chinese Communist Party’s propaganda machinery and system of internet censorship. Because a lot of the Western media outlets have been aggressively reporting on Xinjiang, but China’s great firewall, essentially, has blocked almost every Western media outlet out there. For this reason, if you are a Chinese person in Shanghai, or Beijing, or Guangdong, why you’re consuming or what you’re seeing on the Chinese internet is really like the Party’s message of what you want to hear about Xinjiang. It would be the same sort of message that they’re saying now that in Xinjiang, it’s a reeducation campaign, but in this campaign, what we’re doing is to empower Uyghurs by teaching them Mandarin, helping them understand like Chinese law a lot better, or giving them jobs.

Conveniently leaving out that sometimes these reeducation centers, it’s like forcible assimilation or, in some cases, many Uyghurs have no choice on what job they’re sent to do, the forced labor aspect of it.

Kaiser: Yeah. I think for some people who might be properly horrified by the level of surveillance and other means of repression in Xinjiang, who might even be willing to use the word totalitarian to describe that regime there, there is still this belief that this is very specific to that region or perhaps to other “restive non-Han majority areas” of China like Tibet, but others are quite sure that Xinjiang is just a proving ground for these new technologies and these new policies, and that this approach will inevitably get rolled out to the rest of China. How do the two of you come down on that question? Do you think that Xinjiang is sui generis or do you think that it is just the beginning?

Josh: Well, I mean, actually it’s already happened. If you look at the pandemic response, COVID-zero, I mean, one of the most striking experiences I had just before I was expelled from China early 2020, which happened just at the beginning of the pandemic was the way that the party rolled out its COVID controls. In my apartment building in Beijing, my apartment complex in Beijing, just like every apartment complex, it’s large, sprawling, it has a bunch of entrances. All those entrances were closed off so that there’s only one way in and out, which is like a method that was pioneered in Xinjiang. It’s known as fēngbì shì guǎnlǐ 封闭式管理, close management. And the idea being that you just want to be able to know who’s going in and out whenever they’re giving us passes to go out and come back, which is something that Uyghurs are familiar with.

Now, you have, basically, a situation in which, like Xinjiang, an entire population is being tracked, right? It used to be unique to Xinjiang, where every single person in a group, every member of a group was being intensively surveilled. Now that is happening across the country with COVID. A lot of the methods that you saw being implemented in Xinjiang first are being experienced by Chinese people across the country.

Kaiser: Yeah. My colleague, Jeremy, described it as a biosecurity state in the making. Liza?

Liza: To add onto what Josh said, let’s not forget about the health check booths that have sprung out in every Chinese city, at every street corner. It is really similar to the police stations or like the checks the metal detectors and the barriers that have sprung up in Xinjiang several years earlier as well, just to verify your identity. In this case, outside of Xinjiang, the wrong color code could lead you to detention. Whereas in Xinjiang, it was like the wrong, predictive policing, AI assessment could lead you to a reeducation camp.

Kaiser: And do you think that the dry run in Xinjiang was responsible for the rapidity with which they were able to roll out these sorts of public health-focused surveillance efforts?

Liza: It definitely set a model that was much faster and easier to replicate. Along the same vein, you also saw police outside of Xinjiang use AI-enabled surveillance cameras to track the close context or the trails of COVID patients. People who tested positive and they’re trying to distinguish where this person has been, who he has met. It’s the same sort of spying and surveillance techniques that you saw in Xinjiang rolled out on a much broader scale. And I think just having that trial in Xinjiang definitely brought a lot of understanding on how these systems would work better and avoid mistakes.

Kaiser: So, we’ve seen it spread now from Xinjiang to the rest of China. There’s no question at all that some of these technologies and approaches developed in China are now also finding their way to other countries. You guys have an excellent chapter, I’ve flicked at it, about the struggles of the opposition leader, Bobi Wine, as he goes up against Museveni and the surveillance apparatus that he has now in Uganda, courtesy of Huawei. But one question seems to be, and this is one that I’ve often wondered about myself is whether China is actively pushing that approach, pushing the enabling of tech to these other countries, or whether it’s more, a matter of pull. First, does it matter whether it’s push or pull? And is this a matter of enterprises wanting the sales, and not Beijing actively encouraging them to sell this stuff? Is Beijing facilitating this or some combination, Josh?

Josh: Yes. I think that is actually the answer, right? I think it is all of those things. It is push, it is pull. It is being driven by the incredible profit motive of companies like Huawei, but also by Beijing’s desire to have these systems out there. I think you could take Uganda as an example. This was a situation in which Huawei has been in Uganda like it’s been in lots of places in Africa for years. And I think it was around 2015, they had sort of started to develop what they call safe city systems, which are just sort of state surveillance systems. And they were sort of seeding the ground in Uganda. They actually gifted a sort of starter kit to Museveni, just a few cameras, just to wet his appetite in 2015.

And then they didn’t really do anything after that. But a year or two later, Museveni sort of found he was having trouble with the opposition, the opposition leader, Bobi Wine, who you mentioned, who’s this very charismatic singer who was really rallying the youth, and Museveni felt threatened. So, he then tasked his security chief with finding him a system that would help exert more political control. It wasn’t just Huawei who bid. There was a Canadian company that also bid on the project. So, he didn’t go straight to Huawei, but he obviously knew Huawei had these systems. In the bidding process, the Chinese ambassador got involved and was acting as a sort of auxiliary salesman. Once things got to a certain point, he invited the Ugandan police to send a delegation out to China. They visited Huawei’s headquarters, but they also visited the Ministry of Public Security building next Tiananmen Square. And they were sort of given a tutorial and how, or at least a demonstration in how these systems work. Then very shortly after that, everyone went back to Uganda and they signed the deal.

Kaiser: Yeah. So, there’s elements of push and pull in there, for sure, and facilitation by Beijing, for sure. Yeah.

Josh: Yeah, absolutely. Absolutely.

Kaiser: Toward what end though, Josh? I mean, Jessica Chen Weiss, who was on this program a couple of weeks ago, had this memorable phrase that she’s used as the title of this piece that she published in Foreign Affairs a couple of years ago, that we interviewed her on making the world safe for autocracy. Do you largely agree that that’s what we’re really seeing here that China, basically, it’s not that they want the whole world to become autocratic, it’s just that they want tolerance for autocratic regimes?

Josh: Right. I mean, that’s a question I think we wrestled with as we were looking at all of this. I mean, if you read what Xí Jìnpíng 习近平 says, I mean, he does talk in grand terms about the Chinese people being willing to contribute new forms of governance to humanity and that sort of thing. You can see how someone would read that and think, oh, Xi Jinping wants to remake the world order with China’s system on top. It’s hard to say for sure, obviously what’s in the mind of Xi Jinping. I think if you look at the way they act and the way they speak, generally, it’s socialism with Chinese characteristics, right? Like, it’s not the sort of evangelical workers of the world unite that the Soviet union was espousing, or democracy and freedom for everyone that the United States promotes.

Right. It’s not a sort of world-conquering ideology. And if you examine just the way that China sells these systems abroad, they don’t really care how governments use the systems. They’re not coming in and saying, “You have to do it this way.” They offer an example. They’ll give you a tutorial. They’ll tell you how they do it. But I think what they ultimately want, I think Jessica is… I think her argument holds a lot of water, in the sense that what they want is they want it to be okay for authoritarian governments to use these technologies to exert control, right? For it to be accepted as a model. I mean, it’s similar actually in many ways. It mirrors their arguments about internet sovereignty, right?

Kaiser: Yeah.

Josh: The idea is not that everyone has to censor the internet the way China does, but that if governments want to censor, they should be able to.

Kaiser: Exactly. Yeah.

Liza: Yeah. So, what Jessica… Her statement is a very big one and I don’t have the answers to that. But what I can tell you from our reporting is that, every time a democracy, or if China manages to sell one of these systems abroad, it kind of helps the Communist Party legitimize its own system to its own citizens. Because you always see state media playing up such reporting. And it’s played up as a technological innovation of China has gone global. It’s all part of the building of national pride and building China as a strong and a big country in all areas, including tech. In terms of the corporate front, and you flicked at it earlier as well, there is a real need for China to have these systems succeed overseas. Because if you think about it, the world’s biggest surveillance camera makers are Chinese, like Hikvision and Dahua. And at some point, because China now already has more than 400 million cameras in its country, demand for cameras is going to peak and saturate.

And these companies are gonna have to find export markets to keep their revenue and the profit humming. That’s a reason why it’s really to China’s advantage to be pushing these systems overseas, regardless of like-governance models or not.

Kaiser: Liza, you guys have a fantastic chapter on smart city solutions that Chinese companies are promoting, not only in cities and China, but also abroad. It sets things up really nicely by presenting a dilemma, a double-edged sword with obvious and very real value on the one hand. And also, a lot of riding roughshod on the privacy rights of individuals on the other. Can you illustrate this with a couple of examples from your reporting? That’s an excellent point.

Liza: Sure. When Josh and I first started looking into the topic, we had come into it with the assumption that all sorts of state surveillance was negative and nefarious in nature. And the more we dug into the topic, what we discovered was there are Chinese citizens who really think these systems are very beneficial. I guess, to illustrate this point, we traveled to a city called Hangzhou, which is on the Eastern coast of China, two hours away from Shanghai. Hangzhou is well known for being one of China’s seven ancient capitals, but in the last few decades, its growth has been, like every city on the east coast, very breakneck. Hangzhou has seen its resident population triple, just in the last 10 years alone, from 3.6 million to almost 11 million in 2020. Even though the city’s population has tripled, Hangzhou still has legacy road infrastructure.

That means the road networks through the city has not changed. Instead of 3.6 million people using the same roads, you have 11 million people, which has actually become a real problem for Hangzhou residents. If you go to Hangzhou in 2016 or 2017, you get caught in jams on their highways for hours, and you’re barely moving, and you realize what a big problem it was. So, Hangzhou actually has been very embracing of technology to try and overcome some of its more current issues in city governance. This is where Hangzhou has really gone all in, in the smart city model. So, essentially, the same systems that are being used in Xinjiang, a lot of the data mining and the surveillance cameras are used in Hangzhou to do things like optimize traffic, for example. You would take the feeds from security cameras installed at road intersections in many Hangzhou places, and you would combine that with the GPS data of the cars traveling on the road.

And you use that to make sure traffic lights are green during peak hour, when traffic flow is heavy, or you would use the security camera system to make sure that the traffic police get automated alerts of when a traffic accident happens so they can rush to the scene and clear the flow. And then traffic gets going again.

Kaiser: Right. Ambulances. I remember one example from your book about ambulances, having all the lights turn green as they rushed an injured person into a hospital.

Liza: Exactly. Traffic seems like a very small thing when it comes to a life and death situation, but that’s the other thing the system can do. We spoke to a man whose mother had fallen into the river in Hangzhou. And there were a couple passersby that had kind of fished her out. Once she was out, the ambulance taking her to the hospital essentially turned on a switch on its system that was linked to this AI traffic management platform. What it did for the ambulance was to turn all the lights green on the way from where it was going to the hospital. And that meant she got medical attention in half the time it would’ve taken the ambulance to get her there, where there is no system at all. And it’s not just traffic management, right? In Hangzhou, we’ve seen Hangzhou police use the same sort of systems for law and order. So, spotting criminal suspects on the street, drug pushers, for example. All the type of people that, as parents or as like residents of the city, you might not want to see, or you might not want to have walking next to you.

Kaiser: There is this idea that threads throughout the book, that there might be a difference between Western and Chinese ideas about privacy, about surveillance, about the rights of individuals and the interests of broader society. Various characters throughout the book are in dialogue about this question. This is something I’ve also wrestled with. I’m sure you guys have. I’ve wrestled with it, certainly since the late 1990s when I started working in the internet sector in China, I remember one of your characters, I think it was a guy in Hangzhou, said something to the effect of ‘Chinese have a different idea of privacy. Most don’t understand it. And it’s hard to get worked up about privacy if you don’t understand it.’ The artist, Xú Bīng 徐冰, who you spoke with obviously, he started off with a very similar assumption, more about him and this film project that you guys wrote about, made entirely from surveillance footage in just a bit.

But we’ve also seen huge reactions, though, to privacy violations or even to the suggestion. I mean, one even made by my former boss at Baidu, Robin Li (李彦宏 Lǐ Yànhóng), that Chinese people are generally willing to trade privacy for increased convenience. And obviously, as Josh talked about, COVID has forced this into the bigger conversation. We are all, though, I think people like you and me, all of us are conditioned to be very wary of these kinds of sweeping generalizations. And yet, sometimes it’s hard not to conclude that expectations around privacy, the zealousness with which people guard it might be quite different between different countries. This is not just… Of course, I mean, there’s reasons for this. Chinese maybe because… Well, they’re just used to a more intrusive state. And maybe it’s not out of some innate cultural trait. But anyway, how did all of this shake out for the two of you? Did you move in one direction or the other in the course of your reporting this book? And where did you end up?

Liza: Maybe I’ll start and then Josh can end off as well.

Kaiser: Sure.

Liza: I don’t think the fundamental notion of privacy is different between Chinese and Western societies. I mean, no one likes to be watched, right? It’s eerie, it’s creepy, it’s not a great feeling. I think what is the difference between the West and China is the base of awareness, of privacy awareness. The definitions of privacy in the West have been well established for a while. You had the right to privacy from Brandeis in the 1890s. It’s only in the last couple of decades that Chinese people are beginning to understand what privacy means. Because of that, there is a different level of awareness in both societies leading to different reactions and attitudes. On top of that, most of the people who really value their privacy in China are in the first-tier cities, in the larger cities in China. Half of China’s population, they’re not in cities yet, or they’re not in the large cities.

That kind of means as well that their focus in life, if you think about the Maslow’s hierarchy of needs, the pyramid with like physiological needs at the bottom and self-actualization at the top, they’re still at the lower rungs of that pyramid. That means, if you’re constantly thinking about putting food on your plate or your security, or where their next paycheck is coming from, thinking about privacy is the last thing on your mind.

Kaiser: Josh, I mentioned the artist, Xu Bing, and this film. Xu Bing is really one of the best known contemporary artists that China has produced. This hyperreality video project, a film called DragonFly Eyes, which he made, or was able to complete because of this product that the internet company, Qihoo, had released called Water Droplet Livestream. Can you talk about that phenomenon, and about the film, and what that shows you?

Josh: Yeah. Actually, I mean, this part of the book was actually… This story in the book was one of the most surprising, the way it sort of unfolded. I’d actually run into Xu Bing at a Columbia University event in Beijing. I guess he had spent some time there. This was in, I believe, 2016. I was just asking what he was working on, and he mentioned, oh, I’m doing this film made out of surveillance footage. At the time, we weren’t really reporting on surveillance so I just filed it away in the back of my mind as something interesting. And then we started doing this surveillance reporting. I remembered Xu Bing, and so I looked him up and asked him about it. Basically, what happened was he had always had it in his mind, or for a long time, he had it in his mind that he wanted to make a fictional film out of surveillance footage.

Because he just thought that the way that people appeared in surveillance footage was natural in a way that actors never could be. And he was just intrigued by that idea. As an artist, he’s really into this idea of sort of transformation, right? A sort of recontextualizing images. He always wanted to do this, but he never could, because it was just impossible to get enough of that footage. I mean, he had friends at CCTV and he had friends with the police departments in Beijing. And they would give him some surveillance footage, but it was just never enough to make it work.

You just need immense amounts of this to pull it off. And so, one day, one of his research assistants had stumbled on this website called Water Droplet Livestream, Shuǐdī zhíbò 水滴直播, which was basically a web platform that Qihoo had set up basically for its consumer home security cameras, right? They’re internet connected. And the idea was that you could stream your video footage from your security cameras online, so you could check it remotely. Right?

Kaiser: Right.

Josh: I mean, other companies do this right. But other companies do it, it’s a not-secure, non-public way. So, only you as the owner of that camera can see that footage. With Qihoo, the default setting was public. A lot of people, either intentionally or unintentionally, were broadcasting their security footage onto this platform. I mean, it was fascinating. I looked at it myself. I mean, it’s just totally… it’s hard to tear yourself away. You just had this amazing view into these thousands and thousands of scenes, like apartment buildings, businesses, yoga studios, I mean, dance studios all over China. I mean, it was incredibly creepy, but also fascinating. He immediately started downloading all this footage in mass and used that to make his film. At the time, talking about privacy, I sort of… In my mind, this was an example exactly of how Chinese people don’t care about privacy, right? Because here they are broadcasting this stuff, it’s basically become a form of entertainment.

Kaiser: Right. Where voyeurs can meet exhibitionists, right?

Josh: Yeah, yeah, exactly. But what was really fascinating is, I mean, not too long after Xu Bing finally finished his film and was taking it around to film festivals, a woman went online with this post on Weibo, and it just excoriated Qihoo for putting this site up. And she had cited all these examples of, I mean, really creepy stuff of like, of cameras showing young girls in like dance classes and that sort of thing. And it blew up. It became this huge controversy. And eventually, Qihoo had to shut it down. To me, it was this really eye opening, or it was one of these moments to just show me, like, no matter how much you think you know China, how many years you’ve spent there, you really don’t. And that did actually force us to start rethinking how we thought about privacy in China.

Kaiser: Was that, do you think, an evolution toward that, or do you think that that was just something that ever been present? It was just maybe the accretion of, or maybe the filtering in of ideas about privacy from outside of China, or what do you think responsible for this?

Josh: Yeah. I think it’s tough to say exactly and tough to trace it. I mean, there definitely, as these are said, and in sort of the first-tier cities, amongst sort of well-educated Chinese people, especially people who are educated overseas, I think there is that sense of privacy that the people have picked up and are kind of importing it to China in a way. I mean, as we note in the book, the word for privacy, yǐnsī 隐私, didn’t even appear in the Xinhua, the official Xinhua Chinese dictionary until the 1990s, right? So, it is a newish concept.

Kaiser: See, I mean, I had always heard that and thought it was just one of those BS old canards. Oh, there was no Chinese word for privacy. Really, it wasn’t the dictionary until the ‘90s?

Josh: Not until the ‘90, I think, 1997 or… I think it was 1997. I mean, it is a new, fresh concept. I think Liza’s right in the sense that once people become aware of it, they start thinking about it, they experience it in the same way, right?

Kaiser: Sure.

Josh: At that fundamental sort of reptile brain level, where you feel like creeped out and violated. But I think what’s interesting in China actually, and I mean, when you were at Baidu, I’m sure you experienced this vividly after the Robin Li comments, but …

Kaiser: Actually, it happened after I left, but…

Josh: Oh, was it after you left?

Kaiser: Yeah.

Josh: Okay. Oh man. You dodged a bullet there.

Kaiser: I’ve dodged a bunch of bullets, but hey. Or maybe the bullet struck because I wasn’t there.

Josh: Yeah. Maybe, very possible. But I think what we discovered was, although people do experience privacy fundamentally at the same level, I think, in China, what’s really fascinating is it has almost exclusively been focused on companies, right? On the company sort of abuse of personal data. And the government has really, I think, deftly, sort of defined privacy as applying to companies, but not to the government. I mean, there are definitely exceptions.

Kaiser: Liza, do you think that might be because, well, there’s an element of fear in there? You can’t talk about governmental privacy intrusion, whereas you can direct the ire toward companies. CCTV does it. Every March 15, they call out some company for riding roughshod over people’s privacy.

Liza: Again, I think it boils down to government censorship of what happens, like data breaches due to cyber leaks or like rogue employees selling data, right? That happens in both government and corporations. What’s the difference is when it happens to a company like Alibaba or Baidu, it’s well played up in state media. Whereas, if it happens to the Shanghai police, for example, and we saw that recently with the Shanghai police, where there was someone on the dark web essentially selling a billion people’s worth of personal data.

Kaiser: Yeah, I saw that.

Liza: The data was basically from an unsecured database from the Shanghai Public Security Agency.

Kaiser: Wow.

Liza: That wasn’t reported in state press at all. The CCTV would play up in the past stories about caches of data that was stolen or that can be bought off the dark web. That has everything from your DiDi trail, like where you go on your DiDis, your home, your office, to what you buy on Taobao, or who your contacts are on WeChat. All this was happily played up by CCTV. Whereas the government data breaches, typically there isn’t a sound on it in the press. Just your question earlier about, why now on the privacy awareness and this uptick, it really did kind of coincide with the rise of China becoming this cheap and excellent place to manufacture hardware. Then, all of a sudden, you had like millions of these water drop cameras available for like 80 kuài 块 or 100 kuai that you could just install.

Also, it coincided with the proliferation of apps, like internet apps, like Taobao, WeChat, everything that, whenever you use it, you leave a data trail, and that’s just, or more information for the government to collect.

Kaiser: Absolutely. I’m going to stay with the kind of deep questions for now. One that you got into, which I think really helped your book to stand out, was about the relationship of Chinese people to technology itself. And this propensity, I don’t think we can really deny, as often in evidence, for Chinese leaders, especially in posting about China, to try to apply a kind of engineering mindset to social or political problems. It’s all very James Scott, very seeing like a state. This kind of scientist or mechanistic mentality. I think it’s pretty endemic to Chinese technocracy. There is one approach in particular that you emphasize in this book, which is systems engineering. Your book introduces us to Cybernetics and its founding figure, Norbert Wiener. And the links between him and Qián Xuésēn 钱学森, who many listeners will know as the kind of U.S.-trained protege of Theodore von Kármán, the father of modern Chinese rocketry and the nuclear program. And also Sòng Jiàn 宋健, who many credit or blame, maybe more accurately, for the one-child policy. Josh, can you talk about how all of this connects to the surveillance state that we see in China today?

Josh: Right. I think probably one of the most fascinating figures we discovered in the course of doing this book is Qian Xuesen, right? And he is well known as you know, both best known as the sort of father of China’s rocket program. And he’s doubly famous because, prior to becoming the father of China’s rocket program, he was a very promising American, or at least a U.S.-based missile scientist who was sort of accused by the McCarthy-era FBI of being a Chinese spy, and who then eventually, as a result of that, went back to China. The interesting thing about Qian, from the point of view of state surveillance, is that during the period of time where the FBI was investigating him, he was confined to his home in Los Angeles. And he basically couldn’t do anything except for hang out in his library and read books. One of the books he read was a book called Cybernetics by Norbert Wiener, a sort of child prodigy who invented this new field. And it’s incredibly difficult to explain, I’m not going to try to get into it.

Kaiser: It’s all about feedback loops and how complex systems self-correct and how you can sort of program autonomous… Well, yeah, you’re right, It’s hard to explain. I’m obviously proving that just now.

Josh: Exactly. Yeah, I mean, I think the simplest way to explain it is just, it’s basically the sort of science of how information is used to exert control, which… I mean, the sort of the simplest idea is basically to use this sort of biological concepts and biological methods and apply them in the sort of technological way. The biggest example of this is modeling. The idea is that if you have a system or you have a situation like say to try to shoot down an airplane, what you can do is you can collect data on the ways that airplanes fly, fighter jets fly. If you collect enough of that data, you can sort of model how a pilot will potentially fly in a certain situation. And you can use that model to shoot them down, basically.

Kaiser: To train your system to know how to respond, right?

Josh: Exactly. So, Qian Xuesen, I mean, he got deeply into this, this concept, and thinking in terms of systems and how information affects systems and the abilities to control systems. He used that in his work on rockets, but then when he went to China, his ultimate ambition was to use this on society. I mean, society was the ultimate engineering challenge to him. I mean, we can get into how accurate his application of these systems actually was. I mean, some people did blame him, partly for the Great Leap famine because he sort of misapplied his approach to agriculture, which is the topic he knew nothing about. But anyway, he had a lot of cache in China. A lot of Communist Party leaders held him in high regard. And so, they did eventually listen to him. I mean, one really striking discovery we had was that on the same day that Hú Yàobāng 胡耀邦 died, the reformist Communist Party leader whose death eventually set off the Tiananmen Square protest, on the same day that he died, Qian Xuesen published an essay saying, basically laying out his theory of social systems.

And he said that society is made up of ideological, political, and economic subsystems. And that all of those need to be in harmony. And if they’re not, the entire thing will fall into chaos. Literally, months later, you have China, which is a country where its economic subsystem was battered by inflation, its ideological subsystem was confused, and struggling with how to incorporate ideas about democracy or deal or repudiate ideas about democracy. Its politics was fractured and chaos ensued. Anyway, his ideas kind of took hold and worked their way into the sort of Communist Party mindset. Over time, party leaders started to think in terms of system science. I think they started to think of society as an engineering problem.

Kaiser: Do you see at work what you might call cultural reflexes or some deep structures in the Chinese psyche that maybe bend Chinese elites toward kind of an embrace of this very hubristic, social engineering mentality? Do you think that it’s specific to Chinese communism maybe, or to just the post-Mao leadership? Or, I mean, either way, how would you plead to those who, I’m going to say inevitably, might accuse you of a kind of techno Orientalism when you’re talking about this, this mentality?

Josh: Right. I mean, it’s interesting you bring that up because that actually never really entered my mind. If you think of where these ideas came from, I mean, Norbert Wiener was an American. And a lot of the sort of utopian drive that I think animates leaders like Xi Jinping right now, I mean, that has forebears in Soviet Russia and East Germany. I mean, I think, the East German, East Germany obviously was sort of the driver of one of the major evolutions of state surveillance. They certainly had a very utopian idea. They just didn’t have the tools. They didn’t have the technology to pull it off the way that China has. I mean, it’s very difficult to say whether there’s something about China, Chinese culture, or Chinese politics that makes them more likely to adopt this, but certainly, this is not in any way unique to China.

Kaiser: Yeah. No, that’s a great answer. Liza, we touched on this earlier, but I want to drill down a little bit and get you to say where you actually, well, both you and Josh, come down on this. This suggestion that Silicon Valley bears some culpability for what we’re seeing today in China. I mean, accusations that Cisco, for example, and other network equipment manufacturers from the U.S. might have helped China create its legendary Great Firewall under the Golden Shield program. That’s been around for a very long time. We’ve all seen them try it out in front of Congress and called them a carpet there. What other ways, though, do you think that U.S. companies have maybe enabled surveillance in more recent years? And then, what’s the extent of their culpability? Where do you come down on this?

Liza: Yeah, maybe I’ll set the scene first before I jump into my answer. What we found was that U.S. tech companies have been involved with China’s surveillance state right from the start. At the turn of the century, around the year 2000, 2001, you already had some really good surveillance researchers writing about how, at one of China’s first public security expos, you saw a ton of Western names there, all eager to sell to Chinese police. And this included early Silicon Valley pioneers like Sun Microsystems, Cisco, Canada’s Northtown Networks, which doesn’t exist anymore. And then Germany, Siemens. All of them were there and eager to sell equipment to Chinese police. Ultimately, Sun Microsystems did sell a system to Chinese police as well. It helped China build its first national fingerprint database. Fast forward, 20 years later, you’re seeing like U.S. tech company involvement in the Chinese surveillance state in an equally deep away.

But in this case, they’re not selling systems, they’re selling components. And we found everything from low-end components, such as hot drives, which basically are the bedrock of data storage in every surveillance system in China. Hard drives from Seagate and Western Digital, both American names, to chips, which power the heart of applications, such as facial recognition and image recognition, or the type of applications that would distinguish a Uyghur from a Han Chinese on the street. The chips that these systems used are often sold by Nvidia or Intel, simply because China doesn’t have a domestic equivalent that is powerful enough. The U.S. company involvement in China surveillance state runs very deep, and it still does.

Beyond the supply chain relationship, you have the financial relationships too. Let’s take a company like SenseTime, for example. SenseTime is China’s most valuable AI surveillance company. Some of its earliest investors, before its IPO, were U.S. names like Fidelity. Fidelity Capital, Silver Lake Capital. Qualcomm was invested in SenseTime, IDG. All these U.S. investors were in it early and were there to support its development. On the question of culpability, the way I would describe it is, when U.S. companies first entered China several decades ago, what they had was blind optimism for the market, right? They were conveniently ignoring the risk. At that point, China was such an easy and quick place for potential profit that every boardroom was discussing how fast you could expand into China.

Fast forward to today, in the last five years, we’ve seen a ton of reporting out in the foreign press about the atrocities in Xinjiang. In more recent years, the U.S. government has taken a much stronger stance against U.S. companies selling technology to surveillance states through the entity list or like investment blacklist. That’s really almost shaken the companies and woken them up. So, instead of purely chasing commercial motivations and commercial priorities in China, they’re starting to realize and starting to get cautious about the regulatory and the supply chain risk to dealing with China. Now, the company boardrooms, when they discuss China, no longer is it, how fast can we grow our profits? It’s, let’s comb through our supply chain to see if there’s any involvement of Xinjiang forced labor, for example, anything to trace any of our supply chain components to Xinjiang.

Kaiser: Right. So, it’s hard, though, to draw the line. I mean, ideally, well, just not talking about the investment aspect of this, but just about the hardware you do want to sort of target companies that are involved directly in facial recognition, maybe voice printing, other biometric gathering, all these other really intrusive practices that’s being done in Xinjiang. But the thing is that there are very few of these technologies that are specific to these things at all. So, when you go after broad categories rather than specific end users, it becomes really tough. I mean, does it make sense, for example, the ban, say, Nvidia from selling GPUs to China? Because those aren’t specific. I mean, all deep learning neural nets use GPUs, right? I mean, this is not specific to voice recognition. It’s not specific to facial recognition.

Or Seagate, right? I mean, they make hard drives, who knows? I mean, those are just commodities essentially of, who knows whether it’s going to be used to store surveillance data, surveillance video, or something completely innocuous? That’s tough. What’s the approach?

Liza: So, you hit on a very good point. Enforcement is tough. In the past, when the U.S. used the trade blacklist against like military companies or end users that would supply military equipment, for example, or supercomputers, you had technologies that were single-use or possibly even dual use. So, it’s a lot easier to figure out who the end user was and who was buying it. Now, when you deal with something like chips, they’re multiple use. The GPUs that you just mentioned, it doesn’t just go into AI surveillance. It was originally created for gaming. It’s created to speed up gaming if you are a gamer. So, that’s the challenge. These components are no longer that easy to regulate. In fact, something like the Nvidia chip that was put on a blacklist last week, something like that can be bought off Taobao, for example.

Because there are so many multiple uses for it. When I was writing the story last week, I remember going on Taobao when I searched up that chip, and there were multiple sellers selling various iterations of that chip. That’s how accessible it is. Beyond just the fact that there are multiple users to these components, there are other drawbacks and other challenges to enforcement, for example, transshipment or reselling. It’s very hard for these companies to figure out who the end user is. That said, it doesn’t mean they shouldn’t try. And I think what could be done is perhaps more enforcement or more resources put into enforcement on the U.S. side, or even just multilateral agreements with allies that… It can’t just be the U.S. that’s not selling these chips to Chinese surveillance companies because often Chinese surveillance companies can just buy them off the European country or another second, kind of, middleman destination.

Kaiser: So, do you mean enforcement so that no Nvidia GPUs end up in China? Is that the end game here?

Liza: I think it’s more to go through your supply chain and know who your end customers are better. Because for companies, it’s so easy for them to say, “I don’t know who my end customers are. These are multiple use chips.” But that’s a bit of a cop-out because you can try to figure out who your end customers are. Supply chains are opaque, but it doesn’t mean you can’t have better visibility. You just have to try.

Kaiser: That’s fair enough. Hey, Josh, let me turn to you. You guys wrote about a Bronx public defender named, Kaitlin Jackson. A total hero, by the way, who had one of the very first, if not the first, facial recognition cases in the American legal system. A man accused of stealing a bundle of socks from a TJ Maxx, and then brandishing a box cutter at the security guard who supposedly confronted him. He had an image from a security camera run through some facial recognition software that resulted in a man being charged. So, I won’t spoil the end in here, but as I read this account, I kept wondering something about China, actually. Even though, as I was reading about the Bronx, about law enforcement in China, and about the Chinese criminal justice system, it occurred to me that it might not be about technology at all. That it really doesn’t matter.

It might not be that it’s just that Chinese people have a different relationship with technology than maybe we do in the West. It reminded me of something that I talked about with Rachel Stern from UC Berkeley and Ben Liebman from Columbia, who are both scholars of law in China and it’s something that they once told me, which was that, basically, the faith in things like mandatory minimums or algorithmic sentencing, it grows out of, basically, a lack of faith in the human part of the justice system. They think that’s what makes it unjust is the susceptibility of people in the legal system to corruption, to guanxi, or whatever. They don’t have faith in the arbitrary and highly personalistic legal system. So, they prefer these, kind of, algorithmic outcomes. I was thinking that might be the case in China and why it seems to be less hung up on facial recognition as part of law enforcement. What do you think of that idea?

Josh: I mean, I think you’re onto something there. I’ll preface this by saying the appeal of the technological solution is kind of universal, right? That is the case in the U.S., that one of the reasons police departments love facial recognition is that it’s easy. It’s like a machine told me. It sort of takes responsibility out of your hands, right?

Kaiser: Yeah.

Josh: But I do think, when you’re talking about China, I mean, there is… We’ve all spent a lot of time in China, and I’m sure lots of listeners of this podcast have as well. There is a just yawning, lack of faith in the human beings running the legal system in China. Sometimes it’s not even their fault. It’s just that the system itself is corrupt. If you’re a judicial official in China, you’re subject to just immense pressures of all kinds, right?

Kaiser: Oh, for sure. Yeah.

Josh: Not just political, but even just sort of more straight up corrupt. I think, given that situation, yeah, it’s not surprising that some Chinese people would actually prefer to have algorithmic systems involved. Because they perceive them wrongly often, but they perceive them as being fair and as being objective and impartial. So, yeah, I think, if you’re confronting a system like that, this idea that a machine is weighing in instead of a human being, like a sort of supposedly incorruptible machine, that is certainly… You can understand why people would find that attractive.

Kaiser: Still, I find that like in China, you’re not seeing people writing best-selling books like Weapons of Math Destruction, which I think is like the best pun of all time.

Josh: Yeah, yeah. And a great book.

Kaiser: Yeah. I mean, there are quite a few books in this genre that are sort of alerting us to the inherent biases in a lot of the supposedly impartial algorithms. And that’s happening a lot in the United States. I mean, we’re very skeptical of this. I think anyone would read that TJ Maxx story and root for Kaitlin Jackson in America. I’m just imagining it finding a different audience, it landing differently in China. Anyway, who knows?

Josh: Well, I mean, in China, they actively develop AI that treats ethnic groups differently, right?

Kaiser: Yeah. God. Hey, so Liza, this is something I’m really curious to ask you about. How do we make sense of China’s new Personal Information Protection Law? I mean, is it just rank hypocrisy or is there a way to understand it? Same with the Data Protection Law or the Data Protection Regulations, or the new algorithm law? I mean, because on the face of it, you can see these things seem to be kind of enlightened, right? Demanding a lot of transparency in the way that algorithms are deployed. And this is something we haven’t really seen yet, and we probably really desperately need in the United States. I know Kendra Schaefer, who is just a brilliant writer on technology issues and policy issues, she was actually arguing that the data security law, and all of that, should be understood as actually China taking steps to create viability in the data market.

That you need to have that kind of order and security. You need to have it all locked down tightly before you can actually make a market of it. I thought that was a really interesting possibility as to why we’re seeing this approach. But it feels like China, in some ways, is ahead of us in this regard. So, should we think of these things just in terms of the CCP wanting to exert control for control’s sake, or how should we understand this new spate of laws regarding digital data?

Liza: The way I see it is a little differently. I think it’s very practical for the Chinese government to introduce such laws. I mean, the Party is an authoritarian state, but it still has to answer to its citizens, right? It still has to stay in power. Yes, there are no elections, but you can’t have a riot against you or a huge protest against you. Privacy issues, and data privacy and data security issues, have festered for a long time. We flicked at it earlier, the incidents, the data leaks that led to this whole rise of privacy awareness in the bigger cities. So, China really knows that. China knows that it needs to do something, and that’s why it has this regulation. In some ways, it’s a hit on other countries too. Not every country has a Personal Information Protection Law.

I mean, kudos to China for introducing such laws. At the same time, though, they’re really smart because the introduction of such laws doesn’t change the political dynamic in China. It doesn’t change the fact that if national security or state security wants certain data, they can have it. Or there are guard rails or checks and balances preventing them from having it, right?

Kaiser: Right.

Liza: In a way, they’re having their cake and eating it.

Kaiser: Exactly. There’s a quote you have from Jamie Horsley that says it doesn’t change the status quo as far as government access to data at all.

Liza: I can break that down. For the Personal Information Protection Law, it protects your personal information as a Chinese consumer from greedy corporates or real estate agents who are scanning your face as you enter a showroom to try and pitch you a flat, or decide how to serve you. All these abuses, that’s being prevented. The other thing that the law does do, is it does hold government agencies accountable for data leaks or for cybersecurity. It makes government agencies aware of the need of cybersecurity to reduce all the data leaks. Data leaks that we talked about earlier.

Kaiser: Yeah, like the Shanghai leaks.

Liza: But when it comes to state security and national security agencies, that’s an exception. There’s nothing there stopping them from gathering data if needed. And if anything, it just makes things more opaque because these agencies don’t really have to… They don’t have to be transparent about when they’re taking data or what sort of data they’re taking.

Kaiser: Thanks. That really clears things up a lot. Josh, unsurprisingly, you guys devote a whole chapter to the social credit system, which is a subject about which there’s a lot of confusion out there, not just outside of China. I mean, I’ve done a show with Jeremy Daum about this. He gets a big shout out in your book. And clearly, his pushback on the initial reporting about the system had a pretty big impact on the discourse on this, and on your writing, I suppose. I don’t think we need to get into this in too much detail, but just for those who didn’t hear that episode, let me erect a confessedly exaggerated straw man of what the popular understanding of social credit is. And you can tell me what that gets completely wrong.

The popular understanding is the social credit system is a nationwide system that evaluates citizens’ trustworthiness and assigns them a single numeric score on their online behavior. It’s based on their behavior in social media, their ideological purity, their conformity to the law, their reliability, and the reliability of their friends actually, and acquaintances. They lose points for spending too much time playing games for obviously criticizing the Party or the government, and for minor crimes like jaywalking. What does that get wrong?

Josh: Actually, I mean, it is remarkable, the degree to which that story still does hold. In fact, it’s like the one example of Chinese surveillance that your sort of average American can access, right?

Kaiser: Right.

Josh: You say state surveillance. So, like, oh, it’s social credit. I mean, so that picture is very exaggerated, although it essentially captures the first version of the story that we all encountered back in, I think 2015. What it gets wrong, I mean, gets a lot of things wrong. Obviously, there is no single score. There is no single system actually. It’s a very fragmented system. It doesn’t evaluate your friends. It’s not algorithmic. In fact, it doesn’t even really use AI at all. Yeah. I mean, essentially what the social credit system is… I mean, it certainly has ambitions. It is intended as a, or at least it was originally intended, as a fairly comprehensive, sort of social control mechanism. It was trying to address this what was undeniably a sort of crisis of credibility in Chinese society, especially in the economy. During the Hú Jǐntāo 胡锦涛 era, I think a lot of people in China felt that there was just people who were-

Kaiser: No trust. Yeah.

Josh: Yeah. No one trusted anyone else. And how do have a society like that? It is a system that attempts to expand the notion of credit and credibility broadly, beyond simple sort of financial measures, like you have traditionally in the United States. But the way that it goes about it is very haphazard. It’s very piecemeal. It’s mostly based, in fact, on court blacklists. The courts have developed a system of blacklists for people who didn’t pay their debts, who didn’t fulfill judgments against them. And those have sort of been expanded, right? So that now, if you are found misbehaving in some way, not paying a bill, not only will you be punished by the court, but you may not be able to buy a plane ticket. You may not be able to get a high-speed rail ticket or stay at a nice hotel.

Sometimes, maybe if it’s bad enough, someone will call your phone and they’ll get an automated message saying that they’re calling an untrustworthy person. I mean, it is a system. It is real. It is not quite as comprehensive or as black mirror-esque as it originally seemed.

Kaiser: Yeah. I don’t actually blame a lot of the early incorrect reporting on the reporters themselves, but this is one of those phenomena where I feel like these ideas were actually reinforced, even if they were completely wrong, by the efforts of local governments to try to talk up and show off their capabilities. And this is something that you guys talk about in this great chapter about Potemkin… Well, what’d you call it?

Josh: Potemkin AI.

Kaiser: Potemkin, yeah. Right. I mean, it reminds me of the way that so many people think now of China as monolithic. And I’m all of sudden, “You know, no, China’s really not monolithic.” But when you think about it, the Party always wants to appear monolithic. So, how are you going to blame people who are thinking that China’s monolithic? You guys argue that it really doesn’t matter whether it does everything that it purports to do, a lot of these AI systems and stuff. The thing that matters is that the Party is able to convince enough people about the Potemkin AI, or whatever we have, about its omniscience. What’s your sense of how well they’ve done, at least in this regard? Liza, do you have a sense of like, does your average Chinese person now believe that they are pretty thoroughly surveilled and have this pretty big inducement to good behavior?

Liza: Yeah. I’m pretty sure the regular Chinese have no idea about the propaganda aspect of the state surveillance system in China. I can’t blame them because Josh and I were completely duped at the start as well. The only reason why we discovered that maybe the state surveillance system wasn’t all it was cracked up to be was, when we were trying to chase down state media reports talking about surveillance systems finding missing children, for example, or surveillance systems in a certain city that provided a ton of benefits to the people and the residents, for example, helping people with dementia find their way home because they would be spotted on the surveillance cameras. Even though they couldn’t find their way home, the police would be able to identify where they went or in which direction, and still be able to send them home.

We were surprised in our reporting that whenever… There were a couple of instances, when we tried to go down to the same place where a state media report had purported that something had happened, and thanks to the help of the surveillance system, and when we went there, we just found that either the story was half true, like half-baked, or we couldn’t find any evidence of it at all. I’ll give you one example. There was a state media report that we saw talking about Hangzhou xiǎoqū 小区, like a residence compound with a new facial recognition system that was helping find missing children and senior citizens with dementia. I went down and I was talking to a ton of residents, trying to figure out, do you know a family who has been helped by the surveillance system?

I spent an entire day there talking to everyone. And after that, I walked to the security compound, like the management office and the security office that’s within the compound itself. And I asked, I saw this amazing report. While we were talking, I saw, playing on loop, on the video camera on the wall, this promotional video about the facial recognition system helping to find a lost child and bring that person home. I was so stunned when the lady at the counter said, “Oh, that was me. And we were acting. We were just trying to show people in the residents’ compound the benefits of the facial recognition system.”

Kaiser: Oh my God. You exposed it completely. That’s amazing. Wow.

Liza: It was one of those aha moments when I realized, okay, the state surveillance system doesn’t have to work. The way you think, the way the Chinese Communist Party keeps telling everyone that it has to work, people just have to believe that it works that way.

Kaiser: Either the good parts or the bad parts, right? Yeah. Amazing.

Josh: Yeah.

Liza: Yes.

Josh: I mean, I was going to say, I don’t know if you had this experience too, Kaiser, but the degree to which people internalize the notion of being watched when they’re living in China is really fascinating. In the sense that it’s this kind of like buzzing paranoia in the back of your brain that you don’t even know is there. I mean, I only really became totally aware of it after I got kicked out and landed in the Tokyo airport realizing I was never going to probably go back to China, at least not in the near future, just feeling this immense weight lift off my shoulders of not having to think about, oh, am I being listened to? It’s like, things I’m saying in my apartment are going to end up on a recording somewhere. Or someone else tracking my movements. I think it’s the sort of thing that a lot of people, if they’re aware of what’s happening, aware of the surveillance in China, it just sits there in the back of their minds.

Kaiser: Yeah. I mean, of course, you are in a particularly sensitive profession, but I think that’s broadly applicable. Hey, so ultimately, maybe this is the final question for you guys. What can people who are deeply concerned about rampant surveillance, especially people like me or many of the listeners who have a foot in the West and a foot in China, and who don’t like how things are going in either the West or in China in terms of surveillance, what can we do with respect to shrinking privacy?

Josh: Yeah. Honestly, I don’t know, on an individual level, how much power anybody has to affect what’s happening in China. The Communist Party has… This is a huge priority for them. And they’re going to do what they’re going to do. I think the best way democracies can fight back against that is to actually figure out what they believe and to develop an alternative vision of a future with these technologies in existence. I mean, state surveillance is here, right? It’s not going away. It exists in the United States. China has a very clear vision about how that looks in authoritarian countries. But there isn’t yet a really clear notion in the democratic world about what that looks like.

Kaiser: We can’t agree whether Edward Snowden is a hero or a traitor.

Josh: Exactly. If you look at the EU, the EU has always been, generally had a really strict regulatory approach to data. Everyone was saying early, a few years ago when they passed their first, really big piece of privacy legislation, that they were going to kill innovation because they weren’t allowing tech companies to trade data freely. And the EU is now about to pass, they have a draft law that would ban the use of real-time surveillance by governments, or maybe by anyone permanently. Whereas the U.S., it’s kind of schizophrenic. Some places have instituted facial recognition bans, others are embracing it. I think what it really comes down to is people, as an American, for example, being aware of these issues and thinking about how they affect you, and then voting accordingly.

To make it very clear, to give a very clear example of how this could affect people, if you live in a state with an abortion ban, American police have immense power to request data from companies like Google, or they can buy it from data brokers about who is visiting abortion clinics, who is searching about abortion. That is a way in which state surveillance is going to affect directly, I think, a lot of people in this country. There hasn’t really been a serious public discussion about that yet.

Kaiser: Well, let’s hope this book helps to spark one. Liza, any last thoughts on that question?

Liza: I think, on my end, it’s very important for countries globally to figure out a global accepted standard on how to regulate new technologies like AI enable surveillance cameras. Because I do believe that China wants to be a responsible global player. Once you have these, the reason why they’ve skewed off in the direction they have been is because there is no globally accepted regulation of these technologies. And that’s allowed China to experiment. And we know China’s great at moving fast and breaking things, even though that’s Mark Zuckerberg’s model. So, I think it’s very important to just come together and decide how we want to use these technologies, what sort of future do we want to build with these technologies and try and get China a seat at the table and get them involved, and try and institute change that way.

Kaiser: Yeah. I’m not optimistic, because when you think about all the really disruptive technologies that, I mean, sitting here now, it’s blindingly obvious, are going to be massively disruptive, AI, genetic engineering, CRISPR, and all that stuff.

Liza: Metaverse.

Kaiser: Yeah. And self-driving cars and all these things. There are two countries in the world that lead, by a huge stretch, everyone else, and those are China and the United States. And these are the two that are least communicative right now. So, it’s really well, kind of horrifying. Well, on that cheery note, let me thank you, both Liza Lin, Josh Chin, for taking the time. And what a great book you’ve written. Once again, the book is called Surveillance State: Inside China’s Quest to Launch a New Era of Social Control. It is on sale now. It is subtle and it is really wide-ranging and beautifully written. So, congrats to both of you.

Josh: Thanks Kaiser. I really appreciate it.

Liza: Thanks Kaiser.

Kaiser: Let’s move on now to recommendations. First, a super quick reminder that the Sinica Podcast is part of The China Project. And if you want to support the work that we do with Sinica and all the other great shows in the Sinica Network, then please do subscribe to Access, which gets you all sorts of fabulous perquisites, not the least of which is, of course, our daily newsletter. It would be a bargain if that were the only thing you got, but it also unlocks the formidable paywall and gets you this podcast early, usually on Mondays instead of on Thursdays. So, signing up at thechinaproject.com. All right, let’s march onto recommendations. Liza, what you got for us?

Liza: Kaiser, I wish I had something more intelligent, but recently, everything I’ve been reading is maternity related.

Kaiser: There’s nothing wrong with that.

Liza: So, I guess my recommend-

Kaiser: That’s amazing.

Liza: Yeah, but everyone knows that motherhood is something you can’t read up to prepare for. Yeah. So, the book I’m recommending today is the Mayo Clinic Guide For a Healthy Pregnancy. Yeah, essentially chronicles, the nine months of your pregnancy, what’s going to happen, what you should look out for in checks, how you should eat, diet and nutrition stuff. So, it’s probably applicable for only half of your listeners out there, but I hope that’s helpful.

Kaiser: Well, no, it certainly is. I actually read a bunch of those, like, what to expect when you’re expecting. My parents grew up on that kind of Dr. Benjamin Spock stuff baby in childcare. Every time I talked to them, everything they said was, “You need to trust your instincts. Countless millennia of humans have been doing this, and we wouldn’t be here, but for that. They know how to do it, so you do too.” But I still read the books. I wouldn’t have known to have my wife take so much folic acid otherwise. Good stuff. Thanks, Liza. So, it’s the Mayo Clinic’s book. On it. And Josh, what about you? What you got?

Josh: Well, first, I just want to say Spock babies represent.

Kaiser: All right. Yeah, you were a Spock baby too. Yeah.

Josh: Yeah. I have two recommendations. The first one is The Backstreets which is a short novel by Perhat Tursun, who’s a Uyghur writer from Xinjiang. He’s actually very close with Tahir Hamut, the sort of character who leads off our book.

Kaiser: Didn’t Darren Byler translate that?

Josh: It’s translated by Darren Byler.

Kaiser: Yeah. Darren who writes a column for us, yeah? That’s the one.

Josh: Right. Tursun, I think, disappeared into the camps in 2018. I believe this is the first Uyghur novel, this first modern Wier novel translated into English. I have to confess, I’m still only partway through it, but it is really striking. It tells the story of a Uyghur migrant traveling to Ürümqi to escape poverty, which, of course, is a very extremely common story. Has sort of shades of Camus and Kafka. It’s sort of existentialist. It uses winter pollution as a very multilayered metaphor for what life is like for Uyghurs in modern China. It’s not an easy read, but it is, I think, a really rewarding one. I think it feels like one of those novels that 20 years from now will sort of become part of the literary canon for this era of, of Communist Party rule. And then my second recommendation is The Wok by Kenji Lopez.

Kaiser: Great chef.

Josh: Excellent chef, science-based chef. I don’t know if I’ve ever said this on a podcast or talked about it, but I was a sous chef in a previous life.

Kaiser: Oh, you were?

Josh: I was, at a-

Kaiser: Have you seen The Bear yet?

Josh: I have not. No, it’s on my-

Kaiser: Oh, you need to see The Bear.

Josh: It’s on my list.

Kaiser: On your list. Good.

Josh: I wasn’t necessarily a good sous chef, but hope springs eternal. One of the things I’ve always sort of struggled with, especially being half Chinese, is that I’ve been terrible at wok cooking. I could never get…

Kaiser: Really?

Josh: Well, I could do it, but I could never get wok hei, the elusive, the magic of wok cooking, I could never get it. And so, Kenji Lopez has done this sort of really… This book is, I mean, it’s everything about wok cooking, but in his normal, in his typical style, he sort of breaks it down scientifically and attempts to explain how the extremely high temperatures and the thinness of the wok work to sort of cook food, and how the wok toss basically cooks food in the steam that rises up around the sides of the wok. Then there are sort of various hacks for getting the sort of wok hay flavor that-

Kaiser: Oh, I love that. I live for that. We did a cabbage like that recently that achieved that. And I was delighted.

Josh: Yeah. The holy grail.

Kaiser: Liza’s recommendation was about sort of the beginning of a very important life stage. Mine is about one that comes about 18 years later. I just paid a ghastly sum of money in out-of-state tuition for my daughter just a couple of days ago. Then I started reading this great new book by the journalist, Will Bunch. The book is After the Ivory Tower Falls: How College Broke the American Dream and Blew Up Our Politics and How to Fix It. I haven’t gotten to the how to fix it part yet, but the blowing up part is just making me shake my little fist in rage at the ghastly, truly ghastly sums of money I just paid out to the University of Wisconsin, Madison. But great school. I love it up there. And so, she’s in good hands. I got lots of friends up there, and so I’m sure it’ll be worth it.

She’s already thriving, but a really good book. I highly recommend it. He’s a really good stylist, and deep reflections on the whole sort of journey from the G.I. Bill to our present predicament with college. So, check that out. Liza, thanks so much for joining.

Liza: Thank you for having us. It’s our pleasure.

Kaiser: Josh, great to see you again, man.

Josh: Yeah. Always great talking, Kaiser.

Kaiser: How long are you going to be in the states?

Josh: I think we’re here for basically through the end of the month. And then we have to head back to the Asia time zone to prep for the Party Congress.

Kaiser: All right. Well, I’m going to try to manage to see you at some point, I hope.

Josh: Yeah. That’d be great.

Kaiser: All right. That’d be great. All right, thanks so much.

The Sinica Podcast is powered by The China Project and is a proud part of the Sinica Network. Our show is produced and edited by me, Kaiser Kuo. We would be delighted if you would drop us an email at Sinica at thechinaproject.com or just give us a rating and review on Apple Podcasts as this really does help people discover the show. Meanwhile, follow us on Twitter or on Facebook at @thechinaproj, no E-C-T. And be sure to check out all the other shows in the Sinica Network. Thanks for listening, and we’ll see you next week. Take care.