Media Center

External Affairs Minister participates at The Sydney Dialogue Panel discussion on Democracies and Global Technology Governance

November 19, 2021

Laura Jayes: Hello and thank you all for joining us at the Sydney dialogue for this session on global technology governance. I'm Laura Jayes, host of AM Agenda on Sky News Australia. It is a great honour to be speaking with such an eminent panel today from various corners of the globe of course, and thank you all for being here. I'm also pleased to announce that this session will be streamed as part of the Bengaluru Tech Summit, Asia's largest technology event. And let me start by introducing our panel, Nick Clegg, the Vice President of Global Affairs and Communications at Meta. From 2010 to 2015, Nick also served as the Deputy Prime Minister of the United Kingdom. In 2018, he was awarded a knighthood in recognition of his political and public service. Dr. Subrahmanyam Jaishankar, India's Minister for External Affairs. Dr. Jaishankar was president of Global Corporate Affairs at Tata Sons Private Limited from May of 2018. He has also previously held the position of Foreign Secretary and has been ambassador to the US and China at different times, of course, among a number of other diplomatic missions. Marise Payne, Australia's Minister for Foreign Affairs and Minister for Women. Senator Payne also served as Defence Minister for three years, the first woman to do so, in that cabinet portfolio. Please welcome our panellists.

And I remind our audience that this is an open discussion and we encourage you to be a part of on Twitter as well. So please use the hashtag TSD 2021. Towards the end of this discussion, there are some pre submitted questions from our delegates as well. But let's get straight into it. Thank you to our panellists once again. Before we get into the detail of how we best shape global technology, should we start by acknowledging the biggest emerging challenges when it comes to take governance and maintaining democratic values? Minister Payne, first to you, what do you see as the biggest challenges here?

Marise Payne: Thanks, Laura. And can I acknowledge the other participants today and also I thank SV very much for the invitation to participate in this very important dialogue. I think there are a number of challenges that we've been focused on in recent times, and particularly currently the use of technology, or abuse of technology for disinformation. That is where we see governments that might want to undermine stability elsewhere, for their own strategic purposes. But also, and dangerously so, non-state actors and individuals, for example, engaging in COVID, and vaccine disinformation, which we know in a pandemic literally costs lives. I'm also very focused in both of my roles on online harms against women and girls, whether that is abuse, or stalking or harassment, and finding ways to address this by governments, but also ensuring that our technology companies assume their own responsibilities in this area. We've been very determined and taking strong action in this particular policy space. And finally, where there might be misuse of technology by more authoritarian regimes and actors, including the oppression of people in terms of mass surveillance, or foreign interference, where we've seen the use of cyber and social media, for example, in elections. So we're all very alert to that. And they are I think, three of the challenges that I can see.

Laura Jayes: Yeah, the challenges are many and varied. Dr. Jaishankar does India's views at all diverge, or are they divergent from what has just been outlined by the Minister?

Dr Jaishankar: First of all, Laura, good to join the panel. And Marise, congratulations on the great victory yesterday. I think, as other cricketing nations on this panel, you're just dominating. Look, by and large, I think I agree with what Marise just stated. But if you look at it, not with a sense of immediacy, but over a period of time, the fact is technology has always mattered. Technology has always been a double edged sword. It has brought good but with every good that it has brought, it has brought new vulnerabilities and new challenges. And this is not unique to the digital age, it has always been so. Now what is different today? I think what is different today is that the impact it has on our daily lives, on our culture, on our psychology, on our behaviour is something of a totally different order. And what it has done is, at one level, it has raised issues of privacy, of accountability, but as you know, two of us are today, foreign ministers, and I just want to share with you a sort of international relations kind of perspective of what today big tech means. If you look at it, the size of Apple, the market cap of Apple is more than the GDP of Italy. Microsoft is the size of Canada. Amazon is the size of South Korea. Google is the size of Australia. Facebook is bigger than Turkey. So, you have private companies who are of a scale and a size, which human history hasn't seen. And their implications for what happens within societies, between societies, is something very very profound, and it's something we should be debating. So apart from the issues which Marise raised, I think there are questions about regulations, about practices, I would say even about culture. And I was just reading something which Nick Clegg had written up for, I assume, for the dialogue and where the need for regulation of finding the right balance has been addressed. And I think that's a very, very legitimate issue.

Laura Jayes: Mr. Clegg, I think profound is the right word used there by Dr. Jaishankar. I assume, you know, you're coming to this panel with a different perspective than the other two foreign ministers. How do you see the challenges, and I assume you don't think they're insurmountable?

Nick Clegg: Can I also join the others by thanking you Laura, for hosting this. And it's a great honour to be on this panel with Minister Payne and Minister Jaishankar. And since you mentioned cricket, I feel like I just can't help but mention rugby, since I'm a rugby fan. But I won't dwell on it any more than that would be undiplomatic for me to do so. Much so, I relish the result myself. And I think everything that has been touched on already, are clearly some of the key issues, the issues of sovereignty; you have these huge private sector companies, which are making adjudications on their own, if you like, in a regulatory vacuum for issues which touch on very significant social, ethical, political and cultural matters in jurisdictions around the world. And quite understandably, governments around the world, I was in politics myself for 20 years quite understandably. The political classes around the world want to rebalance that and want to make sure that the public sphere is sort of framing and placing guardrails through law, and through regulation within which these private sector companies operate. The balance between all the good content that flows on the internet, and obviously, the bad content. I mean, in the case of the company, that I work at Meta, you know, we have a third of the world's population. And, you know, however hard we try, we've now reduced hate speech on our platform such that there's only three items of hate speech out of every 10,000 pieces of content that you see. So it's down to 0.03%. We've reduced the prevalence of hate speech in half over the last year. But even 0.03%, is still a big number when you're dealing with 3 billion people on the platform that is another issue. But I would probably single out the following issue, in my view, the most important of all, and that is simply trying to keep the internet itself open. There's a great deal of talk about the global Internet. The global Internet doesn't exist. It's a fiction. It doesn't exist. We have an open Internet, which is enjoyed by most democracies around the world. And then we have a completely different internet, pioneered in other jurisdictions in the world, which are based on heavy surveillance, very little individual privacy, in which sort of walls are built around the internet. And I think as Minister Payne said, there is a real danger that we look back on these days as the heyday of the open internet, in the following years, in the coming years, you'll see an increasing use of the Internet by authoritarian or semi-authoritarian governments to censor speech, surveil individuals, to insist that data cannot flow openly from one jurisdiction to another. And I think that would be a great shame. My own view is that in the long run, we need something close to a sort of Bretton Woods moment of the Internet to create the international institutions, to really underpin the principles of openness, transparency, accountability, and privacy, which has made the internet such an extraordinary phenomenon in recent years.

Laura Jayes: If we're looking for guiding principles, the recent Quad statement on tech is probably a good starting point as well. It talks to every phase of design, development, governance, and users needing to be shaped by "shared democratic values and respect for universal rights". That's, I think, is a really big shift from even a decade ago. So Mr. Clegg, you touched on it there, but how do you see the competition between democracies and authoritarian states playing out and what sort of issues does that raise for you, you talk about striking that right balance and the chilling effect that you could see from authoritarian states, but there needs to be a balance there doesn't it? Any idea where that is?

Nick Clegg: So clearly there needs to be a balance between the ability of the private sector to continue to innovate, to provide these extraordinary products to billions of people around the world, whilst also the right of sovereign governments to establish their own rules. But I think internationally, Dr. Jaishankar talked about the international perspective, I do think there is an increasing standoff really between the open internet which is widely shared amongst open democratic societies around the world, and a completely different internet paradigm. And you know, President Biden is hosting this, I think it's called Tech for Democracy Summit, I think next month. I very much hope that summits like that will lead to a push to create the institutional and legal underpinnings such that the open Internet is preserved going forward. So yes, you need regulation. Yes, of course, the companies need to act with greater accountability and transparency in their operations so that they can be held to account, because with size and success comes responsibility and accountability. But above all, the open data flows, and the fundamental principles of privacy, and free expression, I do think are probably under greater threat than many people appreciate.

Laura Jayes: Dr. Jaishankar, what do you think of that principle of above all, I think that free speech element and the openness of an open global internet needs to be upheld. I want to also touch on what you said last year, that "technology is very political". What were you talking about when you said that?

Dr Jaishankar: You know, again, I understand the contrast you're making between democratic societies and non-democratic societies. But if we are to do well, it's important that democratic societies find the right balance, because at the end of the day, democracy will advance when it is clear that democracy delivers. Now, we can't have the tech world, the data world, essentially run on sort of 19th century principles of capitalism. So yes, on the one hand, we need freedom, we need openness, and we need the flows. But on the other hand, there has to be the basic regulations, a sense of equity, a sense of fairness. I mean, you can't have data pillaging as a basis for a global business. There are countries who would obviously like to build their own businesses; there are people who want to have control over their own data. So, I think those also need to be factored in and, you know, we are in a transition right now. Every society, I mean, we follow the debates, for example, in Australia, on content. There are regulations, there are countries, Singapore has passed a law, Japan has passed a law. There are debates, we all saw the congressional hearings on the responsibilities of big tech. So, I think, I don't see these as negatives, I see these as natural practices in our democracy, we have the ability to talk, to defer, to argue. And I'm very very confident because I'm confident about the democracy, I'm confident that this too, will eventually be handled in the right way. Now, the remark about tech being political, I think, was made in a certain context. And it is political in the sense that it is used by players to advance goals, some democratic, some non-democratic, they use it very differently. We are discussing the tough side of tech. I think in fairness, we should also acknowledge right up front, technology has empowered in a way in which it is inconceivable. I'd be happy to talk about that later. Technology has created efficiencies in a way in which we couldn't have imagined. So there's a lot going for it, as well. And when you say democracy delivers it is also technology in democracies, which help deliver.

Laura Jayes: Indeed, we are talking about the challenges, absolutely when it comes to tech and, sometimes we can forget about the wonderful things technology and a world that is connected brings to our lives. But Minister Payne, Dr. Jaishankar is right. When you talk about big tech and looking at governments, these are fundamentally democratic organisations, they're democratic principles, but I guess, the concern is how these principles and free speech are upheld and not exploited by authoritarian means. I mean, where is the balance there? Dr. Jaishankar says we've probably got time to work that out. But how long?

Marise Payne: I think you used the right word Laura, I think the right word is 'balanced'. And the amount of time available to us is, in my suggestion is we should be starting yesterday, which in fact, is what we are doing in terms of the sorts of conversations and discussions that are being had. And you mentioned the Quad principles. I think, broadly speaking, the principles about supporting universal values, building trust, integrity, resilience, a fair and open marketplace, are axiomatic if you like, for this cyber world. But we do have to be very, very clear that the rule of law that applies offline has to apply online, and rules of the road are what enables road users to stay safe. Same with the users online. And so, in that process, being clear about identifying the difference between free speech and malicious disinformation or online harassment is important. There are rules about how we should act. We've been engaging with multiple partners for some time on these issues, including multilaterally, so through the UN processes in the working groups there, which is about agreeing the rules for how states can and can't behave in cyberspace. And our Ambassador for Cyber and Critical Technologies, Dr. Tobias Feakin, is Australia's lead on that. We have multistake holder discussions, and those discussions include governments, the tech companies and civil society, who look at how we actually address the questions of technical management, and governance of the internet and digital technologies. From Australia's perspective, we see this as necessary for all, not just for the privileged few. So we work across the Indo Pacific as well, to invest in capacity building for countries in our region, to ensure that they understand the technology landscape, and that they can address those challenges. When you think about how important some of the platforms are for fundamental communications, including news communications, in the Pacific, for example, being able to understand that, to understand the rules of the road, so to speak, is essential for governments, for civil society and for individuals. So we have multi-stakeholder governance processes, and that's what we need to be working through together. And I think we can do that without undermining our core principles, which is absolutely essential because it's the principles that allow us to build that fabric that will hold all of this together.

Laura Jayes: Well, the Quad is a really good place to start fundamental democratic principles being brought into the tech world and then shaping that governance. Mr. Clegg, do you have any big ideas that the private sector can contribute here through the Quad and other security partnerships that might give, you know that confidence in democratic values in new technologies?

Nick Clegg:
Yeah, I think the private sector has a very, I hope, productive role to play in proposing ideas about how the governance of the online world and how new rules of the online world can be developed. And you know, we're not sitting around. We've published, as a company Meta has published whitepapers on how you might govern content issues online, we've created our own independent 'Oversight Board', which is a wholly independent entity made up of former prime ministers and Nobel Prize Laureates, scholars, former journalists, and they adjudicate independently on content disputes on our platform. We work together with governments, as Minister Payne said, governments and other players in the internet. You know, we helped create and fund the global Internet forum to counterterrorism for instance, which is now an extremely sophisticated cross-industry, multi-stakeholder entity to help us share technologies and share data when terrorist incidents occur. We were a founding member of the Technology Coalition, which is a global alliance of leading tech firms to come together to build tools and advanced programmes to protect children from online sexual exploitation abuse. There's something called a 'Digital Trust & Safety Partnership', which has been established again, across industries to establish a safer and more trustworthy internet. And, you know, we've been very open for a good two years now that in areas like privacy, and data portability, and election integrity and content, these are issues which private sector companies should not be left to be deciding on themselves. I mean, it is clearly not right, that people like me, or sometimes Mark Zuckerberg and others, are having to make agonising decisions about whether a piece of content can or can't stay up or stay down, according to our own rules, when those rules, of course, they're the rules of a private sector company that, you know, these are rules which should be adjudicated upon by legislators. But as I know, myself from many years in politics, legislation takes a long time and tech moves very quickly. And I do want to be clear, we really do believe that new regulation I mean, not all regulations are good. Some of it we agree with some of it, we don't agree with but of course new regulation needs to be introduced. This is a very young industry, Meta was, it was then formally known Facebook was founded, I think, the day after or the week after Roger Federer became number one in men's tennis. Which shows just how young it is. The Federer era is longer than the Facebook/Meta era. And yet, you know, a third of the world's population now use it. And yet it sometimes takes 15 years for legislation to finally actually find its way onto the statute book. So we are not sitting around as an industry, we really are innovating a lot. I think the Oversight Board is a particularly striking new innovation. No one's tried this before in the industry, we're trying to be more transparent. Every 12 weeks, we publish all the data on all the hate speech, we take down, all the violence, we take down, all the government orders we get. We're gonna submit that data, the first company to do so soon, to an independent audit from EY so that people don't feel that we're marking our own homework. But of course, there's more to do. And we very much hope we can work in responsible partnership with governments who are trying to come up with their own rules, because it's tricky, it’s very tricky, It's very tricky to decide where to draw these lines. And something I've discovered is, no one ever agrees, you know, I operated in a company where we are shouted out every day by one set of people saying you're not taking down enough content and other people saying you're censoring too much content. It's a highly polarised kind of debate.

Laura Jayes: Well, Mr Clegg, I know after 20 years in politics, you will know about the painfully slow pace in which legislation can move. So, I know you fully appreciate that. But to play devil's advocate for just a moment, without every jurisdiction or major democratic jurisdictions in the world agreeing on a uniform set of regulation, is Meta or Facebook allowed or able to maintain that governance power? You say, you know, it's not an enviable position to be in, but without, you know, global agreement on some level, is the status quo going to remain?

Nick Clegg: I think we're gonna see over the next few years’ new legislation in India, in Australia, in the UK, in the EU, possibly in the US, too. I think that's just, that’s obvious. It's all on the books, it's in the pipeline. As I say some of it, I think is really good. Some of it, I just happen to think it's not very good. Some of it is rushed through in anger and something again, I learned over 20 years, legislating in anger almost always leads to unintended consequences that don't really work out in practice. But a lot of it is sensible and smart and so on. Your question is, will that then in turn lead to a degree of fragmentation as everybody comes up with their own rules? I do think that is a very real danger. And that's why I said earlier, I do think the internet is now owed its sort of Bretton Woods moment, because I think there's some foundational principles amongst democracies on privacy, on transparency, on accountability, on free expression, on open data flows, which I think are essential to keep the global Internet as innovative as it is and I say, particularly in the Asia Pacific region, where the sheer growth and dynamism online is astonishing. I read somewhere the other day that since the beginning of the pandemic, more new people have gone, sort of migrated from offline to the online economy in the Asia Pacific region than the total population of the United Kingdom where I come from. You know, in 2012, in the whole of Asia, there were only two tech unicorns worth more than a billion dollars. Now there's more than 170. It is growing; I think we will live in a world one day where the digital economy will be as big if not bigger than the real world economy. And so I think that the more that the tech democracies can work together, including, and notably, democracies, like Australia and India, the better that I think new rules could be developed on a multilateral basis.

Laura Jayes: You were still thinking about the cricket there; we forgive you Mr Clegg. But while we're talking about the Asia Pacific, let's talk about China. China is a major player in invention, and innovation. Minister Payne, with the birth of that innovation and invention, such as AI, quantum technologies coming from China, does that pose problems or complications for governments in a democracy, do you think?

Marise Payne: Laura, I think I will go back to the question about how the rules of the road operate. And it doesn't matter where the technology is derived. If we can come together to agree on the sort of rules of the road that we should be adopting in cyberspace. So online, in that in that way, then the genesis of the technology should not be the issue. Now, I'm not naive, I understand that is perhaps easier said than done in some ways. And from Australia's perspective, we are very clear about where we think the lines of responsibility lie, very focused on ensuring that we, in our approach to technology do walk within those agreed lines. And the work that we do in our multilateral fora in mini laterals like the Quad itself, and of course bilaterally, is key to that. Making sure that technology is not abused is essential. And the three challenges that I identified in response to your earlier questions, will be maintained in the context of the development of new technologies. So as quantum technologies grow, as the use of artificial intelligence grows, and so on, they will all be consistent through that process or should be consistent I should say through that process.

Laura Jayes: Dr. Jaishankar, what do you think about the genesis of such tech? Does geographical location matter? And if I could get you to respond to what Minister Payne was saying, as well as Mr. Clegg about this fragmented approach that we're likely to see?

Dr Jaishankar: You know; I am not sure I would put it in terms of geography. I would put it in terms of, in a sense principles and rules. Look, what are we talking about? We're talking about an era where, I mean, what's the first thing we all do when we get up in the morning? We first go and reach out for our phone and see what happens when we were sleeping, how many messages came? So much of us, which is out there, as a consequence of this technology revolution that we have seen in the last decade. It’s done much more than Roger Federer has done in tennis. And the point is, it then raises to my mind and not just to my mind to a whole lot of other people's minds and interests, issues of trust, issues of transparency. You know, if there's so much of me out there because I did it in good faith, I did it as my normal routine, is that being harvested for, all said and done legitimate reasons, business reasons or for more than that. So I think today when we are looking at technology choices, technology capabilities, technology partnerships, technology practices, I think these words, you know, trust, transparency, resiliency, reliability, they acquire a very, very strong meaning and relevance. So, I do think, we cannot shy away from that. I now fall back from a tech debate into a classic foreign policy debate. You know, once upon a time we pretended all nations are the same, what happens inside a nation doesn't matter. It doesn't matter what you believe when we come to the table, we are all the same. Now, the fact is, in a much more interpenetrated world, much more globalised world, what you do at home, what you think, how you practice, matters to me. So I do think values matter. I do think practices matter. I do think issues of trust and transparency matter. And I think that's a very big issue of our era.

Laura Jayes: Dr Jaishankar, coming back to the Quad, I keep on coming back to this because recently it was agreed that technology, "shouldn't be misused or abused for authoritarian surveillance and oppression," I think we can all agree that we want democratic values to indeed govern the tech world. It's an obvious guiding principle. But have we seen mission creep for want of a better term where, you know, we've seen the lines drawn on GPS tracking, facial recognition, and mass surveillance, but do we need to redraw them?

Dr Jaishankar: Again, in a way, the answers are political, and social answer. I think in all our societies, there will be debates; there will be debates about finding that right balance. There will be debates on how does the government itself use information, control data, what it allows, what it doesn't, what it regulates, how much privacy should the government intervene, not intervene? This is how human societies have evolved, particularly democracies. So, I don't think you're going to get a one time answer, you're not going to get a fix. I think it will be an evolving process. But I am very confident, living in a democratic society as a member of parliament in a democratic parliament, I think for all the slowness, the agonising, I do think there's a value in all of that. Politics may be slow, but you know, flow is another way of describing deliberation. So, I think there is value to that. And, again, my sense is, as democracy is today, we should be confident about ourselves, we should be confident about our ability to find the right answers to get the right balance. And I think technology is one of those issues, because I think what Nick said is very important. I mean, after COVID has accelerated digitization, it has shown how better delivery can be with the tools we have. Now, if you have better digital delivery, you also have greater digital risk. And again, we have to find balances for that. So I would hope very much to see a debate, I would hope very much to see a confident debate, not an apprehensive debate.

Laura Jayes: And you're right, this is so relevant given the last two years we've had with COVID. Minister Payne, we've been increasingly asked to give up some privacies for our own protection, it's argued, and that has been true in practice. But some of these, if you look at it in isolation, is not so distinct from practices we've seen in authoritarian states, is it?

Marise Payne: Well, I know this is a contentious area, and COVID has exacerbated many of these challenges, as Dr. Jaishankar has just said. But I do think that we're coming from quite different perspectives in the broad, in countries like Australia, than perhaps in authoritarian states. We have a range of oversight mechanisms that are embedded in our democratic institutions, and where governments are thought to have overreached, then I'm acutely aware of the parliamentary processes that are available to oppositions or members and senators in our case, whether that's an estimates process or a questions process or inquiries, whatever that might be. And some of, I'm a senator in the Australian Parliament, I'm a great advocate of the work that is done through these processes and have been for over two decades. We also have the operation of the rule of law that provides a level of accountability in countries like Australia. There are penalties under the laws that pertain in our countries, for those who misuse technology, we are also required to be and have endeavoured to be transparent about how government uses technology. As I said, robust public debates through the parliament, through our free press, through free speech. So I think there are clear distinctions there. And I would also say that we as governments have not advocated the use of the cover of COVID, if you like, to entrench stronger security measures. And we would reject that being done. Where there are contact tracing phone apps, for example, QR code registration apps, the process there is to be single purpose, to be proportionate, and importantly, to be temporary. And I know there are discussions about how long those sorts of actions continue here in Australia as we transition through the worst difficulties of the COVID period, into a different context in Australia. So I do think there are quite different underpinnings in both those systems that you've raised.

Laura Jayes: Mr. Clegg, perhaps it's not extreme, but as overt as GPS tracking, facial recognition, mass surveillance, but even things like controlling your own Facebook feed. I mean, perhaps Minister Payne is suggesting Facebook needs its own parliament?

Nick Clegg: Well, as I say, we're not quite a parliament. But we do have something which has binding and independent powers now over some of these sensitive content decisions that we have to adjudicate on, namely, this Oversight Board. And no one's tried this before, it's early days, but I have to say, a year or so after it's first launched, I think it's a really promising initiative, pending, perhaps fuller legislation to ensure these companies are held to account. But to the point that you just made about and also, Mr. Jaishankar said, the importance that people feel they have confidence and trust in their own online experience. We've talked a lot about regulation, governments, private sector, accountability, and so on, and so forth. We mustn't forget one of most important things is to give individuals as much control as possible over their own experience. And I do think that's where companies like Meta are on a journey. I mean, you can now if you're using the Facebook app, you can override the algorithm. You can block certain ads, you can go on these little three dots, which you see on the top right hand corner of every posting, "why are you seeing this?", which explains why you're seeing the ads, why you see that post, you can basically compose your own newsfeed by singling out the pages that are your favourites. So there is actually quite a lot of control, then there's a debate about whether the controls are easy enough to use, are they visible enough to use and so on. But my own view is that the future of the online world, not only do you require regulation, not only do you need to have greater transparency from the companies on everything from their algorithmic ranking systems, to the kind of, you know, debate about hate speech that we were talking about earlier, but you also need to give individuals really granular controls and I would love to see one day Facebook users more or less being able to kind of dial up do you want to see more sport or less sport, more politics or less politics, more cooking, or less cooking. I don't know whether those are the right categories, but I do think, given social media is a highly personalised experience, every single person's Facebook newsfeed is individual to them. It's like a sort of a digital fingerprint. No newsfeed is the same to another, which is why it's so utterly different to the traditional media. You're not reading the same front page; you're essentially you're almost creating your own. And I think over time, what you'll see is that, that people I hope, and people who experience social media will have more and more control about exactly how that's curated and how that's architected, and how that's designed.

Laura Jayes: Perhaps cooking is one of the least controversial parts of Facebook feed, but I understand what you're saying. Now to all of you, where does this lead us when it comes to our relationship with China? I mean, do you expect China is going to change its tune? Does it need to change its tune with the way things are at the moment. Dr. Jaishankar, what do you think about this and China's view of tech governance?

Dr Jaishankar: Well, obviously, they have practices which are different from that followed by many, at least the countries on the screen that we're looking at. I think that's reflective of a larger system. So I don't think you're asking a tech question. You're asking a systemic question.

Laura Jayes: Minister Payne, you've said that Australia and India are like-minded countries. And this is a quote from you , wish to read it back to you "will lead the international discourse on critical technologies advocate for standards and norms that reflect interests and values and define what is and what is not acceptable." What are the things that are not acceptable? Do you have a list of redline issues?

Marise Payne: Laura, I think there are a few that fall into that category, and some of them we have already discussed in our exchanges today. Where there are arbitrary incursions on liberties, where there's the use of dangerous disinformation, where there's the theft of intellectual property, or malicious cyber activity and cyber behaviour to undermine stability, they are pretty clearly unacceptable. There are red lines, I guess that are broadly agreed. Technology being used to perpetrate disinformation, social disharmony, or as a propaganda tool. But I think the best antidote to disinformation, frankly, is sunlight. And I'm a strong advocate of the application of sunlight, but also the production and the promotion of accurate and transparent information from credible sources. And I do note exactly in the same terms as Mr. Clegg has outlined, the takedowns that have occurred during the COVID period, in particular, and I made some remarks on this earlier last year, the takedowns that have occurred in the COVID period have been immensely important. The discussions led at the United Nations, including by countries like Latvia, who have focused on this, immensely important in the application of sunlight, as well. So in my department, we've established a task force that works through the Department of Foreign Affairs and Trade, on disinformation, to pre-empt it, to deter, to respond, because we know exactly how dangerous it can be. And we know we need to help to build regional resilience against it. And vaccines have been the best and worst example of this that I have seen, and certainly during the last months, as vaccines have been a subject of great discussion, I've made sure that we're very focused on that.

Laura Jayes: Indeed. Mr. Clegg, in this discussion, you've described Facebook as being on a journey I think we all are, when it comes to the emergence of technology. Is part of the problem for Facebook is that it's a virtual monopoly without serious competition, it needs to rely heavily on governance, but democracies aren't keeping up with those advancements. So in many ways, you're a victim of your own success.

Nick Clegg: Meta is clearly highly successful in the sense that a lot of people use WhatsApp and Messenger and Instagram and Facebook around the world. But there are a dizzying array of choices. I mean, just look at the explosive growth, I mean, really, far more rapid than Facebook or Instagram grew as social media apps, look at the explosive growth of TikTok over the last couple of years. Or look at the way YouTube is far, far bigger in terms of video consumption than any of our apps. Or in messaging, you know, iMessage is a far bigger messaging app in the United States where I'm sat at the moment than any of our messaging apps. So, whether it's photos, videos, messaging, many of them of course you can use with no upfront charge because in our case they are paid for advertising. So I think it's actually a highly fluid and dynamic sector because as TikTok example has shown, you can erupt on the scene and the network effects can allow new market entrants to grow very rapidly. But that does of course contrast with since you asked with the Chinese market, we're not allowed to operate in the Chinese market, so I can't speak for any, you know, people in China can't use Facebook, Instagram, and so on so forth .But, you know, let's be clear that the Chinese paradigm of the internet is exceptionally successful, but exceptionally different to the open Internet we're talking about. It's exceptionally successful because they've created their own...I mean, Alibaba, Tencent, all of these, these are rivalling the big Silicon Valley giants and doing so in an exceptionally sort of forceful way. But it's based on a completely different intellectual paradigm. It's based on keeping, including, you know, the apps that we have out of the market. It's based on real heavy surveillance inside the market. And I can see why other jurisdictions and particularly authoritarian or semi-authoritarian jurisdictions might find it a more attractive paradigm. But I don’t think, that's perhaps another reason why the great sort of techno democracies in the world, I do think need to cooperate more and more together in order to create multilateral international standards which can withstand the pressures and the challenge of that wholly different paradigm.

Laura Jayes: Okay, we've got some questions from our delegates now, that goes to a lot of things we've already discussed on the panel today. This, Andrew from New York: "Industries that have previously required wholesale regulations such as tobacco, air travel, motor vehicle, food safety, etc., have found it a bumpy road," he says, "it's in everyone's best interest to have a strong and innovative tech sector and effective regulation. How do you see a reset that moves beyond the catchphrases of accountability, transparency and governance to put some of these principles into practice?" Again, it's something that we have explored here today. But Dr. Jaishankar, why don't you kick it off?

Dr Jaishankar: Look, I'm not sure I agree with even the premise of that question. I mean, just think about it. What would happen to air travel, if there wasn't regulation in air travel? What would happen to shipping if there wasn't regulation in shipping? In fact, these are exactly the examples which make a case why you need a larger global understanding and more, some kind of efficient regulation rules of the road, if you would. And I don't think accountability, transparency and governance are catchphrases. I think they are real issues out there. So I made the point earlier. Look, we can't have a laissez-faire, sort of, 'let whatever happens, happen' kind of approach to something that is today so important in our lives.

Laura Jayes: Minister Payne, your view on that?

Marise Payne: I think we're in warm agreement on these issues. International standard settings bodies have been very important, particularly in some of the examples that the questioner has used in their proposition and it might be a bumpy road, but that doesn't make it unimportant. And it doesn't make it not worth travelling to achieve an outcome. So, in those sectors, we have most definitely, as Dr. Jaishankar says, seen the benefit of regulation. We need to ensure balance. I think, in all of our comments today, one or other of us at some time has used the word 'balance', we recognise that, but there is still a responsibility that we have to address these questions from the perspective of government, from the perspective of technology, platforms, and of course, from the views of civil society in the community as well.

Laura Jayes: Mr. Clegg, this one for you from Evelyn. The question: 'How can technology platforms increase public transparency, to improve government and society's understanding of their impact?’

Nick Clegg: Through transparency. I think we need to release more and more data. We need to provide more access to researchers to do their own research using our data. We have 1000 PhDs working in Meta for instance, and they publish I think, they've published already over 400 peer reviewed or academic papers, scientific papers during the course of this year alone. We're cooperating for instance, with I think 17 academics, giving them unparalleled access to our data to examine, for instance, how social media played a role in voter behaviour in the run up to the last U.S. 2020 Presidential election. So, more research, more data, as I say, we've now published, we publish all the data on hate speech, on nudity on violence, all the things that break our rules, what do we identify? What do we catch? What do we not catch? How are we doing in different categories, we've recently started publishing a new report, which lists the most viewed content on Facebook and Instagram, because there's a lot of ongoing commentary and claim and counterclaim about what people are actually seeing on these social media apps, which often don't actually bear very much relationship to the reality. So I think we, as a company have an incentive to be more transparent. I think legislation, by the way, and rules and regulation will help in this respect, I'll give you a very specific example: one of the tensions we find is that academics want to get hold of data that we have to do their own research. But of course, the more data that leaves the building, the greater the risk there is to individual privacy, because the more data moves around, the more, candidly, there's a risk that it gets lost. We can't make as a private sector company that adjudication between the value of data portability, versus the risk, which undoubtedly then increases to privacy that has to be done by lawmakers and legislators in democratic legislatures. So that'd be a great example where we could do more, we should do more, and actually having some rules of the road would help us to do more of what I think everybody wants us to do more of.

Laura Jayes: Dr Jaishankar is that a solution that would be within India’s reach to be able to provide for private researchers be able to get that data within a framework that you can set out?

Dr Jaishankar: You know, first of all, I just wanted to add to what Nick Clegg said. In a sense, what technology platforms have done, I mean, he mostly addressed what, how they are getting more transparent. We should also recognise that helping governmental processes, social processes get more transparent. Look I'm giving you the Indian experience, we do a very, very large number of financial transfers, we run big programmes where either you put money into people's accounts, or you pay them for work, which is done by them. In the past, when you're doing it the old fashioned way, frankly, a lot of these transfers used to disappear, and it was generally said that the recipient got 50% of what the government sent out to them. Today, because of technology platforms, actually, there are direct benefits, which are moving seamlessly. Through the entire COVID period, we were able to do financial transfers, and even food support to literally hundreds of millions of people because of technology platforms. We vaccinated more than a billion people. I mean, we can pull up for you literally, say region by region, block by block, village by village, maybe even street by street, you know who's vaccinated, And when is the second shot due, and send them a reminder saying your second shot is due. So what it has done is it had an absolutely amazing impact on, actually, the quality of governance. And I would say, particularly in democracies, where it's very important that the credibility of the democracy is based on its ability to deliver good governance, this is something which has been hugely helpful. So, I do want to make the point that technology platforms have actually strengthened democracies, provided democracies know how to use it.

Laura Jayes: Minister Payne, that's something of course, we've experienced here in Australia as well COVID-19 with the vaccination what Dr. Jaishankar is talking about is the lived experience here in Australia. So you agree with that premise that even through COVID-19, I think these platforms and technology has shown that with mass amounts of data, there is a responsibility that can be upheld?

Marise Payne: And that's the second Quad principle about trust, integrity and resilience, which we've been articulating right through this discussion. Yes.

Laura Jayes: Another question we've got here is from Miah in Canberra. The question is: 'Do democracies need to form alliances, multilateral forums to promote essential democratic values for global technology governance?" Now, we've talked about the Quad, we've talked about the G20. Is what Miah saying separate to these bodies that we already have, you know, strong relationship within. Can we do it within these groups, Minister Payne? Or does there need to be a separate system set up?

Marise Payne: Well, I think we can and we do it within those groups. But there are also U.N. based working groups, which are broadly engaged on these issues. The Open-Ended Working Group and one other, in particular. And that is the work that we must continue to do. And we must use multiple approaches, because I don't think one single approach covers enough ground or absorbs enough players. So that is, to use multilateral, as I said before the mini-lateral and bilateral agreements. Australia and the Republic of Korea signed an MOU on cyber just in the last six weeks when I was in Seoul recently. And then ensuring that we keep these contemporary as well. I mean, Nick Clegg made the observation about the pace of change that we deal with, across platforms. I know that one of the estimates floating around is that half of the next billion Internet users will be Indian alone. That in and of itself is a very powerful statistic about where critical mass is and what change looks like and how fast it is being driven and in what volumes. So staying contemporary through all of these engagements is essential. And being dynamic, therefore, and responsive to change and doing that across multiple levels, as I said, essential.

Laura Jayes: Yeah. Dr. Jaishankar, what do you think about the best multilateral approach here? Is there's an established body? Marise Payne suggested the working groups, I think you'd agree that there's not one body in which you can get all of this done. But is there a group in which you would suggest?

Dr Jaishankar: Look, I'm in full agreement with her. I think the issue is too complicated for it to be dealt with by a small group or a particular body. I think we're looking at a sort of 'all of the above' answer. And I would even say this, look, there may be countries and societies whose practices could be very different. But that doesn't mean that you necessarily get into a sharply confrontational situation. I think, somewhere we have to find right domestic balances, I think we also have to find ways of enlarging this conversation and trying to see how we can address this larger issue in a much more responsible way. I know it's much easier said than done. But I do think it's something which needs to be tried.

Laura Jayes: Mr. Clegg, a question here from Swetha: "How can governments better understand the kinds of interventions and governance that are needed given much of what occurs in technology companies is opaque to the outside?" We've talked about this, the Facebook and Meta in particular is trying to become more transparent, but perhaps you could talk to the ways and the plans in which Meta has to be able to remove some of that opacity, if you like.

Nick Clegg: Well, I think I'd be at risk of repeating myself more. I just think it's: provide more data to researchers, publish more data, we now publish data in exactly the same cadence as our financial results, so every 12 weeks, we publish all the data on all the takedown content that we've taken down, we publish the rate at which our own systems, principally our AI and automated systems identify that content before it's reported to us. So for instance, terrorist content, I think over 99% of terrorist content is now identified by our own AI systems before it's reported to us by a human being. That is quite different to what it was five or seven years ago, and we're getting much, much better at, for instance, reducing the prevalence of hate speech. I mentioned earlier, the prevalence of hate speech now stands at 0.03%. The quarter before that it was 0.05%, The quarter before that it was 0.07%. I don't think it's ever going to be completely 0%. And so we try to hold ourselves to account in that way. And I suspect, in keeping with the new generation of regulation that is likely to hit the statute books in many jurisdictions in the way that we've talked about, I'm sure there'll be further requirements, for instance, on greater algorithmic transparency. I think my own view is a lot of interesting things are said about algorithms and a lot of very silly things are said about algorithms. They don't sort of burrow into our neural pathways and make us do, feel and say things that we otherwise wouldn't. I mean, sometimes I see our ranking algorithms described as sort of lizard-like entities that sort of force us to do things we otherwise wouldn't do. It's not quite, it's not really that kind of science fiction. But, I do think companies like Meta, we should be obliged to provide greater clarity on how we design the ranking algorithms, what signals do we use, what outputs are we looking for in using those algorithms. I think all of that is to come. Can I just say one final thing on the previous question? I think we've got a number of things. You've got standard setting institutions like the ITU, ICANN. I think, by the way, the OECD and the G20 have done a great job recently, in pioneering new tax rules for the digital world, to make sure that companies like Meta pay that we pay our fair share in tax around the world. But I do think there is a gap. I really do think there's a multilateral gap. That's why I keep going back to this sort of Bretton Woods spirit. I don't think we have the institutions and the machinery or the institutional machinery globally, to really introduce proper governance of the online space on a multilateral basis. So my view is that, at some point, we will need to revisit the kind of international institutions we require, because they don't really exist at the moment.

Laura Jayes: Mr. Clegg, question without notice: Was it more difficult navigating the British Parliament back in 2010 to 2015, or your job now?

Nick Clegg: I think probably British politics remains uniquely ferocious.

Laura Jayes: Before we go down that rabbit hole, I have one last question from our delegates and it goes to the entire panel. It's from Michelle. "Should technology governance aim to meet the minimum standards or improve society?" And she has an example she says for example, should it not replicate bias against racial groups? Or should it decrease it? Marise Payne, first to you.

Marise Payne: Thanks, Laura, this has been a really interesting discussion. So I very much appreciate the participation of all of our friends on screen and of your hosting Laura. I think we should always be asking ourselves what should be done with technology, not just what can be done with technology. When we use technologies to uphold and protect our values in the observance of international law, they can add to safer, more secure and more inclusive societies that are able to drive greater economic growth, that drive greater innovation. I think ensuring that bias of any kind ensuring that we maintain that that is unacceptable, whether it's on race or gender, socio-economic position, there are multiple areas in which it is unacceptable. We need to continually address that from governments from business, from civil society and from individuals alike. I fear, that it is, is of course, a difficult area of endeavour. One of the things that concerns me about the pace of technology and engagement is where it breaks down usual boundaries of courtesies and respect. And I fear that a lot of that is lost. And I don't say that because, you know, I'm so old that that's all I can hark back to the concepts of respect and boundaries, but because it's actually what gives us a civilised society. Genuinely, a civilised society and we should be able to expect civilised society online, as we should expect it offline, frankly.

Laura Jayes: As my mother would say, manners never go out of fashion. Dr. Jaishankar, this final question, if I could get you to respond to it?

Dr Jaishankar: Well, since they don't go out of fashion, let me begin by joining Marise and thanking you for hosting us today. Look, I just think that this is today, technology has such a powerful force of change in our lives, that we shouldn't have a minimalistic view of how to approach it and its consequences. On the contrary, it should be something much more ambitious, much more maximalist, if you would. The fact that we are today debating these issues, issues of bias, issues of sovereignty, issues of privacy, issues of transparency, I think is a very good sign. The world has always progressed whenever there's been a big change. The argumentation of the world has also evolved along with it. I think that's what we are going to see. The governance of the world as it evolved along with it. I do think I mean, we are in for exciting times very energetic, even ferocious debates. But I think that's the way to go.

Laura Jayes: We love a ferocious debate. We've had one here today and furious agreement it must be said as well. Mr. Clegg, finally to you.

Nick Clegg: Well, I think technology, certainly communications technology, and certainly social media, it allows for the more rapid circulation of the good, but also the bad. I mean, if you look at virality on social media, that is something that didn't exist before. And it means that and thankfully, the vast, vast majority of content that is on social media is good, it's innocent, it's playful, it's positive, it's small businesses making a living, it's people organising their daughter's football games. It's, you know, local communities getting together, but of course, the problem is that that ease of circulation, is of course, also available to people who have bad intent. And so the job of both the industry and government is to mitigate the bad. You'll never eliminate it; you'll never eliminate it. In the same way that in a free society, you're not going to eliminate crime, but you want to try and get it down to the absolute minimum level, that's sort of acceptable in society, you have to try and do that online, whilst amplifying the good. And, you know, that's where a lot of the sort of difficult, really difficult, ethical, and cultural and social debates come into play, because who is to decide what is good and bad? Most of the debate that I'm involved in, isn't actually about illegal speech at all. It's perfectly legal speech, but that people just deem to be unpleasant or hateful, or offensive. And it's a moral minefield, because you're asking, certainly, in the case of the company of Meta, you're asking a private sector company, to wade deep into basically moderating and acting against content that is perfectly legal. And that's where you get a lot of the tensions because you know, companies like Meta are at "Why didn't you do more to take down X, Y Z, to make everything nicer, and to make society better?" And we're saying, "Well, yeah, fine, but that's kind of not what the law of the land says, do you really want us to take the law in our own hands?" And that's where I think back to the theme of the whole event, the more that we have clear, democratically agreed guardrails that we all agree on, the better for society at large.

Laura Jayes: Indeed, it is a minefield, and I thank you all for entering it with me today. Mr. Clegg, thank you. Dr. Jaishankar and Minister Payne, really appreciate your time. Hopefully, we can do this again soon to see where we all get up to. Appreciate it. And that is all we have time for today. I want to thank all of our panellists, brilliant panellists for their contributions and thank our Sydney Dialogue audience for tuning in. I'm also pleased to announce that this session is being streamed at the Bengaluru Tech Summit (BTS), Australia's largest technology event. The convergence of dates, themes and sessions between BTS and the Sydney Dialogue speaks to deepening bilateral ties between Australia and India and our shared interests in working with like-minded countries to shape the norms and the rules underpinning emerging technologies. Don't forget, you can continue the conversation. We urge you to do so on Twitter. Use the hashtag #TSD2021. Thank you so much for joining us again today and I hope we see you again soon.

Write a Comment Write a Comment
Comments

Post A Comment

  • Name *
    E-mail *
  • Write Your Comment *
  • Verification Code * Verification Code