CETaS Annual Showcase 2024

In Conversation with NCSC - AI and Cybersecurity

Prof Tim Watson (Science and Innovation Director, Defence and National Security, The Alan Turing Institute) joins Ollie Whitehouse (Chief Technology Officer, NCSC) at the CETaS Annual Showcase 2024 for a conversation on emerging AI and cybersecurity trends, challenges, risks and opportunities.

Keynote Speech - Bella Powell (Cyber Director, Government Security Group, Cabinet Office) 

Bella Powell (Cyber Director, Government Security Group, Cabinet Office) provides a keynote address on the key cyber challenges facing the UK Government over the next 5 years at the CETaS Annual Showcase 2024.

“Morning everyone. Thanks so much for inviting me to be part of this fantastic event today. The work CETaS does is such a critical enabler to our national security and resilience mission, and today's showcase is a real testament to the amount of value that this research brings to us within government but also across the broad UK nationally. My topic for today is the cyber challenges facing government over the next five years. You might be expecting me to focus the discussion on the threats we face from Nation States, organised criminal groups, and the proliferation of cyber weapons. We've seen some very recent examples of publicly attributed Nation State activity including the Deputy Prime Minister's recent announcement in December and February of attacks attributed to China and Russia, and we've also seen CISA recently announce that US critical national infrastructure has been compromised. We've also seen lots of quite significant examples of ransomware incidents, including those affecting the British Library and The Western Isles Council... 

“You might also be expecting me, in addition to that broader threat environment, to be talking about the threats and opportunities posed by emerging technologies, particularly given the conversation today, so the threats from and opportunities from automation from large language models and from quantum computing as well. But if you'll allow me, I'd quite like to treat this question slightly differently… what I've just described, from my perspective, is the operating context. So, the challenge for government, and specifically for my role in delivering the government cyber security strategy, is not just that operating context but it's how we deliver within that context, and how we respond and how we promote National Security and prosperity given those broader challenges and the broader environment that we have to work within.

“I'm going to pick out three themes where within government security we see real opportunity but also really significant risk over the coming months and years. And then I'd be really interested to get your thoughts on those as well more generally but also as part of the panel that we have later on. So, the first theme I'm going to pick out is risk calculus. How do we change the risk calculus around cyber resilience both at an organisational and also national levels. How do we ensure that cyber risk is considered at all levels of decision-making in a balanced well-informed and quantified manner and that leaders are fully enabled to make decisions on the basis of that risk calculus? Part of the challenge we face is about measuring and communicating cyber resilience in a way that enables leaders to make balanced and informed decisions but also that holds them to account. We've made really strong progress I would say in this space through the roll out of our new GovAssure scheme, which is a new cyber security assurance regime for the whole of government. GovAssure is based on the NCSC’s Cyber Assessment Framework, and it also builds on best practice from broader critical national infrastructure sectors and the ability it gives us to objectively measure system and organisational level resilience is a really critical step in embedding cyber risk into decision- making at all levels of government. Over time we'll be building this into core governance mechanisms as well as using the data to diagnose systemic vulnerabilities and to target interventions which is really critical for us. But even if we're able to articulate that clearly, making the leap from levels of resilience through to levels of risk reduction and overarching benefit is really challenging. We still collectively I think, as a profession, struggle to quantify the impact of cyber incidents and the end benefit that investment in cyber security will deliver is also a really challenging thing for us to try and quantify. 

“One of the biggest challenges and the biggest gaps that I think we see is the lack of data to accurately and convincingly calculate the expected impact of cyber incidents. I spent a lot of time on this both in private sector previously but also within government and whilst there are some really interesting examples of innovative approaches to tackling that challenge and to quantifying impact, I don't think we're at the point yet where we can bank that and where we can build that effectively into our risk calculus.

“The second theme I'll draw out is agility. How do we increase our agility to respond at a macro level to advances in technology and the broader geopolitical environments but also more operationally to specific threats and incidents. If we take the example of generative AI (hot topic for today) as an example of that, the rapid development of that field means that it's amplifying a range of existing risks as well as generating new ones. It could both empower hostile actors and significantly increase their ability to do harm. We know that threat actors are already using AI to increase the efficiency and the effectiveness of aspects of cyber operations such as reconnaissance and phishing. But there are also huge opportunities for AI to improve organisations’ ability to defend themselves as well as enabling broader productivity benefits. For example, we know that AI can improve the detection and triage of cyber-attacks and identify malicious emails and phishing campaigns. It can also reduce the amount of time staff spend analysing data to inform good decision-making. 

“Responding effectively to that challenge, particularly within government, means being much more bold and it means getting comfortable with taking more delivery risk and adopting new ways of working. The work of colleagues in the Central Digital and Data Office to develop the generative AI framework for government I think is a really great example of that. They’ve put their heads above the parapet, and they've worked really rapidly to develop guidance and support to help governments to adopt, exploit and manage the risks from AI effectively. And a more granular operational level, our recently soft launched Government Cyber Coordination Centre is enabling us to take a more data driven and collaborative approach to responding to serious threats and vulnerabilities across government. GC3 is a joint venture between Cabinet Office and the NCSC, and it's already having a really positive impact on our ability to respond to threats and vulnerabilities across government.

“Autonomous cyber defence – our topic shortly – poses some really interesting opportunities in this space particularly for governments, and at both organisational and cross-government level, I think there are some really interesting threads for us to draw through, and I'm really looking forward to hearing from our panel shortly on the impact they think this will have on our ability to act in a more agile manner.

“And then the final challenge that I'll raise is one that you'll probably expect me to actually… it’s probably one of the most prosaic but it's also one of the most fundamental from a government perspective and that's people. How do we build the right access to skills and knowledge to enable us to respond to threats and emerging opportunities successfully? This is a really complex problem and it's not exclusive to either the public sector or even the UK. We're facing a global shortage of critical skills in cyber, AI and associated technologies and this is felt particularly acutely I think by public sector. We also know that the entire tech sector and specifically cybersecurity, suffers from a chronic lack of diversity. We're missing out on massive pools of potential talent and we're failing to ensure that all those with the potential to succeed in this domain are given the access and the support that they need to do so. Addressing the cyber skills gap is therefore, absolutely rightly, a critical priority in the National Cyber Strategy and also the Government Cyber Security Strategy.

“And I should say that colleagues in the NCSC have already begun to tackle this challenge head on at national level with a Cyber First scheme, in my personal opinion, a really brilliant example of their work. Cyber First focus focuses on increasing skills and diversity, and this work has delivered some really fantastic results with over 56,000 girls from across the country being involved in the Cyber First Girls Competition since its inception, and almost 600 schools now attaining Cyber First recognition for helping to develop the UK cyber ecosystem and both of those I think are really fantastic results and testament to that broader work. Within wider government, we've also established cyber apprenticeship schemes and fast stream programmes and we were recently recognised amongst the top one 100 apprenticeship employers in the UK.

“Collaboration with industry, academia and research is also absolutely critical as I expect everyone here today to really agree with. That's why the work of CETaS in particular is so important, because it brings together colleagues from an enormous range of disciplines to deliver research that then has direct applicability to policy and specifically to government policy. We've also seen really great examples of knowledge sharing and collaboration through the AI Safety Institute, working with frontier AI labs to to assess new AI models and really understand what the impact of those will be. And we're also increasingly being supported by industry, academia and research through our National and Cyber Advisory Boards, both of which are delivering really fantastic benefit particularly in the delivery of the Government Cyber Security Strategy. 

“But clearly, we need to do more to tackle the scale of this problem and the key, from my perspective, is in significantly increasing the diversity of the workforce that is coming into working government and that we're attracting and retaining to help us with some of these broader challenges. So, within government, we have some pretty big challenges ahead, I think it's fair to say, and a very complex operating environment to work within. We can't forget in our risk calculus that, as governments, individuals, our role is to serve the broader people and to deliver services and often those services are being delivered to the most vulnerable people in society. Our risk calculus needs to bear in mind that the resilience of those services is absolutely critical, but we also need to ensure that security doesn't slow down or impede their delivery. We need to increase our agility to respond to new threats and opportunities but we also need to use security and emerging technology to enable broader transformation and unlock productivity gains and we need to recognise that people are the enabler to all this. People with the right knowledge and skills working towards a common purpose. And that's really what gets me up and out of bed in the morning despite the fact that I'm having to swipe peanut butter stains that my children have put all over me, and what gets me into work is working with really inspiring colleagues on a very challenging mission and really focusing on how we can better protect public services for the broader good. 

“I'm really proud of the progress that we're making together, I'm also extremely grateful for the partnership and collaboration of colleagues within CETaS and also within wider research, in achieving that shared mission and look forward to even greater collaboration in future. Thank you.” 

Keynote Speech - Prof Jennifer Rubin (Chief Scientific Adviser and Director General, Science, Technology, Analysis and Research (STAR), Home Office) 

Professor Jennifer Rubin (Chief Scientific Adviser and Director General, Science, Technology, Analysis and Research (STAR), Home Office) provides a keynote address on research, innovation and partnerships in UK national security at the CETaS Annual Showcase 2024.

“It's really wonderful to be here today with all of you and how fitting that we're in the Institution of Engineering and Technology which was of course granted a charter over a hundred years ago for the purpose of advancing engineering and technology and their applications – a super important role. The institution's been an asset in assisting government and policy areas to inform decisions and improve lives which is crucial to all of us...

“So, it's a privilege to be here to speak to this diverse audience of government officials and partners from the private sector, academia and beyond with a shared interest in this incredibly important mission, and particularly around emerging technology and national security. As the Home Office's Chief Scientific Advisor and DG for STAR (as we call it back where I work), my role is to promote evidence and scientific advice in decision-making. I'm extremely grateful to CETaS for blazing a tale in supporting this mission.

“I'd like to start just by applauding the incredibly policy-focused work that CETaS has already published in the short time since it was launched in 2022. It's really impressive and an awareness of emerging technology is crucial to ensuring that our policies, strategies and operations are resilient to what might happen in the future – and CETaS has provided insights in many relevant areas to our work in the Home Office already. 

“Being invited to speak about research, innovation and partnerships is a particular privilege for me because I spent my career leading and funding research programmes and institutes and a major driver for me has always been to inform policy and operational decisions with the expertise and rigour available across the research landscape. After my PhD and a few years as an academic, I went to work at RAND for many years and then returned to academia to take up a professorship at King’s College London, leading the policy institute at King’s as universities began to focus ever more on impact and on that pull through of research and innovation. From King’s I was seconded to be the Chief Executive and then the Executive Chair of the Economic and Social Research Council as it joined UK Research and Innovation. 

“And now as DG for Science, Technology, Analysis and Research and CSA (or Chief Scientific Advisor), I lead a command of scientists, data and technical experts to deliver against the Department's policy missions. This, of course, as many of you will be well aware, includes reducing the risk to the UK from terrorism, State threats, and economic and cyber-crime as an important part of that work. In these roles I'm responsible for building and enhancing the science, innovation and technology system in the Home Office. This means looking right across the department for how science, innovation and technology can help address challenges. It also means drawing in the knowledge and expertise from across the department, across wider government, across academia and industry and beyond, and to our partners internationally to see where we can learn and where we can leverage and contribute to each other's work as well. 

“Of course, this also supports our wider cross-government ambition to cement the UK as a science and technology superpower enabled by the UK's Science and Technology framework published last year. To achieve this, our partnerships should be informed by an awareness of areas where we have strong capability and some competitive advantage already, and other areas where we should be leading. This latter group of course includes the five critical technologies set out in the framework, identified as AI, engineering biology, future telecommunications, semiconductors, and quantum technologies.

“We have a really exemplary partner in CETaS, and I'll say a little bit about this because it's a helpful way of thinking about partnership more widely. We have opportunities and governance structures that enable us to advise on research projects and help prioritise topics, aligning these with work across the National Security community. Many of the latest projects and ideas should deliver useful findings for our policy areas in the next year. CETaS also shares new publications with us, inviting us to briefings and workshops, all great opportunities to learn about the latest research and share ideas, which is something that we really value. Most recently we invited CETaS into the Home Office to host an exhibition in one of our building's thoroughfares. This was a new thing to do; it was an excellent chance for Home Office colleagues to have their curiosity peaked and to ask questions as they passed. I actually know that this has already sparked a number of new areas of possible collaboration; there's nothing like a little bit of serendipitous encounter and conversation. 

“We were also pleased that CETaS was able to join us in our Innovation Zone at the Annual Home Office Security and Policing Exhibition in Farnborough last month. This was attended by 8,700 people or more with delegates from 40 countries. This brought together industry, policy, and operational decision-makers and the research community, and again gave a much wider audience an opportunity to understand the breadth of areas of interest of all of the exhibitors including CETaS; it was really good to see that represented there. 

“We also benefit from strong links more widely with The Alan Turing Institute, where alongside other government partners we fund the Applied Research Centre for Defence and Security. We're really delighted that this group of data scientists, software engineers and others focusing on the application of cutting-edge technology to problems within the defence and security sector allows us to draw on excellent partners and expertise to support our work. Alongside our partnerships with the ATI, we're continuing to explore how we can foster innovation and collaboration across the research community more widely, and to realise efficiencies, where possible, around shared interests both across the UK and overseas to be a more porous organisation. We deliver an ongoing programme of research and innovation with a range of partners, including public sector research establishments, such as the Defence, Science and Technology Laboratory (Dstl).

“A few of our wider links into academia also include the cross-government intelligence community postdoctoral programme; projects directly linked with individual universities; he Centre for Research and Evidence on Security Threats, or CREST; the Network for Security Excellence and Collaboration, which some of you may know as Academic Risk; engagement with UKRI; and our Home Office Science Advisory Council. I think each of these are probably represented in the audience as well, so ‘hello’ and ‘thank you’ for all that you do, as well as our five other science advisory committees. 
 
“We also communicate our challenges externally in collaboration with partners, like the Ministry of Defence’s Defence and Security Accelerator. We often access industry partnerships through innovator functions such as the joint security and resilience centre, through frameworks such as the Accelerated Capability Environment, and directly with individual suppliers, so for example we're procuring a renewed horizon-scanning framework on emerging technology for the Home Office and our government partners. This will provide an ongoing feed of evidence to build resilience into policymaking. We're looking at going to market using the Crown Commercial Service Framework, and I'd encourage any suppliers to join so that you have access to bid for these opportunities.

“In addition to the collaborations already noted, we have a range of international partnerships, which include for example the Five Research and Development, or 5RD, which runs a Technology Foresight Network that allows us to exchange horizon-scanning findings and best practices with our closest allies and further activities with France, Singapore, and the Department for Homeland Security in the US amongst other others. Our most recent bilateral Science and Technology Event with the US Department of Homeland Security included a deep dive into artificial intelligence, which was hosted at the ATI offices, and thanks very much for that, it was a hugely successful and much appreciated event.

“With all of that, the work of policy and decision-making is not getting any easier given the pace of change and technological development in particular. It's only through building and drawing on strong partnerships that we’ll successfully place science, technology, innovation at the centre of the Home Office’s efforts to deliver a safer, fairer, and more prosperous UK for the public. We're currently developing a Home Office Science, Innovation and Technology Strategy to support us in realising this vision. It will highlight, amongst other things, the need to position futures and horizon-scanning as an integral part of robust and resilient decision-making and R&D programmes. To help the department harness novel technologies and to deliver innovative solutions to the challenges our operational teams and partners face, the strategy will outline our ambition to build science, innovation and technology capabilities across a wide range of disciplines with a clear focus however on the safety and ethics of science and technology and in particular on better outcomes for the end-users and citizens always in mind. 

“We'll be building this approach across our work on futures and emerging technology, data analytics, AI and more. And to enhance our delivery it will drive us to consider novel and innovative ways to collaborate on science, innovation and technology with all of our partners. We'll launch the strategy in the coming months, and we'll do so alongside the publication of our refreshed areas of research interest. I'm really looking forward to working with CETaS and this wider community to achieve the ambition that the Science, Innovation and Technology Strategy will set out.

“So that's a really quick canter through, and hopefully has given you a sense of the importance of both partnerships, innovation and futures thinking in how we're going to go about building more partnerships and our strategy itself. I'm really delighted to have had the chance to be here today to see some of the posters which I saw in the other room which look like there's even more exciting work going on than I was already aware of, and while I'll wrap up in just a moment, I know that I have Home Office colleagues here in the room and who will be around in the next break, and I'm very keen that you take the opportunity to talk to some of them about the forthcoming strategy, about the areas of research interest, about the areas of work that we're prioritising and focusing on, and tell us about areas that you think we could do better, areas we should be focusing on and more.

“I'm sure you're all aware that publishing strategies and having mechanisms and procurement for collaboration are really just the scaffolding. What really gets collaborations underway is the human interactions, the discussions, the invitations to get to know each other better so we begin to fully understand each other's priorities, capabilities, interests and needs. So, I'm really hoping that you'll take the opportunity to do so, and I look forward to having further opportunities to speak with all of you in various fora in the future. Thanks very much for having me here today, and enjoy the rest of the afternoon. Thank you.”

Keynote Speech - Gaven Smith CB FREng

Gaven Smith CB FREng (former Director General for Technology, GCHQ) provides a keynote address on the future of data and technology for the UK Intelligence Community at the CETaS Annual Showcase 2024.

“Good afternoon. Thank you for inviting me. It's been brilliant to be a delegate as well so I'll get on to that in a minute but thank you. For those of you that don't know me, I'm Gaven Smith and until late last year I was the Director General for Technology and Chief Technology Officer at GCHQ. It was the best job in government for an engineer I maintain, because I got to work on the cutting edge of technology. I got to do it for reasons that really matter, and I got to work with brilliant people to do that. Now, I'm an academic, I'm a technology advisor, I'm a non-executive director, and I'm an advocate for online safety and I still get to follow my passion for technology for reasons that really matter and to work with brilliant people...  

“Today's been fantastic because I've got to meet a whole bunch of friends I know well [and] to meet a whole bunch of new people. So, it's been brilliant to be a delegate as well as a speaker. I thought I might offer you a few words around the importance of emerging tech in National Security, my view on what that might mean in the short to medium term, and what that might mean to us as a community. Anybody that knows me well knows that I can only count to three so I'm going to offer you three reasons for why emerging tech is important to us. You won't remember the detail of what I'm about to say for the next 8 minutes but I hope you will at least remember that I made you think, and that I said three things. So, thing one: emerging technology is strategically important, vital I would add, because technology and security are rapidly becoming one and the same thing. Thing two: emerging technology is becoming operationally urgent for us to use in the work that we do. It's critical in the time sense because some of that technology is creating the threat that we need to use it to mitigate that threat with. But let's not lose heart, because the opportunity from emerging technology I would argue has never been better; it's never been more accessible; it's never been more relevant, and that makes it never more exciting. Now clearly that technology won't make itself useful on its own, and so there's a really important underpinning point here about the vital importance of learning communities, empowered communities, and some of that came out in that last session, and about them being inclusive and diverse communities too, and again we talked about that in the session just before tea.  

“So let me unpack that in just a little more detail: why is this so strategically vital (thing one)? Well, you don't need to take my word for this: the Integrated Review and its Refresh last year was really clear about this. We live in a contested and volatile world. And part of our policy response and part of our operational response to that is to secure a technology advantage for the UK. Now, like lots of you, I participated in that debate as that piece of policy work was being done so of course I'm a fan of it but I think it is really important to see technology recognised in a piece of policy work like that. But it's not just in policy work that we would recognise the importance – strategic importance – of emerging tech, you only need to open your news feed to see all around you, so whether it is the war in Ukraine, the crisis in the Middle East, State and non-State cyber deep fakes, online harms, some of that being accelerated by AI as we talked about this morning. Technology has never been more relevant to risk and never been more relevant to us doing something about that risk. So that's the thing one, which leads me rapidly on to thing two around the operational urgency of this, and if you look at that list of threats and how technology is a core part of those threats, then you can very quickly get to the operational urgency of this.

“I'm still a big fan of the five Vs of data. So: velocity, variety, volume, veracity, value. And I still think they have massive currency operationally for us as a community if you think about the data that we hold in trust in National Security. And it's really important that we derive every ounce of value from that data even though it's getting more and more complex around us. I suggest there are another couple of V's that we need to worry about: the first is verticals, and that's about sharing the data we have across boundaries. Some of the most rewarding work I got to do in National Security was to bring disparate data communities together to build capability together and share best practice. So, I think our ability to work across boundaries is super important and becoming more so. But so is public trust and ethics and holding ourselves to the highest compliance standards, so maybe that's visible, and that's also not just virtue – sorry, I'm on my 8th V now – it's really important that we do real things to show that we are prepared to be visible like this conference, like GCHQ publishing its AI ethics guide, and the code that it us uses to do its model management in a thing called Bailo, like every piece of well researched and expert NCSC advice and like the way we published the AI risks ahead of the Bletchley Conference. I think it's super important that we are visible and transparent when we can be. And, as a community, we have taken great strides on that; we still have more to do.

“My third reason and this might be the one that's a slightly bold claim is around the technical opportunity. The opportunity for us to do this has never been better. It's never been easier or more accessible for us as a community to get our hands on this cutting edge tech. And I think increasingly so, we can bring private investment and private capital towards that, and the work of the National Security Strategic Investment Fund is great in showing us how we can secure the investment and efforts of other organisations to do some of our work with us and for us. There's never been a better example in our recent history for me than the work we did during COVID, like every other community, the National Security community had to learn how to work in a hybrid way and work from home during the pandemic. That was unusual for some of my colleagues in National Security, but it has fundamentally changed the way a lot of us work. I think it generated two really important network effects. The first was around actually doing capability build differently, using cloud native technologies and experiencing the power of open source, those things really sped up during COVID and we really used the imperative to do that well. The second network effect was around inclusion, and we've generated a whole range of different working patterns, by working differently and that's made us a more accessible community to work with. Whether you work in academia or industry or you're a return-to-work parent who wants a different working pattern. So, for me that's digital transformation. That is technology enabling radical ways of working, and different ways of working. We need to be doing this inside National Security because our partners are doing this, and yes, our adversaries are doing this.

“And at this point you'd probably expect me to say something about AI and I am not going to disappoint you. It has been the defining issue since we all could consume generative AI through a web interface last December. We've talked a lot today about the opportunity and risk and it is super important that we've done both. I think there is a really important pattern for us to establish around understanding risk, getting our hands on the technology and using it, sharing what we know as practitioners and then getting the basics right around all of that, and you've heard some powerful stories of us doing that today. A little bit of well-placed FOMO (the fear of missing out) around the power of AI in our community is super important but it has to be well-placed, we have to get the security and safety right too.

“Which brings me back to inclusion. So, we all know that diverse teams and inclusive teams are better in the work that we do. We've seen it in the analysis work we've done, we've seen it in the tools that we build, and the capabilities we develop, and it's as true now as it ever was. In the generative AI age, it is going to be one of the best defences we have for mitigating the effects of bias in the way that we train models and the way that we use them. So, inclusion has never been more important.  

“So with your permission I'm going to add a fourth reason to my list of three. I know I've broken the rule of three horribly in doing so, but powerful, inclusive communities are what are going to put that technology in the hands of the right people to do the things we need them to do. I know CETaS have published their work on that, I would wager that you wouldn't be in this audience if you didn't at least believe some of that. So, emerging technology: it is strategically vital; it is operationally urgent that we start using it to get after the threat; the opportunity to do that in that technology is with us now but only if we build a diverse and inclusive community and we're open in the way that we do that. So, I hope that wherever you work in this community on emerging tech, that you feel that you are part of something much bigger because it's communities that are going to help us to make a difference with it. Thank you.”

Introductory Remarks - AI and Strategic Decision-Making - Madeleine Alessandri (Chair of the Joint Intelligence Committee)

Madeleine Alessandri (Chair of the Joint Intelligence Committee) introduces the panel discussion focusing on AI and Strategic Decision-Making at the CETaS Annual Showcase 2024 (01:09-08:16).

“Brilliant, well a bit like Gaven [Smith] said, it's great to see so many familiar faces in this audience today and some old friends here, so thank you for the invitation and I'm really excited about the work that we've been doing with CETaS – and as has just been said, is getting quite a lot of really good coverage in the media today – but as chair of the Joint Intelligence Committee which is my job at the moment, I am the one who's responsible for making sure that our ministers and senior officials across the UK Government, when they're facing those high stakes National Security moments, and those high stake decision moments have the best possible assessment of the situation...

“And in many ways today we find ourselves in the strongest position we've ever been to meet that task. There are vast amounts of data that's available to us. It comes from a huge range of sources, open and secret, and any of these either alone or in any combination, could give and shed light on some of the threats we face or give us warning, give us early warning, of what may be coming. But the sheer volume and velocity (and all the other Vs that Gaven mentioned of these available data) in itself creates its own challenges and the issues that face my analysts today, some of whom are in this room with us now, is how to absorb and make sense of that data. How to work out what's most relevant, what's most important, spotting patterns, spotting anomalies, and putting that whole jigsaw puzzles together to get a picture.

“Now advances in artificial intelligence are offering us huge and really exciting potential to help surface new intelligence insights and in boosting the productivity of my already very hardworking analysts. And we've got to harness that potential because if we don't harness that potential, we face the prospect of drowning in the data, and going under. We face a real risk that we will fail to spot emerging risks or trends as a result. So, at the same time as bringing help artificial intelligence also brings us some new challenges in how we produce intelligence and how we assess it. So, my business is a business of uncertainty. We have to test the evidence base, we have to bring a sceptical eye to scrutinising all of the intelligence that is put in front of us, we have to constantly scan for bias in the open-source data sets that are available and we want to know as much as we possibly can about the accuracy and consistency of the lines of insight that we're getting over time. And these questions of bias, robustness, validation, which have been around for a while, apply just as much to AI systems as they do to the more traditional sources of insight that we have both secret and open.

“So, it's brilliant that CETaS have helped us to think really deeply about what it will take to successfully integrate artificial intelligence into our work, and how we can bring the same professional standards and healthy scepticism to evaluating the reliability and the robustness of AI-enriched insights, as we do with other sources. But the question of trust and accountability is really the critical thing here. We need to take real care to ensure that when decision-makers choose to act, they know how much confidence to have in that evidence base and that we can convey the reasons for that level of confidence in very clear and consistent ways so a human analyst can explain their reasoning. I quite often sit down with my analysts, and we go backwards and forwards on a particular issue and so I understand their reasoning for reaching a particular analytical conclusion. An AI system may not. So, we will need to be very clear about the accuracy and consistency of AI systems that we deploy in the intelligence process and assure that they're used appropriately, because rightfully, ultimately, we remain accountable for the accuracy of our analysis, even when more of it is assisted by AI.

“So, what CETaS have helped us do, is to understand much better what it will take to build the processes and the systems that we need to deploy AI wisely in the intelligence analysis process. It underlines for us the real need to train and equip our analysts to use AI with wisdom and with confidence and to consider also what support our decision-makers might need when facing higher stakes choices, especially when AI systems have played a significant role in giving us our understanding. So, doing this new research has been really groundbreaking for us. It's been groundbreaking in how we bring together professional expertise from across academia and from government to consider some really hard problems. And I was part of a tabletop exercise that uh formed part of this research and it really brought home and demonstrated just how valuable it is to have a diverse mix of minds and skills in the room when considering how much weight to put on the outputs of AI systems. So, in our exercise we had a really rich mixture: we had data scientists; we had technologists; we had foreign policy experts; we had lawyers; we had good old-fashioned ‘securo-cats’ – it was a real mix and to go to Gaven's phrase: a powerful, inclusive community and a fascinating conversation.

“It also demonstrated to me a massive range of knowledge base around the table, around artificial intelligence and how we need to really invest in upskilling everyone in AI so we can fully harness its power and build our intelligence edge by using it. I'm going to pause now because the panel are going to come in, but as a final thing from me: a massive shout out and thank you to the CETaS team, colleagues at GCHQ and my own team in the JIO for doing this groundbreaking work that's led to this report. It is going to make us much stronger, and AI is going to be a fantastic tool in our armoury going forward. Thank you.” 

Media Coverage

UK elections and AI misinformation

Sam Stockwell is interviewed by BBC World Service (Arabic) on the topic of the impact of AI on political mis- and disinformation following the publication of our Briefing Paper 'AI-Enabled Influence Operations: The Threat to the UK General Election'.