Foreword from Stela Solar As the inaugural Director of the National Artificial Intelligence Centre (NAIC), my objective is to establish the foundations, connections and paths to responsible AI adoption and innovation that benefits business and community, both today and in the decades to come. The National AI Centre Listening Tour which spanned June – October 2022 was a fantastic start to ensure our efforts are grounded in the real experiences and expertise of Australia’s industry and the broader AI ecosystem. The listening tour was inspiring and reaffirmed that Australia is home to world-leading AI expertise that our industry is eager to leverage for positive impact that benefits business, customers and communities. Thank you to the 135 organisations across industry, academia, government, and community, who shared their insights with us during the listening tour. Thank you also to organisations who hosted us during the tour, including ARM Hub, QLD AI Hub, CSIRO’s Data61, Monash University, RMIT, Australian Institute of Machine Learning, The University of Western Australia, and Cicada Innovations. The round table discussions spanned topics such as commercialisation, industry adoption of AI, unique SME needs, regulation and standards, skills and education, research to industry connection, community understanding, responsible AI and trust, and Australia’s AI strengths. We heard that AI technology can be a catalyst for positive business and community outcomes such as differentiated products and services, increased customer and employee satisfaction, improved productivity, and improved access to services. We also heard that that SMEs face challenges with AI adoption and innovation, that there is a need for greater investment in skills and community understanding, and that there is a greater need for robust guidelines for responsible AI. We saw Australia’s AI ecosystem creating world-leading outcomes with an awareness of trustworthy, responsible AI – a critical global need and a differentiating capability for Australia that’s grounded our core values of ‘a fair go’. The insights from the discussions became the foundations of the National AI Centre strategy, programs and initiatives, and helped us form our mission to accelerate positive AI adoption and innovation that benefits Australia’s business and community. They also inspired the formation of the National AI Centre Think Tanks through which we continue the listening across industry, academia, government and community representation. Stela Solar Director, National Artificial Intelligence Centre CSIRO Listening tour at a glance [DATA BOX AND PULL QUOTE] 27 roundtable discussions 135 organisations AI is estimated to add $22.17 trillion to the global economy by 2030, with Australia expected to see an economic boost of $315 billion by 2028 from digital technologies including AI. What do we mean by AI? AI systems embrace a family of technologies that can bring together computing power, scalability, networking, connected devices and interfaces, and data. AI systems can be trained to perform a specific task such as reasoning, planning, natural language processing, computer vision, audio processing, interaction, prediction and more. With Machine Learning, AI systems can continue to improve on a specific task according to a set of human-defined objectives. AI systems can be designed to operate with varying levels of autonomy. Six messages from the listening tour 1. Empower Industry and Support SMEs Australia’s commercial sector sees AI as an opportunity to differentiate, compete and scale, offering the opportunity to level-up impact and use technology to gain a market advantage. AI can be a gamechanger, but more needs to be done to help organisations of all sizes, particularly small and medium businesses, to find trusted pathways to AI adoption and innovation success. There are some sectors that have embraced AI, seeing it as a way to improve productivity, reduce costs, create additional value and accelerate innovation. The resources and mining industry is a good example, with companies using AI to inspect and operate equipment in remote or harsh environments (including offshore), automate operations in dangerous environments, connect data sources, and draw new insights for operations. Another example is manufacturing, where predictive maintenance, AI and automation are recognised as delivering benefits in areas such as quality assurance, production speed and scale, safety and efficiency. Consistently, we heard that there was a need for industry to understand how AI can holistically benefit their organisation, clarity on the tools and technologies available, advice on potential use cases, guidance on building and accessing AI capabilities, how AI can be implemented at scale, and clearer pathways to find and engage with the AI ecosystem. For SMEs, AI is seen as a great equaliser that can enable them to reach scale and compete with larger organisations. However, there is less certainty about how SMEs can implement AI in a cost-effective way — and whether smaller businesses will be effective in building AI capability themselves and successfully compete for the skills needed to plan, assess, test and deploy AI initiatives. While the perceived complexity of implementing new tools and learning new skills might act as a barrier for SMEs, it also represents an opportunity for the AI ecosystem to work in partnership with business, as well as an opportunity for finished solutions that are able to be rapidly deployed without the need for significant AI capability development. Greater guidance is asked for to map the use cases that can and should be custom-built, what can be achieved using off-the-shelf AI tools, and whom to work with in each of these cases. We heard that SMEs who are developing AI solutions and technologies would value the opportunity to test and verify AI systems with experts and in trusted sandboxes to enable responsible innovation, and that some innovations were slowed due to this uncertainty. 2. Broaden Skills, Opportunities and Community Understanding Australia has a highly educated workforce, but there remains a gap between the number of AI-skilled workers available and surging demand. Besides building a skilled and diverse workforce, it became apparent that broader community understanding of AI can hinder or accelerate AI adoption and innovation in the commercial sector. Additionally, greater community understanding of AI could support more career pathways to AI and bring increased multi-disciplinary backgrounds and diversity in the AI-trained workforce. The skill shortage seen in the AI ecosystem is not necessarily simply an absence of technical capacity. Instead, it is a more nuanced challenge that sees gaps in some specific skills, as well as in the combined application of AI capability and industry-centric knowledge, and AI practitioners who can speak the language of both AI and industry. We heard that AI skills were so challenging to find, that technology start-ups were expanding to have international employees to fill this need. In addition to the need to support skill development at the start of careers, there were striking examples of AI talent struggling to re-enter the AI field. Several examples of women skilled in AI who took career breaks to have children reported a struggle to re-enter industry due to the perception that the technology landscape had changed too much since their previous employment. Participants told us that diversity is needed in Australia’s AI workforce – from the need for multidisciplinary skill sets to work collaboratively on AI solutions, to the importance of embedding AI into subject matter across university degrees, to the value of having a workforce that can bring a broader set of perspectives and backgrounds to the development and use of AI technology that is fit for purpose for customers and communities. In fact, industry and institutions say AI is improved when multiple perspectives support its development, testing and use. Universities feel the pressure to evolve their offerings to an applied and industry-centric focus, but some worry this is at the expense of teaching students critical thinking skills. There is also a growing workload for higher education providers and training institutions, and increased expectations for teachers, lecturers and professors, who may struggle to keep their own skills up to date due to time pressures. While interdisciplinary study is important to AI, it is not easy to achieve within degree structures and Field of Research (FOR) codes. Tour participants stressed that AI skills should not be limited to STEM and technical training. Just as digital technologies improve the understanding of every area of life, AI offers the opportunity to advance the arts and social sciences, humanities and non-technical fields. We also heard of the value of earlier industry-integrated learning for university students and graduates, as well as better promotion of industry opportunities and career pathways. For students in primary and secondary education, it was observed that the earlier the students were able to see the application of technology to things they care about — whether AI and climate mitigation or the application of AI to sport — the more it was felt they would pursue this field. (CASE STUDY BOX) Human-AI teams put to the test in submarine simulator Training personnel in a submarine is expensive and comes with certain risks. But testing human performance in the use of submarine technology in high-pressure and complex scenarios is critical for safety purposes. To address this problem, the Control Room Use Simulation Model (CRUSE) at University of Western Australia uses technology to simulate and optimise human-machine team performance in a submarine environment. Developed by the Australian Government’s Defence Science and Technology Group, the approach allows scientists to consider participant psychology and monitor the dynamics that impact team performance when working with emerging technology including AI. Sensors are attached to participants during the simulation to track stress and measure levels of trust towards the AI automation, while monitoring communications and tracking eye movements to assess performance and effectiveness. The CRUSE model helps to develop the understanding of human-machine teaming and is being used to optimise performance in high-pressure submarine situations. 3. Accelerate Commercialisation Successful commercialisation can be challenging — and a number of elements need to align for companies to be able to research, innovate, develop solutions and products, find appropriate markets, scale, export and succeed. One challenge raised is the difficulty of achieving economies of scale or sufficient investment ROI within the Australian market alone, and an opportunity for start-ups to plan for global markets from the beginning of their operations. One enabling factor is the sharing of data. Some organisations shared that they have non-sensitive data which may be of benefit to other organisations. Others spoke of the difficulty for individual organisations or research groups to scale or share data easily, asking for standards, governance, and methods to enable ethical data sharing and assurance. Organisations want to safely share and access data sets to drive next generation research, solve pressing community and business challenges, provided they can work within community and regulator expectations around privacy and good governance. Similarly, start-ups expressed the need for guidance around legality, copyright, IP, licences, insurances, and regulatory requirements in relation to AI, machine learning models and new technologies. As well as where the accountability and responsibility sits throughout the value chain that comes together to deliver AI systems. IP continues to be a hot topic, with start-ups seeking streamlined and integrated approaches to IP agreements between the university and industry sectors. We heard from start-ups that it was challenging to win early customers in Australia and that greater support in this area through industry collaboration and government efforts would help retain some start-ups in Australia. For example, greater industry collaboration could benefit Australia’s active robotics and deep tech sector who import parts required for engineering and development. As each organisation independently negotiates price points, participants shared that through collaboration they may achieve greater efficiencies during international procurement. (CASE STUDY BOX) Robot fruit-picker a world-first solution A robotic solution has been developed to assist farmers across regional Queensland to identify, sort and package fruit and vegetables. LYRO Robotics is a deep-tech start-up company that has developed the world’s first pattern-packing robot for produce, capable of lifting delicate produce from a conveyor belt and packing into boxes. Their technology is empowering farmers to optimise operations. The robot, which can be fitted into existing operations and installed in less than an hour, helps farmers to optimise operating margins, reduce food wastage, increase efficiency and mitigate labour shortage impacts. LYRO Robotics, an industry collaboration with Monash University researchers, now plans to manufacture hundreds of robots for farmers, after receiving $100,000 in Advance Queensland Ignite Ideas funding, as well as further investment from the AI ecosystem. The company is also in talks with international venture capital companies about deploying their robots into international markets. 4. Connect Ecosystem Australia has a world-leading, skilled AI ecosystem of researchers and innovators who are largely unknown by the industry. With many organisation types needed along the AI value chain, finding the right collaborators is important but challenging, with industry participants sharing that they go to who they know and that determines their experience. Most participants across the ecosystem asked for a unified vision of Australia’s AI sector in order to strengthen the impact Australia’s ecosystem can achieve. Throughout the Listening Tour discussions, we heard there is a need for an overarching vision for AI in Australia, something the ecosystem can rally around, aligned in Australian values, and that inspires action from the AI ecosystem. We heard that the AI ecosystem is strong but not always visible, reducing the opportunity for partnerships and engagement with Australian business. A visible, connected AI ecosystem would strengthen industry ties and keep Australian AI at the forefront of decision makers, industry and community. Platforms and initiatives to enable the connection of industry to the research and AI ecosystem and especially along common opportunities and needs were also suggested. Greater visibility and discoverability of the AI ecosystem is seen as a way to help SMEs who might be looking to leverage AI but may not have the capability. We heard from SMEs that they generally go to who they know. For example, if an SME works with a technology vendor that takes them down one path, if they work with a research or academic organisation that takes them down a research path. More support for SMEs means being grounded in their needs and demystifying the various paths they can take to AI. Having a stronger, more visible AI community — with greater transparency about types of organisations and the capabilities that are part of the sector — could also improve community and industry understanding about the opportunities AI presents and create a balanced view of AI that empowers informed perspectives. Listening tour participants told us that they would value having a trusted, neutral voice to lead a balanced AI discussion, that can support industry to navigate and engage Australia’s AI Ecosystem. Hence enabling connection, clarity as well as establishing a simplified and common language between industry, research and technology sectors. Finally, we heard from deep tech startups that winning early customers in Australia was more challenging than internationally. Support through industry matchmaking may alleviate this challenge and prevent some startups from being forced to move overseas. (CASE STUDY BOX) HIVE minds patients using early-alert AI Two Perth hospitals have deployed a unique system that helps ease pressure on overstretched staff and accelerates response times for scaling up patient support through the use of AI. The East Metropolitan Health Service teamed up with remote operations specialist AROSE to create a service hub called Health in a Virtual Environment (HIVE), to assist in monitoring patients at Royal Perth Hospital and Armadale Hospital. HIVE is a patient risk-level alert system linked to a remote command centre which uses AI technology that monitors patients’ vital signs around the clock. When the AI system identifies a patient whose condition is at risk of deterioration, doctors and nurses in the command centre are alerted and can provide immediate support, which results in faster response times. The East Metropolitan Health Service team established a community advisory group to co-design the patient monitoring approaches in HIVE. The changes during the machine learning lifecycle, such as data drift, prompting a need to re-convene the group. It’s a major change in operational procedures, but the HIVE solution has empowered both doctors and nursing staff by simplifying patient monitoring and enabling more effective treatments. 5. Cultivate Trust Throughout the Listening Tour, we heard from industry that their organisation’s ability to innovate with AI was also related to their ability to win trust from the customers and communities they serve. We also observed that many of our host organisations had investments in trustworthy, ethical or responsible AI initiatives or researchers – a space where Australia has an early mover advantage with the launch of Australia’s AI Ethics Principles, thought it was not widely known by our participants. Participants shared a greater need for practical guidance to implement principles. We heard from organisations who have been successful with their AI adoption and innovation, that co-designing innovative AI solutions with customers and communities enabled trust by bringing them along the journey with transparency and openness. It also supported improved outcomes and designs of AI solutions. ‘Supply chain resonance’ of responsible AI practices was seen as key – practices and values must echo and be implemented through the organisation’s supply chain and AI technology value chain - if they’re to deliver responsible AI outcomes. Even when one organisation implemented responsible AI governance or practices, they found that it was challenging to have visibility of or influence over responsible AI approaches across their supply chain, including privacy, traceability, sustainability, anti-slavery, and ethics. It was highlighted that there is a lot of principles and ethics discussions, but more work needed to be done on concrete tools, techniques and methodologies to ensure responsible use of AI. A challenge that applies across sectors is the need for clarity on the regulation relevant to AI, with more information sought on trusted guardrails on the use of the technology. Particular examples shared included healthcare use cases, autonomous systems, privacy, facial recognition, as well as clarity around insurances, liability and organisational accountability for AI services and systems. There was high variance and some confusion on how to develop and govern and organisation’s AI strategy. Some organisations appointed technology leaders to spearhead the strategy. Others highlighted what is important to do, for example: be open about use cases, emphasise holistic system design, the role of community consultation, the use of community governance and other approaches that encourage Responsible AI practices. Because AI systems are sometimes dynamic and built on changing data, outcomes are not always predictable, so agile and ‘always on’ governance models become important, with clarity on where accountability sits for AI systems. (HIGHLIGHT BOX) Australia’s AI Ethics Principles at a glance Human, societal and environmental wellbeing: AI systems should benefit individuals, society and the environment. Human-centred values: AI systems should respect human rights, diversity, and the autonomy of individuals. Fairness: AI systems should be inclusive and accessible and should not involve or result in unfair discrimination against individuals, communities or groups. Privacy protection and security: AI systems should respect and uphold privacy rights and data protection, as well as ensure the security of data. Reliability and safety: AI systems should reliably operate in accordance with their intended purpose. Transparency and explainability: There should be transparency and responsible disclosure so people can understand when they are being significantly impacted by AI and can find out when an AI system is engaging with them. Contestability: When an AI system significantly impacts a person, community, group or environment, there should be a timely process to allow people to challenge the use or outcomes of the AI system. Accountability: People responsible for the different phases of the AI system lifecycle should be identifiable and accountable for the outcomes of the AI systems, and human oversight of AI systems should be enabled. 6. Amplify Strengths Australia should be proud of its strengths in AI and its world-leading capability. From field robotics, remote operations, computer vision all the way to unique investments across Australia in responsible AI. Participants shared that more needs to be done to amplify Australia’s AI innovations both for commercial and community benefit. Little is known across industry and the AI ecosystem of Australia’s AI strengths, and so promotion and showcasing could contribute to increased industry collaboration, adoption, innovation and may attract talented practitioners from around the world. We saw world-leading examples during this tour of robotics across Australia, with applications across fruit picking, search and rescue, remote operations, field robotics, and health. With mature industries across mining, agriculture and resources, Australia has developed strength in remote operations, autonomous systems, computer vision, earth observation, and resource management. We observed that there was a strong connection between Australia’s relatively small population to its size, and the innovation strengths it has developed. Our farmers manage stations larger than some countries. Our resources industry effectively operates in some of the most remote locations in the world, at a distance. Our vast land has enabled some unique innovation advantages in AI technologies and methods used to navigate and manage vast distances. This includes strengths in field robotics, intelligent edge, remote operations both on land and in space, computer vision, earth observation, and resource management. A participant shared that one of the reasons Australia is a world leader in satellite calibration is because our land features are so clear. In a global context where climate change, harsh environment, environmental catastrophes and space frontiers are a reality, Australia holds unique advantage that can benefit the world. We also heard persistent high regard and insistence to ensure AI systems are developed safe, fair and with transparency. A participant proudly highlighted that we have the world’s first E-Safety Commissioner and invented the black box flight recorder as well as the evacuation slide. While Australia has an early mover advantage with the development of Australia’s AI Ethics Principles, through the Listening Tour we realised that the AI topic was bringing out core values in our participants. Responsible AI looked a lot like a digital version of ‘a fair go’ – a core Australian value that many participants felt critical to uphold during the accelerating AI wave. We observed that most of our host organisations had investments in responsible, trustworthy or ethical AI researchers. The early mover advantage of AI ethics, our core values, and responsible AI researcher investment were ingredients that could further build Australia’s advantage in responsible AI. (CASE STUDY BOX) Adelaide AI researchers using face mapping technology to create visual effects for Hollywood An innovative South Australian collaboration contributed to one of the biggest Hollywood blockbusters of 2021. The University of Adelaide’s Australian Institute of Machine Learning (AIML) partnered with Rising Sun Pictures to create visual effects using artificial intelligence for Marvel Studios’ Shang-Chi and the Legend of the Ten Rings. This cutting-edge technology mapped the faces of stunt performers in combat scenes with those of actors. The Adelaide-based team used a ‘deep fake’ method in which an identikit of both the stunt double and actor faces was created, alongside a shared 'dictionary' of facial features, to develop a new way of shooting high-intensity action scenes. Traditionally done by 2D and 3D face mapping tools, AIML technologists used about 30,000 face images across five characters, training five principal machine models in over 4 million training iterations. These models were used for 51 face replacements over six key scenes. The new method enabled greater efficiency and access to a once expensive and time-consuming method, resulting in incredibly realistic and believable visual effects. You spoke, we listened The feedback from experts across the AI ecosystem during the listening tour has been invaluable to inform our next steps, which include the development of the following industry initiatives: 1. Australia’s AI Ecosystem Discoverability Portal to help Australian businesses find capability to support their AI journey. 2. National AI Centre LinkedIn Page to amplify Australia’s AI ecosystem strengths. 3. Responsible AI Network establishment as the central gateway where businesses can learn responsible AI practices and access curated tools and guidance. 4. Insights and Guides workstream that has already delivered Australia’s AI Ecosystem Report, AI Infrastructure and Service Availability Infographic, and more is coming. 5. Industry Pathways is an initiative in development that will enable SMEs to ramp up their understanding and practice of AI. One of the ways the Centre will continue to listen to the experiences an expertise from industry, academia, community, government and the broader ecosystem is through our Think Tanks. The Think Tanks will provide considered recommendations to the National AI Centre on: • Responsible AI: Delivering gold-standard principles, tools, and practices to help business develop, adopt, and operationalise responsible and ethical AI. • AI for diversity and inclusion: Driving diversity in talent and delivering principles and tools to ensure AI systems are inclusive. • AI at scale: Developing practical pathways to enhance the ability of Australian businesses to implement AI and move from pilot AI projects to full-scale production implementation. We welcome collaborators from government, industry, and the research sector to boost AI understanding, adoption and innovation in Australia. Visit the NAIC website to learn more: www.csiro.au/naic We asked Listening Tour participants for their National AI Centre wish list — items that could help the AI ecosystem. This word cloud represents a summary of the 215 asks that we received. BACK COVER: LINK TO FIND OUT MORE AND NAIC LOGO/SOCIALS/WEB ADDRESS EXAMPLE OF HOW THIS CAN BE USED FOR AMPLIFICATION. TO BE DONE ON CONFIRMATION OF FULL TEXT. POWERPOINT SLIDE Lessons from the AI Listening Tour [set of slides – this would be a slide specific to this finding] Australia’s commercial sector sees AI as an opportunity to differentiate, compete and scale. AI can be a gamechanger, but more needs to be done to help organisations understand how AI can benefit their organisation and then how AI adoption can be scaled across their organisation. The key takeaway: Empower industry and support SMES to deliver on the AI opportunity — • Clarity about the tools and technologies available • Advice on potential use cases • Guidance on accessing capabilities • Pathways to engage with the AI ecosystem. [IMAGE ON SLIDE CAN BE CASE STUDY IMAGE OR CASE STUDY CAN SIT AS SECOND SLIDE] LINKEDIN POST If we want Australian businesses to adopt AI, we have to empower industry and support SMEs. That’s a key finding of our national listening tour which brought together experiences and expertise across Australia, to understand our strengths, opportunities, challenges and aspirations in the country’s journey with AI. We heard Australia’s commercial sector sees AI as an opportunity to differentiate, compete and scale. But although AI is a gamechanger, many felt more needs to be done to bridge the gap between understanding of AI and adoption. What does this mean? If we want Australian SMEs and bigger businesses to use AI to its full potential, they need the tools and advice that will empower them to do so. That includes providing:  Clarity about the tools and technologies available  Advice on potential use cases  Guidance on accessing capabilities  Pathways to engage with the AI ecosystem. Our goal at NAIC is to help connect Australia’s ecosystem capability with businesses who want to adopt and innovate with AI responsibly. We will be taking the lessons from the listening tour on board and we look forward to continuing and deepening these conversations in the months and years to come.