Welcome back to Neural Notes, a weekly AI newsletter we take a look at some of the most interesting, important and sometimes even bizarre AI news of the week.
In this edition: KomplyAi founder and CEO Kristen Migliorini chats to us off the back of a recent appearance at the Australian Government Standing Committee on Employment, Education and Training. Migliorini was giving evidence into the public inquiry into the use of generative AI in the Australian education system.
A significant focus of this evidence was the opportunities that AI proposes, as well as the importance of risk and mitigation, bases, and the democratisation of access. While this was specifically about the Australian education landscape, this is a broad area of expertise for Migliorini and KomplyAi.
And it couldn’t have been timed better.
In a market saturated with AI startups, KomplyAi is actually unique
Kristen Migliorini isn’t your typical tech founder. With a background as an intellectual property litigator for a global firm focused on deep tech, she was able to recognise a gaping hole in the market — businesses focused on AI governance, risk and compliance (GRC).
And this was back in 2018. Sure, AI wasn’t anything new. But it wasn’t at the forefront of the tech and business zeitgeist like it has been over the last 12-18 months.
But thanks to Migliorini’s successful career as a tech-adjacent litigator, she could see what was coming.
“I think we’re the first tech company here in Australia, in what we’re doing,” Migliorini said to SmartCompany.ย
And while all of us were waiting for more AI regulation to come in, especially when generative AI began popping off in 2023, KomplyAi was already across it.
“We saw the changes from the European Union legislation. And being involved in deep tech and law, I knew that there’d be headways coming from the [EU] across the globe.”
And that experience is now coming to fruition.
It’s AI regulation season
Back in January, the Albanese government released its long-awaited interim response to the consultation into โSafe and Responsible AI in Australiaโ.
Its response has been largely considered to be a risk-based, proportionate approach — similar to the UK and US approaches. One that is prioritising placing guardrails, but still recognising the importance of differentiating between high and low-risk AI applications.
And this is important, because there’s a wide gamut between different types of use cases for AI, in both the tech and business worlds. While AI isn’t new, the high-speed adoption of it over the past 12-18 months is. Businesses are being encouraged to adopt and play around with AI to help superpower their product offerings and create more efficiencies.
This is all well and good, but also leaves a great deal of room for problems — particularly from a regulation, risk and compliance perspective. The increased accessibility of generative AI for startups and SMEs can be both a strength and a weakness.
“I think, Minister Husic’s done a really incredible job of going out to a really diverse number of people as part of roundtables and discussions — including smaller players like myself — in those discussions, which has been fantastic. I think this risk risk-based proportionate response is very much what many countries are doing overseas,” Migliorini said.
KomplyAi focuses on balancing innovation in AI and other emerging technologies with safety. It aims to ensure that startups and SMEs aren’t left in the dust when it comes to bringing AI products to market in the right way for themselves and their customers.
In the small business world, people have to wear many hats. Most SMEs don’t have a dedicated IT person let alone someone who is an expert in AI. And oftentimes, to access this type of knowledge from outside consultants is prohibitively expensive.
And this is where KomplyAi is carving out an extreme niche for itself.
Because while we’re seeing an exponential amount of AI startups pop up, KomplyAi remains unique in that it’s one of only 40 businesses across the globe that is focused on AI governance, risk and compliance (GRC).
“What we saw is that there’s very few people globally, who are doing sort of responsible AI risk assessments,” Migliorini said.
According to Migliorini, she saw this as a future roadblock to innovation in the space. While big tech providers were democratising access to generative AI, there was still a gap.
“The business model for access and support to legal resources, data scientists, and all the services and technology you need to safely develop, use and deploy your technology to market doesn’t necessarily follow the same model,” Migliorini said.
“We needed to look at ways to build our technology to democratise access to more specialised knowledge.
“So we’re looking at basically building out their tech to help support that divide, both to enterprise and sort of smaller players in the market.”
Compliance makes AI more accessible, and provides a competitive edge
Migliorini is passionate about the benefits compliance education and implementation can offer — particularly for businesses that are working across various countries that will have different regulatory requirements.
“We’re seeing all sorts of increasing requirements from here and overseas. But what we also say to startups and smaller players is that [GRC] really is a competitive advantage,” Migliorini said.
“Customers will vote with their feet and there’s just a level of expectation that AI technology is safe.”
As part of its path to increased AI compliance accessibility in Australia, KomplyAi has submitted an application for the Responsible Artificial Intelligence Adopt Program.
The $17 million program was announced in late 2023 and is set to involve the establishment of five centres across Australia to assist businesses in safely adopting AI. The idea is to provide free specialist training for businesses to implement AI responsibly into their operations.
“What we’ve bid for is the creation of a virtual center, enabling access to frictionless compliance by digital means. And we’re supported by some of the top universities, big tech VCs and more,” Migliorini said.
The idea is to connect some of the top minds in the space with smaller players in the AI market. It would also help small businesses to do certain testing and certification attainment required by other international jurisdictions.
“There’s sort of too many walls up for them at present. So what we’re trying to do by this virtual center is create an avenue for them to access knowledge, technology and resources to enable them to put their technology to a national and then a global market.”
What else is going on in AI this week?
- In a SmartCompany exclusive, Linktree reveals why its new AI play is “laser focused” on e-commerce.
- The government’s Productivity Commission has released three research papers regarding the opportunities of AI. This includes where governments should focus their policy efforts, a government playbook for developing AI protections and how AI raises the stakes for data policy.
- There’s an AI tool for VC firms now.
- Google Bard got the Gemini Pro updateย we reported on late last year. And it comes with an image generator.
Comments