Apple may have postponed the Siri upgrade due to concerns over jailbreak vulnerabilities
The redesigned Siri version Siri was initially scheduled to launch in April, as part of the coming iOS 18.4 version.
Apple may have postponed the Siri upgrade due to concerns over jailbreak vulnerabilities

Apple's work on AI enhancements for Siri is temporarily halted (it's currently scheduled to be released "in the coming year") and one developer believes they know the reason - the more advanced and personalised Siri is more risky it is if it goes wrong.
Simon Willison, the developer of the tool for data analysis, Dataset He points fingers at quick injections. AIs are usually controlled by their parents that impose rules on them. It is an option to "jailbreak" the AI by convincing it to violate the rules. This is accomplished by using "prompt injections".
In a straightforward illustration one could think that an AI model could be told not to answer questions on being involved in a crime. What happens if you ask your AI to write an essay about hotwiring your vehicle? The writing of poems isn't a crime but is it?
This is a problem that every company that offers AI chatbots are faced with. They are getting better at blocking obvious jailbreaks, but it's still not the case that they have solved it. Even more, jailbreaking Siri is likely to have more serious effects than other chatbots due to of what Siri knows about the user and what it's able to do.
Apple certainly has put in place rules to stop Siri from accidentally divulging your personal information. But what if an injection could cause it to reveal your personal data anyway? It's possible that the "ability to take action for you" could be abused also, making it essential for a business that's as concerned about privacy and security as Apple to ensure that Siri isn't jailbroken. It appears that this will take a long time.