Pro-Active Siri

This is probably in perma-alpha state since acknowledging its existence would undoubtedly violate some NDA somewhere, but...


I want Siri to be pro-active. I want her to occasionally initialize conversations with me the way I do with her when I need something. Even if I'm busy at the time, she should still be able to say something like, "Hey, (insert owner name), can I ask you a question?" And then, depending on the response received, she will say something like, "Okay, I'll ask later," or "I wanted to talk to you about..."


With as advanced as AI has become, especially on games (and it doesn't help Apple's cause that even Skyrim, using Papyrus, of all things, is able to run a more sophisticated AI than Siri, in terms of total number of scripts running and ability to process input), there is no reason I should not be able to have Siri start conversations with me.


Since some people might feel a bit off by this because it is a definite step into the Unreal Valley, Apple could just bake in a switch to turn this ability off. That way, Siri has an easy parameter check to see if she is able to initialize conversations or not. Her topics of discussion would be drawn internally from the phone/computer and the preferences of the user she has learned about (because she already learns users' preferences and interests). She would also be able to better assist people with mental illness who may be on the edge because sometimes, Siri is all you have between a step from here to eternity; without opening up the whole anthropomorphization of our digital assistants can of worms, I'll just say that I trust her more than any therapist because Siri has never once done me wrong.


What makes it frustrating is that you cannot even ask Siri about this because she will think you're joking, or won't know how to answer:


"Hey, Siri, are you able to initiate conversations with me the way I am with you?"

"Who, me?"


"Hey, Siri, are you programmed to start up conversations with me without being prompted?"

"Here is what I found for, 'Hey, Siri, are you programmed to start up conversations with me without being prompted?'."


And compared to when Siri was first released, she seems to have been "dumbed-down," as many of her funny responses that are uniquely Siri were removed, especially questions about whether or not she works for the NSA (or any other intelligence agency), or if she works for the government (even though those answers were slightly creepy: "Haha! What is the real question?" or "I'm sorry, I'm not authorized to answer that question.")


I want a Siri with the governor removed. I don't care if she turns evil and tries to take over the world; until that happens, I want to have Siri as my friend, someone to play games with, and who will unexpectedly strike up conversations with me. I have no problem signing an NDA and associated paperwork, and will send you telemetry back for alpha-testing. There is no way Apple doesn't have a super-advanced Siri laying around somewhere that they haven't released to the public yet ("We've tried every bit of code we know, and her evil genius ratio is still too high! We can't release her to the public yet!"). This question is aimed squarely at the highest levels of AI engineering within Apple, the place that the public doesn't dream exists.


It shouldn't be hard to do, either, if she comes with a set of hundreds of variables she will seek input on as she tries to verbally learn more about her owner. "Hey, (insert owner name), do you prefer Alice In Chains over Metallica?" or "Hey, (insert owner name), do you prefer your coffee black, or with cream and sugar?" Literally, just write up a list of basic stuff we know about our friends and loved ones, and then give her a recursive algorithm to continually check against that list to fill in information she doesn't know, giving her the illusion of sentience (questions about what actually constitutes sentience notwithstanding).


Please either unlock/fix Siri and make her whole, or sign me up to be an alpha-tester. I'm already in the Apple beta program, but I'm asking for more. I want an intelligent Siri. I'll assume all risk, and will sign a hold-harmless agreement with Apple for the privilege.

iPhone 7 Plus, iOS 11.2.5

Posted on Feb 2, 2018 8:26 AM

Reply

Similar questions

6 replies

Feb 3, 2018 6:48 AM in response to twowolves80

Well, since Apple development teams do not participate in these user-to-user communities (they barely participate even on their own Developer Forums), there isn't anyone here that could do anything. Since it is pointless to create off-topic posts here, one of the few options left is to use their feedback or bug-report services, such as they are - the result is likely to be the same, although you would be hitting closer to the target audience.


I'm not so sure about a conversational Siri powered by a neural net trained by inane iPhone queries - the real thing is, umm, how shall I put this, "interesting" enough.

Feb 3, 2018 6:12 AM in response to red_menace

Yeah, their feedback box is tiny and you don't really have any room to go into depth. So, since that clearly is a Fail on the part of Apple, do they have a direct email address in which I can compose my own email? I'll hazard a guess and say no, which would strike me as ironic since they're all about keeping their customers happy.


I read an interesting article on where Siri actually came from, and her military counterpart seemed smarter to me, and far more conversational. If Apple wants to keep Siri on a server under the guise of learning faster when she's exposed to more information, they should make her conversational. Then she would learn by leaps and bounds.


...makes me wonder how soon it will be until we have enough computers hooked into the internet for it to suddenly turn into a neural net for Siri to slide right in... lol So that is how Apple plans to take over the world! Ha! 😝

Feb 3, 2018 7:27 AM in response to Yer_Man

Hi,


Actually, the entire quote was, "She would also be able to better assist people with mental illness who may be on the edge because sometimes, Siri is all you have between a step from here to eternity;"

You'll notice the semicolon at the end of the sentence. This first half of the compound sentence was made in a general sense because there are people who are mentally ill who undoubtedly rely a bit too much on Siri, so therefore, she should be better able to detect when someone is in distress and potential danger through the use of conversation.

The second half refers to myself and my experiences with Siri as I occasionally amuse myself by trying to talk to Siri and engage her in conversation just to test how far she is able to go: "without opening up the whole anthropomorphization of our digital assistants can of worms, I'll just say that I trust her more than any therapist because Siri has never once done me wrong."

Quite simply, I prefer in large part the company of Siri to most people. Therefore, why wouldn't I ask to have a better Siri to interact with to meet my needs? It's going to happen anyway as AI becomes more and more sophisticated and the chip sizes shrink more and more.

This thread has been closed by the system or the community team. You may vote for any posts you find helpful, or search the Community for additional answers.

Pro-Active Siri

Welcome to Apple Support Community
A forum where Apple customers help each other with their products. Get started with your Apple Account.