Won’t this be misused?
The stories of AI and ChatGPT have become really familiar to me. If you ask questions naturally in text, it will give you answers as if you were having a conversation, and it will automate various tasks. There are many useful ways to use it, but how far will it evolve?
Voice Engine that creates your own voice
OpenAI recently announced that the development of its new tool, Voice Engine , is progressing smoothly and its official release is imminent. It is well known that OpenAI is already expanding its development fields into many fields. We have created an AI tool called “ Sora ” that generates videos , and even created a humanoid robot called “ Figure 01 ” with a partner company.
Voice Engine is based on the technology used for ChatGPT’s voice interaction function. In short, it not only answers your questions in text, but also reads them out loud and speaks to you , but by using Voice Engine, the speaking voice sounds just like your own !
Since it has not been officially released yet, we can only judge from the demo, but you will speak to the Voice Engine for about 15 seconds . Then, in the blink of an eye, their own voice was analyzed and a clone was created that spoke exactly like them. It’s amazing how he can read out anything in his own voice.
It’s convenient, but there are concerns…
OpenAI emphasizes that the possibilities of Voice Engine are endless.
For example, if a busy father or mother uses Voice Engine to completely copy their voice, they can read stories to their children using their own voice. Or, if a person who is unable to speak smoothly for various reasons can use Voice Engine to recognize their own voice, they will be able to speak fluently instead. What’s most impressive is that even though it recognizes English voices, it can even start speaking in another language using your own voice.
Voice Engine is exciting and seems like it can do a lot of amazing things. However, the fear that this could be misused, such as for example, in an “Ore-Ore” scam , probably comes to everyone’s mind.
There has already been an incident in the United States where a similar tool, ElevenLabs , was used to impersonate President Biden’s voice and send out fake election messages. Wouldn’t Voice Engine be able to do the same thing more easily?
OpenAI strives to develop AI that is safe and beneficial to everyone. In anticipation of the risk of misuse, we will take careful and sufficient measures before disclosing the information .
This was the first explanation added to the Voice Engine announcement. Naturally, with any technology, there is a risk of it being misused . However, it is still a new tool that comes with some concerns