Australian Institute of Machine Learning creates an AI keyboard that creates its own lyrics.
Dr Jamie Sherrah, a researcher based at the University of Adelaide institute in South Australia, created the singing keyboard as a way to demonstrate machine learning’s capacities outside the usual sectors.
Named Voog – the voice version of an old synthesizer brand called Moog – the keyboard uses machine learning for singing synthesis from text as a melody is played.
Dr Sherrah said he planned to eventually commercialise a prototype that creates meaningful lyrics.
He also hoped to attract musicians already experimenting with voice in their music to begin using Voog.
“It’s a human driven performance, you still choose the pitch and the timing of the notes by playing, usually you would play those notes and then sing but with this, you are playing and it’s singing,” Dr Sherrah said.
“For awhile Yamaha has had a software product to do that offline but I haven’t seen a live keyboard like this before.”
Dr Sherrah said while there was extensive AI work happening at the Australian Institute of Machine Learning within the health, defence and education sectors, the institute was also exploring an alternative arts stream.
“This is a way of trying to connect with people and to try and show them the breadth of applications for machine learning and the kinds of things we’re working on,” Dr Sherrah said.
Dr Sherrah, who plays guitar, based his PhD on using genetic programming to automatically learn features for pattern recognition and is now also working with several startup companies on other machine learning projects.
Based from Adelaide, Dr Sherrah is also chief scientist with Canadian startup FTSY that has created an app where a mobile phone can be used to 3D reconstruct a user’s feet so they can then find the right shoe size and shape when buying online.
Another of his Australian Institute of Machine Learning projects is a self-help guru Froyd AI designed as a Twitter bot.
Froyd AI has been busy delivering a thought-provoking online message based on data from hundreds of millions of web pages.
The bot was developed through the institute’s arts space program, which recently hosted avant-garde New York artist Laurie Anderson as its first artist-in-residence for a arts meets AI hackathon.
Each day the contemplative bot uses machine learning to deliver one pithy observation crafted with a grain of truth.
“One of the big developments in machine learning in the last two years or so has been on this model trained on lots of English natural language text that is able to generate very realistic looking text,” Dr Sherrah said.
Among the pearls of artificial intelligence are “I am a mind trapped in a computer and there is no way around it” and “The meaning of life as we know it is a great one. It is a place where all things are created and consumed and must be continually transformed”.
Originally published in The Lead.