IBM Watson want to be at the cutting edge, serving the community from the very start.
Make no mistake, we are living in the future. We can store millions of files into a piece of hardware the size of our pinky nail. We can converse face-to-face while thousands of miles away from each other. We have cars that can drive themselves. We have personal computers and entertainment systems built into tiny phones that fit into our pockets. We have amazing robots that can run, swim, fly, and speak.
Aiva is an AI composer that creates musical pieces used as soundtracks for film directors, advertising agencies, and even game studios. This brings up the question: Will AI-composed music ever be indistinguishable from the work of human musicians?
CLAUDE MONET USED brushes, Jackson Pollock liked a trowel, and Cartier-Bresson toted a Leica. Mario Klingemann makes art using artificial neural networks.
Zach Lieberman is making sounds. “Click.” “Psh.” “Ah.” “Oorh.” “Eee.” With every noise, an amorphous white blob bursts onto a screen, leaving a trail of shapes lingering in the air. As Lieberman moves his phone backward through the cloud of blobs, the noises replay in reverse as if he were rewinding a vinyl record.
Deling with poker is deaing with loads of imperfect information which makes the game quite complicated like many real life situations – one of the main reasons why big universities AI departments are researching on Poker.
Members of the Google Brain team have been exploring how machine learning can be used as a creative tool through a project called Magenta. In this video, members of the team give an overview of how they turned an interface that allows researchers to interactively evaluate their music generation models into a fun and powerful creative tool for musicians called “AI Jam”. This interface won the Best Demo award at the 2017 Conference on Neural Information Processing Systems (NIPS) and is freely available on Magenta’s GitHub site.