A player piano isn't really impressive anymore, unless you're a little kid seeing one for the first time, and then it's kind of like magic. But imagine the piano is playing a song it wrote on its own. Does that change anything? What if you couldn't tell the difference between songs the piano had written and ones that were written by a living, breathing human being? 

Arne Eigenfeldt is a composer of acoustic and electro-acoustic music and a professor at Simon Fraser University in Vancouver, but recently he's been spending a lot of time with robots. Specifically, the Musical Metacreation project he co-created with scientist and professor Philippe Pasquier. 

In 2009 he and Pasquier were given a $488,000 grant (Canadian dollars) to "explore the boundaries" of the Metacreation project and the idea that machines could become composers if they were given enough data -- if they were fed, for example, hundreds of songs that their programmers personally enjoyed. Maybe they could even play along, in real time, with human virtuosos.

"In the past, the computer was always an assistant. It would generate ideas, but the composer was always there to shape those ideas," Eigenfeldt told Huffpost. "My goal is a much larger one -- to apply A.I. techniques to a higher level."

Certainly computers have taken on a greater role in popular music of late. They've been integral to the creation of ambitious DJ sets and thumping, auto-tuned popular songs -- just listen to the latest Rihanna hit for proof that computers are alive and well -- but up to this point, computers haven't been trusted to compose concertos all by themselves. Eigenfeldt, along with others in his field, are trying to challenge that. First with classical music and next -- dub-step. He's planning a dance party, written entirely by a computer program, for the end of this year.

"Can a computer be creative?" he asked. Why can't a computer -- if it has been programmed and trained just like a person -- be considered "a real artist?"

The question isn't new. Music professor, author and composer David Cope has been experimenting famously with computerized composers for years, dating back to a few decades ago when a particularly debilitating bout of writer's block got him thinking: Could a computer get my own brain working again? Not only did the computer knock out his writer's block, but it inspired him to launch into a field of his own. 

Out of that blockage came a program he affectionately named "Emmy," and its first songs debuted to the public in 1987 at the University of Illinois at Urbana-Champaign. According to Pacific Standard, Emmy played a few pieces that could have been mistaken for Bach chorales, and "they were met with stunned silence" from audiences and critics alike. People were almost angry, as they couldn't tell which pieces were real Bach and which were composed by the computer. 

In the ensuing years after Emmy's premiere performance, Cope struggled to get her pieces performed by professional musicians, and he couldn't even get a classical label to put out Emmy's music, despite its obvious prowess. Though documentaries were being made about the work -- including the 25-minute video "Bach Lives!… At David Cope's House" -- scientists were more receptive than artists.

"It was certainly a lonely field at the time," Cope said of his early experiments with computer-made compositions. He noted that very intelligent people would "react almost violently" to the idea of computers as creative beings. 

"But now that's changing," he said.

Today, programs like Eigenfeldt and Pasquier's Metacreation can draw classically trained musicians ready and willing to collaborate with computer-based counterparts. Their project has put on multiple performances in recent years, where musicians play along live to compositions that the computer program writes on the fly, using its own algorithmic brain. 

Timothy Van Cleave, a classical and jazz-trained percussionist, said he actually enjoyed playing along with a robot, and suggested that in 20 years, perhaps every band or ensemble will have a robot that plays with them.

"It's a wonderful opportunity to push the boundaries of music," he said. "It's really exciting to know that someone has come up with an algorithm or some kind of software application that could listen to you and play and react." 

Eigenfeldt is aware of the skeptics, just like Cope was back in 1987. He has experimented with live shows involving only robot performers -- with visible gears and robot arms -- and others involving robot and human performers playing together in sync. He said audiences are much more receptive to the work if they see human beings alongside the robot, whereas when the machines are playing the music alone, the performance became more lifeless, less dynamic, to the audience. 

He remembered the first time he brought humans in with his computer program and watched them play together. 

"They were nervous and they overplayed," he said, referring to the humans. "They didn't trust the computer's work so they filled every silence. They weren't willing to give the computer a chance." 

Eigenfeldt likened the experience to playing with a musician from a different culture, who speaks a different language. At first you have to figure out how to gel, you have to feel each other's subtle cues and sideways glances, but eventually the music can transcend cultural barriers. 

Eigenfeldt remembers one recent performance when he watched a huge grin spread across Van Cleave's face as he played with the robot. Then he watched him shut his eyes and he knew: "He and [the robot] had just hit it off. Maybe the computer wasn't smiling back, but they were listening, and they just had to expect that the computer was listening to them."

At the recent International Conference on Computational Creativity in Dublin, Eigenfeldt and his collaborators presented a paper, "Evaluating Musical Metacreation in a Live Performance Context," which essentially stated that audiences couldn't tell the difference between their computer-created pieces and pieces written by humans, no matter how complex the piece or how experienced the audience member was with classical music. 

Cope learned something similar back in 1987. Though his Bach-emulating program has since been scrapped, his new program, affectionately named "Emily Howell," (the two programs "are definitely related," he says) has created some gorgeous original compositions of her own. Howell's newest album, Breathless, will be released next month. 

The program's job, Cope said, is to create new styles of music by combining elements of his favorite composers. Every piece of music that has ever been recorded has its inspiration in another form, so why, Cope asks, is it so wrong if a computer program takes the same approach? 

"Emily Howell is an interactive partner," Cope said. "We sort of speak to one another."

He says it's not about making computers operate on their own, it's about using what we know as human beings about music and the emotional effect a piece of music can have on an audience, and applying that to the power of computer science. There's no "ghost in the machine," he said. Computers are still tools, "very elegant tools," he calls them, but still tools controlled by their human makers. But that doesn't mean we can't keep expanding their minds. 

"It's about getting in and trying to see how you can recreate and model human behavior in the musical world," he added. "That, to me, is real computer research."

The next musical Metacreation projects can be seen at Simon Fraser University, starting June 21 in the Audain Gallery. You can read more about Cope's work at his UCSC faculty page.

WATCH: A Musical Metacreation performance with live musicians and robots

WATCH: A video feature about David Cope from 2007