11-09-2014 04:24 AM
Hi,
I'm an engineering student from belgium (so excuse me for my bad english) and for our project we have to build a robot.
Our idea was to build a robot who reads a stave and detects which music notes it has to play and send this information to another robot who actually plays the piano.
The robot who plays the piano is working but we're stuck with the first part:
they recommended us to use NI Vision Assistant to detect the music notes from a stave but I have no clue how to even start.
first of all I have to detect everything manually while it should be done automatically and my second problem is how do I connect this with LabVIEW? how can I put what I've done in NI Vision Assistant in a LabVIEW program?
Thank you in advance for your time and help,
Lisa
11-09-2014 07:25 AM
Hi,
Do you have images from the music notes? That would be the first step.
The notes will be black on a white background, which will hopefully be not too difficult to locate.
Ensure that the image is of good quality (uniform lighting, no light drift).
Here are a couple ideas of algorithms to try out:
- Threshold to extract the notes from the background.
- Morphology Opening to cleanup the image and get rid of the lines where the notes are.
- Particle Analysis to locate the position of the notes from the binary image.
I recommend you spend a little bit of time reading the NI Vision Concepts manual to get familiar with the common image processing techniques. You can start by looking at the sections that I mentioned above.
> how can I put what I've done in NI Vision Assistant in a LabVIEW program?
From LabVIEW, you can drop the Vision Assistant Express VI, located in the Vision express palette.
This will launch Vision Assistant that you can use to prototype the image processing algorithm.
When you're done and close the UI, LabVIEW code will be generated inside the Express VI to implement the algorithm in LabVIEW.
You can double click on the Express VI again to fine tune the algorithm.
Hope this will help you get started.
Christophe
11-09-2014 08:03 AM
I don't want to be the defeatist here but while typing this I'm looking at a score "A short story" by Dmitri Kabalevsky.
There is much more than just pattern recognition here!
For piano you have 2 stave, one for each hand, left hand mostly starting with sol key, right hand mostly starting with fa key.
But this can change even in the middle of a stave.
Signs for "mollen" en kruisen" (sorry don't know the correct UK terminology)
Then you have the duration of each individual note.
Bottom line: A lot is going on on a score ... ...
Question: Should this reading/playing happens in real time?
11-09-2014 09:30 AM
@Alain_S wrote:
I don't want to be the defeatist here but while typing this I'm looking at a score "A short story" by Dmitri Kabalevsky.
There is much more than just pattern recognition here!For piano you have 2 stave, one for each hand, left hand mostly starting with sol key, right hand mostly starting with fa key.
But this can change even in the middle of a stave.
Signs for "mollen" en kruisen" (sorry don't know the correct UK terminology)
Then you have the duration of each individual note.
Bottom line: A lot is going on on a score ... ...
Question: Should this reading/playing happens in real time?
I agree. It is quite an ambitious project, especially since the same duration for a particular note can be expressed in several different ways. Not to mention dynamics.
Also, I can't see t happening in real time because the notes will not be spaced evenly, so your "sampling" speed will vary.
Or maybe we are talking about a robot that plays a simple melody with one finger?
11-09-2014 10:19 AM
We're planning on playing simple songs like twinkle twinkle little star, and songs where the duration of notes don't vary that much. It shouldn't be a problem if all the notes are read at first and then sent to the next program.
The idea was to build some sort of robot with a webcam that moves over the music sheet, detect the music notes and maybe put them in an array in LabVIEW?
11-09-2014 11:38 AM
I could see maybe a producer reading the music a measure at a time and the consumer playing it. The visual recognition would be tricky because you have to note (pun not inteded) both the duration of the note and its position on the staff. (I have no experience with the technical side of doing the vision stuff, though.)
I would stick to one type of time signature - preferably 4/4 - so that quarter note = 1/4, half note = 1/2, etc. I'd also select songs that have no sharps or flats.
11-10-2014 06:42 AM
Hi Lisa,
Here is a very basic example that shows you how to get started with NI Vision Assistant in a VI.
https://decibel.ni.com/content/docs/DOC-6236
How do you plan to send the acquired data from the first robot to the second robot who is playing the music?
Best Regards,
11-10-2014 03:37 PM
@ShazilM wrote:
Hi Lisa,
Here is a very basic example that shows you how to get started with NI Vision Assistant in a VI.
https://decibel.ni.com/content/docs/DOC-6236
How do you plan to send the acquired data from the first robot to the second robot who is playing the music?
Best Regards,
Hi,
Thank you very much for the example it will be a big help!
we're not sure yet how to send the data to the second robot but we were thinking that maybe we could put in into an array and use that array in our program that drives the motors of the second robot?
11-11-2014 03:10 AM
Hi,
No worries. Hope it helps. Yeah array sounds fine, however I meant to ask how are you "Physically" communicating between the robots? Are they connected via ethernet for example?
11-11-2014 05:17 AM
Now I'm a bit confused 😞
In your first post you wrote that the "playing" robot is working.
Reading this, one can assume that you give this robot some input (data) to play something, the format of the data is known, right?
In mesage 8 you wrote that you aren't sure how to send the data from the "reading" robot to the "playing" robot?
So, like I said, I'm a bit confused ?
You also wrote that real time reading/playing is not required, right?
So you can put the data to play in a standard text file.
Making a file manually allows you to debug the "playing" robot.
Are both robots controlled by one and the same computer or not?
You also wrote : "The idea was to build some sort of robot with a webcam that moves over the music sheet, "
Why not scanning a whole music sheet a once and "translate" it into a file that can be played by the "playing" robot?
The word "moves" scares me a bit 😞
Do you have a motor, a motion controller, what a the end of the staff, how to ensure parallel travel over the staff, ... ... ?