One of things about MusicXML is that it does not explicitly encode time information, but just sticks everything in a list. You can’t extract any particular note or rest from the data in order to find its exact position.
To solve this I have created a node module that takes a MusicXML file and converts it into MusicJSON! This provides all the time information you would expect such as tempo, the absolute location, the location in a measure of a note or rest, the location of the beginning of measures, and an ISO8601 compliant timestamp location in milliseconds that is calculated using location and tempo information.
Having this kind of information encoded in JSON makes data visualisation far simpler and also makes the data set well suited to a range of machine learning and machine intelligence applications.
I have included a pianoroll visualization express app in this node module (which generates the visualization below), and is the test I have found most handy. You can check it all out on my github at https://github.com/jgab3103/musicXML2Json