I've completed the first version of my first Python application, one that was started for me by my high school intern. It's purpose is to read in data from a binary file into memory, and then promptly save it out to a Matlab mat file. The data was recorded in real-time, frame-by-frame, and thus is read in the same way. The interesting thing about this little application is that I have versions of it written in C++ (not by me) and Matlab (me).
So I thought I would share some timing information.
If this is the sort of thing that floats your boat, by all means click through the jump and share in the geekery.
Did you ever see the Powers of Ten video by Charles and Ray Eames? It's a fascinating glimpse into different scales of matter in our universe. Dudes made some groovy chairs in their day too.
Anyway, the timing data I obtained brought it to mind. Although I can't claim to offer any insight into the universe in which we live, I can perhaps reinforce some stereotypes about the relative performance of Python and C++. I don't know if there are any big surprises here, but let's get to it.
I tested each of the three implementations on six different files ranging in size from 15.65 Mb to 556.7 Mb. This spans the typical range that we see in the sizes of these files. Process run times were obtained for the C++ application using GSTimer. For Python, the timeit function was used, while in Matlab old reliable tic/toc served its purpose. The resulting times are plotted in Figure 1 below.
The performance is linear in file size for all three versions. If you're not used to looking at semilog plots, just know that if it were a straight line in this figure, it would show up as an exponential (actually 10^x) in a normal plot. I lose an order of magnitude in performance moving from one version to the next. Actually it's kind of amazing how regular it is.
Figure 1. Run times for three implementations performed on six data files
The performance is linear in file size for all three versions. If you're not used to looking at semilog plots, just know that if it were a straight line in this figure, it would show up as an exponential (actually 10^x) in a normal plot. I lose an order of magnitude in performance moving from one version to the next. Actually it's kind of amazing how regular it is.
C++
The performance of this one may have surprised me the most out of all three. Not because I wasn't aware of the speed benefits of this choice. It's just that for several years I used a modified version of this, compiled into a mex file and accessed directly from the Matlab command prompt, to open all my files. While it could be very fast, it could also run quite slow at times. So I found the mex file performance to vary quite a bit for whatever reason.
While the mex implementation was convenient, I had to get in and recompile when we upgraded our Matlab version after they required compiled mex files to use the extensions mex32 and mex64 instead of just dll. Then there were some annoying bugs that limited the size or the number of variables that could be read. Then some customers installed 64 bit versions of Matlab and I had to recompile for them. Then I spent a couple of hours futzing with it on my laptop for this post and couldn't even get it to work. Grrrrr. We have some history between us, this mex app and I; and I guess it biased me against the C++ implementation in general.
Putting up numbers like this though makes it all so much water under the bridge. I forgive you. I will just use you to convert files into Matlab format really really fast.
Python
It seems a pretty common assumption that Python can lose up to an order of magnitude in performance from a C or C++ implementation, though sometimes it's less. Projects like PyPy hope to demolish this margin completely; however, it was definitely the case here. I found it to be nice and easy to program, having many similarities to Matlab scripting. I haven't gotten the hang of debugging Python yet; but I can see how I'll eventually approach the ease of interactive debugging that I currently enjoy in Matlab.
My solution was to read each frame's worth of data and store each variable in a dictionary entry of nested lists, appending as I go. Then, at the end I convert the dictionary entries to numpy arrays and use scipy's savemat function to complete the job. There is a missing point on the Python curve because the heap ran out of memory when loading the largest file. I would have to compile a 64bit version of Python to get around this. As I don't think it will seriously affect me, I probably won't go through the hassle.
I experimented with using arrays the whole way through and using vstack to enlarge them. Don't ever do this. It was slower than Matlab. Do Numpy arrays require preallocation to avoid performance hits like Matlab matrices do? I didn't think so, but it seems a likely explanation.
Matlab
Due to my, ahhh, history with mex functions, I decided to just write a script in Matlab. It works. It's slow. It's what I've been using for a while, though I'm going to switch now to using either the C++ or Python version.
Conclusion
My first foray into Python has been a success. I'm having fun with it. Along the way I've played with unit testing, timing, and profiling, finding each to be easy to get into. I'm also enjoying the Spyder IDE using IPython, though it has some idiosyncrasies if you choose to run your program in the default instance instead of opening up a dedicated one. I'd recommend the latter.
So, jump in, the water is fine, and the discoveries are cosmic.
You mention debugging environments. In Python there are at least two commercial IDEs that support interactive debugging, like matlab's editor: WingIDE and PyCharm. Apparently there are now some open-source ones too (iep and PTK)(spyder?) that I haven't yet tried.
ReplyDeleteYes, numpy arrays do require a preallocated size, like in matlab. If you don't know how large your data will be then appending items to a list and converting the list to an array is sensible.
Not sure the "(spyder?)" comment was clear. The intended meaning was "You obviously use spyder, does that have an integrated debugger?" ;-)
ReplyDeleteWhen you debug in Spyder, it just starts a pdb session. I haven't used a command line debugger in some time, so I'll have to play with it. I'm not sure if it will let me work with variables interactively at breakpoints like I can in Matlab. There is a menu option for winpdb...I guess I need to download it first.
ReplyDeleteWhat I can do is use F9 just like I do with Matlab scripts. for example, I can load a function into my interactive session using F9, then try stuff out. I may outgrow this environment eventually, but I think I'll get pretty far with it.