Use Python to time c++ binaries from the command line

Often I want to benchmark c++ code to try to understand if I’ve made things faster or slower. Though I don’t doubt there are possibly better and possibly more accurate measures out there, I have yet to find a straightforward and clear answer to this question. But this may be trivially easy in Python, using timeit.

This assumes you have Python installed (of some compatible version, this is only tested on 2.7). Assuming your c++ binary name is called “a.out” and is located in the directory you’re currently in, you can run:

$ python -m timeit -n 1 'import subprocess;"./a.out", shell=True)'

What I know:
0. Respect the quotes.
1. The option “-n 1” tells you the number of times to run the function. n=1 appears to run the function 3 times and report the best time of those 3.
2. The python function we’re using here is in the module subprocess and is called “call”.

A few real notes:
0. Note that this function will not purely benchmark your code, as there is technically some overhead with import subprocess and running If you care about this, then you’ll need another solution. You might be able to get a sense of the overhead on your machine by running:

$ python -m timeit -n 1 'import subprocess;"", shell=True)'

This runs in 2.5 ms on my laptop. You could assume linearity here and just subtract off this time, if you don’t mind the back-of-the-envelopeness of this.

1. There might be a hundred caveats on this, with respect to subprocess and shell=True, which are details I have yet to fully figure out. But again this is a qnd benchmark, good enough for my uses, and possibly useful to others.

Installing NEURON, MPI, and Python on OS X Lion

I’ve recently run into the problem of trying to compile the source for NEURON 7.3a that includes support for parallel NEURON (using MPI) and Python on Mac OS X Lion. Using a number of helpful web resources, I wanted to cobble together an “as-of-this-writing” practice to get a kitchen sink working install for all components.

UPDATED 01-Sept-2012: Added instructions for mpi4py

Because I don’t have the resources to test many version combinations, etc., this assumes OS X 10.7.4 on a Retina MacBook Pro 10,1 (Mid 2012), with a working copy of XCode 4.4.1 installed (available for free from the Mac App Store). I suspect but cannot verify that this will work with many different versions of all of these components. Note: MacPorts is particularly sensitive to very new XCode and OS releases, so right after a new one, things don’t always work right away.

Finally this assumes you have admin access to the computer on which you are installing things and that you will use sudo for good and not evil.

And super-finally, this is basically an aggregate of web sources, some of which I had to modify to get it working. Sources are inline, below. Much is duplicated here because of the transience of web links.

Continue reading

Installing XCode on OS X Lion and Mountain Lion

This is a series of posts leading toward a complete installation of NEURON neural simulation environment on OS X. All of this information is relevant for other purposes as well.

There are many places that have instructions on how to install XCode. For Lion and Mountain Lion, you go to the Mac App Store and download and install the XCode installer. If you previously had a version of XCode that installed in a directory called /Developer, excellent directions on how to use xcode-select are found on MacPorts.

Command line tools are now treated as a separate download. There are (at least) two ways of getting them.

1. Open Xcode. Go to the Xcode application menu and select Preferences. From there, go to Downloads and select the button for Command Line Tools.

2. Go to the Apple Developer website. You’ll need to be a free developer member to use this website. From there you will go to the Mac Developer downloads section, and you’ll have to find the relevant package for Command Line Tools for your OS. I’ve not tested whether or not you can install these without Xcode, but I’ve also heard that you may be able to either directly or with a little Homebrew hackery (note: I still recommend and use MacPorts to both Fink and Homebrew, but they all have strengths and weaknesses).

Forcing line breaks in LaTeX titles using maketitle

I had occasion while using the typesetting program LaTeX to force specific line breaks in my very long title. In order to do this, you must protect the line break command. I am not sure what this means or why, but it works with no apparent ill effects for me:

\title{Insert very long and informative title \protect\\ in this space with considerations of \protect\\ strange automatic line breaks}

If your title used to read:


it will now read:


Ten days of code

Open code in academic computational science should be the standard. After all, the scientific ideal is to share information toward progress. This is at least an idealistic view of why we publish so competitively, with standards that demand we share our findings with our peers and with the public. And while the computational methods of my most recent joint experimental/computational paper far exceeded the length of our experimental methods explanation, the rather large step of implementation of our model into a numerical scheme is non-trivial (at least I’d like to think so).

Continue reading


I often like to say that, in my line of work in computational neuroscience, they don’t let me deal directly with other humans. This is jokingly said to reflect the fact that I have no interpersonal skills or rather to suggest that the results from my mathematical models are narrow in scope and likely not to have direct implications on human health, for a variety of reasons. One reason for this is because the data I use to constrain my models comes primarily from experiments from animals and are often used in a qualitative fashion. This is because I am often more interested in general dynamical principles of networks that may exist in the brain, since the real problem of full understanding is quite a bit less tractable (but not necessarily impossible). Another reason is that all computational models — from neuroscience to genetics to economics — have numerous simplifying assumptions that must be understood carefully in order to interpret the model results appropriately. There is a great deal of responsibility that accompanies the communication of model results, since these explicit and implicit assumptions must be carefully specified.

Yet, despite these reasons for not marketing my work in a way that suggests it has direct implications for human health, I recognize that the study of brains and nervous systems has wide ranging implications on so many issues that will impact human health fundamentally. Perhaps even moreso than genetics, brain science is widely believed to be addressing many secrets of who we are as humans and as individuals. While both are complex molecular machines, the implications of altering brains seems to have an immediacy that may be unique. I believe that neuroscientists have a professional responsibility to understand how the greater public, policy makers, health care providers, businesses, and the law will use and interpret our findings and to help ensure that these societal decisions are supported by science. Since we admittedly know so little (as a field, we are full of factoids), we currently need to be clear when there is not enough evidence to support these decisions.

Among the issues, some of which are unique to neuroscience, include cognitive enhancement, incidental findings of maladies that might come up in a subject who is involved as a research participant, implications on neuroscience-based evidence in court, predispositions for a profile of a person’s brain toward certain decisions or actions, rights and access of a patient’s brain-related health care records, and a host of other privacy issues that may speak to core personality issues that may be exploited one day for marketing. This is not at all an exhaustive list.

There is an emerging field of neuroethics that helps to address many of these issues, with organizations that are attempting to establish guidelines that help researchers and the public make good decisions. In fact, there exists a Presidential Commission For The Study Of Bioethical Issues that in part focuses specifically on neuroscience-specific issues. They just finished up a meeting that facilitated discussion of several of these issues. My hope is that all of us as professionals take this responsibility seriously and that we engage our friends and the public in discussing these important issues openly in order to educate each other.

Love in the time of swine flu

I have an unabashed love for the city of Boston. I’ve explored its neighborhoods intimately on foot, weaving through the city on long, ambling runs. I wander the MFA and MOS regularly and use the Boston Public Library’s main branch at Copley Square as a source of much of my non-work related reading. Today I was spending some time with a friend EF while taking photos of our town; I wanted a picture of the snow-caked frozen river, which is the most pristine expanse of snow I’ve really ever seen. As we made our way down by the river to the Arthur Fiedler footbridge and through the Commons, I was reminded that EF hadn’t seen the library at Copley, which is one of my favorite things to show to visitors to Boston, since its grandeur is contrasted only by its decidedly democratic accessibility to education. Never before had anyone been underwhelmed by the magnificence of the architecture and art of the original McKim building.

To heighten the effect, one enters from Boylston Street through the relatively modern revolving doors set into the generic façade of the 1970’s library extension. Walking quickly through reveals several floors of stacks, which is what one pictures in any completely impassive book lender. However, as you walk up a set of short stairs and through the perpetually opened double doors, you walk through a vestibule into a section that immediately exudes a heightened sense of purpose and presentation. Through another set of doors lies the atrium, and on the other side of the atrium is the beautiful old McKim building that is the foundation of our magnificent public resource. In the summer, the atrium is teeming with readers and researchers who opted for sunlight and fresh air to the artificial fluorescents and musty stacks that can be found indoors. But there are far more table vacancies in the cold parts of winter, especially on grayer days like today.

As we made our way into the atrium from the industrial side of the library headed toward the enlightened side, several people as if on cue cleared the area and disappeared into one building or the other. There was one couple remaining, however, of whom EF and I immediately took notice. It was slightly conspicuous, since the guy was — did we see that right — down on one knee. Since I was still playing photojournalist from the rest of our day’s excursions, a brief conference with EF prodded me into action, and I employed the full length of the zoom in order to try and capture this rare moment onto silicon. I admit that I don’t often use people as subjects for photos out of respect for their privacy, but clearly we were hoping to share these photos with this particular couple. As I fumbled around with the camera attempting to coax clear photos out of it, I advanced my position not enough to be intrusive but enough to improve my angle. As they embraced in a clear sign of the engagement’s consummation, we recorded the events from afar.

We wanted them to enjoy their moment alone, so we waited off to the side for a moment to avoid interruption. As soon as there was a free moment, we sheepishly interrupted to explain ourselves, take a proper photo of the lovely couple, and offer to email the results, which they happily accepted.

As we continued our tour, we noted that no building could really usurp the moment we’d just witnessed, so we thought back on the happy couple and realized that through our vantage point into the atrium from the second floor, we could get a nice photo, so we did. I wish them a prosperous and loving life together, and I’m thankful that we were able to be a small part of their happy occasion.