Profiling Python scripts (3): stl2ps
This is the third in a series of articles that covers analyzing and improving
performance bottlenecks in Python scripts.
The performance of stl2ps
is the topic of this article.
This is the third in a series of articles that covers analyzing and improving
performance bottlenecks in Python scripts.
The performance of stl2ps
is the topic of this article.
This is the second in a series of articles that covers analyzing and improving
performance bottlenecks in Python scripts.
In this second article the performance of stlinfo
is looked at.
This is the first in a series of articles that covers analyzing and improving performance bottlenecks in Python scripts.
Installing Python scripts (as opposed to modules) is a too involved using distutils/setuptools. Those do not take into account zipped archives and scripts using a GUI toolkit. The latter is a problem on ms-windows.
So I wrote my own setup scripts to do things differently;
These scripts are now available on github as setup-py-script.
It is rare to see a piece of software considered “done”. The TeX typesetting engine in one of the few examples I could name. Why isn’t this more common?
This article covers some aspects of using SSH keys with github that are left out of the original documentation on github.
It assumes that you’ve been using HTTPS with a password for remote access to github.
This article documents how I set up Python and the syslog daemon so that Python programs can log to syslogd.
When you are exploring a problem, in general first write a command-line program whenever possible.
It will take less effort to write then a full-blown GUI.
Recently I wrote a program to remove the protection from ms-excel files.
The original version was written as a command-line program. Later I re-used the relevant code for a GUI program for use on ms-windows. This was mainly for the benefit of some colleagues who are not comfortable with using the command-line.
In this article I want to contrast the two programs.
There are basically two ways in which one can make use of a modern CPU with multiple cores for computationally intensive work.
In the first case, all data is implicitly shared. In the second case, data must be explicitly shared or communicated.
The first option is often said to be more convenient. I would like to make the case that this usually makes the task more difficult, because of the need to manage all shared data.