Generally, when passing the size of something around it's good to use size_t and, should that something be a blob (in the binary object type sense) it probably wants to be a void*.
However, the cstring extension to SWIG uses int as sizes and char* for data, for example
%cstring_output_withsize(parm, maxparm): This macro is used to handle bounded character output functions where both a char * and a pointer int * are passed. Initially, the int * parameter points to a value containing the maximum size. On return, this value is assumed to contain the actual number of bytes. As input, a user simply supplies the maximum length. The output value is a string that may contain binary data.
You could potentially create your own typemaps to handle this and re-write large parts of cstring SWIG interface, but the point would be moot because by the time it got back to the Python API it has to be an int anyway; e.g. calls like PyObject* PyString_FromStringAndSize(const char *v, int len) all take an int. Since Python supports binary strings everything should be a char* too (this is less critical, but if you want to build with -Wall -Werror as you should, you'll need to make sure the types are right).
I would reccommend not following some of the SWIG instructions about doing your own typedef for size_t. This seems fraught with danger and you're only going to be calling Python API functions that expect an int anyway. Be aware that if you really have a need to be passing something around with a size that doesn't fit in an int, you'll have some work to do; otherwise design your API with the right types for the Python API.