I’m currently developing a C++ library for Windows which will be distributed as a DLL. My goal is to maximize binary interoperability; more precisely, the functions in my DLL must be usable from code compiled with multiple versions of MSVC++ and MinGW without having to recompile the DLL. However, I’m confused about which calling convention is best,
Sometimes I hear statements like “the C calling convention is the only one guaranteed to be the same accross compilers”, which contrasts with statements like “There are some variations in the interpretation of
cdecl, particularly in how to return values“. This doesn’t seem to stop certain library developers (like libsndfile) to use the C calling convention in the DLLs they distribute, without any visible problems.
On the other hand, the
stdcall calling convention seems to be well-defined. From what I’ve been told, all Windows compilers are basically required to follow it because it’s the convention used for Win32 and COM. This is based on the assumption that a Windows compiler without Win32/COM support would not be very useful. A lot of code snippets posted on forums declare functions as
stdcall but I can’t seem to find one single post which clearly explains why.
There’s too much conflicting information out there, and every search I run gives me different answers which doesn’t really help me decide between the two. I’m searching for a clear, detailed, argumented explanation as to why I should choose one over the other (or why the two are equivalent).
Note that this question not only applies to “classic” functions, but also to virtual member function calls, since most client code will interface with my DLL through “interfaces”, pure virtual classes (following patterns described e.g. here and there).
I just did some real-world testing (compiling DLLs and applications with MSVC++ and MinGW, then mixing them). As it appears, I had better results with the
cdecl calling convention.
More specifically: the problem with
stdcall is that MSVC++ mangles names in the DLL export table, even when using
extern "C". For example
[email protected]. This only happens when using
__declspec(dllexport), not when using a DEF file; however, DEF files are a maintenance hassle in my opinion, and I don’t want to use them.
The MSVC++ name mangling poses two problems:
GetProcAddresson the DLL becomes slightly more complicated;
- MinGW by default doesn’t prepend an undescore to the decorated names (e.g. MinGW will use
[email protected]instead of
[email protected]), which complicates linking. Also, it introduces the risk of seeing “non-underscore versions” of DLLs and applications pop up in the wild which are incompatible with the “underscore versions”.
I’ve tried the
cdecl convention: interoperability between MSVC++ and MinGW works perfectly, out-of-the-box, and names stay undecorated in the DLL export table. It even works for virtual methods.
For these reasons,
cdecl is a clear winner for me.
The biggest difference in the two calling conventions is that “__cdecl” places the burden of balancing the stack after a function call on the caller, which allows for functions with variable amounts of arguments. The “__stdcall” convention is “simpler” in nature, however less flexible in this regard.
Also, I believe managed languages use stdcall convention by default, so anyone using P/Invoke on it would have to explicitly state the calling convention if you go with cdecl.
So, if all of your function signatures are going to be statically defined I would probably lean toward stdcall, if not cdecl.
In terms of security, the
__cdecl convention is “safer” because it is the caller that needs to deallocate the stack. What may happen in a
__stdcall library is that the developer might have forgotten to deallocate the stack properly, or an attacker might inject some code by corrupting the DLL’s stack (e.g. by API hooking) which is then not checked by the caller.
I don’t have any CVS security examples that show my intuition is correct.