In my CGI application something happens when I pass a 0 content length POST request. Here is my code:
char* len_ = getenv("CONTENT_LENGTH");
char* type_ = getenv("REQUEST_METHOD");
if(len_ != NULL)
{
// The code crashes somewhere here
long int len = strtol(len_, NULL, 10);
char* postdata = (char*)malloc(len + 1);
if (!postdata) { exit(EXIT_FAILURE); }
//fgets(postdata, len + 1, stdin);
string temp = "";
fstream ff;
string fileName = string(XML_DATA_DIRECTORY) + string("data.xml");
ff.open(fileName.c_str(), ios::in | ios::out | ios::trunc);
// ff.open(fileName.c_str());
if(ff)
{
// Modified: To handle new line in the Xml request
while(fgets(postdata, len + 1, stdin) != NULL)
{
temp.append(postdata);
}
ff << temp;
}
else
{
// Error on the ifstream
}
ff.close();
//free(postdata);
}
else
{
// No Data
}
I test my application using the Http-Requester plugin for FireFox, when I pass a POST request with no data, the application seems like it enters a loop and no response. If I pass a GET request, the code works fine because len_
becomes NULL and it exits the if statement. If I pass a POST request with data, it works fine and receives the data well and saves it.
The case I can’t figure out is POST with CONTENT_LENGTH = 0. How to figure this case ? I tried strlen(len_)
but it did not work. Thanks for the help.
Advertisement
Answer
Check if getenv
returns NULL
, e.g.:
char* len_;
long int len;
len_ = getenv("CONTENT_LENGTH");
if (len_ && sscanf(len_, "%ld", &len) == 1) {
if (len > 0) {
}
}
Note that (as pointed out by @Deduplicator) is better to declare len
as unsigned long
or size_t
because CONTENT_LENGTH
(the number of bytes being sent by the client) is always positive.