Sockets: setsockopt SO_SNDTIMEO does not set the send timeout.
Hi everybody!
I'm trying to set a 1 second "send timeout" to a SOCKET, using setsockopt.
Apparently it works, because the return value of setsockopt is non-zero.
BUT when I check the value of the "send timeout" via getsockopt, it's still
0, before and after calling setsockopt.
You can see code below.
Does anyone know what may be happening?
Thank you very much!
Best regards,
Ricardo V?zquez.
Madrid, Spain.
--CODE-----------------------
int nRet1 = -1;
int nLen1 = sizeof(nRet1);
if (getsockopt( m_listenSocket, IPPROTO_TCP, SO_SNDTIMEO, (char*)&nRet1,
(int*)&nLen1) != 0)
{
CString sLog;
sLog.Format("Send Timout = %d", nRet1);
g_logSystem.logNormal(0, sLog);
}
int nVal2 = 1000;
if (setsockopt( m_listenSocket, IPPROTO_TCP, setsockopt, (char*)&nVal2,
sizeof(nVal2)) == 0)
{
err.Format("[esvr] setsockopt(%ld SNDTIMEO) error: %ld",
m_listenSocket, WSAGetLastError());
g_logSystem.logWarning(0, err);
}
int nRet2 = -1;
int nLen2 = sizeof(nRet2);
if (getsockopt( m_listenSocket, IPPROTO_TCP, SO_SNDTIMEO, (char*)&nRet2,
(int*)&nLen2) != 0)
{
CString sLog;
sLog.Format("Send Timout = %d", nRet2); //-->HERE IT OUTPUTS 0,
/ /INSTEAD OF 1000!
g_logSystem.logNormal(0, sLog);
}