rs232 -help!
Hello,
At this point if anyone has used the CSerialWnd class from code project, I
am desperately in need of a code sample doing a Read task from rs232.
I have been exploiting this subject with some of the most credible fellows
on this newsgroups and I fully appreciated their support and wish to
continue. I am re-starting this subject so not to clutter the previous post
and to give the full story in case I didn't post it right before.
Right now I have the following code:
===========================================
//Global object declared one time
CSerialWnd MySerial;
LRESULT CALLBACK WndProc_CW1 (HWND hwnd, UINT message,
WPARAM wParam, LPARAM lParam)
{
static DWORD dwBytesRead = 23;
static BYTE abBuffer[23];
static TCHAR szBuffer[49];
MySerial.Open(TEXT("COM1"),hwnd,WM_NULL,lParam,0,0);
//Serial port monitor
if (message == CSerialWnd::mg_nDefaultComMsg)
{
// A serial message occurred
const CSerialWnd::EEvent eEvent = CSerialWnd::EEvent(LOWORD(wParam));
const CSerialWnd::EError eError = CSerialWnd::EError(HIWORD(wParam));
MySerial.Setup(CSerial::EBaud9600,CSerial::EData8,
CSerial::EParNone,CSerial::EStop1);
MySerial.SetupReadTimeouts(CSerial::EReadTimeoutBlocking);
switch (eEvent)
{
case CSerialWnd::EEventRecv:
MySerial.Read(abBuffer,sizeof(abBuffer),&dwBytesRead);
//abBuffer[dwBytesRead] = 0; //Null character is now included in
receiving msg
break;
default:
break;
}
MultiByteToWideChar(CP_ACP,0, (LPCSTR)abBuffer,23,szBuffer,49);
MySerial.Close();
return 0;
}
=================================================
I have been exclusively told about the SetCommTimeouts, and I don't know how
else to set this... I tried it with the :
MySerial.SetupReadTimeouts(CSerial::EReadTimeoutBlocking);
line, and that's the closest I have in order to control the comm timeouts.
Unless obviously some other method has to be set with some sort of byte
limit???. Please see code above for I a not sure if I am using the comm
timout correctly!
Here is what's happening:
To start, I always place a break point on the following line of my code:
MySerial.Read(abBuffer,sizeof(abBuffer),&dwBytesRead); //Break point here
So, the moment I send my message to VC++ (rs232 port of PC) I stop at the
break point, I then step one line to invoke the Read function. I then
continue stepping through until the following statement
return 0;
The correct message was returned. However, immediately after, VC++ shows my
program going back again to the break point a second time without me having
sent another message to the port. I guess this event is caused by some junk
left over in the serial buffer. So I step to the Read statement.... and then
VC++ hangs thereafter!
First, I don't know if my Comm timeout is well set.... ? Obviously, if it
were, well then the Read function would of returned.... right?
From what I understood from previous posts thanks to a credible newsgroup
fellow, is that the Read function will read in data coming in the port, and
when finished it returns after the specified timout. If comm time out is not
specified then Read should never return.... right?
However where this doesn't add up is that the first time around, the read
function returns without a problem, comm timeout specified or not! Anyways
good for me if it does! Its probably because I have a null character in my
message.... I guess, what the hell do I know anyways!
What I am trying to prevent is that it doesn't come back to read the port a
second time. And if it does due to some left over junk in the buffer, well,
then this is where the Comm time out should block the read and return
gracefully!
Thanks for all your help, Indeed guidence would be much appreciated.
Thanks!
--
Best regards
Roberto