I’m developing an application for a development board (Beagle Bone Black) that will send some data over UART
peripheral. The developing board runs Linux Kernel
(some Debian distribution, 3.8.x
Linux Kernel version).
For sending and receiving data over UART
I use the standard UNIX
API: open()
, read()
, and write()
family functions.
For setting the communication parameters (baud rate
, stop/start bits
, parity
, etc…) I use the termios
structure (from termios.h
).
This is some relevant code sequence where I make the I/O settings:
fd_debug = open("output.out", O_CREAT | O_WRONLY, S_IRUSR | S_IWUSR); fd_write = open(port.c_str(), O_WRONLY | O_NOCTTY | O_SYNC); std::cout << std::endl << "I opened: " << port; struct termios settings; tcgetattr(fd_write, &settings); cfsetospeed(&settings, B19200); /* baud rate */ settings.c_cflag &= ~PARENB; /* no parity */ settings.c_cflag &= ~CSTOPB; /* 1 stop bit */ settings.c_cflag &= ~CSIZE; settings.c_cflag |= CS8 | CLOCAL; /* 8 bits */ settings.c_lflag = ICANON; /* canonical mode */ settings.c_oflag &= ~OPOST; /* raw output */ tcsetattr(fd_write, TCSANOW, &settings); /* apply the settings */ tcflush(fd_write, TCOFLUSH);
There I opened two file descriptors:
fd_debug
: Linked to a file, for debugging purposes.fd_write
: Linked to the UART peripheral (/dev/ttyO4
in my particular case).
This is the function that is executed when I want to send one byte over UART
:
int UARTIOHandler::write(uchar8 byte) { auto tp = std::chrono::steady_clock::now(); std::cout << std::endl << "[write] Timestamp: " << std::chrono::duration_cast<std::chrono::milliseconds>(tp.time_since_epoch()).count(); ::write(fd_debug, &byte, 1); return ::write(this->fd_write, &byte, 1); }
For checking if the data I send is received correctly over UART
, I have connected the TX
and RX
pins (loopback test) on my board (because I want to be able to receive back the data that I send), and run minicom
on that particular UART
port:
minicom -D /dev/ttyO4 -b 19200 -C test.test
After sending some data, I compared the two files (debugging file and the output file generated by minicom
(which should contain the data received over UART
)). The problem is that the data is not the same!
This is the actually data sent (in hexadecimal):
55 33 02 04 06 08 0a 0c d5 55 0b 01 03 05 07 ef 55 3f 07 06 05 04 03 02 01 e3 55 16 01 02 03 04 05 06 07 08 db 55 3f 01 02 03 04 05 06 07 e3
This is the data received in the debug file (it’s the same, so this confirms that there’s some UART
problem):
55 33 02 04 06 08 0a 0c d5 55 0b 01 03 05 07 ef 55 3f 07 06 05 04 03 02 01 e3 55 16 01 02 03 04 05 06 07 08 db 55 3f 01 02 03 04 05 06 07 e3
And this is the data received by minicom
tool (that was set to listen to the same UART
port, and the same settings (baud
, parity
, etc):
55 33 02 04 06 08 0a d5 55 01 03 4d 69 6e 69 63 6f 6d 32 2e 36 2e 31 07 ef 55 3f 07 06 4d 69 6e 69 63 6f 6d 32 2e 36 2e 31 04 03 02 01 e3 55 16 01 02 03 04 4d 69 6e 69 63 6f 6d 32 2e 36 2e 31 06 07 08 db 55 3f 01 02 03 04 4d 69 6e 69 63 6f 6d 32 2e 36 2e 31 06 07 e3
As can be observed from a point all data gets corrupted, and there’s more bytes received.
For checking the actual data from the output files I used hexdump
like this:
hexdump -ve '1/1 "%.2x "' test.test
What could be the problem?
Advertisement
Answer
This seems to be Minicom responding to the ENQ character (0x05) and recording its response in the session capture. The additional data is Minicom2.6.1
which is not corruption. It is substituted for every 0x05 in your stream.