[Development] [I/O, Core] Defer a emitting of signal. It is possible?
Denis Shienkov
denis.shienkov at gmail.com
Sun Feb 9 20:26:29 CET 2014
Hi all.
In development of QtSerialPort faced an unpleasant thing - big loading
of CPU in the process of I/O.
A short results can be see here:
https://bugreports.qt-project.org/browse/QTBUG-36684
This loading very big in Linux (up to 30% on 115200 baud) even at using
one instance of the serial port.
Of course, a lot of things depend on device driver, also maybe we missed
something in implementation
of QtSerialPort, but anyway, a main the tendency is not good.
For comparing (see the link above) are given results with use of
Boost::asio which it is twice better
than with Qt.
It seems that select() which implicitly uses QSocketNotifier/Dispatcher
is a bottleneck. But it is very strange
because the serial port is very slow device on the relation with others.
Therefore in the theory the I/O with
the serial port shan't influence overall performance and the CPU loading.
So, I thought that the possible workaround is to use a "deferred"
signals. E.g. the event from an defice
descriptor (fd) was be processed not immediately, but with some time
delay, that can be expressed in
number of cycles of the event-loop. And this parameter can be configured
into QObject::connect(...),
for example, or something. It would allow to accumulate more incoming
data in the input FIFO buffer
of device driver, and to read them less often. And it would reduce load
of Qt event-loop and CPU.
Or maybe, there are other solutions of a problem? Maybe someone can
advise something?
Best regards,
Denis
More information about the Development
mailing list