[PySide] 回复: [PyQt] user interface freezed when using concurrent.futures.ThreadPoolExecutor

iMath 2281570025 at qq.com
Thu Dec 11 06:14:24 CET 2014


of course ,that's just an example .
what I want to do is to get the direct url of a few URLs having redirections ,it would be better if this could be done asynchronously .
Currently ,I am using concurrent.futures.ThreadPoolExecutor , Requests ,and QThread(for executing asynchronously ,not freeze the interface) to achieve it.
  ‍



------------------ 原始邮件 ------------------
发件人: "michael h";<michaelkenth at gmail.com>;
发送时间: 2014年12月11日(星期四) 上午7:53
收件人: "iMath"<2281570025 at qq.com>; 
抄送: "pyqt"<pyqt at riverbankcomputing.com>; "pyside"<pyside at qt-project.org>; 
主题: Re: [PyQt] user interface freezed when using concurrent.futures.ThreadPoolExecutor




On Wed, Dec 10, 2014 at 12:07 AM, iMath <2281570025 at qq.com> wrote:
I think the user interface shouldn't be freezed when using concurrent.futures.ThreadPoolExecutor here, but it doesn't meet my expectations,anyone can explain why ? any other solutions here to not let user interface freezed?‍


code is here
http://stackoverflow.com/questions/27393533/user-interface-freezed-when-using-concurrent-futures-threadpoolexecutor‍

_______________________________________________
 PyQt mailing list    PyQt at riverbankcomputing.com
 http://www.riverbankcomputing.com/mailman/listinfo/pyqt



It appears: concurrent.futures.as_completed yields futures as they complete, so the main thread is blocked as it loops over this until all the futures have completed.



You could use Qt's QNetworkAccessManager / QNetworkRequest, or perhaps you want something like scrapy if you're trying to crawl websites (could probably be integrated with a Qt app using qt4reactor)



What are you building?



- MH
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.qt-project.org/pipermail/pyside/attachments/20141211/91ff2861/attachment.html>


More information about the PySide mailing list