[Development] Building additional components with Conan for Qt 6

Konstantin Tokarev annulen at yandex.ru
Thu Oct 15 11:45:08 CEST 2020



14.10.2020, 09:14, "Heikki Halmet" <heikki.halmet at qt.io>:
> Hi,
>
> I'm not the expert when it comes to Conan so I'm asking what would be the main gain we would achieve by using Conan instead of scripts we currently have..? List of packages and versions in one place? With quick review this would take lot of effort and honestly I'm pretty skeptical about the gain we would have vs time we would spent to make needed changes.

Hi,

I guess we have a bit of misunderstanding here.

My proposition is to use Conan for building libraries which are currently bundled with Qt. Rationale:

1. Libraries bundled with qtbase create troubles for downstream projects and other Qt modules. There should be a clean way to use the same versions of dependencies as used by qtbase binaries and an easy way to rebuild everything with newer dependency version.
2. Custom bundling system requires additional maintenance and replace original upstream build system with qmake or cmake. For comparison, Conan packages from conan-center-index are maintained by the community, and tend to use original upstream build system which simplifies updates.
3. As we are introducing Conan as a way of using 3rd party libraries from Qt modules, it would makes sense to standardize on that instead of using different ways for different libraries

Also, if we go this way, it would also make sense to use Conan for libraries like icu and openssl which are currently not bundled, but built in provisioning scripts. This would reduce complexity of provisioning (you are right, it's done by moving complexity into conanfiles, but they can be easily tested and maintained without involving Coin). And, again, use the same way for dealing with all Qt dependencies.

As for replacing all provisioning scripts with Conan, I don't think it makes much sense, however Toni was interested so I described details how it could be done.

Anyway, even if qtbase bundling and building dependencies in Coin are left as is, we still need to test using Conan packages in CI in modules where it is going to be supported (qtnetworkauth, qt3d, ...) to prevent regressions. To avoid network downloads in CI jobs it would be better to install necessary packages during provisioning stage. On Windows it could be done by reusing existing conan.ps1 machinery (possibly adjusting it if needed)


>
>>>>  2) Build stuff from sources in provisioning time, sometimes in a sophisticated way (e.g. https://code.qt.io/cgit/qt/qt5.git/tree/coin/provisioning/common/windows/android-openssl.ps1). These can be (and IMO should be) replaced with Conan, to reduce provisioning time and complexity.
>>>  Of course, it's also possible to use conan packages for category (1), however this may require writing quite a few custom recipes (see manual [1]), possibly containing more boilerplate than existing ps1 snippets. However, this would allow to have a nice list of all provisioned packages with >> their versions in one place.
>
> You mean reducing provisioning time when everything would be prebuilt already? This is something we could change to existing scripts already. Script could check from the local cache do we have the prebuilt stuff and if don't then build those and add those to cache. And next time use prebuilt package.


So adding new package to CI would require going through all kinds of pain making it building in Coin environment without access to VM, instead of just testing it on local machine and/or public CI and uploading final result. Also, build dependencies of package will reside in provisioning image possibly affecting other scripts (e.g. msys2 brought by android-openssl.ps1).

With Conan project is guaranteed to find required dependencies in expected place. There is no such issue as "how to prevent CMake finding this header or library in a place which I don't want", which easily happens in Coin and is hard to solve without introducing Coin-specific hacks.


> Currently, e.g. with gcc installation, script will use prebuilt package if it exist and if don't it will be built from sources. But in some cases we have just prebuilt the package and added it to cache without source build option in the script.
> With installers I don't think there's no point to start putting those under Conan packages. I think the only thing we would gain here is the listing thing.
> And what comes to complexity - if we would start using Conan for everything we still would need to start creating quite complexity conanfiles for it, right? I guess the gain in some cases would be that we could use the same conanfile in unix and windows.
>
>>>>  In current implementation stuff is loaded from Internet, but all packages are verified against manifests stored in provisioning repo. But you can also set up internal Conan repository by running instance of Artifactory CE.
>
> Yes, we would have to use our own internal Conan repo and the repo should be kept up to date automatically.
> Do we know what kind of regression Conan repos have? Is there a risk that some package is not reachable etc. E.g. with brew we have stumbled to connection problems or their repositories haven't been available.


I don't know if brew uses any kind of CDN, but Bintray is quite reliable in my experience. 


>
> Back in days we used Puppet for provisioning. With puppet we stumbled to problems which were quite hard to debug. Reason why we dump it and moved to shell/powershell scripts was to simplify things and make debugging easier.
>
> We definitely could improve the readability of our provisioning scripts but it's all matter of prioritizing.


-- 
Regards,
Konstantin



More information about the Development mailing list