[Web] Restrictive robots.txt file

Tero Kojo tero.kojo at qt.io
Wed Oct 12 11:03:12 CEST 2016


Hi David,

The qt-project.org server has been offline for quite a while now.
Right now the domain is used only as a redirect server with no content whatsoever.

Apparently the robots file was there in 2013. We can't change that.

I can ask if we have a dump of the server somewhere on tape.

Tero

> -----Original Message-----
> From: Web [mailto:web-bounces+tero.kojo=qt.io at qt-project.org] On Behalf
> Of David Boddie
> Sent: keskiviikko 12. lokakuuta 2016 1.01
> To: web at qt-project.org
> Subject: [Web] Restrictive robots.txt file
> 
> In an attempt to recover as much of the Qt Quarterly content as possible, I'm
> accessing the pages linked to from
> 
> https://doc.qt.io/archives/qq/index.html
> 
> Since some of those pages were hosted on qt-project.org I now have to try
> and find their contents via archive.org and this is where I fall foul of the
> robots.txt file for that site:
> 
> https://web.archive.org/web/20130702155050/http://qt-
> project.org/quarterly/view/using_cmake_to_build_qt_projects
> 
> Please could someone could tweak the file to make it possible for us to
> access those pages again?
> 
> Alternatively, or even additionally, could I get access to the repository that
> held the Qt Quarterly documents and examples? That would be even better
> than having to scrape old archives of the site.
> 
> Thanks,
> 
> David
> _______________________________________________
> Web mailing list
> Web at qt-project.org
> http://lists.qt-project.org/mailman/listinfo/web



More information about the Web mailing list