使用Goutte的抓取网站在特定网站上挂起直到超时



我在玩Goutte,不能让它连接到某个网站。所有其他url似乎工作完美,我正在努力理解是什么阻止它连接。它会挂起直到30秒后超时。如果我取消超时,150秒后同样会发生。

注意事项:

  • 这个超时/挂只发生在tesco,我发现到目前为止。asda.com, google.com等工作正常并返回结果。
  • 网站加载立即在web浏览器(Chrome)(不IP或ISP相关)。
  • 如果我在Postman中向相同的URL发出get请求,我得到一个返回的结果。
  • 与用户代理无关。
<?php
namespace AppHttpControllers;
use GoutteClient;
use GuzzleHttpClient as GuzzleClient;
class ScraperController extends Controller
{
public function scrape()
{
$goutteClient = new Client();
$goutteClient->setHeader('user-agent', 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_14_6) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/88.0.4324.96 Safari/537.36');
$guzzleClient = new GuzzleClient(array(
'timeout' => 30,
'verify' => true,
'debug' => true,
));
$goutteClient->setClient($guzzleClient);
$crawler = $goutteClient->request('GET', 'https://www.tesco.com/');
dump($crawler);
/*$crawler->filter('.result__title .result__a')->each(function ($node) {
dump($node->text());
});*/
}
}

这是调试"输出,包括错误:

* Trying 104.123.91.150:443... * TCP_NODELAY set * Connected to www.tesco.com (104.123.91.150) port 443 (#0) * ALPN, offering http/1.1 * successfully set certificate verify locations: * CAfile: /etc/ssl/certs/ca-certificates.crt CApath: /etc/ssl/certs * SSL connection using TLSv1.3 / TLS_AES_256_GCM_SHA384 * ALPN, server accepted to use http/1.1 * Server certificate: * subject: C=GB; L=Welwyn Garden City; jurisdictionC=GB; O=Tesco PLC; businessCategory=Private Organization; serialNumber=00445790; CN=www.tesco.com * start date: Feb 4 11:09:23 2020 GMT * expire date: Feb 3 11:39:21 2022 GMT * subjectAltName: host "www.tesco.com" matched cert's "www.tesco.com" * issuer: C=US; O=Entrust, Inc.; OU=See www.entrust.net/legal-terms; OU=(c) 2014 Entrust, Inc. - for authorized use only; CN=Entrust Certification Authority - L1M * SSL certificate verify ok. > GET / HTTP/1.1 Host: www.tesco.com user-agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10_14_6) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/88.0.4324.96 Safari/537.36 * old SSL session ID is stale, removing * Operation timed out after 30001 milliseconds with 0 bytes received * Closing connection 0
GuzzleHttpExceptionConnectException
cURL error 28: Operation timed out after 30001 milliseconds with 0 bytes received (see https://curl.haxx.se/libcurl/c/libcurl-errors.html)
http://localhost/scrape

有人知道为什么我没有得到任何回应吗?

设法通过增加一些标题来解决这个问题:

<?php
namespace AppHttpControllers;
use GoutteClient;
use GuzzleHttpClient as GuzzleClient;
class ScraperController extends Controller
{
public function scrape()
{
$goutteClient = new Client();
$goutteClient->setHeader('accept', 'text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.9');
$goutteClient->setHeader('accept-encoding', 'gzip, deflate, br');
$goutteClient->setHeader('accept-language', 'en-GB,en-US;q=0.9,en;q=0.8');
$goutteClient->setHeader('upgrade-insecure-requests', '1');
$goutteClient->setHeader('user-agent', 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_14_6) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/88.0.4324.96 Safari/537.36');
$goutteClient->setHeader('connection', 'keep-alive');
$guzzleClient = new GuzzleClient(array(
'timeout' => 5,
'verify' => true,
'debug' => true,
'cookies' => true,
));
$goutteClient->setClient($guzzleClient);
$crawler = $goutteClient->request('GET', 'https://www.tesco.com/');
dump($crawler);
/*$crawler->filter('.result__title .result__a')->each(function ($node) {
dump($node->text());
});*/
}
}

最新更新