2026-05-11 15:00:50 [scrapy.utils.log] INFO: Scrapy 2.12.0 started (bot: competitor_spider)
2026-05-11 15:00:50 [scrapy.utils.log] INFO: Versions: lxml 6.1.0.0, libxml2 2.14.6, cssselect 1.4.0, parsel 1.11.0, w3lib 2.4.1, Twisted 25.5.0, Python 3.11.15 (main, Apr 7 2026, 02:24:41) [GCC 14.2.0], pyOpenSSL 26.0.0 (OpenSSL 3.5.6 7 Apr 2026), cryptography 46.0.7, Platform Linux-6.12.30+-x86_64-with-glibc2.41
2026-05-11 15:00:50 [scrapy.addons] INFO: Enabled addons:
[]
2026-05-11 15:00:50 [scrapy.extensions.telnet] INFO: Telnet Password: e3833dcf423a759b
2026-05-11 15:00:50 [scrapy.middleware] INFO: Enabled extensions:
['scrapy.extensions.corestats.CoreStats',
'scrapy.extensions.telnet.TelnetConsole',
'scrapy.extensions.memusage.MemoryUsage',
'scrapy.extensions.closespider.CloseSpider',
'scrapy.extensions.feedexport.FeedExporter',
'scrapy.extensions.logstats.LogStats']
2026-05-11 15:00:50 [scrapy.crawler] INFO: Overridden settings:
{'BOT_NAME': 'competitor_spider',
'CLOSESPIDER_PAGECOUNT': 300,
'CONCURRENT_REQUESTS': 2,
'CONCURRENT_REQUESTS_PER_DOMAIN': 1,
'DEPTH_LIMIT': 5,
'DOWNLOAD_DELAY': 2,
'DOWNLOAD_TIMEOUT': 60,
'LOG_FILE': '/var/lib/scrapyd/logs/competitors/competitor/159d528e4d4a11f1b1460242ac1f0002.log',
'LOG_LEVEL': 'INFO',
'NEWSPIDER_MODULE': 'competitor_spider.spiders',
'RETRY_HTTP_CODES': [500, 502, 503, 504, 408, 429],
'ROBOTSTXT_OBEY': True,
'SPIDER_MODULES': ['competitor_spider.spiders'],
'TWISTED_REACTOR': 'twisted.internet.asyncioreactor.AsyncioSelectorReactor',
'USER_AGENT': 'ChalknPencilsBot/1.0 (+https://chalknpencils.com)'}
2026-05-11 15:00:50 [scrapy.middleware] INFO: Enabled downloader middlewares:
['scrapy.downloadermiddlewares.offsite.OffsiteMiddleware',
'scrapy.downloadermiddlewares.robotstxt.RobotsTxtMiddleware',
'scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware',
'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware',
'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware',
'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware',
'scrapy.downloadermiddlewares.retry.RetryMiddleware',
'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware',
'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware',
'scrapy.downloadermiddlewares.redirect.RedirectMiddleware',
'scrapy.downloadermiddlewares.cookies.CookiesMiddleware',
'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware',
'scrapy.downloadermiddlewares.stats.DownloaderStats']
2026-05-11 15:00:50 [scrapy.middleware] INFO: Enabled spider middlewares:
['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware',
'scrapy.spidermiddlewares.referer.RefererMiddleware',
'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware',
'scrapy.spidermiddlewares.depth.DepthMiddleware']
2026-05-11 15:00:50 [scrapy.middleware] INFO: Enabled item pipelines:
['competitor_spider.pipelines.WebhookPipeline']
2026-05-11 15:00:50 [scrapy.core.engine] INFO: Spider opened
2026-05-11 15:00:50 [scrapy.extensions.logstats] INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
2026-05-11 15:00:50 [scrapy.extensions.telnet] INFO: Telnet console listening on 127.0.0.1:6031
2026-05-11 15:00:50 [scrapy-playwright] INFO: Starting download handler
2026-05-11 15:00:50 [scrapy-playwright] INFO: Starting download handler
2026-05-11 15:00:51 [scrapy-playwright] INFO: Launching 1 startup context(s)
2026-05-11 15:00:51 [scrapy-playwright] INFO: Launching browser chromium
2026-05-11 15:00:51 [scrapy-playwright] INFO: Browser chromium launched
2026-05-11 15:00:51 [scrapy-playwright] INFO: Startup context(s) launched
2026-05-11 15:00:51 [scrapy-playwright] INFO: Launching 1 startup context(s)
2026-05-11 15:00:51 [scrapy-playwright] INFO: Launching browser chromium
2026-05-11 15:00:51 [scrapy-playwright] INFO: Browser chromium launched
2026-05-11 15:00:51 [scrapy-playwright] INFO: Startup context(s) launched
2026-05-11 15:00:55 [competitor] INFO: Starting crawl: PeiPer Arts School @ https://www.peiperschool.com/
2026-05-11 15:01:28 [scrapy-playwright] WARNING: Closing page due to failed request: exc_type= exc_msg=Page.goto: Timeout 30000ms exceeded.
Call log:
- navigating to "https://www.peiperschool.com/", waiting until "load"
Traceback (most recent call last):
File "/usr/local/lib/python3.11/site-packages/scrapy_playwright/handler.py", line 403, in _download_request
return await self._download_request_with_page(request, page, spider)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/scrapy_playwright/handler.py", line 432, in _download_request_with_page
response, download = await self._get_response_and_download(request, page, spider)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/scrapy_playwright/handler.py", line 532, in _get_response_and_download
response = await page.goto(url=request.url, **page_goto_kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/playwright/async_api/_generated.py", line 8985, in goto
await self._impl_obj.goto(
File "/usr/local/lib/python3.11/site-packages/playwright/_impl/_page.py", line 551, in goto
return await self._main_frame.goto(**locals_to_params(locals()))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/playwright/_impl/_frame.py", line 145, in goto
await self._channel.send("goto", locals_to_params(locals()))
File "/usr/local/lib/python3.11/site-packages/playwright/_impl/_connection.py", line 61, in send
return await self._connection.wrap_api_call(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/playwright/_impl/_connection.py", line 528, in wrap_api_call
raise rewrite_error(error, f"{parsed_st['apiName']}: {error}") from None
playwright._impl._errors.TimeoutError: Page.goto: Timeout 30000ms exceeded.
Call log:
- navigating to "https://www.peiperschool.com/", waiting until "load"
2026-05-11 15:01:28 [scrapy.core.scraper] ERROR: Error downloading
Traceback (most recent call last):
File "/usr/local/lib/python3.11/site-packages/twisted/internet/defer.py", line 1853, in _inlineCallbacks
result = context.run(
^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/twisted/python/failure.py", line 467, in throwExceptionIntoGenerator
return g.throw(self.value.with_traceback(self.tb))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/scrapy/core/downloader/middleware.py", line 68, in process_request
return (yield download_func(request, spider))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/twisted/internet/defer.py", line 1257, in adapt
extracted: _SelfResultT | Failure = result.result()
^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/scrapy_playwright/handler.py", line 403, in _download_request
return await self._download_request_with_page(request, page, spider)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/scrapy_playwright/handler.py", line 432, in _download_request_with_page
response, download = await self._get_response_and_download(request, page, spider)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/scrapy_playwright/handler.py", line 532, in _get_response_and_download
response = await page.goto(url=request.url, **page_goto_kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/playwright/async_api/_generated.py", line 8985, in goto
await self._impl_obj.goto(
File "/usr/local/lib/python3.11/site-packages/playwright/_impl/_page.py", line 551, in goto
return await self._main_frame.goto(**locals_to_params(locals()))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/playwright/_impl/_frame.py", line 145, in goto
await self._channel.send("goto", locals_to_params(locals()))
File "/usr/local/lib/python3.11/site-packages/playwright/_impl/_connection.py", line 61, in send
return await self._connection.wrap_api_call(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/playwright/_impl/_connection.py", line 528, in wrap_api_call
raise rewrite_error(error, f"{parsed_st['apiName']}: {error}") from None
playwright._impl._errors.TimeoutError: Page.goto: Timeout 30000ms exceeded.
Call log:
- navigating to "https://www.peiperschool.com/", waiting until "load"
2026-05-11 15:01:29 [scrapy.core.engine] INFO: Closing spider (finished)
2026-05-11 15:01:29 [scrapy.extensions.feedexport] INFO: Stored jsonlines feed (0 items) in: file:///var/lib/scrapyd/items/competitors/competitor/159d528e4d4a11f1b1460242ac1f0002.jl
2026-05-11 15:01:29 [scrapy.statscollectors] INFO: Dumping Scrapy stats:
{'downloader/exception_count': 1,
'downloader/exception_type_count/playwright._impl._errors.TimeoutError': 1,
'downloader/request_bytes': 480,
'downloader/request_count': 2,
'downloader/request_method_count/GET': 2,
'downloader/response_bytes': 1335,
'downloader/response_count': 1,
'downloader/response_status_count/200': 1,
'elapsed_time_seconds': 38.23917,
'feedexport/success_count/FileFeedStorage': 1,
'finish_reason': 'finished',
'finish_time': datetime.datetime(2026, 5, 11, 15, 1, 29, 91713, tzinfo=datetime.timezone.utc),
'httpcompression/response_bytes': 494,
'httpcompression/response_count': 1,
'items_per_minute': None,
'log_count/ERROR': 1,
'log_count/INFO': 22,
'log_count/WARNING': 1,
'memusage/max': 79147008,
'memusage/startup': 79147008,
'playwright/context_count': 2,
'playwright/context_count/max_concurrent': 1,
'playwright/context_count/persistent/False': 2,
'playwright/context_count/remote/False': 2,
'playwright/page_count': 1,
'playwright/page_count/closed': 1,
'playwright/page_count/max_concurrent': 1,
'playwright/request_count': 1,
'playwright/request_count/method/GET': 1,
'playwright/request_count/navigation': 1,
'playwright/request_count/resource_type/document': 1,
'playwright/response_count': 1,
'playwright/response_count/method/GET': 1,
'playwright/response_count/resource_type/document': 1,
'response_received_count': 1,
'responses_per_minute': None,
'robotstxt/request_count': 1,
'robotstxt/response_count': 1,
'robotstxt/response_status_count/200': 1,
'scheduler/dequeued': 1,
'scheduler/dequeued/memory': 1,
'scheduler/enqueued': 1,
'scheduler/enqueued/memory': 1,
'start_time': datetime.datetime(2026, 5, 11, 15, 0, 50, 852543, tzinfo=datetime.timezone.utc)}
2026-05-11 15:01:29 [scrapy.core.engine] INFO: Spider closed (finished)
2026-05-11 15:01:29 [scrapy-playwright] INFO: Closing download handler
2026-05-11 15:01:29 [scrapy-playwright] INFO: Closing browser
2026-05-11 15:01:29 [scrapy-playwright] INFO: Closing download handler
2026-05-11 15:01:29 [scrapy-playwright] INFO: Closing browser