OiO.lk Community platform!

Oio.lk is an excellent forum for developers, providing a wide range of resources, discussions, and support for those in the developer community. Join oio.lk today to connect with like-minded professionals, share insights, and stay updated on the latest trends and technologies in the development field.
  You need to log in or register to access the solved answers to this problem.
  • You have reached the maximum number of guest views allowed
  • Please register below to remove this limitation

No spiders found on ScrapeOps Platform

  • Thread starter Thread starter finnohhh
  • Start date Start date
F

finnohhh

Guest
I'm new in ScrapeOps platform. I have already used scrapy to run some websites. All the scripts are fine by running one-by-one in my machine. To automatically schedule my spiders, I'm trying to deploy ScrapeOps on my server. I successfully connected my GCP server (ubuntu 22.04) and clone the repository from GitHub by following the guidance on ScrapeOps. However, there is no spider found. I tried to find tutorials, but nobody talks about this miss-capturing issue. I think the ScrapeOps renew the ways to deploy, so there might be different between now and last year.

Here is some pictures for ScrapeOps platforms and terminal log. Servers & Deployment

It's weird that it shows 'Spider discovered' as I clicked the button 'Find Scrapy Spiders', but it stills no spiders found at all. Servers & Deployment > Cloned Repos: Credit_All_In_One

And the command below seems to work fine:

Code:
scrapy list

The command scrapy list

Here is the my scrapy.cfg:

Code:
[settings]
default = credit_card_scraper.settings
shell = ipython

[deploy]
#url = http://localhost:6800/
project = credit_card_scraper

And Here is my GitHub structure (I have 24 spiders, but just takes 2 spiders to show the structure):

Code:
Credit_All_In_One
├── .env
├── .gitignore
├── requirements.txt
├── scrapy.cfg
└── credit_card_scraper
    ├── __init__.py
    ├── database.py
    ├── items.py
    ├── middlewares.py
    ├── pipelines.py
    ├── settings.py
    └── spiders
        ├── __init__.py
        ├── american.py
        ├── cathay.py
        ├── ...

Please help me to fix the problem... Very appreciated!
<p>I'm new in ScrapeOps platform.
I have already used scrapy to run some websites. All the scripts are fine by running one-by-one in my machine.
To automatically schedule my spiders, I'm trying to deploy ScrapeOps on my server.
I successfully connected my GCP server (ubuntu 22.04) and clone the repository from GitHub by following the guidance on ScrapeOps.
However, there is no spider found.
I tried to find tutorials, but nobody talks about this miss-capturing issue.
I think the ScrapeOps renew the ways to deploy, so there might be different between now and last year.</p>
<p>Here is some pictures for ScrapeOps platforms and terminal log.
<a href="https://i.sstatic.net/T2AIV.png" rel="nofollow noreferrer">Servers & Deployment</a></p>
<p>It's weird that it shows 'Spider discovered' as I clicked the button 'Find Scrapy Spiders', but it stills no spiders found at all.
<a href="https://i.sstatic.net/SZao0.png" rel="nofollow noreferrer">Servers & Deployment > Cloned Repos: Credit_All_In_One</a></p>
<p>And the command below seems to work fine:</p>
<pre><code>scrapy list
</code></pre>
<p><a href="https://i.sstatic.net/Raln9.png" rel="nofollow noreferrer">The command scrapy list</a></p>
<p>Here is the my scrapy.cfg:</p>
<pre><code>[settings]
default = credit_card_scraper.settings
shell = ipython

[deploy]
#url = http://localhost:6800/
project = credit_card_scraper
</code></pre>
<p>And Here is my GitHub structure (I have 24 spiders, but just takes 2 spiders to show the structure):</p>
<pre><code>Credit_All_In_One
├── .env
├── .gitignore
├── requirements.txt
├── scrapy.cfg
└── credit_card_scraper
├── __init__.py
├── database.py
├── items.py
├── middlewares.py
├── pipelines.py
├── settings.py
└── spiders
├── __init__.py
├── american.py
├── cathay.py
├── ...
</code></pre>
<p>Please help me to fix the problem...
Very appreciated!</p>
 

Latest posts

I
Replies
0
Views
1
impact christian
I
Top