[ Can't run Scrapy program ]
I have been learning how to work with Scrapy from the following link :
http://doc.scrapy.org/en/master/intro/tutorial.html
When i try to run the code written in the Crawling(scrapy crawl dmoz
) section, i get the following error:
AttributeError: 'module' object has no attribute 'Spider
'
However, i changed "Spider" to "spider" and i got nothing but a new error:
TypeError: Error when calling the metaclass bases
module.__init__() takes at most 2 arguments (3 given)
I'm so confused, what is the problem? Any help would highly be appreciated. Thanks. By the way, i am using Windows.
EDIT(source added):
First i created a project using Scrapy by going to a directory and running the following commands by cmd like so :
cd #DIRECTORY PATH#
scrapy startproject tutorial
This will create a folder named tutorial in the given directory. The tutorial folder consists :
tutorial/ scrapy.cfg tutorial/ init.py items.py pipelines.py settings.py spiders/ init.py ...
Then i defined my items :
import scrapy
class DmozItem(scrapy.Item):
title = scrapy.Field()
link = scrapy.Field()
desc = scrapy.Field()
Afterwards, i created the spider:
import scrapy
class DmozSpider(scrapy.Spider):
name = "dmoz"
allowed_domains = ["dmoz.org"]
start_urls = [
"http://www.dmoz.org/Computers/Programming/Languages/Python/Books/",
"http://www.dmoz.org/Computers/Programming/Languages/Python/Resources/"
]
def parse(self, response):
filename = response.url.split("/")[-2]
with open(filename, 'wb') as f:
f.write(response.body)
And after, when running the code,the error is shown. I am using windows 7 64 bit along with Python 2.7 32 bit.
EDIT 2:
I tried uninstalling and installing another Scrapy version but it didn't work. Here is the log:
C:\Users\Novin Pendar\Desktop\FS\tutorial>scrapy crawl dmoz
2015-03-26 17:48:29+0430 [scrapy] INFO: Scrapy 0.16.5 started (bot: tutorial)
2015-03-26 17:48:29+0430 [scrapy] DEBUG: Enabled extensions: LogStats, TelnetCon
sole, CloseSpider, WebService, CoreStats, SpiderState
C:\Python27\lib\site-packages\scrapy-0.16.5-py2.7.egg\scrapy\__init__.pyc
Traceback (most recent call last):
File "C:\Python27\lib\runpy.py", line 162, in _run_module_as_main
"__main__", fname, loader, pkg_name)
File "C:\Python27\lib\runpy.py", line 72, in _run_code
exec code in run_globals
File "C:\Python27\lib\site-packages\scrapy-0.16.5-py2.7.egg\scrapy\cmdline.py"
, line 156, in <module>
execute()
File "C:\Python27\lib\site-packages\scrapy-0.16.5-py2.7.egg\scrapy\cmdline.py"
, line 131, in execute
_run_print_help(parser, _run_command, cmd, args, opts)
File "C:\Python27\lib\site-packages\scrapy-0.16.5-py2.7.egg\scrapy\cmdline.py"
, line 76, in _run_print_help
func(*a, **kw)
File "C:\Python27\lib\site-packages\scrapy-0.16.5-py2.7.egg\scrapy\cmdline.py"
, line 138, in _run_command
cmd.run(args, opts)
File "C:\Python27\lib\site-packages\scrapy-0.16.5-py2.7.egg\scrapy\commands\cr
awl.py", line 43, in run
spider = self.crawler.spiders.create(spname, **opts.spargs)
File "C:\Python27\lib\site-packages\scrapy-0.16.5-py2.7.egg\scrapy\command.py"
, line 33, in crawler
self._crawler.configure()
File "C:\Python27\lib\site-packages\scrapy-0.16.5-py2.7.egg\scrapy\crawler.py"
, line 40, in configure
self.spiders = spman_cls.from_crawler(self)
File "C:\Python27\lib\site-packages\scrapy-0.16.5-py2.7.egg\scrapy\spidermanag
er.py", line 35, in from_crawler
sm = cls.from_settings(crawler.settings)
File "C:\Python27\lib\site-packages\scrapy-0.16.5-py2.7.egg\scrapy\spidermanag
er.py", line 31, in from_settings
return cls(settings.getlist('SPIDER_MODULES'))
File "C:\Python27\lib\site-packages\scrapy-0.16.5-py2.7.egg\scrapy\spidermanag
er.py", line 22, in __init__
for module in walk_modules(name):
File "C:\Python27\lib\site-packages\scrapy-0.16.5-py2.7.egg\scrapy\utils\misc.
py", line 65, in walk_modules
submod = __import__(fullpath, {}, {}, [''])
File "tutorial\spiders\dmoz_spider.py", line 3, in <module>
class DmozSpider(scrapy.Spider):
AttributeError: 'module' object has no attribute 'Spider'
EDIT 3:
The problem is solved. I downloaded the latest version of Scrapy(0.24) and installed. Everything worked so great. Just wanted to say to people who have the same problem that i used to, so, they will save them a lot of time. Thanks.
Answer 1
If your installation is correct.Try this
Check any scrapy.py
or scrapy.pyc
, in the working folder.If exists, rename it.Dont change Spider
to spider
Answer 2
use this definition: class DmozSpider(scrapy.spider.BaseSpider):