Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

attributeerror found #3

Open
brucedong2016 opened this issue Jun 24, 2017 · 0 comments
Open

attributeerror found #3

brucedong2016 opened this issue Jun 24, 2017 · 0 comments

Comments

@brucedong2016
Copy link

When I use the example linkgenerator or definde a new spider,there are the same error messages.Please help me,is there any files that need re-config?Thanks

2017-06-25 07:33:44 [scrapy.utils.signal] ERROR: Error caught on signal handler: <bound method ?.spider_closed of <AdminDemo1Spider 'admin_demo1' at 0x1117ed890>>
Traceback (most recent call last):
  File "/Users/Bruce/git/Distributed-Multi-User-Scrapy-System-with-a-Web-UI/venv/lib/python2.7/site-packages/twisted/internet/defer.py", line 150, in maybeDeferred
    result = f(*args, **kw)
  File "/Users/Bruce/git/Distributed-Multi-User-Scrapy-System-with-a-Web-UI/venv/lib/python2.7/site-packages/pydispatch/robustapply.py", line 55, in robustApply
    return receiver(*arguments, **named)
  File "/private/var/folders/fx/jgdx72z90b5dsq_l39v1q00m0000gn/T/admin_demo1-2-sl3cQm.egg/admin_demo1/spiders/admin_demo1.py", line 28, in spider_closed
AttributeError: 'AdminDemo1Spider' object has no attribute 'statstask'
2017-06-25 07:33:44 [twisted] CRITICAL: Unhandled error in Deferred:
2017-06-25 07:33:44 [twisted] CRITICAL: 
Traceback (most recent call last):
  File "/Users/Bruce/git/Distributed-Multi-User-Scrapy-System-with-a-Web-UI/venv/lib/python2.7/site-packages/twisted/internet/defer.py", line 1386, in _inlineCallbacks
    result = g.send(result)
  File "/Users/Bruce/git/Distributed-Multi-User-Scrapy-System-with-a-Web-UI/venv/lib/python2.7/site-packages/scrapy/crawler.py", line 95, in crawl
    six.reraise(*exc_info)
  File "/Users/Bruce/git/Distributed-Multi-User-Scrapy-System-with-a-Web-UI/venv/lib/python2.7/site-packages/scrapy/crawler.py", line 79, in crawl
    yield self.engine.open_spider(self.spider, start_requests)
KeyError: 'username'
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant