Skip to content
Advertisement

How to store screen output of Scrapy into a file in Linux

I often use

self.logger.info('Parse function called on %s', response.url)

to debug Scrapy project. However, when there are lot of items crawled, those log information will be embedded inside a lot of non-useful information.

What I want is to store the screen output into a file, so that I can do search, do text editing, etc.

I tried:

scrapy crawl xxx_spider >> my_log 

but it doesn’t work

Advertisement

Answer

It seems that your script may be writing to something other than stdout, such as stderr. Try:

     $ scrapy crawl xxx_spider  &> test_log

If it is indeed writing script output to stderr, you may want to direct stderr to stdout in the future and might want to consider changing the location to which logger outputs text.

User contributions licensed under: CC BY-SA
4 People found this is helpful
Advertisement