Use Proxy in your Spider
Using proxy, you can minimize the chance of getting blocked for your crawlers/spiders. Now let me tell you how to use proxy ip address in your spider in python. First load the list from a file:
Now for each proxy in list, call the following function:
Hope your spidering experience will be better with proxies :-)
You can use this list to test your crawler.
fileproxylist = open('proxylist.txt', 'r')
proxyList = fileproxylist.readlines()
indexproxy = 0
totalproxy = len(proxyList)
Now for each proxy in list, call the following function:
def get_source_html_proxy(url, pip):
proxy_handler = urllib2.ProxyHandler({'http': pip})
opener = urllib2.build_opener(proxy_handler)
opener.addheaders = [('User-agent', 'Mozilla/5.0')]
urllib2.install_opener(opener)
req=urllib2.Request(url)
sock=urllib2.urlopen(req)
data = sock.read()
return data
Hope your spidering experience will be better with proxies :-)
You can use this list to test your crawler.
Comments
Very interesting article!
How do I know if a proxy is not responding? I mean, is there a simple way to set a timeout?
Thanks!
Nicola
Thanks.
I'm definitely trying that.
thanks again
Nicola
I dont get the def url pip
I want to scrape a page using python. But the problem is when i want to go next page i have to submit a form. This form is submitted with 10 hidden values. So how i submit the form programmatically. The link is "https://www.jobs.lbhf.gov.uk/paplve_webrecruitment/wrd/run/ETREC106GF.display_srch_all?WVID=52561500BT&LANG=USA" . Please give me some suggestion or guide line.
Thanks
Sushanta
i want to scrape a site with python but the problem is that when i want to go the page i want to submit a form. This form is submitted with 10 hidden values. So how i submit the form programmatically. The link is https://www.jobs.lbhf.gov.uk/paplve_webrecruitment/wrd/run/ETREC106GF.display_srch_all?WVID=52561500BT&LANG=USA.please give me some suggestion or guide line.
thanks