python - Getting cookies with requests -
when try access tor sites .cab web proxy using browser, first disclaimer .cab proxy, , after clicking button through actual .onion site. think site uses cookies determine if disclaimer has been clicked, when delete cookies browser, disclaimer again when try access sites.
however, when try access sites requests, don't cookies:
>>> r = requests.get(address) >>> r.cookies <requestscookiejar[]>
i've tried using sessions, same thing happens. how can cookies using python requests?
the url i'm trying "https://qzbkwswfv5k2oj5d.onion.cab/". i've tried both no headers, , headers chrome sends:
host: qzbkwswfv5k2oj5d.onion.cab connection: keep-alive cache-control: max-age=0 accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8 user-agent: mozilla/5.0 (windows nt 6.3; wow64) applewebkit/537.36 (khtml, gecko) chrome/43.0.2357.124 safari/537.36 accept-encoding: gzip, deflate, sdch accept-language: en-gb,en-us;q=0.8,en;q=0.6
i believe you'll have fake user-agent
:
example:
from requests import headers = { "user-agent": "mozilla/5.0 (macintosh; intel mac os x 10_10_1) applewebkit/537.36 (khtml, gecko) chrome/41.0.2227.1 safari/537.36" } response = get(url, headers=headers) response.raise_for_status() response.cookies
this typical google chrome user-agent
got here
Comments
Post a Comment