Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Response:{u'reason': 110, u't': u'refresh', u'seq': 0} #25

Open
martynhoyer opened this issue Feb 20, 2016 · 4 comments
Open

Response:{u'reason': 110, u't': u'refresh', u'seq': 0} #25

martynhoyer opened this issue Feb 20, 2016 · 4 comments

Comments

@martynhoyer
Copy link

So when I run python fetcher.py I just repeatedly get this line in the console:

Response:{u'reason': 110, u't': u'refresh', u'seq': 0}

Can anyone shed any light on what it is and how to fix it? Nothing is going into the log directory at all.

If I fiddle around with the SECRETS.txt file and make something invalid it fails for different reasons, so I can only assume the cookie and IDs are working ok.

This is on Windows 7 if that makes a difference...

@logicabrity
Copy link

Same problem here, Ubuntu 14.04 LTS with python 2.7.6.

@worc
Copy link

worc commented Feb 27, 2016

i've hit the same snag on windows 10, python 2.7.10.

i've been trying cookies from different pull requests and that might be part of the issue. not every pull request in the network tab has a full response object. some of them are just heartbeats and some don't have any data at all.

i changed cookies until i stopped getting 403s, and made sure to grab a cookie from a pull request that had a full response, but now i get a 200 with the same issue:

response_obj = {Response} <Response [200]>
 _content = {str} 'for (;;); {"t":"refresh","reason":110,"seq":0}'
 _content_consumed = {bool} True
   apparent_encoding = {str} 'ascii'
 connection = {HTTPAdapter} <requests.adapters.HTTPAdapter object at 0x03857C10>
 content = {str} 'for (;;); {"t":"refresh","reason":110,"seq":0}'
 cookies = {RequestsCookieJar} <RequestsCookieJar[]>
 elapsed = {timedelta} 0:00:00.177000
 encoding = {NoneType} None
 headers = {CaseInsensitiveDict} {'Content-Length': '46', 'Access-Control-Allow-Credentials': 'true', 'Connection': 'keep-alive', 'Pragma': 'no-cache', 'Cache-Control': 'private, no-store, no-cache, must-revalidate', 'Date': 'Sat, 27 Feb 2016 19:27:23 GMT', 'X-Frame-Options': 'DENY', 'Con
 history = {list} []
 is_permanent_redirect = {bool} False
 is_redirect = {bool} False
 links = {dict} {}
 ok = {bool} True
 raw = {HTTPResponse} <requests.packages.urllib3.response.HTTPResponse object at 0x038AE690>
 reason = {str} 'OK'
 request = {PreparedRequest} <PreparedRequest [GET]>
 status_code = {int} 200
 text = {unicode} u'for (;;); {"t":"refresh","reason":110,"seq":0}'
 url = {unicode} u'https://5-edge-chat.facebook.com/pull?qp=y&partition=-2&msgs_recv=0&uid=<USER ID>&seq=0&format=json&cb=2qfi&isq=173180&cap=8&state=active&clientid=<CLIENT ID>&idle=0&wtc=171%252C170%252C0.000%252C171%252C171&viewer_uid=<USER ID>&sticky_pool=atn2c06_ch

a couple of things look off to me. there's the unicode prefixes when printing to the console, while the facebook reponse has an apparent_encoding of ascii.:

Response:{u'reason': 110, u't': u'refresh', u'seq': 0}

and there's the seq number always being zero. i don't think it's exactly a nonce, but if you watch your own facebook network requests it does seem to increment periodically. edit to add, that's the facebook response that's always 0, not the request we're sending. changing the request's sequence number doesn't seem to fix the issue

@klvs
Copy link
Contributor

klvs commented May 23, 2016

I think you need a new cookie. I had this issue and it was solved after I grabbed a new cookie.

@JacobValdemar
Copy link

Getting a new cookie to secrets.txt made it work to me

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants