# I am the Watcher. I am your guide through this vast new twtiverse.
#
# Usage:
# https://watcher.sour.is/api/plain/users View list of users and latest twt date.
# https://watcher.sour.is/api/plain/twt View all twts.
# https://watcher.sour.is/api/plain/mentions?uri=:uri View all mentions for uri.
# https://watcher.sour.is/api/plain/conv/:hash View all twts for a conversation subject.
#
# Options:
# uri Filter to show a specific users twts.
# offset Start index for quey.
# limit Count of items to return (going back in time).
#
# twt range = 1 9
# self = https://watcher.sour.is/conv/dr2jfzq
@quark Alright, I can’t tell when I’ll be able to do a screen sharing thingy. So let’s try this the old-fashioned way first. Please try to reproduce the issue with the branch quark-trace
that I pushed recently. It’ll create a /tmp/jenny.log file (it will get *large*). When you see duplicate twts, try to find them in that log.
Reasons, in theory, why we could see dups:
1) jenny doesn’t detect your feed’s URL correctly.
2) python-dateutil doesn’t parse your twt’s timestamp correctly. Or rather, it parses it differently depending on some env vars? Cronjobs often have this pitfall where some env var is different than your normal environment.
Actually … I can’t think of anything else. 🤔 You don’t see dups *all the time*, it only happens for your *own* twts, and you said that the twt hash *mismatches*. That already narrows it down to something in make_twt_hash(). 🤔
Let’s see if that trace file helps. If it doesn’t, we can add more trace() calls.
@quark Alright, I can’t tell when I’ll be able to do a screen sharing thingy. So let’s try this the old-fashioned way first. Please try to reproduce the issue with the branch quark-trace
that I pushed recently. It’ll create a /tmp/jenny.log file (it will get *large*). When you see duplicate twts, try to find them in that log.
Reasons, in theory, why we could see dups:
1) jenny doesn’t detect your feed’s URL correctly.
2) python-dateutil doesn’t parse your twt’s timestamp correctly. Or rather, it parses it differently depending on some env vars? Cronjobs often have this pitfall where some env var is different than your normal environment.
Actually … I can’t think of anything else. 🤔 You don’t see dups *all the time*, it only happens for your *own* twts, and you said that the twt hash *mismatches*. That already narrows it down to something in make_twt_hash(). 🤔
Let’s see if that trace file helps. If it doesn’t, we can add more trace() calls.
@quark Alright, I can’t tell when I’ll be able to do a screen sharing thingy. So let’s try this the old-fashioned way first. Please try to reproduce the issue with the branch quark-trace
that I pushed recently. It’ll create a /tmp/jenny.log file (it will get *large*). When you see duplicate twts, try to find them in that log.
Reasons, in theory, why we could see dups:
1) jenny doesn’t detect your feed’s URL correctly.
2) python-dateutil doesn’t parse your twt’s timestamp correctly. Or rather, it parses it differently depending on some env vars? Cronjobs often have this pitfall where some env var is different than your normal environment.
Actually … I can’t think of anything else. 🤔 You don’t see dups *all the time*, it only happens for your *own* twts, and you said that the twt hash *mismatches*. That already narrows it down to something in make_twt_hash(). 🤔
Let’s see if that trace file helps. If it doesn’t, we can add more trace() calls.
@movq OK, will work on it tonight. Thank you!
@quark This screenshot doesn't load for me. I'm getting HTTP 504 Gateway Timeout.
@quark Alright, what’s in between those lines? There should be more info, like the resulting hash and if jenny was able to find the file on disk, etc.
@quark Alright, what’s in between those lines? There should be more info, like the resulting hash and if jenny was able to find the file on disk, etc.
@quark Alright, what’s in between those lines? There should be more info, like the resulting hash and if jenny was able to find the file on disk, etc.