I'm on it.
Gonna drop in here when it's ready
New url should be
https://qanon.news/deltas.html when I'm done
This is old
https://qanon.news/posts.html
https://qanon.news/Q is the new one.
If you wanted to work something out as well this has all the data you need in json
https://qanon.news/api/smash
for xml
https://qanon.news/api/smash?xml=true
Here's my first pass at what I think yall are looking for.
https://qanon.news/deltas.html
I left the negative deltas in because it has seemed to me that some data is relevant in there + it helps to account for lag in posting times. I can remove or reduce that if needed.
It's currently highlighting in red any delta that is mod 5, that is, any delta that is divisible by 5, so โฆ [-15] [-10] [-5] [0] [5] [10] [15]โฆ
Some of the Kanji that POTUS dropped in the tweets was screwing up my character count. I'll see if I can come up with a fix for that, but that's why the formatting looks weird in those posts.
I'm not showing any posts/rows where there is no delta (POTUS tweet) within the hour. I'll check back in later.
Ya man - what are you thinking?
Nearly!
#2665 is a NEGATIVE [-5] delta in minutesโฆ
#2664 is a [0]. Q posted 7 seconds before POTUS
Yes I agree. I'll have to look into why that's happening. I thought that I'd fixed that - but maybe I went back. It was set up where I was looking at 30 minutes on either side of a Q post. either way I'll check that out.
>I think Anons need the ability to create complex infographics and share complex ideas.
Yes. It sounds like we've been thinking about many of the same things. I've been playing around with d3.js. I've been trying to get the time to put something like this together
http://mbostock.github.io/d3/talk/20111116/force-collapsible.html
Sounds like a breddy big project! I can do some of the js/css/html if ya need it.
That network map on qmap wasn't working for me.
Yeah I found this issue and have it fixed. I had left a 61 minute range in where I was testing it. I changed the range to 60 and it looks to be working as designed now. I'll do an update - likely to kill off the site for a couple minutes unfortunately.
5:5
Yean anon, I can work that out. Gotta do some RL work, but it shouldn't take to long.
EX:
[00:07] = [00:07]
[01:10] = [01]
[05:45] = [05]
Anything else while I'm in there?
Clambaker? Not me. I just liked the clam.
Still working on this.
I like this idea alot and will def try to get this done.
Going to try and do a 'hide delta seconds' checkbox I think.
I remembered that there is a gap in the TrumpTwitterArchive where there were a couple markers, a [5] and a [10] as I recall. 1-1-2018 - 1-9-2018 the TrumpTwitterArchive wasn't working. I have a shiny new TwitterAPI key I'm going to play around with to see if I can do something about it unless anybody else has a great idea.
Have I been doing these wrong all along? Should I only be doing math on the minutes not the actual time deltas?
5:5
Gonna add another column so we'll have [delta] and [timediff]. [Delta is just the math on minutes. timediff is the time different in {mm:ss} between the 2 drops.
OK I think I have this sorted out now. I required some backend changes so I'm running a small update.
Here is what changed: The Delta column now shows the math on minutes only. Hover this value to see the exact timespan difference.
Also incorporated the dayfilter drop down where you can filter all the Qdrops with deltas by a day of the week. IE: [Sunday], [Tuesday]โฆ Selecting [NotSet] will reset to no filter.
Thanks for the help in working out this delta issue everybody.
Check #2664
>Sometimes making a connection leads to uncovering โฆโฆ
[0] {00:07}
[-7] {07:27} https://twitter.com/realDonaldTrump/status/1082268365081767936
>โฆ.The Fake News Media in our Country is the real Opposition Party. It is truly the Enemy of the People! We must bring honesty back to journalism and reporting!
Latest link = https://qanon.news/deltas.html
Hmm same day deltas? Based on my not really knowing what you mean on that I'd say no. By default the system ignores any tweets that are not 60 minutes before or 60 minutes after a Q drop.
Yep. Using a dropdown or a textbox? If dropdown what deltas specifically?
Updated. Still need to implement a 'combo filter' where you can filter by delta + day if needed. I added in a few extra deltas to the dropdown. It's only finding the (+) deltas, even though if you were filtering on [1] sometime there's a [-1] associated with that drop.
>draw.io
Hive mind. I looked into that very thing this morning and then had to do other things.
I think I see where you are going. I'll have to look more into what I can do with it.
Do you have the ability to put that frontend together?
I'm rejiggering now. My code was making the 3rd [0] yesterday (Q#2772) a [1] because they were posted in different minutes, but still within 60 seconds.
Just using minutes this is a [1]
Q: 23:44:38
P: 23:45:13
I'm incorporating some logic to allow for a 60 second difference - for [0] deltas only.
Would have fixed this last night but was hangin with the fam. I want to look into one other fix that needs to be done related to looking up posts that Q references and I'll make an update.
My latest tweak has now changed what was a [-1] into [0] on many drops. I think I still either need to tweak it or we need to allow for a negative delta. Here's what I found.
Left side of image:
Just using the 'Anything posted within 60 seconds of a Q post, before or after (-/+), is still a [0]' rule I'm seeing 37 TOTAL [0]. Note that this 37 includes Q#2511 twice where there are 2 POTUS tweets around a single Q drop. So you can call it 36 or 37.
Right side of image:
If we change that rule to eliminate the negative deltas, I'm seeing a total of 18 now.
The column in between the 2 times in the image is the delta in SECONDS (+/-)
Should I eliminate the negative deltas? How do I account for network lag if I'm doing that?
I agree. Anybody else want to offer an opinion before I make a change?
Maybe, but I'd have to think about how to consider lag. IE: if Q posts twice in a single bread, how do I know which one of those posts experienced lag?
Regardless, I made the update to move the negative [-0]'s back to a [-1]. Showing 19 [0]'s now, including the 2 rapid fire POTUS tweets around Q#2511
Correct
You can dig around here
https://8ch.net/cbts/res/2300.html#2300
https://qanon.news/archives/x/2300
So drop it. What'd ya come up with?
I'm gonna go ahead and throw the fake and gay flag on this one.
That is to say and [-1] that was < 60 seconds.
Yeah My host is trying to fuck me.
I'll tell the story since I'm here anyways. So around the end of April something happened at my Host. My site was shut down and I couldn't log in to see what was happening. I called support and they said it was a known issue affecting hosting customers. They said to give them 24-48 hrs and they'd have it working again.
While I was waiting, I decided to go ahead and roll out a new feature on the site. It's now collecting the bakers notables from each bread and collating them into a daily 'Notables' thread. It's not really new - the site itself had been collecting/collating them for nearly 6 months at this point - new access to threads. Before rolling out I ran a backup of the site to make sure I had a point to restore to. Planning on doing the upgrade the next day.
Checked in the next day and my account had been suspended. I can't log in, I can't FTP. My hosting account is jacked again. Call in to support. They explain I have gone over my allowed storage limit and so they have suspended my account. I explain to them, my backup lastnight must have caught the 50GB++ in images here and run me over my limit. Delete that backup and we'll be good.
They won't do it. They can't explain why FTP isn't working - and suggest I use FTP to delete the Backup, or login to delete the backup. I explain that I can't login because my account is suspended. They struggle to comprehend. They suggest that by upgrading to unlimited storage, then it will solve the problem. $++.
At first I said no, but then just agreed. Lets do it. They upgrade me and 24hrs later the site comes back online. I make the upgrade for notables and tweak the site for the next couple weeks.
On 5/16 I see the site has crashed and I attempt to log in to restart it. Account suspended again.
Nice. I call support and they tell me that I have way too many files on my site and need to remove files. I explained that I have unlimited storage. They say there is a 250k file limit and I need to delete files. Again I explain that I can't do anything because my account is suspended. They need to unlock my account so that I can fix it. They suggest upgrading to VPS hosting ($$$++). I say no, we need to figure out what's going on with my account. They say that the only option is for me to upgrade to VPS since it's a big site. The intimation is that if I want my content, I'll have to upgrade. Anger ensues. I manage to talk them into temporarily unlocking my account so that I can FTP in, download my content, and try and figure out what's going on.
Support starts to say that the real reason my site is down is because it uses so much resources. CPU and mem are over limits. I ask what my limits are and how can I monitor that. They tell me the limits and that I'm using too much, but say I have no way of monitoring it.
So now they've done something to the site where I can't start the application pool to run the site. Everything is turned off. It's won't even stay up for more than 5 minutes serving static pages. None of their techs are smart enough to figure it out yet after spending hours on the phone with them daily since the 16th. My plan is to give them a couple more days to try and work it out, then I'm moving to a new host.
Bottom line - I think they have now discovered the site and one of three things is happening:
1) My site is big and they reserve the right to forcefully push me off into something more expensive.
2) Someone in the tech chain I've spoken to is SJW/NeverTrump and they're doing it on purpose.
3) Both 1 and 2.
4) Incompetence. They jacked all the permissions somehow so it wouldn't run.
Site is working now, although it's several days behind. Working out either a more efficient or less detectable scraping option now.
Yeah that's what I'm thinking. I'm running a test to see if I can make it more efficient. Fallback plan is a tiered system like you suggested.
I'd rather not say who my host is right now. Part of the problem is the process that is going thru and doing the scrapes. Each time it does a scrape it takes about 5-8 minutes. That's archiving/formatting/generating around 50 breads. There's not much of anything going on with the UI, just some simple formatting stuff. I've added new logic to try and make that scrape alot faster.