you think anyone wanna sit here and hunt for dubs just to have a little look around your LULZ archive database, which you've probably got backed up anyway?
at the moment I prune the table periodically if/when it's gets too large >100M rows lets say
I plan to instead of deleting those rows to store them compressed somewhere in some index file or something >unironically using zst instead of bz2 for text
wew lad, don't be THAT guy
SELECT board, DATE_FORMAT(timestamp, '%Y-%m') AS month_year, COUNT(commentId) AS comment_count
FROM posts
GROUP BY
board,
month_year
ORDER BY
board,
month_year;
just from your choices of boards to archive i can tell you wouldnt be a long term reliable archiver. otherwise i would try to roll for you to buy a domain and make it compatible with some imageboard
its called mycli, its a mysql client
https://github.com/dbcli/mycli
its in the arch user repos and is also a python module you can install eirh pip i believe
you think anyone wanna sit here and hunt for dubs just to have a little look around your LULZ archive database, which you've probably got backed up anyway?
first one's on the house
weak numbers btw, do better LULZ
select op and insert >her into reddit
how surprising
DELETE * FROM *
>only 200 instances of nagger on LULZ
looks like you've been archiving for a week? am i in the ballpark
well there you go, he got dubs, your thread is over.
IF he has posted valid SQL, that is. But alas, newfags don't know how to truncate
What does null mean?
delete everything and go to sleep
>delete everything and go to sleep
DELETE DEEZ NUTS
drop posts
damnit, rolling again
drop posts
I'll do you a solid here
test
your test worked homie
replace all country names with 'CN'
select * from posts order by time limit 10
I changed it up a bit, need to use substring since com can get quite large and it meses with the output formatting
you've been only archiving the last 5 days? how big is the data?
>tar.zst
oh yeah have fun browsing that
SELECT 1
delete from posts;
> unironically using SQL instead of tar.zst for archival purpose
nagger alert
DROP TABLE posts;
at the moment I prune the table periodically if/when it's gets too large >100M rows lets say
I plan to instead of deleting those rows to store them compressed somewhere in some index file or something
>unironically using zst instead of bz2 for text
wew lad, don't be THAT guy
select * from posts order by time desc limit 10
here you go friend
not that big, ~50Mb I think but I don't archive everything, I don't want/need that much garbage
that wasnt anons query
dont fix it for us
if I try to post the result of his query it would be unreadable, too much gibberish from the com field, what I posted is a decent compromise
SELECT board, DATE_FORMAT(timestamp, '%Y-%m') AS month_year, COUNT(commentId) AS comment_count
FROM posts
GROUP BY
board,
month_year
ORDER BY
board,
month_year;
since I only have posts from a couples of days ago until today this wont return anything interesting
just from your choices of boards to archive i can tell you wouldnt be a long term reliable archiver. otherwise i would try to roll for you to buy a domain and make it compatible with some imageboard
archiving is not my main purpose, as I said I don't even try to get every thread or post
What program are you using in that screenshot?
its called mycli, its a mysql client
https://github.com/dbcli/mycli
its in the arch user repos and is also a python module you can install eirh pip i believe
Looks really neat, and thanks for replying. Have fun with the posts.
new scrape just came in my dudes