#editorial | Logs for 2020-03-05
« return
[00:31:00] <Bytram> #yt you can't get there from here
[00:31:00] <MrPlow> https://www.youtube.com -- You Can't Get There From Hear with Charls Carroll 11
[00:31:08] <Bytram> #yt you can't get there from here bert
[00:31:09] <MrPlow> https://www.youtube.com -- Bert and I
[02:11:08] -!- AzumaHazuki [AzumaHazuki!~hazuki@the.end.of.time] has joined #editorial
[07:29:04] -!- AzumaHazuki has quit [Remote host closed the connection]
[10:49:08] -!- EWBtCiaST3 [EWBtCiaST3!~EWBtCiaST@199.229.rum.ski] has joined #editorial
[10:51:15] -!- EWBtCiaST has quit [Ping timeout: 244 seconds]
[10:51:15] EWBtCiaST3 is now known as EWBtCiaST
[11:42:34] <Bytram> https://www.isitdownrightnow.com
[11:42:35] <systemd> ^ 03Sylnt.us Down or Just Me ? ( https://www.isitdownrightnow.com )
[11:42:36] <exec> └─ 13Sylnt.us - Is Sylnt Down Right Now?
[20:05:38] <carny> =submit from the clean-all-the-things dept https://www.epa.gov for the next update +https://www.epa.gov/sites/production/files/2020-03/documents/sars-cov-2-list_03-03-2020.pdf
[20:05:39] <systemd> carny, 04submit failed: Parse failed
[20:05:50] <carny> =submit from the clean-all-the-things dept https://www.epa.gov for the next update
[20:05:51] <systemd> Submitting "List N: Disinfectants for Use Against SARS-CoV-2"...
[20:06:13] <systemd> ✓ Sub-ccess! "03List N: Disinfectants for Use Against SARS-CoV-2" (7 paragraphs) -> https://soylentnews.org
[20:07:06] <carny> =submit file for the previous sub https://www.epa.gov https://www.epa.gov
[20:07:06] <systemd> carny, 04submit failed: Parse failed
[20:07:21] <carny> i guess you can't submit pdf links
[20:07:40] <carny> well somebody can add it to the story
[21:45:46] <Bytram> carny: I'm on it! Thanks for the (vigorous!) attempt! (My *guess* is that the bot does not know quite what to do with a PDF file
[21:46:38] <Bytram> =g BroadCom
[21:46:39] <systemd> https://www.broadcom.com - Broadcom Inc. | Connecting Everything
[21:51:38] <carny> i guess it tries to fetch a url even if it's just a + to be added at the bottom
[22:19:19] <Bytram> One would think not, but it does seem to be that way. :/
[22:20:55] <chromas> It has to, to get the title
[22:21:07] <chromas> or site name. whatever
[22:21:22] <Bytram> can it get the title of a PDF?
[22:21:27] <chromas> no
[22:21:39] <chromas> but it tries to get info from +urls
[22:21:52] <chromas> far as I know, PDFs don't have title metadata
[22:22:00] <chromas> It'd be nice if they did, though
[22:22:12] <Bytram> WWGD?
[22:23:11] <chromas> What Would Jif Do
[22:23:26] <Bytram> chromas++ Good one!!!!
[22:23:26] <Bender> karma - chromas: 121
[22:24:06] <chromas> Looks like they can have a title but don't
[22:25:08] <chromas> That one does though. Hm
[22:27:14] <chromas> Looks like they embed a little xml, so I could look for that or execute pdfinfo
[22:31:06] <Bytram> =g pdftotext
[22:31:07] <systemd> https://en.wikipedia.org - pdftotext - Wikipedia
[22:38:29] <Bytram> My connection has been stuttering all afternoon. Start a wget or go to refresh a page in my browser... things get started and then... nada
[22:39:21] <chromas> Time to get some cable internet
[22:40:36] <Bytram> think I'll toggle my phone (that I use as a hotspot to get online) into airplane mode, wait a couple minutes, and then re-enable connections... see if I can get a better router port or something
[22:40:39] <Bytram> afk; biab