←back to thread

653 points thunderbong | 2 comments | | HN request time: 0.425s | source
Show context
boomboomsubban ◴[] No.36908788[source]
I'm surprised there aren't more full tapings of 90's television available, as in entire blocks of broadcasting with all the commercials intact. That was how most recording would have happened, and with the start of TV Land the networks should have been able to predict there'd be a market for it in 30 years.
replies(5): >>36908845 #>>36908922 #>>36912076 #>>36914980 #>>36915509 #
LeonardoTolstoy ◴[] No.36914980[source]
It might be apocryphal but I vaguely remember reading an article from the late-90s or early-00s where television executives were shocked that people wanted TV box sets. The logic was ... Why would people want to watch reruns? Whole swaths of soap opera episodes were totally lost, the masters being taped over, occasionally found in a box in some remote TV station.

I have a small personal project of cataloging all the movies that played on television in the 90s. There are tons of television shows that are not only not available on DVD or VHS but also seemingly no one has it. Double goes for cartoons, tons just totally unavailable. It is sad.

replies(2): >>36915761 #>>36918488 #
awiesenhofer ◴[] No.36918488[source]
> I have a small personal project of cataloging all the movies that played on television in the 90s.

Any plans to publish this list? Would surely make a super interesting git repo for example...

replies(1): >>36920614 #
1. LeonardoTolstoy ◴[] No.36920614[source]
I've been working on it for about a year. At the moment it exists as a git repo (as you say), but for it to be of use you need at the very least the corresponding SqliteDB (~70MB) and to be really fun (and work with the frontend) you need the listing pages themselves (high res and ~50GB), neither of which is in the git repo, I keep them separate.

If you message me privately I'd be happy to share the data. The git repos are:

https://github.com/patsmad/nyt-listings https://github.com/patsmad/nyt-listings-app

I use them for curation at the moment so the READMEs leave ... something to be desired. I hope by the end of August to have a read-only version up and running, although without a wikipedia-like effort I don't see how I would curate it fully so it'll probably always be a little touch and go as to what data is available.

The stats I have from curating are: 369345 individual movie "listing boxes" (I would guess around 98% accuracy, although if I were to field a guess the actual number there should be is probably 400K) of which 321308 are matched to a movie, and 296941 of those are for sure unique. And overall 202203 have channel + time + duration matched up using the VCR listings (which the New York Times conveniently published from around November 20th 1990, and the internet archive very nicely has the program the VCRs used to encode/decode those codes). There are 21530 unique movies at the moment.

If I understand the New York Times correctly, then none of this can be commercialized since I scraped the core data (the pages themselves) from the TimesMachine, so this really is a personal project, which I'm happy to share. I've made a few Letterboxd lists from the corresponding data, for example a series of lists with all of the movies (and play times) for films playing on September 1 in particular e.g. https://letterboxd.com/patsmad/list/television-films-septemb... It is rather consistent, around 100 films a day, for 1990-1999 it was 106, 118, 74, 74, 89, 99, 98, 110, 97, 93. As is obvious I can talk about this for days.

I'm not sure the best way to do private messages, my email is associated with this account, but I have no idea if you can see that. I usually just lurk on HN.

replies(1): >>36923072 #
2. awiesenhofer ◴[] No.36923072[source]
Wow, that sounds awesome! Definitely so a "Show HN" when you feel it's right!