Most active commenters
  • kbolino(10)
  • p0w3n3d(6)
  • alganet(6)
  • viraptor(3)
  • JimDabell(3)
  • bmacho(3)

←back to thread

392 points _kush | 54 comments | | HN request time: 1.553s | source | bottom
1. p0w3n3d ◴[] No.44394429[source]
Ok, so it might be a long shot, but I would say that

1. the browsers were inconsistent in 1990-2000 so we started using JS to make them behave the same

2. meanwhile the only thing we needed were good CSS styles which were not yet present and consistent behaviour

3. over the years the browsers started behaving the same (mainly because Highlander rules - there can be only one, but Firefox is also coping well)

4. but we already got used to having frameworks that would make the pages look the same on all browsers. Also the paradigm was switched to have json data rendered

5. at the current technology we could cope with server generated old-school web pages because they would have low footprint, work faster and require less memory.

Why do I say that? Recently we started working on a migration from a legacy system. Looks like 2000s standard page per HTTP request. Every action like add remove etc. requires a http refresh. However it works much faster than our react system. Because:

1. Nowadays the internet is much faster

2. Phones have a lot of memory which is wasted by js frameworks

3. in the backend all's almost same old story - CRUD CRUD and CRUD (+ pagination, + transactions)

replies(4): >>44394607 #>>44394696 #>>44395199 #>>44395273 #
2. em-bee ◴[] No.44394607[source]
at the current technology we could cope with server generated old-school web pages because they would have low footprint, work faster and require less memory.

unless you have a high latency internet connection: https://news.ycombinator.com/item?id=44326816

replies(1): >>44394642 #
3. p0w3n3d ◴[] No.44394642[source]
however when you have a high latency connection, the "thick client" json-filled webapp will only have its advantages if the most of the business logic happens on the browser. I.e. Google Docs - great and much better than it used to be in 2000s design style. Application that searches the apartments to rent? Not really I would say.

-- edit --

by the way in 2005 I programmed using very funny PHP framework PRADO that was sending every change in the UI to the server. Boy it was slow and server heavy. This was the direction we should have never gone...

replies(2): >>44394747 #>>44395025 #
4. viraptor ◴[] No.44394696[source]
That timeline doesn't sound right to me. JS was rarely used to standardise behaviour - we had lots of user agent detection and relying on quirks ordering to force the right layout. JS really was for the interactivity at the beginning - DHTML and later AJAX. I don't think it even had easy access to layout related things? (I may be mistaken though) CSS didn't really make things more consistent either - once it became capable it was still a mess. Sure, CSS garden was great and everyone was so impressed with semantic markup while coding tables everywhere. It took ages for anything to actually pass first two ACIDs. I'm not sure frameworks ever really impacted the "consistent looks" side of things - by the time we grew out of jQuery, CSS was the looks thing.

Then again, it was a long time. Maybe it's me misremembering.

replies(2): >>44394769 #>>44395233 #
5. em-bee ◴[] No.44394747{3}[source]
Application that searches the apartments to rent? Not really I would say.

not a good example. i can't find it now, but there was a story/comment about a realtor app that people used to sell houses. often when they were out with a potential buyer they had bad internet access and loading new data and pictures for houses was a pain. it wasn't until they switched to using a frontend framework to preload everything with the occasional updates that the app became usable.

low latency affects any interaction with a site. even hackernews is a pain to read over low latency and would improve if new comments where loaded in the background. the problem creeps up on you faster than you think.

replies(1): >>44395762 #
6. jonwinstanley ◴[] No.44394769[source]
For me, JQuery was the thing that fixed the browser inconsistencies. If you used JQuery for everything, your code worked in all the browsers.

This was maybe 2008?

replies(5): >>44394819 #>>44394973 #>>44394982 #>>44395232 #>>44395621 #
7. jbverschoor ◴[] No.44394819{3}[source]
Probably 2005.

2002, I was using “JSRS”, and returning http 204/no content, which causes the browser to NOT refresh/load the page.

Just for small interactive things, like a start/pause button for scheduled tasks. The progress bar etc.

But yeah, in my opinion we lost about 15 years of proper progress.

The network is the computer came true

The SUN/JEE model is great.

It’s just that monopolies stifle progress and better standards.

Standards are pretty much dead, and everything is at the application layer.

That said.. I think XSLT sucks, although I haven’t touched it in almost 20 years. The projects I was on, there was this designer/xslt guru. He could do anything with it.

XPath is quite nice though

replies(1): >>44395000 #
8. JimDabell ◴[] No.44394973{3}[source]
jQuery in ~2008 was when it kinda took off, but jQuery was itself an outgrowth of work done before it on browser compatibility with JavaScript. In particular, events.

Internet Explorer didn’t support DOM events, so addEventListener wasn’t cross-browser compatible. A lot of people put work in to come up with an addEvent that worked consistently cross-browser.

The DOMContentLoaded event didn’t exist, only the load event. The load event wasn’t really suitable for setting up things like event handlers because it would wait until all external resources like images had been loaded too, which was a significant delay during which time the user could be interacting with the page. Getting JavaScript to run consistently after the DOM was available, but without waiting for images was a bit tricky.

These kinds of things were iterated on in a series of blog posts from several different web developers. One blogger would publish one solution, people would find shortcomings with it, then another blogger would publish a version that fixed some things, and so on.

This is an example of the kind of thing that was happening, and you’ll note that it refers to work on this going back to 2001:

https://robertnyman.com/2006/08/30/event-handling-in-javascr...

When jQuery came along, it was really trying to achieve two things: firstly, incorporating things like this to help browser compatibility; and second, to provide a “fluent” API where you could chain API calls together.

9. benediktwerner ◴[] No.44394982{3}[source]
Wasn't it more about inconsistencies in JS though? For stuff which didn't need JS at all, there also shouldn't be much need for JQuery.
replies(1): >>44395297 #
10. JimDabell ◴[] No.44395000{4}[source]
> But yeah, in my opinion we lost about 15 years of proper progress.

Internet Explorer 6 was released in 2001 and didn’t drop below 3% worldwide until 2015. So that’s a solid 14 years of paralysis in browser compatibility.

replies(1): >>44395703 #
11. catmanjan ◴[] No.44395025{3}[source]
Lol you'd hate to see what blazor is doing then
replies(1): >>44395072 #
12. Tade0 ◴[] No.44395072{4}[source]
Or Phoenix.LiveView for that matter.
replies(1): >>44400907 #
13. bob1029 ◴[] No.44395199[source]
> at the current technology we could cope with server generated old-school web pages because they would have low footprint, work faster and require less memory

I've got a .NET/Kestrel/SQLite stack that can crank out SSR responses in no more than ~4 milliseconds. Average response time is measured in hundreds of microseconds when running release builds. This is with multiple queries per page, many using complex joins to compose view-specific response shapes. Getting the data in the right shape before interpolating HTML strings can really help with performance in some of those edges like building a table with 100k rows. LINQ is fast, but approaches like materializing a collection per row can get super expensive as the # of items grows.

The closer together you can get the HTML templating engine and the database, the better things will go in my experience. At the end of the day, all of that fancy structured DOM is just a stream of bytes that needs to be fed to the client. Worrying about elaborate AST/parser approaches when you could just use StringBuilder and clever SQL queries has created an entire pointless, self-serving industry. The only arguments I've ever heard against using something approximating this boil down to arrogant security hall monitors who think developers cant be trusted to use the HTML escape function properly.

replies(1): >>44396470 #
14. middleagedman ◴[] No.44395233[source]
Old guy here. Agreed- the actual story of web development and JavaScript’s use was much different.

HTML was the original standard, not JS. HTML was evolving early on, but the web was much more standard than it was today.

Early-mid 1990s web was awesome. HTML served HTTP, and pages used header tags, text, hr, then some backgound color variation and images. CGI in a cgi-bin dir was used for server-side functionality, often written in Perl or C: https://en.m.wikipedia.org/wiki/Common_Gateway_Interface

Back then, if you learned a little HTML, you could serve up audio, animated gifs, and links to files, or Apache could just list files in directories to browse like a fileserver without any search. People might get a friend to let them have access to their server and put content up in it or university, etc. You might be on a server where they had a cgi-bin script or two to email people or save/retrieve from a database, etc. There was also a mailto in addition to href for the a (anchor) tag for hyperlinks so you could just put you email address there.

Then a ton of new things were appearing. PhP on server-side. JavaScript came out but wasn’t used much except for a couple of party tricks. ColdFusion on server-side. Around the same time was VBScript which was nice but just for IE/Windows, but it was big. Perl then PhP were also big on server-side. If you installed Java you could use Applets which were neat little applications on the page. Java Web Server came out serverside and there were JSPs. Java Tomcat came out on server-side. ActionScript came out to basically replace VBScript but do it on serverside with ASPs. VBScript support went away.

During this whole time, JavaScript had just evolved into more party tricks and thing like form validation. It was fun, but it was PhP, ASP, JSP/Struts/etc. serverside in early 2000s, with Rails coming out and ColdFusion going away mostly. Facebook was PhP mid-2000s, and LAMP stack, etc. People breaking up images using tables, CSS coming out with slow adoption. It wasn’t until mid to later 2000s until JavaScript started being used for UI much, and Google’s fostering of it and development of v8 where it was taken more seriously because it was slow before then. And when it finally got big, there was an awful several years where it was framework after framework super-JavaScript ADHD which drove a lot of developers to leave web development, because of the move from server-side to client-side, along with NoSQL DBs, seemingly stupid things were happening like client-side credential storage, ignoring ACID for data, etc.

So- all that to say, it wasn’t until 2007-2011 before JS took off.

replies(2): >>44395652 #>>44400876 #
15. Cthulhu_ ◴[] No.44395232{3}[source]
Before jQuery there was Prototype.js, part of early AJAX support in RoR, which fixed inconsistencies in how browsers could fetch data, especially in the era between IE 5 and 7 (native JS `XMLHttpRequest` was only available from IE 7 onwards, before that it was some ActiveX thing. The other browsers supported it from the get go). My memory is vague, but it also added stuff like selectors, and on top of that was script.aculo.us which added animations and other such fanciness.

jQuery took over very quickly though for all of those.

replies(1): >>44395318 #
16. ozim ◴[] No.44395273[source]
AJAX and updating DOM wasn't there just to "make things faster" it was implemented there to change paradigm of "web sites" or "web documents" — because web was for displaying documents. Full page reload makes sense if you are working in a document paradigm.

It works well here on HN for example as it is quite simple.

There are a lot of other examples where people most likely should do a simple website instead of using JS framework.

But "we could all go back to full page reloads" is not true, as there really are proper "web applications" out there for which full page reloads would be a terrible UX.

To summarize there are:

"websites", "web documents", "web forms" that mostly could get away with full page reloads

"web applications" that need complex stuff presented and manipulated while full page reload would not be a good solution

replies(2): >>44395947 #>>44396791 #
17. dspillett ◴[] No.44395297{4}[source]
jQuery, along with a number of similar attempts and more single-item-focused polyfills¹ was as much about DOM inconsistencies as JS ones. It was also about making dealing with the DOM more convenient² even where it was already consistent between commonly used browsers.

DOM manipulation of that sort is JS dependent, of course, but I think considering language features and the environment, like the DOM, to be separate-but-related concerns is valid. There were less kitchen-sink-y libraries that only concentrated on language features or specific DOM features. Some may even consider a few parts in a third section: the standard library, though that feature set might be rather small (not much more than the XMLHTTPRequest replacement/wrappers?) to consider its own thing.

> For stuff which didn't need JS at all, there also shouldn't be much need for JQuery.

That much is mostly true, as it by default didn't do anything to change non-scripted pages. Some polyfills for static HTML (for features that were inconsistent, or missing entirely in, usually, old-IE) were implemented as jQuery plugins though.

--------

[1] Though I don't think they were called that back then, the term coming later IIRC.

[2] Method chaining³, better built-in searching and filtering functions⁴, and so forth.

[3] This divides opinions a bit though was generally popular, some other libraries did the same, others tried different approaches.

[4] Which we ended up coding repeatedly in slightly different ways when needed otherwise.

18. arkh ◴[] No.44395318{4}[source]
> native JS `XMLHttpRequest` was only available from IE 7 onwards, before that it was some ActiveX thing.

Almost sure it was available on IE6. But even if not, you could emulate it using hidden iframes to call pages which embedded some javascript interacting with the main page. I still have fond memories of using mootools for lightweight nice animations and less fond ones of dojo.

replies(1): >>44395830 #
19. viraptor ◴[] No.44395621{3}[source]
I wasn't clear, jQuery was definitely used for browser inconsistencies, but in behaviour, but layout. It had just a small overlap with CSS functionality (at first, until it all got exposed to JS)
20. nasduia ◴[] No.44395652{3}[source]
Though much less awesome was all the Flash, Realplayer and other plugins required.
replies(1): >>44396161 #
21. jbverschoor ◴[] No.44395703{5}[source]
Time flies when you’re having fun
22. _heimdall ◴[] No.44395762{4}[source]
Prefetching pages doesn't require a frontend framework though. All it takes is a simple script to preload all or specific anchor links on the page, or you could get fancier with a service worker and a site manifest if you want to preload pages that may not be linked on the current page.
replies(1): >>44396533 #
23. JimDabell ◴[] No.44395830{5}[source]
Internet Explorer 5–6 was the ActiveX control. Then other browsers implemented XMLHTTPRequest based on how that ActiveX control worked, then Internet Explorer 7 implemented it without ActiveX the same way as the other browsers, and then WHATWG standardised it.

Kuro5hin had a dynamic commenting system based on iframes like you describe.

24. alerighi ◴[] No.44395947[source]
Yes, of course for web applications you can't do full page reload (you weren't either back in the days, where web applications existed in form of java applets or flash content).

Let's face it, most uses of JS frameworks are for blogs or things that with full page reload you not even notice: nowadays browsers are advanced and only redraw the screen when finished loading the content, meaning that they would out of the box mostly do what React does (only render DOM elements who are changes), meaning that a page reload with a page that only changes one button at UI level does not result in a flicker or loading of the whole page.

BTW, even React now is suggesting people to run the code server-side if it is possible (it's the default of Next.JS), since it makes the project easier to maintain, debug, test, as well as get better score in SEO from search engines.

I'm still a fan of the "old" MVC models of classical frameworks such as Laravel, Django, Rails, etc. to me make overall projects that are easier to maintain for the fact that all code runs in the backend (except maybe some jQuery animation client side), model is well separated from the view, there is no API to maintain, etc.

25. sim7c00 ◴[] No.44396161{4}[source]
Realplayer. christ, forgot all about that one.... thanks... frozenface
replies(1): >>44396315 #
26. p0w3n3d ◴[] No.44396315{5}[source]
ah the feelings. those were the times
replies(1): >>44396371 #
27. viraptor ◴[] No.44396371{6}[source]
If your site didn't have a flash animated menu, was it even a real website at that time?
28. chriswarbo ◴[] No.44396470[source]
> arrogant security hall monitors who think developers cant be trusted to use the HTML escape function properly.

Unfortunately, they're not actually wrong though :-(

Still, there are ways to enforce escaping (like preventing "stringly typed" programming) which work perfectly well with streams of bytes, and don't impose any runtime overhead (e.g. equivalent to Haskell's `newtype`)

29. chriswarbo ◴[] No.44396533{5}[source]
It shouldn't need any scripts https://en.wikipedia.org/wiki/Link_prefetching

It can also be imposed by the client, e.g. via a https://en.wikipedia.org/wiki/Web_accelerator

replies(1): >>44396589 #
30. _heimdall ◴[] No.44396589{6}[source]
Yep, that works as well. I'll reach for a script still if I want more logic around when to prefetch, like only prefetching on link hover or focus. A script is also needed for any links that you need to preload but aren't included on the current page.
31. alganet ◴[] No.44396791[source]
> full page reloads

grug remember ancestor used frames

then UX shaman said frame bad all sour faced frame ugly they said, multiple scrollbar bad

then 20 years later people use fancy js to emulate frames grug remember ancestor was right

https://developer.mozilla.org/en-US/docs/Web/HTML/Reference/...

replies(1): >>44397326 #
32. kbolino ◴[] No.44397326{3}[source]
Classic frames were quite bad. Every frame on a page was a separate, independent, coequal instance of the browser engine. This is almost never what you actually want. The header/footer/sidebar frames are subordinate and should not navigate freely. Bookmarks should return me to the frameset state as I left it, not the default for that URL. History should contain the frameset state I saw, not separate entries for each individual frame.

Even with these problems, classic frames might have been salvageable, but nobody bothered to fix them.

replies(3): >>44397511 #>>44397653 #>>44399351 #
33. bmacho ◴[] No.44397511{4}[source]
> Every frame on a page was a separate, independent, coequal instance of the browser engine. This is almost never what you actually want.

Most frames are used for menu, navigation, frame for data, frame for additional information of data. And they are great for that. I don't think that frames are different instances of the browser engine(?) but that doesn't matter the slightest(?). They are fast and lightweight.

> The header/footer/sidebar frames are subordinate and should not navigate freely.

They have the ability to navigate freely but obviously they don't do that, they navigate different frames.

replies(1): >>44397616 #
34. kbolino ◴[] No.44397616{5}[source]
With a frameset page:

History doesn't work right

Bookmarks don't work right -- this applies to link sharing and incoming links too

Back button doesn't work right

The concept is good. The implementation is bad.

replies(2): >>44397720 #>>44397992 #
35. alganet ◴[] No.44397653{4}[source]
You can see frames in action on the POSIX spec:

https://pubs.opengroup.org/onlinepubs/9799919799/

They can navigate targeting any other frame. For example, clicking "System Interfaces" updates the bottom-left navigation menu, while keeping the state of the main document frame.

It's quite simple, just uses the `target` attribute (target=blank remains popular as a vestigial limb of this whole approach).

This also worked with multiple windows (yes, there were multi-window websites that could present interactions that handled multiple windows).

The popular iframe is sort of salvaged from frame tech, it is still used extensively and not deprecatred.

replies(1): >>44397665 #
36. kbolino ◴[] No.44397665{5}[source]
An iframe is inherently subordinate. This solves one of the major issues with classic frames.

Classic frames are simple. Too simple. Your link goes to the default state of that frameset. Can you link me any non-default state? Can I share a link to my current state with you?

37. bmacho ◴[] No.44397720{6}[source]
Yup, they are not enough for an SPA, not without javascript. And if you have javascript to handle history, URL, bookmarks and all that, you can just use divs without frames.
replies(1): >>44397787 #
38. kbolino ◴[] No.44397787{7}[source]
This has nothing to do with SPAs.

Take the POSIX specs linked in a sibling comment.

Or take the classic Javadocs. I am currently looking at the docs for java.util.ArrayList. Here's a link to it from my browser's URL bar: https://docs.oracle.com/javase/8/docs/api/

But you didn't go to the docs for java.util.ArrayList, you went to the starting page. Ok, fine, I'll link you directly to the ArrayList docs, for which I had to "view frame source" and grab the URL: https://docs.oracle.com/javase/8/docs/api/java/util/ArrayLis...

Ok, but now you don't see any of the other frames, do you? And I had one of those frames pointing at the java.util class. So none of these links show you what I saw.

And if I look in my history, there is no entry that corresponds to what I actually saw. There are separate entries for each frame, but none of them load the frameset page with the correct state.

These are strongly hyperlinked reference documents. Classic use of HTML. No JavaScript or even CSS needed.

replies(2): >>44397914 #>>44397982 #
39. ◴[] No.44397914{8}[source]
40. bmacho ◴[] No.44397982{8}[source]
This is exactly what I wrote? But let me rephrase it: frames are not enough solely for an SPA, they can't keep state, you need javascript/dynamic webserver for that.

> Ok, fine, I'll link you directly to the ArrayList docs, for which I had to "view frame source" and grab the URL:

You could've just right click on the "frames" link, and copy the URL: https://docs.oracle.com/javase/8/docs/api/index.html?java/ut... . They use javascript to navigate based on the search params in the URL. It's not great, it should update the URL as you navigate, maybe you can send them a PR for that. (And to change state of the boxes on the left too.)

Also browser history handling is really messy and hard to get right, regardless of frames.

> And if I look in my history, there is no entry that corresponds to what I actually saw.

? If you write a javascript +1 button that updates a counter, there won't be a corresponding entry in your history for the actual states of your counter. I don't see how that is a fundamental problem with javascript(?).

replies(1): >>44398067 #
41. alganet ◴[] No.44397992{6}[source]
> History doesn't work right

> Bookmarks don't work right -- this applies to link sharing and incoming links too

> Back button doesn't work right

Statements that apply to many JS webpages too.

pushState/popState came years after frames lost popularity. These issues are not related to their downfall.

Relax, dude. I'm not claiming we should use frames today. I'm saying they were simple good tools for the time.

replies(1): >>44398111 #
42. kbolino ◴[] No.44398067{9}[source]
It's cool that they have that link. Most frame sites didn't. JS actually isn't necessary to make that work, they could have just interpolated the requested page server-side. But it only correctly points to one frame. It's the most important frame, to be fair, but it doesn't do anything for the other two frames.

I don't understand how pre-HTML5, non-AJAX reference docs qualify as an "SPA". This is just an ordinary web site.

replies(1): >>44398166 #
43. kbolino ◴[] No.44398111{7}[source]
They were never good. They were always broken in these ways. For some sites, it wasn't a big deal, because the only link that ever mattered was the main link. But a lot of places that used frames were like the POSIX specs or Javadocs, and they sucked for anything other than immediate, personal use. They were not deprecated because designers hated scrollbars (they do hate them, and that sucks too, but it's beside the point).

And, ironically, the best way to fix these problems with frames is to use JavaScript.

replies(1): >>44398296 #
44. ◴[] No.44398166{10}[source]
45. alganet ◴[] No.44398296{8}[source]
> They were never good

They were good enough.

> For some sites, it wasn't a big deal

Precisely my point.

> POSIX specs or Javadocs

Hey, they work for me.

> the best way to fix these problems with frames is to use JavaScript.

Some small amounts of javascript. Mainly, proxy the state for the main frame to the address bar. No need for virtual dom, babel, react, etc.

--

_Again_, you're arguing like I'm defending frames for use today. That's not what I'm doing.

Many websites follow a "left navigation, center content" overall layout, in which the navigation stays somehow stationary and the content is updated. Frames were broken, but were in the right direction. You're nitpicking on the ways they were broken instead of seeing the big picture.

replies(1): >>44398458 #
46. kbolino ◴[] No.44398458{9}[source]
Directionally correct but badly done can poison an idea. Frames sucked and never got better.

Along with other issues, this gave rise to AJAX and SPAs and JS frameworks. A big part of how we got where we are today is because the people making the web standards decided to screw around with XHTML and "the semantic web" (another directionally correct but badly done thing!) and other BS for about a decade instead of improving the status quo.

So we can and often should return to ancestor but if we're going to lay blame and trace the history, we ought to do it right.

replies(1): >>44398806 #
47. alganet ◴[] No.44398806{10}[source]
Your history is off, and you are mixing different eras and browser standards with other initiatives.

Frames gave place to (the incorrect use of) tables. The table era was way worse than it is today. Transparent gif spacers, colspan... it was all hacks.

The table era gave birth to a renewal of web standards. This ran mostly separately from the semantic web (W3C is a consortium, not a single central group).

The table era finally gave way to the jQuery era. Roughly around this time, browser standards got their shit together... but vendors didn't.

Finally, the jQuery era ended with the rise of full JS frameworks (backbone first, then ember, winjs, angular, react). Vendors operating outside standards still dominate in this era.

There's at least two whole generations between frames and SPAs. That's why I used the word "ancestor", it's 90s tech I barely remember because I was a teenager. All the other following eras I lived through and experienced first hand.

The poison on the frames idea wore off ages ago. The fact that websites not made with them resemble their use is a proof of that, they just don't share the same implementation. The "idea" is seen with kind eyes today.

replies(1): >>44398996 #
48. kbolino ◴[] No.44398996{11}[source]
I feel like we're mostly in violent agreement.

The key point about frames in the original context of this thread as I understood it was that they allowed a site to only load the content that actually changes. So accounting for the table-layout era doesn't really change my perspective: frames were so bad, that web sites were willing to regress to full-page-loads instead, at least until AJAX came along -- though that also coincides with the rise of the (still ongoing) div-layout era.

I agree wholeheartedly that the concept of partial page reloading in a rectilinear grid is alive and well. Doing that with JavaScript and CSS is the whole premise of an SPA as I understand it, and those details are key to the difference between now and the heyday of frames. But there was also a time when full-page-loading was the norm between the two eras, reflecting the disillusionment with frames as they were implemented and ossified.

The W3C (*) spent a good few years working on multiple things most of which didn't pan out. Maybe I'm being too harsh, but it feels like a lot of their working groups just went off and disconnected from practice and industry for far too long. Maybe that was tangential to the ~decade-long stagnation of web standards, but that doesn't really change the point of my criticism.

* = Ecma has a part in this too, since JavaScript was standardized by them instead of W3C for whatever reason, and they also went off into la-la land for roughly the same period of time

replies(1): >>44399384 #
49. p0w3n3d ◴[] No.44399351{4}[source]
Iframes are no longer the thing? I must have slept over this scene
replies(1): >>44399389 #
50. alganet ◴[] No.44399384{12}[source]
> I feel like we're mostly in violent agreement.

Probably, yes!

> So accounting for the table-layout era doesn't really change my perspective: frames were so bad, that web sites were willing to regress to full-page-loads instead

That's where we disagree.

From my point of view, what brought sites to full page loads were designers. Design folk wanted to break out of the "left side navigation, right content" mold and make good looking visual experiences.

This all started with sites like this:

https://www.spacejam.com/1996/

This website is a interstitial fossil between frames and full table nightmare. The homepage represents what (at the time) was a radical way of experiencing the web.

It still carries vestiges of frames in other sections:

https://www.spacejam.com/1996/cmp/jamcentral/jamcentralframe...

However, the home is their crown jewel and it is representative of the years that followed.

This new visual experience was enough to discard partial loading. And for a while, it stayed like this.

JS up to this point was still a toy. DHTML, hover tricks, trinkets following the mouse cursor. It was unthinkable to use it to manage content.

It was not until CSS zen garden, in 2003, that things started to shift:

https://csszengarden.com/pages/about/

Now, some people were saying that you could do pretty websites without tables. By this time, frames were already forgotten and obsolete.

So, JS never killed frames. There was a whole generation in between that never used frames, but also never used JS to manage content (no AJAX, no innerHTML shinenigans, nothing).

Today, websites look more like the POSIX spec (in structure and how content is loaded) than the SpaceJam website that defined a generation. The frames idea is kind of back in town. It doesn't matter that we don't use the same 90s tech, they were right about content over style, right about partial loading, right about a lot of structural things.

replies(1): >>44399711 #
51. kbolino ◴[] No.44399389{5}[source]
By "classic frames", I mean <frameset> not <iframe>. Though iframes have some of the same problems, they don't have all of the same problems. They also tend to be used differently, though you can certainly create a frameset-like experience using only iframes.
52. kbolino ◴[] No.44399711{13}[source]
I appreciate looking at things from a different perspective! I can see your line of argument now.

I should clarify. I don't think JS killed frames, that's not what I meant. If anything, I think JS could have saved frames. But the failure of frames left a gap that eventually JS (esp. with AJAX) filled. Lots of other stuff was going on at this time too, including alternative tech like Java, Flash, and ActiveX, all of which were trying to do more by bypassing the "standard" tech stack entirely.

I think the ossification of web standards from ca. 1999 to 2012, combined with the rapidly growing user base, and with web developers/designers aggressively pushing the envelope of what the tech could do, put the standard stuff on the back foot pretty badly. Really, I'm talking about the whole ecosystem and not just the standards bodies themselves; there was an era where e.g. improving HTML itself was just not the active mentality. Both inside and outside of W3C (etc.), it seemed that nobody cared to make the standard stuff better. W3C focused on unproductive tangents; web devs focused on non-standard tech or "not the intended use" (like tables for layout).

So I think we can say that <frameset> frames died a somewhat unfair death, caused partly by their initial shortcomings, partly by people trying to break outside of the (literal) boxes they imposed, and partly by the inability of the standard tech to evolve and address those shortcomings in a timely fashion. But just as there was a reason they failed, there was a reason they existed too.

53. p0w3n3d ◴[] No.44400876{3}[source]
I must contradict. 2005-6 was happening my PRADO development, which already was present on market as a framework that was exensively using javascript (mimicking Microsoft's ASP.NET forms) to make AJAX requests and regenerate states of components that were placed on the web page using DOM.

The thing was that it was really hard to write code that did the same DOM + placement on all the browsers, and if a framework could do that, this was becoming great help. I started my webpage development in 2000-ish with if (`document.forms /* is ie */`) ... and was finding a way to run IE on my Linux computer to test the webpage rendering there. And CSS 2 was released on 1998 and could change everything and was the Deus Ex Machine everyone expected, except for it didn't work, especially on IE (which had majority of market, and especially if you developed a business application, you had to count it as the majority of all your clients, if not the only ones). So in CSS 2 you could __allegedly__ do things you really needed, like placing things together or in a related position, instead of calculating browser's sizes etc., but it didn't work correctly, so you had to fallback to javascript `document.getElementById().position = screenWidth/2 etc`.

So according to my memory, (1) these were the dark times mainly because of m$ being lazy and abusing their market position (2) we used javascript to position elements, colorize them, make complicated bevels, borders etc (3) this created gap for Google that they could use to gain power (which we admired at that time as the saviours of the web) (4) Opera was the thing and Resistance icon (boasting themselves of fulfilling all standards and being fast, but they failed a few times too)

also DSL, LAN internet sharing and AOL (in Poland 0202122 ppp/ppp), tshshshshshs, tidutidumtidum, tshshshshshsh ...

54. p0w3n3d ◴[] No.44400907{5}[source]
I have no hate/love relation to that matter. Tbh I don't care, but my phone gets hot when it has to load another 5/10/20/100MB Single Page Application that displays a few lines of nicely formatted text, an animated background and a button "subscribe"

By the way, GWT did it before.