Google no longer serving up Firefox 2 compatible search
It seems as of today Google is no longer serving up search pages that look right in Firefox 2/Seamonkey 1.1.x/Retrozilla. It is still possible to search, but the layout is all crap.
This last month they started messing things up and only served that specifically to Firefox 2.x, so I had to fiddle with user agents in the others. But now even that is doing nothing.
Balls.
Comments
Does DuckDuckGo HTML search work?
Last summer, I spent an evening with seamonkey 1.1.19 (?) and retrozilla on win95 on my PIII-500 box. Out of all of my usual sites (10-15), only four were perfect to slightly broken. I couldn't access winworld's forum from the main page. And the fanciest ones took like a couple minutes to load!
In the past year, it seems like a lot of sites have suddenly went for super-javascripty versions of themselves with no improvement in functionality and a move to lock everything before the Core 2 Duo out.
Does Google Advanced Search work? It use it with SeaMonkey due to some layout problem.
Another option is to override the UA for Google to an very old browser, unless they broke search for those browsers too.
These javascriptiness is a combination of laziness, ignorance, and a ploy by Google. Javascript was meant to be a tool to help improve web page functionality, never to be the webpage. So when you wind up with daisy chain-loading scripts that daisy chain load other scripts I get poor performance on even modern year or two old machines with Ryzen threadrippers or i9s.
Which brings to modern browsers. The best browser today isn't who has the best standard support. It's about who can handle dozens and dozens of scripts at once.
Mixed that with many support scripts being built by Google, you can expect they optimized it for their browser and vice versa.
But I digress, looks like old systems are being sunset for no purpose. How much space did the nearly pure html version take? Maybe a meg or two?
Okay, did some more testing with old versions of browsers:
Mozilla 1.5 on Windows 10 seems to default to a mobile layout and elements are badly misplaced. I recall seeing something like this if you blocked Google with NoScript.
IceWeasel 3.0 on Windows 10 seems to have the same problem.
SeaMonkey 2.9.1 on Windows 2000 seems to have a minor problem. It mostly works, but the actual input box is misaligned with the input box graphic. Later versions of SeaMonkey seem to have the same problem without a UA override.
Firefox 3.6 on Windows 2000 still defaults to the mobile page, but it does not seem to have the massive layout problems older versions have.
Firefox 2.0 on Windows NT 4.0 again defaults to the mobile page but seems to work better.
RetroZilla 2.2 on Windows NT 4.0 has the same result as Firefox 2.0.
Camino 2.1.2 on Mac OS X 10.6.8 has the same problems as Mozilla 1.5.
Mosaic-CK 2.7ck11 on Mac OS X 10.6.8 has a bunch of garbage between the input box and the actual results but it actually does seems to work.
Firefox 10 on Solaris 10 seems to have the same problems newer versions of SeaMonkey do.
Mozilla 1.7 on Solaris 10 have the same problems Mozilla 1.5 has.
Lynx 2.8.6 on Mac OS X 10.6.8 gets to the results page, but the actual links to the results do not appear as hyperlinks.
And Google was once a site that I tested older browsers with. Guess I'll have to find another site to do that.
Internet Explorer 6 on Windows 2000 is also displaying the mobile-like version, and they dropped the SSL requirement. It seems to work okay, but it looks weird. Oddly enough, Google Images still looks normal, at least for right now.
The mobile site seems to be an improvement for Netscape 4. It previously had the menu bar vertical, over top of the search results, making it useless. Now you can actually click on the search results. It does look terrible, though.
I will totally crap myself if/when google forces me to use that retarded grow-to-infinity make-the-scroll-bar-useless version of their image search. Even ignoring compatibility, the "new" stuff just sucks.
I've even started noticing a lot of sites breaking in NewMoon 27.9 (the one that works on lower SSE and non SSE systems).
Yea, not very many sites look right in Firefox 2/Seamonkey 1.1.x. There is the vintage computing forum, and soylentnews.org.
With a little bit of stuff in the userContent.css, the winworldpc forum is readable, but creating posts does not work well and the post editing menu is not available:
@SomeGuy
With my experiences, script-heavy sites crash New Moon on Server 2003 SP2 no matter which version (sse or not).
Even disabled hardware acceleration.
One of the Newmoon builds a month or so ago seemed to have a serious bug that was crashing a lot of sites (the build before it was not crashing). But I'm running 27.9.7 now and those particular sites are working fine now.
I've had crashes with New Moon/MyPal 28/Serpent 52 on sites such as YouTube, MSFN, IBM's login page and Alibaba. The common denominator is a fault at 0x77f87eeb (RtlEnterCriticalSection) in ntdll.dll. But those only happened under Windows 2000 and not XP/2003.
Unfortunately, BWC wasn't seemingly able to find a solution.
Huh, I've not had any problems with Mypal crashing on YouTube or MSFN. I'm on 28.7.2, Windows 2000.
Hmmm, in the regular Google search (using NewMoon) it at some point recently started showing me this "...." thing by each entry and clicking on it shows the actual URL and site encryption status. Seeing the actual URL can be useful, but the way it is done just seems like something designed by a committee.
It is "fun" watching the web fall apart. eBay finally logged me out of my Seamonkey 1.1.x browser, and the current login page requires crazy scripting. Although that site has been too slow to use for ages, payment broke a way while back, and scripts for viewing additional item images broke earlier this year.
I get your sarcasm.
It's awfully sad how sites are turning into massive javascript applets.
No longer are browsers focusing on standards, but just how good it can handle 40 branching (one script loads 20 more scripts, and those load other scripts, and that 40 scripts quickly turns into like 200).
Just so organizations don't have to hire proper web designers, they can just go to wix or some other garbage "dragndrop" site builder and make a "website" that looks like the hundreds of other cookie-cutter sites that are slow and painful to use.
I wait for the day designers realize this is a major security flaw disaster waiting to happen.
YouTube often uses 100% of my Core Duo T2400 when loading a 360p video. 480p is completely unwatchable.
A 720p MP4 video in WMP 9 using K-Lite Codec Pack uses 30% or less of the same CPU.
And some other sites are also ridiculous nowadays. There was one that didn't like a certain password; it met all of their requirements but I'd keep getting told it was wrong. I changed it to a different, similar one and it worked. And a forum in a completely different field likes to force a mobile version half the time even though the desktop version is perfectly fine and readable with 1024x768.
Oh yeah those sites that use "responsive design." Rather than design a proper site for each environment that ensures equal functionality, "designers" instead add some lines to the site body code that change the site layout by device resolution. Quick, dirty and lazy especially when the layout changes have a tendency to break stuff. Then pissing off users when essential content they're looking for is stripped out or made hard to find and there's nothing they can do.
The proper method would be to detect the mobile or tablet flags in the user-agent served during the request. But this is the modern web, that would take too much time and would actually work.
Huh, and I had 480p working on a 1.6GHz Pentium M just the other day in Mypal browser.
I agree that it is the proper method, but the problem is that a lot of designers are being told that user agent sniffing is not the correct method, and that it should be done with responsive design.
I know I'd much rather have a separate mobile site for those users that just have to look at the web on their toy phones, rather than have a monstrosity like Walmart's new website. It's one of the worst I've seen in a while.
And that's how we know modern web design has taken a turn for the worse.
It's not "sniffing." User agents are there for a reason.
If it wasn't for sites so desperate to become easy cookie cutters like everything else with their heavy script reliance, we wouldn't be having so many problems.
And this is evidence. If it wasn't for those scriptedness, an standards-based older browser like Firefox 2 would have no problem browsing the web, it's just sites wanting to use so many scripts to avoid writing that html.
Instead, browser code is outdated the moment it's compiled due to rapidly changing script design in webpages. It's outrageous.
EDIT:
Those early Core Duos were absolute garbage. They had poor performance no matter where you look.
And yes, I as well have had a Pentium M 1.7 GHz outperform (monumentally) a Core 2 Duo U7700 despite having a third the memory and poor graphics. It was quite sad really.
This is why I only use simple HTML for my entire website.
Seriously, a few years ago you could get around in the oldest, most obscure WWW browsers, and now several sites complain if you use the latest SeaMonkey!
The worst part is that most sites pretend text browsers don't exist, require browsers to display and process zillions of scripts, and then people wonder why 98 percent of the web doesn't work in Lynx.
Funnily enough, Gopher is still around after all this time, and doesn't have any of this 'modern' garbage.
Time for Gopher to make a resurgence then?
I also noticed the other day that Wikipedia is requiring newer HTTPS encraption now. Not too surprising. It will actually load in RetroZilla, which includes some of the newer encryption, but what really ticks me off is this haughty error message they show to other browsers:
Yeah I'm starting to notice sites throwing off unknown decryption errors on Pale Moon and Firefox. Not even Opera (chrome based) works.
Amazingly, the only browser that doesn't throw off those errors is...
Tada! Google Chrome!
The whole Wikipedia encryption thing has been pissing me off for a while. It's one thing to require it for logins, but you used to be able to just browse the site with basic http. We don't need encryption on basic informational sites, and requiring it just limits access to the information.
It's just the prestige thing.
That and to cater to Chrome who screams bloody murder if you type anything into a bar and the page isn't encrypted.
And I've complained about this before...
As for the topic itself, I don't use an ancient browser to surf the web (yet I used Firefox 12 on my Win2000 VM, until that one day complained about HTTPS/encryption).
I've been noticing lately that tons of sites complain even with the latest Firefox ESR, and tell me that I MUST be using Chrome to do anything.
I've gotten around that a couple times using a User Agent Switcher add-on for Firefox. It's incredibly sad that we have to do that, though. I think I mentioned once before that Chrome is becoming the new IE 6.
I use Firefox ESR, and that hasn't happened to me (but then again I only keep to a familiar circle of sites). I guess the developers love Chrome so much and hate everything else.
No, it's the fact they look at those browser popularity things and see "hey, most people use Chrome. We should give them the best experience. Since most people use Chrome, we don't have to bother with other browsers."
Today is seems DuckDuckGo has started requiring "higher" encryption.
Well I mean a search engine hell bent on
"privacy."
How does it work on other browsers?
Same on other browsers. I got an error in Internet Explorer 6 until I enabled TLS 1.2.