[rescue] tired of current GUIs / a rant about the daily garbage we put up with

John Floren john at jfloren.net
Wed Oct 23 09:14:45 CDT 2019


On 10/23/2019 7:53 AM, Patrick Giagnocavo wrote:
> Just delete this if you don't like my opinion, but ...
> 
> we now have multi-Ghz multi-core, multi-GB RAM systems with GUIs that companies have literally spent $100 million+ on, and yet the state of GUIs has not advanced, and the UIs struggle to keep up with the glacially-slow humans that type on it (input latency on Apple IIe and early Macs was lower than today's in many cases).
> 
> Things people actually accomplished real work on:
> 
> NextStation running NS3.3 - hardware we now would not even view as a competitor to a Raspberry Pi: 33Mhz 68040, 32MB RAM, 1120x832 resolution.
> 
> SPARCstation 10 - 50Mhz CPU with up to 1MB cache
> 
> I don't think it is "all Microsoft's fault" in that OSX and Linux haven't really shown their ability to be that much more compact. You would really have to struggle to cut down a Linux GUI system to 512MB RAM, for instance.
> 
> Did people just become lazy?  Did everyone being able to afford a computer, result in a dumbing-down or lowest common denominator approach? Why does so much of computing these days just seem like a total crapfest?
> 
> I have been seriously thinking about using the Coherent unix-a-like, or various Z80/CPM boards running on actually low end hardware.
> 
> What's the minimum you need, for actual hacking?
> 
> /rantoff,
> Patrick

I wonder about the same thing frequently. I think there's a couple 
things at play:

1. New programming languages may be more pleasant to work with or 
provide a more featureful standard library, but can drag along a pretty 
big runtime. I love programming in Go, but its memory management can be 
pretty "greedy"; I don't leak memory like I did in C, but the runtime 
likes to hold on to space it's garbage-collected for later re-use.

2. We're dealing with bigger/more complex content. One reason (not the 
only reason) Firefox now chews up 3 GB of memory is that each page is 
pretty fat. Some of this is due to lazy web programmers loading up on 
external Javascript and fonts etc., but some of it is just due to the 
fact that images / videos / etc. have to look better these days. In 1999 
we may have been happy with 8-bit dithered 400x300 pixel images on web 
pages, because we were viewing them on an 800x600 monitor, but today our 
screens are bigger and we demand better-looking content. Also, with 
faster network connections, we don't worry so much about sending a giant 
image and just scaling it on the client side--but it still means we've 
got a big fat image sitting in memory.

3. As always, developers have the latest and greatest machines which can 
hide a lot of programming sins. I've got one of the new AMD Ryzen 
processors, 64 GB of RAM, and an SSD in my dev box. On something like 
that, you might not even notice that your application is constantly 
pinning 3 cores... but your users on an old i3 sure will!

4. It's always sucked. I remember spending a LOT of time sitting waiting 
while the disk 'ticked' back in the old days. How long did it take to 
start Emacs on a VAX?

If you want something really basic for just responsive text editing, 
just install a baseline Debian distro on whatever less-than-a-decade-old 
hardware you've got, install X and a lightweight window manager like 
FVWM, and only run a text editor like Emacs or Vim or Acme. You'll do 
fine with LaTeX and PDF viewers if you need, but resist the temptation 
to run a web browser! My netbook from 2010 works fine as a ham radio 
box, EXCEPT when I open a web browser.

If you really want to make an appliance, check out https://u-root.tk/; 
it's a busybox-esque thing made by a friend of mine.


john


More information about the rescue mailing list