Are there any GUI Mr. Miyagi out there who can share there wisdom? How can we make the GUI better?
Literally by making it possible to make the GUI better.
Many don't know or have forgotten, but when the PARC team showed Jobs their new mouse-first GUI, Jobs carried over essentially nothing back to the Mac except for its superficial appearance, and only a pale shadow of the features left behind ever got copied back in. Preface example: I'm assuming you're using NT-Windows or Linux over OSX so it's not really relevant, but the original Alto had a mouse with three buttons, and made heavy use of combined presses and keyboard shortcuts to combinatorially explode what the mouse could do at any one time. Alan Kay had been in the audience for Englebart's NLS demo, and saw he was much more effective with his hands either on the keyboard or on a mouse *and macro pad*; the PARC team even built and used their own version to make pointing an operation that actually compared in power to what you want to do with it. The LispM was similar: with its three mouse buttons, four bucky keys, and small swarm of mode modifiers, Symbolics' electronics ebgineers could reportedly operate their in-house chip design suite both faster than any other product and almost entirely without needing to key in commands. Years later, AutoDesk for DOS would gain breakthroufh legendary status on the LispM's shoulders through letting you do the same type of commanding at a fraction of the effectiveness through the keyhole interface of the AutoLISP command line. The Mac, though? One mouse button, and intentionally as few modifiers as possible to keep the power of the machine from scaring away Jobs' target market. Microsoft and the Unixen of the time mildly improved from Apple's intentionally botched abortion, but got not even halfway to filling in Jobs' tar pit. You don't hate GUIs, you hate GUIs designed for emotional children.
But this would matter much less if Jobs had copied over any of the rest of what made the Alto worth copying, such as the actual thing PARC invented and wanted to show people: The Smalltalk-80 live object GUI model. In the Smalltalk "operating system" (technically the thing that would become Mesa was the real OS, but its job for now was just to run an optimizing disc-paging interpreter S-80 sat on top of), everything you can interact with is a live, almost-pure messaging object system, and so can be modified and improved by the user on the fly. For example, when Ingalls was showing Jobs their officework suite, Jobs commented how he didn't like the look of text selection, which at the time inverted the color of whatever block of text was selected so you got a black "hole" in the paper-like screen with white text, Ingalls spent a few seconds modifying how the selection class messaged and positioned the rectangle display class and immediately had a new behavior where the text was surrounded with a black outline, across the whole system, with no recompilation and no efficiency loss thanks to Alto's bytecode-optimized custom cpu. Smalltalk's true-object system design made it trivial to do this with any element at any time. LispMs were similar: As he wished, the user could point the Lisp Listener at any element of any data on-screen, and get back a readout of its Sexp data as it sat in memory together with hyperlinks to all of its calling source provided by the integrated debugger. After modifying it any which way, the Listener would inject itself into the links into and out of the cooresponding binaries and simulate the new behavior between the rest of the code on-the-fly, while the incremental compiler quickly and automatically replaced just what the listener needed with native-speed object code. Again, obviously from the description this applied to literally everything. Want to hose the system builtins? Get the right permissions and it's a few commands away. Want to *repair* the system builtins after a bad patch hoses them, while everything's live in memory, so it's like nothing ever happened and nothing is lost? There's war stories of lispers attaching a fresh listener and doing exactly that while the rest of the wirld thought core dumps were a valuable and innovative step forward. Point is, it's not PARC's fault Jobs hadn't seen a GUI before he went to visit them, and didn't get what made this new breed so much more notable than what came before.
If the point of a GUI is to make your work more efficient, your OS has done half a job no matter what it's like until they give the user the complete power to tailor it to upping the efficiency of your particular workflow quickly, on the fly, and reversibly/harmlessly so everyone can experiment freely. As Alan Kay describes it, when they built the Alto GUI they would do their systems work in the morning, go off to a Palo Alto garden bar and get reasonably drunk at lunch, and come back to build and A/B each other on tens to hundreds of interface modifications and prototypes until close. Later they built a whole 2000 Altos, each costing as much as a car, and a whole networked lab to run them in, to bring in crowds of people and observe how they used the system to make improvements. Not only did it work (MS Office was invented at PARC, the first standard graphics circuit builder and simulator was invented by a teenager on an Alto, half of Adobe's software suite today started as PARC systems, etc.) but it probably can't be done another way. No matter who my vendor is, they have no clue in what particular or peculuar ways I want their OS to enable me to use my hardware, so they can't hope to ensure their GUI is specific enough to enable me without also being too bloated to not get in anyone else's way, just as the user in the OP's video experienced. Far better to build it out of a system that can be efficiently modified and streamlined by everyone efficiently than continue putting everyone in the same Mac box because it's "sleek" and "standard" and "efficient". Completeness beats simplicity, and interface simplicity beats implementation simplicity.
Literally by making it possible to make the GUI better.
Many don't know or have forgotten, but when the PARC team showed Jobs their new mouse-first GUI, Jobs carried over essentially nothing back to the Mac except for its superficial appearance, and only a pale shadow of the features left behind ever got copied back in. Preface example: I'm assuming you're using NT-Windows or Linux over OSX so it's not really relevant, but the original Alto had a mouse with three buttons, and made heavy use of combined presses and keyboard shortcuts to combinatorially explode what the mouse could do at any one time. Alan Kay had been in the audience for Englebart's NLS demo, and saw he was much more effective with his hands either on the keyboard or on a mouse *and macro pad*; the PARC team even built and used their own version to make pointing an operation that actually compared in power to what you want to do with it. The LispM was similar: with its three mouse buttons, four bucky keys, and small swarm of mode modifiers, Symbolics' electronics ebgineers could reportedly operate their in-house chip design suite both faster than any other product and almost entirely without needing to key in commands. Years later, AutoDesk for DOS would gain breakthroufh legendary status on the LispM's shoulders through letting you do the same type of commanding at a fraction of the effectiveness through the keyhole interface of the AutoLISP command line. The Mac, though? One mouse button, and intentionally as few modifiers as possible to keep the power of the machine from scaring away Jobs' target market. Microsoft and the Unixen of the time mildly improved from Apple's intentionally botched abortion, but got not even halfway to filling in Jobs' tar pit. You don't hate GUIs, you hate GUIs designed for emotional children.
But this would matter much less if Jobs had copied over any of the rest of what made the Alto worth copying, such as the actual thing PARC invented and wanted to show people: The Smalltalk-80 live object GUI model. In the Smalltalk "operating system" (technically the thing that would become Mesa was the real OS, but its job for now was just to run an optimizing disc-paging interpreter S-80 sat on top of), everything you can interact with is a live, almost-pure messaging object system, and so can be modified and improved by the user on the fly. For example, when Ingalls was showing Jobs their officework suite, Jobs commented how he didn't like the look of text selection, which at the time inverted the color of whatever block of text was selected so you got a black "hole" in the paper-like screen with white text, Ingalls spent a few seconds modifying how the selection class messaged and positioned the rectangle display class and immediately had a new behavior where the text was surrounded with a black outline, across the whole system, with no recompilation and no efficiency loss thanks to Alto's bytecode-optimized custom cpu. Smalltalk's true-object system design made it trivial to do this with any element at any time. LispMs were similar: As he wished, the user could point the Lisp Listener at any element of any data on-screen, and get back a readout of its Sexp data as it sat in memory together with hyperlinks to all of its calling source provided by the integrated debugger. After modifying it any which way, the Listener would inject itself into the links into and out of the cooresponding binaries and simulate the new behavior between the rest of the code on-the-fly, while the incremental compiler quickly and automatically replaced just what the listener needed with native-speed object code. Again, obviously from the description this applied to literally everything. Want to hose the system builtins? Get the right permissions and it's a few commands away. Want to *repair* the system builtins after a bad patch hoses them, while everything's live in memory, so it's like nothing ever happened and nothing is lost? There's war stories of lispers attaching a fresh listener and doing exactly that while the rest of the wirld thought core dumps were a valuable and innovative step forward. Point is, it's not PARC's fault Jobs hadn't seen a GUI before he went to visit them, and didn't get what made this new breed so much more notable than what came before.
If the point of a GUI is to make your work more efficient, your OS has done half a job no matter what it's like until they give the user the complete power to tailor it to upping the efficiency of your particular workflow quickly, on the fly, and reversibly/harmlessly so everyone can experiment freely. As Alan Kay describes it, when they built the Alto GUI they would do their systems work in the morning, go off to a Palo Alto garden bar and get reasonably drunk at lunch, and come back to build and A/B each other on tens to hundreds of interface modifications and prototypes until close. Later they built a whole 2000 Altos, each costing as much as a car, and a whole networked lab to run them in, to bring in crowds of people and observe how they used the system to make improvements. Not only did it work (MS Office was invented at PARC, the first standard graphics circuit builder and simulator was invented by a teenager on an Alto, half of Adobe's software suite today started as PARC systems, etc.) but it probably can't be done another way. No matter who my vendor is, they have no clue in what particular or peculuar ways I want their OS to enable me to use my hardware, so they can't hope to ensure their GUI is specific enough to enable me without also being too bloated to not get in anyone else's way, just as the user in the OP's video experienced. Far better to build it out of a system that can be efficiently modified and streamlined by everyone efficiently than continue putting everyone in the same Mac box because it's "sleek" and "standard" and "efficient". Completeness beats simplicity, and interface simplicity beats implementation simplicity.