Re: Necessity of multi-level error propogation
* James Kanze:
On Mar 16, 3:07 am, "Alf P. Steinbach" <al...@start.no> wrote:
* Alf P. Steinbach:
That's the Unix heritage:-). Also adopted by Windows, so
regardless of what one might think of it (and a lot of
people like it), we've got to live with it.
Maybe. :-)
Regarding this, the extreme backwardness and sheer
wrongheadedness of treating interactive keyboard input like a
buffered stream that could be a file, it's not just a problem
with the C++ iostream design.
My take on this: a long, long time ago, the model of all IO
being a sequential stream was probably valid. At that time,
however, most OS's didn't handle them that way: outputting to
the printer was a different system request than outputting to a
mag tape or a file, for example. After a while (but still
before Unix came along), it was realized that the OS should
support device independent sequential stream output; this
evolution took place at different times, depending on the
system, and even as late as 1979, MS-DOS 1.0 had separate
requests for outputing to a printer or to a disk file.
Regretfully, about the time the OS's finally started adopting
the sequential stream model, it became a bit caduc, as disk
files (with random access) and interactive terminals
(particularly screens---even non graphic screens support a full
screen mode). So we have all of the OS's rushing to be "up to
date" by implementing a model which is no longer current.
It's a problem with our physical machines, where this less
than intelligent choice has been hardwired, and it's been
annoying me since 1979 or thereabouts. Many a times the itch
has been so strong that I've started writing an article about
it. But then, there's so little to say, it's not stuff for
article.
Here's the way Things Work on a typical PC or workstation:
PHYSICAL EVENTS -> AUTO-REPEATER -> BUFFER -> DECODING -> PROGRAM
1. Under the keys there's a matrix of some sort. A microcontroller in the
keyboard scans this matrix, detects key down and key up. On key down or
key up a series of bytes denoting the EVENT is sent to the computer.
2. When microcontroller detects that a key is held down for a while it
initiates a kind of REPEAT action, sending the same key down sequence
repeatedly to the computer.
3. In the computer, receiving hardware+software logic BUFFERS it all.
The problem isn't hardware. At the lowest level, there are two
possible modes: the OS receives an interrupt for each key
stroke, at which time it can read the scan code of the key and
the state of the "modifier" keys (shift, alt, etc.). Or the OS
receives an interrupt for each key down and key up event
(including those of the modifier keys), and manages everything
itself. In the second case, autorepeat is handled by the OS.
No, autorepeat is independent of undecoded versus decoded.
On the PC, the keyboard is undecoded but with autorepeat in the physical keyboard.
The former (undecoded) is good, the latter and independent feature (autorepeat
in physical keyboard, at front of data flow chain) is ungood to the extreme.
I'm not sure (it's been a very long time since I worked at this
low level), but I think X usually uses the second possibility.
At any rate, it's certainly possible to e.g. map the tab key to
behave like control (which obviously doesn't auto-repeat,
because that doesn't have any meaning for it).
I'm also pretty sure that you've simplified things greatly.
There are several buffers: the one the application sees is after
the auto repeat, but the lower level ones aren't.
Yes, yes, I've simplified. :)
One consequence is that when e.g. the Firefox beast grinds to
a near halt, as it's wont to do now and then, then you can
"type ahead" but you don't see what you're typing, and so you
don't see when the buffer is full such that further keystrokes
are ignored (in earlier times it beeped, but no more), nor do
you see the effect of edit and arrow keys, which can really
mess up things.
Note that this could easily be handled in Firefox itself. All
it has to do is purge its input each time it finishes a screen
update. Whether this is a good idea, however, depends
because...
No, an application, being at the end of the chain, can only do some limited
things. As you say it can try to empty the buffer always -- when it has the
chance to read from the buffer. That can alleviate some problems to some degree
(and should therefore be done as a matter of course, but few apps do), but it
can't remove the problems entirely, being at the end of the chain.
Happily we old dinosaurs know better than pressing backspace
repeatedly when nothing happens, at least in editing (the
dreaded "delayed delete everything").
We old dinosaurs understand that just because we haven't seen
the echo doesn't mean that the command hasn't been taken into
account. So we occasionally "type ahead" a sequence of
commands, even if the system doesn't seem to be responding to
them.
Yes. Typing ahead is problematic with the current universal backward design. It
wouldn't be problematic with a more rational design.
But most users aren't that sophisticated, they have no mental
model of the data flow shown above, and think that what they
see on the screen is what goes on.
Another consequence is that in programs that provide "smooth"
scrolling, e.g. again a web browser, the scrolling is far
from smooth. For the program can't easily detect that a key is
being held down. What it can easily do is to react to
individual synthesized keystrokes resulting from a key being
held down, and so the effect is jumpy no matter how much the
program tries to smooth it out.
One rational way to do things could instead be
PHYSICAL EVENTS -> BUFFER -> DECODING -> AUTO-REPEATER -> PROGRAM
I don't know any system that works that way, however. Though I
suspect that early machines at Xerox PARC did, because those
folks were very keen on having "non-decoded" keyboard events
and, in particular, having time-stamped events.
I'm not sure what difference this would make.
I'm not that bad at explaining things, am I?
Anyway, it makes a huge difference for all aspects of keyboard handling.
With auto-repeats being synthesized as necessary on demand, instead of being
buffered, they're not being buffered. All problems associated with that (like
"delayed delete all") therefore gone. And the application then has a different
default model where e.g. it doesn't react to arrow key characters but instead to
arrow key state, whether an arrow key is currently down or not.
The reason Xerox
PARC could do things differently was that it intervened at a
lower level.
No, they were exploring the fundamentals, on what would be needed for a
reasonable personal workstation. In addition to timestamped events and keyboard
handling they were focusing on things such as blitter chips, as of 2009 known as
graphics cards. They were arguing (yes, arguing) that any personal workstation
should have an undecoded keyboard and a blitter chip, that those were essential.
Today we have the graphics cards. And we have managed to get the undecoded
keyboards, *but* with the dataflow all messed up, *wrong order* of processing...
The "problem" with this rational way is that it's not
compatible with the buffered everything-as-file-like-stream
i/o concept, which the C and C++ standard i/o facilities are
built on. That "problem" is a problem with the i/o facilities.
Except that you don't use the sequential stream interface for
GUI I/O. You use specific functions in the X or MS Windows
libraries.
No and yes. No, that backwards model is not only with sequential stream
interface, instead, it's embodied in the hardware and OS but it seems to be
associated with the stream i/o point of view: to the degree that it makes any
sense at all, it wouldn't make sense without the stream i/o view. And yes, this
is how it is via the OS API, although e.g. Windows "on the side" provides a not
quite reliable current keyboard key state map (which is possible because a PC's
keyboard differentiates between between first actual key-down event and later
synthesized key-down events for auto-repeat; it's unreliable because the
generating logic is in the physical keyboard, at the wrong end, and most
keyboards aren't able to handle the situation with 4 or more keys pressed).
And I suspect that there is a connection, that the hardware
has been adapted to what was easy to handle within that less
than practical i/o model.
I don't think there's a problem with the hardware.
Well, perhaps amend that conclusion after my clarifying comments above? :-)
It's the hardware.
And it's the OS interface to that hardware.
But the OS's
have been designed to use it in a way that isn't necessarily
appropriate in todays world.
Yes, it's all backwards... :-(
Cheers,
- Alf
--
Due to hosting requirements I need visits to <url: http://alfps.izfree.com/>.
No ads, and there is some C++ stuff! :-) Just going there is good. Linking
to it is even better! Thanks in advance!