I beg to differ. I think Windows 95 was so much better for end users than
Window NT (even with newshell) and certainly better than Win 3.1. I think
this was the most significant end user version of Windows yet. Windows 98
was not bad, but Windows ME shouldn't have happened.
Windows XP was a welcome merging of technologies...
"Joseph M. Newcomer" <newcomer@flounder.com> wrote in message
All operating systems since 1988 have been based on Windows NT. I exclude
Windows 95, 98
and Me from the description of "operating systems" because they were
merely file
management software with a windowing system grafted on (MS-DOS with a
32-bit version of
Win16 running). All changes in name, such as "Windows 2000", "Windows
XP", "Windows
Server 2003" and "Windows Vista" are merely marketing labels on the
fundamental Windows
core code. It has morphed from some relatively simple code in 1988 to the
code of Vista
today, but the growth path is direct. Device Drivers, for example, in the
forthcoming
Server 2008, are identical in all fundamental ways to the first device
drivers written for
Windows NT 3.1 which was released around 1991. The API interfaces are a
superset of the
original API interfaces. Windowing works the same way. The GDI is easily
used by someone
who went through a time machine and emerged in 2007, having only
programmed Windows NT
3.1. The file system works the same way. Threading works the same way,
with the same
APIs, and so on.
Don't be fooled by marketing relabeling of the binary executables; the
source code is a
direct descendant from Windows NT 3.1. More secure, cleaned up a lot,
added to immensely,
but a programmer who worked in the original Windows project would readily
adapt twenty
years later to the current coding practices (which have become more
stringent, for
example, the removal of strcpy and strcat from the repertoire of valid
library calls) and
even recognize most of the module names.
On Sun, 20 May 2007 16:52:58 GMT, MrAsm <mrasm@usa.com> wrote:
On Sun, 20 May 2007 12:12:40 +0100, Daniel James
<wastebasket@nospam.aaisp.org> wrote:
In article news:<lseu43pipdqngiqongtcko2lhdc3j0kmje@4ax.com>, MrAsm
wrote:
... since Win2K and exspecially WinXP, Windows robustness and quality
is very very high ...
I've never had a BSOD in any NT version that couldn't be traced to faulty
hardware or buggy third-party device drivers.
I've never used NT. However, I read that the Win2K and XP kernel and
architecture are NT-based; these WinNT engineers must have done a very
good foundation work, very robust and quality work.
Things like plug-and-play are getting much better in linux, and seem to
me
to be handled in a much more flexible way than on Windows. I mostly use
Gentoo, which is a fairly tech-friendly user-hostile distro (as these
things go), but recent releases from distros like Ubuntu and Mandriva
have
been really very easy to set up -- even for completely non-technical
users
-- on anything but the wackiest or most bleeding-edge hardware.
Maybe I'll try again with a new Linux distro...
But I remember that previous year I tried Ubuntu on my notebook (it
was a modern Asus machine): it was all right with Windows, but Ubuntu
Linux crashed during the install process, and some searches on the
Internet showed that it was due to lack of support to accelerated ATI
graphics card mounted in the notebook.
Then people say: you have the source, fix the bug. Are we joking? I
believe that developing (and debugging) a device driver for an
accelerated 3D hardware card is not trivial task (OK, I can program in
C, but you must know also the Linux kernel architecture, how to
develop device driver, and also the *graphics hardware*; it's too
much.)
****
People told me "You know compiler technology. Don't use the Microsoft
compiler, use GNU.
That way, if you hit a compiler bug, you can fix it!"
They are deeply out of contact with any form of reality I know of.
Nobody pays me to maintain the GNU C compiler. If I take time to maintain
it, I can't
bill my clients for it. Assuming my fixes make the cut and get included
in the compiler,
it might be a year before they are distributed at all, and several years
before there is
universally available a version of gcc that would compile my source
successfully. So, in
order to compile my program, I now have to distribute gcc with it. And
convince clients
that MY gcc is a BETTER gcc than the gcc they are now using. And what if
there is another
bug I *didn't* fix, because it didn't bother me, but after they make
changes in my source,
it won't compile with my compiler, but it also won't compile with the
regular gcc they use
(which doesn't have my fixes in it yet?)
What if the change I make requires a change in the Linux kernel as well?
What if it
CHANGES the Linux kernel? The cascading of disasters is not something I
want to deal
with.
Now, do I want to trust the future of my business to a compiler, operating
system, and
infrastructure managed by who-knows-what-skill people? A 17-year-old I
never heard of is
changing a key tool I depend on. At least at MS, we have SOME hope that
the quality of
the compiler and OS are increasing (even if they destroy the IDE), but as
someone who
spent many years in Unix, and had to maintain cross-platform source code,
it is a world of
total chaos which I prefer to avoid. Already there are known problems
with "X Linux", "Y
"Linux" and "Z Linux" having incompatible tool suites, so a build I create
on one can't
execute on the other (I have lots of friends who still believe in Linux,
in spite of
massive evidence to the contrary). Do I have the same shell? With the
same bug fixes?
Will the script I write use tools whose behavior is the same on all
versions? I left that
world in the mid-1980s and do not ever want to go back.
In the 1990s, I made a brief visit back to the Unix world. Gnome and KDE.
Mutually
incompatible, conflicting and non-interoperable features (the clipboard
being the most
noticeable). "Unix is stable" I was assured, but crashing X-windows would
destroy my work
as surely as a BSOD. And I could crash it by sending it valid command
strings! "But
*UNIX* is stable" they would claim, ignoring the fact that "stable" does
not mean "the
core operating system doesn't crash".
Unix was the only OS I worked on where a system crash could destroy every
trace of a file
I had been editing, requiring reconstructing hours worth of work. The
only OS I worked on
where the OS had to be shut down at 2am to do backups (all real operating
systems,
including WIndows at the time, could do backups while supporting real
users at the same
time). The only OS I worked on where I could not tell, when I logged in
each morning, if
anything I'd done the day before would continue to work (sysadmins would
change default
search paths, change utilities overnight, etc.). The only OS I worked on
where I *had* to
have admin privileges because I had to restart the printer spooler six
times a day. The
only OS I worked on where having admin privileges on my machine meant that
I could modify
ANY machine in the network, and read ANY file (including the director's
private email).
When this defect in NFS was discovered, the solution was to remove admin
privileges from
everyone, and it was the only OS I ever worked on where I could trivially
regain them
without even attempting a crack, in under 2 minutes (mostly waiting for
the machine to
reboot. Pull the network cable, boot standalone, log in as admin, change
the admin
password, plug the network cable back in, reboot). Security was and still
is a joke on
Unix and Linux; the only reason they aren't exposed as the frauds they are
is because
there is so little usage of them they are not a desirable target. (Ditto
the Macintosh,
in spite of the misleading TV ads; it is highly vulnerable).
But Microsoft has lost sight of their goal and have begun the "we know
better than you the
way you should think" cycle, and every vendor that tried that has
eventually failed in the
marketplace. Look at the widespread popularity of IBM mainframes in
offices today to see
what happens when you lose sight of your customers' needs.
joe
I also posted the problem to ATI (like others did) but, of course, I
had no feed-back.
And this hardware run just fine on Windows XP partition of the same
notebook.
There are other examples, like when I was trying to develop in C++
under Linux using KDevelop: this IDE used to crash during development,
and, of course, it was very annoying! And again: you have the source,
fix the problem. It's utopian: it's hard to hack several hundred
thousands lines of code without no idea of the general architecture of
the system, the module subdivisions, documentation about data
structures, internal protocols, etc.
Moreover, when you develop for Linux, for *which* Linux are you
developing? For SuSE? For Knoppix? For Ubuntu? For KDE or Gnome
desktop? I know that there are differences between different Linux
distros... it seems a real chaos if compared to Windows platform.
And finally, I think that the open-source thing is kind of "snake
oil". You must have a real job and then code for hobby, after the job?
It's hard to build quality robust useful software like Microsoft
Office "as hobby project". You can't compete with full-time well-paid
high-level software engineers working for Microsoft (or other software
companies).
Having the source code could be useful if you're interested to e.g.
investigate how an operating system kernel works or is implemented,
but IMHO it's impossible to do a real solid business on open-source
software.
However, we're straying off-topic a bit here ...
Yes, you're right :)
Cheers,
MrAsm
Joseph M. Newcomer [MVP]
email: newcomer@flounder.com
Web: http://www.flounder.com
MVP Tips: http://www.flounder.com/mvp_tips.htm