Searching \ for '[OT]: that filesystem royalty problem again' in subject line. ()
Make payments with PayPal - it's fast, free and secure! Help us get a faster server
FAQ page: www.piclist.com/techref/index.htm?key=filesystem+royalty
Search entire site for: 'that filesystem royalty problem again'.

Exact match. Not showing close matches.
PICList Thread
'[OT]: that filesystem royalty problem again'
2005\03\21@075722 by Peter

picon face

According to this article:

http://www.theregister.co.uk/2005/03/19/microsoft_charges_victims/

anyone making an implement (like a USB connected programmer or data
logger whose driver can access or store long filename-named
configuration files on a m$ os) needs to pay $80 per user (or per
server) in licensing fees. While this may not be entirely exact
(numerically), it shows a situation that is not good. If I am not wrong,
many mp3 players that will accept long filenames are well under $80
each, retail.

Peter

2005\03\21@093814 by Paul Hutchinson

picon face
Actually, the requested royalty is $0.25 per unit produced with a limit at 1
million units ($250K maximum, the rest are free). Dataloggers and other
non-multimedia devices may not require the royalty, however, you must
contact Microsoft to negotiate for these types of devices. My impression is
that Microsoft won't want to be bothered with collecting royalties for
devices that sell less than 10K units per year. The official Microsoft site
for this information is:
http://www.microsoft.com/mscorp/ip/tech/fat.asp

One of the patents has been reviewed and thrown out see:
http://www.pubpat.org/Microsoft_517_Rejected.htm

However that still leaves two patents so, Microsoft hasn't given up yet. A
good overview of the situation is here:
http://www.answers.com/topic/file-allocation-table

Paul

> {Original Message removed}

2005\03\21@103409 by ThePicMan

flavicon
face
At 14.57 2005.03.21 +0200, you wrote:

>According to this article:
>
>http://www.theregister.co.uk/2005/03/19/microsoft_charges_victims/
>
>anyone making an implement (like a USB connected programmer or data logger whose driver can access or store long filename-named configuration files on a m$ os) needs to pay $80 per user (or per server) in licensing fees. While this may not be entirely exact (numerically), it shows a situation that is not good. If I am not wrong, many mp3 players that will accept long filenames are well under $80 each, retail.

Microsoft is right, after all it is well known that it has produced the greatest inventions of all times.

Moreover, Bill Gates is the best programmer in the world. Did you know he has the highest IQ of all people ever born, since IQs have been measured?

Also, he has got the longest pe*is of all us, too.

So he deserves us to pay him $80 in royalties for every $10 product.

Heil Bill!!



>Peter


2005\03\21@111104 by Lindy Mayfield

flavicon
face
>
> Moreover, Bill Gates is the best programmer in the world. Did you know he
> has the highest IQ of all people ever born, since IQs have been measured?
>

Didn't he like write a basic interpreter for some machine while on the airplane while he was flying to the demonstration, and then when he got there they typed it in and it ran perfectly the first time?  And they didn't have assemblers back then; everything was written in 1's and 0's.  





2005\03\21@152959 by William Couture

face picon face
On Mon, 21 Mar 2005 17:10:02 +0100, Lindy Mayfield
<spam_OUTLindy.MayfieldTakeThisOuTspameur.sas.com> wrote:
> >
> > Moreover, Bill Gates is the best programmer in the world. Did you know he
> > has the highest IQ of all people ever born, since IQs have been measured?
> >
>
>Didn't he like write a basic interpreter for some machine while on
the airplane while he
>was flying to the demonstration, and then when he got there they
typed it in and it ran
>perfectly the first time?  And they didn't have assemblers back then;
everything was
>written in 1's and 0's.

He wrote the loader for the interpreter on the plane, not the
interpreter itself.

Bill

2005\03\21@162205 by Jinx

face picon face
> > Moreover, Bill Gates is the best programmer in the world. Did you
> > know he has the highest IQ of all people ever born, since IQs have
> > been measured?

That is so true. For kicks he stares at the backs of people's heads until
they throw up and fall over

> Didn't he like write a basic interpreter for some machine while on the
> airplane while he was flying to the demonstration, and then when he got
> there they typed it in and it ran perfectly the first time?  And they
> didn't have assemblers back then; everything was written in 1's and 0's.

And could cut his own hair with the other hand

2005\03\21@175318 by Russell McMahon

face
flavicon
face
>> > Moreover, Bill Gates is the best programmer in the world. Did you
>> > know he
>> > has the highest IQ of all people ever born, since IQs have been
>> > measured?

According to a 'press release' (source at end)

__________________


REDMOND, WA-Microsoft head Bill Gates, already considered by many to
be among the most powerful men in the world, further increased his
powers Monday, augmenting several of his key statistics to
near-immortal levels.

Among the most striking increases were a +2 raise in dexterity to 18,
and an overwhelming charisma increase to an above-human score of 20,
placing Gates in the realm of deities and demigods.

"I am pleased to announce that I have boosted my already impressive
statistics," Gates said in a statement to shareholders Monday. "As we
develop the technological framework that will dominate the 21st
century, these augmentations-and others to follow-will be powerful
wards against competition from the likes of Netscape, Oracle and
Melkor who is named Morgoth."

"Microsoft is the software-industry leader today, and tomorrow it will
also dominate the realm of information access, as well as the content
being accessed," Gates said. "The continued growth of our Corbis Media
archive, the successful development and launch of MSNBC, and my
mastery of the shield spells of the Elven King Lagolin are only the
beginning for Microsoft."

Gates, who raised his intelligence to 20 in 1990, is fast becoming the
most powerful CEO in American media. Experts place him above Fox's
Rupert Murdoch and Disney's Michael Eisner, both of whom hold over 1.2
million hit points. Gates is also rumored to be in possession of a bag
of holding containing one terabyte of information, as well as over 100
billion gold and silver pieces.

Analysts see Monday's statistical boost as extremely beneficial to
Gates in an increasingly competitive marketplace.

"This is a very shrewd move on Gates' part," PC Magazine columnist
John C. Dvorak said. "His vastly increased charisma-the prime stat of
a chaotic evil executive-will help him tremendously in his ongoing
struggle to convince skeptical Microsoft stockholders that his
ventures into television and his massive content-buying spree will pay
off in the long run. The extra CHR will also assist him greatly in
dealing with wary CEOs of companies he wishes to invest in and cast
spells over, like Comcast."

"It hardly seems fair, but he will now be capable of near-invisibility
in behind-the-scenes business dealings," Dvorak added, referring to
the stealth augment which comes with a dexterity gain. "And at the
same time, he'll wear Mordekainen's Spectacles of True Sight, which
provide +6 insight gains into long-term Windows marketing strategies."

While few question the wisdom behind Gates' stat increases, there
remains a possibility that the Federal Trade Commission, which in 1996
ruled his licensing agreement with computer manufacturers to be in
violation of anti-trust laws, will challenge the move. Even if the FTC
rules against Gates, however, industry analysts believe that he should
easily recover, thanks to his above-average 15 constitution.

Gates' rivals expressed frustration over his ability to achieve
invulnerability in a supposedly competitive market. "Combining this
augmentation with last month's purchase of the Polo Shirt of
Thalkettoth, which grants a +5 saving throw against anti-trust
litigation, Gates should now be seen as operating outside the law,"
Apple CEO Dr. Gilbert Amelio said Tuesday. "One more sorcerous potion
of Gain Market Share, and we might as well declare bankruptcy."

"Anyone can be a Santa Claus DM and give out unearned stats," Oracle
president Larry Ellison said. "I'm surprised he didn't just go ahead
and give himself a 20 in everything."

With overpowering statistics in all six ability categories, with the
exception of strength, Gates is widely considered to be primed for the
Kingship.

"Certainly his campaign could be crushed if he made a mistake," ABC
computer correspondent Geena Smith said. "But let's be realistic. He's
got 40 million experience points dating back to when he dropped out of
Harvard. His party has done nothing but kill and acquire for 22 years.
He knows when to cast versus when to hack-and-slash. He will be the
emperor lich of 21st century media."

___________


Source - The Onion.

2005\03\21@175326 by Russell McMahon

face
flavicon
face
> Didn't he like write a basic interpreter for some machine while on
> the airplane while he was flying to the demonstration, and then when
> he got there they typed it in and it ran perfectly the first time?

No. That was Seymore Cray who keyed in the bootloader for the 1st CDC
Super Computer (tm) from memory using the console binary switches.

> And they didn't have assemblers back then; everything was written in
> 1's and 0's.

No that was me (and a few zillion other people) when we first started
with microprocessors :-)



       RM

2005\03\22@033814 by Howard Winter

face
flavicon
picon face
Russell,

On Tue, 22 Mar 2005 10:22:17 +1200, Russell McMahon wrote:

>...<
> > And they didn't have assemblers back then; everything was written in
> > 1's and 0's.
>
> No that was me (and a few zillion other people) when we first started
> with microprocessors :-)

Yup, been there, done that.  Wrote the program on a coding sheet (Z80 assembler), then hand-assembled into
hex, and typed it in as such (I had built a simple terminal, and the Z80 was running under a little monitor
program called MinMon that was a few hundred bytes long, that accepted the hex and stored it).

Ah, those were the days!  Tell that to the kids of today and they don't believe you...

Cheers,


Howard Winter
St.Albans, England


2005\03\22@071255 by Peter

picon face

On Mon, 21 Mar 2005, Lindy Mayfield wrote:

>> Moreover, Bill Gates is the best programmer in the world. Did you know he
>> has the highest IQ of all people ever born, since IQs have been measured?
>
> Didn't he like write a basic interpreter for some machine while on the
> airplane while he was flying to the demonstration, and then when he
> got there they typed it in and it ran perfectly the first time?  And
> they didn't have assemblers back then; everything was written in 1's
> and 0's.

While the details about where exactly he wrote may vary (one cannot say
that it was snowing and that he walked uphill both ways while flying in
an airplane, so no-one is saying that), I think that the part about his
writing some of it on the way may be true. And I don't buy the part
about 'it ran perfectly the first time' if it had more than 300 lines of
code, no matter who wrote it, where, or when, or with what, unless he
used a highlighting ide with inline popup context help and other tools.
And there was no such thing at the time afaik. I'm also pretty sure that
basic interpreters are much longer than 300 lines, even if written in C.
The record is held by that obfuscated C contest winning code and it has
more than 300 lines when expanded (more exactly: when all its parts are
expanded - I spent half a day once disassembling(!) that C (!) code).
That code implements a basic interpreter and it was not written by B.

The push-it-out-of-the-door-at-the-last-minute technology appears to be
a feature of theirs. In his book, 'The Road Ahead' (1995) he wrote that
he used to do that in college. The other person who he said was also
doing that was Steve Ballmer. You have to admit that it is a brilliant
idea. You get so many *paying* beta testers that way! Otoh, this
'technology' is not so rare in the business world where sometimes
prototypes built with duct tape and chewing gum holding the parts inside
are sold to desperate clients as is. And I do not like that at all.

Peter

2005\03\22@072646 by Joe McCauley

picon face
You were lucky! I had to enter the electrons by hand........

Joe

-----Original Message-----
From: .....piclist-bouncesKILLspamspam@spam@mit.edu [piclist-bouncesspamKILLspammit.edu] On Behalf Of
Howard Winter
Sent: 22 March 2005 08:38
To: Microcontroller discussion list - Public.
Subject: Re: [OT]: that filesystem royalty problem again


Russell,

On Tue, 22 Mar 2005 10:22:17 +1200, Russell McMahon wrote:

>...<
> > And they didn't have assemblers back then; everything was written in
> > 1's and 0's.
>
> No that was me (and a few zillion other people) when we first started
> with microprocessors :-)

Yup, been there, done that.  Wrote the program on a coding sheet (Z80
assembler), then hand-assembled into
hex, and typed it in as such (I had built a simple terminal, and the Z80 was
running under a little monitor
program called MinMon that was a few hundred bytes long, that accepted the
hex and stored it).

Ah, those were the days!  Tell that to the kids of today and they don't
believe you...

Cheers,


Howard Winter
St.Albans, England


2005\03\22@075022 by Lindy Mayfield

flavicon
face

I was of course being flippant along with the original post, but it is interesting that just the evening before this thread started I was having a discussion with the wife about a similar thing.  She was saying that he "invented" windows, and such.  I disagreed.  I say he "stole" the idea from something he'd seen before (from Sun? I forget.)  I said he simply was a shrewd businessman who happened to get very lucky and be in the right place at the right time. And don't forget IBM's little mistake with PC licensing that changed the world. (-;

Ok, I'll give him this.  Microsoft does put out some of the best low-end feature-filled software for the price.  I say "low-end" because whenever I've ever tried to create a large document in Word, for example, I ended up spending more time getting around the bugs and quirks of the program than I did typing it in.

The late and wonderful Douglas Adams has this to say about Mr. Gates and Microsoft:  http://www.gksoft.com/a/fun/dna-on-microsoft.html

I personally think the microcomputer world would be a better place without Microsoft, and that is just my opinion.  If there were "killer" apps and applications with as many features at the price available for Linux (that also would convert seamlessly between M$ and other formats), then I'd definitely ditch my Win OS's.  

I think their shotty get-it-out-the-door-as-soon-as-possible time-to-market strategy goes further than just flooding the market with un-secure, easily hackable, and unsafe software.  It affects other companies who have to do the same thing just to stay competitive and in the business.  In other words, they're bringing the rest of us down to their level in a way.  My humble opinions.


> {Original Message removed}

2005\03\22@075202 by Rolf

face picon face
I am reminded of an old "thing".

Real programmers can:
C:\ > copy CON command.com


Rolf

Peter wrote:

{Quote hidden}

2005\03\22@104356 by William Chops Westfield

face picon face
On Mar 22, 2005, at 1:54 AM, Peter wrote:

> if it had more than 300 lines of code

Someone else lowered the ante from "a basic interpreter" to "the loader
for the B.I.", which seems much more reasonable.   IIRC, back in those
days such things could be extremely simple, and it's not unthinkable
that someone could write one on a plane flight, especially if they
had seen something similar before.  And 300 instructions would have
been very long for such a beast; recall that the basic interpreter
itself probably fit in 4k or so, some systems had 1K or memory (or
less total), and when toggling things in on a front panel, brevity
was definitely a virtue.  No flight simulators easter eggs in THOSE
days!

BillW

2005\03\22@144601 by Peter

picon face

On Tue, 22 Mar 2005, William Chops Westfield wrote:

> On Mar 22, 2005, at 1:54 AM, Peter wrote:
>
>> if it had more than 300 lines of code
>
> Someone else lowered the ante from "a basic interpreter" to "the loader
> for the B.I.", which seems much more reasonable.   IIRC, back in those
> days such things could be extremely simple, and it's not unthinkable
> that someone could write one on a plane flight, especially if they
> had seen something similar before.  And 300 instructions would have
> been very long for such a beast; recall that the basic interpreter
> itself probably fit in 4k or so, some systems had 1K or memory (or
> less total), and when toggling things in on a front panel, brevity
> was definitely a virtue.  No flight simulators easter eggs in THOSE
> days!

You are probably right. The smallest basic interpreter I know of was the
4k tiny basic (including the character generator!) implemented in the
Sinclair ZX80. It was not a strict interpreter since it allowed only
tokens to be entered (each key entered a keyword). Math was integer only
on 16 bits. t shared program memory with the display memory (total =
1k), so one had either fancy display and short code or vice versa ;-).
The smallest proper basic interperter still in use is probably the one
in the 8052 variants (8k ROM).

4096 bytes of code with an average of 2/instruction is at least 2048
lines of assembly, 34 pages at 60 lines/page. I don't know what the
world record in flight duration is but I guess it would take longe than
that to write them, never mind the error rate required (about 10
characters/assembly line = 20480 characters, which, if error free, eveon
only from the point of view of the typos, would make the man's
correctness rate 99.995% ;-).

As to imperfect loaders, I once wrote a floppy boot sector code for pc,
and it was *very* hard to make it both work and fit in the allowed 512
bytes minus the disk geometry table and header (my boot code read and
understand FAT12 and searched a specific file by name in the root
directory, and loaded and ran it at 1000:0100h so it could be compiled
as a .com program and it would work). All this using int 10h (BIOS
screen output) and int 13h (disk sector level access) *only*. The fit
was so tight that I had to shorten some of the already short diagnostic
message strings to make it fit. I wrote this in 1995 or 96.

Peter

2005\03\23@065224 by Russell McMahon

face
flavicon
face
> ... She was saying that he "invented" windows, and such.  I
> disagreed.  I say he "stole" the idea from something he'd seen
> before (from Sun? I forget.)

Xerox.

> I think their shotty get-it-out-the-door-as-soon-as-possible
> time-to-market strategy goes further than just flooding the market
> with un-secure, easily hackable, and unsafe software.  It affects
> other companies who have to do the same thing just to stay
> competitive and in the business.  In other words, they're bringing
> the rest of us down to their level in a way.  My humble opinions.

This is completely standard "market forces" / capitalism, motherhood,
apple pie, truth beauty and the A  ... .
What's not to like?


       RM

2005\03\23@072246 by Russell McMahon

face
flavicon
face
> Real programmers can:
> C:\ > copy CON command.com

Even I can do that.
It's what happens after that that I have trouble with.

Now, copy con xxx.bat is (sometimes) OK.
I can almost certainly no longer EDLIN ...
I'm pleased to say ! :-)


       RM

2005\03\23@075620 by Howard Winter

face
flavicon
picon face
Russell,

On Wed, 23 Mar 2005 23:51:49 +1200, Russell McMahon wrote:

> > ... She was saying that he "invented" windows, and such.  I
> > disagreed.  I say he "stole" the idea from something he'd seen
> > before (from Sun? I forget.)
>
> Xerox.

Indeed, at the Palo Alto Research Centre (PARC) in the late seventies.  I saw a presentation given by someone
who had seen it himself and he predicted that mice and "desktop" screens (having the screen horizontal under
glass acting as the desk surface) would be The Thing one day.   Score 1 out of 2 that man!

> > I think their shotty get-it-out-the-door-as-soon-as-possible
> > time-to-market strategy goes further than just flooding the market
> > with un-secure, easily hackable, and unsafe software.  It affects
> > other companies who have to do the same thing just to stay
> > competitive and in the business.  

It also means that the average user expects computers to be unreliable, and to accept the fact rather than
expecting and getting improving standards of reliability, availability, and security.  This, beyond all else,
is why I detest Microsoft's business practices.

> > In other words, they're bringing
> > the rest of us down to their level in a way.  My humble opinions.

Mine too, except I would never stoop anywhere near as low as their level!

> This is completely standard "market forces" / capitalism, motherhood,
> apple pie, truth beauty and the A  ... .
> What's not to like?

All of the above!

Cheers,


Howard Winter
St.Albans, England


2005\03\23@075718 by Howard Winter

face
flavicon
picon face
Russell,

On Wed, 23 Mar 2005 23:53:47 +1200, Russell McMahon
wrote:

> I can almost certainly no longer EDLIN ...
> I'm pleased to say ! :-)

Windows XP, from the people who brought you EDLIN!  
:-)))

Cheers,



Howard Winter
St.Albans, England


2005\03\23@082901 by Gerhard Fiedler

picon face
Lindy Mayfield wrote:

> I personally think the microcomputer world would be a better place
> without Microsoft, and that is just my opinion.  

It probably wouldn't be if it weren't for Microsoft. I don't like many of
their attitudes just like most everybody, especially making big profits
while still putting out buggy stuff that could be made better using some of
that profit, but without them unifying the PC market, it wouldn't exist as
such. Remember when you had to "tune" CP/M for every computer you wanted to
run it on? It's not been so much IBM who standardized the market, it's been
Microsoft.

So in a way, it was necessary to go through Windows to get to Linux... :)

> I think their shotty get-it-out-the-door-as-soon-as-possible
> time-to-market strategy goes further than just flooding the market with
> un-secure, easily hackable, and unsafe software.  It affects other
> companies who have to do the same thing just to stay competitive and in
> the business.  In other words, they're bringing the rest of us down to
> their level in a way.  My humble opinions.

Well, I think it's not so much Microsoft, it's Microsoft's customers. Over
the past two decades, the software companies that had huge success usually
were the ones who came out with bristling albeit buggy features, not the
ones with a more conservative product cycle, focusing on the essential
features and rather making them stable than being in a constant alpha-beta
cycle, never getting to a solid product. It's the buyers that vote with
their dollars (since that market is dominated by the US, that's actually
dollars, not just a figure of speech :) -- and that's why Microsoft is the
biggest in the market. Not because they are evil, but because most people
actually _like_ them and their products, and prefer the shiny over the
solid.

Gerhard

2005\03\23@090334 by Bob Ammerman

picon face
It is interesting(?) to come up with printable ascii strings so that:

echo blahblahblah>myprog.com

Creates a useful program.

IIRC, this technique was at one time used for bootstrapping LapLink onto a
new computer. One would simply issue a mode command to redirect console
input/output to a serial port and then start the cloning procedure from the
machine already running LapLink. It would use the above trick to get a
simple bootstrap loaded, and then build from that.

Bob Ammerman
RAm Systems

{Original Message removed}

2005\03\23@112655 by William Chops Westfield

face picon face

On Mar 23, 2005, at 3:51 AM, Russell McMahon wrote:

>> ... She was saying that he "invented" windows, and such.  I
>> disagreed.  I say he "stole" the idea from something he'd seen before
>> ...

> Xerox.

I dunno.  Microsoft windows was pretty clearly stolen from Apple.
Apple stole it from Xerox.  Of course, M claimed it was stolen
from Xerox, because Xerox was more willing to license them the
"technology."

I've actually sat in front of an Alto.  Once or twice.  Alas, they
were mostly gone from Stanford by the time I worked there (replaced
by SUNs.)  There were a lot of window based things floating around
CS departments and the like for rather a long time before Microsoft
has a viable product (including, for instance, the Stanford "V"
system and several LISP machines.)

BillW

2005\03\23@113521 by William Chops Westfield

face picon face
On Mar 23, 2005, at 5:59 AM, Bob Ammerman wrote:

> It is interesting(?) to come up with printable ascii strings so that:
>    echo blahblahblah>myprog.com
> Creates a useful program.
>
I always liked the procedures that used "debug" as a cheap means
of creating executable .COM files (from hex code, usually.)

BillW

2005\03\23@130658 by Peter

picon face

On Wed, 23 Mar 2005, Gerhard Fiedler wrote:

> It probably wouldn't be if it weren't for Microsoft. I don't like many of
> their attitudes just like most everybody, especially making big profits
> while still putting out buggy stuff that could be made better using some of
> that profit, but without them unifying the PC market, it wouldn't exist as
> such. Remember when you had to "tune" CP/M for every computer you wanted to
> run it on? It's not been so much IBM who standardized the market, it's been
> Microsoft.

Yes, but the standardization occured at the lowest common denominator.
And you still have to 'tune' XP for every computer you want to run it
on, and unlike CP/M writing 50 lines of assembly to fix it is not an
available option.

> So in a way, it was necessary to go through Windows to get to Linux... :)

No, they could have gone straight there if they had developed Xenix
instead of DOS and built Windows on top of a Xenix descendent. That
would have given them the leverage Apple OS X has now, 20 years ago.
They took a wrong turn in history, and bet so much money on the dead
horse that it has become immortal and floats on a bubble of publicity
and pr.

Xenix was a unix variant that was developed and marketed by Microsoft at
about the time when DOS was beginning to crash less. It was in roughly
the same league with Coherent, another Unix variant from that time.

Peter

2005\03\23@131322 by William Couture

face picon face
On Wed, 23 Mar 2005 12:57:14 +0000 (GMT), Howard Winter wrote:
>
> Windows XP, from the people who brought you EDLIN!
> :-)))

Unfortunately, EDLIN predated Microsoft.  It came from 86DOS by
Seattle Computer Products (which was purchased by the infant Microsoft
and became IBM-DOS and MS-DOS).

And, FWIW -- at one time I was the proud owner of 86DOS version 0.1,
serial number 11.

Bill

--
Psst...  Hey, you... Buddy...  Want a kitten?  straycatblues.petfinder.org

2005\03\23@173758 by Lindy Mayfield

flavicon
face
I loved that!  I remember always going to that section of PC Magazine first.  Then they dropped it and it was all downhill for me from there.

I work on IBM mainframes for a living.  Like the TV series CSI, there is always evidence of the crime, of what happened.  Always.  If you can read the dump and put things together, you can always figure out what caused the abend or problem to happen.  Alas, with windows, it is just "Did you reboot?  Reinstall?"

{Original Message removed}

2005\03\23@183623 by Bob Axtell

face picon face
Bill Gates will be remembered in history for just ONE THING: he
defrauded a BILLION PEOPLE worldwide- and got away without
a scratch. He makes P.T.Barnum look like a Sunday-school teacher.

--Bob

Lindy Mayfield wrote:

>I loved that!  I remember always going to that section of PC Magazine first.  Then they dropped it and it was all downhill for me from there.
>
>I work on IBM mainframes for a living.  Like the TV series CSI, there is always evidence of the crime, of what happened.  Always.  If you can read the dump and put things together, you can always figure out what caused the abend or problem to happen.  Alas, with windows, it is just "Did you reboot?  Reinstall?"
>
>{Original Message removed}

2005\03\24@091643 by Gerhard Fiedler

picon face
Peter wrote:

>> So in a way, it was necessary to go through Windows to get to Linux... :)
>
> No, they could have gone straight there if they had developed Xenix
> instead of DOS and built Windows on top of a Xenix descendent.

I don't think Xenix was an option (in terms of resource needs) for all the
early PC owners that began to make the PC /the/ standard platform.

Gerhard

2005\03\25@055930 by Peter

picon face


On Thu, 24 Mar 2005, Gerhard Fiedler wrote:

> Peter wrote:
>
>>> So in a way, it was necessary to go through Windows to get to Linux... :)
>>
>> No, they could have gone straight there if they had developed Xenix
>> instead of DOS and built Windows on top of a Xenix descendent.
>
> I don't think Xenix was an option (in terms of resource needs) for all the
> early PC owners that began to make the PC /the/ standard platform.

Xenix ran fine (albeit slowly) on average hardware. By the time everyone
had a i386 machine and a graphics card Xenix could have taken over from
DOS. At the same time, X workstations ran on hardware marginally better
than the i386 pcs everyone had, with some form of unix os as base.
Instead, they built Windows on DOS, and later they parted in unfriendly
terms from OS2. We know the rest of the story. It took 20 years for DOS
to become XP (via NT which was not DOS based).

Peter

2005\03\25@095421 by Herbert Graf

flavicon
face
On Thu, 2005-03-24 at 21:21 +0200, Peter wrote:
{Quote hidden}

And the reason is VERY clear: backwards compatibility. Look at the x86
architecture, we are still using CPUs that can run code developed for
the first IBM PC. The fact that the OS has remained as compatible for so
long is NO surprise.

Given the alternatives at the time I'd say DOS was the better choice, it
was resource friendly enough to be able to run on very minimal hardware.
The fact that everything later was an evolution or kludge of DOS was no
surprise, backwards compatibility is king in the world of PCs, and THAT
is a GOOD thing. TTYL


-----------------------------
Herbert's PIC Stuff:
http://repatch.dyndns.org:8383/pic_stuff/

2005\03\25@115807 by Dave Tweed

face
flavicon
face
Herbert Graf <.....mailinglist2KILLspamspam.....farcite.net> wrote:
> And the reason is VERY clear: backwards compatibility. Look at the x86
> architecture, we are still using CPUs that can run code developed for
> the first IBM PC. The fact that the OS has remained as compatible for so
> long is NO surprise.
>
> Given the alternatives at the time I'd say DOS was the better choice, it
> was resource friendly enough to be able to run on very minimal hardware.
> The fact that everything later was an evolution or kludge of DOS was no
> surprise, backwards compatibility is king in the world of PCs, and THAT
> is a GOOD thing. TTYL

To a point. But Windows has carried so much baggage forward from DOS that
it has really hindered its development. I'd say that Microsoft has set the
PC industry back a good 20 years from where it would otherwise be -- look
how long it took them to decide that virtual memory and preemptive
multitasking were Good Things. These were known in the 1970s (on mainframes
and minicomputers), microprocessors had the hardware to implement them in
the early 1980s, but we didn't get them in Windows until the mid-to-late
1990s.

There are many ways to provide backwards compatibility for old
applications -- compatibility libraries (e.g., Wine on Linux), emulators
(DOSEMU) -- all the way up to simply continuing to run the older operating
system (on real hardware or on something like VMware) instead of upgrading.

-- Dave Tweed

2005\03\25@130825 by Peter Johansson

flavicon
face
Dave Tweed writes:

> To a point. But Windows has carried so much baggage forward from DOS that
> it has really hindered its development. I'd say that Microsoft has set the
> PC industry back a good 20 years from where it would otherwise be -- look
> how long it took them to decide that virtual memory and preemptive
> multitasking were Good Things. These were known in the 1970s (on mainframes
> and minicomputers), microprocessors had the hardware to implement them in
> the early 1980s, but we didn't get them in Windows until the mid-to-late
> 1990s.

The problem with windows today is much more than these technical
limitations.  DOS was never designed for a multi-user or networked
environment, and the security model was based arround the physical
security of the machine.  And even when people started networking PCs,
the networks where physically isolated, and little serious design
thought was given to developing a proper security architecture.  Even
worse, the general design philosophy was "open by default" to keep
administration simple.  It wasn't until XP's SP2 that M$ started to
get serious about security.  (Although, a lot of the so-called
security in SP2 is really nothing more than evil hacks.)

Just look at how many desktop boxes still run with administrator
privileges!  That's just completely absurd.  A Unix user wouldn't even
think about running a desktop as root.  That most basic security
policy is so ingraned in me that the only time I log in as
Administrator on my own boxes is to install software or make
configuration changes.  I was most pleased when I saw that the XP
installer actually, for the first time, prompted for the creation of
actual user accounts.  I thought, "wow, M$ finally got something
right" but then I realized with shock and horror that these user
accounts were given Administrator access by default!  It was all I
could do to keep from screaming.

But it doesn't end there.  I've found *far* too many software packages
that expect to be run with Administrator privs, sometimes for reasons
as silly as writing configuration data or temporary into the "Program
Files" directory, which is something they shouldn't even be doing in
the first place.  Then theres the fact that most applications can't
even be installed without Administrator access, even when there is
absolutely no reason from them to require administrator access -- The
programmers were simply too lazy to think about supporting anything
other than the "do everything as administrator" philosophy.

Here is another good example of some of the absurd nature of Windows.
When a program wants to talk on the Internet, is uses the IP protocol
stack, which, barring code bugs, will only generate legitimate IP
packets.  However, there is another interface known as "raw sockets"
that can generate *any* type of packet, legitimate or illegitimate.
These illegitimate or rouge packets can wreck all sorts of havoc, from
exploting flaws in remote machines to denial of service attacks, to
screwing with routing tables.  Under Unix, only the root
(administrator) user can access the raw socket interface.  It's not a
perfect way to keep the internet free of legitimate packets, but it is
a reasonable solution.  Under Windoze, just about any machine is
capable of generating these packets, and this is exactly how infected
windows machines have nearly crippled the internet in the past.
Microsoft came up with a brilliant (not!) solution to this problem for
SP2.  Rather than try to secure the raw socket interface, they simply
removed the raw socket interface from the operating system!  Never
mind the fact that since the user-cum-administrator users can still
talk directly to the hardware, it's simply a matter of application
programs including their own raw socket libraries to get right back to
the same problem state.

But wait, it gets even more absurd.  Even the strongest supporters of
Windows criticize its security weaknesses.  But yet those very same
supporters started whining and complaining about all the things that
XP-SP2 "broke."  I'm no windows guru, but it's pretty darn obvious
that SP2 didn't break these things, and that this fallout is simply
part of the process of cleaning up the security problems.  The sad
part is that SP2 was just the tip of the iceburg and doesn't even
begin to address the serious, fundamental problems.  Given how bad
things were with SP2, I can't begin to imagine the uproar that will
result from a serious overhaul...

-p.

2005\03\25@164827 by Martin McCormick

flavicon
face
Dave Tweed writes:
>look
>how long it took them to decide that virtual memory and preemptive
>multitasking were Good Things. These were known in the 1970s (on mainframes
>and minicomputers), microprocessors had the hardware to implement them in
>the early 1980s, but we didn't get them in Windows until the mid-to-late
>1990s.

       One of the problems in the late seventies and early eighties
was that the available hardware was still slow enough that ideas such
as UNIX-like operating systems with true multitasking, while
theoretically possible, would have been slow enough to be unacceptable
to people who were not even used to what is and isn't excessive
slowness.  That is one of the reasons why software developers
immediately began to not use the standardized output mechanisms built
in to the Apple II and later MSDOS worlds.  If text was slow, imagine
what high-resolution graphics were like.  The Apple II had them and
games certainly used them, but even then, people criticized the
slowness.

       UNIX has been on main frames since 1968 or 1969 when it was
developed for Bell Labs, but the speed of the available hardware of
the late seventies plus its limitations and the fact that not many
people might have seen the value of having a UNIX box in one's home,
probably made it almost inevitable that it would not be popular for
another twenty years which is pretty-much what happened.

       What we did get were all the bad habits stemming from trying to
squeeze every microsecond out of video management systems which by
today, are fast enough to be able to still look good while operating
under a more standardized environment.  Look at X in UNIX and the Macs
of today.

Martin McCormick 405 744-7572   Stillwater, OK
Information Technology Division
Network Operations Group

2005\03\25@182139 by William Chops Westfield

face picon face

On Mar 25, 2005, at 8:58 AM, Dave Tweed wrote:

> I'd say that Microsoft has set the PC industry back a good 20 years
> from where it would otherwise be -- look how long it took them to
> decide that virtual memory and preemptive multitasking were Good
> Things. These were known in the 1970s (on mainframes and
> minicomputers),

It wasn't just microsoft, of course.  The microcomputer industry
in general completely ignored a lot of stuff that had been learned
on mainframes much earlier.  Arguably, microcomputers set back the
state of computing 20+ years, and it wasn't microsoft...  (of course,
it would have been nice if microsoft, as the industry leader, was
more aggressive about implementing known technology.)

OTOH, we all tend to neglect the things that microsoft has done well.
The degree to which windows will install and run on a HUGE variety
of hardware without expert hacking is amazing.  The amount of backward
compatibility is amazing.  Their support of hardware vendors is amazing.


> microprocessors had the hardware to implement them in the early 1980s,
> but we didn't get them in Windows until the mid-to-late 1990s.

Early attempts at virtual memory in microprocessors were pretty
bad.  For some reason, hardware vendors were implementing segments
even though most 'real' OSes would have rather had paging.  The only
paging micros in the early 80s I can think of were the early SUNs,
and that was all in external logic and memory.  The 80386 and 68030
were about the first micros with decent MMUs, and those were pretty
much LATE 1980s.  (and don't forget the famous bug in the 68000 that
had would-be unix workstation vendors doing thing like using a extra
CPU to handle page faults; and that was just interfacing to EXTERNAL
MMUs.)

2005\03\26@052158 by Howard Winter

face
flavicon
picon face
Herbert,

On Fri, 25 Mar 2005 09:54:21 -0500, Herbert Graf wrote:
>...<
> backwards compatibility is king in the world of PCs,
and THAT is a GOOD thing

Indeed, and it's high time that Microsoft started to
provide it!

Cheers,


Howard Winter
St.Albans, England


2005\03\26@073606 by Gerhard Fiedler

picon face
William ChopsWestfield wrote:

> Dave Tweed wrote:
>> I'd say that Microsoft has set the PC industry back a good 20 years
>> from where it would otherwise be ...

The "would be" argument is a very virtual one. If it had been easy to do, I
wonder why no one did it. If no one did it, why did they set the PC
industry back? Maybe they made a slower progress than somebody else could
have done, but then, this somebody else didn't do it either.

In large (and smaller) companies, they had running DOS programs until quite
recently. The ability to do so from the very beginning was probably one of
the major ingredients of the success of the Wintel platform.

>> ... -- look how long it took them to decide that virtual memory and
>> preemptive multitasking were Good Things. These were known in the 1970s
>> (on mainframes and minicomputers),
>
> It wasn't just microsoft, of course.  The microcomputer industry
> in general completely ignored a lot of stuff that had been learned
> on mainframes much earlier.  

As I was told (I had very little exposure), pre-OS X Apples didn't even do
such simple multitasking as compiling or rendering something in the
background while editing another file in the foreground.

Gerhard

2005\03\26@081801 by Gerhard Fiedler

picon face
Martin McCormick wrote:

>        What we did get were all the bad habits stemming from trying to
> squeeze every microsecond out of video management systems which by
> today, are fast enough to be able to still look good while operating
> under a more standardized environment.  Look at X in UNIX and the Macs
> of today.

Right. Or the journaling file systems... How many of you guys here have
something like cvs/cvsnt running for at least development work? It's not
the same as when versioning is built into the file system itself, but it
comes pretty close for many applications. It's not that the stuff didn't
exist, it's just that most don't want to go the extra step to use it.

Gerhard

2005\03\26@082131 by Wouter van Ooijen

face picon face
> As I was told (I had very little exposure), pre-OS X Apples
> didn't even do
> such simple multitasking as compiling or rendering something in the
> background while editing another file in the foreground.

Macs did multitasking from the very beginning, but it was *cooperative*
only. Dunno whether that was a good choice.

Wouter van Ooijen

-- -------------------------------------------
Van Ooijen Technische Informatica: http://www.voti.nl
consultancy, development, PICmicro products
docent Hogeschool van Utrecht: http://www.voti.nl/hvu


2005\03\26@083501 by Dave Tweed

face
flavicon
face
Gerhard Fiedler <EraseMElistsspam_OUTspamTakeThisOuTconnectionbrazil.com> wrote:
> Dave Tweed wrote:
> > I'd say that Microsoft has set the PC industry back a good 20 years
> > from where it would otherwise be ...
>
> The "would be" argument is a very virtual one. If it had been easy to do,
> I wonder why no one did it. If no one did it, why did they set the PC
> industry back? Maybe they made a slower progress than somebody else could
> have done, but then, this somebody else didn't do it either.

Not virtual at all! I myself was designing external MMUs for 68020-based
engineering workstations at Apollo Computer in 1984, and others had been
doing it for the 68010-based machines before me. As someone else pointed
out, Sun was doing the same thing, in even higher volumes, shortly
thereafter.

So the technology was definitely available, and a high-volume PC
manufacturer could have made it available at a lower cost than what we
could do in relatively low-volume, high-end engineering workstations, but
there was no demand for it from either end users or from Microsoft.

The introduction of the 80386 in the late 1980s made the hardware expense
issue go away altogether for Intel-based platforms, and we still didn't get
virtual memory or preemptive multitasking in Windows. Others *were* doing
it, but by then Microsoft was such a juggernaut that no one paid any
attention to any of the other OS vendors, except in tiny niche markets.

It became a bit of a catch-22, because the end users were already used to
the primitive capabilities of the original PC and didn't know that there
were better things out there, and Microsoft wasn't inclined to put the
effort into developing software for it and telling them about it. So we
ended up with a PC "dark ages" that has arguably lasted about 20 years.

> In large (and smaller) companies, they had running DOS programs until
> quite recently. The ability to do so from the very beginning was probably
> one of the major ingredients of the success of the Wintel platform.

Like I said, there were ways to accomplish that without holding back the
introduction of better software technology for the mainstream development
of new applications.

-- Dave Tweed

2005\03\26@143106 by Lindy Mayfield

flavicon
face
Straight up dude.  (-:

The very first program written for the IBM 360 computer in 1964 will run on the latest and greatest IBM mainframe today.  The point being, it can be done if you really design things correctly.  

{Original Message removed}

2005\03\26@161518 by William Chops Westfield

face picon face
>>> I'd say that Microsoft has set the PC industry back a good 20 years
>>> from where it would otherwise be ...
>
> The "would be" argument is a very virtual one.

Yah.  Note that the world is STILL feeling out exactly what a "personal"
computer ought to be, especially compared to the mainframes of old in a
business environment.  While microcomputers failed to adapt some of the
good things from mainframes, in some sense they were popular BECAUSE
they
also failed to adapt some of the policies viewed as BAD from mainframes.
(most obviously, that CPU cycles were rare and expensive things to be
used ONLY for doing actual work-like things, not wasted on flipperies
like a user interface :-)

It certainly isn't helping that recent legislation aimed at ensuring
financial good conduct is being widely interpreted as requiring large
amounts of centralized control over the content of people's personal
computers.  There is a constant and continuous flamewar at work over
the amount of "nannyware" that the IT department likes to have, and
would like to INSIST on having, on our laptops (and recently PDAs.)

Guess who's good at providing nannyware?

BillW

2005\03\27@041619 by Morgan Olsson

flavicon
face
Lindy Mayfield 13:49 2005-03-22:
>I personally think the microcomputer world would be a better place without Microsoft, and that is just my opinion.

Let us go in that direction

>If there were "killer" apps and applications with as many features at the price available for Linux (that also would convert seamlessly between M$ and other formats), then I'd definitely ditch my Win OS's.

MS keep their file formats secret so it is hard to seamlessly convert their documents, but i am really impressed by OpenOffice / Staroffice in that regard.  The programs are also (at least for me) more logical and easy to work with.  Plus have a decent vector drawing program i use for simple CAD... Try http://www.openoffice.org for Windows/Linux/Solaris, and th evariant NeoOffice for Mac.

There are other really copetent cross platform programs for other things, like:
Eagle CAD from Cadsoft for PCB design.
For web browsing: Firefox and Opera
For mail: Thunderbird and Opera
Images: GIMP

etc...

/Morgan

--
Morgan Olsson, Kivik, Sweden

2005\03\27@044037 by Lindy Mayfield

flavicon
face
Your point being, I think, and also mine:  I cannot replace M$ apps and do my work the same way I do it now.  Someone sends me a M$Doc, XLS, powerpoint, etc. file and I can be sure to read it.  Yeah, Mr. Gates is an entrepreneur, I'll give him that.  But genius, he ain't.  Also, he didn't do anything to make the world a better place for the rest of us.  

{Original Message removed}

2005\03\27@051626 by William Chops Westfield

face picon face
On Mar 27, 2005, at 1:39 AM, Lindy Mayfield wrote:

> Mr. Gates is an entrepreneur, I'll give him that.  But genius, he
> ain't.

This is perhaps the most annoying thing about Gates; the way is he
held to be a super-geek that made good doing geeky things, when in
fact it looks to me like microsoft owes most of its success to
successful, aggressive, and frequently less than ethical marketing
and business practices.

BillW

2005\03\27@053531 by Morgan Olsson

flavicon
face
Lindy Mayfield 10:39 2005-03-27:
>Your point being, I think, and also mine:  I cannot replace M$ apps and do my work the same way I do it now.  Someone sends me a M$Doc, XLS, powerpoint, etc. file and I can be sure to read it.

I have yet not gotten a MS document i could not read.  (though some formatting may not be as author intended)   If that ever happens i can download free reader from Microsoft (if i use their OS)

There are tens of versions of every MSOffice format, and to read all you need to buy the latest version.  I have even on another discusson list read that on handheld computers Third party office (forgot which) suite is more windows-MSWord-compatible than that version of windows-MSword.

And i do have helped converting MS documents for a friend using not MS but *Openoffice*: Opened a Word97/2000 document and saved in Word95 so he could open it (he was on an old computer and modem so could not get Openoffice easily)  But if you have wideband acess, do try Openoffice - it is free and gratis.  You can even have both OOo1.1.4 and 2.0 beta installed without uninstalling MSOffice (unless you have disk full problems...)

/Morgan
--
Morgan Olsson, Kivik, Sweden

2005\03\27@064001 by Howard Winter

face
flavicon
picon face
Lindy,

On Sun, 27 Mar 2005 11:39:32 +0200, Lindy Mayfield wrote:

> Your point being, I think, and also mine:  I cannot replace M$ apps and do my work the same way I do it now.  

Well actually you could...

> Someone sends me a M$Doc, XLS, powerpoint, etc. file and I can be sure to read it.  

And if someone sends you someething in some non-MS format that you can't read, you rush out and buy the
software they used?

> Yeah, Mr. Gates is an entrepreneur, I'll give him that.  But genius, he ain't.

No, he somehow persuaded the population that his stuff is the only game in town, so people expect to be able
to send documents in a proprietary format and that people at the other end can read it.  The solutions vary
from telling people to send you things in non-proprietary formats (.RTF for word-processed documents, for
example).  The firm I used to work for had problems because customers would send things in Word format, but
from various versions of Word, which didn't render as intended in other versions.  Astoundingly, instead of
asking them to send in a single (lowest common denominator) format, people started setting up different
versions of Word on different machines (because they can't co-exist) and having to sit at the appropriate desk
to read a particular document!

> Also, he didn't do anything to make the world a better place for the rest of us.

Amen to that!  In fact I agree with the opinion that the industry has been crippled, and held back severly
from where it could have been if a proper free market existed.

Cheers,

Howard Winter
(sending this from a non-MS operating system (OS/2) and a non-MS mailer (PMMail/2)


2005\03\27@070657 by Bob Ammerman

picon face
----- Original Message -----
From: "Gerhard Fiedler" <listsspamspam_OUTconnectionbrazil.com>
To: "Microcontroller discussion list - Public." <@spam@piclistKILLspamspammit.edu>
Sent: Saturday, March 26, 2005 7:35 AM
Subject: Re: [OT]: that filesystem royalty problem again


{Quote hidden}

I large suite of DOS applications that I developed starting in 1984 is still
being used today by one of my customers. It is a very critical part of their
business. I have several times provided them with the potential of upgrading
them to a Win environment, but they are happy with them and see no reason to
spend the $$. However, they have had a in-house development project going on
since 1999 to build a replacement for my system, but delivery of that system
is now 4+ years late!

Bob Ammerman
RAm Systems



2005\03\27@070704 by Bob Ammerman

picon face

----- Original Message -----
From: "Wouter van Ooijen" <KILLspamwouterKILLspamspamvoti.nl>
To: "'Microcontroller discussion list - Public.'" <RemoveMEpiclistTakeThisOuTspammit.edu>
Sent: Saturday, March 26, 2005 8:21 AM
Subject: RE: [OT]: that filesystem royalty problem again


>> As I was told (I had very little exposure), pre-OS X Apples
>> didn't even do
>> such simple multitasking as compiling or rendering something in the
>> background while editing another file in the foreground.
>
> Macs did multitasking from the very beginning, but it was *cooperative*
> only. Dunno whether that was a good choice.
>
> Wouter van Ooijen

As did Windows. It was probably the only reasonable choice for the time. Of
course this exposes another Apple/MSoft myth. Microsoft was actually _way_
ahead of Apple at providing true preemptive multitasking: Win95 vs. OS X!!

Bob Ammerman
RAm Systems


2005\03\27@085723 by Jake Anderson

flavicon
face
not to defend his business practices but I know of no other single person
who has done more to help humanity.
he is personally matching the entire UN funding on aids, that to me is
pretty hard to argue with.

> {Original Message removed}

2005\03\27@093710 by Bob Axtell

face picon face
Morgan Olsson wrote:

>Lindy Mayfield 10:39 2005-03-27:
>  
>
>>Your point being, I think, and also mine:  I cannot replace M$ apps and do my work the same way I do it now.  Someone sends me a M$Doc, XLS, powerpoint, etc. file and I can be sure to read it.
>>    
>>
>
>I have yet not gotten a MS document i could not read.  (though some formatting may not be as author intended)   If that ever happens i can download free reader from Microsoft (if i use their OS)
>
>There are tens of versions of every MSOffice format, and to read all you need to buy the latest version.  I have even on another discusson list read that on handheld computers Third party office (forgot which) suite is more windows-MSWord-compatible than that version of windows-MSword.
>
>And i do have helped converting MS documents for a friend using not MS but *Openoffice*: Opened a Word97/2000 document and saved in Word95 so he could open it (he was on an old computer and modem so could not get Openoffice easily)  But if you have wideband acess, do try Openoffice - it is free and gratis.  You can even have both OOo1.1.4 and 2.0 beta installed without uninstalling MSOffice (unless you have disk full problems...)
>
>/Morgan
>--
>Morgan Olsson, Kivik, Sweden
>
>  
>
Open Office: I agree, its a wonderful program.

--Bob

--
Note: To protect our network,
attachments must be sent to
spamBeGoneattachspamBeGonespamengineer.cotse.net .
1-866-263-5745 USA/Canada
http://beam.to/azengineer

2005\03\27@121446 by Peter Johansson

flavicon
face
Bob Ammerman writes:

> As did Windows. It was probably the only reasonable choice for the time. Of
> course this exposes another Apple/MSoft myth. Microsoft was actually _way_
> ahead of Apple at providing true preemptive multitasking: Win95 vs. OS X!!

The reason most people don't know of this is that Macs actually worked
very well as desktop machines with cooperative multitasking, unlike
Windows boxen of the time which BSODed or needed to be rebooted
several times each day.

-p.

2005\03\28@054252 by Gerhard Fiedler

picon face
Peter Johansson wrote:

> The reason most people don't know of this is that Macs actually worked
> very well as desktop machines with cooperative multitasking, unlike
> Windows boxen of the time which BSODed or needed to be rebooted
> several times each day.

How come then that some Mac web developer acquaintances of mine never heard
of having a development web server running on their machines in pre-OS X
times? Something that was quite common among Windows web developers
(working on NT type systems). Did they just not know?

Gerhard

2005\03\28@111903 by Peter

picon face


> And the reason is VERY clear: backwards compatibility. Look at the x86
> architecture, we are still using CPUs that can run code developed for
> the first IBM PC. The fact that the OS has remained as compatible for so
> long is NO surprise.
>
> Given the alternatives at the time I'd say DOS was the better choice, it
> was resource friendly enough to be able to run on very minimal hardware.
> The fact that everything later was an evolution or kludge of DOS was no
> surprise, backwards compatibility is king in the world of PCs, and THAT
> is a GOOD thing. TTYL

What you people are missing is, Xenix ran DOS binaries just fine ;-)

Peter

2005\03\28@122525 by Martin McCormick

flavicon
face
"Howard Winter" writes:
>No, he somehow persuaded the population that his stuff is the only game in town

       That's the problem right there.  There have been public
documents in which Microsoft has said that their goal is to
"decommoditize" the Internet by making all the free and open
applications such as Email, web browsing, etc, either break or not
work well so that the public will trash all that stuff and make the
cash registers ring in Redmond.

       Part of my day job is to look after our domain name servers
and dhcp servers.  The ones I administer run on UNIX and are as solid
as rocks, our master having been up 384 days as of today.

       The only sore spot is that Microsoft has convinced enough
people that only their domain name servers equipped with their special
magic dust can support Active Directory, that we were basically forced
to allow some of these things in parts of our name space and they
don't play nice with the UNIX-based DNS's.
The end result is degraded service for both camps.

       Instead of special sauce or magic dust, the best solution
would have been to not reinvent an already very good wheel but make
one's applications work properly with the kind of UNIX DNS's that have
been on the Internet for almost 25 years.

       As someone once said on a radio editorial I once heard, in
Redmond, they want you to pay the Bill bill every time you do anything
that has to do with computing.

       Well, they're gonna' have to wait a long time for me to pay
any Bill bill.  I have a Debian Linux work station connected via ssh
to a FreeBSD server using Sendmail to send this message to the
PIClist.  At home, it's all Debian Linux and it works well except for
when I improve it to the point of breaking it.:-)
Even then, I was able to fix things without totally starting over
again and I usually learned useful knowledge that kept me out of
trouble later.

Martin McCormick WB5AGZ  Stillwater, OK
OSU Information Technology Division Network Operations Group

2005\03\28@151009 by Peter

picon face


On Mon, 28 Mar 2005, Gerhard Fiedler wrote:

{Quote hidden}

Afaik on Macs before OS X there was a strong division between system and
user tasks. There was only one user task running at any one time, but
several systems tasks could run concurrently. To have a server running
one had to make it a system task (I'm probably using the wrong
terminology for the Mac). Example: file sharing worked in parallel with
whatever the user was doing with not visible side effects.

Peter

2005\03\28@154114 by Howard Winter

face
flavicon
picon face
Peter,

On Sun, 27 Mar 2005 20:26:01 +0200 (IST), Peter wrote:

> What you people are missing is, Xenix ran DOS binaries
just fine ;-)

And OS/2 still does!  :-)

(but under preemptive multitasking, of course)

Cheers,



Howard Winter
St.Albans, England


2005\03\28@155905 by Peter Johansson

flavicon
face
Gerhard Fiedler writes:

> Peter Johansson wrote:
>
> > The reason most people don't know of this is that Macs actually worked
> > very well as desktop machines with cooperative multitasking, unlike
> > Windows boxen of the time which BSODed or needed to be rebooted
> > several times each day.
>
> How come then that some Mac web developer acquaintances of mine never heard
> of having a development web server running on their machines in pre-OS X
> times? Something that was quite common among Windows web developers
> (working on NT type systems). Did they just not know?

The question here is, "what is a development server?"  Are you talking
about a server for static content, or are are you talking about a
server with dynamic capabilities?  For static content, there is no
reason you can't simply use the local filesystem as your "development
server."  The problem really arises from the lack of a server for
dynamic content from pre-OSX Macs.

I ran a small web company back in the mid '90s.  All of our servers
ran Apache on some flavor of Unix.  Just about all of our artists and
designers worked on Macs, and since we didn't care where anyone
worked, most of them worked from home.  Once they had a design they
liked, they'd toss it up on one of our servers for client approval.
Once the customer approved the design, then we'd cut everything up
into templates and let PHP generate the final pages.  Of course, this
is ancient technology now, but back then when everyone else was
cutting and pasting pages all day long, we were *extremely*
profitable.

Since I never did any server work on anything aside from Unix I can't
say for sure, but while it seems correct that full server development
on the desktop did come to Windows prior to MacOS, MacOS wasn't really
all that far behind.

-p.

2005\03\29@022248 by ThePicMan

flavicon
face
At 14.21 2005.03.26 +0100, you wrote:
>> As I was told (I had very little exposure), pre-OS X Apples
>> didn't even do
>> such simple multitasking as compiling or rendering something in the
>> background while editing another file in the foreground.
>
>Macs did multitasking from the very beginning, but it was *cooperative*
>only. Dunno whether that was a good choice.

The Amiga did preemptive multitasking since the very start.

Now THAT was (is) a wonderful computer..

2005\03\29@022928 by ThePicMan

flavicon
face

Virtual memory is the "modern" thing I hate most.

Virtual memory is one of those things that should be handled
cooperatively.. unlike cpu time (through multitasking), where
a cooperative system would work bad (preemptive plus ways to
forbid task-switching for a certain guaranteed time would be
great).

2005\03\29@030358 by Robert Rolf

picon face

ThePicMan wrote:

> At 14.21 2005.03.26 +0100, you wrote:
>
>>>As I was told (I had very little exposure), pre-OS X Apples
>>>didn't even do
>>>such simple multitasking as compiling or rendering something in the
>>>background while editing another file in the foreground.
>>
>>Macs did multitasking from the very beginning, but it was *cooperative*
>>only. Dunno whether that was a good choice.
>
>
> The Amiga did preemptive multitasking since the very start.
>
> Now THAT was (is) a wonderful computer..

Yes, it was quite amazing to be formatting floppies,
downloading files via modem, printing, and editing
all at the SAME TIME! with minimal performance hits.

And the  Amiga OS will soon be available running on power PC.
http://www.amiga.com/
http://www.amiga.com/amigaos/
"Now scheduled for commercial release in early 2005,
AmigaOS 4.0 moves the Amiga Operating System to the
modern Power PC chips enabling the easy use of
off-the-shelf components from third party vendors.
True to its heritage, AmigaOS 4.0 remains small, fast and robust."

2005\03\29@032254 by ThePicMan

flavicon
face

[talking about Mr. Gates]

At 23.57 2005.03.27 +1000, you wrote:
>not to defend his business practices but I know of no other single person
>who has done more to help humanity.
>he is personally matching the entire UN funding on aids, that to me is
>pretty hard to argue with.

You mean that a very small fraction of the money He steals us goes back
(hopefully) to some that need it, thus He gets a lot of publicity back? *g*

2005\03\29@033251 by Wouter van Ooijen

face picon face
> Virtual memory is one of those things that should be handled
> cooperatively.. unlike cpu time (through multitasking), where
> a cooperative system would work bad (preemptive plus ways to
> forbid task-switching for a certain guaranteed time would be
> great).

I disagree. Cooperative memory use enables one application to crash the
whole system, and worst of all, without a hint of which application
caused this. A nightmare for both user and developer.

Wouter van Ooijen

-- -------------------------------------------
Van Ooijen Technische Informatica: http://www.voti.nl
consultancy, development, PICmicro products
docent Hogeschool van Utrecht: http://www.voti.nl/hvu


2005\03\29@042024 by Gerhard Fiedler
picon face
Peter Johansson wrote:

>> How come then that some Mac web developer acquaintances of mine never heard
>> of having a development web server running on their machines in pre-OS X
>> times? Something that was quite common among Windows web developers
>> (working on NT type systems). Did they just not know?
>
> The question here is, "what is a development server?"  Are you talking
> about a server for static content, or are are you talking about a
> server with dynamic capabilities?  

As you correctly stated, for static content you don't really need a server,
you can just browse your local files. I was talking about dynamic content
and highly functional pages that require a /real/ web server for
development, and usually also a database server. It was so normal for me to
have all this on my local system present that I was really surprised to
hear that these Mac guys couldn't do this -- with all the hype about the
Mac being ages ahead.

> Since I never did any server work on anything aside from Unix I can't
> say for sure, but while it seems correct that full server development
> on the desktop did come to Windows prior to MacOS, MacOS wasn't really
> all that far behind.

It seems that either it did it, or it didn't do it... and from what you
seem to confirm, it didn't do it (until OS X, of course).

Gerhard

2005\03\29@042443 by Gerhard Fiedler

picon face
Peter wrote:

> On Mon, 28 Mar 2005, Gerhard Fiedler wrote:
>
>> Peter Johansson wrote:
>>
>>> The reason most people don't know of this is that Macs actually worked
>>> very well as desktop machines with cooperative multitasking, unlike
>>> Windows boxen of the time which BSODed or needed to be rebooted
>>> several times each day.
>>
>> How come then that some Mac web developer acquaintances of mine never heard
>> of having a development web server running on their machines in pre-OS X
>> times? Something that was quite common among Windows web developers
>> (working on NT type systems). Did they just not know?
>
> Afaik on Macs before OS X there was a strong division between system and
> user tasks. There was only one user task running at any one time,

This was what Mac guys told me. Don't know whether that's correct, but it
is quite different from what Peter Johansson wrote.

> but several systems tasks could run concurrently. To have a server
> running one had to make it a system task (I'm probably using the wrong
> terminology for the Mac). Example: file sharing worked in parallel with
> whatever the user was doing with not visible side effects.

This doesn't sound too much like "cooperative multitasking". It sounds more
like DOS-style TSR programs... :)

Gerhard

2005\03\29@043611 by ThePicMan

flavicon
face
At 10.30 2005.03.29 +0200, you wrote:
>> Virtual memory is one of those things that should be handled
>> cooperatively.. unlike cpu time (through multitasking), where
>> a cooperative system would work bad (preemptive plus ways to
>> forbid task-switching for a certain guaranteed time would be
>> great).
>
>I disagree. Cooperative memory use enables one application to crash the
>whole system, and worst of all, without a hint of which application
>caused this. A nightmare for both user and developer.

I will answer with these two points:

1) Since when all those "virtual memory", "protected mode", "preemptive
  multitasking" and "protected memory" (some features which I like anyway)
  introduced with Windows95 have made the whole system become reliable?
  You NEED bugfree software, otherwise the race (OS-side) to make things
  safe will mean having a computer MUCH MUCH slower than it ought to be
  (and *still* crash, as we experience everyday..).
  They better concentrate on making better debugging systems than slowing
  down even more their already slow OS's. User applications must not be
  seen always as enemies. What we really need is better debugging tools;
  certification of known-to-be-reliable applications (via digital signatures);
  and anal security rules for unknown applications. This is the way to go.
  Instead we have lame debugging tools; security rules that slow down the
  whole system; while exploits, virii enter through services we don't even
  knew the existance of, or enter through IE or when we simply open a JPG
  file! This situation is ridicolous.

2) We mean two different things with "Cooperative Memory" here. I simply
  mean that virtual memory must be an option for an application, and not
  the rule. Virtual memory has promoted abusive allocation/use of memory,
  making the systems MUCH slower than they ought to be. And all of this
  to have, on average, just twice the addressable readable/writable memory
  space (usually one doesn't have a 200GB swap file on a 512MB RAM system,
  also because that's simply not really useful on 32bit ~Intel CPUs even
  when using 36bit paging.. However, I see a lot of use for virtual memory
  (I am thinking of file mapping) for specialized scientifical applications,
  for example. But they must be designed properly). When I ask for the right
  to allocate *physical* memory, I also want a way to say to the OS that I
  don't need to access it for a certain timeframe (so that it may swap it
  to disk, if necessary). This is one of the few things that were normal to
  do on the 16bit Windows, which was cooperative in a lot of ways. Now we
  have applications that think that memory is unlimited, and they make the
  worst possible use of it.
  It's simply ridicolous that a WindowsXP system is much faster (read:
  less slow) when you add a 128MB stick to your 128MB PC. RAM is RAM, CPU
  is CPU. CPU is slowed down terribly by virtual memory as-we-know-it. It
  is a non-sense.

How come the modern Windows OS I'm using crashes at boot time much more often
than my old Amiga box (which lacks memory protection and much more) with lotsa
applications running?

Perhaps Microsoft & Co. are concentrating their efforts on the wrong directions..
for example promoting .NET as a solution to realiability and security problems.
They better spend that money on debugging tools and better formation for their
inhouse programmers..

When I must be scared of accessing a HTML file or watching a JPG file or reading
a DOC text document because I'm likely to get a virus, but all is virtualized
and protected from my possible "coding abuses", something must be really wrong..

--
TPM

2005\03\29@060322 by William Chops Westfield

face picon face

On Mar 29, 2005, at 2:37 AM, ThePicMan wrote:

> We mean two different things with "Cooperative Memory" here. I simply
> mean that virtual memory must be an option for an application, and not
> the rule. Virtual memory has promoted abusive allocation/use of memory,

You're talking about on-demand page creation, and perhaps actual paging
to disk. That's only a small part of what I consider to be lumped
together into 'virtual memory.'  The most important part is memory
protection; i can't access another process' memory without explict
permission...

BillW

2005\03\29@072258 by Wouter van Ooijen

face picon face
> >I disagree. Cooperative memory use enables one application
> to crash the
> >whole system, and worst of all, without a hint of which application
> >caused this. A nightmare for both user and developer.
>
> I will answer with these two points:
>
> (snip)

You provide convincing arguments against a few points that I did not
make (mainly: Windows is good). I still hold that virtual memory
(especially the isolation of user processes from each other) is a good
thing.

Do you realy know how memory sharing in the old Macs was implemented?
Ever heared of double indirection? And how difficult that is to debug? I
don't say that it was bad for what was available at that moment, it was
the only way to realise memory sharing. But it was a major PITA for both
application developers and users.

Wouter van Ooijen

-- -------------------------------------------
Van Ooijen Technische Informatica: http://www.voti.nl
consultancy, development, PICmicro products
docent Hogeschool van Utrecht: http://www.voti.nl/hvu


2005\03\29@075219 by Bob Ammerman

picon face
> Afaik on Macs before OS X there was a strong division between system and
> user tasks. There was only one user task running at any one time, but
> several systems tasks could run concurrently. To have a server running one
> had to make it a system task (I'm probably using the wrong terminology for
> the Mac). Example: file sharing worked in parallel with whatever the user
> was doing with not visible side effects.
>
> Peter

AFAIK, and I have done a fair bit of Mac OS <X code, the only preemptive
multitasking on pre OS-X was provided by code that ran off the 'vertical
blanking interrupt'. Such code was strictly constrained in what it could do
(most system calls were off limits to it). So, unless file sharing were
driven off the VBI, it would have been constrained to cooperative
multitasking only.

It is interesting to note that there is a slider control in one of the
preference panels for the OS to choose between good application and good
file-sharing performance.

Bob Ammerman
RAm Systems


2005\03\29@075456 by Howard Winter

face
flavicon
picon face
On Tue, 29 Mar 2005 01:04:00 -0700, Robert Rolf wrote:

> AmigaOS 4.0 remains small, fast and robust

...and continues to miss the marketing opportunity of
calling it "Amigos" !

At least when it was called AmigaDOS you could pronounce
it without a hiccup in the middle...

Cheers,


Howard Winter
St.Albans, England


2005\03\29@075935 by Howard Winter

face
flavicon
picon face
Jake,

On Sun, 27 Mar 2005 23:57:20 +1000, Jake Anderson wrote:

> not to defend his business practices but I know of no other single person
> who has done more to help humanity.
> he is personally matching the entire UN funding on aids, that to me is
> pretty hard to argue with.

I shall argue with it - he has more money than the UN (and in fact more than quite a few countries!).  The
amount he gives away (tax-deductable of course) is a pittance compared to what he already has, and continues
to earn.  And notice that he doesn't exactly keep it a secret - it produces just the reaction you've given
above:  "He may be a carnivorous businessman but he does give to charity" is pretty good marketing!

Cheers,


Howard Winter
St.Albans, England


2005\03\29@080545 by olin_piclist

face picon face
ThePicMan wrote:
> Virtual memory is the "modern" thing I hate most.

Yeah, I really hate how my program keeps running in 64Mb even though it
requires 70Mb of RAM.  And it really sucks how I don't have to carefully
calculate RAM usage to make sure that a program runs on a particular
machine.  The Atlas ruined all the fun.


*****************************************************************
Embed Inc, embedded system specialists in Littleton Massachusetts
(978) 742-9014, http://www.embedinc.com

2005\03\29@082730 by olin_piclist

face picon face
William Chops Westfield wrote:
> The most important part is memory
> protection; i can't access another process' memory without explict
> permission...

That's orthogonal to virtual memory.  You could in theory have a virtual
memory system where all processes see the same address space without
inter-process memory access restrictions.


*****************************************************************
Embed Inc, embedded system specialists in Littleton Massachusetts
(978) 742-9014, http://www.embedinc.com

2005\03\29@091605 by Wouter van Ooijen

face picon face
> > The most important part is memory
> > protection; i can't access another process' memory without explict
> > permission...
>
> That's orthogonal to virtual memory.  You could in theory
> have a virtual
> memory system where all processes see the same address space without
> inter-process memory access restrictions.

But you can also have a virtual memory system where each process is
granted the exact amount (ort even less!) memory than what is physically
available.

Wouter van Ooijen

-- -------------------------------------------
Van Ooijen Technische Informatica: http://www.voti.nl
consultancy, development, PICmicro products
docent Hogeschool van Utrecht: http://www.voti.nl/hvu


2005\03\29@092534 by ThePicMan

flavicon
face
At 08.05 2005.03.29 -0500, you wrote:
>ThePicMan wrote:
>>Virtual memory is the "modern" thing I hate most.
>
>Yeah, I really hate how my program keeps running in 64Mb even though it
>requires 70Mb of RAM.  And it really sucks how I don't have to carefully
>calculate RAM usage to make sure that a program runs on a particular
>machine.  The Atlas ruined all the fun.

Watch out, you're running out of swap-file hard disk space! Panic!

2005\03\29@092541 by ThePicMan

flavicon
face
At 03.03 2005.03.29 -0800, you wrote:

>On Mar 29, 2005, at 2:37 AM, ThePicMan wrote:
>
>>We mean two different things with "Cooperative Memory" here. I simply
>>mean that virtual memory must be an option for an application, and not
>>the rule. Virtual memory has promoted abusive allocation/use of memory,
>
>You're talking about on-demand page creation, and perhaps actual paging
>to disk. That's only a small part of what I consider to be lumped
>together into 'virtual memory.'  The most important part is memory
>protection; i can't access another process' memory without explict
>permission...

And that's a good thing, expecially when there are means to get that
permission. :-)

But memory protection doesn't even require virtual memory, if with
the latter we mostly mean "pagination".

Even the AMD/Intel P6+ generation CPUs (perhaps even some P5?) have
MTRR's (memory range type registers) MSR's which (conceptually) would
allow memory protection (and more) even without pagination.

--
TPM

2005\03\29@150406 by Peter

picon face

On Tue, 29 Mar 2005, Bob Ammerman wrote:

>> Afaik on Macs before OS X there was a strong division between system and
>> user tasks. There was only one user task running at any one time, but
>> several systems tasks could run concurrently. To have a server running one
>> had to make it a system task (I'm probably using the wrong terminology for
>> the Mac). Example: file sharing worked in parallel with whatever the user
>> was doing with not visible side effects.
>>
>> Peter
>
> AFAIK, and I have done a fair bit of Mac OS <X code, the only preemptive
> multitasking on pre OS-X was provided by code that ran off the 'vertical
> blanking interrupt'. Such code was strictly constrained in what it could do
> (most system calls were off limits to it). So, unless file sharing were
> driven off the VBI, it would have been constrained to cooperative
> multitasking only.
>
> It is interesting to note that there is a slider control in one of the
> preference panels for the OS to choose between good application and good
> file-sharing performance.

I think that the filesharing (at least some part of it) was driven by
the interrupt system in the OS, probably the ethernet card interrupt
source. Together with the TCP/IP stack callbacks and a couple of other
things that run in the background on any internet and multimedia capable
machine. I did not *look* at code but the machines work in that way (at
least iMacs above OS9.x). I don't know how they do it but they do it.

Peter

2005\03\29@150419 by Peter

picon face

On Tue, 29 Mar 2005, ThePicMan wrote:

> Virtual memory is the "modern" thing I hate most.
>
> Virtual memory is one of those things that should be handled
> cooperatively.. unlike cpu time (through multitasking), where
> a cooperative system would work bad (preemptive plus ways to
> forbid task-switching for a certain guaranteed time would be
> great).

And which part of those is missing from recent kernels ? You can lock
pages into ram and you can change the sheduler to whatever you want.

Virtual memory is the best thing since electric bulbs were invented. It
allows the programmer to painlessly write code that uses huge memory
allocations without any problems, even if the physical memory is a
hundredth of what he will use. It also allows proper isolation between
tasks required for preemptive timesharing and makes systems robust (a
crashing task won't take the system down).

Peter

2005\03\29@150427 by Peter

picon face

On Tue, 29 Mar 2005, Gerhard Fiedler wrote:

{Quote hidden}

No, imho it sounds very much like multitasking focused on user
experience. The windows sheduler does something very similar afaik, it
pushes the foreground task's priority way up so response to user input
is snappier. Background tasks (services etc) are not affected by this
(noticed how sluggish windows is when it is doing something in the
background via a service ? - linux is not like that at all, you have to
put the system load beyond 1.5 to see slowing down, and it is still
usable). The default Linux sheduler is more democratic and splits CPU
time evenly across all tasks (including servers) unless told otherwise
(in the simplest instance using renice as root for the task to be bumped
up). Imho they (M$) had to do this because the process table of a
default XP installation looks like the only thing missing from it is a
pink elephant. If all those processes would compete democratically for
CPU then a 3GHz Pentium would feel like a 150MHz CPU to the user (in
fact it *does* feel like that under certain circumstances). On a normal
Linux desktop machine with dns,web server,gui running,network services
etc, plus several open consoles and applications running in them, the
process count is around 60, but the individual processes are mostly
swapped out and take less than 0.1% CPU each (run 'top' to see relevant
data). The process table of the bare XP installation with no servers on,
but an antivirus installed, outnumbers this afaik. I have limited
experience with the Mac but the way Mac OS (9.x) was made made a 400MHz
iMac feel like a 1.5GHz Windows machine.

Peter

2005\03\29@154237 by Bradley Ferguson

picon face
On Tue, 29 Mar 2005 11:37:55 +0100, ThePicMan <TakeThisOuTthepicmanEraseMEspamspam_OUTinfinito.it> wrote:
{Quote hidden}

That reminds me of my use of MacOS up to about System 7.3 or so.
Virtual memory was off by default and I never turned it on on any of
the machines I had because it slowed everything down so much.  You
could set the minimum and maximum RAM usage for each application and I
would tailor that to my needs.  That careful use of the Mac is still
with me today when I use a PC.  I am fairly careful to avoid opening
and closing applications out of order to avoid memory fragmentation.
That was more important in Windows 95 than it is today with improved
virtual memory handlers.  It would, at times, be nice to have more
control over that functionality, though.

As to stability, I recall setting up a relative's computer to use a
ram disk for the Netscape Navigator cache because it would crash so
often and always damage the file system.  Once I did that, there was
no appreciable difference in the experience, but the crashes wouldn't
hose the file system.  That was a Performa series, IIRC.

Bradley

2005\03\29@204349 by Jake Anderson

flavicon
face
if he had no money he would have none to give

OS wars seem fairly trivial when compared to AIDS or starvation.

that said i am trying to move over to linux
but dangit windows is just so much easier (often)

> {Original Message removed}

2005\03\30@024940 by ThePicMan

flavicon
face
At 20.27 2005.03.29 +0200, you wrote:

>On Tue, 29 Mar 2005, ThePicMan wrote:
>
>>Virtual memory is the "modern" thing I hate most.
>>
>>Virtual memory is one of those things that should be handled
>>cooperatively.. unlike cpu time (through multitasking), where
>>a cooperative system would work bad (preemptive plus ways to
>>forbid task-switching for a certain guaranteed time would be
>>great).
>
>And which part of those is missing from recent kernels ? You can lock pages into ram and you can change the sheduler to whatever you want.

In Windows you can (legally, without hacks) lock pages into RAM? NO you can't.

And there's nothing that will guarantee that you won't be interrupted for a specified timeframe (which, of course, even in a "ideal OS" from my point of view, should have a time limit anyway).


>Virtual memory is the best thing since electric bulbs were invented. It allows the programmer to painlessly write code that uses huge memory allocations without any problems,

"any problems"? You call "any problems" the slowness of a Windows system all the time swapping to disk? I don't.

Virtual memory has created a class of unresponsible programmers on Win32.


>even if the physical memory is a hundredth of what he will use.

Sure.. and hard disks accesses are in the nanosecond range. *g*


>It also allows proper isolation between tasks required for preemptive timesharing

You can get perfect isolation even without having to swap inefficient programs all the time to hard disk, and you can get perfect isolation even with a shared address space, for the matter (not that I necessarily want it in 32bit systems).


> and makes systems robust (a crashing task won't take the system down).

Ehmm... but a crashing OS will take ALL tasks down. *g*

Anyway, having protected memory (a good thing) has nothing to do with "virtual memory" in the meaning we're using it (a swap file).



>Peter
>-

2005\03\30@055642 by Gerhard Fiedler

picon face
Peter wrote:

> No, imho it sounds very much like multitasking focused on user
> experience.

(Snipped lots of stuff about Linux vs. Windows. Just as a reminder: my
point was not a comparison of Windows with all other systems out there. It
was that pre-OS X Macs didn't seem to be able to run applications
concurrently effectively, in the sense that there weren't really any server
apps to speak of for pre-OS X Macs. And that tasks like compiling or
rendering in the background don't seem to have been known to Mac people
before OS X.)

To make this story short... show me a pre-OS X Mac with a web server
running PHP and MySQL or something similar out there (I should be able to
connect to one, shouldn't I? Lots of Win2k servers I can connect to) and
I'll call myself corrected :)  

Gerhard

2005\03\30@080433 by olin_piclist

face picon face
Jake Anderson wrote:
> OS wars seem fairly trivial when compared to AIDS or starvation.

Not when you've got a computer and food in front of you, and aren't infected
with aids.


*****************************************************************
Embed Inc, embedded system specialists in Littleton Massachusetts
(978) 742-9014, http://www.embedinc.com

2005\03\30@132427 by William Chops Westfield

face picon face
On Mar 30, 2005, at 5:04 AM, Olin Lathrop wrote:

>> OS wars seem fairly trivial when compared to AIDS or starvation.

A true technophile ought to be confident that lots of the major
problems of the world could be solved by appropriate application
of technology, if only it could be PROPERLY applied.  Instead,
we get spam, porn, and Microsoft.

(Consider your used computer; like my 90MHz pentium,  You'd think
that that there would be lots of people that could do something
useful with that; have their lives improved.  However, thanks to
the amount of churn in technology and software, the only people
likely to be able to do anything with it are geeks who already
have more computers than they know what to do with...)

BillW

2005\03\30@151516 by Peter

picon face


On Wed, 30 Mar 2005, Gerhard Fiedler wrote:

{Quote hidden}

Ok, you win. But you can connect to an iMac over ethernet and ping it
and share files while the user is using it.

Peter

2005\03\31@023306 by ThePicMan

flavicon
face
At 08.04 2005.03.30 -0500, you wrote:
>Jake Anderson wrote:
>>OS wars seem fairly trivial when compared to AIDS or starvation.
>
>Not when you've got a computer and food in front of you, and aren't infected
>with aids.

LOL =)

2005\03\31@034725 by Howard Winter

face
flavicon
picon face
Bob,

On Sun, 27 Mar 2005 07:02:53 -0500, Bob Ammerman wrote:

> I large suite of DOS applications that I developed starting in 1984 is still
> being used today by one of my customers. It is a very critical part of their
> business. I have several times provided them with the potential of upgrading
> them to a Win environment, but they are happy with them and see no reason to
> spend the $$. However, they have had a in-house development project going on
> since 1999 to build a replacement for my system, but delivery of that system
> is now 4+ years late!

Ah, I know that feeling!  I've had two systems that I've designed and built (with a team working for me) which
were later (for political reasons) said to be "old technology" and projects to replace them were put in place.  
Estimates of 6 months to implement the system using "the latest technology" (how we laughed!).  I know that
one had failed to be implemented after 5 years, I never did hear what happened to the other (I moved on) but
it was at least 18 months late, if it ever did go live.  Both of these were line-of-business systems,
basically running the companies that used them, and were designed for exactly the way they work.  I think the
problem is that some people in our business get too tangled in the technology, and forget that the requirement  
is to do what the users want, not to get clever with the technology!  Or worse, to try to make the problem fit
the solution that they like...

...for example one system (distributed around the country in a number of branches) used ISDN for comms,
dialing between the centre and each branch twice a day to swap data.  The Brave New Solution proposed by the
new people used a central database (I forget which, but one of the big names) so had to have leased lines
installed to each location, and each transaction had to refer back to the central.  This was before ADSL so
we're talking a *lot* of money!  The users didn't need live data for each location, just for their own, so
having a central database rather than one at each site wasn't needed, but they had to pay for it anyway
because the new IT people wanted to use the solution they had used before.  I don't know if it ever worked,
and I know it would have been slower than the system I put in.

It sometimes disgusts me the way some IT people seem to see their job as using the technology, rather than
solving the problem!   (/rant)

Cheers,


Howard Winter
St.Albans, England


2005\03\31@111024 by Bradley Ferguson

picon face
On Thu, 31 Mar 2005 09:47:17 +0100 (BST), Howard Winter
<RemoveMEHDRWspamTakeThisOuTh2org.demon.co.uk> wrote:
{Quote hidden}

How about the local restaurant that replaced their completely usable
propretary system (which they picked up used) with a system based off
of... wait for it... Windows 98.  And poorly implemented to boot.
Windows 98, surely operating as designed, would crash 4 or 5 times per
day--usually when load was high.  In addition, the installers had one
of the order entry stations plugged in under a table where customers
would kick the plug out of the wall several time a day.  All of this
because the owner's son, who figured himself tech-savvy, was duped by
the company hawking their state-of-the-art/oooh-it-uses-Windows-98
system.  I also seem to recall that you have/had to navigate 5 menu
levels to put onions on a hamburger--this being a burger and malt
restaurant where 80% of the burgers have onions.  Of course, they
talked to no one who actually worked in the restaurant when developing
the "business logic."  I find this to be a very common problem and not
limited to IT.  Specifications are king.

Bradley


'[OT]: that filesystem royalty problem again'
2005\04\01@013040 by Peter
picon face


On Wed, 30 Mar 2005, William Chops Westfield wrote:

> (Consider your used computer; like my 90MHz pentium,  You'd think
> that that there would be lots of people that could do something
> useful with that; have their lives improved.  However, thanks to
> the amount of churn in technology and software, the only people
> likely to be able to do anything with it are geeks who already
> have more computers than they know what to do with...)

That is *so* true. Thanks for writing that.

Peter

2005\04\04@110027 by Hazelwood Lyle

flavicon
face
>And the  Amiga OS will soon be available running on power PC.
>http://www.amiga.com/
>http://www.amiga.com/amigaos/
>"Now scheduled for commercial release in early 2005,
>AmigaOS 4.0 moves the Amiga Operating System to the
>modern Power PC chips enabling the easy use of
>off-the-shelf components from third party vendors.
>True to its heritage, AmigaOS 4.0 remains small, fast and robust."

Yes, it is. While not yet in "full release", OS4 on the new AmigaOne
hardware is running nicely for many hundreds of developers, beta testers,
and guinea pigs around the world.
It is such an engaging system to develop for that it took me a week to
reply to this message, as I've been busy with my Amiga instead. :-)

Lyle

2005\04\04@112042 by ThePicMan

flavicon
face
At 10.59 2005.04.04 -0400, you wrote:
>>And the  Amiga OS will soon be available running on power PC.
>>http://www.amiga.com/
>>http://www.amiga.com/amigaos/
>>"Now scheduled for commercial release in early 2005,
>>AmigaOS 4.0 moves the Amiga Operating System to the
>>modern Power PC chips enabling the easy use of
>>off-the-shelf components from third party vendors.
>>True to its heritage, AmigaOS 4.0 remains small, fast and robust."
>
>Yes, it is. While not yet in "full release", OS4 on the new AmigaOne
>hardware is running nicely for many hundreds of developers, beta testers,
>and guinea pigs around the world.
>It is such an engaging system to develop for that it took me a week to
>reply to this message, as I've been busy with my Amiga instead. :-)

Please call it whatever you like (even "AmigaONE") but do not call it
"Amiga". Some people may get offended here.


>Lyle
>
>-

2005\04\05@073645 by Gerhard Fiedler

picon face
ThePicMan wrote:

> Please call it whatever you like (even "AmigaONE") but do not call it
> "Amiga". Some people may get offended here.

Who? and why?

Gerhard

2005\04\05@081434 by Hazelwood Lyle

flavicon
face


>-----Original Message-----
>From: Gerhard Fiedler [listsEraseMEspam.....connectionbrazil.com]

>>ThePicMan wrote:

>> Please call it whatever you like (even "AmigaONE") but do not call it
>> "Amiga". Some people may get offended here.

>Who? and why?

The "Who" would be me. I was the original poster.
The "Why" probably has to do with the difficult history of the
Amiga computer line. It has changed hands many times in the last ten years,
and during the latter part of that time the Amiga camp has been split into
factions, with much dissent between them.

I find the alienation within the Amiga groups to be painful enough, as we
really weren't a thundering majority to begin with. :-) However, in the
interest of promoting peace, I will gladly refer to the computer I am using
now as an AmigaOne, thus differentiating it from the other brands of computers
that have followed the classic Amiga designs.

Of course I'm only assuming that these are the reasons behind the comment made.
If I'm incorrect or inaccurate, please let me know.

Lyle

2005\04\05@115144 by ThePicMan

flavicon
face
At 08.13 2005.04.05 -0400, you wrote:
>
>
>>-----Original Message-----
>>From: Gerhard Fiedler [EraseMElistsspamconnectionbrazil.com]
>
>>>ThePicMan wrote:
>
>>> Please call it whatever you like (even "AmigaONE") but do not call it
>>> "Amiga". Some people may get offended here.
>
>>Who? and why?
>
>The "Who" would be me. I was the original poster.

And me. I was the one that wrote the above "Some people may get offended here".


>The "Why" probably has to do with the difficult history of the
>Amiga computer line. It has changed hands many times in the last ten years,
>and during the latter part of that time the Amiga camp has been split into
>factions, with much dissent between them.

In another life I was an Amiga hardware and software designer. It's not a
religious thing at all, but the Amiga that Jay Miner and all the other
engineers originally designed has nothing, I say nothing to do with this
lame new business improperly called "Amiga".

Please note that I've got nothing against a radical change of hardware. In
fact if Ed Hepler's "Hombre" project (a totally new redesigned Amiga computer
with completely different custom chips and a HP-PA RISC CPU) took off, I would
have loved that as a new "Amiga". I'm not tied to the past, I mean. But not
all new things can be accepted as a continuation.


>I find the alienation within the Amiga groups to be painful enough, as we
>really weren't a thundering majority to begin with. :-) However, in the
>interest of promoting peace, I will gladly refer to the computer I am using
>now as an AmigaOne, thus differentiating it from the other brands of computers
>that have followed the classic Amiga designs.

Thank you. ;)


>Of course I'm only assuming that these are the reasons behind the comment made.
>If I'm incorrect or inaccurate, please let me know.
>
>Lyle
>
>-

2005\04\06@071750 by Gerhard Fiedler

picon face
Hazelwood Lyle wrote:

>>Who? and why?
>
> The "Who" would be me. I was the original poster.

The question was meant to be who would be offended... :) But you answered
this, too.

> It has changed hands many times in the last ten years, and during the
> latter part of that time the Amiga camp has been split into factions,
> with much dissent between them.

I didn't know that. I wrongly assumed that such a small group would be
tightly held together by the common enemy, and that the internal wars would
only break out once the enemy is brought to its knees :)

Gerhard

2005\04\06@114824 by ThePicMan

flavicon
face
At 08.17 2005.04.06 -0300, you wrote:
>
>> It has changed hands many times in the last ten years, and during the
>> latter part of that time the Amiga camp has been split into factions,
>> with much dissent between them.
>
>I didn't know that. I wrongly assumed that such a small group would be
>tightly held together by the common enemy, and that the internal wars would
>only break out once the enemy is brought to its knees :)

The Amiga is old, but not as much as the Ancient Greece. ;)

More... (looser matching)
- Last day of these posts
- In 2005 , 2006 only
- Today
- New search...