Searching \ for '[PIC] using BREAK in 'C'' in subject line. ()
Make payments with PayPal - it's fast, free and secure! Help us get a faster server
FAQ page: www.piclist.com/techref/microchip/devices.htm?key=pic
Search entire site for: 'using BREAK in 'C''.

Exact match. Not showing close matches.
PICList Thread
'[PIC] using BREAK in 'C''
2009\07\01@122946 by Dave Tweed

face
flavicon
face
Olin Lathrop wrote:
> Gerhard Fiedler wrote:
> > Turbo Pascal was very popular. At some point, it probably was the
> > most popular development environment on CP/M and MS-DOS. I think
> > the main reason why it didn't "take over" like C was the lack of
> > standardization and the proliferation of dialects.
>
> Neither of those reasons make sense. Turbo Pascal was a single language
> that was well defined. There were no dialects.

That's the wrong argument to make. Turbo Pascal was only ever available
on two architectures: Z80 and x86 -- and, AFAIK, these weren't strictly
compatible with each other. If you wanted Pascal on any other platform,
you had to switch to a different vendor's dialect.

> I think the reasons C eventually dominated were because there were
> several free or low cost C compilers out there for a wide range of
> systems, it tagged along with the rise of Unix, and there were (even
> more than today) a large group of programmers lacking the maturity,
> discipline, and experience to see the advantages of a tightly typed
> language like Pascal.

There is also a body of "mature, disciplined and experienced" programmers
who see the limitations of a straightjacket language like Pascal. They get
an awful lot of useful programming done in other languages, including C.

> "Hacker" meant a different thing back then and was sortof a honorary
> title. Unfortunately it also included connotations of writing what we
> now call bad code, flagrant misuse of data typing, and using all manner
> of cutesy tricks that were a side effect of the language syntax. It was
> actually cool to write tiny maximally obfuscated programs. C is the
> perfect language for this, and I think this had some influece over the
> rise of C.

I seriously doubt that. Where's your objective evidence?

Sure, C programmers like to divert themselves with activities like the
"obfuscated C contest", but that hardly applies to the use of the language
in real applications.

> Eventually C got past the critical mass stage where you had to use it.
> I think we all agree that's where we are today. All my point is that
> while we may be forced to use C today, we should complain about it
> whenever possible. Changing the world won't be easy or quick, but it
> can't happen until we start to try.

The point that everyone has been making to you, Olin, is that no one is
forced to use C, not on any platform -- there's always an alternative
available.*

But the bigger point for you is that you need to do a better job of picking
your battles. Complaining to us here on the PIClist every time someone
*else* mentions C is not at all productive in the sense that you want, and
is completely counterproductive to everyone else who ends up participating
in the discussion -- and mildly counterproductive to all other readers who
have to skip over it. Annoying everyone here in this way does not help your
cause.

-- Dave Tweed

* Trivially true, of course, because you can always drop into assembly
 language.** Or get a different job.

** Except ... I have yet to find an easy way to write a nontrivial program
 in assembly for the PIC32. Microchip *really* wants you to program these
 chips in C!

2009\07\01@130534 by olin piclist

face picon face
Dave Tweed wrote:
> There is also a body of "mature, disciplined and experienced"
> programmers who see the limitations of a straightjacket language like
> Pascal. They get an awful lot of useful programming done in other
> languages, including C.

So show me something reasonable and useful you can do in C that you can't in
Pascal (again, I'm using Pascal only as a example because I happen to know
it well).

> I seriously doubt that. Where's your objective evidence?
>
> Sure, C programmers like to divert themselves with activities like the
> "obfuscated C contest", but that hardly applies to the use of the
> language in real applications.

Unfortunately I think there is some spillover, especially in earlier days.
Of course I can't prove or disprove that, so it's my opinion only.  Back in
college (1974-1980) in certain cases it was a matter of pride how few lines
of code someone could use for a program to perform some task they had to
perform.  You might say they would never do that except for such class
assignments, but I'm not so sure.  These were not assignments where the
program was the result to hand in, but where you had to write a program to
get a result.

> The point that everyone has been making to you, Olin, is that no one is
> forced to use C, not on any platform -- there's always an alternative
> available.*

Perhaps technically, but not in practise.  Compare the number of C compilers
offered by Microchip for their chips versus non-C compilers they offer.
Look at the same statistics for third party vendors.  There are a few non-C
offerings out there, but are by far in the minority.


********************************************************************
Embed Inc, Littleton Massachusetts, http://www.embedinc.com/products
(978) 742-9014.  Gold level PIC consultants since 2000.

2009\07\01@133754 by Vitaliy

face
flavicon
face
Olin Lathrop wrote:
>> The point that everyone has been making to you, Olin, is that no one is
>> forced to use C, not on any platform -- there's always an alternative
>> available.*
>
> Perhaps technically, but not in practise.  Compare the number of C
> compilers
> offered by Microchip for their chips versus non-C compilers they offer.
> Look at the same statistics for third party vendors.  There are a few
> non-C
> offerings out there, but are by far in the minority.

It would be interesting to get a compiler vendor's opinion on why there are
more C compilers. Is it simply because there's more demand for them, or
perhaps they're easier to write/port?

Don't we have a compiler vendor on this list?

Vitaliy

2009\07\01@145551 by Dave Tweed

face
flavicon
face
Olin Lathrop wrote:
> Dave Tweed wrote:
> > There is also a body of "mature, disciplined and experienced"
> > programmers who see the limitations of a straightjacket language
> > like Pascal. They get an awful lot of useful programming done in
> > other languages, including C.
>
> So show me something reasonable and useful you can do in C that you
> can't in Pascal (again, I'm using Pascal only as a example because
> I happen to know it well).

That isn't what we're debating here. I'm sure that all useful Pascal
dialects are at some level Turing-complete.

I've used Pascal in the past (including Apollo Pascal), but I'm hardly
an expert. As I recall, the biggest difficulties were with I/O, where
you want to treat an arbitrary chunk of data as just a stream of bytes
or bits, possibly embedded in another data structure.

Marshalling data (and executable code) across an I/O interface is
nontrivial in any language, but strongly-typed languages make it
particularly tedious -- and arguably no less error-prone.

> > The point that everyone has been making to you, Olin, is that no
> > one is forced to use C, not on any platform -- there's always an
> > alternative available.*
>
> Perhaps technically, but not in practise. Compare the number of C
> compilers offered by Microchip for their chips versus non-C compilers
> they offer. Look at the same statistics for third party vendors. There
> are a few non-C offerings out there, but are by far in the minority.

So, you're saying that you're not unhappy because alternatives are not
available (they are), but unhappy because they're not as popular as C?
Geez, this discussion is more pointless than I thought!

-- Dave Tweed

2009\07\01@151849 by olin piclist

face picon face
Dave Tweed wrote:
> Marshalling data (and executable code) across an I/O interface is
> nontrivial in any language, but strongly-typed languages make it
> particularly tedious -- and arguably no less error-prone.

That's why I/O is usually not type checked.  Unless it's a higher level I/O,
like lines of text, it's usually just a sequence of bytes in most languages
and OS calls.

> So, you're saying that you're not unhappy because alternatives are not
> available (they are), but unhappy because they're not as popular as C?
> Geez, this discussion is more pointless than I thought!

You said there's always a alternative, but there really isn't.  There are a
very few non-C compilers for PICs, for example, but that doesn't help when
you have to link in with libraries defined with a existing C compiler's
calling conventions and read it's definitions from .h files.  And none of
them are directly supported by Microchip.

So yes there is a alternative or two, but only for small values of
"alternative".  Actually the only one I can think of off the top of my head
that supposedly is more than a toy is XCSB.  I know more work is being done
on JAL.  I don't know if it's far enough along to be production ready
though.  I have never used either, so I don't know for sure.


********************************************************************
Embed Inc, Littleton Massachusetts, http://www.embedinc.com/products
(978) 742-9014.  Gold level PIC consultants since 2000.

2009\07\01@172814 by M.L.

flavicon
face

On Wed, Jul 1, 2009 at 3:21 PM, Olin Lathrop<spam_OUTolin_piclistTakeThisOuTspamembedinc.com> wrote:
> So yes there is a alternative or two, but only for small values of
> "alternative".  Actually the only one I can think of off the top of my head
> that supposedly is more than a toy is XCSB.  I know more work is being done
> on JAL.  I don't know if it's far enough along to be production ready
> though.  I have never used either, so I don't know for sure.
>
>
> ********************************************************************
> Embed Inc, Littleton Massachusetts, http://www.embedinc.com/products
> (978) 742-9014.  Gold level PIC consultants since 2000.


There's at least one Pascal for PICs:
<http://www.mikroe.com/en/compilers/>

As I see it there are at least three issues with not using assembly
language in a real-time application: timing, non-optimal compilation,
and the compiler vendor.

-
Martin

2009\07\01@183723 by William \Chops\ Westfield

face picon face

On Jul 1, 2009, at 10:07 AM, Olin Lathrop wrote:

> So show me something reasonable and useful you can do in C that you  
> can't in Pascal (again, I'm using Pascal only as a example because I  
> happen to know it well).

How about:
    /*
     * Our console uart is memory mapped
     */
    struct uart_type *console = (struct uart_type *)0x801000;

Please provide a pascal example that works with at least two different  
vendors' pascal compilers.  Of course, this is extremely "dangerous",  
but I think it's a fine example of exactly the sort of thing that  
caused less dangerous languages to be dismissed from consideration for  
"systems" programming.

BillW

2009\07\01@191813 by William \Chops\ Westfield

face picon face

On Jul 1, 2009, at 9:29 AM, Dave Tweed wrote:

> Turbo Pascal was only ever available on two architectures: Z80 and x86

Turbo Pascal also ran on the 68k based Macs.

BillW


2009\07\01@192512 by Benjamin Grant

flavicon
face
can everyone seriously stop indulging Olin in his unpopular views of C? If
he doesn't see the purpose of using it let him be. Who cares, you're never
going to win against him and he loves having this conversation so just stop
responding to it.

On Wed, Jul 1, 2009 at 5:37 PM, William "Chops" Westfield <.....westfwKILLspamspam@spam@mac.com>wrote:

{Quote hidden}

> -

2009\07\01@195805 by Tamas Rudnai

face picon face
I think it is not a win or loose conversation. I love these conversations
too as many times some very important thoughts pulled out of someones head
that would never mentioned otherwise.

Tamas


On Thu, Jul 2, 2009 at 12:12 AM, Benjamin Grant <benjamin.grantspamKILLspamduke.edu>wrote:

{Quote hidden}

2009\07\01@200019 by olin piclist

face picon face
William Chops" Westfield" wrote:
>     /*
>      * Our console uart is memory mapped
>      */
>     struct uart_type *console = (struct uart_type *)0x801000;

I guess this defines CONSOLE as a pointer to a record of type UART_TYPE and
sets the pointer pointing to address 801000h?  A comment or two would help.

Anyway to show the flavor I created a whole program in Pascal that pretends
to use memory mapped hardware registers, which I think was your point.  I
also wrote it more in the style and naming convetions I would have written
such a thing:

program x;

const
 adr_uart = 16#801000;         {address of memory mapped UART registers}

type
 uart_p_t = ^uart_t;
 uart_t = record               {memory mapped UART hardware layout}
   stuff: integer32;
   morstuff: integer32;
   end;

var
 console_p: uart_p_t;          {points to console output HW registers}

begin
 console_p := uart_p_t(adr_uart); {point to the console output regs}
 console_p^.stuff := 27;       {send value to console}
 end.

I did a whole program so that I could actually build it to make sure I
didn't mess up something.  This program builds and runs on Windows.  Not
surprisingly is results in a popup about the program performing a illegal
operation when run.  Since my Pascal is implemented as a source to source
translator and the output is for MSVC on Windows, this also created a
temporary C file:

extern void string_cmline_set (
 int,
 char *,
 char *);

/*****************************
**
**   Start of program X.
*/
#define adr_uart_k 8392704
typedef struct uart_t {
 int stuff;
 int morstuff;
 } uart_t;

typedef uart_t *uart_p_t;

__declspec(dllexport) int main (
   int argc,
   char * argv) {

 static uart_p_t console_p;

 static int int_1 = 8392704;
 /*
 **   Executable code for program X.
 */
 string_cmline_set (argc, argv, "x");
 console_p = *(void * *)&int_1;       /* 8392704 */
 console_p->stuff = 27;
 return 0;

 #undef adr_uart_k
 }

Ignore the STRING_CMLINE_SET stuff.  That is implicit initialization added
to the start of all top level programs in my environment on Windows.  The
__DECLSPEC is unique to the Windows target platform.  Also keep in mind that
the above C was written by a machine for a machine, so its appearance and
style may look a bit strange and awkward, and it should really be judged by
the machine code a reasonable C compiler would produce from it.

> Please provide a pascal example that works with at least two different
> vendors' pascal compilers.

No, you're missing the point again.  We're talking about computer science
concepts, not particular implementations.  The point is that what you want
to do is possible in a more tightly typed language like Pascal.  A single
instance is proof enough.  Anyone can install my Windows host build
environment and verify this for themselves if they think I'm cooking the
data somehow.

> but I think it's a fine example of exactly the sort of thing that
> caused less dangerous languages to be dismissed from consideration for
> "systems" programming.

I don't see why.  A variant of Pascal (call it Pascal-like if you prefer)
was used to write the Apollo Aegis operating system.  Someone else pointed
out that a similar language was used to write the original Apple OS.

People seem to forget that tight type checking only means that typing rules
are enforced unless you explicitly say otherwise, as I did in using the data
type UART_P_T as a "type transfer function" in the first executable
statement.  I could have also used UNIV_PTR instead, but that would have
turned off more type checking than necessary.  UNIV_PTR is roughly like a C
void*.


********************************************************************
Embed Inc, Littleton Massachusetts, http://www.embedinc.com/products
(978) 742-9014.  Gold level PIC consultants since 2000.

2009\07\01@203045 by Tamas Rudnai

face picon face
On Wed, Jul 1, 2009 at 11:37 PM, William "Chops" Westfield
<EraseMEwestfwspam_OUTspamTakeThisOuTmac.com>wrote:

> How about:
>     /*
>      * Our console uart is memory mapped
>      */
>     struct uart_type *console = (struct uart_type *)0x801000;
>
> Please provide a pascal example that works with at least two different
> vendors' pascal compilers.  Of course, this is extremely "dangerous",


I do not know any other Pascal than Turbo Pascal, but with that you could
declare a variable right on top of a memory location you desire:

var myloc : byte absolute $891000;

you could do things like this:

reg.ax:=$0003;
intr($10,reg);

or

mem[$A000: y*320+x]:=c;

you have pointers, binary operators -- but you do not have bitfields and
printf format strings. I think only these two things were what I liked a bit
more in C plus the for loop in C. What hard was to learn is why can't i just
define specific index range for an array and why the switch..case has that
few possibilities in C. Also that there is no equivalent of 'with' directive
in C and that I could not declare functions inside other functions. Actually
I did feel that C was a backup after Turbo Pascal and pain to use, but that
was the stream you had to swim in.

Tamas




>
> but I think it's a fine example of exactly the sort of thing that
> caused less dangerous languages to be dismissed from consideration for
> "systems" programming.
>
> BillW
>
> -

2009\07\02@054316 by Alan B. Pearce

face picon face
>So yes there is a alternative or two, but only for small values
>of "alternative".  Actually the only one I can think of off the
>top of my head that supposedly is more than a toy is XCSB.

Well this does seem to be a reasonably comprehensive set of alternative
tools.

http://www.mikroe.com/en/compilers/

At least two more languages than Microchip provides ...

There are other providers of basic for PICs, in various forms as well. I
haven't had a chance to 'play' with any of these alternative languages, so
cannot comment on the quality of any of them.

I had the impression that JAL is a reasonably mature item as well, even
though you seem to write it off.

2009\07\02@062031 by Rob Hamerling

picon face

Hi Alan,

Alan B. Pearce wrote:

> I had the impression that JAL is a reasonably mature item as well, even
> though you seem to write it off.

Glad you bring it up. Jal (JalV2) is alive and well, not only the
compiler itself is still maintained and further developed, but there is
also a group of people building a set of libraries. Details:
compiler: http://www.casadeyork.com/jalv2
libraries: http://code.google.com/p/jallib

Regards, Rob.

--
Rob Hamerling, Vianen, NL (http://www.robh.nl/)

2009\07\02@073000 by Michael Rigby-Jones

flavicon
face


>
> William Chops" Westfield" wrote:
>
> > Please provide a pascal example that works with at least
> two different
> > vendors' pascal compilers.
>
> {Original Message removed}

2009\07\02@081327 by Alan B. Pearce

face picon face
>I think you are missing Bills point.  Can you take the code
>you have just written and be sure it will compile and work
>on any arbitrary Pascal compiler?  Would you really want to
>have to have a diffierent implementaion of this code for each
>target you want to use it on?

Have you looked at the conditionals inside Microchip code to cover
differences between C18 and C30/C32 ???

And these are compilers from a single source ...

2009\07\02@090114 by olin piclist

face picon face
Michael Rigby-Jones wrote:
> I think you are missing Bills point.  Can you take the code you have
> just written and be sure it will compile and work on any arbitrary
> Pascal compiler?  Would you really want to have to have a diffierent
> implementaion of this code for each target you want to use it on?

That would be a relevant question if we were chosing a compiler from today's
available choices.  I was only pointing out that C is a really bad language
and used some examples based on Pascal to show there are other ways.  I'm
not advocating Pascal either, but its concepts are useful as counter
examples to C, especially since I'm pretty familiar with a language based on
those concepts.

> When I last used Pascal (a long time ago admittedly), I could find no
> way of defining static variables within a function, so end up using
> globals which is very ugly.  Has this situation changed?

There are several ways to define static variable.  One is to put variables
at the module level outside any function:

module xxx;

var
 ii: integer32;  {static variable with module scope}

procedure aaa;
begin
 end;

procedure bbb;
begin
 end;

Subroutines AAA and BBB can both reference the static variable II.  This is
useful when a subsystem needs private state that several of its routines
need to access.

Another way is to define the variable in a specifically named section:

procedure aaa;
var (mysection)
 ii: integer32;  {static variable with subroutine scope}
begin
 end;

Of course the choice of section name may be dependent on linker conventions
and definitions in other non-code files, such as the MPLINK control file on
PIC targets.


********************************************************************
Embed Inc, Littleton Massachusetts, http://www.embedinc.com/products
(978) 742-9014.  Gold level PIC consultants since 2000.

2009\07\02@090423 by Gerhard Fiedler

picon face
Olin Lathrop wrote:

> William Chops" Westfield" wrote:
>> Please provide a pascal example that works with at least two
>> different vendors' pascal compilers.
>
> No, you're missing the point again.  We're talking about computer
> science concepts, not particular implementations.  The point is that
> what you want to do is possible in a more tightly typed language like
> Pascal.  

I'm not sure I follow you. I at least never doubted that almost
everything that can be done with C can be done with any reasonable
Pascal dialect. I did much of it with Turbo Pascal... so this is not
really the question. And it's not the question Bill asked.

(FWIW, I think the main thing that C has that's missing from even decent
Pascal dialects is the precompiler. This is a quite useful tool for many
things the normal language can't do efficiently, but of course you can
use 3rd party preprocessors to help with that.)

But you're talking about popularity, availability. This is not a
computer science concept, this is a market concept. And for any language
to beat C in popularity in the (small micro embedded) market, it's not
enough that it is "better" in terms of computer science; the market
doesn't follow computer science concepts.

This is the thing you seem to get confused. The reason why C is more
popular has nothing to do with computer science concepts, and the reason
why Pascal almost completely lost its edge (don't forget that it had its
shot, was almost there, then squandered it) has nothing to do with
computer science concepts either.

As long as you dismiss the popularity of C as a big conspiracy of
idiots, you'll fail to learn the lesson from it (and to learn the lesson
from the fall into oblivion of Pascal). It could show you what is needed
for a language to become popular in a variety of markets. Bill's
question about code that works with two vendors' compilers goes in this
direction (and the "two vendors" part /is/ essential), and obviously has
nothing to do with computer science -- just as today's popularities of C
and Pascal have nothing to do with computer science.

Gerhard

2009\07\02@092628 by olin piclist

face picon face
Gerhard Fiedler wrote:
> I'm not sure I follow you. I at least never doubted that almost
> everything that can be done with C can be done with any reasonable
> Pascal dialect.

Others do seem to doubt it though.  One of the technical arguments that
several people have made here recently is that you can "do anything" in C,
implying you can't do it in other languages.

> (FWIW, I think the main thing that C has that's missing from even
> decent Pascal dialects is the precompiler.

I agree.  A preprocessor can be a powerful addition to any language.

> But you're talking about popularity, availability.

No, I'm specifically not.  I understand how C got where it is and that today
you have to use it regardless of technical merits.  The part I'm trying to
point out is that C sucks technically, because too many sheeple use C and
don't see a problem with that.  I'm not saying there is a handy solution
because I don't see one either.  But the realization that it would be nice
if there was one is something too many people haven't made yet, and that's
what I'm trying to get them to see.  There are still people in this world
that actually *like* C (and I don't mean in the business sense), as
evidenced by several of the responses here.

> This is the thing you seem to get confused. The reason why C is more
> popular has nothing to do with computer science concepts,

No, in fact I've been trying to point that out.


********************************************************************
Embed Inc, Littleton Massachusetts, http://www.embedinc.com/products
(978) 742-9014.  Gold level PIC consultants since 2000.

2009\07\02@172915 by Gerhard Fiedler

picon face
Olin Lathrop wrote:

>> But you're talking about popularity, availability.
>
> No, I'm specifically not.  I understand how C got where it is and
> that today you have to use it regardless of technical merits.  

Didn't you write earlier in this thread:

:: All I'm looking for is some outcry from the minority rest of us to
:: keep pointing out the faults in C and complain about wanting
:: something better. It's not easy to change such intrenched thinking,
:: but if we keep beating on it more and more people may slowly realize
:: that C is a really bad idea.


> The part I'm trying to point out is that C sucks technically, because
> too many sheeple use C and don't see a problem with that.  I'm not
> saying there is a handy solution because I don't see one either.

Well, in the absence of a better solution, it seems that C is the best
solution -- now, in some (many) settings. This is probably the reason
why most people use it, and it has nothing to do with them being
sheeple. (Well, I'm one of them, so it's kind of obvious that I would
think so :)


> But the realization that it would be nice if there was one is
> something too many people haven't made yet, and that's what I'm
> trying to get them to see.  There are still people in this world that
> actually *like* C (and I don't mean in the business sense), as
> evidenced by several of the responses here.

You may not believe it, but there are people who actually *like*
assembly :)  Not that different from liking C, IMO.


>> This is the thing you seem to get confused. The reason why C is more
>> popular has nothing to do with computer science concepts,
>
> No, in fact I've been trying to point that out.

Then we agree on that. IMO it's usually a mixture of long-term
availability, support for many targets, well specified (and a single
spec), popularity, and other such criteria -- but not on its merits as
well-designed programming language, in the academic sense of it.

But in a business setting, the latter has some impact, but other
criteria have much more impact. And those other criteria may make it a
sound decision to use C -- despite its shortcomings. The shortcomings of
the alternatives are even worse. And that's why C is not, all in all,
"bad".

Gerhard

2009\07\02@190916 by Tamas Rudnai

face picon face
Just installed FreePascal in my linux box and it's great :-) It even has the
old Borland Pascal style IDE in text user interface with the very same menu
alignments and keyboard mappings. I had some fun time to remembering the old
WordStar hotkeys but it seems after 10 years or even more can still remember
^KB & ^KK :-)

Anyway, my Pascal knowledge fade a bit away so have to refresh it and made
some tests for auto and static procedure/function variables just for fun.
Basically 'test' is a recursive procedure so that the staticness of
variables can be tracked down easily. The idea is that an auto variable
keeps it's value on the same recursion level so when returning from the
higher level we can see if the variable changed or not.

As you can see the variables declared with 'const' are really static (like
the 'static' directive in C) while the ones with the 'var' are 'auto'
variables (aka sits on the stack). The interesting part is that if you
specify a 'var' to located on top of a 'const' then it becomes static as you
just have told the compiler to place the variable on the static data memory
area instead of the stack.

Here is the source of this test:

program test;
uses crt;

const
 globConst: Integer = 999;


procedure test ( par: Integer );
const
 v1: Integer = 0;  { it's a static variable }
var
 v2: Integer;      { it's an auto variable }
 v3: Integer absolute globConst;   { this is supposed to be an auto but it
binds to a global static instead }
begin
 v1 := par;
 v2 := par;
 v3 := par;

 write('  test1 = ', v1);
 write('  test2 = ', v2);
 writeln('  test3 = ', v3);
 writeln('--------------');

 if par < 5 then test( par + 1 );

 write('  test1_exit = ', v1);
 write('  test2_exit = ', v2);
 writeln('  test3_exit = ', v3);
 writeln('--------------');
end;


begin
 writeln('variable test');
 test(1);
end.

---RESULTS---

variable test
 test1 = 1  test2 = 1  test3 = 1
--------------
 test1 = 2  test2 = 2  test3 = 2
--------------
 test1 = 3  test2 = 3  test3 = 3
--------------
 test1 = 4  test2 = 4  test3 = 4
--------------
 test1 = 5  test2 = 5  test3 = 5
--------------
 test1_exit = 5  test2_exit = 5  test3_exit = 5
--------------
 test1_exit = 5  test2_exit = 4  test3_exit = 5
--------------
 test1_exit = 5  test2_exit = 3  test3_exit = 5
--------------
 test1_exit = 5  test2_exit = 2  test3_exit = 5
--------------
 test1_exit = 5  test2_exit = 1  test3_exit = 5
--------------

Tamas



>
>
> Of course the choice of section name may be dependent on linker conventions
> and definitions in other non-code files, such as the MPLINK control file on
> PIC targets.
>
>
> ********************************************************************
> Embed Inc, Littleton Massachusetts, http://www.embedinc.com/products
> (978) 742-9014.  Gold level PIC consultants since 2000.
> -

2009\07\02@193327 by Harold Hallikainen

face
flavicon
face
MANY years ago, I worked on a product designed with Turbo Pascal under
CP/M. The final hardware had the code running out of EPROM, variables in
RAM, and no OS. We made no OS calls and did system initialization (setting
up the stack pointer, etc.) before starting to execute our code (I think
it was at 0x100). This was all with a Z80. Worked pretty well. This
must've been 25 to 30 years ago...

Harold

--
FCC Rules Updated Daily at http://www.hallikainen.com - Advertising
opportunities available!

2009\07\02@200447 by Tamas Rudnai

face picon face
Yes I remember even in DOS they have been compiled to the COM file format --
which is not even a format just a 64k-0x100  memory space in disk. Before
that there was the PSP with FCB and some other CP/M heritage -- eh, good old
times :-)

Tamas


On Fri, Jul 3, 2009 at 12:44 AM, Harold Hallikainen
<haroldspamspam_OUThallikainen.org>wrote:

{Quote hidden}

> -

2009\07\03@072616 by Gerhard Fiedler

picon face
Tamas Rudnai wrote:

> As you can see the variables declared with 'const' are really static
> (like the 'static' directive in C) while the ones with the 'var' are
> 'auto' variables (aka sits on the stack).

I find this at least not intuitive. I'd expect a const to be constant.

This thing with v3 is akin (in C) to use a pointer to non-const to point
to a const. Looks like a nice hole in the typing :)

Is this really the only way to create a static variable? Can't you have
a global (that is, outside of a procedure) var section?

Gerhard

2009\07\03@074715 by Isaac Marino Bavaresco

flavicon
face
liGerhard Fiedler escreveu:
> You may not believe it, but there are people who actually *like*
> assembly :)  Not that different from liking C, IMO.

I like assembly. I like also C and Pascal, but I'm way more productive
with C.

The main reason C is more productive than assembly is because with it
one needs to write *much* less code and worry about *much* less details
than with assembly. It leaves much more time to think about the real
problem. It is also easier to to reuse code and port to different
architectures.

Pascal can generate code as efficient as C and the source is much
"prettier", but the number of architectures the code can run is much
restricted.

Regards,

Isaac

__________________________________________________
Faça ligações para outros computadores com o novo Yahoo! Messenger
http://br.beta.messenger.yahoo.com/

2009\07\03@080129 by olin piclist

face picon face
Gerhard Fiedler wrote:
>> As you can see the variables declared with 'const' are really static
>> (like the 'static' directive in C) while the ones with the 'var' are
>> 'auto' variables (aka sits on the stack).
>
> I find this at least not intuitive. I'd expect a const to be constant.

In the Pascals I am familiar with CONST defines compile time symbolic
constants only.  Certainly my version works this way, although it often
realizes these constants as static read-only variables.

Note that Tamas' code doesn't prove how exactly the compiler interpreted the
CONST symbols.  Either interpretation would have resulted in the same
output.  For that matter his code doesn't even prove the constants are
static if they are implemented as values in memory locations.  Since you can
only read them and then you always read the same value, you can't tell
whether you are reading a different memory location each time or whether the
compiler is substituting the symbol's value on the fly.

The only way to tell would be to take the address of a symbolic constant.
If you get a compile error, then its just a symbolic constant.  If it works,
then you still don't know whether the compiler created a literal only
because you asked for the address.  You can pass a symbolic constant to a
subroutine pass by reference parameter and see if the the address changes
with nesting level.  If so, then these things aren't variables at all since
the compiler is creating the argument anew each call.

It's very tricky to tell the difference, which also makes it so you don't
need to care in most cases.


********************************************************************
Embed Inc, Littleton Massachusetts, http://www.embedinc.com/products
(978) 742-9014.  Gold level PIC consultants since 2000.

2009\07\03@140805 by Peter

picon face
C constants trace their ancestry back to when C had no types. Normally the const
is a kludge that is not enforced, iow it is a normal memory storage location in
which data is stored that has the 'const' attribute only in the compiler.
Writing code that tries to write to a const variable (!) causes a compiler
warning at most, and that can be overriden. This is sometimes used to create
code that ensures that any writes occur only at controlled locations (generating
specified tolerated warnings on compile), and nowhere else.

(aside: this is an easy way to find out whether something is changing a variable
that should not do so, while debugging code - obviously it will not catch a
recast referenced access i.e. ptr=(int *)(const int *), but the cast itself
should be caught by the compiler /aside).

The other implication is that the compiler 'knows' that a const will not change
and may use this to generate optimized or reordered code (even optimizing the
variable away entirely into a compile-time only constant symbol - I believe that
gcc -O3 does that). That can be prevented by using certain compiler flags around
the section where the const variable is declared. On embedded systems some types
of const (the ones located at a certain address with @ or linked into a segment
mapped to eeprom etc) can be desirable, to use read-only values or non-volatile
memory that gets updated by external factors. Here is an interesting discussion
about const use with volatile (!):

http://publications.gbdirect.co.uk/c_book/chapter8/const_and_volatile.html

C is fun ! <G>

Peter


2009\07\03@172646 by Gerhard Fiedler

picon face
Isaac Marino Bavaresco wrote:

> Gerhard Fiedler escreveu:
>> You may not believe it, but there are people who actually *like*
>> assembly :)  Not that different from liking C, IMO.
>
> I like assembly.

This wasn't meant as offensive to any assembly programmer :)

I know it can be fun -- it's just that I personally don't like it very
much... and there's this history of conversations between Olin and me.

Gerhard

2009\07\03@185950 by Gerhard Fiedler

picon face
Olin Lathrop wrote:

>>> As you can see the variables declared with 'const' are really static
>>> (like the 'static' directive in C) while the ones with the 'var'
>>> are 'auto' variables (aka sits on the stack).
>>
>> I find this at least not intuitive. I'd expect a const to be constant.
>
> In the Pascals I am familiar with CONST defines compile time symbolic
> constants only.  Certainly my version works this way, although it
> often realizes these constants as static read-only variables.
>
> Note that Tamas' code doesn't prove how exactly the compiler
> interpreted the CONST symbols.  Either interpretation would have
> resulted in the same output.  For that matter his code doesn't even
> prove the constants are static if they are implemented as values in
> memory locations.  Since you can only read them and then you always
> read the same value, you can't tell whether you are reading a
> different memory location each time or whether the compiler is
> substituting the symbol's value on the fly.

:: const
::   globConst: Integer = 999; {<<<}
::
:: procedure test ( par: Integer );
:: const
::   v1: Integer = 0;
:: var
::   v2: Integer;
::   v3: Integer absolute globConst; {<<<}
:: begin
::   v1 := par;
::   v2 := par;
::   v3 := par; {<<<}

It seem the compiler assigned a memory location to the constant
globConst, because v3 is pointed to that memory location. It also seems
that in the line where v3 is pointed at that memory location, the
compiler didn't really care that it was defined as a constant; it
allowed pointing a variable to it, which later was written to.

{Quote hidden}

This seems to assume that a constant remains constant, which in the
above isn't the case. He didn't try to access globConst under that name,
but if the "absolute" directive does what I think it does, its value
will have changed after the last cited line above.

As he later explains, constants defined like that aren't constants, more
like static variables in C. Bad naming; nothing constant to a static
variable. And the 'real' constants don't seem to have a type, similar to
the preprocessor #defines in C. Not good either; in a strongly typed
language, constants should also have a type. Well... nothing's perfect
:)

Gerhard

2009\07\04@072643 by sergio masci

flavicon
face


On Thu, 2 Jul 2009, Tamas Rudnai wrote:

{Quote hidden}

Actually I agree with Tamas.

I find that trying to discuss languages with most people these days tends
to boil down to: use C, use C++ or use Java. Not very productive if you're
trying to do something better.

Remember, the point of a programming language is to allow the user to
describe the problem so that the compiler can generate the best code to
solve that problem. It is not to micro manage the generation of an
executable such that the compiler becomes nothing more than a glorified
assembler.

Regards
Sergio Masci

2009\07\04@105331 by Gerhard Fiedler

picon face
sergio masci wrote:

> On Thu, 2 Jul 2009, Tamas Rudnai wrote:
>> I think it is not a win or loose conversation. I love these
>> conversations too as many times some very important thoughts pulled
>> out of someones head that would never mentioned otherwise.
>>
>
> Actually I agree with Tamas.
>
> I find that trying to discuss languages with most people these days
> tends to boil down to: use C, use C++ or use Java. Not very
> productive if you're trying to do something better.

I remember that when I first read about Ada, I thought that this would
be the future. It didn't become the future... Ada compilers are too
heavyweight, too complicated, too expensive, too "niche". Probably no
way to implement a compiler for a small 8bit micro efficiently.

> Remember, the point of a programming language is to allow the user to
> describe the problem so that the compiler can generate the best code
> to solve that problem.

One thing that's missing in this affirmation is that it's not typically
about "the compiler can generate the best code". Typically it's foremost
about describing the problem in an efficient way -- for the programmer.
The compiler doesn't have to produce the best code possible; it has to
produce code that's good enough. But it has to make the programmer's job
easy; this is its main job.

It seems a certain part of the engineering population thinks that ladder
logic is an adequate way to describe their control problems, yet ladder
logic compilers are definitely unpopular with the micro crowd. There
seem to be groups that have different views of what is the easiest way
to describe a problem.

Another thing that's IMO missing in the affirmation above is that it's
not about "the" problem, it's about "a relevant subset of all
engineering problems". And here is where the "glorified assembler" (and
assembler) comes in, with the flexibility to address problems in not yet
thought-of domains.


> It is not to micro manage the generation of an executable such that
> the compiler becomes nothing more than a glorified assembler.

Hm... With "nothing more than a [...] assembler" you're probably
correct. OTOH, I don't know of any language in widespread use that is
"nothing more". (We could probably disagree on what "glorified" means :)

But if you substitute "nothing more" with "not much more", things are
not so simple anymore. It is a tricky balance between adaptable and
standardized, lightweight and powerful, flexible and supporting, almost
requiring, good practices, universal and highly optimized, allowing
solutions adequate for every domain and including yet unknown domains,
...


One thing I wonder is why we haven't seen a programming language on
silicon. Not a Basic interpreter in firmware, but something more
substantial and efficient, along the lines of the Java micros (that
never seem to take off). Is this because it produces faster code with
the same amount of silicon when a compiler optimizes assembly than when
microcode executes high-level constructs directly?

Gerhard

2009\07\04@132922 by sergio masci

flavicon
face


On Sat, 4 Jul 2009, Gerhard Fiedler wrote:

{Quote hidden}

You forgot the "conformance" aspect. Ada is a registered trademark of the
US DoD. To write an Ada compiler, the vendor needs (needed?) to jump
through DoD hoops. The compiler needed to be certified and anyone using
the compiler could not do DoD work if the certification had expired (they
had to be recertified every year) without special DoD excemption.

With all this BS going on, it's no wonder Ada never really took off.

{Quote hidden}

Yes I can see where you are comming from. So what would you say to a
programmer who points at your generated code and laughing says "I could
have done that in half the number of machine code instructions" using C.
If your C "successor" is going to stand any chance of competeing with C,
it's got to generate code that's at least on a par with it.

> It seems a certain part of the engineering population thinks that ladder
> logic is an adequate way to describe their control problems, yet ladder
> logic compilers are definitely unpopular with the micro crowd. There
> seem to be groups that have different views of what is the easiest way
> to describe a problem.

Yes but most people that like ladder logic are not expert programmers.

>
> Another thing that's IMO missing in the affirmation above is that it's
> not about "the" problem, it's about "a relevant subset of all
> engineering problems". And here is where the "glorified assembler" (and
> assembler) comes in, with the flexibility to address problems in not yet
> thought-of domains.

But most problems can be broken down into much smaller well defined
processes such as search, sort, append, remove etc. which a language
should be able to apply to basic data types such as lists, sets, heaps,
arrays, strings etc. For any really complex not yet thought-of domains we
have libraries. I am so fed up with the notion that we need libraries for
everything, that the language needs to be so rediculasly simply as to not
be able to understand types such as lists, strings, dynamic array etc.
when these types are so common and well understood.

{Quote hidden}

I think if you look at a high level language like Java or even the
executable produced by something like a C++ compiler, you will find that
the native machine code being executed is not the problem - the problem is
all the dynamic memory management that is going on. Objects being created
and destroyed just so that trivial operations can be performed without
impacting on other objects.

Personally I think if you really want to optimise a processor to execute
high level code more efficiently then it needs an evaluation stack (kind
of like FORTH) where you can say in one instruction "load the stack with
this sequence and evaluate it" then let the CPU orchestrate the memory
accesses to best optimise fetching of values. You could even help it by
trying to organise variables so that they are near each other in memory.
Two stackes would probably be best, one for manipulating data and one for
addresses. Offten you don't need to recaclulate an address if you can make
sure a side effect of the calculation is to put it somewhere where it can
be used again with minimal effort (e.g. do the calculation in an index
register and use a different index register for the next address if
possible).

Friendly Regards
Sergio Masci

2009\07\04@230342 by Gerhard Fiedler

picon face
sergio masci wrote:

> You forgot the "conformance" aspect. Ada is a registered trademark of
> the US DoD.

Yup, didn't think of that. Definitely a killer.


> Yes I can see where you are comming from. So what would you say to a
> programmer who points at your generated code and laughing says "I
> could have done that in half the number of machine code instructions"
> using C. If your C "successor" is going to stand any chance of
> competeing with C, it's got to generate code that's at least on a par
> with it.

If it's competing on its turf, then yes. If I can say "I have developed
the application in 10% of the time it would have taken to develop it in
C", then it depends. That's probably the reason why C (and assembly) is
not that popular with bigger systems, notwithstanding its popularity
with smaller systems.


>> It seems a certain part of the engineering population thinks that
>> ladder logic is an adequate way to describe their control problems,
>> yet ladder logic compilers are definitely unpopular with the micro
>> crowd. There seem to be groups that have different views of what is
>> the easiest way to describe a problem.
>
> Yes but most people that like ladder logic are not expert programmers.

Most people, and even most programmers, are not expert programmers (by
definition... :)

There is a place for languages that can be used by domain experts
efficiently, so that they don't need expert programmers to solve their
problems. What we do when we solve a problem with a microcontroller and
C (or assembly, or Pascal, or even XSBC :) is often a "hack", in the
absence of a better solution; the real solution would be something that
allows the domain expert to do it herself.


> But most problems can be broken down into much smaller well defined
> processes such as search, sort, append, remove etc. which a language
> should be able to apply to basic data types such as lists, sets,
> heaps, arrays, strings etc. For any really complex not yet thought-of
> domains we have libraries. I am so fed up with the notion that we
> need libraries for everything, that the language needs to be so
> rediculasly simply as to not be able to understand types such as
> lists, strings, dynamic array etc. when these types are so common and
> well understood.

I for one never understood this "war" between language and libraries. I
don't really care that much whether C++ has a decent array type built
into the language or whether there are the C-style arrays for backwards
compatibility and as a decent solution there is std::vector from the
standard library. The standard libraries are, at least with C and C++,
part of the language specification.

Then there are those cases where I still code my own containers, because
for the one or other requirement, the standard containers don't fit my
bill. Probably no language built-in container can do it all; there too
many different sets of requirements with different solutions.

You probably have a point with strings (in C and C++), but there is the
dreaded backwards compatibility that sooner or later hits everything
that is used for long enough, and so C is pretty much stuck with the
half-assed string implementation it has. In C++ there is std::string,
which works for many cases well enough (and again I don't really care
that much that this is from a library).


> I think if you look at a high level language like Java or even the
> executable produced by something like a C++ compiler, you will find
> that the native machine code being executed is not the problem - the
> problem is all the dynamic memory management that is going on.
> Objects being created and destroyed just so that trivial operations
> can be performed without impacting on other objects.

You're probably right. Heap management and garbage collection, pointer
and reference mechanics, support for polymorphism and virtual functions
-- all in microcode, directly executed, wouldn't that be nice? But by
judging from the

> Personally I think if you really want to optimise a processor to
> execute high level code more efficiently then it needs an evaluation
> stack (kind of like FORTH) [...]

FORTH again... shouldn't it have had more success? :)

Gerhard

2009\07\05@085046 by olin piclist

face picon face
Gerhard Fiedler wrote:
>> Personally I think if you really want to optimise a processor to
>> execute high level code more efficiently then it needs an evaluation
>> stack (kind of like FORTH) [...]
>
> FORTH again... shouldn't it have had more success? :)

It had its chance and didn't emerge as a winner for good reason I think.
Back in the early 1980s I thought Forth might be a reasonable solution for
field or customer programmability in some embedded systems.  I even wrote a
few Forth interpreters, one on a general purpose computer for testing and
investigating concepts, and one in a display controller box.  Forth in the
box did exactly what it was intended to to.  It took remarkably little
memory to hold programs for doing math on coordinates, dealing with input
devices, etc, and the speed was adequate enough.

Then I tried to write some more complex programs.  I got them to work and
they worked well, were fast enough, and took little memory, but were a real
pain to write.  It was far more difficult to write good Forth programs than
I had imagined.  The company finally decided, and I agreed, that Forth was
just too difficult even for expert programmers, that making it available in
the product would cost way more in support than any additional revenue
gained by having user programming capability.

One place Forth-like concepts survive today is in PostScript.  I think that
works because all the benefits of Forth are still there, but humans don't
write PostScript.  Forth is actually not that hard a language to compile to,
especially when you're not trying to do arbitrary computations and are just
describing the layout and other specifics of a document.

I did manually write some PostScript once to support a reasonably compact
image file format in EPS (encapsulated PostScript).  Most of the image file
was compressed data for the PostScript program.  The program would maximally
scale and center the image on the page and decompress and print it.  Once
again, that relatively simple program took much longer to write than it
would have in a more traditional language.  It did work very nicely though.
Our EPS image files were not much bigger than ordinary image files, and a
lot smaller than everyone else's EPS image files.  Apparently nobody else
back then thought of using the programmability of PostScript to decompress
image data stored in the same file.


********************************************************************
Embed Inc, Littleton Massachusetts, http://www.embedinc.com/products
(978) 742-9014.  Gold level PIC consultants since 2000.

2009\07\05@104924 by olin piclist

face picon face
Gerhard Fiedler wrote:
> This seems to assume that a constant remains constant, which in the
> above isn't the case. He didn't try to access globConst under that name,
> but if the "absolute" directive does what I think it does, its value
> will have changed after the last cited line above.

I don't know what "absolute" is supposed to do, but V3 looks like it's being
defined as a variable and only initialized with the contant value, which is
very different from being a constant such as globConst.

> And the 'real' constants don't seem to have a type, similar to
> the preprocessor #defines in C. Not good either; in a strongly typed
> language, constants should also have a type. Well... nothing's perfect
> :)

I don't know the definition of his language, but in my Pascal CONSTs
certainly have a type.  They have whatever type the constant expression
resolves to.  For example:

const
 a = 'z';
 b = 27;
 c = 3.14;

A will have a type of character, B of integer, and C of floating point.  The
exact format or size of integer and floating point doesn't matter since they
are symbolic constants only and have no inherent realization.  The compiler
can of course create literals for them when needed, but there is no
guarantee that the same memory location will be used for the literal each
time.  The constant value could even be encoded in the immediate value of a
instruction.


********************************************************************
Embed Inc, Littleton Massachusetts, http://www.embedinc.com/products
(978) 742-9014.  Gold level PIC consultants since 2000.

2009\07\05@130157 by sergio masci

flavicon
face


On Sun, 5 Jul 2009, Gerhard Fiedler wrote:

{Quote hidden}

Yes but the problem is: are you prepared to take the risk and try a new
programming language just to discover its strengths and weaknesses, to see
it really will save you a great deal of time in both development and
maintinance? And if you are, is your boss?

It's all well and good reading about a language or paradigm that will save
you tons of money and time but having read all these over enthusiastic
claims in the past and found them to be false (if not down right lies),
you tend to treat new claims (that you "hear" about) with a good deal of
sceptisism.

{Quote hidden}

Ok, ok, back to the 99% :-)

Let me change that then to: most people that make a living as a
professional programmer.

>
> There is a place for languages that can be used by domain experts
> efficiently, so that they don't need expert programmers to solve their
> problems. What we do when we solve a problem with a microcontroller and
> C (or assembly, or Pascal, or even XSBC :) is often a "hack", in the
> absence of a better solution; the real solution would be something that
> allows the domain expert to do it herself.
>

I agree which is why I developed the meta CASE tool IPAD-Pro with which
the domain expert together with an expert programmer put together a system
whereby the domain expert can draw a diagram of his/her requirements
in a way that is natural to their domain. The tool then generates code
according to the diagram. It even lets code be simulated on the diagram so
that the diagram, code and requirements can be debugged.

BTW I guess by XSBC you actually mean XCSB :-)

{Quote hidden}

Yes this point seems to elude most programmers.

Think of it like this: when you have a sequence of language keywords the
compiler is able to analyse the sequence in order to try to look for the
"meaning" of the sequence rather than just the statements that the
sequence is made of.

e.g.
this is a loop

       for (j=0; j<10; j++)
       {        arr[j] = 0;
       }

here the compiler can optimise the loop in all kinds of ways.
* It knows that j is not modified within the loop
* it knows that the loop is to be repeated 10 times
* it knows that each item in the array is referenced simply relative to j
* it knows that each item of the array is set to a value that is not
 dependent on the order in which the array is processed
* it can also scan ahead of the loop to see if subsequent statements are
 dependent on the value of j once the loop exits

The compiler can actually change this loop to the more optimal form

       for (j=9; j!=0; j--)
       {        arr[j] = 0;
       }

which in itself could be used to generate a much more optimised executable
depending on the target CPU (think of Z80 or x86 with native load and
repeat). BTW this is one type of optimisation that XCSB does.


Now if we introduce a function the above optimisations break
e.g.
       for (j=0; j<10; j++)
       {        arr[j] = sin(j);
       }

This is because the compiler doesn't really know anything about the
function. It can analyse the function and depending on how complex the
function is it might be able to optimise the loop a little but in the case
of a well optimised sine function optimisation becomes very limited.

If however the compiler "knows" about the sin function (because the
compiler writer has given it special attributes that the compiler can
use) it could optimise this loop by building a lookup table at compile
time as:
static        xxx
       own_sin_tbl[] = {sin(0), sin(1), ...

       for (j=0; j<10; j++)
       {        arr[j] = own_sin_tbl[j];
       }

The important point here is not to focus on the sine function itself but
on the fact that a function is involved. Using functions (and this is
primarilly what libraries use) hinders the compilers ability to analyse
the intent of the programmer. Whereas having the compiler understand more
of the source allows it to better analyse the intent.

Going a bit further, if I wrote a simple search function such as

struct MY_STRUCT
       {        int        id;
               int        mask1, mask2;
       };

int search(int x, int cnt, struct MY_STRUCT arr[])
{
       int        j;

       for (j=0; j<cnt; j++)
       {
               if (x == arr[j].id)
               {        return j;
               }
       }

       return -1;
}

then I could use this in C as:

static const struct MY_STRUCT predef_arr[] =
       {
               { 5, 0xfe, 0x11},
               { 7, 0xfc, 0x22},
               {11, 0xe1, 0x33},
       };

static const int predef_cnt = 3;

main()
{
...

       mask1 = 0xff;
       mask2 = 0x00;

       res = search(7, predef_cnt, predef_arr);

       if (res != -1)
       {
               mask1 = predef_arr[res].mask1;
               mask2 = predef_arr[res].mask2;
       }
...
}

now if search were actually part of the language as in:

       res = SEARCH ARRAY=predef_arr LENGTH=predef_cnt WITH KEY=id FOR 7;

then the compiler would be in the position to simply generate the
equivalent of:

main()
{
...

       mask1 = 0xfc;
       mask2 = 0x22;
...
}

Ok so from all this you might get the impression that what I'm talking
about is purely optimisation related, it's not. It's about making it
easier for the programmer to write correct code and the compiler to
understand the code and so catch mistakes at compile time - good
optimisation is actually a side effect.

In the above example LENGTH was given as predef_cnt but this could have
easily been omitted since the compiler can see the length of the array.
Thus eliminating the need for predef_cnt (and maintaining it if predef_arr
actually changes). If the array were of integers SEARCH wouldn't even have
needed a WITH KEY component :-)

Having built-in types such as LIST, STACK, STRING greatly enhance the
compilers ability to track correct usage and reduce source code size
making it easier for the programmer to see the wood for the trees.

{Quote hidden}

???

>
> > Personally I think if you really want to optimise a processor to
> > execute high level code more efficiently then it needs an evaluation
> > stack (kind of like FORTH) [...]
>
> FORTH again... shouldn't it have had more success? :)

As Olin points out, large complex programs are very difficult to write and
maintain in FORTH. But I wasn't actually advocating FORTH itself just the
evaluation stack.

Friendly Regards
Sergio Masci

2009\07\06@022741 by William \Chops\ Westfield

face picon face

On Jul 5, 2009, at 1:09 PM, sergio masci wrote:

> Yes but the problem is: are you prepared to take the risk and try a  
> new programming language just to discover its strengths and  
> weaknesses, to see it really will save you a great deal of time in  
> both development and maintinance? And if you are, is your boss?

It's worse than that.  Since a big point about using C now is that it  
allows code to be (mostly) portable between different CPUs, any would-
be replacement has to appear nearly simultaneously on at least a half-
dozen platform to even begin to be taken seriously.  (This doesn't  
reflect on Olin's point that C "should have been better."  It just  
reflects the way things ARE, now.)

(Hmm.  I wonder if I can claim the reverse?  That languages that had  
ANY standardization effort AFTER "C" became known should have paid  
more attention to why C was gaining popularity?  Why didn't Pascal, AS  
A STANDARDIZED LANGUAGE, pick up features that would have made it more  
acceptable as a systems programming language?  People were still  
writing operating systems code in Assembly Language well into the  
1980s, right?  (Was it VMS that was mostly in Bliss?  (Now there was  
another language failure.))

BillW

2009\07\06@025830 by Tamas Rudnai

face picon face
Maybe it is just the time people start to realize that few operators like
++, << or += that virtually makes C as a "system programming language" is
not everything.

Tamas



On Mon, Jul 6, 2009 at 7:27 AM, William "Chops" Westfield <KILLspamwestfwKILLspamspammac.com>wrote:

{Quote hidden}

> -

2009\07\06@061128 by Alan B. Pearce

face picon face
>> I remember that when I first read about Ada, I thought that
>> this would be the future. It didn't become the future... Ada
>> compilers are too heavyweight, too complicated, too expensive,
>> too "niche". Probably no way to implement a compiler for a
>> small 8bit micro efficiently.
>
>You forgot the "conformance" aspect. Ada is a registered trademark
>of the US DoD. To write an Ada compiler, the vendor needs (needed?)
>to jump through DoD hoops. The compiler needed to be certified and
>anyone using the compiler could not do DoD work if the certification
>had expired (they had to be recertified every year) without special DoD
> >excemption.
>
>With all this BS going on, it's no wonder Ada never really took off.

IIRC one of the things about producing an ADA compiler, was that you could
not make a compiler that dealt with only a subset of the language. That is
what really killed it for small processors - who needed all that floating
point etc functionality that would never actually be used?

>Yes but most people that like ladder logic are not expert programmers.

Doesn't this come down to a comment someone earlier made about C? The tool
for the job that does what is needed, and can deal with the problem the
programmer specifies? AIUI situations where PLCs with ladder logic are used
tend to be things where blinding speed isn't required anyway, it is
generally better to be sure this valve has closed before opening that one
type of stuff.

2009\07\06@081955 by M.L.

flavicon
face
On Mon, Jul 6, 2009 at 2:58 AM, Tamas Rudnai<RemoveMEtamas.rudnaiTakeThisOuTspamgmail.com> wrote:
> Maybe it is just the time people start to realize that few operators like
> ++, << or += that virtually makes C as a "system programming language" is
> not everything.
>
> Tamas



Can you translate this for me?

--
Martin K.

2009\07\06@082039 by Gerhard Fiedler

picon face
sergio masci wrote:

>>> Yes I can see where you are comming from. So what would you say to a
>>> programmer who points at your generated code and laughing says "I
>>> could have done that in half the number of machine code
>>> instructions" using C. If your C "successor" is going to stand any
>>> chance of competeing with C, it's got to generate code that's at
>>> least on a par with it.
>>
>> If it's competing on its turf, then yes. If I can say "I have
>> developed the application in 10% of the time it would have taken to
>> develop it in C", then it depends. That's probably the reason why C
>> (and assembly) is not that popular with bigger systems,
>> notwithstanding its popularity with smaller systems.
>
> Yes but the problem is: are you prepared to take the risk and try a
> new programming language just to discover its strengths and
> weaknesses, to see it really will save you a great deal of time in
> both development and maintinance? And if you are, is your boss?

People (and their bosses :) do it all the time. C# came out of (almost)
nowhere a few years ago. Ruby is quite recent, too. Both are reasonably
popular now. I'm sure you can find other examples.

> It's all well and good reading about a language or paradigm that will
> save you tons of money and time but having read all these over
> enthusiastic claims in the past and found them to be false (if not
> down right lies), you tend to treat new claims (that you "hear"
> about) with a good deal of sceptisism.

Of course. This is what I've written about before: it's not so much
about the "computer science quality" of the language, but about a number
of other factors. I'm not sure what it is, but it's pretty obvious that
it's not the mere programming quality of a programming language that
makes it popular (or not).


{Quote hidden}

But still... they get work done. And apparently they (and their bosses
:) don't think that learning C (or whatever other programming language)
would help them getting it done more efficiently -- or they would hire
programmers.


> BTW I guess by XSBC you actually mean XCSB :-)

Yes. You can send this to your marketing department -- I never can
remember which way it is :)


{Quote hidden}

Here you have a point, but I'd say that this code (as C code for a small
micro) is badly written. Really badly. arr[j] should be initialized with
compile-time constants, not at run-time. You can say that this is the
compiler's job, but the problem is, again, that there are so many
different ways to calculate sin(). The compiler doesn't know which one I
think is adequate for my purposes; do I need 2 digits precision, or 6,
or 12? The code size and/or execution time of the function varies
substantially with the required precision.


> The important point here is not to focus on the sine function itself
> but on the fact that a function is involved. Using functions (and
> this is primarilly what libraries use) hinders the compilers ability
> to analyse the intent of the programmer. Whereas having the compiler
> understand more of the source allows it to better analyse the intent.

Yes, I get that. But the compiler can't know a few things around my
requirements, and I may be better able to choose which function
implementation is more adequate.

Regarding the intent, a good definition of a library function (think the
C++ standard library) explains quite clearly the intent. It doesn't nail
down the implementation details of the functions, though. C doesn't do
it, but there's nothing in the above that would prevent a compiler to
actually call at compile time the library function to calculate that
array. It doesn't have to be built in for that; it just has to know
which library to use (at compile time).


{Quote hidden}

This doesn't require that SEARCH is part of the compiler; it merely
requires that the semantics of SEARCH are well-defined. It could just as
well be a library function with semantics that are specified so that the
compiler can know what to do.

But again, like above with the sin() example, code like this should
actually be compile-time constants, not run-time calculations. The
program writer should clearly distinguish between values that are known
before the program runs and values that are dependent on run-time
events. The examples you showed are only relevant if the programmer
doesn't do this.


> Ok so from all this you might get the impression that what I'm talking
> about is purely optimisation related, it's not. It's about making it
> easier for the programmer to write correct code and the compiler to
> understand the code and so catch mistakes at compile time - good
> optimisation is actually a side effect.

I get this, but I think a library with clearly defined semantics goes a
long ways towards that. And a programmer who clearly distinguishes
between stuff that's known at compile time and stuff that can only be
known when the program runs also helps.


> Having built-in types such as LIST, STACK, STRING greatly enhance the
> compilers ability to track correct usage and reduce source code size
> making it easier for the programmer to see the wood for the trees.

Right, but again... The C++ standard library, for example, has quite
good implementations of standard containers, yet I find myself creating
my own containers or using different containers from other libraries,
for specific requirements that the standard containers don't satisfy. A
list or stack is a concept, with many different possible implementations
that satisfy different requirements. And I suspect that the smaller the
micro it runs on, the more limited the resources are, the more important
are the differences between different implementations.

Gerhard

2009\07\06@083452 by olin piclist

face picon face
William Chops" Westfield" wrote:
> (Hmm.  I wonder if I can claim the reverse?  That languages that had
> ANY standardization effort AFTER "C" became known should have paid
> more attention to why C was gaining popularity?  Why didn't Pascal, AS
> A STANDARDIZED LANGUAGE, pick up features that would have made it more
> acceptable as a systems programming language?

First, Pascal didn't come after C, it preceeded it.  However, that wouldn't
have prevented others from trying to standardize their variant later.  I
think the reason is that this wasn't how things were done and how people
were thinking back then.  Remember, this was back when every computer
manufacturer had their own operating system and unique CPU.  About all you
could count on was that a Fortran compiler was available with everything
else pretty much proprietary.  And even then, every vendor had their
proprietary extensions to Fortran to give themselves a competitive
advantange and hopefully lock in their customers a bit.  Those that
developed their own useful Pascal variants saw them like intellectual
property that gave their platform a competitive advantage.  This was the
prevailing mindset until the late 1980s when Sun overtook the workstation
market on the strength of openess.


********************************************************************
Embed Inc, Littleton Massachusetts, http://www.embedinc.com/products
(978) 742-9014.  Gold level PIC consultants since 2000.

2009\07\06@083706 by Gerhard Fiedler

picon face
Tamas Rudnai wrote:

> Maybe it is just the time people start to realize that few operators like
> ++, << or += that virtually makes C as a "system programming language" is
> not everything.

Or maybe it's time for some people to start realizing that the success
of C is based on a bit more than operators like ++ or << :)

I really don't think the language features (as in "computer science")
have much to do with its success. As long as people only look at the
language (programmer's view), they're missing the bigger picture -- the
part of it that makes or doesn't make a language successful.

C isn't successful in all areas. It probably was the first language for
server-side creation of dynamic content for web sites, but it has been
replaced by PHP, Perl, Java, Ruby, ASP.

Gerhard

2009\07\06@084628 by M.L.

flavicon
face
On Mon, Jul 6, 2009 at 8:20 AM, Gerhard
Fiedler<spamBeGonelistsspamBeGonespamconnectionbrazil.com> wrote:
> People (and their bosses :) do it all the time. C# came out of (almost)
> nowhere a few years ago. Ruby is quite recent, too. Both are reasonably
> popular now. I'm sure you can find other examples.


Perl, Python, Haskell, ...
Any of which can be compiled so should not be limited by being called
"scripting" languages.
--
Martin K.

2009\07\06@194117 by Gerhard Fiedler

picon face
Olin Lathrop wrote:

> William Chops" Westfield" wrote:
>> (Hmm.  I wonder if I can claim the reverse?  That languages that had
>> ANY standardization effort AFTER "C" became known should have paid
>> more attention to why C was gaining popularity?  Why didn't Pascal,
>> AS A STANDARDIZED LANGUAGE, pick up features that would have made it
>> more acceptable as a systems programming language?
>
> First, Pascal didn't come after C, it preceeded it.  However, that
> wouldn't have prevented others from trying to standardize their
> variant later.  I think the reason is that this wasn't how things
> were done and how people were thinking back then.  Remember, this was
> back when every computer manufacturer had their own operating system
> and unique CPU.  

The ANSI started with standardization of C in 1983. It became an ISO
standard in 1990.

> Those that developed their own useful Pascal variants saw them like
> intellectual property that gave their platform a competitive
> advantage.  

Looking at how things went, it seems like this was a bad choice. Maybe
K+R et al weren't that incapable after all...

> This was the prevailing mindset until the late 1980s when Sun overtook
> the workstation market on the strength of openess.

No, it wasn't. For some people, maybe, but there was an ANSI standard
for FORTRAN that dates back to 1965. (This and the backing of IBM were
probably two of the major factors for the popularity of FORTRAN.) If
ANSI took up standardization efforts for C in 1983, there was a push to
do this that started way before 1983. Something like this doesn't happen
out of the blue. So the mindset to standardize was there.

There is also an ISO standard for the original Pascal that dates back to
1983. But it seems that this standard wasn't usable; for a number of
reasons, everybody implemented their own versions. Which again is the
likely cause for the lack of popularity of each of them. For what it
seems, these Pascal dialects didn't only add features to the original
language, they also changed features and omitted others. This of course
doesn't help.

(BTW, you sound almost like the ones that are trying to defend the C
shortcomings as features in the language wars :)

Gerhard

2009\07\07@024211 by William \Chops\ Westfield

face picon face

> I remember that when I first read about Ada, I thought that
> this would be the future. It didn't become the future... Ada
> compilers are too heavyweight, too complicated, too expensive,
> too "niche". Probably no way to implement a compiler for a
> small 8bit micro efficiently.

One of the problems is that Compiler writers (perhaps more accurately:  
"Language Designers") and the sort of low-level programmers who wrote  
and propagated a language like C don't seem to like each other, or  
even talk to each other very much.  So the language designers go off  
and do something, and the device driver folk, whether they're for 8-
bit micros or mainframes, sorta shrug and go off and do their thing in  
assembler, or invent their own language.

I don't find C's lack of stronger typechecking any more surprising  
than Pascals lack of boolean math, for example (no, sets are not the  
same thing.)

You see this sort of dichotomy all the time.  On the one side you have  
these expensive and huge languages (PL/1, Smalltalk, Ada, etc) with  
the CompSci PhD touting their strong type checking and semantic  
elegance and blah blah blah.  On the other side you have dangerous,  
simplistic, and hard to read languages (Assembler, C, Forth) with some  
hardcore developers mumbling about speed, power, fewer keystrokes.  
Sigh.

BillW

2009\07\07@082336 by olin piclist

face picon face
Gerhard Fiedler wrote:
>> Those that developed their own useful Pascal variants saw them like
>> intellectual property that gave their platform a competitive
>> advantage.
>
> Looking at how things went, it seems like this was a bad choice. Maybe
> K+R et al weren't that incapable after all...

K+R were in a research setting where they didn't have to care about profit
or competitive advantage.  They had nothing to loose by letting everyone at
it, so you can't say if they did it due to deliberate thought or just lucked
into it.  I suspect the latter.

>> This was the prevailing mindset until the late 1980s when Sun
>> overtook the workstation market on the strength of openess.
>
> No, it wasn't. For some people, maybe, but there was an ANSI standard
> for FORTRAN that dates back to 1965. (This and the backing of IBM were
> probably two of the major factors for the popularity of FORTRAN.) If
> ANSI took up standardization efforts for C in 1983, there was a push
> to do this that started way before 1983. Something like this doesn't
> happen out of the blue. So the mindset to standardize was there.

Of course there were standards, but companies opening their proprietary
operating systems and large programs like compilers was certainly not the
prevailing way things were done.

> There is also an ISO standard for the original Pascal that dates back
> to 1983. But it seems that this standard wasn't usable; for a number
> of reasons, everybody implemented their own versions.

I'm guessing that didn't work because it was too late.  By 1983 a number of
proprietary Pascal version were already well developed.  A lot of code
existed in each version, so I'm guessing nobody wanted to have their code
orphaned.  Sometimes standards are set up as smoke screens.  Remeber OSF?


********************************************************************
Embed Inc, Littleton Massachusetts, http://www.embedinc.com/products
(978) 742-9014.  Gold level PIC consultants since 2000.

2009\07\07@091711 by olin piclist

face picon face
William Chops" Westfield" wrote:
> I don't find C's lack of stronger typechecking any more surprising
> than Pascals lack of boolean math, for example (no, sets are not the
> same thing.)

I thought even the original teaching language had the boolean type and
boolean operators like AND, OR, and NOT.  Maybe they left out XOR because
that's implemented as a intrinsic function in Apollo Pascal instead of a
operator.  There aren't that many ways to combine 2 bits.  I'm not sure what
more you want.


********************************************************************
Embed Inc, Littleton Massachusetts, http://www.embedinc.com/products
(978) 742-9014.  Gold level PIC consultants since 2000.

2009\07\08@093939 by Gerhard Fiedler

picon face
Olin Lathrop wrote:

> K+R were in a research setting where they didn't have to care about
> profit or competitive advantage.  They had nothing to loose by
> letting everyone at it,

They were in a commercial venue, porting Unix from PDP-7 to PDP-11. This
is the reason why they created C, as a sort of a portable assembler.
(This wasn't a research project, it was commercial -- they wanted to
sell the PDP-11 :) Wirth created Pascal at the ETH in Zürich; not more
commercial than AT&T, it seems, perhaps less :)

I don't think there's a big difference in balance between research and
commercial compiler writers between C and Pascal. I think it wasn't
until the early 80ies that the first commercial Pascal compilers showed
up (Watcom, Apple, Borland). Until then, it seems, they were all created
by universities. And after then, there were just as many commercial C
compilers.

> Of course there were standards, but companies opening their
> proprietary operating systems and large programs like compilers was
> certainly not the prevailing way things were done.

Right. But IMO this hasn't much to do with C's popularity or Pascal's
lack thereof. I think the main reason is that there was, from the 80ies
on, pretty much /one/ C but many (incompatible) Pascals. A compiler
writer didn't have a clear road to follow for compatibility. The
standard wasn't suitable and not popular anyway, and then there were the
many proprietary and diverging dialects.

>> There is also an ISO standard for the original Pascal that dates back
>> to 1983. But it seems that this standard wasn't usable; for a number
>> of reasons, everybody implemented their own versions.
>
> I'm guessing that didn't work because it was too late.  

You probably didn't read up on the history, and on the standard itself.
C wasn't standardized until later, so 1983 wasn't too late. The thing is
that there doesn't seem to have been a consensus what makes a "good"
Pascal, so there wasn't /a/ Pascal, there were many Pascals -- and all
incompatible with each other. The only consensus, it seems, was that the
standard isn't good enough, so Pascal programmers never seemed to have
cared about standard conformance. Which is different from C; for the
typical C programmer, standard conformance was always an issue. (First
the K+R "standard", then the C89 standard. C99 is here, but not quite
yet. Most compilers don't support it, so the kind of inofficial standard
is still C89.)

> A lot of code existed in each version, so I'm guessing nobody wanted
> to have their code orphaned.  

If this was the case, it was a bad decision, it seems -- rather than
getting together, making some compromise, create a common and usable
standard and live on, they wanted only their proprietary piece and died.
This may be a lesson...

Gerhard

2009\07\08@114236 by Tamas Rudnai

face picon face
On Wed, Jul 8, 2009 at 2:39 PM, Gerhard Fiedler
<TakeThisOuTlistsEraseMEspamspam_OUTconnectionbrazil.com>wrote:

> Wirth created Pascal at the ETH in Zürich; not more
> commercial than AT&T, it seems, perhaps less :)


the original Pascal was only intended to be able to teach the logic of
structured programming. It is just a coincident that it turned out that the
language was quite friendly so with some improvements it could used for
commercial programming as well. The incompatibility was only coming because
of these error and try efforts: Everyone wanted to make their commercialised
implementation better than the other -- same stuff as what you can see now
with the C compilers for PIC: Each of them differs where they can be, like
pragmas, fuses, types, libraries etc. You need to put some effort but can
still implement your code into another C compiler than it was originally
written to -- same as with Pascal... I cannot see the difference much?

Tamas




{Quote hidden}

>

2009\07\08@162256 by William \Chops\ Westfield

face picon face

On Jul 8, 2009, at 6:39 AM, Gerhard Fiedler wrote:
>> K+R were in a research setting where they didn't have to care about  
>> profit or competitive advantage.  They had nothing to loose by  
>> letting everyone at it,
>
> They were in a commercial venue, porting Unix from PDP-7 to PDP-11.  
> This is the reason why they created C, as a sort of a portable  
> assembler. (This wasn't a research project, it was commercial --  
> they wanted to sell the PDP-11 :)

K&R were at Bell Labs, weren't they?   While Bell Labs wasn't quite so  
bad as (say) Xerox PARC, they were pretty far away from being  
"commercial", especially in those days!  (Look at how long it took  
before Unix became a commercial offering.)

And "AT&T wanted to sell [DEC] PDP11s"??  Huh??


> Wirth created Pascal at the ETH in Zürich; not more commercial than  
> AT&T, it seems, perhaps less :)

Ah, you know those university profs.  Their "product" is publication  
("publish or perish"); especially nice because commercial viability is  
less of an issue...  :-)

BillW

2009\07\09@091556 by Gerhard Fiedler

picon face
William "Chops" Westfield wrote:

>>> K+R were in a research setting where they didn't have to care about
>>> profit or competitive advantage.  They had nothing to loose by
>>> letting everyone at it,
>>
>> They were in a commercial venue, porting Unix from PDP-7 to PDP-11.
>> This is the reason why they created C, as a sort of a portable
>> assembler. (This wasn't a research project, it was commercial --
>> they wanted to sell the PDP-11 :)
>
> K&R were at Bell Labs, weren't they? While Bell Labs wasn't quite so
> bad as (say) Xerox PARC, they were pretty far away from being
> "commercial", especially in those days! (Look at how long it took
> before Unix became a commercial offering.)

But didn't AT&T (owner of Bell Labs) sell Unix licenses? Not to the
general public, but still sell them?

> And "AT&T wanted to sell [DEC] PDP11s"??  Huh??

Right... got that mixed up. They wanted to sell Unix for those PDP-11s
-- or not?


>> Wirth created Pascal at the ETH in Zürich; not more commercial than  
>> AT&T, it seems, perhaps less :)
>
> Ah, you know those university profs.  Their "product" is publication
> ("publish or perish"); especially nice because commercial viability
> is less of an issue...  :-)

Exactly. They don't have to worry about commercial success, so they
could do it right. Which they apparently didn't, in this case :)

Gerhard

2009\07\09@092644 by Gerhard Fiedler

picon face
Tamas Rudnai wrote:

> the original Pascal was only intended to be able to teach the logic of
> structured programming. It is just a coincident that it turned out
> that the language was quite friendly so with some improvements it
> could used for commercial programming as well. The incompatibility
> was only coming because of these error and try efforts: Everyone
> wanted to make their commercialised implementation better than the
> other -- same stuff as what you can see now with the C compilers for
> PIC: Each of them differs where they can be, like pragmas, fuses,
> types, libraries etc. You need to put some effort but can still
> implement your code into another C compiler than it was originally
> written to -- same as with Pascal... I cannot see the difference
> much?

You need to look at the bigger picture. C didn't become popular because
of PICs. There are many C programs that compile (and run) on any
standard-compliant compiler, on quite different platforms. While
platform differences sometimes need some conditional parts in the code,
it is possible (and common) to write portable C code that compiles on
gcc, VC++ (in C mode) and a number of other compilers and runs on Linux,
Unix, Solaris, Windows, Mac OSX and others (and depending on the amount
of resources and OS support required for the specific program, also on a
bare-metal PIC).

Also, while with Pascal you may be able to port your code from one
Pascal to the other, from one platform to the other, but then you have
two (or more) code bases. With C, even if you have to make adjustments
for a specific platform, you usually integrate them into the code -- and
end up with code that compiles and works on the previously supported
platforms and on the new platform.

Try that with (any) Pascal. The lack of standardization (and of a
standard preprocessor, for the sometimes necessary platform adjustments)
makes this pretty much impossible.

(I'm not trying to say that "C is better than Pascal" :)

Gerhard

2009\07\09@111721 by PPA

flavicon
face


Gerhard Fiedler wrote:
{Quote hidden}

Let me laugh a bit...
I jump into this so funny discussion since I'm an ol'Pascal'er among other
activities...
Maybe all of this WAS quite true a while ago, but nowadays we can see that
there is a LOT of #IFDEF in C sources to support multiple platforms. Nobody
(I think) is coding in "pure" C (or "pure" Pascal) - whatever that means -
because everybody want the bells and wistles of the compiler toolchain they
have choosen; if they don't manage toolchain specificities (a big work),
when they want to change there is always a bigest work to do to adapt their
supposed to be "portable" sources...
In Pascal this is the same, a lot of $IFDEF and so on... No less no more.
This is far from "impossible" as you said. I do it every day since TP3. Some
of my old libraries still compiles both in TP3 and the latest Delphi. So
they are "portable".
What you qualify as "Pascal dialects" for me is nothing different than "C
dialects" and now there is a few Pascal ones that survives and the
differences are very tight. The more boring work is on adapting libraries
for one environment to another one, not on the language by itself.
Pascal (and inheritors) is alive and still growing, thanks. OK, not so
popular as C but it is far from RIP.

Gerhard Fiedler wrote:
>
> (I'm not trying to say that "C is better than Pascal" :)
>
Just a bit ;-) ...



-----
Best regards,

Philippe.

http://www.pmpcomp.fr Pic Micro Pascal for all!
--
View this message in context: www.nabble.com/Re-using-BREAK-in-%27C%27-tp24256799p24411202.html
Sent from the PIC - [PIC] mailing list archive at Nabble.com.

2009\07\09@120252 by olin piclist

face picon face
PPA wrote:
> Nobody (I think) is coding in "pure" C (or "pure" Pascal) - whatever
> that means -
> because everybody want the bells and wistles of the compiler
> toolchain they
> have choosen; if they don't manage toolchain specificities (a big
> work),
> when they want to change there is always a bigest work to do to adapt
> their
> supposed to be "portable" sources...
> In Pascal this is the same, a lot of $IFDEF and so on... No less no
> more.

I do it a different way.  First, my Pascal has deliberate portability
features built in.  This includes data types that are defined to map in
certain ways to the target hardware or OS and are therefore different on
different implementations.  The rest is done thru our libaries.  We have
abstracted operations that underlying systems do for you, but each does in a
different way.  The real job of porting applications between systems is in
porting this layer once to each system.  The applications then just work.

Since the libraries are designed to be ported, it's not that hard.  I just
checked, and the unique code for the Win32 implementation is under 8500
lines.  This includes I/O, system functions like multi-threading, thread
interlocks, time, etc, except graphics.  There's a whole seperate library
for graphics.  This environment has been to Apollo Aegis, 4 different
flavors of Unix, and Windows.  All this time the applications themselves
haven't changed and just work on all these different systems with virtually
no compile time customizations.


********************************************************************
Embed Inc, Littleton Massachusetts, http://www.embedinc.com/products
(978) 742-9014.  Gold level PIC consultants since 2000.

2009\07\09@203704 by Gerhard Fiedler

picon face
PPA wrote:

{Quote hidden}

This is always good, (almost) no matter the cause :)

> I jump into this so funny discussion since I'm an ol'Pascal'er among
> other activities... Maybe all of this WAS quite true a while ago, but
> nowadays we can see that there is a LOT of #IFDEF in C sources to
> support multiple platforms.

Of course there are. (BTW, it's #ifdef in C.)

Even though many things are standardized, many things can't be in order
to be able to run on different platforms. For those people have to use
the system APIs -- and these are different on different platforms. This
has always been this way, back then and is still so.

> Nobody (I think) is coding in "pure" C (or "pure" Pascal) - whatever
> that means -

You might be surprised. Not sure about Pascal, but there are many people
who do most of their work in standard C or C++, and in such cases only a
small part of an application (if any) deals with the platform
differences and encapsulates them. This part, of course, tends to have a
lot of preprocessor constructs around the system APIs. But the rest is
plain, standard C (or C++) -- which is probably what you call "pure" C
(or C++). So yes, lots of people code in this. I currently spend much of
my day doing just this.

> because everybody want the bells and wistles of the compiler toolchain
> they have choosen; if they don't manage toolchain specificities (a
> big work), when they want to change there is always a bigest work to
> do to adapt their supposed to be "portable" sources...

If you don't write with portability in mind, this may be the case. And
in some cases, it makes sense not to write with portability in mind --
for example, if you don't anticipate that the code will have to be
ported.

I can have all the bells and whistles of the VC++ or gcc toolchain and
code most of the time in standard C or C++. (When I say "most of the
time" I mean in the 99% of the application that are not platform or
compiler specific.) I don't see what the toolchain has to do with
standard or non-standard code.

Of course, if you mean a GUI library, then you're pretty much hosed if
the one you chose doesn't support all your target platforms. If you
started out with MFC and need Linux support, then all compiler
compatibility doesn't help because the GUI support on the different
platforms is different enough to make portability a real challenge.
That's why there are portable GUI libraries that provide the same
interface on different platforms. You better choose one that supports
all your target platforms right from the beginning... independently of
the language you're programming in.


> In Pascal this is the same, a lot of $IFDEF and so on... No less no
> more. This is far from "impossible" as you said. I do it every day
> since TP3. Some of my old libraries still compiles both in TP3 and
> the latest Delphi. So they are "portable".

I'm not sure we mean the same thing with "portable". My C and C++
Windows sources, for example, compile on any standard C or C++ compiler
for Windows -- and there are a bunch, from different vendors. Same goes
for Linux, Unix, Solaris, Apple OSX.

For me, being able to compile the program in different versions of a
compiler of the same vendor is the lowest level; I wouldn't even call
this "portability", it's more like version compatibility. The next level
is being able to compile on compilers from different vendors (on the
same operating system); this maybe I'd start to call portability. Then
come different operating systems and different target processors; here
we get into what I call portability.

So how does this aspect look for Pascal? (This is not a rhetorical
question; I really want to know from someone who uses Pascal portably.)

One application I work on has several 100kLOC that compiles fine on gcc
and VC++ for Linux and Windows, with a very small and isolated part
being compiler and platform specific. Recently we tested the Intel C++
compiler for Windows, and the code compiled without any changes. (We
dropped it, though, because the performance advantages were small, and
the compile times were horrible.) Is this possible with Pascal -- taking
such a complex application, pick another vendor's compiler, and it "just
works"?

> What you qualify as "Pascal dialects" for me is nothing different than
> "C dialects"

Besides some odd vendors of small micro compilers, there aren't really
any C dialects. The few there are are really niche thingies, mostly
restricted to the PIC world (and maybe the one or other small micro, but
I think the PIC compilers are worse than most others in this respect).
In the great scheme of C, they almost play no role at all. Every
compiler that wants to be worth its money has to be standard compatible,
and therefore allows programming in standard C (or C++). Which is more
common as you seem to think.


>> (I'm not trying to say that "C is better than Pascal" :)
>>
> Just a bit ;-) ...

No, I'm not. If you're reading this into my messages, maybe it's because
you're defensive about Pascal. Fact is that Pascal is almost dead in the
professional world. Another fact is that I started serious PC
programming with Pascal. And yet another fact is that I had to give it
up because Pascal is almost dead for most professional purposes. And
part of this thread is about why this is.

I really really didn't like it. Turbo Pascal was miles ahead of anything
else. The IDE was snappy and functional, compilation was fast and the
executables were, too, it was affordable and powerful. But it didn't
make it into the big league, and my guess is that it's about
standardization. Microsoft was and is big enough to be able to make a
vendor-specific language popular (see VB and C#) without a
vendor-independent standard, Borland wasn't and isn't, Delphi
notwithstanding.

It doesn't help pretending that Pascal isn't dead in the professional
world, or make this into yet another "this language is better than that
language" pissing contest... It's about finding out why Pascal, with all
the perceived advantages, didn't make it -- and, arguably, at this
point, has little chance of making it ever.

Gerhard

2009\07\10@082008 by Tony Smith

flavicon
face
{Quote hidden}

I still use TurboPascal when fiddling with TurboCNC.

That said, the differences are often just perception, BASIC is for little
kiddies, Pascal for quiche-eating professors, and 'Real Men' (TM) use C.

Years ago I'd write apps for people using either QuickC or QuickBasic (acually
PDS which could do overlays).  If they didn't get the source code I'd do it in
Basic, otherwise they got to choose.  Many insisted on C, even though I then
charged at least twice as much and took twice the time to do it.

The joke was, of course, that once compiled you couldn't tell the difference as
to whether it was in C or Basic.  (I wasn't writing ray-tracers or such things
where C would have been a better choice).

Microsoft wised up with Windows C & Basic, so VB didn't get a compiler, thus
letting the 'Real Men' (TM) get on with the job of doing whatever it is they
do.  Probably telling the VB weenies their language of choice sucks - "Pfft, it
doesn't even compile".

No-one needs the 'power' that C gives you these days, it's all libraries, web
apps and front-ends for SQL databases.  You don't need C for that.

Tony

2009\07\10@085233 by M.L.

flavicon
face

On Fri, Jul 10, 2009 at 8:20 AM, Tony Smith<RemoveMEajsmithspamTakeThisOuTbeagle.com.au> wrote:
> Years ago I'd write apps for people using either QuickC or QuickBasic (acually
> PDS which could do overlays).  If they didn't get the source code I'd do it in
> Basic, otherwise they got to choose.  Many insisted on C, even though I then
> charged at least twice as much and took twice the time to do it.
>
> The joke was, of course, that once compiled you couldn't tell the difference as
> to whether it was in C or Basic.  (I wasn't writing ray-tracers or such things
> where C would have been a better choice).
>
> Microsoft wised up with Windows C & Basic, so VB didn't get a compiler, thus
> letting the 'Real Men' (TM) get on with the job of doing whatever it is they
> do.  Probably telling the VB weenies their language of choice sucks - "Pfft, it
> doesn't even compile".


I was under the impression that QuickBASIC "compiled" the ASCII source
code to byte-code and included an interpreter in the EXE?
This is sort of what compiled Perl does now - though there is a real
Perl -> C translator back-end. (B::C) - not that I would compare QB to
Perl.
--
Martin K.

2009\07\10@090030 by olin piclist

face picon face
Tony Smith wrote:
> No-one needs the 'power' that C gives you these days, it's all
> libraries, web apps and front-ends for SQL databases.

That's baltantly wrong, of course.  Sure there are a lot of web apps and
such where CPU power is not the limiting factor.  But to say that this is
never the case is rediculous.  Maybe you don't ever run a programs that you
have to wait on because of the sheer number of calculations required, but
that doesn't make it true for everyone.  Programs in that catagory for me
include the Eagle auto router and some image processing utilities.

I recently wrote a program for a customer that performs 50000 relaxation
passes on 10s to 100s of polar grids.  The total run time can be 10s of
minutes or more, depending on the number of thingies installed at a
particular site.  While a hour per customer site installation once is
acceptable, I'm sure the field folks would love it to be faster.

That's just a few examples of the many out there, although only a single one
disproves your blanket statement.

> You don't need C for that.

True.  The language is not the issue as long as it's a truly compiled one
intended for general purpose from the start.  As long as you understand what
goes on underneath and don't do something stupid, most truly compiled
languages with modern compilers will give you about the same performance.


********************************************************************
Embed Inc, Littleton Massachusetts, http://www.embedinc.com/products
(978) 742-9014.  Gold level PIC consultants since 2000.

2009\07\11@091207 by Gerhard Fiedler

picon face
Tony Smith wrote:

>> It doesn't help pretending that Pascal isn't dead in the professional
>> world, or make this into yet another "this language is better than
>> that language" pissing contest... It's about finding out why Pascal,
>> with all the perceived advantages, didn't make it -- and, arguably,
>> at this point, has little chance of making it ever.
>
> I still use TurboPascal when fiddling with TurboCNC.

Which probably doesn't make up a big chunk of your income.

> That said, the differences are often just perception, BASIC is for
> little kiddies, Pascal for quiche-eating professors, and 'Real Men'
> (TM) use C.

Not sure where the people are that perceive it like this, but not around
where I make my money. VB /is/ a professional tool, so is Delphi. And
nobody in his right mind would use C (or pay for someone using C) for
something these two are usually used.

But compare the "bucks made" with Delphi and other Pascal variants with
the "bucks made" with other languages, and you'll probably see why I
said "dead in the professional world".

This isn't a judgment of the "quality" of the language. It's a fact of
the market.


> Microsoft wised up with Windows C & Basic, so VB didn't get a
> compiler, thus letting the 'Real Men' (TM) get on with the job of
> doing whatever it is they do.  

It's been a long time that Microsoft has last promoted a C compiler.
They do sell a C++ compiler that also (as a rarely used side effect) can
compile C sources. But their heart (and their money) lies with VB and C#
(both running on the same engine), not with C and not with C++.

> Probably telling the VB weenies their language of choice sucks -

No, not really. VB is probably more popular than C++ in the professional
Windows world. C++ is treated like some kind of a step child; many of
the useful RAD features that VB and C# have for GUI development (in the
Visual Studio IDE) don't work with C++. It's probably still there mostly
because of legacy applications that need to be supported, and because
some people develop highly optimized applications where the additional
control a low-level language like C++ provides may be helpful (and they
don't need the GUI RAD features for that).

> No-one needs the 'power' that C gives you these days, it's all
> libraries, web apps and front-ends for SQL databases.  You don't need
> C for that.

Nobody really uses it (professionally) for that. But then, check out
what the SQL databases are written in... Chances are it's not a BASIC or
Pascal dialect. (Which again is not to say they couldn't be; just that
they aren't.)

Gerhard

2009\07\11@111156 by Tony Smith

flavicon
face
{Quote hidden}

VB did the byte-code & interpreter, as do the .Net languages now.  QB always compiled to a real .exe file.  Or at least a a.com that pretended to be a .exe.  I know some Basics did store source files like that, but we're back in the days of the Apple ][ now.  

Dropping the compiler from VB meant Microsoft got to sell both you VB & C, VB for the UI, and C for the hard stuff.  That was the theory, but most people picked one or the other, causing themselves all sorts of grief.

The PDS variant (Professional Development System) of QuickBasic did a few things QB didn't, but I've forgotten exactly what.  It optimised code better, could create .obj files and did overlays, so you'd have trouble convincing people that yes, that 4mb exe was in fact written in Basic.  Everyone knew only C could do that, Basic had to CHAIN stuff together.

PDS became VB-DOS, and you could use its compiler on your existing stuff.  I think it moved strings from near to far memory, which was helpful.

Tony


2009\07\11@115429 by Tony Smith

flavicon
face
> > No-one needs the 'power' that C gives you these days, it's all
> > libraries, web apps and front-ends for SQL databases.
>
> That's baltantly wrong, of course.  Sure there are a lot of web apps and
> such where CPU power is not the limiting factor.  But to say that this is
> never the case is rediculous.  Maybe you don't ever run a programs that
you
{Quote hidden}

one
> disproves your blanket statement.
>
> > You don't need C for that.
>
> True.  The language is not the issue as long as it's a truly compiled one
> intended for general purpose from the start.  As long as you understand
what
> goes on underneath and don't do something stupid, most truly compiled
> languages with modern compilers will give you about the same performance.


I originally wrote 'almost no-one', but changed it to see who'd bite.

'Blatantly' is a bit much, sure Eagle needs to run fast, but you're not the
one writing the auto-router.  Out of the thousand of Eagle users only one
needs C, and that's the author.  

It's also been true for a while that it's cheaper to buy more CPU power then
spend days fiddling with something to make it faster.  That offends people
too, but a new PC divided by your hourly rate is a small number.  Which is
better?

MS Office is a perfect example of utilising CPU power, rather than having
document saves being basically a memory dump with various abominations
tacked on over the years, now it's a bunch of XML files zipped up.  That's
better PCs, not so much better coding.

C is rarely needed these days.

PICs are the same, you have "slow cheap Pic + Assembler" vs "faster $ PIC +
compiler".  Not everyone needs assembler for PICs.

Tony

(and in reference to the subject line, BREAK is probably the dumbest idea
ever, every other language got switches right)

2009\07\11@121009 by Tony Smith

flavicon
face
> I still use TurboPascal when fiddling with TurboCNC.

Which probably doesn't make up a big chunk of your income.


Yeah, not a big earner  :)


> > That said, the differences are often just perception, BASIC is for
> > little kiddies, Pascal for quiche-eating professors, and 'Real Men'
> > (TM) use C.
>
> Not sure where the people are that perceive it like this, but not around
> where I make my money. VB /is/ a professional tool, so is Delphi. And
> nobody in his right mind would use C (or pay for someone using C) for
> something these two are usually used.


Oh, they're alive and well (and probably posting on the list).  The comment
made earlier when someone said a product sold by a list member should be
called !C instead of Basic shows it's still there.  Add the 'M$ windoze'
folk to the list.  We know it sucks so get a new joke (one you made yourself
would be better).  Then there's the 'dot net sucks' bandwagon crowd too.
Nothing changes.  


> > No-one needs the 'power' that C gives you these days, it's all
> > libraries, web apps and front-ends for SQL databases.  You don't need
> > C for that.
>
> Nobody really uses it (professionally) for that. But then, check out
> what the SQL databases are written in... Chances are it's not a BASIC or
> Pascal dialect. (Which again is not to say they couldn't be; just that
> they aren't.)


Sure, hands up all those here who write SQL databases.  Uh-huh.  Those
people might need C (or assembler) but not the rest of us.  I still come
across people developing applications in C++ because 'C is better'.  Never
mind you could probably replicate their app in a spreadsheet.

Tony

2009\07\11@123755 by olin piclist

face picon face
Tony Smith wrote:
> I originally wrote 'almost no-one', but changed it to see who'd bite.

I can only respond to what you wrote, not what you considered writing.

> 'Blatantly' is a bit much,

No, you made a absolute statement which a single counter example can
disprove.  There are many counter examples of which I listed a couple.  If
you'd said something like "most programs don't need a compiled language
because speed is not the issue" I wouldn't have argued and even agreed, but
that's very different from what you actually said.

> sure Eagle needs to run fast, but you're
> not the one writing the auto-router.  Out of the thousand of Eagle
> users only one needs C, and that's the author.

And only one example disproves your blanket statement.

> It's also been true for a while that it's cheaper to buy more CPU
> power then spend days fiddling with something to make it faster.
> That offends people too, but a new PC divided by your hourly rate is
> a small number.  Which is better?

Again you're making a absolute claim, which of course is false.  Have you
really never run into programs that took longer to run than you wished, even
if you had the latest computer?  Even if you haven't, are you really saying
you've never heard of any or can't imagine any?  If so, then there's a big
world of computing you've managed to miss somehow.

The Eagle autorouter is a good example.  Some runs I've done have taken a
couple of hours, some overnight.  Let's say for sake of argument I have a
real problem that takes my current PC 1 hour to run.  Of course I'd like it
to be instant, but let's say 10 seconds would put it into the good enough
range so that I wouldn't have to alter my workflow around the program.
That's a factor of 360.  My PC is a few years old.  Maybe I could get 4x out
of a current mainstream model.  Let's be generous and say 8x.  That brings
the one hour down to 7.5 minutes.  That's still a "long" time in my context
and I'd definitely want it to be faster.

Thru the history of computing there have always been problems that taxed the
computers of the time where people had to wait inconveniently long for the
computer to finish or compromises were made to limit the problem.  As
computers got faster some problems could be solved fast enough so that
nobody cared anymore, but others could now be solved with a few less
compromises but are still a long way from good enough.  Generally the ones
that remain pushing computing limits today have to do with many passes over
large arrays.  Surely you've at least heard of some?

> C is rarely needed these days.

That's better, although I think you meant to say the extra performance of a
compiled language is rarely needed these days.  I agree with that, but note
that this doesn't make a compile language a bad idea, only that it makes
other alternatives viable.  There are still plenty of reasons for using a
particular language when performance isn't the issue and a wider choice is
therefore available.  Familiarity and investment in a particular toolchain
is probably top of the list.  For example, for PC programming I use my
Pascal to C translator with C compiler environment.  For some things I do, I
definitely need the power of a compiled language.  For most things I don't.
However, having everything written to the same toolchain is a huge
advantage.  It wouldn't make sense to write the programs that aren't CPU
critical in Java, for example, just because I could, keeping in mind that
occasionally I'd still have to go back to writing truly compiled code to get
the performance.

> PICs are the same, you have "slow cheap Pic + Assembler" vs "faster $
> PIC + compiler".  Not everyone needs assembler for PICs.

Right, but some do.  Even if you haven't encountered them, there are high
volume projects out there where $.05 microcontroller cost is significant.

> (and in reference to the subject line, BREAK is probably the dumbest
> idea ever, every other language got switches right)

I agree with that.  The C switch statement could have been designed better
with no more burden on the compiler or loss of machine code performance, but
with less likelyhood of a programmer screwup.  Worse, this same thing can be
said about too many other aspects of C.


********************************************************************
Embed Inc, Littleton Massachusetts, http://www.embedinc.com/products
(978) 742-9014.  Gold level PIC consultants since 2000.

2009\07\12@014321 by Tony Smith
flavicon
face
> Tony Smith wrote:
> > I originally wrote 'almost no-one', but changed it to see who'd bite.
>
> I can only respond to what you wrote, not what you considered writing.
>
> > 'Blatantly' is a bit much,
>
> No, you made a absolute statement which a single counter example can
> disprove.  There are many counter examples of which I listed a couple.  If
> you'd said something like "most programs don't need a compiled language
> because speed is not the issue" I wouldn't have argued and even agreed,
but
> that's very different from what you actually said.


Surely you've managed you spot the trolls these days.


> > It's also been true for a while that it's cheaper to buy more CPU
> > power then spend days fiddling with something to make it faster.
> > That offends people too, but a new PC divided by your hourly rate is
> > a small number.  Which is better?
>
> Again you're making a absolute claim, which of course is false.  Have you
> really never run into programs that took longer to run than you wished,
even
> if you had the latest computer?  Even if you haven't, are you really
saying
> you've never heard of any or can't imagine any?  If so, then there's a big
> world of computing you've managed to miss somehow.


Hardware is cheap, software isn't.  A PC is a few of hours of your hourly
rate.  How much auto-router code does that buy?


> The Eagle autorouter is a good example.  Some runs I've done have taken a
> couple of hours, some overnight.  Let's say for sake of argument I have a
> real problem that takes my current PC 1 hour to run.  Of course I'd like
it
> to be instant, but let's say 10 seconds would put it into the good enough
> range so that I wouldn't have to alter my workflow around the program.
> That's a factor of 360.  My PC is a few years old.  Maybe I could get 4x
out
> of a current mainstream model.  Let's be generous and say 8x.  That brings
> the one hour down to 7.5 minutes.  That's still a "long" time in my
context
> and I'd definitely want it to be faster.
>
> Thru the history of computing there have always been problems that taxed
> the computers of the time where people had to wait inconveniently long for

> the computer to finish or compromises were made to limit the problem.  As


Y'know, If I'd written that last bit, you'd have probably replied "Duh, go
buy a faster PC.  Why wait until some C guy finally knocks out a few more
bugs and make the autorouter run 5% faster.  A new PC will halve your run
time right now.".  There's a limit to the gains you can get with code, but
you can buy cheap CPU cycles right now.  And even more next year.

So you're happy to wait until the next Eagle release, fork over your licence
upgrade fee, and get something the may run a little bit faster?  I'll take
the PC.  No wait, your saying buy a new PC.  Hmm, you appear to be arguing
in full agreement with me.  I think.  Or are you waiting for the C
programmer to save the day with v4.56b?


> > C is rarely needed these days.
>
> That's better, although I think you meant to say the extra performance of
a
> compiled language is rarely needed these days.  I agree with that, but
note
> that this doesn't make a compile language a bad idea, only that it makes


No, I meant C.  C & a HHL language can both compile to something that's
pretty much indistinguishable.  Microsoft showed that with QuickC &
QuickBasic years ago.  Why write in a language that creates more coding
errors?  Which is, of course, what you're complaining about.  Ya gotta love
these threads.

Tony


2009\07\12@090246 by olin piclist

face picon face
Tony Smith wrote:
> Surely you've managed you spot the trolls these days.

Is this the pot calling the pot black?

> Hardware is cheap, software isn't.  A PC is a few of hours of your
> hourly rate.  How much auto-router code does that buy?

I'm not sure if you're just trolling or somehow really didn't get the point,
so I'll give you the benefit of the doubt.  I was using the Eagle autorouter
merely as a example of something that consumes a lot of CPU time on current
machines because you claimed no such programs existed.

As for your side point, show me a machine for $2000 that can complete any of
the Eagle auto routes I've done in under a minute and I'll buy it.  In the
mean time I'll keep upgrading my PC occasionally.  I'm not going to run out
and get a new PC every 6 months though just because it would 25% faster.
Even ignoring the purchase price, there is a cost in down time of installing
all the software getting everything set up just the way I like, etc.  It
usually takes a day or even more just to switch to a new computer.  Anyway,
this is irrelevant to the point at hand.


********************************************************************
Embed Inc, Littleton Massachusetts, http://www.embedinc.com/products
(978) 742-9014.  Gold level PIC consultants since 2000.

2009\07\12@110115 by Gerhard Fiedler

picon face
Tony Smith wrote:

>> Again you're making a absolute claim, which of course is false.  Have
>> you really never run into programs that took longer to run than you
>> wished, even if you had the latest computer?  Even if you haven't,
>> are you really saying you've never heard of any or can't imagine
>> any?  If so, then there's a big world of computing you've managed to
>> miss somehow.
>
> Hardware is cheap, software isn't.  A PC is a few of hours of your
> hourly rate.  How much auto-router code does that buy?

Let's see... Imagine an auto-router that's used by 10'000 users. Let's
say "a few hours" is 10 hours. That's 10 hours per user, so it's 100'000
hours. How much auto-router code does that buy again? :)


> There's a limit to the gains you can get with code, but you can buy
> cheap CPU cycles right now.  

Isn't there also a limit how many cheap CPU cycles you can buy right
now? Where do you go when you have bought all the cheap CPU cycles you
can buy and still are not there where you want to be?

Of course, from the viewpoint of a single user this may be the solution
-- if no other solution is available. But between two autorouters with
similar functionality, one may be twice as fast, and only a bit more
expensive. So if such an option is available, which one would you pick?
The slower product?


> Or are you waiting for the C programmer to save the day with v4.56b?

You're good in setting up strawmen, or at least are trying to be. But it
may be that you don't have to wait... there's competition out there,
even for autorouters.


>>> C is rarely needed these days.
>>
>> That's better, although I think you meant to say the extra
>> performance of a compiled language is rarely needed these days.  I
>> agree with that, but note that this doesn't make a compile language
>> a bad idea, only that it makes
>
> No, I meant C.  C & a HHL language can both compile to something
> that's pretty much indistinguishable.  

Indistinguishable in what sense?


> Microsoft showed that with QuickC & QuickBasic years ago.  

What did they show? I tried to find something about this, but didn't
find any relevant findings :)


> Why write in a language that creates more coding errors?  

Like for example assembly? Because in some cases and instances the last
bit of control or optimization that a compiler can't give or do may be
important.

Gerhard

2009\07\12@141515 by Tony Smith

flavicon
face
> >> any?  If so, then there's a big world of computing you've managed to
> >> miss somehow.
> >
> > Hardware is cheap, software isn't.  A PC is a few of hours of your
> > hourly rate.  How much auto-router code does that buy?
>
> Let's see... Imagine an auto-router that's used by 10'000 users. Let's
> say "a few hours" is 10 hours. That's 10 hours per user, so it's 100'000
> hours. How much auto-router code does that buy again? :)


Not much.  So you get 100 man strong team together and they code away for 6
months.  In machine code because that's the 'in' thing.  More speed!
Ludicrous speed even!  They develop a whole new algorithm that's 10% faster!
Meanwhile I'd upgraded my 4 year old PC (like Olin has) and have had a
better speed increase than 10% for that time.  So, new PC with instant
results now, or wait 6 months for something that's 100% machine code, 10%
faster and buggy.  Let me think it over...


{Quote hidden}

If the slower product has less bugs and better support, then yeah.  But
that's irrelevant, of course you pick the faster one if everything thing
else is equal.  And buy a new PC to run it on.


> > Or are you waiting for the C programmer to save the day with v4.56b?
>
> You're good in setting up strawmen, or at least are trying to be. But it
> may be that you don't have to wait... there's competition out there,
> even for autorouters.


But the customers are locked in to their existing product.  No-one will
switch as you need to do all your libraries again.  But that's irrelevant
too.


{Quote hidden}

To the user.


> > Microsoft showed that with QuickC & QuickBasic years ago.
>
> What did they show? I tried to find something about this, but didn't
> find any relevant findings :)


I used to write in both.  The compiled QB programs ran just as fast as
compiled C ones, and TurboC & TurboPascal probably did too.  (Hence MS
dropping the compiler in VB to make C++ the 'better' choice.)  Sure it was
hard to write image manipulation stuff in QB as it lacks the bit-twiddling
stuff C has, but hardly anyone did that, despite many claiming to.  Write
that little bit in C and the rest in something easier to use.  

Or just buy a library where some other poor sod has done the hard work for
you.  Like Olin said, you don't write it yourself, but you still need to
know how to make the right choice.  Sorting routines are a good example.

Reminds that Olin mentions he had to write a program that took a long time
to do a lot of complicated stuff, but never actually mentioned what he wrote
it in.  Pascal?  ADA?  GWBasic?


> > Why write in a language that creates more coding errors?
>
> Like for example assembly? Because in some cases and instances the last
> bit of control or optimization that a compiler can't give or do may be
> important.


More than one company went broke waiting for the assembler programmers to
finish while the HLL (even C counts there) guys beat them to the market.
Fast doesn't matter much anymore unless it's fast to market.  Some people
want their stuff to actually work too, which is rather novel.

The car analog for this thread is arguing whether your car needs a rear wing
on it.

Tony

2009\07\12@142317 by Tony Smith

flavicon
face
> > Hardware is cheap, software isn't.  A PC is a few of hours of your
> > hourly rate.  How much auto-router code does that buy?
>
> I'm not sure if you're just trolling or somehow really didn't get the
point,
> so I'll give you the benefit of the doubt.  I was using the Eagle
autorouter
> merely as a example of something that consumes a lot of CPU time on
current
> machines because you claimed no such programs existed.
>
> As for your side point, show me a machine for $2000 that can complete any
of
> the Eagle auto routes I've done in under a minute and I'll buy it.  In the
> mean time I'll keep upgrading my PC occasionally.  I'm not going to run
out
> and get a new PC every 6 months though just because it would 25% faster.
> Even ignoring the purchase price, there is a cost in down time of
installing
> all the software getting everything set up just the way I like, etc.  It
> usually takes a day or even more just to switch to a new computer.
Anyway,
> this is irrelevant to the point at hand.


Getting a 25% speed increase by doing something trivial as buying a new PC
sounds good to me, beats waiting for the next software release (which is
very unlikely to be that good).

You complain something is slow when there's an easy solution at hand?

Disk cloning programs are pretty good these days.  Why build from scratch?

Tony

2009\07\12@142844 by Tony Smith

flavicon
face
> > Surely you've managed you spot the trolls these days.
>
> Is this the pot calling the pot black?


I wasn't implying you were a troll, Olin, just merely replying to one.  Oh,
I see, there's an odd typo.  That 2nd 'you' should be 'to'.  Bedtime, me
thinks.

Cheers,

Tony

2009\07\12@182812 by olin piclist

face picon face
Tony Smith wrote:
> If the slower product has less bugs and better support, then yeah.
> But that's irrelevant, of course you pick the faster one if
> everything thing else is equal.  And buy a new PC to run it on.

It's funny how when someone refutes one statement you make you simply switch
to a orthagonal argument.

Let's keep in focus that your original statment was:

 "No-one needs the 'power' that C gives you these days, it's
 all libraries, web apps and front-ends for SQL databases.
 You don't need C for that."

Looking at that in hindsight it was apparently just trolling.

> Reminds that Olin mentions he had to write a program that took a long
> time to do a lot of complicated stuff, but never actually mentioned
> what he wrote it in.  Pascal?  ADA?  GWBasic?

I wrote it in Pascal, which gets translated to C as part of the build
process, then compiled with the MSVC compiler for PCs.

> More than one company went broke waiting for the assembler
> programmers to finish while the HLL (even C counts there) guys beat
> them to the market. Fast doesn't matter much anymore unless it's fast
> to market.  Some people want their stuff to actually work too, which
> is rather novel.

But low unit cost certainly does matter.  I've got a high volume project
that needs to be cheap, cheap, cheap.  This thing has to periodically send a
RF message that includes its 32 bit ID, a few status bits, and a CRC
checksum.  It also has to send a signature via IR, handle a user button and
a LED, compute life left and display it on the LED when asked, and a
proprietary wrinkle or two I don't want to get into.  This runs on a 10F202
using all 24 bytes of RAM, although a few bits here and there are not used.
Stuff that in your C compiler and see what you'd get.  No product because it
costs to much, I suspect.


********************************************************************
Embed Inc, Littleton Massachusetts, http://www.embedinc.com/products
(978) 742-9014.  Gold level PIC consultants since 2000.

2009\07\12@183213 by olin piclist

face picon face
Tony Smith wrote:
> Getting a 25% speed increase by doing something trivial as buying a
> new PC sounds good to me, beats waiting for the next software release
> (which is very unlikely to be that good).

In the example I used to refute your original statement, I needed a few 100x
for the speed increase to be really useful.  But that was just one example.

Do you really not see that there are problems out there for which no amount
of affordable CPU power is enough, or is this just trolling?


********************************************************************
Embed Inc, Littleton Massachusetts, http://www.embedinc.com/products
(978) 742-9014.  Gold level PIC consultants since 2000.

2009\07\12@212622 by Gerhard Fiedler

picon face
Tony Smith wrote:

> Not much.  So you get 100 man strong team together and they code away for 6
> months.  In machine code because that's the 'in' thing.  

If you really are into the 'in' things, that's a problem you should work
on.

> But that's irrelevant, of course you pick the faster one if everything
> thing else is equal.  

And in the meantime the company with the slower product goes out of
business. Some few made a bit of money, the society as a whole has to
pick up the tab. For some a viable life model... :)


>>> Or are you waiting for the C programmer to save the day with v4.56b?
>>
>> You're good in setting up strawmen, or at least are trying to be.
>> But it may be that you don't have to wait... there's competition out
>> there, even for autorouters.
>
> But the customers are locked in to their existing product.  No-one
> will switch as you need to do all your libraries again.  But that's
> irrelevant too.

Autorouters are often stand-alone apps that integrate with many layout
programs, because the ones that come with layout programs often are not
considered to be that good. So there's very little locking in with
autorouters.

Where are you trying to get with all this?


>>> No, I meant C.  C & a HHL language can both compile to something
>>> that's pretty much indistinguishable.
>>
>> Indistinguishable in what sense?
>
> To the user.

In some cases. GUI apps that don't do much and most of the time wait for
user input are such cases. But not anything that does some real
computing. Try to decrypt, sort and encrypt gigabyte files in
QuickBasic... and try to get your users to actually wait for the result.


> I used to write in both.  The compiled QB programs ran just as fast as
> compiled C ones, and TurboC & TurboPascal probably did too.  

FWIW, I think (even though I don't have any numbers anymore) that it was
generally easier to write fast programs with Turbo Pascal. AFAIR this
compiler was really good.

> (Hence MS dropping the compiler in VB to make C++ the 'better'
> choice.)  

This sounds as if you just made it up.

It didn't make much sense to make a VB compiler... the language was not
designed for ultimate speed anyway, the goal was different. And for
Microsoft, neither C nor C++ was a real focus at any time. As you can
see now... they're all about VB and C#, and for neither there is a
native compiler; C++ (for which there is a native compiler) is a real
step child, for example in terms of IDE support.


> Or just buy a library where some other poor sod has done the hard work
> for you.  

See, there are people happily making their living with this sort of
thing. Would you say they should do it in QuickBasic?

> The car analog for this thread is arguing whether your car needs a
> rear wing on it.

What's your point, really? You say "no need for C", but then you say
"whenever you need something fast, buy something that's written in C by
someone else", which seems to imply that there is a need for C (doesn't
matter who has to write it). Then you come with odd analogies...

So what's your point? Too much time on your hands?

Gerhard

2009\07\13@025803 by Marechiare

picon face
> Getting a 25% speed increase by doing something trivial as
> buying a new PC sounds good to me, beats waiting for the
> next software release (which is very unlikely to be that good).

I'm almost sure you are trolling, you were told that reinstalling all
the software would take a day work. That's not cheap and it could be
rather tricky for the compatibility issues, - hardware drivers and new
Windows version (in case his old Windows were OEM).

And most probably he won't get 25% speed increase by replacing Core 2
system (2 years old) with newer Intel Core i7 system. For the
reasonably priced Core i7 configuration he will get sort of 10%-15%
speed increase at max on the processor intensive tasks, and none speed
increase on other tasks.

> You complain something is slow when there's an easy
> solution at hand?
>
> Disk cloning programs are pretty good these days.
> Why build from scratch?

How is "Disk cloning" related to the reinstalling the software on a
new hardware set?

2009\07\13@061936 by cdb

flavicon
face


:: How is "Disk cloning" related to the reinstalling the software on a
:: new hardware set?

Some disk cloning software does allow ' foreign ' bare metal restore,
which if ti works right will install correct drivers etc for the new
hardware - time lost is all it takes to clone and possibly install the
drive.

Colin
--
cdb, colinEraseMEspam.....btech-online.co.uk on 13/07/2009

Web presence: http://www.btech-online.co.uk  

Hosted by:  http://www.1and1.co.uk/?k_id=7988359







2009\07\13@100449 by Gerhard Fiedler

picon face
cdb wrote:

>:: How is "Disk cloning" related to the reinstalling the software on a
>:: new hardware set?
>
> Some disk cloning software does allow ' foreign ' bare metal restore,
> which if ti works right will install correct drivers etc for the new
> hardware - time lost is all it takes to clone and possibly install
> the drive.

Does this work reliably when switching the whole system (motherboard,
processor, disk controller, ...)? Won't the system load the wrong
drivers before it can recognize that it needs other drivers and starts
to install them?

Gerhard

2009\07\13@103629 by Carl Denk

flavicon
face
If we are talking Windows PC's, I have several times using Maxtor
(Seagate) software cloned HD's to replace drives, and/or changed out
motherboards, graphics cards, etc. with no problems. It will either find
the right drivers if they were previously installed or part of windows.
Yes sometimes if it can't find the drivers, it can be a time consumer to
find the right ones though. Kubuntu seemed to do it also, but don't have
enough experience with the dual boot to comment.

Gerhard Fiedler wrote:
{Quote hidden}

2009\07\13@161229 by William \Chops\ Westfield

face picon face

On Jul 12, 2009, at 11:58 PM, Marechiare wrote:

>> Getting a 25% speed increase by doing something trivial as
>> buying a new PC sounds good to me, beats waiting for the
>> next software release (which is very unlikely to be that good).
>
> I'm almost sure you are trolling, you were told that reinstalling all
> the software would take a day work. That's not cheap and it could be
> rather tricky for the compatibility issues

You know, I can see both sides of this pretty easily; On the one hand,  
upgrading hardware isn't that easy, especially if you're talking  
multiple users in a corporate environment with a large "other"  
applications portfolio.  This is the scope of IT departments, and I  
think I recall seeing studies that IT departments have NOT shrunk much  
since the mainframe days, despite each user having their own machine.

On the other hand, you see people struggling with ANCIENT computer  
setups, somehow thinking that struggling to run modern software on  
their aging Pentium 3 systems is saving them money vs dropping $500 on  
a new Dell (or whatever.)

Still, it'll be nice when Cadsoft comes out with the multi-core aware  
version of EAGLE and its autorouter.  It'd be a nice carrot for the  
commercial versions, too...  (although frankly I'm not convinced that  
some SW changes wouldn't help more.)

BillW

2009\07\13@163829 by cdb

flavicon
face


:: Does this work reliably when switching the whole system
:: (motherboard, processor, disk controller, ...)?

Define reliably! :)  For most Workstations this works fine, Servers
need a bit of planning and home PC's can be a nightmare.

:: Won't the system load the wrong
:: drivers before it can recognize that it needs other drivers and
:: starts to install them?

The cloning software I'm most familiar with has a utility that allows
drivers to be preloaded.

W7 and to some extent Vista have a better database of drivers inbuilt
and at the very least will boot and just require the new MB CD to be
inserted.

W7 beta/RC1 loaded onto my 4 year old system with no problem, only the
sound card wasn't recognised.

Colin
--
cdb, EraseMEcolinspambtech-online.co.uk on 14/07/2009

Web presence: http://www.btech-online.co.uk  

Hosted by:  http://www.1and1.co.uk/?k_id=7988359







2009\07\13@164314 by Bob Blick

face
flavicon
face

On Mon, 13 Jul 2009 13:12:24 -0700, "William Chops Westfield"
<RemoveMEwestfwEraseMEspamEraseMEmac.com> said:
>
> Still, it'll be nice when Cadsoft comes out with the multi-core aware  
> version of EAGLE and its autorouter.  It'd be a nice carrot for the  
> commercial versions, too...  (although frankly I'm not convinced that  
> some SW changes wouldn't help more.)

After seeing Eagle take half an hour to do a poorer job of routing than
15 seconds in SPECCTRA, I have to agree, the software has something to
do with it :)

Cheers,

Bob

--
http://www.fastmail.fm - A fast, anti-spam email service.

2009\07\13@165427 by Tamas Rudnai

face picon face
You just have to install the host operating system and a virtualised one and
install all your application on the guest OS -- on the virtual machine the
"hardware" does not change ;-)

(someone was talking about multi-virtualised hardware, so the virtualisation
works on the virtual memory feature of the CPU+HostOS, then the GuestOS has
it's own VirtualMemory Manager... and then you run a .NET or Java app on it
etc :-)  But who cares? We have enough CPU power and as mentioned earlier we
are able to do the same thing as with the Apple II but with a bit
complicated way :-) )

Tamas


On Mon, Jul 13, 2009 at 9:38 PM, cdb <RemoveMEcolinspam_OUTspamKILLspambtech-online.co.uk> wrote:

{Quote hidden}

> -

2009\07\13@182513 by olin piclist

face picon face
Tamas Rudnai wrote:
> You just have to install the host operating system and a virtualised
> one and install all your application on the guest OS -- on the
> virtual machine the "hardware" does not change ;-)
>
> (someone was talking about multi-virtualised hardware, so the
> virtualisation works on the virtual memory feature of the CPU+HostOS,
> then the GuestOS has it's own VirtualMemory Manager... and then you
> run a .NET or Java app on it etc :-)  But who cares? We have enough
> CPU power and as mentioned earlier we are able to do the same thing
> as with the Apple II but with a bit complicated way :-) )

It's really funny how threads evolve sometimes.  Upgrading to a new PC was
mentioned as getting a 25% speed increase of your application software, so
running a new OS in a virtual machine on a existing physical machine isn't
going to help.  Of course upgrading the PC was only brought up as a smoke
screen to Tony Smith's original rediculous (now apparently trolling)
statement that no software was speed-critical anymore.


********************************************************************
Embed Inc, Littleton Massachusetts, http://www.embedinc.com/products
(978) 742-9014.  Gold level PIC consultants since 2000.

2009\07\13@205408 by Marechiare

picon face
> :: How is "Disk cloning" related to the reinstalling the
> :: software on a new hardware set?
>
> Some disk cloning software does allow ' foreign ' bare
> metal restore,

As for me, this is totaly unaceptable idea to use third party software
to mess up with the op system at that depth.

A lot of problems even with the standard installation procedure on the
newest hardware, not to say about the security issues.

2009\07\14@034830 by Dario Greggio

face picon face
<http://blogs.techrepublic.com.com/programming-and-development/?p=1372&tag=nl.e055>

just an article I found today..

2009\07\14@145425 by Gerhard Fiedler

picon face
Dario Greggio wrote:

> <blogs.techrepublic.com.com/programming-and-development/?p=1372&tag=nl.e055>
>
> just an article I found today..

The first paragraph of this has been true for at least ten years IMO. I
do a lot of C++ work, but I wouldn't use it for anything where it's not
really advantageous. And given the investment you have to put in to make
C++ actually useful, these are some few typically big and high
performance applications only -- as he says.

Gerhard

More... (looser matching)
- Last day of these posts
- In 2009 , 2010 only
- Today
- New search...