Discussion:
[9fans] caveat... optimizer? the `zero and forget' thread on HN
(too old to reply)
dexen deVries
2012-10-29 09:45:33 UTC
Permalink
http://news.ycombinator.com/item?id=4711346

9fans says, ``no room in the compiler world for amateurs''. what's your take
on the above fubar?
--
dexen deVries

[[[↓][→]]]
t***@polynum.com
2012-10-29 10:12:42 UTC
Permalink
Post by dexen deVries
http://news.ycombinator.com/item?id=4711346
9fans says, ``no room in the compiler world for amateurs''. what's your take
on the above fubar?
That when one does programming, one tries to have not fuzzy behavior,
that is to know exactly what the code does.

With some compilers, a programmer can make assumptions valid with the
sources and not anymore valid after compiler acrobatics, first of all
optimizations (and I thought volatile was precisely here to say: don't
try to guess ; leave this alone).

The worst being the GCC example when one has to explicitely disable some
acrobatics that are done by default. The correct behavior should be to
only switch on "features" when compiler is explicitely instructed to do
so (or, if some actions are done by default because it is proved that
this does not invalidate code assumptions, there should be a limit
number of rules explicitely stated in the man pages; the plan9 man
pages, and the Ken Thompson's style are remarquable for this: short
text/sources, with what is done and what is not guaranteed).

The man to texinfo transition has not improved the information but, on
the contrary, the size and complexity of informations, decreasing the
ratio signal/noise.

The first principle of safety (the theme of the article with zeroing
critical memory) is smallest and clearest sources. These are not the
bazaar aim. Neither on the compiler side.
--
Thierry Laronde <tlaronde +AT+ polynum +dot+ com>
http://www.kergis.com/
Key fingerprint = 0FF7 E906 FBAF FE95 FD89 250D 52B1 AE95 6006 F40C
erik quanstrom
2012-10-29 13:43:04 UTC
Permalink
Post by t***@polynum.com
The man to texinfo transition has not improved the information but, on
the contrary, the size and complexity of informations, decreasing the
ratio signal/noise.
from lions, i get only 7889 lines of code in the v6 kernel; the gcc
man page is 13000+, the last i checked.

- erik
erik quanstrom
2012-10-29 13:35:00 UTC
Permalink
Post by dexen deVries
http://news.ycombinator.com/item?id=4711346
9fans says, ``no room in the compiler world for amateurs''. what's your take
on the above fubar?
any sort of "advanced" code-moving optimization is confusing. but the
way c/c++ are used in linux, bsd & osx, there is a noticable benefit to
optimizing calls away. it takes smarts to optimize away those recursive
wrapper macros. so they're in a bit of a pickle.

i just wonder if this isn't a race to the bottom. tricks like this require
programmers to think harder about non-problems, and probablly write
defensive inlines, etc. the compiler writers won't accept this slowdown
and will try to outwit the defensive inlines on the theory that programmers
aren't that keen.

LWN carries story on these sorts of self-inflicted wounds every so often.
the last one was on __access_once or something like that.

it goes without saying, i think a compiler that largely does what you
ask it to optimizes the scarce resource: developer time.

i'm sure you'll find smarter opinions from smarter people, though. :-)

- erik
Bakul Shah
2012-10-29 22:35:41 UTC
Permalink
Post by erik quanstrom
Post by dexen deVries
http://news.ycombinator.com/item?id=4711346
9fans says, ``no room in the compiler world for amateurs''. what's your tak
e
Post by dexen deVries
on the above fubar?
any sort of "advanced" code-moving optimization is confusing. but the
way c/c++ are used in linux, bsd & osx, there is a noticable benefit to
optimizing calls away. it takes smarts to optimize away those recursive
wrapper macros. so they're in a bit of a pickle.
It has nothing to do with "how" C/C++ are used in linux, bsd &
osx -- you forgot windows! The C standard allows a lot of
leeway in optimization. Consider this:

foo() {
...
int x;
int y[10];
...
memset(y, 0, sizeof y);
}

If x is never referred to, a correct program will never notice
if it is taken away. If address of x is never taken, it can be
allocated in a register (if there is a free one) to improve
performance. The same reasoning can be used to elide memset()
on y. Where do you draw the line? Any line will be arbitrary.

What the blog writer wanted to do (clearing memory after use)
is *not* guaranteed by the C standard so he can't expect help.
But it is easy to fool compilers to do what he wanted -- just
calling a user defined function that in turn calls memset did
the trick with gcc.
Post by erik quanstrom
it goes without saying, i think a compiler that largely does what you
ask it to optimizes the scarce resource: developer time.
That is a separate issue.
Charles Forsyth
2012-10-29 22:47:02 UTC
Permalink
He can fool it once, but can he fool it twice? Can he recompile?
Post by Bakul Shah
But it is easy to fool compilers to do what he wanted
Bakul Shah
2012-10-29 23:05:35 UTC
Permalink
Post by Charles Forsyth
He can fool it once, but can he fool it twice? Can he recompile?
Why not. Compilers never get wise to the ways of sneaky programmers!
erik quanstrom
2012-10-29 23:07:59 UTC
Permalink
Post by Bakul Shah
Post by Charles Forsyth
He can fool it once, but can he fool it twice? Can he recompile?
Why not. Compilers never get wise to the ways of sneaky programmers!
feedback-based optimization.

- erik
erik quanstrom
2012-10-29 23:20:50 UTC
Permalink
Call me crazy, but I always felt compilers were there to emit code that
reflected what I wrote, not what it thinks it can do a better job writing
for me. People complain that Go is not a good systems language due to the
garbage collector. Maybe C isn't a good language due to all the places
where optimizers are allowed to break code.
the vodka is strong, but the meat is rotten.

- erik
Kurt H Maier
2012-10-29 23:59:57 UTC
Permalink
Post by erik quanstrom
the vodka is strong, but the meat is rotten.
wait, you're saying this as if it's a bad thing‽
check your syslog for messages about references whizzing past your
terminal
andrey mirtchovski
2012-10-29 23:53:17 UTC
Permalink
Post by erik quanstrom
the vodka is strong, but the meat is rotten.
wait, you're saying this as if it's a bad thing‽
David Leimbach
2012-10-29 23:15:35 UTC
Permalink
Post by erik quanstrom
On Mon, 29 Oct 2012 22:47:02 -0000 Charles Forsyth <
Post by Charles Forsyth
He can fool it once, but can he fool it twice? Can he recompile?
Why not. Compilers never get wise to the ways of sneaky programmers!
feedback-based optimization.
- erik
Call me crazy, but I always felt compilers were there to emit code that
reflected what I wrote, not what it thinks it can do a better job writing
for me. People complain that Go is not a good systems language due to the
garbage collector. Maybe C isn't a good language due to all the places
where optimizers are allowed to break code.
erik quanstrom
2012-10-29 23:10:55 UTC
Permalink
Post by Bakul Shah
Post by erik quanstrom
Post by dexen deVries
http://news.ycombinator.com/item?id=4711346
9fans says, ``no room in the compiler world for amateurs''. what's your tak
e
Post by dexen deVries
on the above fubar?
any sort of "advanced" code-moving optimization is confusing. but the
way c/c++ are used in linux, bsd & osx, there is a noticable benefit to
optimizing calls away. it takes smarts to optimize away those recursive
wrapper macros. so they're in a bit of a pickle.
It has nothing to do with "how" C/C++ are used in linux, bsd &
osx -- you forgot windows! The C standard allows a lot of
my point was that the attitude that every optimization allowed is required is not
helpful and is in the end counter productive.

actually, to be a bit cute about it, i should announce the first
international obfuscated c compiler contest. the goal of the contest
is to write a c99-compliant compiler that breaks every program in /sys/src/cmd.
the winner will be chosen based on highest percentage of programs broken,
with the tie going to the most devious tricks for remaining standards compliant
while missing the spirit completely.
Post by Bakul Shah
Post by erik quanstrom
it goes without saying, i think a compiler that largely does what you
ask it to optimizes the scarce resource: developer time.
That is a separate issue.
actually, i think it *is* the issue.

- erik
Bakul Shah
2012-10-29 23:26:52 UTC
Permalink
Post by erik quanstrom
Post by Bakul Shah
Post by erik quanstrom
Post by dexen deVries
http://news.ycombinator.com/item?id=4711346
9fans says, ``no room in the compiler world for amateurs''. what's your
tak
Post by Bakul Shah
Post by erik quanstrom
e
Post by dexen deVries
on the above fubar?
any sort of "advanced" code-moving optimization is confusing. but the
way c/c++ are used in linux, bsd & osx, there is a noticable benefit to
optimizing calls away. it takes smarts to optimize away those recursive
wrapper macros. so they're in a bit of a pickle.
It has nothing to do with "how" C/C++ are used in linux, bsd &
osx -- you forgot windows! The C standard allows a lot of
my point was that the attitude that every optimization allowed is required is not
helpful and is in the end counter productive.
No disagreement there on "requiring" optimization. But my
point was that a programmer should understand the standard
rather than complain when he gets "surprised" due to his lack
of knowledge.
Post by erik quanstrom
actually, to be a bit cute about it, i should announce the first
international obfuscated c compiler contest. the goal of the contest
is to write a c99-compliant compiler that breaks every program in /sys/src/cm
d.
the winner will be chosen based on highest percentage of programs broken,
with the tie going to the most devious tricks for remaining standards complia
nt
while missing the spirit completely.
/sys/src/cmd follows plan9 c, not c99, right? But pick a
similar set of programs. If this happens, I claim it would be
because programs assume something not guaranteed by the
compiler.
Post by erik quanstrom
Post by Bakul Shah
Post by erik quanstrom
it goes without saying, i think a compiler that largely does what you
ask it to optimizes the scarce resource: developer time.
That is a separate issue.
actually, i think it *is* the issue.
Best way to save developer time is to program in a HLL and not
worry about bit fiddling. C is not a HLL.
erik quanstrom
2012-10-29 23:36:16 UTC
Permalink
Post by Bakul Shah
No disagreement there on "requiring" optimization. But my
point was that a programmer should understand the standard
rather than complain when he gets "surprised" due to his lack
of knowledge.
i agree that one should know the language. but i'm not sure i'll
say it's the programmer's fault when the compiler does things to
the programmer. that to me goes a bit too far.
Post by Bakul Shah
/sys/src/cmd follows plan9 c, not c99, right? But pick a
similar set of programs. If this happens, I claim it would be
because programs assume something not guaranteed by the
compiler.
and i can design a standards-compatable compiler that will break
most any c program.

this is similar to saying that i can design a namespace that will
break almost any plan 9 program.

what i think has broken down is common sense among compiler
writers. they're too focused on being fast, and not focused enough
on being *useful*.
Post by Bakul Shah
Post by erik quanstrom
Post by Bakul Shah
Post by erik quanstrom
it goes without saying, i think a compiler that largely does what you
ask it to optimizes the scarce resource: developer time.
That is a separate issue.
actually, i think it *is* the issue.
Best way to save developer time is to program in a HLL and not
worry about bit fiddling. C is not a HLL.
this is a spurious argument, since we are in fact talking about c.

- erik
Charles Forsyth
2012-10-29 23:58:23 UTC
Permalink
"But my point was that a programmer should understand the standard"

But suppose the standard does not evidently aim to be understood, in the
generally understood meaning of "understood",
or there are more words in the standard than will ever appear in the
programmer's own programs?
Worse! "Standard" doesn't imply a fixed point ("oh, that syntax/semantics
is so last year!").
I think looking into memset and deciding it's not worthwhile calling is
perhaps overly enthusiastic.
Actually, it's wrong, because it overlooks the side-effect, and an
optimiser for a language with side-effects
should take that into account.
Bakul Shah
2012-10-30 00:52:48 UTC
Permalink
Post by Charles Forsyth
"But my point was that a programmer should understand the standard"
But suppose the standard does not evidently aim to be understood, in the
generally understood meaning of "understood",
or there are more words in the standard than will ever appear in the
programmer's own programs?
The C standard is not too hard to understand. For something
worse try one of those ITU standards! Try IEEE 802 standards!
I have had to read the Bridging standard many many more times
(compared to the C standard) to make sense of it. The
standards *shouldn't* be so horrible but they are. And one
does what is needed to get the job done.
Post by Charles Forsyth
Worse! "Standard" doesn't imply a fixed point ("oh, that syntax/semantics
is so last year!").
I think looking into memset and deciding it's not worthwhile calling is
perhaps overly enthusiastic.
I ask again. Who decides where the line is drawn? I think that
in a competitive environment the only thing that can restrain
people is the standard. Unfortunately.
Post by Charles Forsyth
Actually, it's wrong, because it overlooks the side-effect, and an
optimiser for a language with side-effects
should take that into account.
They put in "volatile" to ensure side-effects happen. Hasn't
worked too well.
erik quanstrom
2012-10-30 01:01:52 UTC
Permalink
Post by Bakul Shah
The C standard is not too hard to understand. For something
worse try one of those ITU standards! Try IEEE 802 standards!
I have had to read the Bridging standard many many more times
(compared to the C standard) to make sense of it. The
standards *shouldn't* be so horrible but they are. And one
does what is needed to get the job done.
this is logical fallacy.

the fact that there are larger and more obtuse standards,
does not mean that the c standard is readable or understandable,
by their normal definitions.
Post by Bakul Shah
Post by Charles Forsyth
Actually, it's wrong, because it overlooks the side-effect, and an
optimiser for a language with side-effects
should take that into account.
They put in "volatile" to ensure side-effects happen. Hasn't
worked too well.
that's incorrect. it was a hack to try to sneak a memory
model in the side door. side effects are something else entirely.

- erik
t***@polynum.com
2012-10-30 07:55:28 UTC
Permalink
Post by Charles Forsyth
But suppose the standard does not evidently aim to be understood, in the
generally understood meaning of "understood",
or there are more words in the standard than will ever appear in the
programmer's own programs?
Do you mean as if standards were designed or "improved" by committees?

Please: stop spreading FUD! (Were are my drops? I have tachycardia
again!)
--
Thierry Laronde <tlaronde +AT+ polynum +dot+ com>
http://www.kergis.com/
Key fingerprint = 0FF7 E906 FBAF FE95 FD89 250D 52B1 AE95 6006 F40C
Bakul Shah
2012-10-30 00:35:01 UTC
Permalink
Post by erik quanstrom
Post by Bakul Shah
No disagreement there on "requiring" optimization. But my
point was that a programmer should understand the standard
rather than complain when he gets "surprised" due to his lack
of knowledge.
i agree that one should know the language. but i'm not sure i'll
say it's the programmer's fault when the compiler does things to
the programmer. that to me goes a bit too far.
Not a question of fault but IMHO a programmer, like a
carpenter or anyone who does real work, has to experiment and
learn the strengths and weaknesses of his tools of his trade
if he wants to become competent.

The language standard can only mandate what you can do within
the language. For instace, if the blog writer had done this:

...
memcpy(a, 0, sizeof a);
for (i = 0; i < sizeof a; i++)
assert(a[i] == 0);
return 0;
}

He would have seen that every a[i] was indeed 0. But if you
use means external to the language (such as used a debugger or
look at the assembly language) to check side-effects, you get
to see the underside of optimization. Ugly, confusing and
possibly surprising. If you want to do something not covered
in the standard, learn what the compiler does and find a
compiler dependent way.
Post by erik quanstrom
Post by Bakul Shah
/sys/src/cmd follows plan9 c, not c99, right? But pick a
similar set of programs. If this happens, I claim it would be
because programs assume something not guaranteed by the
compiler.
and i can design a standards-compatable compiler that will break
most any c program.
Care to give an example? I am genuinely interested. The C
standard is far from perfect but it has worked well enough.
[Where is doug gywn when we need him?]
Post by erik quanstrom
what i think has broken down is common sense among compiler
writers. they're too focused on being fast, and not focused enough
on being *useful*.
I agree but I am afraid they do that because that is what the
market demands. Pretty much most of the software that the
Internet runs on + used by most people + businesses uses C or
C++ so C/C++ and their standards have been *useful* (but see
below). Fast does matter when you scale up (a big company may
need 18K servers instead of 20K and that results in savings
and less heat generation).

If you want to whine, whine about the fact that *we* (all of
us) have lost common sense in pretty much every field. Food,
medicine, media, politics, religion, consumption, you name it.
Post by erik quanstrom
Post by Bakul Shah
Post by erik quanstrom
Post by Bakul Shah
Post by erik quanstrom
it goes without saying, i think a compiler that largely does what you
ask it to optimizes the scarce resource: developer time.
That is a separate issue.
actually, i think it *is* the issue.
Best way to save developer time is to program in a HLL and not
worry about bit fiddling. C is not a HLL.
this is a spurious argument, since we are in fact talking about c.
Sorry but IMHO "saving developer time" & C don't mix well. I
am just pointing out what modern C is and how to survive, not
what it should be. I gave up on that a long time ago!
c***@gmx.de
2012-10-30 01:46:34 UTC
Permalink
m(

--
cinap
erik quanstrom
2012-10-30 01:10:41 UTC
Permalink
Post by Bakul Shah
Not a question of fault but IMHO a programmer, like a
carpenter or anyone who does real work, has to experiment and
learn the strengths and weaknesses of his tools of his trade
if he wants to become competent.
The language standard can only mandate what you can do within
...
memcpy(a, 0, sizeof a);
for (i = 0; i < sizeof a; i++)
assert(a[i] == 0);
return 0;
}
He would have seen that every a[i] was indeed 0. But if you
use means external to the language (such as used a debugger or
look at the assembly language) to check side-effects, you get
to see the underside of optimization. Ugly, confusing and
possibly surprising. If you want to do something not covered
in the standard, learn what the compiler does and find a
compiler dependent way.
you are arguing for a cartoon hammer that runs away when you're
not looking at it.
Post by Bakul Shah
Post by erik quanstrom
and i can design a standards-compatable compiler that will break
most any c program.
Care to give an example? I am genuinely interested. The C
standard is far from perfect but it has worked well enough.
[Where is doug gywn when we need him?]
oh, there are a number of obvious under-handed tricks. let's see,
for an intel compiler, change the structure padding to something bizarre
like 3 bytes. that will break a large number of linux drivers. (the 57711
for sure!) rearrange structure elements other than the first.
you can change from signed-preserving to unsigned preserving.
that'll catch a lot of folks out. do ones-complement arithmetic
(i believe that's still legal). have 10-bit bytes. 20 bit shorts, 40 bit
ints, 45-bit longs and 80-bit vlongs. (i'm not sure that's completely
legal, but you get the point.) make pointers to different types different
sizes. the list goes on. the combinations are silly, too. default
unsigned characters with unsigned preserving behavior. good luck
debugging that!

and since you would allow compilers to mess with side effects, we could
have a genuine field day.
Post by Bakul Shah
Post by erik quanstrom
what i think has broken down is common sense among compiler
writers. they're too focused on being fast, and not focused enough
on being *useful*.
I agree but I am afraid they do that because that is what the
market demands. Pretty much most of the software that the
Internet runs on + used by most people + businesses uses C or
C++ so C/C++ and their standards have been *useful* (but see
below). Fast does matter when you scale up (a big company may
need 18K servers instead of 20K and that results in savings
and less heat generation).
i don't understand your use of the world "useful". to me, the c
standard has become like a political issue. it's something to
game for personal advantage.

- erik
Bakul Shah
2012-10-30 03:06:34 UTC
Permalink
Post by erik quanstrom
you are arguing for a cartoon hammer that runs away when you're
not looking at it.
That is an excellent definition of optimization! Typical
optimizations:

- putting a variable in a register
- caching a value in a register
- moving a computation outside a loop
- inlining
- not doing something that is not going to be used
- reusing a register (so the old variable no longer exists)
- moving a computation to callsite from the called function
(partial inlining)
- partial evaluation
- caching a result
- don't allocate a variable if not used

In all these cases, the generated code will look different
compared to when *no* optimization is done.

The issue again is, who draws the line? And where? And is it
the same for every processor model? [An optimization on one
can be a pessimization on another]. Note that even the
simple flat memory model of C is an illusion.

The "stalin" Scheme to C compiler can do whole program
optimization and produces amazingly efficient programs --
often better than hand written C code. Letting it do the
optimization saves a lot of developer time, and more
importantly, one can write clear understandable code and not
have to hand optimize it into spaghetti code (where some
obscure bug can hide). Now the compiler is slow so I would
only use it if I am going to run something for a very long
time.

I would argue that one should not put limits on what can be
optimized but should keep the language simple and clearly pin
down semantics as precisely as possible.
Post by erik quanstrom
Post by Bakul Shah
Post by erik quanstrom
and i can design a standards-compatable compiler that will break
most any c program.
Care to give an example? I am genuinely interested. The C
standard is far from perfect but it has worked well enough.
[Where is doug gywn when we need him?]
oh, there are a number of obvious under-handed tricks. let's see,
for an intel compiler, change the structure padding to something bizarre
like 3 bytes. that will break a large number of linux drivers. (the 57711
for sure!) rearrange structure elements other than the first.
you can change from signed-preserving to unsigned preserving.
that'll catch a lot of folks out. do ones-complement arithmetic
(i believe that's still legal). have 10-bit bytes. 20 bit shorts, 40 bit
ints, 45-bit longs and 80-bit vlongs. (i'm not sure that's completely
legal, but you get the point.) make pointers to different types different
sizes. the list goes on. the combinations are silly, too. default
unsigned characters with unsigned preserving behavior. good luck
debugging that!
My head hurts! Clearly programs that break won't be standard
compliant :-)
Post by erik quanstrom
Post by Bakul Shah
Post by erik quanstrom
what i think has broken down is common sense among compiler
writers. they're too focused on being fast, and not focused enough
on being *useful*.
I agree but I am afraid they do that because that is what the
market demands. Pretty much most of the software that the
Internet runs on + used by most people + businesses uses C or
C++ so C/C++ and their standards have been *useful* (but see
below). Fast does matter when you scale up (a big company may
need 18K servers instead of 20K and that results in savings
and less heat generation).
i don't understand your use of the world "useful". to me, the c
standard has become like a political issue. it's something to
game for personal advantage.
gcc etc. are used to deliver a lot of code that is used in
real word. And without a standard there would've been lot
less interoperability and far more bugs.

You seem to be arguing for K&R as the standard or something
but we already tried that until 1989. A standard was needed
due to the success of C and with indepenedent implementations
that interpreted unwritten things in a different way from K&R.
I doubt Ritchie and co wanted to take an active and central
role (and I am not familiar with the history) but my guess
is only that could've kept the standard simple and readable.
Corey Thomasson
2012-10-30 03:16:01 UTC
Permalink
Post by Bakul Shah
gcc etc. are used to deliver a lot of code that is used in
real word. And without a standard there would've been lot
less interoperability and far more bugs.
Most interoperability delivered by gcc comes from the fact that gcc is
widespread, not that the standard is effective. If it was we wouldn't need
to port gcc to everything.
You seem to be arguing for K&R as the standard or something
Post by Bakul Shah
but we already tried that until 1989. A standard was needed
due to the success of C and with indepenedent implementations
that interpreted unwritten things in a different way from K&R.
I doubt Ritchie and co wanted to take an active and central
role (and I am not familiar with the history) but my guess
is only that could've kept the standard simple and readable.
erik quanstrom
2012-10-30 13:08:11 UTC
Permalink
Post by Corey Thomasson
Post by Bakul Shah
gcc etc. are used to deliver a lot of code that is used in
real word. And without a standard there would've been lot
less interoperability and far more bugs.
Most interoperability delivered by gcc comes from the fact that gcc is
widespread, not that the standard is effective. If it was we wouldn't need
to port gcc to everything.
even clang got forced into bug-for-bug compatability mode.

- erik
a***@skeeve.com
2012-10-30 15:15:08 UTC
Permalink
Post by erik quanstrom
Post by Corey Thomasson
Post by Bakul Shah
gcc etc. are used to deliver a lot of code that is used in
real word. And without a standard there would've been lot
less interoperability and far more bugs.
This remains true. It is possible and not that difficult to write
code that can be successfully and correctly compiled by multiple compilers
(like, oh, i dunno, maybe, GNU awk :-).
Post by erik quanstrom
Post by Corey Thomasson
Most interoperability delivered by gcc comes from the fact that gcc is
widespread, not that the standard is effective. If it was we wouldn't need
to port gcc to everything.
even clang got forced into bug-for-bug compatability mode.
Also the Intel compiler, ICC.

Like many things, standards seem to obey a bell curve. The problem is that
we're on the descending side on so many of them... (Dare I say it? "POSIX".
Freel free to run screaming in horror. :-)

Having lived through the Unix wars of the late 80s and early 90s, I think
that overall standards are a good thing. It just seems that more recently
the comittees keep adding stuff in order to justify their continued existence,
instead of solving real problems.

Arnold

Bakul Shah
2012-10-29 23:31:19 UTC
Permalink
Post by Bakul Shah
/sys/src/cmd follows plan9 c, not c99, right? But pick a
similar set of programs. If this happens, I claim it would be
because programs assume something not guaranteed by the
compiler.
Oops. Meant to say "not guaranteed by the standard".
Kurt H Maier
2012-10-30 01:21:02 UTC
Permalink
Post by Bakul Shah
Best way to save developer time is to program in a HLL and not
worry about bit fiddling. C is not a HLL.
Two problems with this:

1) Developer time is not worth saving, because developers are cheap and
they don't use their time very well in the first place.

2) C was a high-level language during the era where developer time was
worth saving.
t***@polynum.com
2012-10-30 08:07:07 UTC
Permalink
Post by Bakul Shah
Best way to save developer time is to program in a HLL and not
worry about bit fiddling. C is not a HLL.
The HLL will have to have a compiler. If the compiler is, finally,
machine instructions, how does one guarantee that it does exactly what
intended on different architectures. If the next level language, near
machine but high level is used---it's C---you are relying on what you
wanted to escape from...

The only mean to be sure is to have a HLL that allows only HL
instructions, like macroscopic, due to its complexity, escapes from the
uncertainty of quantics effects. This is to say, the language will not
allow you to do everything.

The practical solution is to have a language that is near the machine,
sufficiently abstracted from the machines that assembly is scarcely
needed, with some small set of primitives, a small manual, a clear
(once again: small) set of what is guaranteed and caveats about
what is not, and let the programmer decide without the compiler
"guessing" or improving---and a way of writting sources so that a
programmer can optimize his code on this level (this is litterate
programming). C was very near a perfect practical
solution; C89 was not bad; C99 has improved some things but added
too many things on the border (where there is already uncertainty
about what has to be done because "it depends").
--
Thierry Laronde <tlaronde +AT+ polynum +dot+ com>
http://www.kergis.com/
Key fingerprint = 0FF7 E906 FBAF FE95 FD89 250D 52B1 AE95 6006 F40C
Richard Miller
2012-10-30 10:26:52 UTC
Permalink
Post by Bakul Shah
Best way to save developer time is to program in a HLL and not
worry about bit fiddling. C is not a HLL.
C was created as a language to do bit fiddling in - a tool for writing
assembly language a bit more productively than doing it by hand. The
original Unix C compiler was a tool for writing PDP11 assembly
language. Later when the prospect of Unix on other architectures
began to emerge, the 7th edition Unix "portable C compiler" was a
family of tools for writing assembly language on different machines.

C was designed by and used by systems programmers who were intimately
familiar with their machine's instruction set, and the code generation
was so simple and straightforward that to such programmers it would be
obvious what instructions would correspond to a given bit of C. The
low-level directness of C made it easy to write efficient code, so
there was no need for the compiler itself to try to do clever
"optimisations".

Such a language was and still is just what is needed for writing
device drivers and the lowest layers of OS kernels where things like
memory layout and instruction ordering are important. It's not such a
good match for writing the more abstract upper layers of an OS, and
even less so for applications programming. I think it's unfortunate
that C ended up being so widely (mis)used for applications
programming, and by applications programmers who have never learned
assembly language, causing pressure for compilers to be obfuscated
with semantics-perverting "optimisations" and, in an attempt to
compensate for this, the language to be defaced with "features" like
volatile and __attribute__ and whatnot.

I think inferno got it about right: write the kernel in C, and
the applications in a high level language (limbo).
dexen deVries
2012-10-30 10:39:19 UTC
Permalink
Post by Richard Miller
Post by Bakul Shah
Best way to save developer time is to program in a HLL and not
worry about bit fiddling. C is not a HLL.
C was created as a language to do bit fiddling in - a tool for writing
assembly language a bit more productively than doing it by hand. (..)
slightly branching off:

true that. however, people still manage medium-to-high level with a bit of
macros and advanced stuff.

the other day i browsed http://9hal.ath.cx/usr/cinap_lenrek/cifsd.tgz -- looks
high-level enough for me.

putty uses home-brew coroutines in portable C, as per
http://www.chiark.greenend.org.uk/~sgtatham/coroutines.html
--
dexen deVries

[[[↓][→]]]


How often I found where I should be going
only by setting out for somewhere else.
-- R. Buckminster Fuller
Loading...