Discussion
Computational Complexity
sourcegrift: Assert early, assert often!
ziyao_w: Random anecdote and Mr. Hoare (yep not a Dr.) has always been one of my computing heroes.Mr. Hoare did a talk back during my undergrad and for some reason despite totally checked out of school I attended, and it is one of my formative experiences. AFAICR it was about proving program correctness.After it finished during the Q&A segment, one student asked him about his opinions about the famous Brooks essay No Silver Bullet and Mr. Hoare's answer was... total confusion. Apparently he had not heard of the concept at all! It could be a lost in translation thing but I don't think so since I remember understanding the phrase "silver bullet" which did not make any sense to me. And now Mr. Hoare and Dr. Brooks are two of my all time computing heroes.
groos: I've had the good fortune to attend two of his lectures in person. Each time, he effortlessly derived provably correct code from the conditions of the problem and made it seem all too easy. 10 minutes after leaving the lecture, my thought was "Wait, how did he do it again?".RIP Sir Tony.
arch_deluxe: One of the greats. Invented quicksort and concurrent sequential processes. I always looked up to him because he also seemed very humble.
madsohm: They were never concurrent, they were communicating. https://en.wikipedia.org/wiki/Communicating_sequential_proce...
pjmlp: Rest in peace, it hasn't seen the industry change."A consequence of this principle is that every occurrence of every subscript of every subscripted variable was on every occasion checked at run time against both the upper and the lower declared bounds of the array. Many years later we asked our customers whether they wished us to provide an option to switch off these checks in the interests of efficiency on production runs. Unanimously, they urged us not to they already knew how frequently subscript errors occur on production runs where failure to detect them could be disastrous. I note with fear and horror that even in 1980 language designers and users have not learned this lesson. In any respectable branch of engineering, failure to observe such elementary precautions would have long been against the law."-- C.A.R Hoare's "The 1980 ACM Turing Award Lecture"
krylon: Rest in peace.
adrian_b: That is indeed the correct title, but the processes were concurrent.However, they were not just concurrent, but also communicating.
paul: One of my favorite quotes: “There are two ways of constructing a software design: One way is to make it so simple that there are obviously no deficiencies, and the other way is to make it so complicated that there are no obvious deficiencies.”I think about this a lot because it’s true of any complex system or argument, not just software.
pradn: He came to give a lecture at UT Austin, where I did my undergrad. I had a chance to ask him a question: "what's the story behind inventing QuickSort?". He said something simple, like "first I thought of MergeSort, and then I thought of QuickSort" - as if it were just natural thought. He came across as a kind and humble person. Glad to have met one of the greats of the field!
shaunxcode: Absolutely the GOAT of concurrency. May his ring never die.
embit: Talking about Quicksort, John Bentley’s deep dive in Quicksort is quite illuminating. https://m.youtube.com/watch?v=QvgYAQzg1z8
znpy: oh man, google tech talks. what a throwback.there was a time, 10-15 years ago, when they were super cool. at some point they """diluted""" the technicality content and the nature of guests and they vanished into irrelevance.
nemo44x: How many jobs were had or not due to the candidates ability to implement his algorithms?
rramadass: ACM published this book in 2021; Theories of Programming: The Life and Works of Tony Hoare - https://dl.acm.org/doi/book/10.1145/3477355Somebody needs to contact ACM and have them make the above book freely available now; there can be no better epitaph.Tony Hoare's lecture in honour of Edsger Dijkstra (2010); What can we learn from Edsger W. Dijkstra? - https://www.cs.utexas.edu/~EWD/DijkstraMemorialLectures/Tony...Somebody needs to now write a similar one for Tony Hoare.
withoutboats3: This is indeed a great quote (one of many gems from Sir Tony) but I think the context that follows it is also an essential insight:> The first method is far more difficult. It demands the same skill, devotion, insight, and even inspiration as the discovery of the simple physical laws which underlie the complex phenomena of nature. It also requires a willingness to accept objectives which are limited by physical, logical, and technological constraints, and to accept a compromise when conflicting objectives cannot be met. No committee will ever do this until it is too late.(All from his Turing Award lecture, "The Emperor's Old Clothes": https://www.labouseur.com/projects/codeReckon/papers/The-Emp...)
Plasmoid: Fun story - at Oxford they like to name buildings after important people. Dr Hoare was nominated to have a house named after him. This presented the university with a dilemma of having a literal `Hoare house` (pronounced whore).I can't remember what Oxford did to resolve this, but I think they settled on `C.A.R. Hoare Residence`.
cucumber3732842: Shame the university takes itself so seriously. The illustrative example of overloading would have been pertinent to his subject of expertise.
riazrizvi: Cowards.
baruchel: Yes, but don't forget his formal work also (Hoare logic).
rramadass: To me, this is his most important contribution; Everybody else built on top of this.Hoare Logic - https://en.wikipedia.org/wiki/Hoare_logic
lkuty: Rediscovering it through the Dafny programming language. Brings back memories of a 1994 University course.
jefffoster: I remember attending a tech event at MSR Cambridge, and a speaker made some disparaging comment about older developers not being able to keep up in this modern world of programming.An older gentleman stood up and politely mentioned they knew a thing or two.That was Tony Hoare.
ontouchstart: I watched this video a few months ago.Virtual HLF 2020 – Scientific Dialogue: Sir C. Antony R. Hoare/Leslie Lamporthttps://www.youtube.com/watch?v=wQbFkAkThGk
adrian_b: He also invented many other things, like enumeration types, optional types, constructors. He popularized the "unions" introduced by McCarthy, which were later implemented in ALGOL 68, from where a crippled form of them was added to the C language.Several keywords used in many programming languages come from Hoare, who either coined them himself, or he took them from another source, but all later programming language designers took them from Hoare. For example "case", but here only the keyword comes from Hoare, because a better form of the "case" statement had been proposed first by McCarthy many years earlier, under the name "select".Another example is "class" which Simula 67, then all object-oriented languages took from Hoare, However, in this case the keyword has not been used first by Hoare, because he took "class", together with "record", from COBOL.Another keyword popularized by Hoare is "new" (which Hoare took from Wirth, but everybody else took from Hoare), later used by many languages, including C++. At Hoare, the counterpart of "new" was "destroy", hence the name "destructor", used first in C++.The paper "Record Handling", published by C.A.R. Hoare in 1965-11 was a major influence on many programming languages. It determined significant changes in the IBM PL/I programming language, including the introduction of pointers . It also was the source of many features of the SIMULA 67 and ALGOL 68 languages, from where they spread in many later programming languages.The programming language "Occam" has been designed mainly as an implementation of the ideas described by Hoare in the "Communicating Sequential Processes" paper published in 1978-08. OpenMP also inherits many of those concepts, and some of them are also in CUDA.
EdNutting: And, of course, the Go programming language.
linhns: I would not say he invented Go, although Go is probably the only relevant implementation of CSP nowadays.
EdNutting: I was adding Go to the list at the very end of the comment:>OpenMP also inherits many of those concepts, and some of them are also in CUDA.
eitally: Reminds me of this Pascal quote: "I would have written a shorter letter, but I did not have the time."https://www.npr.org/sections/13.7/2014/02/03/270680304/this-...
EdNutting: "Sir", not "Mr." if you're going to be pedantic about titles ;)Edit: Oh and he has multiple honorary doctorates (at least 6!), so would be just as much "Dr." too!
tialaramex: It is not usual to call people with an honorary doctorate "Doctor" except in the context of the awarding institution. Most likely the awarding institutions will have actually specified that the recipient should not give anybody the false impression and I can't imagine Tony is the type to do otherwise.
randomtools: Rest in peace
srean: As Prof Dijksktra was preparing for his end of life, organizing his documents and correspondence with other researchers became an important task.Cancer had snuck up on Dijkstra and there was not much time.One senior professor who was helping out with this, asked him what is to be done with the correspondences. The professor, quite renowned himself, relates that Dijsktra tells him from his hospital bed, to keep the ones with "Tony" and throw the rest.The professor adds with a dry wit, that his own correspondence with Dijsktra were in the pile too.
jonstewart: John Backus had some correspondence with Dijkstra that's worth a read: https://medium.com/@acidflask/this-guys-arrogance-takes-your...
fidotron: There's that immortal Alan Kay line "arrogance in computer science is measured in nano Dijkstras".
rramadass: Alan Kay himself said this quote is taken out-of-context and so people need to stop repeating it - https://news.ycombinator.com/item?id=11799963
justin66: [delayed]
skybrian: I mean, I like puns but they're a flash in the pan. Jokes get old after a while and you don't want to embed them in something fairly permanent like a building name.
yborg: This particular word for the oldest profession goes back to Old English. I am fairly sure it would outlive the building.
skybrian: If the problem is when the joke lives on amusing undergrads long after you've tired of it, that just makes it worse.
wood_spirit: And regretful inventor of the null reference!His “billion dollar mistake”:https://www.infoq.com/presentations/Null-References-The-Bill...
adrian_b: The null reference was invented by Hoare as a means to implement optional types, which works regardless of their binary representation.Optional types were a very valuable invention and the fact that null values have been handled incorrectly in many programming languages or environments is not Hoare's fault.
tialaramex: Having "Optional types" only makes sense if your type system is powerful enough.There are two ways this might happen, both will solve the Billion Dollar Problem but I think one is the clear winner. The first way is explicit optionality, often retro-fitted to languages for example in C# the difference between the types Goose and Goose? are that (in a suitable C# project enabling this rule) the first one is always a Goose and the second might be null instead.The second way is if you have Sum types you can just add "or it's null" to the type.I think sum types are better because they pass my "three purposes" rule where I can think of not one (Option<T> replaces optionality) or two (Result<T,E> for error handling) but at least three (ControlFlow<B, C> reifies control flow) distinct problems I don't need separate solutions for any more if I have this feature.If your type system is too weak you suffer the Billion Dollar problem with Hoare's idea and perhaps if this "feature" had never been invented we'd have all migrated to languages with a better type system decades ago.
adrian_b: I agree that for the correct use of both Optional types and Sum types (a.k.a. Union types) a type system that is powerful enough is essential. Moreover, a convenient syntax is also important.In my opinion, besides being passed as arguments of functions whose parameters are declared as having the corresponding Optional or Sum type, there is only one other permissible use of values of such types.Variables of an Optional type shall be allowed in the Boolean expression of an "if" or equivalent conditional statement/expression, while variables of a Sum type shall be allowed in an expression that tests which is the current type in a select/case/switch or whatever is the name used for a conditional statement or expression with multiple branches.Then in the statements or expressions that are guarded by testing the Optional- or Sum-type variable, that variable shall be used as having the corresponding non-optional type or the type among those possible for a Sum type that has been determined by the test.This syntax ensures that such variables will not be misused, while also avoiding the excessive and unhelpful verbosity that exists in some languages.
robot: "Communicating Sequential Processes" by Tony Hoare: https://www.cs.cmu.edu/~crary/819-f09/Hoare78.pdfIt had intrigued me due to its promise of designing lock-free concurrent systems, that can (I think) also be proven to be deadlock-free.The way it works is processes don't share data and use synchronized IPC for passing and modifying data. It seemed to be a foundational piece for designing reliable systems that incorporate concurrency in them.
Insanity: RIP.His presentation on his billion dollar mistake is something I still regularly share as a fervent believer that using null is an anti-pattern in _most_ cases. https://www.infoq.com/presentations/Null-References-The-Bill...That said, his contributions greatly outweigh this 'mistake'.
fooker: Anti patterns are great, they act as escape hatches or pressure release valves. Every piece of mechanical equipment has some analogue for good reason.Without things like null pointers, goto, globals, unsafe modes in modern safe(r) languages you can get yourself into a corner by over designing everything, often leading to complex unmaintainable code.With judicious use of these anti-patterns you get mostly good/clean design with one or two well documented exceptions.
tialaramex: The "goto" in languages like C or C++ is de-fanged and not at all similar to the sequence break jump in "Go To Statement Considered Harmful". That doesn't make it a good idea, but in practice today the only place you'll see the unstructured feature complained of is machine code/ assembly language.You just don't need it but it isn't there as some sort of "escape hatch" it's more out of stubbornness. Languages which don't have it are fine, arguably easier to understand by embracing structure more. I happen to like Rust's "break 'label value" but there are plenty of ways to solve even the trickier parts of this problem (and of course most languages aren't expression based and wouldn't need a value there).
bell-cot: "Hoare House" would trigger millions of idiots, from rude little children to pontifying alpha ideologues. In perpetuity.The University was correct in saying "nope" to the endless distractions, misery, and overhead of having to deal with that.
Insanity: That relies on a programmer doing the right thing and knowing when to use the escape valve. From the codebases I've seen, I don't trust humans in doing the right thing and being judicious with this. But it's a good point, knowing when to deviate from a pattern is a strong plus.
Milpotel: I'm pretty sure that this is not true. I talked to Bud Lawson (the inventor of the pointer) and he claimed that they had implemented special behaviour for null pointers earlier. When I talked to Tony later about it, he said he had never heard of Bud Lawson. So probably both invented them independently, but Bud came first.
elch: If we start playing the "who was first" game, then for the Soviet machine Kiev (Kyiv), an "address language" with a "prime operation" was created in 1957-59.The prime operation and address mapping.The prime operation defines a certain single‑argument function. Its symbol (a prime mark) is written above and to the left of the argument: 'a = b where a is the argument and b is the result of the operation. This is read as: "prime a equals b" (or "b is the contents of a"). The argument a is called an address, and the function value b is called the contents of the address. The prime function ' defines a mapping from the set of addresses A to the set of contents B, which we will call an address mapping.Page 36, chapter III https://torba.infoua.net/files/kateryna-yushchenko/Vychislit...
Milpotel: Nice, and that was implemented and qualifies as high-level language?
davidhunter: There's the Tony Hoare Room [1] in the Robert Hooke Building. We held our Reinforcement Learning reading group there.[1] https://www.cs.ox.ac.uk/people/jennifer.watson/tonyhoare.htm...
2001zhaozhao: I had countless lectures and classes there
fooker: That's why code reviews exist, it's good process to make code reviews mandatory.
mrkeen: It's too much of a stretch to call null an escape hatch, or to pretend that code reviews will somehow strip it out.The OpenJDK HashMap returns null from get(), put() and remove(), among others. Is this just because it hasn't been reviewed enough yet?
srean: That's a famous quote and age might have mellowed him. But he was not like that at all in person with his students. He did insist that one be precise with ones words.The origin of the quote may have more to do with cultural differences between the Dutch and Americans.
blast: That's a great point which never occurred to me about Dijkstra, even though I knew where he came from. My father in law used to like this joke: "He was Dutch and behaved as such."
draygonia: Reminds me of this quote... “A complex system that works is invariably found to have evolved from a simple system that worked. A complex system designed from scratch never works and cannot be patched up to make it work.”
dang: https://en.wikipedia.org/wiki/John_Gall_(author)#Gall's_lawThe book is well worth reading.https://news.ycombinator.com/item?id=9948767
jgrahamc: Imagine being a world-famous computer scientist and dying and one of the top threads in a discussion of your life is juvenile crap about how your name sounds like "whore".
john_strinlai: https://news.ycombinator.com/item?id=47316880249 points by nextos 16 hours ago | 61 comments
dang: Thanks! We'll merge those comments hither.
adrian_b: Pointers and indirect addressing were used in assembly languages and machine languages much earlier than that, perhaps even in some relay-based computers.In any case, by 1954 already most or all electronic computers used this.The only priority questions can refer to which are the first high-level programming languages that have used pointers.In my opinion the first language having pointers with implicit dereferencing was CPL, published in 1963-08, and the first language having pointers with explicit dereferencing was Euler, published completely in 1966-01, but this feature had already been published in 1965-11. The first mainstream programming language, with a large installed base, which had pointers, was the revised IBM PL/I, starting with its version from 1966-07.
fuzzylightbulb: Imagine being an adult human but not being able to extract a tiny chuckle from such a silly thing.
jgrahamc: Well, I do have a rather special last name which makes me susceptible.
Pxtl: Good thing we now have technology that allows us to crank out complex software at rates never-before seen.
hinkley: We are poorer for him having waited to drop that sentence at his Turing Award acceptance speech. I use it all the time.Tony might be my favorite computer scientist.
HerbManic: One of the policies of The Rhinoceros Party in Canada was to increase the complexity of the taxation system so much that nobody could find the loopholes to exploit.
dboreham: Yes.
1vuio0pswjnm7: "No committee will ever do this until it is too late."The software I like best was not written by "teams"I prefer small programs written by individuals that generally violate memes like "software is never finished" and "all software has bugs"(End user perspective, not a developer)
hinkley: One of my biggest accomplishments was shipping a suite of 5 apps from four divisions where three of them resented each other’s existence and seemed bound and determined to build rules in the system that made sure the other two couldn’t function. Which made no goddamn sense because it was a pipeline and you can’t get anything out one end if it gets jammed in the middle.I was brought in to finish building the interchange format. The previous guy was not up to snuff. The architect I worked for was (with love) a sarcastic bastard who eventually abdicated about 2 rings of the circus to me. He basically took some of the high level meetings and tapped in when one of us thought I might strangle someone.Their initial impression was that I was a prize to be fought over like a child in a divorce. But the guy who gives you your data has you by the balls, if he is smart enough to realize it, so it went my way nine times out of ten. It was a lot of work threading that needle, (I’ve never changed the semantics of a library so hard without changing the syntax), but it worked out for everyone. By the time we were done the way things worked vs the way they each wanted it to work was on the order of twenty lines of code on their end, which I essentially spoonfed them so they didn’t have a lot of standing to complain. And our three teams always delivered within 15% of estimates, which was about half of anyone else’s error bar so we lowly accreted responsibilities.I ended up as principal on that project (during a hiring/promotional freeze on that title. I felt bad for leaving within a year because someone pulled strings for that, but I stayed until I was sure the house wouldn’t burn down after I left, and I didn’t have to do that). I must have said, “compromise means nobody gets their way.” About twenty times in or between meetings.
ghoshbishakh: His paper on communicating processes was a great read when I was new to computer science research.
awesome_dude: It's the committee vs the dictator issue - a small driven individual (or group) can achieve a lot, but they can also turn into tyrants.A committee forms when there's widespread disagreement on goals or priorities - representing stakeholders who can't agree. The cost is slower decisions and compromise solutions. The benefit is avoiding tyranny of a single vision that ignores real needs.
sfn42: J. Graham Cunth?
andyjohnson0: Seconded.Someone once described Systemantics as the book that systwm designers read under the covers at night with a torch.
rramadass: No, it is not my sentiment nor am i being censorious.It can be inferred from Kay's own words. It probably was just poking fun in a tongue-in-cheek manner often seen amongst larger-than-life figures.John Backus called Edsger Dijkstra arrogant since the latter was highly critical of the former's research in functional programming (not the substance but the hyping). Kay was probably riffing off of that.The problem is that a lot of noobs/kids/oldies-who-should-know-better keep dismissing(!) Dijkstra's work because of this silly quote. Thus in this case, a "nice story" is actually an obstacle to people reading Dijkstra.
marxisttemp: It seems that with vibe coding our industry has finally, permanently embraced the latter approach. RIP Tony.
justin66: [delayed]
jongjong: The part about 'genius' being slow and about wrestling with difficult problems resonates.The idea of 'genius' or in fact 'intelligence' being about speed isn't just a Hollywood thing though; it's also been a Silicon Valley thing as well; it's why most big tech interviews are time-constrained.Over the years, I've also heard many tech CEOs say stuff alongside "There's only one type of intelligence" or "All intelligent people are intelligent in the same way."These kinds of statements raised my eyebrows but now with LLMs being able to solve most puzzle problems rapidly but struggling with complex problems it's completely obvious that it's not the case.What it says is frightening. The CEOs of big companies have been giving positions to people who have the same thinking style as them. Quick puzzle-solving tech tests are literal discrimination against the neurodivergent and also against geniuses. They've been embracing wordcels and rejecting shape rotators.I think a guy like Tony Hoare would struggle to find a job these days.
jdironman: From the linked lecture, which I printed out to read as part of a new less is more screen time management regime (where I print out longer form writing for reading) I found this very interesting tidbit in the context of Tony having made a delivery miscalculation and his team failing to deliver on one of their products; which is where I think a lot people are today with LLMs:"Each of my managers explained carefully his own theory of what had gone wrong and all the theories were different. At last, there breezed into my office the most senior manager of all, a general manager of our parent company, Andrew St. Johnston. I was surprised that he had even heard of me."You know what went wrong?" he shouted--he always shouted -- "You let your programmers do things which you yourself do not understand." I stared in astonishment. "
rockinghigh: It can also be used to simplify existing code bases.
kittikitti: I am greatly saddened by the passing of Tony Hoare. His work has affected me deeply; in personal, academic, and professional life. Without visionaries like him, I would not find the love in computer science as I do now. It would be a great honor to have accomplished a fraction of what he did. My condolences to his close friends and family.
tombert: I actually had two PhD advisors [1]; Jim Woodcock and Simon Foster.Both of them are legitimately wonderful and intelligent humans that I can only use positive adjectives to describe, but the one I was referring to in this was Jim Woodcock [2]. He had many, many nice things to say about Tony Hoare.[1] Just so I'm not misleading people, I didn't finish my PhD. No fault at all of the advisor or the school.[2] https://en.wikipedia.org/wiki/Jim_Woodcock
paddybyers: I remember Jim Woodcock as really inspirational - he was working with my PhD supervisor in 1987. We were working on a variant of Z for specifying what, today, we would call CRDTs. I was also lucky enough to meet Tony Hoare the same year and discuss those concepts.
coffeemug: Incredible letters, thanks for sharing. I wish some of this correspondence was published in physical books. What a joy it would be to read.
PaulRobinson: That's a wild ride of passive aggressive academia in a field I know something about. A rare treat. Thanks for sharing!
robot: BTW Rob Pike designed the Go language channels inspired by this work: https://go.dev/tour/concurrency/2
jdswain: Our Graphics Lab at University used to be in an old house opposite a fish and chip shop. The people at the fish and chip shop were suspicious of our lab as all they saw was young men (mostly) entering and leaving at all hours of the night. We really missed an opportunity to name it "Hoare House" after one of our favourite computer scientists.
robotresearcher: His title at Oxford was 'Professor', and he was addressed as 'Tony'.He made incoming DPhil (PhD) students a cup of tea individually in his office at the Computing Laboratory. It was a small group, but still I appreciated this personal touch.
tialaramex: I never met Tony, but I liked his work. I'm not much of a one for tea, but I don't think either of my PhD supervisors ever bought me a drink - I didn't finish (got cancer, I'm fine now†, some cancers are very curable, but frankly I was struggling anyway so it was a good excuse to quit) and I'm sure it's traditional to buy something a bit harder than a cup of tea if you pass, but I didn't get that far.Anyway my point here was just a PSA that honorary degrees "don't count". If somebody only has an honorary doctorate but insists on being called "Doctor" they're an asshole. In fact, even outside University I know a lot of MDs and PhDs and in most contexts if they insist on the title "Doctor" they're an asshole even though they're entitled.† Well not fine, I'm old but I think that's an inevitable side effect of surviving so the alternative was worse.
tialaramex: I disagree. Only true is true and only false is false, Some(1234) isn't true and None isn't false, for the same reason the string "A" isn't the ASCII byte 'A' and the 8-bit unsigned integer 10 isn't the 64-bit floating point number 10.0When you muddle these things it makes life very slightly easier, very briefly and then you introduce impossible to find bugs and ruin your life. People tend to imagine that OK, maybe C went too far with coercions, but I can handle smaller things, that'll be fine, and in my experience they're wrong.I like the fact that in Rust I'm expected to write if opt.is_none() rather than just treat it as if it was a boolean when it isn't. The resulting machine code isn't different, so we're talking about communicating our intent to human programmers, such as our future selves or our colleagues and I'm certain opt.is_none() communicates that intent better than coercing it to a boolean.I don't like the other idea you propose, but it's less of a footgun and so I don't mind that e.g. C# programmers often write this way. I wouldn't choose it myself in many cases, but I don't write "No, fix this" in reviews of such code.
chr15m: I was lucky enough to see Sir Tony Hoare speak at EuroPython in 2009. This was his last slide:One Day- Software will be the most reliable component of every product which contains it.- Software engineering will be the most dependable of all engineering professions.-Because of the successful interplay of research: - into the science of programming; - and the engineering of software. Come on people we have a lot of work to do.
adrian_b: You should provide a citation for where Bud Lawson has published his invention.The use of pointers in assembly language does not count as an invention, as it was used since the earliest automatic computers. The use of implicit reference variables, which cannot be manipulated by the programmer, like in FORTRAN IV (1962) does not count as pointers.The method for forcing another level of evaluation of a variable by using a "$" prefix, which was introduced in SNOBOL in January 1964, and which has been inherited by the UNIX shell and its derivatives does not count as a pointer.The term "pointer" was introduced in a revision of the IBM PL/I language, which was published in July 1966. In all earlier publications that I have ever seen the term used was "reference", not "pointer".There are 2 high-level programming languages that were the first to introduce explicit references (i.e. pointers). One language was Euler, published in January 1966 by Niklaus Wirth and Helmut Weber. However Hoare knew about this language before the publication, so he mentioned it in his paper from November 1965, where he discussed the use of references (i.e. pointers).The other language was the language CPL, which had references already in August 1963. The difference between how CPL used references and how Euler used references is that in Euler pointer dereferencing was explicit, like later in Pascal or in C. On the other hand, in CPL (the ancestor of BCPL), dereferencing a pointer was implicit, so you had to use a special kind of assignment to assign a new value to a pointer, instead of assigning to the variable pointed by the pointer.Looking now in Wikipedia, I see a claim that Bud Lawson has invented pointers in 1964, but there is no information about where he has published this and about which is the high-level programming language where the pointers of Bud Lawson had been used.If the pointers of Bud Lawson were of the kind with explicit dereferencing, they would precede by a year the Euler language.On the other hand, if his pointers were with implicit dereferencing, then they came a year after the British programming language CPL.Therefore, in the best case for Bud Lawson, he could have invented an explicit dereferencing operator, like the "*" of C, though this would not have been a great invention, because dereferencing operators were already used in assembly languages, they were missing only in high-level languages.However, the use of references a.k.a. pointers in a high-level programming language has already been published in August 1963, in the article "The main features of CPL", by Barron, Buxton, Hartley, Nixon and Strachey.Until I see any evidence for this, I consider that any claim about Bud Lawson inventing pointers is wrong. He might have invented pointers in his head, but if he did not publish this and it was not used in a real high-level programming language, whatever he invented is irrelevant.I see on the Internet a claim that he might have been connected with the pointers of IBM PL/I.This claim appears to be contradicted by the evidence. If Bud Lawson had invented pointers in 1964, then the preliminary version of PL/I would have had them.In reality, the December 1964 version of PL/I did not have pointers. Moreover, the first PL/I version used in production, from the middle of 1965 also did not have pointers.The first PL/I version that has added pointers was introduced only in July 1966, long enough after the widely-known publications of Hoare and of Wirth about pointers. That PL/I version also added other features proposed by Hoare, so there is no doubt that the changes in the language were prompted by the prior publications.So I think that the claim that Bud Lawson has invented pointers is certainly wrong. He might have invented something related to pointers, but not in 1964.PL/I had one original element, the fact that pointer dereferencing was indicated by replacing "." with "->". This has later been incorporated in the language C, to compensate its mistake of making "*" a prefix operator.The "->" operator is the only invention of PL/I related to pointers, so that is a thing that has been invented by an IBM employee, but I am not aware of any information about who that may be. In any case, this was not invented in 1964, but in 1966.
elch: He can only point to his paper from 1967 and the fact that in 1964 he was asked to join PL/I team due to his earlier published works on linked lists.https://dl.acm.org/doi/epdf/10.1145/363332.363344
EdNutting: There's having An honorary degree... and then there's having 6 of them plus numerous other awards, and all the achievements to back them up :)Regardless, I've met people with only honorary doctorates, and it's a mixed bag when it comes to preferred titles. Often, though, the ones that really care, soon acquire a 'superior' title anyway, so it ends up becoming a moot point.
bouncycastle: ACEEE IN PRST
practal: Just two days ago I was curious about the PhD advisor of my PhD advisor and so on, and discovered that I am actually an academic great-grandson of Hoare (shame on me, I should have realised that earlier), and joked, "Wow, they are all still alive!". Then I saw the news yesterday on HN.
brian_herman: Needs a black bar!
als0: Finally. The black bar is there.
jgrahamc: He was the professor in the Programming Research Group (known universally as the PRG) at Oxford when I was doing my DPhil and interviewed me for the DPhil. I spent quite a bit of time with him and, of course, spent a lot of time doing stuff with CSP including my entire DPhil.Sad to think that the TonyHoare process has reached STOP.RIP.
tombert: I think I and most people had hoped that he would DIV instead.
tombert: Jim is an amazing guy. One of the rare people who are absolutely brilliant in their respective field, and are equally good at teaching the subject. He's also just a really kind, nice person who is delightful to chat with, though that's true of pretty much anyone in York [1].I also think his book "Software Engineering Mathematics" [2] is an extremely approachable book for any engineer who wants to learn a bit more theory.As I said, my dropped PhD is not a failure in any capacity from my advisors or the school, mostly just life juggling stuff.[1] I don't know why exactly, but of all the places I've been, York has the highest percentage of "genuinely nice" people. It's one of my favorite spots in the UK as a result.[2] https://a.co/d/02M25LcY, not a referral link.
pocksuppet: Complex software full of very obvious deficiencies that nobody bothered to look for.
gerdesj: Had to look them up (WP), wasn't disappointed. We have the Monster Raving Loony Party in the UK.One of the Rhino's Party policies stands out - are you sure Trump wasn't born a Cannuck and was stolen at birth by racoons and smuggled down south?"Annexing the United States, which would take its place as the third territory in Canada's backyard (after the Yukon and the Northwest Territories—Nunavut did not yet exist), in order to eliminate foreign control of Canada's natural resources"
lokelow: I was introduced to him pretty late in the game, in this interview where he and Joe Armstrong and Carl Hewitt talked concurrency! It was interesting hearing them discuss their different thoughts and approacheshttps://www.youtube.com/watch?v=37wFVVVZlVU
asimpletune: Haha I was there too. I remember he made thinking clearly seem so simple. What a humble man.If I remember correctly, his talk was about how the world of science-the pure pursuit of truth-and the world of engineering-the practical application of solutions under constraints-had to learn from each other.
pradn: I'm glad you remember it as well! I didn't think to see if there was a recording or something of this talk, until now. It looks like the text of the talk was published here: https://www.cs.utexas.edu/~EWD/DijkstraMemorialLectures/Tony...And the talk wasn't a random talk, but a memorial talk for Dijkstra: "The 2010 Edsger W. Dijkstra Memorial Lecture". I forgot this aspect as well!
srean: Happy to meet you. I was there and I remember that question being asked. I think it was 2010.If I remember correctly he had two immediate ideas, his first was bubble sort, the second turned out to be quicksort.He was already very frail by then. Yet clarity of mind was undiminished. What came across in that talk, in addition to his technical material, was his humor and warmth.
pradn: That's right - it was bubble sort first. Absolutely - frail, yet sharp. I'm happy to hear several of us didn't forget this encounter with him.
gsanghani: I remember this vividly! I believe he said that he thought of _Bubble Sort_ first, but that it was too slow, so he came up with QuickSort next
pradn: Good to hear from you after a while, Gaurav (I think?!).
Stratoscope: Here is my favorite visualization of quicksort, by a group of Hungarian dancers:https://www.youtube.com/watch?v=3San3uKKHgg
jcgrillo: > permanentlydon't bet on it
Lasang: RIP
YorickPeterse: Is this why software transactional memory is so prevalent today and the actor model is barely used?
eru: Locks are even more prevalent today. And so was leaded gasoline.(To be less snarky, locks are one way you can implement both actor model and Transactional Memory. But just like JMP instructions in your CPU, it's probably better to provide programmers with a higher level of abstraction.)
messe: > It can be inferred from Kay's own words. He probably was just poking fun in a tongue-in-cheek manner often seen amongst larger-than-life figures....is that not obvious from the original quote? Maybe it's a cultural difference (I'm from Ireland), but that's how I've always interpreted and it's never occurred to me that people took it seriously or as anything other than tongue in cheek.
rramadass: The problem is with folks who don't know/have never read (seriously that is) Dijkstra.For example, every time somebody posts something about Dijkstra on HN/etc. somebody will trot out this silly quote and then others pile on (since it requires no effort) and derail any interesting conversation.It is human nature to have an opinion on everything and mediocrity often takes great pleasure in tearing down the greats (i mean the true ones) in order to soothe their own egos (since they know they don't measure up) i.e. "see? the great one is as flawed/mundane as us and i am showing him up".And Dijkstra was Dutch who are famously known to be blunt which is often perceived as arrogance by others :-)
rramadass: Yeah, i knew of the video. That somewhat proves the point i make here - https://news.ycombinator.com/item?id=47331352People only focus on that phrase since it makes a nice "talking point" and ignore all the other interesting things from Kay's talk. For example; i never knew that most of Euler's proofs were wrong w.r.t. rigorous approach as defined today!
dilawar: "At first I hoped that such a technically unsound project would collapse but I soon realized it was doomed to success. Almost anything in software can be implemented, sold, and even used given enough determination. There is nothing a mere scientist can say that will stand against the flood of a hundred million dollars. But there is one quality that cannot be purchased in this way-- and that is reliability. The price of reliability is the pursuit of the utmost simplicity. It is a price which the very rich find most hard to pay."This explain quite a lot actually!
criddell: Tony's An Axiomatic Basis for Computer Programming[1] is the first academic paper that I read that I was able to understand when I was an undergrad. I think it unlocked something in me because before that I never believed that I would be able to read and understand scientific papers.That was 35ish years ago. I just pulled up the paper now and I can't read the notation anymore... This might be something that I try applying an AI to. Get it to walk me through a paper paragraph-by-paragraph until I get back up to speed.[1]:https://dl.acm.org/doi/10.1145/363235.363259
aembleton: I can recommend NotebookLM [1] for reading through scientific papers. You can then ask it questions and even get a podcast generated.1. https://notebooklm.google/
vermilingua: Please don't, can't we just have one thread without this shit
Gibbon1: I feel there is a tension between computer science is math and computer science is plumbing.
bittercynic: Why not the both?Some seem to think that math is somehow above plumbing, but modern society couldn't exist without both, and I'd argue that modern plumbing is more critical to our health and well being than modern math.
dang: How sad that we all missed when Carl Hewitt submitted that video to HN: https://news.ycombinator.com/item?id=19202209.I had no idea that a discussion between the three of them even existed.
antonvs: I’d want to see an example of Dijkstra’s “arrogance” that wasn’t justified.The “truths that might hurt” essay is a great example. Yeah, the truth hurts for many people. People don’t like being called out on their folly, particularly if it’s something they don’t personally control. That Durant make it “arrogant” to point it out.Also, Alan Kaye is overrated. Object orientation is one of those painful truths.
strken: I'm less concerned about "justified" and more about "useful". If you behave offensively to everyone around you, then you have become your own worst enemy in the war of ideas.Ignaz Semmelweis was right. He also died in an asylum, having utterly failed to convince doctors to wash their hands between patients.
dang: (btw the "confusion" here was confusion about whether he had actually died. this comment was originally posted to https://news.ycombinator.com/item?id=47316880, which was the thread we merged hither)
robotresearcher: You’re right. And ‘Professor’ is a job title that comes and goes with the job, independent of degrees held.
tharkun__: The plumber knows how many inches per foot the pipe has to drop in order for the poop to flow away and not get stuck in the pipe. It's easy enough to either not drop it enough and everything gets stuck or for it to drop too much and the water flows away but the poop stays in place. And they're the ones that actually make it happen and their clients really do care about that in the end. Without knowing this the plumber is nothing. They don't necessarily need to know they why and especially don't need to calculate it out!Some mathematician can probably calculate that properly. Some mathematician probably first did calculate that out to prove it. I'm not entirely certain that a mathematician was the reason that we know what drop we need. A lot of things in "real life" were "empirically discovered" and used and done for centuries before a mathematician proved it.Exceptions prove the rule, like when we calculate(d) things out for space travel before ever attempting it ;)
squirrellous: Can’t argue with the quote. However my current boss has been pushing this to the extreme without much respect for real-world complexities (or perhaps I’m too obtuse to think of a simple solution for all our problems), which regrettably gives me a bit of pause when hearing this quote.
senderista: much like google doodle
STELLANOVA: I uploaded lecture to Claude and asked to create skill using principles described. I guess we shall see if AI can actually follow them. :)
Milpotel: > the first language having pointers with explicit dereferencing was Euler, published completely in 1966-01I could only find a manual for PDP-10 Euler with references. Do you have a source for an Euler with pointers?
adrian_b: "Reference" was the original term used in the languages derived from ALGOL for what is now called "pointer".The distinction that exists in C++ between "reference" and "pointer" is something very recent. In the past the 2 terms were synonymous.The term "pointer" was introduced by IBM PL/I in July 1966, where it replaced "reference".PL/I has introduced many terms that have replaced previously used terms. For example:reference => pointerrecord => structureprocess => taskand a few others that I do not remember right now."Pointer" and "structure" have become dominant after they have been taken by the C language from PL/I and then C has become extremely popular. Previously "reference" and "record" were more frequently used.
Milpotel: But the "references" in Euler seem to be close to references nowadays. There is no access to the address, no pointer arithmetic etc. such as in PL/I.
mike_kamau: RIP
jcattle: What is the equivalent of correspondence today?I guess back then each letter had a cost, in (delivery) time and money, so you better make it count.My guess is that these correspondences were often interesting to read because they had to be worthwile to send because of the associated cost.
Lio: Sometimes I feel completely separated to mainstream culture.That the death of Sir Tony Hoare has been completely ignored by mainstream news is one of them.Not a peep anywhere for a man that put a big dent in reality.
bazoom42: The mistake was not null references per se. The mistake was having all references be implicitly nullable.He states around minute 25 the solution to the problem is to explicitly represent null in the type system, so nullable pointers are explicitly declared as such. But it can be complex to ensure that non-nullable references are always initialized to a non-null value, which is why he chose the easy solution to just let every reference be nullable.
deathanatos: I think even having references that aren't necessarily null is only part of it. Image that your language supports two forms of references, one nullable, one not. Let's just borrow C++ here: &ref_that_cannot_be_null *ref_that_can_be_null The latter is still a bad idea, even if it isn't the only reference form, and even if it isn't the default, if it lets you do this: ref_that_can_be_null->thing() Where only things that are, e.g., type T have a `thing` attribute. Nulls are "obviously" not T, but a good number of languages' type system which permit nullable reference, or some form of it, permit treating what is in actuality T|null in the type system as if it were just T, usually leading to some form of runtime failure if null is actually used, ranging from UB (in C, C++) to panics/exceptions (Go, Java, C#, TS).It's an error that can be caught by the type system (any number of other languages demonstrate that), and null pointer derefs are one of those bugs that just plague the languages that have it.
bazoom42: TypeScript actually supports nulls through type unions, exactly as Hoare suggests. It will not let you derefence a possibly-null value without a check.C# also supports null-safety, although less elegantly.
a96: Reminds me of another good one: Make everything as simple as possible, but not simpler. (-- probably not Einstein)
orthoxerox: TIL his first publication was in Russian, published in the USSR where he spent a year as an exchange student. I wonder if Igor Mel’čuk [0] remembers him.[0] - https://olst.ling.umontreal.ca/static/melcuk/
axelfontaine: I had the privilege to attend his "Billion Dollar Mistake" talk in person at QCon London 2009. Little did I know that this talk would go down in software development history! What an honor to have witnessed this live!https://www.infoq.com/presentations/Null-References-The-Bill...
casey2: I didn't know him, but every time I've read one of his papers I've learned something and changed my viewpoint. I'm never worried that my effort will be betrayed reading Hoare (Especially rare in a field that moves so fast like applied CS)
lionkor: Mainstream news have not been presenting much actual relevant content beyond reporting on wars.
tialaramex: > If enabled, it won’t let you deference a possibly-null reference.It will moan, it doesn't stop you from doing it, our C# software is littered with intentional "Eh, take my word for it this isn't null" and even more annoying, "Eh, it's null but I swear it doesn't matter" code that the compiler moans about but will compile.The C# ecosystem pre-dates nullable reference types and so does much of our codebase, and the result is that you can't reap all the benefits without disproportionate effort. Entity Framework, the .NET ORM is an example.
bazoom42: You can certainly set unsafe null dereferencs to be a compiler error. It is just not the default for reasons of backwards compatibility.
kevthecoder: A near neighbour described being interviewed by Tony Hoare for his first job after graduating (he got the job!). Sounds like the interview process in those days was a chat over lunch rather than coding exercises. https://news.ycombinator.com/item?id=43592201
justyy: RIP
mrsmrtss: Add WarningsAsErrors for prject and you are done.
dizhn: Thank you for this!
rramadass: You might find this paper interesting (uses examples in Dafny) - https://news.ycombinator.com/item?id=47334375
rramadass: Followup on the above with these two classics;Retrospective: An Axiomatic Basis For Computer Programming. This was written 30 years after An Axiomatic Basis for Computer Programming to take stock on what was proven right and what was proven wrong - https://cacm.acm.org/opinion/retrospective-an-axiomatic-basi...How Did Software Get So Reliable Without Proof? More detailed paper on the above theme (pdf) - https://6826.csail.mit.edu/2020/papers/noproof.pdf
ontouchstart: Thanks for the recommendation. I downloaded both social.pdf and noproof.pdf on my Kindle Scribe to read them carefully and revisited the discussions on EWD638 and EWD692.It is very interesting to see how Sir Tony diverged from EDW: one is right in theoretical sense but cynical about human fallacies and how the society is heading towards more wasteful complexity, one is to live with it and stay optimistic.There is a proverb in Chinese Taoism:小隱隱於野,大隱隱於市A small recluse hides in the wild, while a great recluse hides in the city
dang: or should do!
ontouchstart: Reading it now on Kindle Scribe without the help of AI ;-).When I am done, I might simply replace SOUL.md with it and move on.
pramodbiligiri: Leslie Lamport hosted a chat with him a few years back: https://www.youtube.com/watch?v=wQbFkAkThGk. They spoke about general CS stuff and some aspects of concurrency.
vanderZwan: The quote makes much more sense as an in-joke between two like-minded people, because Alan Kay isn't exactly humble himself nor does he avoid provocative statements (and like Dijkstra has pretty much earned the right to do both).
mghackerlady: Rest in peace to a real one, we've lost one of the brightest minds of our century
westurner: Hoare logic: https://en.wikipedia.org/wiki/Hoare_logic> The central feature of Hoare logic is the Hoare triple. A triple describes how the execution of a piece of code changes the state of the computation. A Hoare triple is of the form {P}C{Q} where P and Q. are assertions and C is a command.> P is named the precondition and Q the postcondition: when the precondition is met, executing the command establishes the postcondition. Assertions are formulae in predicate logic.> Hoare logic provides axioms and inference rules for all the constructs of a simple imperative programming language.[...]> Partial and total correctness> Using standard Hoare logic, only partial correctness can be proven. Total correctness additionally requires termination, which can be proven separately or with an extended version of the While rule
rramadass: The actual context is in this video where Kay makes the comment - https://news.ycombinator.com/item?id=47328782
vanderZwan: Have seen that presentation, but that still does not give the full context. At least, I don't think it is obvious from the video alone whether this remark was a friendly jab between friends, or whether it was a stereotypical vicious academic back-and-forth between to big names in a field.
rramadass: I think this is the sequence that led to the quote.1) People are miffed with Dijkstra due to his abrasive style.2) John Backus has a back-and-forth with Dijkstra where he calls him arrogant.3) The community knows of the above.4) Dijkstra writes paper comparing Computer Science approaches in Europe vs. USA in his usual sharp style.5) American scientists perceive the above as dissing them and take umbrage.6) Alan Kay writes a paper rebutting Dijkstra's paper pointing out that most of the Software is written on the American side.7) Alan Kay then disses Dijkstra with this quote half-seriously/half-tongue-in-cheek.
biscuits1: There was an article posted here a few weeks ago titled "Nobody Gets Promoted for Simplicity."I've been thinking about it a lot, and now, in turn, the memory of Mr. Hoare.
rramadass: Nice comparison of Hoare vs. Dijkstra.Hoare was more focused and diplomatic while Dijkstra was more of a free-ranging philosopher.I still remember the first time i came across Hoare Logic/Triple and Dijkstra's GCL/Weakest precondition, understanding nothing and feeling like a complete dolt.As a young'un i thought knowing the syntax of a language and learning some idioms/patterns was all you needed for programming. Reading Hoare/Dijkstra showed me where mathematical theory met programming practice.
adrian_b: Euler had both an address-of operator, which was prefix "@" and an indirect addressing a.k.a. pointer dereferencing operator, which was a postfix middle dot.So it had everything that C has, except pointer arithmetic.Only a subset of the programming languages that have pointers also allow pointer arithmetic, because many believe that whenever address arithmetic is needed only indices shall be used, not pointers, because with indices it is much easier for the compiler to determine the range of addresses that may be accessed.
pdamoc: Also sad that now all three are gone.
dang: I had the same thought.
jonjacky: Mr. Hoare (yep not a Dr.)Hoare's degree from Oxford was in Literae Humaniores, nicknamed Greats - ancient Rome, ancient Greece, Latin, Ancient Greek, and philosophy. According to Wikipedia, "It is an archetypal humanities course."