by russell
a follow-on, perhaps, to a side discussion in the beautiful thread about software and the people who write it.
from the Atlantic Monthly, a discussion of complexity and risk in software, and how folks are trying to deal with it.
the technical lead of my development group circulated this, as being of interest to us, because we are in the middle of developing and visual logic editor for stuff we do where I work.
a long, but interesting, read.
enjoy!
Yesterday, when that discussion about writing code was going on, I thought there should be a generic method for developing algorithms before any code was written in any given language. And, not being in the business, I figured this was too prosaic or mundane and idea to mention.
I've never written code professionally - just in the 3 or 4 classes EEs had to take way back when I was in school. (Now that I think about it, I took one that was an elective, so there were only 2 or 3 we had to take.)
I don't know if it was because I was in an engineering curriculum rather than a CS curriculum, but my recollection is that, whenever I was writing a program, I mapped out the basic logic of it, at least mentally, before I did anything.
At any rate, now there's this TLA+ thing, so ... good! I'm kind of surprised this is a newsworthy thing this far into the on-going history of software. I'm not in the business, but that just seems weird.
Posted by: hairshirthedonist | October 05, 2017 at 03:59 PM
non-code design formalisms have a long history, at least 40 years, and can be especially useful in huge projects with large teams, or in the design of computer languages.
In many projects, data design is more challenging than algorithm design.
I'm terribly out of date; the last design formalism I had to learn to do my job was "data-flow diagrams" that supported the Booch approach to object-oriented programming. That was over 20 years ago.
These days I mostly just talk to myself (and to the eventual inheritor of my code) in the comments in the code I write, explaining the basic abstractions, and why I made particular design decisions, and under what circumstances those decisions should be changed.
Posted by: joel hanes | October 05, 2017 at 04:23 PM
Yesterday, when that discussion about writing code was going on, I thought there should be a generic method for developing algorithms before any code was written in any given language.
In many ways, I think this misses the point. I think the real blindspot is in the fact that programmers are too generic. That is, you are a programmer so you can write programs for any company in any field.
That flexibility is wonderful, in the sense that your business doesn't have to worry about running out of people who can work on your software. But it has a downside: the people writing your software don't know your business. Worse, they frequently don't care about your business per se.
I have lost track of the number of times, over the years, that I have found myself saying to some part of the IT staff: "We're trying to run a business here!" Because they were all about what was a cool new way to do programming, rather than about getting something build that would run reliably and do the job that needed doing for the business. The latter just wasn't on their mental radar at all.
Posted by: wj | October 05, 2017 at 04:50 PM
Isn't this where the frontier pushes back? If employers are going to treat employees as fungible, the employees are going to look at what benefits being fungible brings?
Posted by: liberal japonicus | October 05, 2017 at 05:57 PM
I've looked at TLA+ a bit, because I'm working in a cloud / distributed environment, and it's apparently a really good tool for that world. If Lamport is involved, my confidence level is pretty high.
You need to have some discrete math chops to work with it, which I really don't have, but I can sort of play the home game.
The avionics and medical devices folks have been all about rigorous design and planning for a while now. Mainly because errors there mean somebody probably gets hurt.
But it is kind of remarkable how much of modern daily life is run by machines that nobody really understands.
Posted by: russell | October 05, 2017 at 06:01 PM
wj's 4:50 brings up issues familiar to me. In fact, probably the major way in which I'm still valuable to my company, even though I'm an utter dinosaur technically, is that I know the business and the history of our software and the kinds of things we do for our clients, *and* I can talk to technical people and more or less get what they need and what they're going to do as they write new stuff and bring old stuff almost into the 21st century. (Yeah.)
One of my colleagues, the guy in charge of machines and security, recently called our company "one big bespoke suit" -- it's an extremely niche business, and it's pretty much impossible to pick up anyone off the street who knows anything about it. So new coders (or, as we call them, developers) have a pretty long lead time before they're usefully immersed and don't need tons of handholding.
And yes, the importance of "cool new way to do a program..." compared to "this will make things easier for users" drives me crazy. Luck for them, I'm not directly supervising the people who think that way. ;-)
I did a lot of scut data entry work when I was young. I wish all programmers had to spend six months using their own programs before anyone else did. We'd get a big improvement in user-friendliness in a hurry, IMHO.
Posted by: JanieM | October 05, 2017 at 09:39 PM
Can I complain once again about the number of hours per week that I spend waiting for smart-ass devices to boot up, initialize, configure themselves, or whatever the f#%k they're doing before they give me their undivided attention?
I know, I know: it can't be helped. So what the hell: stay on my lawn.
--TP
Posted by: Tony P. | October 05, 2017 at 09:42 PM
I know, I know: it can't be helped.
Except, of course, it can be helped. It just takes allowing a system that only starts the stuff you really need. Instead of starting everything that the seller thinks you might possibly decide to use, maybe at some point in the misty future.
Only forcing the user to run what he will actually use. What a concept!
Posted by: wj | October 05, 2017 at 10:05 PM
There's a whole genre of Youtube videos of the "boot linux in 15 seconds" variety. And (IIRC) that was not with SSDs.
But trivial it's not.
Posted by: Snarki, child of Loki | October 05, 2017 at 11:17 PM
I think the real blindspot is in the fact that programmers are too generic. That is, you are a programmer so you can write programs for any company in any field.
But what I meant was that there should be a generic, perhaps meaning agnostic (with regard to the subsequent programming language), means of developing algorithms.
I would expect that to be orthogonal to the experience of the programmer.
Posted by: hairshirthedonist | October 06, 2017 at 12:53 AM
> I did a lot of scut data entry work when I was young. I wish all programmers had to spend six months using their own programs before anyone else did.
didn't work. programmers (or managers) didn't always understand what accounting or sales guys doing. if you put programmers to work as accounting or sales, you have to train them as accounting or sales too. Its impossible to maintain cross-training of too many field.
we can't train programmers as apothecary just for designing medicine programs. on the reserve training apotheker to program is also impossible.
our society is simply too complex too have people NOT specialize. the price of specialization simply must be paid.
Posted by: PhilippeO | October 06, 2017 at 02:50 AM
even if you, a programmer, have the domain knowledge down cold, you still run into the problem that programming is not the job of solving a problem in that domain, it's the job of telling a computer how to work on an abstraction of the domain problem.
that's the real trick - coming up with an analogy of the domain problem that can be programmed. if you get the analogy right, the code can be easily repaired and extended and modified as users demand more. if you get it wrong, the code and the users have will disagree about what the problem really is. and that makes it hard to adapt to future requests.
that's one of the reasons i cringe when i hear people talk about 'patterns' and frameworks - it's programmers forcing domain problems into problems they've already solved. that can work for a while - you can deliver something that does what the user was browbeaten into asking for. but eventually, users are going to start using it, and they will chafe at the bits where the analogy doesn't fit their reality.
Posted by: cleek_with_a_fake_beard | October 06, 2017 at 09:30 AM
...you can deliver something that does what the user was browbeaten into asking for. but eventually, users are going to start using it, and they will chafe at the bits where the analogy doesn't fit their reality.
You haven't heard about our SAP implementation, have you?
Posted by: hairshirthedonist | October 06, 2017 at 09:51 AM
cleek has it.
a small number of insights in analyzing the problem and choosing abstractions are much of the difference between elegant, maintainable software that performs well and does what the users really want in a way they understand, and brittle, opaque, frustrating crapware.
hsh: for certain problem domains, such as the design of software state machines or language compilers, algorithm frameworks exist. I would argue that the standard Unix command-line toolset (find, awk, sed, grep, lex, yacc, make, cut, wc, and friends) are in fact chunks of generic algorithm that one can easily string together to do amazing things in one line of code.
and again: data design is often much more challenging than algorithm design.
Posted by: joel hanes | October 06, 2017 at 10:04 AM
we can't train programmers as apothecary just for designing medicine programs.
That is not remotely what I was talking about. I was talking about simple user-friendliness features, like not making users have to click unnecessarily, layout of interfaces, etc.
Plus -- in reality, these things could be taught in a fairly generic way, and apparently are not, at least for the people I've run into.
Posted by: JanieM | October 06, 2017 at 10:10 AM
if you put programmers to work as accounting or sales, you have to train them as accounting or sales too. Its impossible to maintain cross-training of too many field.
Train them to be experts? No. (Although you might be surprised at how much they can pick up. At least if they aren't changing jobs every two years.) But train them in the basics -- the sort of stuff you would learn in the first course or two in the subject? That's quite do-able.
And even if they don't acquire a lot of subject area knowledge, they will come away with a much better awareness of the fact that what they are doing is working on something that somebody will actually be trying to use. That alone will make for a huge step forward in user-friendliness.
Posted by: wj | October 06, 2017 at 10:33 AM