• I just read the latest article by Joel Spolsky over on Joel on Software (a site you really should bookmark). Joel reminisces about the time he had his First BillG Interview where he had to present the spec for what was going to be Visual Basic for Applications to Bill Gates. It’s a good read, hospital visit web and Joel’s writing style is very entertaining. While Joel and I have differing opinions on some things, I certainly respect his experience and knowledge, and I suspect that the things we disagree about are simply related to the fact that he takes some concepts too literally (especially with regards to such Agile concepts and no BDUF, but’s that’s for another post).

    Towards the end of the post, Joel makes this observation:

    Bill Gates was amazingly technical. He understood Variants, and COM objects, and IDispatch and why Automation is different than vtables and why this might lead to dual interfaces. He worried about date functions. He didn’t meddle in software if he trusted the people who were working on it, but you couldn’t bullshit him for a minute because he was a programmer. A real, actual, programmer.

    Watching non-programmers trying to run software companies is like watching someone who doesn’t know how to surf trying to surf.

    "It’s ok! I have great advisors standing on the shore telling me what to do!" they say, and then fall off the board, again and again. The standard cry of the MBA who believes that management is a generic function. Is Ballmer going to be another John Sculley, who nearly drove Apple into extinction because the board of directors thought that selling Pepsi was good preparation for running a computer company? The cult of the MBA likes to believe that you can run organizations that do things that you don’t understand.

    Hear! Hear!

    You know, it’s not that I don’t respect management as a discrete set of skills, a seperate discipline if you will. Managing anything successfully requires a particular mindset and approach that is quite specific to the task of management, and the actions that a manager does (and the skill required to carry them out effectively) are specific and distinct from those needed in other endeavours (like, oh, I don’t know… developing software, for example).

    That does not mean, however, that those skills are all that a manager needs in order to be effective.

    I guess it is theoretically a possibility that somewhere there is an activity that can be managed by someone who has no understanding about that activity. (I am not saying that there is such an activity, just that there might be.)

    But developing software is not it.

    It is simply not possible to manage a software development business without a reasonable understanding of software development. I am not going to try to justify that statement here, because quite frankly either you know that this is true, or you can’t possibly be convinced that it is. What I will say, after 28 years in this game, is that whenever I have had to work with a "manager" that does not understand software development, the end result has invariably been sub-optimal. And by "sub-optimal" I mean a disaster, except where this was avoided by those who did understand and who went far beyond what could be expected of them and worked around that manager to get things done.

    While it is not necessary (indeed, it may not even be desirable) for a manager in a software business to be the most technically competent developer, it is an absolute requirement that he or she be able to understand if something is easy or hard, if it is high risk or low risk, if it is reasonable or unreasonable, if it is obviously wrong, obviously right, or just plain not known. He or she needs to understand whether an estimate is reasonable, or whether it is too optimistic or too conservative.

    If the manager can’t do these things, then how can he or she manage? Can you imagine this scenario? If I were asked to manage a banana farm (an activity I know absolutely nothing about) then here is a conversation with the farm workers:

    Me: The buyers want bananas that are more uniform in size. How can we do this?

    Workers: You can’t. Bananas have a certain variation in size. Indeed, the particular species we grow is internationally recognised as the most uniform in size.

    OK. Now what? Is this true? What do I do next?

    Worse still, the previous day, in the meeting with the buyer, the conversation had gone like this:

    Buyer: Our consumers are complaining about the variation in the size of the bananas. We need them to be far more uniform.

    Me: Oh, I’m sure that is not going to be a problem. After all, we employ world best practice farming techniques. I am sure we can do something about that. So if we add a clause to that effect into the contract, you will sign an order today?

    Buyer: Yes, but only if you can assure me you can have a smaller variation in the size.

    Me: No problem. Sign here, please.

    If this sounds ridiculous to you, welcome to my world…

  • I just read the latest article by Joel Spolsky over on Joel on Software (a site you really should bookmark). Joel reminisces about the time he had his First BillG Interview where he had to present the spec for what was going to be Visual Basic for Applications to Bill Gates. It’s a good read, hospital visit web and Joel’s writing style is very entertaining. While Joel and I have differing opinions on some things, I certainly respect his experience and knowledge, and I suspect that the things we disagree about are simply related to the fact that he takes some concepts too literally (especially with regards to such Agile concepts and no BDUF, but’s that’s for another post).

    Towards the end of the post, Joel makes this observation:

    Bill Gates was amazingly technical. He understood Variants, and COM objects, and IDispatch and why Automation is different than vtables and why this might lead to dual interfaces. He worried about date functions. He didn’t meddle in software if he trusted the people who were working on it, but you couldn’t bullshit him for a minute because he was a programmer. A real, actual, programmer.

    Watching non-programmers trying to run software companies is like watching someone who doesn’t know how to surf trying to surf.

    "It’s ok! I have great advisors standing on the shore telling me what to do!" they say, and then fall off the board, again and again. The standard cry of the MBA who believes that management is a generic function. Is Ballmer going to be another John Sculley, who nearly drove Apple into extinction because the board of directors thought that selling Pepsi was good preparation for running a computer company? The cult of the MBA likes to believe that you can run organizations that do things that you don’t understand.

    Hear! Hear!

    You know, it’s not that I don’t respect management as a discrete set of skills, a seperate discipline if you will. Managing anything successfully requires a particular mindset and approach that is quite specific to the task of management, and the actions that a manager does (and the skill required to carry them out effectively) are specific and distinct from those needed in other endeavours (like, oh, I don’t know… developing software, for example).

    That does not mean, however, that those skills are all that a manager needs in order to be effective.

    I guess it is theoretically a possibility that somewhere there is an activity that can be managed by someone who has no understanding about that activity. (I am not saying that there is such an activity, just that there might be.)

    But developing software is not it.

    It is simply not possible to manage a software development business without a reasonable understanding of software development. I am not going to try to justify that statement here, because quite frankly either you know that this is true, or you can’t possibly be convinced that it is. What I will say, after 28 years in this game, is that whenever I have had to work with a "manager" that does not understand software development, the end result has invariably been sub-optimal. And by "sub-optimal" I mean a disaster, except where this was avoided by those who did understand and who went far beyond what could be expected of them and worked around that manager to get things done.

    While it is not necessary (indeed, it may not even be desirable) for a manager in a software business to be the most technically competent developer, it is an absolute requirement that he or she be able to understand if something is easy or hard, if it is high risk or low risk, if it is reasonable or unreasonable, if it is obviously wrong, obviously right, or just plain not known. He or she needs to understand whether an estimate is reasonable, or whether it is too optimistic or too conservative.

    If the manager can’t do these things, then how can he or she manage? Can you imagine this scenario? If I were asked to manage a banana farm (an activity I know absolutely nothing about) then here is a conversation with the farm workers:

    Me: The buyers want bananas that are more uniform in size. How can we do this?

    Workers: You can’t. Bananas have a certain variation in size. Indeed, the particular species we grow is internationally recognised as the most uniform in size.

    OK. Now what? Is this true? What do I do next?

    Worse still, the previous day, in the meeting with the buyer, the conversation had gone like this:

    Buyer: Our consumers are complaining about the variation in the size of the bananas. We need them to be far more uniform.

    Me: Oh, I’m sure that is not going to be a problem. After all, we employ world best practice farming techniques. I am sure we can do something about that. So if we add a clause to that effect into the contract, you will sign an order today?

    Buyer: Yes, but only if you can assure me you can have a smaller variation in the size.

    Me: No problem. Sign here, please.

    If this sounds ridiculous to you, welcome to my world…

    I don’t remember quite how long ago I first read this book, this web but today I just had to go and pull it off the shelf in my office at work, dosage where it normally lives. Very near to the front, there it tells of one study where the authors took two pieces of corporate writing, one in typical corporate-speak, and one straight-talking and clear. The identities of the companies was not evident or otherwise discernable from the content.

    They took these two pieces and showed them to a number of people in the local (to them, Atlanta) Starbucks, and asked them to select from a list of 30 words the ones that they would associate with the companies involved. There were 15 "positive" and 15 "negative" words in the list. Interestingly, the Starbucks crowd didn’t like the bull, so the four words most strongly associated with the writer of the corporate-speak were obnoxious, rude, stubborn and unreliable. And none of the 15 "good" words were associated with this company’s literature.

    The other piece fared much better — it was associated with the words: likable, energetic, friendly, inspiring and enthusiastic. None of the "negative" words were assoaciated with it.

    Let me quote from the book:

    The short story is that people find straight talkers likable, and that’s a big deal. In his book ”The Power of Persuasion”, Robert Levine, a professor of psychology, says:

    If you could master just one element of personal communication that is more powerful than anything … it is the quality of being likable. I call it the magic bullet, because if your audience likes you, they’ll forgive just about everything else you might do wrong. If they don’t like you, you can hit every rule right on target and it doesn’t matter.

    The authors also note that two if the words that were included in the list were "intelligent" and "educated". There was no statistical difference between the straight-talk sample and the bull sample. This means that an attempt to appear smart by (as they put it) using fifty-cent words to make 5-cent points, is pointless — there is simply no payoff for the verbosity.

    Quoting again:

    The bottom line: Bullshit eats away at your personal capital, while straight talk pays dividends. Invest wisely.

    Amen to that!

    Today I have endured more double-speak and, well, absolute nonsense than anyone should ever need to be exposed to, because of some fear of being absolutely clear in some communications. A futile attempt at stealth management.

    I’ll feel better soon.

    Really I will.

    My daughter forwarded me an email just today, neurosurgeon to which she had simply added a single line at the beginning. She said:

    I LOVE YOU DAD !!!

    And here is the rest of the email, which she had obviously forwarded from one of her friends who had sent it to her:

    When you were 8 years old, your dad handed you an ice cream.
    You thanked him by dripping it all over your lap.

    When you were 9 years old, he paid for piano lessons.
    You thanked him by never even bothering to practice.

    When you were 10 years old he drove you all day, from soccer, to gymnastics, to one birthday party after another.
    You thanked him by jumping out of the car and never looking back.

    When you were 11 years old, he took you and your friends to the movies.
    You thanked him by asking to sit in a different row.

    When you were 12 years old, he warned you not to watch certain TV shows.
    You thanked him by Waiting until he left the house.

    When you were 13, he suggested a haircut that was becoming.
    You thanked him by telling him he had no taste.

    When you were 14, he paid for a month away at summer camp.
    You thanked him by forgetting to write a single letter.

    When you were 15, he came home from work, looking for a hug.
    You thanked him by having your bedroom door locked.

    When you were 16, he taught you how to drive his car.
    You thanked him by taking it every chance you could.

    When you were 17, he was expecting an important call.
    You thanked him by being on the phone all night.

    When you were 18, he cried at your high school graduation.
    You thanked him by staying out partying until dawn.

    When you were 19, he paid for your college tuition, drove you to campus carried your bags.
    You thanked him by saying good-bye outside the dorm so you wouldn’t be embarrassed in front of your friends.

    When you were 25, he helped to pay for your wedding, and he cried and told you how deeply he loved you.
    You thanked him by moving halfway across the country.

    When you were 50, he fell ill and needed you to take care of him.
    You thanked him by reading about the burden parents become to their children.

    And then, one day, he quietly died.

    And everything you never did came crashing down like thunder on your heart.

    If you love your dad, send this to as many people as you can. And if you don’t… then shame on you!!!

    I was just so touched, it moved me to tears.

    I was touched because my daughter thought of me, and acknowledged that I am at least trying to do what is right for her.

    But I was also touched because I don’t think that I can honestly say I got the same attention from my father. He was — is — unarguably a good man. He always cared and provided for his family, and his devotion to my mother, now that she is in a nursing home with Alzheimer’s disease, it truly inspirational. But in all my memories, he was in the background, working at his day job, or on one of his investments, in a never-ending battle to ensure that we wanted for nothing, and instead managing to deprive us of the most precious gift he had to offer — his time, his experience, his wisdom, his companionship.

    His friendship.

    And now, when he is in his eighties and despite living with us, his sense of duty means that he is not spending time with his grandchildren, again robbing them of the precious gifts he has, and robbing himself of the joy that they bring.

    But maybe there is one more gift he has given me. He has taught me that just doing your duty, earning money and looking after your family, although admirable and even necessary, simply is not enough. You need to give of your time, your attention. Your self.

    So Dad, I love you. Thank you for what you have done. I do understand it was the best you could do.

    I just wish I had been given the chance to know you better as a person.
    The following is a post that I made over on the Powerbasic programmer forums, hospital in response to Paul Pank’s posting asking dietary advice. Why he thought to ask about diet on a programming forum is beyond me. I have edited out the references to other posts but the entire thread can be seen at the PowerBasic Forum.

    First, site let me get some credibility here, page then make a disclaimer, then express my opinion.

    I am 47 and have been overweight pretty much all my life. My dad’s side of the family is very prone to Diabetes. In 1994, I was diagnosed with diabetes. At the time, I weighed about 130 kg (about 295 lb and I am 177 cm tall). I led a very sedentary lifestyle — my thinking was that exercise doesn’t really make you live longer, it just feels that way .

    When diagnosed with diabetes, I went through the brainwashing — err, pardon me — nutrition education. I was on 4 insulin injections a day, and had the whole food pyramid thing drummed into me. You know the one — lots of complex carbs with fibre (like vegetables and grains), far less proteins and even less fats.

    I followed the advice. Strictly. Under supervision. Really.

    And I ended up in 2003 weighing nearly 160 kg (360 lb)!

    So I took matters into my own hands, educated myself and did what I needed to do.

    Today, I weigh in at around 105 kg (240 lb: yes, that is over 50 kg/110 lb lost). I take no insulin or any other diabetic medication. My blood pressure is 110/70 and my bloodwork is terrific.

    Disclaimer: do your own research and make your own decisions. This may NOT be right for you!

    I read everything I could find. I went through Atkins and thought he had part of the picture. Then I found out about Glycemic Index (GI) and that completed the picture for me.

    Here’s how I understand it. Insulin is the hormone that allows sugars to cross over the cell membrane and be used for energy. If you don’t have enough, or it is somehow flawed, you are diabetic — the sugars accumulate in your blood while your cells slowly starve.

    But insulin does something else, too: it causes excess sugars in your blood to be stored as fat.

    This is important: it is the excess SUGARS (ie carbs) that are stored as fat, and this is done in the presence of insulin.

    To avoid this, you need to ensure that the food you eat does not cause a spike in insulin production. Foods that cause a rapid rise in insulin levels have a high GI. Glucose has the maximum of 100, and pure water has 0, with everything else inside that range. If you eat low to medium GI foods at each meal (so that the combined GI is low to medium) then your body’s ability to store fat is severely curtailed, so that even if you do eat a bit more than you absolutely need, most of the excess is not stored. That’s not to say you can pig out, but it does mean you don’t need to live your life counting calories or fat-grams or whatever.

    As for eating fats, the actual evidence of their effect on things like cholesterol is flimsy. Recently, here in Australia, the National Heart Foundation said words to the effect of “we were wrong: go ahead and eat the egg yolk too — it’s good for you and does not raise cholesterol”. My personal experience is that the level of cholesterol is largely genetic, and the environmental factors that influence it are far more likely to be related to simple starches (high GI foods) than fats. Consider this: what foods are high in fats and not also high in simple carbs? Pretty much nothing. If you eat a fast-food burger, the bun is not only white bread, it has added sugar. Fries? They’re potato! Chocolate? Sugar. Think about it.

    Calorie intake is important, but don’t get too hung up on it. If you restrict your calorie intake too much, your metabolism slows down, and stores fat more aggressively. It is better to eat regularly throughout the day, and DON’T SKIP BREAKFAST whatever you do! I try to eat six small meals every day. It is also good to give yourself a regular “free” day, where you don’t do any workouts and you eat freely. For me, that is Sunday. Nothing is off-limits on that day. The first few weeks, you go a bit nuts, but the novelty soon wears off, and knowing that you can have that chocolate “on Sunday” seems to make it easier to not have it the rest of the week. It is important to have more calories on these days, because you want to use it to kick-start your metabolism again.

    Hydrogenated oils are really bad karma. The evidence against them may not be concrete, but the anecdotal evidence is pretty overwhelming. I’ll eat butter, but not margerine.

    Finally, BMI is the biggest load of codswallop ever. It is a formulaic representation of the old height/weight charts which have been discredited for decades. You see, a given volume of muscle weighs about four times what the equivalent volume of fat weighs. So if you have a low percentage of body fat, you will way MORE than a person with exactly the same dimensions with a higher percentage of body fat. Elite athletes have TERRIBLE BMI scores — they are HEAVY for their height because they are very lean.

    And you want to be lean (and therefore heavy for your size) because that means that you need to burn more food just to live. As well as being stronger, both in terms of muscles but also in terms of calcium retention in bones, you will be able to eat more without putting on weight. So while aerobic exercise is good for you, losing fat really requires you to add muscle mass, and that needs strength training. Someone mentioned martial arts in a previous post — I would strongly recommend that, even if you are not particularly young. I started at 45, and am now a 1st Kye Brown Belt in the Kempo style. Find a good, family-friendly school. You will find that martial arts training is a really good mix of strength (resistance) and stamina (aerobic) work. Aikido is great — that is next on my list after I achieve Black in Kempo.

    To summarise (with all the disclaimers assumed):

    • eat low and medium GI foods. If you eat anything with a high GI, include something low GI with it at the same meal.
    • don’t stress about fat intake (be sensible here).
    • avoid hydrogenated oils
    • get into a regular, weight-bearing exercise regime
    • do NOT starve yourself
    • try to eat more, smaller meals
    • forget your weight — instead, focus on your belt size, the only figure you really need to care about
    • enjoy life – the point of looking after yourself is to enjoy yourself

    Again, do your own research.
    The following is a post that I made over on the Powerbasic programmer forums, hospital in response to Paul Pank’s posting asking dietary advice. Why he thought to ask about diet on a programming forum is beyond me. I have edited out the references to other posts but the entire thread can be seen at the PowerBasic Forum.

    First, site let me get some credibility here, page then make a disclaimer, then express my opinion.

    I am 47 and have been overweight pretty much all my life. My dad’s side of the family is very prone to Diabetes. In 1994, I was diagnosed with diabetes. At the time, I weighed about 130 kg (about 295 lb and I am 177 cm tall). I led a very sedentary lifestyle — my thinking was that exercise doesn’t really make you live longer, it just feels that way .

    When diagnosed with diabetes, I went through the brainwashing — err, pardon me — nutrition education. I was on 4 insulin injections a day, and had the whole food pyramid thing drummed into me. You know the one — lots of complex carbs with fibre (like vegetables and grains), far less proteins and even less fats.

    I followed the advice. Strictly. Under supervision. Really.

    And I ended up in 2003 weighing nearly 160 kg (360 lb)!

    So I took matters into my own hands, educated myself and did what I needed to do.

    Today, I weigh in at around 105 kg (240 lb: yes, that is over 50 kg/110 lb lost). I take no insulin or any other diabetic medication. My blood pressure is 110/70 and my bloodwork is terrific.

    Disclaimer: do your own research and make your own decisions. This may NOT be right for you!

    I read everything I could find. I went through Atkins and thought he had part of the picture. Then I found out about Glycemic Index (GI) and that completed the picture for me.

    Here’s how I understand it. Insulin is the hormone that allows sugars to cross over the cell membrane and be used for energy. If you don’t have enough, or it is somehow flawed, you are diabetic — the sugars accumulate in your blood while your cells slowly starve.

    But insulin does something else, too: it causes excess sugars in your blood to be stored as fat.

    This is important: it is the excess SUGARS (ie carbs) that are stored as fat, and this is done in the presence of insulin.

    To avoid this, you need to ensure that the food you eat does not cause a spike in insulin production. Foods that cause a rapid rise in insulin levels have a high GI. Glucose has the maximum of 100, and pure water has 0, with everything else inside that range. If you eat low to medium GI foods at each meal (so that the combined GI is low to medium) then your body’s ability to store fat is severely curtailed, so that even if you do eat a bit more than you absolutely need, most of the excess is not stored. That’s not to say you can pig out, but it does mean you don’t need to live your life counting calories or fat-grams or whatever.

    As for eating fats, the actual evidence of their effect on things like cholesterol is flimsy. Recently, here in Australia, the National Heart Foundation said words to the effect of “we were wrong: go ahead and eat the egg yolk too — it’s good for you and does not raise cholesterol”. My personal experience is that the level of cholesterol is largely genetic, and the environmental factors that influence it are far more likely to be related to simple starches (high GI foods) than fats. Consider this: what foods are high in fats and not also high in simple carbs? Pretty much nothing. If you eat a fast-food burger, the bun is not only white bread, it has added sugar. Fries? They’re potato! Chocolate? Sugar. Think about it.

    Calorie intake is important, but don’t get too hung up on it. If you restrict your calorie intake too much, your metabolism slows down, and stores fat more aggressively. It is better to eat regularly throughout the day, and DON’T SKIP BREAKFAST whatever you do! I try to eat six small meals every day. It is also good to give yourself a regular “free” day, where you don’t do any workouts and you eat freely. For me, that is Sunday. Nothing is off-limits on that day. The first few weeks, you go a bit nuts, but the novelty soon wears off, and knowing that you can have that chocolate “on Sunday” seems to make it easier to not have it the rest of the week. It is important to have more calories on these days, because you want to use it to kick-start your metabolism again.

    Hydrogenated oils are really bad karma. The evidence against them may not be concrete, but the anecdotal evidence is pretty overwhelming. I’ll eat butter, but not margerine.

    Finally, BMI is the biggest load of codswallop ever. It is a formulaic representation of the old height/weight charts which have been discredited for decades. You see, a given volume of muscle weighs about four times what the equivalent volume of fat weighs. So if you have a low percentage of body fat, you will way MORE than a person with exactly the same dimensions with a higher percentage of body fat. Elite athletes have TERRIBLE BMI scores — they are HEAVY for their height because they are very lean.

    And you want to be lean (and therefore heavy for your size) because that means that you need to burn more food just to live. As well as being stronger, both in terms of muscles but also in terms of calcium retention in bones, you will be able to eat more without putting on weight. So while aerobic exercise is good for you, losing fat really requires you to add muscle mass, and that needs strength training. Someone mentioned martial arts in a previous post — I would strongly recommend that, even if you are not particularly young. I started at 45, and am now a 1st Kye Brown Belt in the Kempo style. Find a good, family-friendly school. You will find that martial arts training is a really good mix of strength (resistance) and stamina (aerobic) work. Aikido is great — that is next on my list after I achieve Black in Kempo.

    To summarise (with all the disclaimers assumed):

    • eat low and medium GI foods. If you eat anything with a high GI, include something low GI with it at the same meal.
    • don’t stress about fat intake (be sensible here).
    • avoid hydrogenated oils
    • get into a regular, weight-bearing exercise regime
    • do NOT starve yourself
    • try to eat more, smaller meals
    • forget your weight — instead, focus on your belt size, the only figure you really need to care about
    • enjoy life – the point of looking after yourself is to enjoy yourself

    Again, do your own research.
    The wiki is now online at https://jimako.com/wiki, try
    where I hope to maintain more of a presence than I have in this blog. The bottom line is that, treatment most of the time, I am just too busy to journal what I am up to in the blog, and most of my ramblings are probably not terribly interesting to most people.

    So, if you do want to keep in touch, check out the wiki, which will probably be updated more frequently than this blog.

    What I would really love to see is more of an integration between the blog and the wiki. I have tried to tinker with the PmWiki software to see if I can get a blog-like experience from it, but while you can do SOME sort of blog-like activities, it is just not designed for that. If someone could skin WordPress to look like my wiki, well, that would be just too cool.
    I have been reading a great book over the past couple of days. It is called The Best Software Writing I and I picked it up at the local Borders when I dropped in to browse as I am wont to do on the odd occasion. Reading it fired up the creative juices, diagnosis so I thought I would put finger to keyboard about something that has been gnawing at me for a while now: the fact that we seem to be making things more complex than they need to be because of some misguided attempt to model the "real world".

    Now, I am the first to admit to being an old-world developer. I learnt how to program in the late 70s, at at time when OOP was simply unheard of out in the wild. I clearly remember the famous hot air balloon cover on the issue of Byte that introduced Smalltalk. Yes, not only was I alive then, I was old enough to buy Byte and read it.

    I have worked through the evolution of our craft and the changes in the way we approach the development of software. By virtue of the fact that I learnt to program before there was OOP, intially I had a tough time really understanding how OOP worked, simply because there was a lot of unlearning that I had to do. It took me a while, and there were many times when I told myself that now I ‘got’ OOP, only to admit a little while later that I really didn’t get it at all.

    If that sounds at all negative about OOP, it isn’t meant to. I am actually a big proponent of the benefits of OOP and think that the binding of data with the procedures that operate on it is A Good Thing.

    What I am finding, however, is that a new generation of programmers who don’t know anything except OOP are losing track of the objective. When you learn OOP, you are inevitably given examples of real-world objects that you try to model. I think that if I ever see another example of an Animal class, with Dog and Cat descendents, I will scream, even though I actually wrote a student manual for a VB4 course that used these self same classes. We learn about inheritance and its use to create polymorphic behaviour by having Cat.Speak emit a mieaow while Dog.Speak emits a woof. Or we have traffic lights control their own sequencing by sending messages to each other. All very cool.

    Then off we go to do commercial programming. Except, there is a real disconnect at this point. ‘EDP’ (to use an outdated but nonetheless expressive term) is not the same as traffic control. In most business data manipulation, the entities we are dealing with are nothing but data points. The only reason the system exists it to maintain a repository of data, furnish a controlled view into that repository and provide a mechanism to allow a predifined set of transformations of that data to take place in a controlled manner. There are no dogs or cats in the wild that we are trying to model. What exactly does an account do when it is not being "modelled" by a piece of code? The answer, of course, is nothing at all — it just sits quietly in a database somewhere.

    I have seen too many systems that model an Account object with all sorts of complex behaviours. From one perspective, this can be useful — we can encapsulate all the things that we can do to an account as methods of the Account object, thereby binding the data with the code and (at least in theory) hiding the implementation details. I have no problem with this; in fact, this is exactly what makes sense.

    The problem arises when we refuse to acknowledge that the account lives in a database with a whole lot of other accounts. Worse still, we refuse to acknowledge that the database exists at all. Instead, we abstract it away with a "factory" the magically produces Account objects. It can give us a particular Account object if we can give it some distinguishing attribute, like an ID. But heaven forbid that we ask for any complex filtering — why, that’s almost like SQL, and we wouldn’t want to pollute our neat, abstract, HashMap in the Sky with any of that old "relational stuff".

    So instead of crafting a simple SQL statement that reflects the set of data we actually need, we write code that does all but the simplest filtering on the client side. You don’t want all the columns of data?  Who cares? Just don’t reference those attributes of the objects. You don’t want all the rows? Well, use one of the simple factory methods to do a first cut at the filtering, then just iterate through the list of objects that are returned and remove the ones you don’t want.

    Does anybody else see a problem here?

    Decades ago, we came up with a way to manage operations on tabular data. A whole relational alegbra was developed that provided a clean set of operations that could be requested by a client program, so that only the data it actually wanted would be returned by the database. The SQL language provided a reasonably straightforward way to express those operations, and databases got really, really good at processing SQL, so that, at least in most cases, they could quickly and efficiently get just this required data together. That’s what they are designed to do, and they do it very well. And when they have done this, just that data that is actually required goes over the wire to the program that requested it, which in turn can be simpler because it doesn’t need to do the whole post-processing dance.

    Today, OOP developers think that their code has somehow become impure if it contains an SQL statement, or acknowledges the existence of a database, or a table, or a row or column. Yet these are the actual big-O Objects that are being manipulated. The row in the table (OK, the rows in related tables in the database) are the account. That is the object we are dealing with. Why do we feel the need to abstract it away?

    It’s not like a cat. It is not practical for a program to deal with an actual Cat object. The Cat interface is not well understood, and the Cat Query Language is still in its infancy. When dealing with cats, it makes sense to create a simpler abstraction of a Cat, that models all the attributes of a real cat that we are interested in, and use that as a surrogate for a real cat in our program. Besides, cat fur really clogs up the fan vent in a computer case.

    An account, on the other hand, like a large number of the "objects" we deal with in business software development, is quite easily accessible directly.

    We don’t need no steenking abstractions.

  • ROFLMAO I just installed WordPress on my own (hosted) server. It gave me an option to import my Blogger data, pfizer
    which I did, clinic and it seemed to work pretty well.

    I will now try to blog more frequently.

    Yeah, right!

    Sometimes I just crack myself up!

    I got here after fooling around on Google and thought — what the heck, doctor I’ll create a ‘blog. Maybe this time I’ll start using it. I was surprised to find that the user name “karabatsos” was taken — maybe Joanna had created it? Hmmm. How long has this blogger thing been going? Could it have been me and I had forgotten? Sure enough, I tried a few of my old passwords and bingo! I was in. The first and only post dated back to 2001! So much for blogging…

    I am still not sure that I will follow up with this, but I will make an effort to post regularly for at least a little while to see if it helps me at all, or helps anyone else, I guess, although I don’t really see how this would interest anyone else.

    If I really get interested, I might create a few special-topic blogs that might be a kind of virtual publication. I would love to resurrect the AVDF community, but more focussed on the types of applications that I (and most programmers I deal with) develop nowadays: Java web apps.

    That’s all for the first post. It’s late and I have an early start tomorrow.

    Added later — I added a photo here so I can link to it from the profile.
    iBookI recently bought myself a little Mac iBook through eBay. It was a good deal and I have been feeling for a while that I needed to get across the whole Apple culture.

    I am going to try to use the iBook a lot over the next little while. I know that I am going to be less productive than I am on Windows, cure and I will still use Windows for my development work at IBS, bronchitis but I will try to set up my iBook for development too so that I can see whether it is really a viable alternative as a development workstation.

    My main motivation, however, is that I am seriously considering moving the kids over to Apples next year — I am so sick of all the down-time that I seem to have fixing one problem after another with virii, trojans, adware and other miscellaneous malware. The Apple is not totally immune to that sort of thing of course, but it is a lot less susceptible.

    So I am using the iBook for Office applications (I installed MS Office, because I really need seamless compatibility and so will the kids) and I am really impressed with the level of functionality. Honestly, for a user of Office, the Mac versions are better than the Windwows versions. Entourage rocks, and the Notebook feature in Word is very nice — I am continuously finding new uses for it.

    I am also reading up about Ruby On Rails, and am in the process of getting it all set up on the iBook.

    All in all, the experience has been positive. It has not been painless, but that is because I have a lot of Windows knowledge that I don’t have the equivalent for in the Mac world. In truth, I think it is harder for a techie to swap than a non-techie, and I really think that the next person who asks me for advice on what they should buy might well find that he or she is being pointed to a Mac. At the end of the day, they are going to come back to me for support, so it is in my interest too.
    It’s been a couple of weeks since I last blogged, information pills and in that time I have been using the iBook a lot. I have to say that I really like this little computer. While I am still more productive using Windows when my Windows box is actually working OK, the fact is that I have to spend a LOT of time keeping my Windows machine ticking over. I’m pretty careful, and I run anti-virus and anti-spyware all the time. But that only means that I go for many months before something slips through and I need to rebuild my machine yet again. The kids — well, I’m sure that they try, but I seem to be rebuilding their machines every few weeks.

    Joanna is a case in point. I bought her a new Windows (Acer) laptop recently. Clean install of Windows XP Pro, latest service packs, all updates applied and auto updates enabled, anti-virus and 3 — yes, three! — anti spyware tools. Yesterday, she was having a problem saving a Powerpoint presentation and asked for help. There in her task bar was a little dog, saying she could get paid to surf the web. She has no idea where that came from.

    And did I mention that Powerpoint couldn’t save? Anywhere? Not locally, not on the network, not on a flash drive — nowhere? So, I tried a voodoo cure and did a “Save as Powerpoint 95”. This worked after warning me that some features might be lost. A quick bounce of Powerpoint, re-load the file (which triggers a conversion that took forever) and all is well with the world again. Except there goes a good 20 minutes of my time, not counting the interruption overhead.

    Compare that to the Apple. Now admittedly, I have only had it a few weeks, but I am using it as my main machine for everything I do except the actual coding at work (where we use the supplied Windows/Intel clones) and — yes, I know this phrase is becoming a cliche — it just works. This seems to be the comment I am hearing from everyone who is moving across to the Mac platform from Windows, and I am now joining the ranks.

    Is it perfect? Of course not. I couldn’t connect to my networked printer (an Officejet G55 hanging off a Dell XP Pro workstation whose role in life is to be our server). Turns out that there is some bug or configuration error (is there a distinction there I am not aware of?) in Tiger that prevents the authentication from working right. Don’t get me wrong, it only took a few minutes on Google to find a workaround, but it shows that even Apples have issues sometimes.

    But overall, well, I really like this computer. It really gets over 4 hours of use on a charge even with a WiFi link active, so I actually use it on the battery. I’ve never done that with a laptop before, because the Wintel laptops I have owned can’t reliably get more than about an hour and a half (although I believe that the Centrinos a pretty good).

    My current thinking is that I will leave Joanna and Peter with the Windows laptops for the rest of the year, and give Costa this iBook. It is perfect for me except that it doesn’t have Bluetooth built in, and I really want that. I like the 12 inch form factor — after all, the whole point of a laptop is that it is portable. I would really like a PC Card slot, because that would give me the ability to run the iBurst card, but the smallest Apple that has one of those is the 15 inch Powerbook. And the price — well, more than I want to spend right at the moment. I will probably end up buying myself a new 12 inch iBook with minimum RAM (because 3rd party RAM is much cheaper and easy to install) but with the biggest hard drive I can get and with the built-in Bluetooth. That comes to under $1900 delivered, probably just over $2K by the time I add 1G of third-party RAM. I’ll check whether I can get a better price online. Then over the next year or so, I can cycle my new one down to the kids while I upgrade as funds allow.

    Apples also seem to have a longer useful life, so I actually think that the cost of running Apples will be lower when taken over their effective life, even if they do cost more to start with. Time will tell.

    That’s all for now. I really need to learn some lines for tomorrow night, and it is getting late.
    I’m going to get off the topic of the Apple for today — not that nothing has happened, pathopsychology but because in reading over the blog I sound like some Mac fanatic. Today, medicine Chris, this a good friend of mine, showed me his new HP laptop. Huge, 17″ monster, very powerful, but battery life of about an hour, and he couldn’t get it set up to access the network. Sigh!

    But I said, no Apple today.

    Over the past couple of weeks, I have been working on a particular Java application, and I needed to extract a whole bunch of data into flat files for a particular client requirement. Cutting a long story short, I ended up writing a set of scripts to generate an XML specification of an extract that is going to be used to control the total extract process, and this gave me a chance to try my hand at Ruby.

    Now, I have heard a lot of good things about Ruby, but had not really used it before. Everyone I knew, who I respected as a programmer, and who had tried Ruby, raved about it. So, even though I knew Python, I made a point of nutting my way round Ruby.

    Obviously, it took me a little while to get moving — there is always a bit to learn when starting with a new language. But I bought a PDF copy of the Pickaxe book and zoomed through the highlights. I have to say, I like Ruby a LOT.

    Ruby is OO to the core. Everything is an object, and it has a remarkably convenient set of built-in functionality. I am not going to put together a tutorial on Ruby, at least not here, but here are a few examples to whet your appetite.

    In Java, to define a class with a set of accessor methods, you do something like this:

    public class Dog
    private String name;
    public Dog(String name)
    {
    super();
    this.name = name;
    }
    public String getName()
    {
    return name;
    }
    public void setName(String value)
    {
    name = value;
    }
    }
    Here’s the same thing in Ruby:

    class Dog
    attr_accessor :name
    def initialize(name)
    @name = name
    end
    end
    Creating an instance in Java:

    Dog dog = new Dog("Rover");

    and in Ruby:

    dog = Dog.new("Rover")

    so the classes a pretty much equivalent, except that the Ruby one is (a) much shorter and (b) eliminates the need to write a whole lot of plumbing, no-brain code. Now I know that any modern IDE generates this boilerplate code for you, but it is still there and needs to be navigated and mentally discounted while you work on the stuff that DOES matter. In Ruby, the only code you write is what you need for the application — well, most of the time anyway 🙂

    Here’s a really cool thing you can do in Ruby. When you call a function, as well as passing a number of arguments to it, you can also, optionally, attach a code block to it. A code block is delimited by either a the keywords do and end, or braces (they’re the same). Inside the called function, the code can determine whether a code block has been attached to it and, if so, essentially call that block any number of times. Here is an example:

    def send(message)
    if block_given?
    yield "connecting"
    end
    connect(...)
    if block_given?
    yield "sending"
    end
    send(message)
    if block_given?
    yield "sent"
    end
    disconnect(...)
    if block_given?
    yield "done"
    end
    end
    This is a dummy, skeletal procedure. We assume that it sends a message somewhere, and there are several steps — connecting, sending and disconnecting.

    If you call it like this:

    send("Hello world")

    it just does its thing. But you can optionally attach a code block like this:

    send("Hello world") {|stage| puts "... now #{stage}" }

    Let’s look at this line. The braces define a code block — the convention seems to be that short blocks like this use braces, while long, multi-line blocks use do/end. The two vertical bars delineate a parameter list; here, the parameter is called “stage”. The single line inside the code block uses puts to display a string. I’ll get to the string in a moment, but for now just accept that this results in the following printout:

    ... now connecting
    ... now sending
    ... now sent
    ... now done

    The string that is displayed is delimited by double-quote characters, which means that the string is processed by Ruby. One of the effects of this is that the #{x} construct embedded in the string is replaced with the value of the variable x — this works everywhere, not just in these attached code blocks.

    This mechanism is used to implement a really simple, generic and pervasive iterator-like mechanism. For example, to allow arrays to be iterated, the Array built-in class implements a method “each” which, you guessed it, takes a code block. So, to iterate over an array, you use this sort of code:

    my_array.each {|element| puts element }

    The beauty of this is that any object can exhibit this behaviour — just implement an “each” method that expects a code block, and “yield” once for each element your object contains. There is no need to be in any other way related to an array.

    Which leads me to the topic of Duck Typing. This is the Ruby philosophy about object typing. While Ruby does implement a single-inheritance object hierarchy model, you can actually use unrelated object polymorphically as long as they implement a common subset of methods. The idea is that if it walks like a duck, and looks like a duck, and quacks like a duck, then it can be treated like a duck. Yes, this is NOT as bullet-proof as a strongly-typed language like Java, but in reality I don’t actually end up assigning a Debit object instance to an Animal object reference very often, and if I do, I will rely on my tests to pick that up. In return, I save myself a lot of unnecessary casting and fiddling in perfectly good code just to tell the compiler what I already know.

    Ruby also has mix-ins, called modules. A module is a bit like an interface and a bit like an abstract class. Like an interface, a module aggregates a set of methods — these are included by classes that want to, regardless of their position in the object inheritance hierarchy. But unlike interfaces, modules have code in them too — implemented methods. These methods become part of any class that includes the module, and have access to class methods, exactly as if the code had been copied and pasted into that class. Also like interfaces, a single class can mix in, or include, any number of modules.

    Like an abstract class, it implements some code, and through access to non-coded variables and methods, can set up an expectation on the classes that include it, but unlike an abstract class, the class that includes it does NOT need to descend from it (indeed, it can’t do so, because modules are not classes per se).

    Very powerful indeed, and I don’t claim to fully appreciate all the implications of how these can be used, but just intuitively it seems to be really useful. And just plain cool.

    Anyway, that’s more than enough for one post. Tomorrow is going to be a busy day. Toodles.
    Sometimes, buy more about the pressures at work distract us from the other parts of our lives that are important but, well, never seem to be urgent. It struck me over the weekend that I had spent an incredible amount of time either at work, or thinking about work, over the past few months. It’s one of those traps that are always there, ready to spring, and I seem to get lured into it time and time again.

    This time, the realisation came to me during a meeting that we were holding at our home. It was a meeting of the Social Committee of the Lodge that I am a member of, and we had twenty or so people attend, comprised of members of the lodge and some of their partners. The Social Committee is an opportunity for our women folk, who are not able to participate in the ritual of the Craft, to take an active part in the charitable and social work that we do, and to meet the other members of the Lodge and their families.

    After the business part of the meeting was over, and everyone had partaken of the food and drink, a core group of people stayed on for a few hours to catch up and socialise in an informal setting. Maria (my decidedly better half) and a number of the ladies had a great time. Much sparkling wine was consumed by the ladies, a few snifters of various flavours of spirits by the guys, and all of us had a chance to catch up and spin a few stories. It was a really nice time, and we got to talk about a wide variety of things.

    As it inevitably does in these circumstances, the conversation turned to why it is that men today are not drawn to Freemasonry. Now, the membership numbers are not in the absolute free fall that they were a decade or so ago, and indeed the latest word is that the membership numbers are pretty much stable. The reality, however, is that there are far fewer members of the Craft today than there were in past generations. The same is true of service organisations like Rotary and Lions. It wasn’t all that long ago that a young man with ambition would have thought it absolutely part of his future to join a local Lodge, but today many (probably most) of them don’t consider the option at all. We all had our theories as to why this was the case, but most of us agreed that one of the reasons is that they are all too busy!

    Today, the 40 hour week is a distant memory for most working people. Sure, you may only be in the office for 37.5 hours, but when you add in the ever-increasing commute and the inevitable “just one small thing” that you take home with you, or the research or background reading that you do at home, I think that fifty hours is closer to the average work week. Then there are all the structured activities that we do for the kids — every night, someone has to drive one child or another to a music or dance or drama lesson, a sport event, a school function or some other event that, somehow, has become a totally necessary part of life today (even though we all grew up without any of them).

    So in the hustle and bustle of everyday life, we lack that moment of stillness that is necessary to realise that something might be missing.

    For anyone reading this that is not a Freemason, let me give you a little bit of information. In the three levels, or degrees, of the Craft, a lesson is taught using a combination of role-playing and oral recitation. The lessons are on many levels, and the whole point of the lessons is to have each individual interpret them in a way that will benefit himself, will help him to become a better person in all aspects of his life. There is no masonic dogma — none at all. The whole point is for you to use your own mind and intellect to better yourself, not to adopt the ideas of someone else.

    All of which is necessary to understand that when I talk about “the lessons” of each degree, I am talking about ONE way that they can be interpreted, and by definition anything that I say is going to be coloured by my own personality and perception. Neither am I divulging any secrets here — there are surprisingly few secrets in Freemasonry. So, with that out of the way, here’s what the three degrees teach. Or perhaps I should say here’s what I learnt.

    In the first degree, you are taught that you are here for a reason, and that you, yourself, are going to discern that reason. You are taught that you are inextricably connected to the rest of humanity, and that as a member of the family of man you would do well to assist others to the best of your ability, while always being mindful that your charitable activities must not be allowed to impact negatively on the welfare of your family and other responsibilites. Your are taught that you need to be ever industrious, that you are expected to take care of your own needs and the needs of those who depend on you, and not to rely on the charity of others except when there are no alternatives open to you.

    The second degree teaches that you also need to work on yourself. You need to balance your time between working at your profession, craft or employment, and working on improving your mind, your body and your spirit. Without doing this, your ability to contribute to the world is going to be limited, both to the world at large and to those nearest and dearest to you.

    Finally, the third degree brings you face to face with the fact of your own mortality, emphasising that you don’t have eternity to do what you want to do — you may not even have tomorrow — so do what you need to do now, and be ever mindful of your priorities.

    I suspect by now you know where I am going with this. I seem to be regressing time and again to the first degree, being industrious, giving freely of my time and energy to try to benefit all those around me, both at work and at home. But I seem to lose track of the time, to forget (or ignore) the need all living things have to improve themselves, to grow, to realise their potential. And I’m not getting any younger.

    So with this posting, I reaffirm my commitment to strike a balance, to work daily on improving myself in some small way, to take the time out from my schedule to THINK about what I am doing and where I am going.

    I’m sure I’ll slip again. But that’s OK. I hope it will be longer before the next time I slip back into old habits, and that I’ll pick up on it in a more timely manner when I do so.

    That, in itself, will be a small step forward in self improvement.

    I just read an interesting article over on Java Lobby by Dennis Forbes titled Out of Bounds : Avoiding Career Protection Faults.

    The article really rang some bells in my mind. I didn’t really identify with the concerns about job and career security — I guess that I have been fortunate to work with development groups that are fairly self-confident, this and managed by people who can see past the little bits of political posturing that does take place. No, approved what caught my attention was more related to the behaviour of the new generation of programmers when it comes to anything but the newest code and techniques.

    I guess I need to be a bit more concrete. A very good friend of mine, who will remain nameless to protect the innocent, was involved in a web application (yeah, I know, another one). They were using Hibernate to provide an “object” view of the database.

    Now, before everyone jumps all over me, let me state quite clearly that I like Hibernate. I like it a lot. This is not a criticism of Hibernate. Are we all clear on that? OK, moving right along…

    Recently, they needed to implement a report. Nothing fancy, just your usual run-of-the-mill business report, pulling a few thousand line items and doing some basic totalling and so on. It came back from the dev team, with a note that it now works, so the story is complete, but probably needs to be optimised.

    Boy does it need to be optimised. It takes over 4 hours to produce the report.

    Now, these are not dumb developers — nothing could be further from the truth. They are very bright, and they write first-class code. But they are young and idealistic. I’m going to sound like my father for saying this, but the reality is that they haven’t had enough real-world production experience to make the right decisions on gut instinct. Therefore, in the absence of that, they always apply the pearls of wisdom gathered from the latest “best practices” guru or article without filtering it through some basic sanity and appropriateness checks.

    Let’s get back to the report in question as a case in point. There are two reasons that the report was running so slowly. The first is a general reluctance to use the database for what it does so well — querying data. Instead, they use Hibernate to get object graphs. And while Hibernate is pretty good at doing sensible things with lazy loading, there is a limit to how well it can optimise this sort of thing and it seems, more often than not in these sorts of scenarios, that it is pulling more data than it needs to out of the database. Quite frankly, I think that creating reports, or lists of items to display in a list view, is generally better done using a simple SQL query that returns just an ID with a the set of columns actually required for displaying the list or report. Hibernate even has a mechanism for doing essentially exactly this, using report queries and HQL, so I am prepared to accept that as a viable alternative, but I would prefer SQL because, quite frankly, it is better understood.

    Hibernate is actually pretty well documented, especially for an open-source project. Indeed, it is pretty well documented, period. Many commercial packages could learn a thing or two from Hibernate’s documentation. But compared to the huge body of knowledge that exists about SQL, and the decades of real-world experience, frankly, it just doesn’t cut it. Not unless you happen to have a Hibernate guru on hand.

    So, I think that using tools such as Hibernate, and refusing to entertain using raw SQL under any circumstances, is part of the problem, but only a very small part. I doubt you could make Hibernate take 4 hours to do the report even if you were trying very hard. No, that is not where I think the problem lies.

    We can get closer to the problem by stepping back a little. The reason that they can’t just run a query to get the report data is that the data is not stored, but calculated on demand. The system stores transactional data, and all balances are calculated as required using a bunch of (often nested) calculators that walk the database. So if I want a balance, I walk all the transactions that affect the balance (and there are many types of transactions in several tables with various dependencies) and do a calculation, often quite complex.

    From a coding perspective, it’s a one-line call to a calculator method to get the data I want, so it looks quite elegant, and the calculators are tested separately and I know that they are correct.

    The calculators do some basic filtering of the transactions that they process, but they still need to read a lot of them, and sometimes they end up using some lazy fetching of related data. It’s just the way the data mapping is done, because there are conflicting requirements for different functions in the system. The end result is that doing a calculation is quite expensive, which is not a problem when you are doing just a single calculation.

    But what if I want to get month-end balances for the past year? Simple, just loop though the required dates, pass them to the calculator, and get back the balance on that date. It’s really easy for the calculator to just stop processing transactions when it gets to a particular date.

    And if I need DAILY balances? Do you see where this is going?

    Inside what looks to the developer like a simple loop, there is a huge amount of database activity going on, and done in such a way that the database has no opportunity to do anything but act as a dumb file store. That is what is bringing the database (and the system) to its knees when this report is run.

    Now, at this time, gentle reader, you are probably thinking to yourself: why was the database structured that way in the first place? Well, do you remember in school, when you were told not to store something in the database if it can be calculated? Well, that’s what blindly following that advice ultimately leads to.

    Now, I am not advocating that we forget about things like normalisation and removal of redundancy. But all general rules need a context. Just because something can be calculated does not mean that it should never, under any circumstances, be stored.

    Take this calculated balance as a case in point. Calculating it is expensive, so storing it should at least be considered, and implemented if the time taken to recalculate it makes it impractical to use recalculation whenever it is needed. Even more importantly, some data items, even if they can be calculated, have a distinct “point-in-time” value and therefore need to be stored regardless of the time it takes to recalculate them.

    Let me explain that, because it is important.

    Let’s say that a calculation involves a set of input values, transactions, exchange rates, interest rates, taxation rates… you get the idea. It’s more than just the initial and transactional data. Calculating a value means you need to get, for each transaction, the applicable rates that were in effect when that transaction occurred, so you need to keep historical data for each of them. OK, that much is obvious, and there is a clear requirement to store that history if you are going to recalculate as required.

    What most people forget, however, is that there is another component to the calculation that can change over time — the code of the calculation itself. If there is a different way to calculate sales tax, for example, or a new type of tax is introduced, or simply a business rule changes, then the calculation logic itself will change and it will generate different results for the same input data.

    So if you want to recalculate an invoice amount for a past date, you better have a copy of the correct version of the calculator code around too, and you better have an infrastructure in the code to handle identifying and instantiating the correct version of the calculator. And it’s even more complex than that, because if the output of a given calculation feeds into the next period’s data, then you need to identify, instantiate and use the correct version of the calculation logic for each period as you are looping forward.

    You’d better be able to handle this correctly, because if you don’t, your client-facing staff will see one figure on the screen while the customer will have a different figure on the invoice. Trust me when I say that this is not what you want.

    Whenever a figure has a meaning at a point in time, it is a data point all by itself. In my mind, it is a no-brainer: it needs to be stored. In a product order line item, for example, you don’t store the extended price, because you can always calculate it as unit price times number of units.

    But you do store the unit price and description, even if you could always look them up from the products table. Why? Because they can change, and the line item is a point-in-time data value. In this case, we are shielding the point-in-time data value from changes in the data inputs. We are not concerned about the extended price, because the point-in-time data value’s data inputs (ie the unit price and number of units) are captured at that point in time, and we do not foresee (and will not support) any changes to the trivial calculation logic. If the calculation logic were ever to change, then I would see no real alternative but to store the extended price in the line item.

    Could someone have pointed this out earlier? Yes. Did anyone do so? Yes. Did anyone listen? Unfortunately, no.

    You see, that was “old world” thinking. We have a calculator! Why would we want to store the data value when we can always calculate it?

    This post is long enough. If anyone ever reads it, I fully expect to be flamed. Just before you hit that comment button, however, please take the time to understand the point I am trying to make. I am not trying to discount modern best practices, or the tools that are currently in vogue. I actually pride myself as being pretty good at picking up new ideas and technology, and leading rather than following in their adoption. I am simply pointing out that, in my opinion, we need to apply a real-world filter over all the stuff that we are constantly being bombarded with, and realise that there are no absolutes — no rule can be applied blindly without some thought.

    And sometimes, just sometimes, we older folks might know a thing or two that can be useful.

  • ROFLMAO I just installed WordPress on my own (hosted) server. It gave me an option to import my Blogger data, pfizer
    which I did, clinic and it seemed to work pretty well.

    I will now try to blog more frequently.

    Yeah, right!

    Sometimes I just crack myself up!

    I got here after fooling around on Google and thought — what the heck, doctor I’ll create a ‘blog. Maybe this time I’ll start using it. I was surprised to find that the user name “karabatsos” was taken — maybe Joanna had created it? Hmmm. How long has this blogger thing been going? Could it have been me and I had forgotten? Sure enough, I tried a few of my old passwords and bingo! I was in. The first and only post dated back to 2001! So much for blogging…

    I am still not sure that I will follow up with this, but I will make an effort to post regularly for at least a little while to see if it helps me at all, or helps anyone else, I guess, although I don’t really see how this would interest anyone else.

    If I really get interested, I might create a few special-topic blogs that might be a kind of virtual publication. I would love to resurrect the AVDF community, but more focussed on the types of applications that I (and most programmers I deal with) develop nowadays: Java web apps.

    That’s all for the first post. It’s late and I have an early start tomorrow.

    Added later — I added a photo here so I can link to it from the profile.
    iBookI recently bought myself a little Mac iBook through eBay. It was a good deal and I have been feeling for a while that I needed to get across the whole Apple culture.

    I am going to try to use the iBook a lot over the next little while. I know that I am going to be less productive than I am on Windows, cure and I will still use Windows for my development work at IBS, bronchitis but I will try to set up my iBook for development too so that I can see whether it is really a viable alternative as a development workstation.

    My main motivation, however, is that I am seriously considering moving the kids over to Apples next year — I am so sick of all the down-time that I seem to have fixing one problem after another with virii, trojans, adware and other miscellaneous malware. The Apple is not totally immune to that sort of thing of course, but it is a lot less susceptible.

    So I am using the iBook for Office applications (I installed MS Office, because I really need seamless compatibility and so will the kids) and I am really impressed with the level of functionality. Honestly, for a user of Office, the Mac versions are better than the Windwows versions. Entourage rocks, and the Notebook feature in Word is very nice — I am continuously finding new uses for it.

    I am also reading up about Ruby On Rails, and am in the process of getting it all set up on the iBook.

    All in all, the experience has been positive. It has not been painless, but that is because I have a lot of Windows knowledge that I don’t have the equivalent for in the Mac world. In truth, I think it is harder for a techie to swap than a non-techie, and I really think that the next person who asks me for advice on what they should buy might well find that he or she is being pointed to a Mac. At the end of the day, they are going to come back to me for support, so it is in my interest too.
    It’s been a couple of weeks since I last blogged, information pills and in that time I have been using the iBook a lot. I have to say that I really like this little computer. While I am still more productive using Windows when my Windows box is actually working OK, the fact is that I have to spend a LOT of time keeping my Windows machine ticking over. I’m pretty careful, and I run anti-virus and anti-spyware all the time. But that only means that I go for many months before something slips through and I need to rebuild my machine yet again. The kids — well, I’m sure that they try, but I seem to be rebuilding their machines every few weeks.

    Joanna is a case in point. I bought her a new Windows (Acer) laptop recently. Clean install of Windows XP Pro, latest service packs, all updates applied and auto updates enabled, anti-virus and 3 — yes, three! — anti spyware tools. Yesterday, she was having a problem saving a Powerpoint presentation and asked for help. There in her task bar was a little dog, saying she could get paid to surf the web. She has no idea where that came from.

    And did I mention that Powerpoint couldn’t save? Anywhere? Not locally, not on the network, not on a flash drive — nowhere? So, I tried a voodoo cure and did a “Save as Powerpoint 95”. This worked after warning me that some features might be lost. A quick bounce of Powerpoint, re-load the file (which triggers a conversion that took forever) and all is well with the world again. Except there goes a good 20 minutes of my time, not counting the interruption overhead.

    Compare that to the Apple. Now admittedly, I have only had it a few weeks, but I am using it as my main machine for everything I do except the actual coding at work (where we use the supplied Windows/Intel clones) and — yes, I know this phrase is becoming a cliche — it just works. This seems to be the comment I am hearing from everyone who is moving across to the Mac platform from Windows, and I am now joining the ranks.

    Is it perfect? Of course not. I couldn’t connect to my networked printer (an Officejet G55 hanging off a Dell XP Pro workstation whose role in life is to be our server). Turns out that there is some bug or configuration error (is there a distinction there I am not aware of?) in Tiger that prevents the authentication from working right. Don’t get me wrong, it only took a few minutes on Google to find a workaround, but it shows that even Apples have issues sometimes.

    But overall, well, I really like this computer. It really gets over 4 hours of use on a charge even with a WiFi link active, so I actually use it on the battery. I’ve never done that with a laptop before, because the Wintel laptops I have owned can’t reliably get more than about an hour and a half (although I believe that the Centrinos a pretty good).

    My current thinking is that I will leave Joanna and Peter with the Windows laptops for the rest of the year, and give Costa this iBook. It is perfect for me except that it doesn’t have Bluetooth built in, and I really want that. I like the 12 inch form factor — after all, the whole point of a laptop is that it is portable. I would really like a PC Card slot, because that would give me the ability to run the iBurst card, but the smallest Apple that has one of those is the 15 inch Powerbook. And the price — well, more than I want to spend right at the moment. I will probably end up buying myself a new 12 inch iBook with minimum RAM (because 3rd party RAM is much cheaper and easy to install) but with the biggest hard drive I can get and with the built-in Bluetooth. That comes to under $1900 delivered, probably just over $2K by the time I add 1G of third-party RAM. I’ll check whether I can get a better price online. Then over the next year or so, I can cycle my new one down to the kids while I upgrade as funds allow.

    Apples also seem to have a longer useful life, so I actually think that the cost of running Apples will be lower when taken over their effective life, even if they do cost more to start with. Time will tell.

    That’s all for now. I really need to learn some lines for tomorrow night, and it is getting late.
    I’m going to get off the topic of the Apple for today — not that nothing has happened, pathopsychology but because in reading over the blog I sound like some Mac fanatic. Today, medicine Chris, this a good friend of mine, showed me his new HP laptop. Huge, 17″ monster, very powerful, but battery life of about an hour, and he couldn’t get it set up to access the network. Sigh!

    But I said, no Apple today.

    Over the past couple of weeks, I have been working on a particular Java application, and I needed to extract a whole bunch of data into flat files for a particular client requirement. Cutting a long story short, I ended up writing a set of scripts to generate an XML specification of an extract that is going to be used to control the total extract process, and this gave me a chance to try my hand at Ruby.

    Now, I have heard a lot of good things about Ruby, but had not really used it before. Everyone I knew, who I respected as a programmer, and who had tried Ruby, raved about it. So, even though I knew Python, I made a point of nutting my way round Ruby.

    Obviously, it took me a little while to get moving — there is always a bit to learn when starting with a new language. But I bought a PDF copy of the Pickaxe book and zoomed through the highlights. I have to say, I like Ruby a LOT.

    Ruby is OO to the core. Everything is an object, and it has a remarkably convenient set of built-in functionality. I am not going to put together a tutorial on Ruby, at least not here, but here are a few examples to whet your appetite.

    In Java, to define a class with a set of accessor methods, you do something like this:

    public class Dog
    private String name;
    public Dog(String name)
    {
    super();
    this.name = name;
    }
    public String getName()
    {
    return name;
    }
    public void setName(String value)
    {
    name = value;
    }
    }
    Here’s the same thing in Ruby:

    class Dog
    attr_accessor :name
    def initialize(name)
    @name = name
    end
    end
    Creating an instance in Java:

    Dog dog = new Dog("Rover");

    and in Ruby:

    dog = Dog.new("Rover")

    so the classes a pretty much equivalent, except that the Ruby one is (a) much shorter and (b) eliminates the need to write a whole lot of plumbing, no-brain code. Now I know that any modern IDE generates this boilerplate code for you, but it is still there and needs to be navigated and mentally discounted while you work on the stuff that DOES matter. In Ruby, the only code you write is what you need for the application — well, most of the time anyway 🙂

    Here’s a really cool thing you can do in Ruby. When you call a function, as well as passing a number of arguments to it, you can also, optionally, attach a code block to it. A code block is delimited by either a the keywords do and end, or braces (they’re the same). Inside the called function, the code can determine whether a code block has been attached to it and, if so, essentially call that block any number of times. Here is an example:

    def send(message)
    if block_given?
    yield "connecting"
    end
    connect(...)
    if block_given?
    yield "sending"
    end
    send(message)
    if block_given?
    yield "sent"
    end
    disconnect(...)
    if block_given?
    yield "done"
    end
    end
    This is a dummy, skeletal procedure. We assume that it sends a message somewhere, and there are several steps — connecting, sending and disconnecting.

    If you call it like this:

    send("Hello world")

    it just does its thing. But you can optionally attach a code block like this:

    send("Hello world") {|stage| puts "... now #{stage}" }

    Let’s look at this line. The braces define a code block — the convention seems to be that short blocks like this use braces, while long, multi-line blocks use do/end. The two vertical bars delineate a parameter list; here, the parameter is called “stage”. The single line inside the code block uses puts to display a string. I’ll get to the string in a moment, but for now just accept that this results in the following printout:

    ... now connecting
    ... now sending
    ... now sent
    ... now done

    The string that is displayed is delimited by double-quote characters, which means that the string is processed by Ruby. One of the effects of this is that the #{x} construct embedded in the string is replaced with the value of the variable x — this works everywhere, not just in these attached code blocks.

    This mechanism is used to implement a really simple, generic and pervasive iterator-like mechanism. For example, to allow arrays to be iterated, the Array built-in class implements a method “each” which, you guessed it, takes a code block. So, to iterate over an array, you use this sort of code:

    my_array.each {|element| puts element }

    The beauty of this is that any object can exhibit this behaviour — just implement an “each” method that expects a code block, and “yield” once for each element your object contains. There is no need to be in any other way related to an array.

    Which leads me to the topic of Duck Typing. This is the Ruby philosophy about object typing. While Ruby does implement a single-inheritance object hierarchy model, you can actually use unrelated object polymorphically as long as they implement a common subset of methods. The idea is that if it walks like a duck, and looks like a duck, and quacks like a duck, then it can be treated like a duck. Yes, this is NOT as bullet-proof as a strongly-typed language like Java, but in reality I don’t actually end up assigning a Debit object instance to an Animal object reference very often, and if I do, I will rely on my tests to pick that up. In return, I save myself a lot of unnecessary casting and fiddling in perfectly good code just to tell the compiler what I already know.

    Ruby also has mix-ins, called modules. A module is a bit like an interface and a bit like an abstract class. Like an interface, a module aggregates a set of methods — these are included by classes that want to, regardless of their position in the object inheritance hierarchy. But unlike interfaces, modules have code in them too — implemented methods. These methods become part of any class that includes the module, and have access to class methods, exactly as if the code had been copied and pasted into that class. Also like interfaces, a single class can mix in, or include, any number of modules.

    Like an abstract class, it implements some code, and through access to non-coded variables and methods, can set up an expectation on the classes that include it, but unlike an abstract class, the class that includes it does NOT need to descend from it (indeed, it can’t do so, because modules are not classes per se).

    Very powerful indeed, and I don’t claim to fully appreciate all the implications of how these can be used, but just intuitively it seems to be really useful. And just plain cool.

    Anyway, that’s more than enough for one post. Tomorrow is going to be a busy day. Toodles.

 

This is my private blog and web site. You can find out more about this web site and its author on the About page.


Interested in where the name comes from? See the What's a Jimako page.


Feel free to browse the site to see what else is here, or peruse the random ramblings in the blog.


Categories

Archives

 
WP_Modern_Notepad