Thursday, July 26, 2012
The Economic Failure of Public Cloud
Monday, February 13, 2012
On the Use, and Misuse, of Software Test Metrics
Consider a test you get in school. The goal of the test is to show that you understand a topic, by asking questions of you about the topic. Depending on the subject, the questions may be specific, fact-based (When did the USSR launch Sputnik?); they may be logic-based (Sputnik orbits the earth every 90 minutes, at an altitude of 250 km. How fast is it moving?); or they may be interpretative (Why did the Soviet Union launch the Sputnik satellite?)
Or they can be just evil: write an essay about Sputnik. Whoever provides the longest answer will pass the test.
Note that by asking similar questions we learn about the student's capabilities in different dimensions. So when a piece of software shows up, the purpose of testing should not be to find out what it does (a never-ending quest) but to find out if it does what it is supposed to do (conformance to requirements). The requirements may be about specific functions (Does the program correctly calculate the amount of interest on this loan?); about operational characteristics (Does the program support 10,000 concurrent users submitting transactions at an average rate of one every three minutes, while providing response times under 1.5 sec for 95 percent of those users as measured at the network port?); or about infrastructural characteristics (Does the program support any W3C-compliant browser?)
These metrics follow from the program's intended use. Management may use other metrics to evaluate the staff: How many bugs did we find? Who found the most? How much time does it take, on average, to find a bug? How long does it take to fix one? Who created the most bugs?
The problem with these metrics is they generally misinform managers, and lead to perverse behaviors. If I am rated on the number of bugs I write, then I have a reason to write as little code as possible, and stay away from the hard stuff entirely. If I am rated on the number of bugs I find, then I am going to discourage innovations that would improve the quality of new products. So management must focus on those metrics that will meet the wider goal - produce high quality, low defect code, on time.
Software testing takes a lot of thinking: serious, hard, detailed, clear, patient, logical reasoning. Metrics are not testing - they are a side effect, and they can have unintended consequences if used unwisely. Taylor advised care when picking any metric. Often misquoted as "you can't manage what you do not measure," Taylor's intent was to warn us. Lord Kelvin said "You cannot calculate what you do not measure" but he was talking about chemistry, not management. Choose your metrics with care.
Friday, February 10, 2012
Beyond Risk Quantification
Now suppose the crowd starts walking over a bridge. How would you derive the total stress on the structure? You might estimate the average weight of the people in the crowd, and multiply that by the estimated number of people on the bridge. So you estimate there are 2,000 people, and the average weight is 191 pounds (for men) and 164.3 pounds (for women), and pull out the calculator. (These numbers come from the US Centers for Disease Control, and refer to 2002 data for adult US citizens).
So let's estimate that half the people are men. That gives us 191,000 pounds, and for the women, another 164,300 pounds. So the total load is 355,300 pounds. Right?
No. Since the least precise estimate has one significant digit (2,000) then the calculated result must be rounded off to 400,000 pounds.
In other words, you cannot invent precision, even when some of the numbers are more precise than others.
The problem gets even worse when the estimates are widely different in size. The odds of a very significant information security problem are vanishingly small, while the impact of a very significant information security problem can be inestimably huge. When you multiply two estimates of such low precision, and such widely different magnitudes, you have no significant digits: None at all. The mathematical result is indeterminate, unquantifiable.
Another way of saying this is that the margin of error exceeds the magnitude of the result.
What are the odds that an undersea earthquake would generate a tsunami of sufficient strength to knock out three nuclear power plants, causing (as of 2/5/12) 573 deaths? Attempting that calculation wastes time. (For more on that number, see http://bangordailynews.com/2012/02/05/news/world-news/573-deaths-certified-as-nuclear-crisis-related-in-japan/?ref=latest)
The correct approach is to ask, if sufficient force, regardless of origin, could cripple a nuclear power plant, how do I prepare for such an event?
In information security terms, the problem is compounded by two additional factors. First, information security attacks are not natural phenomena; they are often intentional, focused acts with planning behind them. And second, we do not yet understand whether the distribution of intentional acts of varying complexity (both in design and in execution) follow a bell curve, a power law, or some other distribution. This calls into question the value of analytical techniques - including Bayesian analysis.
The core issue is quite simple. If the value of the information is greater than the cost of getting it, the information is not secure. Properly valuing the information is a better starting place than attempting to calculate the likelihood of various attacks.
Thursday, December 9, 2010
The Coming Data Center Singularity: How Fabric Computing Must Evolve
Wednesday, December 8, 2010
The Software Product Lifecycle
Sunday, September 26, 2010
The Blanchard Bone and the Big Bang of Consciousness

Found at a cave in southwestern France, the Blanchard Bone is a curious artifact. It appears to be between 25,000 and 32,000 years old. It is only four inches long. It is engraved with about 69 small figures, arranged in a sequence of a flattened figure eight. Archeologists tell us that the carving required twenty-four changes of point or stroke.
What is it? Looking closely at the carving, it seems that the 69 images represent the phases of the moon over two lunar months (image courtesy of Harvard University, Peabody Museum). It isn’t writing: That was still 20,000 to 27,000 years – over four hundred generations and two ice ages – in the future.
What would the night sky mean to our ancestors so long ago? The Sun was directly responsible for heat and light, and defined the rhythm of days. But the Moon moved so slowly, in comparison. What was it? What did our fathers and mothers think they were looking at, when the Moon rose and traveled across the sky, changing its shape from day to day, but always following the same pattern?
Yet the Moon’s travels had implications and meanings: The Sea responded to the Moon in its tides – or did the tides somehow pull the Moon along? How did that happen? What was going on between the Moon and the Sea?
The ancient artist/scientist/priest who carved this artifact carved what she saw – and more. The artifact was useful, for knowing when to plant, when the birds, the herds, or the fish were migrating, when it might be a good time to find a warm cave. The Moon measured fertility and gestation. When people speculated on this, they began to think – about what they saw, and what it meant.
Some wondered if the bone might be magically linked to the Moon and the Sea. Who among them could not be perplexed by the gymnastics, by the dance, of the Moon?
What would that inevitable nighttime procession inspire? How many nights could people look at the slow but predictable change, observe its correlations, and not be challenged to wonder? The first instances of human reasoning could have been inspired by this persistent phenomenon.
In “Hamlet’s Mill: An Essay Investigating the Origins of Human Knowledge and its Transmission through Myth,” Giorgio Desantillana and Hertha von Dechen propose that the myths, as Aristotle taught, are about the stars. The authors trace the myth of Hamlet back to Amlodhi, who owned a magical mill. Once it ground out peace, but it fell off its axle and on the beach ground out sand, and now it has fallen into the Sea where it grinds out salt. I the author’s essay, they reveal that this myth is a story to capture and preserve the observation of the precession of the equinoxes. This is a 25,950 year long cycle, during which the Earth’s North Pole traces a great circle through the heavens. Now the North Pole points to the Pole Star in Ursa Minor, but in 13,000 years it will point to Vega. Only a medium as persistent as a story could span the ages, capturing and preserving this observation.
When the Blanchard bone was formed, the sky was much as it will appear tonight. Between then and now we have passed through one Great Year. When the North Pole pointed to Vega last, our species was beginning to colonize the Western hemisphere, the ice age was capturing water and lowering the seas, and the Blanchard bone had been lost for ten thousand years.
Let us remember that ancient scientist/artist/priest, let us regard her qualities of observation, synthesis, and imagination with wonder: her discovery in the sky urged us to consciousness, communication, and endless wonders beyond.
Sunday, August 23, 2009
Goulash Recipe
"GULYASUPPE" GOULASH SOUP SERVES 4
INGREDIENTS
Beef chuck cut in 1/4" cubes 1 lb.
Olive Oil 1 Tblsp
Butter 1 Tblsp
Onions diced 1/4” 1/2 cup
Flour 2 oz.
Ground Cumin 1 tsp
Chili powder 1/2 tsp
Paprika 2 Tblsp
Cayenne pepper 1 tsp
Fresh ground pepper & salt 1 tsp (to taste)
Chicken stock 5 cups
Boiled potato in 1/4" dice 3 medium
Sour cream 2 oz
Egg noodles 1 lb.
Method
Sauté beef in some oil and butter, till lightly brown. Set aside, keep warm. Add oil to pan and add onions and sauté till golden brown (clear). Set aside with beef. Add oil and butter to pan, add flour, cumin, chili powder, paprika, salt, pepper, and cayenne pepper. Blend 5 or 6 minutes over low heat. Remove from heat and mix in meat and onions. Add 2 cups of stock and mix until smooth, let sit for 10 minutes off heat. Cover pan and return to heat, simmer for 45 minutes or until meat is tender. Add remaining stock and potatoes. Skim fat, correct seasonings (add salt and pepper to taste). Serve over egg noodles.
Let melted butter cover top to hold from afternoon till dinner. You may add a little tomato paste to thicken, if you like. When serving, add a dollop of sour cream or chives to top if you like.
Sunday, April 5, 2009
My Journey Through Total Knee Replacement
The repair left the meniscus tapered, thinner on the outside than on the inside. Over the years, that unevenness increased, and in December 2008 the discomfort passed my threshold of tolerance. I spoke with a new orthopaedist (the other doctor had retired), Dr. John Crowe, of Orthopaedic and Neurological Services, who informed me that a knee replacement can last 20 years or more – not the seven to ten I’d learned from some inaccurate article floating around the Internet. I asked him if he were busy next Tuesday. He laughed and said that he was booked for the next few weeks, so we scheduled the surgery for after the turn of the year. On January 19th I drove myself to Greenwich Hospital, registered, and had the TKA (total knee arthroplasty). Greenwich Hospital has the lowest rate of post-operative infection of all hospitals in the state of Connecticut. This is especially important for any kind of implant surgery: things get bleak if there is an infection involving an implant.
The path to the surgery took a good deal of time and effort. The hospital offered a two hour class on the day before Christmas to guide people facing a knee or hip replacement. After the class, I arranged with the local Red Cross to donate a pint of blood on Dec. 30th that the hospital would hold for the surgery. I started a course of exercise to strengthen the muscles around the knee. Getting the insurance company aligned took exceptional patience, but eventually they were able to provide some measure of support for my impending adventure. I was very fortunate to arrange post-operative recovery and rehabilitation through Waveney Care Center, very near my apartment.
Monday morning was cold. I was up well before dawn, as I had to drive to Greenwich and check in, bringing a bag with clothing and my laptop. The kind folks at the registration desk processed my arrival as courteously and efficiently as at a high-class hotel. I was brought to a pre-op room where I stowed my bags and changed into the hospital gown. I lay down on the hospital bed and a nurse started the IV. After I talked with the OR nurse, I was wheeled into the operating room. The surgery began shortly after 7:00 AM. I remember the bright lights and very cool air in the OR – the temperature suppresses infection, I was told. The anesthesiologist told me that I was about to go under and then I went out – in the middle of a word.
In the course of the operation the surgeon replaces the lower surface of the femur (thigh) joint, the upper surface of the tibia (shin) joint, and the back of the patella (kneecap). The operation could more accurately be called a knee resurfacing, but the common name is Total Knee Replacement or Total Knee Arthroscopy (TKA). As part of the operation, the surgeon removes the anterior cruciate ligament (ACL) since its attachment point is replaced by the mechanical bearing surface. Its function is replaced by elements of the implant. In my case, the surgeon selected the Zimmer Legacy System LPS-flex.
I have not posted my pre-op or post-op X-rays. I might at some future time. Suffice it to say that the surgery was necessary, as noted in this excerpt from the surgeon’s report:
“… There were advanced degenerative changes noted. There was bone exposed on both the lateral femoral condyle and lateral tibial plateau with grooving of the bone….”
I awoke in recovery three hours later, feeling disoriented but not that bad. A nurse helped me stand for a moment to show me that I could bear weight on my new knee, then guided me back into bed. The pain medicine seemed to not be working, which caused me a sense of panic, but the nurse was able to increase the dose a bit which helped somewhat. Some time later I noticed that my left leg was in a mechanical apparatus (continuous passive motion, or CPM, machine) that gently bent my left knee and then extended it, to keep it mobile while the wound healed. I used that machine for more than an hour each day, increasing the flex a few degrees at each session. The machine’s action was not painful: The discomfort from the surgery was not localized, and seeing my leg actually move was comforting. I did feel some discomfort when the physical therapist increased the range of motion.
By Thursday Jan. 22nd I was beginning to feel more alert, at least enough to complain that the pain medicine was clearly not working, and could they try something else? I was moved to Darvocet, then Vicodin (the drug of choice for my anti-hero Dr. Gregory House). Later I learned that the drugs were working just fine, the problem was that the operation is, frankly, painful. I remained somewhat light-headed and my blood pressure wasn’t coming up very rapidly, so the doctor decided to transfuse my pint of blood back. This helped: my vital signs improved. With the exception of the CPM sessions, my left leg was immobilized using a fabric-lined thigh-to-ankle sleeve, open along the front, strengthened with four or five very firm plastic strips, and secured with five Velcro bands. The first few days, it was hard to close – my leg was quite swollen. Once I got moving, and the swelling began to subside, it closed quite easily. I used the immobilizer into my first week at Waveny.
I brought my laptop with me. That Thursday I wrote my first post-op e-mail. It contained 24 words, and I got seven of them wrong. At the time I thought I was quite lucid. I later forwarded it to my friend and asked him if he could figure out what I might have meant. The pain-killers were stronger than I thought. I wonder what the hospital staff thought I was saying: I felt I was being very clear. That day, I achieved 76 degrees of flex, not quite the 80 degrees the physical therapist at Greenwich Hospital had hoped for, but close enough.
On Friday the 23rd I was helped into a wheelchair and driven from the hospital to Waveny in New Canaan. The driver was a Spanish immigrant, and we talked about Madrid and the opportunities that brought him to the US during the short ride. I got to Waveny in the early afternoon and they had already set a lunch aside for me. The desert was a home-made cream puff – it was so tasty! In fact, all the food was great. I hadn’t eaten so well since my last good vacation – and the meals were healthy and the portions just right. One of the physical therapists – Trisha – visited me that first afternoon. Her concern and sensitivity to my discomfort and uncertainty was profoundly comforting. On Saturday one of the occupational therapists – Gus – stopped by to see how I was doing. I said that I’d been working out with free weights at home and hadn’t had a chance to do anything for a while – could he get me a small weight I could use? He left for a moment and brought me a weight, then he sat alongside me and we talked while I did some upper body work, lying in my recuparatory bed, happy to feel that something was unchanged and more would be getting back on track. It is hard to express how emotionally moving it was, and still is in reflection, to feel that genuine compassion and care. I had not felt such a pervasive sense of concern for my well-being since I lived at home as a young boy.
Dr. Crowe said that I should remain in the immobilizer until I could do a straight leg lift. That took into the weekend, six or so days after surgery. Friday was discharge day from the hospital and admittance day at Waveny. Saturday they offered one hour of rehab, but I don’t recall doing much. Sunday was for rest, but I was able to get out of bed unassisted that evening.
On Monday Jan 26th, I began my regimen of two daily one-hour physical therapy sessions, the first at 9:00 AM, the second at 1:00 PM. My physical therapist, Wrenford, wore a shirt labeled “Physical Terrorist” asserting his determination and gusto. And so the work began. It seemed that every day I achieved another milestone. In a few days I started using a cane rather than the walker.
On the evening of Wednesday Jan 28th, after dinner, my daughters visited me with their Mom and stayed till 8:30 PM. When I walked with them to the exit, one remarked with surprise that my legs were straight! I had begun looking at my toes on my left foot anew. Even though my knee was still swollen, it was properly aligned over my ankle, just where it was supposed to be. For years I had grown used to my left foot being a bit further out to the side, and here it was right next to the right foot, where it belonged. I thought, “I love my new knee!”
Initially I took my meals in my room, watching TV or working on the computer. Encouraged by the nursing staff, I began walking to the cafeteria. During meals I got to know others who had knee or hip replacements. There were eight or nine of us, and we would usually sit together at two or three tables. One male patient was a schoolteacher from Westchester County. He was a gregarious NY sports fan. One female patient was joined at lunch by her husband, a cultured, charming gentleman from central Europe. They had raised six girls, and he had authored two books: one on gardening and one on the gardens of Moravia – with insightful commentary on the history and politics of the region.
I did not expect to get involved in occupational therapy, but the Waveny Care Center wants its patients to get along after returning home. The difference between OT and PT was put simply: PT is for the waist down and OT is from the waist up. I asked what the specific goals of OT were, and it turned out that I could meet them by using their kitchen. So I took the opportunity to make a batch of my Black Bean Soup (the recipe is posted elsewhere in this blog). One of the OTs bought the ingredients! I cooked it up and it was pretty good. I was able to use the stove, blender, tools and sink; reach items in cabinets overhead and load the dishwasher; and cook without getting off-balance or fumbling with the cane.
On Tuesday February 3rd, during the morning PT session, I was able to get 90 revolutions on the stationary bike, and - for the first time since the surgery - I walked without a cane. I was still apprehensive on the stairs, fearing that I might get my toe caught and trip. But each day I would do a little better, breaking down the motion of walking up a step into its components: minor weight shift (but don’t rock the hips), lift up from the knee, move the heel back then up, place the ball of the foot squarely on the next tread, shift weight (but don't rock the hip), lift with the quadriceps, bring the other leg to the next step, keeping the knee pointing forward. Repeat.
On Wednesday, I was able to achieve 86 degrees of flex in the knee. Stairs were challenging, I didn’t have the strength to climb normally but with the cane I could make my way up and down, one step at a time, haltingly. Thursday February 5th was my check-out from Waveny.
New Canaan has a program called GetAbout – residents can request transportation within the town by phoning in a few days in advance. They have a small bus and a van, and I used both. I had PT three times a week. On Tuesday the 10th, I got a ride with my ex to see our daughters’ choir concert in Weston. I was able to ride in the passenger seat both ways, and the girls were surprised and delighted to see me – and I was so proud of their performances! I used a cane to walk from the car to the auditorium, and got a spot on the end of a row so I could stretch out my leg. I never got too good with the cane. My goal was not to get good with the cane, but to get rid of the cane.
That weekend I picked up my car from Greenwich Hospital and drove home – freedom! My first stop was a car wash: Four weeks in the garage had left a remarkably thick layer of dust over the whole car, and someone had drawn a bit of art in the window-panes. I stopped at the grocery store and picked up a few things. Peapod was an enormous help during my immobile phase.
Over the next few sessions I documented my knee's progress on Facebook:
Friday, Feb 13th: 91 degrees.
Monday, Feb 16th: 99 degrees. My primary physical therapist, Jane, noticed that I was very tight on the outside of my left leg. Prior to the surgery I had become knock-kneed by over 10 degrees in my left leg. Now that my leg was straight, the muscles and tendons on the outside of my left leg were stretched taut. She recommended calf stretches and a particular massage across the tendons. It was acutely uncomfortable for about 55 seconds – and then it felt great. I started doing the massage at home. I was never able to get as much relief as that first time, but every time helped a bit more.
Wednesday, Feb 18th: 105 deg (but was only able to get to 104 on Friday Feb 20th).
I started walking up and down the stairs in my apartment, haltingly.
Tuesday, Feb 24th: 108 degrees.
Wednesday, Feb 25th: 110 degrees. This is an important milestone – once I was able to get 110 degrees I could move my foot enough to safely go up and down stairs.
On Tuesday, March 3, six weeks post-op, I had a follow-up visit with Dr Crowe. I had achieved 110 degrees of flex and in the office, with no warm-up, I was past 105 degrees. Dr. Crowe advised me that my goal was to reach 120 degrees, so I was well along. This was a significant relief – I had assumed that I was behind schedule on my way to 135 degrees. It turns out I was on schedule for 120, and all was well.
Friday, March 20: 118 degrees.
Over time the swelling in my left ankle diminished rapidly, while my calf took longer. I still have swelling around my knee, and I’m told that will persist into the summer. On Wednesday, March 25, I achieved 120 degrees of flex. The discomfort as of April 5 is minor, mostly associated with the swelling and weakness around the knee. I am able to go up and down stairs with just a minor, diminishing halt to the downstairs gate. The biggest problem I have in day-to-day life now is remembering to get up and walk a bit every twenty minutes or so. By the end of the day, my knee is sometimes a bit stiff. I’m told that by June I should be able to golf.
I graduated from physical therapy on Tuesday, March 31. Wrenford (while I was an in-patient), Jane (my lead physical therapist while I was an out-patient), Hillary and Trisha and the nursing, occupational therapy, and support staff at Waveny were profoundly helpful, supportive, understanding, and positive. You are all amazing people and make a superior team!
If you want to understand the surgical procedure involved in a total knee replacement, see this lecture by Dr. Seth Leopold of the University of Washington in which he discusses both the total knee and uni-compartmental knee surgery, and also discusses conventional hip and minimally invasive hip repair. His lecture includes a brief edited video showing elements of the procedure.
I met three others during my stint at Waveny who had both knees done simultaneously. I could not imagine that degree of discomfort – but they each said they wanted to get through it. One said that if she hadn’t done both at the same time, she probably wouldn’t have had the courage to get the second one done at all. On the other hand, a neighbor of mine had one knee done last fall and the other a few weeks before I had my surgery. He and I met at the pre-Christmas class at Greenwich Hospital. He’s doing very well.
If you want to talk about your TKA please post to this blog and I’d be happy to hear your story, or share more about mine. I’m done with the drugs, except for an occasional ibuprofen and some ice. I took a walk around the block this afternoon and it felt great! It’s been years since I was last able to do that. I'm looking forward to golfing this summer with my daughters and my doctors.
Sunday, October 26, 2008
The Danger of Microsoft Flight Simulator
Some time ago I flew from the east coast to
Why is it that all airline pilots sound like they were raised in west
Our pilot, Billy Roy, continued: “The on-board computer seemed to think that the flaps weren’t balanced, so it automatically retracted the flaps. We’re going to run a quick diagnostic and we’ll have you on the ground right away.”
Why is it that the notion of being in the ground right away is supposed to inspire confidence? There are times when I’d be more confident if I know we could stay up in the air until everything was fixed.
After a few moments, Billy Roy got back on the PA system: “So the flaps are up and the computer is sure that they aren’t balanced, so we’re just gonna scoot up to LAX and land there. They’ve got real long runways so we’ll be just fine.”
At this point everyone in the front of the plane, where I happened to be for this trip, got very nervous. We all had played with Microsoft Flight Simulator, and we all know what happened when you tried to land with zero flaps. Basically, the plane can’t slow as much as the pilot would like, because the flaps provide extra lift at lower speeds. If you try to slow down too much without any flaps, your aircraft will stall and fall out of the sky. So when you land with no flaps, you hit the runway about 40 or 50 knots faster than you would like. This puts extra stress on the brakes, which might fail. Even if the brakes hold up, you’ll take up a lot of tarmac before you get to a stop. Hopefully not all of it.
Billy Roy got back on the mic: “So we’re gonna land here at LAX and just as a routine procedure you might notice some equipment along the runway, but again this is simply a routine procedure and we’ll be fine. Once we get to the gate we’ll get this all sorted out and we’ll just get on down to
We came in at about 240 knots, and sure enough there was some equipment along the runway: Ambulances, fire trucks, and a couple of other vehicles I couldn’t name, although I thought I’d seen them in the final scene of the movie Airplane!
When we did come to a stop, about 15 yards from the end of the 2-mile long runway, Billy Roy got back on the air: “We’ll be transferring your luggage to busses for the short drive to
Wednesday, September 24, 2008
Eureka, a 21st Century Morality Play
In vivo fertilization? Transmigration of the soul? The collapse of the Communist ideology followed two decades later by the near-collapse of the bastion of capitalist economic theory? Is government bad or good? Is more government necessary or dangerous? Should business seek less regulation to pursue profit maximization, or endure more to mitigate investor risk? When, if ever, is property theft?
Café Diem’s food is free – but Vincent, the café owner, does not trade on that munificence to accumulate political or personal power, rather he serves everyone anything they want, regardless of their behavior, character, or status in the community. This conviviality is economically unsustainable, so must be interpreted symbolically. (It would trivialize the story to interpret it as political economics.) Manna, water from the Rock, a boundless gift.
Henry’s Garage fixes everything without counting the cost. Who else but Henry would officiate at weddings, become mayor by acclimation, and speak truth to power – his defiance of Eva Thorne is signatory. He refused to participate in a morally ambiguous activity, not because it is evil but because he does not have sufficient information to determine if it is evil or good. His wise pragmatism, a counterbalance to naïve enthusiasm, makes him an ideal confidant and teacher to the Sheriff’s late-Jobian incomprehension and acceptance of the mystery and power of Science, the symbolic manifestation of the Deity in our pragmatic 21st Century.
We each have our Vincent, our Henry, and our Eva. We each face demands for moral choice in the face of ambiguous but powerful forces beyond our comprehension. How to find a trusted wise counselor, and avoid a con man? Each day we awake to a new world, trusting in some of our gifts, assaying our strengths and weaknesses, reflecting on the path we have trod so far, contemplating our next steps. Sheriff Carter’s gumption and plain common sense in the midst of chaos offer a healing presence, a promise that we can make the right choice.
Sunday, July 20, 2008
FlowerPower Foundation Experience

At 3:30 I turned off the Mets at the Reds (tied), picked up the map I’d Googled last night, and walked out to the car. Man it was hot! The AC kicked in soon and I rolled down the Merritt towards New York City. At 4:50 I was parking outside Butler Hall on West 119th. The guard told me that any elevator would go to the top floor, just push “R” for restaurant. I stepped out into a smallish alcove and met M., the event planner. She was expecting me. I asked her if there might be a food service cart of some kind. She said that the back was already closed and everyone had gone home, so, no, there wasn’t anything available.
She pointed to a beautiful floral place setting with purple iris, hydrangea, and some white and blue flowers I didn’t recognize. “There’s that centerpiece, and there are 15 table arrangements over there.” These were described as 6x6 – they were 6” tall glass cylinders, 6” across, stuffed with the same types of flowers as the main piece, but without the hydrangea. They were nearly full of water which meant they each weighed a bit more than I’d expected, but pouring out the water would have risked the flowers all wilting on their journey and that would make the trip less worthwhile. So I took them down to the car two at a time. My biceps got a fair workout! M. had a helpful suggestion: She would hold the elevator at the restaurant while I loaded it up with arrangements, and the guard on the lobby would hold the elevator while I unloaded them. I thanked her for the idea and followed that plan. Much better! The flowers all fit nicely in the back of the car.
I drove down Amsterdam Ave. and took a left at 114th St. There was no place to park, though; so I tooled around the block until I saw a space open up on the northbound side of Amsterdam Ave. I carried the first two arrangements into the lobby at St Luke’s hospital, signed in, and asked the guard how to get to 9 West - the geriatric ward. Up the elevator to 9, then turn left when you get to the corridor. 9 West is at the end. I thanked him then asked if he might have some kind of cart or even a spare wheelchair. (My forearms were feeling the burn.) He looked around but nothing was available. With his permission I left the first two arrangements behind his desk then walked back to the car to get another pair.
After the fourth trip, he found a cart – a nice one, with two decks. I rolled it out to the car, thanking the inventor of the wheel, put the large centerpiece on the top, and filled the base with the remaining seven arrangements. The ride had seemed smooth but most of the arrangements had splashed a bit, their sides were slippery. I did not want to drop one and have the glass shards scatter all over the floor! That would be a déclassé introduction. But every piece made it up to the ward safely. When I came down the corridor with my cart, every nurse stopped to say how beautiful the flowers were! I said, thanks – it gets better. I asked them if I could put the large arrangement on their station, and they were very happy about that. Then I picked up one of the arrangements and walked into a patient’s room.
“Hi, I brought this for you. Where would you like it?”
The elderly woman in the bed had a visitor, a man leaning back in a chair. He offered to take the vase but I told him that it was a bit slippery and heavy, so I would just put it on the window if that was okay. She asked, “How much does it cost?” Nothing, there was a wedding a few blocks from here and they asked if I could bring the flowers to you. They are already paid for.
Her room-mate was alone and seemed introspective. I told her that I’d brought her some flowers and where would she like them? She was shocked and exclaimed that she was beginning to feel a bit depressed but this certainly snapped her out of that! Then she recited a lengthy prayer in rhyme. We said Amen, and I thanked her for the blessing, and wished her a happy Sunday. As I was leaving, she reminded me to thank the people who donated the flowers.
Down the hall, the elderly man in the breathing mask didn’t want any flowers, so I turned to his room-mate, who said that he did not want the whole arrangement, but that he would like a single purple flower. Purple was his favorite color. He asked how long it might last, and I said that if we put it in a bit of water it should be good for a few days. I went to the nurse’s station and asked them if they might have an empty water bottle or something to use as a vase. Patient C. didn’t want the whole arrangement, he just wanted one iris. A nurse produced a glass vase and C. got his one purple flower for his bed-table.
I took the cart back downstairs and filled it with the remaining arrangements. One of the assistants took some grief from a nurse who asked him why he never brought her some flowers. I said in a stage whisper that I’d put his name on a gift card in the next batch –
By the time I got back, the nurses had picked out where the rest of the arrangements would go. Many, many smiles and thanks. I brought the cart back to the lobby, thanked the guard (after telling him about the dialog between the nurse and the assistant) and drove home, feeling very good.
I snapped this picture of the guard's desk with the last batch of arrangements at St Luke's:
The FlowerPower Foundation takes donations of flowers from weddings, funerals, and corporate events. Volunteers re-purpose these flowers into vases and deliver them to people in hospices, long term care facilities, and, as today, geriatric wards. There are chapters in New York and Los Angeles. If you would like to donate your time, flowers, or funds to FlowerPower, please visit their web site at http://www.flowerpowerfoundation.org
Saturday, July 5, 2008
Regular PC Maintenance
How does this happen? When you edit a file, the operating system finds the next available space on the hard disk to hold the changed part of the file. When you finish, the operating system marks the space on the hard disk where the old copy of the file resided as deleted. Over time, those old pieces accumulate. The folder that contains the file also contains lots of pieces of unused space. The file is fragmented, with pieces scattered all over the hard disk. The system runs more slowly because it spends extra time to find and put together the fragments of the file. Defragmenting the file means moving the in-use pieces of the file together and freeing the big block of space at the end. This process is also called "degassing" the file.
First: Update Windows for any Microsoft updates and security fixes. Start -> Programs -> Microsoft Update and say “Yes” if it asks if you want to update the download agent. Select all critical updates – some might have to be installed by themselves but most can go together as a bunch. You will probably have to restart your computer after the update completes.
Second: Update your anti-virus software. Each program has a “Live Update” feature to get the latest list of bad code that needs to be prevented from running on your computer. You might have to restart after this, as well.
Third: Run a virus scan. Start your anti-virus software program and run it. Depending on your computer this can take from five to 30 minutes.
Fourth: Scan for spyware. Get a copy of “Spybot – Search and Destroy” from http://www.Majorgeeks.com/ and download it. After you have updated that program pick “Immunize” to block spyware from attaching itself and then run Scan to identify and delete spyware.
There are other spyware programs out there. Microsoft Antispyware is available for free off the Microsoft home page. Webroot sells for $30 from Circuit City and is pretty good; LavaSoft is free to individuals, Spywareblaster is also free (and will accept donations like Spybot does). Norton and McAfee both have additions for spyware and AOL has some code that works, too. I use all of them, because each has its strengths and they don’t interfere or consume excessive resources. Trend Micro is pretty good, too.
Fifth: Remove temporary files. Start -> Search -> For files and folders and pick “All files or folders.” Select TEMP, and once you’ve found the folders, open them each and delete all their contents. Some contents may not be deletable – they are in use and that’s okay. Skip those and get rid of the rest. Close the search window, go to the recycle bin, and empty it. Then find all files and folders that have “*.tmp” in their names. Select them all and delete them. Again, some of them may not be able to be deleted because they are in use, skip them and get rid of the rest.
After this, go to the recycle bin again and empty it. That actually marks the space the files occupied as available.
Finally: Go to Start -> Programs -> Accessories -> System Tools -> Disk Defragmenter and run that utility. The first time you do this it may take a long time – an hour or more.
Do this every month or two, depending on the amount you use the computer. Also, after you install a new program, a major upgrade or a big security update, you might go through this again too. These directions apply to Windows XP but the same process works for Vista.
Sunday, March 30, 2008
Black Bean Soup
Ingredients:
Dried black beans, 1 # bag
Butter, 1 tsp
Olive oil, 1 Tbsp
Small sweet onion in small dice
Garlic, 1 Tbsp, minced
Salt, 1 tsp
Crushed pineapple, drained, 1 oz
Method:
Sort beans and soak overnight. (Some may think this is excessive but I find it makes the soup creamer.)
Heat sauce pan, then add oil and butter. Clarify garlic and onion. Add drained black beans and stir thoroughly over medium-high heat till any remaining water is evaporated. Add 1 tsp salt and crushed pineapple.
Just before beans begin to sizzle, add 8 cups water and reduce heat to medium. Let the beans cook until softened, about 2 hours, stirring occasionally. You can crush the beans a bit with a wooden spoon, but the stirring should be enough.
Serve with a dollop of sour cream or plain yogurt.
Saturday, February 16, 2008
Daddy, What Does a Chief Technology Officer Do?
Let’s begin by talking about what a CTO should not do. The CTO should not manage developers. The head of development spends his or her time working to keep the development team on track against a set of product plans. Inside the development organization, this Director attends to staffing, training, workload and productivity metrics, budget, and scheduling. Working with the customer organizations, the Director keeps up to date on shifting priorities, changes in product requirements, and new potential opportunities that the developers may need to supply. This is a full time job. The performance plan for the Director of Development is quite simple: Deliver high quality programs that meet or exceed customer requirements on time and within budget.
A CTO should not manage a hardware team or an infrastructure group. The CTO might have a lab (for test purposes, not production or QA). But the CTO does not own a production facility and should not be measured against that criterion. Functional strategies (productivity, headcount, floor space, training, power and cooling, etc.) should rest with a COO; the CTO is a research and ad tech discipline in the strategic planning domain.
The Chief Technology Officer matches new technological capabilities with business needs, and documents that match so the business can decide whether to use the new technology or not. The CTO is not an advocate, but a strategic planner and thinker. A business that sells information technology uses the CTO to articulate how the new technology can address business needs for its prospects. So the CTO needs to understand his firm’s capabilities and something of the business processes of his firm’s target market. A business that uses information technology needs its CTO to select potentially useful new technologies for use in its internal business processes. This CTO should understand a good deal about a broad range of new technologies and must have a deep sense of the business’s core processes and goals. The CTO should not be an advocate, but must be unbiased. The CTO needs to understand the abstract potential that a new technology might offer, and must know the underlying architecture of the firm’s business processes.
The CTO must have a high degree of professional integrity – there will be times when the CTO will be the only person that the senior leadership team can turn to for an unbiased and well-grounded assessment of a potentially valuable new technology. A vendor CTO whose primary function is outbound marketing does a disservice to the vendor for whom he or she works. A user CTO whose bias is towards always trying new things adds no value to the firm looking for a sustainable, cost-effective competitive edge.
Consider how firms today confront Web 2.0 – the combination of blogs, wikis, and social networking technologies sprouting up. A user organization that wants to interact with consumers may already be all in. Coca-Cola runs over 500 web sites for consumers, and sponsors videos on YouTube; even IBM has space on Second Life. Other firms may shy away from the uncontrolled side of these technologies. Publicly-traded firms and others facing regulatory scrutiny may fear the consequences of an unguarded comment on a quasi-official channel, and rather than manage that risk they opt to deny employees the ability to participate at all. Of course, this draconian measure does not work; employees can blog under another name, or contribute to a wiki pseudonymously. The CTO would have looked at the potential strengths and liabilities of each medium and present the firm a view of the potential benefits (closer interaction with customers and partners), costs (incremental IT investment, potential lost productivity on other tasks by bloggers), and risks (uncensored commentary reaching the public). The CTO’s performance plan is simple: to evaluate for the executive leadership team potentially useful new technologies – showing how they might fit in specific business processes to the firm’s benefit.
Could that job be done today by another function within the organization? The IT project office might render an opinion about investing in Web 2.0, but that could be characterized as self-serving. The marketing department might argue that Web 2.0 will give them a competitive edge, but that could be marginalized as just the goofy marketing guys wanting more toys to play with. Without a CTO, these organizations might choose to spend money covertly to test the technology, potentially placing the organization in jeopardy. The CTO alone must offer an unbiased, insightful analysis of the potential of the new technology.
How does the CTO improve? A good CTO isn’t just lucky, although never underestimate the value of good luck. Rather, a good CTO describes the environment in which the new technology may fit, and then defines how that fit might occur. If the projection is correct, the CTO celebrates. But if it’s wrong, the CTO has solid documentation to review. By using that documentation, the CTO can learn which element of the current environment he missed or mis-characterized, or what step in the chain of reasoning was flawed. Through this process of self-evaluation and learning, a good CTO gets better over time.
Some companies need a CTO more than others. Firms that tend to adopt leading edge technology not only need a CTO to understand the capabilities on offer (most vendors of leading edge tools don’t know what they are actually for), but they need other processes to manage that raucous environment. The firm’s purchasing department needs to understand how to negotiate with start-ups. The firm’s development team must be able to integrate primitive, early-stage technologies. The firm’s operations area may have to cope with poorly documented, unstable products. But the benefit could include being the first to open and capture a new market.
Companies that deal with established major vendors will spend much less time and effort dealing with these teething pains. But, they will have to wait. Microsoft’s Internet Explorer was years behind Netscape. Some of firms that jumped on Netscape early established dominance over their target market – eBay and Amazon.com, for instance. In both of those company’s cases, the CTO was the CEO. Sam Walton’s vision of a frictionless supply chain drove Wal-Mart’s very early use of e-commerce (predating the term by a decade or more) with its suppliers. Middle of the pack firms don’t leverage their CTO much, they use him for insurance, not strategic planning.
Lagging companies adopt technology after the market figures out its parameters. These firms try to grab a bit of profit by squeezing in under the dominant player’s margins – selling hardware more cheaply than Dell, or audit services at lower rates than the Big Four. Picking up nickels in front of a steam-roller is a dangerous game. Larger vendors will always be willing to sacrifice a few margin points to protect market share, so a successful laggard risks extinction. Trailing-edge firms don’t need a CTO; they need a sharp financial team.
So my daughter got more than she expected, and her class got a peek at how the various functions in a strong, self-aware corporation align with the firm’s goals and vision. How does your firm use its CTO? How might it?
Friday, February 1, 2008
PCI DSS Class Thoughts
The standard came about as a result of the efforts of the then-CISO at Visa, who I’ll name if he wishes. In the late 1990s he was concerned that merchants weren’t protecting their customer’s credit and debit card data suffficiently, so he floated the idea that merchants should follow a code of good practice: Use a firewall, use anti-virus software and keep it current, encrypt card data both when it’s stored and when it’s in flight, restrict access to systems that process card data, have a security policy that informs people that they should keep card data safe, and so on.
The idea caught on and in 2000 Visa announced its Cardholder Information Security Program (CISP). Shortly MasterCard, American Express, Discover, and the rest all launched their versions of the standard. At that point merchants became dismayed that they would have to follow a handful of similar standards with annual inspections from each, so the various firms providing payment cards banded together into the Payment Card Industry Security Council, which released its first standard in January 2005.
The threat landscape continues to evolve rapidly. In the 1990s merchants were worried that a hacker might capture a single card in transit. Now the bad guys can hire a botnet to scan millions of firms for vulnerabilities. The Atlanta-based start-up Damballa maintains statistics on botnets, and they are frightening. At present more than 1 in 7 PCs on the Internet is infected with some form of malware. The Storm botnet seems to have over 50 million zombies (Internet-connected PCs that are receiving and responding to commands from its control infrastructure). Estimates vary but there are now about 800 million PCs connected to the Internet, with the total expected to pass 1 billion machines by 2010.
Traditional information security measures are necessary but not sufficient. Someone once said that using basic information security was like putting a locking gas cap on your car. It may slow someone down, but it won’t keep a determined thief from punching a hole in your tank and draining the gas out. While that is true, for a long time we took a modicum of comfort in the thought that a thief in a hurry would see the locking gas cap and move on to the next car. But in this new threat model, the thieves use stealthy automation, have lots of time, and need almost no effort to undetectably siphon off sensitive data from everyone.
Now there is a whole industry around this standard: about 1,400 merchants globally are so large that they must have annual examinations. There are dozens of firms that are certified to perform those exams, and another slew of firms that are certified to perform the quarterly scans the standard requires. The PCI council certifies both examiners and scanning firms. Note that they don’t certify products; they certify a company’s skill and methodology. So if a scanning vendor uses tool A for certification and switches to tool B, they need to be re-certified.
Certification is valid for one year only. But certification doesn’t guarantee that a merchant won’t get ripped off. TJX suffered the largest breach known so far, with 94 million credit and debit cards stolen. During the 17 months that the bad guys were prowling around TJX’s systems, the firm successfully passed two full examinations and five quarterly scans, all performed by large and reputable vendors. The exam is an audit, not a forensic investigation. And the bad guys are more persistent, diligent, and motivated than the examiners. Some firms believe that since they passed an exam, they must be secure. All that passing the test means is that the firm is meeting minimum requirements. Creative, persistent, diligent information security measures, proactively applied by the firm itself, are the only way any firm will have a chance of finding the bad guys and shutting them down.
The class helps firms that handle credit and debit cards understand the obligations under the standard, but more importantly what additional measures they might take to avoid bad things happening. We look at the TJX breach in depth, reconstructing the apparent chain of events to highlight the tenacity and dedication of the bad guys. Remember that information security is entirely about economics: if the value of the information is greater than the cost of getting it, the information is not secure. For more information about the economics of information security, check out the Workshop on Economics and Information Security (WEIS).
If you use a credit card, be aware of small but unexpected charges. The thieves can get a million dollars just as easily by taking one dollar from each of a million users as they can from taking ten thousand dollars from each of one hundred users. The difference is that nobody complains about losing a buck. The thieves are evolving into endemic, chronic, annoying parasites. Being a 21st century cyber-crook may not be glamorous, but it is lucrative, low risk, steady work.
Sunday, January 20, 2008
5, 6, 8, 12, 19, 23
It turns out there were twenty other winning tickets! So each ticket is worth only $15.6 million, or about $781,000 per year for twenty years. After taxes that’s about $470,000. The ex gets half, so I’m down to $235,000. It’s hardly worth turning the ticket in.
On the plus side, I’ll be able to pay off my debts, and get the car fixed. It’s time for a new car, anyway. And I’ll be able to get to St. Warm for a long weekend in the sun. I haven’t had a real vacation for years. I’ll bring the kids – they will have a great time. They both like fresh fish, and love to swim.
I can make up for the lame presents I was able to get them last Christmas. They both want computers, and now I can get them the laptops they’ve picked out on-line. Birthdays will be bountiful this year! Better, they will have their college all set.
I hope they don’t get spoiled.
[Postscript: This is a work of fiction. I have never won the lottery. In fact, I don't know anyone who has. Statistically speaking, I never will. This fantasy was intended to play with the idea of winning the lottery; and I hope it was enjoyable.]
Saturday, January 12, 2008
Shall I Check the Tires, Sir?
Why does the tire pressure matter? An underinflated tire experiences higher rolling resistance. This excess friction generates excess heat in the tread. This had three consequences. First, excess heat increases wear – the tire gets old faster. Second, excess heat compromises traction. Finally, underinflated tires use more gas. The difference is significant. By raising the tire pressure from 24 psi to 30 psi the car’s mileage will improve by 3% to 4%. See the US Department of Energy site on fuel economy here. And most cars are not running at the correct tire pressure. To verify this, check the pressure on the next rental car you use. You will find that the tires are usually low. This increases road comfort – most Americans like a soft squishy ride. The rental car companies don’t care – the cost of replacing tires is part of normal maintenance and already figured into their operating expense. Most users refill the gas tank rather than pay the high charge the rental car companies impose.
Three to four percent may not seem like much, but that matches the total contribution that the Arctic National Wilderness Refuge will provide should it be exploited to capacity. But more pragmatically, what can individuals do? By checking tires, each of us can benefit individually by spending a little bit less for fuel, driving with a little bit more safety and having the tires last a little bit longer. Could manufacturers do anything? Yes, and they already have. Many newer model cars have tire pressure sensors built into the rims, so the driver doesn’t have to get into a service station and scuttle around with a tire pressure gauge, getting road dirt on one’s fingers and clothing. Should newer models have a warning light to alert the driver? Should States require tire pressure checks as part of the annual safety inspection? Or should responsibility remain with the car owner, as sovereign?
This is a particularly interesting test case in that the benefits to the individual and to society are perfectly aligned. By keeping tires at the optimal running pressure, the individual gets a safer, longer lasting, more economical car, and society gets safer traffic and reduced fossil fuel consumption. The only losers in the bargain are the tire manufacturers, who sell fewer replacement tires, and the gas companies, who sell less gasoline. Tire manufacturers like being known for safe, long-lasting, economical tires, and all offer tips to improve these qualities, such as Goodyear and Michelin. Tire manufacturers grade their tires on three parameters: wear, traction, and temperature resistance. The US Department of Transportation describes this grading system here.
States are free to determine whether to inspect cars for safety, emissions, or neither, and how frequently – annually, on sale only, or at some other frequency. About ten states only require emission testing in metropolitan areas, such as Atlanta, GA, which helpfully summarizes inspection programs nationally here.
A tire pressure gauge is inexpensive. Serviceable models cost under $5 at any car parts store, top of the line digital models cost $15 or so. They fit in the glove compartment. Checking the tire pressure takes a few minutes and will save a few dollars.
Thursday, December 13, 2007
The Way of the Dinosaur
The dinosaurs became extinct following a global environmental catastrophe. There was no war between the mammals and the dinosaurs. The mammals did not out-compete dinosaurs in any ecological niche. An external event radically changed the environment, eliminating the dinosaurs, who had successfully dominated the planet for 180 million years. Dinosaurs suffered a huge environmental catastrophe 200 million years ago. Only half of all dinosaur species survived, while a species which showed both mammalian and saurian characteristics failed, leaving true mammals to evolve in the shadow of the dinosaurs.
And so how does this inform our understanding of the competition between the PCs and the mainframes? The mainframes did dominate the corporate landscape for generations; they were big and capital-intensive. PCs provided an alternative processing mechanism initially for spreadsheets and ultimately many traditionally host-based processes. PCs learned how to connect for client/server computing and eventually to use the Internet for browser-based information access and analysis. So PCs competed successfully for a presence in a series of ecological niches that mainframes had once dominated.
But there is another environmental catastrophe forming. This shock will transform the computing landscape. The transformation is the green revolution. Fossil fuels are becoming a costly and undependable energy source. While the transformation to alternative energy sources is underway, conservation is now an imperative. Business will consider any measure to reduce energy consumption. Building design, telecommuting, and outsourcing all shift the energy burden away from the core business. IT consumes significant power. Measures that IT can take to reduce energy consumption get high marks, are scalable, and have measurable impact on a business’s overall energy use.
Personal computers consume a significant amount of energy. A desktop computer consumes 200 watts, and generates additional energy costs in removing its heat from the building environment. A thin client workstation – and the energy in the data center to support its computing – consumes 25 watts. Converting from desktop computing to thin client computing cuts energy costs by a factor of eight. While one physical server can host five to ten virtual servers (based on typical CPU utilization), that same physical server can host a hundred or more virtual desktops. Virtualization can reduce energy consumption by an order of magnitude or more in a medium to large enterprise.
What other benefits and risks does the organization face when converting from desktop personal computers to thin clients? On the benefit side, the data remains in the data center, so data loss because of a desktop error does not happen. The firm can deploy comprehensive backup and disaster recovery mechanisms. No user’s data would be lost because they forgot to connect and run a backup job. Also, support and maintenance costs drop. The firm will not need to keep a spare inventory of parts for multiple generations of PCs, and the tech support staff will have all the diagnostic information in the data center. Users won’t have to bring their PCs to the help desk to get software installed, and they won’t have to run virus scans or software updates to stay secure or remain current. These processes can be built into the virtual desktop environment inside the data center. Information cannot be stolen from a thin client, since it does not leave the data center. No user can insert a USB drive and download files, or lose a laptop with a hard disk full of customer records.
What risks does a firm face when migrating towards virtual desktops? There are some applications that don’t play well when virtualized – heavy graphics and 3D modeling, for example. These need an unencumbered host with a huge amount of available capacity and may not render well across the link between the virtual desktop and the screen. If the user works at locations where connectivity is problematic, he may need the entire project on his laptop, a fat client device.
More importantly, when firms virtualize their networks they may not have as much visibility into the network activity between virtual desktops and servers. And, in this virtual network, they may not be able to track which users are where. Users’ virtual desktops may move to balance load or recover from an interruption in service. Most importantly, the traditional reliance on a perimeter, whether for security, systems management, capacity planning, or compliance, vanishes in the virtual world. This requires clarity in defining business service level objectives. Policy cannot be imbedded in network topology as it was in the 1980s and 1990s.
So the battle between the mainframes and the PCs may turn out a bit differently than that between the dinosaurs and the mammals. The impending environmental catastrophe threatens the power-hungry PCs, and the large hosts, which efficiently parcel out computing, storage, and bandwidth, across a broad population of users, may prove to be the more adaptable and responsive creatures in this cyber landscape.