Friday, April 29, 2011

Considering a gas-powered standby generator

I've been wearing my Bo Williams Risk Analysis™ cap a fair amount for the past couple of days, and this post started life as a comment on his post Friday miscellanea, post-Alabama tornado super-outbreak edition.

Our house has gas heat and a gas set in the fireplace. I've been letting the latter slide in a state of disrepair. I recognize and accept this BWRA™ Demerit. It will be ready before winter 2011, which will mean redundant sources of heat. You ought to see my related GTD project list!

The option is obvious now, but it took a conversation with my dad to learn about permanent standby generators that run on natural gas and periodically test themselves. I priced them online and may be able to purchase a unit that would power most of my house, perhaps in staggered operation for heavy loads such as HVAC and dryer, for around $5k with no headaches of managing a gasoline reserve or just-in-time fuel purchases.

The last week-long outage in the area was nearly forty years ago, but sidestepping even hours-long outages due to storms, ice on power lines, drunks hitting poles, and so on is awfully appealing too. I emptied the refrigerator and freezer, and temperatures in the Huntsville area are projected to be mild. My one power-related worry is my 55-gallon cichlid tank. I turned on a battery-powered air pump, but that's about the best I could do. I think I was a week overdue for a water change when we lost power, so staying on top of aquarium chores in the future will help there too, assuming optimistically that these hardy critters make it.

Other questions:
  • Where can I find comprehensive lists of electrical and gas outages to find whether I'm trading more-or-less equivalent problems?
  • How expensive would backup mode be?
  • Fossil fuels in general aren't getting any cheaper, so at what gas price does such a generator become a total dud?
  • What about purification of, say, rain water?
  • I expect a crazy run on generators when we're no longer part of the third world. What are good value metrics?
  • What other issues do I need to consider?

Saturday, April 23, 2011

Are coaches to blame for the awful new taunting rule?

In What if you were an SEC Official For a Day? (Part One), Clay Travis blames NCAA coaches for the new rule that allows officials to take points off the board and even eject players for behavior deemed to be taunting.
But officials believe that the coaches, who are the ones who actually make the rules, have foisted the enforcement of taunting onto the officials to relieve pressure on themselves. That is, coaches can point to the officials as the bad guys who are keeping the players from taunting. Effectively, multi-million dollar coaches are ducking the obligation to discipline their teams onto part-time officials. 
As Steve Shaw notes, “Coaches write the rules. They want it. They control it.” … 
The outrage should go to the coaches who implemented this rule. They did it because they don’t want to be the sheriffs. They want to blame the officials for not letting the players celebrate. And, by the way, do you know who will lead the charge in criticizing officials when this call is made to his team’s detriment? 
Yep, the head coach who voted to make this a rule in the first place.
I don't buy this for a second. I'd like to see a rollcall vote from head coaches of major NCAA programs on whether they want this awful rule. These guys risk permanent facial disfigurement screaming at players who invite stupid showboating flags in critical situations—hardly a consensus endorsement.

In defense of players, controlling emotion can be really difficult. Any fan has been in the situation of hugging the guy next to him after a big stop in a tense situation. Now for the player sweating and bleeding in the trenches who breaks the go-ahead TD or comes up with a pick deep in his own territory, of course he's going to be elated and especially in big, high-pressure games. The players' passion is a wonderful distinguishing feature of NCAA football.

The biggest flaw of the rule is that it conflates celebration and taunting. Yes, there are cases were players have engaged in unsportsmanlike taunting meant to demean opponents, but I'd be surprised if these were anything other than a tiny minority. A study segregating celebration flags into piles of behavior intended to deride the guys on the other side versus spontaneous joy of accomplishment, although highly subjective, would be an interesting result. Anyone interested in collaborating on tauntingornot.com?

This looks like another hook to give officials even more influence over games' outcomes. Some pencil-necked geek on the make at the corrupt NCAA is behind this.

Monday, February 28, 2011

Extracting comma-separated integers with Perl

A friend writes asking whether
my @data = ( $data =~ m|(-?\d+),(-?\d+),(\-?\d+)\r| );
or
($data) = (split "\r", $data);
my @data = split ',', $data;
would be better for extracting integers from a line of input, and my reply is below.

The best approach depends on a few factors. Who generates the data you're processing? How much slop in the input format do you need to accommodate? How flexible do you need to be with respect to future changes in the input, e.g., extra fields, different types, and so on?

The CR (\r) in your input is a red flag. Are you running on a Unix platform with input generated on Windows? Can you be more specific about “possibly some other stuff” after the comma-separated numbers?

Perl's $/ special variable can handle oddball line endings. Its default value varies with what makes sense for the current platform: e.g., \n on Unix and \r\n on Windows. Macs introduce another twist (see Newlines in the perlport documentation), but I assume you're not using that platform.

On Windows for files opened in text mode (the default), the C library silently transforms CR followed by LF into LF, so if this matches your setup, I'm surprised you're seeing the \r at all.

Say your input is plain text generated on Windows, and you're running on Linux. Then you'd process it with code of the form
$/ = "\r\n";

while (defined($data = <>)) {
  chomp;
  ...;
}
Remember that chomp removes the value of $/ from the end of the target.

As for extracting the data, Randal Schwartz (author of Learning Perl, a.k.a. the llama book) has a rule of thumb:
Use capturing or m//g when you know what you want to keep.
Use split when you know what you want to throw away.
I first saw this useful guideline in “Regular Expression Mastery” by Mark Dominus.

If this is a quick-and-dirty utility, I'd be inclined to write
@data = split /\s*,\s*/, $data;
This allows for and removes any whitespace around the commas.

If it's important to nail down the format (maybe as a sanity check that you're in the section of the config file where you think you are), you could write
if ($data =~ /^\s*(-?\d+)\s*,\s*(-?\d+)\s*,\s*(-?\d+)\s*$/) {
  my($x,$y,$z) = ($1,$2,$3);
  ...;
}
else {
  die "$0: $ARGV:$.: unexpected format";
}
Note the use of $1 and friends inside the conditional only. Always, always, always protect uses of capture variables with conditionals.

The pattern is just at the annoyance threshold of repetition and illegibility. Perl version 5.10 opens up nice possibilities with named capture buffers:
#! /usr/bin/perl

use strict;
use warnings;

use 5.10.0;

my $record = qr/
  ^
  (?&ws)
  (?<num>(?&n)) (?&sep) (?<num>(?&n)) (?&sep) (?<num>(?&n))
  (?&ws)
  $

  (?(DEFINE)
    (?<n> -? \d+)
    (?<ws> \s* )
    (?<sep> (?&ws) , (?&ws))
  )
/x;

while (<DATA>) {
  if (/$record/) {
    my($x,$y,$z) = @{ $-{num} };
    print "got: x=$x, y=$y, z=$z\n";
  }
  else {
    warn "$0: line $.: no match\n";
  }
}

__DATA__
1,2,3
4,5,6
7,8
Its output:
got: x=1, y=2, z=3
got: x=4, y=5, z=6
./prog: line 3: no match
Notice its use of the special %- hash that records all captures named “num” in this case. With (DEFINE), subpatterns get meaningful names, and the /x modifier allow for judicious use of whitespace inside the pattern.