Posted on March 31, 2009 in Uncategorized by adam1 Comment »

Here’s a trick you can use to improve the quality of messages that arrive in your general inquiry mailbox. As the person who triages ours, anything to improve it is worth the effort.

Most spammers harvest their addresses using bots that just parse your site’s html and apply some simple regex to determine what looks like an email address. This is why people do ‘clever’ things like changing the ‘@’ to ‘at’ or put spaces around it, etc. While this might work, I think it looks rather unprofessional; it shows you are lazy. And do you really think the people writing the kits for these harvesting script don’t know these tricks?

A better solution is to obfuscate the email from the bots, and display it in the browser (to humans). Here is what we do on our site:

<p class="office_contact">
  <span>Email: <script type="text/javascript">eval(unescape('%64%6f%63%75%6d%65%6e%74%2e%77%72%69%74%65%28%27%3c%61%20%68%72%65%66%3d%22%6d%61%69%6c%74%6f%3a%69%6e%66%6f%40%7a%65%72%6f%66%6f%6f%74%70%72%69%6e%74%2e%6e%65%74%22%3e%69%6e%66%6f%40%7a%65%72%6f%66%6f%6f%74%70%72%69%6e%74%2e%6e%65%74%3c%2f%61%3e%27%29%3b'))</script></span>
</p>

The magic here is that email address is displayed only to things that are running a javascript engine — and most spam bots don’t. At least currently. This is a classic arms race problem where this too will be pointless in a couple years, but for now its a working solution.

Right around now is the time the geeks reading this say something like ‘JS is unsecure! I never have it turned on due to a crazy high level of paranoia I operate on!’. Okay, sure. I bet you are also ‘experiencing’ the web without cookies too. But fine. Here is something for you too to at least indicate that there is something you are missing. Well, a lot of somethings, but this something specifically.

<script src="/javascripts/jquery-1.3.2.min.js" type="text/javascript"></script>
<p class="office_contact">
  <span>If you had Javascript turned on, you would find out how to contact us</span>
</p>
<script type="text/javascript">
  var email = unescape('%3C%61%20%68%72%65%66%3D%22%6D%61%69%6C%74%6F%3A%69%6E%66%6F%40%7A%65%72%6F%66%6F%6F%74%70%72%69%6E%74%2E%6E%65%74%22%3E%69%6E%66%6F%40%7A%65%72%6F%66%6F%6F%74%70%72%69%6E%74%2E%6E%65%74%3C%2F%61%3E');
  $('.office_contact span').html('Email: ' + email);
</script>

As for how you get the magic bit of encoding, well, I used this JavaScript ASCII Converter. Just put in your string in the first field and change the delimiter to %. (You will also need to add a % to the beginning of the encoded string.)

Posted on March 29, 2009 in Uncategorized by adam1 Comment »

On February 16, 2009, Dr. Peter Jensen was on the local radio station (FAN590) to pump his new book Igniting the Third Factor (Buy from Amazon). I was in the car at the time and was struck by how much great advice there was for team leads in it (keeping in my ongoing theme of coaching sport and leading teams). I emailed the station to get a copy of the interview and they sent me one — goes to show the power of asking for something. I’m not sure if I am allowed to post the audio of it online, but here it is. And here are my notes

  • Sports psychology looks at what makes athletes successful. ‘Regular’ psychology looks at what makes people dysfunctional
  • Take the intangible, and figure out how to make it tangible
  • Most athletic performance, at the high levels, is mental
  • Athletes taper their mental and physical preparation in opposition to each other while getting ready to compete
  • Mental preparation takes the place of physical aspect leading up to the event
  • You can’t tell how good a coach is based on immediate or current performance. The labels of success or failure can only be applied years later as a result of what the people they have coached accomplish (or don’t)
  • Past players of successful coaches don’t talk about the sport they were coached in, but about the influence the coach had on them
  • Successful coaches grow not only gold medals, but exceptional people
  • True coaching is a long-term commitment
  • High Performance coaches:
    • Manage themselves well
    • Build trust intentionally
    • Imagery – Find out what the athlete is thinking
    • Find out what is blocking their athletes and work to remove it
    • Embrace adversity – if we were prepared to go to the next level then there wouldn’t be adversity in the first place
  • Don’t force talent into a system; design a system for what you’ve got
  • If you are paid in terms of wins vs. losses or making the playoffs, you have to skip some of the developmental stuff listed above. Again, make sure your motivational and reward system garners you the desired result
  • Confidence vs. Competence is a 60/40 split 4 months before a major event (like the Olympics). By the time they are about to compete they ratio is 100% Confidence
  • Your strengths taken to the extreme become your weakness
  • The lessons for coaches at the professional level vs. the house league level are the same but the application is totally different
  • Confidence is very fragile. Don’t set people up for major failure — let them scrape their knees, but don’t let them break their leg
Posted on March 29, 2009 in Uncategorized by adam1 Comment »

Part of what I recommend to people when they are staffing out a test team is to get as diverse variety in the people in it as possible. The logic being that they can compensate for each others flaws. The implication is that flaws are bad.

So last week I was listening to podcasts on the way to work as I usually do and something was said that got me thinking that flaws are not bad, in fact they might actually be desirable. I tried to find the podcast and quote, but couldn’t.

With that idea starting to form in my head, Johanna timed her post, Hire for diversity of all kinds, really well. In it she says

It’s too easy if you have insufficient diversity to achieve group-think without meaning to. If you’re hiring for problem-solving skills, which is what we do in high tech, you want diversity of all kinds: personality, schooling, race, culture, to name just a few. Insufficient diversity leads to an inability to generate other and different solutions

The first two sentences still work in the flaws-are-good mindset. Tweaking the last one a bit you end up with Insufficient flaw diversity leads to an inability to generate other and different solutions. Suddenly the standard interview question of ‘what is your greatest flaw’ now packs some relevance instead of just something that gets asked due to a lack of originality on the part of the interviewer.

Posted on March 29, 2009 in Uncategorized by adam2 Comments »

I’ve been (essentially) the lead (only) developer for our company’s widget over the last couple months. And while that is actually pretty scary, I now know way more about JQuery, Javascript and AJAX than I did before. One thing that stumped me for a bit this week was JSONP.

JSONP is ‘JSON with Padding’ which was originally proposed in this blog post. The most important part from an environmental and testing perspective is but requires a little bit of server-side cooperation.

From a testing perspective, you should check that the app behaves in a predictable manner if the server side call doesn’t work, or the format of the response is not correct (I cannot explain how much I am sick of seeing ‘invalid label’ show up in FireBug).

And from an environmental setup perspective, you need to get your servers dishing out the correct messages. Here is a perl script which will respond correctly to JQuery’s getJSON method when ?callback=? is appended to the request url.

#!/usr/bin/perl

# UPDATE THIS SECTION
$raw_json = "
{
  'offsets': [
    {
      'name': 'Methane Offset',
      'unit': 'ton',
      'currency': 'USD',
      'cost_per_unit': 14,
      'source': 'http://www.thisoffsetssource.com'
    }
  ]
}";

# DO NOT CHANGE ANYTHING BELOW

use CGI qw(:standard);
use JSON;

$enable = true;

$json = new JSON;
$json = $json->allow_nonref([$enable]);

$raw_json =~ s/\n//g;
$json_text = $json->encode($raw_json);

print "Content-type: application/json\n\n";
$callback = param('callback');
if (length($callback) > 0) {
   print $callback . "(" . $json_text . ")";
}

It should be generic enough that all you need to do to use this in your environment is replace the JSON stored in $raw_json and the rest should just work.

Resources:

Posted on March 25, 2009 in Uncategorized by adam3 Comments »

I’ve seen a couple articles in various business magazines and the newspaper recently about properly aligning your reward system to properly motivate employees. Or more specifically, the actions which are measured are getting behavior to achieve the reward, not the one which the system is trying to achieve.

So how do you motivate testers, or more importantly, what measurements do you use to determine whether a reward is given? Here are some of the ones I’ve been inflicted with.

  • Number of Bugs – This has all sorts of flaws, not the least of which is flooding the system with duplicate or way less than useful bugs
  • Number of Bugs that make it to Production – There are all sorts of reasons why bugs don’t get caught — or released. Testers don’t have the final say on when a release should go, so this is out of their control
  • Test Cases Executed – Not only does this not factor in things like Exploratory Testing, but also has no impact on the end quality
  • Product shipped on-time – Again, testers don’t control when the ship happens. And if they did, they could skimp on depth of testing to make the date

None of those felt useful then, and certainly don’t now. What criterion do people use on their teams to measure tester success?

Posted on March 23, 2009 in Uncategorized by adamNo Comments »

February’s Fortune magazine has an interview with Jim Collins, who is best known for his books Built to Last and Good to Great. In the interview they talk a bit about how to not only survive, but better position yourself in this economy. Now, being a management ‘guru’ they were talking on the corporate scale, but they apply equally to the team or individual level as well.

  • Those who panic, die on the mountain. You don’t just sit on the mountain. You either go up or you go down, but you don’t sit and wait to get clobbered.
  • What really matters is that you actually have core values — not what they are
  • The more challenged you are, the more you have to have your values
  • It is the caliber of the people that get you through tough times; not technology
  • Have a hedge against scariness
  • Don’t zoom in during crisis, zoom out
  • The right people don’t need to be managed. The moment you feel you need to tightly manage someone, you’ve made a hiring mistake
  • The right people don’t have a job: they have responsibilities
  • People who take credit in good times and blame external factors in bad do not deserve to lead
  • Turbulence is your friend
  • If you are managing for the quarter century, [you are disciplined] so that when the bad times come you’re ferocious
  • What can we do to not only survive, but turn this into a defining moment of history
  • The goal is not to survive; but to prevail
Posted on March 23, 2009 in Uncategorized by adamNo Comments »

Hoffman has his own method of gauging controllers’ stress levels: Check the angles their spines make with the seats of their chairs. At 100-plus degrees—leaning back—the work is easy; straight up, things are getting interesting; once they cross the 90-degree threshold and begin to perch forward, the sky is roiling chaos. Most of the controllers at the simulation never crossed the 90-degree mark.

That is from an article in Wired called Air Repair is which about redesigning the flight paths around New York city and its cascading effect on the rest of North America’s air space. It’s an interesting article, but the bit I quoted was by far the most important to testers.

In doing simulations they had all sorts of checks to see whether the test was successful or not, but the one that involved observation of humans was likely right up there in terms of importance. Human beings don’t really work well under stressful conditions; its the whole fight-or-flight chemicals that start to do weird things to brain chemistry. It is far too easy to just rely on what your unit / system / integration / other test results are telling you and ignore what is in front of you. It’s not quite the same as the Narcotic Effect of the green bar, but more of a Spotlight Effect.

When someone is on stage and all the lights are out except for a lone spotlight on an individual person, it is really hard to make out the other details and events going on stage at the same time. But someone with a trained eye, and knows to be looking can still pick up important information.

Would you or I have noticed the angle of the chair? I wouldn’t have. I would have been concentrating on locations of planes, etc. during the actual test then likely asked the controllers afterwards how they felt during the test. But that introduces all sorts of biases and internal justifications and such which would likely taint the findings.

The Spotlight Effect is something to be wary of when determining your oracles. Especially when some of your success criteria are linked to internal states of people.

Posted on March 23, 2009 in Uncategorized by adamNo Comments »

Fortune has released its Most Admired Companies list for 2009. As usual with these lists, the value is not in the list itself, but the information gleaned about commonalities and differences between the companies featured. Here are the best lines from the print edition which sets up the list.

Most important is a strong, stable strategy, which confers important benefits in unstable times. Companies that change strategies must usually change organizational structures as well and making that change in a recession is a heavy burden just when corporations can bear it least. It forces employees to focus inward rather than outward and becomes a giant sink of time and energy.

… less admired companies change structures far more often than the Most Admired, the reason being a strategy shift.

By contrast, Most Admired “are more confident in their strategies and as a result are more likely to use this opportunity for rapid expansion and a chance to take market share,” says Mel Stark…

Says [Coca-Cola] CEO Muhtar Kent: Crises offer you the best opportunity to communicate with consumers because the airwaves are cleaner….

… you may wonder what the magical winning structure is. Turns out there isn’t one. … the Most Admired have every type of structure … They’ll even do the same things differently in different parts of their company.

What the Most Admired do share is a focus on identifying and developing talent globally.


The question now becomes just how big is your reach in terms of quality. Are you responsible for just a single component? Product? Product Line? Division? A country? The whole shebang?

From either working for them, or knowing people who do, a common failure mode of a lot of startups is a lack of confidence in their strategy — largely because they have, or are starting to, realized that it is not as high quality as they once thought. Sometimes switching mid-course works. Take for instance Flickr which started out as a game then commercialized (with great success) one tiny aspect of the product.

On the flipside, companies that will not lift succeed in spite of themselves change strategy with the current winds. Consumer facing? Nah, that’s so last week; we’re targeting the enterprise now. But what about the 6 products that are consumer oriented? Well, we’re going to keep selling those to anyone who wants them, but they aren’t going to get any development resources…

In testing terms, that sort of strategic confusion is a violation of the Consistency Oracle and would have a pretty big bug logged against it. I’m not sure how many CEOs would appreciate a Critical bug filed against Strategy show up in their mailbox though.

Posted on March 22, 2009 in Uncategorized by adamNo Comments »

Before I get into the things I found interesting in the Software Test & Performance a bit of commentary. This issue was 36 pages, and and 15 of those were dedicated to the STP Spring conference brochure. As a result there was less actual content in it. That said, the content that made it in was actually pretty good.

The Performance Tester’s Survival Guide by Justin Callison is a series of bullets around things he has learned in his career. Though aimed at performance testers according to the title, they are good things to keep in mind for your testing career.

  • Be a Technical Rosetta Stone
  • Learn To Translate
  • Find One Hole and Dig Deep
  • Automation is Software Development!
  • Get to know the Architect
  • Admit When You Do Not Understand
  • Perfect is the Opposite of Good
  • Tools are Just Tools
  • Extract, Aggregate, and Visualize
  • Understand Reliability vs. Validity
  • Think Like a User
  • Get Agreement on Goals and Objectives
  • Develop Process and Methodology
  • Data is King
  • Learn From Production
  • Performance with Manual Testing

Ross Collard also contributed the first in a series of article which included a list of Types of Data:

  • Test data means data used in testing.
  • Live data means data encountered in live operation.
  • Captured data is a copied sample of transactions from live traffic workflows.
  • Extracted data is a copied sample of stored records from databases and files.
    • Extracted data may be in loaded forms (e.g., in a test database ready for use), and unloaded forms (e.g., data is ready for a database build effort).
    • Only live data that is captured or extracted can become test data.
  • Expected, valid data means the manicured, legitimate data that we expect to encounter.
  • Real data or actual data is what we could feasibly encounter, regardless of whether it is considered legitimate, e.g., data extracted from a corrupted database.
  • Impossible data cannot occur.
  • Atomic data is fundamental, and cannot be derived from other data.
  • Derived data is not fundamental.
  • First-order…
  • Higher-order…
  • Metadata is data about data, e.g., metrics
  • Derived data is not fundamental.
    • First-order is derived directly from atomic only
    • Higher-order is derived from atomic and lower-order…

    He then ends it with three questions we need to consider when thinking about using Live data in our testing

    1. What live data should we utilize?
    2. How do we capture and manipulate this data?
    3. How do we use it in testing?

    And to end this list of lists, there are Ten Steps To Automated Validation by Matthew Hoffman. I don’t think I disagree with any of them.

    • Your automated tester should have some software development experience.
    • Plan out your test scenarios. It is crucial to design the tests before recording the scripts.
    • Organize your tests.
    • Scripts should cleanup the data created.
    • Be certain of your assertions.
    • Scripts should test for negative scenarios.
    • Tests should execute on various environments.
    • Use configuration management.
    • Schedule regular test runs.
    • Report and distribute results.
Posted on March 15, 2009 in Uncategorized by adam1 Comment »

I’ve know about this whole ‘Agile thing’ since around the signing of the manifesto and I’ll openly admit I was pretty anti-Agile as it seemed to just legitimize cowboy coding and push testers out of the picture. The community has generally softened their stance on dedicated testers and so has mine towards Big-A Agile; though I am much more of a proponent of little-a agile. Part of my transition into an agile tester was to join the agile-testing mailing list. Two of the frequent posters on there, Lisa Crispin and Janet Gregory have recently put their collective wisdom together and produced a nice book on agile testing called, appropriately enough, Agile Testing – A Practical Guide for Testers and Agile Teams (Amazon).

My first impression was that it is a lot bigger than I had expected coming in at over 500 pages. That’s a lot of content.

Its heft is divided into 6 sections:

  1. Introduction
  2. Organizational Challenges
  3. The Agile Testing Quadrants
  4. Automation
  5. An Iteration in the Life of a Tester
  6. Summary

Before going too deep into the review, I’m pretty sure I’m not the target audience of this book as I’ve been doing agile-ish testing for a number of years now, so I’m going to try and juggle the new and familiar personas.

The first two sections were actually the ones I found the most useful and have the most markup in them. In it they describe how testing and testers fit into an Agile team as well as the mandatory primer of what makes up an Agile team. The Cultural Challenges (chapter 3) and Transitioning Typical Processes (chapter 5) are tackle very real challenges traditional teams and testers need to tackle to have any chance at succeeding with a switch to Agile.

The Agile Testing Quadrants are a series of chapters which describe four types of tests Lisa and Janet have identified.

  • Test-Driven Development
  • Business Facing Tests
  • Customer Facing Tests
  • Technology Facing Tests

Of the four quadrants, I found the Business Facing Tests the most informative with its discussion on tools and techniques for determining what the software is supposed to do and how it is to do it. These techniques quickly can turn into test data. It was also nice that see that I either currently use or have used them all. Exploratory Testing and other UAT-ish activities fit into Customer Facing Tests with Michael Bolton contributing a nice piece on it.

This is a good time to say I really liked how they used outside ‘experts’ to flush out their sections. It’s nice to see people not afraid to tap resources when they don’t have enough the expertise to convey what they want. They also pepper the chapters with Lisa’s Story or Janet’s Story which lends credibility to things as they clearly practice what they are preaching.

The last 3 sections I actually breezed through and is why I mentioned before that I likely wasn’t the exact audience of the book. My copy has only a couple things noted in the Automation section and I’ve developed my own way of handling an iteration so that section was more of a compare / contrast what I’ve figured out over the last couple years.

If I could come up with a complaint about the book it would be its apparent focus on the XP variant of Agile which seems to be the least likely version to be picked up by transitioning organizations. It is pretty easy to overlook that though and pull out the juicy bits.

I’ve more-or-less drank the agile Kool-Aid and think this is the way testing needs to evolve in order to be successful. If you have been around Agile for awhile, this might not be the book for you, instead, you might be better off getting a book dedicated to a specific topic of interest (like Business Facing Tests or TDD or Continuous Integration). However, if you are new to testing in general, or to Agile testing specifically then Agile Testing deserves a spot on your bookshelf as it discusses all the key points in just enough detail to morph you into an Agile tester.

Next Page »