Saturday, July 14, 2012

Testing Definitions for testing terms

Here are some testing definitions for testing terms. Whenever you are asked this in interviews, feel free to use them (but don't blame me if you don't land the job). I personally have actually come up with these definitions based on the testing that I have done in time.

Testing Definition ---- Here's a definition for the term "testing definition" itself; defined as any testing terminology that is as the basis for a hiring decision in an interview is defined as "testing definition" :).
Authors' note:- Decided to have a definition for "testing defintion" itself :).

Edge Case or Corner case --- Cases that the tester comes up with and when everyone else in the project realizes that they have missed those test cases, they tend to call it a "corner case"; they also give this term to use cases that would be executed by a very little set of users. That will also help their management understand why these people missed these test cases.

Smoke testing --- Imagine that you are testing an android app on an android phone; Now charge the phone for sometime. If you see smoke coming from the phone when you are testing, then that "tests" that you execute to cause the smoke to come out is defined as smoke testing.

Sanity Testing --- Imagine running a set of test cases again and again; there comes a point in your life where you think you would lose your sanity at the end of a testing session. Such testing that makes you question your sanity is defined as sanity testing.

BVT --- Most popularly known as Build Verification Testing; (Now, don't start questioning as to why it's called "Build Verification Testing" instead of "Build Verification". The experts have decided that it's BVT and not BV.....). The term "BVT" itself has so many flavors --- it can be expanded as "Build Validation Testing", "Bugs Validation Testing", "Bugs Verification Testing", "Blind Verification Testing" where you try to explore the product blindly, "Bug Validation Testing" wherein you can say that you are trying to validate bugs from a previous build, etc. etc...... When someone asks you the meaning of "BVT", try to ask them as to what it expands into.

In case the interviewer is hell-bent on stating that BVT stands only for Build Verification Testing and nothing else, try to pick out 1 of the above definitions.

BAT --- Now, this is defined as the equipment that most people want to batter me with; But this, in today's most companies, is defined as "Build Acceptance Testing". It means that you are accepting a build in it's current form, with the list of bugs so that the build can be ready to be tested. You can also define it as "Bug Acceptance Testing" wherein you try to accept the bug since you happen to belong to the department of "bug-savers".

Stress Testing --- This is the kind of test that causes to get "pretty stressed" when asked to speak about it; more often than not, people usually ask you about "stress" testing and "load" testing. "Stress Testing is defined as the mental state of your mind when someone asks you about stress testing"; "Load Testing is defined as any definition that cleverly defines load testing as a form of testing that has nothing to do with stress testing".

Exploratory Testing --- This is very important to remember; "Exploratory testing" is defined as "any kind of testing that has the words uncharted, unexplored, no requirements, no test cases embedded into the definition". Now, that's the most popular definition. What the world seems to have forgotten is the fact that Cem Kaner has defined exploratory testing.

Rapid Exploratory Testing --- "Rapid Exploratory Testing" is defined as "any kind of testing that has the word of 'speed', 'uncharted', 'unexplored', 'no requirements', 'no test cases' embedded into the definition itself". Again, that kind of definition seems most popular these days.

Usability Testing --- This is simple. When you try to define "Usability Testing", try to make use of the word "end-user, customer, last user" etc. being involved in some kind of testing.

Now, most of the above, as you know are fake definitions; Now, what's a true definition? A true definition is a definition that states the intent of the type of testing, and clearly clarifies the objective of a type of testing.A false definition is one that tries to differntiate the "defined type of testing" from another and tries to call out how it is advantage; In my humble opinion, any time that you try to ask for a difference between 2 forms of testing becomes the seed for such "fake definitions". Every definition is true as long as they clarify the intent. As a great person said, any verb prefixed to the word "testing" would result in some form of testing; And there begins a "world of definitions".....

Monday, February 6, 2012

Automation/Requirements Document/Process/Certification cannot find bugs

Product being tested --- Breath analyser to detect alchohol!!!

Objective of product --- Analyze the air to identify if the person blowing air has had alchohol or not.

What the product did not do --- Analyze if the person being tested has actually blown his air or not.

And the test case --- Get Drunk. totally drunk. get analyzed by the breath analyzer, but don't blow air into the equipment.

And the test case result --- Failed. Since the breath analyzer does not detect if you actually blew air into the equipment or not.

And what's the bug? --- Expected behavior is that the system should detect if the person is blowing air into the equipment or not. Actual behavior is that it does not detect this.

And you won't find this test case in the requirements document; not in boundary value analysis or equivalence method or some such method; no testing certification can help you detect this flaw; no 6 sigma process or CMMi process can help you find this test; and no automation suite can help you prevent it.

In spite of all of the above, this bug has been around in breath analysing equipment for a long long long time. That proves the theory that there are more fake testers than me around :). Anyway, the point I was trying to make was that testing is best left to humans and not to automated suites, or processes, or methodologies. The best tester is still the man, and not the machine!!!

Wednesday, January 18, 2012

SOPA, wikipedia and black days...

SOPA --- This term is doing the round these days and a lot has been written about it already. Today, wikipedia have termed it a black day for themselves.

I interviewed myself today; the objective was to execute only 1 test case to test implementation of the SOPA act when it gets implemented and try to break it in the 1st try. My test case is listed below:-

Test Case --- Search for a wiki page that has blacklisted material and has been in existence for a few years; confirm that the material is blacklisted on the wiki; Do a google search and visit Google cache and check if that information is available. My guess is that it will be available; (I had posted a blog post 2 years back, deleted it a year and a half back and this post is still visible in Google Cache)

Does that mean that there will be a Google Black Day too with Google users protesting to protect their data, if SOPA were to be implemented? :)

Sunday, January 8, 2012

Corporate Lies and Timesheets

All testers have filled timesheets; most people fill out timesheets stating that we work 8 hrs in a day. That is today's biggest lie from the corporates. We all know that it is never ever possible to work exactly for 8 hours 0 mins and 0 seconds; obviously, it would be for sometime more than that or less than that. When questioned, the project manager would cleverly counter that claim stating that he did not work for 8 hours, but that he did 8 hours worth of work on that day; The argument claims that he might have taken sometime more or less, but then the work that he did was work that's worth 8 hours. That becomes the 2nd biggest lie.

If he had the ability to do 8 hrs of work in less than that time, then how could it be 8 hours worth of work? To answer this, the senior project manager would claim the development of components that reduce his working time and improve productivity. And then he would bring in the magic word "automation" to claim that they were able to automate that much amount of time to reduce productivity.

That's the 3rd biggest lie; most automation that's been developed would be screen capture components. The 3rd question is if it reduces the working hours, then why does it not improve billing time and gives the client reduced billing time? To answer that, the client would most probably say that they will reduce billing time, but the tool that's being used is created for intelectual usage and the company has to pay for that tool usage.

And the conversation goes on... The conversation, which started with a focus on quality, ends due to money. In the end, money wins and quality loses!!!