Saturday, March 20, 2010

Continuing the saga of UI Testing – The Sequel (Part 2)…..

Ever wondered what would happen to all the keyboards of the world if a new alphabet is added to the English language? Well, I definitely don’t know what’d happen…... and will not attempt to answer this question with this post….... but, this would have been identified in the previous century if there had been in existence, a UI tester who could have tested the keyboard UI design with an eye on scalability... This would be 1 truly defined hallmark of a person with the mind of a "True Tester"!!!!! But we never had any testers in the profession in the Middle Ages, did we?

When I read the above paragraph, I think it's highly likely that I sound more like an idiot.... why on earth would someone want to add a new alphabet to the English language? But again, to answer that..... “Why not???”

This is my 2nd post on the never ending saga of UI Testing - And personally speaking, I'd classify the 12 pillars of UI Testing as follows - Accessibility, Alignment, Appearance, Compatibility, Error Handling, Hot Keys, Menus, Navigation, Security, Style sheets, Toolbars and Client-side UI Validations.

What? Did I read it right? Client-side UI Validations? Yes, of course. I believe that these UI validations also should be an integral part of UI testing…. Mainly because an integral pre-requisite to UI testing is to understand very specific customer requirements, and these might take the form of client-side validations, which is why I am including this section over here.

Now, please note that the perfect UI classification would be found only in the world of the "Fake Software Tester"... In the true tester’s world, nothing and nobody is perfect.... Neither is this post... If you disagree, please feel free to email those rotten tomatoes along with your set of curses, along with the areas of disagreement, so that we can debate it and arrive at a logical conclusion…

Below, am listing a few UI practices that I have been allowed to share with you… I am listing only the ones that I have seen and heard in my life…. And beware, BE AWARE… Practices the below at your own peril, for these are certified fake practices and can kill the customer…

1) Not Understanding the Difference between Functional and UI Testing
I asked a friend, whom I perceive as a “Fake Software Tester” to do some UI testing on Facebook….and below is what she did…
a) Created an account for herself and logged into facebook
b) Went ahead and added a few friends, added a few applications
c) Checked out the Privacy settings of Facebook......
d) Started to send messages and check if the messages were reaching people…
e) Started testing the wall feature…
f) And went on and on and on……

Sadly, this friend of mine did not understand the difference between functionality and UI. Understanding this difference is very important so that you can focus all your testing skills on the right areas.

2) Ignoring the client bandwidth while doing UI Testing for Web-based applications
There was this customer, for which a friend's team was involved in UI testing. After the usual project life-cycle, the project went live. But, customers started complaining of slowness, which was routed to the performance teams. The project team said that the application stability was just fine. The problem --- Most of the clients were accessing the application from 14 KBPS low-band width networks.

Sadly, when the test team did the UI testing, they did not understand the network band-width of the customer. There were so many unwanted GIF and JPEGs; so much of text was getting downloaded. It was a typical example where the test team could have suggested the usage of a "HTML Shrinker" type utility, but this suggestion was never made. Some of you might disagree stating that this would be a design problem, but this is an example to re-iterate the fact that we need to understand customer networks while doing our testing.

3) Understanding the true needs of the customer - A "timely" check
There is this story of this testing team testing the clock feature. The entire team tested these using local clock settings. The entire feature was tested with a DD-MM-YYYY format. But, the client was based out of a time zone that preferred an YYYY-MM-DD format. This issue was detected very late only in the UAT phase, when the UI testing bit should have caught this.

4) Understanding any specific Client Requirements - An Appearance Incident
For an online-form that is to be used by an eye hospital, the missing fields were highlighted in RED. As per the specified requirements, the test team tested if the fields are highlighted in RED. But, when the application went live, the customers were unable to understand the missing fields, since they were color blind and could not recognize the letters.

5) A cold story of Hot-Keys
An ex-boss of mine narrated an interesting incident. This team did a lot of research and hard-work to develop a text editor component along with their application. When they created this text editor, they made a decision to have their own hot keys for this text editor. What they had forgotten was that the users are mostly used to "Ctrl+C" for copy and "Ctrl+v" for pasting, and not having this combination would mean that the users thought that this feature of copy-Paste was unavailable in the text editor. Sadly, for all their hard-work, the text editor remained unused.

The creators of the application insisted that the hot-keys of the application were listed in the Help Manual. Sad but true, in today’s world, most of our users do not read the application manual. Whatever appears in the help manual remains mostly unread. And the application dev team forgot to understand that most of us are most comfortable with the windows specific hotkey combinations.

6) When Compatibility testing takes a vacation…
Once, Compatibility went on holiday. There was a very smart Project Manager (smart since he considered himself very smart), who decided that he can reduce testing time and project cost by making an assumption that if 1 Page works on most browser-screen resolution combinations, all other pages would. The biggest problem with this assumption was that the "Search" and "Registration" pages were not tested on the other screen resolution combinations. When the app went live, the product owners realized that the most commonly used fields of the application were hidden and the user had to do a lot of scrolling.

Compatibility testing means that you test all features of the application on the browser-screen resolution combinations. Such incidents commonly occur even today, when you have a very inexperienced team of project managers handling the delivery of the application.

7) Error-handling
Do the following error messages make any sense?
i. Error 2003- Please contact the admin.
ii. Error -03841769-Error Occurred.
iii. Error 80184436-Duplicate is existing.

Above mentioned error messages are meaningless. The tester should ensure that he tries out all possible error messages and the fact that they have meaningful information that's passed on to the reader. This is where the aspect of a manual tester scores over automation, since an automated application cannot detect when error messages are stupid.

Moving on.... let me tell you my bit about the aspect of Error Handling, Compatibility and Hot Keys in the world of UI Testing...

Compatibility - What do you most commonly look for while doing Compatibility Testing?
What is compatibility testing? My answer's very simple.... Ensure adherence to all screen resolutions of the client. When you do compatibility testing, ensure the following:-
1) Make a list of all browsers that would be used to access your application
2) Make a list of all Operating systems, or other dependant frameworks or software
3) Make a list of all screen resolutions
And make a list of the above combination and test with all the possible combinations. Do not be a Fake tester and limit yourself to this combination alone… Each application is unique and demands that you make up your combination, which is specific to the application.

Error Handling - What do you most commonly look for while doing "Error Handling" Testing?
1) Consistency - Error Handling has to be consistent. The same error condition should throw the same error messages across the application.
2) Meaningful - The error message should be meaningful and the customer should be able to understand it.
3) Educative - Error messages should assume that the user does not know a lot about the application and should guide the user of what the mistake was and what are the rectifying steps.
4) Returning Tab Focus - Also check if the tab focus returns to the erroroneous fields, when there is an error.
5) Working closely with Business - Work with the business user and make a list of all error messages that can occur. Ensure that you test for all the error message combinations and for consistent error messages.


Hot Keys - What do you most commonly look for while doing "Hot Keys" testing?
What are hot keys? We all know about it... but the fake tester does not. Let's simply say hot-keys are keys that are provided by windows to execute OS specific commands. It is not right for us to replace those keywords to execute my program specs. My program should never over-write the operating system commend. What do you look for while doing “Hot Key” testing?
1) Over-riding - Hot Keys provided by the Operating system cannot be over-ridden (all standard hot keys such as (ctrl+c, ctrl+v etc.) are working as expected)
2) Cancel & Escape - Functionality of the cancel buttons and the escape key need to be similar
3) Duplication - No duplication of hot keys
4) F1 - F1 always takes me to help from anywhere in eh application
5) Alt-F4 - Alt+F4 triggers the window.close event and.

Cut to the Chase
During my limited time so far in the world of testing, I admint to having been a “Fake Tester” for a very long time. And in some aspects, I still am one. I have interviewed/questioned many fake testers and their understanding of UI testing. Some of the stories listed above come from what I have listened to them in these interviews.

It is very important for the tester to understand the 12 pillars of UI testing and plan the tests around these. I have talked about the A's earlier and about the other 3 over here.

The list of items under these classifications is huge... and I believe the classification list is endless. But, what I have put in here are only my thoughts on this and feel free to comment/write to me and let me know whatever I have missed out over here.

And saying adios with the promise to write about the last 6 pillars -- Menus, Navigation, Security, Style sheets, Toolbars and Client-side UI Validations -- in a subsequent post..... Have a great week ahead!!!

Sunday, March 7, 2010

Beginner's Navigation Guide to the Galaxy of "Report Testing" in Software...

Every application has Reports... But are they mostly over-looked by testers? Do we really devote enough thought to plan on testing reports?

Well, I could not find enough stuff on the 1st few pages returned by Google for the search "Tips and Tricks for testing software reports", which prompted me to post this... Actually, there's another reason too... Reports is 1 part of the system, which cannot be neglected since their importance by the project teams are undermined, but they form a very crucial part of the system... Why so? Mainly because, it is on the information presented in reports that business make a lot of their decisions. This is something which will not be understood by the "Fake Testers" that are in business today.

Trying to broadly classify the reports that are a part of every software solution, I have come across 2 kinds of Reports - "Operational" and "Historical Reports". (If you can think of a broader classification, which I am sure you would, then please write it to me so that I can be proved to be a member of the Fake Software Testing Community).

Operational Reports - Reports in the application that is used to drive day-to-day applications.

Historical Reports - Reports on the basis of data retrieved from the database that forms the basis for key management decisions.

Am sure that you would have encountered both types... (and you can classify even more to it...)

Top 6 aspects that are mostly overlooked by the fake software tester, when planning on testing software reports are, in my humble opinion are as follows:-

1) Data Preparation for Testing Reports
a. Importance of Test Data - It's impossible to prepare test data for testing software reports in a single day!!! You will need to understand the business context and create all your test data accordingly, as close to real-life data as possible. The fake software his this irritating habit to create a "data
dump" with data that comes to his head, without understanding if this would be close to like-live data. This would Rank @ No. 1 in the "Most Horrible Practices of Software Testing" list. The Data that is being prepared should be close to like-live data and it's 1 of the best practices to have the business users take a quick look at this data that'd be tested.

b. Life Span of Application and Frequency of Report Generation - Mostly, you would need to create/generate your own data and so, please understand the life span of the application being tested, before coming up with like live data creation for testing. Also understand the frequency with which the reports would be
generated so that your data preparation can be like that.

c. Sourcing for Reports - If the data can come from an external data feed, please work with the corresponding systems to get the data in place for your testing. If the data is going to change almost on a daily basis, there would be situations wherein you will have to change dates or other attributes in the system. See if you can have an automated program setup to generate this data for you.

Please plan for data preparation for testing Reports as part of your "Data Preparation" Activity.

It is very difficult for me to imagine a fake tester dedicate so much time and plan for the above mentioned activities as part of testing. To quote an example, I remember seeing data prepared by a fake tester contain "Sundays", when the report is on a stock movement over a 5 week period. The fake software tester does not realize that the stock market is closed on Sundays, mostly!!!

2) NFRs for Reports
a. Response Times - Do reports need NFRs? This is 1 question that is asked by the fake software tester. Another set would state that the business never gave them any NFRs. Sample this, if the business user expects a report to be generated in 4 seconds, then that needs to be something which you need to test. A
true tester would question NFRs for Reports and would ensure that testing for this would be incorporated in the report.

b. Report Data Volumes - Another thing that the fake tester does not ask for is for how many records get processed for a particular report, at least a range. There is no point trying to test with 200 trillion records, if the business expectation of data is only around, say 40381 records :)!!!

Response Time and Record Volumes need to be understood before starting to test reports.

3) SQL Related Testing of Reports
This is 1 area that I believe that the fake tester is completely blind to.

a. SQL Profiler - After executing the report, it is a good habit to make use of a database profiler tool (SQL Profiler for SQL and not sure for other
databases), to run a trace and check for performance improvement. Giving the trace to the developer on the test systems would fully tell the developer on what tweaking he needs to do and would execute the report in a much quicker time.

b. Dynamic SQL Query - A lot of freshers miss this. I was once asked to do a check as to why reports take a lot of time to respond. When I looked into the Sproc, I realized that there was a lot of static SQL, when the need was for it to be dynamic. There was no point in me raising a defect stating that the
query takes a long time. Though you are the developer's enemy, please remember it is such small actions, which would go a long way in bridging the gap between you and the development teams.

c. Overnight reports - Now, in case the data is not based on daily transactions, and the report is required during the day, it makes a lot of sense to have the data archived into a separate table and a separate Sproc to get data from this table for the reports. I am really not sure if all the SQL Gurus of the world
would point their guns at me if this is incorrect, but again, please remember I have a lot of traits of the Fake Software Tester, and am currently in the
transition phase :) !!!

4) Print Functionality of Reports
Every Report would print into a paper. Though all the "Go-Green" people of the world would hate me for this, a true tester has to be "Non-Go-Green" and would, I believe waste a lot of paper in testing this functionality.

A true tester, before designing his tests for reports, would go the extra mile to understand the following
a. What is the printer used when the application is like-live
b. What is the printer configuration data when the application is like-live
c. What kind of paper is used for printing? (A4, A3, A2, etc... Any other specific configuration)

Importance of type of Paper - There’s reason for understanding the kind of paper. I remember 1 client of my peer, who used "Ruled Paper" for his testing, so that the test report had data in the corresponding columns. For some reason (I remember that it was incompatibility with other applications), that they got all reports printed on a paper. Now, this type of "Special Paper" was never used for testing by the test team and this resulted in a lot of surprises when the application went live.

Importance of Printer Configuration - Also, if the customer is using landscape printing on A2 for a type of report, then what value does the test team add in testing those reports on A4 sheet using Portrait configurations?

Importance of trying to print many times before Go-LIVE - And, please do a lot of print-outs to see how the print-outs would look in actual, how much ever times you can!!! Only if you take print-outs can you fix problems in areas where the printed copy goes out of the paper borders, totals are printed in a
separate page, proper page alignment for report header and footer, proper alignment for the summary, font related issues, etc.

This area, sadly, is also mostly neglected by the fake tester.

5) Import and Export Functionalities/Search Criteria/Localization
Every report has import and export possibilities. Export, definitely yes, but import???

Of course... I have seen cases where the user wants to import the query from a text file. So, the possibility of "Import" always exists, though it is very minimally used.

Now, we need to understand the various applications and versions of the applications to which the export should support and test with all of those... I do know of cases where the user was using very old versions of Excel and the application refused to export into these versions of the Office software. Sadly, there were a lot of escalations questioning delivery capability, by the time this escalation happened!!!

The search criteria are tested, but needs to be tested to ensure that the user does not search with what is not possible. Search can throw up a lot of defects. Would I want to save a search?

6) And understanding the business context and business logic behind the reports...
And as I had stated at the beginning, please understand the business logic and the business context behind the report that you are testing. Ask the following questions
"Why do you need the report for business?"
"What data is shown and what decisions can be made by this report"
"Why do you feel some columns are not required while some are required?"
"Why do you think that this Report makes sense to business and if not, why do you think that this report is not required"

And to re-iterate, there are a lot of other aspects of software reports that need to be tested. Not just the 6. Listed above is what I feel are the mostly overlooked aspects, which need to be re-looked at. And since Nobody's Perfect, neither am I.... If you really feel that there is more to add, please keep those comments coming in...

Very unfortunately, the fake software tester does not realize the importance of testing reports, which can, indirectly lead to a very wrong business decision, which can potentially bring down the entire company or cause huge losses to the company... It is my wish that there come a day every tester understands the business context of the reports that he tests and tests it using the business perspective...

And until this day arrives, the ranting of this fake software tester would definitely continue.....!!!