Jan 14, 2010

Don't punish your customers for your incompetencies

Often, there is a common practice amongst huge organizations to dictate their terms and conditions to a customer. And this is not done with a baton or a whip, but more in terms of soft talk from the sales engineer. Have a bunch of IIM grads who can do the talking and they can sell crap claiming it to be cheese.

For instance, a software vendor, gets a requirement from a customer to design a product according to their specifications. And since this SV has a crappy server which is not being sold on high note in the market wants to make some more revenue, they actually decide to do their development on this platform. The end result, the developers are a frustrated lot since they have spent close to 1 year making the product work on this platform, and the remaining part of their time in the company searching for a better option in a saner organization. Then you have the customer. Ah, the customer, the fall guy in all this mayhem. In most cases, the customer would be an innocent employee in a non-IT organization who would have requested for this product thinking it would make his life easier. The end result, to use this product he has to:
1. Hire a consultant who would charge him by the hour for telling him he needs to upgrade all his systems to a better configuration.
2. Hire a consultant who would procure the systems.
3. Hire a consultant who would setup the landscape on the systems.
4. Hire a consultant who would deploy the backend on the lanscape.
5. Hire a consultant who would raise messages to the IT support team about things not working.
6. Hire a consultant to supervise over these consultants.

And just when the non-IT chap thinks he has it all sorted out and things are just shaping up, a new patch for the server arrives which brings down the entire landscape.

This is the apathy towards the customers which causes many a giants to fall on their face. When we design a product for a customer, we need to think that one day we might be the customer using the same product. And then design with this attitude to success.

Jan 7, 2010

My Rantings on Flex (Part 2)

I am responsible for setting up the Unit test framework for my project and had setup this beautiful FlexUnit TestHarness and TestSuite which sent the results on the jazzy and colorful TestRunner. Things were working fine with the dummy test cases I had added and I was happy. So far so good. Today, the developers started contributing their tests into the test case. All that they did was this

package SomePackage
{
public class Constructor() extends TestCase
{
public function Constructor(){
}

public function getNameAndDate():void
{
var obj:Object=null;
Assert.assertNotNull(obj);
}
}
}
And then add this to the TestSuite.addTestCase(MyTestCase)
and run the test runner.

The testRunner showed a weird test case called "WarningTestCase" and showed some empty results in it. At first look, I asked my dev if he had created a testCase called WarningTestCase (I am a novice to FlexUnit hence this query) and he said he did not. Just to make sure (not that I do not trust my developers...they are the best in the industry), I searched for any instance of WarningTestCase and did not find any. Now I was flabbergasted. There should be some information on this somewhere in the net. So I Googled for WarningTestCase FlexUnit and got some half baked information from Adobe that WarningTestCase means that some of the tests within one of my testCases has failed. (This was the worst kind of error message I have ever seen in my Goddamn career as a software engineer). I am sure Adobe has some brilliant minds who could atleast make well described well defined error messages.

OK. So I went to try to find which is the test case that's failing. I tried all the available test cases and they all worked fine but this one test case was failing. I was almost on the verge of giving up when I tried one last step. I prefixed all the functions with the term test. So now getNameAndDate would go by testgetNameAndDate. I ran this test case. And Voila, the test runner showed me all the test results with appropriate assert status.

The least I would have expected would have been well defined error return codes and something that would have been documented for dummies like myself to understand.