Thursday, December 6, 2012

Methodologies, Standards, Maturity Models, and Project Success


'Methodologies, Standards, Maturity Models, and Project Success' is the topic of my session at International Conference on Software Engineering and Mobile Application Modelling and Development (ICSEMA 2012),  19-20 Dec 2012, Chennai, India.

Several software development methodologies, standards, maturity models, etc., have emerged in our industry over the past decades. The plethora of such entities has created several opportunities as well as challenges. Gone are those days when monolithic IT systems were developed and maintained by exclusive communities of IT professionals confined to technology-savvy regions of the world. The challenges of software engineering during the 21st century are quite different and multifold because of factors such as globalization and technology evolution. Global software engineering (GSE), which involves virtually distributed teams working across time zones, is a growing area of practice and research. While GSE enables our industry to leverage the skills and competencies of software professionals across the globe, it poses several challenges with regard to time differences, communication, coordination, and cultural issues.

Meanwhile, the evolution of IT in terms of programming languages, databases, tools, platforms, devices and methodologies demands that IT professionals cope with emerging technology areas and paradigms while also holding down their day jobs. Development during the past decade have presented such new platforms and paradigms as virtualization, service orientation, cloud computing, and agile software development. With the popularity of GSE and evolutionary methodologies, it has become imperative for IT professionals to stay up to date with a very high level of team spirit and a collaborative attitude in order to deliver desired results time and again.

In this complex context how can we deliver successful projects? The objective of this session is to explore the meaning and relationship among these entities, and understand how they contribute to project success.

Let me ask you this.  How do you deliver successful projects? Any thoughts?

Tuesday, November 20, 2012

Technical Debt and Challenges in Distributed Agile


This is about my article ‘Distributed Agile, Agile Testing and Technical Debt’ published in IEEE Software (Nov/Dec 2012). In this article, I have presented an interview I conducted with Johanna Rothman and Lisa Crispin in March 2012, when I met with them at Belgium Testing Days 2012.

The goal of my interview was to understand their perspectives on technical debt in agile teams with specific reference to agile testing and distributed teams. Here is the list of questions I wanted to ask in my interview.
  1. ‘Technical Debt’ is a common term used by agile teams. What are the implications of ‘Technical Debt’ on Agile Testing Teams? Or what does it mean (or how does it matter) to Agile Testing Teams?
  2. What are the impacts of ‘Inadequate Technical Debt Management (of everyone else in projects)’ on Agile Testing Teams?
  3. Should Agile Testing Teams be aware of ‘Technical Debt’ and participate in discussions related to Technical Debt? How can they contribute?
  4. Test Automation is one among the core focus areas of Agile Testing Teams. How can they understand and manage ‘Technical Debt’ related to automation design or scripts? Is it happening in the industry? What has been your experience?
  5. In this context, what are the challenges do you see in distributed teams? How can these challenges be addressed? Any experience?
My interview with them lasted for about 40 minutes.  I had a memorable interaction with them during the conference as well as over email until April-2012 to finalize their responses.

In this article, I have included my thoughts on the need for geographically distributed teams to become more aware and aligned to effectively manage technical debt.

You can download the PDF file of this article from https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=6336723

Write to me if you have any difficulties in getting this article.

Tuesday, October 30, 2012

The Power of Inquiry: Coaching Tips for You! - Part 4


<<<< The Power of Inquiry: ..... - Part 3                                     The Power of Inquiry: ..... - Part 1 >>>>

Tips to Ask Powerful Questions:    Here are some tips to get started and master the art of asking powerful questions.

1. Prepare your questions
2. Eliminate the questions that are less powerful
3. Rehearse
4. Go through the checklist
5. Apply
6. Experience the process of inquiry and improve

The Greatest Enemy:  Many of us conduct our work life (as well as personal life) with ‘I-know-how-to’ attitude. That is good as well as bad. Good when we are ready to learn from multiple sources, inspect and adapt. Bad when we are clouded by illusion. Our greatest enemy is this illusion. When we stop learning our questions can remain damaging. This is when managers become damagers. Their questions are direct, critical, offending, acerbic and vicious. We cannot afford to live with the illusion of knowledge. Don’t you agree?


Conversations and Questions:  Asking and answering questions is a significant part of our daily conversations. We ask questions because of the reasons we discussed in Part-1 of this blog series.

Questions can be of different types. Some questions help us open up or start a conversation. Some questions help us probe. Sometimes we ask hypothetical questions and seek answers. Some of our questions can be reflective in nature. Finally, to end a conversation we ask closing questions. Here are some examples.

Initiate: How have you been? What are we going to discuss today? How was your meeting yesterday with the customer? What are your concerns about our project?

Probe: Can you explain why this tool is important? When did we first observe this? What were the observations?

Create: (Create a hypothetical situation) If we get a database expert, how will that help our project?

Reflect: Would you prioritize the top 3 or 5 issues first so that we make our team involved in finding collective solutions?

Close: What is our action plan? What is our next step? When will you talk to our customer about this?


Do you follow similar models?


<<<< The Power of Inquiry: ..... - Part 3                                      The Power of Inquiry: ..... - Part 1 >>>>

The Power of Inquiry: Coaching Tips for You! - Part 3


<<<< The Power of Inquiry:.... - Part 2                                 The Power of Inquiry:..... - Part 4 >>>>

When we ask questions, we reveal the scope. In every question there is a context. There is a scope – either implicit or explicit. Why does the scope of our question matter? Scope does matter because it can make the question fit to a context. It clarifies the purpose and hence adds more energy or weight to the question. Let us examine the following three questions.

1. How can we educate everyone in our organization in writing high quality code?
2. Why don’t you take ‘code quality’ as an initiative at our business unit level?
3. As a team member how can you write high quality code so that we can meet our goal of delighting our customer?

Depending on the context, the scope of our questions has to be made appropriate. Else, it can result in a shocking experience. For example the scope of the first are second question is not right to someone who is struggling to ensure code quality at project level.

In our questions we embed our assumptions too. In clear and powerful questions, assumptions do surface. Here are some examples.

• Can we do something to produce good quality code? (assumes that nobody in the team has written good quality code)
• How can we learn from the other project team about writing unit tests and adopting TDD? (assumes that nobody in your project can contribute)
• Why is it not working? Why has it crashed? 
• Can you help me understand the situation?

Questions reveal team spirit and your intention. Which of these two questions is better? Why?

1. Why do we receive these customer complaints? Who is responsible? What did we do wrong? Can someone explain?
2. What can we learn from our customer’s email and the current situation? What are the possible options do we have? How can we help each other and serve our customer better? Any ideas?

How about these?

• How can we improve quality and do things faster as compared to the other team?
• How can we collaborate with the other team and understand which of their practices will work for us and provide benefits?

The first question induces competition whereas the second question nurtures collaboration.

Becoming Collaborative Coaches:   The first step to become a collaborative coach is to be genuine. Ask genuine questions. Let them be powerful questions. When genuine questions are powerful they invoke genuine answers. That is a virtuous cycle!  Yes. Let me repeat!

• Collaboration can be nurtured through genuine questions
• Genuine questions, when powerful, will result in genuine answers
• That is a virtuous cycle!

Checklist to Formulate Powerful Questions:  Here is a checklist that can help us formulate powerful questions.  Try this out!

1. Is this question relevant?
2. Is it genuine?
3. What do we want to accomplish with this question? What kind of questions, conversations or emotions can be triggered when we ask this?
4. Will this question invite fresh thinking/feeling?
5. What beliefs and assumptions are hidden here?
6. Will this question increase our focus on problems and shortcomings? Or will this question generate hope, engagement, collaboration, action and new possibilities?
7. Does this question leave room for new and different questions to be raised as the initial question is explored?

Adapted from Sally Ann Roth Public Conversations Project c.1998

<<<< The Power of Inquiry: ..... - Part 2                                         The Power of Inquiry: ..... - Part 4 >>>>

Friday, October 26, 2012

The Power of Inquiry: Coaching Tips for You! - Part 2


<<< The Power of Inquiry: ..... - Part 1                              The Power of Inquiry: ..... - Part 3 >>>>

How do we construct powerful questions? I was reading the white paper ‘The Art of Powerful Questions: Catalyzing Insight, Innovation and Action’ written by Eric E. Vogt, Juanita Brown and David Isaacs. Written in 2003, this paper mentions that questions starting with ‘Which’ are less powerful and so are close ended or Yes/No questions. Who, when and where add some power to questions whereas why, how and what help us construct powerful questions! In all general circumstances this holds good!

A word of caution! Sometimes why, how and what questions can be damaging. Here are two examples.

a) Why do we have unfinished stories?
b) What makes our folks stay on internet messenger all the time?
c) How can we even think about such a bad design?

I hope you got the point! Let me move on.

We do see ups and downs in our projects. It happened in one of my project too. We came across code quality issues reported by customer. The mail from customer reached our project manager. He wanted to have a team meeting. He wanted to figure out the situation and find a solution.

When you put yourself in the shoes of this project manager, which of the following questions will you prefer to ask?

a) Are we satisfied with the quality of code we deliver?
b) When have we been most satisfied with what we deliver? How did we accomplish that?
c) What is it about our way of writing code that you find most satisfying?
d) Why might it be that the feedback on our code quality has had its ups and down?

Or when you want to involve one of your programmers in contributing to this situation, will you choose

1. As a team member how can you write high quality code so that we can meet our goal of delighting our customer?
or
2. With your experience in writing high quality code, how can we enable our team in writing similar code?

By the way, do you think Jim could have been a better coach? Don’t you think Jim could have asked Sachin the second question and make Sachin understand his true potential?

I am sure you have related these examples to your experience and incidents from your projects. Have you thought about the scope of questions and underlying assumptions? Yes.  There is a scope, implicit or explicit in every question.  Also, there are underlying assumptions.

Part-3 of this blog post will address these two aspects in detail.

<<< The Power of Inquiry: ..... - Part 1                          The Power of Inquiry: ..... - Part 3 >>>>

The Power of Inquiry: Coaching Tips for You! - Part 1



‘The Power of Inquiry: Coaching Tips for You!’ was the topic I chose for my 45-minute keynote at Agile Tour 2012, Chennai. It happened last week (20th Oct). The venue (Hotel RainTree, Anna Salai) was great and the delegates were superb! I am writing this blog post to share the salient aspects of my session.

Let me begin with the word ‘inquire’. Inquire means explore, probe, investigate, examine, analyze, review or enquire. It is about seeking information about something or doing a formal investigation. The word ‘inquiry’ means exploration, probing, investigation, examination, analysis, review or enquiry. Inquiry or enquiry is one of the powerful means of coaching. Agile coaches and Scrum Masters can make a positive impact on their teams by understanding the power of inquiry.

Effective inquiry consists of powerful questions. We can learn the importance of asking questions or the power of inquiry from what Albert Einstein said – “If I had an hour to solve a problem, and my life depended on the solution, I would spend the first 55 minutes determining the proper question to ask, for once I know the proper question, I could solve the problem in less than five minutes.”

In her book “The 7 Powers of Questions”, Dorothy Leeds says, “Questions 1) demand answers, 2) stimulate thinking, 3) put us in control, 4) get people to open up, 5) give us valuable info, 6) lead to quality listening, and 7) get people to persuade themselves.” Interesting! Isn’t it?

Having set this context with the delegates, I shared the agenda of my session. The agenda was this set of questions!

a) Why powerful questions?
b) What are powerful questions?
c) How do we go about asking powerful questions? and
d) How can we retain the takeaways, stay connected, and share our coaching experiences?

The delegates were curious and very attentive.

Why powerful questions? Powerful questions a) initiate reflective and productive conversations, b) surface assumptions, c) generate enthusiasm and energy, d) provide focus on attention and enquiry, and e) induce more questions.

Powerless questions do the opposite! They do not initiate reflective and productive conversations. They hide assumptions. They sap energy. They demotivate people!

All of us do have the ability to distinguish powerful questions from powerless questions. What do you think about the following questions? Which ones are powerful? Which ones are not so powerful?

a) Are we doing well in this iteration?
b) Which user story are you working on?
c) Did you do unit testing?
d) What does it mean to provide quality deliverables to our testers?
e) What risks exist that we have not thought of yet?
f) What is the possibility we see now?

The first two are obviously weak questions. You are the Scrum Master or Agile Coach. You know what is happening in the project. You attend daily stand-up meetings! In spite of all these, do you ask the first two questions? Do you stop there? Or do you attempt to continue your dialogue with powerful questions to make your questions accomplish what you want them to accomplish?

The third question is a close ended (Yes/No) question. All of us agree that the last three questions are high quality questions. They are powerful questions! These are the questions that make you think, participate and find answers.

How do we construct powerful questions? Part-2 of this blog post answers this question with several examples.

Thursday, October 11, 2012

Scrum Masters and Coaching



Couple of days ago, I went through ‘The Scrum Guide’ developed and sustained by Jeff Sutherland and Ken Schwaber. I focused on the ‘coaching’ aspects involved in the role of Scrum Master and found the following.

The Scrum Master serves the Development Team in several ways, including:
       *  Coaching the Development Team in self-organization and cross-functionality
       * Coaching the Development Team in organizational environments in which Scrum is not yet fully adopted and understood.

The Scrum Master serves the organization in several ways, including:
       *  Leading and coaching the organization in its Scrum adoption;

Thinking through these aspects, I started recollecting a decade-old incident. We were new to agile methods at that time. Scrum was not very popular in India as well as other parts of the world. We were learning to do iterative development and trying to understand and follow agile principles. XP was well known.

Our team was not fully self-organized. Jim, one of the senior project managers in the organization was responsible for building the project team, working with them and making sure that we deliver. He had prior experience in executing projects using RUP (Rational Unified Process).  He was a wonderful person, seasoned manager, knowledge seeker and mentor.

To me, the role Jim played appears similar to the role of Scrum Master.

During the early days of this project Jim noticed that Sailesh, one of our team members used to come late (by an hour or two or even three sometimes) to work, complete his tasks and go home. Jim was open-minded. He believed in flexible work hours. With no urge to make any judgment Jim was not bothered as long as Sailesh was able to deliver and meet his commitments. Sailesh was a very good programmer who wrote high quality code and took charge of complex features.

After a month or so, Jim found that one of the team members needed some support from Sailesh in solving a technical issue. Sailesh was not around. As usual, he arrived late that day and started concentrating on his work. Obviously, everyday Sailesh had just enough time to take care of his tasks at work. How could have his daily schedule provided him time for collaboration or hand-holding or mutual help? He felt self-sufficient because of his skills and experience. He did not need help from his team mates. As you may guess, he was not showing any signs of collaborative attitude.

That was an impediment.  Having observed similar incidents with Sailesh, Jim was concerned and called for a meeting the next day at 9.00 am. Jim wanted me to accompany them in the meeting. This is because Jim was preparing me to play his role over the next few months.

The next morning Sailesh came in late. He entered the meeting room at 9.40 am with a quick smile and a casual remark, “Hi Jim, I reached just now. Shall we start?”.   That was a 40 minute delay!

Not expecting anything more than that, Jim responded, “Sailesh it is 9.40! How come you got delayed?”.

“I went to bed an hour past midnight and got up late!”

“We had scheduled this meeting yesterday. You accepted and you went home on time yesterday. So, I was wondering this morning and worried why you did not reach by 9.00 to start this meeting.”

“True. But somehow I am used to starting my day little late. Today I had to fixe my flat tire which I did not expect! I am sorry.”

I was listening to the conversation.  I was shocked. No doubt, Sailesh was not organized. He was focused on his tasks alone. He did not value the time of his coworkers.

The meeting continued for 10 more minutes and ended with a stern remark from Jim. He said, “Sailesh, You need to be available at work on time as per our corporate work hours. If you are going to be late by 30 minutes or an hour it is ok as long as you are consistent and all of us in the team know your availability. It is about team work. We are not working in this team as individual contributors.”

Sailesh left the meeting room. Jim was talking to me. We talked about two options. The first option was to talk to Sailesh, coach him and make him understand his strengths and improvement areas. The second one was, of course if the first option does not work, to move him out of our project for further counseling or action.

Eventually, Sailesh resigned after couple of months. It appeared to me that he wanted to remain as an individual contributor and specialize in software architecture. I wasn’t sure of his professional success because of his lack of collaborative spirit!

Looking back, I wonder if Jim and I could have handled it differently. Were we reactive? Did we fail to pay attention early or bond with Sailesh early? If we come across a similar situation now, what will we do?

Have you come across a soft issue like this in your projects? If yes, what was your approach?

I believe an incident like this has to be analyzed with the coaching role of Scrum Masters. Next week, on 20th October, Saturday, I am delivering a session ‘The Power of Inquiry: Coaching Tips for Scrum Masters’ at Agile Tour 2012 to present similar incidents and discuss my thoughts on how to turn these around with powerful questions. If you are in Chennai, don’t miss this event! More info: http://isec.co/event_detail_info.php?event_id=12.

To know more on this read the 4-part blog series 'The Power of Inquiry: Coaching Tips for You!'
Note: For confidentiality reasons, I have changed the names of all characters in this story.

Thursday, September 20, 2012

Agile Evolution and Academic Imperatives


This is the topic I had chosen for an hour long session at Agile Goa 2012 Conference conducted on 25th and 26th Aug 2012.

My objective was to set the context on the current state of engineering education in India, relate it to Agile Evolution and proceed with discussions on what needs to be done in order to reduce the industry-academia gap. In this session, I made an attempt to present the following recommendations to students and teachers.

1. Learn Programming C (Do not teach in class room. Promote self-learning. Solve programming exercises that involve significant code (100+ lines each)).

2. Learn data structures through programming experience.

3. Do OO programming in Java and/or C++. Experience C before learning Java or C++ or similar OO languages. Learn ‘Design Principles’, ‘Programming Principles’, etc.

4. Use .Net, J2EE and similar frameworks. Experience Client-Server Architecture instead of limiting yourself to standalone programs.

5. Learn web development. Use JavaScript, XML, SSL etc. Create a website with adequate complexity.

6. Retire subjects that are not relevant for industry track or research track. Merge or consolidate subjects that are relevant but are not standalone candidates. Retain fundamental subjects.

7. Learn about Software Testing, Software Quality Assurance. Get introduced to Performance Testing, Security Testing etc.,

8. Include DW/BI concepts in DBMS courses.

9. Include contemporary networking concepts (such Wi-Fi, VOIP) in Computer Networking course.

10. Study website such as Amazon, eBay, Google, Railway/Airlines Reservation Systems, etc. to analyze and understand requirements and designs.

11. Introduce case study based learning.

12. Carryout software maintenance projects by taking up projects developed by other groups.

13. Develop viable products, prototypes and applications that do not retire immediately after your coursework.

14. Use evolutionary software development methodologies such as agile and lean methods.

15. Conduct software exhibitions to show case the applications or products developed by groups of students.

Obviously, when we enhance our learning methods, we will enable our undergraduates and graduates fit the research track or industry track very well. Any reactive measures or short-term fixes won’t help much.

The slide deck used for this session is available at http://www.slideshare.net/RajaBavani/agile-evolution-and-academic-impreatives-22861423

Reference:

  1. Rethinking About Computer Science Curriculum - Long Overdue, H.N.Mahabala, IIIT, Bangalore



Thursday, August 16, 2012

Webinar on 'Force of Habit: Seven Essentials of 21-st Century IT Professionals'



Gone are those days when monolithic IT systems were developed and maintained by exclusive communities of IT professionals confined to technology-savvy regions of the world. The challenges of software engineering during the 21st century are quite different and multifold because of factors such as globalization and technology evolution. In order to face these challenges, 21st-century IT professionals will need to transform successful practices into habits so that these practices become second nature.

I am conducting a webinar on this topic. This webinar is on 22nd August 2012 at 3pm IST.  In this webinar, I am going to discuss the seven habits that are essential for today’s IT professionals. These seven habits empower IT professionals to prepare, act, collaborate, optimize, and influence effectively in their career. This webinar will help Project Managers in understanding the importance of leadership skills and building high-performance teams.

Registration Link: http://www.techgig.com/webinars/Project-Management-Leadership-Series-Session-14-Force-of-Habit-Seven-Essentials-for-21st-Century-IT-Professionals-203

White Paper (PDF) Download: http://www.mindtree.com/insights/thought-posts/articles/force-habit-seven-essentials-21st-century-it-professionals




Thursday, August 2, 2012

Requirements Engineering: Lessons from 5 Unusual Sources


Requirements Engineering continues to be one of the challenging aspects of software engineering. Delivering working software in short iterations requires intense communication and coordination among agile teams in order to refine requirements and identify dependencies and conflicts. In my experience with agile teams in both collocated and distributed environments, an interesting aspect I have observed is that successful agile teams can learn from unusual sources. Here are five such unusual sources.
  1. Restaurants & Waiters
  2. Airports and Flights
  3. Families and Children
  4. Schools and Teachers
  5. Ant Colonies
In my article titled, ‘Distributed Agile: Steps to Improve Quality before Design” I wrote that quality is a journey that starts from the early stages of projects. When we open our eyes and ears to the world around us and learn from unusual sources, we get an opportunity to apply such lessons and understand how simple things make big differences.
What can we learn from these 5 unusual sources? To know more, read this article 'Agile Requirements: Lessons from Five Unusual Sources' published in Agile Record.

Sunday, July 15, 2012

Distributed Agile Podcast


Tom Cagley (http://tcagley.wordpress.com/) interviewed me on distributed agile. This interview happened sometime during April 2012. During this interview Tom asked me very interesting questions and engaged me in a great conversation. This interview happened over phone across continents. He was in US (East Coast) and I was in India (Pune).

Tom has composed this interview and made it available at http://spamcast.libsyn.com/webpage/s-pa-mcast-190-raja-bavani-distributed-agile.

Tom runs 'Software Process and Measurement' (SPAM) casts regularly. The Software Process and Measurement Cast provides a forum to explore the varied world of software process improvement and measurement. The SPaMCast covers topics that deal the challenges how work is done in information technology organizations as they grow and evolve. In a nutshell, the cast provides advice for and from practitioners, methodologists, pundits and consultants!

I encourage you to visit his sites and listen to as many casts as you can. You will find impressive interviews by several industry leaders in Software Engineering.

Sunday, June 17, 2012

Changing Requirements: You Have a Role to Play!


Do requirements in your project change late in the game? Probably, you asked questions late in the game or someone in your team made early assumptions and did not validate them with business users. Or you resisted changes in some way or the other. One way to resist change is to avoid collaboration with business users assuming that collaboration can induce changes. The truth is that you resist change to requirements, changes persists!

“In life, what you resist, persists”, says Werner Erhard. How true!
In Requirement Engineering there are several weak links!  Are you a developer, or a tester, or a team member who receives requirements specifications or user stories from business analysts or product owner? Do you expect business analysts or product owners to provide you with clear requirements detailed enough for you to code, test and deliver software? Think practical! You have a role to play!

You have to understand what business analysts or product owners provide you. You have to ask questions as early as you can. You have to think in terms of test scenarios and test data. You have to validate your thoughts and assumptions whenever you are in doubt. You have to think about related user stories and conflicting requirements. Instead of doing all these, if you are going to remain a passive consumer of the inputs received from business analysts or product owners, I am sure you are seeding issues. Late in the game, when you ask questions, you will receive answers. These answers will manifest as ‘last minute changes in requirements’. Do you want to think, collaborate and elicit requirements early? Or do you want to blame on changing requirements?

Think! You have a role to play! You have to ask context-free questions as well as context-specific questions.  You can’t be a passive consumer of requirements who wakes up late in the game plagued with changing requirements!  When you become an active participant and exhibit collaborative spirit you will embarace change comfortably and ensure quality before design!

Scrum teams do backlog grooming. Is it working for you? What else do you do to elicit and refine requirements?

By the way, do you know how tea bag was invented? It happened by accident. Yes. It was an accidental innovation!  Can we let our projects mature by accident?  - read this blog for more information.  I have linked my recent podcast on distributed agile to this blog post.

Monday, June 11, 2012

Can Software Projects Mature by Accident?



More than 100 years ago, tea importers in New York used to send samples of tea to their customers in small tin boxes.  Those days, the cost of tin boxes was soaring.  Thomas Sullivan, a tea importer in New York wanted to avoid the usage of tin boxes. He wanted to try a cost-effective way of packing tea samples.  Mr. Sullivan, in 1908 started using hand-made silk bags. One of his customers dunked a silk bag of tea into hot water by accident and found that it brewed a cup of tea! It was lovely! This accident gave birth to the idea called ‘tea bag’ or ‘dip tea’. Customers started liking tea bags more than tin boxes.  This is how tea bag was invented. It happened by accident!

There are several ideas and innovations around us. They were created by accident not by design!

Can software projects mature by accident? Or can we let software projects mature by accident?  When software projects mature by accident the end result is a negative impact on stakeholders. Accidents in software projects erode customer satisfaction and hence can impact the brand value of businesses. Can we make software project mature by design?

Watts Humphrey wrote, “Quality products are not produced by accident. While most software professionals claim to value quality, they take no specific steps to manage it. In every other technical field, professionals have learned that quality management is essential to get consistently high quality products on competitive schedules. They have also learned that quality management is impossible without quality measures and quality data. As long as software people try to improve quality without measuring and managing quality, they will make little or no progress.”

Distributed agile has gained popularity in our industry. However, we cannot afford to mature distributed agile projects by accident. We need to focus on continuous improvement so that projects mature by design. For more information on distributed agile and maturity of distributed agile projects, listen to this podcast produced by Tom Cagley.

Distributed Agile Podcast: http://www.spamcast.libsyn.com/webpage/s-pa-mcast-190-raja-bavani-distributed-agile

Distributed Agile: The Maturity Curve, article published in Agile Record

Thursday, May 17, 2012

Achieving Benefits from Distributed Agile Teams



This blog post is about a report on ‘Distributed Agile’ by Elizabeth Harrin. This report titled 'Distributed Agile Teams: Achieving the Benefits', is based on a survey conducted by Elizabeth during January 2012 among 340 distributed agile practitioners. It is an informative report and it is it free! 

Sometime during Nov/Dec 2011, Elizabeth collaborated with me in collecting my views and opinions on distributed teams. She has shared some of my thoughts in this report in several places.

I think surveys like this need to happen at regular intervals – may be every year. Also, the number of respondents per survey must increase. When thousands of practitioners participate in a survey the results can benefit analysts and researchers. A good idea is to conduct the next round of this survey from destinations that provide software services and analyze them with the inputs provided by customers who consume software services. A survey like this can provide the analysis and consolidation of the views of both providers and consumers.

Elizabeth’s report is a great one to begin with. Here are the links to the report and couple of reviews on this report.

Main Report (38-Page PDF) -  http://www.projectsatwork.com/whitePapers/Agile-Distributed-Teams-Achieving-The-Benefits-.html

Reviews:
http://www.projectwizards.net/en/macpm/project-management/distributed-agile-teams-white-paper-review
http://www.gantthead.com/blog/Agility-and-Project-Leadership/5264/

Blog Posts:
http://www.pm4girls.elizabeth-harrin.com/2012/04/agile-and-distributed-teams-research-results/
http://talkingwork.com/2012/04/28/agile-and-distributed-teams-research-results/

Friday, May 4, 2012

The Benefits of Pair Debugging


In my blog post ‘Debugging – Critical Questions’ I wrote, “Great developers need not be successful debuggers. Sometimes we find software test engineers becoming very successful in debugging as compared to developers.  More interestingly, when a developer spends several hours on debugging an issue he or she naturally collaborates or joins hands with the corresponding software test engineer to get it straight.”

Pair debugging is a very powerful technique. The are several benefits of pair debugging. Let me share five of them in this blog.

1) Efficient Defect Reproduction: When a developer and a software test engineer collaborate the time spent in defect reproduction is less as compared to the usual back and forth that happens between these two individuals.

2) The Power of Diverse Mindsets: All said and done, the mindsets are different for these two individuals. When they come together they continue to think in their own way. This is the power we get in pair debugging. They get to think in multiple dimensions – about different environments, across multiple layers, across multiple modules, across similar components, about related modules and regression effects etc., In other words, they get to think if a similar defect can be present in the system in as many possible ways. This thinking is required to initiate a holistic approach for debugging. When two individuals think differently, there is an opportunity to stay away from ‘quick fixes’ and commit to thorough fixes.

3) Collaborative Learning: Pair debugging is an opportunity to learn. Software test engineers get to learn about the architecture, design and coding elements of the system. Developers get to learn about the nuances of debugging and testing.

4) Preventive Maintenance: Pair debugging provides an opportunity to identify some of the weak modules or components that need preventive maintenance. When team members practice pair debugging over several months, they gather enough information to suggest ‘top-n preventive maintenance tasks’ that can reduce long term maintenance over heads.

5) Effective Defect Verification: Pair debugging results in effective debugging because software test engineers learn more when they work along with developers. This provides them an opportunity to think beyond the obvious and create test scenarios with the right test data set.

Have you tried pair debugging? Have you seen additional benefits?

Saturday, April 28, 2012

Pair Programming and Mentoring: Which Comes First?


Is there a difference between Pair Programming and Mentoring? Yes. Pair Programming is one of the XP practices. http://www.extremeprogramming.org/rules/pair.html says “One thing pair programming is not is mentoring. A teacher-student relationship feels very different from two people working together as equals even if one has significantly more experience”.

In project organizations, mentoring is crucial. Think about inducting graduate trainees or fresh graduates into software projects. In this situation, in addition to an induction program, identification of mentors and investing in mentoring programs provide lot of value. Also think about experienced new joiners. They may need some mentoring. When mentoring sessions include hands-on coding and discussions on ‘code quality’ and ‘design quality’ team members get an opportunity to learn to write good quality code.

Pair Programming is not practiced in all Agile projects. It is a common practice in all XP teams. In many projects Pair Programming remains in the wish list. Pairing requires rapport, trust and excellent work relationship. One way to start pair programming is to start pairing from an hour to couple of hours per day. This can provide positive results.

Mentoring takes a different approach. Identifying the need for mentoring on a case-to-case basis has to be one of the top considerations in software projects. Mentoring (when required) cannot be ignored or forgotten. Mentoring is not sufficient to produce good quality code. Practices such as pair programming help a lot.

Above all self-review of code cannot be ignored. Programmers can start self-reviews in simple ways. All they have to do is to ensure that they can eliminate a fixed list of programming errors (say 20 to 25 things) through self-reviews. This, in my opinion is the first step in the journey to improve code quality. With these steps, practices such as pair programming or peer-reviews will yield better results.

Mentoring is a great opportunity to imbibe the importance of self-reviews and self-review checklists in teams. Also, it is a great mechanism to bond new team members into teams.

Understanding the importance of mentoring and investing in mentoring programs comes first. When you do this you will know how to (also when to) introduce appropriate engineering practices in software project teams.


Tuesday, April 17, 2012

Requirements Specification: The Weak Links


Most of us believe that Software Requirement Specification documents handed over to project teams enable them in designing, coding, testing and delivering the end product. We trust, in every project we get into, that the business analysts and user representatives collaborated well in order to create such documents. We must be positive and think optimistically. We must trust. We must believe. However, our positive thinking, trust and belief can take us only so far unless we do our homework right.

Yesterday, I was reading a paper titled, ‘What’s Wrong with Requirements Specification? An Analysis of the Fundamental Failings of Conventional Thinking about Software Requirements, and Some Suggestions for Getting it Right’ written by Tom Gilb. Tom is the author of nine books, and hundreds of papers on topics related to Software Engineering. He has been a keynote speaker at dozens of international conferences. He has delivered guest lecturers in universities all over the world.

In this paper, Tom has outlined ten key principles for creating successful requirements specification documents. He has provided very good examples to illustrate these principles. I found it very interesting as it helped me realize several weak links in the way we gather and specify requirements. Let me share a set of takeaways from Tom’s paper in this blog post. I encourage readers to download this paper (It is free! No registration required) and read it.

1) Check if requirements specification helps you understand the top level project objectives. Make sure that these objectives are not verbose and qualitative. They need to be detailed enough but fit into a single page. Tom says that a good first draft of the top ten critical objectives can be made in a day’s work assuming that the key management personnel are available for discussions.

The top level project objectives remain weak links if you are not able to translate these top level objectives into success criteria that can help you measure the success of your project.

2) Understand the value delivered to stakeholders by means of satisfying the specified requirements. More than the defined functionality or user stories, the value delivered is what counts. Traditionally, we are not used to considering value when we think about requirements specifications. The ability to prioritize requirements based on value delivery is critical. Product Owner (Scrum), Business Analysts, Marketing Managers, and Product Managers need to focus on this function. When you implement a requirement (or satisfy a requirement) validate if you are delivering value to stakeholders.

3) Quantify requirements. Qualitative descriptions or textual representation of requirements do not help. Quantification enables measurement. Measurement is necessary if you want to improve.

4) Specify the value you want to deliver to stakeholders clearly. Do not mix this up with how you want to achieve the value (the 'design').When you are able to specify your needs clearly, the solutions to your requirements will naturally follow. Do not let the solutions change your real needs.

5) Instead of focusing on each functionality, focus on system quality. Great products go to market because of this focus.

6) Make the specifications rich with several elements such as historical data, test data, scenarios, etc.

7) Inspections can help you improve the quality of requirements. So, consider Specification Quality Control (SQC).

Finally, requirements do change or evolve. If you spend lot of efforts in hardening all requirements in advance, it is probably a waste of energy. Embrace change!

When you read, Tom’s paper you will realize the weak links in the current state of creating Requirements Specifications. Also, you will get an opportunity to learn from his experience and thoughts!

Monday, April 9, 2012

Who Cares About Software Quality?



This blog is about my observation and interpretation on the keynote address, ‘The Future of Quality’, at Belgium Testing Days 2012 by Goranka Bjedov. A moment of discontent or restlessness in her professional life related to the state of the industry and engineering practices during 2008 triggered several events that culminated in a realization, ‘Quality is Dead’, in 2009. This is because there were (and are) software products and applications out there in the market. Most of them are buggy. Customers are end-users accept such products. Buggy products become a source of revenue as they fetch maintenance and upgrade fees. According to her, practitioners went wrong with the thought that testing is all about checking by means of executing tests and finding defects. She emphasized that testing is more than this and there are several dimensions to testing.

Her question “What happens when value of quality is less than the price of quality” triggered additional questions. Is functional testing adequate? Do we do too much of functional testing (and hence spend a lot)? Is that cost justified? Do end-users pay more and get less? She emphasized that it is imperative for testing teams to ensure that nothing escapes, to provide information about the current state of quality, and stop the release of bad products. It is possible when testers understand product quality. Testers have to carry the right purpose and approach to make this happen.

We have seen software malfunction and failures in many industries. Businesses have paid huge sums of money to settle related charges and court verdicts. The keynote included several examples.

With these facts in front of us, she introduced ‘Productivity Testing’. According to her 'Productivity Testing' is all testing that is focused on faster development. Its main purpose is preventing developers from checking in bad code, and allowing for changes to the code with certain "peace of mind".

Unit tests, micro-benchmarks, "sanity" tests, etc. are examples of Productivity Testing. According to her these are small, fast, cheap to write and maintain, analysis-free (when they fail, it is clear where and why), perfect for automation, fantastic for gaming the system, managers love them (code coverage, metrics), and "technical" (are we using mocks, fakes, doubles, or something else?).

In order to stress on Agility, she shared Brian Marick’s quote

"The lore of testing is full of people who spent weeks improving test automation libraries without ever, you know, quite getting around to automating any tests. The trick is to make improvements in small steps while simultaneously continuing to frequently deliver the business value that makes the project worth funding. There's a real skill to moving gradually and continuously and simultaneously toward several larger goals."

This keynote revealed or reemphasized points such as

1) There is no point in claiming the number of test cases executed or talking about the coverage or number of test cases automated or the types of tools and frameworks used unless we demonstrate that we do the right things to add value to stakeholders at regular intervals.

2) Customers do not want everything right in a product to begin with (unless the product is going to impact their lives or critical money). They want products faster. They want the base quality right. Everything else does not matter. They want products free (and are ready to pay for services and upgrades).

3) When you have the base functionalities in place, it is ok if you deal with bugs as and when you find them. That is how many products are sailing through their life cycle in our industry.

4) We do get into a denial mode and hide ourselves claiming that because of high complexity in products, we are in this state.

I liked the way she concluded. She hinted that the current state of quality is the consequence our thought process, practices and attitude. In order to make the future of quality brighter we must have to think what we can do better. We can reduce development efforts (and costs) by writing productivity tests. We can bring in quality by adding smart system tests in the right places. We must think performance, scalability, usability, and optimization (virtualization, cloud). Also we must understand the economics of software and demonstrate the value of our work (in dollar amounts) to stakeholders.

Publishing volumes of test standards and guidelines do not ensure positive results unless practitioners understand the basics and apply context specific practices.

I agree. Our industry is evolving. We have seen a small number of top notch products and services in this evolution. Our industry will continue to evolve. We will continue to see top notch products and services. Either directly or indirectly customers define what is acceptable and what is not. Keynotes and discussions like these make us stop, think and revalidate our beliefs. Testers (and everyone in project teams for that matter) can enable the delivery of high quality products. Customers can demand quality. When both entities accept the unacceptable, we will continue to see bad quality products, and bugs that result in litigations and huge penalties.

When I shared this blog with Goranka, she added the following points.

1. All paid work should be about creating value - either in the short or in the long term.

2. Created value can be demonstrated in multiple different ways - but being able to attach monetary amounts to the value of work performed is the hardest one to argue with or dismiss.

3. Quality does not have a linear relationship with value. Once good enough is reached, the rest is obtained with effort that could have been better spent elsewhere.

4. ‘What is good enough’ is determined by the customer and not by internal test or QA department.

Additional Links:

     Goranka Bjedov‘s Keynote Abstract and Bio

     Brian Marick’s ‘Two Forgotten Agile Values: Ease and Joy’

     Goranka’s Keynote at STANZ 2011 (Video)

Tuesday, April 3, 2012

Belgium Testing Days 2012

CPGT served to speakers at BTD 2012!

Last month (March 12-14) I was in Brussels for a speaking session at Belgium Testing Days (BTD) 2012! BTD is a confluence of international speakers and audience. BTD 2012 was not an exception. We had not only a great speaker lineup but also a striking list of topics. Overall it was a well-knit program! Thanks to the organizers!

Let me share some of the takeaways from BTD 2012 in this blog post based on the sessions I attended.

1) The testing community can no longer afford to do more and more of manual testing or functional test automation. Focus on specialized test such as performance testing, security testing, etc. are critical. Context-driven testing is the need of the hour.

2) Decision on release dates, product quality, release criteria are business decisions. Testers can do their job of providing right inputs to business. Testers are not the decision makers.

3) Partnership with development team is critical to success. Testing teams cannot afford to function in silos.

4) Consider multi-dimensional metrics to make sense. Standalone metrics don’t make sense for decision making.

5) Introducing ‘Test Assurance’ role in organization is valuable. This role requires ‘Testers’ to transform by means of acquiring additional skills.

6) Quality does not come from testing. Quality needs to be built into all life cycle activities. When we fail to do this, we start depending on ‘testing’ to provide us quality – this is a wrong notion.

7) Test engineers must understand regulatory compliance requirements and do adequate compliance testing and assurance. This can avoid business risks.

8) Software testing professionals need to plan their career. It is a self-initiated activity. No one external to you can tell you what you should do next or move you to the next level.

9) Testers can leverage open source tools to build great test frameworks and solutions.

10) Quality is critical. End users expect a certain level of quality in products. We cannot afford to ignore this fact.


The venue, refreshments and food in the conference were sumptuous. The dinner hosted by conference organizers for all speakers on 12th March was fabulous! The dessert, -  CPGT,  was impressive and delicious. Wondering what CPGT is?  It is Chocolate Parfait and Ginger Tiramisu. Yes. That is in the first picture!

Do you want to read more blogs on BTD 2012? Here you go!

1) Johanna Rothman's Blog
2) Lisa Crispin's Blog
3) Karen Johnson's Blog
4) Sigge Birgisson's Blog


If you post a blog on BTD 2012 or come across any new blogs on BTD 2012, pl let me know. I will add the link here.

Wednesday, March 21, 2012

Data Discipline: Why Should We Care?



Data Discipline is a subject that matters not only to database administrators, product support engineers and MIS personnel but also to developers and testers. Often software engineers neglect the importance of using good quality data and understanding the critical role of Data Discipline in their profession. Here are five reasons that emphasize on the importance of Data Discipline.


1) Data Discipline is the way to optimize efforts spent in gathering and creating good quality test data. Remember the notorious way of testing applications with ‘aaaa’ or ‘1234’ or ‘Mr.XYZ’ as field values? Let us avoid that! The next step is to automate test bed setup.

2) Data Discipline will help you think and be organized so that you direct all test emails to a bunch of ‘test only’ mail boxes instead of flooding the inboxes of real users!

3) Data Discipline will ensure that you mask all fields in order to maintain privacy and security. Using production data without masking fields such as phone numbers and other sensitive data is not the right approach. We all know that!

4) Data Discipline will make sure that you review all configuration files and use the correct parameter values. Else you may contaminate data used by other systems or applications.

5) Data Discipline will make sure that you avoid data loss at any cost. In order to accomplish this you will put in adequate measures to ensure that you do not run test bed clean-up scripts on production database.

Everyone who deals with data can benefit from understanding and following Data Discipline. Let us not think that this is an area of concern for DBAs or Production Support Teams only!



Monday, February 6, 2012

Cloud Computing: Will Standards Ensure Compliance?

In 1979, Relational Software, Inc. which became Oracle Corporation at a later stage introduced the first commercially available implementation of SQL. Also, several other vendors such as Ingres (based on Ingres project led by Michael Stonebraker and Eugene Wong at University of California Berkley during late 70s) and Sybase introduced database products with SQL. The American National Standards Institute (ANSI) announced SQL standard (also known as ANSI-SQL) in 1986. The International Standards Organization (ISO) announced another standard (known as ISO-SQL) in 1987.

However, database vendors could not follow any of these standards completely due to various reasons. A prominent reason was that the standards were seemingly ‘guidelines’ with lack of clarity and room for multiple interpretations.  Consequently, the portability of applications or data migration from one database product to the other remained a challenge.  After several years of product evolution, some of the database vendors announced compliance with SQL standards.   However, the portability or migration challenges have not been addressed yet.

The problem with cloud-computing standardization as elaborated by Sixto Ortiz Jr. in his article published in IEEE Computer, July 2011 is significant. Obviously, the lack of standards means limited adoption. This is because of the limitations in comparing and evaluating offerings and hence there is a potential danger of vendor lock-in.

The good news is that last month(Jan 2012) the Open Group has published the first technical standard for the Cloud. This is called SOCCI (Service Oriented Cloud Computing Infrastructure framework) and it outlines the concepts and architectural building blocks necessary to support SOA and Cloud initiatives.

We are going to see new standards related to Cloud Computing over the next five years. Will standards ensure compliance? We need to wait and see!

Useful Links:
  1. NIST (National Institute of Standards and Technology) Definition of Cloud Computing: http://csrc.nist.gov/publications/nistpubs/800-145/SP800-145.pdf
  2. NIST Cloud Home Page: http://www.nist.gov/itl/cloud/
  3. Critique on NIST Definition: http://blogs.technet.com/b/yungchou/archive/2011/12/19/an-inconvenient-truth-of-the-nist-definition-of-cloud-computing-sp-800-145.aspx
  4. The First Cloud Computing Technical Standard:  http://blog.opengroup.org/2012/01/24/1st-technical-standard-for-cloud-computing-socci/
  5. HP Blog on First Standard: http://h30499.www3.hp.com/t5/Grounded-in-the-Cloud/Cloud-Computing-gets-its-first-technical-standard-SOCCI-from-The/ba-p/5502543
  6. SOCCI (Service Oriented Cloud Computing Infrastructure framework) Announcement:   
  7. http://www3.opengroup.org/news/press/open-group-publishes-new-standards-soa-and-cloud
  8. Article on SOCCI: http://cloudcomputing-365.info/news_full.php?id=20929&title=The-Open-Group-publishes-new-standards-for-SOA-and-cloud
  9. The Problem with Cloud-Computing Standardization by Sixto Ortiz Jr.: http://www.infoq.com/articles/problem-with-cloud-computing-standardization
  10. The Problem with Cloud-Computing Standards, Paul Strassman’s Blog: http://pstrassmann.blogspot.in/2011/08/problem-with-cloud-computing-standards.html
  11. Cloud-Computing’s Vendor Lock-In Problem: Why the Industry is Taking a Step Backward? http://www.forbes.com/sites/joemckendrick/2011/11/20/cloud-computings-vendor-lock-in-problem-why-the-industry-is-taking-a-step-backwards/
  12. Perspectives on Cloud-Computing and Standards, Peter Mell, Tim Grance, NIST, Information Technology Laboratory: http://www.omg.org/news/meetings/tc/dc/special-events/Cloud_Computing/NIST.pdf

Friday, January 27, 2012

Agile Teams: Try Drumming for Team Building!

My previous blog mentioned a little bit about the “5th Indira International Innovation Summit”  presented by Indira College of Engineering and Management.  One of the sessions in this event was “Building Teams One Beat at a Time” by Ms. Aliya Hasal, CEO, Drum Café International.

15 minutes before the start of this session, event volunteers swiftly distributed Djambe drums to almost all 400 attendees in the auditorium. When the anchor Dr. Raju Bhatia invited the presenters on stage, the audience started beating the Djambe drums whimsically. I could feel a sudden desire or change of mind among all to keep beating the drums until the presenters acknowledged the enthusiasm and calmed down everyone!

We were in complete silence. Dr. Vinod Hasal (Director, Drum Café, India) started leading the performance on stage and miraculously gave us very simple instruction to follow the beats! Within minutes, the entire auditorium was in rhythm. We could feel unity, team-building, relaxation, creativity, learning and fun! He was a great coach! & He engaged the team of 400 enthusiastic, novice drummers very well! The result was rhythmic drum beats! The patterns changed every three minutes and the audience followed without missing the beats! A vast majority of us had never touched a Djambe drum before! However, we could join the team on stage and follow their rhythm.  As a result, this session rejuvenated us and made us believe in the power of co-creation.


I strongly believe that drumming sessions like this one can benefit organizations and communities. These sessions can help us build Agile teams. No doubt about it! My experience in this session reemphasized the fact that we must trust our team, give them the necessary infrastructure and tools, and provide them an open environment. This is because the quality of our workday counts! When we do this our teams become creative and innovative.


Above all, we must identify an agile coach. This is the foundation of agile teams. With a good coach, team members will align and start performing! When they perform together over multiple iterations there will be continuous improvement and high quality.

Have you been through a ‘drumming’ session for team building? What do you think?

Tuesday, January 24, 2012

Application of TRIZ in Software Engineering


Last year at SPIN Chennai Annual Conference (SPICON 2011), I delivered a talk on “Inventive Problem Solving for Customer Value Creation”. My talk was based on a real life problem solving experience. When you go through the presentation you will get a broad understanding of this talk.

This year, I got an invitation to speak at “5th Indira International Innovation Summit”  presented by Indira College of Engineering and Management .  During this event, the event organizers honored me with “Excellence in Innovation Award”.

It was a house-full event with more than 400 participants. The planning and execution of this event was very impressive. The selection of topics and speakers on both days offered a good balance between serious discussions and fun-filled learning.

In Software Engineering and Software Services Industry there are numerous opportunities to innovate. What we have accomplished is just the beginning. We have a long way to go! Do you have an experience to share? Feel free to write to me.

Wednesday, January 18, 2012

Do You Follow Defect Prevention Techniques?

The term 'Defect Prevention' (DP) relates to defect analysis and preventive action planning related to defects found in various streams of project activities. Typically defects can be review defects or testing defects (there are other types of defects as well - For example, configuration defects or other forms of defects that are not captured during reviews or testing).

What can be our approach to DP? It can be something that includes a systematic way of collecting and analyzing defects and preventive action planning. Also, this approach includes the frequency of performing DP and a clear definition of ownership so that DP happens as planned. The objective of DP is to make sure that a significant number of those defects do not occur in future.

Creating and executing test cases, test automation, and code quality checks are meant for ‘Defect Detection'. Not for ‘Defect Prevention’.

So, how do we do DP? Well, we can do DP by gathering data on all defects, categorizing them, analyzing them and deriving certain conclusions. Based on these conclusions we decide on preventive actions and implement them.

If we do not derive and implement preventive actions, all we accomplish is a pure 'Defect Analysis' (also known as Bug Analysis). DP requires you to travel an extra mile (in terms of identifying and implementing preventive action plan) after you complete Bug Analysis.

80/20 principle is adopted as one of the DP techniques. When you analyze test defects you perform ‘Pareto Analysis' to understand the distribution of defects. Based on your findings, you come up with action items that help you prevent 80% of defects. There are other techniques too.

Agile teams go through iteration-end retrospectives. How can DP techniques help them perform better? What has been your experience? Do you follow DP techniques? Let us discuss.