Sunday, March 31, 2019

Developing sustainable Page Object Model UI automation

Today, I like to mention about Page Object Model. It's very well known E2E UI automation framework design. And we can find lots of articles on the web. What I'm trying to discuss today is more about its API and how it evolves.

Page Object Model (POM) by itself has petty good attributes as a framework design. I've been using it for my E2E UI automation for web application and mobile application. I've found some useful things to consider when we start UI automation from scratch using POM and maximize the benefit of POM.

1. Maintain the loosely coupled structure
I think it is critical to make Page Objects loosely coupled with test code. Creation of Page object should be handled in other places and the only coupling test code should have is calling its function. This loosely coupled structure make it easy for PageObject classes adopting new device type or browser type. Strategy Pattern with Factory method is very common for this. This structure ensures new device type or browser type does not impact test case code.  Also it helps to handle UI differences for different locale and market/country by using PageObject subclassing. I think this is one of the key aspects which test automation adopting business requirements

2. Do not remove Mix and Match capability
As the automation progresses, it's very common to refactor repeating code. But this refactoring work should not block the flexibility of POM which is Mix and Match. I see this as a very common issue in using POM. "Well, it's repeating many places and we should do it in one place." This actually in many cases makes harder to maintain the test code. I recommend suite level refactoring rather than test base class refactoring(refactor in similar group of test cases). Definitely similar test cases can have duplicate steps. These refactoring should happen in that suite since similar steps may required. If we put these refactored code in test base class for any tests to consume, these refactored code will normally gets more and more complex as new cases/logics are added. And it becomes a burden for maintenance. The beauty of POM is Mix and Match. Refactoring comes after Mix and Match is done when repeating code/steps are identified. Good indicator of this issue is  checking the speed of writing a new test cases. Speed of writing new test case should remain pretty much same when Page Objects are well defined. Heavily refactored code make the test case writing slower and vulnerable to unintended side effect.

3. Use IKEA furniture building strategy for Page Object APIs
I got this furniture building strategy from this video. https://www.youtube.com/watch?v=llGgO74uXMI&t=3369s. And I think it has very interesting benefit of using this strategy. This normally benefits when you're in the early phase of developing UI automation using POM. In the early phase of the UI automation with POM, PageObject functions(methods) are implemented based on test cases selected at that time. And sometimes, it's hard to tell if the function defined is specific to that test cases or os generic. And in early phase, you can see some commonalities and duplication among PageObjects. We tend to think to refactor whenever we see duplication in the code. It just comes to us as a uncomfortable feeling not to refactor when we know there is duplicate code. I would say take a deeeeeeeep breath and let it go for a while. I know this can be hard. But try to let enough test cases covered and PageObject API implemented from those cases. And when you think most of important/mostly used APIs(functions) for each PageObject are implemented. No go back to the PageObject class and try to refactor. You'll be surprise how clear it is to refactor code with very clear logic. It's hard to tell how long we should wait. it's your call, but refactoring PageObject class API after a while really make straightforward and effective.

The furniture building strategy is metaphor the speaker used in the YouTube link I added. Basically, you do not want to tighten one side all the way when you build (maybe more correctly ASSEMBLE)  the furniture. You let it wiggled a while and you tighten both ends little by little. And when it is pretty stable and clear, you tighten both ends all the way.

I like sharing three points per blog.  Have a nice day!   

Monday, January 1, 2018

No more Development testing or QA testing. Just testing

Well.. It is 20018. It's been a while that I wrote something in here. I'm still working as an SDET. There has been lots of progress in software development and software testing. Let me go over the change I've noticed over the several years since last I post (around 2014)

Even if many different companies adopting different development process or strategy, I still believe testing by itself much needed in the development process (code, test and deliver). I still see the value of the unit test, integration tests, e2e test, load/stress tests and etc. I guess Agile development is kind of norm and on top of that many teams wants to go Continuous Integration/Deployment in their development strategy. And there are many success stories about not having  QA organization as part of their software development process.

I think the important question to ask is NOT who should do which kind of testing. Rather, which testing mechanism is placed for which part of development process. This seems obvious but I have not seen many cases where development team leader goes through the entire development process with the team and what kind of tests are required where.

Here is what I think the team should do.

First, the team (engineer leadership) should decide what kind of development process or strategy they are going to use including tests. This will cover from the code getting developed to the code being deployed to the production. All the necessary testing should be identified and decided.

Second, the team needs to confirm all the test infrastructure are all available for them to write and execute. Check if there is a way for developers to test all their new features and bug fixes confidently. Check if proper regression suites are defined and executed properly.

Third, the team should follow the process and do all the things defined for development effort and testing effort. And at the end of the cycle (whichever the team decides), evaluate the make improvement if necessary.

Now, who gets to do what would be up to the team. If QA team exist, you can allocate the work that QA team can do. But the main point is that development process has to come first and then dev team and QA team can take the necessary work to complete that.

Testing is important. All the necessary testing need to be done before going into production. It is just testing not dev team testing or QA team testing.

What about finding bugs? What about regressions caused by bug fixes or new features? I'll cover that on the next blog. But development process and testing associated with the process need to be defined and applied first.



 


Saturday, May 10, 2014

Thoughts around test automation framework

Today, I like to mention my thoughts around test automation.

Test automation is something that I'm passionate about. I like to continue to learn and make myself better at designing, implementing, and maintaining test automation. Throughout my career, I've seen good successful test automation, and yes, I've seen some bad ones as well. And yes, I've made lots of mistakes and bad decisions. I also have some success stories. Here are some things I consider worth noted.

Goal of test automation is value added to the project team not test automation itself. At the beginning of my career as an SDET, I was very passionate about writing a good automation framework. I read many books about coding.  And I watched lots of YouTube videos about coding practices and design. I was just crazy about being good at writing code. I was obsessed with design patterns. I felt like I knew what a good test automation should be. I challenged senior SDETs on existing automation framework and loved to have serious design discussion with test architects at the company.

As I experience more and more, I started to realized that it's not just about writing well designed, maintainable, scalable and beautiful code. It's actually about understanding the role of test automation in a project or a company and provide maximum value out of it. I started to consider various things when I design test automation framework, such as project timeline, short term and long term solution, coding skills of other SDETs, area of focus, what dev team needs, context of the application or system, testability, lab test infrastructure and etc. Sometimes I had to come up with automation in a couple weeks from scratch with coverage of priority 1 test cases. Sometimes I had to modify existing test automation framework to make it easy for inexperienced SDETs or even for manual testers to use. Those work indeed were the right choices for that given situations. It's possible to write easy to use automation framework. It's possible to build test automation starting with short term solution and transform it into long term solution without major design change. The true masters of test automation understand how to write well designed, maintainable, and scalable code. But that's just a foundation. Their adaptability and execution can bring maximum value to the company in any given situation.

Now, I'm getting some feedback on my test automation framework from other SDETs. "Jae-Jin, I believe we should never use hard-coded value in out automation" or "Jae-Jin, why you don't refactor those repeating code?" "We should use XML for all inputs" Well, I can imagine how those Sr. SDETs felt when I challenge them. My response is "well, let's discuss about that..." Fun.. Fun.. Fun

Don’t forget “Framework” part of test automation framework. So what is framework? To me, framework is an agreement. The agreement on a certain development way or convention the team will use to implement the software. Of course, this agreement is mostly introduced by architects or more experienced engineers. Then what are the benefits of having framework? Obviously, the engineers can be on the same page when it comes to implement the feature. It helps communications among engineers like in code review. And It’s hard for a new comers to make mistakes since the framework defines what code goes to where. The most important outcome of using framework to me is “as more and more feature(for dev) or test cases (for test) comes in, the volume of the code will increase, but the complexity of the code will remain the same.” This is the beauty of using framework.

A good example of development framework would be MVC framework. When a developers create a new feature, they will follow the MVC framework and put codes in the right place. Model for data representation. View for presentation layer. And Controller for orchestration and actual business logic. So when you design test automation framework, you should have some sort of agreement that everyone understands. If you don’t have “framework” nature in your test automation, you might not have test automation framework. You just have test scripts.  Take a look at your test automation. Is it a true framework?

Writing test automation framework requires discipline. Why? It's a bit different from writing production code. Just simply think about it. For production code, there are whole dedicated testers and test teams testing that code. But there is no other test team to test test automation framework code. We're writing code to test other code, which means test automation should be more correct and right. How can we achieve this without someone testing test automation code? Actually this is a big challenge for SDETs out there.

OK then what's the advantage? Generally speaking test automation code does not have as high expectations as production code in performance,  algorithmic efficiency (big O) and memory utilization. Test automation does not always required to handle exceptions or errors gracefully. And normally the test case codes are specific intent and specific expected outcome, and sometimes test execution code can ignore things that are not in scope of that particular test case. So we need to be able to utilize these facts when we're writing test automation framework.

Here are my disciplines. I discipline myself not to be fancy with my test automation code. I know I passed all those crazy interview questions to join the company. I know I am capable of writing complicated code with very efficient algorithm. But when I write test automation code, I discipline myself not to be fancy and go simple. For example, let's say I can implement n-square solution and n-log-n solution for a given problem. If n-square solution is more straightforward to implement and easy to understand, I will go for n-square solution. Yes, this is really hard for me too. But it is important that I write less error prone code.

I discipline myself to minimize the usage of conditionals (if, switch statements). Again, it is to reduce complexity of the code. I've seen crazy test automation method that takes 7-8 boolean, enum, objects as parameters. Oh.. man.. the complexity of that code had gone worse and worse. I rather have two separate method than have one method with boolean parameters. It looks fine at the beginning. But as time goes by, boolean parameters becomes enum parameters. One parameter become 2-3 additional parameters. There are some cases where it is necessary. But most of case, I rather choose redundancy over complexity. And if you try to write code without conditionals, it becomes more object oriented programming.

I discipline myself to be open-minded. I let test cases drives automation structure not my preference. I discipline myself not to be obsessed with my own design. When new test cases comes in and it does not fit in current automation design, I will not force the test cases to fit in current automation. I will change the automation structure to fit in new test cases. In other words, I will not let complexity of code grow because of new test cases. I will change the automation structure or design to keep the complexity about the same level.

Well, I'm getting sleepy.. happy testing!  

Saturday, April 19, 2014

5 minute software testing tips

Today, for the first time, I create a video about software testing on Youtube!
I don't know how this will work out, but hey.. why not.

5-minutes software testing tips.. :)
Will post more and more~~



Thursday, April 10, 2014

Dealing with emotion, people, and yourself in software testing

Today, I like to write about stuff around work place and somewhat related to testing.

We love testing! Don't you love doing testing all day long? I do. I really do. Well, but since it is a job, we all have crappy days and good days. Have you experience these? Developers resolve a bug by-design even if you mentioned all the details of the issue? Someone keeps bouncing bug back to you even if that's not your issue? Someone blames you on something via email and cc his manager, your manager, and some other important people?  You made a huge mistake, but your mistake gone away due to other big issue? You are just totally lost and simply take all the blame? Find a huge bug by accident? You've been keeping test automation or test case really well, but just one time when your manager sees your work, everything went wrong? Troubleshoot issue in a few second and find the root cause? Someone you believe not a very good developer/PM, but they save you from disaster? Give or take some condescending comments? You will be so happy if that guy does not work here. You'll be so happy if we don't need to do this? You get so angry about peer performance review?

I think we're so human being. Don't you think? It's not only you. It's everybody.. Right?
Here are things I think you might consider when you're dealing with people and emotion.

Do not criticize anyone publicly
Nobody likes criticism. Even if you're 100% absolutely right, do not criticize anyone in public. The other person will never appreciate your criticism. He will become very defensive. He'll try to find justification for himself and other to believe. And one day he will criticize you in public in return. I had several incidents around this. I had someone criticized me and cc'ed my manager or group alias. Oh man. It was hard to take. I came up with 100 reasons why that happens to justify the issue to myself. But I also have criticized some people with email. I thought I was cool. I email back with my awesome criticism to senior engineer's original email and yes REPLY ALL. I thought I had confidence in myself and opinions about things.. Like philosophy in testing. As a result, I've got very bad review that year. I can sense people did not like me when I raise my voice to debate. I'm not saying you need to be nice to other people to get good review. I'm saying you being confident/strongly opinionated person and you criticizing other people publicly are different thing. You can be confident and have strong opinion about everything without hurting others' feeling. When you receive dumbest email or unreasonable claim or blame from others, think how you can point out the issue without hurting others' feeling.

Leave your ego under your desk when you argue over a bug
I still get surprised by how people interpret and react to bugs. Some fight over the priority. Some doubt about severity of the bug. I've fought over bugs many many times. I have used "user perspective" cards or "terrible potential risk" cards to convince people. And yeah... I argued sometimes just for my EGO. I respectfully(?) respond to "You tester, stop breaking things!" with "I don't break things. I find things YOU broke!" But I think bugs are not our baby. It's a statement that indicate the issue which may or may not be serious impact to product/customers. It can definitely be interpreted differently to different people. Finding a bug will be our job, but how to react to the bug will be project team's responsibilities. Project management/business perspective has to be understood and development impact or effort should be understood as well. When I talk about the bug, I always try to remember what James Bach said in his talk. "Hey developer, don't think bugs as your mistake. We, testers, do not fight for bug to criticize your work. We want you to shine. It's like mother saying 'you have mustard on your mouth' when you leave the house for a date. We care about you. We want you to wipe out mustard on your mouth and shine." If you leave your ego under the desk, discuss facts around the issue and try to understand other people's perspective, it gets much smoother and easy.

Think big. Think big.
When I was a junior SDET, I was really scared of making mistakes. I get offended by people keep pointing out my mistake. I made some  automation framework changes to make it much more effective. Overall, it was great change, but it was not perfect. There is always someone point out flaws and criticize my work. I was a bit discourage to try new initiatives or change. Sometimes I got to the point "why bother." But as I gain more experience, I started to ignore those criticism. I've done many more presentations even if there are always people find flaws in my idea. I've brought so many new successful initiatives at my work even if there are people in doubts. Dr. Russell Ackoff, who is my favorite modern theorist in Systems thinking, mentioned about errors of commission and errors of omission. Error of commission is a mistake that consists of doing something wrong. Error of omission is a mistake that consists of not doing something you should have done. At work, errors of commission are visible, but errors of omission are not visible at all. But I think we should more concern about errors of omission. We should think big. Do not worry too much about small mistake you make when you doing great work. And don't worry about people's doubt on your idea if you believe it would work. We are living in this advanced and modern world, not because of people who are in doubt, but because of people who challenged status quo.