Test automation need operation.
- test automation keep latest specification.
- test automation is fragile. investigation is needed.
- test automation performance should be monitoring and improve it if low performance happened.
this slide is presentation in JaSST'22 Tohoku.
we faced 3 times challanging issue about test automation and resolved .
explain these issue and how to resolve it and what effective is gotton.
1st issue : process issue
2nd issue : scripting issue
3rd issue : maintenance issue
This document discusses unit testing and its benefits. It begins by outlining some questions about unit testing, then compares unit tests to UI tests. Unit tests are faster, test individual functions, and make code easier to change and refactor. The document provides an example of unit testing a password validation function in PHP Laravel. It discusses that while test coverage is a quality metric, high coverage alone does not guarantee high quality. It argues that internal quality through practices like unit testing does not require tradeoffs with development speed, and can actually improve productivity by reducing unnecessary tasks and lead time. Maintaining clean code through practices like unit testing is important for both quality and speed.
This document provides an overview of test automation from the perspective of a test automation engineer. It discusses key topics like the test automation pyramid, reporting, design considerations, and deployment. The test automation pyramid emphasizes unit testing, integration testing, and end-to-end testing from the bottom to top. Reporting and metrics are important for understanding test results and efficiency. Design focuses on aspects like data-driven testing, robustness, and repeatability. Deployment involves piloting automation, maintaining scripts, and supporting evolving environments. The goal is to improve testing in areas like coverage, speed, and cost while maintaining quality.
Test Automation Improvement by Machine Learning Jasst'21 TokyoSadaaki Emura
We have challenging issue in test automation operation.
Test automation fail by bug and not bug reason.
I think one of the main reason is temporary accident.
When test automation fail by this temporary accident, test will be success by running test again usually.
This re-run operation is boaring task and big task for test automation operation
To eliminate this kind issue and improve operation, we built system categorize issue by machine learning, re-run test only when fail reason is temporary accident.
In this session , I show you these below
- What's test automation issue in daily operation
- How to resolve this issue. store big data, learning, system architecture etc.
- Actual result for improvement
20200630 Rakuten QA meetup #2 "Improve test automation operation"Sadaaki Emura
Test automation brings benefits but also struggles like many failure reports. Currently, temporary unstable failures make up 73.4% of issues but investigating them wastes time. The presenter proposes an auto recover system to automatically re-run tests predicted to fail due to temporary issues. This could reduce wasted monthly operation time from 75 hours to around 25 hours by automating recovery of the most common failure type. The system would use OCR, screenshots and previous data to predict failure reasons and automatically re-run tests to improve test automation efficiency.
This document discusses test automation, including the purpose of test automation, the test automation process, and the test automation pyramid. The key points are:
1. Test automation aims to improve test efficiency, provide wider test coverage, reduce costs, and speed up testing.
2. The test automation process involves defining the test scope, designing tests, coding tests, setting up the test environment, running tests, and maintaining automation over time.
3. The test automation pyramid illustrates that unit tests should form the base, as they are quick to write and run, while user interface tests are at the top as they are more complex and time-consuming.
Struggles and Challenges in STLC in Ques No.13Sadaaki Emura
The document discusses struggles and challenges in software testing lifecycle (STLC) at Rakuten. It describes two main challenges - difficulty in managing requirements and test cases, and difficulty completing automation tasks on time. It then introduces requirement traceability matrix (RTM) as a solution to address the first challenge by linking requirements, test cases, and defects to ensure full test coverage and easy identification of impacted test cases when requirements change. RTM allows tracking of changes by onsite and offsite teams.
The document discusses lessons learned from analyzing requirements documents for testing purposes. It summarizes three approaches: 1) James Bach's Heuristic Test Strategy Model which classifies requirement information to guide testing, 2) Yoshio Shimizu's Universal Specification Describing Manner which stresses documenting the reason for requirements, and 3) Kazuhiro Ishihara's requirement matrix analysis which considers what should, must, and never happen for users. The key lessons are to analyze requirements from the user perspective and consider cases beyond what is explicitly specified to improve testing.
This document discusses improving automation testing speed at Rakuten. It describes how the author's team previously faced bottlenecks like long setup times and late test feedback. They created a "Mobile Labo" solution using Appium and Selenium to run tests concurrently on multiple devices from a single script. This sped up testing significantly. However, new issues emerged around device clashes and locating devices. The author aims to resolve such problems to further improve speed in the future.
4. 4
Success Factors in Test Automation
• テストレベル(Unit, IT ..)によって自動化のアプローチは異なる
• システムのアーキテクチャ (IF , 開発言語 ..)によって自動化のアーキテクチャは異なる
• システムの複雑さ (外部連携も含む) によって自動化の構成は異なる
Unit Test
Integration Test
phpUnit
Selenium,appium
GUI
Selenium,appium
SoapUI
テスト自動化導入時に念頭に入れる事
V字モデル
5. 5
Design for Testability and Automation
• Observability: システムの状態を判断できるI/Fを提供。
• Control(ability): システムを操作するI/Fを提供
• Clearly defined architecture: 上記I/Fを一意に判別でき
る
テスタビリティとは?
• 機械(自動化)が合否を判断できる仕組みが必要
• 機械が操作できる手段(I/F , Object)が必要
• 機械が操作する手段を特定できることが必要
Button A
Button B
Button C
Process is success!
Text text text
GUI
Validateが
しやすい
操作でき
る
Objectを
一意に特
定できる
6. 6
The Generic Test Automation Architecture
テスト自動化アーキテクチャの構成
Test Generation Layer
Test Definition Layer
Test Execution Layer
Test Adaption Layer
Manual
Design
Test
Models
Test
Conditions
Test
Cases
Test
Procedures
Test Data , Library
Test Execution
Test Logging Test Reporting
GUI API
Proto
cols
Data
base
Simul
ator
• テストハーネスの制御
• システムのモニタリング
• システムのシミュレーション/エミュレーション
• テストケースの自動実行
• テストケース実行のロギング
• テスト結果のレポート
•テストケース、テストデータ、手順を定義
•テストケース実行のテストSCRIPTを定義
•テストライブラリ(キーワード駆動)の利用
• マニュアルでテストケースの設計
• テストデータの生成、取得
• モデルから自動でテストケースを作成
※ほとんどの自動化は下のlayerから検討、実装していくが全体像を考慮する必要がある