|Published (Last):||6 September 2013|
|PDF File Size:||6.83 Mb|
|ePub File Size:||1.46 Mb|
|Price:||Free* [*Free Regsitration Required]|
You can change your ad preferences anytime. Istqb glossary. Upcoming SlideShare. Like this document? Why not share! Istqb interview questions By H2KInf Embed Size px. Start on.
Show related SlideShares at end. WordPress Shortcode. Published in: Technology. Full Name Comment goes here. Are you sure you want to Yes No. Be the first to like this. No Downloads. Views Total views. Actions Shares. Embeds 0 No embeds. No notes for slide. Istqb glossary 1. Maintenance based on change requests raised by users. Missing terms used in the Advanced Level syllabus added. Some inconsistencies resolved.
Updates to support the new version of the Advanced Level syllabi. Verbiage in 0. Existing terms have not been changed; hence, other syllabi are not affected. Introduction to this Glossary At the time the Glossary version 2. Lucjan Stapp. Erik van Veenendaal created the initial version of this Glossary, maintained it and led the Glossary working group from its inception until March The editors would like to thank him for his pioneering work and major contributions.
Many more people, who are not mentioned here by name, have contributed to former versions of this Glossary. The editors would like to thank them all for their contributions. Introduction to this Glossary 0. Moreover, the professional or technical use of these terms is often at variance, with different meanings attributed to them. In compiling this glossary, the working group has sought the views and comments of a broad spectrum of opinion in industry, commerce and government bodies and organizations, with the aim of producing an international testing standard that would gain wide acceptance.
Total agreement will rarely, if ever, be achieved in compiling a document of this nature. Contributions to this glossary have been received from testing communities from all over the world. The standard was initially developed with a bias toward component testing, but, since its publication, many comments and proposals for new definitions have been submitted to both improve and expand the standard to cover a wider range of software testing. It focuses on terms that have a specific meaning in testing.
Related non-testing terms are also included if they play a major role in testing, such as terms used in software quality assurance and software lifecycle models. However, most terms of other software engineering disciplines that are used in different ISTQB syllabi are not covered in this document. For instance, the terms commonly used in Agile software development are not included in this document. The Foundation Extension Agile Tester syllabus refers to a number of well-accepted Internet resources that provide appropriate definitions.
Some terms are preferred to other synonymous ones, in which case, the definition of the preferred term appears, with the synonymous ones referring to that. For example structural testing is a synonym for the preferred term white box testing. They assist the user to quickly navigate to related terms. Some are there because the term was used in a previous version of a syllabus and the principle of backwards compatibility is being applied.
However, probably the most important terms are the examinable keywords that are explicitly identified by one or more ISTQB syllabus. To support testing professionals who are preparing for exams, the keywords for each syllabus are identified. The syllabus is indicated to the left side of the term. The principle of inheritance is applicable, e. Definitions A abstract test case: See high level test case.
F-AT acceptance criteria: The exit criteria that a component or system must satisfy in order to be accepted by a user, customer, or other authorized entity. ATA accuracy testing: The process of testing to determine the accuracy of a software product.
See also accuracy. ATT adaptability: The capability of the software product to be adapted for different specified environments without applying actions or means other than those provided for this purpose for the software considered. F-AT agile manifesto: A statement on the values that underpin agile software development.
The values are: - individuals and interactions over processes and tools - working software over comprehensive documentation - customer collaboration over contract negotiation - responding to change over following a plan.
F-AT agile software development: A group of software development methodologies based on EITP iterative incremental development, where requirements and solutions evolve through collaboration between self-organizing cross-functional teams. EITP agile testing: Testing practice for a project using agile software development methodologies, incorporating techniques and methods, such as extreme programming XP , treating development as the customer of testing and emphasizing the test-first design paradigm.
See also test-driven development. Alpha testing is often employed for off-the-shelf software as a form of internal acceptance testing. ETM analytical testing: Testing based on a systematic analysis of e. ATT analyzability: The capability of the software product to be diagnosed for deficiencies or causes of failures in the software, or for the parts to be modified to be identified. ATM anomaly: Any condition that deviates from expectation based on requirements specifications, design documents, user documents, standards, etc.
Anomalies may be found during, but not limited to, reviewing, testing, analysis, compilation, or use of software products or applicable documentation. ETAE API testing: Testing performed by submitting commands to the software under test using programming interfaces of the application directly.
EITP assessment report: A document summarizing the assessment results, e. See also process assessment. EITP assessor: A person who conducts an assessment; any member of an assessment team. ATT atomic condition: A condition that cannot be decomposed, i. F attack: Directed and focused attempt to evaluate the quality, especially reliability, of a test object by attempting to force specific failures to occur.
See also negative testing. See also attack. ATA attractiveness: The capability of the software product to be attractive to the user. This facilitates defect analysis and allows a process audit to be carried out. Often expressed as a percentage. See also corporate dashboard, scorecard. Note: A node in a control flow graph represents a basic block.
The opposite is off-the-shelf software. Beta testing is often employed as a form of external acceptance testing for off-the-shelf software in order to acquire feedback from the market.
F black box testing: Testing, either functional or non-functional, without reference to the internal structure of the component or system. This process is repeated until the component at the top of the hierarchy is tested. See also integration testing. F boundary value analysis: A black box test design technique in which test cases are designed ATA based on boundary values. See also boundary value. See also buffer.
F bug: See defect. It is an industry practice when a high frequency of build releases occurs e.
ISTQB Glossary App
ISTQB Glossary App