Nothing Special   »   [go: up one dir, main page]

Haller, 2010 - Google Patents

The test data challenge for database-driven applications.

Haller, 2010

View PDF
Document ID
14262676158030204960
Author
Haller K
Publication year
Publication venue
DBTest

External Links

Snippet

Business applications rely typically on databases for storing and processing their data (database-driven applications, or DBAPs). Testing DBAPs requires testing the application logic plus the interaction between the application logic and the database. Thus, DBAP test …
Continue reading at klaushaller.net (PDF) (other versions)

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/368Test management for test version control, e.g. updating test cases to a new software version
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3696Methods or tools to render software testable
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/22Detection or location of defective computer hardware by testing during standby operation or during idle time, e.g. start-up testing
    • G06F11/26Functional testing
    • G06F11/263Generation of test inputs, e.g. test vectors, patterns or sequences; with adaptation of the tested hardware for testability with external testers
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/30Information retrieval; Database structures therefor; File system structures therefor
    • G06F17/30286Information retrieval; Database structures therefor; File system structures therefor in structured data stores
    • G06F17/30289Database design, administration or maintenance
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/362Software debugging
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3604Software analysis for verifying properties of programs
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/22Detection or location of defective computer hardware by testing during standby operation or during idle time, e.g. start-up testing
    • G06F11/2205Detection or location of defective computer hardware by testing during standby operation or during idle time, e.g. start-up testing using arrangements specific to the hardware being tested
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/22Detection or location of defective computer hardware by testing during standby operation or during idle time, e.g. start-up testing
    • G06F11/25Testing of logic operation, e.g. by logic analysers
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01RMEASURING ELECTRIC VARIABLES; MEASURING MAGNETIC VARIABLES
    • G01R31/00Arrangements for testing electric properties; Arrangements for locating electric faults; Arrangements for electrical testing characterised by what is being tested not provided for elsewhere
    • G01R31/28Testing of electronic circuits, e.g. by signal tracer
    • G01R31/317Testing of digital circuits
    • G01R31/3181Functional testing
    • G01R31/3183Generation of test inputs, e.g. test vectors, patterns or sequence
    • G01R31/318385Random or pseudo-random test pattern
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01RMEASURING ELECTRIC VARIABLES; MEASURING MAGNETIC VARIABLES
    • G01R31/00Arrangements for testing electric properties; Arrangements for locating electric faults; Arrangements for electrical testing characterised by what is being tested not provided for elsewhere
    • G01R31/28Testing of electronic circuits, e.g. by signal tracer
    • G01R31/317Testing of digital circuits
    • G01R31/3181Functional testing
    • G01R31/319Tester hardware, i.e. output processing circuit
    • G01R31/31903Tester hardware, i.e. output processing circuit tester configuration

Similar Documents

Publication Publication Date Title
Silva et al. A systematic review on the use of Definition of Done on agile software development projects
Weske et al. A reference model for workflow application development processes
Felderer et al. Using defect taxonomies to improve the maturity of the system test process: Results from an industrial case study
Majdalawieh et al. Intra/inter process continuous auditing (IIPCA), integrating CA within an enterprise system environment
Bierig et al. Essentials of Software Testing
Alferez et al. Bridging the gap between requirements modeling and behavior-driven development
US20160132797A1 (en) Business process model analyzer and runtime selector
Bass et al. A comparison of requirements specification methods from a software architecture perspective
Haller The test data challenge for database-driven applications.
Alsmadi Advanced Automated Software Testing: Frameworks For Refined Practice: Frameworks For Refined Practice
Szívós et al. The role of data authentication and security in the audit of financial statements
Khannur Software Testing: Techniques and Applications
Essebaa et al. Model-based testing from model driven architecture: A novel approach for automatic test cases generation
Almeida et al. SS-BDD: automated acceptance testing for spreadsheets
Platonov et al. Development of a methodology for cost optimization of software testing for the automatically tests generation
Hill A smarter model risk management discipline will follow from building smarter models: An abbreviated guide for designing the next generation of smart models
Machado Fault model-based variability testing
Jayaswal et al. The Analytic Hierarchy Process (AHP) in Software Development (Digital Short Cut)
Lee et al. An empirical study of quality and cost based security engineering
Gupta et al. Non Performance Testing in Software Development Life Cycle
DHARMAPALAN BURGER SHOP
Heaton Business intelligence cookbook: A project lifecycle approach using Oracle technology
Com Syntax Master
Baltaretu et al. Implementing a plagiarism detection system
Isma et al. CASHCLASS AS AN EFFECTIVE AND PRACTICAL CLASSROOM CASH FINANCIAL MANAGEMENT APPLICATION FOR STUDENTS IN AN EDUCATIONAL ENVIRONMENT