Nothing Special   »   [go: up one dir, main page]

Moghadam et al., 2019 - Google Patents

Machine learning to guide performance testing: An autonomous test framework

Moghadam et al., 2019

View HTML
Document ID
7858850731284999192
Author
Moghadam M
Saadatmand M
Borg M
Bohlin M
Lisper B
Publication year
Publication venue
2019 IEEE international conference on software testing, verification and validation workshops (ICSTW)

External Links

Snippet

Satisfying performance requirements is of great importance for performance-critical software systems. Performance analysis to provide an estimation of performance indices and ascertain whether the requirements are met is essential for achieving this target. Model …
Continue reading at www.diva-portal.org (HTML) (other versions)

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/34Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation; Recording or statistical evaluation of user activity, e.g. usability assessment
    • G06F11/3409Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation; Recording or statistical evaluation of user activity, e.g. usability assessment for performance assessment
    • G06F11/3414Workload generation, e.g. scripts, playback
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3676Test management for coverage analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3604Software analysis for verifying properties of programs
    • G06F11/3612Software analysis for verifying properties of programs by runtime analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/34Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation; Recording or statistical evaluation of user activity, e.g. usability assessment
    • G06F11/3466Performance evaluation by tracing or monitoring
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06QDATA PROCESSING SYSTEMS OR METHODS, SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management, e.g. organising, planning, scheduling or allocating time, human or machine resources; Enterprise planning; Organisational models
    • G06Q10/063Operations research or analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/50Computer-aided design
    • G06F17/5009Computer-aided design using simulation
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F2201/00Indexing scheme relating to error detection, to error correction, and to monitoring
    • G06F2201/875Monitoring of systems including the internet
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F2201/00Indexing scheme relating to error detection, to error correction, and to monitoring
    • G06F2201/86Event-based monitoring
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for programme control, e.g. control unit
    • G06F9/06Arrangements for programme control, e.g. control unit using stored programme, i.e. using internal store of processing equipment to receive and retain programme
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06NCOMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N99/00Subject matter not provided for in other groups of this subclass
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06NCOMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N5/00Computer systems utilising knowledge based models

Similar Documents

Publication Publication Date Title
Moghadam et al. Machine learning to guide performance testing: An autonomous test framework
Draheim et al. Realistic load testing of web applications
Feldt et al. Searching for cognitively diverse tests: Towards universal test diversity metrics
Bernardino et al. Canopus: A domain-specific language for modeling performance testing
Bertolino et al. DevOpRET: Continuous reliability testing in DevOps
Abbas et al. ASPLe: A methodology to develop self-adaptive software systems with systematic reuse
Koziolek et al. Performance and reliability prediction for evolving service-oriented software systems: Industrial experience report
Esparcia-Alcázar et al. Using genetic programming to evolve action selection rules in traversal-based automated software testing: results obtained with the TESTAR tool
Lamghari et al. A set of indicators for BPM life cycle improvement
Fanjiang et al. Search based approach to forecasting QoS attributes of web services using genetic programming
Brosch Integrated software architecture-based reliability prediction for it systems
Emam et al. Test case prioritization using extended digraphs
Moghadam et al. An autonomous performance testing framework using self-adaptive fuzzy reinforcement learning
Kumara et al. FOCloud: feature model guided performance prediction and explanation for deployment configurable cloud applications
Westermann et al. Efficient experiment selection in automated software performance evaluations
EP3734460B1 (en) Probabilistic software testing via dynamic graphs
Akpinar et al. Web application testing with model based testing method: case study
Moghadam et al. Poster: Performance testing driven by reinforcement learning
Solomon et al. Business process performance prediction on a tracked simulation model
Klinaku et al. Architecture-based evaluation of scaling policies for cloud applications
Malik et al. CHESS: A Framework for Evaluation of Self-adaptive Systems based on Chaos Engineering
Wert Performance problem diagnostics by systematic experimentation
Calinescu et al. Analysis and Refactoring of Software Systems Using Performance Antipattern Profiles.
Bures et al. Prioritized process test: An alternative to current process testing strategies
Grohmannn et al. The vision of self-aware performance models