Nothing Special   »   [go: up one dir, main page]

skip to main content
research-article

Architectural Run-time Models for Performance and Privacy Analysis in Dynamic Cloud Applications

Published: 25 February 2016 Publication History

Abstract

Building software systems by composing third-party cloud services promises many benefits such as flexibility and scalability. Yet at the same time, it leads to major challenges like limited control of third party infrastructures and runtime changes which mostly cannot be foreseen during development. While previous research focused on automated adaptation, increased complexity and heterogeneity of cloud services as well as their limited observability, makes evident that we need to allow operators (humans) to engage in the adaptation process. Models are useful for involving humans and conducting analysis, e.g. for performance and privacy. During operation the systems often drifts away from its design-time models. Run-time models are kept insync with the underlying system. However, typical run-time models are close to an implementation level of abstraction which impedes understandability for humans. In this vision paper, we present the iObserve approach to target aforementioned challenges while considering operationlevel adaptation and development-level evolution as two mutual interwoven processes. Central to this perception is an architectural run-time model that is usable for automatized adaptation and is simultaneously comprehensible for humans during evolution. The run-time model builds upon a technology-independent monitoring approach. A correspondence model maintains the semantic relationships between monitoring outcomes and architecture models. As an umbrella a megamodel integrates design-time models, code generation, monitoring, and run-time model update. Currently, iObserve covers the monitoring and analysis phases of the MAPE control loop. We come up with a roadmap to include planning and execution activities in iObserve.

References

[1]
A. Aleti, B. Buhnova, L. Grunske, A. Koziolek, and I. Meedeniya. Software architecture optimization methods: A systematic literature review. Software Engineering, IEEE Transactions on, 39(5):658--683, 2013.
[2]
B. H. C. Cheng et al. Software engineering for self-adaptive systems: A research roadmap. In Software Engineering for Self-Adaptive Systems, pages 1--26. Springer, 2009.
[3]
F. Bause. Queueing petri nets - a formalism for the combined qualitative and quantitative analysis of systems. In 5th Int'l Workshop on Petri nets and Performance Models, pages 14--23. IEEE, 1993.
[4]
N. Bencomo, R. France, B. H. C. Cheng, and U. Amann. [email protected]. Springer, 2014.
[5]
S. A. Boyer. Scada: Supervisory Control And Data Acquisition. International Society of Automation, 2009.
[6]
F. Brosig, N. Huber, and S. Kounev. Automated extraction of architecture-level performance models of distributed component-based systems. In P. Alexander, C. S. Pasareanu, and J. G. Hosking, editors, ASE, pages 183--192. IEEE, 2011.
[7]
F. Brosig, N. Huber, and S. Kounev. Modeling parameter and context dependencies in online architecture-level performance models. In 15th Symposium on Component Based Software Engineering, CBSE '12, pages 3--12. ACM, 2012.
[8]
A. Bucchiarone, C. Cappiello, E. Di Nitto, S. Gorlatch, D. Mailänder, and A. Metzger. Design for self-adaptation in service-oriented systems in the cloud. In D. Petcu and J. Vásquez-Poletti, editors, European Research Activities in Cloud Computing. Cambridge Scholars Publishing, 2012.
[9]
G. Canfora, M. Di Penta, R. Esposito, and M. L. Villani. A framework for QoS-aware binding and re-binding of composite web services. Journal of Systems and Software, 81(10):1754--1769, 2008.
[10]
C. Chambers and C. Scaffidi. Impact and utility of smell-driven performance tuning for end-user programmers. Journal of Visual Languages & Computing, 28(0):176--194, 2015.
[11]
B. Combemale, X. Thirioux, and B. Baudry. Formally defining and iterating infinite models. In R. B. France, J. Kazmeier, R. Breu, and C. Atkinson, editors, Model Driven Engineering Languages and Systems, volume 7590 of LNCS, pages 119--133. Springer, 2012.
[12]
E. Di Nitto, C. Ghezzi, A. Metzger, M. Papazoglou, and K. Pohl. A journey to highly dynamic, self-adaptive service-based applications. Automated Software Engineering, 2008.
[13]
U. e Ghazia, R. Masood, and M. Shibli. Comparative Analysis of Access Control Systems on Cloud. In 13th Int'l Conference on Software Engineering, Artificial Intelligence, Networking and Parallel Distributed Computing, pages 41--46, 2012.
[14]
T. Elrad, O. Aldawud, and A. Bader. Aspect-oriented modeling: Bridging the gap between implementation and design. In D. Batory, C. Consel, and W. Taha, editors, Generative Programming and Component Engineering, volume 2487 of LNCS, pages 189--201. Springer, 2002.
[15]
J.-M. Favre. Foundations of model (driven) (reverse) engineering -- episode i: Story of the fidus papyrus and the solarus. In Dagstuhl post-proccedings, 2004.
[16]
F. Fittkau, S. Roth, and W. Hasselbring. ExplorViz: Visual runtime behavior analysis of enterprise application landscapes. In 23rd Europ. Conference on Information Systems. AIS, 2015.
[17]
S. Frey, F. Fittkau, and W. Hasselbring. Search-based genetic optimization for deployment and reconfiguration of software in the cloud. In 35th Int'l Conference on Software Engineering, pages 512--521. IEEE Press, 2013.
[18]
S. Frey and W. Hasselbring. The CloudMIG approach: Model-based migration of software systems to cloud-optimized applications. Int'l Journal on Advances in Software, 4(3 and 4):342--353, 2011.
[19]
C. Ghezzi, M. Pezzè, M. Sama, and G. Tamburrelli. Mining behavior models from user-intensive web applications. In 36th Int'l Conference on Software Engineering, 2014.
[20]
M. Gondree and Z. N. Peterson. Geolocation of data in the cloud. In 3rd conference on Data and application security and privacy, pages 25--36. ACM, 2013.
[21]
W. Hasselbring. Reverse engineering of dependency graphs via dynamic analysis. In Proceedings of the 5th European Conference on Software Architecture: Companion Volume, pages 5:1--5:2. ACM, 2011.
[22]
W. Hasselbring, R. Heinrich, R. Jung, A. Metzger, K. Pohl, R. Reussner, and E. Schmieders. iObserve: integrated observation and modeling techniques to support adaptation and evolution of software systems. Technical Report 1309, Kiel University, Kiel, Germany, 2013.
[23]
R. Heinrich, S. Gärtner, T.-M. Hesse, T. Ruhroth, R. Reussner, K. Schneider, B. Paech, and J. Jürjens. A platform for empirical research on information system evolution. In 27th Int'l Conference on Software Engineering and Knowledge Engineering, pages 415--420. KSI Research Inc., 2015.
[24]
R. Heinrich, R. Jung, E. Schmieders, A. Metzger, W. Hasselbring, R. Reussner, and K. Pohl. Architectural run-time models for operator-in-the-loop adaptation of cloud applications. In IEEE 9th Symposium on the Maintenance and Evolution of Service-Oriented Systems and Cloud-Based Environments. IEEE, 2015.
[25]
R. Heinrich, P. Merkle, J. Henss, and B. Paech. Integrating Business Process Simulation and Information System Simulation for Performance Prediction. Intl. Journal on Software & Systems Modeling, 2015.
[26]
R. Heinrich, E. Schmieders, R. Jung, W. Hasselbring, A. Metzger, K. Pohl, and R. Reussner. Run-time architecture models for dynamic adaptation and evolution of cloud applications. Technical Report 1503, Kiel University, Kiel, Germany, 2015.
[27]
R. Heinrich, E. Schmieders, R. Jung, K. Rostami, A. Metzger, W. Hasselbring, R. Reussner, and K. Pohl. Integrating run-time observations and design component models for cloud system analysis. In 9th Int'l Workshop on [email protected], pages 41--46. CEUR Vol-1270, 2014.
[28]
A. v. Hoorn, J. Waller, and W. Hasselbring. Kieker: A framework for application performance monitoring and dynamic software analysis. In 3rd Int'l Conference on Performance Engineering (ICPE 2012), pages 247--248. ACM, 2012.
[29]
N. Huber, A. van Hoorn, A. Koziolek, F. Brosig, and S. Kounev. S/T/A: Meta-modeling Run-time Adaptation in Component-Based System Architectures. In 9th Int'l Conference on e-Business Engineering, pages 70--77. IEEE, 2012.
[30]
D. Ivanovic, M. Carro, and M. Hermenegildo. Constraint-based runtime prediction of sla violations in service orchestrations. In Service-Oriented Computing, pages 62--76. Springer, 2011.
[31]
G. Jung, T. Mukherjee, S. Kunde, H. Kim, N. Sharma, and F. Goetz. Cloudadvisor: A recommendation-as-a-service platform for cloud configuration and pricing. In IEEE 9th World Congress on Services, pages 456--463, 2013.
[32]
R. Jung, R. Heinrich, and E. Schmieders. Model-driven instrumentation with Kieker and Palladio to forecast dynamic applications. In Symposium on Software Performance, pages 99--108. CEUR Vol-1083, 2013.
[33]
A. Khan, X. Yan, S. Tao, and N. Anerousis. Workload characterization and prediction in the cloud: A multiple time series approach. In Network Operations and Management Symposium, pages 1287--1294. IEEE, 2012.
[34]
S. Kounev. Performance modeling and evaluation of distributed component-based systems using queueing petri nets. Transactions on Software Engineering, 32(7):486--502, 2006.
[35]
H. Koziolek. Performance evaluation of component-based software systems: A survey. Performance Evaluation, 67(8):634--658, 2010.
[36]
J. Kramer and J. Magee. Self-managed systems: an architectural challenge. In Future of Software Engineering, pages 259--268, 2007.
[37]
S. Lee, S. Kang, S. Kim, and M. Staats. The impact of view histories on edit recommendations. Software Engineering, IEEE Transactions on, 41(3):314--330, 2015.
[38]
M. M. Lehman and L. A. Belady, editors. Program Evolution: Processes of Software Change. Academic Press Professional, Inc., 1985.
[39]
A. Martens, H. Koziolek, L. Prechelt, and R. Reussner. From monolithic to component-based performance evaluation of software architectures. Empirical Software Engineering, 16(5):587--622, 2011.
[40]
D. A. Menasce and V. Almeida. Capacity Planning for Web Services: Metrics, Models, and Methods. Prentice Hall, 2001.
[41]
A. Metzger and E. Di Nitto. Addressing highly dynamic changes in service-oriented systems: Towards agile evolution and adaptation. In Agile and Lean Service-Oriented Development: Foundations, Theory and Practice. IGI Global, 2012.
[42]
A. Metzger (Ed.). Software engineering: Key enabler for innovation. NESSI White Paper, 2014.
[43]
B. Morin, O. Barais, J.-M. Jezequel, F. Fleurey, and A. Solberg. [email protected] to support dynamic adaptation. IEEE Computer, 42(10):44--51, 2009.
[44]
M.P. Robillard. Recommendation systems for software engineering. Software, IEEE, 27(4):80--86, 2010.
[45]
H. Müller and N. Villegas. Runtime evolution of highly dynamic software. In Evolving Software Systems, pages 229--264. Springer, 2014.
[46]
G. Murphy, D. Notkin, and K. Sullivan. Software reflexion models: bridging the gap between design and implementation. IEEE Transactions on Software Engineering, 27(4):364--380, 2001.
[47]
Object Management Group. UML profile for schedulability, performance, and time specification. Technical report, 2005.
[48]
P. Oreizy, N. Medvidovic, and R. N. Taylor. Runtime software adaptation: Framework, approaches, and styles. In Companion of the 30th Int'l Conference on Software Engineering, pages 899--910. ACM, 2008.
[49]
M. Papazoglou, K. Pohl, M. Parkin, and A. Metzger, editors. Service Research Challenges and Solutions for the Future Internet: S-Cube -- Towards Mechanisms and Methods for Engineering, Managing, and Adapting Service-Based Systems, volume 6500 of LNCS. Springer, 2010.
[50]
Reussner, Ralf H. et al., editor. Modeling and Simulating Software Architectures -- The Palladio Approach. MIT Press, 2016. ISBN: 978-0-262-03476-0.
[51]
J. A. Rolia and K. C. Sevcik. The method of layers. IEEE Trans. Softw. Eng., 21(8):689--700, 1995.
[52]
K. Rostami, J. Stammel, R. Heinrich, and R. Reussner. Architecture-based assessment and planning of change requests. In 11th Int'l Conference on Quality of Software Architectures, pages 21--30. ACM, 2015.
[53]
S. Herold et al. CoCoME -- the common component modeling example. In The Common Component Modeling Example, pages 16--53. Springer, 2008.
[54]
B. Schmerl, J. Aldrich, D. Garlan, R. Kazman, and H. Yan. Discovering architectures from running systems. IEEE Transactions on Software Engineering, 32(7):454--466, 2006.
[55]
E. Schmieders, A. Metzger, and K. Pohl. Runtime model-based privacy checks of big data cloud services. In 13th Int'l Conference on Service Oriented Computing. Springer, 2015
[56]
H. Song, G. Huang, F. Chauvel, Y. Xiong, Z. Hu, Y. Sun, and H. Mei. Supporting runtime software architecture: A bidirectional-transformation-based approach. Journal of Systems and Software, 84(5):711--723, 2011.
[57]
M. Szvetits and U. Zdun. Systematic literature review of the objectives, techniques, kinds, and architectures of models at runtime. SoSyM, 2013.
[58]
W. van der Aalst, M. Schonenberg, and M. Song. Time prediction based on process mining. Information Systems, 36(2):450--475, 2011.
[59]
A. van Hoorn, M. Rohr, and W. Hasselbring. Generating probabilistic and intensity-varying workload for web-based software systems. In SPEC Int'l Performance Evaluation Workshop, LNCS, pages 124--143. Springer, 2008.
[60]
A. van Hoorn, C. Vögele, E. Schulz, W. Hasselbring, and H. Krcmar. Automatic extraction of probabilistic workload specifications for load testing session-based application systems. In 8th Int'l Conference on Performance Evaluation Methodologies and Tools, pages 139--146. ICST, 2014.
[61]
T. Vogel and H. Giese. Adaptation and abstract runtime models. In Workshop on Software Engineering for Adaptive and Self-Managing Systems, pages 39--48. ACM, 2010.
[62]
T. Vogel and H. Giese. On unifying development models and runtime models (position paper). In 9th Int'l Workshop on Models at run.time. CEUR, 2014.
[63]
C. Vögele, R. Heinrich, R. Heilein, H. Krcmar, and A. van Hoorn. Modeling complex user behavior with the palladio component model. In Symposium on Software Performance, 2015. accepted, to appear.
[64]
J. von Kistowski, N. R. Herbst, D. Zoller, S. Kounev, and A. Hotho. Modeling and Extracting Load Intensity Profiles. In 10th Int'l Symposium on Software Engineering for Adaptive and Self-Managing Systems, 2015.
[65]
R. von Massow, A. van Hoorn, and W. Hasselbring. Performance simulation of runtime reconfigurable component-based software architectures. In ECSA, volume 6903 of LNCS, pages 43--58. Springer, 2011.
[66]
Y. Brun et al. Software engineering for self-adaptive systems. chapter Engineering Self-Adaptive Systems Through Feedback Loops, pages 48--70. Springer, 2009.

Cited By

View all
  • (2024)Enhancing Software Architecture Adaptability: A Comprehensive Evaluation MethodSymmetry10.3390/sym1607089416:7(894)Online publication date: 13-Jul-2024
  • (2023)RPCover: Recovering gRPC Dependency in Multilingual ProjectsProceedings of the 38th IEEE/ACM International Conference on Automated Software Engineering10.1109/ASE56229.2023.00108(1930-1939)Online publication date: 11-Nov-2023
  • (2022)Runtime Software Architecture-Based Reliability Prediction for Self-Adaptive SystemsSymmetry10.3390/sym1403058914:3(589)Online publication date: 16-Mar-2022
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM SIGMETRICS Performance Evaluation Review
ACM SIGMETRICS Performance Evaluation Review  Volume 43, Issue 4
March 2016
61 pages
ISSN:0163-5999
DOI:10.1145/2897356
  • Editor:
  • Nidhi Hegde
Issue’s Table of Contents
Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 25 February 2016
Published in SIGMETRICS Volume 43, Issue 4

Check for updates

Author Tags

  1. Architectural Run-time Model
  2. Palladio Component Model
  3. Performance Model
  4. Privacy
  5. Usage Profile

Qualifiers

  • Research-article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)6
  • Downloads (Last 6 weeks)1
Reflects downloads up to 02 Oct 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Enhancing Software Architecture Adaptability: A Comprehensive Evaluation MethodSymmetry10.3390/sym1607089416:7(894)Online publication date: 13-Jul-2024
  • (2023)RPCover: Recovering gRPC Dependency in Multilingual ProjectsProceedings of the 38th IEEE/ACM International Conference on Automated Software Engineering10.1109/ASE56229.2023.00108(1930-1939)Online publication date: 11-Nov-2023
  • (2022)Runtime Software Architecture-Based Reliability Prediction for Self-Adaptive SystemsSymmetry10.3390/sym1403058914:3(589)Online publication date: 16-Mar-2022
  • (2022)A Tracing Based Model to Identify Bottlenecks in Physically Distributed Applications2022 International Conference on Information Networking (ICOIN)10.1109/ICOIN53446.2022.9687217(226-231)Online publication date: 12-Jan-2022
  • (2022)Visualizing Microservice Architecture in the Dynamic Perspective: A Systematic Mapping StudyIEEE Access10.1109/ACCESS.2022.322113010(119999-120012)Online publication date: 2022
  • (2021)Models@Runtime: The Development and Re-Configuration Management of Python Applications Using Formal MethodsApplied Sciences10.3390/app1120974311:20(9743)Online publication date: 19-Oct-2021
  • (2021)Evaluation of Software Architectures under UncertaintyACM Transactions on Software Engineering and Methodology10.1145/346430530:4(1-50)Online publication date: 3-Aug-2021
  • (2021)Enabling Consistency between Software Artefacts for Software Adaption and Evolution2021 IEEE 18th International Conference on Software Architecture (ICSA)10.1109/ICSA51549.2021.00009(1-12)Online publication date: Mar-2021
  • (2021)A procedural and flexible approach for specification, modeling, definition, and analysis for self‐adaptive systemsSoftware: Practice and Experience10.1002/spe.296251:6(1387-1415)Online publication date: 5-Mar-2021
  • (2020)Context-Based Confidentiality Analysis for Industrial IoT2020 46th Euromicro Conference on Software Engineering and Advanced Applications (SEAA)10.1109/SEAA51224.2020.00096(589-596)Online publication date: Aug-2020
  • Show More Cited By

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media