Nothing Special   »   [go: up one dir, main page]

Skip to main content

The Incremental Advantage: Evaluating the Performance of a TGG-based Visualisation Framework

  • Conference paper
  • First Online:
Graph Transformation (ICGT 2016)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 9761))

Included in the following conference series:

Abstract

Triple Graph Grammars (TGGs) are best known as a bidirectional model transformation language, which might give the misleading impression that they are wholly unsuitable for unidirectional application scenarios. We believe that it is more useful to regard TGGs as just graph grammars with “batteries included”, meaning that TGG-based tools provide simple, default execution strategies, together with algorithms for incremental change propagation. Especially in cases where the provided execution strategies suffice, a TGG-based infrastructure may be advantageous, even for unidirectional transformations.

In this paper, we demonstrate these advantages by presenting a TGG-based, read-only visualisation framework, which is an integral part of the metamodelling and model transformation tool eMoflon. We argue the advantages of using TGGs for this visualisation application scenario, and provide a quantitative analysis of the runtime complexity and scalability of the realised incremental, unidirectional transformation.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    http://www.graphviz.org.

  2. 2.

    An abstract TGG rule serves to, e. g., extract commonalities of multiple TGG rules, but cannot itself be applied. TGG rules may refine other (abstract or non-abstract) rules to reuse common elements. Refinement is roughly comparable to the purpose of inheritance in object-oriented programming languages. See [3] for more details.

  3. 3.

    A match of a rule \(r: L \rightarrow R\) in a graph G is an occurrence \(m: L \rightarrow G\) of the left-hand side L of the rule in G.

  4. 4.

    http://www.emoflon.org.

  5. 5.

    https://github.com/eMoflon/paper-icgt2016/releases/tag/icgt2016-v1.0.0.

  6. 6.

    http://projects.ikv.de/qvt.

  7. 7.

    http://www.emoflon.org/.

References

  1. Pérez Andrés, F., de Lara, J., Guerra, E.: Domain specific languages with graphical and textual views. In: Schürr, A., Nagl, M., Zündorf, A. (eds.) AGTIVE 2007. LNCS, vol. 5088, pp. 82–97. Springer, Heidelberg (2008)

    Chapter  Google Scholar 

  2. Anjorin, A.: Synchronization of Models on Different Abstraction Levels using Triple Graph Grammars Phd thesis, Technische Universität Darmstadt (2014)

    Google Scholar 

  3. Anjorin, A., Saller, K., Lochau, M., Schürr, A.: Modularizing triple graph grammars using rule refinement. In: Gnesi, S., Rensink, A. (eds.) FASE 2014 (ETAPS). LNCS, vol. 8411, pp. 340–354. Springer, Heidelberg (2014)

    Chapter  Google Scholar 

  4. Blouin, D., Plantec, A., Dissaux, P., Singhoff, F., Diguet, J.-P.: Synchronization of models of rich languages with triple graph grammars: an experience report. In: Di Ruscio, D., Varró, D. (eds.) ICMT 2014. LNCS, vol. 8568, pp. 106–121. Springer, Heidelberg (2014)

    Google Scholar 

  5. Bottoni, P., Guerra, E., de Lara, J.: Enforced generative patterns for the specification of the syntax and semantics of visual languages. JVLC 19(4), 429–455 (2008)

    Google Scholar 

  6. Cheney, J., McKinna, J., Stevens, P., Gibbons, J.: Towards a repository of Bx examples. In: Workshops of EDBT/ICDT 2014. CEUR Workshop Proceedings, vol. 1133, pp. 87–91. CEUR-WS.org (2014)

    Google Scholar 

  7. Diskin, Z., Wider, A., Gholizadeh, H., Czarnecki, K.: Towards a rational taxonomy for increasingly symmetric model synchronization. In: Di Ruscio, D., Varró, D. (eds.) ICMT 2014. LNCS, vol. 8568, pp. 57–73. Springer, Heidelberg (2014)

    Google Scholar 

  8. Giese, H., Hildebrandt, S., Neumann, S.: Model synchronization at work: keeping SysML and AUTOSAR models consistent. In: Engels, G., Lewerentz, C., Schäfer, W., Schürr, A., Westfechtel, B. (eds.) Nagl Festschrift. LNCS, vol. 5765, pp. 555–579. Springer, Heidelberg (2010)

    Chapter  Google Scholar 

  9. Greenyer, J., Rieke, J.: Applying advanced TGG concepts for a complex transformation of sequence diagram specifications to timed game automata. In: Schürr, A., Varró, D., Varró, G. (eds.) AGTIVE 2011. LNCS, vol. 7233, pp. 222–237. Springer, Heidelberg (2012)

    Chapter  Google Scholar 

  10. Hermann, F., Gottmann, S., Nachtigall, N., Ehrig, H., Braatz, B., Morelli, G., Pierre, A., Engel, T., Ermel, C.: Triple graph grammars in the large for translating satellite procedures. In: Di Ruscio, D., Varró, D. (eds.) ICMT 2014. LNCS, vol. 8568, pp. 122–137. Springer, Heidelberg (2014)

    Google Scholar 

  11. Hermann, F., Nachtigall, N., Braatz, B., Engel, T., Gottmann, S.: Solving the FIXML2Code-case study with HenshinTGG. In: TTC 2014. CEUR Workshop Proceedings, vol. 1305, pp. 32–46. CEUR-WS.org (2014)

    Google Scholar 

  12. Hildebrandt, S., Lambers, L., Giese, H., Rieke, J., Greenyer, J., Schäfer, W., Marius Lauder, A., Anjorin, A. Schürr : A survey of triple graph grammar tools. In: BX 2013. ECEASST, vol. 57. EASST (2013)

    Google Scholar 

  13. Kulcsár, G., Leblebici, E., Anjorin, A.: A solution to the FIXML case study using triple graph grammars and eMoflon. In: TTC 2014. CEUR Workshop Proceedings, vol. 1305, pp. 71–75. CEUR-WS.org (2014)

    Google Scholar 

  14. Lauder, M., Anjorin, A., Varró, G., Schürr, A.: Efficient model synchronization with precedence triple graph grammars. In: Ehrig, H., Engels, G., Kreowski, H.-J., Rozenberg, G. (eds.) ICGT 2012. LNCS, vol. 7562, pp. 401–415. Springer, Heidelberg (2012)

    Chapter  Google Scholar 

  15. E. Leblebici: Towards a graph grammar-based approach to inter-model consistency checks with traceability support. In: BX 2016. CEUR Workshop Proceedings, vol. 1571. CEUR-WS.org (2016)

    Google Scholar 

  16. Leblebici, E., Anjorin, A., Schürr, A.: Developing eMoflon with eMoflon. In: Di Ruscio, D., Varró, D. (eds.) ICMT 2014. LNCS, vol. 8568, pp. 138–145. Springer, Heidelberg (2014)

    Google Scholar 

  17. Leblebici, E., Anjorin, A., Schürr, A., Taentzer, G.: Multi-amalgamated triple graph grammars. In: Parisi-Presicce, F., Westfechtel, B. (eds.) ICGT 2015. LNCS, vol. 9151, pp. 87–103. Springer, Heidelberg (2015)

    Chapter  Google Scholar 

  18. Leblebici, E., Anjorin, A., Schürr, A., Hildebrandt, S., Rieke, J., Greenyer, J.: A comparison of incremental triple graph grammar tools. In: GT-VMT 2014. ECEASST, vol. 67. EASST (2014)

    Google Scholar 

  19. Mougenot, A., Darrasse, A., Blanc, X., Soria, M.: Uniform random generation of huge metamodel instances. In: Paige, R.F., Hartman, A., Rensink, A. (eds.) ECMDA-FA 2009. LNCS, vol. 5562, pp. 130–145. Springer, Heidelberg (2009)

    Chapter  Google Scholar 

  20. Peldszus, S., Kulcsár, G., Lochau, M.: A Solution to the java refactoring case study using eMoflon. In: TTC 2015. CEUR Workshop Proceedings, vol. 1524, pp. 118–122. CEUR-WS.org (2015)

    Google Scholar 

  21. Scheidgen, M.: Generation of large random models for benchmarking. In: BigMDE 2015. CEUR Workshop Proceedings, vol. 1406, pp. 1–10. CEUR-WS.org (2015)

    Google Scholar 

  22. Schleich, A.: Skalierbare und effiziente Modellgenerierung mit Tripel-Graph-Grammatiken Master’s thesis. TU Darmstadt, Germany (2015)

    Google Scholar 

  23. Schürr, A.: Specification of graph translators with triple graph grammars. In: Mayr, E.W., Schmidt, G., Tinhofer, G. (eds.) WG 1994. LNCS, vol. 903, pp. 151–163. Springer, Heidelberg (1995)

    Chapter  Google Scholar 

Download references

Acknowledgements

This work has been funded by the German Research Foundation (DFG) as part of projects A01 within the Collaborative Research Centre (CRC) 1053 – MAKI.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Roland Kluge .

Editor information

Editors and Affiliations

Appendix: Examples from the eMoflon Handbook

Appendix: Examples from the eMoflon Handbook

We show concrete examples of visualised source models taken from the eMoflon handbook,Footnote 7 whose illustrative example is Leitner’s learning box, a system, e. g., for language learning. This system works by creating cards, sorted into sequential partitions, with a front face showing the known word (e. g., “hello” in English) and a back face showing the to-be-learnt word (e. g., “Hallo” in German). While exercising, the learner takes a card from a partition, tries to guess the back-face word based on the front-face word, and, if successful, may move the card to the next partition. A so-called fast card contains easy-to-learn words and may be moved to the last partition upon success, immediately.

The story diagram in Fig. 4a shows the logic of checking a card: If the answer is correct (story pattern checkCard) and if the card is a so-called fast card (story pattern isFastCard), then this card is promoted to the last partition, as shown in the story pattern depicted in Fig. 4b.

Another task in the eMoflon handbook is to synchronise (using TGGs) a learning box with a dictionary, whose entries can be thought of as simple key-value pairs. Figure 4c shows the precedence graph resulting from translating the sample box in the handbook into a dictionary. The root node BoxToDictionaryRule 0 indicates that the box is first of all translated into an empty dictionary, before translating all cards to dictionary entries. Finally, Fig. 4d depicts the triple match that corresponds to CardToEntryRule 5 in Fig. 4c. This match shows that the card containing “Question One” is mapped to the entry with content “One : Eins”.

Fig. 4.
figure 4

Visualisations of sample models (a) SDM checkCard (b) SP checkCard::promoteFastCard (c) PG of box-to-dictionary synchronisation, (d) TM BoxToDictionaryRule 0

Rights and permissions

Reprints and permissions

Copyright information

© 2016 Springer International Publishing Switzerland

About this paper

Cite this paper

Kluge, R., Anjorin, A. (2016). The Incremental Advantage: Evaluating the Performance of a TGG-based Visualisation Framework. In: Echahed, R., Minas, M. (eds) Graph Transformation. ICGT 2016. Lecture Notes in Computer Science(), vol 9761. Springer, Cham. https://doi.org/10.1007/978-3-319-40530-8_12

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-40530-8_12

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-40529-2

  • Online ISBN: 978-3-319-40530-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics