Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1109/ICSE-NIER.2017.19acmconferencesArticle/Chapter ViewAbstractPublication PagesicseConference Proceedingsconference-collections
research-article

In the field monitoring of interactive applications

Published: 20 May 2017 Publication History

Abstract

Monitoring techniques can extract accurate data about the behavior of software systems. When used in the field, they can reveal how applications behave in real-world contexts and how programs are actually exercised by their users. Nevertheless, since monitoring might need significant storage and computational resources, it may interfere with users activities degrading the quality of the user experience.
While the impact of monitoring has been typically studied by measuring the overhead that it may introduce in a monitored application, there is little knowledge about how monitoring solutions may actually impact on the user experience and to what extent users may recognize their presence.
In this paper, we present our investigation on how collecting data in the field may impact the quality of the user experience. Our initial results show that non-trivial overhead can be tolerated by users, depending on the kind of activity that is performed. This opens interesting opportunities for research in monitoring solutions, which could be designed to opportunistically collect data considering the kind of activities performed by the users.

References

[1]
Eclipse Community, "Eclipse," http://www.eclipse.org, visited in 2017.
[2]
Microsoft, "Windows 10," http://www.microsoft.com, visited in 2017.
[3]
N. Delgado, A. Q. Gates, and S. Roach, "A taxonomy and catalog of runtime software-fault monitoring tools," IEEE Transactions on Software Engineering (TSE), vol. 30, no. 12, pp. 859--872, 2004.
[4]
G. Jin, A. Thakur, B. Liblit, and S. Lu, "Instrumentation and sampling strategies for cooperative concurrency bug isolation," ACM Sigplan Notices, vol. 45, no. 10, pp. 241--255, 2010.
[5]
B. Liblit, A. Aiken, A. X. Zheng, and M. I. Jordan, "Bug isolation via remote program sampling," ACM Sigplan Notices, vol. 38, no. 5, pp. 141--154, 2003.
[6]
S. Elbaum and M. Diep, "Profiling deployed software: Assessing strategies and testing opportunities," IEEE Transactions on Software Engineering (TSE), vol. 31, no. 4, pp. 312--327, 2005.
[7]
P. Ohmann, D. B. Brown, N. Neelakandan, J. Linderoth, and B. Liblit, "Optimizing customized program coverage," in Proceedings of the International Conference on Automated Software Engineering (ASE), 2016.
[8]
C. Pavlopoulou and M. Young, "Residual test coverage monitoring," in Proceedings of the International Conference on Software Engineering (ICSE), 1999.
[9]
W. Jin and A. Orso, "BugRedux: reproducing field failures for in-house debugging," in Proceedings of the International Conference on Software Engineering (ICSE), 2012.
[10]
J. Clause and A. Orso, "A technique for enabling and supporting debugging of field failures," in Proceedings of the International Conference on Software Engineering (ICSE), 2007.
[11]
A. Orso, D. Liang, M. J. Harrold, and R. Lipton, "Gamma system: continuous evolution of software after deployment," in Proceedings of the International Symposium on Software Testing and Analysis (ISSTA), 2002.
[12]
A. Orso, "Monitoring, analysis, and testing of deployed software," in Proceedings of the FSE/SDP Workshop on Future of Software Engineering Research (FoSER), 2010.
[13]
J. Bowring, A. Orso, and M. J. Harrold, "Monitoring deployed software using software tomography," in Proceedings of the Workshop on Program Analysis for Software Tools and Engineering (PASTE), 2002.
[14]
S. C. Seow, Designing and engineering time: the psychology of time perception in software. Addison-Wesley Professional, 2008.
[15]
P. R. Killeen and N. A. Weiss, "Optimal timing and the Weber function." Psychological Review, vol. 94, no. 4, pp. 455--468, 1987.

Cited By

View all
  • (2024)A family of experiments about how developers perceive delayed system response timeSoftware Quality Journal10.1007/s11219-024-09660-w32:2(567-605)Online publication date: 1-Jun-2024
  • (2017)Flexible in-the-field monitoringProceedings of the 39th International Conference on Software Engineering Companion10.1109/ICSE-C.2017.37(479-480)Online publication date: 20-May-2017

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
ICSE-NIER '17: Proceedings of the 39th International Conference on Software Engineering: New Ideas and Emerging Results Track
May 2017
75 pages
ISBN:9781538626757

Sponsors

Publisher

IEEE Press

Publication History

Published: 20 May 2017

Check for updates

Author Tags

  1. dynamic analysis
  2. monitoring
  3. user experience

Qualifiers

  • Research-article

Conference

ICSE '17
Sponsor:

Upcoming Conference

ICSE 2025

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)2
  • Downloads (Last 6 weeks)0
Reflects downloads up to 27 Nov 2024

Other Metrics

Citations

Cited By

View all
  • (2024)A family of experiments about how developers perceive delayed system response timeSoftware Quality Journal10.1007/s11219-024-09660-w32:2(567-605)Online publication date: 1-Jun-2024
  • (2017)Flexible in-the-field monitoringProceedings of the 39th International Conference on Software Engineering Companion10.1109/ICSE-C.2017.37(479-480)Online publication date: 20-May-2017

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media