Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/2534248acmconferencesBook PagePublication PagesscConference Proceedingsconference-collections
WORKS '13: Proceedings of the 8th Workshop on Workflows in Support of Large-Scale Science
ACM2013 Proceeding
Publisher:
  • Association for Computing Machinery
  • New York
  • NY
  • United States
Conference:
SC13: International Conference for High Performance Computing, Networking, Storage and Analysis Denver Colorado 17 November 2013
ISBN:
978-1-4503-2502-8
Published:
17 November 2013
Sponsors:
SIGHPC, SIGARCH, IEEE CS
Reflects downloads up to 24 Nov 2024Bibliometrics
Skip Abstract Section
Abstract

WORKS'13 was the eighth workshop in the WORKS series. The call for papers attracted sixteen submissions from Asia, Europe, North and South America. The quality of the papers, peer reviewed by the program committee, were exceptional this year and overall thirteen of the papers were accepted, covering a variety of topics, including provenance and metadata, workflow models, workflow tasks characterization, workflow engines scalability, and distributed computing performance.

Skip Table Of Content Section
research-article
On assisting scientific data curation in collection-based dataflows using labels

Thanks to the proliferation of computational techniques and the availability of datasets, data-intensive research has become commonplace in science. Sharing and re-use of datasets is key to scientific progress. A critical requirement for enabling data ...

research-article
Static compiler analysis for workflow provenance

Data provenance is the lineage of an artifact or object. Provenance can provide a basis upon which data can be regenerated, and can be used to determine the quality of both the process and provenance itself. Provenance capture from workflows is ...

research-article
On specifying and sharing scientific workflow optimization results using research objects

Reusing and repurposing scientific workflows for novel scientific experiments is nowadays facilitated by workflow repositories. Such repositories allow scientists to find existing workflows and re-execute them. However, workflow input parameters often ...

research-article
Semantics and provenance for processing element composition in dispel workflows

Dispel is a scripting language for constructing workflow graph which can then be executed by some other computational infrastructure. It facilitates construction of abstract components (called Processing Elements, or PEs) that can be instantiated in ...

research-article
Execution time prediction for grid infrastructures based on runtime provenance data

An accurate performance prediction service can be very useful for resource management and the scheduler service and help them make better resource utilization decisions by providing better execution time estimates. In this paper we present a novel ...

research-article
Toward fine-grained online task characteristics estimation in scientific workflows

Task characteristics estimations such as runtime, disk space, and memory consumption, are commonly used by scheduling algorithms and resource provisioning techniques to provide successful and efficient workflow executions. These methods assume that ...

research-article
Understanding workflows for distributed computing: nitty-gritty details

Scientific workflow management is heavily used in our organization. After six years, a large number of workflows are available and regularly used to run biomedical data analysis experiments on distributed infrastructures, mostly on grids. In this paper ...

research-article
Open Access
A framework for dynamically generating predictive models of workflow execution

The ability to accurately predict the performance of software components executing within a Cloud environment is an area of intense interest to many researchers. The availability of an accurate prediction of the time taken for a piece of code to execute ...

research-article
Time-bound analytic tasks on large datasets through dynamic configuration of workflows

Domain experts are often untrained in big data technologies and this limits their ability to exploit the data they have available. Workflow systems hide the complexities of high-end computing and software engineering by offering pre-packaged analytic ...

research-article
Automated packaging of bioinformatics workflows for portability and durability using makeflow

Dependency management remains a major challenge for all forms of software. A program implemented in a given environment typically has many implicit dependencies on programs, libraries, and other objects present within that environment. Moving ...

research-article
Distributed tools deployment and management for multiple galaxy instances in globus genomics

Workflow systems play an important role in the analysis of the fast-growing genomics data produced by low-cost next generation sequencing (NGS) technologies. Many biomedical research groups lack the expertise to assemble and run the sophisticated ...

research-article
The demand for consistent web-based workflow editors

This paper identifies the high value to researchers in many disciplines of having web-based graphical editors for scientific workflows and draws attention to two technological transitions: good quality editors can now run in a browser and workflow ...

research-article
Scalable script-based data analysis workflows on clouds

Data analysis workflows are often composed by many concurrent and compute-intensive tasks that can be efficiently executed only on scalable computing infrastructures, such as HPC systems, Grids and Cloud platforms. The use of Cloud services for the ...

Contributors
  • CNRS National Centre for Scientific Research
  • Cardiff University
Please enable JavaScript to view thecomments powered by Disqus.

Recommendations

Acceptance Rates

WORKS '13 Paper Acceptance Rate 13 of 16 submissions, 81%;
Overall Acceptance Rate 30 of 54 submissions, 56%
YearSubmittedAcceptedRate
WORKS '1725832%
WORKS '1513969%
WORKS '13161381%
Overall543056%