Multiple workflows scheduling in multi-tenant distributed systems: A taxonomy and future directions
Workflows are an application model that enables the automated execution of multiple
interdependent and interconnected tasks. They are widely used by the scientific community …
interdependent and interconnected tasks. They are widely used by the scientific community …
Reducing energy footprint in cloud computing: a study on the impact of clustering techniques and scheduling algorithms for scientific workflows
The concept of scientific workflow makes it possible to link and control different tasks to carry
out a complex treatment. The complicated workflow is generated by scientific distributed …
out a complex treatment. The complicated workflow is generated by scientific distributed …
A data-aware scheduling strategy for executing large-scale distributed workflows
Task scheduling is a crucial key component for the efficient execution of data-intensive
applications on distributed environments, by which many machines must be coordinated to …
applications on distributed environments, by which many machines must be coordinated to …
Hercules: scalable and network portable in-memory ad-hoc file system for data-centric and high-performance applications
J Garcia-Blas, G Sanchez-Gallegos, C Petre… - … Conference on Parallel …, 2023 - Springer
The growing demands for data processing by new data-intensive applications are putting
pressure on the performance and capacity of HPC storage systems. The advancement in …
pressure on the performance and capacity of HPC storage systems. The advancement in …
A data‐aware scheduling strategy for workflow execution in clouds
F Marozzo, F Rodrigo Duro… - Concurrency and …, 2017 - Wiley Online Library
As data intensive scientific computing systems become more widespread, there is a
necessity of simplifying the development, deployment, and execution of complex data …
necessity of simplifying the development, deployment, and execution of complex data …
IMSS: in-memory storage system for data intensive applications
Computer applications are growing in terms of data management requirements. In both
scientific and engineering domains, high-performance computing clusters tend to …
scientific and engineering domains, high-performance computing clusters tend to …
[PDF][PDF] Exploiting data locality in Swift/T workflows using Hercules
The ever-increasing power of supercomputer systems is both driving and enabling the
emergence of new problem-solving methods that require the efficient execution of many …
emergence of new problem-solving methods that require the efficient execution of many …
Exploiting in-memory storage for improving workflow executions in cloud platforms
Abstract The Data Mining Cloud Framework (DMCF) is an environment for designing and
executing data analysis workflows in cloud platforms. Currently, DMCF relies on the default …
executing data analysis workflows in cloud platforms. Currently, DMCF relies on the default …
Flexible data-aware scheduling for workflows over an in-memory object store
This paper explores novel techniques for improving the performance of many-task workflows
based on the Swift scripting language. We propose novel programmer options for automated …
based on the Swift scripting language. We propose novel programmer options for automated …
Improving spectrum-based fault localization using proximity-based weighting of test cases
A Bandyopadhyay - 2011 26th IEEE/ACM International …, 2011 - ieeexplore.ieee.org
Spectrum based fault localization techniques such as Tarantula and Ochiai calculate the
suspiciousness score of a program statement using the number of failing and passing test …
suspiciousness score of a program statement using the number of failing and passing test …