Dimensioning Scientific Computing systems to improve performance of Map-Reduce based applications

GG Castañè, A Núneñ, R Filgueira… - Procedia Computer …, 2012 - Elsevier
Procedia Computer Science, 2012Elsevier
Map-Reduce is a programming model widely used for processing large data sets on
scientific clusters. Most of the efforts and research are focused on enhancing and alleviating
the drawbacks of the model proposed by Google. The requirements of Map-Reduce based
applications are often unclear because of the difficulty in satisfying the overall system
throughput, as well as exploring alternatives to obtain a good tradeoff between the
performance of basic systems such as storage, networking and CPU. In this paper we …
Map-Reduce is a programming model widely used for processing large data sets on scientific clusters. Most of the efforts and research are focused on enhancing and alleviating the drawbacks of the model proposed by Google. The requirements of Map-Reduce based applications are often unclear because of the difficulty in satisfying the overall system throughput, as well as exploring alternatives to obtain a good tradeoff between the performance of basic systems such as storage, networking and CPU. In this paper we present an evaluation of the compared performance of scaling up scientific computing systems using a Map-Reduce application model. This work is specifically focused on medium-size multi-core systems, frequently used by researchers to compute scientific applications. The scaling process is oriented towards the three main resources: computing power, communications and storage. By performing an extensive set of simulations using iCanCloud simulator, we also show that main bottlenecks of those kinds of applications executed in cluster systems are found in storage and network systems. Thence, in order to increase the overall performance of those applications, the computing power must be scaled up proportionally along the network and storage system.
Elsevier
以上显示的是最相近的搜索结果。 查看全部搜索结果