135 resultados para BIN-PACKING

em Queensland University of Technology - ePrints Archive


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Improving energy efficiency has become increasingly important in data centers in recent years to reduce the rapidly growing tremendous amounts of electricity consumption. The power dissipation of the physical servers is the root cause of power usage of other systems, such as cooling systems. Many efforts have been made to make data centers more energy efficient. One of them is to minimize the total power consumption of these servers in a data center through virtual machine consolidation, which is implemented by virtual machine placement. The placement problem is often modeled as a bin packing problem. Due to the NP-hard nature of the problem, heuristic solutions such as First Fit and Best Fit algorithms have been often used and have generally good results. However, their performance leaves room for further improvement. In this paper we propose a Simulated Annealing based algorithm, which aims at further improvement from any feasible placement. This is the first published attempt of using SA to solve the VM placement problem to optimize the power consumption. Experimental results show that this SA algorithm can generate better results, saving up to 25 percentage more energy than First Fit Decreasing in an acceptable time frame.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

MapReduce is a computation model for processing large data sets in parallel on large clusters of machines, in a reliable, fault-tolerant manner. A MapReduce computation is broken down into a number of map tasks and reduce tasks, which are performed by so called mappers and reducers, respectively. The placement of the mappers and reducers on the machines directly affects the performance and cost of the MapReduce computation in cloud computing. From the computational point of view, the mappers/reducers placement problem is a generation of the classical bin packing problem, which is NP-complete. Thus, in this paper we propose a new heuristic algorithm for the mappers/reducers placement problem in cloud computing and evaluate it by comparing with other several heuristics on solution quality and computation time by solving a set of test problems with various characteristics. The computational results show that our heuristic algorithm is much more efficient than the other heuristics and it can obtain a better solution in a reasonable time. Furthermore, we verify the effectiveness of our heuristic algorithm by comparing the mapper/reducer placement for a benchmark problem generated by our heuristic algorithm with a conventional mapper/reducer placement which puts a fixed number of mapper/reducer on each machine. The comparison results show that the computation using our mapper/reducer placement is much cheaper than the computation using the conventional placement while still satisfying the computation deadline.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The placement of the mappers and reducers on the machines directly affects the performance and cost of the MapReduce computation in cloud computing. From the computational point of view, the mappers/reducers placement problem is a generalization of the classical bin packing problem, which is NP-complete. Thus, in this paper we propose a new heuristic algorithm for the mappers/reducers placement problem in cloud computing and evaluate it by comparing with other several heuristics on solution quality and computation time by solving a set of test problems with various characteristics. The computational results show that our heuristic algorithm is much more efficient than the other heuristics. Also, we verify the effectiveness of our heuristic algorithm by comparing the mapper/reducer placement for a benchmark problem generated by our heuristic algorithm with a conventional mapper/reducer placement. The comparison results show that the computation using our mapper/reducer placement is much cheaper while still satisfying the computation deadline.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

MapReduce is a computation model for processing large data sets in parallel on large clusters of machines, in a reliable, fault-tolerant manner. A MapReduce computation is broken down into a number of map tasks and reduce tasks, which are performed by so called mappers and reducers, respectively. The placement of the mappers and reducers on the machines directly affects the performance and cost of the MapReduce computation. From the computational point of view, the mappers/reducers placement problem is a generation of the classical bin packing problem, which is NPcomplete. Thus, in this paper we propose a new grouping genetic algorithm for the mappers/reducers placement problem in cloud computing. Compared with the original one, our grouping genetic algorithm uses an innovative coding scheme and also eliminates the inversion operator which is an essential operator in the original grouping genetic algorithm. The new grouping genetic algorithm is evaluated by experiments and the experimental results show that it is much more efficient than four popular algorithms for the problem, including the original grouping genetic algorithm.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Prawns are a substantial Australian resource but presently are processed in a very labour-intensive manner. A prototype system has been developed for automatically grading and packing prawns into single-layer 'consumer packs' in which each prawn is approximately straight and has the same orientation. The novel technology includes a machine vision system that has been specially programmed to calculate relevant parameters at high speed and a gripper mechanism that can acquire, straighten and place prawns of various sizes. The system can be implemented on board a trawler or in an onshore processing facility. © 1993.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This book examines public worrying over 'ethnic crime' and what it tells us about Australia today. How, for instance, can the blame for a series of brutal group sexual assaults in Sydney be so widely attributed to whole ethnic communities? How is it that the arrival of a foundering boatload of asylum-seekers mostly seeking refuge from despotic regimes in 'the Middle East' can be manipulated to characterise complete cohorts of applicants for refuge 'and their immigrant compatriots' as dangerous, dishonest, criminally inclined and inhuman? How did the airborne terror attacks on the USA on 11 September 2001 exacerbate existing tendencies in Australia to stereotype Arabs and Muslims as backward, inassimilable, without respect for Western laws and values, and complicit with barbarism and terrorism? Bin Laden in the Suburbs argues that we are witnessing the emergence of the 'Arab Other' as the pre-eminent 'folk devil' of our time. This Arab Other functions in the national imaginary to prop up the project of national belonging. It has little to do with the lived experiences of Arab, Middle Eastern or Muslim Australians, and everything to do with a host of social anxieties which overlap in a series of moral panics. Bin Laden in the Suburbs analyses a decisive moment in the history of multiculturalism in Australia. 'Unlike most migrants, the Arab migrant is a subversive will ... They invade our shores, take over our neighbourhood and rape our women. They are all little bin Ladens and they are everywhere: Explicit bin Ladens and closet bin Ladens; Conscious bin Ladens and unconscious bin Ladens; bin Ladens on the beach and bin Ladens in the suburbs, as this book is aptly titled. Within this register ... even a single Arab is a threat. Contain the Arab or exterminate the Arab? A 'tolerable' presence in the suburbs, or caged in a concentration camp? ... The politics of the Western post-colonial state is constantly and dangerously oscillating between these tendencies today. It is this dangerous oscillation that is so lucidly exposed in this book'.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We investigate, using scanning tunnelling microscopy, the adsorption of pentacene on Ni(111) at room temperature and the behaviour of these monolayer films with annealing up to 700 °C. We observe the conversion of pentacene into graphene, which begins from as low as 220 °C with the coalescence of pentacene molecules into large planar aggregates. Then, by annealing at 350 °C for 20 minutes, these aggregates expand into irregular domains of graphene tens of nanometers in size. On surfaces where graphene and nickel carbide coexist, pentacene shows preferential adsorption on the nickel carbide phase. The same pentacene to graphene transformation was also achieved on Cu(111), but at a higher activation temperature, producing large graphene domains that exhibit a range of moiré superlattice periodicities.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The role of sustainability in urban design is becoming increasingly important as Australia’s cities continue to grow, putting pressure on existing infrastructure such as water, energy and transport. To optimise an urban design many different aspects such as water, energy, transport, costs need to be taken into account integrally. Integrated software applications assessing urban designs on a large variety of aspects are hardly available. With the upcoming next generation of the Internet often referred to as the Semantic Web, data can become more machine-interpretable by developing ontologies that can support the development of integrated software systems. Software systems can use these ontologies to perform an intelligent task such as assessing an urban design on a particular aspect. When ontologies of different applications are aligned, they can share information resulting in interoperability. Inference such as compliancy checks and classifications can support aligning the ontologies. A proof of concept implementation has been made to demonstrate and validate the usefulness of machine interpretable ontologies for urban designs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study aimed to identify the aptitudes required in allied health professionals working in three different service delivery models serving remote locations in Northern tropical Australia. Eighteen allied health professionals including," dietetics, diabetes educators, occupational therapy, physiotherapy, psychology, podiatry, social work and speech pathology participated in this exploratory study using a narrative approach. A range of aptitudes were identified and themed under the following headings: (1) being organized but flexible, (2) cooperation and mediation, (3) culturally aware and accepting communicators, (4) knowing the community (5) resourcefulness and resilience and (6) reflectivity. Limiting factors were also deduced. Three of the themes are discussed in this paper. The study found that allied health professionals working in remote settings identified as important personal attributes not necessarily valued in metropolitan settings. Recruitment processes and education programs need to recognize the importance of personal attributes as well as professional skills.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Research on social networking sites like Facebook is emerging but sparse. This exploratory study investigates the value users derive from self-described ‘cool’ Facebook applications, and explores the features that either encourage or discourage users to recommend applications to their friends. The concepts of value and cool are explored in a social networking context. Our qualitative data reveals consumers derive a combination of functional value along with either social or emotional value from the applications. Female Facebook users indicate self-expression as important motivators, while males tend to use Facebook applications to socially compete. Three broad categories emerged for application features; symmetrical features can both encourage or discourage recommendation, polar features where different levels of the same feature encourage or discourage, and uni-directional features only encourage or discourage but not both. Recommending or not recommending an application tends to be the result of a combination of features and context, rather than one feature in isolation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The challenges of maintaining a building such as the Sydney Opera House are immense and are dependent upon a vast array of information. The value of information can be enhanced by its currency, accessibility and the ability to correlate data sets (integration of information sources). A building information model correlated to various information sources related to the facility is used as definition for a digital facility model. Such a digital facility model would give transparent and an integrated access to an array of datasets and obviously would support Facility Management processes. In order to construct such a digital facility model, two state-of-the-art Information and Communication technologies are considered: an internationally standardized building information model called the Industry Foundation Classes (IFC) and a variety of advanced communication and integration technologies often referred to as the Semantic Web such as the Resource Description Framework (RDF) and the Web Ontology Language (OWL). This paper reports on some technical aspects for developing a digital facility model focusing on Sydney Opera House. The proposed digital facility model enables IFC data to participate in an ontology driven, service-oriented software environment. A proof-of-concept prototype has been developed demonstrating the usability of IFC information to collaborate with Sydney Opera House’s specific data sources using semantic web ontologies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The paper seeks to continue the debate about the need for professionals in the library and information services (LIS) sector to continually engage in career-long learning to sustain and develop their knowledge and skills in a dynamic industry. Aims: The neXus2 workforce study has been funded by the ALIA and the consortium of National and State Libraries Australasia (NSLA). It builds on earlier research work (the neXus census) that looked at the demographic, educational and career perspectives of individual library and information professions, to critically examine institutional policies and practices associated with the LIS workforce. The research aims to develop a clearer understanding of the issues impacting on workforce sustainability, workforce capability and workforce optimisation. Methods: The research methodology involved an extensive online survey conducted in March 2008 which collected data on organisational and general staffing; recruitment and retention; staff development and continuing professional education; and succession planning. Encouragement to participate was provided by key industry groups, including academic, public, health, law and government library and information agencies, with the result that around 150 institutions completed the questionnaire. Results: The paper will specifically discuss the research findings relating to training and professional development, to measure the scope and distribution of training activities across the workforce, to consider the interrelationship between the strategic and operational dimensions of staff development in individual institutions and to analyse the common and distinctive factors evident in the different sectors of the profession. Conclusion: The neXus2 project has successfully engaged LIS institutions in the collection of complex industry data that is relevant to the future education and workforce strategies for all areas of the profession. Cross-sector forums such as Information Online 2009 offer the opportunity for stimulating professional dialogue on the key issues.