18 resultados para biophysical throughput

em Deakin Research Online - Australia


Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper proposes a framework to evaluate post-growth policy instruments which gauges their capacity to lessen the pressure for growth emanating from the labour market and the state’s contradictory legitimisation and accumulation imperatives, whilst increasing societal well-being and reducing the biophysical throughput of the economy. It is argued that the most effective policies to do this are measures to reduce average working hours, expand low productivity sectors and reduce inequality. Specific policies instruments include public sector expansion and the promotion of cooperatives, the introduction of citizens’ basic income schemes, environmental tax reform, the abolition of fossil fuel subsidies, reforms to monetary policy, financial regulatory reform and the introduction of alternative measures of progress to gross domestic product.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The growing computational power requirements of grand challenge applications has promoted the need for merging high throughput computing and grid computing principles to harness computational resources distributed across multiple organisations. This paper identifies the issues in resource management and scheduling in the emerging high throughput grid computing context. We also survey and study the performance of several space-sharing and time-sharing opportunistic scheduling policies that have been developed for high throughput computing.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Biophysical investigations of estuaries require a diversity of tasks to be undertaken by a number of disciplines leading to a range of data requirements and dataflow pathways. Technology changes relating to data collection and storage have lead to the need for metadata systems that describe the vast amounts of data now able to be stored electronically. Such a system is described as the first step in the creation of an efficient data management system for biophysical estuarine data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Network coding has shown the promise of significant throughput improvement. In this paper, we study the throughput of two-hop wireless network coding and explore how the maximum throughput can be achieved under a random medium access scheme. Unlike previous studies, we consider a more practical network where the structure of overhearing status between the intended receivers and the transmitters is arbitrary. We make a formal analysis on the network throughput using network coding upon the concept of network coding cliques (NCCs). The analysis shows that the maximum normalized throughput, subject to fairness requirement, is n/n+m, where n is the number of transmitters and m is the number of NCCs in a 2-hop wireless network. We have also found that this maximum throughput can be achieved under a random medium access scheme when the medium access priority of the relay node is equal to the number of NCCs in the network. Our theoretical findings have been validated by simulation as well.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The scheduling of metal to different casters in a casthouse is a complicated problem, attempting to find the balance between pot-line, crucible carrier, furnace and casting machine capacity. in this paper, a description will be given of a casthouse modelling system designed to test different scenarios for casthouse design and operation. Using discrete-event simulation, the casthouse model incorporates variable arrival times of metal carriers, crucible movements, caster operation and furnace conditions. Each part of the system is individually modelled and synchronised using a series of signals or semaphores. in addition, an easy to operate user interface allows for the modification of key parameters, and analysis of model output. Results from the model will be presented for a case study, which highlights the effect different parameters have on overall casthouse performance. The case study uses past production data from a casthouse to validate the model outputs, with the aim to perform a sensitivity analysis on the overall system. Along with metal preparation times and caster strip-down/setup, the temperature evolution within the furnaces is one key parameter in determining casthouse performance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Expressed Sequence Tags (ESTs) are short DNA sequences generated by sequencing the transcribed cDNAs coming from a gene expression. They can provide significant functional, structural and evolutionary information and thus are a primary resource for gene discovery. EST annotation basically refers to the analysis of unknown ESTs that can be performed by database similarity search for possible identities and database search for functional prediction of translation products. Such kind of annotation typically consists of a series of repetitive tasks which should be automated, and be customizable and amenable to using distributed computing resources. Furthermore, processing of EST data should be done efficiently using a high performance computing platform. In this paper, we describe an EST annotator, EST-PACHPC, which has been developed for harnessing HPC resources potentially from Grid and Cloud systems for high throughput EST annotations. The performance analysis of EST-PACHPC has shown that it provides substantial performance gain in EST annotation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cooperative communication (CC) offers an efficient and low-cost way to achieve spatial diversity by forming a virtual antenna array among single-antenna nodes that cooperatively share their antennas. It has been well recognized that the selection of relay nodes plays a critical role in the performance of CC. Most existing relay selection strategies focus on optimizing the outage probability or energy consumption. To fill in the vacancy of research on throughput improvement via CC, we study the relay selection problem with the objective of optimizing the throughput in this paper. For unicast, it is a P problem, and an optimal relay selection algorithm is provided with a correctness proof. For broadcast, we show the challenge of relay selection by proving it nonprobabilistic hard (NP-hard). A greedy heuristic algorithm is proposed to effectively choose a set of relay nodes that maximize the broadcast throughput. Simulation results show that the proposed algorithms can achieve high throughput under various network settings.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

 Network coding has shown the promise of significant throughput improvement. In this paper, we study the network throughput using network coding and explore how the maximum throughput can be achieved in a two-way relay wireless network. Unlike previous studies, we consider a more general network with arbitrary structure of overhearing status between receivers and transmitters. To efficiently utilize the coding opportunities, we invent the concept of network coding cliques (NCCs), upon which a formal analysis on the network throughput using network coding is elaborated. In particular, we derive the closed-form expression of the network throughput under certain traffic load in a slotted ALOHA network with basic medium access control. Furthermore, the maximum throughput as well as optimal medium access probability at each node is studied under various network settings. Our theoretical findings have been validated by simulation as well.