922 resultados para Household resource allocation


Relevância:

90.00% 90.00%

Publicador:

Resumo:

This work presents exact algorithms for the Resource Allocation and Cyclic Scheduling Problems (RA&CSPs). Cyclic Scheduling Problems arise in a number of application areas, such as in hoist scheduling, mass production, compiler design (implementing scheduling loops on parallel architectures), software pipelining, and in embedded system design. The RA&CS problem concerns time and resource assignment to a set of activities, to be indefinitely repeated, subject to precedence and resource capacity constraints. In this work we present two constraint programming frameworks facing two different types of cyclic problems. In first instance, we consider the disjunctive RA&CSP, where the allocation problem considers unary resources. Instances are described through the Synchronous Data-flow (SDF) Model of Computation. The key problem of finding a maximum-throughput allocation and scheduling of Synchronous Data-Flow graphs onto a multi-core architecture is NP-hard and has been traditionally solved by means of heuristic (incomplete) algorithms. We propose an exact (complete) algorithm for the computation of a maximum-throughput mapping of applications specified as SDFG onto multi-core architectures. Results show that the approach can handle realistic instances in terms of size and complexity. Next, we tackle the Cyclic Resource-Constrained Scheduling Problem (i.e. CRCSP). We propose a Constraint Programming approach based on modular arithmetic: in particular, we introduce a modular precedence constraint and a global cumulative constraint along with their filtering algorithms. Many traditional approaches to cyclic scheduling operate by fixing the period value and then solving a linear problem in a generate-and-test fashion. Conversely, our technique is based on a non-linear model and tackles the problem as a whole: the period value is inferred from the scheduling decisions. The proposed approaches have been tested on a number of non-trivial synthetic instances and on a set of realistic industrial instances achieving good results on practical size problem.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Dynamic spectrum access (DSA) aims at utilizing spectral opportunities both in time and frequency domains at any given location, which arise due to variations in spectrum usage. Recently, Cognitive radios (CRs) have been proposed as a means of implementing DSA. In this work we focus on the aspect of resource management in overlaid CRNs. We formulate resource allocation strategies for cognitive radio networks (CRNs) as mathematical optimization problems. Specifically, we focus on two key problems in resource management: Sum Rate Maximization and Maximization of Number of Admitted Users. Since both the above mentioned problems are NP hard due to presence of binary assignment variables, we propose novel graph based algorithms to optimally solve these problems. Further, we analyze the impact of location awareness on network performance of CRNs by considering three cases: Full location Aware, Partial location Aware and Non location Aware. Our results clearly show that location awareness has significant impact on performance of overlaid CRNs and leads to increase in spectrum utilization effciency.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

According to climate models, drier summers must be expected more frequently in Central Europe during the next decades, which may influence plant performance and competition in grassland. The overall source–sink relations in plants, especially allocation of solutes to above- and below-ground parts, may be affected by drought. To investigate solute export from a given leaf of broadleaf dock, a solution containing 57Co and 65Zn was introduced through a leaf flap. The export from this leaf was detected by analysing radionuclide contents in various plant parts. Less label was allocated to new leaves and more to roots under drought. The observed alterations of source–sink relations in broadleaf dock were reversible during a subsequent short period of rewatering. These findings suggest an increased resource allocation to roots under drought improving the functionality of the plants.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The performance of tasks that are perceived as unnecessary or unreasonable, illegitimate tasks, represents a new stressor concept that refers to assignments that violate the norms associated with the role requirements of professional work. Research has shown that illegitimate tasks are associated with stress and counterproductive work behaviour. The purpose of this study was to provide insight into the contribution of characteristics of the organization on the prevalence of illegitimate tasks in the work of frontline and middle managers. Using the Bern Illegitimate Task Scale (BITS) in a sample of 440 local government operations managers in 28 different organizations in Sweden, this study supports the theoretical assumptions that illegitimate tasks are positively related to stress and negatively related to satisfaction with work performance. Results further show that 10% of the variance in illegitimate tasks can be attributed to the organization where the managers work. Multilevel referential analysis showed that the more the organization was characterized by competition for resources between units, unfair and arbitrary resource allocation and obscure decisional structure, the more illegitimate tasks managers reported. These results should be valuable for strategic-level management since they indicate that illegitimate tasks can be counteracted by means of the organization of work.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Most commercial project management software packages include planning methods to devise schedules for resource-constrained projects. As it is proprietary information of the software vendors which planning methods are implemented, the question arises how the software packages differ in quality with respect to their resource-allocation capabilities. We experimentally evaluate the resource-allocation capabilities of eight recent software packages by using 1,560 instances with 30, 60, and 120 activities of the well-known PSPLIB library. In some of the analyzed packages, the user may influence the resource allocation by means of multi-level priority rules, whereas in other packages, only few options can be chosen. We study the impact of various complexity parameters and priority rules on the project duration obtained by the software packages. The results indicate that the resource-allocation capabilities of these packages differ significantly. In general, the relative gap between the packages gets larger with increasing resource scarcity and with increasing number of activities. Moreover, the selection of the priority rule has a considerable impact on the project duration. Surprisingly, when selecting a priority rule in the packages where it is possible, both the mean and the variance of the project duration are in general worse than for the packages which do not offer the selection of a priority rule.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

For executing the activities of a project, one or several resources are required, which are in general scarce. Many resource-allocation methods assume that the usage of these resources by an activity is constant during execution; in practice, however, the project manager may vary resource usage by individual activities over time within prescribed bounds. This variation gives rise to the project scheduling problem which consists in allocating the scarce resources to the project activities over time such that the project duration is minimized, the total number of resource units allocated equals the prescribed work content of each activity, and precedence and various work-content-related constraints are met.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Debido al gran incremento de datos digitales que ha tenido lugar en los últimos años, ha surgido un nuevo paradigma de computación paralela para el procesamiento eficiente de grandes volúmenes de datos. Muchos de los sistemas basados en este paradigma, también llamados sistemas de computación intensiva de datos, siguen el modelo de programación de Google MapReduce. La principal ventaja de los sistemas MapReduce es que se basan en la idea de enviar la computación donde residen los datos, tratando de proporcionar escalabilidad y eficiencia. En escenarios libres de fallo, estos sistemas generalmente logran buenos resultados. Sin embargo, la mayoría de escenarios donde se utilizan, se caracterizan por la existencia de fallos. Por tanto, estas plataformas suelen incorporar características de tolerancia a fallos y fiabilidad. Por otro lado, es reconocido que las mejoras en confiabilidad vienen asociadas a costes adicionales en recursos. Esto es razonable y los proveedores que ofrecen este tipo de infraestructuras son conscientes de ello. No obstante, no todos los enfoques proporcionan la misma solución de compromiso entre las capacidades de tolerancia a fallo (o de manera general, las capacidades de fiabilidad) y su coste. Esta tesis ha tratado la problemática de la coexistencia entre fiabilidad y eficiencia de los recursos en los sistemas basados en el paradigma MapReduce, a través de metodologías que introducen el mínimo coste, garantizando un nivel adecuado de fiabilidad. Para lograr esto, se ha propuesto: (i) la formalización de una abstracción de detección de fallos; (ii) una solución alternativa a los puntos únicos de fallo de estas plataformas, y, finalmente, (iii) un nuevo sistema de asignación de recursos basado en retroalimentación a nivel de contenedores. Estas contribuciones genéricas han sido evaluadas tomando como referencia la arquitectura Hadoop YARN, que, hoy en día, es la plataforma de referencia en la comunidad de los sistemas de computación intensiva de datos. En la tesis se demuestra cómo todas las contribuciones de la misma superan a Hadoop YARN tanto en fiabilidad como en eficiencia de los recursos utilizados. ABSTRACT Due to the increase of huge data volumes, a new parallel computing paradigm to process big data in an efficient way has arisen. Many of these systems, called dataintensive computing systems, follow the Google MapReduce programming model. The main advantage of these systems is based on the idea of sending the computation where the data resides, trying to provide scalability and efficiency. In failure-free scenarios, these frameworks usually achieve good results. However, these ones are not realistic scenarios. Consequently, these frameworks exhibit some fault tolerance and dependability techniques as built-in features. On the other hand, dependability improvements are known to imply additional resource costs. This is reasonable and providers offering these infrastructures are aware of this. Nevertheless, not all the approaches provide the same tradeoff between fault tolerant capabilities (or more generally, reliability capabilities) and cost. In this thesis, we have addressed the coexistence between reliability and resource efficiency in MapReduce-based systems, looking for methodologies that introduce the minimal cost and guarantee an appropriate level of reliability. In order to achieve this, we have proposed: (i) a formalization of a failure detector abstraction; (ii) an alternative solution to single points of failure of these frameworks, and finally (iii) a novel feedback-based resource allocation system at the container level. Finally, our generic contributions have been instantiated for the Hadoop YARN architecture, which is the state-of-the-art framework in the data-intensive computing systems community nowadays. The thesis demonstrates how all our approaches outperform Hadoop YARN in terms of reliability and resource efficiency.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper presents a theoretical model for the analysis of decisions regarding farm household labour allocation. The agricultural household model is selected as the most appropriate theoretical framework; a model based on the assumption that households behave to maximise utility, which is a function of consumption and leisure, and is subject to time and budget constraints. The model can be used to describe the role of government subsidies in farm household labour allocation decisions; in particular the impact of decoupled subsidies on labour allocation can be examined. Decoupled subsidies are a labour-free payment and as such represent an increase in labour-free income or wealth. An increase in wealth allows farm households to work less while maintaining consumption. On the other hand, decoupled subsidies represent a decline in the return to farm labour and may lead to a substitution effect, i.e., farmers may choose to substitute non-farm work for farm work. The theoretical framework proposed in this paper allows for the examination of these two conflicting effects.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

National Highway Traffic Safety Administration, Washington, D.C.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

National Highway Traffic Safety Administration, Washington, D.C.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

National Highway Traffic Safety Administration, Washington, D.C.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This study examined whether the effectiveness of human resource management (HRM)practices is contingent on organizational climate and competitive strategy The concepts of internol and external fit suggest that the positive relationship between HRM and subsequent productivity will be stronger for firms with a positive organizational climate and for firms using differentiation strategies. Resource allocation theories of motivation, on the other hand, predict that the relationship between HRM and productivity will be stronger for firms with a poor climate because employees working in these firms should have the greatest amount of spare capacity. The results supported the resource allocation argument.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This study examined whether the effectiveness of human resource management (HRM) practices is contingent on organizational climate and competitive strategy. The concepts of internal and external fit suggest that the positive relationship between HRM and subsequent productivity will be stronger for firms with a positive organizational climate and for firms using differentiation strategies. Resource allocation theories of motivation, on the other hand, predict that the relationship between HRM and productivity will be stronger for firms with a poor climate because employees working in these firms should have the greatest amount of spare capacity. The results supported the resource allocation argument. © 2005 Southern Management Association. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In future massively distributed service-based computational systems, resources will span many locations, organisations and platforms. In such systems, the ability to allocate resources in a desired configuration, in a scalable and robust manner, will be essential.We build upon a previous evolutionary market-based approach to achieving resource allocation in decentralised systems, by considering heterogeneous providers. In such scenarios, providers may be said to value their resources differently. We demonstrate how, given such valuations, the outcome allocation may be predicted. Furthermore, we describe how the approach may be used to achieve a stable, uneven load-balance of our choosing. We analyse the system's expected behaviour, and validate our predictions in simulation. Our approach is fully decentralised; no part of the system is weaker than any other. No cooperation between nodes is assumed; only self-interest is relied upon. A particular desired allocation is achieved transparently to users, as no modification to the buyers is required.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Purpose – The purpose of this research is to study the perceived impact of some factors on the resources allocation processes of the Nigerian universities and to suggest a framework that will help practitioners and academics to understand and improve such processes. Design/methodology/approach – The study adopted the interpretive qualitative approach aimed at an ‘in-depth’ understanding of the resource allocation experiences of key university personnel and their perceived impact of the contextual factors affecting such processes. The analysis of individual narratives from each university established the conditions and factors impacting the resources allocation processes within each institution. Findings – The resources allocation process issues in the Nigerian universities may be categorised into people (core and peripheral units’ challenge, and politics and power); process (resources allocation processes); and resources (critical financial shortage and resources dependence response). The study also provides insight that resourcing efficiency in Nigerian universities appears strongly constrained by the rivalry among the resource managers. The efficient resources allocation process (ERAP) model is proposed to resolve the identified resourcing deficiencies. Research limitations/implications – The research is not focused to provide generalizable observations but ‘in-depth’ perceived factors and their impact on the resources allocation processes in Nigerian universities. The study is limited to the internal resources allocation issues within the universities and excludes the external funding factors. The resource managers’ responses to the identified factors may affect their internal resourcing efficiency. Further research using more empirical samples is required to obtain more widespread results and the implications for all universities. Originality/value – This study contributes a fresh literature framework to resources allocation processes focusing at ‘people’, ‘process’ and ‘resources’. Also a middle range theory triangulation is developed in relation to better understanding of resourcing process management. The study will be of interest to university managers and policy makers.