61 resultados para resource matching


Relevância:

20.00% 20.00%

Publicador:

Resumo:

For the execution of the scientific applications, different methods have been proposed to dynamically provide execution environments for such applications that hide the complexity of underlying distributed and heterogeneous infrastructures. Recently virtualization has emerged as a promising technology to provide such environments. Virtualization is a technology that abstracts away the details of physical hardware and provides virtualized resources for high-level scientific applications. Virtualization offers a cost-effective and flexible way to use and manage computing resources. Such an abstraction is appealing in Grid computing and Cloud computing for better matching jobs (applications) to computational resources. This work applies the virtualization concept to the Condor dynamic resource management system by using Condor Virtual Universe to harvest the existing virtual computing resources to their maximum utility. It allows existing computing resources to be dynamically provisioned at run-time by users based on application requirements instead of statically at design-time thereby lay the basis for efficient use of the available resources, thus providing way for the efficient use of the available resources.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Es tracta d’una sessió introductòria que se centra en els aspectes més generals de la nova normativa: el marc conceptual en el qual es fonamenta la RDA (els models FRBR i FRAD), els objectius, l’organització de les normes, el nou vocabulari, la continuïtat amb les AACR2 i els punts de divergència, els beneficis que s’espera obtenir del nou codi, etc. El seu objectiu principal és fer una primera introducció de les regles per sensibilitzar els catalogadors dels canvis que tindran lloc a mig termini.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A parts based model is a parametrization of an object class using a collection of landmarks following the object structure. The matching of parts based models is one of the problems where pairwise Conditional Random Fields have been successfully applied. The main reason of their effectiveness is tractable inference and learning due to the simplicity of involved graphs, usually trees. However, these models do not consider possible patterns of statistics among sets of landmarks, and thus they sufffer from using too myopic information. To overcome this limitation, we propoese a novel structure based on a hierarchical Conditional Random Fields, which we explain in the first part of this memory. We build a hierarchy of combinations of landmarks, where matching is performed taking into account the whole hierarchy. To preserve tractable inference we effectively sample the label set. We test our method on facial feature selection and human pose estimation on two challenging datasets: Buffy and MultiPIE. In the second part of this memory, we present a novel approach to multiple kernel combination that relies on stacked classification. This method can be used to evaluate the landmarks of the parts-based model approach. Our method is based on combining responses of a set of independent classifiers for each individual kernel. Unlike earlier approaches that linearly combine kernel responses, our approach uses them as inputs to another set of classifiers. We will show that we outperform state-of-the-art methods on most of the standard benchmark datasets.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

L'objectiu del TFC consisteix en desenvolupar una aplicació que permeti, per una banda, la definició d'una oferta de recursos; per altra banda el uns usuaris-consumidors puguéssin apuntar-se a dites ofertes i, finalment,

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we consider the ATM networks in which the virtual path concept is implemented. The question of how to multiplex two or more diverse traffic classes while providing different quality of service requirements is a very complicated open problem. Two distinct options are available: integration and segregation. In an integration approach all the traffic from different connections are multiplexed onto one VP. This implies that the most restrictive QOS requirements must be applied to all services. Therefore, link utilization will be decreased because unnecessarily stringent QOS is provided to all connections. With the segregation approach the problem can be much simplified if different types of traffic are separated by assigning a VP with dedicated resources (buffers and links). Therefore, resources may not be efficiently utilized because no sharing of bandwidth can take place across the VP. The probability that the bandwidth required by the accepted connections exceeds the capacity of the link is evaluated with the probability of congestion (PC). Since the PC can be expressed as the CLP, we shall simply carry out bandwidth allocation using the PC. We first focus on the influence of some parameters (CLP, bit rate and burstiness) on the capacity required by a VP supporting a single traffic class using the new convolution approach. Numerical results are presented both to compare the required capacity and to observe which conditions under each approach are preferred

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper proposes MSISpIC, a probabilistic sonar scan matching algorithm for the localization of an autonomous underwater vehicle (AUV). The technique uses range scans gathered with a Mechanical Scanning Imaging Sonar (MSIS), the robot displacement estimated through dead-reckoning using a Doppler velocity log (DVL) and a motion reference unit (MRU). The proposed method is an extension of the pIC algorithm. An extended Kalman filter (EKF) is used to estimate the robot-path during the scan in order to reference all the range and bearing measurements as well as their uncertainty to a scan fixed frame before registering. The major contribution consists of experimentally proving that probabilistic sonar scan matching techniques have the potential to improve the DVL-based navigation. The algorithm has been tested on an AUV guided along a 600 m path within an abandoned marina underwater environment with satisfactory results

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper studies cooperation in a political system dominated by two opportunistic parties competing in a resource-based economy. Since a binding agreement as an external solution might be difficult to enforce due to the close association between the incumbent party and the government, the paper explores the extent to which co-operation between political parties that alternate in office can rely on self-enforcing strategies to provide an internal solution. We show that, for appropriate values of the probability of re-election and the discount factor cooperation in maintaining the value of a state variable is possible, but fragile. Another result is that, in such political framework, debt decisions contain an externality element linked to electoral incentives that creates a bias towards excessive borrowing.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Next Generation Access Networks (NGAN) are the new step forward to deliver broadband services and to facilitate the integration of different technologies. It is plausible to assume that, from a technological standpoint, the Future Internet will be composed of long-range high-speed optical networks; a number of wireless networks at the edge; and, in between, several access technologies, among which, the Passive Optical Networks (xPON) are very likely to succeed, due to their simplicity, low-cost, and increased bandwidth. Among the different PON technologies, the Ethernet-PON (EPON) is the most promising alternative to satisfy operator and user needs, due to its cost, flexibility and interoperability with other technologies. One of the most interesting challenges in such technologies relates to the scheduling and allocation of resources in the upstream (shared) channel. The aim of this research project is to study and evaluate current contributions and propose new efficient solutions to address the resource allocation issues in Next Generation EPON (NG-EPON). Key issues in this context are future end-user needs, integrated quality of service (QoS) support and optimized service provisioning for real time and elastic flows. This project will unveil research opportunities, issue recommendations and propose novel mechanisms associated with the convergence within heterogeneous access networks and will thus serve as a basis for long-term research projects in this direction. The project has served as a platform for the generation of new concepts and solutions that were published in national and international conferences, scientific journals and also in book chapter. We expect some more research publications in addition to the ones mentioned to be generated in a few months.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper proposes an heuristic for the scheduling of capacity requests and the periodic assignment of radio resources in geostationary (GEO) satellite networks with star topology, using the Demand Assigned Multiple Access (DAMA) protocol in the link layer, and Multi-Frequency Time Division Multiple Access (MF-TDMA) and Adaptive Coding and Modulation (ACM) in the physical layer.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper points out an empirical puzzle that arises when an RBC economy with a job matching function is used to model unemployment. The standard model can generate sufficiently large cyclical fluctuations in unemployment, or a sufficiently small response of unemployment to labor market policies, but it cannot do both. Variable search and separation, finite UI benefit duration, efficiency wages, and capital all fail to resolve this puzzle. However, both sticky wages and match-specific productivity shocks help the model reproduce the stylized facts: both make the firm's flow of surplus more procyclical, thus making hiring more procyclical too.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

According to Ljungqvist and Sargent (1998), high European unemployment since the 1980s can be explained by a rise in economic turbulence, leading to greater numbers of unemployed workers with obsolete skills. These workers refuse new jobs due to high unemployment benefits. In this paper we reassess the turbulence-unemployment relationship using a matching model with endogenous job destruction. In our model, higher turbulence reduces the incentives of employed workers to leave their jobs. If turbulence has only a tiny effect on the skills of workers experiencing endogenous separation, then the results of Lungqvist and Sargent (1998, 2004) are reversed, and higher turbulence leads to a reduction in unemployment. Thus, changes in turbulence cannot provide an explanation for European unemployment that reconciles the incentives of both unemployed and employed workers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper generalizes the original random matching model of money byKiyotaki and Wright (1989) (KW) in two aspects: first, the economy ischaracterized by an arbitrary distribution of agents who specialize in producing aparticular consumption good; and second, these agents have preferences suchthat they want to consume any good with some probability. The resultsdepend crucially on the size of the fraction of producers of each goodand the probability with which different agents want to consume eachgood. KW and other related models are shown to be parameterizations ofthis more general one.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The emphasis on integrated care implies new incentives that promote coordinationbetween levels of care. Considering a population as a whole, the resource allocation systemhas to adapt to this environment. This research is aimed to design a model that allows formorbidity related prospective and concurrent capitation payment. The model can be applied inpublicly funded health systems and managed competition settings.Methods: We analyze the application of hybrid risk adjustment versus either prospective orconcurrent risk adjustment formulae in the context of funding total health expenditures for thepopulation of an integrated healthcare delivery organization in Catalonia during years 2004 and2005.Results: The hybrid model reimburses integrated care organizations avoiding excessive risktransfer and maximizing incentives for efficiency in the provision. At the same time, it eliminatesincentives for risk selection for a specific set of high risk individuals through the use ofconcurrent reimbursement in order to assure a proper classification of patients.Conclusion: Prospective Risk Adjustment is used to transfer the financial risk to the healthprovider and therefore provide incentives for efficiency. Within the context of a National HealthSystem, such transfer of financial risk is illusory, and the government has to cover the deficits.Hybrid risk adjustment is useful to provide the right combination of incentive for efficiency andappropriate level of risk transfer for integrated care organizations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper analyzes the problem of matching heterogeneous agents in aBayesian learning model. One agent gives a noisy signal to another agent,who is responsible for learning. If production has a strong informationalcomponent, a phase of cross-matching occurs, so that agents of low knowledgecatch up with those of higher one. It is shown that:(i) a greater informational component in production makes cross-matchingmore likely;(ii) as the new technology is mastered, production becomes relatively morephysical and less informational;(iii) a greater dispersion of the ability to learn and transfer informationmakes self-matching more likely; and(iv) self-matching leads to more self-matching, whereas cross-matching canmake less productive agents overtake more productive ones.