882 resultados para Profit allocation


Relevância:

20.00% 20.00%

Publicador:

Resumo:

We consider a cooperative relaying network in which a source communicates with a group of users in the presence of one eavesdropper. We assume that there are no source-user links and the group of users receive only retransmitted signal from the relay. Whereas, the eavesdropper receives both the original and retransmitted signals. Under these assumptions, we exploit the user selection technique to enhance the secure performance. We first find the optimal power allocation strategy when the source has the full channel state information (CSI) of all links. We then evaluate the security level through: i) ergodic secrecy rate and ii) secrecy outage probability when having only the statistical knowledge of CSIs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper examines the integration of a tolerance design process within the Computer-Aided Design (CAD) environment having identified the potential to create an intelligent Digital Mock-Up [1]. The tolerancing process is complex in nature and as such reliance on Computer-Aided Tolerancing (CAT) software and domain experts can create a disconnect between the design and manufacturing disciplines It is necessary to implement the tolerance design procedure at the earliest opportunity to integrate both disciplines and to reduce workload in tolerance analysis and allocation at critical stages in product development when production is imminent.
The work seeks to develop a methodology that will allow for a preliminary tolerance allocation procedure within CAD. An approach to tolerance allocation based on sensitivity analysis is implemented on a simple assembly to review its contribution to an intelligent DMU. The procedure is developed using Python scripting for CATIA V5, with analysis results aligning with those in literature. A review of its implementation and requirements is presented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We investigate the performance of dual-hop two-way amplify-and-forward (AF) relaying in the presence of inphase and quadrature-phase imbalance (IQI) at the relay node. In particular, the effective signal-to-interference-plus-noise ratio (SINR) at both sources is derived. These SINRs are used to design an instantaneous power allocation scheme, which maximizes the minimum SINR of the two sources under a total transmit power constraint. The solution to this optimization problem is analytically determined and used to evaluate the outage probability (OP) of the considered two-way AF relaying system. Both analytical and numerical results show that IQI can create fundamental performance limits on two-way relaying, which cannot be avoided by simply improving the channel conditions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dans cette thèse, nous avons analysé le déroulement d’un processus de municipalisation du système de santé, effectué au Rio Grande do Norte (RN), un des états fédérés du nord-est du Brésil. En tenant compte des contextes historiques d’implantation, nous avons centré notre attention sur la contribution des acteurs impliqués dans ce processus, spécialement dans l’allocation des ressources financières du système. Les croyances, perceptions, attentes, représentations, connaissances, intérêts, l’ensemble des facteurs qui contribuent à la constitution des capacités cognitives de ces acteurs, favorise la réflexivité sur leurs actions et la définition de stratégies diverses de façon à poursuivre leurs objectifs dans le système de santé. Ils sont vus ainsi comme des agents compétents et réflexifs, capables de s’approprier des propriétés structurelles du système de santé (règles et ressources), de façon à prendre position dans l’espace social de ce système pour favoriser le changement ou la permanence du statu quo. Au cours du processus de structuration du Système unique de santé brésilien, le SUS, la municipalisation a été l’axe le plus développé d’un projet de réforme de la santé. Face aux contraintes contextuelles et de la dynamique complexe des espaces sociaux de la santé, les acteurs réformistes n’ont pas pu suivre le chemin de l’utopie idéalisée; quelques détours ont été parcourus. Au RN, la municipalisation de la santé a constitué un processus très complexe où la triade centralisation/décentralisation/recentralisation a suivi son cours au milieu de négociations, de conflits, d’alliances, de disputes, de coopérations, de compétitions. Malgré les contraintes des contextes successifs, des propriétés structurelles du système et des dynamiques sociales dans le système de santé, quelques changements sont intervenus : la construction de leaderships collectifs; l’émergence d’une culture de négociation; la création des structures et des espaces sociaux du système, favorisant les rencontres des acteurs dans chaque municipalité et au niveau de l’état fédéré; un apprentissage collectif sur le processus de structuration du SUS; une grande croissance des services de première ligne permettant d’envisager une inversion de tendance du modèle de prestation des services; les premiers pas vers la rupture avec la culture bureaucratique du système. Le SUS reste prisonnier de quelques enjeux institutionnalisés dans ce système de santé : la dépendance du secteur privé et de quelques groupes de professionnels; le financement insuffisant et instable; la situation des ressources humaines. Les changements arrivés sont convergents, incrémentiels, lents; ils résultent d’actions normatives, délibérées, formalisées. Elles aussi sont issues de l’inattendu, de l’informel, du paradoxe; quelques-unes plus localisées, d’autres plus généralisées, pour une courte ou une plus longue durée.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Macroeconomic models based on the Phillips Curve predict that as the unemployment rate declines toward the long-run, natural rate, the pace of wage and price growth accelerates and inflation rises.1 In this paper I analyze the profitability prospects for the U.S. hotel industry in today’s relatively volatile economic environment, keeping in mind the Phillips Curve’s general principle that inflation and employment have an inverse, but relatively stable short-term relationship. Although employment and economic growth in the U.S. have been uneven in recent months, the unemployment rate has declined to less than 5 percent, which many economists believe is close to the natural rate. Growth in wages and salaries, as measured by the Employment Cost Index, has concurrently been moving upward between 2.5 and 3.0 percent during the past 12 months. At the same time, general inflation remains below levels that might typically be expected this late in the cycle, although core inflation is bumping up against the Federal Reserve’s 2-percent target. If the inflation rate continues to move upward as predicted by Phillips Curve models (and encouraged by the Federal Reserve), rising labor costs and other expenses will exert downward pressure on U.S. business profits. Backward movement up the Phillips Curve (with greater inflation) coincides with an expanding economy. In that scenario, prices of goods and services also will rise in real terms if their supply cannot keep up with demand, and producers have the ability to raise prices (absent fixed-price contracts such as leases).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A gulf has tended to develop between the adoption and usage of information technology by different generations, at the heart of which is different ways of experiencing and relating to the world around us. This research idea is currently being developed following data collection and feedback is sought on ways forward to enable impact. The research focuses on information technology in the form of multimedia. Multimedia meaning ‘media’ and ‘content’ that uses a combination of different content forms; or electronically integrated communication engaging all or most of the senses (e.g. graphic art, sound, animation and full-motion video presented by way of computer or other electronic means) mainly through presentational technologies. Although multimedia is not new, some organization’s particularly those in the non-profit sector do not always have the technical or financial resources to support such systems and consequently may struggle to adopt and support its usage amongst different generations. However non-profit organizations are being forced to pay more attention to the way they communicate with markets and the public due to the professionalism of communication everywhere in society. The case study used for this study is a church circuit comprising of 15 churches in the Midlands region of the United Kingdom which was selected due to the diverse age groups catered for within this type of non-profit organization. Participants in the study also had a range of skills, experiences and backgrounds which adds to the diversity of the population studied. Data gathered focused on the attitudes and opinions of the adoption and use of multimedia amongst different age groups. 395 questionnaires were distributed, comprising of 11 opinion questions and 4 demographic questions. 83% of the questionnaires were returned, representing 35% of the total circuit membership. Three people from each of the following age categories were also interviewed: 1920 – 1946 (Matures); 1947-1964 (Baby Boomers); 1965-1982 (Generation X); 1983-2004 (Net Generation). Results of the questionnaire and comments from the interviews were found not to tally with the widespread assumption that the younger generation is attracted by the use of multimedia in comparison to the older generation. The highest proportion of those who said that they gain more from a service enhanced by multimedia was from the Baby Boomers. Comments from interviews suggested that: ‘we need to embrace multimedia if we are to attract and retain the younger generation’; ‘multimedia often helps children to remain focused and clarifies the objective of the service’. However, because the younger generations’ world tends to be dominated by computer technology the questionnaire showed that they are more likely to have higher standards when it comes to the use of multimedia, such as identifying higher levels of equipment failing to work and annoying use of sounds compared to older age groups. In comparison problems experienced with multimedia for the Matures age group had the highest percentage of difficulty with the size of letters; the colour of letters and background and the sound not loud enough which is to be expected. Since every organization is unique any type of multimedia adopted and used should be specific to their needs, its stakeholders and the physical building in order to enhance that uniqueness and its needs. Giving thought to whether the type of multimedia is the best method for communicating the message to the particular audience alongside how technical and financial resources are best used can assist in accommodating different age groups that need to be catered for.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background and problem – As a result of financial crises and the realization of a broader stakeholder network, recent decades have seen an increase in stakeholder demand for non- financial information in corporate reporting. This has led to a situation of information overload where separate financial and sustainability reports have developed in length and complexity interdependent of each other. Integrated reporting has been presented as a solution to this problematic situation. The question is whether the corporate world believe this to be the solution and if the development of corporate reporting is heading in this direction. Purpose - This thesis aims to examine and assess to what extent companies listed on the OMX Stockholm 30 (OMXS30), as per 2016-02-28, comply with the Strategic content element of the <IR> Framework and how this disclosure has developed since the framework’s pilot project and official release by using a self-constructed disclosure index based on its specific items. Methodology – The purpose was fulfilled through an analysis of 104 annual reports comprising 26 companies during the period of 2011-2014. The annual reports were assessed using a self-constructed disclosure index based on the <IR> Framework content element Strategy and Resource Allocation, where one point was given for each disclosed item. Analysis and conclusions – The study found that the OMXS30-listed companies to a large extent complies with the strategic content element of the <IR> Framework and that this compliance has seen a steady growth throughout the researched time span. There is still room for improvement however with a total average framework compliance of 84% for 2014. Although many items are being reported on, there are indications that companies generally miss out on the core values of Integrated reporting. 

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In many areas of simulation, a crucial component for efficient numerical computations is the use of solution-driven adaptive features: locally adapted meshing or re-meshing; dynamically changing computational tasks. The full advantages of high performance computing (HPC) technology will thus only be able to be exploited when efficient parallel adaptive solvers can be realised. The resulting requirement for HPC software is for dynamic load balancing, which for many mesh-based applications means dynamic mesh re-partitioning. The DRAMA project has been initiated to address this issue, with a particular focus being the requirements of industrial Finite Element codes, but codes using Finite Volume formulations will also be able to make use of the project results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a methodology to explore the impact on poverty of the public spending on education. The methodology consists of two approaches: Benefit Incidence Analysis (BIA) and behavioral approach. BIA considers the cost and use of the educational service, and the distribution of the benefits among groups of income. Regarding the behavioral approach, we use a Probit model of schooling attendance, in order to determinethe influence of public spending on the probability for thepoor to attend the school. As a complement, a measurement of targeting errors in the allocation of public spending is included in the methodology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Abstract not available

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Deployment of low power basestations within cellular networks can potentially increase both capacity and coverage. However, such deployments require efficient resource allocation schemes for managing interference from the low power and macro basestations that are located within each other’s transmission range. In this dissertation, we propose novel and efficient dynamic resource allocation algorithms in the frequency, time and space domains. We show that the proposed algorithms perform better than the current state-of-art resource management algorithms. In the first part of the dissertation, we propose an interference management solution in the frequency domain. We introduce a distributed frequency allocation scheme that shares frequencies between macro and low power pico basestations, and guarantees a minimum average throughput to users. The scheme seeks to minimize the total number of frequencies needed to honor the minimum throughput requirements. We evaluate our scheme using detailed simulations and show that it performs on par with the centralized optimum allocation. Moreover, our proposed scheme outperforms a static frequency reuse scheme and the centralized optimal partitioning between the macro and picos. In the second part of the dissertation, we propose a time domain solution to the interference problem. We consider the problem of maximizing the alpha-fairness utility over heterogeneous wireless networks (HetNets) by jointly optimizing user association, wherein each user is associated to any one transmission point (TP) in the network, and activation fractions of all TPs. Activation fraction of a TP is the fraction of the frame duration for which it is active, and together these fractions influence the interference seen in the network. To address this joint optimization problem which we show is NP-hard, we propose an alternating optimization based approach wherein the activation fractions and the user association are optimized in an alternating manner. The subproblem of determining the optimal activation fractions is solved using a provably convergent auxiliary function method. On the other hand, the subproblem of determining the user association is solved via a simple combinatorial algorithm. Meaningful performance guarantees are derived in either case. Simulation results over a practical HetNet topology reveal the superior performance of the proposed algorithms and underscore the significant benefits of the joint optimization. In the final part of the dissertation, we propose a space domain solution to the interference problem. We consider the problem of maximizing system utility by optimizing over the set of user and TP pairs in each subframe, where each user can be served by multiple TPs. To address this optimization problem which is NP-hard, we propose a solution scheme based on difference of submodular function optimization approach. We evaluate our scheme using detailed simulations and show that it performs on par with a much more computationally demanding difference of convex function optimization scheme. Moreover, the proposed scheme performs within a reasonable percentage of the optimal solution. We further demonstrate the advantage of the proposed scheme by studying its performance with variation in different network topology parameters.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Anthropogenic disturbance of old-growth tropical forests increases the abundance of early successional tree species at the cost of late successional ones. Quantifying differences in terms of carbon allocation and the proportion of recently fixed carbon in soil CO2 efflux is crucial for addressing the carbon footprint of creeping degradation. Methodology: We compared the carbon allocation pattern of the late successional gymnosperm Podocarpus falcatus (Thunb.) Mirb. and the early successional (gap filling) angiosperm Croton macrostachyus Hochst. es Del. in an Ethiopian Afromontane forest by whole tree (CO2)-C-13 pulse labeling. Over a one-year period we monitored the temporal resolution of the label in the foliage, the phloem sap, the arbuscular mycorrhiza, and in soil-derived CO2. Further, we quantified the overall losses of assimilated C-13 with soil CO2 efflux. Principal Findings: C-13 in leaves of C. macrostachyus declined more rapidly with a larger size of a fast pool (64% vs. 50% of the assimilated carbon), having a shorter mean residence time (14 h vs. 55 h) as in leaves of P. falcatus. Phloem sap velocity was about 4 times higher for C. macrostachyus. Likewise, the label appeared earlier in the arbuscular mycorrhiza of C. macrostachyus and in the soil CO2 efflux as in case of P. falcatus (24 h vs. 72 h). Within one year soil CO2 efflux amounted to a loss of 32% of assimilated carbon for the gap filling tree and to 15% for the late successional one. Conclusions: Our results showed clear differences in carbon allocation patterns between tree species, although we caution that this experiment was unreplicated. A shift in tree species composition of tropical montane forests (e. g., by degradation) accelerates carbon allocation belowground and increases respiratory carbon losses by the autotrophic community. If ongoing disturbance keeps early successional species in dominance, the larger allocation to fast cycling compartments may deplete soil organic carbon in the long run.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the half-duplex relay channel applying the decode-and-forward protocol the relay introduces energy over random time intervals into the channel as observed at the destination. Consequently, during simulation the average signal power seen at the destination becomes known at run-time only. Therefore, in order to obtain specific performance measures at the signal-to-noise ratio (SNR) of interest, strategies are required to adjust the noise variance during simulation run-time. It is necessary that these strategies result in the same performance as measured under real-world conditions. This paper introduces three noise power allocation strategies and demonstrates their applicability using numerical and simulation results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Previous studies of greenhouse gas emissions (GHGE) from beef production systems in northern Australia have been based on models of ‘steady-state’ herd structures that do not take into account the considerable inter-annual variation in liveweight gain, reproduction and mortality rates that occurs due to seasonal conditions. Nor do they consider the implications of flexible stocking strategies designed to adapt these production systems to the highly variable climate. The aim of the present study was to quantify the variation in total GHGE (t CO2e) and GHGE intensity (t CO2e/t liveweight sold) for the beef industry in northern Australia when variability in these factors was considered. A combined GRASP–Enterprise modelling platform was used to simulate a breeding–finishing beef cattle property in the Burdekin River region of northern Queensland, using historical climate data from 1982–2011. GHGE was calculated using the method of Australian National Greenhouse Gas Inventory. Five different stocking-rate strategies were simulated with fixed stocking strategies at moderate and high rates, and three flexible stocking strategies where the stocking rate was adjusted annually by up to 5%, 10% or 20%, according to pasture available at the end of the growing season. Variation in total annual GHGE was lowest in the ‘fixed moderate’ (~9.5 ha/adult equivalent (AE)) stocking strategy, ranging from 3799 to 4471 t CO2e, and highest in the ‘fixed high’ strategy (~5.9 ha/AE), which ranged from 3771 to 7636 t CO2e. The ‘fixed moderate’ strategy had the least variation in GHGE intensity (15.7–19.4 t CO2e/t liveweight sold), while the ‘flexible 20’ strategy (up to 20% annual change in AE) had the largest range (10.5–40.8 t CO2e/t liveweight sold). Across the five stocking strategies, the ‘fixed moderate’ stocking-rate strategy had the highest simulated perennial grass percentage and pasture growth, highest average rate of liveweight gain (121 kg/steer), highest average branding percentage (74%) and lowest average breeding-cow mortality rate (3.9%), resulting in the lowest average GHGE intensity (16.9 t CO2e/t liveweight sold). The ‘fixed high’ stocking rate strategy (~5.9 ha/AE) performed the poorest in each of these measures, while the three flexible stocking strategies were intermediate. The ‘fixed moderate’ stocking strategy also yielded the highest average gross margin per AE carried and per hectare. These results highlight the importance of considering the influence of climate variability on stocking-rate management strategies and herd performance when estimating GHGE. The results also support a body of previous work that has recommended the adoption of moderate stocking strategies to enhance the profitability and ecological stability of beef production systems in northern Australia.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A decision-maker, when faced with a limited and fixed budget to collect data in support of a multiple attribute selection decision, must decide how many samples to observe from each alternative and attribute. This allocation decision is of particular importance when the information gained leads to uncertain estimates of the attribute values as with sample data collected from observations such as measurements, experimental evaluations, or simulation runs. For example, when the U.S. Department of Homeland Security must decide upon a radiation detection system to acquire, a number of performance attributes are of interest and must be measured in order to characterize each of the considered systems. We identified and evaluated several approaches to incorporate the uncertainty in the attribute value estimates into a normative model for a multiple attribute selection decision. Assuming an additive multiple attribute value model, we demonstrated the idea of propagating the attribute value uncertainty and describing the decision values for each alternative as probability distributions. These distributions were used to select an alternative. With the goal of maximizing the probability of correct selection we developed and evaluated, under several different sets of assumptions, procedures to allocate the fixed experimental budget across the multiple attributes and alternatives. Through a series of simulation studies, we compared the performance of these allocation procedures to the simple, but common, allocation procedure that distributed the sample budget equally across the alternatives and attributes. We found the allocation procedures that were developed based on the inclusion of decision-maker knowledge, such as knowledge of the decision model, outperformed those that neglected such information. Beginning with general knowledge of the attribute values provided by Bayesian prior distributions, and updating this knowledge with each observed sample, the sequential allocation procedure performed particularly well. These observations demonstrate that managing projects focused on a selection decision so that the decision modeling and the experimental planning are done jointly, rather than in isolation, can improve the overall selection results.