994 resultados para arm’s length price methodology


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The emergence of the Web 2.0 technologies in the last years havechanged the way people interact with knowledge. Services for cooperation andcollaboration have placed the user in the centre of a new knowledge buildingspace. The development of new second generation learning environments canbenefit from the potential of these Web 2.0 services when applied to aneducational context. We propose a methodology for designing learningenvironments that relates Web 2.0 services with the functional requirements ofthese environments. In particular, we concentrate on the design of the KRSMsystem to discuss the components of this methodology and its application.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The feeder animal price is a derivative in the sense that its value depends upon the price of animals for the consumption market. It also depends upon the biological growth technology and feed costs. Daily maintenance costs are of particular interest to the husbander because they can be avoided through accelerated feeding. In this paper, the optimal feeding path under equilibrium feeder animal prices is established. This analysis is used to gain a better understanding of feeding decisions, regulation in feedstuff markets, and the consequences of genetic innovations. It is shown that days on feed can increase or decrease with a genetic innovation or other improvement in feed conversion efficiency. The structure of comparative prices for feeder animals at different weights, the early slaughter decision, and equilibrium in feeder animal markets are also developed. Feeder animal prices can increase over a weight interval if biological feed efficiency parameters are low over the interval.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Several superstructure design methodologies have been developed for low volume road bridges by the Iowa State University Bridge Engineering Center. However, to date no standard abutment designs have been developed. Thus, there was a need to establish an easy to use design methodology in addition to generating generic abutment standards and other design aids for the more common substructure systems used in Iowa. The final report for this project consists of three volumes. The first volume summarizes the research completed in this project. A survey of the Iowa County Engineers was conducted from which it was determined that while most counties use similar types of abutments, only 17 percent use some type of standard abutment designs or plans. A literature review revealed several possible alternative abutment systems for future use on low volume road bridges in addition to two separate substructure lateral load analysis methods. These consisted of a linear and a non-linear method. The linear analysis method was used for this project due to its relative simplicity and the relative accuracy of the maximum pile moment when compared to values obtained from the more complex non-linear analysis method. The resulting design methodology was developed for single span stub abutments supported on steel or timber piles with a bridge span length ranging from 20 to 90 ft and roadway widths of 24 and 30 ft. However, other roadway widths can be designed using the foundation design template provided. The backwall height is limited to a range of 6 to 12 ft, and the soil type is classified as cohesive or cohesionless. The design methodology was developed using the guidelines specified by the American Association of State Highway Transportation Officials Standard Specifications, the Iowa Department of Transportation Bridge Design Manual, and the National Design Specifications for Wood Construction. The second volume (this volume) introduces and outlines the use of the various design aids developed for this project. Charts for determining dead and live gravity loads based on the roadway width, span length, and superstructure type are provided. A foundation design template was developed in which the engineer can check a substructure design by inputting basic bridge site information. Tables published by the Iowa Department of Transportation that provide values for estimating pile friction and end bearing for different combinations of soils and pile types are also included. Generic standard abutment plans were developed for which the engineer can provide necessary bridge site information in the spaces provided. These tools enable engineers to design and detail county bridge substructures more efficiently. The third volume provides two sets of calculations that demonstrate the application of the substructure design methodology developed in this project. These calculations also verify the accuracy of the foundation design template. The printouts from the foundation design template are provided at the end of each example. Also several tables provide various foundation details for a pre-cast double tee superstructure with different combinations of soil type, backwall height, and pile type.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The alignment between competences, teaching-learning methodologies and assessment is a key element of the European Higher Education Area. This paper presents the efforts carried out by six Telematics, Computer Science and Electronic Engineering Education teachers towards achieving this alignment in their subjects. In a joint work with pedagogues, a set of recommended actions were identified. A selection of these actions were applied and evaluated in the six subjects. The cross-analysis of the results indicate that the actions allow students to better understand the methodologies and assessment planned for the subjects, facilitate (self-) regulation and increase students’ involvement in the subjects.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Computer based training or distance education are facing dramatic changes withthe advent of standardization efforts, some of them concentrating in maximal reuse.This is of paramount importance for a sustainable -cost affordable- production ofeducational materials. Reuse in itself should not be a goal, though, since manymethodological aspects might be lost. In this paper we propose two contentproduction approaches for the InterMediActor platform under a competence-basedmethodology: either a bottom-up approach where content is designed from scratchor a top-down methodology where existing material can be gradually adapted tofulfil requisites to be used with maximal flexibility into InterMediActor.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The objective of this study is to examine the literature and identify most salient outcomes of early postnatal discharge for women, newborns and the health system. An electronic search strategy was designed including the following sources: Web of Science, Scopus, ProQuest and PubMed/MEDLINE, using the following terms: (early AND discharge) OR (length AND stay) AND (postpartum OR postnatal) AND (effect* OR result OR outcome). Content analysis was used to identify and summarise the findings and methods of the research papers. The evidence available is not enough to either reject or support the practice of early postnatal discharge; different studies have reported different outcomes for women and newborns. The need of systematic clinical research is discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND AND STUDY AIMS: Appropriate use of colonoscopy is a key component of quality management in gastrointestinal endoscopy. In an update of a 1998 publication, the 2008 European Panel on the Appropriateness of Gastrointestinal Endoscopy (EPAGE II) defined appropriateness criteria for various colonoscopy indications. This introductory paper therefore deals with methodology, general appropriateness, and a review of colonoscopy complications. METHODS:The RAND/UCLA Appropriateness Method was used to evaluate the appropriateness of various diagnostic colonoscopy indications, with 14 multidisciplinary experts using a scale from 1 (extremely inappropriate) to 9 (extremely appropriate). Evidence reported in a comprehensive updated literature review was used for these decisions. Consolidation of the ratings into three appropriateness categories (appropriate, uncertain, inappropriate) was based on the median and the heterogeneity of the votes. The experts then met to discuss areas of disagreement in the light of existing evidence, followed by a second rating round, with a subsequent third voting round on necessity criteria, using much more stringent criteria (i. e. colonoscopy is deemed mandatory). RESULTS: Overall, 463 indications were rated, with 55 %, 16 % and 29 % of them being judged appropriate, uncertain and inappropriate, respectively. Perforation and hemorrhage rates, as reported in 39 studies, were in general < 0.1 % and < 0.3 %, respectively CONCLUSIONS: The updated EPAGE II criteria constitute an aid to clinical decision-making but should in no way replace individual judgment. Detailed panel results are freely available on the internet (www.epage.ch) and will thus constitute a reference source of information for clinicians.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

IMPORTANCE: The clinical benefit of adding a macrolide to a β-lactam for empirical treatment of moderately severe community-acquired pneumonia remains controversial. OBJECTIVE: To test noninferiority of a β-lactam alone compared with a β-lactam and macrolide combination in moderately severe community-acquired pneumonia. DESIGN, SETTING, AND PARTICIPANTS: Open-label, multicenter, noninferiority, randomized trial conducted from January 13, 2009, through January 31, 2013, in 580 immunocompetent adult patients hospitalized in 6 acute care hospitals in Switzerland for moderately severe community-acquired pneumonia. Follow-up extended to 90 days. Outcome assessors were masked to treatment allocation. INTERVENTIONS: Patients were treated with a β-lactam and a macrolide (combination arm) or with a β-lactam alone (monotherapy arm). Legionella pneumophila infection was systematically searched and treated by addition of a macrolide to the monotherapy arm. MAIN OUTCOMES AND MEASURES: Proportion of patients not reaching clinical stability (heart rate <100/min, systolic blood pressure >90 mm Hg, temperature <38.0°C, respiratory rate <24/min, and oxygen saturation >90% on room air) at day 7. RESULTS: After 7 days of treatment, 120 of 291 patients (41.2%) in the monotherapy arm vs 97 of 289 (33.6%) in the combination arm had not reached clinical stability (7.6% difference, P = .07). The upper limit of the 1-sided 90% CI was 13.0%, exceeding the predefined noninferiority boundary of 8%. Patients infected with atypical pathogens (hazard ratio [HR], 0.33; 95% CI, 0.13-0.85) or with Pneumonia Severity Index (PSI) category IV pneumonia (HR, 0.81; 95% CI, 0.59-1.10) were less likely to reach clinical stability with monotherapy, whereas patients not infected with atypical pathogens (HR, 0.99; 95% CI, 0.80-1.22) or with PSI category I to III pneumonia (HR, 1.06; 95% CI, 0.82-1.36) had equivalent outcomes in the 2 arms. There were more 30-day readmissions in the monotherapy arm (7.9% vs 3.1%, P = .01). Mortality, intensive care unit admission, complications, length of stay, and recurrence of pneumonia within 90 days did not differ between the 2 arms. CONCLUSIONS AND RELEVANCE: We did not find noninferiority of β-lactam monotherapy in patients hospitalized for moderately severe community-acquired pneumonia. Patients infected with atypical pathogens or with PSI category IV pneumonia had delayed clinical stability with monotherapy. TRIAL REGISTRATION: clinicaltrials.gov Identifier: NCT00818610.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The infinite slope method is widely used as the geotechnical component of geomorphic and landscape evolution models. Its assumption that shallow landslides are infinitely long (in a downslope direction) is usually considered valid for natural landslides on the basis that they are generally long relative to their depth. However, this is rarely justified, because the critical length/depth (L/H) ratio below which edge effects become important is unknown. We establish this critical L/H ratio by benchmarking infinite slope stability predictions against finite element predictions for a set of synthetic two-dimensional slopes, assuming that the difference between the predictions is due to error in the infinite slope method. We test the infinite slope method for six different L/H ratios to find the critical ratio at which its predictions fall within 5% of those from the finite element method. We repeat these tests for 5000 synthetic slopes with a range of failure plane depths, pore water pressures, friction angles, soil cohesions, soil unit weights and slope angles characteristic of natural slopes. We find that: (1) infinite slope stability predictions are consistently too conservative for small L/H ratios; (2) the predictions always converge to within 5% of the finite element benchmarks by a L/H ratio of 25 (i.e. the infinite slope assumption is reasonable for landslides 25 times longer than they are deep); but (3) they can converge at much lower ratios depending on slope properties, particularly for low cohesion soils. The implication for catchment scale stability models is that the infinite length assumption is reasonable if their grid resolution is coarse (e.g. >25?m). However, it may also be valid even at much finer grid resolutions (e.g. 1?m), because spatial organization in the predicted pore water pressure field reduces the probability of short landslides and minimizes the risk that predicted landslides will have L/H ratios less than 25. Copyright (c) 2012 John Wiley & Sons, Ltd.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We study optimal public rationing of an indivisible good and private sector price responses. Consumers differ in their wealth and costs of provisions. Due to a limited budget, some consumers must be rationed. Public rationing determines the characteristics of consumers who seek supply from the private sector, where a firm sets prices based on consumers' cost information and in response to the rationing rule. We consider two information regimes. In the first, the public supplier rations consumers according to their wealth information. In equilibrium, the public supplier must ration both rich and poor consumers. Supplying all poor consumers would leave only rich consumers in the private market, and the firm would react by setting a high price. Rationing some poor consumers is optimal, and implements price reduction in the private market. In the second information regime, the public supplier rations consumers according to consumers' wealth and cost information. In equilibrium, consumers are allocated the good if and only if their costs are below a threshold. Wealth information is not used. Rationing based on cost results in higher equilibrium total consumer surplus than rationing based on wealth. [Authors]

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: While the prices of pharmaceuticals are relatively low in Greece, expenditure on them is growing more rapidly than almost anywhere else in the European Union. OBJECTIVE: To describe and explain the rise in drug expenditures through decomposition of the increase into the contribution of changes in prices, in volumes and a product-mix effect. METHODS: The decomposition of the growth in pharmaceutical expenditures in Greece over the period 1991-2006 was conducted using data from the largest social insurance fund (IKA) that covers more than 50% of the population. RESULTS: Real drug spending increased by 285%, despite a 58% decrease in the relative price of pharmaceuticals. The increase in expenditure is mainly attributable to a switch to more innovative, but more expensive, pharmaceuticals, indicated by a product-mix residual of 493% in the decomposition. A rising volume of drugs also plays a role, and this is due to an increase in the number of prescriptions issued per doctor visit, rather than an increase in the number of visits or the population size. CONCLUSIONS: Rising pharmaceutical expenditures are strongly determined by physicians' prescribing behaviour, which is not subject to any monitoring and for which there are no incentives to be cost conscious.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Diagnosis Related Groups (DRG) are frequently used to standardize the comparison of consumption variables, such as length of stay (LOS). In order to be reliable, this comparison must control for the presence of outliers, i.e. values far removed from the pattern set by the majority of the data. Indeed, outliers can distort the usual statistical summaries, such as means and variances. A common practice is to trim LOS values according to various empirical rules, but there is little theoretical support for choosing between alternative procedures. This pilot study explores the possibility of describing LOS distributions with parametric models which provide the necessary framework for the use of robust methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

General Introduction This thesis can be divided into two main parts :the first one, corresponding to the first three chapters, studies Rules of Origin (RoOs) in Preferential Trade Agreements (PTAs); the second part -the fourth chapter- is concerned with Anti-Dumping (AD) measures. Despite wide-ranging preferential access granted to developing countries by industrial ones under North-South Trade Agreements -whether reciprocal, like the Europe Agreements (EAs) or NAFTA, or not, such as the GSP, AGOA, or EBA-, it has been claimed that the benefits from improved market access keep falling short of the full potential benefits. RoOs are largely regarded as a primary cause of the under-utilization of improved market access of PTAs. RoOs are the rules that determine the eligibility of goods to preferential treatment. Their economic justification is to prevent trade deflection, i.e. to prevent non-preferred exporters from using the tariff preferences. However, they are complex, cost raising and cumbersome, and can be manipulated by organised special interest groups. As a result, RoOs can restrain trade beyond what it is needed to prevent trade deflection and hence restrict market access in a statistically significant and quantitatively large proportion. Part l In order to further our understanding of the effects of RoOs in PTAs, the first chapter, written with Pr. Olivier Cadot, Celine Carrère and Pr. Jaime de Melo, describes and evaluates the RoOs governing EU and US PTAs. It draws on utilization-rate data for Mexican exports to the US in 2001 and on similar data for ACP exports to the EU in 2002. The paper makes two contributions. First, we construct an R-index of restrictiveness of RoOs along the lines first proposed by Estevadeordal (2000) for NAFTA, modifying it and extending it for the EU's single-list (SL). This synthetic R-index is then used to compare Roos under NAFTA and PANEURO. The two main findings of the chapter are as follows. First, it shows, in the case of PANEURO, that the R-index is useful to summarize how countries are differently affected by the same set of RoOs because of their different export baskets to the EU. Second, it is shown that the Rindex is a relatively reliable statistic in the sense that, subject to caveats, after controlling for the extent of tariff preference at the tariff-line level, it accounts for differences in utilization rates at the tariff line level. Finally, together with utilization rates, the index can be used to estimate total compliance costs of RoOs. The second chapter proposes a reform of preferential Roos with the aim of making them more transparent and less discriminatory. Such a reform would make preferential blocs more "cross-compatible" and would therefore facilitate cumulation. It would also contribute to move regionalism toward more openness and hence to make it more compatible with the multilateral trading system. It focuses on NAFTA, one of the most restrictive FTAs (see Estevadeordal and Suominen 2006), and proposes a way forward that is close in spirit to what the EU Commission is considering for the PANEURO system. In a nutshell, the idea is to replace the current array of RoOs by a single instrument- Maximum Foreign Content (MFC). An MFC is a conceptually clear and transparent instrument, like a tariff. Therefore changing all instruments into an MFC would bring improved transparency pretty much like the "tariffication" of NTBs. The methodology for this exercise is as follows: In step 1, I estimate the relationship between utilization rates, tariff preferences and RoOs. In step 2, I retrieve the estimates and invert the relationship to get a simulated MFC that gives, line by line, the same utilization rate as the old array of Roos. In step 3, I calculate the trade-weighted average of the simulated MFC across all lines to get an overall equivalent of the current system and explore the possibility of setting this unique instrument at a uniform rate across lines. This would have two advantages. First, like a uniform tariff, a uniform MFC would make it difficult for lobbies to manipulate the instrument at the margin. This argument is standard in the political-economy literature and has been used time and again in support of reductions in the variance of tariffs (together with standard welfare considerations). Second, uniformity across lines is the only way to eliminate the indirect source of discrimination alluded to earlier. Only if two countries face uniform RoOs and tariff preference will they face uniform incentives irrespective of their initial export structure. The result of this exercise is striking: the average simulated MFC is 25% of good value, a very low (i.e. restrictive) level, confirming Estevadeordal and Suominen's critical assessment of NAFTA's RoOs. Adopting a uniform MFC would imply a relaxation from the benchmark level for sectors like chemicals or textiles & apparel, and a stiffening for wood products, papers and base metals. Overall, however, the changes are not drastic, suggesting perhaps only moderate resistance to change from special interests. The third chapter of the thesis considers whether Europe Agreements of the EU, with the current sets of RoOs, could be the potential model for future EU-centered PTAs. First, I have studied and coded at the six-digit level of the Harmonised System (HS) .both the old RoOs -used before 1997- and the "Single list" Roos -used since 1997. Second, using a Constant Elasticity Transformation function where CEEC exporters smoothly mix sales between the EU and the rest of the world by comparing producer prices on each market, I have estimated the trade effects of the EU RoOs. The estimates suggest that much of the market access conferred by the EAs -outside sensitive sectors- was undone by the cost-raising effects of RoOs. The chapter also contains an analysis of the evolution of the CEECs' trade with the EU from post-communism to accession. Part II The last chapter of the thesis is concerned with anti-dumping, another trade-policy instrument having the effect of reducing market access. In 1995, the Uruguay Round introduced in the Anti-Dumping Agreement (ADA) a mandatory "sunset-review" clause (Article 11.3 ADA) under which anti-dumping measures should be reviewed no later than five years from their imposition and terminated unless there was a serious risk of resumption of injurious dumping. The last chapter, written with Pr. Olivier Cadot and Pr. Jaime de Melo, uses a new database on Anti-Dumping (AD) measures worldwide to assess whether the sunset-review agreement had any effect. The question we address is whether the WTO Agreement succeeded in imposing the discipline of a five-year cycle on AD measures and, ultimately, in curbing their length. Two methods are used; count data analysis and survival analysis. First, using Poisson and Negative Binomial regressions, the count of AD measures' revocations is regressed on (inter alia) the count of "initiations" lagged five years. The analysis yields a coefficient on measures' initiations lagged five years that is larger and more precisely estimated after the agreement than before, suggesting some effect. However the coefficient estimate is nowhere near the value that would give a one-for-one relationship between initiations and revocations after five years. We also find that (i) if the agreement affected EU AD practices, the effect went the wrong way, the five-year cycle being quantitatively weaker after the agreement than before; (ii) the agreement had no visible effect on the United States except for aone-time peak in 2000, suggesting a mopping-up of old cases. Second, the survival analysis of AD measures around the world suggests a shortening of their expected lifetime after the agreement, and this shortening effect (a downward shift in the survival function postagreement) was larger and more significant for measures targeted at WTO members than for those targeted at non-members (for which WTO disciplines do not bind), suggesting that compliance was de jure. A difference-in-differences Cox regression confirms this diagnosis: controlling for the countries imposing the measures, for the investigated countries and for the products' sector, we find a larger increase in the hazard rate of AD measures covered by the Agreement than for other measures.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As decisões de preços de venda, é uma das funções com mais relevância no ambiente empresarial para os gerentes, devido o seu carácter estratégico para o sucesso das empresas. O processo de formação de preço (FP) é fundamental para as empresas, embora apresenta graus de dificuldade e de complexidade, por implicar diversos factores. A primeira etapa para a FP consiste na estimativa correcta de custos, para obtenção de preços ideais. Para além dos custos, outros factores internos e externos devem ser analisados para que as decisões possam ser tomadas de forma correcta e obter um preço competitivo. Portanto, os gerentes precisam de informações precisas para poderem tomar decisões com segurança, levando em consideração todos os aspectos relevantes para a FP. O presente trabalho tem como objectivo principal, conhecer uma politica adequada de FP, bem como, descobrir quais os factores mais relevantes considerados na formação desses preços nas empresas e ainda conhecer quais os métodos de FP utilizado pelas empresas de importação de S.Vicente para a FP. A metodologia, consistiu primeiramente numa pesquisa bibliográfica e exploratória. Para a recolha dos dados,foram aplicados questionários com perguntas fechadas múltipla escolha, aos gerentes e aos responsáveis pelos processos de EC e FP e para complementar, foi feita uma entrevista a um especialista da área. Para a análise dos dados, foram utilizadas técnicas qualitativas e quantitativas, feito através das respostas obtidas dos questionários. Os resultados obtidos da pesquisa, mostram que o método de formação de preços de venda adoptado pelas empresas de S.Vicente, é o método baseado no custo e no mercado (método misto), ou seja, o preço é definido com base nos seus custos, mas é ajustado tendo por base o preço da concorrência. The sale prices decisions; it is one of the functions with more relevance in the managerial atmosphere for the managers, due your strategic character for the success of the companies. The process of price formation (FP) it is fundamental for the companies,although it presents degrees of difficulty and of complexity, for implicating several factors. The first stage for FP consists in estimate correct costs, to obtaining ideal prices. Besides the costs, others internal and external factors should be analyzed so that decisions can be made in a correct form to obtain a competitive price. Therefore, the managers need necessary information to make safety decisions, taking in consideration all the important aspects of FP. The main objective of this present work, is to know the appropriate politics of FP, as well as, to discover which the most important factors considered in the formation of those prices to purchases and still to know which of the FP methods used by the import companies in São Vicente. The methodology consisted firstly in a bibliographical and exploratory research. To collect the data, were applied uestionnaires with closed questions, multiple choice to the managers and the responsible for the processes of ECP and EP and to complement, it was made an interview to a specialist. For the analysis of the data, qualitative techniques were used, done through to obtained answers of the questionnaires. The results of the research, show that the method of formation of sale prices adopted for São Vicente's companies, is the method based on the cost and in the market (mixed method), in other words, the price is defined with base in the costs, but it is adjusted tends for base of the competition.