829 resultados para Economic based allocation


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Starting of from Avner Offer’s comment that the First World War was not only a war of steel and gold, but also of bread and potatoes (1989: 1) and my own research on British as well as Australian preparations for economic warfare and based on sources from the entente as well as the central powers but also from the United States, Canada and Australia, may presentation will focus on the interdependence of the measures taken by entente as well as central power authorities in the second half of 1916. Already a year before both sides had become aware that this war would not only be decided on the battlefield, but that the issues of primary as well as secondary resources would be decisive. Accordingly measures that could strike the enemy in this field were discussed and put into place more and more and this at time, when weather conditions caused a reduction of harvest all over Europe, Northern America and Argentina.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper deals with the difference between risk and insecurity from an anthropological perspective using a critical New Institutionalist approach. Risk refers to the ability to reduce undesirable outcomes, based on a range of information actors have on possible outcomes; insecurity refers to the lack of this information. With regard to the neo-liberal setting of a resource rich area in Zambia, Central Africa, local actors – men and women – face risk and insecurity in market constellations between rural and urban areas. They attempt to cope with risk using technical means and diversification of livelihood strategies. But as common-pool resources have been transformed from common property institutions to open access, also leading to unpredictability of competitors and partners in “free” markets, actors rely on magic options to reduce insecurity and transform it into risk-assessing strategies as an adaptation to modern times. Keywords: Risk, insecurity, institutional change, neo-liberal market, common pool resources, livelihood strategies, magic, Zambia.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The main aim of the methodology presented in this paper is to provide a framework for a participatory process for the appraisal and selection of options to mitigate desertification and land degradation. This methodology is being developed within the EU project DESIRE (www.desire-project.eu/) in collaboration with WOCAT (www.wocat.org). It is used to select promising conservation strategies for test-implementation in each of the 16 degradation and desertification hotspot sites in the Mediterranean and around the world. The methodology consists of three main parts: In a first step, prevention and mitigation strategies already applied at the respective DESIRE study site are identified and listed during a workshop with representatives of different stakeholders groups (land users, policy makers, researchers). The participatory and process-oriented approach initiates a mutual learning process among the different stakeholders by sharing knowledge and jointly reflecting on current problems and solutions related to land degradation and desertification. In the second step these identified, locally applied solutions (technologies and approaches) are assessed with the help of the WOCAT methodology. Comprehensive questionnaires and a database system have been developed to document and evaluate all relevant aspects of technical measures as well as implementation approaches by teams of researchers and specialists, together with land users. This research process ensures systematic assessing and piecing together of local information, together with specific details about the environmental and socio-economic setting. The third part consists of another stakeholder workshop where promising strategies for sustainable land management in the given context are selected, based on the best practices database of WOCAT, including the evaluated locally applied strategies at the DESIRE sites. These promising strategies will be assessed with the help of a selection and decision support tool and adapted for test-implementation at the study site.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper focuses on two regions in the United States that have emerged as high-technology regions in the absence of major research universities. The case of Portland's Silicon Forest is compared to Washington, DC. In both regions, high-technology economies grew because of industrial restructuring processes. The paper argues that in both regions other actors—such as firms and government laboratories—spurred the development of knowledge-based economies and catalysed the engagement of higher education institutions in economic development. The paper confirms and advances the triple helix model of university–government–industry relationships and posits that future studies have to examine degrees of university-region engagement.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Perceived profitability is a key factor in explaining farmers' decision to adopt or not adopt sustainable land management (SLM) technologies. Despite this importance, relatively little is known about the economics of SLM. This paper contributes to the literature by analysing data on costs and perceived cost/benefit ratios of SLM technologies. Data are taken from the World Overview of Conservation Approaches and Technologies technology database and cover 363 case studies conducted in a variety of countries between 1990 and 2012. Based on an in-depth descriptive analysis, we determine what costs accrue to local stakeholders and assess perceived short-term and long-term cost/benefit ratios. Our results show that a large majority of the technologies in our sample are perceived as being profitable: 73% were perceived to have a positive or at least neutral cost/benefit ratio in the short term, while 97% were perceived to have a positive or very positive cost/benefit ratio in the long term. An additional empirical analysis confirms that economic factors are key determinants of land users' decisions to adopt or not adopt SLM technologies. We conclude that a wide range of existing SLM practices generate considerable benefits not only for land users, but for other stakeholders as well. High initial investment costs associated with some practices may, however, constitute a barrier to their adoption; short-term support for land users can help to promote these practices where appropriate.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The democratic deficit of evidence-based policymaking and the little attention the approach pays to values and norms have repeatedly been criticized. This article argues that direct-democratic campaigns may provide an arena for citizens and stakeholders to debate the belief systems inherent to evidence. The study is based on a narrative analysis of Program for International Student Assessment (PISA) reports, as well as of newspaper coverage and governmental information referring to PISA in Swiss direct-democratic campaigns on a variety of school policy issues. The findings show that PISA reports are discursive instruments rather than ‘objective evidence’. The reports promote a narrative of economic progress through educational evidence that is adopted without scrutiny by governmental coalitions in direct-democratic campaigns to justify school policy reforms. Yet, the dominant PISA narrative is contested in two counter-narratives, one endorsed by numerous citizens, the other by a group of experts. These counter-narratives question how PISA is used by an ‘expertocracy’ to prescribe reforms, as well as the performance ideology inherent to. Overall, these findings suggest that direct-democratic campaigns may make more transparent how evidence is produced and used according to existing belief systems. Evidence, on the other hand, may be a stimulus for democratic discourse by feeding the debate with potential policy problems and solution. Thus, direct-democratic debates may reconcile normative positions of citizens with the desire to base decisions on empirical evidence.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

INTRODUCTION Dexmedetomidine was shown in two European randomized double-blind double-dummy trials (PRODEX and MIDEX) to be non-inferior to propofol and midazolam in maintaining target sedation levels in mechanically ventilated intensive care unit (ICU) patients. Additionally, dexmedetomidine shortened the time to extubation versus both standard sedatives, suggesting that it may reduce ICU resource needs and thus lower ICU costs. Considering resource utilization data from these two trials, we performed a secondary, cost-minimization analysis assessing the economics of dexmedetomidine versus standard care sedation. METHODS The total ICU costs associated with each study sedative were calculated on the basis of total study sedative consumption and the number of days patients remained intubated, required non-invasive ventilation, or required ICU care without mechanical ventilation. The daily unit costs for these three consecutive ICU periods were set to decline toward discharge, reflecting the observed reduction in mean daily Therapeutic Intervention Scoring System (TISS) points between the periods. A number of additional sensitivity analyses were performed, including one in which the total ICU costs were based on the cumulative sum of daily TISS points over the ICU period, and two further scenarios, with declining direct variable daily costs only. RESULTS Based on pooled data from both trials, sedation with dexmedetomidine resulted in lower total ICU costs than using the standard sedatives, with a difference of €2,656 in the median (interquartile range) total ICU costs-€11,864 (€7,070 to €23,457) versus €14,520 (€7,871 to €26,254)-and €1,649 in the mean total ICU costs. The median (mean) total ICU costs with dexmedetomidine compared with those of propofol or midazolam were €1,292 (€747) and €3,573 (€2,536) lower, respectively. The result was robust, indicating lower costs with dexmedetomidine in all sensitivity analyses, including those in which only direct variable ICU costs were considered. The likelihood of dexmedetomidine resulting in lower total ICU costs compared with pooled standard care was 91.0% (72.4% versus propofol and 98.0% versus midazolam). CONCLUSIONS From an economic point of view, dexmedetomidine appears to be a preferable option compared with standard sedatives for providing light to moderate ICU sedation exceeding 24 hours. The savings potential results primarily from shorter time to extubation. TRIAL REGISTRATION ClinicalTrials.gov NCT00479661 (PRODEX), NCT00481312 (MIDEX).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As often pointed out in the literature on the European debt crisis, the policy programme of austerity and internal devaluation imposed on countries in the Eurozone's periphery exhibits a lack of democratic legitimacy. This article analyses the consequences these developments have for democratic support at both the European and national levels. We show that through the policies of economic adjustment, a majority of citizens in crisis countries has become ‘detached’ from their democratic political system. By cutting loose the Eurozone's periphery from the rest of Europe in terms of democratic legitimacy, the Euro has divided the union, instead of uniting it as foreseen by its architects. Our results are based on aggregated Eurobarometer surveys conducted in 28 European Union (EU) member states between 2002 and 2014. We employ quantitative time-series cross-sectional regression analyses. Moreover, we estimate the causal effect of economic adjustment in a comparative case study of four cases using the synthetic control method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Throughout the last millennium, mankind was affected by prolonged deviations from the climate mean state. While periods like the Maunder Minimum in the 17th century have been assessed in greater detail, earlier cold periods such as the 15th century received much less attention due to the sparse information available. Based on new evidence from different sources ranging from proxy archives to model simulations, it is now possible to provide an end-to-end assessment about the climate state during an exceptionally cold period in the 15th century, the role of internal, unforced climate variability and external forcing in shaping these extreme climatic conditions, and the impacts on and responses of the medieval society in Central Europe. Climate reconstructions from a multitude of natural and human archives indicate that, during winter, the period of the early Spörer Minimum (1431–1440 CE) was the coldest decade in Central Europe in the 15th century. The particularly cold winters and normal but wet summers resulted in a strong seasonal cycle that challenged food production and led to increasing food prices, a subsistence crisis, and a famine in parts of Europe. As a consequence, authorities implemented adaptation measures, such as the installation of grain storage capacities, in order to be prepared for future events. The 15th century is characterised by a grand solar minimum and enhanced volcanic activity, which both imply a reduction of seasonality. Climate model simulations show that periods with cold winters and strong seasonality are associated with internal climate variability rather than external forcing. Accordingly, it is hypothesised that the reconstructed extreme climatic conditions during this decade occurred by chance and in relation to the partly chaotic, internal variability within the climate system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Syndromic surveillance (SyS) systems currently exploit various sources of health-related data, most of which are collected for purposes other than surveillance (e.g. economic). Several European SyS systems use data collected during meat inspection for syndromic surveillance of animal health, as some diseases may be more easily detected post-mortem than at their point of origin or during the ante-mortem inspection upon arrival at the slaughterhouse. In this paper we use simulation to evaluate the performance of a quasi-Poisson regression (also known as an improved Farrington) algorithm for the detection of disease outbreaks during post-mortem inspection of slaughtered animals. When parameterizing the algorithm based on the retrospective analyses of 6 years of historic data, the probability of detection was satisfactory for large (range 83-445 cases) outbreaks but poor for small (range 20-177 cases) outbreaks. Varying the amount of historical data used to fit the algorithm can help increasing the probability of detection for small outbreaks. However, while the use of a 0·975 quantile generated a low false-positive rate, in most cases, more than 50% of outbreak cases had already occurred at the time of detection. High variance observed in the whole carcass condemnations time-series, and lack of flexibility in terms of the temporal distribution of simulated outbreaks resulting from low reporting frequency (monthly), constitute major challenges for early detection of outbreaks in the livestock population based on meat inspection data. Reporting frequency should be increased in the future to improve timeliness of the SyS system while increased sensitivity may be achieved by integrating meat inspection data into a multivariate system simultaneously evaluating multiple sources of data on livestock health.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many countries treat income generated via exports favourably, especially when production takes places in special zones known as export processing zones (EPZs). EPZs can be defined as specific, geographically defined zones or areas that are subject to special administration and that generally offer tax incentives, such as duty‐free imports when producing for export, exemption from other regulatory constraints linked to import for the domestic market, sometimes favourable treatment in terms of industrial regulation, and the streamlining of border clearing procedures. We describe a database of WTO Members that employ special economic zones as part of their industrial policy mix. This is based on WTO notification and monitoring through the WTO’s trade policy review mechanism (TPRM), supplemented with information from the ILO, World Bank, and primary sources. We also provide some rough analysis of the relationship between use of EPZs and the carbon intensity of exports, and relative levels of investment across countries with and without special zones.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The doctrine of fair use allows unauthorized copying of original works of art, music, and literature for limited purposes like criticism, research, and education, based on the rationale that copyright holders would consent to such uses if bargaining were possible. This paper develops the first formal analysis of fair use in an effort to derive the efficient legal standard for applying the doctrine. The model interprets copies and originals as differentiated products and defines fair use as a threshold separating permissible copying from infringement. Application of the analysis to several key cases (including the recent Napster case) shows that this interpretation is consistent with actual legal reasoning. The analysis also underscores the role of technology in shaping the efficient scope of fair use.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The doctrine of fair use allows limited copying of creative works based on the rationale that copyright holders would consent to such uses if bargaining were possible. This paper develops a formal model of fair use in an effort to derive the efficient legal standard for applying the doctrine. The model interprets copies and originals as differentiated products and defines fair use as a threshold separating permissible copying from infringement. The analysis highlights the role of technology in shaping the efficient standard. Discussion of several key cases illustrates the applicability of the model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Monte Carlo simulation has been conducted to investigate parameter estimation and hypothesis testing in some well known adaptive randomization procedures. The four urn models studied are Randomized Play-the-Winner (RPW), Randomized Pôlya Urn (RPU), Birth and Death Urn with Immigration (BDUI), and Drop-the-Loses Urn (DL). Two sequential estimation methods, the sequential maximum likelihood estimation (SMLE) and the doubly adaptive biased coin design (DABC), are simulated at three optimal allocation targets that minimize the expected number of failures under the assumption of constant variance of simple difference (RSIHR), relative risk (ORR), and odds ratio (OOR) respectively. Log likelihood ratio test and three Wald-type tests (simple difference, log of relative risk, log of odds ratio) are compared in different adaptive procedures. ^ Simulation results indicates that although RPW is slightly better in assigning more patients to the superior treatment, the DL method is considerably less variable and the test statistics have better normality. When compared with SMLE, DABC has slightly higher overall response rate with lower variance, but has larger bias and variance in parameter estimation. Additionally, the test statistics in SMLE have better normality and lower type I error rate, and the power of hypothesis testing is more comparable with the equal randomization. Usually, RSIHR has the highest power among the 3 optimal allocation ratios. However, the ORR allocation has better power and lower type I error rate when the log of relative risk is the test statistics. The number of expected failures in ORR is smaller than RSIHR. It is also shown that the simple difference of response rates has the worst normality among all 4 test statistics. The power of hypothesis test is always inflated when simple difference is used. On the other hand, the normality of the log likelihood ratio test statistics is robust against the change of adaptive randomization procedures. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective. To explore issues in current literature concerning possible social and economic ramifications of pharmacogenomic research. Design. Review of the literature. Data sources: Academic Search Premier, Blackwell Synergy, PUBMED and Social Sciences Citation Index. Review methods. Articles dealing with the social and economic ramifications of pharmacogenomic research were selected. The articles discussed at least one of 5 areas (race, privacy/confidentiality, ethics, insurance, and research and development). Some restrictions were placed on the articles chosen to narrow down the number of articles to a relevant, manageable amount. Results. Approximately 219 articles were selected for review; 159 were fully reviewed and found to be relevant to the issues; and 33 were cited. Conclusion. Insurance and research and development decisions are led by the free-market system with limited intervention from government. Race/ethnicity, privacy/confidentiality, and ethics continue to be debated with no clear answer. However, some compromise is regulated by government based upon current laws involving these issues. ^