975 resultados para Comprehensive economic and financial analysis


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Heavy pig breeding in Italy is mainly oriented for the production of high quality processed products. Of particular importance is the dry cured ham production, which is strictly regulated and requires specific carcass characteristics correlated with green leg characteristics. Furthermore, as pigs are slaughtered at about 160 kg live weight, the Italian pig breeding sector faces severe problems of production efficiency that are related to all biological aspects linked to growth, feed conversion, fat deposition and so on. It is well known that production and carcass traits are in part genetically determined. Therefore, as a first step to understand genetic basis of traits that could have a direct or indirect impact on dry cured ham production, a candidate gene approach can be used to identify DNA markers associated with parameters of economic importance. In this thesis, we investigated three candidate genes for carcass and production traits (TRIB3, PCSK1, MUC4) in pig breeds used for dry cured ham production, using different experimental approaches in order to find molecular markers associated with these parameters.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Climate change has been acknowledged as a threat to humanity. Most scholars agree that to avert dangerous climate change and to transform economies into low-carbon societies, deep global emission reductions are required by the year 2050. Under the framework of the Kyoto Protocol, the Clean Development Mechanism (CDM) is the only market-based instrument that encourages industrialised countries to pursue emission reductions in developing countries. The CDM aims to pay the incremental finance necessary to operationalize emission reduction projects which are otherwise not financially viable. According to the objectives of the Kyoto Protocol, the CDM should finance projects that are additional to those which would have happened anyway, contribute to sustainable development in the countries hosting the projects, and be cost-effective. To enable the identification of such projects, an institutional framework has been established by the Kyoto Protocol which lays out responsibilities for public and private actors. This thesis examines whether the CDM has achieved these objectives in practice and can thus be considered an effective tool to reduce emissions. To complete this investigation, the book applies economic theory and analyses the CDM from two perspectives. The first perspective is the supply-dimension which answers the question of how, in practice, the CDM system identified additional, cost-effective, sustainable projects and, generated emission reductions. The main contribution of this book is the second perspective, the compliance-dimension, which answers the question of whether industrialised countries effectively used the CDM for compliance with their Kyoto targets. The application of the CDM in the European Union Emissions Trading Scheme (EU ETS) is used as a case-study. Where the analysis identifies inefficiencies within the supply or the compliance dimension, potential improvements of the legal framework are proposed and discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This research primarily represents a contribution to the lobbying regulation research arena. It introduces an index which for the first time attempts to measure the direct compliance costs of lobbying regulation. The Cost Indicator Index (CII) offers a brand new platform for qualitative and quantitative assessment of adopted lobbying laws and proposals of those laws, both in the comparative and the sui generis dimension. The CII is not just the only new tool introduced in the last decade, but it is the only tool available for comparative assessments of the costs of lobbying regulations. Beside the qualitative contribution, the research introduces an additional theoretical framework for complementary qualitative analysis of the lobbying laws. The Ninefold theory allows a more structured assessment and classification of lobbying regulations, both by indication of benefits and costs. Lastly, this research introduces the Cost-Benefit Labels (CBL). These labels might improve an ex-ante lobbying regulation impact assessment procedure, primarily in the sui generis perspective. In its final part, the research focuses on four South East European countries (Slovenia, Serbia, Montenegro and Macedonia), and for the first time brings them into the discussion and calculates their CPI and CII scores. The special focus of the application was on Serbia, whose proposal on the Law on Lobbying has been extensively analysed in qualitative and quantitative terms, taking into consideration specific political and economic circumstances of the country. Although the obtained results are of an indicative nature, the CII will probably find its place within the academic and policymaking arena, and will hopefully contribute to a better understanding of lobbying regulations worldwide.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aerosol particles are important actors in the Earth’s atmosphere and climate system. They scatter and absorb sunlight, serve as nuclei for water droplets and ice crystals in clouds and precipitation, and are a subject of concern for public health. Atmospheric aerosols originate from both natural and anthropogenic sources, and emissions resulting from human activities have the potential to influence the hydrological cycle and climate. An assessment of the extent and impacts of this human force requires a sound understanding of the natural aerosol background. This dissertation addresses the composition, properties, and atmospheric cycling of biogenic aerosol particles, which represent a major fraction of the natural aerosol burden. The main focal points are: (i) Studies of the autofluo-rescence of primary biological aerosol particles (PBAP) and its application in ambient measure-ments, and (ii) X-ray microscopic and spectroscopic investigations of biogenic secondary organic aerosols (SOA) from the Amazonian rainforest.rnAutofluorescence of biological material has received increasing attention in atmospheric science because it allows real-time monitoring of PBAP in ambient air, however it is associated with high uncertainty. This work aims at reducing the uncertainty through a comprehensive characterization of the autofluorescence properties of relevant biological materials. Fluorescence spectroscopy and microscopy were applied to analyze the fluorescence signatures of pure biological fluorophores, potential non-biological interferences, and various types of reference PBAP. Characteristic features and fingerprint patterns were found and provide support for the operation, interpretation, and further development of PBAP autofluorescence measurements. Online fluorescence detection and offline fluorescence microscopy were jointly applied in a comprehensive bioaerosol field measurement campaign that provided unprecedented insights into PBAP-linked biosphere-atmosphere interactions in a North-American semi-arid forest environment. Rain showers were found to trigger massive bursts of PBAP, including high concentrations of biological ice nucleators that may promote further precipitation and can be regarded as part of a bioprecipitation feedback cycle in the climate system. rnIn the pristine tropical rainforest air of the Amazon, most cloud and fog droplets form on bio-genic SOA particles, but the composition, morphology, mixing state and origin of these particles is hardly known. X-ray microscopy and spectroscopy (STXM-NEXAFS) revealed distinctly different types of secondary organic matter (carboxyl- vs. hydroxy-rich) with internal structures that indicate a strong influence of phase segregation, cloud and fog processing on SOA formation, and aging. In addition, nanometer-sized potassium-rich particles emitted by microorganisms and vegetation were found to act as seeds for the condensation of SOA. Thus, the influence of forest biota on the atmospheric abundance of cloud condensation nuclei appears to be more direct than previously assumed. Overall, the results of this dissertation suggest that biogenic aerosols, clouds and precipitation are indeed tightly coupled through a bioprecipitation cycle, and that advanced microscopic and spectroscopic techniques can provide detailed insights into these mechanisms.rn

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this study was to investigate treatment failure (TF) in hospitalised community-acquired pneumonia (CAP) patients with regard to initial antibiotic treatment and economic impact. CAP patients were included in two open, prospective multicentre studies assessing the direct costs for in-patient treatment. Patients received treatment either with moxifloxacin (MFX) or a nonstandardised antibiotic therapy. Any change in antibiotic therapy after >72 h of treatment to a broadened antibiotic spectrum was considered as TF. Overall, 1,236 patients (mean ± SD age 69.6 ± 16.8 yrs, 691 (55.9%) male) were included. TF occurred in 197 (15.9%) subjects and led to longer hospital stay (15.4 ± 7.3 days versus 9.8 ± 4.2 days; p < 0.001) and increased median treatment costs (€2,206 versus €1,284; p<0.001). 596 (48.2%) patients received MFX and witnessed less TF (10.9% versus 20.6%; p < 0.001). After controlling for confounders in multivariate analysis, adjusted risk of TF was clearly reduced in MFX as compared with β-lactam monotherapy (adjusted OR for MFX 0.43, 95% CI 0.27-0.68) and was more comparable with a β-lactam plus macrolide combination (BLM) (OR 0.68, 95% CI 0.38-1.21). In hospitalised CAP, TF is frequent and leads to prolonged hospital stay and increased treatment costs. Initial treatment with MFX or BLM is a possible strategy to prevent TF, and may thus reduce treatment costs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With the publication of the quality guideline ICH Q9 "Quality Risk Management" by the International Conference on Harmonization, risk management has already become a standard requirement during the life cycle of a pharmaceutical product. Failure mode and effect analysis (FMEA) is a powerful risk analysis tool that has been used for decades in mechanical and electrical industries. However, the adaptation of the FMEA methodology to biopharmaceutical processes brings about some difficulties. The proposal presented here is intended to serve as a brief but nevertheless comprehensive and detailed guideline on how to conduct a biopharmaceutical process FMEA. It includes a detailed 1-to-10-scale FMEA rating table for occurrence, severity, and detectability of failures that has been especially designed for typical biopharmaceutical processes. The application for such a biopharmaceutical process FMEA is widespread. It can be useful whenever a biopharmaceutical manufacturing process is developed or scaled-up, or when it is transferred to a different manufacturing site. It may also be conducted during substantial optimization of an existing process or the development of a second-generation process. According to their resulting risk ratings, process parameters can be ranked for importance and important variables for process development, characterization, or validation can be identified. LAY ABSTRACT: Health authorities around the world ask pharmaceutical companies to manage risk during development and manufacturing of pharmaceuticals. The so-called failure mode and effect analysis (FMEA) is an established risk analysis tool that has been used for decades in mechanical and electrical industries. However, the adaptation of the FMEA methodology to pharmaceutical processes that use modern biotechnology (biopharmaceutical processes) brings about some difficulties, because those biopharmaceutical processes differ from processes in mechanical and electrical industries. The proposal presented here explains how a biopharmaceutical process FMEA can be conducted. It includes a detailed 1-to-10-scale FMEA rating table for occurrence, severity, and detectability of failures that has been especially designed for typical biopharmaceutical processes. With the help of this guideline, different details of the manufacturing process can be ranked according to their potential risks, and this can help pharmaceutical companies to identify aspects with high potential risks and to react accordingly to improve the safety of medicines.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objectives  To assess the proportion of patients lost to programme (died, lost to follow-up, transferred out) between HIV diagnosis and start of antiretroviral therapy (ART) in sub-Saharan Africa, and determine factors associated with loss to programme. Methods  Systematic review and meta-analysis. We searched PubMed and EMBASE databases for studies in adults. Outcomes were the percentage of patients dying before starting ART, the percentage lost to follow-up, the percentage with a CD4 cell count, the distribution of first CD4 counts and the percentage of eligible patients starting ART. Data were combined using random-effects meta-analysis. Results  Twenty-nine studies from sub-Saharan Africa including 148 912 patients were analysed. Six studies covered the whole period from HIV diagnosis to ART start. Meta-analysis of these studies showed that of the 100 patients with a positive HIV test, 72 (95% CI 60-84) had a CD4 cell count measured, 40 (95% CI 26-55) were eligible for ART and 25 (95% CI 13-37) started ART. There was substantial heterogeneity between studies (P < 0.0001). Median CD4 cell count at presentation ranged from 154 to 274 cells/μl. Patients eligible for ART were less likely to become lost to programme (25%vs. 54%, P < 0.0001), but eligible patients were more likely to die (11%vs. 5%, P < 0.0001) than ineligible patients. Loss to programme was higher in men, in patients with low CD4 cell counts and low socio-economic status and in recent time periods. Conclusions  Monitoring and care in the pre-ART time period need improvement, with greater emphasis on patients not yet eligible for ART.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background Falls of elderly people may cause permanent disability or death. Particularly susceptible are elderly patients in rehabilitation hospitals. We systematically reviewed the literature to identify falls prediction tools available for assessing elderly inpatients in rehabilitation hospitals. Methods and Findings We searched six electronic databases using comprehensive search strategies developed for each database. Estimates of sensitivity and specificity were plotted in ROC space graphs and pooled across studies. Our search identified three studies which assessed the prediction properties of falls prediction tools in a total of 754 elderly inpatients in rehabilitation hospitals. Only the STRATIFY tool was assessed in all three studies; the other identified tools (PJC-FRAT and DOWNTON) were assessed by a single study. For a STRATIFY cut-score of two, pooled sensitivity was 73% (95%CI 63 to 81%) and pooled specificity was 42% (95%CI 34 to 51%). An indirect comparison of the tools across studies indicated that the DOWNTON tool has the highest sensitivity (92%), while the PJC-FRAT offers the best balance between sensitivity and specificity (73% and 75%, respectively). All studies presented major methodological limitations. Conclusions We did not identify any tool which had an optimal balance between sensitivity and specificity, or which were clearly better than a simple clinical judgment of risk of falling. The limited number of identified studies with major methodological limitations impairs sound conclusions on the usefulness of falls risk prediction tools in geriatric rehabilitation hospitals.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The main goal of this project was to propose appropriate methods of analysing the effects of the privatisation of state-owned enterprises, methods which were then tested on a limited sample of 16 Polish and 8 German enterprises privatised in 1992. A considerable amount of information was collected relating to the six-year period 1989-1994 relating to most aspects of the companies' activities. The effects of privatisation were taken to be those changes within the enterprises which were the result of privatisation, in such areas as production, the productivity of labour and fixed assets, investments and innovations, employment and wages, economic incentives (especially for top managers), financing (internal and external sources), bad debts and economic effects (financial analysis). A second important goal was to identify the main factors which represent methodological obstacles in surveys of the effects of privatisation during a period of fundamental transformation of the entire economic system. The list of enterprises for the research was compiled in such a way as to allow for the differentiation of ownership structures of privatised firms and to permit (at least to a certain extent) the empirical verification of some hypotheses regarding the privatisation process. The enterprises selected were divided into the following three groups representing (as far as possible) various types of ownership structures or types of control: (1) enterprises control by strategic investors (domestic or foreign), (2) enterprises controlled by employees (employee-owned companies), (3) enterprises controlled by managers. Formal methods such as econometric models with varying parameters were used to separate pure privatisation effects from other factors which influence various aspects of an enterprise's working, including policies on the productivity of labour and capital, average wages, the remuneration of top managers, etc. While the group admits that their findings and conclusions cannot be treated as representative of all privatised enterprises in Poland and Germany, they found considerable convergence with their findings and those of other surveys conducted on a wider scale. The main hypotheses that were confirmed included that privatisation (especially in companies controlled by large investors and managers) leads to a significant increase in the effectiveness of these production process, growing pay differentials between different employee groups (e.g. between executives and rank-and-file employees) and between different jobs and positions within particular professional groups. They also confirmed the growing importance in incentives to top executives of incentives linked with the company's economic effects (particularly profit-related incentives), long-term incentives and the capital market.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The group presents an analysis of the development of the Czech society and economy during the 1990s. They believe that the Czech neo-liberal strategy of transformation led to a partial and uneven modernisation and that this strategy is unable to provide a firm basis for a complex process of modernisation. The increasing developmental problems encountered during 1996-1999 can be seen as empirical evidence of the inadequacy of the neo-liberal transformation strategy. These problems are connected to institutional shortcomings due to the excessive speed of privatisation, its form with certain important Czech innovations (particularly the voucher method and an attempt to resuscitate the Czech national capital) and with the overlooking of the importance of the legal framework and its enforcement. The overly hasty privatisation has created a type of 'recombinant property' which lacks the economic order necessary to stimulate efficiency in an atmosphere of prevailing social justice. A second reason for the present difficulties is the long-term lag behind the civilisation and cultural standards typical of the advanced European countries. The first steps of the Czech transformation concentrated mainly on changes in the institutions important for the distribution of power and wealth and largely neglected the necessity of deep-reaching modernisation of Czech society and the economy. The neo-liberal strategy created conditions conducive to predatory and speculative behaviour at the expense of creative behaviour. Inherited principles of egalitarianism combined with undeserved economic privileges survived and were reinforced by important new developments in the same direction. This situation hinders the assertion of meritocratic motivations. The group advocates the development and implementation of a complex strategy of modernisation based on deliberate reforms, institutional changes and restructuring on the basis of strategic planning, and structural and regional policies which stress the role of cultivation of the institutional order and of the most important factors of economic growth and development.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Trauma care is expensive. However, reliable data on the exact lifelong costs incurred by a major trauma patient are lacking. Discussion usually focuses on direct medical costs--underestimating consequential costs resulting from absence from work and permanent disability. METHODS: Direct medical costs and consequential costs of 63 major trauma survivors (ISS >13) at a Swiss trauma center from 1995 to 1996 were assessed 5 years posttrauma. The following cost evaluation methods were used: correction cost method (direct cost of restoring an original state), human capital method (indirect cost of lost productivity), contingent valuation method (human cost as the lost quality of life), and macroeconomic estimates. RESULTS: Mean ISS (Injury Severity Score) was 26.8 +/- 9.5 (mean +/- SD). In all, 22 patients (35%) were disabled, causing discounted average lifelong total costs of USD 1,293,800, compared with 41 patients (65%) who recovered without any disabilities with incurred costs of USD 147,200 (average of both groups USD 547,800). Two thirds of these costs were attributable to a loss of production whereas only one third was a result of the cost of correction. Primary hospital treatment (USD 27,800 +/- 37,800) was only a minor fraction of the total cost--less than the estimated cost of police and the judiciary. Loss of quality of life led to considerable intangible human costs similar to real costs. CONCLUSIONS: Trauma costs are commonly underestimated. Direct medical costs make up only a small part of the total costs. Consequential costs, such as lost productivity, are well in excess of the usual medical costs. Mere cost averages give a false estimate of the costs incurred by patients with/without disabilities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

ABSTRACT: Nanotechnology in its widest sense seeks to exploit the special biophysical and chemical properties of materials at the nanoscale. While the potential technological, diagnostic or therapeutic applications are promising there is a growing body of evidence that the special technological features of nanoparticulate material are associated with biological effects formerly not attributed to the same materials at a larger particle scale. Therefore, studies that address the potential hazards of nanoparticles on biological systems including human health are required. Due to its large surface area the lung is one of the major sites of interaction with inhaled nanoparticles. One of the great challenges of studying particle-lung interactions is the microscopic visualization of nanoparticles within tissues or single cells both in vivo and in vitro. Once a certain type of nanoparticle can be identified unambiguously using microscopic methods it is desirable to quantify the particle distribution within a cell, an organ or the whole organism. Transmission electron microscopy provides an ideal tool to perform qualitative and quantitative analyses of particle-related structural changes of the respiratory tract, to reveal the localization of nanoparticles within tissues and cells and to investigate the 3D nature of nanoparticle-lung interactions.This article provides information on the applicability, advantages and disadvantages of electron microscopic preparation techniques and several advanced transmission electron microscopic methods including conventional, immuno and energy-filtered electron microscopy as well as electron tomography for the visualization of both model nanoparticles (e.g. polystyrene) and technologically relevant nanoparticles (e.g. titanium dioxide). Furthermore, we highlight possibilities to combine light and electron microscopic techniques in a correlative approach. Finally, we demonstrate a formal quantitative, i.e. stereological approach to analyze the distributions of nanoparticles in tissues and cells.This comprehensive article aims to provide a basis for scientists in nanoparticle research to integrate electron microscopic analyses into their study design and to select the appropriate microscopic strategy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

For a fluid dynamics experimental flow measurement technique, particle image velocimetry (PIV) provides significant advantages over other measurement techniques in its field. In contrast to temperature and pressure based probe measurements or other laser diagnostic techniques including laser Doppler velocimetry (LDV) and phase Doppler particle analysis (PDPA), PIV is unique due to its whole field measurement capability, non-intrusive nature, and ability to collect a vast amount of experimental data in a short time frame providing both quantitative and qualitative insight. These properties make PIV a desirable measurement technique for studies encompassing a broad range of fluid dynamics applications. However, as an optical measurement technique, PIV also requires a substantial technical understanding and application experience to acquire consistent, reliable results. Both a technical understanding of particle image velocimetry and practical application experience are gained by applying a planar PIV system at Michigan Technological University’s Combustion Science Exploration Laboratory (CSEL) and Alternative Fuels Combustion Laboratory (AFCL). Here a PIV system was applied to non-reacting and reacting gaseous environments to make two component planar PIV as well as three component stereographic PIV flow field velocity measurements in conjunction with chemiluminescence imaging in the case of reacting flows. This thesis outlines near surface flow field characteristics in a tumble strip lined channel, three component velocity profiles of non-reacting and reacting swirled flow in a swirl stabilized lean condition premixed/prevaporized-fuel model gas turbine combustor operating on methane at 5-7 kW, and two component planar PIV measurements characterizing the AFCL’s 1.1 liter closed combustion chamber under dual fan driven turbulent mixing flow.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This Ph.D. research is comprised of three major components; (i) Characterization study to analyze the composition of defatted corn syrup (DCS) from a dry corn mill facility (ii) Hydrolysis experiments to optimize the production of fermentable sugars and amino acid platform using DCS and (iii) Sustainability analyses. Analyses of DCS included total solids, ash content, total protein, amino acids, inorganic elements, starch, total carbohydrates, lignin, organic acids, glycerol, and presence of functional groups. Total solids content was 37.4% (± 0.4%) by weight, and the mass balance closure was 101%. Total carbohydrates [27% (± 5%) wt.] comprised of starch (5.6%), soluble monomer carbohydrates (12%) and non-starch carbohydrates (10%). Hemicellulose components (structural and non-structural) were; xylan (6%), xylose (1%), mannan (1%), mannose (0.4%), arabinan (1%), arabinose (0.4%), galatactan (3%) and galactose (0.4%). Based on the measured physical and chemical components, bio-chemical conversion route and subsequent fermentation to value added products was identified as promising. DCS has potential to serve as an important fermentation feedstock for bio-based chemicals production. In the sugar hydrolysis experiments, reaction parameters such as acid concentration and retention time were analyzed to determine the optimal conditions to maximize monomer sugar yields while keeping the inhibitors at minimum. Total fermentable sugars produced can reach approximately 86% of theoretical yield when subjected to dilute acid pretreatment (DAP). DAP followed by subsequent enzymatic hydrolysis was most effective for 0 wt% acid hydrolysate samples and least efficient towards 1 and 2 wt% acid hydrolysate samples. The best hydrolysis scheme DCS from an industry's point of view is standalone 60 minutes dilute acid hydrolysis at 2 wt% acid concentration. The combined effect of hydrolysis reaction time, temperature and ratio of enzyme to substrate ratio to develop hydrolysis process that optimizes the production of amino acids in DCS were studied. Four key hydrolysis pathways were investigated for the production of amino acids using DCS. The first hydrolysis pathway is the amino acid analysis using DAP. The second pathway is DAP of DCS followed by protein hydrolysis using proteases [Trypsin, Pronase E (Streptomyces griseus) and Protex 6L]. The third hydrolysis pathway investigated a standalone experiment using proteases (Trypsin, Pronase E, Protex 6L, and Alcalase) on the DCS without any pretreatment. The final pathway investigated the use of Accellerase 1500® and Protex 6L to simultaneously produce fermentable sugars and amino acids over a 24 hour hydrolysis reaction time. The 3 key objectives of the techno-economic analysis component of this PhD research included; (i) Development of a process design for the production of both the sugar and amino acid platforms with DAP using DCS (ii) A preliminary cost analysis to estimate the initial capital cost and operating cost of this facility (iii) A greenhouse gas analysis to understand the environmental impact of this facility. Using Aspen Plus®, a conceptual process design has been constructed. Finally, both Aspen Plus Economic Analyzer® and Simapro® sofware were employed to conduct the cost analysis as well as the carbon footprint emissions of this process facility respectively. Another section of my PhD research work focused on the life cycle assessment (LCA) of commonly used dairy feeds in the U.S. Greenhouse gas (GHG) emissions analysis was conducted for cultivation, harvesting, and production of common dairy feeds used for the production of dairy milk in the U.S. The goal was to determine the carbon footprint [grams CO2 equivalents (gCO2e)/kg of dry feed] in the U.S. on a regional basis, identify key inputs, and make recommendations for emissions reduction. The final section of my Ph.D. research work was an LCA of a single dairy feed mill located in Michigan, USA. The primary goal was to conduct a preliminary assessment of dairy feed mill operations and ultimately determine the GHG emissions for 1 kilogram of milled dairy feed.