920 resultados para Sensitivity-analysis


Relevância:

70.00% 70.00%

Publicador:

Resumo:

The cost benefit analysis of treatment of bovine subclinical mastitis caused by S. aureus was evaluated. Two hundred and seventy udder quarters with subclinical mastitis and healthy were selected in four groups, in conformity to lactational stage and with the treatment or not. Group 1 included treated animals 10 to 60 days in milk; group 2 included treated animals 61 days in milk until two months before the end of lactation; group 3 included animals not treated 10 to 60 days in milk; group 4 included animals not treated from 61 days in milk until two months before the end of lactation. Treatment with gentamicin (150 mg) was accomplished by intramammary doses, once a day, after sensitivity tests. The mammary quarters were evaluated after 30 days again. The costs with the treatment were calculated considering a S. aureus prevalence of 5%, expenses with antibiotic, loss in milk, tests of sensitivity and workload. There was loss of income of 2% and 14% in the groups 1 and 2, respectively, when compared with the incomes before treatment. In such case, the treatment of bovine subclinical mastitis by S. aureus in the lactation was economically not practicable.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

It is well established that nitrate is a potent inhibitor of nodulation and nitrogen fixation in legumes. The objective of this study was to demonstrate the relative insensitivity of these processes to nitrate with Calopogonium mucunoides, a tropical South American perennial legume, native to the cerrado (savannah) region. It was found that nodule number was reduced by about half in the presence of high levels of nitrate (15 mM) but nodule growth (total nodule mass per plant) and nitrogen fixation (acetylene reduction activity and xylem sap ureide levels) were not affected. Other sources of N (ammonium and urea) were also without effect at these concentrations. At even higher concentrations (30 mM), nitrate did promote significant inhibition (ca. 50%) of acetylene reduction activity, but no significant reduction in xylem sap ureides was found. The extraordinary insensitivity of nodulation and N2 fixation of C. mucunoides to nitrate suggests that this species should be useful in studies aimed at elucidating the mechanisms of nitrate inhibition of these processes. © 2010 Springer Science+Business Media B.V.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

This paper describes a simple, environmentally friendly and rapid quantitative spot test procedure for the determination of captopril (CPT) in bulk drug and in pharmaceutical formulations by using diffuse reflectance spectroscopy. The proposed method is based on the reflectance measurements of the orange compound (λ max 490 nm) produced by the spot test reaction between CPT and p-chloranil (CL). Under optimal conditions, calibration curves were obtained for CPT by plotting the optical density of the reflectance signal (A R) vs. the log of the mol L -1 concentration, from 6.91×10 -3 to 1.17×10 -1, with a good coefficient of determination (R 2 = 0.9992). The common excipients used as additives in pharmaceuticals do not interfere in the proposed method. The method was applied to determine CPT in commercial pharmaceutical formulations. The results obtained by the proposed method are compared favorably with those obtained by an official procedure at 95% confidence level. The method validation results showed that the sensitivity and selectivity of the methods were adequated for drug monitoring in industrial quality control laboratories. © 2011 Moment Publication.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Computational fluid dynamics, CFD, is becoming an essential tool in the prediction of the hydrodynamic efforts and flow characteristics of underwater vehicles for manoeuvring studies. However, when applied to the manoeuvrability of autonomous underwater vehicles, AUVs, most studies have focused on the de- termination of static coefficients without considering the effects of the vehicle control surface deflection. This paper analyses the hydrodynamic efforts generated on an AUV considering the combined effects of the control surface deflection and the angle of attack using CFD software based on the Reynolds-averaged Navier–Stokes formulations. The CFD simulations are also independently conducted for the AUV bare hull and control surface to better identify their individual and interference efforts and to validate the simulations by comparing the experimental results obtained in a towing tank. Several simulations of the bare hull case were conducted to select the k –ω SST turbulent model with the viscosity approach that best predicts its hydrodynamic efforts. Mesh sensitivity analyses were conducted for all simulations. For the flow around the control surfaces, the CFD results were analysed according to two different methodologies, standard and nonlinear. The nonlinear regression methodology provides better results than the standard methodology does for predicting the stall at the control surface. The flow simulations have shown that the occurrence of the control surface stall depends on a linear relationship between the angle of attack and the control surface deflection. This type of information can be used in designing the vehicle’s autopilot system.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Dengue is considered one of the most important vector-borne infection, affecting almost half of the world population with 50 to 100 million cases every year. In this paper, we present one of the simplest models that can encapsulate all the important variables related to vector control of dengue fever. The model considers the human population, the adult mosquito population and the population of immature stages, which includes eggs, larvae and pupae. The model also considers the vertical transmission of dengue in the mosquitoes and the seasonal variation in the mosquito population. From this basic model describing the dynamics of dengue infection, we deduce thresholds for avoiding the introduction of the disease and for the elimination of the disease. In particular, we deduce a Basic Reproduction Number for dengue that includes parameters related to the immature stages of the mosquito. By neglecting seasonal variation, we calculate the equilibrium values of the model’s variables. We also present a sensitivity analysis of the impact of four vector-control strategies on the Basic Reproduction Number, on the Force of Infection and on the human prevalence of dengue. Each of the strategies was studied separately from the others. The analysis presented allows us to conclude that of the available vector control strategies, adulticide application is the most effective, followed by the reduction of the exposure to mosquito bites, locating and destroying breeding places and, finally, larvicides. Current vector-control methods are concentrated on mechanical destruction of mosquitoes’ breeding places. Our results suggest that reducing the contact between vector and hosts (biting rates) is as efficient as the logistically difficult but very efficient adult mosquito’s control.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

This work is focused on the study of saltwater intrusion in coastal aquifers, and in particular on the realization of conceptual schemes to evaluate the risk associated with it. Saltwater intrusion depends on different natural and anthropic factors, both presenting a strong aleatory behaviour, that should be considered for an optimal management of the territory and water resources. Given the uncertainty of problem parameters, the risk associated with salinization needs to be cast in a probabilistic framework. On the basis of a widely adopted sharp interface formulation, key hydrogeological problem parameters are modeled as random variables, and global sensitivity analysis is used to determine their influence on the position of saltwater interface. The analyses presented in this work rely on an efficient model reduction technique, based on Polynomial Chaos Expansion, able to combine the best description of the model without great computational burden. When the assumptions of classical analytical models are not respected, and this occurs several times in the applications to real cases of study, as in the area analyzed in the present work, one can adopt data-driven techniques, based on the analysis of the data characterizing the system under study. It follows that a model can be defined on the basis of connections between the system state variables, with only a limited number of assumptions about the "physical" behaviour of the system.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Suppose that having established a marginal total effect of a point exposure on a time-to-event outcome, an investigator wishes to decompose this effect into its direct and indirect pathways, also know as natural direct and indirect effects, mediated by a variable known to occur after the exposure and prior to the outcome. This paper proposes a theory of estimation of natural direct and indirect effects in two important semiparametric models for a failure time outcome. The underlying survival model for the marginal total effect and thus for the direct and indirect effects, can either be a marginal structural Cox proportional hazards model, or a marginal structural additive hazards model. The proposed theory delivers new estimators for mediation analysis in each of these models, with appealing robustness properties. Specifically, in order to guarantee ignorability with respect to the exposure and mediator variables, the approach, which is multiply robust, allows the investigator to use several flexible working models to adjust for confounding by a large number of pre-exposure variables. Multiple robustness is appealing because it only requires a subset of working models to be correct for consistency; furthermore, the analyst need not know which subset of working models is in fact correct to report valid inferences. Finally, a novel semiparametric sensitivity analysis technique is developed for each of these models, to assess the impact on inference, of a violation of the assumption of ignorability of the mediator.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

As an initial step in establishing mechanistic relationships between environmental variability and recruitment in Atlantic cod Gadhus morhua along the coast of the western Gulf of Maine, we assessed transport success of larvae from major spawning grounds to nursery areas with particle tracking using the unstructured grid model FVCOM (finite volume coastal ocean model). In coastal areas, dispersal of early planktonic life stages of fish and invertebrate species is highly dependent on the regional dynamics and its variability, which has to be captured by our models. With state-of-the-art forcing for the year 1995, we evaluate the sensitivity of particle dispersal to the timing and location of spawning, the spatial and temporal resolution of the model, and the vertical mixing scheme. A 3 d frequency for the release of particles is necessary to capture the effect of the circulation variability into an averaged dispersal pattern of the spawning season. The analysis of sensitivity to model setup showed that a higher resolution mesh, tidal forcing, and current variability do not change the general pattern of connectivity, but do tend to increase within-site retention. Our results indicate strong downstream connectivity among spawning grounds and higher chances for successful transport from spawning areas closer to the coast. The model run for January egg release indicates 1 to 19 % within-spawning ground retention of initial particles, which may be sufficient to sustain local populations. A systematic sensitivity analysis still needs to be conducted to determine the minimum mesh and forcing resolution that adequately resolves the complex dynamics of the western Gulf of Maine. Other sources of variability, i.e. large-scale upstream forcing and the biological environment, also need to be considered in future studies of the interannual variability in transport and survival of the early life stages of cod.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

AIMS Metformin use has been associated with a decreased risk of some cancers, although data on head and neck cancer (HNC) are scarce. We explored the relation between the use of antidiabetic drugs and the risk of HNC. METHODS We conducted a case-control analysis in the UK-based Clinical Practice Research Datalink (CPRD) of people with incident HNC between 1995 and 2013 below the age of 90 years. Six controls per case were matched on age, sex, calendar time, general practice and number of years of active history in the CPRD prior to the index date. Other potential confounders including body mass index (BMI), smoking, alcohol consumption and comorbidities were also evaluated. The final analyses were adjusted for BMI, smoking and diabetes mellitus (or diabetes duration in a sensitivity analysis). Results are presented as odds ratios (ORs) with 95% confidence intervals (CIs). RESULTS Use of metformin was neither associated with a statistically significant altered risk of HNC overall (1-29 prescriptions: adjusted OR 0.87, 95% CI 0.61-1.24 and ≥ 30 prescriptions adjusted OR 0.80, 95% CI 0.53-1.22), nor was long-term use of sulphonylureas (adjusted OR 0.87, 95% CI 0.59-1.30), or any insulin use (adjusted OR 0.92, 95% CI 0.63-1.35). However, we found a (statistically non-significant) decreased risk of laryngeal cancer associated with long-term metformin use (adjusted OR 0.41, 95% CI 0.17-1.03). CONCLUSIONS In this population-based study, the use of antidiabetic drugs was not associated with a materially altered risk of HNC. Our data suggest a protective effect of long-term metformin use for laryngeal cancer.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

BACKGROUND The Endoscopic Release of Carpal Tunnel Syndrome (ECTR) is a minimal invasive approach for the treatment of Carpal Tunnel Syndrome. There is scepticism regarding the safety of this technique, based on the assumption that this is a rather "blind" procedure and on the high number of severe complications that have been reported in the literature. PURPOSE To evaluate whether there is evidence supporting a higher risk after ECTR in comparison to the conventional open release. METHODS We searched MEDLINE (January 1966 to November 2013), EMBASE (January 1980 to November 2013), the Cochrane Neuromuscular Disease Group Specialized Register (November 2013) and CENTRAL (2013, issue 11 in The Cochrane Library). We hand-searched reference lists of included studies. We included all randomized or quasi-randomized controlled trials (e.g. study using alternation, date of birth, or case record number) that compare any ECTR with any OCTR technique. Safety was assessed by the incidence of major, minor and total number of complications, recurrences, and re-operations.The total time needed before return to work or to return to daily activities was also assessed. We synthesized data using a random-effects meta-analysis in STATA. We conducted a sensitivity analysis for rare events using binomial likelihood. We judged the conclusiveness of meta-analysis calculating the conditional power of meta-analysis. CONCLUSIONS ECTR is associated with less time off work or with daily activities. The assessment of major complications, reoperations and recurrence of symptoms does not favor either of the interventions. There is an uncertain advantage of ECTR with respect to total minor complications (more transient paresthesia but fewer skin-related complications). Future studies are unlikely to alter these findings because of the rarity of the outcome. The effect of a learning curve might be responsible for reduced recurrences and reoperations with ECTR in studies that are more recent, although formal statistical analysis failed to provide evidence for such an association. LEVEL OF EVIDENCE I.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

The Advisory Committee on Immunization Practices (ACIP) develops written recommendations for the routine administration of vaccines to children and adults in the U.S. civilian population. The ACIP is the only entity in the federal government that makes such recommendations. ACIP elaborates on selection of its members and rules out concerns regarding its integrity, but fails to provide information about the importance of economic analysis in vaccine selection. ACIP recommendations can have large health and economic consequences. Emphasis on economic evaluation in health is a likely response to severe pressures of the federal and state health budget. This study describes the economic aspects considered by the ACIP while sanctioning a vaccine, and reviews the economic evaluations (our economic data) provided for vaccine deliberations. A five year study period from 2004 to 2009 is adopted. Publicly available data from ACIP web database is used. Drummond et al. (2005) checklist serves as a guide to assess the quality of economic evaluations presented. Drummond et al.'s checklist is a comprehensive hence it is unrealistic to expect every ACIP deliberation to meet all of their criteria. For practical purposes we have selected seven criteria that we judge to be significant criteria provided by Drummond et al. Twenty-four data points were obtained in a five year period. Our results show that out of the total twenty-four data point‘s (economic evaluations) only five data points received a score of six; that is six items on the list of seven were met. None of the data points received a perfect score of seven. Seven of the twenty-four data points received a score of five. A minimum of a two score was received by only one of the economic analyses. The type of economic evaluation along with the model criteria and ICER/QALY criteria met at 0.875 (87.5%). These three criteria were met at the highest rate among the seven criteria studied. Our study findings demonstrate that the perspective criteria met at 0.583 (58.3%) followed by source and sensitivity analysis criteria both tied at 0.541 (54.1%). The discount factor was met at 0.250 (25.0%).^ Economic analysis is not a novel concept to the ACIP. It has been practiced and presented at these meetings on a regular basis for more than five years. ACIP‘s stated goal is to utilize good quality epidemiologic, clinical and economic analyses to help policy makers choose among alternatives presented and thus achieve a better informed decision. As seen in our study the economic analyses over the years are inconsistent. The large variability coupled with lack of a standardized format may compromise the utility of the economic information for decision-making. While making recommendations, the ACIP takes into account all available information about a vaccine. Thus it is vital that standardized high quality economic information is provided at the ACIP meetings. Our study may provide a call for the ACIP to further investigate deficiencies within the system and thereby to improve economic evaluation data presented. ^

Relevância:

70.00% 70.00%

Publicador:

Resumo:

A sensitivity analysis on the multiplication factor, keffkeff, to the cross section data has been carried out for the MYRRHA critical configuration in order to show the most relevant reactions. With these results, a further analysis on the 238Pu and 56Fe cross sections has been performed, comparing the evaluations provided in the JEFF-3.1.2 and ENDF/B-VII.1 libraries for these nuclides. Then, the effect in MYRRHA of the differences between evaluations are analysed, presenting the source of the differences. With these results, recommendations for the 56Fe and 238Pu evaluations are suggested. These calculations have been performed with SCALE6.1 and MCNPX-2.7e.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

In this paper we present a global overview of the recent study carried out in Spain for the new hazard map, which final goal is the revision of the Building Code in our country (NCSE-02). The study was carried our for a working group joining experts from The Instituto Geografico Nacional (IGN) and the Technical University of Madrid (UPM) , being the different phases of the work supervised by an expert Committee integrated by national experts from public institutions involved in subject of seismic hazard. The PSHA method (Probabilistic Seismic Hazard Assessment) has been followed, quantifying the epistemic uncertainties through a logic tree and the aleatory ones linked to variability of parameters by means of probability density functions and Monte Carlo simulations. In a first phase, the inputs have been prepared, which essentially are: 1) a project catalogue update and homogenization at Mw 2) proposal of zoning models and source characterization 3) calibration of Ground Motion Prediction Equations (GMPE’s) with actual data and development of a local model with data collected in Spain for Mw < 5.5. In a second phase, a sensitivity analysis of the different input options on hazard results has been carried out in order to have criteria for defining the branches of the logic tree and their weights. Finally, the hazard estimation was done with the logic tree shown in figure 1, including nodes for quantifying uncertainties corresponding to: 1) method for estimation of hazard (zoning and zoneless); 2) zoning models, 3) GMPE combinations used and 4) regression method for estimation of source parameters. In addition, the aleatory uncertainties corresponding to the magnitude of the events, recurrence parameters and maximum magnitude for each zone have been also considered including probability density functions and Monte Carlo simulations The main conclusions of the study are presented here, together with the obtained results in terms of PGA and other spectral accelerations SA (T) for return periods of 475, 975 and 2475 years. The map of the coefficient of variation (COV) are also represented to give an idea of the zones where the dispersion among results are the highest and the zones where the results are robust.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Advanced control techniques like V2, Vout hysteresis or V2Ic can strongly reduce the required output capacitance in PowerSoC converters. Techniques to analyze power converters based on the analysis of the frequency response are not suitable for ripple-based controllers that use fast-scale dynamics to control the power stage. This paper proves that the use of discrete modeling together with Floquet theory is a very powerful tool to model the system and derive stable region diagrams for sensitivity analysis. It is applied to V 2Ic control, validating experimentally that Floquet theory predicts accurately subharmonic oscillations. This method is applied to several ripplebased controllers, providing higher accuracy when it is compared with other techniques based on the frequency response. The paper experimentally validates the usefulness of the discrete modeling and the Floquet theory on a 5 MHz Buck converter with a V 2Ic control.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Ripple-based controls can strongly reduce the required output capacitance in PowerSoC converter thanks to a very fast dynamic response. Unfortunately, these controls are prone to sub-harmonic oscillations and several parameters affect the stability of these systems. This paper derives and validates a simulation-based modeling and stability analysis of a closed-loop V 2Ic control applied to a 5 MHz Buck converter using discrete modeling and Floquet theory to predict stability. This allows the derivation of sensitivity analysis to design robust systems. The work is extended to different V 2 architectures using the same methodology.