979 resultados para Least cost
Resumo:
[Excerpt] Bioethanol from lignocellulosic materials (LCM), also called second generation bioethanol, is considered a promising alternative to first generation bioethanol. An efficient production process of lignocellulosic bioethanol involves an effective pretreatment of LCM to improve the accessibility of cellulose and thus enhance the enzymatic saccharification. One interesting approach is to use the whole slurry from treatment, since allows economical and industrial benefits: washing steps are avoided, water consumption is lower and the sugars from liquid phase can be used, increasing ethanol concentration [1]. However, during the pretreatment step some compounds (such as furans, phenolic compounds and weak acids) are produced. These compounds have an inhibitory effect on the microorganisms used for hydrolysate fermentation [2]. To overcome this, the use of a robust industrial strain together with agro-industrial by-products as nutritional supplementation was proposed to increase the ethanol productivities and yields. (...)
Resumo:
Dissertação de mestrado em Sociologia (área de especialização em Organizações e Trabalho)
Resumo:
OBJECTIVE: This study was performed to observe the number of pacemakers that had never been reprogrammed after implantation, and the effect of optimised output programming on estimated longevity of pulse generators in patients with pacemaker METHODS: Sixty patients with Teletronics Reflex pacemakers were evaluated in a pacemaker clinic, from the time of the beginning of its activities, in June 1998, until March 1999. Telemetry was performed during the first clinic visit, and we observed how many pulse generators retained nominal output settings of the manufactures indicating the absence of reprogramming until that date. After evaluation of the capture threshold, reprogramming of pacemakers was performed with a safety margin of 2 to 2.5:1, and we compared the estimated longevity based on battery current at the manufacturer's settings with that based on settings achieved after reprogramming. RESULTS: In 95% of the cases, the original programmed setting was never reprogrammed before the patients attended the pacemaker clinic. Reprogramming the pacemaker prolonged estimated pulse generator life by 19.7±15.6 months (35.5%). CONCLUSION: The majority of the pacemakers evaluated had never been reprogrammed. Estimated pulse generator longevity can be prolonged significantly, using this simple, safe, efficacious, and cost-effective procedure.
Resumo:
OBJECTIVE: To analyze the reasons given by patients for interrupting their pharmacological treatment of hypertension. METHODS: We carried out an observational cross-sectional study, in which a questionnaire was applied and blood pressure was measured in 401 patients in different centers of the state of Bahia. The patients selected had been diagnosed with hypertension and were not on antihypertensive treatment for at least 60 days. Clinical and epidemiological characteristics of the groups were analyzed. RESULTS: Of the 401 patients, 58.4% were females, 55.6% of whom white; 60.5% of the males were white. The major reasons alleged for not adhering to treatment were as follows (for males and females respectively): normalization of blood pressure (41.3% and 42.3%); side effects of the medications (31.7% and 24.8%); forgetting to use the medication (25.2% and 20.1%); cost of medication (21.6% and 20.1%); fear of mixing alcohol and medication (23.4% and 3.8%); ignoring the need for continuing the treatment (15% and 21.8%); use of an alternative treatment (11.4% and 17.1%); fear of intoxication (9.6% and 12.4%); fear of hypotension (9.6% and 12%); and fear of mixing the medication with other drugs (8.4% and 6.1%). CONCLUSION: Our data suggest that most factors concerning the abandonment of the treatment of hypertension are related to lack of information, and that, despite the advancement in antihypertensive drugs, side effects still account for most abandonments of treatment.
Resumo:
Background:Statins have proven efficacy in the reduction of cardiovascular events, but the financial impact of its widespread use can be substantial.Objective:To conduct a cost-effectiveness analysis of three statin dosing schemes in the Brazilian Unified National Health System (SUS) perspective.Methods:We developed a Markov model to evaluate the incremental cost-effectiveness ratios (ICERs) of low, intermediate and high intensity dose regimens in secondary and four primary scenarios (5%, 10%, 15% and 20% ten-year risk) of prevention of cardiovascular events. Regimens with expected low-density lipoprotein cholesterol reduction below 30% (e.g. simvastatin 10mg) were considered as low dose; between 30-40%, (atorvastatin 10mg, simvastatin 40mg), intermediate dose; and above 40% (atorvastatin 20-80mg, rosuvastatin 20mg), high-dose statins. Effectiveness data were obtained from a systematic review with 136,000 patients. National data were used to estimate utilities and costs (expressed as International Dollars - Int$). A willingness-to-pay (WTP) threshold equal to the Brazilian gross domestic product per capita (circa Int$11,770) was applied.Results:Low dose was dominated by extension in the primary prevention scenarios. In the five scenarios, the ICER of intermediate dose was below Int$10,000 per QALY. The ICER of the high versus intermediate dose comparison was above Int$27,000 per QALY in all scenarios. In the cost-effectiveness acceptability curves, intermediate dose had a probability above 50% of being cost-effective with ICERs between Int$ 9,000-20,000 per QALY in all scenarios.Conclusions:Considering a reasonable WTP threshold, intermediate dose statin therapy is economically attractive, and should be a priority intervention in prevention of cardiovascular events in Brazil.
Resumo:
Background:Polypharmacy is a significant economic burden.Objective:We tested whether using reverse auction (RA) as compared with commercial pharmacy (CP) to purchase medicine results in lower pharmaceutical costs for heart failure (HF) and heart transplantation (HT) outpatients.Methods:We compared the costs via RA versus CP in 808 HF and 147 HT patients followed from 2009 through 2011, and evaluated the influence of clinical and demographic variables on cost.Results:The monthly cost per patient for HF drugs acquired via RA was $10.15 (IQ 3.51-40.22) versus $161.76 (IQ 86.05‑340.15) via CP; for HT, those costs were $393.08 (IQ 124.74-774.76) and $1,207.70 (IQ 604.48-2,499.97), respectively.Conclusion:RA may reduce the cost of prescription drugs for HF and HT, potentially making HF treatment more accessible. Clinical characteristics can influence the cost and benefits of RA. RA may be a new health policy strategy to reduce costs of prescribed medications for HF and HT patients, reducing the economic burden of treatment.
Resumo:
Background: Exercise is essential for patients with heart failure as it leads to a reduction in morbidity and mortality as well as improved functional capacity and oxygen uptake (v̇O2). However, the need for an experienced physiologist and the cost of the exam may render the cardiopulmonary exercise test (CPET) unfeasible. Thus, the six-minute walk test (6MWT) and step test (ST) may be alternatives for exercise prescription. Objective: The aim was to correlate heart rate (HR) during the 6MWT and ST with HR at the anaerobic threshold (HRAT) and peak HR (HRP) obtained on the CPET. Methods: Eighty-three patients (58 ± 11 years) with heart failure (NYHA class II) were included and all subjects had optimized medication for at least 3 months. Evaluations involved CPET (v̇O2, HRAT, HRP), 6MWT (HR6MWT) and ST (HRST). Results: The participants exhibited severe ventricular dysfunction (ejection fraction: 31 ± 7%) and low peak v̇O2 (15.2 ± 3.1 mL.kg-1.min-1). HRP (113 ± 19 bpm) was higher than HRAT (92 ± 14 bpm; p < 0.05) and HR6MWT (94 ± 13 bpm; p < 0.05). No significant difference was found between HRP and HRST. Moreover, a strong correlation was found between HRAT and HR6MWT (r = 0.81; p < 0.0001), and between HRP and HRST (r = 0.89; p < 0.0001). Conclusion: These findings suggest that, in the absence of CPET, exercise prescription can be performed by use of 6MWT and ST, based on HR6MWT and HRST
Resumo:
The main object of the present paper consists in giving formulas and methods which enable us to determine the minimum number of repetitions or of individuals necessary to garantee some extent the success of an experiment. The theoretical basis of all processes consists essentially in the following. Knowing the frequency of the desired p and of the non desired ovents q we may calculate the frequency of all possi- ble combinations, to be expected in n repetitions, by expanding the binomium (p-+q)n. Determining which of these combinations we want to avoid we calculate their total frequency, selecting the value of the exponent n of the binomium in such a way that this total frequency is equal or smaller than the accepted limit of precision n/pª{ 1/n1 (q/p)n + 1/(n-1)| (q/p)n-1 + 1/ 2!(n-2)| (q/p)n-2 + 1/3(n-3) (q/p)n-3... < Plim - -(1b) There does not exist an absolute limit of precision since its value depends not only upon psychological factors in our judgement, but is at the same sime a function of the number of repetitions For this reasen y have proposed (1,56) two relative values, one equal to 1-5n as the lowest value of probability and the other equal to 1-10n as the highest value of improbability, leaving between them what may be called the "region of doubt However these formulas cannot be applied in our case since this number n is just the unknown quantity. Thus we have to use, instead of the more exact values of these two formulas, the conventional limits of P.lim equal to 0,05 (Precision 5%), equal to 0,01 (Precision 1%, and to 0,001 (Precision P, 1%). The binominal formula as explained above (cf. formula 1, pg. 85), however is of rather limited applicability owing to the excessive calculus necessary, and we have thus to procure approximations as substitutes. We may use, without loss of precision, the following approximations: a) The normal or Gaussean distribution when the expected frequency p has any value between 0,1 and 0,9, and when n is at least superior to ten. b) The Poisson distribution when the expected frequecy p is smaller than 0,1. Tables V to VII show for some special cases that these approximations are very satisfactory. The praticai solution of the following problems, stated in the introduction can now be given: A) What is the minimum number of repititions necessary in order to avoid that any one of a treatments, varieties etc. may be accidentally always the best, on the best and second best, or the first, second, and third best or finally one of the n beat treatments, varieties etc. Using the first term of the binomium, we have the following equation for n: n = log Riim / log (m:) = log Riim / log.m - log a --------------(5) B) What is the minimun number of individuals necessary in 01der that a ceratin type, expected with the frequency p, may appaer at least in one, two, three or a=m+1 individuals. 1) For p between 0,1 and 0,9 and using the Gaussean approximation we have: on - ó. p (1-p) n - a -1.m b= δ. 1-p /p e c = m/p } -------------------(7) n = b + b² + 4 c/ 2 n´ = 1/p n cor = n + n' ---------- (8) We have to use the correction n' when p has a value between 0,25 and 0,75. The greek letters delta represents in the present esse the unilateral limits of the Gaussean distribution for the three conventional limits of precision : 1,64; 2,33; and 3,09 respectively. h we are only interested in having at least one individual, and m becomes equal to zero, the formula reduces to : c= m/p o para a = 1 a = { b + b²}² = b² = δ2 1- p /p }-----------------(9) n = 1/p n (cor) = n + n´ 2) If p is smaller than 0,1 we may use table 1 in order to find the mean m of a Poisson distribution and determine. n = m: p C) Which is the minimun number of individuals necessary for distinguishing two frequencies p1 and p2? 1) When pl and p2 are values between 0,1 and 0,9 we have: n = { δ p1 ( 1-pi) + p2) / p2 (1 - p2) n= 1/p1-p2 }------------ (13) n (cor) We have again to use the unilateral limits of the Gaussean distribution. The correction n' should be used if at least one of the valors pl or p2 has a value between 0,25 and 0,75. A more complicated formula may be used in cases where whe want to increase the precision : n (p1 - p2) δ { p1 (1- p2 ) / n= m δ = δ p1 ( 1 - p1) + p2 ( 1 - p2) c= m / p1 - p2 n = { b2 + 4 4 c }2 }--------- (14) n = 1/ p1 - p2 2) When both pl and p2 are smaller than 0,1 we determine the quocient (pl-r-p2) and procure the corresponding number m2 of a Poisson distribution in table 2. The value n is found by the equation : n = mg /p2 ------------- (15) D) What is the minimun number necessary for distinguishing three or more frequencies, p2 p1 p3. If the frequecies pl p2 p3 are values between 0,1 e 0,9 we have to solve the individual equations and sue the higest value of n thus determined : n 1.2 = {δ p1 (1 - p1) / p1 - p2 }² = Fiim n 1.2 = { δ p1 ( 1 - p1) + p1 ( 1 - p1) }² } -- (16) Delta represents now the bilateral limits of the : Gaussean distrioution : 1,96-2,58-3,29. 2) No table was prepared for the relatively rare cases of a comparison of threes or more frequencies below 0,1 and in such cases extremely high numbers would be required. E) A process is given which serves to solve two problemr of informatory nature : a) if a special type appears in n individuals with a frequency p(obs), what may be the corresponding ideal value of p(esp), or; b) if we study samples of n in diviuals and expect a certain type with a frequency p(esp) what may be the extreme limits of p(obs) in individual farmlies ? I.) If we are dealing with values between 0,1 and 0,9 we may use table 3. To solve the first question we select the respective horizontal line for p(obs) and determine which column corresponds to our value of n and find the respective value of p(esp) by interpolating between columns. In order to solve the second problem we start with the respective column for p(esp) and find the horizontal line for the given value of n either diretly or by approximation and by interpolation. 2) For frequencies smaller than 0,1 we have to use table 4 and transform the fractions p(esp) and p(obs) in numbers of Poisson series by multiplication with n. Tn order to solve the first broblem, we verify in which line the lower Poisson limit is equal to m(obs) and transform the corresponding value of m into frequecy p(esp) by dividing through n. The observed frequency may thus be a chance deviate of any value between 0,0... and the values given by dividing the value of m in the table by n. In the second case we transform first the expectation p(esp) into a value of m and procure in the horizontal line, corresponding to m(esp) the extreme values om m which than must be transformed, by dividing through n into values of p(obs). F) Partial and progressive tests may be recomended in all cases where there is lack of material or where the loss of time is less importent than the cost of large scale experiments since in many cases the minimun number necessary to garantee the results within the limits of precision is rather large. One should not forget that the minimun number really represents at the same time a maximun number, necessary only if one takes into consideration essentially the disfavorable variations, but smaller numbers may frequently already satisfactory results. For instance, by definition, we know that a frequecy of p means that we expect one individual in every total o(f1-p). If there were no chance variations, this number (1- p) will be suficient. and if there were favorable variations a smaller number still may yield one individual of the desired type. r.nus trusting to luck, one may start the experiment with numbers, smaller than the minimun calculated according to the formulas given above, and increase the total untill the desired result is obtained and this may well b ebefore the "minimum number" is reached. Some concrete examples of this partial or progressive procedure are given from our genetical experiments with maize.
Resumo:
Aspects related to the longevity and oviposition of Ophyra albuquerquei Lopes, 1985 are studied. Males had a mean lifespan longer than females (40.24 vs. 33.15 days, respectively), while still possessing qualitative advantages during this period. Females O. albuquerquei were fed on powdered milk, fish flour, refined sugar and water, and provided fish flour and moistened sawdust for oviposition. The length of the oviposition period for females was 46.75 days, and most of the deposition of the eggs occurred in the first days of the colony. Females completed 50% of their egg deposition by Day 16, seven days before the large last mortality peak and about 12 days after the first oviposition in the colony. Females deposited an average of 184 eggs per individual. Mortality of males, unlike females, was low until the 28th day, and increased thereafter. It was demonstrated that it is possible to maintain colonies of this species under laboratory conditions for at least 28 days with high fertility and low cost.
Resumo:
This paper investigates the selection of governance forms in interfirm collaborations taking into account the predictions from transaction costs and property rights theories. Transaction costs arguments are often used to justify the introduction of hierarchical controls in collaborations, but the ownership dimension of going from “contracts” to “hierarchies” has been ignored in the past and with it the so called “costs of ownership”. The theoretical results, tested with a sample of collaborations in which participate Spanish firms, indicate that the cost of ownership may offset the benefits of hierarchical controls and therefore limit their diffusion. Evidence is also reported of possible complementarities between reputation effects and forms of ownership that go together with hierarchical controls (i.e. joint ventures), in contrast with the generally assumed substitutability between the two.
Resumo:
We study the relation between the number of firms and price-cost margins under price competition with uncertainty about competitors' costs. We present results of an experiment in which two, three and four identical firms repeatedly interact in this environment. In line with the theoretical prediction, market prices decrease with the number of firms, but on average stay above marginal costs. Pricing is less aggressive in duopolies than in triopolies and tetrapolies. However, independently from the number of firms, pricing is more aggressive than in the theoretical equilibrium. Both the absolute and the relative surpluses increase with the number of firms. Total surplus is close to the equilibrium level, since enhanced consumer surplus through lower prices is counteracted by occasional displacements of the most efficient firm in production.
Resumo:
Description of a costing model developed by digital production librarian to determine the cost to put an item into the Claremont Colleges Digital Library at the Claremont University Consortium. This case study includes variables such as material types and funding sources, data collection methods, and formulas and calculations for analysis. This model is useful for grant applications, cost allocations, and budgeting for digital project coordinators and digital library projects.