101 resultados para Lower cost

em Université de Lausanne, Switzerland


Relevância:

70.00% 70.00%

Publicador:

Resumo:

AIM: The study aimed to compare the rate of success and cost of anal fistula plug (AFP) insertion and endorectal advancement flap (ERAF) for anal fistula. METHOD: Patients receiving an AFP or ERAF for a complex single fistula tract, defined as involving more than a third of the longitudinal length of of the anal sphincter, were registered in a prospective database. A regression analysis was performed of factors predicting recurrence and contributing to cost. RESULTS: Seventy-one patients (AFP 31, ERAF 40) were analysed. Twelve (39%) recurrences occurred in the AFP and 17 (43%) in the ERAF group (P = 1.00). The median length of stay was 1.23 and 2.0 days (P < 0.001), respectively, and the mean cost of treatment was euro5439 ± euro2629 and euro7957 ± euro5905 (P = 0.021), respectively. On multivariable analysis, postoperative complications, underlying inflammatory bowel disease and fistula recurring after previous treatment were independent predictors of de novo recurrence. It also showed that length of hospital stay ≤ 1 day to be the most significant independent contributor to lower cost (P = 0.023). CONCLUSION: Anal fistula plug and ERAF were equally effective in treating fistula-in-ano, but AFP has a mean cost saving of euro2518 per procedure compared with ERAF. The higher cost for ERAF is due to a longer median length of stay.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Atrial fibrillation (AF) is the most common arrhythmia and among the leading causes of stroke and heart failure in Western populations. Despite the increasing size of clinical trials assessing the efficacy and safety of AF therapies, achieved outcomes have not always matched expectations. Considering that AF is a symptom of many possible underlying diseases, clinical research for this arrhythmia should take into account their respective pathophysiology. Accordingly, the definition of the study populations to be included should rely on the established as well as on the new classifications of AF and take advantage from a differentiated look at the AF-electrocardiogram and from increasingly large spectrum of biomarkers. Such an integrated approach could bring researchers and treating physicians one step closer to the ultimate vision of personalized therapy, which, in this case, means an AF therapy based on refined diagnostic elements in accordance with scientific evidence gathered from clinical trials. By applying clear-cut patient inclusion criteria, future studies will be of smaller size and thus of lower cost. In addition, the findings from such studies will be of greater predictive value at the individual patient level, allowing for pinpointed therapeutic decisions in daily practice.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The drug discovery process has been deeply transformed recently by the use of computational ligand-based or structure-based methods, helping the lead compounds identification and optimization, and finally the delivery of new drug candidates more quickly and at lower cost. Structure-based computational methods for drug discovery mainly involve ligand-protein docking and rapid binding free energy estimation, both of which require force field parameterization for many drug candidates. Here, we present a fast force field generation tool, called SwissParam, able to generate, for arbitrary small organic molecule, topologies, and parameters based on the Merck molecular force field, but in a functional form that is compatible with the CHARMM force field. Output files can be used with CHARMM or GROMACS. The topologies and parameters generated by SwissParam are used by the docking software EADock2 and EADock DSS to describe the small molecules to be docked, whereas the protein is described by the CHARMM force field, and allow them to reach success rates ranging from 56 to 78%. We have also developed a rapid binding free energy estimation approach, using SwissParam for ligands and CHARMM22/27 for proteins, which requires only a short minimization to reproduce the experimental binding free energy of 214 ligand-protein complexes involving 62 different proteins, with a standard error of 2.0 kcal mol(-1), and a correlation coefficient of 0.74. Together, these results demonstrate the relevance of using SwissParam topologies and parameters to describe small organic molecules in computer-aided drug design applications, together with a CHARMM22/27 description of the target protein. SwissParam is available free of charge for academic users at www.swissparam.ch.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Aim  The imperfect detection of species may lead to erroneous conclusions about species-environment relationships. Accuracy in species detection usually requires temporal replication at sampling sites, a time-consuming and costly monitoring scheme. Here, we applied a lower-cost alternative based on a double-sampling approach to incorporate the reliability of species detection into regression-based species distribution modelling.Location  Doñana National Park (south-western Spain).Methods  Using species-specific monthly detection probabilities, we estimated the detection reliability as the probability of having detected the species given the species-specific survey time. Such reliability estimates were used to account explicitly for data uncertainty by weighting each absence. We illustrated how this novel framework can be used to evaluate four competing hypotheses as to what constitutes primary environmental control of amphibian distribution: breeding habitat, aestivating habitat, spatial distribution of surrounding habitats and/or major ecosystems zonation. The study was conducted on six pond-breeding amphibian species during a 4-year period.Results  Non-detections should not be considered equivalent to real absences, as their reliability varied considerably. The occurrence of Hyla meridionalis and Triturus pygmaeus was related to a particular major ecosystem of the study area, where suitable habitat for these species seemed to be widely available. Characteristics of the breeding habitat (area and hydroperiod) were of high importance for the occurrence of Pelobates cultripes and Pleurodeles waltl. Terrestrial characteristics were the most important predictors of the occurrence of Discoglossus galganoi and Lissotriton boscai, along with spatial distribution of breeding habitats for the last species.Main conclusions  We did not find a single best supported hypothesis valid for all species, which stresses the importance of multiscale and multifactor approaches. More importantly, this study shows that estimating the reliability of non-detection records, an exercise that had been previously seen as a naïve goal in species distribution modelling, is feasible and could be promoted in future studies, at least in comparable systems.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The motivation for this research initiated from the abrupt rise and fall of minicomputers which were initially used both for industrial automation and business applications due to their significantly lower cost than their predecessors, the mainframes. Later industrial automation developed its own vertically integrated hardware and software to address the application needs of uninterrupted operations, real-time control and resilience to harsh environmental conditions. This has led to the creation of an independent industry, namely industrial automation used in PLC, DCS, SCADA and robot control systems. This industry employs today over 200'000 people in a profitable slow clockspeed context in contrast to the two mainstream computing industries of information technology (IT) focused on business applications and telecommunications focused on communications networks and hand-held devices. Already in 1990s it was foreseen that IT and communication would merge into one Information and communication industry (ICT). The fundamental question of the thesis is: Could industrial automation leverage a common technology platform with the newly formed ICT industry? Computer systems dominated by complex instruction set computers (CISC) were challenged during 1990s with higher performance reduced instruction set computers (RISC). RISC started to evolve parallel to the constant advancement of Moore's law. These developments created the high performance and low energy consumption System-on-Chip architecture (SoC). Unlike to the CISC processors RISC processor architecture is a separate industry from the RISC chip manufacturing industry. It also has several hardware independent software platforms consisting of integrated operating system, development environment, user interface and application market which enables customers to have more choices due to hardware independent real time capable software applications. An architecture disruption merged and the smartphone and tablet market were formed with new rules and new key players in the ICT industry. Today there are more RISC computer systems running Linux (or other Unix variants) than any other computer system. The astonishing rise of SoC based technologies and related software platforms in smartphones created in unit terms the largest installed base ever seen in the history of computers and is now being further extended by tablets. An underlying additional element of this transition is the increasing role of open source technologies both in software and hardware. This has driven the microprocessor based personal computer industry with few dominating closed operating system platforms into a steep decline. A significant factor in this process has been the separation of processor architecture and processor chip production and operating systems and application development platforms merger into integrated software platforms with proprietary application markets. Furthermore the pay-by-click marketing has changed the way applications development is compensated: Three essays on major trends in a slow clockspeed industry: The case of industrial automation 2014 freeware, ad based or licensed - all at a lower price and used by a wider customer base than ever before. Moreover, the concept of software maintenance contract is very remote in the app world. However, as a slow clockspeed industry, industrial automation has remained intact during the disruptions based on SoC and related software platforms in the ICT industries. Industrial automation incumbents continue to supply systems based on vertically integrated systems consisting of proprietary software and proprietary mainly microprocessor based hardware. They enjoy admirable profitability levels on a very narrow customer base due to strong technology-enabled customer lock-in and customers' high risk leverage as their production is dependent on fault-free operation of the industrial automation systems. When will this balance of power be disrupted? The thesis suggests how industrial automation could join the mainstream ICT industry and create an information, communication and automation (ICAT) industry. Lately the Internet of Things (loT) and weightless networks, a new standard leveraging frequency channels earlier occupied by TV broadcasting, have gradually started to change the rigid world of Machine to Machine (M2M) interaction. It is foreseeable that enough momentum will be created that the industrial automation market will in due course face an architecture disruption empowered by these new trends. This thesis examines the current state of industrial automation subject to the competition between the incumbents firstly through a research on cost competitiveness efforts in captive outsourcing of engineering, research and development and secondly researching process re- engineering in the case of complex system global software support. Thirdly we investigate the industry actors', namely customers, incumbents and newcomers, views on the future direction of industrial automation and conclude with our assessments of the possible routes industrial automation could advance taking into account the looming rise of the Internet of Things (loT) and weightless networks. Industrial automation is an industry dominated by a handful of global players each of them focusing on maintaining their own proprietary solutions. The rise of de facto standards like IBM PC, Unix and Linux and SoC leveraged by IBM, Compaq, Dell, HP, ARM, Apple, Google, Samsung and others have created new markets of personal computers, smartphone and tablets and will eventually also impact industrial automation through game changing commoditization and related control point and business model changes. This trend will inevitably continue, but the transition to a commoditized industrial automation will not happen in the near future.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Long synthetic peptides (LSPs) have a variety of important clinical uses as synthetic vaccines and drugs. Techniques for peptide synthesis were revolutionized in the 1960s and 1980s, after which efficient techniques for purification and characterization of the product were developed. These improved techniques allowed the stepwise synthesis of increasingly longer products at a faster rate, greater purity, and lower cost for clinical use. A synthetic peptide approach, coupled with bioinformatics analysis of genomes, can tremendously expand the search for clinically relevant products. In this Review, we discuss efforts to develop a malaria vaccine from LSPs, among other clinically directed work.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

BACKGROUND: Intravenously administered antimicrobial agents have been the standard choice for the empirical management of fever in patients with cancer and granulocytopenia. If orally administered empirical therapy is as effective as intravenous therapy, it would offer advantages such as improved quality of life and lower cost. METHODS: In a prospective, open-label, multicenter trial, we randomly assigned febrile patients with cancer who had granulocytopenia that was expected to resolve within 10 days to receive empirical therapy with either oral ciprofloxacin (750 mg twice daily) plus amoxicillin-clavulanate (625 mg three times daily) or standard daily doses of intravenous ceftriaxone plus amikacin. All patients were hospitalized until their fever resolved. The primary objective of the study was to determine whether there was equivalence between the regimens, defined as an absolute difference in the rates of success of 10 percent or less. RESULTS: Equivalence was demonstrated at the second interim analysis, and the trial was terminated after the enrollment of 353 patients. In the analysis of the 312 patients who were treated according to the protocol and who could be evaluated, treatment was successful in 86 percent of the patients in the oral-therapy group (95 percent confidence interval, 80 to 91 percent) and 84 percent of those in the intravenous-therapy group (95 percent confidence interval, 78 to 90 percent; P=0.02). The results were similar in the intention-to-treat analysis (80 percent and 77 percent, respectively; P=0.03), as were the duration of fever, the time to a change in the regimen, the reasons for such a change, the duration of therapy, and survival. The types of adverse events differed slightly between the groups but were similar in frequency. CONCLUSIONS: In low-risk patients with cancer who have fever and granulocytopenia, oral therapy with ciprofloxacin plus amoxicillin-clavulanate is as effective as intravenous therapy.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The relative importance of molecular biology in clinical practice is often underestimated. However, numerous procedures in clinical diagnosis and new therapeutic drugs have resulted from basic molecular research. Furthermore, understanding of the physiological and physiopathological mechanisms underlying several human diseases has been improved by the results of basic molecular research. For example, cloning of the gene encoding leptin has provided spectacular insights into the understanding of the mechanisms involved in the control of food intake and body weight maintenance in man. In cystic fibrosis, the cloning and identification of several mutations in the gene encoding the chloride channel transmembrane regulator (CFTR) have resolved several important issues in clinical practice: cystic fibrosis constitutes a molecular defect of a single gene. There is a strong correlation between the clinical manifestations or the severity of the disease (phenotype) with the type of mutations present in the CFTR gene (genotype). More recently, identification of mutations in the gene encoding a subunit of the renal sodium channel in the Liddle syndrome has provided important insight into the physiopathological understanding of mechanisms involved in this form of hereditary hypertension. Salt retention and secondary high blood pressure are the result of constitutive activation of the renal sodium channel by mutations in the gene encoding the renal sodium channel. It is speculated that less severe mutations in this channel could result in a less severe form of hypertension which may correspond to patients suffering from high blood pressure with low plasma renin activity. Several tools, most notably PCR, are derived from molecular research and are used in everyday practice, i.e. in prenatal diagnosis and in the diagnosis of several infectious diseases including tuberculosis and hepatitis. Finally, the production of recombinant proteins at lower cost and with fewer side effects is used in everyday clinical practice. Gene therapy remains an extraordinary challenge in correcting severe hereditary or acquired diseases. The use of genetically modified animal cell lines producing growth factors, insulin or erythropoetin, which are subsequently encapsulated and transferred to man, represents an attractive approach for gene therapy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Lipid-lowering therapy is costly but effective at reducing coronary heart disease (CHD) risk. OBJECTIVE: To assess the cost-effectiveness and public health impact of Adult Treatment Panel III (ATP III) guidelines and compare with a range of risk- and age-based alternative strategies. DESIGN: The CHD Policy Model, a Markov-type cost-effectiveness model. DATA SOURCES: National surveys (1999 to 2004), vital statistics (2000), the Framingham Heart Study (1948 to 2000), other published data, and a direct survey of statin costs (2008). TARGET POPULATION: U.S. population age 35 to 85 years. Time Horizon: 2010 to 2040. PERSPECTIVE: Health care system. INTERVENTION: Lowering of low-density lipoprotein cholesterol with HMG-CoA reductase inhibitors (statins). OUTCOME MEASURE: Incremental cost-effectiveness. RESULTS OF BASE-CASE ANALYSIS: Full adherence to ATP III primary prevention guidelines would require starting (9.7 million) or intensifying (1.4 million) statin therapy for 11.1 million adults and would prevent 20,000 myocardial infarctions and 10,000 CHD deaths per year at an annual net cost of $3.6 billion ($42,000/QALY) if low-intensity statins cost $2.11 per pill. The ATP III guidelines would be preferred over alternative strategies if society is willing to pay $50,000/QALY and statins cost $1.54 to $2.21 per pill. At higher statin costs, ATP III is not cost-effective; at lower costs, more liberal statin-prescribing strategies would be preferred; and at costs less than $0.10 per pill, treating all persons with low-density lipoprotein cholesterol levels greater than 3.4 mmol/L (>130 mg/dL) would yield net cost savings. RESULTS OF SENSITIVITY ANALYSIS: Results are sensitive to the assumptions that LDL cholesterol becomes less important as a risk factor with increasing age and that little disutility results from taking a pill every day. LIMITATION: Randomized trial evidence for statin effectiveness is not available for all subgroups. CONCLUSION: The ATP III guidelines are relatively cost-effective and would have a large public health impact if implemented fully in the United States. Alternate strategies may be preferred, however, depending on the cost of statins and how much society is willing to pay for better health outcomes. FUNDING: Flight Attendants' Medical Research Institute and the Swanson Family Fund. The Framingham Heart Study and Framingham Offspring Study are conducted and supported by the National Heart, Lung, and Blood Institute.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: According to recent guidelines, patients with coronary artery disease (CAD) should undergo revascularization if significant myocardial ischemia is present. Both, cardiovascular magnetic resonance (CMR) and fractional flow reserve (FFR) allow for a reliable ischemia assessment and in combination with anatomical information provided by invasive coronary angiography (CXA), such a work-up sets the basis for a decision to revascularize or not. The cost-effectiveness ratio of these two strategies is compared. METHODS: Strategy 1) CMR to assess ischemia followed by CXA in ischemia-positive patients (CMR + CXA), Strategy 2) CXA followed by FFR in angiographically positive stenoses (CXA + FFR). The costs, evaluated from the third party payer perspective in Switzerland, Germany, the United Kingdom (UK), and the United States (US), included public prices of the different outpatient procedures and costs induced by procedural complications and by diagnostic errors. The effectiveness criterion was the correct identification of hemodynamically significant coronary lesion(s) (= significant CAD) complemented by full anatomical information. Test performances were derived from the published literature. Cost-effectiveness ratios for both strategies were compared for hypothetical cohorts with different pretest likelihood of significant CAD. RESULTS: CMR + CXA and CXA + FFR were equally cost-effective at a pretest likelihood of CAD of 62% in Switzerland, 65% in Germany, 83% in the UK, and 82% in the US with costs of CHF 5'794, euro 1'517, £ 2'680, and $ 2'179 per patient correctly diagnosed. Below these thresholds, CMR + CXA showed lower costs per patient correctly diagnosed than CXA + FFR. CONCLUSIONS: The CMR + CXA strategy is more cost-effective than CXA + FFR below a CAD prevalence of 62%, 65%, 83%, and 82% for the Swiss, the German, the UK, and the US health care systems, respectively. These findings may help to optimize resource utilization in the diagnosis of CAD.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: CMR has recently emerged as a robust and reliable technique to assess coronary artery disease (CAD). A negative perfusion CMR test predicts low event rates of 0.3-0.5%/year. Invasive coronary angiography (CA) remains the "gold standard" for the evaluation of CAD in many countries.Objective: Assessing the costs of the two strategies in the European CMR registry for the work-up of known or suspected CAD from a health care payer perspective. Strategy 1) a CA to all patients or 2) a CA only to patients who are diagnosed positive for ischemia in a prior CMR.Method and results: Using data of the European CMR registry (20 hospitals, 11'040 consecutive patients) we calculated the proportion of patients who were diagnosed positive (20.6%), uncertain (6.5%), and negative (72.9%) after the CMR test in patients with known or suspected CAD (n=2'717). No other medical test was performed to patients who were negative for ischemia. Positive diagnosed patients had a coronary angiography. Those with uncertain diagnosis had additional tests (84.7%: stress echocardiography, 13.1%: CCT, 2.3% SPECT), these costs were added to the CMR strategy costs. Information from costs for tests in Germany and Switzerland were used. A sensibility analysis was performed for inpatient CA. For costs see figure. Results - costs.Discussion: The CMR strategy costs less than the CA strategy for the health insurance systems both, in Germany and Switzerland. While lower in costs, the CMR strategy is a non-invasive one, does not expose to radiation, and yields additional information on cardiac function, viability, valves, and great vessels. Developing the use of CMR instead of CA might imply some reduction in costs together with superior patient safety and comfort, and a better utilization of resources at the hospital level. Document introduit le : 01.12.2011

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Two nonmutually exclusive hypotheses can explain why divorce is an adaptive strategy to improve reproductive success. Under the 'better option hypothesis', only one of the two partners initiates divorce to secure a higher-quality partner and increases reproductive success after divorce. Under the 'incompatibility hypothesis', partners are incompatible and hence they may both increase reproductive success after divorce. In a long-term study of the barn owl (Tyto alba), we address the question of whether one or the two partners derive fitness benefits by divorcing. Our results support the hypothesis that divorce is adaptive: after a poor reproductive season, at least one of the two divorcees increase breeding success up to the level of faithful pairs. By breeding more often together, faithful pairs improve coordination and thereby gain in their efficiency to produce successful fledglings. Males would divorce to obtain a compatible mate rather than a mate of higher quality: a heritable melanin-based signal of female quality did not predict divorce (indicating that female absolute quality may not be the cause of divorce), but the new mate of divorced males was less melanic than their previous mate. This suggests that, at least for males, a cost of divorce may be to secure a lower-quality but compatible mate. The better option hypothesis could not be formally rejected, as only one of the two divorcing partners commonly succeeded in obtaining a higher reproductive success after divorce. In conclusion, incompatible partners divorce to restore reproductive success, and by breeding more often together, faithful partners improve coordination.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVES: To assess whether patients' characteristics and healthcare resources consumption and costs were different between native and migrant populations in Switzerland. METHODS: All adult patients followed-up in the Swiss HIV-cohort study in our institution during 2000-2003 were considered. Patients' characteristics were retrieved from the cohort database. Hospital and outpatient resource use were extracted from individual charts and valued with 2002 tariffs. RESULTS: The 66 migrants were younger (29 +/- 8 years versus 37 +/- 11, p < 0.001), less often of male gender (38 % versus 70 %, p < 0.001), predominantly infected via heterosexual contact (87 % versus 52 %, p < 0.01), with lower mean CD4 level at enrollment (326 +/- 235 versus 437 +/- 305, p = 0.002) than their 200 native counterparts. Migrants had fewer hospitalizations, more frequent outpatient visits, laboratory tests, and lower total cost of care per year of follow-up (<euro> 2'215 +/- 4'206 versus 4'155 +/- 12'304, p = 0.037). Resource use and costs were significantly higher in people with < 200 CD4 cell counts in both groups. CONCLUSIONS: Migrant population had more advanced disease, more outpatient visits but less hospitalizations, resulting in lower costs of care when compared with native population.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study tested whether the lower economy of walking in healthy elderly subjects is due to greater gait instability. We compared the energy cost of walking and gait instability (assessed by stride to stride changes in the stride time) in octogenarians (G80, n = 10), 65-yr-olds (G65, n = 10), and young controls (G25, n = 10) walking on a treadmill at six different speeds. The energy cost of walking was higher for G80 than for G25 across the different walking speeds (P < 0.05). Stride time variability at preferred walking speed was significantly greater in G80 (2.31 +/- 0.68%) and G65 (1.93 +/- 0.39%) compared with G25 (1.40 +/- 0.30%; P < 0.05). There was no significant correlation between gait instability and energy cost of walking at preferred walking speed. These findings demonstrated greater energy expenditure in healthy elderly subjects while walking and increased gait instability. However, no relationship was noted between these two variables. The increase in energy cost is probably multifactorial, and our results suggest that gait instability is probably not the main contributing factor in this population. We thus concluded that other mechanisms, such as the energy expenditure associated with walking movements and related to mechanical work, or neuromuscular factors, are more likely involved in the higher cost of walking in elderly people.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Evolutionary theory predicts that the rate of extrinsic (i.e. age- and condition-independent) mortality should affect important life history traits such as the rate of ageing and maximum lifespan. Sex-specific differences in mortality rates due to predation may therefore result in the evolution of important differences in life history traits between males and females. However, quantifying the role of predators as a factor of extrinsic mortality is notoriously difficult in natural populations. We took advantage of the unusual prey caching behaviour of the barn owl Tyto alba and the tawny owl Strix aluco to estimate the sex ratio of their five most common preys. For all prey species, there was a significant bias in the sex ratio of remains found in nests of both these owls. A survey of literature revealed that sex-biased predation is a common phenomenon. These results demonstrate that predation, a chief source of extrinsic mortality, was strongly sex-biased. This may select for alternate life history strategies between males and females, and account for a male life span being frequently lower than female lifespan in many animal species.