946 resultados para incremental computation


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Fundamento: O valor prognóstico incremental da dosagem plasmática de Proteína C-reativa (PCR) em relação ao Escore GRACE não está estabelecido em pacientes com síndromes coronarianas agudas sem supradesnivelamento do segmento ST (SCA). Objetivo: Testar a hipótese de que a medida de PCR na admissão incrementa o valor prognóstico do escore GRACE em pacientes com SCA. Métodos: Foram estudados 290 indivíduos, internados consecutivamente por SCA, os quais tiveram material plasmático colhido na admissão para dosagem de PCR por método de alta sensibilidade (nefelometria). Desfechos cardiovasculares durante hospitalização foram definidos pela combinação de óbito, infarto não fatal ou angina refratária não fatal. Resultados: A incidência de eventos cardiovasculares durante hospitalização foi 15% (18 óbitos, 11 infartos, 13 anginas), tendo a PCR apresentado estatística-C de 0,60 (95% IC = 0,51 - 0,70; p = 0,034) na predição desses desfechos. Após ajuste para o Escore GRACE, PCR elevada (definida pelo melhor ponto de corte) apresentou tendência a associação com eventos hospitalares (OR = 1,89; 95% IC = 0,92 - 3,88; p = 0,08). No entanto, a adição da variável PCR elevada no modelo GRACE não promoveu incremento significativo na estatística-C, a qual variou de 0,705 para 0,718 (p = 0,46). Da mesma forma, não houve reclassificação de risco significativa com a adição da PCR no modelo preditor (reclassificação líquida = 5,7%; p = 0,15). Conclusão Embora PCR possua associação com desfechos hospitalares, esse marcador inflamatório não incrementa o valor prognóstico do Escore GRACE.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Object-oriented simulation, mechatronic systems, non-iterative algorithm, electric components, piezo-actuator, symbolic computation, Maple, Sparse-Tableau, Library of components

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background:Statins have proven efficacy in the reduction of cardiovascular events, but the financial impact of its widespread use can be substantial.Objective:To conduct a cost-effectiveness analysis of three statin dosing schemes in the Brazilian Unified National Health System (SUS) perspective.Methods:We developed a Markov model to evaluate the incremental cost-effectiveness ratios (ICERs) of low, intermediate and high intensity dose regimens in secondary and four primary scenarios (5%, 10%, 15% and 20% ten-year risk) of prevention of cardiovascular events. Regimens with expected low-density lipoprotein cholesterol reduction below 30% (e.g. simvastatin 10mg) were considered as low dose; between 30-40%, (atorvastatin 10mg, simvastatin 40mg), intermediate dose; and above 40% (atorvastatin 20-80mg, rosuvastatin 20mg), high-dose statins. Effectiveness data were obtained from a systematic review with 136,000 patients. National data were used to estimate utilities and costs (expressed as International Dollars - Int$). A willingness-to-pay (WTP) threshold equal to the Brazilian gross domestic product per capita (circa Int$11,770) was applied.Results:Low dose was dominated by extension in the primary prevention scenarios. In the five scenarios, the ICER of intermediate dose was below Int$10,000 per QALY. The ICER of the high versus intermediate dose comparison was above Int$27,000 per QALY in all scenarios. In the cost-effectiveness acceptability curves, intermediate dose had a probability above 50% of being cost-effective with ICERs between Int$ 9,000-20,000 per QALY in all scenarios.Conclusions:Considering a reasonable WTP threshold, intermediate dose statin therapy is economically attractive, and should be a priority intervention in prevention of cardiovascular events in Brazil.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

AbstractBackground:Guidelines recommend that in suspected stable coronary artery disease (CAD), a clinical (non-invasive) evaluation should be performed before coronary angiography.Objective:We assessed the efficacy of patient selection for coronary angiography in suspected stable CAD.Methods:We prospectively selected consecutive patients without known CAD, referred to a high-volume tertiary center. Demographic characteristics, risk factors, symptoms and non-invasive test results were correlated to the presence of obstructive CAD. We estimated the CAD probability based on available clinical data and the incremental diagnostic value of previous non-invasive tests.Results:A total of 830 patients were included; median age was 61 years, 49.3% were males, 81% had hypertension and 35.5% were diabetics. Non-invasive tests were performed in 64.8% of the patients. At coronary angiography, 23.8% of the patients had obstructive CAD. The independent predictors for obstructive CAD were: male gender (odds ratio [OR], 3.95; confidence interval [CI] 95%, 2.70 - 5.77), age (OR for 5 years increment, 1.15; CI 95%, 1.06 - 1.26), diabetes (OR, 2.01; CI 95%, 1.40 - 2.90), dyslipidemia (OR, 2.02; CI 95%, 1.32 - 3.07), typical angina (OR, 2.92; CI 95%, 1.77 - 4.83) and previous non-invasive test (OR 1.54; CI 95% 1.05 - 2.27).Conclusions:In this study, less than a quarter of the patients referred for coronary angiography with suspected CAD had the diagnosis confirmed. A better clinical and non-invasive assessment is necessary, to improve the efficacy of patient selection for coronary angiography.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The authors prove some approximate formulas for the computation of the mean and the standard error of quotients of two variates, correlated or uncorrelated, with not too high coefficient of variation. The formulas obtained are subsequently applied to some date on mensuration of horses of the Brazilian breed Mangalarga, by the eclectic system of LESBRE. The relsults obtained directly by the actual computation of the quotients as well as by means of the formulas with the aid of statistics of the numerators and the denominators are given in table 3, showing excellent agreement.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of this paper is to unify the points of view of three recent and independent papers (Ventura 1997, Margolis, Sapir and Weil 2001 and Kapovich and Miasnikov 2002), where similar modern versions of a 1951 theorem of Takahasi were given. We develop a theory of algebraic extensions for free groups, highlighting the analogies and differences with respect to the corresponding classical fieldt heoretic notions, and we discuss in detail the notion of algebraic closure. We apply that theory to the study and the computation of certain algebraic properties of subgroups (e.g. being malnormal, pure, inert or compressed, being closed in certain profinite topologies) and the corresponding closure operators. We also analyze the closure of a subgroup under the addition of solutions of certain sets of equations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

White micas in carbonate-rich tectonites and a few other rock types of large thrusts in the Swiss Helvetic fold-and-thrust belt have been analyzed by Ar-40/Ar-39 and Rb/Sr techniques to better constrain the timing of Alpine deformation for this region. Incremental Ar-40/Ar-39 heating experiments of 25 weakly metamorphosed (anchizone to low greenschist) samples yield plateau and staircase spectra. We interpret most of the staircase release spectra result from variable mixtures of syntectonic (neoformed) and detrital micas. The range in dates obtained within individual spectra depends primarily on the duration of mica nucleation and growth, and relative proportions of neoformed and detrital mica. Rb/Sr analyses of 12 samples yield dates of ca. 10-39 Ma (excluding one anomalously young sample). These dates are slightly younger than the Ar-40/Ar-39 total gas dates obtained for the same samples. The Rb/ Sr dates were calculated using initial Sr-87/Sr-86 ratios obtained from the carbonate-dominated host rocks, which are higher than normal Mesozoic carbonate values due to exchange with fluids of higher Sr-87/Sr-86 ratios (and lower O-18/O-16 ratios). Model dates calculated using Sr-87/Sr-86 values typical of Mesozoic marine carbonates more closely approximate the Ar-40/Ar-39 total gas dates for most of the samples. The similarities of Rb/Sr and Ar-40/Ar-39 total gas dates are consistent with limited amounts of detrital mica in the samples. The delta(18)O values range from 24-15%. (VSMOW) for 2-6 mum micas and 27-16parts per thousand for the carbonate host rocks. The carbonate values are significantly lower than their protolith values due to localized fluid-rock interaction and fluid flow along most thrust surfaces. Although most calcite-mica pairs are not in oxygen isotope equilibrium at temperatures of ca. 200-400 degreesC, their isotopic fractionations are indicative of either 1) partial exchange between the minerals and a common external fluid, or 2) growth or isotopic exchange of the mica with the carbonate after the carbonate had isotopically exchanged with an external fluid. The geological significance of these results is not easily or uniquely determined, and exemplifies the difficulties inherent in dating very fine-grained micas of highly deformed tectonites in low-grade metamorphic terranes. Two generalizations can be made regarding the dates obtained from the Helvetic thrusts: 1) samples from the two highest thrusts (Mt. Gond and Sublage) have all of their Ar-40/Ar-39 steps above 20 Ma, and 2) most samples from the deepest Helvetic thrusts have steps (often accounting for more than 80% of Ar-39 release) between 15 and 25 Ma. These dates are consistent with the order of thrusting in the foreland-imbricating system and increase proportions of neoformed to detrital mica in the more metamorphosed hinterland and deeply buried portions of the nappe pile. Individual thrusts accommodated the majority of their displacement during their initial incorporation into the foreland-imbricating system, and some thrusts remained active or were reactivated down to 15 Ma.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

El projecte "Anàlisi del sistema operatiu RTLinux i implementació d'un entorn de desenvolupament de tasques en temps real" analitza la possibilitat de crear un entorn de desenvolupament de tasques en temps real per poder crear sistemes de control complex, tot això mitjançant codi lliure. Inicialment es fa un aprenentatge sobre el concepte de temps real, després s'elegeix el sistema operatiu en temps real RTLinux per a crear l'entorn de desenvolupament utilitzant el llenguatge de programació Tcl/Tk. Es creen un conjunt d'aplicacions (pel control computacional) per estudiar la viabilitat de la construcció de l'entorn desitjat per facilitar la tasca de l'usuari final. Aquest projecte obre multitud de possibles camins a continuar: comunicació remota, implementació de planificadors, estudi de controladors, etc.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Aquest projecte té com a objectiu participar en el desafiament d'RSA Laboratories corresponent a trencar el criptosistema RC5-32-12-9 proposat. Per realitzar-ho s'ha triat realitzar un atac per força bruta, mitjançant el càlcul distribuït i, més concretament, utilitzant la Public Resource Computing. La plataforma escollida és la Berkeley Open Infrastructure for Network Computing (BOINC), coneguda per la seva utilització en grans projectes com ara SETI@home. En aquest projecte es posa en funcionament la infraestructura i es desenvolupen les aplicacions necessàries per iniciar els càlculs que haurien de permetre el trencament del criptosistema.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Aquest projecte tracta de realitzar un estudi per comprovar si es pot simular una aplicació de prova (benchmark) reduint el temps de simulació. Per tal de reduir el temps de simulació es seleccionaran uns determinats fragments significatius de l'execució de l'aplicació. L'objectiu és obtenir un resultat de simulació el més similar possible al de la simulació completa però en menys temps. El mètode que farem servir s'anomena incremental i consisteix a dividir la simulació en intervals d'un milió d'instruccions. Un cop dividit hem simulat per passos. En cada pas s'afegeixen intervals i s'atura la simulació quan la diferència entre el resultat del pas actual i l'anterior és inferior a un determinat valor escollit inicialment. Després es proposa una millora que es realitza i es mostren els resultats obtinguts. La millora consisteix a simular un petit interval previ a l'interval significatiu per millorar el resultat.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: Lipid-lowering therapy is costly but effective at reducing coronary heart disease (CHD) risk. OBJECTIVE: To assess the cost-effectiveness and public health impact of Adult Treatment Panel III (ATP III) guidelines and compare with a range of risk- and age-based alternative strategies. DESIGN: The CHD Policy Model, a Markov-type cost-effectiveness model. DATA SOURCES: National surveys (1999 to 2004), vital statistics (2000), the Framingham Heart Study (1948 to 2000), other published data, and a direct survey of statin costs (2008). TARGET POPULATION: U.S. population age 35 to 85 years. Time Horizon: 2010 to 2040. PERSPECTIVE: Health care system. INTERVENTION: Lowering of low-density lipoprotein cholesterol with HMG-CoA reductase inhibitors (statins). OUTCOME MEASURE: Incremental cost-effectiveness. RESULTS OF BASE-CASE ANALYSIS: Full adherence to ATP III primary prevention guidelines would require starting (9.7 million) or intensifying (1.4 million) statin therapy for 11.1 million adults and would prevent 20,000 myocardial infarctions and 10,000 CHD deaths per year at an annual net cost of $3.6 billion ($42,000/QALY) if low-intensity statins cost $2.11 per pill. The ATP III guidelines would be preferred over alternative strategies if society is willing to pay $50,000/QALY and statins cost $1.54 to $2.21 per pill. At higher statin costs, ATP III is not cost-effective; at lower costs, more liberal statin-prescribing strategies would be preferred; and at costs less than $0.10 per pill, treating all persons with low-density lipoprotein cholesterol levels greater than 3.4 mmol/L (>130 mg/dL) would yield net cost savings. RESULTS OF SENSITIVITY ANALYSIS: Results are sensitive to the assumptions that LDL cholesterol becomes less important as a risk factor with increasing age and that little disutility results from taking a pill every day. LIMITATION: Randomized trial evidence for statin effectiveness is not available for all subgroups. CONCLUSION: The ATP III guidelines are relatively cost-effective and would have a large public health impact if implemented fully in the United States. Alternate strategies may be preferred, however, depending on the cost of statins and how much society is willing to pay for better health outcomes. FUNDING: Flight Attendants' Medical Research Institute and the Swanson Family Fund. The Framingham Heart Study and Framingham Offspring Study are conducted and supported by the National Heart, Lung, and Blood Institute.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Informe de investigación realizado a partir de una estancia en el Department of Computer and Information Science de la Norwegian University of Science and Technology (NTNU), Noruega, entre setiembre i diciembre de 2006. El uso de componentes de software llamados Commercial-Off-The-Shelf (COTS) en el desarrollo de sistemas basados en componentes implica varios retos. Uno de ellos es la falta de información disponible y adecuada para dar soporte al proceso de selección de componentes a ser integrados. Para lidiar con estos problemas, se esta desarrollando un trabajo de tesis que propone un método llamado GOThIC (Goal-Oriented Taxonomy and reuse Infrastructure Construction). El método está orientado a construir una infrastructura de reuse para facilitar la búsqueda y reuse de componentes COTS. La estancia en la NTNU, reportada en este documento, tuvo como objetivo primordial las mejora del método y la obtención de datos empíricos para darle soporte. Algunos de los principales resultados fueron la obtención de datos empíricos fundamentando la utilización del método en ámbitos industriales de selección de componentes COTS, así como una nueva estrategia para conseguir de forma factible e incremental, la federación y reuso de los diferentes esfuerzos existentes para encontrar, seleccionar y mantener componentes COTS y Open Source (OSS) -comúnmente llamados componentes Off-The-Shelf (OTS) - en forma estructurada.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

INTRODUCTION: Hip fractures are responsible for excessive mortality, decreasing the 5-year survival rate by about 20%. From an economic perspective, they represent a major source of expense, with direct costs in hospitalization, rehabilitation, and institutionalization. The incidence rate sharply increases after the age of 70, but it can be reduced in women aged 70-80 years by therapeutic interventions. Recent analyses suggest that the most efficient strategy is to implement such interventions in women at the age of 70 years. As several guidelines recommend bone mineral density (BMD) screening of postmenopausal women with clinical risk factors, our objective was to assess the cost-effectiveness of two screening strategies applied to elderly women aged 70 years and older. METHODS: A cost-effectiveness analysis was performed using decision-tree analysis and a Markov model. Two alternative strategies, one measuring BMD of all women, and one measuring BMD only of those having at least one risk factor, were compared with the reference strategy "no screening". Cost-effectiveness ratios were measured as cost per year gained without hip fracture. Most probabilities were based on data observed in EPIDOS, SEMOF and OFELY cohorts. RESULTS: In this model, which is mostly based on observed data, the strategy "screen all" was more cost effective than "screen women at risk." For one woman screened at the age of 70 and followed for 10 years, the incremental (additional) cost-effectiveness ratio of these two strategies compared with the reference was 4,235 euros and 8,290 euros, respectively. CONCLUSION: The results of this model, under the assumptions described in the paper, suggest that in women aged 70-80 years, screening all women with dual-energy X-ray absorptiometry (DXA) would be more effective than no screening or screening only women with at least one risk factor. Cost-effectiveness studies based on decision-analysis trees maybe useful tools for helping decision makers, and further models based on different assumptions should be performed to improve the level of evidence on cost-effectiveness ratios of the usual screening strategies for osteoporosis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Although extended secondary prophylaxis with low-molecular-weight heparin was recently shown to be more effective than warfarin for cancer-related venous thromboembolism, its cost-effectiveness compared to traditional prophylaxis with warfarin is uncertain. We built a decision analytic model to evaluate the clinical and economic outcomes of a 6-month course of low-molecular-weight heparin or warfarin therapy in 65-year-old patients with cancer-related venous thromboembolism. We used probability estimates and utilities reported in the literature and published cost data. Using a US societal perspective, we compared strategies based on quality-adjusted life-years (QALYs) and lifetime costs. The incremental cost-effectiveness ratio of low-molecular-weight heparin compared with warfarin was 149,865 dollars/QALY. Low-molecular-weight heparin yielded a quality-adjusted life expectancy of 1.097 QALYs at the cost of 15,329 dollars. Overall, 46% (7108 dollars) of the total costs associated with low-molecular-weight heparin were attributable to pharmacy costs. Although the low-molecular-weigh heparin strategy achieved a higher incremental quality-adjusted life expectancy than the warfarin strategy (difference of 0.051 QALYs), this clinical benefit was offset by a substantial cost increment of 7,609 dollars. Cost-effectiveness results were sensitive to variation of the early mortality risks associated with low-molecular-weight heparin and warfarin and the pharmacy costs for low-molecular-weight heparin. Based on the best available evidence, secondary prophylaxis with low-molecular-weight heparin is more effective than warfarin for cancer-related venous thromboembolism. However, because of the substantial pharmacy costs of extended low-molecular-weight heparin prophylaxis in the US, this treatment is relatively expensive compared with warfarin.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Therapeutic drug monitoring (TDM) aims to optimize treatments by individualizing dosage regimens based on the measurement of blood concentrations. Dosage individualization to maintain concentrations within a target range requires pharmacokinetic and clinical capabilities. Bayesian calculations currently represent the gold standard TDM approach but require computation assistance. In recent decades computer programs have been developed to assist clinicians in this assignment. The aim of this survey was to assess and compare computer tools designed to support TDM clinical activities. The literature and the Internet were searched to identify software. All programs were tested on personal computers. Each program was scored against a standardized grid covering pharmacokinetic relevance, user friendliness, computing aspects, interfacing and storage. A weighting factor was applied to each criterion of the grid to account for its relative importance. To assess the robustness of the software, six representative clinical vignettes were processed through each of them. Altogether, 12 software tools were identified, tested and ranked, representing a comprehensive review of the available software. Numbers of drugs handled by the software vary widely (from two to 180), and eight programs offer users the possibility of adding new drug models based on population pharmacokinetic analyses. Bayesian computation to predict dosage adaptation from blood concentration (a posteriori adjustment) is performed by ten tools, while nine are also able to propose a priori dosage regimens, based only on individual patient covariates such as age, sex and bodyweight. Among those applying Bayesian calculation, MM-USC*PACK© uses the non-parametric approach. The top two programs emerging from this benchmark were MwPharm© and TCIWorks. Most other programs evaluated had good potential while being less sophisticated or less user friendly. Programs vary in complexity and might not fit all healthcare settings. Each software tool must therefore be regarded with respect to the individual needs of hospitals or clinicians. Programs should be easy and fast for routine activities, including for non-experienced users. Computer-assisted TDM is gaining growing interest and should further improve, especially in terms of information system interfacing, user friendliness, data storage capability and report generation.