994 resultados para Developing Software
Resumo:
In 2008 the International Society of Physical and Rehabilitation Medicine (ISPRM) started an initiative to systematically develop its capacity and its internal and external policy agenda. This paper sums up achievements that have been made with this ISPRM initiative as well as pending issues and strategies to address them. The paper treats the following: ISPRM`s policy agenda in collaboration with the World Health Organization (WHO), research capacity in functioning and rehabilitation, ISPRM world conferences, relationships with regional societies of Physical and Rehabilitation Medicine (PRM), and ISPRM`s membership and governance structure.
Resumo:
Dherte PM, Negrao MPG, Mori Neto S, Holzhacker R, Shimada V, Taberner P, Carmona MJC - Smart Alerts: Development of a Software to Optimize Data Monitoring. Background and objectives: Monitoring is useful for vital follow-ups and prevention, diagnosis, and treatment of several events in anesthesia. Although alarms can be useful in monitoring they can cause dangerous user`s desensitization. The objective of this study was to describe the development of specific software to integrate intraoperative monitoring parameters generating ""smart alerts"" that can help decision making, besides indicating possible diagnosis and treatment. Methods: A system that allowed flexibility in the definition of alerts, combining individual alarms of the parameters monitored to generate a more elaborated alert system was designed. After investigating a set of smart alerts, considered relevant in the surgical environment, a prototype was designed and evaluated, and additional suggestions were implemented in the final product. To verify the occurrence of smart alerts, the system underwent testing with data previously obtained during intraoperative monitoring of 64 patients. The system allows continuous analysis of monitored parameters, verifying the occurrence of smart alerts defined in the user interface. Results: With this system a potential 92% reduction in alarms was observed. We observed that in most situations that did not generate alerts individual alarms did not represent risk to the patient. Conclusions: Implementation of software can allow integration of the data monitored and generate information, such as possible diagnosis or interventions. An expressive potential reduction in the amount of alarms during surgery was observed. Information displayed by the system can be oftentimes more useful than analysis of isolated parameters.
Resumo:
Background. Chagas disease is caused by the protozoan parasite Trypanosoma cruzi. Among T. cruzi-infected individuals, only a subgroup develops severe chronic Chagas cardiomyopathy (CCC); the majority remain asymptomatic. T. cruzi displays numerous ligands for the Toll-like receptors (TLRs), which are an important component of innate immunity that lead to the transcription of proinflammatory cytokines by nuclear factor-kappa B. Because proinflammatory cytokines play an important role in CCC, we hypothesized that single-nucleotide polymorphisms (SNPs) in the genes that encode proteins in the TLR pathway could explain differential susceptibility to CCC among T. cruzi-infected individuals. Methods. For 169 patients with CCC and 76 T. cruzi-infected, asymptomatic individuals, we analyzed SNPs by use of polymerase chain reaction-restriction fragment length polymorphism analysis for the genes TLR1, TLR2, TLR4, TLR5, TLR9, and MAL/TIRAP, which encodes an adaptor protein. Results. Heterozygous carriers of the MAL/TIRAP variant S180L were more prevalent in the asymptomatic group (24 [32%] of 76 subjects) than in the CCC group (21 [12%] of 169) (chi(2) = 12.6; P = .0004 [adjusted P (P(c)) = .0084]; odds ratio [OR], 0.31 [95% confidence interval {CI}, 0.16-0.60]). Subgroup analysis showed a stronger association when asymptomatic patients were compared with patients who had severe CCC (i.e., patients with left-ventricular ejection fraction <= 40%) (chi(2) = 11.3; P = .0008 [P(c) = .017]; OR, 0.22 [95% CI, 0.09-0.56]) than when asymptomatic patients were compared with patients who had mild CCC (i.e., patients with left-ventricular ejection fraction >40%) (chi(2) = 7.7; P = .005 [P(c) = .11]; OR, 0.33 [95% CI, 0.15-0.73]). Conclusion. T. cruzi-infected individuals who are heterozygous for the MAL/TIRAP S180L variant that leads to a decrease in signal transduction upon ligation of TLR2 or TLR4 to their respective ligand may have a lower risk of developing CCC.
Resumo:
Background The development of products and services for health care systems is one of the most important phenomena to have occurred in the field of health care over the last 50 years. It generates significant commercial, medical and social results. Although much has been done to understand how health technologies are adopted and regulated in developed countries, little attention has been paid to the situation in low- and middle-income countries (LMICs). Here we examine the institutional environment in which decisions are made regarding the adoption of expensive medical devices into the Brazilian health care system. Methods We used a case study strategy to address our research question. The empirical work relied on in-depth interviews (N = 16) with representatives of a wide range of actors and stakeholders that participate in the process of diffusion of CT (computerized tomography) scanners in Brazil, including manufacturers, health care organizations, medical specialty societies, health insurance companies, regulatory agencies and the Ministry of Health. Results The adoption of CT scanners is not determined by health policy makers or third-party payers of public and private sectors. Instead, decisions are primarily made by administrators of individual hospitals and clinics, strongly influenced by both physicians and sales representatives of the medical industry who act as change agents. Because this process is not properly regulated by public authorities, health care organizations are free to decide whether, when and how they will adopt a particular technology. Conclusions Our study identifies problems in how health care systems in LMICs adopt new, expensive medical technologies, and suggests that a set of innovative approaches and policy instruments are needed in order to balance the institutional and professional desire to practise a modern and expensive medicine in a context of health inequalities and basic health needs.
Resumo:
Objectives: To analyze mortality rates of children with severe sepsis and septic shock in relation to time-sensitive fluid resuscitation and treatments received and to define barriers to the implementation of the American College of Critical Care Medicine/Pediatric Advanced Life Support guidelines in a pediatric intensive care unit in a developing country. Methods: Retrospective chart review and prospective analysis of septic shock treatment in a pediatric intensive care unit of a tertiary care teaching hospital. Ninety patients with severe sepsis or septic shock admitted between July 2002 and June 2003 were included in this study. Results: Of the 90 patients, 83% had septic shock and 17% had severe sepsis; 80 patients had preexisting severe chronic diseases. Patients with septic shock who received less than a 20-mL/kg dose of resuscitation fluid in the first hour of treatment had a mortality rate of 73%, whereas patients who received more than a 40-mL/kg dose in the first hour of treatment had a mortality rate of 33% (P < 0.05.) Patients treated less than 30 minutes after diagnosis of severe sepsis and septic shock had a significantly lower mortality rate (40%) than patients treated more than 60 Minutes after diagnosis (P < 0.05). Controlling for the risk of mortality, early fluid resuscitation was associated with a 3-fold reduction in the odds of death (odds ratio, 0.33; 95% confidence interval, 0.13-0.85). The most important barriers to achieve adequate severe sepsis and septic shock treatment were lack of adequate vascular access, lack of recognition of early shock, shortage of health care providers, and nonuse of goals and treatment protocols. Conclusions: The mortality rate was higher for children older than years, for those who received less than 40 mL/kg in the first hour, and for those whose treatment was not initiated in the first 30 Minutes after the diagnosis of septic shock. The acknowledgment of existing barriers to a timely fluid administration and the establishment of objectives to overcome these barriers may lead to a more successful implementation of the American College of Critical Care Medicine guidelines and reduced mortality rates for children with septic shock in the developing world.
Resumo:
Little is known about the effect of clinical characteristics, parental psychopathology, family functioning, and environmental stressors in the response to methylphenidate in children with attention-deficit/hyperactivity disorder (ADHD) followed up in a naturalistic setting. Data from cultures outside the United States are extremely scarce. This is a longitudinal study using a nonrandom assignment, quasi-experimental design. One hundred twenty-five children with ADHD were treated with methylphenidate according to standard clinical procedures, and followed up for 6 months. The severity of ADHD symptoms was assessed by the Swanson, Nolan, and Pelham rating scale. In the final multivariate model, ADHD combined subtype (P < 0.001) and comorbidity with oppositional defiant disorder (P = 0.03) were both predictors of a worse clinical response. In addition, the levels of maternal ADHD symptoms were also associated with worse prognosis (P < 0.001). In the context of several adverse psychosocial factors assessed, only undesired pregnancy was associated with poorer response to methylphenidate in the final comprehensive-model (P = 0.02). Our study provides evidence for the involvement of clinical characteristics, maternal psychopathology, and environmental stressors in the response to methylphenidate. Clinicians may consider adjuvant strategies when negative predictors are present to increase the chances of success with methylphenidate treatment.
Resumo:
SAD and numerous outcomes (age-of-onset, persistence, severity, comorbidity, treatment) were examined. Additional analyses examined associations with number of performance fears Versus number of interactional fears. Results: Lifetime social fears are quite common in both developed (15.9%) and developing (14.3%) countries, but lifetime SAD is much more common in the former (6.1%) than latter (2.1%) countries. Among those with SAD, persistence, severity, comorbidity, and treatment have dose response relationships with number of social fears, with no clear nonlinearity in relationships that would support a distinction between generalized and non-generalized SAD. The distinction between performance fears and interactional fears is generally not important in predicting these same outcomes. Conclusion: No evidence is found to support subtyping SAD on the basis of either number of social fears or number of performance fears versus number of interactional fears. Depression and Anxiety 27:390-403, 2010. (C) 2009 Wiley-Liss, Inc.
Resumo:
Conventional karyotyping detects anomalies in 3-15% of patients with multiple congenital anomalies and mental retardation (MCA/MR). Whole-genome array screening (WGAS) has been consistently suggested as the first choice diagnostic test for this group of patients, but it is very costly for large-scale use in developing countries. We evaluated the use of a combination of Multiplex Ligation-dependent Probe Amplification (MLPA) kits to increase the detection rate of chromosomal abnormalities in MCA/MR patients. We screened 261 MCA/MR patients with two subtelomeric and one microdeletion kits. This would theoretically detect up to 70% of all submicroscopic abnormalities. Additionally we scored the de Vries score for 209 patients in an effort to find a suitable cut-off for MLPA screening. Our results reveal that chromosomal abnormalities were present in 87 (33.3%) patients, but only 57 (21.8%) were considered causative. Karyotyping detected 15 abnormalities (6.9%), while MLPA identified 54 (20.7%). Our combined MLPA screening raised the total detection number of pathogenic imbalances more than three times when compared to conventional karyotyping. We also show that using the de Vries score as a cutoff for this screening would only be suitable under financial restrictions. A decision analytic model was constructed with three possible strategies: karyotype, karyotype + MLPA and karyotype + WGAS. Karyotype + MLPA strategy detected anomalies in 19.8% of cases which account for 76.45% of the expected yield for karyotype + WGAS. Incremental Cost Effectiveness Ratio (ICER) of MLPA is three times lower than that of WGAS, which means that, for the same costs, we have three additional diagnoses with MLPA but only one with WGAS. We list all causative alterations found, including rare findings, such as reciprocal duplications of regions deleted in Sotos and Williams-Beuren syndromes. We also describe imbalances that were considered polymorphisms or rare variants, such as the new SNP that confounded the analysis of the 22q13.3 deletion syndrome. (C) 2011 Elsevier Masson SAS. All rights reserved.