883 resultados para Pre-clinical tests
Resumo:
Image-guided, computer-assisted neurosurgery has emerged to improve localization and targeting, to provide a better anatomic definition of the surgical field, and to decrease invasiveness. Usually, in image-guided surgery, a computer displays the surgical field in a CT/MR environment, using axial, coronal or sagittal views, or even a 3D representation of the patient. Such a system forces the surgeon to look away from the surgical scene to the computer screen. Moreover, this kind of information, being pre-operative imaging, can not be modified during the operation, so it remains valid for guidance in the first stage of the surgical procedure, and mainly for rigid structures like bones. In order to solve the two constraints mentioned before, we are developing an ultrasoundguided surgical microscope. Such a system takes the advantage that surgical microscopy and ultrasound systems are already used in neurosurgery, so it does not add more complexity to the surgical procedure. We have integrated an optical tracking device in the microscope and an augmented reality overlay system with which we avoid the need to look away from the scene, providing correctly aligned surgical images with sub-millimeter accuracy. In addition to the standard CT and 3D views, we are able to track an ultrasound probe, and using a previous calibration and registration of the imaging, the image obtained is correctly projected to the overlay system, so the surgeon can always localize the target and verify the effects of the intervention. Several tests of the system have been already performed to evaluate the accuracy, and clinical experiments are currently in progress in order to validate the clinical usefulness of the system.
Resumo:
Pre-eclampsia, a pregnancy-specific disorder, contributes substantially to perinatal morbidity and mortality of both, mother and newborn. An increasing number of biochemical agents were evaluated as markers for predicting pre-eclampsia. None of them has been proved to be of clinical value yet. Much effort has been put into assessing novel potential markers and their combination with other screening methods such as Doppler sonography. The purpose of this review is to reflect the current knowledge of serum markers for predicting pre-eclampsia. So far, the most promising serum markers are placental protein 13 (PP-13), as well as soluble fms-like tyrosine kinase-1 (sFlt-1), placental growth factor (PIGF) and soluble endoglin (sEng). These markers allow screening at a relatively early stage and, most importantly, show relatively high predictive values and improved diagnostic performance if combined with first trimester Doppler sonography. Large-scale prospective studies, assessing these markers, are important to justify their clinical use in view of early intervention to prevent pre-eclampsia in the future.
Resumo:
OBJECTIVE: To consider the reasons and context for test ordering by doctors when faced with an undiagnosed complaint in primary or secondary care. STUDY DESIGN AND SETTING: We reviewed any study of any design that discussed factors that may affect a doctor's decision to order a test. Articles were located through searches of electronic databases, authors' files on diagnostic methodology, and reference lists of relevant studies. We extracted data on: study design, type of analysis, setting, topic area, and any factors reported to influence test ordering. RESULTS: We included 37 studies. We carried out a thematic analysis to synthesize data. Five key groupings arose from this process: diagnostic factors, therapeutic and prognostic factors, patient-related factors, doctor-related factors, and policy and organization-related factors. To illustrate how the various factors identified may influence test ordering we considered the symptom low back pain and the diagnosis multiple sclerosis as examples. CONCLUSIONS: A wide variety of factors influence a doctor's decision to order a test. These are integral to understanding diagnosis in clinical practice. Traditional diagnostic accuracy studies should be supplemented with research into the broader context in which doctors perform their work.
Resumo:
BACKGROUND: Epidemiological data for south Asian children in the United Kingdom are contradictory, showing a lower prevalence of wheeze, but a higher rate of medical consultations and admissions for asthma compared with white children. These studies have not distinguished different asthma phenotypes or controlled for varying environmental exposures. OBJECTIVE: To compare the prevalence of wheeze and related health-service use in south Asian and white pre-schoolchildren in the United Kingdom, taking into account wheeze phenotype (viral and multiple wheeze) and environmental exposures. METHODS: A postal questionnaire was completed by parents of a population-based sample of 4366 white and 1714 south Asian children aged 1-4 years in Leicestershire, UK. Children were classified as having viral wheeze or multiple trigger wheeze. RESULTS: The prevalence of current wheeze was 35.6% in white and 25.5% in south Asian 1-year-olds (P<0.001), and 21.9% and 20.9%, respectively, in children aged 2-4 years. Odds ratios (ORs) (95% confidence interval) for multiple wheeze and for viral wheeze, comparing south Asian with white children, were 2.21 (1.19-4.09) and 1.43 (0.77-2.65) in 2-4-year-olds after controlling for socio-economic conditions, environmental exposures and family history. In 1-year-olds, the respective ORs for multiple and viral wheeze were 0.66 (0.47-0.92) and 0.81 (0.64-1.03). Reported GP consultation rates for wheeze and hospital admissions were greater in south Asian children aged 2-4 years, even after adjustment for severity, but the use of inhaled corticosteroids was lower. CONCLUSIONS: South Asian 2-4-year-olds are more likely than white children to have multiple wheeze (a condition with many features of chronic atopic asthma), after taking into account ethnic differences in exposure to some environmental agents. Undertreatment with inhaled corticosteroids might partly explain their greater use of health services.
Resumo:
OBJECTIVES: We sought to determine both the procedural performance and safety of percutaneous implantation of the second (21-French [F])- and third (18-F)-generation CoreValve aortic valve prosthesis (CoreValve Inc., Irvine, California). BACKGROUND: Percutaneous aortic valve replacement represents an emerging alternative therapy for high-risk and inoperable patients with severe symptomatic aortic valve stenosis. METHODS: Patients with: 1) symptomatic, severe aortic valve stenosis (area <1 cm2); 2) age > or =80 years with a logistic EuroSCORE > or =20% (21-F group) or age > or =75 years with a logistic EuroSCORE > or =15% (18-F group); or 3) age > or =65 years plus additional prespecified risk factors were included. Introduction of the 18-F device enabled the transition from a multidisciplinary approach involving general anesthesia, surgical cut-down, and cardiopulmonary bypass to a truly percutaneous approach under local anesthesia without hemodynamic support. RESULTS: A total of 86 patients (21-F, n = 50; 18-F, n = 36) with a mean valve area of 0.66 +/- 0.19 cm2 (21-F) and 0.54 +/- 0.15 cm2 (18-F), a mean age of 81.3 +/- 5.2 years (21-F) and 83.4 +/- 6.7 years (18-F), and a mean logistic EuroSCORE of 23.4 +/- 13.5% (21-F) and 19.1 +/- 11.1% (18-F) were recruited. Acute device success was 88%. Successful device implantation resulted in a marked reduction of aortic transvalvular gradients (mean pre 43.7 mm Hg vs. post 9.0 mm Hg, p < 0.001) with aortic regurgitation grade remaining unchanged. Acute procedural success rate was 74% (21-F: 78%; 18-F: 69%). Procedural mortality was 6%. Overall 30-day mortality rate was 12%; the combined rate of death, stroke, and myocardial infarction was 22%. CONCLUSIONS: Treatment of severe aortic valve stenosis in high-risk patients with percutaneous implantation of the CoreValve prosthesis is feasible and associated with a lower mortality rate than predicted by risk algorithms.
Resumo:
AIMS: Recent studies of drug-eluting stents for unprotected left main coronary artery (LMCA) disease have been encouraging. We examined the performance of sirolimus-eluting stents (SES) for this indication. METHODS AND RESULTS: This retrospective study included 228 consecutive patients (mean age = 68 +/- 11 years, 80.6% men, 26.3% diabetics) who underwent implantation of SES for de novo LMCA stenoses. The mean additive and logistic EuroSCOREs were 5.2 +/- 3.9 and 8.2 +/- 13.2, respectively. The main objective of this study was to measure the rate of major adverse cardiac events (MACE), including death, myocardial infarction and target lesion revascularisation (TLR) at 12 months. Other objectives were to measure the rates of in-hospital MACE and 12-month TLR. Outcomes in 143 patients with (BIF+ group), versus 84 patients without (BIF-group) involvement of the bifurcation were compared. The pre-procedural percent diameter stenosis (%DS) was 60.1 +/- 11.2 in the BIF+ versus 54.7 +/- 12.2% in the BIF- group (p=0.008), and decreased to 18.0 +/- 9.7 and 13.9 +/- 11.3%, respectively (ns), after SES implant. The overall in-hospital MACE rate was 3.5%, and similar in both subgroups. The 1-year MACE rate was 14.5% overall, 16.8% in the BIF+ and 10.7% in the BIF- subgroup (ns). CONCLUSIONS: SES implants in high-risk patients with LMCA stenoses were associated with a low 1-year MACE rate. Stenting of the bifurcation was associated with significant increases in neither mortality nor 1-year MACE rate.
Clinical presentation of celiac disease and the diagnostic accuracy of serologic markers in children
Resumo:
There has been growing recognition of a changing clinical presentation of celiac disease (CD), with the manifestation of milder symptoms. Serologic testing is widely used to screen patients with suspected CD and populations at risk. The aim of this retrospective analysis was to evaluate the clinical presentation of CD in childhood, assess the diagnostic value of serologic tests, and investigate the impact of IgA deficiency on diagnostic accuracy. We evaluated 206 consecutive children with suspected CD on the basis of clinical symptoms and positive serology results. Ninety-four (46%) had biopsy-proven CD. The median age at diagnosis of CD was 6.8 years; 15% of the children were <2 years of age. There was a higher incidence of CD in girls (p = 0.003). Iron deficiency and intestinal complaints were more frequent in children with CD than those without CD (61% vs. 33%, p = 0.0001 and 71% vs. 55%, p = 0.02, respectively), while failure to thrive was less common (35% vs. 53%, p = 0.02). The sensitivity of IgA tissue transglutaminase (IgA-tTG) was 0.98 when including all children and 1.00 after excluding children with selective IgA deficiency. The specificity of IgA-tTG was 0.73 using the recommended cut-off value of 20 IU, and this improved to 0.94 when using a higher cut-off value of 100 IU. All children with CD and relative IgA deficiency (IgA levels that are measurable but below the age reference [n = 8]) had elevated IgA-tTG. In conclusion, CD is frequently diagnosed in school-age children with relatively mild symptoms. The absence of intestinal symptoms does not preclude the diagnosis of CD; many children with CD do not report intestinal symptoms. While the sensitivity of IgA-tTG is excellent, its specificity is insufficient for the diagnostic confirmation of a disease requiring life-long dietary restrictions. Children with negative IgA-tTG and decreased but measurable IgA values are unlikely to have CD.
Resumo:
OBJECTIVES: To examine the ambiguity tolerance, i.e. the ability to perceive new, contradictory and complex situations as positive challenges, of pre-lingually deafened adolescents who received a cochlear implant after their eighth birthday and to identify those dimensions of ambiguity tolerance which correlate significantly with specific variables of their oral communication. DESIGN AND SETTING: Clinical survey at an academic tertiary referral center. Participants and main outcome measures: A questionnaire concerning communication and subjectively perceived changes compared to the pre-cochlear implant situation was completed by 13 pre-lingually deafened patients aged between 13 and 23 years, who received their cochlear implants between the ages of 8 and 17 years. The results were correlated with the 'Inventory for Measuring Ambiguity Tolerance'. RESULTS: The patients showed a lower ambiguity tolerance with a total score of 134.5 than the normative group with a score of 143.1. There was a positive correlation between the total score for ambiguity tolerance and the frequency of 'use of oral speech', as well as between the subscale 'ambiguity tolerance towards apparently insoluble problems' and all five areas of oral communication that were investigated. Comparison of two variables of oral communication, which shows a significant difference pre- and postoperatively, yields a positive correlation with the subscale 'ambiguity tolerance towards the parental image'. CONCLUSIONS: Pre-lingually deafened juveniles with cochlear implant who increasingly use oral communication seem to regard the limits of a cochlear implant as an interesting challenge rather than an insoluble problem.
Resumo:
Through alternative splicing, multiple different transcripts can be generated from a single gene. Alternative splicing represents an important molecular mechanism of gene regulation in physiological processes such as developmental programming as well as in disease. In cancer, splicing is significantly altered. Tumors express a different collection of alternative spliceoforms than normal tissues. Many tumor-associated splice variants arise from genes with an established role in carcinogenesis or tumor progression, and their functions can be oncogenic. This raises the possibility that products of alternative splicing play a pathogenic role in cancer. Moreover, cancer-associated spliceoforms represent potential diagnostic biomarkers and therapeutic targets. G protein-coupled peptide hormone receptors provide a good illustration of alternative splicing in cancer. The wild-type forms of these receptors have long been known to be expressed in cancer and to modulate tumor cell functions. They are also recognized as attractive clinical targets. Recently, splice variants of these receptors have been increasingly identified in various types of cancer. In particular, alternative cholecystokinin type 2, secretin, and growth hormone-releasing hormone receptor spliceoforms are expressed in tumors. Peptide hormone receptor splice variants can fundamentally differ from their wild-type receptor counterparts in pharmacological and functional characteristics, in their distribution in normal and malignant tissues, and in their potential use for clinical applications.
Resumo:
Rapid diagnostic tests (RDT) are sometimes recommended to improve the home-based management of malaria. The accuracy of an RDT for the detection of clinical malaria and the presence of malarial parasites has recently been evaluated in a high-transmission area of southern Mali. During the same study, the cost-effectiveness of a 'test-and-treat' strategy for the home-based management of malaria (based on an artemisinin-combination therapy) was compared with that of a 'treat-all' strategy. Overall, 301 patients, of all ages, each of whom had been considered a presumptive case of uncomplicated malaria by a village healthworker, were checked with a commercial RDT (Paracheck-Pf). The sensitivity, specificity, and positive and negative predictive values of this test, compared with the results of microscopy and two different definitions of clinical malaria, were then determined. The RDT was found to be 82.9% sensitive (with a 95% confidence interval of 78.0%-87.1%) and 78.9% (63.9%-89.7%) specific compared with the detection of parasites by microscopy. In the detection of clinical malaria, it was 95.2% (91.3%-97.6%) sensitive and 57.4% (48.2%-66.2%) specific compared with a general practitioner's diagnosis of the disease, and 100.0% (94.5%-100.0%) sensitive but only 30.2% (24.8%-36.2%) specific when compared against the fulfillment of the World Health Organization's (2003) research criteria for uncomplicated malaria. Among children aged 0-5 years, the cost of the 'test-and-treat' strategy, per episode, was about twice that of the 'treat-all' (U.S.$1.0. v. U.S.$0.5). In older subjects, however, the two strategies were equally costly (approximately U.S.$2/episode). In conclusion, for children aged 0-5 years in a high-transmission area of sub-Saharan Africa, use of the RDT was not cost-effective compared with the presumptive treatment of malaria with an ACT. In older patients, use of the RDT did not reduce costs. The question remains whether either of the strategies investigated can be made affordable for the affected population.
Resumo:
The purpose of this study was to assess if delayed gadolinium MRI of cartilage using postcontrast T(1) (T(1Gd)) is sufficient for evaluating cartilage damage in femoroacetabular impingement without using noncontrast values (T(10)). T(1Gd) and DeltaR(1) (1/T(1Gd) - 1/T(10)) that include noncontrast T(1) measurements were studied in two grades of osteoarthritis and in a control group of asymptomatic young-adult volunteers. Differences between T(1Gd) and DeltaR(1) values for femoroacetabular impingement patients and volunteers were compared. There was a very high correlation between T(1Gd) and DeltaR(1) in all study groups. In the study cohort with Tonnis grade 0, correlation (r) was -0.95 and -0.89 with Tonnis grade 1 and -0.88 in asymptomatic volunteers, being statistically significant (P < 0.001) for all groups. For both T(1Gd) and DeltaR(1), a statistically significant difference was noted between patients and control group. Significant difference was also noted for both T(1Gd) and DeltaR(1) between the patients with Tonnis grade 0 osteoarthritis and those with grade 1 changes. Our results prove a linear correlation between T(1Gd) and DeltaR(1), suggesting that T(1Gd) assessment is sufficient for the clinical utility of delayed gadolinium MRI of cartilage in this setting and additional time-consuming T(10) evaluation may not be needed.
Resumo:
The present study evaluated gingival recession 1 year following apical surgery of 70 maxillary anterior teeth (central and lateral incisors, canines, and first premolars). A visual assessment of the mid-facial aspect of the gingival level and of papillary heights of treated teeth was carried out using photographs taken at pre-treatment and 1-year follow-up appointments. In addition, changes in the gingival margin (GM) and clinical attachment levels (CAL) were calculated with the use of clinical measurements, that is, pre-treatment and 1-year follow-up pocket probing depth and level of gingival margin. Changes in GM and CAL were then correlated with patient-, tooth-, and surgery-related parameters. The following parameters were found to significantly influence changes in GM and CAL over time: gingival biotype (P < 0.05), with thin biotype exhibiting more gingival recession than thick biotype; pre-treatment pocket probing depth (PPD) (P < 0.03), with cases of pre-treatment PPD < 2.5 mm demonstrating more attachment loss than cases of PPD > or = 2.5 mm; and type of incision (P < 0.01), with the submarginal incision showing considerably less gingival recession compared with the intrasulcular incision, papilla-base incision or papilla-saving incision. The visual assessment using pre-treatment and 1-year follow-up photographs did not demonstrate significant changes in gingival level or papillary height after apical surgery. In conclusion, gingival biotype, pre-treatment PPD, and type of incision may significantly influence changes in GM and CAL following apical surgery in maxillary anterior teeth.
Resumo:
OBJECTIVES: To evaluate the potential improvement of antimicrobial treatment by utilizing a new multiplex polymerase chain reaction (PCR) assay that identifies sepsis-relevant microorganisms in blood. DESIGN: Prospective, observational international multicentered trial. SETTING: University hospitals in Germany (n = 2), Spain (n = 1), and the United States (n = 1), and one Italian tertiary general hospital. PATIENTS: 436 sepsis patients with 467 episodes of antimicrobial treatment. METHODS: Whole blood for PCR and blood culture (BC) analysis was sampled independently for each episode. The potential impact of reporting microorganisms by PCR on adequacy and timeliness of antimicrobial therapy was analyzed. The number of gainable days on early adequate antimicrobial treatment attributable to PCR findings was assessed. MEASUREMENTS AND MAIN RESULTS: Sepsis criteria, days on antimicrobial therapy, antimicrobial substances administered, and microorganisms identified by PCR and BC susceptibility tests. RESULTS: BC diagnosed 117 clinically relevant microorganisms; PCR identified 154. Ninety-nine episodes were BC positive (BC+); 131 episodes were PCR positive (PCR+). Overall, 127.8 days of clinically inadequate empirical antibiotic treatment in the 99 BC+ episodes were observed. Utilization of PCR-aided diagnostics calculates to a potential reduction of 106.5 clinically inadequate treatment days. The ratio of gainable early adequate treatment days to number of PCR tests done is 22.8 days/100 tests overall (confidence interval 15-31) and 36.4 days/100 tests in the intensive care and surgical ward populations (confidence interval 22-51). CONCLUSIONS: Rapid PCR identification of microorganisms may contribute to a reduction of early inadequate antibiotic treatment in sepsis.
Resumo:
The diagnosis of a drug hypersensitivity reaction (DHR) is a challenging task because multiple and complex mechanisms are involved. Better understanding of immunologic pathomechanisms in DHRs and rapid progress in cellular-based in-vitro tests can help to adjust the correct diagnostic strategy to individual patients with different clinical manifestations of drug allergy. Thus, drug hypersensitivity diagnosis needs to rely on a combination of medical history and different in vivo and in vitro tests. In this article, the authors discuss current in vitro techniques, most recent findings, and new promising tools in the diagnosis of T-cell-mediated drug hypersensitivity.