53 resultados para Simplified and advanced calculation methods
Comparison of bacterial plaque samples from titanium implant and tooth surfaces by different methods
Resumo:
Studies have shown similarities in the microflora between titanium implants or tooth sites when samples are taken by gingival crevicular fluid (GCF) sampling methods. The purpose of the present study was to study the microflora from curette and GCF samples using the checkerboard DNA-DNA hybridization method to assess the microflora of patients who had at least one oral osseo-integrated implant and who were otherwise dentate. Plaque samples were taken from tooth/implant surfaces and from sulcular gingival surfaces with curettes, and from gingival fluid using filter papers. A total of 28 subjects (11 females) were enrolled in the study. The mean age of the subjects was 64.1 years (SD+/-4.7). On average, the implants studied had been in function for 3.7 years (SD+/-2.9). The proportion of Streptococcus oralis (P<0.02) and Fusobacterium periodonticum (P<0.02) was significantly higher at tooth sites (curette samples). The GCF samples yielded higher proportions for 28/40 species studies (P-values varying between 0.05 and 0.001). The proportions of Tannerella forsythia (T. forsythensis), and Treponema denticola were both higher in GCF samples (P<0.02 and P<0.05, respectively) than in curette samples (implant sites). The microbial composition in gingival fluid from samples taken at implant sites differed partly from that of curette samples taken from implant surfaces or from sulcular soft tissues, providing higher counts for most bacteria studied at implant surfaces, but with the exception of Porphyromonas gingivalis. A combination of GCF and curette sampling methods might be the most representative sample method.
Methods and representativeness of a European survey in children and adolescents: the KIDSCREEN study
Resumo:
BACKGROUND: The objective of the present study was to compare three different sampling and questionnaire administration methods used in the international KIDSCREEN study in terms of participation, response rates, and external validity. METHODS: Children and adolescents aged 8-18 years were surveyed in 13 European countries using either telephone sampling and mail administration, random sampling of school listings followed by classroom or mail administration, or multistage random sampling of communities and households with self-administration of the survey materials at home. Cooperation, completion, and response rates were compared across countries and survey methods. Data on non-respondents was collected in 8 countries. The population fraction (PF, respondents in each sex-age, or educational level category, divided by the population in the same category from Eurostat census data) and population fraction ratio (PFR, ratio of PF) and their corresponding 95% confidence intervals were used to analyze differences by country between the KIDSCREEN samples and a reference Eurostat population. RESULTS: Response rates by country ranged from 18.9% to 91.2%. Response rates were highest in the school-based surveys (69.0%-91.2%). Sample proportions by age and gender were similar to the reference Eurostat population in most countries, although boys and adolescents were slightly underrepresented (PFR <1). Parents in lower educational categories were less likely to participate (PFR <1 in 5 countries). Parents in higher educational categories were overrepresented when the school and household sampling strategies were used (PFR = 1.78-2.97). CONCLUSION: School-based sampling achieved the highest overall response rates but also produced slightly more biased samples than the other methods. The results suggest that the samples were sufficiently representative to provide reference population values for the KIDSCREEN instrument.
Resumo:
In the present study, dose measurements have been conducted following examination of the maxilla and mandible with spiral computed tomography (CT). The measurements were carried out with 2 phantoms, a head and neck phantom and a full body phantom. The analysis of applied thermoluminescent dosimeters yielded radiation doses for organs and tissues in the head and neck region between 0.6 and 16.7 mGy when 40 axial slices and 120 kV/165 mAs were used as exposure parameters. The effective dose was calculated as 0.58 and 0.48 mSv in the maxilla and mandible, respectively. Tested methods for dose reduction showed a significant decrease of radiation dose from 40 to 65%. Based on these results, the mortality risk was estimated according to calculation models recommended by the Committee on the Biological Effects of Ionizing Radiations and by the International Commission on Radiological Protection. Both models resulted in similar values. The mortality risk ranges from 46.2 x 10.6 for 20-year-old men to 11.2 x 10(-6) for 65-year-old women. Using 2 methods of dose reduction, the mortality risk decreased by approximately 50 to 60% to 19.1 x 10(-6) for 20-year-old men and 5.5 x 10(-6) for 65-year-old women. It can be concluded that a CT scan of the maxillofacial complex causes a considerable radiation dose when compared with conventional radiographic examinations. Therefore, a careful indication for this imaging technique and dose reduction methods should be considered in daily practice.
Resumo:
OBJECTIVE: Postmortem examination of chest trauma is an important domain in forensic medicine, which is today performed using autopsy. Since the implementation of cross-sectional imaging methods in forensic medicine such as computed tomography (CT) and magnetic resonance imaging (MRI), a number of advantages in comparison with autopsy have been described. Within the scope of validation of cross-sectional radiology in forensic medicine, the comparison of findings of postmortem imaging and autopsy in chest trauma was performed. METHODS: This retrospective study includes 24 cases with chest trauma that underwent postmortem CT, MRI, and autopsy. Two board-certified radiologists, blind to the autopsy findings, evaluated the radiologic data independently. Each radiologist interpreted postmortem CT and MRI data together for every case. The comparison of the results of the radiologic assessment with the autopsy and a calculation of interobserver discrepancy was performed. RESULTS: Using combined CT and MRI, between 75% and 100% of the investigated findings, except for hemomediastinum (70%), diaphragmatic ruptures (50%; n=2) and heart injury (38%), were discovered. Although the sensitivity and specificity regarding pneumomediastinum, pneumopericardium, and pericardial effusion were not calculated, as these findings were not mentioned at the autopsy, these findings were clearly seen radiologically. The averaged interobserver concordance was 90%. CONCLUSION: The sensitivity and specificity of our results demonstrate that postmortem CT and MRI are useful diagnostic methods for assessing chest trauma in forensic medicine as a supplement to autopsy. Further radiologic-pathologic case studies are necessary to define the role of postmortem CT and MRI as a single examination modality.
Resumo:
AIM: The importance of ventilatory support during cardiac arrest and basic life support is controversial. This experimental study used dynamic computed tomography (CT) to assess the effects of chest compressions only during cardiopulmonary resuscitation (CCO-CPR) on alveolar recruitment and haemodynamic parameters in porcine model of ventricular fibrillation. MATERIALS AND METHODS: Twelve anaesthetized pigs (26+/-1kg) were randomly assigned to one of the following groups: (1) intermittent positive pressure ventilation (IPPV) both during basic life support and advanced cardiac life support, or (2) CCO during basic life support and IPPV during advanced cardiac life support. Measurements were acquired at baseline prior to cardiac arrest, during basic life support, during advanced life support, and after return of spontaneous circulation (ROSC), as follows: dynamic CT series, arterial and central venous pressures, blood gases, and regional organ blood flow. The ventilated and atelectatic lung area was quantified from dynamic CT images. Differences between groups were analyzed using the Kruskal-Wallis test, and a p<0.05 was considered statistically significant. RESULTS: IPPV was associated with cyclic alveolar recruitment and de-recruitment. Compared with controls, the CCO-CPR group had a significantly larger mean fractional area of atelectasis (p=0.009), and significantly lower PaO(2) (p=0.002) and mean arterial pressure (p=0.023). The increase in mean atelectatic lung area observed during basic life support in the CCO-CPR group remained clinically relevant throughout the subsequent advanced cardiac life support period and following ROSC, and was associated with prolonged impaired haemodynamics. No inter-group differences in myocardial and cerebral blood flow were observed. CONCLUSION: A lack of ventilation during basic life support is associated with excessive atelectasis, arterial hypoxaemia and compromised CPR haemodynamics. Moreover, these detrimental effects remain evident even after restoration of IPPV.
Resumo:
BACKGROUND: Sound epidemiologic data on halitosis are rare. We evaluated the prevalence of halitosis in a young male adult population in Switzerland using a standardized questionnaire and clinical examination. METHODS: Six hundred twenty-six Swiss Army recruits aged 18 to 25 years (mean: 20.3 years) were selected as study subjects. First, a standardized questionnaire focusing on dental hygiene, self-reported halitosis, smoking, and alcohol consumption was filled out by all participants. In the clinical examination, objective values for the presence of halitosis were gathered through an organoleptic assessment of the breath odor and the measurement of volatile sulfur compounds (VSCs). Additionally, tongue coating, plaque index, and probing depths were evaluated for each recruit. RESULTS: The questionnaire revealed that only 17% of all included recruits had never experienced halitosis. The organoleptic evaluation (grades 0 to 3) identified eight persons with grade 3, 148 persons with grade 2, and 424 persons with grade 1 or 0. The calculation of the Pearson correlation coefficient to evaluate the relationship among the three methods of assessing halitosis revealed little to no correlation. The organoleptic score showed high reproducibility (kappa = 0.79). Tongue coating was the only influencing factor found to contribute to higher organoleptic scores and higher VSC values. CONCLUSIONS: Oral malodor seemed to pose an oral health problem for about one-fifth of 20-year-old Swiss males questioned. No correlation between self-reported halitosis and organoleptic or VSC measurements could be detected. Although the organoleptic method described here offers a high reproducibility, the lack of correlation between VSC values and organoleptic scores has to be critically addressed. For further studies assessing new organoleptic scores, a validated index should always be included as a direct control.
Resumo:
BACKGROUND: Periodontitis is the major cause of tooth loss in adults and is linked to systemic illnesses, such as cardiovascular disease and stroke. The development of rapid point-of-care (POC) chairside diagnostics has the potential for the early detection of periodontal infection and progression to identify incipient disease and reduce health care costs. However, validation of effective diagnostics requires the identification and verification of biomarkers correlated with disease progression. This clinical study sought to determine the ability of putative host- and microbially derived biomarkers to identify periodontal disease status from whole saliva and plaque biofilm. METHODS: One hundred human subjects were equally recruited into a healthy/gingivitis group or a periodontitis population. Whole saliva was collected from all subjects and analyzed using antibody arrays to measure the levels of multiple proinflammatory cytokines and bone resorptive/turnover markers. RESULTS: Salivary biomarker data were correlated to comprehensive clinical, radiographic, and microbial plaque biofilm levels measured by quantitative polymerase chain reaction (qPCR) for the generation of models for periodontal disease identification. Significantly elevated levels of matrix metalloproteinase (MMP)-8 and -9 were found in subjects with advanced periodontitis with Random Forest importance scores of 7.1 and 5.1, respectively. The generation of receiver operating characteristic curves demonstrated that permutations of salivary biomarkers and pathogen biofilm values augmented the prediction of disease category. Multiple combinations of salivary biomarkers (especially MMP-8 and -9 and osteoprotegerin) combined with red-complex anaerobic periodontal pathogens (such as Porphyromonas gingivalis or Treponema denticola) provided highly accurate predictions of periodontal disease category. Elevated salivary MMP-8 and T. denticola biofilm levels displayed robust combinatorial characteristics in predicting periodontal disease severity (area under the curve = 0.88; odds ratio = 24.6; 95% confidence interval: 5.2 to 116.5). CONCLUSIONS: Using qPCR and sensitive immunoassays, we identified host- and bacterially derived biomarkers correlated with periodontal disease. This approach offers significant potential for the discovery of biomarker signatures useful in the development of rapid POC chairside diagnostics for oral and systemic diseases. Studies are ongoing to apply this approach to the longitudinal predictions of disease activity.
Resumo:
PURPOSE To explore whether population-related pharmacogenomics contribute to differences in patient outcomes between clinical trials performed in Japan and the United States, given similar study designs, eligibility criteria, staging, and treatment regimens. METHODS We prospectively designed and conducted three phase III trials (Four-Arm Cooperative Study, LC00-03, and S0003) in advanced-stage, non-small-cell lung cancer, each with a common arm of paclitaxel plus carboplatin. Genomic DNA was collected from patients in LC00-03 and S0003 who received paclitaxel (225 mg/m(2)) and carboplatin (area under the concentration-time curve, 6). Genotypic variants of CYP3A4, CYP3A5, CYP2C8, NR1I2-206, ABCB1, ERCC1, and ERCC2 were analyzed by pyrosequencing or by PCR restriction fragment length polymorphism. Results were assessed by Cox model for survival and by logistic regression for response and toxicity. Results Clinical results were similar in the two Japanese trials, and were significantly different from the US trial, for survival, neutropenia, febrile neutropenia, and anemia. There was a significant difference between Japanese and US patients in genotypic distribution for CYP3A4*1B (P = .01), CYP3A5*3C (P = .03), ERCC1 118 (P < .0001), ERCC2 K751Q (P < .001), and CYP2C8 R139K (P = .01). Genotypic associations were observed between CYP3A4*1B for progression-free survival (hazard ratio [HR], 0.36; 95% CI, 0.14 to 0.94; P = .04) and ERCC2 K751Q for response (HR, 0.33; 95% CI, 0.13 to 0.83; P = .02). For grade 4 neutropenia, the HR for ABCB1 3425C-->T was 1.84 (95% CI, 0.77 to 4.48; P = .19). CONCLUSION Differences in allelic distribution for genes involved in paclitaxel disposition or DNA repair were observed between Japanese and US patients. In an exploratory analysis, genotype-related associations with patient outcomes were observed for CYP3A4*1B and ERCC2 K751Q. This common-arm approach facilitates the prospective study of population-related pharmacogenomics in which ethnic differences in antineoplastic drug disposition are anticipated.
Resumo:
BACKGROUND: Efavirenz and lopinavir boosted with ritonavir are both recommended as first-line therapies for patients with HIV when combined with two nucleoside reverse transcriptase inhibitors. It is uncertain which therapy is more effective for patients starting therapy with an advanced infection. METHODS: We estimated the relative effect of these two therapies on rates of virological and immunological failure within the Swiss HIV Cohort Study and considered whether estimates depended on the CD4(+) T-cell count when starting therapy. We defined virological failure as either an incomplete virological response or viral rebound after viral suppression and immunological failure as failure to achieve an expected CD4(+) T-cell increase calculated from EuroSIDA statistics. RESULTS: Patients starting efavirenz (n=660) and lopinavir (n=541) were followed for a median of 4.5 and 3.1 years, respectively. Virological failure was less likely for patients on efavirenz, with the adjusted hazard ratio (95% confidence interval) of 0.63 (0.50-0.78) then multiplied by a factor of 1.00 (0.90-1.12) for each 100 cells/mm(3) decrease in CD4(+) T-cell count below the mean when starting therapy. Immunological failure was also less likely for patients on efavirenz, with the adjusted hazard ratio of 0.68 (0.51-0.91) then multiplied by a factor of 1.29 (1.14-1.46) for each 100 cells/mm(3) decrease in CD4(+) T-cell count below the mean when starting therapy. CONCLUSIONS: Virological failure is less likely with efavirenz regardless of the CD4(+) T-cell count when starting therapy. Immunological failure is also less likely with efavirenz; however, this advantage disappears if patients start therapy with a low CD4(+) T-cell count.
Resumo:
This study aimed to assess the performance of International Caries Detection and Assessment System (ICDAS), radiographic examination, and fluorescence-based methods for detecting occlusal caries in primary teeth. One occlusal site on each of 79 primary molars was assessed twice by two examiners using ICDAS, bitewing radiography (BW), DIAGNOdent 2095 (LF), DIAGNOdent 2190 (LFpen), and VistaProof fluorescence camera (FC). The teeth were histologically prepared and assessed for caries extent. Optimal cutoff limits were calculated for LF, LFpen, and FC. At the D (1) threshold (enamel and dentin lesions), ICDAS and FC presented higher sensitivity values (0.75 and 0.73, respectively), while BW showed higher specificity (1.00). At the D (2) threshold (inner enamel and dentin lesions), ICDAS presented higher sensitivity (0.83) and statistically significantly lower specificity (0.70). At the D(3) threshold (dentin lesions), LFpen and FC showed higher sensitivity (1.00 and 0.91, respectively), while higher specificity was presented by FC (0.95), ICDAS (0.94), BW (0.94), and LF (0.92). The area under the receiver operating characteristic (ROC) curve (Az) varied from 0.780 (BW) to 0.941 (LF). Spearman correlation coefficients with histology were 0.72 (ICDAS), 0.64 (BW), 0.71 (LF), 0.65 (LFpen), and 0.74 (FC). Inter- and intraexaminer intraclass correlation values varied from 0.772 to 0.963 and unweighted kappa values ranged from 0.462 to 0.750. In conclusion, ICDAS and FC exhibited better accuracy in detecting enamel and dentin caries lesions, whereas ICDAS, LF, LFpen, and FC were more appropriate for detecting dentin lesions on occlusal surfaces in primary teeth, with no statistically significant difference among them. All methods presented good to excellent reproducibility.
Resumo:
Introduction: Lesotho was among the first countries to adopt decentralization of care from hospitals to nurse-led health centres (HCs) to scale up the provision of antiretroviral therapy (ART). We compared outcomes between patients who started ART at HCs and hospitals in two rural catchment areas in Lesotho. Methods: The two catchment areas comprise two hospitals and 12 HCs. Patients ≥16 years starting ART at a hospital or HC between 2008 and 2011 were included. Loss to follow-up (LTFU) was defined as not returning to the facility for ≥180 days after the last visit, no follow-up (no FUP) as not returning after starting ART, and retention in care as alive and on ART at the facility. The data were analysed using logistic regression, competing risk regression and Kaplan-Meier methods. Multivariable analyses were adjusted for sex, age, CD4 cell count, World Health Organization stage, catchment area and type of ART. All analyses were stratified by gender. Results: Of 3747 patients, 2042 (54.5%) started ART at HCs. Both women and men at hospitals had more advanced clinical and immunological stages of disease than those at HCs. Over 5445 patient-years, 420 died and 475 were LTFU. Kaplan-Meier estimates for three-year retention were 68.7 and 69.7% at HCs and hospitals, respectively, among women (p=0.81) and 68.8% at HCs versus 54.7% at hospitals among men (p<0.001). These findings persisted in adjusted analyses, with similar retention at HCs and hospitals among women (odds ratio (OR): 0.89, 95% confidence interval (CI): 0.73-1.09) and higher retention at HCs among men (OR: 1.53, 95% CI: 1.20-1.96). The latter result was mainly driven by a lower proportion of patients LTFU at HCs (OR: 0.68, 95% CI: 0.51-0.93). Conclusions: In rural Lesotho, overall retention in care did not differ significantly between nurse-led HCs and hospitals. However, men seemed to benefit most from starting ART at HCs, as they were more likely to remain in care in these facilities compared to hospitals.
Resumo:
AIMS AND BACKGROUND Tumor progression due to seeding of tumor cells after definitive treatment for squamous cell carcinomas of the head and neck is an uncommon condition that can considerably worsen the outcome of patients with head and neck cancer. METHODS AND STUDY DESIGN We report two cases of recurrence due to neoplastic seeding from oropharyngeal and oral cancer, respectively. We performed a literature review with MEDLINE as the main search engine. RESULTS Seeding was found to occur most often in tracheotomy scars and gastrostomy sites. The oral cavity, hypopharynx and oropharynx were the primary sites in most cases, and advanced tumor stage seemed to be a risk factor for seeding. Treatment options include salvage surgery, which requires thorough resections, radiotherapy when possible, and palliative management. The prognosis of such events is poor. CONCLUSION Although neoplastic seeding is a well-known phenomenon in cancer surgery, many questions remain unanswered, especially regarding preventive measures and management strategies.
Resumo:
PURPOSE To assess the clinical profile and prognostic factors in patients with adenosquamous carcinoma (ASC) of the head and neck treated by surgery and/or radiation therapy with or without chemotherapy. METHODS Data from 20 patients with stage I-II (n = 4), III (n = 5), or IVA (n = 11) head and neck ASC, treated between 1989 and 2010 were collected in a retrospective multicenter Rare Cancer Network study. Surgery was performed in 16 patients. Seventeen patients received combined modality treatment. RESULTS After a median follow-up of 15.5 months, 12 patients recurred. The 3-year and median overall survival, disease-free survival (DFS), and loco-regional control were 52% and 39 months, 32% and 12 months, and 47% and 33 months respectively. In multivariate analysis, DFS was negatively influenced by the presence of extracapsular extension and advanced stage. CONCLUSION Overall prognosis of locoregionally advanced ASC remains poor. However, early stage ASC patients managed with combined modality treatment may have prolonged DFS.
Resumo:
The range of novel psychoactive substances (NPS) including phenethylamines, cathinones, piperazines, tryptamines, etc. is continuously growing. Therefore, fast and reliable screening methods for these compounds are essential and needed. The use of dried blood spots (DBS) for a fast straightforward approach helps to simplify and shorten sample preparation significantly. DBS were produced from 10 µl of whole blood and extracted offline with 500 µl methanol followed by evaporation and reconstitution in mobile phase. Reversed-phase chromatographic separation and mass spectrometric detection (RP-LC-MS/MS) was achieved within a run time of 10 min. The screening method was validated by evaluating the following parameters: limit of detection (LOD), matrix effect, selectivity and specificity, extraction efficiency, and short-term and long-term stability. Furthermore, the method was applied to authentic samples and results were compared with those obtained with a validated whole blood method used for Routine analysis of NPS. LOD was between 1 and 10 ng/ml. No interference from Matrix compounds was observed. The method was proven to be specific and selective for the analytes, although with limitations for 3-FMC/flephedrone and MDDMA/MDEA. Mean extraction efficiency was 84.6 %. All substances were stable in DBS for at least a week when cooled. Cooling was essential for the stability of cathinones. Prepared samples were stable for at least 3 days. Comparison to the validated whole blood method yielded similar results. DBS were shown to be useful in developing a rapid screening method for NPS with simplified sample preparation. Copyright © 2013 John Wiley & Sons, Ltd
Resumo:
OBJECTIVES: The aim of this study was to determine whether the Chronic Kidney Disease Epidemiology Collaboration (CKD-EPI)- or Cockcroft-Gault (CG)-based estimated glomerular filtration rates (eGFRs) performs better in the cohort setting for predicting moderate/advanced chronic kidney disease (CKD) or end-stage renal disease (ESRD). METHODS: A total of 9521 persons in the EuroSIDA study contributed 133 873 eGFRs. Poisson regression was used to model the incidence of moderate and advanced CKD (confirmed eGFR < 60 and < 30 mL/min/1.73 m(2) , respectively) or ESRD (fatal/nonfatal) using CG and CKD-EPI eGFRs. RESULTS: Of 133 873 eGFR values, the ratio of CG to CKD-EPI was ≥ 1.1 in 22 092 (16.5%) and the difference between them (CG minus CKD-EPI) was ≥ 10 mL/min/1.73 m(2) in 20 867 (15.6%). Differences between CKD-EPI and CG were much greater when CG was not standardized for body surface area (BSA). A total of 403 persons developed moderate CKD using CG [incidence 8.9/1000 person-years of follow-up (PYFU); 95% confidence interval (CI) 8.0-9.8] and 364 using CKD-EPI (incidence 7.3/1000 PYFU; 95% CI 6.5-8.0). CG-derived eGFRs were equal to CKD-EPI-derived eGFRs at predicting ESRD (n = 36) and death (n = 565), as measured by the Akaike information criterion. CG-based moderate and advanced CKDs were associated with ESRD [adjusted incidence rate ratio (aIRR) 7.17; 95% CI 2.65-19.36 and aIRR 23.46; 95% CI 8.54-64.48, respectively], as were CKD-EPI-based moderate and advanced CKDs (aIRR 12.41; 95% CI 4.74-32.51 and aIRR 12.44; 95% CI 4.83-32.03, respectively). CONCLUSIONS: Differences between eGFRs using CG adjusted for BSA or CKD-EPI were modest. In the absence of a gold standard, the two formulae predicted clinical outcomes with equal precision and can be used to estimate GFR in HIV-positive persons.