953 resultados para Genealogical concordance
Resumo:
The subjective interpretation of dobutamine echocardiography (DBE) makes the accuracy of this technique dependent on the experience of the observer, and also poses problems of concordance between observers. Myocardial tissue Doppler velocity (MDV) may offer a quantitative technique for identification of coronary artery disease, but it is unclear whether this parameter could improve the results of less expert readers and in segments with low interobserver concordance. The aim of this study was to find whether MDV improved the accuracy of wall motion scoring in novice readers, experienced echocardiographers, and experts in stress echocardiography, and to identify the optimal means of integrating these tissue Doppler data in 77 patients who underwent DBE and angiography. New or worsening abnormalities were identified as ischemia and abnormalities seen at rest as scarring. Segmental MDV was measured independently and previously derived cutoffs were applied to categorize segments as normal or ab normal. Five strategies were used to combine MDV and wall motion score, and the results of each reader using each strategy were compared with quantitative coronary angiography. The accuracy of wall motion scoring by novice (68 +/- 3%) and experienced echocardiographers (71 +/- 3%) was less than experts in stress echocardiography (88 +/- 3%, p < 0.001). Various strategies for integration with MDV significantly improved the accuracy of wall motion scoring by novices from 75 +/- 2% to 77 +/- 5% (p < 0.01). Among the experienced group, accuracy improved from 74 +/- 2% to 77 +/- 5% (p < 0.05), but in the experts, no improvement was seen from their baseline accuracy. Integration with MDV also improved discordance related to the basal segments. Thus, use of MDV in all segments or MDV in all segments with wall motion scoring in the apex offers an improvement in sensitivity and accuracy with minimal compromise in specificity. (C) 2001 by Excerpta Medica, Inc.
Resumo:
Theory predicts that in small isolated populations random genetic drift can lead to phenotypic divergence; however this prediction has rarely been tested quantitatively in natural populations. Here we utilize natural repeated island colonization events by members of the avian species complex, Zosterops lateralis, to assess whether or not genetic drift alone is an adequate explanation for the observed patterns of microevolutionary divergence in morphology. Morphological and molecular genetic characteristics of island and mainland populations are compared to test three predictions of drift theory: (1) that the pattern of morphological change is idiosyncratic to each island; (2) that there is concordance between morphological and neutral genetic shifts across island populations; and (3) for populations whose time of colonization is known, that the rate of morphological change is sufficiently slow to be accounted for solely by genetic drift. Our results are not consistent with these predictions. First, the direction of size shifts was consistently towards larger size, suggesting the action of a nonrandom process. Second, patterns of morphological divergence among recently colonized populations showed little concordance with divergence in neutral genetic characters. Third, rate tests of morphological change showed that effective population sizes were not small enough for random processes alone to account for the magnitude of microevolutionary change. Altogether, these three lines of evidence suggest that drift alone is not an adequate explanation of morphological differentiation in recently colonized island Zosterops and therefore we suggest that the observed microevolutionary changes are largely a result of directional natural selection.
Resumo:
Objectives: To determine (i) factors which predict whether patients hospitalised with acute myocardial infarction (AMI) receive care discordant with recommendations of clinical practice guidelines; and (ii) whether such discordant care results in worse outcomes compared with receiving guideline-concordant care. Design: Retrospective cohort study. Setting: Two community general hospitals. Participants: 607 consecutive patients admitted with AMI between July 1997 and December 2000. Main outcome measures: Clinical predictors of discordant care; crude and risk-adjusted rates of inhospital mortality and reinfarction, and mean length of hospital stay. Results: At least one treatment recommendation for AMI was applicable for 602 of the 607 patients. Of these patients, 411(68%) received concordant care, and 191 (32%) discordant care. Positive predictors at presentation of discordant care were age > 65 years (odds ratio [OR], 2.5; 95% Cl, 1.7-3.6), silent infarction (OR, 2.7; 95% Cl, 1.6-4.6), anterior infarction (OR, 2.5; 95% Cl, 1.7-3.8), a history of heart failure (OR, 6.3; 95% Cl, 3.7-10.7), chronic atrial fibrillation (OR, 3.2; 95% Cl, 1.5-6.4); and heart rate greater than or equal to 100 beats/min (OR, 2.1; 95% Cl, 1.4-3.1). Death occurred in 12.0% (23/191) of discordant-care patients versus 4.6% (19/411) of concordant-care patients (adjusted OR, 2.42; 95% Cl, 1.22-4.82). Mortality was inversely related to the level of guideline concordance (P = 0.03). Reinfarction rates also tended to be higher in the discordant-care group (4.2% v 1.7%; adjusted OR, 2.5; 95% Cl, 0.90-7.1). Conclusions: Certain clinical features at presentation predict a higher likelihood of guideline-discordant care in patients presenting with AMI Such care appears to increase the risk of inhospital death.
Resumo:
Aims To determine the degree of inter-institutional agreement in the assessment of dobutamine stress echocardiograms using modern stress echo cardiographic technology in combination with standardized data acquisition and assessment criteria. Method and Results Among six experienced institutions, 150 dobutamine stress echocardiograms (dobutamine up to 40 mug.kg(-1) min(-1) and atropine up to I mg) were performed on patients with suspected coronary artery disease using fundamental and harmonic imaging following a consistent digital acquisition protocol. Each dobutamine stress echocardiogram was assessed at every institution regarding endocardial visibility and left ventricular wall motion without knowledge of any other data using standardized reading criteria. No patients were excluded due to poor image quality or inadequate stress level. Coronary angiography was performed within 4 weeks. Coronary angiography demonstrated significant coronary artery disease (less than or equal to50% diameter stenosis) in 87 patients. Using harmonic imaging an average of 5.2+/-0.9 institutions agreed on dobutamine stress echocardiogram results as being normal or abnormal (mean kappa 0.55; 95% CI 0.50-0.60). Agreement was higher in patients with no (equal assessment of dobutamine stress echocardiogram results by 5.5 +/- 0.8 institutions) or three-vessel coronary artery disease (5.4 +/- 0.8 institutions) and lower in one- or two- vessel disease (5.0 +/- 0.9 and 5.2 +/- 1.0 institutions, respectively-, P=0.041). Disagreement on test results was greater in only minor wall motion abnormalities. Agreement on dobutamine stress echocardiogram results was lower using fundamental imaging (mean kappa 0.49; 95% CI 0.44-0.54; P
Resumo:
Abstract: Background: Patients diagnosed with cancer are often treated with chemotherapy and radiotherapy with curative intent. The transition from curative to palliative intent involves re-evaluation of treatment, and has to take into account the attitudes, beliefs and life aims of the patient. Objective: To discuss the difficulties in determining when to cease chemotherapy and radiotherapy in patients with advanced cancer. Discussion: The concept of treatment evaluation using a ‘burden versus benefit’ paradigm is discussed. Treatment aims must be in concordance with those of the patient, which are often couched in functional terms or linked to future significant life events. Chemotherapy and radiotherapy can offer patients in the palliative phase of cancer illness, benefits in terms of relief of symptoms and meaningful prolongation of life, and should be considered in appropriate circumstances. (author abstract)
Resumo:
Patients with chronic or complex medical or psychiatric conditions are treated by many practitioners, including general practitioners (GPs). Formal liaison between primary and specialist is often assumed to offer benefits to patients The aim of this study was to assess the efficacy of formal liaison of GPs with specialist service providers on patient health outcomes, by conducting a systematic review of the published literature in MEDLINE, EMBASE, PsychINFO, CINAHL and Cochrane Library databases using the following search terms family physicians': synonyms of 'patient care planning', 'patient discharge' and 'patient care team'; and synonyms of 'randomised controlled trials'. Seven studies were identified, involving 963 subjects and 899 controls. most health outcomes were unchanged, although some physical and functional health outcomes were improved by formal liaison between GPs and specialist services, particularly among chronic mental illness patients. Some health outcomes worsened during the intervention. Patient retention rates within treatment programmes improved with GP involvement, as did patient satisfaction. Doctor (GP and specialist) behaviour changed, with reports of more rational use of resources and diagnostic tests, improved clinical skills, more frequent use of appropriate treatment strategies, and more frequent clinical behaviours designed to detect disease complications Cost effectiveness could not be determined. In conclusion, formal liaison between GPs and specialist services leaves most physical health outcomes unchanged, but improves functional outcomes in chronically mentally ill patients. It may confer modest long-term health benefits through improvements in patient concordance with treatment programmes and more effective clinical practice.
Resumo:
Two different doses of Ross River virus (1111) were fed to Ochlerotatus vigilax (Skuse), the primary coastal vector in Australia; and blood engorged females were held at different temperatures up to 35 d. After ingesting 10(4.3) CCID50/Mosquito, mosquitoes reared at 18 and 25degreesC (and held at the same temperature) had higher body remnant and head and salivary gland titers than those held at 32degreesC, although infection rates were comparable. At 18, 25, and 32degreesC, respectively, virus was first detected in the salivary glands on days 3, 2, and 3. Based on a previously demonstrated 98.7% concordance between salivary gland infection and transmission, the extrinsic incubation periods were estimated as 5, 4, and 3 d, respectively, for these three temperatures. When Oc. vigilax reared at 18, 25, or 32degreesC were fed a lower dosage of 10(3.3) CCID50 RR/mosquito, and assayed after 7 d extrinsic incubation at these (or combinations of these) temperatures, infection rates and titers were similar. However, by 14 d, infection rates and titers of those reared and held at 18 and 32degreesC were significantly higher and lower, respectively. However, this process was reversible when the moderate 25degreesC was involved, and intermediate infection rates and titers resulted. These data indicate that for the strains of RR and Oc. vigilax used, rearing temperature is unimportant to vector competence in the field, and that ambient temperature variations will modulate or enhance detectable infection rates only after 7 d: extrinsic incubation. Because of the short duration of extrinsic incubation, however, this will do little to influence RR epidemiology, because by this time some Oc. vigilax could be seeking their third blood meal, the latter two being infectious.
Resumo:
For zygosity diagnosis in the absence of genotypic data, or in the recruitment phase of a twin study where only single twins from same-sex pairs are being screened, or to provide a test for sample duplication leading to the false identification of a dizygotic pair as monozygotic, the appropriate analysis of respondents' answers to questions about zygosity is critical. Using data from a young adult Australian twin cohort (N = 2094 complete pairs and 519 singleton twins from same-sex pairs with complete responses to all zygosity items), we show that application of latent class analysis (LCA), fitting a 2-class model, yields results that show good concordance with traditional methods of zygosity diagnosis, but with certain important advantages. These include the ability, in many cases, to assign zygosity with specified probability on the basis of responses of a single informant (advantageous when one zygosity type is being oversampled); and the ability to quantify the probability of misassignment of zygosity, allowing prioritization of cases for genotyping as well as identification of cases of probable laboratory error. Out of 242 twins (from 121 like-sex pairs) where genotypic data were available for zygosity confirmation, only a single case was identified of incorrect zygosity assignment by the latent class algorithm. Zygosity assignment for that single case was identified by the LCA as uncertain (probability of being a monozygotic twin only 76%), and the co-twin's responses clearly identified the pair as dizygotic (probability of being dizygotic 100%). In the absence of genotypic data, or as a safeguard against sample duplication, application of LCA for zygosity assignment or confirmation is strongly recommended.
Resumo:
Cropp and Gabric [Ecosystem adaptation: do ecosystems maximise resilience? Ecology. In press] used a simple phytoplanktonzooplankton-nutrient model and a genetic algorithm to determine the parameter values that would maximize the value of certain goal functions. These goal functions were to maximize biomass, maximize flux, maximize flux to biomass ratio, and maximize resilience. It was found that maximizing goal functions maximized resilience. The objective of this study was to investigate whether the Cropp and Gabric [Ecosystem adaptation: do ecosystems maximise resilience? Ecology. In press] result was indicative of a general ecosystem principle, or peculiar to the model and parameter ranges used. This study successfully replicated the Cropp and Gabric [Ecosystem adaptation: do ecosystems maximise resilience? Ecology. In press] experiment for a number of different model types, however, a different interpretation of the results is made. A new metric, concordance, was devised to describe the agreement between goal functions. It was found that resilience has the highest concordance of all goal functions trialled. for most model types. This implies that resilience offers a compromise between the established ecological goal functions. The parameter value range used is found to affect the parameter versus goal function relationships. Local maxima and minima affected the relationship between parameters and goal functions, and between goal functions. (C) 2003 Elsevier B.V. All rights reserved.
Resumo:
OBJETIVES: To detect anti-Giardia lamblia serum antibodies in healthy children attending public day care centers and to assess serological tests as tools for estimating the prevalence of G. lamblia in endemic areas. METHODS: Three separate stool specimens and filter paper blood samples were collected from 147 children ranging from 0 to 6 years old. Each stool sample was processed using spontaneous sedimentation and zinc sulfate flotation methods. Blood samples were tested by indirect immunofluorescence (IIF) and enzyme-linked immunosorbent assay (ELISA) for Giardia IgG. RESULTS AND CONCLUSIONS: Of 147 individuals tested, 93 (63.3%) showed Giardia cysts in their feces. Using IIF and ELISA, serum antibodies were detected in 93 (63.3%) and 100 (68%) samples , respectively. Sensitivity of IIF and ELISA was 82% and 72%, respectively. However, ELISA revealed to be less specific (39%) than IIF (70%). IIF also showed a higher concordance with microscopic examination than ELISA.
Resumo:
OBJECTIVE: To evaluate the potential advantages and limitations of the use of the Brazilian hospital admission authorization forms database and the probabilistic record linkage methodology for the validation of reported utilization of hospital care services in household surveys. METHODS: A total of 2,288 households interviews were conducted in the county of Duque de Caxias, Brazil. Information on the occurrence of at least one hospital admission in the year preceding the interview was obtained from a total of 10,733 household members. The 130 records of household members who reported at least one hospital admission in a public hospital were linked to a hospital database with 801,587 records, using an automatic probabilistic approach combined with an extensive clerical review. RESULTS: Seventy-four (57%) of the 130 household members were identified in the hospital database. Yet only 60 subjects (46%) showed a record of hospitalization in the hospital database in the study period. Hospital admissions due to a surgery procedure were significantly more likely to have been identified in the hospital database. The low level of concordance seen in the study can be explained by the following factors: errors in the linkage process; a telescoping effect; and an incomplete record in the hospital database. CONCLUSIONS: The use of hospital administrative databases and probabilistic linkage methodology may represent a methodological alternative for the validation of reported utilization of health care services, but some strategies should be employed in order to minimize the problems related to the use of this methodology in non-ideal conditions. Ideally, a single identifier, such as a personal health insurance number, and the universal coverage of the database would be desirable.
Resumo:
Several antineoplasic drugs have been demonstrated to be carcinogenic or to have mutagenic and teratogenic effects. The greatest protection is achieved with the implementation of administrative and engineering controls and safety procedures. Objective: to evaluate the improvements on pharmacy technicians' work practices, after the implementation of operational procedures related to individual protection, biologic safety cabinet disinfection and cytotoxic drug preparation. Method: case-study in a hospital pharmacy undergoing a certification process. Six pharmacy technicians were observed during their daily activities. Characterization of the work practices was made using a checklist based on ISOPP and PIC guidelines. The variables studied concerning cleaning/disinfection procedures, personal protective equipment and procedures for preparing cytotoxic drugs. The same work practices were evaluated after four months of operational procedures implementation. Concordance between work practices and guidelines was considered to be a quality indicator (guidelines concordance practices number/total number of practices x 100). Results: improvements were observed after operational procedures implementation. An improvement of 6,25% in personal protective equipment practice was achieved by changing second pair of gloves every thirty minutes. The major progress, 10%, was obtained in disinfection procedure, where 80% of tasks are now realized according to guidelines.By now, we hot an improvement of only 1% at drug preparation procedure by placing one cytotoxic drug at a time inside the biological safety cabinet. Then, 85% of practices are according to guidelines. Conclusion: before operational procedures implementation 80,3% of practices were according to the guidelines, while now is 84,4%. This indicates that is necessary to review the procedures frequently in the benefit to reduce the risks associated with handling cytotoxic drugs and maintenance of drug specifications.
Resumo:
Pós-graduação em Psicologia - FCLAS
Resumo:
Objectivo: Verificar se existem alterações no padrão de recrutamento do transverso do abdómen/oblíquo interno, recto abdominal e oblíquo externo em indivíduos com história de dor lombopélvica aguda. Métodos: Foi realizado um estudo do tipo observacional, analítico e transversal, cuja amostra consistia num grupo de 15 indivíduos que nunca tiveram dor lombopélvica e por outro de 14 indivíduos do sexo feminino que tiveram pelo menos episódio de dor lombopélvica aguda nos últimos 6 meses. Foi recolhida e avaliada por electromiografia de superfície a actividade dos músculos acima referenciados e do deltóide, durante os movimentos rápidos de flexão, abdução e extensão do ombro. Analisou-se e foi comparado entre os dois grupos o padrão de recrutamento dos músculos abdominais. O teste de Mann-Whitney foi utilizado para comparar a média do tempo de activação muscular entre os dois grupos. O teste de Friedman foi efectuado para comparar o tempo de activação entre os músculos avaliados em cada direcção de movimento (com nível de significância de 5%). Resultados: Registou-se um atraso na activação do transverso do abdómen no movimento de flexão e abdução do ombro e ainda perda da activação independente entre os músculos superficiais e profundos estudados. Conclusão: O padrão de recrutamento dos músculos abdominais associado ao movimento do ombro encontra-se alterado em indivíduos com história de dor lombopélvica aguda.
Resumo:
OBJECTIVE To propose a short version of the Brazilian Food Insecurity Scale. METHODS Two samples were used to test the results obtained in the analyses in two distinct scenarios. One of the studies was composed of 230 low income families from Pelotas, RS, Southern Brazil, and the other was composed of 15,575 women, whose data were obtained from the 2006 National Survey on Demography and Health. Two models were tested, the first containing seven questions, and the second, the five questions that were considered the most relevant ones in the concordance analysis. The models were compared to the Brazilian Food Insecurity Scale, and the sensitivity, specificity and accuracy parameters were calculated, as well as the kappa agreement test. RESULTS Comparing the prevalence of food insecurity between the Brazilian Food Insecurity Scale and the two models, the differences were around 2 percentage points. In the sensitivity analysis, the short version of seven questions obtained 97.8% and 99.5% in the Pelotas sample and in the National Survey on Demography and Health sample, respectively, while specificity was 100% in both studies. The five-question model showed similar results (sensitivity of 95.7% and 99.5% in the Pelotas sample and in the National Survey on Demography and Health sample, respectively). In the Pelotas sample, the kappa test of the seven-question version totaled 97.0% and that of the five-question version, 95.0%. In the National Survey on Demography and Health sample, the two models presented a 99.0% kappa. CONCLUSIONS We suggest that the model with five questions should be used as the short version of the Brazilian Food Insecurity Scale, as its results were similar to the original scale with a lower number of questions. This version needs to be administered to other populations in Brazil in order to allow for the adequate assessment of the validity parameters.