872 resultados para Weights and measures.
Resumo:
Background The loose and stringent Asthma Predictive Indices (API), developed in Tucson, are popular rules to predict asthma in preschool children. To be clinically useful, they require validation in different settings. Objective To assess the predictive performance of the API in an independent population and compare it with simpler rules based only on preschool wheeze. Methods We studied 1954 children of the population-based Leicester Respiratory Cohort, followed up from age 1 to 10 years. The API and frequency of wheeze were assessed at age 3 years, and we determined their association with asthma at ages 7 and 10 years by using logistic regression. We computed test characteristics and measures of predictive performance to validate the API and compare it with simpler rules. Results The ability of the API to predict asthma in Leicester was comparable to Tucson: for the loose API, odds ratios for asthma at age 7 years were 5.2 in Leicester (5.5 in Tucson), and positive predictive values were 26% (26%). For the stringent API, these values were 8.2 (9.8) and 40% (48%). For the simpler rule early wheeze, corresponding values were 5.4 and 21%; for early frequent wheeze, 6.7 and 36%. The discriminative ability of all prediction rules was moderate (c statistic ≤ 0.7) and overall predictive performance low (scaled Brier score < 20%). Conclusion Predictive performance of the API in Leicester, although comparable to the original study, was modest and similar to prediction based only on preschool wheeze. This highlights the need for better prediction rules.
Resumo:
Victor Sazonov (Russia). Video Games and Aggression in Teenagers. Mr. Sazonov works as a psychologist at the Obninsk Linguistic College and worked on this research from July 1996 to June 1997. Mr. Sazonov conducted a survey of 200 tenth and eleventh graders in Moscow (94 boys and 106 girls), in which they were asked to estimate the total amount of time they spent playing video games each week and which games were the most popular. Aggression was also assessed using two measures, the first dealing with manifest physical aggression and the second with aggressive behavioural delinquency. The data collected showed that 62% of teenagers spend at least one hour a week playing video games, with 10% spending over seven hours on them. Girls tended to play less than boys (1.6 and 2.8 hours on average respectively). Eight of the ten most popular games require the player to perform acts of a violent nature. Boys also scored higher on the index of aggressive behavioural delinquency, with a mean of 7.0 compared to 4.6 for girls. The results of the correlation analysis between time spent playing video games and measures of aggression were mixed. No relation was found between manifest physical aggression and time spent on the games, although in the case of aggressive behavioural delinquency the link was significant, which seems to indicate that aggressive teenagers spend more time playing video games. While the lack of significant correlations between violent games and aggression suggest that video games may not in fact be as great a menace as their critics suggest, Mr. Sazonov admits that these findings may be influenced by the high number of teenagers who do not play games at all or play relatively little. He also suggests that the abstract nature of the violence in games (often directed against aliens or spaceships) may make it less of a risk than the more realistic violence seen on television. In summary, however, he concludes that his results provide more support for the theories saying that violent video games provide a stimulus to violent action, than for those which suggest that they may help defuse violent tendencies.
Resumo:
The early detection of subjects with probable Alzheimer's disease (AD) is crucial for effective appliance of treatment strategies. Here we explored the ability of a multitude of linear and non-linear classification algorithms to discriminate between the electroencephalograms (EEGs) of patients with varying degree of AD and their age-matched control subjects. Absolute and relative spectral power, distribution of spectral power, and measures of spatial synchronization were calculated from recordings of resting eyes-closed continuous EEGs of 45 healthy controls, 116 patients with mild AD and 81 patients with moderate AD, recruited in two different centers (Stockholm, New York). The applied classification algorithms were: principal component linear discriminant analysis (PC LDA), partial least squares LDA (PLS LDA), principal component logistic regression (PC LR), partial least squares logistic regression (PLS LR), bagging, random forest, support vector machines (SVM) and feed-forward neural network. Based on 10-fold cross-validation runs it could be demonstrated that even tough modern computer-intensive classification algorithms such as random forests, SVM and neural networks show a slight superiority, more classical classification algorithms performed nearly equally well. Using random forests classification a considerable sensitivity of up to 85% and a specificity of 78%, respectively for the test of even only mild AD patients has been reached, whereas for the comparison of moderate AD vs. controls, using SVM and neural networks, values of 89% and 88% for sensitivity and specificity were achieved. Such a remarkable performance proves the value of these classification algorithms for clinical diagnostics.
Resumo:
OBJECTIVE: To determine the association between the 3-dimensional (3-D) motion pattern of the caudal lumbar and lumbosacral portions of the canine vertebral column and the morphology of vertebrae, facet joints, and intervertebral disks. SAMPLE POPULATION: Vertebral columns of 9 German Shepherd Dogs and 16 dogs of other breeds with similar body weights and body conditions. PROCEDURE: Different morphometric parameters of the vertebral column were assessed by computed tomography (CT) and magnetic resonance imaging. Anatomic conformation and the 3-D motion pattern were compared, and correlation coefficients were calculated. RESULTS: Total range of motion for flexion and extension was mainly associated with the facet joint angle, the facet joint angle difference between levels of the vertebral column in the transverse plane on CT images, disk height, and lever arm length. CONCLUSIONS AND CLINICAL RELEVANCE: Motion is a complex process that is influenced by the entire 3-D conformation of the lumbar portion of the vertebral column. In vivo dynamic measurements of the 3-D motion pattern of the lumbar and lumbosacral portions of the vertebral column will be necessary to further assess biomechanics that could lead to disk degeneration in dogs.
Resumo:
Heart rate variability (HRV) and cardiorespiratory coordination, i.e. the temporal interplay between oscillations of heartbeat and respiration, reflect information related to the cardiovascular and autonomic nervous system. The purpose of this study was to investigate the relationship between spectral measures of HRV and measures of cardiorespiratory coordination. In 127 subjects from a normal population a 24 h Holter ECG was recorded. Average heart rate (HR) and the following HRV parameters were calculated: very low (VLF), low (LF) and high frequency (HF) oscillations and LF/HF. Cardiorespiratory coordination was quantified using average respiratory rate (RespR), the ratio of heart rate and respiratory rate (HRR), the phase coordination ratio (PCR) and the extent of cardiorespiratory coordination (PP). Pearson's correlation coefficient r was used to quantify the relationship between each pair of the variables across all subjects. HR and HRR correlated strongest during daytime (r = 0.89). LF/HF and PP showed a negative correlation to a reasonable degree (r = -0.69). During nighttime sleep these correlations decreased whereas the correlation between HRR and RespR (r = -0.47) as well as between HRR and PCR (r = 0.73) increased substantially. In conclusion, HRR and PCR deliver considerably different information compared to HRV measures whereas PP is partially linked reciprocally to LF/HF.
Resumo:
About one-sixth of the world’s land area, that is, about one-third of the land used for agriculture, has been affected by soil degradation in the historic past. While most of this damage was caused by water and wind erosion, other forms of soil degradation are induced by biological, chemical, and physical processes. Since the 1950s, pressure on agricultural land has increased considerably owing to population growth and agricultural modernization. Small-scale farming is the largest occupation in the world, involving over 2.5 billion people, over 70% of whom live below the poverty line. Soil erosion, along with other environmental threats, particularly affects these farmers by diminishing yields that are primarily used for subsistence. Soil and water conservation measures have been developed and applied on many farms. Local and science-based innovations are available for most agroecological conditions and land management and farming types. Principles and measures developed for small-scale as well as modern agricultural systems have begun to show positive impacts in most regions of the world, particularly in wealthier states and modern systems. Much more emphasis still needs to be given to small-scale farming, which requires external support for investment in sustainable land management technologies as an indispensable and integral component of farm activities.
Resumo:
Animal production, hay production and feeding, winter forage composition changes, and summer pasture yields and nutrient composition of a year-round grazing system for spring-calving and fall-calving cows were compared to those of a conventional, minimal land system. Cows in the year-round and minimal land systems grazed forage from smooth bromegrassorchardgrass-birdsfoot trefoil (SB-O-T) pastures at 1.67 and 3.33 acres, respectively, per cow in the summer. During the summer, SB-O-T pastures in the year-round grazing system also were grazed by stockers at 1.67 stockers per acre, and spring-calving and fall-calving cows grazed smooth bromegrass–red clover (SB-RC) and endophyte-free tall fescue–red clover (TF-RC) at 2.5 acres per cow for approximately 45 days in midsummer. In the year-round grazing system, spring-calving cows grazed corn crop residues at 2.5 acres per cow and stockpiled SB-RC pastures at 2.5 acres per cow; fallcalving cows grazed stockpiled TF-RC pastures at 2.5 acres per cow during winter. In the minimal land system, in winter, cows were maintained in a drylot on first-cutting hay harvested from 62.5–75% of the pasture acres during summer. Hay was fed to maintain a body condition score of 5 on a 9-point scale for springcalving cows in both systems and a body condition score of 3 for fall-calving cows in the year-round system. Over 3 years, mean body weights of fall-calving cows in the year-round system did not differ from the body weights of spring-calving cows in either system, but fall-calving cows had higher (P < .05) body condition scores compared to spring-calving cows in either system. There were no differences among all groups of cows in body condition score changes over the winter grazing season (P > .05). During the summer grazing season, fall-calving cows in the year- round system and springcalving cows in the minimal land system gained more body condition and more weight (P < .05) than springcalving cows in the year-round grazing system. Fall calves in the year-round system had higher birth weights, lower weaning weights, and lower average preweaning daily gains compared to either group of spring calves (P < .05). However, there were no significant differences for birth weights, weaning weights, or average pre-weaning daily gains between spring calves in either system over the 3-year experiment (P > .05). The amount of total growing animal production (calves and stockers) per acre for each system did not differ in any year (P > .05). Over the 3-year experiment, 1.9 ton more hay was fed per cow and 1 ton more hay was fed per cow–calf pair in the minimal land system compared to the year-round grazing system (P < .05).
Resumo:
In the field of thrombosis and haemostasis, many preanalytical variables influence the results of coagulation assays and measures to limit potential results variations should be taken. To our knowledge, no paper describing the development and maintenance of a haemostasis biobank has been previously published. Our description of the biobank of the Swiss cohort of elderly patients with venous thromboembolism (SWITCO65+) is intended to facilitate the set-up of other biobanks in the field of thrombosis and haemostasis. SWITCO65+ is a multicentre cohort that prospectively enrolled consecutive patients aged ≥65 years with venous thromboembolism at nine Swiss hospitals from 09/2009 to 03/2012. Patients will be followed up until December 2013. The cohort includes a biobank with biological material from each participant taken at baseline and after 12 months of follow-up. Whole blood from all participants is assayed with a standard haematology panel, for which fresh samples are required. Two buffy coat vials, one PAXgene Blood RNA System tube and one EDTA-whole blood sample are also collected at baseline for RNA/DNA extraction. Blood samples are processed and vialed within 1 h of collection and transported in batches to a central laboratory where they are stored in ultra-low temperature archives. All analyses of the same type are performed in the same laboratory in batches. Using multiple core laboratories increased the speed of sample analyses and reduced storage time. After recruiting, processing and analyzing the blood of more than 1,000 patients, we determined that the adopted methods and technologies were fit-for-purpose and robust.
Resumo:
IMPORTANCE This study addresses the value of patients' reported symptoms as markers of tumor recurrence after definitive therapy for head and neck squamous cell carcinoma. OBJECTIVE To evaluate the correlation between patients' symptoms and objective findings in the diagnosis of local and/or regional recurrences of head and neck squamous cell carcinomas in the first 2 years of follow-up. DESIGN Retrospective single-institution study of a prospectively collected database. SETTING Regional hospital. PARTICIPANTS We reviewed the clinical records of patients treated for oral cavity, oropharyngeal, laryngeal, and hypopharyngeal carcinomas between January 1, 2008, and December 31, 2009, with a minimum follow-up of 2 years. MAIN OUTCOMES AND MEASURES Correlation between symptoms and oncologic status (recurrence vs remission) in the posttreatment period. RESULTS Of the 101 patients included, 30 had recurrences. Pain, odynophagia, and dysphonia were independently correlated with recurrence (odds ratios, 16.07, 11.20, and 5.90, respectively; P < .001). New-onset symptoms had the best correlation with recurrences. Correlation was better between 6 to 12 and 18 to 21 months after therapy and in patients initially treated unimodally (P < .05). Primary stage and tumor site had no effect. CONCLUSIONS AND RELEVANCE The correlation between symptoms and oncologic status is low during substantial periods within the first 2 years of follow-up. New-onset symptoms, especially pain, odynophagia, or dysphonia, better correlate with tumor recurrence, especially in patients treated unimodally.
Resumo:
Health care providers face the problem of trying to make decisions with inadequate information and also with an overload of (often contradictory) information. Physicians often choose treatment long before they know which disease is present. Indeed, uncertainty is intrinsic to the practice of medicine. Decision analysis can help physicians structure and work through a medical decision problem, and can provide reassurance that decisions are rational and consistent with the beliefs and preferences of other physicians and patients. ^ The primary purpose of this research project is to develop the theory, methods, techniques and tools necessary for designing and implementing a system to support solving medical decision problems. A case study involving “abdominal pain” serves as a prototype for implementing the system. The research, however, focuses on a generic class of problems and aims at covering theoretical as well as practical aspects of the system developed. ^ The main contributions of this research are: (1) bridging the gap between the statistical approach and the knowledge-based (expert) approach to medical decision making; (2) linking a collection of methods, techniques and tools together to allow for the design of a medical decision support system, based on a framework that involves the Analytic Network Process (ANP), the generalization of the Analytic Hierarchy Process (AHP) to dependence and feedback, for problems involving diagnosis and treatment; (3) enhancing the representation and manipulation of uncertainty in the ANP framework by incorporating group consensus weights; and (4) developing a computer program to assist in the implementation of the system. ^
Resumo:
Metabolomics is the global and unbiased survey of the complement of small molecules (say, <1 kDa) in a biofluid, tissue, organ or organism and measures the end-products of the cellular metabolism of both endogenous and exogenous substrates. Many drug candidates fail during Phase II and III clinical trials at an enormous cost to the pharmaceutical industry in terms of both time lost and of financial resources. The constantly evolving model of drug development now dictates that biomarkers should be employed in preclinical development for the early detection of likely-to-fail candidates. Biomarkers may also be useful in the preselection of patients and through the subclassification of diseases in clinical drug development. Here we show with examples how metabolomics can assist in the preclinical development phases of discovery, pharmacology, toxicology, and ADME. Although not yet established as a clinical trial patient prescreening procedure, metabolomics shows considerable promise in this regard. We can be certain that metabolomics will join genomics and transcriptomics in lubricating the wheels of clinical drug development in the near future.
Resumo:
IMPORTANCE The discontinuation of randomized clinical trials (RCTs) raises ethical concerns and often wastes scarce research resources. The epidemiology of discontinued RCTs, however, remains unclear. OBJECTIVES To determine the prevalence, characteristics, and publication history of discontinued RCTs and to investigate factors associated with RCT discontinuation due to poor recruitment and with nonpublication. DESIGN AND SETTING Retrospective cohort of RCTs based on archived protocols approved by 6 research ethics committees in Switzerland, Germany, and Canada between 2000 and 2003. We recorded trial characteristics and planned recruitment from included protocols. Last follow-up of RCTs was April 27, 2013. MAIN OUTCOMES AND MEASURES Completion status, reported reasons for discontinuation, and publication status of RCTs as determined by correspondence with the research ethics committees, literature searches, and investigator surveys. RESULTS After a median follow-up of 11.6 years (range, 8.8-12.6 years), 253 of 1017 included RCTs were discontinued (24.9% [95% CI, 22.3%-27.6%]). Only 96 of 253 discontinuations (37.9% [95% CI, 32.0%-44.3%]) were reported to ethics committees. The most frequent reason for discontinuation was poor recruitment (101/1017; 9.9% [95% CI, 8.2%-12.0%]). In multivariable analysis, industry sponsorship vs investigator sponsorship (8.4% vs 26.5%; odds ratio [OR], 0.25 [95% CI, 0.15-0.43]; P < .001) and a larger planned sample size in increments of 100 (-0.7%; OR, 0.96 [95% CI, 0.92-1.00]; P = .04) were associated with lower rates of discontinuation due to poor recruitment. Discontinued trials were more likely to remain unpublished than completed trials (55.1% vs 33.6%; OR, 3.19 [95% CI, 2.29-4.43]; P < .001). CONCLUSIONS AND RELEVANCE In this sample of trials based on RCT protocols from 6 research ethics committees, discontinuation was common, with poor recruitment being the most frequently reported reason. Greater efforts are needed to ensure the reporting of trial discontinuation to research ethics committees and the publication of results of discontinued trials.
Resumo:
Over the last 20 years, health literacy (German: Gesundheitskompetenz/health competency) has become a popular concept in research and health policy. Initially defined as an individual's ability to understand medical information, the definition has quickly expanded to describe individual-based resources for actions or conduct relevant to health, in different socio-cultural or clinical contexts. Today, researchers and practice experts can draw on a wide variety of definitions and measurements. This article provides an overview of the definitions, briefly introduces the "structure and agency" approach as an example of theorizing health literacy, and shows different types of operationalization. The article presents the strengths and shortcomings of the available concepts and measures and provides starting points for future research in public health and health promotion.
Resumo:
IMPORTANCE Associations between subclinical thyroid dysfunction and fractures are unclear and clinical trials are lacking. OBJECTIVE To assess the association of subclinical thyroid dysfunction with hip, nonspine, spine, or any fractures. DATA SOURCES AND STUDY SELECTION The databases of MEDLINE and EMBASE (inception to March 26, 2015) were searched without language restrictions for prospective cohort studies with thyroid function data and subsequent fractures. DATA EXTRACTION Individual participant data were obtained from 13 prospective cohorts in the United States, Europe, Australia, and Japan. Levels of thyroid function were defined as euthyroidism (thyroid-stimulating hormone [TSH], 0.45-4.49 mIU/L), subclinical hyperthyroidism (TSH <0.45 mIU/L), and subclinical hypothyroidism (TSH ≥4.50-19.99 mIU/L) with normal thyroxine concentrations. MAIN OUTCOME AND MEASURES The primary outcome was hip fracture. Any fractures, nonspine fractures, and clinical spine fractures were secondary outcomes. RESULTS Among 70,298 participants, 4092 (5.8%) had subclinical hypothyroidism and 2219 (3.2%) had subclinical hyperthyroidism. During 762,401 person-years of follow-up, hip fracture occurred in 2975 participants (4.6%; 12 studies), any fracture in 2528 participants (9.0%; 8 studies), nonspine fracture in 2018 participants (8.4%; 8 studies), and spine fracture in 296 participants (1.3%; 6 studies). In age- and sex-adjusted analyses, the hazard ratio (HR) for subclinical hyperthyroidism vs euthyroidism was 1.36 for hip fracture (95% CI, 1.13-1.64; 146 events in 2082 participants vs 2534 in 56,471); for any fracture, HR was 1.28 (95% CI, 1.06-1.53; 121 events in 888 participants vs 2203 in 25,901); for nonspine fracture, HR was 1.16 (95% CI, 0.95-1.41; 107 events in 946 participants vs 1745 in 21,722); and for spine fracture, HR was 1.51 (95% CI, 0.93-2.45; 17 events in 732 participants vs 255 in 20,328). Lower TSH was associated with higher fracture rates: for TSH of less than 0.10 mIU/L, HR was 1.61 for hip fracture (95% CI, 1.21-2.15; 47 events in 510 participants); for any fracture, HR was 1.98 (95% CI, 1.41-2.78; 44 events in 212 participants); for nonspine fracture, HR was 1.61 (95% CI, 0.96-2.71; 32 events in 185 participants); and for spine fracture, HR was 3.57 (95% CI, 1.88-6.78; 8 events in 162 participants). Risks were similar after adjustment for other fracture risk factors. Endogenous subclinical hyperthyroidism (excluding thyroid medication users) was associated with HRs of 1.52 (95% CI, 1.19-1.93) for hip fracture, 1.42 (95% CI, 1.16-1.74) for any fracture, and 1.74 (95% CI, 1.01-2.99) for spine fracture. No association was found between subclinical hypothyroidism and fracture risk. CONCLUSIONS AND RELEVANCE Subclinical hyperthyroidism was associated with an increased risk of hip and other fractures, particularly among those with TSH levels of less than 0.10 mIU/L and those with endogenous subclinical hyperthyroidism. Further study is needed to determine whether treating subclinical hyperthyroidism can prevent fractures.
Resumo:
Studies assessing citizens’ attitudes towards Europe have mostly used explicit concepts and measures. However, psychologists have shown that human behaviour is not only determined by explicit attitudes which can be assessed via self-report, but also by implicit attitudes which require indirect measurement. We combine a self-report questionnaire with an implicit Affective Misattribution Procedure for the first time in an online environment to estimate the reliability, validity and predictive power of this implicit measure for the explanation of European Union-skeptical behaviour. Based on a survey with a sample representative for Germany, we found evidence for good reliability and validity of the implicit measure. In addition, the implicit attitude had a significant incremental impact beyond explicit attitudes on citizens’ proneness to engage in EU-skeptical information and voting behaviour.